Merge branch 'release/v5.0.0'

This commit is contained in:
Ivan Kravets
2020-09-03 14:43:10 +03:00
136 changed files with 8716 additions and 7392 deletions

View File

@@ -1,6 +1,6 @@
name: Core name: Core
on: [push] on: [push, pull_request]
jobs: jobs:
build: build:
@@ -8,7 +8,7 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
os: [ubuntu-latest, windows-latest, macos-latest] os: [ubuntu-latest, windows-latest, macos-latest]
python-version: [2.7, 3.7] python-version: [2.7, 3.7, 3.8]
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
@@ -28,8 +28,9 @@ jobs:
tox -e lint tox -e lint
- name: Integration Tests - name: Integration Tests
env: env:
PLATFORMIO_TEST_ACCOUNT_LOGIN: ${{ secrets.PLATFORMIO_TEST_ACCOUNT_LOGIN }} TEST_EMAIL_LOGIN: ${{ secrets.TEST_EMAIL_LOGIN }}
PLATFORMIO_TEST_ACCOUNT_PASSWORD: ${{ secrets.PLATFORMIO_TEST_ACCOUNT_PASSWORD }} TEST_EMAIL_PASSWORD: ${{ secrets.TEST_EMAIL_PASSWORD }}
TEST_EMAIL_IMAP_SERVER: ${{ secrets.TEST_EMAIL_IMAP_SERVER }}
run: | run: |
tox -e testcore tox -e testcore

View File

@@ -1,6 +1,6 @@
name: Docs name: Docs
on: [push] on: [push, pull_request]
jobs: jobs:
build: build:

View File

@@ -1,6 +1,6 @@
name: Examples name: Examples
on: [push] on: [push, pull_request]
jobs: jobs:
build: build:
@@ -49,6 +49,7 @@ jobs:
if: startsWith(matrix.os, 'windows') if: startsWith(matrix.os, 'windows')
env: env:
PLATFORMIO_CORE_DIR: C:/pio PLATFORMIO_CORE_DIR: C:/pio
PLATFORMIO_WORKSPACE_DIR: C:/pio-workspace/$PROJECT_HASH
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,riscv_gap" PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,riscv_gap"
run: | run: |
tox -e testexamples tox -e testexamples

View File

@@ -1,3 +1,6 @@
[REPORTS]
output-format=colorized
[MESSAGES CONTROL] [MESSAGES CONTROL]
disable= disable=
bad-continuation, bad-continuation,
@@ -12,4 +15,8 @@ disable=
useless-object-inheritance, useless-object-inheritance,
useless-import-alias, useless-import-alias,
fixme, fixme,
bad-option-value bad-option-value,
; PY2 Compat
super-with-arguments,
raise-missing-from

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,6 @@
lint: lint:
pylint --rcfile=./.pylintrc ./platformio pylint -j 6 --rcfile=./.pylintrc ./platformio
pylint -j 6 --rcfile=./.pylintrc ./tests
isort: isort:
isort -rc ./platformio isort -rc ./platformio
@@ -27,7 +28,7 @@ clean: clean-docs
profile: profile:
# Usage $ > make PIOARGS="boards" profile # Usage $ > make PIOARGS="boards" profile
python -m cProfile -o .tox/.tmp/cprofile.prof $(shell which platformio) ${PIOARGS} python -m cProfile -o .tox/.tmp/cprofile.prof -m platformio ${PIOARGS}
snakeviz .tox/.tmp/cprofile.prof snakeviz .tox/.tmp/cprofile.prof
publish: publish:

View File

@@ -37,7 +37,9 @@ PlatformIO
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png .. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
:target: https://platformio.org?utm_source=github&utm_medium=core :target: https://platformio.org?utm_source=github&utm_medium=core
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ a new generation ecosystem for embedded development `PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ is a professional collaborative platform for embedded development
**A place where Developers and Teams have true Freedom! No more vendor lock-in!**
* Open source, maximum permissive Apache 2.0 license * Open source, maximum permissive Apache 2.0 license
* Cross-platform IDE and Unified Debugger * Cross-platform IDE and Unified Debugger
@@ -64,10 +66,10 @@ Instruments
Professional Professional
------------ ------------
* `PIO Check <https://docs.platformio.org/page/plus/pio-check.html?utm_source=github&utm_medium=core>`_ * `Debugging <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `PIO Remote <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_ * `Unit Testing <https://docs.platformio.org/page/plus/unit-testing.html?utm_source=github&utm_medium=core>`_
* `PIO Unified Debugger <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_ * `Static Code Analysis <https://docs.platformio.org/page/plus/pio-check.html?utm_source=github&utm_medium=core>`_
* `PIO Unit Testing <https://docs.platformio.org/page/plus/unit-testing.html?utm_source=github&utm_medium=core>`_ * `Remote Development <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
Registry Registry
-------- --------
@@ -81,6 +83,7 @@ Development Platforms
--------------------- ---------------------
* `Aceinna IMU <https://platformio.org/platforms/aceinna_imu?utm_source=github&utm_medium=core>`_ * `Aceinna IMU <https://platformio.org/platforms/aceinna_imu?utm_source=github&utm_medium=core>`_
* `ASR Microelectronics ASR605x <https://platformio.org/platforms/asrmicro650x?utm_source=github&utm_medium=core>`_
* `Atmel AVR <https://platformio.org/platforms/atmelavr?utm_source=github&utm_medium=core>`_ * `Atmel AVR <https://platformio.org/platforms/atmelavr?utm_source=github&utm_medium=core>`_
* `Atmel SAM <https://platformio.org/platforms/atmelsam?utm_source=github&utm_medium=core>`_ * `Atmel SAM <https://platformio.org/platforms/atmelsam?utm_source=github&utm_medium=core>`_
* `Espressif 32 <https://platformio.org/platforms/espressif32?utm_source=github&utm_medium=core>`_ * `Espressif 32 <https://platformio.org/platforms/espressif32?utm_source=github&utm_medium=core>`_
@@ -144,7 +147,6 @@ Share minimal diagnostics and usage information to help us make PlatformIO bette
It is enabled by default. For more information see: It is enabled by default. For more information see:
* `Telemetry Setting <https://docs.platformio.org/page/userguide/cmd_settings.html?utm_source=github&utm_medium=core#enable-telemetry>`_ * `Telemetry Setting <https://docs.platformio.org/page/userguide/cmd_settings.html?utm_source=github&utm_medium=core#enable-telemetry>`_
* `SSL Setting <https://docs.platformio.org/page/userguide/cmd_settings.html?utm_source=github&utm_medium=core#strict-ssl>`_
License License
------- -------

2
docs

Submodule docs updated: 683415246b...03a83c996f

View File

@@ -12,18 +12,22 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
VERSION = (4, 3, 4) import sys
VERSION = (5, 0, 0)
__version__ = ".".join([str(s) for s in VERSION]) __version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio" __title__ = "platformio"
__description__ = ( __description__ = (
"A new generation ecosystem for embedded development. " "A professional collaborative platform for embedded development. "
"Cross-platform IDE and Unified Debugger. " "Cross-platform IDE and Unified Debugger. "
"Static Code Analyzer and Remote Unit Testing. " "Static Code Analyzer and Remote Unit Testing. "
"Multi-platform and Multi-architecture Build System. " "Multi-platform and Multi-architecture Build System. "
"Firmware File Explorer and Memory Inspection. " "Firmware File Explorer and Memory Inspection. "
"Arduino, ARM mbed, Espressif (ESP8266/ESP32), STM32, PIC32, nRF51/nRF52, " "IoT, Arduino, CMSIS, ESP-IDF, FreeRTOS, libOpenCM3, mbedOS, Pulp OS, SPL, "
"RISC-V, FPGA, CMSIS, SPL, AVR, Samsung ARTIK, libOpenCM3" "STM32Cube, Zephyr RTOS, ARM, AVR, Espressif (ESP8266/ESP32), FPGA, "
"MCS-51 (8051), MSP430, Nordic (nRF51/nRF52), NXP i.MX RT, PIC32, RISC-V, "
"STMicroelectronics (STM8/STM32), Teensy"
) )
__url__ = "https://platformio.org" __url__ = "https://platformio.org"
@@ -33,6 +37,29 @@ __email__ = "contact@platformio.org"
__license__ = "Apache Software License" __license__ = "Apache Software License"
__copyright__ = "Copyright 2014-present PlatformIO" __copyright__ = "Copyright 2014-present PlatformIO"
__apiurl__ = "https://api.platformio.org" __accounts_api__ = "https://api.accounts.platformio.org"
__pioaccount_api__ = "https://api.accounts.platformio.org" __registry_api__ = [
"https://api.registry.platformio.org",
"https://api.registry.ns1.platformio.org",
]
__pioremote_endpoint__ = "ssl:host=remote.platformio.org:port=4413" __pioremote_endpoint__ = "ssl:host=remote.platformio.org:port=4413"
__default_requests_timeout__ = (10, None) # (connect, read)
__core_packages__ = {
"contrib-piohome": "~3.3.0",
"contrib-pysite": "~2.%d%d.0" % (sys.version_info.major, sys.version_info.minor),
"tool-unity": "~1.20500.0",
"tool-scons": "~2.20501.7" if sys.version_info.major == 2 else "~4.40001.0",
"tool-cppcheck": "~1.210.0",
"tool-clangtidy": "~1.100000.0",
"tool-pvs-studio": "~7.9.0",
}
__check_internet_hosts__ = [
"140.82.118.3", # Github.com
"35.231.145.151", # Gitlab.com
"88.198.170.159", # platformio.org
"github.com",
"platformio.org",
]

View File

@@ -12,27 +12,21 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import codecs from __future__ import absolute_import
import getpass import getpass
import hashlib import hashlib
import json
import os import os
import platform import platform
import socket import socket
import uuid import uuid
from os import environ, getenv, listdir, remove
from os.path import dirname, isdir, isfile, join, realpath from os.path import dirname, isdir, isfile, join, realpath
from time import time
import requests from platformio import __version__, exception, fs, proc
from platformio import __version__, exception, fs, lockfile
from platformio.compat import WINDOWS, dump_json_to_unicode, hashlib_encode_data from platformio.compat import WINDOWS, dump_json_to_unicode, hashlib_encode_data
from platformio.proc import is_ci from platformio.package.lockfile import LockFile
from platformio.project.helpers import ( from platformio.project.helpers import get_default_projects_dir, get_project_core_dir
get_default_projects_dir,
get_project_cache_dir,
get_project_core_dir,
)
def projects_dir_validate(projects_dir): def projects_dir_validate(projects_dir):
@@ -62,10 +56,9 @@ DEFAULT_SETTINGS = {
"value": 7, "value": 7,
}, },
"enable_cache": { "enable_cache": {
"description": "Enable caching for API requests and Library Manager", "description": "Enable caching for HTTP API requests",
"value": True, "value": True,
}, },
"strict_ssl": {"description": "Strict SSL for PlatformIO Services", "value": False},
"enable_telemetry": { "enable_telemetry": {
"description": ("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"), "description": ("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"),
"value": True, "value": True,
@@ -75,7 +68,7 @@ DEFAULT_SETTINGS = {
"value": False, "value": False,
}, },
"projects_dir": { "projects_dir": {
"description": "Default location for PlatformIO projects (PIO Home)", "description": "Default location for PlatformIO projects (PlatformIO Home)",
"value": get_default_projects_dir(), "value": get_default_projects_dir(),
"validator": projects_dir_validate, "validator": projects_dir_validate,
}, },
@@ -126,7 +119,7 @@ class State(object):
def _lock_state_file(self): def _lock_state_file(self):
if not self.lock: if not self.lock:
return return
self._lockfile = lockfile.LockFile(self.path) self._lockfile = LockFile(self.path)
try: try:
self._lockfile.acquire() self._lockfile.acquire()
except IOError: except IOError:
@@ -144,6 +137,9 @@ class State(object):
def as_dict(self): def as_dict(self):
return self._storage return self._storage
def keys(self):
return self._storage.keys()
def get(self, key, default=True): def get(self, key, default=True):
return self._storage.get(key, default) return self._storage.get(key, default)
@@ -169,146 +165,6 @@ class State(object):
return item in self._storage return item in self._storage
class ContentCache(object):
def __init__(self, cache_dir=None):
self.cache_dir = None
self._db_path = None
self._lockfile = None
self.cache_dir = cache_dir or get_project_cache_dir()
self._db_path = join(self.cache_dir, "db.data")
def __enter__(self):
self.delete()
return self
def __exit__(self, type_, value, traceback):
pass
def _lock_dbindex(self):
if not self.cache_dir:
os.makedirs(self.cache_dir)
self._lockfile = lockfile.LockFile(self.cache_dir)
try:
self._lockfile.acquire()
except: # pylint: disable=bare-except
return False
return True
def _unlock_dbindex(self):
if self._lockfile:
self._lockfile.release()
return True
def get_cache_path(self, key):
assert "/" not in key and "\\" not in key
key = str(key)
assert len(key) > 3
return join(self.cache_dir, key[-2:], key)
@staticmethod
def key_from_args(*args):
h = hashlib.md5()
for arg in args:
if arg:
h.update(hashlib_encode_data(arg))
return h.hexdigest()
def get(self, key):
cache_path = self.get_cache_path(key)
if not isfile(cache_path):
return None
with codecs.open(cache_path, "rb", encoding="utf8") as fp:
return fp.read()
def set(self, key, data, valid):
if not get_setting("enable_cache"):
return False
cache_path = self.get_cache_path(key)
if isfile(cache_path):
self.delete(key)
if not data:
return False
if not isdir(self.cache_dir):
os.makedirs(self.cache_dir)
tdmap = {"s": 1, "m": 60, "h": 3600, "d": 86400}
assert valid.endswith(tuple(tdmap))
expire_time = int(time() + tdmap[valid[-1]] * int(valid[:-1]))
if not self._lock_dbindex():
return False
if not isdir(dirname(cache_path)):
os.makedirs(dirname(cache_path))
try:
with codecs.open(cache_path, "wb", encoding="utf8") as fp:
fp.write(data)
with open(self._db_path, "a") as fp:
fp.write("%s=%s\n" % (str(expire_time), cache_path))
except UnicodeError:
if isfile(cache_path):
try:
remove(cache_path)
except OSError:
pass
return self._unlock_dbindex()
def delete(self, keys=None):
""" Keys=None, delete expired items """
if not isfile(self._db_path):
return None
if not keys:
keys = []
if not isinstance(keys, list):
keys = [keys]
paths_for_delete = [self.get_cache_path(k) for k in keys]
found = False
newlines = []
with open(self._db_path) as fp:
for line in fp.readlines():
line = line.strip()
if "=" not in line:
continue
expire, path = line.split("=")
try:
if (
time() < int(expire)
and isfile(path)
and path not in paths_for_delete
):
newlines.append(line)
continue
except ValueError:
pass
found = True
if isfile(path):
try:
remove(path)
if not listdir(dirname(path)):
fs.rmtree(dirname(path))
except OSError:
pass
if found and self._lock_dbindex():
with open(self._db_path, "w") as fp:
fp.write("\n".join(newlines) + "\n")
self._unlock_dbindex()
return True
def clean(self):
if not self.cache_dir or not isdir(self.cache_dir):
return
fs.rmtree(self.cache_dir)
def clean_cache():
with ContentCache() as cc:
cc.clean()
def sanitize_setting(name, value): def sanitize_setting(name, value):
if name not in DEFAULT_SETTINGS: if name not in DEFAULT_SETTINGS:
raise exception.InvalidSettingName(name) raise exception.InvalidSettingName(name)
@@ -346,8 +202,8 @@ def delete_state_item(name):
def get_setting(name): def get_setting(name):
_env_name = "PLATFORMIO_SETTING_%s" % name.upper() _env_name = "PLATFORMIO_SETTING_%s" % name.upper()
if _env_name in environ: if _env_name in os.environ:
return sanitize_setting(name, getenv(_env_name)) return sanitize_setting(name, os.getenv(_env_name))
with State() as state: with State() as state:
if "settings" in state and name in state["settings"]: if "settings" in state and name in state["settings"]:
@@ -383,31 +239,32 @@ def is_disabled_progressbar():
return any( return any(
[ [
get_session_var("force_option"), get_session_var("force_option"),
is_ci(), proc.is_ci(),
getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true", os.getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true",
] ]
) )
def get_cid(): def get_cid():
# pylint: disable=import-outside-toplevel
from platformio.clients.http import fetch_remote_content
cid = get_state_item("cid") cid = get_state_item("cid")
if cid: if cid:
return cid return cid
uid = None uid = None
if getenv("C9_UID"): if os.getenv("C9_UID"):
uid = getenv("C9_UID") uid = os.getenv("C9_UID")
elif getenv("CHE_API", getenv("CHE_API_ENDPOINT")): elif os.getenv("CHE_API", os.getenv("CHE_API_ENDPOINT")):
try: try:
uid = ( uid = json.loads(
requests.get( fetch_remote_content(
"{api}/user?token={token}".format( "{api}/user?token={token}".format(
api=getenv("CHE_API", getenv("CHE_API_ENDPOINT")), api=os.getenv("CHE_API", os.getenv("CHE_API_ENDPOINT")),
token=getenv("USER_TOKEN"), token=os.getenv("USER_TOKEN"),
) )
) )
.json() ).get("id")
.get("id")
)
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
if not uid: if not uid:
@@ -420,7 +277,11 @@ def get_cid():
def get_user_agent(): def get_user_agent():
data = ["PlatformIO/%s" % __version__, "CI/%d" % int(is_ci())] data = [
"PlatformIO/%s" % __version__,
"CI/%d" % int(proc.is_ci()),
"Container/%d" % int(proc.is_container()),
]
if get_session_var("caller_id"): if get_session_var("caller_id"):
data.append("Caller/%s" % get_session_var("caller_id")) data.append("Caller/%s" % get_session_var("caller_id"))
if os.getenv("PLATFORMIO_IDE"): if os.getenv("PLATFORMIO_IDE"):

View File

@@ -30,7 +30,7 @@ from SCons.Script import Variables # pylint: disable=import-error
from platformio import compat, fs from platformio import compat, fs
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformBase from platformio.platform.base import PlatformBase
from platformio.proc import get_pythonexe_path from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_project_dir from platformio.project.helpers import get_project_dir
@@ -55,6 +55,7 @@ DEFAULT_ENV_OPTIONS = dict(
"c++", "c++",
"link", "link",
"platformio", "platformio",
"piotarget",
"pioplatform", "pioplatform",
"pioproject", "pioproject",
"piomaxlen", "piomaxlen",
@@ -159,7 +160,7 @@ env.LoadPioPlatform()
env.SConscriptChdir(0) env.SConscriptChdir(0)
env.SConsignFile( env.SConsignFile(
join("$BUILD_DIR", ".sconsign%d%d.db" % (sys.version_info[0], sys.version_info[1])) join("$BUILD_DIR", ".sconsign%d%d" % (sys.version_info[0], sys.version_info[1]))
) )
for item in env.GetExtraScripts("pre"): for item in env.GetExtraScripts("pre"):
@@ -217,7 +218,7 @@ if "idedata" in COMMAND_LINE_TARGETS:
click.echo( click.echo(
"\n%s\n" "\n%s\n"
% dump_json_to_unicode( % dump_json_to_unicode(
projenv.DumpIDEData() # pylint: disable=undefined-variable projenv.DumpIDEData(env) # pylint: disable=undefined-variable
) )
) )
env.Exit(0) env.Exit(0)

View File

@@ -20,7 +20,7 @@ from glob import glob
from SCons.Defaults import processDefines # pylint: disable=import-error from SCons.Defaults import processDefines # pylint: disable=import-error
from platformio.compat import glob_escape from platformio.compat import glob_escape
from platformio.managers.core import get_core_package_dir from platformio.package.manager.core import get_core_package_dir
from platformio.proc import exec_command, where_is_program from platformio.proc import exec_command, where_is_program
@@ -45,10 +45,10 @@ def _dump_includes(env):
# includes from toolchains # includes from toolchains
p = env.PioPlatform() p = env.PioPlatform()
includes["toolchain"] = [] includes["toolchain"] = []
for name in p.get_installed_packages(): for pkg in p.get_installed_packages():
if p.get_package_type(name) != "toolchain": if p.get_package_type(pkg.metadata.name) != "toolchain":
continue continue
toolchain_dir = glob_escape(p.get_package_dir(name)) toolchain_dir = glob_escape(pkg.path)
toolchain_incglobs = [ toolchain_incglobs = [
os.path.join(toolchain_dir, "*", "include", "c++", "*"), os.path.join(toolchain_dir, "*", "include", "c++", "*"),
os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"), os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
@@ -143,7 +143,8 @@ def _escape_build_flag(flags):
return [flag if " " not in flag else '"%s"' % flag for flag in flags] return [flag if " " not in flag else '"%s"' % flag for flag in flags]
def DumpIDEData(env): def DumpIDEData(env, globalenv):
""" env here is `projenv`"""
env["__escape_build_flag"] = _escape_build_flag env["__escape_build_flag"] = _escape_build_flag
@@ -169,6 +170,7 @@ def DumpIDEData(env):
], ],
"svd_path": _get_svd_path(env), "svd_path": _get_svd_path(env),
"compiler_type": env.GetCompilerType(), "compiler_type": env.GetCompilerType(),
"targets": globalenv.DumpTargets(),
} }
env_ = env.Clone() env_ = env.Clone()

View File

@@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=no-member, no-self-use, unused-argument, too-many-lines # pylint: disable=no-self-use, unused-argument, too-many-lines
# pylint: disable=too-many-instance-attributes, too-many-public-methods # pylint: disable=too-many-instance-attributes, too-many-public-methods
# pylint: disable=assignment-from-no-return # pylint: disable=assignment-from-no-return
@@ -23,7 +23,6 @@ import io
import os import os
import re import re
import sys import sys
from os.path import basename, commonprefix, isdir, isfile, join, realpath, sep
import click import click
import SCons.Scanner # pylint: disable=import-error import SCons.Scanner # pylint: disable=import-error
@@ -33,12 +32,15 @@ from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.builder.tools import platformio as piotool from platformio.builder.tools import platformio as piotool
from platformio.clients.http import InternetIsOffline
from platformio.compat import WINDOWS, hashlib_encode_data, string_types from platformio.compat import WINDOWS, hashlib_encode_data, string_types
from platformio.managers.lib import LibraryManager from platformio.package.exception import UnknownPackageError
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manifest.parser import ( from platformio.package.manifest.parser import (
ManifestParserError, ManifestParserError,
ManifestParserFactory, ManifestParserFactory,
) )
from platformio.package.meta import PackageItem
from platformio.project.options import ProjectOptions from platformio.project.options import ProjectOptions
@@ -46,7 +48,7 @@ class LibBuilderFactory(object):
@staticmethod @staticmethod
def new(env, path, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))): def new(env, path, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
clsname = "UnknownLibBuilder" clsname = "UnknownLibBuilder"
if isfile(join(path, "library.json")): if os.path.isfile(os.path.join(path, "library.json")):
clsname = "PlatformIOLibBuilder" clsname = "PlatformIOLibBuilder"
else: else:
used_frameworks = LibBuilderFactory.get_used_frameworks(env, path) used_frameworks = LibBuilderFactory.get_used_frameworks(env, path)
@@ -63,12 +65,12 @@ class LibBuilderFactory(object):
@staticmethod @staticmethod
def get_used_frameworks(env, path): def get_used_frameworks(env, path):
if any( if any(
isfile(join(path, fname)) os.path.isfile(os.path.join(path, fname))
for fname in ("library.properties", "keywords.txt") for fname in ("library.properties", "keywords.txt")
): ):
return ["arduino"] return ["arduino"]
if isfile(join(path, "module.json")): if os.path.isfile(os.path.join(path, "module.json")):
return ["mbed"] return ["mbed"]
include_re = re.compile( include_re = re.compile(
@@ -84,7 +86,7 @@ class LibBuilderFactory(object):
fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT
): ):
continue continue
with io.open(join(root, fname), errors="ignore") as fp: with io.open(os.path.join(root, fname), errors="ignore") as fp:
content = fp.read() content = fp.read()
if not content: if not content:
continue continue
@@ -111,7 +113,7 @@ class LibBuilderBase(object):
def __init__(self, env, path, manifest=None, verbose=False): def __init__(self, env, path, manifest=None, verbose=False):
self.env = env.Clone() self.env = env.Clone()
self.envorigin = env.Clone() self.envorigin = env.Clone()
self.path = realpath(env.subst(path)) self.path = os.path.realpath(env.subst(path))
self.verbose = verbose self.verbose = verbose
try: try:
@@ -145,11 +147,11 @@ class LibBuilderBase(object):
p2 = p2.lower() p2 = p2.lower()
if p1 == p2: if p1 == p2:
return True return True
return commonprefix((p1 + sep, p2)) == p1 + sep return os.path.commonprefix((p1 + os.path.sep, p2)) == p1 + os.path.sep
@property @property
def name(self): def name(self):
return self._manifest.get("name", basename(self.path)) return self._manifest.get("name", os.path.basename(self.path))
@property @property
def version(self): def version(self):
@@ -170,13 +172,19 @@ class LibBuilderBase(object):
@property @property
def include_dir(self): def include_dir(self):
if not all(isdir(join(self.path, d)) for d in ("include", "src")): if not all(
os.path.isdir(os.path.join(self.path, d)) for d in ("include", "src")
):
return None return None
return join(self.path, "include") return os.path.join(self.path, "include")
@property @property
def src_dir(self): def src_dir(self):
return join(self.path, "src") if isdir(join(self.path, "src")) else self.path return (
os.path.join(self.path, "src")
if os.path.isdir(os.path.join(self.path, "src"))
else self.path
)
def get_include_dirs(self): def get_include_dirs(self):
items = [] items = []
@@ -189,7 +197,9 @@ class LibBuilderBase(object):
@property @property
def build_dir(self): def build_dir(self):
lib_hash = hashlib.sha1(hashlib_encode_data(self.path)).hexdigest()[:3] lib_hash = hashlib.sha1(hashlib_encode_data(self.path)).hexdigest()[:3]
return join("$BUILD_DIR", "lib%s" % lib_hash, basename(self.path)) return os.path.join(
"$BUILD_DIR", "lib%s" % lib_hash, os.path.basename(self.path)
)
@property @property
def build_flags(self): def build_flags(self):
@@ -268,7 +278,7 @@ class LibBuilderBase(object):
if self.extra_script: if self.extra_script:
self.env.SConscriptChdir(1) self.env.SConscriptChdir(1)
self.env.SConscript( self.env.SConscript(
realpath(self.extra_script), os.path.realpath(self.extra_script),
exports={"env": self.env, "pio_lib_builder": self}, exports={"env": self.env, "pio_lib_builder": self},
) )
self.env.ProcessUnFlags(self.build_unflags) self.env.ProcessUnFlags(self.build_unflags)
@@ -294,14 +304,14 @@ class LibBuilderBase(object):
def get_search_files(self): def get_search_files(self):
items = [ items = [
join(self.src_dir, item) os.path.join(self.src_dir, item)
for item in self.env.MatchSourceFiles(self.src_dir, self.src_filter) for item in self.env.MatchSourceFiles(self.src_dir, self.src_filter)
] ]
include_dir = self.include_dir include_dir = self.include_dir
if include_dir: if include_dir:
items.extend( items.extend(
[ [
join(include_dir, item) os.path.join(include_dir, item)
for item in self.env.MatchSourceFiles(include_dir) for item in self.env.MatchSourceFiles(include_dir)
] ]
) )
@@ -370,7 +380,7 @@ class LibBuilderBase(object):
continue continue
_f_part = _h_path[: _h_path.rindex(".")] _f_part = _h_path[: _h_path.rindex(".")]
for ext in piotool.SRC_C_EXT + piotool.SRC_CXX_EXT: for ext in piotool.SRC_C_EXT + piotool.SRC_CXX_EXT:
if not isfile("%s.%s" % (_f_part, ext)): if not os.path.isfile("%s.%s" % (_f_part, ext)):
continue continue
_c_path = self.env.File("%s.%s" % (_f_part, ext)) _c_path = self.env.File("%s.%s" % (_f_part, ext))
if _c_path not in result: if _c_path not in result:
@@ -464,23 +474,24 @@ class UnknownLibBuilder(LibBuilderBase):
class ArduinoLibBuilder(LibBuilderBase): class ArduinoLibBuilder(LibBuilderBase):
def load_manifest(self): def load_manifest(self):
manifest_path = join(self.path, "library.properties") manifest_path = os.path.join(self.path, "library.properties")
if not isfile(manifest_path): if not os.path.isfile(manifest_path):
return {} return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict() return ManifestParserFactory.new_from_file(manifest_path).as_dict()
def get_include_dirs(self): def get_include_dirs(self):
include_dirs = LibBuilderBase.get_include_dirs(self) include_dirs = LibBuilderBase.get_include_dirs(self)
if isdir(join(self.path, "src")): if os.path.isdir(os.path.join(self.path, "src")):
return include_dirs return include_dirs
if isdir(join(self.path, "utility")): if os.path.isdir(os.path.join(self.path, "utility")):
include_dirs.append(join(self.path, "utility")) include_dirs.append(os.path.join(self.path, "utility"))
return include_dirs return include_dirs
@property @property
def src_filter(self): def src_filter(self):
src_dir = join(self.path, "src") src_dir = os.path.join(self.path, "src")
if isdir(src_dir): if os.path.isdir(src_dir):
# pylint: disable=no-member
src_filter = LibBuilderBase.src_filter.fget(self) src_filter = LibBuilderBase.src_filter.fget(self)
for root, _, files in os.walk(src_dir, followlinks=True): for root, _, files in os.walk(src_dir, followlinks=True):
found = False found = False
@@ -491,50 +502,68 @@ class ArduinoLibBuilder(LibBuilderBase):
if not found: if not found:
continue continue
rel_path = root.replace(src_dir, "") rel_path = root.replace(src_dir, "")
if rel_path.startswith(sep): if rel_path.startswith(os.path.sep):
rel_path = rel_path[1:] + sep rel_path = rel_path[1:] + os.path.sep
src_filter.append("-<%s*.[aA][sS][mM]>" % rel_path) src_filter.append("-<%s*.[aA][sS][mM]>" % rel_path)
return src_filter return src_filter
src_filter = [] src_filter = []
is_utility = isdir(join(self.path, "utility")) is_utility = os.path.isdir(os.path.join(self.path, "utility"))
for ext in piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT: for ext in piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT:
# arduino ide ignores files with .asm or .ASM extensions # arduino ide ignores files with .asm or .ASM extensions
if ext.lower() == "asm": if ext.lower() == "asm":
continue continue
src_filter.append("+<*.%s>" % ext) src_filter.append("+<*.%s>" % ext)
if is_utility: if is_utility:
src_filter.append("+<utility%s*.%s>" % (sep, ext)) src_filter.append("+<utility%s*.%s>" % (os.path.sep, ext))
return src_filter return src_filter
@property
def dependencies(self):
# do not include automatically all libraries for build
# chain+ will decide later
return None
@property
def lib_ldf_mode(self):
# pylint: disable=no-member
if not self._manifest.get("dependencies"):
return LibBuilderBase.lib_ldf_mode.fget(self)
missing = object()
global_value = self.env.GetProjectConfig().getraw(
"env:" + self.env["PIOENV"], "lib_ldf_mode", missing
)
if global_value != missing:
return LibBuilderBase.lib_ldf_mode.fget(self)
# automatically enable C++ Preprocessing in runtime
# (Arduino IDE has this behavior)
return "chain+"
def is_frameworks_compatible(self, frameworks): def is_frameworks_compatible(self, frameworks):
return util.items_in_list(frameworks, ["arduino", "energia"]) return util.items_in_list(frameworks, ["arduino", "energia"])
def is_platforms_compatible(self, platforms): def is_platforms_compatible(self, platforms):
items = self._manifest.get("platforms", []) return util.items_in_list(platforms, self._manifest.get("platforms") or ["*"])
if not items:
return LibBuilderBase.is_platforms_compatible(self, platforms)
return util.items_in_list(platforms, items)
class MbedLibBuilder(LibBuilderBase): class MbedLibBuilder(LibBuilderBase):
def load_manifest(self): def load_manifest(self):
manifest_path = join(self.path, "module.json") manifest_path = os.path.join(self.path, "module.json")
if not isfile(manifest_path): if not os.path.isfile(manifest_path):
return {} return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict() return ManifestParserFactory.new_from_file(manifest_path).as_dict()
@property @property
def include_dir(self): def include_dir(self):
if isdir(join(self.path, "include")): if os.path.isdir(os.path.join(self.path, "include")):
return join(self.path, "include") return os.path.join(self.path, "include")
return None return None
@property @property
def src_dir(self): def src_dir(self):
if isdir(join(self.path, "source")): if os.path.isdir(os.path.join(self.path, "source")):
return join(self.path, "source") return os.path.join(self.path, "source")
return LibBuilderBase.src_dir.fget(self) return LibBuilderBase.src_dir.fget(self) # pylint: disable=no-member
def get_include_dirs(self): def get_include_dirs(self):
include_dirs = LibBuilderBase.get_include_dirs(self) include_dirs = LibBuilderBase.get_include_dirs(self)
@@ -543,13 +572,13 @@ class MbedLibBuilder(LibBuilderBase):
# library with module.json # library with module.json
for p in self._manifest.get("extraIncludes", []): for p in self._manifest.get("extraIncludes", []):
include_dirs.append(join(self.path, p)) include_dirs.append(os.path.join(self.path, p))
# old mbed library without manifest, add to CPPPATH all folders # old mbed library without manifest, add to CPPPATH all folders
if not self._manifest: if not self._manifest:
for root, _, __ in os.walk(self.path): for root, _, __ in os.walk(self.path):
part = root.replace(self.path, "").lower() part = root.replace(self.path, "").lower()
if any(s in part for s in ("%s." % sep, "test", "example")): if any(s in part for s in ("%s." % os.path.sep, "test", "example")):
continue continue
if root not in include_dirs: if root not in include_dirs:
include_dirs.append(root) include_dirs.append(root)
@@ -565,7 +594,7 @@ class MbedLibBuilder(LibBuilderBase):
def _process_mbed_lib_confs(self): def _process_mbed_lib_confs(self):
mbed_lib_paths = [ mbed_lib_paths = [
join(root, "mbed_lib.json") os.path.join(root, "mbed_lib.json")
for root, _, files in os.walk(self.path) for root, _, files in os.walk(self.path)
if "mbed_lib.json" in files if "mbed_lib.json" in files
] ]
@@ -574,8 +603,8 @@ class MbedLibBuilder(LibBuilderBase):
mbed_config_path = None mbed_config_path = None
for p in self.env.get("CPPPATH"): for p in self.env.get("CPPPATH"):
mbed_config_path = join(self.env.subst(p), "mbed_config.h") mbed_config_path = os.path.join(self.env.subst(p), "mbed_config.h")
if isfile(mbed_config_path): if os.path.isfile(mbed_config_path):
break break
mbed_config_path = None mbed_config_path = None
if not mbed_config_path: if not mbed_config_path:
@@ -667,30 +696,31 @@ class MbedLibBuilder(LibBuilderBase):
class PlatformIOLibBuilder(LibBuilderBase): class PlatformIOLibBuilder(LibBuilderBase):
def load_manifest(self): def load_manifest(self):
manifest_path = join(self.path, "library.json") manifest_path = os.path.join(self.path, "library.json")
if not isfile(manifest_path): if not os.path.isfile(manifest_path):
return {} return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict() return ManifestParserFactory.new_from_file(manifest_path).as_dict()
def _has_arduino_manifest(self): def _has_arduino_manifest(self):
return isfile(join(self.path, "library.properties")) return os.path.isfile(os.path.join(self.path, "library.properties"))
@property @property
def include_dir(self): def include_dir(self):
if "includeDir" in self._manifest.get("build", {}): if "includeDir" in self._manifest.get("build", {}):
with fs.cd(self.path): with fs.cd(self.path):
return realpath(self._manifest.get("build").get("includeDir")) return os.path.realpath(self._manifest.get("build").get("includeDir"))
return LibBuilderBase.include_dir.fget(self) return LibBuilderBase.include_dir.fget(self) # pylint: disable=no-member
@property @property
def src_dir(self): def src_dir(self):
if "srcDir" in self._manifest.get("build", {}): if "srcDir" in self._manifest.get("build", {}):
with fs.cd(self.path): with fs.cd(self.path):
return realpath(self._manifest.get("build").get("srcDir")) return os.path.realpath(self._manifest.get("build").get("srcDir"))
return LibBuilderBase.src_dir.fget(self) return LibBuilderBase.src_dir.fget(self) # pylint: disable=no-member
@property @property
def src_filter(self): def src_filter(self):
# pylint: disable=no-member
if "srcFilter" in self._manifest.get("build", {}): if "srcFilter" in self._manifest.get("build", {}):
return self._manifest.get("build").get("srcFilter") return self._manifest.get("build").get("srcFilter")
if self.env["SRC_FILTER"]: if self.env["SRC_FILTER"]:
@@ -703,19 +733,19 @@ class PlatformIOLibBuilder(LibBuilderBase):
def build_flags(self): def build_flags(self):
if "flags" in self._manifest.get("build", {}): if "flags" in self._manifest.get("build", {}):
return self._manifest.get("build").get("flags") return self._manifest.get("build").get("flags")
return LibBuilderBase.build_flags.fget(self) return LibBuilderBase.build_flags.fget(self) # pylint: disable=no-member
@property @property
def build_unflags(self): def build_unflags(self):
if "unflags" in self._manifest.get("build", {}): if "unflags" in self._manifest.get("build", {}):
return self._manifest.get("build").get("unflags") return self._manifest.get("build").get("unflags")
return LibBuilderBase.build_unflags.fget(self) return LibBuilderBase.build_unflags.fget(self) # pylint: disable=no-member
@property @property
def extra_script(self): def extra_script(self):
if "extraScript" in self._manifest.get("build", {}): if "extraScript" in self._manifest.get("build", {}):
return self._manifest.get("build").get("extraScript") return self._manifest.get("build").get("extraScript")
return LibBuilderBase.extra_script.fget(self) return LibBuilderBase.extra_script.fget(self) # pylint: disable=no-member
@property @property
def lib_archive(self): def lib_archive(self):
@@ -727,12 +757,14 @@ class PlatformIOLibBuilder(LibBuilderBase):
return self.env.GetProjectConfig().get( return self.env.GetProjectConfig().get(
"env:" + self.env["PIOENV"], "lib_archive" "env:" + self.env["PIOENV"], "lib_archive"
) )
# pylint: disable=no-member
return self._manifest.get("build", {}).get( return self._manifest.get("build", {}).get(
"libArchive", LibBuilderBase.lib_archive.fget(self) "libArchive", LibBuilderBase.lib_archive.fget(self)
) )
@property @property
def lib_ldf_mode(self): def lib_ldf_mode(self):
# pylint: disable=no-member
return self.validate_ldf_mode( return self.validate_ldf_mode(
self._manifest.get("build", {}).get( self._manifest.get("build", {}).get(
"libLDFMode", LibBuilderBase.lib_ldf_mode.fget(self) "libLDFMode", LibBuilderBase.lib_ldf_mode.fget(self)
@@ -741,6 +773,7 @@ class PlatformIOLibBuilder(LibBuilderBase):
@property @property
def lib_compat_mode(self): def lib_compat_mode(self):
# pylint: disable=no-member
return self.validate_compat_mode( return self.validate_compat_mode(
self._manifest.get("build", {}).get( self._manifest.get("build", {}).get(
"libCompatMode", LibBuilderBase.lib_compat_mode.fget(self) "libCompatMode", LibBuilderBase.lib_compat_mode.fget(self)
@@ -748,16 +781,10 @@ class PlatformIOLibBuilder(LibBuilderBase):
) )
def is_platforms_compatible(self, platforms): def is_platforms_compatible(self, platforms):
items = self._manifest.get("platforms") return util.items_in_list(platforms, self._manifest.get("platforms") or ["*"])
if not items:
return LibBuilderBase.is_platforms_compatible(self, platforms)
return util.items_in_list(platforms, items)
def is_frameworks_compatible(self, frameworks): def is_frameworks_compatible(self, frameworks):
items = self._manifest.get("frameworks") return util.items_in_list(frameworks, self._manifest.get("frameworks") or ["*"])
if not items:
return LibBuilderBase.is_frameworks_compatible(self, frameworks)
return util.items_in_list(frameworks, items)
def get_include_dirs(self): def get_include_dirs(self):
include_dirs = LibBuilderBase.get_include_dirs(self) include_dirs = LibBuilderBase.get_include_dirs(self)
@@ -766,10 +793,10 @@ class PlatformIOLibBuilder(LibBuilderBase):
if ( if (
"build" not in self._manifest "build" not in self._manifest
and self._has_arduino_manifest() and self._has_arduino_manifest()
and not isdir(join(self.path, "src")) and not os.path.isdir(os.path.join(self.path, "src"))
and isdir(join(self.path, "utility")) and os.path.isdir(os.path.join(self.path, "utility"))
): ):
include_dirs.append(join(self.path, "utility")) include_dirs.append(os.path.join(self.path, "utility"))
for path in self.env.get("CPPPATH", []): for path in self.env.get("CPPPATH", []):
if path not in self.envorigin.get("CPPPATH", []): if path not in self.envorigin.get("CPPPATH", []):
@@ -788,7 +815,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
@property @property
def include_dir(self): def include_dir(self):
include_dir = self.env.subst("$PROJECT_INCLUDE_DIR") include_dir = self.env.subst("$PROJECT_INCLUDE_DIR")
return include_dir if isdir(include_dir) else None return include_dir if os.path.isdir(include_dir) else None
@property @property
def src_dir(self): def src_dir(self):
@@ -797,7 +824,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
def get_include_dirs(self): def get_include_dirs(self):
include_dirs = [] include_dirs = []
project_include_dir = self.env.subst("$PROJECT_INCLUDE_DIR") project_include_dir = self.env.subst("$PROJECT_INCLUDE_DIR")
if isdir(project_include_dir): if os.path.isdir(project_include_dir):
include_dirs.append(project_include_dir) include_dirs.append(project_include_dir)
for include_dir in LibBuilderBase.get_include_dirs(self): for include_dir in LibBuilderBase.get_include_dirs(self):
if include_dir not in include_dirs: if include_dir not in include_dirs:
@@ -811,7 +838,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
if "__test" in COMMAND_LINE_TARGETS: if "__test" in COMMAND_LINE_TARGETS:
items.extend( items.extend(
[ [
join("$PROJECT_TEST_DIR", item) os.path.join("$PROJECT_TEST_DIR", item)
for item in self.env.MatchSourceFiles( for item in self.env.MatchSourceFiles(
"$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER" "$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
) )
@@ -821,7 +848,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
@property @property
def lib_ldf_mode(self): def lib_ldf_mode(self):
mode = LibBuilderBase.lib_ldf_mode.fget(self) mode = LibBuilderBase.lib_ldf_mode.fget(self) # pylint: disable=no-member
if not mode.startswith("chain"): if not mode.startswith("chain"):
return mode return mode
# parse all project files # parse all project files
@@ -829,6 +856,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
@property @property
def src_filter(self): def src_filter(self):
# pylint: disable=no-member
return self.env.get("SRC_FILTER") or LibBuilderBase.src_filter.fget(self) return self.env.get("SRC_FILTER") or LibBuilderBase.src_filter.fget(self)
@property @property
@@ -840,34 +868,36 @@ class ProjectAsLibBuilder(LibBuilderBase):
pass pass
def install_dependencies(self): def install_dependencies(self):
def _is_builtin(uri): def _is_builtin(spec):
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
if lb.name == uri: if lb.name == spec:
return True return True
return False return False
not_found_uri = [] not_found_specs = []
for uri in self.dependencies: for spec in self.dependencies:
# check if built-in library # check if built-in library
if _is_builtin(uri): if _is_builtin(spec):
continue continue
found = False found = False
for storage_dir in self.env.GetLibSourceDirs(): for storage_dir in self.env.GetLibSourceDirs():
lm = LibraryManager(storage_dir) lm = LibraryPackageManager(storage_dir)
if lm.get_package_dir(*lm.parse_pkg_uri(uri)): if lm.get_package(spec):
found = True found = True
break break
if not found: if not found:
not_found_uri.append(uri) not_found_specs.append(spec)
did_install = False did_install = False
lm = LibraryManager(self.env.subst(join("$PROJECT_LIBDEPS_DIR", "$PIOENV"))) lm = LibraryPackageManager(
for uri in not_found_uri: self.env.subst(os.path.join("$PROJECT_LIBDEPS_DIR", "$PIOENV"))
)
for spec in not_found_specs:
try: try:
lm.install(uri) lm.install(spec)
did_install = True did_install = True
except (exception.LibNotFound, exception.InternetIsOffline) as e: except (UnknownPackageError, InternetIsOffline) as e:
click.secho("Warning! %s" % e, fg="yellow") click.secho("Warning! %s" % e, fg="yellow")
# reset cache # reset cache
@@ -875,17 +905,17 @@ class ProjectAsLibBuilder(LibBuilderBase):
DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=None) DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=None)
def process_dependencies(self): # pylint: disable=too-many-branches def process_dependencies(self): # pylint: disable=too-many-branches
for uri in self.dependencies: for spec in self.dependencies:
found = False found = False
for storage_dir in self.env.GetLibSourceDirs(): for storage_dir in self.env.GetLibSourceDirs():
if found: if found:
break break
lm = LibraryManager(storage_dir) lm = LibraryPackageManager(storage_dir)
lib_dir = lm.get_package_dir(*lm.parse_pkg_uri(uri)) pkg = lm.get_package(spec)
if not lib_dir: if not pkg:
continue continue
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
if lib_dir != lb.path: if pkg.path != lb.path:
continue continue
if lb not in self.depbuilders: if lb not in self.depbuilders:
self.depend_recursive(lb) self.depend_recursive(lb)
@@ -897,7 +927,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
# look for built-in libraries by a name # look for built-in libraries by a name
# which don't have package manifest # which don't have package manifest
for lb in self.env.GetLibBuilders(): for lb in self.env.GetLibBuilders():
if lb.name != uri: if lb.name != spec:
continue continue
if lb not in self.depbuilders: if lb not in self.depbuilders:
self.depend_recursive(lb) self.depend_recursive(lb)
@@ -952,12 +982,12 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
found_incompat = False found_incompat = False
for storage_dir in env.GetLibSourceDirs(): for storage_dir in env.GetLibSourceDirs():
storage_dir = realpath(storage_dir) storage_dir = os.path.realpath(storage_dir)
if not isdir(storage_dir): if not os.path.isdir(storage_dir):
continue continue
for item in sorted(os.listdir(storage_dir)): for item in sorted(os.listdir(storage_dir)):
lib_dir = join(storage_dir, item) lib_dir = os.path.join(storage_dir, item)
if item == "__cores__" or not isdir(lib_dir): if item == "__cores__" or not os.path.isdir(lib_dir):
continue continue
try: try:
lb = LibBuilderFactory.new(env, lib_dir) lb = LibBuilderFactory.new(env, lib_dir)
@@ -989,10 +1019,6 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
def ConfigureProjectLibBuilder(env): def ConfigureProjectLibBuilder(env):
def _get_vcs_info(lb):
path = LibraryManager.get_src_manifest_path(lb.path)
return fs.load_json(path) if path else None
def _correct_found_libs(lib_builders): def _correct_found_libs(lib_builders):
# build full dependency graph # build full dependency graph
found_lbs = [lb for lb in lib_builders if lb.dependent] found_lbs = [lb for lb in lib_builders if lb.dependent]
@@ -1008,15 +1034,15 @@ def ConfigureProjectLibBuilder(env):
margin = "| " * (level) margin = "| " * (level)
for lb in root.depbuilders: for lb in root.depbuilders:
title = "<%s>" % lb.name title = "<%s>" % lb.name
vcs_info = _get_vcs_info(lb) pkg = PackageItem(lb.path)
if lb.version: if pkg.metadata:
title += " %s" % pkg.metadata.version
elif lb.version:
title += " %s" % lb.version title += " %s" % lb.version
if vcs_info and vcs_info.get("version"):
title += " #%s" % vcs_info.get("version")
click.echo("%s|-- %s" % (margin, title), nl=False) click.echo("%s|-- %s" % (margin, title), nl=False)
if int(ARGUMENTS.get("PIOVERBOSE", 0)): if int(ARGUMENTS.get("PIOVERBOSE", 0)):
if vcs_info: if pkg.metadata and pkg.metadata.spec.external:
click.echo(" [%s]" % vcs_info.get("url"), nl=False) click.echo(" [%s]" % pkg.metadata.spec.url, nl=False)
click.echo(" (", nl=False) click.echo(" (", nl=False)
click.echo(lb.path, nl=False) click.echo(lb.path, nl=False)
click.echo(")", nl=False) click.echo(")", nl=False)
@@ -1025,7 +1051,7 @@ def ConfigureProjectLibBuilder(env):
_print_deps_tree(lb, level + 1) _print_deps_tree(lb, level + 1)
project = ProjectAsLibBuilder(env, "$PROJECT_DIR") project = ProjectAsLibBuilder(env, "$PROJECT_DIR")
ldf_mode = LibBuilderBase.lib_ldf_mode.fget(project) ldf_mode = LibBuilderBase.lib_ldf_mode.fget(project) # pylint: disable=no-member
click.echo("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf") click.echo("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf")
click.echo( click.echo(

View File

@@ -16,19 +16,16 @@ from __future__ import absolute_import
import atexit import atexit
import io import io
import os
import re import re
import sys import sys
from os import environ, remove, walk
from os.path import basename, isdir, isfile, join, realpath, relpath, sep
from tempfile import mkstemp from tempfile import mkstemp
import click import click
from SCons.Action import Action # pylint: disable=import-error
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from platformio import fs, util from platformio import fs, util
from platformio.compat import get_filesystem_encoding, get_locale_encoding, glob_escape from platformio.compat import get_filesystem_encoding, get_locale_encoding, glob_escape
from platformio.managers.core import get_core_package_dir from platformio.package.manager.core import get_core_package_dir
from platformio.proc import exec_command from platformio.proc import exec_command
@@ -126,11 +123,11 @@ class InoToCPPConverter(object):
'$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format( '$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format(
out_file, tmp_path out_file, tmp_path
), ),
"Converting " + basename(out_file[:-4]), "Converting " + os.path.basename(out_file[:-4]),
) )
) )
atexit.register(_delete_file, tmp_path) atexit.register(_delete_file, tmp_path)
return isfile(out_file) return os.path.isfile(out_file)
def _join_multiline_strings(self, contents): def _join_multiline_strings(self, contents):
if "\\\n" not in contents: if "\\\n" not in contents:
@@ -233,7 +230,9 @@ class InoToCPPConverter(object):
def ConvertInoToCpp(env): def ConvertInoToCpp(env):
src_dir = glob_escape(env.subst("$PROJECT_SRC_DIR")) src_dir = glob_escape(env.subst("$PROJECT_SRC_DIR"))
ino_nodes = env.Glob(join(src_dir, "*.ino")) + env.Glob(join(src_dir, "*.pde")) ino_nodes = env.Glob(os.path.join(src_dir, "*.ino")) + env.Glob(
os.path.join(src_dir, "*.pde")
)
if not ino_nodes: if not ino_nodes:
return return
c = InoToCPPConverter(env) c = InoToCPPConverter(env)
@@ -244,8 +243,8 @@ def ConvertInoToCpp(env):
def _delete_file(path): def _delete_file(path):
try: try:
if isfile(path): if os.path.isfile(path):
remove(path) os.remove(path)
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
pass pass
@@ -255,7 +254,7 @@ def _get_compiler_type(env):
if env.subst("$CC").endswith("-gcc"): if env.subst("$CC").endswith("-gcc"):
return "gcc" return "gcc"
try: try:
sysenv = environ.copy() sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"]) sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command([env.subst("$CC"), "-v"], env=sysenv) result = exec_command([env.subst("$CC"), "-v"], env=sysenv)
except OSError: except OSError:
@@ -277,8 +276,8 @@ def GetCompilerType(env):
def GetActualLDScript(env): def GetActualLDScript(env):
def _lookup_in_ldpath(script): def _lookup_in_ldpath(script):
for d in env.get("LIBPATH", []): for d in env.get("LIBPATH", []):
path = join(env.subst(d), script) path = os.path.join(env.subst(d), script)
if isfile(path): if os.path.isfile(path):
return path return path
return None return None
@@ -297,7 +296,7 @@ def GetActualLDScript(env):
else: else:
continue continue
script = env.subst(raw_script.replace('"', "").strip()) script = env.subst(raw_script.replace('"', "").strip())
if isfile(script): if os.path.isfile(script):
return script return script
path = _lookup_in_ldpath(script) path = _lookup_in_ldpath(script)
if path: if path:
@@ -319,29 +318,6 @@ def GetActualLDScript(env):
env.Exit(1) env.Exit(1)
def VerboseAction(_, act, actstr):
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
return act
return Action(act, actstr)
def PioClean(env, clean_dir):
if not isdir(clean_dir):
print("Build environment is clean")
env.Exit(0)
clean_rel_path = relpath(clean_dir)
for root, _, files in walk(clean_dir):
for f in files:
dst = join(root, f)
remove(dst)
print(
"Removed %s" % (dst if clean_rel_path.startswith(".") else relpath(dst))
)
print("Done cleaning")
fs.rmtree(clean_dir)
env.Exit(0)
def ConfigureDebugFlags(env): def ConfigureDebugFlags(env):
def _cleanup_debug_flags(scope): def _cleanup_debug_flags(scope):
if scope not in env: if scope not in env:
@@ -370,16 +346,16 @@ def ConfigureDebugFlags(env):
def ConfigureTestTarget(env): def ConfigureTestTarget(env):
env.Append( env.Append(
CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"], CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"],
CPPPATH=[join("$BUILD_DIR", "UnityTestLib")], CPPPATH=[os.path.join("$BUILD_DIR", "UnityTestLib")],
) )
unitylib = env.BuildLibrary( unitylib = env.BuildLibrary(
join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity") os.path.join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity")
) )
env.Prepend(LIBS=[unitylib]) env.Prepend(LIBS=[unitylib])
src_filter = ["+<*.cpp>", "+<*.c>"] src_filter = ["+<*.cpp>", "+<*.c>"]
if "PIOTEST_RUNNING_NAME" in env: if "PIOTEST_RUNNING_NAME" in env:
src_filter.append("+<%s%s>" % (env["PIOTEST_RUNNING_NAME"], sep)) src_filter.append("+<%s%s>" % (env["PIOTEST_RUNNING_NAME"], os.path.sep))
env.Replace(PIOTEST_SRC_FILTER=src_filter) env.Replace(PIOTEST_SRC_FILTER=src_filter)
@@ -393,7 +369,7 @@ def GetExtraScripts(env, scope):
if not items: if not items:
return items return items
with fs.cd(env.subst("$PROJECT_DIR")): with fs.cd(env.subst("$PROJECT_DIR")):
return [realpath(item) for item in items] return [os.path.realpath(item) for item in items]
def exists(_): def exists(_):
@@ -404,8 +380,6 @@ def generate(env):
env.AddMethod(ConvertInoToCpp) env.AddMethod(ConvertInoToCpp)
env.AddMethod(GetCompilerType) env.AddMethod(GetCompilerType)
env.AddMethod(GetActualLDScript) env.AddMethod(GetActualLDScript)
env.AddMethod(VerboseAction)
env.AddMethod(PioClean)
env.AddMethod(ConfigureDebugFlags) env.AddMethod(ConfigureDebugFlags)
env.AddMethod(ConfigureTestTarget) env.AddMethod(ConfigureTestTarget)
env.AddMethod(GetExtraScripts) env.AddMethod(GetExtraScripts)

View File

@@ -14,15 +14,18 @@
from __future__ import absolute_import from __future__ import absolute_import
import os
import sys import sys
from os.path import isdir, isfile, join
from SCons.Script import ARGUMENTS # pylint: disable=import-error from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from platformio import exception, fs, util from platformio import fs, util
from platformio.compat import WINDOWS from platformio.compat import WINDOWS
from platformio.managers.platform import PlatformFactory from platformio.package.meta import PackageItem
from platformio.package.version import get_original_version
from platformio.platform.exception import UnknownBoard
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectOptions from platformio.project.config import ProjectOptions
# pylint: disable=too-many-branches, too-many-locals # pylint: disable=too-many-branches, too-many-locals
@@ -34,7 +37,7 @@ def PioPlatform(env):
if "framework" in variables: if "framework" in variables:
# support PIO Core 3.0 dev/platforms # support PIO Core 3.0 dev/platforms
variables["pioframework"] = variables["framework"] variables["pioframework"] = variables["framework"]
p = PlatformFactory.newPlatform(env["PLATFORM_MANIFEST"]) p = PlatformFactory.new(os.path.dirname(env["PLATFORM_MANIFEST"]))
p.configure_default_packages(variables, COMMAND_LINE_TARGETS) p.configure_default_packages(variables, COMMAND_LINE_TARGETS)
return p return p
@@ -46,7 +49,7 @@ def BoardConfig(env, board=None):
board = board or env.get("BOARD") board = board or env.get("BOARD")
assert board, "BoardConfig: Board is not defined" assert board, "BoardConfig: Board is not defined"
return p.board_config(board) return p.board_config(board)
except (AssertionError, exception.UnknownBoard) as e: except (AssertionError, UnknownBoard) as e:
sys.stderr.write("Error: %s\n" % str(e)) sys.stderr.write("Error: %s\n" % str(e))
env.Exit(1) env.Exit(1)
@@ -55,37 +58,42 @@ def GetFrameworkScript(env, framework):
p = env.PioPlatform() p = env.PioPlatform()
assert p.frameworks and framework in p.frameworks assert p.frameworks and framework in p.frameworks
script_path = env.subst(p.frameworks[framework]["script"]) script_path = env.subst(p.frameworks[framework]["script"])
if not isfile(script_path): if not os.path.isfile(script_path):
script_path = join(p.get_dir(), script_path) script_path = os.path.join(p.get_dir(), script_path)
return script_path return script_path
def LoadPioPlatform(env): def LoadPioPlatform(env):
p = env.PioPlatform() p = env.PioPlatform()
installed_packages = p.get_installed_packages()
# Ensure real platform name # Ensure real platform name
env["PIOPLATFORM"] = p.name env["PIOPLATFORM"] = p.name
# Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH # Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH
systype = util.get_systype() systype = util.get_systype()
for name in installed_packages: for pkg in p.get_installed_packages():
type_ = p.get_package_type(name) type_ = p.get_package_type(pkg.metadata.name)
if type_ not in ("toolchain", "uploader", "debugger"): if type_ not in ("toolchain", "uploader", "debugger"):
continue continue
pkg_dir = p.get_package_dir(name)
env.PrependENVPath( env.PrependENVPath(
"PATH", join(pkg_dir, "bin") if isdir(join(pkg_dir, "bin")) else pkg_dir "PATH",
os.path.join(pkg.path, "bin")
if os.path.isdir(os.path.join(pkg.path, "bin"))
else pkg.path,
) )
if not WINDOWS and isdir(join(pkg_dir, "lib")) and type_ != "toolchain": if (
not WINDOWS
and os.path.isdir(os.path.join(pkg.path, "lib"))
and type_ != "toolchain"
):
env.PrependENVPath( env.PrependENVPath(
"DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH", "DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH",
join(pkg_dir, "lib"), os.path.join(pkg.path, "lib"),
) )
# Platform specific LD Scripts # Platform specific LD Scripts
if isdir(join(p.get_dir(), "ldscripts")): if os.path.isdir(os.path.join(p.get_dir(), "ldscripts")):
env.Prepend(LIBPATH=[join(p.get_dir(), "ldscripts")]) env.Prepend(LIBPATH=[os.path.join(p.get_dir(), "ldscripts")])
if "BOARD" not in env: if "BOARD" not in env:
return return
@@ -125,6 +133,7 @@ def LoadPioPlatform(env):
def PrintConfiguration(env): # pylint: disable=too-many-statements def PrintConfiguration(env): # pylint: disable=too-many-statements
platform = env.PioPlatform() platform = env.PioPlatform()
pkg_metadata = PackageItem(platform.get_dir()).metadata
board_config = env.BoardConfig() if "BOARD" in env else None board_config = env.BoardConfig() if "BOARD" in env else None
def _get_configuration_data(): def _get_configuration_data():
@@ -139,11 +148,19 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
) )
def _get_plaform_data(): def _get_plaform_data():
data = ["PLATFORM: %s %s" % (platform.title, platform.version)] data = [
if platform.src_version: "PLATFORM: %s (%s)"
data.append("#" + platform.src_version) % (
if int(ARGUMENTS.get("PIOVERBOSE", 0)) and platform.src_url: platform.title,
data.append("(%s)" % platform.src_url) pkg_metadata.version if pkg_metadata else platform.version,
)
]
if (
int(ARGUMENTS.get("PIOVERBOSE", 0))
and pkg_metadata
and pkg_metadata.spec.external
):
data.append("(%s)" % pkg_metadata.spec.url)
if board_config: if board_config:
data.extend([">", board_config.get("name")]) data.extend([">", board_config.get("name")])
return data return data
@@ -162,7 +179,8 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
ram = board_config.get("upload", {}).get("maximum_ram_size") ram = board_config.get("upload", {}).get("maximum_ram_size")
flash = board_config.get("upload", {}).get("maximum_size") flash = board_config.get("upload", {}).get("maximum_size")
data.append( data.append(
"%s RAM, %s Flash" % (fs.format_filesize(ram), fs.format_filesize(flash)) "%s RAM, %s Flash"
% (fs.humanize_file_size(ram), fs.humanize_file_size(flash))
) )
return data return data
@@ -194,7 +212,7 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
def _get_packages_data(): def _get_packages_data():
data = [] data = []
for item in platform.dump_used_packages(): for item in platform.dump_used_packages():
original_version = util.get_original_version(item["version"]) original_version = get_original_version(item["version"])
info = "%s %s" % (item["name"], item["version"]) info = "%s %s" % (item["name"], item["version"])
extra = [] extra = []
if original_version: if original_version:

View File

@@ -0,0 +1,119 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import os
from SCons.Action import Action # pylint: disable=import-error
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import AlwaysBuild # pylint: disable=import-error
from platformio import compat, fs
def VerboseAction(_, act, actstr):
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
return act
return Action(act, actstr)
def PioClean(env, clean_dir):
def _relpath(path):
if compat.WINDOWS:
prefix = os.getcwd()[:2].lower()
if (
":" not in prefix
or not path.lower().startswith(prefix)
or os.path.relpath(path).startswith("..")
):
return path
return os.path.relpath(path)
if not os.path.isdir(clean_dir):
print("Build environment is clean")
env.Exit(0)
clean_rel_path = _relpath(clean_dir)
for root, _, files in os.walk(clean_dir):
for f in files:
dst = os.path.join(root, f)
os.remove(dst)
print(
"Removed %s"
% (dst if not clean_rel_path.startswith(".") else _relpath(dst))
)
print("Done cleaning")
fs.rmtree(clean_dir)
env.Exit(0)
def AddTarget( # pylint: disable=too-many-arguments
env,
name,
dependencies,
actions,
title=None,
description=None,
group="Generic",
always_build=True,
):
if "__PIO_TARGETS" not in env:
env["__PIO_TARGETS"] = {}
assert name not in env["__PIO_TARGETS"]
env["__PIO_TARGETS"][name] = dict(
name=name, title=title, description=description, group=group
)
target = env.Alias(name, dependencies, actions)
if always_build:
AlwaysBuild(target)
return target
def AddPlatformTarget(env, *args, **kwargs):
return env.AddTarget(group="Platform", *args, **kwargs)
def AddCustomTarget(env, *args, **kwargs):
return env.AddTarget(group="Custom", *args, **kwargs)
def DumpTargets(env):
targets = env.get("__PIO_TARGETS") or {}
# pre-fill default targets if embedded dev-platform
if env.PioPlatform().is_embedded() and not any(
t["group"] == "Platform" for t in targets.values()
):
targets["upload"] = dict(name="upload", group="Platform", title="Upload")
targets["compiledb"] = dict(
name="compiledb",
title="Compilation Database",
description="Generate compilation database `compile_commands.json`",
group="Advanced",
)
targets["clean"] = dict(name="clean", title="Clean", group="Generic")
return list(targets.values())
def exists(_):
return True
def generate(env):
env.AddMethod(VerboseAction)
env.AddMethod(PioClean)
env.AddMethod(AddTarget)
env.AddMethod(AddPlatformTarget)
env.AddMethod(AddCustomTarget)
env.AddMethod(DumpTargets)
return env

View File

@@ -26,9 +26,9 @@ from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from SCons.Script import Export # pylint: disable=import-error from SCons.Script import Export # pylint: disable=import-error
from SCons.Script import SConscript # pylint: disable=import-error from SCons.Script import SConscript # pylint: disable=import-error
from platformio import fs from platformio import __version__, fs
from platformio.compat import string_types from platformio.compat import string_types
from platformio.util import pioversion_to_intstr from platformio.package.version import pepver_to_semver
SRC_HEADER_EXT = ["h", "hpp"] SRC_HEADER_EXT = ["h", "hpp"]
SRC_ASM_EXT = ["S", "spp", "SPP", "sx", "s", "asm", "ASM"] SRC_ASM_EXT = ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
@@ -66,7 +66,11 @@ def BuildProgram(env):
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")]) env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
# enable "cyclic reference" for linker # enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc": if (
env.get("LIBS")
and env.GetCompilerType() == "gcc"
and env.PioPlatform().is_embedded()
):
env.Prepend(_LIBFLAGS="-Wl,--start-group ") env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group") env.Append(_LIBFLAGS=" -Wl,--end-group")
@@ -90,11 +94,16 @@ def BuildProgram(env):
def ProcessProgramDeps(env): def ProcessProgramDeps(env):
def _append_pio_macros(): def _append_pio_macros():
core_version = pepver_to_semver(__version__)
env.AppendUnique( env.AppendUnique(
CPPDEFINES=[ CPPDEFINES=[
( (
"PLATFORMIO", "PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())), int(
"{0:02d}{1:02d}{2:02d}".format(
core_version.major, core_version.minor, core_version.patch
)
),
) )
] ]
) )
@@ -282,19 +291,22 @@ def CollectBuildFiles(
if fs.path_endswith_ext(item, SRC_BUILD_EXT): if fs.path_endswith_ext(item, SRC_BUILD_EXT):
sources.append(env.File(os.path.join(_var_dir, os.path.basename(item)))) sources.append(env.File(os.path.join(_var_dir, os.path.basename(item))))
for callback, pattern in env.get("__PIO_BUILD_MIDDLEWARES", []): middlewares = env.get("__PIO_BUILD_MIDDLEWARES")
tmp = [] if not middlewares:
for node in sources:
if pattern and not fnmatch.fnmatch(node.srcnode().get_path(), pattern):
tmp.append(node)
continue
n = callback(node)
if n:
tmp.append(n)
sources = tmp
return sources return sources
new_sources = []
for node in sources:
new_node = node
for callback, pattern in middlewares:
if pattern and not fnmatch.fnmatch(node.srcnode().get_path(), pattern):
continue
new_node = callback(new_node)
if new_node:
new_sources.append(new_node)
return new_sources
def AddBuildMiddleware(env, callback, pattern=None): def AddBuildMiddleware(env, callback, pattern=None):
env.Append(__PIO_BUILD_MIDDLEWARES=[(callback, pattern)]) env.Append(__PIO_BUILD_MIDDLEWARES=[(callback, pattern)])

165
platformio/cache.py Normal file
View File

@@ -0,0 +1,165 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import codecs
import hashlib
import os
from time import time
from platformio import app, fs
from platformio.compat import hashlib_encode_data
from platformio.package.lockfile import LockFile
from platformio.project.helpers import get_project_cache_dir
class ContentCache(object):
def __init__(self, namespace=None):
self.cache_dir = os.path.join(get_project_cache_dir(), namespace or "content")
self._db_path = os.path.join(self.cache_dir, "db.data")
self._lockfile = None
if not os.path.isdir(self.cache_dir):
os.makedirs(self.cache_dir)
def __enter__(self):
# cleanup obsolete items
self.delete()
return self
def __exit__(self, type_, value, traceback):
pass
@staticmethod
def key_from_args(*args):
h = hashlib.sha1()
for arg in args:
if arg:
h.update(hashlib_encode_data(arg))
return h.hexdigest()
def get_cache_path(self, key):
assert "/" not in key and "\\" not in key
key = str(key)
assert len(key) > 3
return os.path.join(self.cache_dir, key)
def get(self, key):
cache_path = self.get_cache_path(key)
if not os.path.isfile(cache_path):
return None
with codecs.open(cache_path, "rb", encoding="utf8") as fp:
return fp.read()
def set(self, key, data, valid):
if not app.get_setting("enable_cache"):
return False
cache_path = self.get_cache_path(key)
if os.path.isfile(cache_path):
self.delete(key)
if not data:
return False
tdmap = {"s": 1, "m": 60, "h": 3600, "d": 86400}
assert valid.endswith(tuple(tdmap))
expire_time = int(time() + tdmap[valid[-1]] * int(valid[:-1]))
if not self._lock_dbindex():
return False
if not os.path.isdir(os.path.dirname(cache_path)):
os.makedirs(os.path.dirname(cache_path))
try:
with codecs.open(cache_path, "wb", encoding="utf8") as fp:
fp.write(data)
with open(self._db_path, "a") as fp:
fp.write("%s=%s\n" % (str(expire_time), os.path.basename(cache_path)))
except UnicodeError:
if os.path.isfile(cache_path):
try:
os.remove(cache_path)
except OSError:
pass
return self._unlock_dbindex()
def delete(self, keys=None):
""" Keys=None, delete expired items """
if not os.path.isfile(self._db_path):
return None
if not keys:
keys = []
if not isinstance(keys, list):
keys = [keys]
paths_for_delete = [self.get_cache_path(k) for k in keys]
found = False
newlines = []
with open(self._db_path) as fp:
for line in fp.readlines():
line = line.strip()
if "=" not in line:
continue
expire, fname = line.split("=")
path = os.path.join(self.cache_dir, fname)
try:
if (
time() < int(expire)
and os.path.isfile(path)
and path not in paths_for_delete
):
newlines.append(line)
continue
except ValueError:
pass
found = True
if os.path.isfile(path):
try:
os.remove(path)
if not os.listdir(os.path.dirname(path)):
fs.rmtree(os.path.dirname(path))
except OSError:
pass
if found and self._lock_dbindex():
with open(self._db_path, "w") as fp:
fp.write("\n".join(newlines) + "\n")
self._unlock_dbindex()
return True
def clean(self):
if not os.path.isdir(self.cache_dir):
return
fs.rmtree(self.cache_dir)
def _lock_dbindex(self):
self._lockfile = LockFile(self.cache_dir)
try:
self._lockfile.acquire()
except: # pylint: disable=bare-except
return False
return True
def _unlock_dbindex(self):
if self._lockfile:
self._lockfile.release()
return True
#
# Helpers
#
def cleanup_content_cache(namespace=None):
with ContentCache(namespace) as cc:
cc.clean()

View File

@@ -0,0 +1,290 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import time
from platformio import __accounts_api__, app
from platformio.clients.http import HTTPClient
from platformio.exception import PlatformioException
class AccountError(PlatformioException):
MESSAGE = "{0}"
class AccountNotAuthorized(AccountError):
MESSAGE = "You are not authorized! Please log in to PlatformIO Account."
class AccountAlreadyAuthorized(AccountError):
MESSAGE = "You are already authorized with {0} account."
class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
SUMMARY_CACHE_TTL = 60 * 60 * 24 * 7
def __init__(self):
super(AccountClient, self).__init__(__accounts_api__)
@staticmethod
def get_refresh_token():
try:
return app.get_state_item("account").get("auth").get("refresh_token")
except: # pylint:disable=bare-except
raise AccountNotAuthorized()
@staticmethod
def delete_local_session():
app.delete_state_item("account")
@staticmethod
def delete_local_state(key):
account = app.get_state_item("account")
if not account or key not in account:
return
del account[key]
app.set_state_item("account", account)
def send_auth_request(self, *args, **kwargs):
headers = kwargs.get("headers", {})
if "Authorization" not in headers:
token = self.fetch_authentication_token()
headers["Authorization"] = "Bearer %s" % token
kwargs["headers"] = headers
return self.fetch_json_data(*args, **kwargs)
def login(self, username, password):
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
data = self.fetch_json_data(
"post", "/v1/login", data={"username": username, "password": password},
)
app.set_state_item("account", data)
return data
def login_with_code(self, client_id, code, redirect_uri):
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
result = self.fetch_json_data(
"post",
"/v1/login/code",
data={"client_id": client_id, "code": code, "redirect_uri": redirect_uri},
)
app.set_state_item("account", result)
return result
def logout(self):
refresh_token = self.get_refresh_token()
self.delete_local_session()
try:
self.fetch_json_data(
"post", "/v1/logout", data={"refresh_token": refresh_token},
)
except AccountError:
pass
return True
def change_password(self, old_password, new_password):
return self.send_auth_request(
"post",
"/v1/password",
data={"old_password": old_password, "new_password": new_password},
)
def registration(
self, username, email, password, firstname, lastname
): # pylint:disable=too-many-arguments
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
return self.fetch_json_data(
"post",
"/v1/registration",
data={
"username": username,
"email": email,
"password": password,
"firstname": firstname,
"lastname": lastname,
},
)
def auth_token(self, password, regenerate):
return self.send_auth_request(
"post",
"/v1/token",
data={"password": password, "regenerate": 1 if regenerate else 0},
).get("auth_token")
def forgot_password(self, username):
return self.fetch_json_data("post", "/v1/forgot", data={"username": username},)
def get_profile(self):
return self.send_auth_request("get", "/v1/profile",)
def update_profile(self, profile, current_password):
profile["current_password"] = current_password
self.delete_local_state("summary")
response = self.send_auth_request("put", "/v1/profile", data=profile,)
return response
def get_account_info(self, offline=False):
account = app.get_state_item("account") or {}
if (
account.get("summary")
and account["summary"].get("expire_at", 0) > time.time()
):
return account["summary"]
if offline and account.get("email"):
return {
"profile": {
"email": account.get("email"),
"username": account.get("username"),
}
}
result = self.send_auth_request("get", "/v1/summary",)
account["summary"] = dict(
profile=result.get("profile"),
packages=result.get("packages"),
subscriptions=result.get("subscriptions"),
user_id=result.get("user_id"),
expire_at=int(time.time()) + self.SUMMARY_CACHE_TTL,
)
app.set_state_item("account", account)
return result
def destroy_account(self):
return self.send_auth_request("delete", "/v1/account")
def create_org(self, orgname, email, displayname):
return self.send_auth_request(
"post",
"/v1/orgs",
data={"orgname": orgname, "email": email, "displayname": displayname},
)
def get_org(self, orgname):
return self.send_auth_request("get", "/v1/orgs/%s" % orgname)
def list_orgs(self):
return self.send_auth_request("get", "/v1/orgs",)
def update_org(self, orgname, data):
return self.send_auth_request(
"put", "/v1/orgs/%s" % orgname, data={k: v for k, v in data.items() if v}
)
def destroy_org(self, orgname):
return self.send_auth_request("delete", "/v1/orgs/%s" % orgname,)
def add_org_owner(self, orgname, username):
return self.send_auth_request(
"post", "/v1/orgs/%s/owners" % orgname, data={"username": username},
)
def list_org_owners(self, orgname):
return self.send_auth_request("get", "/v1/orgs/%s/owners" % orgname,)
def remove_org_owner(self, orgname, username):
return self.send_auth_request(
"delete", "/v1/orgs/%s/owners" % orgname, data={"username": username},
)
def create_team(self, orgname, teamname, description):
return self.send_auth_request(
"post",
"/v1/orgs/%s/teams" % orgname,
data={"name": teamname, "description": description},
)
def destroy_team(self, orgname, teamname):
return self.send_auth_request(
"delete", "/v1/orgs/%s/teams/%s" % (orgname, teamname),
)
def get_team(self, orgname, teamname):
return self.send_auth_request(
"get", "/v1/orgs/%s/teams/%s" % (orgname, teamname),
)
def list_teams(self, orgname):
return self.send_auth_request("get", "/v1/orgs/%s/teams" % orgname,)
def update_team(self, orgname, teamname, data):
return self.send_auth_request(
"put",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
data={k: v for k, v in data.items() if v},
)
def add_team_member(self, orgname, teamname, username):
return self.send_auth_request(
"post",
"/v1/orgs/%s/teams/%s/members" % (orgname, teamname),
data={"username": username},
)
def remove_team_member(self, orgname, teamname, username):
return self.send_auth_request(
"delete",
"/v1/orgs/%s/teams/%s/members" % (orgname, teamname),
data={"username": username},
)
def fetch_authentication_token(self):
if os.environ.get("PLATFORMIO_AUTH_TOKEN"):
return os.environ.get("PLATFORMIO_AUTH_TOKEN")
auth = app.get_state_item("account", {}).get("auth", {})
if auth.get("access_token") and auth.get("access_token_expire"):
if auth.get("access_token_expire") > time.time():
return auth.get("access_token")
if auth.get("refresh_token"):
try:
data = self.fetch_json_data(
"post",
"/v1/login",
headers={
"Authorization": "Bearer %s" % auth.get("refresh_token")
},
)
app.set_state_item("account", data)
return data.get("auth").get("access_token")
except AccountError:
self.delete_local_session()
raise AccountNotAuthorized()

206
platformio/clients/http.py Normal file
View File

@@ -0,0 +1,206 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import math
import os
import socket
import requests.adapters
from requests.packages.urllib3.util.retry import Retry # pylint:disable=import-error
from platformio import __check_internet_hosts__, __default_requests_timeout__, app, util
from platformio.cache import ContentCache
from platformio.exception import PlatformioException, UserSideException
try:
from urllib.parse import urljoin
except ImportError:
from urlparse import urljoin
class HTTPClientError(PlatformioException):
def __init__(self, message, response=None):
super(HTTPClientError, self).__init__()
self.message = message
self.response = response
def __str__(self): # pragma: no cover
return self.message
class InternetIsOffline(UserSideException):
MESSAGE = (
"You are not connected to the Internet.\n"
"PlatformIO needs the Internet connection to"
" download dependent packages or to work with PlatformIO Account."
)
class EndpointSession(requests.Session):
def __init__(self, base_url, *args, **kwargs):
super(EndpointSession, self).__init__(*args, **kwargs)
self.base_url = base_url
def request( # pylint: disable=signature-differs,arguments-differ
self, method, url, *args, **kwargs
):
# print(self.base_url, method, url, args, kwargs)
return super(EndpointSession, self).request(
method, urljoin(self.base_url, url), *args, **kwargs
)
class EndpointSessionIterator(object):
def __init__(self, endpoints):
if not isinstance(endpoints, list):
endpoints = [endpoints]
self.endpoints = endpoints
self.endpoints_iter = iter(endpoints)
self.retry = Retry(
total=math.ceil(6 / len(self.endpoints)),
backoff_factor=1,
# method_whitelist=list(Retry.DEFAULT_METHOD_WHITELIST) + ["POST"],
status_forcelist=[413, 429, 500, 502, 503, 504],
)
def __iter__(self): # pylint: disable=non-iterator-returned
return self
def next(self):
""" For Python 2 compatibility """
return self.__next__()
def __next__(self):
base_url = next(self.endpoints_iter)
session = EndpointSession(base_url)
session.headers.update({"User-Agent": app.get_user_agent()})
adapter = requests.adapters.HTTPAdapter(max_retries=self.retry)
session.mount(base_url, adapter)
return session
class HTTPClient(object):
def __init__(self, endpoints):
self._session_iter = EndpointSessionIterator(endpoints)
self._session = None
self._next_session()
def __del__(self):
if not self._session:
return
self._session.close()
self._session = None
def _next_session(self):
if self._session:
self._session.close()
self._session = next(self._session_iter)
@util.throttle(500)
def send_request(self, method, path, **kwargs):
# check Internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
# set default timeout
if "timeout" not in kwargs:
kwargs["timeout"] = __default_requests_timeout__
while True:
try:
return getattr(self._session, method)(path, **kwargs)
except (
requests.exceptions.ConnectionError,
requests.exceptions.Timeout,
) as e:
try:
self._next_session()
except: # pylint: disable=bare-except
raise HTTPClientError(str(e))
def fetch_json_data(self, method, path, **kwargs):
cache_valid = kwargs.pop("cache_valid") if "cache_valid" in kwargs else None
if not cache_valid:
return self.raise_error_from_response(
self.send_request(method, path, **kwargs)
)
cache_key = ContentCache.key_from_args(
method, path, kwargs.get("params"), kwargs.get("data")
)
with ContentCache("http") as cc:
result = cc.get(cache_key)
if result is not None:
return json.loads(result)
response = self.send_request(method, path, **kwargs)
cc.set(cache_key, response.text, cache_valid)
return self.raise_error_from_response(response)
@staticmethod
def raise_error_from_response(response, expected_codes=(200, 201, 202)):
if response.status_code in expected_codes:
try:
return response.json()
except ValueError:
pass
try:
message = response.json()["message"]
except (KeyError, ValueError):
message = response.text
raise HTTPClientError(message, response)
#
# Helpers
#
@util.memoized(expire="10s")
def _internet_on():
timeout = 2
socket.setdefaulttimeout(timeout)
for host in __check_internet_hosts__:
try:
for var in ("HTTP_PROXY", "HTTPS_PROXY"):
if not os.getenv(var) and not os.getenv(var.lower()):
continue
requests.get("http://%s" % host, allow_redirects=False, timeout=timeout)
return True
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, 80))
s.close()
return True
except: # pylint: disable=bare-except
pass
return False
def ensure_internet_on(raise_exception=False):
result = _internet_on()
if raise_exception and not result:
raise InternetIsOffline()
return result
def fetch_remote_content(*args, **kwargs):
kwargs["headers"] = kwargs.get("headers", {})
if "User-Agent" not in kwargs["headers"]:
kwargs["headers"]["User-Agent"] = app.get_user_agent()
if "timeout" not in kwargs:
kwargs["timeout"] = __default_requests_timeout__
r = requests.get(*args, **kwargs)
r.raise_for_status()
return r.text

View File

@@ -0,0 +1,141 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import __registry_api__, fs
from platformio.clients.account import AccountClient
from platformio.clients.http import HTTPClient, HTTPClientError
from platformio.package.meta import PackageType
# pylint: disable=too-many-arguments
class RegistryClient(HTTPClient):
def __init__(self):
super(RegistryClient, self).__init__(__registry_api__)
def send_auth_request(self, *args, **kwargs):
headers = kwargs.get("headers", {})
if "Authorization" not in headers:
token = AccountClient().fetch_authentication_token()
headers["Authorization"] = "Bearer %s" % token
kwargs["headers"] = headers
return self.fetch_json_data(*args, **kwargs)
def publish_package(
self, archive_path, owner=None, released_at=None, private=False, notify=True
):
account = AccountClient()
if not owner:
owner = (
account.get_account_info(offline=True).get("profile").get("username")
)
with open(archive_path, "rb") as fp:
return self.send_auth_request(
"post",
"/v3/packages/%s/%s" % (owner, PackageType.from_archive(archive_path)),
params={
"private": 1 if private else 0,
"notify": 1 if notify else 0,
"released_at": released_at,
},
headers={
"Content-Type": "application/octet-stream",
"X-PIO-Content-SHA256": fs.calculate_file_hashsum(
"sha256", archive_path
),
},
data=fp,
)
def unpublish_package( # pylint: disable=redefined-builtin
self, type, name, owner=None, version=None, undo=False
):
account = AccountClient()
if not owner:
owner = (
account.get_account_info(offline=True).get("profile").get("username")
)
path = "/v3/packages/%s/%s/%s" % (owner, type, name)
if version:
path += "/" + version
return self.send_auth_request(
"delete", path, params={"undo": 1 if undo else 0},
)
def update_resource(self, urn, private):
return self.send_auth_request(
"put", "/v3/resources/%s" % urn, data={"private": int(private)},
)
def grant_access_for_resource(self, urn, client, level):
return self.send_auth_request(
"put",
"/v3/resources/%s/access" % urn,
data={"client": client, "level": level},
)
def revoke_access_from_resource(self, urn, client):
return self.send_auth_request(
"delete", "/v3/resources/%s/access" % urn, data={"client": client},
)
def list_resources(self, owner):
return self.send_auth_request(
"get", "/v3/resources", params={"owner": owner} if owner else None
)
def list_packages(self, query=None, filters=None, page=None):
assert query or filters
search_query = []
if filters:
valid_filters = (
"authors",
"keywords",
"frameworks",
"platforms",
"headers",
"ids",
"names",
"owners",
"types",
)
assert set(filters.keys()) <= set(valid_filters)
for name, values in filters.items():
for value in set(
values if isinstance(values, (list, tuple)) else [values]
):
search_query.append('%s:"%s"' % (name[:-1], value))
if query:
search_query.append(query)
params = dict(query=" ".join(search_query))
if page:
params["page"] = int(page)
return self.fetch_json_data(
"get", "/v3/packages", params=params, cache_valid="1h"
)
def get_package(self, type_, owner, name, version=None):
try:
return self.fetch_json_data(
"get",
"/v3/packages/{owner}/{type}/{name}".format(
type=type_, owner=owner.lower(), name=name.lower()
),
params=dict(version=version) if version else None,
cache_valid="1h",
)
except HTTPClientError as e:
if e.response.status_code == 404:
return None
raise e

View File

@@ -0,0 +1,138 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
import json
import re
import click
from tabulate import tabulate
from platformio.clients.registry import RegistryClient
from platformio.commands.account import validate_username
from platformio.commands.team import validate_orgname_teamname
def validate_client(value):
if ":" in value:
validate_orgname_teamname(value)
else:
validate_username(value)
return value
@click.group("access", short_help="Manage resource access")
def cli():
pass
def validate_urn(value):
value = str(value).strip()
if not re.match(r"^prn:reg:pkg:(\d+):(\w+)$", value, flags=re.I):
raise click.BadParameter("Invalid URN format.")
return value
@cli.command("public", short_help="Make resource public")
@click.argument(
"urn", callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_public(urn, urn_type):
client = RegistryClient()
client.update_resource(urn=urn, private=0)
return click.secho(
"The resource %s has been successfully updated." % urn, fg="green",
)
@cli.command("private", short_help="Make resource private")
@click.argument(
"urn", callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_private(urn, urn_type):
client = RegistryClient()
client.update_resource(urn=urn, private=1)
return click.secho(
"The resource %s has been successfully updated." % urn, fg="green",
)
@cli.command("grant", short_help="Grant access")
@click.argument("level", type=click.Choice(["admin", "maintainer", "guest"]))
@click.argument(
"client",
metavar="[<ORGNAME:TEAMNAME>|<USERNAME>]",
callback=lambda _, __, value: validate_client(value),
)
@click.argument(
"urn", callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_grant(level, client, urn, urn_type):
reg_client = RegistryClient()
reg_client.grant_access_for_resource(urn=urn, client=client, level=level)
return click.secho(
"Access for resource %s has been granted for %s" % (urn, client), fg="green",
)
@cli.command("revoke", short_help="Revoke access")
@click.argument(
"client",
metavar="[ORGNAME:TEAMNAME|USERNAME]",
callback=lambda _, __, value: validate_client(value),
)
@click.argument(
"urn", callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_revoke(client, urn, urn_type):
reg_client = RegistryClient()
reg_client.revoke_access_from_resource(urn=urn, client=client)
return click.secho(
"Access for resource %s has been revoked for %s" % (urn, client), fg="green",
)
@cli.command("list", short_help="List published resources")
@click.argument("owner", required=False)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
@click.option("--json-output", is_flag=True)
def access_list(owner, urn_type, json_output):
reg_client = RegistryClient()
resources = reg_client.list_resources(owner=owner)
if json_output:
return click.echo(json.dumps(resources))
if not resources:
return click.secho("You do not have any resources.", fg="yellow")
for resource in resources:
click.echo()
click.secho(resource.get("name"), fg="cyan")
click.echo("-" * len(resource.get("name")))
table_data = []
table_data.append(("URN:", resource.get("urn")))
table_data.append(("Owner:", resource.get("owner")))
table_data.append(
(
"Access level(s):",
", ".join(
(level.capitalize() for level in resource.get("access_levels"))
),
)
)
click.echo(tabulate(table_data, tablefmt="plain"))
return click.echo()

View File

@@ -21,22 +21,23 @@ import re
import click import click
from tabulate import tabulate from tabulate import tabulate
from platformio.commands.account import exception from platformio.clients.account import AccountClient, AccountNotAuthorized
from platformio.commands.account.client import AccountClient
@click.group("account", short_help="Manage PIO Account") @click.group("account", short_help="Manage PlatformIO account")
def cli(): def cli():
pass pass
def validate_username(value): def validate_username(value, field="username"):
value = str(value).strip() value = str(value).strip()
if not re.match(r"^[a-z\d](?:[a-z\d]|-(?=[a-z\d])){3,38}$", value, flags=re.I): if not re.match(r"^[a-z\d](?:[a-z\d]|-(?=[a-z\d])){0,37}$", value, flags=re.I):
raise click.BadParameter( raise click.BadParameter(
"Invalid username format. " "Invalid %s format. "
"Username must contain at least 4 characters including single hyphens," "%s must contain only alphanumeric characters "
" and cannot begin or end with a hyphen" "or single hyphens, cannot begin or end with a hyphen, "
"and must not be longer than 38 characters."
% (field.lower(), field.capitalize())
) )
return value return value
@@ -59,7 +60,7 @@ def validate_password(value):
return value return value
@cli.command("register", short_help="Create new PIO Account") @cli.command("register", short_help="Create new PlatformIO Account")
@click.option( @click.option(
"-u", "-u",
"--username", "--username",
@@ -89,7 +90,7 @@ def account_register(username, email, password, firstname, lastname):
) )
@cli.command("login", short_help="Log in to PIO Account") @cli.command("login", short_help="Log in to PlatformIO Account")
@click.option("-u", "--username", prompt="Username or email") @click.option("-u", "--username", prompt="Username or email")
@click.option("-p", "--password", prompt=True, hide_input=True) @click.option("-p", "--password", prompt=True, hide_input=True)
def account_login(username, password): def account_login(username, password):
@@ -98,7 +99,7 @@ def account_login(username, password):
return click.secho("Successfully logged in!", fg="green") return click.secho("Successfully logged in!", fg="green")
@cli.command("logout", short_help="Log out of PIO Account") @cli.command("logout", short_help="Log out of PlatformIO Account")
def account_logout(): def account_logout():
client = AccountClient() client = AccountClient()
client.logout() client.logout()
@@ -167,7 +168,7 @@ def account_update(current_password, **kwargs):
return None return None
try: try:
client.logout() client.logout()
except exception.AccountNotAuthorized: except AccountNotAuthorized:
pass pass
if email_changed: if email_changed:
return click.secho( return click.secho(
@@ -177,7 +178,24 @@ def account_update(current_password, **kwargs):
return click.secho("Please re-login.", fg="yellow") return click.secho("Please re-login.", fg="yellow")
@cli.command("show", short_help="PIO Account information") @cli.command("destroy", short_help="Destroy account")
def account_destroy():
client = AccountClient()
click.confirm(
"Are you sure you want to delete the %s user account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% client.get_account_info().get("profile").get("username"),
abort=True,
)
client.destroy_account()
try:
client.logout()
except AccountNotAuthorized:
pass
return click.secho("User account has been destroyed.", fg="green",)
@cli.command("show", short_help="PlatformIO Account information")
@click.option("--offline", is_flag=True) @click.option("--offline", is_flag=True)
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def account_show(offline, json_output): def account_show(offline, json_output):

View File

@@ -1,262 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
import os
import time
import requests.adapters
from requests.packages.urllib3.util.retry import Retry # pylint:disable=import-error
from platformio import __pioaccount_api__, app
from platformio.commands.account import exception
from platformio.exception import InternetIsOffline
class AccountClient(object):
SUMMARY_CACHE_TTL = 60 * 60 * 24 * 7
def __init__(
self, api_base_url=__pioaccount_api__, retries=3,
):
if api_base_url.endswith("/"):
api_base_url = api_base_url[:-1]
self.api_base_url = api_base_url
self._session = requests.Session()
self._session.headers.update({"User-Agent": app.get_user_agent()})
retry = Retry(
total=retries,
read=retries,
connect=retries,
backoff_factor=2,
method_whitelist=list(Retry.DEFAULT_METHOD_WHITELIST) + ["POST"],
)
adapter = requests.adapters.HTTPAdapter(max_retries=retry)
self._session.mount(api_base_url, adapter)
@staticmethod
def get_refresh_token():
try:
return app.get_state_item("account").get("auth").get("refresh_token")
except: # pylint:disable=bare-except
raise exception.AccountNotAuthorized()
@staticmethod
def delete_local_session():
app.delete_state_item("account")
@staticmethod
def delete_local_state(key):
account = app.get_state_item("account")
if not account or key not in account:
return
del account[key]
app.set_state_item("account", account)
def login(self, username, password):
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise exception.AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
result = self.send_request(
"post",
self.api_base_url + "/v1/login",
data={"username": username, "password": password},
)
app.set_state_item("account", result)
return result
def login_with_code(self, client_id, code, redirect_uri):
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise exception.AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
result = self.send_request(
"post",
self.api_base_url + "/v1/login/code",
data={"client_id": client_id, "code": code, "redirect_uri": redirect_uri},
)
app.set_state_item("account", result)
return result
def logout(self):
refresh_token = self.get_refresh_token()
self.delete_local_session()
try:
self.send_request(
"post",
self.api_base_url + "/v1/logout",
data={"refresh_token": refresh_token},
)
except exception.AccountError:
pass
return True
def change_password(self, old_password, new_password):
token = self.fetch_authentication_token()
self.send_request(
"post",
self.api_base_url + "/v1/password",
headers={"Authorization": "Bearer %s" % token},
data={"old_password": old_password, "new_password": new_password},
)
return True
def registration(
self, username, email, password, firstname, lastname
): # pylint:disable=too-many-arguments
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise exception.AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
return self.send_request(
"post",
self.api_base_url + "/v1/registration",
data={
"username": username,
"email": email,
"password": password,
"firstname": firstname,
"lastname": lastname,
},
)
def auth_token(self, password, regenerate):
token = self.fetch_authentication_token()
result = self.send_request(
"post",
self.api_base_url + "/v1/token",
headers={"Authorization": "Bearer %s" % token},
data={"password": password, "regenerate": 1 if regenerate else 0},
)
return result.get("auth_token")
def forgot_password(self, username):
return self.send_request(
"post", self.api_base_url + "/v1/forgot", data={"username": username},
)
def get_profile(self):
token = self.fetch_authentication_token()
return self.send_request(
"get",
self.api_base_url + "/v1/profile",
headers={"Authorization": "Bearer %s" % token},
)
def update_profile(self, profile, current_password):
token = self.fetch_authentication_token()
profile["current_password"] = current_password
self.delete_local_state("summary")
response = self.send_request(
"put",
self.api_base_url + "/v1/profile",
headers={"Authorization": "Bearer %s" % token},
data=profile,
)
return response
def get_account_info(self, offline):
account = app.get_state_item("account")
if not account:
raise exception.AccountNotAuthorized()
if (
account.get("summary")
and account["summary"].get("expire_at", 0) > time.time()
):
return account["summary"]
if offline:
return {
"profile": {
"email": account.get("email"),
"username": account.get("username"),
}
}
token = self.fetch_authentication_token()
result = self.send_request(
"get",
self.api_base_url + "/v1/summary",
headers={"Authorization": "Bearer %s" % token},
)
account["summary"] = dict(
profile=result.get("profile"),
packages=result.get("packages"),
subscriptions=result.get("subscriptions"),
user_id=result.get("user_id"),
expire_at=int(time.time()) + self.SUMMARY_CACHE_TTL,
)
app.set_state_item("account", account)
return result
def fetch_authentication_token(self):
if "PLATFORMIO_AUTH_TOKEN" in os.environ:
return os.environ["PLATFORMIO_AUTH_TOKEN"]
auth = app.get_state_item("account", {}).get("auth", {})
if auth.get("access_token") and auth.get("access_token_expire"):
if auth.get("access_token_expire") > time.time():
return auth.get("access_token")
if auth.get("refresh_token"):
try:
result = self.send_request(
"post",
self.api_base_url + "/v1/login",
headers={
"Authorization": "Bearer %s" % auth.get("refresh_token")
},
)
app.set_state_item("account", result)
return result.get("auth").get("access_token")
except exception.AccountError:
self.delete_local_session()
raise exception.AccountNotAuthorized()
def send_request(self, method, url, headers=None, data=None):
try:
response = getattr(self._session, method)(
url, headers=headers or {}, data=data or {}
)
except requests.exceptions.ConnectionError:
raise InternetIsOffline()
return self.raise_error_from_response(response)
def raise_error_from_response(self, response, expected_codes=(200, 201, 202)):
if response.status_code in expected_codes:
try:
return response.json()
except ValueError:
pass
try:
message = response.json()["message"]
except (KeyError, ValueError):
message = response.text
if "Authorization session has been expired" in message:
self.delete_local_session()
raise exception.AccountError(message)

View File

@@ -19,10 +19,10 @@ from tabulate import tabulate
from platformio import fs from platformio import fs
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformManager from platformio.package.manager.platform import PlatformPackageManager
@click.command("boards", short_help="Embedded Board Explorer") @click.command("boards", short_help="Embedded board explorer")
@click.argument("query", required=False) @click.argument("query", required=False)
@click.option("--installed", is_flag=True) @click.option("--installed", is_flag=True)
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
@@ -59,8 +59,8 @@ def print_boards(boards):
click.style(b["id"], fg="cyan"), click.style(b["id"], fg="cyan"),
b["mcu"], b["mcu"],
"%dMHz" % (b["fcpu"] / 1000000), "%dMHz" % (b["fcpu"] / 1000000),
fs.format_filesize(b["rom"]), fs.humanize_file_size(b["rom"]),
fs.format_filesize(b["ram"]), fs.humanize_file_size(b["ram"]),
b["name"], b["name"],
) )
for b in boards for b in boards
@@ -71,7 +71,7 @@ def print_boards(boards):
def _get_boards(installed=False): def _get_boards(installed=False):
pm = PlatformManager() pm = PlatformPackageManager()
return pm.get_installed_boards() if installed else pm.get_all_boards() return pm.get_installed_boards() if installed else pm.get_all_boards()

View File

@@ -31,7 +31,7 @@ from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above, get_project_dir from platformio.project.helpers import find_project_dir_above, get_project_dir
@click.command("check", short_help="Run a static analysis tool on code") @click.command("check", short_help="Static code analysis")
@click.option("-e", "--environment", multiple=True) @click.option("-e", "--environment", multiple=True)
@click.option( @click.option(
"-d", "-d",

View File

@@ -12,13 +12,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import glob
import os import os
from tempfile import NamedTemporaryFile from tempfile import NamedTemporaryFile
import click import click
from platformio import fs, proc from platformio import compat, fs, proc
from platformio.commands.check.defect import DefectItem from platformio.commands.check.defect import DefectItem
from platformio.project.helpers import load_project_ide_data from platformio.project.helpers import load_project_ide_data
@@ -84,7 +83,9 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
cmd = "echo | %s -x %s %s %s -dM -E -" % ( cmd = "echo | %s -x %s %s %s -dM -E -" % (
self.cc_path, self.cc_path,
language, language,
" ".join([f for f in build_flags if f.startswith(("-m", "-f"))]), " ".join(
[f for f in build_flags if f.startswith(("-m", "-f", "-std"))]
),
includes_file, includes_file,
) )
result = proc.exec_command(cmd, shell=True) result = proc.exec_command(cmd, shell=True)
@@ -183,7 +184,7 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
result["c++"].append(os.path.realpath(path)) result["c++"].append(os.path.realpath(path))
for pattern in patterns: for pattern in patterns:
for item in glob.glob(pattern): for item in compat.glob_recursive(pattern):
if not os.path.isdir(item): if not os.path.isdir(item):
_add_file(item) _add_file(item)
for root, _, files in os.walk(item, followlinks=True): for root, _, files in os.walk(item, followlinks=True):

View File

@@ -17,7 +17,7 @@ from os.path import join
from platformio.commands.check.defect import DefectItem from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir from platformio.package.manager.core import get_core_package_dir
class ClangtidyCheckTool(CheckToolBase): class ClangtidyCheckTool(CheckToolBase):
@@ -63,10 +63,7 @@ class ClangtidyCheckTool(CheckToolBase):
for scope in project_files: for scope in project_files:
src_files.extend(project_files[scope]) src_files.extend(project_files[scope])
cmd.extend(flags) cmd.extend(flags + src_files + ["--"])
cmd.extend(src_files)
cmd.append("--")
cmd.extend( cmd.extend(
["-D%s" % d for d in self.cpp_defines + self.toolchain_defines["c++"]] ["-D%s" % d for d in self.cpp_defines + self.toolchain_defines["c++"]]
) )
@@ -79,6 +76,6 @@ class ClangtidyCheckTool(CheckToolBase):
continue continue
includes.append(inc) includes.append(inc)
cmd.append("--extra-arg=" + self._long_includes_hook(includes)) cmd.extend(["-I%s" % inc for inc in includes])
return cmd return cmd

View File

@@ -19,11 +19,13 @@ import click
from platformio import proc from platformio import proc
from platformio.commands.check.defect import DefectItem from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir from platformio.package.manager.core import get_core_package_dir
class CppcheckCheckTool(CheckToolBase): class CppcheckCheckTool(CheckToolBase):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
self._field_delimiter = "<&PIO&>"
self._buffer = ""
self.defect_fields = [ self.defect_fields = [
"severity", "severity",
"message", "message",
@@ -55,13 +57,15 @@ class CppcheckCheckTool(CheckToolBase):
return line return line
def parse_defect(self, raw_line): def parse_defect(self, raw_line):
if "<&PIO&>" not in raw_line or any( if self._field_delimiter not in raw_line:
f not in raw_line for f in self.defect_fields return None
):
self._buffer += raw_line
if any(f not in self._buffer for f in self.defect_fields):
return None return None
args = dict() args = dict()
for field in raw_line.split("<&PIO&>"): for field in self._buffer.split(self._field_delimiter):
field = field.strip().replace('"', "") field = field.strip().replace('"', "")
name, value = field.split("=", 1) name, value = field.split("=", 1)
args[name] = value args[name] = value
@@ -94,6 +98,7 @@ class CppcheckCheckTool(CheckToolBase):
self._bad_input = True self._bad_input = True
return None return None
self._buffer = ""
return DefectItem(**args) return DefectItem(**args)
def configure_command( def configure_command(
@@ -103,13 +108,16 @@ class CppcheckCheckTool(CheckToolBase):
cmd = [ cmd = [
tool_path, tool_path,
"--addon-python=%s" % proc.get_pythonexe_path(),
"--error-exitcode=1", "--error-exitcode=1",
"--verbose" if self.options.get("verbose") else "--quiet", "--verbose" if self.options.get("verbose") else "--quiet",
] ]
cmd.append( cmd.append(
'--template="%s"' '--template="%s"'
% "<&PIO&>".join(["{0}={{{0}}}".format(f) for f in self.defect_fields]) % self._field_delimiter.join(
["{0}={{{0}}}".format(f) for f in self.defect_fields]
)
) )
flags = self.get_flags("cppcheck") flags = self.get_flags("cppcheck")

View File

@@ -22,7 +22,7 @@ import click
from platformio import proc, util from platformio import proc, util
from platformio.commands.check.defect import DefectItem from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir from platformio.package.manager.core import get_core_package_dir
class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes

View File

@@ -12,7 +12,6 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from glob import glob
from os import getenv, makedirs, remove from os import getenv, makedirs, remove
from os.path import basename, isdir, isfile, join, realpath from os.path import basename, isdir, isfile, join, realpath
from shutil import copyfile, copytree from shutil import copyfile, copytree
@@ -20,11 +19,10 @@ from tempfile import mkdtemp
import click import click
from platformio import app, fs from platformio import app, compat, fs
from platformio.commands.project import project_init as cmd_project_init from platformio.commands.project import project_init as cmd_project_init
from platformio.commands.project import validate_boards from platformio.commands.project import validate_boards
from platformio.commands.run.command import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.compat import glob_escape
from platformio.exception import CIBuildEnvsEmpty from platformio.exception import CIBuildEnvsEmpty
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@@ -36,7 +34,7 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
if p.startswith("~"): if p.startswith("~"):
value[i] = fs.expanduser(p) value[i] = fs.expanduser(p)
value[i] = realpath(value[i]) value[i] = realpath(value[i])
if not glob(value[i]): if not compat.glob_recursive(value[i]):
invalid_path = p invalid_path = p
break break
try: try:
@@ -46,7 +44,7 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
raise click.BadParameter("Found invalid path: %s" % invalid_path) raise click.BadParameter("Found invalid path: %s" % invalid_path)
@click.command("ci", short_help="Continuous Integration") @click.command("ci", short_help="Continuous integration")
@click.argument("src", nargs=-1, callback=validate_path) @click.argument("src", nargs=-1, callback=validate_path)
@click.option("-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY") @click.option("-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY")
@click.option("--exclude", multiple=True) @click.option("--exclude", multiple=True)
@@ -98,7 +96,7 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
continue continue
contents = [] contents = []
for p in patterns: for p in patterns:
contents += glob(p) contents += compat.glob_recursive(p)
_copy_contents(join(build_dir, dir_name), contents) _copy_contents(join(build_dir, dir_name), contents)
if project_conf and isfile(project_conf): if project_conf and isfile(project_conf):
@@ -159,7 +157,7 @@ def _copy_contents(dst_dir, contents):
def _exclude_contents(dst_dir, patterns): def _exclude_contents(dst_dir, patterns):
contents = [] contents = []
for p in patterns: for p in patterns:
contents += glob(join(glob_escape(dst_dir), p)) contents += compat.glob_recursive(join(compat.glob_escape(dst_dir), p))
for path in contents: for path in contents:
path = realpath(path) path = realpath(path)
if isdir(path): if isdir(path):

View File

@@ -21,10 +21,10 @@ from os.path import isfile
import click import click
from platformio import app, exception, fs, proc, util from platformio import app, exception, fs, proc
from platformio.commands.debug import helpers from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.managers.core import inject_contrib_pysite from platformio.package.manager.core import inject_contrib_pysite
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectEnvsNotAvailableError from platformio.project.exception import ProjectEnvsNotAvailableError
from platformio.project.helpers import is_platformio_project, load_project_ide_data from platformio.project.helpers import is_platformio_project, load_project_ide_data
@@ -33,7 +33,7 @@ from platformio.project.helpers import is_platformio_project, load_project_ide_d
@click.command( @click.command(
"debug", "debug",
context_settings=dict(ignore_unknown_options=True), context_settings=dict(ignore_unknown_options=True),
short_help="PIO Unified Debugger", short_help="Unified debugger",
) )
@click.option( @click.option(
"-d", "-d",
@@ -130,7 +130,7 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unpro
nl=False, nl=False,
) )
stream = helpers.GDBMIConsoleStream() stream = helpers.GDBMIConsoleStream()
with util.capture_std_streams(stream): with proc.capture_std_streams(stream):
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose) helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
stream.close() stream.close()
else: else:

View File

@@ -20,13 +20,14 @@ from hashlib import sha1
from io import BytesIO from io import BytesIO
from os.path import isfile from os.path import isfile
from platformio import exception, fs, util from platformio import fs, util
from platformio.commands import PlatformioCLI from platformio.commands import PlatformioCLI
from platformio.commands.debug.exception import DebugInvalidOptionsError from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.platform import platform_install as cmd_platform_install from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.run.command import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.compat import is_bytes from platformio.compat import is_bytes
from platformio.managers.platform import PlatformFactory from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.options import ProjectOptions from platformio.project.options import ProjectOptions
@@ -94,14 +95,14 @@ def validate_debug_options(cmd_ctx, env_options):
return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items] return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items]
try: try:
platform = PlatformFactory.newPlatform(env_options["platform"]) platform = PlatformFactory.new(env_options["platform"])
except exception.UnknownPlatform: except UnknownPlatform:
cmd_ctx.invoke( cmd_ctx.invoke(
cmd_platform_install, cmd_platform_install,
platforms=[env_options["platform"]], platforms=[env_options["platform"]],
skip_default_package=True, skip_default_package=True,
) )
platform = PlatformFactory.newPlatform(env_options["platform"]) platform = PlatformFactory.new(env_options["platform"])
board_config = platform.board_config(env_options["board"]) board_config = platform.board_config(env_options["board"])
tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool")) tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool"))

View File

@@ -26,7 +26,8 @@ from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import stdio # pylint: disable=import-error from twisted.internet import stdio # pylint: disable=import-error
from twisted.internet import task # pylint: disable=import-error from twisted.internet import task # pylint: disable=import-error
from platformio import app, fs, proc, telemetry, util from platformio import fs, proc, telemetry, util
from platformio.cache import ContentCache
from platformio.commands.debug import helpers from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.initcfgs import get_gdb_init_config from platformio.commands.debug.initcfgs import get_gdb_init_config
@@ -252,7 +253,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
def _kill_previous_session(self): def _kill_previous_session(self):
assert self._session_id assert self._session_id
pid = None pid = None
with app.ContentCache() as cc: with ContentCache() as cc:
pid = cc.get(self._session_id) pid = cc.get(self._session_id)
cc.delete(self._session_id) cc.delete(self._session_id)
if not pid: if not pid:
@@ -269,11 +270,11 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
def _lock_session(self, pid): def _lock_session(self, pid):
if not self._session_id: if not self._session_id:
return return
with app.ContentCache() as cc: with ContentCache() as cc:
cc.set(self._session_id, str(pid), "1h") cc.set(self._session_id, str(pid), "1h")
def _unlock_session(self): def _unlock_session(self):
if not self._session_id: if not self._session_id:
return return
with app.ContentCache() as cc: with ContentCache() as cc:
cc.delete(self._session_id) cc.delete(self._session_id)

View File

@@ -22,11 +22,11 @@ from serial.tools import miniterm
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.commands.device import helpers as device_helpers from platformio.commands.device import helpers as device_helpers
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformFactory from platformio.platform.factory import PlatformFactory
from platformio.project.exception import NotPlatformIOProjectError from platformio.project.exception import NotPlatformIOProjectError
@click.group(short_help="Monitor device or list existing") @click.group(short_help="Device manager & serial/socket monitor")
def cli(): def cli():
pass pass
@@ -192,7 +192,7 @@ def device_monitor(**kwargs): # pylint: disable=too-many-branches
platform = None platform = None
if "platform" in project_options: if "platform" in project_options:
with fs.cd(kwargs["project_dir"]): with fs.cd(kwargs["project_dir"]):
platform = PlatformFactory.newPlatform(project_options["platform"]) platform = PlatformFactory.new(project_options["platform"])
device_helpers.register_platform_filters( device_helpers.register_platform_filters(
platform, kwargs["project_dir"], kwargs["environment"] platform, kwargs["project_dir"], kwargs["environment"]
) )

View File

@@ -22,10 +22,10 @@ import click
from platformio import exception from platformio import exception
from platformio.compat import WINDOWS from platformio.compat import WINDOWS
from platformio.managers.core import get_core_package_dir, inject_contrib_pysite from platformio.package.manager.core import get_core_package_dir, inject_contrib_pysite
@click.command("home", short_help="PIO Home") @click.command("home", short_help="UI to manage PlatformIO")
@click.option("--port", type=int, default=8008, help="HTTP port, default=8008") @click.option("--port", type=int, default=8008, help="HTTP port, default=8008")
@click.option( @click.option(
"--host", "--host",

View File

@@ -14,9 +14,6 @@
# pylint: disable=keyword-arg-before-vararg,arguments-differ,signature-differs # pylint: disable=keyword-arg-before-vararg,arguments-differ,signature-differs
import os
import socket
import requests import requests
from twisted.internet import defer # pylint: disable=import-error from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error from twisted.internet import reactor # pylint: disable=import-error
@@ -52,18 +49,3 @@ def get_core_fullpath():
return where_is_program( return where_is_program(
"platformio" + (".exe" if "windows" in util.get_systype() else "") "platformio" + (".exe" if "windows" in util.get_systype() else "")
) )
@util.memoized(expire="10s")
def is_twitter_blocked():
ip = "104.244.42.1"
timeout = 2
try:
if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")):
requests.get("http://%s" % ip, allow_redirects=False, timeout=timeout)
else:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80))
return False
except: # pylint: disable=bare-except
pass
return True

View File

@@ -14,7 +14,7 @@
import jsonrpc # pylint: disable=import-error import jsonrpc # pylint: disable=import-error
from platformio.commands.account.client import AccountClient from platformio.clients.account import AccountClient
class AccountRPC(object): class AccountRPC(object):

View File

@@ -17,15 +17,15 @@ import time
from twisted.internet import defer, reactor # pylint: disable=import-error from twisted.internet import defer, reactor # pylint: disable=import-error
from platformio import app from platformio.cache import ContentCache
from platformio.commands.home.rpc.handlers.os import OSRPC from platformio.commands.home.rpc.handlers.os import OSRPC
class MiscRPC(object): class MiscRPC(object):
def load_latest_tweets(self, data_url): def load_latest_tweets(self, data_url):
cache_key = app.ContentCache.key_from_args(data_url, "tweets") cache_key = ContentCache.key_from_args(data_url, "tweets")
cache_valid = "7d" cache_valid = "180d"
with app.ContentCache() as cc: with ContentCache() as cc:
cache_data = cc.get(cache_key) cache_data = cc.get(cache_key)
if cache_data: if cache_data:
cache_data = json.loads(cache_data) cache_data = json.loads(cache_data)
@@ -43,7 +43,7 @@ class MiscRPC(object):
@defer.inlineCallbacks @defer.inlineCallbacks
def _preload_latest_tweets(data_url, cache_key, cache_valid): def _preload_latest_tweets(data_url, cache_key, cache_valid):
result = json.loads((yield OSRPC.fetch_content(data_url))) result = json.loads((yield OSRPC.fetch_content(data_url)))
with app.ContentCache() as cc: with ContentCache() as cc:
cc.set( cc.set(
cache_key, cache_key,
json.dumps({"time": int(time.time()), "result": result}), json.dumps({"time": int(time.time()), "result": result}),

View File

@@ -14,7 +14,6 @@
from __future__ import absolute_import from __future__ import absolute_import
import glob
import io import io
import os import os
import shutil import shutil
@@ -23,9 +22,11 @@ from functools import cmp_to_key
import click import click
from twisted.internet import defer # pylint: disable=import-error from twisted.internet import defer # pylint: disable=import-error
from platformio import app, fs, util from platformio import __default_requests_timeout__, fs, util
from platformio.cache import ContentCache
from platformio.clients.http import ensure_internet_on
from platformio.commands.home import helpers from platformio.commands.home import helpers
from platformio.compat import PY2, get_filesystem_encoding from platformio.compat import PY2, get_filesystem_encoding, glob_recursive
class OSRPC(object): class OSRPC(object):
@@ -40,26 +41,30 @@ class OSRPC(object):
"Safari/603.3.8" "Safari/603.3.8"
) )
} }
cache_key = app.ContentCache.key_from_args(uri, data) if cache_valid else None cache_key = ContentCache.key_from_args(uri, data) if cache_valid else None
with app.ContentCache() as cc: with ContentCache() as cc:
if cache_key: if cache_key:
result = cc.get(cache_key) result = cc.get(cache_key)
if result is not None: if result is not None:
defer.returnValue(result) defer.returnValue(result)
# check internet before and resolve issue with 60 seconds timeout # check internet before and resolve issue with 60 seconds timeout
util.internet_on(raise_exception=True) ensure_internet_on(raise_exception=True)
session = helpers.requests_session() session = helpers.requests_session()
if data: if data:
r = yield session.post(uri, data=data, headers=headers) r = yield session.post(
uri, data=data, headers=headers, timeout=__default_requests_timeout__
)
else: else:
r = yield session.get(uri, headers=headers) r = yield session.get(
uri, headers=headers, timeout=__default_requests_timeout__
)
r.raise_for_status() r.raise_for_status()
result = r.text result = r.text
if cache_valid: if cache_valid:
with app.ContentCache() as cc: with ContentCache() as cc:
cc.set(cache_key, result, cache_valid) cc.set(cache_key, result, cache_valid)
defer.returnValue(result) defer.returnValue(result)
@@ -115,7 +120,9 @@ class OSRPC(object):
pathnames = [pathnames] pathnames = [pathnames]
result = set() result = set()
for pathname in pathnames: for pathname in pathnames:
result |= set(glob.glob(os.path.join(root, pathname) if root else pathname)) result |= set(
glob_recursive(os.path.join(root, pathname) if root else pathname)
)
return list(result) return list(result)
@staticmethod @staticmethod

View File

@@ -25,7 +25,7 @@ from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.compat import PY2, get_filesystem_encoding from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager from platformio.package.manager.platform import PlatformPackageManager
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectError from platformio.project.exception import ProjectError
from platformio.project.helpers import get_project_dir, is_platformio_project from platformio.project.helpers import get_project_dir, is_platformio_project
@@ -105,7 +105,7 @@ class ProjectRPC(object):
return (os.path.sep).join(path.split(os.path.sep)[-2:]) return (os.path.sep).join(path.split(os.path.sep)[-2:])
result = [] result = []
pm = PlatformManager() pm = PlatformPackageManager()
for project_dir in AppRPC.load_state()["storage"]["recentProjects"]: for project_dir in AppRPC.load_state()["storage"]["recentProjects"]:
if not os.path.isdir(project_dir): if not os.path.isdir(project_dir):
continue continue
@@ -148,8 +148,9 @@ class ProjectRPC(object):
@staticmethod @staticmethod
def get_project_examples(): def get_project_examples():
result = [] result = []
for manifest in PlatformManager().get_installed(): pm = PlatformPackageManager()
examples_dir = os.path.join(manifest["__pkg_dir"], "examples") for pkg in pm.get_installed():
examples_dir = os.path.join(pkg.path, "examples")
if not os.path.isdir(examples_dir): if not os.path.isdir(examples_dir):
continue continue
items = [] items = []
@@ -172,6 +173,7 @@ class ProjectRPC(object):
"description": project_description, "description": project_description,
} }
) )
manifest = pm.load_manifest(pkg)
result.append( result.append(
{ {
"platform": { "platform": {

View File

@@ -11,20 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio.exception import PlatformioException
class AccountError(PlatformioException):
MESSAGE = "{0}"
class AccountNotAuthorized(AccountError):
MESSAGE = "You are not authorized! Please log in to PIO Account."
class AccountAlreadyAuthorized(AccountError):
MESSAGE = "You are already authorized with {0} account."

View File

@@ -18,18 +18,21 @@ import os
import time import time
import click import click
import semantic_version
from tabulate import tabulate from tabulate import tabulate
from platformio import exception, fs, util from platformio import exception, fs, util
from platformio.commands import PlatformioCLI from platformio.commands import PlatformioCLI
from platformio.commands.lib.helpers import (
get_builtin_libs,
is_builtin_lib,
save_project_libdeps,
)
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.lib import LibraryManager, get_builtin_libs, is_builtin_lib from platformio.package.exception import NotGlobalLibDir, UnknownPackageError
from platformio.package.manifest.parser import ManifestParserFactory from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manifest.schema import ManifestSchema from platformio.package.meta import PackageItem, PackageSpec
from platformio.proc import is_ci from platformio.proc import is_ci
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import InvalidProjectConfError
from platformio.project.helpers import get_project_dir, is_platformio_project from platformio.project.helpers import get_project_dir, is_platformio_project
try: try:
@@ -47,7 +50,7 @@ def get_project_global_lib_dir():
return ProjectConfig.get_instance().get_optional_dir("globallib") return ProjectConfig.get_instance().get_optional_dir("globallib")
@click.group(short_help="Library Manager") @click.group(short_help="Library manager")
@click.option( @click.option(
"-d", "-d",
"--storage-dir", "--storage-dir",
@@ -94,7 +97,7 @@ def cli(ctx, **options):
) )
if not storage_dirs: if not storage_dirs:
raise exception.NotGlobalLibDir( raise NotGlobalLibDir(
get_project_dir(), get_project_global_lib_dir(), ctx.invoked_subcommand get_project_dir(), get_project_global_lib_dir(), ctx.invoked_subcommand
) )
@@ -126,89 +129,106 @@ def cli(ctx, **options):
@cli.command("install", short_help="Install library") @cli.command("install", short_help="Install library")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]") @click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
@click.option( @click.option(
"--save", "--save/--no-save",
is_flag=True, is_flag=True,
help="Save installed libraries into the `platformio.ini` dependency list", default=True,
help="Save installed libraries into the `platformio.ini` dependency list"
" (enabled by default)",
) )
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting") @click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option( @click.option(
"--interactive", is_flag=True, help="Allow to make a choice for all prompts" "--interactive",
is_flag=True,
help="Deprecated! Please use a strict dependency specification (owner/libname)",
) )
@click.option( @click.option(
"-f", "--force", is_flag=True, help="Reinstall/redownload library if exists" "-f", "--force", is_flag=True, help="Reinstall/redownload library if exists"
) )
@click.pass_context @click.pass_context
def lib_install( # pylint: disable=too-many-arguments def lib_install( # pylint: disable=too-many-arguments,unused-argument
ctx, libraries, save, silent, interactive, force ctx, libraries, save, silent, interactive, force
): ):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY] storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
storage_libdeps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, []) storage_libdeps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, [])
installed_manifests = {} installed_pkgs = {}
for storage_dir in storage_dirs: for storage_dir in storage_dirs:
if not silent and (libraries or storage_dir in storage_libdeps): if not silent and (libraries or storage_dir in storage_libdeps):
print_storage_header(storage_dirs, storage_dir) print_storage_header(storage_dirs, storage_dir)
lm = LibraryManager(storage_dir) lm = LibraryPackageManager(storage_dir)
if libraries: if libraries:
for library in libraries: installed_pkgs = {
pkg_dir = lm.install( library: lm.install(library, silent=silent, force=force)
library, silent=silent, interactive=interactive, force=force for library in libraries
) }
installed_manifests[library] = lm.load_manifest(pkg_dir)
elif storage_dir in storage_libdeps: elif storage_dir in storage_libdeps:
builtin_lib_storages = None builtin_lib_storages = None
for library in storage_libdeps[storage_dir]: for library in storage_libdeps[storage_dir]:
try: try:
pkg_dir = lm.install( lm.install(library, silent=silent, force=force)
library, silent=silent, interactive=interactive, force=force except UnknownPackageError as e:
)
installed_manifests[library] = lm.load_manifest(pkg_dir)
except exception.LibNotFound as e:
if builtin_lib_storages is None: if builtin_lib_storages is None:
builtin_lib_storages = get_builtin_libs() builtin_lib_storages = get_builtin_libs()
if not silent or not is_builtin_lib(builtin_lib_storages, library): if not silent or not is_builtin_lib(builtin_lib_storages, library):
click.secho("Warning! %s" % e, fg="yellow") click.secho("Warning! %s" % e, fg="yellow")
if not save or not libraries: if save and installed_pkgs:
return _save_deps(ctx, installed_pkgs)
def _save_deps(ctx, pkgs, action="add"):
specs = []
for library, pkg in pkgs.items():
spec = PackageSpec(library)
if spec.external:
specs.append(spec)
else:
specs.append(
PackageSpec(
owner=pkg.metadata.spec.owner,
name=pkg.metadata.spec.name,
requirements=spec.requirements
or (
("^%s" % pkg.metadata.version)
if not pkg.metadata.version.build
else pkg.metadata.version
),
)
)
input_dirs = ctx.meta.get(CTX_META_INPUT_DIRS_KEY, []) input_dirs = ctx.meta.get(CTX_META_INPUT_DIRS_KEY, [])
project_environments = ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] project_environments = ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY]
for input_dir in input_dirs: for input_dir in input_dirs:
config = ProjectConfig.get_instance(os.path.join(input_dir, "platformio.ini")) if not is_platformio_project(input_dir):
config.validate(project_environments)
for env in config.envs():
if project_environments and env not in project_environments:
continue continue
config.expand_interpolations = False save_project_libdeps(input_dir, specs, project_environments, action=action)
try:
lib_deps = config.get("env:" + env, "lib_deps")
except InvalidProjectConfError:
lib_deps = []
for library in libraries:
if library in lib_deps:
continue
manifest = installed_manifests[library]
try:
assert library.lower() == manifest["name"].lower()
assert semantic_version.Version(manifest["version"])
lib_deps.append("{name}@^{version}".format(**manifest))
except (AssertionError, ValueError):
lib_deps.append(library)
config.set("env:" + env, "lib_deps", lib_deps)
config.save()
@cli.command("uninstall", short_help="Uninstall libraries") @cli.command("uninstall", short_help="Remove libraries")
@click.argument("libraries", nargs=-1, metavar="[LIBRARY...]") @click.argument("libraries", nargs=-1, metavar="[LIBRARY...]")
@click.option(
"--save/--no-save",
is_flag=True,
default=True,
help="Remove libraries from the `platformio.ini` dependency list and save changes"
" (enabled by default)",
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.pass_context @click.pass_context
def lib_uninstall(ctx, libraries): def lib_uninstall(ctx, libraries, save, silent):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY] storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
uninstalled_pkgs = {}
for storage_dir in storage_dirs: for storage_dir in storage_dirs:
print_storage_header(storage_dirs, storage_dir) print_storage_header(storage_dirs, storage_dir)
lm = LibraryManager(storage_dir) lm = LibraryPackageManager(storage_dir)
for library in libraries: uninstalled_pkgs = {
lm.uninstall(library) library: lm.uninstall(library, silent=silent) for library in libraries
}
if save and uninstalled_pkgs:
_save_deps(ctx, uninstalled_pkgs, action="remove")
@cli.command("update", short_help="Update installed libraries") @cli.command("update", short_help="Update installed libraries")
@@ -222,42 +242,58 @@ def lib_uninstall(ctx, libraries):
@click.option( @click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions" "--dry-run", is_flag=True, help="Do not update, only check for the new versions"
) )
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
@click.pass_context @click.pass_context
def lib_update(ctx, libraries, only_check, dry_run, json_output): def lib_update( # pylint: disable=too-many-arguments
ctx, libraries, only_check, dry_run, silent, json_output
):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY] storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
only_check = dry_run or only_check only_check = dry_run or only_check
json_result = {} json_result = {}
for storage_dir in storage_dirs: for storage_dir in storage_dirs:
if not json_output: if not json_output:
print_storage_header(storage_dirs, storage_dir) print_storage_header(storage_dirs, storage_dir)
lm = LibraryManager(storage_dir) lib_deps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, {}).get(storage_dir, [])
lm = LibraryPackageManager(storage_dir)
_libraries = libraries _libraries = libraries or lib_deps or lm.get_installed()
if not _libraries:
_libraries = [manifest["__pkg_dir"] for manifest in lm.get_installed()]
if only_check and json_output: if only_check and json_output:
result = [] result = []
for library in _libraries: for library in _libraries:
pkg_dir = library if os.path.isdir(library) else None spec = None
requirements = None pkg = None
url = None if isinstance(library, PackageItem):
if not pkg_dir: pkg = library
name, requirements, url = lm.parse_pkg_uri(library) else:
pkg_dir = lm.get_package_dir(name, requirements, url) spec = PackageSpec(library)
if not pkg_dir: pkg = lm.get_package(spec)
if not pkg:
continue continue
latest = lm.outdated(pkg_dir, requirements) outdated = lm.outdated(pkg, spec)
if not latest: if not outdated.is_outdated(allow_incompatible=True):
continue continue
manifest = lm.load_manifest(pkg_dir) manifest = lm.legacy_load_manifest(pkg)
manifest["versionLatest"] = latest manifest["versionWanted"] = (
str(outdated.wanted) if outdated.wanted else None
)
manifest["versionLatest"] = (
str(outdated.latest) if outdated.latest else None
)
result.append(manifest) result.append(manifest)
json_result[storage_dir] = result json_result[storage_dir] = result
else: else:
for library in _libraries: for library in _libraries:
lm.update(library, only_check=only_check) to_spec = (
None if isinstance(library, PackageItem) else PackageSpec(library)
)
try:
lm.update(
library, to_spec=to_spec, only_check=only_check, silent=silent
)
except UnknownPackageError as e:
if library not in lib_deps:
raise e
if json_output: if json_output:
return click.echo( return click.echo(
@@ -278,8 +314,8 @@ def lib_list(ctx, json_output):
for storage_dir in storage_dirs: for storage_dir in storage_dirs:
if not json_output: if not json_output:
print_storage_header(storage_dirs, storage_dir) print_storage_header(storage_dirs, storage_dir)
lm = LibraryManager(storage_dir) lm = LibraryPackageManager(storage_dir)
items = lm.get_installed() items = lm.legacy_get_installed()
if json_output: if json_output:
json_result[storage_dir] = items json_result[storage_dir] = items
elif items: elif items:
@@ -303,6 +339,7 @@ def lib_list(ctx, json_output):
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
@click.option("--page", type=click.INT, default=1) @click.option("--page", type=click.INT, default=1)
@click.option("--id", multiple=True) @click.option("--id", multiple=True)
@click.option("-o", "--owner", multiple=True)
@click.option("-n", "--name", multiple=True) @click.option("-n", "--name", multiple=True)
@click.option("-a", "--author", multiple=True) @click.option("-a", "--author", multiple=True)
@click.option("-k", "--keyword", multiple=True) @click.option("-k", "--keyword", multiple=True)
@@ -315,6 +352,7 @@ def lib_list(ctx, json_output):
help="Do not prompt, automatically paginate with delay", help="Do not prompt, automatically paginate with delay",
) )
def lib_search(query, json_output, page, noninteractive, **filters): def lib_search(query, json_output, page, noninteractive, **filters):
regclient = LibraryPackageManager().get_registry_client_instance()
if not query: if not query:
query = [] query = []
if not isinstance(query, list): if not isinstance(query, list):
@@ -324,8 +362,11 @@ def lib_search(query, json_output, page, noninteractive, **filters):
for value in values: for value in values:
query.append('%s:"%s"' % (key, value)) query.append('%s:"%s"' % (key, value))
result = util.get_api_result( result = regclient.fetch_json_data(
"/v2/lib/search", dict(query=" ".join(query), page=page), cache_valid="1d" "get",
"/v2/lib/search",
params=dict(query=" ".join(query), page=page),
cache_valid="1d",
) )
if json_output: if json_output:
@@ -374,9 +415,10 @@ def lib_search(query, json_output, page, noninteractive, **filters):
time.sleep(5) time.sleep(5)
elif not click.confirm("Show next libraries?"): elif not click.confirm("Show next libraries?"):
break break
result = util.get_api_result( result = regclient.fetch_json_data(
"get",
"/v2/lib/search", "/v2/lib/search",
{"query": " ".join(query), "page": int(result["page"]) + 1}, params=dict(query=" ".join(query), page=int(result["page"]) + 1),
cache_valid="1d", cache_valid="1d",
) )
@@ -406,23 +448,20 @@ def lib_builtin(storage, json_output):
@click.argument("library", metavar="[LIBRARY]") @click.argument("library", metavar="[LIBRARY]")
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def lib_show(library, json_output): def lib_show(library, json_output):
lm = LibraryManager() lm = LibraryPackageManager()
name, requirements, _ = lm.parse_pkg_uri(library) lib_id = lm.reveal_registry_package_id(library, silent=json_output)
lib_id = lm.search_lib_id( regclient = lm.get_registry_client_instance()
{"name": name, "requirements": requirements}, lib = regclient.fetch_json_data("get", "/v2/lib/info/%d" % lib_id, cache_valid="1h")
silent=json_output,
interactive=not json_output,
)
lib = util.get_api_result("/lib/info/%d" % lib_id, cache_valid="1d")
if json_output: if json_output:
return click.echo(dump_json_to_unicode(lib)) return click.echo(dump_json_to_unicode(lib))
click.secho(lib["name"], fg="cyan") title = "{ownername}/{name}".format(**lib)
click.echo("=" * len(lib["name"])) click.secho(title, fg="cyan")
click.secho("#ID: %d" % lib["id"], bold=True) click.echo("=" * len(title))
click.echo(lib["description"]) click.echo(lib["description"])
click.echo() click.echo()
click.secho("ID: %d" % lib["id"])
click.echo( click.echo(
"Version: %s, released %s" "Version: %s, released %s"
% ( % (
@@ -445,7 +484,7 @@ def lib_show(library, json_output):
for author in lib.get("authors", []): for author in lib.get("authors", []):
_data = [] _data = []
for key in ("name", "email", "url", "maintainer"): for key in ("name", "email", "url", "maintainer"):
if not author[key]: if not author.get(key):
continue continue
if key == "email": if key == "email":
_data.append("<%s>" % author[key]) _data.append("<%s>" % author[key])
@@ -495,29 +534,19 @@ def lib_show(library, json_output):
return True return True
@cli.command("register", short_help="Register a new library") @cli.command("register", short_help="Deprecated")
@click.argument("config_url") @click.argument("config_url")
def lib_register(config_url): def lib_register(config_url): # pylint: disable=unused-argument
if not config_url.startswith("http://") and not config_url.startswith("https://"): raise exception.UserSideException(
raise exception.InvalidLibConfURL(config_url) "This command is deprecated. Please use `pio package publish` command."
# Validate manifest
ManifestSchema().load_manifest(
ManifestParserFactory.new_from_url(config_url).as_dict()
)
result = util.get_api_result("/lib/register", data=dict(config_url=config_url))
if "message" in result and result["message"]:
click.secho(
result["message"],
fg="green" if "successed" in result and result["successed"] else "red",
) )
@cli.command("stats", short_help="Library Registry Statistics") @cli.command("stats", short_help="Library Registry Statistics")
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def lib_stats(json_output): def lib_stats(json_output):
result = util.get_api_result("/lib/stats", cache_valid="1h") regclient = LibraryPackageManager().get_registry_client_instance()
result = regclient.fetch_json_data("get", "/v2/lib/stats", cache_valid="1h")
if json_output: if json_output:
return click.echo(dump_json_to_unicode(result)) return click.echo(dump_json_to_unicode(result))

View File

@@ -0,0 +1,94 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from platformio.compat import ci_strings_are_equal
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.meta import PackageSpec
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.exception import InvalidProjectConfError
def get_builtin_libs(storage_names=None):
# pylint: disable=import-outside-toplevel
from platformio.package.manager.library import LibraryPackageManager
items = []
storage_names = storage_names or []
pm = PlatformPackageManager()
for pkg in pm.get_installed():
p = PlatformFactory.new(pkg)
for storage in p.get_lib_storages():
if storage_names and storage["name"] not in storage_names:
continue
lm = LibraryPackageManager(storage["path"])
items.append(
{
"name": storage["name"],
"path": storage["path"],
"items": lm.legacy_get_installed(),
}
)
return items
def is_builtin_lib(storages, name):
for storage in storages or []:
if any(lib.get("name") == name for lib in storage["items"]):
return True
return False
def ignore_deps_by_specs(deps, specs):
result = []
for dep in deps:
depspec = PackageSpec(dep)
if depspec.external:
result.append(dep)
continue
ignore_conditions = []
for spec in specs:
if depspec.owner:
ignore_conditions.append(
ci_strings_are_equal(depspec.owner, spec.owner)
and ci_strings_are_equal(depspec.name, spec.name)
)
else:
ignore_conditions.append(ci_strings_are_equal(depspec.name, spec.name))
if not any(ignore_conditions):
result.append(dep)
return result
def save_project_libdeps(project_dir, specs, environments=None, action="add"):
config = ProjectConfig.get_instance(os.path.join(project_dir, "platformio.ini"))
config.validate(environments)
for env in config.envs():
if environments and env not in environments:
continue
config.expand_interpolations = False
lib_deps = []
try:
lib_deps = ignore_deps_by_specs(config.get("env:" + env, "lib_deps"), specs)
except InvalidProjectConfError:
pass
if action == "add":
lib_deps.extend(spec.as_dependency() for spec in specs)
if lib_deps:
config.set("env:" + env, "lib_deps", lib_deps)
elif config.has_option("env:" + env, "lib_deps"):
config.remove_option("env:" + env, "lib_deps")
config.save()

150
platformio/commands/org.py Normal file
View File

@@ -0,0 +1,150 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
import json
import click
from tabulate import tabulate
from platformio.clients.account import AccountClient
from platformio.commands.account import validate_email, validate_username
@click.group("org", short_help="Manage organizations")
def cli():
pass
def validate_orgname(value):
return validate_username(value, "Organization name")
@cli.command("create", short_help="Create a new organization")
@click.argument(
"orgname", callback=lambda _, __, value: validate_orgname(value),
)
@click.option(
"--email", callback=lambda _, __, value: validate_email(value) if value else value
)
@click.option("--displayname",)
def org_create(orgname, email, displayname):
client = AccountClient()
client.create_org(orgname, email, displayname)
return click.secho(
"The organization `%s` has been successfully created." % orgname, fg="green",
)
@cli.command("list", short_help="List organizations and their members")
@click.option("--json-output", is_flag=True)
def org_list(json_output):
client = AccountClient()
orgs = client.list_orgs()
if json_output:
return click.echo(json.dumps(orgs))
if not orgs:
return click.echo("You do not have any organization")
for org in orgs:
click.echo()
click.secho(org.get("orgname"), fg="cyan")
click.echo("-" * len(org.get("orgname")))
data = []
if org.get("displayname"):
data.append(("Display Name:", org.get("displayname")))
if org.get("email"):
data.append(("Email:", org.get("email")))
data.append(
(
"Owners:",
", ".join((owner.get("username") for owner in org.get("owners"))),
)
)
click.echo(tabulate(data, tablefmt="plain"))
return click.echo()
@cli.command("update", short_help="Update organization")
@click.argument("cur_orgname")
@click.option(
"--orgname",
callback=lambda _, __, value: validate_orgname(value),
help="A new orgname",
)
@click.option("--email")
@click.option("--displayname")
def org_update(cur_orgname, **kwargs):
client = AccountClient()
org = client.get_org(cur_orgname)
del org["owners"]
new_org = org.copy()
if not any(kwargs.values()):
for field in org:
new_org[field] = click.prompt(
field.replace("_", " ").capitalize(), default=org[field]
)
if field == "email":
validate_email(new_org[field])
if field == "orgname":
validate_orgname(new_org[field])
else:
new_org.update(
{key.replace("new_", ""): value for key, value in kwargs.items() if value}
)
client.update_org(cur_orgname, new_org)
return click.secho(
"The organization `%s` has been successfully updated." % cur_orgname,
fg="green",
)
@cli.command("destroy", short_help="Destroy organization")
@click.argument("orgname")
def account_destroy(orgname):
client = AccountClient()
click.confirm(
"Are you sure you want to delete the `%s` organization account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% orgname,
abort=True,
)
client.destroy_org(orgname)
return click.secho("Organization `%s` has been destroyed." % orgname, fg="green",)
@cli.command("add", short_help="Add a new owner to organization")
@click.argument("orgname",)
@click.argument("username",)
def org_add_owner(orgname, username):
client = AccountClient()
client.add_org_owner(orgname, username)
return click.secho(
"The new owner `%s` has been successfully added to the `%s` organization."
% (username, orgname),
fg="green",
)
@cli.command("remove", short_help="Remove an owner from organization")
@click.argument("orgname",)
@click.argument("username",)
def org_remove_owner(orgname, username):
client = AccountClient()
client.remove_org_owner(orgname, username)
return click.secho(
"The `%s` owner has been successfully removed from the `%s` organization."
% (username, orgname),
fg="green",
)

View File

@@ -0,0 +1,113 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from datetime import datetime
import click
from platformio.clients.registry import RegistryClient
from platformio.package.meta import PackageSpec, PackageType
from platformio.package.pack import PackagePacker
def validate_datetime(ctx, param, value): # pylint: disable=unused-argument
if not value:
return value
try:
datetime.strptime(value, "%Y-%m-%d %H:%M:%S")
except ValueError as e:
raise click.BadParameter(e)
return value
@click.group("package", short_help="Package manager")
def cli():
pass
@cli.command("pack", short_help="Create a tarball from a package")
@click.argument(
"package",
required=True,
default=os.getcwd,
metavar="<source directory, tar.gz or zip>",
)
@click.option(
"-o", "--output", help="A destination path (folder or a full path to file)"
)
def package_pack(package, output):
p = PackagePacker(package)
archive_path = p.pack(output)
click.secho('Wrote a tarball to "%s"' % archive_path, fg="green")
@cli.command("publish", short_help="Publish a package to the registry")
@click.argument(
"package",
required=True,
default=os.getcwd,
metavar="<source directory, tar.gz or zip>",
)
@click.option(
"--owner",
help="PIO Account username (can be organization username). "
"Default is set to a username of the authorized PIO Account",
)
@click.option(
"--released-at",
callback=validate_datetime,
help="Custom release date and time in the next format (UTC): 2014-06-13 17:08:52",
)
@click.option("--private", is_flag=True, help="Restricted access (not a public)")
@click.option(
"--notify/--no-notify",
default=True,
help="Notify by email when package is processed",
)
def package_publish(package, owner, released_at, private, notify):
p = PackagePacker(package)
archive_path = p.pack()
response = RegistryClient().publish_package(
archive_path, owner, released_at, private, notify
)
os.remove(archive_path)
click.secho(response.get("message"), fg="green")
@cli.command("unpublish", short_help="Remove a pushed package from the registry")
@click.argument(
"package", required=True, metavar="[<organization>/]<pkgname>[@<version>]"
)
@click.option(
"--type",
type=click.Choice(list(PackageType.items().values())),
default="library",
help="Package type, default is set to `library`",
)
@click.option(
"--undo",
is_flag=True,
help="Undo a remove, putting a version back into the registry",
)
def package_unpublish(package, type, undo): # pylint: disable=redefined-builtin
spec = PackageSpec(package)
response = RegistryClient().unpublish_package(
type=type,
name=spec.name,
owner=spec.owner,
version=spec.requirements,
undo=undo,
)
click.secho(response.get("message"), fg="green")

View File

@@ -12,18 +12,21 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from os.path import dirname, isdir import os
import click import click
from platformio import app, exception, util from platformio.cache import cleanup_content_cache
from platformio.commands.boards import print_boards from platformio.commands.boards import print_boards
from platformio.compat import dump_json_to_unicode from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.pack import PackagePacker from platformio.package.meta import PackageItem, PackageSpec
from platformio.package.version import get_original_version
from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
@click.group(short_help="Platform Manager") @click.group(short_help="Platform manager")
def cli(): def cli():
pass pass
@@ -47,7 +50,7 @@ def _print_platforms(platforms):
if "version" in platform: if "version" in platform:
if "__src_url" in platform: if "__src_url" in platform:
click.echo( click.echo(
"Version: #%s (%s)" % (platform["version"], platform["__src_url"]) "Version: %s (%s)" % (platform["version"], platform["__src_url"])
) )
else: else:
click.echo("Version: " + platform["version"]) click.echo("Version: " + platform["version"])
@@ -55,31 +58,27 @@ def _print_platforms(platforms):
def _get_registry_platforms(): def _get_registry_platforms():
platforms = util.get_api_result("/platforms", cache_valid="7d") regclient = PlatformPackageManager().get_registry_client_instance()
pm = PlatformManager() return regclient.fetch_json_data("get", "/v2/platforms", cache_valid="1d")
for platform in platforms or []:
platform["versions"] = pm.get_all_repo_versions(platform["name"])
return platforms
def _get_platform_data(*args, **kwargs): def _get_platform_data(*args, **kwargs):
try: try:
return _get_installed_platform_data(*args, **kwargs) return _get_installed_platform_data(*args, **kwargs)
except exception.UnknownPlatform: except UnknownPlatform:
return _get_registry_platform_data(*args, **kwargs) return _get_registry_platform_data(*args, **kwargs)
def _get_installed_platform_data(platform, with_boards=True, expose_packages=True): def _get_installed_platform_data(platform, with_boards=True, expose_packages=True):
p = PlatformFactory.newPlatform(platform) p = PlatformFactory.new(platform)
data = dict( data = dict(
name=p.name, name=p.name,
title=p.title, title=p.title,
description=p.description, description=p.description,
version=p.version, version=p.version,
homepage=p.homepage, homepage=p.homepage,
url=p.homepage,
repository=p.repository_url, repository=p.repository_url,
url=p.vendor_url,
docs=p.docs_url,
license=p.license, license=p.license,
forDesktop=not p.is_embedded(), forDesktop=not p.is_embedded(),
frameworks=sorted(list(p.frameworks) if p.frameworks else []), frameworks=sorted(list(p.frameworks) if p.frameworks else []),
@@ -91,7 +90,9 @@ def _get_installed_platform_data(platform, with_boards=True, expose_packages=Tru
# return data # return data
# overwrite VCS version and add extra fields # overwrite VCS version and add extra fields
manifest = PlatformManager().load_manifest(dirname(p.manifest_path)) manifest = PlatformPackageManager().legacy_load_manifest(
os.path.dirname(p.manifest_path)
)
assert manifest assert manifest
for key in manifest: for key in manifest:
if key == "version" or key.startswith("__"): if key == "version" or key.startswith("__"):
@@ -104,13 +105,15 @@ def _get_installed_platform_data(platform, with_boards=True, expose_packages=Tru
return data return data
data["packages"] = [] data["packages"] = []
installed_pkgs = p.get_installed_packages() installed_pkgs = {
for name, opts in p.packages.items(): pkg.metadata.name: p.pm.load_manifest(pkg) for pkg in p.get_installed_packages()
}
for name, options in p.packages.items():
item = dict( item = dict(
name=name, name=name,
type=p.get_package_type(name), type=p.get_package_type(name),
requirements=opts.get("version"), requirements=options.get("version"),
optional=opts.get("optional") is True, optional=options.get("optional") is True,
) )
if name in installed_pkgs: if name in installed_pkgs:
for key, value in installed_pkgs[name].items(): for key, value in installed_pkgs[name].items():
@@ -118,7 +121,7 @@ def _get_installed_platform_data(platform, with_boards=True, expose_packages=Tru
continue continue
item[key] = value item[key] = value
if key == "version": if key == "version":
item["originalVersion"] = util.get_original_version(value) item["originalVersion"] = get_original_version(value)
data["packages"].append(item) data["packages"].append(item)
return data return data
@@ -137,6 +140,7 @@ def _get_registry_platform_data( # pylint: disable=unused-argument
return None return None
data = dict( data = dict(
ownername=_data.get("ownername"),
name=_data["name"], name=_data["name"],
title=_data["title"], title=_data["title"],
description=_data["description"], description=_data["description"],
@@ -147,13 +151,13 @@ def _get_registry_platform_data( # pylint: disable=unused-argument
forDesktop=_data["forDesktop"], forDesktop=_data["forDesktop"],
frameworks=_data["frameworks"], frameworks=_data["frameworks"],
packages=_data["packages"], packages=_data["packages"],
versions=_data["versions"], versions=_data.get("versions"),
) )
if with_boards: if with_boards:
data["boards"] = [ data["boards"] = [
board board
for board in PlatformManager().get_registered_boards() for board in PlatformPackageManager().get_registered_boards()
if board["platform"] == _data["name"] if board["platform"] == _data["name"]
] ]
@@ -187,8 +191,11 @@ def platform_search(query, json_output):
@click.argument("query", required=False) @click.argument("query", required=False)
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def platform_frameworks(query, json_output): def platform_frameworks(query, json_output):
regclient = PlatformPackageManager().get_registry_client_instance()
frameworks = [] frameworks = []
for framework in util.get_api_result("/frameworks", cache_valid="7d"): for framework in regclient.fetch_json_data(
"get", "/v2/frameworks", cache_valid="1d"
):
if query == "all": if query == "all":
query = "" query = ""
search_data = dump_json_to_unicode(framework) search_data = dump_json_to_unicode(framework)
@@ -213,12 +220,10 @@ def platform_frameworks(query, json_output):
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def platform_list(json_output): def platform_list(json_output):
platforms = [] platforms = []
pm = PlatformManager() pm = PlatformPackageManager()
for manifest in pm.get_installed(): for pkg in pm.get_installed():
platforms.append( platforms.append(
_get_installed_platform_data( _get_installed_platform_data(pkg, with_boards=False, expose_packages=False)
manifest["__pkg_dir"], with_boards=False, expose_packages=False
)
) )
platforms = sorted(platforms, key=lambda manifest: manifest["name"]) platforms = sorted(platforms, key=lambda manifest: manifest["name"])
@@ -234,16 +239,15 @@ def platform_list(json_output):
def platform_show(platform, json_output): # pylint: disable=too-many-branches def platform_show(platform, json_output): # pylint: disable=too-many-branches
data = _get_platform_data(platform) data = _get_platform_data(platform)
if not data: if not data:
raise exception.UnknownPlatform(platform) raise UnknownPlatform(platform)
if json_output: if json_output:
return click.echo(dump_json_to_unicode(data)) return click.echo(dump_json_to_unicode(data))
dep = "{ownername}/{name}".format(**data) if "ownername" in data else data["name"]
click.echo( click.echo(
"{name} ~ {title}".format( "{dep} ~ {title}".format(dep=click.style(dep, fg="cyan"), title=data["title"])
name=click.style(data["name"], fg="cyan"), title=data["title"]
) )
) click.echo("=" * (3 + len(dep + data["title"])))
click.echo("=" * (3 + len(data["name"] + data["title"])))
click.echo(data["description"]) click.echo(data["description"])
click.echo() click.echo()
if "version" in data: if "version" in data:
@@ -300,6 +304,7 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
@click.option("--without-package", multiple=True) @click.option("--without-package", multiple=True)
@click.option("--skip-default-package", is_flag=True) @click.option("--skip-default-package", is_flag=True)
@click.option("--with-all-packages", is_flag=True) @click.option("--with-all-packages", is_flag=True)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option( @click.option(
"-f", "-f",
"--force", "--force",
@@ -312,21 +317,24 @@ def platform_install( # pylint: disable=too-many-arguments
without_package, without_package,
skip_default_package, skip_default_package,
with_all_packages, with_all_packages,
silent,
force, force,
): ):
pm = PlatformManager() pm = PlatformPackageManager()
for platform in platforms: for platform in platforms:
if pm.install( pkg = pm.install(
name=platform, spec=platform,
with_packages=with_package, with_packages=with_package,
without_packages=without_package, without_packages=without_package,
skip_default_package=skip_default_package, skip_default_package=skip_default_package,
with_all_packages=with_all_packages, with_all_packages=with_all_packages,
silent=silent,
force=force, force=force,
): )
if pkg and not silent:
click.secho( click.secho(
"The platform '%s' has been successfully installed!\n" "The platform '%s' has been successfully installed!\n"
"The rest of packages will be installed automatically " "The rest of the packages will be installed later "
"depending on your build environment." % platform, "depending on your build environment." % platform,
fg="green", fg="green",
) )
@@ -335,11 +343,11 @@ def platform_install( # pylint: disable=too-many-arguments
@cli.command("uninstall", short_help="Uninstall development platform") @cli.command("uninstall", short_help="Uninstall development platform")
@click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]") @click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]")
def platform_uninstall(platforms): def platform_uninstall(platforms):
pm = PlatformManager() pm = PlatformPackageManager()
for platform in platforms: for platform in platforms:
if pm.uninstall(platform): if pm.uninstall(platform):
click.secho( click.secho(
"The platform '%s' has been successfully uninstalled!" % platform, "The platform '%s' has been successfully removed!" % platform,
fg="green", fg="green",
) )
@@ -358,66 +366,60 @@ def platform_uninstall(platforms):
@click.option( @click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions" "--dry-run", is_flag=True, help="Do not update, only check for the new versions"
) )
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def platform_update( # pylint: disable=too-many-locals def platform_update( # pylint: disable=too-many-locals, too-many-arguments
platforms, only_packages, only_check, dry_run, json_output platforms, only_packages, only_check, dry_run, silent, json_output
): ):
pm = PlatformManager() pm = PlatformPackageManager()
pkg_dir_to_name = {} platforms = platforms or pm.get_installed()
if not platforms:
platforms = []
for manifest in pm.get_installed():
platforms.append(manifest["__pkg_dir"])
pkg_dir_to_name[manifest["__pkg_dir"]] = manifest.get(
"title", manifest["name"]
)
only_check = dry_run or only_check only_check = dry_run or only_check
if only_check and json_output: if only_check and json_output:
result = [] result = []
for platform in platforms: for platform in platforms:
pkg_dir = platform if isdir(platform) else None spec = None
requirements = None pkg = None
url = None if isinstance(platform, PackageItem):
if not pkg_dir: pkg = platform
name, requirements, url = pm.parse_pkg_uri(platform) else:
pkg_dir = pm.get_package_dir(name, requirements, url) spec = PackageSpec(platform)
if not pkg_dir: pkg = pm.get_package(spec)
if not pkg:
continue continue
latest = pm.outdated(pkg_dir, requirements) outdated = pm.outdated(pkg, spec)
if ( if (
not latest not outdated.is_outdated(allow_incompatible=True)
and not PlatformFactory.newPlatform(pkg_dir).are_outdated_packages() and not PlatformFactory.new(pkg).are_outdated_packages()
): ):
continue continue
data = _get_installed_platform_data( data = _get_installed_platform_data(
pkg_dir, with_boards=False, expose_packages=False pkg, with_boards=False, expose_packages=False
)
if outdated.is_outdated(allow_incompatible=True):
data["versionLatest"] = (
str(outdated.latest) if outdated.latest else None
) )
if latest:
data["versionLatest"] = latest
result.append(data) result.append(data)
return click.echo(dump_json_to_unicode(result)) return click.echo(dump_json_to_unicode(result))
# cleanup cached board and platform lists # cleanup cached board and platform lists
app.clean_cache() cleanup_content_cache("http")
for platform in platforms: for platform in platforms:
click.echo( click.echo(
"Platform %s" "Platform %s"
% click.style(pkg_dir_to_name.get(platform, platform), fg="cyan") % click.style(
platform.metadata.name
if isinstance(platform, PackageItem)
else platform,
fg="cyan",
)
) )
click.echo("--------") click.echo("--------")
pm.update(platform, only_packages=only_packages, only_check=only_check) pm.update(
platform, only_packages=only_packages, only_check=only_check, silent=silent
)
click.echo() click.echo()
return True return True
@cli.command(
"pack", short_help="Create a tarball from development platform/tool package"
)
@click.argument("package", required=True, metavar="[source directory, tar.gz or zip]")
def platform_pack(package):
p = PackagePacker(package)
tarball_path = p.pack()
click.secho('Wrote a tarball to "%s"' % tarball_path, fg="green")

View File

@@ -12,23 +12,25 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals, too-many-branches # pylint: disable=too-many-arguments,too-many-locals,too-many-branches,line-too-long
import json
import os import os
import click import click
from tabulate import tabulate from tabulate import tabulate
from platformio import exception, fs from platformio import fs
from platformio.commands.platform import platform_install as cli_platform_install from platformio.commands.platform import platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager from platformio.package.manager.platform import PlatformPackageManager
from platformio.platform.exception import UnknownBoard
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.exception import NotPlatformIOProjectError from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.helpers import is_platformio_project from platformio.project.helpers import is_platformio_project, load_project_ide_data
@click.group(short_help="Project Manager") @click.group(short_help="Project manager")
def cli(): def cli():
pass pass
@@ -38,9 +40,7 @@ def cli():
"-d", "-d",
"--project-dir", "--project-dir",
default=os.getcwd, default=os.getcwd,
type=click.Path( type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
) )
@click.option("--json-output", is_flag=True) @click.option("--json-output", is_flag=True)
def project_config(project_dir, json_output): def project_config(project_dir, json_output):
@@ -54,7 +54,6 @@ def project_config(project_dir, json_output):
"Computed project configuration for %s" % click.style(project_dir, fg="cyan") "Computed project configuration for %s" % click.style(project_dir, fg="cyan")
) )
for section, options in config.as_tuple(): for section, options in config.as_tuple():
click.echo()
click.secho(section, fg="cyan") click.secho(section, fg="cyan")
click.echo("-" * len(section)) click.echo("-" * len(section))
click.echo( click.echo(
@@ -66,15 +65,55 @@ def project_config(project_dir, json_output):
tablefmt="plain", tablefmt="plain",
) )
) )
click.echo()
return None
@cli.command("data", short_help="Dump data intended for IDE extensions/plugins")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option("-e", "--environment", multiple=True)
@click.option("--json-output", is_flag=True)
def project_data(project_dir, environment, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
config.validate(environment)
environment = list(environment or config.envs())
if json_output:
return click.echo(json.dumps(load_project_ide_data(project_dir, environment)))
for envname in environment:
click.echo("Environment: " + click.style(envname, fg="cyan", bold=True))
click.echo("=" * (13 + len(envname)))
click.echo(
tabulate(
[
(click.style(name, bold=True), "=", json.dumps(value, indent=2))
for name, value in load_project_ide_data(
project_dir, envname
).items()
],
tablefmt="plain",
)
)
click.echo()
return None return None
def validate_boards(ctx, param, value): # pylint: disable=W0613 def validate_boards(ctx, param, value): # pylint: disable=W0613
pm = PlatformManager() pm = PlatformPackageManager()
for id_ in value: for id_ in value:
try: try:
pm.board_config(id_) pm.board_config(id_)
except exception.UnknownBoard: except UnknownBoard:
raise click.BadParameter( raise click.BadParameter(
"`%s`. Please search for board ID using `platformio boards` " "`%s`. Please search for board ID using `platformio boards` "
"command" % id_ "command" % id_
@@ -93,6 +132,7 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
) )
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards) @click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option("--ide", type=click.Choice(ProjectGenerator.get_supported_ides())) @click.option("--ide", type=click.Choice(ProjectGenerator.get_supported_ides()))
@click.option("-e", "--environment", help="Update using existing environment")
@click.option("-O", "--project-option", multiple=True) @click.option("-O", "--project-option", multiple=True)
@click.option("--env-prefix", default="") @click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True) @click.option("-s", "--silent", is_flag=True)
@@ -102,6 +142,7 @@ def project_init(
project_dir, project_dir,
board, board,
ide, ide,
environment,
project_option, project_option,
env_prefix, env_prefix,
silent, silent,
@@ -139,11 +180,17 @@ def project_init(
) )
if ide: if ide:
pg = ProjectGenerator(project_dir, ide, board) with fs.cd(project_dir):
config = ProjectConfig.get_instance(
os.path.join(project_dir, "platformio.ini")
)
config.validate()
pg = ProjectGenerator(
config, environment or get_best_envname(config, board), ide
)
pg.generate() pg.generate()
if is_new_project: if is_new_project:
init_ci_conf(project_dir)
init_cvs_ignore(project_dir) init_cvs_ignore(project_dir)
if silent: if silent:
@@ -233,7 +280,6 @@ https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
def init_lib_readme(lib_dir): def init_lib_readme(lib_dir):
# pylint: disable=line-too-long
with open(os.path.join(lib_dir, "README"), "w") as fp: with open(os.path.join(lib_dir, "README"), "w") as fp:
fp.write( fp.write(
""" """
@@ -290,7 +336,7 @@ def init_test_readme(test_dir):
with open(os.path.join(test_dir, "README"), "w") as fp: with open(os.path.join(test_dir, "README"), "w") as fp:
fp.write( fp.write(
""" """
This directory is intended for PIO Unit Testing and project tests. This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated source code, sets of one or more MCU program modules together with associated
@@ -298,89 +344,12 @@ control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early determine whether they are fit for use. Unit testing finds problems early
in the development cycle. in the development cycle.
More information about PIO Unit Testing: More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html - https://docs.platformio.org/page/plus/unit-testing.html
""", """,
) )
def init_ci_conf(project_dir):
conf_path = os.path.join(project_dir, ".travis.yml")
if os.path.isfile(conf_path):
return
with open(conf_path, "w") as fp:
fp.write(
"""# Continuous Integration (CI) is the practice, in software
# engineering, of merging all developer working copies with a shared mainline
# several times a day < https://docs.platformio.org/page/ci/index.html >
#
# Documentation:
#
# * Travis CI Embedded Builds with PlatformIO
# < https://docs.travis-ci.com/user/integration/platformio/ >
#
# * PlatformIO integration with Travis CI
# < https://docs.platformio.org/page/ci/travis.html >
#
# * User Guide for `platformio ci` command
# < https://docs.platformio.org/page/userguide/cmd_ci.html >
#
#
# Please choose one of the following templates (proposed below) and uncomment
# it (remove "# " before each line) or use own configuration according to the
# Travis CI documentation (see above).
#
#
# Template #1: General project. Test it using existing `platformio.ini`.
#
# language: python
# python:
# - "2.7"
#
# sudo: false
# cache:
# directories:
# - "~/.platformio"
#
# install:
# - pip install -U platformio
# - platformio update
#
# script:
# - platformio run
#
# Template #2: The project is intended to be used as a library with examples.
#
# language: python
# python:
# - "2.7"
#
# sudo: false
# cache:
# directories:
# - "~/.platformio"
#
# env:
# - PLATFORMIO_CI_SRC=path/to/test/file.c
# - PLATFORMIO_CI_SRC=examples/file.ino
# - PLATFORMIO_CI_SRC=path/to/test/directory
#
# install:
# - pip install -U platformio
# - platformio update
#
# script:
# - platformio ci --lib="." --board=ID_1 --board=ID_2 --board=ID_N
""",
)
def init_cvs_ignore(project_dir): def init_cvs_ignore(project_dir):
conf_path = os.path.join(project_dir, ".gitignore") conf_path = os.path.join(project_dir, ".gitignore")
if os.path.isfile(conf_path): if os.path.isfile(conf_path):
@@ -401,7 +370,7 @@ def fill_project_envs(
if all(cond): if all(cond):
used_boards.append(config.get(section, "board")) used_boards.append(config.get(section, "board"))
pm = PlatformManager() pm = PlatformPackageManager()
used_platforms = [] used_platforms = []
modified = False modified = False
for id_ in board_ids: for id_ in board_ids:
@@ -438,9 +407,31 @@ def fill_project_envs(
def _install_dependent_platforms(ctx, platforms): def _install_dependent_platforms(ctx, platforms):
installed_platforms = [p["name"] for p in PlatformManager().get_installed()] installed_platforms = [
pkg.metadata.name for pkg in PlatformPackageManager().get_installed()
]
if set(platforms) <= set(installed_platforms): if set(platforms) <= set(installed_platforms):
return return
ctx.invoke( ctx.invoke(
cli_platform_install, platforms=list(set(platforms) - set(installed_platforms)) cli_platform_install, platforms=list(set(platforms) - set(installed_platforms))
) )
def get_best_envname(config, board_ids=None):
envname = None
default_envs = config.default_envs()
if default_envs:
envname = default_envs[0]
if not board_ids:
return envname
for env in config.envs():
if not board_ids:
return env
if not envname:
envname = env
items = config.items(env=env, as_dict=True)
if "board" in items and items.get("board") in board_ids:
return env
return envname

View File

@@ -72,7 +72,7 @@ class RemoteClientBase( # pylint: disable=too-many-instance-attributes
def connect(self): def connect(self):
self.log.info("Name: {name}", name=self.name) self.log.info("Name: {name}", name=self.name)
self.log.info("Connecting to PIO Remote Cloud") self.log.info("Connecting to PlatformIO Remote Development Cloud")
# pylint: disable=protected-access # pylint: disable=protected-access
proto, options = endpoints._parse(__pioremote_endpoint__) proto, options = endpoints._parse(__pioremote_endpoint__)

View File

@@ -20,7 +20,7 @@ from io import BytesIO
from twisted.spread import pb # pylint: disable=import-error from twisted.spread import pb # pylint: disable=import-error
from platformio import util from platformio import fs
from platformio.commands.remote.client.async_base import AsyncClientBase from platformio.commands.remote.client.async_base import AsyncClientBase
from platformio.commands.remote.projectsync import PROJECT_SYNC_STAGE, ProjectSync from platformio.commands.remote.projectsync import PROJECT_SYNC_STAGE, ProjectSync
from platformio.compat import hashlib_encode_data from platformio.compat import hashlib_encode_data
@@ -64,7 +64,7 @@ class RunOrTestClient(AsyncClientBase):
return "%s-%s" % (os.path.basename(path), h.hexdigest()) return "%s-%s" % (os.path.basename(path), h.hexdigest())
def add_project_items(self, psync): def add_project_items(self, psync):
with util.cd(self.options["project_dir"]): with fs.cd(self.options["project_dir"]):
cfg = ProjectConfig.get_instance( cfg = ProjectConfig.get_instance(
os.path.join(self.options["project_dir"], "platformio.ini") os.path.join(self.options["project_dir"], "platformio.ini")
) )

View File

@@ -29,18 +29,19 @@ from platformio.commands.device.command import device_monitor as cmd_device_moni
from platformio.commands.run.command import cli as cmd_run from platformio.commands.run.command import cli as cmd_run
from platformio.commands.test.command import cli as cmd_test from platformio.commands.test.command import cli as cmd_test
from platformio.compat import PY2 from platformio.compat import PY2
from platformio.managers.core import inject_contrib_pysite from platformio.package.manager.core import inject_contrib_pysite
from platformio.project.exception import NotPlatformIOProjectError from platformio.project.exception import NotPlatformIOProjectError
@click.group("remote", short_help="PIO Remote") @click.group("remote", short_help="Remote development")
@click.option("-a", "--agent", multiple=True) @click.option("-a", "--agent", multiple=True)
@click.pass_context @click.pass_context
def cli(ctx, agent): def cli(ctx, agent):
if PY2: if PY2:
raise exception.UserSideException( raise exception.UserSideException(
"PIO Remote requires Python 3.5 or above. \nPlease install the latest " "PlatformIO Remote Development requires Python 3.5 or above. \n"
"Python 3 and reinstall PlatformIO Core using installation script:\n" "Please install the latest Python 3 and reinstall PlatformIO Core using "
"installation script:\n"
"https://docs.platformio.org/page/core/installation.html" "https://docs.platformio.org/page/core/installation.html"
) )
ctx.obj = agent ctx.obj = agent

View File

@@ -17,7 +17,7 @@ from twisted.internet import defer, protocol, reactor # pylint: disable=import-
from twisted.spread import pb # pylint: disable=import-error from twisted.spread import pb # pylint: disable=import-error
from platformio.app import get_host_id from platformio.app import get_host_id
from platformio.commands.account.client import AccountClient from platformio.clients.account import AccountClient
class RemoteClientFactory(pb.PBClientFactory, protocol.ReconnectingClientFactory): class RemoteClientFactory(pb.PBClientFactory, protocol.ReconnectingClientFactory):

View File

@@ -12,9 +12,9 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import operator
import os
from multiprocessing import cpu_count from multiprocessing import cpu_count
from os import getcwd
from os.path import isfile
from time import time from time import time
import click import click
@@ -26,7 +26,7 @@ from platformio.commands.run.helpers import clean_build_dir, handle_legacy_libde
from platformio.commands.run.processor import EnvironmentProcessor from platformio.commands.run.processor import EnvironmentProcessor
from platformio.commands.test.processor import CTX_META_TEST_IS_RUNNING from platformio.commands.test.processor import CTX_META_TEST_IS_RUNNING
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above from platformio.project.helpers import find_project_dir_above, load_project_ide_data
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches # pylint: disable=too-many-arguments,too-many-locals,too-many-branches
@@ -36,14 +36,14 @@ except NotImplementedError:
DEFAULT_JOB_NUMS = 1 DEFAULT_JOB_NUMS = 1
@click.command("run", short_help="Process project environments") @click.command("run", short_help="Run project targets (build, upload, clean, etc.)")
@click.option("-e", "--environment", multiple=True) @click.option("-e", "--environment", multiple=True)
@click.option("-t", "--target", multiple=True) @click.option("-t", "--target", multiple=True)
@click.option("--upload-port") @click.option("--upload-port")
@click.option( @click.option(
"-d", "-d",
"--project-dir", "--project-dir",
default=getcwd, default=os.getcwd,
type=click.Path( type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
), ),
@@ -68,6 +68,7 @@ except NotImplementedError:
@click.option("-s", "--silent", is_flag=True) @click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True) @click.option("-v", "--verbose", is_flag=True)
@click.option("--disable-auto-clean", is_flag=True) @click.option("--disable-auto-clean", is_flag=True)
@click.option("--list-targets", is_flag=True)
@click.pass_context @click.pass_context
def cli( def cli(
ctx, ctx,
@@ -80,11 +81,12 @@ def cli(
silent, silent,
verbose, verbose,
disable_auto_clean, disable_auto_clean,
list_targets,
): ):
app.set_session_var("custom_project_conf", project_conf) app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level # find project directory on upper level
if isfile(project_dir): if os.path.isfile(project_dir):
project_dir = find_project_dir_above(project_dir) project_dir = find_project_dir_above(project_dir)
is_test_running = CTX_META_TEST_IS_RUNNING in ctx.meta is_test_running = CTX_META_TEST_IS_RUNNING in ctx.meta
@@ -93,6 +95,9 @@ def cli(
config = ProjectConfig.get_instance(project_conf) config = ProjectConfig.get_instance(project_conf)
config.validate(environment) config.validate(environment)
if list_targets:
return print_target_list(list(environment) or config.envs())
# clean obsolete build dir # clean obsolete build dir
if not disable_auto_clean: if not disable_auto_clean:
build_dir = config.get_optional_dir("build") build_dir = config.get_optional_dir("build")
@@ -142,7 +147,7 @@ def cli(
command_failed = any(r.get("succeeded") is False for r in results) command_failed = any(r.get("succeeded") is False for r in results)
if not is_test_running and (command_failed or not silent) and len(results) > 1: if not is_test_running and (command_failed or not silent) and len(results) > 1:
print_processing_summary(results) print_processing_summary(results, verbose)
if command_failed: if command_failed:
raise exception.ReturnErrorCode(1) raise exception.ReturnErrorCode(1)
@@ -215,7 +220,7 @@ def print_processing_footer(result):
) )
def print_processing_summary(results): def print_processing_summary(results, verbose=False):
tabular_data = [] tabular_data = []
succeeded_nums = 0 succeeded_nums = 0
failed_nums = 0 failed_nums = 0
@@ -227,6 +232,8 @@ def print_processing_summary(results):
failed_nums += 1 failed_nums += 1
status_str = click.style("FAILED", fg="red") status_str = click.style("FAILED", fg="red")
elif result.get("succeeded") is None: elif result.get("succeeded") is None:
if not verbose:
continue
status_str = "IGNORED" status_str = "IGNORED"
else: else:
succeeded_nums += 1 succeeded_nums += 1
@@ -261,3 +268,33 @@ def print_processing_summary(results):
is_error=failed_nums, is_error=failed_nums,
fg="red" if failed_nums else "green", fg="red" if failed_nums else "green",
) )
def print_target_list(envs):
tabular_data = []
for env, data in load_project_ide_data(os.getcwd(), envs).items():
tabular_data.extend(
sorted(
[
(
click.style(env, fg="cyan"),
t["group"],
click.style(t.get("name"), fg="yellow"),
t["title"],
t.get("description"),
)
for t in data.get("targets", [])
],
key=operator.itemgetter(1, 2),
)
)
tabular_data.append((None, None, None, None, None))
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Environment", "Group", "Name", "Title", "Description")
],
),
)

View File

@@ -12,10 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio import exception
from platformio.commands.platform import platform_install as cmd_platform_install from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME
from platformio.managers.platform import PlatformFactory from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
from platformio.project.exception import UndefinedEnvPlatformError from platformio.project.exception import UndefinedEnvPlatformError
# pylint: disable=too-many-instance-attributes # pylint: disable=too-many-instance-attributes
@@ -67,14 +67,14 @@ class EnvironmentProcessor(object):
build_targets.remove("monitor") build_targets.remove("monitor")
try: try:
p = PlatformFactory.newPlatform(self.options["platform"]) p = PlatformFactory.new(self.options["platform"])
except exception.UnknownPlatform: except UnknownPlatform:
self.cmd_ctx.invoke( self.cmd_ctx.invoke(
cmd_platform_install, cmd_platform_install,
platforms=[self.options["platform"]], platforms=[self.options["platform"]],
skip_default_package=True, skip_default_package=True,
) )
p = PlatformFactory.newPlatform(self.options["platform"]) p = PlatformFactory.new(self.options["platform"])
result = p.run(build_vars, build_targets, self.silent, self.verbose, self.jobs) result = p.run(build_vars, build_targets, self.silent, self.verbose, self.jobs)
return result["returncode"] == 0 return result["returncode"] == 0

View File

@@ -27,7 +27,7 @@ def format_value(raw):
return str(raw) return str(raw)
@click.group(short_help="Manage PlatformIO settings") @click.group(short_help="Manage system settings")
def cli(): def cli():
pass pass

View File

@@ -12,17 +12,26 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import json
import os
import platform
import subprocess import subprocess
import sys
import click import click
from tabulate import tabulate
from platformio import proc from platformio import __version__, compat, fs, proc, util
from platformio.commands.system.completion import ( from platformio.commands.system.completion import (
get_completion_install_path, get_completion_install_path,
install_completion_code, install_completion_code,
uninstall_completion_code, uninstall_completion_code,
) )
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.manager.tool import ToolPackageManager
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_cache_dir
@click.group("system", short_help="Miscellaneous system commands") @click.group("system", short_help="Miscellaneous system commands")
@@ -30,6 +39,85 @@ def cli():
pass pass
@cli.command("info", short_help="Display system-wide information")
@click.option("--json-output", is_flag=True)
def system_info(json_output):
project_config = ProjectConfig()
data = {}
data["core_version"] = {"title": "PlatformIO Core", "value": __version__}
data["python_version"] = {
"title": "Python",
"value": "{0}.{1}.{2}-{3}.{4}".format(*list(sys.version_info)),
}
data["system"] = {"title": "System Type", "value": util.get_systype()}
data["platform"] = {"title": "Platform", "value": platform.platform(terse=True)}
data["filesystem_encoding"] = {
"title": "File System Encoding",
"value": compat.get_filesystem_encoding(),
}
data["locale_encoding"] = {
"title": "Locale Encoding",
"value": compat.get_locale_encoding(),
}
data["core_dir"] = {
"title": "PlatformIO Core Directory",
"value": project_config.get_optional_dir("core"),
}
data["platformio_exe"] = {
"title": "PlatformIO Core Executable",
"value": proc.where_is_program(
"platformio.exe" if proc.WINDOWS else "platformio"
),
}
data["python_exe"] = {
"title": "Python Executable",
"value": proc.get_pythonexe_path(),
}
data["global_lib_nums"] = {
"title": "Global Libraries",
"value": len(LibraryPackageManager().get_installed()),
}
data["dev_platform_nums"] = {
"title": "Development Platforms",
"value": len(PlatformPackageManager().get_installed()),
}
data["package_tool_nums"] = {
"title": "Tools & Toolchains",
"value": len(
ToolPackageManager(
project_config.get_optional_dir("packages")
).get_installed()
),
}
click.echo(
json.dumps(data)
if json_output
else tabulate([(item["title"], item["value"]) for item in data.values()])
)
@cli.command("prune", short_help="Remove unused data")
@click.option("--force", "-f", is_flag=True, help="Do not prompt for confirmation")
def system_prune(force):
click.secho("WARNING! This will remove:", fg="yellow")
click.echo(" - cached API requests")
click.echo(" - cached package downloads")
click.echo(" - temporary data")
if not force:
click.confirm("Do you want to continue?", abort=True)
reclaimed_total = 0
cache_dir = get_project_cache_dir()
if os.path.isdir(cache_dir):
reclaimed_total += fs.calculate_folder_size(cache_dir)
fs.rmtree(cache_dir)
click.secho(
"Total reclaimed space: %s" % fs.humanize_file_size(reclaimed_total), fg="green"
)
@cli.group("completion", short_help="Shell completion support") @cli.group("completion", short_help="Shell completion support")
def completion(): def completion():
# pylint: disable=import-error,import-outside-toplevel # pylint: disable=import-error,import-outside-toplevel

203
platformio/commands/team.py Normal file
View File

@@ -0,0 +1,203 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
import json
import re
import click
from tabulate import tabulate
from platformio.clients.account import AccountClient
def validate_orgname_teamname(value, teamname_validate=False):
if ":" not in value:
raise click.BadParameter(
"Please specify organization and team name in the next"
" format - orgname:teamname. For example, mycompany:DreamTeam"
)
teamname = str(value.strip().split(":", 1)[1])
if teamname_validate:
validate_teamname(teamname)
return value
def validate_teamname(value):
if not value:
return value
value = str(value).strip()
if not re.match(r"^[a-z\d](?:[a-z\d]|[\-_ ](?=[a-z\d])){0,19}$", value, flags=re.I):
raise click.BadParameter(
"Invalid team name format. "
"Team name must only contain alphanumeric characters, "
"single hyphens, underscores, spaces. It can not "
"begin or end with a hyphen or a underscore and must"
" not be longer than 20 characters."
)
return value
@click.group("team", short_help="Manage organization teams")
def cli():
pass
@cli.command("create", short_help="Create a new team")
@click.argument(
"orgname_teamname",
metavar="ORGNAME:TEAMNAME",
callback=lambda _, __, value: validate_orgname_teamname(
value, teamname_validate=True
),
)
@click.option("--description",)
def team_create(orgname_teamname, description):
orgname, teamname = orgname_teamname.split(":", 1)
client = AccountClient()
client.create_team(orgname, teamname, description)
return click.secho(
"The team %s has been successfully created." % teamname, fg="green",
)
@cli.command("list", short_help="List teams")
@click.argument("orgname", required=False)
@click.option("--json-output", is_flag=True)
def team_list(orgname, json_output):
client = AccountClient()
data = {}
if not orgname:
for item in client.list_orgs():
teams = client.list_teams(item.get("orgname"))
data[item.get("orgname")] = teams
else:
teams = client.list_teams(orgname)
data[orgname] = teams
if json_output:
return click.echo(json.dumps(data[orgname] if orgname else data))
if not any(data.values()):
return click.secho("You do not have any teams.", fg="yellow")
for org_name in data:
for team in data[org_name]:
click.echo()
click.secho("%s:%s" % (org_name, team.get("name")), fg="cyan")
click.echo("-" * len("%s:%s" % (org_name, team.get("name"))))
table_data = []
if team.get("description"):
table_data.append(("Description:", team.get("description")))
table_data.append(
(
"Members:",
", ".join(
(member.get("username") for member in team.get("members"))
)
if team.get("members")
else "-",
)
)
click.echo(tabulate(table_data, tablefmt="plain"))
return click.echo()
@cli.command("update", short_help="Update team")
@click.argument(
"orgname_teamname",
metavar="ORGNAME:TEAMNAME",
callback=lambda _, __, value: validate_orgname_teamname(value),
)
@click.option(
"--name",
callback=lambda _, __, value: validate_teamname(value),
help="A new team name",
)
@click.option("--description",)
def team_update(orgname_teamname, **kwargs):
orgname, teamname = orgname_teamname.split(":", 1)
client = AccountClient()
team = client.get_team(orgname, teamname)
del team["id"]
del team["members"]
new_team = team.copy()
if not any(kwargs.values()):
for field in team:
new_team[field] = click.prompt(
field.replace("_", " ").capitalize(), default=team[field]
)
if field == "name":
validate_teamname(new_team[field])
else:
new_team.update({key: value for key, value in kwargs.items() if value})
client.update_team(orgname, teamname, new_team)
return click.secho(
"The team %s has been successfully updated." % teamname, fg="green",
)
@cli.command("destroy", short_help="Destroy a team")
@click.argument(
"orgname_teamname",
metavar="ORGNAME:TEAMNAME",
callback=lambda _, __, value: validate_orgname_teamname(value),
)
def team_destroy(orgname_teamname):
orgname, teamname = orgname_teamname.split(":", 1)
click.confirm(
click.style(
"Are you sure you want to destroy the %s team?" % teamname, fg="yellow"
),
abort=True,
)
client = AccountClient()
client.destroy_team(orgname, teamname)
return click.secho(
"The team %s has been successfully destroyed." % teamname, fg="green",
)
@cli.command("add", short_help="Add a new member to team")
@click.argument(
"orgname_teamname",
metavar="ORGNAME:TEAMNAME",
callback=lambda _, __, value: validate_orgname_teamname(value),
)
@click.argument("username",)
def team_add_member(orgname_teamname, username):
orgname, teamname = orgname_teamname.split(":", 1)
client = AccountClient()
client.add_team_member(orgname, teamname, username)
return click.secho(
"The new member %s has been successfully added to the %s team."
% (username, teamname),
fg="green",
)
@cli.command("remove", short_help="Remove a member from team")
@click.argument(
"orgname_teamname",
metavar="ORGNAME:TEAMNAME",
callback=lambda _, __, value: validate_orgname_teamname(value),
)
@click.argument("username")
def team_remove_owner(orgname_teamname, username):
orgname, teamname = orgname_teamname.split(":", 1)
client = AccountClient()
client.remove_team_member(orgname, teamname, username)
return click.secho(
"The %s member has been successfully removed from the %s team."
% (username, teamname),
fg="green",
)

View File

@@ -28,7 +28,7 @@ from platformio.commands.test.native import NativeTestProcessor
from platformio.project.config import ProjectConfig from platformio.project.config import ProjectConfig
@click.command("test", short_help="Unit Testing") @click.command("test", short_help="Unit testing")
@click.option("--environment", "-e", multiple=True, metavar="<environment>") @click.option("--environment", "-e", multiple=True, metavar="<environment>")
@click.option( @click.option(
"--filter", "--filter",

View File

@@ -19,7 +19,7 @@ import serial
from platformio import exception, util from platformio import exception, util
from platformio.commands.test.processor import TestProcessorBase from platformio.commands.test.processor import TestProcessorBase
from platformio.managers.platform import PlatformFactory from platformio.platform.factory import PlatformFactory
class EmbeddedTestProcessor(TestProcessorBase): class EmbeddedTestProcessor(TestProcessorBase):
@@ -108,7 +108,7 @@ class EmbeddedTestProcessor(TestProcessorBase):
return self.env_options.get("test_port") return self.env_options.get("test_port")
assert set(["platform", "board"]) & set(self.env_options.keys()) assert set(["platform", "board"]) & set(self.env_options.keys())
p = PlatformFactory.newPlatform(self.env_options["platform"]) p = PlatformFactory.new(self.env_options["platform"])
board_hwids = p.board_config(self.env_options["board"]).get("build.hwids", []) board_hwids = p.board_config(self.env_options["board"]).get("build.hwids", [])
port = None port = None
elapsed = 0 elapsed = 0

View File

@@ -13,7 +13,7 @@
# limitations under the License. # limitations under the License.
import atexit import atexit
from os import remove from os import listdir, remove
from os.path import isdir, isfile, join from os.path import isdir, isfile, join
from string import Template from string import Template
@@ -25,33 +25,39 @@ TRANSPORT_OPTIONS = {
"arduino": { "arduino": {
"include": "#include <Arduino.h>", "include": "#include <Arduino.h>",
"object": "", "object": "",
"putchar": "Serial.write(c)", "putchar": "Serial.write(c);",
"flush": "Serial.flush()", "flush": "Serial.flush();",
"begin": "Serial.begin($baudrate)", "begin": "Serial.begin($baudrate);",
"end": "Serial.end()", "end": "Serial.end();",
"language": "cpp", "language": "cpp",
}, },
"mbed": { "mbed": {
"include": "#include <mbed.h>", "include": "#include <mbed.h>",
"object": "Serial pc(USBTX, USBRX);", "object": (
"putchar": "pc.putc(c)", "#if MBED_MAJOR_VERSION == 6\nUnbufferedSerial pc(USBTX, USBRX);\n"
"#else\nRawSerial pc(USBTX, USBRX);\n#endif"
),
"putchar": (
"#if MBED_MAJOR_VERSION == 6\npc.write(&c, 1);\n"
"#else\npc.putc(c);\n#endif"
),
"flush": "", "flush": "",
"begin": "pc.baud($baudrate)", "begin": "pc.baud($baudrate);",
"end": "", "end": "",
"language": "cpp", "language": "cpp",
}, },
"espidf": { "espidf": {
"include": "#include <stdio.h>", "include": "#include <stdio.h>",
"object": "", "object": "",
"putchar": "putchar(c)", "putchar": "putchar(c);",
"flush": "fflush(stdout)", "flush": "fflush(stdout);",
"begin": "", "begin": "",
"end": "", "end": "",
}, },
"zephyr": { "zephyr": {
"include": "#include <sys/printk.h>", "include": "#include <sys/printk.h>",
"object": "", "object": "",
"putchar": 'printk("%c", c)', "putchar": 'printk("%c", c);',
"flush": "", "flush": "",
"begin": "", "begin": "",
"end": "", "end": "",
@@ -59,18 +65,18 @@ TRANSPORT_OPTIONS = {
"native": { "native": {
"include": "#include <stdio.h>", "include": "#include <stdio.h>",
"object": "", "object": "",
"putchar": "putchar(c)", "putchar": "putchar(c);",
"flush": "fflush(stdout)", "flush": "fflush(stdout);",
"begin": "", "begin": "",
"end": "", "end": "",
}, },
"custom": { "custom": {
"include": '#include "unittest_transport.h"', "include": '#include "unittest_transport.h"',
"object": "", "object": "",
"putchar": "unittest_uart_putchar(c)", "putchar": "unittest_uart_putchar(c);",
"flush": "unittest_uart_flush()", "flush": "unittest_uart_flush();",
"begin": "unittest_uart_begin()", "begin": "unittest_uart_begin();",
"end": "unittest_uart_end()", "end": "unittest_uart_end();",
"language": "cpp", "language": "cpp",
}, },
} }
@@ -132,6 +138,7 @@ class TestProcessorBase(object):
return self.cmd_ctx.invoke( return self.cmd_ctx.invoke(
cmd_run, cmd_run,
project_dir=self.options["project_dir"], project_dir=self.options["project_dir"],
project_conf=self.options["project_config"].path,
upload_port=self.options["upload_port"], upload_port=self.options["upload_port"],
verbose=self.options["verbose"], verbose=self.options["verbose"],
silent=self.options["silent"], silent=self.options["silent"],
@@ -174,44 +181,50 @@ class TestProcessorBase(object):
"void output_start(unsigned int baudrate)", "void output_start(unsigned int baudrate)",
"#endif", "#endif",
"{", "{",
" $begin;", " $begin",
"}", "}",
"", "",
"void output_char(int c)", "void output_char(int c)",
"{", "{",
" $putchar;", " $putchar",
"}", "}",
"", "",
"void output_flush(void)", "void output_flush(void)",
"{", "{",
" $flush;", " $flush",
"}", "}",
"", "",
"void output_complete(void)", "void output_complete(void)",
"{", "{",
" $end;", " $end",
"}", "}",
] ]
) )
def delete_tmptest_file(file_): tmp_file_prefix = "tmp_pio_test_transport"
def delete_tmptest_files(test_dir):
for item in listdir(test_dir):
if item.startswith(tmp_file_prefix) and isfile(join(test_dir, item)):
try: try:
remove(file_) remove(join(test_dir, item))
except: # pylint: disable=bare-except except: # pylint: disable=bare-except
if isfile(file_):
click.secho( click.secho(
"Warning: Could not remove temporary file '%s'. " "Warning: Could not remove temporary file '%s'. "
"Please remove it manually." % file_, "Please remove it manually." % join(test_dir, item),
fg="yellow", fg="yellow",
) )
transport_options = TRANSPORT_OPTIONS[self.get_transport()] transport_options = TRANSPORT_OPTIONS[self.get_transport()]
tpl = Template(file_tpl).substitute(transport_options) tpl = Template(file_tpl).substitute(transport_options)
data = Template(tpl).substitute(baudrate=self.get_baudrate()) data = Template(tpl).substitute(baudrate=self.get_baudrate())
delete_tmptest_files(test_dir)
tmp_file = join( tmp_file = join(
test_dir, "output_export." + transport_options.get("language", "c") test_dir,
"%s.%s" % (tmp_file_prefix, transport_options.get("language", "c")),
) )
with open(tmp_file, "w") as fp: with open(tmp_file, "w") as fp:
fp.write(data) fp.write(data)
atexit.register(delete_tmptest_file, tmp_file) atexit.register(delete_tmptest_files, test_dir)

View File

@@ -14,12 +14,12 @@
import click import click
from platformio import app from platformio.cache import cleanup_content_cache
from platformio.commands.lib import CTX_META_STORAGE_DIRS_KEY from platformio.commands.lib.command import CTX_META_STORAGE_DIRS_KEY
from platformio.commands.lib import lib_update as cmd_lib_update from platformio.commands.lib.command import lib_update as cmd_lib_update
from platformio.commands.platform import platform_update as cmd_platform_update from platformio.commands.platform import platform_update as cmd_platform_update
from platformio.managers.core import update_core_packages from platformio.package.manager.core import update_core_packages
from platformio.managers.lib import LibraryManager from platformio.package.manager.library import LibraryPackageManager
@click.command( @click.command(
@@ -38,7 +38,7 @@ from platformio.managers.lib import LibraryManager
@click.pass_context @click.pass_context
def cli(ctx, core_packages, only_check, dry_run): def cli(ctx, core_packages, only_check, dry_run):
# cleanup lib search results, cached board and platform lists # cleanup lib search results, cached board and platform lists
app.clean_cache() cleanup_content_cache("http")
only_check = dry_run or only_check only_check = dry_run or only_check
@@ -55,5 +55,5 @@ def cli(ctx, core_packages, only_check, dry_run):
click.echo() click.echo()
click.echo("Library Manager") click.echo("Library Manager")
click.echo("===============") click.echo("===============")
ctx.meta[CTX_META_STORAGE_DIRS_KEY] = [LibraryManager().package_dir] ctx.meta[CTX_META_STORAGE_DIRS_KEY] = [LibraryPackageManager().package_dir]
ctx.invoke(cmd_lib_update, only_check=only_check) ctx.invoke(cmd_lib_update, only_check=only_check)

View File

@@ -12,14 +12,15 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import json
import os import os
import re import re
from zipfile import ZipFile from zipfile import ZipFile
import click import click
import requests
from platformio import VERSION, __version__, app, exception from platformio import VERSION, __version__, app, exception
from platformio.clients.http import fetch_remote_content
from platformio.compat import WINDOWS from platformio.compat import WINDOWS
from platformio.proc import exec_command, get_pythonexe_path from platformio.proc import exec_command, get_pythonexe_path
from platformio.project.helpers import get_project_cache_dir from platformio.project.helpers import get_project_cache_dir
@@ -130,13 +131,11 @@ def get_latest_version():
def get_develop_latest_version(): def get_develop_latest_version():
version = None version = None
r = requests.get( content = fetch_remote_content(
"https://raw.githubusercontent.com/platformio/platformio" "https://raw.githubusercontent.com/platformio/platformio"
"/develop/platformio/__init__.py", "/develop/platformio/__init__.py"
headers={"User-Agent": app.get_user_agent()},
) )
r.raise_for_status() for line in content.split("\n"):
for line in r.text.split("\n"):
line = line.strip() line = line.strip()
if not line.startswith("VERSION"): if not line.startswith("VERSION"):
continue continue
@@ -152,9 +151,5 @@ def get_develop_latest_version():
def get_pypi_latest_version(): def get_pypi_latest_version():
r = requests.get( content = fetch_remote_content("https://pypi.org/pypi/platformio/json")
"https://pypi.org/pypi/platformio/json", return json.loads(content)["info"]["version"]
headers={"User-Agent": app.get_user_agent()},
)
r.raise_for_status()
return r.json()["info"]["version"]

View File

@@ -13,8 +13,9 @@
# limitations under the License. # limitations under the License.
# pylint: disable=unused-import, no-name-in-module, import-error, # pylint: disable=unused-import, no-name-in-module, import-error,
# pylint: disable=no-member, undefined-variable # pylint: disable=no-member, undefined-variable, unexpected-keyword-arg
import glob
import inspect import inspect
import json import json
import locale import locale
@@ -49,6 +50,14 @@ def get_object_members(obj, ignore_private=True):
} }
def ci_strings_are_equal(a, b):
if a == b:
return True
if not a or not b:
return False
return a.strip().lower() == b.strip().lower()
if PY2: if PY2:
import imp import imp
@@ -81,6 +90,9 @@ if PY2:
_magic_check = re.compile("([*?[])") _magic_check = re.compile("([*?[])")
_magic_check_bytes = re.compile(b"([*?[])") _magic_check_bytes = re.compile(b"([*?[])")
def glob_recursive(pathname):
return glob.glob(pathname)
def glob_escape(pathname): def glob_escape(pathname):
"""Escape all special characters.""" """Escape all special characters."""
# https://github.com/python/cpython/blob/master/Lib/glob.py#L161 # https://github.com/python/cpython/blob/master/Lib/glob.py#L161
@@ -122,6 +134,9 @@ else:
return obj return obj
return json.dumps(obj, ensure_ascii=False, sort_keys=True) return json.dumps(obj, ensure_ascii=False, sort_keys=True)
def glob_recursive(pathname):
return glob.glob(pathname, recursive=True)
def load_python_module(name, pathname): def load_python_module(name, pathname):
spec = importlib.util.spec_from_file_location(name, pathname) spec = importlib.util.spec_from_file_location(name, pathname)
module = importlib.util.module_from_spec(spec) module = importlib.util.module_from_spec(spec)

View File

@@ -30,10 +30,6 @@ class ReturnErrorCode(PlatformioException):
MESSAGE = "{0}" MESSAGE = "{0}"
class LockFileTimeoutError(PlatformioException):
pass
class MinitermException(PlatformioException): class MinitermException(PlatformioException):
pass pass
@@ -47,141 +43,6 @@ class AbortedByUser(UserSideException):
MESSAGE = "Aborted by user" MESSAGE = "Aborted by user"
#
# Development Platform
#
class UnknownPlatform(PlatformioException):
MESSAGE = "Unknown development platform '{0}'"
class IncompatiblePlatform(PlatformioException):
MESSAGE = "Development platform '{0}' is not compatible with PIO Core v{1}"
class PlatformNotInstalledYet(PlatformioException):
MESSAGE = (
"The platform '{0}' has not been installed yet. "
"Use `platformio platform install {0}` command"
)
class UnknownBoard(PlatformioException):
MESSAGE = "Unknown board ID '{0}'"
class InvalidBoardManifest(PlatformioException):
MESSAGE = "Invalid board JSON manifest '{0}'"
class UnknownFramework(PlatformioException):
MESSAGE = "Unknown framework '{0}'"
# Package Manager
class PlatformIOPackageException(PlatformioException):
pass
class UnknownPackage(UserSideException):
MESSAGE = "Detected unknown package '{0}'"
class MissingPackageManifest(PlatformIOPackageException):
MESSAGE = "Could not find one of '{0}' manifest files in the package"
class UndefinedPackageVersion(PlatformIOPackageException):
MESSAGE = (
"Could not find a version that satisfies the requirement '{0}'"
" for your system '{1}'"
)
class PackageInstallError(PlatformIOPackageException):
MESSAGE = (
"Could not install '{0}' with version requirements '{1}' "
"for your system '{2}'.\n\n"
"Please try this solution -> http://bit.ly/faq-package-manager"
)
class ExtractArchiveItemError(PlatformIOPackageException):
MESSAGE = (
"Could not extract `{0}` to `{1}`. Try to disable antivirus "
"tool or check this solution -> http://bit.ly/faq-package-manager"
)
class UnsupportedArchiveType(PlatformIOPackageException):
MESSAGE = "Can not unpack file '{0}'"
class FDUnrecognizedStatusCode(PlatformIOPackageException):
MESSAGE = "Got an unrecognized status code '{0}' when downloaded {1}"
class FDSizeMismatch(PlatformIOPackageException):
MESSAGE = (
"The size ({0:d} bytes) of downloaded file '{1}' "
"is not equal to remote size ({2:d} bytes)"
)
class FDSHASumMismatch(PlatformIOPackageException):
MESSAGE = (
"The 'sha1' sum '{0}' of downloaded file '{1}' is not equal to remote '{2}'"
)
#
# Library
#
class LibNotFound(PlatformioException):
MESSAGE = (
"Library `{0}` has not been found in PlatformIO Registry.\n"
"You can ignore this message, if `{0}` is a built-in library "
"(included in framework, SDK). E.g., SPI, Wire, etc."
)
class NotGlobalLibDir(UserSideException):
MESSAGE = (
"The `{0}` is not a PlatformIO project.\n\n"
"To manage libraries in global storage `{1}`,\n"
"please use `platformio lib --global {2}` or specify custom storage "
"`platformio lib --storage-dir /path/to/storage/ {2}`.\n"
"Check `platformio lib --help` for details."
)
class InvalidLibConfURL(UserSideException):
MESSAGE = "Invalid library config URL '{0}'"
# #
# UDEV Rules # UDEV Rules
# #
@@ -194,8 +55,8 @@ class InvalidUdevRules(PlatformioException):
class MissedUdevRules(InvalidUdevRules): class MissedUdevRules(InvalidUdevRules):
MESSAGE = ( MESSAGE = (
"Warning! Please install `99-platformio-udev.rules`. \nMode details: " "Warning! Please install `99-platformio-udev.rules`. \nMore details: "
"https://docs.platformio.org/en/latest/faq.html#platformio-udev-rules" "https://docs.platformio.org/page/faq.html#platformio-udev-rules"
) )
@@ -203,8 +64,8 @@ class OutdatedUdevRules(InvalidUdevRules):
MESSAGE = ( MESSAGE = (
"Warning! Your `{0}` are outdated. Please update or reinstall them." "Warning! Your `{0}` are outdated. Please update or reinstall them."
"\n Mode details: https://docs.platformio.org" "\nMore details: "
"/en/latest/faq.html#platformio-udev-rules" "https://docs.platformio.org/page/faq.html#platformio-udev-rules"
) )
@@ -223,25 +84,6 @@ class GetLatestVersionError(PlatformioException):
MESSAGE = "Can not retrieve the latest PlatformIO version" MESSAGE = "Can not retrieve the latest PlatformIO version"
class APIRequestError(PlatformioException):
MESSAGE = "[API] {0}"
class InternetIsOffline(UserSideException):
MESSAGE = (
"You are not connected to the Internet.\n"
"PlatformIO needs the Internet connection to"
" download dependent packages or to work with PIO Account."
)
class BuildScriptNotFound(PlatformioException):
MESSAGE = "Invalid path '{0}' to build script"
class InvalidSettingName(UserSideException): class InvalidSettingName(UserSideException):
MESSAGE = "Invalid setting with the name '{0}'" MESSAGE = "Invalid setting with the name '{0}'"

View File

@@ -12,18 +12,19 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import hashlib
import io
import json import json
import os import os
import re import re
import shutil import shutil
import stat import stat
import sys import sys
from glob import glob
import click import click
from platformio import exception from platformio import exception
from platformio.compat import WINDOWS, glob_escape from platformio.compat import WINDOWS, glob_escape, glob_recursive
class cd(object): class cd(object):
@@ -56,7 +57,7 @@ def load_json(file_path):
raise exception.InvalidJSONFile(file_path) raise exception.InvalidJSONFile(file_path)
def format_filesize(filesize): def humanize_file_size(filesize):
base = 1024 base = 1024
unit = 0 unit = 0
suffix = "B" suffix = "B"
@@ -73,6 +74,28 @@ def format_filesize(filesize):
return "%d%sB" % ((base * filesize / unit), suffix) return "%d%sB" % ((base * filesize / unit), suffix)
def calculate_file_hashsum(algorithm, path):
h = hashlib.new(algorithm)
with io.open(path, "rb", buffering=0) as fp:
while True:
chunk = fp.read(io.DEFAULT_BUFFER_SIZE)
if not chunk:
break
h.update(chunk)
return h.hexdigest()
def calculate_folder_size(path):
assert os.path.isdir(path)
result = 0
for root, __, files in os.walk(path):
for f in files:
file_path = os.path.join(root, f)
if not os.path.islink(file_path):
result += os.path.getsize(file_path)
return result
def ensure_udev_rules(): def ensure_udev_rules():
from platformio.util import get_systype # pylint: disable=import-outside-toplevel from platformio.util import get_systype # pylint: disable=import-outside-toplevel
@@ -135,7 +158,7 @@ def match_src_files(src_dir, src_filter=None, src_exts=None, followlinks=True):
src_filter = src_filter.replace("/", os.sep).replace("\\", os.sep) src_filter = src_filter.replace("/", os.sep).replace("\\", os.sep)
for (action, pattern) in re.findall(r"(\+|\-)<([^>]+)>", src_filter): for (action, pattern) in re.findall(r"(\+|\-)<([^>]+)>", src_filter):
items = set() items = set()
for item in glob(os.path.join(glob_escape(src_dir), pattern)): for item in glob_recursive(os.path.join(glob_escape(src_dir), pattern)):
if os.path.isdir(item): if os.path.isdir(item):
for root, _, files in os.walk(item, followlinks=followlinks): for root, _, files in os.walk(item, followlinks=followlinks):
for f in files: for f in files:
@@ -164,6 +187,10 @@ def expanduser(path):
return os.environ["USERPROFILE"] + path[1:] return os.environ["USERPROFILE"] + path[1:]
def change_filemtime(path, mtime):
os.utime(path, (mtime, mtime))
def rmtree(path): def rmtree(path):
def _onerror(func, path, __): def _onerror(func, path, __):
try: try:

View File

@@ -15,47 +15,31 @@
import codecs import codecs
import os import os
import sys import sys
from os.path import basename, isdir, isfile, join, realpath, relpath
import bottle import bottle
from platformio import fs, util from platformio import fs, util
from platformio.proc import where_is_program from platformio.proc import where_is_program
from platformio.project.config import ProjectConfig
from platformio.project.helpers import load_project_ide_data from platformio.project.helpers import load_project_ide_data
class ProjectGenerator(object): class ProjectGenerator(object):
def __init__(self, project_dir, ide, boards): def __init__(self, config, env_name, ide):
self.config = ProjectConfig.get_instance(join(project_dir, "platformio.ini")) self.config = config
self.config.validate() self.project_dir = os.path.dirname(config.path)
self.project_dir = project_dir self.env_name = str(env_name)
self.ide = str(ide) self.ide = str(ide)
self.env_name = str(self.get_best_envname(boards))
@staticmethod @staticmethod
def get_supported_ides(): def get_supported_ides():
tpls_dir = join(fs.get_source_dir(), "ide", "tpls") tpls_dir = os.path.join(fs.get_source_dir(), "ide", "tpls")
return sorted([d for d in os.listdir(tpls_dir) if isdir(join(tpls_dir, d))]) return sorted(
[
def get_best_envname(self, boards=None): d
envname = None for d in os.listdir(tpls_dir)
default_envs = self.config.default_envs() if os.path.isdir(os.path.join(tpls_dir, d))
if default_envs: ]
envname = default_envs[0] )
if not boards:
return envname
for env in self.config.envs():
if not boards:
return env
if not envname:
envname = env
items = self.config.items(env=env, as_dict=True)
if "board" in items and items.get("board") in boards:
return env
return envname
@staticmethod @staticmethod
def filter_includes(includes_map, ignore_scopes=None, to_unix_path=True): def filter_includes(includes_map, ignore_scopes=None, to_unix_path=True):
@@ -75,12 +59,12 @@ class ProjectGenerator(object):
tpl_vars = { tpl_vars = {
"config": self.config, "config": self.config,
"systype": util.get_systype(), "systype": util.get_systype(),
"project_name": basename(self.project_dir), "project_name": os.path.basename(self.project_dir),
"project_dir": self.project_dir, "project_dir": self.project_dir,
"env_name": self.env_name, "env_name": self.env_name,
"user_home_dir": realpath(fs.expanduser("~")), "user_home_dir": os.path.realpath(fs.expanduser("~")),
"platformio_path": sys.argv[0] "platformio_path": sys.argv[0]
if isfile(sys.argv[0]) if os.path.isfile(sys.argv[0])
else where_is_program("platformio"), else where_is_program("platformio"),
"env_path": os.getenv("PATH"), "env_path": os.getenv("PATH"),
"env_pathsep": os.pathsep, "env_pathsep": os.pathsep,
@@ -97,7 +81,7 @@ class ProjectGenerator(object):
"src_files": self.get_src_files(), "src_files": self.get_src_files(),
"project_src_dir": self.config.get_optional_dir("src"), "project_src_dir": self.config.get_optional_dir("src"),
"project_lib_dir": self.config.get_optional_dir("lib"), "project_lib_dir": self.config.get_optional_dir("lib"),
"project_libdeps_dir": join( "project_libdeps_dir": os.path.join(
self.config.get_optional_dir("libdeps"), self.env_name self.config.get_optional_dir("libdeps"), self.env_name
), ),
} }
@@ -120,12 +104,12 @@ class ProjectGenerator(object):
with fs.cd(self.project_dir): with fs.cd(self.project_dir):
for root, _, files in os.walk(self.config.get_optional_dir("src")): for root, _, files in os.walk(self.config.get_optional_dir("src")):
for f in files: for f in files:
result.append(relpath(join(root, f))) result.append(os.path.relpath(os.path.join(root, f)))
return result return result
def get_tpls(self): def get_tpls(self):
tpls = [] tpls = []
tpls_dir = join(fs.get_source_dir(), "ide", "tpls", self.ide) tpls_dir = os.path.join(fs.get_source_dir(), "ide", "tpls", self.ide)
for root, _, files in os.walk(tpls_dir): for root, _, files in os.walk(tpls_dir):
for f in files: for f in files:
if not f.endswith(".tpl"): if not f.endswith(".tpl"):
@@ -133,7 +117,7 @@ class ProjectGenerator(object):
_relpath = root.replace(tpls_dir, "") _relpath = root.replace(tpls_dir, "")
if _relpath.startswith(os.sep): if _relpath.startswith(os.sep):
_relpath = _relpath[1:] _relpath = _relpath[1:]
tpls.append((_relpath, join(root, f))) tpls.append((_relpath, os.path.join(root, f)))
return tpls return tpls
def generate(self): def generate(self):
@@ -141,12 +125,12 @@ class ProjectGenerator(object):
for tpl_relpath, tpl_path in self.get_tpls(): for tpl_relpath, tpl_path in self.get_tpls():
dst_dir = self.project_dir dst_dir = self.project_dir
if tpl_relpath: if tpl_relpath:
dst_dir = join(self.project_dir, tpl_relpath) dst_dir = os.path.join(self.project_dir, tpl_relpath)
if not isdir(dst_dir): if not os.path.isdir(dst_dir):
os.makedirs(dst_dir) os.makedirs(dst_dir)
file_name = basename(tpl_path)[:-4] file_name = os.path.basename(tpl_path)[:-4]
contents = self._render_tpl(tpl_path, tpl_vars) contents = self._render_tpl(tpl_path, tpl_vars)
self._merge_contents(join(dst_dir, file_name), contents) self._merge_contents(os.path.join(dst_dir, file_name), contents)
@staticmethod @staticmethod
def _render_tpl(tpl_path, tpl_vars): def _render_tpl(tpl_path, tpl_vars):
@@ -155,7 +139,7 @@ class ProjectGenerator(object):
@staticmethod @staticmethod
def _merge_contents(dst_path, contents): def _merge_contents(dst_path, contents):
if basename(dst_path) == ".gitignore" and isfile(dst_path): if os.path.basename(dst_path) == ".gitignore" and os.path.isfile(dst_path):
return return
with codecs.open(dst_path, "w", encoding="utf8") as fp: with codecs.open(dst_path, "w", encoding="utf8") as fp:
fp.write(contents) fp.write(contents)

View File

@@ -1,2 +1,3 @@
.pio .pio
.clang_complete .clang_complete
.ccls

View File

@@ -1,3 +1,4 @@
.pio .pio
.clang_complete .clang_complete
.gcc-flags.json .gcc-flags.json
.ccls

View File

@@ -10,10 +10,6 @@
% return to_unix_path(text).replace('"', '\\"') % return to_unix_path(text).replace('"', '\\"')
% end % end
% %
% def _escape_required(flag):
% return " " in flag and systype == "windows"
% end
%
% def split_args(args_string): % def split_args(args_string):
% return click.parser.split_arg_string(to_unix_path(args_string)) % return click.parser.split_arg_string(to_unix_path(args_string))
% end % end
@@ -53,10 +49,7 @@
% def _find_forced_includes(flags, inc_paths): % def _find_forced_includes(flags, inc_paths):
% result = [] % result = []
% include_args = ("-include", "-imacros") % include_args = ("-include", "-imacros")
% for f in flags: % for f in filter_args(flags, include_args):
% if not f.startswith(include_args):
% continue
% end
% for arg in include_args: % for arg in include_args:
% inc = "" % inc = ""
% if f.startswith(arg) and f.split(arg)[1].strip(): % if f.startswith(arg) and f.split(arg)[1].strip():
@@ -66,6 +59,7 @@
% end % end
% if inc: % if inc:
% result.append(_find_abs_path(inc, inc_paths)) % result.append(_find_abs_path(inc, inc_paths))
% break
% end % end
% end % end
% end % end
@@ -134,8 +128,7 @@
"compilerPath": "{{ cc_path }}", "compilerPath": "{{ cc_path }}",
"compilerArgs": [ "compilerArgs": [
% for flag in [ % for flag in [
% '"%s"' % _escape(f) if _escape_required(f) else f % f for f in filter_args(cc_m_flags, ["-m", "-i", "@"], ["-include", "-imacros"])
% for f in filter_args(cc_m_flags, ["-m", "-i", "@"], ["-include", "-imacros"])
% ]: % ]:
"{{ flag }}", "{{ flag }}",
% end % end

View File

@@ -19,15 +19,21 @@ from time import time
import click import click
import semantic_version import semantic_version
from platformio import __version__, app, exception, fs, telemetry, util from platformio import __version__, app, exception, fs, telemetry
from platformio.cache import cleanup_content_cache
from platformio.clients import http
from platformio.commands import PlatformioCLI from platformio.commands import PlatformioCLI
from platformio.commands.lib import CTX_META_STORAGE_DIRS_KEY from platformio.commands.lib.command import CTX_META_STORAGE_DIRS_KEY
from platformio.commands.lib import lib_update as cmd_lib_update from platformio.commands.lib.command import lib_update as cmd_lib_update
from platformio.commands.platform import platform_update as cmd_platform_update from platformio.commands.platform import platform_update as cmd_platform_update
from platformio.commands.upgrade import get_latest_version from platformio.commands.upgrade import get_latest_version
from platformio.managers.core import update_core_packages from platformio.package.manager.core import update_core_packages
from platformio.managers.lib import LibraryManager from platformio.package.manager.library import LibraryPackageManager
from platformio.managers.platform import PlatformFactory, PlatformManager from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.manager.tool import ToolPackageManager
from platformio.package.meta import PackageSpec
from platformio.package.version import pepver_to_semver
from platformio.platform.factory import PlatformFactory
from platformio.proc import is_container from platformio.proc import is_container
@@ -50,9 +56,9 @@ def on_platformio_end(ctx, result): # pylint: disable=unused-argument
check_internal_updates(ctx, "platforms") check_internal_updates(ctx, "platforms")
check_internal_updates(ctx, "libraries") check_internal_updates(ctx, "libraries")
except ( except (
exception.InternetIsOffline, http.HTTPClientError,
http.InternetIsOffline,
exception.GetLatestVersionError, exception.GetLatestVersionError,
exception.APIRequestError,
): ):
click.secho( click.secho(
"Failed to check for PlatformIO upgrades. " "Failed to check for PlatformIO upgrades. "
@@ -66,10 +72,9 @@ def on_platformio_exception(e):
def set_caller(caller=None): def set_caller(caller=None):
caller = caller or getenv("PLATFORMIO_CALLER")
if not caller: if not caller:
if getenv("PLATFORMIO_CALLER"): if getenv("VSCODE_PID") or getenv("VSCODE_NLS_CONFIG"):
caller = getenv("PLATFORMIO_CALLER")
elif getenv("VSCODE_PID") or getenv("VSCODE_NLS_CONFIG"):
caller = "vscode" caller = "vscode"
elif is_container(): elif is_container():
if getenv("C9_UID"): if getenv("C9_UID"):
@@ -83,15 +88,12 @@ def set_caller(caller=None):
class Upgrader(object): class Upgrader(object):
def __init__(self, from_version, to_version): def __init__(self, from_version, to_version):
self.from_version = semantic_version.Version.coerce( self.from_version = pepver_to_semver(from_version)
util.pepver_to_semver(from_version) self.to_version = pepver_to_semver(to_version)
)
self.to_version = semantic_version.Version.coerce(
util.pepver_to_semver(to_version)
)
self._upgraders = [ self._upgraders = [
(semantic_version.Version("3.5.0-a.2"), self._update_dev_platforms) (semantic_version.Version("3.5.0-a.2"), self._update_dev_platforms),
(semantic_version.Version("4.4.0-a.8"), self._update_pkg_metadata),
] ]
def run(self, ctx): def run(self, ctx):
@@ -111,6 +113,22 @@ class Upgrader(object):
ctx.invoke(cmd_platform_update) ctx.invoke(cmd_platform_update)
return True return True
@staticmethod
def _update_pkg_metadata(_):
pm = ToolPackageManager()
for pkg in pm.get_installed():
if not pkg.metadata or pkg.metadata.spec.external or pkg.metadata.spec.id:
continue
result = pm.search_registry_packages(PackageSpec(name=pkg.metadata.name))
if len(result) != 1:
continue
result = result[0]
pkg.metadata.spec = PackageSpec(
id=result["id"], owner=result["owner"]["username"], name=result["name"],
)
pkg.dump_meta()
return True
def after_upgrade(ctx): def after_upgrade(ctx):
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
@@ -120,9 +138,7 @@ def after_upgrade(ctx):
if last_version == "0.0.0": if last_version == "0.0.0":
app.set_state_item("last_version", __version__) app.set_state_item("last_version", __version__)
elif semantic_version.Version.coerce( elif pepver_to_semver(last_version) > pepver_to_semver(__version__):
util.pepver_to_semver(last_version)
) > semantic_version.Version.coerce(util.pepver_to_semver(__version__)):
click.secho("*" * terminal_width, fg="yellow") click.secho("*" * terminal_width, fg="yellow")
click.secho( click.secho(
"Obsolete PIO Core v%s is used (previous was %s)" "Obsolete PIO Core v%s is used (previous was %s)"
@@ -132,14 +148,17 @@ def after_upgrade(ctx):
click.secho("Please remove multiple PIO Cores from a system:", fg="yellow") click.secho("Please remove multiple PIO Cores from a system:", fg="yellow")
click.secho( click.secho(
"https://docs.platformio.org/page/faq.html" "https://docs.platformio.org/page/faq.html"
"#multiple-pio-cores-in-a-system", "#multiple-platformio-cores-in-a-system",
fg="cyan", fg="cyan",
) )
click.secho("*" * terminal_width, fg="yellow") click.secho("*" * terminal_width, fg="yellow")
return return
else: else:
click.secho("Please wait while upgrading PlatformIO...", fg="yellow") click.secho("Please wait while upgrading PlatformIO...", fg="yellow")
app.clean_cache() try:
cleanup_content_cache("http")
except: # pylint: disable=bare-except
pass
# Update PlatformIO's Core packages # Update PlatformIO's Core packages
update_core_packages(silent=True) update_core_packages(silent=True)
@@ -158,7 +177,6 @@ def after_upgrade(ctx):
) )
else: else:
raise exception.UpgradeError("Auto upgrading...") raise exception.UpgradeError("Auto upgrading...")
click.echo("")
# PlatformIO banner # PlatformIO banner
click.echo("*" * terminal_width) click.echo("*" * terminal_width)
@@ -200,15 +218,13 @@ def check_platformio_upgrade():
last_check["platformio_upgrade"] = int(time()) last_check["platformio_upgrade"] = int(time())
app.set_state_item("last_check", last_check) app.set_state_item("last_check", last_check)
util.internet_on(raise_exception=True) http.ensure_internet_on(raise_exception=True)
# Update PlatformIO's Core packages # Update PlatformIO's Core packages
update_core_packages(silent=True) update_core_packages(silent=True)
latest_version = get_latest_version() latest_version = get_latest_version()
if semantic_version.Version.coerce( if pepver_to_semver(latest_version) <= pepver_to_semver(__version__):
util.pepver_to_semver(latest_version)
) <= semantic_version.Version.coerce(util.pepver_to_semver(__version__)):
return return
terminal_width, _ = click.get_terminal_size() terminal_width, _ = click.get_terminal_size()
@@ -238,7 +254,7 @@ def check_platformio_upgrade():
click.echo("") click.echo("")
def check_internal_updates(ctx, what): def check_internal_updates(ctx, what): # pylint: disable=too-many-branches
last_check = app.get_state_item("last_check", {}) last_check = app.get_state_item("last_check", {})
interval = int(app.get_setting("check_%s_interval" % what)) * 3600 * 24 interval = int(app.get_setting("check_%s_interval" % what)) * 3600 * 24
if (time() - interval) < last_check.get(what + "_update", 0): if (time() - interval) < last_check.get(what + "_update", 0):
@@ -247,22 +263,19 @@ def check_internal_updates(ctx, what):
last_check[what + "_update"] = int(time()) last_check[what + "_update"] = int(time())
app.set_state_item("last_check", last_check) app.set_state_item("last_check", last_check)
util.internet_on(raise_exception=True) http.ensure_internet_on(raise_exception=True)
pm = PlatformManager() if what == "platforms" else LibraryManager()
outdated_items = [] outdated_items = []
for manifest in pm.get_installed(): pm = PlatformPackageManager() if what == "platforms" else LibraryPackageManager()
if manifest["name"] in outdated_items: for pkg in pm.get_installed():
if pkg.metadata.name in outdated_items:
continue continue
conds = [ conds = [
pm.outdated(manifest["__pkg_dir"]), pm.outdated(pkg).is_outdated(),
what == "platforms" what == "platforms" and PlatformFactory.new(pkg).are_outdated_packages(),
and PlatformFactory.newPlatform(
manifest["__pkg_dir"]
).are_outdated_packages(),
] ]
if any(conds): if any(conds):
outdated_items.append(manifest["name"]) outdated_items.append(pkg.metadata.name)
if not outdated_items: if not outdated_items:
return return

View File

@@ -1,374 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments, too-many-locals, too-many-branches
# pylint: disable=too-many-return-statements
import json
from glob import glob
from os.path import isdir, join
import click
import semantic_version
from platformio import app, exception, util
from platformio.compat import glob_escape
from platformio.managers.package import BasePkgManager
from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.package.exception import ManifestException
from platformio.package.manifest.parser import ManifestParserFactory
from platformio.project.config import ProjectConfig
class LibraryManager(BasePkgManager):
FILE_CACHE_VALID = "30d" # 1 month
def __init__(self, package_dir=None):
self.config = ProjectConfig.get_instance()
super(LibraryManager, self).__init__(
package_dir or self.config.get_optional_dir("globallib")
)
@property
def manifest_names(self):
return [".library.json", "library.json", "library.properties", "module.json"]
def get_manifest_path(self, pkg_dir):
path = BasePkgManager.get_manifest_path(self, pkg_dir)
if path:
return path
# if library without manifest, returns first source file
src_dir = join(glob_escape(pkg_dir))
if isdir(join(pkg_dir, "src")):
src_dir = join(src_dir, "src")
chs_files = glob(join(src_dir, "*.[chS]"))
if chs_files:
return chs_files[0]
cpp_files = glob(join(src_dir, "*.cpp"))
if cpp_files:
return cpp_files[0]
return None
def max_satisfying_repo_version(self, versions, requirements=None):
def _cmp_dates(datestr1, datestr2):
date1 = util.parse_date(datestr1)
date2 = util.parse_date(datestr2)
if date1 == date2:
return 0
return -1 if date1 < date2 else 1
semver_spec = None
try:
semver_spec = (
semantic_version.SimpleSpec(requirements) if requirements else None
)
except ValueError:
pass
item = {}
for v in versions:
semver_new = self.parse_semver_version(v["name"])
if semver_spec:
# pylint: disable=unsupported-membership-test
if not semver_new or semver_new not in semver_spec:
continue
if not item or self.parse_semver_version(item["name"]) < semver_new:
item = v
elif requirements:
if requirements == v["name"]:
return v
else:
if not item or _cmp_dates(item["released"], v["released"]) == -1:
item = v
return item
def get_latest_repo_version(self, name, requirements, silent=False):
item = self.max_satisfying_repo_version(
util.get_api_result(
"/lib/info/%d"
% self.search_lib_id(
{"name": name, "requirements": requirements}, silent=silent
),
cache_valid="1h",
)["versions"],
requirements,
)
return item["name"] if item else None
def _install_from_piorepo(self, name, requirements):
assert name.startswith("id="), name
version = self.get_latest_repo_version(name, requirements)
if not version:
raise exception.UndefinedPackageVersion(
requirements or "latest", util.get_systype()
)
dl_data = util.get_api_result(
"/lib/download/" + str(name[3:]), dict(version=version), cache_valid="30d"
)
assert dl_data
return self._install_from_url(
name,
dl_data["url"].replace("http://", "https://")
if app.get_setting("strict_ssl")
else dl_data["url"],
requirements,
)
def search_lib_id( # pylint: disable=too-many-branches
self, filters, silent=False, interactive=False
):
assert isinstance(filters, dict)
assert "name" in filters
# try to find ID within installed packages
lib_id = self._get_lib_id_from_installed(filters)
if lib_id:
return lib_id
# looking in PIO Library Registry
if not silent:
click.echo(
"Looking for %s library in registry"
% click.style(filters["name"], fg="cyan")
)
query = []
for key in filters:
if key not in ("name", "authors", "frameworks", "platforms"):
continue
values = filters[key]
if not isinstance(values, list):
values = [v.strip() for v in values.split(",") if v]
for value in values:
query.append(
'%s:"%s"' % (key[:-1] if key.endswith("s") else key, value)
)
lib_info = None
result = util.get_api_result(
"/v2/lib/search", dict(query=" ".join(query)), cache_valid="1h"
)
if result["total"] == 1:
lib_info = result["items"][0]
elif result["total"] > 1:
if silent and not interactive:
lib_info = result["items"][0]
else:
click.secho(
"Conflict: More than one library has been found "
"by request %s:" % json.dumps(filters),
fg="yellow",
err=True,
)
# pylint: disable=import-outside-toplevel
from platformio.commands.lib import print_lib_item
for item in result["items"]:
print_lib_item(item)
if not interactive:
click.secho(
"Automatically chose the first available library "
"(use `--interactive` option to make a choice)",
fg="yellow",
err=True,
)
lib_info = result["items"][0]
else:
deplib_id = click.prompt(
"Please choose library ID",
type=click.Choice([str(i["id"]) for i in result["items"]]),
)
for item in result["items"]:
if item["id"] == int(deplib_id):
lib_info = item
break
if not lib_info:
if list(filters) == ["name"]:
raise exception.LibNotFound(filters["name"])
raise exception.LibNotFound(str(filters))
if not silent:
click.echo(
"Found: %s"
% click.style(
"https://platformio.org/lib/show/{id}/{name}".format(**lib_info),
fg="blue",
)
)
return int(lib_info["id"])
def _get_lib_id_from_installed(self, filters):
if filters["name"].startswith("id="):
return int(filters["name"][3:])
package_dir = self.get_package_dir(
filters["name"], filters.get("requirements", filters.get("version"))
)
if not package_dir:
return None
manifest = self.load_manifest(package_dir)
if "id" not in manifest:
return None
for key in ("frameworks", "platforms"):
if key not in filters:
continue
if key not in manifest:
return None
if not util.items_in_list(
util.items_to_list(filters[key]), util.items_to_list(manifest[key])
):
return None
if "authors" in filters:
if "authors" not in manifest:
return None
manifest_authors = manifest["authors"]
if not isinstance(manifest_authors, list):
manifest_authors = [manifest_authors]
manifest_authors = [
a["name"]
for a in manifest_authors
if isinstance(a, dict) and "name" in a
]
filter_authors = filters["authors"]
if not isinstance(filter_authors, list):
filter_authors = [filter_authors]
if not set(filter_authors) <= set(manifest_authors):
return None
return int(manifest["id"])
def install( # pylint: disable=arguments-differ
self,
name,
requirements=None,
silent=False,
after_update=False,
interactive=False,
force=False,
):
_name, _requirements, _url = self.parse_pkg_uri(name, requirements)
if not _url:
name = "id=%d" % self.search_lib_id(
{"name": _name, "requirements": _requirements},
silent=silent,
interactive=interactive,
)
requirements = _requirements
pkg_dir = BasePkgManager.install(
self,
name,
requirements,
silent=silent,
after_update=after_update,
force=force,
)
if not pkg_dir:
return None
manifest = None
try:
manifest = ManifestParserFactory.new_from_dir(pkg_dir).as_dict()
except ManifestException:
pass
if not manifest or not manifest.get("dependencies"):
return pkg_dir
if not silent:
click.secho("Installing dependencies", fg="yellow")
builtin_lib_storages = None
for filters in manifest["dependencies"]:
assert "name" in filters
# avoid circle dependencies
if not self.INSTALL_HISTORY:
self.INSTALL_HISTORY = []
history_key = str(filters)
if history_key in self.INSTALL_HISTORY:
continue
self.INSTALL_HISTORY.append(history_key)
if any(s in filters.get("version", "") for s in ("\\", "/")):
self.install(
"{name}={version}".format(**filters),
silent=silent,
after_update=after_update,
interactive=interactive,
force=force,
)
else:
try:
lib_id = self.search_lib_id(filters, silent, interactive)
except exception.LibNotFound as e:
if builtin_lib_storages is None:
builtin_lib_storages = get_builtin_libs()
if not silent or is_builtin_lib(
builtin_lib_storages, filters["name"]
):
click.secho("Warning! %s" % e, fg="yellow")
continue
if filters.get("version"):
self.install(
lib_id,
filters.get("version"),
silent=silent,
after_update=after_update,
interactive=interactive,
force=force,
)
else:
self.install(
lib_id,
silent=silent,
after_update=after_update,
interactive=interactive,
force=force,
)
return pkg_dir
def get_builtin_libs(storage_names=None):
items = []
storage_names = storage_names or []
pm = PlatformManager()
for manifest in pm.get_installed():
p = PlatformFactory.newPlatform(manifest["__pkg_dir"])
for storage in p.get_lib_storages():
if storage_names and storage["name"] not in storage_names:
continue
lm = LibraryManager(storage["path"])
items.append(
{
"name": storage["name"],
"path": storage["path"],
"items": lm.get_installed(),
}
)
return items
def is_builtin_lib(storages, name):
for storage in storages or []:
if any(l.get("name") == name for l in storage["items"]):
return True
return False

View File

@@ -1,818 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import hashlib
import json
import os
import re
import shutil
from os.path import basename, getsize, isdir, isfile, islink, join, realpath
from tempfile import mkdtemp
import click
import requests
import semantic_version
from platformio import __version__, app, exception, fs, util
from platformio.compat import hashlib_encode_data
from platformio.downloader import FileDownloader
from platformio.lockfile import LockFile
from platformio.package.exception import ManifestException
from platformio.package.manifest.parser import ManifestParserFactory
from platformio.unpacker import FileUnpacker
from platformio.vcsclient import VCSClientFactory
# pylint: disable=too-many-arguments, too-many-return-statements
class PackageRepoIterator(object):
def __init__(self, package, repositories):
assert isinstance(repositories, list)
self.package = package
self.repositories = iter(repositories)
def __iter__(self):
return self
def __next__(self):
return self.next() # pylint: disable=not-callable
@staticmethod
@util.memoized(expire="60s")
def load_manifest(url):
r = None
try:
r = requests.get(url, headers={"User-Agent": app.get_user_agent()})
r.raise_for_status()
return r.json()
except: # pylint: disable=bare-except
pass
finally:
if r:
r.close()
return None
def next(self):
repo = next(self.repositories)
manifest = repo if isinstance(repo, dict) else self.load_manifest(repo)
if manifest and self.package in manifest:
return manifest[self.package]
return next(self)
class PkgRepoMixin(object):
PIO_VERSION = semantic_version.Version(util.pepver_to_semver(__version__))
@staticmethod
def is_system_compatible(valid_systems):
if not valid_systems or "*" in valid_systems:
return True
if not isinstance(valid_systems, list):
valid_systems = list([valid_systems])
return util.get_systype() in valid_systems
def max_satisfying_repo_version(self, versions, requirements=None):
item = None
reqspec = None
try:
reqspec = (
semantic_version.SimpleSpec(requirements) if requirements else None
)
except ValueError:
pass
for v in versions:
if not self.is_system_compatible(v.get("system")):
continue
# if "platformio" in v.get("engines", {}):
# if PkgRepoMixin.PIO_VERSION not in requirements.SimpleSpec(
# v['engines']['platformio']):
# continue
specver = semantic_version.Version(v["version"])
if reqspec and specver not in reqspec:
continue
if not item or semantic_version.Version(item["version"]) < specver:
item = v
return item
def get_latest_repo_version( # pylint: disable=unused-argument
self, name, requirements, silent=False
):
version = None
for versions in PackageRepoIterator(name, self.repositories):
pkgdata = self.max_satisfying_repo_version(versions, requirements)
if not pkgdata:
continue
if (
not version
or semantic_version.compare(pkgdata["version"], version) == 1
):
version = pkgdata["version"]
return version
def get_all_repo_versions(self, name):
result = []
for versions in PackageRepoIterator(name, self.repositories):
result.extend([semantic_version.Version(v["version"]) for v in versions])
return [str(v) for v in sorted(set(result))]
class PkgInstallerMixin(object):
SRC_MANIFEST_NAME = ".piopkgmanager.json"
TMP_FOLDER_PREFIX = "_tmp_installing-"
FILE_CACHE_VALID = None # for example, 1 week = "7d"
FILE_CACHE_MAX_SIZE = 1024 * 1024 * 50 # 50 Mb
MEMORY_CACHE = {} # cache for package manifests and read dirs
def cache_get(self, key, default=None):
return self.MEMORY_CACHE.get(key, default)
def cache_set(self, key, value):
self.MEMORY_CACHE[key] = value
def cache_reset(self):
self.MEMORY_CACHE.clear()
def read_dirs(self, src_dir):
cache_key = "read_dirs-%s" % src_dir
result = self.cache_get(cache_key)
if result:
return result
result = [
join(src_dir, name)
for name in sorted(os.listdir(src_dir))
if isdir(join(src_dir, name))
]
self.cache_set(cache_key, result)
return result
def download(self, url, dest_dir, sha1=None):
cache_key_fname = app.ContentCache.key_from_args(url, "fname")
cache_key_data = app.ContentCache.key_from_args(url, "data")
if self.FILE_CACHE_VALID:
with app.ContentCache() as cc:
fname = str(cc.get(cache_key_fname))
cache_path = cc.get_cache_path(cache_key_data)
if fname and isfile(cache_path):
dst_path = join(dest_dir, fname)
shutil.copy(cache_path, dst_path)
click.echo("Using cache: %s" % cache_path)
return dst_path
with_progress = not app.is_disabled_progressbar()
try:
fd = FileDownloader(url, dest_dir)
fd.start(with_progress=with_progress)
except IOError as e:
raise_error = not with_progress
if with_progress:
try:
fd = FileDownloader(url, dest_dir)
fd.start(with_progress=False)
except IOError:
raise_error = True
if raise_error:
click.secho(
"Error: Please read http://bit.ly/package-manager-ioerror",
fg="red",
err=True,
)
raise e
if sha1:
fd.verify(sha1)
dst_path = fd.get_filepath()
if (
not self.FILE_CACHE_VALID
or getsize(dst_path) > PkgInstallerMixin.FILE_CACHE_MAX_SIZE
):
return dst_path
with app.ContentCache() as cc:
cc.set(cache_key_fname, basename(dst_path), self.FILE_CACHE_VALID)
cc.set(cache_key_data, "DUMMY", self.FILE_CACHE_VALID)
shutil.copy(dst_path, cc.get_cache_path(cache_key_data))
return dst_path
@staticmethod
def unpack(source_path, dest_dir):
with_progress = not app.is_disabled_progressbar()
try:
with FileUnpacker(source_path) as fu:
return fu.unpack(dest_dir, with_progress=with_progress)
except IOError as e:
if not with_progress:
raise e
with FileUnpacker(source_path) as fu:
return fu.unpack(dest_dir, with_progress=False)
@staticmethod
def parse_semver_version(value, raise_exception=False):
try:
try:
return semantic_version.Version(value)
except ValueError:
if "." not in str(value) and not str(value).isdigit():
raise ValueError("Invalid SemVer version %s" % value)
return semantic_version.Version.coerce(value)
except ValueError as e:
if raise_exception:
raise e
return None
@staticmethod
def parse_pkg_uri(text, requirements=None): # pylint: disable=too-many-branches
text = str(text)
name, url = None, None
# Parse requirements
req_conditions = [
"@" in text,
not requirements,
":" not in text or text.rfind("/") < text.rfind("@"),
]
if all(req_conditions):
text, requirements = text.rsplit("@", 1)
# Handle PIO Library Registry ID
if text.isdigit():
text = "id=" + text
# Parse custom name
elif "=" in text and not text.startswith("id="):
name, text = text.split("=", 1)
# Parse URL
# if valid URL with scheme vcs+protocol://
if "+" in text and text.find("+") < text.find("://"):
url = text
elif "/" in text or "\\" in text:
git_conditions = [
# Handle GitHub URL (https://github.com/user/package)
text.startswith("https://github.com/")
and not text.endswith((".zip", ".tar.gz")),
(text.split("#", 1)[0] if "#" in text else text).endswith(".git"),
]
hg_conditions = [
# Handle Developer Mbed URL
# (https://developer.mbed.org/users/user/code/package/)
# (https://os.mbed.com/users/user/code/package/)
text.startswith("https://developer.mbed.org"),
text.startswith("https://os.mbed.com"),
]
if any(git_conditions):
url = "git+" + text
elif any(hg_conditions):
url = "hg+" + text
elif "://" not in text and (isfile(text) or isdir(text)):
url = "file://" + text
elif "://" in text:
url = text
# Handle short version of GitHub URL
elif text.count("/") == 1:
url = "git+https://github.com/" + text
# Parse name from URL
if url and not name:
_url = url.split("#", 1)[0] if "#" in url else url
if _url.endswith(("\\", "/")):
_url = _url[:-1]
name = basename(_url)
if "." in name and not name.startswith("."):
name = name.rsplit(".", 1)[0]
return (name or text, requirements, url)
@staticmethod
def get_install_dirname(manifest):
name = re.sub(r"[^\da-z\_\-\. ]", "_", manifest["name"], flags=re.I)
if "id" in manifest:
name += "_ID%d" % manifest["id"]
return str(name)
@classmethod
def get_src_manifest_path(cls, pkg_dir):
if not isdir(pkg_dir):
return None
for item in os.listdir(pkg_dir):
if not isdir(join(pkg_dir, item)):
continue
if isfile(join(pkg_dir, item, cls.SRC_MANIFEST_NAME)):
return join(pkg_dir, item, cls.SRC_MANIFEST_NAME)
return None
def get_manifest_path(self, pkg_dir):
if not isdir(pkg_dir):
return None
for name in self.manifest_names:
manifest_path = join(pkg_dir, name)
if isfile(manifest_path):
return manifest_path
return None
def manifest_exists(self, pkg_dir):
return self.get_manifest_path(pkg_dir) or self.get_src_manifest_path(pkg_dir)
def load_manifest(self, pkg_dir): # pylint: disable=too-many-branches
cache_key = "load_manifest-%s" % pkg_dir
result = self.cache_get(cache_key)
if result:
return result
manifest = {}
src_manifest = None
manifest_path = self.get_manifest_path(pkg_dir)
src_manifest_path = self.get_src_manifest_path(pkg_dir)
if src_manifest_path:
src_manifest = fs.load_json(src_manifest_path)
if not manifest_path and not src_manifest_path:
return None
try:
manifest = ManifestParserFactory.new_from_file(manifest_path).as_dict()
except ManifestException:
pass
if src_manifest:
if "version" in src_manifest:
manifest["version"] = src_manifest["version"]
manifest["__src_url"] = src_manifest["url"]
# handle a custom package name
autogen_name = self.parse_pkg_uri(manifest["__src_url"])[0]
if "name" not in manifest or autogen_name != src_manifest["name"]:
manifest["name"] = src_manifest["name"]
if "name" not in manifest:
manifest["name"] = basename(pkg_dir)
if "version" not in manifest:
manifest["version"] = "0.0.0"
manifest["__pkg_dir"] = realpath(pkg_dir)
self.cache_set(cache_key, manifest)
return manifest
def get_installed(self):
items = []
for pkg_dir in self.read_dirs(self.package_dir):
if self.TMP_FOLDER_PREFIX in pkg_dir:
continue
manifest = self.load_manifest(pkg_dir)
if not manifest:
continue
assert "name" in manifest
items.append(manifest)
return items
def get_package(self, name, requirements=None, url=None):
pkg_id = int(name[3:]) if name.startswith("id=") else 0
best = None
for manifest in self.get_installed():
if url:
if manifest.get("__src_url") != url:
continue
elif pkg_id and manifest.get("id") != pkg_id:
continue
elif not pkg_id and manifest["name"] != name:
continue
elif not PkgRepoMixin.is_system_compatible(manifest.get("system")):
continue
# strict version or VCS HASH
if requirements and requirements == manifest["version"]:
return manifest
try:
if requirements and not semantic_version.SimpleSpec(requirements).match(
self.parse_semver_version(manifest["version"], raise_exception=True)
):
continue
if not best or (
self.parse_semver_version(manifest["version"], raise_exception=True)
> self.parse_semver_version(best["version"], raise_exception=True)
):
best = manifest
except ValueError:
pass
return best
def get_package_dir(self, name, requirements=None, url=None):
manifest = self.get_package(name, requirements, url)
return (
manifest.get("__pkg_dir")
if manifest and isdir(manifest.get("__pkg_dir"))
else None
)
def get_package_by_dir(self, pkg_dir):
for manifest in self.get_installed():
if manifest["__pkg_dir"] == realpath(pkg_dir):
return manifest
return None
def find_pkg_root(self, src_dir):
if self.manifest_exists(src_dir):
return src_dir
for root, _, _ in os.walk(src_dir):
if self.manifest_exists(root):
return root
raise exception.MissingPackageManifest(", ".join(self.manifest_names))
def _install_from_piorepo(self, name, requirements):
pkg_dir = None
pkgdata = None
versions = None
last_exc = None
for versions in PackageRepoIterator(name, self.repositories):
pkgdata = self.max_satisfying_repo_version(versions, requirements)
if not pkgdata:
continue
try:
pkg_dir = self._install_from_url(
name, pkgdata["url"], requirements, pkgdata.get("sha1")
)
break
except Exception as e: # pylint: disable=broad-except
last_exc = e
click.secho("Warning! Package Mirror: %s" % e, fg="yellow")
click.secho("Looking for another mirror...", fg="yellow")
if versions is None:
util.internet_on(raise_exception=True)
raise exception.UnknownPackage(
name + (". Error -> %s" % last_exc if last_exc else "")
)
if not pkgdata:
raise exception.UndefinedPackageVersion(
requirements or "latest", util.get_systype()
)
return pkg_dir
def _install_from_url(self, name, url, requirements=None, sha1=None, track=False):
tmp_dir = mkdtemp("-package", self.TMP_FOLDER_PREFIX, self.package_dir)
src_manifest_dir = None
src_manifest = {"name": name, "url": url, "requirements": requirements}
try:
if url.startswith("file://"):
_url = url[7:]
if isfile(_url):
self.unpack(_url, tmp_dir)
else:
fs.rmtree(tmp_dir)
shutil.copytree(_url, tmp_dir, symlinks=True)
elif url.startswith(("http://", "https://")):
dlpath = self.download(url, tmp_dir, sha1)
assert isfile(dlpath)
self.unpack(dlpath, tmp_dir)
os.remove(dlpath)
else:
vcs = VCSClientFactory.newClient(tmp_dir, url)
assert vcs.export()
src_manifest_dir = vcs.storage_dir
src_manifest["version"] = vcs.get_current_revision()
_tmp_dir = tmp_dir
if not src_manifest_dir:
_tmp_dir = self.find_pkg_root(tmp_dir)
src_manifest_dir = join(_tmp_dir, ".pio")
# write source data to a special manifest
if track:
self._update_src_manifest(src_manifest, src_manifest_dir)
return self._install_from_tmp_dir(_tmp_dir, requirements)
finally:
if isdir(tmp_dir):
fs.rmtree(tmp_dir)
return None
def _update_src_manifest(self, data, src_dir):
if not isdir(src_dir):
os.makedirs(src_dir)
src_manifest_path = join(src_dir, self.SRC_MANIFEST_NAME)
_data = {}
if isfile(src_manifest_path):
_data = fs.load_json(src_manifest_path)
_data.update(data)
with open(src_manifest_path, "w") as fp:
json.dump(_data, fp)
def _install_from_tmp_dir( # pylint: disable=too-many-branches
self, tmp_dir, requirements=None
):
tmp_manifest = self.load_manifest(tmp_dir)
assert set(["name", "version"]) <= set(tmp_manifest)
pkg_dirname = self.get_install_dirname(tmp_manifest)
pkg_dir = join(self.package_dir, pkg_dirname)
cur_manifest = self.load_manifest(pkg_dir)
tmp_semver = self.parse_semver_version(tmp_manifest["version"])
cur_semver = None
if cur_manifest:
cur_semver = self.parse_semver_version(cur_manifest["version"])
# package should satisfy requirements
if requirements:
mismatch_error = "Package version %s doesn't satisfy requirements %s" % (
tmp_manifest["version"],
requirements,
)
try:
assert tmp_semver and tmp_semver in semantic_version.SimpleSpec(
requirements
), mismatch_error
except (AssertionError, ValueError):
assert tmp_manifest["version"] == requirements, mismatch_error
# check if package already exists
if cur_manifest:
# 0-overwrite, 1-rename, 2-fix to a version
action = 0
if "__src_url" in cur_manifest:
if cur_manifest["__src_url"] != tmp_manifest.get("__src_url"):
action = 1
elif "__src_url" in tmp_manifest:
action = 2
else:
if tmp_semver and (not cur_semver or tmp_semver > cur_semver):
action = 1
elif tmp_semver and cur_semver and tmp_semver != cur_semver:
action = 2
# rename
if action == 1:
target_dirname = "%s@%s" % (pkg_dirname, cur_manifest["version"])
if "__src_url" in cur_manifest:
target_dirname = "%s@src-%s" % (
pkg_dirname,
hashlib.md5(
hashlib_encode_data(cur_manifest["__src_url"])
).hexdigest(),
)
shutil.move(pkg_dir, join(self.package_dir, target_dirname))
# fix to a version
elif action == 2:
target_dirname = "%s@%s" % (pkg_dirname, tmp_manifest["version"])
if "__src_url" in tmp_manifest:
target_dirname = "%s@src-%s" % (
pkg_dirname,
hashlib.md5(
hashlib_encode_data(tmp_manifest["__src_url"])
).hexdigest(),
)
pkg_dir = join(self.package_dir, target_dirname)
# remove previous/not-satisfied package
if isdir(pkg_dir):
fs.rmtree(pkg_dir)
shutil.copytree(tmp_dir, pkg_dir, symlinks=True)
try:
shutil.rmtree(tmp_dir)
except: # pylint: disable=bare-except
pass
assert isdir(pkg_dir)
self.cache_reset()
return pkg_dir
class BasePkgManager(PkgRepoMixin, PkgInstallerMixin):
# Handle circle dependencies
INSTALL_HISTORY = None
def __init__(self, package_dir, repositories=None):
self.repositories = repositories
self.package_dir = package_dir
if not isdir(self.package_dir):
os.makedirs(self.package_dir)
assert isdir(self.package_dir)
@property
def manifest_names(self):
raise NotImplementedError()
def print_message(self, message, nl=True):
click.echo("%s: %s" % (self.__class__.__name__, message), nl=nl)
def outdated(self, pkg_dir, requirements=None):
"""
Has 3 different results:
`None` - unknown package, VCS is detached to commit
`False` - package is up-to-date
`String` - a found latest version
"""
if not isdir(pkg_dir):
return None
latest = None
manifest = self.load_manifest(pkg_dir)
# skip detached package to a specific version
if "@" in pkg_dir and "__src_url" not in manifest and not requirements:
return None
if "__src_url" in manifest:
try:
vcs = VCSClientFactory.newClient(
pkg_dir, manifest["__src_url"], silent=True
)
except (AttributeError, exception.PlatformioException):
return None
if not vcs.can_be_updated:
return None
latest = vcs.get_latest_revision()
else:
try:
latest = self.get_latest_repo_version(
"id=%d" % manifest["id"] if "id" in manifest else manifest["name"],
requirements,
silent=True,
)
except (exception.PlatformioException, ValueError):
return None
if not latest:
return None
up_to_date = False
try:
assert "__src_url" not in manifest
up_to_date = self.parse_semver_version(
manifest["version"], raise_exception=True
) >= self.parse_semver_version(latest, raise_exception=True)
except (AssertionError, ValueError):
up_to_date = latest == manifest["version"]
return False if up_to_date else latest
def install(
self, name, requirements=None, silent=False, after_update=False, force=False
): # pylint: disable=unused-argument
pkg_dir = None
# interprocess lock
with LockFile(self.package_dir):
self.cache_reset()
name, requirements, url = self.parse_pkg_uri(name, requirements)
package_dir = self.get_package_dir(name, requirements, url)
# avoid circle dependencies
if not self.INSTALL_HISTORY:
self.INSTALL_HISTORY = []
history_key = "%s-%s-%s" % (name, requirements or "", url or "")
if history_key in self.INSTALL_HISTORY:
return package_dir
self.INSTALL_HISTORY.append(history_key)
if package_dir and force:
self.uninstall(package_dir)
package_dir = None
if not package_dir or not silent:
msg = "Installing " + click.style(name, fg="cyan")
if requirements:
msg += " @ " + requirements
self.print_message(msg)
if package_dir:
if not silent:
click.secho(
"{name} @ {version} is already installed".format(
**self.load_manifest(package_dir)
),
fg="yellow",
)
return package_dir
if url:
pkg_dir = self._install_from_url(name, url, requirements, track=True)
else:
pkg_dir = self._install_from_piorepo(name, requirements)
if not pkg_dir or not self.manifest_exists(pkg_dir):
raise exception.PackageInstallError(
name, requirements or "*", util.get_systype()
)
manifest = self.load_manifest(pkg_dir)
assert manifest
click.secho(
"{name} @ {version} has been successfully installed!".format(
**manifest
),
fg="green",
)
return pkg_dir
def uninstall(
self, package, requirements=None, after_update=False
): # pylint: disable=unused-argument
# interprocess lock
with LockFile(self.package_dir):
self.cache_reset()
if isdir(package) and self.get_package_by_dir(package):
pkg_dir = package
else:
name, requirements, url = self.parse_pkg_uri(package, requirements)
pkg_dir = self.get_package_dir(name, requirements, url)
if not pkg_dir:
raise exception.UnknownPackage(
"%s @ %s" % (package, requirements or "*")
)
manifest = self.load_manifest(pkg_dir)
click.echo(
"Uninstalling %s @ %s: \t"
% (click.style(manifest["name"], fg="cyan"), manifest["version"]),
nl=False,
)
if islink(pkg_dir):
os.unlink(pkg_dir)
else:
fs.rmtree(pkg_dir)
self.cache_reset()
# unfix package with the same name
pkg_dir = self.get_package_dir(manifest["name"])
if pkg_dir and "@" in pkg_dir:
shutil.move(
pkg_dir, join(self.package_dir, self.get_install_dirname(manifest))
)
self.cache_reset()
click.echo("[%s]" % click.style("OK", fg="green"))
return True
def update(self, package, requirements=None, only_check=False):
self.cache_reset()
if isdir(package) and self.get_package_by_dir(package):
pkg_dir = package
else:
pkg_dir = self.get_package_dir(*self.parse_pkg_uri(package))
if not pkg_dir:
raise exception.UnknownPackage("%s @ %s" % (package, requirements or "*"))
manifest = self.load_manifest(pkg_dir)
name = manifest["name"]
click.echo(
"{} {:<40} @ {:<15}".format(
"Checking" if only_check else "Updating",
click.style(manifest["name"], fg="cyan"),
manifest["version"],
),
nl=False,
)
if not util.internet_on():
click.echo("[%s]" % (click.style("Off-line", fg="yellow")))
return None
latest = self.outdated(pkg_dir, requirements)
if latest:
click.echo("[%s]" % (click.style(latest, fg="red")))
elif latest is False:
click.echo("[%s]" % (click.style("Up-to-date", fg="green")))
else:
click.echo("[%s]" % (click.style("Detached", fg="yellow")))
if only_check or not latest:
return True
if "__src_url" in manifest:
vcs = VCSClientFactory.newClient(pkg_dir, manifest["__src_url"])
assert vcs.update()
self._update_src_manifest(
dict(version=vcs.get_current_revision()), vcs.storage_dir
)
else:
self.uninstall(pkg_dir, after_update=True)
self.install(name, latest, after_update=True)
return True
class PackageManager(BasePkgManager):
@property
def manifest_names(self):
return ["package.json"]

View File

@@ -12,921 +12,5 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
# pylint: disable=too-many-public-methods, too-many-instance-attributes # Backward compatibility with legacy dev-platforms
from platformio.platform.base import PlatformBase # pylint: disable=unused-import
import base64
import os
import re
import subprocess
import sys
from os.path import basename, dirname, isdir, isfile, join
import click
import semantic_version
from platformio import __version__, app, exception, fs, proc, telemetry, util
from platformio.commands.debug.exception import (
DebugInvalidOptionsError,
DebugSupportError,
)
from platformio.compat import PY2, hashlib_encode_data, is_bytes, load_python_module
from platformio.managers.core import get_core_package_dir
from platformio.managers.package import BasePkgManager, PackageManager
from platformio.project.config import ProjectConfig
try:
from urllib.parse import quote
except ImportError:
from urllib import quote
class PlatformManager(BasePkgManager):
def __init__(self, package_dir=None, repositories=None):
if not repositories:
repositories = [
"https://dl.bintray.com/platformio/dl-platforms/manifest.json",
"{0}://dl.platformio.org/platforms/manifest.json".format(
"https" if app.get_setting("strict_ssl") else "http"
),
]
self.config = ProjectConfig.get_instance()
BasePkgManager.__init__(
self, package_dir or self.config.get_optional_dir("platforms"), repositories
)
@property
def manifest_names(self):
return ["platform.json"]
def get_manifest_path(self, pkg_dir):
if not isdir(pkg_dir):
return None
for name in self.manifest_names:
manifest_path = join(pkg_dir, name)
if isfile(manifest_path):
return manifest_path
return None
def install(
self,
name,
requirements=None,
with_packages=None,
without_packages=None,
skip_default_package=False,
with_all_packages=False,
after_update=False,
silent=False,
force=False,
**_
): # pylint: disable=too-many-arguments, arguments-differ
platform_dir = BasePkgManager.install(
self, name, requirements, silent=silent, force=force
)
p = PlatformFactory.newPlatform(platform_dir)
if with_all_packages:
with_packages = list(p.packages.keys())
# don't cleanup packages or install them after update
# we check packages for updates in def update()
if after_update:
p.install_python_packages()
p.on_installed()
return True
p.install_packages(
with_packages,
without_packages,
skip_default_package,
silent=silent,
force=force,
)
p.install_python_packages()
p.on_installed()
return self.cleanup_packages(list(p.packages))
def uninstall(self, package, requirements=None, after_update=False):
if isdir(package):
pkg_dir = package
else:
name, requirements, url = self.parse_pkg_uri(package, requirements)
pkg_dir = self.get_package_dir(name, requirements, url)
if not pkg_dir:
raise exception.UnknownPlatform(package)
p = PlatformFactory.newPlatform(pkg_dir)
BasePkgManager.uninstall(self, pkg_dir, requirements)
p.uninstall_python_packages()
p.on_uninstalled()
# don't cleanup packages or install them after update
# we check packages for updates in def update()
if after_update:
return True
return self.cleanup_packages(list(p.packages))
def update( # pylint: disable=arguments-differ
self, package, requirements=None, only_check=False, only_packages=False
):
if isdir(package):
pkg_dir = package
else:
name, requirements, url = self.parse_pkg_uri(package, requirements)
pkg_dir = self.get_package_dir(name, requirements, url)
if not pkg_dir:
raise exception.UnknownPlatform(package)
p = PlatformFactory.newPlatform(pkg_dir)
pkgs_before = list(p.get_installed_packages())
missed_pkgs = set()
if not only_packages:
BasePkgManager.update(self, pkg_dir, requirements, only_check)
p = PlatformFactory.newPlatform(pkg_dir)
missed_pkgs = set(pkgs_before) & set(p.packages)
missed_pkgs -= set(p.get_installed_packages())
p.update_packages(only_check)
self.cleanup_packages(list(p.packages))
if missed_pkgs:
p.install_packages(
with_packages=list(missed_pkgs), skip_default_package=True
)
return True
def cleanup_packages(self, names):
self.cache_reset()
deppkgs = {}
for manifest in PlatformManager().get_installed():
p = PlatformFactory.newPlatform(manifest["__pkg_dir"])
for pkgname, pkgmanifest in p.get_installed_packages().items():
if pkgname not in deppkgs:
deppkgs[pkgname] = set()
deppkgs[pkgname].add(pkgmanifest["version"])
pm = PackageManager(self.config.get_optional_dir("packages"))
for manifest in pm.get_installed():
if manifest["name"] not in names:
continue
if (
manifest["name"] not in deppkgs
or manifest["version"] not in deppkgs[manifest["name"]]
):
try:
pm.uninstall(manifest["__pkg_dir"], after_update=True)
except exception.UnknownPackage:
pass
self.cache_reset()
return True
@util.memoized(expire="5s")
def get_installed_boards(self):
boards = []
for manifest in self.get_installed():
p = PlatformFactory.newPlatform(manifest["__pkg_dir"])
for config in p.get_boards().values():
board = config.get_brief_data()
if board not in boards:
boards.append(board)
return boards
@staticmethod
def get_registered_boards():
return util.get_api_result("/boards", cache_valid="7d")
def get_all_boards(self):
boards = self.get_installed_boards()
know_boards = ["%s:%s" % (b["platform"], b["id"]) for b in boards]
try:
for board in self.get_registered_boards():
key = "%s:%s" % (board["platform"], board["id"])
if key not in know_boards:
boards.append(board)
except (exception.APIRequestError, exception.InternetIsOffline):
pass
return sorted(boards, key=lambda b: b["name"])
def board_config(self, id_, platform=None):
for manifest in self.get_installed_boards():
if manifest["id"] == id_ and (
not platform or manifest["platform"] == platform
):
return manifest
for manifest in self.get_registered_boards():
if manifest["id"] == id_ and (
not platform or manifest["platform"] == platform
):
return manifest
raise exception.UnknownBoard(id_)
class PlatformFactory(object):
@staticmethod
def get_clsname(name):
name = re.sub(r"[^\da-z\_]+", "", name, flags=re.I)
return "%s%sPlatform" % (name.upper()[0], name.lower()[1:])
@staticmethod
def load_module(name, path):
try:
return load_python_module("platformio.managers.platform.%s" % name, path)
except ImportError:
raise exception.UnknownPlatform(name)
@classmethod
def newPlatform(cls, name, requirements=None):
pm = PlatformManager()
platform_dir = None
if isdir(name):
platform_dir = name
name = pm.load_manifest(platform_dir)["name"]
elif name.endswith("platform.json") and isfile(name):
platform_dir = dirname(name)
name = fs.load_json(name)["name"]
else:
name, requirements, url = pm.parse_pkg_uri(name, requirements)
platform_dir = pm.get_package_dir(name, requirements, url)
if platform_dir:
name = pm.load_manifest(platform_dir)["name"]
if not platform_dir:
raise exception.UnknownPlatform(
name if not requirements else "%s@%s" % (name, requirements)
)
platform_cls = None
if isfile(join(platform_dir, "platform.py")):
platform_cls = getattr(
cls.load_module(name, join(platform_dir, "platform.py")),
cls.get_clsname(name),
)
else:
platform_cls = type(str(cls.get_clsname(name)), (PlatformBase,), {})
_instance = platform_cls(join(platform_dir, "platform.json"))
assert isinstance(_instance, PlatformBase)
return _instance
class PlatformPackagesMixin(object):
def install_packages( # pylint: disable=too-many-arguments
self,
with_packages=None,
without_packages=None,
skip_default_package=False,
silent=False,
force=False,
):
with_packages = set(self.find_pkg_names(with_packages or []))
without_packages = set(self.find_pkg_names(without_packages or []))
upkgs = with_packages | without_packages
ppkgs = set(self.packages)
if not upkgs.issubset(ppkgs):
raise exception.UnknownPackage(", ".join(upkgs - ppkgs))
for name, opts in self.packages.items():
version = opts.get("version", "")
if name in without_packages:
continue
if name in with_packages or not (
skip_default_package or opts.get("optional", False)
):
if ":" in version:
self.pm.install(
"%s=%s" % (name, version), silent=silent, force=force
)
else:
self.pm.install(name, version, silent=silent, force=force)
return True
def find_pkg_names(self, candidates):
result = []
for candidate in candidates:
found = False
# lookup by package types
for _name, _opts in self.packages.items():
if _opts.get("type") == candidate:
result.append(_name)
found = True
if (
self.frameworks
and candidate.startswith("framework-")
and candidate[10:] in self.frameworks
):
result.append(self.frameworks[candidate[10:]]["package"])
found = True
if not found:
result.append(candidate)
return result
def update_packages(self, only_check=False):
for name, manifest in self.get_installed_packages().items():
requirements = self.packages[name].get("version", "")
if ":" in requirements:
_, requirements, __ = self.pm.parse_pkg_uri(requirements)
self.pm.update(manifest["__pkg_dir"], requirements, only_check)
def get_installed_packages(self):
items = {}
for name in self.packages:
pkg_dir = self.get_package_dir(name)
if pkg_dir:
items[name] = self.pm.load_manifest(pkg_dir)
return items
def are_outdated_packages(self):
for name, manifest in self.get_installed_packages().items():
requirements = self.packages[name].get("version", "")
if ":" in requirements:
_, requirements, __ = self.pm.parse_pkg_uri(requirements)
if self.pm.outdated(manifest["__pkg_dir"], requirements):
return True
return False
def get_package_dir(self, name):
version = self.packages[name].get("version", "")
if ":" in version:
return self.pm.get_package_dir(
*self.pm.parse_pkg_uri("%s=%s" % (name, version))
)
return self.pm.get_package_dir(name, version)
def get_package_version(self, name):
pkg_dir = self.get_package_dir(name)
if not pkg_dir:
return None
return self.pm.load_manifest(pkg_dir).get("version")
def dump_used_packages(self):
result = []
for name, options in self.packages.items():
if options.get("optional"):
continue
pkg_dir = self.get_package_dir(name)
if not pkg_dir:
continue
manifest = self.pm.load_manifest(pkg_dir)
item = {"name": manifest["name"], "version": manifest["version"]}
if manifest.get("__src_url"):
item["src_url"] = manifest.get("__src_url")
result.append(item)
return result
class PlatformRunMixin(object):
LINE_ERROR_RE = re.compile(r"(^|\s+)error:?\s+", re.I)
@staticmethod
def encode_scons_arg(value):
data = base64.urlsafe_b64encode(hashlib_encode_data(value))
return data.decode() if is_bytes(data) else data
@staticmethod
def decode_scons_arg(data):
value = base64.urlsafe_b64decode(data)
return value.decode() if is_bytes(value) else value
def run( # pylint: disable=too-many-arguments
self, variables, targets, silent, verbose, jobs
):
assert isinstance(variables, dict)
assert isinstance(targets, list)
options = self.config.items(env=variables["pioenv"], as_dict=True)
if "framework" in options:
# support PIO Core 3.0 dev/platforms
options["pioframework"] = options["framework"]
self.configure_default_packages(options, targets)
self.install_packages(silent=True)
self._report_non_sensitive_data(options, targets)
self.silent = silent
self.verbose = verbose or app.get_setting("force_verbose")
if "clean" in targets:
targets = ["-c", "."]
variables["platform_manifest"] = self.manifest_path
if "build_script" not in variables:
variables["build_script"] = self.get_build_script()
if not isfile(variables["build_script"]):
raise exception.BuildScriptNotFound(variables["build_script"])
result = self._run_scons(variables, targets, jobs)
assert "returncode" in result
return result
def _report_non_sensitive_data(self, options, targets):
topts = options.copy()
topts["platform_packages"] = [
dict(name=item["name"], version=item["version"])
for item in self.dump_used_packages()
]
topts["platform"] = {"name": self.name, "version": self.version}
if self.src_version:
topts["platform"]["src_version"] = self.src_version
telemetry.send_run_environment(topts, targets)
def _run_scons(self, variables, targets, jobs):
args = [
proc.get_pythonexe_path(),
join(get_core_package_dir("tool-scons"), "script", "scons"),
"-Q",
"--warn=no-no-parallel-support",
"--jobs",
str(jobs),
"--sconstruct",
join(fs.get_source_dir(), "builder", "main.py"),
]
args.append("PIOVERBOSE=%d" % (1 if self.verbose else 0))
# pylint: disable=protected-access
args.append("ISATTY=%d" % (1 if click._compat.isatty(sys.stdout) else 0))
args += targets
# encode and append variables
for key, value in variables.items():
args.append("%s=%s" % (key.upper(), self.encode_scons_arg(value)))
proc.copy_pythonpath_to_osenv()
if targets and "menuconfig" in targets:
return proc.exec_command(
args, stdout=sys.stdout, stderr=sys.stderr, stdin=sys.stdin
)
if click._compat.isatty(sys.stdout):
def _write_and_flush(stream, data):
try:
stream.write(data)
stream.flush()
except IOError:
pass
return proc.exec_command(
args,
stdout=proc.BuildAsyncPipe(
line_callback=self._on_stdout_line,
data_callback=lambda data: _write_and_flush(sys.stdout, data),
),
stderr=proc.BuildAsyncPipe(
line_callback=self._on_stderr_line,
data_callback=lambda data: _write_and_flush(sys.stderr, data),
),
)
return proc.exec_command(
args,
stdout=proc.LineBufferedAsyncPipe(line_callback=self._on_stdout_line),
stderr=proc.LineBufferedAsyncPipe(line_callback=self._on_stderr_line),
)
def _on_stdout_line(self, line):
if "`buildprog' is up to date." in line:
return
self._echo_line(line, level=1)
def _on_stderr_line(self, line):
is_error = self.LINE_ERROR_RE.search(line) is not None
self._echo_line(line, level=3 if is_error else 2)
a_pos = line.find("fatal error:")
b_pos = line.rfind(": No such file or directory")
if a_pos == -1 or b_pos == -1:
return
self._echo_missed_dependency(line[a_pos + 12 : b_pos].strip())
def _echo_line(self, line, level):
if line.startswith("scons: "):
line = line[7:]
assert 1 <= level <= 3
if self.silent and (level < 2 or not line):
return
fg = (None, "yellow", "red")[level - 1]
if level == 1 and "is up to date" in line:
fg = "green"
click.secho(line, fg=fg, err=level > 1, nl=False)
@staticmethod
def _echo_missed_dependency(filename):
if "/" in filename or not filename.endswith((".h", ".hpp")):
return
banner = """
{dots}
* Looking for {filename_styled} dependency? Check our library registry!
*
* CLI > platformio lib search "header:{filename}"
* Web > {link}
*
{dots}
""".format(
filename=filename,
filename_styled=click.style(filename, fg="cyan"),
link=click.style(
"https://platformio.org/lib/search?query=header:%s"
% quote(filename, safe=""),
fg="blue",
),
dots="*" * (56 + len(filename)),
)
click.echo(banner, err=True)
class PlatformBase(PlatformPackagesMixin, PlatformRunMixin):
PIO_VERSION = semantic_version.Version(util.pepver_to_semver(__version__))
_BOARDS_CACHE = {}
def __init__(self, manifest_path):
self.manifest_path = manifest_path
self.silent = False
self.verbose = False
self._manifest = fs.load_json(manifest_path)
self._BOARDS_CACHE = {}
self._custom_packages = None
self.config = ProjectConfig.get_instance()
self.pm = PackageManager(
self.config.get_optional_dir("packages"), self.package_repositories
)
self._src_manifest = None
src_manifest_path = self.pm.get_src_manifest_path(self.get_dir())
if src_manifest_path:
self._src_manifest = fs.load_json(src_manifest_path)
# if self.engines and "platformio" in self.engines:
# if self.PIO_VERSION not in semantic_version.SimpleSpec(
# self.engines['platformio']):
# raise exception.IncompatiblePlatform(self.name,
# str(self.PIO_VERSION))
@property
def name(self):
return self._manifest["name"]
@property
def title(self):
return self._manifest["title"]
@property
def description(self):
return self._manifest["description"]
@property
def version(self):
return self._manifest["version"]
@property
def src_version(self):
return self._src_manifest.get("version") if self._src_manifest else None
@property
def src_url(self):
return self._src_manifest.get("url") if self._src_manifest else None
@property
def homepage(self):
return self._manifest.get("homepage")
@property
def vendor_url(self):
return self._manifest.get("url")
@property
def docs_url(self):
return self._manifest.get("docs")
@property
def repository_url(self):
return self._manifest.get("repository", {}).get("url")
@property
def license(self):
return self._manifest.get("license")
@property
def frameworks(self):
return self._manifest.get("frameworks")
@property
def engines(self):
return self._manifest.get("engines")
@property
def package_repositories(self):
return self._manifest.get("packageRepositories")
@property
def manifest(self):
return self._manifest
@property
def packages(self):
packages = self._manifest.get("packages", {})
for item in self._custom_packages or []:
name = item
version = "*"
if "@" in item:
name, version = item.split("@", 2)
name = name.strip()
if name not in packages:
packages[name] = {}
packages[name].update({"version": version.strip(), "optional": False})
return packages
@property
def python_packages(self):
return self._manifest.get("pythonPackages")
def get_dir(self):
return dirname(self.manifest_path)
def get_build_script(self):
main_script = join(self.get_dir(), "builder", "main.py")
if isfile(main_script):
return main_script
raise NotImplementedError()
def is_embedded(self):
for opts in self.packages.values():
if opts.get("type") == "uploader":
return True
return False
def get_boards(self, id_=None):
def _append_board(board_id, manifest_path):
config = PlatformBoardConfig(manifest_path)
if "platform" in config and config.get("platform") != self.name:
return
if "platforms" in config and self.name not in config.get("platforms"):
return
config.manifest["platform"] = self.name
self._BOARDS_CACHE[board_id] = config
bdirs = [
self.config.get_optional_dir("boards"),
join(self.config.get_optional_dir("core"), "boards"),
join(self.get_dir(), "boards"),
]
if id_ is None:
for boards_dir in bdirs:
if not isdir(boards_dir):
continue
for item in sorted(os.listdir(boards_dir)):
_id = item[:-5]
if not item.endswith(".json") or _id in self._BOARDS_CACHE:
continue
_append_board(_id, join(boards_dir, item))
else:
if id_ not in self._BOARDS_CACHE:
for boards_dir in bdirs:
if not isdir(boards_dir):
continue
manifest_path = join(boards_dir, "%s.json" % id_)
if isfile(manifest_path):
_append_board(id_, manifest_path)
break
if id_ not in self._BOARDS_CACHE:
raise exception.UnknownBoard(id_)
return self._BOARDS_CACHE[id_] if id_ else self._BOARDS_CACHE
def board_config(self, id_):
return self.get_boards(id_)
def get_package_type(self, name):
return self.packages[name].get("type")
def configure_default_packages(self, options, targets):
# override user custom packages
self._custom_packages = options.get("platform_packages")
# enable used frameworks
for framework in options.get("framework", []):
if not self.frameworks:
continue
framework = framework.lower().strip()
if not framework or framework not in self.frameworks:
continue
_pkg_name = self.frameworks[framework].get("package")
if _pkg_name:
self.packages[_pkg_name]["optional"] = False
# enable upload tools for upload targets
if any(["upload" in t for t in targets] + ["program" in targets]):
for name, opts in self.packages.items():
if opts.get("type") == "uploader":
self.packages[name]["optional"] = False
# skip all packages in "nobuild" mode
# allow only upload tools and frameworks
elif "nobuild" in targets and opts.get("type") != "framework":
self.packages[name]["optional"] = True
def get_lib_storages(self):
storages = {}
for opts in (self.frameworks or {}).values():
if "package" not in opts:
continue
pkg_dir = self.get_package_dir(opts["package"])
if not pkg_dir or not isdir(join(pkg_dir, "libraries")):
continue
libs_dir = join(pkg_dir, "libraries")
storages[libs_dir] = opts["package"]
libcores_dir = join(libs_dir, "__cores__")
if not isdir(libcores_dir):
continue
for item in os.listdir(libcores_dir):
libcore_dir = join(libcores_dir, item)
if not isdir(libcore_dir):
continue
storages[libcore_dir] = "%s-core-%s" % (opts["package"], item)
return [dict(name=name, path=path) for path, name in storages.items()]
def on_installed(self):
pass
def on_uninstalled(self):
pass
def install_python_packages(self):
if not self.python_packages:
return None
click.echo(
"Installing Python packages: %s"
% ", ".join(list(self.python_packages.keys())),
)
args = [proc.get_pythonexe_path(), "-m", "pip", "install", "--upgrade"]
for name, requirements in self.python_packages.items():
if any(c in requirements for c in ("<", ">", "=")):
args.append("%s%s" % (name, requirements))
else:
args.append("%s==%s" % (name, requirements))
try:
return subprocess.call(args) == 0
except Exception as e: # pylint: disable=broad-except
click.secho(
"Could not install Python packages -> %s" % e, fg="red", err=True
)
def uninstall_python_packages(self):
if not self.python_packages:
return
click.echo("Uninstalling Python packages")
args = [proc.get_pythonexe_path(), "-m", "pip", "uninstall", "--yes"]
args.extend(list(self.python_packages.keys()))
try:
subprocess.call(args) == 0
except Exception as e: # pylint: disable=broad-except
click.secho(
"Could not install Python packages -> %s" % e, fg="red", err=True
)
class PlatformBoardConfig(object):
def __init__(self, manifest_path):
self._id = basename(manifest_path)[:-5]
assert isfile(manifest_path)
self.manifest_path = manifest_path
try:
self._manifest = fs.load_json(manifest_path)
except ValueError:
raise exception.InvalidBoardManifest(manifest_path)
if not set(["name", "url", "vendor"]) <= set(self._manifest):
raise exception.PlatformioException(
"Please specify name, url and vendor fields for " + manifest_path
)
def get(self, path, default=None):
try:
value = self._manifest
for k in path.split("."):
value = value[k]
# pylint: disable=undefined-variable
if PY2 and isinstance(value, unicode):
# cast to plain string from unicode for PY2, resolves issue in
# dev/platform when BoardConfig.get() is used in pair with
# os.path.join(file_encoding, unicode_encoding)
try:
value = value.encode("utf-8")
except UnicodeEncodeError:
pass
return value
except KeyError:
if default is not None:
return default
raise KeyError("Invalid board option '%s'" % path)
def update(self, path, value):
newdict = None
for key in path.split(".")[::-1]:
if newdict is None:
newdict = {key: value}
else:
newdict = {key: newdict}
util.merge_dicts(self._manifest, newdict)
def __contains__(self, key):
try:
self.get(key)
return True
except KeyError:
return False
@property
def id(self):
return self._id
@property
def id_(self):
return self.id
@property
def manifest(self):
return self._manifest
def get_brief_data(self):
return {
"id": self.id,
"name": self._manifest["name"],
"platform": self._manifest.get("platform"),
"mcu": self._manifest.get("build", {}).get("mcu", "").upper(),
"fcpu": int(
"".join(
[
c
for c in str(self._manifest.get("build", {}).get("f_cpu", "0L"))
if c.isdigit()
]
)
),
"ram": self._manifest.get("upload", {}).get("maximum_ram_size", 0),
"rom": self._manifest.get("upload", {}).get("maximum_size", 0),
"connectivity": self._manifest.get("connectivity"),
"frameworks": self._manifest.get("frameworks"),
"debug": self.get_debug_data(),
"vendor": self._manifest["vendor"],
"url": self._manifest["url"],
}
def get_debug_data(self):
if not self._manifest.get("debug", {}).get("tools"):
return None
tools = {}
for name, options in self._manifest["debug"]["tools"].items():
tools[name] = {}
for key, value in options.items():
if key in ("default", "onboard"):
tools[name][key] = value
return {"tools": tools}
def get_debug_tool_name(self, custom=None):
debug_tools = self._manifest.get("debug", {}).get("tools")
tool_name = custom
if tool_name == "custom":
return tool_name
if not debug_tools:
telemetry.send_event("Debug", "Request", self.id)
raise DebugSupportError(self._manifest["name"])
if tool_name:
if tool_name in debug_tools:
return tool_name
raise DebugInvalidOptionsError(
"Unknown debug tool `%s`. Please use one of `%s` or `custom`"
% (tool_name, ", ".join(sorted(list(debug_tools))))
)
# automatically select best tool
data = {"default": [], "onboard": [], "external": []}
for key, value in debug_tools.items():
if value.get("default"):
data["default"].append(key)
elif value.get("onboard"):
data["onboard"].append(key)
data["external"].append(key)
for key, value in data.items():
if not value:
continue
return sorted(value)[0]
assert any(item for item in data)

View File

@@ -12,10 +12,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import hashlib
import io import io
import math import math
import sys
from email.utils import parsedate_tz from email.utils import parsedate_tz
from os.path import getsize, join from os.path import getsize, join
from time import mktime from time import mktime
@@ -23,12 +21,8 @@ from time import mktime
import click import click
import requests import requests
from platformio import app, util from platformio import __default_requests_timeout__, app, fs
from platformio.exception import ( from platformio.package.exception import PackageException
FDSHASumMismatch,
FDSizeMismatch,
FDUnrecognizedStatusCode,
)
class FileDownloader(object): class FileDownloader(object):
@@ -39,10 +33,14 @@ class FileDownloader(object):
url, url,
stream=True, stream=True,
headers={"User-Agent": app.get_user_agent()}, headers={"User-Agent": app.get_user_agent()},
verify=sys.version_info >= (2, 7, 9), timeout=__default_requests_timeout__,
) )
if self._request.status_code != 200: if self._request.status_code != 200:
raise FDUnrecognizedStatusCode(self._request.status_code, url) raise PackageException(
"Got the unrecognized status code '{0}' when downloaded {1}".format(
self._request.status_code, url
)
)
disposition = self._request.headers.get("content-disposition") disposition = self._request.headers.get("content-disposition")
if disposition and "filename=" in disposition: if disposition and "filename=" in disposition:
@@ -75,21 +73,21 @@ class FileDownloader(object):
def start(self, with_progress=True, silent=False): def start(self, with_progress=True, silent=False):
label = "Downloading" label = "Downloading"
itercontent = self._request.iter_content(chunk_size=io.DEFAULT_BUFFER_SIZE) itercontent = self._request.iter_content(chunk_size=io.DEFAULT_BUFFER_SIZE)
f = open(self._destination, "wb") fp = open(self._destination, "wb")
try: try:
if not with_progress or self.get_size() == -1: if not with_progress or self.get_size() == -1:
if not silent: if not silent:
click.echo("%s..." % label) click.echo("%s..." % label)
for chunk in itercontent: for chunk in itercontent:
if chunk: if chunk:
f.write(chunk) fp.write(chunk)
else: else:
chunks = int(math.ceil(self.get_size() / float(io.DEFAULT_BUFFER_SIZE))) chunks = int(math.ceil(self.get_size() / float(io.DEFAULT_BUFFER_SIZE)))
with click.progressbar(length=chunks, label=label) as pb: with click.progressbar(length=chunks, label=label) as pb:
for _ in pb: for _ in pb:
f.write(next(itercontent)) fp.write(next(itercontent))
finally: finally:
f.close() fp.close()
self._request.close() self._request.close()
if self.get_lmtime(): if self.get_lmtime():
@@ -97,29 +95,46 @@ class FileDownloader(object):
return True return True
def verify(self, sha1=None): def verify(self, checksum=None):
_dlsize = getsize(self._destination) _dlsize = getsize(self._destination)
if self.get_size() != -1 and _dlsize != self.get_size(): if self.get_size() != -1 and _dlsize != self.get_size():
raise FDSizeMismatch(_dlsize, self._fname, self.get_size()) raise PackageException(
if not sha1: (
return None "The size ({0:d} bytes) of downloaded file '{1}' "
"is not equal to remote size ({2:d} bytes)"
).format(_dlsize, self._fname, self.get_size())
)
if not checksum:
return True
checksum = hashlib.sha1() checksum_len = len(checksum)
with io.open(self._destination, "rb", buffering=0) as fp: hash_algo = None
while True: if checksum_len == 32:
chunk = fp.read(io.DEFAULT_BUFFER_SIZE) hash_algo = "md5"
if not chunk: elif checksum_len == 40:
break hash_algo = "sha1"
checksum.update(chunk) elif checksum_len == 64:
hash_algo = "sha256"
if sha1.lower() != checksum.hexdigest().lower(): if not hash_algo:
raise FDSHASumMismatch(checksum.hexdigest(), self._fname, sha1) raise PackageException(
"Could not determine checksum algorithm by %s" % checksum
)
dl_checksum = fs.calculate_file_hashsum(hash_algo, self._destination)
if checksum.lower() != dl_checksum.lower():
raise PackageException(
"The checksum '{0}' of the downloaded file '{1}' "
"does not match to the remote '{2}'".format(
dl_checksum, self._fname, checksum
)
)
return True return True
def _preserve_filemtime(self, lmdate): def _preserve_filemtime(self, lmdate):
timedata = parsedate_tz(lmdate) timedata = parsedate_tz(lmdate)
lmtime = mktime(timedata[:9]) lmtime = mktime(timedata[:9])
util.change_filemtime(self._destination, lmtime) fs.change_filemtime(self._destination, lmtime)
def __del__(self): def __del__(self):
if self._request: if self._request:

View File

@@ -12,7 +12,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from platformio.exception import PlatformioException from platformio import util
from platformio.exception import PlatformioException, UserSideException
class PackageException(PlatformioException): class PackageException(PlatformioException):
@@ -44,3 +45,27 @@ class ManifestValidationError(ManifestException):
"https://docs.platformio.org/page/librarymanager/config.html" "https://docs.platformio.org/page/librarymanager/config.html"
% self.messages % self.messages
) )
class MissingPackageManifestError(ManifestException):
MESSAGE = "Could not find one of '{0}' manifest files in the package"
class UnknownPackageError(UserSideException):
MESSAGE = (
"Could not find the package with '{0}' requirements for your system '%s'"
% util.get_systype()
)
class NotGlobalLibDir(UserSideException):
MESSAGE = (
"The `{0}` is not a PlatformIO project.\n\n"
"To manage libraries in global storage `{1}`,\n"
"please use `platformio lib --global {2}` or specify custom storage "
"`platformio lib --storage-dir /path/to/storage/ {2}`.\n"
"Check `platformio lib --help` for details."
)

View File

@@ -15,7 +15,7 @@
import os import os
from time import sleep, time from time import sleep, time
from platformio import exception from platformio.exception import PlatformioException
LOCKFILE_TIMEOUT = 3600 # in seconds, 1 hour LOCKFILE_TIMEOUT = 3600 # in seconds, 1 hour
LOCKFILE_DELAY = 0.2 LOCKFILE_DELAY = 0.2
@@ -36,7 +36,11 @@ except ImportError:
LOCKFILE_CURRENT_INTERFACE = None LOCKFILE_CURRENT_INTERFACE = None
class LockFileExists(Exception): class LockFileExists(PlatformioException):
pass
class LockFileTimeoutError(PlatformioException):
pass pass
@@ -88,7 +92,7 @@ class LockFile(object):
sleep(self.delay) sleep(self.delay)
elapsed += self.delay elapsed += self.delay
raise exception.LockFileTimeoutError() raise LockFileTimeoutError()
def release(self): def release(self):
self._unlock() self._unlock()

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,95 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import hashlib
import os
import tempfile
import time
from platformio import app, compat
from platformio.package.download import FileDownloader
from platformio.package.lockfile import LockFile
class PackageManagerDownloadMixin(object):
DOWNLOAD_CACHE_EXPIRE = 86400 * 30 # keep package in a local cache for 1 month
def compute_download_path(self, *args):
request_hash = hashlib.new("sha1")
for arg in args:
request_hash.update(compat.hashlib_encode_data(arg))
dl_path = os.path.join(self.get_download_dir(), request_hash.hexdigest())
return dl_path
def get_download_usagedb_path(self):
return os.path.join(self.get_download_dir(), "usage.db")
def set_download_utime(self, path, utime=None):
with app.State(self.get_download_usagedb_path(), lock=True) as state:
state[os.path.basename(path)] = int(time.time() if not utime else utime)
def cleanup_expired_downloads(self):
with app.State(self.get_download_usagedb_path(), lock=True) as state:
# remove outdated
for fname in list(state.keys()):
if state[fname] > (time.time() - self.DOWNLOAD_CACHE_EXPIRE):
continue
del state[fname]
dl_path = os.path.join(self.get_download_dir(), fname)
if os.path.isfile(dl_path):
os.remove(dl_path)
def download(self, url, checksum=None, silent=False):
dl_path = self.compute_download_path(url, checksum or "")
if os.path.isfile(dl_path):
self.set_download_utime(dl_path)
return dl_path
with_progress = not silent and not app.is_disabled_progressbar()
tmp_fd, tmp_path = tempfile.mkstemp(dir=self.get_download_dir())
try:
with LockFile(dl_path):
try:
fd = FileDownloader(url)
fd.set_destination(tmp_path)
fd.start(with_progress=with_progress, silent=silent)
except IOError as e:
raise_error = not with_progress
if with_progress:
try:
fd = FileDownloader(url)
fd.set_destination(tmp_path)
fd.start(with_progress=False, silent=silent)
except IOError:
raise_error = True
if raise_error:
self.print_message(
"Error: Please read http://bit.ly/package-manager-ioerror",
fg="red",
err=True,
)
raise e
if checksum:
fd.verify(checksum)
os.close(tmp_fd)
os.rename(tmp_path, dl_path)
finally:
if os.path.isfile(tmp_path):
os.close(tmp_fd)
os.remove(tmp_path)
assert os.path.isfile(dl_path)
self.set_download_utime(dl_path)
return dl_path

View File

@@ -0,0 +1,242 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import hashlib
import os
import shutil
import tempfile
import click
from platformio import app, compat, fs, util
from platformio.package.exception import PackageException
from platformio.package.meta import PackageItem
from platformio.package.unpack import FileUnpacker
from platformio.package.vcsclient import VCSClientFactory
class PackageManagerInstallMixin(object):
_INSTALL_HISTORY = None # avoid circle dependencies
@staticmethod
def unpack(src, dst):
with_progress = not app.is_disabled_progressbar()
try:
with FileUnpacker(src) as fu:
return fu.unpack(dst, with_progress=with_progress)
except IOError as e:
if not with_progress:
raise e
with FileUnpacker(src) as fu:
return fu.unpack(dst, with_progress=False)
def install(self, spec, silent=False, skip_dependencies=False, force=False):
try:
self.lock()
pkg = self._install(
spec, silent=silent, skip_dependencies=skip_dependencies, force=force
)
self.memcache_reset()
self.cleanup_expired_downloads()
return pkg
finally:
self.unlock()
def _install( # pylint: disable=too-many-arguments
self,
spec,
search_filters=None,
silent=False,
skip_dependencies=False,
force=False,
):
spec = self.ensure_spec(spec)
# avoid circle dependencies
if not self._INSTALL_HISTORY:
self._INSTALL_HISTORY = {}
if spec in self._INSTALL_HISTORY:
return self._INSTALL_HISTORY[spec]
# check if package is already installed
pkg = self.get_package(spec)
# if a forced installation
if pkg and force:
self.uninstall(pkg, silent=silent)
pkg = None
if pkg:
if not silent:
self.print_message(
"{name} @ {version} is already installed".format(
**pkg.metadata.as_dict()
),
fg="yellow",
)
return pkg
if not silent:
self.print_message(
"Installing %s" % click.style(spec.humanize(), fg="cyan")
)
if spec.external:
pkg = self.install_from_url(spec.url, spec, silent=silent)
else:
pkg = self.install_from_registry(spec, search_filters, silent=silent)
if not pkg or not pkg.metadata:
raise PackageException(
"Could not install package '%s' for '%s' system"
% (spec.humanize(), util.get_systype())
)
if not silent:
self.print_message(
"{name} @ {version} has been installed!".format(
**pkg.metadata.as_dict()
),
fg="green",
)
self.memcache_reset()
if not skip_dependencies:
self.install_dependencies(pkg, silent)
self._INSTALL_HISTORY[spec] = pkg
return pkg
def install_dependencies(self, pkg, silent=False):
pass
def install_from_url(self, url, spec, checksum=None, silent=False):
spec = self.ensure_spec(spec)
tmp_dir = tempfile.mkdtemp(prefix="pkg-installing-", dir=self.get_tmp_dir())
vcs = None
try:
if url.startswith("file://"):
_url = url[7:]
if os.path.isfile(_url):
self.unpack(_url, tmp_dir)
else:
fs.rmtree(tmp_dir)
shutil.copytree(_url, tmp_dir, symlinks=True)
elif url.startswith(("http://", "https://")):
dl_path = self.download(url, checksum, silent=silent)
assert os.path.isfile(dl_path)
self.unpack(dl_path, tmp_dir)
else:
vcs = VCSClientFactory.new(tmp_dir, url)
assert vcs.export()
root_dir = self.find_pkg_root(tmp_dir, spec)
pkg_item = PackageItem(
root_dir,
self.build_metadata(
root_dir, spec, vcs.get_current_revision() if vcs else None
),
)
pkg_item.dump_meta()
return self._install_tmp_pkg(pkg_item)
finally:
if os.path.isdir(tmp_dir):
fs.rmtree(tmp_dir)
def _install_tmp_pkg(self, tmp_pkg):
assert isinstance(tmp_pkg, PackageItem)
# validate package version and declared requirements
if (
tmp_pkg.metadata.spec.requirements
and tmp_pkg.metadata.version not in tmp_pkg.metadata.spec.requirements
):
raise PackageException(
"Package version %s doesn't satisfy requirements %s based on %s"
% (
tmp_pkg.metadata.version,
tmp_pkg.metadata.spec.requirements,
tmp_pkg.metadata,
)
)
dst_pkg = PackageItem(
os.path.join(self.package_dir, tmp_pkg.get_safe_dirname())
)
# what to do with existing package?
action = "overwrite"
if tmp_pkg.metadata.spec.has_custom_name():
action = "overwrite"
dst_pkg = PackageItem(
os.path.join(self.package_dir, tmp_pkg.metadata.spec.name)
)
elif dst_pkg.metadata:
if dst_pkg.metadata.spec.external:
if dst_pkg.metadata.spec.url != tmp_pkg.metadata.spec.url:
action = "detach-existing"
elif (
dst_pkg.metadata.version != tmp_pkg.metadata.version
or dst_pkg.metadata.spec.owner != tmp_pkg.metadata.spec.owner
):
action = (
"detach-existing"
if tmp_pkg.metadata.version > dst_pkg.metadata.version
else "detach-new"
)
def _cleanup_dir(path):
if os.path.isdir(path):
fs.rmtree(path)
if action == "detach-existing":
target_dirname = "%s@%s" % (
tmp_pkg.get_safe_dirname(),
dst_pkg.metadata.version,
)
if dst_pkg.metadata.spec.url:
target_dirname = "%s@src-%s" % (
tmp_pkg.get_safe_dirname(),
hashlib.md5(
compat.hashlib_encode_data(dst_pkg.metadata.spec.url)
).hexdigest(),
)
# move existing into the new place
pkg_dir = os.path.join(self.package_dir, target_dirname)
_cleanup_dir(pkg_dir)
shutil.move(dst_pkg.path, pkg_dir)
# move new source to the destination location
_cleanup_dir(dst_pkg.path)
shutil.move(tmp_pkg.path, dst_pkg.path)
return PackageItem(dst_pkg.path)
if action == "detach-new":
target_dirname = "%s@%s" % (
tmp_pkg.get_safe_dirname(),
tmp_pkg.metadata.version,
)
if tmp_pkg.metadata.spec.external:
target_dirname = "%s@src-%s" % (
tmp_pkg.get_safe_dirname(),
hashlib.md5(
compat.hashlib_encode_data(tmp_pkg.metadata.spec.url)
).hexdigest(),
)
pkg_dir = os.path.join(self.package_dir, target_dirname)
_cleanup_dir(pkg_dir)
shutil.move(tmp_pkg.path, pkg_dir)
return PackageItem(pkg_dir)
# otherwise, overwrite existing
_cleanup_dir(dst_pkg.path)
shutil.move(tmp_pkg.path, dst_pkg.path)
return PackageItem(dst_pkg.path)

View File

@@ -0,0 +1,61 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from platformio import fs
from platformio.package.meta import PackageItem, PackageSpec
class PackageManagerLegacyMixin(object):
def build_legacy_spec(self, pkg_dir):
# find src manifest
src_manifest_name = ".piopkgmanager.json"
src_manifest_path = None
for name in os.listdir(pkg_dir):
if not os.path.isfile(os.path.join(pkg_dir, name, src_manifest_name)):
continue
src_manifest_path = os.path.join(pkg_dir, name, src_manifest_name)
break
if src_manifest_path:
src_manifest = fs.load_json(src_manifest_path)
return PackageSpec(
name=src_manifest.get("name"),
url=src_manifest.get("url"),
requirements=src_manifest.get("requirements"),
)
# fall back to a package manifest
manifest = self.load_manifest(pkg_dir)
return PackageSpec(name=manifest.get("name"))
def legacy_load_manifest(self, pkg):
if not isinstance(pkg, PackageItem):
assert os.path.isdir(pkg)
pkg = PackageItem(pkg)
manifest = self.load_manifest(pkg)
manifest["__pkg_dir"] = pkg.path
for key in ("name", "version"):
if not manifest.get(key):
manifest[key] = str(getattr(pkg.metadata, key))
if pkg.metadata and pkg.metadata.spec and pkg.metadata.spec.external:
manifest["__src_url"] = pkg.metadata.spec.url
manifest["version"] = str(pkg.metadata.version)
if pkg.metadata and pkg.metadata.spec.owner:
manifest["ownername"] = pkg.metadata.spec.owner
return manifest
def legacy_get_installed(self):
return [self.legacy_load_manifest(pkg) for pkg in self.get_installed()]

View File

@@ -0,0 +1,229 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import time
import click
from platformio.clients.http import HTTPClient
from platformio.clients.registry import RegistryClient
from platformio.package.exception import UnknownPackageError
from platformio.package.meta import PackageSpec
from platformio.package.version import cast_version_to_semver
try:
from urllib.parse import urlparse
except ImportError:
from urlparse import urlparse
class RegistryFileMirrorIterator(object):
HTTP_CLIENT_INSTANCES = {}
def __init__(self, download_url):
self.download_url = download_url
self._url_parts = urlparse(download_url)
self._mirror = "%s://%s" % (self._url_parts.scheme, self._url_parts.netloc)
self._visited_mirrors = []
def __iter__(self): # pylint: disable=non-iterator-returned
return self
def next(self):
""" For Python 2 compatibility """
return self.__next__()
def __next__(self):
http = self.get_http_client()
response = http.send_request(
"head",
self._url_parts.path,
allow_redirects=False,
params=dict(bypass=",".join(self._visited_mirrors))
if self._visited_mirrors
else None,
)
stop_conditions = [
response.status_code not in (302, 307),
not response.headers.get("Location"),
not response.headers.get("X-PIO-Mirror"),
response.headers.get("X-PIO-Mirror") in self._visited_mirrors,
]
if any(stop_conditions):
raise StopIteration
self._visited_mirrors.append(response.headers.get("X-PIO-Mirror"))
return (
response.headers.get("Location"),
response.headers.get("X-PIO-Content-SHA256"),
)
def get_http_client(self):
if self._mirror not in RegistryFileMirrorIterator.HTTP_CLIENT_INSTANCES:
RegistryFileMirrorIterator.HTTP_CLIENT_INSTANCES[self._mirror] = HTTPClient(
self._mirror
)
return RegistryFileMirrorIterator.HTTP_CLIENT_INSTANCES[self._mirror]
class PackageManageRegistryMixin(object):
def install_from_registry(self, spec, search_filters=None, silent=False):
if spec.owner and spec.name and not search_filters:
package = self.fetch_registry_package(spec)
if not package:
raise UnknownPackageError(spec.humanize())
version = self.pick_best_registry_version(package["versions"], spec)
else:
packages = self.search_registry_packages(spec, search_filters)
if not packages:
raise UnknownPackageError(spec.humanize())
if len(packages) > 1 and not silent:
self.print_multi_package_issue(packages, spec)
package, version = self.find_best_registry_version(packages, spec)
if not package or not version:
raise UnknownPackageError(spec.humanize())
pkgfile = self._pick_compatible_pkg_file(version["files"]) if version else None
if not pkgfile:
raise UnknownPackageError(spec.humanize())
for url, checksum in RegistryFileMirrorIterator(pkgfile["download_url"]):
try:
return self.install_from_url(
url,
PackageSpec(
owner=package["owner"]["username"],
id=package["id"],
name=package["name"],
),
checksum or pkgfile["checksum"]["sha256"],
silent=silent,
)
except Exception as e: # pylint: disable=broad-except
self.print_message("Warning! Package Mirror: %s" % e, fg="yellow")
self.print_message("Looking for another mirror...", fg="yellow")
return None
def get_registry_client_instance(self):
if not self._registry_client:
self._registry_client = RegistryClient()
return self._registry_client
def search_registry_packages(self, spec, filters=None):
assert isinstance(spec, PackageSpec)
filters = filters or {}
if spec.id:
filters["ids"] = str(spec.id)
else:
filters["types"] = self.pkg_type
filters["names"] = spec.name.lower()
if spec.owner:
filters["owners"] = spec.owner.lower()
return self.get_registry_client_instance().list_packages(filters=filters)[
"items"
]
def fetch_registry_package(self, spec):
assert isinstance(spec, PackageSpec)
result = None
regclient = self.get_registry_client_instance()
if spec.owner and spec.name:
result = regclient.get_package(self.pkg_type, spec.owner, spec.name)
if not result and (spec.id or (spec.name and not spec.owner)):
packages = self.search_registry_packages(spec)
if packages:
result = regclient.get_package(
self.pkg_type, packages[0]["owner"]["username"], packages[0]["name"]
)
if not result:
raise UnknownPackageError(spec.humanize())
return result
def reveal_registry_package_id(self, spec, silent=False):
spec = self.ensure_spec(spec)
if spec.id:
return spec.id
packages = self.search_registry_packages(spec)
if not packages:
raise UnknownPackageError(spec.humanize())
if len(packages) > 1 and not silent:
self.print_multi_package_issue(packages, spec)
click.echo("")
return packages[0]["id"]
def print_multi_package_issue(self, packages, spec):
self.print_message(
"Warning! More than one package has been found by ", fg="yellow", nl=False
)
click.secho(spec.humanize(), fg="cyan", nl=False)
click.secho(" requirements:", fg="yellow")
for item in packages:
click.echo(
" - {owner}/{name} @ {version}".format(
owner=click.style(item["owner"]["username"], fg="cyan"),
name=item["name"],
version=item["version"]["name"],
)
)
self.print_message(
"Please specify detailed REQUIREMENTS using package owner and version "
"(showed above) to avoid name conflicts",
fg="yellow",
)
def find_best_registry_version(self, packages, spec):
for package in packages:
# find compatible version within the latest package versions
version = self.pick_best_registry_version([package["version"]], spec)
if version:
return (package, version)
# if the custom version requirements, check ALL package versions
version = self.pick_best_registry_version(
self.fetch_registry_package(
PackageSpec(
id=package["id"],
owner=package["owner"]["username"],
name=package["name"],
)
).get("versions"),
spec,
)
if version:
return (package, version)
time.sleep(1)
return (None, None)
def pick_best_registry_version(self, versions, spec=None):
assert not spec or isinstance(spec, PackageSpec)
best = None
for version in versions:
semver = cast_version_to_semver(version["name"])
if spec and spec.requirements and semver not in spec.requirements:
continue
if not any(
self.is_system_compatible(f.get("system")) for f in version["files"]
):
continue
if not best or (semver > cast_version_to_semver(best["name"])):
best = version
return best
def _pick_compatible_pkg_file(self, version_files):
for item in version_files:
if self.is_system_compatible(item.get("system")):
return item
return None

View File

@@ -0,0 +1,78 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
import click
from platformio import fs
from platformio.package.exception import UnknownPackageError
from platformio.package.meta import PackageSpec
class PackageManagerUninstallMixin(object):
def uninstall(self, spec, silent=False, skip_dependencies=False):
try:
self.lock()
return self._uninstall(spec, silent, skip_dependencies)
finally:
self.unlock()
def _uninstall(self, spec, silent=False, skip_dependencies=False):
pkg = self.get_package(spec)
if not pkg or not pkg.metadata:
raise UnknownPackageError(spec)
if not silent:
self.print_message(
"Removing %s @ %s"
% (click.style(pkg.metadata.name, fg="cyan"), pkg.metadata.version),
)
# firstly, remove dependencies
if not skip_dependencies:
self.uninstall_dependencies(pkg, silent)
if os.path.islink(pkg.path):
os.unlink(pkg.path)
else:
fs.rmtree(pkg.path)
self.memcache_reset()
# unfix detached-package with the same name
detached_pkg = self.get_package(PackageSpec(name=pkg.metadata.name))
if (
detached_pkg
and "@" in detached_pkg.path
and not os.path.isdir(
os.path.join(self.package_dir, detached_pkg.get_safe_dirname())
)
):
shutil.move(
detached_pkg.path,
os.path.join(self.package_dir, detached_pkg.get_safe_dirname()),
)
self.memcache_reset()
if not silent:
self.print_message(
"{name} @ {version} has been removed!".format(**pkg.metadata.as_dict()),
fg="green",
)
return pkg
def uninstall_dependencies(self, pkg, silent=False):
pass

View File

@@ -0,0 +1,169 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import click
from platformio.clients.http import ensure_internet_on
from platformio.package.exception import UnknownPackageError
from platformio.package.meta import PackageItem, PackageOutdatedResult, PackageSpec
from platformio.package.vcsclient import VCSBaseException, VCSClientFactory
class PackageManagerUpdateMixin(object):
def outdated(self, pkg, spec=None):
assert isinstance(pkg, PackageItem)
assert not spec or isinstance(spec, PackageSpec)
assert os.path.isdir(pkg.path) and pkg.metadata
# skip detached package to a specific version
detached_conditions = [
"@" in pkg.path,
pkg.metadata.spec and not pkg.metadata.spec.external,
not spec,
]
if all(detached_conditions):
return PackageOutdatedResult(current=pkg.metadata.version, detached=True)
latest = None
wanted = None
if pkg.metadata.spec.external:
latest = self._fetch_vcs_latest_version(pkg)
else:
try:
reg_pkg = self.fetch_registry_package(pkg.metadata.spec)
latest = (
self.pick_best_registry_version(reg_pkg["versions"]) or {}
).get("name")
if spec:
wanted = (
self.pick_best_registry_version(reg_pkg["versions"], spec) or {}
).get("name")
if not wanted: # wrong library
latest = None
except UnknownPackageError:
pass
return PackageOutdatedResult(
current=pkg.metadata.version, latest=latest, wanted=wanted
)
def _fetch_vcs_latest_version(self, pkg):
vcs = None
try:
vcs = VCSClientFactory.new(pkg.path, pkg.metadata.spec.url, silent=True)
except VCSBaseException:
return None
if not vcs.can_be_updated:
return None
return str(
self.build_metadata(
pkg.path, pkg.metadata.spec, vcs_revision=vcs.get_latest_revision()
).version
)
def update( # pylint: disable=too-many-arguments
self,
from_spec,
to_spec=None,
only_check=False,
silent=False,
show_incompatible=True,
):
pkg = self.get_package(from_spec)
if not pkg or not pkg.metadata:
raise UnknownPackageError(from_spec)
if not silent:
click.echo(
"{} {:<45} {:<35}".format(
"Checking" if only_check else "Updating",
click.style(pkg.metadata.spec.humanize(), fg="cyan"),
"%s @ %s" % (pkg.metadata.version, to_spec.requirements)
if to_spec and to_spec.requirements
else str(pkg.metadata.version),
),
nl=False,
)
if not ensure_internet_on():
if not silent:
click.echo("[%s]" % (click.style("Off-line", fg="yellow")))
return pkg
outdated = self.outdated(pkg, to_spec)
if not silent:
self.print_outdated_state(outdated, show_incompatible)
if only_check or not outdated.is_outdated(allow_incompatible=False):
return pkg
try:
self.lock()
return self._update(pkg, outdated, silent=silent)
finally:
self.unlock()
@staticmethod
def print_outdated_state(outdated, show_incompatible=True):
if outdated.detached:
return click.echo("[%s]" % (click.style("Detached", fg="yellow")))
if (
not outdated.latest
or outdated.current == outdated.latest
or (not show_incompatible and outdated.current == outdated.wanted)
):
return click.echo("[%s]" % (click.style("Up-to-date", fg="green")))
if outdated.wanted and outdated.current == outdated.wanted:
return click.echo(
"[%s]" % (click.style("Incompatible %s" % outdated.latest, fg="yellow"))
)
return click.echo(
"[%s]"
% (
click.style(
"Outdated %s" % str(outdated.wanted or outdated.latest), fg="red"
)
)
)
def _update(self, pkg, outdated, silent=False):
if pkg.metadata.spec.external:
vcs = VCSClientFactory.new(pkg.path, pkg.metadata.spec.url)
assert vcs.update()
pkg.metadata.version = self._fetch_vcs_latest_version(pkg)
pkg.dump_meta()
return pkg
new_pkg = self.install(
PackageSpec(
id=pkg.metadata.spec.id,
owner=pkg.metadata.spec.owner,
name=pkg.metadata.spec.name,
requirements=outdated.wanted or outdated.latest,
),
silent=silent,
)
if new_pkg:
old_pkg = self.get_package(
PackageSpec(
id=pkg.metadata.spec.id,
owner=pkg.metadata.spec.owner,
name=pkg.metadata.name,
requirements=pkg.metadata.version,
)
)
if old_pkg:
self.uninstall(old_pkg, silent=silent, skip_dependencies=True)
return new_pkg

View File

@@ -0,0 +1,268 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from datetime import datetime
import click
import semantic_version
from platformio import util
from platformio.commands import PlatformioCLI
from platformio.compat import ci_strings_are_equal
from platformio.package.exception import ManifestException, MissingPackageManifestError
from platformio.package.lockfile import LockFile
from platformio.package.manager._download import PackageManagerDownloadMixin
from platformio.package.manager._install import PackageManagerInstallMixin
from platformio.package.manager._legacy import PackageManagerLegacyMixin
from platformio.package.manager._registry import PackageManageRegistryMixin
from platformio.package.manager._uninstall import PackageManagerUninstallMixin
from platformio.package.manager._update import PackageManagerUpdateMixin
from platformio.package.manifest.parser import ManifestParserFactory
from platformio.package.meta import (
PackageItem,
PackageMetaData,
PackageSpec,
PackageType,
)
from platformio.project.helpers import get_project_cache_dir
class BasePackageManager( # pylint: disable=too-many-public-methods
PackageManagerDownloadMixin,
PackageManageRegistryMixin,
PackageManagerInstallMixin,
PackageManagerUninstallMixin,
PackageManagerUpdateMixin,
PackageManagerLegacyMixin,
):
_MEMORY_CACHE = {}
def __init__(self, pkg_type, package_dir):
self.pkg_type = pkg_type
self.package_dir = self.ensure_dir_exists(package_dir)
self._MEMORY_CACHE = {}
self._lockfile = None
self._download_dir = None
self._tmp_dir = None
self._registry_client = None
def lock(self):
if self._lockfile:
return
self._lockfile = LockFile(self.package_dir)
self._lockfile.acquire()
def unlock(self):
if hasattr(self, "_lockfile") and self._lockfile:
self._lockfile.release()
self._lockfile = None
def __del__(self):
self.unlock()
def memcache_get(self, key, default=None):
return self._MEMORY_CACHE.get(key, default)
def memcache_set(self, key, value):
self._MEMORY_CACHE[key] = value
def memcache_reset(self):
self._MEMORY_CACHE.clear()
@staticmethod
def is_system_compatible(value):
if not value or "*" in value:
return True
return util.items_in_list(value, util.get_systype())
@staticmethod
def ensure_dir_exists(path):
if not os.path.isdir(path):
os.makedirs(path)
assert os.path.isdir(path)
return path
@staticmethod
def ensure_spec(spec):
return spec if isinstance(spec, PackageSpec) else PackageSpec(spec)
@property
def manifest_names(self):
raise NotImplementedError
def print_message(self, message, **kwargs):
click.echo(
"%s: " % str(self.__class__.__name__).replace("Package", " "), nl=False
)
click.secho(message, **kwargs)
def get_download_dir(self):
if not self._download_dir:
self._download_dir = self.ensure_dir_exists(
os.path.join(get_project_cache_dir(), "downloads")
)
return self._download_dir
def get_tmp_dir(self):
if not self._tmp_dir:
self._tmp_dir = self.ensure_dir_exists(
os.path.join(get_project_cache_dir(), "tmp")
)
return self._tmp_dir
def find_pkg_root(self, path, spec): # pylint: disable=unused-argument
if self.manifest_exists(path):
return path
for root, _, _ in os.walk(path):
if self.manifest_exists(root):
return root
raise MissingPackageManifestError(", ".join(self.manifest_names))
def get_manifest_path(self, pkg_dir):
if not os.path.isdir(pkg_dir):
return None
for name in self.manifest_names:
manifest_path = os.path.join(pkg_dir, name)
if os.path.isfile(manifest_path):
return manifest_path
return None
def manifest_exists(self, pkg_dir):
return self.get_manifest_path(pkg_dir)
def load_manifest(self, src):
path = src.path if isinstance(src, PackageItem) else src
cache_key = "load_manifest-%s" % path
result = self.memcache_get(cache_key)
if result:
return result
candidates = (
[os.path.join(path, name) for name in self.manifest_names]
if os.path.isdir(path)
else [path]
)
for item in candidates:
if not os.path.isfile(item):
continue
try:
result = ManifestParserFactory.new_from_file(item).as_dict()
self.memcache_set(cache_key, result)
return result
except ManifestException as e:
if not PlatformioCLI.in_silence():
self.print_message(str(e), fg="yellow")
raise MissingPackageManifestError(", ".join(self.manifest_names))
@staticmethod
def generate_rand_version():
return datetime.now().strftime("0.0.0+%Y%m%d%H%M%S")
def build_metadata(self, pkg_dir, spec, vcs_revision=None):
manifest = self.load_manifest(pkg_dir)
metadata = PackageMetaData(
type=self.pkg_type,
name=manifest.get("name"),
version=manifest.get("version"),
spec=spec,
)
if not metadata.name or spec.has_custom_name():
metadata.name = spec.name
if vcs_revision:
metadata.version = "%s+sha.%s" % (
metadata.version if metadata.version else "0.0.0",
vcs_revision,
)
if not metadata.version:
metadata.version = self.generate_rand_version()
return metadata
def get_installed(self):
cache_key = "get_installed"
if self.memcache_get(cache_key):
return self.memcache_get(cache_key)
result = []
for name in sorted(os.listdir(self.package_dir)):
if name.startswith("_tmp_installing"): # legacy tmp folder
continue
pkg_dir = os.path.join(self.package_dir, name)
if not os.path.isdir(pkg_dir):
continue
pkg = PackageItem(pkg_dir)
if not pkg.metadata:
try:
spec = self.build_legacy_spec(pkg_dir)
pkg.metadata = self.build_metadata(pkg_dir, spec)
except MissingPackageManifestError:
pass
if not pkg.metadata:
continue
if self.pkg_type == PackageType.TOOL:
try:
if not self.is_system_compatible(
self.load_manifest(pkg).get("system")
):
continue
except MissingPackageManifestError:
pass
result.append(pkg)
self.memcache_set(cache_key, result)
return result
def get_package(self, spec):
if isinstance(spec, PackageItem):
return spec
spec = self.ensure_spec(spec)
best = None
for pkg in self.get_installed():
if not self.test_pkg_spec(pkg, spec):
continue
assert isinstance(pkg.metadata.version, semantic_version.Version)
if spec.requirements and pkg.metadata.version not in spec.requirements:
continue
if not best or (pkg.metadata.version > best.metadata.version):
best = pkg
return best
@staticmethod
def test_pkg_spec(pkg, spec):
# "id" mismatch
if spec.id and spec.id != pkg.metadata.spec.id:
return False
# external "URL" mismatch
if spec.external:
# local folder mismatch
if os.path.realpath(spec.url) == os.path.realpath(pkg.path) or (
spec.url.startswith("file://")
and os.path.realpath(pkg.path) == os.path.realpath(spec.url[7:])
):
return True
if spec.url != pkg.metadata.spec.url:
return False
# "owner" mismatch
elif spec.owner and not ci_strings_are_equal(
spec.owner, pkg.metadata.spec.owner
):
return False
# "name" mismatch
elif not spec.id and not ci_strings_are_equal(spec.name, pkg.metadata.name):
return False
return True

View File

@@ -17,89 +17,58 @@ import os
import subprocess import subprocess
import sys import sys
from platformio import exception, util from platformio import __core_packages__, exception, fs, util
from platformio.compat import PY2 from platformio.compat import PY2
from platformio.managers.package import PackageManager from platformio.package.manager.tool import ToolPackageManager
from platformio.package.meta import PackageSpec
from platformio.proc import get_pythonexe_path from platformio.proc import get_pythonexe_path
from platformio.project.config import ProjectConfig
CORE_PACKAGES = {
"contrib-piohome": "~3.2.1",
"contrib-pysite": "~2.%d%d.0" % (sys.version_info.major, sys.version_info.minor),
"tool-unity": "~1.20500.0",
"tool-scons": "~2.20501.7" if PY2 else "~3.30102.0",
"tool-cppcheck": "~1.190.0",
"tool-clangtidy": "~1.100000.0",
"tool-pvs-studio": "~7.7.0",
}
# pylint: disable=arguments-differ,signature-differs
class CorePackageManager(PackageManager):
def __init__(self):
config = ProjectConfig.get_instance()
packages_dir = config.get_optional_dir("packages")
super(CorePackageManager, self).__init__(
packages_dir,
[
"https://dl.bintray.com/platformio/dl-packages/manifest.json",
"http%s://dl.platformio.org/packages/manifest.json"
% ("" if sys.version_info < (2, 7, 9) else "s"),
],
)
def install( # pylint: disable=keyword-arg-before-vararg
self, name, requirements=None, *args, **kwargs
):
PackageManager.install(self, name, requirements, *args, **kwargs)
self.cleanup_packages()
return self.get_package_dir(name, requirements)
def update(self, *args, **kwargs):
result = PackageManager.update(self, *args, **kwargs)
self.cleanup_packages()
return result
def cleanup_packages(self):
self.cache_reset()
best_pkg_versions = {}
for name, requirements in CORE_PACKAGES.items():
pkg_dir = self.get_package_dir(name, requirements)
if not pkg_dir:
continue
best_pkg_versions[name] = self.load_manifest(pkg_dir)["version"]
for manifest in self.get_installed():
if manifest["name"] not in best_pkg_versions:
continue
if manifest["version"] != best_pkg_versions[manifest["name"]]:
self.uninstall(manifest["__pkg_dir"], after_update=True)
self.cache_reset()
return True
def get_core_package_dir(name): def get_core_package_dir(name):
if name not in CORE_PACKAGES: if name not in __core_packages__:
raise exception.PlatformioException("Please upgrade PIO Core") raise exception.PlatformioException("Please upgrade PlatformIO Core")
requirements = CORE_PACKAGES[name] pm = ToolPackageManager()
pm = CorePackageManager() spec = PackageSpec(
pkg_dir = pm.get_package_dir(name, requirements) owner="platformio", name=name, requirements=__core_packages__[name]
if pkg_dir: )
return pkg_dir pkg = pm.get_package(spec)
return pm.install(name, requirements) if pkg:
return pkg.path
assert pm.install(spec)
_remove_unnecessary_packages()
return pm.get_package(spec).path
def update_core_packages(only_check=False, silent=False): def update_core_packages(only_check=False, silent=False):
pm = CorePackageManager() pm = ToolPackageManager()
for name, requirements in CORE_PACKAGES.items(): for name, requirements in __core_packages__.items():
pkg_dir = pm.get_package_dir(name) spec = PackageSpec(owner="platformio", name=name, requirements=requirements)
if not pkg_dir: pkg = pm.get_package(spec)
if not pkg:
continue continue
if not silent or pm.outdated(pkg_dir, requirements): if not silent or pm.outdated(pkg, spec).is_outdated():
pm.update(name, requirements, only_check=only_check) pm.update(pkg, spec, only_check=only_check)
if not only_check:
_remove_unnecessary_packages()
return True return True
def _remove_unnecessary_packages():
pm = ToolPackageManager()
best_pkg_versions = {}
for name, requirements in __core_packages__.items():
spec = PackageSpec(owner="platformio", name=name, requirements=requirements)
pkg = pm.get_package(spec)
if not pkg:
continue
best_pkg_versions[pkg.metadata.name] = pkg.metadata.version
for pkg in pm.get_installed():
if pkg.metadata.name not in best_pkg_versions:
continue
if pkg.metadata.version != best_pkg_versions[pkg.metadata.name]:
pm.uninstall(pkg)
def inject_contrib_pysite(verify_openssl=False): def inject_contrib_pysite(verify_openssl=False):
# pylint: disable=import-outside-toplevel # pylint: disable=import-outside-toplevel
from site import addsitedir from site import addsitedir
@@ -124,7 +93,7 @@ def inject_contrib_pysite(verify_openssl=False):
def build_contrib_pysite_deps(target_dir): def build_contrib_pysite_deps(target_dir):
if os.path.isdir(target_dir): if os.path.isdir(target_dir):
util.rmtree_(target_dir) fs.rmtree(target_dir)
os.makedirs(target_dir) os.makedirs(target_dir)
with open(os.path.join(target_dir, "package.json"), "w") as fp: with open(os.path.join(target_dir, "package.json"), "w") as fp:
json.dump( json.dump(

View File

@@ -0,0 +1,107 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
from platformio.package.exception import MissingPackageManifestError
from platformio.package.manager.base import BasePackageManager
from platformio.package.meta import PackageItem, PackageSpec, PackageType
from platformio.project.helpers import get_project_global_lib_dir
class LibraryPackageManager(BasePackageManager): # pylint: disable=too-many-ancestors
def __init__(self, package_dir=None):
super(LibraryPackageManager, self).__init__(
PackageType.LIBRARY, package_dir or get_project_global_lib_dir()
)
@property
def manifest_names(self):
return PackageType.get_manifest_map()[PackageType.LIBRARY]
def find_pkg_root(self, path, spec):
try:
return super(LibraryPackageManager, self).find_pkg_root(path, spec)
except MissingPackageManifestError:
pass
assert isinstance(spec, PackageSpec)
root_dir = self.find_library_root(path)
# automatically generate library manifest
with open(os.path.join(root_dir, "library.json"), "w") as fp:
json.dump(
dict(name=spec.name, version=self.generate_rand_version(),),
fp,
indent=2,
)
return root_dir
@staticmethod
def find_library_root(path):
for root, dirs, files in os.walk(path):
if not files and len(dirs) == 1:
continue
for fname in files:
if not fname.endswith((".c", ".cpp", ".h", ".S")):
continue
if os.path.isdir(os.path.join(os.path.dirname(root), "src")):
return os.path.dirname(root)
return root
return path
def install_dependencies(self, pkg, silent=False):
assert isinstance(pkg, PackageItem)
manifest = self.load_manifest(pkg)
if not manifest.get("dependencies"):
return
if not silent:
self.print_message("Installing dependencies...")
for dependency in manifest.get("dependencies"):
if not self._install_dependency(dependency, silent) and not silent:
self.print_message(
"Warning! Could not install dependency %s for package '%s'"
% (dependency, pkg.metadata.name),
fg="yellow",
)
def _install_dependency(self, dependency, silent=False):
spec = PackageSpec(
name=dependency.get("name"), requirements=dependency.get("version")
)
search_filters = {
key: value
for key, value in dependency.items()
if key in ("authors", "platforms", "frameworks")
}
return self._install(spec, search_filters=search_filters or None, silent=silent)
def uninstall_dependencies(self, pkg, silent=False):
assert isinstance(pkg, PackageItem)
manifest = self.load_manifest(pkg)
if not manifest.get("dependencies"):
return
if not silent:
self.print_message("Removing dependencies...", fg="yellow")
for dependency in manifest.get("dependencies"):
pkg = self.get_package(
PackageSpec(
name=dependency.get("name"), requirements=dependency.get("version")
)
)
if not pkg:
continue
self._uninstall(pkg, silent=silent)

View File

@@ -0,0 +1,195 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import util
from platformio.clients.http import HTTPClientError, InternetIsOffline
from platformio.package.exception import UnknownPackageError
from platformio.package.manager.base import BasePackageManager
from platformio.package.manager.tool import ToolPackageManager
from platformio.package.meta import PackageType
from platformio.platform.exception import IncompatiblePlatform, UnknownBoard
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-ancestors
def __init__(self, package_dir=None):
self.config = ProjectConfig.get_instance()
super(PlatformPackageManager, self).__init__(
PackageType.PLATFORM,
package_dir or self.config.get_optional_dir("platforms"),
)
@property
def manifest_names(self):
return PackageType.get_manifest_map()[PackageType.PLATFORM]
def install( # pylint: disable=arguments-differ, too-many-arguments
self,
spec,
with_packages=None,
without_packages=None,
skip_default_package=False,
with_all_packages=False,
silent=False,
force=False,
):
pkg = super(PlatformPackageManager, self).install(
spec, silent=silent, force=force, skip_dependencies=True
)
try:
p = PlatformFactory.new(pkg)
p.ensure_engine_compatible()
except IncompatiblePlatform as e:
super(PlatformPackageManager, self).uninstall(
pkg, silent=silent, skip_dependencies=True
)
raise e
if with_all_packages:
with_packages = list(p.packages)
p.install_packages(
with_packages,
without_packages,
skip_default_package,
silent=silent,
force=force,
)
p.install_python_packages()
p.on_installed()
self.cleanup_packages(list(p.packages))
return pkg
def uninstall(self, spec, silent=False, skip_dependencies=False):
pkg = self.get_package(spec)
if not pkg or not pkg.metadata:
raise UnknownPackageError(spec)
p = PlatformFactory.new(pkg)
assert super(PlatformPackageManager, self).uninstall(
pkg, silent=silent, skip_dependencies=True
)
if not skip_dependencies:
p.uninstall_python_packages()
p.on_uninstalled()
self.cleanup_packages(list(p.packages))
return pkg
def update( # pylint: disable=arguments-differ, too-many-arguments
self,
from_spec,
to_spec=None,
only_check=False,
silent=False,
show_incompatible=True,
only_packages=False,
):
pkg = self.get_package(from_spec)
if not pkg or not pkg.metadata:
raise UnknownPackageError(from_spec)
p = PlatformFactory.new(pkg)
pkgs_before = [item.metadata.name for item in p.get_installed_packages()]
new_pkg = None
missed_pkgs = set()
if not only_packages:
new_pkg = super(PlatformPackageManager, self).update(
from_spec,
to_spec,
only_check=only_check,
silent=silent,
show_incompatible=show_incompatible,
)
p = PlatformFactory.new(new_pkg)
missed_pkgs = set(pkgs_before) & set(p.packages)
missed_pkgs -= set(
item.metadata.name for item in p.get_installed_packages()
)
p.update_packages(only_check)
self.cleanup_packages(list(p.packages))
if missed_pkgs:
p.install_packages(
with_packages=list(missed_pkgs), skip_default_package=True
)
return new_pkg or pkg
def cleanup_packages(self, names):
self.memcache_reset()
deppkgs = {}
for platform in PlatformPackageManager().get_installed():
p = PlatformFactory.new(platform)
for pkg in p.get_installed_packages():
if pkg.metadata.name not in deppkgs:
deppkgs[pkg.metadata.name] = set()
deppkgs[pkg.metadata.name].add(pkg.metadata.version)
pm = ToolPackageManager()
for pkg in pm.get_installed():
if pkg.metadata.name not in names:
continue
if (
pkg.metadata.name not in deppkgs
or pkg.metadata.version not in deppkgs[pkg.metadata.name]
):
try:
pm.uninstall(pkg.metadata.spec)
except UnknownPackageError:
pass
self.memcache_reset()
return True
@util.memoized(expire="5s")
def get_installed_boards(self):
boards = []
for pkg in self.get_installed():
p = PlatformFactory.new(pkg)
for config in p.get_boards().values():
board = config.get_brief_data()
if board not in boards:
boards.append(board)
return boards
def get_registered_boards(self):
return self.get_registry_client_instance().fetch_json_data(
"get", "/v2/boards", cache_valid="1d"
)
def get_all_boards(self):
boards = self.get_installed_boards()
know_boards = ["%s:%s" % (b["platform"], b["id"]) for b in boards]
try:
for board in self.get_registered_boards():
key = "%s:%s" % (board["platform"], board["id"])
if key not in know_boards:
boards.append(board)
except (HTTPClientError, InternetIsOffline):
pass
return sorted(boards, key=lambda b: b["name"])
def board_config(self, id_, platform=None):
for manifest in self.get_installed_boards():
if manifest["id"] == id_ and (
not platform or manifest["platform"] == platform
):
return manifest
for manifest in self.get_registered_boards():
if manifest["id"] == id_ and (
not platform or manifest["platform"] == platform
):
return manifest
raise UnknownBoard(id_)

View File

@@ -0,0 +1,28 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.package.manager.base import BasePackageManager
from platformio.package.meta import PackageType
from platformio.project.config import ProjectConfig
class ToolPackageManager(BasePackageManager): # pylint: disable=too-many-ancestors
def __init__(self, package_dir=None):
if not package_dir:
package_dir = ProjectConfig.get_instance().get_optional_dir("packages")
super(ToolPackageManager, self).__init__(PackageType.TOOL, package_dir)
@property
def manifest_names(self):
return PackageType.get_manifest_map()[PackageType.TOOL]

View File

@@ -17,10 +17,10 @@ import io
import json import json
import os import os
import re import re
import tarfile
import requests
from platformio import util from platformio import util
from platformio.clients.http import fetch_remote_content
from platformio.compat import get_object_members, string_types from platformio.compat import get_object_members, string_types
from platformio.package.exception import ManifestParserError, UnknownManifestError from platformio.package.exception import ManifestParserError, UnknownManifestError
from platformio.project.helpers import is_platformio_project from platformio.project.helpers import is_platformio_project
@@ -40,7 +40,7 @@ class ManifestFileType(object):
@classmethod @classmethod
def items(cls): def items(cls):
return get_object_members(ManifestFileType) return get_object_members(cls)
@classmethod @classmethod
def from_uri(cls, uri): def from_uri(cls, uri):
@@ -60,8 +60,14 @@ class ManifestFileType(object):
class ManifestParserFactory(object): class ManifestParserFactory(object):
@staticmethod @staticmethod
def read_manifest_contents(path): def read_manifest_contents(path):
with io.open(path, encoding="utf-8") as fp: last_err = None
for encoding in ("utf-8", "latin-1"):
try:
with io.open(path, encoding=encoding) as fp:
return fp.read() return fp.read()
except UnicodeDecodeError as e:
last_err = e
raise last_err # pylint: disable=raising-bad-type
@classmethod @classmethod
def new_from_file(cls, path, remote_url=False): def new_from_file(cls, path, remote_url=False):
@@ -101,14 +107,26 @@ class ManifestParserFactory(object):
@staticmethod @staticmethod
def new_from_url(remote_url): def new_from_url(remote_url):
r = requests.get(remote_url) content = fetch_remote_content(remote_url)
r.raise_for_status()
return ManifestParserFactory.new( return ManifestParserFactory.new(
r.text, content,
ManifestFileType.from_uri(remote_url) or ManifestFileType.LIBRARY_JSON, ManifestFileType.from_uri(remote_url) or ManifestFileType.LIBRARY_JSON,
remote_url, remote_url,
) )
@staticmethod
def new_from_archive(path):
assert path.endswith("tar.gz")
with tarfile.open(path, mode="r:gz") as tf:
for t in sorted(ManifestFileType.items().values()):
try:
return ManifestParserFactory.new(
tf.extractfile(t).read().decode(), t
)
except KeyError:
pass
raise UnknownManifestError("Unknown manifest file type in %s archive" % path)
@staticmethod @staticmethod
def new( # pylint: disable=redefined-builtin def new( # pylint: disable=redefined-builtin
contents, type, remote_url=None, package_dir=None contents, type, remote_url=None, package_dir=None
@@ -148,10 +166,27 @@ class BaseManifestParser(object):
return self._data return self._data
@staticmethod @staticmethod
def normalize_author(author): def str_to_list(value, sep=",", lowercase=True):
if isinstance(value, string_types):
value = value.split(sep)
assert isinstance(value, list)
result = []
for item in value:
item = item.strip()
if not item:
continue
if lowercase:
item = item.lower()
result.append(item)
return result
@staticmethod
def cleanup_author(author):
assert isinstance(author, dict) assert isinstance(author, dict)
if author.get("email"): if author.get("email"):
author["email"] = re.sub(r"\s+[aA][tT]\s+", "@", author["email"]) author["email"] = re.sub(r"\s+[aA][tT]\s+", "@", author["email"])
if "@" not in author["email"]:
author["email"] = None
for key in list(author.keys()): for key in list(author.keys()):
if author[key] is None: if author[key] is None:
del author[key] del author[key]
@@ -163,10 +198,13 @@ class BaseManifestParser(object):
return (None, None) return (None, None)
name = raw name = raw
email = None email = None
for ldel, rdel in [("<", ">"), ("(", ")")]: ldel = "<"
rdel = ">"
if ldel in raw and rdel in raw: if ldel in raw and rdel in raw:
name = raw[: raw.index(ldel)] name = raw[: raw.index(ldel)]
email = raw[raw.index(ldel) + 1 : raw.index(rdel)] email = raw[raw.index(ldel) + 1 : raw.index(rdel)]
if "(" in name:
name = name.split("(")[0]
return (name.strip(), email.strip() if email else None) return (name.strip(), email.strip() if email else None)
@staticmethod @staticmethod
@@ -284,7 +322,7 @@ class LibraryJsonManifestParser(BaseManifestParser):
# normalize Union[str, list] fields # normalize Union[str, list] fields
for k in ("keywords", "platforms", "frameworks"): for k in ("keywords", "platforms", "frameworks"):
if k in data: if k in data:
data[k] = self._str_to_list(data[k], sep=",") data[k] = self.str_to_list(data[k], sep=",")
if "authors" in data: if "authors" in data:
data["authors"] = self._parse_authors(data["authors"]) data["authors"] = self._parse_authors(data["authors"])
@@ -297,21 +335,6 @@ class LibraryJsonManifestParser(BaseManifestParser):
return data return data
@staticmethod
def _str_to_list(value, sep=",", lowercase=True):
if isinstance(value, string_types):
value = value.split(sep)
assert isinstance(value, list)
result = []
for item in value:
item = item.strip()
if not item:
continue
if lowercase:
item = item.lower()
result.append(item)
return result
@staticmethod @staticmethod
def _process_renamed_fields(data): def _process_renamed_fields(data):
if "url" in data: if "url" in data:
@@ -334,7 +357,7 @@ class LibraryJsonManifestParser(BaseManifestParser):
# normalize Union[dict, list] fields # normalize Union[dict, list] fields
if not isinstance(raw, list): if not isinstance(raw, list):
raw = [raw] raw = [raw]
return [self.normalize_author(author) for author in raw] return [self.cleanup_author(author) for author in raw]
@staticmethod @staticmethod
def _parse_platforms(raw): def _parse_platforms(raw):
@@ -372,8 +395,6 @@ class LibraryJsonManifestParser(BaseManifestParser):
for k, v in dependency.items(): for k, v in dependency.items():
if k not in ("platforms", "frameworks", "authors"): if k not in ("platforms", "frameworks", "authors"):
continue continue
if "*" in v:
del raw[i][k]
raw[i][k] = util.items_to_list(v) raw[i][k] = util.items_to_list(v)
else: else:
raw[i] = {"name": dependency} raw[i] = {"name": dependency}
@@ -399,6 +420,8 @@ class ModuleJsonManifestParser(BaseManifestParser):
del data["licenses"] del data["licenses"]
if "dependencies" in data: if "dependencies" in data:
data["dependencies"] = self._parse_dependencies(data["dependencies"]) data["dependencies"] = self._parse_dependencies(data["dependencies"])
if "keywords" in data:
data["keywords"] = self.str_to_list(data["keywords"], sep=",")
return data return data
def _parse_authors(self, raw): def _parse_authors(self, raw):
@@ -409,7 +432,7 @@ class ModuleJsonManifestParser(BaseManifestParser):
name, email = self.parse_author_name_and_email(author) name, email = self.parse_author_name_and_email(author)
if not name: if not name:
continue continue
result.append(self.normalize_author(dict(name=name, email=email))) result.append(self.cleanup_author(dict(name=name, email=email)))
return result return result
@staticmethod @staticmethod
@@ -450,7 +473,9 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
) )
if "author" in data: if "author" in data:
data["authors"] = self._parse_authors(data) data["authors"] = self._parse_authors(data)
del data["author"] for key in ("author", "maintainer"):
if key in data:
del data[key]
if "depends" in data: if "depends" in data:
data["dependencies"] = self._parse_dependencies(data["depends"]) data["dependencies"] = self._parse_dependencies(data["depends"])
return data return data
@@ -466,6 +491,8 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
if line.startswith("#"): if line.startswith("#"):
continue continue
key, value = line.split("=", 1) key, value = line.split("=", 1)
if not value.strip():
continue
data[key.strip()] = value.strip() data[key.strip()] = value.strip()
return data return data
@@ -521,7 +548,7 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
name, email = self.parse_author_name_and_email(author) name, email = self.parse_author_name_and_email(author)
if not name: if not name:
continue continue
authors.append(self.normalize_author(dict(name=name, email=email))) authors.append(self.cleanup_author(dict(name=name, email=email)))
for author in properties.get("maintainer", "").split(","): for author in properties.get("maintainer", "").split(","):
name, email = self.parse_author_name_and_email(author) name, email = self.parse_author_name_and_email(author)
if not name: if not name:
@@ -532,11 +559,11 @@ class LibraryPropertiesManifestParser(BaseManifestParser):
continue continue
found = True found = True
item["maintainer"] = True item["maintainer"] = True
if not item.get("email") and email: if not item.get("email") and email and "@" in email:
item["email"] = email item["email"] = email
if not found: if not found:
authors.append( authors.append(
self.normalize_author(dict(name=name, email=email, maintainer=True)) self.cleanup_author(dict(name=name, email=email, maintainer=True))
) )
return authors return authors
@@ -605,6 +632,8 @@ class PlatformJsonManifestParser(BaseManifestParser):
def parse(self, contents): def parse(self, contents):
data = json.loads(contents) data = json.loads(contents)
if "keywords" in data:
data["keywords"] = self.str_to_list(data["keywords"], sep=",")
if "frameworks" in data: if "frameworks" in data:
data["frameworks"] = self._parse_frameworks(data["frameworks"]) data["frameworks"] = self._parse_frameworks(data["frameworks"])
if "packages" in data: if "packages" in data:
@@ -629,8 +658,11 @@ class PackageJsonManifestParser(BaseManifestParser):
def parse(self, contents): def parse(self, contents):
data = json.loads(contents) data = json.loads(contents)
if "keywords" in data:
data["keywords"] = self.str_to_list(data["keywords"], sep=",")
data = self._parse_system(data) data = self._parse_system(data)
data = self._parse_homepage(data) data = self._parse_homepage(data)
data = self._parse_repository(data)
return data return data
@staticmethod @staticmethod
@@ -651,3 +683,14 @@ class PackageJsonManifestParser(BaseManifestParser):
data["homepage"] = data["url"] data["homepage"] = data["url"]
del data["url"] del data["url"]
return data return data
@staticmethod
def _parse_repository(data):
if isinstance(data.get("repository", {}), dict):
return data
data["repository"] = dict(type="git", url=str(data["repository"]))
if data["repository"]["url"].startswith(("github:", "gitlab:", "bitbucket:")):
data["repository"]["url"] = "https://{0}.com/{1}".format(
*(data["repository"]["url"].split(":", 1))
)
return data

View File

@@ -14,11 +14,14 @@
# pylint: disable=too-many-ancestors # pylint: disable=too-many-ancestors
import json
import marshmallow import marshmallow
import requests import requests
import semantic_version import semantic_version
from marshmallow import Schema, ValidationError, fields, validate, validates from marshmallow import Schema, ValidationError, fields, validate, validates
from platformio.clients.http import fetch_remote_content
from platformio.package.exception import ManifestValidationError from platformio.package.exception import ManifestValidationError
from platformio.util import memoized from platformio.util import memoized
@@ -84,7 +87,7 @@ class StrictListField(fields.List):
class AuthorSchema(StrictSchema): class AuthorSchema(StrictSchema):
name = fields.Str(required=True, validate=validate.Length(min=1, max=50)) name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
email = fields.Email(validate=validate.Length(min=1, max=50)) email = fields.Email(validate=validate.Length(min=1, max=50))
maintainer = fields.Bool(default=False) maintainer = fields.Bool(default=False)
url = fields.Url(validate=validate.Length(min=1, max=255)) url = fields.Url(validate=validate.Length(min=1, max=255))
@@ -149,7 +152,15 @@ class ExampleSchema(StrictSchema):
class ManifestSchema(BaseSchema): class ManifestSchema(BaseSchema):
# Required fields # Required fields
name = fields.Str(required=True, validate=validate.Length(min=1, max=100)) name = fields.Str(
required=True,
validate=[
validate.Length(min=1, max=100),
validate.Regexp(
r"^[^:;/,@\<\>]+$", error="The next chars [:;/,@<>] are not allowed"
),
],
)
version = fields.Str(required=True, validate=validate.Length(min=1, max=50)) version = fields.Str(required=True, validate=validate.Length(min=1, max=50))
# Optional fields # Optional fields
@@ -240,9 +251,9 @@ class ManifestSchema(BaseSchema):
@staticmethod @staticmethod
@memoized(expire="1h") @memoized(expire="1h")
def load_spdx_licenses(): def load_spdx_licenses():
r = requests.get( version = "3.10"
spdx_data_url = (
"https://raw.githubusercontent.com/spdx/license-list-data" "https://raw.githubusercontent.com/spdx/license-list-data"
"/v3.9/json/licenses.json" "/v%s/json/licenses.json" % version
) )
r.raise_for_status() return json.loads(fetch_remote_content(spdx_data_url))
return r.json()

433
platformio/package/meta.py Normal file
View File

@@ -0,0 +1,433 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import re
import tarfile
from binascii import crc32
import semantic_version
from platformio.compat import get_object_members, hashlib_encode_data, string_types
from platformio.package.manifest.parser import ManifestFileType
from platformio.package.version import cast_version_to_semver
try:
from urllib.parse import urlparse
except ImportError:
from urlparse import urlparse
class PackageType(object):
LIBRARY = "library"
PLATFORM = "platform"
TOOL = "tool"
@classmethod
def items(cls):
return get_object_members(cls)
@classmethod
def get_manifest_map(cls):
return {
cls.PLATFORM: (ManifestFileType.PLATFORM_JSON,),
cls.LIBRARY: (
ManifestFileType.LIBRARY_JSON,
ManifestFileType.LIBRARY_PROPERTIES,
ManifestFileType.MODULE_JSON,
),
cls.TOOL: (ManifestFileType.PACKAGE_JSON,),
}
@classmethod
def from_archive(cls, path):
assert path.endswith("tar.gz")
manifest_map = cls.get_manifest_map()
with tarfile.open(path, mode="r:gz") as tf:
for t in sorted(cls.items().values()):
for manifest in manifest_map[t]:
try:
if tf.getmember(manifest):
return t
except KeyError:
pass
return None
class PackageOutdatedResult(object):
def __init__(self, current, latest=None, wanted=None, detached=False):
self.current = current
self.latest = latest
self.wanted = wanted
self.detached = detached
def __repr__(self):
return (
"PackageOutdatedResult <current={current} latest={latest} wanted={wanted} "
"detached={detached}>".format(
current=self.current,
latest=self.latest,
wanted=self.wanted,
detached=self.detached,
)
)
def __setattr__(self, name, value):
if (
value
and name in ("current", "latest", "wanted")
and not isinstance(value, semantic_version.Version)
):
value = cast_version_to_semver(str(value))
return super(PackageOutdatedResult, self).__setattr__(name, value)
def is_outdated(self, allow_incompatible=False):
if self.detached or not self.latest or self.current == self.latest:
return False
if allow_incompatible:
return self.current != self.latest
if self.wanted:
return self.current != self.wanted
return True
class PackageSpec(object): # pylint: disable=too-many-instance-attributes
def __init__( # pylint: disable=redefined-builtin,too-many-arguments
self, raw=None, owner=None, id=None, name=None, requirements=None, url=None
):
self.owner = owner
self.id = id
self.name = name
self._requirements = None
self.url = url
self.raw = raw
if requirements:
self.requirements = requirements
self._name_is_custom = False
self._parse(raw)
def __eq__(self, other):
return all(
[
self.owner == other.owner,
self.id == other.id,
self.name == other.name,
self.requirements == other.requirements,
self.url == other.url,
]
)
def __hash__(self):
return crc32(
hashlib_encode_data(
"%s-%s-%s-%s-%s"
% (self.owner, self.id, self.name, self.requirements, self.url)
)
)
def __repr__(self):
return (
"PackageSpec <owner={owner} id={id} name={name} "
"requirements={requirements} url={url}>".format(**self.as_dict())
)
@property
def external(self):
return bool(self.url)
@property
def requirements(self):
return self._requirements
@requirements.setter
def requirements(self, value):
if not value:
self._requirements = None
return
self._requirements = (
value
if isinstance(value, semantic_version.SimpleSpec)
else semantic_version.SimpleSpec(str(value))
)
def humanize(self):
result = ""
if self.url:
result = self.url
elif self.name:
if self.owner:
result = self.owner + "/"
result += self.name
elif self.id:
result = "id:%d" % self.id
if self.requirements:
result += " @ " + str(self.requirements)
return result
def has_custom_name(self):
return self._name_is_custom
def as_dict(self):
return dict(
owner=self.owner,
id=self.id,
name=self.name,
requirements=str(self.requirements) if self.requirements else None,
url=self.url,
)
def as_dependency(self):
if self.url:
return self.raw or self.url
result = ""
if self.name:
result = "%s/%s" % (self.owner, self.name) if self.owner else self.name
elif self.id:
result = str(self.id)
assert result
if self.requirements:
result = "%s@%s" % (result, self.requirements)
return result
def _parse(self, raw):
if raw is None:
return
if not isinstance(raw, string_types):
raw = str(raw)
raw = raw.strip()
parsers = (
self._parse_requirements,
self._parse_custom_name,
self._parse_id,
self._parse_owner,
self._parse_url,
)
for parser in parsers:
if raw is None:
break
raw = parser(raw)
# if name is not custom, parse it from URL
if not self.name and self.url:
self.name = self._parse_name_from_url(self.url)
elif raw:
# the leftover is a package name
self.name = raw
def _parse_requirements(self, raw):
if "@" not in raw:
return raw
if raw.startswith("file://") and os.path.exists(raw[7:]):
return raw
tokens = raw.rsplit("@", 1)
if any(s in tokens[1] for s in (":", "/")):
return raw
self.requirements = tokens[1].strip()
return tokens[0].strip()
def _parse_custom_name(self, raw):
if "=" not in raw or raw.startswith("id="):
return raw
tokens = raw.split("=", 1)
if "/" in tokens[0]:
return raw
self.name = tokens[0].strip()
self._name_is_custom = True
return tokens[1].strip()
def _parse_id(self, raw):
if raw.isdigit():
self.id = int(raw)
return None
if raw.startswith("id="):
return self._parse_id(raw[3:])
return raw
def _parse_owner(self, raw):
if raw.count("/") != 1 or "@" in raw:
return raw
tokens = raw.split("/", 1)
self.owner = tokens[0].strip()
self.name = tokens[1].strip()
return None
def _parse_url(self, raw):
if not any(s in raw for s in ("@", ":", "/")):
return raw
self.url = raw.strip()
parts = urlparse(self.url)
# if local file or valid URL with scheme vcs+protocol://
if parts.scheme == "file" or "+" in parts.scheme or self.url.startswith("git+"):
return None
# parse VCS
git_conditions = [
parts.path.endswith(".git"),
# Handle GitHub URL (https://github.com/user/package)
parts.netloc in ("github.com", "gitlab.com", "bitbucket.com")
and not parts.path.endswith((".zip", ".tar.gz")),
]
hg_conditions = [
# Handle Developer Mbed URL
# (https://developer.mbed.org/users/user/code/package/)
# (https://os.mbed.com/users/user/code/package/)
parts.netloc
in ("mbed.com", "os.mbed.com", "developer.mbed.org")
]
if any(git_conditions):
self.url = "git+" + self.url
elif any(hg_conditions):
self.url = "hg+" + self.url
return None
@staticmethod
def _parse_name_from_url(url):
if url.endswith("/"):
url = url[:-1]
stop_chars = ["#", "?"]
if url.startswith("file://"):
stop_chars.append("@") # detached path
for c in stop_chars:
if c in url:
url = url[: url.index(c)]
# parse real repository name from Github
parts = urlparse(url)
if parts.netloc == "github.com" and parts.path.count("/") > 2:
return parts.path.split("/")[2]
name = os.path.basename(url)
if "." in name:
return name.split(".", 1)[0].strip()
return name
class PackageMetaData(object):
def __init__( # pylint: disable=redefined-builtin
self, type, name, version, spec=None
):
# assert type in PackageType.items().values()
if spec:
assert isinstance(spec, PackageSpec)
self.type = type
self.name = name
self._version = None
self.version = version
self.spec = spec
def __repr__(self):
return (
"PackageMetaData <type={type} name={name} version={version} "
"spec={spec}".format(**self.as_dict())
)
def __eq__(self, other):
return all(
[
self.type == other.type,
self.name == other.name,
self.version == other.version,
self.spec == other.spec,
]
)
@property
def version(self):
return self._version
@version.setter
def version(self, value):
if not value:
self._version = None
return
self._version = (
value
if isinstance(value, semantic_version.Version)
else cast_version_to_semver(value)
)
def as_dict(self):
return dict(
type=self.type,
name=self.name,
version=str(self.version),
spec=self.spec.as_dict() if self.spec else None,
)
def dump(self, path):
with open(path, "w") as fp:
return json.dump(self.as_dict(), fp)
@staticmethod
def load(path):
with open(path) as fp:
data = json.load(fp)
if data["spec"]:
data["spec"] = PackageSpec(**data["spec"])
return PackageMetaData(**data)
class PackageItem(object):
METAFILE_NAME = ".piopm"
def __init__(self, path, metadata=None):
self.path = path
self.metadata = metadata
if not self.metadata and self.exists():
self.metadata = self.load_meta()
def __repr__(self):
return "PackageItem <path={path} metadata={metadata}".format(
path=self.path, metadata=self.metadata
)
def __eq__(self, other):
return all([self.path == other.path, self.metadata == other.metadata])
def exists(self):
return os.path.isdir(self.path)
def get_safe_dirname(self):
assert self.metadata
return re.sub(r"[^\da-z\_\-\. ]", "_", self.metadata.name, flags=re.I)
def get_metafile_locations(self):
return [
os.path.join(self.path, ".git"),
os.path.join(self.path, ".hg"),
os.path.join(self.path, ".svn"),
self.path,
]
def load_meta(self):
assert self.exists()
for location in self.get_metafile_locations():
manifest_path = os.path.join(location, self.METAFILE_NAME)
if os.path.isfile(manifest_path):
return PackageMetaData.load(manifest_path)
return None
def dump_meta(self):
assert self.exists()
location = None
for location in self.get_metafile_locations():
if os.path.isdir(location):
break
assert location
return self.metadata.dump(os.path.join(location, self.METAFILE_NAME))

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import json
import os import os
import re import re
import shutil import shutil
@@ -22,17 +23,21 @@ from platformio import fs
from platformio.package.exception import PackageException from platformio.package.exception import PackageException
from platformio.package.manifest.parser import ManifestFileType, ManifestParserFactory from platformio.package.manifest.parser import ManifestFileType, ManifestParserFactory
from platformio.package.manifest.schema import ManifestSchema from platformio.package.manifest.schema import ManifestSchema
from platformio.unpacker import FileUnpacker from platformio.package.meta import PackageItem
from platformio.package.unpack import FileUnpacker
class PackagePacker(object): class PackagePacker(object):
EXCLUDE_DEFAULT = [ EXCLUDE_DEFAULT = [
"._*", "._*",
"__*",
".DS_Store", ".DS_Store",
".git", ".git/",
".hg", ".hg/",
".svn", ".svn/",
".pio", ".pio/",
"**/.pio/",
PackageItem.METAFILE_NAME,
] ]
INCLUDE_DEFAULT = ManifestFileType.items().values() INCLUDE_DEFAULT = ManifestFileType.items().values()
@@ -40,6 +45,16 @@ class PackagePacker(object):
self.package = package self.package = package
self.manifest_uri = manifest_uri self.manifest_uri = manifest_uri
@staticmethod
def get_archive_name(name, version, system=None):
return re.sub(
r"[^\da-zA-Z\-\._\+]+",
"",
"{name}{system}-{version}.tar.gz".format(
name=name, system=("-" + system) if system else "", version=version,
),
)
def pack(self, dst=None): def pack(self, dst=None):
tmp_dir = tempfile.mkdtemp() tmp_dir = tempfile.mkdtemp()
try: try:
@@ -54,14 +69,10 @@ class PackagePacker(object):
src = self.find_source_root(src) src = self.find_source_root(src)
manifest = self.load_manifest(src) manifest = self.load_manifest(src)
filename = re.sub( filename = self.get_archive_name(
r"[^\da-zA-Z\-\._]+", manifest["name"],
"", manifest["version"],
"{name}{system}-{version}.tar.gz".format( manifest["system"][0] if "system" in manifest else None,
name=manifest["name"],
system="-" + manifest["system"][0] if "system" in manifest else "",
version=manifest["version"],
),
) )
if not dst: if not dst:
@@ -69,12 +80,7 @@ class PackagePacker(object):
elif os.path.isdir(dst): elif os.path.isdir(dst):
dst = os.path.join(dst, filename) dst = os.path.join(dst, filename)
return self._create_tarball( return self._create_tarball(src, dst, manifest)
src,
dst,
include=manifest.get("export", {}).get("include"),
exclude=manifest.get("export", {}).get("exclude"),
)
finally: finally:
shutil.rmtree(tmp_dir) shutil.rmtree(tmp_dir)
@@ -106,7 +112,9 @@ class PackagePacker(object):
return src return src
def _create_tarball(self, src, dst, include=None, exclude=None): def _create_tarball(self, src, dst, manifest):
include = manifest.get("export", {}).get("include")
exclude = manifest.get("export", {}).get("exclude")
# remap root # remap root
if ( if (
include include
@@ -114,6 +122,10 @@ class PackagePacker(object):
and os.path.isdir(os.path.join(src, include[0])) and os.path.isdir(os.path.join(src, include[0]))
): ):
src = os.path.join(src, include[0]) src = os.path.join(src, include[0])
with open(os.path.join(src, "library.json"), "w") as fp:
manifest_updated = manifest.copy()
del manifest_updated["export"]["include"]
json.dump(manifest_updated, fp, indent=2, ensure_ascii=False)
include = None include = None
src_filters = self.compute_src_filters(include, exclude) src_filters = self.compute_src_filters(include, exclude)

View File

@@ -19,10 +19,19 @@ from zipfile import ZipFile
import click import click
from platformio import exception, util from platformio import fs
from platformio.package.exception import PackageException
class ArchiveBase(object): class ExtractArchiveItemError(PackageException):
MESSAGE = (
"Could not extract `{0}` to `{1}`. Try to disable antivirus "
"tool or check this solution -> http://bit.ly/faq-package-manager"
)
class BaseArchiver(object):
def __init__(self, arhfileobj): def __init__(self, arhfileobj):
self._afo = arhfileobj self._afo = arhfileobj
@@ -46,9 +55,9 @@ class ArchiveBase(object):
self._afo.close() self._afo.close()
class TARArchive(ArchiveBase): class TARArchiver(BaseArchiver):
def __init__(self, archpath): def __init__(self, archpath):
super(TARArchive, self).__init__(tarfile_open(archpath)) super(TARArchiver, self).__init__(tarfile_open(archpath))
def get_items(self): def get_items(self):
return self._afo.getmembers() return self._afo.getmembers()
@@ -79,7 +88,7 @@ class TARArchive(ArchiveBase):
self.is_link(item) and self.is_bad_link(item, dest_dir), self.is_link(item) and self.is_bad_link(item, dest_dir),
] ]
if not any(bad_conds): if not any(bad_conds):
super(TARArchive, self).extract_item(item, dest_dir) super(TARArchiver, self).extract_item(item, dest_dir)
else: else:
click.secho( click.secho(
"Blocked insecure item `%s` from TAR archive" % item.name, "Blocked insecure item `%s` from TAR archive" % item.name,
@@ -88,9 +97,9 @@ class TARArchive(ArchiveBase):
) )
class ZIPArchive(ArchiveBase): class ZIPArchiver(BaseArchiver):
def __init__(self, archpath): def __init__(self, archpath):
super(ZIPArchive, self).__init__(ZipFile(archpath)) super(ZIPArchiver, self).__init__(ZipFile(archpath))
@staticmethod @staticmethod
def preserve_permissions(item, dest_dir): def preserve_permissions(item, dest_dir):
@@ -100,7 +109,7 @@ class ZIPArchive(ArchiveBase):
@staticmethod @staticmethod
def preserve_mtime(item, dest_dir): def preserve_mtime(item, dest_dir):
util.change_filemtime( fs.change_filemtime(
os.path.join(dest_dir, item.filename), os.path.join(dest_dir, item.filename),
mktime(tuple(item.date_time) + tuple([0, 0, 0])), mktime(tuple(item.date_time) + tuple([0, 0, 0])),
) )
@@ -121,48 +130,59 @@ class ZIPArchive(ArchiveBase):
class FileUnpacker(object): class FileUnpacker(object):
def __init__(self, archpath): def __init__(self, path):
self.archpath = archpath self.path = path
self._unpacker = None self._archiver = None
def _init_archiver(self):
magic_map = {
b"\x1f\x8b\x08": TARArchiver,
b"\x42\x5a\x68": TARArchiver,
b"\x50\x4b\x03\x04": ZIPArchiver,
}
magic_len = max(len(k) for k in magic_map)
with open(self.path, "rb") as fp:
data = fp.read(magic_len)
for magic, archiver in magic_map.items():
if data.startswith(magic):
return archiver(self.path)
raise PackageException("Unknown archive type '%s'" % self.path)
def __enter__(self): def __enter__(self):
if self.archpath.lower().endswith((".gz", ".bz2", ".tar")): self._archiver = self._init_archiver()
self._unpacker = TARArchive(self.archpath)
elif self.archpath.lower().endswith(".zip"):
self._unpacker = ZIPArchive(self.archpath)
if not self._unpacker:
raise exception.UnsupportedArchiveType(self.archpath)
return self return self
def __exit__(self, *args): def __exit__(self, *args):
if self._unpacker: if self._archiver:
self._unpacker.close() self._archiver.close()
def unpack( def unpack(
self, dest_dir=".", with_progress=True, check_unpacked=True, silent=False self, dest_dir=None, with_progress=True, check_unpacked=True, silent=False
): ):
assert self._unpacker assert self._archiver
if not dest_dir:
dest_dir = os.getcwd()
if not with_progress or silent: if not with_progress or silent:
if not silent: if not silent:
click.echo("Unpacking...") click.echo("Unpacking...")
for item in self._unpacker.get_items(): for item in self._archiver.get_items():
self._unpacker.extract_item(item, dest_dir) self._archiver.extract_item(item, dest_dir)
else: else:
items = self._unpacker.get_items() items = self._archiver.get_items()
with click.progressbar(items, label="Unpacking") as pb: with click.progressbar(items, label="Unpacking") as pb:
for item in pb: for item in pb:
self._unpacker.extract_item(item, dest_dir) self._archiver.extract_item(item, dest_dir)
if not check_unpacked: if not check_unpacked:
return True return True
# check on disk # check on disk
for item in self._unpacker.get_items(): for item in self._archiver.get_items():
filename = self._unpacker.get_item_filename(item) filename = self._archiver.get_item_filename(item)
item_path = os.path.join(dest_dir, filename) item_path = os.path.join(dest_dir, filename)
try: try:
if not self._unpacker.is_link(item) and not os.path.exists(item_path): if not self._archiver.is_link(item) and not os.path.exists(item_path):
raise exception.ExtractArchiveItemError(filename, dest_dir) raise ExtractArchiveItemError(filename, dest_dir)
except NotImplementedError: except NotImplementedError:
pass pass
return True return True

View File

@@ -17,7 +17,11 @@ from os.path import join
from subprocess import CalledProcessError, check_call from subprocess import CalledProcessError, check_call
from sys import modules from sys import modules
from platformio.exception import PlatformioException, UserSideException from platformio.package.exception import (
PackageException,
PlatformioException,
UserSideException,
)
from platformio.proc import exec_command from platformio.proc import exec_command
try: try:
@@ -26,9 +30,13 @@ except ImportError:
from urlparse import urlparse from urlparse import urlparse
class VCSBaseException(PackageException):
pass
class VCSClientFactory(object): class VCSClientFactory(object):
@staticmethod @staticmethod
def newClient(src_dir, remote_url, silent=False): def new(src_dir, remote_url, silent=False):
result = urlparse(remote_url) result = urlparse(remote_url)
type_ = result.scheme type_ = result.scheme
tag = None tag = None
@@ -41,12 +49,15 @@ class VCSClientFactory(object):
if "#" in remote_url: if "#" in remote_url:
remote_url, tag = remote_url.rsplit("#", 1) remote_url, tag = remote_url.rsplit("#", 1)
if not type_: if not type_:
raise PlatformioException("VCS: Unknown repository type %s" % remote_url) raise VCSBaseException("VCS: Unknown repository type %s" % remote_url)
try:
obj = getattr(modules[__name__], "%sClient" % type_.title())( obj = getattr(modules[__name__], "%sClient" % type_.title())(
src_dir, remote_url, tag, silent src_dir, remote_url, tag, silent
) )
assert isinstance(obj, VCSClientBase) assert isinstance(obj, VCSClientBase)
return obj return obj
except (AttributeError, AssertionError):
raise VCSBaseException("VCS: Unknown repository type %s" % remote_url)
class VCSClientBase(object): class VCSClientBase(object):
@@ -101,7 +112,7 @@ class VCSClientBase(object):
check_call(args, **kwargs) check_call(args, **kwargs)
return True return True
except CalledProcessError as e: except CalledProcessError as e:
raise PlatformioException("VCS: Could not process command %s" % e.cmd) raise VCSBaseException("VCS: Could not process command %s" % e.cmd)
def get_cmd_output(self, args, **kwargs): def get_cmd_output(self, args, **kwargs):
args = [self.command] + args args = [self.command] + args
@@ -110,7 +121,7 @@ class VCSClientBase(object):
result = exec_command(args, **kwargs) result = exec_command(args, **kwargs)
if result["returncode"] == 0: if result["returncode"] == 0:
return result["out"].strip() return result["out"].strip()
raise PlatformioException( raise VCSBaseException(
"VCS: Could not receive an output from `%s` command (%s)" % (args, result) "VCS: Could not receive an output from `%s` command (%s)" % (args, result)
) )
@@ -227,7 +238,6 @@ class SvnClient(VCSClientBase):
return self.run_cmd(args) return self.run_cmd(args)
def update(self): def update(self):
args = ["update"] args = ["update"]
return self.run_cmd(args) return self.run_cmd(args)
@@ -239,4 +249,4 @@ class SvnClient(VCSClientBase):
line = line.strip() line = line.strip()
if line.startswith("Revision:"): if line.startswith("Revision:"):
return line.split(":", 1)[1].strip() return line.split(":", 1)[1].strip()
raise PlatformioException("Could not detect current SVN revision") raise VCSBaseException("Could not detect current SVN revision")

View File

@@ -0,0 +1,53 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import semantic_version
def cast_version_to_semver(value, force=True, raise_exception=False):
assert value
try:
return semantic_version.Version(value)
except ValueError:
pass
if force:
try:
return semantic_version.Version.coerce(value)
except ValueError:
pass
if raise_exception:
raise ValueError("Invalid SemVer version %s" % value)
# parse commit hash
if re.match(r"^[\da-f]+$", value, flags=re.I):
return semantic_version.Version("0.0.0+sha." + value)
return semantic_version.Version("0.0.0+" + value)
def pepver_to_semver(pepver):
return cast_version_to_semver(
re.sub(r"(\.\d+)\.?(dev|a|b|rc|post)", r"\1-\2.", pepver, 1)
)
def get_original_version(version):
if version.count(".") != 2:
return None
_, raw = version.split(".")[:2]
if int(raw) <= 99:
return None
if int(raw) <= 9999:
return "%s.%s" % (raw[:-2], int(raw[-2:]))
return "%s.%s.%s" % (raw[:-4], int(raw[-4:-2]), int(raw[-2:]))

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,137 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.package.exception import UnknownPackageError
from platformio.package.meta import PackageSpec
class PlatformPackagesMixin(object):
def get_package_spec(self, name):
version = self.packages[name].get("version", "")
if any(c in version for c in (":", "/", "@")):
return PackageSpec("%s=%s" % (name, version))
return PackageSpec(
owner=self.packages[name].get("owner"), name=name, requirements=version
)
def get_package(self, name):
if not name:
return None
return self.pm.get_package(self.get_package_spec(name))
def get_package_dir(self, name):
pkg = self.get_package(name)
return pkg.path if pkg else None
def get_package_version(self, name):
pkg = self.get_package(name)
return str(pkg.metadata.version) if pkg else None
def get_installed_packages(self):
result = []
for name in self.packages:
pkg = self.get_package(name)
if pkg:
result.append(pkg)
return result
def dump_used_packages(self):
result = []
for name, options in self.packages.items():
if options.get("optional"):
continue
pkg = self.get_package(name)
if not pkg or not pkg.metadata:
continue
item = {"name": pkg.metadata.name, "version": str(pkg.metadata.version)}
if pkg.metadata.spec.external:
item["src_url"] = pkg.metadata.spec.url
result.append(item)
return result
def autoinstall_runtime_packages(self):
for name, options in self.packages.items():
if options.get("optional", False):
continue
if self.get_package(name):
continue
self.pm.install(self.get_package_spec(name))
return True
def install_packages( # pylint: disable=too-many-arguments
self,
with_packages=None,
without_packages=None,
skip_default_package=False,
silent=False,
force=False,
):
with_packages = set(self._find_pkg_names(with_packages or []))
without_packages = set(self._find_pkg_names(without_packages or []))
upkgs = with_packages | without_packages
ppkgs = set(self.packages)
if not upkgs.issubset(ppkgs):
raise UnknownPackageError(", ".join(upkgs - ppkgs))
for name, options in self.packages.items():
if name in without_packages:
continue
if name in with_packages or not (
skip_default_package or options.get("optional", False)
):
self.pm.install(self.get_package_spec(name), silent=silent, force=force)
return True
def _find_pkg_names(self, candidates):
result = []
for candidate in candidates:
found = False
# lookup by package types
for _name, _opts in self.packages.items():
if _opts.get("type") == candidate:
result.append(_name)
found = True
if (
self.frameworks
and candidate.startswith("framework-")
and candidate[10:] in self.frameworks
):
result.append(self.frameworks[candidate[10:]]["package"])
found = True
if not found:
result.append(candidate)
return result
def update_packages(self, only_check=False):
for pkg in self.get_installed_packages():
self.pm.update(
pkg,
to_spec=self.get_package_spec(pkg.metadata.name),
only_check=only_check,
show_incompatible=False,
)
def are_outdated_packages(self):
for pkg in self.get_installed_packages():
if self.pm.outdated(
pkg, self.get_package_spec(pkg.metadata.name)
).is_outdated(allow_incompatible=False):
return True
return False

199
platformio/platform/_run.py Normal file
View File

@@ -0,0 +1,199 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import base64
import os
import re
import sys
import click
from platformio import app, fs, proc, telemetry
from platformio.compat import PY2, hashlib_encode_data, is_bytes
from platformio.package.manager.core import get_core_package_dir
from platformio.platform.exception import BuildScriptNotFound
try:
from urllib.parse import quote
except ImportError:
from urllib import quote
class PlatformRunMixin(object):
LINE_ERROR_RE = re.compile(r"(^|\s+)error:?\s+", re.I)
@staticmethod
def encode_scons_arg(value):
data = base64.urlsafe_b64encode(hashlib_encode_data(value))
return data.decode() if is_bytes(data) else data
@staticmethod
def decode_scons_arg(data):
value = base64.urlsafe_b64decode(data)
return value.decode() if is_bytes(value) else value
def run( # pylint: disable=too-many-arguments
self, variables, targets, silent, verbose, jobs
):
assert isinstance(variables, dict)
assert isinstance(targets, list)
self.ensure_engine_compatible()
options = self.config.items(env=variables["pioenv"], as_dict=True)
if "framework" in options:
# support PIO Core 3.0 dev/platforms
options["pioframework"] = options["framework"]
self.configure_default_packages(options, targets)
self.autoinstall_runtime_packages()
self._report_non_sensitive_data(options, targets)
self.silent = silent
self.verbose = verbose or app.get_setting("force_verbose")
if "clean" in targets:
targets = ["-c", "."]
variables["platform_manifest"] = self.manifest_path
if "build_script" not in variables:
variables["build_script"] = self.get_build_script()
if not os.path.isfile(variables["build_script"]):
raise BuildScriptNotFound(variables["build_script"])
result = self._run_scons(variables, targets, jobs)
assert "returncode" in result
return result
def _report_non_sensitive_data(self, options, targets):
topts = options.copy()
topts["platform_packages"] = [
dict(name=item["name"], version=item["version"])
for item in self.dump_used_packages()
]
topts["platform"] = {"name": self.name, "version": self.version}
telemetry.send_run_environment(topts, targets)
def _run_scons(self, variables, targets, jobs):
scons_dir = get_core_package_dir("tool-scons")
script_path = (
os.path.join(scons_dir, "script", "scons")
if PY2
else os.path.join(scons_dir, "scons.py")
)
args = [
proc.get_pythonexe_path(),
script_path,
"-Q",
"--warn=no-no-parallel-support",
"--jobs",
str(jobs),
"--sconstruct",
os.path.join(fs.get_source_dir(), "builder", "main.py"),
]
args.append("PIOVERBOSE=%d" % (1 if self.verbose else 0))
# pylint: disable=protected-access
args.append("ISATTY=%d" % (1 if click._compat.isatty(sys.stdout) else 0))
args += targets
# encode and append variables
for key, value in variables.items():
args.append("%s=%s" % (key.upper(), self.encode_scons_arg(value)))
proc.copy_pythonpath_to_osenv()
if targets and "menuconfig" in targets:
return proc.exec_command(
args, stdout=sys.stdout, stderr=sys.stderr, stdin=sys.stdin
)
if click._compat.isatty(sys.stdout):
def _write_and_flush(stream, data):
try:
stream.write(data)
stream.flush()
except IOError:
pass
return proc.exec_command(
args,
stdout=proc.BuildAsyncPipe(
line_callback=self._on_stdout_line,
data_callback=lambda data: _write_and_flush(sys.stdout, data),
),
stderr=proc.BuildAsyncPipe(
line_callback=self._on_stderr_line,
data_callback=lambda data: _write_and_flush(sys.stderr, data),
),
)
return proc.exec_command(
args,
stdout=proc.LineBufferedAsyncPipe(line_callback=self._on_stdout_line),
stderr=proc.LineBufferedAsyncPipe(line_callback=self._on_stderr_line),
)
def _on_stdout_line(self, line):
if "`buildprog' is up to date." in line:
return
self._echo_line(line, level=1)
def _on_stderr_line(self, line):
is_error = self.LINE_ERROR_RE.search(line) is not None
self._echo_line(line, level=3 if is_error else 2)
a_pos = line.find("fatal error:")
b_pos = line.rfind(": No such file or directory")
if a_pos == -1 or b_pos == -1:
return
self._echo_missed_dependency(line[a_pos + 12 : b_pos].strip())
def _echo_line(self, line, level):
if line.startswith("scons: "):
line = line[7:]
assert 1 <= level <= 3
if self.silent and (level < 2 or not line):
return
fg = (None, "yellow", "red")[level - 1]
if level == 1 and "is up to date" in line:
fg = "green"
click.secho(line, fg=fg, err=level > 1, nl=False)
@staticmethod
def _echo_missed_dependency(filename):
if "/" in filename or not filename.endswith((".h", ".hpp")):
return
banner = """
{dots}
* Looking for {filename_styled} dependency? Check our library registry!
*
* CLI > platformio lib search "header:{filename}"
* Web > {link}
*
{dots}
""".format(
filename=filename,
filename_styled=click.style(filename, fg="cyan"),
link=click.style(
"https://platformio.org/lib/search?query=header:%s"
% quote(filename, safe=""),
fg="blue",
),
dots="*" * (56 + len(filename)),
)
click.echo(banner, err=True)

Some files were not shown because too many files have changed in this diff Show More