Compare commits

..

86 Commits

Author SHA1 Message Date
Ivan Kravets
1aa256d63c Merge branch 'release/v6.1.18' 2025-03-11 21:47:32 +02:00
Ivan Kravets
3a133af1a6 Bump version to 6.1.18 2025-03-11 21:47:22 +02:00
Ivan Kravets
f93d3d509b Resolved a regression issue that prevented PIO Home from opening external links // Resolve #5084 2025-03-10 19:57:13 +02:00
Ivan Kravets
145142ea6c Update deps 2025-02-24 13:15:19 +02:00
David Hotham
b4b02982d6 build and publish wheel (#5088) 2025-02-14 12:15:12 +02:00
Ivan Kravets
841489c154 Sync docs 2025-02-14 12:14:31 +02:00
Ivan Kravets
23c142dffd Bump version to 6.1.18a1 2025-02-13 13:58:19 +02:00
Ivan Kravets
fc946baa93 Merge branch 'release/v6.1.17' 2025-02-13 13:08:40 +02:00
Ivan Kravets
a447022e7f Merge tag 'v6.1.17' into develop
Bump version to 6.1.17
2025-02-13 13:08:40 +02:00
Ivan Kravets
4c697d9032 Bump version to 6.1.17 2025-02-13 13:08:31 +02:00
Deen-Weible
a71443a2ee Clarified language and fixed some grammar issues (#5085) 2025-02-13 13:04:48 +02:00
Ivan Kravets
20e076191e Bump version to 6.1.17rc1 2025-02-09 12:24:37 +02:00
Ivan Kravets
d907ecb9e9 Sync docs 2025-02-09 12:22:58 +02:00
Ivan Kravets
c950d6d366 CI: Disable cleaning of ubuntu instance 2025-02-08 21:51:50 +02:00
Ivan Kravets
29cd2d2bdb Update GH actions 2025-02-08 21:24:54 +02:00
Ivan Kravets
a584a6bce3 Fix spell issues 2025-02-08 19:27:58 +02:00
Christian Clauss
4dc7ea5bd0 Fix typos discovered by codespell (#5078)
* Fix typos discovered by codespell

* codespell-project/actions-codespell@v2
2025-02-08 17:21:12 +02:00
Ivan Kravets
1be6e10f99 Introduced the PLATFORMIO_RUN_JOBS environment variable // Resolve #5077 2025-02-08 17:15:34 +02:00
Ivan Kravets
c9016d6939 Simplify PyPi dependencies // Resolve #5059 2025-01-03 12:56:03 +02:00
Ivan Kravets
baab25a48c Update SPDX license list to v3.24.0 2025-01-03 12:43:51 +02:00
Ivan Kravets
4d4f5a217b Bump version to 6.1.17b2 2024-12-18 14:39:48 +02:00
Ivan Kravets
b6d1f4d769 Resolved an issue where the |LDF| occasionally excluded bundled platform libraries from the dependency graph // Issue #4940 2024-12-18 14:38:56 +02:00
Ivan Kravets
90fc36cf2d Update deps 2024-12-18 14:38:24 +02:00
vortigont
9be0a8248d LDF: refresh lib dependency after recursive search (#4941)
LDF might mistakenly remove recursive dependency libs from a graph
usually platform bundled ones

Closes #4940
2024-12-18 14:24:35 +02:00
Ivan Kravets
d15314689d Resolved an issue where the `--project-dir` flag did not function correctly with the check and debug commands // Resolve #5029 2024-12-13 13:01:40 +02:00
Ivan Kravets
1d4b5c8051 Upgrade docs to the sphinx 8 2024-12-13 13:01:21 +02:00
Ivan Kravets
47a87c57f2 Bump version to 6.1.17b1 2024-12-12 19:56:35 +02:00
Ivan Kravets
ec2d01f277 Ensured that dependencies of private libraries are no longer unnecessarily re-installed // Resolve #4987 2024-12-12 19:55:29 +02:00
Ivan Kravets
4e05309e02 Added support for `tar.xz` tarball dependencies // Issue #4974 2024-12-12 19:49:47 +02:00
Jason2866
1fd3a4061f add support for tar.xz tarballs (#4974)
* tar.xz

* add magic bytes for `tar.xz`
2024-12-12 19:47:23 +02:00
Ivan Kravets
014ac79c87 Resolved an issue with incorrect path resolution when linking static libraries via the build_flags // Resolve #5004 2024-12-12 19:46:11 +02:00
Ivan Kravets
dd3fe909a1 Better handling of the missed args for exec command // Resolve #5047 2024-12-12 16:32:43 +02:00
Maximilian Gerhardt
c1afb364e9 Allow HTTP 203 as successful response (#5043)
Fixes downloading library dependencies from Azure Devops repositories.
2024-12-09 20:07:27 +02:00
Ivan Kravets
f3c27eadf6 Switch to named argument (PY 3.13+) 2024-12-02 21:32:44 +02:00
Ivan Kravets
fe2fd5e880 Sync docs 2024-12-02 21:32:10 +02:00
Ivan Kravets
07e7dc4717 Update deps 2024-12-02 20:38:05 +02:00
Ivan Kravets
a94e5bd5ab Bump version to 6.1.17a2 2024-10-17 12:33:15 +03:00
Ivan Kravets
f5ab0e5ddd Resolve an issue where the `compiledb` target failed to properly escape compiler executable paths containing spaces // Resolve #4998 2024-10-17 12:32:52 +03:00
Ivan Kravets
3e20abec90 Disable temporary "test_custom_testing_command" 2024-10-17 11:10:58 +03:00
Ivan Kravets
a4276b4ea6 Update dependencies.py 2024-10-15 22:43:13 +03:00
Ben Beasley
cade63fba5 Allow Starlette 0.40.x (#5000)
This release of Starlette contains a fix for a security bug:

- GHSA-f96h-pmfr-66vw:
  https://github.com/encode/starlette/security/advisories/GHSA-f96h-pmfr-66vw
- CVE-2024-47874: https://nvd.nist.gov/vuln/detail/CVE-2024-47874
2024-10-15 22:41:10 +03:00
Ivan Kravets
3a57661230 Switch to the stable Python 3.13 for CI 2024-10-13 21:46:22 +03:00
Ivan Kravets
33fadd028d Sync docs 2024-10-13 11:54:21 +03:00
Ivan Kravets
647b131d9b Sync docs 2024-10-04 12:22:54 +03:00
Ivan Kravets
b537004a75 Bump version to 6.1.17a1 2024-09-26 13:40:24 +03:00
Ivan Kravets
67b2759be2 Merge branch 'release/v6.1.16' 2024-09-26 13:14:01 +03:00
Ivan Kravets
fe2e8a0a40 Merge tag 'v6.1.16' into develop
Bump version to 6.1.16
2024-09-26 13:14:01 +03:00
Ivan Kravets
03e84fe325 Bump version to 6.1.16 2024-09-26 13:13:50 +03:00
Ivan Kravets
b45cdc9cb6 Declare pip dependencies statically // Resolve #4955 2024-09-26 13:13:31 +03:00
Ivan Kravets
3aed8e1259 Drop Python 3.6 & 3.7 from CI 2024-09-23 23:33:25 +03:00
Ivan Kravets
2d4a87238a Add support for Python 3.13 2024-09-23 23:25:46 +03:00
Ivan Kravets
023b58e9f0 Fix PyLint warnings 2024-09-23 16:02:07 +03:00
Ivan Kravets
3211a2b91b Bump version to 6.1.16rc1 2024-09-23 15:34:04 +03:00
Ivan Kravets
4b61de0136 Update deps 2024-09-23 15:33:06 +03:00
Ivan Kravets
e6ae18ab0d Enhanced internet connection checks by falling back to HTTPS protocol when HTTP (port 80) fails // Resolve #4980 2024-09-23 15:32:23 +03:00
Ivan Kravets
4230b223d2 Update bottle to 0.13.* 2024-09-16 20:45:30 +03:00
Ivan Kravets
d224ae658d Sync docs 2024-09-14 13:14:26 +03:00
Ivan Kravets
20dc006345 Bump version to 6.1.16b2 2024-09-04 11:48:59 +03:00
Ivan Kravets
13035ced59 Upgrade the build engine to the latest version of SCons (4.8.1) 2024-09-04 11:48:30 +03:00
Ivan Kravets
b9d27240b5 Drop ESPHome from CI 2024-08-30 11:06:38 +03:00
Ivan Kravets
2441d47321 Upgrade the build engine to the latest version of SCons (4.8.0) 2024-08-29 15:58:40 +03:00
Ivan Kravets
cf497e8829 Update tests 2024-08-29 15:58:12 +03:00
Ivan Kravets
013153718d Allow manual override of system type 2024-08-29 11:30:18 +03:00
Chris
f1726843a2 allow manual override of system type (#4952)
* allow manual override of system type
https://community.platformio.org/t/windows-on-arm64-problem-installing-xtensa-toolchain/25497

* fix lint
2024-08-29 11:28:52 +03:00
Ivan Kravets
44ef6e3469 Use owner-based platform declaration // Resolve #4962 2024-08-28 19:43:52 +03:00
Ivan Kravets
eeb5ac456e Upgrade Doctest and GoogleTest testing frameworks to the latest versions 2024-08-01 14:30:00 +03:00
Ivan Kravets
aea9075d4b Bump version to 6.1.16b1 2024-07-29 15:33:33 +03:00
Ivan Kravets
11a8d9ff7a Updated Unity testing framework to v2.6.0 // Resolve #4871 2024-07-29 15:32:39 +03:00
Ivan Kravets
7b587ba8bf Corrected an issue where the incorrect public class was imported for the `DoctestTestRunner` // Resolve #4949 2024-07-29 14:43:03 +03:00
Ben Beasley
9eb6e5166d Allow Starlette 0.38.1 (#4953) 2024-07-25 14:33:32 +03:00
Ivan Kravets
aa580360e8 Sync docs 2024-07-11 12:53:55 +03:00
Jean Alinei
4c490cc63c Adding Zephyr framework USB VID PID to udev rules. (#4947)
As Zephyr is officially supported by platformio, it make sense to add its default VID:PID to udev rules.
2024-07-11 12:48:35 +03:00
Ivan Kravets
882d4da8cb Sync docs 2024-07-10 23:09:07 +03:00
Ivan Kravets
781114f026 Update deps 2024-06-30 12:44:48 +03:00
valeros
7cf8d1d696 Exclude Python 3.6 and 3.7 from CI matrix for MacOS 2024-06-03 15:06:46 +03:00
Ivan Kravets
fd1333f031 Update SPDX licenses to 3.24.0 2024-05-24 09:25:30 +03:00
Ivan Kravets
8e21259222 Disable macOS runner for Pyrhon 3.6, 3.7 2024-05-24 09:04:00 +03:00
Ivan Kravets
9899547b73 Sync docs 2024-05-24 09:03:35 +03:00
Ivan Kravets
4075789a32 Sync docs 2024-05-10 12:45:07 +03:00
Ivan Kravets
ff364610c5 Sync docs 2024-05-07 22:22:26 +03:00
Ivan Kravets
e5940673d7 Bump version to 6.1.16a1 2024-04-25 11:41:53 +03:00
Ivan Kravets
fe140b0566 Merge branch 'release/v6.1.15' 2024-04-25 11:37:53 +03:00
Ivan Kravets
2ec5a3154e Merge tag 'v6.1.15' into develop
Bump version to 6.1.15
2024-04-25 11:37:53 +03:00
Ivan Kravets
956f21b639 Bump version to 6.1.15 2024-04-25 11:37:28 +03:00
Ivan Kravets
cdac7d497c Resolved an issue related to the inaccurate detection of the Clang compiler // Resolve #4897 2024-04-24 23:08:00 +03:00
Ivan Kravets
591b377e4a Sync docs 2024-03-29 21:33:54 +02:00
134 changed files with 1424 additions and 1379 deletions

View File

@@ -7,8 +7,8 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-20.04, windows-latest, macos-latest]
python-version: ["3.6", "3.7", "3.11", "3.12"]
os: [ubuntu-latest, windows-latest, macos-latest]
python-version: ["3.11", "3.12", "3.13"]
runs-on: ${{ matrix.os }}
@@ -18,7 +18,7 @@ jobs:
submodules: "recursive"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
@@ -27,15 +27,16 @@ jobs:
python -m pip install --upgrade pip
pip install tox
- name: Run "codespell" on Linux
if: startsWith(matrix.os, 'ubuntu')
run: |
python -m pip install codespell
make codespell
- name: Core System Info
run: |
tox -e py
- name: Python Lint
if: ${{ matrix.python-version != '3.6' }}
run: |
tox -e lint
- name: Integration Tests
if: ${{ matrix.python-version == '3.11' }}
run: |

View File

@@ -24,7 +24,7 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox wheel
pip install tox build
- name: Deployment Tests
env:
@@ -34,9 +34,8 @@ jobs:
run: |
tox -e testcore
- name: Build Python source tarball
# run: python setup.py sdist bdist_wheel
run: python setup.py sdist
- name: Build Python distributions
run: python -m build
- name: Publish package to PyPI
if: ${{ github.ref == 'refs/heads/master' }}

View File

@@ -11,7 +11,7 @@ jobs:
with:
submodules: "recursive"
- name: Set up Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install dependencies
@@ -40,7 +40,7 @@ jobs:
- name: Save artifact
if: ${{ github.event_name == 'push' }}
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: docs
path: ./docs.tar.gz
@@ -49,15 +49,15 @@ jobs:
name: Deploy Docs
needs: build
runs-on: ubuntu-latest
if: ${{ github.event_name == 'push' && (github.ref == 'refs/heads/develop' || github.ref == 'refs/heads/master') }}
env:
DOCS_REPO: platformio/platformio-docs
DOCS_DIR: platformio-docs
LATEST_DOCS_DIR: latest-docs
RELEASE_BUILD: ${{ startsWith(github.ref, 'refs/tags/v') }}
if: ${{ github.event_name == 'push' }}
steps:
- name: Download artifact
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
with:
name: docs
- name: Unpack artifact
@@ -65,7 +65,7 @@ jobs:
mkdir ./${{ env.LATEST_DOCS_DIR }}
tar -xzf ./docs.tar.gz -C ./${{ env.LATEST_DOCS_DIR }}
- name: Delete Artifact
uses: geekyeggo/delete-artifact@v2
uses: geekyeggo/delete-artifact@v5
with:
name: docs
- name: Select Docs type
@@ -101,7 +101,7 @@ jobs:
exit 1
fi
- name: Deploy to Github Pages
uses: peaceiris/actions-gh-pages@v3
uses: peaceiris/actions-gh-pages@v4
with:
personal_token: ${{ secrets.DEPLOY_GH_DOCS_TOKEN }}
external_repository: ${{ env.DOCS_REPO }}

View File

@@ -34,7 +34,7 @@ jobs:
run: |
# Free space
sudo apt clean
docker rmi $(docker image ls -aq)
# docker rmi $(docker image ls -aq)
df -h
tox -e testexamples

View File

@@ -13,11 +13,6 @@ jobs:
folder: "Marlin"
config_dir: "Marlin"
env_name: "mega2560"
- esphome:
repository: "esphome/esphome"
folder: "esphome"
config_dir: "esphome"
env_name: "esp32-arduino"
- smartknob:
repository: "scottbez1/smartknob"
folder: "smartknob"
@@ -34,9 +29,6 @@ jobs:
config_dir: "OpenMQTTGateway"
env_name: "esp32-m5atom-lite"
os: [ubuntu-latest, windows-latest, macos-latest]
exclude:
- os: windows-latest
project: {"esphome": "", "repository": "esphome/esphome", "folder": "esphome", "config_dir": "esphome", "env_name": "esp32-arduino"}
runs-on: ${{ matrix.os }}
steps:
@@ -59,11 +51,6 @@ jobs:
repository: ${{ matrix.project.repository }}
path: ${{ matrix.project.folder }}
- name: Install ESPHome dependencies
# Requires esptool package as it's used in a custom prescript
if: ${{ contains(matrix.project.repository, 'esphome') }}
run: pip install esptool==3.*
- name: Compile ${{ matrix.project.repository }}
run: pio run -d ${{ matrix.project.config_dir }} -e ${{ matrix.project.env_name }}

View File

@@ -8,6 +8,7 @@ Release Notes
.. |UNITTESTING| replace:: `Unit Testing <https://docs.platformio.org/en/latest/advanced/unit-testing/index.html>`__
.. |DEBUGGING| replace:: `Debugging <https://docs.platformio.org/en/latest/plus/debugging.html>`__
.. |STATICCODEANALYSIS| replace:: `Static Code Analysis <https://docs.platformio.org/en/latest/advanced/static-code-analysis/index.html>`__
.. |PIOHOME| replace:: `PIO Home <https://docs.platformio.org/en/latest/home/index.html>`__
.. _release_notes_6:
@@ -18,10 +19,37 @@ Unlock the true potential of embedded software development with
PlatformIO's collaborative ecosystem, embracing declarative principles,
test-driven methodologies, and modern toolchains for unrivaled success.
6.1.15 (2024-??-??)
6.1.18 (2025-03-11)
~~~~~~~~~~~~~~~~~~~
* Resolved a regression issue that prevented |PIOHOME| from opening external links (`issue #5084 <https://github.com/platformio/platformio-core/issues/5084>`_)
6.1.17 (2025-02-13)
~~~~~~~~~~~~~~~~~~~
* Introduced the `PLATFORMIO_RUN_JOBS <https://docs.platformio.org/en/latest/envvars.html#envvar-PLATFORMIO_RUN_JOBS>`__ environment variable, allowing manual override of the number of parallel build jobs (`issue #5077 <https://github.com/platformio/platformio-core/issues/5077>`_)
* Added support for ``tar.xz`` tarball dependencies (`pull #4974 <https://github.com/platformio/platformio-core/pull/4974>`_)
* Ensured that dependencies of private libraries are no longer unnecessarily re-installed, optimizing dependency management and reducing redundant operations (`issue #4987 <https://github.com/platformio/platformio-core/issues/4987>`_)
* Resolved an issue where the ``compiledb`` target failed to properly escape compiler executable paths containing spaces (`issue #4998 <https://github.com/platformio/platformio-core/issues/4998>`_)
* Resolved an issue with incorrect path resolution when linking static libraries via the `build_flags <https://docs.platformio.org/en/latest/projectconf/sections/env/options/build/build_flags.html>`__ option (`issue #5004 <https://github.com/platformio/platformio-core/issues/5004>`_)
* Resolved an issue where the ``--project-dir`` flag did not function correctly with the `pio check <https://docs.platformio.org/en/latest/core/userguide/cmd_check.html>`__ and `pio debug <https://docs.platformio.org/en/latest/core/userguide/cmd_debug.html>`__ commands (`issue #5029 <https://github.com/platformio/platformio-core/issues/5029>`_)
* Resolved an issue where the |LDF| occasionally excluded bundled platform libraries from the dependency graph (`pull #4941 <https://github.com/platformio/platformio-core/pull/4941>`_)
6.1.16 (2024-09-26)
~~~~~~~~~~~~~~~~~~~
* Added support for Python 3.13
* Introduced the `PLATFORMIO_SYSTEM_TYPE <https://docs.platformio.org/en/latest/envvars.html#envvar-PLATFORMIO_SYSTEM_TYPE>`__ environment variable, enabling manual override of the detected system type for greater flexibility and control in custom build environments
* Enhanced internet connection checks by falling back to HTTPS protocol when HTTP (port 80) fails (`issue #4980 <https://github.com/platformio/platformio-core/issues/4980>`_)
* Upgraded the build engine to the latest version of SCons (4.8.1) to improve build performance, reliability, and compatibility with other tools and systems (`release notes <https://github.com/SCons/scons/releases/tag/4.8.1>`__)
* Upgraded the `Doctest <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/doctest.html>`__ testing framework to version 2.4.11, the `GoogleTest <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/doctest.html>`__ to version 1.15.2, and the `Unity <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/unity.html>`__ to version 2.6.0, incorporating the latest features and improvements for enhanced testing capabilities
* Corrected an issue where the incorrect public class was imported for the ``DoctestTestRunner`` (`issue #4949 <https://github.com/platformio/platformio-core/issues/4949>`_)
6.1.15 (2024-04-25)
~~~~~~~~~~~~~~~~~~~
* Resolved an issue where the |LDF| couldn't locate a library dependency declared via version control system repository (`issue #4885 <https://github.com/platformio/platformio-core/issues/4885>`_)
* Resolved an issue related to the inaccurate detection of the Clang compiler (`pull #4897 <https://github.com/platformio/platformio-core/pull/4897>`_)
6.1.14 (2024-03-21)
~~~~~~~~~~~~~~~~~~~
@@ -65,7 +93,7 @@ test-driven methodologies, and modern toolchains for unrivaled success.
~~~~~~~~~~~~~~~~~~~
* Resolved a possible issue that may cause generated projects for `PlatformIO IDE for VSCode <https://docs.platformio.org/en/latest/integration/ide/vscode.html>`__ to fail to launch a debug session because of a missing "objdump" binary when GDB is not part of the toolchain package
* Resolved a regression issue that resulted in the malfunction of the Memory Inspection feature within `PIO Home <https://docs.platformio.org/en/latest/home/index.html>`__
* Resolved a regression issue that resulted in the malfunction of the Memory Inspection feature within |PIOHOME|
6.1.10 (2023-08-11)
~~~~~~~~~~~~~~~~~~~

View File

@@ -10,10 +10,13 @@ format:
black ./platformio
black ./tests
test:
py.test --verbose --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
codespell:
codespell --skip "./build,./docs/_build" -L "AtLeast,TRE,ans,dout,homestate,ser"
before-commit: isort format lint
test:
pytest --verbose --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
before-commit: codespell isort format lint
clean-docs:
rm -rf docs/_build

2
docs

Submodule docs updated: 670721e923...70ab7ee27b

View File

@@ -12,7 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
VERSION = (6, 1, "15a1")
VERSION = (6, 1, 18)
__version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio"

View File

@@ -66,6 +66,15 @@ def configure():
if IS_CYGWIN:
raise exception.CygwinEnvDetected()
# https://urllib3.readthedocs.org
# /en/latest/security.html#insecureplatformwarning
try:
import urllib3 # pylint: disable=import-outside-toplevel
urllib3.disable_warnings()
except (AttributeError, ImportError):
pass
# Handle IOError issue with VSCode's Terminal (Windows)
click_echo_origin = [click.echo, click.secho]

View File

@@ -17,7 +17,7 @@ import time
from platformio import __accounts_api__, app
from platformio.exception import PlatformioException, UserSideException
from platformio.http import HttpApiClient, HttpClientApiError
from platformio.http import HTTPClient, HTTPClientError
class AccountError(PlatformioException):
@@ -32,7 +32,7 @@ class AccountAlreadyAuthorized(AccountError, UserSideException):
MESSAGE = "You are already authorized with {0} account."
class AccountClient(HttpApiClient): # pylint:disable=too-many-public-methods
class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
SUMMARY_CACHE_TTL = 60 * 60 * 24 * 7
def __init__(self):
@@ -60,7 +60,7 @@ class AccountClient(HttpApiClient): # pylint:disable=too-many-public-methods
def fetch_json_data(self, *args, **kwargs):
try:
return super().fetch_json_data(*args, **kwargs)
except HttpClientApiError as exc:
except HTTPClientError as exc:
raise AccountError(exc) from exc
def fetch_authentication_token(self):
@@ -144,7 +144,7 @@ class AccountClient(HttpApiClient): # pylint:disable=too-many-public-methods
def registration(
self, username, email, password, firstname, lastname
): # pylint:disable=too-many-arguments
): # pylint: disable=too-many-arguments,too-many-positional-arguments
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except

View File

@@ -19,18 +19,18 @@ from platformio.account.client import AccountClient, AccountNotAuthorized
@click.command("destroy", short_help="Destroy account")
def account_destroy_cmd():
with AccountClient() as client:
click.confirm(
"Are you sure you want to delete the %s user account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% client.get_logged_username(),
abort=True,
)
client.destroy_account()
try:
client.logout()
except AccountNotAuthorized:
pass
client = AccountClient()
click.confirm(
"Are you sure you want to delete the %s user account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% client.get_logged_username(),
abort=True,
)
client.destroy_account()
try:
client.logout()
except AccountNotAuthorized:
pass
click.secho(
"User account has been destroyed.",
fg="green",

View File

@@ -20,8 +20,8 @@ from platformio.account.client import AccountClient
@click.command("forgot", short_help="Forgot password")
@click.option("--username", prompt="Username or email")
def account_forgot_cmd(username):
with AccountClient() as client:
client.forgot_password(username)
client = AccountClient()
client.forgot_password(username)
click.secho(
"If this account is registered, we will send the "
"further instructions to your email.",

View File

@@ -21,6 +21,6 @@ from platformio.account.client import AccountClient
@click.option("-u", "--username", prompt="Username or email")
@click.option("-p", "--password", prompt=True, hide_input=True)
def account_login_cmd(username, password):
with AccountClient() as client:
client.login(username, password)
client = AccountClient()
client.login(username, password)
click.secho("Successfully logged in!", fg="green")

View File

@@ -19,6 +19,6 @@ from platformio.account.client import AccountClient
@click.command("logout", short_help="Log out of PlatformIO Account")
def account_logout_cmd():
with AccountClient() as client:
client.logout()
client = AccountClient()
client.logout()
click.secho("Successfully logged out!", fg="green")

View File

@@ -21,6 +21,6 @@ from platformio.account.client import AccountClient
@click.option("--old-password", prompt=True, hide_input=True)
@click.option("--new-password", prompt=True, hide_input=True, confirmation_prompt=True)
def account_password_cmd(old_password, new_password):
with AccountClient() as client:
client.change_password(old_password, new_password)
client = AccountClient()
client.change_password(old_password, new_password)
click.secho("Password successfully changed!", fg="green")

View File

@@ -43,8 +43,8 @@ from platformio.account.validate import (
@click.option("--firstname", prompt=True)
@click.option("--lastname", prompt=True)
def account_register_cmd(username, email, password, firstname, lastname):
with AccountClient() as client:
client.registration(username, email, password, firstname, lastname)
client = AccountClient()
client.registration(username, email, password, firstname, lastname)
click.secho(
"An account has been successfully created. "
"Please check your mail to activate your account and verify your email address.",

View File

@@ -25,8 +25,8 @@ from platformio.account.client import AccountClient
@click.option("--offline", is_flag=True)
@click.option("--json-output", is_flag=True)
def account_show_cmd(offline, json_output):
with AccountClient() as client:
info = client.get_account_info(offline)
client = AccountClient()
info = client.get_account_info(offline)
if json_output:
click.echo(json.dumps(info))
return

View File

@@ -24,8 +24,8 @@ from platformio.account.client import AccountClient
@click.option("--regenerate", is_flag=True)
@click.option("--json-output", is_flag=True)
def account_token_cmd(password, regenerate, json_output):
with AccountClient() as client:
auth_token = client.auth_token(password, regenerate)
client = AccountClient()
auth_token = client.auth_token(password, regenerate)
if json_output:
click.echo(json.dumps({"status": "success", "result": auth_token}))
return

View File

@@ -25,8 +25,8 @@ from platformio.account.validate import validate_email, validate_username
@click.option("--firstname")
@click.option("--lastname")
def account_update_cmd(current_password, **kwargs):
with AccountClient() as client:
profile = client.get_profile()
client = AccountClient()
profile = client.get_profile()
new_profile = profile.copy()
if not any(kwargs.values()):
for field in profile:

View File

@@ -25,8 +25,8 @@ from platformio.account.client import AccountClient
"username",
)
def org_add_cmd(orgname, username):
with AccountClient() as client:
client.add_org_owner(orgname, username)
client = AccountClient()
client.add_org_owner(orgname, username)
return click.secho(
"The new owner `%s` has been successfully added to the `%s` organization."
% (username, orgname),

View File

@@ -30,8 +30,8 @@ from platformio.account.validate import validate_email, validate_orgname
"--displayname",
)
def org_create_cmd(orgname, email, displayname):
with AccountClient() as client:
client.create_org(orgname, email, displayname)
client = AccountClient()
client.create_org(orgname, email, displayname)
return click.secho(
"The organization `%s` has been successfully created." % orgname,
fg="green",

View File

@@ -20,14 +20,14 @@ from platformio.account.client import AccountClient
@click.command("destroy", short_help="Destroy organization")
@click.argument("orgname")
def org_destroy_cmd(orgname):
with AccountClient() as client:
click.confirm(
"Are you sure you want to delete the `%s` organization account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% orgname,
abort=True,
)
client.destroy_org(orgname)
client = AccountClient()
click.confirm(
"Are you sure you want to delete the `%s` organization account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% orgname,
abort=True,
)
client.destroy_org(orgname)
return click.secho(
"Organization `%s` has been destroyed." % orgname,
fg="green",

View File

@@ -23,8 +23,8 @@ from platformio.account.client import AccountClient
@click.command("list", short_help="List organizations and their members")
@click.option("--json-output", is_flag=True)
def org_list_cmd(json_output):
with AccountClient() as client:
orgs = client.list_orgs()
client = AccountClient()
orgs = client.list_orgs()
if json_output:
return click.echo(json.dumps(orgs))
if not orgs:

View File

@@ -25,8 +25,8 @@ from platformio.account.client import AccountClient
"username",
)
def org_remove_cmd(orgname, username):
with AccountClient() as client:
client.remove_org_owner(orgname, username)
client = AccountClient()
client.remove_org_owner(orgname, username)
return click.secho(
"The `%s` owner has been successfully removed from the `%s` organization."
% (username, orgname),

View File

@@ -31,8 +31,8 @@ from platformio.account.validate import validate_email, validate_orgname
)
@click.option("--displayname")
def org_update_cmd(cur_orgname, **kwargs):
with AccountClient() as client:
org = client.get_org(cur_orgname)
client = AccountClient()
org = client.get_org(cur_orgname)
new_org = {
key: value if value is not None else org[key] for key, value in kwargs.items()
}

View File

@@ -29,8 +29,8 @@ from platformio.account.validate import validate_orgname_teamname
)
def team_add_cmd(orgname_teamname, username):
orgname, teamname = orgname_teamname.split(":", 1)
with AccountClient() as client:
client.add_team_member(orgname, teamname, username)
client = AccountClient()
client.add_team_member(orgname, teamname, username)
return click.secho(
"The new member %s has been successfully added to the %s team."
% (username, teamname),

View File

@@ -29,8 +29,8 @@ from platformio.account.validate import validate_orgname_teamname
)
def team_create_cmd(orgname_teamname, description):
orgname, teamname = orgname_teamname.split(":", 1)
with AccountClient() as client:
client.create_team(orgname, teamname, description)
client = AccountClient()
client.create_team(orgname, teamname, description)
return click.secho(
"The team %s has been successfully created." % teamname,
fg="green",

View File

@@ -32,8 +32,8 @@ def team_destroy_cmd(orgname_teamname):
),
abort=True,
)
with AccountClient() as client:
client.destroy_team(orgname, teamname)
client = AccountClient()
client.destroy_team(orgname, teamname)
return click.secho(
"The team %s has been successfully destroyed." % teamname,
fg="green",

View File

@@ -24,22 +24,19 @@ from platformio.account.client import AccountClient
@click.argument("orgname", required=False)
@click.option("--json-output", is_flag=True)
def team_list_cmd(orgname, json_output):
with AccountClient() as client:
data = {}
if not orgname:
for item in client.list_orgs():
teams = client.list_teams(item.get("orgname"))
data[item.get("orgname")] = teams
else:
teams = client.list_teams(orgname)
data[orgname] = teams
client = AccountClient()
data = {}
if not orgname:
for item in client.list_orgs():
teams = client.list_teams(item.get("orgname"))
data[item.get("orgname")] = teams
else:
teams = client.list_teams(orgname)
data[orgname] = teams
if json_output:
return click.echo(json.dumps(data[orgname] if orgname else data))
if not any(data.values()):
return click.secho("You do not have any teams.", fg="yellow")
for org_name, teams in data.items():
for team in teams:
click.echo()

View File

@@ -27,8 +27,8 @@ from platformio.account.validate import validate_orgname_teamname
@click.argument("username")
def team_remove_cmd(orgname_teamname, username):
orgname, teamname = orgname_teamname.split(":", 1)
with AccountClient() as client:
client.remove_team_member(orgname, teamname, username)
client = AccountClient()
client.remove_team_member(orgname, teamname, username)
return click.secho(
"The %s member has been successfully removed from the %s team."
% (username, teamname),

View File

@@ -34,8 +34,8 @@ from platformio.account.validate import validate_orgname_teamname, validate_team
)
def team_update_cmd(orgname_teamname, **kwargs):
orgname, teamname = orgname_teamname.split(":", 1)
with AccountClient() as client:
team = client.get_team(orgname, teamname)
client = AccountClient()
team = client.get_team(orgname, teamname)
new_team = {
key: value if value is not None else team[key] for key, value in kwargs.items()
}

View File

@@ -258,6 +258,10 @@ def get_cid():
return cid
def get_project_id(project_dir):
return hashlib.sha1(hashlib_encode_data(project_dir)).hexdigest()
def get_user_agent():
data = [
"PlatformIO/%s" % __version__,

View File

@@ -178,3 +178,6 @@ ATTRS{idVendor}=="03eb", ATTRS{idProduct}=="2107", MODE="0666", ENV{ID_MM_DEVICE
# Espressif USB JTAG/serial debug unit
ATTRS{idVendor}=="303a", ATTRS{idProduct}=="1001", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"
# Zephyr framework USB CDC-ACM
ATTRS{idVendor}=="2fe3", ATTRS{idProduct}=="0100", MODE="0666", ENV{ID_MM_DEVICE_IGNORE}="1", ENV{ID_MM_PORT_IGNORE}="1"

View File

@@ -15,7 +15,7 @@
import json
import os
import sys
import time
from time import time
import click
from SCons.Script import ARGUMENTS # pylint: disable=import-error
@@ -31,7 +31,7 @@ from SCons.Script import Variables # pylint: disable=import-error
from platformio import app, fs
from platformio.platform.base import PlatformBase
from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_build_type, get_project_dir
from platformio.project.helpers import get_project_dir
AllowSubstExceptions(NameError)
@@ -61,7 +61,7 @@ DEFAULT_ENV_OPTIONS = dict(
"piotarget",
"piolib",
"pioupload",
"piomemusage",
"piosize",
"pioino",
"piomisc",
"piointegration",
@@ -71,7 +71,15 @@ DEFAULT_ENV_OPTIONS = dict(
variables=clivars,
# Propagating External Environment
ENV=os.environ,
UNIX_TIME=int(time.time()),
UNIX_TIME=int(time()),
BUILD_DIR=os.path.join("$PROJECT_BUILD_DIR", "$PIOENV"),
BUILD_SRC_DIR=os.path.join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=os.path.join("$BUILD_DIR", "test"),
COMPILATIONDB_PATH=os.path.join("$PROJECT_DIR", "compile_commands.json"),
LIBPATH=["$BUILD_DIR"],
PROGNAME="program",
PROGPATH=os.path.join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PROG_PATH="$PROGPATH", # deprecated
PYTHONEXE=get_pythonexe_path(),
)
@@ -118,21 +126,13 @@ env.Replace(
PROJECT_DATA_DIR=config.get("platformio", "data_dir"),
PROJECTDATA_DIR="$PROJECT_DATA_DIR", # legacy for dev/platform
PROJECT_BUILD_DIR=config.get("platformio", "build_dir"),
BUILD_TYPE=get_build_type(config, env["PIOENV"], COMMAND_LINE_TARGETS),
BUILD_DIR=os.path.join("$PROJECT_BUILD_DIR", "$PIOENV", "$BUILD_TYPE"),
BUILD_SRC_DIR=os.path.join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=os.path.join("$BUILD_DIR", "test"),
BUILD_TYPE=env.GetBuildType(),
BUILD_CACHE_DIR=config.get("platformio", "build_cache_dir"),
LIBPATH=["$BUILD_DIR"],
LIBSOURCE_DIRS=[
config.get("platformio", "lib_dir"),
os.path.join("$PROJECT_LIBDEPS_DIR", "$PIOENV"),
config.get("platformio", "globallib_dir"),
],
COMPILATIONDB_PATH=os.path.join("$PROJECT_DIR", "compile_commands.json"),
PROGNAME="program",
PROGPATH=os.path.join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PROG_PATH="$PROGPATH", # deprecated
)
if int(ARGUMENTS.get("ISATTY", 0)):
@@ -147,13 +147,13 @@ if env.subst("$BUILD_CACHE_DIR"):
if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
if not os.path.isdir(env.subst("$BUILD_DIR")):
os.makedirs(env.subst("$BUILD_DIR"))
# Dynamically load dependent tools
if "compiledb" in COMMAND_LINE_TARGETS:
env.Tool("compilation_db")
if not os.path.isdir(env.subst("$BUILD_DIR")):
os.makedirs(env.subst("$BUILD_DIR"))
env.LoadProjectOptions()
env.LoadPioPlatform()
@@ -183,7 +183,7 @@ env.SConscript(env.GetExtraScripts("post"), exports="env")
# Checking program size
if env.get("SIZETOOL") and not (
set(["nobuild", "__memusage"]) & set(COMMAND_LINE_TARGETS)
set(["nobuild", "sizedata"]) & set(COMMAND_LINE_TARGETS)
):
env.Depends("upload", "checkprogsize")
# Replace platform's "size" target with our
@@ -224,27 +224,24 @@ if env.IsIntegrationDump():
data = projenv.DumpIntegrationData(env)
# dump to file for the further reading by project.helpers.load_build_metadata
with open(
projenv.subst(os.path.join("$BUILD_DIR", "metadata.json")),
projenv.subst(os.path.join("$BUILD_DIR", "idedata.json")),
mode="w",
encoding="utf8",
) as fp:
json.dump(data, fp)
click.echo(
"Metadata has been saved to the following location: %s"
% projenv.subst(os.path.join("$BUILD_DIR", "metadata.json"))
)
click.echo("\n%s\n" % json.dumps(data)) # pylint: disable=undefined-variable
env.Exit(0)
if "__memusage" in COMMAND_LINE_TARGETS:
if "sizedata" in COMMAND_LINE_TARGETS:
AlwaysBuild(
env.Alias(
"__memusage",
"sizedata",
DEFAULT_TARGETS,
env.VerboseAction(env.DumpMemoryUsage, "Generating memory usage report..."),
env.VerboseAction(env.DumpSizeData, "Generating memory usage report..."),
)
)
Default("__memusage")
Default("sizedata")
# issue #4604: process targets sequentially
for index, target in enumerate(

View File

@@ -58,6 +58,7 @@ def GetBuildType(env):
def BuildProgram(env):
env.ProcessCompileDbToolchainOption()
env.ProcessProgramDeps()
env.ProcessProjectDeps()
@@ -90,6 +91,26 @@ def BuildProgram(env):
return program
def ProcessCompileDbToolchainOption(env):
if "compiledb" not in COMMAND_LINE_TARGETS:
return
# Resolve absolute path of toolchain
for cmd in ("CC", "CXX", "AS"):
if cmd not in env:
continue
if os.path.isabs(env[cmd]) or '"' in env[cmd]:
continue
env[cmd] = where_is_program(env.subst("$%s" % cmd), env.subst("${ENV['PATH']}"))
if " " in env[cmd]: # issue #4998: Space in compilator path
env[cmd] = f'"{env[cmd]}"'
if env.get("COMPILATIONDB_INCLUDE_TOOLCHAIN"):
print("Warning! `COMPILATIONDB_INCLUDE_TOOLCHAIN` is scoping")
for scope, includes in env.DumpIntegrationIncludes().items():
if scope in ("toolchain",):
env.Append(CPPPATH=includes)
def ProcessProgramDeps(env):
def _append_pio_macros():
core_version = pepver_to_semver(__version__)
@@ -126,27 +147,6 @@ def ProcessProgramDeps(env):
# remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
env.ProcessCompileDbToolchainOption()
def ProcessCompileDbToolchainOption(env):
if "compiledb" in COMMAND_LINE_TARGETS:
# Resolve absolute path of toolchain
for cmd in ("CC", "CXX", "AS"):
if cmd not in env:
continue
if os.path.isabs(env[cmd]):
continue
env[cmd] = where_is_program(
env.subst("$%s" % cmd), env.subst("${ENV['PATH']}")
)
if env.get("COMPILATIONDB_INCLUDE_TOOLCHAIN"):
print("Warning! `COMPILATIONDB_INCLUDE_TOOLCHAIN` is scoping")
for scope, includes in env.DumpIntegrationIncludes().items():
if scope in ("toolchain",):
env.Append(CPPPATH=includes)
def ProcessProjectDeps(env):
plb = env.ConfigureProjectLibBuilder()
@@ -219,6 +219,11 @@ def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
if os.path.isdir(p):
result[k][i] = os.path.abspath(p)
# fix relative LIBs
for i, l in enumerate(result.get("LIBS", [])):
if isinstance(l, FS.File):
result["LIBS"][i] = os.path.abspath(l.get_path())
# fix relative path for "-include"
for i, f in enumerate(result.get("CCFLAGS", [])):
if isinstance(f, tuple) and f[0] == "-include":

View File

@@ -23,7 +23,7 @@ from platformio.proc import exec_command, where_is_program
def IsIntegrationDump(_):
return set(["__idedata", "__metadata"]) & set(COMMAND_LINE_TARGETS)
return set(["__idedata", "idedata"]) & set(COMMAND_LINE_TARGETS)
def DumpIntegrationIncludes(env):
@@ -141,7 +141,7 @@ def _split_flags_string(env, s):
def DumpIntegrationData(*args):
projenv, globalenv = args[0:2] # pylint: disable=unbalanced-tuple-unpacking
data = {
"build_type": globalenv["BUILD_TYPE"],
"build_type": globalenv.GetBuildType(),
"env_name": globalenv["PIOENV"],
"libsource_dirs": [
globalenv.subst(item) for item in globalenv.GetLibSourceDirs()

View File

@@ -29,7 +29,7 @@ from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from platformio import exception, fs
from platformio.builder.tools import piobuild
from platformio.compat import IS_WINDOWS, hashlib_encode_data, string_types
from platformio.http import HttpClientApiError, InternetConnectionError
from platformio.http import HTTPClientError, InternetConnectionError
from platformio.package.exception import (
MissingPackageManifestError,
UnknownPackageError,
@@ -1005,7 +1005,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
lm.install(spec)
did_install = True
except (
HttpClientApiError,
HTTPClientError,
UnknownPackageError,
InternetConnectionError,
) as exc:
@@ -1159,6 +1159,8 @@ def ConfigureProjectLibBuilder(env):
for lb in lib_builders:
if lb in found_lbs:
lb.search_deps_recursive(lb.get_search_files())
# refill found libs after recursive search
found_lbs = [lb for lb in lib_builders if lb.is_dependent]
for lb in lib_builders:
for deplb in lb.depbuilders[:]:
if deplb not in found_lbs:

View File

@@ -20,19 +20,23 @@ from platformio.proc import exec_command
@util.memoized()
def GetCompilerType(env):
if env.subst("$CC").endswith("-gcc"):
def GetCompilerType(env): # pylint: disable=too-many-return-statements
CC = env.subst("$CC")
if CC.endswith("-gcc"):
return "gcc"
if os.path.basename(CC) == "clang":
return "clang"
try:
sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command([env.subst("$CC"), "-v"], env=sysenv)
result = exec_command([CC, "-v"], env=sysenv)
except OSError:
return None
if result["returncode"] != 0:
return None
output = "".join([result["out"], result["err"]]).lower()
if "clang" in output and "LLVM" in output:
if "clang version" in output:
return "clang"
if "gcc" in output:
return "gcc"

View File

@@ -14,33 +14,33 @@
# pylint: disable=too-many-locals
import os
import json
import sys
import time
from os import environ, makedirs, remove
from os.path import isdir, join, splitdrive
from elftools.elf.descriptions import describe_sh_flags
from elftools.elf.elffile import ELFFile
from platformio.compat import IS_WINDOWS
from platformio.proc import exec_command
from platformio.project.memusage import save_report
def _run_tool(cmd, env, tool_args):
sysenv = os.environ.copy()
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
build_dir = env.subst("$BUILD_DIR")
if not os.path.isdir(build_dir):
os.makedirs(build_dir)
tmp_file = os.path.join(build_dir, "size-data-longcmd.txt")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(build_dir, "size-data-longcmd.txt")
with open(tmp_file, mode="w", encoding="utf8") as fp:
fp.write("\n".join(tool_args))
cmd.append("@" + tmp_file)
result = exec_command(cmd, env=sysenv)
os.remove(tmp_file)
remove(tmp_file)
return result
@@ -92,8 +92,8 @@ def _collect_sections_info(env, elffile):
}
sections[section.name] = section_data
sections[section.name]["in_flash"] = env.memusageIsFlashSection(section_data)
sections[section.name]["in_ram"] = env.memusageIsRamSection(section_data)
sections[section.name]["in_flash"] = env.pioSizeIsFlashSection(section_data)
sections[section.name]["in_ram"] = env.pioSizeIsRamSection(section_data)
return sections
@@ -106,7 +106,7 @@ def _collect_symbols_info(env, elffile, elf_path, sections):
sys.stderr.write("Couldn't find symbol table. Is ELF file stripped?")
env.Exit(1)
sysenv = os.environ.copy()
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
symbol_addrs = []
@@ -117,7 +117,7 @@ def _collect_symbols_info(env, elffile, elf_path, sections):
symbol_size = s["st_size"]
symbol_type = symbol_info["type"]
if not env.memusageIsValidSymbol(s.name, symbol_type, symbol_addr):
if not env.pioSizeIsValidSymbol(s.name, symbol_type, symbol_addr):
continue
symbol = {
@@ -126,7 +126,7 @@ def _collect_symbols_info(env, elffile, elf_path, sections):
"name": s.name,
"type": symbol_type,
"size": symbol_size,
"section": env.memusageDetermineSection(sections, symbol_addr),
"section": env.pioSizeDetermineSection(sections, symbol_addr),
}
if s.name.startswith("_Z"):
@@ -144,8 +144,8 @@ def _collect_symbols_info(env, elffile, elf_path, sections):
if not location or "?" in location:
continue
if IS_WINDOWS:
drive, tail = os.path.splitdrive(location)
location = os.path.join(drive.upper(), tail)
drive, tail = splitdrive(location)
location = join(drive.upper(), tail)
symbol["file"] = location
symbol["line"] = 0
if ":" in location:
@@ -156,7 +156,7 @@ def _collect_symbols_info(env, elffile, elf_path, sections):
return symbols
def memusageDetermineSection(_, sections, symbol_addr):
def pioSizeDetermineSection(_, sections, symbol_addr):
for section, info in sections.items():
if not info.get("in_flash", False) and not info.get("in_ram", False):
continue
@@ -165,22 +165,22 @@ def memusageDetermineSection(_, sections, symbol_addr):
return "unknown"
def memusageIsValidSymbol(_, symbol_name, symbol_type, symbol_address):
def pioSizeIsValidSymbol(_, symbol_name, symbol_type, symbol_address):
return symbol_name and symbol_address != 0 and symbol_type != "STT_NOTYPE"
def memusageIsRamSection(_, section):
def pioSizeIsRamSection(_, section):
return (
section.get("type", "") in ("SHT_NOBITS", "SHT_PROGBITS")
and section.get("flags", "") == "WA"
)
def memusageIsFlashSection(_, section):
def pioSizeIsFlashSection(_, section):
return section.get("type", "") == "SHT_PROGBITS" and "A" in section.get("flags", "")
def memusageCalculateFirmwareSize(_, sections):
def pioSizeCalculateFirmwareSize(_, sections):
flash_size = ram_size = 0
for section_info in sections.values():
if section_info.get("in_flash", False):
@@ -191,22 +191,20 @@ def memusageCalculateFirmwareSize(_, sections):
return ram_size, flash_size
def DumpMemoryUsage(_, target, source, env): # pylint: disable=unused-argument
result = {"version": 1, "timestamp": int(time.time()), "device": {}, "memory": {}}
def DumpSizeData(_, target, source, env): # pylint: disable=unused-argument
data = {"device": {}, "memory": {}, "version": 1}
board = env.BoardConfig()
if board:
result["device"] = {
data["device"] = {
"mcu": board.get("build.mcu", ""),
"cpu": board.get("build.cpu", ""),
"frequency": board.get("build.f_cpu"),
"flash": int(board.get("upload.maximum_size", 0)),
"ram": int(board.get("upload.maximum_ram_size", 0)),
}
if result["device"]["frequency"] and result["device"]["frequency"].endswith(
"L"
):
result["device"]["frequency"] = int(result["device"]["frequency"][0:-1])
if data["device"]["frequency"] and data["device"]["frequency"].endswith("L"):
data["device"]["frequency"] = int(data["device"]["frequency"][0:-1])
elf_path = env.subst("$PIOMAINPROG")
@@ -218,16 +216,16 @@ def DumpMemoryUsage(_, target, source, env): # pylint: disable=unused-argument
env.Exit(1)
sections = _collect_sections_info(env, elffile)
firmware_ram, firmware_flash = env.memusageCalculateFirmwareSize(sections)
result["memory"]["total"] = {
firmware_ram, firmware_flash = env.pioSizeCalculateFirmwareSize(sections)
data["memory"]["total"] = {
"ram_size": firmware_ram,
"flash_size": firmware_flash,
"sections": sections,
}
result["memory"]["sections"] = sections
files = {}
for symbol in _collect_symbols_info(env, elffile, elf_path, sections):
file_path = symbol.pop("file", "unknown")
file_path = symbol.get("file") or "unknown"
if not files.get(file_path, {}):
files[file_path] = {"symbols": [], "ram_size": 0, "flash_size": 0}
@@ -242,16 +240,16 @@ def DumpMemoryUsage(_, target, source, env): # pylint: disable=unused-argument
files[file_path]["symbols"].append(symbol)
result["memory"]["files"] = []
data["memory"]["files"] = []
for k, v in files.items():
file_data = {"path": k}
file_data.update(v)
result["memory"]["files"].append(file_data)
data["memory"]["files"].append(file_data)
print(
"Memory usage report has been saved to the following location: "
f"\"{save_report(os.getcwd(), env['PIOENV'], result)}\""
)
with open(
join(env.subst("$BUILD_DIR"), "sizedata.json"), mode="w", encoding="utf8"
) as fp:
fp.write(json.dumps(data))
def exists(_):
@@ -259,10 +257,10 @@ def exists(_):
def generate(env):
env.AddMethod(memusageIsRamSection)
env.AddMethod(memusageIsFlashSection)
env.AddMethod(memusageCalculateFirmwareSize)
env.AddMethod(memusageDetermineSection)
env.AddMethod(memusageIsValidSymbol)
env.AddMethod(DumpMemoryUsage)
env.AddMethod(pioSizeIsRamSection)
env.AddMethod(pioSizeIsFlashSection)
env.AddMethod(pioSizeCalculateFirmwareSize)
env.AddMethod(pioSizeDetermineSection)
env.AddMethod(pioSizeIsValidSymbol)
env.AddMethod(DumpSizeData)
return env

View File

@@ -61,7 +61,7 @@ def CleanProject(env, fullclean=False):
print("Done cleaning")
def AddTarget( # pylint: disable=too-many-arguments
def AddTarget( # pylint: disable=too-many-arguments,too-many-positional-arguments
env,
name,
dependencies,

View File

@@ -19,7 +19,6 @@ import json
import os
import shutil
from collections import Counter
from os.path import dirname, isfile
from time import time
import click
@@ -60,7 +59,7 @@ from platformio.project.helpers import find_project_dir_above, get_project_dir
type=click.Choice(DefectItem.SEVERITY_LABELS.values()),
)
@click.option("--skip-packages", is_flag=True)
def cli(
def cli( # pylint: disable=too-many-positional-arguments
environment,
project_dir,
project_conf,
@@ -77,7 +76,7 @@ def cli(
app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level
if isfile(project_dir):
if os.path.isfile(project_dir):
project_dir = find_project_dir_above(project_dir)
results = []
@@ -150,7 +149,7 @@ def cli(
print_processing_header(tool, envname, env_dump)
ct = CheckToolFactory.new(
tool, project_dir, config, envname, tool_options
tool, os.getcwd(), config, envname, tool_options
)
result = {"env": envname, "tool": tool, "duration": time()}
@@ -250,12 +249,12 @@ def collect_component_stats(result):
components[component].update({DefectItem.SEVERITY_LABELS[defect.severity]: 1})
for defect in result.get("defects", []):
component = dirname(defect.file) or defect.file
component = os.path.dirname(defect.file) or defect.file
_append_defect(component, defect)
if component.lower().startswith(get_project_dir().lower()):
while os.sep in component:
component = dirname(component)
component = os.path.dirname(component)
_append_defect(component, defect)
return components

View File

@@ -29,7 +29,7 @@ class DefectItem:
SEVERITY_LOW = 4
SEVERITY_LABELS = {4: "low", 2: "medium", 1: "high"}
def __init__(
def __init__( # pylint: disable=too-many-positional-arguments
self,
severity,
category,

0
platformio/check/tools/base.py Executable file → Normal file
View File

View File

@@ -63,7 +63,7 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
@click.option("-e", "--environment", "environments", multiple=True)
@click.option("-v", "--verbose", is_flag=True)
@click.pass_context
def cli( # pylint: disable=too-many-arguments, too-many-branches
def cli( # pylint: disable=too-many-arguments,too-many-positional-arguments, too-many-branches
ctx,
src,
lib,

View File

@@ -152,7 +152,7 @@ def cli(ctx, **options):
"-f", "--force", is_flag=True, help="Reinstall/redownload library if exists"
)
@click.pass_context
def lib_install( # pylint: disable=too-many-arguments,unused-argument
def lib_install( # pylint: disable=too-many-arguments,too-many-positional-arguments,unused-argument
ctx, libraries, save, silent, interactive, force
):
click.secho(
@@ -210,7 +210,7 @@ def lib_uninstall(ctx, libraries, save, silent):
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True)
@click.pass_context
def lib_update( # pylint: disable=too-many-arguments
def lib_update( # pylint: disable=too-many-arguments,too-many-positional-arguments
ctx, libraries, only_check, dry_run, silent, json_output
):
only_check = dry_run or only_check

View File

@@ -159,7 +159,7 @@ def platform_show(ctx, platform, json_output): # pylint: disable=too-many-branc
help="Reinstall/redownload dev/platform and its packages if exist",
)
@click.pass_context
def platform_install( # pylint: disable=too-many-arguments
def platform_install( # pylint: disable=too-many-arguments,too-many-positional-arguments
ctx,
platforms,
with_package,
@@ -224,7 +224,7 @@ def platform_uninstall(ctx, platforms):
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True)
@click.pass_context
def platform_update( # pylint: disable=too-many-locals, too-many-arguments
def platform_update( # pylint: disable=too-many-locals,too-many-arguments,too-many-positional-arguments
ctx, platforms, only_check, dry_run, silent, json_output, **_
):
only_check = dry_run or only_check

View File

@@ -76,5 +76,5 @@ def settings_set(ctx, name, value):
@click.pass_context
def settings_reset(ctx):
app.reset_settings()
click.secho("The settings have been reseted!", fg="green")
click.secho("The settings have been reset!", fg="green")
ctx.invoke(settings_get)

View File

@@ -20,7 +20,7 @@ import click
from platformio import VERSION, __version__, app, exception
from platformio.dependencies import get_pip_dependencies
from platformio.http import fetch_http_content
from platformio.http import fetch_remote_content
from platformio.package.manager.core import update_core_packages
from platformio.proc import get_pythonexe_path
@@ -133,7 +133,7 @@ def get_latest_version():
def get_develop_latest_version():
version = None
content = fetch_http_content(DEVELOP_INIT_SCRIPT_URL)
content = fetch_remote_content(DEVELOP_INIT_SCRIPT_URL)
for line in content.split("\n"):
line = line.strip()
if not line.startswith("VERSION"):
@@ -150,5 +150,5 @@ def get_develop_latest_version():
def get_pypi_latest_version():
content = fetch_http_content(PYPI_JSON_URL)
content = fetch_remote_content(PYPI_JSON_URL)
return json.loads(content)["info"]["version"]

View File

@@ -57,7 +57,7 @@ from platformio.project.options import ProjectOptions
@click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("client_extra_args", nargs=-1, type=click.UNPROCESSED)
@click.pass_context
def cli(
def cli( # pylint: disable=too-many-positional-arguments
ctx,
project_dir,
project_conf,
@@ -86,7 +86,7 @@ def cli(
if not interface:
return helpers.predebug_project(
ctx, project_dir, project_config, env_name, False, verbose
ctx, os.getcwd(), project_config, env_name, False, verbose
)
configure_args = (
@@ -106,12 +106,14 @@ def cli(
else:
debug_config = _configure(*configure_args)
_run(project_dir, debug_config, client_extra_args)
_run(os.getcwd(), debug_config, client_extra_args)
return None
def _configure(ctx, project_config, env_name, load_mode, verbose, client_extra_args):
def _configure(
ctx, project_config, env_name, load_mode, verbose, client_extra_args
): # pylint: disable=too-many-positional-arguments
platform = PlatformFactory.from_env(env_name, autoinstall=True)
debug_config = DebugConfigFactory.new(
platform,

View File

@@ -149,7 +149,7 @@ class DebugConfigBase: # pylint: disable=too-many-instance-attributes
def _load_build_data(self):
data = load_build_metadata(
os.getcwd(), self.env_name, cache=True, force_targets=["__debug"]
os.getcwd(), self.env_name, cache=True, build_type="debug"
)
if not data:
raise DebugInvalidOptionsError("Could not load a build configuration")

View File

@@ -76,7 +76,7 @@ def get_default_debug_env(config):
def predebug_project(
ctx, project_dir, project_config, env_name, preload, verbose
): # pylint: disable=too-many-arguments
): # pylint: disable=too-many-arguments,too-many-positional-arguments
debug_testname = project_config.get("env:" + env_name, "debug_test")
if debug_testname:
test_names = list_test_names(project_config)

View File

@@ -12,16 +12,14 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import platform
from platformio.compat import PY36, is_proxy_set
from platformio.compat import is_proxy_set
def get_core_dependencies():
return {
"contrib-piohome": "~3.4.2",
"contrib-pioremote": "~1.0.0",
"tool-scons": "~4.40700.0",
"tool-scons": "~4.40801.0",
"tool-cppcheck": "~1.21100.0",
"tool-clangtidy": "~1.150005.0",
"tool-pvs-studio": "~7.18.0",
@@ -30,13 +28,13 @@ def get_core_dependencies():
def get_pip_dependencies():
core = [
"bottle == 0.12.*",
"click >=8.0.4, <9",
"bottle == 0.13.*",
"click >=8.0.4, <8.1.8",
"colorama",
"httpx%s >=0.22.0, <0.28" % ("[socks]" if is_proxy_set(socks=True) else ""),
"marshmallow == 3.*",
"pyelftools >=0.27, <1",
"pyserial == 3.5.*", # keep in sync "device/monitor/terminal.py"
"requests%s == 2.*" % ("[socks]" if is_proxy_set(socks=True) else ""),
"semantic_version == 2.10.*",
"tabulate == 0.*",
]
@@ -44,16 +42,16 @@ def get_pip_dependencies():
home = [
# PIO Home requirements
"ajsonrpc == 1.2.*",
"starlette >=0.19, <0.38",
"uvicorn %s" % ("== 0.16.0" if PY36 else ">=0.16, <0.30"),
"starlette >=0.19, <0.47",
"uvicorn >=0.16, <0.35",
"wsproto == 1.*",
]
extra = []
# issue #4702; Broken "requests/charset_normalizer" on macOS ARM
if platform.system() == "Darwin" and "arm" in platform.machine().lower():
extra.append("chardet>=3.0.2,<6")
extra.append(
'chardet >= 3.0.2,<6; platform_system == "Darwin" and "arm" in platform_machine'
)
# issue 4614: urllib3 v2.0 only supports OpenSSL 1.1.1+
try:

View File

@@ -89,7 +89,7 @@ def is_serial_port_ready(port, timeout=1):
class SerialPortFinder:
def __init__( # pylint: disable=too-many-arguments
def __init__( # pylint: disable=too-many-arguments,too-many-positional-arguments
self,
board_config=None,
upload_protocol=None,

View File

@@ -29,16 +29,15 @@ from platformio.compat import IS_WINDOWS
class cd:
def __init__(self, path):
self.path = path
self._old_cwd = []
def __init__(self, new_path):
self.new_path = new_path
self.prev_path = os.getcwd()
def __enter__(self):
self._old_cwd.append(os.getcwd())
os.chdir(self.path)
os.chdir(self.new_path)
def __exit__(self, *excinfo):
os.chdir(self._old_cwd.pop())
def __exit__(self, etype, value, traceback):
os.chdir(self.prev_path)
def get_source_dir():

View File

@@ -12,14 +12,19 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from ajsonrpc.core import JSONRPC20DispatchException
from platformio.account.client import AccountClient
from platformio.home.rpc.handlers.base import BaseRPCHandler
class AccountRPC(BaseRPCHandler):
NAMESPACE = "account"
@staticmethod
def call_client(method, *args, **kwargs):
with AccountClient() as client:
try:
client = AccountClient()
return getattr(client, method)(*args, **kwargs)
except Exception as exc: # pylint: disable=bare-except
raise JSONRPC20DispatchException(
code=5000, message="PIO Account Call Error", data=str(exc)
) from exc

View File

@@ -20,7 +20,6 @@ from platformio.project.helpers import is_platformio_project
class AppRPC(BaseRPCHandler):
NAMESPACE = "app"
IGNORE_STORAGE_KEYS = [
"cid",
"coreVersion",

View File

@@ -14,6 +14,4 @@
class BaseRPCHandler:
NAMESPACE = None
factory = None

View File

@@ -1,123 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import asyncio
import functools
import os
from platformio import __main__, __version__, app, proc, util
from platformio.compat import (
IS_WINDOWS,
aio_create_task,
aio_get_running_loop,
get_locale_encoding,
shlex_join,
)
from platformio.exception import UserSideException
from platformio.home.rpc.handlers.base import BaseRPCHandler
class PIOCoreCallError(UserSideException):
MESSAGE = 'An error occured while executing PIO Core command: "{0}"\n\n{1}'
class PIOCoreProtocol(asyncio.SubprocessProtocol):
def __init__(self, exit_future, on_data_callback=None):
self.exit_future = exit_future
self.on_data_callback = on_data_callback
self.stdout = ""
self.stderr = ""
self._is_exited = False
self._encoding = get_locale_encoding()
def pipe_data_received(self, fd, data):
data = data.decode(self._encoding, "replace")
pipe = ["stdin", "stdout", "stderr"][fd]
if pipe == "stdout":
self.stdout += data
if pipe == "stderr":
self.stderr += data
if self.on_data_callback:
self.on_data_callback(pipe=pipe, data=data)
def connection_lost(self, exc):
self.process_exited()
def process_exited(self):
if self._is_exited:
return
self.exit_future.set_result(True)
self._is_exited = True
@util.memoized(expire="60s")
def get_core_fullpath():
return proc.where_is_program("platformio" + (".exe" if IS_WINDOWS else ""))
class CoreRPC(BaseRPCHandler):
NAMESPACE = "core"
@staticmethod
def version():
return __version__
async def exec(self, args, options=None, raise_exception=True):
options = options or {}
loop = aio_get_running_loop()
exit_future = loop.create_future()
data_callback = functools.partial(
self._on_exec_data_received, exec_options=options
)
if args[0] != "--caller" and app.get_session_var("caller_id"):
args = ["--caller", app.get_session_var("caller_id")] + args
kwargs = options.get("spawn", {})
if "force_ansi" in options:
environ = kwargs.get("env", os.environ.copy())
environ["PLATFORMIO_FORCE_ANSI"] = "true"
kwargs["env"] = environ
transport, protocol = await loop.subprocess_exec(
lambda: PIOCoreProtocol(exit_future, data_callback),
get_core_fullpath(),
*args,
stdin=None,
**kwargs,
)
await exit_future
transport.close()
return_code = transport.get_returncode()
if return_code != 0 and raise_exception:
raise PIOCoreCallError(
shlex_join(["pio"] + args), f"{protocol.stdout}\n{protocol.stderr}"
)
return {
"stdout": protocol.stdout,
"stderr": protocol.stderr,
"returncode": return_code,
}
def _on_exec_data_received(self, exec_options, pipe, data):
notification_method = exec_options.get(f"{pipe}NotificationMethod")
if not notification_method:
return
aio_create_task(
self.factory.notify_clients(
method=notification_method,
params=[data],
actor="frontend",
)
)

View File

@@ -22,7 +22,6 @@ from platformio.home.rpc.handlers.base import BaseRPCHandler
class IDERPC(BaseRPCHandler):
NAMESPACE = "ide"
COMMAND_TIMEOUT = 1.5 # in seconds
def __init__(self):

View File

@@ -1,127 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
import os
from platformio.home.rpc.handlers.base import BaseRPCHandler
from platformio.project import memusage
class MemUsageRPC(BaseRPCHandler):
NAMESPACE = "memusage"
async def profile(self, project_dir, env, options=None):
options = options or {}
report_dir = memusage.get_report_dir(project_dir, env)
if options.get("lazy"):
existing_reports = memusage.list_reports(report_dir)
if existing_reports:
return existing_reports[-1]
await self.factory.manager.dispatcher["core.exec"](
["run", "-d", project_dir, "-e", env, "-t", "__memusage"],
options=options.get("exec"),
)
return memusage.list_reports(report_dir)[-1]
@staticmethod
def load_report(path):
return memusage.read_report(path)
def summary(self, report_path):
max_top_items = 10
report_dir = os.path.dirname(report_path)
existing_reports = memusage.list_reports(report_dir)
current_report = memusage.read_report(report_path)
previous_report = None
try:
current_index = existing_reports.index(report_path)
if current_index > 0:
previous_report = memusage.read_report(
existing_reports[current_index - 1]
)
except ValueError:
pass
return dict(
timestamp=dict(
current=current_report["timestamp"],
previous=previous_report["timestamp"] if previous_report else None,
),
device=current_report["device"],
trend=dict(
current=current_report["memory"]["total"],
previous=(
previous_report["memory"]["total"] if previous_report else None
),
),
top=dict(
files=self._calculate_top_files(current_report["memory"]["files"])[
0:max_top_items
],
symbols=self._calculate_top_symbols(current_report["memory"]["files"])[
0:max_top_items
],
sections=sorted(
current_report["memory"]["sections"].values(),
key=lambda item: item["size"],
reverse=True,
)[0:max_top_items],
),
)
@staticmethod
def _calculate_top_files(items):
return [
{"path": item["path"], "ram": item["ram_size"], "flash": item["flash_size"]}
for item in sorted(
items,
key=lambda item: item["ram_size"] + item["flash_size"],
reverse=True,
)
]
@staticmethod
def _calculate_top_symbols(files):
symbols = functools.reduce(
lambda result, filex: result
+ [
{
"name": s["name"],
"type": s["type"],
"size": s["size"],
"file": filex["path"],
"line": s.get("line"),
}
for s in filex["symbols"]
],
files,
[],
)
return sorted(symbols, key=lambda item: item["size"], reverse=True)
async def history(self, project_dir, env, nums=10):
result = []
report_dir = memusage.get_report_dir(project_dir, env)
reports = memusage.list_reports(report_dir)[nums * -1 :]
for path in reports:
data = memusage.read_report(path)
result.append(
{
"timestamp": data["timestamp"],
"ram": data["memory"]["total"]["ram_size"],
"flash": data["memory"]["total"]["flash_size"],
}
)
return result

View File

@@ -22,8 +22,6 @@ from platformio.home.rpc.handlers.os import OSRPC
class MiscRPC(BaseRPCHandler):
NAMESPACE = "misc"
async def load_latest_tweets(self, data_url):
cache_key = ContentCache.key_from_args(data_url, "tweets")
cache_valid = "180d"

View File

@@ -15,22 +15,32 @@
import glob
import io
import os
import shutil
from functools import cmp_to_key
import click
from platformio import fs
from platformio.cache import ContentCache
from platformio.compat import aio_to_thread
from platformio.device.list.util import list_logical_devices
from platformio.home.rpc.handlers.base import BaseRPCHandler
from platformio.http import HTTPSession, ensure_internet_on
class HTTPAsyncSession(HTTPSession):
async def request( # pylint: disable=signature-differs,invalid-overridden-method
self, *args, **kwargs
):
func = super().request
return await aio_to_thread(func, *args, **kwargs)
class OSRPC(BaseRPCHandler):
NAMESPACE = "os"
_http_session = None
@classmethod
def fetch_content(cls, url, data=None, headers=None, cache_valid=None):
async def fetch_content(cls, url, data=None, headers=None, cache_valid=None):
if not headers:
headers = {
"User-Agent": (
@@ -42,33 +52,35 @@ class OSRPC(BaseRPCHandler):
cache_key = ContentCache.key_from_args(url, data) if cache_valid else None
with ContentCache() as cc:
if cache_key:
content = cc.get(cache_key)
if content is not None:
return content
result = cc.get(cache_key)
if result is not None:
return result
# check internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
with HTTPSession() as session:
if data:
response = session.post(url, data=data, headers=headers)
else:
response = session.get(url, headers=headers)
if not cls._http_session:
cls._http_session = HTTPAsyncSession()
response.raise_for_status()
content = response.text
if cache_valid:
with ContentCache() as cc:
cc.set(cache_key, content, cache_valid)
return content
if data:
r = await cls._http_session.post(url, data=data, headers=headers)
else:
r = await cls._http_session.get(url, headers=headers)
@classmethod
def request_content(cls, uri, data=None, headers=None, cache_valid=None):
r.raise_for_status()
result = r.text
if cache_valid:
with ContentCache() as cc:
cc.set(cache_key, result, cache_valid)
return result
async def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith("http"):
return cls.fetch_content(uri, data, headers, cache_valid)
return await self.fetch_content(uri, data, headers, cache_valid)
local_path = uri[7:] if uri.startswith("file://") else uri
with io.open(local_path, encoding="utf-8") as fp:
return fp.read()
return None
@staticmethod
def open_url(url):
@@ -98,10 +110,22 @@ class OSRPC(BaseRPCHandler):
def is_dir(path):
return os.path.isdir(path)
@staticmethod
def make_dirs(path):
return os.makedirs(path)
@staticmethod
def get_file_mtime(path):
return os.path.getmtime(path)
@staticmethod
def rename(src, dst):
return os.rename(src, dst)
@staticmethod
def copy(src, dst):
return shutil.copytree(src, dst, symlinks=True)
@staticmethod
def glob(pathnames, root=None):
if not isinstance(pathnames, list):

View File

@@ -0,0 +1,229 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import asyncio
import functools
import io
import json
import os
import sys
import threading
import click
from ajsonrpc.core import JSONRPC20DispatchException
from platformio import __main__, __version__, app, fs, proc, util
from platformio.compat import (
IS_WINDOWS,
aio_create_task,
aio_get_running_loop,
aio_to_thread,
get_locale_encoding,
is_bytes,
)
from platformio.exception import PlatformioException
from platformio.home.rpc.handlers.base import BaseRPCHandler
class PIOCoreProtocol(asyncio.SubprocessProtocol):
def __init__(self, exit_future, on_data_callback=None):
self.exit_future = exit_future
self.on_data_callback = on_data_callback
self.stdout = ""
self.stderr = ""
self._is_exited = False
self._encoding = get_locale_encoding()
def pipe_data_received(self, fd, data):
data = data.decode(self._encoding, "replace")
pipe = ["stdin", "stdout", "stderr"][fd]
if pipe == "stdout":
self.stdout += data
if pipe == "stderr":
self.stderr += data
if self.on_data_callback:
self.on_data_callback(pipe=pipe, data=data)
def connection_lost(self, exc):
self.process_exited()
def process_exited(self):
if self._is_exited:
return
self.exit_future.set_result(True)
self._is_exited = True
class MultiThreadingStdStream:
def __init__(self, parent_stream):
self._buffers = {threading.get_ident(): parent_stream}
def __getattr__(self, name):
thread_id = threading.get_ident()
self._ensure_thread_buffer(thread_id)
return getattr(self._buffers[thread_id], name)
def _ensure_thread_buffer(self, thread_id):
if thread_id not in self._buffers:
self._buffers[thread_id] = io.StringIO()
def write(self, value):
thread_id = threading.get_ident()
self._ensure_thread_buffer(thread_id)
return self._buffers[thread_id].write(
value.decode() if is_bytes(value) else value
)
def get_value_and_reset(self):
result = ""
try:
result = self.getvalue()
self.seek(0)
self.truncate(0)
except AttributeError:
pass
return result
@util.memoized(expire="60s")
def get_core_fullpath():
return proc.where_is_program("platformio" + (".exe" if IS_WINDOWS else ""))
class PIOCoreRPC(BaseRPCHandler):
@staticmethod
def version():
return __version__
async def exec(self, args, options=None):
loop = aio_get_running_loop()
exit_future = loop.create_future()
data_callback = functools.partial(
self._on_exec_data_received, exec_options=options
)
if args[0] != "--caller" and app.get_session_var("caller_id"):
args = ["--caller", app.get_session_var("caller_id")] + args
transport, protocol = await loop.subprocess_exec(
lambda: PIOCoreProtocol(exit_future, data_callback),
get_core_fullpath(),
*args,
stdin=None,
**options.get("spawn", {}),
)
await exit_future
transport.close()
return {
"stdout": protocol.stdout,
"stderr": protocol.stderr,
"returncode": transport.get_returncode(),
}
def _on_exec_data_received(self, exec_options, pipe, data):
notification_method = exec_options.get(f"{pipe}NotificationMethod")
if not notification_method:
return
aio_create_task(
self.factory.notify_clients(
method=notification_method,
params=[data],
actor="frontend",
)
)
@staticmethod
def setup_multithreading_std_streams():
if isinstance(sys.stdout, MultiThreadingStdStream):
return
PIOCoreRPC.thread_stdout = MultiThreadingStdStream(sys.stdout)
PIOCoreRPC.thread_stderr = MultiThreadingStdStream(sys.stderr)
sys.stdout = PIOCoreRPC.thread_stdout
sys.stderr = PIOCoreRPC.thread_stderr
@staticmethod
async def call(args, options=None):
for i, arg in enumerate(args):
if not isinstance(arg, str):
args[i] = str(arg)
options = options or {}
to_json = "--json-output" in args
try:
if options.get("force_subprocess"):
result = await PIOCoreRPC._call_subprocess(args, options)
return PIOCoreRPC._process_result(result, to_json)
result = await PIOCoreRPC._call_inline(args, options)
try:
return PIOCoreRPC._process_result(result, to_json)
except ValueError:
# fall-back to subprocess method
result = await PIOCoreRPC._call_subprocess(args, options)
return PIOCoreRPC._process_result(result, to_json)
except Exception as exc: # pylint: disable=bare-except
raise JSONRPC20DispatchException(
code=5000, message="PIO Core Call Error", data=str(exc)
) from exc
@staticmethod
async def _call_subprocess(args, options):
result = await aio_to_thread(
proc.exec_command,
[get_core_fullpath()] + args,
cwd=options.get("cwd") or os.getcwd(),
)
return (result["out"], result["err"], result["returncode"])
@staticmethod
async def _call_inline(args, options):
PIOCoreRPC.setup_multithreading_std_streams()
def _thread_safe_call(args, cwd):
with fs.cd(cwd):
exit_code = __main__.main(["-c"] + args)
return (
PIOCoreRPC.thread_stdout.get_value_and_reset(),
PIOCoreRPC.thread_stderr.get_value_and_reset(),
exit_code,
)
return await aio_to_thread(
_thread_safe_call, args=args, cwd=options.get("cwd") or os.getcwd()
)
@staticmethod
def _process_result(result, to_json=False):
out, err, code = result
if out and is_bytes(out):
out = out.decode(get_locale_encoding())
if err and is_bytes(err):
err = err.decode(get_locale_encoding())
text = ("%s\n\n%s" % (out, err)).strip()
if code != 0:
raise PlatformioException(text)
if not to_json:
return text
try:
return json.loads(out)
except ValueError as exc:
click.secho("%s => `%s`" % (exc, out), fg="red", err=True)
# if PIO Core prints unhandled warnings
for line in out.split("\n"):
line = line.strip()
if not line:
continue
try:
return json.loads(line)
except ValueError:
pass
raise exc

View File

@@ -14,8 +14,8 @@
import os.path
from platformio.compat import aio_to_thread
from platformio.home.rpc.handlers.base import BaseRPCHandler
from platformio.home.rpc.handlers.registry import RegistryRPC
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.manifest.parser import ManifestParserFactory
from platformio.package.meta import PackageSpec
@@ -23,13 +23,15 @@ from platformio.platform.factory import PlatformFactory
class PlatformRPC(BaseRPCHandler):
NAMESPACE = "platform"
def fetch_platforms(self, search_query=None, page=0, force_installed=False):
async def fetch_platforms(self, search_query=None, page=0, force_installed=False):
if force_installed:
return {"items": self._load_installed_platforms(search_query)}
return {
"items": await aio_to_thread(
self._load_installed_platforms, search_query
)
}
search_result = RegistryRPC.call_client(
search_result = await self.factory.manager.dispatcher["registry.call_client"](
method="list_packages",
query=search_query,
qualifiers={
@@ -86,17 +88,17 @@ class PlatformRPC(BaseRPCHandler):
)
return items
def fetch_boards(self, platform_spec):
async def fetch_boards(self, platform_spec):
spec = PackageSpec(platform_spec)
if spec.owner:
return RegistryRPC.call_client(
return await self.factory.manager.dispatcher["registry.call_client"](
method="get_package",
typex="platform",
owner=spec.owner,
name=spec.name,
extra_path="/boards",
)
return self._load_installed_boards(spec)
return await aio_to_thread(self._load_installed_boards, spec)
@staticmethod
def _load_installed_boards(platform_spec):
@@ -106,17 +108,17 @@ class PlatformRPC(BaseRPCHandler):
key=lambda item: item["name"],
)
def fetch_examples(self, platform_spec):
async def fetch_examples(self, platform_spec):
spec = PackageSpec(platform_spec)
if spec.owner:
return RegistryRPC.call_client(
return await self.factory.manager.dispatcher["registry.call_client"](
method="get_package",
typex="platform",
owner=spec.owner,
name=spec.name,
extra_path="/examples",
)
return self._load_installed_examples(spec)
return await aio_to_thread(self._load_installed_examples, spec)
@staticmethod
def _load_installed_examples(platform_spec):

View File

@@ -13,24 +13,29 @@
# limitations under the License.
import os
import shutil
import time
from pathlib import Path
import semantic_version
from ajsonrpc.core import JSONRPC20DispatchException
from platformio import app, fs
from platformio import app, exception, fs
from platformio.home.rpc.handlers.app import AppRPC
from platformio.home.rpc.handlers.base import BaseRPCHandler
from platformio.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.package.manager.platform import PlatformPackageManager
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_dir
from platformio.project.exception import ProjectError
from platformio.project.helpers import get_project_dir, is_platformio_project
from platformio.project.integration.generator import ProjectGenerator
from platformio.project.options import get_config_options_schema
class ProjectRPC(BaseRPCHandler):
NAMESPACE = "project"
@staticmethod
async def config_call(init_kwargs, method, *args):
def config_call(init_kwargs, method, *args):
assert isinstance(init_kwargs, dict)
assert "path" in init_kwargs
if os.path.isdir(init_kwargs["path"]):
@@ -43,20 +48,249 @@ class ProjectRPC(BaseRPCHandler):
with fs.cd(project_dir):
return getattr(ProjectConfig(**init_kwargs), method)(*args)
async def init(self, configuration, options=None):
@staticmethod
def config_load(path):
return ProjectConfig(
path, parse_extra=False, expand_interpolations=False
).as_tuple()
@staticmethod
def config_dump(path, data):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
config.update(data, clear=True)
return config.save()
@staticmethod
def config_update_description(path, text):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
if not config.has_section("platformio"):
config.add_section("platformio")
if text:
config.set("platformio", "description", text)
else:
if config.has_option("platformio", "description"):
config.remove_option("platformio", "description")
if not config.options("platformio"):
config.remove_section("platformio")
return config.save()
@staticmethod
def get_config_schema():
return get_config_options_schema()
@staticmethod
def get_projects():
def _get_project_data():
data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []}
config = ProjectConfig()
data["envs"] = config.envs()
data["description"] = config.get("platformio", "description")
data["libExtraDirs"].extend(config.get("platformio", "lib_extra_dirs", []))
libdeps_dir = config.get("platformio", "libdeps_dir")
for section in config.sections():
if not section.startswith("env:"):
continue
data["envLibdepsDirs"].append(os.path.join(libdeps_dir, section[4:]))
if config.has_option(section, "board"):
data["boards"].append(config.get(section, "board"))
data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", []))
# skip non existing folders and resolve full path
for key in ("envLibdepsDirs", "libExtraDirs"):
data[key] = [
fs.expanduser(d) if d.startswith("~") else os.path.abspath(d)
for d in data[key]
if os.path.isdir(d)
]
return data
def _path_to_name(path):
return (os.path.sep).join(path.split(os.path.sep)[-2:])
result = []
pm = PlatformPackageManager()
for project_dir in AppRPC.load_state()["storage"]["recentProjects"]:
if not os.path.isdir(project_dir):
continue
data = {}
boards = []
try:
with fs.cd(project_dir):
data = _get_project_data()
except ProjectError:
continue
for board_id in data.get("boards", []):
name = board_id
try:
name = pm.board_config(board_id)["name"]
except exception.PlatformioException:
pass
boards.append({"id": board_id, "name": name})
result.append(
{
"path": project_dir,
"name": _path_to_name(project_dir),
"modified": int(os.path.getmtime(project_dir)),
"boards": boards,
"description": data.get("description"),
"envs": data.get("envs", []),
"envLibStorages": [
{"name": os.path.basename(d), "path": d}
for d in data.get("envLibdepsDirs", [])
],
"extraLibStorages": [
{"name": _path_to_name(d), "path": d}
for d in data.get("libExtraDirs", [])
],
}
)
return result
@staticmethod
def get_project_examples():
result = []
pm = PlatformPackageManager()
for pkg in pm.get_installed():
examples_dir = os.path.join(pkg.path, "examples")
if not os.path.isdir(examples_dir):
continue
items = []
for project_dir, _, __ in os.walk(examples_dir):
project_description = None
try:
config = ProjectConfig(os.path.join(project_dir, "platformio.ini"))
config.validate(silent=True)
project_description = config.get("platformio", "description")
except ProjectError:
continue
path_tokens = project_dir.split(os.path.sep)
items.append(
{
"name": "/".join(
path_tokens[path_tokens.index("examples") + 1 :]
),
"path": project_dir,
"description": project_description,
}
)
manifest = pm.load_manifest(pkg)
result.append(
{
"platform": {
"title": manifest["title"],
"version": manifest["version"],
},
"items": sorted(items, key=lambda item: item["name"]),
}
)
return sorted(result, key=lambda data: data["platform"]["title"])
async def init(self, board, framework, project_dir):
assert project_dir
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["init", "--board", board, "--sample-code"]
if framework:
args.extend(["--project-option", "framework = %s" % framework])
ide = app.get_session_var("caller_id")
if ide in ProjectGenerator.get_supported_ides():
args.extend(["--ide", ide])
await PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True}
)
return project_dir
@staticmethod
async def import_arduino(board, use_arduino_libs, arduino_project_dir):
board = str(board)
# don't import PIO Project
if is_platformio_project(arduino_project_dir):
return arduino_project_dir
is_arduino_project = any(
os.path.isfile(
os.path.join(
arduino_project_dir,
"%s.%s" % (os.path.basename(arduino_project_dir), ext),
)
)
for ext in ("ino", "pde")
)
if not is_arduino_project:
raise JSONRPC20DispatchException(
code=4000, message="Not an Arduino project: %s" % arduino_project_dir
)
state = AppRPC.load_state()
project_dir = os.path.join(
state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board
)
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["init", "--board", board]
args.extend(["--project-option", "framework = arduino"])
if use_arduino_libs:
args.extend(
["--project-option", "lib_extra_dirs = ~/Documents/Arduino/libraries"]
)
ide = app.get_session_var("caller_id")
if ide in ProjectGenerator.get_supported_ides():
args.extend(["--ide", ide])
await PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True}
)
with fs.cd(project_dir):
config = ProjectConfig()
src_dir = config.get("platformio", "src_dir")
if os.path.isdir(src_dir):
fs.rmtree(src_dir)
shutil.copytree(arduino_project_dir, src_dir, symlinks=True)
return project_dir
@staticmethod
async def import_pio(project_dir):
if not project_dir or not is_platformio_project(project_dir):
raise JSONRPC20DispatchException(
code=4001, message="Not an PlatformIO project: %s" % project_dir
)
new_project_dir = os.path.join(
AppRPC.load_state()["storage"]["projectsDir"],
time.strftime("%y%m%d-%H%M%S-") + os.path.basename(project_dir),
)
shutil.copytree(project_dir, new_project_dir, symlinks=True)
args = ["init"]
ide = app.get_session_var("caller_id")
if ide in ProjectGenerator.get_supported_ides():
args.extend(["--ide", ide])
await PIOCoreRPC.call(
args, options={"cwd": new_project_dir, "force_subprocess": True}
)
return new_project_dir
async def init_v2(self, configuration, options=None):
project_dir = os.path.join(configuration["location"], configuration["name"])
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["project", "init", "-d", project_dir]
envclone = os.environ.copy()
envclone["PLATFORMIO_FORCE_ANSI"] = "true"
options = options or {}
options["spawn"] = {"env": envclone, "cwd": project_dir}
args = ["project", "init"]
ide = app.get_session_var("caller_id")
if ide in ProjectGenerator.get_supported_ides():
args.extend(["--ide", ide])
exec_options = options.get("exec", {})
if configuration.get("example"):
await self.factory.notify_clients(
method=exec_options.get("stdoutNotificationMethod"),
method=options.get("stdoutNotificationMethod"),
params=["Copying example files...\n"],
actor="frontend",
)
@@ -64,9 +298,7 @@ class ProjectRPC(BaseRPCHandler):
else:
args.extend(self._pre_init_empty(configuration))
return await self.factory.manager.dispatcher["core.exec"](
args, options=exec_options, raise_exception=False
)
return await self.factory.manager.dispatcher["core.exec"](args, options=options)
@staticmethod
def _pre_init_empty(configuration):
@@ -110,10 +342,10 @@ class ProjectRPC(BaseRPCHandler):
return []
@staticmethod
async def configuration(project_dir, env):
def configuration(project_dir, env):
assert is_platformio_project(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
config.validate(envs=[env])
config = ProjectConfig(os.path.join(project_dir, "platformio.ini"))
platform = PlatformFactory.from_env(env, autoinstall=True)
platform_pkg = PlatformPackageManager().get_package(platform.get_dir())
board_id = config.get(f"env:{env}", "board", None)

View File

@@ -12,14 +12,20 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from ajsonrpc.core import JSONRPC20DispatchException
from platformio.compat import aio_to_thread
from platformio.home.rpc.handlers.base import BaseRPCHandler
from platformio.registry.client import RegistryClient
class RegistryRPC(BaseRPCHandler):
NAMESPACE = "registry"
@staticmethod
def call_client(method, *args, **kwargs):
with RegistryClient() as client:
return getattr(client, method)(*args, **kwargs)
async def call_client(method, *args, **kwargs):
try:
client = RegistryClient()
return await aio_to_thread(getattr(client, method), *args, **kwargs)
except Exception as exc: # pylint: disable=bare-except
raise JSONRPC20DispatchException(
code=5000, message="Registry Call Error", data=str(exc)
) from exc

View File

@@ -12,24 +12,22 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import functools
import inspect
from urllib.parse import parse_qs
import ajsonrpc.manager
import ajsonrpc.utils
import click
from ajsonrpc.core import JSONRPC20Error, JSONRPC20Request
from ajsonrpc.dispatcher import Dispatcher
from ajsonrpc.manager import AsyncJSONRPCResponseManager, JSONRPC20Response
from starlette.endpoints import WebSocketEndpoint
from platformio.compat import aio_create_task, aio_get_running_loop, aio_to_thread
from platformio.compat import aio_create_task, aio_get_running_loop
from platformio.http import InternetConnectionError
from platformio.proc import force_exit
# Remove this line when PR is merged
# https://github.com/pavlov99/ajsonrpc/pull/22
ajsonrpc.manager.is_invalid_params = lambda *args, **kwargs: False
ajsonrpc.utils.is_invalid_params = lambda: False
class JSONRPCServerFactoryBase:
@@ -46,18 +44,9 @@ class JSONRPCServerFactoryBase:
def __call__(self, *args, **kwargs):
raise NotImplementedError
def add_object_handler(self, obj):
obj.factory = self
namespace = obj.NAMESPACE or obj.__class__.__name__
for name in dir(obj):
method = getattr(obj, name)
if name.startswith("_") or not (
inspect.ismethod(method) or inspect.isfunction(method)
):
continue
if not inspect.iscoroutinefunction(method):
method = functools.partial(aio_to_thread, method)
self.manager.dispatcher.add_function(method, name=f"{namespace}.{name}")
def add_object_handler(self, handler, namespace):
handler.factory = self
self.manager.dispatcher.add_object(handler, prefix="%s." % namespace)
def on_client_connect(self, connection, actor=None):
self._clients[connection] = {"actor": actor}

View File

@@ -28,11 +28,10 @@ from platformio.compat import aio_get_running_loop
from platformio.exception import PlatformioException
from platformio.home.rpc.handlers.account import AccountRPC
from platformio.home.rpc.handlers.app import AppRPC
from platformio.home.rpc.handlers.core import CoreRPC
from platformio.home.rpc.handlers.ide import IDERPC
from platformio.home.rpc.handlers.memusage import MemUsageRPC
from platformio.home.rpc.handlers.misc import MiscRPC
from platformio.home.rpc.handlers.os import OSRPC
from platformio.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.home.rpc.handlers.platform import PlatformRPC
from platformio.home.rpc.handlers.project import ProjectRPC
from platformio.home.rpc.handlers.registry import RegistryRPC
@@ -68,16 +67,15 @@ def run_server(host, port, no_open, shutdown_timeout, home_url):
raise PlatformioException("Invalid path to PIO Home Contrib")
ws_rpc_factory = WebSocketJSONRPCServerFactory(shutdown_timeout)
ws_rpc_factory.add_object_handler(AccountRPC())
ws_rpc_factory.add_object_handler(AppRPC())
ws_rpc_factory.add_object_handler(IDERPC())
ws_rpc_factory.add_object_handler(MemUsageRPC())
ws_rpc_factory.add_object_handler(MiscRPC())
ws_rpc_factory.add_object_handler(OSRPC())
ws_rpc_factory.add_object_handler(CoreRPC())
ws_rpc_factory.add_object_handler(ProjectRPC())
ws_rpc_factory.add_object_handler(PlatformRPC())
ws_rpc_factory.add_object_handler(RegistryRPC())
ws_rpc_factory.add_object_handler(AccountRPC(), namespace="account")
ws_rpc_factory.add_object_handler(AppRPC(), namespace="app")
ws_rpc_factory.add_object_handler(IDERPC(), namespace="ide")
ws_rpc_factory.add_object_handler(MiscRPC(), namespace="misc")
ws_rpc_factory.add_object_handler(OSRPC(), namespace="os")
ws_rpc_factory.add_object_handler(PIOCoreRPC(), namespace="core")
ws_rpc_factory.add_object_handler(ProjectRPC(), namespace="project")
ws_rpc_factory.add_object_handler(PlatformRPC(), namespace="platform")
ws_rpc_factory.add_object_handler(RegistryRPC(), namespace="registry")
path = urlparse(home_url).path
routes = [

View File

@@ -12,25 +12,22 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import contextlib
import itertools
import json
import socket
import time
from urllib.parse import urljoin
import httpx
import requests.adapters
from urllib3.util.retry import Retry
from platformio import __check_internet_hosts__, app, util
from platformio.cache import ContentCache, cleanup_content_cache
from platformio.compat import is_proxy_set
from platformio.exception import PlatformioException, UserSideException
RETRIES_BACKOFF_FACTOR = 2 # 0s, 2s, 4s, 8s, etc.
RETRIES_METHOD_WHITELIST = ["GET"]
RETRIES_STATUS_FORCELIST = [429, 500, 502, 503, 504]
__default_requests_timeout__ = (10, None) # (connect, read)
class HttpClientApiError(UserSideException):
class HTTPClientError(UserSideException):
def __init__(self, message, response=None):
super().__init__()
self.message = message
@@ -43,138 +40,86 @@ class HttpClientApiError(UserSideException):
class InternetConnectionError(UserSideException):
MESSAGE = (
"You are not connected to the Internet.\n"
"PlatformIO needs the Internet connection to "
"download dependent packages or to work with PlatformIO Account."
"PlatformIO needs the Internet connection to"
" download dependent packages or to work with PlatformIO Account."
)
def exponential_backoff(factor):
yield 0
for n in itertools.count(2):
yield factor * (2 ** (n - 2))
def apply_default_kwargs(kwargs=None):
kwargs = kwargs or {}
# enable redirects by default
kwargs["follow_redirects"] = kwargs.get("follow_redirects", True)
try:
kwargs["verify"] = kwargs.get(
"verify", app.get_setting("enable_proxy_strict_ssl")
)
except PlatformioException:
kwargs["verify"] = True
headers = kwargs.pop("headers", {})
if "User-Agent" not in headers:
headers.update({"User-Agent": app.get_user_agent()})
kwargs["headers"] = headers
retry = kwargs.pop("retry", None)
if retry:
kwargs["transport"] = HTTPRetryTransport(verify=kwargs["verify"], **retry)
return kwargs
class HTTPRetryTransport(httpx.HTTPTransport):
def __init__( # pylint: disable=too-many-arguments
self,
verify=True,
retries=1,
backoff_factor=None,
status_forcelist=None,
method_whitelist=None,
):
super().__init__(verify=verify)
self._retries = retries
self._backoff_factor = backoff_factor or RETRIES_BACKOFF_FACTOR
self._status_forcelist = status_forcelist or RETRIES_STATUS_FORCELIST
self._method_whitelist = method_whitelist or RETRIES_METHOD_WHITELIST
def handle_request(self, request):
retries_left = self._retries
delays = exponential_backoff(factor=RETRIES_BACKOFF_FACTOR)
while retries_left > 0:
retries_left -= 1
try:
response = super().handle_request(request)
if response.status_code in RETRIES_STATUS_FORCELIST:
if request.method.upper() not in self._method_whitelist:
return response
raise httpx.HTTPStatusError(
f"Server error '{response.status_code} {response.reason_phrase}' "
f"for url '{request.url}'\n",
request=request,
response=response,
)
return response
except httpx.HTTPError:
if retries_left == 0:
raise
time.sleep(next(delays) or 1)
raise httpx.RequestError(
f"Could not process '{request.url}' request", request=request
)
class HTTPSession(httpx.Client):
class HTTPSession(requests.Session):
def __init__(self, *args, **kwargs):
super().__init__(*args, **apply_default_kwargs(kwargs))
self._x_base_url = kwargs.pop("x_base_url") if "x_base_url" in kwargs else None
super().__init__(*args, **kwargs)
self.headers.update({"User-Agent": app.get_user_agent()})
try:
self.verify = app.get_setting("enable_proxy_strict_ssl")
except PlatformioException:
self.verify = True
def request( # pylint: disable=signature-differs,arguments-differ
self, method, url, *args, **kwargs
):
# print("HTTPSession::request", self._x_base_url, method, url, args, kwargs)
if "timeout" not in kwargs:
kwargs["timeout"] = __default_requests_timeout__
return super().request(
method,
(
url
if url.startswith("http") or not self._x_base_url
else urljoin(self._x_base_url, url)
),
*args,
**kwargs
)
class HttpEndpointPool:
def __init__(self, endpoints, session_retry=None):
class HTTPSessionIterator:
def __init__(self, endpoints):
if not isinstance(endpoints, list):
endpoints = [endpoints]
self.endpoints = endpoints
self.session_retry = session_retry
self._endpoints_iter = iter(endpoints)
self._session = None
self.next()
def close(self):
if self._session:
self._session.close()
def next(self):
if self._session:
self._session.close()
self._session = HTTPSession(
base_url=next(self._endpoints_iter), retry=self.session_retry
self.endpoints_iter = iter(endpoints)
# https://urllib3.readthedocs.io/en/stable/reference/urllib3.util.html
self.retry = Retry(
total=5,
backoff_factor=1, # [0, 2, 4, 8, 16] secs
# method_whitelist=list(Retry.DEFAULT_METHOD_WHITELIST) + ["POST"],
status_forcelist=[413, 429, 500, 502, 503, 504],
)
def request(self, method, *args, **kwargs):
while True:
try:
return self._session.request(method, *args, **kwargs)
except httpx.HTTPError as exc:
try:
self.next()
except StopIteration as exc2:
raise exc from exc2
def __iter__(self): # pylint: disable=non-iterator-returned
return self
def __next__(self):
base_url = next(self.endpoints_iter)
session = HTTPSession(x_base_url=base_url)
adapter = requests.adapters.HTTPAdapter(max_retries=self.retry)
session.mount(base_url, adapter)
return session
class HttpApiClient(contextlib.AbstractContextManager):
class HTTPClient:
def __init__(self, endpoints):
self._endpoint = HttpEndpointPool(endpoints, session_retry=dict(retries=5))
def __exit__(self, *excinfo):
self.close()
self._session_iter = HTTPSessionIterator(endpoints)
self._session = None
self._next_session()
def __del__(self):
self.close()
if not self._session:
return
try:
self._session.close()
except: # pylint: disable=bare-except
pass
self._session = None
def close(self):
if getattr(self, "_endpoint"):
self._endpoint.close()
def _next_session(self):
if self._session:
self._session.close()
self._session = next(self._session_iter)
@util.throttle(500)
def send_request(self, method, *args, **kwargs):
def send_request(self, method, path, **kwargs):
# check Internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
@@ -188,28 +133,23 @@ class HttpApiClient(contextlib.AbstractContextManager):
# pylint: disable=import-outside-toplevel
from platformio.account.client import AccountClient
with AccountClient() as client:
headers["Authorization"] = (
"Bearer %s" % client.fetch_authentication_token()
)
headers["Authorization"] = (
"Bearer %s" % AccountClient().fetch_authentication_token()
)
kwargs["headers"] = headers
try:
return self._endpoint.request(method, *args, **kwargs)
except httpx.HTTPError as exc:
raise HttpClientApiError(str(exc)) from exc
while True:
try:
return getattr(self._session, method)(path, **kwargs)
except requests.exceptions.RequestException as exc:
try:
self._next_session()
except Exception as exc2:
raise HTTPClientError(str(exc2)) from exc
def fetch_json_data(self, method, path, **kwargs):
if method not in ("get", "head", "options"):
cleanup_content_cache("http")
# remove empty params
if kwargs.get("params"):
kwargs["params"] = {
key: value
for key, value in kwargs.get("params").items()
if value is not None
}
cache_valid = kwargs.pop("x_cache_valid") if "x_cache_valid" in kwargs else None
if not cache_valid:
return self._parse_json_response(self.send_request(method, path, **kwargs))
@@ -239,7 +179,7 @@ class HttpApiClient(contextlib.AbstractContextManager):
message = response.json()["message"]
except (KeyError, ValueError):
message = response.text
raise HttpClientApiError(message, response)
raise HTTPClientError(message, response)
#
@@ -250,11 +190,12 @@ class HttpApiClient(contextlib.AbstractContextManager):
@util.memoized(expire="10s")
def _internet_on():
timeout = 2
use_proxy = is_proxy_set()
socket.setdefaulttimeout(timeout)
for host in __check_internet_hosts__:
try:
if is_proxy_set():
httpx.get("http://%s" % host, follow_redirects=False, timeout=timeout)
if use_proxy:
requests.get("http://%s" % host, allow_redirects=False, timeout=timeout)
return True
# try to resolve `host` for both AF_INET and AF_INET6, and then try to connect
# to all possible addresses (IPv4 and IPv6) in turn until a connection succeeds:
@@ -263,6 +204,15 @@ def _internet_on():
return True
except: # pylint: disable=bare-except
pass
# falling back to HTTPs, issue #4980
for host in __check_internet_hosts__:
try:
requests.get("https://%s" % host, allow_redirects=False, timeout=timeout)
except requests.exceptions.RequestException:
pass
return True
return False
@@ -273,8 +223,9 @@ def ensure_internet_on(raise_exception=False):
return result
def fetch_http_content(*args, **kwargs):
with HTTPSession() as session:
response = session.get(*args, **kwargs)
response.raise_for_status()
return response.text
def fetch_remote_content(*args, **kwargs):
with HTTPSession() as s:
r = s.get(*args, **kwargs)
r.raise_for_status()
r.close()
return r.text

View File

@@ -23,11 +23,7 @@ from platformio import __version__, app, exception, fs, telemetry
from platformio.cache import cleanup_content_cache
from platformio.cli import PlatformioCLI
from platformio.commands.upgrade import get_latest_version
from platformio.http import (
HttpClientApiError,
InternetConnectionError,
ensure_internet_on,
)
from platformio.http import HTTPClientError, InternetConnectionError, ensure_internet_on
from platformio.package.manager.core import update_core_packages
from platformio.package.version import pepver_to_semver
from platformio.system.prune import calculate_unnecessary_system_data
@@ -50,7 +46,7 @@ def on_cmd_end():
check_platformio_upgrade()
check_prune_system()
except (
HttpClientApiError,
HTTPClientError,
InternetConnectionError,
exception.GetLatestVersionError,
):

View File

@@ -54,7 +54,7 @@ def package_exec_cmd(obj, package, call, args):
os.environ["PIO_PYTHON_EXE"] = get_pythonexe_path()
# inject current python interpreter on Windows
if args[0].endswith(".py"):
if args and args[0].endswith(".py"):
args = [os.environ["PIO_PYTHON_EXE"]] + list(args)
if not os.path.exists(args[1]):
args[1] = where_is_program(args[1])

View File

@@ -297,7 +297,11 @@ def _install_project_private_library_deps(private_pkg, private_lm, env_lm, optio
if not spec.external and not spec.owner:
continue
pkg = private_lm.get_package(spec)
if not pkg and not env_lm.get_package(spec):
if (
not pkg
and not private_lm.get_package(spec)
and not env_lm.get_package(spec)
):
pkg = env_lm.install(
spec,
skip_dependencies=True,

View File

@@ -12,8 +12,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
from typing import List
import click
@@ -21,13 +21,13 @@ from platformio import fs
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.manager.tool import ToolPackageManager
from platformio.package.meta import PackageInfo, PackageItem, PackageSpec
from platformio.package.meta import PackageItem, PackageSpec
from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
@click.command("list", short_help="List project packages")
@click.command("list", short_help="List installed packages")
@click.option(
"-d",
"--project-dir",
@@ -48,116 +48,79 @@ from platformio.project.config import ProjectConfig
@click.option("--only-platforms", is_flag=True, help="List only platform packages")
@click.option("--only-tools", is_flag=True, help="List only tool packages")
@click.option("--only-libraries", is_flag=True, help="List only library packages")
@click.option("--json-output", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
def package_list_cmd(**options):
data = (
if options.get("global"):
list_global_packages(options)
if options.get("global")
else list_project_packages(options)
)
if options.get("json_output"):
return click.echo(_dump_to_json(data, options))
def _print_items(typex, items):
click.secho(typex.capitalize(), bold=True)
print_dependency_tree(items, verbose=options.get("verbose"))
click.echo()
if options.get("global"):
for typex, items in data.items():
_print_items(typex, items)
else:
for env, env_data in data.items():
click.echo("Resolving %s dependencies..." % click.style(env, fg="cyan"))
for typex, items in env_data.items():
_print_items(typex, items)
return None
list_project_packages(options)
def _dump_to_json(data, options):
result = {}
if options.get("global"):
for typex, items in data.items():
result[typex] = [info.as_dict(with_manifest=True) for info in items]
else:
for env, env_data in data.items():
result[env] = {}
for typex, items in env_data.items():
result[env][typex] = [
info.as_dict(with_manifest=True) for info in items
]
return json.dumps(result)
def build_package_info(pm, specs=None, filter_specs=None, resolve_dependencies=True):
filtered_pkgs = [
pm.get_package(spec) for spec in filter_specs if pm.get_package(spec)
def humanize_package(pkg, spec=None, verbose=False):
if spec and not isinstance(spec, PackageSpec):
spec = PackageSpec(spec)
data = [
click.style(pkg.metadata.name, fg="cyan"),
click.style(f"@ {str(pkg.metadata.version)}", bold=True),
]
candidates = []
extra_data = ["required: %s" % (spec.humanize() if spec else "Any")]
if verbose:
extra_data.append(pkg.path)
data.append("(%s)" % ", ".join(extra_data))
return " ".join(data)
def print_dependency_tree(pm, specs=None, filter_specs=None, level=0, verbose=False):
filtered_pkgs = [
pm.get_package(spec) for spec in filter_specs or [] if pm.get_package(spec)
]
candidates = {}
if specs:
for spec in specs:
candidates.append(
PackageInfo(
spec if isinstance(spec, PackageSpec) else PackageSpec(spec),
pm.get_package(spec),
)
)
else:
candidates = [PackageInfo(pkg.metadata.spec, pkg) for pkg in pm.get_installed()]
if not candidates:
return []
candidates = sorted(
candidates,
key=lambda info: info.item.metadata.name if info.item else info.spec.humanize(),
)
result = []
for info in candidates:
if filter_specs and (
not info.item or not _pkg_tree_contains(pm, info.item, filtered_pkgs)
):
continue
if not info.item:
if not info.spec.external and not info.spec.owner: # built-in library?
pkg = pm.get_package(spec)
if not pkg:
continue
result.append(info)
continue
candidates[pkg.path] = (pkg, spec)
else:
candidates = {pkg.path: (pkg, pkg.metadata.spec) for pkg in pm.get_installed()}
if not candidates:
return
candidates = sorted(candidates.values(), key=lambda item: item[0].metadata.name)
visited_pkgs = pm.memcache_get("__visited_pkgs", [])
if visited_pkgs and info.item.path in visited_pkgs:
for index, (pkg, spec) in enumerate(candidates):
if filtered_pkgs and not _pkg_tree_contains(pm, pkg, filtered_pkgs):
continue
visited_pkgs.append(info.item.path)
pm.memcache_set("__visited_pkgs", visited_pkgs)
printed_pkgs = pm.memcache_get("__printed_pkgs", [])
if printed_pkgs and pkg.path in printed_pkgs:
continue
printed_pkgs.append(pkg.path)
pm.memcache_set("__printed_pkgs", printed_pkgs)
result.append(
PackageInfo(
info.spec,
info.item,
(
build_package_info(
pm,
specs=[
pm.dependency_to_spec(item)
for item in pm.get_pkg_dependencies(info.item)
],
filter_specs=filter_specs,
resolve_dependencies=True,
)
if resolve_dependencies and pm.get_pkg_dependencies(info.item)
else []
click.echo(
"%s%s %s"
% (
"" * level,
"├──" if index < len(candidates) - 1 else "└──",
humanize_package(
pkg,
spec=spec,
verbose=verbose,
),
)
)
return result
dependencies = pm.get_pkg_dependencies(pkg)
if dependencies:
print_dependency_tree(
pm,
specs=[pm.dependency_to_spec(item) for item in dependencies],
filter_specs=filter_specs,
level=level + 1,
verbose=verbose,
)
def _pkg_tree_contains(pm, root: PackageItem, children: list[PackageItem]):
def _pkg_tree_contains(pm, root: PackageItem, children: List[PackageItem]):
if root in children:
return True
for dependency in pm.get_pkg_dependencies(root) or []:
@@ -176,7 +139,6 @@ def list_global_packages(options):
only_packages = any(
options.get(typex) or options.get(f"only_{typex}") for (typex, _) in data
)
result = {}
for typex, pm in data:
skip_conds = [
only_packages
@@ -186,115 +148,82 @@ def list_global_packages(options):
]
if any(skip_conds):
continue
result[typex] = build_package_info(pm, filter_specs=options.get(typex))
return result
click.secho(typex.capitalize(), bold=True)
print_dependency_tree(
pm, filter_specs=options.get(typex), verbose=options.get("verbose")
)
click.echo()
def list_project_packages(options):
environments = options["environments"]
only_filtered_packages = any(
only_packages = any(
options.get(typex) or options.get(f"only_{typex}")
for typex in ("platforms", "tools", "libraries")
)
only_platform_package = options.get("platforms") or options.get("only_platforms")
only_tool_packages = options.get("tools") or options.get("only_tools")
only_platform_packages = any(
options.get(typex) or options.get(f"only_{typex}")
for typex in ("platforms", "tools")
)
only_library_packages = options.get("libraries") or options.get("only_libraries")
result = {}
with fs.cd(options["project_dir"]):
config = ProjectConfig.get_instance()
config.validate(environments)
for env in config.envs():
if environments and env not in environments:
continue
result[env] = {}
if not only_filtered_packages or only_platform_package:
result[env]["platforms"] = list_project_env_platform_package(
env, options
)
if not only_filtered_packages or only_tool_packages:
result[env]["tools"] = list_project_env_tool_packages(env, options)
if not only_filtered_packages or only_library_packages:
result[env]["libraries"] = list_project_env_library_packages(
env, options
)
return result
click.echo("Resolving %s dependencies..." % click.style(env, fg="cyan"))
found = False
if not only_packages or only_platform_packages:
_found = print_project_env_platform_packages(env, options)
found = found or _found
if not only_packages or only_library_packages:
_found = print_project_env_library_packages(env, options)
found = found or _found
if not found:
click.echo("No packages")
if (not environments and len(config.envs()) > 1) or len(environments) > 1:
click.echo()
def list_project_env_platform_package(project_env, options):
pm = PlatformPackageManager()
return build_package_info(
pm,
specs=[PackageSpec(pm.config.get(f"env:{project_env}", "platform"))],
filter_specs=options.get("platforms"),
resolve_dependencies=False,
)
def list_project_env_tool_packages(project_env, options):
def print_project_env_platform_packages(project_env, options):
try:
p = PlatformFactory.from_env(project_env, targets=["upload"])
p = PlatformFactory.from_env(project_env)
except UnknownPlatform:
return []
return build_package_info(
return None
click.echo(
"Platform %s"
% (
humanize_package(
PlatformPackageManager().get_package(p.get_dir()),
p.config.get(f"env:{project_env}", "platform"),
verbose=options.get("verbose"),
)
)
)
print_dependency_tree(
p.pm,
specs=[
p.get_package_spec(name)
for name, options in p.packages.items()
if not options.get("optional")
],
specs=[p.get_package_spec(name) for name in p.packages],
filter_specs=options.get("tools"),
)
click.echo()
return True
def list_project_env_library_packages(project_env, options):
def print_project_env_library_packages(project_env, options):
config = ProjectConfig.get_instance()
lib_deps = config.get(f"env:{project_env}", "lib_deps")
lm = LibraryPackageManager(
os.path.join(config.get("platformio", "libdeps_dir"), project_env)
)
return build_package_info(
if not lib_deps or not lm.get_installed():
return None
click.echo("Libraries")
print_dependency_tree(
lm,
lib_deps,
filter_specs=options.get("libraries"),
verbose=options.get("verbose"),
)
def humanize_package(info, verbose=False):
data = (
[
click.style(info.item.metadata.name, fg="cyan"),
click.style(f"@ {str(info.item.metadata.version)}", bold=True),
]
if info.item
else ["Not installed"]
)
extra_data = ["required: %s" % (info.spec.humanize() if info.spec else "Any")]
if verbose and info.item:
extra_data.append(info.item.path)
data.append("(%s)" % ", ".join(extra_data))
return " ".join(data)
def print_dependency_tree(items, verbose=False, level=0):
for index, info in enumerate(items):
click.echo(
"%s%s %s"
% (
"" * level,
"├──" if index < len(items) - 1 else "└──",
humanize_package(
info,
verbose=verbose,
),
)
)
if info.dependencies:
print_dependency_tree(
info.dependencies,
verbose=verbose,
level=level + 1,
)
return True

View File

@@ -82,14 +82,13 @@ def validate_datetime(ctx, param, value): # pylint: disable=unused-argument
help="Do not show interactive prompt",
hidden=True,
)
def package_publish_cmd( # pylint: disable=too-many-arguments, too-many-locals
def package_publish_cmd( # pylint: disable=too-many-arguments,too-many-positional-arguments,too-many-locals
package, owner, typex, released_at, private, notify, no_interactive, non_interactive
):
click.secho("Preparing a package...", fg="cyan")
package = os.path.abspath(package)
no_interactive = no_interactive or non_interactive
with AccountClient() as client:
owner = owner or client.get_logged_username()
owner = owner or AccountClient().get_logged_username()
do_not_pack = (
not os.path.isdir(package)
and isinstance(FileUnpacker.new_archiver(package), TARArchiver)
@@ -147,10 +146,9 @@ def package_publish_cmd( # pylint: disable=too-many-arguments, too-many-locals
fg="yellow",
)
click.echo("Publishing...")
with RegistryClient() as client:
response = client.publish_package(
owner, typex, archive_path, released_at, private, notify
)
response = RegistryClient().publish_package(
owner, typex, archive_path, released_at, private, notify
)
if not do_not_pack:
os.remove(archive_path)
click.secho(response.get("message"), fg="green")

View File

@@ -29,8 +29,8 @@ from platformio.registry.client import RegistryClient
type=click.Choice(["relevance", "popularity", "trending", "added", "updated"]),
)
def package_search_cmd(query, page, sort):
with RegistryClient() as client:
result = client.list_packages(query, page=page, sort=sort)
client = RegistryClient()
result = client.list_packages(query, page=page, sort=sort)
if not result["total"]:
click.secho("Nothing has been found by your request", fg="yellow")
click.echo(

View File

@@ -124,31 +124,31 @@ def package_show_cmd(spec, pkg_type):
def fetch_package_data(spec, pkg_type=None):
assert isinstance(spec, PackageSpec)
with RegistryClient() as client:
if pkg_type and spec.owner and spec.name:
return client.get_package(
pkg_type, spec.owner, spec.name, version=spec.requirements
)
qualifiers = {}
if spec.id:
qualifiers["ids"] = str(spec.id)
if spec.name:
qualifiers["names"] = spec.name.lower()
if pkg_type:
qualifiers["types"] = pkg_type
if spec.owner:
qualifiers["owners"] = spec.owner.lower()
packages = client.list_packages(qualifiers=qualifiers)["items"]
if not packages:
return None
if len(packages) > 1:
PackageManagerRegistryMixin.print_multi_package_issue(
click.echo, packages, spec
)
return None
client = RegistryClient()
if pkg_type and spec.owner and spec.name:
return client.get_package(
packages[0]["type"],
packages[0]["owner"]["username"],
packages[0]["name"],
version=spec.requirements,
pkg_type, spec.owner, spec.name, version=spec.requirements
)
qualifiers = {}
if spec.id:
qualifiers["ids"] = str(spec.id)
if spec.name:
qualifiers["names"] = spec.name.lower()
if pkg_type:
qualifiers["types"] = pkg_type
if spec.owner:
qualifiers["owners"] = spec.owner.lower()
packages = client.list_packages(qualifiers=qualifiers)["items"]
if not packages:
return None
if len(packages) > 1:
PackageManagerRegistryMixin.print_multi_package_issue(
click.echo, packages, spec
)
return None
return client.get_package(
packages[0]["type"],
packages[0]["owner"]["username"],
packages[0]["name"],
version=spec.requirements,
)

View File

@@ -111,7 +111,7 @@ def uninstall_project_env_dependencies(project_env, options=None):
uninstalled_conds.append(
_uninstall_project_env_custom_tools(project_env, options)
)
# custom ibraries
# custom libraries
if options.get("libraries"):
uninstalled_conds.append(
_uninstall_project_env_custom_libraries(project_env, options)

View File

@@ -36,14 +36,11 @@ from platformio.registry.client import RegistryClient
)
def package_unpublish_cmd(package, type, undo): # pylint: disable=redefined-builtin
spec = PackageSpec(package)
with AccountClient() as client:
owner = spec.owner or client.get_logged_username()
with RegistryClient() as client:
response = client.unpublish_package(
owner=owner,
type=type,
name=spec.name,
version=str(spec.requirements),
undo=undo,
)
click.secho(response.get("message"), fg="green")
response = RegistryClient().unpublish_package(
owner=spec.owner or AccountClient().get_logged_username(),
type=type,
name=spec.name,
version=str(spec.requirements),
undo=undo,
)
click.secho(response.get("message"), fg="green")

View File

@@ -110,7 +110,7 @@ def update_project_env_dependencies(project_env, options=None):
# custom tools
if options.get("tools"):
updated_conds.append(_update_project_env_custom_tools(project_env, options))
# custom ibraries
# custom libraries
if options.get("libraries"):
updated_conds.append(_update_project_env_custom_libraries(project_env, options))
# declared dependencies

View File

@@ -12,28 +12,48 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import tempfile
import time
import io
from email.utils import parsedate
from urllib.parse import urlparse
from os.path import getsize, join
from time import mktime
import click
import httpx
from platformio import fs
from platformio.compat import is_terminal
from platformio.http import apply_default_kwargs
from platformio.http import HTTPSession
from platformio.package.exception import PackageException
class FileDownloader:
def __init__(self, url, dst_dir=None):
self.url = url
self.dst_dir = dst_dir
self._destination = None
def __init__(self, url, dest_dir=None):
self._http_session = HTTPSession()
self._http_response = None
# make connection
self._http_response = self._http_session.get(
url,
stream=True,
)
if self._http_response.status_code not in (200, 203):
raise PackageException(
"Got the unrecognized status code '{0}' when downloaded {1}".format(
self._http_response.status_code, url
)
)
disposition = self._http_response.headers.get("content-disposition")
if disposition and "filename=" in disposition:
self._fname = (
disposition[disposition.index("filename=") + 9 :]
.replace('"', "")
.replace("'", "")
)
else:
self._fname = [p for p in url.split("/") if p][-1]
self._fname = str(self._fname)
self._destination = self._fname
if dest_dir:
self.set_destination(join(dest_dir, self._fname))
def set_destination(self, destination):
self._destination = destination
@@ -49,34 +69,18 @@ class FileDownloader:
return -1
return int(self._http_response.headers["content-length"])
def get_disposition_filname(self):
disposition = self._http_response.headers.get("content-disposition")
if disposition and "filename=" in disposition:
return (
disposition[disposition.index("filename=") + 9 :]
.replace('"', "")
.replace("'", "")
)
return [p for p in urlparse(self.url).path.split("/") if p][-1]
def start(self, with_progress=True, silent=False):
label = "Downloading"
with httpx.stream("GET", self.url, **apply_default_kwargs()) as response:
if response.status_code != 200:
raise PackageException(
f"Got the unrecognized status code '{response.status_code}' "
"when downloading '{self.url}'"
)
self._http_response = response
total_size = self.get_size()
if not self._destination:
assert self.dst_dir
file_size = self.get_size()
itercontent = self._http_response.iter_content(
chunk_size=io.DEFAULT_BUFFER_SIZE
)
try:
with open(self._destination, "wb") as fp:
if total_size == -1 or not with_progress or silent:
if file_size == -1 or not with_progress or silent:
if not silent:
click.echo(f"{label}...")
for chunk in response.iter_bytes():
for chunk in itercontent:
fp.write(chunk)
elif not is_terminal():
@@ -84,10 +88,10 @@ class FileDownloader:
print_percent_step = 10
printed_percents = 0
downloaded_size = 0
for chunk in response.iter_bytes():
for chunk in itercontent:
fp.write(chunk)
downloaded_size += len(chunk)
if (downloaded_size / total_size * 100) >= (
if (downloaded_size / file_size * 100) >= (
printed_percents + print_percent_step
):
printed_percents += print_percent_step
@@ -96,39 +100,33 @@ class FileDownloader:
else:
with click.progressbar(
length=total_size,
iterable=response.iter_bytes(),
length=file_size,
iterable=itercontent,
label=label,
update_min_steps=min(
256 * 1024, total_size / 100
256 * 1024, file_size / 100
), # every 256Kb or less
) as pb:
for chunk in pb:
pb.update(len(chunk))
fp.write(chunk)
finally:
self._http_response.close()
self._http_session.close()
last_modified = self.get_lmtime()
if last_modified:
self._preserve_filemtime(last_modified)
if self.get_lmtime():
self._preserve_filemtime(self.get_lmtime())
return True
def _set_tmp_destination(self):
dst_dir = self.dst_dir or tempfile.mkdtemp()
self.set_destination(os.path.join(dst_dir, self.get_disposition_filname()))
def _preserve_filemtime(self, lmdate):
lmtime = time.mktime(parsedate(lmdate))
fs.change_filemtime(self._destination, lmtime)
def verify(self, checksum=None):
remote_size = self.get_size()
downloaded_size = os.path.getsize(self._destination)
if remote_size not in (-1, downloaded_size):
_dlsize = getsize(self._destination)
if self.get_size() != -1 and _dlsize != self.get_size():
raise PackageException(
f"The size ({downloaded_size} bytes) of downloaded file "
f"'{self._destination}' is not equal to remote size "
f"({remote_size} bytes)"
(
"The size ({0:d} bytes) of downloaded file '{1}' "
"is not equal to remote size ({2:d} bytes)"
).format(_dlsize, self._fname, self.get_size())
)
if not checksum:
return True
@@ -144,7 +142,7 @@ class FileDownloader:
if not hash_algo:
raise PackageException(
f"Could not determine checksum algorithm by {checksum}"
"Could not determine checksum algorithm by %s" % checksum
)
dl_checksum = fs.calculate_file_hashsum(hash_algo, self._destination)
@@ -152,7 +150,16 @@ class FileDownloader:
raise PackageException(
"The checksum '{0}' of the downloaded file '{1}' "
"does not match to the remote '{2}'".format(
dl_checksum, self._destination, checksum
dl_checksum, self._fname, checksum
)
)
return True
def _preserve_filemtime(self, lmdate):
lmtime = mktime(parsedate(lmdate))
fs.change_filemtime(self._destination, lmtime)
def __del__(self):
self._http_session.close()
if self._http_response:
self._http_response.close()

View File

@@ -15,7 +15,6 @@
import time
import click
import httpx
from platformio.package.exception import UnknownPackageError
from platformio.package.meta import PackageSpec
@@ -58,7 +57,7 @@ class PackageManagerRegistryMixin:
),
checksum or pkgfile["checksum"]["sha256"],
)
except httpx.HTTPError as exc:
except Exception as exc: # pylint: disable=broad-except
self.log.warning(
click.style("Warning! Package Mirror: %s" % exc, fg="yellow")
)

View File

@@ -15,7 +15,7 @@
import os
from platformio import util
from platformio.http import HttpClientApiError, InternetConnectionError
from platformio.http import HTTPClientError, InternetConnectionError
from platformio.package.exception import UnknownPackageError
from platformio.package.manager.base import BasePackageManager
from platformio.package.manager.core import get_installed_core_packages
@@ -38,7 +38,7 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
def manifest_names(self):
return PackageType.get_manifest_map()[PackageType.PLATFORM]
def install( # pylint: disable=arguments-differ,too-many-arguments
def install( # pylint: disable=arguments-differ,too-many-arguments,too-many-positional-arguments
self,
spec,
skip_dependencies=False,
@@ -128,7 +128,7 @@ class PlatformPackageManager(BasePackageManager): # pylint: disable=too-many-an
key = "%s:%s" % (board["platform"], board["id"])
if key not in know_boards:
boards.append(board)
except (HttpClientApiError, InternetConnectionError):
except (HTTPClientError, InternetConnectionError):
pass
return sorted(boards, key=lambda b: b["name"])

View File

@@ -22,7 +22,7 @@ from urllib.parse import urlparse
from platformio import util
from platformio.compat import get_object_members, string_types
from platformio.http import fetch_http_content
from platformio.http import fetch_remote_content
from platformio.package.exception import ManifestParserError, UnknownManifestError
from platformio.project.helpers import is_platformio_project
@@ -103,7 +103,7 @@ class ManifestParserFactory:
@staticmethod
def new_from_url(remote_url):
content = fetch_http_content(remote_url)
content = fetch_remote_content(remote_url)
return ManifestParserFactory.new(
content,
ManifestFileType.from_uri(remote_url) or ManifestFileType.LIBRARY_JSON,

View File

@@ -17,12 +17,12 @@
import json
import re
import httpx
import marshmallow
import requests
import semantic_version
from marshmallow import Schema, ValidationError, fields, validate, validates
from platformio.http import fetch_http_content
from platformio.http import fetch_remote_content
from platformio.package.exception import ManifestValidationError
from platformio.util import memoized
@@ -252,7 +252,7 @@ class ManifestSchema(BaseSchema):
def validate_license(self, value):
try:
spdx = self.load_spdx_licenses()
except httpx.HTTPError as exc:
except requests.exceptions.RequestException as exc:
raise ValidationError(
"Could not load SPDX licenses for validation"
) from exc
@@ -276,9 +276,9 @@ class ManifestSchema(BaseSchema):
@staticmethod
@memoized(expire="1h")
def load_spdx_licenses():
version = "3.23"
version = "3.26.0"
spdx_data_url = (
"https://raw.githubusercontent.com/spdx/license-list-data/"
f"v{version}/json/licenses.json"
)
return json.loads(fetch_http_content(spdx_data_url))
return json.loads(fetch_remote_content(spdx_data_url))

View File

@@ -23,7 +23,7 @@ import semantic_version
from platformio import fs
from platformio.compat import get_object_members, hashlib_encode_data, string_types
from platformio.package.manifest.parser import ManifestFileType, ManifestParserFactory
from platformio.package.manifest.parser import ManifestFileType
from platformio.package.version import SemanticVersionError, cast_version_to_semver
from platformio.util import items_in_list
@@ -196,7 +196,7 @@ class PackageOutdatedResult:
class PackageSpec: # pylint: disable=too-many-instance-attributes
def __init__( # pylint: disable=redefined-builtin,too-many-arguments
def __init__( # pylint: disable=redefined-builtin,too-many-arguments,too-many-positional-arguments
self, raw=None, owner=None, id=None, name=None, requirements=None, uri=None
):
self._requirements = None
@@ -396,7 +396,7 @@ class PackageSpec: # pylint: disable=too-many-instance-attributes
parts.path.endswith(".git"),
# Handle GitHub URL (https://github.com/user/package)
parts.netloc in ("github.com", "gitlab.com", "bitbucket.com")
and not parts.path.endswith((".zip", ".tar.gz")),
and not parts.path.endswith((".zip", ".tar.gz", ".tar.xz")),
]
hg_conditions = [
# Handle Developer Mbed URL
@@ -561,29 +561,3 @@ class PackageItem:
break
assert location
return self.metadata.dump(os.path.join(location, self.METAFILE_NAME))
def as_dict(self):
return {"path": self.path, "metadata": self.metadata.as_dict()}
class PackageInfo:
def __init__(self, spec: PackageSpec, item: PackageItem = None, dependencies=None):
assert isinstance(spec, PackageSpec)
self.spec = spec
self.item = item
self.dependencies = dependencies or []
def as_dict(self, with_manifest=False):
result = {
"spec": self.spec.as_dict(),
"item": self.item.as_dict() if self.item else None,
"dependencies": [d.as_dict() for d in self.dependencies],
}
if with_manifest:
result["manifest"] = (
ManifestParserFactory.new_from_dir(self.item.path).as_dict()
if self.item
else None
)
return result

View File

@@ -152,6 +152,7 @@ class FileUnpacker:
magic_map = {
b"\x1f\x8b\x08": TARArchiver,
b"\x42\x5a\x68": TARArchiver,
b"\xfd\x37\x7a\x58\x5a\x00": TARArchiver,
b"\x50\x4b\x03\x04": ZIPArchiver,
}
magic_len = max(len(k) for k in magic_map)

View File

@@ -44,7 +44,7 @@ def cast_version_to_semver(value, force=True, raise_exception=False):
def pepver_to_semver(pepver):
return cast_version_to_semver(
re.sub(r"(\.\d+)\.?(dev|a|b|rc|post)", r"\1-\2.", pepver, 1)
re.sub(r"(\.\d+)\.?(dev|a|b|rc|post)", r"\1-\2.", pepver, count=1)
)

View File

@@ -44,7 +44,7 @@ class PlatformRunMixin:
value = json.loads(value)
return value
def run( # pylint: disable=too-many-arguments
def run( # pylint: disable=too-many-arguments,too-many-positional-arguments
self, variables, targets, silent, verbose, jobs
):
assert isinstance(variables, dict)

View File

@@ -16,8 +16,6 @@ import os
import re
import sys
import httpx
from platformio import fs
from platformio.compat import load_python_module
from platformio.package.meta import PackageItem
@@ -33,16 +31,13 @@ class PlatformFactory:
name = re.sub(r"[^\da-z\_]+", "", name, flags=re.I)
return "%sPlatform" % name.lower().capitalize()
@classmethod
def load_platform_module(cls, name, path):
# backward compatibiility with the legacy dev-platforms
@staticmethod
def load_platform_module(name, path):
# backward compatibility with the legacy dev-platforms
sys.modules["platformio.managers.platform"] = base
try:
return load_python_module("platformio.platform.%s" % name, path)
except ImportError as exc:
if exc.name == "requests" and not sys.modules.get("requests"):
sys.modules["requests"] = httpx
return cls.load_platform_module(name, path)
raise UnknownPlatform(name) from exc
@classmethod

View File

@@ -68,7 +68,7 @@ def validate_boards(ctx, param, value): # pylint: disable=unused-argument
@click.option("--no-install-dependencies", is_flag=True)
@click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True)
def project_init_cmd(
def project_init_cmd( # pylint: disable=too-many-positional-arguments
project_dir,
boards,
ide,
@@ -201,9 +201,7 @@ new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
In C, the convention is to give header files names that end with `.h'.
Read more about using header files in official GCC documentation:
@@ -222,12 +220,12 @@ def init_lib_readme(lib_dir):
fp.write(
"""
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
PlatformIO will compile them to static libraries and link into the executable file.
The source code of each library should be placed in an own separate directory
("lib/your_library_name/[here are source files]").
The source code of each library should be placed in a separate directory
("lib/your_library_name/[Code]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
For example, see the structure of the following example libraries `Foo` and `Bar`:
|--lib
| |
@@ -237,7 +235,7 @@ For example, see a structure of the following two libraries `Foo` and `Bar`:
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| | |- library.json (optional. for custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
@@ -249,7 +247,7 @@ For example, see a structure of the following two libraries `Foo` and `Bar`:
|--src
|- main.c
and a contents of `src/main.c`:
Example contents of `src/main.c` using Foo and Bar:
```
#include <Foo.h>
#include <Bar.h>
@@ -261,8 +259,8 @@ int main (void)
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
The PlatformIO Library Dependency Finder will find automatically dependent
libraries by scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html

0
platformio/project/commands/metadata.py Executable file → Normal file
View File

View File

@@ -347,7 +347,7 @@ class ProjectConfigBase:
if section is None:
if option in self.BUILTIN_VARS:
return self.BUILTIN_VARS[option]()
# SCons varaibles
# SCons variables
return f"${{{option}}}"
# handle system environment variables

View File

@@ -12,29 +12,22 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import hashlib
import os
import re
import subprocess
from hashlib import sha1
from click.testing import CliRunner
from platformio import __version__, exception, fs
from platformio.compat import IS_MACOS, IS_WINDOWS, hashlib_encode_data
from platformio.project.config import ProjectConfig
from platformio.project.options import ProjectOptions
def get_project_dir():
return os.getcwd()
def get_project_id(project_dir=None):
return hashlib.sha1(
hashlib_encode_data(project_dir or get_project_dir())
).hexdigest()
def is_platformio_project(project_dir=None):
if not project_dir:
project_dir = get_project_dir()
@@ -99,7 +92,7 @@ def get_default_projects_dir():
def compute_project_checksum(config):
# rebuild when PIO Core version changes
checksum = hashlib.sha1(hashlib_encode_data(__version__))
checksum = sha1(hashlib_encode_data(__version__))
# configuration file state
config_data = config.to_json()
@@ -138,59 +131,49 @@ def compute_project_checksum(config):
return checksum.hexdigest()
def get_build_type(config, env, run_targets=None):
types = []
run_targets = run_targets or []
env_build_type = config.get(f"env:{env}", "build_type")
if set(["__debug", "__memusage"]) & set(run_targets) or env_build_type == "debug":
types.append("debug")
if "__test" in run_targets or env_build_type == "test":
types.append("test")
return ", ".join(types or [ProjectOptions["env.build_type"].default])
def load_build_metadata(project_dir, env_or_envs, cache=False, force_targets=None):
def load_build_metadata(project_dir, env_or_envs, cache=False, build_type=None):
assert env_or_envs
envs = env_or_envs
if not isinstance(envs, list):
envs = [envs]
with fs.cd(project_dir or os.getcwd()):
result = _get_cached_build_metadata(envs, force_targets) if cache else {}
missed_envs = set(envs) - set(result.keys())
if missed_envs:
result.update(_load_build_metadata(missed_envs, force_targets))
env_names = env_or_envs
if not isinstance(env_names, list):
env_names = [env_names]
with fs.cd(project_dir):
result = _get_cached_build_metadata(env_names) if cache else {}
# incompatible build-type data
for env_name in list(result.keys()):
if build_type is None:
build_type = ProjectConfig.get_instance().get(
f"env:{env_name}", "build_type"
)
if result[env_name].get("build_type", "") != build_type:
del result[env_name]
missed_env_names = set(env_names) - set(result.keys())
if missed_env_names:
result.update(
_load_build_metadata(project_dir, missed_env_names, build_type)
)
if not isinstance(env_or_envs, list) and env_or_envs in result:
return result[env_or_envs]
return result or None
# Backward compatibiility with dev-platforms
# Backward compatibility with dev-platforms
load_project_ide_data = load_build_metadata
def _get_cached_build_metadata(envs, force_targets=None):
config = ProjectConfig.get_instance(os.path.join(os.getcwd(), "platformio.ini"))
build_dir = config.get("platformio", "build_dir")
result = {}
for env in envs:
build_type = get_build_type(config, env, force_targets)
json_path = os.path.join(build_dir, env, build_type, "metadata.json")
if os.path.isfile(json_path):
result[env] = fs.load_json(json_path)
return result
def _load_build_metadata(envs, force_targets=None):
def _load_build_metadata(project_dir, env_names, build_type=None):
# pylint: disable=import-outside-toplevel
from platformio import app
from platformio.run.cli import cli as cmd_run
args = ["--target", "__metadata"]
for target in force_targets or []:
args.extend(["--target", target])
for env in envs:
args.extend(["-e", env])
args = ["--project-dir", project_dir, "--target", "__idedata"]
if build_type == "debug":
args.extend(["--target", "__debug"])
# if build_type == "test":
# args.extend(["--target", "__test"])
for name in env_names:
args.extend(["-e", name])
app.set_session_var("pause_telemetry", True)
result = CliRunner().invoke(cmd_run, args)
app.set_session_var("pause_telemetry", False)
@@ -198,6 +181,18 @@ def _load_build_metadata(envs, force_targets=None):
result.exception, exception.ReturnErrorCode
):
raise result.exception
if "Metadata has been saved to the following location" not in result.output:
if '"includes":' not in result.output:
raise exception.UserSideException(result.output)
return _get_cached_build_metadata(envs, force_targets)
return _get_cached_build_metadata(env_names)
def _get_cached_build_metadata(env_names):
build_dir = ProjectConfig.get_instance().get("platformio", "build_dir")
result = {}
for env_name in env_names:
if not os.path.isfile(os.path.join(build_dir, env_name, "idedata.json")):
continue
result[env_name] = fs.load_json(
os.path.join(build_dir, env_name, "idedata.json")
)
return result

0
platformio/project/integration/generator.py Executable file → Normal file
View File

View File

@@ -17,7 +17,7 @@
# common.symbolFiles=<Symbol Files loaded by debugger>
# (This value is overwritten by a launcher specific symbolFiles value if the latter exists)
#
# In runDir, symbolFiles and env fields you can use these macroses:
# In runDir, symbolFiles and env fields you can use these macros:
# ${PROJECT_DIR} - project directory absolute path
# ${OUTPUT_PATH} - linker output path (relative to project directory path)
# ${OUTPUT_BASENAME}- linker output filename

View File

@@ -1,58 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import gzip
import json
import os
import time
from platformio import fs
from platformio.project.config import ProjectConfig
def get_report_dir(project_dir, env):
with fs.cd(project_dir):
return os.path.join(
ProjectConfig.get_instance().get("platformio", "memusage_dir"), env
)
def list_reports(report_dir):
if not os.path.isdir(report_dir):
return []
return [os.path.join(report_dir, item) for item in sorted(os.listdir(report_dir))]
def read_report(path):
with gzip.open(path, mode="rt", encoding="utf8") as fp:
return json.load(fp)
def save_report(project_dir, env, data):
report_dir = get_report_dir(project_dir, env)
if not os.path.isdir(report_dir):
os.makedirs(report_dir)
report_path = os.path.join(report_dir, f"{int(time.time())}.json.gz")
with gzip.open(report_path, mode="wt", encoding="utf8") as fp:
json.dump(data, fp)
rotate_reports(report_dir)
return report_path
def rotate_reports(report_dir, max_reports=100):
reports = os.listdir(report_dir)
if len(reports) < max_reports:
return
for fname in sorted(reports)[0 : len(reports) - max_reports]:
os.remove(os.path.join(report_dir, fname))

View File

@@ -23,7 +23,7 @@ from platformio import fs
from platformio.compat import IS_WINDOWS
class ConfigOption: # pylint: disable=too-many-instance-attributes
class ConfigOption: # pylint: disable=too-many-instance-attributes,too-many-positional-arguments
def __init__(
self,
scope,
@@ -240,17 +240,6 @@ ProjectOptions = OrderedDict(
default=os.path.join("${platformio.workspace_dir}", "libdeps"),
validate=validate_dir,
),
ConfigPlatformioOption(
group="directory",
name="memusage_dir",
description=(
"A location where PlatformIO Core will store "
"project memory usage reports"
),
sysenvvar="PLATFORMIO_MEMUSAGE_DIR",
default=os.path.join("${platformio.workspace_dir}", "memusage"),
validate=validate_dir,
),
ConfigPlatformioOption(
group="directory",
name="include_dir",

View File

@@ -17,14 +17,13 @@
from platformio.device.list.util import list_logical_devices, list_serial_ports
from platformio.device.monitor.filters.base import DeviceMonitorFilterBase
from platformio.fs import to_unix_path
from platformio.http import fetch_http_content
from platformio.platform.base import PlatformBase
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_watch_lib_dirs, load_build_metadata
from platformio.project.options import get_config_options_schema
from platformio.test.result import TestCase, TestCaseSource, TestStatus
from platformio.test.runners.base import TestRunnerBase
from platformio.test.runners.doctest import DoctestTestCaseParser
from platformio.test.runners.doctest import DoctestTestRunner
from platformio.test.runners.googletest import GoogletestTestRunner
from platformio.test.runners.unity import UnityTestRunner
from platformio.util import get_systype

Some files were not shown because too many files have changed in this diff Show More