Compare commits

...

1347 Commits

Author SHA1 Message Date
7a8c061d79 Added CONFIG += c++17 to get correct autocompletions 2020-10-30 18:09:37 +01:00
8d4cde4534 Merge tag 'v5.0.2' into develop
Bump version to 5.0.2
2020-10-30 18:10:54 +02:00
7292024ee6 Merge branch 'release/v5.0.2' 2020-10-30 18:10:54 +02:00
d6df6cbb5d Bump version to 5.0.2 2020-10-30 18:10:47 +02:00
344e94d8a1 Bump version to 5.0.2rc2 2020-10-30 17:51:02 +02:00
5cf73a9165 Remove all hooks when dumping data to JSON and Python 3 is used 2020-10-30 17:50:43 +02:00
96b1a1c79c Fixed an issue with a "wrong" timestamp in device monitor output using "time" filter // Resolve #3712 2020-10-30 14:11:27 +02:00
0bbe7f8c73 Sync docs 2020-10-29 23:48:44 +02:00
e333bb1cca Tests: skip dev-plalform without examples 2020-10-29 23:42:15 +02:00
454cd8d784 Bump version to 5.0.2rc1 2020-10-29 23:18:39 +02:00
743a43ae17 Fixed an issue when multiple pio lib install command with the same local library results in duplicates in `lib_deps` // Resolve #3715 2020-10-29 23:17:47 +02:00
5a1b0e19b2 Initialize a new project or update existing passing working environment name and its options // Resolve #3686 2020-10-29 22:59:48 +02:00
da6cde5cbd Sync docs 2020-10-29 18:09:08 +02:00
5ea864da39 Add py39 env 2020-10-29 18:08:58 +02:00
175448deda Fix tests on PY2 2020-10-29 14:37:50 +02:00
16f90dd821 Ignore possible empty defines when exporting IDE data // Resolve #3690 2020-10-29 12:22:26 +02:00
9efac669e6 Bump version to 5.0.2b5 2020-10-28 22:53:29 +02:00
adf9ba29df Fixed an issue when "pio package publish" command removes original archive after submitting to the registry // Resolve #3716 2020-10-28 22:52:48 +02:00
cacddb9abb Support package packing on the Python 3+ only 2020-10-28 22:33:24 +02:00
edbe213410 Sync docs 2020-10-28 22:32:48 +02:00
891f78be37 Use "ensure_python3" util 2020-10-28 22:32:27 +02:00
175be346a8 Extend package filters 2020-10-28 20:57:26 +02:00
9ae981614f Add pack/sdist target 2020-10-28 20:56:53 +02:00
16f5f3ef46 Do not pack binary files and docs to the package source archive 2020-10-28 14:18:09 +02:00
2cd19b0273 Bump version to 5.0.2b4 2020-10-27 23:00:33 +02:00
e158e54a26 Fix issue with data decoding when calling PIO Core via PIO Home 2020-10-27 22:57:51 +02:00
63a6fe9133 Improved "core.call" RPC for PlatformIO Home // Resolve #3671 2020-10-27 21:07:02 +02:00
779eaee310 Bump version to 5.0.2b3 2020-10-26 22:25:47 +02:00
0ecfe8105f Docs: Unify CLI commands to use "pio" short version command 2020-10-26 22:24:05 +02:00
b8cc867ba4 Allow dev-platform to provide extra debug configuration using BsePlatform::configure_debug_options API 2020-10-26 18:24:46 +02:00
7230556d1b Move extra IDE data to "extra" section 2020-10-26 18:23:28 +02:00
afd79f4655 Improve initiating manifest parser from a package archive 2020-10-22 19:08:20 +03:00
5d87fb8757 Add "Articles" section to Zephyr RTOS 2020-10-22 19:07:44 +03:00
23e9596506 Automatically build PlatformIO Core extra Python dependencies on a host machine if they are missed in the registry // Resolve #3700 2020-10-20 21:06:53 +03:00
428f46fafe Typo test 2020-10-16 17:05:19 +03:00
ee847e03a6 Fix an issue with "'NoneType' object has no attribute 'status_code'" 2020-10-16 14:23:10 +03:00
a870981266 Docs: Fix custom "platform_packages" for ESP8266/32 2020-10-14 23:21:59 +03:00
411bf1107d Disable "lattice_ice40" examples for macOS 2020-10-14 22:33:04 +03:00
5b74c8a942 Minor fixes 2020-10-14 19:53:30 +03:00
a24bab0a27 Fix badge 2020-10-14 17:55:45 +03:00
1cb7764b0e Highlight PlatformIO Labs Technology 2020-10-14 17:53:41 +03:00
d835f52a18 Sync docs 2020-10-10 20:48:56 +03:00
9c20ab81cb fix quoting of defines in ccls template (#3692) 2020-10-02 13:28:02 +03:00
14de3e79c5 Sync docs 2020-09-26 22:52:43 +03:00
21c12030d5 Bump version to 5.0.2b2 2020-09-19 19:30:40 +03:00
2370e16f1b Fixed an "AssertionError: ensure_dir_exists" when checking library updates from simultaneous subprocesses // Resolve #3677 2020-09-19 19:29:51 +03:00
a384411a28 Bump version to 5.0.2b1 2020-09-17 20:41:10 +03:00
1e0ca8f79c Fixed an issue with GCC linker when "native" dev-platform is used in pair with library dependencies // Resolve #3669 2020-09-17 20:40:11 +03:00
2b5e590819 Docs: Explain how to install custom Python packages // Resolve #3673 2020-09-17 19:21:12 +03:00
bf57b777bf Docs: Update docs for PlatformIO IDE 2.0 for VSCode 2020-09-16 19:33:55 +03:00
f656d19ed5 Docs: Added new section about Arduino STM32L0 core 2020-09-14 22:31:11 +03:00
eb09af06ed Bump version to 5.0.2a2 2020-09-12 23:21:33 +03:00
687c339f20 Fixed a "PermissionError: [WinError 5]" on Windows when external repository is used with lib_deps option // Resolve #3664 2020-09-12 23:20:46 +03:00
7bc170a53e Fixed an issue with "KeyError: 'versions'" when dependency does not exist in the registry // Resolve #3666 2020-09-11 21:16:18 +03:00
65297c24d4 Merge branch 'release/v5.0.1' 2020-09-10 17:46:56 +03:00
ea21f3fba0 Merge tag 'v5.0.1' into develop
Bump version to 5.0.1
2020-09-10 17:46:56 +03:00
b515a004d3 Bump version to 5.0.1 2020-09-10 17:46:49 +03:00
7d3fc1ec1a Catch exception if folder already exists 2020-09-09 18:38:55 +03:00
6987d6c1c6 Fixed an issue when can not remove update or remove external dev-platform using PlatformIO Home // Resolve #3663 2020-09-09 17:53:04 +03:00
de2b5ea905 Bump version to 5.0.1b1 2020-09-09 16:27:47 +03:00
f946a0bc08 Reformat code with black==20.8b1 2020-09-09 16:27:36 +03:00
4f47ca5742 Fixed an issue with "Invalid simple block (semantic_version)" from library dependency that refs to an external source (repository, ZIP/Tar archives) // Resolve #3658 2020-09-09 16:13:39 +03:00
54b51fc2fd Merge branch 'develop' of https://github.com/platformio/platformio-core into develop
# Conflicts:
#	HISTORY.rst
2020-09-09 14:38:29 +03:00
1f284e853d Fixed an issue when the package manager tries to install a built-in library from the registry // Resolve #3662 2020-09-09 14:36:01 +03:00
2a30ad0fdf Allow in-progress language standards in IDE templates // Resolve #3653
Note: VS Code only supports finalized names
2020-09-09 13:56:00 +03:00
c454ae336d Added support for "owner" requirement when declaring `dependencies using library.json` 2020-09-09 13:10:42 +03:00
cd59c829e0 Fixed an issue when pio package unpublish command crashes // Resolve #3660 2020-09-09 12:17:09 +03:00
429f416b38 Generate current date for a custom contrib-pysite package 2020-09-07 16:38:20 +03:00
0a881d582d Docs: Add info how to list publish packages 2020-09-07 16:37:37 +03:00
65b1029216 Host SPDX licenses on Bintray, Github is blocked in multiple countries 2020-09-07 13:16:08 +03:00
c7758fd30e Docs: Minor fixes 2020-09-06 21:06:16 +03:00
46f300d62f Docs: Add "Publishing" section to the instruction on how to create own dev-platform 2020-09-06 21:00:45 +03:00
4234dfb6f9 Fixed an issue with "ImportError: cannot import name '_get_backend' from 'cryptography.hazmat.backends'" when using Remote Development // Resolve #3652 2020-09-06 18:37:27 +03:00
9695720343 Bump version to 5.0.1a1 2020-09-04 22:48:21 +03:00
1f28056459 Fixed an issue when using a custom git/ssh package with platform_packages // Resolve #3624 2020-09-04 22:47:49 +03:00
7dacceef04 Exclude tests from python package (#3650) 2020-09-04 18:55:30 +03:00
39883e8d68 Docs: Document "--without-testing" option for pio test command 2020-09-04 18:39:52 +03:00
949ef2c48a Merge tag 'v5.0.0' into develop
Bump version to 5.0.0
2020-09-03 14:43:11 +03:00
ada3f8b270 Merge branch 'release/v5.0.0' 2020-09-03 14:43:10 +03:00
cf4b835b0c Bump version to 5.0.0 2020-09-03 14:42:59 +03:00
fec4569ada Docs: Update docs with new owner-based dependency form 2020-09-03 14:37:24 +03:00
083edc4c76 Refactor to os.path 2020-09-02 20:52:11 +03:00
fe4112a2a3 Follow SemVer complaint version constraints when checking library updates // Resolve #1281 2020-09-02 20:36:56 +03:00
c8ea64edab Fix link to FAQ sections (#3642)
* Fix link to FAQ sections

Use consistently the same host and url and fix one unmatched anchor.

* Update HISTORY.rst

Co-authored-by: Ivan Kravets <me@ikravets.com>
2020-09-02 19:13:20 +03:00
6e5198f373 Minor improvements 2020-09-02 18:49:00 +03:00
44c2b65372 Show ignored project environments only in the verbose mode // Resolve #3641 2020-09-02 17:31:32 +03:00
5cc21511ad Show owner name for packages 2020-09-02 16:07:16 +03:00
2edd7ae649 Update PVS-Studio to the latest v7.09 2020-08-31 15:40:25 +03:00
7a49a74135 Bump version to 5.0.0b3 2020-08-28 21:55:55 +03:00
be487019f5 Fix a broken handling multi-configuration project // Resolve #3615 2020-08-28 21:54:47 +03:00
5dee0a31e6 Do not test for package owner if resource is external 2020-08-28 21:40:17 +03:00
9f2c134e44 Do not detach a new package even if it comes from external source 2020-08-28 21:24:48 +03:00
cdbb837948 Minor fixes 2020-08-28 18:45:52 +03:00
80c1774a19 Docs: PlatformIO Core 5.0: new commands, migration guide, other improvements 2020-08-28 14:08:26 +03:00
1aaa9b6707 Update changelog with static analysis section 2020-08-26 17:44:01 +03:00
4a7f578649 Sync docs and history 2020-08-26 15:40:24 +03:00
d59416431d Parse npm-like "repository" data from a package manifest // Resolve #3637 2020-08-26 15:40:03 +03:00
8625fdc571 Minor imperovements 2020-08-26 14:51:53 +03:00
3c91e3c1e1 Move build dir to the disk root (should fix issue with long path for Zephyr RTOS on WIndows) 2020-08-26 14:51:01 +03:00
1560fb724c Bump version to 5.0.0b2 2020-08-26 06:40:46 +03:00
0db39ccfbd Automatically accept PIO. Core 4.0 compatible dev-platforms 2020-08-26 06:40:22 +03:00
5086b96ede Bump version to 5.0.0b1 2020-08-25 22:22:35 +03:00
210cd76042 Rename "idedata" sub-command to "data" 2020-08-25 22:01:22 +03:00
f77978a295 Apply formatting 2020-08-25 22:01:08 +03:00
3e72f098fe Updates for PIO Check (#3640)
* Update check tools to the latest versions

* Use language standard when exporting defines to check tools

* Buffer Cppcheck output to detect multiline messages

* Add new test for PIO Check

* Pass include paths to Clang-Tidy as individual compiler arguments

Clang-tidy doesn't support response files which can exceed command
length limitations on Windows

* Simplify tests for PIO Check

* Update history

* Sync changelog
2020-08-25 21:19:21 +03:00
b9fe493336 Sync docs 2020-08-25 19:18:26 +03:00
79bfac29ba Update history and sync docs 2020-08-25 18:57:20 +03:00
2ea80d91f8 Minor fixes 2020-08-25 15:55:17 +03:00
fa90251714 Fixed an issue when Unit Testing engine fails with a custom project configuration file // Resolve #3583 2020-08-25 14:35:01 +03:00
ff19109787 Fix test 2020-08-25 14:34:03 +03:00
091ba4346d Bump version to 4.4.0b6 2020-08-24 23:11:43 +03:00
e43176e33a Typo fix 2020-08-24 23:11:24 +03:00
655e2856d1 Bump version to 4.4.0b5 2020-08-24 23:05:01 +03:00
c6a37ef880 Get real path of just installed core-package 2020-08-24 23:04:17 +03:00
6af2bad123 Make PIO Core 4.0 automatically compatible with dev-platforms for PIO Core 2.0 & 3.0 // Resolve #3638 2020-08-24 22:56:31 +03:00
3e7e9e2b3d Remove unused data using a new `pio system prune // Resolve #3522 2020-08-24 15:22:05 +03:00
13db51a556 Install/Uninstall dependencies only for library-type packages // Resolve #3637 2020-08-24 15:10:38 +03:00
d6d95e05e8 Rename "fs.format_filesize" to "fs.humanize_file_size" 2020-08-24 15:09:37 +03:00
b44bc80bd1 PyLint fix for PY2 2020-08-23 21:41:53 +03:00
f39c9fb597 Bump version to 4.4.0b4 2020-08-23 21:07:40 +03:00
24f85a337f Fix "AttributeError: module 'platformio.exception' has no attribute 'InternetIsOffline'" 2020-08-23 21:07:14 +03:00
a069bae1fb Fix a bug with package updating when version is not in SemVer format // Resolve #3635 2020-08-23 15:26:58 +03:00
1c8aca2f6a Check ALL possible version for the first matched package 2020-08-23 15:25:03 +03:00
620241e067 Move package "version" related things to "platformio.package.version" module 2020-08-23 15:24:31 +03:00
da179cb33f Enhance configuration variables 2020-08-23 14:29:31 +03:00
8ea10a18d3 Bump version to 4.4.0b3 2020-08-23 13:22:38 +03:00
e2bb81bae4 Restore legacy util.cd API 2020-08-23 13:22:11 +03:00
dcf91c49ac Remove debug code 2020-08-22 22:56:26 +03:00
c2caf8b839 Bump version to 4.4.0b2 2020-08-22 22:53:41 +03:00
95151062f5 Implement mirroring for HTTP client 2020-08-22 22:52:29 +03:00
7e4bfb1959 Move CacheContent API to "cache.py" module 2020-08-22 20:05:14 +03:00
abae9c7e77 Cache base registry requests 2020-08-22 17:52:12 +03:00
102aa5f22b Port legacy API requests to the new registry client 2020-08-22 17:49:29 +03:00
d92c1d3442 Refactor HTTP related operations 2020-08-22 17:48:49 +03:00
aa186382a8 Upgraded to SCons 4.0 2020-08-22 14:22:37 +03:00
70366d34b9 Sync docs 2020-08-22 13:57:18 +03:00
49b70f44ca Ignore legacy tmp pkg folders 2020-08-22 13:56:57 +03:00
f79fb4190e Sync docs 2020-08-21 14:25:59 +03:00
d980194600 Bump version to 4.4.0b1 2020-08-17 15:34:02 +03:00
fb6e1fd33c PyLint fixes 2020-08-17 15:33:08 +03:00
6f7fc638c7 Fix PyLint errors in tests 2020-08-17 12:56:57 +03:00
2459e85c1d Fix a bug with the custom platform packages // Resolve #3628 2020-08-17 12:13:25 +03:00
74e27a2edc Enable "cyclic reference" for GCC linker only for the embedded dev-platforms // Resolve #3570 2020-08-16 20:26:59 +03:00
808852f4cc Set default timeout for http requests // Resolve #3623 2020-08-16 20:21:30 +03:00
67e6d177b4 Minor fixes for dev-platform factory 2020-08-16 18:48:05 +03:00
04694b4126 Switch legacy platform manager to the new 2020-08-15 23:11:01 +03:00
bb6fb3fdf8 Fix bug with parsing detached packages 2020-08-15 15:24:35 +03:00
4ec64f8980 Fix a test for examples 2020-08-14 17:00:18 +03:00
332874cd4b Fix relative import of platform module on Py27 2020-08-14 16:48:12 +03:00
276ca61cde Refactor dev-platform API 2020-08-14 16:39:15 +03:00
5f3ad70190 Rename meta.PackageSourceItem or PackageItem 2020-08-14 16:38:46 +03:00
ff8ec43a28 Ensure tool-type package is compatible with a host system 2020-08-13 21:46:46 +03:00
ecc369c2f8 Minor fixes 2020-08-13 20:19:27 +03:00
26fdd0a62c Bump version to 4.4.0a8 2020-08-13 18:30:33 +03:00
64ff6a0ff5 Switch legacy core package manager to the new 2020-08-13 18:30:04 +03:00
fd7dba1d74 Package Manifest: increase package author.name field to the 100 chars 2020-08-13 17:50:44 +03:00
38ec517200 Update history 2020-08-12 21:09:42 +03:00
20a74d1654 Merge branch 'feature/pkg-next' into develop 2020-08-12 20:09:18 +03:00
d5451756fd Minor improvements 2020-08-12 20:09:10 +03:00
893ca1b328 Switch library manager to the new package manager 2020-08-12 13:27:05 +03:00
2dd69e21c0 Implement package removing with dependencies 2020-08-01 20:17:07 +03:00
a01b3a2473 Do not raise exception when package is not found (404), return None 2020-08-01 19:58:59 +03:00
6ac538fba4 Remove unused import 2020-08-01 15:49:10 +03:00
41c2d64ef0 Fix "PermissionError: [WinError 32] The process cannot access the file" on Windows 2020-08-01 15:36:28 +03:00
a1970bbfe3 Allow a forced package installation with removing existing package 2020-08-01 14:38:28 +03:00
d329aef876 Initial version of a new package manager 2020-07-31 15:42:26 +03:00
abc0489ac6 Update changelog 2020-07-28 15:59:02 +03:00
2bc47f4e97 PyLint fix 2020-07-28 15:55:25 +03:00
933a09f981 Update unit testing support for mbed framework
- Take into account Mbed OS6 API changes
- RawSerial is used with Mbed OS 5 since Serial doesn't support putc with baremetal profile
2020-07-28 15:22:36 +03:00
adc2d5fe7c Update VSCode template
Starting with cpptools v0.29 escaped paths in compilerArgs field don't work on Windows.
2020-07-28 15:10:52 +03:00
def149a29e Use updated registry API 2020-07-25 17:13:05 +03:00
39cb23813f Allow ignoring "platforms" and "frameworks" fields in "library.json" and treat a library as compatible with all 2020-07-25 11:51:47 +03:00
85f5a6a84a Bump version to 4.4.0a7 2020-07-24 21:00:58 +03:00
6ace5668b8 Update the registry publish endpoints 2020-07-24 20:57:18 +03:00
c193a4ceb7 Handle proxy environment variables in lower case // Resolve #3606 2020-07-23 19:07:29 +03:00
1abc110f8a Merge branch 'develop' of https://github.com/platformio/platformio-core into develop 2020-07-23 17:57:00 +03:00
73740aea89 Sync docs and examples 2020-07-23 17:56:41 +03:00
83110975fa Docs: Sync 2020-07-23 17:42:46 +03:00
881c5ea308 Remove unused code 2020-07-23 17:37:23 +03:00
22f1b94062 Bump version to 4.4.0a6 2020-07-21 12:42:26 +03:00
ea30d94324 Automatically enable LDF dependency chain+ mode (evaluates C/C++ Preprocessor conditional syntax) for Arduino library when “library.properties” has “depends” field // Resolve #3607 2020-07-21 12:41:38 +03:00
1ed462a29a PyLint fix 2020-07-16 01:00:38 +03:00
a2efd7f7c5 Bump version to 4.4.0a5 2020-07-15 23:18:07 +03:00
ca33058637 New commands for the registry package management (pack, publish, unpublish) 2020-07-15 23:16:46 +03:00
a6f143d1ca Dump data intended for IDE extensions/plugins using a new platformio project idedata command 2020-07-15 14:20:29 +03:00
1368fa4c3b Implement new fields (id, ownername, url, requirements) for PackageSpec API 2020-07-14 21:07:09 +03:00
cca3099d13 Ensure that module.json keywords are lowercased 2020-07-14 18:55:29 +03:00
368c66727b Fix issue with package packing when re-map is used and manifest is missed in "include" (copy it now) 2020-07-12 22:39:32 +03:00
a688edbdf1 Fix an issue with manifest parser when "new_from_archive" API is used 2020-07-09 21:53:46 +03:00
e570aadd72 Docs: Sync 2020-07-09 17:17:34 +03:00
f85cf61d68 Revert back max length of author name to 50 chars 2020-07-08 23:23:14 +03:00
f27c71a0d4 Increase author name length to 100 chars for manifest 2020-07-08 22:56:14 +03:00
940682255d Lock Python's isort package to isort<5 2020-07-08 22:16:52 +03:00
a00722bef4 Ignore maintainer's broken email in library.properties manifest 2020-07-08 21:53:28 +03:00
84132d9459 Fix tests 2020-07-08 21:52:34 +03:00
42fd284560 Improve parsing "author" field of library.properties manfiest 2020-07-08 20:21:10 +03:00
40d6847c96 Add option to pass a custom path where to save package archive 2020-07-08 13:46:36 +03:00
abd3f8b3b5 Docs: Remove legacy library dependency syntax for github 2020-07-07 22:53:01 +03:00
3c986ed681 Remove recursively .pio folders when packing a package 2020-07-07 16:28:51 +03:00
8b24b0f657 Sync docs & examples 2020-07-06 23:37:28 +03:00
0f8042eeb4 Implement PackagePacker.get_archive_name API 2020-07-06 15:57:49 +03:00
f97632202b Fix issue with KeyError 2020-07-06 15:57:10 +03:00
a79e933c37 Ignore author's broken email in a package manifest 2020-07-06 14:22:35 +03:00
ef53bcf601 Ignore empty fields in library.properties manifest 2020-07-06 14:17:00 +03:00
08a87f3a21 Do not allow [;.<>] chars for a package name 2020-07-03 19:14:58 +03:00
b3dabb221d Allow "+" in a package name 2020-07-03 16:07:36 +03:00
899a6734ee Add .ccls to .gitignore (vim and emacs) (#3576)
* Add .ccls to .gitignore (vim)

* Add .ccls to .gitignore (emacs)
2020-06-30 21:48:44 +03:00
7f48c8c14e Fix PyLint for PY 2.7 2020-06-30 15:06:40 +03:00
2c24e9eff6 Fall back to latin-1 encoding when failed with UTF-8 while parsing manifest 2020-06-30 14:28:37 +03:00
5cdca9d490 Optimize tests 2020-06-29 21:14:34 +03:00
1ac6c50334 Update multi-environment test for PIO test command 2020-06-29 20:52:15 +03:00
4cbad399f7 Remove mbed framework from several tests 2020-06-29 19:22:22 +03:00
2b8aebbdf9 Extend test for parsing package manifest when "system" is used as a list 2020-06-29 15:06:21 +03:00
e9a15b4e9b Parse package.json manifest keywords 2020-06-27 21:42:13 +03:00
dd18abcac3 Fix tests 2020-06-27 12:59:12 +03:00
b046f21e0d Fix "RuntimeError: dictionary keys changed during iteration" when parsing "library.json" dependencies 2020-06-27 12:46:04 +03:00
29fb803be1 Enable PIO Core tests on Python 3.8 2020-06-27 12:36:57 +03:00
bc2eb0d79f Parse dev-platform keywords 2020-06-26 19:49:25 +03:00
0bec1f1585 Extend system info with "file system" and "locale" encodings 2020-06-26 18:38:17 +03:00
a1ec3e0a22 Remove "vendor_url" and "docs_url" from Platform API 2020-06-25 23:23:55 +03:00
7bc22353cc Docs: Sync dev-platforms 2020-06-25 18:04:04 +03:00
efc2242046 Remove empty data from board information 2020-06-25 14:51:53 +03:00
5dadb8749e Change slogan to "PlatformIO is a professional collaborative platform for embedded development" 2020-06-23 12:33:00 +03:00
82735dd571 Fixed an issue with improper processing of source files added via multiple Build Middlewares // Resolve #3531 2020-06-23 11:46:00 +03:00
9fb4cde2a5 Do not generate ".travis.yml" for a new project, let the user have a choice 2020-06-23 11:26:22 +03:00
164ae2bcbc Extend system info with Python and PIO Core executables // Issue #3521 2020-06-23 11:20:29 +03:00
a172a17c81 Bump version to 4.4.0a4 2020-06-22 23:09:28 +03:00
5ee90f4e61 Display system-wide information using platformio system info command // Resolve #3521 2020-06-22 23:04:36 +03:00
3aae791bee Change slogan to "collaborative platform" 2020-06-22 20:02:43 +03:00
9f05519ccd List available project targets with a new "platformio run –list-targets" command // Resolve #3544 2020-06-22 19:53:31 +03:00
f19491f909 Docs: Sync articles 2020-06-22 17:55:02 +03:00
967a856061 Do not allow ":" and "/" chars in a package name 2020-06-22 15:25:02 +03:00
87d5997b46 Add a test that ensures setUp and tearDown functions can be compiled 2020-06-22 14:42:45 +03:00
c20a1f24cd Don't print relative paths with double-dot 2020-06-18 20:36:59 +03:00
260c36727c fix pio access urn format 2020-06-17 23:56:22 +03:00
03d9965758 Replace urn with prn (#3565)
* Replace urn with prn

* fix

* fix text
2020-06-17 23:46:50 +03:00
e853d61e16 Add orgname filter for access list (#3564)
* add orgname filter for access list

* fix

* fix namings
2020-06-17 18:55:40 +03:00
42e8ea29ff CLI to manage access level on PlatformIO resources. Resolve #3534 (#3563) 2020-06-17 13:53:53 +03:00
1e90c821dc Disable package upload test (#3562) 2020-06-17 00:24:55 +03:00
cad0ae0113 Update slogan to "No more vendor lock-in!" 2020-06-16 15:06:04 +03:00
21f3dd11f4 Fix printing relative paths on Windows // Resolve #3542
Fixes "ValueError" when running "clean" target if "build_dir"
points to a folder on a different logical drive
2020-06-16 12:27:49 +03:00
a9c13aa20e Implement "ManifestParserFactory.new_from_archive" API 2020-06-15 22:05:59 +03:00
d3fd115743 Black format 2020-06-15 22:05:28 +03:00
df0e6016bb Handle possible NodeList in source files when processing Middlewares // Resolve #3531
env.Object() returns a list of objects that breaks the processing of
subsequent middlewares since we only expected File nodes.
2020-06-15 21:25:24 +03:00
cb70e51016 Update changelog for Custom Targets 2020-06-13 16:21:15 +03:00
cf2fa37e56 Bump version to 4.4.0a3 2020-06-13 13:18:54 +03:00
28d9f25f9a Added a new "-e, --environment" option to "platformio project init" command 2020-06-12 23:47:12 +03:00
fdb83c24be Clean autogenerated files before running tests // Resolve #3523
Fixes possible conflicts between auxiliary test transport files when
project contains multiple environments with different platforms
2020-06-11 23:53:52 +03:00
660b57cdd3 Update PIO Home front-end to 3.2.3 2020-06-11 21:16:06 +03:00
405dcda824 Feature/update account tests (#3556)
* update account tests

* change second user

* refactoring

* clean

* fix tests email receiving

* fix
2020-06-11 16:02:38 +03:00
266612bbdf Run CI on pull requests 2020-06-11 15:27:51 +03:00
2722e27415 Sync docs 2020-06-11 15:15:46 +03:00
f571ad9d47 Sync docs 2020-06-11 11:03:48 +03:00
ef8a9835b0 Bump version to 4.4.0a2 2020-06-10 14:26:48 +03:00
b71b939307 Rename "AddSystemTarget" to "AddPlatformTarget" 2020-06-10 14:25:53 +03:00
9e3ba11e8a skip account tests 2020-06-10 12:36:07 +03:00
91e9406304 cleaning 2020-06-10 12:22:28 +03:00
0d8272890c merge account, org and team tests into one file 2020-06-10 12:02:34 +03:00
a182cca5e9 tests fix (#3555)
* replace timestamp with randint in tests

* replace pop3 with imap
2020-06-10 11:07:19 +03:00
e6fbd6acf1 Remove debug code 2020-06-09 23:26:49 +03:00
062a82c89e Sync docs 2020-06-09 20:59:23 +03:00
89cc6f9bf3 Bump version to 4.4.0a1 2020-06-09 18:44:49 +03:00
3c8e0b17a7 Added support for custom targets 2020-06-09 18:43:50 +03:00
e0023bb908 increase tests email receiving time 2020-06-09 17:05:11 +03:00
a5547491ed Add account and org destroy commands. Fix tests (#3552)
* Add account and org destroy commands. Fix tests

* fix tests

* fix

* fix texts
2020-06-09 15:50:37 +03:00
78546e9246 Docs: Add "TensorFlow, Meet The ESP32" to articles list 2020-06-08 19:26:48 +03:00
7457ef043b Docs: Sync ASR Micro dev-platform 2020-06-08 12:00:19 +03:00
e0e97a3629 Cache the latest news in PIO Home for 180 days 2020-06-05 18:29:11 +03:00
f5e6820903 Bump version to 4.3.5a2 2020-06-05 14:18:24 +03:00
27fd3b0b14 Improve detecting if PlatformIO Core is run in container 2020-06-05 14:17:19 +03:00
ced244d30a Sync docs 2020-06-05 11:30:15 +03:00
6fa7cb4af5 Add new dev-platform "ASR Microelectronics ASR605x" 2020-06-04 22:59:05 +03:00
94cb808285 CLI to manage teams. Resolve #3533 (#3547)
* CLI to manage teams.Minor fixes. Resolve #3533

* fix teams tests

* disable org and team tests

* minor fixes. fix error texts

* fix split compatibility
2020-06-04 19:31:30 +03:00
42df3c9c3f Sync docs 2020-06-04 15:27:46 +03:00
0c4c113b0a Fix account shpw command when PLATFORMIO_AUTH_TOKEN is used 2020-06-04 14:09:42 +03:00
3c1b08daab Ignore empty PLATFORMIO_AUTH_TOKEN 2020-06-04 13:57:56 +03:00
d7f4eb5955 Minor grammar fix 2020-06-03 22:40:37 +03:00
87b5fbd237 More cosmetic changes to Org CLI 2020-06-03 22:34:37 +03:00
6c97cc6192 Cosmetic changes to Org CLI 2020-06-03 22:22:13 +03:00
cbcd3f7c4d Fix cmd.org test 2020-06-03 21:40:03 +03:00
f7dceb782c Fix PY2.7 when PermissionError is not avialable 2020-06-03 21:24:01 +03:00
140fff9c23 CLI to manage organizations. Resolve #3532 (#3540)
* CLI to manage organizations. Resolve #3532

* fix tests

* fix test

* add org owner test

* fix org test

* fix invalid username/orgname error text

* refactor auth request in clients

* fix

* fix send auth request

* fix regexp

* remove duplicated code. minor fixes.

* Remove space

Co-authored-by: Ivan Kravets <me@ikravets.com>
2020-06-03 17:41:30 +03:00
8c586dc360 Sync docs 2020-06-03 17:16:59 +03:00
fe52f60389 Bypass PermissionError when cleaning the cache 2020-06-03 14:33:53 +03:00
9064fcbc77 Sync docs 2020-06-03 14:33:03 +03:00
9a1d2970cc Sync docs 2020-05-30 01:10:04 +03:00
26ba6e4756 Add new option to package publishing CLI which allows to disable email notiication 2020-05-28 17:06:36 +03:00
ae58cc74bd Rename checksum header to X-PIO-Content-SHA256 2020-05-28 16:08:37 +03:00
37e795d539 Send package checksum when publishing 2020-05-28 16:07:20 +03:00
49960b257d Implement fs.calculate_file_hashsum 2020-05-28 16:07:02 +03:00
25a421402b fix package type detector 2020-05-28 12:49:32 +03:00
8e72c48319 fix datetime validation in package publish command 2020-05-27 22:30:16 +03:00
c1965b607b Add binary stream to package publishing request 2020-05-27 17:27:05 +03:00
d38f5aca5c Fix metavar for package CLI 2020-05-27 16:20:02 +03:00
c06859aa9f Add package type to unpublish command 2020-05-27 14:30:27 +03:00
e706a2cfe2 Refactor pio account client. Resolve #3525 (#3529) 2020-05-27 13:39:58 +03:00
0c301b2f5d Fix order of arguments 2020-05-27 01:14:07 +03:00
deb12972fb Implement "package unpublish" CLI 2020-05-27 01:10:35 +03:00
8346b9822d Implement "package pack" command 2020-05-26 22:17:55 +03:00
19cdc7d34a Initial support for package publishing in to the registry 2020-05-26 22:01:32 +03:00
49cc5d606b Sync docs 2020-05-26 21:58:58 +03:00
58470e8911 PY2 lint fix 2020-05-26 14:30:43 +03:00
38699cca8f Bump version to 4.3.5a1 2020-05-26 14:26:42 +03:00
0eb8895959 Add support for “globstar/**” (recursive) pattern 2020-05-26 14:25:28 +03:00
99d4e0c390 Merge branch 'release/v4.3.4' 2020-05-23 20:35:59 +03:00
6d32aeb310 Merge tag 'v4.3.4' into develop
Bump version to 4.3.4	81843087	Ivan Kravets <me@ikravets.com>	23 May 2020, 20:33
Bump version to 4.3.4
2020-05-23 20:35:59 +03:00
8184308755 Bump version to 4.3.4 2020-05-23 20:33:13 +03:00
b68953b733 Bump version to 4.3.4b1 2020-05-23 20:01:25 +03:00
7dce494ad6 Rename "misc" command to "system", do not append completion code for Fish shell // Resolve 3435 2020-05-23 20:00:56 +03:00
4921bf8b6a PyLint fix 2020-05-22 14:22:41 +03:00
32cb0d6e4d Handle possible issue on Python 2.x when writing to thread buffer
The problem happens when value has type "unicode" that shouldn't be decoded
2020-05-22 14:17:17 +03:00
e2c5a3c498 Add Python 3.8 for Tox 2020-05-22 14:12:27 +03:00
ec34a65cff Bump version to 4.3.4a5 2020-05-21 15:40:38 +03:00
9296615dbf Merge branch 'develop' of https://github.com/platformio/platformio-core into develop 2020-05-21 15:39:59 +03:00
56795940b9 Sync teensy dev-platform 2020-05-21 15:39:24 +03:00
09a5952248 Add new record to history log
Mention issues about permission error on Windows when cloning
package from Git repository
2020-05-20 21:51:13 +03:00
735435306d Copy and remove cloned package instead of moving // Resolve #2844, Resolve #3328
On Windows, it’s not possible to move a file which is used by another
process (e.g. Git extension in VSCode)
2020-05-20 21:32:55 +03:00
bdd57bf356 Ensure that copytree preserves symlinks 2020-05-20 20:57:55 +03:00
8840b28968 Handle possible issue on Python 2.x when writing to thread buffer
The problem happens when value has type "unicode" that shouldn't be decoded
2020-05-20 17:04:50 +03:00
e31591a35e Print warning about an issue with mapped network drives on Windows // Issue #3417
Starting with Python 3.8 paths to mapped network drives are resolved
to their real path in the system, e.g.: "Z:\path" becomes "\\path" which
causes weird errors in the default terminal with a message that UNC
paths are not supported
2020-05-19 22:37:05 +03:00
457a218723 Sync docs 2020-05-19 13:27:54 +03:00
9724660dda Update SPDX licenses to 3.9 2020-05-19 13:27:44 +03:00
eac6c1c552 Handle error when internet is offline. Resolve # 3503 (#3505)
* Handle error when internet is offline.

* Fix

* minor fix
2020-05-17 22:44:27 +03:00
54d73e834b Github Actions: Checkout submodules recursive 2020-05-14 18:08:51 +03:00
099e3c7198 Use original MongoDB license for "compilation_db.py" 2020-05-13 00:48:11 +03:00
96a68c6b14 Docs: Sync Atmel AVR dev-platform 2020-05-11 22:10:32 +03:00
2a0a1247e3 Revert "Add initialization config for new simavr tool"
This reverts commit 16966a4957.
2020-05-11 18:21:48 +03:00
7555d66748 Revert "Add special debug port for simavr tool"
This reverts commit 7b43444d81.
2020-05-11 18:21:26 +03:00
c76940f7ce PyLint fix 2020-05-10 19:11:06 +03:00
b2ed027bc3 Bump version to 4.3.4a4 2020-05-10 18:36:16 +03:00
01a1981ca1 Added PlatformIO CLI Shell Completion for Fish, Zsh, Bash, and PowerShell // Resolve #3435 2020-05-10 18:35:50 +03:00
03228c528e Bump version to 4.3.4a3 2020-05-09 16:36:05 +03:00
ac510c1553 fix summary caching (#3500) 2020-05-09 16:35:21 +03:00
a1ff5e1a4f Save summary data to local session. (#3497)
* Save summary data to local session.

* naming

* fix account summary test

* add ttl for summary cache

* refactoring get_account_info

* fix
2020-05-08 21:34:52 +03:00
7b43444d81 Add special debug port for simavr tool 2020-05-08 12:31:48 +03:00
16966a4957 Add initialization config for new simavr tool 2020-05-08 01:14:15 +03:00
7e7a6d7807 Skip account tests if env variables not presented (#3494)
* added skip if env variables not presented. fix exception texts

* fix texts

* fix texts
2020-05-06 19:25:24 +03:00
f32dbeeb6d Bump version to 4.3.4a2 2020-05-06 12:30:41 +03:00
f78ffaded0 Remove local PIO Account session from PIO Remote when token is broken 2020-05-06 12:29:01 +03:00
8480ebde89 Rename "authenticating" to "authorizing" wording for PIO Account 2020-05-06 12:27:29 +03:00
44ee7d6a6b Docs: Sync ESP8266 dev-platform 2020-05-04 15:50:11 +03:00
2ab47b7968 Remove account state item if refreshing token failed (#3487) 2020-05-04 13:14:52 +03:00
7181b7632b Docs: Increase content width 2020-05-03 11:06:26 +03:00
b82eaca45e Enable caching for PIP when building contrib-pysite 2020-05-02 15:29:24 +03:00
fd04f31c5f Update link to CLA provider 2020-05-01 23:37:38 +03:00
75abe8a0af Sync docs 2020-05-01 23:36:29 +03:00
f7995ce49a PyLint fix 2020-04-29 12:56:54 +03:00
2c2309acac Bump version to 4.3.4a1 2020-04-29 12:41:56 +03:00
5f79ab34f5 Automatically build `contrib-pysite` package on a target machine when pre-built package is not compatible // Resolve #3482 2020-04-29 12:40:04 +03:00
5bcbee7423 Merge branch 'release/v4.3.3' 2020-04-28 18:06:53 +03:00
961049cf9b Merge tag 'v4.3.3' into develop
Bump version to 4.3.3
2020-04-28 18:06:53 +03:00
72e7492a78 Bump version to 4.3.3 2020-04-28 18:06:46 +03:00
5e4b4bbacd Fix "UnicodeDecodeError: 'utf-8' codec can't decode byte" when non-Latin chars are used in project path // Resolve #3481 2020-04-28 18:05:08 +03:00
a64d368de2 Merge branch 'release/v4.3.2' 2020-04-28 13:27:23 +03:00
6146b58520 Merge tag 'v4.3.2' into develop
Bump version to 4.3.2
2020-04-28 13:27:23 +03:00
f35e6e99af Bump version to 4.3.2 2020-04-28 13:25:52 +03:00
5d8440fdd1 PyLint fixes 2020-04-28 12:48:15 +03:00
d1b394b20a Bump version to 4.3.2rc2 2020-04-27 23:42:22 +03:00
520d6decac Nominate some exceptions to UserSideException 2020-04-27 23:42:02 +03:00
4a251f0ab0 Fix JSONDecodeError when bottle.SimpleTemplate is used 2020-04-27 23:41:36 +03:00
c215abb50c Bump version to 4.3.2rc1 2020-04-26 19:46:33 +03:00
31ca47837d Disable build cache for Github Actions 2020-04-26 12:58:32 +03:00
560699fc6b Apply formatting 2020-04-26 12:58:05 +03:00
51ec94f78c Add new test for PIO Check with --skip-packages option 2020-04-26 01:38:25 +03:00
ac1210fbea Add -imacros files to forcedInclude field in VSCode template 2020-04-26 00:35:22 +03:00
c03f93521b Refactor PIO Check feature (#3478)
* Add new option --skip-packages for check command

Check tools might fail if they're not able to preprocess source
files, for example, Cppcheck uses a custom preprocessor that is
not able to parse complex preprocessor code in zephyr framework).
Instead user can specify this option to skip headers included from
packages and only check project sources.

* Fix toolchain built-in include paths order
C++ and fixed directories should have higher priority

* Refactor check feature

The main purpose is to prepare a more comprehensive build environment.
It's crucial for cppcheck to be able to check complex frameworks like
zephyr, esp-idf, etc. Also detect a special case when cppcheck fails to check
the entire project (e.g. a syntax error due to custom preprocessor)

* Add new test for check feature

Tests ststm32 platform all tools and the main frameworks

* Update check tools to the latest available versions

* Test check tools and Zephyr framework only with Python 3

* Tidy up code

* Add history entry
2020-04-26 00:10:41 +03:00
62ede23b0e Bump version to 4.3.2b1 2020-04-25 15:48:55 +03:00
60f28599d9 Echo what is typed when "send_on_enter" device monitor filter is used // Resolve #3452 2020-04-25 15:48:37 +03:00
629f23c4f3 Bump version to 4.3.2a4 2020-04-25 13:20:56 +03:00
27db344739 Bump version to 4.3.2a3 2020-04-25 13:15:55 +03:00
777a47fd99 Minor improvements 2020-04-25 13:14:54 +03:00
e913159cb4 Sync docs 2020-04-24 21:48:57 +03:00
b285c3137a Extend remote hosts with PlatformIO when checking internet connection 2020-04-24 16:25:02 +03:00
f0576ddcd9 Docs: Fix incorrect type for library.json "libCompatMode" field 2020-04-24 13:14:03 +03:00
6e2cc333f2 disable pio account change password and username update tests 2020-04-24 11:57:17 +03:00
18c7c5a9be Refactor pio account tests. (#3473) 2020-04-24 11:25:09 +03:00
01945716d3 Sync docs 2020-04-24 00:43:32 +03:00
2f5b231dc3 Disable pio account tests (#3472)
* minor fix pio account test

* disable pio account change password and username update tests
2020-04-24 00:13:46 +03:00
75c1aafaef fix pio account tests 2020-04-23 20:45:51 +03:00
b9714d0ac1 Add pio account tests (#3470)
* add pio account tests

* update tests
2020-04-23 16:05:00 +03:00
5774654582 Switch to Github Actions (#3471) 2020-04-23 16:04:15 +03:00
0a46b8ab6a add login with code method for account client. add new account rpc handler. (#3468) 2020-04-21 22:44:32 +03:00
a556573a4f Move env dependent directories to appropriate CMAKE_BUILD_TYPE // Issue #3460
This will allow to dynamically populate list of sources depending on
selected environment. At the same time "src" and "lib" folders remain
common for all environments
2020-04-21 22:01:07 +03:00
fd91819b2c Fix missing include paths for check tools
Includes are now split by scopes and imported as a dictionary
2020-04-21 19:26:00 +03:00
24c04057e9 CLion: Add paths to libraries specified via lib_extra_dirs option (#3463)
* Add paths to libraries specified via lib_extra_dirs option

Besides, global folders in SRC_LIST seem a bit unnecessary
since there might be unused libraries in these folders

* Refactor processing of includes when exporting IDE/Editor projects

Split includes according to their source. That will help export includes in a more flexible way.
For example some IDEs don't need include paths from toolchains

* Add new record to history log

* Typo fix
2020-04-21 17:37:55 +03:00
2960b73da5 Fix an issue when PIO Remote agent was not reconnected automatically 2020-04-21 12:32:03 +03:00
c4645a9a96 Sync docs 2020-04-21 10:43:34 +03:00
877e84ea1d Bump version to 4.3.2a2 2020-04-19 20:05:21 +03:00
cb1058c693 New PIO Account with "username" and profile support 2020-04-19 20:03:46 +03:00
be6bf5052e Open source PIO Remote client 2020-04-19 19:26:56 +03:00
7780003d01 New Account Management System (#3443)
* add login for PIO account to account cli

* Remove PyJWT lib. Fixes.

* Add password change for account

* Refactoring. Add Account Client.

* Fixes.

* http -> https.

* adding error handling for expired session.

* Change broker requests from json to form-data.

* Add pio accoint register command. fixes

* Fixes.

* Fixes.

* Add username and password validation

* fixes

* Add token, forgot commands to pio account

* fix domain

* add update command for pio account

* fixes

* refactor profile update output

* lint

* Update exception text.

* Fix logout

* Add custom user-agent for pio account

* add profile show command. minor fixes.

* Fix pio account show output format.

* Move account related exceptions

* cleaning

* minor fix

* Remove try except for account command authenticated/non-authenticated errors

* fix profile update cli command

* rename first name and last name vars to 'firstname' and 'lastname'
2020-04-19 19:06:06 +03:00
445ca937fd Sync docs 2020-04-18 22:40:27 +03:00
1f4aff7f27 Sync docs 2020-04-16 16:07:02 +03:00
788351a0cd Fixed an incorrect node path used for pattern matching when processing middleware nodes 2020-04-13 16:41:03 +03:00
5ba7753bfa Sync docs 2020-04-12 17:43:27 +03:00
ae57829190 Generate user agent based on PIO Core environment 2020-04-10 17:59:58 +03:00
030ddf4ea1 Apply black formatting 2020-04-10 17:08:16 +03:00
ccc43633b7 Support for a new dev-platform NXP i.MX RT 2020-04-10 13:14:10 +03:00
aba2ea9746 Temporary disable "infineonxmc" from CI due to a broken dev-platform 2020-04-09 12:44:52 +03:00
d5ebbb99a7 Dynamically choose extension for file with unit test transports (#3454)
C file should be used by default as only Arduino and mbed require C++ files.
There might be a lot of legacy projects so custom transport is also set to use C++.
2020-04-09 12:02:38 +03:00
a636a60e00 Sort examples 2020-04-08 22:34:04 +03:00
ad7e3f83aa Fix tests/commands/test_init.py 2020-04-08 17:18:59 +03:00
baa7aab1d7 Specify C++ as the language for .ino files when preprocessing them for PVS-Studio // Resolve #3450 2020-04-07 11:35:17 +03:00
2e320c01b3 Fix test 2020-04-06 18:19:34 +03:00
3cd6c618a4 Docs: Sync STM32 dev-platform 2020-04-06 16:56:05 +03:00
5ea759bc3e Document PIO Core: Integration with custom applications (extensions, plugins) 2020-04-05 19:48:50 +03:00
11cb3a1bf7 Docs: Add info about stm32pio tool for STM32Cube framework 2020-04-04 00:45:45 +03:00
7412cf586b Update docs for Zephyr RTOS 2.2 2020-04-02 00:59:35 +03:00
f976cf7ae5 Docs: Extend tutorials list 2020-03-30 17:15:18 +03:00
e92b498b68 Fixed an issue when saving libraries in new project results in error "No option 'lib_deps' in section" // Resolve #3442 2020-03-27 13:34:14 +02:00
1b0810ec87 Docs: Fix broken link for creating dev-platform // Resolve #123 2020-03-26 22:31:15 +02:00
45e523a468 Docs: Sync with Atmel SAM dev-platform 2020-03-25 17:01:12 +02:00
d42481d196 Sync docs 2020-03-24 18:03:23 +02:00
11c946bfe4 Sync Espressif 32 dev-platform 2020-03-23 19:52:57 +02:00
589d6f9e12 Docs: Sync Espressif 32 dev-platform 2020-03-23 19:35:18 +02:00
79b3a232fc Move debug client and server implementations to "process" folder 2020-03-21 22:00:14 +02:00
f95230b86e Fixed UnicodeDecodeError on Windows when network drive (NAS) is used // Resolve #3417 2020-03-21 21:53:42 +02:00
fc9a16aa81 Merge branch 'feature/issue-3417-unicodeerror-nas' into develop 2020-03-21 21:44:13 +02:00
81a4d28918 Docs: Remove duplicate demo image of PlatformIO for CLion 2020-03-21 16:39:17 +02:00
fd137fe054 Bump version to 4.3.2a1 2020-03-21 13:28:31 +02:00
efd3b244e1 Force PIPE reader to UTF-8 on Windows // Issue #3417 2020-03-21 13:27:46 +02:00
dbeaaf270c fix typo in URL (#3432) 2020-03-21 00:02:50 +02:00
32642b7ec8 Fix broken link to Renode in history 2020-03-20 17:16:49 +02:00
096c2f6165 Typo fix in docs 2020-03-20 17:11:31 +02:00
91ae8b4cc7 Fixed typo in history 2020-03-20 15:15:30 +02:00
cc52890d45 Merge branch 'release/v4.3.1' 2020-03-20 15:13:46 +02:00
5a12f1f56e Merge tag 'v4.3.1' into develop
Bump version to 4.3.1
2020-03-20 15:13:46 +02:00
b7b9ee5a80 Bump version to 4.3.1 2020-03-20 15:13:40 +02:00
97a0cbdd18 Skip Click 7.1 and 7.1.1 on Windows due to broken releases 2020-03-20 15:11:14 +02:00
b8f43732fe Docs: update What's PlatformIO and PIO IDE pages 2020-03-20 14:44:24 +02:00
658b3df123 Fixed an TypeError "super(type, obj): obj must be an instance or subtype of type" when device monitor is used with a custom dev-platform filter // Resolve #3431 2020-03-20 13:56:30 +02:00
d32312e738 Fixed an issue when lib_archive = no was not honored in "platformio.ini" 2020-03-20 13:34:35 +02:00
20023f8d8a Bump version to 4.3.1a1 2020-03-20 13:02:11 +02:00
6b2ff04bbf Fixed an error "SyntaxError: 'return' with argument inside generator" for PIO Unified Debugger when Python 2.7 is used 2020-03-20 13:01:33 +02:00
d80a9c820d Merge branch 'release/v4.3.0' 2020-03-19 22:38:05 +02:00
4b62af1675 Merge tag 'v4.3.0' into develop
Bump version to 4.3.0
2020-03-19 22:38:05 +02:00
6414e1d9e3 Bump version to 4.3.0 2020-03-19 22:37:16 +02:00
a55f04dc28 Warn that can't allocate socket for PIO Home 2020-03-19 22:36:55 +02:00
2d68e28a70 Fix auto-ready logic for debugging server 2020-03-19 21:33:23 +02:00
4c2a157dce Bump version to 4.3.0rc1 2020-03-19 19:28:13 +02:00
d9647dec95 Add support for debugging server "ready_pattern" 2020-03-19 19:17:54 +02:00
15647c81f0 New standalone (1-script) PlatformIO Core Installer 2020-03-19 18:26:30 +02:00
24a0d9123e Update history with initial support for Renode 2020-03-19 17:04:05 +02:00
720c29350d Add docs for Renode debugging tool // Issue #3401 2020-03-19 16:58:18 +02:00
aa939b07b1 Update default init config for Renode 2020-03-19 16:17:51 +02:00
0e3c3abf73 GDB init commands for Renode simulation framework // Issue #3401 2020-03-19 15:16:55 +02:00
a8606f4efa Refactor debug GDB initial configurations 2020-03-19 14:49:25 +02:00
475f898222 Replace installer script with a new one // Resolve #3420 (#3428)
* Replace installer script with a new one. Resolve #3420

* temp file name fix

* get-platformio.py script update.

* small fix
2020-03-19 13:26:51 +02:00
69f5fdf8e1 Remove debug code 2020-03-19 01:05:12 +02:00
fe1ad35cad Merge branch 'feature/issue-3401-renode-support' into develop 2020-03-19 00:50:21 +02:00
352a0b7377 Wait for an output from debug server 2020-03-19 00:46:23 +02:00
52689bc5e8 Wait until debug server is ready 2020-03-19 00:19:59 +02:00
3dd3ea1c35 Show a hexadecimal representation of the data (code point of each character) with `hexlify` filter 2020-03-18 18:55:54 +02:00
fff33d8c29 Do not send CR+NL for "send_on_enter" device monitor filter 2020-03-18 17:25:40 +02:00
db9829a11e Sync docs 2020-03-18 00:36:07 +02:00
9a1b5d869d Bump version to 4.3.0b2 2020-03-18 00:13:03 +02:00
605cd36e27 Send a text to device on ENTER with `send_on_enter` filter // Resolve #926 2020-03-18 00:09:40 +02:00
24a23b67dd Fix formatting issue 2020-03-17 23:10:06 +02:00
0df72411a0 Device Monitor Filter API, implement "time" and "log2file" filters // Resolve #981 Resolve #670 2020-03-17 23:08:57 +02:00
5a72033622 Fixed an issue when unknown transport is used for PIO Unit Testing // Resolve #3422 2020-03-17 17:42:54 +02:00
4e6095ca13 Update docs and history 2020-03-17 17:39:11 +02:00
f81b0b2a84 Ensure all commands in compilation_commands.json use absolute paths. (#3415)
* Fix resolving of absolute path for toolchain

By placing the `where_is_program` call into this function, all references to the compiler will be made absolute, instead of just ones in the top environment. Previously, all references to the compiler for user source code would not use the full path in the compilation database, which broke `clangd`'s detection of system includes.

* Linting issue
2020-03-17 16:30:28 +02:00
314f634e16 Docs: Improvements for CLion docs. 2020-03-15 00:41:16 +02:00
ba040ba2ba Docs: Workaround for ReadTheDocs bug 2020-03-14 20:32:42 +02:00
a22ed40256 Added initial support for an official "PlatformIO for CLion IDE" plugin // Resolve #2201 2020-03-14 19:31:00 +02:00
58a4ff8246 Skip broken Click 7.1 & 7.1.1, see Click's issue #1501 2020-03-14 12:18:00 +02:00
9a5ebfb642 Bump version to 4.3.0b1 2020-03-12 15:10:25 +02:00
5d0faaa5a8 Refactor docs structure 2020-03-12 15:09:20 +02:00
108b892e30 Control device monitor output with filters and text transformations 2020-03-12 14:28:54 +02:00
0ff37c9999 Implement universal "get_object_members" helper 2020-03-12 14:24:20 +02:00
8c3de609ab Add ESP crash trace decoding to monitor (#3383)
* Implement mechanism for adding platform filters into miniterm

Updates platformio/platform-espressif8266#31

* DeviceMonitorFilter: fixes for Windows and Python2
2020-03-11 13:22:01 +02:00
073efef2a1 Explicitly use Python-x64 with Appveyor CI 2020-03-10 15:54:01 +02:00
b9fd97dae4 Changes required for CLion PlatformIO plugin (#3298) 2020-03-09 15:47:41 +02:00
60a7af6a8c Docs: Update recent articles 2020-03-09 14:58:35 +02:00
0f02b3b653 Improved support for Arduino "library.properties" `depends` field 2020-03-07 17:44:28 +02:00
620335631f Bump version to 4.2.2b1 2020-03-06 22:08:38 +02:00
3ef96cb215 Minor fixes 2020-03-06 00:43:57 +02:00
59e1c88726 Fixed an issue when `"libArchive": false` in "library.json" does not work // Resolve #3403 2020-03-06 00:37:48 +02:00
3a27fbc883 Fixed an issue when Python 2 does not keep encoding when converting .INO file // Resolve #3393 2020-03-05 23:52:46 +02:00
ce6b96ea84 Use native open/io.open for file contents reading/writing 2020-03-05 23:52:13 +02:00
3275bb59bf Fix test 2020-03-04 18:14:51 +02:00
fbb62fa8a6 Bump version to 4.2.2a3 2020-03-03 23:10:54 +02:00
261c46d4ef Add support for Arm Mbed "module.json" `dependencies` field // Resolve #3400 2020-03-03 23:10:19 +02:00
0c0ceb2caa Sync docs 2020-03-03 23:03:14 +02:00
de60f20c21 Sync docs 2020-03-03 14:59:03 +02:00
314fe7d309 Initial support for menuconfig target 2020-03-03 00:58:07 +02:00
a271143c52 Sync docs 2020-03-02 23:25:28 +02:00
2d4a3db250 Fixed an issue with expanding $WORKSPACE_DIR for library manager 2020-02-29 23:08:08 +02:00
7fba6f78d6 Bump version to 4.2.2a2 2020-02-29 21:59:58 +02:00
eee12b9b66 Fixed an issue "the JSON object must be str, not 'bytes'" when PIO Home is used with Python 3.5 // Resolve #3396 2020-02-29 21:59:10 +02:00
d3e151feeb Sync docs 2020-02-29 18:44:37 +02:00
dd1fe74956 PyLint fix 2020-02-21 15:44:55 +02:00
49aed34325 Rename PIO Plus to Professional 2020-02-21 15:44:24 +02:00
81ba2a5a74 Sync docs 2020-02-20 18:22:12 +02:00
1c87f83463 Parse package dependencies declared as a list of strings 2020-02-18 21:55:01 +02:00
e15f227c48 Docs: Sync Atmel SAM dev-platform 2020-02-18 14:45:54 +02:00
ea5f2742f8 Bump version to 4.2.2a1 2020-02-18 00:05:20 +02:00
9fd0943b75 Fixed an issue when quitting from PlatformIO IDE does not shutdown PIO Home server 2020-02-18 00:03:23 +02:00
80acd52fc2 Merge branch 'release/v4.2.1' 2020-02-17 14:25:27 +02:00
b8312d545c Merge tag 'v4.2.1' into develop
Bump version to 4.2.1
2020-02-17 14:25:27 +02:00
82f36a1ac3 Bump version to 4.2.1 2020-02-17 14:25:20 +02:00
9f7c827572 Resolve absolute path of toolchain when generating compilation database 2020-02-17 13:52:25 +02:00
6328206e78 Fixed an issue when generating of compilation database "compile_commands.json" does not work with Python 2.7 // Resolve #3378 2020-02-17 13:05:01 +02:00
154be7fa81 Improve VSCode template structure (#3385)
* Switch to click argument parser

* Typo fix

* Tidy up VSCode template

Co-authored-by: Ivan Kravets <me@ikravets.com>
2020-02-17 12:19:00 +02:00
b8c9eee8af Force docs to HTTPS 2020-02-16 21:25:30 +02:00
43664672fc Bump version to 4.2.1a3 2020-02-14 22:59:16 +02:00
5cc9a328ab Fixed an issue when Library Dependency Finder (LDF) ignores custom "libLDFMode" and "libCompatMode" options in library.json 2020-02-14 22:57:51 +02:00
6556c37e58 Bump version to 4.2.1a2 2020-02-14 20:57:40 +02:00
292049199a Add new record to history log 2020-02-14 17:24:52 +02:00
ed4452b115 Get rid of direct imports 2020-02-14 17:09:48 +02:00
fbfbf340c1 Add "forceInclude" field to VSCode template
VScode doesn't recognize header files included via "-include" flag in "compilerArgs" field.
Instead, absolute paths to these files should be specified in a special section "forceInclude".
2020-02-14 16:43:20 +02:00
a57ea79bf8 Froze "marshmallow" dependency to 2.X for Python 2 // Resolve #3380 2020-02-14 13:49:41 +02:00
22e8e02f3d Automatically rebuild contrib-pysite package when import fails // Resolve #3313 2020-02-13 22:06:46 +02:00
a10625a052 Automatically rebuild contrib-pysite package when import fails // Issue #3313 2020-02-13 15:53:42 +02:00
42020e2498 Bump version to 4.2.1a1 2020-02-13 13:35:38 +02:00
36a2228220 Fixed "TypeError: unsupported operand type(s)" when system environment variable is used by project configuration parser // Resolve #3377 2020-02-13 13:34:34 +02:00
206054b35f Docs: Sync dev-platforms 2020-02-12 23:58:39 +02:00
b87048020d Add warning that overriding board data will not work for device monitor command // Issue #3349 2020-02-12 21:08:08 +02:00
f7d4bf5fa8 Merge branch 'release/v4.2.0' 2020-02-12 16:42:13 +02:00
f0bf531e1b Merge tag 'v4.2.0' into develop
Bump version to 4.2.0
2020-02-12 16:42:13 +02:00
176cf17f9f Bump version to 4.2.0 2020-02-12 16:42:06 +02:00
b41262a20e Fix broken "init" command 2020-02-12 16:34:38 +02:00
d0a6861369 Fix "TypeError: TypeError: write() argument 1 must be unicode" when generating project on Windows/Python 2 2020-02-12 15:14:58 +02:00
0bc872eafd Remap command before calling Click 2020-02-12 13:47:42 +02:00
edc2d10574 Update SPDX License List to 3.8 2020-02-11 18:59:35 +02:00
a033e2d1fe Docs: Update supported upload/debug protocols by Nuclei dev-platform 2020-02-11 18:37:48 +02:00
c0aefe4c62 Add support for Nuclei RISC-V Processor development platform // Resolve #3340 2020-02-10 18:52:31 +02:00
86069ab7c6 Typo fix 2020-02-08 21:37:18 +02:00
86f2dde6f3 Do not overwrite a custom items in VSCode's "extensions.json" // Resolve #3374 2020-02-08 21:36:32 +02:00
96f60aab66 Bump version to 4.2.0rc2 2020-02-08 20:35:43 +02:00
2763853d8d Fixed an issue when no error is raised if referred parameter (interpolation) is missing in a project configuration file // Resolve #3279 2020-02-08 19:10:48 +02:00
a78b461d45 Code formatting 2020-02-08 19:10:15 +02:00
2fa20b5c52 Typo fix 2020-02-07 11:28:32 +02:00
0b0b63aa7d Update templates for Atom, VSCode, CLion (#3371)
* Wrap flags with whitespace chars when exporting data for IDEs

* Update IDEs templates

Take into account compiler flags that can contain whitespace characters (e.g. -iprefix)

* Update template for VSCode

* Add history record
2020-02-07 11:26:45 +02:00
fe174e35c8 Suppress warnings from packages dir instead of core dir for CppCheck 2020-02-06 23:48:13 +02:00
88e40e28fc Bump version to 4.2.0rc1 2020-02-06 23:33:27 +02:00
73ce3c94e9 Initial support for `Project Manager // Resolve #3335 2020-02-06 23:32:43 +02:00
d2033aacea Remove the entire folder with temp files used by pvs-studio tool 2020-02-06 23:21:27 +02:00
dfb47a089b Add license file to sdist using MANIFEST.in (#3325) 2020-02-06 17:48:08 +02:00
81960ce051 Fix test 2020-02-06 17:41:35 +02:00
2ecceb8ed2 Generate absolute path for compilation DB item 2020-02-06 17:30:55 +02:00
00a9a2c04d Generate `compilation database "compile_commands.json" // Resolve #2990 2020-02-06 17:19:48 +02:00
7716fe1d1c Autodetect monitor port for boards with specified HWIDs // Resolves #3349 (#3369)
* Autodetect serial port for boards with specified hwids

* Add new record to history log
2020-02-05 22:33:05 +02:00
09b3df5520 Fixed a "UnicodeDecodeError" when listing built-in libraries on macOS with Python 2.7 // Resolve #3370 2020-02-05 22:25:06 +02:00
ee2e4896d2 Fixed an issue when Project Inspector crashes when flash use > 100% // Resolve #3368 2020-02-05 18:15:44 +02:00
390f1935d6 Fix parsing dependency in a legacy format 2020-02-05 15:43:39 +02:00
365c3eaf4b Fixed an issue when invalid CLI command does not return non-zero exit code 2020-02-05 15:31:32 +02:00
b83121a951 Fix package parser test 2020-02-05 14:34:40 +02:00
efe8e599fd Added support for Arduino's library.properties `depends` field // Resolve #2781 2020-02-05 00:04:16 +02:00
bc69259dd1 Update tool-unity package to v2.5.0 // Resolve #3362 2020-01-31 15:10:45 +02:00
607e8eb477 Improve detecting of active Internet connection // Resolve #3359 2020-01-29 18:54:30 +02:00
139171a79f Sync docs 2020-01-29 18:53:33 +02:00
848e525919 Bumo PIO Home to 3.1.0-rc.2 2020-01-26 00:31:55 +02:00
b805822eea Remove debug code 2020-01-25 20:51:20 +02:00
13e9306753 Bump version to 4.1.1b9 2020-01-25 20:50:26 +02:00
880d5bb8b0 Fix project saving 2020-01-25 20:47:10 +02:00
f9de23b16f Bump version to 4.1.1b8 2020-01-25 15:48:14 +02:00
e5aa71e4e1 Fix config saving when PY2 is used 2020-01-25 15:47:45 +02:00
ba441ca77c Remove double blank lines when saving project config // Resolve #3293 2020-01-24 22:15:33 +02:00
decf482367 Use io.DEFAULT_BUFFER_SIZE as default chunk size 2020-01-24 22:05:24 +02:00
253e4b13b5 Disable buffering file calculating checksum for downloaded file 2020-01-24 21:40:12 +02:00
04ca6621f1 Verify downloaded package checksum with native Python's hashlib and support all OS 2020-01-24 20:43:05 +02:00
20334772b5 Install a dev-platform with ALL declared packages using a new `--with-all-packages` option // Resolve #3345 2020-01-24 20:20:56 +02:00
a62bc3846e Update IPs when checking for Internet 2020-01-24 19:50:55 +02:00
3b092f28c3 Added support for "pythonPackages" in platform.json 2020-01-24 19:47:47 +02:00
2de46f545f Formatter 2020-01-24 19:47:20 +02:00
8fce660afa Add notice about VID:PID pairs // Resolve #3349 2020-01-24 14:47:23 +02:00
dbeffe426f Update docs for CI 2020-01-23 22:48:22 +02:00
d4c42bd546 Added support for `PVS-Studio static code analyzer 2020-01-23 20:51:34 +02:00
fad5d1d744 Sync docs 2020-01-23 14:06:36 +02:00
46a9c1b6b2 Add initial support for PVS-Studio check tool (#3357)
* Add initial support for PVS-Studio check tool

* Enable all available PVS-Studio analyzers by default

* Add tests for PVS-Studio check tool

* Improve handling check tool extra flags that contain colon symbol
2020-01-23 12:57:54 +02:00
5ac1e9454f pio-test: pass --verbose to the run command context (#3338)
* pio-test: pass --verbose to the run command context

* restore old output behavior
2020-01-23 12:56:08 +02:00
17f9d57207 Control debug flags and optimization level with a new "debug_build_flags" option 2020-01-22 22:20:24 +02:00
5bdec19f31 Add new debug_build_flags option (#3355)
This will allow users to override default debug flags for example in cases when
the final binary built in debug mode is too large to be loaded on target
2020-01-22 20:41:42 +02:00
90b80083e8 Sync docs 2020-01-22 12:58:16 +02:00
8d02e8b8f7 Docs: Update FAQ for installing Python 2020-01-16 11:55:58 +02:00
7e41841a74 Allow to call Downloader API in silent mode 2020-01-11 15:23:36 +02:00
0f296e7e37 Skip broken example definitions in package manifest 2020-01-10 21:28:19 +02:00
9c32ff278c Sync docs 2020-01-09 23:32:55 +02:00
60139035d8 Include hidden files by default in a package 2020-01-05 18:29:19 +02:00
5ab34436ec Cleanup package file name when packing 2020-01-04 23:48:06 +02:00
178080fd12 Improve boards search 2020-01-04 13:53:08 +02:00
915b9145f6 Filter boards by ID+Name 2020-01-04 11:46:25 +02:00
6020faf970 Docs: Add "v" suffix to SparkFun RISC-V board ids 2020-01-04 11:38:13 +02:00
ec82fc82a2 Support packages with nested folders and with a custom "root" 2020-01-03 22:55:56 +02:00
8d7b775875 Implement package packer 2020-01-03 15:52:54 +02:00
682114d6f1 Bump version to 4.1.1b7 2019-12-31 20:11:14 +02:00
0ac5cd6789 Skip empty "export" values when parsing library.json manifest 2019-12-31 10:21:20 +02:00
209040fdc1 Initial support for Zephyr framework // Issue #1613 2019-12-30 14:11:27 +02:00
643f290057 Fix issue with library.properties manifest parser when author email is not specified 2019-12-29 17:43:50 +02:00
ea0b462d0b PyLint fixes 2019-12-29 14:18:43 +02:00
442a7e3576 Made package ManifestSchema compatible with marshmallow >= 3 // Resolve #3296 2019-12-29 13:13:04 +02:00
f7385e8e88 Bump version to 4.1.1b6 2019-12-25 00:44:01 +02:00
20a10c7fc5 Fixed an issue "Import of non-existent variable 'projenv''" // Resolve #3315 2019-12-24 23:43:21 +02:00
2f05040081 Fixed an issue with LDF when header files not found if "libdeps_dir" is within a subdirectory of "lib_extra_dirs" // Resolve #3311 2019-12-24 14:36:44 +02:00
7e0fb43dbe Normalise repository url when parsing package manifest 2019-12-22 23:45:12 +02:00
26e7069099 Fixed default PIO Unified Debugger configuration for J-Link probe 2019-12-22 12:33:08 +02:00
7d90c468ae Updated SCons tool to 3.1.2 2019-12-22 01:27:51 +02:00
f9494c940e Add support for "downloadUrl" in package manifest 2019-12-20 20:20:09 +02:00
135ef8701c Better representation of dependent packages 2019-12-20 01:19:23 +02:00
1bd6e898ad Fix tests 2019-12-18 14:03:10 +02:00
b15ddc00a5 Pass extra files with compiler options to VSCode IntelliSense tool 2019-12-17 22:40:28 +02:00
b4088a6d00 Drop support for Samsung Artik dev/platform // Resolve #3309 2019-12-17 22:39:15 +02:00
c57f68aee3 Change order between board/framework for configuration file 2019-12-17 15:22:02 +02:00
31eed6c5e5 Improve detecting of a run PIO Home Server 2019-12-09 14:47:00 -08:00
09852dcada Sync docs 2019-12-09 14:46:28 -08:00
5768fcd429 Inform that PlatformIO Home server is already started in another process 2019-12-02 19:34:19 +02:00
d901cc875a Minor improvements for docs 2019-12-02 19:28:21 +02:00
4f1ccfe58f Bump version to 4.1.1b5 2019-11-28 18:58:09 +02:00
f852f9fa89 Small refacoring 2019-11-28 18:23:00 +02:00
5c388d4271 Fix issue with invalid checking used port for PIO Home server 2019-11-28 18:14:53 +02:00
f9cae60225 Fix tests 2019-11-28 18:14:10 +02:00
49ceadc6ad Move exceptions to their components 2019-11-28 16:15:54 +02:00
47469e8759 Fix issue when None value is passed to config.set 2019-11-27 18:08:32 +02:00
742f60b48d Bump version to 4.1.1b4 2019-11-26 18:00:33 +02:00
0615ff6dd8 Docs: Change "Global/Local scope" configuration to "Common/Environment" 2019-11-26 17:40:09 +02:00
2fbe33bca0 Move "Board" Options to "Platform 2019-11-26 14:44:54 +02:00
fdd73552ea Docs for Zephyr // Issue #1613 2019-11-26 14:44:29 +02:00
17b4f0b4dd Docs: Sync Atmel AVR dev/platform 2019-11-25 13:56:33 +02:00
f87322591d Bump version to 4.1.1b3 2019-11-22 20:49:44 +02:00
47099190f4 Bump PIO Home version to 3.1.0-beta.2 2019-11-22 20:49:09 +02:00
9cf777b4e5 Fixed an issue with "start-group/end-group" linker flags on Native development platform // Resolve #3282 2019-11-21 14:19:32 +02:00
8675d3fa46 Use proper CMake variables for setting compilation flags 2019-11-21 00:39:05 +02:00
2b9d8dba5f Bump version to 4.1.1b2 2019-11-20 23:32:19 +02:00
b37814976c Fix project generator for CLion 2019-11-20 23:31:35 +02:00
895aa0cb34 Update PlatformIO slogan 2019-11-19 20:01:50 +02:00
18a9866a02 An example with custom debug flags in advanced scripting 2019-11-19 19:56:10 +02:00
88ead0aed1 Catch all debug errors when killing debug server 2019-11-19 18:55:21 +02:00
f19c7dc575 Support for Atmel megaAVR 2019-11-18 17:59:19 +02:00
7eab5d567e Fix CLion generator when one env is used 2019-11-17 00:35:56 +02:00
5b1c6daa2a Fix a space in config header 2019-11-16 23:09:18 +02:00
464e1a509f Add PIO Home "os.get_file_mtime" RPC method 2019-11-16 22:57:56 +02:00
c1394b290d Fix issue with unknown dev/platform when symlink is used 2019-11-16 22:43:25 +02:00
0029c7fe09 Advanced Serial Monitor with UI 2019-11-16 17:45:57 +02:00
e9f9871c1e Show to user the last exception when can't install a package 2019-11-16 17:25:27 +02:00
d1b46c838e Revert back "Flash" instead of "ROM" 2019-11-16 14:09:16 +02:00
a7b9187234 Bump version to 4.1.1b1 2019-11-15 20:05:52 +02:00
c7202154de Project Manager and Project Configuration UI for "platformio.ini" 2019-11-15 20:05:09 +02:00
6809da0353 Replace os.path.abspath by realpath 2019-11-15 16:02:15 +02:00
fbdcbf17c7 Rename Data/Flash to RAM/ROM 2019-11-15 15:52:39 +02:00
44a9de6dcb Pass -m and -i flags to VSCode Intellisense analyzer 2019-11-15 15:11:13 +02:00
a077081e46 Init generic C/C++ SCons tools by default 2019-11-15 15:10:43 +02:00
728fd7f5b9 Catch exception when can not get a default locale 2019-11-13 16:48:04 +02:00
053160a6eb Sync docs 2019-11-13 16:47:39 +02:00
9bbaba3d59 Bump version to 4.1.1a2 2019-11-13 15:35:32 +02:00
b1577d101c Update PIO Home to 3.0.1 2019-11-13 15:32:57 +02:00
53e6cf3e4a Drop support for "project_dirs" argument to Project RPC 2019-11-13 15:32:14 +02:00
a9f9f4ef04 Fixed an issue when `env.BoardConfig()` does not work for custom boards in extra scripts of libraries // Resolve #3264 2019-11-12 23:52:43 +02:00
15f142fc70 Ensure content cache key does not contain path separators 2019-11-12 23:48:39 +02:00
6e9429dbbf Split BuildProgram into ProcessProgramDeps and ProcessProjectDeps 2019-11-12 18:52:25 +02:00
be628051a7 Fix typo 2019-11-12 18:35:07 +02:00
f0eb177a8e Check if directory exists before fetching project info 2019-11-12 18:32:10 +02:00
7c481291dc Warn about about broken library manifest when scanning dependencies // Resolve #3268 2019-11-12 18:14:06 +02:00
f1d20f591a Sync docs 2019-11-12 13:41:54 +02:00
c6a8e03367 Fixed an issue when `env.BoardConfig()` does not work for custom boards in extra scripts of libraries // Resolve #3264 2019-11-12 13:41:39 +02:00
cbb7869da1 Fixed an issue with the broken latest news for PIO Home 2019-11-12 13:09:35 +02:00
1f796ca0e5 Bump version to 4.1.1a1 2019-11-11 23:22:17 +02:00
703b29a05e Fixed missed descriptions for project options 2019-11-11 23:19:47 +02:00
56ceee220b Fix invalid build status for unit test when remote is used 2019-11-11 22:48:29 +02:00
0328037b49 Handle project configuration (monitor, test, and upload options) for PIO Remote commands // Resolve #2591 2019-11-11 22:38:16 +02:00
3c796ca7c8 Cosmetic fixes 2019-11-08 17:40:11 +02:00
e6e14be528 Fix framework name shakti-sdk 2019-11-07 16:58:57 +02:00
f42d1a89f2 Merge tag 'v4.1.0' into develop
Bump version to 4.1.0
2019-11-07 16:54:21 +02:00
5a89388fb0 Merge branch 'release/v4.1.0' 2019-11-07 16:54:20 +02:00
d043412e0f Bump version to 4.1.0 2019-11-07 16:54:12 +02:00
71168b1a5f Replace IoT with Embedded 2019-11-07 16:49:34 +02:00
95c1b0214c Rename "check_pattern" to "check_patterns" 2019-11-07 15:24:47 +02:00
2408c0a4c7 Fix incorrect info about build_type 2019-11-06 23:53:38 +02:00
4b3f593df9 Bump version to 4.1.0rc9 2019-11-06 23:24:49 +02:00
67aea4db3f Fix issue with GDB/MI Stream Records for PIO Debugger 2019-11-06 22:30:58 +02:00
6b44a8ae75 Use BuildAsyncPipe only if TTY stream 2019-11-06 22:25:14 +02:00
70f4fa2665 Remove platformio.description if none value is passed 2019-11-06 20:06:30 +02:00
17ff3250c9 Export uppercased driver name in sizedata report on Windows 2019-11-06 12:34:43 +02:00
2cce47a13d Fix "pio_reset_run_target" for JLink debug probe 2019-11-06 12:20:42 +02:00
c1f62f8ead Bump version to 4.1.0rc8 2019-11-06 01:01:07 +02:00
e0c174b9b6 Improve PIO Home Misc RPC 2019-11-06 00:48:28 +02:00
e5ec4de3a4 Extend PIO Home RPC with "project.config_update_description(path, text)" method 2019-11-06 00:05:17 +02:00
bcf09964ab Better formatting for multi-line values in config option 2019-11-06 00:03:08 +02:00
f3992f8e53 Create dummy target for CLION 2019-11-05 15:59:06 +02:00
66cc557d2f Export SVD Path to CLion 2019-11-05 15:16:50 +02:00
9786b3e1b9 Fix CLion integration when project name contains a space 2019-11-05 15:16:35 +02:00
30bc691c95 Fix test with a missed library 2019-11-05 12:08:29 +02:00
83110326f4 Rename "check_pattern" option 2019-11-05 12:02:12 +02:00
182835fabf Rename check_patterns option to check_pattern 2019-11-05 11:36:20 +02:00
7345d3ea19 Improve dump of config data 2019-11-05 00:17:39 +02:00
3f4aa320c2 Sync docs 2019-11-04 21:52:42 +02:00
dfd853fa87 Update tests for check command with a new flag "pattern" that supersedes "filter" 2019-11-04 21:34:39 +02:00
3289e84b21 Refactor PIO Check from "check_filters" to "check_patterns" 2019-11-04 18:22:28 +02:00
39639d45fe Bump version to 4.1.0rc7 2019-11-04 15:36:45 +02:00
b45abf67a5 Fix broken debug configuration 2019-11-04 15:36:23 +02:00
e57871cab7 Print a building mode 2019-11-03 22:22:40 +02:00
484ea15959 Bump version to 4.1.0rc6 2019-11-02 23:14:16 +02:00
40109263f0 Fix initial debug configuration for J-Link 2019-11-02 23:09:56 +02:00
b45261c3dc Change initial debug configuration to: reset/halt, load, init break points 2019-11-02 22:56:11 +02:00
73bcf18498 Fix broken debugger 2019-11-02 22:54:57 +02:00
0dcc6f350d Bump version to 4.1.0rc5 2019-11-02 19:49:34 +02:00
0488cc4086 Typo fix 2019-11-02 19:48:41 +02:00
7784743cb1 Switch to default values from project configuration options 2019-11-02 19:44:28 +02:00
0a4bc1d4e3 Add "description" for project config options, configure "default" values 2019-11-02 19:41:39 +02:00
3630084a64 Docs: Sync Kendryte K210 dev/platform 2019-11-01 18:53:00 +02:00
53c561e895 Bump version to 4.1.0rc4 2019-11-01 18:33:21 +02:00
88db253515 Ignore duplicate library storages 2019-11-01 18:28:20 +02:00
da928efb43 Added "--shutdown-timeout" option to PIO Home Server 2019-11-01 14:40:03 +02:00
cd3d638337 Disable parsing of extra configs for PIO Home Project RPC load/dump methods 2019-11-01 12:05:13 +02:00
3de2d84e2b Fixed an issue with a GCC Linter for PlatformIO IDE for Atom // Resolve #3218 2019-10-31 22:42:22 +02:00
1d5d09feab Fixed an issue when Project Config Parser does not remove in-line comments when Python 3 is used // Remove #3213 2019-10-31 22:04:57 +02:00
2c2b419685 Docs: Sync nRF52 dev/platform 2019-10-31 18:52:26 +02:00
a7f8838d9a Format code 2019-10-31 18:52:13 +02:00
a18f8b2a4c Use default values from project options 2019-10-31 15:28:02 +02:00
9b65a091da Export config dump/load and schema to PIO Home Project.RPC 2019-10-31 15:27:34 +02:00
8ccf9d2e53 Implement project config "update" with "clear" option 2019-10-31 15:26:34 +02:00
cd6137bdb0 Bump version to 4.1.0rc3 2019-10-31 00:43:44 +02:00
6d69c25a2f Use locale encoding to decode subprocess output // Resolve #2890 2019-10-30 20:43:37 +02:00
7b6bab7f4e Update memory usage banner 2019-10-30 20:40:26 +02:00
257a8c63d2 Sync docs 2019-10-30 20:28:38 +02:00
3146ab5d12 Allow export project config data as Tuple 2019-10-30 19:09:32 +02:00
2d4722477e Automatically shutdown PIO Home server when no clients for 1 hour 2019-10-30 18:58:49 +02:00
d815daed29 Allow specifying defect level that will cause failure 2019-10-30 13:38:46 +02:00
c4e7674585 Don't pass header files to Cppcheck 2019-10-30 12:23:33 +02:00
94f565db84 Show warning about restart IDE to affect PIO Home changes 2019-10-29 18:11:09 +02:00
6e03aa3a3d Bump version to 4.1.0rc2 2019-10-29 18:03:07 +02:00
737c29b510 Update PIO Home to 3.0.0-beta.2 2019-10-29 18:02:32 +02:00
0222c56c4d Use file system encoding when decoding subprocess output // Resolve #2890 2019-10-29 17:43:48 +02:00
8d0584aa59 Add new IDE RPC "open_text_document" method for PIO Home 2019-10-29 17:37:09 +02:00
7dbeab11a5 Add new OS.RPC "open_file" method for PIO Home 2019-10-29 17:36:36 +02:00
7cad06ea18 Minor test fixes 2019-10-29 17:12:18 +02:00
3236fb6b3d Return file+line for sizedata instead of "location" 2019-10-29 17:01:20 +02:00
0194e09410 Use simple abspath to get absolute path to file with defect 2019-10-28 18:37:14 +02:00
187e30d055 Export full path to file with defect 2019-10-28 18:30:22 +02:00
39a7062503 Fix types of defect fields column and line 2019-10-28 18:12:39 +02:00
4ff7c868ef Sync docs 2019-10-28 16:34:26 +02:00
5573c3871c Allow cppcheck suppress individual defects by default 2019-10-28 13:38:46 +02:00
d620579247 Fix tests for check command according to updated exit codes 2019-10-25 21:04:30 +03:00
48651286b6 Automatically detect C++ standard version when invoking cppcheck 2019-10-25 20:59:36 +03:00
0e7a2b3141 Automatically detect source files language when invoking cppcheck 2019-10-25 20:08:04 +03:00
f3b8ae4224 Bump version to 4.1.0rc1 2019-10-25 20:02:01 +03:00
a2451a716d PIO Home 3.0 with Project Inspect 2019-10-25 20:01:31 +03:00
2e5dabb913 Fix issue with custom board_ options 2019-10-25 19:33:22 +03:00
4e43e7d3c3 Fix code formatting 2019-10-25 17:43:52 +03:00
49acf4bdb9 Minimum supported version of PIO Plus Core is 2.5.8 2019-10-25 17:27:51 +03:00
f3d8c30f95 Skip ignored environments when exporting check report in JSON format 2019-10-25 15:50:19 +03:00
4486a85d4c Introduce new flag --fail-on-defect to pio check 2019-10-25 15:40:50 +03:00
8a6892bf3c Fixed an issue with invalid encoding when generating project for Visual Studio // Issue #3183 2019-10-25 14:33:22 +03:00
087a8f6dd0 Fix Visual Studio template files encoding // Resolve #3183 2019-10-25 14:27:47 +03:00
5e681ec03c ProjectRPC.config_call accepts first argument as dict/kwargs for Config.init 2019-10-25 14:01:46 +03:00
784a5cd349 Add support for "Build Middlewares" 2019-10-25 00:33:04 +03:00
5345dd2674 Give a proper name to method that converts defect item to dict 2019-10-24 21:35:04 +03:00
8127fd9960 Export correct stats for each check tool 2019-10-24 20:44:34 +03:00
3177aaf591 Fixed an issue when booleans in "platformio.ini" are not parsed properly // Resolve #3022 2019-10-24 19:43:13 +03:00
70b484a2c2 Escape "\" char in GDB console output 2019-10-24 17:34:49 +03:00
ed6c9a08ce Add custom "PLATFORMIO_BUILD_DEBUG" target for CLion 2019-10-24 17:21:02 +03:00
601989c5ff Escape "\" char in GDB console output 2019-10-24 16:56:28 +03:00
234585dc97 Fixed an issue with project generator when `src_build_flags` were not respected // Resolve #3137 2019-10-24 16:39:11 +03:00
2388b2a62b Fixed security issue when extracting items from TAR archive // Issue #2995 2019-10-24 16:24:53 +03:00
69d9438c71 Temporary disable security checking for Tar items 2019-10-24 15:39:41 +03:00
0b500dba54 Handle legacy "system": "all" for package manifest 2019-10-24 15:10:11 +03:00
798b12ce7b Fixed security issue when extracting items from TAR archive // Resolve #2995 2019-10-24 14:55:45 +03:00
334d50c367 Use package parser for package manager and LDF 2019-10-24 13:42:46 +03:00
dd1da95a40 Fix issue when wrong library was picked up by LDF when framework is not declared 2019-10-24 00:28:03 +03:00
6684ac5a57 LDF: Check project include dirs before looking for dependencies 2019-10-23 22:55:02 +03:00
b533d7a1dd LDF: Check global CPPPATH when looking for dependencies 2019-10-23 22:31:26 +03:00
95d1f43799 Sync docs with ST STM32 dev/platform 2019-10-23 18:49:08 +03:00
9c7cc87c5f Move command related modules to "commands" package 2019-10-23 16:05:27 +03:00
374379ba03 Skip .debug sections when generating memory use report 2019-10-22 21:52:55 +03:00
56ac577b0a Fix case with empty arguments when generating sizedata report 2019-10-22 12:10:48 +03:00
941c0f4297 Improve the speed of memory use report generation 2019-10-21 23:26:28 +03:00
f34745bef9 Parse device frequency in int format for size data 2019-10-21 15:57:34 +03:00
9fef7f0ba9 Docs: Sync TI MSP430 dev/platfom 2019-10-21 15:53:25 +03:00
971cd2ca0f Export device info in pair with sizedata 2019-10-21 00:12:04 +03:00
6bf8bec22d Bump version to 4.1.0b5 2019-10-19 12:43:43 +03:00
d771816b02 Automatically change dir to project for RPC "config_call"; add "envs" and "descrption" for project entities 2019-10-19 12:42:43 +03:00
f78a1a7b15 Show encoding error when can't read a file // Issue #2796 2019-10-18 22:00:28 +03:00
77f8414c63 Better explanation about encoding error // Resolve #2796 2019-10-18 15:56:50 +03:00
4d84d03a63 Black 2019-10-18 15:56:41 +03:00
065607b68c Disable PyLint's "import-outside-toplevel" 2019-10-18 15:41:52 +03:00
f5807364e8 Force to "backslashreplace" when UnicodeEncodeError arises when writing file // Issue #2796 2019-10-18 15:20:52 +03:00
92d86192aa Substitute LDSCRIPT with real value 2019-10-18 15:05:11 +03:00
d44c60614d Use direct LDSCRIPT_PATH only if script resolves 2019-10-17 23:40:30 +03:00
19a8326f0f Fix test for package manifest 2019-10-17 21:19:04 +03:00
be9aaf8902 Be compatible with Python 3.8, on Windows skip HOME and check for USERPROFILE 2019-10-17 20:57:40 +03:00
5cfa2b7fdd Fix typo with UnicodeEncodeError // Issue #2796 2019-10-17 19:28:57 +03:00
6218b773fd Better support for file contents writing // Issue #2796 2019-10-17 18:48:59 +03:00
7bcfea13fb Fixed an issue with linking process when `$LDSCRIPT` contains a space in path 2019-10-17 16:52:18 +03:00
89843c0d65 Fix issue with parsing library.properties when export field is used 2019-10-17 15:48:18 +03:00
31d4a5c72e Add collective stats info about project components to check report 2019-10-17 13:42:00 +03:00
83f25cbc16 Fix tests 2019-10-17 12:38:35 +03:00
27fc19d6b3 Switch to Marshmallow ODM framework 2019-10-17 00:17:16 +03:00
9cfccc5cd4 Minor fixes to manifest parser 2019-10-16 13:58:50 +03:00
9da19fbf54 Use isolated SCons sign DB per Python interpreter 2019-10-16 12:09:53 +03:00
a481a5deda Fix issue with "remote test" // Resolve #3127 2019-10-15 23:30:02 +03:00
e8692334f6 Replace deprecated in python3: iteritems with items and basestr with str (#3119) 2019-10-15 22:00:48 +03:00
239befa4ee New Shakti dev/platform 2019-10-15 13:05:56 +03:00
2e9b0066de Capture manifest parser exceptions 2019-10-14 23:36:15 +03:00
55d905a0d0 Add a new RPC method "project.config_call" for Home API 2019-10-12 20:00:12 +03:00
181adb277f Sync docs 2019-10-12 19:58:34 +03:00
82ec0164b0 Update docs on how to install Python 3.7 on Windows 2019-10-10 23:35:59 +03:00
c8354b100e Bump version to 4.1.0b4 2019-10-10 14:51:14 +03:00
4366719ed2 Restore missed project helpers needed by "platformio-node-helpers" 2019-10-10 14:50:34 +03:00
971eb8e35c Revert back unix style directory separator in sizedata report 2019-10-09 17:37:24 +03:00
a785c238b1 Use OS-native directory separator in sizedata report 2019-10-09 00:39:57 +03:00
eda02750ae Export files as list instead of dict for sizedata target 2019-10-08 13:45:36 +03:00
e5d50eb45c Docs: RV-LINK debug tool, sync GDV32 dev/platform 2019-10-08 11:49:04 +03:00
b66bf5f4c0 Ignore symbolic links for package examples 2019-10-07 20:35:01 +03:00
d1c8cc38f2 Cast semver to string when validating version 2019-10-05 23:40:27 +03:00
10bada0bcc ManifestPaser: handle examples from "Examples" folder 2019-10-05 20:21:39 +03:00
5d7e7b1796 DataModel: capture exceptions from failed models in non-strict mode 2019-10-04 23:52:06 +03:00
0f7fe260d1 Docs: Sync ESP32 dev platform 2019-10-04 21:15:37 +03:00
46be56af43 Bump version to 4.1.0b3 2019-10-04 20:51:05 +03:00
dce2655004 Fix broken serial monitor called via run target while uploading // Resolve #3081 2019-10-04 20:50:39 +03:00
36acdd7797 DataModel: allow valid values in non-strict mode for TypeOfList and TypeOfDict 2019-10-04 18:30:48 +03:00
47e297fecb Use less verbose debug output 2019-10-04 13:27:02 +03:00
9ce19c7e83 Skip debug sections when calculating sizedata 2019-10-04 10:52:55 +03:00
9954900a0e Return back LINKFLAGS for debug mode 2019-10-03 18:16:55 +03:00
a7855ae664 ManifestParser: init from dir using name of file in remote url if provided 2019-10-03 16:14:51 +03:00
76865a1730 ManifetPatrser fixes (#3080)
* Skip broken examples declaration

* Allow dots in keywords

* Allow "+" in keywords
2019-10-03 14:55:04 +03:00
8febdc19ea ManifestParser: normalize example names 2019-10-03 12:47:41 +03:00
85a814c21a Allow dot in manifest example name 2019-10-03 10:33:11 +03:00
ab5650f84b Use max line length hooks for all systems 2019-10-02 23:46:42 +03:00
77c591ce81 Fix RTD conf 2019-10-02 21:35:13 +03:00
dc067642b2 Fix RTD conf 2019-10-02 21:33:40 +03:00
d0ee0c2919 Sync docs 2019-10-02 21:32:31 +03:00
6d50aa2e25 Remove RTD confs 2019-10-02 20:54:36 +03:00
b68b9794ec Fix docs with "htmlzip" format 2019-10-02 20:51:39 +03:00
e6ea4cb613 PackageManifest: Ignore hidden files for examples 2019-10-02 20:42:01 +03:00
47ba127733 Add ReadTheDocs config 2019-10-02 18:15:17 +03:00
bbd694c5ea ManifestParse: automatically generate examples from package dir 2019-10-02 17:54:59 +03:00
7ba2a7cd3d Bump version to 4.1.0b2 2019-10-02 13:33:12 +03:00
a1ed99962c Better handling of non-dict values passed to DataModel 2019-10-02 12:34:50 +03:00
c2970631a5 Add "--force" for git update // Issue #3060 2019-10-02 12:34:20 +03:00
d38c843574 Fixed an issue when installing a package using custom Git tag and submodules were not updated correctly // Resolve #3060 2019-10-02 11:52:14 +03:00
a2213a1aa4 Change "examples" field in package manifest to ListOf(ExampleModel) 2019-10-02 11:04:29 +03:00
dee2d2c538 Add manifest parsers for platform.json and package.json 2019-10-01 22:03:23 +03:00
fec19849b5 Docs: Add info about ignoring individual parts of mbed framework 2019-10-01 22:02:55 +03:00
5b77adccb1 DataModels: fix issue when traversing using model fields 2019-10-01 18:10:48 +03:00
a82c4666d4 DataModel: add support for DictOfType, extend base manifest with ExampelsModel 2019-10-01 17:37:11 +03:00
df6a8da290 DataModel: add support for silent validation and "get_exceptions" API 2019-10-01 16:13:25 +03:00
39c8996093 Fix docs typos 2019-10-01 16:11:55 +03:00
af1a0f3587 Allow to build a manifest parser from directory 2019-10-01 00:11:31 +03:00
703912fdc9 Strict manifest validation when submitting to Registry, more tests for manifest model 2019-09-30 23:45:03 +03:00
744881da59 Refactor DataModel with a strict type declaration 2019-09-30 19:44:03 +03:00
5f55c18373 Introduce DataModel, package manifest parser and base manifest model 2019-09-30 17:59:06 +03:00
2137eb1794 Do not append debug flags to linker stage 2019-09-30 13:27:34 +03:00
3dcf1784fb Update PIO Remote to 2.5.5 2019-09-27 19:36:49 +03:00
9a3dcd3daa PY2 fix with absolute import 2019-09-27 18:53:58 +03:00
1b74f380a6 Refactor appending of debugging flags 2019-09-27 17:22:21 +03:00
cd2a4ea535 Update copyrights 2019-09-27 17:21:35 +03:00
536a9566da Feature/pio size data (#3056)
* Add initial support for detailed memory usage report

* Tidy up sizedata target

* Add toolchain to environment paths

* Make sizedata target a bit more readable
2019-09-27 14:18:35 +03:00
d2abac9b18 Fixed an issue when configuration file options partly ignored when `--project-conf` // Resolve #3034 (#3055)
* Fixed an issue when configuration file options partly ignored when using custom ``--project-conf`` // Resolve #3034

* Py2 compatible makedirs

* Fix circle dependency

* Fix broken import in test examples

* Fix history

* Remove YAPF markers

* PyLint fix

* Fix invalid project conf path

* Move PIO Core to the root on Windows, issue with long CPPPATHs

* Respect global PLATFORMIO_BUILD_CACHE_DIR env var

* Fix Appveyor paths

* Minor changes
2019-09-27 14:13:53 +03:00
94f8afec38 udev: Add GD32V DFU Bootloader (#3032) 2019-09-24 11:02:26 +03:00
3d5c1411c0 Fix PyLint for PY2 2019-09-24 00:28:23 +03:00
9a7e5d86fc Install Black only for Python 3.6+ 2019-09-24 00:21:16 +03:00
ca29b4e370 Fixed "DeprecationWarning: the imp module is deprecated in favour of importlib" PY2/PY3 2019-09-24 00:17:08 +03:00
392fe1cbd0 Move Run to the root 2019-09-24 00:12:21 +03:00
aa955819b0 Move PIO Check to the root 2019-09-23 23:44:42 +03:00
b1f190a7f8 Move PIO Unit Testing to the root 2019-09-23 23:44:28 +03:00
5453df94e4 Move PIO Unified Debugger to the root 2019-09-23 23:27:55 +03:00
7b314b58a4 Move PIO Home to the root of source code 2019-09-23 23:23:11 +03:00
7c41c7c2f3 Introduce Black to automate code formatting 2019-09-23 23:13:48 +03:00
5e144a2c98 Add PIO Check to changelog 2019-09-23 21:57:31 +03:00
61b6eea52c New "--no-ansi" flag for PIO Core 2019-09-23 20:51:02 +03:00
cd8dc24454 Docs: Sync Espressif32 dev/platform 2019-09-18 18:47:55 +03:00
6531dcbc78 Allow to skip checking of unpacked data 2019-09-16 21:38:47 +03:00
123963f760 UTF8 decoding should ignore invalid characters (#3026)
Some boards, like ESP32 based boards, give some unintelligible data when connecting to them via Serial. This is sometimes data that is send with the wrong baud rate (hard baked into the boot loader), or something else. It's hard to prevent this from happening. When a build is uploaded to the ESP board for unit testing, the decoding of the incoming stream should not fail the test due to some garbled content. Since the read data is validated on line 95, any garbage is automatically ignored and only outputted to the console.
2019-09-16 21:02:07 +03:00
08a94b6f7c New article "Arduino In-circuit Debugging with PlatformIO" 2019-09-16 18:58:29 +03:00
43ae62afd8 Sync Aceinna and GD32V dev/platforms. 2019-09-13 16:01:42 +03:00
e08dc5f0d7 Docs: Sync Microchip PIC32 dev/platform 2019-09-10 17:48:47 +03:00
1e26feb566 Bump version to 4.1.0b1 2019-09-09 23:34:44 +03:00
96567dea4d PyLint fix 2019-09-08 23:44:18 +03:00
c720933d34 Refactor PIO Check 2019-09-08 23:33:25 +03:00
f61d03ec8f PIO Check (#2982) 2019-09-08 18:04:41 +03:00
b7bc4401eb Use isolated SCons DB per build environment 2019-09-08 14:01:41 +03:00
7a07a2e63e Generate `.ccls` LSP file for Emacs 2019-09-03 15:31:33 +03:00
2c242944c7 Fixed default PIO Unified Debugger configuration for J-Link probe 2019-09-02 16:48:33 +03:00
6265233903 Optimize udev rules 2019-09-02 16:01:15 +03:00
be3e26c202 Cleanup UDEV rules 2019-09-02 14:24:35 +03:00
9f76293684 Cleanup Segger UDEV rules 2019-09-02 14:13:58 +03:00
1be2e510da Sync nRF52 dev/platform 2019-09-02 12:50:55 +03:00
af049eecc9 Bump version to 4.1.0a1 2019-08-31 23:40:28 +03:00
fe237f15aa Implement "extends" for project configuration // Resolve #2953 2019-08-31 23:39:41 +03:00
bdce78ba6f Stop ModemManager corrupting Arduino uploads (#2966)
On boards like the Arduino Micro, when in bootloader mode it appears ModemManager interferes with the programming process and result in a catastrophic failure with no end of different errors including, but not limited to:
```
error: programmer did not respond to command: write block
error: butterfly programmer uses avr_write_page() but does not provide a cmd() method.
error: programmer did not respond to command: set addr
```
After this, the device could appear to be completely non-functional, refusing to enumerate or appear for programming, but thankfully a double-reset will usually recover it, but the underlying ModemManager issue will still prevent successful programming. Hence the additional rules. 

This affects not only PlatformIO, but also the Arduino IDE (on linux).
2019-08-31 11:47:32 +03:00
f26e3c42dd Sync docs 2019-08-31 11:40:16 +03:00
92cd03cf2a Sync docs 2019-08-30 18:12:26 +03:00
e7da3d7f5f Bump version to 4.0.4a1 2019-08-30 16:41:17 +03:00
f966eeb604 Fixed an issue with project generator for CLion IDE when 2 environments were used // Resolve #2824 2019-08-30 16:40:44 +03:00
34176f974b Fix generator for CLion when project is empty // Issue #2824 2019-08-30 15:45:21 +03:00
60f0f775ef Merge branch 'release/v4.0.3' 2019-08-30 15:41:59 +03:00
5f044a7948 Merge tag 'v4.0.3' into develop
Bump version to 4.0.3
2019-08-30 15:41:59 +03:00
9f1dd3dd5d Bump version to 4.0.3 2019-08-30 15:41:49 +03:00
db6f983364 Fix issue for CLion project generator when environment contains space // Issue #2824 2019-08-30 10:55:13 +03:00
386883fbe5 Bump version to 4.0.3rc1 2019-08-29 17:18:36 +03:00
e08527a0af Cleanup CLion project generator 2019-08-29 16:58:18 +03:00
4a6d5e8395 Added support for multi-environment PlatformIO project for CLion IDE // Resolve #2824 Resolve #2944 2019-08-29 16:26:51 +03:00
83bf34fb77 Extend "load_project_ide_data" API to return IDE data for more than one environment 2019-08-29 16:01:36 +03:00
1c8666e946 Clion integration, resolves #2824 (#2944)
* Better environement integration :
 - Environement can be selected in the build target menu of CLion
 - Platformio target runs on the selected environment
 - Changing environment changes defined preprocessor variables and includes accordingly
 - Added 'All' build profile that runs targets on all environment if there are multiple of them (Original behaviour)

* Calling get_project_dir() only once.

* Fixed include path not being converted to unix style.
Removed duplicate and not normalized  definition
2019-08-29 15:01:50 +03:00
0440b7a2f7 Disable TTY coloring with "PLATFORMIO_DISABLE_COLOR" system environment // Resolve #2956 2019-08-29 14:34:51 +03:00
223a85baca CCLS LSP for VIM // Resolve #2952 2019-08-29 14:20:24 +03:00
ed39a755bc Update to semantic_version 2.8.0 2019-08-29 13:49:52 +03:00
519156512c Strict versions for "semantic_version" and "tabulate" 2019-08-28 23:01:39 +03:00
9fa424ea9b Remove ProjectConfig from cache on saving 2019-08-28 22:43:34 +03:00
883a97a38c Fixed an issue when --upload-port CLI flag does not override declared upload_port option in "platformio.ini" 2019-08-28 19:56:09 +03:00
c671a8e235 Bump version to 4.0.3b1 2019-08-27 20:35:25 +03:00
55a44aecc3 Remove debug code 2019-08-27 20:26:44 +03:00
81fc1c9010 Fixed an issue with PIO Unified Debugger on Windows when debug server is piped 2019-08-27 20:23:03 +03:00
8037bef847 Move "to_unix_path" helper to FS module 2019-08-27 20:21:53 +03:00
98ec287797 Docs: Remove non-existing project examples 2019-08-27 16:28:38 +03:00
bc2765eb1f Fix issue with SemVer when library version has incompatible format // Resolve #2950 2019-08-27 14:05:01 +03:00
94644c2863 Update SCons tool to 3.1.1 2019-08-27 00:15:58 +03:00
fa090131ae Do not parse visited source files for LDF 2019-08-27 00:15:12 +03:00
48b46d74cf PIO Home: Improve description for project examples // Resolve #2713 2019-08-25 20:40:28 +03:00
66b22a218a Update PIO Home to 2.3.0 // Resolve #2614 Resolve #2819 2019-08-25 19:27:44 +03:00
3d18d4f9ce Sync docs 2019-08-25 18:37:54 +03:00
cba2f4d7b6 Remove ProjectConfig cache when "platformio.ini" was modified outside 2019-08-25 18:37:14 +03:00
54f14c64b5 Merge branch 'release/v4.0.2' 2019-08-23 16:24:28 +03:00
785be3cb26 Merge tag 'v4.0.2' into develop
Bump version to 4.0.2
2019-08-23 16:24:28 +03:00
20a9522542 Bump version to 4.0.2 2019-08-23 16:17:11 +03:00
f8d957a705 Sync docs 2019-08-23 16:10:47 +03:00
2eecbf966c Fixed an issue with a broken LDF when checking for framework compatibility // Resolve #2940 2019-08-23 15:45:45 +03:00
cef778731e Merge branch 'release/v4.0.1' 2019-08-22 14:25:40 +03:00
adde5e6a7e Merge tag 'v4.0.1' into develop
Bump version to 4.0.1
2019-08-22 14:25:40 +03:00
37863df67e Bump version to 4.0.1 2019-08-22 14:25:27 +03:00
0e56a155f8 Fixed an issue when printing settings and file path contains non-ASCII chars // Resolve #2934 2019-08-22 14:21:54 +03:00
c6de3ebea0 Refactor "humanize_duration_time" to "00:00:00.000" format 2019-08-20 16:49:18 +03:00
a25c1155c2 Docs: Add support for Teensy 4.0 2019-08-19 18:16:53 +03:00
42ee6fe96e Fixed an issue when library.json had priority over project configuration for LDF // Resolve #2867 2019-08-19 15:54:07 +03:00
a33fd6de27 Bump version to 4.0.1rc2 2019-08-19 14:44:04 +03:00
a2830dd527 Automatically normalize file system paths to UNIX-style for Project Generator // Resolve #2857 2019-08-19 14:43:39 +03:00
f46e1b7a3a Fix broken reference to VSCode Settings docs 2019-08-19 12:57:40 +03:00
bf59dda01b Better handling of fs.rmtree errors on Windows // Issue #2916 2019-08-19 12:48:57 +03:00
1dc15326c9 Export ProjectConfig instance to templates generator // Issue #2824 2019-08-18 11:06:52 +03:00
1a3720cfb9 Ability to set "databaseFilename" for VSCode and C/C++ extension // Resolve #2825 2019-08-18 00:40:59 +03:00
5ea3f2bbc6 Use board ID when InternetIsOffline for getting resent projects // Resolve #2866 2019-08-17 23:58:05 +03:00
08103cfc59 Improve settings printing 2019-08-17 23:57:12 +03:00
78e88b6115 Update changelog 2019-08-17 22:31:41 +03:00
26bb74afd5 Bump version to 4.0.1rc1 2019-08-17 22:03:52 +03:00
0587f5b964 Add __attribute__((unused)) to generated test code (#2906)
Without this, the compiler will generate unused parameter warnings in
the native version:

test/output_export.cpp: In function ‘void output_start(unsigned int)’:
test/output_export.cpp:6:32: error: unused parameter ‘baudrate’ [-Werror=unused-parameter]
 void output_start(unsigned int baudrate)
                                ^~~~~~~~
cc1plus: all warnings being treated as errors
2019-08-17 21:07:37 +03:00
e0ec4ff435 Encode PIO Core version before hashing // Resolve #2916 2019-08-17 21:06:15 +03:00
0677bcecb9 Improve printing of tabulated results 2019-08-17 20:55:16 +03:00
9023358d9e YAPF=0.28.0 2019-08-17 17:54:31 +03:00
3edf7e6ca8 Fixed an issue with PIO Home's "No JSON object could be decoded" // Resolve #2823 2019-08-15 21:44:07 +03:00
59114bbd86 Fixes for creating dev/platform 2019-08-15 17:13:45 +03:00
fac45d37f8 Article by Tech Explorations 2019-08-13 20:48:19 +03:00
d0c73af459 Fix docs when only Python 2 was mentioned for PIO Core 2019-08-13 20:44:52 +03:00
fcd1862f40 Add FT231XS ids for Sparkfun ESP32 Thing (#2886)
Signed-off-by: Yusuf Soyipek's ysoyipek@iora.com.tr
2019-08-12 23:22:55 +03:00
a6d42bedc1 Minor fix 2019-08-12 23:00:25 +03:00
04ebdf428b Move "match_src_files" to FS module 2019-08-12 20:32:26 +03:00
6a90388649 Move FS related helpers to fs.py module 2019-08-12 19:44:37 +03:00
1b2e410f12 New article by Jean-Claude Wippler about PlatformIO CLI 2019-07-24 13:08:51 +03:00
f7647438ef Docs: Fix case sensitive files 2019-07-24 12:15:08 +03:00
436a6bc521 Updated contributing guidelines // Resolve #2820 2019-07-24 12:08:21 +03:00
8ea3909a64 Updated contributing guidelines // Resolve #2820 2019-07-24 12:03:28 +03:00
ba0ab796c6 Updated contributing guidelines // Resolve #2820 2019-07-23 19:11:22 +03:00
0e61020652 Docs: Sync Atmel AVR boards 2019-07-22 20:03:54 +03:00
c51c35018d Fix test 2019-07-22 00:57:01 +03:00
5505c6c0e3 Switch Energia's projects to Arduino framework 2019-07-21 22:31:39 +03:00
d6a8360b29 Fix platformio ide search query for VSCode 2019-07-19 14:57:00 +03:00
0687ceb8a4 Bump version to 4.0.1b3 2019-07-19 13:00:52 +03:00
6d7854d113 Session based IDE integration with PIO Home 2019-07-19 12:59:50 +03:00
aebe891895 Skip modified homeState 2019-07-18 14:17:30 +03:00
e69d5e5873 Sync docs and examples 2019-07-18 13:16:23 +03:00
6ea49910d5 Fixed an issue when "debug", "home", "run", and "test" commands were not shown in "platformio --help" CLI 2019-07-18 00:19:30 +03:00
b17e318373 Add Gitlab CI to known list 2019-07-17 14:03:48 +03:00
b0aa4c6682 Renamed "enable_ssl" setting to "strict_ssl" 2019-07-17 00:53:40 +03:00
4d6c452d79 add telemetry / privacy policy reference (#2798) 2019-07-17 00:20:14 +03:00
d06d5def72 Docs: Use renamed "default_envs" option 2019-07-16 16:55:23 +03:00
aa3c943651 Fix typo in docs 2019-07-16 16:50:44 +03:00
1369b10e76 Bump version to 4.0.1b2 2019-07-16 15:48:06 +03:00
8059e04499 Improved computing of project check sum (structure, configuration) and avoid unnecessary rebuilding 2019-07-16 15:47:33 +03:00
71c4201487 Do not save unnecessary data for PIO Home 2019-07-16 14:15:48 +03:00
425c1fb0a8 Do not shutdown PIO Home Server for "upgrade" operations // Resolve #2784 2019-07-15 18:19:40 +03:00
bbf829fe92 Bump version to 4.0.1b1 2019-07-15 14:56:18 +03:00
6f2779fe5d Remove unnecessary code 2019-07-15 14:46:23 +03:00
b51f2ae722 Fixed an issue with incorrect escaping of Windows slashes for PIO Unified Debugger 2019-07-15 14:20:14 +03:00
d5dd4d4b3a Add global "env" group to extra_configs example 2019-07-13 13:51:04 +03:00
eaecad3d49 Add docs for Kendryte FreeRTOS SDK 2019-07-12 15:21:04 +03:00
a8a0cbbbf3 Remove PIO Plus link 2019-07-12 14:16:27 +03:00
94e21bf8a1 Update badges 2019-07-12 14:12:38 +03:00
02c3d870ff Sync Kendryte dev/platform 2019-07-12 13:57:12 +03:00
91f0217d39 Make scripts compatible with Python 3 // Resolve #2779 2019-07-12 13:38:37 +03:00
5e73348263 Print debug tool name in debugging session 2019-07-11 14:09:15 +03:00
2f871db3ae Add docs about configuring MCUdude's cores 2019-07-11 12:12:54 +03:00
28972b838b Add "publish" target 2019-07-10 16:52:04 +03:00
0473405b92 Merge tag 'v4.0.0' into develop
Bump version to 4.0.0
2019-07-10 16:22:53 +03:00
0d00d0534a Merge branch 'release/v4.0.0' 2019-07-10 16:22:52 +03:00
fe210aadcf Bump version to 4.0.0 2019-07-10 16:22:37 +03:00
6fcc76e199 Docs for VSCode: Proxy Server Support 2019-07-10 16:07:13 +03:00
e927669632 Document how to override board configuration 2019-07-10 13:49:59 +03:00
24d2b94e79 Drop Python 2.7 limitation from a docs // Resolve #1991 2019-07-10 13:17:42 +03:00
f81b1089c1 Control a number of parallel build jobs with a new -j, --jobs option 2019-07-10 13:00:51 +03:00
fee748d384 Bump version to 4.0.0rc6 2019-07-10 00:20:43 +03:00
d003dffc5a Fixed an issue with broken internal call for PIO Account // Resolve #2748 2019-07-10 00:20:23 +03:00
a300168658 Bump version to 4.0.0rc5 2019-07-09 00:14:46 +03:00
0a638b7ea5 Fix issues with Unicode arguments when calling inline PIO Core 2019-07-09 00:14:11 +03:00
ffcf6b873a Use native Windows API for getting My Documents folder path 2019-07-08 17:21:28 +03:00
c69e80249d Append system PATH to overridden in CDT project // Resolve #810 2019-07-04 22:56:54 +03:00
2a4b25705c More pre-configured target for Eclipse IDE 2019-07-04 22:53:36 +03:00
412a1f78cd Fixed an issue when generating invalid "Eclipse CDT Cross GCC Built-in Compiler Settings" if a custom PLATFORMIO_CORE_DIR is used // Resolve #806 2019-07-04 20:07:44 +03:00
73dacd9418 Add "make profile" hotkey 2019-07-04 19:02:13 +03:00
caf5159002 Fixed an issue when Ctrl+C(SIGINT) terminates debugging session instead of halting // Resolve #2733 2019-07-04 17:47:26 +03:00
d9b6842b6a Update history with refactored PIO Home back-end 2019-07-04 16:49:29 +03:00
affa54e5fc Fix an issue when lib_compat_mode = strict does not ignore libraries incompatible with a project framework 2019-07-04 16:46:54 +03:00
c3ac10fe29 Update docs and examples for SiFive dev/platform 2019-07-03 22:58:27 +03:00
0a907627be Bump version to 4.0.0rc4 2019-07-03 15:22:22 +03:00
26dda104dd Remove debug code 2019-07-03 15:22:04 +03:00
7c6fabaee2 Fix an issue with unhandled warnings from PIO Core when calling it internally // Resolved #2727 2019-07-03 15:10:37 +03:00
dc07ea56d2 Bump version to 4.0.0rc3 2019-07-03 14:01:14 +03:00
b68e5db46b Fix broken example when using custom uploader // Resolve #2735 2019-07-03 12:37:00 +03:00
fd9ca0cd15 Ensure buffer is created for thread stream 2019-07-02 22:35:35 +03:00
6c47c7506e Do not remove thread buffers, reset them instead 2019-07-02 22:26:01 +03:00
18d93dfcc9 Use StringIO for PY3 2019-07-02 21:04:43 +03:00
900a4d463f Bump version to 4.0.0rc2 2019-07-02 20:49:06 +03:00
4b24d6e3e4 Minor changes 2019-07-02 20:48:43 +03:00
2988724456 Setup MultiThreadingStdStreams from child threads 2019-07-02 20:44:33 +03:00
c79d5f0cf1 Fix an issue saving modified State 2019-07-02 15:52:12 +03:00
148b7dccfd Refactor ThreadSafeStdBuffer 2019-07-02 15:47:03 +03:00
83f64cebbd Remove test code 2019-07-02 12:23:35 +03:00
69de40c409 Fix an issue when empty multiple values for project option were not casted to list // Resolve #2734 2019-07-02 12:21:06 +03:00
bf9552bd56 Free lock when state is deleted 2019-07-02 00:45:35 +03:00
5afaa6d0ee Merge branch 'feature/performance-fixes' into develop 2019-07-02 00:42:04 +03:00
d2c86ab71c Refactor state to a proxied dictionary 2019-07-02 00:41:47 +03:00
6d9de80f12 Better comparison for app state changes 2019-07-01 20:42:23 +03:00
bf77d70d82 Thread safe internal PIO Core calling for PIO Home 2019-07-01 20:39:52 +03:00
8525fd6ae8 Docs: Sync Atmel SAMD boards 2019-07-01 18:33:05 +03:00
dfca7f0b68 Speedup PIO Home via internal calling of PIO Core CLI 2019-07-01 15:55:42 +03:00
500b3e08fe Docs: Fix default OTA port for ESP32 2019-06-29 13:29:10 +03:00
6739d5a570 In the current version of this template all build targets result (#2714)
in

   pio -c -f eclipse debug run target <mytarget>

The commit fixes this to be

   pio -c -f eclipse run target <mytarget>

See also discussion in forum:
   https://community.platformio.org/t/pio-4-0-0b3-potential-bug-in-cprojet-tpl/8390
2019-06-29 12:48:24 +03:00
f8dd90c9a9 Docs: Fix invalid references to Sipeed boards 2019-06-28 23:34:08 +03:00
8d459d86d3 Bump version to 4.0.0rc1 2019-06-28 19:09:47 +03:00
f30bd18bdc Skip async output when empty byte is given 2019-06-28 19:07:59 +03:00
0035f56e15 Sync docs and examples 2019-06-28 19:07:44 +03:00
4123aa4c23 Look in for "lib_deps" in all declared library storages // Resolve #2708 2019-06-28 13:27:05 +03:00
1bff3c6615 Use isolated Core folder for test_maintenance 2019-06-28 13:21:11 +03:00
936b04e075 Remove upgrade hooks from Core 2.0 to Core 3.0 2019-06-28 13:04:34 +03:00
4d23ad03c3 Remove upgrade hooks from Core 2.0 to Core 3.0 2019-06-28 13:03:49 +03:00
a75823227d Use :1234 as default GDB port for QEMU 2019-06-27 15:13:16 +03:00
96d337388b Debug: Initial configuration for QEMU 2019-06-27 15:07:13 +03:00
c89793eab9 Use piped openOCD for RISC-V 2019-06-27 15:06:12 +03:00
e27b40390d Look for project dir in sys env variables (hooks for Eclipse, CLion) 2019-06-27 14:57:05 +03:00
1d80914070 Remove upgrade hooks from Core 2.0 to Core 3.0 2019-06-27 14:56:21 +03:00
e61caa37a8 Fix issue when CMakeListsUser.txt is not included // Resolve #2712 2019-06-27 14:04:16 +03:00
d07a1d265e Log stdout of GDB client 2019-06-27 13:38:46 +03:00
54921c5dbd Bump version to 4.0.0b3 2019-06-26 01:02:08 +03:00
5cb2a970fa Suppress IOError 2019-06-26 00:52:22 +03:00
89b951e7d5 Switch to news from registry 2019-06-26 00:29:49 +03:00
6daf387c90 Missing parentheses and depreciated syntax (#2700)
Fixes
```
  File "get-platformio.py", line 93
    print r['out']
          ^
SyntaxError: Missing parentheses in call to 'print'. Did you mean print(r['out'])?
```

and 

```
  File "get-platformio.py", line 146
    except Exception, e:
                    ^
SyntaxError: invalid syntax
```
2019-06-25 23:50:06 +03:00
2c3d8ce695 Bump version to 4.0.0b2 2019-06-23 21:23:44 +03:00
17fa5f77d5 Fix LDF recursive behaviour 2019-06-23 21:23:19 +03:00
c0a9ae5c70 Better handling of library.properties "architectures" meta data 2019-06-23 21:21:34 +03:00
4b7916c2af Bump version to 4.0.0b1 2019-06-21 17:50:18 +03:00
6d968a7093 Docs: Add info about using custom targets with mbed framework 2019-06-21 15:11:51 +03:00
634afdcf8a Update FUNDING.yml 2019-06-20 19:11:50 +03:00
3ebeb1bab2 Create FUNDING.yml 2019-06-20 19:10:21 +03:00
5ab564a6d0 Add doc page about ULP programming 2019-06-20 17:41:06 +03:00
04dc6230e7 Open source PIO Unified Debugger and Unit Testing 2019-06-19 00:29:21 +03:00
51115c1254 Ignore vscode-cpptools cache (#2623)
One of the recent updates to the vscode-cpptools extension made it start caching in .vscode/ipch, and this cache can be quite big for even small projects. Since they're cache files, they should be ignored by default. This may be a short-lived suggestion, as there is already some mention that the IntelliSense cache folder may be moved to workspace storage, rather than 'per project' storage... although the 'when' is anyone's guess. 

See here for more: https://github.com/microsoft/vscode-cpptools/issues/3347
2019-06-16 12:37:32 +03:00
40c8046546 Fix Teensy serial lockout (#2676)
Fixes `[Errno 16] Device or resource busy: '/dev/ttyACM0` issue on Teensy 3.2 and 3.5 (at minimum) due to ModemManager probing the devices and making them unavailable. Updated rules sourced from current https://www.pjrc.com/teensy/49-teensy.rules configuration.
2019-06-16 12:36:41 +03:00
fa761f9616 Use build cache for AppVeyor CI examples 2019-06-15 22:06:55 +03:00
5fabadd059 Use build cache for CI examples 2019-06-15 21:12:40 +03:00
38408c1e1f Use only 1 example per framework for CI 2019-06-15 21:09:52 +03:00
fbdfe31f17 Shared cache directory for the build derived files // Resolve #2674 2019-06-15 18:53:13 +03:00
bd8ba738cf Improve docs how to enable PIO Core dev/version in PIO IDE 2019-06-13 23:46:53 +03:00
8b8b6c3b9e Bump version to 4.0.0a23 2019-06-13 22:57:53 +03:00
bd3b29c304 Fix converting to real version 2019-06-13 20:24:55 +03:00
1339924c2e Print installation progress for "lib_deps" after LDF banner 2019-06-13 19:54:40 +03:00
46eab99888 Typo fix 2019-06-13 18:31:03 +03:00
461d71c2c7 Look firstly in built-in library storages for a missing dependency instead of PlatformIO Registry // Resolve #1654 2019-06-13 18:22:36 +03:00
1ccc526960 Revert "Revert back "Look firstly in built-in library storages""
This reverts commit 4ae302762a.
2019-06-13 13:08:53 +03:00
5338a9caa3 Bump version to 4.0.0a22 2019-06-13 00:42:24 +03:00
4ae302762a Revert back "Look firstly in built-in library storages" 2019-06-13 00:42:10 +03:00
98513c9967 Fix nested import 2019-06-12 23:47:22 +03:00
b6688db8b7 Bump version to 4.0.0a21 2019-06-12 22:03:28 +03:00
d5c98e4f27 Look firstly in built-in library storages for a missing dependency instead of PlatformIO Registry // Resolve #1654 2019-06-12 22:02:59 +03:00
b7e9bcb609 Do not check udev rules for gdb --version 2019-06-11 20:30:06 +03:00
9de7297d38 Fix "UnicodeEncodeError: 'ascii' codec can't encode characters" // Resolve #2644 2019-06-10 19:44:18 +03:00
97acf23a6d Docs: Add info how to enable built-in PIO Core for VSCode 2019-06-10 16:58:34 +03:00
80718ebb95 Fix "UnicodeEncodeError: 'ascii' codec can't encode characters" // Resolve #2644 2019-06-10 15:25:59 +03:00
643d118062 Sync docs 2019-06-07 17:42:33 +03:00
15b5a14995 Bump version to 4.0.0a20 2019-06-07 17:24:07 +03:00
68a3b3f9e7 Custom platform_packages per a build environment with an option to override default // Resolve #1367 2019-06-07 17:22:02 +03:00
5f1bd286c7 Fix "ValueError: invalid literal for int() with base 10: '0.0'" // Resolve #2646 2019-06-07 15:12:24 +03:00
d18b4f12d0 Sync docs 2019-06-07 15:01:52 +03:00
d9010230a4 Make internal in-memory cache for package manager to be instance related 2019-06-07 15:01:27 +03:00
686d615639 Cast env_name to string // Resolve #2644 2019-06-07 13:14:14 +03:00
d205370e9b Docs: Minor changes to migration guide 2019-06-07 01:06:16 +03:00
ce66033190 Docs: update migration from 3.0 to 4.0 2019-06-07 00:18:34 +03:00
bcff26d4d7 Refactor using "@util.memoized" 2019-06-06 00:13:04 +03:00
898d79956d Bump version to 4.0.0a19 2019-06-05 18:28:23 +03:00
522f814811 Show detailed info about a platform when is installed from local folder or VCS // Resolve #2081 2019-06-05 18:26:39 +03:00
394d272324 Fix numerous issues related to "UnicodeDecodeError" and international locales, or when project path contains non-ASCII chars // Resolve #143, Resolve #1342, Resolve #1959, Resolve #2100 2019-06-05 17:57:22 +03:00
84ce7db3e3 Fixed an issue when library keeps reinstalling for non-latin path // Resolve #1252 2019-06-05 17:53:02 +03:00
f873bd41f8 Better printing of relative path for removed objects 2019-06-05 16:47:02 +03:00
5c8c10e7d3 Do not check that lib_extra_dirs exist // Resolve #2624 2019-06-04 13:55:11 +03:00
a504a13fa8 Fix broken example with ConfigParser using // Resolve #2616 2019-06-04 00:50:45 +03:00
d09964a897 Use common IDE data loading for IDE and DEBUG 2019-06-03 19:20:10 +03:00
4416c12747 Fix numerous issues related to "UnicodeDecodeError" and international locales, or when project path contains non-ASCII chars // Resolve #2100 2019-06-03 17:44:41 +03:00
80a1b95887 Sync docs 2019-06-03 14:48:05 +03:00
9eb18ca72d PyLint fix 2019-06-03 14:29:22 +03:00
37653d8446 Better decoding SCons arguments 2019-06-03 13:57:58 +03:00
e269c91d26 Improve compatibility with hashlib Py2/Py3 2019-06-03 13:30:35 +03:00
ac3236693f Bump version to 4.0.0a18 2019-06-03 01:14:02 +03:00
d0b3c5ee86 Switch between Build Configurations (release and debug) with a new project configuration option build_type // Resolve #2184 2019-06-02 14:11:31 +03:00
23a2022f04 Add support for PLATFORMIO_DEFAULT_ENVS system environment variable // Resolve #1967 2019-06-01 22:43:44 +03:00
c5177efd0b Minor fixes 2019-06-01 22:24:38 +03:00
d51cd9c277 Bump version to 4.0.0a17 2019-06-01 19:48:21 +03:00
6257480d0d Print platform package details, such as version, VSC source and commit // Resolve #2155 2019-06-01 19:44:45 +03:00
4af615a49c Maintain renamed options when reading configuration file 2019-06-01 16:58:14 +03:00
6186b425d4 Typo fix 2019-06-01 15:38:55 +03:00
c038074489 Override default development platform upload command with a custom // Resolve #2599 2019-06-01 14:36:07 +03:00
d25f1ddc21 Add project folder prefix to $PROJECT_HASH 2019-05-31 21:47:50 +03:00
5011e47709 Added support for "shared_dir" 2019-05-31 21:18:37 +03:00
33d16bfcf0 Use named context meta vars for unit testing 2019-05-31 15:47:25 +03:00
bae21f1cdd Allow to pass multiple load commands to PIO Unified Debugger 2019-05-31 14:45:48 +03:00
5f9fd9260e New project configuration parser with a strict options typing 2019-05-31 14:45:01 +03:00
61db0f1d6a YAPF 0.27.0 2019-05-30 23:42:15 +03:00
1598c8197e Fix "clean" target 2019-05-30 23:33:57 +03:00
01db26f204 Add a link to PIO Unified Debugger options 2019-05-30 22:26:42 +03:00
12876c5c2b Merge branch 'feature/refactor-project-options' into develop 2019-05-30 22:15:16 +03:00
5c60d922ca Skip "arduino-mock" from CI 2019-05-30 22:15:06 +03:00
0570fc6c48 Don't override custom "core_dir" on Windows platform 2019-05-30 21:56:55 +03:00
f3c8277572 Fix broken tests 2019-05-30 21:27:12 +03:00
1dbaed5beb Implement "silent" mode for config.validate() 2019-05-30 21:26:51 +03:00
19c1574993 Use the latest TOX for Travis.CI 2019-05-30 17:37:48 +03:00
346579b93c Improve type converting for config options 2019-05-30 17:34:44 +03:00
0ce2343836 Do not pass project settings as SCons arguments // Resolve #1637 2019-05-30 17:08:00 +03:00
d5e277b7cc Minor improvements to unit testing engine 2019-05-30 16:39:17 +03:00
3cc4af1723 Refactor project config options 2019-05-30 16:38:04 +03:00
8d05903bf3 Log THE ONLY non sensitive data (used board, platform, and framework) 2019-05-30 14:36:04 +03:00
7f845ab943 Sync docs and examples 2019-05-30 14:32:49 +03:00
ddc8a353cb Sync docs 2019-05-28 12:53:54 +03:00
9ce9171a36 Fix typo 2019-05-28 01:19:51 +03:00
dec43bec9d Fix test 2019-05-28 00:09:20 +03:00
99377130eb Enhance unit testing summary 2019-05-27 22:25:48 +03:00
3df01405a1 Remove unused Python imports 2019-05-27 22:25:22 +03:00
3adcf66453 Docs: Use native Python ConfigParser for extra scripting examples 2019-05-27 19:03:31 +03:00
b88c262a9d Use the latest version of tox for AppVeyor CI 2019-05-27 18:49:35 +03:00
5999bcee3f Update history 2019-05-27 18:45:24 +03:00
078b0af312 Test only 1 project example per dev/platform for AppVeyor CI 2019-05-27 18:17:57 +03:00
a0fb88e28a Implement "envLibdepsDirs" per project for PIO Home 2019-05-27 17:57:46 +03:00
3cd4b005d8 Open sourcing PIO Unified Debugger, PIO Unit Testing Engine, and PIO Home Server 2019-05-27 17:19:33 +03:00
0a523fc06c Docs: Minor tweak 2019-05-27 14:33:38 +03:00
16864509af Document "Override package files" 2019-05-27 14:21:48 +03:00
cb8af5add9 Print "No items found" when there are no packages for llisting 2019-05-27 12:28:04 +03:00
3f96dc1432 Fixed an issue when package cache (Library Manager) expires too fast // Resolve #2559 2019-05-27 12:24:20 +03:00
e1aa29cb36 Bump version to 4.0.0a16 2019-05-25 22:18:23 +03:00
6e87089ded Add support for Unix shell-style wildcards for "monitor_port" option // Resolve #2541 2019-05-25 22:14:38 +03:00
a84195bb5a Add user-definable monitor options to platformio.ini // Resolve #2165 2019-05-25 21:49:51 +03:00
70a0bd72c0 Sync "include" directory for PIO Remote // Resolve #2210 2019-05-25 21:06:08 +03:00
fea7e97112 Fix an issue with hardcoded C stadard version when generating project for CLion IDE // Resolve #2527 2019-05-25 20:47:39 +03:00
7beb332b31 Support custom CMake configuration for CLion IDE using `CMakeListsUser.txt` file 2019-05-25 20:46:56 +03:00
7b2c1f27fc Sync docs 2019-05-25 20:46:23 +03:00
67f7b6cda3 Bump version to 4.0.0a15 2019-05-25 01:18:28 +03:00
4266cba53b Cleanup ".piolibdeps" 2019-05-25 01:10:35 +03:00
19725fec04 Add options to override default locations used by PlatformIO Core // Resolve #1615 2019-05-24 20:49:05 +03:00
a6e5a0c7f5 Fix an issue for Project Generator when include path search order is inconsistent to what passed to the compiler // Resolve #2509 2019-05-24 16:06:27 +03:00
2baea815fe Update history 2019-05-24 15:06:33 +03:00
b38c57bcf9 Fix an issue when `-U in build_flags does not remove macro previously defined via -D` flag // Resolve #2508 2019-05-24 14:57:59 +03:00
e6d1805f0b Save library requirements when using --save option // Issue #1028 2019-05-24 14:09:25 +03:00
9a95b0df56 Fix handling custom includeDir and srcDir for library.json // Resolve #2518 2019-05-24 01:15:47 +03:00
70a5d32925 Add "--save" flag to "platformio lib install" command // Resolve #1028 2019-05-23 19:39:04 +03:00
c2a549b0c2 Install all project dependencies declared via "lib_deps" option using "platformio lib install" command // Resolve #2147 2019-05-23 18:37:08 +03:00
0fda79a075 Switch to Click meta context for lib CLI 2019-05-23 13:05:44 +03:00
21e2ac6695 Use isolated library dependency storage per project build environment // Resolve #1696 2019-05-23 00:23:24 +03:00
e7d75d1412 Remove debug code 2019-05-21 21:47:20 +03:00
4386dc56ea Move "in_silence" to PlatformioCLI 2019-05-21 13:18:11 +03:00
a30b79c5fc Sync docs 2019-05-21 12:01:14 +03:00
f29a74042f Drop support for "lib_extra_dirs" in "platformio" section 2019-05-20 21:12:45 +03:00
c46643f0fd Custom project "***_dir" options declared in “platformio” section of “platformio.ini” have higher priority than Environment variables 2019-05-20 17:07:59 +03:00
5fe4de626b Implement unified project workspace storage ".pio" // Resolve #1778 2019-05-20 17:01:54 +03:00
774380c2ef Bump version to 4.0.0a14 2019-05-20 12:38:49 +03:00
8643f0454e Move "glob_escape" and "get_file_contents" helpers to "compat" module 2019-05-17 13:18:15 +03:00
f844d9cb47 Remove line-buffering from "platformio run" command which was leading to omitting progress bar from upload tools // Resolve #856, Resolve #857 2019-05-17 12:53:51 +03:00
f94fbb951a Typo fix 2019-05-16 21:27:32 +03:00
899de600e4 Fix broken "util.string_types" 2019-05-16 21:11:21 +03:00
971049b41c Move process related helpers to "proc" module 2019-05-16 21:03:15 +03:00
aaf61082c1 Replace "--only-check" CLI option by "--dry-run" 2019-05-16 20:02:45 +03:00
b14abeff48 Bump version to 4.0.0a13 2019-05-13 22:37:45 +03:00
f26553b451 ESP8266, docs for "SSL Support" 2019-05-13 22:37:24 +03:00
8b93ad00a2 Docs: ESP32: Implement "espota" protocol 2019-05-11 22:18:35 +03:00
5e1a931145 Switch Python or Platform dependent code to "compat" module 2019-05-10 17:50:08 +03:00
abfee8308e Switch Python or Platform dependent code to "compat" module 2019-05-10 17:47:02 +03:00
d2449762c2 Move Python or Platform dependent code to "compat" module 2019-05-10 17:27:04 +03:00
59848c3115 Check unknown build environment passed by user 2019-05-10 17:26:47 +03:00
834206ff20 Move Python or Platform dependent code to "compat" module 2019-05-10 17:26:10 +03:00
ce4ed18ceb Check unknown build environment passed by user 2019-05-10 15:49:01 +03:00
b710bbd80e Allow to skip ProjectConfig option validation with new "validate_options" argument 2019-05-10 15:47:45 +03:00
446176bf5e Allow to skip ProjectConfig option validation with new "validate_options" argument 2019-05-10 15:47:17 +03:00
0b2d780618 Switch to the new ProjectConfig API 2019-05-10 15:45:52 +03:00
d9b0364aa8 Allow overriding a default project "platformio.ini" configuration file 2019-05-10 13:12:41 +03:00
76818448e2 Ensure PIO Home mimetypes are known 2019-05-10 13:01:52 +03:00
131144ec34 Resolve PyLint "import-error" 2019-05-10 13:00:53 +03:00
a21d75b273 Merge branch 'develop' into feature/pio-plus-oss 2019-05-10 01:10:51 +03:00
c79b3ff7f1 Override default “platformio.ini” with a custom using "-c, --project-conf" option // Resolve #1913 2019-05-10 00:01:10 +03:00
2b5ac57fd0 Bump version to 4.0.0a12 2019-05-09 18:40:19 +03:00
32d317d3cb Fix PlatformIO CLI 2019-05-09 18:39:27 +03:00
71f606912a Implement ProjectConfig.getlist() 2019-05-09 14:14:19 +03:00
62b80c396b Added support for the latest Python "Click" package (CLI Builder) // Resolve #349 2019-05-09 00:51:28 +03:00
7687a0a929 Fix PyLint "not-an-iterable" error 2019-05-08 21:02:23 +03:00
f63fe1699b Bump version to 4.0.0a11 2019-05-08 20:20:56 +03:00
4f98a3fd42 Share common (global) options between declared build environments using "[env]" section // Resolve #1643 Resolve #790 2019-05-08 20:19:39 +03:00
693304590c Fix PyLint "not-an-iterable" error 2019-05-08 12:41:11 +03:00
947e31ca8d Fix some PyLint errors 2019-05-07 23:51:46 +03:00
45d4b92678 Export get_projectdata_dir for util, fix riscv_gap dev/platform 2019-05-07 23:00:01 +03:00
07a2a49d93 Refactor project helpers 2019-05-07 22:13:21 +03:00
7ddd22209f Enable PyLint "import-error" 2019-05-07 21:16:42 +03:00
6cd4484be9 Init new project using new ProjectConfig API 2019-05-07 19:57:24 +03:00
c235974eb6 Switch to the new ProjectConfig API 2019-05-07 17:51:50 +03:00
1b4f945907 Remove support for renamed dev/platforms 2019-05-07 15:59:09 +03:00
4fdd51e190 Docs: Sync MCS51 boards 2019-05-07 13:26:10 +03:00
95b9ae9f24 Docs: Sync Atmel SAM boards 2019-05-06 13:38:48 +03:00
94e580bf4e Override default source and include directories for a library via "library.json" manifest using "includeDir" and "srcDir" fields 2019-05-04 13:36:27 +03:00
e4dca37874 Bump version to 4.0.0a10 2019-05-04 12:23:51 +03:00
7f607b742f Fix issue when handling dynamic variables 2019-05-04 12:23:23 +03:00
2c0e0b2619 Bump version to 4.0.0a9 2019-05-03 21:14:08 +03:00
8e55c9e4d0 Include external configuration files with "extra_configs" option // Resolve #1590 2019-05-03 21:03:36 +03:00
41ff1b0188 Support for Kendryte K210 // Resolve #2233 2019-05-03 16:56:44 +03:00
48c1aeae03 Fix "systemd-udevd" warnings in 99-platformio-udev.rules // Resolve #2442 2019-05-03 13:11:03 +03:00
eab2fd91fd Lowercase SHA sum for package manager 2019-04-25 12:49:22 +03:00
ba6d120cf4 Docs: Sync Atmel AVR dev/platform 2019-04-24 00:14:53 +03:00
e9df6166ee Update SCons to 3.0.5 2019-04-24 00:07:27 +03:00
9bdc85fd52 Bump version to 4.0.0a8 2019-04-23 12:49:36 +03:00
fa48a6460f Merge tag 'v3.6.7' into develop 2019-04-23 12:49:10 +03:00
f07854879a Merge branch 'hotfix/v3.6.7' 2019-04-23 12:47:27 +03:00
9ca53c57f4 Bump version to 3.6.7 2019-04-23 12:32:18 +03:00
ee420cc35e Bump version to 3.6.7 2019-04-23 12:31:58 +03:00
d49d91269d Update history with upcoming 3.6.7 release 2019-04-23 12:29:29 +03:00
a59efc2fc0 Update history with upcoming 3.6.7 release 2019-04-23 12:29:16 +03:00
20f28383a0 Fix links in changelog 2019-04-23 01:07:36 +03:00
d3c3491a91 Fix links in changelog 2019-04-23 01:07:21 +03:00
e6a7cc2036 Update core package dependencies 2019-04-23 01:05:11 +03:00
4d615416f3 Improve debugging in debug_load_mode = modified and fix an issue with useless project rebuilding 2019-04-23 00:50:32 +03:00
137a5d1c42 Improve debugging in debug_load_mode = modified and fix an issue with useless project rebuilding 2019-04-23 00:49:53 +03:00
65354e995d Initial commit of PIO Home 2019-04-22 21:07:28 +03:00
3032cade17 PyLint fixes 2019-04-19 20:46:28 +03:00
948a977fa5 Initial commit of PIO Unit Testing 2019-04-19 20:33:31 +03:00
c7d8b50474 Use generic GDB_MSPDEBUG_INIT_CONFIG 2019-04-19 19:58:34 +03:00
f1da544279 Initial commit of PIO Unified Debugger 2019-04-19 19:56:16 +03:00
40d1bb204c Docs: New boards by ESP32 dev/platform 2019-04-19 13:26:25 +03:00
21a36f8ee9 Fix UnicodeEncodeError when converting INO to CPP 2019-04-18 16:20:01 +03:00
be24c6ab4d Fix an issue when invalid "env_default" results into unhandled errors // Resolve #2265 2019-04-18 14:18:27 +03:00
3d96e584fb Fix an issue when invalid "env_default" results into unhandled errors // Resolve #2265 2019-04-18 14:17:22 +03:00
963eabc3f5 Docs: Rename "Download" button to INSTALL 2019-04-18 01:01:26 +03:00
f63041a402 Add get_original_version to public utils API 2019-04-17 23:20:45 +03:00
c084db1619 Enable AppVeyor CI only for Windows x64 2019-04-17 20:09:05 +03:00
4edfb8f6cc Cleanup PING_INTERNET_IPS 2019-04-17 20:06:01 +03:00
6501c1f171 Cleanup PING_INTERNET_IPS 2019-04-17 20:05:51 +03:00
41ab97203a Docs: Add community video tutorials 2019-04-16 13:55:47 +03:00
a51a03843d Docs: Use jQuery from Sphinx theme (fix search) 2019-04-16 00:33:21 +03:00
e9b8478942 Sync docs 2019-04-07 00:52:20 +03:00
29cf1c8596 Fix "ValueError: invalid literal for int() with base 10" for click.get_terminal_size 2019-04-05 19:49:36 +03:00
e992e156bf Fix "ValueError: invalid literal for int() with base 10" for click.get_terminal_size 2019-04-05 19:48:04 +03:00
cf13ec4035 Fix an "IndexError: list index out of range" for Arduino sketch preprocessor // Resolve #2268 2019-04-01 18:50:40 +03:00
c1d01dbe34 Update info about PIO IDE for VSCode 2019-04-01 18:35:00 +03:00
ff5da3c3cc Use stable dev/platforms for CI 2019-03-30 13:54:29 +02:00
7d9e10095e Set manifest version of VSCode C/C++ configuration file 2019-03-30 13:00:27 +02:00
bb17630571 Set manifest version of VSCode C/C++ configuration file 2019-03-30 13:00:10 +02:00
7746f7eeee Project Generator: fixed a VSCode C/C++'s "Cannot find" warning when CPPPATH folder does not exist 2019-03-29 21:27:08 +02:00
e089c4a546 Project Generator: fixed a VSCode C/C++'s "Cannot find" warning when CPPPATH folder does not exist 2019-03-29 21:26:45 +02:00
8f29d951cb YAPF 0.26.0 2019-03-29 12:55:05 +02:00
f182a6dcae Bump version to 3.6.7a1 2019-03-29 12:46:24 +02:00
0c76116948 Merge tag 'v3.6.6' into develop 2019-03-29 12:44:54 +02:00
6b17183cff Merge branch 'hotfix/v3.6.6' 2019-03-29 12:43:41 +02:00
8cd35fb537 Bump version to 3.6.6 2019-03-29 12:43:16 +02:00
2d4421e8e5 New article "Automated unit testing in the metal" 2019-03-25 16:04:05 +02:00
760499c095 Append configuration settings from "mbed_lib.json" to "mbed_config.h" // Resolve #2164 2019-03-24 23:55:06 +02:00
c141189883 Append configuration settings from "mbed_lib.json" to "mbed_config.h" // Resolve #2164 2019-03-24 23:54:53 +02:00
00b162608e Fix an issue with incorrect order of project "include" and "src" paths in `CPPPATH` // Resolve #1914 2019-03-23 17:45:28 +02:00
51cb87bc6f Bump version to 4.0.0a7 2019-03-23 17:44:30 +02:00
cae708f2d7 Fix an issue with incorrect order of project "include" and "src" paths in `CPPPATH` // Resolve #1914 2019-03-23 17:44:01 +02:00
e33b0fe291 Fix an issue when PlatformIO Build System does not pick up "mbed_lib.json" files from libraries // Resolve #2164 2019-03-23 14:33:34 +02:00
3a77bed0d4 Fix an issue when PlatformIO Build System does not pick up "mbed_lib.json" files from libraries // Resolve #2164 2019-03-23 14:24:43 +02:00
06fe557a20 Fix "SameFileError" when CI is used in pair with --keep-build-dir // Resolve #2227 2019-03-22 21:25:12 +02:00
10a7367b33 Fix "SameFileError" when CI is used in pair with --keep-build-dir // Resolve #2227 2019-03-22 21:24:43 +02:00
9c1cc97776 Project Generator: fixed a warning "Property !!! WARNING !!! is not allowed" for VSCode // Resolve #2243 2019-03-22 21:17:26 +02:00
dea6551841 Bump version to 4.0.0a6 2019-03-22 21:16:33 +02:00
4ca71e7df1 Project Generator: fixed a warning "Property !!! WARNING !!! is not allowed" for VSCode // Resolve #2243 2019-03-22 21:16:02 +02:00
d7e2d05f60 Fix error with conflicting declaration of a prototype (Arduino sketch preprocessor) 2019-03-20 18:58:06 +02:00
d9e6111ac3 Fix error with conflicting declaration of a prototype (Arduino sketch preprocessor) 2019-03-20 18:49:20 +02:00
86e4641101 Bump version to 3.6.6a1 2019-03-14 20:16:15 +02:00
326eb4a681 Fix "FileExistsError" when "platformio ci" command is used in pair with "--keep-build-dir" option 2019-03-14 20:14:55 +02:00
d4af985eb8 Bump version to 4.0.0a5 2019-03-14 20:12:52 +02:00
0d904ad1cc Fix "FileExistsError" when "platformio ci" command is used in pair with "--keep-build-dir" option 2019-03-14 20:12:18 +02:00
b3ac567b53 Docs: New article "Getting started with the STM32F407VG and STM32Cube" 2019-03-14 14:46:44 +02:00
03aec79cc1 Drop mbed OS support for Adafruit boards 2019-03-13 22:22:48 +02:00
9a6d148bdc Update docs for PIO Plus products 2019-03-08 18:01:35 +02:00
a9288e5a5b Docs: Cosmetic changes to PIO Unified Debugging and Unit Testing 2019-03-07 17:26:22 +02:00
8280fd557b Bump version to 4.0.0a4 2019-03-07 14:12:45 +02:00
15a1cbf95a Tag docs with v3.6.5 2019-03-07 14:09:50 +02:00
775797b4e5 Merge tag '3.6.5' into develop 2019-03-07 14:02:35 +02:00
4c8df44a5a Fix an issue when `$PROJECT_HASH template was not expanded for the other directory ***_dir` options in "platformio.ini" // Resolve #2170 2019-03-07 12:40:55 +02:00
3f52a6d5ba YAPF 0.26.0 2019-03-07 12:36:17 +02:00
237d55208c Fix "Unnecessary "else/elif" after "raise"" by PyLint 2019-03-07 12:35:34 +02:00
5a72e3f2a1 Update "dl.bintray.com" IP address 2019-03-06 22:39:24 +02:00
1ce913ea74 Sync docs 2019-03-06 21:04:01 +02:00
9911a04232 Sync docs 2019-03-06 12:29:03 +02:00
f1549e1f0e Docs: Sync new boards by STM32 2019-03-05 13:20:19 +02:00
4659a44bd6 New boards: Adafruit Bluefruit nRF52832 Feather, Feather nRF52840 Express, and Nordic nRF52840-DK (Adafruit BSP) 2019-03-04 13:50:34 +02:00
e39d4f5c32 Sync docs 2019-03-04 13:12:16 +02:00
fb9fe8c77c Document upload_protocol = espota option 2019-03-04 13:03:22 +02:00
ab914e1566 PyLin fix 2019-02-24 11:45:14 +02:00
e0b9d080fa Project Generator: add new targets for CLion IDE "BUILD_VERBOSE" and "MONITOR" (serial port monitor) // Resolve #359 2019-02-23 22:34:36 +02:00
42c22758bb Use NodeMCUv2 as default board for wiring example 2019-02-23 21:04:43 +02:00
522b42c2a9 Fix an issue when platformio ci recompiles project if `--keep-build-dir` option is passed // Resolve #2109 2019-02-23 18:39:54 +02:00
5a707a4849 CI only for Python 2.7 & 3.6 2019-02-23 16:00:24 +02:00
1bed2650f3 Sync docs 2019-02-23 14:44:13 +02:00
65f30dd7b4 Sync docs 2019-02-23 01:10:41 +02:00
59b27d5d0a Document switching between Arduino cores 2019-02-21 22:53:33 +02:00
6575ddab26 Improve docs for pio ci command when multiple --project-option are used 2019-02-21 00:25:14 +02:00
dc4a5df2af Drop support for RFduino 2019-02-20 00:13:56 +02:00
2e4c9411af Docs: Update VSCode Custom Task example 2019-02-20 00:00:41 +02:00
e477d1fbcf Docs: Move images from root of _static to _static/images 2019-02-19 23:25:13 +02:00
6059e0dce6 Fix an issue with slow updating of PlatformIO Core packages on Windows 2019-02-19 20:12:07 +02:00
234bb75b9b Sync docs & examples 2019-02-19 18:11:21 +02:00
38dfa40e32 Docs: Sync Atmel SAM boards 2019-02-18 18:57:37 +02:00
5add5f1cfb Document lwIP profiles for EPS8266 2019-02-18 15:51:35 +02:00
f1b809f8dd Improve debugging instruction for ESP32 tutorial // Resolve #1871 2019-02-09 19:11:51 +02:00
47aa63fc04 Update docs for mbed framework 2019-02-07 20:36:45 +02:00
08000a6f62 Sync docs & examples 2019-02-07 19:01:51 +02:00
31d4706abd Debug: Update wiring scheme for Mini-Module 2019-02-02 19:21:12 +02:00
e23ca2c109 Docs: Allow to control firmware optimization for Teensy 2019-01-31 16:42:44 +02:00
17efa89047 Disable STM8 from CI tests 2019-01-30 18:00:40 +02:00
527efe3359 RISV-GAP, document AutoTiler 2019-01-30 18:00:12 +02:00
5aed0efd61 Docs: Add Alorium Hinj board 2019-01-29 22:20:41 +02:00
27ac17e8c3 Release docs to 3.6.4 2019-01-23 21:24:29 +02:00
d2c545eb27 Bump version to 4.0.0a3 2019-01-23 20:55:18 +02:00
035ab14202 Merge tag 'v3.6.4' into develop
Bump version to 3.6.4

# Conflicts:
#	HISTORY.rst
#	platformio/__init__.py
2019-01-23 20:54:21 +02:00
eb57e14ac1 Fix "ValueError: invalid literal for int() with base 10" // Resolve #2061 2019-01-23 17:54:45 +02:00
3cc996d89f Fix "ValueError: invalid literal for int() with base 10" // Resolve #2058 2019-01-22 21:59:26 +02:00
d7d981f522 Add "Variable Format" section to VSCode 2019-01-22 21:05:38 +02:00
eedc1c3ccd Bump version to 4.0.0a2 2019-01-17 18:18:28 +02:00
63075c9607 CLion: Improve project portability using "${CMAKE_CURRENT_LIST_DIR}" instead of USER_HOME 2019-01-17 18:04:52 +02:00
b6ccda3568 Fix cmd_lib test 2019-01-17 17:56:21 +02:00
35f96a534a Drop "do-not-modify-files-here.url" from build_dir 2019-01-17 02:07:46 +02:00
7e8349d45e Docs: Sync Atmel AVR boards 2019-01-16 15:34:31 +02:00
88d06b4437 Ignore examples for ststm8 on Linux 2019-01-12 01:08:13 +02:00
3019e35724 Initial support for STM8 2019-01-11 16:52:01 +02:00
38bb2c61c1 Docs: Sort upload protocols for board 2019-01-11 14:35:54 +02:00
c7949ecd07 Temporary disable checking for PlatformIO Core engine (allow PIO Core 3 dev/platforms for PIO Core 4) 2019-01-11 14:11:54 +02:00
2b467f3fee Fix PyLint warning 2019-01-11 14:07:35 +02:00
a750b06fc8 Fix PY3 Lint "consider-using-set-comprehension" 2019-01-11 13:01:53 +02:00
286a53991c Use GCC C++ compiler for Eclipse project indexer // Issue #1010 2019-01-11 12:52:00 +02:00
b92a8467c9 Update SCons to 3.0.3 2019-01-10 23:47:43 +02:00
a092f87c50 Eclipse: Provide language standard to a project C/C++ indexer // Resolve #1010 2019-01-10 21:57:39 +02:00
02937216b0 Fix "TypeError : startswith first arg" when checking udev rules with PY3 // Resolve #2000 2019-01-10 19:33:15 +02:00
e02d7528ad Fix an error "Could not extract item..." when extracting TAR archive with symbolic items on Windows platform // Resolve #2015 2019-01-10 19:14:03 +02:00
2ae41c8434 Fix "Runtime Error: Dictionary size changed during iteration" // Resolve #2003 2019-01-09 16:34:39 +02:00
aa2bc4a63b Implement "get_file_contents" helper 2018-12-27 14:48:22 +02:00
74218f4f93 Fix PyLint warning for Windows 2018-12-26 22:33:21 +02:00
a60c57ac58 Initial support for Python 3.5+ // Resolve #895 Resolve #1365 2018-12-26 20:54:29 +02:00
249 changed files with 26651 additions and 11032 deletions

View File

@ -1,27 +0,0 @@
build: off
platform:
- x86
- x64
environment:
matrix:
- TOXENV: "py27"
install:
- cmd: git submodule update --init --recursive
- cmd: SET PATH=C:\MinGW\bin;%PATH%
- if %PLATFORM% == x64 SET PATH=C:\Python27-x64;C:\Python27-x64\Scripts;%PATH%
- if %PLATFORM% == x86 SET PATH=C:\Python27;C:\Python27\Scripts;%PATH%
- cmd: pip install tox
test_script:
- cmd: tox
notifications:
- provider: Slack
incoming_webhook:
secure: E9H0SU0Ju7WLDvgxsV8cs3J62T3nTTX7QkEjsczN0Sto/c9hWkVfhc5gGWUkxhlD975cokHByKGJIdwYwCewqOI+7BrcT8U+nlga4Uau7J8=
on_build_success: false
on_build_failure: true
on_build_status_changed: true

1
.github/FUNDING.yml vendored Normal file
View File

@ -0,0 +1 @@
custom: https://platformio.org/donate

44
.github/workflows/core.yml vendored Normal file
View File

@ -0,0 +1,44 @@
name: Core
on: [push, pull_request]
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
python-version: [2.7, 3.7, 3.8]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
with:
submodules: "recursive"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox
- name: Python Lint
run: |
tox -e lint
- name: Integration Tests
env:
TEST_EMAIL_LOGIN: ${{ secrets.TEST_EMAIL_LOGIN }}
TEST_EMAIL_PASSWORD: ${{ secrets.TEST_EMAIL_PASSWORD }}
TEST_EMAIL_IMAP_SERVER: ${{ secrets.TEST_EMAIL_IMAP_SERVER }}
run: |
tox -e testcore
- name: Slack Notification
uses: homoluctus/slatify@master
if: failure()
with:
type: ${{ job.status }}
job_name: '*Core*'
commit: true
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}

32
.github/workflows/docs.yml vendored Normal file
View File

@ -0,0 +1,32 @@
name: Docs
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
submodules: "recursive"
- name: Set up Python
uses: actions/setup-python@v1
with:
python-version: 3.7
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox
- name: Build docs
run: |
tox -e docs
- name: Slack Notification
uses: homoluctus/slatify@master
if: failure()
with:
type: ${{ job.status }}
job_name: '*Docs*'
commit: true
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}

64
.github/workflows/examples.yml vendored Normal file
View File

@ -0,0 +1,64 @@
name: Examples
on: [push, pull_request]
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-16.04, windows-latest, macos-latest]
python-version: [2.7, 3.7]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
with:
submodules: "recursive"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install tox
- name: Run on Linux
if: startsWith(matrix.os, 'ubuntu')
env:
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,siwigsm,intel_mcs51,aceinna_imu"
run: |
# ChipKIT issue: install 32-bit support for GCC PIC32
sudo apt-get install libc6-i386
# Free space
sudo apt clean
docker rmi $(docker image ls -aq)
df -h
# Run
tox -e testexamples
- name: Run on macOS
if: startsWith(matrix.os, 'macos')
env:
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,siwigsm,microchippic32,gd32v,nuclei,lattice_ice40"
run: |
df -h
tox -e testexamples
- name: Run on Windows
if: startsWith(matrix.os, 'windows')
env:
PLATFORMIO_CORE_DIR: C:/pio
PLATFORMIO_WORKSPACE_DIR: C:/pio-workspace/$PROJECT_HASH
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,siwigsm,riscv_gap"
run: |
tox -e testexamples
- name: Slack Notification
uses: homoluctus/slatify@master
if: failure()
with:
type: ${{ job.status }}
job_name: '*Examples*'
commit: true
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}

View File

@ -1,3 +1,3 @@
[settings]
line_length=79
known_third_party=bottle,click,pytest,requests,SCons,semantic_version,serial
line_length=88
known_third_party=OpenSSL, SCons, autobahn, jsonrpc, twisted, zope

View File

@ -1,23 +1,22 @@
[REPORTS]
output-format=colorized
[MESSAGES CONTROL]
disable=
bad-continuation,
bad-whitespace,
missing-docstring,
ungrouped-imports,
invalid-name,
cyclic-import,
duplicate-code,
superfluous-parens,
too-few-public-methods,
useless-object-inheritance,
useless-import-alias,
fixme,
bad-option-value,
# Only show warnings with the listed confidence levels. Leave empty to show
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
confidence=
# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
# multiple time. See also the "--disable" option for examples.
#enable=
# Disable the message, report, category or checker with the given id(s). You
# can either give multiple identifiers separated by comma (,) or put this
# option multiple times (only on the command line, not in the configuration
# file where it should appear only once).You can also use "--disable=all" to
# disable everything first and then reenable specific checks. For example, if
# you want to run only the similarities checker, you can use "--disable=all
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
# disable=import-star-module-level,old-octal-literal,oct-method,print-statement,unpacking-in-except,parameter-unpacking,backtick,old-raise-syntax,old-ne-operator,long-suffix,dict-view-method,dict-iter-method,metaclass-assignment,next-method-called,raising-string,indexing-exception,raw_input-builtin,long-builtin,file-builtin,execfile-builtin,coerce-builtin,cmp-builtin,buffer-builtin,basestring-builtin,apply-builtin,filter-builtin-not-iterating,using-cmp-argument,useless-suppression,range-builtin-not-iterating,suppressed-message,no-absolute-import,old-division,cmp-method,reload-builtin,zip-builtin-not-iterating,intern-builtin,unichr-builtin,reduce-builtin,standarderror-builtin,unicode-builtin,xrange-builtin,coerce-method,delslice-method,getslice-method,setslice-method,input-builtin,round-builtin,hex-method,nonzero-method,map-builtin-not-iterating
disable=locally-disabled,missing-docstring,invalid-name,too-few-public-methods,redefined-variable-type,import-error,similarities,unsupported-membership-test,unsubscriptable-object,ungrouped-imports,cyclic-import,superfluous-parens
; PY2 Compat
super-with-arguments,
raise-missing-from

12
.readthedocs.yml Normal file
View File

@ -0,0 +1,12 @@
# See https://docs.readthedocs.io/en/stable/config-file/index.html
version: 2
sphinx:
configuration: docs/conf.py
formats:
- pdf
submodules:
include: all

View File

@ -1,3 +0,0 @@
[style]
blank_line_before_nested_class_or_def = true
allow_multiline_lambdas = true

View File

@ -1,39 +0,0 @@
language: python
matrix:
include:
- os: linux
sudo: false
python: 2.7
env: TOX_ENV=docs
- os: linux
sudo: false
python: 2.7
env: TOX_ENV=lint
- os: linux
sudo: required
python: 2.7
env: TOX_ENV=py27
- os: osx
language: generic
env: TOX_ENV=skipexamples
install:
- git submodule update --init --recursive
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then curl -fsSL https://bootstrap.pypa.io/get-pip.py | sudo python; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then sudo pip install "tox==3.0.0"; else pip install -U tox; fi
# ChipKIT issue: install 32-bit support for GCC PIC32
- if [[ "$TOX_ENV" == "py27" ]] && [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo apt-get install libc6-i386; fi
script:
- tox -e $TOX_ENV
notifications:
email: false
slack:
rooms:
secure: JD6VGfN4+SLU2CwDdiIOr1VgwD+zbYUCE/srwyGuHavnjIkPItkl6T6Bn8Y4VrU6ysbuKotfdV2TAJJ82ivFbY8BvZBc7FBcYp/AGQ4FaCCV5ySv8RDAcQgdE12oaGzMdODiLqsB85f65zOlAFa+htaXyEiRTcotn6Y2hupatrI=
on_failure: always
on_success: change

15
.vscode/settings.json vendored
View File

@ -1,15 +0,0 @@
{
"python.pythonPath": "${workspaceRoot}/.tox/develop/bin/python",
"python.formatting.provider": "yapf",
"files.exclude": {
"**/*.pyc": true,
"*.egg-info": true,
".cache": true,
"build": true,
"dist": true
},
"editor.rulers": [79],
"restructuredtext.builtDocumentationPath": "${workspaceRoot}/docs/_build/html",
"restructuredtext.confPath": "${workspaceRoot}/docs",
"restructuredtext.linter.executablePath": "${workspaceRoot}/.tox/docs/bin/restructuredtext-lint"
}

View File

@ -1,21 +1,21 @@
Contributing
------------
To get started, <a href="https://www.clahub.com/agreements/platformio/platformio-core">sign the Contributor License Agreement</a>.
To get started, <a href="https://cla-assistant.io/platformio/platformio-core">sign the Contributor License Agreement</a>.
1. Fork the repository on GitHub.
2. Make a branch off of ``develop``
3. Run ``pip install tox``
4. Go to the root of project where is located ``tox.ini`` and run ``tox -e develop``
2. Clone repository `git clone --recursive https://github.com/YourGithubUsername/platformio-core.git`
3. Run `pip install tox`
4. Go to the root of project where is located `tox.ini` and run `tox -e py37`
5. Activate current development environment:
* Windows: ``.tox\develop\Scripts\activate``
* Bash/ZSH: ``source .tox/develop/bin/activate``
* Fish: ``source .tox/bin/activate.fish``
* Windows: `.tox\py37\Scripts\activate`
* Bash/ZSH: `source .tox/py37/bin/activate`
* Fish: `source .tox/py37/bin/activate.fish`
6. Make changes to code, documentation, etc.
7. Lint source code ``tox -e lint``
8. Run the tests ``tox -e py27``
9. Build documentation ``tox -e docs`` (creates a directory _build under docs where you can find the html)
7. Lint source code `make before-commit`
8. Run the tests `make test`
9. Build documentation `tox -e docs` (creates a directory _build under docs where you can find the html)
10. Commit changes to your forked repository
11. Submit a Pull Request on GitHub.

File diff suppressed because it is too large Load Diff

1
MANIFEST.in Normal file
View File

@ -0,0 +1 @@
include LICENSE

View File

@ -1,18 +1,19 @@
lint:
pylint --rcfile=./.pylintrc ./platformio
pylint -j 6 --rcfile=./.pylintrc ./platformio
pylint -j 6 --rcfile=./.pylintrc ./tests
isort:
isort -rc ./platformio
isort -rc ./tests
yapf:
yapf --recursive --in-place platformio/
format:
black --target-version py27 ./platformio
black --target-version py27 ./tests
test:
py.test -v -s -n 3 --dist=loadscope tests --ignore tests/test_examples.py --ignore tests/test_pkgmanifest.py
py.test --verbose --capture=no --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
before-commit: isort yapf lint test
before-commit: isort format lint
clean-docs:
rm -rf docs/_build
@ -23,4 +24,15 @@ clean: clean-docs
rm -rf .cache
rm -rf build
rm -rf htmlcov
rm -f .coverage
rm -f .coverage
profile:
# Usage $ > make PIOARGS="boards" profile
python -m cProfile -o .tox/.tmp/cprofile.prof -m platformio ${PIOARGS}
snakeviz .tox/.tmp/cprofile.prof
pack:
python setup.py sdist
publish:
python setup.py sdist upload

View File

@ -1,71 +1,71 @@
PlatformIO
==========
.. image:: https://travis-ci.org/platformio/platformio-core.svg?branch=develop
:target: https://travis-ci.org/platformio/platformio-core
:alt: Travis.CI Build Status
.. image:: https://ci.appveyor.com/api/projects/status/unnpw0n3c5k14btn/branch/develop?svg=true
:target: https://ci.appveyor.com/project/ivankravets/platformio-core
:alt: AppVeyor.CI Build Status
.. image:: https://github.com/platformio/platformio-core/workflows/Core/badge.svg
:target: https://docs.platformio.org/page/core/index.html
:alt: CI Build for PlatformIO Core
.. image:: https://github.com/platformio/platformio-core/workflows/Examples/badge.svg
:target: https://github.com/platformio/platformio-examples
:alt: CI Build for dev-platform examples
.. image:: https://github.com/platformio/platformio-core/workflows/Docs/badge.svg
:target: https://docs.platformio.org?utm_source=github&utm_medium=core
:alt: CI Build for Docs
.. image:: https://img.shields.io/pypi/v/platformio.svg
:target: https://pypi.python.org/pypi/platformio/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/l/platformio.svg
.. image:: https://img.shields.io/badge/license-Apache%202.0-blue.svg
:target: https://pypi.python.org/pypi/platformio/
:alt: License
.. image:: https://img.shields.io/PlatformIO/Community.png
:alt: Community Forums
:target: https://community.platformio.org?utm_source=github&utm_medium=core
.. image:: https://img.shields.io/PIO/Plus.png?color=orange
:alt: PIO Plus: Professional solutions for an awesome open source PlatformIO ecosystem
:target: https://platformio.org/pricing?utm_source=github&utm_medium=core
.. image:: https://img.shields.io/badge/PlatformIO-Labs-orange.svg
:alt: Community Labs
:target: https://piolabs.com/?utm_source=github&utm_medium=core
**Quick Links:** `Web <https://platformio.org?utm_source=github&utm_medium=core>`_ |
`PIO Plus <https://platformio.org/pricing?utm_source=github&utm_medium=core>`_ |
`PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_ |
`Project Examples <https://github.com/platformio/platformio-examples/>`_ |
`Project Examples <https://github.com/platformio/platformio-examples/>`__ |
`Docs <https://docs.platformio.org?utm_source=github&utm_medium=core>`_ |
`Donate <https://platformio.org/donate?utm_source=github&utm_medium=core>`_ |
`Contact Us <https://platformio.org/contact?utm_source=github&utm_medium=core>`_
`Contact Us <https://piolabs.com/?utm_source=github&utm_medium=core>`_
**Social:** `Twitter <https://twitter.com/PlatformIO_Org>`_ |
`LinkedIn <https://www.linkedin.com/company/platformio/>`_ |
**Social:** `LinkedIn <https://www.linkedin.com/company/platformio/>`_ |
`Twitter <https://twitter.com/PlatformIO_Org>`_ |
`Facebook <https://www.facebook.com/platformio>`_ |
`Hackaday <https://hackaday.io/project/7980-platformio>`_ |
`Bintray <https://bintray.com/platformio>`_ |
`Community <https://community.platformio.org?utm_source=github&utm_medium=core>`_
`Community Forums <https://community.platformio.org?utm_source=github&utm_medium=core>`_
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
:target: https://platformio.org?utm_source=github&utm_medium=core
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ is an open source ecosystem for IoT
development. Cross-platform IDE and unified debugger. Remote unit testing and
firmware updates.
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ is a professional collaborative platform for embedded development
**A place where Developers and Teams have true Freedom! No more vendor lock-in!**
* Open source, maximum permissive Apache 2.0 license
* Cross-platform IDE and Unified Debugger
* Static Code Analyzer and Remote Unit Testing
* Multi-platform and Multi-architecture Build System
* Firmware File Explorer and Memory Inspection.
Get Started
-----------
* `What is PlatformIO? <https://docs.platformio.org/en/latest/what-is-platformio.html?utm_source=github&utm_medium=core>`_
Open Source
-----------
* `What is PlatformIO? <https://docs.platformio.org/page/what-is-platformio.html?utm_source=github&utm_medium=core>`_
* `PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_
* `PlatformIO Core (CLI) <https://docs.platformio.org/en/latest/core.html?utm_source=github&utm_medium=core>`_
* `PlatformIO Core (CLI) <https://docs.platformio.org/page/core.html?utm_source=github&utm_medium=core>`_
* `Project Examples <https://github.com/platformio/platformio-examples?utm_source=github&utm_medium=core>`__
Solutions
---------
* `Library Management <https://docs.platformio.org/page/librarymanager/index.html?utm_source=github&utm_medium=core>`_
* `Project Examples <https://github.com/platformio/platformio-examples?utm_source=github&utm_medium=core>`_
* `Desktop IDEs Integration <https://docs.platformio.org/page/ide.html?utm_source=github&utm_medium=core>`_
* `Continuous Integration <https://docs.platformio.org/page/ci/index.html?utm_source=github&utm_medium=core>`_
* `Advanced Scripting API <https://docs.platformio.org/page/projectconf/advanced_scripting.html?utm_source=github&utm_medium=core>`_
PIO Plus
--------
**Advanced**
* `PIO Remote <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
* `PIO Unified Debugger <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `PIO Unit Testing <https://docs.platformio.org/en/latest/plus/unit-testing.html?utm_source=github&utm_medium=core>`_
* `Cloud IDEs Integration <https://docs.platformio.org/en/latest/ide.html?utm_source=github&utm_medium=core#solution-pio-delivery>`_
* `Integration Services <https://platformio.org/pricing?utm_source=github&utm_medium=core#enterprise-features>`_
* `Debugging <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `Unit Testing <https://docs.platformio.org/page/plus/unit-testing.html?utm_source=github&utm_medium=core>`_
* `Static Code Analysis <https://docs.platformio.org/page/plus/pio-check.html?utm_source=github&utm_medium=core>`_
* `Remote Development <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
Registry
--------
@ -78,6 +78,8 @@ Registry
Development Platforms
---------------------
* `Aceinna IMU <https://platformio.org/platforms/aceinna_imu?utm_source=github&utm_medium=core>`_
* `ASR Microelectronics ASR605x <https://platformio.org/platforms/asrmicro650x?utm_source=github&utm_medium=core>`_
* `Atmel AVR <https://platformio.org/platforms/atmelavr?utm_source=github&utm_medium=core>`_
* `Atmel SAM <https://platformio.org/platforms/atmelsam?utm_source=github&utm_medium=core>`_
* `Espressif 32 <https://platformio.org/platforms/espressif32?utm_source=github&utm_medium=core>`_
@ -86,16 +88,20 @@ Development Platforms
* `Infineon XMC <https://platformio.org/platforms/infineonxmc?utm_source=github&utm_medium=core>`_
* `Intel ARC32 <https://platformio.org/platforms/intel_arc32?utm_source=github&utm_medium=core>`_
* `Intel MCS-51 (8051) <https://platformio.org/platforms/intel_mcs51?utm_source=github&utm_medium=core>`_
* `Kendryte K210 <https://platformio.org/platforms/kendryte210?utm_source=github&utm_medium=core>`_
* `Lattice iCE40 <https://platformio.org/platforms/lattice_ice40?utm_source=github&utm_medium=core>`_
* `Maxim 32 <https://platformio.org/platforms/maxim32?utm_source=github&utm_medium=core>`_
* `Microchip PIC32 <https://platformio.org/platforms/microchippic32?utm_source=github&utm_medium=core>`_
* `Nordic nRF51 <https://platformio.org/platforms/nordicnrf51?utm_source=github&utm_medium=core>`_
* `Nordic nRF52 <https://platformio.org/platforms/nordicnrf52?utm_source=github&utm_medium=core>`_
* `Nuclei <https://platformio.org/platforms/nuclei?utm_source=github&utm_medium=core>`_
* `NXP LPC <https://platformio.org/platforms/nxplpc?utm_source=github&utm_medium=core>`_
* `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_
* `Samsung ARTIK <https://platformio.org/platforms/samsung_artik?utm_source=github&utm_medium=core>`_
* `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_
* `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_
* `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_
* `ST STM8 <https://platformio.org/platforms/ststm8?utm_source=github&utm_medium=core>`_
* `Teensy <https://platformio.org/platforms/teensy?utm_source=github&utm_medium=core>`_
* `TI MSP430 <https://platformio.org/platforms/timsp430?utm_source=github&utm_medium=core>`_
* `TI Tiva <https://platformio.org/platforms/titiva?utm_source=github&utm_medium=core>`_
@ -105,26 +111,39 @@ Frameworks
----------
* `Arduino <https://platformio.org/frameworks/arduino?utm_source=github&utm_medium=core>`_
* `ARTIK SDK <https://platformio.org/frameworks/artik-sdk?utm_source=github&utm_medium=core>`_
* `CMSIS <https://platformio.org/frameworks/cmsis?utm_source=github&utm_medium=core>`_
* `Energia <https://platformio.org/frameworks/energia?utm_source=github&utm_medium=core>`_
* `ESP-IDF <https://platformio.org/frameworks/espidf?utm_source=github&utm_medium=core>`_
* `ESP8266 Non-OS SDK <https://platformio.org/frameworks/esp8266-nonos-sdk?utm_source=github&utm_medium=core>`_
* `ESP8266 RTOS SDK <https://platformio.org/frameworks/esp8266-rtos-sdk?utm_source=github&utm_medium=core>`_
* `Freedom E SDK <https://platformio.org/frameworks/freedom-e-sdk?utm_source=github&utm_medium=core>`_
* `GigaDevice GD32V SDK <https://platformio.org/frameworks/gd32vf103-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte Standalone SDK <https://platformio.org/frameworks/kendryte-standalone-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte FreeRTOS SDK <https://platformio.org/frameworks/kendryte-freertos-sdk?utm_source=github&utm_medium=core>`_
* `libOpenCM3 <https://platformio.org/frameworks/libopencm3?utm_source=github&utm_medium=core>`_
* `mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `Mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `Nuclei SDK <https://platformio.org/frameworks/nuclei-sdk?utm_source=github&utm_medium=core>`_
* `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_
* `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_
* `Shakti SDK <https://platformio.org/frameworks/shakti-sdk?utm_source=github&utm_medium=core>`_
* `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_
* `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_
* `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_
* `Tizen RT <https://platformio.org/frameworks/tizenrt?utm_source=github&utm_medium=core>`_
* `WiringPi <https://platformio.org/frameworks/wiringpi?utm_source=github&utm_medium=core>`_
* `Zephyr <https://platformio.org/frameworks/zephyr?utm_source=github&utm_medium=core>`_
Contributing
------------
See `contributing guidelines <https://github.com/platformio/platformio/blob/develop/CONTRIBUTING.md>`_.
Telemetry / Privacy Policy
--------------------------
Share minimal diagnostics and usage information to help us make PlatformIO better.
It is enabled by default. For more information see:
* `Telemetry Setting <https://docs.platformio.org/page/userguide/cmd_settings.html?utm_source=github&utm_medium=core#enable-telemetry>`_
License
-------

2
docs

Submodule docs updated: 768ccfd2b4...deae09a880

View File

@ -14,16 +14,21 @@
import sys
VERSION = (3, 6, 5)
VERSION = (5, 0, 2)
__version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio"
__description__ = (
"An open source ecosystem for IoT development. "
"Cross-platform IDE and unified debugger. "
"Remote unit testing and firmware updates. "
"Arduino, ARM mbed, Espressif (ESP8266/ESP32), STM32, PIC32, nRF51/nRF52, "
"FPGA, CMSIS, SPL, AVR, Samsung ARTIK, libOpenCM3")
"A professional collaborative platform for embedded development. "
"Cross-platform IDE and Unified Debugger. "
"Static Code Analyzer and Remote Unit Testing. "
"Multi-platform and Multi-architecture Build System. "
"Firmware File Explorer and Memory Inspection. "
"IoT, Arduino, CMSIS, ESP-IDF, FreeRTOS, libOpenCM3, mbedOS, Pulp OS, SPL, "
"STM32Cube, Zephyr RTOS, ARM, AVR, Espressif (ESP8266/ESP32), FPGA, "
"MCS-51 (8051), MSP430, Nordic (nRF51/nRF52), NXP i.MX RT, PIC32, RISC-V, "
"STMicroelectronics (STM8/STM32), Teensy"
)
__url__ = "https://platformio.org"
__author__ = "PlatformIO"
@ -32,11 +37,28 @@ __email__ = "contact@platformio.org"
__license__ = "Apache Software License"
__copyright__ = "Copyright 2014-present PlatformIO"
__apiurl__ = "https://api.platformio.org"
__accounts_api__ = "https://api.accounts.platformio.org"
__registry_api__ = [
"https://api.registry.platformio.org",
"https://api.registry.ns1.platformio.org",
]
__pioremote_endpoint__ = "ssl:host=remote.platformio.org:port=4413"
if sys.version_info < (2, 7, 0) or sys.version_info >= (3, 0, 0):
msg = ("PlatformIO Core v%s does not run under Python version %s.\n"
"Minimum supported version is 2.7, please upgrade Python.\n"
"Python 3 is not yet supported.\n")
sys.stderr.write(msg % (__version__, sys.version))
sys.exit(1)
__default_requests_timeout__ = (10, None) # (connect, read)
__core_packages__ = {
"contrib-piohome": "~3.3.1",
"contrib-pysite": "~2.%d%d.0" % (sys.version_info.major, sys.version_info.minor),
"tool-unity": "~1.20500.0",
"tool-scons": "~2.20501.7" if sys.version_info.major == 2 else "~4.40001.0",
"tool-cppcheck": "~1.210.0",
"tool-clangtidy": "~1.100000.0",
"tool-pvs-studio": "~7.9.0",
}
__check_internet_hosts__ = [
"185.199.110.153", # Github.com
"88.198.170.159", # platformio.org
"github.com",
"platformio.org",
]

View File

@ -14,92 +14,75 @@
import os
import sys
from os.path import join
from platform import system
from traceback import format_exc
import click
from platformio import __version__, exception, maintenance
from platformio.util import get_source_dir
from platformio import __version__, exception, maintenance, util
from platformio.commands import PlatformioCLI
from platformio.compat import CYGWIN
try:
import click_completion # pylint: disable=import-error
class PlatformioCLI(click.MultiCommand): # pylint: disable=R0904
def list_commands(self, ctx):
cmds = []
for filename in os.listdir(join(get_source_dir(), "commands")):
if filename.startswith("__init__"):
continue
if filename.endswith(".py"):
cmds.append(filename[:-3])
cmds.sort()
return cmds
def get_command(self, ctx, cmd_name):
mod = None
try:
mod = __import__("platformio.commands." + cmd_name, None, None,
["cli"])
except ImportError:
try:
return self._handle_obsolate_command(cmd_name)
except AttributeError:
raise click.UsageError('No such command "%s"' % cmd_name, ctx)
return mod.cli
@staticmethod
def _handle_obsolate_command(name):
if name == "platforms":
from platformio.commands import platform
return platform.cli
elif name == "serialports":
from platformio.commands import device
return device.cli
raise AttributeError()
click_completion.init()
except: # pylint: disable=bare-except
pass
@click.command(
cls=PlatformioCLI,
context_settings=dict(help_option_names=["-h", "--help"]))
cls=PlatformioCLI, context_settings=dict(help_option_names=["-h", "--help"])
)
@click.version_option(__version__, prog_name="PlatformIO")
@click.option(
"--force",
"-f",
is_flag=True,
help="Force to accept any confirmation prompts.")
@click.option("--caller", "-c", help="Caller ID (service).")
@click.option("--force", "-f", is_flag=True, help="DEPRECATE")
@click.option("--caller", "-c", help="Caller ID (service)")
@click.option("--no-ansi", is_flag=True, help="Do not print ANSI control characters")
@click.pass_context
def cli(ctx, force, caller):
def cli(ctx, force, caller, no_ansi):
try:
if (
no_ansi
or str(
os.getenv("PLATFORMIO_NO_ANSI", os.getenv("PLATFORMIO_DISABLE_COLOR"))
).lower()
== "true"
):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: False
elif (
str(
os.getenv("PLATFORMIO_FORCE_ANSI", os.getenv("PLATFORMIO_FORCE_COLOR"))
).lower()
== "true"
):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
except: # pylint: disable=bare-except
pass
maintenance.on_platformio_start(ctx, force, caller)
@cli.resultcallback()
@click.pass_context
def process_result(ctx, result, force, caller): # pylint: disable=W0613
def process_result(ctx, result, *_, **__):
maintenance.on_platformio_end(ctx, result)
@util.memoized()
def configure():
if "cygwin" in system().lower():
if CYGWIN:
raise exception.CygwinEnvDetected()
# https://urllib3.readthedocs.org
# /en/latest/security.html#insecureplatformwarning
try:
import urllib3
import urllib3 # pylint: disable=import-outside-toplevel
urllib3.disable_warnings()
except (AttributeError, ImportError):
pass
# handle PLATFORMIO_FORCE_COLOR
if str(os.getenv("PLATFORMIO_FORCE_COLOR", "")).lower() == "true":
try:
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
except: # pylint: disable=bare-except
pass
# Handle IOError issue with VSCode's Terminal (Windows)
click_echo_origin = [click.echo, click.secho]
@ -108,16 +91,25 @@ def configure():
click_echo_origin[origin](*args, **kwargs)
except IOError:
(sys.stderr.write if kwargs.get("err") else sys.stdout.write)(
"%s\n" % (args[0] if args else ""))
"%s\n" % (args[0] if args else "")
)
click.echo = lambda *args, **kwargs: _safe_echo(0, *args, **kwargs)
click.secho = lambda *args, **kwargs: _safe_echo(1, *args, **kwargs)
def main():
def main(argv=None):
exit_code = 0
prev_sys_argv = sys.argv[:]
if argv:
assert isinstance(argv, list)
sys.argv = argv
try:
configure()
cli(None, None, None)
cli() # pylint: disable=no-value-for-parameter
except SystemExit as e:
if e.code and str(e.code).isdigit():
exit_code = int(e.code)
except Exception as e: # pylint: disable=broad-except
if not isinstance(e, exception.ReturnErrorCode):
maintenance.on_platformio_exception(e)
@ -143,13 +135,13 @@ An unexpected error occurred. Further steps:
============================================================
"""
click.secho(error_str, fg="red", err=True)
return int(str(e)) if str(e).isdigit() else 1
return 0
exit_code = int(str(e)) if str(e).isdigit() else 1
sys.argv = prev_sys_argv
return exit_code
def debug_gdb_main():
sys.argv = [sys.argv[0], "debug", "--interface", "gdb"] + sys.argv[1:]
return main()
return main([sys.argv[0], "debug", "--interface", "gdb"] + sys.argv[1:])
if __name__ == "__main__":

View File

@ -12,110 +12,114 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import codecs
from __future__ import absolute_import
import getpass
import hashlib
import json
import os
import platform
import socket
import uuid
from copy import deepcopy
from os import environ, getenv, listdir, remove
from os.path import abspath, dirname, expanduser, isdir, isfile, join
from time import time
from os.path import dirname, isdir, isfile, join, realpath
import requests
from platformio import exception, lockfile, util
from platformio import __version__, exception, fs, proc
from platformio.compat import WINDOWS, dump_json_to_unicode, hashlib_encode_data
from platformio.package.lockfile import LockFile
from platformio.project.helpers import get_default_projects_dir, get_project_core_dir
def projects_dir_validate(projects_dir):
assert isdir(projects_dir)
return abspath(projects_dir)
return realpath(projects_dir)
DEFAULT_SETTINGS = {
"auto_update_libraries": {
"description": "Automatically update libraries (Yes/No)",
"value": False
"value": False,
},
"auto_update_platforms": {
"description": "Automatically update platforms (Yes/No)",
"value": False
"value": False,
},
"check_libraries_interval": {
"description": "Check for the library updates interval (days)",
"value": 7
"value": 7,
},
"check_platformio_interval": {
"description": "Check for the new PlatformIO interval (days)",
"value": 3
"value": 3,
},
"check_platforms_interval": {
"description": "Check for the platform updates interval (days)",
"value": 7
"value": 7,
},
"enable_cache": {
"description": "Enable caching for API requests and Library Manager",
"value": True
},
"enable_ssl": {
"description": "Enable SSL for PlatformIO Services",
"value": False
"description": "Enable caching for HTTP API requests",
"value": True,
},
"enable_telemetry": {
"description":
("Telemetry service <https://docs.platformio.org/page/"
"userguide/cmd_settings.html?#enable-telemetry> (Yes/No)"),
"value":
True
"description": ("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"),
"value": True,
},
"force_verbose": {
"description": "Force verbose output when processing environments",
"value": False
"value": False,
},
"projects_dir": {
"description": "Default location for PlatformIO projects (PIO Home)",
"value": join(expanduser("~"), "Documents", "PlatformIO", "Projects"),
"validator": projects_dir_validate
"description": "Default location for PlatformIO projects (PlatformIO Home)",
"value": get_default_projects_dir(),
"validator": projects_dir_validate,
},
}
SESSION_VARS = {"command_ctx": None, "force_option": False, "caller_id": None}
SESSION_VARS = {
"command_ctx": None,
"force_option": False,
"caller_id": None,
"custom_project_conf": None,
}
class State(object):
def __init__(self, path=None, lock=False):
self.path = path
self.lock = lock
if not self.path:
self.path = join(util.get_home_dir(), "appstate.json")
self._state = {}
self._prev_state = {}
self.path = join(get_project_core_dir(), "appstate.json")
self._storage = {}
self._lockfile = None
self.modified = False
def __enter__(self):
try:
self._lock_state_file()
if isfile(self.path):
self._state = util.load_json(self.path)
except exception.PlatformioException:
self._state = {}
self._prev_state = deepcopy(self._state)
return self._state
self._storage = fs.load_json(self.path)
assert isinstance(self._storage, dict)
except (
AssertionError,
ValueError,
UnicodeDecodeError,
exception.InvalidJSONFile,
):
self._storage = {}
return self
def __exit__(self, type_, value, traceback):
if self._prev_state != self._state:
if self.modified:
try:
with codecs.open(self.path, "w", encoding="utf8") as fp:
json.dump(self._state, fp)
with open(self.path, "w") as fp:
fp.write(dump_json_to_unicode(self._storage))
except IOError:
raise exception.HomeDirPermissionsError(util.get_home_dir())
raise exception.HomeDirPermissionsError(get_project_core_dir())
self._unlock_state_file()
def _lock_state_file(self):
if not self.lock:
return
self._lockfile = lockfile.LockFile(self.path)
self._lockfile = LockFile(self.path)
try:
self._lockfile.acquire()
except IOError:
@ -128,137 +132,37 @@ class State(object):
def __del__(self):
self._unlock_state_file()
# Dictionary Proxy
class ContentCache(object):
def as_dict(self):
return self._storage
def __init__(self, cache_dir=None):
self.cache_dir = None
self._db_path = None
self._lockfile = None
def keys(self):
return self._storage.keys()
self.cache_dir = cache_dir or util.get_cache_dir()
self._db_path = join(self.cache_dir, "db.data")
def get(self, key, default=True):
return self._storage.get(key, default)
def __enter__(self):
self.delete()
return self
def update(self, *args, **kwargs):
self.modified = True
return self._storage.update(*args, **kwargs)
def __exit__(self, type_, value, traceback):
pass
def clear(self):
return self._storage.clear()
def _lock_dbindex(self):
if not self.cache_dir:
os.makedirs(self.cache_dir)
self._lockfile = lockfile.LockFile(self.cache_dir)
try:
self._lockfile.acquire()
except: # pylint: disable=bare-except
return False
def __getitem__(self, key):
return self._storage[key]
return True
def __setitem__(self, key, value):
self.modified = True
self._storage[key] = value
def _unlock_dbindex(self):
if self._lockfile:
self._lockfile.release()
return True
def __delitem__(self, key):
self.modified = True
del self._storage[key]
def get_cache_path(self, key):
assert len(key) > 3
return join(self.cache_dir, key[-2:], key)
@staticmethod
def key_from_args(*args):
h = hashlib.md5()
for data in args:
h.update(str(data))
return h.hexdigest()
def get(self, key):
cache_path = self.get_cache_path(key)
if not isfile(cache_path):
return None
with codecs.open(cache_path, "rb", encoding="utf8") as fp:
return fp.read()
def set(self, key, data, valid):
if not get_setting("enable_cache"):
return False
cache_path = self.get_cache_path(key)
if isfile(cache_path):
self.delete(key)
if not data:
return False
if not isdir(self.cache_dir):
os.makedirs(self.cache_dir)
tdmap = {"s": 1, "m": 60, "h": 3600, "d": 86400}
assert valid.endswith(tuple(tdmap.keys()))
expire_time = int(time() + tdmap[valid[-1]] * int(valid[:-1]))
if not self._lock_dbindex():
return False
if not isdir(dirname(cache_path)):
os.makedirs(dirname(cache_path))
try:
with codecs.open(cache_path, "wb", encoding="utf8") as fp:
fp.write(data)
with open(self._db_path, "a") as fp:
fp.write("%s=%s\n" % (str(expire_time), cache_path))
except UnicodeError:
if isfile(cache_path):
try:
remove(cache_path)
except OSError:
pass
return self._unlock_dbindex()
def delete(self, keys=None):
""" Keys=None, delete expired items """
if not isfile(self._db_path):
return None
if not keys:
keys = []
if not isinstance(keys, list):
keys = [keys]
paths_for_delete = [self.get_cache_path(k) for k in keys]
found = False
newlines = []
with open(self._db_path) as fp:
for line in fp.readlines():
line = line.strip()
if "=" not in line:
continue
expire, path = line.split("=")
if time() < int(expire) and isfile(path) and \
path not in paths_for_delete:
newlines.append(line)
continue
found = True
if isfile(path):
try:
remove(path)
if not listdir(dirname(path)):
util.rmtree_(dirname(path))
except OSError:
pass
if found and self._lock_dbindex():
with open(self._db_path, "w") as fp:
fp.write("\n".join(newlines) + "\n")
self._unlock_dbindex()
return True
def clean(self):
if not self.cache_dir or not isdir(self.cache_dir):
return
util.rmtree_(self.cache_dir)
def clean_cache():
with ContentCache() as cc:
cc.clean()
def __contains__(self, item):
return item in self._storage
def sanitize_setting(name, value):
@ -268,11 +172,11 @@ def sanitize_setting(name, value):
defdata = DEFAULT_SETTINGS[name]
try:
if "validator" in defdata:
value = defdata['validator'](value)
elif isinstance(defdata['value'], bool):
value = defdata["validator"](value)
elif isinstance(defdata["value"], bool):
if not isinstance(value, bool):
value = str(value).lower() in ("true", "yes", "y", "1")
elif isinstance(defdata['value'], int):
elif isinstance(defdata["value"], int):
value = int(value)
except Exception:
raise exception.InvalidSettingValue(value, name)
@ -280,44 +184,46 @@ def sanitize_setting(name, value):
def get_state_item(name, default=None):
with State() as data:
return data.get(name, default)
with State() as state:
return state.get(name, default)
def set_state_item(name, value):
with State(lock=True) as data:
data[name] = value
with State(lock=True) as state:
state[name] = value
state.modified = True
def delete_state_item(name):
with State(lock=True) as data:
if name in data:
del data[name]
with State(lock=True) as state:
if name in state:
del state[name]
def get_setting(name):
_env_name = "PLATFORMIO_SETTING_%s" % name.upper()
if _env_name in environ:
return sanitize_setting(name, getenv(_env_name))
if _env_name in os.environ:
return sanitize_setting(name, os.getenv(_env_name))
with State() as data:
if "settings" in data and name in data['settings']:
return data['settings'][name]
with State() as state:
if "settings" in state and name in state["settings"]:
return state["settings"][name]
return DEFAULT_SETTINGS[name]['value']
return DEFAULT_SETTINGS[name]["value"]
def set_setting(name, value):
with State(lock=True) as data:
if "settings" not in data:
data['settings'] = {}
data['settings'][name] = sanitize_setting(name, value)
with State(lock=True) as state:
if "settings" not in state:
state["settings"] = {}
state["settings"][name] = sanitize_setting(name, value)
state.modified = True
def reset_settings():
with State(lock=True) as data:
if "settings" in data:
del data['settings']
with State(lock=True) as state:
if "settings" in state:
del state["settings"]
def get_session_var(name, default=None):
@ -330,30 +236,70 @@ def set_session_var(name, value):
def is_disabled_progressbar():
return any([
get_session_var("force_option"),
util.is_ci(),
getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true"
])
return any(
[
get_session_var("force_option"),
proc.is_ci(),
os.getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true",
]
)
def get_cid():
# pylint: disable=import-outside-toplevel
from platformio.clients.http import fetch_remote_content
cid = get_state_item("cid")
if not cid:
_uid = None
if getenv("C9_UID"):
_uid = getenv("C9_UID")
elif getenv("CHE_API", getenv("CHE_API_ENDPOINT")):
try:
_uid = requests.get("{api}/user?token={token}".format(
api=getenv("CHE_API", getenv("CHE_API_ENDPOINT")),
token=getenv("USER_TOKEN"))).json().get("id")
except: # pylint: disable=bare-except
pass
cid = str(
uuid.UUID(
bytes=hashlib.md5(str(
_uid if _uid else uuid.getnode())).digest()))
if "windows" in util.get_systype() or os.getuid() > 0:
set_state_item("cid", cid)
if cid:
return cid
uid = None
if os.getenv("C9_UID"):
uid = os.getenv("C9_UID")
elif os.getenv("CHE_API", os.getenv("CHE_API_ENDPOINT")):
try:
uid = json.loads(
fetch_remote_content(
"{api}/user?token={token}".format(
api=os.getenv("CHE_API", os.getenv("CHE_API_ENDPOINT")),
token=os.getenv("USER_TOKEN"),
)
)
).get("id")
except: # pylint: disable=bare-except
pass
if not uid:
uid = uuid.getnode()
cid = uuid.UUID(bytes=hashlib.md5(hashlib_encode_data(uid)).digest())
cid = str(cid)
if WINDOWS or os.getuid() > 0: # pylint: disable=no-member
set_state_item("cid", cid)
return cid
def get_user_agent():
data = [
"PlatformIO/%s" % __version__,
"CI/%d" % int(proc.is_ci()),
"Container/%d" % int(proc.is_container()),
]
if get_session_var("caller_id"):
data.append("Caller/%s" % get_session_var("caller_id"))
if os.getenv("PLATFORMIO_IDE"):
data.append("IDE/%s" % os.getenv("PLATFORMIO_IDE"))
data.append("Python/%s" % platform.python_version())
data.append("Platform/%s" % platform.platform())
return " ".join(data)
def get_host_id():
h = hashlib.sha1(hashlib_encode_data(get_cid()))
try:
username = getpass.getuser()
h.update(hashlib_encode_data(username))
except: # pylint: disable=bare-except
pass
return h.hexdigest()
def get_host_name():
return str(socket.gethostname())[:255]

View File

@ -12,156 +12,157 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import base64
import json
import sys
from os import environ
from os.path import expanduser, join
from os import environ, makedirs
from os.path import isdir, join
from time import time
from SCons.Script import (ARGUMENTS, COMMAND_LINE_TARGETS, DEFAULT_TARGETS,
AllowSubstExceptions, AlwaysBuild, Default,
DefaultEnvironment, Variables)
import click
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from SCons.Script import DEFAULT_TARGETS # pylint: disable=import-error
from SCons.Script import AllowSubstExceptions # pylint: disable=import-error
from SCons.Script import AlwaysBuild # pylint: disable=import-error
from SCons.Script import Default # pylint: disable=import-error
from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from SCons.Script import Import # pylint: disable=import-error
from SCons.Script import Variables # pylint: disable=import-error
from platformio import util
from platformio import compat, fs
from platformio.compat import dump_json_to_unicode
from platformio.platform.base import PlatformBase
from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_project_dir
AllowSubstExceptions(NameError)
# allow common variables from INI file
commonvars = Variables(None)
commonvars.AddVariables(
# append CLI arguments to build environment
clivars = Variables(None)
clivars.AddVariables(
("PLATFORM_MANIFEST",),
("BUILD_SCRIPT",),
("EXTRA_SCRIPTS",),
("PROJECT_CONFIG",),
("PIOENV",),
("PIOTEST",),
("PIOPLATFORM",),
("PIOFRAMEWORK",),
# build options
("BUILD_FLAGS",),
("SRC_BUILD_FLAGS",),
("BUILD_UNFLAGS",),
("SRC_FILTER",),
# library options
("LIB_LDF_MODE",),
("LIB_COMPAT_MODE",),
("LIB_DEPS",),
("LIB_IGNORE",),
("LIB_EXTRA_DIRS",),
("LIB_ARCHIVE",),
# board options
("BOARD",),
# deprecated options, use board_{object.path} instead
("BOARD_MCU",),
("BOARD_F_CPU",),
("BOARD_F_FLASH",),
("BOARD_FLASH_MODE",),
# end of deprecated options
# upload options
("PIOTEST_RUNNING_NAME",),
("UPLOAD_PORT",),
("UPLOAD_PROTOCOL",),
("UPLOAD_SPEED",),
("UPLOAD_FLAGS",),
("UPLOAD_RESETMETHOD",),
# test options
("TEST_BUILD_PROJECT_SRC",),
# debug options
("DEBUG_TOOL",),
("DEBUG_SVD_PATH",),
) # yapf: disable
MULTILINE_VARS = [
"EXTRA_SCRIPTS", "PIOFRAMEWORK", "BUILD_FLAGS", "SRC_BUILD_FLAGS",
"BUILD_UNFLAGS", "UPLOAD_FLAGS", "SRC_FILTER", "LIB_DEPS", "LIB_IGNORE",
"LIB_EXTRA_DIRS"
]
)
DEFAULT_ENV_OPTIONS = dict(
tools=[
"ar", "gas", "gcc", "g++", "gnulink", "platformio", "pioplatform",
"piowinhooks", "piolib", "pioupload", "piomisc", "pioide"
], # yapf: disable
toolpath=[join(util.get_source_dir(), "builder", "tools")],
variables=commonvars,
"ar",
"as",
"cc",
"c++",
"link",
"platformio",
"piotarget",
"pioplatform",
"pioproject",
"piomaxlen",
"piolib",
"pioupload",
"piomisc",
"pioide",
"piosize",
],
toolpath=[join(fs.get_source_dir(), "builder", "tools")],
variables=clivars,
# Propagating External Environment
PIOVARIABLES=commonvars.keys(),
ENV=environ,
UNIX_TIME=int(time()),
PIOHOME_DIR=util.get_home_dir(),
PROJECT_DIR=util.get_project_dir(),
PROJECTINCLUDE_DIR=util.get_projectinclude_dir(),
PROJECTSRC_DIR=util.get_projectsrc_dir(),
PROJECTTEST_DIR=util.get_projecttest_dir(),
PROJECTDATA_DIR=util.get_projectdata_dir(),
PROJECTBUILD_DIR=util.get_projectbuild_dir(),
BUILD_DIR=join("$PROJECTBUILD_DIR", "$PIOENV"),
BUILDSRC_DIR=join("$BUILD_DIR", "src"),
BUILDTEST_DIR=join("$BUILD_DIR", "test"),
BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"),
BUILD_SRC_DIR=join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=join("$BUILD_DIR", "test"),
COMPILATIONDB_PATH=join("$BUILD_DIR", "compile_commands.json"),
LIBPATH=["$BUILD_DIR"],
LIBSOURCE_DIRS=[
util.get_projectlib_dir(),
util.get_projectlibdeps_dir(),
join("$PIOHOME_DIR", "lib")
],
PROGNAME="program",
PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PYTHONEXE=util.get_pythonexe_path())
PYTHONEXE=get_pythonexe_path(),
IDE_EXTRA_DATA={},
)
if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
DEFAULT_ENV_OPTIONS['ARCOMSTR'] = "Archiving $TARGET"
DEFAULT_ENV_OPTIONS['LINKCOMSTR'] = "Linking $TARGET"
DEFAULT_ENV_OPTIONS['RANLIBCOMSTR'] = "Indexing $TARGET"
DEFAULT_ENV_OPTIONS["ARCOMSTR"] = "Archiving $TARGET"
DEFAULT_ENV_OPTIONS["LINKCOMSTR"] = "Linking $TARGET"
DEFAULT_ENV_OPTIONS["RANLIBCOMSTR"] = "Indexing $TARGET"
for k in ("ASCOMSTR", "ASPPCOMSTR", "CCCOMSTR", "CXXCOMSTR"):
DEFAULT_ENV_OPTIONS[k] = "Compiling $TARGET"
env = DefaultEnvironment(**DEFAULT_ENV_OPTIONS)
# decode common variables
for k in commonvars.keys():
if k in env:
env[k] = base64.b64decode(env[k])
if k in MULTILINE_VARS:
env[k] = util.parse_conf_multi_values(env[k])
# Load variables from CLI
env.Replace(
**{
key: PlatformBase.decode_scons_arg(env[key])
for key in list(clivars.keys())
if key in env
}
)
if env.GetOption('clean'):
# Setup project optional directories
config = env.GetProjectConfig()
env.Replace(
PROJECT_DIR=get_project_dir(),
PROJECT_CORE_DIR=config.get_optional_dir("core"),
PROJECT_PACKAGES_DIR=config.get_optional_dir("packages"),
PROJECT_WORKSPACE_DIR=config.get_optional_dir("workspace"),
PROJECT_LIBDEPS_DIR=config.get_optional_dir("libdeps"),
PROJECT_INCLUDE_DIR=config.get_optional_dir("include"),
PROJECT_SRC_DIR=config.get_optional_dir("src"),
PROJECTSRC_DIR=config.get_optional_dir("src"), # legacy for dev/platform
PROJECT_TEST_DIR=config.get_optional_dir("test"),
PROJECT_DATA_DIR=config.get_optional_dir("data"),
PROJECTDATA_DIR=config.get_optional_dir("data"), # legacy for dev/platform
PROJECT_BUILD_DIR=config.get_optional_dir("build"),
BUILD_CACHE_DIR=config.get_optional_dir("build_cache"),
LIBSOURCE_DIRS=[
config.get_optional_dir("lib"),
join("$PROJECT_LIBDEPS_DIR", "$PIOENV"),
config.get_optional_dir("globallib"),
],
)
if (
compat.WINDOWS
and sys.version_info >= (3, 8)
and env["PROJECT_DIR"].startswith("\\\\")
):
click.secho(
"There is a known issue with Python 3.8+ and mapped network drives on "
"Windows.\nPlease downgrade Python to the latest 3.7. More details at:\n"
"https://github.com/platformio/platformio-core/issues/3417",
fg="yellow",
)
if env.subst("$BUILD_CACHE_DIR"):
if not isdir(env.subst("$BUILD_CACHE_DIR")):
makedirs(env.subst("$BUILD_CACHE_DIR"))
env.CacheDir("$BUILD_CACHE_DIR")
if int(ARGUMENTS.get("ISATTY", 0)):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
if env.GetOption("clean"):
env.PioClean(env.subst("$BUILD_DIR"))
env.Exit(0)
elif not int(ARGUMENTS.get("PIOVERBOSE", 0)):
print("Verbose mode can be enabled via `-v, --verbose` option")
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
# Handle custom variables from system environment
for var in ("BUILD_FLAGS", "SRC_BUILD_FLAGS", "SRC_FILTER", "EXTRA_SCRIPTS",
"UPLOAD_PORT", "UPLOAD_FLAGS", "LIB_EXTRA_DIRS"):
k = "PLATFORMIO_%s" % var
if k not in environ:
continue
if var in ("UPLOAD_PORT", ):
env[var] = environ.get(k)
continue
env.Append(**{var: util.parse_conf_multi_values(environ.get(k))})
# Dynamically load dependent tools
if "compiledb" in COMMAND_LINE_TARGETS:
env.Tool("compilation_db")
# Configure extra library source directories for LDF
if util.get_project_optional_dir("lib_extra_dirs"):
env.Prepend(
LIBSOURCE_DIRS=util.parse_conf_multi_values(
util.get_project_optional_dir("lib_extra_dirs")))
env.Prepend(LIBSOURCE_DIRS=env.get("LIB_EXTRA_DIRS", []))
env['LIBSOURCE_DIRS'] = [
expanduser(d) if d.startswith("~") else d for d in env['LIBSOURCE_DIRS']
]
if not isdir(env.subst("$BUILD_DIR")):
makedirs(env.subst("$BUILD_DIR"))
env.LoadPioPlatform(commonvars)
env.LoadProjectOptions()
env.LoadPioPlatform()
env.SConscriptChdir(0)
env.SConsignFile(join("$PROJECTBUILD_DIR", ".sconsign.dblite"))
env.SConsignFile(
join("$BUILD_DIR", ".sconsign%d%d" % (sys.version_info[0], sys.version_info[1]))
)
for item in env.GetExtraScripts("pre"):
env.SConscript(item, exports="env")
@ -170,6 +171,8 @@ env.SConscript("$BUILD_SCRIPT")
if "UPLOAD_FLAGS" in env:
env.Prepend(UPLOADERFLAGS=["$UPLOAD_FLAGS"])
if env.GetProjectOption("upload_command"):
env.Replace(UPLOADCMD=env.GetProjectOption("upload_command"))
for item in env.GetExtraScripts("post"):
env.SConscript(item, exports="env")
@ -177,7 +180,9 @@ for item in env.GetExtraScripts("post"):
##############################################################################
# Checking program size
if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
if env.get("SIZETOOL") and not (
set(["nobuild", "sizedata"]) & set(COMMAND_LINE_TARGETS)
):
env.Depends(["upload", "program"], "checkprogsize")
# Replace platform's "size" target with our
_new_targets = [t for t in DEFAULT_TARGETS if str(t) != "size"]
@ -185,31 +190,47 @@ if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
Default(_new_targets)
Default("checkprogsize")
if "compiledb" in COMMAND_LINE_TARGETS:
env.Alias("compiledb", env.CompilationDatabase("$COMPILATIONDB_PATH"))
# Print configured protocols
env.AddPreAction(
["upload", "program"],
env.VerboseAction(lambda source, target, env: env.PrintUploadInfo(),
"Configuring upload protocol..."))
env.VerboseAction(
lambda source, target, env: env.PrintUploadInfo(),
"Configuring upload protocol...",
),
)
AlwaysBuild(env.Alias("debug", DEFAULT_TARGETS))
AlwaysBuild(env.Alias("__debug", DEFAULT_TARGETS))
AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS))
##############################################################################
if "envdump" in COMMAND_LINE_TARGETS:
print(env.Dump())
click.echo(env.Dump())
env.Exit(0)
if "idedata" in COMMAND_LINE_TARGETS:
try:
print("\n%s\n" % util.path_to_unicode(
json.dumps(env.DumpIDEData(), ensure_ascii=False)))
env.Exit(0)
except UnicodeDecodeError:
sys.stderr.write(
"\nUnicodeDecodeError: Non-ASCII characters found in build "
"environment\n"
"See explanation in FAQ > Troubleshooting > Building\n"
"https://docs.platformio.org/page/faq.html\n\n")
env.Exit(1)
Import("projenv")
except: # pylint: disable=bare-except
projenv = env
click.echo(
"\n%s\n"
% dump_json_to_unicode(
projenv.DumpIDEData(env) # pylint: disable=undefined-variable
)
)
env.Exit(0)
if "sizedata" in COMMAND_LINE_TARGETS:
AlwaysBuild(
env.Alias(
"sizedata",
DEFAULT_TARGETS,
env.VerboseAction(env.DumpSizeData, "Generating memory usage report..."),
)
)
Default("sizedata")

View File

@ -0,0 +1,226 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
# Copyright 2020 MongoDB Inc.
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
# pylint: disable=unused-argument, protected-access, unused-variable, import-error
# Original: https://github.com/mongodb/mongo/blob/master/site_scons/site_tools/compilation_db.py
from __future__ import absolute_import
import itertools
import json
import os
import SCons
from platformio.builder.tools.platformio import SRC_ASM_EXT, SRC_C_EXT, SRC_CXX_EXT
from platformio.proc import where_is_program
# Implements the ability for SCons to emit a compilation database for the MongoDB project. See
# http://clang.llvm.org/docs/JSONCompilationDatabase.html for details on what a compilation
# database is, and why you might want one. The only user visible entry point here is
# 'env.CompilationDatabase'. This method takes an optional 'target' to name the file that
# should hold the compilation database, otherwise, the file defaults to compile_commands.json,
# which is the name that most clang tools search for by default.
# TODO: Is there a better way to do this than this global? Right now this exists so that the
# emitter we add can record all of the things it emits, so that the scanner for the top level
# compilation database can access the complete list, and also so that the writer has easy
# access to write all of the files. But it seems clunky. How can the emitter and the scanner
# communicate more gracefully?
__COMPILATION_DB_ENTRIES = []
# We make no effort to avoid rebuilding the entries. Someday, perhaps we could and even
# integrate with the cache, but there doesn't seem to be much call for it.
class __CompilationDbNode(SCons.Node.Python.Value):
def __init__(self, value):
SCons.Node.Python.Value.__init__(self, value)
self.Decider(changed_since_last_build_node)
def changed_since_last_build_node(*args, **kwargs):
""" Dummy decider to force always building"""
return True
def makeEmitCompilationDbEntry(comstr):
"""
Effectively this creates a lambda function to capture:
* command line
* source
* target
:param comstr: unevaluated command line
:return: an emitter which has captured the above
"""
user_action = SCons.Action.Action(comstr)
def EmitCompilationDbEntry(target, source, env):
"""
This emitter will be added to each c/c++ object build to capture the info needed
for clang tools
:param target: target node(s)
:param source: source node(s)
:param env: Environment for use building this node
:return: target(s), source(s)
"""
# Resolve absolute path of toolchain
for cmd in ("CC", "CXX", "AS"):
if cmd not in env:
continue
if os.path.isabs(env[cmd]):
continue
env[cmd] = where_is_program(
env.subst("$%s" % cmd), env.subst("${ENV['PATH']}")
)
dbtarget = __CompilationDbNode(source)
entry = env.__COMPILATIONDB_Entry(
target=dbtarget,
source=[],
__COMPILATIONDB_UTARGET=target,
__COMPILATIONDB_USOURCE=source,
__COMPILATIONDB_UACTION=user_action,
__COMPILATIONDB_ENV=env,
)
# TODO: Technically, these next two lines should not be required: it should be fine to
# cache the entries. However, they don't seem to update properly. Since they are quick
# to re-generate disable caching and sidestep this problem.
env.AlwaysBuild(entry)
env.NoCache(entry)
__COMPILATION_DB_ENTRIES.append(dbtarget)
return target, source
return EmitCompilationDbEntry
def CompilationDbEntryAction(target, source, env, **kw):
"""
Create a dictionary with evaluated command line, target, source
and store that info as an attribute on the target
(Which has been stored in __COMPILATION_DB_ENTRIES array
:param target: target node(s)
:param source: source node(s)
:param env: Environment for use building this node
:param kw:
:return: None
"""
command = env["__COMPILATIONDB_UACTION"].strfunction(
target=env["__COMPILATIONDB_UTARGET"],
source=env["__COMPILATIONDB_USOURCE"],
env=env["__COMPILATIONDB_ENV"],
)
entry = {
"directory": env.Dir("#").abspath,
"command": command,
"file": str(env["__COMPILATIONDB_USOURCE"][0]),
}
target[0].write(entry)
def WriteCompilationDb(target, source, env):
entries = []
for s in __COMPILATION_DB_ENTRIES:
item = s.read()
item["file"] = os.path.abspath(item["file"])
entries.append(item)
with open(str(target[0]), "w") as target_file:
json.dump(
entries, target_file, sort_keys=True, indent=4, separators=(",", ": ")
)
def ScanCompilationDb(node, env, path):
return __COMPILATION_DB_ENTRIES
def generate(env, **kwargs):
static_obj, shared_obj = SCons.Tool.createObjBuilders(env)
env["COMPILATIONDB_COMSTR"] = kwargs.get(
"COMPILATIONDB_COMSTR", "Building compilation database $TARGET"
)
components_by_suffix = itertools.chain(
itertools.product(
[".%s" % ext for ext in SRC_C_EXT],
[
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CCCOM"),
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCCCOM"),
],
),
itertools.product(
[".%s" % ext for ext in SRC_CXX_EXT],
[
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CXXCOM"),
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCXXCOM"),
],
),
itertools.product(
[".%s" % ext for ext in SRC_ASM_EXT],
[(static_obj, SCons.Defaults.StaticObjectEmitter, "$ASCOM")],
),
)
for entry in components_by_suffix:
suffix = entry[0]
builder, base_emitter, command = entry[1]
# Assumes a dictionary emitter
emitter = builder.emitter[suffix]
builder.emitter[suffix] = SCons.Builder.ListEmitter(
[emitter, makeEmitCompilationDbEntry(command)]
)
env["BUILDERS"]["__COMPILATIONDB_Entry"] = SCons.Builder.Builder(
action=SCons.Action.Action(CompilationDbEntryAction, None),
)
env["BUILDERS"]["__COMPILATIONDB_Database"] = SCons.Builder.Builder(
action=SCons.Action.Action(WriteCompilationDb, "$COMPILATIONDB_COMSTR"),
target_scanner=SCons.Scanner.Scanner(
function=ScanCompilationDb, node_class=None
),
)
def CompilationDatabase(env, target):
result = env.__COMPILATIONDB_Database(target=target, source=[])
env.AlwaysBuild(result)
env.NoCache(result)
return result
env.AddMethod(CompilationDatabase, "CompilationDatabase")
def exists(env):
return True

View File

@ -14,70 +14,71 @@
from __future__ import absolute_import
import os
from glob import glob
from os import environ
from os.path import abspath, isfile, join
from SCons.Defaults import processDefines
from SCons.Defaults import processDefines # pylint: disable=import-error
from platformio import util
from platformio.managers.core import get_core_package_dir
from platformio.compat import glob_escape
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import exec_command, where_is_program
def _dump_includes(env):
includes = []
includes = {}
for item in env.get("CPPPATH", []):
includes.append(env.subst(item))
includes["build"] = [
env.subst("$PROJECT_INCLUDE_DIR"),
env.subst("$PROJECT_SRC_DIR"),
]
includes["build"].extend(
[os.path.realpath(env.subst(item)) for item in env.get("CPPPATH", [])]
)
# installed libs
includes["compatlib"] = []
for lb in env.GetLibBuilders():
includes.extend(lb.get_include_dirs())
includes["compatlib"].extend(
[os.path.realpath(inc) for inc in lb.get_include_dirs()]
)
# includes from toolchains
p = env.PioPlatform()
for name in p.get_installed_packages():
if p.get_package_type(name) != "toolchain":
includes["toolchain"] = []
for pkg in p.get_installed_packages():
if p.get_package_type(pkg.metadata.name) != "toolchain":
continue
toolchain_dir = util.glob_escape(p.get_package_dir(name))
toolchain_dir = glob_escape(pkg.path)
toolchain_incglobs = [
join(toolchain_dir, "*", "include*"),
join(toolchain_dir, "*", "include", "c++", "*"),
join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
join(toolchain_dir, "lib", "gcc", "*", "*", "include*")
os.path.join(toolchain_dir, "*", "include", "c++", "*"),
os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
os.path.join(toolchain_dir, "lib", "gcc", "*", "*", "include*"),
os.path.join(toolchain_dir, "*", "include*"),
]
for g in toolchain_incglobs:
includes.extend(glob(g))
includes["toolchain"].extend([os.path.realpath(inc) for inc in glob(g)])
includes["unity"] = []
unity_dir = get_core_package_dir("tool-unity")
if unity_dir:
includes.append(unity_dir)
includes["unity"].append(unity_dir)
includes.extend(
[env.subst("$PROJECTINCLUDE_DIR"),
env.subst("$PROJECTSRC_DIR")])
# remove duplicates
result = []
for item in includes:
if item not in result:
result.append(abspath(item))
return result
return includes
def _get_gcc_defines(env):
items = []
try:
sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH'])
result = util.exec_command(
"echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True)
sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command(
"echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True
)
except OSError:
return items
if result['returncode'] != 0:
if result["returncode"] != 0:
return items
for line in result['out'].split("\n"):
for line in result["out"].split("\n"):
tokens = line.strip().split(" ", 2)
if not tokens or tokens[0] != "#define":
continue
@ -92,17 +93,24 @@ def _dump_defines(env):
defines = []
# global symbols
for item in processDefines(env.get("CPPDEFINES", [])):
defines.append(env.subst(item).replace('\\', ''))
item = item.strip()
if item:
defines.append(env.subst(item).replace("\\", ""))
# special symbol for Atmel AVR MCU
if env['PIOPLATFORM'] == "atmelavr":
if env["PIOPLATFORM"] == "atmelavr":
board_mcu = env.get("BOARD_MCU")
if not board_mcu and "BOARD" in env:
board_mcu = env.BoardConfig().get("build.mcu")
if board_mcu:
defines.append(
str("__AVR_%s__" % board_mcu.upper().replace(
"ATMEGA", "ATmega").replace("ATTINY", "ATtiny")))
str(
"__AVR_%s__"
% board_mcu.upper()
.replace("ATMEGA", "ATmega")
.replace("ATTINY", "ATtiny")
)
)
# built-in GCC marcos
# if env.GetCompilerType() == "gcc":
@ -112,9 +120,9 @@ def _dump_defines(env):
def _get_svd_path(env):
svd_path = env.subst("$DEBUG_SVD_PATH")
svd_path = env.GetProjectOption("debug_svd_path")
if svd_path:
return abspath(svd_path)
return os.path.realpath(svd_path)
if "BOARD" not in env:
return None
@ -124,47 +132,51 @@ def _get_svd_path(env):
except (AssertionError, KeyError):
return None
# custom path to SVD file
if isfile(svd_path):
if os.path.isfile(svd_path):
return svd_path
# default file from ./platform/misc/svd folder
p = env.PioPlatform()
if isfile(join(p.get_dir(), "misc", "svd", svd_path)):
return abspath(join(p.get_dir(), "misc", "svd", svd_path))
if os.path.isfile(os.path.join(p.get_dir(), "misc", "svd", svd_path)):
return os.path.realpath(os.path.join(p.get_dir(), "misc", "svd", svd_path))
return None
def DumpIDEData(env):
LINTCCOM = "$CFLAGS $CCFLAGS $CPPFLAGS"
LINTCXXCOM = "$CXXFLAGS $CCFLAGS $CPPFLAGS"
def _escape_build_flag(flags):
return [flag if " " not in flag else '"%s"' % flag for flag in flags]
def DumpIDEData(env, globalenv):
""" env here is `projenv`"""
env["__escape_build_flag"] = _escape_build_flag
LINTCCOM = (
"${__escape_build_flag(CFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
LINTCXXCOM = (
"${__escape_build_flag(CXXFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
data = {
"libsource_dirs":
[env.subst(l) for l in env.get("LIBSOURCE_DIRS", [])],
"defines":
_dump_defines(env),
"includes":
_dump_includes(env),
"cc_flags":
env.subst(LINTCCOM),
"cxx_flags":
env.subst(LINTCXXCOM),
"cc_path":
util.where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
"cxx_path":
util.where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")),
"gdb_path":
util.where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")),
"prog_path":
env.subst("$PROG_PATH"),
"flash_extra_images": [{
"offset": item[0],
"path": env.subst(item[1])
} for item in env.get("FLASH_EXTRA_IMAGES", [])],
"svd_path":
_get_svd_path(env),
"compiler_type":
env.GetCompilerType()
"env_name": env["PIOENV"],
"libsource_dirs": [env.subst(l) for l in env.GetLibSourceDirs()],
"defines": _dump_defines(env),
"includes": _dump_includes(env),
"cc_path": where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
"cxx_path": where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")),
"gdb_path": where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")),
"prog_path": env.subst("$PROG_PATH"),
"svd_path": _get_svd_path(env),
"compiler_type": env.GetCompilerType(),
"targets": globalenv.DumpTargets(),
"extra": dict(
flash_images=[
{"offset": item[0], "path": env.subst(item[1])}
for item in env.get("FLASH_EXTRA_IMAGES", [])
]
),
}
data["extra"].update(env.get("IDE_EXTRA_DATA", {}))
env_ = env.Clone()
# https://github.com/platformio/platformio-atom-ide/issues/34
@ -177,10 +189,7 @@ def DumpIDEData(env):
_new_defines.append(item)
env_.Replace(CPPDEFINES=_new_defines)
data.update({
"cc_flags": env_.subst(LINTCCOM),
"cxx_flags": env_.subst(LINTCXXCOM)
})
data.update({"cc_flags": env_.subst(LINTCCOM), "cxx_flags": env_.subst(LINTCXXCOM)})
return data

File diff suppressed because it is too large Load Diff

View File

@ -12,19 +12,22 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from hashlib import md5
from os import makedirs
from os.path import isdir, isfile, join
from platform import system
from platformio.compat import WINDOWS, hashlib_encode_data
# Windows CLI has limit with command length to 8192
# Leave 2000 chars for flags and other options
MAX_SOURCES_LENGTH = 6000
MAX_LINE_LENGTH = 6000 if WINDOWS else 128072
def long_sources_hook(env, sources):
_sources = str(sources).replace("\\", "/")
if len(str(_sources)) < MAX_SOURCES_LENGTH:
if len(str(_sources)) < MAX_LINE_LENGTH:
return sources
# fix space in paths
@ -40,7 +43,7 @@ def long_sources_hook(env, sources):
def long_incflags_hook(env, incflags):
_incflags = env.subst(incflags).replace("\\", "/")
if len(_incflags) < MAX_SOURCES_LENGTH:
if len(_incflags) < MAX_LINE_LENGTH:
return incflags
# fix space in paths
@ -58,7 +61,9 @@ def _file_long_data(env, data):
build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(build_dir, "longcmd-%s" % md5(data).hexdigest())
tmp_file = join(
build_dir, "longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest()
)
if isfile(tmp_file):
return tmp_file
with open(tmp_file, "w") as fp:
@ -71,18 +76,17 @@ def exists(_):
def generate(env):
if system() != "Windows":
return None
env.Replace(_long_sources_hook=long_sources_hook)
env.Replace(_long_incflags_hook=long_incflags_hook)
coms = {}
for key in ("ARCOM", "LINKCOM"):
coms[key] = env.get(key, "").replace(
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}")
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}"
)
for key in ("_CCCOMCOM", "ASPPCOM"):
coms[key] = env.get(key, "").replace(
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}")
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}"
)
env.Replace(**coms)
return env

View File

@ -15,17 +15,18 @@
from __future__ import absolute_import
import atexit
import io
import os
import re
import sys
from os import environ, remove, walk
from os.path import basename, isdir, isfile, join, realpath, relpath, sep
from tempfile import mkstemp
from SCons.Action import Action
from SCons.Script import ARGUMENTS
import click
from platformio import util
from platformio.managers.core import get_core_package_dir
from platformio import fs, util
from platformio.compat import get_filesystem_encoding, get_locale_encoding, glob_escape
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import exec_command
class InoToCPPConverter(object):
@ -33,17 +34,52 @@ class InoToCPPConverter(object):
PROTOTYPE_RE = re.compile(
r"""^(
(?:template\<.*\>\s*)? # template
([a-z_\d\&]+\*?\s+){1,2} # return type
([a-z_\d\&]+\*?\s+){1,2} # return type
([a-z_\d]+\s*) # name of prototype
\([a-z_,\.\*\&\[\]\s\d]*\) # arguments
)\s*\{ # must end with {
""", re.X | re.M | re.I)
)\s*(\{|;) # must end with `{` or `;`
""",
re.X | re.M | re.I,
)
DETECTMAIN_RE = re.compile(r"void\s+(setup|loop)\s*\(", re.M | re.I)
PROTOPTRS_TPLRE = r"\([^&\(]*&(%s)[^\)]*\)"
def __init__(self, env):
self.env = env
self._main_ino = None
self._safe_encoding = None
def read_safe_contents(self, path):
error_reported = False
for encoding in (
"utf-8",
None,
get_filesystem_encoding(),
get_locale_encoding(),
"latin-1",
):
try:
with io.open(path, encoding=encoding) as fp:
contents = fp.read()
self._safe_encoding = encoding
return contents
except UnicodeDecodeError:
if not error_reported:
error_reported = True
click.secho(
"Unicode decode error has occurred, please remove invalid "
"(non-ASCII or non-UTF8) characters from %s file or convert it to UTF-8"
% path,
fg="yellow",
err=True,
)
return ""
def write_safe_contents(self, path, contents):
with io.open(
path, "w", encoding=self._safe_encoding, errors="backslashreplace"
) as fp:
return fp.write(contents)
def is_main_node(self, contents):
return self.DETECTMAIN_RE.search(contents)
@ -58,10 +94,8 @@ class InoToCPPConverter(object):
assert nodes
lines = []
for node in nodes:
contents = node.get_text_contents()
_lines = [
'# 1 "%s"' % node.get_path().replace("\\", "/"), contents
]
contents = self.read_safe_contents(node.get_path())
_lines = ['# 1 "%s"' % node.get_path().replace("\\", "/"), contents]
if self.is_main_node(contents):
lines = _lines + lines
self._main_ino = node.get_path()
@ -76,24 +110,24 @@ class InoToCPPConverter(object):
def process(self, contents):
out_file = self._main_ino + ".cpp"
assert self._gcc_preprocess(contents, out_file)
with open(out_file) as fp:
contents = fp.read()
contents = self.read_safe_contents(out_file)
contents = self._join_multiline_strings(contents)
with open(out_file, "w") as fp:
fp.write(self.append_prototypes(contents))
self.write_safe_contents(out_file, self.append_prototypes(contents))
return out_file
def _gcc_preprocess(self, contents, out_file):
tmp_path = mkstemp()[1]
with open(tmp_path, "w") as fp:
fp.write(contents)
self.write_safe_contents(tmp_path, contents)
self.env.Execute(
self.env.VerboseAction(
'$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format(
out_file, tmp_path),
"Converting " + basename(out_file[:-4])))
out_file, tmp_path
),
"Converting " + os.path.basename(out_file[:-4]),
)
)
atexit.register(_delete_file, tmp_path)
return isfile(out_file)
return os.path.isfile(out_file)
def _join_multiline_strings(self, contents):
if "\\\n" not in contents:
@ -113,14 +147,15 @@ class InoToCPPConverter(object):
stropen = True
newlines.append(line[:-1])
continue
elif stropen:
if stropen:
newlines[len(newlines) - 1] += line[:-1]
continue
elif stropen and line.endswith(('",', '";')):
newlines[len(newlines) - 1] += line
stropen = False
newlines.append('#line %d "%s"' %
(linenum, self._main_ino.replace("\\", "/")))
newlines.append(
'#line %d "%s"' % (linenum, self._main_ino.replace("\\", "/"))
)
continue
newlines.append(line)
@ -140,8 +175,10 @@ class InoToCPPConverter(object):
prototypes = []
reserved_keywords = set(["if", "else", "while"])
for match in self.PROTOTYPE_RE.finditer(contents):
if (set([match.group(2).strip(),
match.group(3).strip()]) & reserved_keywords):
if (
set([match.group(2).strip(), match.group(3).strip()])
& reserved_keywords
):
continue
prototypes.append(match)
return prototypes
@ -158,31 +195,44 @@ class InoToCPPConverter(object):
return total
def append_prototypes(self, contents):
prototypes = self._parse_prototypes(contents)
prototypes = self._parse_prototypes(contents) or []
# skip already declared prototypes
declared = set(m.group(1).strip() for m in prototypes if m.group(4) == ";")
prototypes = [m for m in prototypes if m.group(1).strip() not in declared]
if not prototypes:
return contents
prototype_names = set([m.group(3).strip() for m in prototypes])
prototype_names = set(m.group(3).strip() for m in prototypes)
split_pos = prototypes[0].start()
match_ptrs = re.search(
self.PROTOPTRS_TPLRE % ("|".join(prototype_names)),
contents[:split_pos], re.M)
contents[:split_pos],
re.M,
)
if match_ptrs:
split_pos = contents.rfind("\n", 0, match_ptrs.start()) + 1
result = []
result.append(contents[:split_pos].strip())
result.append("%s;" % ";\n".join([m.group(1) for m in prototypes]))
result.append('#line %d "%s"' % (self._get_total_lines(
contents[:split_pos]), self._main_ino.replace("\\", "/")))
result.append(
'#line %d "%s"'
% (
self._get_total_lines(contents[:split_pos]),
self._main_ino.replace("\\", "/"),
)
)
result.append(contents[split_pos:].strip())
return "\n".join(result)
def ConvertInoToCpp(env):
src_dir = util.glob_escape(env.subst("$PROJECTSRC_DIR"))
ino_nodes = (
env.Glob(join(src_dir, "*.ino")) + env.Glob(join(src_dir, "*.pde")))
src_dir = glob_escape(env.subst("$PROJECT_SRC_DIR"))
ino_nodes = env.Glob(os.path.join(src_dir, "*.ino")) + env.Glob(
os.path.join(src_dir, "*.pde")
)
if not ino_nodes:
return
c = InoToCPPConverter(env)
@ -193,26 +243,28 @@ def ConvertInoToCpp(env):
def _delete_file(path):
try:
if isfile(path):
remove(path)
if os.path.isfile(path):
os.remove(path)
except: # pylint: disable=bare-except
pass
@util.memoized()
def _get_compiler_type(env):
if env.subst("$CC").endswith("-gcc"):
return "gcc"
try:
sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH'])
result = util.exec_command([env.subst("$CC"), "-v"], env=sysenv)
sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command([env.subst("$CC"), "-v"], env=sysenv)
except OSError:
return None
if result['returncode'] != 0:
if result["returncode"] != 0:
return None
output = "".join([result['out'], result['err']]).lower()
output = "".join([result["out"], result["err"]]).lower()
if "clang" in output and "LLVM" in output:
return "clang"
elif "gcc" in output:
if "gcc" in output:
return "gcc"
return None
@ -222,11 +274,10 @@ def GetCompilerType(env):
def GetActualLDScript(env):
def _lookup_in_ldpath(script):
for d in env.get("LIBPATH", []):
path = join(env.subst(d), script)
if isfile(path):
path = os.path.join(env.subst(d), script)
if os.path.isfile(path):
return path
return None
@ -237,7 +288,7 @@ def GetActualLDScript(env):
if f == "-T":
script_in_next = True
continue
elif script_in_next:
if script_in_next:
script_in_next = False
raw_script = f
elif f.startswith("-Wl,-T"):
@ -245,7 +296,7 @@ def GetActualLDScript(env):
else:
continue
script = env.subst(raw_script.replace('"', "").strip())
if isfile(script):
if os.path.isfile(script):
return script
path = _lookup_in_ldpath(script)
if path:
@ -253,12 +304,13 @@ def GetActualLDScript(env):
if script:
sys.stderr.write(
"Error: Could not find '%s' LD script in LDPATH '%s'\n" %
(script, env.subst("$LIBPATH")))
"Error: Could not find '%s' LD script in LDPATH '%s'\n"
% (script, env.subst("$LIBPATH"))
)
env.Exit(1)
if not script and "LDSCRIPT_PATH" in env:
path = _lookup_in_ldpath(env['LDSCRIPT_PATH'])
path = _lookup_in_ldpath(env["LDSCRIPT_PATH"])
if path:
return path
@ -266,63 +318,58 @@ def GetActualLDScript(env):
env.Exit(1)
def VerboseAction(_, act, actstr):
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
return act
return Action(act, actstr)
def ConfigureDebugFlags(env):
def _cleanup_debug_flags(scope):
if scope not in env:
return
unflags = ["-Os", "-g"]
for level in [0, 1, 2, 3]:
for flag in ("O", "g", "ggdb"):
unflags.append("-%s%d" % (flag, level))
env[scope] = [f for f in env.get(scope, []) if f not in unflags]
env.Append(CPPDEFINES=["__PLATFORMIO_BUILD_DEBUG__"])
for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"):
_cleanup_debug_flags(scope)
debug_flags = env.ParseFlags(env.GetProjectOption("debug_build_flags"))
env.MergeFlags(debug_flags)
optimization_flags = [
f for f in debug_flags.get("CCFLAGS", []) if f.startswith(("-O", "-g"))
]
if optimization_flags:
env.AppendUnique(ASFLAGS=optimization_flags, LINKFLAGS=optimization_flags)
def PioClean(env, clean_dir):
if not isdir(clean_dir):
print("Build environment is clean")
env.Exit(0)
for root, _, files in walk(clean_dir):
for file_ in files:
remove(join(root, file_))
print("Removed %s" % relpath(join(root, file_)))
print("Done cleaning")
util.rmtree_(clean_dir)
env.Exit(0)
def ProcessDebug(env):
if not env.subst("$PIODEBUGFLAGS"):
env.Replace(PIODEBUGFLAGS=["-Og", "-g3", "-ggdb3"])
env.Append(
PIODEBUGFLAGS=["-D__PLATFORMIO_DEBUG__"],
BUILD_FLAGS=env.get("PIODEBUGFLAGS", []))
unflags = ["-Os"]
for level in [0, 1, 2]:
for flag in ("O", "g", "ggdb"):
unflags.append("-%s%d" % (flag, level))
env.Append(BUILD_UNFLAGS=unflags)
def ProcessTest(env):
def ConfigureTestTarget(env):
env.Append(
CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"],
CPPPATH=[join("$BUILD_DIR", "UnityTestLib")])
CPPPATH=[os.path.join("$BUILD_DIR", "UnityTestLib")],
)
unitylib = env.BuildLibrary(
join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity"))
os.path.join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity")
)
env.Prepend(LIBS=[unitylib])
src_filter = ["+<*.cpp>", "+<*.c>"]
if "PIOTEST" in env:
src_filter.append("+<%s%s>" % (env['PIOTEST'], sep))
if "PIOTEST_RUNNING_NAME" in env:
src_filter.append("+<%s%s>" % (env["PIOTEST_RUNNING_NAME"], os.path.sep))
env.Replace(PIOTEST_SRC_FILTER=src_filter)
def GetExtraScripts(env, scope):
items = []
for item in env.get("EXTRA_SCRIPTS", []):
for item in env.GetProjectOption("extra_scripts", []):
if scope == "post" and ":" not in item:
items.append(item)
elif item.startswith("%s:" % scope):
items.append(item[len(scope) + 1:])
items.append(item[len(scope) + 1 :])
if not items:
return items
with util.cd(env.subst("$PROJECT_DIR")):
return [realpath(item) for item in items]
with fs.cd(env.subst("$PROJECT_DIR")):
return [os.path.realpath(item) for item in items]
def exists(_):
@ -333,9 +380,7 @@ def generate(env):
env.AddMethod(ConvertInoToCpp)
env.AddMethod(GetCompilerType)
env.AddMethod(GetActualLDScript)
env.AddMethod(VerboseAction)
env.AddMethod(PioClean)
env.AddMethod(ProcessDebug)
env.AddMethod(ProcessTest)
env.AddMethod(ConfigureDebugFlags)
env.AddMethod(ConfigureTestTarget)
env.AddMethod(GetExtraScripts)
return env

View File

@ -14,169 +14,228 @@
from __future__ import absolute_import
import base64
import os
import sys
from os.path import isdir, isfile, join
from SCons.Script import COMMAND_LINE_TARGETS
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from platformio import exception, util
from platformio.managers.platform import PlatformFactory
from platformio import fs, util
from platformio.compat import WINDOWS
from platformio.package.meta import PackageItem
from platformio.package.version import get_original_version
from platformio.platform.exception import UnknownBoard
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectOptions
# pylint: disable=too-many-branches, too-many-locals
@util.memoized()
def initPioPlatform(name):
return PlatformFactory.newPlatform(name)
def PioPlatform(env):
variables = {}
for name in env['PIOVARIABLES']:
if name in env:
variables[name.lower()] = env[name]
p = initPioPlatform(env['PLATFORM_MANIFEST'])
variables = env.GetProjectOptions(as_dict=True)
if "framework" in variables:
# support PIO Core 3.0 dev/platforms
variables["pioframework"] = variables["framework"]
p = PlatformFactory.new(os.path.dirname(env["PLATFORM_MANIFEST"]))
p.configure_default_packages(variables, COMMAND_LINE_TARGETS)
return p
def BoardConfig(env, board=None):
p = initPioPlatform(env['PLATFORM_MANIFEST'])
try:
board = board or env.get("BOARD")
assert board, "BoardConfig: Board is not defined"
config = p.board_config(board)
except (AssertionError, exception.UnknownBoard) as e:
sys.stderr.write("Error: %s\n" % str(e))
env.Exit(1)
return config
with fs.cd(env.subst("$PROJECT_DIR")):
try:
p = env.PioPlatform()
board = board or env.get("BOARD")
assert board, "BoardConfig: Board is not defined"
return p.board_config(board)
except (AssertionError, UnknownBoard) as e:
sys.stderr.write("Error: %s\n" % str(e))
env.Exit(1)
def GetFrameworkScript(env, framework):
p = env.PioPlatform()
assert p.frameworks and framework in p.frameworks
script_path = env.subst(p.frameworks[framework]['script'])
if not isfile(script_path):
script_path = join(p.get_dir(), script_path)
script_path = env.subst(p.frameworks[framework]["script"])
if not os.path.isfile(script_path):
script_path = os.path.join(p.get_dir(), script_path)
return script_path
def LoadPioPlatform(env, variables):
def LoadPioPlatform(env):
p = env.PioPlatform()
installed_packages = p.get_installed_packages()
# Ensure real platform name
env['PIOPLATFORM'] = p.name
env["PIOPLATFORM"] = p.name
# Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH
systype = util.get_systype()
for name in installed_packages:
type_ = p.get_package_type(name)
for pkg in p.get_installed_packages():
type_ = p.get_package_type(pkg.metadata.name)
if type_ not in ("toolchain", "uploader", "debugger"):
continue
pkg_dir = p.get_package_dir(name)
env.PrependENVPath(
"PATH",
join(pkg_dir, "bin") if isdir(join(pkg_dir, "bin")) else pkg_dir)
if ("windows" not in systype and isdir(join(pkg_dir, "lib"))
and type_ != "toolchain"):
os.path.join(pkg.path, "bin")
if os.path.isdir(os.path.join(pkg.path, "bin"))
else pkg.path,
)
if (
not WINDOWS
and os.path.isdir(os.path.join(pkg.path, "lib"))
and type_ != "toolchain"
):
env.PrependENVPath(
"DYLD_LIBRARY_PATH"
if "darwin" in systype else "LD_LIBRARY_PATH",
join(pkg_dir, "lib"))
"DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH",
os.path.join(pkg.path, "lib"),
)
# Platform specific LD Scripts
if isdir(join(p.get_dir(), "ldscripts")):
env.Prepend(LIBPATH=[join(p.get_dir(), "ldscripts")])
if os.path.isdir(os.path.join(p.get_dir(), "ldscripts")):
env.Prepend(LIBPATH=[os.path.join(p.get_dir(), "ldscripts")])
if "BOARD" not in env:
# handle _MCU and _F_CPU variables for AVR native
for key, value in variables.UnknownVariables().items():
if not key.startswith("BOARD_"):
continue
env.Replace(**{
key.upper().replace("BUILD.", ""):
base64.b64decode(value)
})
return
# update board manifest with a custom data
# update board manifest with overridden data from INI config
board_config = env.BoardConfig()
for key, value in variables.UnknownVariables().items():
if not key.startswith("BOARD_"):
for option, value in env.GetProjectOptions():
if not option.startswith("board_"):
continue
board_config.update(key.lower()[6:], base64.b64decode(value))
option = option.lower()[6:]
try:
if isinstance(board_config.get(option), bool):
value = str(value).lower() in ("1", "yes", "true")
elif isinstance(board_config.get(option), int):
value = int(value)
except KeyError:
pass
board_config.update(option, value)
# update default environment variables
for key in variables.keys():
if key in env or \
not any([key.startswith("BOARD_"), key.startswith("UPLOAD_")]):
# load default variables from board config
for option_meta in ProjectOptions.values():
if not option_meta.buildenvvar or option_meta.buildenvvar in env:
continue
_opt, _val = key.lower().split("_", 1)
if _opt == "board":
_opt = "build"
if _val in board_config.get(_opt):
env.Replace(**{key: board_config.get("%s.%s" % (_opt, _val))})
data_path = (
option_meta.name[6:]
if option_meta.name.startswith("board_")
else option_meta.name.replace("_", ".")
)
try:
env[option_meta.buildenvvar] = board_config.get(data_path)
except KeyError:
pass
if "build.ldscript" in board_config:
env.Replace(LDSCRIPT_PATH=board_config.get("build.ldscript"))
def PrintConfiguration(env):
def PrintConfiguration(env): # pylint: disable=too-many-statements
platform = env.PioPlatform()
platform_data = ["PLATFORM: %s >" % platform.title]
hardware_data = ["HARDWARE:"]
configuration_data = ["CONFIGURATION:"]
mcu = env.subst("$BOARD_MCU")
f_cpu = env.subst("$BOARD_F_CPU")
if mcu:
hardware_data.append(mcu.upper())
if f_cpu:
f_cpu = int("".join([c for c in str(f_cpu) if c.isdigit()]))
hardware_data.append("%dMHz" % (f_cpu / 1000000))
pkg_metadata = PackageItem(platform.get_dir()).metadata
board_config = env.BoardConfig() if "BOARD" in env else None
debug_tools = None
if "BOARD" in env:
board_config = env.BoardConfig()
platform_data.append(board_config.get("name"))
def _get_configuration_data():
return (
None
if not board_config
else [
"CONFIGURATION:",
"https://docs.platformio.org/page/boards/%s/%s.html"
% (platform.name, board_config.id),
]
)
debug_tools = board_config.get("debug", {}).get("tools")
def _get_plaform_data():
data = [
"PLATFORM: %s (%s)"
% (
platform.title,
pkg_metadata.version if pkg_metadata else platform.version,
)
]
if (
int(ARGUMENTS.get("PIOVERBOSE", 0))
and pkg_metadata
and pkg_metadata.spec.external
):
data.append("(%s)" % pkg_metadata.spec.url)
if board_config:
data.extend([">", board_config.get("name")])
return data
def _get_hardware_data():
data = ["HARDWARE:"]
mcu = env.subst("$BOARD_MCU")
f_cpu = env.subst("$BOARD_F_CPU")
if mcu:
data.append(mcu.upper())
if f_cpu:
f_cpu = int("".join([c for c in str(f_cpu) if c.isdigit()]))
data.append("%dMHz," % (f_cpu / 1000000))
if not board_config:
return data
ram = board_config.get("upload", {}).get("maximum_ram_size")
flash = board_config.get("upload", {}).get("maximum_size")
hardware_data.append(
"%s RAM (%s Flash)" % (util.format_filesize(ram),
util.format_filesize(flash)))
configuration_data.append(
"https://docs.platformio.org/page/boards/%s/%s.html" %
(platform.name, board_config.id))
data.append(
"%s RAM, %s Flash"
% (fs.humanize_file_size(ram), fs.humanize_file_size(flash))
)
return data
for data in (configuration_data, platform_data, hardware_data):
if len(data) > 1:
def _get_debug_data():
debug_tools = (
board_config.get("debug", {}).get("tools") if board_config else None
)
if not debug_tools:
return None
data = [
"DEBUG:",
"Current",
"(%s)"
% board_config.get_debug_tool_name(env.GetProjectOption("debug_tool")),
]
onboard = []
external = []
for key, value in debug_tools.items():
if value.get("onboard"):
onboard.append(key)
else:
external.append(key)
if onboard:
data.extend(["On-board", "(%s)" % ", ".join(sorted(onboard))])
if external:
data.extend(["External", "(%s)" % ", ".join(sorted(external))])
return data
def _get_packages_data():
data = []
for item in platform.dump_used_packages():
original_version = get_original_version(item["version"])
info = "%s %s" % (item["name"], item["version"])
extra = []
if original_version:
extra.append(original_version)
if "src_url" in item and int(ARGUMENTS.get("PIOVERBOSE", 0)):
extra.append(item["src_url"])
if extra:
info += " (%s)" % ", ".join(extra)
data.append(info)
if not data:
return None
return ["PACKAGES:"] + ["\n - %s" % d for d in sorted(data)]
for data in (
_get_configuration_data(),
_get_plaform_data(),
_get_hardware_data(),
_get_debug_data(),
_get_packages_data(),
):
if data and len(data) > 1:
print(" ".join(data))
# Debugging
if not debug_tools:
return
data = [
"CURRENT(%s)" % board_config.get_debug_tool_name(
env.subst("$DEBUG_TOOL"))
]
onboard = []
external = []
for key, value in debug_tools.items():
if value.get("onboard"):
onboard.append(key)
else:
external.append(key)
if onboard:
data.append("ON-BOARD(%s)" % ", ".join(sorted(onboard)))
if external:
data.append("EXTERNAL(%s)" % ", ".join(sorted(external)))
print("DEBUG: %s" % " ".join(data))
def exists(_):
return True

View File

@ -0,0 +1,53 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from platformio.project.config import MISSING, ProjectConfig, ProjectOptions
def GetProjectConfig(env):
return ProjectConfig.get_instance(env["PROJECT_CONFIG"])
def GetProjectOptions(env, as_dict=False):
return env.GetProjectConfig().items(env=env["PIOENV"], as_dict=as_dict)
def GetProjectOption(env, option, default=MISSING):
return env.GetProjectConfig().get("env:" + env["PIOENV"], option, default)
def LoadProjectOptions(env):
for option, value in env.GetProjectOptions():
option_meta = ProjectOptions.get("env." + option)
if (
not option_meta
or not option_meta.buildenvvar
or option_meta.buildenvvar in env
):
continue
env[option_meta.buildenvvar] = value
def exists(_):
return True
def generate(env):
env.AddMethod(GetProjectConfig)
env.AddMethod(GetProjectOptions)
env.AddMethod(GetProjectOption)
env.AddMethod(LoadProjectOptions)
return env

View File

@ -0,0 +1,254 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-locals
from __future__ import absolute_import
import sys
from os import environ, makedirs, remove
from os.path import isdir, join, splitdrive
from elftools.elf.descriptions import describe_sh_flags
from elftools.elf.elffile import ELFFile
from platformio.compat import dump_json_to_unicode
from platformio.proc import exec_command
from platformio.util import get_systype
def _run_tool(cmd, env, tool_args):
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(build_dir, "size-data-longcmd.txt")
with open(tmp_file, "w") as fp:
fp.write("\n".join(tool_args))
cmd.append("@" + tmp_file)
result = exec_command(cmd, env=sysenv)
remove(tmp_file)
return result
def _get_symbol_locations(env, elf_path, addrs):
if not addrs:
return {}
cmd = [env.subst("$CC").replace("-gcc", "-addr2line"), "-e", elf_path]
result = _run_tool(cmd, env, addrs)
locations = [line for line in result["out"].split("\n") if line]
assert len(addrs) == len(locations)
return dict(zip(addrs, [l.strip() for l in locations]))
def _get_demangled_names(env, mangled_names):
if not mangled_names:
return {}
result = _run_tool(
[env.subst("$CC").replace("-gcc", "-c++filt")], env, mangled_names
)
demangled_names = [line for line in result["out"].split("\n") if line]
assert len(mangled_names) == len(demangled_names)
return dict(
zip(
mangled_names,
[dn.strip().replace("::__FUNCTION__", "") for dn in demangled_names],
)
)
def _determine_section(sections, symbol_addr):
for section, info in sections.items():
if not _is_flash_section(info) and not _is_ram_section(info):
continue
if symbol_addr in range(info["start_addr"], info["start_addr"] + info["size"]):
return section
return "unknown"
def _is_ram_section(section):
return (
section.get("type", "") in ("SHT_NOBITS", "SHT_PROGBITS")
and section.get("flags", "") == "WA"
)
def _is_flash_section(section):
return section.get("type", "") == "SHT_PROGBITS" and "A" in section.get("flags", "")
def _is_valid_symbol(symbol_name, symbol_type, symbol_address):
return symbol_name and symbol_address != 0 and symbol_type != "STT_NOTYPE"
def _collect_sections_info(elffile):
sections = {}
for section in elffile.iter_sections():
if section.is_null() or section.name.startswith(".debug"):
continue
section_type = section["sh_type"]
section_flags = describe_sh_flags(section["sh_flags"])
section_size = section.data_size
sections[section.name] = {
"size": section_size,
"start_addr": section["sh_addr"],
"type": section_type,
"flags": section_flags,
}
return sections
def _collect_symbols_info(env, elffile, elf_path, sections):
symbols = []
symbol_section = elffile.get_section_by_name(".symtab")
if symbol_section.is_null():
sys.stderr.write("Couldn't find symbol table. Is ELF file stripped?")
env.Exit(1)
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
symbol_addrs = []
mangled_names = []
for s in symbol_section.iter_symbols():
symbol_info = s.entry["st_info"]
symbol_addr = s["st_value"]
symbol_size = s["st_size"]
symbol_type = symbol_info["type"]
if not _is_valid_symbol(s.name, symbol_type, symbol_addr):
continue
symbol = {
"addr": symbol_addr,
"bind": symbol_info["bind"],
"name": s.name,
"type": symbol_type,
"size": symbol_size,
"section": _determine_section(sections, symbol_addr),
}
if s.name.startswith("_Z"):
mangled_names.append(s.name)
symbol_addrs.append(hex(symbol_addr))
symbols.append(symbol)
symbol_locations = _get_symbol_locations(env, elf_path, symbol_addrs)
demangled_names = _get_demangled_names(env, mangled_names)
for symbol in symbols:
if symbol["name"].startswith("_Z"):
symbol["demangled_name"] = demangled_names.get(symbol["name"])
location = symbol_locations.get(hex(symbol["addr"]))
if not location or "?" in location:
continue
if "windows" in get_systype():
drive, tail = splitdrive(location)
location = join(drive.upper(), tail)
symbol["file"] = location
symbol["line"] = 0
if ":" in location:
file_, line = location.rsplit(":", 1)
if line.isdigit():
symbol["file"] = file_
symbol["line"] = int(line)
return symbols
def _calculate_firmware_size(sections):
flash_size = ram_size = 0
for section_info in sections.values():
if _is_flash_section(section_info):
flash_size += section_info.get("size", 0)
if _is_ram_section(section_info):
ram_size += section_info.get("size", 0)
return ram_size, flash_size
def DumpSizeData(_, target, source, env): # pylint: disable=unused-argument
data = {"device": {}, "memory": {}, "version": 1}
board = env.BoardConfig()
if board:
data["device"] = {
"mcu": board.get("build.mcu", ""),
"cpu": board.get("build.cpu", ""),
"frequency": board.get("build.f_cpu"),
"flash": int(board.get("upload.maximum_size", 0)),
"ram": int(board.get("upload.maximum_ram_size", 0)),
}
if data["device"]["frequency"] and data["device"]["frequency"].endswith("L"):
data["device"]["frequency"] = int(data["device"]["frequency"][0:-1])
elf_path = env.subst("$PIOMAINPROG")
with open(elf_path, "rb") as fp:
elffile = ELFFile(fp)
if not elffile.has_dwarf_info():
sys.stderr.write("Elf file doesn't contain DWARF information")
env.Exit(1)
sections = _collect_sections_info(elffile)
firmware_ram, firmware_flash = _calculate_firmware_size(sections)
data["memory"]["total"] = {
"ram_size": firmware_ram,
"flash_size": firmware_flash,
"sections": sections,
}
files = dict()
for symbol in _collect_symbols_info(env, elffile, elf_path, sections):
file_path = symbol.get("file") or "unknown"
if not files.get(file_path, {}):
files[file_path] = {"symbols": [], "ram_size": 0, "flash_size": 0}
symbol_size = symbol.get("size", 0)
section = sections.get(symbol.get("section", ""), {})
if _is_ram_section(section):
files[file_path]["ram_size"] += symbol_size
if _is_flash_section(section):
files[file_path]["flash_size"] += symbol_size
files[file_path]["symbols"].append(symbol)
data["memory"]["files"] = list()
for k, v in files.items():
file_data = {"path": k}
file_data.update(v)
data["memory"]["files"].append(file_data)
with open(join(env.subst("$BUILD_DIR"), "sizedata.json"), "w") as fp:
fp.write(dump_json_to_unicode(data))
def exists(_):
return True
def generate(env):
env.AddMethod(DumpSizeData)
return env

View File

@ -0,0 +1,119 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import os
from SCons.Action import Action # pylint: disable=import-error
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import AlwaysBuild # pylint: disable=import-error
from platformio import compat, fs
def VerboseAction(_, act, actstr):
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
return act
return Action(act, actstr)
def PioClean(env, clean_dir):
def _relpath(path):
if compat.WINDOWS:
prefix = os.getcwd()[:2].lower()
if (
":" not in prefix
or not path.lower().startswith(prefix)
or os.path.relpath(path).startswith("..")
):
return path
return os.path.relpath(path)
if not os.path.isdir(clean_dir):
print("Build environment is clean")
env.Exit(0)
clean_rel_path = _relpath(clean_dir)
for root, _, files in os.walk(clean_dir):
for f in files:
dst = os.path.join(root, f)
os.remove(dst)
print(
"Removed %s"
% (dst if not clean_rel_path.startswith(".") else _relpath(dst))
)
print("Done cleaning")
fs.rmtree(clean_dir)
env.Exit(0)
def AddTarget( # pylint: disable=too-many-arguments
env,
name,
dependencies,
actions,
title=None,
description=None,
group="Generic",
always_build=True,
):
if "__PIO_TARGETS" not in env:
env["__PIO_TARGETS"] = {}
assert name not in env["__PIO_TARGETS"]
env["__PIO_TARGETS"][name] = dict(
name=name, title=title, description=description, group=group
)
target = env.Alias(name, dependencies, actions)
if always_build:
AlwaysBuild(target)
return target
def AddPlatformTarget(env, *args, **kwargs):
return env.AddTarget(group="Platform", *args, **kwargs)
def AddCustomTarget(env, *args, **kwargs):
return env.AddTarget(group="Custom", *args, **kwargs)
def DumpTargets(env):
targets = env.get("__PIO_TARGETS") or {}
# pre-fill default targets if embedded dev-platform
if env.PioPlatform().is_embedded() and not any(
t["group"] == "Platform" for t in targets.values()
):
targets["upload"] = dict(name="upload", group="Platform", title="Upload")
targets["compiledb"] = dict(
name="compiledb",
title="Compilation Database",
description="Generate compilation database `compile_commands.json`",
group="Advanced",
)
targets["clean"] = dict(name="clean", title="Clean", group="Generic")
return list(targets.values())
def exists(_):
return True
def generate(env):
env.AddMethod(VerboseAction)
env.AddMethod(PioClean)
env.AddMethod(AddTarget)
env.AddMethod(AddPlatformTarget)
env.AddMethod(AddCustomTarget)
env.AddMethod(DumpTargets)
return env

View File

@ -22,10 +22,12 @@ from os.path import isfile, join
from shutil import copyfile
from time import sleep
from SCons.Script import ARGUMENTS
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from serial import Serial, SerialException
from platformio import exception, util
from platformio import exception, fs, util
from platformio.compat import WINDOWS
from platformio.proc import exec_command
# pylint: disable=unused-argument
@ -58,9 +60,9 @@ def WaitForNewSerialPort(env, before):
prev_port = env.subst("$UPLOAD_PORT")
new_port = None
elapsed = 0
before = [p['port'] for p in before]
before = [p["port"] for p in before]
while elapsed < 5 and new_port is None:
now = [p['port'] for p in util.get_serial_ports()]
now = [p["port"] for p in util.get_serial_ports()]
for p in now:
if p not in before:
new_port = p
@ -82,10 +84,12 @@ def WaitForNewSerialPort(env, before):
sleep(1)
if not new_port:
sys.stderr.write("Error: Couldn't find a board on the selected port. "
"Check that you have the correct port selected. "
"If it is correct, try pressing the board's reset "
"button after initiating the upload.\n")
sys.stderr.write(
"Error: Couldn't find a board on the selected port. "
"Check that you have the correct port selected. "
"If it is correct, try pressing the board's reset "
"button after initiating the upload.\n"
)
env.Exit(1)
return new_port
@ -97,8 +101,8 @@ def AutodetectUploadPort(*args, **kwargs):
def _get_pattern():
if "UPLOAD_PORT" not in env:
return None
if set(["*", "?", "[", "]"]) & set(env['UPLOAD_PORT']):
return env['UPLOAD_PORT']
if set(["*", "?", "[", "]"]) & set(env["UPLOAD_PORT"]):
return env["UPLOAD_PORT"]
return None
def _is_match_pattern(port):
@ -110,17 +114,13 @@ def AutodetectUploadPort(*args, **kwargs):
def _look_for_mbed_disk():
msdlabels = ("mbed", "nucleo", "frdm", "microbit")
for item in util.get_logical_devices():
if item['path'].startswith("/net") or not _is_match_pattern(
item['path']):
if item["path"].startswith("/net") or not _is_match_pattern(item["path"]):
continue
mbed_pages = [
join(item['path'], n) for n in ("mbed.htm", "mbed.html")
]
mbed_pages = [join(item["path"], n) for n in ("mbed.htm", "mbed.html")]
if any(isfile(p) for p in mbed_pages):
return item['path']
if item['name'] \
and any(l in item['name'].lower() for l in msdlabels):
return item['path']
return item["path"]
if item["name"] and any(l in item["name"].lower() for l in msdlabels):
return item["path"]
return None
def _look_for_serial_port():
@ -130,18 +130,17 @@ def AutodetectUploadPort(*args, **kwargs):
if "BOARD" in env and "build.hwids" in env.BoardConfig():
board_hwids = env.BoardConfig().get("build.hwids")
for item in util.get_serial_ports(filter_hwid=True):
if not _is_match_pattern(item['port']):
if not _is_match_pattern(item["port"]):
continue
port = item['port']
port = item["port"]
if upload_protocol.startswith("blackmagic"):
if "windows" in util.get_systype() and \
port.startswith("COM") and len(port) > 4:
if WINDOWS and port.startswith("COM") and len(port) > 4:
port = "\\\\.\\%s" % port
if "GDB" in item['description']:
if "GDB" in item["description"]:
return port
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']:
if hwid_str in item["hwid"]:
return port
return port
@ -149,13 +148,13 @@ def AutodetectUploadPort(*args, **kwargs):
print(env.subst("Use manually specified: $UPLOAD_PORT"))
return
if (env.subst("$UPLOAD_PROTOCOL") == "mbed"
or ("mbed" in env.subst("$PIOFRAMEWORK")
and not env.subst("$UPLOAD_PROTOCOL"))):
if env.subst("$UPLOAD_PROTOCOL") == "mbed" or (
"mbed" in env.subst("$PIOFRAMEWORK") and not env.subst("$UPLOAD_PROTOCOL")
):
env.Replace(UPLOAD_PORT=_look_for_mbed_disk())
else:
try:
util.ensure_udev_rules()
fs.ensure_udev_rules()
except exception.InvalidUdevRules as e:
sys.stderr.write("\n%s\n\n" % e)
env.Replace(UPLOAD_PORT=_look_for_serial_port())
@ -167,7 +166,8 @@ def AutodetectUploadPort(*args, **kwargs):
"Error: Please specify `upload_port` for environment or use "
"global `--upload-port` option.\n"
"For some development platforms it can be a USB flash "
"drive (i.e. /media/<user>/<device name>)\n")
"drive (i.e. /media/<user>/<device name>)\n"
)
env.Exit(1)
@ -178,16 +178,17 @@ def UploadToDisk(_, target, source, env):
fpath = join(env.subst("$BUILD_DIR"), "%s.%s" % (progname, ext))
if not isfile(fpath):
continue
copyfile(fpath,
join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext)))
print("Firmware has been successfully uploaded.\n"
"(Some boards may require manual hard reset)")
copyfile(fpath, join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext)))
print(
"Firmware has been successfully uploaded.\n"
"(Some boards may require manual hard reset)"
)
def CheckUploadSize(_, target, source, env):
check_conditions = [
env.get("BOARD"),
env.get("SIZETOOL") or env.get("SIZECHECKCMD")
env.get("SIZETOOL") or env.get("SIZECHECKCMD"),
]
if not all(check_conditions):
return
@ -200,7 +201,8 @@ def CheckUploadSize(_, target, source, env):
env.Replace(
SIZECHECKCMD="$SIZETOOL -B -d $SOURCES",
SIZEPROGREGEXP=r"^(\d+)\s+(\d+)\s+\d+\s",
SIZEDATAREGEXP=r"^\d+\s+(\d+)\s+(\d+)\s+\d+")
SIZEDATAREGEXP=r"^\d+\s+(\d+)\s+(\d+)\s+\d+",
)
def _get_size_output():
cmd = env.get("SIZECHECKCMD")
@ -210,11 +212,11 @@ def CheckUploadSize(_, target, source, env):
cmd = cmd.split()
cmd = [arg.replace("$SOURCES", str(source[0])) for arg in cmd if arg]
sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH'])
result = util.exec_command(env.subst(cmd), env=sysenv)
if result['returncode'] != 0:
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command(env.subst(cmd), env=sysenv)
if result["returncode"] != 0:
return None
return result['out'].strip()
return result["out"].strip()
def _calculate_size(output, pattern):
if not output or not pattern:
@ -238,7 +240,8 @@ def CheckUploadSize(_, target, source, env):
if used_blocks > blocks_per_progress:
used_blocks = blocks_per_progress
return "[{:{}}] {: 6.1%} (used {:d} bytes from {:d} bytes)".format(
"=" * used_blocks, blocks_per_progress, percent_raw, value, total)
"=" * used_blocks, blocks_per_progress, percent_raw, value, total
)
if not env.get("SIZECHECKCMD") and not env.get("SIZEPROGREGEXP"):
_configure_defaults()
@ -246,12 +249,11 @@ def CheckUploadSize(_, target, source, env):
program_size = _calculate_size(output, env.get("SIZEPROGREGEXP"))
data_size = _calculate_size(output, env.get("SIZEDATAREGEXP"))
print("Memory Usage -> http://bit.ly/pio-memory-usage")
print('Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"')
if data_max_size and data_size > -1:
print("DATA: %s" % _format_availale_bytes(data_size, data_max_size))
print("RAM: %s" % _format_availale_bytes(data_size, data_max_size))
if program_size > -1:
print("PROGRAM: %s" % _format_availale_bytes(program_size,
program_max_size))
print("Flash: %s" % _format_availale_bytes(program_size, program_max_size))
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
print(output)
@ -262,9 +264,10 @@ def CheckUploadSize(_, target, source, env):
# "than maximum allowed (%s bytes)\n" % (data_size, data_max_size))
# env.Exit(1)
if program_size > program_max_size:
sys.stderr.write("Error: The program size (%d bytes) is greater "
"than maximum allowed (%s bytes)\n" %
(program_size, program_max_size))
sys.stderr.write(
"Error: The program size (%d bytes) is greater "
"than maximum allowed (%s bytes)\n" % (program_size, program_max_size)
)
env.Exit(1)
@ -272,8 +275,7 @@ def PrintUploadInfo(env):
configured = env.subst("$UPLOAD_PROTOCOL")
available = [configured] if configured else []
if "BOARD" in env:
available.extend(env.BoardConfig().get("upload", {}).get(
"protocols", []))
available.extend(env.BoardConfig().get("upload", {}).get("protocols", []))
if available:
print("AVAILABLE: %s" % ", ".join(sorted(set(available))))
if configured:

View File

@ -14,23 +14,28 @@
from __future__ import absolute_import
import re
import fnmatch
import os
import sys
from glob import glob
from os import sep, walk
from os.path import basename, dirname, isdir, join, realpath
from SCons import Builder, Util
from SCons.Script import (COMMAND_LINE_TARGETS, AlwaysBuild,
DefaultEnvironment, Export, SConscript)
from SCons import Builder, Util # pylint: disable=import-error
from SCons.Node import FS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from SCons.Script import AlwaysBuild # pylint: disable=import-error
from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from SCons.Script import Export # pylint: disable=import-error
from SCons.Script import SConscript # pylint: disable=import-error
from platformio.util import glob_escape, pioversion_to_intstr
from platformio import __version__, fs
from platformio.compat import MACOS, string_types
from platformio.package.version import pepver_to_semver
SRC_HEADER_EXT = ["h", "hpp"]
SRC_C_EXT = ["c", "cc", "cpp"]
SRC_BUILD_EXT = SRC_C_EXT + ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
SRC_FILTER_DEFAULT = ["+<*>", "-<.git%s>" % sep, "-<svn%s>" % sep]
SRC_FILTER_PATTERNS_RE = re.compile(r"(\+|\-)<([^>]+)>")
SRC_ASM_EXT = ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
SRC_C_EXT = ["c"]
SRC_CXX_EXT = ["cc", "cpp", "cxx", "c++"]
SRC_BUILD_EXT = SRC_C_EXT + SRC_CXX_EXT + SRC_ASM_EXT
SRC_FILTER_DEFAULT = ["+<*>", "-<.git%s>" % os.sep, "-<.svn%s>" % os.sep]
def scons_patched_match_splitext(path, suffixes=None):
@ -41,49 +46,67 @@ def scons_patched_match_splitext(path, suffixes=None):
return tokens
def _build_project_deps(env):
project_lib_builder = env.ConfigureProjectLibBuilder()
# prepend project libs to the beginning of list
env.Prepend(LIBS=project_lib_builder.build())
# prepend extra linker related options from libs
env.PrependUnique(
**{
key: project_lib_builder.env.get(key)
for key in ("LIBS", "LIBPATH", "LINKFLAGS")
if project_lib_builder.env.get(key)
})
projenv = env.Clone()
# CPPPATH from dependencies
projenv.PrependUnique(CPPPATH=project_lib_builder.env.get("CPPPATH"))
# extra build flags from `platformio.ini`
projenv.ProcessFlags(env.get("SRC_BUILD_FLAGS"))
is_test = "__test" in COMMAND_LINE_TARGETS
if is_test:
projenv.BuildSources("$BUILDTEST_DIR", "$PROJECTTEST_DIR",
"$PIOTEST_SRC_FILTER")
if not is_test or env.get("TEST_BUILD_PROJECT_SRC") == "true":
projenv.BuildSources("$BUILDSRC_DIR", "$PROJECTSRC_DIR",
env.get("SRC_FILTER"))
if not env.get("PIOBUILDFILES") and not COMMAND_LINE_TARGETS:
sys.stderr.write(
"Error: Nothing to build. Please put your source code files "
"to '%s' folder\n" % env.subst("$PROJECTSRC_DIR"))
env.Exit(1)
Export("projenv")
def GetBuildType(env):
return (
"debug"
if (
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
)
else "release"
)
def BuildProgram(env):
env.ProcessProgramDeps()
env.ProcessProjectDeps()
# append into the beginning a main LD script
if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
# enable "cyclic reference" for linker
if (
env.get("LIBS")
and env.GetCompilerType() == "gcc"
and (env.PioPlatform().is_embedded() or not MACOS)
):
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
program = env.Program(
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
)
env.Replace(PIOMAINPROG=program)
AlwaysBuild(
env.Alias(
"checkprogsize",
program,
env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
)
)
print("Building in %s mode" % env.GetBuildType())
return program
def ProcessProgramDeps(env):
def _append_pio_macros():
env.AppendUnique(CPPDEFINES=[(
"PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())))])
core_version = pepver_to_semver(__version__)
env.AppendUnique(
CPPDEFINES=[
(
"PLATFORMIO",
int(
"{0:02d}{1:02d}{2:02d}".format(
core_version.major, core_version.minor, core_version.patch
)
),
)
]
)
_append_pio_macros()
@ -93,9 +116,6 @@ def BuildProgram(env):
if not Util.case_sensitive_suffixes(".s", ".S"):
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
if set(["__debug", "debug"]) & set(COMMAND_LINE_TARGETS):
env.ProcessDebug()
# process extra flags from board
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
@ -106,39 +126,55 @@ def BuildProgram(env):
# process framework scripts
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
# restore PIO macros if it was deleted by framework
_append_pio_macros()
if env.GetBuildType() == "debug":
env.ConfigureDebugFlags()
# remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
if "__test" in COMMAND_LINE_TARGETS:
env.ProcessTest()
env.ConfigureTestTarget()
# build project with dependencies
_build_project_deps(env)
# append into the beginning a main LD script
if (env.get("LDSCRIPT_PATH")
and not any("-Wl,-T" in f for f in env['LINKFLAGS'])):
env.Prepend(LINKFLAGS=["-T", "$LDSCRIPT_PATH"])
def ProcessProjectDeps(env):
project_lib_builder = env.ConfigureProjectLibBuilder()
# enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc":
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
# prepend project libs to the beginning of list
env.Prepend(LIBS=project_lib_builder.build())
# prepend extra linker related options from libs
env.PrependUnique(
**{
key: project_lib_builder.env.get(key)
for key in ("LIBS", "LIBPATH", "LINKFLAGS")
if project_lib_builder.env.get(key)
}
)
program = env.Program(
join("$BUILD_DIR", env.subst("$PROGNAME")), env['PIOBUILDFILES'])
env.Replace(PIOMAINPROG=program)
projenv = env.Clone()
AlwaysBuild(
env.Alias(
"checkprogsize", program,
env.VerboseAction(env.CheckUploadSize,
"Checking size $PIOMAINPROG")))
# CPPPATH from dependencies
projenv.PrependUnique(CPPPATH=project_lib_builder.env.get("CPPPATH"))
# extra build flags from `platformio.ini`
projenv.ProcessFlags(env.get("SRC_BUILD_FLAGS"))
return program
is_test = "__test" in COMMAND_LINE_TARGETS
if is_test:
projenv.BuildSources(
"$BUILD_TEST_DIR", "$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
)
if not is_test or env.GetProjectOption("test_build_project_src"):
projenv.BuildSources(
"$BUILD_SRC_DIR", "$PROJECT_SRC_DIR", env.get("SRC_FILTER")
)
if not env.get("PIOBUILDFILES") and not COMMAND_LINE_TARGETS:
sys.stderr.write(
"Error: Nothing to build. Please put your source code files "
"to '%s' folder\n" % env.subst("$PROJECT_SRC_DIR")
)
env.Exit(1)
Export("projenv")
def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
@ -152,30 +188,30 @@ def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
result[key].extend(value)
cppdefines = []
for item in result['CPPDEFINES']:
for item in result["CPPDEFINES"]:
if not Util.is_Sequence(item):
cppdefines.append(item)
continue
name, value = item[:2]
if '\"' in value:
value = value.replace('\"', '\\\"')
if '"' in value:
value = value.replace('"', '\\"')
elif value.isdigit():
value = int(value)
elif value.replace(".", "", 1).isdigit():
value = float(value)
cppdefines.append((name, value))
result['CPPDEFINES'] = cppdefines
result["CPPDEFINES"] = cppdefines
# fix relative CPPPATH & LIBPATH
for k in ("CPPPATH", "LIBPATH"):
for i, p in enumerate(result.get(k, [])):
if isdir(p):
result[k][i] = realpath(p)
if os.path.isdir(p):
result[k][i] = os.path.realpath(p)
# fix relative path for "-include"
for i, f in enumerate(result.get("CCFLAGS", [])):
if isinstance(f, tuple) and f[0] == "-include":
result['CCFLAGS'][i] = (f[0], env.File(realpath(f[1].get_path())))
result["CCFLAGS"][i] = (f[0], env.File(os.path.realpath(f[1].get_path())))
return result
@ -188,12 +224,15 @@ def ProcessFlags(env, flags): # pylint: disable=too-many-branches
# Cancel any previous definition of name, either built in or
# provided with a -U option // Issue #191
undefines = [
u for u in env.get("CCFLAGS", [])
if isinstance(u, basestring) and u.startswith("-U")
u
for u in env.get("CCFLAGS", [])
if isinstance(u, string_types) and u.startswith("-U")
]
if undefines:
for undef in undefines:
env['CCFLAGS'].remove(undef)
env["CCFLAGS"].remove(undef)
if undef[2:] in env["CPPDEFINES"]:
env["CPPDEFINES"].remove(undef[2:])
env.Append(_CPPDEFFLAGS=" %s" % " ".join(undefines))
@ -216,78 +255,61 @@ def ProcessUnFlags(env, flags):
for current in env.get(key, []):
conditions = [
unflag == current,
isinstance(current, (tuple, list))
and unflag[0] == current[0]
isinstance(current, (tuple, list)) and unflag[0] == current[0],
]
if any(conditions):
env[key].remove(current)
def IsFileWithExt(env, file_, ext): # pylint: disable=W0613
if basename(file_).startswith("."):
return False
for e in ext:
if file_.endswith(".%s" % e):
return True
return False
def MatchSourceFiles(env, src_dir, src_filter=None):
def _append_build_item(items, item, src_dir):
if env.IsFileWithExt(item, SRC_BUILD_EXT + SRC_HEADER_EXT):
items.add(item.replace(src_dir + sep, ""))
src_dir = env.subst(src_dir)
src_filter = env.subst(src_filter) if src_filter else None
src_filter = src_filter or SRC_FILTER_DEFAULT
if isinstance(src_filter, (list, tuple)):
src_filter = " ".join(src_filter)
matches = set()
# correct fs directory separator
src_filter = src_filter.replace("/", sep).replace("\\", sep)
for (action, pattern) in SRC_FILTER_PATTERNS_RE.findall(src_filter):
items = set()
for item in glob(join(glob_escape(src_dir), pattern)):
if isdir(item):
for root, _, files in walk(item, followlinks=True):
for f in files:
_append_build_item(items, join(root, f), src_dir)
else:
_append_build_item(items, item, src_dir)
if action == "+":
matches |= items
else:
matches -= items
return sorted(list(matches))
return fs.match_src_files(
env.subst(src_dir), src_filter, SRC_BUILD_EXT + SRC_HEADER_EXT
)
def CollectBuildFiles(env,
variant_dir,
src_dir,
src_filter=None,
duplicate=False):
def CollectBuildFiles(
env, variant_dir, src_dir, src_filter=None, duplicate=False
): # pylint: disable=too-many-locals
sources = []
variants = []
src_dir = env.subst(src_dir)
if src_dir.endswith(sep):
if src_dir.endswith(os.sep):
src_dir = src_dir[:-1]
for item in env.MatchSourceFiles(src_dir, src_filter):
_reldir = dirname(item)
_src_dir = join(src_dir, _reldir) if _reldir else src_dir
_var_dir = join(variant_dir, _reldir) if _reldir else variant_dir
_reldir = os.path.dirname(item)
_src_dir = os.path.join(src_dir, _reldir) if _reldir else src_dir
_var_dir = os.path.join(variant_dir, _reldir) if _reldir else variant_dir
if _var_dir not in variants:
variants.append(_var_dir)
env.VariantDir(_var_dir, _src_dir, duplicate)
if env.IsFileWithExt(item, SRC_BUILD_EXT):
sources.append(env.File(join(_var_dir, basename(item))))
if fs.path_endswith_ext(item, SRC_BUILD_EXT):
sources.append(env.File(os.path.join(_var_dir, os.path.basename(item))))
return sources
middlewares = env.get("__PIO_BUILD_MIDDLEWARES")
if not middlewares:
return sources
new_sources = []
for node in sources:
new_node = node
for callback, pattern in middlewares:
if pattern and not fnmatch.fnmatch(node.srcnode().get_path(), pattern):
continue
new_node = callback(new_node)
if new_node:
new_sources.append(new_node)
return new_sources
def AddBuildMiddleware(env, callback, pattern=None):
env.Append(__PIO_BUILD_MIDDLEWARES=[(callback, pattern)])
def BuildFrameworks(env, frameworks):
@ -295,8 +317,10 @@ def BuildFrameworks(env, frameworks):
return
if "BOARD" not in env:
sys.stderr.write("Please specify `board` in `platformio.ini` to use "
"with '%s' framework\n" % ", ".join(frameworks))
sys.stderr.write(
"Please specify `board` in `platformio.ini` to use "
"with '%s' framework\n" % ", ".join(frameworks)
)
env.Exit(1)
board_frameworks = env.BoardConfig().get("frameworks", [])
@ -304,12 +328,11 @@ def BuildFrameworks(env, frameworks):
if board_frameworks:
frameworks.insert(0, board_frameworks[0])
else:
sys.stderr.write(
"Error: Please specify `board` in `platformio.ini`\n")
sys.stderr.write("Error: Please specify `board` in `platformio.ini`\n")
env.Exit(1)
for f in frameworks:
if f in ("arduino", "energia"):
if f == "arduino":
# Arduino IDE appends .o the end of filename
Builder.match_splitext = scons_patched_match_splitext
if "nobuild" not in COMMAND_LINE_TARGETS:
@ -318,22 +341,24 @@ def BuildFrameworks(env, frameworks):
if f in board_frameworks:
SConscript(env.GetFrameworkScript(f), exports="env")
else:
sys.stderr.write(
"Error: This board doesn't support %s framework!\n" % f)
sys.stderr.write("Error: This board doesn't support %s framework!\n" % f)
env.Exit(1)
def BuildLibrary(env, variant_dir, src_dir, src_filter=None):
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
return env.StaticLibrary(
env.subst(variant_dir),
env.CollectBuildFiles(variant_dir, src_dir, src_filter))
env.subst(variant_dir), env.CollectBuildFiles(variant_dir, src_dir, src_filter)
)
def BuildSources(env, variant_dir, src_dir, src_filter=None):
nodes = env.CollectBuildFiles(variant_dir, src_dir, src_filter)
DefaultEnvironment().Append(
PIOBUILDFILES=[env.Object(node) for node in nodes])
PIOBUILDFILES=[
env.Object(node) if isinstance(node, FS.File) else node for node in nodes
]
)
def exists(_):
@ -341,13 +366,16 @@ def exists(_):
def generate(env):
env.AddMethod(GetBuildType)
env.AddMethod(BuildProgram)
env.AddMethod(ProcessProgramDeps)
env.AddMethod(ProcessProjectDeps)
env.AddMethod(ParseFlagsExtended)
env.AddMethod(ProcessFlags)
env.AddMethod(ProcessUnFlags)
env.AddMethod(IsFileWithExt)
env.AddMethod(MatchSourceFiles)
env.AddMethod(CollectBuildFiles)
env.AddMethod(AddBuildMiddleware)
env.AddMethod(BuildFrameworks)
env.AddMethod(BuildLibrary)
env.AddMethod(BuildSources)

165
platformio/cache.py Normal file
View File

@ -0,0 +1,165 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import codecs
import hashlib
import os
from time import time
from platformio import app, fs
from platformio.compat import hashlib_encode_data
from platformio.package.lockfile import LockFile
from platformio.project.helpers import get_project_cache_dir
class ContentCache(object):
def __init__(self, namespace=None):
self.cache_dir = os.path.join(get_project_cache_dir(), namespace or "content")
self._db_path = os.path.join(self.cache_dir, "db.data")
self._lockfile = None
if not os.path.isdir(self.cache_dir):
os.makedirs(self.cache_dir)
def __enter__(self):
# cleanup obsolete items
self.delete()
return self
def __exit__(self, type_, value, traceback):
pass
@staticmethod
def key_from_args(*args):
h = hashlib.sha1()
for arg in args:
if arg:
h.update(hashlib_encode_data(arg))
return h.hexdigest()
def get_cache_path(self, key):
assert "/" not in key and "\\" not in key
key = str(key)
assert len(key) > 3
return os.path.join(self.cache_dir, key)
def get(self, key):
cache_path = self.get_cache_path(key)
if not os.path.isfile(cache_path):
return None
with codecs.open(cache_path, "rb", encoding="utf8") as fp:
return fp.read()
def set(self, key, data, valid):
if not app.get_setting("enable_cache"):
return False
cache_path = self.get_cache_path(key)
if os.path.isfile(cache_path):
self.delete(key)
if not data:
return False
tdmap = {"s": 1, "m": 60, "h": 3600, "d": 86400}
assert valid.endswith(tuple(tdmap))
expire_time = int(time() + tdmap[valid[-1]] * int(valid[:-1]))
if not self._lock_dbindex():
return False
if not os.path.isdir(os.path.dirname(cache_path)):
os.makedirs(os.path.dirname(cache_path))
try:
with codecs.open(cache_path, "wb", encoding="utf8") as fp:
fp.write(data)
with open(self._db_path, "a") as fp:
fp.write("%s=%s\n" % (str(expire_time), os.path.basename(cache_path)))
except UnicodeError:
if os.path.isfile(cache_path):
try:
os.remove(cache_path)
except OSError:
pass
return self._unlock_dbindex()
def delete(self, keys=None):
""" Keys=None, delete expired items """
if not os.path.isfile(self._db_path):
return None
if not keys:
keys = []
if not isinstance(keys, list):
keys = [keys]
paths_for_delete = [self.get_cache_path(k) for k in keys]
found = False
newlines = []
with open(self._db_path) as fp:
for line in fp.readlines():
line = line.strip()
if "=" not in line:
continue
expire, fname = line.split("=")
path = os.path.join(self.cache_dir, fname)
try:
if (
time() < int(expire)
and os.path.isfile(path)
and path not in paths_for_delete
):
newlines.append(line)
continue
except ValueError:
pass
found = True
if os.path.isfile(path):
try:
os.remove(path)
if not os.listdir(os.path.dirname(path)):
fs.rmtree(os.path.dirname(path))
except OSError:
pass
if found and self._lock_dbindex():
with open(self._db_path, "w") as fp:
fp.write("\n".join(newlines) + "\n")
self._unlock_dbindex()
return True
def clean(self):
if not os.path.isdir(self.cache_dir):
return
fs.rmtree(self.cache_dir)
def _lock_dbindex(self):
self._lockfile = LockFile(self.cache_dir)
try:
self._lockfile.acquire()
except: # pylint: disable=bare-except
return False
return True
def _unlock_dbindex(self):
if self._lockfile:
self._lockfile.release()
return True
#
# Helpers
#
def cleanup_content_cache(namespace=None):
with ContentCache(namespace) as cc:
cc.clean()

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,326 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import time
from platformio import __accounts_api__, app
from platformio.clients.http import HTTPClient
from platformio.exception import PlatformioException
class AccountError(PlatformioException):
MESSAGE = "{0}"
class AccountNotAuthorized(AccountError):
MESSAGE = "You are not authorized! Please log in to PlatformIO Account."
class AccountAlreadyAuthorized(AccountError):
MESSAGE = "You are already authorized with {0} account."
class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
SUMMARY_CACHE_TTL = 60 * 60 * 24 * 7
def __init__(self):
super(AccountClient, self).__init__(__accounts_api__)
@staticmethod
def get_refresh_token():
try:
return app.get_state_item("account").get("auth").get("refresh_token")
except: # pylint:disable=bare-except
raise AccountNotAuthorized()
@staticmethod
def delete_local_session():
app.delete_state_item("account")
@staticmethod
def delete_local_state(key):
account = app.get_state_item("account")
if not account or key not in account:
return
del account[key]
app.set_state_item("account", account)
def send_auth_request(self, *args, **kwargs):
headers = kwargs.get("headers", {})
if "Authorization" not in headers:
token = self.fetch_authentication_token()
headers["Authorization"] = "Bearer %s" % token
kwargs["headers"] = headers
return self.fetch_json_data(*args, **kwargs)
def login(self, username, password):
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
data = self.fetch_json_data(
"post",
"/v1/login",
data={"username": username, "password": password},
)
app.set_state_item("account", data)
return data
def login_with_code(self, client_id, code, redirect_uri):
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
result = self.fetch_json_data(
"post",
"/v1/login/code",
data={"client_id": client_id, "code": code, "redirect_uri": redirect_uri},
)
app.set_state_item("account", result)
return result
def logout(self):
refresh_token = self.get_refresh_token()
self.delete_local_session()
try:
self.fetch_json_data(
"post",
"/v1/logout",
data={"refresh_token": refresh_token},
)
except AccountError:
pass
return True
def change_password(self, old_password, new_password):
return self.send_auth_request(
"post",
"/v1/password",
data={"old_password": old_password, "new_password": new_password},
)
def registration(
self, username, email, password, firstname, lastname
): # pylint:disable=too-many-arguments
try:
self.fetch_authentication_token()
except: # pylint:disable=bare-except
pass
else:
raise AccountAlreadyAuthorized(
app.get_state_item("account", {}).get("email", "")
)
return self.fetch_json_data(
"post",
"/v1/registration",
data={
"username": username,
"email": email,
"password": password,
"firstname": firstname,
"lastname": lastname,
},
)
def auth_token(self, password, regenerate):
return self.send_auth_request(
"post",
"/v1/token",
data={"password": password, "regenerate": 1 if regenerate else 0},
).get("auth_token")
def forgot_password(self, username):
return self.fetch_json_data(
"post",
"/v1/forgot",
data={"username": username},
)
def get_profile(self):
return self.send_auth_request(
"get",
"/v1/profile",
)
def update_profile(self, profile, current_password):
profile["current_password"] = current_password
self.delete_local_state("summary")
response = self.send_auth_request(
"put",
"/v1/profile",
data=profile,
)
return response
def get_account_info(self, offline=False):
account = app.get_state_item("account") or {}
if (
account.get("summary")
and account["summary"].get("expire_at", 0) > time.time()
):
return account["summary"]
if offline and account.get("email"):
return {
"profile": {
"email": account.get("email"),
"username": account.get("username"),
}
}
result = self.send_auth_request(
"get",
"/v1/summary",
)
account["summary"] = dict(
profile=result.get("profile"),
packages=result.get("packages"),
subscriptions=result.get("subscriptions"),
user_id=result.get("user_id"),
expire_at=int(time.time()) + self.SUMMARY_CACHE_TTL,
)
app.set_state_item("account", account)
return result
def destroy_account(self):
return self.send_auth_request("delete", "/v1/account")
def create_org(self, orgname, email, displayname):
return self.send_auth_request(
"post",
"/v1/orgs",
data={"orgname": orgname, "email": email, "displayname": displayname},
)
def get_org(self, orgname):
return self.send_auth_request("get", "/v1/orgs/%s" % orgname)
def list_orgs(self):
return self.send_auth_request(
"get",
"/v1/orgs",
)
def update_org(self, orgname, data):
return self.send_auth_request(
"put", "/v1/orgs/%s" % orgname, data={k: v for k, v in data.items() if v}
)
def destroy_org(self, orgname):
return self.send_auth_request(
"delete",
"/v1/orgs/%s" % orgname,
)
def add_org_owner(self, orgname, username):
return self.send_auth_request(
"post",
"/v1/orgs/%s/owners" % orgname,
data={"username": username},
)
def list_org_owners(self, orgname):
return self.send_auth_request(
"get",
"/v1/orgs/%s/owners" % orgname,
)
def remove_org_owner(self, orgname, username):
return self.send_auth_request(
"delete",
"/v1/orgs/%s/owners" % orgname,
data={"username": username},
)
def create_team(self, orgname, teamname, description):
return self.send_auth_request(
"post",
"/v1/orgs/%s/teams" % orgname,
data={"name": teamname, "description": description},
)
def destroy_team(self, orgname, teamname):
return self.send_auth_request(
"delete",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
)
def get_team(self, orgname, teamname):
return self.send_auth_request(
"get",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
)
def list_teams(self, orgname):
return self.send_auth_request(
"get",
"/v1/orgs/%s/teams" % orgname,
)
def update_team(self, orgname, teamname, data):
return self.send_auth_request(
"put",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
data={k: v for k, v in data.items() if v},
)
def add_team_member(self, orgname, teamname, username):
return self.send_auth_request(
"post",
"/v1/orgs/%s/teams/%s/members" % (orgname, teamname),
data={"username": username},
)
def remove_team_member(self, orgname, teamname, username):
return self.send_auth_request(
"delete",
"/v1/orgs/%s/teams/%s/members" % (orgname, teamname),
data={"username": username},
)
def fetch_authentication_token(self):
if os.environ.get("PLATFORMIO_AUTH_TOKEN"):
return os.environ.get("PLATFORMIO_AUTH_TOKEN")
auth = app.get_state_item("account", {}).get("auth", {})
if auth.get("access_token") and auth.get("access_token_expire"):
if auth.get("access_token_expire") > time.time():
return auth.get("access_token")
if auth.get("refresh_token"):
try:
data = self.fetch_json_data(
"post",
"/v1/login",
headers={
"Authorization": "Bearer %s" % auth.get("refresh_token")
},
)
app.set_state_item("account", data)
return data.get("auth").get("access_token")
except AccountError:
self.delete_local_session()
raise AccountNotAuthorized()

205
platformio/clients/http.py Normal file
View File

@ -0,0 +1,205 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import math
import os
import socket
import requests.adapters
from requests.packages.urllib3.util.retry import Retry # pylint:disable=import-error
from platformio import __check_internet_hosts__, __default_requests_timeout__, app, util
from platformio.cache import ContentCache
from platformio.exception import PlatformioException, UserSideException
try:
from urllib.parse import urljoin
except ImportError:
from urlparse import urljoin
class HTTPClientError(PlatformioException):
def __init__(self, message, response=None):
super(HTTPClientError, self).__init__()
self.message = message
self.response = response
def __str__(self): # pragma: no cover
return self.message
class InternetIsOffline(UserSideException):
MESSAGE = (
"You are not connected to the Internet.\n"
"PlatformIO needs the Internet connection to"
" download dependent packages or to work with PlatformIO Account."
)
class EndpointSession(requests.Session):
def __init__(self, base_url, *args, **kwargs):
super(EndpointSession, self).__init__(*args, **kwargs)
self.base_url = base_url
def request( # pylint: disable=signature-differs,arguments-differ
self, method, url, *args, **kwargs
):
# print(self.base_url, method, url, args, kwargs)
return super(EndpointSession, self).request(
method, urljoin(self.base_url, url), *args, **kwargs
)
class EndpointSessionIterator(object):
def __init__(self, endpoints):
if not isinstance(endpoints, list):
endpoints = [endpoints]
self.endpoints = endpoints
self.endpoints_iter = iter(endpoints)
self.retry = Retry(
total=math.ceil(6 / len(self.endpoints)),
backoff_factor=1,
# method_whitelist=list(Retry.DEFAULT_METHOD_WHITELIST) + ["POST"],
status_forcelist=[413, 429, 500, 502, 503, 504],
)
def __iter__(self): # pylint: disable=non-iterator-returned
return self
def next(self):
""" For Python 2 compatibility """
return self.__next__()
def __next__(self):
base_url = next(self.endpoints_iter)
session = EndpointSession(base_url)
session.headers.update({"User-Agent": app.get_user_agent()})
adapter = requests.adapters.HTTPAdapter(max_retries=self.retry)
session.mount(base_url, adapter)
return session
class HTTPClient(object):
def __init__(self, endpoints):
self._session_iter = EndpointSessionIterator(endpoints)
self._session = None
self._next_session()
def __del__(self):
if not self._session:
return
self._session.close()
self._session = None
def _next_session(self):
if self._session:
self._session.close()
self._session = next(self._session_iter)
@util.throttle(500)
def send_request(self, method, path, **kwargs):
# check Internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
# set default timeout
if "timeout" not in kwargs:
kwargs["timeout"] = __default_requests_timeout__
while True:
try:
return getattr(self._session, method)(path, **kwargs)
except (
requests.exceptions.ConnectionError,
requests.exceptions.Timeout,
) as e:
try:
self._next_session()
except: # pylint: disable=bare-except
raise HTTPClientError(str(e))
def fetch_json_data(self, method, path, **kwargs):
cache_valid = kwargs.pop("cache_valid") if "cache_valid" in kwargs else None
if not cache_valid:
return self._parse_json_response(self.send_request(method, path, **kwargs))
cache_key = ContentCache.key_from_args(
method, path, kwargs.get("params"), kwargs.get("data")
)
with ContentCache("http") as cc:
result = cc.get(cache_key)
if result is not None:
return json.loads(result)
response = self.send_request(method, path, **kwargs)
data = self._parse_json_response(response)
cc.set(cache_key, response.text, cache_valid)
return data
@staticmethod
def _parse_json_response(response, expected_codes=(200, 201, 202)):
if response.status_code in expected_codes:
try:
return response.json()
except ValueError:
pass
try:
message = response.json()["message"]
except (KeyError, ValueError):
message = response.text
raise HTTPClientError(message, response)
#
# Helpers
#
@util.memoized(expire="10s")
def _internet_on():
timeout = 2
socket.setdefaulttimeout(timeout)
for host in __check_internet_hosts__:
try:
for var in ("HTTP_PROXY", "HTTPS_PROXY"):
if not os.getenv(var) and not os.getenv(var.lower()):
continue
requests.get("http://%s" % host, allow_redirects=False, timeout=timeout)
return True
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, 80))
s.close()
return True
except: # pylint: disable=bare-except
pass
return False
def ensure_internet_on(raise_exception=False):
result = _internet_on()
if raise_exception and not result:
raise InternetIsOffline()
return result
def fetch_remote_content(*args, **kwargs):
kwargs["headers"] = kwargs.get("headers", {})
if "User-Agent" not in kwargs["headers"]:
kwargs["headers"]["User-Agent"] = app.get_user_agent()
if "timeout" not in kwargs:
kwargs["timeout"] = __default_requests_timeout__
r = requests.get(*args, **kwargs)
r.raise_for_status()
return r.text

View File

@ -0,0 +1,147 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import __registry_api__, fs
from platformio.clients.account import AccountClient
from platformio.clients.http import HTTPClient, HTTPClientError
from platformio.package.meta import PackageType
# pylint: disable=too-many-arguments
class RegistryClient(HTTPClient):
def __init__(self):
super(RegistryClient, self).__init__(__registry_api__)
def send_auth_request(self, *args, **kwargs):
headers = kwargs.get("headers", {})
if "Authorization" not in headers:
token = AccountClient().fetch_authentication_token()
headers["Authorization"] = "Bearer %s" % token
kwargs["headers"] = headers
return self.fetch_json_data(*args, **kwargs)
def publish_package(
self, archive_path, owner=None, released_at=None, private=False, notify=True
):
account = AccountClient()
if not owner:
owner = (
account.get_account_info(offline=True).get("profile").get("username")
)
with open(archive_path, "rb") as fp:
return self.send_auth_request(
"post",
"/v3/packages/%s/%s" % (owner, PackageType.from_archive(archive_path)),
params={
"private": 1 if private else 0,
"notify": 1 if notify else 0,
"released_at": released_at,
},
headers={
"Content-Type": "application/octet-stream",
"X-PIO-Content-SHA256": fs.calculate_file_hashsum(
"sha256", archive_path
),
},
data=fp,
)
def unpublish_package( # pylint: disable=redefined-builtin
self, type, name, owner=None, version=None, undo=False
):
account = AccountClient()
if not owner:
owner = (
account.get_account_info(offline=True).get("profile").get("username")
)
path = "/v3/packages/%s/%s/%s" % (owner, type, name)
if version:
path += "/" + version
return self.send_auth_request(
"delete",
path,
params={"undo": 1 if undo else 0},
)
def update_resource(self, urn, private):
return self.send_auth_request(
"put",
"/v3/resources/%s" % urn,
data={"private": int(private)},
)
def grant_access_for_resource(self, urn, client, level):
return self.send_auth_request(
"put",
"/v3/resources/%s/access" % urn,
data={"client": client, "level": level},
)
def revoke_access_from_resource(self, urn, client):
return self.send_auth_request(
"delete",
"/v3/resources/%s/access" % urn,
data={"client": client},
)
def list_resources(self, owner):
return self.send_auth_request(
"get", "/v3/resources", params={"owner": owner} if owner else None
)
def list_packages(self, query=None, filters=None, page=None):
assert query or filters
search_query = []
if filters:
valid_filters = (
"authors",
"keywords",
"frameworks",
"platforms",
"headers",
"ids",
"names",
"owners",
"types",
)
assert set(filters.keys()) <= set(valid_filters)
for name, values in filters.items():
for value in set(
values if isinstance(values, (list, tuple)) else [values]
):
search_query.append('%s:"%s"' % (name[:-1], value))
if query:
search_query.append(query)
params = dict(query=" ".join(search_query))
if page:
params["page"] = int(page)
return self.fetch_json_data(
"get", "/v3/packages", params=params, cache_valid="1h"
)
def get_package(self, type_, owner, name, version=None):
try:
return self.fetch_json_data(
"get",
"/v3/packages/{owner}/{type}/{name}".format(
type=type_, owner=owner.lower(), name=name.lower()
),
params=dict(version=version) if version else None,
cache_valid="1h",
)
except HTTPClientError as e:
if e.response is not None and e.response.status_code == 404:
return None
raise e

View File

@ -11,3 +11,70 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import click
class PlatformioCLI(click.MultiCommand):
leftover_args = []
def __init__(self, *args, **kwargs):
super(PlatformioCLI, self).__init__(*args, **kwargs)
self._pio_cmds_dir = os.path.dirname(__file__)
@staticmethod
def in_silence():
args = PlatformioCLI.leftover_args
return args and any(
[
args[0] == "debug" and "--interpreter" in " ".join(args),
args[0] == "upgrade",
"--json-output" in args,
"--version" in args,
]
)
def invoke(self, ctx):
PlatformioCLI.leftover_args = ctx.args
if hasattr(ctx, "protected_args"):
PlatformioCLI.leftover_args = ctx.protected_args + ctx.args
return super(PlatformioCLI, self).invoke(ctx)
def list_commands(self, ctx):
cmds = []
for cmd_name in os.listdir(self._pio_cmds_dir):
if cmd_name.startswith("__init__"):
continue
if os.path.isfile(os.path.join(self._pio_cmds_dir, cmd_name, "command.py")):
cmds.append(cmd_name)
elif cmd_name.endswith(".py"):
cmds.append(cmd_name[:-3])
cmds.sort()
return cmds
def get_command(self, ctx, cmd_name):
mod = None
try:
mod_path = "platformio.commands." + cmd_name
if os.path.isfile(os.path.join(self._pio_cmds_dir, cmd_name, "command.py")):
mod_path = "platformio.commands.%s.command" % cmd_name
mod = __import__(mod_path, None, None, ["cli"])
except ImportError:
try:
return self._handle_obsolate_command(cmd_name)
except AttributeError:
pass
raise click.UsageError('No such command "%s"' % cmd_name, ctx)
return mod.cli
@staticmethod
def _handle_obsolate_command(name):
# pylint: disable=import-outside-toplevel
if name == "init":
from platformio.commands.project import project_init
return project_init
raise AttributeError()

View File

@ -0,0 +1,146 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
import json
import re
import click
from tabulate import tabulate
from platformio.clients.registry import RegistryClient
from platformio.commands.account import validate_username
from platformio.commands.team import validate_orgname_teamname
def validate_client(value):
if ":" in value:
validate_orgname_teamname(value)
else:
validate_username(value)
return value
@click.group("access", short_help="Manage resource access")
def cli():
pass
def validate_urn(value):
value = str(value).strip()
if not re.match(r"^prn:reg:pkg:(\d+):(\w+)$", value, flags=re.I):
raise click.BadParameter("Invalid URN format.")
return value
@cli.command("public", short_help="Make resource public")
@click.argument(
"urn",
callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_public(urn, urn_type):
client = RegistryClient()
client.update_resource(urn=urn, private=0)
return click.secho(
"The resource %s has been successfully updated." % urn,
fg="green",
)
@cli.command("private", short_help="Make resource private")
@click.argument(
"urn",
callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_private(urn, urn_type):
client = RegistryClient()
client.update_resource(urn=urn, private=1)
return click.secho(
"The resource %s has been successfully updated." % urn,
fg="green",
)
@cli.command("grant", short_help="Grant access")
@click.argument("level", type=click.Choice(["admin", "maintainer", "guest"]))
@click.argument(
"client",
metavar="[<ORGNAME:TEAMNAME>|<USERNAME>]",
callback=lambda _, __, value: validate_client(value),
)
@click.argument(
"urn",
callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_grant(level, client, urn, urn_type):
reg_client = RegistryClient()
reg_client.grant_access_for_resource(urn=urn, client=client, level=level)
return click.secho(
"Access for resource %s has been granted for %s" % (urn, client),
fg="green",
)
@cli.command("revoke", short_help="Revoke access")
@click.argument(
"client",
metavar="[ORGNAME:TEAMNAME|USERNAME]",
callback=lambda _, __, value: validate_client(value),
)
@click.argument(
"urn",
callback=lambda _, __, value: validate_urn(value),
)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
def access_revoke(client, urn, urn_type):
reg_client = RegistryClient()
reg_client.revoke_access_from_resource(urn=urn, client=client)
return click.secho(
"Access for resource %s has been revoked for %s" % (urn, client),
fg="green",
)
@cli.command("list", short_help="List published resources")
@click.argument("owner", required=False)
@click.option("--urn-type", type=click.Choice(["prn:reg:pkg"]), default="prn:reg:pkg")
@click.option("--json-output", is_flag=True)
def access_list(owner, urn_type, json_output):
reg_client = RegistryClient()
resources = reg_client.list_resources(owner=owner)
if json_output:
return click.echo(json.dumps(resources))
if not resources:
return click.secho("You do not have any resources.", fg="yellow")
for resource in resources:
click.echo()
click.secho(resource.get("name"), fg="cyan")
click.echo("-" * len(resource.get("name")))
table_data = []
table_data.append(("URN:", resource.get("urn")))
table_data.append(("Owner:", resource.get("owner")))
table_data.append(
(
"Access level(s):",
", ".join(
(level.capitalize() for level in resource.get("access_levels"))
),
)
)
click.echo(tabulate(table_data, tablefmt="plain"))
return click.echo()

View File

@ -14,59 +14,286 @@
# pylint: disable=unused-argument
import sys
import datetime
import json
import re
import click
from tabulate import tabulate
from platformio.managers.core import pioplus_call
from platformio.clients.account import AccountClient, AccountNotAuthorized
@click.group("account", short_help="Manage PIO Account")
@click.group("account", short_help="Manage PlatformIO account")
def cli():
pass
@cli.command("register", short_help="Create new PIO Account")
@click.option("-u", "--username")
def account_register(**kwargs):
pioplus_call(sys.argv[1:])
def validate_username(value, field="username"):
value = str(value).strip()
if not re.match(r"^[a-z\d](?:[a-z\d]|-(?=[a-z\d])){0,37}$", value, flags=re.I):
raise click.BadParameter(
"Invalid %s format. "
"%s must contain only alphanumeric characters "
"or single hyphens, cannot begin or end with a hyphen, "
"and must not be longer than 38 characters."
% (field.lower(), field.capitalize())
)
return value
@cli.command("login", short_help="Log in to PIO Account")
@click.option("-u", "--username")
@click.option("-p", "--password")
def account_login(**kwargs):
pioplus_call(sys.argv[1:])
def validate_email(value):
value = str(value).strip()
if not re.match(r"^[a-z\d_.+-]+@[a-z\d\-]+\.[a-z\d\-.]+$", value, flags=re.I):
raise click.BadParameter("Invalid email address")
return value
@cli.command("logout", short_help="Log out of PIO Account")
def validate_password(value):
value = str(value).strip()
if not re.match(r"^(?=.*[a-z])(?=.*\d).{8,}$", value):
raise click.BadParameter(
"Invalid password format. "
"Password must contain at least 8 characters"
" including a number and a lowercase letter"
)
return value
@cli.command("register", short_help="Create new PlatformIO Account")
@click.option(
"-u",
"--username",
prompt=True,
callback=lambda _, __, value: validate_username(value),
)
@click.option(
"-e", "--email", prompt=True, callback=lambda _, __, value: validate_email(value)
)
@click.option(
"-p",
"--password",
prompt=True,
hide_input=True,
confirmation_prompt=True,
callback=lambda _, __, value: validate_password(value),
)
@click.option("--firstname", prompt=True)
@click.option("--lastname", prompt=True)
def account_register(username, email, password, firstname, lastname):
client = AccountClient()
client.registration(username, email, password, firstname, lastname)
return click.secho(
"An account has been successfully created. "
"Please check your mail to activate your account and verify your email address.",
fg="green",
)
@cli.command("login", short_help="Log in to PlatformIO Account")
@click.option("-u", "--username", prompt="Username or email")
@click.option("-p", "--password", prompt=True, hide_input=True)
def account_login(username, password):
client = AccountClient()
client.login(username, password)
return click.secho("Successfully logged in!", fg="green")
@cli.command("logout", short_help="Log out of PlatformIO Account")
def account_logout():
pioplus_call(sys.argv[1:])
client = AccountClient()
client.logout()
return click.secho("Successfully logged out!", fg="green")
@cli.command("password", short_help="Change password")
@click.option("--old-password")
@click.option("--new-password")
def account_password(**kwargs):
pioplus_call(sys.argv[1:])
@click.option("--old-password", prompt=True, hide_input=True)
@click.option("--new-password", prompt=True, hide_input=True, confirmation_prompt=True)
def account_password(old_password, new_password):
client = AccountClient()
client.change_password(old_password, new_password)
return click.secho("Password successfully changed!", fg="green")
@cli.command("token", short_help="Get or regenerate Authentication Token")
@click.option("-p", "--password")
@click.option("-p", "--password", prompt=True, hide_input=True)
@click.option("--regenerate", is_flag=True)
@click.option("--json-output", is_flag=True)
def account_token(**kwargs):
pioplus_call(sys.argv[1:])
def account_token(password, regenerate, json_output):
client = AccountClient()
auth_token = client.auth_token(password, regenerate)
if json_output:
return click.echo(json.dumps({"status": "success", "result": auth_token}))
return click.secho("Personal Authentication Token: %s" % auth_token, fg="green")
@cli.command("forgot", short_help="Forgot password")
@click.option("-u", "--username")
def account_forgot(**kwargs):
pioplus_call(sys.argv[1:])
@click.option("--username", prompt="Username or email")
def account_forgot(username):
client = AccountClient()
client.forgot_password(username)
return click.secho(
"If this account is registered, we will send the "
"further instructions to your email.",
fg="green",
)
@cli.command("show", short_help="PIO Account information")
@cli.command("update", short_help="Update profile information")
@click.option("--current-password", prompt=True, hide_input=True)
@click.option("--username")
@click.option("--email")
@click.option("--firstname")
@click.option("--lastname")
def account_update(current_password, **kwargs):
client = AccountClient()
profile = client.get_profile()
new_profile = profile.copy()
if not any(kwargs.values()):
for field in profile:
new_profile[field] = click.prompt(
field.replace("_", " ").capitalize(), default=profile[field]
)
if field == "email":
validate_email(new_profile[field])
if field == "username":
validate_username(new_profile[field])
else:
new_profile.update({key: value for key, value in kwargs.items() if value})
client.update_profile(new_profile, current_password)
click.secho("Profile successfully updated!", fg="green")
username_changed = new_profile["username"] != profile["username"]
email_changed = new_profile["email"] != profile["email"]
if not username_changed and not email_changed:
return None
try:
client.logout()
except AccountNotAuthorized:
pass
if email_changed:
return click.secho(
"Please check your mail to verify your new email address and re-login. ",
fg="yellow",
)
return click.secho("Please re-login.", fg="yellow")
@cli.command("destroy", short_help="Destroy account")
def account_destroy():
client = AccountClient()
click.confirm(
"Are you sure you want to delete the %s user account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% client.get_account_info().get("profile").get("username"),
abort=True,
)
client.destroy_account()
try:
client.logout()
except AccountNotAuthorized:
pass
return click.secho(
"User account has been destroyed.",
fg="green",
)
@cli.command("show", short_help="PlatformIO Account information")
@click.option("--offline", is_flag=True)
@click.option("--json-output", is_flag=True)
def account_show(**kwargs):
pioplus_call(sys.argv[1:])
def account_show(offline, json_output):
client = AccountClient()
info = client.get_account_info(offline)
if json_output:
return click.echo(json.dumps(info))
click.echo()
if info.get("profile"):
print_profile(info["profile"])
if info.get("packages"):
print_packages(info["packages"])
if info.get("subscriptions"):
print_subscriptions(info["subscriptions"])
return click.echo()
def print_profile(profile):
click.secho("Profile", fg="cyan", bold=True)
click.echo("=" * len("Profile"))
data = []
if profile.get("username"):
data.append(("Username:", profile["username"]))
if profile.get("email"):
data.append(("Email:", profile["email"]))
if profile.get("firstname"):
data.append(("First name:", profile["firstname"]))
if profile.get("lastname"):
data.append(("Last name:", profile["lastname"]))
click.echo(tabulate(data, tablefmt="plain"))
def print_packages(packages):
click.echo()
click.secho("Packages", fg="cyan")
click.echo("=" * len("Packages"))
for package in packages:
click.echo()
click.secho(package.get("name"), bold=True)
click.echo("-" * len(package.get("name")))
if package.get("description"):
click.echo(package.get("description"))
data = []
expire = "-"
if "subscription" in package:
expire = datetime.datetime.strptime(
(
package["subscription"].get("end_at")
or package["subscription"].get("next_bill_at")
),
"%Y-%m-%dT%H:%M:%SZ",
).strftime("%Y-%m-%d")
data.append(("Expire:", expire))
services = []
for key in package:
if not key.startswith("service."):
continue
if isinstance(package[key], dict):
services.append(package[key].get("title"))
else:
services.append(package[key])
if services:
data.append(("Services:", ", ".join(services)))
click.echo(tabulate(data, tablefmt="plain"))
def print_subscriptions(subscriptions):
click.echo()
click.secho("Subscriptions", fg="cyan")
click.echo("=" * len("Subscriptions"))
for subscription in subscriptions:
click.echo()
click.secho(subscription.get("product_name"), bold=True)
click.echo("-" * len(subscription.get("product_name")))
data = [("State:", subscription.get("status"))]
begin_at = datetime.datetime.strptime(
subscription.get("begin_at"), "%Y-%m-%dT%H:%M:%SZ"
).strftime("%Y-%m-%d %H:%M:%S")
data.append(("Start date:", begin_at or "-"))
end_at = subscription.get("end_at")
if end_at:
end_at = datetime.datetime.strptime(
subscription.get("end_at"), "%Y-%m-%dT%H:%M:%SZ"
).strftime("%Y-%m-%d %H:%M:%S")
data.append(("End date:", end_at or "-"))
next_bill_at = subscription.get("next_bill_at")
if next_bill_at:
next_bill_at = datetime.datetime.strptime(
subscription.get("next_bill_at"), "%Y-%m-%dT%H:%M:%SZ"
).strftime("%Y-%m-%d %H:%M:%S")
data.append(("Next payment:", next_bill_at or "-"))
data.append(
("Edit:", click.style(subscription.get("update_url"), fg="blue") or "-")
)
data.append(
("Cancel:", click.style(subscription.get("cancel_url"), fg="blue") or "-")
)
click.echo(tabulate(data, tablefmt="plain"))

View File

@ -15,12 +15,14 @@
import json
import click
from tabulate import tabulate
from platformio import util
from platformio.managers.platform import PlatformManager
from platformio import fs
from platformio.compat import dump_json_to_unicode
from platformio.package.manager.platform import PlatformPackageManager
@click.command("boards", short_help="Embedded Board Explorer")
@click.command("boards", short_help="Embedded board explorer")
@click.argument("query", required=False)
@click.option("--installed", is_flag=True)
@click.option("--json-output", is_flag=True)
@ -30,49 +32,46 @@ def cli(query, installed, json_output): # pylint: disable=R0912
grpboards = {}
for board in _get_boards(installed):
if query and query.lower() not in json.dumps(board).lower():
if query and not any(
query.lower() in str(board.get(k, "")).lower()
for k in ("id", "name", "mcu", "vendor", "platform", "frameworks")
):
continue
if board['platform'] not in grpboards:
grpboards[board['platform']] = []
grpboards[board['platform']].append(board)
if board["platform"] not in grpboards:
grpboards[board["platform"]] = []
grpboards[board["platform"]].append(board)
terminal_width, _ = click.get_terminal_size()
for (platform, boards) in sorted(grpboards.items()):
click.echo("")
click.echo("Platform: ", nl=False)
click.secho(platform, bold=True)
click.echo("-" * terminal_width)
click.echo("=" * terminal_width)
print_boards(boards)
return True
def print_boards(boards):
terminal_width, _ = click.get_terminal_size()
BOARDLIST_TPL = ("{type:<30} {mcu:<14} {frequency:<8} "
" {flash:<7} {ram:<6} {name}")
click.echo(
BOARDLIST_TPL.format(
type=click.style("ID", fg="cyan"),
mcu="MCU",
frequency="Frequency",
flash="Flash",
ram="RAM",
name="Name"))
click.echo("-" * terminal_width)
for board in boards:
click.echo(
BOARDLIST_TPL.format(
type=click.style(board['id'], fg="cyan"),
mcu=board['mcu'],
frequency="%dMHz" % (board['fcpu'] / 1000000),
flash=util.format_filesize(board['rom']),
ram=util.format_filesize(board['ram']),
name=board['name']))
tabulate(
[
(
click.style(b["id"], fg="cyan"),
b["mcu"],
"%dMHz" % (b["fcpu"] / 1000000),
fs.humanize_file_size(b["rom"]),
fs.humanize_file_size(b["ram"]),
b["name"],
)
for b in boards
],
headers=["ID", "MCU", "Frequency", "Flash", "RAM", "Name"],
)
)
def _get_boards(installed=False):
pm = PlatformManager()
pm = PlatformPackageManager()
return pm.get_installed_boards() if installed else pm.get_all_boards()
@ -80,8 +79,8 @@ def _print_boards_json(query, installed=False):
result = []
for board in _get_boards(installed):
if query:
search_data = "%s %s" % (board['id'], json.dumps(board).lower())
search_data = "%s %s" % (board["id"], json.dumps(board).lower())
if query.lower() not in search_data.lower():
continue
result.append(board)
click.echo(json.dumps(result))
click.echo(dump_json_to_unicode(result))

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,319 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches
# pylint: disable=redefined-builtin,too-many-statements
import os
from collections import Counter
from os.path import dirname, isfile
from time import time
import click
from tabulate import tabulate
from platformio import app, exception, fs, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools import CheckToolFactory
from platformio.compat import dump_json_to_unicode
from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above, get_project_dir
@click.command("check", short_help="Static code analysis")
@click.option("-e", "--environment", multiple=True)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--pattern", multiple=True)
@click.option("--flags", multiple=True)
@click.option(
"--severity", multiple=True, type=click.Choice(DefectItem.SEVERITY_LABELS.values())
)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
@click.option("--json-output", is_flag=True)
@click.option(
"--fail-on-defect",
multiple=True,
type=click.Choice(DefectItem.SEVERITY_LABELS.values()),
)
@click.option("--skip-packages", is_flag=True)
def cli(
environment,
project_dir,
project_conf,
pattern,
flags,
severity,
silent,
verbose,
json_output,
fail_on_defect,
skip_packages,
):
app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level
if isfile(project_dir):
project_dir = find_project_dir_above(project_dir)
results = []
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(environment)
default_envs = config.default_envs()
for envname in config.envs():
skipenv = any(
[
environment and envname not in environment,
not environment and default_envs and envname not in default_envs,
]
)
env_options = config.items(env=envname, as_dict=True)
env_dump = []
for k, v in env_options.items():
if k not in ("platform", "framework", "board"):
continue
env_dump.append(
"%s: %s" % (k, ", ".join(v) if isinstance(v, list) else v)
)
default_patterns = [
config.get_optional_dir("src"),
config.get_optional_dir("include"),
]
tool_options = dict(
verbose=verbose,
silent=silent,
patterns=pattern or env_options.get("check_patterns", default_patterns),
flags=flags or env_options.get("check_flags"),
severity=[DefectItem.SEVERITY_LABELS[DefectItem.SEVERITY_HIGH]]
if silent
else severity or config.get("env:" + envname, "check_severity"),
skip_packages=skip_packages or env_options.get("check_skip_packages"),
)
for tool in config.get("env:" + envname, "check_tool"):
if skipenv:
results.append({"env": envname, "tool": tool})
continue
if not silent and not json_output:
print_processing_header(tool, envname, env_dump)
ct = CheckToolFactory.new(
tool, project_dir, config, envname, tool_options
)
result = {"env": envname, "tool": tool, "duration": time()}
rc = ct.check(
on_defect_callback=None
if (json_output or verbose)
else lambda defect: click.echo(repr(defect))
)
result["defects"] = ct.get_defects()
result["duration"] = time() - result["duration"]
result["succeeded"] = rc == 0
if fail_on_defect:
result["succeeded"] = rc == 0 and not any(
DefectItem.SEVERITY_LABELS[d.severity] in fail_on_defect
for d in result["defects"]
)
result["stats"] = collect_component_stats(result)
results.append(result)
if verbose:
click.echo("\n".join(repr(d) for d in result["defects"]))
if not json_output and not silent:
if rc != 0:
click.echo(
"Error: %s failed to perform check! Please "
"examine tool output in verbose mode." % tool
)
elif not result["defects"]:
click.echo("No defects found")
print_processing_footer(result)
if json_output:
click.echo(dump_json_to_unicode(results_to_json(results)))
elif not silent:
print_check_summary(results)
command_failed = any(r.get("succeeded") is False for r in results)
if command_failed:
raise exception.ReturnErrorCode(1)
def results_to_json(raw):
results = []
for item in raw:
if item.get("succeeded") is None:
continue
item.update(
{
"succeeded": bool(item.get("succeeded")),
"defects": [d.as_dict() for d in item.get("defects", [])],
}
)
results.append(item)
return results
def print_processing_header(tool, envname, envdump):
click.echo(
"Checking %s > %s (%s)"
% (click.style(envname, fg="cyan", bold=True), tool, "; ".join(envdump))
)
terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
def print_processing_footer(result):
is_failed = not result.get("succeeded")
util.print_labeled_bar(
"[%s] Took %.2f seconds"
% (
(
click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("PASSED", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def collect_component_stats(result):
components = dict()
def _append_defect(component, defect):
if not components.get(component):
components[component] = Counter()
components[component].update({DefectItem.SEVERITY_LABELS[defect.severity]: 1})
for defect in result.get("defects", []):
component = dirname(defect.file) or defect.file
_append_defect(component, defect)
if component.lower().startswith(get_project_dir().lower()):
while os.sep in component:
component = dirname(component)
_append_defect(component, defect)
return components
def print_defects_stats(results):
if not results:
return
component_stats = {}
for r in results:
for k, v in r.get("stats", {}).items():
if not component_stats.get(k):
component_stats[k] = Counter()
component_stats[k].update(r["stats"][k])
if not component_stats:
return
severity_labels = list(DefectItem.SEVERITY_LABELS.values())
severity_labels.reverse()
tabular_data = list()
for k, v in component_stats.items():
tool_defect = [v.get(s, 0) for s in severity_labels]
tabular_data.append([k] + tool_defect)
total = ["Total"] + [sum(d) for d in list(zip(*tabular_data))[1:]]
tabular_data.sort()
tabular_data.append([]) # Empty line as delimiter
tabular_data.append(total)
headers = ["Component"]
headers.extend([l.upper() for l in severity_labels])
headers = [click.style(h, bold=True) for h in headers]
click.echo(tabulate(tabular_data, headers=headers, numalign="center"))
click.echo()
def print_check_summary(results):
click.echo()
tabular_data = []
succeeded_nums = 0
failed_nums = 0
duration = 0
print_defects_stats(results)
for result in results:
duration += result.get("duration", 0)
if result.get("succeeded") is False:
failed_nums += 1
status_str = click.style("FAILED", fg="red")
elif result.get("succeeded") is None:
status_str = "IGNORED"
else:
succeeded_nums += 1
status_str = click.style("PASSED", fg="green")
tabular_data.append(
(
click.style(result["env"], fg="cyan"),
result["tool"],
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Environment", "Tool", "Status", "Duration")
],
),
err=failed_nums,
)
util.print_labeled_bar(
"%s%d succeeded in %s"
% (
"%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums,
fg="red" if failed_nums else "green",
)

View File

@ -0,0 +1,95 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import click
from platformio.project.helpers import get_project_dir
# pylint: disable=too-many-instance-attributes, redefined-builtin
# pylint: disable=too-many-arguments
class DefectItem(object):
SEVERITY_HIGH = 1
SEVERITY_MEDIUM = 2
SEVERITY_LOW = 4
SEVERITY_LABELS = {4: "low", 2: "medium", 1: "high"}
def __init__(
self,
severity,
category,
message,
file="unknown",
line=0,
column=0,
id=None,
callstack=None,
cwe=None,
):
assert severity in (self.SEVERITY_HIGH, self.SEVERITY_MEDIUM, self.SEVERITY_LOW)
self.severity = severity
self.category = category
self.message = message
self.line = int(line)
self.column = int(column)
self.callstack = callstack
self.cwe = cwe
self.id = id
self.file = file
if file.lower().startswith(get_project_dir().lower()):
self.file = os.path.relpath(file, get_project_dir())
def __repr__(self):
defect_color = None
if self.severity == self.SEVERITY_HIGH:
defect_color = "red"
elif self.severity == self.SEVERITY_MEDIUM:
defect_color = "yellow"
format_str = "{file}:{line}: [{severity}:{category}] {message} {id}"
return format_str.format(
severity=click.style(self.SEVERITY_LABELS[self.severity], fg=defect_color),
category=click.style(self.category.lower(), fg=defect_color),
file=click.style(self.file, bold=True),
message=self.message,
line=self.line,
id="%s" % "[%s]" % self.id if self.id else "",
)
def __or__(self, defect):
return self.severity | defect.severity
@staticmethod
def severity_to_int(label):
for key, value in DefectItem.SEVERITY_LABELS.items():
if label == value:
return key
raise Exception("Unknown severity label -> %s" % label)
def as_dict(self):
return {
"severity": self.SEVERITY_LABELS[self.severity],
"category": self.category,
"message": self.message,
"file": os.path.realpath(self.file),
"line": self.line,
"column": self.column,
"callstack": self.callstack,
"id": self.id,
"cwe": self.cwe,
}

View File

@ -0,0 +1,33 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import exception
from platformio.commands.check.tools.clangtidy import ClangtidyCheckTool
from platformio.commands.check.tools.cppcheck import CppcheckCheckTool
from platformio.commands.check.tools.pvsstudio import PvsStudioCheckTool
class CheckToolFactory(object):
@staticmethod
def new(tool, project_dir, config, envname, options):
cls = None
if tool == "cppcheck":
cls = CppcheckCheckTool
elif tool == "clangtidy":
cls = ClangtidyCheckTool
elif tool == "pvs-studio":
cls = PvsStudioCheckTool
else:
raise exception.PlatformioException("Unknown check tool `%s`" % tool)
return cls(project_dir, config, envname, options)

View File

@ -0,0 +1,216 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from tempfile import NamedTemporaryFile
import click
from platformio import compat, fs, proc
from platformio.commands.check.defect import DefectItem
from platformio.project.helpers import load_project_ide_data
class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
def __init__(self, project_dir, config, envname, options):
self.config = config
self.envname = envname
self.options = options
self.cc_flags = []
self.cxx_flags = []
self.cpp_includes = []
self.cpp_defines = []
self.toolchain_defines = []
self._tmp_files = []
self.cc_path = None
self.cxx_path = None
self._defects = []
self._on_defect_callback = None
self._bad_input = False
self._load_cpp_data(project_dir)
# detect all defects by default
if not self.options.get("severity"):
self.options["severity"] = [
DefectItem.SEVERITY_LOW,
DefectItem.SEVERITY_MEDIUM,
DefectItem.SEVERITY_HIGH,
]
# cast to severity by ids
self.options["severity"] = [
s if isinstance(s, int) else DefectItem.severity_to_int(s)
for s in self.options["severity"]
]
def _load_cpp_data(self, project_dir):
data = load_project_ide_data(project_dir, self.envname)
if not data:
return
self.cc_flags = click.parser.split_arg_string(data.get("cc_flags", ""))
self.cxx_flags = click.parser.split_arg_string(data.get("cxx_flags", ""))
self.cpp_includes = self._dump_includes(data.get("includes", {}))
self.cpp_defines = data.get("defines", [])
self.cc_path = data.get("cc_path")
self.cxx_path = data.get("cxx_path")
self.toolchain_defines = self._get_toolchain_defines()
def get_flags(self, tool):
result = []
flags = self.options.get("flags") or []
for flag in flags:
if ":" not in flag or flag.startswith("-"):
result.extend([f for f in flag.split(" ") if f])
elif flag.startswith("%s:" % tool):
result.extend([f for f in flag.split(":", 1)[1].split(" ") if f])
return result
def _get_toolchain_defines(self):
def _extract_defines(language, includes_file):
build_flags = self.cxx_flags if language == "c++" else self.cc_flags
defines = []
cmd = "echo | %s -x %s %s %s -dM -E -" % (
self.cc_path,
language,
" ".join(
[f for f in build_flags if f.startswith(("-m", "-f", "-std"))]
),
includes_file,
)
result = proc.exec_command(cmd, shell=True)
for line in result["out"].split("\n"):
tokens = line.strip().split(" ", 2)
if not tokens or tokens[0] != "#define":
continue
if len(tokens) > 2:
defines.append("%s=%s" % (tokens[1], tokens[2]))
else:
defines.append(tokens[1])
return defines
incflags_file = self._long_includes_hook(self.cpp_includes)
return {lang: _extract_defines(lang, incflags_file) for lang in ("c", "c++")}
def _create_tmp_file(self, data):
with NamedTemporaryFile("w", delete=False) as fp:
fp.write(data)
self._tmp_files.append(fp.name)
return fp.name
def _long_includes_hook(self, includes):
data = []
for inc in includes:
data.append('-I"%s"' % fs.to_unix_path(inc))
return '@"%s"' % self._create_tmp_file(" ".join(data))
@staticmethod
def _dump_includes(includes_map):
result = []
for includes in includes_map.values():
for include in includes:
if include not in result:
result.append(include)
return result
@staticmethod
def is_flag_set(flag, flags):
return any(flag in f for f in flags)
def get_defects(self):
return self._defects
def configure_command(self):
raise NotImplementedError
def on_tool_output(self, line):
line = self.tool_output_filter(line)
if not line:
return
defect = self.parse_defect(line)
if not isinstance(defect, DefectItem):
if self.options.get("verbose"):
click.echo(line)
return
if defect.severity not in self.options["severity"]:
return
self._defects.append(defect)
if self._on_defect_callback:
self._on_defect_callback(defect)
@staticmethod
def tool_output_filter(line):
return line
@staticmethod
def parse_defect(raw_line):
return raw_line
def clean_up(self):
for f in self._tmp_files:
if os.path.isfile(f):
os.remove(f)
@staticmethod
def get_project_target_files(patterns):
c_extension = (".c",)
cpp_extensions = (".cc", ".cpp", ".cxx", ".ino")
header_extensions = (".h", ".hh", ".hpp", ".hxx")
result = {"c": [], "c++": [], "headers": []}
def _add_file(path):
if path.endswith(header_extensions):
result["headers"].append(os.path.realpath(path))
elif path.endswith(c_extension):
result["c"].append(os.path.realpath(path))
elif path.endswith(cpp_extensions):
result["c++"].append(os.path.realpath(path))
for pattern in patterns:
for item in compat.glob_recursive(pattern):
if not os.path.isdir(item):
_add_file(item)
for root, _, files in os.walk(item, followlinks=True):
for f in files:
_add_file(os.path.join(root, f))
return result
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
cmd = self.configure_command()
if cmd:
if self.options.get("verbose"):
click.echo(" ".join(cmd))
proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
else:
if self.options.get("verbose"):
click.echo("Error: Couldn't configure command")
self._bad_input = True
self.clean_up()
return self._bad_input

View File

@ -0,0 +1,81 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
from os.path import join
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.package.manager.core import get_core_package_dir
class ClangtidyCheckTool(CheckToolBase):
def tool_output_filter(self, line):
if not self.options.get("verbose") and "[clang-diagnostic-error]" in line:
return ""
if "[CommonOptionsParser]" in line:
self._bad_input = True
return line
if any(d in line for d in ("note: ", "error: ", "warning: ")):
return line
return ""
def parse_defect(self, raw_line):
match = re.match(r"^(.*):(\d+):(\d+):\s+([^:]+):\s(.+)\[([^]]+)\]$", raw_line)
if not match:
return raw_line
file_, line, column, category, message, defect_id = match.groups()
severity = DefectItem.SEVERITY_LOW
if category == "error":
severity = DefectItem.SEVERITY_HIGH
elif category == "warning":
severity = DefectItem.SEVERITY_MEDIUM
return DefectItem(severity, category, message, file_, line, column, defect_id)
def configure_command(self):
tool_path = join(get_core_package_dir("tool-clangtidy"), "clang-tidy")
cmd = [tool_path, "--quiet"]
flags = self.get_flags("clangtidy")
if not self.is_flag_set("--checks", flags):
cmd.append("--checks=*")
project_files = self.get_project_target_files(self.options["patterns"])
src_files = []
for scope in project_files:
src_files.extend(project_files[scope])
cmd.extend(flags + src_files + ["--"])
cmd.extend(
["-D%s" % d for d in self.cpp_defines + self.toolchain_defines["c++"]]
)
includes = []
for inc in self.cpp_includes:
if self.options.get("skip_packages") and inc.lower().startswith(
self.config.get_optional_dir("packages").lower()
):
continue
includes.append(inc)
cmd.extend(["-I%s" % inc for inc in includes])
return cmd

View File

@ -0,0 +1,249 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import click
from platformio import proc
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.package.manager.core import get_core_package_dir
class CppcheckCheckTool(CheckToolBase):
def __init__(self, *args, **kwargs):
self._field_delimiter = "<&PIO&>"
self._buffer = ""
self.defect_fields = [
"severity",
"message",
"file",
"line",
"column",
"callstack",
"cwe",
"id",
]
super(CppcheckCheckTool, self).__init__(*args, **kwargs)
def tool_output_filter(self, line):
if (
not self.options.get("verbose")
and "--suppress=unmatchedSuppression:" in line
):
return ""
if any(
msg in line
for msg in (
"No C or C++ source files found",
"unrecognized command line option",
)
):
self._bad_input = True
return line
def parse_defect(self, raw_line):
if self._field_delimiter not in raw_line:
return None
self._buffer += raw_line
if any(f not in self._buffer for f in self.defect_fields):
return None
args = dict()
for field in self._buffer.split(self._field_delimiter):
field = field.strip().replace('"', "")
name, value = field.split("=", 1)
args[name] = value
args["category"] = args["severity"]
if args["severity"] == "error":
args["severity"] = DefectItem.SEVERITY_HIGH
elif args["severity"] == "warning":
args["severity"] = DefectItem.SEVERITY_MEDIUM
else:
args["severity"] = DefectItem.SEVERITY_LOW
# Skip defects found in third-party software, but keep in mind that such defects
# might break checking process so defects from project files are not reported
breaking_defect_ids = ("preprocessorErrorDirective", "syntaxError")
if (
args.get("file", "")
.lower()
.startswith(self.config.get_optional_dir("packages").lower())
):
if args["id"] in breaking_defect_ids:
if self.options.get("verbose"):
click.echo(
"Error: Found a breaking defect '%s' in %s:%s\n"
"Please note: check results might not be valid!\n"
"Try adding --skip-packages"
% (args.get("message"), args.get("file"), args.get("line"))
)
click.echo()
self._bad_input = True
return None
self._buffer = ""
return DefectItem(**args)
def configure_command(
self, language, src_files
): # pylint: disable=arguments-differ
tool_path = os.path.join(get_core_package_dir("tool-cppcheck"), "cppcheck")
cmd = [
tool_path,
"--addon-python=%s" % proc.get_pythonexe_path(),
"--error-exitcode=1",
"--verbose" if self.options.get("verbose") else "--quiet",
]
cmd.append(
'--template="%s"'
% self._field_delimiter.join(
["{0}={{{0}}}".format(f) for f in self.defect_fields]
)
)
flags = self.get_flags("cppcheck")
if not flags:
# by default user can suppress reporting individual defects
# directly in code // cppcheck-suppress warningID
cmd.append("--inline-suppr")
if not self.is_flag_set("--platform", flags):
cmd.append("--platform=unspecified")
if not self.is_flag_set("--enable", flags):
enabled_checks = [
"warning",
"style",
"performance",
"portability",
"unusedFunction",
]
cmd.append("--enable=%s" % ",".join(enabled_checks))
if not self.is_flag_set("--language", flags):
cmd.append("--language=" + language)
build_flags = self.cxx_flags if language == "c++" else self.cc_flags
for flag in build_flags:
if "-std" in flag:
# Standards with GNU extensions are not allowed
cmd.append("-" + flag.replace("gnu", "c"))
cmd.extend(
["-D%s" % d for d in self.cpp_defines + self.toolchain_defines[language]]
)
cmd.extend(flags)
cmd.extend(
"--include=" + inc
for inc in self.get_forced_includes(build_flags, self.cpp_includes)
)
cmd.append("--file-list=%s" % self._generate_src_file(src_files))
cmd.append("--includes-file=%s" % self._generate_inc_file())
return cmd
@staticmethod
def get_forced_includes(build_flags, includes):
def _extract_filepath(flag, include_options, build_flags):
path = ""
for option in include_options:
if not flag.startswith(option):
continue
if flag.split(option)[1].strip():
path = flag.split(option)[1].strip()
elif build_flags.index(flag) + 1 < len(build_flags):
path = build_flags[build_flags.index(flag) + 1]
return path
def _search_include_dir(filepath, include_paths):
for inc_path in include_paths:
path = os.path.join(inc_path, filepath)
if os.path.isfile(path):
return path
return ""
result = []
include_options = ("-include", "-imacros")
for f in build_flags:
if f.startswith(include_options):
filepath = _extract_filepath(f, include_options, build_flags)
if not os.path.isabs(filepath):
filepath = _search_include_dir(filepath, includes)
if os.path.isfile(filepath):
result.append(filepath)
return result
def _generate_src_file(self, src_files):
return self._create_tmp_file("\n".join(src_files))
def _generate_inc_file(self):
result = []
for inc in self.cpp_includes:
if self.options.get("skip_packages") and inc.lower().startswith(
self.config.get_optional_dir("packages").lower()
):
continue
result.append(inc)
return self._create_tmp_file("\n".join(result))
def clean_up(self):
super(CppcheckCheckTool, self).clean_up()
# delete temporary dump files generated by addons
if not self.is_flag_set("--addon", self.get_flags("cppcheck")):
return
for files in self.get_project_target_files(self.options["patterns"]).values():
for f in files:
dump_file = f + ".dump"
if os.path.isfile(dump_file):
os.remove(dump_file)
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
project_files = self.get_project_target_files(self.options["patterns"])
languages = ("c", "c++")
if not any([project_files[t] for t in languages]):
click.echo("Error: Nothing to check.")
return True
for language in languages:
if not project_files[language]:
continue
cmd = self.configure_command(language, project_files[language])
if not cmd:
self._bad_input = True
continue
if self.options.get("verbose"):
click.echo(" ".join(cmd))
proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
self.clean_up()
return self._bad_input

View File

@ -0,0 +1,233 @@
# Copyright (c) 2020-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
import tempfile
from xml.etree.ElementTree import fromstring
import click
from platformio import proc, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.package.manager.core import get_core_package_dir
class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes
def __init__(self, *args, **kwargs):
self._tmp_dir = tempfile.mkdtemp(prefix="piocheck")
self._tmp_preprocessed_file = self._generate_tmp_file_path() + ".i"
self._tmp_output_file = self._generate_tmp_file_path() + ".pvs"
self._tmp_cfg_file = self._generate_tmp_file_path() + ".cfg"
self._tmp_cmd_file = self._generate_tmp_file_path() + ".cmd"
self.tool_path = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"x64" if "windows" in util.get_systype() else "bin",
"pvs-studio",
)
super(PvsStudioCheckTool, self).__init__(*args, **kwargs)
with open(self._tmp_cfg_file, "w") as fp:
fp.write(
"exclude-path = "
+ self.config.get_optional_dir("packages").replace("\\", "/")
)
with open(self._tmp_cmd_file, "w") as fp:
fp.write(
" ".join(
['-I"%s"' % inc.replace("\\", "/") for inc in self.cpp_includes]
)
)
def _process_defects(self, defects):
for defect in defects:
if not isinstance(defect, DefectItem):
return
if defect.severity not in self.options["severity"]:
return
self._defects.append(defect)
if self._on_defect_callback:
self._on_defect_callback(defect)
def _demangle_report(self, output_file):
converter_tool = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"HtmlGenerator"
if "windows" in util.get_systype()
else os.path.join("bin", "plog-converter"),
)
cmd = (
converter_tool,
"-t",
"xml",
output_file,
"-m",
"cwe",
"-m",
"misra",
"-a",
# Enable all possible analyzers and defect levels
"GA:1,2,3;64:1,2,3;OP:1,2,3;CS:1,2,3;MISRA:1,2,3",
"--cerr",
)
result = proc.exec_command(cmd)
if result["returncode"] != 0:
click.echo(result["err"])
self._bad_input = True
return result["err"]
def parse_defects(self, output_file):
defects = []
report = self._demangle_report(output_file)
if not report:
self._bad_input = True
return []
try:
defects_data = fromstring(report)
except: # pylint: disable=bare-except
click.echo("Error: Couldn't decode generated report!")
self._bad_input = True
return []
for table in defects_data.iter("PVS-Studio_Analysis_Log"):
message = table.find("Message").text
category = table.find("ErrorType").text
line = table.find("Line").text
file_ = table.find("File").text
defect_id = table.find("ErrorCode").text
cwe = table.find("CWECode")
cwe_id = None
if cwe is not None:
cwe_id = cwe.text.lower().replace("cwe-", "")
misra = table.find("MISRA")
if misra is not None:
message += " [%s]" % misra.text
severity = DefectItem.SEVERITY_LOW
if category == "error":
severity = DefectItem.SEVERITY_HIGH
elif category == "warning":
severity = DefectItem.SEVERITY_MEDIUM
defects.append(
DefectItem(
severity, category, message, file_, line, id=defect_id, cwe=cwe_id
)
)
return defects
def configure_command(self, src_file): # pylint: disable=arguments-differ
if os.path.isfile(self._tmp_output_file):
os.remove(self._tmp_output_file)
if not os.path.isfile(self._tmp_preprocessed_file):
click.echo("Error: Missing preprocessed file for '%s'" % src_file)
return ""
cmd = [
self.tool_path,
"--skip-cl-exe",
"yes",
"--language",
"C" if src_file.endswith(".c") else "C++",
"--preprocessor",
"gcc",
"--cfg",
self._tmp_cfg_file,
"--source-file",
src_file,
"--i-file",
self._tmp_preprocessed_file,
"--output-file",
self._tmp_output_file,
]
flags = self.get_flags("pvs-studio")
if not self.is_flag_set("--platform", flags):
cmd.append("--platform=arm")
cmd.extend(flags)
return cmd
def _generate_tmp_file_path(self):
# pylint: disable=protected-access
return os.path.join(self._tmp_dir, next(tempfile._get_candidate_names()))
def _prepare_preprocessed_file(self, src_file):
if os.path.isfile(self._tmp_preprocessed_file):
os.remove(self._tmp_preprocessed_file)
flags = self.cxx_flags
compiler = self.cxx_path
if src_file.endswith(".c"):
flags = self.cc_flags
compiler = self.cc_path
cmd = [compiler, src_file, "-E", "-o", self._tmp_preprocessed_file]
cmd.extend([f for f in flags if f])
cmd.extend(["-D%s" % d for d in self.cpp_defines])
cmd.append('@"%s"' % self._tmp_cmd_file)
# Explicitly specify C++ as the language used in .ino files
if src_file.endswith(".ino"):
cmd.insert(1, "-xc++")
result = proc.exec_command(" ".join(cmd), shell=True)
if result["returncode"] != 0 or result["err"]:
if self.options.get("verbose"):
click.echo(" ".join(cmd))
click.echo(result["err"])
self._bad_input = True
def clean_up(self):
super(PvsStudioCheckTool, self).clean_up()
if os.path.isdir(self._tmp_dir):
shutil.rmtree(self._tmp_dir)
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
for scope, files in self.get_project_target_files(
self.options["patterns"]
).items():
if scope not in ("c", "c++"):
continue
for src_file in files:
self._prepare_preprocessed_file(src_file)
cmd = self.configure_command(src_file)
if self.options.get("verbose"):
click.echo(" ".join(cmd))
if not cmd:
self._bad_input = True
continue
result = proc.exec_command(cmd)
# pylint: disable=unsupported-membership-test
if result["returncode"] != 0 or "license" in result["err"].lower():
self._bad_input = True
click.echo(result["err"])
continue
self._process_defects(self.parse_defects(self._tmp_output_file))
self.clean_up()
return self._bad_input

View File

@ -12,19 +12,19 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from glob import glob
from os import getenv, makedirs, remove
from os.path import abspath, basename, expanduser, isdir, isfile, join
from os.path import basename, isdir, isfile, join, realpath
from shutil import copyfile, copytree
from tempfile import mkdtemp
import click
from platformio import app, util
from platformio.commands.init import cli as cmd_init
from platformio.commands.init import validate_boards
from platformio.commands.run import cli as cmd_run
from platformio import app, compat, fs
from platformio.commands.project import project_init as cmd_project_init
from platformio.commands.project import validate_boards
from platformio.commands.run.command import cli as cmd_run
from platformio.exception import CIBuildEnvsEmpty
from platformio.project.config import ProjectConfig
def validate_path(ctx, param, value): # pylint: disable=unused-argument
@ -32,9 +32,9 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
value = list(value)
for i, p in enumerate(value):
if p.startswith("~"):
value[i] = expanduser(p)
value[i] = abspath(value[i])
if not glob(value[i]):
value[i] = fs.expanduser(p)
value[i] = realpath(value[i])
if not compat.glob_recursive(value[i]):
invalid_path = p
break
try:
@ -44,38 +44,39 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
raise click.BadParameter("Found invalid path: %s" % invalid_path)
@click.command("ci", short_help="Continuous Integration")
@click.command("ci", short_help="Continuous integration")
@click.argument("src", nargs=-1, callback=validate_path)
@click.option(
"-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY")
@click.option("-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY")
@click.option("--exclude", multiple=True)
@click.option(
"-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option(
"--build-dir",
default=mkdtemp,
type=click.Path(
exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
type=click.Path(file_okay=False, dir_okay=True, writable=True, resolve_path=True),
)
@click.option("--keep-build-dir", is_flag=True)
@click.option(
"-C",
"-c",
"--project-conf",
type=click.Path(
exists=True,
file_okay=True,
dir_okay=False,
readable=True,
resolve_path=True))
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("-O", "--project-option", multiple=True)
@click.option("-v", "--verbose", is_flag=True)
@click.pass_context
def cli( # pylint: disable=too-many-arguments, too-many-branches
ctx, src, lib, exclude, board, build_dir, keep_build_dir, project_conf,
project_option, verbose):
ctx,
src,
lib,
exclude,
board,
build_dir,
keep_build_dir,
project_conf,
project_option,
verbose,
):
if not src and getenv("PLATFORMIO_CI_SRC"):
src = validate_path(ctx, None, getenv("PLATFORMIO_CI_SRC").split(":"))
@ -86,7 +87,7 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
app.set_session_var("force_option", True)
if not keep_build_dir and isdir(build_dir):
util.rmtree_(build_dir)
fs.rmtree(build_dir)
if not isdir(build_dir):
makedirs(build_dir)
@ -95,7 +96,7 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
continue
contents = []
for p in patterns:
contents += glob(p)
contents += compat.glob_recursive(p)
_copy_contents(join(build_dir, dir_name), contents)
if project_conf and isfile(project_conf):
@ -108,16 +109,17 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
# initialise project
ctx.invoke(
cmd_init,
cmd_project_init,
project_dir=build_dir,
board=board,
project_option=project_option)
project_option=project_option,
)
# process project
ctx.invoke(cmd_run, project_dir=build_dir, verbose=verbose)
finally:
if not keep_build_dir:
util.rmtree_(build_dir)
fs.rmtree(build_dir)
def _copy_contents(dst_dir, contents):
@ -125,44 +127,47 @@ def _copy_contents(dst_dir, contents):
for path in contents:
if isdir(path):
items['dirs'].add(path)
items["dirs"].add(path)
elif isfile(path):
items['files'].add(path)
items["files"].add(path)
dst_dir_name = basename(dst_dir)
if dst_dir_name == "src" and len(items['dirs']) == 1:
copytree(list(items['dirs']).pop(), dst_dir, symlinks=True)
if dst_dir_name == "src" and len(items["dirs"]) == 1:
copytree(list(items["dirs"]).pop(), dst_dir, symlinks=True)
else:
makedirs(dst_dir)
for d in items['dirs']:
if not isdir(dst_dir):
makedirs(dst_dir)
for d in items["dirs"]:
copytree(d, join(dst_dir, basename(d)), symlinks=True)
if not items['files']:
if not items["files"]:
return
if dst_dir_name == "lib":
dst_dir = join(dst_dir, mkdtemp(dir=dst_dir))
for f in items['files']:
copyfile(f, join(dst_dir, basename(f)))
for f in items["files"]:
dst_file = join(dst_dir, basename(f))
if f == dst_file:
continue
copyfile(f, dst_file)
def _exclude_contents(dst_dir, patterns):
contents = []
for p in patterns:
contents += glob(join(util.glob_escape(dst_dir), p))
contents += compat.glob_recursive(join(compat.glob_escape(dst_dir), p))
for path in contents:
path = abspath(path)
path = realpath(path)
if isdir(path):
util.rmtree_(path)
fs.rmtree(path)
elif isfile(path):
remove(path)
def _copy_project_conf(build_dir, project_conf):
config = util.load_project_config(project_conf)
config = ProjectConfig(project_conf, parse_extra=False)
if config.has_section("platformio"):
config.remove_section("platformio")
with open(join(build_dir, "platformio.ini"), "w") as fp:
config.write(fp)
config.save(join(build_dir, "platformio.ini"))

View File

@ -1,42 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from os import getcwd
import click
from platformio.managers.core import pioplus_call
@click.command(
"debug",
context_settings=dict(ignore_unknown_options=True),
short_help="PIO Unified Debugger")
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("--environment", "-e", metavar="<environment>")
@click.option("--verbose", "-v", is_flag=True)
@click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED)
def cli(*args, **kwargs): # pylint: disable=unused-argument
pioplus_call(sys.argv[1:])

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,175 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments, too-many-statements
# pylint: disable=too-many-locals, too-many-branches
import os
import signal
from os.path import isfile
import click
from platformio import app, exception, fs, proc
from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.package.manager.core import inject_contrib_pysite
from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectEnvsNotAvailableError
from platformio.project.helpers import is_platformio_project, load_project_ide_data
@click.command(
"debug",
context_settings=dict(ignore_unknown_options=True),
short_help="Unified debugger",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--environment", "-e", metavar="<environment>")
@click.option("--verbose", "-v", is_flag=True)
@click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED)
@click.pass_context
def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unprocessed):
app.set_session_var("custom_project_conf", project_conf)
# use env variables from Eclipse or CLion
for sysenv in ("CWD", "PWD", "PLATFORMIO_PROJECT_DIR"):
if is_platformio_project(project_dir):
break
if os.getenv(sysenv):
project_dir = os.getenv(sysenv)
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(envs=[environment] if environment else None)
env_name = environment or helpers.get_default_debug_env(config)
env_options = config.items(env=env_name, as_dict=True)
if not set(env_options.keys()) >= set(["platform", "board"]):
raise ProjectEnvsNotAvailableError()
try:
platform = PlatformFactory.new(env_options["platform"])
except UnknownPlatform:
ctx.invoke(
cmd_platform_install,
platforms=[env_options["platform"]],
skip_default_package=True,
)
platform = PlatformFactory.new(env_options["platform"])
debug_options = helpers.configure_initial_debug_options(platform, env_options)
assert debug_options
if not interface:
return helpers.predebug_project(ctx, project_dir, env_name, False, verbose)
ide_data = load_project_ide_data(project_dir, env_name)
if not ide_data:
raise DebugInvalidOptionsError("Could not load a build configuration")
if "--version" in __unprocessed:
result = proc.exec_command([ide_data["gdb_path"], "--version"])
if result["returncode"] == 0:
return click.echo(result["out"])
raise exception.PlatformioException("\n".join([result["out"], result["err"]]))
try:
fs.ensure_udev_rules()
except exception.InvalidUdevRules as e:
click.echo(
helpers.escape_gdbmi_stream("~", str(e) + "\n")
if helpers.is_gdbmi_mode()
else str(e) + "\n",
nl=False,
)
try:
debug_options = platform.configure_debug_options(debug_options, ide_data)
except NotImplementedError:
# legacy for ESP32 dev-platform <=2.0.0
debug_options["load_cmds"] = helpers.configure_esp32_load_cmds(
debug_options, ide_data
)
rebuild_prog = False
preload = debug_options["load_cmds"] == ["preload"]
load_mode = debug_options["load_mode"]
if load_mode == "always":
rebuild_prog = preload or not helpers.has_debug_symbols(ide_data["prog_path"])
elif load_mode == "modified":
rebuild_prog = helpers.is_prog_obsolete(
ide_data["prog_path"]
) or not helpers.has_debug_symbols(ide_data["prog_path"])
else:
rebuild_prog = not isfile(ide_data["prog_path"])
if preload or (not rebuild_prog and load_mode != "always"):
# don't load firmware through debug server
debug_options["load_cmds"] = []
if rebuild_prog:
if helpers.is_gdbmi_mode():
click.echo(
helpers.escape_gdbmi_stream(
"~", "Preparing firmware for debugging...\n"
),
nl=False,
)
stream = helpers.GDBMIConsoleStream()
with proc.capture_std_streams(stream):
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
stream.close()
else:
click.echo("Preparing firmware for debugging...")
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
# save SHA sum of newly created prog
if load_mode == "modified":
helpers.is_prog_obsolete(ide_data["prog_path"])
if not isfile(ide_data["prog_path"]):
raise DebugInvalidOptionsError("Program/firmware is missed")
# run debugging client
inject_contrib_pysite()
# pylint: disable=import-outside-toplevel
from platformio.commands.debug.process.client import GDBClient, reactor
client = GDBClient(project_dir, __unprocessed, debug_options, env_options)
client.spawn(ide_data["gdb_path"], ide_data["prog_path"])
signal.signal(signal.SIGINT, lambda *args, **kwargs: None)
reactor.run()
return True

View File

@ -0,0 +1,33 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.exception import PlatformioException, UserSideException
class DebugError(PlatformioException):
pass
class DebugSupportError(DebugError, UserSideException):
MESSAGE = (
"Currently, PlatformIO does not support debugging for `{0}`.\n"
"Please request support at https://github.com/platformio/"
"platformio-core/issues \nor visit -> https://docs.platformio.org"
"/page/plus/debugging.html"
)
class DebugInvalidOptionsError(DebugError, UserSideException):
pass

View File

@ -0,0 +1,302 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import sys
import time
from fnmatch import fnmatch
from hashlib import sha1
from io import BytesIO
from os.path import isfile
from platformio import fs, util
from platformio.commands import PlatformioCLI
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.run.command import cli as cmd_run
from platformio.compat import is_bytes
from platformio.project.config import ProjectConfig
from platformio.project.options import ProjectOptions
class GDBMIConsoleStream(BytesIO): # pylint: disable=too-few-public-methods
STDOUT = sys.stdout
def write(self, text):
self.STDOUT.write(escape_gdbmi_stream("~", text))
self.STDOUT.flush()
def is_gdbmi_mode():
return "--interpreter" in " ".join(PlatformioCLI.leftover_args)
def escape_gdbmi_stream(prefix, stream):
bytes_stream = False
if is_bytes(stream):
bytes_stream = True
stream = stream.decode()
if not stream:
return b"" if bytes_stream else ""
ends_nl = stream.endswith("\n")
stream = re.sub(r"\\+", "\\\\\\\\", stream)
stream = stream.replace('"', '\\"')
stream = stream.replace("\n", "\\n")
stream = '%s"%s"' % (prefix, stream)
if ends_nl:
stream += "\n"
return stream.encode() if bytes_stream else stream
def get_default_debug_env(config):
default_envs = config.default_envs()
all_envs = config.envs()
for env in default_envs:
if config.get("env:" + env, "build_type") == "debug":
return env
for env in all_envs:
if config.get("env:" + env, "build_type") == "debug":
return env
return default_envs[0] if default_envs else all_envs[0]
def predebug_project(ctx, project_dir, env_name, preload, verbose):
ctx.invoke(
cmd_run,
project_dir=project_dir,
environment=[env_name],
target=["debug"] + (["upload"] if preload else []),
verbose=verbose,
)
if preload:
time.sleep(5)
def configure_initial_debug_options(platform, env_options):
def _cleanup_cmds(items):
items = ProjectConfig.parse_multi_values(items)
return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items]
board_config = platform.board_config(env_options["board"])
tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool"))
tool_settings = board_config.get("debug", {}).get("tools", {}).get(tool_name, {})
server_options = None
# specific server per a system
if isinstance(tool_settings.get("server", {}), list):
for item in tool_settings["server"][:]:
tool_settings["server"] = item
if util.get_systype() in item.get("system", []):
break
# user overwrites debug server
if env_options.get("debug_server"):
server_options = {
"cwd": None,
"executable": None,
"arguments": env_options.get("debug_server"),
}
server_options["executable"] = server_options["arguments"][0]
server_options["arguments"] = server_options["arguments"][1:]
elif "server" in tool_settings:
server_options = tool_settings["server"]
server_package = server_options.get("package")
server_package_dir = (
platform.get_package_dir(server_package) if server_package else None
)
if server_package and not server_package_dir:
platform.install_packages(
with_packages=[server_package], skip_default_package=True, silent=True
)
server_package_dir = platform.get_package_dir(server_package)
server_options.update(
dict(
cwd=server_package_dir if server_package else None,
executable=server_options.get("executable"),
arguments=[
a.replace("$PACKAGE_DIR", server_package_dir)
if server_package_dir
else a
for a in server_options.get("arguments", [])
],
)
)
extra_cmds = _cleanup_cmds(env_options.get("debug_extra_cmds"))
extra_cmds.extend(_cleanup_cmds(tool_settings.get("extra_cmds")))
result = dict(
tool=tool_name,
upload_protocol=env_options.get(
"upload_protocol", board_config.get("upload", {}).get("protocol")
),
load_cmds=_cleanup_cmds(
env_options.get(
"debug_load_cmds",
tool_settings.get(
"load_cmds",
tool_settings.get(
"load_cmd", ProjectOptions["env.debug_load_cmds"].default
),
),
)
),
load_mode=env_options.get(
"debug_load_mode",
tool_settings.get(
"load_mode", ProjectOptions["env.debug_load_mode"].default
),
),
init_break=env_options.get(
"debug_init_break",
tool_settings.get(
"init_break", ProjectOptions["env.debug_init_break"].default
),
),
init_cmds=_cleanup_cmds(
env_options.get("debug_init_cmds", tool_settings.get("init_cmds"))
),
extra_cmds=extra_cmds,
require_debug_port=tool_settings.get("require_debug_port", False),
port=reveal_debug_port(
env_options.get("debug_port", tool_settings.get("port")),
tool_name,
tool_settings,
),
server=server_options,
)
return result
def configure_esp32_load_cmds(debug_options, configuration):
"""
DEPRECATED: Moved to ESP32 dev-platform
See platform.py::configure_debug_options
"""
flash_images = configuration.get("extra", {}).get("flash_images")
ignore_conds = [
debug_options["load_cmds"] != ["load"],
"xtensa-esp32" not in configuration.get("cc_path", ""),
not flash_images,
not all([isfile(item["path"]) for item in flash_images]),
]
if any(ignore_conds):
return debug_options["load_cmds"]
mon_cmds = [
'monitor program_esp32 "{{{path}}}" {offset} verify'.format(
path=fs.to_unix_path(item["path"]), offset=item["offset"]
)
for item in flash_images
]
mon_cmds.append(
'monitor program_esp32 "{%s.bin}" 0x10000 verify'
% fs.to_unix_path(configuration["prog_path"][:-4])
)
return mon_cmds
def has_debug_symbols(prog_path):
if not isfile(prog_path):
return False
matched = {
b".debug_info": False,
b".debug_abbrev": False,
b" -Og": False,
b" -g": False,
b"__PLATFORMIO_BUILD_DEBUG__": False,
}
with open(prog_path, "rb") as fp:
last_data = b""
while True:
data = fp.read(1024)
if not data:
break
for pattern, found in matched.items():
if found:
continue
if pattern in last_data + data:
matched[pattern] = True
last_data = data
return all(matched.values())
def is_prog_obsolete(prog_path):
prog_hash_path = prog_path + ".sha1"
if not isfile(prog_path):
return True
shasum = sha1()
with open(prog_path, "rb") as fp:
while True:
data = fp.read(1024)
if not data:
break
shasum.update(data)
new_digest = shasum.hexdigest()
old_digest = None
if isfile(prog_hash_path):
with open(prog_hash_path) as fp:
old_digest = fp.read()
if new_digest == old_digest:
return False
with open(prog_hash_path, "w") as fp:
fp.write(new_digest)
return True
def reveal_debug_port(env_debug_port, tool_name, tool_settings):
def _get_pattern():
if not env_debug_port:
return None
if set(["*", "?", "[", "]"]) & set(env_debug_port):
return env_debug_port
return None
def _is_match_pattern(port):
pattern = _get_pattern()
if not pattern:
return True
return fnmatch(port, pattern)
def _look_for_serial_port(hwids):
for item in util.get_serialports(filter_hwid=True):
if not _is_match_pattern(item["port"]):
continue
port = item["port"]
if tool_name.startswith("blackmagic"):
if (
"windows" in util.get_systype()
and port.startswith("COM")
and len(port) > 4
):
port = "\\\\.\\%s" % port
if "GDB" in item["description"]:
return port
for hwid in hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
return port
return None
if env_debug_port and not _get_pattern():
return env_debug_port
if not tool_settings.get("require_debug_port"):
return None
debug_port = _look_for_serial_port(tool_settings.get("hwids", []))
if not debug_port:
raise DebugInvalidOptionsError("Please specify `debug_port` for environment")
return debug_port

View File

@ -0,0 +1,161 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
GDB_DEFAULT_INIT_CONFIG = """
define pio_reset_halt_target
monitor reset halt
end
define pio_reset_run_target
monitor reset
end
target extended-remote $DEBUG_PORT
monitor init
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_STUTIL_INIT_CONFIG = """
define pio_reset_halt_target
monitor reset
monitor halt
end
define pio_reset_run_target
monitor reset
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_JLINK_INIT_CONFIG = """
define pio_reset_halt_target
monitor reset
monitor halt
end
define pio_reset_run_target
monitor clrbp
monitor reset
monitor go
end
target extended-remote $DEBUG_PORT
monitor clrbp
monitor speed auto
pio_reset_halt_target
$LOAD_CMDS
$INIT_BREAK
"""
GDB_BLACKMAGIC_INIT_CONFIG = """
define pio_reset_halt_target
set language c
set *0xE000ED0C = 0x05FA0004
set $busy = (*0xE000ED0C & 0x4)
while ($busy)
set $busy = (*0xE000ED0C & 0x4)
end
set language auto
end
define pio_reset_run_target
pio_reset_halt_target
end
target extended-remote $DEBUG_PORT
monitor swdp_scan
attach 1
set mem inaccessible-by-default off
$LOAD_CMDS
$INIT_BREAK
set language c
set *0xE000ED0C = 0x05FA0004
set $busy = (*0xE000ED0C & 0x4)
while ($busy)
set $busy = (*0xE000ED0C & 0x4)
end
set language auto
"""
GDB_MSPDEBUG_INIT_CONFIG = """
define pio_reset_halt_target
end
define pio_reset_run_target
end
target extended-remote $DEBUG_PORT
monitor erase
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_QEMU_INIT_CONFIG = """
define pio_reset_halt_target
monitor system_reset
end
define pio_reset_run_target
monitor system_reset
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_RENODE_INIT_CONFIG = """
define pio_reset_halt_target
monitor machine Reset
$LOAD_CMDS
monitor start
end
define pio_reset_run_target
pio_reset_halt_target
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
$INIT_BREAK
monitor start
"""
TOOL_TO_CONFIG = {
"jlink": GDB_JLINK_INIT_CONFIG,
"mspdebug": GDB_MSPDEBUG_INIT_CONFIG,
"qemu": GDB_QEMU_INIT_CONFIG,
"blackmagic": GDB_BLACKMAGIC_INIT_CONFIG,
"renode": GDB_RENODE_INIT_CONFIG,
}
def get_gdb_init_config(debug_options):
tool = debug_options.get("tool")
if tool and tool in TOOL_TO_CONFIG:
return TOOL_TO_CONFIG[tool]
server_exe = (debug_options.get("server") or {}).get("executable", "").lower()
if "st-util" in server_exe:
return GDB_STUTIL_INIT_CONFIG
return GDB_DEFAULT_INIT_CONFIG

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,93 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import signal
import time
import click
from twisted.internet import protocol # pylint: disable=import-error
from platformio import fs
from platformio.compat import string_types
from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_project_core_dir
class BaseProcess(protocol.ProcessProtocol, object):
STDOUT_CHUNK_SIZE = 2048
LOG_FILE = None
COMMON_PATTERNS = {
"PLATFORMIO_HOME_DIR": get_project_core_dir(),
"PLATFORMIO_CORE_DIR": get_project_core_dir(),
"PYTHONEXE": get_pythonexe_path(),
}
def __init__(self):
self._last_activity = 0
def apply_patterns(self, source, patterns=None):
_patterns = self.COMMON_PATTERNS.copy()
_patterns.update(patterns or {})
for key, value in _patterns.items():
if key.endswith(("_DIR", "_PATH")):
_patterns[key] = fs.to_unix_path(value)
def _replace(text):
for key, value in _patterns.items():
pattern = "$%s" % key
text = text.replace(pattern, value or "")
return text
if isinstance(source, string_types):
source = _replace(source)
elif isinstance(source, (list, dict)):
items = enumerate(source) if isinstance(source, list) else source.items()
for key, value in items:
if isinstance(value, string_types):
source[key] = _replace(value)
elif isinstance(value, (list, dict)):
source[key] = self.apply_patterns(value, patterns)
return source
def onStdInData(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
def outReceived(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
while data:
chunk = data[: self.STDOUT_CHUNK_SIZE]
click.echo(chunk, nl=False)
data = data[self.STDOUT_CHUNK_SIZE :]
def errReceived(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
click.echo(data, nl=False, err=True)
def processEnded(self, _):
self._last_activity = time.time()
# Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler)

View File

@ -0,0 +1,280 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import re
import signal
import time
from hashlib import sha1
from os.path import basename, dirname, isdir, join, realpath, splitext
from tempfile import mkdtemp
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import protocol # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import stdio # pylint: disable=import-error
from twisted.internet import task # pylint: disable=import-error
from platformio import fs, proc, telemetry, util
from platformio.cache import ContentCache
from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.initcfgs import get_gdb_init_config
from platformio.commands.debug.process.base import BaseProcess
from platformio.commands.debug.process.server import DebugServer
from platformio.compat import hashlib_encode_data, is_bytes
from platformio.project.helpers import get_project_cache_dir
class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
PIO_SRC_NAME = ".pioinit"
INIT_COMPLETED_BANNER = "PlatformIO: Initialization completed"
def __init__(self, project_dir, args, debug_options, env_options):
super(GDBClient, self).__init__()
self.project_dir = project_dir
self.args = list(args)
self.debug_options = debug_options
self.env_options = env_options
self._debug_server = DebugServer(debug_options, env_options)
self._session_id = None
if not isdir(get_project_cache_dir()):
os.makedirs(get_project_cache_dir())
self._gdbsrc_dir = mkdtemp(dir=get_project_cache_dir(), prefix=".piodebug-")
self._target_is_run = False
self._auto_continue_timer = None
self._errors_buffer = b""
@defer.inlineCallbacks
def spawn(self, gdb_path, prog_path):
session_hash = gdb_path + prog_path
self._session_id = sha1(hashlib_encode_data(session_hash)).hexdigest()
self._kill_previous_session()
patterns = {
"PROJECT_DIR": self.project_dir,
"PROG_PATH": prog_path,
"PROG_DIR": dirname(prog_path),
"PROG_NAME": basename(splitext(prog_path)[0]),
"DEBUG_PORT": self.debug_options["port"],
"UPLOAD_PROTOCOL": self.debug_options["upload_protocol"],
"INIT_BREAK": self.debug_options["init_break"] or "",
"LOAD_CMDS": "\n".join(self.debug_options["load_cmds"] or []),
}
yield self._debug_server.spawn(patterns)
if not patterns["DEBUG_PORT"]:
patterns["DEBUG_PORT"] = self._debug_server.get_debug_port()
self.generate_pioinit(self._gdbsrc_dir, patterns)
# start GDB client
args = [
"piogdb",
"-q",
"--directory",
self._gdbsrc_dir,
"--directory",
self.project_dir,
"-l",
"10",
]
args.extend(self.args)
if not gdb_path:
raise DebugInvalidOptionsError("GDB client is not configured")
gdb_data_dir = self._get_data_dir(gdb_path)
if gdb_data_dir:
args.extend(["--data-directory", gdb_data_dir])
args.append(patterns["PROG_PATH"])
transport = reactor.spawnProcess(
self, gdb_path, args, path=self.project_dir, env=os.environ
)
defer.returnValue(transport)
@staticmethod
def _get_data_dir(gdb_path):
if "msp430" in gdb_path:
return None
gdb_data_dir = realpath(join(dirname(gdb_path), "..", "share", "gdb"))
return gdb_data_dir if isdir(gdb_data_dir) else None
def generate_pioinit(self, dst_dir, patterns):
# default GDB init commands depending on debug tool
commands = get_gdb_init_config(self.debug_options).split("\n")
if self.debug_options["init_cmds"]:
commands = self.debug_options["init_cmds"]
commands.extend(self.debug_options["extra_cmds"])
if not any("define pio_reset_run_target" in cmd for cmd in commands):
commands = [
"define pio_reset_run_target",
" echo Warning! Undefined pio_reset_run_target command\\n",
" monitor reset",
"end",
] + commands
if not any("define pio_reset_halt_target" in cmd for cmd in commands):
commands = [
"define pio_reset_halt_target",
" echo Warning! Undefined pio_reset_halt_target command\\n",
" monitor reset halt",
"end",
] + commands
if not any("define pio_restart_target" in cmd for cmd in commands):
commands += [
"define pio_restart_target",
" pio_reset_halt_target",
" $INIT_BREAK",
" %s" % ("continue" if patterns["INIT_BREAK"] else "next"),
"end",
]
banner = [
"echo PlatformIO Unified Debugger -> http://bit.ly/pio-debug\\n",
"echo PlatformIO: debug_tool = %s\\n" % self.debug_options["tool"],
"echo PlatformIO: Initializing remote target...\\n",
]
footer = ["echo %s\\n" % self.INIT_COMPLETED_BANNER]
commands = banner + commands + footer
with open(join(dst_dir, self.PIO_SRC_NAME), "w") as fp:
fp.write("\n".join(self.apply_patterns(commands, patterns)))
def connectionMade(self):
self._lock_session(self.transport.pid)
p = protocol.Protocol()
p.dataReceived = self.onStdInData
stdio.StandardIO(p)
def onStdInData(self, data):
super(GDBClient, self).onStdInData(data)
if b"-exec-run" in data:
if self._target_is_run:
token, _ = data.split(b"-", 1)
self.outReceived(token + b"^running\n")
return
data = data.replace(b"-exec-run", b"-exec-continue")
if b"-exec-continue" in data:
self._target_is_run = True
if b"-gdb-exit" in data or data.strip() in (b"q", b"quit"):
# Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler)
self.transport.write(b"pio_reset_run_target\n")
self.transport.write(data)
def processEnded(self, reason): # pylint: disable=unused-argument
self._unlock_session()
if self._gdbsrc_dir and isdir(self._gdbsrc_dir):
fs.rmtree(self._gdbsrc_dir)
if self._debug_server:
self._debug_server.terminate()
reactor.stop()
def outReceived(self, data):
super(GDBClient, self).outReceived(data)
self._handle_error(data)
# go to init break automatically
if self.INIT_COMPLETED_BANNER.encode() in data:
telemetry.send_event(
"Debug", "Started", telemetry.dump_run_environment(self.env_options)
)
self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue)
self._auto_continue_timer.start(0.1)
def errReceived(self, data):
super(GDBClient, self).errReceived(data)
self._handle_error(data)
def console_log(self, msg):
if helpers.is_gdbmi_mode():
msg = helpers.escape_gdbmi_stream("~", msg)
self.outReceived(msg if is_bytes(msg) else msg.encode())
def _auto_exec_continue(self):
auto_exec_delay = 0.5 # in seconds
if self._last_activity > (time.time() - auto_exec_delay):
return
if self._auto_continue_timer:
self._auto_continue_timer.stop()
self._auto_continue_timer = None
if not self.debug_options["init_break"] or self._target_is_run:
return
self.console_log(
"PlatformIO: Resume the execution to `debug_init_break = %s`\n"
% self.debug_options["init_break"]
)
self.console_log(
"PlatformIO: More configuration options -> http://bit.ly/pio-debug\n"
)
self.transport.write(
b"0-exec-continue\n" if helpers.is_gdbmi_mode() else b"continue\n"
)
self._target_is_run = True
def _handle_error(self, data):
self._errors_buffer = (self._errors_buffer + data)[-8192:] # keep last 8 KBytes
if not (
self.PIO_SRC_NAME.encode() in self._errors_buffer
and b"Error in sourced" in self._errors_buffer
):
return
last_erros = self._errors_buffer.decode()
last_erros = " ".join(reversed(last_erros.split("\n")))
last_erros = re.sub(r'((~|&)"|\\n\"|\\t)', " ", last_erros, flags=re.M)
err = "%s -> %s" % (
telemetry.dump_run_environment(self.env_options),
last_erros,
)
telemetry.send_exception("DebugInitError: %s" % err)
self.transport.loseConnection()
def _kill_previous_session(self):
assert self._session_id
pid = None
with ContentCache() as cc:
pid = cc.get(self._session_id)
cc.delete(self._session_id)
if not pid:
return
if "windows" in util.get_systype():
kill = ["Taskkill", "/PID", pid, "/F"]
else:
kill = ["kill", pid]
try:
proc.exec_command(kill)
except: # pylint: disable=bare-except
pass
def _lock_session(self, pid):
if not self._session_id:
return
with ContentCache() as cc:
cc.set(self._session_id, str(pid), "1h")
def _unlock_session(self):
if not self._session_id:
return
with ContentCache() as cc:
cc.delete(self._session_id)

View File

@ -0,0 +1,166 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import time
from os.path import isdir, isfile, join
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from platformio import fs, util
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode
from platformio.commands.debug.process.base import BaseProcess
from platformio.proc import where_is_program
class DebugServer(BaseProcess):
def __init__(self, debug_options, env_options):
super(DebugServer, self).__init__()
self.debug_options = debug_options
self.env_options = env_options
self._debug_port = ":3333"
self._transport = None
self._process_ended = False
self._ready = False
@defer.inlineCallbacks
def spawn(self, patterns): # pylint: disable=too-many-branches
systype = util.get_systype()
server = self.debug_options.get("server")
if not server:
defer.returnValue(None)
server = self.apply_patterns(server, patterns)
server_executable = server["executable"]
if not server_executable:
defer.returnValue(None)
if server["cwd"]:
server_executable = join(server["cwd"], server_executable)
if (
"windows" in systype
and not server_executable.endswith(".exe")
and isfile(server_executable + ".exe")
):
server_executable = server_executable + ".exe"
if not isfile(server_executable):
server_executable = where_is_program(server_executable)
if not isfile(server_executable):
raise DebugInvalidOptionsError(
"\nCould not launch Debug Server '%s'. Please check that it "
"is installed and is included in a system PATH\n\n"
"See documentation or contact contact@platformio.org:\n"
"https://docs.platformio.org/page/plus/debugging.html\n"
% server_executable
)
openocd_pipe_allowed = all(
[not self.debug_options["port"], "openocd" in server_executable]
)
if openocd_pipe_allowed:
args = []
if server["cwd"]:
args.extend(["-s", server["cwd"]])
args.extend(
["-c", "gdb_port pipe; tcl_port disabled; telnet_port disabled"]
)
args.extend(server["arguments"])
str_args = " ".join(
[arg if arg.startswith("-") else '"%s"' % arg for arg in args]
)
self._debug_port = '| "%s" %s' % (server_executable, str_args)
self._debug_port = fs.to_unix_path(self._debug_port)
defer.returnValue(self._debug_port)
env = os.environ.copy()
# prepend server "lib" folder to LD path
if (
"windows" not in systype
and server["cwd"]
and isdir(join(server["cwd"], "lib"))
):
ld_key = "DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH"
env[ld_key] = join(server["cwd"], "lib")
if os.environ.get(ld_key):
env[ld_key] = "%s:%s" % (env[ld_key], os.environ.get(ld_key))
# prepend BIN to PATH
if server["cwd"] and isdir(join(server["cwd"], "bin")):
env["PATH"] = "%s%s%s" % (
join(server["cwd"], "bin"),
os.pathsep,
os.environ.get("PATH", os.environ.get("Path", "")),
)
self._transport = reactor.spawnProcess(
self,
server_executable,
[server_executable] + server["arguments"],
path=server["cwd"],
env=env,
)
if "mspdebug" in server_executable.lower():
self._debug_port = ":2000"
elif "jlink" in server_executable.lower():
self._debug_port = ":2331"
elif "qemu" in server_executable.lower():
self._debug_port = ":1234"
yield self._wait_until_ready()
defer.returnValue(self._debug_port)
@defer.inlineCallbacks
def _wait_until_ready(self):
timeout = 10
elapsed = 0
delay = 0.5
auto_ready_delay = 0.5
while not self._ready and not self._process_ended and elapsed < timeout:
yield self.async_sleep(delay)
if not self.debug_options.get("server", {}).get("ready_pattern"):
self._ready = self._last_activity < (time.time() - auto_ready_delay)
elapsed += delay
@staticmethod
def async_sleep(secs):
d = defer.Deferred()
reactor.callLater(secs, d.callback, None)
return d
def get_debug_port(self):
return self._debug_port
def outReceived(self, data):
super(DebugServer, self).outReceived(
escape_gdbmi_stream("@", data) if is_gdbmi_mode() else data
)
if self._ready:
return
ready_pattern = self.debug_options.get("server", {}).get("ready_pattern")
if ready_pattern:
self._ready = ready_pattern.encode() in data
def processEnded(self, reason):
self._process_ended = True
super(DebugServer, self).processEnded(reason)
def terminate(self):
if self._process_ended or not self._transport:
return
try:
self._transport.signalProcess("KILL")
except: # pylint: disable=bare-except
pass

View File

@ -1,228 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import sys
from os import getcwd
import click
from serial.tools import miniterm
from platformio import exception, util
@click.group(short_help="Monitor device or list existing")
def cli():
pass
@cli.command("list", short_help="List devices")
@click.option("--serial", is_flag=True, help="List serial ports, default")
@click.option("--logical", is_flag=True, help="List logical devices")
@click.option("--mdns", is_flag=True, help="List multicast DNS services")
@click.option("--json-output", is_flag=True)
def device_list( # pylint: disable=too-many-branches
serial, logical, mdns, json_output):
if not logical and not mdns:
serial = True
data = {}
if serial:
data['serial'] = util.get_serial_ports()
if logical:
data['logical'] = util.get_logical_devices()
if mdns:
data['mdns'] = util.get_mdns_services()
single_key = data.keys()[0] if len(data.keys()) == 1 else None
if json_output:
return click.echo(json.dumps(data[single_key] if single_key else data))
titles = {
"serial": "Serial Ports",
"logical": "Logical Devices",
"mdns": "Multicast DNS Services"
}
for key, value in data.items():
if not single_key:
click.secho(titles[key], bold=True)
click.echo("=" * len(titles[key]))
if key == "serial":
for item in value:
click.secho(item['port'], fg="cyan")
click.echo("-" * len(item['port']))
click.echo("Hardware ID: %s" % item['hwid'])
click.echo("Description: %s" % item['description'])
click.echo("")
if key == "logical":
for item in value:
click.secho(item['path'], fg="cyan")
click.echo("-" * len(item['path']))
click.echo("Name: %s" % item['name'])
click.echo("")
if key == "mdns":
for item in value:
click.secho(item['name'], fg="cyan")
click.echo("-" * len(item['name']))
click.echo("Type: %s" % item['type'])
click.echo("IP: %s" % item['ip'])
click.echo("Port: %s" % item['port'])
if item['properties']:
click.echo("Properties: %s" % ("; ".join([
"%s=%s" % (k, v)
for k, v in item['properties'].items()
])))
click.echo("")
if single_key:
click.echo("")
return True
@cli.command("monitor", short_help="Monitor device (Serial)")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option(
"--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N")
@click.option(
"--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option(
"--xonxoff",
is_flag=True,
help="Enable software flow control, default=Off")
@click.option(
"--rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state")
@click.option(
"--dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state")
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option(
"--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8")
@click.option("--filter", "-f", multiple=True, help="Add text transformation")
@click.option(
"--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF")
@click.option(
"--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option(
"--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)")
@click.option(
"--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)")
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off")
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, resolve_path=True))
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment")
def device_monitor(**kwargs): # pylint: disable=too-many-branches
try:
project_options = get_project_options(kwargs['project_dir'],
kwargs['environment'])
monitor_options = {k: v for k, v in project_options or []}
if monitor_options:
for k in ("port", "baud", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k
if k == "speed":
k = "baud"
if kwargs[k] is None and k2 in monitor_options:
kwargs[k] = monitor_options[k2]
if k != "port":
kwargs[k] = int(kwargs[k])
except exception.NotPlatformIOProject:
pass
if not kwargs['port']:
ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1:
kwargs['port'] = ports[0]['port']
sys.argv = ["monitor"]
for k, v in kwargs.items():
if k in ("port", "baud", "rts", "dtr", "environment", "project_dir"):
continue
k = "--" + k.replace("_", "-")
if isinstance(v, bool):
if v:
sys.argv.append(k)
elif isinstance(v, tuple):
for i in v:
sys.argv.extend([k, i])
else:
sys.argv.extend([k, str(v)])
try:
miniterm.main(
default_port=kwargs['port'],
default_baudrate=kwargs['baud'] or 9600,
default_rts=kwargs['rts'],
default_dtr=kwargs['dtr'])
except Exception as e:
raise exception.MinitermException(e)
def get_project_options(project_dir, environment):
config = util.load_project_config(project_dir)
if not config.sections():
return None
known_envs = [s[4:] for s in config.sections() if s.startswith("env:")]
if environment:
if environment in known_envs:
return config.items("env:%s" % environment)
raise exception.UnknownEnvNames(environment, ", ".join(known_envs))
if not known_envs:
return None
if config.has_option("platformio", "env_default"):
env_default = config.get("platformio",
"env_default").split(", ")[0].strip()
if env_default and env_default in known_envs:
return config.items("env:%s" % env_default)
return config.items("env:%s" % known_envs[0])

View File

@ -0,0 +1,15 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.device.filters.base import DeviceMonitorFilter

View File

@ -0,0 +1,245 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
from fnmatch import fnmatch
import click
from serial.tools import miniterm
from platformio import exception, fs, util
from platformio.commands.device import helpers as device_helpers
from platformio.compat import dump_json_to_unicode
from platformio.platform.factory import PlatformFactory
from platformio.project.exception import NotPlatformIOProjectError
@click.group(short_help="Device manager & serial/socket monitor")
def cli():
pass
@cli.command("list", short_help="List devices")
@click.option("--serial", is_flag=True, help="List serial ports, default")
@click.option("--logical", is_flag=True, help="List logical devices")
@click.option("--mdns", is_flag=True, help="List multicast DNS services")
@click.option("--json-output", is_flag=True)
def device_list( # pylint: disable=too-many-branches
serial, logical, mdns, json_output
):
if not logical and not mdns:
serial = True
data = {}
if serial:
data["serial"] = util.get_serial_ports()
if logical:
data["logical"] = util.get_logical_devices()
if mdns:
data["mdns"] = util.get_mdns_services()
single_key = list(data)[0] if len(list(data)) == 1 else None
if json_output:
return click.echo(
dump_json_to_unicode(data[single_key] if single_key else data)
)
titles = {
"serial": "Serial Ports",
"logical": "Logical Devices",
"mdns": "Multicast DNS Services",
}
for key, value in data.items():
if not single_key:
click.secho(titles[key], bold=True)
click.echo("=" * len(titles[key]))
if key == "serial":
for item in value:
click.secho(item["port"], fg="cyan")
click.echo("-" * len(item["port"]))
click.echo("Hardware ID: %s" % item["hwid"])
click.echo("Description: %s" % item["description"])
click.echo("")
if key == "logical":
for item in value:
click.secho(item["path"], fg="cyan")
click.echo("-" * len(item["path"]))
click.echo("Name: %s" % item["name"])
click.echo("")
if key == "mdns":
for item in value:
click.secho(item["name"], fg="cyan")
click.echo("-" * len(item["name"]))
click.echo("Type: %s" % item["type"])
click.echo("IP: %s" % item["ip"])
click.echo("Port: %s" % item["port"])
if item["properties"]:
click.echo(
"Properties: %s"
% (
"; ".join(
[
"%s=%s" % (k, v)
for k, v in item["properties"].items()
]
)
)
)
click.echo("")
if single_key:
click.echo("")
return True
@cli.command("monitor", short_help="Monitor device (Serial)")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option(
"--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N",
)
@click.option("--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option(
"--xonxoff", is_flag=True, help="Enable software flow control, default=Off"
)
@click.option(
"--rts", default=None, type=click.IntRange(0, 1), help="Set initial RTS line state"
)
@click.option(
"--dtr", default=None, type=click.IntRange(0, 1), help="Set initial DTR line state"
)
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option(
"--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8",
)
@click.option("--filter", "-f", multiple=True, help="Add filters/text transformations")
@click.option(
"--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF",
)
@click.option("--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option(
"--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)",
)
@click.option(
"--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)",
)
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment",
)
def device_monitor(**kwargs): # pylint: disable=too-many-branches
# load default monitor filters
filters_dir = os.path.join(fs.get_source_dir(), "commands", "device", "filters")
for name in os.listdir(filters_dir):
if not name.endswith(".py"):
continue
device_helpers.load_monitor_filter(os.path.join(filters_dir, name))
project_options = {}
try:
with fs.cd(kwargs["project_dir"]):
project_options = device_helpers.get_project_options(kwargs["environment"])
kwargs = device_helpers.apply_project_monitor_options(kwargs, project_options)
except NotPlatformIOProjectError:
pass
platform = None
if "platform" in project_options:
with fs.cd(kwargs["project_dir"]):
platform = PlatformFactory.new(project_options["platform"])
device_helpers.register_platform_filters(
platform, kwargs["project_dir"], kwargs["environment"]
)
if not kwargs["port"]:
ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1:
kwargs["port"] = ports[0]["port"]
elif "platform" in project_options and "board" in project_options:
board_hwids = device_helpers.get_board_hwids(
kwargs["project_dir"],
platform,
project_options["board"],
)
for item in ports:
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
kwargs["port"] = item["port"]
break
if kwargs["port"]:
break
elif kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
for item in util.get_serial_ports():
if fnmatch(item["port"], kwargs["port"]):
kwargs["port"] = item["port"]
break
# override system argv with patched options
sys.argv = ["monitor"] + device_helpers.options_to_argv(
kwargs,
project_options,
ignore=("port", "baud", "rts", "dtr", "environment", "project_dir"),
)
if not kwargs["quiet"]:
click.echo(
"--- Available filters and text transformations: %s"
% ", ".join(sorted(miniterm.TRANSFORMATIONS.keys()))
)
click.echo("--- More details at http://bit.ly/pio-monitor-filters")
try:
miniterm.main(
default_port=kwargs["port"],
default_baudrate=kwargs["baud"] or 9600,
default_rts=kwargs["rts"],
default_dtr=kwargs["dtr"],
)
except Exception as e:
raise exception.MinitermException(e)

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,42 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from serial.tools import miniterm
from platformio.project.config import ProjectConfig
class DeviceMonitorFilter(miniterm.Transform):
def __init__(self, project_dir=None, environment=None):
""" Called by PlatformIO to pass context """
miniterm.Transform.__init__(self)
self.project_dir = project_dir
self.environment = environment
self.config = ProjectConfig.get_instance()
if not self.environment:
default_envs = self.config.default_envs()
if default_envs:
self.environment = default_envs[0]
elif self.config.envs():
self.environment = self.config.envs()[0]
def __call__(self):
""" Called by the miniterm library when the filter is actually used """
return self
@property
def NAME(self):
raise NotImplementedError("Please declare NAME attribute for the filter class")

View File

@ -0,0 +1,38 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import serial
from platformio.commands.device import DeviceMonitorFilter
class Hexlify(DeviceMonitorFilter):
NAME = "hexlify"
def __init__(self, *args, **kwargs):
super(Hexlify, self).__init__(*args, **kwargs)
self._counter = 0
def rx(self, text):
result = ""
for b in serial.iterbytes(text):
if (self._counter % 16) == 0:
result += "\n{:04X} | ".format(self._counter)
asciicode = ord(b)
if asciicode <= 255:
result += "{:02X} ".format(asciicode)
else:
result += "?? "
self._counter += 1
return result

View File

@ -0,0 +1,44 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import io
import os.path
from datetime import datetime
from platformio.commands.device import DeviceMonitorFilter
class LogToFile(DeviceMonitorFilter):
NAME = "log2file"
def __init__(self, *args, **kwargs):
super(LogToFile, self).__init__(*args, **kwargs)
self._log_fp = None
def __call__(self):
log_file_name = "platformio-device-monitor-%s.log" % datetime.now().strftime(
"%y%m%d-%H%M%S"
)
print("--- Logging an output to %s" % os.path.abspath(log_file_name))
self._log_fp = io.open(log_file_name, "w", encoding="utf-8")
return self
def __del__(self):
if self._log_fp:
self._log_fp.close()
def rx(self, text):
self._log_fp.write(text)
self._log_fp.flush()
return text

View File

@ -12,20 +12,20 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
import click
from platformio.managers.core import pioplus_call
from platformio.commands.device import DeviceMonitorFilter
@click.command("home", short_help="PIO Home")
@click.option("--port", type=int, default=8008, help="HTTP port, default=8008")
@click.option(
"--host",
default="127.0.0.1",
help="HTTP host, default=127.0.0.1. "
"You can open PIO Home for inbound connections with --host=0.0.0.0")
@click.option("--no-open", is_flag=True)
def cli(*args, **kwargs): # pylint: disable=unused-argument
pioplus_call(sys.argv[1:])
class SendOnEnter(DeviceMonitorFilter):
NAME = "send_on_enter"
def __init__(self, *args, **kwargs):
super(SendOnEnter, self).__init__(*args, **kwargs)
self._buffer = ""
def tx(self, text):
self._buffer += text
if self._buffer.endswith("\r\n"):
text = self._buffer[:-2]
self._buffer = ""
return text
return ""

View File

@ -0,0 +1,37 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from datetime import datetime
from platformio.commands.device import DeviceMonitorFilter
class Timestamp(DeviceMonitorFilter):
NAME = "time"
def __init__(self, *args, **kwargs):
super(Timestamp, self).__init__(*args, **kwargs)
self._line_started = False
def rx(self, text):
if self._line_started and "\n" not in text:
return text
timestamp = datetime.now().strftime("%H:%M:%S.%f")[:-3]
if not self._line_started:
self._line_started = True
text = "%s > %s" % (timestamp, text)
if text.endswith("\n"):
self._line_started = False
return text[:-1].replace("\n", "\n%s > " % timestamp) + "\n"
return text.replace("\n", "\n%s > " % timestamp)

View File

@ -0,0 +1,106 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import inspect
import os
from serial.tools import miniterm
from platformio import fs
from platformio.commands.device import DeviceMonitorFilter
from platformio.compat import get_object_members, load_python_module
from platformio.project.config import ProjectConfig
def apply_project_monitor_options(cli_options, project_options):
for k in ("port", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k
if k == "speed":
k = "baud"
if cli_options[k] is None and k2 in project_options:
cli_options[k] = project_options[k2]
if k != "port":
cli_options[k] = int(cli_options[k])
return cli_options
def options_to_argv(cli_options, project_options, ignore=None):
confmon_flags = project_options.get("monitor_flags", [])
result = confmon_flags[::]
for f in project_options.get("monitor_filters", []):
result.extend(["--filter", f])
for k, v in cli_options.items():
if v is None or (ignore and k in ignore):
continue
k = "--" + k.replace("_", "-")
if k in confmon_flags:
continue
if isinstance(v, bool):
if v:
result.append(k)
elif isinstance(v, tuple):
for i in v:
result.extend([k, i])
else:
result.extend([k, str(v)])
return result
def get_project_options(environment=None):
config = ProjectConfig.get_instance()
config.validate(envs=[environment] if environment else None)
if not environment:
default_envs = config.default_envs()
if default_envs:
environment = default_envs[0]
else:
environment = config.envs()[0]
return config.items(env=environment, as_dict=True)
def get_board_hwids(project_dir, platform, board):
with fs.cd(project_dir):
return platform.board_config(board).get("build.hwids", [])
def load_monitor_filter(path, project_dir=None, environment=None):
name = os.path.basename(path)
name = name[: name.find(".")]
module = load_python_module("platformio.commands.device.filters.%s" % name, path)
for cls in get_object_members(module).values():
if (
not inspect.isclass(cls)
or not issubclass(cls, DeviceMonitorFilter)
or cls == DeviceMonitorFilter
):
continue
obj = cls(project_dir, environment)
miniterm.TRANSFORMATIONS[obj.NAME] = obj
return True
def register_platform_filters(platform, project_dir, environment):
monitor_dir = os.path.join(platform.get_dir(), "monitor")
if not os.path.isdir(monitor_dir):
return
for name in os.listdir(monitor_dir):
if not name.startswith("filter_") or not name.endswith(".py"):
continue
path = os.path.join(monitor_dir, name)
if not os.path.isfile(path):
continue
load_monitor_filter(path, project_dir, environment)

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,152 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-locals,too-many-statements
import mimetypes
import socket
from os.path import isdir
import click
from platformio import exception
from platformio.compat import WINDOWS
from platformio.package.manager.core import get_core_package_dir, inject_contrib_pysite
@click.command("home", short_help="UI to manage PlatformIO")
@click.option("--port", type=int, default=8008, help="HTTP port, default=8008")
@click.option(
"--host",
default="127.0.0.1",
help=(
"HTTP host, default=127.0.0.1. You can open PIO Home for inbound "
"connections with --host=0.0.0.0"
),
)
@click.option("--no-open", is_flag=True)
@click.option(
"--shutdown-timeout",
default=0,
type=int,
help=(
"Automatically shutdown server on timeout (in seconds) when no clients "
"are connected. Default is 0 which means never auto shutdown"
),
)
def cli(port, host, no_open, shutdown_timeout):
# pylint: disable=import-error, import-outside-toplevel
# import contrib modules
inject_contrib_pysite()
from autobahn.twisted.resource import WebSocketResource
from twisted.internet import reactor
from twisted.web import server
from twisted.internet.error import CannotListenError
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.commands.home.rpc.handlers.project import ProjectRPC
from platformio.commands.home.rpc.handlers.account import AccountRPC
from platformio.commands.home.rpc.server import JSONRPCServerFactory
from platformio.commands.home.web import WebRoot
factory = JSONRPCServerFactory(shutdown_timeout)
factory.addHandler(AppRPC(), namespace="app")
factory.addHandler(IDERPC(), namespace="ide")
factory.addHandler(MiscRPC(), namespace="misc")
factory.addHandler(OSRPC(), namespace="os")
factory.addHandler(PIOCoreRPC(), namespace="core")
factory.addHandler(ProjectRPC(), namespace="project")
factory.addHandler(AccountRPC(), namespace="account")
contrib_dir = get_core_package_dir("contrib-piohome")
if not isdir(contrib_dir):
raise exception.PlatformioException("Invalid path to PIO Home Contrib")
# Ensure PIO Home mimetypes are known
mimetypes.add_type("text/html", ".html")
mimetypes.add_type("text/css", ".css")
mimetypes.add_type("application/javascript", ".js")
root = WebRoot(contrib_dir)
root.putChild(b"wsrpc", WebSocketResource(factory))
site = server.Site(root)
# hook for `platformio-node-helpers`
if host == "__do_not_start__":
return
already_started = is_port_used(host, port)
home_url = "http://%s:%d" % (host, port)
if not no_open:
if already_started:
click.launch(home_url)
else:
reactor.callLater(1, lambda: click.launch(home_url))
click.echo(
"\n".join(
[
"",
" ___I_",
" /\\-_--\\ PlatformIO Home",
"/ \\_-__\\",
"|[]| [] | %s" % home_url,
"|__|____|______________%s" % ("_" * len(host)),
]
)
)
click.echo("")
click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url)
try:
reactor.listenTCP(port, site, interface=host)
except CannotListenError as e:
click.secho(str(e), fg="red", err=True)
already_started = True
if already_started:
click.secho(
"PlatformIO Home server is already started in another process.", fg="yellow"
)
return
click.echo("PIO Home has been started. Press Ctrl+C to shutdown.")
reactor.run()
def is_port_used(host, port):
socket.setdefaulttimeout(1)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if WINDOWS:
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True

View File

@ -0,0 +1,51 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=keyword-arg-before-vararg,arguments-differ,signature-differs
import requests
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import threads # pylint: disable=import-error
from platformio import util
from platformio.proc import where_is_program
class AsyncSession(requests.Session):
def __init__(self, n=None, *args, **kwargs):
if n:
pool = reactor.getThreadPool()
pool.adjustPoolsize(0, n)
super(AsyncSession, self).__init__(*args, **kwargs)
def request(self, *args, **kwargs):
func = super(AsyncSession, self).request
return threads.deferToThread(func, *args, **kwargs)
def wrap(self, *args, **kwargs): # pylint: disable=no-self-use
return defer.ensureDeferred(*args, **kwargs)
@util.memoized(expire="60s")
def requests_session():
return AsyncSession(n=5)
@util.memoized(expire="60s")
def get_core_fullpath():
return where_is_program(
"platformio" + (".exe" if "windows" in util.get_systype() else "")
)

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,29 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import jsonrpc # pylint: disable=import-error
from platformio.clients.account import AccountClient
class AccountRPC(object):
@staticmethod
def call_client(method, *args, **kwargs):
try:
client = AccountClient()
return getattr(client, method)(*args, **kwargs)
except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4003, message="PIO Account Call Error", data=str(e)
)

View File

@ -0,0 +1,82 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from os.path import join
from platformio import __version__, app, fs, util
from platformio.project.helpers import get_project_core_dir, is_platformio_project
class AppRPC(object):
APPSTATE_PATH = join(get_project_core_dir(), "homestate.json")
IGNORE_STORAGE_KEYS = [
"cid",
"coreVersion",
"coreSystype",
"coreCaller",
"coreSettings",
"homeDir",
"projectsDir",
]
@staticmethod
def load_state():
with app.State(AppRPC.APPSTATE_PATH, lock=True) as state:
storage = state.get("storage", {})
# base data
caller_id = app.get_session_var("caller_id")
storage["cid"] = app.get_cid()
storage["coreVersion"] = __version__
storage["coreSystype"] = util.get_systype()
storage["coreCaller"] = str(caller_id).lower() if caller_id else None
storage["coreSettings"] = {
name: {
"description": data["description"],
"default_value": data["value"],
"value": app.get_setting(name),
}
for name, data in app.DEFAULT_SETTINGS.items()
}
storage["homeDir"] = fs.expanduser("~")
storage["projectsDir"] = storage["coreSettings"]["projects_dir"]["value"]
# skip non-existing recent projects
storage["recentProjects"] = [
p for p in storage.get("recentProjects", []) if is_platformio_project(p)
]
state["storage"] = storage
state.modified = False # skip saving extra fields
return state.as_dict()
@staticmethod
def get_state():
return AppRPC.load_state()
@staticmethod
def save_state(state):
with app.State(AppRPC.APPSTATE_PATH, lock=True) as s:
s.clear()
s.update(state)
storage = s.get("storage", {})
for k in AppRPC.IGNORE_STORAGE_KEYS:
if k in storage:
del storage[k]
return True

View File

@ -0,0 +1,47 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import time
import jsonrpc # pylint: disable=import-error
from twisted.internet import defer # pylint: disable=import-error
class IDERPC(object):
def __init__(self):
self._queue = {}
def send_command(self, sid, command, params):
if not self._queue.get(sid):
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4005, message="PIO Home IDE agent is not started"
)
while self._queue[sid]:
self._queue[sid].pop().callback(
{"id": time.time(), "method": command, "params": params}
)
def listen_commands(self, sid=0):
if sid not in self._queue:
self._queue[sid] = []
self._queue[sid].append(defer.Deferred())
return self._queue[sid][-1]
def open_project(self, sid, project_dir):
return self.send_command(sid, "open_project", project_dir)
def open_text_document(self, sid, path, line=None, column=None):
return self.send_command(
sid, "open_text_document", dict(path=path, line=line, column=column)
)

View File

@ -0,0 +1,52 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import time
from twisted.internet import defer, reactor # pylint: disable=import-error
from platformio.cache import ContentCache
from platformio.commands.home.rpc.handlers.os import OSRPC
class MiscRPC(object):
def load_latest_tweets(self, data_url):
cache_key = ContentCache.key_from_args(data_url, "tweets")
cache_valid = "180d"
with ContentCache() as cc:
cache_data = cc.get(cache_key)
if cache_data:
cache_data = json.loads(cache_data)
# automatically update cache in background every 12 hours
if cache_data["time"] < (time.time() - (3600 * 12)):
reactor.callLater(
5, self._preload_latest_tweets, data_url, cache_key, cache_valid
)
return cache_data["result"]
result = self._preload_latest_tweets(data_url, cache_key, cache_valid)
return result
@staticmethod
@defer.inlineCallbacks
def _preload_latest_tweets(data_url, cache_key, cache_valid):
result = json.loads((yield OSRPC.fetch_content(data_url)))
with ContentCache() as cc:
cc.set(
cache_key,
json.dumps({"time": int(time.time()), "result": result}),
cache_valid,
)
defer.returnValue(result)

View File

@ -0,0 +1,163 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import io
import os
import shutil
from functools import cmp_to_key
import click
from twisted.internet import defer # pylint: disable=import-error
from platformio import __default_requests_timeout__, fs, util
from platformio.cache import ContentCache
from platformio.clients.http import ensure_internet_on
from platformio.commands.home import helpers
from platformio.compat import PY2, get_filesystem_encoding, glob_recursive
class OSRPC(object):
@staticmethod
@defer.inlineCallbacks
def fetch_content(uri, data=None, headers=None, cache_valid=None):
if not headers:
headers = {
"User-Agent": (
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) "
"AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 "
"Safari/603.3.8"
)
}
cache_key = ContentCache.key_from_args(uri, data) if cache_valid else None
with ContentCache() as cc:
if cache_key:
result = cc.get(cache_key)
if result is not None:
defer.returnValue(result)
# check internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
session = helpers.requests_session()
if data:
r = yield session.post(
uri, data=data, headers=headers, timeout=__default_requests_timeout__
)
else:
r = yield session.get(
uri, headers=headers, timeout=__default_requests_timeout__
)
r.raise_for_status()
result = r.text
if cache_valid:
with ContentCache() as cc:
cc.set(cache_key, result, cache_valid)
defer.returnValue(result)
def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith("http"):
return self.fetch_content(uri, data, headers, cache_valid)
if os.path.isfile(uri):
with io.open(uri, encoding="utf-8") as fp:
return fp.read()
return None
@staticmethod
def open_url(url):
return click.launch(url)
@staticmethod
def reveal_file(path):
return click.launch(
path.encode(get_filesystem_encoding()) if PY2 else path, locate=True
)
@staticmethod
def open_file(path):
return click.launch(path.encode(get_filesystem_encoding()) if PY2 else path)
@staticmethod
def is_file(path):
return os.path.isfile(path)
@staticmethod
def is_dir(path):
return os.path.isdir(path)
@staticmethod
def make_dirs(path):
return os.makedirs(path)
@staticmethod
def get_file_mtime(path):
return os.path.getmtime(path)
@staticmethod
def rename(src, dst):
return os.rename(src, dst)
@staticmethod
def copy(src, dst):
return shutil.copytree(src, dst, symlinks=True)
@staticmethod
def glob(pathnames, root=None):
if not isinstance(pathnames, list):
pathnames = [pathnames]
result = set()
for pathname in pathnames:
result |= set(
glob_recursive(os.path.join(root, pathname) if root else pathname)
)
return list(result)
@staticmethod
def list_dir(path):
def _cmp(x, y):
if x[1] and not y[1]:
return -1
if not x[1] and y[1]:
return 1
if x[0].lower() > y[0].lower():
return 1
if x[0].lower() < y[0].lower():
return -1
return 0
items = []
if path.startswith("~"):
path = fs.expanduser(path)
if not os.path.isdir(path):
return items
for item in os.listdir(path):
try:
item_is_dir = os.path.isdir(os.path.join(path, item))
if item_is_dir:
os.listdir(os.path.join(path, item))
items.append((item, item_is_dir))
except OSError:
pass
return sorted(items, key=cmp_to_key(_cmp))
@staticmethod
def get_logical_devices():
items = []
for item in util.get_logical_devices():
if item["name"]:
item["name"] = item["name"]
items.append(item)
return items

View File

@ -0,0 +1,175 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import json
import os
import sys
from io import BytesIO, StringIO
import click
import jsonrpc # pylint: disable=import-error
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import threads # pylint: disable=import-error
from twisted.internet import utils # pylint: disable=import-error
from platformio import __main__, __version__, fs
from platformio.commands.home import helpers
from platformio.compat import (
PY2,
get_filesystem_encoding,
get_locale_encoding,
is_bytes,
string_types,
)
try:
from thread import get_ident as thread_get_ident
except ImportError:
from threading import get_ident as thread_get_ident
class MultiThreadingStdStream(object):
def __init__(self, parent_stream):
self._buffers = {thread_get_ident(): parent_stream}
def __getattr__(self, name):
thread_id = thread_get_ident()
self._ensure_thread_buffer(thread_id)
return getattr(self._buffers[thread_id], name)
def _ensure_thread_buffer(self, thread_id):
if thread_id not in self._buffers:
self._buffers[thread_id] = BytesIO() if PY2 else StringIO()
def write(self, value):
thread_id = thread_get_ident()
self._ensure_thread_buffer(thread_id)
if PY2 and isinstance(value, unicode): # pylint: disable=undefined-variable
value = value.encode()
return self._buffers[thread_id].write(
value.decode() if is_bytes(value) else value
)
def get_value_and_reset(self):
result = ""
try:
result = self.getvalue()
self.seek(0)
self.truncate(0)
except AttributeError:
pass
return result
class PIOCoreRPC(object):
@staticmethod
def version():
return __version__
@staticmethod
def setup_multithreading_std_streams():
if isinstance(sys.stdout, MultiThreadingStdStream):
return
PIOCoreRPC.thread_stdout = MultiThreadingStdStream(sys.stdout)
PIOCoreRPC.thread_stderr = MultiThreadingStdStream(sys.stderr)
sys.stdout = PIOCoreRPC.thread_stdout
sys.stderr = PIOCoreRPC.thread_stderr
@staticmethod
def call(args, options=None):
return defer.maybeDeferred(PIOCoreRPC._call_generator, args, options)
@staticmethod
@defer.inlineCallbacks
def _call_generator(args, options=None):
for i, arg in enumerate(args):
if isinstance(arg, string_types):
args[i] = arg.encode(get_filesystem_encoding()) if PY2 else arg
else:
args[i] = str(arg)
options = options or {}
to_json = "--json-output" in args
try:
if options.get("force_subprocess"):
result = yield PIOCoreRPC._call_subprocess(args, options)
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
else:
result = yield PIOCoreRPC._call_inline(args, options)
try:
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
except ValueError:
# fall-back to subprocess method
result = yield PIOCoreRPC._call_subprocess(args, options)
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4003, message="PIO Core Call Error", data=str(e)
)
@staticmethod
def _call_inline(args, options):
PIOCoreRPC.setup_multithreading_std_streams()
cwd = options.get("cwd") or os.getcwd()
def _thread_task():
with fs.cd(cwd):
exit_code = __main__.main(["-c"] + args)
return (
PIOCoreRPC.thread_stdout.get_value_and_reset(),
PIOCoreRPC.thread_stderr.get_value_and_reset(),
exit_code,
)
return threads.deferToThread(_thread_task)
@staticmethod
def _call_subprocess(args, options):
cwd = (options or {}).get("cwd") or os.getcwd()
return utils.getProcessOutputAndValue(
helpers.get_core_fullpath(),
args,
path=cwd,
env={k: v for k, v in os.environ.items() if "%" not in k},
)
@staticmethod
def _process_result(result, to_json=False):
out, err, code = result
if out and is_bytes(out):
out = out.decode(get_locale_encoding())
if err and is_bytes(err):
err = err.decode(get_locale_encoding())
text = ("%s\n\n%s" % (out, err)).strip()
if code != 0:
raise Exception(text)
if not to_json:
return text
try:
return json.loads(out)
except ValueError as e:
click.secho("%s => `%s`" % (e, out), fg="red", err=True)
# if PIO Core prints unhandled warnings
for line in out.split("\n"):
line = line.strip()
if not line:
continue
try:
return json.loads(line)
except ValueError:
pass
raise e

View File

@ -0,0 +1,335 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import os
import shutil
import time
import jsonrpc # pylint: disable=import-error
from platformio import exception, fs
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.package.manager.platform import PlatformPackageManager
from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectError
from platformio.project.helpers import get_project_dir, is_platformio_project
from platformio.project.options import get_config_options_schema
class ProjectRPC(object):
@staticmethod
def config_call(init_kwargs, method, *args):
assert isinstance(init_kwargs, dict)
assert "path" in init_kwargs
project_dir = get_project_dir()
if os.path.isfile(init_kwargs["path"]):
project_dir = os.path.dirname(init_kwargs["path"])
with fs.cd(project_dir):
return getattr(ProjectConfig(**init_kwargs), method)(*args)
@staticmethod
def config_load(path):
return ProjectConfig(
path, parse_extra=False, expand_interpolations=False
).as_tuple()
@staticmethod
def config_dump(path, data):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
config.update(data, clear=True)
return config.save()
@staticmethod
def config_update_description(path, text):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
if not config.has_section("platformio"):
config.add_section("platformio")
if text:
config.set("platformio", "description", text)
else:
if config.has_option("platformio", "description"):
config.remove_option("platformio", "description")
if not config.options("platformio"):
config.remove_section("platformio")
return config.save()
@staticmethod
def get_config_schema():
return get_config_options_schema()
@staticmethod
def get_projects():
def _get_project_data():
data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []}
config = ProjectConfig()
data["envs"] = config.envs()
data["description"] = config.get("platformio", "description")
data["libExtraDirs"].extend(config.get("platformio", "lib_extra_dirs", []))
libdeps_dir = config.get_optional_dir("libdeps")
for section in config.sections():
if not section.startswith("env:"):
continue
data["envLibdepsDirs"].append(os.path.join(libdeps_dir, section[4:]))
if config.has_option(section, "board"):
data["boards"].append(config.get(section, "board"))
data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", []))
# skip non existing folders and resolve full path
for key in ("envLibdepsDirs", "libExtraDirs"):
data[key] = [
fs.expanduser(d) if d.startswith("~") else os.path.realpath(d)
for d in data[key]
if os.path.isdir(d)
]
return data
def _path_to_name(path):
return (os.path.sep).join(path.split(os.path.sep)[-2:])
result = []
pm = PlatformPackageManager()
for project_dir in AppRPC.load_state()["storage"]["recentProjects"]:
if not os.path.isdir(project_dir):
continue
data = {}
boards = []
try:
with fs.cd(project_dir):
data = _get_project_data()
except ProjectError:
continue
for board_id in data.get("boards", []):
name = board_id
try:
name = pm.board_config(board_id)["name"]
except exception.PlatformioException:
pass
boards.append({"id": board_id, "name": name})
result.append(
{
"path": project_dir,
"name": _path_to_name(project_dir),
"modified": int(os.path.getmtime(project_dir)),
"boards": boards,
"description": data.get("description"),
"envs": data.get("envs", []),
"envLibStorages": [
{"name": os.path.basename(d), "path": d}
for d in data.get("envLibdepsDirs", [])
],
"extraLibStorages": [
{"name": _path_to_name(d), "path": d}
for d in data.get("libExtraDirs", [])
],
}
)
return result
@staticmethod
def get_project_examples():
result = []
pm = PlatformPackageManager()
for pkg in pm.get_installed():
examples_dir = os.path.join(pkg.path, "examples")
if not os.path.isdir(examples_dir):
continue
items = []
for project_dir, _, __ in os.walk(examples_dir):
project_description = None
try:
config = ProjectConfig(os.path.join(project_dir, "platformio.ini"))
config.validate(silent=True)
project_description = config.get("platformio", "description")
except ProjectError:
continue
path_tokens = project_dir.split(os.path.sep)
items.append(
{
"name": "/".join(
path_tokens[path_tokens.index("examples") + 1 :]
),
"path": project_dir,
"description": project_description,
}
)
manifest = pm.load_manifest(pkg)
result.append(
{
"platform": {
"title": manifest["title"],
"version": manifest["version"],
},
"items": sorted(items, key=lambda item: item["name"]),
}
)
return sorted(result, key=lambda data: data["platform"]["title"])
def init(self, board, framework, project_dir):
assert project_dir
state = AppRPC.load_state()
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["init", "--board", board]
if framework:
args.extend(["--project-option", "framework = %s" % framework])
if (
state["storage"]["coreCaller"]
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True}
)
d.addCallback(self._generate_project_main, project_dir, framework)
return d
@staticmethod
def _generate_project_main(_, project_dir, framework):
main_content = None
if framework == "arduino":
main_content = "\n".join(
[
"#include <Arduino.h>",
"",
"void setup() {",
" // put your setup code here, to run once:",
"}",
"",
"void loop() {",
" // put your main code here, to run repeatedly:",
"}",
"",
]
)
elif framework == "mbed":
main_content = "\n".join(
[
"#include <mbed.h>",
"",
"int main() {",
"",
" // put your setup code here, to run once:",
"",
" while(1) {",
" // put your main code here, to run repeatedly:",
" }",
"}",
"",
]
)
if not main_content:
return project_dir
with fs.cd(project_dir):
config = ProjectConfig()
src_dir = config.get_optional_dir("src")
main_path = os.path.join(src_dir, "main.cpp")
if os.path.isfile(main_path):
return project_dir
if not os.path.isdir(src_dir):
os.makedirs(src_dir)
with open(main_path, "w") as fp:
fp.write(main_content.strip())
return project_dir
def import_arduino(self, board, use_arduino_libs, arduino_project_dir):
board = str(board)
if arduino_project_dir and PY2:
arduino_project_dir = arduino_project_dir.encode(get_filesystem_encoding())
# don't import PIO Project
if is_platformio_project(arduino_project_dir):
return arduino_project_dir
is_arduino_project = any(
[
os.path.isfile(
os.path.join(
arduino_project_dir,
"%s.%s" % (os.path.basename(arduino_project_dir), ext),
)
)
for ext in ("ino", "pde")
]
)
if not is_arduino_project:
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4000, message="Not an Arduino project: %s" % arduino_project_dir
)
state = AppRPC.load_state()
project_dir = os.path.join(
state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board
)
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["init", "--board", board]
args.extend(["--project-option", "framework = arduino"])
if use_arduino_libs:
args.extend(
["--project-option", "lib_extra_dirs = ~/Documents/Arduino/libraries"]
)
if (
state["storage"]["coreCaller"]
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True}
)
d.addCallback(self._finalize_arduino_import, project_dir, arduino_project_dir)
return d
@staticmethod
def _finalize_arduino_import(_, project_dir, arduino_project_dir):
with fs.cd(project_dir):
config = ProjectConfig()
src_dir = config.get_optional_dir("src")
if os.path.isdir(src_dir):
fs.rmtree(src_dir)
shutil.copytree(arduino_project_dir, src_dir, symlinks=True)
return project_dir
@staticmethod
def import_pio(project_dir):
if not project_dir or not is_platformio_project(project_dir):
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4001, message="Not an PlatformIO project: %s" % project_dir
)
new_project_dir = os.path.join(
AppRPC.load_state()["storage"]["projectsDir"],
time.strftime("%y%m%d-%H%M%S-") + os.path.basename(project_dir),
)
shutil.copytree(project_dir, new_project_dir, symlinks=True)
state = AppRPC.load_state()
args = ["init"]
if (
state["storage"]["coreCaller"]
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(
args, options={"cwd": new_project_dir, "force_subprocess": True}
)
d.addCallback(lambda _: new_project_dir)
return d

View File

@ -0,0 +1,101 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=import-error
import click
import jsonrpc
from autobahn.twisted.websocket import WebSocketServerFactory, WebSocketServerProtocol
from jsonrpc.exceptions import JSONRPCDispatchException
from twisted.internet import defer, reactor
from platformio.compat import PY2, dump_json_to_unicode, is_bytes
class JSONRPCServerProtocol(WebSocketServerProtocol):
def onOpen(self):
self.factory.connection_nums += 1
if self.factory.shutdown_timer:
self.factory.shutdown_timer.cancel()
self.factory.shutdown_timer = None
def onClose(self, wasClean, code, reason): # pylint: disable=unused-argument
self.factory.connection_nums -= 1
if self.factory.connection_nums == 0:
self.factory.shutdownByTimeout()
def onMessage(self, payload, isBinary): # pylint: disable=unused-argument
# click.echo("> %s" % payload)
response = jsonrpc.JSONRPCResponseManager.handle(
payload, self.factory.dispatcher
).data
# if error
if "result" not in response:
self.sendJSONResponse(response)
return None
d = defer.maybeDeferred(lambda: response["result"])
d.addCallback(self._callback, response)
d.addErrback(self._errback, response)
return None
def _callback(self, result, response):
response["result"] = result
self.sendJSONResponse(response)
def _errback(self, failure, response):
if isinstance(failure.value, JSONRPCDispatchException):
e = failure.value
else:
e = JSONRPCDispatchException(code=4999, message=failure.getErrorMessage())
del response["result"]
response["error"] = e.error._data # pylint: disable=protected-access
self.sendJSONResponse(response)
def sendJSONResponse(self, response):
# click.echo("< %s" % response)
if "error" in response:
click.secho("Error: %s" % response["error"], fg="red", err=True)
response = dump_json_to_unicode(response)
if not PY2 and not is_bytes(response):
response = response.encode("utf-8")
self.sendMessage(response)
class JSONRPCServerFactory(WebSocketServerFactory):
protocol = JSONRPCServerProtocol
connection_nums = 0
shutdown_timer = 0
def __init__(self, shutdown_timeout=0):
super(JSONRPCServerFactory, self).__init__()
self.shutdown_timeout = shutdown_timeout
self.dispatcher = jsonrpc.Dispatcher()
def shutdownByTimeout(self):
if self.shutdown_timeout < 1:
return
def _auto_shutdown_server():
click.echo("Automatically shutdown server on timeout")
reactor.stop()
self.shutdown_timer = reactor.callLater(
self.shutdown_timeout, _auto_shutdown_server
)
def addHandler(self, handler, namespace):
self.dispatcher.build_method_map(handler, prefix="%s." % namespace)

View File

@ -0,0 +1,28 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from twisted.internet import reactor # pylint: disable=import-error
from twisted.web import static # pylint: disable=import-error
class WebRoot(static.File):
def render_GET(self, request):
if request.args.get(b"__shutdown__", False):
reactor.stop()
return "Server has been stopped"
request.setHeader("cache-control", "no-cache, no-store, must-revalidate")
request.setHeader("pragma", "no-cache")
request.setHeader("expires", "0")
return static.File.render_GET(self, request)

View File

@ -1,423 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals, too-many-branches
from os import getcwd, makedirs
from os.path import isdir, isfile, join
from shutil import copyfile
import click
from platformio import exception, util
from platformio.commands.platform import \
platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager
def validate_boards(ctx, param, value): # pylint: disable=W0613
pm = PlatformManager()
for id_ in value:
try:
pm.board_config(id_)
except exception.UnknownBoard:
raise click.BadParameter(
"`%s`. Please search for board ID using `platformio boards` "
"command" % id_)
return value
@click.command(
"init", short_help="Initialize PlatformIO project or update existing")
@click.option(
"--project-dir",
"-d",
default=getcwd,
type=click.Path(
exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option(
"-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option(
"--ide", type=click.Choice(ProjectGenerator.get_supported_ides()))
@click.option("-O", "--project-option", multiple=True)
@click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True)
@click.pass_context
def cli(
ctx, # pylint: disable=R0913
project_dir,
board,
ide,
project_option,
env_prefix,
silent):
if not silent:
if project_dir == getcwd():
click.secho(
"\nThe current working directory", fg="yellow", nl=False)
click.secho(" %s " % project_dir, fg="cyan", nl=False)
click.secho("will be used for the project.", fg="yellow")
click.echo("")
click.echo("The next files/directories have been created in %s" %
click.style(project_dir, fg="cyan"))
click.echo("%s - Put project header files here" % click.style(
"include", fg="cyan"))
click.echo("%s - Put here project specific (private) libraries" %
click.style("lib", fg="cyan"))
click.echo("%s - Put project source files here" % click.style(
"src", fg="cyan"))
click.echo("%s - Project Configuration File" % click.style(
"platformio.ini", fg="cyan"))
is_new_project = not util.is_platformio_project(project_dir)
init_base_project(project_dir)
if board:
fill_project_envs(ctx, project_dir, board, project_option, env_prefix,
ide is not None)
if ide:
env_name = get_best_envname(project_dir, board)
if not env_name:
raise exception.BoardNotDefined()
pg = ProjectGenerator(project_dir, ide, env_name)
pg.generate()
if is_new_project:
init_ci_conf(project_dir)
init_cvs_ignore(project_dir)
if silent:
return
if ide:
click.secho(
"\nProject has been successfully %s including configuration files "
"for `%s` IDE." % ("initialized" if is_new_project else "updated",
ide),
fg="green")
else:
click.secho(
"\nProject has been successfully %s! Useful commands:\n"
"`pio run` - process/build project from the current directory\n"
"`pio run --target upload` or `pio run -t upload` "
"- upload firmware to a target\n"
"`pio run --target clean` - clean project (remove compiled files)"
"\n`pio run --help` - additional information" %
("initialized" if is_new_project else "updated"),
fg="green")
def get_best_envname(project_dir, boards=None):
config = util.load_project_config(project_dir)
env_default = None
if config.has_option("platformio", "env_default"):
env_default = config.get("platformio",
"env_default").split(", ")[0].strip()
if env_default:
return env_default
section = None
for section in config.sections():
if not section.startswith("env:"):
continue
elif config.has_option(section, "board") and (not boards or config.get(
section, "board") in boards):
break
return section[4:] if section else None
def init_base_project(project_dir):
if util.is_platformio_project(project_dir):
return
copyfile(
join(util.get_source_dir(), "projectconftpl.ini"),
join(project_dir, "platformio.ini"))
with util.cd(project_dir):
dir_to_readme = [
(util.get_projectsrc_dir(), None),
(util.get_projectinclude_dir(), init_include_readme),
(util.get_projectlib_dir(), init_lib_readme),
(util.get_projecttest_dir(), init_test_readme),
]
for (path, cb) in dir_to_readme:
if isdir(path):
continue
makedirs(path)
if cb:
cb(path)
def init_include_readme(include_dir):
with open(join(include_dir, "README"), "w") as f:
f.write("""
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
""")
def init_lib_readme(lib_dir):
with open(join(lib_dir, "README"), "w") as f:
f.write("""
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html
""")
def init_test_readme(test_dir):
with open(join(test_dir, "README"), "w") as f:
f.write("""
This directory is intended for PIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html
""")
def init_ci_conf(project_dir):
conf_path = join(project_dir, ".travis.yml")
if isfile(conf_path):
return
with open(conf_path, "w") as f:
f.write("""# Continuous Integration (CI) is the practice, in software
# engineering, of merging all developer working copies with a shared mainline
# several times a day < https://docs.platformio.org/page/ci/index.html >
#
# Documentation:
#
# * Travis CI Embedded Builds with PlatformIO
# < https://docs.travis-ci.com/user/integration/platformio/ >
#
# * PlatformIO integration with Travis CI
# < https://docs.platformio.org/page/ci/travis.html >
#
# * User Guide for `platformio ci` command
# < https://docs.platformio.org/page/userguide/cmd_ci.html >
#
#
# Please choose one of the following templates (proposed below) and uncomment
# it (remove "# " before each line) or use own configuration according to the
# Travis CI documentation (see above).
#
#
# Template #1: General project. Test it using existing `platformio.ini`.
#
# language: python
# python:
# - "2.7"
#
# sudo: false
# cache:
# directories:
# - "~/.platformio"
#
# install:
# - pip install -U platformio
# - platformio update
#
# script:
# - platformio run
#
# Template #2: The project is intended to be used as a library with examples.
#
# language: python
# python:
# - "2.7"
#
# sudo: false
# cache:
# directories:
# - "~/.platformio"
#
# env:
# - PLATFORMIO_CI_SRC=path/to/test/file.c
# - PLATFORMIO_CI_SRC=examples/file.ino
# - PLATFORMIO_CI_SRC=path/to/test/directory
#
# install:
# - pip install -U platformio
# - platformio update
#
# script:
# - platformio ci --lib="." --board=ID_1 --board=ID_2 --board=ID_N
""")
def init_cvs_ignore(project_dir):
conf_path = join(project_dir, ".gitignore")
if isfile(conf_path):
return
with open(conf_path, "w") as fp:
fp.writelines([".pio\n", ".pioenvs\n", ".piolibdeps\n"])
def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix,
force_download):
content = []
used_boards = []
used_platforms = []
config = util.load_project_config(project_dir)
for section in config.sections():
cond = [
section.startswith("env:"),
config.has_option(section, "board")
]
if all(cond):
used_boards.append(config.get(section, "board"))
pm = PlatformManager()
for id_ in board_ids:
board_config = pm.board_config(id_)
used_platforms.append(board_config['platform'])
if id_ in used_boards:
continue
used_boards.append(id_)
envopts = {"platform": board_config['platform'], "board": id_}
# find default framework for board
frameworks = board_config.get("frameworks")
if frameworks:
envopts['framework'] = frameworks[0]
for item in project_option:
if "=" not in item:
continue
_name, _value = item.split("=", 1)
envopts[_name.strip()] = _value.strip()
content.append("")
content.append("[env:%s%s]" % (env_prefix, id_))
for name, value in envopts.items():
content.append("%s = %s" % (name, value))
if force_download and used_platforms:
_install_dependent_platforms(ctx, used_platforms)
if not content:
return
with open(join(project_dir, "platformio.ini"), "a") as f:
content.append("")
f.write("\n".join(content))
def _install_dependent_platforms(ctx, platforms):
installed_platforms = [
p['name'] for p in PlatformManager().get_installed()
]
if set(platforms) <= set(installed_platforms):
return
ctx.invoke(
cli_platform_install,
platforms=list(set(platforms) - set(installed_platforms)))

View File

@ -1,472 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-branches, too-many-locals
import json
import time
from os.path import isdir, join
from urllib import quote
import click
from platformio import exception, util
from platformio.managers.lib import LibraryManager, get_builtin_libs
from platformio.util import get_api_result
@click.group(short_help="Library Manager")
@click.option(
"-g",
"--global",
is_flag=True,
help="Manage global PlatformIO library storage")
@click.option(
"-d",
"--storage-dir",
default=None,
type=click.Path(
exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True),
help="Manage custom library storage")
@click.pass_context
def cli(ctx, **options):
non_storage_cmds = ("search", "show", "register", "stats", "builtin")
# skip commands that don't need storage folder
if ctx.invoked_subcommand in non_storage_cmds or \
(len(ctx.args) == 2 and ctx.args[1] in ("-h", "--help")):
return
storage_dir = options['storage_dir']
if not storage_dir:
if options['global']:
storage_dir = join(util.get_home_dir(), "lib")
elif util.is_platformio_project():
storage_dir = util.get_projectlibdeps_dir()
elif util.is_ci():
storage_dir = join(util.get_home_dir(), "lib")
click.secho(
"Warning! Global library storage is used automatically. "
"Please use `platformio lib --global %s` command to remove "
"this warning." % ctx.invoked_subcommand,
fg="yellow")
elif util.is_platformio_project(storage_dir):
with util.cd(storage_dir):
storage_dir = util.get_projectlibdeps_dir()
if not storage_dir and not util.is_platformio_project():
raise exception.NotGlobalLibDir(util.get_project_dir(),
join(util.get_home_dir(), "lib"),
ctx.invoked_subcommand)
ctx.obj = LibraryManager(storage_dir)
if "--json-output" not in ctx.args:
click.echo("Library Storage: " + storage_dir)
@cli.command("install", short_help="Install library")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
# @click.option(
# "--save",
# is_flag=True,
# help="Save installed libraries into the project's platformio.ini "
# "library dependencies")
@click.option(
"-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option(
"--interactive",
is_flag=True,
help="Allow to make a choice for all prompts")
@click.option(
"-f",
"--force",
is_flag=True,
help="Reinstall/redownload library if exists")
@click.pass_obj
def lib_install(lm, libraries, silent, interactive, force):
# @TODO: "save" option
for library in libraries:
lm.install(
library, silent=silent, interactive=interactive, force=force)
@cli.command("uninstall", short_help="Uninstall libraries")
@click.argument("libraries", nargs=-1, metavar="[LIBRARY...]")
@click.pass_obj
def lib_uninstall(lm, libraries):
for library in libraries:
lm.uninstall(library)
@cli.command("update", short_help="Update installed libraries")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
@click.option(
"-c",
"--only-check",
is_flag=True,
help="Do not update, only check for new version")
@click.option("--json-output", is_flag=True)
@click.pass_obj
def lib_update(lm, libraries, only_check, json_output):
if not libraries:
libraries = [manifest['__pkg_dir'] for manifest in lm.get_installed()]
if only_check and json_output:
result = []
for library in libraries:
pkg_dir = library if isdir(library) else None
requirements = None
url = None
if not pkg_dir:
name, requirements, url = lm.parse_pkg_uri(library)
pkg_dir = lm.get_package_dir(name, requirements, url)
if not pkg_dir:
continue
latest = lm.outdated(pkg_dir, requirements)
if not latest:
continue
manifest = lm.load_manifest(pkg_dir)
manifest['versionLatest'] = latest
result.append(manifest)
return click.echo(json.dumps(result))
else:
for library in libraries:
lm.update(library, only_check=only_check)
return True
def print_lib_item(item):
click.secho(item['name'], fg="cyan")
click.echo("=" * len(item['name']))
if "id" in item:
click.secho("#ID: %d" % item['id'], bold=True)
if "description" in item or "url" in item:
click.echo(item.get("description", item.get("url", "")))
click.echo()
for key in ("version", "homepage", "license", "keywords"):
if key not in item or not item[key]:
continue
if isinstance(item[key], list):
click.echo("%s: %s" % (key.title(), ", ".join(item[key])))
else:
click.echo("%s: %s" % (key.title(), item[key]))
for key in ("frameworks", "platforms"):
if key not in item:
continue
click.echo("Compatible %s: %s" % (key, ", ".join(
[i['title'] if isinstance(i, dict) else i for i in item[key]])))
if "authors" in item or "authornames" in item:
click.echo("Authors: %s" % ", ".join(
item.get("authornames",
[a.get("name", "") for a in item.get("authors", [])])))
if "__src_url" in item:
click.secho("Source: %s" % item['__src_url'])
click.echo()
@cli.command("search", short_help="Search for a library")
@click.argument("query", required=False, nargs=-1)
@click.option("--json-output", is_flag=True)
@click.option("--page", type=click.INT, default=1)
@click.option("--id", multiple=True)
@click.option("-n", "--name", multiple=True)
@click.option("-a", "--author", multiple=True)
@click.option("-k", "--keyword", multiple=True)
@click.option("-f", "--framework", multiple=True)
@click.option("-p", "--platform", multiple=True)
@click.option("-i", "--header", multiple=True)
@click.option(
"--noninteractive",
is_flag=True,
help="Do not prompt, automatically paginate with delay")
def lib_search(query, json_output, page, noninteractive, **filters):
if not query:
query = []
if not isinstance(query, list):
query = list(query)
for key, values in filters.items():
for value in values:
query.append('%s:"%s"' % (key, value))
result = get_api_result(
"/v2/lib/search",
dict(query=" ".join(query), page=page),
cache_valid="1d")
if json_output:
click.echo(json.dumps(result))
return
if result['total'] == 0:
click.secho(
"Nothing has been found by your request\n"
"Try a less-specific search or use truncation (or wildcard) "
"operator",
fg="yellow",
nl=False)
click.secho(" *", fg="green")
click.secho("For example: DS*, PCA*, DHT* and etc.\n", fg="yellow")
click.echo("For more examples and advanced search syntax, "
"please use documentation:")
click.secho(
"https://docs.platformio.org/page/userguide/lib/cmd_search.html\n",
fg="cyan")
return
click.secho(
"Found %d libraries:\n" % result['total'],
fg="green" if result['total'] else "yellow")
while True:
for item in result['items']:
print_lib_item(item)
if (int(result['page']) * int(result['perpage']) >= int(
result['total'])):
break
if noninteractive:
click.echo()
click.secho(
"Loading next %d libraries... Press Ctrl+C to stop!" %
result['perpage'],
fg="yellow")
click.echo()
time.sleep(5)
elif not click.confirm("Show next libraries?"):
break
result = get_api_result(
"/v2/lib/search", {
"query": " ".join(query),
"page": int(result['page']) + 1
},
cache_valid="1d")
@cli.command("list", short_help="List installed libraries")
@click.option("--json-output", is_flag=True)
@click.pass_obj
def lib_list(lm, json_output):
items = lm.get_installed()
if json_output:
return click.echo(json.dumps(items))
if not items:
return None
for item in sorted(items, key=lambda i: i['name']):
print_lib_item(item)
return True
@cli.command("builtin", short_help="List built-in libraries")
@click.option("--storage", multiple=True)
@click.option("--json-output", is_flag=True)
def lib_builtin(storage, json_output):
items = get_builtin_libs(storage)
if json_output:
return click.echo(json.dumps(items))
for storage_ in items:
if not storage_['items']:
continue
click.secho(storage_['name'], fg="green")
click.echo("*" * len(storage_['name']))
click.echo()
for item in sorted(storage_['items'], key=lambda i: i['name']):
print_lib_item(item)
return True
@cli.command("show", short_help="Show detailed info about a library")
@click.argument("library", metavar="[LIBRARY]")
@click.option("--json-output", is_flag=True)
def lib_show(library, json_output):
lm = LibraryManager()
name, requirements, _ = lm.parse_pkg_uri(library)
lib_id = lm.search_lib_id({
"name": name,
"requirements": requirements
},
silent=json_output,
interactive=not json_output)
lib = get_api_result("/lib/info/%d" % lib_id, cache_valid="1d")
if json_output:
return click.echo(json.dumps(lib))
click.secho(lib['name'], fg="cyan")
click.echo("=" * len(lib['name']))
click.secho("#ID: %d" % lib['id'], bold=True)
click.echo(lib['description'])
click.echo()
click.echo(
"Version: %s, released %s" %
(lib['version']['name'],
time.strftime("%c", util.parse_date(lib['version']['released']))))
click.echo("Manifest: %s" % lib['confurl'])
for key in ("homepage", "repository", "license"):
if key not in lib or not lib[key]:
continue
if isinstance(lib[key], list):
click.echo("%s: %s" % (key.title(), ", ".join(lib[key])))
else:
click.echo("%s: %s" % (key.title(), lib[key]))
blocks = []
_authors = []
for author in lib.get("authors", []):
_data = []
for key in ("name", "email", "url", "maintainer"):
if not author[key]:
continue
if key == "email":
_data.append("<%s>" % author[key])
elif key == "maintainer":
_data.append("(maintainer)")
else:
_data.append(author[key])
_authors.append(" ".join(_data))
if _authors:
blocks.append(("Authors", _authors))
blocks.append(("Keywords", lib['keywords']))
for key in ("frameworks", "platforms"):
if key not in lib or not lib[key]:
continue
blocks.append(("Compatible %s" % key, [i['title'] for i in lib[key]]))
blocks.append(("Headers", lib['headers']))
blocks.append(("Examples", lib['examples']))
blocks.append(("Versions", [
"%s, released %s" %
(v['name'], time.strftime("%c", util.parse_date(v['released'])))
for v in lib['versions']
]))
blocks.append(("Unique Downloads", [
"Today: %s" % lib['dlstats']['day'],
"Week: %s" % lib['dlstats']['week'],
"Month: %s" % lib['dlstats']['month']
]))
for (title, rows) in blocks:
click.echo()
click.secho(title, bold=True)
click.echo("-" * len(title))
for row in rows:
click.echo(row)
return True
@cli.command("register", short_help="Register a new library")
@click.argument("config_url")
def lib_register(config_url):
if (not config_url.startswith("http://")
and not config_url.startswith("https://")):
raise exception.InvalidLibConfURL(config_url)
result = get_api_result("/lib/register", data=dict(config_url=config_url))
if "message" in result and result['message']:
click.secho(
result['message'],
fg="green"
if "successed" in result and result['successed'] else "red")
@cli.command("stats", short_help="Library Registry Statistics")
@click.option("--json-output", is_flag=True)
def lib_stats(json_output):
result = get_api_result("/lib/stats", cache_valid="1h")
if json_output:
return click.echo(json.dumps(result))
printitem_tpl = "{name:<33} {url}"
printitemdate_tpl = "{name:<33} {date:23} {url}"
def _print_title(title):
click.secho(title.upper(), bold=True)
click.echo("*" * len(title))
def _print_header(with_date=False):
click.echo((printitemdate_tpl if with_date else printitem_tpl).format(
name=click.style("Name", fg="cyan"),
date="Date",
url=click.style("Url", fg="blue")))
terminal_width, _ = click.get_terminal_size()
click.echo("-" * terminal_width)
def _print_lib_item(item):
date = str(
time.strftime("%c", util.parse_date(item['date'])) if "date" in
item else "")
url = click.style(
"https://platformio.org/lib/show/%s/%s" % (item['id'],
quote(item['name'])),
fg="blue")
click.echo(
(printitemdate_tpl if "date" in item else printitem_tpl).format(
name=click.style(item['name'], fg="cyan"), date=date, url=url))
def _print_tag_item(name):
click.echo(
printitem_tpl.format(
name=click.style(name, fg="cyan"),
url=click.style(
"https://platformio.org/lib/search?query=" + quote(
"keyword:%s" % name),
fg="blue")))
for key in ("updated", "added"):
_print_title("Recently " + key)
_print_header(with_date=True)
for item in result.get(key, []):
_print_lib_item(item)
click.echo()
_print_title("Recent keywords")
_print_header(with_date=False)
for item in result.get("lastkeywords"):
_print_tag_item(item)
click.echo()
_print_title("Popular keywords")
_print_header(with_date=False)
for item in result.get("topkeywords"):
_print_tag_item(item)
click.echo()
for key, title in (("dlday", "Today"), ("dlweek", "Week"), ("dlmonth",
"Month")):
_print_title("Featured: " + title)
_print_header(with_date=False)
for item in result.get(key, []):
_print_lib_item(item)
click.echo()
return True

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,651 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-branches, too-many-locals
import os
import time
import click
from tabulate import tabulate
from platformio import exception, fs, util
from platformio.commands import PlatformioCLI
from platformio.commands.lib.helpers import get_builtin_libs, save_project_libdeps
from platformio.compat import dump_json_to_unicode
from platformio.package.exception import NotGlobalLibDir, UnknownPackageError
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.meta import PackageItem, PackageSpec
from platformio.proc import is_ci
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_dir, is_platformio_project
try:
from urllib.parse import quote
except ImportError:
from urllib import quote
CTX_META_INPUT_DIRS_KEY = __name__ + ".input_dirs"
CTX_META_PROJECT_ENVIRONMENTS_KEY = __name__ + ".project_environments"
CTX_META_STORAGE_DIRS_KEY = __name__ + ".storage_dirs"
CTX_META_STORAGE_LIBDEPS_KEY = __name__ + ".storage_lib_deps"
def get_project_global_lib_dir():
return ProjectConfig.get_instance().get_optional_dir("globallib")
@click.group(short_help="Library manager")
@click.option(
"-d",
"--storage-dir",
multiple=True,
default=None,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
help="Manage custom library storage",
)
@click.option(
"-g", "--global", is_flag=True, help="Manage global PlatformIO library storage"
)
@click.option(
"-e",
"--environment",
multiple=True,
help=(
"Manage libraries for the specific project build environments "
"declared in `platformio.ini`"
),
)
@click.pass_context
def cli(ctx, **options):
storage_cmds = ("install", "uninstall", "update", "list")
# skip commands that don't need storage folder
if ctx.invoked_subcommand not in storage_cmds or (
len(ctx.args) == 2 and ctx.args[1] in ("-h", "--help")
):
return
storage_dirs = list(options["storage_dir"])
if options["global"]:
storage_dirs.append(get_project_global_lib_dir())
if not storage_dirs:
if is_platformio_project():
storage_dirs = [get_project_dir()]
elif is_ci():
storage_dirs = [get_project_global_lib_dir()]
click.secho(
"Warning! Global library storage is used automatically. "
"Please use `platformio lib --global %s` command to remove "
"this warning." % ctx.invoked_subcommand,
fg="yellow",
)
if not storage_dirs:
raise NotGlobalLibDir(
get_project_dir(), get_project_global_lib_dir(), ctx.invoked_subcommand
)
in_silence = PlatformioCLI.in_silence()
ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] = options["environment"]
ctx.meta[CTX_META_INPUT_DIRS_KEY] = storage_dirs
ctx.meta[CTX_META_STORAGE_DIRS_KEY] = []
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY] = {}
for storage_dir in storage_dirs:
if not is_platformio_project(storage_dir):
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
continue
with fs.cd(storage_dir):
config = ProjectConfig.get_instance(
os.path.join(storage_dir, "platformio.ini")
)
config.validate(options["environment"], silent=in_silence)
libdeps_dir = config.get_optional_dir("libdeps")
for env in config.envs():
if options["environment"] and env not in options["environment"]:
continue
storage_dir = os.path.join(libdeps_dir, env)
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY][storage_dir] = config.get(
"env:" + env, "lib_deps", []
)
@cli.command("install", short_help="Install library")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
@click.option(
"--save/--no-save",
is_flag=True,
default=True,
help="Save installed libraries into the `platformio.ini` dependency list"
" (enabled by default)",
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option(
"--interactive",
is_flag=True,
help="Deprecated! Please use a strict dependency specification (owner/libname)",
)
@click.option(
"-f", "--force", is_flag=True, help="Reinstall/redownload library if exists"
)
@click.pass_context
def lib_install( # pylint: disable=too-many-arguments,unused-argument
ctx, libraries, save, silent, interactive, force
):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
storage_libdeps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, [])
installed_pkgs = {}
for storage_dir in storage_dirs:
if not silent and (libraries or storage_dir in storage_libdeps):
print_storage_header(storage_dirs, storage_dir)
lm = LibraryPackageManager(storage_dir)
if libraries:
installed_pkgs = {
library: lm.install(library, silent=silent, force=force)
for library in libraries
}
elif storage_dir in storage_libdeps:
for library in storage_libdeps[storage_dir]:
lm.install(library, silent=silent, force=force)
if save and installed_pkgs:
_save_deps(ctx, installed_pkgs)
def _save_deps(ctx, pkgs, action="add"):
specs = []
for library, pkg in pkgs.items():
spec = PackageSpec(library)
if spec.external:
specs.append(spec)
else:
specs.append(
PackageSpec(
owner=pkg.metadata.spec.owner,
name=pkg.metadata.spec.name,
requirements=spec.requirements
or (
("^%s" % pkg.metadata.version)
if not pkg.metadata.version.build
else pkg.metadata.version
),
)
)
input_dirs = ctx.meta.get(CTX_META_INPUT_DIRS_KEY, [])
project_environments = ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY]
for input_dir in input_dirs:
if not is_platformio_project(input_dir):
continue
save_project_libdeps(input_dir, specs, project_environments, action=action)
@cli.command("uninstall", short_help="Remove libraries")
@click.argument("libraries", nargs=-1, metavar="[LIBRARY...]")
@click.option(
"--save/--no-save",
is_flag=True,
default=True,
help="Remove libraries from the `platformio.ini` dependency list and save changes"
" (enabled by default)",
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.pass_context
def lib_uninstall(ctx, libraries, save, silent):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
uninstalled_pkgs = {}
for storage_dir in storage_dirs:
print_storage_header(storage_dirs, storage_dir)
lm = LibraryPackageManager(storage_dir)
uninstalled_pkgs = {
library: lm.uninstall(library, silent=silent) for library in libraries
}
if save and uninstalled_pkgs:
_save_deps(ctx, uninstalled_pkgs, action="remove")
@cli.command("update", short_help="Update installed libraries")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True)
@click.pass_context
def lib_update( # pylint: disable=too-many-arguments
ctx, libraries, only_check, dry_run, silent, json_output
):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
only_check = dry_run or only_check
json_result = {}
for storage_dir in storage_dirs:
if not json_output:
print_storage_header(storage_dirs, storage_dir)
lib_deps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, {}).get(storage_dir, [])
lm = LibraryPackageManager(storage_dir)
_libraries = libraries or lib_deps or lm.get_installed()
if only_check and json_output:
result = []
for library in _libraries:
spec = None
pkg = None
if isinstance(library, PackageItem):
pkg = library
else:
spec = PackageSpec(library)
pkg = lm.get_package(spec)
if not pkg:
continue
outdated = lm.outdated(pkg, spec)
if not outdated.is_outdated(allow_incompatible=True):
continue
manifest = lm.legacy_load_manifest(pkg)
manifest["versionWanted"] = (
str(outdated.wanted) if outdated.wanted else None
)
manifest["versionLatest"] = (
str(outdated.latest) if outdated.latest else None
)
result.append(manifest)
json_result[storage_dir] = result
else:
for library in _libraries:
to_spec = (
None if isinstance(library, PackageItem) else PackageSpec(library)
)
try:
lm.update(
library, to_spec=to_spec, only_check=only_check, silent=silent
)
except UnknownPackageError as e:
if library not in lib_deps:
raise e
if json_output:
return click.echo(
dump_json_to_unicode(
json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
return True
@cli.command("list", short_help="List installed libraries")
@click.option("--json-output", is_flag=True)
@click.pass_context
def lib_list(ctx, json_output):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
json_result = {}
for storage_dir in storage_dirs:
if not json_output:
print_storage_header(storage_dirs, storage_dir)
lm = LibraryPackageManager(storage_dir)
items = lm.legacy_get_installed()
if json_output:
json_result[storage_dir] = items
elif items:
for item in sorted(items, key=lambda i: i["name"]):
print_lib_item(item)
else:
click.echo("No items found")
if json_output:
return click.echo(
dump_json_to_unicode(
json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
return True
@cli.command("search", short_help="Search for a library")
@click.argument("query", required=False, nargs=-1)
@click.option("--json-output", is_flag=True)
@click.option("--page", type=click.INT, default=1)
@click.option("--id", multiple=True)
@click.option("-o", "--owner", multiple=True)
@click.option("-n", "--name", multiple=True)
@click.option("-a", "--author", multiple=True)
@click.option("-k", "--keyword", multiple=True)
@click.option("-f", "--framework", multiple=True)
@click.option("-p", "--platform", multiple=True)
@click.option("-i", "--header", multiple=True)
@click.option(
"--noninteractive",
is_flag=True,
help="Do not prompt, automatically paginate with delay",
)
def lib_search(query, json_output, page, noninteractive, **filters):
regclient = LibraryPackageManager().get_registry_client_instance()
if not query:
query = []
if not isinstance(query, list):
query = list(query)
for key, values in filters.items():
for value in values:
query.append('%s:"%s"' % (key, value))
result = regclient.fetch_json_data(
"get",
"/v2/lib/search",
params=dict(query=" ".join(query), page=page),
cache_valid="1d",
)
if json_output:
click.echo(dump_json_to_unicode(result))
return
if result["total"] == 0:
click.secho(
"Nothing has been found by your request\n"
"Try a less-specific search or use truncation (or wildcard) "
"operator",
fg="yellow",
nl=False,
)
click.secho(" *", fg="green")
click.secho("For example: DS*, PCA*, DHT* and etc.\n", fg="yellow")
click.echo(
"For more examples and advanced search syntax, please use documentation:"
)
click.secho(
"https://docs.platformio.org/page/userguide/lib/cmd_search.html\n",
fg="cyan",
)
return
click.secho(
"Found %d libraries:\n" % result["total"],
fg="green" if result["total"] else "yellow",
)
while True:
for item in result["items"]:
print_lib_item(item)
if int(result["page"]) * int(result["perpage"]) >= int(result["total"]):
break
if noninteractive:
click.echo()
click.secho(
"Loading next %d libraries... Press Ctrl+C to stop!"
% result["perpage"],
fg="yellow",
)
click.echo()
time.sleep(5)
elif not click.confirm("Show next libraries?"):
break
result = regclient.fetch_json_data(
"get",
"/v2/lib/search",
params=dict(query=" ".join(query), page=int(result["page"]) + 1),
cache_valid="1d",
)
@cli.command("builtin", short_help="List built-in libraries")
@click.option("--storage", multiple=True)
@click.option("--json-output", is_flag=True)
def lib_builtin(storage, json_output):
items = get_builtin_libs(storage)
if json_output:
return click.echo(dump_json_to_unicode(items))
for storage_ in items:
if not storage_["items"]:
continue
click.secho(storage_["name"], fg="green")
click.echo("*" * len(storage_["name"]))
click.echo()
for item in sorted(storage_["items"], key=lambda i: i["name"]):
print_lib_item(item)
return True
@cli.command("show", short_help="Show detailed info about a library")
@click.argument("library", metavar="[LIBRARY]")
@click.option("--json-output", is_flag=True)
def lib_show(library, json_output):
lm = LibraryPackageManager()
lib_id = lm.reveal_registry_package_id(library, silent=json_output)
regclient = lm.get_registry_client_instance()
lib = regclient.fetch_json_data("get", "/v2/lib/info/%d" % lib_id, cache_valid="1h")
if json_output:
return click.echo(dump_json_to_unicode(lib))
title = "{ownername}/{name}".format(**lib)
click.secho(title, fg="cyan")
click.echo("=" * len(title))
click.echo(lib["description"])
click.echo()
click.secho("ID: %d" % lib["id"])
click.echo(
"Version: %s, released %s"
% (
lib["version"]["name"],
time.strftime("%c", util.parse_date(lib["version"]["released"])),
)
)
click.echo("Manifest: %s" % lib["confurl"])
for key in ("homepage", "repository", "license"):
if key not in lib or not lib[key]:
continue
if isinstance(lib[key], list):
click.echo("%s: %s" % (key.title(), ", ".join(lib[key])))
else:
click.echo("%s: %s" % (key.title(), lib[key]))
blocks = []
_authors = []
for author in lib.get("authors", []):
_data = []
for key in ("name", "email", "url", "maintainer"):
if not author.get(key):
continue
if key == "email":
_data.append("<%s>" % author[key])
elif key == "maintainer":
_data.append("(maintainer)")
else:
_data.append(author[key])
_authors.append(" ".join(_data))
if _authors:
blocks.append(("Authors", _authors))
blocks.append(("Keywords", lib["keywords"]))
for key in ("frameworks", "platforms"):
if key not in lib or not lib[key]:
continue
blocks.append(("Compatible %s" % key, [i["title"] for i in lib[key]]))
blocks.append(("Headers", lib["headers"]))
blocks.append(("Examples", lib["examples"]))
blocks.append(
(
"Versions",
[
"%s, released %s"
% (v["name"], time.strftime("%c", util.parse_date(v["released"])))
for v in lib["versions"]
],
)
)
blocks.append(
(
"Unique Downloads",
[
"Today: %s" % lib["dlstats"]["day"],
"Week: %s" % lib["dlstats"]["week"],
"Month: %s" % lib["dlstats"]["month"],
],
)
)
for (title, rows) in blocks:
click.echo()
click.secho(title, bold=True)
click.echo("-" * len(title))
for row in rows:
click.echo(row)
return True
@cli.command("register", short_help="Deprecated")
@click.argument("config_url")
def lib_register(config_url): # pylint: disable=unused-argument
raise exception.UserSideException(
"This command is deprecated. Please use `pio package publish` command."
)
@cli.command("stats", short_help="Library Registry Statistics")
@click.option("--json-output", is_flag=True)
def lib_stats(json_output):
regclient = LibraryPackageManager().get_registry_client_instance()
result = regclient.fetch_json_data("get", "/v2/lib/stats", cache_valid="1h")
if json_output:
return click.echo(dump_json_to_unicode(result))
for key in ("updated", "added"):
tabular_data = [
(
click.style(item["name"], fg="cyan"),
time.strftime("%c", util.parse_date(item["date"])),
"https://platformio.org/lib/show/%s/%s"
% (item["id"], quote(item["name"])),
)
for item in result.get(key, [])
]
table = tabulate(
tabular_data,
headers=[click.style("RECENTLY " + key.upper(), bold=True), "Date", "URL"],
)
click.echo(table)
click.echo()
for key in ("lastkeywords", "topkeywords"):
tabular_data = [
(
click.style(name, fg="cyan"),
"https://platformio.org/lib/search?query=" + quote("keyword:%s" % name),
)
for name in result.get(key, [])
]
table = tabulate(
tabular_data,
headers=[
click.style(
("RECENT" if key == "lastkeywords" else "POPULAR") + " KEYWORDS",
bold=True,
),
"URL",
],
)
click.echo(table)
click.echo()
for key, title in (("dlday", "Today"), ("dlweek", "Week"), ("dlmonth", "Month")):
tabular_data = [
(
click.style(item["name"], fg="cyan"),
"https://platformio.org/lib/show/%s/%s"
% (item["id"], quote(item["name"])),
)
for item in result.get(key, [])
]
table = tabulate(
tabular_data,
headers=[click.style("FEATURED: " + title.upper(), bold=True), "URL"],
)
click.echo(table)
click.echo()
return True
def print_storage_header(storage_dirs, storage_dir):
if storage_dirs and storage_dirs[0] != storage_dir:
click.echo("")
click.echo(
click.style("Library Storage: ", bold=True)
+ click.style(storage_dir, fg="blue")
)
def print_lib_item(item):
click.secho(item["name"], fg="cyan")
click.echo("=" * len(item["name"]))
if "id" in item:
click.secho("#ID: %d" % item["id"], bold=True)
if "description" in item or "url" in item:
click.echo(item.get("description", item.get("url", "")))
click.echo()
for key in ("version", "homepage", "license", "keywords"):
if key not in item or not item[key]:
continue
if isinstance(item[key], list):
click.echo("%s: %s" % (key.title(), ", ".join(item[key])))
else:
click.echo("%s: %s" % (key.title(), item[key]))
for key in ("frameworks", "platforms"):
if key not in item:
continue
click.echo(
"Compatible %s: %s"
% (
key,
", ".join(
[i["title"] if isinstance(i, dict) else i for i in item[key]]
),
)
)
if "authors" in item or "authornames" in item:
click.echo(
"Authors: %s"
% ", ".join(
item.get(
"authornames", [a.get("name", "") for a in item.get("authors", [])]
)
)
)
if "__src_url" in item:
click.secho("Source: %s" % item["__src_url"])
click.echo()

View File

@ -0,0 +1,102 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from platformio.compat import ci_strings_are_equal
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.meta import PackageSpec
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.exception import InvalidProjectConfError
def get_builtin_libs(storage_names=None):
# pylint: disable=import-outside-toplevel
from platformio.package.manager.library import LibraryPackageManager
items = []
storage_names = storage_names or []
pm = PlatformPackageManager()
for pkg in pm.get_installed():
p = PlatformFactory.new(pkg)
for storage in p.get_lib_storages():
if storage_names and storage["name"] not in storage_names:
continue
lm = LibraryPackageManager(storage["path"])
items.append(
{
"name": storage["name"],
"path": storage["path"],
"items": lm.legacy_get_installed(),
}
)
return items
def is_builtin_lib(name, storages=None):
for storage in storages or get_builtin_libs():
for lib in storage["items"]:
if lib.get("name") == name:
return True
return False
def ignore_deps_by_specs(deps, specs):
result = []
for dep in deps:
depspec = PackageSpec(dep)
if depspec.external:
result.append(dep)
continue
ignore_conditions = []
for spec in specs:
if depspec.owner:
ignore_conditions.append(
ci_strings_are_equal(depspec.owner, spec.owner)
and ci_strings_are_equal(depspec.name, spec.name)
)
else:
ignore_conditions.append(ci_strings_are_equal(depspec.name, spec.name))
if not any(ignore_conditions):
result.append(dep)
return result
def save_project_libdeps(project_dir, specs, environments=None, action="add"):
config = ProjectConfig.get_instance(os.path.join(project_dir, "platformio.ini"))
config.validate(environments)
for env in config.envs():
if environments and env not in environments:
continue
config.expand_interpolations = False
candidates = []
try:
candidates = ignore_deps_by_specs(
config.get("env:" + env, "lib_deps"), specs
)
except InvalidProjectConfError:
pass
if action == "add":
candidates.extend(spec.as_dependency() for spec in specs)
if candidates:
result = []
for item in candidates:
item = item.strip()
if item and item not in result:
result.append(item)
config.set("env:" + env, "lib_deps", result)
elif config.has_option("env:" + env, "lib_deps"):
config.remove_option("env:" + env, "lib_deps")
config.save()

165
platformio/commands/org.py Normal file
View File

@ -0,0 +1,165 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
import json
import click
from tabulate import tabulate
from platformio.clients.account import AccountClient
from platformio.commands.account import validate_email, validate_username
@click.group("org", short_help="Manage organizations")
def cli():
pass
def validate_orgname(value):
return validate_username(value, "Organization name")
@cli.command("create", short_help="Create a new organization")
@click.argument(
"orgname",
callback=lambda _, __, value: validate_orgname(value),
)
@click.option(
"--email", callback=lambda _, __, value: validate_email(value) if value else value
)
@click.option(
"--displayname",
)
def org_create(orgname, email, displayname):
client = AccountClient()
client.create_org(orgname, email, displayname)
return click.secho(
"The organization `%s` has been successfully created." % orgname,
fg="green",
)
@cli.command("list", short_help="List organizations and their members")
@click.option("--json-output", is_flag=True)
def org_list(json_output):
client = AccountClient()
orgs = client.list_orgs()
if json_output:
return click.echo(json.dumps(orgs))
if not orgs:
return click.echo("You do not have any organization")
for org in orgs:
click.echo()
click.secho(org.get("orgname"), fg="cyan")
click.echo("-" * len(org.get("orgname")))
data = []
if org.get("displayname"):
data.append(("Display Name:", org.get("displayname")))
if org.get("email"):
data.append(("Email:", org.get("email")))
data.append(
(
"Owners:",
", ".join((owner.get("username") for owner in org.get("owners"))),
)
)
click.echo(tabulate(data, tablefmt="plain"))
return click.echo()
@cli.command("update", short_help="Update organization")
@click.argument("cur_orgname")
@click.option(
"--orgname",
callback=lambda _, __, value: validate_orgname(value),
help="A new orgname",
)
@click.option("--email")
@click.option("--displayname")
def org_update(cur_orgname, **kwargs):
client = AccountClient()
org = client.get_org(cur_orgname)
del org["owners"]
new_org = org.copy()
if not any(kwargs.values()):
for field in org:
new_org[field] = click.prompt(
field.replace("_", " ").capitalize(), default=org[field]
)
if field == "email":
validate_email(new_org[field])
if field == "orgname":
validate_orgname(new_org[field])
else:
new_org.update(
{key.replace("new_", ""): value for key, value in kwargs.items() if value}
)
client.update_org(cur_orgname, new_org)
return click.secho(
"The organization `%s` has been successfully updated." % cur_orgname,
fg="green",
)
@cli.command("destroy", short_help="Destroy organization")
@click.argument("orgname")
def account_destroy(orgname):
client = AccountClient()
click.confirm(
"Are you sure you want to delete the `%s` organization account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% orgname,
abort=True,
)
client.destroy_org(orgname)
return click.secho(
"Organization `%s` has been destroyed." % orgname,
fg="green",
)
@cli.command("add", short_help="Add a new owner to organization")
@click.argument(
"orgname",
)
@click.argument(
"username",
)
def org_add_owner(orgname, username):
client = AccountClient()
client.add_org_owner(orgname, username)
return click.secho(
"The new owner `%s` has been successfully added to the `%s` organization."
% (username, orgname),
fg="green",
)
@cli.command("remove", short_help="Remove an owner from organization")
@click.argument(
"orgname",
)
@click.argument(
"username",
)
def org_remove_owner(orgname, username):
client = AccountClient()
client.remove_org_owner(orgname, username)
return click.secho(
"The `%s` owner has been successfully removed from the `%s` organization."
% (username, orgname),
fg="green",
)

View File

@ -0,0 +1,119 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import tempfile
from datetime import datetime
import click
from platformio import fs
from platformio.clients.registry import RegistryClient
from platformio.compat import ensure_python3
from platformio.package.meta import PackageSpec, PackageType
from platformio.package.pack import PackagePacker
def validate_datetime(ctx, param, value): # pylint: disable=unused-argument
if not value:
return value
try:
datetime.strptime(value, "%Y-%m-%d %H:%M:%S")
except ValueError as e:
raise click.BadParameter(e)
return value
@click.group("package", short_help="Package manager")
def cli():
pass
@cli.command("pack", short_help="Create a tarball from a package")
@click.argument(
"package",
required=True,
default=os.getcwd,
metavar="<source directory, tar.gz or zip>",
)
@click.option(
"-o", "--output", help="A destination path (folder or a full path to file)"
)
def package_pack(package, output):
p = PackagePacker(package)
archive_path = p.pack(output)
click.secho('Wrote a tarball to "%s"' % archive_path, fg="green")
@cli.command("publish", short_help="Publish a package to the registry")
@click.argument(
"package",
required=True,
default=os.getcwd,
metavar="<source directory, tar.gz or zip>",
)
@click.option(
"--owner",
help="PIO Account username (can be organization username). "
"Default is set to a username of the authorized PIO Account",
)
@click.option(
"--released-at",
callback=validate_datetime,
help="Custom release date and time in the next format (UTC): 2014-06-13 17:08:52",
)
@click.option("--private", is_flag=True, help="Restricted access (not a public)")
@click.option(
"--notify/--no-notify",
default=True,
help="Notify by email when package is processed",
)
def package_publish(package, owner, released_at, private, notify):
assert ensure_python3()
with tempfile.TemporaryDirectory() as tmp_dir: # pylint: disable=no-member
with fs.cd(tmp_dir):
p = PackagePacker(package)
archive_path = p.pack()
response = RegistryClient().publish_package(
archive_path, owner, released_at, private, notify
)
os.remove(archive_path)
click.secho(response.get("message"), fg="green")
@cli.command("unpublish", short_help="Remove a pushed package from the registry")
@click.argument(
"package", required=True, metavar="[<organization>/]<pkgname>[@<version>]"
)
@click.option(
"--type",
type=click.Choice(list(PackageType.items().values())),
default="library",
help="Package type, default is set to `library`",
)
@click.option(
"--undo",
is_flag=True,
help="Undo a remove, putting a version back into the registry",
)
def package_unpublish(package, type, undo): # pylint: disable=redefined-builtin
spec = PackageSpec(package)
response = RegistryClient().unpublish_package(
type=type,
name=spec.name,
owner=spec.owner,
version=str(spec.requirements),
undo=undo,
)
click.secho(response.get("message"), fg="green")

View File

@ -12,129 +12,127 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import json
from os.path import dirname, isdir
import os
import click
from platformio import app, exception, util
from platformio.cache import cleanup_content_cache
from platformio.commands.boards import print_boards
from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.compat import dump_json_to_unicode
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.meta import PackageItem, PackageSpec
from platformio.package.version import get_original_version
from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
@click.group(short_help="Platform Manager")
@click.group(short_help="Platform manager")
def cli():
pass
def _print_platforms(platforms):
for platform in platforms:
click.echo("{name} ~ {title}".format(
name=click.style(platform['name'], fg="cyan"),
title=platform['title']))
click.echo("=" * (3 + len(platform['name'] + platform['title'])))
click.echo(platform['description'])
click.echo(
"{name} ~ {title}".format(
name=click.style(platform["name"], fg="cyan"), title=platform["title"]
)
)
click.echo("=" * (3 + len(platform["name"] + platform["title"])))
click.echo(platform["description"])
click.echo()
if "homepage" in platform:
click.echo("Home: %s" % platform['homepage'])
if "frameworks" in platform and platform['frameworks']:
click.echo("Frameworks: %s" % ", ".join(platform['frameworks']))
click.echo("Home: %s" % platform["homepage"])
if "frameworks" in platform and platform["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(platform["frameworks"]))
if "packages" in platform:
click.echo("Packages: %s" % ", ".join(platform['packages']))
click.echo("Packages: %s" % ", ".join(platform["packages"]))
if "version" in platform:
click.echo("Version: " + platform['version'])
if "__src_url" in platform:
click.echo(
"Version: %s (%s)" % (platform["version"], platform["__src_url"])
)
else:
click.echo("Version: " + platform["version"])
click.echo()
def _get_registry_platforms():
platforms = util.get_api_result("/platforms", cache_valid="7d")
pm = PlatformManager()
for platform in platforms or []:
platform['versions'] = pm.get_all_repo_versions(platform['name'])
return platforms
def _original_version(version):
if version.count(".") != 2:
return None
_, y = version.split(".")[:2]
if int(y) < 100:
return None
if len(y) % 2 != 0:
y = "0" + y
parts = [str(int(y[i * 2:i * 2 + 2])) for i in range(len(y) / 2)]
return ".".join(parts)
regclient = PlatformPackageManager().get_registry_client_instance()
return regclient.fetch_json_data("get", "/v2/platforms", cache_valid="1d")
def _get_platform_data(*args, **kwargs):
try:
return _get_installed_platform_data(*args, **kwargs)
except exception.UnknownPlatform:
except UnknownPlatform:
return _get_registry_platform_data(*args, **kwargs)
def _get_installed_platform_data(platform,
with_boards=True,
expose_packages=True):
p = PlatformFactory.newPlatform(platform)
def _get_installed_platform_data(platform, with_boards=True, expose_packages=True):
p = PlatformFactory.new(platform)
data = dict(
name=p.name,
title=p.title,
description=p.description,
version=p.version,
homepage=p.homepage,
url=p.homepage,
repository=p.repository_url,
url=p.vendor_url,
docs=p.docs_url,
license=p.license,
forDesktop=not p.is_embedded(),
frameworks=sorted(p.frameworks.keys() if p.frameworks else []),
packages=p.packages.keys() if p.packages else [])
frameworks=sorted(list(p.frameworks) if p.frameworks else []),
packages=list(p.packages) if p.packages else [],
)
# if dump to API
# del data['version']
# return data
# overwrite VCS version and add extra fields
manifest = PlatformManager().load_manifest(dirname(p.manifest_path))
manifest = PlatformPackageManager().legacy_load_manifest(
os.path.dirname(p.manifest_path)
)
assert manifest
for key in manifest:
if key == "version" or key.startswith("__"):
data[key] = manifest[key]
if with_boards:
data['boards'] = [c.get_brief_data() for c in p.get_boards().values()]
data["boards"] = [c.get_brief_data() for c in p.get_boards().values()]
if not data['packages'] or not expose_packages:
if not data["packages"] or not expose_packages:
return data
data['packages'] = []
installed_pkgs = p.get_installed_packages()
for name, opts in p.packages.items():
data["packages"] = []
installed_pkgs = {
pkg.metadata.name: p.pm.load_manifest(pkg) for pkg in p.get_installed_packages()
}
for name, options in p.packages.items():
item = dict(
name=name,
type=p.get_package_type(name),
requirements=opts.get("version"),
optional=opts.get("optional") is True)
requirements=options.get("version"),
optional=options.get("optional") is True,
)
if name in installed_pkgs:
for key, value in installed_pkgs[name].items():
if key not in ("url", "version", "description"):
continue
item[key] = value
if key == "version":
item["originalVersion"] = _original_version(value)
data['packages'].append(item)
item["originalVersion"] = get_original_version(value)
data["packages"].append(item)
return data
def _get_registry_platform_data( # pylint: disable=unused-argument
platform,
with_boards=True,
expose_packages=True):
platform, with_boards=True, expose_packages=True
):
_data = None
for p in _get_registry_platforms():
if p['name'] == platform:
if p["name"] == platform:
_data = p
break
@ -142,22 +140,25 @@ def _get_registry_platform_data( # pylint: disable=unused-argument
return None
data = dict(
name=_data['name'],
title=_data['title'],
description=_data['description'],
homepage=_data['homepage'],
repository=_data['repository'],
url=_data['url'],
license=_data['license'],
forDesktop=_data['forDesktop'],
frameworks=_data['frameworks'],
packages=_data['packages'],
versions=_data['versions'])
ownername=_data.get("ownername"),
name=_data["name"],
title=_data["title"],
description=_data["description"],
homepage=_data["homepage"],
repository=_data["repository"],
url=_data["url"],
license=_data["license"],
forDesktop=_data["forDesktop"],
frameworks=_data["frameworks"],
packages=_data["packages"],
versions=_data.get("versions"),
)
if with_boards:
data['boards'] = [
board for board in PlatformManager().get_registered_boards()
if board['platform'] == _data['name']
data["boards"] = [
board
for board in PlatformPackageManager().get_registered_boards()
if board["platform"] == _data["name"]
]
return data
@ -171,15 +172,17 @@ def platform_search(query, json_output):
for platform in _get_registry_platforms():
if query == "all":
query = ""
search_data = json.dumps(platform)
search_data = dump_json_to_unicode(platform)
if query and query.lower() not in search_data.lower():
continue
platforms.append(
_get_registry_platform_data(
platform['name'], with_boards=False, expose_packages=False))
platform["name"], with_boards=False, expose_packages=False
)
)
if json_output:
click.echo(json.dumps(platforms))
click.echo(dump_json_to_unicode(platforms))
else:
_print_platforms(platforms)
@ -188,24 +191,27 @@ def platform_search(query, json_output):
@click.argument("query", required=False)
@click.option("--json-output", is_flag=True)
def platform_frameworks(query, json_output):
regclient = PlatformPackageManager().get_registry_client_instance()
frameworks = []
for framework in util.get_api_result("/frameworks", cache_valid="7d"):
for framework in regclient.fetch_json_data(
"get", "/v2/frameworks", cache_valid="1d"
):
if query == "all":
query = ""
search_data = json.dumps(framework)
search_data = dump_json_to_unicode(framework)
if query and query.lower() not in search_data.lower():
continue
framework['homepage'] = (
"https://platformio.org/frameworks/" + framework['name'])
framework['platforms'] = [
platform['name'] for platform in _get_registry_platforms()
if framework['name'] in platform['frameworks']
framework["homepage"] = "https://platformio.org/frameworks/" + framework["name"]
framework["platforms"] = [
platform["name"]
for platform in _get_registry_platforms()
if framework["name"] in platform["frameworks"]
]
frameworks.append(framework)
frameworks = sorted(frameworks, key=lambda manifest: manifest['name'])
frameworks = sorted(frameworks, key=lambda manifest: manifest["name"])
if json_output:
click.echo(json.dumps(frameworks))
click.echo(dump_json_to_unicode(frameworks))
else:
_print_platforms(frameworks)
@ -214,17 +220,15 @@ def platform_frameworks(query, json_output):
@click.option("--json-output", is_flag=True)
def platform_list(json_output):
platforms = []
pm = PlatformManager()
for manifest in pm.get_installed():
pm = PlatformPackageManager()
for pkg in pm.get_installed():
platforms.append(
_get_installed_platform_data(
manifest['__pkg_dir'],
with_boards=False,
expose_packages=False))
_get_installed_platform_data(pkg, with_boards=False, expose_packages=False)
)
platforms = sorted(platforms, key=lambda manifest: manifest['name'])
platforms = sorted(platforms, key=lambda manifest: manifest["name"])
if json_output:
click.echo(json.dumps(platforms))
click.echo(dump_json_to_unicode(platforms))
else:
_print_platforms(platforms)
@ -235,58 +239,61 @@ def platform_list(json_output):
def platform_show(platform, json_output): # pylint: disable=too-many-branches
data = _get_platform_data(platform)
if not data:
raise exception.UnknownPlatform(platform)
raise UnknownPlatform(platform)
if json_output:
return click.echo(json.dumps(data))
return click.echo(dump_json_to_unicode(data))
click.echo("{name} ~ {title}".format(
name=click.style(data['name'], fg="cyan"), title=data['title']))
click.echo("=" * (3 + len(data['name'] + data['title'])))
click.echo(data['description'])
dep = "{ownername}/{name}".format(**data) if "ownername" in data else data["name"]
click.echo(
"{dep} ~ {title}".format(dep=click.style(dep, fg="cyan"), title=data["title"])
)
click.echo("=" * (3 + len(dep + data["title"])))
click.echo(data["description"])
click.echo()
if "version" in data:
click.echo("Version: %s" % data['version'])
if data['homepage']:
click.echo("Home: %s" % data['homepage'])
if data['repository']:
click.echo("Repository: %s" % data['repository'])
if data['url']:
click.echo("Vendor: %s" % data['url'])
if data['license']:
click.echo("License: %s" % data['license'])
if data['frameworks']:
click.echo("Frameworks: %s" % ", ".join(data['frameworks']))
click.echo("Version: %s" % data["version"])
if data["homepage"]:
click.echo("Home: %s" % data["homepage"])
if data["repository"]:
click.echo("Repository: %s" % data["repository"])
if data["url"]:
click.echo("Vendor: %s" % data["url"])
if data["license"]:
click.echo("License: %s" % data["license"])
if data["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(data["frameworks"]))
if not data['packages']:
if not data["packages"]:
return None
if not isinstance(data['packages'][0], dict):
click.echo("Packages: %s" % ", ".join(data['packages']))
if not isinstance(data["packages"][0], dict):
click.echo("Packages: %s" % ", ".join(data["packages"]))
else:
click.echo()
click.secho("Packages", bold=True)
click.echo("--------")
for item in data['packages']:
for item in data["packages"]:
click.echo()
click.echo("Package %s" % click.style(item['name'], fg="yellow"))
click.echo("-" * (8 + len(item['name'])))
if item['type']:
click.echo("Type: %s" % item['type'])
click.echo("Requirements: %s" % item['requirements'])
click.echo("Installed: %s" %
("Yes" if item.get("version") else "No (optional)"))
click.echo("Package %s" % click.style(item["name"], fg="yellow"))
click.echo("-" * (8 + len(item["name"])))
if item["type"]:
click.echo("Type: %s" % item["type"])
click.echo("Requirements: %s" % item["requirements"])
click.echo(
"Installed: %s" % ("Yes" if item.get("version") else "No (optional)")
)
if "version" in item:
click.echo("Version: %s" % item['version'])
click.echo("Version: %s" % item["version"])
if "originalVersion" in item:
click.echo("Original version: %s" % item['originalVersion'])
click.echo("Original version: %s" % item["originalVersion"])
if "description" in item:
click.echo("Description: %s" % item['description'])
click.echo("Description: %s" % item["description"])
if data['boards']:
if data["boards"]:
click.echo()
click.secho("Boards", bold=True)
click.echo("------")
print_boards(data['boards'])
print_boards(data["boards"])
return True
@ -296,93 +303,123 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
@click.option("--with-package", multiple=True)
@click.option("--without-package", multiple=True)
@click.option("--skip-default-package", is_flag=True)
@click.option("--with-all-packages", is_flag=True)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option(
"-f",
"--force",
is_flag=True,
help="Reinstall/redownload dev/platform and its packages if exist")
def platform_install(platforms, with_package, without_package,
skip_default_package, force):
pm = PlatformManager()
help="Reinstall/redownload dev/platform and its packages if exist",
)
def platform_install( # pylint: disable=too-many-arguments
platforms,
with_package,
without_package,
skip_default_package,
with_all_packages,
silent,
force,
):
pm = PlatformPackageManager()
for platform in platforms:
if pm.install(
name=platform,
with_packages=with_package,
without_packages=without_package,
skip_default_package=skip_default_package,
force=force):
pkg = pm.install(
spec=platform,
with_packages=with_package,
without_packages=without_package,
skip_default_package=skip_default_package,
with_all_packages=with_all_packages,
silent=silent,
force=force,
)
if pkg and not silent:
click.secho(
"The platform '%s' has been successfully installed!\n"
"The rest of packages will be installed automatically "
"The rest of the packages will be installed later "
"depending on your build environment." % platform,
fg="green")
fg="green",
)
@cli.command("uninstall", short_help="Uninstall development platform")
@click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]")
def platform_uninstall(platforms):
pm = PlatformManager()
pm = PlatformPackageManager()
for platform in platforms:
if pm.uninstall(platform):
click.secho(
"The platform '%s' has been successfully "
"uninstalled!" % platform,
fg="green")
"The platform '%s' has been successfully removed!" % platform,
fg="green",
)
@cli.command("update", short_help="Update installed development platforms")
@click.argument("platforms", nargs=-1, required=False, metavar="[PLATFORM...]")
@click.option(
"-p",
"--only-packages",
is_flag=True,
help="Update only the platform packages")
"-p", "--only-packages", is_flag=True, help="Update only the platform packages"
)
@click.option(
"-c",
"--only-check",
is_flag=True,
help="Do not update, only check for a new version")
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True)
def platform_update(platforms, only_packages, only_check, json_output):
pm = PlatformManager()
pkg_dir_to_name = {}
if not platforms:
platforms = []
for manifest in pm.get_installed():
platforms.append(manifest['__pkg_dir'])
pkg_dir_to_name[manifest['__pkg_dir']] = manifest.get(
"title", manifest['name'])
def platform_update( # pylint: disable=too-many-locals, too-many-arguments
platforms, only_packages, only_check, dry_run, silent, json_output
):
pm = PlatformPackageManager()
platforms = platforms or pm.get_installed()
only_check = dry_run or only_check
if only_check and json_output:
result = []
for platform in platforms:
pkg_dir = platform if isdir(platform) else None
requirements = None
url = None
if not pkg_dir:
name, requirements, url = pm.parse_pkg_uri(platform)
pkg_dir = pm.get_package_dir(name, requirements, url)
if not pkg_dir:
spec = None
pkg = None
if isinstance(platform, PackageItem):
pkg = platform
else:
spec = PackageSpec(platform)
pkg = pm.get_package(spec)
if not pkg:
continue
latest = pm.outdated(pkg_dir, requirements)
if (not latest and not PlatformFactory.newPlatform(
pkg_dir).are_outdated_packages()):
outdated = pm.outdated(pkg, spec)
if (
not outdated.is_outdated(allow_incompatible=True)
and not PlatformFactory.new(pkg).are_outdated_packages()
):
continue
data = _get_installed_platform_data(
pkg_dir, with_boards=False, expose_packages=False)
if latest:
data['versionLatest'] = latest
pkg, with_boards=False, expose_packages=False
)
if outdated.is_outdated(allow_incompatible=True):
data["versionLatest"] = (
str(outdated.latest) if outdated.latest else None
)
result.append(data)
return click.echo(json.dumps(result))
else:
# cleanup cached board and platform lists
app.clean_cache()
for platform in platforms:
click.echo("Platform %s" % click.style(
pkg_dir_to_name.get(platform, platform), fg="cyan"))
click.echo("--------")
pm.update(
platform, only_packages=only_packages, only_check=only_check)
click.echo()
return click.echo(dump_json_to_unicode(result))
# cleanup cached board and platform lists
cleanup_content_cache("http")
for platform in platforms:
click.echo(
"Platform %s"
% click.style(
platform.metadata.name
if isinstance(platform, PackageItem)
else platform,
fg="cyan",
)
)
click.echo("--------")
pm.update(
platform, only_packages=only_packages, only_check=only_check, silent=silent
)
click.echo()
return True

View File

@ -0,0 +1,459 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches,line-too-long
import json
import os
import click
from tabulate import tabulate
from platformio import fs
from platformio.commands.platform import platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.package.manager.platform import PlatformPackageManager
from platformio.platform.exception import UnknownBoard
from platformio.project.config import ProjectConfig
from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.helpers import is_platformio_project, load_project_ide_data
@click.group(short_help="Project manager")
def cli():
pass
@cli.command("config", short_help="Show computed configuration")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option("--json-output", is_flag=True)
def project_config(project_dir, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
if json_output:
return click.echo(config.to_json())
click.echo(
"Computed project configuration for %s" % click.style(project_dir, fg="cyan")
)
for section, options in config.as_tuple():
click.secho(section, fg="cyan")
click.echo("-" * len(section))
click.echo(
tabulate(
[
(name, "=", "\n".join(value) if isinstance(value, list) else value)
for name, value in options
],
tablefmt="plain",
)
)
click.echo()
return None
@cli.command("data", short_help="Dump data intended for IDE extensions/plugins")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option("-e", "--environment", multiple=True)
@click.option("--json-output", is_flag=True)
def project_data(project_dir, environment, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
config.validate(environment)
environment = list(environment or config.envs())
if json_output:
return click.echo(json.dumps(load_project_ide_data(project_dir, environment)))
for envname in environment:
click.echo("Environment: " + click.style(envname, fg="cyan", bold=True))
click.echo("=" * (13 + len(envname)))
click.echo(
tabulate(
[
(click.style(name, bold=True), "=", json.dumps(value, indent=2))
for name, value in load_project_ide_data(
project_dir, envname
).items()
],
tablefmt="plain",
)
)
click.echo()
return None
def validate_boards(ctx, param, value): # pylint: disable=W0613
pm = PlatformPackageManager()
for id_ in value:
try:
pm.board_config(id_)
except UnknownBoard:
raise click.BadParameter(
"`%s`. Please search for board ID using `platformio boards` "
"command" % id_
)
return value
@cli.command("init", short_help="Initialize a project or update existing")
@click.option(
"--project-dir",
"-d",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option("--ide", type=click.Choice(ProjectGenerator.get_supported_ides()))
@click.option("-e", "--environment", help="Update using existing environment")
@click.option("-O", "--project-option", multiple=True)
@click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True)
@click.pass_context
def project_init(
ctx, # pylint: disable=R0913
project_dir,
board,
ide,
environment,
project_option,
env_prefix,
silent,
):
if not silent:
if project_dir == os.getcwd():
click.secho("\nThe current working directory", fg="yellow", nl=False)
click.secho(" %s " % project_dir, fg="cyan", nl=False)
click.secho("will be used for the project.", fg="yellow")
click.echo("")
click.echo(
"The next files/directories have been created in %s"
% click.style(project_dir, fg="cyan")
)
click.echo(
"%s - Put project header files here" % click.style("include", fg="cyan")
)
click.echo(
"%s - Put here project specific (private) libraries"
% click.style("lib", fg="cyan")
)
click.echo("%s - Put project source files here" % click.style("src", fg="cyan"))
click.echo(
"%s - Project Configuration File" % click.style("platformio.ini", fg="cyan")
)
is_new_project = not is_platformio_project(project_dir)
if is_new_project:
init_base_project(project_dir)
if environment:
update_project_env(project_dir, environment, project_option)
elif board:
update_board_envs(
ctx, project_dir, board, project_option, env_prefix, ide is not None
)
if ide:
with fs.cd(project_dir):
config = ProjectConfig.get_instance(
os.path.join(project_dir, "platformio.ini")
)
config.validate()
pg = ProjectGenerator(
config, environment or get_best_envname(config, board), ide
)
pg.generate()
if is_new_project:
init_cvs_ignore(project_dir)
if silent:
return
if ide:
click.secho(
"\nProject has been successfully %s including configuration files "
"for `%s` IDE." % ("initialized" if is_new_project else "updated", ide),
fg="green",
)
else:
click.secho(
"\nProject has been successfully %s! Useful commands:\n"
"`pio run` - process/build project from the current directory\n"
"`pio run --target upload` or `pio run -t upload` "
"- upload firmware to a target\n"
"`pio run --target clean` - clean project (remove compiled files)"
"\n`pio run --help` - additional information"
% ("initialized" if is_new_project else "updated"),
fg="green",
)
def init_base_project(project_dir):
with fs.cd(project_dir):
config = ProjectConfig()
config.save()
dir_to_readme = [
(config.get_optional_dir("src"), None),
(config.get_optional_dir("include"), init_include_readme),
(config.get_optional_dir("lib"), init_lib_readme),
(config.get_optional_dir("test"), init_test_readme),
]
for (path, cb) in dir_to_readme:
if os.path.isdir(path):
continue
os.makedirs(path)
if cb:
cb(path)
def init_include_readme(include_dir):
with open(os.path.join(include_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
""",
)
def init_lib_readme(lib_dir):
with open(os.path.join(lib_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html
""",
)
def init_test_readme(test_dir):
with open(os.path.join(test_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html
""",
)
def init_cvs_ignore(project_dir):
conf_path = os.path.join(project_dir, ".gitignore")
if os.path.isfile(conf_path):
return
with open(conf_path, "w") as fp:
fp.write(".pio\n")
def update_board_envs(
ctx, project_dir, board_ids, project_option, env_prefix, force_download
):
config = ProjectConfig(
os.path.join(project_dir, "platformio.ini"), parse_extra=False
)
used_boards = []
for section in config.sections():
cond = [section.startswith("env:"), config.has_option(section, "board")]
if all(cond):
used_boards.append(config.get(section, "board"))
pm = PlatformPackageManager()
used_platforms = []
modified = False
for id_ in board_ids:
board_config = pm.board_config(id_)
used_platforms.append(board_config["platform"])
if id_ in used_boards:
continue
used_boards.append(id_)
modified = True
envopts = {"platform": board_config["platform"], "board": id_}
# find default framework for board
frameworks = board_config.get("frameworks")
if frameworks:
envopts["framework"] = frameworks[0]
for item in project_option:
if "=" not in item:
continue
_name, _value = item.split("=", 1)
envopts[_name.strip()] = _value.strip()
section = "env:%s%s" % (env_prefix, id_)
config.add_section(section)
for option, value in envopts.items():
config.set(section, option, value)
if force_download and used_platforms:
_install_dependent_platforms(ctx, used_platforms)
if modified:
config.save()
def _install_dependent_platforms(ctx, platforms):
installed_platforms = [
pkg.metadata.name for pkg in PlatformPackageManager().get_installed()
]
if set(platforms) <= set(installed_platforms):
return
ctx.invoke(
cli_platform_install, platforms=list(set(platforms) - set(installed_platforms))
)
def update_project_env(project_dir, environment, project_option):
if not project_option:
return
config = ProjectConfig(
os.path.join(project_dir, "platformio.ini"), parse_extra=False
)
section = "env:%s" % environment
if not config.has_section(section):
config.add_section(section)
for item in project_option:
if "=" not in item:
continue
_name, _value = item.split("=", 1)
config.set(section, _name.strip(), _value.strip())
config.save()
def get_best_envname(config, board_ids=None):
envname = None
default_envs = config.default_envs()
if default_envs:
envname = default_envs[0]
if not board_ids:
return envname
for env in config.envs():
if not board_ids:
return env
if not envname:
envname = env
items = config.items(env=env, as_dict=True)
if "board" in items and items.get("board") in board_ids:
return env
return envname

View File

@ -1,209 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
import threading
from os import getcwd
from os.path import isfile, join
from tempfile import mkdtemp
from time import sleep
import click
from platformio import exception, util
from platformio.commands.device import device_monitor as cmd_device_monitor
from platformio.managers.core import pioplus_call
# pylint: disable=unused-argument
@click.group("remote", short_help="PIO Remote")
@click.option("-a", "--agent", multiple=True)
def cli(**kwargs):
pass
@cli.group("agent", short_help="Start new agent or list active")
def remote_agent():
pass
@remote_agent.command("start", short_help="Start agent")
@click.option("-n", "--name")
@click.option("-s", "--share", multiple=True, metavar="E-MAIL")
@click.option(
"-d",
"--working-dir",
envvar="PLATFORMIO_REMOTE_AGENT_DIR",
type=click.Path(
file_okay=False, dir_okay=True, writable=True, resolve_path=True))
def remote_agent_start(**kwargs):
pioplus_call(sys.argv[1:])
@remote_agent.command("reload", short_help="Reload agents")
def remote_agent_reload():
pioplus_call(sys.argv[1:])
@remote_agent.command("list", short_help="List active agents")
def remote_agent_list():
pioplus_call(sys.argv[1:])
@cli.command(
"update", short_help="Update installed Platforms, Packages and Libraries")
@click.option(
"-c",
"--only-check",
is_flag=True,
help="Do not update, only check for new version")
def remote_update(only_check):
pioplus_call(sys.argv[1:])
@cli.command("run", short_help="Process project environments remotely")
@click.option("-e", "--environment", multiple=True)
@click.option("-t", "--target", multiple=True)
@click.option("--upload-port")
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True,
file_okay=True,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("--disable-auto-clean", is_flag=True)
@click.option("-r", "--force-remote", is_flag=True)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
def remote_run(**kwargs):
pioplus_call(sys.argv[1:])
@cli.command("test", short_help="Remote Unit Testing")
@click.option("--environment", "-e", multiple=True, metavar="<environment>")
@click.option("--ignore", "-i", multiple=True, metavar="<pattern>")
@click.option("--upload-port")
@click.option("--test-port")
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("-r", "--force-remote", is_flag=True)
@click.option("--without-building", is_flag=True)
@click.option("--without-uploading", is_flag=True)
@click.option("--verbose", "-v", is_flag=True)
def remote_test(**kwargs):
pioplus_call(sys.argv[1:])
@cli.group("device", short_help="Monitor remote device or list existing")
def remote_device():
pass
@remote_device.command("list", short_help="List remote devices")
@click.option("--json-output", is_flag=True)
def device_list(json_output):
pioplus_call(sys.argv[1:])
@remote_device.command("monitor", short_help="Monitor remote device")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option(
"--baud", "-b", type=int, default=9600, help="Set baud rate, default=9600")
@click.option(
"--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N")
@click.option(
"--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option(
"--xonxoff",
is_flag=True,
help="Enable software flow control, default=Off")
@click.option(
"--rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state")
@click.option(
"--dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state")
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option(
"--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8")
@click.option("--filter", "-f", multiple=True, help="Add text transformation")
@click.option(
"--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF")
@click.option(
"--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option(
"--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)")
@click.option(
"--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)")
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off")
@click.pass_context
def device_monitor(ctx, **kwargs):
def _tx_target(sock_dir):
try:
pioplus_call(sys.argv[1:] + ["--sock", sock_dir])
except exception.ReturnErrorCode:
pass
sock_dir = mkdtemp(suffix="pioplus")
sock_file = join(sock_dir, "sock")
try:
t = threading.Thread(target=_tx_target, args=(sock_dir, ))
t.start()
while t.is_alive() and not isfile(sock_file):
sleep(0.1)
if not t.is_alive():
return
kwargs['port'] = open(sock_file).read()
ctx.invoke(cmd_device_monitor, **kwargs)
t.join(2)
finally:
util.rmtree_(sock_dir)

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@ -0,0 +1,91 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from twisted.internet import defer # pylint: disable=import-error
from twisted.spread import pb # pylint: disable=import-error
class AsyncCommandBase(object):
MAX_BUFFER_SIZE = 1024 * 1024 # 1Mb
def __init__(self, options=None, on_end_callback=None):
self.options = options or {}
self.on_end_callback = on_end_callback
self._buffer = b""
self._return_code = None
self._d = None
self._paused = False
try:
self.start()
except Exception as e:
raise pb.Error(str(e))
@property
def id(self):
return id(self)
def pause(self):
self._paused = True
self.stop()
def unpause(self):
self._paused = False
self.start()
def start(self):
raise NotImplementedError
def stop(self):
self.transport.loseConnection() # pylint: disable=no-member
def _ac_ended(self):
if self.on_end_callback:
self.on_end_callback()
if not self._d or self._d.called:
self._d = None
return
if self._buffer:
self._d.callback(self._buffer)
else:
self._d.callback(None)
def _ac_ondata(self, data):
self._buffer += data
if len(self._buffer) > self.MAX_BUFFER_SIZE:
self._buffer = self._buffer[-1 * self.MAX_BUFFER_SIZE :]
if self._paused:
return
if self._d and not self._d.called:
self._d.callback(self._buffer)
self._buffer = b""
def ac_read(self):
if self._buffer:
result = self._buffer
self._buffer = b""
return result
if self._return_code is None:
self._d = defer.Deferred()
return self._d
return None
def ac_write(self, data):
self.transport.write(data) # pylint: disable=no-member
return len(data)
def ac_close(self):
self.stop()
return self._return_code

View File

@ -0,0 +1,42 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from twisted.internet import protocol, reactor # pylint: disable=import-error
from platformio.commands.remote.ac.base import AsyncCommandBase
class ProcessAsyncCmd(protocol.ProcessProtocol, AsyncCommandBase):
def start(self):
env = dict(os.environ).copy()
env.update({"PLATFORMIO_FORCE_ANSI": "true"})
reactor.spawnProcess(
self, self.options["executable"], self.options["args"], env
)
def outReceived(self, data):
self._ac_ondata(data)
def errReceived(self, data):
self._ac_ondata(data)
def processExited(self, reason):
self._return_code = reason.value.exitCode
def processEnded(self, reason):
if self._return_code is None:
self._return_code = reason.value.exitCode
self._ac_ended()

View File

@ -0,0 +1,66 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import zlib
from io import BytesIO
from platformio.commands.remote.ac.base import AsyncCommandBase
from platformio.commands.remote.projectsync import PROJECT_SYNC_STAGE, ProjectSync
class ProjectSyncAsyncCmd(AsyncCommandBase):
def __init__(self, *args, **kwargs):
self.psync = None
self._upstream = None
super(ProjectSyncAsyncCmd, self).__init__(*args, **kwargs)
def start(self):
project_dir = os.path.join(
self.options["agent_working_dir"], "projects", self.options["id"]
)
self.psync = ProjectSync(project_dir)
for name in self.options["items"]:
self.psync.add_item(os.path.join(project_dir, name), name)
def stop(self):
self.psync = None
self._upstream = None
self._return_code = PROJECT_SYNC_STAGE.COMPLETED.value
def ac_write(self, data):
stage = PROJECT_SYNC_STAGE.lookupByValue(data.get("stage"))
if stage is PROJECT_SYNC_STAGE.DBINDEX:
self.psync.rebuild_dbindex()
return zlib.compress(json.dumps(self.psync.get_dbindex()).encode())
if stage is PROJECT_SYNC_STAGE.DELETE:
return self.psync.delete_dbindex(
json.loads(zlib.decompress(data["dbindex"]))
)
if stage is PROJECT_SYNC_STAGE.UPLOAD:
if not self._upstream:
self._upstream = BytesIO()
self._upstream.write(data["chunk"])
if self._upstream.tell() == data["total"]:
self.psync.decompress_items(self._upstream)
self._upstream = None
return PROJECT_SYNC_STAGE.EXTRACTED.value
return PROJECT_SYNC_STAGE.UPLOAD.value
return None

Some files were not shown because too many files have changed in this diff Show More