Compare commits

..

487 Commits

Author SHA1 Message Date
Ivan Kravets
cc52890d45 Merge branch 'release/v4.3.1' 2020-03-20 15:13:46 +02:00
Ivan Kravets
b7b9ee5a80 Bump version to 4.3.1 2020-03-20 15:13:40 +02:00
Ivan Kravets
97a0cbdd18 Skip Click 7.1 and 7.1.1 on Windows due to broken releases 2020-03-20 15:11:14 +02:00
Ivan Kravets
b8f43732fe Docs: update What's PlatformIO and PIO IDE pages 2020-03-20 14:44:24 +02:00
Ivan Kravets
658b3df123 Fixed an TypeError "super(type, obj): obj must be an instance or subtype of type" when device monitor is used with a custom dev-platform filter // Resolve #3431 2020-03-20 13:56:30 +02:00
Ivan Kravets
d32312e738 Fixed an issue when lib_archive = no was not honored in "platformio.ini" 2020-03-20 13:34:35 +02:00
Ivan Kravets
20023f8d8a Bump version to 4.3.1a1 2020-03-20 13:02:11 +02:00
Ivan Kravets
6b2ff04bbf Fixed an error "SyntaxError: 'return' with argument inside generator" for PIO Unified Debugger when Python 2.7 is used 2020-03-20 13:01:33 +02:00
Ivan Kravets
d80a9c820d Merge branch 'release/v4.3.0' 2020-03-19 22:38:05 +02:00
Ivan Kravets
4b62af1675 Merge tag 'v4.3.0' into develop
Bump version to 4.3.0
2020-03-19 22:38:05 +02:00
Ivan Kravets
6414e1d9e3 Bump version to 4.3.0 2020-03-19 22:37:16 +02:00
Ivan Kravets
a55f04dc28 Warn that can't allocate socket for PIO Home 2020-03-19 22:36:55 +02:00
Ivan Kravets
2d68e28a70 Fix auto-ready logic for debugging server 2020-03-19 21:33:23 +02:00
Ivan Kravets
4c2a157dce Bump version to 4.3.0rc1 2020-03-19 19:28:13 +02:00
Ivan Kravets
d9647dec95 Add support for debugging server "ready_pattern" 2020-03-19 19:17:54 +02:00
Ivan Kravets
15647c81f0 New standalone (1-script) PlatformIO Core Installer 2020-03-19 18:26:30 +02:00
Ivan Kravets
24a0d9123e Update history with initial support for Renode 2020-03-19 17:04:05 +02:00
Ivan Kravets
720c29350d Add docs for Renode debugging tool // Issue #3401 2020-03-19 16:58:18 +02:00
valeros
aa939b07b1 Update default init config for Renode 2020-03-19 16:17:51 +02:00
Ivan Kravets
0e3c3abf73 GDB init commands for Renode simulation framework // Issue #3401 2020-03-19 15:16:55 +02:00
Ivan Kravets
a8606f4efa Refactor debug GDB initial configurations 2020-03-19 14:49:25 +02:00
ShahRustam
475f898222 Replace installer script with a new one // Resolve #3420 (#3428)
* Replace installer script with a new one. Resolve #3420

* temp file name fix

* get-platformio.py script update.

* small fix
2020-03-19 13:26:51 +02:00
Ivan Kravets
69f5fdf8e1 Remove debug code 2020-03-19 01:05:12 +02:00
Ivan Kravets
fe1ad35cad Merge branch 'feature/issue-3401-renode-support' into develop 2020-03-19 00:50:21 +02:00
Ivan Kravets
352a0b7377 Wait for an output from debug server 2020-03-19 00:46:23 +02:00
Ivan Kravets
52689bc5e8 Wait until debug server is ready 2020-03-19 00:19:59 +02:00
Ivan Kravets
3dd3ea1c35 Show a hexadecimal representation of the data (code point of each character) with `hexlify` filter 2020-03-18 18:55:54 +02:00
Ivan Kravets
fff33d8c29 Do not send CR+NL for "send_on_enter" device monitor filter 2020-03-18 17:25:40 +02:00
Ivan Kravets
db9829a11e Sync docs 2020-03-18 00:36:07 +02:00
Ivan Kravets
9a1b5d869d Bump version to 4.3.0b2 2020-03-18 00:13:03 +02:00
Ivan Kravets
605cd36e27 Send a text to device on ENTER with `send_on_enter` filter // Resolve #926 2020-03-18 00:09:40 +02:00
Ivan Kravets
24a23b67dd Fix formatting issue 2020-03-17 23:10:06 +02:00
Ivan Kravets
0df72411a0 Device Monitor Filter API, implement "time" and "log2file" filters // Resolve #981 Resolve #670 2020-03-17 23:08:57 +02:00
Ivan Kravets
5a72033622 Fixed an issue when unknown transport is used for PIO Unit Testing // Resolve #3422 2020-03-17 17:42:54 +02:00
Ivan Kravets
4e6095ca13 Update docs and history 2020-03-17 17:39:11 +02:00
Matthew Mirvish
f81b0b2a84 Ensure all commands in compilation_commands.json use absolute paths. (#3415)
* Fix resolving of absolute path for toolchain

By placing the `where_is_program` call into this function, all references to the compiler will be made absolute, instead of just ones in the top environment. Previously, all references to the compiler for user source code would not use the full path in the compilation database, which broke `clangd`'s detection of system includes.

* Linting issue
2020-03-17 16:30:28 +02:00
Ivan Kravets
314f634e16 Docs: Improvements for CLion docs. 2020-03-15 00:41:16 +02:00
Ivan Kravets
ba040ba2ba Docs: Workaround for ReadTheDocs bug 2020-03-14 20:32:42 +02:00
Ivan Kravets
a22ed40256 Added initial support for an official "PlatformIO for CLion IDE" plugin // Resolve #2201 2020-03-14 19:31:00 +02:00
Ivan Kravets
58a4ff8246 Skip broken Click 7.1 & 7.1.1, see Click's issue #1501 2020-03-14 12:18:00 +02:00
Ivan Kravets
9a5ebfb642 Bump version to 4.3.0b1 2020-03-12 15:10:25 +02:00
Ivan Kravets
5d0faaa5a8 Refactor docs structure 2020-03-12 15:09:20 +02:00
Ivan Kravets
108b892e30 Control device monitor output with filters and text transformations 2020-03-12 14:28:54 +02:00
Ivan Kravets
0ff37c9999 Implement universal "get_object_members" helper 2020-03-12 14:24:20 +02:00
Vojtěch Boček
8c3de609ab Add ESP crash trace decoding to monitor (#3383)
* Implement mechanism for adding platform filters into miniterm

Updates platformio/platform-espressif8266#31

* DeviceMonitorFilter: fixes for Windows and Python2
2020-03-11 13:22:01 +02:00
valeros
073efef2a1 Explicitly use Python-x64 with Appveyor CI 2020-03-10 15:54:01 +02:00
Ilia Motornyi
b9fd97dae4 Changes required for CLion PlatformIO plugin (#3298) 2020-03-09 15:47:41 +02:00
Ivan Kravets
60a7af6a8c Docs: Update recent articles 2020-03-09 14:58:35 +02:00
Ivan Kravets
0f02b3b653 Improved support for Arduino "library.properties" `depends` field 2020-03-07 17:44:28 +02:00
Ivan Kravets
620335631f Bump version to 4.2.2b1 2020-03-06 22:08:38 +02:00
Ivan Kravets
3ef96cb215 Minor fixes 2020-03-06 00:43:57 +02:00
Ivan Kravets
59e1c88726 Fixed an issue when `"libArchive": false` in "library.json" does not work // Resolve #3403 2020-03-06 00:37:48 +02:00
Ivan Kravets
3a27fbc883 Fixed an issue when Python 2 does not keep encoding when converting .INO file // Resolve #3393 2020-03-05 23:52:46 +02:00
Ivan Kravets
ce6b96ea84 Use native open/io.open for file contents reading/writing 2020-03-05 23:52:13 +02:00
Ivan Kravets
3275bb59bf Fix test 2020-03-04 18:14:51 +02:00
Ivan Kravets
fbb62fa8a6 Bump version to 4.2.2a3 2020-03-03 23:10:54 +02:00
Ivan Kravets
261c46d4ef Add support for Arm Mbed "module.json" `dependencies` field // Resolve #3400 2020-03-03 23:10:19 +02:00
Ivan Kravets
0c0ceb2caa Sync docs 2020-03-03 23:03:14 +02:00
Ivan Kravets
de60f20c21 Sync docs 2020-03-03 14:59:03 +02:00
valeros
314fe7d309 Initial support for menuconfig target 2020-03-03 00:58:07 +02:00
Ivan Kravets
a271143c52 Sync docs 2020-03-02 23:25:28 +02:00
Ivan Kravets
2d4a3db250 Fixed an issue with expanding $WORKSPACE_DIR for library manager 2020-02-29 23:08:08 +02:00
Ivan Kravets
7fba6f78d6 Bump version to 4.2.2a2 2020-02-29 21:59:58 +02:00
Ivan Kravets
eee12b9b66 Fixed an issue "the JSON object must be str, not 'bytes'" when PIO Home is used with Python 3.5 // Resolve #3396 2020-02-29 21:59:10 +02:00
Ivan Kravets
d3e151feeb Sync docs 2020-02-29 18:44:37 +02:00
Ivan Kravets
dd1fe74956 PyLint fix 2020-02-21 15:44:55 +02:00
Ivan Kravets
49aed34325 Rename PIO Plus to Professional 2020-02-21 15:44:24 +02:00
Ivan Kravets
81ba2a5a74 Sync docs 2020-02-20 18:22:12 +02:00
Ivan Kravets
1c87f83463 Parse package dependencies declared as a list of strings 2020-02-18 21:55:01 +02:00
Ivan Kravets
e15f227c48 Docs: Sync Atmel SAM dev-platform 2020-02-18 14:45:54 +02:00
Ivan Kravets
ea5f2742f8 Bump version to 4.2.2a1 2020-02-18 00:05:20 +02:00
Ivan Kravets
9fd0943b75 Fixed an issue when quitting from PlatformIO IDE does not shutdown PIO Home server 2020-02-18 00:03:23 +02:00
Ivan Kravets
80acd52fc2 Merge branch 'release/v4.2.1' 2020-02-17 14:25:27 +02:00
Ivan Kravets
b8312d545c Merge tag 'v4.2.1' into develop
Bump version to 4.2.1
2020-02-17 14:25:27 +02:00
Ivan Kravets
82f36a1ac3 Bump version to 4.2.1 2020-02-17 14:25:20 +02:00
Ivan Kravets
9f7c827572 Resolve absolute path of toolchain when generating compilation database 2020-02-17 13:52:25 +02:00
Ivan Kravets
6328206e78 Fixed an issue when generating of compilation database "compile_commands.json" does not work with Python 2.7 // Resolve #3378 2020-02-17 13:05:01 +02:00
Valerii Koval
154be7fa81 Improve VSCode template structure (#3385)
* Switch to click argument parser

* Typo fix

* Tidy up VSCode template

Co-authored-by: Ivan Kravets <me@ikravets.com>
2020-02-17 12:19:00 +02:00
Ivan Kravets
b8c9eee8af Force docs to HTTPS 2020-02-16 21:25:30 +02:00
Ivan Kravets
43664672fc Bump version to 4.2.1a3 2020-02-14 22:59:16 +02:00
Ivan Kravets
5cc9a328ab Fixed an issue when Library Dependency Finder (LDF) ignores custom "libLDFMode" and "libCompatMode" options in library.json 2020-02-14 22:57:51 +02:00
Ivan Kravets
6556c37e58 Bump version to 4.2.1a2 2020-02-14 20:57:40 +02:00
valeros
292049199a Add new record to history log 2020-02-14 17:24:52 +02:00
valeros
ed4452b115 Get rid of direct imports 2020-02-14 17:09:48 +02:00
valeros
fbfbf340c1 Add "forceInclude" field to VSCode template
VScode doesn't recognize header files included via "-include" flag in "compilerArgs" field.
Instead, absolute paths to these files should be specified in a special section "forceInclude".
2020-02-14 16:43:20 +02:00
Ivan Kravets
a57ea79bf8 Froze "marshmallow" dependency to 2.X for Python 2 // Resolve #3380 2020-02-14 13:49:41 +02:00
Ivan Kravets
22e8e02f3d Automatically rebuild contrib-pysite package when import fails // Resolve #3313 2020-02-13 22:06:46 +02:00
Ivan Kravets
a10625a052 Automatically rebuild contrib-pysite package when import fails // Issue #3313 2020-02-13 15:53:42 +02:00
Ivan Kravets
42020e2498 Bump version to 4.2.1a1 2020-02-13 13:35:38 +02:00
Ivan Kravets
36a2228220 Fixed "TypeError: unsupported operand type(s)" when system environment variable is used by project configuration parser // Resolve #3377 2020-02-13 13:34:34 +02:00
Ivan Kravets
206054b35f Docs: Sync dev-platforms 2020-02-12 23:58:39 +02:00
Ivan Kravets
b87048020d Add warning that overriding board data will not work for device monitor command // Issue #3349 2020-02-12 21:08:08 +02:00
Ivan Kravets
f7d4bf5fa8 Merge branch 'release/v4.2.0' 2020-02-12 16:42:13 +02:00
Ivan Kravets
f0bf531e1b Merge tag 'v4.2.0' into develop
Bump version to 4.2.0
2020-02-12 16:42:13 +02:00
Ivan Kravets
176cf17f9f Bump version to 4.2.0 2020-02-12 16:42:06 +02:00
Ivan Kravets
b41262a20e Fix broken "init" command 2020-02-12 16:34:38 +02:00
Ivan Kravets
d0a6861369 Fix "TypeError: TypeError: write() argument 1 must be unicode" when generating project on Windows/Python 2 2020-02-12 15:14:58 +02:00
Ivan Kravets
0bc872eafd Remap command before calling Click 2020-02-12 13:47:42 +02:00
Ivan Kravets
edc2d10574 Update SPDX License List to 3.8 2020-02-11 18:59:35 +02:00
Ivan Kravets
a033e2d1fe Docs: Update supported upload/debug protocols by Nuclei dev-platform 2020-02-11 18:37:48 +02:00
Ivan Kravets
c0aefe4c62 Add support for Nuclei RISC-V Processor development platform // Resolve #3340 2020-02-10 18:52:31 +02:00
Ivan Kravets
86069ab7c6 Typo fix 2020-02-08 21:37:18 +02:00
Ivan Kravets
86f2dde6f3 Do not overwrite a custom items in VSCode's "extensions.json" // Resolve #3374 2020-02-08 21:36:32 +02:00
Ivan Kravets
96f60aab66 Bump version to 4.2.0rc2 2020-02-08 20:35:43 +02:00
Ivan Kravets
2763853d8d Fixed an issue when no error is raised if referred parameter (interpolation) is missing in a project configuration file // Resolve #3279 2020-02-08 19:10:48 +02:00
Ivan Kravets
a78b461d45 Code formatting 2020-02-08 19:10:15 +02:00
Ivan Kravets
2fa20b5c52 Typo fix 2020-02-07 11:28:32 +02:00
Valerii Koval
0b0b63aa7d Update templates for Atom, VSCode, CLion (#3371)
* Wrap flags with whitespace chars when exporting data for IDEs

* Update IDEs templates

Take into account compiler flags that can contain whitespace characters (e.g. -iprefix)

* Update template for VSCode

* Add history record
2020-02-07 11:26:45 +02:00
valeros
fe174e35c8 Suppress warnings from packages dir instead of core dir for CppCheck 2020-02-06 23:48:13 +02:00
Ivan Kravets
88e40e28fc Bump version to 4.2.0rc1 2020-02-06 23:33:27 +02:00
Ivan Kravets
73ce3c94e9 Initial support for `Project Manager // Resolve #3335 2020-02-06 23:32:43 +02:00
valeros
d2033aacea Remove the entire folder with temp files used by pvs-studio tool 2020-02-06 23:21:27 +02:00
Mark Harfouche
dfb47a089b Add license file to sdist using MANIFEST.in (#3325) 2020-02-06 17:48:08 +02:00
Ivan Kravets
81960ce051 Fix test 2020-02-06 17:41:35 +02:00
Ivan Kravets
2ecceb8ed2 Generate absolute path for compilation DB item 2020-02-06 17:30:55 +02:00
Ivan Kravets
00a9a2c04d Generate `compilation database "compile_commands.json" // Resolve #2990 2020-02-06 17:19:48 +02:00
Valerii Koval
7716fe1d1c Autodetect monitor port for boards with specified HWIDs // Resolves #3349 (#3369)
* Autodetect serial port for boards with specified hwids

* Add new record to history log
2020-02-05 22:33:05 +02:00
Ivan Kravets
09b3df5520 Fixed a "UnicodeDecodeError" when listing built-in libraries on macOS with Python 2.7 // Resolve #3370 2020-02-05 22:25:06 +02:00
Ivan Kravets
ee2e4896d2 Fixed an issue when Project Inspector crashes when flash use > 100% // Resolve #3368 2020-02-05 18:15:44 +02:00
Ivan Kravets
390f1935d6 Fix parsing dependency in a legacy format 2020-02-05 15:43:39 +02:00
Ivan Kravets
365c3eaf4b Fixed an issue when invalid CLI command does not return non-zero exit code 2020-02-05 15:31:32 +02:00
Ivan Kravets
b83121a951 Fix package parser test 2020-02-05 14:34:40 +02:00
Ivan Kravets
efe8e599fd Added support for Arduino's library.properties `depends` field // Resolve #2781 2020-02-05 00:04:16 +02:00
valeros
bc69259dd1 Update tool-unity package to v2.5.0 // Resolve #3362 2020-01-31 15:10:45 +02:00
Ivan Kravets
607e8eb477 Improve detecting of active Internet connection // Resolve #3359 2020-01-29 18:54:30 +02:00
Ivan Kravets
139171a79f Sync docs 2020-01-29 18:53:33 +02:00
Ivan Kravets
848e525919 Bumo PIO Home to 3.1.0-rc.2 2020-01-26 00:31:55 +02:00
Ivan Kravets
b805822eea Remove debug code 2020-01-25 20:51:20 +02:00
Ivan Kravets
13e9306753 Bump version to 4.1.1b9 2020-01-25 20:50:26 +02:00
Ivan Kravets
880d5bb8b0 Fix project saving 2020-01-25 20:47:10 +02:00
Ivan Kravets
f9de23b16f Bump version to 4.1.1b8 2020-01-25 15:48:14 +02:00
Ivan Kravets
e5aa71e4e1 Fix config saving when PY2 is used 2020-01-25 15:47:45 +02:00
Ivan Kravets
ba441ca77c Remove double blank lines when saving project config // Resolve #3293 2020-01-24 22:15:33 +02:00
Ivan Kravets
decf482367 Use io.DEFAULT_BUFFER_SIZE as default chunk size 2020-01-24 22:05:24 +02:00
Ivan Kravets
253e4b13b5 Disable buffering file calculating checksum for downloaded file 2020-01-24 21:40:12 +02:00
Ivan Kravets
04ca6621f1 Verify downloaded package checksum with native Python's hashlib and support all OS 2020-01-24 20:43:05 +02:00
Ivan Kravets
20334772b5 Install a dev-platform with ALL declared packages using a new `--with-all-packages` option // Resolve #3345 2020-01-24 20:20:56 +02:00
Ivan Kravets
a62bc3846e Update IPs when checking for Internet 2020-01-24 19:50:55 +02:00
Ivan Kravets
3b092f28c3 Added support for "pythonPackages" in platform.json 2020-01-24 19:47:47 +02:00
Ivan Kravets
2de46f545f Formatter 2020-01-24 19:47:20 +02:00
Ivan Kravets
8fce660afa Add notice about VID:PID pairs // Resolve #3349 2020-01-24 14:47:23 +02:00
Ivan Kravets
dbeffe426f Update docs for CI 2020-01-23 22:48:22 +02:00
Ivan Kravets
d4c42bd546 Added support for `PVS-Studio static code analyzer 2020-01-23 20:51:34 +02:00
Ivan Kravets
fad5d1d744 Sync docs 2020-01-23 14:06:36 +02:00
Valerii Koval
46a9c1b6b2 Add initial support for PVS-Studio check tool (#3357)
* Add initial support for PVS-Studio check tool

* Enable all available PVS-Studio analyzers by default

* Add tests for PVS-Studio check tool

* Improve handling check tool extra flags that contain colon symbol
2020-01-23 12:57:54 +02:00
Max Prokhorov
5ac1e9454f pio-test: pass --verbose to the run command context (#3338)
* pio-test: pass --verbose to the run command context

* restore old output behavior
2020-01-23 12:56:08 +02:00
Ivan Kravets
17f9d57207 Control debug flags and optimization level with a new "debug_build_flags" option 2020-01-22 22:20:24 +02:00
Valerii Koval
5bdec19f31 Add new debug_build_flags option (#3355)
This will allow users to override default debug flags for example in cases when
the final binary built in debug mode is too large to be loaded on target
2020-01-22 20:41:42 +02:00
Ivan Kravets
90b80083e8 Sync docs 2020-01-22 12:58:16 +02:00
Ivan Kravets
8d02e8b8f7 Docs: Update FAQ for installing Python 2020-01-16 11:55:58 +02:00
Ivan Kravets
7e41841a74 Allow to call Downloader API in silent mode 2020-01-11 15:23:36 +02:00
Ivan Kravets
0f296e7e37 Skip broken example definitions in package manifest 2020-01-10 21:28:19 +02:00
Ivan Kravets
9c32ff278c Sync docs 2020-01-09 23:32:55 +02:00
Ivan Kravets
60139035d8 Include hidden files by default in a package 2020-01-05 18:29:19 +02:00
Ivan Kravets
5ab34436ec Cleanup package file name when packing 2020-01-04 23:48:06 +02:00
Ivan Kravets
178080fd12 Improve boards search 2020-01-04 13:53:08 +02:00
Ivan Kravets
915b9145f6 Filter boards by ID+Name 2020-01-04 11:46:25 +02:00
Ivan Kravets
6020faf970 Docs: Add "v" suffix to SparkFun RISC-V board ids 2020-01-04 11:38:13 +02:00
Ivan Kravets
ec82fc82a2 Support packages with nested folders and with a custom "root" 2020-01-03 22:55:56 +02:00
Ivan Kravets
8d7b775875 Implement package packer 2020-01-03 15:52:54 +02:00
Ivan Kravets
682114d6f1 Bump version to 4.1.1b7 2019-12-31 20:11:14 +02:00
Ivan Kravets
0ac5cd6789 Skip empty "export" values when parsing library.json manifest 2019-12-31 10:21:20 +02:00
Ivan Kravets
209040fdc1 Initial support for Zephyr framework // Issue #1613 2019-12-30 14:11:27 +02:00
Ivan Kravets
643f290057 Fix issue with library.properties manifest parser when author email is not specified 2019-12-29 17:43:50 +02:00
Ivan Kravets
ea0b462d0b PyLint fixes 2019-12-29 14:18:43 +02:00
Ivan Kravets
442a7e3576 Made package ManifestSchema compatible with marshmallow >= 3 // Resolve #3296 2019-12-29 13:13:04 +02:00
Ivan Kravets
f7385e8e88 Bump version to 4.1.1b6 2019-12-25 00:44:01 +02:00
Ivan Kravets
20a10c7fc5 Fixed an issue "Import of non-existent variable 'projenv''" // Resolve #3315 2019-12-24 23:43:21 +02:00
Ivan Kravets
2f05040081 Fixed an issue with LDF when header files not found if "libdeps_dir" is within a subdirectory of "lib_extra_dirs" // Resolve #3311 2019-12-24 14:36:44 +02:00
Ivan Kravets
7e0fb43dbe Normalise repository url when parsing package manifest 2019-12-22 23:45:12 +02:00
Ivan Kravets
26e7069099 Fixed default PIO Unified Debugger configuration for J-Link probe 2019-12-22 12:33:08 +02:00
Ivan Kravets
7d90c468ae Updated SCons tool to 3.1.2 2019-12-22 01:27:51 +02:00
Ivan Kravets
f9494c940e Add support for "downloadUrl" in package manifest 2019-12-20 20:20:09 +02:00
Ivan Kravets
135ef8701c Better representation of dependent packages 2019-12-20 01:19:23 +02:00
Ivan Kravets
1bd6e898ad Fix tests 2019-12-18 14:03:10 +02:00
Ivan Kravets
b15ddc00a5 Pass extra files with compiler options to VSCode IntelliSense tool 2019-12-17 22:40:28 +02:00
Ivan Kravets
b4088a6d00 Drop support for Samsung Artik dev/platform // Resolve #3309 2019-12-17 22:39:15 +02:00
Ivan Kravets
c57f68aee3 Change order between board/framework for configuration file 2019-12-17 15:22:02 +02:00
Ivan Kravets
31eed6c5e5 Improve detecting of a run PIO Home Server 2019-12-09 14:47:00 -08:00
Ivan Kravets
09852dcada Sync docs 2019-12-09 14:46:28 -08:00
Ivan Kravets
5768fcd429 Inform that PlatformIO Home server is already started in another process 2019-12-02 19:34:19 +02:00
Ivan Kravets
d901cc875a Minor improvements for docs 2019-12-02 19:28:21 +02:00
Ivan Kravets
4f1ccfe58f Bump version to 4.1.1b5 2019-11-28 18:58:09 +02:00
Ivan Kravets
f852f9fa89 Small refacoring 2019-11-28 18:23:00 +02:00
Ivan Kravets
5c388d4271 Fix issue with invalid checking used port for PIO Home server 2019-11-28 18:14:53 +02:00
Ivan Kravets
f9cae60225 Fix tests 2019-11-28 18:14:10 +02:00
Ivan Kravets
49ceadc6ad Move exceptions to their components 2019-11-28 16:15:54 +02:00
Ivan Kravets
47469e8759 Fix issue when None value is passed to config.set 2019-11-27 18:08:32 +02:00
Ivan Kravets
742f60b48d Bump version to 4.1.1b4 2019-11-26 18:00:33 +02:00
Ivan Kravets
0615ff6dd8 Docs: Change "Global/Local scope" configuration to "Common/Environment" 2019-11-26 17:40:09 +02:00
Ivan Kravets
2fbe33bca0 Move "Board" Options to "Platform 2019-11-26 14:44:54 +02:00
Ivan Kravets
fdd73552ea Docs for Zephyr // Issue #1613 2019-11-26 14:44:29 +02:00
Ivan Kravets
17b4f0b4dd Docs: Sync Atmel AVR dev/platform 2019-11-25 13:56:33 +02:00
Ivan Kravets
f87322591d Bump version to 4.1.1b3 2019-11-22 20:49:44 +02:00
Ivan Kravets
47099190f4 Bump PIO Home version to 3.1.0-beta.2 2019-11-22 20:49:09 +02:00
Ivan Kravets
9cf777b4e5 Fixed an issue with "start-group/end-group" linker flags on Native development platform // Resolve #3282 2019-11-21 14:19:32 +02:00
valeros
8675d3fa46 Use proper CMake variables for setting compilation flags 2019-11-21 00:39:05 +02:00
Ivan Kravets
2b9d8dba5f Bump version to 4.1.1b2 2019-11-20 23:32:19 +02:00
Ivan Kravets
b37814976c Fix project generator for CLion 2019-11-20 23:31:35 +02:00
Ivan Kravets
895aa0cb34 Update PlatformIO slogan 2019-11-19 20:01:50 +02:00
Ivan Kravets
18a9866a02 An example with custom debug flags in advanced scripting 2019-11-19 19:56:10 +02:00
Ivan Kravets
88ead0aed1 Catch all debug errors when killing debug server 2019-11-19 18:55:21 +02:00
Ivan Kravets
f19c7dc575 Support for Atmel megaAVR 2019-11-18 17:59:19 +02:00
Ivan Kravets
7eab5d567e Fix CLion generator when one env is used 2019-11-17 00:35:56 +02:00
Ivan Kravets
5b1c6daa2a Fix a space in config header 2019-11-16 23:09:18 +02:00
Ivan Kravets
464e1a509f Add PIO Home "os.get_file_mtime" RPC method 2019-11-16 22:57:56 +02:00
Ivan Kravets
c1394b290d Fix issue with unknown dev/platform when symlink is used 2019-11-16 22:43:25 +02:00
Ivan Kravets
0029c7fe09 Advanced Serial Monitor with UI 2019-11-16 17:45:57 +02:00
Ivan Kravets
e9f9871c1e Show to user the last exception when can't install a package 2019-11-16 17:25:27 +02:00
Ivan Kravets
d1b46c838e Revert back "Flash" instead of "ROM" 2019-11-16 14:09:16 +02:00
Ivan Kravets
a7b9187234 Bump version to 4.1.1b1 2019-11-15 20:05:52 +02:00
Ivan Kravets
c7202154de Project Manager and Project Configuration UI for "platformio.ini" 2019-11-15 20:05:09 +02:00
Ivan Kravets
6809da0353 Replace os.path.abspath by realpath 2019-11-15 16:02:15 +02:00
Ivan Kravets
fbdcbf17c7 Rename Data/Flash to RAM/ROM 2019-11-15 15:52:39 +02:00
Ivan Kravets
44a9de6dcb Pass -m and -i flags to VSCode Intellisense analyzer 2019-11-15 15:11:13 +02:00
Ivan Kravets
a077081e46 Init generic C/C++ SCons tools by default 2019-11-15 15:10:43 +02:00
Ivan Kravets
728fd7f5b9 Catch exception when can not get a default locale 2019-11-13 16:48:04 +02:00
Ivan Kravets
053160a6eb Sync docs 2019-11-13 16:47:39 +02:00
Ivan Kravets
9bbaba3d59 Bump version to 4.1.1a2 2019-11-13 15:35:32 +02:00
Ivan Kravets
b1577d101c Update PIO Home to 3.0.1 2019-11-13 15:32:57 +02:00
Ivan Kravets
53e6cf3e4a Drop support for "project_dirs" argument to Project RPC 2019-11-13 15:32:14 +02:00
Ivan Kravets
a9f9f4ef04 Fixed an issue when `env.BoardConfig()` does not work for custom boards in extra scripts of libraries // Resolve #3264 2019-11-12 23:52:43 +02:00
Ivan Kravets
15f142fc70 Ensure content cache key does not contain path separators 2019-11-12 23:48:39 +02:00
Ivan Kravets
6e9429dbbf Split BuildProgram into ProcessProgramDeps and ProcessProjectDeps 2019-11-12 18:52:25 +02:00
Ivan Kravets
be628051a7 Fix typo 2019-11-12 18:35:07 +02:00
Ivan Kravets
f0eb177a8e Check if directory exists before fetching project info 2019-11-12 18:32:10 +02:00
Ivan Kravets
7c481291dc Warn about about broken library manifest when scanning dependencies // Resolve #3268 2019-11-12 18:14:06 +02:00
Ivan Kravets
f1d20f591a Sync docs 2019-11-12 13:41:54 +02:00
Ivan Kravets
c6a8e03367 Fixed an issue when `env.BoardConfig()` does not work for custom boards in extra scripts of libraries // Resolve #3264 2019-11-12 13:41:39 +02:00
Ivan Kravets
cbb7869da1 Fixed an issue with the broken latest news for PIO Home 2019-11-12 13:09:35 +02:00
Ivan Kravets
1f796ca0e5 Bump version to 4.1.1a1 2019-11-11 23:22:17 +02:00
Ivan Kravets
703b29a05e Fixed missed descriptions for project options 2019-11-11 23:19:47 +02:00
Ivan Kravets
56ceee220b Fix invalid build status for unit test when remote is used 2019-11-11 22:48:29 +02:00
Ivan Kravets
0328037b49 Handle project configuration (monitor, test, and upload options) for PIO Remote commands // Resolve #2591 2019-11-11 22:38:16 +02:00
Ivan Kravets
3c796ca7c8 Cosmetic fixes 2019-11-08 17:40:11 +02:00
Ivan Kravets
e6e14be528 Fix framework name shakti-sdk 2019-11-07 16:58:57 +02:00
Ivan Kravets
f42d1a89f2 Merge tag 'v4.1.0' into develop
Bump version to 4.1.0
2019-11-07 16:54:21 +02:00
Ivan Kravets
5a89388fb0 Merge branch 'release/v4.1.0' 2019-11-07 16:54:20 +02:00
Ivan Kravets
d043412e0f Bump version to 4.1.0 2019-11-07 16:54:12 +02:00
Ivan Kravets
71168b1a5f Replace IoT with Embedded 2019-11-07 16:49:34 +02:00
Ivan Kravets
95c1b0214c Rename "check_pattern" to "check_patterns" 2019-11-07 15:24:47 +02:00
Ivan Kravets
2408c0a4c7 Fix incorrect info about build_type 2019-11-06 23:53:38 +02:00
Ivan Kravets
4b3f593df9 Bump version to 4.1.0rc9 2019-11-06 23:24:49 +02:00
Ivan Kravets
67aea4db3f Fix issue with GDB/MI Stream Records for PIO Debugger 2019-11-06 22:30:58 +02:00
Ivan Kravets
6b44a8ae75 Use BuildAsyncPipe only if TTY stream 2019-11-06 22:25:14 +02:00
Ivan Kravets
70f4fa2665 Remove platformio.description if none value is passed 2019-11-06 20:06:30 +02:00
valeros
17ff3250c9 Export uppercased driver name in sizedata report on Windows 2019-11-06 12:34:43 +02:00
Ivan Kravets
2cce47a13d Fix "pio_reset_run_target" for JLink debug probe 2019-11-06 12:20:42 +02:00
Ivan Kravets
c1f62f8ead Bump version to 4.1.0rc8 2019-11-06 01:01:07 +02:00
Ivan Kravets
e0c174b9b6 Improve PIO Home Misc RPC 2019-11-06 00:48:28 +02:00
Ivan Kravets
e5ec4de3a4 Extend PIO Home RPC with "project.config_update_description(path, text)" method 2019-11-06 00:05:17 +02:00
Ivan Kravets
bcf09964ab Better formatting for multi-line values in config option 2019-11-06 00:03:08 +02:00
Ivan Kravets
f3992f8e53 Create dummy target for CLION 2019-11-05 15:59:06 +02:00
Ivan Kravets
66cc557d2f Export SVD Path to CLion 2019-11-05 15:16:50 +02:00
Ivan Kravets
9786b3e1b9 Fix CLion integration when project name contains a space 2019-11-05 15:16:35 +02:00
Ivan Kravets
30bc691c95 Fix test with a missed library 2019-11-05 12:08:29 +02:00
Ivan Kravets
83110326f4 Rename "check_pattern" option 2019-11-05 12:02:12 +02:00
valeros
182835fabf Rename check_patterns option to check_pattern 2019-11-05 11:36:20 +02:00
Ivan Kravets
7345d3ea19 Improve dump of config data 2019-11-05 00:17:39 +02:00
Ivan Kravets
3f4aa320c2 Sync docs 2019-11-04 21:52:42 +02:00
valeros
dfd853fa87 Update tests for check command with a new flag "pattern" that supersedes "filter" 2019-11-04 21:34:39 +02:00
Ivan Kravets
3289e84b21 Refactor PIO Check from "check_filters" to "check_patterns" 2019-11-04 18:22:28 +02:00
Ivan Kravets
39639d45fe Bump version to 4.1.0rc7 2019-11-04 15:36:45 +02:00
Ivan Kravets
b45abf67a5 Fix broken debug configuration 2019-11-04 15:36:23 +02:00
Ivan Kravets
e57871cab7 Print a building mode 2019-11-03 22:22:40 +02:00
Ivan Kravets
484ea15959 Bump version to 4.1.0rc6 2019-11-02 23:14:16 +02:00
Ivan Kravets
40109263f0 Fix initial debug configuration for J-Link 2019-11-02 23:09:56 +02:00
Ivan Kravets
b45261c3dc Change initial debug configuration to: reset/halt, load, init break points 2019-11-02 22:56:11 +02:00
Ivan Kravets
73bcf18498 Fix broken debugger 2019-11-02 22:54:57 +02:00
Ivan Kravets
0dcc6f350d Bump version to 4.1.0rc5 2019-11-02 19:49:34 +02:00
Ivan Kravets
0488cc4086 Typo fix 2019-11-02 19:48:41 +02:00
Ivan Kravets
7784743cb1 Switch to default values from project configuration options 2019-11-02 19:44:28 +02:00
Ivan Kravets
0a4bc1d4e3 Add "description" for project config options, configure "default" values 2019-11-02 19:41:39 +02:00
Ivan Kravets
3630084a64 Docs: Sync Kendryte K210 dev/platform 2019-11-01 18:53:00 +02:00
Ivan Kravets
53c561e895 Bump version to 4.1.0rc4 2019-11-01 18:33:21 +02:00
Ivan Kravets
88db253515 Ignore duplicate library storages 2019-11-01 18:28:20 +02:00
Ivan Kravets
da928efb43 Added "--shutdown-timeout" option to PIO Home Server 2019-11-01 14:40:03 +02:00
Ivan Kravets
cd3d638337 Disable parsing of extra configs for PIO Home Project RPC load/dump methods 2019-11-01 12:05:13 +02:00
Ivan Kravets
3de2d84e2b Fixed an issue with a GCC Linter for PlatformIO IDE for Atom // Resolve #3218 2019-10-31 22:42:22 +02:00
Ivan Kravets
1d5d09feab Fixed an issue when Project Config Parser does not remove in-line comments when Python 3 is used // Remove #3213 2019-10-31 22:04:57 +02:00
Ivan Kravets
2c2b419685 Docs: Sync nRF52 dev/platform 2019-10-31 18:52:26 +02:00
Ivan Kravets
a7f8838d9a Format code 2019-10-31 18:52:13 +02:00
Ivan Kravets
a18f8b2a4c Use default values from project options 2019-10-31 15:28:02 +02:00
Ivan Kravets
9b65a091da Export config dump/load and schema to PIO Home Project.RPC 2019-10-31 15:27:34 +02:00
Ivan Kravets
8ccf9d2e53 Implement project config "update" with "clear" option 2019-10-31 15:26:34 +02:00
Ivan Kravets
cd6137bdb0 Bump version to 4.1.0rc3 2019-10-31 00:43:44 +02:00
Ivan Kravets
6d69c25a2f Use locale encoding to decode subprocess output // Resolve #2890 2019-10-30 20:43:37 +02:00
Ivan Kravets
7b6bab7f4e Update memory usage banner 2019-10-30 20:40:26 +02:00
Ivan Kravets
257a8c63d2 Sync docs 2019-10-30 20:28:38 +02:00
Ivan Kravets
3146ab5d12 Allow export project config data as Tuple 2019-10-30 19:09:32 +02:00
Ivan Kravets
2d4722477e Automatically shutdown PIO Home server when no clients for 1 hour 2019-10-30 18:58:49 +02:00
valeros
d815daed29 Allow specifying defect level that will cause failure 2019-10-30 13:38:46 +02:00
valeros
c4e7674585 Don't pass header files to Cppcheck 2019-10-30 12:23:33 +02:00
Ivan Kravets
94f565db84 Show warning about restart IDE to affect PIO Home changes 2019-10-29 18:11:09 +02:00
Ivan Kravets
6e03aa3a3d Bump version to 4.1.0rc2 2019-10-29 18:03:07 +02:00
Ivan Kravets
737c29b510 Update PIO Home to 3.0.0-beta.2 2019-10-29 18:02:32 +02:00
Ivan Kravets
0222c56c4d Use file system encoding when decoding subprocess output // Resolve #2890 2019-10-29 17:43:48 +02:00
Ivan Kravets
8d0584aa59 Add new IDE RPC "open_text_document" method for PIO Home 2019-10-29 17:37:09 +02:00
Ivan Kravets
7dbeab11a5 Add new OS.RPC "open_file" method for PIO Home 2019-10-29 17:36:36 +02:00
Ivan Kravets
7cad06ea18 Minor test fixes 2019-10-29 17:12:18 +02:00
Ivan Kravets
3236fb6b3d Return file+line for sizedata instead of "location" 2019-10-29 17:01:20 +02:00
valeros
0194e09410 Use simple abspath to get absolute path to file with defect 2019-10-28 18:37:14 +02:00
valeros
187e30d055 Export full path to file with defect 2019-10-28 18:30:22 +02:00
valeros
39a7062503 Fix types of defect fields column and line 2019-10-28 18:12:39 +02:00
Ivan Kravets
4ff7c868ef Sync docs 2019-10-28 16:34:26 +02:00
valeros
5573c3871c Allow cppcheck suppress individual defects by default 2019-10-28 13:38:46 +02:00
valeros
d620579247 Fix tests for check command according to updated exit codes 2019-10-25 21:04:30 +03:00
valeros
48651286b6 Automatically detect C++ standard version when invoking cppcheck 2019-10-25 20:59:36 +03:00
valeros
0e7a2b3141 Automatically detect source files language when invoking cppcheck 2019-10-25 20:08:04 +03:00
Ivan Kravets
f3b8ae4224 Bump version to 4.1.0rc1 2019-10-25 20:02:01 +03:00
Ivan Kravets
a2451a716d PIO Home 3.0 with Project Inspect 2019-10-25 20:01:31 +03:00
Ivan Kravets
2e5dabb913 Fix issue with custom board_ options 2019-10-25 19:33:22 +03:00
valeros
4e43e7d3c3 Fix code formatting 2019-10-25 17:43:52 +03:00
Ivan Kravets
49acf4bdb9 Minimum supported version of PIO Plus Core is 2.5.8 2019-10-25 17:27:51 +03:00
valeros
f3d8c30f95 Skip ignored environments when exporting check report in JSON format 2019-10-25 15:50:19 +03:00
valeros
4486a85d4c Introduce new flag --fail-on-defect to pio check 2019-10-25 15:40:50 +03:00
Ivan Kravets
8a6892bf3c Fixed an issue with invalid encoding when generating project for Visual Studio // Issue #3183 2019-10-25 14:33:22 +03:00
valeros
087a8f6dd0 Fix Visual Studio template files encoding // Resolve #3183 2019-10-25 14:27:47 +03:00
Ivan Kravets
5e681ec03c ProjectRPC.config_call accepts first argument as dict/kwargs for Config.init 2019-10-25 14:01:46 +03:00
Ivan Kravets
784a5cd349 Add support for "Build Middlewares" 2019-10-25 00:33:04 +03:00
valeros
5345dd2674 Give a proper name to method that converts defect item to dict 2019-10-24 21:35:04 +03:00
valeros
8127fd9960 Export correct stats for each check tool 2019-10-24 20:44:34 +03:00
Ivan Kravets
3177aaf591 Fixed an issue when booleans in "platformio.ini" are not parsed properly // Resolve #3022 2019-10-24 19:43:13 +03:00
Ivan Kravets
70b484a2c2 Escape "\" char in GDB console output 2019-10-24 17:34:49 +03:00
Ivan Kravets
ed6c9a08ce Add custom "PLATFORMIO_BUILD_DEBUG" target for CLion 2019-10-24 17:21:02 +03:00
Ivan Kravets
601989c5ff Escape "\" char in GDB console output 2019-10-24 16:56:28 +03:00
Ivan Kravets
234585dc97 Fixed an issue with project generator when `src_build_flags` were not respected // Resolve #3137 2019-10-24 16:39:11 +03:00
Ivan Kravets
2388b2a62b Fixed security issue when extracting items from TAR archive // Issue #2995 2019-10-24 16:24:53 +03:00
Ivan Kravets
69d9438c71 Temporary disable security checking for Tar items 2019-10-24 15:39:41 +03:00
Ivan Kravets
0b500dba54 Handle legacy "system": "all" for package manifest 2019-10-24 15:10:11 +03:00
Ivan Kravets
798b12ce7b Fixed security issue when extracting items from TAR archive // Resolve #2995 2019-10-24 14:55:45 +03:00
Ivan Kravets
334d50c367 Use package parser for package manager and LDF 2019-10-24 13:42:46 +03:00
Ivan Kravets
dd1da95a40 Fix issue when wrong library was picked up by LDF when framework is not declared 2019-10-24 00:28:03 +03:00
Ivan Kravets
6684ac5a57 LDF: Check project include dirs before looking for dependencies 2019-10-23 22:55:02 +03:00
Ivan Kravets
b533d7a1dd LDF: Check global CPPPATH when looking for dependencies 2019-10-23 22:31:26 +03:00
Ivan Kravets
95d1f43799 Sync docs with ST STM32 dev/platform 2019-10-23 18:49:08 +03:00
Ivan Kravets
9c7cc87c5f Move command related modules to "commands" package 2019-10-23 16:05:27 +03:00
valeros
374379ba03 Skip .debug sections when generating memory use report 2019-10-22 21:52:55 +03:00
valeros
56ac577b0a Fix case with empty arguments when generating sizedata report 2019-10-22 12:10:48 +03:00
valeros
941c0f4297 Improve the speed of memory use report generation 2019-10-21 23:26:28 +03:00
Ivan Kravets
f34745bef9 Parse device frequency in int format for size data 2019-10-21 15:57:34 +03:00
Ivan Kravets
9fef7f0ba9 Docs: Sync TI MSP430 dev/platfom 2019-10-21 15:53:25 +03:00
Ivan Kravets
971cd2ca0f Export device info in pair with sizedata 2019-10-21 00:12:04 +03:00
Ivan Kravets
6bf8bec22d Bump version to 4.1.0b5 2019-10-19 12:43:43 +03:00
Ivan Kravets
d771816b02 Automatically change dir to project for RPC "config_call"; add "envs" and "descrption" for project entities 2019-10-19 12:42:43 +03:00
Ivan Kravets
f78a1a7b15 Show encoding error when can't read a file // Issue #2796 2019-10-18 22:00:28 +03:00
Ivan Kravets
77f8414c63 Better explanation about encoding error // Resolve #2796 2019-10-18 15:56:50 +03:00
Ivan Kravets
4d84d03a63 Black 2019-10-18 15:56:41 +03:00
Ivan Kravets
065607b68c Disable PyLint's "import-outside-toplevel" 2019-10-18 15:41:52 +03:00
Ivan Kravets
f5807364e8 Force to "backslashreplace" when UnicodeEncodeError arises when writing file // Issue #2796 2019-10-18 15:20:52 +03:00
Ivan Kravets
92d86192aa Substitute LDSCRIPT with real value 2019-10-18 15:05:11 +03:00
Ivan Kravets
d44c60614d Use direct LDSCRIPT_PATH only if script resolves 2019-10-17 23:40:30 +03:00
Ivan Kravets
19a8326f0f Fix test for package manifest 2019-10-17 21:19:04 +03:00
Ivan Kravets
be9aaf8902 Be compatible with Python 3.8, on Windows skip HOME and check for USERPROFILE 2019-10-17 20:57:40 +03:00
Ivan Kravets
5cfa2b7fdd Fix typo with UnicodeEncodeError // Issue #2796 2019-10-17 19:28:57 +03:00
Ivan Kravets
6218b773fd Better support for file contents writing // Issue #2796 2019-10-17 18:48:59 +03:00
Ivan Kravets
7bcfea13fb Fixed an issue with linking process when `$LDSCRIPT` contains a space in path 2019-10-17 16:52:18 +03:00
Ivan Kravets
89843c0d65 Fix issue with parsing library.properties when export field is used 2019-10-17 15:48:18 +03:00
valeros
31d4a5c72e Add collective stats info about project components to check report 2019-10-17 13:42:00 +03:00
Ivan Kravets
83f25cbc16 Fix tests 2019-10-17 12:38:35 +03:00
Ivan Kravets
27fc19d6b3 Switch to Marshmallow ODM framework 2019-10-17 00:17:16 +03:00
Ivan Kravets
9cfccc5cd4 Minor fixes to manifest parser 2019-10-16 13:58:50 +03:00
Ivan Kravets
9da19fbf54 Use isolated SCons sign DB per Python interpreter 2019-10-16 12:09:53 +03:00
Ivan Kravets
a481a5deda Fix issue with "remote test" // Resolve #3127 2019-10-15 23:30:02 +03:00
Matt McCartney
e8692334f6 Replace deprecated in python3: iteritems with items and basestr with str (#3119) 2019-10-15 22:00:48 +03:00
Ivan Kravets
239befa4ee New Shakti dev/platform 2019-10-15 13:05:56 +03:00
Ivan Kravets
2e9b0066de Capture manifest parser exceptions 2019-10-14 23:36:15 +03:00
Ivan Kravets
55d905a0d0 Add a new RPC method "project.config_call" for Home API 2019-10-12 20:00:12 +03:00
Ivan Kravets
181adb277f Sync docs 2019-10-12 19:58:34 +03:00
Ivan Kravets
82ec0164b0 Update docs on how to install Python 3.7 on Windows 2019-10-10 23:35:59 +03:00
Ivan Kravets
c8354b100e Bump version to 4.1.0b4 2019-10-10 14:51:14 +03:00
Ivan Kravets
4366719ed2 Restore missed project helpers needed by "platformio-node-helpers" 2019-10-10 14:50:34 +03:00
valeros
971eb8e35c Revert back unix style directory separator in sizedata report 2019-10-09 17:37:24 +03:00
valeros
a785c238b1 Use OS-native directory separator in sizedata report 2019-10-09 00:39:57 +03:00
valeros
eda02750ae Export files as list instead of dict for sizedata target 2019-10-08 13:45:36 +03:00
Ivan Kravets
e5d50eb45c Docs: RV-LINK debug tool, sync GDV32 dev/platform 2019-10-08 11:49:04 +03:00
Ivan Kravets
b66bf5f4c0 Ignore symbolic links for package examples 2019-10-07 20:35:01 +03:00
Ivan Kravets
d1c8cc38f2 Cast semver to string when validating version 2019-10-05 23:40:27 +03:00
Ivan Kravets
10bada0bcc ManifestPaser: handle examples from "Examples" folder 2019-10-05 20:21:39 +03:00
Ivan Kravets
5d7e7b1796 DataModel: capture exceptions from failed models in non-strict mode 2019-10-04 23:52:06 +03:00
Ivan Kravets
0f7fe260d1 Docs: Sync ESP32 dev platform 2019-10-04 21:15:37 +03:00
Ivan Kravets
46be56af43 Bump version to 4.1.0b3 2019-10-04 20:51:05 +03:00
Ivan Kravets
dce2655004 Fix broken serial monitor called via run target while uploading // Resolve #3081 2019-10-04 20:50:39 +03:00
Ivan Kravets
36acdd7797 DataModel: allow valid values in non-strict mode for TypeOfList and TypeOfDict 2019-10-04 18:30:48 +03:00
valeros
47e297fecb Use less verbose debug output 2019-10-04 13:27:02 +03:00
valeros
9ce19c7e83 Skip debug sections when calculating sizedata 2019-10-04 10:52:55 +03:00
Ivan Kravets
9954900a0e Return back LINKFLAGS for debug mode 2019-10-03 18:16:55 +03:00
Ivan Kravets
a7855ae664 ManifestParser: init from dir using name of file in remote url if provided 2019-10-03 16:14:51 +03:00
Ivan Kravets
76865a1730 ManifetPatrser fixes (#3080)
* Skip broken examples declaration

* Allow dots in keywords

* Allow "+" in keywords
2019-10-03 14:55:04 +03:00
Ivan Kravets
8febdc19ea ManifestParser: normalize example names 2019-10-03 12:47:41 +03:00
Ivan Kravets
85a814c21a Allow dot in manifest example name 2019-10-03 10:33:11 +03:00
Ivan Kravets
ab5650f84b Use max line length hooks for all systems 2019-10-02 23:46:42 +03:00
Ivan Kravets
77c591ce81 Fix RTD conf 2019-10-02 21:35:13 +03:00
Ivan Kravets
dc067642b2 Fix RTD conf 2019-10-02 21:33:40 +03:00
Ivan Kravets
d0ee0c2919 Sync docs 2019-10-02 21:32:31 +03:00
Ivan Kravets
6d50aa2e25 Remove RTD confs 2019-10-02 20:54:36 +03:00
Ivan Kravets
b68b9794ec Fix docs with "htmlzip" format 2019-10-02 20:51:39 +03:00
Ivan Kravets
e6ea4cb613 PackageManifest: Ignore hidden files for examples 2019-10-02 20:42:01 +03:00
Ivan Kravets
47ba127733 Add ReadTheDocs config 2019-10-02 18:15:17 +03:00
Ivan Kravets
bbd694c5ea ManifestParse: automatically generate examples from package dir 2019-10-02 17:54:59 +03:00
Ivan Kravets
7ba2a7cd3d Bump version to 4.1.0b2 2019-10-02 13:33:12 +03:00
Ivan Kravets
a1ed99962c Better handling of non-dict values passed to DataModel 2019-10-02 12:34:50 +03:00
Ivan Kravets
c2970631a5 Add "--force" for git update // Issue #3060 2019-10-02 12:34:20 +03:00
Ivan Kravets
d38c843574 Fixed an issue when installing a package using custom Git tag and submodules were not updated correctly // Resolve #3060 2019-10-02 11:52:14 +03:00
Ivan Kravets
a2213a1aa4 Change "examples" field in package manifest to ListOf(ExampleModel) 2019-10-02 11:04:29 +03:00
Ivan Kravets
dee2d2c538 Add manifest parsers for platform.json and package.json 2019-10-01 22:03:23 +03:00
Ivan Kravets
fec19849b5 Docs: Add info about ignoring individual parts of mbed framework 2019-10-01 22:02:55 +03:00
Ivan Kravets
5b77adccb1 DataModels: fix issue when traversing using model fields 2019-10-01 18:10:48 +03:00
Ivan Kravets
a82c4666d4 DataModel: add support for DictOfType, extend base manifest with ExampelsModel 2019-10-01 17:37:11 +03:00
Ivan Kravets
df6a8da290 DataModel: add support for silent validation and "get_exceptions" API 2019-10-01 16:13:25 +03:00
Ivan Kravets
39c8996093 Fix docs typos 2019-10-01 16:11:55 +03:00
Ivan Kravets
af1a0f3587 Allow to build a manifest parser from directory 2019-10-01 00:11:31 +03:00
Ivan Kravets
703912fdc9 Strict manifest validation when submitting to Registry, more tests for manifest model 2019-09-30 23:45:03 +03:00
Ivan Kravets
744881da59 Refactor DataModel with a strict type declaration 2019-09-30 19:44:03 +03:00
Ivan Kravets
5f55c18373 Introduce DataModel, package manifest parser and base manifest model 2019-09-30 17:59:06 +03:00
Ivan Kravets
2137eb1794 Do not append debug flags to linker stage 2019-09-30 13:27:34 +03:00
Ivan Kravets
3dcf1784fb Update PIO Remote to 2.5.5 2019-09-27 19:36:49 +03:00
Ivan Kravets
9a3dcd3daa PY2 fix with absolute import 2019-09-27 18:53:58 +03:00
Ivan Kravets
1b74f380a6 Refactor appending of debugging flags 2019-09-27 17:22:21 +03:00
Ivan Kravets
cd2a4ea535 Update copyrights 2019-09-27 17:21:35 +03:00
Ivan Kravets
536a9566da Feature/pio size data (#3056)
* Add initial support for detailed memory usage report

* Tidy up sizedata target

* Add toolchain to environment paths

* Make sizedata target a bit more readable
2019-09-27 14:18:35 +03:00
Ivan Kravets
d2abac9b18 Fixed an issue when configuration file options partly ignored when `--project-conf` // Resolve #3034 (#3055)
* Fixed an issue when configuration file options partly ignored when using custom ``--project-conf`` // Resolve #3034

* Py2 compatible makedirs

* Fix circle dependency

* Fix broken import in test examples

* Fix history

* Remove YAPF markers

* PyLint fix

* Fix invalid project conf path

* Move PIO Core to the root on Windows, issue with long CPPPATHs

* Respect global PLATFORMIO_BUILD_CACHE_DIR env var

* Fix Appveyor paths

* Minor changes
2019-09-27 14:13:53 +03:00
Florian Knodt
94f8afec38 udev: Add GD32V DFU Bootloader (#3032) 2019-09-24 11:02:26 +03:00
Ivan Kravets
3d5c1411c0 Fix PyLint for PY2 2019-09-24 00:28:23 +03:00
Ivan Kravets
9a7e5d86fc Install Black only for Python 3.6+ 2019-09-24 00:21:16 +03:00
Ivan Kravets
ca29b4e370 Fixed "DeprecationWarning: the imp module is deprecated in favour of importlib" PY2/PY3 2019-09-24 00:17:08 +03:00
Ivan Kravets
392fe1cbd0 Move Run to the root 2019-09-24 00:12:21 +03:00
Ivan Kravets
aa955819b0 Move PIO Check to the root 2019-09-23 23:44:42 +03:00
Ivan Kravets
b1f190a7f8 Move PIO Unit Testing to the root 2019-09-23 23:44:28 +03:00
Ivan Kravets
5453df94e4 Move PIO Unified Debugger to the root 2019-09-23 23:27:55 +03:00
Ivan Kravets
7b314b58a4 Move PIO Home to the root of source code 2019-09-23 23:23:11 +03:00
Ivan Kravets
7c41c7c2f3 Introduce Black to automate code formatting 2019-09-23 23:13:48 +03:00
Ivan Kravets
5e144a2c98 Add PIO Check to changelog 2019-09-23 21:57:31 +03:00
Ivan Kravets
61b6eea52c New "--no-ansi" flag for PIO Core 2019-09-23 20:51:02 +03:00
Ivan Kravets
cd8dc24454 Docs: Sync Espressif32 dev/platform 2019-09-18 18:47:55 +03:00
Ivan Kravets
6531dcbc78 Allow to skip checking of unpacked data 2019-09-16 21:38:47 +03:00
Thomas Bleijendaal
123963f760 UTF8 decoding should ignore invalid characters (#3026)
Some boards, like ESP32 based boards, give some unintelligible data when connecting to them via Serial. This is sometimes data that is send with the wrong baud rate (hard baked into the boot loader), or something else. It's hard to prevent this from happening. When a build is uploaded to the ESP board for unit testing, the decoding of the incoming stream should not fail the test due to some garbled content. Since the read data is validated on line 95, any garbage is automatically ignored and only outputted to the console.
2019-09-16 21:02:07 +03:00
Ivan Kravets
08a94b6f7c New article "Arduino In-circuit Debugging with PlatformIO" 2019-09-16 18:58:29 +03:00
Ivan Kravets
43ae62afd8 Sync Aceinna and GD32V dev/platforms. 2019-09-13 16:01:42 +03:00
Ivan Kravets
e08dc5f0d7 Docs: Sync Microchip PIC32 dev/platform 2019-09-10 17:48:47 +03:00
Ivan Kravets
1e26feb566 Bump version to 4.1.0b1 2019-09-09 23:34:44 +03:00
Ivan Kravets
96567dea4d PyLint fix 2019-09-08 23:44:18 +03:00
Ivan Kravets
c720933d34 Refactor PIO Check 2019-09-08 23:33:25 +03:00
Ivan Kravets
f61d03ec8f PIO Check (#2982) 2019-09-08 18:04:41 +03:00
Ivan Kravets
b7bc4401eb Use isolated SCons DB per build environment 2019-09-08 14:01:41 +03:00
Ivan Kravets
7a07a2e63e Generate `.ccls` LSP file for Emacs 2019-09-03 15:31:33 +03:00
Ivan Kravets
2c242944c7 Fixed default PIO Unified Debugger configuration for J-Link probe 2019-09-02 16:48:33 +03:00
Ivan Kravets
6265233903 Optimize udev rules 2019-09-02 16:01:15 +03:00
Ivan Kravets
be3e26c202 Cleanup UDEV rules 2019-09-02 14:24:35 +03:00
Ivan Kravets
9f76293684 Cleanup Segger UDEV rules 2019-09-02 14:13:58 +03:00
Ivan Kravets
1be2e510da Sync nRF52 dev/platform 2019-09-02 12:50:55 +03:00
Ivan Kravets
af049eecc9 Bump version to 4.1.0a1 2019-08-31 23:40:28 +03:00
Ivan Kravets
fe237f15aa Implement "extends" for project configuration // Resolve #2953 2019-08-31 23:39:41 +03:00
Peter
bdce78ba6f Stop ModemManager corrupting Arduino uploads (#2966)
On boards like the Arduino Micro, when in bootloader mode it appears ModemManager interferes with the programming process and result in a catastrophic failure with no end of different errors including, but not limited to:
```
error: programmer did not respond to command: write block
error: butterfly programmer uses avr_write_page() but does not provide a cmd() method.
error: programmer did not respond to command: set addr
```
After this, the device could appear to be completely non-functional, refusing to enumerate or appear for programming, but thankfully a double-reset will usually recover it, but the underlying ModemManager issue will still prevent successful programming. Hence the additional rules. 

This affects not only PlatformIO, but also the Arduino IDE (on linux).
2019-08-31 11:47:32 +03:00
Ivan Kravets
f26e3c42dd Sync docs 2019-08-31 11:40:16 +03:00
Ivan Kravets
92cd03cf2a Sync docs 2019-08-30 18:12:26 +03:00
Ivan Kravets
e7da3d7f5f Bump version to 4.0.4a1 2019-08-30 16:41:17 +03:00
Ivan Kravets
f966eeb604 Fixed an issue with project generator for CLion IDE when 2 environments were used // Resolve #2824 2019-08-30 16:40:44 +03:00
Ivan Kravets
34176f974b Fix generator for CLion when project is empty // Issue #2824 2019-08-30 15:45:21 +03:00
Ivan Kravets
60f0f775ef Merge branch 'release/v4.0.3' 2019-08-30 15:41:59 +03:00
Ivan Kravets
5f044a7948 Merge tag 'v4.0.3' into develop
Bump version to 4.0.3
2019-08-30 15:41:59 +03:00
Ivan Kravets
9f1dd3dd5d Bump version to 4.0.3 2019-08-30 15:41:49 +03:00
Ivan Kravets
db6f983364 Fix issue for CLion project generator when environment contains space // Issue #2824 2019-08-30 10:55:13 +03:00
Ivan Kravets
386883fbe5 Bump version to 4.0.3rc1 2019-08-29 17:18:36 +03:00
Ivan Kravets
e08527a0af Cleanup CLion project generator 2019-08-29 16:58:18 +03:00
Ivan Kravets
4a6d5e8395 Added support for multi-environment PlatformIO project for CLion IDE // Resolve #2824 Resolve #2944 2019-08-29 16:26:51 +03:00
Ivan Kravets
83bf34fb77 Extend "load_project_ide_data" API to return IDE data for more than one environment 2019-08-29 16:01:36 +03:00
Teo-CD
1c8666e946 Clion integration, resolves #2824 (#2944)
* Better environement integration :
 - Environement can be selected in the build target menu of CLion
 - Platformio target runs on the selected environment
 - Changing environment changes defined preprocessor variables and includes accordingly
 - Added 'All' build profile that runs targets on all environment if there are multiple of them (Original behaviour)

* Calling get_project_dir() only once.

* Fixed include path not being converted to unix style.
Removed duplicate and not normalized  definition
2019-08-29 15:01:50 +03:00
Ivan Kravets
0440b7a2f7 Disable TTY coloring with "PLATFORMIO_DISABLE_COLOR" system environment // Resolve #2956 2019-08-29 14:34:51 +03:00
Ivan Kravets
223a85baca CCLS LSP for VIM // Resolve #2952 2019-08-29 14:20:24 +03:00
Ivan Kravets
ed39a755bc Update to semantic_version 2.8.0 2019-08-29 13:49:52 +03:00
Ivan Kravets
519156512c Strict versions for "semantic_version" and "tabulate" 2019-08-28 23:01:39 +03:00
Ivan Kravets
9fa424ea9b Remove ProjectConfig from cache on saving 2019-08-28 22:43:34 +03:00
Ivan Kravets
883a97a38c Fixed an issue when --upload-port CLI flag does not override declared upload_port option in "platformio.ini" 2019-08-28 19:56:09 +03:00
Ivan Kravets
c671a8e235 Bump version to 4.0.3b1 2019-08-27 20:35:25 +03:00
Ivan Kravets
55a44aecc3 Remove debug code 2019-08-27 20:26:44 +03:00
Ivan Kravets
81fc1c9010 Fixed an issue with PIO Unified Debugger on Windows when debug server is piped 2019-08-27 20:23:03 +03:00
Ivan Kravets
8037bef847 Move "to_unix_path" helper to FS module 2019-08-27 20:21:53 +03:00
Ivan Kravets
98ec287797 Docs: Remove non-existing project examples 2019-08-27 16:28:38 +03:00
Ivan Kravets
bc2765eb1f Fix issue with SemVer when library version has incompatible format // Resolve #2950 2019-08-27 14:05:01 +03:00
Ivan Kravets
94644c2863 Update SCons tool to 3.1.1 2019-08-27 00:15:58 +03:00
Ivan Kravets
fa090131ae Do not parse visited source files for LDF 2019-08-27 00:15:12 +03:00
Ivan Kravets
48b46d74cf PIO Home: Improve description for project examples // Resolve #2713 2019-08-25 20:40:28 +03:00
Ivan Kravets
66b22a218a Update PIO Home to 2.3.0 // Resolve #2614 Resolve #2819 2019-08-25 19:27:44 +03:00
Ivan Kravets
3d18d4f9ce Sync docs 2019-08-25 18:37:54 +03:00
Ivan Kravets
cba2f4d7b6 Remove ProjectConfig cache when "platformio.ini" was modified outside 2019-08-25 18:37:14 +03:00
Ivan Kravets
785be3cb26 Merge tag 'v4.0.2' into develop
Bump version to 4.0.2
2019-08-23 16:24:28 +03:00
147 changed files with 11524 additions and 5345 deletions

View File

@@ -6,14 +6,17 @@ platform:
environment:
matrix:
- TOXENV: "py27"
PLATFORMIO_BUILD_CACHE_DIR: C:/Temp/PIO_Build_Cache_P2_{build}
PLATFORMIO_BUILD_CACHE_DIR: C:\Temp\PIO_Build_Cache_P2_{build}
PYTHON_DIRS: C:\Python27-x64;C:\Python27-x64\Scripts
- TOXENV: "py36"
PLATFORMIO_BUILD_CACHE_DIR: C:/Temp/PIO_Build_Cache_P3_{build}
PLATFORMIO_BUILD_CACHE_DIR: C:\Temp\PIO_Build_Cache_P3_{build}
PYTHON_DIRS: C:\Python36-x64;C:\Python36-x64\Scripts
install:
- cmd: git submodule update --init --recursive
- cmd: SET PATH=C:\MinGW\bin;%PATH%
- cmd: SET PATH=%PYTHON_DIRS%;C:\MinGW\bin;%PATH%
- cmd: SET PLATFORMIO_CORE_DIR=C:\.pio
- cmd: pip install --force-reinstall tox
test_script:

View File

@@ -1,3 +1,3 @@
[settings]
line_length=79
known_third_party=bottle,click,pytest,requests,SCons,semantic_version,serial,twisted,autobahn,jsonrpc,tabulate
line_length=88
known_third_party=SCons, twisted, autobahn, jsonrpc

View File

@@ -1,5 +1,7 @@
[MESSAGES CONTROL]
disable=
bad-continuation,
bad-whitespace,
missing-docstring,
ungrouped-imports,
invalid-name,
@@ -9,4 +11,5 @@ disable=
too-few-public-methods,
useless-object-inheritance,
useless-import-alias,
fixme
fixme,
bad-option-value

12
.readthedocs.yml Normal file
View File

@@ -0,0 +1,12 @@
# See https://docs.readthedocs.io/en/stable/config-file/index.html
version: 2
sphinx:
configuration: docs/conf.py
formats:
- pdf
submodules:
include: all

View File

@@ -1,3 +0,0 @@
[style]
blank_line_before_nested_class_or_def = true
allow_multiline_lambdas = true

View File

@@ -1,10 +1,149 @@
Release Notes
=============
.. _release_notes_4_0:
.. _release_notes_4:
PlatformIO 4.0
--------------
PlatformIO Core 4
-----------------
4.3.1 (2020-03-20)
~~~~~~~~~~~~~~~~~~
* Fixed a SyntaxError "'return' with argument inside generator" for PIO Unified Debugger when Python 2.7 is used
* Fixed an issue when ``lib_archive = no`` was not honored in `"platformio.ini" <https://docs.platformio.org/page/projectconf.html>`__
* Fixed an TypeError "super(type, obj): obj must be an instance or subtype of type" when device monitor is used with a custom dev-platform filter (`issue #3431 <https://github.com/platformio/platformio-core/issues/3431>`_)
4.3.0 (2020-03-19)
~~~~~~~~~~~~~~~~~~
* Initial support for an official `PlatformIO for CLion IDE <https://docs.platformio.org/page/integration/ide/clion.html>`__ plugin:
- Smart C and C++ editor
- Code refactoring
- On-the-fly code analysis
- "New PlatformIO Project" wizard
- Building, Uploading, Testing
- Integrated debugger (inline variable view, conditional breakpoints, expressions, watchpoints, peripheral registers, multi-thread support, etc.)
* `Device Monitor 2.0 <https://docs.platformio.org/page/core/userguide/device/cmd_monitor.html>`__
- Added **PlatformIO Device Monitor Filter API** (dev-platforms can extend base device monitor with a custom functionality, such as exception decoding) (`pull #3383 <https://github.com/platformio/platformio-core/pull/3383>`_)
- Configure project device monitor with `monitor_filters <https://docs.platformio.org/page/projectconf/section_env_monitor.html#monitor-filters>`__ option
- `Capture device monitor output to a file <https://docs.platformio.org/page/core/userguide/device/cmd_monitor.html#capture-output-to-a-file>`__ with ``log2file`` filter (`issue #670 <https://github.com/platformio/platformio-core/issues/670>`_)
- Show a timestamp for each new line with ``time`` filter (`issue #981 <https://github.com/platformio/platformio-core/issues/981>`_)
- Send a text to device on ENTER with ``send_on_enter`` filter (`issue #926 <https://github.com/platformio/platformio-core/issues/926>`_)
- Show a hexadecimal representation of the data (code point of each character) with ``hexlify`` filter
* New standalone (1-script) `PlatformIO Core Installer <https://github.com/platformio/platformio-core-installer>`_
* Initial support for `Renode <https://docs.platformio.org/page/plus/debug-tools/qemu.html>`__ simulation framework (`issue #3401 <https://github.com/platformio/platformio-core/issues/3401>`_)
* Added support for Arm Mbed "module.json" ``dependencies`` field (`issue #3400 <https://github.com/platformio/platformio-core/issues/3400>`_)
* Improved support for Arduino "library.properties" ``depends`` field
* Fixed an issue when quitting from PlatformIO IDE does not shutdown PIO Home server
* Fixed an issue "the JSON object must be str, not 'bytes'" when PIO Home is used with Python 3.5 (`issue #3396 <https://github.com/platformio/platformio-core/issues/3396>`_)
* Fixed an issue when Python 2 does not keep encoding when converting ".ino" (`issue #3393 <https://github.com/platformio/platformio-core/issues/3393>`_)
* Fixed an issue when ``"libArchive": false`` in "library.json" does not work (`issue #3403 <https://github.com/platformio/platformio-core/issues/3403>`_)
* Fixed an issue when not all commands in `compilation database "compile_commands.json" <https://docs.platformio.org/page/integration/compile_commands.html>`__ use absolute paths (`pull #3415 <https://github.com/platformio/platformio-core/pull/3415>`_)
* Fixed an issue when unknown transport is used for `PIO Unit Testing <https://docs.platformio.org/page/plus/unit-testing.html>`__ engine (`issue #3422 <https://github.com/platformio/platformio-core/issues/3422>`_)
4.2.1 (2020-02-17)
~~~~~~~~~~~~~~~~~~
* Improved VSCode template with special ``forceInclude`` field for direct includes via ``-include`` flag (`issue #3379 <https://github.com/platformio/platformio-core/issues/3379>`_)
* Improved support of PIO Home on card-sized PC (Raspberry Pi, etc.) (`issue #3313 <https://github.com/platformio/platformio-core/issues/3313>`_)
* Froze "marshmallow" dependency to 2.X for Python 2 (`issue #3380 <https://github.com/platformio/platformio-core/issues/3380>`_)
* Fixed "TypeError: unsupported operand type(s)" when system environment variable is used by project configuration parser (`issue #3377 <https://github.com/platformio/platformio-core/issues/3377>`_)
* Fixed an issue when Library Dependency Finder (LDF) ignores custom "libLDFMode" and "libCompatMode" options in `library.json <http://docs.platformio.org/page/librarymanager/config.html>`__
* Fixed an issue when generating of compilation database "compile_commands.json" does not work with Python 2.7 (`issue #3378 <https://github.com/platformio/platformio-core/issues/3378>`_)
4.2.0 (2020-02-12)
~~~~~~~~~~~~~~~~~~
* `PlatformIO Home 3.1 <http://docs.platformio.org/page/home/index.html>`__:
- Project Manager
- Project Configuration UI for `"platformio.ini" <https://docs.platformio.org/page/projectconf.html>`__
* `PIO Check <http://docs.platformio.org/page/plus/pio-check.html>`__ automated code analysis without hassle:
- Added support for `PVS-Studio <https://docs.platformio.org/page/plus/check-tools/pvs-studio.html>`__ static code analyzer
* Initial support for `Project Manager <https://docs.platformio.org/page/userguide/project/index.html>`_ CLI:
- Show computed project configuration with a new `platformio project config <https://docs.platformio.org/page/userguide/project/cmd_config.html>`_ command or dump to JSON with ``platformio project config --json-output`` (`issue #3335 <https://github.com/platformio/platformio-core/issues/3335>`_)
- Moved ``platformio init`` command to `platformio project init <https://docs.platformio.org/page/userguide/project/cmd_init.html>`_
* Generate `compilation database "compile_commands.json" <https://docs.platformio.org/page/integration/compile_commands.html>`__ (`issue #2990 <https://github.com/platformio/platformio-core/issues/2990>`_)
* Control debug flags and optimization level with a new `debug_build_flags <https://docs.platformio.org/page/projectconf/section_env_debug.html#debug-build-flags>`__ option
* Install a dev-platform with ALL declared packages using a new ``--with-all-packages`` option for `pio platform install <https://docs.platformio.org/page/userguide/platforms/cmd_install.html>`__ command (`issue #3345 <https://github.com/platformio/platformio-core/issues/3345>`_)
* Added support for "pythonPackages" in `platform.json <https://docs.platformio.org/page/platforms/creating_platform.html#manifest-file-platform-json>`__ manifest (PlatformIO Package Manager will install dependent Python packages from PyPi registry automatically when dev-platform is installed)
* Handle project configuration (monitor, test, and upload options) for PIO Remote commands (`issue #2591 <https://github.com/platformio/platformio-core/issues/2591>`_)
* Added support for Arduino's library.properties ``depends`` field (`issue #2781 <https://github.com/platformio/platformio-core/issues/2781>`_)
* Autodetect monitor port for boards with specified HWIDs (`issue #3349 <https://github.com/platformio/platformio-core/issues/3349>`_)
* Updated SCons tool to 3.1.2
* Updated Unity tool to 2.5.0
* Made package ManifestSchema compatible with marshmallow >= 3 (`issue #3296 <https://github.com/platformio/platformio-core/issues/3296>`_)
* Warn about broken library manifest when scanning dependencies (`issue #3268 <https://github.com/platformio/platformio-core/issues/3268>`_)
* Do not overwrite custom items in VSCode's "extensions.json" (`issue #3374 <https://github.com/platformio/platformio-core/issues/3374>`_)
* Fixed an issue when ``env.BoardConfig()`` does not work for custom boards in extra scripts of libraries (`issue #3264 <https://github.com/platformio/platformio-core/issues/3264>`_)
* Fixed an issue with "start-group/end-group" linker flags on Native development platform (`issue #3282 <https://github.com/platformio/platformio-core/issues/3282>`_)
* Fixed default PIO Unified Debugger configuration for `J-Link probe <http://docs.platformio.org/page/plus/debug-tools/jlink.html>`__
* Fixed an issue with LDF when header files not found if "libdeps_dir" is within a subdirectory of "lib_extra_dirs" (`issue #3311 <https://github.com/platformio/platformio-core/issues/3311>`_)
* Fixed an issue "Import of non-existent variable 'projenv''" when development platform does not call "env.BuildProgram()" (`issue #3315 <https://github.com/platformio/platformio-core/issues/3315>`_)
* Fixed an issue when invalid CLI command does not return non-zero exit code
* Fixed an issue when Project Inspector crashes when flash use > 100% (`issue #3368 <https://github.com/platformio/platformio-core/issues/3368>`_)
* Fixed a "UnicodeDecodeError" when listing built-in libraries on macOS with Python 2.7 (`issue #3370 <https://github.com/platformio/platformio-core/issues/3370>`_)
* Fixed an issue with improperly handled compiler flags with space symbols in VSCode template (`issue #3364 <https://github.com/platformio/platformio-core/issues/3364>`_)
* Fixed an issue when no error is raised if referred parameter (interpolation) is missing in a project configuration file (`issue #3279 <https://github.com/platformio/platformio-core/issues/3279>`_)
4.1.0 (2019-11-07)
~~~~~~~~~~~~~~~~~~
* `PIO Check <http://docs.platformio.org/page/plus/pio-check.html>`__ automated code analysis without hassle:
- Potential NULL pointer dereferences
- Possible indexing beyond array bounds
- Suspicious assignments
- Reads of potentially uninitialized objects
- Unused variables or functions
- Out of scope memory usage.
* `PlatformIO Home 3.0 <http://docs.platformio.org/page/home/index.html>`__:
- Project Inspection
- Static Code Analysis
- Firmware File Explorer
- Firmware Memory Inspection
- Firmware Sections & Symbols Viewer.
* Added support for `Build Middlewares <http://docs.platformio.org/page/projectconf/advanced_scripting.html#build-middlewares>`__: configure custom build flags per specific file, skip any build nodes from a framework, replace build file with another on-the-fly, etc.
* Extend project environment configuration in "platformio.ini" with other sections using a new `extends <http://docs.platformio.org/page/projectconf/section_env_advanced.html#extends>`__ option (`issue #2953 <https://github.com/platformio/platformio-core/issues/2953>`_)
* Generate ``.ccls`` LSP file for `Emacs <https://docs.platformio.org/page/ide/emacs.html>`__ cross references, hierarchies, completion and semantic highlighting
* Added ``--no-ansi`` flag for `PIO Core <http://docs.platformio.org/page/userguide/index.html>`__ to disable ANSI control characters
* Added ``--shutdown-timeout`` option to `PIO Home Server <http://docs.platformio.org/page/userguide/cmd_home.html>`__
* Fixed an issue with project generator for `CLion IDE <http://docs.platformio.org/page/ide/clion.html>`__ when 2 environments were used (`issue #2824 <https://github.com/platformio/platformio-core/issues/2824>`_)
* Fixed default PIO Unified Debugger configuration for `J-Link probe <http://docs.platformio.org/page/plus/debug-tools/jlink.html>`__
* Fixed an issue when configuration file options partly ignored when using custom ``--project-conf`` (`issue #3034 <https://github.com/platformio/platformio-core/issues/3034>`_)
* Fixed an issue when installing a package using custom Git tag and submodules were not updated correctly (`issue #3060 <https://github.com/platformio/platformio-core/issues/3060>`_)
* Fixed an issue with linking process when ``$LDSCRIPT`` contains a space in path
* Fixed security issue when extracting items from TAR archive (`issue #2995 <https://github.com/platformio/platformio-core/issues/2995>`_)
* Fixed an issue with project generator when ``src_build_flags`` were not respected (`issue #3137 <https://github.com/platformio/platformio-core/issues/3137>`_)
* Fixed an issue when booleans in "platformio.ini" are not parsed properly (`issue #3022 <https://github.com/platformio/platformio-core/issues/3022>`_)
* Fixed an issue with invalid encoding when generating project for Visual Studio (`issue #3183 <https://github.com/platformio/platformio-core/issues/3183>`_)
* Fixed an issue when Project Config Parser does not remove in-line comments when Python 3 is used (`issue #3213 <https://github.com/platformio/platformio-core/issues/3213>`_)
* Fixed an issue with a GCC Linter for PlatformIO IDE for Atom (`issue #3218 <https://github.com/platformio/platformio-core/issues/3218>`_)
4.0.3 (2019-08-30)
~~~~~~~~~~~~~~~~~~
* Added support for multi-environment PlatformIO project for `CLion IDE <http://docs.platformio.org/page/ide/clion.html>`__ (`issue #2824 <https://github.com/platformio/platformio-core/issues/2824>`_)
* Generate ``.ccls`` LSP file for `Vim <http://docs.platformio.org/en/page/vim.html>`__ cross references, hierarchies, completion and semantic highlighting (`issue #2952 <https://github.com/platformio/platformio-core/issues/2952>`_)
* Added support for `PLATFORMIO_DISABLE_COLOR <http://docs.platformio.org/page/envvars.html#envvar-PLATFORMIO_DISABLE_COLOR>`__ system environment variable which disables color ANSI-codes in a terminal output (`issue #2956 <https://github.com/platformio/platformio-core/issues/2956>`_)
* Updated SCons tool to 3.1.1
* Remove ProjectConfig cache when "platformio.ini" was modified outside
* Fixed an issue with PIO Unified Debugger on Windows OS when debug server is piped
* Fixed an issue when `--upload-port <http://docs.platformio.org/page/userguide/cmd_run.html#cmdoption-platformio-run-upload-port>`__ CLI flag does not override declared `upload_port <http://docs.platformio.org/page/projectconf/section_env_upload.html#upload-port>`__ option in `"platformio.ini" (Project Configuration File) <https://docs.platformio.org/page/projectconf.html>`__
4.0.2 (2019-08-23)
~~~~~~~~~~~~~~~~~~
@@ -93,8 +232,8 @@ PlatformIO 4.0
- Fixed "systemd-udevd" warnings in `99-platformio-udev.rules <http://docs.platformio.org/page/faq.html#platformio-udev-rules>`__ (`issue #2442 <https://github.com/platformio/platformio-core/issues/2442>`_)
- Fixed an issue when package cache (Library Manager) expires too fast (`issue #2559 <https://github.com/platformio/platformio-core/issues/2559>`_)
PlatformIO 3.0
--------------
PlatformIO Core 3
-----------------
3.6.7 (2019-04-23)
~~~~~~~~~~~~~~~~~~
@@ -694,8 +833,8 @@ PlatformIO 3.0
(`issue #742 <https://github.com/platformio/platformio-core/issues/742>`_)
* Stopped supporting Python 2.6
PlatformIO 2.0
--------------
PlatformIO Core 2
-----------------
2.11.2 (2016-08-02)
~~~~~~~~~~~~~~~~~~~
@@ -1480,8 +1619,8 @@ PlatformIO 2.0
* Fixed bug with creating copies of source files
(`issue #177 <https://github.com/platformio/platformio-core/issues/177>`_)
PlatformIO 1.0
--------------
PlatformIO Core 1
-----------------
1.5.0 (2015-05-15)
~~~~~~~~~~~~~~~~~~
@@ -1671,8 +1810,8 @@ PlatformIO 1.0
error (`issue #81 <https://github.com/platformio/platformio-core/issues/81>`_)
* Several bug fixes, increased stability and performance improvements
PlatformIO 0.0
--------------
PlatformIO Core Preview
-----------------------
0.10.2 (2015-01-06)
~~~~~~~~~~~~~~~~~~~

1
MANIFEST.in Normal file
View File

@@ -0,0 +1 @@
include LICENSE

View File

@@ -5,13 +5,14 @@ isort:
isort -rc ./platformio
isort -rc ./tests
yapf:
yapf --recursive --in-place platformio/
format:
black --target-version py27 ./platformio
black --target-version py27 ./tests
test:
py.test --verbose --capture=no --exitfirst -n 3 --dist=loadscope tests --ignore tests/test_examples.py --ignore tests/test_pkgmanifest.py
py.test --verbose --capture=no --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
before-commit: isort yapf lint test
before-commit: isort format lint test
clean-docs:
rm -rf docs/_build

View File

@@ -34,16 +34,20 @@ PlatformIO
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
:target: https://platformio.org?utm_source=github&utm_medium=core
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ is an open source ecosystem for IoT
development. Cross-platform IDE and unified debugger. Remote unit testing and
firmware updates.
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ a new generation ecosystem for embedded development
* Open source, maximum permissive Apache 2.0 license
* Cross-platform IDE and Unified Debugger
* Static Code Analyzer and Remote Unit Testing
* Multi-platform and Multi-architecture Build System
* Firmware File Explorer and Memory Inspection.
Get Started
-----------
* `What is PlatformIO? <https://docs.platformio.org/en/latest/what-is-platformio.html?utm_source=github&utm_medium=core>`_
Open Source
Instruments
-----------
* `PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_
@@ -54,14 +58,13 @@ Open Source
* `Continuous Integration <https://docs.platformio.org/page/ci/index.html?utm_source=github&utm_medium=core>`_
* `Advanced Scripting API <https://docs.platformio.org/page/projectconf/advanced_scripting.html?utm_source=github&utm_medium=core>`_
PIO Plus
--------
Professional
------------
* `PIO Check <https://docs.platformio.org/page/plus/pio-check.html?utm_source=github&utm_medium=core>`_
* `PIO Remote <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
* `PIO Unified Debugger <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `PIO Unit Testing <https://docs.platformio.org/en/latest/plus/unit-testing.html?utm_source=github&utm_medium=core>`_
* `Cloud IDEs Integration <https://docs.platformio.org/en/latest/ide.html?utm_source=github&utm_medium=core#solution-pio-delivery>`_
* `Integration Services <https://platformio.org/pricing?utm_source=github&utm_medium=core#enterprise-features>`_
Registry
--------
@@ -89,10 +92,11 @@ Development Platforms
* `Microchip PIC32 <https://platformio.org/platforms/microchippic32?utm_source=github&utm_medium=core>`_
* `Nordic nRF51 <https://platformio.org/platforms/nordicnrf51?utm_source=github&utm_medium=core>`_
* `Nordic nRF52 <https://platformio.org/platforms/nordicnrf52?utm_source=github&utm_medium=core>`_
* `Nuclei <https://platformio.org/platforms/nuclei?utm_source=github&utm_medium=core>`_
* `NXP LPC <https://platformio.org/platforms/nxplpc?utm_source=github&utm_medium=core>`_
* `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_
* `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_
* `Samsung ARTIK <https://platformio.org/platforms/samsung_artik?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_
* `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_
* `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_
* `ST STM8 <https://platformio.org/platforms/ststm8?utm_source=github&utm_medium=core>`_
@@ -105,23 +109,25 @@ Frameworks
----------
* `Arduino <https://platformio.org/frameworks/arduino?utm_source=github&utm_medium=core>`_
* `ARTIK SDK <https://platformio.org/frameworks/artik-sdk?utm_source=github&utm_medium=core>`_
* `CMSIS <https://platformio.org/frameworks/cmsis?utm_source=github&utm_medium=core>`_
* `Energia <https://platformio.org/frameworks/energia?utm_source=github&utm_medium=core>`_
* `ESP-IDF <https://platformio.org/frameworks/espidf?utm_source=github&utm_medium=core>`_
* `ESP8266 Non-OS SDK <https://platformio.org/frameworks/esp8266-nonos-sdk?utm_source=github&utm_medium=core>`_
* `ESP8266 RTOS SDK <https://platformio.org/frameworks/esp8266-rtos-sdk?utm_source=github&utm_medium=core>`_
* `Freedom E SDK <https://platformio.org/frameworks/freedom-e-sdk?utm_source=github&utm_medium=core>`_
* `GigaDevice GD32V SDK <https://platformio.org/frameworks/gd32vf103-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte Standalone SDK <https://platformio.org/frameworks/kendryte-standalone-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte FreeRTOS SDK <https://platformio.org/frameworks/kendryte-freertos-sdk?utm_source=github&utm_medium=core>`_
* `libOpenCM3 <https://platformio.org/frameworks/libopencm3?utm_source=github&utm_medium=core>`_
* `mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `Mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `Nuclei SDK <https://platformio.org/frameworks/nuclei-sdk?utm_source=github&utm_medium=core>`_
* `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_
* `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_
* `Shakti SDK <https://platformio.org/frameworks/shakti-sdk?utm_source=github&utm_medium=core>`_
* `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_
* `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_
* `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_
* `Tizen RT <https://platformio.org/frameworks/tizenrt?utm_source=github&utm_medium=core>`_
* `WiringPi <https://platformio.org/frameworks/wiringpi?utm_source=github&utm_medium=core>`_
* `Zephyr <https://platformio.org/frameworks/zephyr?utm_source=github&utm_medium=core>`_
Contributing
------------

2
docs

Submodule docs updated: 3f5d12ca25...d97117eb2e

View File

@@ -12,16 +12,19 @@
# See the License for the specific language governing permissions and
# limitations under the License.
VERSION = (4, 0, 2)
VERSION = (4, 3, 1)
__version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio"
__description__ = (
"An open source ecosystem for IoT development. "
"Cross-platform IDE and unified debugger. "
"Remote unit testing and firmware updates. "
"A new generation ecosystem for embedded development. "
"Cross-platform IDE and Unified Debugger. "
"Static Code Analyzer and Remote Unit Testing. "
"Multi-platform and Multi-architecture Build System. "
"Firmware File Explorer and Memory Inspection. "
"Arduino, ARM mbed, Espressif (ESP8266/ESP32), STM32, PIC32, nRF51/nRF52, "
"FPGA, CMSIS, SPL, AVR, Samsung ARTIK, libOpenCM3")
"RISC-V, FPGA, CMSIS, SPL, AVR, Samsung ARTIK, libOpenCM3"
)
__url__ = "https://platformio.org"
__author__ = "PlatformIO"

View File

@@ -23,22 +23,42 @@ from platformio.commands import PlatformioCLI
from platformio.compat import CYGWIN
@click.command(cls=PlatformioCLI,
context_settings=dict(help_option_names=["-h", "--help"]))
@click.command(
cls=PlatformioCLI, context_settings=dict(help_option_names=["-h", "--help"])
)
@click.version_option(__version__, prog_name="PlatformIO")
@click.option("--force",
"-f",
is_flag=True,
help="Force to accept any confirmation prompts.")
@click.option("--caller", "-c", help="Caller ID (service).")
@click.option("--force", "-f", is_flag=True, help="DEPRECATE")
@click.option("--caller", "-c", help="Caller ID (service)")
@click.option("--no-ansi", is_flag=True, help="Do not print ANSI control characters")
@click.pass_context
def cli(ctx, force, caller):
def cli(ctx, force, caller, no_ansi):
try:
if (
no_ansi
or str(
os.getenv("PLATFORMIO_NO_ANSI", os.getenv("PLATFORMIO_DISABLE_COLOR"))
).lower()
== "true"
):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: False
elif (
str(
os.getenv("PLATFORMIO_FORCE_ANSI", os.getenv("PLATFORMIO_FORCE_COLOR"))
).lower()
== "true"
):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
except: # pylint: disable=bare-except
pass
maintenance.on_platformio_start(ctx, force, caller)
@cli.resultcallback()
@click.pass_context
def process_result(ctx, result, force, caller): # pylint: disable=W0613
def process_result(ctx, result, *_, **__):
maintenance.on_platformio_end(ctx, result)
@@ -50,19 +70,12 @@ def configure():
# https://urllib3.readthedocs.org
# /en/latest/security.html#insecureplatformwarning
try:
import urllib3
import urllib3 # pylint: disable=import-outside-toplevel
urllib3.disable_warnings()
except (AttributeError, ImportError):
pass
# handle PLATFORMIO_FORCE_COLOR
if str(os.getenv("PLATFORMIO_FORCE_COLOR", "")).lower() == "true":
try:
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
except: # pylint: disable=bare-except
pass
# Handle IOError issue with VSCode's Terminal (Windows)
click_echo_origin = [click.echo, click.secho]
@@ -71,7 +84,8 @@ def configure():
click_echo_origin[origin](*args, **kwargs)
except IOError:
(sys.stderr.write if kwargs.get("err") else sys.stdout.write)(
"%s\n" % (args[0] if args else ""))
"%s\n" % (args[0] if args else "")
)
click.echo = lambda *args, **kwargs: _safe_echo(0, *args, **kwargs)
click.secho = lambda *args, **kwargs: _safe_echo(1, *args, **kwargs)
@@ -85,9 +99,10 @@ def main(argv=None):
sys.argv = argv
try:
configure()
cli(None, None, None)
except SystemExit:
pass
cli() # pylint: disable=no-value-for-parameter
except SystemExit as e:
if e.code and str(e.code).isdigit():
exit_code = int(e.code)
except Exception as e: # pylint: disable=broad-except
if not isinstance(e, exception.ReturnErrorCode):
maintenance.on_platformio_exception(e)

View File

@@ -17,87 +17,76 @@ import hashlib
import os
import uuid
from os import environ, getenv, listdir, remove
from os.path import abspath, dirname, expanduser, isdir, isfile, join
from os.path import dirname, isdir, isfile, join, realpath
from time import time
import requests
from platformio import exception, fs, lockfile
from platformio.compat import (WINDOWS, dump_json_to_unicode,
hashlib_encode_data)
from platformio.compat import WINDOWS, dump_json_to_unicode, hashlib_encode_data
from platformio.proc import is_ci
from platformio.project.helpers import (get_project_cache_dir,
get_project_core_dir)
def get_default_projects_dir():
docs_dir = join(expanduser("~"), "Documents")
try:
assert WINDOWS
import ctypes.wintypes
buf = ctypes.create_unicode_buffer(ctypes.wintypes.MAX_PATH)
ctypes.windll.shell32.SHGetFolderPathW(None, 5, None, 0, buf)
docs_dir = buf.value
except: # pylint: disable=bare-except
pass
return join(docs_dir, "PlatformIO", "Projects")
from platformio.project.helpers import (
get_default_projects_dir,
get_project_cache_dir,
get_project_core_dir,
)
def projects_dir_validate(projects_dir):
assert isdir(projects_dir)
return abspath(projects_dir)
return realpath(projects_dir)
DEFAULT_SETTINGS = {
"auto_update_libraries": {
"description": "Automatically update libraries (Yes/No)",
"value": False
"value": False,
},
"auto_update_platforms": {
"description": "Automatically update platforms (Yes/No)",
"value": False
"value": False,
},
"check_libraries_interval": {
"description": "Check for the library updates interval (days)",
"value": 7
"value": 7,
},
"check_platformio_interval": {
"description": "Check for the new PlatformIO interval (days)",
"value": 3
"value": 3,
},
"check_platforms_interval": {
"description": "Check for the platform updates interval (days)",
"value": 7
"value": 7,
},
"enable_cache": {
"description": "Enable caching for API requests and Library Manager",
"value": True
},
"strict_ssl": {
"description": "Strict SSL for PlatformIO Services",
"value": False
"value": True,
},
"strict_ssl": {"description": "Strict SSL for PlatformIO Services", "value": False},
"enable_telemetry": {
"description":
("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"),
"value": True
"description": ("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"),
"value": True,
},
"force_verbose": {
"description": "Force verbose output when processing environments",
"value": False
"value": False,
},
"projects_dir": {
"description": "Default location for PlatformIO projects (PIO Home)",
"value": get_default_projects_dir(),
"validator": projects_dir_validate
"validator": projects_dir_validate,
},
}
SESSION_VARS = {"command_ctx": None, "force_option": False, "caller_id": None}
SESSION_VARS = {
"command_ctx": None,
"force_option": False,
"caller_id": None,
"custom_project_conf": None,
}
class State(object):
def __init__(self, path=None, lock=False):
self.path = path
self.lock = lock
@@ -113,8 +102,12 @@ class State(object):
if isfile(self.path):
self._storage = fs.load_json(self.path)
assert isinstance(self._storage, dict)
except (AssertionError, ValueError, UnicodeDecodeError,
exception.InvalidJSONFile):
except (
AssertionError,
ValueError,
UnicodeDecodeError,
exception.InvalidJSONFile,
):
self._storage = {}
return self
@@ -174,7 +167,6 @@ class State(object):
class ContentCache(object):
def __init__(self, cache_dir=None):
self.cache_dir = None
self._db_path = None
@@ -207,6 +199,7 @@ class ContentCache(object):
return True
def get_cache_path(self, key):
assert "/" not in key and "\\" not in key
key = str(key)
assert len(key) > 3
return join(self.cache_dir, key[-2:], key)
@@ -277,8 +270,11 @@ class ContentCache(object):
continue
expire, path = line.split("=")
try:
if time() < int(expire) and isfile(path) and \
path not in paths_for_delete:
if (
time() < int(expire)
and isfile(path)
and path not in paths_for_delete
):
newlines.append(line)
continue
except ValueError:
@@ -317,11 +313,11 @@ def sanitize_setting(name, value):
defdata = DEFAULT_SETTINGS[name]
try:
if "validator" in defdata:
value = defdata['validator'](value)
elif isinstance(defdata['value'], bool):
value = defdata["validator"](value)
elif isinstance(defdata["value"], bool):
if not isinstance(value, bool):
value = str(value).lower() in ("true", "yes", "y", "1")
elif isinstance(defdata['value'], int):
elif isinstance(defdata["value"], int):
value = int(value)
except Exception:
raise exception.InvalidSettingValue(value, name)
@@ -351,24 +347,24 @@ def get_setting(name):
return sanitize_setting(name, getenv(_env_name))
with State() as state:
if "settings" in state and name in state['settings']:
return state['settings'][name]
if "settings" in state and name in state["settings"]:
return state["settings"][name]
return DEFAULT_SETTINGS[name]['value']
return DEFAULT_SETTINGS[name]["value"]
def set_setting(name, value):
with State(lock=True) as state:
if "settings" not in state:
state['settings'] = {}
state['settings'][name] = sanitize_setting(name, value)
state["settings"] = {}
state["settings"][name] = sanitize_setting(name, value)
state.modified = True
def reset_settings():
with State(lock=True) as state:
if "settings" in state:
del state['settings']
del state["settings"]
def get_session_var(name, default=None):
@@ -381,11 +377,13 @@ def set_session_var(name, value):
def is_disabled_progressbar():
return any([
get_session_var("force_option"),
is_ci(),
getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true"
])
return any(
[
get_session_var("force_option"),
is_ci(),
getenv("PLATFORMIO_DISABLE_PROGRESSBAR") == "true",
]
)
def get_cid():
@@ -397,15 +395,22 @@ def get_cid():
uid = getenv("C9_UID")
elif getenv("CHE_API", getenv("CHE_API_ENDPOINT")):
try:
uid = requests.get("{api}/user?token={token}".format(
api=getenv("CHE_API", getenv("CHE_API_ENDPOINT")),
token=getenv("USER_TOKEN"))).json().get("id")
uid = (
requests.get(
"{api}/user?token={token}".format(
api=getenv("CHE_API", getenv("CHE_API_ENDPOINT")),
token=getenv("USER_TOKEN"),
)
)
.json()
.get("id")
)
except: # pylint: disable=bare-except
pass
if not uid:
uid = uuid.getnode()
cid = uuid.UUID(bytes=hashlib.md5(hashlib_encode_data(uid)).digest())
cid = str(cid)
if WINDOWS or os.getuid() > 0: # yapf: disable pylint: disable=no-member
if WINDOWS or os.getuid() > 0: # pylint: disable=no-member
set_state_item("cid", cid)
return cid

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from os import environ, makedirs
from os.path import isdir, join
from time import time
@@ -28,10 +29,10 @@ from SCons.Script import Import # pylint: disable=import-error
from SCons.Script import Variables # pylint: disable=import-error
from platformio import fs
from platformio.compat import PY2, dump_json_to_unicode
from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformBase
from platformio.proc import get_pythonexe_path
from platformio.project import helpers as project_helpers
from platformio.project.helpers import get_project_dir
AllowSubstExceptions(NameError)
@@ -43,48 +44,45 @@ clivars.AddVariables(
("PROJECT_CONFIG",),
("PIOENV",),
("PIOTEST_RUNNING_NAME",),
("UPLOAD_PORT",)
) # yapf: disable
("UPLOAD_PORT",),
)
DEFAULT_ENV_OPTIONS = dict(
tools=[
"ar", "gas", "gcc", "g++", "gnulink", "platformio", "pioplatform",
"pioproject", "piowinhooks", "piolib", "pioupload", "piomisc", "pioide"
"ar",
"as",
"cc",
"c++",
"link",
"platformio",
"pioplatform",
"pioproject",
"piomaxlen",
"piolib",
"pioupload",
"piomisc",
"pioide",
"piosize",
],
toolpath=[join(fs.get_source_dir(), "builder", "tools")],
variables=clivars,
# Propagating External Environment
ENV=environ,
UNIX_TIME=int(time()),
PROJECT_DIR=project_helpers.get_project_dir(),
PROJECTCORE_DIR=project_helpers.get_project_core_dir(),
PROJECTPACKAGES_DIR=project_helpers.get_project_packages_dir(),
PROJECTWORKSPACE_DIR=project_helpers.get_project_workspace_dir(),
PROJECTLIBDEPS_DIR=project_helpers.get_project_libdeps_dir(),
PROJECTINCLUDE_DIR=project_helpers.get_project_include_dir(),
PROJECTSRC_DIR=project_helpers.get_project_src_dir(),
PROJECTTEST_DIR=project_helpers.get_project_test_dir(),
PROJECTDATA_DIR=project_helpers.get_project_data_dir(),
PROJECTBUILD_DIR=project_helpers.get_project_build_dir(),
BUILDCACHE_DIR=project_helpers.get_project_optional_dir("build_cache_dir"),
BUILD_DIR=join("$PROJECTBUILD_DIR", "$PIOENV"),
BUILDSRC_DIR=join("$BUILD_DIR", "src"),
BUILDTEST_DIR=join("$BUILD_DIR", "test"),
BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"),
BUILD_SRC_DIR=join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=join("$BUILD_DIR", "test"),
COMPILATIONDB_PATH=join("$BUILD_DIR", "compile_commands.json"),
LIBPATH=["$BUILD_DIR"],
LIBSOURCE_DIRS=[
project_helpers.get_project_lib_dir(),
join("$PROJECTLIBDEPS_DIR", "$PIOENV"),
project_helpers.get_project_global_lib_dir()
],
PROGNAME="program",
PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PYTHONEXE=get_pythonexe_path())
PYTHONEXE=get_pythonexe_path(),
)
if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
DEFAULT_ENV_OPTIONS['ARCOMSTR'] = "Archiving $TARGET"
DEFAULT_ENV_OPTIONS['LINKCOMSTR'] = "Linking $TARGET"
DEFAULT_ENV_OPTIONS['RANLIBCOMSTR'] = "Indexing $TARGET"
DEFAULT_ENV_OPTIONS["ARCOMSTR"] = "Archiving $TARGET"
DEFAULT_ENV_OPTIONS["LINKCOMSTR"] = "Linking $TARGET"
DEFAULT_ENV_OPTIONS["RANLIBCOMSTR"] = "Indexing $TARGET"
for k in ("ASCOMSTR", "ASPPCOMSTR", "CCCOMSTR", "CXXCOMSTR"):
DEFAULT_ENV_OPTIONS[k] = "Compiling $TARGET"
@@ -94,31 +92,63 @@ env = DefaultEnvironment(**DEFAULT_ENV_OPTIONS)
env.Replace(
**{
key: PlatformBase.decode_scons_arg(env[key])
for key in list(clivars.keys()) if key in env
})
for key in list(clivars.keys())
if key in env
}
)
if env.subst("$BUILDCACHE_DIR"):
if not isdir(env.subst("$BUILDCACHE_DIR")):
makedirs(env.subst("$BUILDCACHE_DIR"))
env.CacheDir("$BUILDCACHE_DIR")
# Setup project optional directories
config = env.GetProjectConfig()
env.Replace(
PROJECT_DIR=get_project_dir(),
PROJECT_CORE_DIR=config.get_optional_dir("core"),
PROJECT_PACKAGES_DIR=config.get_optional_dir("packages"),
PROJECT_WORKSPACE_DIR=config.get_optional_dir("workspace"),
PROJECT_LIBDEPS_DIR=config.get_optional_dir("libdeps"),
PROJECT_INCLUDE_DIR=config.get_optional_dir("include"),
PROJECT_SRC_DIR=config.get_optional_dir("src"),
PROJECTSRC_DIR=config.get_optional_dir("src"), # legacy for dev/platform
PROJECT_TEST_DIR=config.get_optional_dir("test"),
PROJECT_DATA_DIR=config.get_optional_dir("data"),
PROJECTDATA_DIR=config.get_optional_dir("data"), # legacy for dev/platform
PROJECT_BUILD_DIR=config.get_optional_dir("build"),
BUILD_CACHE_DIR=config.get_optional_dir("build_cache"),
LIBSOURCE_DIRS=[
config.get_optional_dir("lib"),
join("$PROJECT_LIBDEPS_DIR", "$PIOENV"),
config.get_optional_dir("globallib"),
],
)
if env.subst("$BUILD_CACHE_DIR"):
if not isdir(env.subst("$BUILD_CACHE_DIR")):
makedirs(env.subst("$BUILD_CACHE_DIR"))
env.CacheDir("$BUILD_CACHE_DIR")
if int(ARGUMENTS.get("ISATTY", 0)):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
if env.GetOption('clean'):
if env.GetOption("clean"):
env.PioClean(env.subst("$BUILD_DIR"))
env.Exit(0)
elif not int(ARGUMENTS.get("PIOVERBOSE", 0)):
print("Verbose mode can be enabled via `-v, --verbose` option")
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
# Dynamically load dependent tools
if "compiledb" in COMMAND_LINE_TARGETS:
env.Tool("compilation_db")
if not isdir(env.subst("$BUILD_DIR")):
makedirs(env.subst("$BUILD_DIR"))
env.LoadProjectOptions()
env.LoadPioPlatform()
env.SConscriptChdir(0)
env.SConsignFile(
join("$PROJECTBUILD_DIR",
".sconsign.dblite" if PY2 else ".sconsign3.dblite"))
join("$BUILD_DIR", ".sconsign.py%d%d" % (sys.version_info[0], sys.version_info[1]))
)
for item in env.GetExtraScripts("pre"):
env.SConscript(item, exports="env")
@@ -136,7 +166,9 @@ for item in env.GetExtraScripts("post"):
##############################################################################
# Checking program size
if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
if env.get("SIZETOOL") and not (
set(["nobuild", "sizedata"]) & set(COMMAND_LINE_TARGETS)
):
env.Depends(["upload", "program"], "checkprogsize")
# Replace platform's "size" target with our
_new_targets = [t for t in DEFAULT_TARGETS if str(t) != "size"]
@@ -144,11 +176,17 @@ if env.get("SIZETOOL") and "nobuild" not in COMMAND_LINE_TARGETS:
Default(_new_targets)
Default("checkprogsize")
if "compiledb" in COMMAND_LINE_TARGETS:
env.Alias("compiledb", env.CompilationDatabase("$COMPILATIONDB_PATH"))
# Print configured protocols
env.AddPreAction(["upload", "program"],
env.VerboseAction(
lambda source, target, env: env.PrintUploadInfo(),
"Configuring upload protocol..."))
env.AddPreAction(
["upload", "program"],
env.VerboseAction(
lambda source, target, env: env.PrintUploadInfo(),
"Configuring upload protocol...",
),
)
AlwaysBuild(env.Alias("debug", DEFAULT_TARGETS))
AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS))
@@ -156,12 +194,29 @@ AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS))
##############################################################################
if "envdump" in COMMAND_LINE_TARGETS:
print(env.Dump())
click.echo(env.Dump())
env.Exit(0)
if "idedata" in COMMAND_LINE_TARGETS:
Import("projenv")
print("\n%s\n" % dump_json_to_unicode(
env.DumpIDEData(projenv) # pylint: disable=undefined-variable
))
try:
Import("projenv")
except: # pylint: disable=bare-except
projenv = env
click.echo(
"\n%s\n"
% dump_json_to_unicode(
projenv.DumpIDEData() # pylint: disable=undefined-variable
)
)
env.Exit(0)
if "sizedata" in COMMAND_LINE_TARGETS:
AlwaysBuild(
env.Alias(
"sizedata",
DEFAULT_TARGETS,
env.VerboseAction(env.DumpSizeData, "Generating memory usage report..."),
)
)
Default("sizedata")

View File

@@ -0,0 +1,219 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
# Copyright 2015 MongoDB Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument, protected-access, unused-variable, import-error
# Original: https://github.com/mongodb/mongo/blob/master/site_scons/site_tools/compilation_db.py
from __future__ import absolute_import
import itertools
import json
import os
import SCons
from platformio.builder.tools.platformio import SRC_ASM_EXT, SRC_C_EXT, SRC_CXX_EXT
from platformio.proc import where_is_program
# Implements the ability for SCons to emit a compilation database for the MongoDB project. See
# http://clang.llvm.org/docs/JSONCompilationDatabase.html for details on what a compilation
# database is, and why you might want one. The only user visible entry point here is
# 'env.CompilationDatabase'. This method takes an optional 'target' to name the file that
# should hold the compilation database, otherwise, the file defaults to compile_commands.json,
# which is the name that most clang tools search for by default.
# TODO: Is there a better way to do this than this global? Right now this exists so that the
# emitter we add can record all of the things it emits, so that the scanner for the top level
# compilation database can access the complete list, and also so that the writer has easy
# access to write all of the files. But it seems clunky. How can the emitter and the scanner
# communicate more gracefully?
__COMPILATION_DB_ENTRIES = []
# We make no effort to avoid rebuilding the entries. Someday, perhaps we could and even
# integrate with the cache, but there doesn't seem to be much call for it.
class __CompilationDbNode(SCons.Node.Python.Value):
def __init__(self, value):
SCons.Node.Python.Value.__init__(self, value)
self.Decider(changed_since_last_build_node)
def changed_since_last_build_node(*args, **kwargs):
""" Dummy decider to force always building"""
return True
def makeEmitCompilationDbEntry(comstr):
"""
Effectively this creates a lambda function to capture:
* command line
* source
* target
:param comstr: unevaluated command line
:return: an emitter which has captured the above
"""
user_action = SCons.Action.Action(comstr)
def EmitCompilationDbEntry(target, source, env):
"""
This emitter will be added to each c/c++ object build to capture the info needed
for clang tools
:param target: target node(s)
:param source: source node(s)
:param env: Environment for use building this node
:return: target(s), source(s)
"""
# Resolve absolute path of toolchain
for cmd in ("CC", "CXX", "AS"):
if cmd not in env:
continue
if os.path.isabs(env[cmd]):
continue
env[cmd] = where_is_program(
env.subst("$%s" % cmd), env.subst("${ENV['PATH']}")
)
dbtarget = __CompilationDbNode(source)
entry = env.__COMPILATIONDB_Entry(
target=dbtarget,
source=[],
__COMPILATIONDB_UTARGET=target,
__COMPILATIONDB_USOURCE=source,
__COMPILATIONDB_UACTION=user_action,
__COMPILATIONDB_ENV=env,
)
# TODO: Technically, these next two lines should not be required: it should be fine to
# cache the entries. However, they don't seem to update properly. Since they are quick
# to re-generate disable caching and sidestep this problem.
env.AlwaysBuild(entry)
env.NoCache(entry)
__COMPILATION_DB_ENTRIES.append(dbtarget)
return target, source
return EmitCompilationDbEntry
def CompilationDbEntryAction(target, source, env, **kw):
"""
Create a dictionary with evaluated command line, target, source
and store that info as an attribute on the target
(Which has been stored in __COMPILATION_DB_ENTRIES array
:param target: target node(s)
:param source: source node(s)
:param env: Environment for use building this node
:param kw:
:return: None
"""
command = env["__COMPILATIONDB_UACTION"].strfunction(
target=env["__COMPILATIONDB_UTARGET"],
source=env["__COMPILATIONDB_USOURCE"],
env=env["__COMPILATIONDB_ENV"],
)
entry = {
"directory": env.Dir("#").abspath,
"command": command,
"file": str(env["__COMPILATIONDB_USOURCE"][0]),
}
target[0].write(entry)
def WriteCompilationDb(target, source, env):
entries = []
for s in __COMPILATION_DB_ENTRIES:
item = s.read()
item["file"] = os.path.abspath(item["file"])
entries.append(item)
with open(str(target[0]), "w") as target_file:
json.dump(
entries, target_file, sort_keys=True, indent=4, separators=(",", ": ")
)
def ScanCompilationDb(node, env, path):
return __COMPILATION_DB_ENTRIES
def generate(env, **kwargs):
static_obj, shared_obj = SCons.Tool.createObjBuilders(env)
env["COMPILATIONDB_COMSTR"] = kwargs.get(
"COMPILATIONDB_COMSTR", "Building compilation database $TARGET"
)
components_by_suffix = itertools.chain(
itertools.product(
[".%s" % ext for ext in SRC_C_EXT],
[
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CCCOM"),
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCCCOM"),
],
),
itertools.product(
[".%s" % ext for ext in SRC_CXX_EXT],
[
(static_obj, SCons.Defaults.StaticObjectEmitter, "$CXXCOM"),
(shared_obj, SCons.Defaults.SharedObjectEmitter, "$SHCXXCOM"),
],
),
itertools.product(
[".%s" % ext for ext in SRC_ASM_EXT],
[(static_obj, SCons.Defaults.StaticObjectEmitter, "$ASCOM")],
),
)
for entry in components_by_suffix:
suffix = entry[0]
builder, base_emitter, command = entry[1]
# Assumes a dictionary emitter
emitter = builder.emitter[suffix]
builder.emitter[suffix] = SCons.Builder.ListEmitter(
[emitter, makeEmitCompilationDbEntry(command)]
)
env["BUILDERS"]["__COMPILATIONDB_Entry"] = SCons.Builder.Builder(
action=SCons.Action.Action(CompilationDbEntryAction, None),
)
env["BUILDERS"]["__COMPILATIONDB_Database"] = SCons.Builder.Builder(
action=SCons.Action.Action(WriteCompilationDb, "$COMPILATIONDB_COMSTR"),
target_scanner=SCons.Scanner.Scanner(
function=ScanCompilationDb, node_class=None
),
)
def CompilationDatabase(env, target):
result = env.__COMPILATIONDB_Database(target=target, source=[])
env.AlwaysBuild(result)
env.NoCache(result)
return result
env.AddMethod(CompilationDatabase, "CompilationDatabase")
def exists(env):
return True

View File

@@ -14,9 +14,8 @@
from __future__ import absolute_import
import os
from glob import glob
from os import environ
from os.path import abspath, isfile, join
from SCons.Defaults import processDefines # pylint: disable=import-error
@@ -25,11 +24,11 @@ from platformio.managers.core import get_core_package_dir
from platformio.proc import exec_command, where_is_program
def _dump_includes(env, projenv):
def _dump_includes(env):
includes = []
for item in projenv.get("CPPPATH", []):
includes.append(projenv.subst(item))
for item in env.get("CPPPATH", []):
includes.append(env.subst(item))
# installed libs
for lb in env.GetLibBuilders():
@@ -42,10 +41,10 @@ def _dump_includes(env, projenv):
continue
toolchain_dir = glob_escape(p.get_package_dir(name))
toolchain_incglobs = [
join(toolchain_dir, "*", "include*"),
join(toolchain_dir, "*", "include", "c++", "*"),
join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
join(toolchain_dir, "lib", "gcc", "*", "*", "include*")
os.path.join(toolchain_dir, "*", "include*"),
os.path.join(toolchain_dir, "*", "include", "c++", "*"),
os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
os.path.join(toolchain_dir, "lib", "gcc", "*", "*", "include*"),
]
for g in toolchain_incglobs:
includes.extend(glob(g))
@@ -54,15 +53,14 @@ def _dump_includes(env, projenv):
if unity_dir:
includes.append(unity_dir)
includes.extend(
[env.subst("$PROJECTINCLUDE_DIR"),
env.subst("$PROJECTSRC_DIR")])
includes.extend([env.subst("$PROJECT_INCLUDE_DIR"), env.subst("$PROJECT_SRC_DIR")])
# remove duplicates
result = []
for item in includes:
item = os.path.realpath(item)
if item not in result:
result.append(abspath(item))
result.append(item)
return result
@@ -70,16 +68,16 @@ def _dump_includes(env, projenv):
def _get_gcc_defines(env):
items = []
try:
sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH'])
result = exec_command("echo | %s -dM -E -" % env.subst("$CC"),
env=sysenv,
shell=True)
sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command(
"echo | %s -dM -E -" % env.subst("$CC"), env=sysenv, shell=True
)
except OSError:
return items
if result['returncode'] != 0:
if result["returncode"] != 0:
return items
for line in result['out'].split("\n"):
for line in result["out"].split("\n"):
tokens = line.strip().split(" ", 2)
if not tokens or tokens[0] != "#define":
continue
@@ -94,17 +92,22 @@ def _dump_defines(env):
defines = []
# global symbols
for item in processDefines(env.get("CPPDEFINES", [])):
defines.append(env.subst(item).replace('\\', ''))
defines.append(env.subst(item).replace("\\", ""))
# special symbol for Atmel AVR MCU
if env['PIOPLATFORM'] == "atmelavr":
if env["PIOPLATFORM"] == "atmelavr":
board_mcu = env.get("BOARD_MCU")
if not board_mcu and "BOARD" in env:
board_mcu = env.BoardConfig().get("build.mcu")
if board_mcu:
defines.append(
str("__AVR_%s__" % board_mcu.upper().replace(
"ATMEGA", "ATmega").replace("ATTINY", "ATtiny")))
str(
"__AVR_%s__"
% board_mcu.upper()
.replace("ATMEGA", "ATmega")
.replace("ATTINY", "ATtiny")
)
)
# built-in GCC marcos
# if env.GetCompilerType() == "gcc":
@@ -116,7 +119,7 @@ def _dump_defines(env):
def _get_svd_path(env):
svd_path = env.GetProjectOption("debug_svd_path")
if svd_path:
return abspath(svd_path)
return os.path.realpath(svd_path)
if "BOARD" not in env:
return None
@@ -126,45 +129,47 @@ def _get_svd_path(env):
except (AssertionError, KeyError):
return None
# custom path to SVD file
if isfile(svd_path):
if os.path.isfile(svd_path):
return svd_path
# default file from ./platform/misc/svd folder
p = env.PioPlatform()
if isfile(join(p.get_dir(), "misc", "svd", svd_path)):
return abspath(join(p.get_dir(), "misc", "svd", svd_path))
if os.path.isfile(os.path.join(p.get_dir(), "misc", "svd", svd_path)):
return os.path.realpath(os.path.join(p.get_dir(), "misc", "svd", svd_path))
return None
def DumpIDEData(env, projenv):
LINTCCOM = "$CFLAGS $CCFLAGS $CPPFLAGS"
LINTCXXCOM = "$CXXFLAGS $CCFLAGS $CPPFLAGS"
def _escape_build_flag(flags):
return [flag if " " not in flag else '"%s"' % flag for flag in flags]
def DumpIDEData(env):
env["__escape_build_flag"] = _escape_build_flag
LINTCCOM = (
"${__escape_build_flag(CFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
LINTCXXCOM = (
"${__escape_build_flag(CXXFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
data = {
"env_name": env["PIOENV"],
"libsource_dirs": [env.subst(l) for l in env.GetLibSourceDirs()],
"defines":
_dump_defines(env),
"includes":
_dump_includes(env, projenv),
"cc_flags":
env.subst(LINTCCOM),
"cxx_flags":
env.subst(LINTCXXCOM),
"cc_path":
where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
"cxx_path":
where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")),
"gdb_path":
where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")),
"prog_path":
env.subst("$PROG_PATH"),
"flash_extra_images": [{
"offset": item[0],
"path": env.subst(item[1])
} for item in env.get("FLASH_EXTRA_IMAGES", [])],
"svd_path":
_get_svd_path(env),
"compiler_type":
env.GetCompilerType()
"defines": _dump_defines(env),
"includes": _dump_includes(env),
"cc_flags": env.subst(LINTCCOM),
"cxx_flags": env.subst(LINTCXXCOM),
"cc_path": where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
"cxx_path": where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")),
"gdb_path": where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")),
"prog_path": env.subst("$PROG_PATH"),
"flash_extra_images": [
{"offset": item[0], "path": env.subst(item[1])}
for item in env.get("FLASH_EXTRA_IMAGES", [])
],
"svd_path": _get_svd_path(env),
"compiler_type": env.GetCompilerType(),
}
env_ = env.Clone()
@@ -178,10 +183,7 @@ def DumpIDEData(env, projenv):
_new_defines.append(item)
env_.Replace(CPPDEFINES=_new_defines)
data.update({
"cc_flags": env_.subst(LINTCCOM),
"cxx_flags": env_.subst(LINTCXXCOM)
})
data.update({"cc_flags": env_.subst(LINTCCOM), "cxx_flags": env_.subst(LINTCXXCOM)})
return data

View File

@@ -17,13 +17,12 @@
from __future__ import absolute_import
import codecs
import hashlib
import io
import os
import re
import sys
from os.path import (basename, commonprefix, expanduser, isdir, isfile, join,
realpath, sep)
from os.path import basename, commonprefix, isdir, isfile, join, realpath, sep
import click
import SCons.Scanner # pylint: disable=import-error
@@ -33,13 +32,16 @@ from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from platformio import exception, fs, util
from platformio.builder.tools import platformio as piotool
from platformio.compat import (WINDOWS, get_file_contents, hashlib_encode_data,
string_types)
from platformio.compat import WINDOWS, hashlib_encode_data, string_types
from platformio.managers.lib import LibraryManager
from platformio.package.manifest.parser import (
ManifestParserError,
ManifestParserFactory,
)
from platformio.project.options import ProjectOptions
class LibBuilderFactory(object):
@staticmethod
def new(env, path, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
clsname = "UnknownLibBuilder"
@@ -47,31 +49,30 @@ class LibBuilderFactory(object):
clsname = "PlatformIOLibBuilder"
else:
used_frameworks = LibBuilderFactory.get_used_frameworks(env, path)
common_frameworks = (set(env.get("PIOFRAMEWORK", []))
& set(used_frameworks))
common_frameworks = set(env.get("PIOFRAMEWORK", [])) & set(used_frameworks)
if common_frameworks:
clsname = "%sLibBuilder" % list(common_frameworks)[0].title()
elif used_frameworks:
clsname = "%sLibBuilder" % used_frameworks[0].title()
obj = getattr(sys.modules[__name__], clsname)(env,
path,
verbose=verbose)
obj = getattr(sys.modules[__name__], clsname)(env, path, verbose=verbose)
assert isinstance(obj, LibBuilderBase)
return obj
@staticmethod
def get_used_frameworks(env, path):
if any(
isfile(join(path, fname))
for fname in ("library.properties", "keywords.txt")):
isfile(join(path, fname))
for fname in ("library.properties", "keywords.txt")
):
return ["arduino"]
if isfile(join(path, "module.json")):
return ["mbed"]
include_re = re.compile(r'^#include\s+(<|")(Arduino|mbed)\.h(<|")',
flags=re.MULTILINE)
include_re = re.compile(
r'^#include\s+(<|")(Arduino|mbed)\.h(<|")', flags=re.MULTILINE
)
# check source files
for root, _, files in os.walk(path, followlinks=True):
@@ -79,9 +80,11 @@ class LibBuilderFactory(object):
return ["mbed"]
for fname in files:
if not fs.path_endswith_ext(
fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT):
fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT
):
continue
content = get_file_contents(join(root, fname))
with io.open(join(root, fname), errors="ignore") as fp:
content = fp.read()
if not content:
continue
if "Arduino.h" in content and include_re.search(content):
@@ -93,12 +96,6 @@ class LibBuilderFactory(object):
class LibBuilderBase(object):
LDF_MODES = ["off", "chain", "deep", "chain+", "deep+"]
LDF_MODE_DEFAULT = "chain"
COMPAT_MODES = ["off", "soft", "strict"]
COMPAT_MODE_DEFAULT = "soft"
CLASSIC_SCANNER = SCons.Scanner.C.CScanner()
CCONDITIONAL_SCANNER = SCons.Scanner.C.CConditionalScanner()
# Max depth of nested includes:
@@ -116,7 +113,14 @@ class LibBuilderBase(object):
self.path = realpath(env.subst(path))
self.verbose = verbose
self._manifest = manifest if manifest else self.load_manifest()
try:
self._manifest = manifest if manifest else self.load_manifest()
except ManifestParserError:
click.secho(
"Warning! Ignoring broken library manifest in " + self.path, fg="yellow"
)
self._manifest = {}
self._is_dependent = False
self._is_built = False
self._depbuilders = list()
@@ -124,7 +128,7 @@ class LibBuilderBase(object):
self._processed_files = list()
# reset source filter, could be overridden with extra script
self.env['SRC_FILTER'] = ""
self.env["SRC_FILTER"] = ""
# process extra options and append to build environment
self.process_extra_options()
@@ -152,8 +156,7 @@ class LibBuilderBase(object):
@property
def dependencies(self):
return LibraryManager.normalize_dependencies(
self._manifest.get("dependencies", []))
return self._manifest.get("dependencies")
@property
def src_filter(self):
@@ -161,7 +164,7 @@ class LibBuilderBase(object):
"-<example%s>" % os.sep,
"-<examples%s>" % os.sep,
"-<test%s>" % os.sep,
"-<tests%s>" % os.sep
"-<tests%s>" % os.sep,
]
@property
@@ -172,8 +175,7 @@ class LibBuilderBase(object):
@property
def src_dir(self):
return (join(self.path, "src")
if isdir(join(self.path, "src")) else self.path)
return join(self.path, "src") if isdir(join(self.path, "src")) else self.path
def get_include_dirs(self):
items = []
@@ -214,40 +216,41 @@ class LibBuilderBase(object):
@property
def lib_archive(self):
return self.env.GetProjectOption("lib_archive", True)
return self.env.GetProjectOption("lib_archive")
@property
def lib_ldf_mode(self):
return self.env.GetProjectOption("lib_ldf_mode", self.LDF_MODE_DEFAULT)
return self.env.GetProjectOption("lib_ldf_mode")
@staticmethod
def validate_ldf_mode(mode):
ldf_modes = ProjectOptions["env.lib_ldf_mode"].type.choices
if isinstance(mode, string_types):
mode = mode.strip().lower()
if mode in LibBuilderBase.LDF_MODES:
if mode in ldf_modes:
return mode
try:
return LibBuilderBase.LDF_MODES[int(mode)]
return ldf_modes[int(mode)]
except (IndexError, ValueError):
pass
return LibBuilderBase.LDF_MODE_DEFAULT
return ProjectOptions["env.lib_ldf_mode"].default
@property
def lib_compat_mode(self):
return self.env.GetProjectOption("lib_compat_mode",
self.COMPAT_MODE_DEFAULT)
return self.env.GetProjectOption("lib_compat_mode")
@staticmethod
def validate_compat_mode(mode):
compat_modes = ProjectOptions["env.lib_compat_mode"].type.choices
if isinstance(mode, string_types):
mode = mode.strip().lower()
if mode in LibBuilderBase.COMPAT_MODES:
if mode in compat_modes:
return mode
try:
return LibBuilderBase.COMPAT_MODES[int(mode)]
return compat_modes[int(mode)]
except (IndexError, ValueError):
pass
return LibBuilderBase.COMPAT_MODE_DEFAULT
return ProjectOptions["env.lib_compat_mode"].default
def is_platforms_compatible(self, platforms):
return True
@@ -263,11 +266,10 @@ class LibBuilderBase(object):
self.env.ProcessFlags(self.build_flags)
if self.extra_script:
self.env.SConscriptChdir(1)
self.env.SConscript(realpath(self.extra_script),
exports={
"env": self.env,
"pio_lib_builder": self
})
self.env.SConscript(
realpath(self.extra_script),
exports={"env": self.env, "pio_lib_builder": self},
)
self.env.ProcessUnFlags(self.build_unflags)
def process_dependencies(self):
@@ -276,7 +278,7 @@ class LibBuilderBase(object):
for item in self.dependencies:
found = False
for lb in self.env.GetLibBuilders():
if item['name'] != lb.name:
if item["name"] != lb.name:
continue
found = True
if lb not in self.depbuilders:
@@ -284,65 +286,79 @@ class LibBuilderBase(object):
break
if not found and self.verbose:
sys.stderr.write("Warning: Ignored `%s` dependency for `%s` "
"library\n" % (item['name'], self.name))
sys.stderr.write(
"Warning: Ignored `%s` dependency for `%s` "
"library\n" % (item["name"], self.name)
)
def get_search_files(self):
items = [
join(self.src_dir, item) for item in self.env.MatchSourceFiles(
self.src_dir, self.src_filter)
join(self.src_dir, item)
for item in self.env.MatchSourceFiles(self.src_dir, self.src_filter)
]
include_dir = self.include_dir
if include_dir:
items.extend([
join(include_dir, item)
for item in self.env.MatchSourceFiles(include_dir)
])
items.extend(
[
join(include_dir, item)
for item in self.env.MatchSourceFiles(include_dir)
]
)
return items
def _validate_search_files(self, search_files=None):
if not search_files:
search_files = []
assert isinstance(search_files, list)
_search_files = []
for path in search_files:
if path not in self._processed_files:
_search_files.append(path)
self._processed_files.append(path)
return _search_files
def _get_found_includes(self, search_files=None):
def _get_found_includes( # pylint: disable=too-many-branches
self, search_files=None
):
# all include directories
if not LibBuilderBase._INCLUDE_DIRS_CACHE:
LibBuilderBase._INCLUDE_DIRS_CACHE = []
LibBuilderBase._INCLUDE_DIRS_CACHE = [
self.env.Dir(d)
for d in ProjectAsLibBuilder(
self.envorigin, "$PROJECT_DIR"
).get_include_dirs()
]
for lb in self.env.GetLibBuilders():
LibBuilderBase._INCLUDE_DIRS_CACHE.extend(
[self.env.Dir(d) for d in lb.get_include_dirs()])
[self.env.Dir(d) for d in lb.get_include_dirs()]
)
# append self include directories
include_dirs = [self.env.Dir(d) for d in self.get_include_dirs()]
include_dirs.extend(LibBuilderBase._INCLUDE_DIRS_CACHE)
result = []
for path in self._validate_search_files(search_files):
for path in search_files or []:
if path in self._processed_files:
continue
self._processed_files.append(path)
try:
assert "+" in self.lib_ldf_mode
candidates = LibBuilderBase.CCONDITIONAL_SCANNER(
self.env.File(path),
self.env,
tuple(include_dirs),
depth=self.CCONDITIONAL_SCANNER_DEPTH)
depth=self.CCONDITIONAL_SCANNER_DEPTH,
)
# mark candidates already processed via Conditional Scanner
self._processed_files.extend(
[
c.get_abspath()
for c in candidates
if c.get_abspath() not in self._processed_files
]
)
except Exception as e: # pylint: disable=broad-except
if self.verbose and "+" in self.lib_ldf_mode:
sys.stderr.write(
"Warning! Classic Pre Processor is used for `%s`, "
"advanced has failed with `%s`\n" % (path, e))
"advanced has failed with `%s`\n" % (path, e)
)
candidates = LibBuilderBase.CLASSIC_SCANNER(
self.env.File(path), self.env, tuple(include_dirs))
self.env.File(path), self.env, tuple(include_dirs)
)
# print(path, map(lambda n: n.get_abspath(), candidates))
# print(path, [c.get_abspath() for c in candidates])
for item in candidates:
if item not in result:
result.append(item)
@@ -351,8 +367,8 @@ class LibBuilderBase(object):
_h_path = item.get_abspath()
if not fs.path_endswith_ext(_h_path, piotool.SRC_HEADER_EXT):
continue
_f_part = _h_path[:_h_path.rindex(".")]
for ext in piotool.SRC_C_EXT:
_f_part = _h_path[: _h_path.rindex(".")]
for ext in piotool.SRC_C_EXT + piotool.SRC_CXX_EXT:
if not isfile("%s.%s" % (_f_part, ext)):
continue
_c_path = self.env.File("%s.%s" % (_f_part, ext))
@@ -362,7 +378,6 @@ class LibBuilderBase(object):
return result
def depend_recursive(self, lb, search_files=None):
def _already_depends(_lb):
if self in _lb.depbuilders:
return True
@@ -375,9 +390,10 @@ class LibBuilderBase(object):
if self != lb:
if _already_depends(lb):
if self.verbose:
sys.stderr.write("Warning! Circular dependencies detected "
"between `%s` and `%s`\n" %
(self.path, lb.path))
sys.stderr.write(
"Warning! Circular dependencies detected "
"between `%s` and `%s`\n" % (self.path, lb.path)
)
self._circular_deps.append(lb)
elif lb not in self._depbuilders:
self._depbuilders.append(lb)
@@ -434,11 +450,10 @@ class LibBuilderBase(object):
if self.lib_archive:
libs.append(
self.env.BuildLibrary(self.build_dir, self.src_dir,
self.src_filter))
self.env.BuildLibrary(self.build_dir, self.src_dir, self.src_filter)
)
else:
self.env.BuildSources(self.build_dir, self.src_dir,
self.src_filter)
self.env.BuildSources(self.build_dir, self.src_dir, self.src_filter)
return libs
@@ -447,19 +462,11 @@ class UnknownLibBuilder(LibBuilderBase):
class ArduinoLibBuilder(LibBuilderBase):
def load_manifest(self):
manifest = {}
if not isfile(join(self.path, "library.properties")):
return manifest
manifest_path = join(self.path, "library.properties")
with codecs.open(manifest_path, encoding="utf-8") as fp:
for line in fp.readlines():
if "=" not in line:
continue
key, value = line.split("=", 1)
manifest[key.strip()] = value.strip()
return manifest
if not isfile(manifest_path):
return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict()
def get_include_dirs(self):
include_dirs = LibBuilderBase.get_include_dirs(self)
@@ -503,35 +510,18 @@ class ArduinoLibBuilder(LibBuilderBase):
return util.items_in_list(frameworks, ["arduino", "energia"])
def is_platforms_compatible(self, platforms):
platforms_map = {
"avr": ["atmelavr"],
"sam": ["atmelsam"],
"samd": ["atmelsam"],
"esp8266": ["espressif8266"],
"esp32": ["espressif32"],
"arc32": ["intel_arc32"],
"stm32": ["ststm32"],
"nrf5": ["nordicnrf51", "nordicnrf52"]
}
items = []
for arch in self._manifest.get("architectures", "").split(","):
arch = arch.strip().lower()
if arch == "*":
items = "*"
break
if arch in platforms_map:
items.extend(platforms_map[arch])
items = self._manifest.get("platforms", [])
if not items:
return LibBuilderBase.is_platforms_compatible(self, platforms)
return util.items_in_list(platforms, items)
class MbedLibBuilder(LibBuilderBase):
def load_manifest(self):
if not isfile(join(self.path, "module.json")):
manifest_path = join(self.path, "module.json")
if not isfile(manifest_path):
return {}
return fs.load_json(join(self.path, "module.json"))
return ManifestParserFactory.new_from_file(manifest_path).as_dict()
@property
def include_dir(self):
@@ -586,8 +576,7 @@ class MbedLibBuilder(LibBuilderBase):
mbed_config_path = join(self.env.subst(p), "mbed_config.h")
if isfile(mbed_config_path):
break
else:
mbed_config_path = None
mbed_config_path = None
if not mbed_config_path:
return None
@@ -614,14 +603,15 @@ class MbedLibBuilder(LibBuilderBase):
# default macros
for macro in manifest.get("macros", []):
macro = self._mbed_normalize_macro(macro)
macros[macro['name']] = macro
macros[macro["name"]] = macro
# configuration items
for key, options in manifest.get("config", {}).items():
if "value" not in options:
continue
macros[key] = dict(name=options.get("macro_name"),
value=options.get("value"))
macros[key] = dict(
name=options.get("macro_name"), value=options.get("value")
)
# overrode items per target
for target, options in manifest.get("target_overrides", {}).items():
@@ -629,25 +619,23 @@ class MbedLibBuilder(LibBuilderBase):
continue
for macro in options.get("target.macros_add", []):
macro = self._mbed_normalize_macro(macro)
macros[macro['name']] = macro
macros[macro["name"]] = macro
for key, value in options.items():
if not key.startswith("target.") and key in macros:
macros[key]['value'] = value
macros[key]["value"] = value
# normalize macro names
for key, macro in macros.items():
if not macro['name']:
macro['name'] = key
if "." not in macro['name']:
macro['name'] = "%s.%s" % (manifest.get("name"),
macro['name'])
macro['name'] = re.sub(r"[^a-z\d]+",
"_",
macro['name'],
flags=re.I).upper()
macro['name'] = "MBED_CONF_" + macro['name']
if isinstance(macro['value'], bool):
macro['value'] = 1 if macro['value'] else 0
if not macro["name"]:
macro["name"] = key
if "." not in macro["name"]:
macro["name"] = "%s.%s" % (manifest.get("name"), macro["name"])
macro["name"] = re.sub(
r"[^a-z\d]+", "_", macro["name"], flags=re.I
).upper()
macro["name"] = "MBED_CONF_" + macro["name"]
if isinstance(macro["value"], bool):
macro["value"] = 1 if macro["value"] else 0
return {macro["name"]: macro["value"] for macro in macros.values()}
@@ -657,13 +645,13 @@ class MbedLibBuilder(LibBuilderBase):
for line in fp.readlines():
line = line.strip()
if line == "#endif":
lines.append(
"// PlatformIO Library Dependency Finder (LDF)")
lines.extend([
"#define %s %s" %
(name, value if value is not None else "")
for name, value in macros.items()
])
lines.append("// PlatformIO Library Dependency Finder (LDF)")
lines.extend(
[
"#define %s %s" % (name, value if value is not None else "")
for name, value in macros.items()
]
)
lines.append("")
if not line.startswith("#define"):
lines.append(line)
@@ -677,22 +665,13 @@ class MbedLibBuilder(LibBuilderBase):
class PlatformIOLibBuilder(LibBuilderBase):
def load_manifest(self):
assert isfile(join(self.path, "library.json"))
manifest = fs.load_json(join(self.path, "library.json"))
assert "name" in manifest
manifest_path = join(self.path, "library.json")
if not isfile(manifest_path):
return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict()
# replace "espressif" old name dev/platform with ESP8266
if "platforms" in manifest:
manifest['platforms'] = [
"espressif8266" if p == "espressif" else p
for p in util.items_to_list(manifest['platforms'])
]
return manifest
def _is_arduino_manifest(self):
def _has_arduino_manifest(self):
return isfile(join(self.path, "library.properties"))
@property
@@ -713,9 +692,9 @@ class PlatformIOLibBuilder(LibBuilderBase):
def src_filter(self):
if "srcFilter" in self._manifest.get("build", {}):
return self._manifest.get("build").get("srcFilter")
if self.env['SRC_FILTER']:
return self.env['SRC_FILTER']
if self._is_arduino_manifest():
if self.env["SRC_FILTER"]:
return self.env["SRC_FILTER"]
if self._has_arduino_manifest():
return ArduinoLibBuilder.src_filter.fget(self)
return LibBuilderBase.src_filter.fget(self)
@@ -739,28 +718,33 @@ class PlatformIOLibBuilder(LibBuilderBase):
@property
def lib_archive(self):
global_value = self.env.GetProjectOption("lib_archive")
if global_value is not None:
return global_value
missing = object()
global_value = self.env.GetProjectConfig().getraw(
"env:" + self.env["PIOENV"], "lib_archive", missing
)
if global_value != missing:
return self.env.GetProjectConfig().get(
"env:" + self.env["PIOENV"], "lib_archive"
)
return self._manifest.get("build", {}).get(
"libArchive", LibBuilderBase.lib_archive.fget(self))
"libArchive", LibBuilderBase.lib_archive.fget(self)
)
@property
def lib_ldf_mode(self):
return self.validate_ldf_mode(
self.env.GetProjectOption(
"lib_ldf_mode",
self._manifest.get("build", {}).get(
"libLDFMode", LibBuilderBase.lib_ldf_mode.fget(self))))
self._manifest.get("build", {}).get(
"libLDFMode", LibBuilderBase.lib_ldf_mode.fget(self)
)
)
@property
def lib_compat_mode(self):
return self.validate_compat_mode(
self.env.GetProjectOption(
"lib_compat_mode",
self._manifest.get("build", {}).get(
"libCompatMode",
LibBuilderBase.lib_compat_mode.fget(self))))
self._manifest.get("build", {}).get(
"libCompatMode", LibBuilderBase.lib_compat_mode.fget(self)
)
)
def is_platforms_compatible(self, platforms):
items = self._manifest.get("platforms")
@@ -778,9 +762,12 @@ class PlatformIOLibBuilder(LibBuilderBase):
include_dirs = LibBuilderBase.get_include_dirs(self)
# backwards compatibility with PlatformIO 2.0
if ("build" not in self._manifest and self._is_arduino_manifest()
and not isdir(join(self.path, "src"))
and isdir(join(self.path, "utility"))):
if (
"build" not in self._manifest
and self._has_arduino_manifest()
and not isdir(join(self.path, "src"))
and isdir(join(self.path, "utility"))
):
include_dirs.append(join(self.path, "utility"))
for path in self.env.get("CPPPATH", []):
@@ -791,25 +778,24 @@ class PlatformIOLibBuilder(LibBuilderBase):
class ProjectAsLibBuilder(LibBuilderBase):
def __init__(self, env, *args, **kwargs):
# backup original value, will be reset in base.__init__
project_src_filter = env.get("SRC_FILTER")
super(ProjectAsLibBuilder, self).__init__(env, *args, **kwargs)
self.env['SRC_FILTER'] = project_src_filter
self.env["SRC_FILTER"] = project_src_filter
@property
def include_dir(self):
include_dir = self.env.subst("$PROJECTINCLUDE_DIR")
include_dir = self.env.subst("$PROJECT_INCLUDE_DIR")
return include_dir if isdir(include_dir) else None
@property
def src_dir(self):
return self.env.subst("$PROJECTSRC_DIR")
return self.env.subst("$PROJECT_SRC_DIR")
def get_include_dirs(self):
include_dirs = []
project_include_dir = self.env.subst("$PROJECTINCLUDE_DIR")
project_include_dir = self.env.subst("$PROJECT_INCLUDE_DIR")
if isdir(project_include_dir):
include_dirs.append(project_include_dir)
for include_dir in LibBuilderBase.get_include_dirs(self):
@@ -822,11 +808,14 @@ class ProjectAsLibBuilder(LibBuilderBase):
items = LibBuilderBase.get_search_files(self)
# test files
if "__test" in COMMAND_LINE_TARGETS:
items.extend([
join("$PROJECTTEST_DIR",
item) for item in self.env.MatchSourceFiles(
"$PROJECTTEST_DIR", "$PIOTEST_SRC_FILTER")
])
items.extend(
[
join("$PROJECT_TEST_DIR", item)
for item in self.env.MatchSourceFiles(
"$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
)
]
)
return items
@property
@@ -839,8 +828,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
@property
def src_filter(self):
return (self.env.get("SRC_FILTER")
or LibBuilderBase.src_filter.fget(self))
return self.env.get("SRC_FILTER") or LibBuilderBase.src_filter.fget(self)
@property
def dependencies(self):
@@ -851,7 +839,6 @@ class ProjectAsLibBuilder(LibBuilderBase):
pass
def install_dependencies(self):
def _is_builtin(uri):
for lb in self.env.GetLibBuilders():
if lb.name == uri:
@@ -874,8 +861,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
not_found_uri.append(uri)
did_install = False
lm = LibraryManager(
self.env.subst(join("$PROJECTLIBDEPS_DIR", "$PIOENV")))
lm = LibraryManager(self.env.subst(join("$PROJECT_LIBDEPS_DIR", "$PIOENV")))
for uri in not_found_uri:
try:
lm.install(uri)
@@ -898,7 +884,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
if not lib_dir:
continue
for lb in self.env.GetLibBuilders():
if lib_dir not in lb:
if lib_dir != lb.path:
continue
if lb not in self.depbuilders:
self.depend_recursive(lb)
@@ -926,28 +912,26 @@ class ProjectAsLibBuilder(LibBuilderBase):
def GetLibSourceDirs(env):
items = env.GetProjectOption("lib_extra_dirs", [])
items.extend(env['LIBSOURCE_DIRS'])
items.extend(env["LIBSOURCE_DIRS"])
return [
env.subst(expanduser(item) if item.startswith("~") else item)
env.subst(fs.expanduser(item) if item.startswith("~") else item)
for item in items
]
def IsCompatibleLibBuilder(env,
lb,
verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
def IsCompatibleLibBuilder(env, lb, verbose=int(ARGUMENTS.get("PIOVERBOSE", 0))):
compat_mode = lb.lib_compat_mode
if lb.name in env.GetProjectOption("lib_ignore", []):
if verbose:
sys.stderr.write("Ignored library %s\n" % lb.path)
return None
if compat_mode == "strict" and not lb.is_platforms_compatible(
env['PIOPLATFORM']):
if compat_mode == "strict" and not lb.is_platforms_compatible(env["PIOPLATFORM"]):
if verbose:
sys.stderr.write("Platform incompatible library %s\n" % lb.path)
return False
if (compat_mode in ("soft", "strict") and "PIOFRAMEWORK" in env
and not lb.is_frameworks_compatible(env.get("PIOFRAMEWORK", []))):
if compat_mode in ("soft", "strict") and not lb.is_frameworks_compatible(
env.get("PIOFRAMEWORK", [])
):
if verbose:
sys.stderr.write("Framework incompatible library %s\n" % lb.path)
return False
@@ -956,8 +940,10 @@ def IsCompatibleLibBuilder(env,
def GetLibBuilders(env): # pylint: disable=too-many-branches
if DefaultEnvironment().get("__PIO_LIB_BUILDERS", None) is not None:
return sorted(DefaultEnvironment()['__PIO_LIB_BUILDERS'],
key=lambda lb: 0 if lb.dependent else 1)
return sorted(
DefaultEnvironment()["__PIO_LIB_BUILDERS"],
key=lambda lb: 0 if lb.dependent else 1,
)
DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=[])
@@ -977,7 +963,8 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
except exception.InvalidJSONFile:
if verbose:
sys.stderr.write(
"Skip library with broken manifest: %s\n" % lib_dir)
"Skip library with broken manifest: %s\n" % lib_dir
)
continue
if env.IsCompatibleLibBuilder(lb):
DefaultEnvironment().Append(__PIO_LIB_BUILDERS=[lb])
@@ -992,15 +979,15 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
if verbose and found_incompat:
sys.stderr.write(
"More details about \"Library Compatibility Mode\": "
'More details about "Library Compatibility Mode": '
"https://docs.platformio.org/page/librarymanager/ldf.html#"
"ldf-compat-mode\n")
"ldf-compat-mode\n"
)
return DefaultEnvironment()['__PIO_LIB_BUILDERS']
return DefaultEnvironment()["__PIO_LIB_BUILDERS"]
def ConfigureProjectLibBuilder(env):
def _get_vcs_info(lb):
path = LibraryManager.get_src_manifest_path(lb.path)
return fs.load_json(path) if path else None
@@ -1025,40 +1012,42 @@ def ConfigureProjectLibBuilder(env):
title += " %s" % lb.version
if vcs_info and vcs_info.get("version"):
title += " #%s" % vcs_info.get("version")
sys.stdout.write("%s|-- %s" % (margin, title))
click.echo("%s|-- %s" % (margin, title), nl=False)
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
if vcs_info:
sys.stdout.write(" [%s]" % vcs_info.get("url"))
sys.stdout.write(" (")
sys.stdout.write(lb.path)
sys.stdout.write(")")
sys.stdout.write("\n")
click.echo(" [%s]" % vcs_info.get("url"), nl=False)
click.echo(" (", nl=False)
click.echo(lb.path, nl=False)
click.echo(")", nl=False)
click.echo("")
if lb.depbuilders:
_print_deps_tree(lb, level + 1)
project = ProjectAsLibBuilder(env, "$PROJECT_DIR")
ldf_mode = LibBuilderBase.lib_ldf_mode.fget(project)
print("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf")
print("LDF Modes: Finder ~ %s, Compatibility ~ %s" %
(ldf_mode, project.lib_compat_mode))
click.echo("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf")
click.echo(
"LDF Modes: Finder ~ %s, Compatibility ~ %s"
% (ldf_mode, project.lib_compat_mode)
)
project.install_dependencies()
lib_builders = env.GetLibBuilders()
print("Found %d compatible libraries" % len(lib_builders))
click.echo("Found %d compatible libraries" % len(lib_builders))
print("Scanning dependencies...")
click.echo("Scanning dependencies...")
project.search_deps_recursive()
if ldf_mode.startswith("chain") and project.depbuilders:
_correct_found_libs(lib_builders)
if project.depbuilders:
print("Dependency Graph")
click.echo("Dependency Graph")
_print_deps_tree(project)
else:
print("No dependencies")
click.echo("No dependencies")
return project

View File

@@ -22,12 +22,12 @@ from platformio.compat import WINDOWS, hashlib_encode_data
# Windows CLI has limit with command length to 8192
# Leave 2000 chars for flags and other options
MAX_SOURCES_LENGTH = 6000
MAX_LINE_LENGTH = 6000 if WINDOWS else 128072
def long_sources_hook(env, sources):
_sources = str(sources).replace("\\", "/")
if len(str(_sources)) < MAX_SOURCES_LENGTH:
if len(str(_sources)) < MAX_LINE_LENGTH:
return sources
# fix space in paths
@@ -43,7 +43,7 @@ def long_sources_hook(env, sources):
def long_incflags_hook(env, incflags):
_incflags = env.subst(incflags).replace("\\", "/")
if len(_incflags) < MAX_SOURCES_LENGTH:
if len(_incflags) < MAX_LINE_LENGTH:
return incflags
# fix space in paths
@@ -61,8 +61,9 @@ def _file_long_data(env, data):
build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(build_dir,
"longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest())
tmp_file = join(
build_dir, "longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest()
)
if isfile(tmp_file):
return tmp_file
with open(tmp_file, "w") as fp:
@@ -75,18 +76,17 @@ def exists(_):
def generate(env):
if not WINDOWS:
return None
env.Replace(_long_sources_hook=long_sources_hook)
env.Replace(_long_incflags_hook=long_incflags_hook)
coms = {}
for key in ("ARCOM", "LINKCOM"):
coms[key] = env.get(key, "").replace(
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}")
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}"
)
for key in ("_CCCOMCOM", "ASPPCOM"):
coms[key] = env.get(key, "").replace(
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}")
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}"
)
env.Replace(**coms)
return env

View File

@@ -15,17 +15,19 @@
from __future__ import absolute_import
import atexit
import io
import re
import sys
from os import environ, remove, walk
from os.path import basename, isdir, isfile, join, realpath, relpath, sep
from tempfile import mkstemp
import click
from SCons.Action import Action # pylint: disable=import-error
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from platformio import fs, util
from platformio.compat import get_file_contents, glob_escape
from platformio.compat import get_filesystem_encoding, get_locale_encoding, glob_escape
from platformio.managers.core import get_core_package_dir
from platformio.proc import exec_command
@@ -39,13 +41,48 @@ class InoToCPPConverter(object):
([a-z_\d]+\s*) # name of prototype
\([a-z_,\.\*\&\[\]\s\d]*\) # arguments
)\s*(\{|;) # must end with `{` or `;`
""", re.X | re.M | re.I)
""",
re.X | re.M | re.I,
)
DETECTMAIN_RE = re.compile(r"void\s+(setup|loop)\s*\(", re.M | re.I)
PROTOPTRS_TPLRE = r"\([^&\(]*&(%s)[^\)]*\)"
def __init__(self, env):
self.env = env
self._main_ino = None
self._safe_encoding = None
def read_safe_contents(self, path):
error_reported = False
for encoding in (
"utf-8",
None,
get_filesystem_encoding(),
get_locale_encoding(),
"latin-1",
):
try:
with io.open(path, encoding=encoding) as fp:
contents = fp.read()
self._safe_encoding = encoding
return contents
except UnicodeDecodeError:
if not error_reported:
error_reported = True
click.secho(
"Unicode decode error has occurred, please remove invalid "
"(non-ASCII or non-UTF8) characters from %s file or convert it to UTF-8"
% path,
fg="yellow",
err=True,
)
return ""
def write_safe_contents(self, path, contents):
with io.open(
path, "w", encoding=self._safe_encoding, errors="backslashreplace"
) as fp:
return fp.write(contents)
def is_main_node(self, contents):
return self.DETECTMAIN_RE.search(contents)
@@ -60,10 +97,8 @@ class InoToCPPConverter(object):
assert nodes
lines = []
for node in nodes:
contents = get_file_contents(node.get_path())
_lines = [
'# 1 "%s"' % node.get_path().replace("\\", "/"), contents
]
contents = self.read_safe_contents(node.get_path())
_lines = ['# 1 "%s"' % node.get_path().replace("\\", "/"), contents]
if self.is_main_node(contents):
lines = _lines + lines
self._main_ino = node.get_path()
@@ -78,21 +113,22 @@ class InoToCPPConverter(object):
def process(self, contents):
out_file = self._main_ino + ".cpp"
assert self._gcc_preprocess(contents, out_file)
contents = get_file_contents(out_file)
contents = self.read_safe_contents(out_file)
contents = self._join_multiline_strings(contents)
with open(out_file, "w") as fp:
fp.write(self.append_prototypes(contents))
self.write_safe_contents(out_file, self.append_prototypes(contents))
return out_file
def _gcc_preprocess(self, contents, out_file):
tmp_path = mkstemp()[1]
with open(tmp_path, "w") as fp:
fp.write(contents)
self.write_safe_contents(tmp_path, contents)
self.env.Execute(
self.env.VerboseAction(
'$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format(
out_file, tmp_path),
"Converting " + basename(out_file[:-4])))
out_file, tmp_path
),
"Converting " + basename(out_file[:-4]),
)
)
atexit.register(_delete_file, tmp_path)
return isfile(out_file)
@@ -114,14 +150,15 @@ class InoToCPPConverter(object):
stropen = True
newlines.append(line[:-1])
continue
elif stropen:
if stropen:
newlines[len(newlines) - 1] += line[:-1]
continue
elif stropen and line.endswith(('",', '";')):
newlines[len(newlines) - 1] += line
stropen = False
newlines.append('#line %d "%s"' %
(linenum, self._main_ino.replace("\\", "/")))
newlines.append(
'#line %d "%s"' % (linenum, self._main_ino.replace("\\", "/"))
)
continue
newlines.append(line)
@@ -141,8 +178,10 @@ class InoToCPPConverter(object):
prototypes = []
reserved_keywords = set(["if", "else", "while"])
for match in self.PROTOTYPE_RE.finditer(contents):
if (set([match.group(2).strip(),
match.group(3).strip()]) & reserved_keywords):
if (
set([match.group(2).strip(), match.group(3).strip()])
& reserved_keywords
):
continue
prototypes.append(match)
return prototypes
@@ -162,11 +201,8 @@ class InoToCPPConverter(object):
prototypes = self._parse_prototypes(contents) or []
# skip already declared prototypes
declared = set(
m.group(1).strip() for m in prototypes if m.group(4) == ";")
prototypes = [
m for m in prototypes if m.group(1).strip() not in declared
]
declared = set(m.group(1).strip() for m in prototypes if m.group(4) == ";")
prototypes = [m for m in prototypes if m.group(1).strip() not in declared]
if not prototypes:
return contents
@@ -175,23 +211,29 @@ class InoToCPPConverter(object):
split_pos = prototypes[0].start()
match_ptrs = re.search(
self.PROTOPTRS_TPLRE % ("|".join(prototype_names)),
contents[:split_pos], re.M)
contents[:split_pos],
re.M,
)
if match_ptrs:
split_pos = contents.rfind("\n", 0, match_ptrs.start()) + 1
result = []
result.append(contents[:split_pos].strip())
result.append("%s;" % ";\n".join([m.group(1) for m in prototypes]))
result.append('#line %d "%s"' % (self._get_total_lines(
contents[:split_pos]), self._main_ino.replace("\\", "/")))
result.append(
'#line %d "%s"'
% (
self._get_total_lines(contents[:split_pos]),
self._main_ino.replace("\\", "/"),
)
)
result.append(contents[split_pos:].strip())
return "\n".join(result)
def ConvertInoToCpp(env):
src_dir = glob_escape(env.subst("$PROJECTSRC_DIR"))
ino_nodes = (env.Glob(join(src_dir, "*.ino")) +
env.Glob(join(src_dir, "*.pde")))
src_dir = glob_escape(env.subst("$PROJECT_SRC_DIR"))
ino_nodes = env.Glob(join(src_dir, "*.ino")) + env.Glob(join(src_dir, "*.pde"))
if not ino_nodes:
return
c = InoToCPPConverter(env)
@@ -214,13 +256,13 @@ def _get_compiler_type(env):
return "gcc"
try:
sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH'])
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command([env.subst("$CC"), "-v"], env=sysenv)
except OSError:
return None
if result['returncode'] != 0:
if result["returncode"] != 0:
return None
output = "".join([result['out'], result['err']]).lower()
output = "".join([result["out"], result["err"]]).lower()
if "clang" in output and "LLVM" in output:
return "clang"
if "gcc" in output:
@@ -233,7 +275,6 @@ def GetCompilerType(env):
def GetActualLDScript(env):
def _lookup_in_ldpath(script):
for d in env.get("LIBPATH", []):
path = join(env.subst(d), script)
@@ -248,7 +289,7 @@ def GetActualLDScript(env):
if f == "-T":
script_in_next = True
continue
elif script_in_next:
if script_in_next:
script_in_next = False
raw_script = f
elif f.startswith("-Wl,-T"):
@@ -264,12 +305,13 @@ def GetActualLDScript(env):
if script:
sys.stderr.write(
"Error: Could not find '%s' LD script in LDPATH '%s'\n" %
(script, env.subst("$LIBPATH")))
"Error: Could not find '%s' LD script in LDPATH '%s'\n"
% (script, env.subst("$LIBPATH"))
)
env.Exit(1)
if not script and "LDSCRIPT_PATH" in env:
path = _lookup_in_ldpath(env['LDSCRIPT_PATH'])
path = _lookup_in_ldpath(env["LDSCRIPT_PATH"])
if path:
return path
@@ -292,35 +334,52 @@ def PioClean(env, clean_dir):
for f in files:
dst = join(root, f)
remove(dst)
print("Removed %s" %
(dst if clean_rel_path.startswith(".") else relpath(dst)))
print(
"Removed %s" % (dst if clean_rel_path.startswith(".") else relpath(dst))
)
print("Done cleaning")
fs.rmtree(clean_dir)
env.Exit(0)
def ProcessDebug(env):
if not env.subst("$PIODEBUGFLAGS"):
env.Replace(PIODEBUGFLAGS=["-Og", "-g3", "-ggdb3"])
env.Append(BUILD_FLAGS=list(env['PIODEBUGFLAGS']) +
["-D__PLATFORMIO_BUILD_DEBUG__"])
unflags = ["-Os"]
for level in [0, 1, 2]:
for flag in ("O", "g", "ggdb"):
unflags.append("-%s%d" % (flag, level))
env.Append(BUILD_UNFLAGS=unflags)
def ConfigureDebugFlags(env):
def _cleanup_debug_flags(scope):
if scope not in env:
return
unflags = ["-Os", "-g"]
for level in [0, 1, 2, 3]:
for flag in ("O", "g", "ggdb"):
unflags.append("-%s%d" % (flag, level))
env[scope] = [f for f in env.get(scope, []) if f not in unflags]
env.Append(CPPDEFINES=["__PLATFORMIO_BUILD_DEBUG__"])
for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"):
_cleanup_debug_flags(scope)
debug_flags = env.ParseFlags(env.GetProjectOption("debug_build_flags"))
env.MergeFlags(debug_flags)
optimization_flags = [
f for f in debug_flags.get("CCFLAGS", []) if f.startswith(("-O", "-g"))
]
if optimization_flags:
env.AppendUnique(ASFLAGS=optimization_flags, LINKFLAGS=optimization_flags)
def ProcessTest(env):
env.Append(CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"],
CPPPATH=[join("$BUILD_DIR", "UnityTestLib")])
unitylib = env.BuildLibrary(join("$BUILD_DIR", "UnityTestLib"),
get_core_package_dir("tool-unity"))
def ConfigureTestTarget(env):
env.Append(
CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"],
CPPPATH=[join("$BUILD_DIR", "UnityTestLib")],
)
unitylib = env.BuildLibrary(
join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity")
)
env.Prepend(LIBS=[unitylib])
src_filter = ["+<*.cpp>", "+<*.c>"]
if "PIOTEST_RUNNING_NAME" in env:
src_filter.append("+<%s%s>" % (env['PIOTEST_RUNNING_NAME'], sep))
src_filter.append("+<%s%s>" % (env["PIOTEST_RUNNING_NAME"], sep))
env.Replace(PIOTEST_SRC_FILTER=src_filter)
@@ -330,7 +389,7 @@ def GetExtraScripts(env, scope):
if scope == "post" and ":" not in item:
items.append(item)
elif item.startswith("%s:" % scope):
items.append(item[len(scope) + 1:])
items.append(item[len(scope) + 1 :])
if not items:
return items
with fs.cd(env.subst("$PROJECT_DIR")):
@@ -347,7 +406,7 @@ def generate(env):
env.AddMethod(GetActualLDScript)
env.AddMethod(VerboseAction)
env.AddMethod(PioClean)
env.AddMethod(ProcessDebug)
env.AddMethod(ProcessTest)
env.AddMethod(ConfigureDebugFlags)
env.AddMethod(ConfigureTestTarget)
env.AddMethod(GetExtraScripts)
return env

View File

@@ -33,28 +33,28 @@ def PioPlatform(env):
variables = env.GetProjectOptions(as_dict=True)
if "framework" in variables:
# support PIO Core 3.0 dev/platforms
variables['pioframework'] = variables['framework']
p = PlatformFactory.newPlatform(env['PLATFORM_MANIFEST'])
variables["pioframework"] = variables["framework"]
p = PlatformFactory.newPlatform(env["PLATFORM_MANIFEST"])
p.configure_default_packages(variables, COMMAND_LINE_TARGETS)
return p
def BoardConfig(env, board=None):
p = env.PioPlatform()
try:
board = board or env.get("BOARD")
assert board, "BoardConfig: Board is not defined"
config = p.board_config(board)
except (AssertionError, exception.UnknownBoard) as e:
sys.stderr.write("Error: %s\n" % str(e))
env.Exit(1)
return config
with fs.cd(env.subst("$PROJECT_DIR")):
try:
p = env.PioPlatform()
board = board or env.get("BOARD")
assert board, "BoardConfig: Board is not defined"
return p.board_config(board)
except (AssertionError, exception.UnknownBoard) as e:
sys.stderr.write("Error: %s\n" % str(e))
env.Exit(1)
def GetFrameworkScript(env, framework):
p = env.PioPlatform()
assert p.frameworks and framework in p.frameworks
script_path = env.subst(p.frameworks[framework]['script'])
script_path = env.subst(p.frameworks[framework]["script"])
if not isfile(script_path):
script_path = join(p.get_dir(), script_path)
return script_path
@@ -65,7 +65,7 @@ def LoadPioPlatform(env):
installed_packages = p.get_installed_packages()
# Ensure real platform name
env['PIOPLATFORM'] = p.name
env["PIOPLATFORM"] = p.name
# Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH
systype = util.get_systype()
@@ -75,14 +75,13 @@ def LoadPioPlatform(env):
continue
pkg_dir = p.get_package_dir(name)
env.PrependENVPath(
"PATH",
join(pkg_dir, "bin") if isdir(join(pkg_dir, "bin")) else pkg_dir)
if (not WINDOWS and isdir(join(pkg_dir, "lib"))
and type_ != "toolchain"):
"PATH", join(pkg_dir, "bin") if isdir(join(pkg_dir, "bin")) else pkg_dir
)
if not WINDOWS and isdir(join(pkg_dir, "lib")) and type_ != "toolchain":
env.PrependENVPath(
"DYLD_LIBRARY_PATH"
if "darwin" in systype else "LD_LIBRARY_PATH",
join(pkg_dir, "lib"))
"DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH",
join(pkg_dir, "lib"),
)
# Platform specific LD Scripts
if isdir(join(p.get_dir(), "ldscripts")):
@@ -94,16 +93,27 @@ def LoadPioPlatform(env):
# update board manifest with overridden data from INI config
board_config = env.BoardConfig()
for option, value in env.GetProjectOptions():
if option.startswith("board_"):
board_config.update(option.lower()[6:], value)
if not option.startswith("board_"):
continue
option = option.lower()[6:]
try:
if isinstance(board_config.get(option), bool):
value = str(value).lower() in ("1", "yes", "true")
elif isinstance(board_config.get(option), int):
value = int(value)
except KeyError:
pass
board_config.update(option, value)
# load default variables from board config
for option_meta in ProjectOptions.values():
if not option_meta.buildenvvar or option_meta.buildenvvar in env:
continue
data_path = (option_meta.name[6:]
if option_meta.name.startswith("board_") else
option_meta.name.replace("_", "."))
data_path = (
option_meta.name[6:]
if option_meta.name.startswith("board_")
else option_meta.name.replace("_", ".")
)
try:
env[option_meta.buildenvvar] = board_config.get(data_path)
except KeyError:
@@ -118,22 +128,25 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
board_config = env.BoardConfig() if "BOARD" in env else None
def _get_configuration_data():
return None if not board_config else [
"CONFIGURATION:",
"https://docs.platformio.org/page/boards/%s/%s.html" %
(platform.name, board_config.id)
]
return (
None
if not board_config
else [
"CONFIGURATION:",
"https://docs.platformio.org/page/boards/%s/%s.html"
% (platform.name, board_config.id),
]
)
def _get_plaform_data():
data = ["PLATFORM: %s %s" % (platform.title, platform.version)]
src_manifest_path = platform.pm.get_src_manifest_path(
platform.get_dir())
src_manifest_path = platform.pm.get_src_manifest_path(platform.get_dir())
if src_manifest_path:
src_manifest = fs.load_json(src_manifest_path)
if "version" in src_manifest:
data.append("#" + src_manifest['version'])
data.append("#" + src_manifest["version"])
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
data.append("(%s)" % src_manifest['url'])
data.append("(%s)" % src_manifest["url"])
if board_config:
data.extend([">", board_config.get("name")])
return data
@@ -151,19 +164,22 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
return data
ram = board_config.get("upload", {}).get("maximum_ram_size")
flash = board_config.get("upload", {}).get("maximum_size")
data.append("%s RAM, %s Flash" %
(fs.format_filesize(ram), fs.format_filesize(flash)))
data.append(
"%s RAM, %s Flash" % (fs.format_filesize(ram), fs.format_filesize(flash))
)
return data
def _get_debug_data():
debug_tools = board_config.get(
"debug", {}).get("tools") if board_config else None
debug_tools = (
board_config.get("debug", {}).get("tools") if board_config else None
)
if not debug_tools:
return None
data = [
"DEBUG:", "Current",
"(%s)" % board_config.get_debug_tool_name(
env.GetProjectOption("debug_tool"))
"DEBUG:",
"Current",
"(%s)"
% board_config.get_debug_tool_name(env.GetProjectOption("debug_tool")),
]
onboard = []
external = []
@@ -187,21 +203,27 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
if not pkg_dir:
continue
manifest = platform.pm.load_manifest(pkg_dir)
original_version = util.get_original_version(manifest['version'])
info = "%s %s" % (manifest['name'], manifest['version'])
original_version = util.get_original_version(manifest["version"])
info = "%s %s" % (manifest["name"], manifest["version"])
extra = []
if original_version:
extra.append(original_version)
if "__src_url" in manifest and int(ARGUMENTS.get("PIOVERBOSE", 0)):
extra.append(manifest['__src_url'])
extra.append(manifest["__src_url"])
if extra:
info += " (%s)" % ", ".join(extra)
data.append(info)
return ["PACKAGES:", ", ".join(data)]
if not data:
return None
return ["PACKAGES:"] + ["\n - %s" % d for d in sorted(data)]
for data in (_get_configuration_data(), _get_plaform_data(),
_get_hardware_data(), _get_debug_data(),
_get_packages_data()):
for data in (
_get_configuration_data(),
_get_plaform_data(),
_get_hardware_data(),
_get_debug_data(),
_get_packages_data(),
):
if data and len(data) > 1:
print(" ".join(data))

View File

@@ -14,25 +14,29 @@
from __future__ import absolute_import
from platformio.project.config import ProjectConfig, ProjectOptions
from platformio.project.config import MISSING, ProjectConfig, ProjectOptions
def GetProjectConfig(env):
return ProjectConfig.get_instance(env['PROJECT_CONFIG'])
return ProjectConfig.get_instance(env["PROJECT_CONFIG"])
def GetProjectOptions(env, as_dict=False):
return env.GetProjectConfig().items(env=env['PIOENV'], as_dict=as_dict)
return env.GetProjectConfig().items(env=env["PIOENV"], as_dict=as_dict)
def GetProjectOption(env, option, default=None):
return env.GetProjectConfig().get("env:" + env['PIOENV'], option, default)
def GetProjectOption(env, option, default=MISSING):
return env.GetProjectConfig().get("env:" + env["PIOENV"], option, default)
def LoadProjectOptions(env):
for option, value in env.GetProjectOptions():
option_meta = ProjectOptions.get("env." + option)
if not option_meta or not option_meta.buildenvvar:
if (
not option_meta
or not option_meta.buildenvvar
or option_meta.buildenvvar in env
):
continue
env[option_meta.buildenvvar] = value

View File

@@ -0,0 +1,254 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-locals
from __future__ import absolute_import
import sys
from os import environ, makedirs, remove
from os.path import isdir, join, splitdrive
from elftools.elf.descriptions import describe_sh_flags
from elftools.elf.elffile import ELFFile
from platformio.compat import dump_json_to_unicode
from platformio.proc import exec_command
from platformio.util import get_systype
def _run_tool(cmd, env, tool_args):
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(build_dir, "size-data-longcmd.txt")
with open(tmp_file, "w") as fp:
fp.write("\n".join(tool_args))
cmd.append("@" + tmp_file)
result = exec_command(cmd, env=sysenv)
remove(tmp_file)
return result
def _get_symbol_locations(env, elf_path, addrs):
if not addrs:
return {}
cmd = [env.subst("$CC").replace("-gcc", "-addr2line"), "-e", elf_path]
result = _run_tool(cmd, env, addrs)
locations = [line for line in result["out"].split("\n") if line]
assert len(addrs) == len(locations)
return dict(zip(addrs, [l.strip() for l in locations]))
def _get_demangled_names(env, mangled_names):
if not mangled_names:
return {}
result = _run_tool(
[env.subst("$CC").replace("-gcc", "-c++filt")], env, mangled_names
)
demangled_names = [line for line in result["out"].split("\n") if line]
assert len(mangled_names) == len(demangled_names)
return dict(
zip(
mangled_names,
[dn.strip().replace("::__FUNCTION__", "") for dn in demangled_names],
)
)
def _determine_section(sections, symbol_addr):
for section, info in sections.items():
if not _is_flash_section(info) and not _is_ram_section(info):
continue
if symbol_addr in range(info["start_addr"], info["start_addr"] + info["size"]):
return section
return "unknown"
def _is_ram_section(section):
return (
section.get("type", "") in ("SHT_NOBITS", "SHT_PROGBITS")
and section.get("flags", "") == "WA"
)
def _is_flash_section(section):
return section.get("type", "") == "SHT_PROGBITS" and "A" in section.get("flags", "")
def _is_valid_symbol(symbol_name, symbol_type, symbol_address):
return symbol_name and symbol_address != 0 and symbol_type != "STT_NOTYPE"
def _collect_sections_info(elffile):
sections = {}
for section in elffile.iter_sections():
if section.is_null() or section.name.startswith(".debug"):
continue
section_type = section["sh_type"]
section_flags = describe_sh_flags(section["sh_flags"])
section_size = section.data_size
sections[section.name] = {
"size": section_size,
"start_addr": section["sh_addr"],
"type": section_type,
"flags": section_flags,
}
return sections
def _collect_symbols_info(env, elffile, elf_path, sections):
symbols = []
symbol_section = elffile.get_section_by_name(".symtab")
if symbol_section.is_null():
sys.stderr.write("Couldn't find symbol table. Is ELF file stripped?")
env.Exit(1)
sysenv = environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
symbol_addrs = []
mangled_names = []
for s in symbol_section.iter_symbols():
symbol_info = s.entry["st_info"]
symbol_addr = s["st_value"]
symbol_size = s["st_size"]
symbol_type = symbol_info["type"]
if not _is_valid_symbol(s.name, symbol_type, symbol_addr):
continue
symbol = {
"addr": symbol_addr,
"bind": symbol_info["bind"],
"name": s.name,
"type": symbol_type,
"size": symbol_size,
"section": _determine_section(sections, symbol_addr),
}
if s.name.startswith("_Z"):
mangled_names.append(s.name)
symbol_addrs.append(hex(symbol_addr))
symbols.append(symbol)
symbol_locations = _get_symbol_locations(env, elf_path, symbol_addrs)
demangled_names = _get_demangled_names(env, mangled_names)
for symbol in symbols:
if symbol["name"].startswith("_Z"):
symbol["demangled_name"] = demangled_names.get(symbol["name"])
location = symbol_locations.get(hex(symbol["addr"]))
if not location or "?" in location:
continue
if "windows" in get_systype():
drive, tail = splitdrive(location)
location = join(drive.upper(), tail)
symbol["file"] = location
symbol["line"] = 0
if ":" in location:
file_, line = location.rsplit(":", 1)
if line.isdigit():
symbol["file"] = file_
symbol["line"] = int(line)
return symbols
def _calculate_firmware_size(sections):
flash_size = ram_size = 0
for section_info in sections.values():
if _is_flash_section(section_info):
flash_size += section_info.get("size", 0)
if _is_ram_section(section_info):
ram_size += section_info.get("size", 0)
return ram_size, flash_size
def DumpSizeData(_, target, source, env): # pylint: disable=unused-argument
data = {"device": {}, "memory": {}, "version": 1}
board = env.BoardConfig()
if board:
data["device"] = {
"mcu": board.get("build.mcu", ""),
"cpu": board.get("build.cpu", ""),
"frequency": board.get("build.f_cpu"),
"flash": int(board.get("upload.maximum_size", 0)),
"ram": int(board.get("upload.maximum_ram_size", 0)),
}
if data["device"]["frequency"] and data["device"]["frequency"].endswith("L"):
data["device"]["frequency"] = int(data["device"]["frequency"][0:-1])
elf_path = env.subst("$PIOMAINPROG")
with open(elf_path, "rb") as fp:
elffile = ELFFile(fp)
if not elffile.has_dwarf_info():
sys.stderr.write("Elf file doesn't contain DWARF information")
env.Exit(1)
sections = _collect_sections_info(elffile)
firmware_ram, firmware_flash = _calculate_firmware_size(sections)
data["memory"]["total"] = {
"ram_size": firmware_ram,
"flash_size": firmware_flash,
"sections": sections,
}
files = dict()
for symbol in _collect_symbols_info(env, elffile, elf_path, sections):
file_path = symbol.get("file") or "unknown"
if not files.get(file_path, {}):
files[file_path] = {"symbols": [], "ram_size": 0, "flash_size": 0}
symbol_size = symbol.get("size", 0)
section = sections.get(symbol.get("section", ""), {})
if _is_ram_section(section):
files[file_path]["ram_size"] += symbol_size
if _is_flash_section(section):
files[file_path]["flash_size"] += symbol_size
files[file_path]["symbols"].append(symbol)
data["memory"]["files"] = list()
for k, v in files.items():
file_data = {"path": k}
file_data.update(v)
data["memory"]["files"].append(file_data)
with open(join(env.subst("$BUILD_DIR"), "sizedata.json"), "w") as fp:
fp.write(dump_json_to_unicode(data))
def exists(_):
return True
def generate(env):
env.AddMethod(DumpSizeData)
return env

View File

@@ -60,9 +60,9 @@ def WaitForNewSerialPort(env, before):
prev_port = env.subst("$UPLOAD_PORT")
new_port = None
elapsed = 0
before = [p['port'] for p in before]
before = [p["port"] for p in before]
while elapsed < 5 and new_port is None:
now = [p['port'] for p in util.get_serial_ports()]
now = [p["port"] for p in util.get_serial_ports()]
for p in now:
if p not in before:
new_port = p
@@ -84,10 +84,12 @@ def WaitForNewSerialPort(env, before):
sleep(1)
if not new_port:
sys.stderr.write("Error: Couldn't find a board on the selected port. "
"Check that you have the correct port selected. "
"If it is correct, try pressing the board's reset "
"button after initiating the upload.\n")
sys.stderr.write(
"Error: Couldn't find a board on the selected port. "
"Check that you have the correct port selected. "
"If it is correct, try pressing the board's reset "
"button after initiating the upload.\n"
)
env.Exit(1)
return new_port
@@ -99,8 +101,8 @@ def AutodetectUploadPort(*args, **kwargs):
def _get_pattern():
if "UPLOAD_PORT" not in env:
return None
if set(["*", "?", "[", "]"]) & set(env['UPLOAD_PORT']):
return env['UPLOAD_PORT']
if set(["*", "?", "[", "]"]) & set(env["UPLOAD_PORT"]):
return env["UPLOAD_PORT"]
return None
def _is_match_pattern(port):
@@ -112,17 +114,13 @@ def AutodetectUploadPort(*args, **kwargs):
def _look_for_mbed_disk():
msdlabels = ("mbed", "nucleo", "frdm", "microbit")
for item in util.get_logical_devices():
if item['path'].startswith("/net") or not _is_match_pattern(
item['path']):
if item["path"].startswith("/net") or not _is_match_pattern(item["path"]):
continue
mbed_pages = [
join(item['path'], n) for n in ("mbed.htm", "mbed.html")
]
mbed_pages = [join(item["path"], n) for n in ("mbed.htm", "mbed.html")]
if any(isfile(p) for p in mbed_pages):
return item['path']
if item['name'] \
and any(l in item['name'].lower() for l in msdlabels):
return item['path']
return item["path"]
if item["name"] and any(l in item["name"].lower() for l in msdlabels):
return item["path"]
return None
def _look_for_serial_port():
@@ -132,17 +130,17 @@ def AutodetectUploadPort(*args, **kwargs):
if "BOARD" in env and "build.hwids" in env.BoardConfig():
board_hwids = env.BoardConfig().get("build.hwids")
for item in util.get_serial_ports(filter_hwid=True):
if not _is_match_pattern(item['port']):
if not _is_match_pattern(item["port"]):
continue
port = item['port']
port = item["port"]
if upload_protocol.startswith("blackmagic"):
if WINDOWS and port.startswith("COM") and len(port) > 4:
port = "\\\\.\\%s" % port
if "GDB" in item['description']:
if "GDB" in item["description"]:
return port
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']:
if hwid_str in item["hwid"]:
return port
return port
@@ -150,9 +148,9 @@ def AutodetectUploadPort(*args, **kwargs):
print(env.subst("Use manually specified: $UPLOAD_PORT"))
return
if (env.subst("$UPLOAD_PROTOCOL") == "mbed"
or ("mbed" in env.subst("$PIOFRAMEWORK")
and not env.subst("$UPLOAD_PROTOCOL"))):
if env.subst("$UPLOAD_PROTOCOL") == "mbed" or (
"mbed" in env.subst("$PIOFRAMEWORK") and not env.subst("$UPLOAD_PROTOCOL")
):
env.Replace(UPLOAD_PORT=_look_for_mbed_disk())
else:
try:
@@ -168,7 +166,8 @@ def AutodetectUploadPort(*args, **kwargs):
"Error: Please specify `upload_port` for environment or use "
"global `--upload-port` option.\n"
"For some development platforms it can be a USB flash "
"drive (i.e. /media/<user>/<device name>)\n")
"drive (i.e. /media/<user>/<device name>)\n"
)
env.Exit(1)
@@ -179,16 +178,17 @@ def UploadToDisk(_, target, source, env):
fpath = join(env.subst("$BUILD_DIR"), "%s.%s" % (progname, ext))
if not isfile(fpath):
continue
copyfile(fpath,
join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext)))
print("Firmware has been successfully uploaded.\n"
"(Some boards may require manual hard reset)")
copyfile(fpath, join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext)))
print(
"Firmware has been successfully uploaded.\n"
"(Some boards may require manual hard reset)"
)
def CheckUploadSize(_, target, source, env):
check_conditions = [
env.get("BOARD"),
env.get("SIZETOOL") or env.get("SIZECHECKCMD")
env.get("SIZETOOL") or env.get("SIZECHECKCMD"),
]
if not all(check_conditions):
return
@@ -198,9 +198,11 @@ def CheckUploadSize(_, target, source, env):
return
def _configure_defaults():
env.Replace(SIZECHECKCMD="$SIZETOOL -B -d $SOURCES",
SIZEPROGREGEXP=r"^(\d+)\s+(\d+)\s+\d+\s",
SIZEDATAREGEXP=r"^\d+\s+(\d+)\s+(\d+)\s+\d+")
env.Replace(
SIZECHECKCMD="$SIZETOOL -B -d $SOURCES",
SIZEPROGREGEXP=r"^(\d+)\s+(\d+)\s+\d+\s",
SIZEDATAREGEXP=r"^\d+\s+(\d+)\s+(\d+)\s+\d+",
)
def _get_size_output():
cmd = env.get("SIZECHECKCMD")
@@ -210,11 +212,11 @@ def CheckUploadSize(_, target, source, env):
cmd = cmd.split()
cmd = [arg.replace("$SOURCES", str(source[0])) for arg in cmd if arg]
sysenv = environ.copy()
sysenv['PATH'] = str(env['ENV']['PATH'])
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command(env.subst(cmd), env=sysenv)
if result['returncode'] != 0:
if result["returncode"] != 0:
return None
return result['out'].strip()
return result["out"].strip()
def _calculate_size(output, pattern):
if not output or not pattern:
@@ -238,7 +240,8 @@ def CheckUploadSize(_, target, source, env):
if used_blocks > blocks_per_progress:
used_blocks = blocks_per_progress
return "[{:{}}] {: 6.1%} (used {:d} bytes from {:d} bytes)".format(
"=" * used_blocks, blocks_per_progress, percent_raw, value, total)
"=" * used_blocks, blocks_per_progress, percent_raw, value, total
)
if not env.get("SIZECHECKCMD") and not env.get("SIZEPROGREGEXP"):
_configure_defaults()
@@ -246,12 +249,11 @@ def CheckUploadSize(_, target, source, env):
program_size = _calculate_size(output, env.get("SIZEPROGREGEXP"))
data_size = _calculate_size(output, env.get("SIZEDATAREGEXP"))
print("Memory Usage -> http://bit.ly/pio-memory-usage")
print('Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"')
if data_max_size and data_size > -1:
print("DATA: %s" % _format_availale_bytes(data_size, data_max_size))
print("RAM: %s" % _format_availale_bytes(data_size, data_max_size))
if program_size > -1:
print("PROGRAM: %s" %
_format_availale_bytes(program_size, program_max_size))
print("Flash: %s" % _format_availale_bytes(program_size, program_max_size))
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
print(output)
@@ -262,9 +264,10 @@ def CheckUploadSize(_, target, source, env):
# "than maximum allowed (%s bytes)\n" % (data_size, data_max_size))
# env.Exit(1)
if program_size > program_max_size:
sys.stderr.write("Error: The program size (%d bytes) is greater "
"than maximum allowed (%s bytes)\n" %
(program_size, program_max_size))
sys.stderr.write(
"Error: The program size (%d bytes) is greater "
"than maximum allowed (%s bytes)\n" % (program_size, program_max_size)
)
env.Exit(1)
@@ -272,8 +275,7 @@ def PrintUploadInfo(env):
configured = env.subst("$UPLOAD_PROTOCOL")
available = [configured] if configured else []
if "BOARD" in env:
available.extend(env.BoardConfig().get("upload",
{}).get("protocols", []))
available.extend(env.BoardConfig().get("upload", {}).get("protocols", []))
if available:
print("AVAILABLE: %s" % ", ".join(sorted(set(available))))
if configured:

View File

@@ -14,11 +14,12 @@
from __future__ import absolute_import
import fnmatch
import os
import sys
from os.path import basename, dirname, isdir, join, realpath
from SCons import Builder, Util # pylint: disable=import-error
from SCons.Node import FS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from SCons.Script import AlwaysBuild # pylint: disable=import-error
from SCons.Script import DefaultEnvironment # pylint: disable=import-error
@@ -30,8 +31,10 @@ from platformio.compat import string_types
from platformio.util import pioversion_to_intstr
SRC_HEADER_EXT = ["h", "hpp"]
SRC_C_EXT = ["c", "cc", "cpp"]
SRC_BUILD_EXT = SRC_C_EXT + ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
SRC_ASM_EXT = ["S", "spp", "SPP", "sx", "s", "asm", "ASM"]
SRC_C_EXT = ["c"]
SRC_CXX_EXT = ["cc", "cpp", "cxx", "c++"]
SRC_BUILD_EXT = SRC_C_EXT + SRC_CXX_EXT + SRC_ASM_EXT
SRC_FILTER_DEFAULT = ["+<*>", "-<.git%s>" % os.sep, "-<.svn%s>" % os.sep]
@@ -43,49 +46,58 @@ def scons_patched_match_splitext(path, suffixes=None):
return tokens
def _build_project_deps(env):
project_lib_builder = env.ConfigureProjectLibBuilder()
# prepend project libs to the beginning of list
env.Prepend(LIBS=project_lib_builder.build())
# prepend extra linker related options from libs
env.PrependUnique(
**{
key: project_lib_builder.env.get(key)
for key in ("LIBS", "LIBPATH", "LINKFLAGS")
if project_lib_builder.env.get(key)
})
projenv = env.Clone()
# CPPPATH from dependencies
projenv.PrependUnique(CPPPATH=project_lib_builder.env.get("CPPPATH"))
# extra build flags from `platformio.ini`
projenv.ProcessFlags(env.get("SRC_BUILD_FLAGS"))
is_test = "__test" in COMMAND_LINE_TARGETS
if is_test:
projenv.BuildSources("$BUILDTEST_DIR", "$PROJECTTEST_DIR",
"$PIOTEST_SRC_FILTER")
if not is_test or env.GetProjectOption("test_build_project_src", False):
projenv.BuildSources("$BUILDSRC_DIR", "$PROJECTSRC_DIR",
env.get("SRC_FILTER"))
if not env.get("PIOBUILDFILES") and not COMMAND_LINE_TARGETS:
sys.stderr.write(
"Error: Nothing to build. Please put your source code files "
"to '%s' folder\n" % env.subst("$PROJECTSRC_DIR"))
env.Exit(1)
Export("projenv")
def GetBuildType(env):
return (
"debug"
if (
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
)
else "release"
)
def BuildProgram(env):
env.ProcessProgramDeps()
env.ProcessProjectDeps()
# append into the beginning a main LD script
if env.get("LDSCRIPT_PATH") and not any("-Wl,-T" in f for f in env["LINKFLAGS"]):
env.Prepend(LINKFLAGS=["-T", env.subst("$LDSCRIPT_PATH")])
# enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc":
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
program = env.Program(
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
)
env.Replace(PIOMAINPROG=program)
AlwaysBuild(
env.Alias(
"checkprogsize",
program,
env.VerboseAction(env.CheckUploadSize, "Checking size $PIOMAINPROG"),
)
)
print("Building in %s mode" % env.GetBuildType())
return program
def ProcessProgramDeps(env):
def _append_pio_macros():
env.AppendUnique(CPPDEFINES=[(
"PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())))])
env.AppendUnique(
CPPDEFINES=[
(
"PLATFORMIO",
int("{0:02d}{1:02d}{2:02d}".format(*pioversion_to_intstr())),
)
]
)
_append_pio_macros()
@@ -95,10 +107,6 @@ def BuildProgram(env):
if not Util.case_sensitive_suffixes(".s", ".S"):
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
if ("debug" in COMMAND_LINE_TARGETS
or env.GetProjectOption("build_type") == "debug"):
env.ProcessDebug()
# process extra flags from board
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
@@ -109,39 +117,55 @@ def BuildProgram(env):
# process framework scripts
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
# restore PIO macros if it was deleted by framework
_append_pio_macros()
if env.GetBuildType() == "debug":
env.ConfigureDebugFlags()
# remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
if "__test" in COMMAND_LINE_TARGETS:
env.ProcessTest()
env.ConfigureTestTarget()
# build project with dependencies
_build_project_deps(env)
# append into the beginning a main LD script
if (env.get("LDSCRIPT_PATH")
and not any("-Wl,-T" in f for f in env['LINKFLAGS'])):
env.Prepend(LINKFLAGS=["-T", "$LDSCRIPT_PATH"])
def ProcessProjectDeps(env):
project_lib_builder = env.ConfigureProjectLibBuilder()
# enable "cyclic reference" for linker
if env.get("LIBS") and env.GetCompilerType() == "gcc":
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
# prepend project libs to the beginning of list
env.Prepend(LIBS=project_lib_builder.build())
# prepend extra linker related options from libs
env.PrependUnique(
**{
key: project_lib_builder.env.get(key)
for key in ("LIBS", "LIBPATH", "LINKFLAGS")
if project_lib_builder.env.get(key)
}
)
program = env.Program(join("$BUILD_DIR", env.subst("$PROGNAME")),
env['PIOBUILDFILES'])
env.Replace(PIOMAINPROG=program)
projenv = env.Clone()
AlwaysBuild(
env.Alias(
"checkprogsize", program,
env.VerboseAction(env.CheckUploadSize,
"Checking size $PIOMAINPROG")))
# CPPPATH from dependencies
projenv.PrependUnique(CPPPATH=project_lib_builder.env.get("CPPPATH"))
# extra build flags from `platformio.ini`
projenv.ProcessFlags(env.get("SRC_BUILD_FLAGS"))
return program
is_test = "__test" in COMMAND_LINE_TARGETS
if is_test:
projenv.BuildSources(
"$BUILD_TEST_DIR", "$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
)
if not is_test or env.GetProjectOption("test_build_project_src"):
projenv.BuildSources(
"$BUILD_SRC_DIR", "$PROJECT_SRC_DIR", env.get("SRC_FILTER")
)
if not env.get("PIOBUILDFILES") and not COMMAND_LINE_TARGETS:
sys.stderr.write(
"Error: Nothing to build. Please put your source code files "
"to '%s' folder\n" % env.subst("$PROJECT_SRC_DIR")
)
env.Exit(1)
Export("projenv")
def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
@@ -155,30 +179,30 @@ def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
result[key].extend(value)
cppdefines = []
for item in result['CPPDEFINES']:
for item in result["CPPDEFINES"]:
if not Util.is_Sequence(item):
cppdefines.append(item)
continue
name, value = item[:2]
if '\"' in value:
value = value.replace('\"', '\\\"')
if '"' in value:
value = value.replace('"', '\\"')
elif value.isdigit():
value = int(value)
elif value.replace(".", "", 1).isdigit():
value = float(value)
cppdefines.append((name, value))
result['CPPDEFINES'] = cppdefines
result["CPPDEFINES"] = cppdefines
# fix relative CPPPATH & LIBPATH
for k in ("CPPPATH", "LIBPATH"):
for i, p in enumerate(result.get(k, [])):
if isdir(p):
result[k][i] = realpath(p)
if os.path.isdir(p):
result[k][i] = os.path.realpath(p)
# fix relative path for "-include"
for i, f in enumerate(result.get("CCFLAGS", [])):
if isinstance(f, tuple) and f[0] == "-include":
result['CCFLAGS'][i] = (f[0], env.File(realpath(f[1].get_path())))
result["CCFLAGS"][i] = (f[0], env.File(os.path.realpath(f[1].get_path())))
return result
@@ -191,14 +215,15 @@ def ProcessFlags(env, flags): # pylint: disable=too-many-branches
# Cancel any previous definition of name, either built in or
# provided with a -U option // Issue #191
undefines = [
u for u in env.get("CCFLAGS", [])
u
for u in env.get("CCFLAGS", [])
if isinstance(u, string_types) and u.startswith("-U")
]
if undefines:
for undef in undefines:
env['CCFLAGS'].remove(undef)
if undef[2:] in env['CPPDEFINES']:
env['CPPDEFINES'].remove(undef[2:])
env["CCFLAGS"].remove(undef)
if undef[2:] in env["CPPDEFINES"]:
env["CPPDEFINES"].remove(undef[2:])
env.Append(_CPPDEFFLAGS=" %s" % " ".join(undefines))
@@ -221,8 +246,7 @@ def ProcessUnFlags(env, flags):
for current in env.get(key, []):
conditions = [
unflag == current,
isinstance(current, (tuple, list))
and unflag[0] == current[0]
isinstance(current, (tuple, list)) and unflag[0] == current[0],
]
if any(conditions):
env[key].remove(current)
@@ -231,15 +255,14 @@ def ProcessUnFlags(env, flags):
def MatchSourceFiles(env, src_dir, src_filter=None):
src_filter = env.subst(src_filter) if src_filter else None
src_filter = src_filter or SRC_FILTER_DEFAULT
return fs.match_src_files(env.subst(src_dir), src_filter,
SRC_BUILD_EXT + SRC_HEADER_EXT)
return fs.match_src_files(
env.subst(src_dir), src_filter, SRC_BUILD_EXT + SRC_HEADER_EXT
)
def CollectBuildFiles(env,
variant_dir,
src_dir,
src_filter=None,
duplicate=False):
def CollectBuildFiles(
env, variant_dir, src_dir, src_filter=None, duplicate=False
): # pylint: disable=too-many-locals
sources = []
variants = []
@@ -248,27 +271,44 @@ def CollectBuildFiles(env,
src_dir = src_dir[:-1]
for item in env.MatchSourceFiles(src_dir, src_filter):
_reldir = dirname(item)
_src_dir = join(src_dir, _reldir) if _reldir else src_dir
_var_dir = join(variant_dir, _reldir) if _reldir else variant_dir
_reldir = os.path.dirname(item)
_src_dir = os.path.join(src_dir, _reldir) if _reldir else src_dir
_var_dir = os.path.join(variant_dir, _reldir) if _reldir else variant_dir
if _var_dir not in variants:
variants.append(_var_dir)
env.VariantDir(_var_dir, _src_dir, duplicate)
if fs.path_endswith_ext(item, SRC_BUILD_EXT):
sources.append(env.File(join(_var_dir, basename(item))))
sources.append(env.File(os.path.join(_var_dir, os.path.basename(item))))
for callback, pattern in env.get("__PIO_BUILD_MIDDLEWARES", []):
tmp = []
for node in sources:
if pattern and not fnmatch.fnmatch(node.get_path(), pattern):
tmp.append(node)
continue
n = callback(node)
if n:
tmp.append(n)
sources = tmp
return sources
def AddBuildMiddleware(env, callback, pattern=None):
env.Append(__PIO_BUILD_MIDDLEWARES=[(callback, pattern)])
def BuildFrameworks(env, frameworks):
if not frameworks:
return
if "BOARD" not in env:
sys.stderr.write("Please specify `board` in `platformio.ini` to use "
"with '%s' framework\n" % ", ".join(frameworks))
sys.stderr.write(
"Please specify `board` in `platformio.ini` to use "
"with '%s' framework\n" % ", ".join(frameworks)
)
env.Exit(1)
board_frameworks = env.BoardConfig().get("frameworks", [])
@@ -276,8 +316,7 @@ def BuildFrameworks(env, frameworks):
if board_frameworks:
frameworks.insert(0, board_frameworks[0])
else:
sys.stderr.write(
"Error: Please specify `board` in `platformio.ini`\n")
sys.stderr.write("Error: Please specify `board` in `platformio.ini`\n")
env.Exit(1)
for f in frameworks:
@@ -290,22 +329,24 @@ def BuildFrameworks(env, frameworks):
if f in board_frameworks:
SConscript(env.GetFrameworkScript(f), exports="env")
else:
sys.stderr.write(
"Error: This board doesn't support %s framework!\n" % f)
sys.stderr.write("Error: This board doesn't support %s framework!\n" % f)
env.Exit(1)
def BuildLibrary(env, variant_dir, src_dir, src_filter=None):
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
return env.StaticLibrary(
env.subst(variant_dir),
env.CollectBuildFiles(variant_dir, src_dir, src_filter))
env.subst(variant_dir), env.CollectBuildFiles(variant_dir, src_dir, src_filter)
)
def BuildSources(env, variant_dir, src_dir, src_filter=None):
nodes = env.CollectBuildFiles(variant_dir, src_dir, src_filter)
DefaultEnvironment().Append(
PIOBUILDFILES=[env.Object(node) for node in nodes])
PIOBUILDFILES=[
env.Object(node) if isinstance(node, FS.File) else node for node in nodes
]
)
def exists(_):
@@ -313,12 +354,16 @@ def exists(_):
def generate(env):
env.AddMethod(GetBuildType)
env.AddMethod(BuildProgram)
env.AddMethod(ProcessProgramDeps)
env.AddMethod(ProcessProjectDeps)
env.AddMethod(ParseFlagsExtended)
env.AddMethod(ProcessFlags)
env.AddMethod(ProcessUnFlags)
env.AddMethod(MatchSourceFiles)
env.AddMethod(CollectBuildFiles)
env.AddMethod(AddBuildMiddleware)
env.AddMethod(BuildFrameworks)
env.AddMethod(BuildLibrary)
env.AddMethod(BuildSources)

View File

@@ -13,7 +13,6 @@
# limitations under the License.
import os
from os.path import dirname, isfile, join
import click
@@ -22,13 +21,21 @@ class PlatformioCLI(click.MultiCommand):
leftover_args = []
def __init__(self, *args, **kwargs):
super(PlatformioCLI, self).__init__(*args, **kwargs)
self._pio_cmds_dir = os.path.dirname(__file__)
@staticmethod
def in_silence():
args = PlatformioCLI.leftover_args
return args and any([
args[0] == "debug" and "--interpreter" in " ".join(args),
args[0] == "upgrade", "--json-output" in args, "--version" in args
])
return args and any(
[
args[0] == "debug" and "--interpreter" in " ".join(args),
args[0] == "upgrade",
"--json-output" in args,
"--version" in args,
]
)
def invoke(self, ctx):
PlatformioCLI.leftover_args = ctx.args
@@ -38,35 +45,36 @@ class PlatformioCLI(click.MultiCommand):
def list_commands(self, ctx):
cmds = []
cmds_dir = dirname(__file__)
for name in os.listdir(cmds_dir):
if name.startswith("__init__"):
for cmd_name in os.listdir(self._pio_cmds_dir):
if cmd_name.startswith("__init__"):
continue
if isfile(join(cmds_dir, name, "command.py")):
cmds.append(name)
elif name.endswith(".py"):
cmds.append(name[:-3])
if os.path.isfile(os.path.join(self._pio_cmds_dir, cmd_name, "command.py")):
cmds.append(cmd_name)
elif cmd_name.endswith(".py"):
cmds.append(cmd_name[:-3])
cmds.sort()
return cmds
def get_command(self, ctx, cmd_name):
mod = None
try:
mod = __import__("platformio.commands." + cmd_name, None, None,
["cli"])
mod_path = "platformio.commands." + cmd_name
if os.path.isfile(os.path.join(self._pio_cmds_dir, cmd_name, "command.py")):
mod_path = "platformio.commands.%s.command" % cmd_name
mod = __import__(mod_path, None, None, ["cli"])
except ImportError:
try:
return self._handle_obsolate_command(cmd_name)
except AttributeError:
raise click.UsageError('No such command "%s"' % cmd_name, ctx)
pass
raise click.UsageError('No such command "%s"' % cmd_name, ctx)
return mod.cli
@staticmethod
def _handle_obsolate_command(name):
if name == "platforms":
from platformio.commands import platform
return platform.cli
if name == "serialports":
from platformio.commands import device
return device.cli
# pylint: disable=import-outside-toplevel
if name == "init":
from platformio.commands.project import project_init
return project_init
raise AttributeError()

View File

@@ -32,11 +32,14 @@ def cli(query, installed, json_output): # pylint: disable=R0912
grpboards = {}
for board in _get_boards(installed):
if query and query.lower() not in json.dumps(board).lower():
if query and not any(
query.lower() in str(board.get(k, "")).lower()
for k in ("id", "name", "mcu", "vendor", "platform", "frameworks")
):
continue
if board['platform'] not in grpboards:
grpboards[board['platform']] = []
grpboards[board['platform']].append(board)
if board["platform"] not in grpboards:
grpboards[board["platform"]] = []
grpboards[board["platform"]].append(board)
terminal_width, _ = click.get_terminal_size()
for (platform, boards) in sorted(grpboards.items()):
@@ -50,11 +53,21 @@ def cli(query, installed, json_output): # pylint: disable=R0912
def print_boards(boards):
click.echo(
tabulate([(click.style(b['id'], fg="cyan"), b['mcu'], "%dMHz" %
(b['fcpu'] / 1000000), fs.format_filesize(
b['rom']), fs.format_filesize(b['ram']), b['name'])
for b in boards],
headers=["ID", "MCU", "Frequency", "Flash", "RAM", "Name"]))
tabulate(
[
(
click.style(b["id"], fg="cyan"),
b["mcu"],
"%dMHz" % (b["fcpu"] / 1000000),
fs.format_filesize(b["rom"]),
fs.format_filesize(b["ram"]),
b["name"],
)
for b in boards
],
headers=["ID", "MCU", "Frequency", "Flash", "RAM", "Name"],
)
)
def _get_boards(installed=False):
@@ -66,7 +79,7 @@ def _print_boards_json(query, installed=False):
result = []
for board in _get_boards(installed):
if query:
search_data = "%s %s" % (board['id'], json.dumps(board).lower())
search_data = "%s %s" % (board["id"], json.dumps(board).lower())
if query.lower() not in search_data.lower():
continue
result.append(board)

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,316 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches
# pylint: disable=redefined-builtin,too-many-statements
import os
from collections import Counter
from os.path import dirname, isfile
from time import time
import click
from tabulate import tabulate
from platformio import app, exception, fs, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools import CheckToolFactory
from platformio.compat import dump_json_to_unicode
from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above, get_project_dir
@click.command("check", short_help="Run a static analysis tool on code")
@click.option("-e", "--environment", multiple=True)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--pattern", multiple=True)
@click.option("--flags", multiple=True)
@click.option(
"--severity", multiple=True, type=click.Choice(DefectItem.SEVERITY_LABELS.values())
)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
@click.option("--json-output", is_flag=True)
@click.option(
"--fail-on-defect",
multiple=True,
type=click.Choice(DefectItem.SEVERITY_LABELS.values()),
)
def cli(
environment,
project_dir,
project_conf,
pattern,
flags,
severity,
silent,
verbose,
json_output,
fail_on_defect,
):
app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level
if isfile(project_dir):
project_dir = find_project_dir_above(project_dir)
results = []
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(environment)
default_envs = config.default_envs()
for envname in config.envs():
skipenv = any(
[
environment and envname not in environment,
not environment and default_envs and envname not in default_envs,
]
)
env_options = config.items(env=envname, as_dict=True)
env_dump = []
for k, v in env_options.items():
if k not in ("platform", "framework", "board"):
continue
env_dump.append(
"%s: %s" % (k, ", ".join(v) if isinstance(v, list) else v)
)
default_patterns = [
config.get_optional_dir("src"),
config.get_optional_dir("include"),
]
tool_options = dict(
verbose=verbose,
silent=silent,
patterns=pattern or env_options.get("check_patterns", default_patterns),
flags=flags or env_options.get("check_flags"),
severity=[DefectItem.SEVERITY_LABELS[DefectItem.SEVERITY_HIGH]]
if silent
else severity or config.get("env:" + envname, "check_severity"),
)
for tool in config.get("env:" + envname, "check_tool"):
if skipenv:
results.append({"env": envname, "tool": tool})
continue
if not silent and not json_output:
print_processing_header(tool, envname, env_dump)
ct = CheckToolFactory.new(
tool, project_dir, config, envname, tool_options
)
result = {"env": envname, "tool": tool, "duration": time()}
rc = ct.check(
on_defect_callback=None
if (json_output or verbose)
else lambda defect: click.echo(repr(defect))
)
result["defects"] = ct.get_defects()
result["duration"] = time() - result["duration"]
result["succeeded"] = rc == 0
if fail_on_defect:
result["succeeded"] = rc == 0 and not any(
DefectItem.SEVERITY_LABELS[d.severity] in fail_on_defect
for d in result["defects"]
)
result["stats"] = collect_component_stats(result)
results.append(result)
if verbose:
click.echo("\n".join(repr(d) for d in result["defects"]))
if not json_output and not silent:
if rc != 0:
click.echo(
"Error: %s failed to perform check! Please "
"examine tool output in verbose mode." % tool
)
elif not result["defects"]:
click.echo("No defects found")
print_processing_footer(result)
if json_output:
click.echo(dump_json_to_unicode(results_to_json(results)))
elif not silent:
print_check_summary(results)
command_failed = any(r.get("succeeded") is False for r in results)
if command_failed:
raise exception.ReturnErrorCode(1)
def results_to_json(raw):
results = []
for item in raw:
if item.get("succeeded") is None:
continue
item.update(
{
"succeeded": bool(item.get("succeeded")),
"defects": [d.as_dict() for d in item.get("defects", [])],
}
)
results.append(item)
return results
def print_processing_header(tool, envname, envdump):
click.echo(
"Checking %s > %s (%s)"
% (click.style(envname, fg="cyan", bold=True), tool, "; ".join(envdump))
)
terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
def print_processing_footer(result):
is_failed = not result.get("succeeded")
util.print_labeled_bar(
"[%s] Took %.2f seconds"
% (
(
click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("PASSED", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def collect_component_stats(result):
components = dict()
def _append_defect(component, defect):
if not components.get(component):
components[component] = Counter()
components[component].update({DefectItem.SEVERITY_LABELS[defect.severity]: 1})
for defect in result.get("defects", []):
component = dirname(defect.file) or defect.file
_append_defect(component, defect)
if component.startswith(get_project_dir()):
while os.sep in component:
component = dirname(component)
_append_defect(component, defect)
return components
def print_defects_stats(results):
if not results:
return
component_stats = {}
for r in results:
for k, v in r.get("stats", {}).items():
if not component_stats.get(k):
component_stats[k] = Counter()
component_stats[k].update(r["stats"][k])
if not component_stats:
return
severity_labels = list(DefectItem.SEVERITY_LABELS.values())
severity_labels.reverse()
tabular_data = list()
for k, v in component_stats.items():
tool_defect = [v.get(s, 0) for s in severity_labels]
tabular_data.append([k] + tool_defect)
total = ["Total"] + [sum(d) for d in list(zip(*tabular_data))[1:]]
tabular_data.sort()
tabular_data.append([]) # Empty line as delimiter
tabular_data.append(total)
headers = ["Component"]
headers.extend([l.upper() for l in severity_labels])
headers = [click.style(h, bold=True) for h in headers]
click.echo(tabulate(tabular_data, headers=headers, numalign="center"))
click.echo()
def print_check_summary(results):
click.echo()
tabular_data = []
succeeded_nums = 0
failed_nums = 0
duration = 0
print_defects_stats(results)
for result in results:
duration += result.get("duration", 0)
if result.get("succeeded") is False:
failed_nums += 1
status_str = click.style("FAILED", fg="red")
elif result.get("succeeded") is None:
status_str = "IGNORED"
else:
succeeded_nums += 1
status_str = click.style("PASSED", fg="green")
tabular_data.append(
(
click.style(result["env"], fg="cyan"),
result["tool"],
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Environment", "Tool", "Status", "Duration")
],
),
err=failed_nums,
)
util.print_labeled_bar(
"%s%d succeeded in %s"
% (
"%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums,
fg="red" if failed_nums else "green",
)

View File

@@ -0,0 +1,95 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import click
from platformio.project.helpers import get_project_dir
# pylint: disable=too-many-instance-attributes, redefined-builtin
# pylint: disable=too-many-arguments
class DefectItem(object):
SEVERITY_HIGH = 1
SEVERITY_MEDIUM = 2
SEVERITY_LOW = 4
SEVERITY_LABELS = {4: "low", 2: "medium", 1: "high"}
def __init__(
self,
severity,
category,
message,
file="unknown",
line=0,
column=0,
id=None,
callstack=None,
cwe=None,
):
assert severity in (self.SEVERITY_HIGH, self.SEVERITY_MEDIUM, self.SEVERITY_LOW)
self.severity = severity
self.category = category
self.message = message
self.line = int(line)
self.column = int(column)
self.callstack = callstack
self.cwe = cwe
self.id = id
self.file = file
if file.startswith(get_project_dir()):
self.file = os.path.relpath(file, get_project_dir())
def __repr__(self):
defect_color = None
if self.severity == self.SEVERITY_HIGH:
defect_color = "red"
elif self.severity == self.SEVERITY_MEDIUM:
defect_color = "yellow"
format_str = "{file}:{line}: [{severity}:{category}] {message} {id}"
return format_str.format(
severity=click.style(self.SEVERITY_LABELS[self.severity], fg=defect_color),
category=click.style(self.category.lower(), fg=defect_color),
file=click.style(self.file, bold=True),
message=self.message,
line=self.line,
id="%s" % "[%s]" % self.id if self.id else "",
)
def __or__(self, defect):
return self.severity | defect.severity
@staticmethod
def severity_to_int(label):
for key, value in DefectItem.SEVERITY_LABELS.items():
if label == value:
return key
raise Exception("Unknown severity label -> %s" % label)
def as_dict(self):
return {
"severity": self.SEVERITY_LABELS[self.severity],
"category": self.category,
"message": self.message,
"file": os.path.realpath(self.file),
"line": self.line,
"column": self.column,
"callstack": self.callstack,
"id": self.id,
"cwe": self.cwe,
}

View File

@@ -0,0 +1,33 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import exception
from platformio.commands.check.tools.clangtidy import ClangtidyCheckTool
from platformio.commands.check.tools.cppcheck import CppcheckCheckTool
from platformio.commands.check.tools.pvsstudio import PvsStudioCheckTool
class CheckToolFactory(object):
@staticmethod
def new(tool, project_dir, config, envname, options):
cls = None
if tool == "cppcheck":
cls = CppcheckCheckTool
elif tool == "clangtidy":
cls = ClangtidyCheckTool
elif tool == "pvs-studio":
cls = PvsStudioCheckTool
else:
raise exception.PlatformioException("Unknown check tool `%s`" % tool)
return cls(project_dir, config, envname, options)

View File

@@ -0,0 +1,177 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import glob
import os
import click
from platformio import fs, proc
from platformio.commands.check.defect import DefectItem
from platformio.project.helpers import get_project_dir, load_project_ide_data
class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
def __init__(self, project_dir, config, envname, options):
self.config = config
self.envname = envname
self.options = options
self.cc_flags = []
self.cxx_flags = []
self.cpp_includes = []
self.cpp_defines = []
self.toolchain_defines = []
self.cc_path = None
self.cxx_path = None
self._defects = []
self._on_defect_callback = None
self._bad_input = False
self._load_cpp_data(project_dir, envname)
# detect all defects by default
if not self.options.get("severity"):
self.options["severity"] = [
DefectItem.SEVERITY_LOW,
DefectItem.SEVERITY_MEDIUM,
DefectItem.SEVERITY_HIGH,
]
# cast to severity by ids
self.options["severity"] = [
s if isinstance(s, int) else DefectItem.severity_to_int(s)
for s in self.options["severity"]
]
def _load_cpp_data(self, project_dir, envname):
data = load_project_ide_data(project_dir, envname)
if not data:
return
self.cc_flags = data.get("cc_flags", "").split(" ")
self.cxx_flags = data.get("cxx_flags", "").split(" ")
self.cpp_includes = data.get("includes", [])
self.cpp_defines = data.get("defines", [])
self.cc_path = data.get("cc_path")
self.cxx_path = data.get("cxx_path")
self.toolchain_defines = self._get_toolchain_defines(self.cc_path)
def get_flags(self, tool):
result = []
flags = self.options.get("flags") or []
for flag in flags:
if ":" not in flag or flag.startswith("-"):
result.extend([f for f in flag.split(" ") if f])
elif flag.startswith("%s:" % tool):
result.extend([f for f in flag.split(":", 1)[1].split(" ") if f])
return result
@staticmethod
def _get_toolchain_defines(cc_path):
defines = []
result = proc.exec_command("echo | %s -dM -E -x c++ -" % cc_path, shell=True)
for line in result["out"].split("\n"):
tokens = line.strip().split(" ", 2)
if not tokens or tokens[0] != "#define":
continue
if len(tokens) > 2:
defines.append("%s=%s" % (tokens[1], tokens[2]))
else:
defines.append(tokens[1])
return defines
@staticmethod
def is_flag_set(flag, flags):
return any(flag in f for f in flags)
def get_defects(self):
return self._defects
def configure_command(self):
raise NotImplementedError
def on_tool_output(self, line):
line = self.tool_output_filter(line)
if not line:
return
defect = self.parse_defect(line)
if not isinstance(defect, DefectItem):
if self.options.get("verbose"):
click.echo(line)
return
if defect.severity not in self.options["severity"]:
return
self._defects.append(defect)
if self._on_defect_callback:
self._on_defect_callback(defect)
@staticmethod
def tool_output_filter(line):
return line
@staticmethod
def parse_defect(raw_line):
return raw_line
def clean_up(self):
pass
def get_project_target_files(self):
allowed_extensions = (".h", ".hpp", ".c", ".cc", ".cpp", ".ino")
result = []
def _add_file(path):
if not path.endswith(allowed_extensions):
return
result.append(os.path.realpath(path))
for pattern in self.options["patterns"]:
for item in glob.glob(pattern):
if not os.path.isdir(item):
_add_file(item)
for root, _, files in os.walk(item, followlinks=True):
for f in files:
_add_file(os.path.join(root, f))
return result
def get_source_language(self):
with fs.cd(get_project_dir()):
for _, __, files in os.walk(self.config.get_optional_dir("src")):
for name in files:
if "." not in name:
continue
if os.path.splitext(name)[1].lower() in (".cpp", ".cxx", ".ino"):
return "c++"
return "c"
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
cmd = self.configure_command()
if self.options.get("verbose"):
click.echo(" ".join(cmd))
proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
self.clean_up()
return self._bad_input

View File

@@ -0,0 +1,67 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
from os.path import join
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir
class ClangtidyCheckTool(CheckToolBase):
def tool_output_filter(self, line):
if not self.options.get("verbose") and "[clang-diagnostic-error]" in line:
return ""
if "[CommonOptionsParser]" in line:
self._bad_input = True
return line
if any(d in line for d in ("note: ", "error: ", "warning: ")):
return line
return ""
def parse_defect(self, raw_line):
match = re.match(r"^(.*):(\d+):(\d+):\s+([^:]+):\s(.+)\[([^]]+)\]$", raw_line)
if not match:
return raw_line
file_, line, column, category, message, defect_id = match.groups()
severity = DefectItem.SEVERITY_LOW
if category == "error":
severity = DefectItem.SEVERITY_HIGH
elif category == "warning":
severity = DefectItem.SEVERITY_MEDIUM
return DefectItem(severity, category, message, file_, line, column, defect_id)
def configure_command(self):
tool_path = join(get_core_package_dir("tool-clangtidy"), "clang-tidy")
cmd = [tool_path, "--quiet"]
flags = self.get_flags("clangtidy")
if not self.is_flag_set("--checks", flags):
cmd.append("--checks=*")
cmd.extend(flags)
cmd.extend(self.get_project_target_files())
cmd.append("--")
cmd.extend(["-D%s" % d for d in self.cpp_defines + self.toolchain_defines])
cmd.extend(["-I%s" % inc for inc in self.cpp_includes])
return cmd

View File

@@ -0,0 +1,158 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os import remove
from os.path import isfile, join
from tempfile import NamedTemporaryFile
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir
class CppcheckCheckTool(CheckToolBase):
def __init__(self, *args, **kwargs):
self._tmp_files = []
self.defect_fields = [
"severity",
"message",
"file",
"line",
"column",
"callstack",
"cwe",
"id",
]
super(CppcheckCheckTool, self).__init__(*args, **kwargs)
def tool_output_filter(self, line):
if (
not self.options.get("verbose")
and "--suppress=unmatchedSuppression:" in line
):
return ""
if any(
msg in line
for msg in (
"No C or C++ source files found",
"unrecognized command line option",
)
):
self._bad_input = True
return line
def parse_defect(self, raw_line):
if "<&PIO&>" not in raw_line or any(
f not in raw_line for f in self.defect_fields
):
return None
args = dict()
for field in raw_line.split("<&PIO&>"):
field = field.strip().replace('"', "")
name, value = field.split("=", 1)
args[name] = value
args["category"] = args["severity"]
if args["severity"] == "error":
args["severity"] = DefectItem.SEVERITY_HIGH
elif args["severity"] == "warning":
args["severity"] = DefectItem.SEVERITY_MEDIUM
else:
args["severity"] = DefectItem.SEVERITY_LOW
return DefectItem(**args)
def configure_command(self):
tool_path = join(get_core_package_dir("tool-cppcheck"), "cppcheck")
cmd = [
tool_path,
"--error-exitcode=1",
"--verbose" if self.options.get("verbose") else "--quiet",
]
cmd.append(
'--template="%s"'
% "<&PIO&>".join(["{0}={{{0}}}".format(f) for f in self.defect_fields])
)
flags = self.get_flags("cppcheck")
if not flags:
# by default user can suppress reporting individual defects
# directly in code // cppcheck-suppress warningID
cmd.append("--inline-suppr")
if not self.is_flag_set("--platform", flags):
cmd.append("--platform=unspecified")
if not self.is_flag_set("--enable", flags):
enabled_checks = [
"warning",
"style",
"performance",
"portability",
"unusedFunction",
]
cmd.append("--enable=%s" % ",".join(enabled_checks))
if not self.is_flag_set("--language", flags):
if self.get_source_language() == "c++":
cmd.append("--language=c++")
if not self.is_flag_set("--std", flags):
for f in self.cxx_flags + self.cc_flags:
if "-std" in f:
# Standards with GNU extensions are not allowed
cmd.append("-" + f.replace("gnu", "c"))
cmd.extend(["-D%s" % d for d in self.cpp_defines + self.toolchain_defines])
cmd.extend(flags)
cmd.append("--file-list=%s" % self._generate_src_file())
cmd.append("--includes-file=%s" % self._generate_inc_file())
core_dir = self.config.get_optional_dir("packages")
cmd.append("--suppress=*:%s*" % core_dir)
cmd.append("--suppress=unmatchedSuppression:%s*" % core_dir)
return cmd
def _create_tmp_file(self, data):
with NamedTemporaryFile("w", delete=False) as fp:
fp.write(data)
self._tmp_files.append(fp.name)
return fp.name
def _generate_src_file(self):
src_files = [
f for f in self.get_project_target_files() if not f.endswith((".h", ".hpp"))
]
return self._create_tmp_file("\n".join(src_files))
def _generate_inc_file(self):
return self._create_tmp_file("\n".join(self.cpp_includes))
def clean_up(self):
for f in self._tmp_files:
if isfile(f):
remove(f)
# delete temporary dump files generated by addons
if not self.is_flag_set("--addon", self.get_flags("cppcheck")):
return
for f in self.get_project_target_files():
dump_file = f + ".dump"
if isfile(dump_file):
remove(dump_file)

View File

@@ -0,0 +1,226 @@
# Copyright (c) 2020-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import shutil
import tempfile
from xml.etree.ElementTree import fromstring
import click
from platformio import proc, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.managers.core import get_core_package_dir
class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes
def __init__(self, *args, **kwargs):
self._tmp_dir = tempfile.mkdtemp(prefix="piocheck")
self._tmp_preprocessed_file = self._generate_tmp_file_path() + ".i"
self._tmp_output_file = self._generate_tmp_file_path() + ".pvs"
self._tmp_cfg_file = self._generate_tmp_file_path() + ".cfg"
self._tmp_cmd_file = self._generate_tmp_file_path() + ".cmd"
self.tool_path = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"x64" if "windows" in util.get_systype() else "bin",
"pvs-studio",
)
super(PvsStudioCheckTool, self).__init__(*args, **kwargs)
with open(self._tmp_cfg_file, "w") as fp:
fp.write(
"exclude-path = "
+ self.config.get_optional_dir("packages").replace("\\", "/")
)
with open(self._tmp_cmd_file, "w") as fp:
fp.write(
" ".join(
['-I"%s"' % inc.replace("\\", "/") for inc in self.cpp_includes]
)
)
def _process_defects(self, defects):
for defect in defects:
if not isinstance(defect, DefectItem):
return
if defect.severity not in self.options["severity"]:
return
self._defects.append(defect)
if self._on_defect_callback:
self._on_defect_callback(defect)
def _demangle_report(self, output_file):
converter_tool = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"HtmlGenerator"
if "windows" in util.get_systype()
else os.path.join("bin", "plog-converter"),
)
cmd = (
converter_tool,
"-t",
"xml",
output_file,
"-m",
"cwe",
"-m",
"misra",
"-a",
# Enable all possible analyzers and defect levels
"GA:1,2,3;64:1,2,3;OP:1,2,3;CS:1,2,3;MISRA:1,2,3",
"--cerr",
)
result = proc.exec_command(cmd)
if result["returncode"] != 0:
click.echo(result["err"])
self._bad_input = True
return result["err"]
def parse_defects(self, output_file):
defects = []
report = self._demangle_report(output_file)
if not report:
self._bad_input = True
return []
try:
defects_data = fromstring(report)
except: # pylint: disable=bare-except
click.echo("Error: Couldn't decode generated report!")
self._bad_input = True
return []
for table in defects_data.iter("PVS-Studio_Analysis_Log"):
message = table.find("Message").text
category = table.find("ErrorType").text
line = table.find("Line").text
file_ = table.find("File").text
defect_id = table.find("ErrorCode").text
cwe = table.find("CWECode")
cwe_id = None
if cwe is not None:
cwe_id = cwe.text.lower().replace("cwe-", "")
misra = table.find("MISRA")
if misra is not None:
message += " [%s]" % misra.text
severity = DefectItem.SEVERITY_LOW
if category == "error":
severity = DefectItem.SEVERITY_HIGH
elif category == "warning":
severity = DefectItem.SEVERITY_MEDIUM
defects.append(
DefectItem(
severity, category, message, file_, line, id=defect_id, cwe=cwe_id
)
)
return defects
def configure_command(self, src_file): # pylint: disable=arguments-differ
if os.path.isfile(self._tmp_output_file):
os.remove(self._tmp_output_file)
if not os.path.isfile(self._tmp_preprocessed_file):
click.echo(
"Error: Missing preprocessed file '%s'" % (self._tmp_preprocessed_file)
)
return ""
cmd = [
self.tool_path,
"--skip-cl-exe",
"yes",
"--language",
"C" if src_file.endswith(".c") else "C++",
"--preprocessor",
"gcc",
"--cfg",
self._tmp_cfg_file,
"--source-file",
src_file,
"--i-file",
self._tmp_preprocessed_file,
"--output-file",
self._tmp_output_file,
]
flags = self.get_flags("pvs-studio")
if not self.is_flag_set("--platform", flags):
cmd.append("--platform=arm")
cmd.extend(flags)
return cmd
def _generate_tmp_file_path(self):
# pylint: disable=protected-access
return os.path.join(self._tmp_dir, next(tempfile._get_candidate_names()))
def _prepare_preprocessed_file(self, src_file):
flags = self.cxx_flags
compiler = self.cxx_path
if src_file.endswith(".c"):
flags = self.cc_flags
compiler = self.cc_path
cmd = [compiler, src_file, "-E", "-o", self._tmp_preprocessed_file]
cmd.extend([f for f in flags if f])
cmd.extend(["-D%s" % d for d in self.cpp_defines])
cmd.append('@"%s"' % self._tmp_cmd_file)
result = proc.exec_command(" ".join(cmd), shell=True)
if result["returncode"] != 0:
if self.options.get("verbose"):
click.echo(" ".join(cmd))
click.echo(result["err"])
self._bad_input = True
def clean_up(self):
if os.path.isdir(self._tmp_dir):
shutil.rmtree(self._tmp_dir)
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
src_files = [
f for f in self.get_project_target_files() if not f.endswith((".h", ".hpp"))
]
for src_file in src_files:
self._prepare_preprocessed_file(src_file)
cmd = self.configure_command(src_file)
if self.options.get("verbose"):
click.echo(" ".join(cmd))
if not cmd:
self._bad_input = True
continue
result = proc.exec_command(cmd)
# pylint: disable=unsupported-membership-test
if result["returncode"] != 0 or "License was not entered" in result["err"]:
self._bad_input = True
click.echo(result["err"])
continue
self._process_defects(self.parse_defects(self._tmp_output_file))
self.clean_up()
return self._bad_input

View File

@@ -14,16 +14,16 @@
from glob import glob
from os import getenv, makedirs, remove
from os.path import abspath, basename, expanduser, isdir, isfile, join
from os.path import basename, isdir, isfile, join, realpath
from shutil import copyfile, copytree
from tempfile import mkdtemp
import click
from platformio import app, fs
from platformio.commands.init import cli as cmd_init
from platformio.commands.init import validate_boards
from platformio.commands.run import cli as cmd_run
from platformio.commands.project import project_init as cmd_project_init
from platformio.commands.project import validate_boards
from platformio.commands.run.command import cli as cmd_run
from platformio.compat import glob_escape
from platformio.exception import CIBuildEnvsEmpty
from platformio.project.config import ProjectConfig
@@ -34,8 +34,8 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
value = list(value)
for i, p in enumerate(value):
if p.startswith("~"):
value[i] = expanduser(p)
value[i] = abspath(value[i])
value[i] = fs.expanduser(p)
value[i] = realpath(value[i])
if not glob(value[i]):
invalid_path = p
break
@@ -48,37 +48,37 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
@click.command("ci", short_help="Continuous Integration")
@click.argument("src", nargs=-1, callback=validate_path)
@click.option("-l",
"--lib",
multiple=True,
callback=validate_path,
metavar="DIRECTORY")
@click.option("-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY")
@click.option("--exclude", multiple=True)
@click.option("-b",
"--board",
multiple=True,
metavar="ID",
callback=validate_boards)
@click.option("--build-dir",
default=mkdtemp,
type=click.Path(file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option(
"--build-dir",
default=mkdtemp,
type=click.Path(file_okay=False, dir_okay=True, writable=True, resolve_path=True),
)
@click.option("--keep-build-dir", is_flag=True)
@click.option("-c",
"--project-conf",
type=click.Path(exists=True,
file_okay=True,
dir_okay=False,
readable=True,
resolve_path=True))
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("-O", "--project-option", multiple=True)
@click.option("-v", "--verbose", is_flag=True)
@click.pass_context
def cli( # pylint: disable=too-many-arguments, too-many-branches
ctx, src, lib, exclude, board, build_dir, keep_build_dir, project_conf,
project_option, verbose):
ctx,
src,
lib,
exclude,
board,
build_dir,
keep_build_dir,
project_conf,
project_option,
verbose,
):
if not src and getenv("PLATFORMIO_CI_SRC"):
src = validate_path(ctx, None, getenv("PLATFORMIO_CI_SRC").split(":"))
@@ -110,10 +110,12 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
_exclude_contents(build_dir, exclude)
# initialise project
ctx.invoke(cmd_init,
project_dir=build_dir,
board=board,
project_option=project_option)
ctx.invoke(
cmd_project_init,
project_dir=build_dir,
board=board,
project_option=project_option,
)
# process project
ctx.invoke(cmd_run, project_dir=build_dir, verbose=verbose)
@@ -127,27 +129,27 @@ def _copy_contents(dst_dir, contents):
for path in contents:
if isdir(path):
items['dirs'].add(path)
items["dirs"].add(path)
elif isfile(path):
items['files'].add(path)
items["files"].add(path)
dst_dir_name = basename(dst_dir)
if dst_dir_name == "src" and len(items['dirs']) == 1:
copytree(list(items['dirs']).pop(), dst_dir, symlinks=True)
if dst_dir_name == "src" and len(items["dirs"]) == 1:
copytree(list(items["dirs"]).pop(), dst_dir, symlinks=True)
else:
if not isdir(dst_dir):
makedirs(dst_dir)
for d in items['dirs']:
for d in items["dirs"]:
copytree(d, join(dst_dir, basename(d)), symlinks=True)
if not items['files']:
if not items["files"]:
return
if dst_dir_name == "lib":
dst_dir = join(dst_dir, mkdtemp(dir=dst_dir))
for f in items['files']:
for f in items["files"]:
dst_file = join(dst_dir, basename(f))
if f == dst_file:
continue
@@ -159,7 +161,7 @@ def _exclude_contents(dst_dir, patterns):
for p in patterns:
contents += glob(join(glob_escape(dst_dir), p))
for path in contents:
path = abspath(path)
path = realpath(path)
if isdir(path):
fs.rmtree(path)
elif isfile(path):

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.debug.command import cli

View File

@@ -12,29 +12,28 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import re
import signal
import time
from hashlib import sha1
from os.path import abspath, basename, dirname, isdir, join, splitext
from os.path import basename, dirname, isdir, join, realpath, splitext
from tempfile import mkdtemp
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import protocol # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import stdio # pylint: disable=import-error
from twisted.internet import task # pylint: disable=import-error
from platformio import app, exception, fs, proc, util
from platformio.commands.debug import helpers, initcfgs
from platformio import app, fs, proc, telemetry, util
from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.initcfgs import get_gdb_init_config
from platformio.commands.debug.process import BaseProcess
from platformio.commands.debug.server import DebugServer
from platformio.compat import hashlib_encode_data
from platformio.compat import hashlib_encode_data, is_bytes
from platformio.project.helpers import get_project_cache_dir
from platformio.telemetry import MeasurementProtocol
LOG_FILE = None
class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
@@ -43,6 +42,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
INIT_COMPLETED_BANNER = "PlatformIO: Initialization completed"
def __init__(self, project_dir, args, debug_options, env_options):
super(GDBClient, self).__init__()
self.project_dir = project_dir
self.args = list(args)
self.debug_options = debug_options
@@ -53,13 +53,13 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
if not isdir(get_project_cache_dir()):
os.makedirs(get_project_cache_dir())
self._gdbsrc_dir = mkdtemp(dir=get_project_cache_dir(),
prefix=".piodebug-")
self._gdbsrc_dir = mkdtemp(dir=get_project_cache_dir(), prefix=".piodebug-")
self._target_is_run = False
self._last_server_activity = 0
self._auto_continue_timer = None
self._errors_buffer = b""
@defer.inlineCallbacks
def spawn(self, gdb_path, prog_path):
session_hash = gdb_path + prog_path
self._session_id = sha1(hashlib_encode_data(session_hash)).hexdigest()
@@ -70,95 +70,84 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
"PROG_PATH": prog_path,
"PROG_DIR": dirname(prog_path),
"PROG_NAME": basename(splitext(prog_path)[0]),
"DEBUG_PORT": self.debug_options['port'],
"UPLOAD_PROTOCOL": self.debug_options['upload_protocol'],
"INIT_BREAK": self.debug_options['init_break'] or "",
"LOAD_CMDS": "\n".join(self.debug_options['load_cmds'] or []),
"DEBUG_PORT": self.debug_options["port"],
"UPLOAD_PROTOCOL": self.debug_options["upload_protocol"],
"INIT_BREAK": self.debug_options["init_break"] or "",
"LOAD_CMDS": "\n".join(self.debug_options["load_cmds"] or []),
}
self._debug_server.spawn(patterns)
yield self._debug_server.spawn(patterns)
if not patterns["DEBUG_PORT"]:
patterns["DEBUG_PORT"] = self._debug_server.get_debug_port()
if not patterns['DEBUG_PORT']:
patterns['DEBUG_PORT'] = self._debug_server.get_debug_port()
self.generate_pioinit(self._gdbsrc_dir, patterns)
# start GDB client
args = [
"piogdb",
"-q",
"--directory", self._gdbsrc_dir,
"--directory", self.project_dir,
"-l", "10"
] # yapf: disable
"--directory",
self._gdbsrc_dir,
"--directory",
self.project_dir,
"-l",
"10",
]
args.extend(self.args)
if not gdb_path:
raise exception.DebugInvalidOptions("GDB client is not configured")
raise DebugInvalidOptionsError("GDB client is not configured")
gdb_data_dir = self._get_data_dir(gdb_path)
if gdb_data_dir:
args.extend(["--data-directory", gdb_data_dir])
args.append(patterns['PROG_PATH'])
args.append(patterns["PROG_PATH"])
return reactor.spawnProcess(self,
gdb_path,
args,
path=self.project_dir,
env=os.environ)
transport = reactor.spawnProcess(
self, gdb_path, args, path=self.project_dir, env=os.environ
)
defer.returnValue(transport)
@staticmethod
def _get_data_dir(gdb_path):
if "msp430" in gdb_path:
return None
gdb_data_dir = abspath(join(dirname(gdb_path), "..", "share", "gdb"))
gdb_data_dir = realpath(join(dirname(gdb_path), "..", "share", "gdb"))
return gdb_data_dir if isdir(gdb_data_dir) else None
def generate_pioinit(self, dst_dir, patterns):
server_exe = (self.debug_options.get("server")
or {}).get("executable", "").lower()
if "jlink" in server_exe:
cfg = initcfgs.GDB_JLINK_INIT_CONFIG
elif "st-util" in server_exe:
cfg = initcfgs.GDB_STUTIL_INIT_CONFIG
elif "mspdebug" in server_exe:
cfg = initcfgs.GDB_MSPDEBUG_INIT_CONFIG
elif "qemu" in server_exe:
cfg = initcfgs.GDB_QEMU_INIT_CONFIG
elif self.debug_options['require_debug_port']:
cfg = initcfgs.GDB_BLACKMAGIC_INIT_CONFIG
else:
cfg = initcfgs.GDB_DEFAULT_INIT_CONFIG
commands = cfg.split("\n")
# default GDB init commands depending on debug tool
commands = get_gdb_init_config(self.debug_options).split("\n")
if self.debug_options['init_cmds']:
commands = self.debug_options['init_cmds']
commands.extend(self.debug_options['extra_cmds'])
if self.debug_options["init_cmds"]:
commands = self.debug_options["init_cmds"]
commands.extend(self.debug_options["extra_cmds"])
if not any("define pio_reset_target" in cmd for cmd in commands):
if not any("define pio_reset_run_target" in cmd for cmd in commands):
commands = [
"define pio_reset_target",
" echo Warning! Undefined pio_reset_target command\\n",
" mon reset",
"end"
] + commands # yapf: disable
"define pio_reset_run_target",
" echo Warning! Undefined pio_reset_run_target command\\n",
" monitor reset",
"end",
] + commands
if not any("define pio_reset_halt_target" in cmd for cmd in commands):
commands = [
"define pio_reset_halt_target",
" echo Warning! Undefined pio_reset_halt_target command\\n",
" mon reset halt",
"end"
] + commands # yapf: disable
" monitor reset halt",
"end",
] + commands
if not any("define pio_restart_target" in cmd for cmd in commands):
commands += [
"define pio_restart_target",
" pio_reset_halt_target",
" $INIT_BREAK",
" %s" % ("continue" if patterns['INIT_BREAK'] else "next"),
"end"
] # yapf: disable
" %s" % ("continue" if patterns["INIT_BREAK"] else "next"),
"end",
]
banner = [
"echo PlatformIO Unified Debugger -> http://bit.ly/pio-debug\\n",
"echo PlatformIO: debug_tool = %s\\n" % self.debug_options['tool'],
"echo PlatformIO: Initializing remote target...\\n"
"echo PlatformIO: debug_tool = %s\\n" % self.debug_options["tool"],
"echo PlatformIO: Initializing remote target...\\n",
]
footer = ["echo %s\\n" % self.INIT_COMPLETED_BANNER]
commands = banner + commands + footer
@@ -174,12 +163,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
stdio.StandardIO(p)
def onStdInData(self, data):
if LOG_FILE:
with open(LOG_FILE, "ab") as fp:
fp.write(data)
self._last_server_activity = time.time()
super(GDBClient, self).onStdInData(data)
if b"-exec-run" in data:
if self._target_is_run:
token, _ = data.split(b"-", 1)
@@ -192,7 +176,7 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
if b"-gdb-exit" in data or data.strip() in (b"q", b"quit"):
# Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler)
self.transport.write(b"pio_reset_target\n")
self.transport.write(b"pio_reset_run_target\n")
self.transport.write(data)
def processEnded(self, reason): # pylint: disable=unused-argument
@@ -205,17 +189,14 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
reactor.stop()
def outReceived(self, data):
if LOG_FILE:
with open(LOG_FILE, "ab") as fp:
fp.write(data)
self._last_server_activity = time.time()
super(GDBClient, self).outReceived(data)
self._handle_error(data)
# go to init break automatically
if self.INIT_COMPLETED_BANNER.encode() in data:
self._auto_continue_timer = task.LoopingCall(
self._auto_exec_continue)
telemetry.send_event(
"Debug", "Started", telemetry.encode_run_environment(self.env_options)
)
self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue)
self._auto_continue_timer.start(0.1)
def errReceived(self, data):
@@ -223,43 +204,46 @@ class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
self._handle_error(data)
def console_log(self, msg):
if helpers.is_mi_mode(self.args):
self.outReceived(('~"%s\\n"\n' % msg).encode())
else:
self.outReceived(("%s\n" % msg).encode())
if helpers.is_gdbmi_mode():
msg = helpers.escape_gdbmi_stream("~", msg)
self.outReceived(msg if is_bytes(msg) else msg.encode())
def _auto_exec_continue(self):
auto_exec_delay = 0.5 # in seconds
if self._last_server_activity > (time.time() - auto_exec_delay):
if self._last_activity > (time.time() - auto_exec_delay):
return
if self._auto_continue_timer:
self._auto_continue_timer.stop()
self._auto_continue_timer = None
if not self.debug_options['init_break'] or self._target_is_run:
if not self.debug_options["init_break"] or self._target_is_run:
return
self.console_log(
"PlatformIO: Resume the execution to `debug_init_break = %s`" %
self.debug_options['init_break'])
self.console_log("PlatformIO: More configuration options -> "
"http://bit.ly/pio-debug")
self.transport.write(b"0-exec-continue\n" if helpers.
is_mi_mode(self.args) else b"continue\n")
"PlatformIO: Resume the execution to `debug_init_break = %s`\n"
% self.debug_options["init_break"]
)
self.console_log(
"PlatformIO: More configuration options -> http://bit.ly/pio-debug\n"
)
self.transport.write(
b"0-exec-continue\n" if helpers.is_gdbmi_mode() else b"continue\n"
)
self._target_is_run = True
def _handle_error(self, data):
if (self.PIO_SRC_NAME.encode() not in data
or b"Error in sourced" not in data):
self._errors_buffer += data
if self.PIO_SRC_NAME.encode() not in data or b"Error in sourced" not in data:
return
configuration = {"debug": self.debug_options, "env": self.env_options}
exd = re.sub(r'\\(?!")', "/", json.dumps(configuration))
exd = re.sub(r'"(?:[a-z]\:)?((/[^"/]+)+)"',
lambda m: '"%s"' % join(*m.group(1).split("/")[-2:]), exd,
re.I | re.M)
mp = MeasurementProtocol()
mp['exd'] = "DebugGDBPioInitError: %s" % exd
mp['exf'] = 1
mp.send("exception")
last_erros = self._errors_buffer.decode()
last_erros = " ".join(reversed(last_erros.split("\n")))
last_erros = re.sub(r'((~|&)"|\\n\"|\\t)', " ", last_erros, flags=re.M)
err = "%s -> %s" % (
telemetry.encode_run_environment(self.env_options),
last_erros,
)
telemetry.send_exception("DebugInitError: %s" % err)
self.transport.loseConnection()
def _kill_previous_session(self):

View File

@@ -17,43 +17,47 @@
import os
import signal
from os.path import isfile, join
from os.path import isfile
import click
from platformio import exception, fs, proc, util
from platformio import app, exception, fs, proc, util
from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.managers.core import inject_contrib_pysite
from platformio.project.config import ProjectConfig
from platformio.project.helpers import (is_platformio_project,
load_project_ide_data)
from platformio.project.exception import ProjectEnvsNotAvailableError
from platformio.project.helpers import is_platformio_project, load_project_ide_data
@click.command("debug",
context_settings=dict(ignore_unknown_options=True),
short_help="PIO Unified Debugger")
@click.option("-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("-c",
"--project-conf",
type=click.Path(exists=True,
file_okay=True,
dir_okay=False,
readable=True,
resolve_path=True))
@click.command(
"debug",
context_settings=dict(ignore_unknown_options=True),
short_help="PIO Unified Debugger",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--environment", "-e", metavar="<environment>")
@click.option("--verbose", "-v", is_flag=True)
@click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED)
@click.pass_context
def cli(ctx, project_dir, project_conf, environment, verbose, interface,
__unprocessed):
def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unprocessed):
app.set_session_var("custom_project_conf", project_conf)
# use env variables from Eclipse or CLion
for sysenv in ("CWD", "PWD", "PLATFORMIO_PROJECT_DIR"):
if is_platformio_project(project_dir):
@@ -62,88 +66,92 @@ def cli(ctx, project_dir, project_conf, environment, verbose, interface,
project_dir = os.getenv(sysenv)
with fs.cd(project_dir):
config = ProjectConfig.get_instance(
project_conf or join(project_dir, "platformio.ini"))
config = ProjectConfig.get_instance(project_conf)
config.validate(envs=[environment] if environment else None)
env_name = environment or helpers.get_default_debug_env(config)
env_options = config.items(env=env_name, as_dict=True)
if not set(env_options.keys()) >= set(["platform", "board"]):
raise exception.ProjectEnvsNotAvailable()
raise ProjectEnvsNotAvailableError()
debug_options = helpers.validate_debug_options(ctx, env_options)
assert debug_options
if not interface:
return helpers.predebug_project(ctx, project_dir, env_name, False,
verbose)
return helpers.predebug_project(ctx, project_dir, env_name, False, verbose)
configuration = load_project_ide_data(project_dir, env_name)
if not configuration:
raise exception.DebugInvalidOptions(
"Could not load debug configuration")
raise DebugInvalidOptionsError("Could not load debug configuration")
if "--version" in __unprocessed:
result = proc.exec_command([configuration['gdb_path'], "--version"])
if result['returncode'] == 0:
return click.echo(result['out'])
raise exception.PlatformioException("\n".join(
[result['out'], result['err']]))
result = proc.exec_command([configuration["gdb_path"], "--version"])
if result["returncode"] == 0:
return click.echo(result["out"])
raise exception.PlatformioException("\n".join([result["out"], result["err"]]))
try:
fs.ensure_udev_rules()
except exception.InvalidUdevRules as e:
for line in str(e).split("\n") + [""]:
click.echo(
('~"%s\\n"' if helpers.is_mi_mode(__unprocessed) else "%s") %
line)
click.echo(
helpers.escape_gdbmi_stream("~", str(e) + "\n")
if helpers.is_gdbmi_mode()
else str(e) + "\n",
nl=False,
)
debug_options['load_cmds'] = helpers.configure_esp32_load_cmds(
debug_options, configuration)
debug_options["load_cmds"] = helpers.configure_esp32_load_cmds(
debug_options, configuration
)
rebuild_prog = False
preload = debug_options['load_cmds'] == ["preload"]
load_mode = debug_options['load_mode']
preload = debug_options["load_cmds"] == ["preload"]
load_mode = debug_options["load_mode"]
if load_mode == "always":
rebuild_prog = (
preload
or not helpers.has_debug_symbols(configuration['prog_path']))
rebuild_prog = preload or not helpers.has_debug_symbols(
configuration["prog_path"]
)
elif load_mode == "modified":
rebuild_prog = (
helpers.is_prog_obsolete(configuration['prog_path'])
or not helpers.has_debug_symbols(configuration['prog_path']))
rebuild_prog = helpers.is_prog_obsolete(
configuration["prog_path"]
) or not helpers.has_debug_symbols(configuration["prog_path"])
else:
rebuild_prog = not isfile(configuration['prog_path'])
rebuild_prog = not isfile(configuration["prog_path"])
if preload or (not rebuild_prog and load_mode != "always"):
# don't load firmware through debug server
debug_options['load_cmds'] = []
debug_options["load_cmds"] = []
if rebuild_prog:
if helpers.is_mi_mode(__unprocessed):
click.echo('~"Preparing firmware for debugging...\\n"')
output = helpers.GDBBytesIO()
with util.capture_std_streams(output):
helpers.predebug_project(ctx, project_dir, env_name, preload,
verbose)
output.close()
if helpers.is_gdbmi_mode():
click.echo(
helpers.escape_gdbmi_stream(
"~", "Preparing firmware for debugging...\n"
),
nl=False,
)
stream = helpers.GDBMIConsoleStream()
with util.capture_std_streams(stream):
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
stream.close()
else:
click.echo("Preparing firmware for debugging...")
helpers.predebug_project(ctx, project_dir, env_name, preload,
verbose)
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
# save SHA sum of newly created prog
if load_mode == "modified":
helpers.is_prog_obsolete(configuration['prog_path'])
helpers.is_prog_obsolete(configuration["prog_path"])
if not isfile(configuration['prog_path']):
raise exception.DebugInvalidOptions("Program/firmware is missed")
if not isfile(configuration["prog_path"]):
raise DebugInvalidOptionsError("Program/firmware is missed")
# run debugging client
inject_contrib_pysite()
# pylint: disable=import-outside-toplevel
from platformio.commands.debug.client import GDBClient, reactor
client = GDBClient(project_dir, __unprocessed, debug_options, env_options)
client.spawn(configuration['gdb_path'], configuration['prog_path'])
client.spawn(configuration["gdb_path"], configuration["prog_path"])
signal.signal(signal.SIGINT, lambda *args, **kwargs: None)
reactor.run()

View File

@@ -0,0 +1,33 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.exception import PlatformioException, UserSideException
class DebugError(PlatformioException):
pass
class DebugSupportError(DebugError, UserSideException):
MESSAGE = (
"Currently, PlatformIO does not support debugging for `{0}`.\n"
"Please request support at https://github.com/platformio/"
"platformio-core/issues \nor visit -> https://docs.platformio.org"
"/page/plus/debugging.html"
)
class DebugInvalidOptionsError(DebugError, UserSideException):
pass

View File

@@ -12,6 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import sys
import time
from fnmatch import fnmatch
@@ -19,29 +20,48 @@ from hashlib import sha1
from io import BytesIO
from os.path import isfile
from platformio import exception, util
from platformio.commands.platform import \
platform_install as cmd_platform_install
from platformio.commands.run import cli as cmd_run
from platformio import exception, fs, util
from platformio.commands import PlatformioCLI
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.run.command import cli as cmd_run
from platformio.compat import is_bytes
from platformio.managers.platform import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.options import ProjectOptions
class GDBBytesIO(BytesIO): # pylint: disable=too-few-public-methods
class GDBMIConsoleStream(BytesIO): # pylint: disable=too-few-public-methods
STDOUT = sys.stdout
def write(self, text):
if "\n" in text:
for line in text.strip().split("\n"):
self.STDOUT.write('~"%s\\n"\n' % line)
else:
self.STDOUT.write('~"%s"' % text)
self.STDOUT.write(escape_gdbmi_stream("~", text))
self.STDOUT.flush()
def is_mi_mode(args):
return "--interpreter" in " ".join(args)
def is_gdbmi_mode():
return "--interpreter" in " ".join(PlatformioCLI.leftover_args)
def escape_gdbmi_stream(prefix, stream):
bytes_stream = False
if is_bytes(stream):
bytes_stream = True
stream = stream.decode()
if not stream:
return b"" if bytes_stream else ""
ends_nl = stream.endswith("\n")
stream = re.sub(r"\\+", "\\\\\\\\", stream)
stream = stream.replace('"', '\\"')
stream = stream.replace("\n", "\\n")
stream = '%s"%s"' % (prefix, stream)
if ends_nl:
stream += "\n"
return stream.encode() if bytes_stream else stream
def get_default_debug_env(config):
@@ -57,41 +77,41 @@ def get_default_debug_env(config):
def predebug_project(ctx, project_dir, env_name, preload, verbose):
ctx.invoke(cmd_run,
project_dir=project_dir,
environment=[env_name],
target=["debug"] + (["upload"] if preload else []),
verbose=verbose)
ctx.invoke(
cmd_run,
project_dir=project_dir,
environment=[env_name],
target=["debug"] + (["upload"] if preload else []),
verbose=verbose,
)
if preload:
time.sleep(5)
def validate_debug_options(cmd_ctx, env_options):
def _cleanup_cmds(items):
items = ProjectConfig.parse_multi_values(items)
return [
"$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items
]
return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items]
try:
platform = PlatformFactory.newPlatform(env_options['platform'])
platform = PlatformFactory.newPlatform(env_options["platform"])
except exception.UnknownPlatform:
cmd_ctx.invoke(cmd_platform_install,
platforms=[env_options['platform']],
skip_default_package=True)
platform = PlatformFactory.newPlatform(env_options['platform'])
cmd_ctx.invoke(
cmd_platform_install,
platforms=[env_options["platform"]],
skip_default_package=True,
)
platform = PlatformFactory.newPlatform(env_options["platform"])
board_config = platform.board_config(env_options['board'])
board_config = platform.board_config(env_options["board"])
tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool"))
tool_settings = board_config.get("debug", {}).get("tools",
{}).get(tool_name, {})
tool_settings = board_config.get("debug", {}).get("tools", {}).get(tool_name, {})
server_options = None
# specific server per a system
if isinstance(tool_settings.get("server", {}), list):
for item in tool_settings['server'][:]:
tool_settings['server'] = item
for item in tool_settings["server"][:]:
tool_settings["server"] = item
if util.get_systype() in item.get("system", []):
break
@@ -100,76 +120,101 @@ def validate_debug_options(cmd_ctx, env_options):
server_options = {
"cwd": None,
"executable": None,
"arguments": env_options.get("debug_server")
"arguments": env_options.get("debug_server"),
}
server_options['executable'] = server_options['arguments'][0]
server_options['arguments'] = server_options['arguments'][1:]
server_options["executable"] = server_options["arguments"][0]
server_options["arguments"] = server_options["arguments"][1:]
elif "server" in tool_settings:
server_package = tool_settings['server'].get("package")
server_package_dir = platform.get_package_dir(
server_package) if server_package else None
server_options = tool_settings["server"]
server_package = server_options.get("package")
server_package_dir = (
platform.get_package_dir(server_package) if server_package else None
)
if server_package and not server_package_dir:
platform.install_packages(with_packages=[server_package],
skip_default_package=True,
silent=True)
platform.install_packages(
with_packages=[server_package], skip_default_package=True, silent=True
)
server_package_dir = platform.get_package_dir(server_package)
server_options = dict(
cwd=server_package_dir if server_package else None,
executable=tool_settings['server'].get("executable"),
arguments=[
a.replace("$PACKAGE_DIR", server_package_dir)
if server_package_dir else a
for a in tool_settings['server'].get("arguments", [])
])
server_options.update(
dict(
cwd=server_package_dir if server_package else None,
executable=server_options.get("executable"),
arguments=[
a.replace("$PACKAGE_DIR", server_package_dir)
if server_package_dir
else a
for a in server_options.get("arguments", [])
],
)
)
extra_cmds = _cleanup_cmds(env_options.get("debug_extra_cmds"))
extra_cmds.extend(_cleanup_cmds(tool_settings.get("extra_cmds")))
result = dict(
tool=tool_name,
upload_protocol=env_options.get(
"upload_protocol",
board_config.get("upload", {}).get("protocol")),
"upload_protocol", board_config.get("upload", {}).get("protocol")
),
load_cmds=_cleanup_cmds(
env_options.get(
"debug_load_cmds",
tool_settings.get("load_cmds",
tool_settings.get("load_cmd", "load")))),
load_mode=env_options.get("debug_load_mode",
tool_settings.get("load_mode", "always")),
tool_settings.get(
"load_cmds",
tool_settings.get(
"load_cmd", ProjectOptions["env.debug_load_cmds"].default
),
),
)
),
load_mode=env_options.get(
"debug_load_mode",
tool_settings.get(
"load_mode", ProjectOptions["env.debug_load_mode"].default
),
),
init_break=env_options.get(
"debug_init_break", tool_settings.get("init_break",
"tbreak main")),
"debug_init_break",
tool_settings.get(
"init_break", ProjectOptions["env.debug_init_break"].default
),
),
init_cmds=_cleanup_cmds(
env_options.get("debug_init_cmds",
tool_settings.get("init_cmds"))),
env_options.get("debug_init_cmds", tool_settings.get("init_cmds"))
),
extra_cmds=extra_cmds,
require_debug_port=tool_settings.get("require_debug_port", False),
port=reveal_debug_port(
env_options.get("debug_port", tool_settings.get("port")),
tool_name, tool_settings),
server=server_options)
tool_name,
tool_settings,
),
server=server_options,
)
return result
def configure_esp32_load_cmds(debug_options, configuration):
ignore_conds = [
debug_options['load_cmds'] != ["load"],
debug_options["load_cmds"] != ["load"],
"xtensa-esp32" not in configuration.get("cc_path", ""),
not configuration.get("flash_extra_images"), not all([
isfile(item['path'])
for item in configuration.get("flash_extra_images")
])
not configuration.get("flash_extra_images"),
not all(
[isfile(item["path"]) for item in configuration.get("flash_extra_images")]
),
]
if any(ignore_conds):
return debug_options['load_cmds']
return debug_options["load_cmds"]
mon_cmds = [
'monitor program_esp32 "{{{path}}}" {offset} verify'.format(
path=item['path'], offset=item['offset'])
path=fs.to_unix_path(item["path"]), offset=item["offset"]
)
for item in configuration.get("flash_extra_images")
]
mon_cmds.append('monitor program_esp32 "{%s.bin}" 0x10000 verify' %
configuration['prog_path'][:-4])
mon_cmds.append(
'monitor program_esp32 "{%s.bin}" 0x10000 verify'
% fs.to_unix_path(configuration["prog_path"][:-4])
)
return mon_cmds
@@ -181,7 +226,7 @@ def has_debug_symbols(prog_path):
b".debug_abbrev": False,
b" -Og": False,
b" -g": False,
b"__PLATFORMIO_BUILD_DEBUG__": False
b"__PLATFORMIO_BUILD_DEBUG__": False,
}
with open(prog_path, "rb") as fp:
last_data = b""
@@ -212,7 +257,7 @@ def is_prog_obsolete(prog_path):
new_digest = shasum.hexdigest()
old_digest = None
if isfile(prog_hash_path):
with open(prog_hash_path, "r") as fp:
with open(prog_hash_path) as fp:
old_digest = fp.read()
if new_digest == old_digest:
return False
@@ -222,7 +267,6 @@ def is_prog_obsolete(prog_path):
def reveal_debug_port(env_debug_port, tool_name, tool_settings):
def _get_pattern():
if not env_debug_port:
return None
@@ -238,18 +282,21 @@ def reveal_debug_port(env_debug_port, tool_name, tool_settings):
def _look_for_serial_port(hwids):
for item in util.get_serialports(filter_hwid=True):
if not _is_match_pattern(item['port']):
if not _is_match_pattern(item["port"]):
continue
port = item['port']
port = item["port"]
if tool_name.startswith("blackmagic"):
if "windows" in util.get_systype() and \
port.startswith("COM") and len(port) > 4:
if (
"windows" in util.get_systype()
and port.startswith("COM")
and len(port) > 4
):
port = "\\\\.\\%s" % port
if "GDB" in item['description']:
if "GDB" in item["description"]:
return port
for hwid in hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']:
if hwid_str in item["hwid"]:
return port
return None
@@ -260,6 +307,5 @@ def reveal_debug_port(env_debug_port, tool_name, tool_settings):
debug_port = _look_for_serial_port(tool_settings.get("hwids", []))
if not debug_port:
raise exception.DebugInvalidOptions(
"Please specify `debug_port` for environment")
raise DebugInvalidOptionsError("Please specify `debug_port` for environment")
return debug_port

View File

@@ -17,50 +17,51 @@ define pio_reset_halt_target
monitor reset halt
end
define pio_reset_target
define pio_reset_run_target
monitor reset
end
target extended-remote $DEBUG_PORT
$INIT_BREAK
pio_reset_halt_target
$LOAD_CMDS
monitor init
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_STUTIL_INIT_CONFIG = """
define pio_reset_halt_target
monitor halt
monitor reset
monitor halt
end
define pio_reset_target
define pio_reset_run_target
monitor reset
end
target extended-remote $DEBUG_PORT
$INIT_BREAK
pio_reset_halt_target
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_JLINK_INIT_CONFIG = """
define pio_reset_halt_target
monitor halt
monitor reset
monitor halt
end
define pio_reset_target
define pio_reset_run_target
monitor clrbp
monitor reset
monitor go
end
target extended-remote $DEBUG_PORT
$INIT_BREAK
monitor clrbp
monitor speed auto
pio_reset_halt_target
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_BLACKMAGIC_INIT_CONFIG = """
@@ -74,7 +75,7 @@ define pio_reset_halt_target
set language auto
end
define pio_reset_target
define pio_reset_run_target
pio_reset_halt_target
end
@@ -82,8 +83,8 @@ target extended-remote $DEBUG_PORT
monitor swdp_scan
attach 1
set mem inaccessible-by-default off
$INIT_BREAK
$LOAD_CMDS
$INIT_BREAK
set language c
set *0xE000ED0C = 0x05FA0004
@@ -98,14 +99,14 @@ GDB_MSPDEBUG_INIT_CONFIG = """
define pio_reset_halt_target
end
define pio_reset_target
define pio_reset_run_target
end
target extended-remote $DEBUG_PORT
$INIT_BREAK
monitor erase
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_QEMU_INIT_CONFIG = """
@@ -113,12 +114,48 @@ define pio_reset_halt_target
monitor system_reset
end
define pio_reset_target
define pio_reset_run_target
monitor system_reset
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_RENODE_INIT_CONFIG = """
define pio_reset_halt_target
monitor machine Reset
$LOAD_CMDS
monitor start
end
define pio_reset_run_target
pio_reset_halt_target
end
target extended-remote $DEBUG_PORT
$INIT_BREAK
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
monitor start
"""
TOOL_TO_CONFIG = {
"jlink": GDB_JLINK_INIT_CONFIG,
"mspdebug": GDB_MSPDEBUG_INIT_CONFIG,
"qemu": GDB_QEMU_INIT_CONFIG,
"blackmagic": GDB_BLACKMAGIC_INIT_CONFIG,
"renode": GDB_RENODE_INIT_CONFIG,
}
def get_gdb_init_config(debug_options):
tool = debug_options.get("tool")
if tool and tool in TOOL_TO_CONFIG:
return TOOL_TO_CONFIG[tool]
server_exe = (debug_options.get("server") or {}).get("executable", "").lower()
if "st-util" in server_exe:
return GDB_STUTIL_INIT_CONFIG
return GDB_DEFAULT_INIT_CONFIG

View File

@@ -13,31 +13,39 @@
# limitations under the License.
import signal
import time
import click
from twisted.internet import protocol # pylint: disable=import-error
from platformio import fs
from platformio.compat import string_types
from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_project_core_dir
LOG_FILE = None
class BaseProcess(protocol.ProcessProtocol, object):
STDOUT_CHUNK_SIZE = 2048
LOG_FILE = None
COMMON_PATTERNS = {
"PLATFORMIO_HOME_DIR": get_project_core_dir(),
"PLATFORMIO_CORE_DIR": get_project_core_dir(),
"PYTHONEXE": get_pythonexe_path()
"PYTHONEXE": get_pythonexe_path(),
}
def __init__(self):
self._last_activity = 0
def apply_patterns(self, source, patterns=None):
_patterns = self.COMMON_PATTERNS.copy()
_patterns.update(patterns or {})
for key, value in _patterns.items():
if key.endswith(("_DIR", "_PATH")):
_patterns[key] = fs.to_unix_path(value)
def _replace(text):
for key, value in _patterns.items():
pattern = "$%s" % key
@@ -47,8 +55,7 @@ class BaseProcess(protocol.ProcessProtocol, object):
if isinstance(source, string_types):
source = _replace(source)
elif isinstance(source, (list, dict)):
items = enumerate(source) if isinstance(source,
list) else source.items()
items = enumerate(source) if isinstance(source, list) else source.items()
for key, value in items:
if isinstance(value, string_types):
source[key] = _replace(value)
@@ -57,23 +64,30 @@ class BaseProcess(protocol.ProcessProtocol, object):
return source
def onStdInData(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
def outReceived(self, data):
if LOG_FILE:
with open(LOG_FILE, "ab") as fp:
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
while data:
chunk = data[:self.STDOUT_CHUNK_SIZE]
chunk = data[: self.STDOUT_CHUNK_SIZE]
click.echo(chunk, nl=False)
data = data[self.STDOUT_CHUNK_SIZE:]
data = data[self.STDOUT_CHUNK_SIZE :]
@staticmethod
def errReceived(data):
if LOG_FILE:
with open(LOG_FILE, "ab") as fp:
def errReceived(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
click.echo(data, nl=False, err=True)
@staticmethod
def processEnded(_):
def processEnded(self, _):
self._last_activity = time.time()
# Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler)

View File

@@ -13,102 +13,146 @@
# limitations under the License.
import os
import time
from os.path import isdir, isfile, join
from twisted.internet import error # pylint: disable=import-error
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from platformio import exception, util
from platformio import fs, util
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode
from platformio.commands.debug.process import BaseProcess
from platformio.proc import where_is_program
class DebugServer(BaseProcess):
def __init__(self, debug_options, env_options):
super(DebugServer, self).__init__()
self.debug_options = debug_options
self.env_options = env_options
self._debug_port = None
self._debug_port = ":3333"
self._transport = None
self._process_ended = False
self._ready = False
@defer.inlineCallbacks
def spawn(self, patterns): # pylint: disable=too-many-branches
systype = util.get_systype()
server = self.debug_options.get("server")
if not server:
return None
defer.returnValue(None)
server = self.apply_patterns(server, patterns)
server_executable = server['executable']
server_executable = server["executable"]
if not server_executable:
return None
if server['cwd']:
server_executable = join(server['cwd'], server_executable)
if ("windows" in systype and not server_executable.endswith(".exe")
and isfile(server_executable + ".exe")):
defer.returnValue(None)
if server["cwd"]:
server_executable = join(server["cwd"], server_executable)
if (
"windows" in systype
and not server_executable.endswith(".exe")
and isfile(server_executable + ".exe")
):
server_executable = server_executable + ".exe"
if not isfile(server_executable):
server_executable = where_is_program(server_executable)
if not isfile(server_executable):
raise exception.DebugInvalidOptions(
raise DebugInvalidOptionsError(
"\nCould not launch Debug Server '%s'. Please check that it "
"is installed and is included in a system PATH\n\n"
"See documentation or contact contact@platformio.org:\n"
"http://docs.platformio.org/page/plus/debugging.html\n" %
server_executable)
"https://docs.platformio.org/page/plus/debugging.html\n"
% server_executable
)
self._debug_port = ":3333"
openocd_pipe_allowed = all([
not self.debug_options['port'],
"openocd" in server_executable
]) # yapf: disable
openocd_pipe_allowed = all(
[not self.debug_options["port"], "openocd" in server_executable]
)
if openocd_pipe_allowed:
args = []
if server['cwd']:
args.extend(["-s", server['cwd']])
args.extend([
"-c", "gdb_port pipe; tcl_port disabled; telnet_port disabled"
])
args.extend(server['arguments'])
if server["cwd"]:
args.extend(["-s", server["cwd"]])
args.extend(
["-c", "gdb_port pipe; tcl_port disabled; telnet_port disabled"]
)
args.extend(server["arguments"])
str_args = " ".join(
[arg if arg.startswith("-") else '"%s"' % arg for arg in args])
[arg if arg.startswith("-") else '"%s"' % arg for arg in args]
)
self._debug_port = '| "%s" %s' % (server_executable, str_args)
self._debug_port = self._debug_port.replace("\\", "\\\\")
else:
env = os.environ.copy()
# prepend server "lib" folder to LD path
if ("windows" not in systype and server['cwd']
and isdir(join(server['cwd'], "lib"))):
ld_key = ("DYLD_LIBRARY_PATH"
if "darwin" in systype else "LD_LIBRARY_PATH")
env[ld_key] = join(server['cwd'], "lib")
if os.environ.get(ld_key):
env[ld_key] = "%s:%s" % (env[ld_key],
os.environ.get(ld_key))
# prepend BIN to PATH
if server['cwd'] and isdir(join(server['cwd'], "bin")):
env['PATH'] = "%s%s%s" % (
join(server['cwd'], "bin"), os.pathsep,
os.environ.get("PATH", os.environ.get("Path", "")))
self._debug_port = fs.to_unix_path(self._debug_port)
defer.returnValue(self._debug_port)
self._transport = reactor.spawnProcess(
self,
server_executable, [server_executable] + server['arguments'],
path=server['cwd'],
env=env)
if "mspdebug" in server_executable.lower():
self._debug_port = ":2000"
elif "jlink" in server_executable.lower():
self._debug_port = ":2331"
elif "qemu" in server_executable.lower():
self._debug_port = ":1234"
env = os.environ.copy()
# prepend server "lib" folder to LD path
if (
"windows" not in systype
and server["cwd"]
and isdir(join(server["cwd"], "lib"))
):
ld_key = "DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH"
env[ld_key] = join(server["cwd"], "lib")
if os.environ.get(ld_key):
env[ld_key] = "%s:%s" % (env[ld_key], os.environ.get(ld_key))
# prepend BIN to PATH
if server["cwd"] and isdir(join(server["cwd"], "bin")):
env["PATH"] = "%s%s%s" % (
join(server["cwd"], "bin"),
os.pathsep,
os.environ.get("PATH", os.environ.get("Path", "")),
)
return self._transport
self._transport = reactor.spawnProcess(
self,
server_executable,
[server_executable] + server["arguments"],
path=server["cwd"],
env=env,
)
if "mspdebug" in server_executable.lower():
self._debug_port = ":2000"
elif "jlink" in server_executable.lower():
self._debug_port = ":2331"
elif "qemu" in server_executable.lower():
self._debug_port = ":1234"
yield self._wait_until_ready()
defer.returnValue(self._debug_port)
@defer.inlineCallbacks
def _wait_until_ready(self):
timeout = 10
elapsed = 0
delay = 0.5
auto_ready_delay = 0.5
while not self._ready and not self._process_ended and elapsed < timeout:
yield self.async_sleep(delay)
if not self.debug_options.get("server", {}).get("ready_pattern"):
self._ready = self._last_activity < (time.time() - auto_ready_delay)
elapsed += delay
@staticmethod
def async_sleep(secs):
d = defer.Deferred()
reactor.callLater(secs, d.callback, None)
return d
def get_debug_port(self):
return self._debug_port
def outReceived(self, data):
super(DebugServer, self).outReceived(
escape_gdbmi_stream("@", data) if is_gdbmi_mode() else data
)
if self._ready:
return
ready_pattern = self.debug_options.get("server", {}).get("ready_pattern")
if ready_pattern:
self._ready = ready_pattern.encode() in data
def processEnded(self, reason):
self._process_ended = True
super(DebugServer, self).processEnded(reason)
@@ -118,5 +162,5 @@ class DebugServer(BaseProcess):
return
try:
self._transport.signalProcess("KILL")
except (OSError, error.ProcessExitedAlready):
except: # pylint: disable=bare-except
pass

View File

@@ -1,221 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
from fnmatch import fnmatch
from os import getcwd
from os.path import join
import click
from serial.tools import miniterm
from platformio import exception, util
from platformio.compat import dump_json_to_unicode
from platformio.project.config import ProjectConfig
@click.group(short_help="Monitor device or list existing")
def cli():
pass
@cli.command("list", short_help="List devices")
@click.option("--serial", is_flag=True, help="List serial ports, default")
@click.option("--logical", is_flag=True, help="List logical devices")
@click.option("--mdns", is_flag=True, help="List multicast DNS services")
@click.option("--json-output", is_flag=True)
def device_list( # pylint: disable=too-many-branches
serial, logical, mdns, json_output):
if not logical and not mdns:
serial = True
data = {}
if serial:
data['serial'] = util.get_serial_ports()
if logical:
data['logical'] = util.get_logical_devices()
if mdns:
data['mdns'] = util.get_mdns_services()
single_key = list(data)[0] if len(list(data)) == 1 else None
if json_output:
return click.echo(
dump_json_to_unicode(data[single_key] if single_key else data))
titles = {
"serial": "Serial Ports",
"logical": "Logical Devices",
"mdns": "Multicast DNS Services"
}
for key, value in data.items():
if not single_key:
click.secho(titles[key], bold=True)
click.echo("=" * len(titles[key]))
if key == "serial":
for item in value:
click.secho(item['port'], fg="cyan")
click.echo("-" * len(item['port']))
click.echo("Hardware ID: %s" % item['hwid'])
click.echo("Description: %s" % item['description'])
click.echo("")
if key == "logical":
for item in value:
click.secho(item['path'], fg="cyan")
click.echo("-" * len(item['path']))
click.echo("Name: %s" % item['name'])
click.echo("")
if key == "mdns":
for item in value:
click.secho(item['name'], fg="cyan")
click.echo("-" * len(item['name']))
click.echo("Type: %s" % item['type'])
click.echo("IP: %s" % item['ip'])
click.echo("Port: %s" % item['port'])
if item['properties']:
click.echo("Properties: %s" % ("; ".join([
"%s=%s" % (k, v)
for k, v in item['properties'].items()
])))
click.echo("")
if single_key:
click.echo("")
return True
@cli.command("monitor", short_help="Monitor device (Serial)")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option("--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N")
@click.option("--rtscts",
is_flag=True,
help="Enable RTS/CTS flow control, default=Off")
@click.option("--xonxoff",
is_flag=True,
help="Enable software flow control, default=Off")
@click.option("--rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state")
@click.option("--dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state")
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option("--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8")
@click.option("--filter", "-f", multiple=True, help="Add text transformation")
@click.option("--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF")
@click.option("--raw",
is_flag=True,
help="Do not apply any encodings/transformations")
@click.option("--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)")
@click.option("--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)")
@click.option("--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off")
@click.option("-d",
"--project-dir",
default=getcwd,
type=click.Path(exists=True,
file_okay=False,
dir_okay=True,
resolve_path=True))
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment")
def device_monitor(**kwargs): # pylint: disable=too-many-branches
env_options = {}
try:
env_options = get_project_options(kwargs['project_dir'],
kwargs['environment'])
for k in ("port", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k
if k == "speed":
k = "baud"
if kwargs[k] is None and k2 in env_options:
kwargs[k] = env_options[k2]
if k != "port":
kwargs[k] = int(kwargs[k])
except exception.NotPlatformIOProject:
pass
if not kwargs['port']:
ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1:
kwargs['port'] = ports[0]['port']
sys.argv = ["monitor"] + env_options.get("monitor_flags", [])
for k, v in kwargs.items():
if k in ("port", "baud", "rts", "dtr", "environment", "project_dir"):
continue
k = "--" + k.replace("_", "-")
if k in env_options.get("monitor_flags", []):
continue
if isinstance(v, bool):
if v:
sys.argv.append(k)
elif isinstance(v, tuple):
for i in v:
sys.argv.extend([k, i])
else:
sys.argv.extend([k, str(v)])
if kwargs['port'] and (set(["*", "?", "[", "]"]) & set(kwargs['port'])):
for item in util.get_serial_ports():
if fnmatch(item['port'], kwargs['port']):
kwargs['port'] = item['port']
break
try:
miniterm.main(default_port=kwargs['port'],
default_baudrate=kwargs['baud'] or 9600,
default_rts=kwargs['rts'],
default_dtr=kwargs['dtr'])
except Exception as e:
raise exception.MinitermException(e)
def get_project_options(project_dir, environment=None):
config = ProjectConfig.get_instance(join(project_dir, "platformio.ini"))
config.validate(envs=[environment] if environment else None)
if not environment:
default_envs = config.default_envs()
if default_envs:
environment = default_envs[0]
else:
environment = config.envs()[0]
return config.items(env=environment, as_dict=True)

View File

@@ -0,0 +1,15 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.device.filters.base import DeviceMonitorFilter

View File

@@ -0,0 +1,243 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
from fnmatch import fnmatch
import click
from serial.tools import miniterm
from platformio import exception, fs, util
from platformio.commands.device import helpers as device_helpers
from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformFactory
from platformio.project.exception import NotPlatformIOProjectError
@click.group(short_help="Monitor device or list existing")
def cli():
pass
@cli.command("list", short_help="List devices")
@click.option("--serial", is_flag=True, help="List serial ports, default")
@click.option("--logical", is_flag=True, help="List logical devices")
@click.option("--mdns", is_flag=True, help="List multicast DNS services")
@click.option("--json-output", is_flag=True)
def device_list( # pylint: disable=too-many-branches
serial, logical, mdns, json_output
):
if not logical and not mdns:
serial = True
data = {}
if serial:
data["serial"] = util.get_serial_ports()
if logical:
data["logical"] = util.get_logical_devices()
if mdns:
data["mdns"] = util.get_mdns_services()
single_key = list(data)[0] if len(list(data)) == 1 else None
if json_output:
return click.echo(
dump_json_to_unicode(data[single_key] if single_key else data)
)
titles = {
"serial": "Serial Ports",
"logical": "Logical Devices",
"mdns": "Multicast DNS Services",
}
for key, value in data.items():
if not single_key:
click.secho(titles[key], bold=True)
click.echo("=" * len(titles[key]))
if key == "serial":
for item in value:
click.secho(item["port"], fg="cyan")
click.echo("-" * len(item["port"]))
click.echo("Hardware ID: %s" % item["hwid"])
click.echo("Description: %s" % item["description"])
click.echo("")
if key == "logical":
for item in value:
click.secho(item["path"], fg="cyan")
click.echo("-" * len(item["path"]))
click.echo("Name: %s" % item["name"])
click.echo("")
if key == "mdns":
for item in value:
click.secho(item["name"], fg="cyan")
click.echo("-" * len(item["name"]))
click.echo("Type: %s" % item["type"])
click.echo("IP: %s" % item["ip"])
click.echo("Port: %s" % item["port"])
if item["properties"]:
click.echo(
"Properties: %s"
% (
"; ".join(
[
"%s=%s" % (k, v)
for k, v in item["properties"].items()
]
)
)
)
click.echo("")
if single_key:
click.echo("")
return True
@cli.command("monitor", short_help="Monitor device (Serial)")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option(
"--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N",
)
@click.option("--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option(
"--xonxoff", is_flag=True, help="Enable software flow control, default=Off"
)
@click.option(
"--rts", default=None, type=click.IntRange(0, 1), help="Set initial RTS line state"
)
@click.option(
"--dtr", default=None, type=click.IntRange(0, 1), help="Set initial DTR line state"
)
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option(
"--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8",
)
@click.option("--filter", "-f", multiple=True, help="Add filters/text transformations")
@click.option(
"--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF",
)
@click.option("--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option(
"--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)",
)
@click.option(
"--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)",
)
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment",
)
def device_monitor(**kwargs): # pylint: disable=too-many-branches
# load default monitor filters
filters_dir = os.path.join(fs.get_source_dir(), "commands", "device", "filters")
for name in os.listdir(filters_dir):
if not name.endswith(".py"):
continue
device_helpers.load_monitor_filter(os.path.join(filters_dir, name))
project_options = {}
try:
with fs.cd(kwargs["project_dir"]):
project_options = device_helpers.get_project_options(kwargs["environment"])
kwargs = device_helpers.apply_project_monitor_options(kwargs, project_options)
except NotPlatformIOProjectError:
pass
platform = None
if "platform" in project_options:
with fs.cd(kwargs["project_dir"]):
platform = PlatformFactory.newPlatform(project_options["platform"])
device_helpers.register_platform_filters(
platform, kwargs["project_dir"], kwargs["environment"]
)
if not kwargs["port"]:
ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1:
kwargs["port"] = ports[0]["port"]
elif "platform" in project_options and "board" in project_options:
board_hwids = device_helpers.get_board_hwids(
kwargs["project_dir"], platform, project_options["board"],
)
for item in ports:
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
kwargs["port"] = item["port"]
break
if kwargs["port"]:
break
elif kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
for item in util.get_serial_ports():
if fnmatch(item["port"], kwargs["port"]):
kwargs["port"] = item["port"]
break
# override system argv with patched options
sys.argv = ["monitor"] + device_helpers.options_to_argv(
kwargs,
project_options,
ignore=("port", "baud", "rts", "dtr", "environment", "project_dir"),
)
if not kwargs["quiet"]:
click.echo(
"--- Available filters and text transformations: %s"
% ", ".join(sorted(miniterm.TRANSFORMATIONS.keys()))
)
click.echo("--- More details at http://bit.ly/pio-monitor-filters")
try:
miniterm.main(
default_port=kwargs["port"],
default_baudrate=kwargs["baud"] or 9600,
default_rts=kwargs["rts"],
default_dtr=kwargs["dtr"],
)
except Exception as e:
raise exception.MinitermException(e)

View File

@@ -0,0 +1,13 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

View File

@@ -0,0 +1,42 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from serial.tools import miniterm
from platformio.project.config import ProjectConfig
class DeviceMonitorFilter(miniterm.Transform):
def __init__(self, project_dir=None, environment=None):
""" Called by PlatformIO to pass context """
miniterm.Transform.__init__(self)
self.project_dir = project_dir
self.environment = environment
self.config = ProjectConfig.get_instance()
if not self.environment:
default_envs = self.config.default_envs()
if default_envs:
self.environment = default_envs[0]
elif self.config.envs():
self.environment = self.config.envs()[0]
def __call__(self):
""" Called by the miniterm library when the filter is actually used """
return self
@property
def NAME(self):
raise NotImplementedError("Please declare NAME attribute for the filter class")

View File

@@ -0,0 +1,38 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import serial
from platformio.commands.device import DeviceMonitorFilter
class Hexlify(DeviceMonitorFilter):
NAME = "hexlify"
def __init__(self, *args, **kwargs):
super(Hexlify, self).__init__(*args, **kwargs)
self._counter = 0
def rx(self, text):
result = ""
for b in serial.iterbytes(text):
if (self._counter % 16) == 0:
result += "\n{:04X} | ".format(self._counter)
asciicode = ord(b)
if asciicode <= 255:
result += "{:02X} ".format(asciicode)
else:
result += "?? "
self._counter += 1
return result

View File

@@ -0,0 +1,44 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import io
import os.path
from datetime import datetime
from platformio.commands.device import DeviceMonitorFilter
class LogToFile(DeviceMonitorFilter):
NAME = "log2file"
def __init__(self, *args, **kwargs):
super(LogToFile, self).__init__(*args, **kwargs)
self._log_fp = None
def __call__(self):
log_file_name = "platformio-device-monitor-%s.log" % datetime.now().strftime(
"%y%m%d-%H%M%S"
)
print("--- Logging an output to %s" % os.path.abspath(log_file_name))
self._log_fp = io.open(log_file_name, "w", encoding="utf-8")
return self
def __del__(self):
if self._log_fp:
self._log_fp.close()
def rx(self, text):
self._log_fp.write(text)
self._log_fp.flush()
return text

View File

@@ -0,0 +1,31 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.device import DeviceMonitorFilter
class SendOnEnter(DeviceMonitorFilter):
NAME = "send_on_enter"
def __init__(self, *args, **kwargs):
super(SendOnEnter, self).__init__(*args, **kwargs)
self._buffer = ""
def tx(self, text):
self._buffer += text
if self._buffer.endswith("\r\n"):
text = self._buffer[:-2]
self._buffer = ""
return text
return ""

View File

@@ -0,0 +1,34 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from datetime import datetime
from platformio.commands.device import DeviceMonitorFilter
class Timestamp(DeviceMonitorFilter):
NAME = "time"
def __init__(self, *args, **kwargs):
super(Timestamp, self).__init__(*args, **kwargs)
self._first_text_received = False
def rx(self, text):
if self._first_text_received and "\n" not in text:
return text
timestamp = datetime.now().strftime("%H:%M:%S.%f")[:-3]
if not self._first_text_received:
self._first_text_received = True
return "%s > %s" % (timestamp, text)
return text.replace("\n", "\n%s > " % timestamp)

View File

@@ -0,0 +1,106 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import inspect
import os
from serial.tools import miniterm
from platformio import fs
from platformio.commands.device import DeviceMonitorFilter
from platformio.compat import get_object_members, load_python_module
from platformio.project.config import ProjectConfig
def apply_project_monitor_options(cli_options, project_options):
for k in ("port", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k
if k == "speed":
k = "baud"
if cli_options[k] is None and k2 in project_options:
cli_options[k] = project_options[k2]
if k != "port":
cli_options[k] = int(cli_options[k])
return cli_options
def options_to_argv(cli_options, project_options, ignore=None):
confmon_flags = project_options.get("monitor_flags", [])
result = confmon_flags[::]
for f in project_options.get("monitor_filters", []):
result.extend(["--filter", f])
for k, v in cli_options.items():
if v is None or (ignore and k in ignore):
continue
k = "--" + k.replace("_", "-")
if k in confmon_flags:
continue
if isinstance(v, bool):
if v:
result.append(k)
elif isinstance(v, tuple):
for i in v:
result.extend([k, i])
else:
result.extend([k, str(v)])
return result
def get_project_options(environment=None):
config = ProjectConfig.get_instance()
config.validate(envs=[environment] if environment else None)
if not environment:
default_envs = config.default_envs()
if default_envs:
environment = default_envs[0]
else:
environment = config.envs()[0]
return config.items(env=environment, as_dict=True)
def get_board_hwids(project_dir, platform, board):
with fs.cd(project_dir):
return platform.board_config(board).get("build.hwids", [])
def load_monitor_filter(path, project_dir=None, environment=None):
name = os.path.basename(path)
name = name[: name.find(".")]
module = load_python_module("platformio.commands.device.filters.%s" % name, path)
for cls in get_object_members(module).values():
if (
not inspect.isclass(cls)
or not issubclass(cls, DeviceMonitorFilter)
or cls == DeviceMonitorFilter
):
continue
obj = cls(project_dir, environment)
miniterm.TRANSFORMATIONS[obj.NAME] = obj
return True
def register_platform_filters(platform, project_dir, environment):
monitor_dir = os.path.join(platform.get_dir(), "monitor")
if not os.path.isdir(monitor_dir):
return
for name in os.listdir(monitor_dir):
if not name.startswith("filter_") or not name.endswith(".py"):
continue
path = os.path.join(monitor_dir, name)
if not os.path.isfile(path):
continue
load_monitor_filter(path, project_dir, environment)

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.home.command import cli

View File

@@ -12,6 +12,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-locals,too-many-statements
import mimetypes
import socket
from os.path import isdir
@@ -19,8 +21,12 @@ from os.path import isdir
import click
from platformio import exception
from platformio.managers.core import (get_core_package_dir,
inject_contrib_pysite)
from platformio.compat import WINDOWS
from platformio.managers.core import (
build_contrib_pysite_deps,
get_core_package_dir,
inject_contrib_pysite,
)
@click.command("home", short_help="PIO Home")
@@ -28,17 +34,37 @@ from platformio.managers.core import (get_core_package_dir,
@click.option(
"--host",
default="127.0.0.1",
help="HTTP host, default=127.0.0.1. "
"You can open PIO Home for inbound connections with --host=0.0.0.0")
@click.option("--no-open", is_flag=True) # pylint: disable=too-many-locals
def cli(port, host, no_open):
help=(
"HTTP host, default=127.0.0.1. You can open PIO Home for inbound "
"connections with --host=0.0.0.0"
),
)
@click.option("--no-open", is_flag=True)
@click.option(
"--shutdown-timeout",
default=0,
type=int,
help=(
"Automatically shutdown server on timeout (in seconds) when no clients "
"are connected. Default is 0 which means never auto shutdown"
),
)
def cli(port, host, no_open, shutdown_timeout):
# pylint: disable=import-error, import-outside-toplevel
# import contrib modules
inject_contrib_pysite()
# pylint: disable=import-error
from autobahn.twisted.resource import WebSocketResource
try:
from autobahn.twisted.resource import WebSocketResource
except: # pylint: disable=bare-except
build_contrib_pysite_deps(get_core_package_dir("contrib-pysite"))
from autobahn.twisted.resource import WebSocketResource
from twisted.internet import reactor
from twisted.web import server
# pylint: enable=import-error
from twisted.internet.error import CannotListenError
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
@@ -48,7 +74,7 @@ def cli(port, host, no_open):
from platformio.commands.home.rpc.server import JSONRPCServerFactory
from platformio.commands.home.web import WebRoot
factory = JSONRPCServerFactory()
factory = JSONRPCServerFactory(shutdown_timeout)
factory.addHandler(AppRPC(), namespace="app")
factory.addHandler(IDERPC(), namespace="ide")
factory.addHandler(MiscRPC(), namespace="misc")
@@ -73,15 +99,7 @@ def cli(port, host, no_open):
if host == "__do_not_start__":
return
# if already started
already_started = False
socket.setdefaulttimeout(1)
try:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
already_started = True
except: # pylint: disable=bare-except
pass
already_started = is_port_used(host, port)
home_url = "http://%s:%d" % (host, port)
if not no_open:
if already_started:
@@ -89,21 +107,53 @@ def cli(port, host, no_open):
else:
reactor.callLater(1, lambda: click.launch(home_url))
click.echo("\n".join([
"",
" ___I_",
" /\\-_--\\ PlatformIO Home",
"/ \\_-__\\",
"|[]| [] | %s" % home_url,
"|__|____|______________%s" % ("_" * len(host)),
]))
click.echo(
"\n".join(
[
"",
" ___I_",
" /\\-_--\\ PlatformIO Home",
"/ \\_-__\\",
"|[]| [] | %s" % home_url,
"|__|____|______________%s" % ("_" * len(host)),
]
)
)
click.echo("")
click.echo("Open PIO Home in your browser by this URL => %s" % home_url)
click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url)
try:
reactor.listenTCP(port, site, interface=host)
except CannotListenError as e:
click.secho(str(e), fg="red", err=True)
already_started = True
if already_started:
click.secho(
"PlatformIO Home server is already started in another process.", fg="yellow"
)
return
click.echo("PIO Home has been started. Press Ctrl+C to shutdown.")
reactor.listenTCP(port, site, interface=host)
reactor.run()
def is_port_used(host, port):
socket.setdefaulttimeout(1)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if WINDOWS:
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True

View File

@@ -27,7 +27,6 @@ from platformio.proc import where_is_program
class AsyncSession(requests.Session):
def __init__(self, n=None, *args, **kwargs):
if n:
pool = reactor.getThreadPool()
@@ -51,7 +50,8 @@ def requests_session():
@util.memoized(expire="60s")
def get_core_fullpath():
return where_is_program(
"platformio" + (".exe" if "windows" in util.get_systype() else ""))
"platformio" + (".exe" if "windows" in util.get_systype() else "")
)
@util.memoized(expire="10s")
@@ -60,9 +60,7 @@ def is_twitter_blocked():
timeout = 2
try:
if os.getenv("HTTP_PROXY", os.getenv("HTTPS_PROXY")):
requests.get("http://%s" % ip,
allow_redirects=False,
timeout=timeout)
requests.get("http://%s" % ip, allow_redirects=False, timeout=timeout)
else:
socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((ip, 80))
return False

View File

@@ -14,11 +14,10 @@
from __future__ import absolute_import
from os.path import expanduser, join
from os.path import join
from platformio import __version__, app, util
from platformio.project.helpers import (get_project_core_dir,
is_platformio_project)
from platformio import __version__, app, fs, util
from platformio.project.helpers import get_project_core_dir, is_platformio_project
class AppRPC(object):
@@ -26,8 +25,13 @@ class AppRPC(object):
APPSTATE_PATH = join(get_project_core_dir(), "homestate.json")
IGNORE_STORAGE_KEYS = [
"cid", "coreVersion", "coreSystype", "coreCaller", "coreSettings",
"homeDir", "projectsDir"
"cid",
"coreVersion",
"coreSystype",
"coreCaller",
"coreSettings",
"homeDir",
"projectsDir",
]
@staticmethod
@@ -37,31 +41,28 @@ class AppRPC(object):
# base data
caller_id = app.get_session_var("caller_id")
storage['cid'] = app.get_cid()
storage['coreVersion'] = __version__
storage['coreSystype'] = util.get_systype()
storage['coreCaller'] = (str(caller_id).lower()
if caller_id else None)
storage['coreSettings'] = {
storage["cid"] = app.get_cid()
storage["coreVersion"] = __version__
storage["coreSystype"] = util.get_systype()
storage["coreCaller"] = str(caller_id).lower() if caller_id else None
storage["coreSettings"] = {
name: {
"description": data['description'],
"default_value": data['value'],
"value": app.get_setting(name)
"description": data["description"],
"default_value": data["value"],
"value": app.get_setting(name),
}
for name, data in app.DEFAULT_SETTINGS.items()
}
storage['homeDir'] = expanduser("~")
storage['projectsDir'] = storage['coreSettings']['projects_dir'][
'value']
storage["homeDir"] = fs.expanduser("~")
storage["projectsDir"] = storage["coreSettings"]["projects_dir"]["value"]
# skip non-existing recent projects
storage['recentProjects'] = [
p for p in storage.get("recentProjects", [])
if is_platformio_project(p)
storage["recentProjects"] = [
p for p in storage.get("recentProjects", []) if is_platformio_project(p)
]
state['storage'] = storage
state["storage"] = storage
state.modified = False # skip saving extra fields
return state.as_dict()

View File

@@ -19,20 +19,18 @@ from twisted.internet import defer # pylint: disable=import-error
class IDERPC(object):
def __init__(self):
self._queue = {}
def send_command(self, command, params, sid=0):
def send_command(self, sid, command, params):
if not self._queue.get(sid):
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4005, message="PIO Home IDE agent is not started")
code=4005, message="PIO Home IDE agent is not started"
)
while self._queue[sid]:
self._queue[sid].pop().callback({
"id": time.time(),
"method": command,
"params": params
})
self._queue[sid].pop().callback(
{"id": time.time(), "method": command, "params": params}
)
def listen_commands(self, sid=0):
if sid not in self._queue:
@@ -40,5 +38,10 @@ class IDERPC(object):
self._queue[sid].append(defer.Deferred())
return self._queue[sid][-1]
def open_project(self, project_dir, sid=0):
return self.send_command("open_project", project_dir, sid)
def open_project(self, sid, project_dir):
return self.send_command(sid, "open_project", project_dir)
def open_text_document(self, sid, path, line=None, column=None):
return self.send_command(
sid, "open_text_document", dict(path=path, line=line, column=column)
)

View File

@@ -22,33 +22,31 @@ from platformio.commands.home.rpc.handlers.os import OSRPC
class MiscRPC(object):
def load_latest_tweets(self, username):
cache_key = "piohome_latest_tweets_" + str(username)
def load_latest_tweets(self, data_url):
cache_key = app.ContentCache.key_from_args(data_url, "tweets")
cache_valid = "7d"
with app.ContentCache() as cc:
cache_data = cc.get(cache_key)
if cache_data:
cache_data = json.loads(cache_data)
# automatically update cache in background every 12 hours
if cache_data['time'] < (time.time() - (3600 * 12)):
reactor.callLater(5, self._preload_latest_tweets, username,
cache_key, cache_valid)
return cache_data['result']
if cache_data["time"] < (time.time() - (3600 * 12)):
reactor.callLater(
5, self._preload_latest_tweets, data_url, cache_key, cache_valid
)
return cache_data["result"]
result = self._preload_latest_tweets(username, cache_key, cache_valid)
result = self._preload_latest_tweets(data_url, cache_key, cache_valid)
return result
@staticmethod
@defer.inlineCallbacks
def _preload_latest_tweets(username, cache_key, cache_valid):
result = yield OSRPC.fetch_content(
"https://api.platformio.org/tweets/" + username)
result = json.loads(result)
def _preload_latest_tweets(data_url, cache_key, cache_valid):
result = json.loads((yield OSRPC.fetch_content(data_url)))
with app.ContentCache() as cc:
cc.set(cache_key,
json.dumps({
"time": int(time.time()),
"result": result
}), cache_valid)
cc.set(
cache_key,
json.dumps({"time": int(time.time()), "result": result}),
cache_valid,
)
defer.returnValue(result)

View File

@@ -14,35 +14,33 @@
from __future__ import absolute_import
import codecs
import glob
import io
import os
import shutil
from functools import cmp_to_key
from os.path import expanduser, isdir, isfile, join
import click
from twisted.internet import defer # pylint: disable=import-error
from platformio import app, util
from platformio import app, fs, util
from platformio.commands.home import helpers
from platformio.compat import PY2, get_filesystem_encoding
class OSRPC(object):
@staticmethod
@defer.inlineCallbacks
def fetch_content(uri, data=None, headers=None, cache_valid=None):
if not headers:
headers = {
"User-Agent":
("Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) "
"AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 "
"Safari/603.3.8")
"User-Agent": (
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) "
"AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 "
"Safari/603.3.8"
)
}
cache_key = (app.ContentCache.key_from_args(uri, data)
if cache_valid else None)
cache_key = app.ContentCache.key_from_args(uri, data) if cache_valid else None
with app.ContentCache() as cc:
if cache_key:
result = cc.get(cache_key)
@@ -66,12 +64,12 @@ class OSRPC(object):
defer.returnValue(result)
def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith('http'):
if uri.startswith("http"):
return self.fetch_content(uri, data, headers, cache_valid)
if not isfile(uri):
return None
with codecs.open(uri, encoding="utf-8") as fp:
return fp.read()
if os.path.isfile(uri):
with io.open(uri, encoding="utf-8") as fp:
return fp.read()
return None
@staticmethod
def open_url(url):
@@ -80,21 +78,29 @@ class OSRPC(object):
@staticmethod
def reveal_file(path):
return click.launch(
path.encode(get_filesystem_encoding()) if PY2 else path,
locate=True)
path.encode(get_filesystem_encoding()) if PY2 else path, locate=True
)
@staticmethod
def open_file(path):
return click.launch(path.encode(get_filesystem_encoding()) if PY2 else path)
@staticmethod
def is_file(path):
return isfile(path)
return os.path.isfile(path)
@staticmethod
def is_dir(path):
return isdir(path)
return os.path.isdir(path)
@staticmethod
def make_dirs(path):
return os.makedirs(path)
@staticmethod
def get_file_mtime(path):
return os.path.getmtime(path)
@staticmethod
def rename(src, dst):
return os.rename(src, dst)
@@ -109,13 +115,11 @@ class OSRPC(object):
pathnames = [pathnames]
result = set()
for pathname in pathnames:
result |= set(
glob.glob(join(root, pathname) if root else pathname))
result |= set(glob.glob(os.path.join(root, pathname) if root else pathname))
return list(result)
@staticmethod
def list_dir(path):
def _cmp(x, y):
if x[1] and not y[1]:
return -1
@@ -129,14 +133,14 @@ class OSRPC(object):
items = []
if path.startswith("~"):
path = expanduser(path)
if not isdir(path):
path = fs.expanduser(path)
if not os.path.isdir(path):
return items
for item in os.listdir(path):
try:
item_is_dir = isdir(join(path, item))
item_is_dir = os.path.isdir(os.path.join(path, item))
if item_is_dir:
os.listdir(join(path, item))
os.listdir(os.path.join(path, item))
items.append((item, item_is_dir))
except OSError:
pass
@@ -146,7 +150,7 @@ class OSRPC(object):
def get_logical_devices():
items = []
for item in util.get_logical_devices():
if item['name']:
item['name'] = item['name']
if item["name"]:
item["name"] = item["name"]
items.append(item)
return items

View File

@@ -27,8 +27,7 @@ from twisted.internet import utils # pylint: disable=import-error
from platformio import __main__, __version__, fs
from platformio.commands.home import helpers
from platformio.compat import (PY2, get_filesystem_encoding, is_bytes,
string_types)
from platformio.compat import PY2, get_filesystem_encoding, is_bytes, string_types
try:
from thread import get_ident as thread_get_ident
@@ -37,7 +36,6 @@ except ImportError:
class MultiThreadingStdStream(object):
def __init__(self, parent_stream):
self._buffers = {thread_get_ident(): parent_stream}
@@ -54,7 +52,8 @@ class MultiThreadingStdStream(object):
thread_id = thread_get_ident()
self._ensure_thread_buffer(thread_id)
return self._buffers[thread_id].write(
value.decode() if is_bytes(value) else value)
value.decode() if is_bytes(value) else value
)
def get_value_and_reset(self):
result = ""
@@ -68,7 +67,6 @@ class MultiThreadingStdStream(object):
class PIOCoreRPC(object):
@staticmethod
def version():
return __version__
@@ -104,16 +102,15 @@ class PIOCoreRPC(object):
else:
result = yield PIOCoreRPC._call_inline(args, options)
try:
defer.returnValue(
PIOCoreRPC._process_result(result, to_json))
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
except ValueError:
# fall-back to subprocess method
result = yield PIOCoreRPC._call_subprocess(args, options)
defer.returnValue(
PIOCoreRPC._process_result(result, to_json))
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4003, message="PIO Core Call Error", data=str(e))
code=4003, message="PIO Core Call Error", data=str(e)
)
@staticmethod
def _call_inline(args, options):
@@ -123,8 +120,11 @@ class PIOCoreRPC(object):
def _thread_task():
with fs.cd(cwd):
exit_code = __main__.main(["-c"] + args)
return (PIOCoreRPC.thread_stdout.get_value_and_reset(),
PIOCoreRPC.thread_stderr.get_value_and_reset(), exit_code)
return (
PIOCoreRPC.thread_stdout.get_value_and_reset(),
PIOCoreRPC.thread_stderr.get_value_and_reset(),
exit_code,
)
return threads.deferToThread(_thread_task)
@@ -135,8 +135,8 @@ class PIOCoreRPC(object):
helpers.get_core_fullpath(),
args,
path=cwd,
env={k: v
for k, v in os.environ.items() if "%" not in k})
env={k: v for k, v in os.environ.items() if "%" not in k},
)
@staticmethod
def _process_result(result, to_json=False):
@@ -146,6 +146,8 @@ class PIOCoreRPC(object):
raise Exception(text)
if not to_json:
return text
if is_bytes(out):
out = out.decode()
try:
return json.loads(out)
except ValueError as e:

View File

@@ -17,8 +17,6 @@ from __future__ import absolute_import
import os
import shutil
import time
from os.path import (basename, expanduser, getmtime, isdir, isfile, join,
realpath, sep)
import jsonrpc # pylint: disable=import-error
@@ -29,137 +27,175 @@ from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager
from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_libdeps_dir,
get_project_src_dir,
is_platformio_project)
from platformio.project.exception import ProjectError
from platformio.project.helpers import get_project_dir, is_platformio_project
from platformio.project.options import get_config_options_schema
class ProjectRPC(object):
@staticmethod
def config_call(init_kwargs, method, *args):
assert isinstance(init_kwargs, dict)
assert "path" in init_kwargs
project_dir = get_project_dir()
if os.path.isfile(init_kwargs["path"]):
project_dir = os.path.dirname(init_kwargs["path"])
with fs.cd(project_dir):
return getattr(ProjectConfig(**init_kwargs), method)(*args)
@staticmethod
def _get_projects(project_dirs=None):
def config_load(path):
return ProjectConfig(
path, parse_extra=False, expand_interpolations=False
).as_tuple()
def _get_project_data(project_dir):
@staticmethod
def config_dump(path, data):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
config.update(data, clear=True)
return config.save()
@staticmethod
def config_update_description(path, text):
config = ProjectConfig(path, parse_extra=False, expand_interpolations=False)
if not config.has_section("platformio"):
config.add_section("platformio")
if text:
config.set("platformio", "description", text)
else:
if config.has_option("platformio", "description"):
config.remove_option("platformio", "description")
if not config.options("platformio"):
config.remove_section("platformio")
return config.save()
@staticmethod
def get_config_schema():
return get_config_options_schema()
@staticmethod
def get_projects():
def _get_project_data():
data = {"boards": [], "envLibdepsDirs": [], "libExtraDirs": []}
config = ProjectConfig(join(project_dir, "platformio.ini"))
libdeps_dir = get_project_libdeps_dir()
data['libExtraDirs'].extend(
config.get("platformio", "lib_extra_dirs", []))
config = ProjectConfig()
data["envs"] = config.envs()
data["description"] = config.get("platformio", "description")
data["libExtraDirs"].extend(config.get("platformio", "lib_extra_dirs", []))
libdeps_dir = config.get_optional_dir("libdeps")
for section in config.sections():
if not section.startswith("env:"):
continue
data['envLibdepsDirs'].append(join(libdeps_dir, section[4:]))
data["envLibdepsDirs"].append(os.path.join(libdeps_dir, section[4:]))
if config.has_option(section, "board"):
data['boards'].append(config.get(section, "board"))
data['libExtraDirs'].extend(
config.get(section, "lib_extra_dirs", []))
data["boards"].append(config.get(section, "board"))
data["libExtraDirs"].extend(config.get(section, "lib_extra_dirs", []))
# skip non existing folders and resolve full path
for key in ("envLibdepsDirs", "libExtraDirs"):
data[key] = [
expanduser(d) if d.startswith("~") else realpath(d)
for d in data[key] if isdir(d)
fs.expanduser(d) if d.startswith("~") else os.path.realpath(d)
for d in data[key]
if os.path.isdir(d)
]
return data
def _path_to_name(path):
return (sep).join(path.split(sep)[-2:])
if not project_dirs:
project_dirs = AppRPC.load_state()['storage']['recentProjects']
return (os.path.sep).join(path.split(os.path.sep)[-2:])
result = []
pm = PlatformManager()
for project_dir in project_dirs:
for project_dir in AppRPC.load_state()["storage"]["recentProjects"]:
if not os.path.isdir(project_dir):
continue
data = {}
boards = []
try:
with fs.cd(project_dir):
data = _get_project_data(project_dir)
except exception.PlatformIOProjectException:
data = _get_project_data()
except ProjectError:
continue
for board_id in data.get("boards", []):
name = board_id
try:
name = pm.board_config(board_id)['name']
name = pm.board_config(board_id)["name"]
except exception.PlatformioException:
pass
boards.append({"id": board_id, "name": name})
result.append({
"path":
project_dir,
"name":
_path_to_name(project_dir),
"modified":
int(getmtime(project_dir)),
"boards":
boards,
"envLibStorages": [{
"name": basename(d),
"path": d
} for d in data.get("envLibdepsDirs", [])],
"extraLibStorages": [{
"name": _path_to_name(d),
"path": d
} for d in data.get("libExtraDirs", [])]
})
result.append(
{
"path": project_dir,
"name": _path_to_name(project_dir),
"modified": int(os.path.getmtime(project_dir)),
"boards": boards,
"description": data.get("description"),
"envs": data.get("envs", []),
"envLibStorages": [
{"name": os.path.basename(d), "path": d}
for d in data.get("envLibdepsDirs", [])
],
"extraLibStorages": [
{"name": _path_to_name(d), "path": d}
for d in data.get("libExtraDirs", [])
],
}
)
return result
def get_projects(self, project_dirs=None):
return self._get_projects(project_dirs)
@staticmethod
def get_project_examples():
result = []
for manifest in PlatformManager().get_installed():
examples_dir = join(manifest['__pkg_dir'], "examples")
if not isdir(examples_dir):
examples_dir = os.path.join(manifest["__pkg_dir"], "examples")
if not os.path.isdir(examples_dir):
continue
items = []
for project_dir, _, __ in os.walk(examples_dir):
project_description = None
try:
config = ProjectConfig(join(project_dir, "platformio.ini"))
config = ProjectConfig(os.path.join(project_dir, "platformio.ini"))
config.validate(silent=True)
project_description = config.get("platformio",
"description")
except exception.PlatformIOProjectException:
project_description = config.get("platformio", "description")
except ProjectError:
continue
path_tokens = project_dir.split(sep)
items.append({
"name":
"/".join(path_tokens[path_tokens.index("examples") + 1:]),
"path":
project_dir,
"description":
project_description
})
result.append({
"platform": {
"title": manifest['title'],
"version": manifest['version']
},
"items": sorted(items, key=lambda item: item['name'])
})
return sorted(result, key=lambda data: data['platform']['title'])
path_tokens = project_dir.split(os.path.sep)
items.append(
{
"name": "/".join(
path_tokens[path_tokens.index("examples") + 1 :]
),
"path": project_dir,
"description": project_description,
}
)
result.append(
{
"platform": {
"title": manifest["title"],
"version": manifest["version"],
},
"items": sorted(items, key=lambda item: item["name"]),
}
)
return sorted(result, key=lambda data: data["platform"]["title"])
def init(self, board, framework, project_dir):
assert project_dir
state = AppRPC.load_state()
if not isdir(project_dir):
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["init", "--board", board]
if framework:
args.extend(["--project-option", "framework = %s" % framework])
if (state['storage']['coreCaller'] and state['storage']['coreCaller']
in ProjectGenerator.get_supported_ides()):
args.extend(["--ide", state['storage']['coreCaller']])
if (
state["storage"]["coreCaller"]
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(args, options={"cwd": project_dir})
d.addCallback(self._generate_project_main, project_dir, framework)
return d
@@ -168,90 +204,101 @@ class ProjectRPC(object):
def _generate_project_main(_, project_dir, framework):
main_content = None
if framework == "arduino":
main_content = "\n".join([
"#include <Arduino.h>",
"",
"void setup() {",
" // put your setup code here, to run once:",
"}",
"",
"void loop() {",
" // put your main code here, to run repeatedly:",
"}"
""
]) # yapf: disable
main_content = "\n".join(
[
"#include <Arduino.h>",
"",
"void setup() {",
" // put your setup code here, to run once:",
"}",
"",
"void loop() {",
" // put your main code here, to run repeatedly:",
"}",
"",
]
)
elif framework == "mbed":
main_content = "\n".join([
"#include <mbed.h>",
"",
"int main() {",
"",
" // put your setup code here, to run once:",
"",
" while(1) {",
" // put your main code here, to run repeatedly:",
" }",
"}",
""
]) # yapf: disable
main_content = "\n".join(
[
"#include <mbed.h>",
"",
"int main() {",
"",
" // put your setup code here, to run once:",
"",
" while(1) {",
" // put your main code here, to run repeatedly:",
" }",
"}",
"",
]
)
if not main_content:
return project_dir
with fs.cd(project_dir):
src_dir = get_project_src_dir()
main_path = join(src_dir, "main.cpp")
if isfile(main_path):
config = ProjectConfig()
src_dir = config.get_optional_dir("src")
main_path = os.path.join(src_dir, "main.cpp")
if os.path.isfile(main_path):
return project_dir
if not isdir(src_dir):
if not os.path.isdir(src_dir):
os.makedirs(src_dir)
with open(main_path, "w") as f:
f.write(main_content.strip())
with open(main_path, "w") as fp:
fp.write(main_content.strip())
return project_dir
def import_arduino(self, board, use_arduino_libs, arduino_project_dir):
board = str(board)
if arduino_project_dir and PY2:
arduino_project_dir = arduino_project_dir.encode(
get_filesystem_encoding())
arduino_project_dir = arduino_project_dir.encode(get_filesystem_encoding())
# don't import PIO Project
if is_platformio_project(arduino_project_dir):
return arduino_project_dir
is_arduino_project = any([
isfile(
join(arduino_project_dir,
"%s.%s" % (basename(arduino_project_dir), ext)))
for ext in ("ino", "pde")
])
is_arduino_project = any(
[
os.path.isfile(
os.path.join(
arduino_project_dir,
"%s.%s" % (os.path.basename(arduino_project_dir), ext),
)
)
for ext in ("ino", "pde")
]
)
if not is_arduino_project:
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4000,
message="Not an Arduino project: %s" % arduino_project_dir)
code=4000, message="Not an Arduino project: %s" % arduino_project_dir
)
state = AppRPC.load_state()
project_dir = join(state['storage']['projectsDir'],
time.strftime("%y%m%d-%H%M%S-") + board)
if not isdir(project_dir):
project_dir = os.path.join(
state["storage"]["projectsDir"], time.strftime("%y%m%d-%H%M%S-") + board
)
if not os.path.isdir(project_dir):
os.makedirs(project_dir)
args = ["init", "--board", board]
args.extend(["--project-option", "framework = arduino"])
if use_arduino_libs:
args.extend([
"--project-option",
"lib_extra_dirs = ~/Documents/Arduino/libraries"
])
if (state['storage']['coreCaller'] and state['storage']['coreCaller']
in ProjectGenerator.get_supported_ides()):
args.extend(["--ide", state['storage']['coreCaller']])
args.extend(
["--project-option", "lib_extra_dirs = ~/Documents/Arduino/libraries"]
)
if (
state["storage"]["coreCaller"]
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(args, options={"cwd": project_dir})
d.addCallback(self._finalize_arduino_import, project_dir,
arduino_project_dir)
d.addCallback(self._finalize_arduino_import, project_dir, arduino_project_dir)
return d
@staticmethod
def _finalize_arduino_import(_, project_dir, arduino_project_dir):
with fs.cd(project_dir):
src_dir = get_project_src_dir()
if isdir(src_dir):
config = ProjectConfig()
src_dir = config.get_optional_dir("src")
if os.path.isdir(src_dir):
fs.rmtree(src_dir)
shutil.copytree(arduino_project_dir, src_dir)
return project_dir
@@ -260,18 +307,21 @@ class ProjectRPC(object):
def import_pio(project_dir):
if not project_dir or not is_platformio_project(project_dir):
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4001,
message="Not an PlatformIO project: %s" % project_dir)
new_project_dir = join(
AppRPC.load_state()['storage']['projectsDir'],
time.strftime("%y%m%d-%H%M%S-") + basename(project_dir))
code=4001, message="Not an PlatformIO project: %s" % project_dir
)
new_project_dir = os.path.join(
AppRPC.load_state()["storage"]["projectsDir"],
time.strftime("%y%m%d-%H%M%S-") + os.path.basename(project_dir),
)
shutil.copytree(project_dir, new_project_dir)
state = AppRPC.load_state()
args = ["init"]
if (state['storage']['coreCaller'] and state['storage']['coreCaller']
in ProjectGenerator.get_supported_ides()):
args.extend(["--ide", state['storage']['coreCaller']])
if (
state["storage"]["coreCaller"]
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(args, options={"cwd": new_project_dir})
d.addCallback(lambda _: new_project_dir)
return d

View File

@@ -16,49 +16,58 @@
import click
import jsonrpc
from autobahn.twisted.websocket import (WebSocketServerFactory,
WebSocketServerProtocol)
from autobahn.twisted.websocket import WebSocketServerFactory, WebSocketServerProtocol
from jsonrpc.exceptions import JSONRPCDispatchException
from twisted.internet import defer
from twisted.internet import defer, reactor
from platformio.compat import PY2, dump_json_to_unicode, is_bytes
class JSONRPCServerProtocol(WebSocketServerProtocol):
def onOpen(self):
self.factory.connection_nums += 1
if self.factory.shutdown_timer:
self.factory.shutdown_timer.cancel()
self.factory.shutdown_timer = None
def onClose(self, wasClean, code, reason): # pylint: disable=unused-argument
self.factory.connection_nums -= 1
if self.factory.connection_nums == 0:
self.factory.shutdownByTimeout()
def onMessage(self, payload, isBinary): # pylint: disable=unused-argument
# click.echo("> %s" % payload)
response = jsonrpc.JSONRPCResponseManager.handle(
payload, self.factory.dispatcher).data
payload, self.factory.dispatcher
).data
# if error
if "result" not in response:
self.sendJSONResponse(response)
return None
d = defer.maybeDeferred(lambda: response['result'])
d = defer.maybeDeferred(lambda: response["result"])
d.addCallback(self._callback, response)
d.addErrback(self._errback, response)
return None
def _callback(self, result, response):
response['result'] = result
response["result"] = result
self.sendJSONResponse(response)
def _errback(self, failure, response):
if isinstance(failure.value, JSONRPCDispatchException):
e = failure.value
else:
e = JSONRPCDispatchException(code=4999,
message=failure.getErrorMessage())
e = JSONRPCDispatchException(code=4999, message=failure.getErrorMessage())
del response["result"]
response['error'] = e.error._data # pylint: disable=protected-access
response["error"] = e.error._data # pylint: disable=protected-access
self.sendJSONResponse(response)
def sendJSONResponse(self, response):
# click.echo("< %s" % response)
if "error" in response:
click.secho("Error: %s" % response['error'], fg="red", err=True)
click.secho("Error: %s" % response["error"], fg="red", err=True)
response = dump_json_to_unicode(response)
if not PY2 and not is_bytes(response):
response = response.encode("utf-8")
@@ -68,10 +77,25 @@ class JSONRPCServerProtocol(WebSocketServerProtocol):
class JSONRPCServerFactory(WebSocketServerFactory):
protocol = JSONRPCServerProtocol
connection_nums = 0
shutdown_timer = 0
def __init__(self):
def __init__(self, shutdown_timeout=0):
super(JSONRPCServerFactory, self).__init__()
self.shutdown_timeout = shutdown_timeout
self.dispatcher = jsonrpc.Dispatcher()
def shutdownByTimeout(self):
if self.shutdown_timeout < 1:
return
def _auto_shutdown_server():
click.echo("Automatically shutdown server on timeout")
reactor.stop()
self.shutdown_timer = reactor.callLater(
self.shutdown_timeout, _auto_shutdown_server
)
def addHandler(self, handler, namespace):
self.dispatcher.build_method_map(handler, prefix="%s." % namespace)

View File

@@ -17,14 +17,12 @@ from twisted.web import static # pylint: disable=import-error
class WebRoot(static.File):
def render_GET(self, request):
if request.args.get("__shutdown__", False):
if request.args.get(b"__shutdown__", False):
reactor.stop()
return "Server has been stopped"
request.setHeader("cache-control",
"no-cache, no-store, must-revalidate")
request.setHeader("cache-control", "no-cache, no-store, must-revalidate")
request.setHeader("pragma", "no-cache")
request.setHeader("expires", "0")
return static.File.render_GET(self, request)

View File

@@ -14,8 +14,8 @@
# pylint: disable=too-many-branches, too-many-locals
import os
import time
from os.path import isdir, join
import click
import semantic_version
@@ -24,14 +24,12 @@ from tabulate import tabulate
from platformio import exception, fs, util
from platformio.commands import PlatformioCLI
from platformio.compat import dump_json_to_unicode
from platformio.managers.lib import (LibraryManager, get_builtin_libs,
is_builtin_lib)
from platformio.managers.lib import LibraryManager, get_builtin_libs, is_builtin_lib
from platformio.package.manifest.parser import ManifestParserFactory
from platformio.package.manifest.schema import ManifestSchema
from platformio.proc import is_ci
from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_dir,
get_project_global_lib_dir,
get_project_libdeps_dir,
is_platformio_project)
from platformio.project.helpers import get_project_dir, is_platformio_project
try:
from urllib.parse import quote
@@ -44,36 +42,43 @@ CTX_META_STORAGE_DIRS_KEY = __name__ + ".storage_dirs"
CTX_META_STORAGE_LIBDEPS_KEY = __name__ + ".storage_lib_deps"
def get_project_global_lib_dir():
return ProjectConfig.get_instance().get_optional_dir("globallib")
@click.group(short_help="Library Manager")
@click.option("-d",
"--storage-dir",
multiple=True,
default=None,
type=click.Path(exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True),
help="Manage custom library storage")
@click.option("-g",
"--global",
is_flag=True,
help="Manage global PlatformIO library storage")
@click.option(
"-d",
"--storage-dir",
multiple=True,
default=None,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
help="Manage custom library storage",
)
@click.option(
"-g", "--global", is_flag=True, help="Manage global PlatformIO library storage"
)
@click.option(
"-e",
"--environment",
multiple=True,
help=("Manage libraries for the specific project build environments "
"declared in `platformio.ini`"))
help=(
"Manage libraries for the specific project build environments "
"declared in `platformio.ini`"
),
)
@click.pass_context
def cli(ctx, **options):
storage_cmds = ("install", "uninstall", "update", "list")
# skip commands that don't need storage folder
if ctx.invoked_subcommand not in storage_cmds or \
(len(ctx.args) == 2 and ctx.args[1] in ("-h", "--help")):
if ctx.invoked_subcommand not in storage_cmds or (
len(ctx.args) == 2 and ctx.args[1] in ("-h", "--help")
):
return
storage_dirs = list(options['storage_dir'])
if options['global']:
storage_dirs = list(options["storage_dir"])
if options["global"]:
storage_dirs.append(get_project_global_lib_dir())
if not storage_dirs:
if is_platformio_project():
@@ -84,15 +89,16 @@ def cli(ctx, **options):
"Warning! Global library storage is used automatically. "
"Please use `platformio lib --global %s` command to remove "
"this warning." % ctx.invoked_subcommand,
fg="yellow")
fg="yellow",
)
if not storage_dirs:
raise exception.NotGlobalLibDir(get_project_dir(),
get_project_global_lib_dir(),
ctx.invoked_subcommand)
raise exception.NotGlobalLibDir(
get_project_dir(), get_project_global_lib_dir(), ctx.invoked_subcommand
)
in_silence = PlatformioCLI.in_silence()
ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] = options['environment']
ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] = options["environment"]
ctx.meta[CTX_META_INPUT_DIRS_KEY] = storage_dirs
ctx.meta[CTX_META_STORAGE_DIRS_KEY] = []
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY] = {}
@@ -101,17 +107,19 @@ def cli(ctx, **options):
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
continue
with fs.cd(storage_dir):
libdeps_dir = get_project_libdeps_dir()
config = ProjectConfig.get_instance(join(storage_dir,
"platformio.ini"))
config.validate(options['environment'], silent=in_silence)
for env in config.envs():
if options['environment'] and env not in options['environment']:
continue
storage_dir = join(libdeps_dir, env)
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY][storage_dir] = config.get(
"env:" + env, "lib_deps", [])
config = ProjectConfig.get_instance(
os.path.join(storage_dir, "platformio.ini")
)
config.validate(options["environment"], silent=in_silence)
libdeps_dir = config.get_optional_dir("libdeps")
for env in config.envs():
if options["environment"] and env not in options["environment"]:
continue
storage_dir = os.path.join(libdeps_dir, env)
ctx.meta[CTX_META_STORAGE_DIRS_KEY].append(storage_dir)
ctx.meta[CTX_META_STORAGE_LIBDEPS_KEY][storage_dir] = config.get(
"env:" + env, "lib_deps", []
)
@cli.command("install", short_help="Install library")
@@ -119,21 +127,19 @@ def cli(ctx, **options):
@click.option(
"--save",
is_flag=True,
help="Save installed libraries into the `platformio.ini` dependency list")
@click.option("-s",
"--silent",
is_flag=True,
help="Suppress progress reporting")
@click.option("--interactive",
is_flag=True,
help="Allow to make a choice for all prompts")
@click.option("-f",
"--force",
is_flag=True,
help="Reinstall/redownload library if exists")
help="Save installed libraries into the `platformio.ini` dependency list",
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option(
"--interactive", is_flag=True, help="Allow to make a choice for all prompts"
)
@click.option(
"-f", "--force", is_flag=True, help="Reinstall/redownload library if exists"
)
@click.pass_context
def lib_install( # pylint: disable=too-many-arguments
ctx, libraries, save, silent, interactive, force):
ctx, libraries, save, silent, interactive, force
):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
storage_libdeps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, [])
@@ -144,25 +150,22 @@ def lib_install( # pylint: disable=too-many-arguments
lm = LibraryManager(storage_dir)
if libraries:
for library in libraries:
pkg_dir = lm.install(library,
silent=silent,
interactive=interactive,
force=force)
pkg_dir = lm.install(
library, silent=silent, interactive=interactive, force=force
)
installed_manifests[library] = lm.load_manifest(pkg_dir)
elif storage_dir in storage_libdeps:
builtin_lib_storages = None
for library in storage_libdeps[storage_dir]:
try:
pkg_dir = lm.install(library,
silent=silent,
interactive=interactive,
force=force)
pkg_dir = lm.install(
library, silent=silent, interactive=interactive, force=force
)
installed_manifests[library] = lm.load_manifest(pkg_dir)
except exception.LibNotFound as e:
if builtin_lib_storages is None:
builtin_lib_storages = get_builtin_libs()
if not silent or not is_builtin_lib(
builtin_lib_storages, library):
if not silent or not is_builtin_lib(builtin_lib_storages, library):
click.secho("Warning! %s" % e, fg="yellow")
if not save or not libraries:
@@ -171,7 +174,7 @@ def lib_install( # pylint: disable=too-many-arguments
input_dirs = ctx.meta.get(CTX_META_INPUT_DIRS_KEY, [])
project_environments = ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY]
for input_dir in input_dirs:
config = ProjectConfig.get_instance(join(input_dir, "platformio.ini"))
config = ProjectConfig.get_instance(os.path.join(input_dir, "platformio.ini"))
config.validate(project_environments)
for env in config.envs():
if project_environments and env not in project_environments:
@@ -183,8 +186,8 @@ def lib_install( # pylint: disable=too-many-arguments
continue
manifest = installed_manifests[library]
try:
assert library.lower() == manifest['name'].lower()
assert semantic_version.Version(manifest['version'])
assert library.lower() == manifest["name"].lower()
assert semantic_version.Version(manifest["version"])
lib_deps.append("{name}@^{version}".format(**manifest))
except (AssertionError, ValueError):
lib_deps.append(library)
@@ -206,13 +209,15 @@ def lib_uninstall(ctx, libraries):
@cli.command("update", short_help="Update installed libraries")
@click.argument("libraries", required=False, nargs=-1, metavar="[LIBRARY...]")
@click.option("-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead")
@click.option("--dry-run",
is_flag=True,
help="Do not update, only check for the new versions")
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("--json-output", is_flag=True)
@click.pass_context
def lib_update(ctx, libraries, only_check, dry_run, json_output):
@@ -226,14 +231,12 @@ def lib_update(ctx, libraries, only_check, dry_run, json_output):
_libraries = libraries
if not _libraries:
_libraries = [
manifest['__pkg_dir'] for manifest in lm.get_installed()
]
_libraries = [manifest["__pkg_dir"] for manifest in lm.get_installed()]
if only_check and json_output:
result = []
for library in _libraries:
pkg_dir = library if isdir(library) else None
pkg_dir = library if os.path.isdir(library) else None
requirements = None
url = None
if not pkg_dir:
@@ -245,7 +248,7 @@ def lib_update(ctx, libraries, only_check, dry_run, json_output):
if not latest:
continue
manifest = lm.load_manifest(pkg_dir)
manifest['versionLatest'] = latest
manifest["versionLatest"] = latest
result.append(manifest)
json_result[storage_dir] = result
else:
@@ -254,8 +257,10 @@ def lib_update(ctx, libraries, only_check, dry_run, json_output):
if json_output:
return click.echo(
dump_json_to_unicode(json_result[storage_dirs[0]]
if len(storage_dirs) == 1 else json_result))
dump_json_to_unicode(
json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
return True
@@ -274,15 +279,17 @@ def lib_list(ctx, json_output):
if json_output:
json_result[storage_dir] = items
elif items:
for item in sorted(items, key=lambda i: i['name']):
for item in sorted(items, key=lambda i: i["name"]):
print_lib_item(item)
else:
click.echo("No items found")
if json_output:
return click.echo(
dump_json_to_unicode(json_result[storage_dirs[0]]
if len(storage_dirs) == 1 else json_result))
dump_json_to_unicode(
json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
return True
@@ -298,9 +305,11 @@ def lib_list(ctx, json_output):
@click.option("-f", "--framework", multiple=True)
@click.option("-p", "--platform", multiple=True)
@click.option("-i", "--header", multiple=True)
@click.option("--noninteractive",
is_flag=True,
help="Do not prompt, automatically paginate with delay")
@click.option(
"--noninteractive",
is_flag=True,
help="Do not prompt, automatically paginate with delay",
)
def lib_search(query, json_output, page, noninteractive, **filters):
if not query:
query = []
@@ -311,55 +320,61 @@ def lib_search(query, json_output, page, noninteractive, **filters):
for value in values:
query.append('%s:"%s"' % (key, value))
result = util.get_api_result("/v2/lib/search",
dict(query=" ".join(query), page=page),
cache_valid="1d")
result = util.get_api_result(
"/v2/lib/search", dict(query=" ".join(query), page=page), cache_valid="1d"
)
if json_output:
click.echo(dump_json_to_unicode(result))
return
if result['total'] == 0:
if result["total"] == 0:
click.secho(
"Nothing has been found by your request\n"
"Try a less-specific search or use truncation (or wildcard) "
"operator",
fg="yellow",
nl=False)
nl=False,
)
click.secho(" *", fg="green")
click.secho("For example: DS*, PCA*, DHT* and etc.\n", fg="yellow")
click.echo("For more examples and advanced search syntax, "
"please use documentation:")
click.echo(
"For more examples and advanced search syntax, please use documentation:"
)
click.secho(
"https://docs.platformio.org/page/userguide/lib/cmd_search.html\n",
fg="cyan")
fg="cyan",
)
return
click.secho("Found %d libraries:\n" % result['total'],
fg="green" if result['total'] else "yellow")
click.secho(
"Found %d libraries:\n" % result["total"],
fg="green" if result["total"] else "yellow",
)
while True:
for item in result['items']:
for item in result["items"]:
print_lib_item(item)
if (int(result['page']) * int(result['perpage']) >= int(
result['total'])):
if int(result["page"]) * int(result["perpage"]) >= int(result["total"]):
break
if noninteractive:
click.echo()
click.secho("Loading next %d libraries... Press Ctrl+C to stop!" %
result['perpage'],
fg="yellow")
click.secho(
"Loading next %d libraries... Press Ctrl+C to stop!"
% result["perpage"],
fg="yellow",
)
click.echo()
time.sleep(5)
elif not click.confirm("Show next libraries?"):
break
result = util.get_api_result("/v2/lib/search", {
"query": " ".join(query),
"page": int(result['page']) + 1
},
cache_valid="1d")
result = util.get_api_result(
"/v2/lib/search",
{"query": " ".join(query), "page": int(result["page"]) + 1},
cache_valid="1d",
)
@cli.command("builtin", short_help="List built-in libraries")
@@ -371,13 +386,13 @@ def lib_builtin(storage, json_output):
return click.echo(dump_json_to_unicode(items))
for storage_ in items:
if not storage_['items']:
if not storage_["items"]:
continue
click.secho(storage_['name'], fg="green")
click.echo("*" * len(storage_['name']))
click.secho(storage_["name"], fg="green")
click.echo("*" * len(storage_["name"]))
click.echo()
for item in sorted(storage_['items'], key=lambda i: i['name']):
for item in sorted(storage_["items"], key=lambda i: i["name"]):
print_lib_item(item)
return True
@@ -389,27 +404,29 @@ def lib_builtin(storage, json_output):
def lib_show(library, json_output):
lm = LibraryManager()
name, requirements, _ = lm.parse_pkg_uri(library)
lib_id = lm.search_lib_id({
"name": name,
"requirements": requirements
},
silent=json_output,
interactive=not json_output)
lib_id = lm.search_lib_id(
{"name": name, "requirements": requirements},
silent=json_output,
interactive=not json_output,
)
lib = util.get_api_result("/lib/info/%d" % lib_id, cache_valid="1d")
if json_output:
return click.echo(dump_json_to_unicode(lib))
click.secho(lib['name'], fg="cyan")
click.echo("=" * len(lib['name']))
click.secho("#ID: %d" % lib['id'], bold=True)
click.echo(lib['description'])
click.secho(lib["name"], fg="cyan")
click.echo("=" * len(lib["name"]))
click.secho("#ID: %d" % lib["id"], bold=True)
click.echo(lib["description"])
click.echo()
click.echo(
"Version: %s, released %s" %
(lib['version']['name'],
time.strftime("%c", util.parse_date(lib['version']['released']))))
click.echo("Manifest: %s" % lib['confurl'])
"Version: %s, released %s"
% (
lib["version"]["name"],
time.strftime("%c", util.parse_date(lib["version"]["released"])),
)
)
click.echo("Manifest: %s" % lib["confurl"])
for key in ("homepage", "repository", "license"):
if key not in lib or not lib[key]:
continue
@@ -436,23 +453,33 @@ def lib_show(library, json_output):
if _authors:
blocks.append(("Authors", _authors))
blocks.append(("Keywords", lib['keywords']))
blocks.append(("Keywords", lib["keywords"]))
for key in ("frameworks", "platforms"):
if key not in lib or not lib[key]:
continue
blocks.append(("Compatible %s" % key, [i['title'] for i in lib[key]]))
blocks.append(("Headers", lib['headers']))
blocks.append(("Examples", lib['examples']))
blocks.append(("Versions", [
"%s, released %s" %
(v['name'], time.strftime("%c", util.parse_date(v['released'])))
for v in lib['versions']
]))
blocks.append(("Unique Downloads", [
"Today: %s" % lib['dlstats']['day'],
"Week: %s" % lib['dlstats']['week'],
"Month: %s" % lib['dlstats']['month']
]))
blocks.append(("Compatible %s" % key, [i["title"] for i in lib[key]]))
blocks.append(("Headers", lib["headers"]))
blocks.append(("Examples", lib["examples"]))
blocks.append(
(
"Versions",
[
"%s, released %s"
% (v["name"], time.strftime("%c", util.parse_date(v["released"])))
for v in lib["versions"]
],
)
)
blocks.append(
(
"Unique Downloads",
[
"Today: %s" % lib["dlstats"]["day"],
"Week: %s" % lib["dlstats"]["week"],
"Month: %s" % lib["dlstats"]["month"],
],
)
)
for (title, rows) in blocks:
click.echo()
@@ -467,16 +494,20 @@ def lib_show(library, json_output):
@cli.command("register", short_help="Register a new library")
@click.argument("config_url")
def lib_register(config_url):
if (not config_url.startswith("http://")
and not config_url.startswith("https://")):
if not config_url.startswith("http://") and not config_url.startswith("https://"):
raise exception.InvalidLibConfURL(config_url)
result = util.get_api_result("/lib/register",
data=dict(config_url=config_url))
if "message" in result and result['message']:
click.secho(result['message'],
fg="green" if "successed" in result and result['successed']
else "red")
# Validate manifest
ManifestSchema().load_manifest(
ManifestParserFactory.new_from_url(config_url).as_dict()
)
result = util.get_api_result("/lib/register", data=dict(config_url=config_url))
if "message" in result and result["message"]:
click.secho(
result["message"],
fg="green" if "successed" in result and result["successed"] else "red",
)
@cli.command("stats", short_help="Library Registry Statistics")
@@ -488,46 +519,56 @@ def lib_stats(json_output):
return click.echo(dump_json_to_unicode(result))
for key in ("updated", "added"):
tabular_data = [(click.style(item['name'], fg="cyan"),
time.strftime("%c", util.parse_date(item['date'])),
"https://platformio.org/lib/show/%s/%s" %
(item['id'], quote(item['name'])))
for item in result.get(key, [])]
table = tabulate(tabular_data,
headers=[
click.style("RECENTLY " + key.upper(), bold=True),
"Date", "URL"
])
tabular_data = [
(
click.style(item["name"], fg="cyan"),
time.strftime("%c", util.parse_date(item["date"])),
"https://platformio.org/lib/show/%s/%s"
% (item["id"], quote(item["name"])),
)
for item in result.get(key, [])
]
table = tabulate(
tabular_data,
headers=[click.style("RECENTLY " + key.upper(), bold=True), "Date", "URL"],
)
click.echo(table)
click.echo()
for key in ("lastkeywords", "topkeywords"):
tabular_data = [(click.style(name, fg="cyan"),
"https://platformio.org/lib/search?query=" +
quote("keyword:%s" % name))
for name in result.get(key, [])]
tabular_data = [
(
click.style(name, fg="cyan"),
"https://platformio.org/lib/search?query=" + quote("keyword:%s" % name),
)
for name in result.get(key, [])
]
table = tabulate(
tabular_data,
headers=[
click.style(
("RECENT" if key == "lastkeywords" else "POPULAR") +
" KEYWORDS",
bold=True), "URL"
])
("RECENT" if key == "lastkeywords" else "POPULAR") + " KEYWORDS",
bold=True,
),
"URL",
],
)
click.echo(table)
click.echo()
for key, title in (("dlday", "Today"), ("dlweek", "Week"), ("dlmonth",
"Month")):
tabular_data = [(click.style(item['name'], fg="cyan"),
"https://platformio.org/lib/show/%s/%s" %
(item['id'], quote(item['name'])))
for item in result.get(key, [])]
table = tabulate(tabular_data,
headers=[
click.style("FEATURED: " + title.upper(),
bold=True), "URL"
])
for key, title in (("dlday", "Today"), ("dlweek", "Week"), ("dlmonth", "Month")):
tabular_data = [
(
click.style(item["name"], fg="cyan"),
"https://platformio.org/lib/show/%s/%s"
% (item["id"], quote(item["name"])),
)
for item in result.get(key, [])
]
table = tabulate(
tabular_data,
headers=[click.style("FEATURED: " + title.upper(), bold=True), "URL"],
)
click.echo(table)
click.echo()
@@ -538,15 +579,16 @@ def print_storage_header(storage_dirs, storage_dir):
if storage_dirs and storage_dirs[0] != storage_dir:
click.echo("")
click.echo(
click.style("Library Storage: ", bold=True) +
click.style(storage_dir, fg="blue"))
click.style("Library Storage: ", bold=True)
+ click.style(storage_dir, fg="blue")
)
def print_lib_item(item):
click.secho(item['name'], fg="cyan")
click.echo("=" * len(item['name']))
click.secho(item["name"], fg="cyan")
click.echo("=" * len(item["name"]))
if "id" in item:
click.secho("#ID: %d" % item['id'], bold=True)
click.secho("#ID: %d" % item["id"], bold=True)
if "description" in item or "url" in item:
click.echo(item.get("description", item.get("url", "")))
click.echo()
@@ -562,14 +604,26 @@ def print_lib_item(item):
for key in ("frameworks", "platforms"):
if key not in item:
continue
click.echo("Compatible %s: %s" % (key, ", ".join(
[i['title'] if isinstance(i, dict) else i for i in item[key]])))
click.echo(
"Compatible %s: %s"
% (
key,
", ".join(
[i["title"] if isinstance(i, dict) else i for i in item[key]]
),
)
)
if "authors" in item or "authornames" in item:
click.echo("Authors: %s" % ", ".join(
item.get("authornames",
[a.get("name", "") for a in item.get("authors", [])])))
click.echo(
"Authors: %s"
% ", ".join(
item.get(
"authornames", [a.get("name", "") for a in item.get("authors", [])]
)
)
)
if "__src_url" in item:
click.secho("Source: %s" % item['__src_url'])
click.secho("Source: %s" % item["__src_url"])
click.echo()

View File

@@ -20,6 +20,7 @@ from platformio import app, exception, util
from platformio.commands.boards import print_boards
from platformio.compat import dump_json_to_unicode
from platformio.managers.platform import PlatformFactory, PlatformManager
from platformio.package.pack import PackagePacker
@click.group(short_help="Platform Manager")
@@ -29,24 +30,27 @@ def cli():
def _print_platforms(platforms):
for platform in platforms:
click.echo("{name} ~ {title}".format(name=click.style(platform['name'],
fg="cyan"),
title=platform['title']))
click.echo("=" * (3 + len(platform['name'] + platform['title'])))
click.echo(platform['description'])
click.echo(
"{name} ~ {title}".format(
name=click.style(platform["name"], fg="cyan"), title=platform["title"]
)
)
click.echo("=" * (3 + len(platform["name"] + platform["title"])))
click.echo(platform["description"])
click.echo()
if "homepage" in platform:
click.echo("Home: %s" % platform['homepage'])
if "frameworks" in platform and platform['frameworks']:
click.echo("Frameworks: %s" % ", ".join(platform['frameworks']))
click.echo("Home: %s" % platform["homepage"])
if "frameworks" in platform and platform["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(platform["frameworks"]))
if "packages" in platform:
click.echo("Packages: %s" % ", ".join(platform['packages']))
click.echo("Packages: %s" % ", ".join(platform["packages"]))
if "version" in platform:
if "__src_url" in platform:
click.echo("Version: #%s (%s)" %
(platform['version'], platform['__src_url']))
click.echo(
"Version: #%s (%s)" % (platform["version"], platform["__src_url"])
)
else:
click.echo("Version: " + platform['version'])
click.echo("Version: " + platform["version"])
click.echo()
@@ -54,7 +58,7 @@ def _get_registry_platforms():
platforms = util.get_api_result("/platforms", cache_valid="7d")
pm = PlatformManager()
for platform in platforms or []:
platform['versions'] = pm.get_all_repo_versions(platform['name'])
platform["versions"] = pm.get_all_repo_versions(platform["name"])
return platforms
@@ -65,22 +69,22 @@ def _get_platform_data(*args, **kwargs):
return _get_registry_platform_data(*args, **kwargs)
def _get_installed_platform_data(platform,
with_boards=True,
expose_packages=True):
def _get_installed_platform_data(platform, with_boards=True, expose_packages=True):
p = PlatformFactory.newPlatform(platform)
data = dict(name=p.name,
title=p.title,
description=p.description,
version=p.version,
homepage=p.homepage,
repository=p.repository_url,
url=p.vendor_url,
docs=p.docs_url,
license=p.license,
forDesktop=not p.is_embedded(),
frameworks=sorted(list(p.frameworks) if p.frameworks else []),
packages=list(p.packages) if p.packages else [])
data = dict(
name=p.name,
title=p.title,
description=p.description,
version=p.version,
homepage=p.homepage,
repository=p.repository_url,
url=p.vendor_url,
docs=p.docs_url,
license=p.license,
forDesktop=not p.is_embedded(),
frameworks=sorted(list(p.frameworks) if p.frameworks else []),
packages=list(p.packages) if p.packages else [],
)
# if dump to API
# del data['version']
@@ -94,18 +98,20 @@ def _get_installed_platform_data(platform,
data[key] = manifest[key]
if with_boards:
data['boards'] = [c.get_brief_data() for c in p.get_boards().values()]
data["boards"] = [c.get_brief_data() for c in p.get_boards().values()]
if not data['packages'] or not expose_packages:
if not data["packages"] or not expose_packages:
return data
data['packages'] = []
data["packages"] = []
installed_pkgs = p.get_installed_packages()
for name, opts in p.packages.items():
item = dict(name=name,
type=p.get_package_type(name),
requirements=opts.get("version"),
optional=opts.get("optional") is True)
item = dict(
name=name,
type=p.get_package_type(name),
requirements=opts.get("version"),
optional=opts.get("optional") is True,
)
if name in installed_pkgs:
for key, value in installed_pkgs[name].items():
if key not in ("url", "version", "description"):
@@ -113,40 +119,42 @@ def _get_installed_platform_data(platform,
item[key] = value
if key == "version":
item["originalVersion"] = util.get_original_version(value)
data['packages'].append(item)
data["packages"].append(item)
return data
def _get_registry_platform_data( # pylint: disable=unused-argument
platform,
with_boards=True,
expose_packages=True):
platform, with_boards=True, expose_packages=True
):
_data = None
for p in _get_registry_platforms():
if p['name'] == platform:
if p["name"] == platform:
_data = p
break
if not _data:
return None
data = dict(name=_data['name'],
title=_data['title'],
description=_data['description'],
homepage=_data['homepage'],
repository=_data['repository'],
url=_data['url'],
license=_data['license'],
forDesktop=_data['forDesktop'],
frameworks=_data['frameworks'],
packages=_data['packages'],
versions=_data['versions'])
data = dict(
name=_data["name"],
title=_data["title"],
description=_data["description"],
homepage=_data["homepage"],
repository=_data["repository"],
url=_data["url"],
license=_data["license"],
forDesktop=_data["forDesktop"],
frameworks=_data["frameworks"],
packages=_data["packages"],
versions=_data["versions"],
)
if with_boards:
data['boards'] = [
board for board in PlatformManager().get_registered_boards()
if board['platform'] == _data['name']
data["boards"] = [
board
for board in PlatformManager().get_registered_boards()
if board["platform"] == _data["name"]
]
return data
@@ -164,9 +172,10 @@ def platform_search(query, json_output):
if query and query.lower() not in search_data.lower():
continue
platforms.append(
_get_registry_platform_data(platform['name'],
with_boards=False,
expose_packages=False))
_get_registry_platform_data(
platform["name"], with_boards=False, expose_packages=False
)
)
if json_output:
click.echo(dump_json_to_unicode(platforms))
@@ -185,15 +194,15 @@ def platform_frameworks(query, json_output):
search_data = dump_json_to_unicode(framework)
if query and query.lower() not in search_data.lower():
continue
framework['homepage'] = ("https://platformio.org/frameworks/" +
framework['name'])
framework['platforms'] = [
platform['name'] for platform in _get_registry_platforms()
if framework['name'] in platform['frameworks']
framework["homepage"] = "https://platformio.org/frameworks/" + framework["name"]
framework["platforms"] = [
platform["name"]
for platform in _get_registry_platforms()
if framework["name"] in platform["frameworks"]
]
frameworks.append(framework)
frameworks = sorted(frameworks, key=lambda manifest: manifest['name'])
frameworks = sorted(frameworks, key=lambda manifest: manifest["name"])
if json_output:
click.echo(dump_json_to_unicode(frameworks))
else:
@@ -207,11 +216,12 @@ def platform_list(json_output):
pm = PlatformManager()
for manifest in pm.get_installed():
platforms.append(
_get_installed_platform_data(manifest['__pkg_dir'],
with_boards=False,
expose_packages=False))
_get_installed_platform_data(
manifest["__pkg_dir"], with_boards=False, expose_packages=False
)
)
platforms = sorted(platforms, key=lambda manifest: manifest['name'])
platforms = sorted(platforms, key=lambda manifest: manifest["name"])
if json_output:
click.echo(dump_json_to_unicode(platforms))
else:
@@ -228,55 +238,58 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
if json_output:
return click.echo(dump_json_to_unicode(data))
click.echo("{name} ~ {title}".format(name=click.style(data['name'],
fg="cyan"),
title=data['title']))
click.echo("=" * (3 + len(data['name'] + data['title'])))
click.echo(data['description'])
click.echo(
"{name} ~ {title}".format(
name=click.style(data["name"], fg="cyan"), title=data["title"]
)
)
click.echo("=" * (3 + len(data["name"] + data["title"])))
click.echo(data["description"])
click.echo()
if "version" in data:
click.echo("Version: %s" % data['version'])
if data['homepage']:
click.echo("Home: %s" % data['homepage'])
if data['repository']:
click.echo("Repository: %s" % data['repository'])
if data['url']:
click.echo("Vendor: %s" % data['url'])
if data['license']:
click.echo("License: %s" % data['license'])
if data['frameworks']:
click.echo("Frameworks: %s" % ", ".join(data['frameworks']))
click.echo("Version: %s" % data["version"])
if data["homepage"]:
click.echo("Home: %s" % data["homepage"])
if data["repository"]:
click.echo("Repository: %s" % data["repository"])
if data["url"]:
click.echo("Vendor: %s" % data["url"])
if data["license"]:
click.echo("License: %s" % data["license"])
if data["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(data["frameworks"]))
if not data['packages']:
if not data["packages"]:
return None
if not isinstance(data['packages'][0], dict):
click.echo("Packages: %s" % ", ".join(data['packages']))
if not isinstance(data["packages"][0], dict):
click.echo("Packages: %s" % ", ".join(data["packages"]))
else:
click.echo()
click.secho("Packages", bold=True)
click.echo("--------")
for item in data['packages']:
for item in data["packages"]:
click.echo()
click.echo("Package %s" % click.style(item['name'], fg="yellow"))
click.echo("-" * (8 + len(item['name'])))
if item['type']:
click.echo("Type: %s" % item['type'])
click.echo("Requirements: %s" % item['requirements'])
click.echo("Installed: %s" %
("Yes" if item.get("version") else "No (optional)"))
click.echo("Package %s" % click.style(item["name"], fg="yellow"))
click.echo("-" * (8 + len(item["name"])))
if item["type"]:
click.echo("Type: %s" % item["type"])
click.echo("Requirements: %s" % item["requirements"])
click.echo(
"Installed: %s" % ("Yes" if item.get("version") else "No (optional)")
)
if "version" in item:
click.echo("Version: %s" % item['version'])
click.echo("Version: %s" % item["version"])
if "originalVersion" in item:
click.echo("Original version: %s" % item['originalVersion'])
click.echo("Original version: %s" % item["originalVersion"])
if "description" in item:
click.echo("Description: %s" % item['description'])
click.echo("Description: %s" % item["description"])
if data['boards']:
if data["boards"]:
click.echo()
click.secho("Boards", bold=True)
click.echo("------")
print_boards(data['boards'])
print_boards(data["boards"])
return True
@@ -286,24 +299,37 @@ def platform_show(platform, json_output): # pylint: disable=too-many-branches
@click.option("--with-package", multiple=True)
@click.option("--without-package", multiple=True)
@click.option("--skip-default-package", is_flag=True)
@click.option("--with-all-packages", is_flag=True)
@click.option(
"-f",
"--force",
is_flag=True,
help="Reinstall/redownload dev/platform and its packages if exist")
def platform_install(platforms, with_package, without_package,
skip_default_package, force):
help="Reinstall/redownload dev/platform and its packages if exist",
)
def platform_install( # pylint: disable=too-many-arguments
platforms,
with_package,
without_package,
skip_default_package,
with_all_packages,
force,
):
pm = PlatformManager()
for platform in platforms:
if pm.install(name=platform,
with_packages=with_package,
without_packages=without_package,
skip_default_package=skip_default_package,
force=force):
click.secho("The platform '%s' has been successfully installed!\n"
"The rest of packages will be installed automatically "
"depending on your build environment." % platform,
fg="green")
if pm.install(
name=platform,
with_packages=with_package,
without_packages=without_package,
skip_default_package=skip_default_package,
with_all_packages=with_all_packages,
force=force,
):
click.secho(
"The platform '%s' has been successfully installed!\n"
"The rest of packages will be installed automatically "
"depending on your build environment." % platform,
fg="green",
)
@cli.command("uninstall", short_help="Uninstall development platform")
@@ -312,35 +338,39 @@ def platform_uninstall(platforms):
pm = PlatformManager()
for platform in platforms:
if pm.uninstall(platform):
click.secho("The platform '%s' has been successfully "
"uninstalled!" % platform,
fg="green")
click.secho(
"The platform '%s' has been successfully uninstalled!" % platform,
fg="green",
)
@cli.command("update", short_help="Update installed development platforms")
@click.argument("platforms", nargs=-1, required=False, metavar="[PLATFORM...]")
@click.option("-p",
"--only-packages",
is_flag=True,
help="Update only the platform packages")
@click.option("-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead")
@click.option("--dry-run",
is_flag=True,
help="Do not update, only check for the new versions")
@click.option(
"-p", "--only-packages", is_flag=True, help="Update only the platform packages"
)
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("--json-output", is_flag=True)
def platform_update( # pylint: disable=too-many-locals
platforms, only_packages, only_check, dry_run, json_output):
platforms, only_packages, only_check, dry_run, json_output
):
pm = PlatformManager()
pkg_dir_to_name = {}
if not platforms:
platforms = []
for manifest in pm.get_installed():
platforms.append(manifest['__pkg_dir'])
pkg_dir_to_name[manifest['__pkg_dir']] = manifest.get(
"title", manifest['name'])
platforms.append(manifest["__pkg_dir"])
pkg_dir_to_name[manifest["__pkg_dir"]] = manifest.get(
"title", manifest["name"]
)
only_check = dry_run or only_check
@@ -356,14 +386,16 @@ def platform_update( # pylint: disable=too-many-locals
if not pkg_dir:
continue
latest = pm.outdated(pkg_dir, requirements)
if (not latest and not PlatformFactory.newPlatform(
pkg_dir).are_outdated_packages()):
if (
not latest
and not PlatformFactory.newPlatform(pkg_dir).are_outdated_packages()
):
continue
data = _get_installed_platform_data(pkg_dir,
with_boards=False,
expose_packages=False)
data = _get_installed_platform_data(
pkg_dir, with_boards=False, expose_packages=False
)
if latest:
data['versionLatest'] = latest
data["versionLatest"] = latest
result.append(data)
return click.echo(dump_json_to_unicode(result))
@@ -371,10 +403,21 @@ def platform_update( # pylint: disable=too-many-locals
app.clean_cache()
for platform in platforms:
click.echo(
"Platform %s" %
click.style(pkg_dir_to_name.get(platform, platform), fg="cyan"))
"Platform %s"
% click.style(pkg_dir_to_name.get(platform, platform), fg="cyan")
)
click.echo("--------")
pm.update(platform, only_packages=only_packages, only_check=only_check)
click.echo()
return True
@cli.command(
"pack", short_help="Create a tarball from development platform/tool package"
)
@click.argument("package", required=True, metavar="[source directory, tar.gz or zip]")
def platform_pack(package):
p = PackagePacker(package)
tarball_path = p.pack()
click.secho('Wrote a tarball to "%s"' % tarball_path, fg="green")

View File

@@ -14,22 +14,59 @@
# pylint: disable=too-many-arguments,too-many-locals, too-many-branches
from os import getcwd, makedirs
from os.path import isdir, isfile, join
import os
import click
from tabulate import tabulate
from platformio import exception, fs
from platformio.commands.platform import \
platform_install as cli_platform_install
from platformio.commands.platform import platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.managers.platform import PlatformManager
from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_include_dir,
get_project_lib_dir,
get_project_src_dir,
get_project_test_dir,
is_platformio_project)
from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.helpers import is_platformio_project
@click.group(short_help="Project Manager")
def cli():
pass
@cli.command("config", short_help="Show computed configuration")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("--json-output", is_flag=True)
def project_config(project_dir, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
if json_output:
return click.echo(config.to_json())
click.echo(
"Computed project configuration for %s" % click.style(project_dir, fg="cyan")
)
for section, options in config.as_tuple():
click.echo()
click.secho(section, fg="cyan")
click.echo("-" * len(section))
click.echo(
tabulate(
[
(name, "=", "\n".join(value) if isinstance(value, list) else value)
for name, value in options
],
tablefmt="plain",
)
)
return None
def validate_boards(ctx, param, value): # pylint: disable=W0613
@@ -40,66 +77,66 @@ def validate_boards(ctx, param, value): # pylint: disable=W0613
except exception.UnknownBoard:
raise click.BadParameter(
"`%s`. Please search for board ID using `platformio boards` "
"command" % id_)
"command" % id_
)
return value
@click.command("init",
short_help="Initialize PlatformIO project or update existing")
@click.option("--project-dir",
"-d",
default=getcwd,
type=click.Path(exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("-b",
"--board",
multiple=True,
metavar="ID",
callback=validate_boards)
@click.option("--ide",
type=click.Choice(ProjectGenerator.get_supported_ides()))
@cli.command("init", short_help="Initialize a project or update existing")
@click.option(
"--project-dir",
"-d",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option("--ide", type=click.Choice(ProjectGenerator.get_supported_ides()))
@click.option("-O", "--project-option", multiple=True)
@click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True)
@click.pass_context
def cli(
ctx, # pylint: disable=R0913
project_dir,
board,
ide,
project_option,
env_prefix,
silent):
def project_init(
ctx, # pylint: disable=R0913
project_dir,
board,
ide,
project_option,
env_prefix,
silent,
):
if not silent:
if project_dir == getcwd():
click.secho("\nThe current working directory",
fg="yellow",
nl=False)
if project_dir == os.getcwd():
click.secho("\nThe current working directory", fg="yellow", nl=False)
click.secho(" %s " % project_dir, fg="cyan", nl=False)
click.secho("will be used for the project.", fg="yellow")
click.echo("")
click.echo("The next files/directories have been created in %s" %
click.style(project_dir, fg="cyan"))
click.echo("%s - Put project header files here" %
click.style("include", fg="cyan"))
click.echo("%s - Put here project specific (private) libraries" %
click.style("lib", fg="cyan"))
click.echo("%s - Put project source files here" %
click.style("src", fg="cyan"))
click.echo("%s - Project Configuration File" %
click.style("platformio.ini", fg="cyan"))
click.echo(
"The next files/directories have been created in %s"
% click.style(project_dir, fg="cyan")
)
click.echo(
"%s - Put project header files here" % click.style("include", fg="cyan")
)
click.echo(
"%s - Put here project specific (private) libraries"
% click.style("lib", fg="cyan")
)
click.echo("%s - Put project source files here" % click.style("src", fg="cyan"))
click.echo(
"%s - Project Configuration File" % click.style("platformio.ini", fg="cyan")
)
is_new_project = not is_platformio_project(project_dir)
if is_new_project:
init_base_project(project_dir)
if board:
fill_project_envs(ctx, project_dir, board, project_option, env_prefix,
ide is not None)
fill_project_envs(
ctx, project_dir, board, project_option, env_prefix, ide is not None
)
if ide:
pg = ProjectGenerator(project_dir, ide, board)
@@ -115,9 +152,9 @@ def cli(
if ide:
click.secho(
"\nProject has been successfully %s including configuration files "
"for `%s` IDE." %
("initialized" if is_new_project else "updated", ide),
fg="green")
"for `%s` IDE." % ("initialized" if is_new_project else "updated", ide),
fg="green",
)
else:
click.secho(
"\nProject has been successfully %s! Useful commands:\n"
@@ -125,31 +162,34 @@ def cli(
"`pio run --target upload` or `pio run -t upload` "
"- upload firmware to a target\n"
"`pio run --target clean` - clean project (remove compiled files)"
"\n`pio run --help` - additional information" %
("initialized" if is_new_project else "updated"),
fg="green")
"\n`pio run --help` - additional information"
% ("initialized" if is_new_project else "updated"),
fg="green",
)
def init_base_project(project_dir):
ProjectConfig(join(project_dir, "platformio.ini")).save()
with fs.cd(project_dir):
config = ProjectConfig()
config.save()
dir_to_readme = [
(get_project_src_dir(), None),
(get_project_include_dir(), init_include_readme),
(get_project_lib_dir(), init_lib_readme),
(get_project_test_dir(), init_test_readme),
(config.get_optional_dir("src"), None),
(config.get_optional_dir("include"), init_include_readme),
(config.get_optional_dir("lib"), init_lib_readme),
(config.get_optional_dir("test"), init_test_readme),
]
for (path, cb) in dir_to_readme:
if isdir(path):
if os.path.isdir(path):
continue
makedirs(path)
os.makedirs(path)
if cb:
cb(path)
def init_include_readme(include_dir):
with open(join(include_dir, "README"), "w") as f:
f.write("""
with open(os.path.join(include_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
@@ -188,12 +228,15 @@ Read more about using header files in official GCC documentation:
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
""")
""",
)
def init_lib_readme(lib_dir):
with open(join(lib_dir, "README"), "w") as f:
f.write("""
# pylint: disable=line-too-long
with open(os.path.join(lib_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
@@ -239,12 +282,14 @@ libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html
""")
""",
)
def init_test_readme(test_dir):
with open(join(test_dir, "README"), "w") as f:
f.write("""
with open(os.path.join(test_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for PIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
@@ -255,15 +300,17 @@ in the development cycle.
More information about PIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html
""")
""",
)
def init_ci_conf(project_dir):
conf_path = join(project_dir, ".travis.yml")
if isfile(conf_path):
conf_path = os.path.join(project_dir, ".travis.yml")
if os.path.isfile(conf_path):
return
with open(conf_path, "w") as f:
f.write("""# Continuous Integration (CI) is the practice, in software
with open(conf_path, "w") as fp:
fp.write(
"""# Continuous Integration (CI) is the practice, in software
# engineering, of merging all developer working copies with a shared mainline
# several times a day < https://docs.platformio.org/page/ci/index.html >
#
@@ -330,27 +377,27 @@ def init_ci_conf(project_dir):
#
# script:
# - platformio ci --lib="." --board=ID_1 --board=ID_2 --board=ID_N
""")
""",
)
def init_cvs_ignore(project_dir):
conf_path = join(project_dir, ".gitignore")
if isfile(conf_path):
conf_path = os.path.join(project_dir, ".gitignore")
if os.path.isfile(conf_path):
return
with open(conf_path, "w") as fp:
fp.write(".pio\n")
def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix,
force_download):
config = ProjectConfig(join(project_dir, "platformio.ini"),
parse_extra=False)
def fill_project_envs(
ctx, project_dir, board_ids, project_option, env_prefix, force_download
):
config = ProjectConfig(
os.path.join(project_dir, "platformio.ini"), parse_extra=False
)
used_boards = []
for section in config.sections():
cond = [
section.startswith("env:"),
config.has_option(section, "board")
]
cond = [section.startswith("env:"), config.has_option(section, "board")]
if all(cond):
used_boards.append(config.get(section, "board"))
@@ -359,17 +406,17 @@ def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix,
modified = False
for id_ in board_ids:
board_config = pm.board_config(id_)
used_platforms.append(board_config['platform'])
used_platforms.append(board_config["platform"])
if id_ in used_boards:
continue
used_boards.append(id_)
modified = True
envopts = {"platform": board_config['platform'], "board": id_}
envopts = {"platform": board_config["platform"], "board": id_}
# find default framework for board
frameworks = board_config.get("frameworks")
if frameworks:
envopts['framework'] = frameworks[0]
envopts["framework"] = frameworks[0]
for item in project_option:
if "=" not in item:
@@ -388,14 +435,12 @@ def fill_project_envs(ctx, project_dir, board_ids, project_option, env_prefix,
if modified:
config.save()
config.reset_instances()
def _install_dependent_platforms(ctx, platforms):
installed_platforms = [
p['name'] for p in PlatformManager().get_installed()
]
installed_platforms = [p["name"] for p in PlatformManager().get_installed()]
if set(platforms) <= set(installed_platforms):
return
ctx.invoke(cli_platform_install,
platforms=list(set(platforms) - set(installed_platforms)))
ctx.invoke(
cli_platform_install, platforms=list(set(platforms) - set(installed_platforms))
)

View File

@@ -12,19 +12,19 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
import threading
from os import getcwd
from os.path import isfile, join
from tempfile import mkdtemp
from time import sleep
import click
from platformio import exception, fs
from platformio.commands.device import device_monitor as cmd_device_monitor
from platformio.compat import get_file_contents
from platformio.commands.device import helpers as device_helpers
from platformio.commands.device.command import device_monitor as cmd_device_monitor
from platformio.managers.core import pioplus_call
from platformio.project.exception import NotPlatformIOProjectError
# pylint: disable=unused-argument
@@ -43,13 +43,12 @@ def remote_agent():
@remote_agent.command("start", short_help="Start agent")
@click.option("-n", "--name")
@click.option("-s", "--share", multiple=True, metavar="E-MAIL")
@click.option("-d",
"--working-dir",
envvar="PLATFORMIO_REMOTE_AGENT_DIR",
type=click.Path(file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option(
"-d",
"--working-dir",
envvar="PLATFORMIO_REMOTE_AGENT_DIR",
type=click.Path(file_okay=False, dir_okay=True, writable=True, resolve_path=True),
)
def remote_agent_start(**kwargs):
pioplus_call(sys.argv[1:])
@@ -64,15 +63,16 @@ def remote_agent_list():
pioplus_call(sys.argv[1:])
@cli.command("update",
short_help="Update installed Platforms, Packages and Libraries")
@click.option("-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead")
@click.option("--dry-run",
is_flag=True,
help="Do not update, only check for the new versions")
@cli.command("update", short_help="Update installed Platforms, Packages and Libraries")
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
def remote_update(only_check, dry_run):
pioplus_call(sys.argv[1:])
@@ -81,14 +81,14 @@ def remote_update(only_check, dry_run):
@click.option("-e", "--environment", multiple=True)
@click.option("-t", "--target", multiple=True)
@click.option("--upload-port")
@click.option("-d",
"--project-dir",
default=getcwd,
type=click.Path(exists=True,
file_okay=True,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("--disable-auto-clean", is_flag=True)
@click.option("-r", "--force-remote", is_flag=True)
@click.option("-s", "--silent", is_flag=True)
@@ -102,14 +102,14 @@ def remote_run(**kwargs):
@click.option("--ignore", "-i", multiple=True, metavar="<pattern>")
@click.option("--upload-port")
@click.option("--test-port")
@click.option("-d",
"--project-dir",
default=getcwd,
type=click.Path(exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("-r", "--force-remote", is_flag=True)
@click.option("--without-building", is_flag=True)
@click.option("--without-uploading", is_flag=True)
@@ -131,74 +131,100 @@ def device_list(json_output):
@remote_device.command("monitor", short_help="Monitor remote device")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud",
"-b",
type=int,
default=9600,
help="Set baud rate, default=9600")
@click.option("--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N")
@click.option("--rtscts",
is_flag=True,
help="Enable RTS/CTS flow control, default=Off")
@click.option("--xonxoff",
is_flag=True,
help="Enable software flow control, default=Off")
@click.option("--rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state")
@click.option("--dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option(
"--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N",
)
@click.option("--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option(
"--xonxoff", is_flag=True, help="Enable software flow control, default=Off"
)
@click.option(
"--rts", default=None, type=click.IntRange(0, 1), help="Set initial RTS line state"
)
@click.option(
"--dtr", default=None, type=click.IntRange(0, 1), help="Set initial DTR line state"
)
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option("--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8")
@click.option(
"--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8",
)
@click.option("--filter", "-f", multiple=True, help="Add text transformation")
@click.option("--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF")
@click.option("--raw",
is_flag=True,
help="Do not apply any encodings/transformations")
@click.option("--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)")
@click.option("--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)")
@click.option("--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off")
@click.option(
"--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF",
)
@click.option("--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option(
"--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)",
)
@click.option(
"--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)",
)
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment",
)
@click.pass_context
def device_monitor(ctx, **kwargs):
project_options = {}
try:
with fs.cd(kwargs["project_dir"]):
project_options = device_helpers.get_project_options(kwargs["environment"])
kwargs = device_helpers.apply_project_monitor_options(kwargs, project_options)
except NotPlatformIOProjectError:
pass
kwargs["baud"] = kwargs["baud"] or 9600
def _tx_target(sock_dir):
pioplus_argv = ["remote", "device", "monitor"]
pioplus_argv.extend(device_helpers.options_to_argv(kwargs, project_options))
pioplus_argv.extend(["--sock", sock_dir])
try:
pioplus_call(sys.argv[1:] + ["--sock", sock_dir])
pioplus_call(pioplus_argv)
except exception.ReturnErrorCode:
pass
sock_dir = mkdtemp(suffix="pioplus")
sock_file = join(sock_dir, "sock")
sock_file = os.path.join(sock_dir, "sock")
try:
t = threading.Thread(target=_tx_target, args=(sock_dir, ))
t = threading.Thread(target=_tx_target, args=(sock_dir,))
t.start()
while t.is_alive() and not isfile(sock_file):
while t.is_alive() and not os.path.isfile(sock_file):
sleep(0.1)
if not t.is_alive():
return
kwargs['port'] = get_file_contents(sock_file)
with open(sock_file) as fp:
kwargs["port"] = fp.read()
ctx.invoke(cmd_device_monitor, **kwargs)
t.join(2)
finally:

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.run.command import cli

View File

@@ -14,21 +14,19 @@
from multiprocessing import cpu_count
from os import getcwd
from os.path import isfile, join
from os.path import isfile
from time import time
import click
from tabulate import tabulate
from platformio import exception, fs, util
from platformio.commands.device import device_monitor as cmd_device_monitor
from platformio.commands.run.helpers import (clean_build_dir,
handle_legacy_libdeps)
from platformio import app, exception, fs, util
from platformio.commands.device.command import device_monitor as cmd_device_monitor
from platformio.commands.run.helpers import clean_build_dir, handle_legacy_libdeps
from platformio.commands.run.processor import EnvironmentProcessor
from platformio.commands.test.processor import CTX_META_TEST_IS_RUNNING
from platformio.project.config import ProjectConfig
from platformio.project.helpers import (find_project_dir_above,
get_project_build_dir)
from platformio.project.helpers import find_project_dir_above
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches
@@ -42,34 +40,49 @@ except NotImplementedError:
@click.option("-e", "--environment", multiple=True)
@click.option("-t", "--target", multiple=True)
@click.option("--upload-port")
@click.option("-d",
"--project-dir",
default=getcwd,
type=click.Path(exists=True,
file_okay=True,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("-c",
"--project-conf",
type=click.Path(exists=True,
file_okay=True,
dir_okay=False,
readable=True,
resolve_path=True))
@click.option("-j",
"--jobs",
type=int,
default=DEFAULT_JOB_NUMS,
help=("Allow N jobs at once. "
"Default is a number of CPUs in a system (N=%d)" %
DEFAULT_JOB_NUMS))
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True, file_okay=True, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option(
"-j",
"--jobs",
type=int,
default=DEFAULT_JOB_NUMS,
help=(
"Allow N jobs at once. "
"Default is a number of CPUs in a system (N=%d)" % DEFAULT_JOB_NUMS
),
)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
@click.option("--disable-auto-clean", is_flag=True)
@click.pass_context
def cli(ctx, environment, target, upload_port, project_dir, project_conf, jobs,
silent, verbose, disable_auto_clean):
def cli(
ctx,
environment,
target,
upload_port,
project_dir,
project_conf,
jobs,
silent,
verbose,
disable_auto_clean,
):
app.set_session_var("custom_project_conf", project_conf)
# find project directory on upper level
if isfile(project_dir):
project_dir = find_project_dir_above(project_dir)
@@ -77,47 +90,58 @@ def cli(ctx, environment, target, upload_port, project_dir, project_conf, jobs,
is_test_running = CTX_META_TEST_IS_RUNNING in ctx.meta
with fs.cd(project_dir):
config = ProjectConfig.get_instance(
project_conf or join(project_dir, "platformio.ini"))
config = ProjectConfig.get_instance(project_conf)
config.validate(environment)
# clean obsolete build dir
if not disable_auto_clean:
build_dir = config.get_optional_dir("build")
try:
clean_build_dir(get_project_build_dir(), config)
clean_build_dir(build_dir, config)
except: # pylint: disable=bare-except
click.secho(
"Can not remove temporary directory `%s`. Please remove "
"it manually to avoid build issues" %
get_project_build_dir(force=True),
fg="yellow")
"it manually to avoid build issues" % build_dir,
fg="yellow",
)
handle_legacy_libdeps(project_dir, config)
default_envs = config.default_envs()
results = []
for env in config.envs():
skipenv = any([
environment and env not in environment, not environment
and default_envs and env not in default_envs
])
skipenv = any(
[
environment and env not in environment,
not environment and default_envs and env not in default_envs,
]
)
if skipenv:
results.append({"env": env})
continue
# print empty line between multi environment project
if not silent and any(
r.get("succeeded") is not None for r in results):
if not silent and any(r.get("succeeded") is not None for r in results):
click.echo()
results.append(
process_env(ctx, env, config, environment, target, upload_port,
silent, verbose, jobs, is_test_running))
process_env(
ctx,
env,
config,
environment,
target,
upload_port,
silent,
verbose,
jobs,
is_test_running,
)
)
command_failed = any(r.get("succeeded") is False for r in results)
if (not is_test_running and (command_failed or not silent)
and len(results) > 1):
if not is_test_running and (command_failed or not silent) and len(results) > 1:
print_processing_summary(results)
if command_failed:
@@ -125,24 +149,39 @@ def cli(ctx, environment, target, upload_port, project_dir, project_conf, jobs,
return True
def process_env(ctx, name, config, environments, targets, upload_port, silent,
verbose, jobs, is_test_running):
def process_env(
ctx,
name,
config,
environments,
targets,
upload_port,
silent,
verbose,
jobs,
is_test_running,
):
if not is_test_running and not silent:
print_processing_header(name, config, verbose)
ep = EnvironmentProcessor(ctx, name, config, targets, upload_port, silent,
verbose, jobs)
ep = EnvironmentProcessor(
ctx, name, config, targets, upload_port, silent, verbose, jobs
)
result = {"env": name, "duration": time(), "succeeded": ep.process()}
result['duration'] = time() - result['duration']
result["duration"] = time() - result["duration"]
# print footer on error or when is not unit testing
if not is_test_running and (not silent or not result['succeeded']):
if not is_test_running and (not silent or not result["succeeded"]):
print_processing_footer(result)
if (result['succeeded'] and "monitor" in ep.get_build_targets()
and "nobuild" not in ep.get_build_targets()):
ctx.invoke(cmd_device_monitor,
environment=environments[0] if environments else None)
if (
result["succeeded"]
and "monitor" in ep.get_build_targets()
and "nobuild" not in ep.get_build_targets()
):
ctx.invoke(
cmd_device_monitor, environment=environments[0] if environments else None
)
return result
@@ -151,10 +190,11 @@ def print_processing_header(env, config, verbose=False):
env_dump = []
for k, v in config.items(env=env):
if verbose or k in ("platform", "framework", "board"):
env_dump.append("%s: %s" %
(k, ", ".join(v) if isinstance(v, list) else v))
click.echo("Processing %s (%s)" %
(click.style(env, fg="cyan", bold=True), "; ".join(env_dump)))
env_dump.append("%s: %s" % (k, ", ".join(v) if isinstance(v, list) else v))
click.echo(
"Processing %s (%s)"
% (click.style(env, fg="cyan", bold=True), "; ".join(env_dump))
)
terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
@@ -162,10 +202,17 @@ def print_processing_header(env, config, verbose=False):
def print_processing_footer(result):
is_failed = not result.get("succeeded")
util.print_labeled_bar(
"[%s] Took %.2f seconds" %
((click.style("FAILED", fg="red", bold=True) if is_failed else
click.style("SUCCESS", fg="green", bold=True)), result['duration']),
is_error=is_failed)
"[%s] Took %.2f seconds"
% (
(
click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("SUCCESS", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def print_processing_summary(results):
@@ -186,20 +233,31 @@ def print_processing_summary(results):
status_str = click.style("SUCCESS", fg="green")
tabular_data.append(
(click.style(result['env'], fg="cyan"), status_str,
util.humanize_duration_time(result.get("duration"))))
(
click.style(result["env"], fg="cyan"),
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo()
click.echo(tabulate(tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Environment", "Status", "Duration")
]),
err=failed_nums)
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True) for s in ("Environment", "Status", "Duration")
],
),
err=failed_nums,
)
util.print_labeled_bar(
"%s%d succeeded in %s" %
("%d failed, " % failed_nums if failed_nums else "", succeeded_nums,
util.humanize_duration_time(duration)),
"%s%d succeeded in %s"
% (
"%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums,
fg="red" if failed_nums else "green")
fg="red" if failed_nums else "green",
)

View File

@@ -18,15 +18,14 @@ from os.path import isdir, isfile, join
import click
from platformio import fs
from platformio.project.helpers import (compute_project_checksum,
get_project_dir,
get_project_libdeps_dir)
from platformio.project.helpers import compute_project_checksum, get_project_dir
def handle_legacy_libdeps(project_dir, config):
legacy_libdeps_dir = join(project_dir, ".piolibdeps")
if (not isdir(legacy_libdeps_dir)
or legacy_libdeps_dir == get_project_libdeps_dir()):
if not isdir(legacy_libdeps_dir) or legacy_libdeps_dir == config.get_optional_dir(
"libdeps"
):
return
if not config.has_section("env"):
config.add_section("env")
@@ -37,9 +36,10 @@ def handle_legacy_libdeps(project_dir, config):
"DEPRECATED! A legacy library storage `{0}` has been found in a "
"project. \nPlease declare project dependencies in `platformio.ini`"
" file using `lib_deps` option and remove `{0}` folder."
"\nMore details -> http://docs.platformio.org/page/projectconf/"
"\nMore details -> https://docs.platformio.org/page/projectconf/"
"section_env_library.html#lib-deps".format(legacy_libdeps_dir),
fg="yellow")
fg="yellow",
)
def clean_build_dir(build_dir, config):
@@ -54,11 +54,11 @@ def clean_build_dir(build_dir, config):
if isdir(build_dir):
# check project structure
if isfile(checksum_file):
with open(checksum_file) as f:
if f.read() == checksum:
with open(checksum_file) as fp:
if fp.read() == checksum:
return
fs.rmtree(build_dir)
makedirs(build_dir)
with open(checksum_file, "w") as f:
f.write(checksum)
with open(checksum_file, "w") as fp:
fp.write(checksum)

View File

@@ -13,19 +13,18 @@
# limitations under the License.
from platformio import exception, telemetry
from platformio.commands.platform import \
platform_install as cmd_platform_install
from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME
from platformio.managers.platform import PlatformFactory
from platformio.project.exception import UndefinedEnvPlatformError
# pylint: disable=too-many-instance-attributes
class EnvironmentProcessor(object):
def __init__( # pylint: disable=too-many-arguments
self, cmd_ctx, name, config, targets, upload_port, silent, verbose,
jobs):
self, cmd_ctx, name, config, targets, upload_port, silent, verbose, jobs
):
self.cmd_ctx = cmd_ctx
self.name = name
self.config = config
@@ -40,40 +39,44 @@ class EnvironmentProcessor(object):
variables = {"pioenv": self.name, "project_config": self.config.path}
if CTX_META_TEST_RUNNING_NAME in self.cmd_ctx.meta:
variables['piotest_running_name'] = self.cmd_ctx.meta[
CTX_META_TEST_RUNNING_NAME]
variables["piotest_running_name"] = self.cmd_ctx.meta[
CTX_META_TEST_RUNNING_NAME
]
if self.upload_port:
# override upload port with a custom from CLI
variables['upload_port'] = self.upload_port
variables["upload_port"] = self.upload_port
return variables
def get_build_targets(self):
if self.targets:
return [t for t in self.targets]
return self.config.get("env:" + self.name, "targets", [])
return (
self.targets
if self.targets
else self.config.get("env:" + self.name, "targets", [])
)
def process(self):
if "platform" not in self.options:
raise exception.UndefinedEnvPlatform(self.name)
raise UndefinedEnvPlatformError(self.name)
build_vars = self.get_build_variables()
build_targets = self.get_build_targets()
build_targets = list(self.get_build_targets())
telemetry.on_run_environment(self.options, build_targets)
telemetry.send_run_environment(self.options, build_targets)
# skip monitor target, we call it above
if "monitor" in build_targets:
build_targets.remove("monitor")
try:
p = PlatformFactory.newPlatform(self.options['platform'])
p = PlatformFactory.newPlatform(self.options["platform"])
except exception.UnknownPlatform:
self.cmd_ctx.invoke(cmd_platform_install,
platforms=[self.options['platform']],
skip_default_package=True)
p = PlatformFactory.newPlatform(self.options['platform'])
self.cmd_ctx.invoke(
cmd_platform_install,
platforms=[self.options["platform"]],
skip_default_package=True,
)
p = PlatformFactory.newPlatform(self.options["platform"])
result = p.run(build_vars, build_targets, self.silent, self.verbose,
self.jobs)
return result['returncode'] == 0
result = p.run(build_vars, build_targets, self.silent, self.verbose, self.jobs)
return result["returncode"] == 0

View File

@@ -42,20 +42,24 @@ def settings_get(name):
raw_value = app.get_setting(key)
formatted_value = format_value(raw_value)
if raw_value != options['value']:
default_formatted_value = format_value(options['value'])
if raw_value != options["value"]:
default_formatted_value = format_value(options["value"])
formatted_value += "%s" % (
"\n" if len(default_formatted_value) > 10 else " ")
formatted_value += "[%s]" % click.style(default_formatted_value,
fg="yellow")
"\n" if len(default_formatted_value) > 10 else " "
)
formatted_value += "[%s]" % click.style(
default_formatted_value, fg="yellow"
)
tabular_data.append(
(click.style(key,
fg="cyan"), formatted_value, options['description']))
(click.style(key, fg="cyan"), formatted_value, options["description"])
)
click.echo(
tabulate(tabular_data,
headers=["Name", "Current value [Default]", "Description"]))
tabulate(
tabular_data, headers=["Name", "Current value [Default]", "Description"]
)
)
@cli.command("set", short_help="Set new value for the setting")

View File

@@ -11,5 +11,3 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.test.command import cli

View File

@@ -22,71 +22,93 @@ from time import time
import click
from tabulate import tabulate
from platformio import exception, fs, util
from platformio import app, exception, fs, util
from platformio.commands.test.embedded import EmbeddedTestProcessor
from platformio.commands.test.native import NativeTestProcessor
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_test_dir
@click.command("test", short_help="Unit Testing")
@click.option("--environment", "-e", multiple=True, metavar="<environment>")
@click.option("--filter",
"-f",
multiple=True,
metavar="<pattern>",
help="Filter tests by a pattern")
@click.option("--ignore",
"-i",
multiple=True,
metavar="<pattern>",
help="Ignore tests by a pattern")
@click.option(
"--filter",
"-f",
multiple=True,
metavar="<pattern>",
help="Filter tests by a pattern",
)
@click.option(
"--ignore",
"-i",
multiple=True,
metavar="<pattern>",
help="Ignore tests by a pattern",
)
@click.option("--upload-port")
@click.option("--test-port")
@click.option("-d",
"--project-dir",
default=getcwd,
type=click.Path(exists=True,
file_okay=False,
dir_okay=True,
writable=True,
resolve_path=True))
@click.option("-c",
"--project-conf",
type=click.Path(exists=True,
file_okay=True,
dir_okay=False,
readable=True,
resolve_path=True))
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--without-building", is_flag=True)
@click.option("--without-uploading", is_flag=True)
@click.option("--without-testing", is_flag=True)
@click.option("--no-reset", is_flag=True)
@click.option("--monitor-rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state for Serial Monitor")
@click.option("--monitor-dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state for Serial Monitor")
@click.option(
"--monitor-rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state for Serial Monitor",
)
@click.option(
"--monitor-dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state for Serial Monitor",
)
@click.option("--verbose", "-v", is_flag=True)
@click.pass_context
def cli( # pylint: disable=redefined-builtin
ctx, environment, ignore, filter, upload_port, test_port, project_dir,
project_conf, without_building, without_uploading, without_testing,
no_reset, monitor_rts, monitor_dtr, verbose):
ctx,
environment,
ignore,
filter,
upload_port,
test_port,
project_dir,
project_conf,
without_building,
without_uploading,
without_testing,
no_reset,
monitor_rts,
monitor_dtr,
verbose,
):
app.set_session_var("custom_project_conf", project_conf)
with fs.cd(project_dir):
test_dir = get_project_test_dir()
config = ProjectConfig.get_instance(project_conf)
config.validate(envs=environment)
test_dir = config.get_optional_dir("test")
if not isdir(test_dir):
raise exception.TestDirNotExists(test_dir)
test_names = get_test_names(test_dir)
config = ProjectConfig.get_instance(
project_conf or join(project_dir, "platformio.ini"))
config.validate(envs=environment)
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
if not verbose:
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
click.secho("Collected %d items" % len(test_names), bold=True)
results = []
@@ -99,19 +121,16 @@ def cli( # pylint: disable=redefined-builtin
# filter and ignore patterns
patterns = dict(filter=list(filter), ignore=list(ignore))
for key in patterns:
patterns[key].extend(
config.get(section, "test_%s" % key, []))
patterns[key].extend(config.get(section, "test_%s" % key, []))
skip_conditions = [
environment and envname not in environment,
not environment and default_envs
and envname not in default_envs,
testname != "*" and patterns['filter'] and
not any([fnmatch(testname, p)
for p in patterns['filter']]),
not environment and default_envs and envname not in default_envs,
testname != "*"
and any([fnmatch(testname, p)
for p in patterns['ignore']]),
and patterns["filter"]
and not any([fnmatch(testname, p) for p in patterns["filter"]]),
testname != "*"
and any([fnmatch(testname, p) for p in patterns["ignore"]]),
]
if any(skip_conditions):
results.append({"env": envname, "test": testname})
@@ -120,29 +139,37 @@ def cli( # pylint: disable=redefined-builtin
click.echo()
print_processing_header(testname, envname)
cls = (NativeTestProcessor
if config.get(section, "platform") == "native" else
EmbeddedTestProcessor)
cls = (
NativeTestProcessor
if config.get(section, "platform") == "native"
else EmbeddedTestProcessor
)
tp = cls(
ctx, testname, envname,
dict(project_config=config,
project_dir=project_dir,
upload_port=upload_port,
test_port=test_port,
without_building=without_building,
without_uploading=without_uploading,
without_testing=without_testing,
no_reset=no_reset,
monitor_rts=monitor_rts,
monitor_dtr=monitor_dtr,
verbose=verbose))
ctx,
testname,
envname,
dict(
project_config=config,
project_dir=project_dir,
upload_port=upload_port,
test_port=test_port,
without_building=without_building,
without_uploading=without_uploading,
without_testing=without_testing,
no_reset=no_reset,
monitor_rts=monitor_rts,
monitor_dtr=monitor_dtr,
verbose=verbose,
silent=not verbose,
),
)
result = {
"env": envname,
"test": testname,
"duration": time(),
"succeeded": tp.process()
"succeeded": tp.process(),
}
result['duration'] = time() - result['duration']
result["duration"] = time() - result["duration"]
results.append(result)
print_processing_footer(result)
@@ -168,8 +195,13 @@ def get_test_names(test_dir):
def print_processing_header(test, env):
click.echo("Processing %s in %s environment" % (click.style(
test, fg="yellow", bold=True), click.style(env, fg="cyan", bold=True)))
click.echo(
"Processing %s in %s environment"
% (
click.style(test, fg="yellow", bold=True),
click.style(env, fg="cyan", bold=True),
)
)
terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
@@ -177,10 +209,17 @@ def print_processing_header(test, env):
def print_processing_footer(result):
is_failed = not result.get("succeeded")
util.print_labeled_bar(
"[%s] Took %.2f seconds" %
((click.style("FAILED", fg="red", bold=True) if is_failed else
click.style("PASSED", fg="green", bold=True)), result['duration']),
is_error=is_failed)
"[%s] Took %.2f seconds"
% (
(
click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("PASSED", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def print_testing_summary(results):
@@ -203,20 +242,32 @@ def print_testing_summary(results):
status_str = click.style("PASSED", fg="green")
tabular_data.append(
(result['test'], click.style(result['env'], fg="cyan"), status_str,
util.humanize_duration_time(result.get("duration"))))
(
result["test"],
click.style(result["env"], fg="cyan"),
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo(tabulate(tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Test", "Environment", "Status",
"Duration")
]),
err=failed_nums)
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Test", "Environment", "Status", "Duration")
],
),
err=failed_nums,
)
util.print_labeled_bar(
"%s%d succeeded in %s" %
("%d failed, " % failed_nums if failed_nums else "", succeeded_nums,
util.humanize_duration_time(duration)),
"%s%d succeeded in %s"
% (
"%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums,
fg="red" if failed_nums else "green")
fg="red" if failed_nums else "green",
)

View File

@@ -27,47 +27,50 @@ class EmbeddedTestProcessor(TestProcessorBase):
SERIAL_TIMEOUT = 600
def process(self):
if not self.options['without_building']:
if not self.options["without_building"]:
self.print_progress("Building...")
target = ["__test"]
if self.options['without_uploading']:
if self.options["without_uploading"]:
target.append("checkprogsize")
if not self.build_or_upload(target):
return False
if not self.options['without_uploading']:
if not self.options["without_uploading"]:
self.print_progress("Uploading...")
target = ["upload"]
if self.options['without_building']:
if self.options["without_building"]:
target.append("nobuild")
else:
target.append("__test")
if not self.build_or_upload(target):
return False
if self.options['without_testing']:
return None
if self.options["without_testing"]:
return True
self.print_progress("Testing...")
return self.run()
def run(self):
click.echo("If you don't see any output for the first 10 secs, "
"please reset board (press reset button)")
click.echo(
"If you don't see any output for the first 10 secs, "
"please reset board (press reset button)"
)
click.echo()
try:
ser = serial.Serial(baudrate=self.get_baudrate(),
timeout=self.SERIAL_TIMEOUT)
ser = serial.Serial(
baudrate=self.get_baudrate(), timeout=self.SERIAL_TIMEOUT
)
ser.port = self.get_test_port()
ser.rts = self.options['monitor_rts']
ser.dtr = self.options['monitor_dtr']
ser.rts = self.options["monitor_rts"]
ser.dtr = self.options["monitor_dtr"]
ser.open()
except serial.SerialException as e:
click.secho(str(e), fg="red", err=True)
return False
if not self.options['no_reset']:
if not self.options["no_reset"]:
ser.flushInput()
ser.setDTR(False)
ser.setRTS(False)
@@ -90,7 +93,7 @@ class EmbeddedTestProcessor(TestProcessorBase):
if not line:
continue
if isinstance(line, bytes):
line = line.decode("utf8")
line = line.decode("utf8", "ignore")
self.on_run_out(line)
if all([l in line for l in ("Tests", "Failures", "Ignored")]):
break
@@ -105,17 +108,16 @@ class EmbeddedTestProcessor(TestProcessorBase):
return self.env_options.get("test_port")
assert set(["platform", "board"]) & set(self.env_options.keys())
p = PlatformFactory.newPlatform(self.env_options['platform'])
board_hwids = p.board_config(self.env_options['board']).get(
"build.hwids", [])
p = PlatformFactory.newPlatform(self.env_options["platform"])
board_hwids = p.board_config(self.env_options["board"]).get("build.hwids", [])
port = None
elapsed = 0
while elapsed < 5 and not port:
for item in util.get_serialports():
port = item['port']
port = item["port"]
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item['hwid']:
if hwid_str in item["hwid"]:
return port
# check if port is already configured
@@ -131,5 +133,6 @@ class EmbeddedTestProcessor(TestProcessorBase):
if not port:
raise exception.PlatformioException(
"Please specify `test_port` for environment or use "
"global `--test-port` option.")
"global `--test-port` option."
)
return port

View File

@@ -14,30 +14,28 @@
from os.path import join
from platformio import fs, proc
from platformio import proc
from platformio.commands.test.processor import TestProcessorBase
from platformio.proc import LineBufferedAsyncPipe
from platformio.project.helpers import get_project_build_dir
class NativeTestProcessor(TestProcessorBase):
def process(self):
if not self.options['without_building']:
if not self.options["without_building"]:
self.print_progress("Building...")
if not self.build_or_upload(["__test"]):
return False
if self.options['without_testing']:
if self.options["without_testing"]:
return None
self.print_progress("Testing...")
return self.run()
def run(self):
with fs.cd(self.options['project_dir']):
build_dir = get_project_build_dir()
build_dir = self.options["project_config"].get_optional_dir("build")
result = proc.exec_command(
[join(build_dir, self.env_name, "program")],
stdout=LineBufferedAsyncPipe(self.on_run_out),
stderr=LineBufferedAsyncPipe(self.on_run_out))
stderr=LineBufferedAsyncPipe(self.on_run_out),
)
assert "returncode" in result
return result['returncode'] == 0 and not self._run_failed
return result["returncode"] == 0 and not self._run_failed

View File

@@ -20,7 +20,6 @@ from string import Template
import click
from platformio import exception
from platformio.project.helpers import get_project_test_dir
TRANSPORT_OPTIONS = {
"arduino": {
@@ -29,7 +28,7 @@ TRANSPORT_OPTIONS = {
"putchar": "Serial.write(c)",
"flush": "Serial.flush()",
"begin": "Serial.begin($baudrate)",
"end": "Serial.end()"
"end": "Serial.end()",
},
"mbed": {
"include": "#include <mbed.h>",
@@ -37,7 +36,7 @@ TRANSPORT_OPTIONS = {
"putchar": "pc.putc(c)",
"flush": "",
"begin": "pc.baud($baudrate)",
"end": ""
"end": "",
},
"espidf": {
"include": "#include <stdio.h>",
@@ -45,7 +44,7 @@ TRANSPORT_OPTIONS = {
"putchar": "putchar(c)",
"flush": "fflush(stdout)",
"begin": "",
"end": ""
"end": "",
},
"native": {
"include": "#include <stdio.h>",
@@ -53,7 +52,7 @@ TRANSPORT_OPTIONS = {
"putchar": "putchar(c)",
"flush": "fflush(stdout)",
"begin": "",
"end": ""
"end": "",
},
"custom": {
"include": '#include "unittest_transport.h"',
@@ -61,8 +60,8 @@ TRANSPORT_OPTIONS = {
"putchar": "unittest_uart_putchar(c)",
"flush": "unittest_uart_flush()",
"begin": "unittest_uart_begin()",
"end": "unittest_uart_end()"
}
"end": "unittest_uart_end()",
},
}
CTX_META_TEST_IS_RUNNING = __name__ + ".test_running"
@@ -79,21 +78,24 @@ class TestProcessorBase(object):
self.test_name = testname
self.options = options
self.env_name = envname
self.env_options = options['project_config'].items(env=envname,
as_dict=True)
self.env_options = options["project_config"].items(env=envname, as_dict=True)
self._run_failed = False
self._outputcpp_generated = False
def get_transport(self):
transport = None
if self.env_options.get("platform") == "native":
transport = "native"
elif "framework" in self.env_options:
transport = self.env_options.get("framework")[0]
if "test_transport" in self.env_options:
transport = self.env_options['test_transport']
transport = self.env_options["test_transport"]
if transport not in TRANSPORT_OPTIONS:
raise exception.PlatformioException(
"Unknown Unit Test transport `%s`" % transport)
"Unknown Unit Test transport `%s`. Please check a documentation how "
"to create an own 'Test Transport':\n"
"- https://docs.platformio.org/page/plus/unit-testing.html" % transport
)
return transport.lower()
def get_baudrate(self):
@@ -104,21 +106,28 @@ class TestProcessorBase(object):
def build_or_upload(self, target):
if not self._outputcpp_generated:
self.generate_outputcpp(get_project_test_dir())
self.generate_outputcpp(
self.options["project_config"].get_optional_dir("test")
)
self._outputcpp_generated = True
if self.test_name != "*":
self.cmd_ctx.meta[CTX_META_TEST_RUNNING_NAME] = self.test_name
try:
from platformio.commands.run import cli as cmd_run
return self.cmd_ctx.invoke(cmd_run,
project_dir=self.options['project_dir'],
upload_port=self.options['upload_port'],
silent=not self.options['verbose'],
environment=[self.env_name],
disable_auto_clean="nobuild" in target,
target=target)
# pylint: disable=import-outside-toplevel
from platformio.commands.run.command import cli as cmd_run
return self.cmd_ctx.invoke(
cmd_run,
project_dir=self.options["project_dir"],
upload_port=self.options["upload_port"],
verbose=self.options["verbose"],
silent=self.options["silent"],
environment=[self.env_name],
disable_auto_clean="nobuild" in target,
target=target,
)
except exception.ReturnErrorCode:
return False
@@ -131,8 +140,7 @@ class TestProcessorBase(object):
def on_run_out(self, line):
line = line.strip()
if line.endswith(":PASS"):
click.echo("%s\t[%s]" %
(line[:-5], click.style("PASSED", fg="green")))
click.echo("%s\t[%s]" % (line[:-5], click.style("PASSED", fg="green")))
elif ":FAIL" in line:
self._run_failed = True
click.echo("%s\t[%s]" % (line, click.style("FAILED", fg="red")))
@@ -142,36 +150,38 @@ class TestProcessorBase(object):
def generate_outputcpp(self, test_dir):
assert isdir(test_dir)
cpp_tpl = "\n".join([
"$include",
"#include <output_export.h>",
"",
"$object",
"",
"#ifdef __GNUC__",
"void output_start(unsigned int baudrate __attribute__((unused)))",
"#else",
"void output_start(unsigned int baudrate)",
"#endif",
"{",
" $begin;",
"}",
"",
"void output_char(int c)",
"{",
" $putchar;",
"}",
"",
"void output_flush(void)",
"{",
" $flush;",
"}",
"",
"void output_complete(void)",
"{",
" $end;",
"}"
]) # yapf: disable
cpp_tpl = "\n".join(
[
"$include",
"#include <output_export.h>",
"",
"$object",
"",
"#ifdef __GNUC__",
"void output_start(unsigned int baudrate __attribute__((unused)))",
"#else",
"void output_start(unsigned int baudrate)",
"#endif",
"{",
" $begin;",
"}",
"",
"void output_char(int c)",
"{",
" $putchar;",
"}",
"",
"void output_flush(void)",
"{",
" $flush;",
"}",
"",
"void output_complete(void)",
"{",
" $end;",
"}",
]
)
def delete_tmptest_file(file_):
try:
@@ -181,14 +191,14 @@ class TestProcessorBase(object):
click.secho(
"Warning: Could not remove temporary file '%s'. "
"Please remove it manually." % file_,
fg="yellow")
fg="yellow",
)
tpl = Template(cpp_tpl).substitute(
TRANSPORT_OPTIONS[self.get_transport()])
tpl = Template(cpp_tpl).substitute(TRANSPORT_OPTIONS[self.get_transport()])
data = Template(tpl).substitute(baudrate=self.get_baudrate())
tmp_file = join(test_dir, "output_export.cpp")
with open(tmp_file, "w") as f:
f.write(data)
with open(tmp_file, "w") as fp:
fp.write(data)
atexit.register(delete_tmptest_file, tmp_file)

View File

@@ -22,18 +22,19 @@ from platformio.managers.core import update_core_packages
from platformio.managers.lib import LibraryManager
@click.command("update",
short_help="Update installed platforms, packages and libraries")
@click.option("--core-packages",
is_flag=True,
help="Update only the core packages")
@click.option("-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead")
@click.option("--dry-run",
is_flag=True,
help="Do not update, only check for the new versions")
@click.command(
"update", short_help="Update installed platforms, packages and libraries"
)
@click.option("--core-packages", is_flag=True, help="Update only the core packages")
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.pass_context
def cli(ctx, core_packages, only_check, dry_run):
# cleanup lib search results, cached board and platform lists

View File

@@ -19,27 +19,29 @@ from zipfile import ZipFile
import click
import requests
from platformio import VERSION, __version__, exception, util
from platformio import VERSION, __version__, app, exception, util
from platformio.compat import WINDOWS
from platformio.proc import exec_command, get_pythonexe_path
from platformio.project.helpers import get_project_cache_dir
@click.command("upgrade",
short_help="Upgrade PlatformIO to the latest version")
@click.command("upgrade", short_help="Upgrade PlatformIO to the latest version")
@click.option("--dev", is_flag=True, help="Use development branch")
def cli(dev):
if not dev and __version__ == get_latest_version():
return click.secho(
"You're up-to-date!\nPlatformIO %s is currently the "
"newest version available." % __version__,
fg="green")
fg="green",
)
click.secho("Please wait while upgrading PlatformIO ...", fg="yellow")
to_develop = dev or not all(c.isdigit() for c in __version__ if c != ".")
cmds = (["pip", "install", "--upgrade",
get_pip_package(to_develop)], ["platformio", "--version"])
cmds = (
["pip", "install", "--upgrade", get_pip_package(to_develop)],
["platformio", "--version"],
)
cmd = None
r = {}
@@ -49,26 +51,30 @@ def cli(dev):
r = exec_command(cmd)
# try pip with disabled cache
if r['returncode'] != 0 and cmd[2] == "pip":
if r["returncode"] != 0 and cmd[2] == "pip":
cmd.insert(3, "--no-cache-dir")
r = exec_command(cmd)
assert r['returncode'] == 0
assert "version" in r['out']
actual_version = r['out'].strip().split("version", 1)[1].strip()
click.secho("PlatformIO has been successfully upgraded to %s" %
actual_version,
fg="green")
assert r["returncode"] == 0
assert "version" in r["out"]
actual_version = r["out"].strip().split("version", 1)[1].strip()
click.secho(
"PlatformIO has been successfully upgraded to %s" % actual_version,
fg="green",
)
click.echo("Release notes: ", nl=False)
click.secho("https://docs.platformio.org/en/latest/history.html",
fg="cyan")
click.secho("https://docs.platformio.org/en/latest/history.html", fg="cyan")
if app.get_session_var("caller_id"):
click.secho(
"Warning! Please restart IDE to affect PIO Home changes", fg="yellow"
)
except Exception as e: # pylint: disable=broad-except
if not r:
raise exception.UpgradeError("\n".join([str(cmd), str(e)]))
permission_errors = ("permission denied", "not permitted")
if (any(m in r['err'].lower() for m in permission_errors)
and not WINDOWS):
click.secho("""
if any(m in r["err"].lower() for m in permission_errors) and not WINDOWS:
click.secho(
"""
-----------------
Permission denied
-----------------
@@ -78,10 +84,11 @@ You need the `sudo` permission to install Python packages. Try
WARNING! Don't use `sudo` for the rest PlatformIO commands.
""",
fg="yellow",
err=True)
fg="yellow",
err=True,
)
raise exception.ReturnErrorCode(1)
raise exception.UpgradeError("\n".join([str(cmd), r['out'], r['err']]))
raise exception.UpgradeError("\n".join([str(cmd), r["out"], r["err"]]))
return True
@@ -89,18 +96,17 @@ WARNING! Don't use `sudo` for the rest PlatformIO commands.
def get_pip_package(to_develop):
if not to_develop:
return "platformio"
dl_url = ("https://github.com/platformio/"
"platformio-core/archive/develop.zip")
dl_url = "https://github.com/platformio/platformio-core/archive/develop.zip"
cache_dir = get_project_cache_dir()
if not os.path.isdir(cache_dir):
os.makedirs(cache_dir)
pkg_name = os.path.join(cache_dir, "piocoredevelop.zip")
try:
with open(pkg_name, "w") as fp:
r = exec_command(["curl", "-fsSL", dl_url],
stdout=fp,
universal_newlines=True)
assert r['returncode'] == 0
r = exec_command(
["curl", "-fsSL", dl_url], stdout=fp, universal_newlines=True
)
assert r["returncode"] == 0
# check ZIP structure
with ZipFile(pkg_name) as zp:
assert zp.testzip() is None
@@ -127,7 +133,8 @@ def get_develop_latest_version():
r = requests.get(
"https://raw.githubusercontent.com/platformio/platformio"
"/develop/platformio/__init__.py",
headers=util.get_request_defheaders())
headers=util.get_request_defheaders(),
)
r.raise_for_status()
for line in r.text.split("\n"):
line = line.strip()
@@ -145,7 +152,8 @@ def get_develop_latest_version():
def get_pypi_latest_version():
r = requests.get("https://pypi.org/pypi/platformio/json",
headers=util.get_request_defheaders())
r = requests.get(
"https://pypi.org/pypi/platformio/json", headers=util.get_request_defheaders()
)
r.raise_for_status()
return r.json()['info']['version']
return r.json()["info"]["version"]

View File

@@ -12,24 +12,46 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-import
# pylint: disable=unused-import, no-name-in-module, import-error,
# pylint: disable=no-member, undefined-variable
import inspect
import json
import locale
import os
import re
import sys
PY2 = sys.version_info[0] == 2
CYGWIN = sys.platform.startswith('cygwin')
WINDOWS = sys.platform.startswith('win')
CYGWIN = sys.platform.startswith("cygwin")
WINDOWS = sys.platform.startswith("win")
def get_filesystem_encoding():
return sys.getfilesystemencoding() or sys.getdefaultencoding()
def get_locale_encoding():
try:
return locale.getdefaultlocale()[1]
except ValueError:
return None
def get_object_members(obj, ignore_private=True):
members = inspect.getmembers(obj, lambda a: not inspect.isroutine(a))
if not ignore_private:
return members
return {
item[0]: item[1]
for item in members
if not (item[0].startswith("__") and item[0].endswith("__"))
}
if PY2:
# pylint: disable=undefined-variable
import imp
string_types = (str, unicode)
def is_bytes(x):
@@ -38,11 +60,7 @@ if PY2:
def path_to_unicode(path):
if isinstance(path, unicode):
return path
return path.decode(get_filesystem_encoding()).encode("utf-8")
def get_file_contents(path):
with open(path) as f:
return f.read()
return path.decode(get_filesystem_encoding())
def hashlib_encode_data(data):
if is_bytes(data):
@@ -56,13 +74,12 @@ if PY2:
def dump_json_to_unicode(obj):
if isinstance(obj, unicode):
return obj
return json.dumps(obj,
encoding=get_filesystem_encoding(),
ensure_ascii=False,
sort_keys=True).encode("utf8")
return json.dumps(
obj, encoding=get_filesystem_encoding(), ensure_ascii=False, sort_keys=True
).encode("utf8")
_magic_check = re.compile('([*?[])')
_magic_check_bytes = re.compile(b'([*?[])')
_magic_check = re.compile("([*?[])")
_magic_check_bytes = re.compile(b"([*?[])")
def glob_escape(pathname):
"""Escape all special characters."""
@@ -72,14 +89,20 @@ if PY2:
# escaped.
drive, pathname = os.path.splitdrive(pathname)
if isinstance(pathname, bytes):
pathname = _magic_check_bytes.sub(br'[\1]', pathname)
pathname = _magic_check_bytes.sub(br"[\1]", pathname)
else:
pathname = _magic_check.sub(r'[\1]', pathname)
pathname = _magic_check.sub(r"[\1]", pathname)
return drive + pathname
else:
from glob import escape as glob_escape # pylint: disable=no-name-in-module
string_types = (str, )
def load_python_module(name, pathname):
return imp.load_source(name, pathname)
else:
import importlib.util
from glob import escape as glob_escape
string_types = (str,)
def is_bytes(x):
return isinstance(x, (bytes, memoryview, bytearray))
@@ -87,14 +110,6 @@ else:
def path_to_unicode(path):
return path
def get_file_contents(path):
try:
with open(path) as f:
return f.read()
except UnicodeDecodeError:
with open(path, encoding="latin-1") as f:
return f.read()
def hashlib_encode_data(data):
if is_bytes(data):
return data
@@ -106,3 +121,9 @@ else:
if isinstance(obj, string_types):
return obj
return json.dumps(obj, ensure_ascii=False, sort_keys=True)
def load_python_module(name, pathname):
spec = importlib.util.spec_from_file_location(name, pathname)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module

View File

@@ -12,39 +12,45 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import hashlib
import io
import math
import sys
from email.utils import parsedate_tz
from math import ceil
from os.path import getsize, join
from sys import version_info
from time import mktime
import click
import requests
from platformio import util
from platformio.exception import (FDSHASumMismatch, FDSizeMismatch,
FDUnrecognizedStatusCode)
from platformio.proc import exec_command
from platformio.exception import (
FDSHASumMismatch,
FDSizeMismatch,
FDUnrecognizedStatusCode,
)
class FileDownloader(object):
CHUNK_SIZE = 1024
def __init__(self, url, dest_dir=None):
self._request = None
# make connection
self._request = requests.get(url,
stream=True,
headers=util.get_request_defheaders(),
verify=version_info >= (2, 7, 9))
self._request = requests.get(
url,
stream=True,
headers=util.get_request_defheaders(),
verify=sys.version_info >= (2, 7, 9),
)
if self._request.status_code != 200:
raise FDUnrecognizedStatusCode(self._request.status_code, url)
disposition = self._request.headers.get("content-disposition")
if disposition and "filename=" in disposition:
self._fname = disposition[disposition.index("filename=") +
9:].replace('"', "").replace("'", "")
self._fname = (
disposition[disposition.index("filename=") + 9 :]
.replace('"', "")
.replace("'", "")
)
else:
self._fname = [p for p in url.split("/") if p][-1]
self._fname = str(self._fname)
@@ -64,20 +70,21 @@ class FileDownloader(object):
def get_size(self):
if "content-length" not in self._request.headers:
return -1
return int(self._request.headers['content-length'])
return int(self._request.headers["content-length"])
def start(self, with_progress=True):
def start(self, with_progress=True, silent=False):
label = "Downloading"
itercontent = self._request.iter_content(chunk_size=self.CHUNK_SIZE)
itercontent = self._request.iter_content(chunk_size=io.DEFAULT_BUFFER_SIZE)
f = open(self._destination, "wb")
try:
if not with_progress or self.get_size() == -1:
click.echo("%s..." % label)
if not silent:
click.echo("%s..." % label)
for chunk in itercontent:
if chunk:
f.write(chunk)
else:
chunks = int(ceil(self.get_size() / float(self.CHUNK_SIZE)))
chunks = int(math.ceil(self.get_size() / float(io.DEFAULT_BUFFER_SIZE)))
with click.progressbar(length=chunks, label=label) as pb:
for _ in pb:
f.write(next(itercontent))
@@ -94,25 +101,19 @@ class FileDownloader(object):
_dlsize = getsize(self._destination)
if self.get_size() != -1 and _dlsize != self.get_size():
raise FDSizeMismatch(_dlsize, self._fname, self.get_size())
if not sha1:
return None
dlsha1 = None
try:
result = exec_command(["sha1sum", self._destination])
dlsha1 = result['out']
except (OSError, ValueError):
try:
result = exec_command(["shasum", "-a", "1", self._destination])
dlsha1 = result['out']
except (OSError, ValueError):
pass
if not dlsha1:
return None
dlsha1 = dlsha1[1:41] if dlsha1.startswith("\\") else dlsha1[:40]
if sha1.lower() != dlsha1.lower():
raise FDSHASumMismatch(dlsha1, self._fname, sha1)
checksum = hashlib.sha1()
with io.open(self._destination, "rb", buffering=0) as fp:
while True:
chunk = fp.read(io.DEFAULT_BUFFER_SIZE)
if not chunk:
break
checksum.update(chunk)
if sha1.lower() != checksum.hexdigest().lower():
raise FDSHASumMismatch(checksum.hexdigest(), self._fname, sha1)
return True
def _preserve_filemtime(self, lmdate):

View File

@@ -64,8 +64,10 @@ class IncompatiblePlatform(PlatformioException):
class PlatformNotInstalledYet(PlatformioException):
MESSAGE = ("The platform '{0}' has not been installed yet. "
"Use `platformio platform install {0}` command")
MESSAGE = (
"The platform '{0}' has not been installed yet. "
"Use `platformio platform install {0}` command"
)
class UnknownBoard(PlatformioException):
@@ -102,22 +104,27 @@ class MissingPackageManifest(PlatformIOPackageException):
class UndefinedPackageVersion(PlatformIOPackageException):
MESSAGE = ("Could not find a version that satisfies the requirement '{0}'"
" for your system '{1}'")
MESSAGE = (
"Could not find a version that satisfies the requirement '{0}'"
" for your system '{1}'"
)
class PackageInstallError(PlatformIOPackageException):
MESSAGE = ("Could not install '{0}' with version requirements '{1}' "
"for your system '{2}'.\n\n"
"Please try this solution -> http://bit.ly/faq-package-manager")
MESSAGE = (
"Could not install '{0}' with version requirements '{1}' "
"for your system '{2}'.\n\n"
"Please try this solution -> http://bit.ly/faq-package-manager"
)
class ExtractArchiveItemError(PlatformIOPackageException):
MESSAGE = (
"Could not extract `{0}` to `{1}`. Try to disable antivirus "
"tool or check this solution -> http://bit.ly/faq-package-manager")
"tool or check this solution -> http://bit.ly/faq-package-manager"
)
class UnsupportedArchiveType(PlatformIOPackageException):
@@ -132,56 +139,17 @@ class FDUnrecognizedStatusCode(PlatformIOPackageException):
class FDSizeMismatch(PlatformIOPackageException):
MESSAGE = ("The size ({0:d} bytes) of downloaded file '{1}' "
"is not equal to remote size ({2:d} bytes)")
MESSAGE = (
"The size ({0:d} bytes) of downloaded file '{1}' "
"is not equal to remote size ({2:d} bytes)"
)
class FDSHASumMismatch(PlatformIOPackageException):
MESSAGE = ("The 'sha1' sum '{0}' of downloaded file '{1}' "
"is not equal to remote '{2}'")
#
# Project
#
class PlatformIOProjectException(PlatformioException):
pass
class NotPlatformIOProject(PlatformIOProjectException):
MESSAGE = (
"Not a PlatformIO project. `platformio.ini` file has not been "
"found in current working directory ({0}). To initialize new project "
"please use `platformio init` command")
class InvalidProjectConf(PlatformIOProjectException):
MESSAGE = ("Invalid '{0}' (project configuration file): '{1}'")
class UndefinedEnvPlatform(PlatformIOProjectException):
MESSAGE = "Please specify platform for '{0}' environment"
class ProjectEnvsNotAvailable(PlatformIOProjectException):
MESSAGE = "Please setup environments in `platformio.ini` file"
class UnknownEnvNames(PlatformIOProjectException):
MESSAGE = "Unknown environment names '{0}'. Valid names are '{1}'"
class ProjectOptionValueError(PlatformIOProjectException):
MESSAGE = "{0} for option `{1}` in section [{2}]"
"The 'sha1' sum '{0}' of downloaded file '{1}' is not equal to remote '{2}'"
)
#
@@ -191,9 +159,11 @@ class ProjectOptionValueError(PlatformIOProjectException):
class LibNotFound(PlatformioException):
MESSAGE = ("Library `{0}` has not been found in PlatformIO Registry.\n"
"You can ignore this message, if `{0}` is a built-in library "
"(included in framework, SDK). E.g., SPI, Wire, etc.")
MESSAGE = (
"Library `{0}` has not been found in PlatformIO Registry.\n"
"You can ignore this message, if `{0}` is a built-in library "
"(included in framework, SDK). E.g., SPI, Wire, etc."
)
class NotGlobalLibDir(UserSideException):
@@ -203,7 +173,8 @@ class NotGlobalLibDir(UserSideException):
"To manage libraries in global storage `{1}`,\n"
"please use `platformio lib --global {2}` or specify custom storage "
"`platformio lib --storage-dir /path/to/storage/ {2}`.\n"
"Check `platformio lib --help` for details.")
"Check `platformio lib --help` for details."
)
class InvalidLibConfURL(PlatformioException):
@@ -224,7 +195,8 @@ class MissedUdevRules(InvalidUdevRules):
MESSAGE = (
"Warning! Please install `99-platformio-udev.rules`. \nMode details: "
"https://docs.platformio.org/en/latest/faq.html#platformio-udev-rules")
"https://docs.platformio.org/en/latest/faq.html#platformio-udev-rules"
)
class OutdatedUdevRules(InvalidUdevRules):
@@ -232,7 +204,8 @@ class OutdatedUdevRules(InvalidUdevRules):
MESSAGE = (
"Warning! Your `{0}` are outdated. Please update or reinstall them."
"\n Mode details: https://docs.platformio.org"
"/en/latest/faq.html#platformio-udev-rules")
"/en/latest/faq.html#platformio-udev-rules"
)
#
@@ -260,7 +233,8 @@ class InternetIsOffline(UserSideException):
MESSAGE = (
"You are not connected to the Internet.\n"
"If you build a project first time, we need Internet connection "
"to install all dependencies and toolchains.")
"to install all dependencies and toolchains."
)
class BuildScriptNotFound(PlatformioException):
@@ -285,9 +259,11 @@ class InvalidJSONFile(PlatformioException):
class CIBuildEnvsEmpty(PlatformioException):
MESSAGE = ("Can't find PlatformIO build environments.\n"
"Please specify `--board` or path to `platformio.ini` with "
"predefined environments using `--project-conf` option")
MESSAGE = (
"Can't find PlatformIO build environments.\n"
"Please specify `--board` or path to `platformio.ini` with "
"predefined environments using `--project-conf` option"
)
class UpgradeError(PlatformioException):
@@ -300,39 +276,31 @@ class UpgradeError(PlatformioException):
"""
class HomeDirPermissionsError(PlatformioException):
class HomeDirPermissionsError(UserSideException):
MESSAGE = (
"The directory `{0}` or its parent directory is not owned by the "
"current user and PlatformIO can not store configuration data.\n"
"Please check the permissions and owner of that directory.\n"
"Otherwise, please remove manually `{0}` directory and PlatformIO "
"will create new from the current user.")
"will create new from the current user."
)
class CygwinEnvDetected(PlatformioException):
MESSAGE = ("PlatformIO does not work within Cygwin environment. "
"Use native Terminal instead.")
class DebugSupportError(PlatformioException):
MESSAGE = (
"Currently, PlatformIO does not support debugging for `{0}`.\n"
"Please request support at https://github.com/platformio/"
"platformio-core/issues \nor visit -> https://docs.platformio.org"
"/page/plus/debugging.html")
class DebugInvalidOptions(PlatformioException):
pass
"PlatformIO does not work within Cygwin environment. "
"Use native Terminal instead."
)
class TestDirNotExists(PlatformioException):
MESSAGE = "A test folder '{0}' does not exist.\nPlease create 'test' "\
"directory in project's root and put a test set.\n"\
"More details about Unit "\
"Testing: http://docs.platformio.org/page/plus/"\
"unit-testing.html"
MESSAGE = (
"A test folder '{0}' does not exist.\nPlease create 'test' "
"directory in project's root and put a test set.\n"
"More details about Unit "
"Testing: https://docs.platformio.org/page/plus/"
"unit-testing.html"
)

View File

@@ -23,11 +23,10 @@ from glob import glob
import click
from platformio import exception
from platformio.compat import get_file_contents, glob_escape
from platformio.compat import WINDOWS, glob_escape
class cd(object):
def __init__(self, new_path):
self.new_path = new_path
self.prev_path = os.getcwd()
@@ -40,7 +39,7 @@ class cd(object):
def get_source_dir():
curpath = os.path.abspath(__file__)
curpath = os.path.realpath(__file__)
if not os.path.isfile(curpath):
for p in sys.path:
if os.path.isfile(os.path.join(p, __file__)):
@@ -65,34 +64,40 @@ def format_filesize(filesize):
if filesize < base:
return "%d%s" % (filesize, suffix)
for i, suffix in enumerate("KMGTPEZY"):
unit = base**(i + 2)
unit = base ** (i + 2)
if filesize >= unit:
continue
if filesize % (base**(i + 1)):
if filesize % (base ** (i + 1)):
return "%.2f%sB" % ((base * filesize / unit), suffix)
break
return "%d%sB" % ((base * filesize / unit), suffix)
def ensure_udev_rules():
from platformio.util import get_systype
from platformio.util import get_systype # pylint: disable=import-outside-toplevel
def _rules_to_set(rules_path):
return set(l.strip() for l in get_file_contents(rules_path).split("\n")
if l.strip() and not l.startswith("#"))
result = set()
with open(rules_path) as fp:
for line in fp.readlines():
line = line.strip()
if not line or line.startswith("#"):
continue
result.add(line)
return result
if "linux" not in get_systype():
return None
installed_rules = [
"/etc/udev/rules.d/99-platformio-udev.rules",
"/lib/udev/rules.d/99-platformio-udev.rules"
"/lib/udev/rules.d/99-platformio-udev.rules",
]
if not any(os.path.isfile(p) for p in installed_rules):
raise exception.MissedUdevRules
origin_path = os.path.abspath(
os.path.join(get_source_dir(), "..", "scripts",
"99-platformio-udev.rules"))
origin_path = os.path.realpath(
os.path.join(get_source_dir(), "..", "scripts", "99-platformio-udev.rules")
)
if not os.path.isfile(origin_path):
return None
@@ -116,11 +121,10 @@ def path_endswith_ext(path, extensions):
return False
def match_src_files(src_dir, src_filter=None, src_exts=None):
def match_src_files(src_dir, src_filter=None, src_exts=None, followlinks=True):
def _append_build_item(items, item, src_dir):
if not src_exts or path_endswith_ext(item, src_exts):
items.add(item.replace(src_dir + os.sep, ""))
items.add(os.path.relpath(item, src_dir))
src_filter = src_filter or ""
if isinstance(src_filter, (list, tuple)):
@@ -133,10 +137,9 @@ def match_src_files(src_dir, src_filter=None, src_exts=None):
items = set()
for item in glob(os.path.join(glob_escape(src_dir), pattern)):
if os.path.isdir(item):
for root, _, files in os.walk(item, followlinks=True):
for root, _, files in os.walk(item, followlinks=followlinks):
for f in files:
_append_build_item(items, os.path.join(root, f),
src_dir)
_append_build_item(items, os.path.join(root, f), src_dir)
else:
_append_build_item(items, item, src_dir)
if action == "+":
@@ -146,8 +149,22 @@ def match_src_files(src_dir, src_filter=None, src_exts=None):
return sorted(list(matches))
def rmtree(path):
def to_unix_path(path):
if not WINDOWS or not path:
return path
return re.sub(r"[\\]+", "/", path)
def expanduser(path):
"""
Be compatible with Python 3.8, on Windows skip HOME and check for USERPROFILE
"""
if not WINDOWS or not path.startswith("~") or "USERPROFILE" not in os.environ:
return os.path.expanduser(path)
return os.environ["USERPROFILE"] + path[1:]
def rmtree(path):
def _onerror(func, path, __):
try:
st_mode = os.stat(path).st_mode
@@ -155,9 +172,10 @@ def rmtree(path):
os.chmod(path, st_mode | stat.S_IWRITE)
func(path)
except Exception as e: # pylint: disable=broad-except
click.secho("%s \nPlease manually remove the file `%s`" %
(str(e), path),
fg="red",
err=True)
click.secho(
"%s \nPlease manually remove the file `%s`" % (str(e), path),
fg="red",
err=True,
)
return shutil.rmtree(path, onerror=_onerror)

View File

@@ -14,27 +14,20 @@
import codecs
import os
import re
import sys
from os.path import abspath, basename, expanduser, isdir, isfile, join, relpath
from os.path import basename, isdir, isfile, join, realpath, relpath
import bottle
from platformio import fs, util
from platformio.compat import WINDOWS, get_file_contents
from platformio.proc import where_is_program
from platformio.project.config import ProjectConfig
from platformio.project.helpers import (get_project_lib_dir,
get_project_libdeps_dir,
get_project_src_dir,
load_project_ide_data)
from platformio.project.helpers import load_project_ide_data
class ProjectGenerator(object):
def __init__(self, project_dir, ide, boards):
self.config = ProjectConfig.get_instance(
join(project_dir, "platformio.ini"))
self.config = ProjectConfig.get_instance(join(project_dir, "platformio.ini"))
self.config.validate()
self.project_dir = project_dir
self.ide = str(ide)
@@ -43,8 +36,7 @@ class ProjectGenerator(object):
@staticmethod
def get_supported_ides():
tpls_dir = join(fs.get_source_dir(), "ide", "tpls")
return sorted(
[d for d in os.listdir(tpls_dir) if isdir(join(tpls_dir, d))])
return sorted([d for d in os.listdir(tpls_dir) if isdir(join(tpls_dir, d))])
def get_best_envname(self, boards=None):
envname = None
@@ -72,51 +64,46 @@ class ProjectGenerator(object):
"project_name": basename(self.project_dir),
"project_dir": self.project_dir,
"env_name": self.env_name,
"user_home_dir": abspath(expanduser("~")),
"platformio_path":
sys.argv[0] if isfile(sys.argv[0])
else where_is_program("platformio"),
"user_home_dir": realpath(fs.expanduser("~")),
"platformio_path": sys.argv[0]
if isfile(sys.argv[0])
else where_is_program("platformio"),
"env_path": os.getenv("PATH"),
"env_pathsep": os.pathsep
} # yapf: disable
"env_pathsep": os.pathsep,
}
# default env configuration
tpl_vars.update(self.config.items(env=self.env_name, as_dict=True))
# build data
tpl_vars.update(
load_project_ide_data(self.project_dir, self.env_name) or {})
tpl_vars.update(load_project_ide_data(self.project_dir, self.env_name) or {})
with fs.cd(self.project_dir):
tpl_vars.update({
"src_files": self.get_src_files(),
"project_src_dir": get_project_src_dir(),
"project_lib_dir": get_project_lib_dir(),
"project_libdeps_dir": join(
get_project_libdeps_dir(), self.env_name)
}) # yapf: disable
tpl_vars.update(
{
"src_files": self.get_src_files(),
"project_src_dir": self.config.get_optional_dir("src"),
"project_lib_dir": self.config.get_optional_dir("lib"),
"project_libdeps_dir": join(
self.config.get_optional_dir("libdeps"), self.env_name
),
}
)
for key, value in tpl_vars.items():
if key.endswith(("_path", "_dir")):
tpl_vars[key] = self.to_unix_path(value)
tpl_vars[key] = fs.to_unix_path(value)
for key in ("includes", "src_files", "libsource_dirs"):
if key not in tpl_vars:
continue
tpl_vars[key] = [self.to_unix_path(inc) for inc in tpl_vars[key]]
tpl_vars[key] = [fs.to_unix_path(inc) for inc in tpl_vars[key]]
tpl_vars['to_unix_path'] = self.to_unix_path
tpl_vars["to_unix_path"] = fs.to_unix_path
return tpl_vars
@staticmethod
def to_unix_path(path):
if not WINDOWS or not path:
return path
return re.sub(r"[\\]+", "/", path)
def get_src_files(self):
result = []
with fs.cd(self.project_dir):
for root, _, files in os.walk(get_project_src_dir()):
for root, _, files in os.walk(self.config.get_optional_dir("src")):
for f in files:
result.append(relpath(join(root, f)))
return result
@@ -142,14 +129,14 @@ class ProjectGenerator(object):
dst_dir = join(self.project_dir, tpl_relpath)
if not isdir(dst_dir):
os.makedirs(dst_dir)
file_name = basename(tpl_path)[:-4]
contents = self._render_tpl(tpl_path, tpl_vars)
self._merge_contents(join(dst_dir, file_name), contents)
@staticmethod
def _render_tpl(tpl_path, tpl_vars):
return bottle.template(get_file_contents(tpl_path), **tpl_vars)
with codecs.open(tpl_path, "r", encoding="utf8") as fp:
return bottle.SimpleTemplate(fp.read()).render(**tpl_vars)
@staticmethod
def _merge_contents(dst_path, contents):

View File

@@ -1,8 +1,8 @@
% _defines = " ".join(["-D%s" % d for d in defines])
% _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines])
{
"execPath": "{{ cxx_path }}",
"gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccDefaultCppFlags": "-fsyntax-only {{! cxx_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccDefaultCFlags": "-fsyntax-only {{! to_unix_path(cc_flags).replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccDefaultCppFlags": "-fsyntax-only {{! to_unix_path(cxx_flags).replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccErrorLimit": 15,
"gccIncludePaths": "{{ ','.join(includes) }}",
"gccSuppressWarnings": false

View File

@@ -1,2 +1,3 @@
.pio
CMakeListsPrivate.txt
cmake-build-*/

View File

@@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="CPP_MODULE" version="4">
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$" />
<orderEntry type="inheritedJdk" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>

View File

@@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CMakeWorkspace" PROJECT_DIR="$PROJECT_DIR$" />
<component name="CidrRootsConfiguration">
<sourceRoots>
<file path="$PROJECT_DIR$/src" />
</sourceRoots>
<libraryRoots>
<file path="$PROJECT_DIR$/lib" />
<file path="$PROJECT_DIR$/.pio/libdeps" />
</libraryRoots>
<excludeRoots>
<file path="$PROJECT_DIR$/.pio" />
</excludeRoots>
</component>
</project>

View File

@@ -1,9 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/clion.iml" filepath="$PROJECT_DIR$/.idea/clion.iml" />
<module fileurl="file://$PROJECT_DIR$/.idea/platformio.iml" filepath="$PROJECT_DIR$/.idea/platformio.iml" />
</modules>
</component>
</project>

View File

@@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="CPP_MODULE" version="4">
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$" />
<orderEntry type="inheritedJdk" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>

View File

@@ -1,30 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectTasksOptions">
<TaskOptions isEnabled="true">
<option name="arguments" value="-f -c clion init --ide clion" />
<option name="checkSyntaxErrors" value="true" />
<option name="description" />
<option name="exitCodeBehavior" value="NEVER" />
<option name="fileExtension" value="ini" />
<option name="immediateSync" value="false" />
<option name="name" value="Monitor platformio.ini" />
<option name="output" value="" />
<option name="outputFilters">
<array>
<FilterInfo>
<option name="description" value="" />
<option name="name" value="PIO Conf" />
<option name="regExp" value="$FILE_PATH$:^platformio" />
</FilterInfo>
</array>
</option>
<option name="outputFromStdout" value="false" />
<option name="program" value="{{platformio_path}}" />
<option name="scopeName" value="Project Files" />
<option name="trackOnlyRoot" value="false" />
<option name="workingDir" value="$PROJECT_DIR$" />
<envs />
</TaskOptions>
</component>
</project>

View File

@@ -1,266 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CMakeRunConfigurationManager" shouldGenerate="true" assignedExecutableTargets="true" buildAllGenerated="true">
<generated>
<config projectName="{{project_name}}" targetName="PLATFORMIO" />
<config projectName="{{project_name}}" targetName="{{project_name}}" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_BUILD" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_UPLOAD" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_CLEAN" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_TEST" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_PROGRAM" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_UPLOADFS" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_UPDATE_ALL" />
<config projectName="{{project_name}}" targetName="PLATFORMIO_REBUILD_PROJECT_INDEX" />
<config projectName="{{project_name}}" targetName="DEBUG" />
</generated>
</component>
<component name="CMakeSettings" AUTO_RELOAD="true" GENERATION_PASS_SYSTEM_ENVIRONMENT="true">
<ADDITIONAL_GENERATION_ENVIRONMENT>
<envs />
</ADDITIONAL_GENERATION_ENVIRONMENT>
</component>
<component name="ChangeListManager">
<list default="true" id="ec922180-b3d3-40f1-af0b-2568113a9075" name="Default" comment="" />
<ignored path="platformio.iws" />
<ignored path=".idea/workspace.xml" />
<option name="EXCLUDED_CONVERTED_TO_IGNORED" value="true" />
<option name="TRACKING_ENABLED" value="true" />
<option name="SHOW_DIALOG" value="false" />
<option name="HIGHLIGHT_CONFLICTS" value="true" />
<option name="HIGHLIGHT_NON_ACTIVE_CHANGELIST" value="false" />
<option name="LAST_RESOLUTION" value="IGNORE" />
</component>
<component name="ChangesViewManager" flattened_view="true" show_ignored="false" />
<component name="CreatePatchCommitExecutor">
<option name="PATCH_PATH" value="" />
</component>
<component name="ExecutionTargetManager" SELECTED_TARGET="default_target" />
<component name="FavoritesManager">
<favorites_list name="{{project_name}}" />
</component>
<component name="FileEditorManager">
<leaf>
<file leaf-file-name="platformio.ini" pinned="false" current-in-tab="true">
<entry file="file://$PROJECT_DIR$/platformio.ini"></entry>
</file>
% for file in src_files:
<file leaf-file-name="file://$PROJECT_DIR$/{{file}}" pinned="false" current-in-tab="false">
<entry file="file://$PROJECT_DIR/${{file}}"></entry>
</file>
% end
</leaf>
</component>
<component name="JsBuildToolGruntFileManager" detection-done="true" />
<component name="JsGulpfileManager">
<detection-done>true</detection-done>
</component>
<component name="NamedScopeManager">
<order />
</component>
<component name="ProjectFrameBounds">
<option name="x" value="252" />
<option name="y" value="21" />
<option name="width" value="1400" />
<option name="height" value="1000" />
</component>
<component name="ProjectInspectionProfilesVisibleTreeState">
<entry key="Project Default">
<profile-state>
<expanded-state>
<State>
<id />
</State>
</expanded-state>
<selected-state>
<State>
<id>C/C++</id>
</State>
</selected-state>
</profile-state>
</entry>
</component>
<component name="ProjectLevelVcsManager" settingsEditedManually="false">
<OptionsSetting value="true" id="Add" />
<OptionsSetting value="true" id="Remove" />
<OptionsSetting value="true" id="Checkout" />
<OptionsSetting value="true" id="Update" />
<OptionsSetting value="true" id="Status" />
<OptionsSetting value="true" id="Edit" />
<ConfirmationsSetting value="0" id="Add" />
<ConfirmationsSetting value="0" id="Remove" />
</component>
<component name="ProjectView">
<navigator currentView="ProjectPane" proportions="" version="1">
<flattenPackages />
<showMembers />
<showModules />
<showLibraryContents />
<hideEmptyPackages />
<abbreviatePackageNames />
<autoscrollToSource />
<autoscrollFromSource />
<sortByType />
<manualOrder />
<foldersAlwaysOnTop value="true" />
</navigator>
<panes>
<pane id="ProjectPane">
<subPane>
<PATH>
<PATH_ELEMENT>
<option name="myItemId" value="{{project_name}}" />
<option name="myItemType" value="com.jetbrains.cidr.projectView.CidrFilesViewHelper$MyProjectTreeStructure$1" />
</PATH_ELEMENT>
</PATH>
<PATH>
<PATH_ELEMENT>
<option name="myItemId" value="{{project_name}}" />
<option name="myItemType" value="com.jetbrains.cidr.projectView.CidrFilesViewHelper$MyProjectTreeStructure$1" />
</PATH_ELEMENT>
<PATH_ELEMENT>
<option name="myItemId" value="{{project_name}}" />
<option name="myItemType" value="com.intellij.ide.projectView.impl.nodes.PsiDirectoryNode" />
</PATH_ELEMENT>
</PATH>
<PATH>
<PATH_ELEMENT>
<option name="myItemId" value="{{project_name}}" />
<option name="myItemType" value="com.jetbrains.cidr.projectView.CidrFilesViewHelper$MyProjectTreeStructure$1" />
</PATH_ELEMENT>
<PATH_ELEMENT>
<option name="myItemId" value="{{project_name}}" />
<option name="myItemType" value="com.intellij.ide.projectView.impl.nodes.PsiDirectoryNode" />
</PATH_ELEMENT>
<PATH_ELEMENT>
<option name="myItemId" value="src" />
<option name="myItemType" value="com.intellij.ide.projectView.impl.nodes.PsiDirectoryNode" />
</PATH_ELEMENT>
</PATH>
</subPane>
</pane>
</panes>
</component>
<component name="PropertiesComponent">
<property name="recentsLimit" value="5" />
<property name="settings.editor.selected.configurable" value="CPPToolchains" />
<property name="settings.editor.splitter.proportion" value="0.2" />
<property name="last_opened_file_path" value="$PROJECT_DIR$/platformio.ini" />
<property name="restartRequiresConfirmation" value="true" />
<property name="FullScreen" value="false" />
</component>
<component name="RunManager" selected="Application.PLATFORMIO_BUILD">
<configuration default="true" type="CMakeRunConfiguration" factoryName="Application" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="{{project_name}}" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="true" type="js.build_tools.gulp" factoryName="Gulp.js">
<node-options />
<gulpfile />
<tasks />
<arguments />
<pass-parent-envs>true</pass-parent-envs>
<envs />
<method />
</configuration>
<configuration default="false" name="Build All" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" CONFIG_NAME="Debug" EXPLICIT_BUILD_TARGET_NAME="all">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_BUILD" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_BUILD" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_CLEAN" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_CLEAN" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_TEST" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_TEST" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_UPLOAD" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_UPLOAD" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_UPLOADFS" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_UPLOADFS" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_PROGRAM" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_PROGRAM" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_UPDATE" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_UPDATE_ALL" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<configuration default="false" name="PLATFORMIO_REBUILD_PROJECT_INDEX" type="CMakeRunConfiguration" factoryName="Application" WORKING_DIR="" PASS_PARENT_ENVS="FALSE" PROJECT_NAME="{{project_name}}" TARGET_NAME="PLATFORMIO_REBUILD_PROJECT_INDEX" CONFIG_NAME="Debug">
<envs />
<method />
</configuration>
<list size="9">
<item index="0" class="java.lang.String" itemvalue="Application.Build All" />
<item index="1" class="java.lang.String" itemvalue="Application.PLATFORMIO_BUILD" />
<item index="2" class="java.lang.String" itemvalue="Application.PLATFORMIO_UPLOAD" />
<item index="3" class="java.lang.String" itemvalue="Application.PLATFORMIO_CLEAN" />
<item index="4" class="java.lang.String" itemvalue="Application.PLATFORMIO_TEST" />
<item index="5" class="java.lang.String" itemvalue="Application.PLATFORMIO_PROGRAM" />
<item index="6" class="java.lang.String" itemvalue="Application.PLATFORMIO_UPLOADFS" />
<item index="7" class="java.lang.String" itemvalue="Application.PLATFORMIO_UPDATE" />
<item index="8" class="java.lang.String" itemvalue="Application.PLATFORMIO_REBUILD_PROJECT_INDEX" />
</list>
</component>
<component name="ShelveChangesManager" show_recycled="false" />
<component name="SvnConfiguration">
<configuration />
</component>
<component name="TaskManager">
<task active="true" id="Default" summary="Default task">
<changelist id="ec922180-b3d3-40f1-af0b-2568113a9075" name="Default" comment="" />
<created>1435919971910</created>
<option name="number" value="Default" />
<updated>1435919971910</updated>
</task>
<servers />
</component>
<component name="ToolWindowManager">
<frame x="181" y="23" width="1400" height="1000" extended-state="0" />
<editor active="true" />
<layout>
<window_info id="Project" active="false" anchor="left" auto_hide="false" internal_type="DOCKED" type="DOCKED" visible="true" show_stripe_button="true" weight="0.24945612" sideWeight="0.5" order="0" side_tool="false" content_ui="tabs" />
</layout>
</component>
<component name="Vcs.Log.UiProperties">
<option name="RECENTLY_FILTERED_USER_GROUPS">
<collection />
</option>
<option name="RECENTLY_FILTERED_BRANCH_GROUPS">
<collection />
</option>
</component>
<component name="VcsContentAnnotationSettings">
<option name="myLimit" value="2678400000" />
</component>
<component name="XDebuggerManager">
<breakpoint-manager>
<option name="time" value="4" />
</breakpoint-manager>
<watches-manager />
</component>
<component name="masterDetails">
<states>
<state key="ScopeChooserConfigurable.UI">
<settings>
<splitter-proportions>
<option name="proportions">
<list>
<option value="0.2" />
</list>
</option>
</splitter-proportions>
</settings>
</state>
</states>
</component>
</project>

View File

@@ -5,8 +5,12 @@
# please create `CMakeListsUser.txt` in the root of project.
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
cmake_minimum_required(VERSION 3.2)
project({{project_name}})
cmake_minimum_required(VERSION 3.13)
set(CMAKE_SYSTEM_NAME Generic)
set(CMAKE_C_COMPILER_WORKS 1)
set(CMAKE_CXX_COMPILER_WORKS 1)
project("{{project_name}}" C CXX)
include(CMakeListsPrivate.txt)
@@ -15,63 +19,15 @@ include(CMakeListsUser.txt)
endif()
add_custom_target(
PLATFORMIO_BUILD ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run
Production ALL
COMMAND platformio -c clion run "$<$<NOT:$<CONFIG:All>>:-e${CMAKE_BUILD_TYPE}>"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_BUILD_VERBOSE ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run --verbose
Debug ALL
COMMAND platformio -c clion run --target debug "$<$<NOT:$<CONFIG:All>>:-e${CMAKE_BUILD_TYPE}>"
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_UPLOAD ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run --target upload
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_CLEAN ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run --target clean
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_MONITOR ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion device monitor
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_TEST ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion test
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_PROGRAM ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run --target program
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_UPLOADFS ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion run --target uploadfs
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_UPDATE_ALL ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion update
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_custom_target(
PLATFORMIO_REBUILD_PROJECT_INDEX ALL
COMMAND ${PLATFORMIO_CMD} -f -c clion init --ide clion
WORKING_DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}
)
add_executable(${PROJECT_NAME} ${SRC_LIST})
add_executable(Z_DUMMY_TARGET ${SRC_LIST})

View File

@@ -5,6 +5,8 @@
# please create `CMakeListsUser.txt` in the root of project.
# The `CMakeListsUser.txt` will not be overwritten by PlatformIO.
% from platformio.project.helpers import (load_project_ide_data)
%
% import re
%
% def _normalize_path(path):
@@ -19,13 +21,27 @@
% end
% return path
% end
%
% def _escape(text):
% return to_unix_path(text).replace('"', '\\"')
% end
%
% envs = config.envs()
set(PLATFORMIO_CMD "{{ _normalize_path(platformio_path) }}")
% if len(envs) > 1:
set(CMAKE_CONFIGURATION_TYPES "{{ ";".join(envs) }};" CACHE STRING "Build Types reflect PlatformIO Environments" FORCE)
% else:
set(CMAKE_CONFIGURATION_TYPES "{{ env_name }}" CACHE STRING "Build Types reflect PlatformIO Environments" FORCE)
% end
% if svd_path:
set(CLION_SVD_FILE_PATH "{{ _normalize_path(svd_path) }}" CACHE FILEPATH "Peripheral Registers Definitions File" FORCE)
% end
SET(CMAKE_C_COMPILER "{{ _normalize_path(cc_path) }}")
SET(CMAKE_CXX_COMPILER "{{ _normalize_path(cxx_path) }}")
SET(CMAKE_CXX_FLAGS_DISTRIBUTION "{{cxx_flags}}")
SET(CMAKE_C_FLAGS_DISTRIBUTION "{{cc_flags}}")
SET(CMAKE_CXX_FLAGS "{{ _normalize_path(to_unix_path(cxx_flags)) }}")
SET(CMAKE_C_FLAGS "{{ _normalize_path(to_unix_path(cc_flags)) }}")
% STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)")
% cc_stds = STD_RE.findall(cc_flags)
@@ -37,12 +53,32 @@ SET(CMAKE_C_STANDARD {{ cc_stds[-1] }})
set(CMAKE_CXX_STANDARD {{ cxx_stds[-1] }})
% end
% for define in defines:
add_definitions(-D'{{!re.sub(r"([\"\(\)#])", r"\\\1", define)}}')
% end
if (CMAKE_BUILD_TYPE MATCHES "{{ env_name }}")
%for define in defines:
add_definitions(-D'{{!re.sub(r"([\"\(\)#])", r"\\\1", define)}}')
%end
% for include in includes:
include_directories("{{ _normalize_path(include) }}")
% end
%for include in includes:
include_directories("{{ _normalize_path(to_unix_path(include)) }}")
%end
endif()
% leftover_envs = list(set(envs) ^ set([env_name]))
%
% ide_data = {}
% if leftover_envs:
% ide_data = load_project_ide_data(project_dir, leftover_envs)
% end
%
% for env, data in ide_data.items():
if (CMAKE_BUILD_TYPE MATCHES "{{ env }}")
% for define in data["defines"]:
add_definitions(-D'{{!re.sub(r"([\"\(\)#])", r"\\\1", define)}}')
% end
% for include in data["includes"]:
include_directories("{{ _normalize_path(to_unix_path(include)) }}")
% end
endif()
% end
FILE(GLOB_RECURSE SRC_LIST "{{ _normalize_path(project_src_dir) }}/*.*" "{{ _normalize_path(project_lib_dir) }}/*.*" "{{ _normalize_path(project_libdeps_dir) }}/*.*")

View File

@@ -0,0 +1,22 @@
% import re
% STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)")
% cc_stds = STD_RE.findall(cc_flags)
% cxx_stds = STD_RE.findall(cxx_flags)
%
%
clang
% if cc_stds:
{{"%c"}} -std=c{{ cc_stds[-1] }}
% end
% if cxx_stds:
{{"%cpp"}} -std=c++{{ cxx_stds[-1] }}
% end
% for include in includes:
-I{{ include }}
% end
% for define in defines:
-D{{ define }}
% end

View File

@@ -0,0 +1,22 @@
% import re
% STD_RE = re.compile(r"\-std=[a-z\+]+(\d+)")
% cc_stds = STD_RE.findall(cc_flags)
% cxx_stds = STD_RE.findall(cxx_flags)
%
%
clang
% if cc_stds:
{{"%c"}} -std=c{{ cc_stds[-1] }}
% end
% if cxx_stds:
{{"%cpp"}} -std=c++{{ cxx_stds[-1] }}
% end
% for include in includes:
-I{{ include }}
% end
% for define in defines:
-D{{ define }}
% end

View File

@@ -1,9 +1,9 @@
% _defines = " ".join(["-D%s" % d for d in defines])
% _defines = " ".join(["-D%s" % d.replace(" ", "\\\\ ") for d in defines])
{
"execPath": "{{ cxx_path }}",
"gccDefaultCFlags": "-fsyntax-only {{! cc_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccDefaultCppFlags": "-fsyntax-only {{! cxx_flags.replace(' -MMD ', ' ').replace('"', '\\"') }} {{ !_defines.replace('"', '\\"') }}",
"gccErrorLimit": 15,
"gccIncludePaths": "{{! ','.join("'{}'".format(inc) for inc in includes)}}",
"gccIncludePaths": "{{ ','.join(includes) }}",
"gccSuppressWarnings": false
}

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<Filter Include="Source Files">

Some files were not shown because too many files have changed in this diff Show More