Compare commits

...

733 Commits

Author SHA1 Message Date
a76e445ed9 Merge branch 'release/v6.0.1' 2022-05-17 19:23:31 +03:00
cb7148d018 Bump version to 6.0.1 2022-05-17 19:23:00 +03:00
38afa07dbe Use Marshmallow v3.14.1 for Python 3.6 2022-05-17 19:10:54 +03:00
92073a4ccd Deprecate "pio update", "pio lib", and "pio platform" commands 2022-05-17 18:57:40 +03:00
abf6304818 Fixed an issue when using "Interpolation of Values" and merging str+int options // Resolve #4271 2022-05-17 16:03:33 +03:00
9a86175701 Bump version to 6.0.1b1 2022-05-17 13:34:03 +03:00
b764a2220f Improved support for the renamed configuration options // Resolve #4270 2022-05-17 13:33:25 +03:00
3776233233 Rename "shared" module to the "public" 2022-05-16 16:56:01 +03:00
0d92e8fc17 Bump version to 6.0.0a1 2022-05-16 14:46:52 +03:00
40422eac2e Fixed an issue when calling built-in pio device monitor filter 2022-05-16 14:46:37 +03:00
0fb4b1e109 Merge tag 'v6.0.0' into develop
Bump version to 6.0.0
2022-05-16 14:22:08 +03:00
44ecc7c666 Merge branch 'release/v6.0.0' 2022-05-16 14:22:07 +03:00
26d659c433 Bump version to 6.0.0 2022-05-16 14:21:57 +03:00
58c4145809 Refactor library management docs 2022-05-16 14:18:45 +03:00
fe08ce7795 Implement shared API 2022-05-16 11:39:18 +03:00
9163e9e67d Rename pio project data to the pio project metadata command 2022-05-15 16:57:27 +03:00
7acae6461e Merge branch 'develop' of https://github.com/platformio/platformio-core into develop 2022-05-15 15:35:07 +03:00
e7a172b8dd qtcreator: add project-update makefile target (#4267)
* qtcreator: add project-update makefile target

* add prompt and delete .pio/

* formatting

* forced rm

* remove workaround of deleting .pio/
2022-05-15 15:34:57 +03:00
b90e89a791 no message 2022-05-15 14:54:07 +03:00
db11244f49 qtcreator IDE gitignore tweaks (#4266)
* add .gitignore to project files
  * exclude qtc_clangd
  * don't exclude user project config file
2022-05-15 13:52:59 +03:00
54f0748201 Cache a build metadata only for debugging // Resolve #4267 2022-05-15 13:52:11 +03:00
575f0ae300 Bump version to 6.0.0rc3 2022-05-15 13:47:32 +03:00
7a100fb0b0 Use device finder for automatic detection of upload port 2022-05-15 13:46:44 +03:00
d01d314f47 Pick the last USB device port 2022-05-15 13:13:45 +03:00
e5e2210768 Improved automatic detection of a testing serial port // Resolve #4076 2022-05-14 23:30:36 +03:00
d22b479bd3 Regroup device command 2022-05-14 18:21:44 +03:00
19853b0b66 Implement config.get_default_env() 2022-05-14 17:55:36 +03:00
ce62514a17 Resolve project dependencies with pio project init command 2022-05-14 16:31:08 +03:00
4a4ba5594b Rename "load_project_ide_data" to the "load_build_metadata" 2022-05-14 16:30:20 +03:00
af5a820862 Rename "load_project_ide_data" to the "load_build_metadata" 2022-05-14 16:29:41 +03:00
40e4e38e0c Do not override CWD when executing a package command 2022-05-14 16:23:36 +03:00
cb1c825747 Merge branch 'develop' of https://github.com/platformio/platformio-core into develop 2022-05-14 15:27:13 +03:00
8c27754045 qtcreator IDE template now generates a "generic" Qt project (#4262)
* Create qtcreator-generic IDE template.

* Fix case of #define in qtcreator-generic template .config file.

* follow directory move

* * fix includes output
  * fixup -mlong-calls for clang
  * add Makefile to files output

* fix escaping in config output

* Makefile improvements:
  * support any platformio run target
  * remove platformio deprecated -f option
  * remove explicit default target (first is always default)

* replace qtcreator rather than making another IDE target

Co-authored-by: Donna Whisnant <dewhisna@users.noreply.github.com>
2022-05-14 15:26:04 +03:00
3247e661e9 Regroup "pio project" command 2022-05-14 13:41:20 +03:00
7c93167d52 Docs: Document double hyphen for "pio debug" // Resolve #4260 2022-05-13 21:04:44 +03:00
79b2bfdefe Fix an issue with multiple symbol definitions when framework uses own Unity // Resolve #4259 2022-05-12 15:34:50 +03:00
de7d710943 Look for custom "unity_config.h" only in the "test" dir 2022-05-12 14:17:45 +03:00
b88a29e652 Bump version to 6.0.0rc2 2022-05-12 13:41:45 +03:00
ed0b12dcf9 Improve project config parser to resolve renamed options // Issue #4259 2022-05-12 13:24:27 +03:00
280bede0e9 Bump version to 6.0.0rc1 2022-05-10 20:22:36 +03:00
e6938f8f39 List available project tests with a new "pio test --list-tests" option 2022-05-10 20:21:49 +03:00
6d705172f5 Docs: Extend migration guide with Unit Testing solution 2022-05-10 19:18:36 +03:00
8fff7084db Rename pio test --output-{format} options to --{format}-output 2022-05-10 18:25:26 +03:00
e75bf27b5f Add "-pthread" to the LINKFLAGS 2022-05-10 17:23:03 +03:00
2c99607d3d Pass "-pthread" flag to GoogleTest only on Unix OS 2022-05-10 16:46:48 +03:00
c09af13b7f Add "-pthread" flag for GoogleTest 2022-05-10 16:13:30 +03:00
ee6b498ca9 Optimize unit testing report CLI 2022-05-10 15:25:30 +03:00
65f2f02d93 Add support for GoogleTest testing and mocking framework // Resolve #3572 2022-05-10 14:30:02 +03:00
960edb5611 Use full testing program path on Windows 2022-05-10 11:59:59 +03:00
cda7a97e67 Do not automatically generate JSON report 2022-05-09 22:32:16 +03:00
c520700276 Export testcase file & line to JUnit XML 2022-05-09 19:20:33 +03:00
a7654a6098 Move Unity code parts to the Unity runner 2022-05-09 18:58:43 +03:00
814679522a Do not override embedded std flag 2022-05-09 18:49:15 +03:00
4249349c2b Add hint about verbose output 2022-05-09 18:40:46 +03:00
d065646d3e Update SPDX license list to v3.17 2022-05-09 10:08:08 +03:00
0cf7aeeec9 Fix test on Github Actions 2022-05-08 14:42:07 +03:00
277ccdafb6 Bump version to 6.0.0b1 2022-05-07 17:58:42 +03:00
5b00f6fb95 Skip "test_doctest_framework" from Github Actions / Windows 2022-05-07 17:55:32 +03:00
3f46a97b6b Fix LDF lib resolving 2022-05-07 16:44:11 +03:00
3989979ca3 Pass extra arguments to the native program with a new "pio run --program-arg" option // Resolve #4246 2022-05-07 16:22:05 +03:00
50eda82e27 Fix test 2022-05-07 14:09:11 +03:00
daa3481862 Pass extra arguments to the testing program with a new "pio test --program-arg" option // Resolve # 3132 2022-05-07 13:31:19 +03:00
2d94000dd5 Rename source.file to source.file name and report project folder 2022-05-07 13:24:27 +03:00
e3eb155d76 Improve doctest results parser 2022-05-07 13:23:03 +03:00
f95e23118c Fix test 2022-05-06 21:57:39 +03:00
82778473fe New: "doctest" testing framework // Resolve #4240 2022-05-06 20:00:23 +03:00
dae3b9665b Implement TestCase.humanize 2022-05-06 19:56:39 +03:00
f19058df65 Try to resolve paths if the common part is not found 2022-05-06 19:40:00 +03:00
3c7bec7c61 Exclude SVG files by default 2022-05-06 19:39:21 +03:00
c4388a6904 Fixed an issue when LDF ignores build_src_flags in the “deep+” mode // Resolve #4253 2022-05-06 10:31:34 +03:00
6d1e637518 Add support for Semihosting and Unit Testing // Resolve #3516 2022-05-05 17:36:15 +03:00
bbd56d6eb0 Document using QEMU, Renode, SimAVR simulators with Unit Testing // Resolve #4238 2022-05-05 15:33:39 +03:00
0b317ef04b Implement buffering for the testing output 2022-05-05 13:02:27 +03:00
c0cfbe2ce0 Using hardware Simulators for Unit Testing // Issue #4238 2022-05-04 23:20:37 +03:00
3ed5d41df5 Strip ANSI codes from Unity output 2022-05-04 18:56:57 +03:00
517ee6532f Move "strip_ansi_codes" to the util 2022-05-04 18:55:34 +03:00
653f22f85b Fix issue with nested interpolation 2022-05-04 14:52:11 +03:00
38906478d3 Professional collaborative platform for safety-critical and declarative embedded development 2022-05-03 22:09:25 +03:00
e81d83b8c2 Added support for a Custom Unity Library // Resolve #3980 2022-05-03 21:47:20 +03:00
b12d9f62b9 Show list of failed tests in the summary // Resolve #4251 2022-05-03 19:30:15 +03:00
0849e5faad Rename "src_filter" and "src_build_flags" options // Resolve #4245 2022-05-03 18:39:49 +03:00
1a4419059d Added support for "socket://" and "rfc2217://" protocols using "test_port" option // Resolve #4229 2022-05-03 18:11:23 +03:00
4ef1333abc Refactor test runner mixins to the test output readers 2022-05-03 15:21:53 +03:00
2b11f64ef1 New Custom Testing Framework 2022-05-03 14:30:15 +03:00
5b98f432f2 Update deps 2022-05-03 14:25:29 +03:00
76779e6af4 Sync docs 2022-05-01 23:00:25 +03:00
738d537266 Docs: Sync Intel MCS51 dev-platform 2022-05-01 20:10:25 +03:00
327d5990d6 Docs: Minor improvements 2022-04-29 21:51:35 +03:00
16021d0df7 Added support for "Test Hierarchies" // Issue #4135 2022-04-29 20:46:43 +03:00
b37a74dfd9 Refactor Unit Testing documentation 2022-04-29 20:46:04 +03:00
d02f02731f Rename the "test_build_project_src" project configuration option to "test_build_src" 2022-04-29 20:44:28 +03:00
4295c54c67 Sync docs and examples 2022-04-29 14:50:15 +03:00
fb1e4fa02b Add "--filter" option to the pio remote test command 2022-04-28 22:02:16 +03:00
62b8a63b80 Add --filter to remote test (#4244) 2022-04-28 18:25:43 +03:00
ab3c832f5e Pylint fix 2022-04-27 21:15:08 +03:00
d380e7ea01 Update Cppcheck and PVS-Studio tools to the latest available 2022-04-27 20:47:13 +03:00
e69fd5e682 Minor improvements to check tools
- Better handling of unusual macro for PVS-Studio
- Fail the analysis if Cppcheck exited with an internal error
2022-04-27 20:45:21 +03:00
285f19e132 Properly handle cases when path to a file with a defect is unknown
Resolves #4237
2022-04-27 20:40:55 +03:00
4151f53e14 Rename unit testing module to "test" 2022-04-26 15:09:51 +03:00
5895fb9faf Bump version to 6.0.0a2 2022-04-25 22:11:50 +03:00
19e22d74f3 Fix unit testing case 2022-04-25 15:30:54 +03:00
26ed6a5548 Implement required setUp/tearDown functions for the latest Unity testing framework 2022-04-25 13:23:33 +03:00
05dd7dd811 Revert back showing test cases status before 2022-04-24 21:08:49 +03:00
8b694f3734 Unity: show test case status before stdout 2022-04-24 11:28:07 +03:00
c9026a1b9c Generate reports in JUnit and JSON formats // Resolve #2891 2022-04-23 19:19:25 +03:00
9b221a06c8 Unity: Avoid "weak" attributes on Windows 2022-04-23 11:05:28 +03:00
f88904e246 Export "ConfigureDebugFlags" to build env (bakward compatibility with Zephyr build script) 2022-04-22 18:14:28 +03:00
e3533dcb01 Added support for test hierarchies (nested test suites) // Resolve #4135 2022-04-22 15:19:12 +03:00
8edb5ffe20 Use unsigned long for unityOutputStart 2022-04-22 10:55:59 +03:00
90e6cd7b46 Fixed an issue when command line parameters do not override values // Resolve #3845 2022-04-21 20:23:30 +03:00
1fa73fb632 Typo fixes 2022-04-21 20:22:57 +03:00
a615af233a Provide more information when the native program crashed on a host (errored with a negative return code) // Resolve #3429 2022-04-21 19:32:12 +03:00
4817e13823 PyLint fixes 2022-04-21 19:30:55 +03:00
ee43b86742 Introduce a new PlatformIO Unit Testing engine 2022-04-21 18:11:49 +03:00
93bfc57dea Merge branch 'develop' of https://github.com/platformio/platformio-core into develop 2022-04-21 17:12:31 +03:00
a568a5c356 Keep recursive for the glob 2022-04-21 17:10:38 +03:00
0b21977e48 Sync docs 2022-04-21 17:07:21 +03:00
2f7668aef5 Improve src matcher for the symbolic links 2022-04-21 16:31:40 +03:00
72fa6eebba Switch to FS JSON loader 2022-04-21 16:30:55 +03:00
2f6a417168 Move test 2022-04-20 18:54:40 +03:00
faa63727ab Revert back to title() 2022-04-20 18:48:26 +03:00
a2b1a0a0a7 Use capitalize instead of title 2022-04-20 18:36:28 +03:00
0d7bc09c49 Cache DL requests 2022-04-20 18:33:46 +03:00
f57ca747a9 Add support for DL mirrors 2022-04-20 18:03:55 +03:00
624421e4b0 Memoize dev-platform instance cross the clonned build envs 2022-04-19 13:51:43 +03:00
943c6bc59c Move INO converter to a separate tool 2022-04-19 11:36:05 +03:00
9ce0b0e25b Use builtin "title()" 2022-04-19 11:33:56 +03:00
df3a13fc61 Move MISSING to the compat 2022-04-19 11:32:36 +03:00
5a0a215bfc Use PY3 super() zero-argument syntax 2022-04-15 14:44:30 +03:00
eaff7f307c Avoid RecursionError for circular_dependencies // Resolve #4228 2022-04-15 14:17:21 +03:00
8d63591ce8 Extend "library.json" with an example for passing flags to library dependencies // Resolve #1941 2022-04-13 18:55:44 +03:00
0e3aa29689 Introduce PlatformIO Core 6.0 2022-04-13 15:32:05 +03:00
a56b19ff65 Improve pio exec command on Windows 2022-04-13 13:58:31 +03:00
62b7ec271f Keep PY2 for backward compatibility with ESP8266/ESP32 // Resolve #4226 2022-04-13 12:51:13 +03:00
5515bef3d7 Add backward compatibility with ESP-IDF build script // Resolve #4225 2022-04-13 12:47:17 +03:00
092f5de231 Fix removing temporary debugging data on Windows 2022-04-12 18:17:38 +03:00
81fdd75aac Report problematic file before publishing package to the registry 2022-04-12 12:30:49 +03:00
f63b2f79e0 Fixed an issue when GCC preprocessor was applied to the ".s" assembly files on case-sensitive OS such as Window OS // Resolve #3917 2022-04-10 19:21:03 +03:00
0501d55c8f Fixed an issue with calling an extra script located outside a project // Resolve #4220 2022-04-10 19:09:29 +03:00
fe6f51369e Autoinstall dev-platform for the "clean" target 2022-04-10 13:56:44 +03:00
8f454c7e9c Bump version to 5.3.0b5 2022-04-09 20:31:40 +03:00
965feccfdc Extended Interpolation of Values with "${this}" pattern // Resolve #3953 2022-04-09 20:31:06 +03:00
5e18f9bbda Finally removed all tracks to the Python 2.7 2022-04-09 17:46:21 +03:00
541fcbf015 Added a new build variable (COMPILATIONDB_INCLUDE_TOOLCHAIN) to include toolchain paths in the compilation database // Resolve #3735 2022-04-09 12:53:22 +03:00
16f5374474 Typo fix 2022-04-08 21:58:29 +03:00
b414745aa1 Fixed an issue when LDF ignores the project "lib_deps" while resolving library dependencies // Resolve #3598 2022-04-08 18:37:16 +03:00
696d95bf1b Black formatter 2022-04-08 18:36:43 +03:00
1269ce064a Improved detection of a package type from the tarball archive // Resolve #3828 2022-04-08 13:58:40 +03:00
9097d455db Avoid working with detached / non-existent git branches when checking for updates (#4217)
* Avoid working with detached / non-existent git branches when checking for updates

b/c we can't use `pull` anyway in that situation
Otherwise, ask for the specific branch via `refs/heads/{branch}` and
also fail when it is not available

* Update vcsclient.py

Co-authored-by: Ivan Kravets <me@ikravets.com>
2022-04-08 13:15:35 +03:00
1615159014 Fix test 2022-04-08 12:03:31 +03:00
e4e1e72c30 Bump version to 5.3.0b4 2022-04-07 23:10:35 +03:00
43329b7748 Minor improvements for symlink support // Issue #3348 2022-04-07 23:03:40 +03:00
2280865936 Resovle symlink based on the saved cwd 2022-04-05 09:11:10 +03:00
fb2f3c8836 Resovle symlink based on the saved cwd 2022-04-05 09:07:44 +03:00
e2f21212b7 Added support for symbolic links allowing pointing the local source folder to the Package Manager // Resolve #3348 2022-04-04 23:14:19 +03:00
d7597d0992 Cache downloads cleanup 2022-04-04 22:45:25 +03:00
c21876ebe3 Typo fix in class name 2022-04-04 22:22:22 +03:00
76bea5b7a7 Cache downloads cleanup 2022-04-04 22:21:06 +03:00
a03d82ff1a Replace package meta URL with URI 2022-04-04 14:18:11 +03:00
f555656c92 Bump version to 5.3.0b3 2022-04-03 23:18:01 +03:00
f289ebd1f3 Revert back lib deps tree to ascii chars 2022-04-03 23:17:29 +03:00
41b3646012 Bump version to 5.3.0b2 2022-04-03 19:54:03 +03:00
8de5db4b48 Added support for “scripts” in package manifest // Resolve #485 2022-04-03 19:53:34 +03:00
d8be12dcdd PyLint fix 2022-04-03 10:54:23 +03:00
71f9401e23 Fixed an issue when manually removed dependencies were not uninstalled from the storage // Resolve #3076 2022-04-02 22:30:35 +03:00
cdd63dec65 Do not process package that was installed into the "env" storage // Resolve #2910 2022-04-02 16:38:54 +03:00
279fdfc47a Show project dependency licenses when building in the verbose mode 2022-04-02 16:28:40 +03:00
feda42f18f Added support for multi-licensed packages in library.json using SPDX Expressions // Resolve #4037 2022-04-02 14:19:24 +03:00
d86f7fc25e Added ability to override a tool version using the "platform_packages" option // Resolve #3798 2022-04-01 22:05:30 +03:00
e4fb675d5f Install only missed dependencies for the private libraries // Resolve #2910 2022-04-01 17:25:40 +03:00
25e786e6a5 Docs: Sync with dev-platforms 2022-04-01 14:29:38 +03:00
fd01e98cb1 Fix an issue with automatic installation of debug dependencies 2022-04-01 13:47:07 +03:00
2a88cdb8df Bump version to 5.3.0b1 2022-03-31 19:26:21 +03:00
be8f842061 Automatically install dependencies of the local (private) libraries // Resolve #2910 2022-03-31 19:25:44 +03:00
fcb81ae074 Update docs with the new Package Specifications // Resolve #3373 2022-03-31 15:44:16 +03:00
7d9c018b44 Implement Click logging handler for package manager 2022-03-30 21:40:59 +03:00
a6e12532f8 Implement pio pkg search command // Issue #3373 2022-03-30 17:32:05 +03:00
bd202f55ce Rename search "filters" to "qualifiers" 2022-03-30 14:43:02 +03:00
f7b5a7bed8 Added support for the custom Clang-Tidy configuration file // issue #4186 2022-03-30 12:01:17 +03:00
6123d6f9bf Don't append --checks=* when the --config or --config-file flags are set (#4210)
Appending --checks=* causes clang-tidy to ignore the flags --config
and --config-file, which breaks the ability to use a clang-tidy file
2022-03-30 11:47:14 +03:00
6c8173d1aa Implement pio pkg show command // Issue #3373 2022-03-29 16:39:48 +03:00
d2f857d176 Lock "click" dependency for Python 3.6 2022-03-28 20:56:23 +03:00
1e2afafbc4 Use parse_datetime API 2022-03-28 18:18:51 +03:00
927c5c5e36 Do not install any dependencies on the "clean" target 2022-03-28 00:05:20 +03:00
b2ea96b4a7 Resolve package path 2022-03-27 22:34:43 +03:00
6afb53dd7d PyLint fixes 2022-03-27 22:34:22 +03:00
d7477833d6 PyLint fixes 2022-03-24 14:29:32 +02:00
7624645626 Implement pio pkg list command // Issue #3373 2022-03-24 14:17:18 +02:00
53753c0127 Do not install dependencies that are built-in libraries 2022-03-23 18:01:23 +02:00
95604ff66a Minor enhancements 2022-03-23 18:00:31 +02:00
99e0d1071a Add package METAVAR for CLI 2022-03-23 17:57:18 +02:00
13aacbcc05 Dump only required toolchains 2022-03-23 17:56:15 +02:00
b137b25169 Enhance library dependency tree 2022-03-23 17:55:27 +02:00
b44fb101c4 Remove deprecated code 2022-03-21 18:38:36 +02:00
accc8ac254 Add test for "pio pkg outdated" command 2022-03-21 16:00:29 +02:00
435a526140 Implement pio pkg update command // Issue #3373 2022-03-20 15:40:44 +02:00
346580d955 Do not warn about unknown packages if they are built-in libraries 2022-03-19 18:13:29 +02:00
81f343dbe8 Cleanup dev-platform package installer 2022-03-19 18:12:36 +02:00
fa443f2e5f Strict PackageItem comparison 2022-03-19 18:08:34 +02:00
a25a86e42f Init dev-platform with autoinstallation 2022-03-19 18:07:19 +02:00
1ffa924483 Fix test 2022-03-16 18:17:21 +02:00
463a16a68f Implement "pio pkg uninstall" command // Issue #3373 2022-03-16 16:23:09 +02:00
d2adca8d68 Minor improvements 2022-03-16 16:18:59 +02:00
057bf89894 Sync "asrmicro650x" dev-platform 2022-03-16 12:36:22 +02:00
c9037982d7 Save tool deps into the "platformio.ini" // Issue #3373 2022-03-14 13:37:47 +02:00
ce1264564f Ensure default libs are saved 2022-03-14 12:31:48 +02:00
61ffab376d Split code 2022-03-14 12:18:05 +02:00
f3bcaae4e4 Update deps 2022-03-13 17:54:13 +02:00
2201214717 Allow to skip saving of package dependencies to the "platformio.ini" // Issue #3373 2022-03-09 19:07:11 +02:00
eba4231cdc Move test 2022-03-09 19:01:37 +02:00
de0a810fcf Update "wsproto" dependencies to the "1.1.*" 2022-03-09 14:18:09 +02:00
644fc36c32 Revert back to using TOX tmp dir for PyTest 2022-03-08 18:29:54 +02:00
41144bffeb Reset custom project config per command 2022-03-08 18:00:10 +02:00
c84709dd9d Switch to the new "pio pkg install" command 2022-03-08 15:57:25 +02:00
f28651eaf7 Ensure package dependencies are installed // Resolve #2573 2022-03-08 14:59:12 +02:00
9e40eb992e Implement unified "pio pkg install" CLI // Issue #3373 2022-03-08 14:58:01 +02:00
f445cb7895 Ignore Python3 "__pycache__" binaries 2022-03-06 16:00:01 +02:00
dfc0ecdf69 #StandWithUkraine (#4195) 2022-03-06 13:20:54 +02:00
6f11f812f8 Ignore files according to the patterns declared in ".gitignore" when using pio package pack // Resolve #4188 2022-02-23 18:46:53 +02:00
4191a9bc3c Fixed issue linked to package refactoring // Resolve #4189 2022-02-23 13:37:02 +02:00
f2fbdafe64 Use the latest PIO Remote dependencies on non-ARM platforms // Issue #3865 2022-02-22 13:36:11 +02:00
22a037b213 Better handling of the failed tests using "Unit Testing" solution 2022-02-22 13:02:10 +02:00
dbe3ab6c97 Docs: Fix platformio.ini contents for Zephyr and Nordic nRF52-DK tutorial 2022-02-21 19:27:05 +02:00
6bed610af3 Check for invalid version with leading zeros 2022-02-21 18:02:56 +02:00
4d9547066b Show package size before publishing to the registry 2022-02-21 15:00:13 +02:00
54c18ae0c6 Fix test on Win 2022-02-19 21:10:57 +02:00
e49fb9f0d0 Minor Py.Test fixes 2022-02-19 20:45:37 +02:00
33da2af31e Improve pio pkg exec test 2022-02-19 19:22:40 +02:00
bcb3678055 Add test for pio pkg exec command 2022-02-18 21:03:12 +02:00
28da2d245b Handle "BlockingIOError" when locking file resource 2022-02-18 18:51:03 +02:00
e6864adfb6 Minor improvements 2022-02-18 18:34:50 +02:00
8562319638 Do not handle built-in libraries when using package manager 2022-02-18 18:34:24 +02:00
6be17cec37 Added support for dependencies declared in a "tool" type package 2022-02-18 17:51:07 +02:00
f34e6e9c4c Port package management "print_message" to the Python logging system 2022-02-18 12:57:30 +02:00
e8051838a3 Dropped support for "pythonPackages" field in "platform.json" manifest in favor of "Extra Python Dependencies" 2022-02-17 17:25:21 +02:00
f1f5497d8d Fix test 2022-02-16 22:33:16 +02:00
1b44ba4ce0 Dropped automatic updates of global libraries and development platforms // Resolve#4179 2022-02-16 21:53:18 +02:00
a4d2dc856c Do not check for "system prune" for newest PlatformIO Core installation 2022-02-16 21:08:13 +02:00
7964d1c2bf Docs: Add community book "Developing IoT Projects with ESP32" 2022-02-15 20:49:26 +02:00
5df5dd155f Bump version to 5.3.0a3 2022-02-12 23:14:16 +02:00
89cce21161 Move "pio exec" command to "pio pkg exec" // Issue #4163 2022-02-12 23:13:17 +02:00
0bdef36e2a pio pkg outdated - check for project outdated packages // Issue #3373 2022-02-12 23:06:10 +02:00
e549a07901 Typo fix 2022-02-12 23:01:20 +02:00
98603dad66 Configure platform instance with project packages using "configure_project_packages" API 2022-02-12 21:59:27 +02:00
c37fbda7a8 Bump version to 5.3.0a2 2022-02-11 22:42:50 +02:00
34ea4d8f41 Move "debug" command to its main module 2022-02-11 22:42:02 +02:00
452a76105f Update command titles 2022-02-11 22:33:33 +02:00
4982676ca8 Rename "package" command to "pkg" 2022-02-11 22:24:37 +02:00
83d115acca Ensure that platform directory path is string or bytes 2022-02-11 22:22:20 +02:00
86bd0f7c37 Show current working directory, not a path to platformio.ini 2022-02-11 22:21:44 +02:00
83fe00a0cf Revert "Run library extra script only at a build process" (breaks mbed framework) // Issue #3915 2022-02-11 17:00:33 +02:00
526abc6a9f Improved PIO Remote setup on credit-card sized computers (Raspberry Pi, BeagleBon, etc) // Resolve #3865 2022-02-11 14:42:17 +02:00
63feda6efc Simplify dependency on "zeroconf" package // Resolve #4177 2022-02-11 12:15:47 +02:00
c9b3dedbb0 Merge tag 'v5.2.5' into develop
Bump version to 5.2.5

# Conflicts:
#	HISTORY.rst
#	docs
#	platformio/__init__.py
2022-02-10 21:02:47 +02:00
dae8dfe1fc Merge branch 'release/v5.2.5' 2022-02-10 20:59:25 +02:00
100def7609 Bump version to 5.2.5 2022-02-10 20:59:16 +02:00
8594012fa1 Update deps 2022-02-10 20:55:38 +02:00
27400f66a9 Strip the path to userhome dir on Linux // Resolve #4173 Issue #4158 2022-02-10 20:55:31 +02:00
bb1e590222 Update SPDX License List to 3.16 2022-02-10 20:55:18 +02:00
a4b414010d Removing inconsistent dot at README.rst, HISTORY.rst and CONTRIBUTING.md (#4172)
* Removing inconsistent dot at README list

* Removing inconsistent dot at HISTORY file

* Removing inconsistent dot at CONTRIBUTING file
2022-02-10 20:55:08 +02:00
1d72a96654 Merge tag 'v5.2.5' into develop
Bump version to 5.2.5

# Conflicts:
#	docs
#	platformio/__init__.py
2022-02-10 20:52:16 +02:00
9b85ed86a9 fix: Added udev rule for FireBeetle-ESP32. (#4168) 2022-02-10 20:50:55 +02:00
e36066a9a2 Move package's related commands to "package" sub-folder 2022-02-10 15:22:20 +02:00
8082158a16 Update deps 2022-02-08 17:40:50 +02:00
1a8567a6da Sync docs 2022-02-08 17:33:58 +02:00
b17cbe30e2 Strip the path to userhome dir on Linux // Resolve #4173 Issue #4158 2022-02-08 17:21:13 +02:00
8aadc88dd5 Update SPDX License List to 3.16 2022-02-07 13:46:47 +02:00
f3d26fae64 Removing inconsistent dot at README.rst, HISTORY.rst and CONTRIBUTING.md (#4172)
* Removing inconsistent dot at README list

* Removing inconsistent dot at HISTORY file

* Removing inconsistent dot at CONTRIBUTING file
2022-02-07 13:45:56 +02:00
828d6f5baf Fixed a "module 'asyncio' has no attribute 'run'" error when launching PIO Home using Python 3.6 // Resolve #4169 2022-02-05 20:00:37 +02:00
2003806481 fix: Added udev rule for FireBeetle-ESP32. (#4168) 2022-02-05 13:13:43 +02:00
362823c1e1 Bump version to 5.2.5b1 2022-02-04 19:15:55 +02:00
9c10e00234 Run command from a PlatformIO package with a new pio exec command // Resolve #4163 2022-02-04 19:15:31 +02:00
a4cef2fbd8 Bump version to 5.2.5a7 2022-02-03 15:33:30 +02:00
e5fca99b52 Run library extra script only at a build process // Resolve #3915 2022-02-03 15:33:03 +02:00
f4c692eed2 Bump PIO Home to 3.4.1 2022-02-02 17:42:28 +02:00
2e0688db5f Fix test 2022-02-02 12:42:31 +02:00
ac2b358f87 Docs: generate docs from the registry 2022-02-01 21:56:53 +02:00
251a2c9fa4 Docs: link packages with the registry 2022-02-01 15:38:15 +02:00
0064d4b2c5 Docs: remove deprecated links to "boards" page 2022-02-01 15:01:58 +02:00
ebbac6b483 Use "black" profile 2022-02-01 15:00:47 +02:00
d5373a62f4 Docs: Sync dev-platforms 2022-01-28 14:24:25 +02:00
681b91a6a4 Update deps 2022-01-23 14:17:22 +02:00
8c66352994 Fixed wrong path (#4158)
* Fixed wrong path

On linux, "Documents" doesn't have to be the right folder. It depends on the language selected when installing the operating system.

* Refactor code

* Update HISTORY.rst

Co-authored-by: Ivan Kravets <me@ikravets.com>
2022-01-20 12:19:30 +02:00
4e1ec1215a Bump version to 5.2.5a6 2022-01-19 17:16:44 +02:00
6981894060 Minor updates 2022-01-19 17:16:23 +02:00
57c92e877c Respect disabling debugging server from platformio.ini 2022-01-19 16:53:31 +02:00
e8c0b8504a Ignore annoying "ms-vscode.cpptools-extension-pack" for VSCode and C/C++ files 2022-01-15 22:27:30 +02:00
93bbe8f2a3 Update deps 2022-01-15 15:00:55 +02:00
c78bb1f572 Docs: Remove icons from navbar 2022-01-11 14:11:32 +02:00
7256102785 Unix line-endings for extensions.json (#4153) 2022-01-09 13:58:39 +02:00
fc907c568d Improved checking of available Internet connection for IPv6-only workstations // Issue #4151 2022-01-08 15:08:39 +02:00
9e078ff4d7 Sync docs 2022-01-08 15:00:35 +02:00
5658e7f718 _internet_on: try IPv4, if not acceptable — try IPv6 (#4151)
* _internet_on: try IPv4, if not acceptable — try IPv6

* _internet_on: replace IPv4 `socket.socket` + IPv6 `socket.socket` with one universal `socket.create_connection`
2022-01-08 14:59:47 +02:00
111eb55a9f Docs: Update "platformio.ini" examples 2022-01-05 15:00:41 +02:00
0630ec5503 Bump version to 5.2.5a5 2022-01-04 17:18:14 +02:00
38cc493eb7 Minor improvements 2022-01-04 17:17:51 +02:00
254507c3a3 Escape custom request arguments 2022-01-04 15:02:48 +02:00
7cdcc9099b Escape custom request arguments 2022-01-04 14:53:34 +02:00
fb046c43ea Require authorization for package downloading 2022-01-04 14:46:51 +02:00
73ddf80fc1 Refactor authentication part for clients 2022-01-04 14:45:14 +02:00
a5a224ac6f Sync docs 2022-01-03 13:05:53 +02:00
c56dfda833 Minor fixes 2022-01-02 23:08:21 +02:00
6081f9ff1b Switch to the universal Twisted 2022-01-02 23:08:12 +02:00
f3c7d71b3b Sync docs 2022-01-02 19:46:52 +02:00
5748bf9549 Extend packing filters 2021-12-29 15:03:43 +02:00
84a0a6a418 Update deps 2021-12-24 18:14:29 +02:00
1ee9f183cc Fix test 2021-12-24 18:14:18 +02:00
55e8523925 Improve docs for "dependencies" field of library.json 2021-12-24 15:04:54 +02:00
c9efe24959 Switch to the new registry 2021-12-22 22:36:32 +02:00
69aff39205 Warn about package publishing time 2021-12-20 20:57:18 +02:00
f6e9e15253 Bump version to 5.2.5a4 2021-12-20 19:28:28 +02:00
b7f685ed62 Fix a bug with expired account session 2021-12-20 19:27:56 +02:00
6e03eff303 Handle base AccountError 2021-12-20 19:05:12 +02:00
3e0b95e1e1 Fix tests 2021-12-18 14:17:22 +02:00
a32997ceba Bump version to 5.2.5a3 2021-12-18 13:54:09 +02:00
63674d85e8 Ignore private packages if user not authorized 2021-12-18 13:53:54 +02:00
56848ece7a Bump version to 5.2.5a2 2021-12-18 13:45:51 +02:00
449722f08c Improved support for private packages in PlatformIO Registry 2021-12-18 13:45:26 +02:00
949b4562c7 Packaging: exclude extras from Arduino libraries 2021-12-15 13:46:30 +02:00
75f68c8be1 Bump version to 5.2.5a1 2021-12-15 12:46:28 +02:00
1b117712cf Merge branch 'release/v5.2.4' 2021-12-15 12:19:59 +02:00
11356af502 Merge tag 'v5.2.4' into develop
Bump version to 5.2.4
2021-12-15 12:19:59 +02:00
9dbdf7fc8d Bump version to 5.2.4 2021-12-15 12:19:51 +02:00
dec38273b6 Cleanup code 2021-12-15 11:59:19 +02:00
5098f5f420 Minor improvements // Issue #3865 2021-12-14 22:55:48 +02:00
d32fd72d13 Bump version to 5.2.4rc1 2021-12-14 22:38:57 +02:00
a4692d5457 Improved PIO Remote setup on credit-card sized computers (Raspberry Pi, BeagleBon, etc) // Resolve #3865 2021-12-14 22:38:31 +02:00
24ea7aaede Update title to PlatformIO Core 2021-12-14 22:37:42 +02:00
b7f10982c3 Update PIO Remote deps // Issue #3865 2021-12-14 21:14:11 +02:00
8f28d1ad43 Update uvicorn to 0.16 2021-12-08 18:45:43 +02:00
d5db2f0eb7 Apply formatting 2021-12-08 18:45:29 +02:00
fe69f3de04 Bump version to 5.2.4b4 2021-12-08 18:40:37 +02:00
5534394b06 Fixed an issue with wrong detecting Windows architecture when Python 32bit is used // Resolve #4134 2021-12-08 18:40:07 +02:00
24fc2f7e14 Sync docs 2021-12-08 18:38:16 +02:00
5b23c9a294 Add basic test for CLion integration 2021-12-08 13:48:35 +02:00
7338a02b48 Do not pack Python bytecode by default 2021-12-07 15:05:42 +02:00
8555e83cb1 Sync docs 2021-12-07 15:05:25 +02:00
39494d18bf Revert "Revert "Lock "cryptography" to RUST-less 3.3.2 version""
This reverts commit 24e63e7a02.
2021-12-06 20:59:31 +02:00
aab42c3cff Skip library.properties "paragraph" if total len >= 1000 2021-12-03 20:05:37 +02:00
f5a23c3817 Bump version to 5.2.4b3 2021-12-03 17:02:05 +02:00
b3eb81c3b4 Typo fix 2021-12-03 17:01:42 +02:00
4f4c88aca9 Use SCons vars for deprecated variables 2021-12-02 22:16:37 +02:00
c3ad3ebb57 Properly replace Home Directory in CLion template on Windows
Issue #4071
2021-12-02 20:56:18 +02:00
f13734dda4 Convert Home Directory path into a cmake-style path on Windows
Resolve #4071
2021-12-02 20:05:35 +02:00
24e63e7a02 Revert "Lock "cryptography" to RUST-less 3.3.2 version"
This reverts commit 3828e6d15e.
2021-12-02 19:30:19 +02:00
a163048396 Bump version to 5.2.4b2 2021-12-02 16:34:35 +02:00
55f8471aff Improved tab completion support for Bash, ZSH, and Fish shells // Resolve #4114 2021-12-02 16:34:05 +02:00
04e9f38e0e Check for default core dir in run-time (solves issue with tests) 2021-12-02 15:06:58 +02:00
90972e9ce0 Sync docs 2021-12-02 14:55:48 +02:00
6e8f60a27a Bump version to 5.2.4b1 2021-12-02 14:20:46 +02:00
014090c407 Fixed an issue when referencing "*_dir" option from a custom project configuration environment // Resolve #4110 2021-12-02 14:19:54 +02:00
e40b251c06 Fixed a bug when the system environment variable does not override a project configuration option // Resolve #4125 2021-12-02 13:13:07 +02:00
414a194c9d Do not claim that library.properties packages is compatible with any dev-platform if "architectures" field is not defined 2021-11-29 20:02:53 +02:00
7bffe3993d Update deps 2021-11-29 20:01:27 +02:00
3828e6d15e Lock "cryptography" to RUST-less 3.3.2 version 2021-11-29 14:31:38 +02:00
85c582bc93 Use "/v3//search" endpoint when searching for packages in registry 2021-11-27 15:00:10 +02:00
ea1c9dec12 Typo fix 2021-11-26 14:21:06 +02:00
6753121a6a Better cleanup package manifest fields 2021-11-26 14:13:06 +02:00
f63d899c42 Ignore duplicated manifest values 2021-11-25 22:35:44 +02:00
7219c9f806 Ignore duplicated manifest values 2021-11-25 22:19:47 +02:00
df2f1d10fd Sync docs 2021-11-25 22:19:01 +02:00
3f71067b67 Update zeroconf deps to 0.37.* 2021-11-22 22:08:57 +02:00
8dc68a01fd Do not print empty errors 2021-11-22 22:08:10 +02:00
9e0ded958c Bump version to 5.2.4a4 2021-11-18 17:56:18 +02:00
68243aa95b Added support for a new "headers" field in "library.json" 2021-11-18 17:55:35 +02:00
507df1f507 Extend platform manifest test with a package owner 2021-11-18 13:31:49 +02:00
1800c29b44 Upgraded build engine to the SCons 4.3 2021-11-18 13:17:26 +02:00
0343548f6e Sync docs 2021-11-18 13:14:55 +02:00
5cb5c9713e Wrap the path to PlatformIO core in the NetBeans project template
This fixes a possible issue when the path to PlatformIO contains a whitespace

Resolve #4096
2021-11-15 19:22:41 +02:00
5e2c5c793f SPDX License List v3.15 2021-11-15 11:28:57 +02:00
3022cb6955 Bump version to 5.2.4a3 2021-11-12 15:17:55 +02:00
4687665ff3 Improved support for projects located on a network share // Resolve #3417 , Resolve #3926 , Resolve #4102 2021-11-12 15:17:25 +02:00
001f075a49 Bump version to 5.2.4a2 2021-11-09 22:49:21 +02:00
7d78e4a60a Fixed an issue with the CLion project generator when a macro contains a space // Resolve #4102 2021-11-09 22:49:00 +02:00
2786bfbeb8 Escape spaces in CLion CMakeListsPrivate template - FIXES #4085 (#4105)
This fix adds spaces to the regex substitutions on CMakeListsPrivate.txt add_definitions.

Fixes #4102
2021-11-09 22:45:12 +02:00
d3049a8d62 Fix test 2021-11-08 20:08:18 +02:00
831a2582ed Sync docs 2021-11-08 19:31:49 +02:00
0919019123 Bump version to 5.2.4a1 2021-11-05 23:19:22 +02:00
7dd9c99c91 Merge tag 'v5.2.3' into develop
Bump version to 5.2.3
2021-11-05 17:31:41 +02:00
326c24911a Merge branch 'release/v5.2.3' 2021-11-05 17:31:40 +02:00
133fa1495b Bump version to 5.2.3 2021-11-05 17:31:23 +02:00
7c040ed99f Normalize Windows path with Python's pathlib 2021-11-05 17:21:15 +02:00
f88a2de8a9 Filter duplicated recent projects on Windows 2021-11-05 17:05:30 +02:00
a24ec8b07a Grammar fixes 2021-11-05 16:57:44 +02:00
d6ad6f96e8 Bump version to 5.2.3rc1 2021-11-05 16:29:18 +02:00
411764854b Add support for custom device monitor filters located in package folders // Issue #3924 2021-11-05 16:28:49 +02:00
973f77012f Fixed an issue when VSCode's debugger does not honor default environment // Resolve #4098 2021-11-05 14:46:57 +02:00
1d80da2559 Add "inc" as sign that it's the root of the library (#4093)
* Add "inc" as sign that it's the root of the library

* Add "inc" and "Inc"

Co-authored-by: Ivan Kravets <me@ikravets.com>
2021-11-05 14:16:36 +02:00
00d298935a Bump version to 5.2.3b5 2021-11-05 12:58:12 +02:00
4a9a478243 Refactor PIO Home IDE RPC 2021-11-05 12:57:09 +02:00
9040bbb75a Update deps 2021-11-05 12:56:39 +02:00
abcc4c0a12 Bump version to 5.2.3b4 2021-11-02 20:06:08 +02:00
ceb3a19b81 Automatically synchronize active projects between IDE and PlatformIO Home 2021-11-02 20:05:40 +02:00
2a2f7825cc Sync docs 2021-11-01 16:21:47 +02:00
a0e9f6a92d Docs: Sync dev-platforms 2021-11-01 15:57:17 +02:00
dbc73f5086 Use Rust-less "cryptography" dependency for PIO Remote 2021-10-30 14:30:30 +03:00
78a67b754e Docs: Extend a project configuration example with the common "[env]" section 2021-10-26 16:01:50 +03:00
de4b02eaf1 Remove unused module 2021-10-26 15:52:16 +03:00
751c82fd29 Bump version to 5.2.3b3 2021-10-26 15:42:05 +03:00
8c8a94fc71 Run config option validation even in raw mode 2021-10-26 15:41:41 +03:00
1174958e8b Add project.helpers.get_project_all_lib_dirs API (used by platformio-node-helpers) 2021-10-26 14:36:18 +03:00
6399de7a66 Removed deprecated project.helpers API 2021-10-26 14:35:28 +03:00
c0f2275b61 Restore ProjectConfig.get_optional_dir API, "platformio-node-helpers" depends on it 2021-10-26 14:34:32 +03:00
256a9ee45d Revert "Pass system STDIN stream to SCons subprocess"
This reverts commit d7b7d2de6e.
2021-10-26 13:54:49 +03:00
c835ce780a Fixed "UnicodeEncodeError" when a build output contains non-ASCII characters // Resolve #3971 2021-10-25 22:01:11 +03:00
d7b7d2de6e Pass system STDIN stream to SCons subprocess 2021-10-25 21:12:29 +03:00
1dd0635e5e Use secured bitly 2021-10-25 20:25:23 +03:00
67506511c3 Update token for docs/deploy 2021-10-25 19:45:47 +03:00
3fbb4cde36 Bump version to 5.2.3b2 2021-10-25 18:45:04 +03:00
9aaa80a213 Cast Python warnings to errors when running "pytest" 2021-10-25 18:36:10 +03:00
acb6cbffa0 Add "arduplot" to the "Community Filters" // Resolve #4058 2021-10-25 15:54:06 +03:00
6a70ab74bc Update history 2021-10-25 15:24:24 +03:00
852c252302 Added support for custom device monitor filters // Resolve #3924 2021-10-25 15:18:18 +03:00
3a670b55b6 Update CMakeLists.txt.tpl (#4089) 2021-10-25 14:56:12 +03:00
d01435f4f2 Bump version to 5.2.3b1 2021-10-25 13:28:57 +03:00
f1638c9cd7 Fixed an issue when PIO Remote device monitor crashes on the first keypress // Resolve #3832 2021-10-25 13:24:36 +03:00
4943504898 Bump version to 5.2.3a3 2021-10-24 23:17:30 +03:00
7d7480c120 Show human-readable message when infinite recursion is detected while processing "Interpolation of Values" // Resolve #3883 2021-10-24 22:21:15 +03:00
78182fea0a Disabled resolving of SCons variables when preprocessing "Interpolation of Values" // Resolve #3933 2021-10-24 21:27:25 +03:00
947e57b5b4 Bump version to 5.2.3a2 2021-10-24 20:00:30 +03:00
e0e4a594e9 Fix conf tests on Windows 2021-10-24 19:59:52 +03:00
4839fe37a3 Improved PlatformIO directory interpolation (${platformio.***_dir}) in “platformio.ini” configuration file // Resolve #3934 2021-10-24 18:19:40 +03:00
9914b7ea38 Typo (#4087)
showed > shown
2021-10-23 13:01:48 +03:00
f86ed97820 Bump version to 5.2.3a1 2021-10-22 19:14:17 +03:00
8d8b0807e2 Fixed an issue when the "$PROJECT_DIR" gets the full path to "platformio.ini", not the directory name // Resolve #4086 2021-10-22 19:13:24 +03:00
e3c6237430 Remove unused files 2021-10-20 23:29:34 +03:00
e964c7fa5c Merge branch 'release/v5.2.2' 2021-10-20 18:44:28 +03:00
f1e84e145c Merge tag 'v5.2.2' into develop
Bump version to 5.2.2
2021-10-20 18:44:28 +03:00
2e2773fa6b Bump version to 5.2.2 2021-10-20 18:44:20 +03:00
a9c7a27d47 Fix CLion 2021.3 support (#4085)
New CMake behavior crashes CLion with apostrophe symbols in `add_definitions` clause
see https://youtrack.jetbrains.com/issue/CPP-26719
2021-10-20 18:08:22 +03:00
e41ecb19cf Resolve an issue with interrupting a running program 2021-10-20 16:21:48 +03:00
5b091b602f Fixed a “TypeError” issue when extending configuration option in “platformio.ini” with the multi-line default value // Resolve #4082 2021-10-20 15:35:01 +03:00
768681c4f2 Remove debugging code // Resolve #4083 2021-10-19 19:27:20 +03:00
2e4e5c1873 Temporary disable CI for Windows+Python 3.10 2021-10-19 19:26:13 +03:00
4a61806e60 Quote Python versions 2021-10-19 18:52:30 +03:00
883187f9ac Bump version to 5.2.2a1 2021-10-19 18:21:28 +03:00
2d9a5031e9 Test PlatformIO Core on Python 3.10 2021-10-19 18:21:21 +03:00
39c93f6512 Override debugging firmware loading mode using `--load-mode option for pio debug` command 2021-10-19 18:20:01 +03:00
a7905b373e Skip CI for macOS & Py 3.6 2021-10-11 16:00:09 +03:00
a7c82ff9b9 Merge branch 'release/v5.2.1' 2021-10-11 15:07:19 +03:00
5b4b4a4051 Merge tag 'v5.2.1' into develop
Bump version to 5.2.1
2021-10-11 15:07:19 +03:00
c348fec609 Bump version to 5.2.1 2021-10-11 15:07:04 +03:00
4af17356f3 Handle ".hpp" files when looking for a library root 2021-10-11 15:01:42 +03:00
384e5052bc Bump version to 5.2.1rc2 2021-10-10 14:09:59 +03:00
a5adae1491 Skip broken Click 8.0.2 release // Resolve #4078 2021-10-10 14:09:17 +03:00
fe62b810db Bump version to 5.2.1rc1 2021-10-08 19:03:12 +03:00
ee78496058 Clean a build environment and installed library dependencies using a new `cleanall` target // Resolve #4062 2021-10-08 19:02:45 +03:00
8afe4bae87 Typo fix 2021-10-08 15:31:26 +03:00
b04bb2b740 Fix Click's "DeprecationWarning: 'resultcallback' has been renamed to 'result_callback'" // Resolve #4075 2021-10-08 15:18:34 +03:00
3d46f0d72f Drop support for Click < 7.1.2 2021-10-08 15:18:19 +03:00
a65d973660 Extend library root signs with "include" and "src" dirs // Resolve #4073 2021-10-08 15:00:05 +03:00
df83d90c06 Handle upper-cased "Include" & "Src" folders 2021-10-08 14:58:41 +03:00
a1d55f2529 Ignore telemetry on "idedata" target 2021-10-08 14:40:23 +03:00
aa097f3fd6 Update Cppcheck to v2.6.0 // Resolve #3942 2021-10-07 16:43:06 +03:00
e0b72202fd Bump version to 5.2.1b4 2021-09-29 19:21:55 +03:00
e8769fff7d Improved handling of a library root based on "Conan" or "CMake" build systems // Resolve #3887 2021-09-29 19:21:31 +03:00
ed33652534 Handle "test" folder as a part of CLion project // Resolve #4005 2021-09-29 15:44:52 +03:00
d1c1f972a6 Propagate agent option to remote device monitor command (#4065)
Signed-off-by: Christophe PAVOT <christophe.pavot@wiifor.com>
2021-09-29 14:47:11 +03:00
6008275aae Properly handle in-progress C++ standards when invoking Cppcheck // Resolve #3944 (#4070) 2021-09-29 14:46:02 +03:00
edf8bb3945 Bump version to 5.2.1b3 2021-09-27 22:59:58 +03:00
dd7d133263 Dump "embedded_result.output" 2021-09-27 22:59:36 +03:00
b6f783674b Allowed to override a default library builder via a new `builder field in a build group of library.json` // Resolve #3957 2021-09-26 15:27:41 +03:00
eab70fae3b Properly handle "--keep-build-dir" option in platformio ci command (#4061)
This fixes #4011 and possible "FileExists" errors when the "platformio ci"
command by safely copying sources to the build folder
2021-09-23 23:26:42 +03:00
fed40ef104 Add debug information when a test fails on Win/Py3.8 2021-09-17 21:06:08 +03:00
6d087f5a38 Bump version to 5.2.1b2 2021-09-16 22:07:01 +03:00
0edcf33547 Use "ubuntu-18.04" for project examples (CI) 2021-09-16 22:06:45 +03:00
443417b0f4 PyLint fix 2021-09-16 21:56:09 +03:00
369e994b0d Check for "build.mcu" and "build.cpu" when looking for precompiled library // Issue #405 2021-09-16 21:51:53 +03:00
55469327c6 Bump version to 5.2.1b1 2021-09-16 21:16:21 +03:00
27f326673c Fixed a "KeyError: Invalid board option 'build.cpu'" when using a precompiled library with a board that does not have a CPU field in the manifest // Resolve #405 2021-09-16 21:13:54 +03:00
e6fd766fff Bump version to 5.2.1a1 2021-09-14 13:03:47 +03:00
7da3ccfacb Merge tag 'v5.2.0' into develop
Bump version to 5.2.0
2021-09-13 19:00:10 +03:00
624d6b3b0b Merge branch 'release/v5.2.0' 2021-09-13 19:00:09 +03:00
9528083a66 Bump version to 5.2.0 2021-09-13 18:59:53 +03:00
55408f6ccb Fixed an issue when PlatformIO archives a library that does not contain C/C++ source files // Resolve #4019 2021-09-13 14:56:24 +03:00
dce5a39b10 Process "precompiled" and "ldflags" properties of the "library.properties" manifest // Resolve #3994 2021-09-13 14:48:48 +03:00
03a23876a7 Fixed an issue when PlatformIO archives a library that does not contain C/C++ source files // Resolve #4019 2021-09-13 14:04:33 +03:00
775357dd94 Better error handling if git is not installed // Resolve #4013 2021-09-13 13:31:53 +03:00
d10cbb2823 Fix link to clang-tidy (#4049) 2021-09-13 12:36:56 +03:00
63a2465bac Update check tools to the latest available // Resolve #4041 2021-09-10 18:11:48 +03:00
d97ed52e91 Sync docs 2021-09-07 15:17:59 +03:00
e1dc12c14d Docs: Document "platformio-ide.pioHomeServerHttpHost" setting for VSCode 2021-09-02 12:47:17 +03:00
7c755d4e2d Sync docs 2021-08-31 16:23:24 +03:00
55b786d9f0 Use byte-mode for writing binary file 2021-08-28 13:21:46 +03:00
131f4be4ea Fix PyLint's "use-dict-literal" and "use-list-literal" 2021-08-28 13:14:40 +03:00
d819617d2b Specify encoding for "open()" functions 2021-08-28 13:10:07 +03:00
b9219a2b62 Update "zeroconf" deps to 0.36 2021-08-28 12:31:02 +03:00
554e378dd6 Sync docs 2021-08-28 12:30:38 +03:00
cc11402bc9 Sync docs 2021-08-14 15:41:44 +03:00
40220f92c1 Sync docs 2021-08-14 15:25:25 +03:00
8c4d9021c2 Update deps 2021-08-14 12:53:49 +03:00
efefb02d86 Sync docs 2021-08-14 12:53:30 +03:00
3ee281aaf9 Update SPDX License List to 3.14 2021-08-09 17:46:56 +03:00
097b6d5097 PyLint fixes 2021-08-05 18:13:22 +03:00
6cdaf05f98 Sync docs 2021-08-05 18:13:00 +03:00
3be0f58c30 Sync docs 2021-08-04 14:58:54 +03:00
f3489a3b01 Sync docs 2021-08-02 13:52:06 +03:00
173dbeb24a Bump version to 5.2.0b1 2021-08-02 13:11:23 +03:00
0607b86818 Upgraded build engine to the SCons 4.2 2021-08-02 13:10:37 +03:00
1282a65bcb Update Arduino udev rule to include latest Portenta board
Resolves #4014
2021-08-02 12:12:52 +03:00
45d3207dfe Docs: Sync dev-platforms 2021-07-31 18:48:08 +03:00
76b46f59e9 Fix lib test 2021-07-30 20:13:53 +03:00
19fa108f61 Docs: Add "Copy" button to CODE blocks 2021-07-30 17:32:22 +03:00
2372d06591 Sync docs 2021-07-26 19:26:33 +03:00
7015375892 Docs: Revert "html_favicon" path 2021-07-23 15:32:02 +03:00
e9bf2b361f Update deps and sync docs 2021-07-23 15:05:01 +03:00
51b790b767 Bump version to 5.2.0a9 2021-07-12 15:06:42 +03:00
ac84431361 Take into account package's "system" when checking for duplicates 2021-07-12 15:06:06 +03:00
7dc8463da9 Fix charmap error (#3998)
* Fix charmap error

Fix charmap error on cyrilic in platformio.ini file #3493

* Update config.py

Co-authored-by: Ivan Kravets <me@ikravets.com>
2021-07-07 18:25:55 +03:00
71ae579bc0 PyLint fix 2021-07-05 16:06:02 +03:00
5036d25b60 Enable Python version auto-detection for Black formatter 2021-07-05 13:31:23 +03:00
ff6d169862 Fix PyLint for v2.9.3 2021-07-05 13:30:37 +03:00
dde8898aae Bump zeroconf to 0.32.* (#3991) 2021-07-05 12:57:30 +03:00
72cc23ef46 Fix PyLint warning with "No exception type(s) specified (bare-except)" 2021-06-29 18:25:20 +03:00
5390b4ed42 Add Github token for Slack notification 2021-06-29 18:24:47 +03:00
17c7d90d52 Sync docs 2021-06-29 18:11:08 +03:00
5c3b5be613 Fix TypeError: 'NoneType' object is not callable 2021-06-29 18:07:45 +03:00
5ab7769745 Bump version to 5.2.0a8 2021-06-24 16:43:00 +03:00
05374d1145 Match buffered data from debugging server 2021-06-24 16:42:45 +03:00
311e10f91e Ensure all patterns are replaces in debug init script 2021-06-24 16:00:13 +03:00
2b94791387 Bump version to 5.2.0a7 2021-06-22 14:28:40 +03:00
fbcae11cd0 Fix project generator 2021-06-22 14:28:04 +03:00
0d6eff2a9a Syn docs 2021-06-22 14:27:33 +03:00
6a9b7fdb6d Update SPDX License List to 3.13 2021-06-03 16:32:53 +03:00
e8f703648a Docs: Use Python 3 for CI integration 2021-06-01 18:24:17 +03:00
710f82de0f Up uvicorn to 0.14 & click to 8.0 2021-06-01 17:59:18 +03:00
bee35acfa6 Sync docs 2021-06-01 17:56:55 +03:00
90fdaf80e4 Sync docs 2021-05-31 18:25:54 +03:00
27feb1ddd7 Added support for Click 8.0; updated other deps 2021-05-19 19:43:41 +03:00
2be7e0f7e6 Docs: Promote PlatformIO Labs blog posts 2021-05-13 15:28:09 +03:00
186ab70bf9 Add udev rule for Raspberry Pi Pico boards 2021-05-10 11:38:05 +03:00
0fa9006e45 Sync docs: CircleCI updates 2021-05-03 22:34:43 +03:00
60c83bae93 Docs: Sync dev-platforms 2021-05-01 13:44:28 +03:00
553c398c8e Show package "system" info before publishing 2021-04-30 18:06:35 +03:00
1c90bb383f Sync docs 2021-04-29 19:46:17 +03:00
4281225b02 Sync docs 2021-04-29 19:24:44 +03:00
14dc9c6c43 Sync docs 2021-04-29 18:38:44 +03:00
c9e10b1a3e Fix issue with broken redirect 2021-04-29 14:43:27 +03:00
915c850760 Docs: Fix JS redirect URL 2021-04-29 12:47:57 +03:00
2c3f430203 Tidy up Docs CI 2021-04-28 20:59:01 +03:00
1a152ed7fa Add deploy step to CI configuration 2021-04-28 20:18:23 +03:00
5953480807 Docs: Fix broken link for RTD page 2021-04-28 20:16:01 +03:00
b5c1a195be Fix PyLint issues: consider-using-with 2021-04-28 19:59:37 +03:00
310cc086c6 Docs: Minor fixes to "redirect" page generator 2021-04-28 19:59:12 +03:00
61d6cd3c18 Apply black formatter 2021-04-28 19:58:50 +03:00
cccabf5330 Add missed "sphinx-notfound-page" package for docs 2021-04-28 13:19:49 +03:00
6f33460afd Remove debugging code 2021-04-28 13:17:22 +03:00
603d524aaf Refactor docs to be deployed as a static content 2021-04-28 13:10:19 +03:00
eb2cd001b6 Use private "_idedata" target when fetching data for debugging 2021-04-24 18:01:35 +03:00
b5b57790be Validate package manifest when packing archive or publishing a package 2021-04-23 22:02:07 +03:00
286f4ef961 Bump version to 5.2.0a6 2021-04-21 20:52:27 +03:00
ad28d1906c Improve a package publishing process 2021-04-21 20:51:54 +03:00
dfdccac67d Remove unnecessary "ensure_python3()" blocks 2021-04-20 20:28:49 +03:00
b8c2752237 Dccs: Add information how to avoid extra script running when IDE fetches metadata 2021-04-16 13:36:53 +03:00
834c7b0def Bump version to 5.2.0a5 2021-04-12 22:38:56 +03:00
5bfe70142e Switch to project directory before starting debugging process 2021-04-12 22:38:21 +03:00
b35c5a22bb Fix a broken support for custom configuration file for pio debug command // Resolve #3922 2021-04-11 22:21:01 +03:00
eecc825c90 PyLint 2021-04-11 22:20:09 +03:00
3823c22dad Update Release Notes 2021-04-07 21:30:06 +03:00
551bd3dbfe Explicitly specify PROGSUFFIX when compiling final binary (#3918)
Resolves #3906
2021-04-02 17:09:38 +03:00
7e9956963a Remove a note with using pio ci for uploading // Resolve #3903 2021-04-02 15:23:34 +03:00
80c24a1993 Fixed an issue when "main.cpp" was generated for a new project for 8-bit development platforms // Resolve #3872 2021-04-02 15:19:18 +03:00
66091bae24 Disable GDB "startup-with-shell" only on Unix platform 2021-04-02 14:44:38 +03:00
73d4f10f4b Bump version to 5.2.0a4 2021-04-01 21:16:42 +03:00
ee7ea77fc3 Fixed an error "Unknown development platform" when running unit tests on a clean machine // Resolve #3901 2021-04-01 21:15:14 +03:00
32e1cbe2a3 Provide solution for issue #3417 2021-03-31 18:28:06 +03:00
3539724843 Update "zeroconf" dependency to 0.29 2021-03-31 17:33:26 +03:00
940b25f158 Sync docs & examples 2021-03-31 17:32:57 +03:00
37e601e5b5 Ensure that a serial port is ready before running unit tests on a remote target // Resolve #3742 2021-03-24 19:07:40 +02:00
0230374709 Document new VSCode settings: activateProjectOnTextEditorChange & autoOpenPlatformIOIniFile 2021-03-24 13:04:20 +02:00
86db237e5d Update Cppcheck and PVS-Studio packages // Resolve #3898 2021-03-23 21:17:32 +02:00
1542b1cebb Bump version to 5.2.0a3 2021-03-20 10:32:14 +02:00
990071af5c Fix issue with missed compat.path_to_unicode // Resolve #3894 2021-03-20 10:31:55 +02:00
f543e00307 Bump version to 5.2.0a2 2021-03-19 20:26:26 +02:00
34b4f8265a Debug unit tests created with PlatformIO Unit Testing solution // Resolve #948 2021-03-19 20:25:30 +02:00
a366d1af2a Use "target remote" for mpsdebug 2021-03-19 18:26:09 +02:00
ebe5785a91 Allow overriding default debugging flags from dev-platform 2021-03-19 17:11:25 +02:00
887d46725b Debug native (desktop) application on a host machine // Resolve #980 2021-03-19 17:02:11 +02:00
a326b718f2 Handle legacy $LOAD_CMD "init_cmds" 2021-03-19 16:09:38 +02:00
c14b298cb9 Fixed an issue with silent hanging when a custom debug server is not found // Resolve #3756 2021-03-19 15:55:42 +02:00
9cca8f3f55 Split debugging client to base and GDB // Resolve #3757 2021-03-19 15:47:20 +02:00
f5cee56740 Fix issue when disabling "debug_init_break" did not work 2021-03-19 14:09:43 +02:00
972d183d85 Use a cached build configuration 2021-03-19 13:46:54 +02:00
eebdf04357 Load "idedata" configuration from a dumped file 2021-03-19 13:46:27 +02:00
9ede20a367 Disable checking for "__PLATFORMIO_BUILD_DEBUG__" that is not available in g2 mode 2021-03-19 13:10:29 +02:00
b0c3e22a52 Configure a custom pattern to determine when debugging server is started with a new debug_server_ready_pattern option 2021-03-19 12:30:16 +02:00
a78db17784 Drop support for Python 2 2021-03-19 00:21:44 +02:00
dbb9998f69 Refactor debugging configuration, add support for server_ready_pattern // Resolve #3401 2021-03-18 23:42:54 +02:00
2745dbd124 PyLint fix 2021-03-17 23:14:22 +02:00
c0357daf01 Remove Python 2 code 2021-03-17 21:08:06 +02:00
064fa6027d Bump version to 5.2.0a1 2021-03-17 20:07:26 +02:00
779e02a05e Use "connect_read_pipe" on Unix 2021-03-17 20:06:52 +02:00
e222d0356a Merge branch 'feature/debug-async' into develop 2021-03-17 18:25:47 +02:00
d2ae333bb8 Merge branch 'release/v5.1.1' 2021-03-17 18:17:46 +02:00
764c42a810 Merge tag 'v5.1.1' into develop
Bump version to 5.1.1
2021-03-17 18:17:46 +02:00
18b18f1c3d Bump version to 5.1.1 2021-03-17 18:17:40 +02:00
b54a8b40a4 Refactor Unified Debugger to native Python Asynchronous I/O stack // Resolve #3793 , Resolve #3595 2021-03-17 17:42:11 +02:00
edf724d20d Sync docs 2021-03-15 17:01:44 +02:00
622a190a61 Avoid "rustup" when building cryptography for contrib-pysite // Resolve #3865 2021-03-15 17:00:16 +02:00
5b4a78ba20 Bump version to 5.1.1b1 2021-03-11 14:49:20 +02:00
44b85f6e4b Switch Cppcheck to analyze project per file // Issue #3797
Cppcheck doesn't provide a proper report when one of the files in the check list is broken.
If we run the analysis on a per-file basis, then Cppcheck will be able report at least defects
from valid source files.
2021-03-11 13:49:27 +02:00
7f1f760645 Preserve user-specified debug configurations in VSCode integration (#3878)
* Preserve user-specified debug configurations in VSCode integration

Issue #3824

* Tidy up Python code
2021-03-10 14:54:52 +02:00
54d8c96c30 Update SPDX license list to 3.12 2021-03-09 22:01:58 +02:00
c6ab7827e7 Fixed incorrect size of unnecessary data // Resolve #3830 2021-03-09 19:26:22 +02:00
ae26079e2e Fixed an issue when code inspection fails with "Bad JSON" // Resolve #3790 2021-03-09 19:20:30 +02:00
3e993156f2 Suppress printing unnecessary info in silent mode // Resolve #3837 2021-03-08 12:16:53 +02:00
3b2fafd789 Add new test for check command and project with whitespace 2021-03-04 22:27:00 +02:00
72ebaddcb8 Handle possible whitespaces in project path for PVS-Studio (#3849) 2021-03-04 22:22:09 +02:00
5a9950cc19 Sync docs 2021-03-04 18:52:12 +02:00
cf29d7e400 Typo fix 2021-03-04 18:52:02 +02:00
244dba3614 JFrog shutdowns Bintray 2021-03-03 21:31:42 +02:00
21886517e1 Bump version to 5.1.1a3 2021-03-01 17:59:58 +02:00
3996236729 Report detailed server error to PIO Home frontend 2021-03-01 17:59:40 +02:00
560cb3ac82 Sync docs 2021-02-27 19:57:40 +02:00
81c7e23ae9 Bump version to 5.1.1a2 2021-02-27 19:44:11 +02:00
0b8bd6d4fc Migrate to Async JSON-RPC package 2021-02-27 19:43:43 +02:00
7c271c8207 Better detecting of native dev-platform for unit testing // Resolve #3851 2021-02-27 18:53:26 +02:00
58947d91a6 PyLint fixes 2021-02-27 17:13:30 +02:00
20096be990 Sync docs 2021-02-26 13:39:13 +02:00
7c8508b651 Fixed an issue with device monitor when the “send_on_enter” filter didn’t send EOL chars // Resolve #3787 2021-02-10 14:43:50 +02:00
b56d0fdd9b Sync docs & examples 2021-02-10 14:43:12 +02:00
d0cc06f766 Move isort setttings to "tox.ini" 2021-02-06 16:56:44 +02:00
d8d2b215d1 Minor improvement 2021-02-03 23:11:47 +02:00
c478d383b4 Sync docs 2021-02-03 23:10:01 +02:00
e01cd1c037 Bump version to 5.1.1a1 2021-02-01 13:01:31 +02:00
e63019c469 Fixed a "The command line is too long" issue with a linking process on Windows // Resolve #3827 2021-02-01 12:52:00 +02:00
90a325a1b2 Merge branch 'release/v5.1.0' 2021-01-28 19:23:14 +02:00
698594525f Merge tag 'v5.1.0' into develop
Bump version to 5.1.0
2021-01-28 19:23:14 +02:00
fd540148f3 Bump version to 5.1.0 2021-01-28 19:23:06 +02:00
078a024931 Configure default debug_speed 2021-01-28 13:52:11 +02:00
f8193b2419 Bump version to 5.1.0rc3 2021-01-27 23:06:42 +02:00
808ba603c5 Fixed an issue when "pio device monitor –eol" and “send_on_enter” filter do not work properly // Resolve #3787 2021-01-27 23:06:18 +02:00
61d70fa688 Include Unity framework for IDE data only if there are tests in project 2021-01-27 22:40:19 +02:00
493a33e754 Drop support for Python 2 2021-01-27 22:25:42 +02:00
bd75c3e559 Bump version to 5.1.0rc2 2021-01-27 20:58:13 +02:00
cb9e72a879 Dump build flags using SCons.Subst.SUBST_CMD 2021-01-27 20:57:53 +02:00
9d2fd4982f Cleanup code 2021-01-27 20:40:25 +02:00
eed9a0e376 Merge branch 'feature/3792-maxleng-cmd' into develop 2021-01-27 20:30:39 +02:00
d77dbb2cca Use "TEMPFILEARGESCFUNC" for GCC workaround on Windows 2021-01-27 20:30:28 +02:00
7810946484 Use project build folder for tempfile workaround with command maxlen 2021-01-27 18:47:54 +02:00
e2906e3be5 Refactored a workaround for a maximum command line character limitation // Resolve #3792 2021-01-27 16:10:13 +02:00
0a8b66ee95 Configure a custom debug adapter speed using a new debug_speed option // Resolve #3799 2021-01-26 21:21:41 +02:00
8ff270c5f7 Skip non-existing package when checking for update// Resolve #3818 2021-01-26 17:05:37 +02:00
4012a86cac Fixed a "ValueError: Invalid simple block" when uninstalling a package with a custom name and external source // Resolve #3816 2021-01-26 16:15:11 +02:00
dd4fff3a79 Bump version to 5.1.0rc1 2021-01-25 23:50:41 +02:00
0ed99b7687 Added a new `--session-id option to pio home` // Resolve #3397 2021-01-25 23:44:26 +02:00
2c389ae11e Added new check_prune_system_threshold setting 2021-01-24 17:21:22 +02:00
15ff8f9d2a Bump version to 5.0.5b5 2021-01-24 15:58:07 +02:00
bd4d3b914b Revert "lib_compat_mode" changes // Resolve #3811 Resolve #3806 2021-01-24 15:49:56 +02:00
59b02120b6 New options for system prune command: remove unnecessary core and development platform packages // Resolve #923 2021-01-23 23:20:53 +02:00
92655c30c1 Disabled automatic removal of unnecessary development platform packages // Resolve #3708 , Resolve #/3770 2021-01-23 22:34:48 +02:00
484567f242 Project's "lib_compat_mode" has higher priority than "library.json" 2021-01-23 15:54:52 +02:00
ef6e70a38b Fixed an issue when unnecessary packages were removed in `update --dry-run` mode // Resolve #3809 2021-01-23 15:24:32 +02:00
e695e30a9b Fixed an issue with compiler driver for ".ccls" language server // Resolve #3808 2021-01-23 14:44:53 +02:00
65e67b64bd Remove unnecessary dependencies from contrib-pysite 2021-01-22 22:55:45 +02:00
ddbe339541 Update to iSort 5.0 2021-01-22 22:55:02 +02:00
b2c0e6a8c2 Sync docs 2021-01-22 22:46:09 +02:00
f9384ded27 Fixed an issue when “strict” compatibility mode was not used for a library with custom “platforms” field in library.json manifest // Resolve #3806 2021-01-22 22:45:36 +02:00
4488f25ce0 Bump version to 5.0.5b4 2021-01-20 23:26:22 +02:00
52b22b5784 Fixed a "UnicodeDecodeError: 'utf-8' codec can't decode byte" // Resolve #3804 , Resolve #3417 2021-01-20 20:45:23 +02:00
5a356140d6 Sync examples and docs 2021-01-20 20:44:43 +02:00
e79de0108c Upgraded build engine to the SCons 4.1 2021-01-20 16:15:05 +02:00
985f31877c Automatically install tool-unity when there are tests and "idedata" target is called 2021-01-20 15:14:45 +02:00
11a71b7fbb Bump version to 5.0.5b3 2021-01-20 14:37:19 +02:00
7f26c11c9d Fix an issue with "coroutine' object has no attribute 'addCallback'" 2021-01-20 14:36:45 +02:00
9b93fcd947 Do not install tool-unity for even non-test proejct 2021-01-20 14:27:03 +02:00
733ca5174b Bump version to 5.0.5b2 2021-01-18 21:19:57 +02:00
bd897d780b Implement "__shutdown__" endpoint for PIO Home server 2021-01-18 21:19:15 +02:00
429065d2b9 Legacy support for PIO Home "__shutdown__" query request 2021-01-18 20:53:19 +02:00
b90734f1e2 List multicast DNS services only when PY3 2021-01-18 20:51:50 +02:00
db97a7d9d3 Bump version to 5.0.5b1 2021-01-18 18:21:27 +02:00
6ff67aeadf Significantly speedup PlatformIO Home loading time by migrating to native Python 3 Asynchronous I/O 2021-01-18 18:20:26 +02:00
dd7d282d17 Improved listing of multicast DNS services 2021-01-18 18:17:10 +02:00
4e637ae58a Drop Python 2 from PIO Core test 2021-01-18 18:15:15 +02:00
1ec2e55322 Add udev rule for Atmel AVR Dragon (#3786) 2021-01-04 13:46:09 +02:00
556eb3f8c1 Docs: Update "Wiring Connections" section for ST-Link debugging probe 2020-12-31 13:47:05 +02:00
76b49ebc95 Increase timeout to 60sec when starting debug server and "ready_pattern" is used 2020-12-30 14:38:18 +02:00
e82443a302 Bump version to 5.0.5a1 2020-12-30 14:29:41 +02:00
5de86a6416 Check for debug server's "ready_pattern" in "stderr" 2020-12-30 14:29:19 +02:00
3f3c8cabb8 Merge branch 'release/v5.0.4' 2020-12-30 13:23:11 +02:00
cd59aa9afb Merge tag 'v5.0.4' into develop
Bump version to 5.0.4
2020-12-30 13:23:11 +02:00
34e12e575b Bump version to 5.0.4 2020-12-30 13:23:04 +02:00
4c8c261ab4 Raise an exception when trying to pack a package from tar.gz on Windows // Resolve #3776 2020-12-28 20:12:53 +02:00
099bb3b9ff Sync dev-platforms: docs + examples 2020-12-28 13:51:34 +02:00
c623a6aacc Fixed an issue with package publishing on Windows when Unix permissions are not preserved // Resolve // #3776 2020-12-28 13:08:12 +02:00
ce7356794d Test examples from the official dev-platforms 2020-12-26 21:43:41 +02:00
523494f9cf Ignore CI tests from tokisaki dev-platform 2020-12-26 20:18:15 +02:00
0edc867d45 Bump version to 5.0.4rc1 2020-12-26 16:10:44 +02:00
ce4c45a075 Show a warning message about deprecated support for Python 2 and Python 3.5 2020-12-26 16:10:07 +02:00
e29941e3eb Update release notes with check tools updates 2020-12-22 21:30:01 +02:00
86ce3595f6 Update check tools packages // Resolve #3758
Updated tools: Cppcheck v2.3, PVS-Studio v7.11
2020-12-22 00:44:09 +02:00
6e958b8415 Handle possible issues when check tool cannot be executed // Resolve #3753
Now, each tool individually decides under what conditions the check is considered failed.
2020-12-22 00:21:32 +02:00
d485703768 Use "Updating to X.Y.Z" instead of "Outdated" when doing a real package updating 2020-12-11 17:53:48 +02:00
109e2107d1 Sync docs 2020-12-11 16:14:08 +02:00
3469905365 Decode subprocess output only for byte-strings 2020-12-02 15:15:17 +02:00
75b3846f8f Sync docs & examples 2020-12-02 15:15:02 +02:00
a9ec38208c Bump version to 5.0.4b1 2020-11-30 20:24:45 +02:00
c38b9a4144 Fixed a "git-sh-setup: file not found" error when installing project dependencies from Git VCS // Resolve #3740 2020-11-30 20:23:30 +02:00
b6128aeaa1 Apply formatting 2020-11-22 22:32:03 +02:00
881782be05 Allow spaces and dots in example's name ([package manifest) 2020-11-22 21:42:25 +02:00
0c05930501 Sync docs 2020-11-22 21:41:47 +02:00
b96f2a19b5 Bump version to 5.0.4a2 2020-11-14 20:10:45 +02:00
c1906714ee Give a constant "PlatformIO" name for the C/C++ configuration 2020-11-14 20:10:22 +02:00
32181d1bd2 Improved `.ccls` configuration file for Emacs, Vim, and Sublime Text integrations // Issue #3735 2020-11-14 19:55:24 +02:00
7dfb413d87 Typo fix 2020-11-12 21:42:53 +02:00
7934a96ad1 Added "Core" suffix when showing PlatformIO Core version using `pio --version` command 2020-11-12 20:42:27 +02:00
abddbf9c7d Bump version to 5.0.4a1 2020-11-12 18:56:55 +02:00
77e66241f7 Do not provide "intelliSenseMode" option when generating configuration for VSCode C/C++ extension 2020-11-12 18:56:34 +02:00
4b3f2e19a4 Merge branch 'release/v5.0.3' 2020-11-12 17:57:30 +02:00
b29c6485a8 Merge tag 'v5.0.3' into develop
Bump version to 5.0.3
2020-11-12 17:57:30 +02:00
f4dba7a68c Bump version to 5.0.3 2020-11-12 17:56:12 +02:00
2817408db3 Fixed an issue when pio package pack ignores some folders // Resolve #3730 2020-11-12 16:06:54 +02:00
9ff3c758eb Fix tests 2020-11-12 15:35:37 +02:00
3dcc189740 Use custom Pre-Debug task only for multi-env project 2020-11-12 15:35:19 +02:00
4a12d1954e Fixed an issue when the package manager tries to install a built-in library from the registry // Resolve #3662 2020-11-12 15:27:34 +02:00
e4d645110a Merge branch 'develop' of https://github.com/platformio/platformio-core into develop
# Conflicts:
#	HISTORY.rst
2020-11-12 15:25:51 +02:00
01a32067d5 Print ignored environments and test suites in only in verbose mode
Resolve #3726
2020-11-12 15:22:47 +02:00
fc5ce4739c Added an error selector for Sublime Text build runner // Resolve #3733 2020-11-12 15:05:01 +02:00
ae7b8f9ecf Fix tests 2020-11-11 20:52:23 +02:00
0f5d2d6821 Sync docs 2020-11-11 19:44:39 +02:00
48eca22a00 Force VSCode's intelliSenseMode to "gcc-x64" when GCC toolchain is used 2020-11-11 14:19:58 +02:00
5e164493a8 Sync docs 2020-11-09 11:39:26 +02:00
ead99208f2 Increase example name in manifest to 255 chars 2020-11-09 11:38:46 +02:00
4f5ad05792 Docs: Document "Introducing Strict SSL/TLS" in migration 2020-11-04 14:07:40 +02:00
bc52e72605 Bump version to 5.0.3a2 2020-11-03 15:11:52 +02:00
038674835a Workaround for a broken locale 2020-11-02 12:27:17 +02:00
00f21c17ca Merge branch 'develop' of https://github.com/platformio/platformio-core into develop 2020-11-01 21:06:47 +02:00
818a1508a0 Docs: Use native ProjectConfig in the advanced scripting examples 2020-11-01 21:06:23 +02:00
2d9480a6a7 Support for GitPod environment 2020-11-01 21:05:03 +02:00
0bec4e25c8 Add support for C++ language standard in QtCreator template
Resolve #3719
2020-11-01 19:03:14 +02:00
950a540df4 Bump version to 5.0.3a1 2020-10-31 19:07:45 +02:00
2e66c5f807 Generate a working "projectEnvName" for PlatformIO IDE's debugger for VSCode 2020-10-31 19:07:04 +02:00
7033c2616b Docs: Add info how to access PlatformIO Core CLI in VSCode 2020-10-31 12:44:37 +02:00
278 changed files with 15168 additions and 7245 deletions

View File

@ -8,14 +8,19 @@ jobs:
fail-fast: false
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
python-version: [2.7, 3.7, 3.8]
python-version: ["3.6", "3.7", "3.8", "3.9", "3.10"]
exclude:
- os: macos-latest
python-version: "3.6"
- os: windows-latest
python-version: "3.10"
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
with:
submodules: "recursive"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
@ -42,3 +47,4 @@ jobs:
job_name: '*Core*'
commit: true
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}
token: ${{ secrets.SLACK_GITHUB_TOKEN }}

View File

@ -4,13 +4,14 @@ on: [push, pull_request]
jobs:
build:
name: Build Docs
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
submodules: "recursive"
- name: Set up Python
uses: actions/setup-python@v1
uses: actions/setup-python@v2
with:
python-version: 3.7
- name: Install dependencies
@ -29,4 +30,80 @@ jobs:
type: ${{ job.status }}
job_name: '*Docs*'
commit: true
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}
token: ${{ secrets.SLACK_GITHUB_TOKEN }}
- name: Preserve Docs
if: ${{ github.event_name == 'push' }}
run: |
tar -czvf docs.tar.gz -C docs/_build html rtdpage
- name: Save artifact
if: ${{ github.event_name == 'push' }}
uses: actions/upload-artifact@v2
with:
name: docs
path: ./docs.tar.gz
deploy:
name: Deploy Docs
needs: build
runs-on: ubuntu-latest
env:
DOCS_REPO: platformio/platformio-docs
DOCS_DIR: platformio-docs
LATEST_DOCS_DIR: latest-docs
RELEASE_BUILD: ${{ startsWith(github.ref, 'refs/tags/v') }}
if: ${{ github.event_name == 'push' }}
steps:
- name: Download artifact
uses: actions/download-artifact@v2
with:
name: docs
- name: Unpack artifact
run: |
mkdir ./${{ env.LATEST_DOCS_DIR }}
tar -xzf ./docs.tar.gz -C ./${{ env.LATEST_DOCS_DIR }}
- name: Delete Artifact
uses: geekyeggo/delete-artifact@v1
with:
name: docs
- name: Select Docs type
id: get-destination-dir
run: |
if [[ ${{ env.RELEASE_BUILD }} == true ]]; then
echo "::set-output name=dst_dir::stable"
else
echo "::set-output name=dst_dir::latest"
fi
- name: Checkout latest Docs
continue-on-error: true
uses: actions/checkout@v2
with:
repository: ${{ env.DOCS_REPO }}
path: ${{ env.DOCS_DIR }}
ref: gh-pages
- name: Synchronize Docs
run: |
rm -rf ${{ env.DOCS_DIR }}/.git
rm -rf ${{ env.DOCS_DIR }}/en/${{ steps.get-destination-dir.outputs.dst_dir }}
mkdir -p ${{ env.DOCS_DIR }}/en/${{ steps.get-destination-dir.outputs.dst_dir }}
cp -rf ${{ env.LATEST_DOCS_DIR }}/html/* ${{ env.DOCS_DIR }}/en/${{ steps.get-destination-dir.outputs.dst_dir }}
if [[ ${{ env.RELEASE_BUILD }} == false ]]; then
rm -rf ${{ env.DOCS_DIR }}/page
mkdir -p ${{ env.DOCS_DIR }}/page
cp -rf ${{ env.LATEST_DOCS_DIR }}/rtdpage/* ${{ env.DOCS_DIR }}/page
fi
- name: Validate Docs
run: |
if [ -z "$(ls -A ${{ env.DOCS_DIR }})" ]; then
echo "Docs folder is empty. Aborting!"
exit 1
fi
- name: Deploy to Github Pages
uses: peaceiris/actions-gh-pages@v3
with:
personal_token: ${{ secrets.DEPLOY_GH_DOCS_TOKEN }}
external_repository: ${{ env.DOCS_REPO }}
publish_dir: ./${{ env.DOCS_DIR }}
commit_message: Sync Docs

View File

@ -7,15 +7,15 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-16.04, windows-latest, macos-latest]
python-version: [2.7, 3.7]
os: [ubuntu-18.04, windows-latest, macos-latest]
python-version: [3.7]
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v2
with:
submodules: "recursive"
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
@ -26,7 +26,8 @@ jobs:
- name: Run on Linux
if: startsWith(matrix.os, 'ubuntu')
env:
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,siwigsm,intel_mcs51,aceinna_imu"
PIO_INSTALL_DEVPLATFORMS_OWNERNAMES: "platformio"
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,intel_mcs51"
run: |
# ChipKIT issue: install 32-bit support for GCC PIC32
sudo apt-get install libc6-i386
@ -40,7 +41,8 @@ jobs:
- name: Run on macOS
if: startsWith(matrix.os, 'macos')
env:
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,siwigsm,microchippic32,gd32v,nuclei,lattice_ice40"
PIO_INSTALL_DEVPLATFORMS_OWNERNAMES: "platformio"
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,microchippic32,lattice_ice40,gd32v"
run: |
df -h
tox -e testexamples
@ -50,7 +52,8 @@ jobs:
env:
PLATFORMIO_CORE_DIR: C:/pio
PLATFORMIO_WORKSPACE_DIR: C:/pio-workspace/$PROJECT_HASH
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,siwigsm,riscv_gap"
PIO_INSTALL_DEVPLATFORMS_OWNERNAMES: "platformio"
PIO_INSTALL_DEVPLATFORMS_IGNORE: "ststm8,infineonxmc,riscv_gap"
run: |
tox -e testexamples
@ -62,3 +65,4 @@ jobs:
job_name: '*Examples*'
commit: true
url: ${{ secrets.SLACK_BUILD_WEBHOOK }}
token: ${{ secrets.SLACK_GITHUB_TOKEN }}

2
.gitignore vendored
View File

@ -1,6 +1,6 @@
*.egg-info
*.pyc
.pioenvs
__pycache__
.tox
docs/_build
dist

View File

@ -1,3 +0,0 @@
[settings]
line_length=88
known_third_party=OpenSSL, SCons, autobahn, jsonrpc, twisted, zope

View File

@ -14,8 +14,9 @@ disable=
too-few-public-methods,
useless-object-inheritance,
useless-import-alias,
fixme,
bad-option-value,
consider-using-dict-items,
consider-using-f-string,
; PY2 Compat
super-with-arguments,

View File

@ -1,12 +0,0 @@
# See https://docs.readthedocs.io/en/stable/config-file/index.html
version: 2
sphinx:
configuration: docs/conf.py
formats:
- pdf
submodules:
include: all

View File

@ -3,7 +3,7 @@ Contributing
To get started, <a href="https://cla-assistant.io/platformio/platformio-core">sign the Contributor License Agreement</a>.
1. Fork the repository on GitHub.
1. Fork the repository on GitHub
2. Clone repository `git clone --recursive https://github.com/YourGithubUsername/platformio-core.git`
3. Run `pip install tox`
4. Go to the root of project where is located `tox.ini` and run `tox -e py37`
@ -18,4 +18,4 @@ To get started, <a href="https://cla-assistant.io/platformio/platformio-core">si
8. Run the tests `make test`
9. Build documentation `tox -e docs` (creates a directory _build under docs where you can find the html)
10. Commit changes to your forked repository
11. Submit a Pull Request on GitHub.
11. Submit a Pull Request on GitHub

View File

@ -1,148 +1,146 @@
Release Notes
=============
.. |PIOCONF| replace:: `"platformio.ini" <https://docs.platformio.org/en/latest/projectconf.html>`__ configuration file
.. |LDF| replace:: `LDF <https://docs.platformio.org/en/latest/librarymanager/ldf.html>`__
.. |INTERPOLATION| replace:: `Interpolation of Values <https://docs.platformio.org/en/latest/projectconf/interpolation.html>`__
.. _release_notes_6:
PlatformIO Core 6
-----------------
**A professional collaborative platform for declarative, safety-critical, and test-driven embedded development.**
6.0.1 (2022-05-17)
~~~~~~~~~~~~~~~~~~
* Improved support for the renamed configuration options (`issue #4270 <https://github.com/platformio/platformio-core/issues/4270>`_)
* Fixed an issue when calling the built-in `pio device monitor <https://docs.platformio.org/en/latest/core/userguide/device/cmd_monitor.html#filters>`__ filters
* Fixed an issue when using |INTERPOLATION| and merging str+int options (`issue #4271 <https://github.com/platformio/platformio-core/issues/4271>`_)
6.0.0 (2022-05-16)
~~~~~~~~~~~~~~~~~~
Please check the `Migration guide from 5.x to 6.0 <https://docs.platformio.org/en/latest/core/migration.html>`__.
* **Package Management**
- New unified Package Management CLI (``pio pkg``):
* `pio pkg exec <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_exec.html>`_ - run command from package tool (`issue #4163 <https://github.com/platformio/platformio-core/issues/4163>`_)
* `pio pkg install <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_install.html>`_ - install the project dependencies or custom packages
* `pio pkg list <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_list.html>`__ - list installed packages
* `pio pkg outdated <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_outdated.html>`__ - check for project outdated packages
* `pio pkg search <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_search.html>`__ - search for packages
* `pio pkg show <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_show.html>`__ - show package information
* `pio pkg uninstall <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_uninstall.html>`_ - uninstall the project dependencies or custom packages
* `pio pkg update <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_update.html>`__ - update the project dependencies or custom packages
- Package Manifest
* Added support for `"scripts" <https://docs.platformio.org/en/latest/librarymanager/config.html#scripts>`__ (`issue #485 <https://github.com/platformio/platformio-core/issues/485>`_)
* Added support for `multi-licensed <https://docs.platformio.org/en/latest/librarymanager/config.html#license>`__ packages using SPDX Expressions (`issue #4037 <https://github.com/platformio/platformio-core/issues/4037>`_)
* Added support for `"dependencies" <https://docs.platformio.org/en/latest/librarymanager/config.html#dependencies>`__ declared in a "tool" package manifest
- Added support for `symbolic links <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_install.html#local-folder>`__ allowing pointing the local source folder to the Package Manager (`issue #3348 <https://github.com/platformio/platformio-core/issues/3348>`_)
- Automatically install dependencies of the local (private) project libraries (`issue #2910 <https://github.com/platformio/platformio-core/issues/2910>`_)
- Improved detection of a package type from the tarball archive (`issue #3828 <https://github.com/platformio/platformio-core/issues/3828>`_)
- Ignore files according to the patterns declared in ".gitignore" when using the `pio package pack <https://docs.platformio.org/en/latest/core/userguide/pkg/cmd_pack.html>`__ command (`issue #4188 <https://github.com/platformio/platformio-core/issues/4188>`_)
- Dropped automatic updates of global libraries and development platforms (`issue #4179 <https://github.com/platformio/platformio-core/issues/4179>`_)
- Dropped support for the "pythonPackages" field in "platform.json" manifest in favor of `Extra Python Dependencies <https://docs.platformio.org/en/latest/scripting/examples/extra_python_packages.html>`__
- Fixed an issue when manually removed dependencies from the |PIOCONF| were not uninstalled from the storage (`issue #3076 <https://github.com/platformio/platformio-core/issues/3076>`_)
* **Unit Testing**
- Refactored from scratch `Unit Testing <https://docs.platformio.org/en/latest/advanced/unit-testing/index.html>`_ solution and its documentation
- New: `Test Hierarchy <https://docs.platformio.org/en/latest/advanced/unit-testing/structure.html>`_ (`issue #4135 <https://github.com/platformio/platformio-core/issues/4135>`_)
- New: `Doctest <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/doctest.html>`__ testing framework (`issue #4240 <https://github.com/platformio/platformio-core/issues/4240>`_)
- New: `GoogleTest <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/googletest.html>`__ testing and mocking framework (`issue #3572 <https://github.com/platformio/platformio-core/issues/3572>`_)
- New: `Semihosting <https://docs.platformio.org/en/latest/advanced/unit-testing/semihosting.html>`__ (`issue #3516 <https://github.com/platformio/platformio-core/issues/3516>`_)
- New: Hardware `Simulators <https://docs.platformio.org/en/latest/advanced/unit-testing/simulators/index.html>`__ for Unit Testing (QEMU, Renode, SimAVR, and custom solutions)
- New: ``test`` `build configuration <https://docs.platformio.org/en/latest/projectconf/build_configurations.html>`__
- Added support for a `custom testing framework <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/custom/index.html>`_
- Added support for a custom `testing command <https://docs.platformio.org/en/latest/projectconf/section_env_test.html#test-testing-command>`__
- Added support for a `custom Unity library <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/custom/examples/custom_unity_library.html>`__ (`issue #3980 <https://github.com/platformio/platformio-core/issues/3980>`_)
- Added support for the ``socket://`` and ``rfc2217://`` protocols using `test_port <https://docs.platformio.org/en/latest/projectconf/section_env_test.html#test-port>`__ option (`issue #4229 <https://github.com/platformio/platformio-core/issues/4229>`_)
- List available project tests with a new `pio test --list-tests <https://docs.platformio.org/en/latest/core/userguide/cmd_test.html#cmdoption-pio-test-list-tests>`__ option
- Pass extra arguments to the testing program with a new `pio test --program-arg <https://docs.platformio.org/en/latest/core/userguide/cmd_test.html#cmdoption-pio-test-a>`__ option (`issue #3132 <https://github.com/platformio/platformio-core/issues/3132>`_)
- Generate reports in JUnit and JSON formats using the `pio test <https://docs.platformio.org/en/latest/core/userguide/cmd_test.html>`__ command (`issue #2891 <https://github.com/platformio/platformio-core/issues/2891>`_)
- Provide more information when the native program crashed on a host (errored with a non-zero return code) (`issue #3429 <https://github.com/platformio/platformio-core/issues/3429>`_)
- Improved automatic detection of a testing serial port (`issue #4076 <https://github.com/platformio/platformio-core/issues/4076>`_)
- Fixed an issue when command line parameters (``--ignore``, ``--filter``) do not override values defined in the |PIOCONF| (`issue #3845 <https://github.com/platformio/platformio-core/issues/3845>`_)
- Renamed the "test_build_project_src" project configuration option to the `test_build_src <https://docs.platformio.org/en/latest//projectconf/section_env_test.html#test-build-src>`__
- Removed the "test_transport" option in favor of the `Custom "unity_config.h" <https://docs.platformio.org/en/latest/advanced/unit-testing/frameworks/unity.html>`_
* **Static Code Analysis**
- Updated analysis tools:
* `Cppcheck <https://docs.platformio.org/en/latest/plus/check-tools/cppcheck.html>`__ v2.7 with various checker improvements and fixed false positives
* `PVS-Studio <https://docs.platformio.org/en/latest/plus/check-tools/pvs-studio.html>`__ v7.18 with improved and updated semantic analysis system
- Added support for the custom `Clang-Tidy <https://docs.platformio.org/en/latest/plus/check-tools/clang-tidy.html>`__ configuration file (`issue #4186 <https://github.com/platformio/platformio-core/issues/4186>`_)
- Added ability to override a tool version using the `platform_packages <https://docs.platformio.org/en/latest/projectconf/section_env_platform.html#platform-packages>`__ option (`issue #3798 <https://github.com/platformio/platformio-core/issues/3798>`_)
- Fixed an issue with improper handling of defects that don't specify a source file (`issue #4237 <https://github.com/platformio/platformio-core/issues/4237>`_)
* **Build System**
- Show project dependency licenses when building in the verbose mode
- Fixed an issue when |LDF| ignores the project `lib_deps <https://docs.platformio.org/en/latest/projectconf/section_env_library.html#lib-deps>`__ while resolving library dependencies (`issue #3598 <https://github.com/platformio/platformio-core/issues/3598>`_)
- Fixed an issue with calling an extra script located outside a project (`issue #4220 <https://github.com/platformio/platformio-core/issues/4220>`_)
- Fixed an issue when GCC preprocessor was applied to the ".s" assembly files on case-sensitive OS such as Window OS (`issue #3917 <https://github.com/platformio/platformio-core/issues/3917>`_)
- Fixed an issue when |LDF| ignores `build_src_flags <https://docs.platformio.org/en/latest/projectconf/section_env_build.html#build-src-flags>`__ in the "deep+" mode (`issue #4253 <https://github.com/platformio/platformio-core/issues/4253>`_)
* **Integration**
- Added a new build variable (``COMPILATIONDB_INCLUDE_TOOLCHAIN``) to include toolchain paths in the compilation database (`issue #3735 <https://github.com/platformio/platformio-core/issues/3735>`_)
- Changed a default path for compilation database `compile_commands.json <https://docs.platformio.org/en/latest/integration/compile_commands.html>`__ to the project root
- Enhanced integration for Qt Creator (`issue #3046 <https://github.com/platformio/platformio-core/issues/3046>`_)
* **Project Configuration**
- Extended |INTERPOLATION| with ``${this}`` pattern (`issue #3953 <https://github.com/platformio/platformio-core/issues/3953>`_)
- Embed environment name of the current section in the |PIOCONF| using ``${this.__env__}`` pattern
- Renamed the "src_build_flags" project configuration option to the `build_src_flags <https://docs.platformio.org/en/latest/projectconf/section_env_build.html#build-src-flags>`__
- Renamed the "src_filter" project configuration option to the `build_src_filter <https://docs.platformio.org/en/latest/projectconf/section_env_build.html#build-src-filter>`__
* **Miscellaneous**
- Pass extra arguments to the `native <https://docs.platformio.org/en/latest/platforms/native.html>`__ program with a new `pio run --program-arg <https://docs.platformio.org/en/latest/core/userguide/cmd_run.html#cmdoption-pio-run-a>`__ option (`issue #4246 <https://github.com/platformio/platformio-core/issues/4246>`_)
- Improved PIO Remote setup on credit-card sized computers (Raspberry Pi, BeagleBon, etc) (`issue #3865 <https://github.com/platformio/platformio-core/issues/3865>`_)
- Finally removed all tracks to the Python 2.7, the Python 3.6 is the minimum supported version.
.. _release_notes_5:
PlatformIO Core 5
-----------------
**A professional collaborative platform for embedded development**
5.0.2 (2020-10-30)
~~~~~~~~~~~~~~~~~~
- Initialize a new project or update the existing passing working environment name and its options (`issue #3686 <https://github.com/platformio/platformio-core/issues/3686>`_)
- Automatically build PlatformIO Core extra Python dependencies on a host machine if they are missed in the registry (`issue #3700 <https://github.com/platformio/platformio-core/issues/3700>`_)
- Improved "core.call" RPC for PlatformIO Home (`issue #3671 <https://github.com/platformio/platformio-core/issues/3671>`_)
- Fixed a "PermissionError: [WinError 5]" on Windows when an external repository is used with `lib_deps <https://docs.platformio.org/page/projectconf/section_env_library.html#lib-deps>`__ option (`issue #3664 <https://github.com/platformio/platformio-core/issues/3664>`_)
- Fixed a "KeyError: 'versions'" when dependency does not exist in the registry (`issue #3666 <https://github.com/platformio/platformio-core/issues/3666>`_)
- Fixed an issue with GCC linker when "native" dev-platform is used in pair with library dependencies (`issue #3669 <https://github.com/platformio/platformio-core/issues/3669>`_)
- Fixed an "AssertionError: ensure_dir_exists" when checking library updates from simultaneous subprocesses (`issue #3677 <https://github.com/platformio/platformio-core/issues/3677>`_)
- Fixed an issue when `pio package publish <https://docs.platformio.org/page/core/userguide/package/cmd_publish.html>`__ command removes original archive after submitting to the registry (`issue #3716 <https://github.com/platformio/platformio-core/issues/3716>`_)
- Fixed an issue when multiple `pio lib install <https://docs.platformio.org/page/core/userguide/lib/cmd_install.html>`__ command with the same local library results in duplicates in ``lib_deps`` (`issue #3715 <https://github.com/platformio/platformio-core/issues/3715>`_)
- Fixed an issue with a "wrong" timestamp in device monitor output using `"time" filter <https://docs.platformio.org/page/core/userguide/device/cmd_monitor.html#filters>`__ (`issue #3712 <https://github.com/platformio/platformio-core/issues/3712>`_)
5.0.1 (2020-09-10)
~~~~~~~~~~~~~~~~~~
- Added support for "owner" requirement when declaring ``dependencies`` using `library.json <https://docs.platformio.org/page/librarymanager/config.html#dependencies>`__
- Fixed an issue when using a custom git/ssh package with `platform_packages <https://docs.platformio.org/page/projectconf/section_env_platform.html#platform-packages>`__ option (`issue #3624 <https://github.com/platformio/platformio-core/issues/3624>`_)
- Fixed an issue with "ImportError: cannot import name '_get_backend' from 'cryptography.hazmat.backends'" when using `Remote Development <https://docs.platformio.org/page/plus/pio-remote.html>`__ on RaspberryPi device (`issue #3652 <https://github.com/platformio/platformio-core/issues/3652>`_)
- Fixed an issue when `pio package unpublish <https://docs.platformio.org/page/core/userguide/package/cmd_unpublish.html>`__ command crashes (`issue #3660 <https://github.com/platformio/platformio-core/issues/3660>`_)
- Fixed an issue when the package manager tries to install a built-in library from the registry (`issue #3662 <https://github.com/platformio/platformio-core/issues/3662>`_)
- Fixed an issue with incorrect value for C++ language standard in IDE projects when an in-progress language standard is used (`issue #3653 <https://github.com/platformio/platformio-core/issues/3653>`_)
- Fixed an issue with "Invalid simple block (semantic_version)" from library dependency that refs to an external source (repository, ZIP/Tar archives) (`issue #3658 <https://github.com/platformio/platformio-core/issues/3658>`_)
- Fixed an issue when can not remove update or remove external dev-platform using PlatformIO Home (`issue #3663 <https://github.com/platformio/platformio-core/issues/3663>`_)
5.0.0 (2020-09-03)
~~~~~~~~~~~~~~~~~~
Please check `Migration guide from 4.x to 5.0 <https://docs.platformio.org/page/core/migration.html>`__.
* Integration with the new **PlatformIO Trusted Registry**
- Enterprise-grade package storage with high availability (multi replicas)
- Secure, fast, and reliable global content delivery network (CDN)
- Universal support for all packages:
* Libraries
* Development platforms
* Toolchains
- Built-in fine-grained access control (role-based, teams, organizations)
- New CLI commands:
* `pio package <https://docs.platformio.org/page/core/userguide/package/index.html>`__ manage packages in the registry
* `pio access <https://docs.platformio.org/page/core/userguide/access/index.html>`__ manage package access for users, teams, and maintainers
* Integration with the new **Account Management System**
- `Manage organizations <https://docs.platformio.org/page/core/userguide/org/index.html>`__
- `Manage teams and team memberships <https://docs.platformio.org/page/core/userguide/team/index.html>`__
* New **Package Management System**
- Integrated PlatformIO Core with the new PlatformIO Registry
- Support for owner-based dependency declaration (resolves name conflicts) (`issue #1824 <https://github.com/platformio/platformio-core/issues/1824>`_)
- Automatically save dependencies to `"platformio.ini" <https://docs.platformio.org/page/projectconf.html>`__ when installing using PlatformIO CLI (`issue #2964 <https://github.com/platformio/platformio-core/issues/2964>`_)
- Follow SemVer complaint version constraints when checking library updates `issue #1281 <https://github.com/platformio/platformio-core/issues/1281>`_)
- Dropped support for "packageRepositories" section in "platform.json" manifest (please publish packages directly to the registry)
* **Build System**
- Upgraded build engine to the `SCons 4.0 - a next-generation software construction tool <https://scons.org/>`__
* `Configuration files are Python scripts <https://docs.platformio.org/page/projectconf/advanced_scripting.html>`__ use the power of a real programming language to solve build problems
* Built-in reliable and automatic dependency analysis
* Improved support for parallel builds
* Ability to `share built files in a cache <https://docs.platformio.org/page/projectconf/section_platformio.html#projectconf-pio-build-cache-dir>`__ to speed up multiple builds
- New `Custom Targets <https://docs.platformio.org/page/projectconf/advanced_scripting.html#custom-targets>`__
* Pre/Post processing based on dependent sources (another target, source file, etc.)
* Command launcher with own arguments
* Launch command with custom options declared in `"platformio.ini" <https://docs.platformio.org/page/projectconf.html>`__
* Python callback as a target (use the power of Python interpreter and PlatformIO Build API)
* List available project targets (including dev-platform specific and custom targets) with a new `pio run --list-targets <https://docs.platformio.org/page/core/userguide/cmd_run.html#cmdoption-platformio-run-list-targets>`__ command (`issue #3544 <https://github.com/platformio/platformio-core/issues/3544>`_)
- Enable "cyclic reference" for GCC linker only for the embedded dev-platforms (`issue #3570 <https://github.com/platformio/platformio-core/issues/3570>`_)
- Automatically enable LDF dependency `chain+ mode (evaluates C/C++ Preprocessor conditional syntax) <https://docs.platformio.org/page/librarymanager/ldf.html#dependency-finder-mode>`__ for Arduino library when "library.property" has "depends" field (`issue #3607 <https://github.com/platformio/platformio-core/issues/3607>`_)
- Fixed an issue with improper processing of source files added via multiple Build Middlewares (`issue #3531 <https://github.com/platformio/platformio-core/issues/3531>`_)
- Fixed an issue with the ``clean`` target on Windows when project and build directories are located on different logical drives (`issue #3542 <https://github.com/platformio/platformio-core/issues/3542>`_)
* **Project Management**
- Added support for "globstar/`**`" (recursive) pattern for the different commands and configuration options (`pio ci <https://docs.platformio.org/page/core/userguide/cmd_ci.html>`__, `src_filter <https://docs.platformio.org/page/projectconf/section_env_build.html#src-filter>`__, `check_patterns <https://docs.platformio.org/page/projectconf/section_env_check.html#check-patterns>`__, `library.json > srcFilter <https://docs.platformio.org/page/librarymanager/config.html#srcfilter>`__). Python 3.5+ is required
- Added a new ``-e, --environment`` option to `pio project init <https://docs.platformio.org/page/core/userguide/project/cmd_init.html#cmdoption-platformio-project-init-e>`__ command that helps to update a PlatformIO project using the existing environment
- Dump build system data intended for IDE extensions/plugins using a new `pio project data <https://docs.platformio.org/page/core/userguide/project/cmd_data.html>`__ command
- Do not generate ".travis.yml" for a new project, let the user have a choice
* **Unit Testing**
- Updated PIO Unit Testing support for Mbed framework and added compatibility with Mbed OS 6
- Fixed an issue when running multiple test environments (`issue #3523 <https://github.com/platformio/platformio-core/issues/3523>`_)
- Fixed an issue when Unit Testing engine fails with a custom project configuration file (`issue #3583 <https://github.com/platformio/platformio-core/issues/3583>`_)
* **Static Code Analysis**
- Updated analysis tools:
* `Cppcheck <https://docs.platformio.org/page/plus/check-tools/cppcheck.html>`__ v2.1 with a new "soundy" analysis option and improved code parser
* `PVS-Studio <https://docs.platformio.org/page/plus/check-tools/pvs-studio.html>`__ v7.09 with a new file list analysis mode and an extended list of analysis diagnostics
- Added Cppcheck package for ARM-based single-board computers (`issue #3559 <https://github.com/platformio/platformio-core/issues/3559>`_)
- Fixed an issue with PIO Check when a defect with a multiline error message is not reported in verbose mode (`issue #3631 <https://github.com/platformio/platformio-core/issues/3631>`_)
* **Miscellaneous**
- Display system-wide information using a new `pio system info <https://docs.platformio.org/page/core/userguide/system/cmd_info.html>`__ command (`issue #3521 <https://github.com/platformio/platformio-core/issues/3521>`_)
- Remove unused data using a new `pio system prune <https://docs.platformio.org/page/core/userguide/system/cmd_prune.html>`__ command (`issue #3522 <https://github.com/platformio/platformio-core/issues/3522>`_)
- Show ignored project environments only in the verbose mode (`issue #3641 <https://github.com/platformio/platformio-core/issues/3641>`_)
- Do not escape compiler arguments in VSCode template on Windows.
See `PlatformIO Core 5.0 history <https://github.com/platformio/platformio-core/blob/v5.2.5/HISTORY.rst>`__.
.. _release_notes_4:
PlatformIO Core 4
-----------------
See `PlatformIO Core 4.0 history <https://docs.platformio.org/en/v4.3.4/core/history.html#platformio-core-4>`__.
See `PlatformIO Core 4.0 history <https://github.com/platformio/platformio-core/blob/v4.3.4/HISTORY.rst>`__.
PlatformIO Core 3
-----------------
See `PlatformIO Core 3.0 history <https://docs.platformio.org/en/v4.3.4/core/history.html#platformio-core-3>`__.
See `PlatformIO Core 3.0 history <https://github.com/platformio/platformio-core/blob/v3.6.7/HISTORY.rst>`__.
PlatformIO Core 2
-----------------
See `PlatformIO Core 2.0 history <https://docs.platformio.org/en/v4.3.4/core/history.html#platformio-core-2>`__.
See `PlatformIO Core 2.0 history <https://github.com/platformio/platformio-core/blob/v2.11.2/HISTORY.rst>`__.
PlatformIO Core 1
-----------------
See `PlatformIO Core 1.0 history <https://docs.platformio.org/en/v4.3.4/core/history.html#platformio-core-1>`__.
See `PlatformIO Core 1.0 history <https://github.com/platformio/platformio-core/blob/v1.5.0/HISTORY.rst>`__.
PlatformIO Core Preview
-----------------------
See `PlatformIO Core Preview history <https://docs.platformio.org/en/v4.3.4/core/history.html#platformio-core-preview>`__.
See `PlatformIO Core Preview history <https://github.com/platformio/platformio-core/blob/v0.10.2/HISTORY.rst>`__.

View File

@ -1,17 +1,17 @@
lint:
pylint -j 6 --rcfile=./.pylintrc ./platformio
pylint -j 6 --rcfile=./.pylintrc ./tests
pylint -j 6 --rcfile=./.pylintrc ./platformio
isort:
isort -rc ./platformio
isort -rc ./tests
isort ./platformio
isort ./tests
format:
black --target-version py27 ./platformio
black --target-version py27 ./tests
black ./platformio
black ./tests
test:
py.test --verbose --capture=no --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
py.test --verbose --exitfirst -n 6 --dist=loadscope tests --ignore tests/test_examples.py
before-commit: isort format lint

View File

@ -1,8 +1,12 @@
PlatformIO
==========
.. image:: https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/main/banner-direct.svg
:target: https://github.com/vshymanskyy/StandWithUkraine/blob/main/docs/README.md
:alt: SWUbanner
PlatformIO Core
===============
.. image:: https://github.com/platformio/platformio-core/workflows/Core/badge.svg
:target: https://docs.platformio.org/page/core/index.html
:target: https://docs.platformio.org/en/latest/core/index.html
:alt: CI Build for PlatformIO Core
.. image:: https://github.com/platformio/platformio-core/workflows/Examples/badge.svg
:target: https://github.com/platformio/platformio-examples
@ -17,11 +21,12 @@ PlatformIO
:target: https://pypi.python.org/pypi/platformio/
:alt: License
.. image:: https://img.shields.io/badge/PlatformIO-Labs-orange.svg
:alt: Community Labs
:alt: PlatformIO Labs
:target: https://piolabs.com/?utm_source=github&utm_medium=core
**Quick Links:** `Web <https://platformio.org?utm_source=github&utm_medium=core>`_ |
**Quick Links:** `Homepage <https://platformio.org?utm_source=github&utm_medium=core>`_ |
`PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_ |
`Registry <https://registry.platformio.org?utm_source=github&utm_medium=core>`_ |
`Project Examples <https://github.com/platformio/platformio-examples/>`__ |
`Docs <https://docs.platformio.org?utm_source=github&utm_medium=core>`_ |
`Donate <https://platformio.org/donate?utm_source=github&utm_medium=core>`_ |
@ -35,7 +40,7 @@ PlatformIO
.. image:: https://raw.githubusercontent.com/platformio/platformio-web/develop/app/images/platformio-ide-laptop.png
:target: https://platformio.org?utm_source=github&utm_medium=core
`PlatformIO <https://platformio.org?utm_source=github&utm_medium=core>`_ is a professional collaborative platform for embedded development
`PlatformIO <https://platformio.org>`_ is a professional collaborative platform for embedded development.
**A place where Developers and Teams have true Freedom! No more vendor lock-in!**
@ -43,93 +48,36 @@ PlatformIO
* Cross-platform IDE and Unified Debugger
* Static Code Analyzer and Remote Unit Testing
* Multi-platform and Multi-architecture Build System
* Firmware File Explorer and Memory Inspection.
* Firmware File Explorer and Memory Inspection
Get Started
-----------
* `What is PlatformIO? <https://docs.platformio.org/page/what-is-platformio.html?utm_source=github&utm_medium=core>`_
* `What is PlatformIO? <https://docs.platformio.org/en/latest/what-is-platformio.html?utm_source=github&utm_medium=core>`_
* `PlatformIO IDE <https://platformio.org/platformio-ide?utm_source=github&utm_medium=core>`_
* `PlatformIO Core (CLI) <https://docs.platformio.org/page/core.html?utm_source=github&utm_medium=core>`_
* `PlatformIO Core (CLI) <https://docs.platformio.org/en/latest/core.html?utm_source=github&utm_medium=core>`_
* `Project Examples <https://github.com/platformio/platformio-examples?utm_source=github&utm_medium=core>`__
Solutions
---------
* `Library Management <https://docs.platformio.org/page/librarymanager/index.html?utm_source=github&utm_medium=core>`_
* `Desktop IDEs Integration <https://docs.platformio.org/page/ide.html?utm_source=github&utm_medium=core>`_
* `Continuous Integration <https://docs.platformio.org/page/ci/index.html?utm_source=github&utm_medium=core>`_
* `Library Management <https://docs.platformio.org/en/latest/librarymanager/index.html?utm_source=github&utm_medium=core>`_
* `Desktop IDEs Integration <https://docs.platformio.org/en/latest/ide.html?utm_source=github&utm_medium=core>`_
* `Continuous Integration <https://docs.platformio.org/en/latest/ci/index.html?utm_source=github&utm_medium=core>`_
**Advanced**
* `Debugging <https://docs.platformio.org/page/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `Unit Testing <https://docs.platformio.org/page/plus/unit-testing.html?utm_source=github&utm_medium=core>`_
* `Static Code Analysis <https://docs.platformio.org/page/plus/pio-check.html?utm_source=github&utm_medium=core>`_
* `Remote Development <https://docs.platformio.org/page/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
* `Debugging <https://docs.platformio.org/en/latest/plus/debugging.html?utm_source=github&utm_medium=core>`_
* `Unit Testing <https://docs.platformio.org/en/latest/advanced/unit-testing/index.html?utm_source=github&utm_medium=core>`_
* `Static Code Analysis <https://docs.platformio.org/en/latest/plus/pio-check.html?utm_source=github&utm_medium=core>`_
* `Remote Development <https://docs.platformio.org/en/latest/plus/pio-remote.html?utm_source=github&utm_medium=core>`_
Registry
--------
* `Libraries <https://platformio.org/lib?utm_source=github&utm_medium=core>`_
* `Development Platforms <https://platformio.org/platforms?utm_source=github&utm_medium=core>`_
* `Frameworks <https://platformio.org/frameworks?utm_source=github&utm_medium=core>`_
* `Embedded Boards <https://platformio.org/boards?utm_source=github&utm_medium=core>`_
Development Platforms
---------------------
* `Aceinna IMU <https://platformio.org/platforms/aceinna_imu?utm_source=github&utm_medium=core>`_
* `ASR Microelectronics ASR605x <https://platformio.org/platforms/asrmicro650x?utm_source=github&utm_medium=core>`_
* `Atmel AVR <https://platformio.org/platforms/atmelavr?utm_source=github&utm_medium=core>`_
* `Atmel SAM <https://platformio.org/platforms/atmelsam?utm_source=github&utm_medium=core>`_
* `Espressif 32 <https://platformio.org/platforms/espressif32?utm_source=github&utm_medium=core>`_
* `Espressif 8266 <https://platformio.org/platforms/espressif8266?utm_source=github&utm_medium=core>`_
* `Freescale Kinetis <https://platformio.org/platforms/freescalekinetis?utm_source=github&utm_medium=core>`_
* `Infineon XMC <https://platformio.org/platforms/infineonxmc?utm_source=github&utm_medium=core>`_
* `Intel ARC32 <https://platformio.org/platforms/intel_arc32?utm_source=github&utm_medium=core>`_
* `Intel MCS-51 (8051) <https://platformio.org/platforms/intel_mcs51?utm_source=github&utm_medium=core>`_
* `Kendryte K210 <https://platformio.org/platforms/kendryte210?utm_source=github&utm_medium=core>`_
* `Lattice iCE40 <https://platformio.org/platforms/lattice_ice40?utm_source=github&utm_medium=core>`_
* `Maxim 32 <https://platformio.org/platforms/maxim32?utm_source=github&utm_medium=core>`_
* `Microchip PIC32 <https://platformio.org/platforms/microchippic32?utm_source=github&utm_medium=core>`_
* `Nordic nRF51 <https://platformio.org/platforms/nordicnrf51?utm_source=github&utm_medium=core>`_
* `Nordic nRF52 <https://platformio.org/platforms/nordicnrf52?utm_source=github&utm_medium=core>`_
* `Nuclei <https://platformio.org/platforms/nuclei?utm_source=github&utm_medium=core>`_
* `NXP LPC <https://platformio.org/platforms/nxplpc?utm_source=github&utm_medium=core>`_
* `RISC-V <https://platformio.org/platforms/riscv?utm_source=github&utm_medium=core>`_
* `RISC-V GAP <https://platformio.org/platforms/riscv_gap?utm_source=github&utm_medium=core>`_
* `Shakti <https://platformio.org/platforms/shakti?utm_source=github&utm_medium=core>`_
* `Silicon Labs EFM32 <https://platformio.org/platforms/siliconlabsefm32?utm_source=github&utm_medium=core>`_
* `ST STM32 <https://platformio.org/platforms/ststm32?utm_source=github&utm_medium=core>`_
* `ST STM8 <https://platformio.org/platforms/ststm8?utm_source=github&utm_medium=core>`_
* `Teensy <https://platformio.org/platforms/teensy?utm_source=github&utm_medium=core>`_
* `TI MSP430 <https://platformio.org/platforms/timsp430?utm_source=github&utm_medium=core>`_
* `TI Tiva <https://platformio.org/platforms/titiva?utm_source=github&utm_medium=core>`_
* `WIZNet W7500 <https://platformio.org/platforms/wiznet7500?utm_source=github&utm_medium=core>`_
Frameworks
----------
* `Arduino <https://platformio.org/frameworks/arduino?utm_source=github&utm_medium=core>`_
* `CMSIS <https://platformio.org/frameworks/cmsis?utm_source=github&utm_medium=core>`_
* `ESP-IDF <https://platformio.org/frameworks/espidf?utm_source=github&utm_medium=core>`_
* `ESP8266 Non-OS SDK <https://platformio.org/frameworks/esp8266-nonos-sdk?utm_source=github&utm_medium=core>`_
* `ESP8266 RTOS SDK <https://platformio.org/frameworks/esp8266-rtos-sdk?utm_source=github&utm_medium=core>`_
* `Freedom E SDK <https://platformio.org/frameworks/freedom-e-sdk?utm_source=github&utm_medium=core>`_
* `GigaDevice GD32V SDK <https://platformio.org/frameworks/gd32vf103-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte Standalone SDK <https://platformio.org/frameworks/kendryte-standalone-sdk?utm_source=github&utm_medium=core>`_
* `Kendryte FreeRTOS SDK <https://platformio.org/frameworks/kendryte-freertos-sdk?utm_source=github&utm_medium=core>`_
* `libOpenCM3 <https://platformio.org/frameworks/libopencm3?utm_source=github&utm_medium=core>`_
* `Mbed <https://platformio.org/frameworks/mbed?utm_source=github&utm_medium=core>`_
* `Nuclei SDK <https://platformio.org/frameworks/nuclei-sdk?utm_source=github&utm_medium=core>`_
* `PULP OS <https://platformio.org/frameworks/pulp-os?utm_source=github&utm_medium=core>`_
* `Pumbaa <https://platformio.org/frameworks/pumbaa?utm_source=github&utm_medium=core>`_
* `Shakti SDK <https://platformio.org/frameworks/shakti-sdk?utm_source=github&utm_medium=core>`_
* `Simba <https://platformio.org/frameworks/simba?utm_source=github&utm_medium=core>`_
* `SPL <https://platformio.org/frameworks/spl?utm_source=github&utm_medium=core>`_
* `STM32Cube <https://platformio.org/frameworks/stm32cube?utm_source=github&utm_medium=core>`_
* `WiringPi <https://platformio.org/frameworks/wiringpi?utm_source=github&utm_medium=core>`_
* `Zephyr <https://platformio.org/frameworks/zephyr?utm_source=github&utm_medium=core>`_
* `Libraries <https://registry.platformio.org/search?t=library&utm_source=github&utm_medium=core>`_
* `Development Platforms <https://registry.platformio.org/search?t=platform&utm_source=github&utm_medium=core>`_
* `Development Tools <https://registry.platformio.org/search?t=tool&utm_source=github&utm_medium=core>`_
Contributing
------------
@ -142,7 +90,7 @@ Telemetry / Privacy Policy
Share minimal diagnostics and usage information to help us make PlatformIO better.
It is enabled by default. For more information see:
* `Telemetry Setting <https://docs.platformio.org/page/userguide/cmd_settings.html?utm_source=github&utm_medium=core#enable-telemetry>`_
* `Telemetry Setting <https://docs.platformio.org/en/latest/userguide/cmd_settings.html?utm_source=github&utm_medium=core#enable-telemetry>`_
License
-------

2
docs

Submodule docs updated: deae09a880...5bf0037c66

View File

@ -14,7 +14,7 @@
import sys
VERSION = (5, 0, 2)
VERSION = (6, 0, 1)
__version__ = ".".join([str(s) for s in VERSION])
__title__ = "platformio"
@ -31,34 +31,32 @@ __description__ = (
)
__url__ = "https://platformio.org"
__author__ = "PlatformIO"
__email__ = "contact@platformio.org"
__author__ = "PlatformIO Labs"
__email__ = "contact@piolabs.com"
__license__ = "Apache Software License"
__copyright__ = "Copyright 2014-present PlatformIO"
__copyright__ = "Copyright 2014-present PlatformIO Labs"
__accounts_api__ = "https://api.accounts.platformio.org"
__registry_api__ = [
"https://api.registry.platformio.org",
"https://api.registry.ns1.platformio.org",
__registry_mirror_hosts__ = [
"registry.platformio.org",
"registry.nm1.platformio.org",
]
__pioremote_endpoint__ = "ssl:host=remote.platformio.org:port=4413"
__default_requests_timeout__ = (10, None) # (connect, read)
__core_packages__ = {
"contrib-piohome": "~3.3.1",
"contrib-piohome": "~3.4.1",
"contrib-pysite": "~2.%d%d.0" % (sys.version_info.major, sys.version_info.minor),
"tool-unity": "~1.20500.0",
"tool-scons": "~2.20501.7" if sys.version_info.major == 2 else "~4.40001.0",
"tool-cppcheck": "~1.210.0",
"tool-clangtidy": "~1.100000.0",
"tool-pvs-studio": "~7.9.0",
"tool-scons": "~4.40300.0",
"tool-cppcheck": "~1.270.0",
"tool-clangtidy": "~1.120001.0",
"tool-pvs-studio": "~7.18.0",
}
__check_internet_hosts__ = [
"185.199.110.153", # Github.com
"88.198.170.159", # platformio.org
"github.com",
"platformio.org",
]
] + __registry_mirror_hosts__

View File

@ -18,23 +18,16 @@ from traceback import format_exc
import click
from platformio import __version__, exception, maintenance, util
from platformio import __version__, exception, maintenance
from platformio.commands import PlatformioCLI
from platformio.compat import CYGWIN
try:
import click_completion # pylint: disable=import-error
click_completion.init()
except: # pylint: disable=bare-except
pass
from platformio.compat import IS_CYGWIN, ensure_python3
@click.command(
cls=PlatformioCLI, context_settings=dict(help_option_names=["-h", "--help"])
)
@click.version_option(__version__, prog_name="PlatformIO")
@click.option("--force", "-f", is_flag=True, help="DEPRECATE")
@click.version_option(__version__, prog_name="PlatformIO Core")
@click.option("--force", "-f", is_flag=True, help="DEPRECATED")
@click.option("--caller", "-c", help="Caller ID (service)")
@click.option("--no-ansi", is_flag=True, help="Do not print ANSI control characters")
@click.pass_context
@ -63,15 +56,14 @@ def cli(ctx, force, caller, no_ansi):
maintenance.on_platformio_start(ctx, force, caller)
@cli.resultcallback()
@cli.result_callback()
@click.pass_context
def process_result(ctx, result, *_, **__):
maintenance.on_platformio_end(ctx, result)
@util.memoized()
def configure():
if CYGWIN:
if IS_CYGWIN:
raise exception.CygwinEnvDetected()
# https://urllib3.readthedocs.org
@ -105,6 +97,7 @@ def main(argv=None):
assert isinstance(argv, list)
sys.argv = argv
try:
ensure_python3(raise_exception=True)
configure()
cli() # pylint: disable=no-value-for-parameter
except SystemExit as e:

View File

@ -21,46 +21,34 @@ import os
import platform
import socket
import uuid
from os.path import dirname, isdir, isfile, join, realpath
from platformio import __version__, exception, fs, proc
from platformio.compat import WINDOWS, dump_json_to_unicode, hashlib_encode_data
from platformio.compat import IS_WINDOWS, hashlib_encode_data
from platformio.package.lockfile import LockFile
from platformio.project.helpers import get_default_projects_dir, get_project_core_dir
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_default_projects_dir
def projects_dir_validate(projects_dir):
assert isdir(projects_dir)
return realpath(projects_dir)
assert os.path.isdir(projects_dir)
return os.path.abspath(projects_dir)
DEFAULT_SETTINGS = {
"auto_update_libraries": {
"description": "Automatically update libraries (Yes/No)",
"value": False,
},
"auto_update_platforms": {
"description": "Automatically update platforms (Yes/No)",
"value": False,
},
"check_libraries_interval": {
"description": "Check for the library updates interval (days)",
"value": 7,
},
"check_platformio_interval": {
"description": "Check for the new PlatformIO interval (days)",
"value": 3,
},
"check_platforms_interval": {
"description": "Check for the platform updates interval (days)",
"description": "Check for the new PlatformIO Core interval (days)",
"value": 7,
},
"check_prune_system_threshold": {
"description": "Check for pruning unnecessary data threshold (megabytes)",
"value": 1024,
},
"enable_cache": {
"description": "Enable caching for HTTP API requests",
"value": True,
},
"enable_telemetry": {
"description": ("Telemetry service <http://bit.ly/pio-telemetry> (Yes/No)"),
"description": ("Telemetry service <https://bit.ly/pio-telemetry> (Yes/No)"),
"value": True,
},
"force_verbose": {
@ -87,7 +75,10 @@ class State(object):
self.path = path
self.lock = lock
if not self.path:
self.path = join(get_project_core_dir(), "appstate.json")
core_dir = ProjectConfig.get_instance().get("platformio", "core_dir")
if not os.path.isdir(core_dir):
os.makedirs(core_dir)
self.path = os.path.join(core_dir, "appstate.json")
self._storage = {}
self._lockfile = None
self.modified = False
@ -95,7 +86,7 @@ class State(object):
def __enter__(self):
try:
self._lock_state_file()
if isfile(self.path):
if os.path.isfile(self.path):
self._storage = fs.load_json(self.path)
assert isinstance(self._storage, dict)
except (
@ -110,10 +101,10 @@ class State(object):
def __exit__(self, type_, value, traceback):
if self.modified:
try:
with open(self.path, "w") as fp:
fp.write(dump_json_to_unicode(self._storage))
with open(self.path, mode="w", encoding="utf8") as fp:
fp.write(json.dumps(self._storage))
except IOError:
raise exception.HomeDirPermissionsError(get_project_core_dir())
raise exception.HomeDirPermissionsError(os.path.dirname(self.path))
self._unlock_state_file()
def _lock_state_file(self):
@ -123,7 +114,7 @@ class State(object):
try:
self._lockfile.acquire()
except IOError:
raise exception.HomeDirPermissionsError(dirname(self.path))
raise exception.HomeDirPermissionsError(os.path.dirname(self.path))
def _unlock_state_file(self):
if hasattr(self, "_lockfile") and self._lockfile:
@ -246,32 +237,19 @@ def is_disabled_progressbar():
def get_cid():
# pylint: disable=import-outside-toplevel
from platformio.clients.http import fetch_remote_content
cid = get_state_item("cid")
if cid:
return cid
uid = None
if os.getenv("C9_UID"):
uid = os.getenv("C9_UID")
elif os.getenv("CHE_API", os.getenv("CHE_API_ENDPOINT")):
try:
uid = json.loads(
fetch_remote_content(
"{api}/user?token={token}".format(
api=os.getenv("CHE_API", os.getenv("CHE_API_ENDPOINT")),
token=os.getenv("USER_TOKEN"),
)
)
).get("id")
except: # pylint: disable=bare-except
pass
if os.getenv("GITHUB_USER"):
uid = os.getenv("GITHUB_USER")
elif os.getenv("GITPOD_GIT_USER_NAME"):
uid = os.getenv("GITPOD_GIT_USER_NAME")
if not uid:
uid = uuid.getnode()
cid = uuid.UUID(bytes=hashlib.md5(hashlib_encode_data(uid)).digest())
cid = str(cid)
if WINDOWS or os.getuid() > 0: # pylint: disable=no-member
if IS_WINDOWS or os.getuid() > 0: # pylint: disable=no-member
set_state_item("cid", cid)
return cid

View File

@ -12,9 +12,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
import sys
from os import environ, makedirs
from os.path import isdir, join
from time import time
import click
@ -29,7 +29,6 @@ from SCons.Script import Import # pylint: disable=import-error
from SCons.Script import Variables # pylint: disable=import-error
from platformio import compat, fs
from platformio.compat import dump_json_to_unicode
from platformio.platform.base import PlatformBase
from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_project_dir
@ -45,48 +44,58 @@ clivars.AddVariables(
("PIOENV",),
("PIOTEST_RUNNING_NAME",),
("UPLOAD_PORT",),
("PROGRAM_ARGS",),
)
DEFAULT_ENV_OPTIONS = dict(
tools=[
"ar",
"as",
"cc",
"c++",
"link",
"pioasm",
"platformio",
"piotarget",
"pioplatform",
"pioproject",
"pioplatform",
"piotest",
"piotarget",
"piomaxlen",
"piolib",
"pioupload",
"piomisc",
"pioide",
"piosize",
"pioino",
"piomisc",
"piointegration",
],
toolpath=[join(fs.get_source_dir(), "builder", "tools")],
toolpath=[os.path.join(fs.get_source_dir(), "builder", "tools")],
variables=clivars,
# Propagating External Environment
ENV=environ,
ENV=os.environ,
UNIX_TIME=int(time()),
BUILD_DIR=join("$PROJECT_BUILD_DIR", "$PIOENV"),
BUILD_SRC_DIR=join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=join("$BUILD_DIR", "test"),
COMPILATIONDB_PATH=join("$BUILD_DIR", "compile_commands.json"),
BUILD_DIR=os.path.join("$PROJECT_BUILD_DIR", "$PIOENV"),
BUILD_SRC_DIR=os.path.join("$BUILD_DIR", "src"),
BUILD_TEST_DIR=os.path.join("$BUILD_DIR", "test"),
COMPILATIONDB_PATH=os.path.join("$PROJECT_DIR", "compile_commands.json"),
LIBPATH=["$BUILD_DIR"],
PROGNAME="program",
PROG_PATH=join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PROG_PATH=os.path.join("$BUILD_DIR", "$PROGNAME$PROGSUFFIX"),
PYTHONEXE=get_pythonexe_path(),
IDE_EXTRA_DATA={},
)
# Declare command verbose messages
command_strings = dict(
ARCOM="Archiving",
LINKCOM="Linking",
RANLIBCOM="Indexing",
ASCOM="Compiling",
ASPPCOM="Compiling",
CCCOM="Compiling",
CXXCOM="Compiling",
)
if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
DEFAULT_ENV_OPTIONS["ARCOMSTR"] = "Archiving $TARGET"
DEFAULT_ENV_OPTIONS["LINKCOMSTR"] = "Linking $TARGET"
DEFAULT_ENV_OPTIONS["RANLIBCOMSTR"] = "Indexing $TARGET"
for k in ("ASCOMSTR", "ASPPCOMSTR", "CCCOMSTR", "CXXCOMSTR"):
DEFAULT_ENV_OPTIONS[k] = "Compiling $TARGET"
for name, value in command_strings.items():
DEFAULT_ENV_OPTIONS["%sSTR" % name] = "%s $TARGET" % (value)
env = DefaultEnvironment(**DEFAULT_ENV_OPTIONS)
@ -103,65 +112,70 @@ env.Replace(
config = env.GetProjectConfig()
env.Replace(
PROJECT_DIR=get_project_dir(),
PROJECT_CORE_DIR=config.get_optional_dir("core"),
PROJECT_PACKAGES_DIR=config.get_optional_dir("packages"),
PROJECT_WORKSPACE_DIR=config.get_optional_dir("workspace"),
PROJECT_LIBDEPS_DIR=config.get_optional_dir("libdeps"),
PROJECT_INCLUDE_DIR=config.get_optional_dir("include"),
PROJECT_SRC_DIR=config.get_optional_dir("src"),
PROJECTSRC_DIR=config.get_optional_dir("src"), # legacy for dev/platform
PROJECT_TEST_DIR=config.get_optional_dir("test"),
PROJECT_DATA_DIR=config.get_optional_dir("data"),
PROJECTDATA_DIR=config.get_optional_dir("data"), # legacy for dev/platform
PROJECT_BUILD_DIR=config.get_optional_dir("build"),
BUILD_CACHE_DIR=config.get_optional_dir("build_cache"),
PROJECT_CORE_DIR=config.get("platformio", "core_dir"),
PROJECT_PACKAGES_DIR=config.get("platformio", "packages_dir"),
PROJECT_WORKSPACE_DIR=config.get("platformio", "workspace_dir"),
PROJECT_LIBDEPS_DIR=config.get("platformio", "libdeps_dir"),
PROJECT_INCLUDE_DIR=config.get("platformio", "include_dir"),
PROJECT_SRC_DIR=config.get("platformio", "src_dir"),
PROJECTSRC_DIR="$PROJECT_SRC_DIR", # legacy for dev/platform
PROJECT_TEST_DIR=config.get("platformio", "test_dir"),
PROJECT_DATA_DIR=config.get("platformio", "data_dir"),
PROJECTDATA_DIR="$PROJECT_DATA_DIR", # legacy for dev/platform
PROJECT_BUILD_DIR=config.get("platformio", "build_dir"),
BUILD_CACHE_DIR=config.get("platformio", "build_cache_dir"),
LIBSOURCE_DIRS=[
config.get_optional_dir("lib"),
join("$PROJECT_LIBDEPS_DIR", "$PIOENV"),
config.get_optional_dir("globallib"),
config.get("platformio", "lib_dir"),
os.path.join("$PROJECT_LIBDEPS_DIR", "$PIOENV"),
config.get("platformio", "globallib_dir"),
],
)
if (
compat.WINDOWS
and sys.version_info >= (3, 8)
and env["PROJECT_DIR"].startswith("\\\\")
):
click.secho(
"There is a known issue with Python 3.8+ and mapped network drives on "
"Windows.\nPlease downgrade Python to the latest 3.7. More details at:\n"
"https://github.com/platformio/platformio-core/issues/3417",
fg="yellow",
)
if env.subst("$BUILD_CACHE_DIR"):
if not isdir(env.subst("$BUILD_CACHE_DIR")):
makedirs(env.subst("$BUILD_CACHE_DIR"))
env.CacheDir("$BUILD_CACHE_DIR")
if int(ARGUMENTS.get("ISATTY", 0)):
# pylint: disable=protected-access
click._compat.isatty = lambda stream: True
if env.GetOption("clean"):
env.PioClean(env.subst("$BUILD_DIR"))
if compat.IS_WINDOWS and sys.version_info >= (3, 8) and os.getcwd().startswith("\\\\"):
click.secho("!!! WARNING !!!\t\t" * 3, fg="red")
click.secho(
"Your project is located on a mapped network drive but the "
"current command-line shell does not support the UNC paths.",
fg="yellow",
)
click.secho(
"Please move your project to a physical drive or check this workaround: "
"https://bit.ly/3kuU5mP\n",
fg="yellow",
)
if env.subst("$BUILD_CACHE_DIR"):
if not os.path.isdir(env.subst("$BUILD_CACHE_DIR")):
os.makedirs(env.subst("$BUILD_CACHE_DIR"))
env.CacheDir("$BUILD_CACHE_DIR")
is_clean_all = "cleanall" in COMMAND_LINE_TARGETS
if env.GetOption("clean") or is_clean_all:
env.PioClean(is_clean_all)
env.Exit(0)
elif not int(ARGUMENTS.get("PIOVERBOSE", 0)):
if not int(ARGUMENTS.get("PIOVERBOSE", 0)):
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
# Dynamically load dependent tools
if "compiledb" in COMMAND_LINE_TARGETS:
env.Tool("compilation_db")
if not isdir(env.subst("$BUILD_DIR")):
makedirs(env.subst("$BUILD_DIR"))
if not os.path.isdir(env.subst("$BUILD_DIR")):
os.makedirs(env.subst("$BUILD_DIR"))
env.LoadProjectOptions()
env.LoadPioPlatform()
env.SConscriptChdir(0)
env.SConsignFile(
join("$BUILD_DIR", ".sconsign%d%d" % (sys.version_info[0], sys.version_info[1]))
os.path.join(
"$BUILD_DIR", ".sconsign%d%d" % (sys.version_info[0], sys.version_info[1])
)
)
for item in env.GetExtraScripts("pre"):
@ -202,7 +216,7 @@ env.AddPreAction(
),
)
AlwaysBuild(env.Alias("debug", DEFAULT_TARGETS))
AlwaysBuild(env.Alias("__debug", DEFAULT_TARGETS))
AlwaysBuild(env.Alias("__test", DEFAULT_TARGETS))
##############################################################################
@ -211,17 +225,21 @@ if "envdump" in COMMAND_LINE_TARGETS:
click.echo(env.Dump())
env.Exit(0)
if "idedata" in COMMAND_LINE_TARGETS:
if set(["_idedata", "idedata"]) & set(COMMAND_LINE_TARGETS):
projenv = None
try:
Import("projenv")
except: # pylint: disable=bare-except
projenv = env
click.echo(
"\n%s\n"
% dump_json_to_unicode(
projenv.DumpIDEData(env) # pylint: disable=undefined-variable
)
)
data = projenv.DumpIntegrationData(env)
# dump to file for the further reading by project.helpers.load_build_metadata
with open(
projenv.subst(os.path.join("$BUILD_DIR", "idedata.json")),
mode="w",
encoding="utf8",
) as fp:
json.dump(data, fp)
click.echo("\n%s\n" % json.dumps(data)) # pylint: disable=undefined-variable
env.Exit(0)
if "sizedata" in COMMAND_LINE_TARGETS:

View File

@ -41,7 +41,7 @@ from platformio.proc import where_is_program
# should hold the compilation database, otherwise, the file defaults to compile_commands.json,
# which is the name that most clang tools search for by default.
# TODO: Is there a better way to do this than this global? Right now this exists so that the
# Is there a better way to do this than this global? Right now this exists so that the
# emitter we add can record all of the things it emits, so that the scanner for the top level
# compilation database can access the complete list, and also so that the writer has easy
# access to write all of the files. But it seems clunky. How can the emitter and the scanner
@ -58,7 +58,7 @@ class __CompilationDbNode(SCons.Node.Python.Value):
def changed_since_last_build_node(*args, **kwargs):
""" Dummy decider to force always building"""
"""Dummy decider to force always building"""
return True
@ -104,7 +104,7 @@ def makeEmitCompilationDbEntry(comstr):
__COMPILATIONDB_ENV=env,
)
# TODO: Technically, these next two lines should not be required: it should be fine to
# Technically, these next two lines should not be required: it should be fine to
# cache the entries. However, they don't seem to update properly. Since they are quick
# to re-generate disable caching and sidestep this problem.
env.AlwaysBuild(entry)
@ -152,7 +152,7 @@ def WriteCompilationDb(target, source, env):
item["file"] = os.path.abspath(item["file"])
entries.append(item)
with open(str(target[0]), "w") as target_file:
with open(str(target[0]), mode="w", encoding="utf8") as target_file:
json.dump(
entries, target_file, sort_keys=True, indent=4, separators=(",", ": ")
)

View File

@ -12,17 +12,20 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from twisted.internet import reactor # pylint: disable=import-error
from twisted.web import static # pylint: disable=import-error
from __future__ import absolute_import
import SCons.Tool.asm # pylint: disable=import-error
#
# Resolve https://github.com/platformio/platformio-core/issues/3917
# Avoid forcing .S to bare assembly on Windows OS
#
if ".S" in SCons.Tool.asm.ASSuffixes:
SCons.Tool.asm.ASSuffixes.remove(".S")
if ".S" not in SCons.Tool.asm.ASPPSuffixes:
SCons.Tool.asm.ASPPSuffixes.append(".S")
class WebRoot(static.File):
def render_GET(self, request):
if request.args.get(b"__shutdown__", False):
reactor.stop()
return "Server has been stopped"
request.setHeader("cache-control", "no-cache, no-store, must-revalidate")
request.setHeader("pragma", "no-cache")
request.setHeader("expires", "0")
return static.File.render_GET(self, request)
generate = SCons.Tool.asm.generate
exists = SCons.Tool.asm.exists

View File

@ -0,0 +1,254 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import atexit
import glob
import io
import os
import re
import tempfile
import click
from platformio.compat import get_filesystem_encoding, get_locale_encoding
class InoToCPPConverter(object):
PROTOTYPE_RE = re.compile(
r"""^(
(?:template\<.*\>\s*)? # template
([a-z_\d\&]+\*?\s+){1,2} # return type
([a-z_\d]+\s*) # name of prototype
\([a-z_,\.\*\&\[\]\s\d]*\) # arguments
)\s*(\{|;) # must end with `{` or `;`
""",
re.X | re.M | re.I,
)
DETECTMAIN_RE = re.compile(r"void\s+(setup|loop)\s*\(", re.M | re.I)
PROTOPTRS_TPLRE = r"\([^&\(]*&(%s)[^\)]*\)"
def __init__(self, env):
self.env = env
self._main_ino = None
self._safe_encoding = None
def read_safe_contents(self, path):
error_reported = False
for encoding in (
"utf-8",
None,
get_filesystem_encoding(),
get_locale_encoding(),
"latin-1",
):
try:
with io.open(path, encoding=encoding) as fp:
contents = fp.read()
self._safe_encoding = encoding
return contents
except UnicodeDecodeError:
if not error_reported:
error_reported = True
click.secho(
"Unicode decode error has occurred, please remove invalid "
"(non-ASCII or non-UTF8) characters from %s file or convert it to UTF-8"
% path,
fg="yellow",
err=True,
)
return ""
def write_safe_contents(self, path, contents):
with io.open(
path, "w", encoding=self._safe_encoding, errors="backslashreplace"
) as fp:
return fp.write(contents)
def is_main_node(self, contents):
return self.DETECTMAIN_RE.search(contents)
def convert(self, nodes):
contents = self.merge(nodes)
if not contents:
return None
return self.process(contents)
def merge(self, nodes):
assert nodes
lines = []
for node in nodes:
contents = self.read_safe_contents(node.get_path())
_lines = ['# 1 "%s"' % node.get_path().replace("\\", "/"), contents]
if self.is_main_node(contents):
lines = _lines + lines
self._main_ino = node.get_path()
else:
lines.extend(_lines)
if not self._main_ino:
self._main_ino = nodes[0].get_path()
return "\n".join(["#include <Arduino.h>"] + lines) if lines else None
def process(self, contents):
out_file = self._main_ino + ".cpp"
assert self._gcc_preprocess(contents, out_file)
contents = self.read_safe_contents(out_file)
contents = self._join_multiline_strings(contents)
self.write_safe_contents(out_file, self.append_prototypes(contents))
return out_file
def _gcc_preprocess(self, contents, out_file):
tmp_path = tempfile.mkstemp()[1]
self.write_safe_contents(tmp_path, contents)
self.env.Execute(
self.env.VerboseAction(
'$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format(
out_file, tmp_path
),
"Converting " + os.path.basename(out_file[:-4]),
)
)
atexit.register(_delete_file, tmp_path)
return os.path.isfile(out_file)
def _join_multiline_strings(self, contents):
if "\\\n" not in contents:
return contents
newlines = []
linenum = 0
stropen = False
for line in contents.split("\n"):
_linenum = self._parse_preproc_line_num(line)
if _linenum is not None:
linenum = _linenum
else:
linenum += 1
if line.endswith("\\"):
if line.startswith('"'):
stropen = True
newlines.append(line[:-1])
continue
if stropen:
newlines[len(newlines) - 1] += line[:-1]
continue
elif stropen and line.endswith(('",', '";')):
newlines[len(newlines) - 1] += line
stropen = False
newlines.append(
'#line %d "%s"' % (linenum, self._main_ino.replace("\\", "/"))
)
continue
newlines.append(line)
return "\n".join(newlines)
@staticmethod
def _parse_preproc_line_num(line):
if not line.startswith("#"):
return None
tokens = line.split(" ", 3)
if len(tokens) > 2 and tokens[1].isdigit():
return int(tokens[1])
return None
def _parse_prototypes(self, contents):
prototypes = []
reserved_keywords = set(["if", "else", "while"])
for match in self.PROTOTYPE_RE.finditer(contents):
if (
set([match.group(2).strip(), match.group(3).strip()])
& reserved_keywords
):
continue
prototypes.append(match)
return prototypes
def _get_total_lines(self, contents):
total = 0
if contents.endswith("\n"):
contents = contents[:-1]
for line in contents.split("\n")[::-1]:
linenum = self._parse_preproc_line_num(line)
if linenum is not None:
return total + linenum
total += 1
return total
def append_prototypes(self, contents):
prototypes = self._parse_prototypes(contents) or []
# skip already declared prototypes
declared = set(m.group(1).strip() for m in prototypes if m.group(4) == ";")
prototypes = [m for m in prototypes if m.group(1).strip() not in declared]
if not prototypes:
return contents
prototype_names = set(m.group(3).strip() for m in prototypes)
split_pos = prototypes[0].start()
match_ptrs = re.search(
self.PROTOPTRS_TPLRE % ("|".join(prototype_names)),
contents[:split_pos],
re.M,
)
if match_ptrs:
split_pos = contents.rfind("\n", 0, match_ptrs.start()) + 1
result = []
result.append(contents[:split_pos].strip())
result.append("%s;" % ";\n".join([m.group(1) for m in prototypes]))
result.append(
'#line %d "%s"'
% (
self._get_total_lines(contents[:split_pos]),
self._main_ino.replace("\\", "/"),
)
)
result.append(contents[split_pos:].strip())
return "\n".join(result)
def ConvertInoToCpp(env):
src_dir = glob.escape(env.subst("$PROJECT_SRC_DIR"))
ino_nodes = env.Glob(os.path.join(src_dir, "*.ino")) + env.Glob(
os.path.join(src_dir, "*.pde")
)
if not ino_nodes:
return
c = InoToCPPConverter(env)
out_file = c.convert(ino_nodes)
atexit.register(_delete_file, out_file)
def _delete_file(path):
try:
if os.path.isfile(path):
os.remove(path)
except: # pylint: disable=bare-except
pass
def generate(env):
env.AddMethod(ConvertInoToCpp)
def exists(_):
return True

View File

@ -14,41 +14,40 @@
from __future__ import absolute_import
import glob
import os
from glob import glob
from SCons.Defaults import processDefines # pylint: disable=import-error
import SCons.Defaults # pylint: disable=import-error
import SCons.Subst # pylint: disable=import-error
from platformio.compat import glob_escape
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import exec_command, where_is_program
def _dump_includes(env):
includes = {}
def DumpIntegrationIncludes(env):
result = dict(build=[], compatlib=[], toolchain=[])
includes["build"] = [
env.subst("$PROJECT_INCLUDE_DIR"),
env.subst("$PROJECT_SRC_DIR"),
]
includes["build"].extend(
[os.path.realpath(env.subst(item)) for item in env.get("CPPPATH", [])]
result["build"].extend(
[
env.subst("$PROJECT_INCLUDE_DIR"),
env.subst("$PROJECT_SRC_DIR"),
]
)
result["build"].extend(
[os.path.abspath(env.subst(item)) for item in env.get("CPPPATH", [])]
)
# installed libs
includes["compatlib"] = []
for lb in env.GetLibBuilders():
includes["compatlib"].extend(
[os.path.realpath(inc) for inc in lb.get_include_dirs()]
result["compatlib"].extend(
[os.path.abspath(inc) for inc in lb.get_include_dirs()]
)
# includes from toolchains
p = env.PioPlatform()
includes["toolchain"] = []
for pkg in p.get_installed_packages():
for pkg in p.get_installed_packages(with_optional=False):
if p.get_package_type(pkg.metadata.name) != "toolchain":
continue
toolchain_dir = glob_escape(pkg.path)
toolchain_dir = glob.escape(pkg.path)
toolchain_incglobs = [
os.path.join(toolchain_dir, "*", "include", "c++", "*"),
os.path.join(toolchain_dir, "*", "include", "c++", "*", "*-*-*"),
@ -56,14 +55,9 @@ def _dump_includes(env):
os.path.join(toolchain_dir, "*", "include*"),
]
for g in toolchain_incglobs:
includes["toolchain"].extend([os.path.realpath(inc) for inc in glob(g)])
result["toolchain"].extend([os.path.abspath(inc) for inc in glob.glob(g)])
includes["unity"] = []
unity_dir = get_core_package_dir("tool-unity")
if unity_dir:
includes["unity"].append(unity_dir)
return includes
return result
def _get_gcc_defines(env):
@ -92,7 +86,7 @@ def _get_gcc_defines(env):
def _dump_defines(env):
defines = []
# global symbols
for item in processDefines(env.get("CPPDEFINES", [])):
for item in SCons.Defaults.processDefines(env.get("CPPDEFINES", [])):
item = item.strip()
if item:
defines.append(env.subst(item).replace("\\", ""))
@ -122,7 +116,7 @@ def _dump_defines(env):
def _get_svd_path(env):
svd_path = env.GetProjectOption("debug_svd_path")
if svd_path:
return os.path.realpath(svd_path)
return os.path.abspath(svd_path)
if "BOARD" not in env:
return None
@ -137,31 +131,23 @@ def _get_svd_path(env):
# default file from ./platform/misc/svd folder
p = env.PioPlatform()
if os.path.isfile(os.path.join(p.get_dir(), "misc", "svd", svd_path)):
return os.path.realpath(os.path.join(p.get_dir(), "misc", "svd", svd_path))
return os.path.abspath(os.path.join(p.get_dir(), "misc", "svd", svd_path))
return None
def _escape_build_flag(flags):
return [flag if " " not in flag else '"%s"' % flag for flag in flags]
def _subst_cmd(env, cmd):
args = env.subst_list(cmd, SCons.Subst.SUBST_CMD)[0]
return " ".join([SCons.Subst.quote_spaces(arg) for arg in args])
def DumpIDEData(env, globalenv):
""" env here is `projenv`"""
env["__escape_build_flag"] = _escape_build_flag
LINTCCOM = (
"${__escape_build_flag(CFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
LINTCXXCOM = (
"${__escape_build_flag(CXXFLAGS)} ${__escape_build_flag(CCFLAGS)} $CPPFLAGS"
)
def DumpIntegrationData(env, globalenv):
"""env here is `projenv`"""
data = {
"env_name": env["PIOENV"],
"libsource_dirs": [env.subst(l) for l in env.GetLibSourceDirs()],
"libsource_dirs": [env.subst(item) for item in env.GetLibSourceDirs()],
"defines": _dump_defines(env),
"includes": _dump_includes(env),
"includes": env.DumpIntegrationIncludes(),
"cc_path": where_is_program(env.subst("$CC"), env.subst("${ENV['PATH']}")),
"cxx_path": where_is_program(env.subst("$CXX"), env.subst("${ENV['PATH']}")),
"gdb_path": where_is_program(env.subst("$GDB"), env.subst("${ENV['PATH']}")),
@ -181,7 +167,7 @@ def DumpIDEData(env, globalenv):
env_ = env.Clone()
# https://github.com/platformio/platformio-atom-ide/issues/34
_new_defines = []
for item in processDefines(env_.get("CPPDEFINES", [])):
for item in SCons.Defaults.processDefines(env_.get("CPPDEFINES", [])):
item = item.replace('\\"', '"')
if " " in item:
_new_defines.append(item.replace(" ", "\\\\ "))
@ -189,7 +175,13 @@ def DumpIDEData(env, globalenv):
_new_defines.append(item)
env_.Replace(CPPDEFINES=_new_defines)
data.update({"cc_flags": env_.subst(LINTCCOM), "cxx_flags": env_.subst(LINTCXXCOM)})
# export C/C++ build flags
data.update(
{
"cc_flags": _subst_cmd(env_, "$CFLAGS $CCFLAGS $CPPFLAGS"),
"cxx_flags": _subst_cmd(env_, "$CXXFLAGS $CCFLAGS $CPPFLAGS"),
}
)
return data
@ -199,5 +191,6 @@ def exists(_):
def generate(env):
env.AddMethod(DumpIDEData)
env.AddMethod(DumpIntegrationIncludes)
env.AddMethod(DumpIntegrationData)
return env

View File

@ -27,14 +27,16 @@ import sys
import click
import SCons.Scanner # pylint: disable=import-error
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from platformio import exception, fs, util
from platformio.builder.tools import platformio as piotool
from platformio.clients.http import InternetIsOffline
from platformio.compat import WINDOWS, hashlib_encode_data, string_types
from platformio.package.exception import UnknownPackageError
from platformio.clients.http import HTTPClientError, InternetIsOffline
from platformio.compat import IS_WINDOWS, hashlib_encode_data, string_types
from platformio.package.exception import (
MissingPackageManifestError,
UnknownPackageError,
)
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manifest.parser import (
ManifestParserError,
@ -54,11 +56,21 @@ class LibBuilderFactory(object):
used_frameworks = LibBuilderFactory.get_used_frameworks(env, path)
common_frameworks = set(env.get("PIOFRAMEWORK", [])) & set(used_frameworks)
if common_frameworks:
clsname = "%sLibBuilder" % list(common_frameworks)[0].title()
clsname = "%sLibBuilder" % list(common_frameworks)[0].capitalize()
elif used_frameworks:
clsname = "%sLibBuilder" % used_frameworks[0].title()
clsname = "%sLibBuilder" % used_frameworks[0].capitalize()
obj = getattr(sys.modules[__name__], clsname)(env, path, verbose=verbose)
# Handle PlatformIOLibBuilder.manifest.build.builder
# pylint: disable=protected-access
if isinstance(obj, PlatformIOLibBuilder) and obj._manifest.get("build", {}).get(
"builder"
):
obj = getattr(
sys.modules[__name__], obj._manifest.get("build", {}).get("builder")
)(env, path, verbose=verbose)
assert isinstance(obj, LibBuilderBase)
return obj
@ -86,7 +98,9 @@ class LibBuilderFactory(object):
fname, piotool.SRC_BUILD_EXT + piotool.SRC_HEADER_EXT
):
continue
with io.open(os.path.join(root, fname), errors="ignore") as fp:
with io.open(
os.path.join(root, fname), encoding="utf8", errors="ignore"
) as fp:
content = fp.read()
if not content:
continue
@ -113,7 +127,7 @@ class LibBuilderBase(object):
def __init__(self, env, path, manifest=None, verbose=False):
self.env = env.Clone()
self.envorigin = env.Clone()
self.path = os.path.realpath(env.subst(path))
self.path = os.path.abspath(env.subst(path))
self.verbose = verbose
try:
@ -124,11 +138,13 @@ class LibBuilderBase(object):
)
self._manifest = {}
self._is_dependent = False
self._is_built = False
self._depbuilders = list()
self._circular_deps = list()
self._processed_files = list()
self.is_dependent = False
self.is_built = False
self.depbuilders = []
self._deps_are_processed = False
self._circular_deps = []
self._processed_files = []
# reset source filter, could be overridden with extra script
self.env["SRC_FILTER"] = ""
@ -142,12 +158,17 @@ class LibBuilderBase(object):
def __contains__(self, path):
p1 = self.path
p2 = path
if WINDOWS:
if IS_WINDOWS:
p1 = p1.lower()
p2 = p2.lower()
if p1 == p2:
return True
return os.path.commonprefix((p1 + os.path.sep, p2)) == p1 + os.path.sep
if os.path.commonprefix([p1 + os.path.sep, p2]) == p1 + os.path.sep:
return True
# try to resolve paths
p1 = os.path.os.path.realpath(p1)
p2 = os.path.os.path.realpath(p2)
return os.path.commonprefix([p1 + os.path.sep, p2]) == p1 + os.path.sep
@property
def name(self):
@ -157,6 +178,11 @@ class LibBuilderBase(object):
def version(self):
return self._manifest.get("version")
@property
def dependent(self):
"""Backward compatibility with ESP-IDF"""
return self.is_dependent
@property
def dependencies(self):
return self._manifest.get("dependencies")
@ -172,19 +198,19 @@ class LibBuilderBase(object):
@property
def include_dir(self):
if not all(
os.path.isdir(os.path.join(self.path, d)) for d in ("include", "src")
):
return None
return os.path.join(self.path, "include")
for name in ("include", "Include"):
d = os.path.join(self.path, name)
if os.path.isdir(d):
return d
return None
@property
def src_dir(self):
return (
os.path.join(self.path, "src")
if os.path.isdir(os.path.join(self.path, "src"))
else self.path
)
for name in ("src", "Src"):
d = os.path.join(self.path, name)
if os.path.isdir(d):
return d
return self.path
def get_include_dirs(self):
items = []
@ -213,18 +239,6 @@ class LibBuilderBase(object):
def extra_script(self):
return None
@property
def depbuilders(self):
return self._depbuilders
@property
def dependent(self):
return self._is_dependent
@property
def is_built(self):
return self._is_built
@property
def lib_archive(self):
return self.env.GetProjectOption("lib_archive")
@ -278,14 +292,15 @@ class LibBuilderBase(object):
if self.extra_script:
self.env.SConscriptChdir(1)
self.env.SConscript(
os.path.realpath(self.extra_script),
os.path.abspath(self.extra_script),
exports={"env": self.env, "pio_lib_builder": self},
)
self.env.ProcessUnFlags(self.build_unflags)
def process_dependencies(self):
if not self.dependencies:
if not self.dependencies or self._deps_are_processed:
return
self._deps_are_processed = True
for item in self.dependencies:
found = False
for lb in self.env.GetLibBuilders():
@ -293,7 +308,7 @@ class LibBuilderBase(object):
continue
found = True
if lb not in self.depbuilders:
self.depend_recursive(lb)
self.depend_on(lb)
break
if not found and self.verbose:
@ -388,7 +403,29 @@ class LibBuilderBase(object):
return result
def depend_recursive(self, lb, search_files=None):
def search_deps_recursive(self, search_files=None):
self.process_dependencies()
# when LDF is disabled
if self.lib_ldf_mode == "off":
return
if self.lib_ldf_mode.startswith("deep"):
search_files = self.get_search_files()
lib_inc_map = {}
for inc in self._get_found_includes(search_files):
for lb in self.env.GetLibBuilders():
if inc.get_abspath() in lb:
if lb not in lib_inc_map:
lib_inc_map[lb] = []
lib_inc_map[lb].append(inc.get_abspath())
break
for lb, lb_search_files in lib_inc_map.items():
self.depend_on(lb, search_files=lb_search_files)
def depend_on(self, lb, search_files=None, recursive=True):
def _already_depends(_lb):
if self in _lb.depbuilders:
return True
@ -406,38 +443,17 @@ class LibBuilderBase(object):
"between `%s` and `%s`\n" % (self.path, lb.path)
)
self._circular_deps.append(lb)
elif lb not in self._depbuilders:
self._depbuilders.append(lb)
elif lb not in self.depbuilders:
self.depbuilders.append(lb)
lb.is_dependent = True
LibBuilderBase._INCLUDE_DIRS_CACHE = None
lb.search_deps_recursive(search_files)
def search_deps_recursive(self, search_files=None):
if not self._is_dependent:
self._is_dependent = True
self.process_dependencies()
if self.lib_ldf_mode.startswith("deep"):
search_files = self.get_search_files()
# when LDF is disabled
if self.lib_ldf_mode == "off":
return
lib_inc_map = {}
for inc in self._get_found_includes(search_files):
for lb in self.env.GetLibBuilders():
if inc.get_abspath() in lb:
if lb not in lib_inc_map:
lib_inc_map[lb] = []
lib_inc_map[lb].append(inc.get_abspath())
break
for lb, lb_search_files in lib_inc_map.items():
self.depend_recursive(lb, lb_search_files)
if recursive:
lb.search_deps_recursive(search_files)
def build(self):
libs = []
for lb in self._depbuilders:
for lb in self.depbuilders:
libs.extend(lb.build())
# copy shared information to self env
for key in ("CPPPATH", "LIBPATH", "LIBS", "LINKFLAGS"):
@ -446,9 +462,9 @@ class LibBuilderBase(object):
for lb in self._circular_deps:
self.env.PrependUnique(CPPPATH=lb.get_include_dirs())
if self._is_built:
if self.is_built:
return libs
self._is_built = True
self.is_built = True
self.env.PrependUnique(CPPPATH=self.get_include_dirs())
@ -459,12 +475,22 @@ class LibBuilderBase(object):
for key in ("CPPPATH", "LIBPATH", "LIBS", "LINKFLAGS"):
self.env.PrependUnique(**{key: lb.env.get(key)})
if self.lib_archive:
libs.append(
self.env.BuildLibrary(self.build_dir, self.src_dir, self.src_filter)
do_not_archive = not self.lib_archive
if not do_not_archive:
nodes = self.env.CollectBuildFiles(
self.build_dir, self.src_dir, self.src_filter
)
else:
if nodes:
libs.append(
self.env.BuildLibrary(
self.build_dir, self.src_dir, self.src_filter, nodes
)
)
else:
do_not_archive = True
if do_not_archive:
self.env.BuildSources(self.build_dir, self.src_dir, self.src_filter)
return libs
@ -479,6 +505,14 @@ class ArduinoLibBuilder(LibBuilderBase):
return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict()
@property
def include_dir(self):
if not all(
os.path.isdir(os.path.join(self.path, d)) for d in ("include", "src")
):
return None
return os.path.join(self.path, "include")
def get_include_dirs(self):
include_dirs = LibBuilderBase.get_include_dirs(self)
if os.path.isdir(os.path.join(self.path, "src")):
@ -545,6 +579,24 @@ class ArduinoLibBuilder(LibBuilderBase):
def is_platforms_compatible(self, platforms):
return util.items_in_list(platforms, self._manifest.get("platforms") or ["*"])
@property
def build_flags(self):
ldflags = [
LibBuilderBase.build_flags.fget(self), # pylint: disable=no-member
self._manifest.get("ldflags"),
]
if self._manifest.get("precompiled") in ("true", "full"):
# add to LDPATH {build.mcu} folder
board_config = self.env.BoardConfig()
for key in ("build.mcu", "build.cpu"):
libpath = os.path.join(self.src_dir, board_config.get(key, ""))
if not os.path.isdir(libpath):
continue
self.env.PrependUnique(LIBPATH=libpath)
break
ldflags = [flag for flag in ldflags if flag] # remove empty
return " ".join(ldflags) if ldflags else None
class MbedLibBuilder(LibBuilderBase):
def load_manifest(self):
@ -553,12 +605,6 @@ class MbedLibBuilder(LibBuilderBase):
return {}
return ManifestParserFactory.new_from_file(manifest_path).as_dict()
@property
def include_dir(self):
if os.path.isdir(os.path.join(self.path, "include")):
return os.path.join(self.path, "include")
return None
@property
def src_dir(self):
if os.path.isdir(os.path.join(self.path, "source")):
@ -590,7 +636,7 @@ class MbedLibBuilder(LibBuilderBase):
def process_extra_options(self):
self._process_mbed_lib_confs()
return super(MbedLibBuilder, self).process_extra_options()
return super().process_extra_options()
def _process_mbed_lib_confs(self):
mbed_lib_paths = [
@ -671,7 +717,7 @@ class MbedLibBuilder(LibBuilderBase):
def _mbed_conf_append_macros(self, mbed_config_path, macros):
lines = []
with open(mbed_config_path) as fp:
with open(mbed_config_path, encoding="utf8") as fp:
for line in fp.readlines():
line = line.strip()
if line == "#endif":
@ -690,7 +736,7 @@ class MbedLibBuilder(LibBuilderBase):
if len(tokens) < 2 or tokens[1] not in macros:
lines.append(line)
lines.append("")
with open(mbed_config_path, "w") as fp:
with open(mbed_config_path, mode="w", encoding="utf8") as fp:
fp.write("\n".join(lines))
@ -708,14 +754,14 @@ class PlatformIOLibBuilder(LibBuilderBase):
def include_dir(self):
if "includeDir" in self._manifest.get("build", {}):
with fs.cd(self.path):
return os.path.realpath(self._manifest.get("build").get("includeDir"))
return os.path.abspath(self._manifest.get("build").get("includeDir"))
return LibBuilderBase.include_dir.fget(self) # pylint: disable=no-member
@property
def src_dir(self):
if "srcDir" in self._manifest.get("build", {}):
with fs.cd(self.path):
return os.path.realpath(self._manifest.get("build").get("srcDir"))
return os.path.abspath(self._manifest.get("build").get("srcDir"))
return LibBuilderBase.src_dir.fget(self) # pylint: disable=no-member
@property
@ -809,7 +855,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
def __init__(self, env, *args, **kwargs):
# backup original value, will be reset in base.__init__
project_src_filter = env.get("SRC_FILTER")
super(ProjectAsLibBuilder, self).__init__(env, *args, **kwargs)
super().__init__(env, *args, **kwargs)
self.env["SRC_FILTER"] = project_src_filter
@property
@ -835,7 +881,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
# project files
items = LibBuilderBase.get_search_files(self)
# test files
if "__test" in COMMAND_LINE_TARGETS:
if "test" in self.env.GetBuildType():
items.extend(
[
os.path.join("$PROJECT_TEST_DIR", item)
@ -859,13 +905,19 @@ class ProjectAsLibBuilder(LibBuilderBase):
# pylint: disable=no-member
return self.env.get("SRC_FILTER") or LibBuilderBase.src_filter.fget(self)
@property
def build_flags(self):
# pylint: disable=no-member
return self.env.get("SRC_BUILD_FLAGS") or LibBuilderBase.build_flags.fget(self)
@property
def dependencies(self):
return self.env.GetProjectOption("lib_deps", [])
def process_extra_options(self):
# skip for project, options are already processed
pass
with fs.cd(self.path):
self.env.ProcessFlags(self.build_flags)
self.env.ProcessUnFlags(self.build_unflags)
def install_dependencies(self):
def _is_builtin(spec):
@ -897,7 +949,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
try:
lm.install(spec)
did_install = True
except (UnknownPackageError, InternetIsOffline) as e:
except (HTTPClientError, UnknownPackageError, InternetIsOffline) as e:
click.secho("Warning! %s" % e, fg="yellow")
# reset cache
@ -905,6 +957,7 @@ class ProjectAsLibBuilder(LibBuilderBase):
DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=None)
def process_dependencies(self): # pylint: disable=too-many-branches
found_lbs = []
for spec in self.dependencies:
found = False
for storage_dir in self.env.GetLibSourceDirs():
@ -918,7 +971,8 @@ class ProjectAsLibBuilder(LibBuilderBase):
if pkg.path != lb.path:
continue
if lb not in self.depbuilders:
self.depend_recursive(lb)
self.depend_on(lb, recursive=False)
found_lbs.append(lb)
found = True
break
if found:
@ -930,12 +984,16 @@ class ProjectAsLibBuilder(LibBuilderBase):
if lb.name != spec:
continue
if lb not in self.depbuilders:
self.depend_recursive(lb)
self.depend_on(lb)
found = True
break
# process library dependencies
for lb in found_lbs:
lb.search_deps_recursive()
def build(self):
self._is_built = True # do not build Project now
self.is_built = True # do not build Project now
result = LibBuilderBase.build(self)
self.env.PrependUnique(CPPPATH=self.get_include_dirs())
return result
@ -973,7 +1031,7 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
if DefaultEnvironment().get("__PIO_LIB_BUILDERS", None) is not None:
return sorted(
DefaultEnvironment()["__PIO_LIB_BUILDERS"],
key=lambda lb: 0 if lb.dependent else 1,
key=lambda lb: 0 if lb.is_dependent else 1,
)
DefaultEnvironment().Replace(__PIO_LIB_BUILDERS=[])
@ -982,12 +1040,16 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
found_incompat = False
for storage_dir in env.GetLibSourceDirs():
storage_dir = os.path.realpath(storage_dir)
storage_dir = os.path.abspath(storage_dir)
if not os.path.isdir(storage_dir):
continue
for item in sorted(os.listdir(storage_dir)):
lib_dir = os.path.join(storage_dir, item)
if item == "__cores__" or not os.path.isdir(lib_dir):
if item == "__cores__":
continue
if LibraryPackageManager.is_symlink(lib_dir):
lib_dir, _ = LibraryPackageManager.resolve_symlink(lib_dir)
if not lib_dir or not os.path.isdir(lib_dir):
continue
try:
lb = LibBuilderFactory.new(env, lib_dir)
@ -1019,9 +1081,21 @@ def GetLibBuilders(env): # pylint: disable=too-many-branches
def ConfigureProjectLibBuilder(env):
_pm_storage = {}
def _get_lib_license(pkg):
storage_dir = os.path.dirname(os.path.dirname(pkg.path))
if storage_dir not in _pm_storage:
_pm_storage[storage_dir] = LibraryPackageManager(storage_dir)
try:
return (_pm_storage[storage_dir].load_manifest(pkg) or {}).get("license")
except MissingPackageManifestError:
pass
return None
def _correct_found_libs(lib_builders):
# build full dependency graph
found_lbs = [lb for lb in lib_builders if lb.dependent]
found_lbs = [lb for lb in lib_builders if lb.is_dependent]
for lb in lib_builders:
if lb in found_lbs:
lb.search_deps_recursive(lb.get_search_files())
@ -1033,18 +1107,20 @@ def ConfigureProjectLibBuilder(env):
def _print_deps_tree(root, level=0):
margin = "| " * (level)
for lb in root.depbuilders:
title = "<%s>" % lb.name
title = lb.name
pkg = PackageItem(lb.path)
if pkg.metadata:
title += " %s" % pkg.metadata.version
title += " @ %s" % pkg.metadata.version
elif lb.version:
title += " %s" % lb.version
title += " @ %s" % lb.version
click.echo("%s|-- %s" % (margin, title), nl=False)
if int(ARGUMENTS.get("PIOVERBOSE", 0)):
click.echo(
" (License: %s, " % (_get_lib_license(pkg) or "Unknown"), nl=False
)
if pkg.metadata and pkg.metadata.spec.external:
click.echo(" [%s]" % pkg.metadata.spec.url, nl=False)
click.echo(" (", nl=False)
click.echo(lb.path, nl=False)
click.echo("URI: %s, " % pkg.metadata.spec.uri, nl=False)
click.echo("Path: %s" % lb.path, nl=False)
click.echo(")", nl=False)
click.echo("")
if lb.depbuilders:
@ -1053,7 +1129,7 @@ def ConfigureProjectLibBuilder(env):
project = ProjectAsLibBuilder(env, "$PROJECT_DIR")
ldf_mode = LibBuilderBase.lib_ldf_mode.fget(project) # pylint: disable=no-member
click.echo("LDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf")
click.echo("LDF: Library Dependency Finder -> https://bit.ly/configure-pio-ldf")
click.echo(
"LDF Modes: Finder ~ %s, Compatibility ~ %s"
% (ldf_mode, project.lib_compat_mode)

View File

@ -14,15 +14,30 @@
from __future__ import absolute_import
from hashlib import md5
from os import makedirs
from os.path import isdir, isfile, join
import hashlib
import os
import re
from platformio.compat import WINDOWS, hashlib_encode_data
from SCons.Platform import TempFileMunge # pylint: disable=import-error
from SCons.Subst import quote_spaces # pylint: disable=import-error
# Windows CLI has limit with command length to 8192
# Leave 2000 chars for flags and other options
MAX_LINE_LENGTH = 6000 if WINDOWS else 128072
from platformio.compat import IS_WINDOWS, hashlib_encode_data
# There are the next limits depending on a platform:
# - Windows = 8192
# - Unix = 131072
# We need ~512 characters for compiler and temporary file paths
MAX_LINE_LENGTH = (8192 if IS_WINDOWS else 131072) - 512
WINPATHSEP_RE = re.compile(r"\\([^\"'\\]|$)")
def tempfile_arg_esc_func(arg):
arg = quote_spaces(arg)
if not IS_WINDOWS:
return arg
# GCC requires double Windows slashes, let's use UNIX separator
return WINPATHSEP_RE.sub(r"/\1", arg)
def long_sources_hook(env, sources):
@ -41,32 +56,16 @@ def long_sources_hook(env, sources):
return '@"%s"' % _file_long_data(env, " ".join(data))
def long_incflags_hook(env, incflags):
_incflags = env.subst(incflags).replace("\\", "/")
if len(_incflags) < MAX_LINE_LENGTH:
return incflags
# fix space in paths
data = []
for line in _incflags.split(" -I"):
line = line.strip()
if not line.startswith("-I"):
line = "-I" + line
data.append('-I"%s"' % line[2:])
return '@"%s"' % _file_long_data(env, " ".join(data))
def _file_long_data(env, data):
build_dir = env.subst("$BUILD_DIR")
if not isdir(build_dir):
makedirs(build_dir)
tmp_file = join(
build_dir, "longcmd-%s" % md5(hashlib_encode_data(data)).hexdigest()
if not os.path.isdir(build_dir):
os.makedirs(build_dir)
tmp_file = os.path.join(
build_dir, "longcmd-%s" % hashlib.md5(hashlib_encode_data(data)).hexdigest()
)
if isfile(tmp_file):
if os.path.isfile(tmp_file):
return tmp_file
with open(tmp_file, "w") as fp:
with open(tmp_file, mode="w", encoding="utf8") as fp:
fp.write(data)
return tmp_file
@ -76,17 +75,21 @@ def exists(_):
def generate(env):
env.Replace(_long_sources_hook=long_sources_hook)
env.Replace(_long_incflags_hook=long_incflags_hook)
coms = {}
for key in ("ARCOM", "LINKCOM"):
coms[key] = env.get(key, "").replace(
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}"
)
for key in ("_CCCOMCOM", "ASPPCOM"):
coms[key] = env.get(key, "").replace(
"$_CPPINCFLAGS", "${_long_incflags_hook(__env__, _CPPINCFLAGS)}"
)
env.Replace(**coms)
kwargs = dict(
_long_sources_hook=long_sources_hook,
TEMPFILE=TempFileMunge,
MAXLINELENGTH=MAX_LINE_LENGTH,
TEMPFILEARGESCFUNC=tempfile_arg_esc_func,
TEMPFILESUFFIX=".tmp",
TEMPFILEDIR="$BUILD_DIR",
)
for name in ("LINKCOM", "ASCOM", "ASPPCOM", "CCCOM", "CXXCOM"):
kwargs[name] = "${TEMPFILE('%s','$%sSTR')}" % (env.get(name), name)
kwargs["ARCOM"] = env.get("ARCOM", "").replace(
"$SOURCES", "${_long_sources_hook(__env__, SOURCES)}"
)
env.Replace(**kwargs)
return env

View File

@ -14,243 +14,15 @@
from __future__ import absolute_import
import atexit
import io
import os
import re
import sys
from tempfile import mkstemp
import click
from platformio import fs, util
from platformio.compat import get_filesystem_encoding, get_locale_encoding, glob_escape
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import exec_command
class InoToCPPConverter(object):
PROTOTYPE_RE = re.compile(
r"""^(
(?:template\<.*\>\s*)? # template
([a-z_\d\&]+\*?\s+){1,2} # return type
([a-z_\d]+\s*) # name of prototype
\([a-z_,\.\*\&\[\]\s\d]*\) # arguments
)\s*(\{|;) # must end with `{` or `;`
""",
re.X | re.M | re.I,
)
DETECTMAIN_RE = re.compile(r"void\s+(setup|loop)\s*\(", re.M | re.I)
PROTOPTRS_TPLRE = r"\([^&\(]*&(%s)[^\)]*\)"
def __init__(self, env):
self.env = env
self._main_ino = None
self._safe_encoding = None
def read_safe_contents(self, path):
error_reported = False
for encoding in (
"utf-8",
None,
get_filesystem_encoding(),
get_locale_encoding(),
"latin-1",
):
try:
with io.open(path, encoding=encoding) as fp:
contents = fp.read()
self._safe_encoding = encoding
return contents
except UnicodeDecodeError:
if not error_reported:
error_reported = True
click.secho(
"Unicode decode error has occurred, please remove invalid "
"(non-ASCII or non-UTF8) characters from %s file or convert it to UTF-8"
% path,
fg="yellow",
err=True,
)
return ""
def write_safe_contents(self, path, contents):
with io.open(
path, "w", encoding=self._safe_encoding, errors="backslashreplace"
) as fp:
return fp.write(contents)
def is_main_node(self, contents):
return self.DETECTMAIN_RE.search(contents)
def convert(self, nodes):
contents = self.merge(nodes)
if not contents:
return None
return self.process(contents)
def merge(self, nodes):
assert nodes
lines = []
for node in nodes:
contents = self.read_safe_contents(node.get_path())
_lines = ['# 1 "%s"' % node.get_path().replace("\\", "/"), contents]
if self.is_main_node(contents):
lines = _lines + lines
self._main_ino = node.get_path()
else:
lines.extend(_lines)
if not self._main_ino:
self._main_ino = nodes[0].get_path()
return "\n".join(["#include <Arduino.h>"] + lines) if lines else None
def process(self, contents):
out_file = self._main_ino + ".cpp"
assert self._gcc_preprocess(contents, out_file)
contents = self.read_safe_contents(out_file)
contents = self._join_multiline_strings(contents)
self.write_safe_contents(out_file, self.append_prototypes(contents))
return out_file
def _gcc_preprocess(self, contents, out_file):
tmp_path = mkstemp()[1]
self.write_safe_contents(tmp_path, contents)
self.env.Execute(
self.env.VerboseAction(
'$CXX -o "{0}" -x c++ -fpreprocessed -dD -E "{1}"'.format(
out_file, tmp_path
),
"Converting " + os.path.basename(out_file[:-4]),
)
)
atexit.register(_delete_file, tmp_path)
return os.path.isfile(out_file)
def _join_multiline_strings(self, contents):
if "\\\n" not in contents:
return contents
newlines = []
linenum = 0
stropen = False
for line in contents.split("\n"):
_linenum = self._parse_preproc_line_num(line)
if _linenum is not None:
linenum = _linenum
else:
linenum += 1
if line.endswith("\\"):
if line.startswith('"'):
stropen = True
newlines.append(line[:-1])
continue
if stropen:
newlines[len(newlines) - 1] += line[:-1]
continue
elif stropen and line.endswith(('",', '";')):
newlines[len(newlines) - 1] += line
stropen = False
newlines.append(
'#line %d "%s"' % (linenum, self._main_ino.replace("\\", "/"))
)
continue
newlines.append(line)
return "\n".join(newlines)
@staticmethod
def _parse_preproc_line_num(line):
if not line.startswith("#"):
return None
tokens = line.split(" ", 3)
if len(tokens) > 2 and tokens[1].isdigit():
return int(tokens[1])
return None
def _parse_prototypes(self, contents):
prototypes = []
reserved_keywords = set(["if", "else", "while"])
for match in self.PROTOTYPE_RE.finditer(contents):
if (
set([match.group(2).strip(), match.group(3).strip()])
& reserved_keywords
):
continue
prototypes.append(match)
return prototypes
def _get_total_lines(self, contents):
total = 0
if contents.endswith("\n"):
contents = contents[:-1]
for line in contents.split("\n")[::-1]:
linenum = self._parse_preproc_line_num(line)
if linenum is not None:
return total + linenum
total += 1
return total
def append_prototypes(self, contents):
prototypes = self._parse_prototypes(contents) or []
# skip already declared prototypes
declared = set(m.group(1).strip() for m in prototypes if m.group(4) == ";")
prototypes = [m for m in prototypes if m.group(1).strip() not in declared]
if not prototypes:
return contents
prototype_names = set(m.group(3).strip() for m in prototypes)
split_pos = prototypes[0].start()
match_ptrs = re.search(
self.PROTOPTRS_TPLRE % ("|".join(prototype_names)),
contents[:split_pos],
re.M,
)
if match_ptrs:
split_pos = contents.rfind("\n", 0, match_ptrs.start()) + 1
result = []
result.append(contents[:split_pos].strip())
result.append("%s;" % ";\n".join([m.group(1) for m in prototypes]))
result.append(
'#line %d "%s"'
% (
self._get_total_lines(contents[:split_pos]),
self._main_ino.replace("\\", "/"),
)
)
result.append(contents[split_pos:].strip())
return "\n".join(result)
def ConvertInoToCpp(env):
src_dir = glob_escape(env.subst("$PROJECT_SRC_DIR"))
ino_nodes = env.Glob(os.path.join(src_dir, "*.ino")) + env.Glob(
os.path.join(src_dir, "*.pde")
)
if not ino_nodes:
return
c = InoToCPPConverter(env)
out_file = c.convert(ino_nodes)
atexit.register(_delete_file, out_file)
def _delete_file(path):
try:
if os.path.isfile(path):
os.remove(path)
except: # pylint: disable=bare-except
pass
@util.memoized()
def _get_compiler_type(env):
def GetCompilerType(env):
if env.subst("$CC").endswith("-gcc"):
return "gcc"
try:
@ -269,10 +41,6 @@ def _get_compiler_type(env):
return None
def GetCompilerType(env):
return _get_compiler_type(env)
def GetActualLDScript(env):
def _lookup_in_ldpath(script):
for d in env.get("LIBPATH", []):
@ -318,7 +86,7 @@ def GetActualLDScript(env):
env.Exit(1)
def ConfigureDebugFlags(env):
def ConfigureDebugTarget(env):
def _cleanup_debug_flags(scope):
if scope not in env:
return
@ -333,7 +101,13 @@ def ConfigureDebugFlags(env):
for scope in ("ASFLAGS", "CCFLAGS", "LINKFLAGS"):
_cleanup_debug_flags(scope)
debug_flags = env.ParseFlags(env.GetProjectOption("debug_build_flags"))
debug_flags = env.ParseFlags(
env.get("PIODEBUGFLAGS")
if env.get("PIODEBUGFLAGS")
and not env.GetProjectOptions(as_dict=True).get("debug_build_flags")
else env.GetProjectOption("debug_build_flags")
)
env.MergeFlags(debug_flags)
optimization_flags = [
f for f in debug_flags.get("CCFLAGS", []) if f.startswith(("-O", "-g"))
@ -343,22 +117,6 @@ def ConfigureDebugFlags(env):
env.AppendUnique(ASFLAGS=optimization_flags, LINKFLAGS=optimization_flags)
def ConfigureTestTarget(env):
env.Append(
CPPDEFINES=["UNIT_TEST", "UNITY_INCLUDE_CONFIG_H"],
CPPPATH=[os.path.join("$BUILD_DIR", "UnityTestLib")],
)
unitylib = env.BuildLibrary(
os.path.join("$BUILD_DIR", "UnityTestLib"), get_core_package_dir("tool-unity")
)
env.Prepend(LIBS=[unitylib])
src_filter = ["+<*.cpp>", "+<*.c>"]
if "PIOTEST_RUNNING_NAME" in env:
src_filter.append("+<%s%s>" % (env["PIOTEST_RUNNING_NAME"], os.path.sep))
env.Replace(PIOTEST_SRC_FILTER=src_filter)
def GetExtraScripts(env, scope):
items = []
for item in env.GetProjectOption("extra_scripts", []):
@ -369,18 +127,17 @@ def GetExtraScripts(env, scope):
if not items:
return items
with fs.cd(env.subst("$PROJECT_DIR")):
return [os.path.realpath(item) for item in items]
return [os.path.abspath(env.subst(item)) for item in items]
def generate(env):
env.AddMethod(GetCompilerType)
env.AddMethod(GetActualLDScript)
env.AddMethod(ConfigureDebugTarget)
env.AddMethod(GetExtraScripts)
# bakward-compatibility with Zephyr build script
env.AddMethod(ConfigureDebugTarget, "ConfigureDebugFlags")
def exists(_):
return True
def generate(env):
env.AddMethod(ConvertInoToCpp)
env.AddMethod(GetCompilerType)
env.AddMethod(GetActualLDScript)
env.AddMethod(ConfigureDebugFlags)
env.AddMethod(ConfigureTestTarget)
env.AddMethod(GetExtraScripts)
return env

View File

@ -19,9 +19,10 @@ import sys
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from SCons.Script import COMMAND_LINE_TARGETS # pylint: disable=import-error
from SCons.Script import DefaultEnvironment # pylint: disable=import-error
from platformio import fs, util
from platformio.compat import WINDOWS
from platformio.compat import IS_MACOS, IS_WINDOWS
from platformio.package.meta import PackageItem
from platformio.package.version import get_original_version
from platformio.platform.exception import UnknownBoard
@ -32,16 +33,17 @@ from platformio.project.config import ProjectOptions
@util.memoized()
def PioPlatform(env):
variables = env.GetProjectOptions(as_dict=True)
if "framework" in variables:
# support PIO Core 3.0 dev/platforms
variables["pioframework"] = variables["framework"]
def _PioPlatform():
env = DefaultEnvironment()
p = PlatformFactory.new(os.path.dirname(env["PLATFORM_MANIFEST"]))
p.configure_default_packages(variables, COMMAND_LINE_TARGETS)
p.configure_project_packages(env["PIOENV"], COMMAND_LINE_TARGETS)
return p
def PioPlatform(_):
return _PioPlatform()
def BoardConfig(env, board=None):
with fs.cd(env.subst("$PROJECT_DIR")):
try:
@ -52,6 +54,7 @@ def BoardConfig(env, board=None):
except (AssertionError, UnknownBoard) as e:
sys.stderr.write("Error: %s\n" % str(e))
env.Exit(1)
return None
def GetFrameworkScript(env, framework):
@ -70,7 +73,6 @@ def LoadPioPlatform(env):
env["PIOPLATFORM"] = p.name
# Add toolchains and uploaders to $PATH and $*_LIBRARY_PATH
systype = util.get_systype()
for pkg in p.get_installed_packages():
type_ = p.get_package_type(pkg.metadata.name)
if type_ not in ("toolchain", "uploader", "debugger"):
@ -82,12 +84,12 @@ def LoadPioPlatform(env):
else pkg.path,
)
if (
not WINDOWS
not IS_WINDOWS
and os.path.isdir(os.path.join(pkg.path, "lib"))
and type_ != "toolchain"
):
env.PrependENVPath(
"DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH",
"DYLD_LIBRARY_PATH" if IS_MACOS else "LD_LIBRARY_PATH",
os.path.join(pkg.path, "lib"),
)
@ -160,7 +162,7 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
and pkg_metadata
and pkg_metadata.spec.external
):
data.append("(%s)" % pkg_metadata.spec.url)
data.append("(%s)" % pkg_metadata.spec.uri)
if board_config:
data.extend([">", board_config.get("name")])
return data
@ -213,7 +215,7 @@ def PrintConfiguration(env): # pylint: disable=too-many-statements
data = []
for item in platform.dump_used_packages():
original_version = get_original_version(item["version"])
info = "%s %s" % (item["name"], item["version"])
info = "%s @ %s" % (item["name"], item["version"])
extra = []
if original_version:
extra.append(original_version)

View File

@ -14,7 +14,8 @@
from __future__ import absolute_import
from platformio.project.config import MISSING, ProjectConfig, ProjectOptions
from platformio.compat import MISSING
from platformio.project.config import ProjectConfig
def GetProjectConfig(env):
@ -30,15 +31,17 @@ def GetProjectOption(env, option, default=MISSING):
def LoadProjectOptions(env):
for option, value in env.GetProjectOptions():
option_meta = ProjectOptions.get("env." + option)
config = env.GetProjectConfig()
section = "env:" + env["PIOENV"]
for option in config.options(section):
option_meta = config.find_option_meta(section, option)
if (
not option_meta
or not option_meta.buildenvvar
or option_meta.buildenvvar in env
):
continue
env[option_meta.buildenvvar] = value
env[option_meta.buildenvvar] = config.get(section, option)
def exists(_):

View File

@ -16,6 +16,7 @@
from __future__ import absolute_import
import json
import sys
from os import environ, makedirs, remove
from os.path import isdir, join, splitdrive
@ -23,9 +24,8 @@ from os.path import isdir, join, splitdrive
from elftools.elf.descriptions import describe_sh_flags
from elftools.elf.elffile import ELFFile
from platformio.compat import dump_json_to_unicode
from platformio.compat import IS_WINDOWS
from platformio.proc import exec_command
from platformio.util import get_systype
def _run_tool(cmd, env, tool_args):
@ -37,7 +37,7 @@ def _run_tool(cmd, env, tool_args):
makedirs(build_dir)
tmp_file = join(build_dir, "size-data-longcmd.txt")
with open(tmp_file, "w") as fp:
with open(tmp_file, mode="w", encoding="utf8") as fp:
fp.write("\n".join(tool_args))
cmd.append("@" + tmp_file)
@ -164,7 +164,7 @@ def _collect_symbols_info(env, elffile, elf_path, sections):
location = symbol_locations.get(hex(symbol["addr"]))
if not location or "?" in location:
continue
if "windows" in get_systype():
if IS_WINDOWS:
drive, tail = splitdrive(location)
location = join(drive.upper(), tail)
symbol["file"] = location
@ -220,7 +220,7 @@ def DumpSizeData(_, target, source, env): # pylint: disable=unused-argument
"sections": sections,
}
files = dict()
files = {}
for symbol in _collect_symbols_info(env, elffile, elf_path, sections):
file_path = symbol.get("file") or "unknown"
if not files.get(file_path, {}):
@ -235,14 +235,16 @@ def DumpSizeData(_, target, source, env): # pylint: disable=unused-argument
files[file_path]["symbols"].append(symbol)
data["memory"]["files"] = list()
data["memory"]["files"] = []
for k, v in files.items():
file_data = {"path": k}
file_data.update(v)
data["memory"]["files"].append(file_data)
with open(join(env.subst("$BUILD_DIR"), "sizedata.json"), "w") as fp:
fp.write(dump_json_to_unicode(data))
with open(
join(env.subst("$BUILD_DIR"), "sizedata.json"), mode="w", encoding="utf8"
) as fp:
fp.write(json.dumps(data))
def exists(_):

View File

@ -29,9 +29,9 @@ def VerboseAction(_, act, actstr):
return Action(act, actstr)
def PioClean(env, clean_dir):
def PioClean(env, clean_all=False):
def _relpath(path):
if compat.WINDOWS:
if compat.IS_WINDOWS:
prefix = os.getcwd()[:2].lower()
if (
":" not in prefix
@ -41,21 +41,30 @@ def PioClean(env, clean_dir):
return path
return os.path.relpath(path)
if not os.path.isdir(clean_dir):
def _clean_dir(path):
clean_rel_path = _relpath(path)
for root, _, files in os.walk(path):
for f in files:
dst = os.path.join(root, f)
os.remove(dst)
print(
"Removed %s"
% (dst if not clean_rel_path.startswith(".") else _relpath(dst))
)
build_dir = env.subst("$BUILD_DIR")
libdeps_dir = env.subst("$PROJECT_LIBDEPS_DIR")
if os.path.isdir(build_dir):
_clean_dir(build_dir)
fs.rmtree(build_dir)
else:
print("Build environment is clean")
env.Exit(0)
clean_rel_path = _relpath(clean_dir)
for root, _, files in os.walk(clean_dir):
for f in files:
dst = os.path.join(root, f)
os.remove(dst)
print(
"Removed %s"
% (dst if not clean_rel_path.startswith(".") else _relpath(dst))
)
if clean_all and os.path.isdir(libdeps_dir):
_clean_dir(libdeps_dir)
fs.rmtree(libdeps_dir)
print("Done cleaning")
fs.rmtree(clean_dir)
env.Exit(0)
def AddTarget( # pylint: disable=too-many-arguments
@ -65,7 +74,7 @@ def AddTarget( # pylint: disable=too-many-arguments
actions,
title=None,
description=None,
group="Generic",
group="General",
always_build=True,
):
if "__PIO_TARGETS" not in env:
@ -101,7 +110,13 @@ def DumpTargets(env):
description="Generate compilation database `compile_commands.json`",
group="Advanced",
)
targets["clean"] = dict(name="clean", title="Clean", group="Generic")
targets["clean"] = dict(name="clean", title="Clean", group="General")
targets["cleanall"] = dict(
name="cleanall",
title="Clean All",
group="General",
description="Clean a build environment and installed library dependencies",
)
return list(targets.values())

View File

@ -0,0 +1,63 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
import os
from platformio.builder.tools import platformio as piotool
from platformio.test.result import TestSuite
from platformio.test.runners.factory import TestRunnerFactory
def ConfigureTestTarget(env):
env.Append(
CPPDEFINES=["UNIT_TEST", "PIO_UNIT_TESTING"],
PIOTEST_SRC_FILTER=[f"+<*.{ext}>" for ext in piotool.SRC_BUILD_EXT],
)
env.Prepend(CPPPATH=["$PROJECT_TEST_DIR"])
if "PIOTEST_RUNNING_NAME" in env:
test_name = env["PIOTEST_RUNNING_NAME"]
while True:
test_name = os.path.dirname(test_name) # parent dir
# skip nested tests (user's side issue?)
if not test_name or os.path.basename(test_name).startswith("test_"):
break
env.Prepend(
PIOTEST_SRC_FILTER=[
f"+<{test_name}{os.path.sep}*.{ext}>"
for ext in piotool.SRC_BUILD_EXT
],
CPPPATH=[os.path.join("$PROJECT_TEST_DIR", test_name)],
)
env.Prepend(
PIOTEST_SRC_FILTER=[f"+<$PIOTEST_RUNNING_NAME{os.path.sep}>"],
CPPPATH=[os.path.join("$PROJECT_TEST_DIR", "$PIOTEST_RUNNING_NAME")],
)
test_runner = TestRunnerFactory.new(
TestSuite(env["PIOENV"], env.get("PIOTEST_RUNNING_NAME", "*")),
env.GetProjectConfig(),
)
test_runner.configure_build_env(env)
def generate(env):
env.AddMethod(ConfigureTestTarget)
def exists(_):
return True

View File

@ -12,25 +12,24 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-argument
from __future__ import absolute_import
import os
import re
import sys
from fnmatch import fnmatch
from os import environ
from os.path import isfile, join
from shutil import copyfile
from time import sleep
from SCons.Script import ARGUMENTS # pylint: disable=import-error
from serial import Serial, SerialException
from platformio import exception, fs, util
from platformio.compat import WINDOWS
from platformio import exception, fs
from platformio.device.finder import find_mbed_disk, find_serial_port, is_pattern_port
from platformio.device.list import list_serial_ports
from platformio.proc import exec_command
# pylint: disable=unused-argument
def FlushSerialBuffer(env, port):
s = Serial(env.subst(port))
@ -62,7 +61,7 @@ def WaitForNewSerialPort(env, before):
elapsed = 0
before = [p["port"] for p in before]
while elapsed < 5 and new_port is None:
now = [p["port"] for p in util.get_serial_ports()]
now = [p["port"] for p in list_serial_ports()]
for p in now:
if p not in before:
new_port = p
@ -97,67 +96,28 @@ def WaitForNewSerialPort(env, before):
def AutodetectUploadPort(*args, **kwargs):
env = args[0]
def _get_pattern():
if "UPLOAD_PORT" not in env:
return None
if set(["*", "?", "[", "]"]) & set(env["UPLOAD_PORT"]):
return env["UPLOAD_PORT"]
return None
def _is_match_pattern(port):
pattern = _get_pattern()
if not pattern:
return True
return fnmatch(port, pattern)
def _look_for_mbed_disk():
msdlabels = ("mbed", "nucleo", "frdm", "microbit")
for item in util.get_logical_devices():
if item["path"].startswith("/net") or not _is_match_pattern(item["path"]):
continue
mbed_pages = [join(item["path"], n) for n in ("mbed.htm", "mbed.html")]
if any(isfile(p) for p in mbed_pages):
return item["path"]
if item["name"] and any(l in item["name"].lower() for l in msdlabels):
return item["path"]
return None
def _look_for_serial_port():
port = None
board_hwids = []
upload_protocol = env.subst("$UPLOAD_PROTOCOL")
if "BOARD" in env and "build.hwids" in env.BoardConfig():
board_hwids = env.BoardConfig().get("build.hwids")
for item in util.get_serial_ports(filter_hwid=True):
if not _is_match_pattern(item["port"]):
continue
port = item["port"]
if upload_protocol.startswith("blackmagic"):
if WINDOWS and port.startswith("COM") and len(port) > 4:
port = "\\\\.\\%s" % port
if "GDB" in item["description"]:
return port
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
return port
return port
if "UPLOAD_PORT" in env and not _get_pattern():
print(env.subst("Use manually specified: $UPLOAD_PORT"))
initial_port = env.subst("$UPLOAD_PORT")
upload_protocol = env.subst("$UPLOAD_PROTOCOL")
if initial_port and not is_pattern_port(initial_port):
print(env.subst("Using manually specified: $UPLOAD_PORT"))
return
if env.subst("$UPLOAD_PROTOCOL") == "mbed" or (
"mbed" in env.subst("$PIOFRAMEWORK") and not env.subst("$UPLOAD_PROTOCOL")
if upload_protocol == "mbed" or (
"mbed" in env.subst("$PIOFRAMEWORK") and not upload_protocol
):
env.Replace(UPLOAD_PORT=_look_for_mbed_disk())
env.Replace(UPLOAD_PORT=find_mbed_disk(initial_port))
else:
try:
fs.ensure_udev_rules()
except exception.InvalidUdevRules as e:
sys.stderr.write("\n%s\n\n" % e)
env.Replace(UPLOAD_PORT=_look_for_serial_port())
env.Replace(
UPLOAD_PORT=find_serial_port(
initial_port=initial_port,
board_config=env.BoardConfig() if "BOARD" in env else None,
upload_protocol=upload_protocol,
)
)
if env.subst("$UPLOAD_PORT"):
print(env.subst("Auto-detected: $UPLOAD_PORT"))
@ -175,10 +135,12 @@ def UploadToDisk(_, target, source, env):
assert "UPLOAD_PORT" in env
progname = env.subst("$PROGNAME")
for ext in ("bin", "hex"):
fpath = join(env.subst("$BUILD_DIR"), "%s.%s" % (progname, ext))
if not isfile(fpath):
fpath = os.path.join(env.subst("$BUILD_DIR"), "%s.%s" % (progname, ext))
if not os.path.isfile(fpath):
continue
copyfile(fpath, join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext)))
copyfile(
fpath, os.path.join(env.subst("$UPLOAD_PORT"), "%s.%s" % (progname, ext))
)
print(
"Firmware has been successfully uploaded.\n"
"(Some boards may require manual hard reset)"
@ -211,7 +173,7 @@ def CheckUploadSize(_, target, source, env):
if not isinstance(cmd, list):
cmd = cmd.split()
cmd = [arg.replace("$SOURCES", str(source[0])) for arg in cmd if arg]
sysenv = environ.copy()
sysenv = os.environ.copy()
sysenv["PATH"] = str(env["ENV"]["PATH"])
result = exec_command(env.subst(cmd), env=sysenv)
if result["returncode"] != 0:
@ -236,9 +198,9 @@ def CheckUploadSize(_, target, source, env):
def _format_availale_bytes(value, total):
percent_raw = float(value) / float(total)
blocks_per_progress = 10
used_blocks = int(round(blocks_per_progress * percent_raw))
if used_blocks > blocks_per_progress:
used_blocks = blocks_per_progress
used_blocks = min(
int(round(blocks_per_progress * percent_raw)), blocks_per_progress
)
return "[{:{}}] {: 6.1%} (used {:d} bytes from {:d} bytes)".format(
"=" * used_blocks, blocks_per_progress, percent_raw, value, total
)

View File

@ -27,7 +27,7 @@ from SCons.Script import Export # pylint: disable=import-error
from SCons.Script import SConscript # pylint: disable=import-error
from platformio import __version__, fs
from platformio.compat import MACOS, string_types
from platformio.compat import IS_MACOS, string_types
from platformio.package.version import pepver_to_semver
SRC_HEADER_EXT = ["h", "hpp"]
@ -47,14 +47,16 @@ def scons_patched_match_splitext(path, suffixes=None):
def GetBuildType(env):
return (
"debug"
if (
set(["debug", "sizedata"]) & set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
)
else "release"
)
modes = []
if (
set(["__debug", "sizedata"]) # sizedata = for memory inspection
& set(COMMAND_LINE_TARGETS)
or env.GetProjectOption("build_type") == "debug"
):
modes.append("debug")
if "__test" in COMMAND_LINE_TARGETS or env.GetProjectOption("build_type") == "test":
modes.append("test")
return "+".join(modes or ["release"])
def BuildProgram(env):
@ -69,13 +71,14 @@ def BuildProgram(env):
if (
env.get("LIBS")
and env.GetCompilerType() == "gcc"
and (env.PioPlatform().is_embedded() or not MACOS)
and (env.PioPlatform().is_embedded() or not IS_MACOS)
):
env.Prepend(_LIBFLAGS="-Wl,--start-group ")
env.Append(_LIBFLAGS=" -Wl,--end-group")
program = env.Program(
os.path.join("$BUILD_DIR", env.subst("$PROGNAME")), env["PIOBUILDFILES"]
os.path.join("$BUILD_DIR", env.subst("$PROGNAME$PROGSUFFIX")),
env["PIOBUILDFILES"],
)
env.Replace(PIOMAINPROG=program)
@ -112,10 +115,6 @@ def ProcessProgramDeps(env):
env.PrintConfiguration()
# fix ASM handling under non case-sensitive OS
if not Util.case_sensitive_suffixes(".s", ".S"):
env.Replace(AS="$CC", ASCOM="$ASPPCOM")
# process extra flags from board
if "BOARD" in env and "build.extra_flags" in env.BoardConfig():
env.ProcessFlags(env.BoardConfig().get("build.extra_flags"))
@ -126,14 +125,20 @@ def ProcessProgramDeps(env):
# process framework scripts
env.BuildFrameworks(env.get("PIOFRAMEWORK"))
if env.GetBuildType() == "debug":
env.ConfigureDebugFlags()
if "debug" in env.GetBuildType():
env.ConfigureDebugTarget()
if "test" in env.GetBuildType():
env.ConfigureTestTarget()
# remove specified flags
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
if "__test" in COMMAND_LINE_TARGETS:
env.ConfigureTestTarget()
if "compiledb" in COMMAND_LINE_TARGETS and env.get(
"COMPILATIONDB_INCLUDE_TOOLCHAIN"
):
for scope, includes in env.DumpIntegrationIncludes().items():
if scope in ("toolchain",):
env.Append(CPPPATH=includes)
def ProcessProjectDeps(env):
@ -157,12 +162,11 @@ def ProcessProjectDeps(env):
# extra build flags from `platformio.ini`
projenv.ProcessFlags(env.get("SRC_BUILD_FLAGS"))
is_test = "__test" in COMMAND_LINE_TARGETS
if is_test:
if "test" in env.GetBuildType():
projenv.BuildSources(
"$BUILD_TEST_DIR", "$PROJECT_TEST_DIR", "$PIOTEST_SRC_FILTER"
)
if not is_test or env.GetProjectOption("test_build_project_src"):
if "test" not in env.GetBuildType() or env.GetProjectOption("test_build_src"):
projenv.BuildSources(
"$BUILD_SRC_DIR", "$PROJECT_SRC_DIR", env.get("SRC_FILTER")
)
@ -206,12 +210,12 @@ def ParseFlagsExtended(env, flags): # pylint: disable=too-many-branches
for k in ("CPPPATH", "LIBPATH"):
for i, p in enumerate(result.get(k, [])):
if os.path.isdir(p):
result[k][i] = os.path.realpath(p)
result[k][i] = os.path.abspath(p)
# fix relative path for "-include"
for i, f in enumerate(result.get("CCFLAGS", [])):
if isinstance(f, tuple) and f[0] == "-include":
result["CCFLAGS"][i] = (f[0], env.File(os.path.realpath(f[1].get_path())))
result["CCFLAGS"][i] = (f[0], env.File(os.path.abspath(f[1].get_path())))
return result
@ -345,11 +349,10 @@ def BuildFrameworks(env, frameworks):
env.Exit(1)
def BuildLibrary(env, variant_dir, src_dir, src_filter=None):
def BuildLibrary(env, variant_dir, src_dir, src_filter=None, nodes=None):
env.ProcessUnFlags(env.get("BUILD_UNFLAGS"))
return env.StaticLibrary(
env.subst(variant_dir), env.CollectBuildFiles(variant_dir, src_dir, src_filter)
)
nodes = nodes or env.CollectBuildFiles(variant_dir, src_dir, src_filter)
return env.StaticLibrary(env.subst(variant_dir), nodes)
def BuildSources(env, variant_dir, src_dir, src_filter=None):

View File

@ -78,9 +78,9 @@ class ContentCache(object):
if not os.path.isdir(os.path.dirname(cache_path)):
os.makedirs(os.path.dirname(cache_path))
try:
with codecs.open(cache_path, "wb", encoding="utf8") as fp:
with codecs.open(cache_path, mode="wb", encoding="utf8") as fp:
fp.write(data)
with open(self._db_path, "a") as fp:
with open(self._db_path, mode="a", encoding="utf8") as fp:
fp.write("%s=%s\n" % (str(expire_time), os.path.basename(cache_path)))
except UnicodeError:
if os.path.isfile(cache_path):
@ -92,7 +92,7 @@ class ContentCache(object):
return self._unlock_dbindex()
def delete(self, keys=None):
""" Keys=None, delete expired items """
"""Keys=None, delete expired items"""
if not os.path.isfile(self._db_path):
return None
if not keys:
@ -102,7 +102,7 @@ class ContentCache(object):
paths_for_delete = [self.get_cache_path(k) for k in keys]
found = False
newlines = []
with open(self._db_path) as fp:
with open(self._db_path, encoding="utf8") as fp:
for line in fp.readlines():
line = line.strip()
if "=" not in line:
@ -129,7 +129,7 @@ class ContentCache(object):
pass
if found and self._lock_dbindex():
with open(self._db_path, "w") as fp:
with open(self._db_path, mode="w", encoding="utf8") as fp:
fp.write("\n".join(newlines) + "\n")
self._unlock_dbindex()

View File

@ -16,7 +16,7 @@ import os
import time
from platformio import __accounts_api__, app
from platformio.clients.http import HTTPClient
from platformio.clients.http import HTTPClient, HTTPClientError
from platformio.exception import PlatformioException
@ -40,7 +40,7 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
SUMMARY_CACHE_TTL = 60 * 60 * 24 * 7
def __init__(self):
super(AccountClient, self).__init__(__accounts_api__)
super().__init__(__accounts_api__)
@staticmethod
def get_refresh_token():
@ -61,13 +61,33 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
del account[key]
app.set_state_item("account", account)
def send_auth_request(self, *args, **kwargs):
headers = kwargs.get("headers", {})
if "Authorization" not in headers:
token = self.fetch_authentication_token()
headers["Authorization"] = "Bearer %s" % token
kwargs["headers"] = headers
return self.fetch_json_data(*args, **kwargs)
def fetch_json_data(self, *args, **kwargs):
try:
return super().fetch_json_data(*args, **kwargs)
except HTTPClientError as exc:
raise AccountError(exc) from exc
def fetch_authentication_token(self):
if os.environ.get("PLATFORMIO_AUTH_TOKEN"):
return os.environ.get("PLATFORMIO_AUTH_TOKEN")
auth = app.get_state_item("account", {}).get("auth", {})
if auth.get("access_token") and auth.get("access_token_expire"):
if auth.get("access_token_expire") > time.time():
return auth.get("access_token")
if auth.get("refresh_token"):
try:
data = self.fetch_json_data(
"post",
"/v1/login",
headers={
"Authorization": "Bearer %s" % auth.get("refresh_token")
},
)
app.set_state_item("account", data)
return data.get("auth").get("access_token")
except AccountError:
self.delete_local_session()
raise AccountNotAuthorized()
def login(self, username, password):
try:
@ -119,10 +139,11 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
return True
def change_password(self, old_password, new_password):
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v1/password",
data={"old_password": old_password, "new_password": new_password},
x_with_authorization=True,
)
def registration(
@ -150,10 +171,11 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
)
def auth_token(self, password, regenerate):
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v1/token",
data={"password": password, "regenerate": 1 if regenerate else 0},
x_with_authorization=True,
).get("auth_token")
def forgot_password(self, username):
@ -164,18 +186,20 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
)
def get_profile(self):
return self.send_auth_request(
return self.fetch_json_data(
"get",
"/v1/profile",
x_with_authorization=True,
)
def update_profile(self, profile, current_password):
profile["current_password"] = current_password
self.delete_local_state("summary")
response = self.send_auth_request(
response = self.fetch_json_data(
"put",
"/v1/profile",
data=profile,
x_with_authorization=True,
)
return response
@ -193,9 +217,10 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
"username": account.get("username"),
}
}
result = self.send_auth_request(
result = self.fetch_json_data(
"get",
"/v1/summary",
x_with_authorization=True,
)
account["summary"] = dict(
profile=result.get("profile"),
@ -207,120 +232,125 @@ class AccountClient(HTTPClient): # pylint:disable=too-many-public-methods
app.set_state_item("account", account)
return result
def get_logged_username(self):
return self.get_account_info(offline=True).get("profile").get("username")
def destroy_account(self):
return self.send_auth_request("delete", "/v1/account")
return self.fetch_json_data(
"delete",
"/v1/account",
x_with_authorization=True,
)
def create_org(self, orgname, email, displayname):
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v1/orgs",
data={"orgname": orgname, "email": email, "displayname": displayname},
x_with_authorization=True,
)
def get_org(self, orgname):
return self.send_auth_request("get", "/v1/orgs/%s" % orgname)
return self.fetch_json_data(
"get",
"/v1/orgs/%s" % orgname,
x_with_authorization=True,
)
def list_orgs(self):
return self.send_auth_request(
return self.fetch_json_data(
"get",
"/v1/orgs",
x_with_authorization=True,
)
def update_org(self, orgname, data):
return self.send_auth_request(
"put", "/v1/orgs/%s" % orgname, data={k: v for k, v in data.items() if v}
return self.fetch_json_data(
"put",
"/v1/orgs/%s" % orgname,
data={k: v for k, v in data.items() if v},
x_with_authorization=True,
)
def destroy_org(self, orgname):
return self.send_auth_request(
return self.fetch_json_data(
"delete",
"/v1/orgs/%s" % orgname,
x_with_authorization=True,
)
def add_org_owner(self, orgname, username):
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v1/orgs/%s/owners" % orgname,
data={"username": username},
x_with_authorization=True,
)
def list_org_owners(self, orgname):
return self.send_auth_request(
return self.fetch_json_data(
"get",
"/v1/orgs/%s/owners" % orgname,
x_with_authorization=True,
)
def remove_org_owner(self, orgname, username):
return self.send_auth_request(
return self.fetch_json_data(
"delete",
"/v1/orgs/%s/owners" % orgname,
data={"username": username},
x_with_authorization=True,
)
def create_team(self, orgname, teamname, description):
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v1/orgs/%s/teams" % orgname,
data={"name": teamname, "description": description},
x_with_authorization=True,
)
def destroy_team(self, orgname, teamname):
return self.send_auth_request(
return self.fetch_json_data(
"delete",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
x_with_authorization=True,
)
def get_team(self, orgname, teamname):
return self.send_auth_request(
return self.fetch_json_data(
"get",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
x_with_authorization=True,
)
def list_teams(self, orgname):
return self.send_auth_request(
return self.fetch_json_data(
"get",
"/v1/orgs/%s/teams" % orgname,
x_with_authorization=True,
)
def update_team(self, orgname, teamname, data):
return self.send_auth_request(
return self.fetch_json_data(
"put",
"/v1/orgs/%s/teams/%s" % (orgname, teamname),
data={k: v for k, v in data.items() if v},
x_with_authorization=True,
)
def add_team_member(self, orgname, teamname, username):
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v1/orgs/%s/teams/%s/members" % (orgname, teamname),
data={"username": username},
x_with_authorization=True,
)
def remove_team_member(self, orgname, teamname, username):
return self.send_auth_request(
return self.fetch_json_data(
"delete",
"/v1/orgs/%s/teams/%s/members" % (orgname, teamname),
data={"username": username},
x_with_authorization=True,
)
def fetch_authentication_token(self):
if os.environ.get("PLATFORMIO_AUTH_TOKEN"):
return os.environ.get("PLATFORMIO_AUTH_TOKEN")
auth = app.get_state_item("account", {}).get("auth", {})
if auth.get("access_token") and auth.get("access_token_expire"):
if auth.get("access_token_expire") > time.time():
return auth.get("access_token")
if auth.get("refresh_token"):
try:
data = self.fetch_json_data(
"post",
"/v1/login",
headers={
"Authorization": "Bearer %s" % auth.get("refresh_token")
},
)
app.set_state_item("account", data)
return data.get("auth").get("access_token")
except AccountError:
self.delete_local_session()
raise AccountNotAuthorized()

View File

@ -16,23 +16,19 @@ import json
import math
import os
import socket
from urllib.parse import urljoin
import requests.adapters
from requests.packages.urllib3.util.retry import Retry # pylint:disable=import-error
from platformio import __check_internet_hosts__, __default_requests_timeout__, app, util
from platformio.cache import ContentCache
from platformio.cache import ContentCache, cleanup_content_cache
from platformio.exception import PlatformioException, UserSideException
try:
from urllib.parse import urljoin
except ImportError:
from urlparse import urljoin
class HTTPClientError(PlatformioException):
def __init__(self, message, response=None):
super(HTTPClientError, self).__init__()
super().__init__()
self.message = message
self.response = response
@ -51,16 +47,14 @@ class InternetIsOffline(UserSideException):
class EndpointSession(requests.Session):
def __init__(self, base_url, *args, **kwargs):
super(EndpointSession, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
self.base_url = base_url
def request( # pylint: disable=signature-differs,arguments-differ
self, method, url, *args, **kwargs
):
# print(self.base_url, method, url, args, kwargs)
return super(EndpointSession, self).request(
method, urljoin(self.base_url, url), *args, **kwargs
)
return super().request(method, urljoin(self.base_url, url), *args, **kwargs)
class EndpointSessionIterator(object):
@ -79,10 +73,6 @@ class EndpointSessionIterator(object):
def __iter__(self): # pylint: disable=non-iterator-returned
return self
def next(self):
""" For Python 2 compatibility """
return self.__next__()
def __next__(self):
base_url = next(self.endpoints_iter)
session = EndpointSession(base_url)
@ -101,7 +91,10 @@ class HTTPClient(object):
def __del__(self):
if not self._session:
return
self._session.close()
try:
self._session.close()
except: # pylint: disable=bare-except
pass
self._session = None
def _next_session(self):
@ -114,6 +107,21 @@ class HTTPClient(object):
# check Internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
headers = kwargs.get("headers", {})
with_authorization = (
kwargs.pop("x_with_authorization")
if "x_with_authorization" in kwargs
else False
)
if with_authorization and "Authorization" not in headers:
# pylint: disable=import-outside-toplevel
from platformio.clients.account import AccountClient
headers["Authorization"] = (
"Bearer %s" % AccountClient().fetch_authentication_token()
)
kwargs["headers"] = headers
# set default timeout
if "timeout" not in kwargs:
kwargs["timeout"] = __default_requests_timeout__
@ -131,7 +139,9 @@ class HTTPClient(object):
raise HTTPClientError(str(e))
def fetch_json_data(self, method, path, **kwargs):
cache_valid = kwargs.pop("cache_valid") if "cache_valid" in kwargs else None
if method not in ("get", "head", "options"):
cleanup_content_cache("http")
cache_valid = kwargs.pop("x_cache_valid") if "x_cache_valid" in kwargs else None
if not cache_valid:
return self._parse_json_response(self.send_request(method, path, **kwargs))
cache_key = ContentCache.key_from_args(
@ -176,8 +186,9 @@ def _internet_on():
continue
requests.get("http://%s" % host, allow_redirects=False, timeout=timeout)
return True
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, 80))
# try to resolve `host` for both AF_INET and AF_INET6, and then try to connect
# to all possible addresses (IPv4 and IPv6) in turn until a connection succeeds:
s = socket.create_connection((host, 80))
s.close()
return True
except: # pylint: disable=bare-except

View File

@ -12,38 +12,43 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio import __registry_api__, fs
from platformio.clients.account import AccountClient
from platformio import __registry_mirror_hosts__, fs
from platformio.clients.account import AccountClient, AccountError
from platformio.clients.http import HTTPClient, HTTPClientError
from platformio.package.meta import PackageType
# pylint: disable=too-many-arguments
class RegistryClient(HTTPClient):
def __init__(self):
super(RegistryClient, self).__init__(__registry_api__)
endpoints = [f"https://api.{host}" for host in __registry_mirror_hosts__]
super().__init__(endpoints)
def send_auth_request(self, *args, **kwargs):
headers = kwargs.get("headers", {})
if "Authorization" not in headers:
token = AccountClient().fetch_authentication_token()
headers["Authorization"] = "Bearer %s" % token
kwargs["headers"] = headers
return self.fetch_json_data(*args, **kwargs)
@staticmethod
def allowed_private_packages():
private_permissions = set(
[
"service.registry.publish-private-tool",
"service.registry.publish-private-platform",
"service.registry.publish-private-library",
]
)
try:
info = AccountClient().get_account_info() or {}
for item in info.get("packages", []):
if set(item.keys()) & private_permissions:
return True
except AccountError:
pass
return False
def publish_package(
self, archive_path, owner=None, released_at=None, private=False, notify=True
def publish_package( # pylint: disable=redefined-builtin
self, owner, type, archive_path, released_at=None, private=False, notify=True
):
account = AccountClient()
if not owner:
owner = (
account.get_account_info(offline=True).get("profile").get("username")
)
with open(archive_path, "rb") as fp:
return self.send_auth_request(
return self.fetch_json_data(
"post",
"/v3/packages/%s/%s" % (owner, PackageType.from_archive(archive_path)),
"/v3/packages/%s/%s" % (owner, type),
params={
"private": 1 if private else 0,
"notify": 1 if notify else 0,
@ -56,56 +61,56 @@ class RegistryClient(HTTPClient):
),
},
data=fp,
x_with_authorization=True,
)
def unpublish_package( # pylint: disable=redefined-builtin
self, type, name, owner=None, version=None, undo=False
self, owner, type, name, version=None, undo=False
):
account = AccountClient()
if not owner:
owner = (
account.get_account_info(offline=True).get("profile").get("username")
)
path = "/v3/packages/%s/%s/%s" % (owner, type, name)
if version:
path += "/" + version
return self.send_auth_request(
"delete",
path,
params={"undo": 1 if undo else 0},
return self.fetch_json_data(
"delete", path, params={"undo": 1 if undo else 0}, x_with_authorization=True
)
def update_resource(self, urn, private):
return self.send_auth_request(
return self.fetch_json_data(
"put",
"/v3/resources/%s" % urn,
data={"private": int(private)},
x_with_authorization=True,
)
def grant_access_for_resource(self, urn, client, level):
return self.send_auth_request(
return self.fetch_json_data(
"put",
"/v3/resources/%s/access" % urn,
data={"client": client, "level": level},
x_with_authorization=True,
)
def revoke_access_from_resource(self, urn, client):
return self.send_auth_request(
return self.fetch_json_data(
"delete",
"/v3/resources/%s/access" % urn,
data={"client": client},
x_with_authorization=True,
)
def list_resources(self, owner):
return self.send_auth_request(
"get", "/v3/resources", params={"owner": owner} if owner else None
return self.fetch_json_data(
"get",
"/v3/resources",
params={"owner": owner} if owner else None,
x_cache_valid="1h",
x_with_authorization=True,
)
def list_packages(self, query=None, filters=None, page=None):
assert query or filters
def list_packages(self, query=None, qualifiers=None, page=None, sort=None):
search_query = []
if filters:
valid_filters = (
if qualifiers:
valid_qualifiers = (
"authors",
"keywords",
"frameworks",
@ -116,8 +121,8 @@ class RegistryClient(HTTPClient):
"owners",
"types",
)
assert set(filters.keys()) <= set(valid_filters)
for name, values in filters.items():
assert set(qualifiers.keys()) <= set(valid_qualifiers)
for name, values in qualifiers.items():
for value in set(
values if isinstance(values, (list, tuple)) else [values]
):
@ -127,8 +132,14 @@ class RegistryClient(HTTPClient):
params = dict(query=" ".join(search_query))
if page:
params["page"] = int(page)
if sort:
params["sort"] = sort
return self.fetch_json_data(
"get", "/v3/packages", params=params, cache_valid="1h"
"get",
"/v3/search",
params=params,
x_cache_valid="1h",
x_with_authorization=self.allowed_private_packages(),
)
def get_package(self, type_, owner, name, version=None):
@ -139,7 +150,8 @@ class RegistryClient(HTTPClient):
type=type_, owner=owner.lower(), name=name.lower()
),
params=dict(version=version) if version else None,
cache_valid="1h",
x_cache_valid="1h",
x_with_authorization=self.allowed_private_packages(),
)
except HTTPClientError as e:
if e.response is not None and e.response.status_code == 404:

View File

@ -22,7 +22,7 @@ class PlatformioCLI(click.MultiCommand):
leftover_args = []
def __init__(self, *args, **kwargs):
super(PlatformioCLI, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
self._pio_cmds_dir = os.path.dirname(__file__)
@staticmethod
@ -41,7 +41,7 @@ class PlatformioCLI(click.MultiCommand):
PlatformioCLI.leftover_args = ctx.args
if hasattr(ctx, "protected_args"):
PlatformioCLI.leftover_args = ctx.protected_args + ctx.args
return super(PlatformioCLI, self).invoke(ctx)
return super().invoke(ctx)
def list_commands(self, ctx):
cmds = []
@ -74,7 +74,13 @@ class PlatformioCLI(click.MultiCommand):
def _handle_obsolate_command(name):
# pylint: disable=import-outside-toplevel
if name == "init":
from platformio.commands.project import project_init
from platformio.project.commands.init import project_init_cmd
return project_init_cmd
if name == "package":
from platformio.commands.pkg import cli
return cli
return project_init
raise AttributeError()

View File

@ -134,6 +134,14 @@ def access_list(owner, urn_type, json_output):
table_data = []
table_data.append(("URN:", resource.get("urn")))
table_data.append(("Owner:", resource.get("owner")))
table_data.append(
(
"Access:",
click.style("Private", fg="red")
if resource.get("private", False)
else "Public",
)
)
table_data.append(
(
"Access level(s):",

View File

@ -14,13 +14,13 @@
# pylint: disable=unused-argument
import datetime
import json
import re
import click
from tabulate import tabulate
from platformio import util
from platformio.clients.account import AccountClient, AccountNotAuthorized
@ -184,7 +184,7 @@ def account_destroy():
click.confirm(
"Are you sure you want to delete the %s user account?\n"
"Warning! All linked data will be permanently removed and can not be restored."
% client.get_account_info().get("profile").get("username"),
% client.get_logged_username(),
abort=True,
)
client.destroy_account()
@ -244,12 +244,9 @@ def print_packages(packages):
data = []
expire = "-"
if "subscription" in package:
expire = datetime.datetime.strptime(
(
package["subscription"].get("end_at")
or package["subscription"].get("next_bill_at")
),
"%Y-%m-%dT%H:%M:%SZ",
expire = util.parse_datetime(
package["subscription"].get("end_at")
or package["subscription"].get("next_bill_at")
).strftime("%Y-%m-%d")
data.append(("Expire:", expire))
services = []
@ -274,21 +271,17 @@ def print_subscriptions(subscriptions):
click.secho(subscription.get("product_name"), bold=True)
click.echo("-" * len(subscription.get("product_name")))
data = [("State:", subscription.get("status"))]
begin_at = datetime.datetime.strptime(
subscription.get("begin_at"), "%Y-%m-%dT%H:%M:%SZ"
).strftime("%Y-%m-%d %H:%M:%S")
begin_at = util.parse_datetime(subscription.get("begin_at")).strftime("%c")
data.append(("Start date:", begin_at or "-"))
end_at = subscription.get("end_at")
if end_at:
end_at = datetime.datetime.strptime(
subscription.get("end_at"), "%Y-%m-%dT%H:%M:%SZ"
).strftime("%Y-%m-%d %H:%M:%S")
end_at = util.parse_datetime(subscription.get("end_at")).strftime("%c")
data.append(("End date:", end_at or "-"))
next_bill_at = subscription.get("next_bill_at")
if next_bill_at:
next_bill_at = datetime.datetime.strptime(
subscription.get("next_bill_at"), "%Y-%m-%dT%H:%M:%SZ"
).strftime("%Y-%m-%d %H:%M:%S")
next_bill_at = util.parse_datetime(
subscription.get("next_bill_at")
).strftime("%c")
data.append(("Next payment:", next_bill_at or "-"))
data.append(
("Edit:", click.style(subscription.get("update_url"), fg="blue") or "-")

View File

@ -13,16 +13,16 @@
# limitations under the License.
import json
import shutil
import click
from tabulate import tabulate
from platformio import fs
from platformio.compat import dump_json_to_unicode
from platformio.package.manager.platform import PlatformPackageManager
@click.command("boards", short_help="Embedded board explorer")
@click.command("boards", short_help="Board Explorer")
@click.argument("query", required=False)
@click.option("--installed", is_flag=True)
@click.option("--json-output", is_flag=True)
@ -41,7 +41,7 @@ def cli(query, installed, json_output): # pylint: disable=R0912
grpboards[board["platform"]] = []
grpboards[board["platform"]].append(board)
terminal_width, _ = click.get_terminal_size()
terminal_width, _ = shutil.get_terminal_size()
for (platform, boards) in sorted(grpboards.items()):
click.echo("")
click.echo("Platform: ", nl=False)
@ -83,4 +83,4 @@ def _print_boards_json(query, installed=False):
if query.lower() not in search_data.lower():
continue
result.append(board)
click.echo(dump_json_to_unicode(result))
click.echo(json.dumps(result))

View File

@ -15,7 +15,9 @@
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches
# pylint: disable=redefined-builtin,too-many-statements
import json
import os
import shutil
from collections import Counter
from os.path import dirname, isfile
from time import time
@ -26,12 +28,11 @@ from tabulate import tabulate
from platformio import app, exception, fs, util
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools import CheckToolFactory
from platformio.compat import dump_json_to_unicode
from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above, get_project_dir
@click.command("check", short_help="Static code analysis")
@click.command("check", short_help="Static Code Analysis")
@click.option("-e", "--environment", multiple=True)
@click.option(
"-d",
@ -105,8 +106,8 @@ def cli(
)
default_patterns = [
config.get_optional_dir("src"),
config.get_optional_dir("include"),
config.get("platformio", "src_dir"),
config.get("platformio", "include_dir"),
]
tool_options = dict(
verbose=verbose,
@ -117,6 +118,7 @@ def cli(
if silent
else severity or config.get("env:" + envname, "check_severity"),
skip_packages=skip_packages or env_options.get("check_skip_packages"),
platform_packages=env_options.get("platform_packages"),
)
for tool in config.get("env:" + envname, "check_tool"):
@ -163,9 +165,12 @@ def cli(
print_processing_footer(result)
if json_output:
click.echo(dump_json_to_unicode(results_to_json(results)))
click.echo(json.dumps(results_to_json(results)))
elif not silent:
print_check_summary(results)
print_check_summary(results, verbose=verbose)
# Reset custom project config
app.set_session_var("custom_project_conf", None)
command_failed = any(r.get("succeeded") is False for r in results)
if command_failed:
@ -193,7 +198,7 @@ def print_processing_header(tool, envname, envdump):
"Checking %s > %s (%s)"
% (click.style(envname, fg="cyan", bold=True), tool, "; ".join(envdump))
)
terminal_width, _ = click.get_terminal_size()
terminal_width, _ = shutil.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
@ -214,7 +219,7 @@ def print_processing_footer(result):
def collect_component_stats(result):
components = dict()
components = {}
def _append_defect(component, defect):
if not components.get(component):
@ -249,7 +254,7 @@ def print_defects_stats(results):
severity_labels = list(DefectItem.SEVERITY_LABELS.values())
severity_labels.reverse()
tabular_data = list()
tabular_data = []
for k, v in component_stats.items():
tool_defect = [v.get(s, 0) for s in severity_labels]
tabular_data.append([k] + tool_defect)
@ -266,7 +271,7 @@ def print_defects_stats(results):
click.echo()
def print_check_summary(results):
def print_check_summary(results, verbose=False):
click.echo()
tabular_data = []
@ -283,6 +288,8 @@ def print_check_summary(results):
status_str = click.style("FAILED", fg="red")
elif result.get("succeeded") is None:
status_str = "IGNORED"
if not verbose:
continue
else:
succeeded_nums += 1
status_str = click.style("PASSED", fg="green")

View File

@ -34,7 +34,7 @@ class DefectItem(object):
severity,
category,
message,
file="unknown",
file=None,
line=0,
column=0,
id=None,
@ -50,7 +50,7 @@ class DefectItem(object):
self.callstack = callstack
self.cwe = cwe
self.id = id
self.file = file
self.file = file or "unknown"
if file.lower().startswith(get_project_dir().lower()):
self.file = os.path.relpath(file, get_project_dir())
@ -86,7 +86,7 @@ class DefectItem(object):
"severity": self.SEVERITY_LABELS[self.severity],
"category": self.category,
"message": self.message,
"file": os.path.realpath(self.file),
"file": os.path.abspath(self.file),
"line": self.line,
"column": self.column,
"callstack": self.callstack,

View File

@ -12,14 +12,17 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import glob
import os
from tempfile import NamedTemporaryFile
import tempfile
import click
from platformio import compat, fs, proc
from platformio import fs, proc
from platformio.commands.check.defect import DefectItem
from platformio.project.helpers import load_project_ide_data
from platformio.package.manager.core import get_core_package_dir
from platformio.package.meta import PackageSpec
from platformio.project.helpers import load_build_metadata
class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
@ -54,7 +57,7 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
]
def _load_cpp_data(self, project_dir):
data = load_project_ide_data(project_dir, self.envname)
data = load_build_metadata(project_dir, self.envname)
if not data:
return
self.cc_flags = click.parser.split_arg_string(data.get("cc_flags", ""))
@ -65,6 +68,13 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
self.cxx_path = data.get("cxx_path")
self.toolchain_defines = self._get_toolchain_defines()
def get_tool_dir(self, pkg_name):
for spec in self.options["platform_packages"] or []:
spec = PackageSpec(spec)
if spec.name == pkg_name:
return get_core_package_dir(pkg_name, spec=spec)
return get_core_package_dir(pkg_name)
def get_flags(self, tool):
result = []
flags = self.options.get("flags") or []
@ -104,7 +114,7 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
return {lang: _extract_defines(lang, incflags_file) for lang in ("c", "c++")}
def _create_tmp_file(self, data):
with NamedTemporaryFile("w", delete=False) as fp:
with tempfile.NamedTemporaryFile("w", delete=False) as fp:
fp.write(data)
self._tmp_files.append(fp.name)
return fp.name
@ -167,6 +177,29 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
if os.path.isfile(f):
os.remove(f)
@staticmethod
def is_check_successful(cmd_result):
return cmd_result["returncode"] == 0
def execute_check_cmd(self, cmd):
result = proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
if not self.is_check_successful(result):
click.echo(
"\nError: Failed to execute check command! Exited with code %d."
% result["returncode"]
)
if self.options.get("verbose"):
click.echo(result["out"])
click.echo(result["err"])
self._bad_input = True
return result
@staticmethod
def get_project_target_files(patterns):
c_extension = (".c",)
@ -177,14 +210,14 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
def _add_file(path):
if path.endswith(header_extensions):
result["headers"].append(os.path.realpath(path))
result["headers"].append(os.path.abspath(path))
elif path.endswith(c_extension):
result["c"].append(os.path.realpath(path))
result["c"].append(os.path.abspath(path))
elif path.endswith(cpp_extensions):
result["c++"].append(os.path.realpath(path))
result["c++"].append(os.path.abspath(path))
for pattern in patterns:
for item in compat.glob_recursive(pattern):
for item in glob.glob(pattern, recursive=True):
if not os.path.isdir(item):
_add_file(item)
for root, _, files in os.walk(item, followlinks=True):
@ -200,11 +233,7 @@ class CheckToolBase(object): # pylint: disable=too-many-instance-attributes
if self.options.get("verbose"):
click.echo(" ".join(cmd))
proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
self.execute_check_cmd(cmd)
else:
if self.options.get("verbose"):

View File

@ -17,11 +17,10 @@ from os.path import join
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.package.manager.core import get_core_package_dir
class ClangtidyCheckTool(CheckToolBase):
def tool_output_filter(self, line):
def tool_output_filter(self, line): # pylint: disable=arguments-differ
if not self.options.get("verbose") and "[clang-diagnostic-error]" in line:
return ""
@ -34,7 +33,7 @@ class ClangtidyCheckTool(CheckToolBase):
return ""
def parse_defect(self, raw_line):
def parse_defect(self, raw_line): # pylint: disable=arguments-differ
match = re.match(r"^(.*):(\d+):(\d+):\s+([^:]+):\s(.+)\[([^]]+)\]$", raw_line)
if not match:
return raw_line
@ -49,12 +48,20 @@ class ClangtidyCheckTool(CheckToolBase):
return DefectItem(severity, category, message, file_, line, column, defect_id)
@staticmethod
def is_check_successful(cmd_result):
# Note: Clang-Tidy returns 1 for not critical compilation errors,
# so 0 and 1 are only acceptable values
return cmd_result["returncode"] < 2
def configure_command(self):
tool_path = join(get_core_package_dir("tool-clangtidy"), "clang-tidy")
tool_path = join(self.get_tool_dir("tool-clangtidy"), "clang-tidy")
cmd = [tool_path, "--quiet"]
flags = self.get_flags("clangtidy")
if not self.is_flag_set("--checks", flags):
if not (
self.is_flag_set("--checks", flags) or self.is_flag_set("--config", flags)
):
cmd.append("--checks=*")
project_files = self.get_project_target_files(self.options["patterns"])
@ -71,7 +78,7 @@ class ClangtidyCheckTool(CheckToolBase):
includes = []
for inc in self.cpp_includes:
if self.options.get("skip_packages") and inc.lower().startswith(
self.config.get_optional_dir("packages").lower()
self.config.get("platformio", "packages_dir").lower()
):
continue
includes.append(inc)

View File

@ -19,11 +19,11 @@ import click
from platformio import proc
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.package.manager.core import get_core_package_dir
class CppcheckCheckTool(CheckToolBase):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._field_delimiter = "<&PIO&>"
self._buffer = ""
self.defect_fields = [
@ -36,9 +36,8 @@ class CppcheckCheckTool(CheckToolBase):
"cwe",
"id",
]
super(CppcheckCheckTool, self).__init__(*args, **kwargs)
def tool_output_filter(self, line):
def tool_output_filter(self, line): # pylint: disable=arguments-differ
if (
not self.options.get("verbose")
and "--suppress=unmatchedSuppression:" in line
@ -50,13 +49,14 @@ class CppcheckCheckTool(CheckToolBase):
for msg in (
"No C or C++ source files found",
"unrecognized command line option",
"there was an internal error",
)
):
self._bad_input = True
return line
def parse_defect(self, raw_line):
def parse_defect(self, raw_line): # pylint: disable=arguments-differ
if self._field_delimiter not in raw_line:
return None
@ -64,7 +64,7 @@ class CppcheckCheckTool(CheckToolBase):
if any(f not in self._buffer for f in self.defect_fields):
return None
args = dict()
args = {}
for field in self._buffer.split(self._field_delimiter):
field = field.strip().replace('"', "")
name, value = field.split("=", 1)
@ -84,7 +84,7 @@ class CppcheckCheckTool(CheckToolBase):
if (
args.get("file", "")
.lower()
.startswith(self.config.get_optional_dir("packages").lower())
.startswith(self.config.get("platformio", "packages_dir").lower())
):
if args["id"] in breaking_defect_ids:
if self.options.get("verbose"):
@ -96,20 +96,19 @@ class CppcheckCheckTool(CheckToolBase):
)
click.echo()
self._bad_input = True
self._buffer = ""
return None
self._buffer = ""
return DefectItem(**args)
def configure_command(
self, language, src_files
): # pylint: disable=arguments-differ
tool_path = os.path.join(get_core_package_dir("tool-cppcheck"), "cppcheck")
def configure_command(self, language, src_file): # pylint: disable=arguments-differ
tool_path = os.path.join(self.get_tool_dir("tool-cppcheck"), "cppcheck")
cmd = [
tool_path,
"--addon-python=%s" % proc.get_pythonexe_path(),
"--error-exitcode=1",
"--error-exitcode=3",
"--verbose" if self.options.get("verbose") else "--quiet",
]
@ -142,10 +141,11 @@ class CppcheckCheckTool(CheckToolBase):
build_flags = self.cxx_flags if language == "c++" else self.cc_flags
for flag in build_flags:
if "-std" in flag:
# Standards with GNU extensions are not allowed
cmd.append("-" + flag.replace("gnu", "c"))
if not self.is_flag_set("--std", flags):
# Try to guess the standard version from the build flags
for flag in build_flags:
if "-std" in flag:
cmd.append("-" + self.convert_language_standard(flag))
cmd.extend(
["-D%s" % d for d in self.cpp_defines + self.toolchain_defines[language]]
@ -157,8 +157,8 @@ class CppcheckCheckTool(CheckToolBase):
"--include=" + inc
for inc in self.get_forced_includes(build_flags, self.cpp_includes)
)
cmd.append("--file-list=%s" % self._generate_src_file(src_files))
cmd.append("--includes-file=%s" % self._generate_inc_file())
cmd.append('"%s"' % src_file)
return cmd
@ -201,14 +201,14 @@ class CppcheckCheckTool(CheckToolBase):
result = []
for inc in self.cpp_includes:
if self.options.get("skip_packages") and inc.lower().startswith(
self.config.get_optional_dir("packages").lower()
self.config.get("platformio", "packages_dir").lower()
):
continue
result.append(inc)
return self._create_tmp_file("\n".join(result))
def clean_up(self):
super(CppcheckCheckTool, self).clean_up()
super().clean_up()
# delete temporary dump files generated by addons
if not self.is_flag_set("--addon", self.get_flags("cppcheck")):
@ -220,29 +220,47 @@ class CppcheckCheckTool(CheckToolBase):
if os.path.isfile(dump_file):
os.remove(dump_file)
@staticmethod
def is_check_successful(cmd_result):
# Cppcheck is configured to return '3' if a defect is found
return cmd_result["returncode"] in (0, 3)
@staticmethod
def convert_language_standard(flag):
cpp_standards_map = {
"0x": "11",
"1y": "14",
"1z": "17",
"2a": "20",
}
standard = flag[-2:]
# Note: GNU extensions are not supported and converted to regular standards
return flag.replace("gnu", "c").replace(
standard, cpp_standards_map.get(standard, standard)
)
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
project_files = self.get_project_target_files(self.options["patterns"])
languages = ("c", "c++")
if not any([project_files[t] for t in languages]):
project_files = self.get_project_target_files(self.options["patterns"])
src_files_scope = ("c", "c++")
if not any(project_files[t] for t in src_files_scope):
click.echo("Error: Nothing to check.")
return True
for language in languages:
if not project_files[language]:
continue
cmd = self.configure_command(language, project_files[language])
if not cmd:
self._bad_input = True
continue
if self.options.get("verbose"):
click.echo(" ".join(cmd))
proc.exec_command(
cmd,
stdout=proc.LineBufferedAsyncPipe(self.on_tool_output),
stderr=proc.LineBufferedAsyncPipe(self.on_tool_output),
)
for scope, files in project_files.items():
if scope not in src_files_scope:
continue
for src_file in files:
cmd = self.configure_command(scope, src_file)
if not cmd:
self._bad_input = True
continue
if self.options.get("verbose"):
click.echo(" ".join(cmd))
self.execute_check_cmd(cmd)
self.clean_up()

View File

@ -19,39 +19,50 @@ from xml.etree.ElementTree import fromstring
import click
from platformio import proc, util
from platformio import proc
from platformio.commands.check.defect import DefectItem
from platformio.commands.check.tools.base import CheckToolBase
from platformio.package.manager.core import get_core_package_dir
from platformio.compat import IS_WINDOWS
class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-attributes
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._tmp_dir = tempfile.mkdtemp(prefix="piocheck")
self._tmp_preprocessed_file = self._generate_tmp_file_path() + ".i"
self._tmp_output_file = self._generate_tmp_file_path() + ".pvs"
self._tmp_cfg_file = self._generate_tmp_file_path() + ".cfg"
self._tmp_cmd_file = self._generate_tmp_file_path() + ".cmd"
self.tool_path = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"x64" if "windows" in util.get_systype() else "bin",
self.get_tool_dir("tool-pvs-studio"),
"x64" if IS_WINDOWS else "bin",
"pvs-studio",
)
super(PvsStudioCheckTool, self).__init__(*args, **kwargs)
with open(self._tmp_cfg_file, "w") as fp:
with open(self._tmp_cfg_file, mode="w", encoding="utf8") as fp:
fp.write(
"exclude-path = "
+ self.config.get_optional_dir("packages").replace("\\", "/")
+ self.config.get("platformio", "packages_dir").replace("\\", "/")
)
with open(self._tmp_cmd_file, "w") as fp:
with open(self._tmp_cmd_file, mode="w", encoding="utf8") as fp:
fp.write(
" ".join(
['-I"%s"' % inc.replace("\\", "/") for inc in self.cpp_includes]
)
)
def tool_output_filter(self, line): # pylint: disable=arguments-differ
if any(
err_msg in line.lower()
for err_msg in (
"license was not entered",
"license information is incorrect",
)
):
self._bad_input = True
return line
def _process_defects(self, defects):
for defect in defects:
if not isinstance(defect, DefectItem):
@ -64,10 +75,8 @@ class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-at
def _demangle_report(self, output_file):
converter_tool = os.path.join(
get_core_package_dir("tool-pvs-studio"),
"HtmlGenerator"
if "windows" in util.get_systype()
else os.path.join("bin", "plog-converter"),
self.get_tool_dir("tool-pvs-studio"),
"HtmlGenerator" if IS_WINDOWS else os.path.join("bin", "plog-converter"),
)
cmd = (
@ -182,9 +191,15 @@ class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-at
flags = self.cc_flags
compiler = self.cc_path
cmd = [compiler, src_file, "-E", "-o", self._tmp_preprocessed_file]
cmd = [
compiler,
'"%s"' % src_file,
"-E",
"-o",
'"%s"' % self._tmp_preprocessed_file,
]
cmd.extend([f for f in flags if f])
cmd.extend(["-D%s" % d for d in self.cpp_defines])
cmd.extend(['"-D%s"' % d.replace('"', '\\"') for d in self.cpp_defines])
cmd.append('@"%s"' % self._tmp_cmd_file)
# Explicitly specify C++ as the language used in .ino files
@ -199,10 +214,16 @@ class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-at
self._bad_input = True
def clean_up(self):
super(PvsStudioCheckTool, self).clean_up()
super().clean_up()
if os.path.isdir(self._tmp_dir):
shutil.rmtree(self._tmp_dir)
@staticmethod
def is_check_successful(cmd_result):
return (
"license" not in cmd_result["err"].lower() and cmd_result["returncode"] == 0
)
def check(self, on_defect_callback=None):
self._on_defect_callback = on_defect_callback
for scope, files in self.get_project_target_files(
@ -219,11 +240,8 @@ class PvsStudioCheckTool(CheckToolBase): # pylint: disable=too-many-instance-at
self._bad_input = True
continue
result = proc.exec_command(cmd)
# pylint: disable=unsupported-membership-test
if result["returncode"] != 0 or "license" in result["err"].lower():
self._bad_input = True
click.echo(result["err"])
result = self.execute_check_cmd(cmd)
if result["returncode"] != 0:
continue
self._process_defects(self.parse_defects(self._tmp_output_file))

View File

@ -12,18 +12,17 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from os import getenv, makedirs, remove
from os.path import basename, isdir, isfile, join, realpath
from shutil import copyfile, copytree
from tempfile import mkdtemp
import glob
import os
import shutil
import tempfile
import click
from platformio import app, compat, fs
from platformio.commands.project import project_init as cmd_project_init
from platformio.commands.project import validate_boards
from platformio import app, fs
from platformio.commands.run.command import cli as cmd_run
from platformio.exception import CIBuildEnvsEmpty
from platformio.project.commands.init import project_init_cmd, validate_boards
from platformio.project.config import ProjectConfig
@ -33,8 +32,8 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
for i, p in enumerate(value):
if p.startswith("~"):
value[i] = fs.expanduser(p)
value[i] = realpath(value[i])
if not compat.glob_recursive(value[i]):
value[i] = os.path.abspath(value[i])
if not glob.glob(value[i], recursive=True):
invalid_path = p
break
try:
@ -44,14 +43,14 @@ def validate_path(ctx, param, value): # pylint: disable=unused-argument
raise click.BadParameter("Found invalid path: %s" % invalid_path)
@click.command("ci", short_help="Continuous integration")
@click.command("ci", short_help="Continuous Integration")
@click.argument("src", nargs=-1, callback=validate_path)
@click.option("-l", "--lib", multiple=True, callback=validate_path, metavar="DIRECTORY")
@click.option("--exclude", multiple=True)
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option(
"--build-dir",
default=mkdtemp,
default=tempfile.mkdtemp,
type=click.Path(file_okay=False, dir_okay=True, writable=True, resolve_path=True),
)
@click.option("--keep-build-dir", is_flag=True)
@ -78,28 +77,28 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
verbose,
):
if not src and getenv("PLATFORMIO_CI_SRC"):
src = validate_path(ctx, None, getenv("PLATFORMIO_CI_SRC").split(":"))
if not src and os.getenv("PLATFORMIO_CI_SRC"):
src = validate_path(ctx, None, os.getenv("PLATFORMIO_CI_SRC").split(":"))
if not src:
raise click.BadParameter("Missing argument 'src'")
try:
app.set_session_var("force_option", True)
if not keep_build_dir and isdir(build_dir):
if not keep_build_dir and os.path.isdir(build_dir):
fs.rmtree(build_dir)
if not isdir(build_dir):
makedirs(build_dir)
if not os.path.isdir(build_dir):
os.makedirs(build_dir)
for dir_name, patterns in dict(lib=lib, src=src).items():
if not patterns:
continue
contents = []
for p in patterns:
contents += compat.glob_recursive(p)
_copy_contents(join(build_dir, dir_name), contents)
contents += glob.glob(p, recursive=True)
_copy_contents(os.path.join(build_dir, dir_name), contents)
if project_conf and isfile(project_conf):
if project_conf and os.path.isfile(project_conf):
_copy_project_conf(build_dir, project_conf)
elif not board:
raise CIBuildEnvsEmpty()
@ -109,7 +108,7 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
# initialise project
ctx.invoke(
cmd_project_init,
project_init_cmd,
project_dir=build_dir,
board=board,
project_option=project_option,
@ -122,52 +121,55 @@ def cli( # pylint: disable=too-many-arguments, too-many-branches
fs.rmtree(build_dir)
def _copy_contents(dst_dir, contents):
def _copy_contents(dst_dir, contents): # pylint: disable=too-many-branches
items = {"dirs": set(), "files": set()}
for path in contents:
if isdir(path):
if os.path.isdir(path):
items["dirs"].add(path)
elif isfile(path):
elif os.path.isfile(path):
items["files"].add(path)
dst_dir_name = basename(dst_dir)
dst_dir_name = os.path.basename(dst_dir)
if dst_dir_name == "src" and len(items["dirs"]) == 1:
copytree(list(items["dirs"]).pop(), dst_dir, symlinks=True)
if not os.path.isdir(dst_dir):
shutil.copytree(list(items["dirs"]).pop(), dst_dir, symlinks=True)
else:
if not isdir(dst_dir):
makedirs(dst_dir)
if not os.path.isdir(dst_dir):
os.makedirs(dst_dir)
for d in items["dirs"]:
copytree(d, join(dst_dir, basename(d)), symlinks=True)
src_dst_dir = os.path.join(dst_dir, os.path.basename(d))
if not os.path.isdir(src_dst_dir):
shutil.copytree(d, src_dst_dir, symlinks=True)
if not items["files"]:
return
if dst_dir_name == "lib":
dst_dir = join(dst_dir, mkdtemp(dir=dst_dir))
dst_dir = os.path.join(dst_dir, tempfile.mkdtemp(dir=dst_dir))
for f in items["files"]:
dst_file = join(dst_dir, basename(f))
dst_file = os.path.join(dst_dir, os.path.basename(f))
if f == dst_file:
continue
copyfile(f, dst_file)
shutil.copyfile(f, dst_file)
def _exclude_contents(dst_dir, patterns):
contents = []
for p in patterns:
contents += compat.glob_recursive(join(compat.glob_escape(dst_dir), p))
contents += glob.glob(os.path.join(glob.escape(dst_dir), p), recursive=True)
for path in contents:
path = realpath(path)
if isdir(path):
path = os.path.abspath(path)
if os.path.isdir(path):
fs.rmtree(path)
elif isfile(path):
remove(path)
elif os.path.isfile(path):
os.remove(path)
def _copy_project_conf(build_dir, project_conf):
config = ProjectConfig(project_conf, parse_extra=False)
if config.has_section("platformio"):
config.remove_section("platformio")
config.save(join(build_dir, "platformio.ini"))
config.save(os.path.join(build_dir, "platformio.ini"))

View File

@ -12,10 +12,6 @@
# See the License for the specific language governing permissions and
# limitations under the License.
[report]
# Regexes for lines to exclude from consideration
exclude_lines =
pragma: no cover
def __repr__
raise AssertionError
raise NotImplementedError
# pylint: disable=unused-import
from platformio.debug.command import debug_cmd as cli

View File

@ -1,175 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments, too-many-statements
# pylint: disable=too-many-locals, too-many-branches
import os
import signal
from os.path import isfile
import click
from platformio import app, exception, fs, proc
from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.package.manager.core import inject_contrib_pysite
from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectEnvsNotAvailableError
from platformio.project.helpers import is_platformio_project, load_project_ide_data
@click.command(
"debug",
context_settings=dict(ignore_unknown_options=True),
short_help="Unified debugger",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--environment", "-e", metavar="<environment>")
@click.option("--verbose", "-v", is_flag=True)
@click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED)
@click.pass_context
def cli(ctx, project_dir, project_conf, environment, verbose, interface, __unprocessed):
app.set_session_var("custom_project_conf", project_conf)
# use env variables from Eclipse or CLion
for sysenv in ("CWD", "PWD", "PLATFORMIO_PROJECT_DIR"):
if is_platformio_project(project_dir):
break
if os.getenv(sysenv):
project_dir = os.getenv(sysenv)
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(envs=[environment] if environment else None)
env_name = environment or helpers.get_default_debug_env(config)
env_options = config.items(env=env_name, as_dict=True)
if not set(env_options.keys()) >= set(["platform", "board"]):
raise ProjectEnvsNotAvailableError()
try:
platform = PlatformFactory.new(env_options["platform"])
except UnknownPlatform:
ctx.invoke(
cmd_platform_install,
platforms=[env_options["platform"]],
skip_default_package=True,
)
platform = PlatformFactory.new(env_options["platform"])
debug_options = helpers.configure_initial_debug_options(platform, env_options)
assert debug_options
if not interface:
return helpers.predebug_project(ctx, project_dir, env_name, False, verbose)
ide_data = load_project_ide_data(project_dir, env_name)
if not ide_data:
raise DebugInvalidOptionsError("Could not load a build configuration")
if "--version" in __unprocessed:
result = proc.exec_command([ide_data["gdb_path"], "--version"])
if result["returncode"] == 0:
return click.echo(result["out"])
raise exception.PlatformioException("\n".join([result["out"], result["err"]]))
try:
fs.ensure_udev_rules()
except exception.InvalidUdevRules as e:
click.echo(
helpers.escape_gdbmi_stream("~", str(e) + "\n")
if helpers.is_gdbmi_mode()
else str(e) + "\n",
nl=False,
)
try:
debug_options = platform.configure_debug_options(debug_options, ide_data)
except NotImplementedError:
# legacy for ESP32 dev-platform <=2.0.0
debug_options["load_cmds"] = helpers.configure_esp32_load_cmds(
debug_options, ide_data
)
rebuild_prog = False
preload = debug_options["load_cmds"] == ["preload"]
load_mode = debug_options["load_mode"]
if load_mode == "always":
rebuild_prog = preload or not helpers.has_debug_symbols(ide_data["prog_path"])
elif load_mode == "modified":
rebuild_prog = helpers.is_prog_obsolete(
ide_data["prog_path"]
) or not helpers.has_debug_symbols(ide_data["prog_path"])
else:
rebuild_prog = not isfile(ide_data["prog_path"])
if preload or (not rebuild_prog and load_mode != "always"):
# don't load firmware through debug server
debug_options["load_cmds"] = []
if rebuild_prog:
if helpers.is_gdbmi_mode():
click.echo(
helpers.escape_gdbmi_stream(
"~", "Preparing firmware for debugging...\n"
),
nl=False,
)
stream = helpers.GDBMIConsoleStream()
with proc.capture_std_streams(stream):
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
stream.close()
else:
click.echo("Preparing firmware for debugging...")
helpers.predebug_project(ctx, project_dir, env_name, preload, verbose)
# save SHA sum of newly created prog
if load_mode == "modified":
helpers.is_prog_obsolete(ide_data["prog_path"])
if not isfile(ide_data["prog_path"]):
raise DebugInvalidOptionsError("Program/firmware is missed")
# run debugging client
inject_contrib_pysite()
# pylint: disable=import-outside-toplevel
from platformio.commands.debug.process.client import GDBClient, reactor
client = GDBClient(project_dir, __unprocessed, debug_options, env_options)
client.spawn(ide_data["gdb_path"], ide_data["prog_path"])
signal.signal(signal.SIGINT, lambda *args, **kwargs: None)
reactor.run()
return True

View File

@ -1,302 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import re
import sys
import time
from fnmatch import fnmatch
from hashlib import sha1
from io import BytesIO
from os.path import isfile
from platformio import fs, util
from platformio.commands import PlatformioCLI
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.run.command import cli as cmd_run
from platformio.compat import is_bytes
from platformio.project.config import ProjectConfig
from platformio.project.options import ProjectOptions
class GDBMIConsoleStream(BytesIO): # pylint: disable=too-few-public-methods
STDOUT = sys.stdout
def write(self, text):
self.STDOUT.write(escape_gdbmi_stream("~", text))
self.STDOUT.flush()
def is_gdbmi_mode():
return "--interpreter" in " ".join(PlatformioCLI.leftover_args)
def escape_gdbmi_stream(prefix, stream):
bytes_stream = False
if is_bytes(stream):
bytes_stream = True
stream = stream.decode()
if not stream:
return b"" if bytes_stream else ""
ends_nl = stream.endswith("\n")
stream = re.sub(r"\\+", "\\\\\\\\", stream)
stream = stream.replace('"', '\\"')
stream = stream.replace("\n", "\\n")
stream = '%s"%s"' % (prefix, stream)
if ends_nl:
stream += "\n"
return stream.encode() if bytes_stream else stream
def get_default_debug_env(config):
default_envs = config.default_envs()
all_envs = config.envs()
for env in default_envs:
if config.get("env:" + env, "build_type") == "debug":
return env
for env in all_envs:
if config.get("env:" + env, "build_type") == "debug":
return env
return default_envs[0] if default_envs else all_envs[0]
def predebug_project(ctx, project_dir, env_name, preload, verbose):
ctx.invoke(
cmd_run,
project_dir=project_dir,
environment=[env_name],
target=["debug"] + (["upload"] if preload else []),
verbose=verbose,
)
if preload:
time.sleep(5)
def configure_initial_debug_options(platform, env_options):
def _cleanup_cmds(items):
items = ProjectConfig.parse_multi_values(items)
return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items]
board_config = platform.board_config(env_options["board"])
tool_name = board_config.get_debug_tool_name(env_options.get("debug_tool"))
tool_settings = board_config.get("debug", {}).get("tools", {}).get(tool_name, {})
server_options = None
# specific server per a system
if isinstance(tool_settings.get("server", {}), list):
for item in tool_settings["server"][:]:
tool_settings["server"] = item
if util.get_systype() in item.get("system", []):
break
# user overwrites debug server
if env_options.get("debug_server"):
server_options = {
"cwd": None,
"executable": None,
"arguments": env_options.get("debug_server"),
}
server_options["executable"] = server_options["arguments"][0]
server_options["arguments"] = server_options["arguments"][1:]
elif "server" in tool_settings:
server_options = tool_settings["server"]
server_package = server_options.get("package")
server_package_dir = (
platform.get_package_dir(server_package) if server_package else None
)
if server_package and not server_package_dir:
platform.install_packages(
with_packages=[server_package], skip_default_package=True, silent=True
)
server_package_dir = platform.get_package_dir(server_package)
server_options.update(
dict(
cwd=server_package_dir if server_package else None,
executable=server_options.get("executable"),
arguments=[
a.replace("$PACKAGE_DIR", server_package_dir)
if server_package_dir
else a
for a in server_options.get("arguments", [])
],
)
)
extra_cmds = _cleanup_cmds(env_options.get("debug_extra_cmds"))
extra_cmds.extend(_cleanup_cmds(tool_settings.get("extra_cmds")))
result = dict(
tool=tool_name,
upload_protocol=env_options.get(
"upload_protocol", board_config.get("upload", {}).get("protocol")
),
load_cmds=_cleanup_cmds(
env_options.get(
"debug_load_cmds",
tool_settings.get(
"load_cmds",
tool_settings.get(
"load_cmd", ProjectOptions["env.debug_load_cmds"].default
),
),
)
),
load_mode=env_options.get(
"debug_load_mode",
tool_settings.get(
"load_mode", ProjectOptions["env.debug_load_mode"].default
),
),
init_break=env_options.get(
"debug_init_break",
tool_settings.get(
"init_break", ProjectOptions["env.debug_init_break"].default
),
),
init_cmds=_cleanup_cmds(
env_options.get("debug_init_cmds", tool_settings.get("init_cmds"))
),
extra_cmds=extra_cmds,
require_debug_port=tool_settings.get("require_debug_port", False),
port=reveal_debug_port(
env_options.get("debug_port", tool_settings.get("port")),
tool_name,
tool_settings,
),
server=server_options,
)
return result
def configure_esp32_load_cmds(debug_options, configuration):
"""
DEPRECATED: Moved to ESP32 dev-platform
See platform.py::configure_debug_options
"""
flash_images = configuration.get("extra", {}).get("flash_images")
ignore_conds = [
debug_options["load_cmds"] != ["load"],
"xtensa-esp32" not in configuration.get("cc_path", ""),
not flash_images,
not all([isfile(item["path"]) for item in flash_images]),
]
if any(ignore_conds):
return debug_options["load_cmds"]
mon_cmds = [
'monitor program_esp32 "{{{path}}}" {offset} verify'.format(
path=fs.to_unix_path(item["path"]), offset=item["offset"]
)
for item in flash_images
]
mon_cmds.append(
'monitor program_esp32 "{%s.bin}" 0x10000 verify'
% fs.to_unix_path(configuration["prog_path"][:-4])
)
return mon_cmds
def has_debug_symbols(prog_path):
if not isfile(prog_path):
return False
matched = {
b".debug_info": False,
b".debug_abbrev": False,
b" -Og": False,
b" -g": False,
b"__PLATFORMIO_BUILD_DEBUG__": False,
}
with open(prog_path, "rb") as fp:
last_data = b""
while True:
data = fp.read(1024)
if not data:
break
for pattern, found in matched.items():
if found:
continue
if pattern in last_data + data:
matched[pattern] = True
last_data = data
return all(matched.values())
def is_prog_obsolete(prog_path):
prog_hash_path = prog_path + ".sha1"
if not isfile(prog_path):
return True
shasum = sha1()
with open(prog_path, "rb") as fp:
while True:
data = fp.read(1024)
if not data:
break
shasum.update(data)
new_digest = shasum.hexdigest()
old_digest = None
if isfile(prog_hash_path):
with open(prog_hash_path) as fp:
old_digest = fp.read()
if new_digest == old_digest:
return False
with open(prog_hash_path, "w") as fp:
fp.write(new_digest)
return True
def reveal_debug_port(env_debug_port, tool_name, tool_settings):
def _get_pattern():
if not env_debug_port:
return None
if set(["*", "?", "[", "]"]) & set(env_debug_port):
return env_debug_port
return None
def _is_match_pattern(port):
pattern = _get_pattern()
if not pattern:
return True
return fnmatch(port, pattern)
def _look_for_serial_port(hwids):
for item in util.get_serialports(filter_hwid=True):
if not _is_match_pattern(item["port"]):
continue
port = item["port"]
if tool_name.startswith("blackmagic"):
if (
"windows" in util.get_systype()
and port.startswith("COM")
and len(port) > 4
):
port = "\\\\.\\%s" % port
if "GDB" in item["description"]:
return port
for hwid in hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
return port
return None
if env_debug_port and not _get_pattern():
return env_debug_port
if not tool_settings.get("require_debug_port"):
return None
debug_port = _look_for_serial_port(tool_settings.get("hwids", []))
if not debug_port:
raise DebugInvalidOptionsError("Please specify `debug_port` for environment")
return debug_port

View File

@ -1,161 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
GDB_DEFAULT_INIT_CONFIG = """
define pio_reset_halt_target
monitor reset halt
end
define pio_reset_run_target
monitor reset
end
target extended-remote $DEBUG_PORT
monitor init
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_STUTIL_INIT_CONFIG = """
define pio_reset_halt_target
monitor reset
monitor halt
end
define pio_reset_run_target
monitor reset
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_JLINK_INIT_CONFIG = """
define pio_reset_halt_target
monitor reset
monitor halt
end
define pio_reset_run_target
monitor clrbp
monitor reset
monitor go
end
target extended-remote $DEBUG_PORT
monitor clrbp
monitor speed auto
pio_reset_halt_target
$LOAD_CMDS
$INIT_BREAK
"""
GDB_BLACKMAGIC_INIT_CONFIG = """
define pio_reset_halt_target
set language c
set *0xE000ED0C = 0x05FA0004
set $busy = (*0xE000ED0C & 0x4)
while ($busy)
set $busy = (*0xE000ED0C & 0x4)
end
set language auto
end
define pio_reset_run_target
pio_reset_halt_target
end
target extended-remote $DEBUG_PORT
monitor swdp_scan
attach 1
set mem inaccessible-by-default off
$LOAD_CMDS
$INIT_BREAK
set language c
set *0xE000ED0C = 0x05FA0004
set $busy = (*0xE000ED0C & 0x4)
while ($busy)
set $busy = (*0xE000ED0C & 0x4)
end
set language auto
"""
GDB_MSPDEBUG_INIT_CONFIG = """
define pio_reset_halt_target
end
define pio_reset_run_target
end
target extended-remote $DEBUG_PORT
monitor erase
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_QEMU_INIT_CONFIG = """
define pio_reset_halt_target
monitor system_reset
end
define pio_reset_run_target
monitor system_reset
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
pio_reset_halt_target
$INIT_BREAK
"""
GDB_RENODE_INIT_CONFIG = """
define pio_reset_halt_target
monitor machine Reset
$LOAD_CMDS
monitor start
end
define pio_reset_run_target
pio_reset_halt_target
end
target extended-remote $DEBUG_PORT
$LOAD_CMDS
$INIT_BREAK
monitor start
"""
TOOL_TO_CONFIG = {
"jlink": GDB_JLINK_INIT_CONFIG,
"mspdebug": GDB_MSPDEBUG_INIT_CONFIG,
"qemu": GDB_QEMU_INIT_CONFIG,
"blackmagic": GDB_BLACKMAGIC_INIT_CONFIG,
"renode": GDB_RENODE_INIT_CONFIG,
}
def get_gdb_init_config(debug_options):
tool = debug_options.get("tool")
if tool and tool in TOOL_TO_CONFIG:
return TOOL_TO_CONFIG[tool]
server_exe = (debug_options.get("server") or {}).get("executable", "").lower()
if "st-util" in server_exe:
return GDB_STUTIL_INIT_CONFIG
return GDB_DEFAULT_INIT_CONFIG

View File

@ -1,93 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import signal
import time
import click
from twisted.internet import protocol # pylint: disable=import-error
from platformio import fs
from platformio.compat import string_types
from platformio.proc import get_pythonexe_path
from platformio.project.helpers import get_project_core_dir
class BaseProcess(protocol.ProcessProtocol, object):
STDOUT_CHUNK_SIZE = 2048
LOG_FILE = None
COMMON_PATTERNS = {
"PLATFORMIO_HOME_DIR": get_project_core_dir(),
"PLATFORMIO_CORE_DIR": get_project_core_dir(),
"PYTHONEXE": get_pythonexe_path(),
}
def __init__(self):
self._last_activity = 0
def apply_patterns(self, source, patterns=None):
_patterns = self.COMMON_PATTERNS.copy()
_patterns.update(patterns or {})
for key, value in _patterns.items():
if key.endswith(("_DIR", "_PATH")):
_patterns[key] = fs.to_unix_path(value)
def _replace(text):
for key, value in _patterns.items():
pattern = "$%s" % key
text = text.replace(pattern, value or "")
return text
if isinstance(source, string_types):
source = _replace(source)
elif isinstance(source, (list, dict)):
items = enumerate(source) if isinstance(source, list) else source.items()
for key, value in items:
if isinstance(value, string_types):
source[key] = _replace(value)
elif isinstance(value, (list, dict)):
source[key] = self.apply_patterns(value, patterns)
return source
def onStdInData(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
def outReceived(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
while data:
chunk = data[: self.STDOUT_CHUNK_SIZE]
click.echo(chunk, nl=False)
data = data[self.STDOUT_CHUNK_SIZE :]
def errReceived(self, data):
self._last_activity = time.time()
if self.LOG_FILE:
with open(self.LOG_FILE, "ab") as fp:
fp.write(data)
click.echo(data, nl=False, err=True)
def processEnded(self, _):
self._last_activity = time.time()
# Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler)

View File

@ -1,280 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import re
import signal
import time
from hashlib import sha1
from os.path import basename, dirname, isdir, join, realpath, splitext
from tempfile import mkdtemp
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import protocol # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import stdio # pylint: disable=import-error
from twisted.internet import task # pylint: disable=import-error
from platformio import fs, proc, telemetry, util
from platformio.cache import ContentCache
from platformio.commands.debug import helpers
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.initcfgs import get_gdb_init_config
from platformio.commands.debug.process.base import BaseProcess
from platformio.commands.debug.process.server import DebugServer
from platformio.compat import hashlib_encode_data, is_bytes
from platformio.project.helpers import get_project_cache_dir
class GDBClient(BaseProcess): # pylint: disable=too-many-instance-attributes
PIO_SRC_NAME = ".pioinit"
INIT_COMPLETED_BANNER = "PlatformIO: Initialization completed"
def __init__(self, project_dir, args, debug_options, env_options):
super(GDBClient, self).__init__()
self.project_dir = project_dir
self.args = list(args)
self.debug_options = debug_options
self.env_options = env_options
self._debug_server = DebugServer(debug_options, env_options)
self._session_id = None
if not isdir(get_project_cache_dir()):
os.makedirs(get_project_cache_dir())
self._gdbsrc_dir = mkdtemp(dir=get_project_cache_dir(), prefix=".piodebug-")
self._target_is_run = False
self._auto_continue_timer = None
self._errors_buffer = b""
@defer.inlineCallbacks
def spawn(self, gdb_path, prog_path):
session_hash = gdb_path + prog_path
self._session_id = sha1(hashlib_encode_data(session_hash)).hexdigest()
self._kill_previous_session()
patterns = {
"PROJECT_DIR": self.project_dir,
"PROG_PATH": prog_path,
"PROG_DIR": dirname(prog_path),
"PROG_NAME": basename(splitext(prog_path)[0]),
"DEBUG_PORT": self.debug_options["port"],
"UPLOAD_PROTOCOL": self.debug_options["upload_protocol"],
"INIT_BREAK": self.debug_options["init_break"] or "",
"LOAD_CMDS": "\n".join(self.debug_options["load_cmds"] or []),
}
yield self._debug_server.spawn(patterns)
if not patterns["DEBUG_PORT"]:
patterns["DEBUG_PORT"] = self._debug_server.get_debug_port()
self.generate_pioinit(self._gdbsrc_dir, patterns)
# start GDB client
args = [
"piogdb",
"-q",
"--directory",
self._gdbsrc_dir,
"--directory",
self.project_dir,
"-l",
"10",
]
args.extend(self.args)
if not gdb_path:
raise DebugInvalidOptionsError("GDB client is not configured")
gdb_data_dir = self._get_data_dir(gdb_path)
if gdb_data_dir:
args.extend(["--data-directory", gdb_data_dir])
args.append(patterns["PROG_PATH"])
transport = reactor.spawnProcess(
self, gdb_path, args, path=self.project_dir, env=os.environ
)
defer.returnValue(transport)
@staticmethod
def _get_data_dir(gdb_path):
if "msp430" in gdb_path:
return None
gdb_data_dir = realpath(join(dirname(gdb_path), "..", "share", "gdb"))
return gdb_data_dir if isdir(gdb_data_dir) else None
def generate_pioinit(self, dst_dir, patterns):
# default GDB init commands depending on debug tool
commands = get_gdb_init_config(self.debug_options).split("\n")
if self.debug_options["init_cmds"]:
commands = self.debug_options["init_cmds"]
commands.extend(self.debug_options["extra_cmds"])
if not any("define pio_reset_run_target" in cmd for cmd in commands):
commands = [
"define pio_reset_run_target",
" echo Warning! Undefined pio_reset_run_target command\\n",
" monitor reset",
"end",
] + commands
if not any("define pio_reset_halt_target" in cmd for cmd in commands):
commands = [
"define pio_reset_halt_target",
" echo Warning! Undefined pio_reset_halt_target command\\n",
" monitor reset halt",
"end",
] + commands
if not any("define pio_restart_target" in cmd for cmd in commands):
commands += [
"define pio_restart_target",
" pio_reset_halt_target",
" $INIT_BREAK",
" %s" % ("continue" if patterns["INIT_BREAK"] else "next"),
"end",
]
banner = [
"echo PlatformIO Unified Debugger -> http://bit.ly/pio-debug\\n",
"echo PlatformIO: debug_tool = %s\\n" % self.debug_options["tool"],
"echo PlatformIO: Initializing remote target...\\n",
]
footer = ["echo %s\\n" % self.INIT_COMPLETED_BANNER]
commands = banner + commands + footer
with open(join(dst_dir, self.PIO_SRC_NAME), "w") as fp:
fp.write("\n".join(self.apply_patterns(commands, patterns)))
def connectionMade(self):
self._lock_session(self.transport.pid)
p = protocol.Protocol()
p.dataReceived = self.onStdInData
stdio.StandardIO(p)
def onStdInData(self, data):
super(GDBClient, self).onStdInData(data)
if b"-exec-run" in data:
if self._target_is_run:
token, _ = data.split(b"-", 1)
self.outReceived(token + b"^running\n")
return
data = data.replace(b"-exec-run", b"-exec-continue")
if b"-exec-continue" in data:
self._target_is_run = True
if b"-gdb-exit" in data or data.strip() in (b"q", b"quit"):
# Allow terminating via SIGINT/CTRL+C
signal.signal(signal.SIGINT, signal.default_int_handler)
self.transport.write(b"pio_reset_run_target\n")
self.transport.write(data)
def processEnded(self, reason): # pylint: disable=unused-argument
self._unlock_session()
if self._gdbsrc_dir and isdir(self._gdbsrc_dir):
fs.rmtree(self._gdbsrc_dir)
if self._debug_server:
self._debug_server.terminate()
reactor.stop()
def outReceived(self, data):
super(GDBClient, self).outReceived(data)
self._handle_error(data)
# go to init break automatically
if self.INIT_COMPLETED_BANNER.encode() in data:
telemetry.send_event(
"Debug", "Started", telemetry.dump_run_environment(self.env_options)
)
self._auto_continue_timer = task.LoopingCall(self._auto_exec_continue)
self._auto_continue_timer.start(0.1)
def errReceived(self, data):
super(GDBClient, self).errReceived(data)
self._handle_error(data)
def console_log(self, msg):
if helpers.is_gdbmi_mode():
msg = helpers.escape_gdbmi_stream("~", msg)
self.outReceived(msg if is_bytes(msg) else msg.encode())
def _auto_exec_continue(self):
auto_exec_delay = 0.5 # in seconds
if self._last_activity > (time.time() - auto_exec_delay):
return
if self._auto_continue_timer:
self._auto_continue_timer.stop()
self._auto_continue_timer = None
if not self.debug_options["init_break"] or self._target_is_run:
return
self.console_log(
"PlatformIO: Resume the execution to `debug_init_break = %s`\n"
% self.debug_options["init_break"]
)
self.console_log(
"PlatformIO: More configuration options -> http://bit.ly/pio-debug\n"
)
self.transport.write(
b"0-exec-continue\n" if helpers.is_gdbmi_mode() else b"continue\n"
)
self._target_is_run = True
def _handle_error(self, data):
self._errors_buffer = (self._errors_buffer + data)[-8192:] # keep last 8 KBytes
if not (
self.PIO_SRC_NAME.encode() in self._errors_buffer
and b"Error in sourced" in self._errors_buffer
):
return
last_erros = self._errors_buffer.decode()
last_erros = " ".join(reversed(last_erros.split("\n")))
last_erros = re.sub(r'((~|&)"|\\n\"|\\t)', " ", last_erros, flags=re.M)
err = "%s -> %s" % (
telemetry.dump_run_environment(self.env_options),
last_erros,
)
telemetry.send_exception("DebugInitError: %s" % err)
self.transport.loseConnection()
def _kill_previous_session(self):
assert self._session_id
pid = None
with ContentCache() as cc:
pid = cc.get(self._session_id)
cc.delete(self._session_id)
if not pid:
return
if "windows" in util.get_systype():
kill = ["Taskkill", "/PID", pid, "/F"]
else:
kill = ["kill", pid]
try:
proc.exec_command(kill)
except: # pylint: disable=bare-except
pass
def _lock_session(self, pid):
if not self._session_id:
return
with ContentCache() as cc:
cc.set(self._session_id, str(pid), "1h")
def _unlock_session(self):
if not self._session_id:
return
with ContentCache() as cc:
cc.delete(self._session_id)

View File

@ -1,166 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import time
from os.path import isdir, isfile, join
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from platformio import fs, util
from platformio.commands.debug.exception import DebugInvalidOptionsError
from platformio.commands.debug.helpers import escape_gdbmi_stream, is_gdbmi_mode
from platformio.commands.debug.process.base import BaseProcess
from platformio.proc import where_is_program
class DebugServer(BaseProcess):
def __init__(self, debug_options, env_options):
super(DebugServer, self).__init__()
self.debug_options = debug_options
self.env_options = env_options
self._debug_port = ":3333"
self._transport = None
self._process_ended = False
self._ready = False
@defer.inlineCallbacks
def spawn(self, patterns): # pylint: disable=too-many-branches
systype = util.get_systype()
server = self.debug_options.get("server")
if not server:
defer.returnValue(None)
server = self.apply_patterns(server, patterns)
server_executable = server["executable"]
if not server_executable:
defer.returnValue(None)
if server["cwd"]:
server_executable = join(server["cwd"], server_executable)
if (
"windows" in systype
and not server_executable.endswith(".exe")
and isfile(server_executable + ".exe")
):
server_executable = server_executable + ".exe"
if not isfile(server_executable):
server_executable = where_is_program(server_executable)
if not isfile(server_executable):
raise DebugInvalidOptionsError(
"\nCould not launch Debug Server '%s'. Please check that it "
"is installed and is included in a system PATH\n\n"
"See documentation or contact contact@platformio.org:\n"
"https://docs.platformio.org/page/plus/debugging.html\n"
% server_executable
)
openocd_pipe_allowed = all(
[not self.debug_options["port"], "openocd" in server_executable]
)
if openocd_pipe_allowed:
args = []
if server["cwd"]:
args.extend(["-s", server["cwd"]])
args.extend(
["-c", "gdb_port pipe; tcl_port disabled; telnet_port disabled"]
)
args.extend(server["arguments"])
str_args = " ".join(
[arg if arg.startswith("-") else '"%s"' % arg for arg in args]
)
self._debug_port = '| "%s" %s' % (server_executable, str_args)
self._debug_port = fs.to_unix_path(self._debug_port)
defer.returnValue(self._debug_port)
env = os.environ.copy()
# prepend server "lib" folder to LD path
if (
"windows" not in systype
and server["cwd"]
and isdir(join(server["cwd"], "lib"))
):
ld_key = "DYLD_LIBRARY_PATH" if "darwin" in systype else "LD_LIBRARY_PATH"
env[ld_key] = join(server["cwd"], "lib")
if os.environ.get(ld_key):
env[ld_key] = "%s:%s" % (env[ld_key], os.environ.get(ld_key))
# prepend BIN to PATH
if server["cwd"] and isdir(join(server["cwd"], "bin")):
env["PATH"] = "%s%s%s" % (
join(server["cwd"], "bin"),
os.pathsep,
os.environ.get("PATH", os.environ.get("Path", "")),
)
self._transport = reactor.spawnProcess(
self,
server_executable,
[server_executable] + server["arguments"],
path=server["cwd"],
env=env,
)
if "mspdebug" in server_executable.lower():
self._debug_port = ":2000"
elif "jlink" in server_executable.lower():
self._debug_port = ":2331"
elif "qemu" in server_executable.lower():
self._debug_port = ":1234"
yield self._wait_until_ready()
defer.returnValue(self._debug_port)
@defer.inlineCallbacks
def _wait_until_ready(self):
timeout = 10
elapsed = 0
delay = 0.5
auto_ready_delay = 0.5
while not self._ready and not self._process_ended and elapsed < timeout:
yield self.async_sleep(delay)
if not self.debug_options.get("server", {}).get("ready_pattern"):
self._ready = self._last_activity < (time.time() - auto_ready_delay)
elapsed += delay
@staticmethod
def async_sleep(secs):
d = defer.Deferred()
reactor.callLater(secs, d.callback, None)
return d
def get_debug_port(self):
return self._debug_port
def outReceived(self, data):
super(DebugServer, self).outReceived(
escape_gdbmi_stream("@", data) if is_gdbmi_mode() else data
)
if self._ready:
return
ready_pattern = self.debug_options.get("server", {}).get("ready_pattern")
if ready_pattern:
self._ready = ready_pattern.encode() in data
def processEnded(self, reason):
self._process_ended = True
super(DebugServer, self).processEnded(reason)
def terminate(self):
if self._process_ended or not self._transport:
return
try:
self._transport.signalProcess("KILL")
except: # pylint: disable=bare-except
pass

View File

@ -12,4 +12,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.device.filters.base import DeviceMonitorFilter
# pylint: disable=unused-import
from platformio.device.filters.base import (
DeviceMonitorFilterBase as DeviceMonitorFilter,
)

View File

@ -12,234 +12,19 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
from fnmatch import fnmatch
import click
from serial.tools import miniterm
from platformio import exception, fs, util
from platformio.commands.device import helpers as device_helpers
from platformio.compat import dump_json_to_unicode
from platformio.platform.factory import PlatformFactory
from platformio.project.exception import NotPlatformIOProjectError
from platformio.device.commands.list import device_list_cmd
from platformio.device.commands.monitor import device_monitor_cmd
@click.group(short_help="Device manager & serial/socket monitor")
@click.group(
"device",
commands=[
device_list_cmd,
device_monitor_cmd,
],
short_help="Device manager & Serial/Socket monitor",
)
def cli():
pass
@cli.command("list", short_help="List devices")
@click.option("--serial", is_flag=True, help="List serial ports, default")
@click.option("--logical", is_flag=True, help="List logical devices")
@click.option("--mdns", is_flag=True, help="List multicast DNS services")
@click.option("--json-output", is_flag=True)
def device_list( # pylint: disable=too-many-branches
serial, logical, mdns, json_output
):
if not logical and not mdns:
serial = True
data = {}
if serial:
data["serial"] = util.get_serial_ports()
if logical:
data["logical"] = util.get_logical_devices()
if mdns:
data["mdns"] = util.get_mdns_services()
single_key = list(data)[0] if len(list(data)) == 1 else None
if json_output:
return click.echo(
dump_json_to_unicode(data[single_key] if single_key else data)
)
titles = {
"serial": "Serial Ports",
"logical": "Logical Devices",
"mdns": "Multicast DNS Services",
}
for key, value in data.items():
if not single_key:
click.secho(titles[key], bold=True)
click.echo("=" * len(titles[key]))
if key == "serial":
for item in value:
click.secho(item["port"], fg="cyan")
click.echo("-" * len(item["port"]))
click.echo("Hardware ID: %s" % item["hwid"])
click.echo("Description: %s" % item["description"])
click.echo("")
if key == "logical":
for item in value:
click.secho(item["path"], fg="cyan")
click.echo("-" * len(item["path"]))
click.echo("Name: %s" % item["name"])
click.echo("")
if key == "mdns":
for item in value:
click.secho(item["name"], fg="cyan")
click.echo("-" * len(item["name"]))
click.echo("Type: %s" % item["type"])
click.echo("IP: %s" % item["ip"])
click.echo("Port: %s" % item["port"])
if item["properties"]:
click.echo(
"Properties: %s"
% (
"; ".join(
[
"%s=%s" % (k, v)
for k, v in item["properties"].items()
]
)
)
)
click.echo("")
if single_key:
click.echo("")
return True
@cli.command("monitor", short_help="Monitor device (Serial)")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option(
"--parity",
default="N",
type=click.Choice(["N", "E", "O", "S", "M"]),
help="Set parity, default=N",
)
@click.option("--rtscts", is_flag=True, help="Enable RTS/CTS flow control, default=Off")
@click.option(
"--xonxoff", is_flag=True, help="Enable software flow control, default=Off"
)
@click.option(
"--rts", default=None, type=click.IntRange(0, 1), help="Set initial RTS line state"
)
@click.option(
"--dtr", default=None, type=click.IntRange(0, 1), help="Set initial DTR line state"
)
@click.option("--echo", is_flag=True, help="Enable local echo, default=Off")
@click.option(
"--encoding",
default="UTF-8",
help="Set the encoding for the serial port (e.g. hexlify, "
"Latin1, UTF-8), default: UTF-8",
)
@click.option("--filter", "-f", multiple=True, help="Add filters/text transformations")
@click.option(
"--eol",
default="CRLF",
type=click.Choice(["CR", "LF", "CRLF"]),
help="End of line mode, default=CRLF",
)
@click.option("--raw", is_flag=True, help="Do not apply any encodings/transformations")
@click.option(
"--exit-char",
type=int,
default=3,
help="ASCII code of special character that is used to exit "
"the application, default=3 (Ctrl+C)",
)
@click.option(
"--menu-char",
type=int,
default=20,
help="ASCII code of special character that is used to "
"control miniterm (menu), default=20 (DEC)",
)
@click.option(
"--quiet",
is_flag=True,
help="Diagnostics: suppress non-error messages, default=Off",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option(
"-e",
"--environment",
help="Load configuration from `platformio.ini` and specified environment",
)
def device_monitor(**kwargs): # pylint: disable=too-many-branches
# load default monitor filters
filters_dir = os.path.join(fs.get_source_dir(), "commands", "device", "filters")
for name in os.listdir(filters_dir):
if not name.endswith(".py"):
continue
device_helpers.load_monitor_filter(os.path.join(filters_dir, name))
project_options = {}
try:
with fs.cd(kwargs["project_dir"]):
project_options = device_helpers.get_project_options(kwargs["environment"])
kwargs = device_helpers.apply_project_monitor_options(kwargs, project_options)
except NotPlatformIOProjectError:
pass
platform = None
if "platform" in project_options:
with fs.cd(kwargs["project_dir"]):
platform = PlatformFactory.new(project_options["platform"])
device_helpers.register_platform_filters(
platform, kwargs["project_dir"], kwargs["environment"]
)
if not kwargs["port"]:
ports = util.get_serial_ports(filter_hwid=True)
if len(ports) == 1:
kwargs["port"] = ports[0]["port"]
elif "platform" in project_options and "board" in project_options:
board_hwids = device_helpers.get_board_hwids(
kwargs["project_dir"],
platform,
project_options["board"],
)
for item in ports:
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
kwargs["port"] = item["port"]
break
if kwargs["port"]:
break
elif kwargs["port"] and (set(["*", "?", "[", "]"]) & set(kwargs["port"])):
for item in util.get_serial_ports():
if fnmatch(item["port"], kwargs["port"]):
kwargs["port"] = item["port"]
break
# override system argv with patched options
sys.argv = ["monitor"] + device_helpers.options_to_argv(
kwargs,
project_options,
ignore=("port", "baud", "rts", "dtr", "environment", "project_dir"),
)
if not kwargs["quiet"]:
click.echo(
"--- Available filters and text transformations: %s"
% ", ".join(sorted(miniterm.TRANSFORMATIONS.keys()))
)
click.echo("--- More details at http://bit.ly/pio-monitor-filters")
try:
miniterm.main(
default_port=kwargs["port"],
default_baudrate=kwargs["baud"] or 9600,
default_rts=kwargs["rts"],
default_dtr=kwargs["dtr"],
)
except Exception as e:
raise exception.MinitermException(e)

View File

@ -1,42 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from serial.tools import miniterm
from platformio.project.config import ProjectConfig
class DeviceMonitorFilter(miniterm.Transform):
def __init__(self, project_dir=None, environment=None):
""" Called by PlatformIO to pass context """
miniterm.Transform.__init__(self)
self.project_dir = project_dir
self.environment = environment
self.config = ProjectConfig.get_instance()
if not self.environment:
default_envs = self.config.default_envs()
if default_envs:
self.environment = default_envs[0]
elif self.config.envs():
self.environment = self.config.envs()[0]
def __call__(self):
""" Called by the miniterm library when the filter is actually used """
return self
@property
def NAME(self):
raise NotImplementedError("Please declare NAME attribute for the filter class")

View File

@ -1,106 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import inspect
import os
from serial.tools import miniterm
from platformio import fs
from platformio.commands.device import DeviceMonitorFilter
from platformio.compat import get_object_members, load_python_module
from platformio.project.config import ProjectConfig
def apply_project_monitor_options(cli_options, project_options):
for k in ("port", "speed", "rts", "dtr"):
k2 = "monitor_%s" % k
if k == "speed":
k = "baud"
if cli_options[k] is None and k2 in project_options:
cli_options[k] = project_options[k2]
if k != "port":
cli_options[k] = int(cli_options[k])
return cli_options
def options_to_argv(cli_options, project_options, ignore=None):
confmon_flags = project_options.get("monitor_flags", [])
result = confmon_flags[::]
for f in project_options.get("monitor_filters", []):
result.extend(["--filter", f])
for k, v in cli_options.items():
if v is None or (ignore and k in ignore):
continue
k = "--" + k.replace("_", "-")
if k in confmon_flags:
continue
if isinstance(v, bool):
if v:
result.append(k)
elif isinstance(v, tuple):
for i in v:
result.extend([k, i])
else:
result.extend([k, str(v)])
return result
def get_project_options(environment=None):
config = ProjectConfig.get_instance()
config.validate(envs=[environment] if environment else None)
if not environment:
default_envs = config.default_envs()
if default_envs:
environment = default_envs[0]
else:
environment = config.envs()[0]
return config.items(env=environment, as_dict=True)
def get_board_hwids(project_dir, platform, board):
with fs.cd(project_dir):
return platform.board_config(board).get("build.hwids", [])
def load_monitor_filter(path, project_dir=None, environment=None):
name = os.path.basename(path)
name = name[: name.find(".")]
module = load_python_module("platformio.commands.device.filters.%s" % name, path)
for cls in get_object_members(module).values():
if (
not inspect.isclass(cls)
or not issubclass(cls, DeviceMonitorFilter)
or cls == DeviceMonitorFilter
):
continue
obj = cls(project_dir, environment)
miniterm.TRANSFORMATIONS[obj.NAME] = obj
return True
def register_platform_filters(platform, project_dir, environment):
monitor_dir = os.path.join(platform.get_dir(), "monitor")
if not os.path.isdir(monitor_dir):
return
for name in os.listdir(monitor_dir):
if not name.startswith("filter_") or not name.endswith(".py"):
continue
path = os.path.join(monitor_dir, name)
if not os.path.isfile(path):
continue
load_monitor_filter(path, project_dir, environment)

View File

@ -12,20 +12,15 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-locals,too-many-statements
import mimetypes
import socket
from os.path import isdir
import click
from platformio import exception
from platformio.compat import WINDOWS
from platformio.package.manager.core import get_core_package_dir, inject_contrib_pysite
from platformio.commands.home.helpers import is_port_used
from platformio.commands.home.run import run_server
@click.command("home", short_help="UI to manage PlatformIO")
@click.command("home", short_help="GUI to manage PlatformIO")
@click.option("--port", type=int, default=8008, help="HTTP port, default=8008")
@click.option(
"--host",
@ -45,61 +40,28 @@ from platformio.package.manager.core import get_core_package_dir, inject_contrib
"are connected. Default is 0 which means never auto shutdown"
),
)
def cli(port, host, no_open, shutdown_timeout):
# pylint: disable=import-error, import-outside-toplevel
# import contrib modules
inject_contrib_pysite()
from autobahn.twisted.resource import WebSocketResource
from twisted.internet import reactor
from twisted.web import server
from twisted.internet.error import CannotListenError
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.commands.home.rpc.handlers.project import ProjectRPC
from platformio.commands.home.rpc.handlers.account import AccountRPC
from platformio.commands.home.rpc.server import JSONRPCServerFactory
from platformio.commands.home.web import WebRoot
factory = JSONRPCServerFactory(shutdown_timeout)
factory.addHandler(AppRPC(), namespace="app")
factory.addHandler(IDERPC(), namespace="ide")
factory.addHandler(MiscRPC(), namespace="misc")
factory.addHandler(OSRPC(), namespace="os")
factory.addHandler(PIOCoreRPC(), namespace="core")
factory.addHandler(ProjectRPC(), namespace="project")
factory.addHandler(AccountRPC(), namespace="account")
contrib_dir = get_core_package_dir("contrib-piohome")
if not isdir(contrib_dir):
raise exception.PlatformioException("Invalid path to PIO Home Contrib")
@click.option(
"--session-id",
help=(
"A unique session identifier to keep PIO Home isolated from other instances "
"and protect from 3rd party access"
),
)
def cli(port, host, no_open, shutdown_timeout, session_id):
# Ensure PIO Home mimetypes are known
mimetypes.add_type("text/html", ".html")
mimetypes.add_type("text/css", ".css")
mimetypes.add_type("application/javascript", ".js")
root = WebRoot(contrib_dir)
root.putChild(b"wsrpc", WebSocketResource(factory))
site = server.Site(root)
# hook for `platformio-node-helpers`
if host == "__do_not_start__":
return
already_started = is_port_used(host, port)
home_url = "http://%s:%d" % (host, port)
if not no_open:
if already_started:
click.launch(home_url)
else:
reactor.callLater(1, lambda: click.launch(home_url))
home_url = "http://%s:%d%s" % (
host,
port,
("/session/%s/" % session_id) if session_id else "/",
)
click.echo(
"\n".join(
[
@ -108,45 +70,25 @@ def cli(port, host, no_open, shutdown_timeout):
" /\\-_--\\ PlatformIO Home",
"/ \\_-__\\",
"|[]| [] | %s" % home_url,
"|__|____|______________%s" % ("_" * len(host)),
"|__|____|__%s" % ("_" * len(home_url)),
]
)
)
click.echo("")
click.echo("Open PlatformIO Home in your browser by this URL => %s" % home_url)
try:
reactor.listenTCP(port, site, interface=host)
except CannotListenError as e:
click.secho(str(e), fg="red", err=True)
already_started = True
if already_started:
if is_port_used(host, port):
click.secho(
"PlatformIO Home server is already started in another process.", fg="yellow"
)
if not no_open:
click.launch(home_url)
return
click.echo("PIO Home has been started. Press Ctrl+C to shutdown.")
reactor.run()
def is_port_used(host, port):
socket.setdefaulttimeout(1)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if WINDOWS:
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True
run_server(
host=host,
port=port,
no_open=no_open,
shutdown_timeout=shutdown_timeout,
home_url=home_url,
)

View File

@ -12,40 +12,49 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=keyword-arg-before-vararg,arguments-differ,signature-differs
import socket
import requests
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import reactor # pylint: disable=import-error
from twisted.internet import threads # pylint: disable=import-error
from starlette.concurrency import run_in_threadpool
from platformio import util
from platformio.compat import IS_WINDOWS
from platformio.proc import where_is_program
class AsyncSession(requests.Session):
def __init__(self, n=None, *args, **kwargs):
if n:
pool = reactor.getThreadPool()
pool.adjustPoolsize(0, n)
super(AsyncSession, self).__init__(*args, **kwargs)
def request(self, *args, **kwargs):
func = super(AsyncSession, self).request
return threads.deferToThread(func, *args, **kwargs)
def wrap(self, *args, **kwargs): # pylint: disable=no-self-use
return defer.ensureDeferred(*args, **kwargs)
async def request( # pylint: disable=signature-differs,invalid-overridden-method
self, *args, **kwargs
):
func = super().request
return await run_in_threadpool(func, *args, **kwargs)
@util.memoized(expire="60s")
def requests_session():
return AsyncSession(n=5)
return AsyncSession()
@util.memoized(expire="60s")
def get_core_fullpath():
return where_is_program(
"platformio" + (".exe" if "windows" in util.get_systype() else "")
)
return where_is_program("platformio" + (".exe" if IS_WINDOWS else ""))
def is_port_used(host, port):
socket.setdefaulttimeout(1)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if IS_WINDOWS:
try:
s.bind((host, port))
s.close()
return False
except (OSError, socket.error):
pass
else:
try:
s.connect((host, port))
s.close()
except socket.error:
return False
return True

View File

@ -12,18 +12,18 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import jsonrpc # pylint: disable=import-error
from ajsonrpc.core import JSONRPC20DispatchException
from platformio.clients.account import AccountClient
class AccountRPC(object):
class AccountRPC:
@staticmethod
def call_client(method, *args, **kwargs):
try:
client = AccountClient()
return getattr(client, method)(*args, **kwargs)
except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException(
raise JSONRPC20DispatchException(
code=4003, message="PIO Account Call Error", data=str(e)
)

View File

@ -14,15 +14,15 @@
from __future__ import absolute_import
from os.path import join
import os
from pathlib import Path
from platformio import __version__, app, fs, util
from platformio.project.helpers import get_project_core_dir, is_platformio_project
from platformio.project.config import ProjectConfig
from platformio.project.helpers import is_platformio_project
class AppRPC(object):
APPSTATE_PATH = join(get_project_core_dir(), "homestate.json")
class AppRPC:
IGNORE_STORAGE_KEYS = [
"cid",
@ -34,9 +34,16 @@ class AppRPC(object):
"projectsDir",
]
@staticmethod
def get_state_path():
core_dir = ProjectConfig.get_instance().get("platformio", "core_dir")
if not os.path.isdir(core_dir):
os.makedirs(core_dir)
return os.path.join(core_dir, "homestate.json")
@staticmethod
def load_state():
with app.State(AppRPC.APPSTATE_PATH, lock=True) as state:
with app.State(AppRPC.get_state_path(), lock=True) as state:
storage = state.get("storage", {})
# base data
@ -58,9 +65,13 @@ class AppRPC(object):
storage["projectsDir"] = storage["coreSettings"]["projects_dir"]["value"]
# skip non-existing recent projects
storage["recentProjects"] = [
p for p in storage.get("recentProjects", []) if is_platformio_project(p)
]
storage["recentProjects"] = list(
set(
str(Path(p).resolve())
for p in storage.get("recentProjects", [])
if is_platformio_project(p)
)
)
state["storage"] = storage
state.modified = False # skip saving extra fields
@ -72,7 +83,7 @@ class AppRPC(object):
@staticmethod
def save_state(state):
with app.State(AppRPC.APPSTATE_PATH, lock=True) as s:
with app.State(AppRPC.get_state_path(), lock=True) as s:
s.clear()
s.update(state)
storage = s.get("storage", {})

View File

@ -13,35 +13,73 @@
# limitations under the License.
import time
from pathlib import Path
import jsonrpc # pylint: disable=import-error
from twisted.internet import defer # pylint: disable=import-error
from ajsonrpc.core import JSONRPC20DispatchException
from platformio.compat import aio_get_running_loop
class IDERPC(object):
class IDERPC:
COMMAND_TIMEOUT = 1.5 # in seconds
def __init__(self):
self._queue = {}
self._ide_queue = []
self._cmd_queue = {}
def send_command(self, sid, command, params):
if not self._queue.get(sid):
raise jsonrpc.exceptions.JSONRPCDispatchException(
code=4005, message="PIO Home IDE agent is not started"
)
while self._queue[sid]:
self._queue[sid].pop().callback(
{"id": time.time(), "method": command, "params": params}
)
async def listen_commands(self):
f = aio_get_running_loop().create_future()
self._ide_queue.append(f)
self._process_commands()
return await f
def listen_commands(self, sid=0):
if sid not in self._queue:
self._queue[sid] = []
self._queue[sid].append(defer.Deferred())
return self._queue[sid][-1]
def open_project(self, sid, project_dir):
return self.send_command(sid, "open_project", project_dir)
def open_text_document(self, sid, path, line=None, column=None):
return self.send_command(
sid, "open_text_document", dict(path=path, line=line, column=column)
async def send_command(self, command, params=None):
cmd_id = f"ide-{command}-{time.time()}"
self._cmd_queue[cmd_id] = {
"method": command,
"params": params,
"time": time.time(),
"future": aio_get_running_loop().create_future(),
}
self._process_commands()
# in case if IDE agent has not been started
aio_get_running_loop().call_later(
self.COMMAND_TIMEOUT + 0.1, self._process_commands
)
return await self._cmd_queue[cmd_id]["future"]
def on_command_result(self, cmd_id, value):
if cmd_id not in self._cmd_queue:
return
if self._cmd_queue[cmd_id]["method"] == "get_pio_project_dirs":
value = [str(Path(p).resolve()) for p in value]
self._cmd_queue[cmd_id]["future"].set_result(value)
del self._cmd_queue[cmd_id]
def _process_commands(self):
for cmd_id in list(self._cmd_queue):
cmd_data = self._cmd_queue[cmd_id]
if cmd_data["future"].done():
del self._cmd_queue[cmd_id]
continue
if (
not self._ide_queue
and (time.time() - cmd_data["time"]) > self.COMMAND_TIMEOUT
):
cmd_data["future"].set_exception(
JSONRPC20DispatchException(
code=4005, message="PIO Home IDE agent is not started"
)
)
continue
while self._ide_queue:
self._ide_queue.pop().set_result(
{
"id": cmd_id,
"method": cmd_data["method"],
"params": cmd_data["params"],
}
)

View File

@ -15,14 +15,13 @@
import json
import time
from twisted.internet import defer, reactor # pylint: disable=import-error
from platformio.cache import ContentCache
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.compat import aio_create_task
class MiscRPC(object):
def load_latest_tweets(self, data_url):
class MiscRPC:
async def load_latest_tweets(self, data_url):
cache_key = ContentCache.key_from_args(data_url, "tweets")
cache_valid = "180d"
with ContentCache() as cc:
@ -31,22 +30,20 @@ class MiscRPC(object):
cache_data = json.loads(cache_data)
# automatically update cache in background every 12 hours
if cache_data["time"] < (time.time() - (3600 * 12)):
reactor.callLater(
5, self._preload_latest_tweets, data_url, cache_key, cache_valid
aio_create_task(
self._preload_latest_tweets(data_url, cache_key, cache_valid)
)
return cache_data["result"]
result = self._preload_latest_tweets(data_url, cache_key, cache_valid)
return result
return await self._preload_latest_tweets(data_url, cache_key, cache_valid)
@staticmethod
@defer.inlineCallbacks
def _preload_latest_tweets(data_url, cache_key, cache_valid):
result = json.loads((yield OSRPC.fetch_content(data_url)))
async def _preload_latest_tweets(data_url, cache_key, cache_valid):
result = json.loads((await OSRPC.fetch_content(data_url)))
with ContentCache() as cc:
cc.set(
cache_key,
json.dumps({"time": int(time.time()), "result": result}),
cache_valid,
)
defer.returnValue(result)
return result

View File

@ -14,25 +14,24 @@
from __future__ import absolute_import
import glob
import io
import os
import shutil
from functools import cmp_to_key
import click
from twisted.internet import defer # pylint: disable=import-error
from platformio import __default_requests_timeout__, fs, util
from platformio import __default_requests_timeout__, fs
from platformio.cache import ContentCache
from platformio.clients.http import ensure_internet_on
from platformio.commands.home import helpers
from platformio.compat import PY2, get_filesystem_encoding, glob_recursive
from platformio.device.list import list_logical_devices
class OSRPC(object):
class OSRPC:
@staticmethod
@defer.inlineCallbacks
def fetch_content(uri, data=None, headers=None, cache_valid=None):
async def fetch_content(uri, data=None, headers=None, cache_valid=None):
if not headers:
headers = {
"User-Agent": (
@ -46,18 +45,18 @@ class OSRPC(object):
if cache_key:
result = cc.get(cache_key)
if result is not None:
defer.returnValue(result)
return result
# check internet before and resolve issue with 60 seconds timeout
ensure_internet_on(raise_exception=True)
session = helpers.requests_session()
if data:
r = yield session.post(
r = await session.post(
uri, data=data, headers=headers, timeout=__default_requests_timeout__
)
else:
r = yield session.get(
r = await session.get(
uri, headers=headers, timeout=__default_requests_timeout__
)
@ -66,11 +65,11 @@ class OSRPC(object):
if cache_valid:
with ContentCache() as cc:
cc.set(cache_key, result, cache_valid)
defer.returnValue(result)
return result
def request_content(self, uri, data=None, headers=None, cache_valid=None):
async def request_content(self, uri, data=None, headers=None, cache_valid=None):
if uri.startswith("http"):
return self.fetch_content(uri, data, headers, cache_valid)
return await self.fetch_content(uri, data, headers, cache_valid)
if os.path.isfile(uri):
with io.open(uri, encoding="utf-8") as fp:
return fp.read()
@ -82,13 +81,11 @@ class OSRPC(object):
@staticmethod
def reveal_file(path):
return click.launch(
path.encode(get_filesystem_encoding()) if PY2 else path, locate=True
)
return click.launch(path, locate=True)
@staticmethod
def open_file(path):
return click.launch(path.encode(get_filesystem_encoding()) if PY2 else path)
return click.launch(path)
@staticmethod
def is_file(path):
@ -121,7 +118,9 @@ class OSRPC(object):
result = set()
for pathname in pathnames:
result |= set(
glob_recursive(os.path.join(root, pathname) if root else pathname)
glob.glob(
os.path.join(root, pathname) if root else pathname, recursive=True
)
)
return list(result)
@ -156,7 +155,7 @@ class OSRPC(object):
@staticmethod
def get_logical_devices():
items = []
for item in util.get_logical_devices():
for item in list_logical_devices():
if item["name"]:
item["name"] = item["name"]
items.append(item)

View File

@ -14,51 +14,37 @@
from __future__ import absolute_import
import io
import json
import os
import sys
from io import BytesIO, StringIO
import threading
import click
import jsonrpc # pylint: disable=import-error
from twisted.internet import defer # pylint: disable=import-error
from twisted.internet import threads # pylint: disable=import-error
from twisted.internet import utils # pylint: disable=import-error
from ajsonrpc.core import JSONRPC20DispatchException
from starlette.concurrency import run_in_threadpool
from platformio import __main__, __version__, fs
from platformio import __main__, __version__, fs, proc
from platformio.commands.home import helpers
from platformio.compat import (
PY2,
get_filesystem_encoding,
get_locale_encoding,
is_bytes,
string_types,
)
try:
from thread import get_ident as thread_get_ident
except ImportError:
from threading import get_ident as thread_get_ident
from platformio.compat import get_locale_encoding, is_bytes
class MultiThreadingStdStream(object):
def __init__(self, parent_stream):
self._buffers = {thread_get_ident(): parent_stream}
self._buffers = {threading.get_ident(): parent_stream}
def __getattr__(self, name):
thread_id = thread_get_ident()
thread_id = threading.get_ident()
self._ensure_thread_buffer(thread_id)
return getattr(self._buffers[thread_id], name)
def _ensure_thread_buffer(self, thread_id):
if thread_id not in self._buffers:
self._buffers[thread_id] = BytesIO() if PY2 else StringIO()
self._buffers[thread_id] = io.StringIO()
def write(self, value):
thread_id = thread_get_ident()
thread_id = threading.get_ident()
self._ensure_thread_buffer(thread_id)
if PY2 and isinstance(value, unicode): # pylint: disable=undefined-variable
value = value.encode()
return self._buffers[thread_id].write(
value.decode() if is_bytes(value) else value
)
@ -74,7 +60,7 @@ class MultiThreadingStdStream(object):
return result
class PIOCoreRPC(object):
class PIOCoreRPC:
@staticmethod
def version():
return __version__
@ -89,16 +75,9 @@ class PIOCoreRPC(object):
sys.stderr = PIOCoreRPC.thread_stderr
@staticmethod
def call(args, options=None):
return defer.maybeDeferred(PIOCoreRPC._call_generator, args, options)
@staticmethod
@defer.inlineCallbacks
def _call_generator(args, options=None):
async def call(args, options=None):
for i, arg in enumerate(args):
if isinstance(arg, string_types):
args[i] = arg.encode(get_filesystem_encoding()) if PY2 else arg
else:
if not isinstance(arg, str):
args[i] = str(arg)
options = options or {}
@ -106,27 +85,34 @@ class PIOCoreRPC(object):
try:
if options.get("force_subprocess"):
result = yield PIOCoreRPC._call_subprocess(args, options)
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
else:
result = yield PIOCoreRPC._call_inline(args, options)
try:
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
except ValueError:
# fall-back to subprocess method
result = yield PIOCoreRPC._call_subprocess(args, options)
defer.returnValue(PIOCoreRPC._process_result(result, to_json))
result = await PIOCoreRPC._call_subprocess(args, options)
return PIOCoreRPC._process_result(result, to_json)
result = await PIOCoreRPC._call_inline(args, options)
try:
return PIOCoreRPC._process_result(result, to_json)
except ValueError:
# fall-back to subprocess method
result = await PIOCoreRPC._call_subprocess(args, options)
return PIOCoreRPC._process_result(result, to_json)
except Exception as e: # pylint: disable=bare-except
raise jsonrpc.exceptions.JSONRPCDispatchException(
raise JSONRPC20DispatchException(
code=4003, message="PIO Core Call Error", data=str(e)
)
@staticmethod
def _call_inline(args, options):
PIOCoreRPC.setup_multithreading_std_streams()
cwd = options.get("cwd") or os.getcwd()
async def _call_subprocess(args, options):
result = await run_in_threadpool(
proc.exec_command,
[helpers.get_core_fullpath()] + args,
cwd=options.get("cwd") or os.getcwd(),
)
return (result["out"], result["err"], result["returncode"])
def _thread_task():
@staticmethod
async def _call_inline(args, options):
PIOCoreRPC.setup_multithreading_std_streams()
def _thread_safe_call(args, cwd):
with fs.cd(cwd):
exit_code = __main__.main(["-c"] + args)
return (
@ -135,16 +121,8 @@ class PIOCoreRPC(object):
exit_code,
)
return threads.deferToThread(_thread_task)
@staticmethod
def _call_subprocess(args, options):
cwd = (options or {}).get("cwd") or os.getcwd()
return utils.getProcessOutputAndValue(
helpers.get_core_fullpath(),
args,
path=cwd,
env={k: v for k, v in os.environ.items() if "%" not in k},
return await run_in_threadpool(
_thread_safe_call, args=args, cwd=options.get("cwd") or os.getcwd()
)
@staticmethod

View File

@ -18,21 +18,20 @@ import os
import shutil
import time
import jsonrpc # pylint: disable=import-error
from ajsonrpc.core import JSONRPC20DispatchException
from platformio import exception, fs
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.compat import PY2, get_filesystem_encoding
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.package.manager.platform import PlatformPackageManager
from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectError
from platformio.project.generator import ProjectGenerator
from platformio.project.helpers import get_project_dir, is_platformio_project
from platformio.project.options import get_config_options_schema
class ProjectRPC(object):
class ProjectRPC:
@staticmethod
def config_call(init_kwargs, method, *args):
assert isinstance(init_kwargs, dict)
@ -82,7 +81,7 @@ class ProjectRPC(object):
data["description"] = config.get("platformio", "description")
data["libExtraDirs"].extend(config.get("platformio", "lib_extra_dirs", []))
libdeps_dir = config.get_optional_dir("libdeps")
libdeps_dir = config.get("platformio", "libdeps_dir")
for section in config.sections():
if not section.startswith("env:"):
continue
@ -94,7 +93,7 @@ class ProjectRPC(object):
# skip non existing folders and resolve full path
for key in ("envLibdepsDirs", "libExtraDirs"):
data[key] = [
fs.expanduser(d) if d.startswith("~") else os.path.realpath(d)
fs.expanduser(d) if d.startswith("~") else os.path.abspath(d)
for d in data[key]
if os.path.isdir(d)
]
@ -185,7 +184,7 @@ class ProjectRPC(object):
)
return sorted(result, key=lambda data: data["platform"]["title"])
def init(self, board, framework, project_dir):
async def init(self, board, framework, project_dir):
assert project_dir
state = AppRPC.load_state()
if not os.path.isdir(project_dir):
@ -198,14 +197,13 @@ class ProjectRPC(object):
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(
await PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True}
)
d.addCallback(self._generate_project_main, project_dir, framework)
return d
return self._generate_project_main(project_dir, board, framework)
@staticmethod
def _generate_project_main(_, project_dir, framework):
def _generate_project_main(project_dir, board, framework):
main_content = None
if framework == "arduino":
main_content = "\n".join(
@ -240,39 +238,51 @@ class ProjectRPC(object):
)
if not main_content:
return project_dir
is_cpp_project = True
pm = PlatformPackageManager()
try:
board = pm.board_config(board)
platforms = board.get("platforms", board.get("platform"))
if not isinstance(platforms, list):
platforms = [platforms]
c_based_platforms = ["intel_mcs51", "ststm8"]
is_cpp_project = not (set(platforms) & set(c_based_platforms))
except exception.PlatformioException:
pass
with fs.cd(project_dir):
config = ProjectConfig()
src_dir = config.get_optional_dir("src")
main_path = os.path.join(src_dir, "main.cpp")
src_dir = config.get("platformio", "src_dir")
main_path = os.path.join(
src_dir, "main.%s" % ("cpp" if is_cpp_project else "c")
)
if os.path.isfile(main_path):
return project_dir
if not os.path.isdir(src_dir):
os.makedirs(src_dir)
with open(main_path, "w") as fp:
with open(main_path, mode="w", encoding="utf8") as fp:
fp.write(main_content.strip())
return project_dir
def import_arduino(self, board, use_arduino_libs, arduino_project_dir):
@staticmethod
async def import_arduino(board, use_arduino_libs, arduino_project_dir):
board = str(board)
if arduino_project_dir and PY2:
arduino_project_dir = arduino_project_dir.encode(get_filesystem_encoding())
# don't import PIO Project
if is_platformio_project(arduino_project_dir):
return arduino_project_dir
is_arduino_project = any(
[
os.path.isfile(
os.path.join(
arduino_project_dir,
"%s.%s" % (os.path.basename(arduino_project_dir), ext),
)
os.path.isfile(
os.path.join(
arduino_project_dir,
"%s.%s" % (os.path.basename(arduino_project_dir), ext),
)
for ext in ("ino", "pde")
]
)
for ext in ("ino", "pde")
)
if not is_arduino_project:
raise jsonrpc.exceptions.JSONRPCDispatchException(
raise JSONRPC20DispatchException(
code=4000, message="Not an Arduino project: %s" % arduino_project_dir
)
@ -293,26 +303,21 @@ class ProjectRPC(object):
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(
await PIOCoreRPC.call(
args, options={"cwd": project_dir, "force_subprocess": True}
)
d.addCallback(self._finalize_arduino_import, project_dir, arduino_project_dir)
return d
@staticmethod
def _finalize_arduino_import(_, project_dir, arduino_project_dir):
with fs.cd(project_dir):
config = ProjectConfig()
src_dir = config.get_optional_dir("src")
src_dir = config.get("platformio", "src_dir")
if os.path.isdir(src_dir):
fs.rmtree(src_dir)
shutil.copytree(arduino_project_dir, src_dir, symlinks=True)
return project_dir
@staticmethod
def import_pio(project_dir):
async def import_pio(project_dir):
if not project_dir or not is_platformio_project(project_dir):
raise jsonrpc.exceptions.JSONRPCDispatchException(
raise JSONRPC20DispatchException(
code=4001, message="Not an PlatformIO project: %s" % project_dir
)
new_project_dir = os.path.join(
@ -328,8 +333,7 @@ class ProjectRPC(object):
and state["storage"]["coreCaller"] in ProjectGenerator.get_supported_ides()
):
args.extend(["--ide", state["storage"]["coreCaller"]])
d = PIOCoreRPC.call(
await PIOCoreRPC.call(
args, options={"cwd": new_project_dir, "force_subprocess": True}
)
d.addCallback(lambda _: new_project_dir)
return d
return new_project_dir

View File

@ -12,90 +12,86 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=import-error
import click
import jsonrpc
from autobahn.twisted.websocket import WebSocketServerFactory, WebSocketServerProtocol
from jsonrpc.exceptions import JSONRPCDispatchException
from twisted.internet import defer, reactor
from ajsonrpc.dispatcher import Dispatcher
from ajsonrpc.manager import AsyncJSONRPCResponseManager
from starlette.endpoints import WebSocketEndpoint
from platformio.compat import PY2, dump_json_to_unicode, is_bytes
from platformio.compat import aio_create_task, aio_get_running_loop
from platformio.proc import force_exit
class JSONRPCServerProtocol(WebSocketServerProtocol):
def onOpen(self):
self.factory.connection_nums += 1
if self.factory.shutdown_timer:
self.factory.shutdown_timer.cancel()
self.factory.shutdown_timer = None
class JSONRPCServerFactoryBase:
def onClose(self, wasClean, code, reason): # pylint: disable=unused-argument
self.factory.connection_nums -= 1
if self.factory.connection_nums == 0:
self.factory.shutdownByTimeout()
def onMessage(self, payload, isBinary): # pylint: disable=unused-argument
# click.echo("> %s" % payload)
response = jsonrpc.JSONRPCResponseManager.handle(
payload, self.factory.dispatcher
).data
# if error
if "result" not in response:
self.sendJSONResponse(response)
return None
d = defer.maybeDeferred(lambda: response["result"])
d.addCallback(self._callback, response)
d.addErrback(self._errback, response)
return None
def _callback(self, result, response):
response["result"] = result
self.sendJSONResponse(response)
def _errback(self, failure, response):
if isinstance(failure.value, JSONRPCDispatchException):
e = failure.value
else:
e = JSONRPCDispatchException(code=4999, message=failure.getErrorMessage())
del response["result"]
response["error"] = e.error._data # pylint: disable=protected-access
self.sendJSONResponse(response)
def sendJSONResponse(self, response):
# click.echo("< %s" % response)
if "error" in response:
click.secho("Error: %s" % response["error"], fg="red", err=True)
response = dump_json_to_unicode(response)
if not PY2 and not is_bytes(response):
response = response.encode("utf-8")
self.sendMessage(response)
class JSONRPCServerFactory(WebSocketServerFactory):
protocol = JSONRPCServerProtocol
connection_nums = 0
shutdown_timer = 0
shutdown_timer = None
def __init__(self, shutdown_timeout=0):
super(JSONRPCServerFactory, self).__init__()
self.shutdown_timeout = shutdown_timeout
self.dispatcher = jsonrpc.Dispatcher()
self.manager = AsyncJSONRPCResponseManager(
Dispatcher(), is_server_error_verbose=True
)
def shutdownByTimeout(self):
def __call__(self, *args, **kwargs):
raise NotImplementedError
def add_object_handler(self, handler, namespace):
self.manager.dispatcher.add_object(handler, prefix="%s." % namespace)
def on_client_connect(self):
self.connection_nums += 1
if self.shutdown_timer:
self.shutdown_timer.cancel()
self.shutdown_timer = None
def on_client_disconnect(self):
self.connection_nums -= 1
if self.connection_nums < 1:
self.connection_nums = 0
if self.connection_nums == 0:
self.shutdown_by_timeout()
async def on_shutdown(self):
pass
def shutdown_by_timeout(self):
if self.shutdown_timeout < 1:
return
def _auto_shutdown_server():
click.echo("Automatically shutdown server on timeout")
reactor.stop()
force_exit()
self.shutdown_timer = reactor.callLater(
self.shutdown_timer = aio_get_running_loop().call_later(
self.shutdown_timeout, _auto_shutdown_server
)
def addHandler(self, handler, namespace):
self.dispatcher.build_method_map(handler, prefix="%s." % namespace)
class WebSocketJSONRPCServerFactory(JSONRPCServerFactoryBase):
def __call__(self, *args, **kwargs):
ws = WebSocketJSONRPCServer(*args, **kwargs)
ws.factory = self
return ws
class WebSocketJSONRPCServer(WebSocketEndpoint):
encoding = "text"
factory: WebSocketJSONRPCServerFactory = None
async def on_connect(self, websocket):
await websocket.accept()
self.factory.on_client_connect() # pylint: disable=no-member
async def on_receive(self, websocket, data):
aio_create_task(self._handle_rpc(websocket, data))
async def on_disconnect(self, websocket, close_code):
self.factory.on_client_disconnect() # pylint: disable=no-member
async def _handle_rpc(self, websocket, data):
# pylint: disable=no-member
response = await self.factory.manager.get_response_for_payload(data)
if response.error and response.error.data:
click.secho("Error: %s" % response.error.data, fg="red", err=True)
await websocket.send_text(self.factory.manager.serialize(response.body))

View File

@ -0,0 +1,99 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from urllib.parse import urlparse
import click
import uvicorn
from starlette.applications import Starlette
from starlette.middleware import Middleware
from starlette.responses import PlainTextResponse
from starlette.routing import Mount, Route, WebSocketRoute
from starlette.staticfiles import StaticFiles
from starlette.status import HTTP_403_FORBIDDEN
from platformio.commands.home.rpc.handlers.account import AccountRPC
from platformio.commands.home.rpc.handlers.app import AppRPC
from platformio.commands.home.rpc.handlers.ide import IDERPC
from platformio.commands.home.rpc.handlers.misc import MiscRPC
from platformio.commands.home.rpc.handlers.os import OSRPC
from platformio.commands.home.rpc.handlers.piocore import PIOCoreRPC
from platformio.commands.home.rpc.handlers.project import ProjectRPC
from platformio.commands.home.rpc.server import WebSocketJSONRPCServerFactory
from platformio.compat import aio_get_running_loop
from platformio.exception import PlatformioException
from platformio.package.manager.core import get_core_package_dir
from platformio.proc import force_exit
class ShutdownMiddleware:
def __init__(self, app):
self.app = app
async def __call__(self, scope, receive, send):
if scope["type"] == "http" and b"__shutdown__" in scope.get("query_string", {}):
await shutdown_server()
await self.app(scope, receive, send)
async def shutdown_server(_=None):
aio_get_running_loop().call_later(0.5, force_exit)
return PlainTextResponse("Server has been shutdown!")
async def protected_page(_):
return PlainTextResponse(
"Protected PlatformIO Home session", status_code=HTTP_403_FORBIDDEN
)
def run_server(host, port, no_open, shutdown_timeout, home_url):
contrib_dir = get_core_package_dir("contrib-piohome")
if not os.path.isdir(contrib_dir):
raise PlatformioException("Invalid path to PIO Home Contrib")
ws_rpc_factory = WebSocketJSONRPCServerFactory(shutdown_timeout)
ws_rpc_factory.add_object_handler(AccountRPC(), namespace="account")
ws_rpc_factory.add_object_handler(AppRPC(), namespace="app")
ws_rpc_factory.add_object_handler(IDERPC(), namespace="ide")
ws_rpc_factory.add_object_handler(MiscRPC(), namespace="misc")
ws_rpc_factory.add_object_handler(OSRPC(), namespace="os")
ws_rpc_factory.add_object_handler(PIOCoreRPC(), namespace="core")
ws_rpc_factory.add_object_handler(ProjectRPC(), namespace="project")
path = urlparse(home_url).path
routes = [
WebSocketRoute(path + "wsrpc", ws_rpc_factory, name="wsrpc"),
Route(path + "__shutdown__", shutdown_server, methods=["POST"]),
Mount(path, StaticFiles(directory=contrib_dir, html=True), name="static"),
]
if path != "/":
routes.append(Route("/", protected_page))
uvicorn.run(
Starlette(
middleware=[Middleware(ShutdownMiddleware)],
routes=routes,
on_startup=[
lambda: click.echo(
"PIO Home has been started. Press Ctrl+C to shutdown."
),
lambda: None if no_open else click.launch(home_url),
],
),
host=host,
port=port,
log_level="warning",
)

View File

@ -14,8 +14,11 @@
# pylint: disable=too-many-branches, too-many-locals
import json
import logging
import os
import time
from urllib.parse import quote
import click
from tabulate import tabulate
@ -23,7 +26,6 @@ from tabulate import tabulate
from platformio import exception, fs, util
from platformio.commands import PlatformioCLI
from platformio.commands.lib.helpers import get_builtin_libs, save_project_libdeps
from platformio.compat import dump_json_to_unicode
from platformio.package.exception import NotGlobalLibDir, UnknownPackageError
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.meta import PackageItem, PackageSpec
@ -31,11 +33,6 @@ from platformio.proc import is_ci
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_dir, is_platformio_project
try:
from urllib.parse import quote
except ImportError:
from urllib import quote
CTX_META_INPUT_DIRS_KEY = __name__ + ".input_dirs"
CTX_META_PROJECT_ENVIRONMENTS_KEY = __name__ + ".project_environments"
CTX_META_STORAGE_DIRS_KEY = __name__ + ".storage_dirs"
@ -43,10 +40,10 @@ CTX_META_STORAGE_LIBDEPS_KEY = __name__ + ".storage_lib_deps"
def get_project_global_lib_dir():
return ProjectConfig.get_instance().get_optional_dir("globallib")
return ProjectConfig.get_instance().get("platformio", "globallib_dir")
@click.group(short_help="Library manager")
@click.group(short_help="Library manager", hidden=True)
@click.option(
"-d",
"--storage-dir",
@ -71,6 +68,14 @@ def get_project_global_lib_dir():
)
@click.pass_context
def cli(ctx, **options):
in_silence = PlatformioCLI.in_silence()
if not in_silence:
click.secho(
"\nWARNING!!! This command is deprecated and will be removed in "
"the next releases. \nPlease use `pio pkg` instead.\n",
fg="yellow",
)
storage_cmds = ("install", "uninstall", "update", "list")
# skip commands that don't need storage folder
if ctx.invoked_subcommand not in storage_cmds or (
@ -97,7 +102,6 @@ def cli(ctx, **options):
get_project_dir(), get_project_global_lib_dir(), ctx.invoked_subcommand
)
in_silence = PlatformioCLI.in_silence()
ctx.meta[CTX_META_PROJECT_ENVIRONMENTS_KEY] = options["environment"]
ctx.meta[CTX_META_INPUT_DIRS_KEY] = storage_dirs
ctx.meta[CTX_META_STORAGE_DIRS_KEY] = []
@ -111,7 +115,7 @@ def cli(ctx, **options):
os.path.join(storage_dir, "platformio.ini")
)
config.validate(options["environment"], silent=in_silence)
libdeps_dir = config.get_optional_dir("libdeps")
libdeps_dir = config.get("platformio", "libdeps_dir")
for env in config.envs():
if options["environment"] and env not in options["environment"]:
continue
@ -152,16 +156,16 @@ def lib_install( # pylint: disable=too-many-arguments,unused-argument
if not silent and (libraries or storage_dir in storage_libdeps):
print_storage_header(storage_dirs, storage_dir)
lm = LibraryPackageManager(storage_dir)
lm.set_log_level(logging.WARN if silent else logging.DEBUG)
if libraries:
installed_pkgs = {
library: lm.install(library, silent=silent, force=force)
for library in libraries
library: lm.install(library, force=force) for library in libraries
}
elif storage_dir in storage_libdeps:
for library in storage_libdeps[storage_dir]:
lm.install(library, silent=silent, force=force)
lm.install(library, force=force)
if save and installed_pkgs:
_save_deps(ctx, installed_pkgs)
@ -212,9 +216,8 @@ def lib_uninstall(ctx, libraries, save, silent):
for storage_dir in storage_dirs:
print_storage_header(storage_dirs, storage_dir)
lm = LibraryPackageManager(storage_dir)
uninstalled_pkgs = {
library: lm.uninstall(library, silent=silent) for library in libraries
}
lm.set_log_level(logging.WARN if silent else logging.DEBUG)
uninstalled_pkgs = {library: lm.uninstall(library) for library in libraries}
if save and uninstalled_pkgs:
_save_deps(ctx, uninstalled_pkgs, action="remove")
@ -237,14 +240,20 @@ def lib_uninstall(ctx, libraries, save, silent):
def lib_update( # pylint: disable=too-many-arguments
ctx, libraries, only_check, dry_run, silent, json_output
):
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
only_check = dry_run or only_check
if only_check and not json_output:
raise exception.UserSideException(
"This command is deprecated, please use `pio pkg outdated` instead"
)
storage_dirs = ctx.meta[CTX_META_STORAGE_DIRS_KEY]
json_result = {}
for storage_dir in storage_dirs:
if not json_output:
print_storage_header(storage_dirs, storage_dir)
lib_deps = ctx.meta.get(CTX_META_STORAGE_LIBDEPS_KEY, {}).get(storage_dir, [])
lm = LibraryPackageManager(storage_dir)
lm.set_log_level(logging.WARN if silent else logging.DEBUG)
_libraries = libraries or lib_deps or lm.get_installed()
if only_check and json_output:
@ -277,16 +286,14 @@ def lib_update( # pylint: disable=too-many-arguments
None if isinstance(library, PackageItem) else PackageSpec(library)
)
try:
lm.update(
library, to_spec=to_spec, only_check=only_check, silent=silent
)
lm.update(library, to_spec=to_spec)
except UnknownPackageError as e:
if library not in lib_deps:
raise e
if json_output:
return click.echo(
dump_json_to_unicode(
json.dumps(
json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
@ -315,7 +322,7 @@ def lib_list(ctx, json_output):
if json_output:
return click.echo(
dump_json_to_unicode(
json.dumps(
json_result[storage_dirs[0]] if len(storage_dirs) == 1 else json_result
)
)
@ -355,11 +362,11 @@ def lib_search(query, json_output, page, noninteractive, **filters):
"get",
"/v2/lib/search",
params=dict(query=" ".join(query), page=page),
cache_valid="1d",
x_cache_valid="1d",
)
if json_output:
click.echo(dump_json_to_unicode(result))
click.echo(json.dumps(result))
return
if result["total"] == 0:
@ -408,7 +415,7 @@ def lib_search(query, json_output, page, noninteractive, **filters):
"get",
"/v2/lib/search",
params=dict(query=" ".join(query), page=int(result["page"]) + 1),
cache_valid="1d",
x_cache_valid="1d",
)
@ -418,7 +425,7 @@ def lib_search(query, json_output, page, noninteractive, **filters):
def lib_builtin(storage, json_output):
items = get_builtin_libs(storage)
if json_output:
return click.echo(dump_json_to_unicode(items))
return click.echo(json.dumps(items))
for storage_ in items:
if not storage_["items"]:
@ -438,11 +445,14 @@ def lib_builtin(storage, json_output):
@click.option("--json-output", is_flag=True)
def lib_show(library, json_output):
lm = LibraryPackageManager()
lib_id = lm.reveal_registry_package_id(library, silent=json_output)
lm.set_log_level(logging.ERROR if json_output else logging.DEBUG)
lib_id = lm.reveal_registry_package_id(library)
regclient = lm.get_registry_client_instance()
lib = regclient.fetch_json_data("get", "/v2/lib/info/%d" % lib_id, cache_valid="1h")
lib = regclient.fetch_json_data(
"get", "/v2/lib/info/%d" % lib_id, x_cache_valid="1h"
)
if json_output:
return click.echo(dump_json_to_unicode(lib))
return click.echo(json.dumps(lib))
title = "{ownername}/{name}".format(**lib)
click.secho(title, fg="cyan")
@ -455,7 +465,7 @@ def lib_show(library, json_output):
"Version: %s, released %s"
% (
lib["version"]["name"],
time.strftime("%c", util.parse_date(lib["version"]["released"])),
util.parse_datetime(lib["version"]["released"]).strftime("%c"),
)
)
click.echo("Manifest: %s" % lib["confurl"])
@ -463,9 +473,9 @@ def lib_show(library, json_output):
if key not in lib or not lib[key]:
continue
if isinstance(lib[key], list):
click.echo("%s: %s" % (key.title(), ", ".join(lib[key])))
click.echo("%s: %s" % (key.capitalize(), ", ".join(lib[key])))
else:
click.echo("%s: %s" % (key.title(), lib[key]))
click.echo("%s: %s" % (key.capitalize(), lib[key]))
blocks = []
@ -497,7 +507,7 @@ def lib_show(library, json_output):
"Versions",
[
"%s, released %s"
% (v["name"], time.strftime("%c", util.parse_date(v["released"])))
% (v["name"], util.parse_datetime(v["released"]).strftime("%c"))
for v in lib["versions"]
],
)
@ -527,7 +537,7 @@ def lib_show(library, json_output):
@click.argument("config_url")
def lib_register(config_url): # pylint: disable=unused-argument
raise exception.UserSideException(
"This command is deprecated. Please use `pio package publish` command."
"This command is deprecated. Please use `pio pkg publish` command."
)
@ -535,16 +545,16 @@ def lib_register(config_url): # pylint: disable=unused-argument
@click.option("--json-output", is_flag=True)
def lib_stats(json_output):
regclient = LibraryPackageManager().get_registry_client_instance()
result = regclient.fetch_json_data("get", "/v2/lib/stats", cache_valid="1h")
result = regclient.fetch_json_data("get", "/v2/lib/stats", x_cache_valid="1h")
if json_output:
return click.echo(dump_json_to_unicode(result))
return click.echo(json.dumps(result))
for key in ("updated", "added"):
tabular_data = [
(
click.style(item["name"], fg="cyan"),
time.strftime("%c", util.parse_date(item["date"])),
util.parse_datetime(item["date"]).strftime("%c"),
"https://platformio.org/lib/show/%s/%s"
% (item["id"], quote(item["name"])),
)
@ -619,9 +629,9 @@ def print_lib_item(item):
if key not in item or not item[key]:
continue
if isinstance(item[key], list):
click.echo("%s: %s" % (key.title(), ", ".join(item[key])))
click.echo("%s: %s" % (key.capitalize(), ", ".join(item[key])))
else:
click.echo("%s: %s" % (key.title(), item[key]))
click.echo("%s: %s" % (key.capitalize(), item[key]))
for key in ("frameworks", "platforms"):
if key not in item:

View File

@ -14,6 +14,7 @@
import os
from platformio import util
from platformio.compat import ci_strings_are_equal
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.meta import PackageSpec
@ -22,6 +23,7 @@ from platformio.project.config import ProjectConfig
from platformio.project.exception import InvalidProjectConfError
@util.memoized(expire="60s")
def get_builtin_libs(storage_names=None):
# pylint: disable=import-outside-toplevel
from platformio.package.manager.library import LibraryPackageManager
@ -45,8 +47,8 @@ def get_builtin_libs(storage_names=None):
return items
def is_builtin_lib(name, storages=None):
for storage in storages or get_builtin_libs():
def is_builtin_lib(name):
for storage in get_builtin_libs():
for lib in storage["items"]:
if lib.get("name") == name:
return True

View File

@ -1,119 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import tempfile
from datetime import datetime
import click
from platformio import fs
from platformio.clients.registry import RegistryClient
from platformio.compat import ensure_python3
from platformio.package.meta import PackageSpec, PackageType
from platformio.package.pack import PackagePacker
def validate_datetime(ctx, param, value): # pylint: disable=unused-argument
if not value:
return value
try:
datetime.strptime(value, "%Y-%m-%d %H:%M:%S")
except ValueError as e:
raise click.BadParameter(e)
return value
@click.group("package", short_help="Package manager")
def cli():
pass
@cli.command("pack", short_help="Create a tarball from a package")
@click.argument(
"package",
required=True,
default=os.getcwd,
metavar="<source directory, tar.gz or zip>",
)
@click.option(
"-o", "--output", help="A destination path (folder or a full path to file)"
)
def package_pack(package, output):
p = PackagePacker(package)
archive_path = p.pack(output)
click.secho('Wrote a tarball to "%s"' % archive_path, fg="green")
@cli.command("publish", short_help="Publish a package to the registry")
@click.argument(
"package",
required=True,
default=os.getcwd,
metavar="<source directory, tar.gz or zip>",
)
@click.option(
"--owner",
help="PIO Account username (can be organization username). "
"Default is set to a username of the authorized PIO Account",
)
@click.option(
"--released-at",
callback=validate_datetime,
help="Custom release date and time in the next format (UTC): 2014-06-13 17:08:52",
)
@click.option("--private", is_flag=True, help="Restricted access (not a public)")
@click.option(
"--notify/--no-notify",
default=True,
help="Notify by email when package is processed",
)
def package_publish(package, owner, released_at, private, notify):
assert ensure_python3()
with tempfile.TemporaryDirectory() as tmp_dir: # pylint: disable=no-member
with fs.cd(tmp_dir):
p = PackagePacker(package)
archive_path = p.pack()
response = RegistryClient().publish_package(
archive_path, owner, released_at, private, notify
)
os.remove(archive_path)
click.secho(response.get("message"), fg="green")
@cli.command("unpublish", short_help="Remove a pushed package from the registry")
@click.argument(
"package", required=True, metavar="[<organization>/]<pkgname>[@<version>]"
)
@click.option(
"--type",
type=click.Choice(list(PackageType.items().values())),
default="library",
help="Package type, default is set to `library`",
)
@click.option(
"--undo",
is_flag=True,
help="Undo a remove, putting a version back into the registry",
)
def package_unpublish(package, type, undo): # pylint: disable=redefined-builtin
spec = PackageSpec(package)
response = RegistryClient().unpublish_package(
type=type,
name=spec.name,
owner=spec.owner,
version=str(spec.requirements),
undo=undo,
)
click.secho(response.get("message"), fg="green")

View File

@ -0,0 +1,48 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import click
from platformio.package.commands.exec import package_exec_cmd
from platformio.package.commands.install import package_install_cmd
from platformio.package.commands.list import package_list_cmd
from platformio.package.commands.outdated import package_outdated_cmd
from platformio.package.commands.pack import package_pack_cmd
from platformio.package.commands.publish import package_publish_cmd
from platformio.package.commands.search import package_search_cmd
from platformio.package.commands.show import package_show_cmd
from platformio.package.commands.uninstall import package_uninstall_cmd
from platformio.package.commands.unpublish import package_unpublish_cmd
from platformio.package.commands.update import package_update_cmd
@click.group(
"pkg",
commands=[
package_exec_cmd,
package_install_cmd,
package_list_cmd,
package_outdated_cmd,
package_pack_cmd,
package_publish_cmd,
package_search_cmd,
package_show_cmd,
package_uninstall_cmd,
package_unpublish_cmd,
package_update_cmd,
],
short_help="Unified Package Manager",
)
def cli():
pass

View File

@ -12,13 +12,16 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import logging
import os
import click
from platformio.cache import cleanup_content_cache
from platformio.commands import PlatformioCLI
from platformio.commands.boards import print_boards
from platformio.compat import dump_json_to_unicode
from platformio.exception import UserSideException
from platformio.package.exception import UnknownPackageError
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.meta import PackageItem, PackageSpec
from platformio.package.version import get_original_version
@ -26,9 +29,315 @@ from platformio.platform.exception import UnknownPlatform
from platformio.platform.factory import PlatformFactory
@click.group(short_help="Platform manager")
@click.group(short_help="Platform manager", hidden=True)
def cli():
pass
if not PlatformioCLI.in_silence():
click.secho(
"\nWARNING!!! This command is deprecated and will be removed in "
"the next releases. \nPlease use `pio pkg` instead.\n",
fg="yellow",
)
@cli.command("search", short_help="Search for development platform")
@click.argument("query", required=False)
@click.option("--json-output", is_flag=True)
def platform_search(query, json_output):
platforms = []
for platform in _get_registry_platforms():
if query == "all":
query = ""
search_data = json.dumps(platform)
if query and query.lower() not in search_data.lower():
continue
platforms.append(
_get_registry_platform_data(
platform["name"], with_boards=False, expose_packages=False
)
)
if json_output:
click.echo(json.dumps(platforms))
else:
_print_platforms(platforms)
@cli.command("frameworks", short_help="List supported frameworks, SDKs")
@click.argument("query", required=False)
@click.option("--json-output", is_flag=True)
def platform_frameworks(query, json_output):
regclient = PlatformPackageManager().get_registry_client_instance()
frameworks = []
for framework in regclient.fetch_json_data(
"get", "/v2/frameworks", x_cache_valid="1d"
):
if query == "all":
query = ""
search_data = json.dumps(framework)
if query and query.lower() not in search_data.lower():
continue
framework["homepage"] = "https://platformio.org/frameworks/" + framework["name"]
framework["platforms"] = [
platform["name"]
for platform in _get_registry_platforms()
if framework["name"] in platform["frameworks"]
]
frameworks.append(framework)
frameworks = sorted(frameworks, key=lambda manifest: manifest["name"])
if json_output:
click.echo(json.dumps(frameworks))
else:
_print_platforms(frameworks)
@cli.command("list", short_help="List installed development platforms")
@click.option("--json-output", is_flag=True)
def platform_list(json_output):
platforms = []
pm = PlatformPackageManager()
for pkg in pm.get_installed():
platforms.append(
_get_installed_platform_data(pkg, with_boards=False, expose_packages=False)
)
platforms = sorted(platforms, key=lambda manifest: manifest["name"])
if json_output:
click.echo(json.dumps(platforms))
else:
_print_platforms(platforms)
@cli.command("show", short_help="Show details about development platform")
@click.argument("platform")
@click.option("--json-output", is_flag=True)
def platform_show(platform, json_output): # pylint: disable=too-many-branches
data = _get_platform_data(platform)
if not data:
raise UnknownPlatform(platform)
if json_output:
return click.echo(json.dumps(data))
dep = "{ownername}/{name}".format(**data) if "ownername" in data else data["name"]
click.echo(
"{dep} ~ {title}".format(dep=click.style(dep, fg="cyan"), title=data["title"])
)
click.echo("=" * (3 + len(dep + data["title"])))
click.echo(data["description"])
click.echo()
if "version" in data:
click.echo("Version: %s" % data["version"])
if data["homepage"]:
click.echo("Home: %s" % data["homepage"])
if data["repository"]:
click.echo("Repository: %s" % data["repository"])
if data["url"]:
click.echo("Vendor: %s" % data["url"])
if data["license"]:
click.echo("License: %s" % data["license"])
if data["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(data["frameworks"]))
if not data["packages"]:
return None
if not isinstance(data["packages"][0], dict):
click.echo("Packages: %s" % ", ".join(data["packages"]))
else:
click.echo()
click.secho("Packages", bold=True)
click.echo("--------")
for item in data["packages"]:
click.echo()
click.echo("Package %s" % click.style(item["name"], fg="yellow"))
click.echo("-" * (8 + len(item["name"])))
if item["type"]:
click.echo("Type: %s" % item["type"])
click.echo("Requirements: %s" % item["requirements"])
click.echo(
"Installed: %s" % ("Yes" if item.get("version") else "No (optional)")
)
if "version" in item:
click.echo("Version: %s" % item["version"])
if "originalVersion" in item:
click.echo("Original version: %s" % item["originalVersion"])
if "description" in item:
click.echo("Description: %s" % item["description"])
if data["boards"]:
click.echo()
click.secho("Boards", bold=True)
click.echo("------")
print_boards(data["boards"])
return True
@cli.command("install", short_help="Install new development platform")
@click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]")
@click.option("--with-package", multiple=True)
@click.option("--without-package", multiple=True)
@click.option("--skip-default-package", is_flag=True)
@click.option("--with-all-packages", is_flag=True)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option(
"-f",
"--force",
is_flag=True,
help="Reinstall/redownload dev/platform and its packages if exist",
)
def platform_install( # pylint: disable=too-many-arguments,too-many-locals
platforms,
with_package,
without_package,
skip_default_package,
with_all_packages,
silent,
force,
):
def _find_pkg_names(p, candidates):
result = []
for candidate in candidates:
found = False
# lookup by package types
for _name, _opts in p.packages.items():
if _opts.get("type") == candidate:
result.append(_name)
found = True
if (
p.frameworks
and candidate.startswith("framework-")
and candidate[10:] in p.frameworks
):
result.append(p.frameworks[candidate[10:]]["package"])
found = True
if not found:
result.append(candidate)
return result
pm = PlatformPackageManager()
pm.set_log_level(logging.WARN if silent else logging.DEBUG)
for platform in platforms:
if with_package or without_package or with_all_packages:
pkg = pm.install(platform, skip_dependencies=True)
p = PlatformFactory.new(pkg)
if with_all_packages:
with_package = list(p.packages)
with_package = set(_find_pkg_names(p, with_package or []))
without_package = set(_find_pkg_names(p, without_package or []))
upkgs = with_package | without_package
ppkgs = set(p.packages)
if not upkgs.issubset(ppkgs):
raise UnknownPackageError(", ".join(upkgs - ppkgs))
for name, options in p.packages.items():
if name in without_package:
continue
if name in with_package or not (
skip_default_package or options.get("optional", False)
):
p.pm.install(p.get_package_spec(name), force=force)
else:
pkg = pm.install(platform, skip_dependencies=skip_default_package)
if pkg and not silent:
click.secho(
"The platform '%s' has been successfully installed!\n"
"The rest of the packages will be installed later "
"depending on your build environment." % platform,
fg="green",
)
@cli.command("uninstall", short_help="Uninstall development platform")
@click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]")
def platform_uninstall(platforms):
pm = PlatformPackageManager()
pm.set_log_level(logging.DEBUG)
for platform in platforms:
if pm.uninstall(platform):
click.secho(
"The platform '%s' has been successfully removed!" % platform,
fg="green",
)
@cli.command("update", short_help="Update installed development platforms")
@click.argument("platforms", nargs=-1, required=False, metavar="[PLATFORM...]")
@click.option(
"-p", "--only-packages", is_flag=True, help="Update only the platform packages"
)
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True)
def platform_update( # pylint: disable=too-many-locals, too-many-arguments
platforms, only_check, dry_run, silent, json_output, **_
):
if only_check and not json_output:
raise UserSideException(
"This command is deprecated, please use `pio pkg outdated` instead"
)
pm = PlatformPackageManager()
pm.set_log_level(logging.WARN if silent else logging.DEBUG)
platforms = platforms or pm.get_installed()
only_check = dry_run or only_check
if only_check and json_output:
result = []
for platform in platforms:
spec = None
pkg = None
if isinstance(platform, PackageItem):
pkg = platform
else:
spec = PackageSpec(platform)
pkg = pm.get_package(spec)
if not pkg:
continue
outdated = pm.outdated(pkg, spec)
if (
not outdated.is_outdated(allow_incompatible=True)
and not PlatformFactory.new(pkg).are_outdated_packages()
):
continue
data = _get_installed_platform_data(
pkg, with_boards=False, expose_packages=False
)
if outdated.is_outdated(allow_incompatible=True):
data["versionLatest"] = (
str(outdated.latest) if outdated.latest else None
)
result.append(data)
return click.echo(json.dumps(result))
for platform in platforms:
click.echo(
"Platform %s"
% click.style(
platform.metadata.name
if isinstance(platform, PackageItem)
else platform,
fg="cyan",
)
)
click.echo("--------")
pm.update(platform)
click.echo()
return True
#
# Helpers
#
def _print_platforms(platforms):
@ -59,7 +368,7 @@ def _print_platforms(platforms):
def _get_registry_platforms():
regclient = PlatformPackageManager().get_registry_client_instance()
return regclient.fetch_json_data("get", "/v2/platforms", cache_valid="1d")
return regclient.fetch_json_data("get", "/v2/platforms", x_cache_valid="1d")
def _get_platform_data(*args, **kwargs):
@ -162,264 +471,3 @@ def _get_registry_platform_data( # pylint: disable=unused-argument
]
return data
@cli.command("search", short_help="Search for development platform")
@click.argument("query", required=False)
@click.option("--json-output", is_flag=True)
def platform_search(query, json_output):
platforms = []
for platform in _get_registry_platforms():
if query == "all":
query = ""
search_data = dump_json_to_unicode(platform)
if query and query.lower() not in search_data.lower():
continue
platforms.append(
_get_registry_platform_data(
platform["name"], with_boards=False, expose_packages=False
)
)
if json_output:
click.echo(dump_json_to_unicode(platforms))
else:
_print_platforms(platforms)
@cli.command("frameworks", short_help="List supported frameworks, SDKs")
@click.argument("query", required=False)
@click.option("--json-output", is_flag=True)
def platform_frameworks(query, json_output):
regclient = PlatformPackageManager().get_registry_client_instance()
frameworks = []
for framework in regclient.fetch_json_data(
"get", "/v2/frameworks", cache_valid="1d"
):
if query == "all":
query = ""
search_data = dump_json_to_unicode(framework)
if query and query.lower() not in search_data.lower():
continue
framework["homepage"] = "https://platformio.org/frameworks/" + framework["name"]
framework["platforms"] = [
platform["name"]
for platform in _get_registry_platforms()
if framework["name"] in platform["frameworks"]
]
frameworks.append(framework)
frameworks = sorted(frameworks, key=lambda manifest: manifest["name"])
if json_output:
click.echo(dump_json_to_unicode(frameworks))
else:
_print_platforms(frameworks)
@cli.command("list", short_help="List installed development platforms")
@click.option("--json-output", is_flag=True)
def platform_list(json_output):
platforms = []
pm = PlatformPackageManager()
for pkg in pm.get_installed():
platforms.append(
_get_installed_platform_data(pkg, with_boards=False, expose_packages=False)
)
platforms = sorted(platforms, key=lambda manifest: manifest["name"])
if json_output:
click.echo(dump_json_to_unicode(platforms))
else:
_print_platforms(platforms)
@cli.command("show", short_help="Show details about development platform")
@click.argument("platform")
@click.option("--json-output", is_flag=True)
def platform_show(platform, json_output): # pylint: disable=too-many-branches
data = _get_platform_data(platform)
if not data:
raise UnknownPlatform(platform)
if json_output:
return click.echo(dump_json_to_unicode(data))
dep = "{ownername}/{name}".format(**data) if "ownername" in data else data["name"]
click.echo(
"{dep} ~ {title}".format(dep=click.style(dep, fg="cyan"), title=data["title"])
)
click.echo("=" * (3 + len(dep + data["title"])))
click.echo(data["description"])
click.echo()
if "version" in data:
click.echo("Version: %s" % data["version"])
if data["homepage"]:
click.echo("Home: %s" % data["homepage"])
if data["repository"]:
click.echo("Repository: %s" % data["repository"])
if data["url"]:
click.echo("Vendor: %s" % data["url"])
if data["license"]:
click.echo("License: %s" % data["license"])
if data["frameworks"]:
click.echo("Frameworks: %s" % ", ".join(data["frameworks"]))
if not data["packages"]:
return None
if not isinstance(data["packages"][0], dict):
click.echo("Packages: %s" % ", ".join(data["packages"]))
else:
click.echo()
click.secho("Packages", bold=True)
click.echo("--------")
for item in data["packages"]:
click.echo()
click.echo("Package %s" % click.style(item["name"], fg="yellow"))
click.echo("-" * (8 + len(item["name"])))
if item["type"]:
click.echo("Type: %s" % item["type"])
click.echo("Requirements: %s" % item["requirements"])
click.echo(
"Installed: %s" % ("Yes" if item.get("version") else "No (optional)")
)
if "version" in item:
click.echo("Version: %s" % item["version"])
if "originalVersion" in item:
click.echo("Original version: %s" % item["originalVersion"])
if "description" in item:
click.echo("Description: %s" % item["description"])
if data["boards"]:
click.echo()
click.secho("Boards", bold=True)
click.echo("------")
print_boards(data["boards"])
return True
@cli.command("install", short_help="Install new development platform")
@click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]")
@click.option("--with-package", multiple=True)
@click.option("--without-package", multiple=True)
@click.option("--skip-default-package", is_flag=True)
@click.option("--with-all-packages", is_flag=True)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option(
"-f",
"--force",
is_flag=True,
help="Reinstall/redownload dev/platform and its packages if exist",
)
def platform_install( # pylint: disable=too-many-arguments
platforms,
with_package,
without_package,
skip_default_package,
with_all_packages,
silent,
force,
):
pm = PlatformPackageManager()
for platform in platforms:
pkg = pm.install(
spec=platform,
with_packages=with_package,
without_packages=without_package,
skip_default_package=skip_default_package,
with_all_packages=with_all_packages,
silent=silent,
force=force,
)
if pkg and not silent:
click.secho(
"The platform '%s' has been successfully installed!\n"
"The rest of the packages will be installed later "
"depending on your build environment." % platform,
fg="green",
)
@cli.command("uninstall", short_help="Uninstall development platform")
@click.argument("platforms", nargs=-1, required=True, metavar="[PLATFORM...]")
def platform_uninstall(platforms):
pm = PlatformPackageManager()
for platform in platforms:
if pm.uninstall(platform):
click.secho(
"The platform '%s' has been successfully removed!" % platform,
fg="green",
)
@cli.command("update", short_help="Update installed development platforms")
@click.argument("platforms", nargs=-1, required=False, metavar="[PLATFORM...]")
@click.option(
"-p", "--only-packages", is_flag=True, help="Update only the platform packages"
)
@click.option(
"-c",
"--only-check",
is_flag=True,
help="DEPRECATED. Please use `--dry-run` instead",
)
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.option("-s", "--silent", is_flag=True, help="Suppress progress reporting")
@click.option("--json-output", is_flag=True)
def platform_update( # pylint: disable=too-many-locals, too-many-arguments
platforms, only_packages, only_check, dry_run, silent, json_output
):
pm = PlatformPackageManager()
platforms = platforms or pm.get_installed()
only_check = dry_run or only_check
if only_check and json_output:
result = []
for platform in platforms:
spec = None
pkg = None
if isinstance(platform, PackageItem):
pkg = platform
else:
spec = PackageSpec(platform)
pkg = pm.get_package(spec)
if not pkg:
continue
outdated = pm.outdated(pkg, spec)
if (
not outdated.is_outdated(allow_incompatible=True)
and not PlatformFactory.new(pkg).are_outdated_packages()
):
continue
data = _get_installed_platform_data(
pkg, with_boards=False, expose_packages=False
)
if outdated.is_outdated(allow_incompatible=True):
data["versionLatest"] = (
str(outdated.latest) if outdated.latest else None
)
result.append(data)
return click.echo(dump_json_to_unicode(result))
# cleanup cached board and platform lists
cleanup_content_cache("http")
for platform in platforms:
click.echo(
"Platform %s"
% click.style(
platform.metadata.name
if isinstance(platform, PackageItem)
else platform,
fg="cyan",
)
)
click.echo("--------")
pm.update(
platform, only_packages=only_packages, only_check=only_check, silent=silent
)
click.echo()
return True

View File

@ -12,448 +12,21 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches,line-too-long
import json
import os
import click
from tabulate import tabulate
from platformio import fs
from platformio.commands.platform import platform_install as cli_platform_install
from platformio.ide.projectgenerator import ProjectGenerator
from platformio.package.manager.platform import PlatformPackageManager
from platformio.platform.exception import UnknownBoard
from platformio.project.config import ProjectConfig
from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.helpers import is_platformio_project, load_project_ide_data
from platformio.project.commands.config import project_config_cmd
from platformio.project.commands.init import project_init_cmd
from platformio.project.commands.metadata import project_metadata_cmd
@click.group(short_help="Project manager")
@click.group(
"project",
commands=[
project_config_cmd,
project_init_cmd,
project_metadata_cmd,
],
short_help="Project Manager",
)
def cli():
pass
@cli.command("config", short_help="Show computed configuration")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option("--json-output", is_flag=True)
def project_config(project_dir, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
if json_output:
return click.echo(config.to_json())
click.echo(
"Computed project configuration for %s" % click.style(project_dir, fg="cyan")
)
for section, options in config.as_tuple():
click.secho(section, fg="cyan")
click.echo("-" * len(section))
click.echo(
tabulate(
[
(name, "=", "\n".join(value) if isinstance(value, list) else value)
for name, value in options
],
tablefmt="plain",
)
)
click.echo()
return None
@cli.command("data", short_help="Dump data intended for IDE extensions/plugins")
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(exists=True, file_okay=False, dir_okay=True, resolve_path=True),
)
@click.option("-e", "--environment", multiple=True)
@click.option("--json-output", is_flag=True)
def project_data(project_dir, environment, json_output):
if not is_platformio_project(project_dir):
raise NotPlatformIOProjectError(project_dir)
with fs.cd(project_dir):
config = ProjectConfig.get_instance()
config.validate(environment)
environment = list(environment or config.envs())
if json_output:
return click.echo(json.dumps(load_project_ide_data(project_dir, environment)))
for envname in environment:
click.echo("Environment: " + click.style(envname, fg="cyan", bold=True))
click.echo("=" * (13 + len(envname)))
click.echo(
tabulate(
[
(click.style(name, bold=True), "=", json.dumps(value, indent=2))
for name, value in load_project_ide_data(
project_dir, envname
).items()
],
tablefmt="plain",
)
)
click.echo()
return None
def validate_boards(ctx, param, value): # pylint: disable=W0613
pm = PlatformPackageManager()
for id_ in value:
try:
pm.board_config(id_)
except UnknownBoard:
raise click.BadParameter(
"`%s`. Please search for board ID using `platformio boards` "
"command" % id_
)
return value
@cli.command("init", short_help="Initialize a project or update existing")
@click.option(
"--project-dir",
"-d",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option("-b", "--board", multiple=True, metavar="ID", callback=validate_boards)
@click.option("--ide", type=click.Choice(ProjectGenerator.get_supported_ides()))
@click.option("-e", "--environment", help="Update using existing environment")
@click.option("-O", "--project-option", multiple=True)
@click.option("--env-prefix", default="")
@click.option("-s", "--silent", is_flag=True)
@click.pass_context
def project_init(
ctx, # pylint: disable=R0913
project_dir,
board,
ide,
environment,
project_option,
env_prefix,
silent,
):
if not silent:
if project_dir == os.getcwd():
click.secho("\nThe current working directory", fg="yellow", nl=False)
click.secho(" %s " % project_dir, fg="cyan", nl=False)
click.secho("will be used for the project.", fg="yellow")
click.echo("")
click.echo(
"The next files/directories have been created in %s"
% click.style(project_dir, fg="cyan")
)
click.echo(
"%s - Put project header files here" % click.style("include", fg="cyan")
)
click.echo(
"%s - Put here project specific (private) libraries"
% click.style("lib", fg="cyan")
)
click.echo("%s - Put project source files here" % click.style("src", fg="cyan"))
click.echo(
"%s - Project Configuration File" % click.style("platformio.ini", fg="cyan")
)
is_new_project = not is_platformio_project(project_dir)
if is_new_project:
init_base_project(project_dir)
if environment:
update_project_env(project_dir, environment, project_option)
elif board:
update_board_envs(
ctx, project_dir, board, project_option, env_prefix, ide is not None
)
if ide:
with fs.cd(project_dir):
config = ProjectConfig.get_instance(
os.path.join(project_dir, "platformio.ini")
)
config.validate()
pg = ProjectGenerator(
config, environment or get_best_envname(config, board), ide
)
pg.generate()
if is_new_project:
init_cvs_ignore(project_dir)
if silent:
return
if ide:
click.secho(
"\nProject has been successfully %s including configuration files "
"for `%s` IDE." % ("initialized" if is_new_project else "updated", ide),
fg="green",
)
else:
click.secho(
"\nProject has been successfully %s! Useful commands:\n"
"`pio run` - process/build project from the current directory\n"
"`pio run --target upload` or `pio run -t upload` "
"- upload firmware to a target\n"
"`pio run --target clean` - clean project (remove compiled files)"
"\n`pio run --help` - additional information"
% ("initialized" if is_new_project else "updated"),
fg="green",
)
def init_base_project(project_dir):
with fs.cd(project_dir):
config = ProjectConfig()
config.save()
dir_to_readme = [
(config.get_optional_dir("src"), None),
(config.get_optional_dir("include"), init_include_readme),
(config.get_optional_dir("lib"), init_lib_readme),
(config.get_optional_dir("test"), init_test_readme),
]
for (path, cb) in dir_to_readme:
if os.path.isdir(path):
continue
os.makedirs(path)
if cb:
cb(path)
def init_include_readme(include_dir):
with open(os.path.join(include_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for project header files.
A header file is a file containing C declarations and macro definitions
to be shared between several project source files. You request the use of a
header file in your project source file (C, C++, etc) located in `src` folder
by including it, with the C preprocessing directive `#include'.
```src/main.c
#include "header.h"
int main (void)
{
...
}
```
Including a header file produces the same results as copying the header file
into each source file that needs it. Such copying would be time-consuming
and error-prone. With a header file, the related declarations appear
in only one place. If they need to be changed, they can be changed in one
place, and programs that include the header file will automatically use the
new version when next recompiled. The header file eliminates the labor of
finding and changing all the copies as well as the risk that a failure to
find one copy will result in inconsistencies within a program.
In C, the usual convention is to give header files names that end with `.h'.
It is most portable to use only letters, digits, dashes, and underscores in
header file names, and at most one dot.
Read more about using header files in official GCC documentation:
* Include Syntax
* Include Operation
* Once-Only Headers
* Computed Includes
https://gcc.gnu.org/onlinedocs/cpp/Header-Files.html
""",
)
def init_lib_readme(lib_dir):
with open(os.path.join(lib_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for project specific (private) libraries.
PlatformIO will compile them to static libraries and link into executable file.
The source code of each library should be placed in a an own separate directory
("lib/your_library_name/[here are source files]").
For example, see a structure of the following two libraries `Foo` and `Bar`:
|--lib
| |
| |--Bar
| | |--docs
| | |--examples
| | |--src
| | |- Bar.c
| | |- Bar.h
| | |- library.json (optional, custom build options, etc) https://docs.platformio.org/page/librarymanager/config.html
| |
| |--Foo
| | |- Foo.c
| | |- Foo.h
| |
| |- README --> THIS FILE
|
|- platformio.ini
|--src
|- main.c
and a contents of `src/main.c`:
```
#include <Foo.h>
#include <Bar.h>
int main (void)
{
...
}
```
PlatformIO Library Dependency Finder will find automatically dependent
libraries scanning project source files.
More information about PlatformIO Library Dependency Finder
- https://docs.platformio.org/page/librarymanager/ldf.html
""",
)
def init_test_readme(test_dir):
with open(os.path.join(test_dir, "README"), "w") as fp:
fp.write(
"""
This directory is intended for PlatformIO Unit Testing and project tests.
Unit Testing is a software testing method by which individual units of
source code, sets of one or more MCU program modules together with associated
control data, usage procedures, and operating procedures, are tested to
determine whether they are fit for use. Unit testing finds problems early
in the development cycle.
More information about PlatformIO Unit Testing:
- https://docs.platformio.org/page/plus/unit-testing.html
""",
)
def init_cvs_ignore(project_dir):
conf_path = os.path.join(project_dir, ".gitignore")
if os.path.isfile(conf_path):
return
with open(conf_path, "w") as fp:
fp.write(".pio\n")
def update_board_envs(
ctx, project_dir, board_ids, project_option, env_prefix, force_download
):
config = ProjectConfig(
os.path.join(project_dir, "platformio.ini"), parse_extra=False
)
used_boards = []
for section in config.sections():
cond = [section.startswith("env:"), config.has_option(section, "board")]
if all(cond):
used_boards.append(config.get(section, "board"))
pm = PlatformPackageManager()
used_platforms = []
modified = False
for id_ in board_ids:
board_config = pm.board_config(id_)
used_platforms.append(board_config["platform"])
if id_ in used_boards:
continue
used_boards.append(id_)
modified = True
envopts = {"platform": board_config["platform"], "board": id_}
# find default framework for board
frameworks = board_config.get("frameworks")
if frameworks:
envopts["framework"] = frameworks[0]
for item in project_option:
if "=" not in item:
continue
_name, _value = item.split("=", 1)
envopts[_name.strip()] = _value.strip()
section = "env:%s%s" % (env_prefix, id_)
config.add_section(section)
for option, value in envopts.items():
config.set(section, option, value)
if force_download and used_platforms:
_install_dependent_platforms(ctx, used_platforms)
if modified:
config.save()
def _install_dependent_platforms(ctx, platforms):
installed_platforms = [
pkg.metadata.name for pkg in PlatformPackageManager().get_installed()
]
if set(platforms) <= set(installed_platforms):
return
ctx.invoke(
cli_platform_install, platforms=list(set(platforms) - set(installed_platforms))
)
def update_project_env(project_dir, environment, project_option):
if not project_option:
return
config = ProjectConfig(
os.path.join(project_dir, "platformio.ini"), parse_extra=False
)
section = "env:%s" % environment
if not config.has_section(section):
config.add_section(section)
for item in project_option:
if "=" not in item:
continue
_name, _value = item.split("=", 1)
config.set(section, _name.strip(), _value.strip())
config.save()
def get_best_envname(config, board_ids=None):
envname = None
default_envs = config.default_envs()
if default_envs:
envname = default_envs[0]
if not board_ids:
return envname
for env in config.envs():
if not board_ids:
return env
if not envname:
envname = env
items = config.items(env=env, as_dict=True)
if "board" in items and items.get("board") in board_ids:
return env
return envname

View File

@ -25,7 +25,7 @@ class ProjectSyncAsyncCmd(AsyncCommandBase):
def __init__(self, *args, **kwargs):
self.psync = None
self._upstream = None
super(ProjectSyncAsyncCmd, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
def start(self):
project_dir = os.path.join(

View File

@ -13,27 +13,28 @@
# limitations under the License.
import os
from os.path import getatime, getmtime, isdir, isfile, join
from twisted.logger import LogLevel # pylint: disable=import-error
from twisted.spread import pb # pylint: disable=import-error
from platformio import proc, util
from platformio import proc
from platformio.commands.remote.ac.process import ProcessAsyncCmd
from platformio.commands.remote.ac.psync import ProjectSyncAsyncCmd
from platformio.commands.remote.ac.serial import SerialPortAsyncCmd
from platformio.commands.remote.client.base import RemoteClientBase
from platformio.device.list import list_serial_ports
from platformio.project.config import ProjectConfig
from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.helpers import get_project_core_dir
class RemoteAgentService(RemoteClientBase):
def __init__(self, name, share, working_dir=None):
RemoteClientBase.__init__(self)
self.log_level = LogLevel.info
self.working_dir = working_dir or join(get_project_core_dir(), "remote")
if not isdir(self.working_dir):
self.working_dir = working_dir or os.path.join(
ProjectConfig.get_instance().get("platformio", "core_dir"), "remote"
)
if not os.path.isdir(self.working_dir):
os.makedirs(self.working_dir)
if name:
self.name = str(name)[:50]
@ -84,11 +85,11 @@ class RemoteAgentService(RemoteClientBase):
return (self.id, ac.id)
def _process_cmd_device_list(self, _):
return (self.name, util.get_serialports())
return (self.name, list_serial_ports())
def _process_cmd_device_monitor(self, options):
if not options["port"]:
for item in util.get_serialports():
for item in list_serial_ports():
if "VID:PID" in item["hwid"]:
options["port"] = item["port"]
break
@ -138,14 +139,14 @@ class RemoteAgentService(RemoteClientBase):
self, command, options
):
assert options and "project_id" in options
project_dir = join(self.working_dir, "projects", options["project_id"])
origin_pio_ini = join(project_dir, "platformio.ini")
back_pio_ini = join(project_dir, "platformio.ini.bak")
project_dir = os.path.join(self.working_dir, "projects", options["project_id"])
origin_pio_ini = os.path.join(project_dir, "platformio.ini")
back_pio_ini = os.path.join(project_dir, "platformio.ini.bak")
# remove insecure project options
try:
conf = ProjectConfig(origin_pio_ini)
if isfile(back_pio_ini):
if os.path.isfile(back_pio_ini):
os.remove(back_pio_ini)
os.rename(origin_pio_ini, back_pio_ini)
# cleanup
@ -159,7 +160,10 @@ class RemoteAgentService(RemoteClientBase):
conf.save(origin_pio_ini)
# restore A/M times
os.utime(origin_pio_ini, (getatime(back_pio_ini), getmtime(back_pio_ini)))
os.utime(
origin_pio_ini,
(os.path.getatime(back_pio_ini), os.path.getmtime(back_pio_ini)),
)
except NotPlatformIOProjectError as e:
raise pb.Error(str(e))
@ -194,8 +198,8 @@ class RemoteAgentService(RemoteClientBase):
paused_acs.append(ac)
def _cb_on_end():
if isfile(back_pio_ini):
if isfile(origin_pio_ini):
if os.path.isfile(back_pio_ini):
if os.path.isfile(origin_pio_ini):
os.remove(origin_pio_ini)
os.rename(back_pio_ini, origin_pio_ini)
for ac in paused_acs:

View File

@ -84,7 +84,7 @@ class DeviceMonitorClient( # pylint: disable=too-many-instance-attributes
self._ac_id = None
self._d_acread = None
self._d_acwrite = None
self._acwrite_buffer = ""
self._acwrite_buffer = b""
def agent_pool_ready(self):
d = task.deferLater(
@ -173,7 +173,11 @@ class DeviceMonitorClient( # pylint: disable=too-many-instance-attributes
address = port.getHost()
self.log.debug("Serial Bridge is started on {address!r}", address=address)
if "sock" in self.cmd_options:
with open(os.path.join(self.cmd_options["sock"], "sock"), "w") as fp:
with open(
os.path.join(self.cmd_options["sock"], "sock"),
mode="w",
encoding="utf8",
) as fp:
fp.write("socket://localhost:%d" % address.port)
def client_terminal_stopped(self):
@ -222,7 +226,7 @@ class DeviceMonitorClient( # pylint: disable=too-many-instance-attributes
return
data = self._acwrite_buffer
self._acwrite_buffer = ""
self._acwrite_buffer = b""
try:
d = self.agentpool.callRemote("acwrite", self._agent_id, self._ac_id, data)
d.addCallback(self.cb_acwrite_result)
@ -233,4 +237,4 @@ class DeviceMonitorClient( # pylint: disable=too-many-instance-attributes
def cb_acwrite_result(self, result):
assert result > 0
if self._acwrite_buffer:
self.acwrite_data("")
self.acwrite_data(b"")

View File

@ -69,8 +69,8 @@ class RunOrTestClient(AsyncClientBase):
os.path.join(self.options["project_dir"], "platformio.ini")
)
psync.add_item(cfg.path, "platformio.ini")
psync.add_item(cfg.get_optional_dir("shared"), "shared")
psync.add_item(cfg.get_optional_dir("boards"), "boards")
psync.add_item(cfg.get("platformio", "shared_dir"), "shared")
psync.add_item(cfg.get("platformio", "boards_dir"), "boards")
if self.options["force_remote"]:
self._add_project_source_items(cfg, psync)
@ -78,26 +78,26 @@ class RunOrTestClient(AsyncClientBase):
self._add_project_binary_items(cfg, psync)
if self.command == "test":
psync.add_item(cfg.get_optional_dir("test"), "test")
psync.add_item(cfg.get("platformio", "test_dir"), "test")
def _add_project_source_items(self, cfg, psync):
psync.add_item(cfg.get_optional_dir("lib"), "lib")
psync.add_item(cfg.get("platformio", "lib_dir"), "lib")
psync.add_item(
cfg.get_optional_dir("include"),
cfg.get("platformio", "include_dir"),
"include",
cb_filter=self._cb_tarfile_filter,
)
psync.add_item(
cfg.get_optional_dir("src"), "src", cb_filter=self._cb_tarfile_filter
cfg.get("platformio", "src_dir"), "src", cb_filter=self._cb_tarfile_filter
)
if set(["buildfs", "uploadfs", "uploadfsota"]) & set(
self.options.get("target", [])
):
psync.add_item(cfg.get_optional_dir("data"), "data")
psync.add_item(cfg.get("platformio", "data_dir"), "data")
@staticmethod
def _add_project_binary_items(cfg, psync):
build_dir = cfg.get_optional_dir("build")
build_dir = cfg.get("platformio", "build_dir")
for env_name in os.listdir(build_dir):
env_dir = os.path.join(build_dir, env_name)
if not os.path.isdir(env_dir):

View File

@ -24,22 +24,25 @@ from time import sleep
import click
from platformio import fs, proc
from platformio.commands.device import helpers as device_helpers
from platformio.commands.device.command import device_monitor as cmd_device_monitor
from platformio.commands.run.command import cli as cmd_run
from platformio.commands.test.command import cli as cmd_test
from platformio.compat import ensure_python3
from platformio.device.commands.monitor import (
apply_project_monitor_options,
device_monitor_cmd,
get_project_options,
project_options_to_monitor_argv,
)
from platformio.package.manager.core import inject_contrib_pysite
from platformio.project.exception import NotPlatformIOProjectError
from platformio.project.options import ProjectOptions
from platformio.test.command import test_cmd
@click.group("remote", short_help="Remote development")
@click.group("remote", short_help="Remote Development")
@click.option("-a", "--agent", multiple=True)
@click.pass_context
def cli(ctx, agent):
assert ensure_python3()
ctx.obj = agent
inject_contrib_pysite(verify_openssl=True)
inject_contrib_pysite()
@cli.group("agent", short_help="Start a new agent or list active")
@ -165,7 +168,20 @@ def remote_run(
@cli.command("test", short_help="Remote Unit Testing")
@click.option("--environment", "-e", multiple=True, metavar="<environment>")
@click.option("--ignore", "-i", multiple=True, metavar="<pattern>")
@click.option(
"--filter",
"-f",
multiple=True,
metavar="<pattern>",
help="Filter tests by a pattern",
)
@click.option(
"--ignore",
"-i",
multiple=True,
metavar="<pattern>",
help="Ignore tests by a pattern",
)
@click.option("--upload-port")
@click.option("--test-port")
@click.option(
@ -182,10 +198,11 @@ def remote_run(
@click.option("--verbose", "-v", is_flag=True)
@click.pass_obj
@click.pass_context
def remote_test(
def remote_test( # pylint: disable=redefined-builtin
ctx,
agents,
environment,
filter,
ignore,
upload_port,
test_port,
@ -203,6 +220,7 @@ def remote_test(
agents,
dict(
environment=environment,
filter=filter,
ignore=ignore,
upload_port=upload_port,
test_port=test_port,
@ -219,8 +237,9 @@ def remote_test(
click.secho("Building project locally", bold=True)
ctx.invoke(
cmd_test,
test_cmd,
environment=environment,
filter=filter,
ignore=ignore,
project_dir=project_dir,
without_uploading=True,
@ -251,7 +270,12 @@ def device_list(agents, json_output):
@remote_device.command("monitor", short_help="Monitor remote device")
@click.option("--port", "-p", help="Port, a number or a device name")
@click.option("--baud", "-b", type=int, help="Set baud rate, default=9600")
@click.option(
"--baud",
"-b",
type=int,
help="Set baud rate, default=%d" % ProjectOptions["env.monitor_speed"].default,
)
@click.option(
"--parity",
default="N",
@ -330,16 +354,19 @@ def device_monitor(ctx, agents, **kwargs):
project_options = {}
try:
with fs.cd(kwargs["project_dir"]):
project_options = device_helpers.get_project_options(kwargs["environment"])
kwargs = device_helpers.apply_project_monitor_options(kwargs, project_options)
project_options = get_project_options(kwargs["environment"])
kwargs = apply_project_monitor_options(kwargs, project_options)
except NotPlatformIOProjectError:
pass
kwargs["baud"] = kwargs["baud"] or 9600
kwargs["baud"] = kwargs["baud"] or ProjectOptions["env.monitor_speed"].default
def _tx_target(sock_dir):
subcmd_argv = ["remote", "device", "monitor"]
subcmd_argv.extend(device_helpers.options_to_argv(kwargs, project_options))
subcmd_argv = ["remote"]
for agent in agents:
subcmd_argv.extend(["--agent", agent])
subcmd_argv.extend(["device", "monitor"])
subcmd_argv.extend(project_options_to_monitor_argv(kwargs, project_options))
subcmd_argv.extend(["--sock", sock_dir])
subprocess.call([proc.where_is_program("platformio")] + subcmd_argv)
@ -352,9 +379,9 @@ def device_monitor(ctx, agents, **kwargs):
sleep(0.1)
if not t.is_alive():
return
with open(sock_file) as fp:
with open(sock_file, encoding="utf8") as fp:
kwargs["port"] = fp.read()
ctx.invoke(cmd_device_monitor, **kwargs)
ctx.invoke(device_monitor_cmd, **kwargs)
t.join(2)
finally:
fs.rmtree(sock_dir)

View File

@ -23,7 +23,7 @@ class SSLContextFactory(ssl.ClientContextFactory):
self.certificate_verified = False
def getContext(self):
ctx = super(SSLContextFactory, self).getContext()
ctx = super().getContext()
ctx.set_verify(
SSL.VERIFY_PEER | SSL.VERIFY_FAIL_IF_NO_PEER_CERT, self.verifyHostname
)

View File

@ -14,6 +14,7 @@
import operator
import os
import shutil
from multiprocessing import cpu_count
from time import time
@ -21,12 +22,12 @@ import click
from tabulate import tabulate
from platformio import app, exception, fs, util
from platformio.commands.device.command import device_monitor as cmd_device_monitor
from platformio.commands.run.helpers import clean_build_dir, handle_legacy_libdeps
from platformio.commands.run.processor import EnvironmentProcessor
from platformio.commands.test.processor import CTX_META_TEST_IS_RUNNING
from platformio.device.commands.monitor import device_monitor_cmd
from platformio.project.config import ProjectConfig
from platformio.project.helpers import find_project_dir_above, load_project_ide_data
from platformio.project.helpers import find_project_dir_above, load_build_metadata
from platformio.test.runners.base import CTX_META_TEST_IS_RUNNING
# pylint: disable=too-many-arguments,too-many-locals,too-many-branches
@ -65,10 +66,17 @@ except NotImplementedError:
"Default is a number of CPUs in a system (N=%d)" % DEFAULT_JOB_NUMS
),
)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
@click.option(
"-a",
"--program-arg",
"program_args",
multiple=True,
help="A program argument (multiple are allowed)",
)
@click.option("--disable-auto-clean", is_flag=True)
@click.option("--list-targets", is_flag=True)
@click.option("-s", "--silent", is_flag=True)
@click.option("-v", "--verbose", is_flag=True)
@click.pass_context
def cli(
ctx,
@ -78,10 +86,11 @@ def cli(
project_dir,
project_conf,
jobs,
silent,
verbose,
program_args,
disable_auto_clean,
list_targets,
silent,
verbose,
):
app.set_session_var("custom_project_conf", project_conf)
@ -91,6 +100,7 @@ def cli(
is_test_running = CTX_META_TEST_IS_RUNNING in ctx.meta
results = []
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(environment)
@ -100,7 +110,7 @@ def cli(
# clean obsolete build dir
if not disable_auto_clean:
build_dir = config.get_optional_dir("build")
build_dir = config.get("platformio", "build_dir")
try:
clean_build_dir(build_dir, config)
except: # pylint: disable=bare-except
@ -113,7 +123,6 @@ def cli(
handle_legacy_libdeps(project_dir, config)
default_envs = config.default_envs()
results = []
for env in config.envs():
skipenv = any(
[
@ -137,21 +146,25 @@ def cli(
environment,
target,
upload_port,
jobs,
program_args,
is_test_running,
silent,
verbose,
jobs,
is_test_running,
)
)
command_failed = any(r.get("succeeded") is False for r in results)
command_failed = any(r.get("succeeded") is False for r in results)
if not is_test_running and (command_failed or not silent) and len(results) > 1:
print_processing_summary(results, verbose)
if not is_test_running and (command_failed or not silent) and len(results) > 1:
print_processing_summary(results, verbose)
if command_failed:
raise exception.ReturnErrorCode(1)
return True
# Reset custom project config
app.set_session_var("custom_project_conf", None)
if command_failed:
raise exception.ReturnErrorCode(1)
return True
def process_env(
@ -161,16 +174,25 @@ def process_env(
environments,
targets,
upload_port,
jobs,
program_args,
is_test_running,
silent,
verbose,
jobs,
is_test_running,
):
if not is_test_running and not silent:
print_processing_header(name, config, verbose)
ep = EnvironmentProcessor(
ctx, name, config, targets, upload_port, silent, verbose, jobs
ctx,
name,
config,
targets,
upload_port,
jobs,
program_args,
silent,
verbose,
)
result = {"env": name, "duration": time(), "succeeded": ep.process()}
result["duration"] = time() - result["duration"]
@ -185,7 +207,7 @@ def process_env(
and "nobuild" not in ep.get_build_targets()
):
ctx.invoke(
cmd_device_monitor, environment=environments[0] if environments else None
device_monitor_cmd, environment=environments[0] if environments else None
)
return result
@ -200,7 +222,7 @@ def print_processing_header(env, config, verbose=False):
"Processing %s (%s)"
% (click.style(env, fg="cyan", bold=True), "; ".join(env_dump))
)
terminal_width, _ = click.get_terminal_size()
terminal_width, _ = shutil.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
@ -272,7 +294,7 @@ def print_processing_summary(results, verbose=False):
def print_target_list(envs):
tabular_data = []
for env, data in load_project_ide_data(os.getcwd(), envs).items():
for env, data in load_build_metadata(os.getcwd(), envs).items():
tabular_data.extend(
sorted(
[

View File

@ -23,8 +23,8 @@ from platformio.project.helpers import compute_project_checksum, get_project_dir
def handle_legacy_libdeps(project_dir, config):
legacy_libdeps_dir = join(project_dir, ".piolibdeps")
if not isdir(legacy_libdeps_dir) or legacy_libdeps_dir == config.get_optional_dir(
"libdeps"
if not isdir(legacy_libdeps_dir) or legacy_libdeps_dir == config.get(
"platformio", "libdeps_dir"
):
return
if not config.has_section("env"):
@ -54,11 +54,11 @@ def clean_build_dir(build_dir, config):
if isdir(build_dir):
# check project structure
if isfile(checksum_file):
with open(checksum_file) as fp:
with open(checksum_file, encoding="utf8") as fp:
if fp.read() == checksum:
return
fs.rmtree(build_dir)
makedirs(build_dir)
with open(checksum_file, "w") as fp:
with open(checksum_file, mode="w", encoding="utf8") as fp:
fp.write(checksum)

View File

@ -12,31 +12,44 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.commands.platform import platform_install as cmd_platform_install
from platformio.commands.test.processor import CTX_META_TEST_RUNNING_NAME
from platformio.platform.exception import UnknownPlatform
from platformio.package.commands.install import install_project_env_dependencies
from platformio.platform.factory import PlatformFactory
from platformio.project.exception import UndefinedEnvPlatformError
from platformio.test.runners.base import CTX_META_TEST_RUNNING_NAME
# pylint: disable=too-many-instance-attributes
class EnvironmentProcessor(object):
def __init__( # pylint: disable=too-many-arguments
self, cmd_ctx, name, config, targets, upload_port, silent, verbose, jobs
self,
cmd_ctx,
name,
config,
targets,
upload_port,
jobs,
program_args,
silent,
verbose,
):
self.cmd_ctx = cmd_ctx
self.name = name
self.config = config
self.targets = [str(t) for t in targets]
self.upload_port = upload_port
self.jobs = jobs
self.program_args = program_args
self.silent = silent
self.verbose = verbose
self.jobs = jobs
self.options = config.items(env=name, as_dict=True)
def get_build_variables(self):
variables = {"pioenv": self.name, "project_config": self.config.path}
variables = dict(
pioenv=self.name,
project_config=self.config.path,
program_args=self.program_args,
)
if CTX_META_TEST_RUNNING_NAME in self.cmd_ctx.meta:
variables["piotest_running_name"] = self.cmd_ctx.meta[
@ -66,15 +79,16 @@ class EnvironmentProcessor(object):
if "monitor" in build_targets:
build_targets.remove("monitor")
try:
p = PlatformFactory.new(self.options["platform"])
except UnknownPlatform:
self.cmd_ctx.invoke(
cmd_platform_install,
platforms=[self.options["platform"]],
skip_default_package=True,
if "clean" not in build_targets:
install_project_env_dependencies(
self.name,
{
"project_targets": build_targets,
"piotest_running_name": build_vars.get("piotest_running_name"),
},
)
p = PlatformFactory.new(self.options["platform"])
result = p.run(build_vars, build_targets, self.silent, self.verbose, self.jobs)
result = PlatformFactory.new(self.options["platform"], autoinstall=True).run(
build_vars, build_targets, self.silent, self.verbose, self.jobs
)
return result["returncode"] == 0

View File

@ -13,9 +13,7 @@
# limitations under the License.
import json
import os
import platform
import subprocess
import sys
import click
@ -23,15 +21,20 @@ from tabulate import tabulate
from platformio import __version__, compat, fs, proc, util
from platformio.commands.system.completion import (
ShellType,
get_completion_install_path,
install_completion_code,
uninstall_completion_code,
)
from platformio.commands.system.prune import (
prune_cached_data,
prune_core_packages,
prune_platform_packages,
)
from platformio.package.manager.library import LibraryPackageManager
from platformio.package.manager.platform import PlatformPackageManager
from platformio.package.manager.tool import ToolPackageManager
from platformio.project.config import ProjectConfig
from platformio.project.helpers import get_project_cache_dir
@click.group("system", short_help="Miscellaneous system commands")
@ -61,12 +64,12 @@ def system_info(json_output):
}
data["core_dir"] = {
"title": "PlatformIO Core Directory",
"value": project_config.get_optional_dir("core"),
"value": project_config.get("platformio", "core_dir"),
}
data["platformio_exe"] = {
"title": "PlatformIO Core Executable",
"value": proc.where_is_program(
"platformio.exe" if proc.WINDOWS else "platformio"
"platformio.exe" if compat.IS_WINDOWS else "platformio"
),
}
data["python_exe"] = {
@ -85,7 +88,7 @@ def system_info(json_output):
"title": "Tools & Toolchains",
"value": len(
ToolPackageManager(
project_config.get_optional_dir("packages")
project_config.get("platformio", "packages_dir")
).get_installed()
),
}
@ -99,44 +102,59 @@ def system_info(json_output):
@cli.command("prune", short_help="Remove unused data")
@click.option("--force", "-f", is_flag=True, help="Do not prompt for confirmation")
def system_prune(force):
click.secho("WARNING! This will remove:", fg="yellow")
click.echo(" - cached API requests")
click.echo(" - cached package downloads")
click.echo(" - temporary data")
if not force:
click.confirm("Do you want to continue?", abort=True)
@click.option(
"--dry-run", is_flag=True, help="Do not prune, only show data that will be removed"
)
@click.option("--cache", is_flag=True, help="Prune only cached data")
@click.option(
"--core-packages", is_flag=True, help="Prune only unnecessary core packages"
)
@click.option(
"--platform-packages",
is_flag=True,
help="Prune only unnecessary development platform packages",
)
def system_prune(force, dry_run, cache, core_packages, platform_packages):
if dry_run:
click.secho(
"Dry run mode (do not prune, only show data that will be removed)",
fg="yellow",
)
click.echo()
reclaimed_total = 0
cache_dir = get_project_cache_dir()
if os.path.isdir(cache_dir):
reclaimed_total += fs.calculate_folder_size(cache_dir)
fs.rmtree(cache_dir)
reclaimed_cache = 0
reclaimed_core_packages = 0
reclaimed_platform_packages = 0
prune_all = not any([cache, core_packages, platform_packages])
if cache or prune_all:
reclaimed_cache = prune_cached_data(force, dry_run)
click.echo()
if core_packages or prune_all:
reclaimed_core_packages = prune_core_packages(force, dry_run)
click.echo()
if platform_packages or prune_all:
reclaimed_platform_packages = prune_platform_packages(force, dry_run)
click.echo()
click.secho(
"Total reclaimed space: %s" % fs.humanize_file_size(reclaimed_total), fg="green"
"Total reclaimed space: %s"
% fs.humanize_file_size(
reclaimed_cache + reclaimed_core_packages + reclaimed_platform_packages
),
fg="green",
)
@cli.group("completion", short_help="Shell completion support")
def completion():
# pylint: disable=import-error,import-outside-toplevel
try:
import click_completion # pylint: disable=unused-import,unused-variable
except ImportError:
click.echo("Installing dependent packages...")
subprocess.check_call(
[proc.get_pythonexe_path(), "-m", "pip", "install", "click-completion"],
)
pass
@completion.command("install", short_help="Install shell completion files/code")
@click.option(
"--shell",
default=None,
type=click.Choice(["fish", "bash", "zsh", "powershell", "auto"]),
help="The shell type, default=auto",
)
@click.argument("shell", type=click.Choice([t.value for t in ShellType]))
@click.option(
"--path",
type=click.Path(file_okay=True, dir_okay=False, readable=True, resolve_path=True),
@ -144,26 +162,18 @@ def completion():
"The standard installation path is used by default.",
)
def completion_install(shell, path):
import click_completion # pylint: disable=import-outside-toplevel,import-error
shell = shell or click_completion.get_auto_shell()
shell = ShellType(shell)
path = path or get_completion_install_path(shell)
install_completion_code(shell, path)
click.echo(
"PlatformIO CLI completion has been installed for %s shell to %s \n"
"Please restart a current shell session."
% (click.style(shell, fg="cyan"), click.style(path, fg="blue"))
% (click.style(shell.name, fg="cyan"), click.style(path, fg="blue"))
)
@completion.command("uninstall", short_help="Uninstall shell completion files/code")
@click.option(
"--shell",
default=None,
type=click.Choice(["fish", "bash", "zsh", "powershell", "auto"]),
help="The shell type, default=auto",
)
@click.argument("shell", type=click.Choice([t.value for t in ShellType]))
@click.option(
"--path",
type=click.Path(file_okay=True, dir_okay=False, readable=True, resolve_path=True),
@ -171,14 +181,11 @@ def completion_install(shell, path):
"The standard installation path is used by default.",
)
def completion_uninstall(shell, path):
import click_completion # pylint: disable=import-outside-toplevel,import-error
shell = shell or click_completion.get_auto_shell()
shell = ShellType(shell)
path = path or get_completion_install_path(shell)
uninstall_completion_code(shell, path)
click.echo(
"PlatformIO CLI completion has been uninstalled for %s shell from %s \n"
"Please restart a current shell session."
% (click.style(shell, fg="cyan"), click.style(path, fg="blue"))
% (click.style(shell.name, fg="cyan"), click.style(path, fg="blue"))
)

View File

@ -13,61 +13,75 @@
# limitations under the License.
import os
import subprocess
from enum import Enum
import click
from platformio.compat import IS_MACOS
class ShellType(Enum):
FISH = "fish"
ZSH = "zsh"
BASH = "bash"
def get_completion_install_path(shell):
home_dir = os.path.expanduser("~")
prog_name = click.get_current_context().find_root().info_name
if shell == "fish":
if shell == ShellType.FISH:
return os.path.join(
home_dir, ".config", "fish", "completions", "%s.fish" % prog_name
)
if shell == "bash":
return os.path.join(home_dir, ".bash_completion")
if shell == "zsh":
if shell == ShellType.ZSH:
return os.path.join(home_dir, ".zshrc")
if shell == "powershell":
return subprocess.check_output(
["powershell", "-NoProfile", "echo $profile"]
).strip()
if shell == ShellType.BASH:
return os.path.join(home_dir, ".bash_completion")
raise click.ClickException("%s is not supported." % shell)
def get_completion_code(shell):
if shell == ShellType.FISH:
return "eval (env _PIO_COMPLETE=fish_source pio)"
if shell == ShellType.ZSH:
code = "autoload -Uz compinit\ncompinit\n" if IS_MACOS else ""
return code + 'eval "$(_PIO_COMPLETE=zsh_source pio)"'
if shell == ShellType.BASH:
return 'eval "$(_PIO_COMPLETE=bash_source pio)"'
raise click.ClickException("%s is not supported." % shell)
def is_completion_code_installed(shell, path):
if shell == "fish" or not os.path.exists(path):
if shell == ShellType.FISH or not os.path.exists(path):
return False
import click_completion # pylint: disable=import-error,import-outside-toplevel
with open(path) as fp:
return click_completion.get_code(shell=shell) in fp.read()
with open(path, encoding="utf8") as fp:
return get_completion_code(shell) in fp.read()
def install_completion_code(shell, path):
import click_completion # pylint: disable=import-error,import-outside-toplevel
if is_completion_code_installed(shell, path):
return None
return click_completion.install(shell=shell, path=path, append=shell != "fish")
append = shell != ShellType.FISH
with open(path, mode="a" if append else "w", encoding="utf8") as fp:
if append:
fp.write("\n\n# Begin: PlatformIO Core completion support\n")
fp.write(get_completion_code(shell))
if append:
fp.write("\n# End: PlatformIO Core completion support\n\n")
return True
def uninstall_completion_code(shell, path):
if not os.path.exists(path):
return True
if shell == "fish":
if shell == ShellType.FISH:
os.remove(path)
return True
import click_completion # pylint: disable=import-error,import-outside-toplevel
with open(path, "r+") as fp:
with open(path, "r+", encoding="utf8") as fp:
contents = fp.read()
fp.seek(0)
fp.truncate()
fp.write(contents.replace(click_completion.get_code(shell=shell), ""))
fp.write(contents.replace(get_completion_code(shell), ""))
return True

View File

@ -0,0 +1,98 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
from operator import itemgetter
import click
from tabulate import tabulate
from platformio import fs
from platformio.package.manager.core import remove_unnecessary_core_packages
from platformio.package.manager.platform import remove_unnecessary_platform_packages
from platformio.project.helpers import get_project_cache_dir
def prune_cached_data(force=False, dry_run=False, silent=False):
reclaimed_space = 0
if not silent:
click.secho("Prune cached data:", bold=True)
click.echo(" - cached API requests")
click.echo(" - cached package downloads")
click.echo(" - temporary data")
cache_dir = get_project_cache_dir()
if os.path.isdir(cache_dir):
reclaimed_space += fs.calculate_folder_size(cache_dir)
if not dry_run:
if not force:
click.confirm("Do you want to continue?", abort=True)
fs.rmtree(cache_dir)
if not silent:
click.secho("Space on disk: %s" % fs.humanize_file_size(reclaimed_space))
return reclaimed_space
def prune_core_packages(force=False, dry_run=False, silent=False):
if not silent:
click.secho("Prune unnecessary core packages:", bold=True)
return _prune_packages(force, dry_run, silent, remove_unnecessary_core_packages)
def prune_platform_packages(force=False, dry_run=False, silent=False):
if not silent:
click.secho("Prune unnecessary development platform packages:", bold=True)
return _prune_packages(force, dry_run, silent, remove_unnecessary_platform_packages)
def _prune_packages(force, dry_run, silent, handler):
if not silent:
click.echo("Calculating...")
items = [
(
pkg,
fs.calculate_folder_size(pkg.path),
)
for pkg in handler(dry_run=True)
]
items = sorted(items, key=itemgetter(1), reverse=True)
reclaimed_space = sum([item[1] for item in items])
if items and not silent:
click.echo(
tabulate(
[
(
pkg.metadata.spec.humanize(),
str(pkg.metadata.version),
fs.humanize_file_size(size),
)
for (pkg, size) in items
],
headers=["Package", "Version", "Size"],
)
)
if not dry_run:
if not force:
click.confirm("Do you want to continue?", abort=True)
handler(dry_run=False)
if not silent:
click.secho("Space on disk: %s" % fs.humanize_file_size(reclaimed_space))
return reclaimed_space
def calculate_unnecessary_system_data():
return (
prune_cached_data(force=True, dry_run=True, silent=True)
+ prune_core_packages(force=True, dry_run=True, silent=True)
+ prune_platform_packages(force=True, dry_run=True, silent=True)
)

View File

@ -0,0 +1,17 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-import
from platformio.test.command import test_cmd as cli

View File

@ -1,273 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments, too-many-locals, too-many-branches
from fnmatch import fnmatch
from os import getcwd, listdir
from os.path import isdir, join
from time import time
import click
from tabulate import tabulate
from platformio import app, exception, fs, util
from platformio.commands.test.embedded import EmbeddedTestProcessor
from platformio.commands.test.native import NativeTestProcessor
from platformio.project.config import ProjectConfig
@click.command("test", short_help="Unit testing")
@click.option("--environment", "-e", multiple=True, metavar="<environment>")
@click.option(
"--filter",
"-f",
multiple=True,
metavar="<pattern>",
help="Filter tests by a pattern",
)
@click.option(
"--ignore",
"-i",
multiple=True,
metavar="<pattern>",
help="Ignore tests by a pattern",
)
@click.option("--upload-port")
@click.option("--test-port")
@click.option(
"-d",
"--project-dir",
default=getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--without-building", is_flag=True)
@click.option("--without-uploading", is_flag=True)
@click.option("--without-testing", is_flag=True)
@click.option("--no-reset", is_flag=True)
@click.option(
"--monitor-rts",
default=None,
type=click.IntRange(0, 1),
help="Set initial RTS line state for Serial Monitor",
)
@click.option(
"--monitor-dtr",
default=None,
type=click.IntRange(0, 1),
help="Set initial DTR line state for Serial Monitor",
)
@click.option("--verbose", "-v", is_flag=True)
@click.pass_context
def cli( # pylint: disable=redefined-builtin
ctx,
environment,
ignore,
filter,
upload_port,
test_port,
project_dir,
project_conf,
without_building,
without_uploading,
without_testing,
no_reset,
monitor_rts,
monitor_dtr,
verbose,
):
app.set_session_var("custom_project_conf", project_conf)
with fs.cd(project_dir):
config = ProjectConfig.get_instance(project_conf)
config.validate(envs=environment)
test_dir = config.get_optional_dir("test")
if not isdir(test_dir):
raise exception.TestDirNotExists(test_dir)
test_names = get_test_names(test_dir)
if not verbose:
click.echo("Verbose mode can be enabled via `-v, --verbose` option")
click.secho("Collected %d items" % len(test_names), bold=True)
results = []
default_envs = config.default_envs()
for testname in test_names:
for envname in config.envs():
section = "env:%s" % envname
# filter and ignore patterns
patterns = dict(filter=list(filter), ignore=list(ignore))
for key in patterns:
patterns[key].extend(config.get(section, "test_%s" % key, []))
skip_conditions = [
environment and envname not in environment,
not environment and default_envs and envname not in default_envs,
testname != "*"
and patterns["filter"]
and not any([fnmatch(testname, p) for p in patterns["filter"]]),
testname != "*"
and any([fnmatch(testname, p) for p in patterns["ignore"]]),
]
if any(skip_conditions):
results.append({"env": envname, "test": testname})
continue
click.echo()
print_processing_header(testname, envname)
cls = (
NativeTestProcessor
if config.get(section, "platform") == "native"
else EmbeddedTestProcessor
)
tp = cls(
ctx,
testname,
envname,
dict(
project_config=config,
project_dir=project_dir,
upload_port=upload_port,
test_port=test_port,
without_building=without_building,
without_uploading=without_uploading,
without_testing=without_testing,
no_reset=no_reset,
monitor_rts=monitor_rts,
monitor_dtr=monitor_dtr,
verbose=verbose,
silent=not verbose,
),
)
result = {
"env": envname,
"test": testname,
"duration": time(),
"succeeded": tp.process(),
}
result["duration"] = time() - result["duration"]
results.append(result)
print_processing_footer(result)
if without_testing:
return
print_testing_summary(results)
command_failed = any(r.get("succeeded") is False for r in results)
if command_failed:
raise exception.ReturnErrorCode(1)
def get_test_names(test_dir):
names = []
for item in sorted(listdir(test_dir)):
if isdir(join(test_dir, item)):
names.append(item)
if not names:
names = ["*"]
return names
def print_processing_header(test, env):
click.echo(
"Processing %s in %s environment"
% (
click.style(test, fg="yellow", bold=True),
click.style(env, fg="cyan", bold=True),
)
)
terminal_width, _ = click.get_terminal_size()
click.secho("-" * terminal_width, bold=True)
def print_processing_footer(result):
is_failed = not result.get("succeeded")
util.print_labeled_bar(
"[%s] Took %.2f seconds"
% (
(
click.style("FAILED", fg="red", bold=True)
if is_failed
else click.style("PASSED", fg="green", bold=True)
),
result["duration"],
),
is_error=is_failed,
)
def print_testing_summary(results):
click.echo()
tabular_data = []
succeeded_nums = 0
failed_nums = 0
duration = 0
for result in results:
duration += result.get("duration", 0)
if result.get("succeeded") is False:
failed_nums += 1
status_str = click.style("FAILED", fg="red")
elif result.get("succeeded") is None:
status_str = "IGNORED"
else:
succeeded_nums += 1
status_str = click.style("PASSED", fg="green")
tabular_data.append(
(
result["test"],
click.style(result["env"], fg="cyan"),
status_str,
util.humanize_duration_time(result.get("duration")),
)
)
click.echo(
tabulate(
tabular_data,
headers=[
click.style(s, bold=True)
for s in ("Test", "Environment", "Status", "Duration")
],
),
err=failed_nums,
)
util.print_labeled_bar(
"%s%d succeeded in %s"
% (
"%d failed, " % failed_nums if failed_nums else "",
succeeded_nums,
util.humanize_duration_time(duration),
),
is_error=failed_nums,
fg="red" if failed_nums else "green",
)

View File

@ -1,138 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from time import sleep
import click
import serial
from platformio import exception, util
from platformio.commands.test.processor import TestProcessorBase
from platformio.platform.factory import PlatformFactory
class EmbeddedTestProcessor(TestProcessorBase):
SERIAL_TIMEOUT = 600
def process(self):
if not self.options["without_building"]:
self.print_progress("Building...")
target = ["__test"]
if self.options["without_uploading"]:
target.append("checkprogsize")
if not self.build_or_upload(target):
return False
if not self.options["without_uploading"]:
self.print_progress("Uploading...")
target = ["upload"]
if self.options["without_building"]:
target.append("nobuild")
else:
target.append("__test")
if not self.build_or_upload(target):
return False
if self.options["without_testing"]:
return True
self.print_progress("Testing...")
return self.run()
def run(self):
click.echo(
"If you don't see any output for the first 10 secs, "
"please reset board (press reset button)"
)
click.echo()
try:
ser = serial.Serial(
baudrate=self.get_baudrate(), timeout=self.SERIAL_TIMEOUT
)
ser.port = self.get_test_port()
ser.rts = self.options["monitor_rts"]
ser.dtr = self.options["monitor_dtr"]
ser.open()
except serial.SerialException as e:
click.secho(str(e), fg="red", err=True)
return False
if not self.options["no_reset"]:
ser.flushInput()
ser.setDTR(False)
ser.setRTS(False)
sleep(0.1)
ser.setDTR(True)
ser.setRTS(True)
sleep(0.1)
while True:
line = ser.readline().strip()
# fix non-ascii output from device
for i, c in enumerate(line[::-1]):
if not isinstance(c, int):
c = ord(c)
if c > 127:
line = line[-i:]
break
if not line:
continue
if isinstance(line, bytes):
line = line.decode("utf8", "ignore")
self.on_run_out(line)
if all([l in line for l in ("Tests", "Failures", "Ignored")]):
break
ser.close()
return not self._run_failed
def get_test_port(self):
# if test port is specified manually or in config
if self.options.get("test_port"):
return self.options.get("test_port")
if self.env_options.get("test_port"):
return self.env_options.get("test_port")
assert set(["platform", "board"]) & set(self.env_options.keys())
p = PlatformFactory.new(self.env_options["platform"])
board_hwids = p.board_config(self.env_options["board"]).get("build.hwids", [])
port = None
elapsed = 0
while elapsed < 5 and not port:
for item in util.get_serialports():
port = item["port"]
for hwid in board_hwids:
hwid_str = ("%s:%s" % (hwid[0], hwid[1])).replace("0x", "")
if hwid_str in item["hwid"]:
return port
# check if port is already configured
try:
serial.Serial(port, timeout=self.SERIAL_TIMEOUT).close()
except serial.SerialException:
port = None
if not port:
sleep(0.25)
elapsed += 0.25
if not port:
raise exception.PlatformioException(
"Please specify `test_port` for environment or use "
"global `--test-port` option."
)
return port

View File

@ -1,41 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from os.path import join
from platformio import proc
from platformio.commands.test.processor import TestProcessorBase
from platformio.proc import LineBufferedAsyncPipe
class NativeTestProcessor(TestProcessorBase):
def process(self):
if not self.options["without_building"]:
self.print_progress("Building...")
if not self.build_or_upload(["__test"]):
return False
if self.options["without_testing"]:
return None
self.print_progress("Testing...")
return self.run()
def run(self):
build_dir = self.options["project_config"].get_optional_dir("build")
result = proc.exec_command(
[join(build_dir, self.env_name, "program")],
stdout=LineBufferedAsyncPipe(self.on_run_out),
stderr=LineBufferedAsyncPipe(self.on_run_out),
)
assert "returncode" in result
return result["returncode"] == 0 and not self._run_failed

View File

@ -1,230 +0,0 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import atexit
from os import listdir, remove
from os.path import isdir, isfile, join
from string import Template
import click
from platformio import exception
TRANSPORT_OPTIONS = {
"arduino": {
"include": "#include <Arduino.h>",
"object": "",
"putchar": "Serial.write(c);",
"flush": "Serial.flush();",
"begin": "Serial.begin($baudrate);",
"end": "Serial.end();",
"language": "cpp",
},
"mbed": {
"include": "#include <mbed.h>",
"object": (
"#if MBED_MAJOR_VERSION == 6\nUnbufferedSerial pc(USBTX, USBRX);\n"
"#else\nRawSerial pc(USBTX, USBRX);\n#endif"
),
"putchar": (
"#if MBED_MAJOR_VERSION == 6\npc.write(&c, 1);\n"
"#else\npc.putc(c);\n#endif"
),
"flush": "",
"begin": "pc.baud($baudrate);",
"end": "",
"language": "cpp",
},
"espidf": {
"include": "#include <stdio.h>",
"object": "",
"putchar": "putchar(c);",
"flush": "fflush(stdout);",
"begin": "",
"end": "",
},
"zephyr": {
"include": "#include <sys/printk.h>",
"object": "",
"putchar": 'printk("%c", c);',
"flush": "",
"begin": "",
"end": "",
},
"native": {
"include": "#include <stdio.h>",
"object": "",
"putchar": "putchar(c);",
"flush": "fflush(stdout);",
"begin": "",
"end": "",
},
"custom": {
"include": '#include "unittest_transport.h"',
"object": "",
"putchar": "unittest_uart_putchar(c);",
"flush": "unittest_uart_flush();",
"begin": "unittest_uart_begin();",
"end": "unittest_uart_end();",
"language": "cpp",
},
}
CTX_META_TEST_IS_RUNNING = __name__ + ".test_running"
CTX_META_TEST_RUNNING_NAME = __name__ + ".test_running_name"
class TestProcessorBase(object):
DEFAULT_BAUDRATE = 115200
def __init__(self, cmd_ctx, testname, envname, options):
self.cmd_ctx = cmd_ctx
self.cmd_ctx.meta[CTX_META_TEST_IS_RUNNING] = True
self.test_name = testname
self.options = options
self.env_name = envname
self.env_options = options["project_config"].items(env=envname, as_dict=True)
self._run_failed = False
self._output_file_generated = False
def get_transport(self):
transport = None
if self.env_options.get("platform") == "native":
transport = "native"
elif "framework" in self.env_options:
transport = self.env_options.get("framework")[0]
if "test_transport" in self.env_options:
transport = self.env_options["test_transport"]
if transport not in TRANSPORT_OPTIONS:
raise exception.PlatformioException(
"Unknown Unit Test transport `%s`. Please check a documentation how "
"to create an own 'Test Transport':\n"
"- https://docs.platformio.org/page/plus/unit-testing.html" % transport
)
return transport.lower()
def get_baudrate(self):
return int(self.env_options.get("test_speed", self.DEFAULT_BAUDRATE))
def print_progress(self, text):
click.secho(text, bold=self.options.get("verbose"))
def build_or_upload(self, target):
if not self._output_file_generated:
self.generate_output_file(
self.options["project_config"].get_optional_dir("test")
)
self._output_file_generated = True
if self.test_name != "*":
self.cmd_ctx.meta[CTX_META_TEST_RUNNING_NAME] = self.test_name
try:
# pylint: disable=import-outside-toplevel
from platformio.commands.run.command import cli as cmd_run
return self.cmd_ctx.invoke(
cmd_run,
project_dir=self.options["project_dir"],
project_conf=self.options["project_config"].path,
upload_port=self.options["upload_port"],
verbose=self.options["verbose"],
silent=self.options["silent"],
environment=[self.env_name],
disable_auto_clean="nobuild" in target,
target=target,
)
except exception.ReturnErrorCode:
return False
def process(self):
raise NotImplementedError
def run(self):
raise NotImplementedError
def on_run_out(self, line):
line = line.strip()
if line.endswith(":PASS"):
click.echo("%s\t[%s]" % (line[:-5], click.style("PASSED", fg="green")))
elif ":FAIL" in line:
self._run_failed = True
click.echo("%s\t[%s]" % (line, click.style("FAILED", fg="red")))
else:
click.echo(line)
def generate_output_file(self, test_dir):
assert isdir(test_dir)
file_tpl = "\n".join(
[
"$include",
"#include <output_export.h>",
"",
"$object",
"",
"#ifdef __GNUC__",
"void output_start(unsigned int baudrate __attribute__((unused)))",
"#else",
"void output_start(unsigned int baudrate)",
"#endif",
"{",
" $begin",
"}",
"",
"void output_char(int c)",
"{",
" $putchar",
"}",
"",
"void output_flush(void)",
"{",
" $flush",
"}",
"",
"void output_complete(void)",
"{",
" $end",
"}",
]
)
tmp_file_prefix = "tmp_pio_test_transport"
def delete_tmptest_files(test_dir):
for item in listdir(test_dir):
if item.startswith(tmp_file_prefix) and isfile(join(test_dir, item)):
try:
remove(join(test_dir, item))
except: # pylint: disable=bare-except
click.secho(
"Warning: Could not remove temporary file '%s'. "
"Please remove it manually." % join(test_dir, item),
fg="yellow",
)
transport_options = TRANSPORT_OPTIONS[self.get_transport()]
tpl = Template(file_tpl).substitute(transport_options)
data = Template(tpl).substitute(baudrate=self.get_baudrate())
delete_tmptest_files(test_dir)
tmp_file = join(
test_dir,
"%s.%s" % (tmp_file_prefix, transport_options.get("language", "c")),
)
with open(tmp_file, "w") as fp:
fp.write(data)
atexit.register(delete_tmptest_files, test_dir)

View File

@ -14,16 +14,11 @@
import click
from platformio.cache import cleanup_content_cache
from platformio.commands.lib.command import CTX_META_STORAGE_DIRS_KEY
from platformio.commands.lib.command import lib_update as cmd_lib_update
from platformio.commands.platform import platform_update as cmd_platform_update
from platformio.package.manager.core import update_core_packages
from platformio.package.manager.library import LibraryPackageManager
@click.command(
"update", short_help="Update installed platforms, packages and libraries"
"update",
short_help="Update installed platforms, packages and libraries",
hidden=True,
)
@click.option("--core-packages", is_flag=True, help="Update only the core packages")
@click.option(
@ -35,25 +30,9 @@ from platformio.package.manager.library import LibraryPackageManager
@click.option(
"--dry-run", is_flag=True, help="Do not update, only check for the new versions"
)
@click.pass_context
def cli(ctx, core_packages, only_check, dry_run):
# cleanup lib search results, cached board and platform lists
cleanup_content_cache("http")
only_check = dry_run or only_check
update_core_packages(only_check)
if core_packages:
return
click.echo()
click.echo("Platform Manager")
click.echo("================")
ctx.invoke(cmd_platform_update, only_check=only_check)
click.echo()
click.echo("Library Manager")
click.echo("===============")
ctx.meta[CTX_META_STORAGE_DIRS_KEY] = [LibraryPackageManager().package_dir]
ctx.invoke(cmd_lib_update, only_check=only_check)
def cli(*_, **__):
click.secho(
"This command is deprecated and will be removed in the next releases. \n"
"Please use `pio pkg update` instead.",
fg="yellow",
)

View File

@ -21,14 +21,16 @@ import click
from platformio import VERSION, __version__, app, exception
from platformio.clients.http import fetch_remote_content
from platformio.compat import WINDOWS
from platformio.compat import IS_WINDOWS
from platformio.package.manager.core import update_core_packages
from platformio.proc import exec_command, get_pythonexe_path
from platformio.project.helpers import get_project_cache_dir
@click.command("upgrade", short_help="Upgrade PlatformIO to the latest version")
@click.command("upgrade", short_help="Upgrade PlatformIO Core to the latest version")
@click.option("--dev", is_flag=True, help="Use development branch")
def cli(dev):
update_core_packages()
if not dev and __version__ == get_latest_version():
return click.secho(
"You're up-to-date!\nPlatformIO %s is currently the "
@ -40,7 +42,7 @@ def cli(dev):
to_develop = dev or not all(c.isdigit() for c in __version__ if c != ".")
cmds = (
["pip", "install", "--upgrade", get_pip_package(to_develop)],
["pip", "install", "--upgrade", download_dist_package(to_develop)],
["platformio", "--version"],
)
@ -73,7 +75,7 @@ def cli(dev):
if not r:
raise exception.UpgradeError("\n".join([str(cmd), str(e)]))
permission_errors = ("permission denied", "not permitted")
if any(m in r["err"].lower() for m in permission_errors) and not WINDOWS:
if any(m in r["err"].lower() for m in permission_errors) and not IS_WINDOWS:
click.secho(
"""
-----------------
@ -94,7 +96,7 @@ WARNING! Don't use `sudo` for the rest PlatformIO commands.
return True
def get_pip_package(to_develop):
def download_dist_package(to_develop):
if not to_develop:
return "platformio"
dl_url = "https://github.com/platformio/platformio-core/archive/develop.zip"
@ -103,7 +105,7 @@ def get_pip_package(to_develop):
os.makedirs(cache_dir)
pkg_name = os.path.join(cache_dir, "piocoredevelop.zip")
try:
with open(pkg_name, "w") as fp:
with open(pkg_name, "wb") as fp:
r = exec_command(
["curl", "-fsSL", dl_url], stdout=fp, universal_newlines=True
)

View File

@ -12,23 +12,56 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=unused-import, no-name-in-module, import-error,
# pylint: disable=no-member, undefined-variable, unexpected-keyword-arg
# pylint: disable=unused-import,no-name-in-module
import glob
import importlib.util
import inspect
import json
import locale
import os
import re
import sys
from platformio.exception import UserSideException
PY2 = sys.version_info[0] == 2
CYGWIN = sys.platform.startswith("cygwin")
WINDOWS = sys.platform.startswith("win")
MACOS = sys.platform.startswith("darwin")
if sys.version_info >= (3, 7):
from asyncio import create_task as aio_create_task
from asyncio import get_running_loop as aio_get_running_loop
else:
from asyncio import ensure_future as aio_create_task
from asyncio import get_event_loop as aio_get_running_loop
PY2 = sys.version_info[0] == 2 # DO NOT REMOVE IT. ESP8266/ESP32 depend on it
IS_CYGWIN = sys.platform.startswith("cygwin")
IS_WINDOWS = WINDOWS = sys.platform.startswith("win")
IS_MACOS = sys.platform.startswith("darwin")
MISSING = object()
string_types = (str,)
def is_bytes(x):
return isinstance(x, (bytes, memoryview, bytearray))
def ci_strings_are_equal(a, b):
if a == b:
return True
if not a or not b:
return False
return a.strip().lower() == b.strip().lower()
def hashlib_encode_data(data):
if is_bytes(data):
return data
if not isinstance(data, string_types):
data = str(data)
return data.encode()
def load_python_module(name, pathname):
spec = importlib.util.spec_from_file_location(name, pathname)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
def get_filesystem_encoding():
@ -53,106 +86,21 @@ def get_object_members(obj, ignore_private=True):
}
def ci_strings_are_equal(a, b):
if a == b:
return True
if not a or not b:
return False
return a.strip().lower() == b.strip().lower()
def ensure_python3(raise_exception=True):
if not raise_exception or not PY2:
return not PY2
compatible = sys.version_info >= (3, 6)
if not raise_exception or compatible:
return compatible
raise UserSideException(
"Python 3.5 or later is required for this operation. \n"
"Please install the latest Python 3 and reinstall PlatformIO Core using "
"installation script:\n"
"https://docs.platformio.org/page/core/installation.html"
"Python 3.6 or later is required for this operation. \n"
"Please check a migration guide:\n"
"https://docs.platformio.org/en/latest/core/migration.html"
"#drop-support-for-python-2-and-3-5"
)
if PY2:
import imp
string_types = (str, unicode)
def is_bytes(x):
return isinstance(x, (buffer, bytearray))
def path_to_unicode(path):
if isinstance(path, unicode):
return path
return path.decode(get_filesystem_encoding())
def hashlib_encode_data(data):
if is_bytes(data):
return data
if isinstance(data, unicode):
data = data.encode(get_filesystem_encoding())
elif not isinstance(data, string_types):
data = str(data)
return data
def dump_json_to_unicode(obj):
if isinstance(obj, unicode):
return obj
return json.dumps(
obj, encoding=get_filesystem_encoding(), ensure_ascii=False
).encode("utf8")
_magic_check = re.compile("([*?[])")
_magic_check_bytes = re.compile(b"([*?[])")
def glob_recursive(pathname):
return glob.glob(pathname)
def glob_escape(pathname):
"""Escape all special characters."""
# https://github.com/python/cpython/blob/master/Lib/glob.py#L161
# Escaping is done by wrapping any of "*?[" between square brackets.
# Metacharacters do not work in the drive part and shouldn't be
# escaped.
drive, pathname = os.path.splitdrive(pathname)
if isinstance(pathname, bytes):
pathname = _magic_check_bytes.sub(br"[\1]", pathname)
else:
pathname = _magic_check.sub(r"[\1]", pathname)
return drive + pathname
def load_python_module(name, pathname):
return imp.load_source(name, pathname)
else:
import importlib.util
from glob import escape as glob_escape
string_types = (str,)
def is_bytes(x):
return isinstance(x, (bytes, memoryview, bytearray))
def path_to_unicode(path):
return path
def hashlib_encode_data(data):
if is_bytes(data):
return data
if not isinstance(data, string_types):
data = str(data)
return data.encode()
def dump_json_to_unicode(obj):
if isinstance(obj, string_types):
return obj
return json.dumps(obj)
def glob_recursive(pathname):
return glob.glob(pathname, recursive=True)
def load_python_module(name, pathname):
spec = importlib.util.spec_from_file_location(name, pathname)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
def path_to_unicode(path):
"""
Deprecated: Compatibility with dev-platforms,
and custom device monitor filters
"""
return path

203
platformio/debug/command.py Normal file
View File

@ -0,0 +1,203 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# pylint: disable=too-many-arguments, too-many-locals
# pylint: disable=too-many-branches, too-many-statements
import asyncio
import os
import signal
import subprocess
import click
from platformio import app, exception, fs, proc
from platformio.compat import IS_WINDOWS
from platformio.debug import helpers
from platformio.debug.config.factory import DebugConfigFactory
from platformio.debug.exception import DebugInvalidOptionsError
from platformio.debug.process.gdb import GDBClientProcess
from platformio.platform.factory import PlatformFactory
from platformio.project.config import ProjectConfig
from platformio.project.exception import ProjectEnvsNotAvailableError
from platformio.project.helpers import is_platformio_project
from platformio.project.options import ProjectOptions
@click.command(
"debug",
context_settings=dict(ignore_unknown_options=True),
short_help="Unified Debugger",
)
@click.option(
"-d",
"--project-dir",
default=os.getcwd,
type=click.Path(
exists=True, file_okay=False, dir_okay=True, writable=True, resolve_path=True
),
)
@click.option(
"-c",
"--project-conf",
type=click.Path(
exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True
),
)
@click.option("--environment", "-e", metavar="<environment>")
@click.option("--load-mode", type=ProjectOptions["env.debug_load_mode"].type)
@click.option("--verbose", "-v", is_flag=True)
@click.option("--interface", type=click.Choice(["gdb"]))
@click.argument("__unprocessed", nargs=-1, type=click.UNPROCESSED)
@click.pass_context
def debug_cmd(
ctx,
project_dir,
project_conf,
environment,
load_mode,
verbose,
interface,
__unprocessed,
):
app.set_session_var("custom_project_conf", project_conf)
# use env variables from Eclipse or CLion
for name in ("CWD", "PWD", "PLATFORMIO_PROJECT_DIR"):
if is_platformio_project(project_dir):
break
if os.getenv(name):
project_dir = os.getenv(name)
with fs.cd(project_dir):
return _debug_in_project_dir(
ctx,
project_dir,
project_conf,
environment,
load_mode,
verbose,
interface,
__unprocessed,
)
def _debug_in_project_dir(
ctx,
project_dir,
project_conf,
environment,
load_mode,
verbose,
interface,
__unprocessed,
):
project_config = ProjectConfig.get_instance(project_conf)
project_config.validate(envs=[environment] if environment else None)
env_name = environment or helpers.get_default_debug_env(project_config)
if not interface:
return helpers.predebug_project(
ctx, project_dir, project_config, env_name, False, verbose
)
env_options = project_config.items(env=env_name, as_dict=True)
if "platform" not in env_options:
raise ProjectEnvsNotAvailableError()
debug_config = DebugConfigFactory.new(
PlatformFactory.new(env_options["platform"], autoinstall=True),
project_config,
env_name,
)
if "--version" in __unprocessed:
return subprocess.run(
[debug_config.client_executable_path, "--version"], check=True
)
try:
fs.ensure_udev_rules()
except exception.InvalidUdevRules as e:
click.echo(
helpers.escape_gdbmi_stream("~", str(e) + "\n")
if helpers.is_gdbmi_mode()
else str(e) + "\n",
nl=False,
)
rebuild_prog = False
preload = debug_config.load_cmds == ["preload"]
load_mode = load_mode or debug_config.load_mode
if load_mode == "always":
rebuild_prog = preload or not helpers.has_debug_symbols(
debug_config.program_path
)
elif load_mode == "modified":
rebuild_prog = helpers.is_prog_obsolete(
debug_config.program_path
) or not helpers.has_debug_symbols(debug_config.program_path)
if not (debug_config.program_path and os.path.isfile(debug_config.program_path)):
rebuild_prog = True
if preload or (not rebuild_prog and load_mode != "always"):
# don't load firmware through debug server
debug_config.load_cmds = []
if rebuild_prog:
if helpers.is_gdbmi_mode():
click.echo(
helpers.escape_gdbmi_stream(
"~", "Preparing firmware for debugging...\n"
),
nl=False,
)
stream = helpers.GDBMIConsoleStream()
with proc.capture_std_streams(stream):
helpers.predebug_project(
ctx, project_dir, project_config, env_name, preload, verbose
)
stream.close()
else:
click.echo("Preparing firmware for debugging...")
helpers.predebug_project(
ctx, project_dir, project_config, env_name, preload, verbose
)
# save SHA sum of newly created prog
if load_mode == "modified":
helpers.is_prog_obsolete(debug_config.program_path)
if not os.path.isfile(debug_config.program_path):
raise DebugInvalidOptionsError("Program/firmware is missed")
loop = asyncio.ProactorEventLoop() if IS_WINDOWS else asyncio.get_event_loop()
asyncio.set_event_loop(loop)
client = GDBClientProcess(project_dir, debug_config)
coro = client.run(__unprocessed)
try:
signal.signal(signal.SIGINT, signal.SIG_IGN)
loop.run_until_complete(coro)
if IS_WINDOWS:
client.close()
# an issue with `asyncio` executor and STIDIN,
# it cannot be closed gracefully
proc.force_exit()
finally:
client.close()
loop.close()
return True

View File

@ -0,0 +1,249 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import os
from platformio import fs, proc, util
from platformio.compat import string_types
from platformio.debug.exception import DebugInvalidOptionsError
from platformio.debug.helpers import reveal_debug_port
from platformio.project.config import ProjectConfig
from platformio.project.helpers import load_build_metadata
from platformio.project.options import ProjectOptions
class DebugConfigBase: # pylint: disable=too-many-instance-attributes
def __init__(self, platform, project_config, env_name):
self.platform = platform
self.project_config = project_config
self.env_name = env_name
self.env_options = project_config.items(env=env_name, as_dict=True)
self.build_data = self._load_build_data()
self.tool_name = None
self.board_config = {}
self.tool_settings = {}
if "board" in self.env_options:
self.board_config = platform.board_config(self.env_options["board"])
self.tool_name = self.board_config.get_debug_tool_name(
self.env_options.get("debug_tool")
)
self.tool_settings = (
self.board_config.get("debug", {})
.get("tools", {})
.get(self.tool_name, {})
)
self._load_cmds = None
self._port = None
self.server = self._configure_server()
try:
platform.configure_debug_session(self)
except NotImplementedError:
pass
@staticmethod
def cleanup_cmds(items):
items = ProjectConfig.parse_multi_values(items)
return ["$LOAD_CMDS" if item == "$LOAD_CMD" else item for item in items]
@property
def program_path(self):
return self.build_data["prog_path"]
@property
def client_executable_path(self):
return self.build_data["gdb_path"]
@property
def load_cmds(self):
if self._load_cmds is not None:
return self._load_cmds
result = self.env_options.get("debug_load_cmds")
if not result:
result = self.tool_settings.get("load_cmds")
if not result:
# legacy
result = self.tool_settings.get("load_cmd")
if not result:
result = ProjectOptions["env.debug_load_cmds"].default
return self.cleanup_cmds(result)
@load_cmds.setter
def load_cmds(self, cmds):
self._load_cmds = cmds
@property
def load_mode(self):
result = self.env_options.get("debug_load_mode")
if not result:
result = self.tool_settings.get("load_mode")
return result or ProjectOptions["env.debug_load_mode"].default
@property
def init_break(self):
missed = object()
result = self.env_options.get("debug_init_break", missed)
if result != missed:
return result
result = None
if not result:
result = self.tool_settings.get("init_break")
return result or ProjectOptions["env.debug_init_break"].default
@property
def init_cmds(self):
return self.cleanup_cmds(
self.env_options.get("debug_init_cmds", self.tool_settings.get("init_cmds"))
)
@property
def extra_cmds(self):
return self.cleanup_cmds(
self.env_options.get("debug_extra_cmds")
) + self.cleanup_cmds(self.tool_settings.get("extra_cmds"))
@property
def port(self):
return reveal_debug_port(
self.env_options.get("debug_port", self.tool_settings.get("port"))
or self._port,
self.tool_name,
self.tool_settings,
)
@port.setter
def port(self, value):
self._port = value
@property
def upload_protocol(self):
return self.env_options.get(
"upload_protocol", self.board_config.get("upload", {}).get("protocol")
)
@property
def speed(self):
return self.env_options.get("debug_speed", self.tool_settings.get("speed"))
@property
def server_ready_pattern(self):
return self.env_options.get(
"debug_server_ready_pattern", (self.server or {}).get("ready_pattern")
)
def _load_build_data(self):
data = load_build_metadata(os.getcwd(), self.env_name, cache=True)
if data:
return data
raise DebugInvalidOptionsError("Could not load a build configuration")
def _configure_server(self):
# user disabled server in platformio.ini
if "debug_server" in self.env_options and not self.env_options.get(
"debug_server"
):
return None
result = None
# specific server per a system
if isinstance(self.tool_settings.get("server", {}), list):
for item in self.tool_settings["server"][:]:
self.tool_settings["server"] = item
if util.get_systype() in item.get("system", []):
break
# user overwrites debug server
if self.env_options.get("debug_server"):
result = {
"cwd": None,
"executable": None,
"arguments": self.env_options.get("debug_server"),
}
result["executable"] = result["arguments"][0]
result["arguments"] = result["arguments"][1:]
elif "server" in self.tool_settings:
result = self.tool_settings["server"]
server_package = result.get("package")
server_package_dir = (
self.platform.get_package_dir(server_package)
if server_package
else None
)
if server_package and not server_package_dir:
self.platform.install_package(server_package)
server_package_dir = self.platform.get_package_dir(server_package)
result.update(
dict(
cwd=server_package_dir if server_package else None,
executable=result.get("executable"),
arguments=[
a.replace("$PACKAGE_DIR", server_package_dir)
if server_package_dir
else a
for a in result.get("arguments", [])
],
)
)
return self.reveal_patterns(result) if result else None
def get_init_script(self, debugger):
try:
return getattr(self, "%s_INIT_SCRIPT" % debugger.upper())
except AttributeError:
raise NotImplementedError
def reveal_patterns(self, source, recursive=True):
program_path = self.program_path or ""
patterns = {
"PLATFORMIO_CORE_DIR": self.project_config.get("platformio", "core_dir"),
"PYTHONEXE": proc.get_pythonexe_path(),
"PROJECT_DIR": os.getcwd(),
"PROG_PATH": program_path,
"PROG_DIR": os.path.dirname(program_path),
"PROG_NAME": os.path.basename(os.path.splitext(program_path)[0]),
"DEBUG_PORT": self.port,
"UPLOAD_PROTOCOL": self.upload_protocol,
"INIT_BREAK": self.init_break or "",
"LOAD_CMDS": "\n".join(self.load_cmds or []),
}
for key, value in patterns.items():
if key.endswith(("_DIR", "_PATH")):
patterns[key] = fs.to_unix_path(value)
def _replace(text):
for key, value in patterns.items():
pattern = "$%s" % key
text = text.replace(pattern, value or "")
return text
if isinstance(source, string_types):
source = _replace(source)
elif isinstance(source, (list, dict)):
items = enumerate(source) if isinstance(source, list) else source.items()
for key, value in items:
if isinstance(value, string_types):
source[key] = _replace(value)
elif isinstance(value, (list, dict)) and recursive:
source[key] = self.reveal_patterns(value, patterns)
data = json.dumps(source)
if any(("$" + key) in data for key in patterns):
source = self.reveal_patterns(source, patterns)
return source

View File

@ -0,0 +1,49 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from platformio.debug.config.base import DebugConfigBase
class BlackmagicDebugConfig(DebugConfigBase):
GDB_INIT_SCRIPT = """
define pio_reset_halt_target
set language c
set *0xE000ED0C = 0x05FA0004
set $busy = (*0xE000ED0C & 0x4)
while ($busy)
set $busy = (*0xE000ED0C & 0x4)
end
set language auto
end
define pio_reset_run_target
pio_reset_halt_target
end
target extended-remote $DEBUG_PORT
monitor swdp_scan
attach 1
set mem inaccessible-by-default off
$LOAD_CMDS
$INIT_BREAK
set language c
set *0xE000ED0C = 0x05FA0004
set $busy = (*0xE000ED0C & 0x4)
while ($busy)
set $busy = (*0xE000ED0C & 0x4)
end
set language auto
"""

View File

@ -0,0 +1,48 @@
# Copyright (c) 2014-present PlatformIO <contact@platformio.org>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import importlib
import re
from platformio.debug.config.generic import GenericDebugConfig
from platformio.debug.config.native import NativeDebugConfig
class DebugConfigFactory(object):
@staticmethod
def get_clsname(name):
name = re.sub(r"[^\da-z\_\-]+", "", name, flags=re.I)
return "%sDebugConfig" % name.lower().capitalize()
@classmethod
def new(cls, platform, project_config, env_name):
board_config = platform.board_config(
project_config.get("env:" + env_name, "board")
)
tool_name = (
board_config.get_debug_tool_name(
project_config.get("env:" + env_name, "debug_tool")
)
if board_config
else None
)
config_cls = None
try:
mod = importlib.import_module("platformio.debug.config.%s" % tool_name)
config_cls = getattr(mod, cls.get_clsname(tool_name))
except ModuleNotFoundError:
config_cls = (
GenericDebugConfig if platform.is_embedded() else NativeDebugConfig
)
return config_cls(platform, project_config, env_name)

Some files were not shown because too many files have changed in this diff Show More