unity: add option to enable 64-bit formatting support

This option is not enabled by default because many existing tests
use integer assertions to check the pointers:

   TEST_ASSERT_EQUAL(NULL, pointer)

This causes a "cast from pointer to integer of different size"
(-Wpointer-to-int-cast) warning to be generated, as Unity converts
every argument to UNITY_UINT first, and with 64-bit support enabled,
UNITY_UINT becomes a 64-bit unsigned type.
This commit is contained in:
Ivan Grokhotkov
2021-05-07 12:39:01 +02:00
parent fae335dc68
commit 71f711976d
2 changed files with 12 additions and 1 deletions

View File

@@ -13,6 +13,14 @@ menu "Unity unit testing library"
help help
If not set, assertions on double arguments will not be available. If not set, assertions on double arguments will not be available.
config UNITY_ENABLE_64BIT
bool "Support for 64-bit integer types"
default n
help
If not set, assertions on 64-bit integer types will always fail.
If this feature is enabled, take care not to pass pointers (which are 32 bit)
to UNITY_ASSERT_EQUAL, as that will cause pointer-to-int-cast warnings.
config UNITY_ENABLE_COLOR config UNITY_ENABLE_COLOR
bool "Colorize test output" bool "Colorize test output"
default n default n

View File

@@ -21,13 +21,16 @@
#define UNITY_EXCLUDE_DOUBLE #define UNITY_EXCLUDE_DOUBLE
#endif //CONFIG_UNITY_ENABLE_DOUBLE #endif //CONFIG_UNITY_ENABLE_DOUBLE
#ifdef CONFIG_UNITY_ENABLE_64BIT
#define UNITY_SUPPORT_64
#endif
#ifdef CONFIG_UNITY_ENABLE_COLOR #ifdef CONFIG_UNITY_ENABLE_COLOR
#define UNITY_OUTPUT_COLOR #define UNITY_OUTPUT_COLOR
#endif #endif
#define UNITY_EXCLUDE_TIME_H #define UNITY_EXCLUDE_TIME_H
void unity_flush(void); void unity_flush(void);
void unity_putc(int c); void unity_putc(int c);
void unity_gets(char* dst, size_t len); void unity_gets(char* dst, size_t len);