When debugging with VisualGDB, the ‘signed-ness’ seems to be wrong in the intellisense. Using the following code:
<Type> difference;
<Type> desiredVal = 10;
<Type> actualVal = 21;
difference = desiredVal - actualVal;
difference should be -11 for signed types and some random large number (type_max – 10) for unsigned types. However, it does not follow these rules for some reason.
From the limited debugging I found that the following types are correct “(un)signed int” & “(un)signed char” and I assume other “standard” types like these will work correctly as well.
However it using the “(u)intN_t” types, this behaviour seems to be flipped. For example when using the “uint8_t” type it returns as -11. And if you use “int8_t” it returns 245.
It should be noted that it will behave normally in other code. For example if the following code is appended to the code above:
if (difference < -10)
int a = 2;
The “intN_t” types will hit the int a = 2 line. While “uintN_t” types will, correctly, skip it.
The issue appears to be only visual.
I am using VisualGDB on an STM32F411VET (devboard) with the following package versions:
ARM toolchain: 6.2.0/7.12/r3
STM32 Devices: 4.2
OpenOCD: 20161025
*Edited to fix code blocks (twice :))
-
This topic was modified 7 years, 6 months ago by Blutti.
-
This topic was modified 7 years, 6 months ago by Blutti.