ARMClang problems about optimization levels

Sysprogs forums Forums VisualGDB ARMClang problems about optimization levels

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #23815
    spsftd
    Participant

    Hi,

    my visualgdb project imported from my keil project , when the vs Project Configuration Type is Debug ,so,the default C/C++ -> Optimization level is Disabled(-O0) .

    when i load it into my board , it can’t work any more .

    but when i change it to any other options ,it works …

    why casue this ?

    #23817
    support
    Keymaster

    Hi,

    VisualGDB sets the optimization level to “none” for debug builds because this makes it easier to debug the project. Debugging optimized code could be very complicated as the compiler would often reuse common chunks of generated code (complicating stepping) or destroy the values of the variables once they are no longer needed (preventing you from evaluating most of them in the debugger).

    If your project only works with some optimization levels, but not with others, it likely contains bugs triggered by a specific memory layout, or runs out of stack/heap space. Either way, we would advise investigating this as it might cause further trouble in production code.

    #23823
    spsftd
    Participant

    Any debug suggestions ?

    How to locate the problem ?

    Or how to exclude any condition what you just said?

    #23824
    spsftd
    Participant

    And how to distinguish where can set breakpoint when Debug ?

    And some time the breakpoints will automatically float to other place that is not where you just set when debug .

    #23832
    support
    Keymaster

    According to our records, your trial has expired, so unfortunately we won’t be able to provide any further support unless you purchase a VisualGDB license. Sorry.

Viewing 5 posts - 1 through 5 (of 5 total)
  • You must be logged in to reply to this topic.