BitFlipper

Forum Replies Created

Viewing 11 posts - 76 through 86 (of 86 total)
  • Author
    Posts
  • in reply to: Setting a breakpoint causes the application to crash #2569
    BitFlipper
    Participant

    @ket wrote:

    Hi,

    What happens if you press on ‘Break all’ during execution?

    In that case VS acts as if I stopped debugging altogether. The application exists and my client application also loses the connection immediately. I turned on GDB logging and saw there were some library issues, see the log output below. I’m not interested in debugging the libraries without debug info so in theory it should not matter. Maybe someone with more experience with this can suggest a way to resolve this. I will try the “zipper” commands as shown below in the meantime.

    &"Missing separate debuginfo for /lib64/libm.so.6n"
    &"Try: zypper install -C "debuginfo(build-id)=b10c3cae031ba5a87e715c117a83cd3bef83ebd2"n"
    &"Missing separate debuginfo for /lib64/librt.so.1n"
    &"Try: zypper install -C "debuginfo(build-id)=44612b93c19e6567318299411987b113d2387081"n"
    &"Missing separate debuginfo for /lib64/libdl.so.2n"
    &"Try: zypper install -C "debuginfo(build-id)=3e4f6bfee9fdf77ca975b77b8c325347d9228bb8"n"
    &"Missing separate debuginfo for /lib64/libpthread.so.0n"
    &"Try: zypper install -C "debuginfo(build-id)=368b7757bc9f9d7e2e93678c63cb3e5587a9934f"n"
    ~"[Thread debugging using libthread_db enabled]n"
    &"Missing separate debuginfo for /lib64/libc.so.6n"
    &"Try: zypper install -C "debuginfo(build-id)=72e7b043935a2bd0b80d325f7f166a132cf37140"n"
    ~"Stopped due to shared library eventn"
    *stopped,thread-id="1",stopped-threads="all",core="0"
    -break-list
    ^done,BreakpointTable={nr_rows="0",nr_cols="6",hdr=[{width="7",alignment="-1",col_name="number",colhdr="Num"},{width="14",alignment="-1",col_name="type",colhdr="Type"},{width="4",alignment="-1",col_name="disp",colhdr="Disp"},{width="3",alignment="-1",col_name="enabled",colhdr="Enb"},{width="10",alignment="-1",col_name="addr",colhdr="Address"},{width="40",alignment="2",col_name="what",colhdr="What"}],body=[]}
    info shared
    &"info sharedn"
    ~"From                To                  Syms Read   Shared Object Libraryn"
    ~"0x00007ffff7ddeb00  0x00007ffff7df6514  Yes         /lib64/ld-linux-x86-64.so.2n"
    ~"0x00007ffff7b69580  0x00007ffff7ba5a58  Yes         /lib64/libm.so.6n"
    ~"0x00007ffff795e2e0  0x00007ffff7961e38  Yes         /lib64/librt.so.1n"
    ~"0x00007ffff7758df0  0x00007ffff7759948  Yes         /lib64/libdl.so.2n"
    ~"0x00007ffff75403f0  0x00007ffff754c358  Yes         /lib64/libpthread.so.0n"
    ~"0x00007ffff71e58c0  0x00007ffff72e8ef8  Yes         /lib64/libc.so.6n"
    ^done
    -exec-continue
    ^running
    *running,thread-id="1"
    =thread-created,id="2",group-id="i1"
    ~"[New Thread 0x7ffff6e85700 (LWP 21692)]n"
    *running,thread-id="all"
    =thread-created,id="3",group-id="i1"
    ~"[New Thread 0x7ffff67ba700 (LWP 21693)]n"
    *running,thread-id="all"
    ~"[Thread 0x7ffff67ba700 (LWP 21693) exited]n"
    =thread-exited,id="3",group-id="i1"
    ~"[Thread 0x7ffff6e85700 (LWP 21692) exited]n"
    =thread-exited,id="2",group-id="i1"
    =thread-exited,id="1",group-id="i1"
    =thread-group-exited,id="i1"
    *stopped,reason="signal-received",signal-name="SIGPWR",signal-meaning="Power fail/restart",reason="signal-received",signal-name="SIGXCPU",signal-meaning="CPU time limit exceeded",reason="signal-received",signal-name="SIGPWR",signal-meaning="Power fail/restart",reason="signal-received",signal-name="SIGXCPU",signal-meaning="CPU time limit exceeded",reason="signal-received",signal-name="SIGPWR",signal-meaning="Power fail/restart",reason="signal-received",signal-name="SIGXCPU",signal-meaning="CPU time limit exceeded",reason="signal-received",signal-name="SIGINT",signal-meaning="Interrupt",reason="exited-signalled",signal-name="SIGINT",signal-meaning="Interrupt"
    -gdb-exit
    ^exit
    
    in reply to: Need VisualGDB to not change Make commands #2468
    BitFlipper
    Participant

    OK thanks for the info. For now I’m just going to ignore the warning. I’m settings up a system that will be used by other developers as well and they will have to be made aware that they should not click on the warning link.

    Due to the complexity of our project, I have written my own configure, sync and build system that bypasses all of VGDB’s functionality. The only thing it is used for now is for debugging. VGDB is a good product, but if you need something a bit more out of the ordinary it has some limitations. Since I need it to work this way, here are some things that would make it more flexible:

    1. Recognize environment variables in the list of SSH Hosts. Then we can have something like $(BUILD_USER)@$(BUILD_HOST) which makes the VGDB project portable (currently I’m experimenting with modifying VGDB’s list of host files with some luck).
    2. Allow the build steps to be completely disabled, including checking/changing the make commands in the VS project.

    in reply to: How can I do the following…? #2434
    BitFlipper
    Participant

    Thanks for the detailed reply, much appreciated.

    So let me try to explain the problem I have in more detail… As mentioned before, my source code tree is very large, consisting of many different products that can each be compiled out of this tree depending on which parameters are passed into make. My product’s dependencies are spread out all over this tree, and manually trying to come up with a subset of dependencies is an almost impossible task. So what we all do is just get the whole source code tree and let make sort it out. Note that the same source code tree can be built on Windows and Linux. Windows builds happen locally on my dev system.

    Next, I’m in a remote office from where the Linux build machine is. Our network is not super slow, but not super fast either. We are discouraged from using shared folders with these large source trees for the purpose of syncing or building across the network since it would use large amounts of bandwidth.

    As for syncing the sources, there are different scenarios. Every now and then, we get the “latest” sources from source code control, and hence the whole tree needs to be synced to the build machine. But for a typical code edit/build/debug cycle, it is possible to isolate just a small subset of the whole tree, because we know exactly which subfolders our own code is in. It is this subset of folders that my sync utility focusses on, which is why it can be fast.

    What I’m finding is that VisualGDB is very slow to determine which files have changed (if any), simply because of the tree size. I can probably manually set up the same subset of folders in VisualGDB that I did in my scan utility but it would end up being something like 30 different entries so it will be difficult to set up and maintain. With my scan utility it stores the list of subfolders in a config file that is easier to maintain than VisualGDB when there are many entries.

    So the problem is a bit complicated. I’m currently trying to come up with additional ways to handle these different syncing scenarios.

    Here are some features of that would make VisualGDB even better:

    1. The ability to enable/disable individual actions in the list of actions. This would be especially helpful while setting up complicated actions that need some experimentation.

    2. Ability to disable the feature to add new source files to make. In my case I would never want it to modify the make files, yet with the large number of files, it slows everything down when VGDB does this step.

    3. Support for custom commands that can be executed on demand (macros). For instance, if I got all the latest sources for my whole source code tree, I would like to easily press a mapped toolbar button or keyboard combination that runs a pre-configured VGDB command (or list of commands).One can roughly do the same thing by creating multiple VGDB build configurations but this is cumbersome and difficult to maintain.

    Thanks for the tip about the environment variables. I played with these and it seems it would do what I want. I have not tried this yet but I assume the environment variables would also work in the SSH Connection Manager?

    Once I have everything working, I’ll take a look at the visualizers. I’ve created standard Visual Studio visualizers before and was interested to read the documentation about VGDB’s support for visualizers.

    I’ll also be very interested in testing a pre-release version as long as I can go back to a previous version if for some reason it isn’t working properly.

    Anyway, VisualGDB is a great product and I’m interested in making it even better… 🙂

    in reply to: How can I do the following…? #2432
    BitFlipper
    Participant

    OK I figured a way out. In my scan utility I create a bash script at runtime that gets copied to the build server. VisualGDB then executes this bash script. I can now put whatever I want in there. If no files changed, the bash script is simply empty.

    This also gives me great flexibility to run additional commands from the build machine via this bash script that would otherwise have been tricky to configure via VisualGDB. So now VisualGDB simply runs two commands… A local command that runs my scan utility, and then the bash script that was created by this utility and which it already copied to the build machine.

    BitFlipper
    Participant

    Anyone?

    BTW I tried playing around with the “Files to transfer” field but I can’t figure out how to exclude a specific folder. Does anyone know if this is even possible, and if so, what is the correct syntax?

    BitFlipper
    Participant

    OK figured it out. I needed to add it to the environment variables listed under the GDB Customize settings.

    in reply to: How to get rid of these annoying exceptions? #2119
    BitFlipper
    Participant

    OK it turns out I can disable these by setting GDB up to not stop on them. Under “GDB startup commands”, I can add something like this:

    handle SIGPWR nostop pass
    handle SIGXCPU nostop pass
    handle SIG35 nostop pass

    Here is some documentation on setting up exceptions under GDB: http://idlebox.net/2010/apidocs/gdb-7.0.zip/gdb_6.html#SEC44

    BitFlipper
    Participant

    @ket wrote:

    The main binary and the main deployable executable need to be set to one file, not several files. This executable is given to gdb for debugging and gdb expects one file. The rest of the files this binary depends on should be moved to the deployment machine separately.

    OK makes sense. However in my case the binary that gdb needs to execute is not any of the binaries being built in my project. So I guess I can just pick any one of the many binaries and it should at least copy them all.

    @ket wrote:

    Consider converting the project from Linux to Custom, custom projects allow executing arbitrary sequences of commands for building and debugging. We have a tutorial at http://visualgdb.com/tutorials/linux/convert/ where we convert a project to custom to move a generated output file from the deployment machine to the windows development file, in your case you need to add the copying of the deployment binaries to the ‘actions before launching GDB’.

    Yes I converted it to a custom project now. It seems I have much more control so now. I do run into a problem which seems to be related to permissions when copying the files to the remote system. I get the error “LIBSSH2_ERROR_SCP_PROTOCOL”. I’m trying to figure out what to change in order to have the files transferred successfully.

    in reply to: Importing a very large project – Help!! #2066
    BitFlipper
    Participant

    @Nox Metus wrote:

    @BitFlipper wrote:

    Unfortunately the project I need to work on pulls in many files all over the place so it would be a nightmare to try and do it manually in VS.

    Write a simple Python script that parses .d files and creates the correspondent “” and “Source Files” strings for a chosen subset of your project.

    Hmmm, interesting. I’ll look into this. Unfortunately right now I’m struggling to figure out how to copy all the output files from the build machine to the debug machine.

    in reply to: Importing a very large project – Help!! #2064
    BitFlipper
    Participant

    @Nox Metus wrote:

    Probably we work in the same company. I ended up manually creating a solution only for one project I’m working on. Well, almost manually: I tracked the files compiled when I run make, then added only those files that belong to my project to the solution. I chose “makefile” type of the solution configuration and added all include directories to the NMake section of configuration for IntelliSense. Also I use plink to initiate build: it supports authentication through pageant in contrast to VisualGDB. And finally you might want to specify another location for IntelliSense database, since most likely you project is located on a network share. You can do it in Tools|Options|Text Editor|C/C++|Advanced|Fallback location.

    Thanks for the info. Unfortunately the project I need to work on pulls in many files all over the place so it would be a nightmare to try and do it manually in VS. It takes over an hour to build from scratch just to give you an idea of the number of files involved.

    I can now successfully build the project from VS. However I’m a bit stuck on how to set everything else up. I build the project on one machine, then deploy on another for debugging. Right now it is deploying just the main binary I specified instead of all the files in the build folder. I’m guessing instead of specifying just the one file I need to use wildcards. Trial and error is getting me closer…

    But back to the issue with performance issues with large projects… It would be good to be able to do the following (maybe there is already a way to do some of there):

    1. Have multiple source file sync options. For instance, even if nothing changes, it still takes a long time for VisualGDB to get past the sync stage since it is checking all the files. What would help is to have multiple sync groups. I can have an “All” sync group, in which case all files are synced after I got new sources for my whole tree. Then I can also have a “Debug” sync group that only contains files/folders that I know I will be working on. This way, when VisualGDB tries to sync the files, it will be much faster since it only checks the sub-group of files in my “Debug” list.

    2. Can I start debugging without going through a build stage? Right now if I press F5, it syncs the files (see above), then initiates the remote build. However that build is really slow, even if everything is up to date (~10 minutes). So this is a real productivity killer if all I want to do is start debugging but I know the project is already up to date. I guess I can select to attach to an already running project. I’ll need to play with all the options to see what will work.

    Anyway I have not gotten to the point of actual debugging but if it works anything like it says on the tin then that would be great. Hopefully the sheer size of the project won’t make things too slow during debugging. Keeping my fingers crossed…

    in reply to: Importing a very large project – Help!! #2068
    BitFlipper
    Participant

    Agreed this is most likely a limitation in VS. Even so, it finally completed and all the files were in the project. However everything was quite sluggish so I deleted the whole project. I’m now starting from scratch with the suggestion to not import sources into Solution Explorer.

    So while the source files will now not be added automatically, I assume it should not be too difficult to add sub-branches from the source code into Solution Explorer, right? It would still be useful to use Solution Explorer to browse the sections of the source files that I am interested in.

    Thanks for your help!

Viewing 11 posts - 76 through 86 (of 86 total)