sigh
In my third try to introducing Linux to my everyday life I'm still bouncing hard on programming.
I have solution for my major thesis' program created in Visual Studio (five projects in one solution; 4 static libs, one executable), and managed with it since it's beginning. It's posted on Git, so I thought, that I'll try changing it and compiling on my laptop with Ubuntu 18.4. While changing it was no-effort task (Linux text editors are neat; as opposed to almost non-existent IDEs), compiling in the other hand was the point I doubted in all my skills.
First problem I've encountered is that I have no idea how to manage multiple projects outside VS. I mean, there are .vcxproj files with data, but not very useful for what I saw.
Second problem are references - I have no idea how to link #include directories to specific point in files without going ../MainFolder/subfolder/file.h which is extremely unesthetic.
I expect, that those are just iceberg's peaks, and I will encounter massive amount of problems in future, but as for now - can anybody give me idea of how to manage such project in Linux?
Related
I'm working in Windows-10 pro 64-bit.
When I run my Delphi project the smadav(2019) rev 13.2 antivirus is playing and shows the message a virus (malwez.767), has been found.
How can I fix this case?
There are two possibilities:
Either your Delphi project is a virus.
Either your Delphi project is not a virus.
I imagine you're dealing with the second case. Then it might be a good idea to contact the virusscanner company, and ask them why this problem occurs.
Also, some virusscanners have the possibility to exclude some directories/files from the scanning. Like this, you might try to avoid your Delphi process being scanned at all.
I'm a little late with this question, but better late than never. I've been using Visual Studio 6.0 since it came out, but recently switched to VS 2013 on a new PC.
I've gotten my projects to build under 2013, but the resulting executables it produces are consistently bigger than VS6.0 produced. I've seen a similar thread on here about that happening in the transition from VS2008 to VS2010, and the comments and suggestions there all seem to attribute the change to changes in MFC libraries that are statically linked in. But my projects are straight C code. No C++, let alone MFC. And the 'Use of MFC' option on my project is set to "Use Standard Windows Libraries" (presumably set by the import tool that generated the 2013-compatible project). The only non-stadard library it uses is wsock32.lib.
The extra size isn't a killer, but it's a significant relative to the size of the whole app. My biggest .exe goes from 980Kb to 1.3Mb - about a 35% increase in size to an app whose small size was a selling point (i.e. install this tiny app and you have access to all of our goodies). That's without debugging info - the increase on the debug version is even more - but I don't really care about that.
Any ideas how to strip out the new cruft - or even to know what it is?
This is a good manual how to make your binaries smaller.
Basic ideas are the following:
Don't forget about Release mode
Declare #define WIN32_LEAN_AND_MEAN
Dynamically link to the C++ runtime
Compile the executable without debugging information
Compile with /O1, an 'optimize for size' flag
Remove iostream and fstream headers, use low level instead if possible
Typically you generate a MAP file on both systems, and figure out the sections that cause the largest contributions.
Anton's answer reminds me: first check if they are both linked the same way (both static or both dynamic, otherwise it is apples and oranges)
I work from 2 different machines. One is Windows and the other is Linux. If I alternately work on the same project but switch between both OSes, will I eventually run into compiling errors? I ask because maybe there are standards supported by one but not by the other.
That question is a pretty broad one and it depends, strictly speaking, on your tool chain. If you were to use the same tool chain (e.g. GCC/MinGW or Clang), you'd be minimizing the chance for this class of errors. If you were to use Visual Studio on Windows and GCC or Clang on the Linux side, you'd run into more issues alone because some of the headers differ. So once your program leaves the realm of strict ANSI C (C89) you'll be on your own.
However, if you aren't careful you may run into a lot of other more profane errors, such as the compiler on Linux choking on the line endings if you didn't tell your editor on the Windows side to use these.
Ah, and also keep in mind that if you want to actually cross-compile, GCC may be the best choice and therefore the first part I mentioned in my answer becomes a moot point. GCC is a proven choice on both ends. And given your question it's unlikely that you are trying to write something like a kernel mode driver - which would be fundamentally different.
That may be only if your application use some specific API.
It is entirely possible to write code that works on both platforms, with no issues to compile the code. It is, however, not without some difficulties. Compilers allow you to use non-standard features in the compiler, and it's often hard to do more fancy user interfaces (even if it's still just text) because as soon as you start wanting to do more than "read a line of text as it is entered in a shell", it's into "non-standard" land.
If you do find yourself needing to do more than what the standard C library can do, make sure you isolate those parts of the code into a separate file (or a couple of files, one for Linux/Unix style systems and one for Windows systems).
Using the same compiler (gcc) would help avoiding problems with "compiler B doesn't compile code that works fine in compiler A".
But it's far from an absolute necessity - just make sure you compile the code on both platforms and with all of your "suppoerted" compilers often enough that you haven't dug a very deep hole that is hard to get out of before you discover that "it's not working on the other system". It certainly helps if you have (at least) a virtual machine running the other OS, so you can easily try both variants.
Ideally, you want to set up an automated system, such that when you change the code [and feel that the changes are "complete"], it automatically gets built on both platforms and all compilers you want to use. And if possible, also automatically tested!
I would also seriously consider using version control - that way, when something breaks on one or the other side, you can go back and look at what the code looked like before it stopped working, and (hopefully) find the reason it broke much quicker than "Hmm, I think it's the change I made to foo.c, lets take that out... No, not that one, ok how about the change here..." - at least with version control, you can say "Ok, so version 1234 doesn't work, let's try version 1220 - ok, that works. Now try 1228, still works - so change between 1229 and 1234 - try 1232, ah, it's broken..." No editing files and you can still go to any other version you like with very little difficulty. I have used Mercurial quite a bit, git a little bit, some subversion, and worked on a project in Perforce for a few years. All of these are good - personally, I think I prefer mercurial.
As a side-effect: Most version control systems also deal with filename and line endings in the saner way than doing this manually.
If you combine your version control system with a "automated build and test-system", such as Jenkins, you can get everything very automated. Jenkins is free and runs on both Windows and Linux, and you can use it to automatically build and test your code as and when you submit the code to the version control system.
It will not create a problem until you recompile the source code in the respective OS. If you wanna run your compiled file generated by windows(.exe or .obj), into linux or vice-versa then it will definitely create a problem and wont be possible. But you can move you source code (file with extension .c/.c++) into any of the os. And sometimes it also create problems with different header files, so take care of that also. Best practice is to use single OS for you entire project, avoid multiple os until it is extremely necessary.
I would like to know if there is good practices, good tools to move a bunch of windows makefile projects to some msbuid (VS 2010) format?
If you think that' not a good idea to make it using a tool, maybe you do know something like a dependency analyser to make a checklist?
Having recently converted a legacy "make" based build to MSBuild, I'd have to say that there is no real easy way. Granted, the legacy build I was working on was actually calling msbuild to build .sln (I believe that the build engineer that put the other process in place was Old-Skool, and was using the toolset that best suited him, rather than .Net).
However, what I noticed was that the make (specifically nmake.exe/build.exe) tools were directory based - subdirs were "built" before parent dirs. Whereas that is not the case for msbuild - it's solution and project based.
Get your code into Visual Studio projects, living in a "flat" directory structure (having all projects as children of a single "Source" folder will really make your life easier in the long run - don't have projects that live several dirs "down the tree"
Use multiple solutions to break the build into "tiers" - order the build of the slns into a helper .bat file - this will help you in the long term to convert to TeamBuild
(my answer started to get out of control - your question reminds me of that joke with the American visiting Ireland who gets lost, and he asks a local "how do you get to Kilarney?", and the local replies "well, I wouldn't start from here".. Can you give a bit more detail about what you are actually buiding? Is it .Net code? I'm sure there is countless advice I and others could give you, but don't know what you are working with)
So my company uses a delightfully buggy program called Rational Purify (as a plugin to Microsoft Visual Developer Studio) to manage memory leaks. The program is deigned to let you click on a memory leak after you have encountered it, and then jump to the line that the leak occurs on.
Unfortunately Purify is malfunctioning and Purify will not jump to the place that the leak occurred it only mentions the class and method that the leak occurs in. Unfortunately, sometimes this is about as useful as hiring a guide to help you hunt bears and having him point to the forest and tell you there are bears there.
Does anyone with Purify experience have any idea how I might fix this problem or have a good manual to look though?
Generally you have two options, one exclude modules DLL's from instrumentation in Purify, it helps some times. Second is get BoundsChecker, this does compile time instrumentation much slower but the level of detail is an order of magnitude better.
We generally use Purify on check-in, sanity checking, and BoundsChecker when we know a bug/crash exists.
BoundsChecker has some nice features like only instrument files A.cpp & B.cpp, excluding all the rest.
Be aware neither of these two applications function on 64 bit operating systems, and BoundsChecker will not install on 64 bit OS. Most frustrating if you make the switch to native 64 bit development with 32 bit back port!
Purify is like a swiss knife. If you know how to use it, you will get some results, not the best but still results. If you don't, it will crash, because it is just another program running on Windows.
In the end you will need a lot of patience, rebuilds and a bit of luck.
Purify comes with a script called ScanVSSolutionForPurifyPlus.pl which will ensure that your project files have all the right settings for Purify to work properly. If you haven't run it, give it a go.
(I've personally used ScanVSSolutionForPurifyPlus.pl on a large solution, and it worked like a charm. One caveat: when you give it the name of your .sln file, you might need to give it the full pathname.)
Are you sure you have debug build? Or rather you have all PDB's enabled? Try WindDbg on your executable and check with !lmi command what is visible.
Is whole code properly instrumented?
Also consider using something else like free Visual Leak Detector or Microsoft's tool LeakDiag.
I used Purify about 5 years ago. It was really flaky then. They kept promising to fix all the bugs in the 'next release'. We gave up on it in the end. One can only wonder if they used their own QA tools on their products. Oh the irony...