Converting Makefiles to MsBuild - visual-c++

I would like to know if there is good practices, good tools to move a bunch of windows makefile projects to some msbuid (VS 2010) format?
If you think that' not a good idea to make it using a tool, maybe you do know something like a dependency analyser to make a checklist?

Having recently converted a legacy "make" based build to MSBuild, I'd have to say that there is no real easy way. Granted, the legacy build I was working on was actually calling msbuild to build .sln (I believe that the build engineer that put the other process in place was Old-Skool, and was using the toolset that best suited him, rather than .Net).
However, what I noticed was that the make (specifically nmake.exe/build.exe) tools were directory based - subdirs were "built" before parent dirs. Whereas that is not the case for msbuild - it's solution and project based.
Get your code into Visual Studio projects, living in a "flat" directory structure (having all projects as children of a single "Source" folder will really make your life easier in the long run - don't have projects that live several dirs "down the tree"
Use multiple solutions to break the build into "tiers" - order the build of the slns into a helper .bat file - this will help you in the long term to convert to TeamBuild
(my answer started to get out of control - your question reminds me of that joke with the American visiting Ireland who gets lost, and he asks a local "how do you get to Kilarney?", and the local replies "well, I wouldn't start from here".. Can you give a bit more detail about what you are actually buiding? Is it .Net code? I'm sure there is countless advice I and others could give you, but don't know what you are working with)

Related

How to manage developing former Visual Studio solution in Linux?

sigh
In my third try to introducing Linux to my everyday life I'm still bouncing hard on programming.
I have solution for my major thesis' program created in Visual Studio (five projects in one solution; 4 static libs, one executable), and managed with it since it's beginning. It's posted on Git, so I thought, that I'll try changing it and compiling on my laptop with Ubuntu 18.4. While changing it was no-effort task (Linux text editors are neat; as opposed to almost non-existent IDEs), compiling in the other hand was the point I doubted in all my skills.
First problem I've encountered is that I have no idea how to manage multiple projects outside VS. I mean, there are .vcxproj files with data, but not very useful for what I saw.
Second problem are references - I have no idea how to link #include directories to specific point in files without going ../MainFolder/subfolder/file.h which is extremely unesthetic.
I expect, that those are just iceberg's peaks, and I will encounter massive amount of problems in future, but as for now - can anybody give me idea of how to manage such project in Linux?

Anjuta/Glade Tutorials or Better IDE?

I am attempting to develop a GUI application for Tails. I'm doing the initial development on Debian 8 since development directly in Tails can be a pain.
I started out using Anjuta, but the documentation is essentially non-existent. The Anjuta website has nothing at all about how Glade is integrated or how to use it. I can't even track down documentation on how to change the main window title. The only tutorial I found has you start a project and build it using the default files that are generated for a GTKmm project.
Is there a good book or online tutorial out there for doing GUI development in Anjuta?
This is maybe not a complete answer, but it's too large to put in as a comment. I use Anjuta fairly regularly, but I share your feeling about the missing documentation (which is, by the way, not unique for Anjuta). I appreciate Anjuta (and Glade) very much, so don't take the following as criticisms on either program.
I would recommend you consider using PyGTK for GUI creation. It is a lot more productive. You can design the GUI in Glade - exactly the same way you would do for C/C++ - and then implement the code in Python, which you can also edit and manage from Anjuta. There are plenty of code examples, for example on the nullege code search engine.
About the work flow in Anjuta (for C/C++). It is based mainly on the Autotools system, so you should really read up a little on make, Makefile, and related tools. Though in principle Anjuta manages this, you will, sooner or later hit a problem, and some knowledge about Autotools will help you a long way (also this tutorial or this one. This slide series is interesting - probably because it is more graphical. There are even some video tutorials, like this one.).
There is no real necessity to use Glade from inside Anjuta. In fact, Glade has passed a long process distancing itself from 'code generation'. It now only contains an XML generator, which can be called separately. I find the screen space left for Glade inside Anjuta insufficient for comfortable work anyway.
So, in conclusion: If you mainly need a GUI, consider Python + Gtk. If you do need C or C++, Anjuta is a great IDE, but look at Gtk Development examples (like this one). Following those, the use of Anjuta should be a lot clearer.
EDIT:
Very useful answer. I have some underlying legacy code that has to be
C++. Is there a way to mix Python and C++ in Anjuta, or do you know of
any guideposts or tutorials for such?
You can open a C++ project in Anjuta - maybe even import you legacy code directly as a Makefile project. You can also add new files to your C/C++ project and create them as Python files. I've never tried to do that though, and I'm not sure how Anjuta would treat them, for example, in the Makefile(s). I don't have large projects mixing languages at the moment, but for small projects, I like 'Geany', because it doesn't get in the way. You do have to maintain the Makefiles manually.

MonoTouch: Adding DLL references in sub projects

In a way I am looking for best-practice here.
I have a common project that is shared by many of my apps. This project has FlurryAnaylics and the ATMHud DLLs as references.
If I do not also reference these DLLs in the main project, the apps will often, but not always, fail in the debug-to-device test. In the debug-to-simulator I don't need to add these DLLs to the main project.
So, the question is: Do I have to include references to DLLs in the main project that I have in sub projects all the time?
Whenever possible I use references to project files (csproj files) over references to assemblies (.dll). It makes a lot of things easier, like:
code navigation (IDE);
automatic build dependency (the source code you're reading is the one you're building, not something potentially out-of-sync);
source-level debugging (even if you can have it without it, you're sure to be in-sync);
(easier) switch between Debug|Release|... configurations;
changing defines (or any project-level option);
E.g.
Solution1.sln
Project1a.csproj
MonoTouch.Dialog.csproj (link to ../Common/MonoTouch.Dialog.csproj)
Solution2.sln
Project2a.csproj
MonoTouch.Dialog.csproj (link to ../Common/MonoTouch.Dialog.csproj)
Common.sln
MonoTouch.Dialog.csproj
Large solutions might suffer a bit from doing this (build performance, searching across files...). The larger they get the less likely everyone has to know about every part of it. So there's a diminished return on the advantages while the inconvenience grows with each project being added.
E.g. I would not want to have references to every framework assemblies inside Mono (but personally I could live with all the SDK assemblies of MonoTouch ;-)
Note: Working with assemblies references should not cause you random errors while debugging on device. If you can create such a test case please fill a bug report :-)

Is there any good build tool that stays out of my way?

As the title says, I want to have a build tool that quite much stays out of my way.
I would rather want to specify rules, rather than steps in the build process. I wan to say that I want a binary file with a name placed in the root directory of my project, .o files should go in an obj/tmp dir and the source is in the Source-directory.
I do NOT want to tell it that it is this'n'that file as I keep adding new files rather quickly, it should just scan the source directory (and its subdirectories) looking for Ragel (.rl) and C++ code (.cxx) and doing what's necessary to make all into an executable.
I have looked into many tools, like auto{make,conf,header} (Did not really like that I placed the files it wanted in a subdir of project root, eclipse did not like that either), CMake (Seems like I have to add all source-files myself, and is quite much a variation of autotools in my eyes). I have also read about ant, maven (I am also allergic against XML, it's a good format to serialize data for applications, not so much for humans. I would prefer YAML) and others on WikiPedia. And I have seen tools which seems good but which require to be set up as a webserver which is kinda overkill.
Also, I really need the ability to be able to work offline without internet connection!.
Right now it seems like the best option is to make a little script that finds all .cxx files and write an Unity.cxx and builds that one with G++, which probably is quite fast but to much an ugly hack, I guess.
Bonus Points:
Fast builds
Ability to type build test-1 or something and it will build and directly run test-1
Multi-core builds (i.e. faster builds)
Does really not interrupt my train of thought
CMake is great. It's free, cross-platform, and reasonably well documented. It supports "out of source builds", meaning none of the build files are placed in the source directory. That makes source control a bit easier. It can be set up to find new files (globbing). Fast?...It generates make files...after that it's up to your compiler. Multicore...again, more a function of the compiler. I've used CMake on Windows, Linux, and Mac...it just works.
Another that I haven't tried but have read about and plan to test is premake... http://industriousone.com/sample-script
cake from CoffeeScript is quite good, and I'm writing a similar tool using Lua myself.
CMake and premake Ain't build/maketools, they are build/make-descriptor generators; which may fit a large number of projects that ain't changing too much. But not for project where rapid prototyping is a key.
Right now, I'm doing a project where the browser updates when you hit the save-button in your text editor; You do not need to go to the browser and hit F5 (Which would cause a small delay while the browser load in everything again, and you would most likely loose the state of the page, like say that you have an menu open, and wish to tweak the look of the menu. You would be forced to navigate there again in your RIA).

Create setup for Linux C project

I want to create a setup for my project so that it can be installed on any pc without installing the header files.
How can I do that?
There are two general ways to distribute programs:
Source Distribution (source code to be built). The most common way is to use GNU autotools to generate a configure script so that your project can be installed by doing ./configure && make install
Binary Distribution (prebuilt). Instead of shipping source, you ship binaries. There are a couple of competing standards although the two main ones are RPM and DEB file.
You just changed your question (appreciated, it was kind of vage), so my answer no longer applies ..
make sure you have a C compiler
I'd be surprised if you didn't, Linux normally has one
find an editor you are comfortable with
vi and emacs are the classics
write your first program and compile
learn about makefiles
learn about sub projects and libraries
In many respects, your question is too vague to be answerable. You will need to describe more what you have in mind. All else apart, if you are using an integrated development environment (IDE), then what you do should be coloured strongly by what the IDE encourages you to do. (Fighting your IDE is counter-productive; I've just never found an IDE that doesn't make me want to fight it.)
However, for a typical project on Linux, you will create a directory to hold the materials. For a small project (up to a few thousand lines of code in a few - say 5-20 - files), you might not need any more structure than a single directory. For bigger projects, you will segregate sub-sections of the project into separate sub-directories under the main project directory.
Depending on your build mechanisms, you may have a single makefile at the top of the project hierarchy (or the only directory in the 'hierarchy'). This goes in line with the 'Recursive Make Considered Harmful' paper (P Miller). Alternatively, you can create a separate makefile for each sub-directory and the top-level makefile simply coordinates builds across directories.
You should also consider which version control system (VCS) you will use.

Resources