Options for using MSVC static libraries with MinGW (reimp was unable to convert them) - visual-c++

I work for a company that recently purchased a piece of hardware accompanied by an SDK. Unfortunately, all the SDK libraries were compiled with a Microsoft Visual C++ compiler (I don't know which one) and cannot be used by MinGW (I develop in Code::Blocks using the MinGW C++ compiler).
I've attempted to convert the libraries using reimp (from the MinGW utilities collection), which has worked in the past with static libraries from other vendors, but in this case the converted libraries result in "undefined reference" errors when linked.
The def files generated by reimp for each library during the conversion process don't look very good (they're filled with lines like ??0nameOfFunction##QEAA#AEBV0##Z, while the def files generated during a successful conversion contain lines similar to nameOfFunction#32), so it seems that the vendor's libraries are simply of a type that reimp can't convert.
Are there any other options that would allow me to use these libraries with MinGW? If not, is it reasonable to request that the vendor recompile their libraries with g++ (i.e., is it something they could feasibly do given that the libraries were originally developed using MSVC)?
Any comments or suggestions are appreciated!

Related

create MSVC import library with MinGW tools

We are currently switching the W32 build-process of a cross-platform (linux, osx, w32) project from VisualStudio to mingw.
One of the problems we are facing is, that our projects creates a dynamic library (foo.dll), against which 3rd party projects can link. For this to work on W32/MSVC, an import library is required (foo.lib).
Now, following the documentation it is pretty easy to create a .def file which holds all the information required for importing the library:
gcc -shared -o foo.dll foo-*.o -Wl,--output-def,foo.def
In order to use the foo.def file, the docs tell me to use the Microsoft LIB tool to build a foo.lib from it:
lib /machine:i386 /def:testdll.def
This obviously requires me to have (a subset of) MSVC installed on the build computer.
However, we'd like to cross-compile the entire thing on our linux systems (probably even on some CI), which makes the installation of MSVC rather tedious.
So I wonder, whether there's a native MinGW way to convert the foo.def file into a foo.lib import library?
(We are aware that in the end, only MSVC users will require the import library and that they will have the lib tool ready at hand. However, since we've always shipped the foo.lib file, switching to foo.def would break 3rd parties build systems - something we would like to avoid).
To produce an import library that is similar to the one generated by Microsoft's link.exe, you can use llvm-dlltool (part of the LLVM compiler project):
llvm-dlltool -m i386:x86-64 -d foo.def -l foo.lib
Substitute i386:x86-64 for i386 if you would like to create a 32-bit library. For more details see this answer to How to generate an import library (LIB-file) from a DLL?.
Note that some MinGW projects generate a .dll.a file (as produced by binutils dlltool). While this could be renamed to .lib and function as import library, I found that it would result in broken binaries if a MSVC projects links to multiple .dll.a libraries. So stick to llvm-dlltool instead for improved compatibility.
we'd like to cross-compile the entire thing on our linux systems
I'm not aware of any MS LIB clone portable to Linux, however POLIB from Pelles C distribution is free, small, self-contained and compatible with MS tool. It has no dependencies other than kernel32.dll, so, I believe, it will run under Wine too.

Use C++ DLLs from the same VS compiled at different times/teams - ABI compatibility?

To repeat: I'm looking for ABI compatibility between libraries of the same Visual-C++ version!
We want to mix and match some internal C++ DLLs from different teams - built at different times with different project files. Because of long build times, we exactly want to avoid large monolithic builds where each team re-compiles the source code of another team's library.
When consuming C++ DLLs with C++ interfaces it is rather clear that you only can do this if all DLLs are compiled with the same compiler / Visual Studio version.
What is not readily apparent to me is what, exactly needs to be the same to get ABI compatibility.
Obviously debug (_DEBUG) and release (NDEBUG) cannot be mixed -- but that's also apparent from the fact that these link to different versions of the shared runtime.
Do you need the exact same compiler version, or is it sufficient that the resulting DLL links to the same shared C++ runtime -- that is, basically to the same redistributable? (I think static doesn't fly when passing full C++ objects around)
Is there a documented list of compiler (and linker) options that need to be the same for two C++ DLLs of the same vc++ version to be compatible?
For example, is the same /O switch necessary - does the optimization level affect ABI compatibility´? (I'm pretty sure not.)
Or do both version have to use the same /EH switch?
Or /volatile:ms|iso ... ?
Essentially, I'd like to come up with a set of (meta-)data to associate with a Visual-C++ DLL that describes it's ABI compatibility.
If differences exist, my focus is on VS2015 only at the moment.
Have been thinking this through the last days, and what I did do was to try to see if some use-cases exists where devs have already needed to categorize their C++ build to make sure binaries are compatible.
One such place is the Native Packages from nuget. So I looked at one package there, specifically the cpprestsdk:
The binaries in the downloadable package as split like this:
native\v120\windesktop\msvcstl\dyn\rt-dyn\x64\Release\
^ ^ ^ ^ ^
VS version | not sure | uses cpp-runtime dynamically
| lib itself dynamic (as opposed to static)
or WinXP or WinApp(WinRT?)
I pulled this out from this example, because I couldn't find any other docs. I also know that the boost binaries build directory is separated in a similar way.
So, to get to a list of meta data to identify the ABI compatibility, I can preliminarily list the following:
VC version (that is, the version of the C and CPP runtime libraries used)
one point here is that e.g. vc140 should be enough nowadays - given how the CRT is linked in, all possible bugfixes to the versioned CRT components must be ABI compatible anyway, so it shouldn't matter which version a given precompiled library was built with.
pure native | managed (/CLI) | WinRT
how the CRT is consumed (statically / dynamically)
bitness / platform (Win32, x64, ARM, etc.)
Release or Debug version (i.e. which version of the CRT we link to)
plus: _ITERATOR_DEBUG_LEVEL ... if everyone goes with the defaults, fine, if a project does not, it must declare so
Additionally my best guess as to the following items:
/O must not matter - we constantly mix&match binaries with different optimization settings - specifically, this is even working for object files within the same binary
/volatile - since this is a code-gen thing, I have a hard time imagining how this could break an ABI
/EH - except for the option to disable all exception, in which case you obviously can't call anything that throws, I'm pretty confident this is save from an ABI perspective: There are possible pitfalls here, but I think they can't really be categorized into ABI compat. (Maybe some complex callback chains could be said to be ABI incompatible, not sure)
Others:
Default calling convention (/G..) : I think this would break at link time, when mangled export symbols and header declarations don't match up.
/Zc:wchar_t - will break at link time (It's actually ABI compatible, but the symbols won't macth.)
Enable RTTI (/GR) - not too sure 'bout this one - I never have worked with this disabled.

Why are C/C++ 'obj' files valid only for a specific compiler?

After much struggling with attempting to link a C++ project to a 3rd party static library (.lib), I think I have solved the issue by verifying that I compile my project using the same compiler in which the .lib file was compiled (MSVC11).
Until this point, I assumed all .obj files were equivalent, thus I could take a .lib file (containing various .objs), and use that with any other project I might want to develop in the future. However, this was an incorrect assumption.
So I'm curious as to why (in the context of MSVC) .obj files differ from one version of the compiler to the next. Assuming you're targeting an x86 application, shouldn't the obj files be comprised of the same types of instructions regardless of whether or not you compiled using MSVC11/12/14?
TLDR; Why can't I link a project to an .obj that was created using a different MSVC compiler?
That's because it could be linked to another version of Visual C++ runtime libraries, which is incompatible with the version you are using.
This problem could be even with DLLs if you try to expose C++ objects from it.

arm-linux-gnueabi toolchain vs arm-linux-androideabi toolchain.

Can I compile files (e.g. C or C++ source code) using for my android device using the arm-linux-gnueabi-* toolchain?
My question might seem a bit silly, but will I get the same result as compiling with the arm-linux-androideabi-* toolchain?
A compilation might mean more than just converting source code to binary. A compiler like GCC also provides certain libraries, in this case libgcc for handling what hardware can't handle. When a compiler becomes a toolchain, it also provides runtime libraries standardised by the programming language similar to ones provided in target system. In arm-linux-gnueabi-'s case that might be libc and for arm-linux-androideabi- that's bionic.
You can produce compatible object files to be used by different compilers, that's what elf is for.
You can produce static executable which can be mighty in size and they should work on any matching hardware/kernel, because in that case toolchains aim for that.
But if you produce dynamic executables, those ones can only run on systems that's supporting their dependencies. Because of that a simple "hello world" application that's not static build by arm-linux-gnueabi- won't work on an Android system since it provides bionic, not libc.

LLVM and visual studio .obj binary incompatibility

Does anyone know if LLVM binary compatibility is planned for visual studio combiled .obj and static .lib files?
Right now I can only link LLVM made .obj files with dynamic libs that loads a DLL at runtime (compiled from visual studio).
While there probably is very small chances that binary compatibility will happen between the two compilers, does anybody know why it is so difficult achieving this between compilers for one platform?
As Neil already said, the compatibility includes stuff like calling convention, name mangling, etc. Though these two are the smallest possible problems. LLVM already knows about all windows-specific calling conventions (stdcall, fastcall, thiscall), this is why you can call stuff from .dll's.
If we speak about C++ code then the main problem is C++ ABI: vtable layout, rtti implementation, etc. clang follows Itanium C++ ABI (which gcc use, for example, among others), VCPP - doesn't and all these are undocumented, unfortunately. There is some work going in clang in this direction, so stuff might start to work apparently. Note that most probably some parts will never be covered, e.g. seh-based exception handling on win32, because it's patented.
Linking with pure C code worked for ages, so, you might workaround these C++ ABI-related issues via C stubs / wrappers.
Apart from anything else, such as calling conventions, register usage etc, for C++ code to binary compatible the two compilers must use the same name-mangling scheme. These schemes are proprietory (so MS does not release the details if its scheme) and are in any case in a constant state of flux.

Resources