VC++ 2005 project option to include stl? - visual-c++

I'm working on a cross platform project that uses STL. The other compiler includes STL support by default, but in VS2005 I need to add the following before the class definitions that use STL items:
#include <cstdlib>
using namespace std;
Is there a VS2005 option that would set this automatically? It's just a bit tedious to work around. I'm just trying to avoid lots of #ifdefs in the source -
EDIT: The other compiler is the IAR workbench for the ARM 926x family. Perhaps I should get them to explicitly do the includes?
Also - is "std::map<>" preferred over "using namespace std; map<>" ?

All compilers should require you to include those lines. If they don't, then they're just encouraging you to write non-portable code because you're relying on certain headers to be included automatically and you're relying on certain names to be in scope implicitly.
I don't mean to say that those two lines should always be required, though. I only mean that if the rest of your code is written to use things declared in the cstdlib header and in the std namespace, then those two lines need to appear first, and the compiler shouldn't act as though they are there when they really aren't.
Check whether your other compiler has some settings to disable this implicit code. If it doesn't, then it's probably a very, very old compiler, and you should consider not using it and not supporting it anymore.

Try refering to STL components by their namespace-qualified name (i.e. std::vector).
Doing a global 'using namespace std' is usually a bad idea.
Or maybe I'm not understanding the question.

The IAR compiler does not support the std namespace (I'm not sure why, because it does support namespaces in general if I remember right).
If you look in the runtime headers for IAR you'll see that they do some macro gymnastics to work around that (the runtime is licensed from Dinkumware, who provide runtimes for quite a few compilers).
You may need to do something similar if you want your stuff to work in multiple environments. A possible cleaner alternative is to just include the "using namespace std;" directive. I might be wrong, but I think the IAR compiler essentially ignored it (it didn't mind that you were using a namespace it didn't know about). A lot of people will think that's ugly, but sometimes you gotta do what the compiler you have wants you to do.

In general you should avoid "using namespace X", especially in header files (because everyone who includes your header gets that namespace too whether they want it or not), and especially for namespace std (because it's so big and the potential for name collisions is big).
Instead, in header files refer to names by their fully qualified form, e.g.:
// for plain functions
void foo(std::map<int> intMap);
// for classes
class person {
std::string name_;
public:
person(std::string name);
// ...
};
Then, in code files, you can do "using", but prefer using specific items in the namespace rather than pulling in the entire namespace. e.g.:
using std::map;
using std::string;
void foo(map<int> intMap) { ... };
person::person(string name) : name_(name) { ... };
etc. This way you avoid impacting others including your headers, and you avoid pulling in potentially zillions of names that might cause a collision with other stuff.

Related

How do I get haxe to generate externs?

I am writing haxe code which I want to compile to an arbitrary target as a module and then use the results from another module compiled for this same target. I don’t want to handle this the “Haxe way” (which is to fully inline all libraries at compiletime). Instead I want to be able to write distinct Haxe modules and reference them with full type safety without inlining between the modules. The natural way to do this would be to have both source Haxe files and a separate directory of “headers” filled with extern describing the public API of my module, with these externs somehow automatically generated so that they don’t need to be manually maintained.
I cannot figure out how to get Haxe to emit externs. It would make sense to me if haxe-externs were an actual “target platform” so that I could do something like:
$ haxe ClassName -hxe externsoutdir
It would make less sense but still be acceptable if one of the -D flags like -D dump (which seems to sort of get one part of the way there) or some imaginary, nonexistent -D dump-externs existed. Then you could generate externs while compiling to your favorite target:
$ haxe ClassName -js outfile.js -D shallow-expose -D dump-externs=externsoutdir
The idea is to take a class definition like this:
#:expose
class ClassName {
function quack() {
trace('quack');
}
}
and emit something like this in a separate directory:
extern class ClassName {
function quack():Void;
}
so that I can consume it from another module like this:
#:expose
class MyClassName extends ClassName {
override function quack() {
super.quack();
trace('…and again I say “quack”');
}
}
$ haxe -cp path\to\externsoutdir MyClassName -js outfile.js -D shallow-expose
It would only make sense to generate externs for things decorated with #:expose or some other decorator.
I will figure out how to wrap the emitted modules to load each other correctly. That’s easy. The hard part is generating the extern definitions—shouldn’t Haxe already have a way to do this?
Is there already some tool or built-in way I’m missing to do this? When Googling, all I see are projects that supposedly help with generating externs for existing JavaScript libraries. But that’s not my use case…
Update: --gen-hx-classes was removed sometime aroudn Haxe 4.0.0-rc3. Apparently the functionality still exists secretly as -D gen-hx-classes, but beware, if you rely on this, it seems like its going away.
I believe --gen-hx-classes option might be what you're looking for. Oddly I don't see it in the compiler flags list.
I use it in a modular JavaScript build system that is similar to what you're talking about.
I believe it creates a directory of .hx files that are externs for every class generated by the build (including those from the Haxe standard library.) Actually, getting duplicates of the classes in the standard library may be a problem you will face.
You may also need to use #:keep (or the related macro) to ensure dead code elimination doesn't remove things the other build will need.
You might also need to exclude a class from one or the other builds, e.g. --macro 'exclude("haxe.io.Input")' (or, excludeFile is actually more performant for a whole list of exclusions.)

How to prevent dead-code removal of utility libraries in Haxe?

I've been tasked with creating conformance tests of user input, the task if fairly tricky and we need very high levels of reliability. The server runs on PHP, the client runs on JS, and I thought Haxe might reduce duplicative work.
However, I'm having trouble with deadcode removal. Since I am just creating helper functions (utilObject.isMeaningOfLife(42)) I don't have a main program that calls each one. I tried adding #:keep: to a utility class, but it was cut out anyway.
I tried to specify that utility class through the -main switch, but I had to add a dummy main() method and this doesn't scale beyond that single class.
You can force the inclusion of all the files defined in a given package and its sub packages to be included in the build using a compiler argument.
haxe --macro include('my.package') ..etc
This is a shortcut to the macro.Compiler.include function.
As you can see the signature of this function allows you to do it recursive and also exclude packages.
static include (pack:String, rec:Bool = true, ?ignore:Array<String>, ?classPaths:Array<String>):Void
I think you don't have to use #:keep in that case for each library class.
I'm not sure if this is what you are looking for, I hope it helps.
Otherwise this could be helpful checks:
Is it bad that the code is cut away if you don't use it?
It could also be the case some code is inlined in the final output?
Compile your code using the compiler flag -dce std as mentioned in comments.
If you use the static analyzer, don't use it.
Add #:keep and reference the class+function somewhere.
Otherwise provide minimal setup if you can reproduce.

fatal error LNK1179: invalid or corrupt file: duplicate COMDAT '_IID_IXMLDOMImplementation'

I am getting this single error when I am linking my project,
COMMUNICATION.obj : fatal error LNK1179: invalid or corrupt file:
duplicate COMDAT '_IID_IXMLDOMImplementation'
What is the source of the problem?
This is a tricky one.
The issue is that the symbol(s)-generated is too-long, and an ambiguity exists:
//...
void MyVeryLongFunctionNameUnique_0(void);
void MyVeryLongFunctionNameUnique_1(void);
// ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
// (example max-symbol-length-seen-by-linker)
In this case, the linker "sees" these two functions as the "same", because the part that makes them "unique" is longer-than-the-max-symbol length.
This can happen in at least three cases:
Your symbol names are "too-long" to be considered unique to the linker, but may have been fine for the compiler (such as when you expand-out from many nested templates)
You did some "trickery" that is invalid C++, and it passed the compiler, but you now have an invalid *.obj, and it chokes the linker.
You specified duplicate "unnamed" classes/structs, and the linker cannot resolve them.
===[UPDATE]===, It's not your fault, it's an internal problem with the compiler and/or linker (see below for possible work-arounds).
Depending on the issue (above), you can "increase" your symbol-length (by limiting-your-decrease-of-symbol-length), or fix your code to make it valid (unambiguous) C++.
This error is (minimally) described by Microsoft at:
http://msdn.microsoft.com/en-us/library/cddbs9aw(v=vs.90).aspx
NOTE: This max-symbol-length can be set with the /H option, see: http://msdn.microsoft.com/en-us/library/bc2y4ddf(v=vs.90).aspx
RECOMMEND: Check to see if /H is used on your command-line. If it is, delete it (do not specify max-symbol-length, it will default to 2,047, the /H can only DECREASE this length, not increase it).
However, you probably triggered it through the /Gy option (function-level-linking), which was probably implied through one of /Z7, /Zi, or /ZI: http://msdn.microsoft.com/en-us/library/958x11bc(v=vs.90).aspx
One MSDN thread that talks about this issue is:
http://social.msdn.microsoft.com/Forums/en-US/vcmfcatl/thread/57e3207e-9fab-4b83-b264-79a8a717a8a7
This thread suggests that it's possible to trigger this issue with "invalid-C++-code-that-compiles" (you get your *.obj), but that invalid-*.obj chokes the linker (this example attempts to use main as both a function and as a template):
http://www.cplusplus.com/forum/lounge/46361/
===[UPDATE]===
I should have said this before, because I suspected, but I now have more information: It might not be your fault, there seems to be an issue in the compiler and/or linker that triggers this error. This is despite the fact that the only common denominator in all your failed relationships is you.
Recall that the "above-list" applies (it MIGHT be your fault). However, in the case where, "it's not your fault", here's the current-running-list (I'm confident this list is NOT complete).
There is an internal error/corruption in your *.ilk file (intermediate-link-file). Delete it and rebuild.
You have /INCREMENTAL turned on for linking, but somehow that incremental-linking is not working for your project, so you should turn it off and rebuild (Project-Properties=>Configuration Properties=>Linker=>General=>Enable Incremental Linking [set to "No" (/INCREMENTAL:NO)]
There's a problem with "Optimization" for "COMDAT Folding" in your use. Your can "Remove Redundant COMDATs" by going to Project Proerties=>Configuration Properties=>Linker=>Optimization=>Enable COMDAT Folding, set to "Remove Redundant COMDATs (/OPT:ICF)
Here's an interesting thread from a guy who sometimes can link, and sometimes not, by commenting in/out a couple lines of code. It's not the code that is the problem -- he just cannot link consistently, and it looks like the compiler and/or linker has an internal problem under some obscure use case:
http://www.pcreview.co.uk/forums/fatal-error-lnk1179-t1430593.html
Other observations from a non-trivial web search:
this problem appears to be non-rare
it seems to be related to some form of template<> use
others seem to see this problem with "Release" build when it did not have this problem with "Debug" build (but it is also seen on the "Debug" build in many cases)
if the link "fails" on one machine, it may "succeed" on another build machine (not sure why, a "clean-build" appears to have no effect)
if you comment in/out a particularly significant couple-lines-of-code, and finish your build, and keep doing this until all the code is un-commented again, your link may succeed (this appears to be repeatable)
if you get this error with MSVC2008, and you port your code to MSVC2010, you will still get this error
===[PETITION TO THE GOOD PEOPLE OF THE WORLD]===
If you have other observations on this error, please list them below (as other answers, or as comments below this answer). I have a similar problem, and it's not my fault, and none of these work-arounds worked for me (although they did appear to work for others in their projects in some cases).
I'm adding a bounty because this is driving me nuts.
===[UPDATE+2]===
(sigh), Here's more things to try (which apparently work for others, but did not work for me):
this guy changed his compile settings, and it worked (from thread at http://forums.codeguru.com/showthread.php?249603.html):
Project->Settings->C++ tab, Debug cathegory: Inline function expansion: change from 'None' to 'Only _inline'.
the above thread references another thread where the had to re-install MSVC
it is possibly related to linking modules with "subtle-differences" in possibly-incompatible compiler and/or link switches. Check that all the "contributing libs" are built with the exact same switches
Here's some more symptoms/observations on this error/bug:
list(s) for above issues still apply
the issue seems to "start-showing-up" with MSVC2005, and continues with the same behavior for MSVC2008 and MSVC2010 (error still occurs after porting code to newer compilers)
restarting IDE, rebooting machine doesn't seem to work for anybody
one guy said an explicit "clean" followed by a recompile worked for him, but many others say it did not work for them
is often related to "incremental linking" (e.g., turn it off)
Status: No joy.
===[UPDATE+3 : LINK SUCCESS]===
Super-wacky-makes-no-sense fix to successfully link discovered!
This is a variation on (above), where you "fiddle-with-the-code-until-the-compiler-and/or-linker-behaves". NOT GOOD that one might need to do this.
Specific single linker-error (LNK1179) was for MyMainBody<>():
#include "MyClassA.hpp"
#include "MyClassB.hpp"
#include "MyClassC.hpp"
#include "MyClassD.hpp"
#include "MyMainBody.hpp"
int main(int argc, char* argv[])
{
// Use a function template for the "main-body",
// implementation is "mostly-simple", instantiates
// some local "MyClass" instances, they reference
// each other, and do some initialization,
// (~50 lines of code)
//
// !!! LNK1179 for `MyMainBody<>()`, mangled name is ~236 chars
//
return MyMainBody<MyClassA,MyClassB,MyClassC,MyClassD>(argc,argv);
}
THE FIX:
Convert MyMainBody<>() from a "template<>" to an explicit function, LINK SUCCESS.
THIS FIX SUX, as I need the EXACT-SAME-CODE for other types in other utilities, and the MyMainBody<>() implementation is non-trivial (but mostly simple) instantiations-and-setups that must be done in a specific way, in a specific order.
But hey, it's a temporary work-around for now: Confirmed on MSVC2008 and MSVC2010 compilers (same LNK1179 error for each, successful link on each after applying the work-around).
THIS IS A COMPILER AND/OR LINKER ERROR, as the code is "simple/proper-C++" (not even C++11).
So, I'm happy (that I got a link after suffering full-time for 2+weeks). But, disappointed (that the compiler and/or linker has a STUPID GLARING PROBLEM with linking a SIMPLE TEMPLATE<> in this use-case that I couldn't figure out how to address).
FURTHER, the "Bounty Ended", but nobody else wanted to take this on (no other answers?), so looks like "+100" goes to nobody. (heavy-sigh)
This question has a lot of answers but none of them quite capture what was happening in my codebase, and what I suspect the OP was seeing back in 2012 when this question was asked.
The Problem
The COMDAT error on an IID_* type is easy to accidentally reproduce by using the #import directive with both the rename_namespace and named_guids attributes.
If two #imported type libraries contain the same interface, as is likely the case for OP's IXMLDOMImplementation, then the generated .tlh files will declare IID_IXMLDOMImplementation in both namespaces, leading to the duplicate.
For example, the code generated for:
#import <foo.tlb> rename_namespace("FOO") named_guids;
#import <bar.tlb> rename_namespace("BAR") named_guids;
...could be simplified into something like this:
namespace FOO {
extern "C" __declspec(selectany) const GUID IID_IFOOBAR = {0};
}
namespace BAR {
extern "C" __declspec(selectany) const GUID IID_IFOOBAR = {0};
}
Here's a simple RexTester reproduction of the problem: https://rextester.com/OLAC10112
The named_guids attribute causes the IID_* to be generated and the rename_namespace attribute wraps it in the namespace.
Unfortunately, in this case, extern "C" does not seem to work as expected when it appears inside a C++ namespace. This causes the compiler to generate multiple definitions for IID_FOOBAR in the same .obj file.
DUMPBIN /SYMBOLS or a hex editor confirms the duplicate symbols.
The linker sees these multiple definitions and issues a duplicate COMDAT diagnostic.
A Solution
Knowing that rename_namespace doesn't play well with named_guids, the obvious solution is to simply not use them together. It's probably easiest to remove the named_guids attribute and instead use the _uuidof() operator.
After removing named_guids from #import directives and touching up the code, replacing all uses of FOO::IID_IFooBar with _uuidof(FOO::IFooBar), my COM-heavy codebase is back to building again.
This issue is reported as a bug in some specific versions of Visual Studio 2017. Try patching 15.9.1 or later to fix this issue
Reported Issue in VS 15.8 Preview 4
Resolved patchs in VS 15.9 Preview 2
I encountered this problem whilst porting some code (1) from MSVC to GCC. To get the build to link on GCC, I had to provide empty implementations for some specialised templated functions (2), and this resulted in LNK1179 on MSVC. I was able to resolve by inlining the functions (3), i.e.
template<> template<> void LongName1<LongName2>::FunctionName(boost::library::type1 & a, const unsigned int b);
template<> template<> void LongName1<LongName2>::FunctionName(boost::library::type1 & a, const unsigned int b) {};
template<> template<> inline void LongName1<LongName2>::FunctionName(boost::library::type1 & a, const unsigned int b) {};
I had to do
c++ -> code generation -> enable function - level linking -> no
Hopefully my lame workaround will help someone: I make sure to manually delete ALL .obj AND intermediate build files (including at least .pch, .pdb, .tlog, .lastbuildstate and anything else just hanging out looking suspicious) and rebuild from scratch.
I suggest without evidence that having some files left over from a previous build tends to cause the problem to happen more frequently. In my specific build system, I delete and recreate the .vcxproj and .sln files from scratch as well.
My own personal suspicion is that some kind of race condition exists in the build/link process between the time that intermediate files are read and the time they are written in a large project. Again, I have no evidence this is true, but this is my only guess that seems to fit all the known facts of the bug.
I wrote Outlook addins years ago and I was asked to write another. Right off the bat, I ran into this problem and through a little process of elimination, I fixed mine.
It turns out that when you choose an extensibity project(I hand coded mine back in the day), it creates and save 2 objects that I was unaware of: DTE and DTE80. To create the interfaces that manipulate these objects, they import directly from the DLLs in stdafx.h. Being that I'm working on Outlook, I also needed to import a couple of interfaces: Office and Outlook.
So, seeing as this error popped up almost immediately after writing my first tidbits of code, I started over, and added one thing at a time. The project blew chunks in the described way right after I added:
//Added mvc
//The following #import imports MSO based on it's LIBID
#import "libid:2DF8D04C-5BFA-101B-BDE5-00AA0044DE52" version("2.2") lcid("0") rename_namespace("Office") raw_interfaces_only named_guids
using namespace Office;
//The following #import imports Outoloks Object lib based on it's LIBID
#import "libid:00062FFF-0000-0000-C000-000000000046" rename_namespace("Outlook") raw_interfaces_only named_guids
using namespace Outlook;
So, seeing as I had no intention of figuring out the DTE stuff, I just commented out them and anything having to do with them:
//The following #import imports VS Command Bars based on it's LIBID
// #import "libid:1CBA492E-7263-47BB-87FE-639000619B15" version("8.0") lcid("0") raw_interfaces_only named_guids
//The following #import imports DTE based on it's LIBID
// #import "libid:80cc9f66-e7d8-4ddd-85b6-d9e6cd0e93e2" version("8.0") lcid("0") raw_interfaces_only named_guids
//The following #import imports DTE80 based on it's LIBID
// #import "libid:1A31287A-4D7D-413e-8E32-3B374931BD89" version("8.0") lcid("0") raw_interfaces_only named_guids
After wandering around fixing the compile errors, it compiled and linked just fine. I'm not suggesting this will work for everybody, but it worked for me. Good luck to any who pass by here....
I got this error and was really confused about it. Ended commenting out everything in the referenced cpp and reintroduce things in small batches until the file was back in the same state as when I started. And I don't get the error anymore. To me this points to this in my case being a bug in the compiler but since I can't reproduce it anymore I can't get help further than that.
I'm on:
Microsoft Visual Studio Professional 2019
Version 16.11.3

Which is efficient to use #pragma once or #ifndef #endif?

To avoid multiple includes of a header file, one of my friend suggested the following way
#ifndef _INTERFACEMESSAGE_HPP
#define _INTERFACEMESSAGE_HPP
class CInterfaceMessage
{
/ /Declaration of class goes here
//i.e declaration of member variables and methods
private:
int m_nCount;
CString m_cStrMessage;
public:
CString foo(int);
}
#endif
where _INTERFACEMESSAGE_HPP is just an identifier
but when i declare a class using visual studio 2005 IDE I get a statement as
#pragma once
at the starting of the class definition
when i took the help of msdn to find the purpose of #pragma once
it gave me the following explanation
"Specifies that the file will be included (opened) only once by the compiler when compiling a source code file. "
Someone please tell which is the right approach?, if both are correct then what is the difference? is one approach is better than the other?
gcc has pragma once as deprecated. You should use the standard include guards. All pragma directives are by definition implementation defined. So, if you want portability, don't use them.
Pragmas are compiler-specific, so I'd use #ifndef.
Preprocessor directives are resolved during (actually, before) compilation, so they do not make a difference in runtime except maybe for compile time.
However, you will never notice a difference in compile time from these two alternatives unless you use them several thousand times I guess.
The first approach is the generic approach that works with all compilers and is also the older one around. The #pragma once approach is compiler specific.

Why are there so many libraries in MSVC and why do I have to recompile the code again

In every platform there are various versions of a given library: multi-threaded, debug, dynamic, etc..
Correct me if I am wrong here, but in Linux an object can link to any version of a library just fine, regardless of how its compiled. For example, there is no need to use any special flags at compile time to specify whether the link will eventually be to a dynamic or a static version of the run-time libraries (clarification: I am not talking about creating dynamic/static libraries, I am talking about linking to them - so -fPIC doesn't apply). Same goes for debug or optimized version of libraries.
Why in MSVC (Windows in general with other compilers. true?) I need to recompile the code every time in order to link to different versions of libraries? I am talking the /MD, /MT, /MTd, /MDd, etc flags. Is the code actually using different system headers each time. If so, why?
I would really appreciate any pointers to solid documentation that discusses these library matters in Windows for a C/C++ programmer..
thanks!
The compiler setting does very little other than simple change some macro definitions. Its microsoft's c-runtime header files that change their behaviour based on the runtime selected.
First, the header files use a # pragma directive to embed in the object file a directive specifying which .lib file to include, choosing one of: msvcrt.lib, msvcrtd.lib, libcmt.lib and mibcmtd.lib
The directives look like this
#ifdef <release dll runtime>
#pragma comment(lib,"msvcrt.lib")
#endif
Next, it also modifies a macro definition used on all c-rt functions that adds the __declspec(dllimport) directive if a dll runtime was selected. the effect of this directive is to change the imported symbol from, say, '_strcmp' to '__imp__strcmp'.
The dll import libraries (msvcrt.lib and msvcrtd.lib) export their symbols (to the linker) as __imp_<function name>, which means that, in the Visual C++ world, once you have compiled code to link against the dll runtimes you cannot change your mind - they will NOT link against a static runtime.
Of course, the reverse is not the case - dll import libraries actually export their public symbols both ways: with and without the __imp_ prefix.
Which means that code built against a static runtime CAN be later co-erced into linking with the dll or static runtimes.
If you are building a static library for other consumers, you should ensure that your compiler settings include:
One of the static library settings, so that consumers of your .lib can choose themselves which c-runtime to use, and
Set the 'Omit Default Library Name' (/Zl)flag. This tells the compiler to ignore the #pragma comment(lib,... directives, so the obj files and resulting lib does NOT have any kind of implicit runtime dependency. If you don't do this, users of your lib who choose a different runtime setting will see confusing messages about duplicate symbols in libc.lib and msvcrt.lib which they will have to bypass by using the ignore default libraries flag.
These using these compiler options have two effects. The automatically #define a macro that may be used by header files (and your own code) to do different things. This effects only a small part of the C runtime, and you can check the headers to see if it's happening in your case.
The other thing is that the C++ compiler embeds a comment in your object file that tells the linker to automatically include a particular flavor of the MSVC runtime, whether you specify that library at link time or not.
This is convenient for small programs, where you simply type at a command prompt cl myprogram.cpp to compile and link, producing myprogram.exe.
You can defeat automatic linking of the commented-in flavor of the c-runtime by passing /nodefaultlib to the linker. And then specify a different flavor of the c-runtime instead. This will work if you are careful not to depend on the #defines for _MT and
_DLL (keep in mind that the standard C headers might be looking at these also).
I don't recommend this, but if you have a reason to need to do this, it can be made to work in most cases.
If you want to know what parts of the C header files behave differently, you should just search for _MT and _DLL in the headers and see.
All of the options use the same header files, however they all imply different #define which affect the header files. So they need to be recompiled.
The switches also link to the appropriate library, but the recompile is not because of the linking.
See here for a list of what is defined when you use each.

Resources