I am in the process of updating an older windows driver. I am using Build.exe and the associated tool set included in the WinDDK (7600.16385.1). Reviewing the SOURCES file I came across the following macro: USE_CTRLDLL=1. I cannot find any documentation related to this on MSDN (https://msdn.microsoft.com/en-us/library/ms910176.aspx) or third party sites. Any idea as to what this macro actually tells the tool set to do?
The following answer was provided by Don Burn in the Windows Dev Center Forums (What does USE_CTRLDLL=1 in SOURCES file do?):
I suspect someone typo'd meaning to put in USE_CRTDLL which is
obsolete and instead should be USE_MSVCRT.
Removing this MACRO has no apparent effect on the compilation, linking, or execution of the driver. As Don implies, it is likely the result of a typo made during a maintenance update.
Related
Can anyone please let me know any pointers to create Sparse files(holey files) in latest vdbench 50404rc2. It seems that this is the latest supported feature.
Link for more info:
https://community.oracle.com/thread/3759500?start=0&tstart=0
The answer was given by Henk, but on oracle vdbench forum so posting the excerpt from it,also the below link to forum post.
This is EXPERIMENTAL, so it will work for now, but once I get feedback and decide that this experiment was successful I will change the instructions to activate it.
That means that the '-d86' info below will no longer work.
To activate truncate, add '-d86' as an execution parameter, or, add 'debug=86' at the top of your parameter file.
(For experiments adding an other 'debug=' parameter is much easier than fiddling with the Vdbench parameter parser. If I decide to make this permanently available I'll worry about adding a more 'official' parameter.)
This uses the Unix 'ftruncate()' and similar Windows function during file creation.
This will create ONLY sparse files during the format, not one block of data is written until further Vdbench workloads are run against these files.
The definition file.
debug=86
fsd=fsd1,anchor=/tmp/sparsedir,depth=1,width=1,files=10,size=40k
fwd=fwd1,fsd=fsd1,operation=read,xfersize=4k,fileio=sequential,fileselect=random,threads=1
rd=rd1,fwd=fwd1,fwdrate=max,format=yes,elapsed=10,interval=1
I am using Node.js Tools for Visual Studio.
When I am opening a project it will take some time to load, because of Node.js analysis process.
Another problem is .ntvs_analysis.dat is growing larger and larger?
What is it and do I need it?
To my understanding, the NTVS extension analyzes your code to provide IntelliSense support. The result of the analyzed code is stored in ntvs_analysis.dat. However, it doesn't only analyze your code but also all installed npm_modules and their dependencies (and theirs, and theirs)). So installing more modules will make your ntvs_analysis.dat grow really fast.
There is an open issue on github https://github.com/Microsoft/nodejstools/issues/88 about this. The file is getting really big for some people including myself.
One proposed solution in the discussion is to reduce the depth of scanned folders. Turning off IntelliSense would help keeping the file smaller according to the discussion.
I have here a C++/CLI solution which isn't mixed with native C++ (although we have this type too). It consists of three projects, where are two relevant for my question.
The first one is a static library (.lib) and deals with Acitve Diretytory matters.
The second one is the executable main project (.exe) which depends on the other projects.
I'm new to Visual Studio 2012 and want to use the advantages of tools like the code analysis. Running the code analysis over the solution reveals several CA2122 warnings:
CA2122 Do not indirectly expose methods with link demands
I understand the security concerns related to this warning and I think I understood how to deal with it, although I'm also new to this security stuff. This warnings are related to the Active Directory code when the whole solution is examined, while examining only the lib-project they will not appear and everything seems to be ok.
Now to the core of the problem:
I tried to mark all methods where I'm warned with the SecuritySafeCritical attribute
--> no changes, same warnings
I've solved this warning in another project by marking the whole assembly as SecurityCritical and adding the SecuritySafeCritical to the problematic method. This will not work since adding a AssemblyInfo.cpp with marking the assembly as SecurityCritical will not affect this problem. (I know that *.cpp seem to be obsolete in managed static librarys since the code seem to have to be complete in the header files making this kind of project obsolete... but we don't want to have .dll for every small part and we also want to have this stuff capsulated in an own project instead of having some loose header files or have it mixed with other regions)
After that I tried to mark the whole assembly of the main project as SecurityTransparent because so far I understand this SecuritySafeCritical marked code can be called by SecurityTransparent or SecurityCritical code (what is for me every kind of security). --> My as SecuritySafeCritical marked methods now are marked with CA2141 warnings and many other methods produce new warnings (most of them are related to exception handling):
CA2141:Transparent methods must not satisfy LinkDemands
CA2140: Transparent code must not reference security critical items
So I decided to try marking this assembly as SecurityCritical too.
--> My SecuritySafeCritical methods finally produce no warnings, but there are still all these other warnings from methods having exceptionhandling.
So I don't know how to solve this problem. I assume that having a managed static library is the problem and when having just a dll-project maybe I could solve the problem as mentionend in 2., but I want to avoid to share another *.dll project with our programs.
I searched for a solution but found nothing which would help in this case. Also informations on this topic are rare, out of date (because related to .Net Framework 2.0 while the whole security thing seems to be changed massively with .Net Framework 4.0) or hard to understand for me. So I hope someone has an idea what I could try or what I should do.
I'm using nuget (as many of you) a lot for referencing external and internal component-assemblies.
For debugging purposes, it would be nice being able to exchange the nuget-assembly for it's source-code.
Unfortunately, some "core-lib" is used pretty often by the solution itself and also some nuget-referenced packages (used by the sln). Simply removing the reference via VS and adding it's source-project often causes an ambigous relation to the "core-lib", because both (sln and package-src) use the "core-lib" - mostly in different versions.
The only way (I know) to solve that issue is to update all references to the same version (usually, the most actual one). That can be pretty annoying, especially in bigger projects.
Maybe there exists a way to make referencing more flexible - e.g. by using wildcards in the hint-path?
Thanks for all suggestions!
Did you try using symbol packages ? More details # http://docs.nuget.org/docs/creating-packages/creating-and-publishing-a-symbol-package
I have updated my development system to the new MonoTouch (6.0.1) and now whenever I'm referencing zxing.Monotouch types I get MissingMethodException on the constructor.
System.MissingMethodException: Method not found: 'MyClass..ctor'.
It's been 3 days now...
Anyone got any idea? I'm even willing to give up zxing if that what it takes (even though it's a wonderful library).
Edit
When I include zxing.Monotouch in the solution and reference it as a project the problem does not reproduce. If that's a clue I've missed it...
It's likely that the binary version of zxing.Monotouch is trying to access something that does not exists in 6.0.1. That's uncommon as we try to maintain source/binary compatibility unless the code is really broken (e.g. it would cause a crash anyway). I cannot be more precise without more data (e.g. a full build log).
If you include zxing.Monotouch as a reference then it will be rebuilt. If it works then it really looks like source compatibility was preserved (but not binary compatibility).
Whenever you have the source code available I encourage you to use .csproj (not .dll) references. Is has a few advantages, including the source/binary compatibility (above) and the fact that it makes things easier to debug from your project.