How can we protect ourselves from other third parties installing DLLs with the same names as some of ours into C:\WINDOWS? - search

Our product includes several DLLs built from open source into files with default names as delivered by the open source developers. We're careful to install the files in our own directories and we carefully manage the search path (only for our processes) to keep the loader happy.
Another developer -- a towering intellect -- decided it would be easier to install their own build of some of the same open source into C:\WINDOWS under the same default DLL filenames. Consequently, when we launch a process which depends on these open source DLLs, the system searches C:\WINDOWS before our directories and finds the DLLs installed by the other developer. And they are, of course, incompatible.
Ideas which have occurred to me so far:
rename all our DLLs to avoid the default names, which would only make
it less likely we would encounter collisions
load all our DLLs by full path so the loader captures their names into
RAM and doesn't search anywhere else the next time they are requested
For various reasons, neither of these options is palatable at the moment.
What else can we do to defend ourselves against the towering intellects of the world?

You've got only two options: deploy the DLL in the same directory as the EXE (that's where Windows looks first) or using manifests and deploy the DLL to the Windows side-by-side cache. I don't think the latter option is common in the Open Source world but it is the only real fix if you want to share DLLs between different apps.

To add to the already excellent answers, you have a couple more choices:
The preferred solution(s) to this problem, supported since Windows XP, is to turn your dll's into a win32 assembly (They don't have to be .NET but the documentation on creating win32 assemblies with strong names is appallingly light so its easy to get confused and think this is a .NET only technology).
An assembly is noting more complicated than a folder (With the name of the assembly) containing the dlls and a .manifest (With the name of the assembly) that contains an assemblyIdentiy element, and a number of file nodes for each dll in the assembly.
Assembly based searching works even when dlls are statically linked!
The easiest option is to create unversioned assemblies and store them in the same folder as your .exe files (Assuming all your exe's are in a single folder).
If the exe's are in different folders, then there are two ways to access shared assemblies:
You can store your assemblies in a private alternate location if you expect your application to be used on Windows 7 and higher. Create a app.exe.config file for each of your exe's, and point a probing privatePath element to a common folder where you are storing the assemblies.
If you are ok with requiring administrative access to perform installs, (via MSI's) then you can deal with the appallingly bad documentation (well, absent documentation) that deals with giving your assemblies a strong name, and then store the assembly in WinSxS.
If you can't, or do not want to bundle your dlls as assemblies then this page covers dll search order
Using functions like SetDllDirectory are only going to help for dlls loaded dynamically at runtime (via LoadLibrary).
Dll search order used to be:
Directory containing the process exe
Current directory
various windows folders
PATH
Which you could have used to your advantage - launch each exe, setting the "current" directory to the folder containing the OSS dlls.
With the advent of SafeDllSearchMode the search order now is:
Directory containing the process exe
various windows folders
Current directory
PATH
Meaning theres now less control than ever :( - It goes even faster to the "untrusted" c:\windows & System32 folders.
Again, if the initial dll is being loaded via LoadLibrary, and its the dependent dll's that are the problem, LoadLibraryEx with the LOAD_WITH_ALTERED_SEARCH_PATH flag will cause the following search order (Assuming you pass a full path to LoadLibraryEx) :-
Directory part of the Dll path passed to LoadLibraryEx
various windows folders
Current directory
PATH

The directory from which the application loaded is normally the first directory searched when you load a DLL. You can, however, use SetDllDirectory to get the "alternate search order". In this case, the directory you specify to SetDllDirectory gets searched first.
There is also a SafeDllSearchMode that affects this to a degree. Turning it on excludes the current directory from the search.

Maybe just compile them to a static library?
Why not?
Also, the current directory, where the exe is activated from is searched before c:\windows.

Related

How to specify output directory for nuget binaries?

I have a nuget package which has .dlls files that it automatically copies to a directory when the program is compiled. Is there a way to specify where the dlls files will be copied to?
I know it's possible to modify the .targets file of the package but assume I don't have access to the package. The project is managed through git so I'd rather not have to distribute binaries along with the licenses included in the package. Currently when someone downloads/compiles the source, the nuget package is automatically downloaded through nuget restore. Is there a way to override the .targets file of the package?
I'd prefer not to use scripts to manage the dll files if possible. Also this is for a C++ project and I know nuget has restrictions based on the languages used.
If your question is asking if there's a simple configuration file or something similar where you can write "copy contents of package X to location Y", then no.
To most people, the difference between the various components in the build system are not important, so it doesn't matter to them if NuGet copies something or their project's SDK copies something, or MSBuild copies something. However, since you're now trying to do something more advanced, these differences may be important. NuGet only writes/copies files at restore (and therefore only to the packages folder, not the project output folder), but after restore, NuGet doesn't run at all. NuGet just makes the files known to the rest of the build system, and those components are responsible for deciding what to do. For example, where to copy files.
Since both C++ and .NET projects use MSBuild, the same debugging techniques can be used. From a "developer command prompt", build your project using the -bl switch to generate a "msbuild.binlog" file. You can open this file with the MSBuild structured log viewer. You can then use the search to find where in the build each dll is copied, and what the copy arguments (including destination) is. You can also look to find where the item that defined the file to be copied was created. Then, you can write an MSBuild target in your project file (or another file that gets imported by your project file) that runs at an appropriate time and updates the item to set the destination you want the file to be copied to. But, MSBuild is a scripting language, and you said you didn't want to write a script, so you might not like this approach. And if you're not already knowledgeable about MSBuild scripting, it's probably more effort than writing a powershell script. But at least it would happen automatically as part of the build (and therefore happen when you build and debug in Visual Studio), and not be some other process that needs to be manually run.

How should I go about using a temporarily changed copy of a DLL locally when it's been checked in to TFS?

We have a Libraries folder where we keep third-party DLLs and our own utility DLLs for all applications to reference. I want to do development against one of our utility DLLs and an application that consumes it at the same time. But if I check out the library DLL to change it for temporary local use, TFS insists on checking it out exclusively, which trips other people up. I understand the reasoning behind it doing that (hard/impossible to merge a DLL, so two people shouldn't be working on one at the same time), but I just want to mess with my local copy while I'm working on the library it represents.
I suppose I could delete my application's reference to the DLL and recreate the reference pointing to some other place, but of course this just begs for me to forget and check it in like that, which would obviously be bad. Not to mention that this is a pain in the neck.
How should I proceed in such a situation?
You are using a server workspace that does not allow editing outwith TFS. In TFS 2012 local workspaces were introduced which do not have a read only flag for files and you are free to edit at will.
You can change your existing workspace in a few clicks: http://msdn.microsoft.com/en-us/library/bb892960.aspx
You could just go into the file system and mark the file as writeable. Once you are happy the binary is good you could check it out, copy the new version of the file over and check it back in again. TFS marks binary files like this as locked for good reason, as you can't merge them in the way you can with textual content.
The best approach would be to use a NuGet repository to manage your binary dependencies, instead of relying on binaries checked into source control.

Why is it not recommended to keep the shared libraries in the Executable location

I'm Fairly new to Linux , And Software Development I was suggested that the shared library should be placed separately from the executable location in Linux , But in Windows I could see all the Files including the dll's and the exe are available in the same folder .What kind of problem's it might cause in linux if the exe and the shared library are in the same folder
When executable and library in same folder you can't say library is shared, right? And you'll probably have dozens of same libraries in system - what a waste of space. In Windows there's no organized way of storing libraries in system line /usr/bin or /lib/bin. It's not ok to put them in system32, and there's no other directories in PATH - no choice to make.

.dll .so -- loading in windows & Linux enviroment

My question is related to dynamic library loading ('abc.dll' or 'abc.so') on both Windows & Linux enviroments.
I have a dll or shared memory. And i am having two applications which have to use this dll (abc.dll or abc.so).
Now i have placed a copy of this dll (abc.dll or abc.so) in their respective folder of executable :
/folder-one/app1.exe
/folder-one/abc.dll (resp. abc.so)
/folder-two/app2.exe
/folder-two/abc.dll (resp. abc.so)
Now when i run app1.exe it loads the abc-library (abc.dll or abc.so) from its folder-one & runs.
Now when i run app2.exe it loads the abc-library (abc.dll or abc.so) from its folder-two & runs.
Q-1 Now my question is that when both the application run, will there be be two copies of the dll loaded ?
Loader loads the shared library (abc.dll or abc.so) in the memory in both linux & windows enviroment.
http://tldp.org/HOWTO/Program-Library-HOWTO/shared-libraries.html
Q-2 Is there a disadvantage to have the shared libraries (abc.dll or abc.so) as individual copies in the respective folders ?
Q-3 If i want to load a single dll from both applications, then what should be the common location (so both applications can find it)?
Q-1 Now my question is that when both the application runs then there will be two copy of the dll loaded ?
Yes, libraries will be loaded twice because they live in different locations (and name itself is not enough to make it unique).
Q-2 Does there disadvantage to have (abc.dll or abc.so) as an indivisual copy in respective folders ?
Primary memory consumption because code will be duplicated (of course each copy will have its own data). This is a simplification because it's not mandatory to share code (and it's different between Linux and Windows). In general a read-only section is shared and a read/write section is private (then duplicated).
Moreover loading time is higher because of basic overhead that occurs during loading (memory allocation, relocation of addresses, resolution of dependencies and so on).
Do not forget that this will apply to each dependency too (if they are deployed in the same folder of library you're using).
Finally you should consider deploying and updates. This can be a pro or a con but don't forget that a shared library can be updated just once and it'll upgrade all dependent applications; it's a pro if it's done carefully but it's a con if an update can break existing code. This issue can be managed, for example, including a version number in the name (when you change version then compatibility isn't granted).
Q-3 If i loads the dll from two application then that should be at common location ?
Yes. A library is uniquely identified by its name and its location. In that case it won't be loaded in memory twice.
On Windows, if a dll is loaded already and you load a dll with the same module name, it will return the loaded one and not bother searching.
Before the system searches for a DLL, it checks the following:
If a DLL with the same module name is already loaded in memory, the system uses the loaded DLL, no matter which directory it is in.
The system does not search for the DLL.
as described in this very long MSDN page (which appears at first glance to be solely concerning Windows tore apps, but it does talk about desktop mode that I guess hasn't changed from Win7 at least)
However, I think this depends on a lot of things nowadays: are you running a windows store app, or a .net dll, or a win32 dll. I know .net uses a system of probing' to locate a dll. And then you also have to determine whether your dll is held as a side-by-side component.
The old days of memory, then current directory, then path was easy to understand. I think Linux uses this approach.

How does Visual Studio process the App_Code folder specially?

How does Visual Studio process the App_Code folder when a change is made or detected in it? Not IIS or ASP.NET.
I want to gain a better understanding of why Visual Studio freezes for long periods of time whenever I save a code file inside a large App_Code folder of a website project. Alternatively, I could ask: why does Visual Studio not exhibit these same freezes when processing a file inside a class library that is equally large?
Ideally I would like to see official documentation cited from Microsoft of the issue at hand of processing the App_Code folder in Visual Studio and what happens that differs from processing a class library for example.
The App_Code folder is not explicitly marked as containing files
written in any one programming language. Instead, the ASP.NET infers
which compiler to invoke for the App_Code folder based on the files it
contains. If the App_Code folder contains .vb files, ASP.NET uses the
Visual Basic compiler; if it contains .cs files, ASP.NET uses the C#
compiler, and so on.
If the App_Code folder contains only files where the programming
language is ambiguous, such as a .wsdl file, ASP.NET uses the default
compiler for Web applications, as established in the compilation
element of the application Web.config file or the machine-level
Web.config file. Compilers are named build providers and a build
provider is specified for each file extension in an extension
element.
See the documentation here.
It recompiles all code in this folder in a separate assembly, then reference this assembly in your project.
You should be aware that a double reference could occur if you include these files as compilable in your project. In this latter case, the files are at the same time compiles in a separate assembly (with a temp name) which is referenced, and also compiled in the bin folder. This is the start of the horror show ...
These performance notes about the App_Code folder are slightly dated but likely still apply to the project type:
2) Keep the number of files in your /app_code directory small. If you
end up having a lot of class files within this directory, I'd
recommend you instead add a separate class library project to your VS
solution and move these classes within that instead since class
library projects compile faster than compiling classes in the
/app_code directory. This isn't usually an issue if you just have a
small number of files in /app_code, but if you have lots of
directories or dozens of files you will be able to get speed
improvements by moving these files into a separate class library
project and then reference that project from your web-site instead.
One other thing to be aware of is that whenever you switch from source
to design-view within the VS HTML designer, the designer causes the
/app_code directory to be compiled before the designer surface loads.
The reason for this is so that you can host controls defined within
/app_code in the designer. If you don't have an /app_code directory,
or only have a few files defined within it, the page designer will be
able to load much quicker (since it doesn't need to perform a big
compilation first).
-- http://weblogs.asp.net/scottgu/archive/2006/09/22/Tip_2F00_Trick_3A00_-Optimizing-ASP.NET-2.0-Web-Project-Build-Performance-with-VS-2005.aspx

Resources