Update a DLL without replacing it? - c#-4.0

Lets say I have a C# DLL that is 100 MB. Now If I wanted to update this DLL as far as I know I would need to get another DLL by downloading it, delete the old DLL and then move this new one in its place to "update" the DLL (Lets assume the assembly's that reference this DLL do not care about the version of the DLL).
Is there a way to create a file with has all the classes and items that have changed (ONLY the stuff that has changed) and recompile the DLL to have these changes, or some how update the DLL with those changes?
The goal is to have a way to update the DLL through small patches without having to re-download the entire DLL.
Thank You

You use the same mechanism version control systems use - SVN for example happily stores only the changes between different binary checkins, and on some dlls that can turn a mb dll into a few k. (obviously your mileage may change)
Generally binary diffs are not nearly as efficient as text, which is why its not often used, but it can work for some - it depends on how the binary is laid out, if the compilation process produces something that is generally the same, rather than completely different.
You'll need some code on the client to pull the changes down, either use the gnu patch tools, or use something like Windows BITS.

Related

Keep installer size down by reusing files / components

Let's say I have 2 features, both use abc.dll, and both reference it from their respective current directories.
So the output will look like this :
Feature1
abc.dll
Feature2
abc.dll
I've created 2 components for this. In reality I have many features and many dll's that are shared, and my installer size is nearly 1GB.
What I am looking for is a smarter way to do this, using IS 2015 professional.
What I've looked at so far:
Merge modules: Not sure if this would work, also it means I need to maintain the merge modules manually should files be upgraded.
DuplicateFile, via direct editor, but this wouldn't work because there is no way to have this bound to a feature, only a component.
A hidden feature which would install the shared files to the target system, then a post script which would copy these files to their respective features, and delete the folder of this feature.
Is there a best practice method to implement what I need?
The most suitable approach in this case is, indeed, merge modules. I am not sure why the concern about maintaining them - you should have an automated build process that creates all merge modules and then builds your main installer with the newly created modules.
However, in my opinion, merge modules are a bit cumbersome to use if you have a lot of custom actions.
An alternative to merge modules - assuming you are using a Windows Installer project - is using small MSI packages which you "chain" to your main installer (you can chain multiple packages with different conditions and supply different properties). Here too, you should have a build process which builds all those small msi packages and then builds the main installer.
If you don't want to have this kind of 'sub-projects', then the option of a hidden feature with a post action is acceptable, I've seen it, and done it, a few times. Note that if you target Windows 7 or later, instead of physically copying the files and deleting them, you can use symbolic links (using the mklink command), which helps reduce the installation's foot print on the target system (and make patching easier - you replace the original file, and all its links are updated automatically).

How can I convert dynamically linked application to statically one?

I have an application, say gedit, which is dynamically linked and I don't have the source code. So I can not compile it as I like. what I want to do is to make it statically linked and move it to the system which doesn't have the necessary libraries to run that application. So is it possible to do it and how?
It is theoretically possible. You basically have to do the same job that the dynamic linker does, with some modifications, i.e.
dump all sections from the original file
resolve symbols
locate libraries
instead of loading them into memory, assemble them into a "virtual image"
resolve internal links
dump the whole thing in a independent file.
So objdump, readelf, and objcopy will be some of your friends.
The task is not easy and the result will be neither automatic, nor (probably) stable.
You may want to check out this code by someone else that tried the same, by actually intercepting the dynamic linker (i.e. all steps above, except the last) and dumping the results to disk.
It is based on this tool, so it's anyone's bet whether it works on the newest kernels.
(It probably doesn't - and you need at least to patch it to reflect the new structures. This is my attempt at doing so. Caveat emptor).

Finding the shared library name to use with dlload

In my open-source project Artha I use libnotify for showing passive desktop notifications to the user.
Instead of statically linking libnotify, a lookup at runtime is made for the shared object (.so) file via dlload, if available on the target machine, Artha exposes the notification feature in it's GUI. On app. start, a call to dlload with filename param as libnotify.so.1 is made and if it returns a non-null pointer, then the feature is exposed.
A recurring problem with this model is that every time the version number of the library is bumped, Artha's code needs to be updated, currently libnotify.so.4 is the latest to entail such an occurance.
Is there a linux system call (irrespective of the distro the app. is running on), which can tell me if a particular library's shared object is available at runtime? I know that there exists the bruteforce option of enumerating the library by going from 1 to say 10, I find the solution ugly and inelegant.
Also, if this can be addressed via autoconf, then that solution is welcome too I.e. at build time, based on the target machine, the configure.h generated should've the right .so name that can be passed to dlload.
P.S.: I think good distros follow the style of creating links to libnotify.so.x so that a programmer can just do dlload("libnotify.so", RTLD_LAZY) and the right version numbered .so is loaded; unfortunately not all distros follow this, including Ubuntu.
The answer is: you don't.
dlopen() is not designed to deal with things like that, and trying to load whichever soversion you find on the system just because it happens to have the symbols you need is not a good way to do it.
Different sonames have different ABIs, and different ABIs means that you may be calling the same exact symbol name that is expecting a different set (or different size) of parameters, which will cause crashes or misbehaviour that are extremely difficult do debug.
You should have a read on how shared object versions work and what an ABI is.
The libfoo.so link is there for the link editor (ld) and is usually installed with the -devel packages for that reason; it might also very well not be a link but rather a text file with a linker script, often times on purpose to avoid exactly what you're trying to do.

MonoTouch: Adding DLL references in sub projects

In a way I am looking for best-practice here.
I have a common project that is shared by many of my apps. This project has FlurryAnaylics and the ATMHud DLLs as references.
If I do not also reference these DLLs in the main project, the apps will often, but not always, fail in the debug-to-device test. In the debug-to-simulator I don't need to add these DLLs to the main project.
So, the question is: Do I have to include references to DLLs in the main project that I have in sub projects all the time?
Whenever possible I use references to project files (csproj files) over references to assemblies (.dll). It makes a lot of things easier, like:
code navigation (IDE);
automatic build dependency (the source code you're reading is the one you're building, not something potentially out-of-sync);
source-level debugging (even if you can have it without it, you're sure to be in-sync);
(easier) switch between Debug|Release|... configurations;
changing defines (or any project-level option);
E.g.
Solution1.sln
Project1a.csproj
MonoTouch.Dialog.csproj (link to ../Common/MonoTouch.Dialog.csproj)
Solution2.sln
Project2a.csproj
MonoTouch.Dialog.csproj (link to ../Common/MonoTouch.Dialog.csproj)
Common.sln
MonoTouch.Dialog.csproj
Large solutions might suffer a bit from doing this (build performance, searching across files...). The larger they get the less likely everyone has to know about every part of it. So there's a diminished return on the advantages while the inconvenience grows with each project being added.
E.g. I would not want to have references to every framework assemblies inside Mono (but personally I could live with all the SDK assemblies of MonoTouch ;-)
Note: Working with assemblies references should not cause you random errors while debugging on device. If you can create such a test case please fill a bug report :-)

How to make a fix in one of the shared libraries (.so) in the project on linux?

I want to make a quick fix to one of the project's .so libraries. Is it safe to just recompile the .so and replace the original? Or I have to rebuild and reinstall the whole project? Or it depends?
It depends. Shared library needs to be binary-compatible with your executable.
For example,
if you changed the behaviour of one of library's internal functions, you probably don't need to recompile.
If you changed the size of a struct (e.g. by adding a member) that's known by the application, you will need to recompile, otherwise the library and the application will think the struct is smaller than it is, and will crash when the library tries to read an extra uninitialized member that the application didn't write to.
If you change the type or the position of arguments of any functions visible from the applications, you do need to recompile, because the library will try to read more arguments off the stack than the application has put on it (this is the case with C, in C++ argument types are the part of function signature, so the app will refuse run, rather than crashing).
The rule of thumb (for production releases) is that, if you are not consciously aware that you are maintaining binary compatibility, or not sure what binary compatibility is, you should recompile.
That's certainly the intent of using dynamic libraries: if something in the library needs updating, then you just update the library, and programs that use it don't need to be changed. If the signature of the function you're changing doesn't change, and it accomplishes the same thing, then this will in general be fine.
There are of course always edge cases where a program depends on some undocumented side-effect of a function, and then changing that function's implementation might change the side-effect and break the program; but c'est la vie.
If you have not changed the ABI of the shared library, you can just rebuild and replace the library.
It depends yes.
However, I assume you have the exact same source and compiler that built the other stuff and now if you only change in a .cpp file something, it is fine.
Other things e.g. changing an interface (between the shared lib and the rest of the system) in a header file is not fine.
If you don't change your library binary interface, it's ok to recompile and redeploy only the shared library.
Good references:
How To Write Shared Libraries
The Little Manual of API Design

Resources