Keep installer size down by reusing files / components - installshield

Let's say I have 2 features, both use abc.dll, and both reference it from their respective current directories.
So the output will look like this :
Feature1
abc.dll
Feature2
abc.dll
I've created 2 components for this. In reality I have many features and many dll's that are shared, and my installer size is nearly 1GB.
What I am looking for is a smarter way to do this, using IS 2015 professional.
What I've looked at so far:
Merge modules: Not sure if this would work, also it means I need to maintain the merge modules manually should files be upgraded.
DuplicateFile, via direct editor, but this wouldn't work because there is no way to have this bound to a feature, only a component.
A hidden feature which would install the shared files to the target system, then a post script which would copy these files to their respective features, and delete the folder of this feature.
Is there a best practice method to implement what I need?

The most suitable approach in this case is, indeed, merge modules. I am not sure why the concern about maintaining them - you should have an automated build process that creates all merge modules and then builds your main installer with the newly created modules.
However, in my opinion, merge modules are a bit cumbersome to use if you have a lot of custom actions.
An alternative to merge modules - assuming you are using a Windows Installer project - is using small MSI packages which you "chain" to your main installer (you can chain multiple packages with different conditions and supply different properties). Here too, you should have a build process which builds all those small msi packages and then builds the main installer.
If you don't want to have this kind of 'sub-projects', then the option of a hidden feature with a post action is acceptable, I've seen it, and done it, a few times. Note that if you target Windows 7 or later, instead of physically copying the files and deleting them, you can use symbolic links (using the mklink command), which helps reduce the installation's foot print on the target system (and make patching easier - you replace the original file, and all its links are updated automatically).

Related

Where to install multiple compiler-specific libraries on UNIX-like systems

I need to install the same C++/Fortran library compiled with different compilers on the system with CMake. Is there a standard location where to install the different compiler-specific versions of the same library on the system? For example, assuming that lib.so and lib.a have already been installed using the system package manager under /usr/, is it good practice to install each of the additional compiler-specific versions in a different folder under let's say usr/local. Or is there a better way of doing this that you can advise?
It depends on how many compilers/libraries/versions you have. If you have just a few of them, I think that (almost) any choice on the location is right but I personally prefer /opt/ paths for manually installed code. But if you start having several combinations of them you easily get in trouble. Besides, I think that the question on the "best" location is related to the question on the "best" way to switch from the usage of one library to another one, possibly avoiding to manually set LD_LIBRARY_PATH, libraies to link or similar things.
I give some personal recommendations according to my experience for systems where you want to support many libraries/applications with many compilers/versions and also provide them for many users:
Do not use root user to install compiled software: just use an "installer account" and give read and execute permissions when needed to other users
Select a path for compiled software, e.g. /opt and define two subfolders /opt/build and /opt/install, the first one for your sources and where you compile them, the second as compilation target
Create some subfolders based on categories, e.g. /compilers, /libraries, /applications, ... from both /opt/build and /opt/install
Start preparing compilers under /compilers, e.g. /compilers/gnu/6.3 or /compilers/intel/2017. When possible compile them, e.g. from /opt/build/compilers/gnu/6.3 to /opt/install/compilers/gnu/6.3 or just put them into /install folder, e.g. /opt/install/compilers/intel/2017
Prepare the tree for libraries (or applications) adding subfolders which specify the version, the compiler and compiler version, e.g. compile from /opt/build/libraries/boost/1.64.0/gnu/6.3 and install to /opt/install/libraries/boost/1.64.0/gnu/6.3
At this stage, you have well organized things. But:
It is difficult to decide which library you want to use, you have to specify LD_LIBRARY_PATH or manually link the right one and the situation is worse when you deal also with applications
You are not considering dependencies between libraries: how can I force using g++ 6.3 when linking against boost/1.64.0/gnu/6.3?
To address these and many other issues, a good way of doing is using a tool which can help you, e.g. http://modules.sourceforge.net/ so that you can easily switch to one library to another one, force dependency, get help, and in general have something less error prone in the daily usage.

Perforce and handling switching different versions of a framework/library

I have two versions of a framework both stored under a "thirdparty" directory in my depot. One is in beta which I'm evaluating, and the other is stable. When I first made my workspace, I had it set up to use the stable one, but now I'd like to switch it to use the beta one for testing. I've got a few questions:
Let's say the frameworks are named Framework-2.0-beta and Framework-1.0-stable. Ideally I'd like them to just simply map to a "framework" directory on my local machine, so that I don't have to change all my include paths and such in my project files. Then, in theory, if I wanted to swap back and forth between frameworks, I'd just simply change which one from the depot I'm pulling and then do an update again. How do I do this? I tried at first just mapping them like I mentioned above, but I seem to be getting some errors using this method.
Is this the best way to go about something like this? Like, am I supposed to instead just use a unique workspace for use with one version of the framework vs. another?
Thanks for your help.
The most straight forward way with just perforce means is to put both versions framework the
framework to perforce and map one of them in the clientview of your project.
For example submit the frameworks to places like this:
//thirdparty/framework-2.0-beta/...
//thirdparty/framework-1.0-stable/...
In your projects clientview you map one of the two to a fixed target path, e.g.:
//thirdparty/framework-2.0-beta/... //yourclient/framework/...
So far so good.
But in larger environments (with several people developing the same project) you will definitely run into problems with that approach because:
the compile/test/performance results of your workspace are not
necessarily the same of other people working on the same project
(depending on the clientview)
having several modules (thirdparty or not) and handling them in this
way will be hard to manage and lead to problems with crossdependencies (e.g. module a
version 2 will require module b version > 3, but that doesn't work with certain other
modules, etc.)
There are tools to solve these dependency issues. Look for Apache Ivy or Maven.

Update a DLL without replacing it?

Lets say I have a C# DLL that is 100 MB. Now If I wanted to update this DLL as far as I know I would need to get another DLL by downloading it, delete the old DLL and then move this new one in its place to "update" the DLL (Lets assume the assembly's that reference this DLL do not care about the version of the DLL).
Is there a way to create a file with has all the classes and items that have changed (ONLY the stuff that has changed) and recompile the DLL to have these changes, or some how update the DLL with those changes?
The goal is to have a way to update the DLL through small patches without having to re-download the entire DLL.
Thank You
You use the same mechanism version control systems use - SVN for example happily stores only the changes between different binary checkins, and on some dlls that can turn a mb dll into a few k. (obviously your mileage may change)
Generally binary diffs are not nearly as efficient as text, which is why its not often used, but it can work for some - it depends on how the binary is laid out, if the compilation process produces something that is generally the same, rather than completely different.
You'll need some code on the client to pull the changes down, either use the gnu patch tools, or use something like Windows BITS.

VC++ merge multiple COM DLLs into one

Let's say we have multiple libraries (DLLs) whose features one wants to use in an application, and wants to use them as a single DLL.
Is it possible to merge the DLLs into a single one, with all the features packed into it? I am not looking at the option to write a wrapper.
EDIT:
I've revisited the problem. Now all I want to do is bring all the projects under one solution and get a single DLL as the output instead of each project having it's independant output. Is this possible?
You can't literally merge several compiled .dll files into one. Your best bet is to put all files into a single project and recompile as a single library. You will likely have conflicts you'll have to resolve manually.
If you really have several COM in-proc servers you will also have to merge the data that facilitates class factories and COM registration - you will have to do that manually.

Create setup for Linux C project

I want to create a setup for my project so that it can be installed on any pc without installing the header files.
How can I do that?
There are two general ways to distribute programs:
Source Distribution (source code to be built). The most common way is to use GNU autotools to generate a configure script so that your project can be installed by doing ./configure && make install
Binary Distribution (prebuilt). Instead of shipping source, you ship binaries. There are a couple of competing standards although the two main ones are RPM and DEB file.
You just changed your question (appreciated, it was kind of vage), so my answer no longer applies ..
make sure you have a C compiler
I'd be surprised if you didn't, Linux normally has one
find an editor you are comfortable with
vi and emacs are the classics
write your first program and compile
learn about makefiles
learn about sub projects and libraries
In many respects, your question is too vague to be answerable. You will need to describe more what you have in mind. All else apart, if you are using an integrated development environment (IDE), then what you do should be coloured strongly by what the IDE encourages you to do. (Fighting your IDE is counter-productive; I've just never found an IDE that doesn't make me want to fight it.)
However, for a typical project on Linux, you will create a directory to hold the materials. For a small project (up to a few thousand lines of code in a few - say 5-20 - files), you might not need any more structure than a single directory. For bigger projects, you will segregate sub-sections of the project into separate sub-directories under the main project directory.
Depending on your build mechanisms, you may have a single makefile at the top of the project hierarchy (or the only directory in the 'hierarchy'). This goes in line with the 'Recursive Make Considered Harmful' paper (P Miller). Alternatively, you can create a separate makefile for each sub-directory and the top-level makefile simply coordinates builds across directories.
You should also consider which version control system (VCS) you will use.

Resources