Uninstalling files deployed by merge module - installshield

I am working InstallShield 2015 project. Say for version 1 of my product is shipping some files using Merge module. Now I am working on version 5 of same product. During upgrade from v1 to v5, I want to remove files deployed by merge module from v1 and deploy new set of files from merge module in v5.
How can I achieve this? How should I remove files deployed by v1 merge module?

Merge Modules: Merge modules are merged into your package at MSI compile time. They are intended to be merged into any MSI package that needs the components from the merge module. As such it is a distribution mechanism for shared components or runtimes or whatever you need that many packages consume.
Merged Content: The merge modules become part of the packages they are merged into and they are hence reference counted by MSI itself - not by some custom means - so that the components are only uninstalled when there are no other MSI packages depending on them. Components can also be set permanent, in which case they are never removed.
Updates: If you want to update the files from the merge module, you basically need to create a new version of the merge module.

Related

How to maintain two versions of an app (pro and lite) in Flutter

I am building an app in Flutter. I need to make two versions of it - a pro and lite version. I use Android Studio. What would be the best way to maintain two versions of an app, so I don't have to create two different projects and update the code in both projects ?
You could achieve this using Flavors where you can create 2 binaries from the same code base with different app ids. Anotehr way, although not tried myself you can use the flutter tools to create multiple different builds for each platform.
From there you can have 2 entry main files that will be configured for each flavor, main.dart and main_lite.dart. In there you can specify what features are enabled for each one
I ran into a similar issue with an app for work. The solution I am using is to simply have a singe project for both apps but a special gitHub branch for the 2nd app.
So for instance my main app is on master and the 2nd app is on the second_app branch.
When I make a significant change, I make sure to merge it in master first.
And then I just git merge master into second_app.
So, it's not 100% automatic, but it works fine for most cases. And you have the whole git merge workflow at your disposal so you can easily resolve conflicts when you merge.

Node global npm package, keeping up to date

I have published a global node package via npm to generate boilerplate templates for projects at my company.
I would like to compare the current version with the latest published in order to exit the process if it’s not the latest.
What node libraries would you recommend to check for the latest version.
Is there a way to auto update the global package if a new version is detected.
Remember this is an internal tool for my company so It’s critical they are creating projects with the latest templates and I’d like them to be able to update as automatically or easily as possible
Personal Suggestion
Instead of forcing the user to upgrade, another option is to publish your templates (as zip) on remote static server (e.g. S3). In such case, you can often update the zip to the latest template without upgrading the template generator.
generate-template angularjs-template:latest
generate-template angularjs-template:4.3
Answering Your Questions
What node libraries would you recommend to check for the latest version.
I am not sure if there is a library for this. However, you can build one very easily.
Create a JSON file which contains the package information (e.g. latest stable version, deprecation message, etc.).
Upload the JSON file to a remote static server.
Whenever the user runs your program, download the JSON file and check against the current package.json.
Show a deprecation warning if the user should upgrade.
process.exit() the application if the user must upgrade.
Is there a way to auto update the global package if a new version is detected.
I think it is better to leave the control to the user, because there could be some reasons why he doesn't prefer upgrade. For example, if the user has a bunch of projects started 10 months ago, he might want to use the same template for newer projects.
But if you really want to automate it, you might use the following code (not tested).
const { execSync } = require('child_process');
const pkg = require('./package.json')
execSync(`npm update -g ${pkg.name}`)
process.exit()

Installshield patch issue for individual feature upgrade

we have a project , the porject contains three features and each feature contains its separate component files. we have also created patches for this install project. but have strange issues.
The three features can be installed individually in several machines, we can say we have A,B.C three features. in machine 1, i have installed feature B only, and applyed patch1, 2 ,3 . but when i uninstall patch3, it will bring other component files in . just like it installed other features. Check the install dir it installed other features. almost every file from other features. How could it happen? do anyone have a fix on this, thanks in advance.
after reanalysis my project, i re-structured the feature and pick the common component into a separate feature, and that resolved the issue.

npm module versioning with auto merging in git

I'm currently struggling with automatic merges of a semantic versioned node project. In my current setup I have to maintain multiple older (minor) versions of the applications. To ensure that bug fixes in older versions are also applied to newer versions I'm using release branches in combination with bit buckets feature of auto merging. It works great apart from permanent auto merge conflicts with the version of the application that has to be stored in the package.json. Each time an auto merge happens there is a version conflict with the newer release versions.
Is there any way to avoid those merge conflicts? I fiddled around with a custom merging driver (https://gist.github.com/jphaas/ad7823b3469aac112a52), it kind of works but in my opinion there should be an easier solution like storing the version in a dedicated file (e.g .npmversion) and using build in merge drivers.

Project references v NuGet dependencies

I am in the process of introducing NuGet into our software dev process, both for external binaries (eg Moq, NUnit) and for internal library projects containing shared functionality.
TeamCity is producing NuGet packages from our internal library projects, and publishing them to a local repository. My modified solution files use the local repository for accessing the NuGet packages.
Consider the following source code solutions:
Company.Interfaces.sln builds Company.Interfaces.1.2.3.7654.nupkg.
Company.Common.sln contains a reference to Company.Interfaces via its NuGet package, and builds Company.Common.1.1.1.7655.nupkg, with Company.Interfaces.1.2.3.7654 included as a dependency.
The Company.DataAccess.sln uses the Company.Common nupkg to add
Company.Interfaces and Company.Common as references. It builds
Company.DataAccess.1.0.8.7660.nupkg, including Company.Common.1.1.1.7655 as a dependent component.
Company.Product.A is a website solution that contains references to all three library projects (added by selecting the
Company.DataAccess NuGet package).
Questions:
If there is a source code change to Company.Interfaces, do I always need to renumber and rebuild the intermediate packages (Company.Common and Company.DataAccess) and update the packages in Company.Product.A?
Or does that depend on whether the source code change was
a bug fix, or
a new feature, or
a breaking change?
In reality, I have 8 levels of dependent library packages. Is there tooling support for updating an entire tree of packages, should that be necessary?
I know about Semantic Versioning.
We are using VS2012, C#4.0, TeamCity 7.1.5.
It is a good idea to update everything on each check-in, in order to test it early.
What you're describing can be easily managed using artifact dependencies (http://confluence.jetbrains.com/display/TCD7/Artifact+Dependencies) and "Finish Build" build triggers (or even solely "Nuget Dependency Trigger").
We wrote our own build configuration on the base project (would be Company.Interfaces.sln in this case) which builds and updates the whole tree in one go. It checks in updated packages.config files and .nuspec files along the way. I can't say how much of a time-saver this ended up being for us, even if it might sound like overkill at the beginning.
One thing to watch out for: the script we wrote checks in the files even if the chain fails somewhere in between, to give us the chance of fixing it on our local machine, check in the fix and restart the publishing.

Resources