Tool to manage and track versions of ide,programming languages,plugins etc - resources

Can you suggest a tool(if exists) to keep track of all the versions of all the resources i use to develop a project?
i.e. Project:Myproject
IDE:Eclipse
php:5.0.2
iReport Plugin:1.2.1
...
Thanks.

One approach is to use source control: check in the binaries (or installers) for all the tools you use along with your project. When you upgrade a tool, check in the upgraded version, so the source control history includes everything that is needed to build a particular version of your project. If you want to build last year's version of the code, everything you need is then in source control.
(Disk space is cheap, so just keep a copy of everything)

Related

Could Origen updater changes impact generated test flows or patterns?

As I was going through and locking down specific tag versions in my app's Gemfile, I noticed the origen_updater gem had no version associated with it.
gem 'origen_updater'
Why is that? I see that it is version controlled. Could changes to this gem impact modeling or generation?
thx
No, the only thing this gem does is copy its fix_my_workspace script to the application's bin directory.
See here for more background - http://origen-sdk.org/origen//guides/starting/workspace/?highlight=fix_my_workspace#Fix_My_Workspace
It is not versioned for the following reasons:
The latest version of this script will always be the best, which by definition, will have the best chance of getting your workspace going again if it is in a broken state
This script is intended to be called directly at times when either Origen or Bundler fail to launch. Finding out that you need a newer version of the script to fix your current problem is not very helpful if you are in a state where Bundler cannot be used to pull it in.
Therefore, the recommendation is not to lock this to a particular version and instead allow it to pull in the latest and greatest anytime you do a bundle update.

GitVersion – selective versioning multiple assemblies of the same project

I’m on a .net c# project composed by a solution with several class library projects.
The source control is managed by git using gitflow as branching model.
We have decided that we wanted to implement semantic versioning (http://semver.org/) of the project in order to follow a standard way to communicate our releases.
For that we are using GitVersionTask (via NuGet) which works pretty well with gitflow.
Every time we tag a release and we perform a build from the master branch the version of all assemblies are updated and a new release is out for delivery.
Only one of the assemblies has a public API, all the other are for internal consume. I would like to know if this is the correct way to manage the version of multiple assemblies of the same project I mean, isn’t it wrong to change the version of every assembly when only a couple (or even just one) was changed? To get thinks more complicated there is strong possibility that some of the “internal” assemblies will be used by other projects so I believe it not very wise to increment a major version of an assembly that didn’t suffer a change just because another assembly of the same project is promoting breaking changes. Should each assembly project be managed on its own repository?
Thanks in advance.
I know this is a bit of an old question, still:
I want to share a workaround that seems to be working:
GitVersion uses $(Build.SourcesDirectory) to see where the sources are located - src
We can change this using logging commands*
Workaround is to set the Build.SourcesDirectory before GitVersion task
Then gitVersion uses the GitVersion.yml from the project folder (Build.SourceDirectory) and voila - works
After that you might want to roll back the change or not - depending on your need. For me it seems it is nice to scope down to the only nuget package from the collection of nuget packages in our nugetPackages monorepo.
see GitVersion issue and comment
*Example Powershell command:
standard PowerShell task; set to inline script;
Write-Host "##vso[task.setvariable variable=Build_SourcesDirectory;]$(Build.SourcesDirectory)\$(NugetProjectName)"
There is certainly nothing in GitVersion that would help with having separate projects within the same repository. The guidance that we would offer here is that you should use different repositories for the different parts of your application. That way they can be versioned/updated at their own cadence.

Project references v NuGet dependencies

I am in the process of introducing NuGet into our software dev process, both for external binaries (eg Moq, NUnit) and for internal library projects containing shared functionality.
TeamCity is producing NuGet packages from our internal library projects, and publishing them to a local repository. My modified solution files use the local repository for accessing the NuGet packages.
Consider the following source code solutions:
Company.Interfaces.sln builds Company.Interfaces.1.2.3.7654.nupkg.
Company.Common.sln contains a reference to Company.Interfaces via its NuGet package, and builds Company.Common.1.1.1.7655.nupkg, with Company.Interfaces.1.2.3.7654 included as a dependency.
The Company.DataAccess.sln uses the Company.Common nupkg to add
Company.Interfaces and Company.Common as references. It builds
Company.DataAccess.1.0.8.7660.nupkg, including Company.Common.1.1.1.7655 as a dependent component.
Company.Product.A is a website solution that contains references to all three library projects (added by selecting the
Company.DataAccess NuGet package).
Questions:
If there is a source code change to Company.Interfaces, do I always need to renumber and rebuild the intermediate packages (Company.Common and Company.DataAccess) and update the packages in Company.Product.A?
Or does that depend on whether the source code change was
a bug fix, or
a new feature, or
a breaking change?
In reality, I have 8 levels of dependent library packages. Is there tooling support for updating an entire tree of packages, should that be necessary?
I know about Semantic Versioning.
We are using VS2012, C#4.0, TeamCity 7.1.5.
It is a good idea to update everything on each check-in, in order to test it early.
What you're describing can be easily managed using artifact dependencies (http://confluence.jetbrains.com/display/TCD7/Artifact+Dependencies) and "Finish Build" build triggers (or even solely "Nuget Dependency Trigger").
We wrote our own build configuration on the base project (would be Company.Interfaces.sln in this case) which builds and updates the whole tree in one go. It checks in updated packages.config files and .nuspec files along the way. I can't say how much of a time-saver this ended up being for us, even if it might sound like overkill at the beginning.
One thing to watch out for: the script we wrote checks in the files even if the chain fails somewhere in between, to give us the chance of fixing it on our local machine, check in the fix and restart the publishing.

Tools to help manage sets of multiple versions of executables on Linux?

We are in a networked Linux environment and what I'm looking for is a FOSS or generic system level method for managing what versions of executables and libraries get used, per session. The executables would preferably be installed on the network. The executables will be in-house tools and installs of commercial packages like Houdini, Maya and Nuke.
The need for this is that we'd prefer to have multiple versions of the software installed and available for the artists but there needs to be an easy way to select which version to use. As an added benefit, I'd like to be able to track the version of software used to generate a given output as metadata. I've worked at studios that did this successfully but I was not 100% up to speed on how it was achieved. Every executable in a given set was assigned a single uber version for the set. That way, the "approved packages" of the studio tools were all collapsed into a single package of tools that were known to work together.
Due to the way they install, some programs make setting this up easy (It's as simple as adding their install directories to $PATH). Other programs don't make it quite so easy. I'm particularly worried about how to handle the libraries a program might install. What's needed is a generic access method I can use to wrap everything into a clean front end.
Does anyone know of such a system available in the wild or am I going to have to implement it from scratch? Google hasn't been very helpful in finding a solution.
Thanks!
Check out the "modules" system at http://modules.sourceforge.net/ ; it's quite widely used in HPC.
There is eselect . I have only used it on funtoo(offspring of gentoo) but it seems to be doing what you need. It is also written entirely in BASH, so it should be quite possible to port to other distros.

Version Roll Back

I am doing a concept in linux in which i want to do version rollback for an app installed in linux. Is it possible??
For eg I have an application named X with version 1.1
I get an update. It changes it to version 1.2
I note what all the packages in the app going to be modified.
Then i save them and apply the changes.
Now after sometime due to some problems I want to switch back to version 1.1
If i undo the changes and make the entire solution will the rollback be done?
The easiest and common way in Unix is to install them in separate directories,
eg "/usr/bin/MyApp.1.2.3" and "/usr/bin/MyApp.1.2.4" then create a link to the one to use "/usr/bin/Myapp".
Changing versions is then just a matter of moving the link.
You don't need to invent anything. Just keep the packages you install around. If you want to go back, uninstall the current version and install the previous package again.

Resources