SSIS Shared database connection strings between parent and child packages - visual-studio-2012

I want to be able to build 30+ packages in SSIS and be able to test/develop them in isolation. I also want to be able to run these from a Master/Parent package.
When it comes to delivering the SSIS parent package I want to be able to change the connection string once and have this trickle down to all child packages. Other developers will be building and testing without using the master package and want to be able to develop these in isolation.
I've seen many articles on XML config/parameter mappings etc. but I've not seen any definitive guide on how this should be done & what is best practice.
The project we have created also only allows packages to be linked in the solution as an external reference rather than as project links (is this the legacy format?). I'm wondering if this type of project could hamper the ability to achieve shared connection strings.

Answering this myself for reference. Basically there is no streamlined way of doing this in the Package Deployment model. It is much easier to achieve this using the Project Deployment model which is the default in VS2012. However, we don't have this luxury.
I had to create some parent variables contained in the master package. These are then set to the XML config. The child packages then have direct config links to the parent variables, with the target properties mapped to the connection string properties of the connection managers.

Related

How to add custom fields to Kentico Modules?

I want to use kentico's polls module but I need to add a "tooltip field" and add some extra HTML. Is there a way to clone the module with the same functionality and let me to edit/add fields to the module classes.
I was trying to edit the module but it shows a message saying Classes cannot be created or deleted in installed modules.
Are you trying to modify the class in the module and it's giving that error? Or modify the module info itself?
Some module classes allow edits and additions. If this is not the case for the polling, you have two options.
1: clone or recreate the module and clone the web parts that use it to use your new module. This is the safest but longest task and may require some digging into Kentico drivers using a decompiler like just decompile to find any other code pieces you need to clone and modify.
2: open the database in SQL management studio, and edit the cms_class on that class and manually change the bool values that allow it to be edited. This comes with the risk that on future upgrades it may break it, but it's a small risk.

Importing Go packages locally rather than remotely

In a node.js project, I'm using Go for a critical part of it that node isn't adequate enough to handle. I want to split the Go code into a sockets package and a main package, with sockets containing required structs/interfaces for the main package to run. The problem I'm having is that from what I can gather from Go's documentation, I can only use external packages like sockets remotely from github/gopkg. I don't want to split the repository for the project into one containing the Go code and one containing node's. How can I make the sockets package available for main to import locally while making it possible to rebuild the binaries for the two packages if any updates to their source code are made?
Edit: importing the packages is no longer an issue, but rebuilding packages on update still remains
It happens the same to my team too and we end up using vendor it's pretty easy to manage all the external packages. So, whoever checkout your repo will have all the packages inside vendor.
Understanding and using the vendor folder
And Please refer this site lots of other option out there too:
Golang Package Management Tools

Exclude Certain Database Objects from the Build Depending on Configuration Settings

I have a database project in Visual Studio 2012 with SSDT (latest as of this writing). In the database project, I have a schema called "UNITTEST" which contains tons of stored procedures that create, destroy, and provide other helper functionality for the unit tests. We do this because it gives us the ability to control our test data centrally rather than inside each unit test. Now that's fine and all however, I don't want to publish this schema or any of the objects inside of this schema to production.
So my question.. Is there a way to stop SSDT/VS2012 from including the UNITTEST schema in the production build deployment script?
I'm thinking there should be a way to do it depending on the solution configuration settings and publish profiles. If my configuration is set to "Release" then I want the build to perform a bit differently.
Builds are very new to me. I found this question: build-different-scripts-depending-on-build-configuration but I can't seem to get the answer to fulfill my problem. This question also doesn't help although it's very similar: bind-the-deploy-and-publish-destination.
Is anyone else managing something like this? The other developers in my team are just modifying the published script to remove these objects but I HATE manual work, there HAS to be a solution! :)
Thanks all!
One of my schemas references a lot of sys.* objects which created a lot of errors in the build. I created another project in the solution and moved that schema to the new project.
Luckily you can build and publish at the project level.
This allows me to keep the other schema in change control at least.
(It may also help to set the Properties on the SQL files to Build Action: None)
Partial/Composite projects might be useful here. Main project contains all of your necessary DB objects for your apps to run. The partial project references the main project, but then contains all of the "Test" code.
Here are a couple of options from Jamie Thomson:
http://sqlblog.com/blogs/jamie_thomson/archive/2013/03/10/deployment-of-client-specific-database-code-using-ssdt.aspx --This may be the simplest way to handle this
http://sqlblog.com/blogs/jamie_thomson/archive/2012/01/01/implementing-sql-server-solutions-using-visual-studio-2010-database-projects-a-compendium-of-project-experiences.aspx --Lots of good information in this post and most of it also applies to SSDT SQL Projects.
http://msdn.microsoft.com/en-us/library/dd193415.aspx - Composite projects for larger DBs. This could potentially work for you as well.

Automatic way to update component code in InstallShield Basic MSI project

I have InstallShield 2013 Basic MSI project
Is there an automatic way using a tool or script to automatically update all component code in the project ?
First question: why do you want to do that? A component GUID is set in stone for all absolute paths it references. See a description of this here: Change my component GUID in wix?
If you are familiar with COM automation, you should be able to automate the generation of new GUIDs in your project using the Installshield automation interface.
There are only a few cases where such an operation is logical and valid. If you are looking to install the same product many times, you can check out instance transforms. I have no real experience with this - I dislike the concept, but here are some pointers:
Installing Multiple Instances with Instance Transforms
Authoring Multiple Instances with Instance Transforms
Configuring and Building a Release that Includes Multiple-Instance Support
Multiple Instances Tab for a Product Configuration

How can one create package for ClearQuest?

I am modifying ClearQuest database schema and I wonder is there a way to create a package for future deployment. If there isn't what are best practices for tracking and deployment of schema modifications?
In your CQ installation path, look for this "cqload" tool.
basically...
cqoload exportintegration - Exports specific schame versions into a text
file
cqload importintegration - Imports the exported text file into a CQ
schema
What is common practice is you at least have 2 CQ environments, a Dev/QA env, and a Production env. Developers work on the Dev/QA environments (checkout/modify/checkin) until they are happy, QA verified the changes in the same environment. Then the implementor will use cqload commands to transfer the changes to the production environment.
Personally I think this workflow is retarded, as it requires so many manual process and it doesnt work properly if you have additional "Packages" upgrade or installation within CQ, e.g. UCM package, etc. Unfortunately I dont think this is gonna change anytime.

Resources