Currently we have a development cloud services (acme-dev-service) and a production cloud service (acme-prod-service). Our current setup in our solution has a cloud service project called acme.application that uses transformation of the .cscfg and .csdef files for deploying the project to the two environments (production and development). I don’t like the transformation method because it feels like a bit of a hack to me. So after doing some research it seems that you can have multiple configuration files which solves some of the issue but I am running into problems because you are only allowed one service definition. This doesn’t work for us because the production environment requires extra certificates as well as different hostHeader bindings than our dev environment does.
So it seems we cant really get away from using the transformations. So I guess my question boils down to am I looking at the Azure Service Project files in the wrong light? Should we really be mapping one Azure Project to one Azure cloud service? Should I have an Azure project for Production and a second Azure Project for Development?
Is there a better way to do this? Or a best practice for working with multiple environments in Azure?
The CSDefinition file is the real kicker here. If you have a value you need to be different between two environments (dev/test/stage/production, etc.) then you really have three options:
1) Manually modify the value before a deployment. Errr....Okay....you have two options.
1) Tap into the MS Build process and determine which cloud configuration you have selected (the one used to determine which version of the .cscfg file will be used) and then have the build modify the .csdef after the build and prior to packaging (there is a time when the file has been copied to a different directory just before packaging and this is where you want to make the change). This can be tricky, though I've seen it done and have even done so myself in the early SDK days. Here is a blog post explaining one example where he's using WebConfigTransformRunner to do just that: http://fabriccontroller.net/blog/posts/apply-xdt-transforms-to-your-servicedefinition-csdef-file/. I don't really think this is your best option because it is opaque. It's not evident what is going on and someone who comes along after you to maintain the code will not know about this little gem and will spend forever trying to figure out why some value they put into the csdef somewhere is somehow getting overwritten after they publish to a different environment.
2) Use the two Azure Project approach you mentioned. You can set up build definitions in your Build tool of choice that determine which of the Azure projects you want to build and publish. Personally I think this is the best way to deal with different .csdef files. It's straight forward and doesn't require modifying the csproj files. I'm not opposed to csproj file changing, it's just not overly obvious it was done and, speaking as someone who has inherited things like that, it's not easy to find when people do that kind of thing and they aren't around to tell you about it.
Related
Our asp mvc is published automatically using octopus deploy. We use web config transformations and we always end up with additional environment specific files in the installation folder. Eg
Web.development.config
Web.test.config
Web.preprod.config
There is a slight advantage in having these files deployed since we can easily compare values between different environments when troubleshooting.
Is there an security risk in having different config files deployed to a production environment??
IIS should be configured to prevent the download of .config files by default, but depending on how tight your security needs are, it might be worth getting rid of them. (eg if someone comprises a test server, they do not gain access to production).
If you do want to get rid of the, you can write a PostDeploy.ps1 script to remove Web.*.config
These files can be removed using the community-contributed step File System - Clean Configuration Transforms.
If you want them to be available for diagnostic purposes, redeploy the Release, but switch off this step.
We have an Website project that's hosted in Azure, and we use Web.config transforms for setting environment variables. However, our current approach for building the system for different environments is to build the project multiple times (currently this is 3), which is inefficient.
We'd like to move to using Web Deploy, as this would then set us up nicely for using Release Manager.
Our issue is around using Web Deploy parameters instead of web.config transforms; we need to substitute multiple xml elements, rather than single values.
After much research, I found these 2 articles which detail almost exactly what I'm trying to do
http://blogs.iis.net/elliotth/web-deploy-xml-file-parameterization
http://www.iis.net/learn/publish/using-web-deploy/parameterization-improvements-in-web-deploy-v3
Essentially I'm trying to replicate Scenario 5, but using a separate Set Parameter file for the value.
Unfortunately, in the examples, referencing an external xml file only works if it is on the target machine. Some testing with a colleague confirmed this; works on local machine, but not on Azure.
Is there a way I can force Web Deploy to look in a particular location for the external configuration files?
As you've already noticed, Web Deploy is only able to read replacement values on the local machine or on a UNC share. It can't read that specific file over HTTP.
If you're deploying to an Azure Web App, then one thing you could try would be to use Kudu/FTP to manually upload that file one level above your wwwroot folder. Then you could specify the file location like so:
D:\home\site\prices.xml:://book[#name='book1']/price
Of course this implies that you'd have to pre-upload this file before publishing to your site, so it's not a perfect solution, but it should work for what you're trying to accomplish.
Background
I work in QA at a software company.
We have about a half a dozen different web applications, each of which may require, at any given site, some customised settings added to its web.config file.
These can range from which Oracle database/schema(s) the app connects to, to how many search results to cache, to which hierarchy to use when sorting items on a web page.
We make use of Microsoft's Deploy package, to get the new releases installed/updated on client sites.
When we put out a new release, some of these customised settings may have been added to or removed from the given web app's web.config file, but using Deploy to import the new release over the top of the old one will clobber any customisations that may have been made.
Alternatives
There are ways of handling this manually, such as merging via a plain text comparison of the old and new web.config files, but these are cumbersome and prone to human error.
I was reading about transformations and thought they could be of some use.
There is also the capability to use external files (tip #8) which seems like a good way to go.
Improvement?
Should our programmers be providing some sort of semi-automated merge facility for this web.config file? Does the Deploy package provide this somehow?
Should we be making use of the external config files, as a best practice?
Is the concept of customising this web.config file at each site so fundamentally flawed that the whole thing needs to be re-thought?
Microsoft provides Web.config transformations as the de-facto way to do this. You can create different deployment configurations within Visual Studio and the web projects. Then when you build or your build server builds with that particular configuration the web.config is transformed to contain the settings you want to see.
View more about web.config transforms here.
we are hosting 3 different web applications in azure machine and we are using CI to push this changes from build server TFS in the azure to our machines.
the trouble is in physicalDirectory porperty as we could not find any possibily to use any of build-variables there, so we have to use relative path like this
physicalDirectory="..\..\..\..\..\..\bin\_PublishedWebsites\xxxxxx"
we have two issues with this, one is that the loca build and hosted build needs different relative paths, and second is that one our two branhces are not on same level.
I tried
physicalDirectory="$(OutpuPath)\_PublishedWebsites\xxxxxx"
as well as
physicalDirectory="%OutpuPath%\_PublishedWebsites\xxxxxx"
had no success with any of those.
is this possible at all, if not is there any other way to replace this values?
thanks
almir
You can define your own environment variables in azure, this approach may work for you
"http://msdn.microsoft.com/en-us/library/windowsazure/gg432991.aspx"
I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.