How do you handle code promotion in a Sharepoint environment? - sharepoint

In a typical enterprise scenario with in-house development, you might have dev, staging, and production environments. You might use SVN to contain ongoing development work in a trunk, with patches being stored in branches, and your released code going into appropriately named tags. Migrating binaries from one environment to the next may be as simple as copying them to middle-ware servers, GAC'ing things that need to be GAC'ed, etc. In coordination with new revisions of binaries, databases are updated, usually by adding stored procedures, views, and adding/adjusting table schema.
In a Sharepoint environment, you might use a similar version control scheme. Custom code (assemblies) ends up in features that get installed either manually or via various setup programs. However, some of what needs to be promoted from dev to staging, and then onto production might be database content that supports the custom code bits.
If you've managed an enterprise Sharepoint environment, please share thoughts on how you manage promotion of code and content changes between environments, while protecting your work and your users, and keeping your sanity.

I assume when you talk about database content you are referring to the actual contents contained in a site a or a list.
Probably the best way to do this is to use the stsadm import and export commands to export and import content from one environment to another. (Don't use backup/restore when going from one environment to another.)

For any file changes (assemblies, aspx) you can use Features and then keep track of the installers. You would install the feature and do an upgrade to push changes.
There's no easy way to sync the data...you can use stsadm import/export commands as John pointed out. But this may not be straight-forward, especially if the servers are configured differently.
There's also Data Sync Studio product (http://www.simego.net/DataSync_Studio.aspx) you can try.

Depending on what form the database content takes, I would keep the creation of it in code so it's all in one place (your Visual Studio project) and can also be managed via source control. Deployment of the content could either be via a console application or even better feature receiver.
You might also like to read this blog post and look at the tool mentioned there for another approach.

The best resource I can point you to is Eric's paper:
http://msdn.microsoft.com/en-us/library/bb428899.aspx
I was part of a team working to better the story around development of WSS and MOSS solutions with TFS, but I don't know where that stands.

Related

Azure cloud service project configuration (.csdef and .cscfg) in multiple environments

Currently we have a development cloud services (acme-dev-service) and a production cloud service (acme-prod-service). Our current setup in our solution has a cloud service project called acme.application that uses transformation of the .cscfg and .csdef files for deploying the project to the two environments (production and development). I don’t like the transformation method because it feels like a bit of a hack to me. So after doing some research it seems that you can have multiple configuration files which solves some of the issue but I am running into problems because you are only allowed one service definition. This doesn’t work for us because the production environment requires extra certificates as well as different hostHeader bindings than our dev environment does.
So it seems we cant really get away from using the transformations. So I guess my question boils down to am I looking at the Azure Service Project files in the wrong light? Should we really be mapping one Azure Project to one Azure cloud service? Should I have an Azure project for Production and a second Azure Project for Development?
Is there a better way to do this? Or a best practice for working with multiple environments in Azure?
The CSDefinition file is the real kicker here. If you have a value you need to be different between two environments (dev/test/stage/production, etc.) then you really have three options:
1) Manually modify the value before a deployment. Errr....Okay....you have two options.
1) Tap into the MS Build process and determine which cloud configuration you have selected (the one used to determine which version of the .cscfg file will be used) and then have the build modify the .csdef after the build and prior to packaging (there is a time when the file has been copied to a different directory just before packaging and this is where you want to make the change). This can be tricky, though I've seen it done and have even done so myself in the early SDK days. Here is a blog post explaining one example where he's using WebConfigTransformRunner to do just that: http://fabriccontroller.net/blog/posts/apply-xdt-transforms-to-your-servicedefinition-csdef-file/. I don't really think this is your best option because it is opaque. It's not evident what is going on and someone who comes along after you to maintain the code will not know about this little gem and will spend forever trying to figure out why some value they put into the csdef somewhere is somehow getting overwritten after they publish to a different environment.
2) Use the two Azure Project approach you mentioned. You can set up build definitions in your Build tool of choice that determine which of the Azure projects you want to build and publish. Personally I think this is the best way to deal with different .csdef files. It's straight forward and doesn't require modifying the csproj files. I'm not opposed to csproj file changing, it's just not overly obvious it was done and, speaking as someone who has inherited things like that, it's not easy to find when people do that kind of thing and they aren't around to tell you about it.

In IIS, how should environment/site-specific WEB.CONFIG settings be preserved, when using MSDeploy?

Background
I work in QA at a software company.
We have about a half a dozen different web applications, each of which may require, at any given site, some customised settings added to its web.config file.
These can range from which Oracle database/schema(s) the app connects to, to how many search results to cache, to which hierarchy to use when sorting items on a web page.
We make use of Microsoft's Deploy package, to get the new releases installed/updated on client sites.
When we put out a new release, some of these customised settings may have been added to or removed from the given web app's web.config file, but using Deploy to import the new release over the top of the old one will clobber any customisations that may have been made.
Alternatives
There are ways of handling this manually, such as merging via a plain text comparison of the old and new web.config files, but these are cumbersome and prone to human error.
I was reading about transformations and thought they could be of some use.
There is also the capability to use external files (tip #8) which seems like a good way to go.
Improvement?
Should our programmers be providing some sort of semi-automated merge facility for this web.config file? Does the Deploy package provide this somehow?
Should we be making use of the external config files, as a best practice?
Is the concept of customising this web.config file at each site so fundamentally flawed that the whole thing needs to be re-thought?
Microsoft provides Web.config transformations as the de-facto way to do this. You can create different deployment configurations within Visual Studio and the web projects. Then when you build or your build server builds with that particular configuration the web.config is transformed to contain the settings you want to see.
View more about web.config transforms here.

Trouble syncing file-based templates to database using MSM and config bootstrap

Had started my typical EE build (using a bootstrapped config) for a client when they announced they wanted another additional site using the MSM module (le sigh).
So added the MSM module, I commented out the $config['site_url'] and $config['cp_url'] and set those in index.php instead using $assign_to_config.
That's when I discovered this bug where MSM config file settings are not recognized, which is a pain but I can work around it. However, I noticed that when I created the secondary site, it wouldn't recognise my custom location for add-ons and so I had to add that to index.php as well to $assign_to_config['third_party_path'] = "../assets/third_party/";.
Then I discovered that when I create or modify a template file, it won't automatically sync and so I need to manually do that each time which is a real PITA.
Why would my templates not be syncing to the database? Is this related to the MSM config bug?
While I haven't tried bootstrapping the third party path yet, I've definitely been able to bootstrap the template path for MSM sites... What bootstrap method are you using?
Are your sites on subdomains or subfolders? I've only had experience with subfolders so perhaps that makes a difference (although it shouldn't).
Could you maybe walk through in a bit more detail what's happening? Your first site (site_id = 1) templates sync automatically from filesystem edits, but your second site does not? Yet if you go to CP > Design > Synchronize Templates, that works?
The $assign_to_config portion of MSM setup is definitely a weakspot when it comes to bootstrapping... I wonder if we need to work up an additional bootstrap for MSM+CP environment, where it looks at the cp cookie ($_COOKIE['exp_cp_last_site_id']), and sets values based on that.
It may be helpful if you let us know which bootstrap you are using. For example, if you look at this bootstrap the site_url and cp_url are set using the HTTP_HOST server variable, so this shouldn't clash with your MSM install (and multiple domains) at all.
Perhaps you could try using that boostrap file instead, and see if it fixes your issue with template syncing?
Finally, if you're going to use the EE template manager, you don't really need to store templates as files. Conversely, if you want to save templates as files, it's probably much easier editing them using Sublime Text or another editor, rather than the clunky built-in editor (which is really only useful for small/simple changes).

How do I move ExpressionEngine (EE) to another server?

What are the best steps to take to prevent bugs and/or data loss in moving servers?
EDIT: Solved, but I should specify I mean in the typical shared hosting environment e.g. DreamHost or GoDaddy.
Bootstrap config is the smartest method (Newism has a free bootstrap config module). I think it works best on fresh installs myself, but ymmv.
If you've been given an existing EE system and need to move it, there are a few simple tools that can help:
REElocate: all the EE 2.x path and config options, in one place. Swap one URL for another in setup, check what's being set and push the button.
Greenery: Again, one module to rule them all. I've not used this but it's got a good rating.
So install, set permissions, move files and and DB, and then use either free module. If you find that not all the images or CSS instantly comes back online, check your template base paths (in template prefs) and permissions.
I'm also presuming you have access to the old DB. If not, and you can't add something simple like PHPMyAdmin to back it up, try:
Backup Pro(ish): A free backup module for files and db. Easy enough that you should introduce it to the site users (most never consider backups). All done through the EE CP. The zipped output can easily be moved to the new server.
The EE User Guide offers a reasonably extensive guide to Moving ExpressionEngine to Another Server and if you follow all of these steps then you will have everything you need to try again if any bugs or data loss occur.
Verify Server Compatibility
Synchronize Templates
Back-up Database and Files
Prepare the New Database
Copy Files and Folders
Verify File Permissions
Update database.php
Verify index.php and admin.php
Log In and Update Paths
Clear Caches
As suggested by Bitmanic, a dynamic config.php file helps with moving environments tremendously. Check out Leevi Graham's Config Bootstrap for a quick and simple solution. This is helpful for dev/staging/prod environments too!
I'd say the answer is the same as any other system -- export your entire database, and download all of your files (both system and anything uploaded by users - images, etc). Then, mirror this process by importing/uploading to the new server.
Before I run my export, I like to use the Deeploy Helper module to change all of my file paths in EE to the new server's settings.
Preventing data loss primarily revolves around the database and upload directories.
Does your website allow users to interact with the database? If so at some point you'll need to turn off EE to prevent DB changes. If not that you don't have too much to worry about as you can track and changes on the database end between the old and new servers.
Both Philip and Derek offer good advice for migrating EE. I've also found that having a bootstrap config file helps tremendously - especially since you can configure your file upload directories directly via config values now (as of EE2.4, I think).
For related information, please check out the answers to this similar Stack Overflow question.

What is a good deployment tool for websites on Windows?

I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.

Resources