What is a good deployment tool for websites on Windows? - iis

I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.

#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.

For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.

You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).

We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.

I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...

Maybe rsync plus some custom scripts will do the trick.

Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.

Related

Sync Android Studio projects across multiple workstations

I want to be able to work across multiple workstations synchronously jumping from one to the other without having to worry about committing.
I have windows personal and work desktop and a Mac OSX laptop. At the moment, I point my project to a cloud directory and have the local install of Android Studio pointing to a gradle offline cache in another cloud directory. This keeps failing as it tells me that the path to gradle is invalid. Which I understand because gradle is referenced in different locations on different machine (considering the differing file management system in MACOSX and Windows7).
Edit: When I try to open the project, it brings up the "Import Project from Gradle" screen. To which it has the option for me to select "Use local gradle distribution" and select the Gradle home directory. I pointed it to the cache directory, and it tells me:
Cannot Save Settings
Gradle location is incorrect.
Location:C:/Users/Username/.gradle
All my research (include these answers here, and here) suggest that VCS is the way to go. However, I don't see this as a solution to my problem. I'm not looking to version control, I'm looking to transition seamlessly across workstations. Of course I will still use Version Control System for the purpose of saving a working version of my code, or sharing it with other developers, but there has to be a better way when I simply just want to keep all workstations synced.
I come from web development, and I synchronise local environment on AMPPS across multiple computers without any issue. This meant I can transition from my personal desktop, laptop, and work desktop instantly. It frustrates me if I have to remember to commit every time I move around. If I have to do this 20 times a day, and it takes about a minute to do this, that's 20 minutes that could have been spent writing a couple of functions. And what if I forget to commit, then I get to work, or home, that would be a day wasted because I won't actually have the current up to date code...
So the question remains, is there a way to instantly synchronise Android Studio projects? How do I keep all my code base (ie gradle) in sync?
Ok thanks to the comments above which pointed me in the right direction.
Android Studio create some local files that are specific to the machine that you are on. Following on this principle, to sync the "source" files (files that are specific to your application only), you must ignore all these local files. This is similar to what you would store on github. I followed the answer for this question to apply the ignore rules.
Having ignored all the "local files", when I create a new project, the source files are synchronised across all my workstations. In order to establish a local version, I need to "import" the project first. Once it has been imported, "local files" will be created for that particular machine. From then on, I can "open" the project locally.
To summarise:
Set your sync to ignore files as per .gitignore or refer to this question.
Create a project on one of your workstation and save it in the cloud.
When you are ready to work on the project for the first time on another workstation, "import" the project.
Once the project has been imported, all local files should have been created.
From then on, use the "open" option to continue working on the project.
I hope this helps somebody else, saving hours on googling.

How should I go about using a temporarily changed copy of a DLL locally when it's been checked in to TFS?

We have a Libraries folder where we keep third-party DLLs and our own utility DLLs for all applications to reference. I want to do development against one of our utility DLLs and an application that consumes it at the same time. But if I check out the library DLL to change it for temporary local use, TFS insists on checking it out exclusively, which trips other people up. I understand the reasoning behind it doing that (hard/impossible to merge a DLL, so two people shouldn't be working on one at the same time), but I just want to mess with my local copy while I'm working on the library it represents.
I suppose I could delete my application's reference to the DLL and recreate the reference pointing to some other place, but of course this just begs for me to forget and check it in like that, which would obviously be bad. Not to mention that this is a pain in the neck.
How should I proceed in such a situation?
You are using a server workspace that does not allow editing outwith TFS. In TFS 2012 local workspaces were introduced which do not have a read only flag for files and you are free to edit at will.
You can change your existing workspace in a few clicks: http://msdn.microsoft.com/en-us/library/bb892960.aspx
You could just go into the file system and mark the file as writeable. Once you are happy the binary is good you could check it out, copy the new version of the file over and check it back in again. TFS marks binary files like this as locked for good reason, as you can't merge them in the way you can with textual content.
The best approach would be to use a NuGet repository to manage your binary dependencies, instead of relying on binaries checked into source control.

Why does VS2013 publish all website files when using a different machine?

I have a home machine and office machine I use to publish websites using Visual Studio 2013. If I make a change from the same machine, and re-publish, just the changes are published, not all files.
However, when using my clone machine at the office, even if I do a get latest, make one small change, and re-publish, all files are published, not just the ones that changed, and not just the ones that have been recompiled. ALL dll files, even third party dlls that have not changed or have been recompiled with a new date, are republished. Same thing happens if my cohort publishes a small change on his machine after I did the last publish. Not a problem if publishing twice from the same machine as then only the changed files are published.
Is there anyway to prevent complete republishing just because a different machine is used to publish than the one used for the last publish? Thanks.
This seems to make "Determining Changes" a lot slower, but for .Net 4.5 [and
up(?)], use this info from:
https://msdn.microsoft.com/en-us/library/ee942158:
To configure Web Deploy to use checksums instead of dates to determine
which files need to be copied to the server, add the following element
to the .pubxml file (Publish Settings):
<MSDeployUseChecksum>true</MSDeployUseChecksum>
Like this:
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<MSDeployUseChecksum>true</MSDeployUseChecksum>
<!— other settings omitted to keep the example short -->
<PublishDatabaseSettings>
<!— this section omitted to keep the example short -->
</PublishDatabaseSettings>
</PropertyGroup>
</Project>
First of all, I do not understand the behavior of MSBuild+VS2013 and the publishing feature completely as I just started to use the publishing feature myself. I'm looking for a way to speed up publishing via FTP web deploy in VS2013. (I'm not using TFS for a get latest though.)
I would say this is partially explained in a different context at this SO question. The MSBuild process. Timestamps of certain files are being compared and then could indicate MSBuild/VS2013 whether a target (build output) is up-to-date or not. Then files that are not up-to-date are being recompiled.
As you all work on different machines, timestamps are likely to be different quite soon.
To find out what is actually going on during builds/publishes, set build output verbosity to detailed or diagnostic, for a moment:
VS2013 menu - Tools - Options... - Project and solutions - Build and run - output verbosity - set to -> detailed or diagnostic. Run the build, and see the Output panel/tab in VS2013. Select "show output from: Build" to see results if not already visible. Don't forget to set it to the original setting after checking the build details, as it could slow the build down a bit.
But why even unchanged third party dll's are being republished? Possibly because these dll's ARE actually overwritten during a build. You might have the assembly reference property "Copy local" set to true to get your website running without any manual uploads to be done for this. Or you are using a commandline copy command with the overwrite parameter explicitly set to true during project's post build event (like 'copy /y ...' or 'xcopy /y ...'). Then the timestamp of the file that is to be published is overwritten for sure, see in
the "obj\Release\Package\PackageTmp" folder (or for example: "\Debug" instead of "\Release" if Debug build is set for the selected publish profile.)
Furthermore, VS2013 as default does NOT check timestamp differences on the targeted webserver if you are using FTP publishing, at least that is my experience. As for the other publishing options like a web deploy I don't know yet. But the differences you experience seem normal behavior to me, as you run builds on different machines and publishing files from different machines as well. So timestamps are likely to be different... which, again, indicates 'changed' files to be published.
As I'm curious how this question should be solved, I was looking for this in the first place:
Maybe a TFS build server is an option for you, configured with a rolling build. But I read on a specific SO (sorry, can't add more links at this moment as I've just registered) that is suggesting to do clean builds to prevent new problems. And that will force full publishing again as files are all being changed by the clean build... so that won't work I think.
As an answer, you might want to use these FAQ answers on MSDN for web deployments. See the questions:
"Can I exclude specific files or folders from deployment?" .NET 4.0/4.5 and/or
"How to make Web Deploy use file checksums instead of dates to
determine which files were changed?" .NET 4.5 only!
The first option is to exclude files.
The second option is to use a checksum to compare files instead of timestamps, but that could be somewhat slowing the build(?) process, as the FAQ says. Note the first few lines on that FAQ pages, on how you can edit your publishing profile to apply one or both of these elements!
Also it is an option to put the thirdparty dll's in a different project which you then could only include that project in the deploy for a certain solution configuration (VS2013 menu - Build - Configuration Manager, see the checkboxes in the 'deploy' column there for every project. Though I'm not sure if this is part of the VS2013 'publish' feature as a web deployment, because this deploy column checkboxes are greyed out for my solution projects for some reason I don't understand yet... so I can't test it to verify this option.)
Though it sounds logic, don't forget to create backups/copies/screenshots first before you change any settings or publishing profiles, and then change the same settings/configurable files on the other machines you and your colleagues work at.

All files are marked as read only after check in to TFS from Visual Studio 2012

I have a solution with two projects. I just marked the solution and checked it in and now all files have that little image of blue padlock on the left side which obviously means that they are marked as read only.
I've been working only with Tortoise SVN till now and this is my very first check in to TFS, so why does this happen, or if it is too complicated for an answer here, at least how can I return the state of my files (the entire solution maybe) to normal.
Also I've been struggling to find a good reading/tutorial on how to use/execute the basic tasks for TFS from Visual Studio 2012 so if someone can share a good source of information about the topic it would be much appreciated.
This is, in fact, normal. You are using a "server workspace", or connecting to a TFS server from before TFS 2012. Team Foundation Server has multiple modes of working:
A Checkout/Edit/Checkin system (via "server workspaces") means that you will need to explicitly check a file out to begin editing it. Files are kept read-only in order to indicate to you quickly what files are checked out and which files need to be checked out. If you simply start typing in an IDE or editor that understands TFS version control (Visual Studio, Eclipse) then the IDE will check the file out for you. Otherwise, you will need to check the file out manually (by selecting "Check Out for Edit" in Source Control Explorer or by running tf checkout <filename>.)
This type of system is very useful with teams that have very large repositories or very large files in those repositories; by explicitly instructing the source control system that you are editing a file, you avoid the need to scan the filesystem.
An Edit/Merge/Commit system (via "local workspaces") means that you do not need to take any explicit action to check a file out, when you query your pending changes, the disk will be scanned to determine what changes you have made. Local workspaces are the default in TFS 2012, though you or your administrator may change this default.
This is similar to the way Subversion operates and is generally suitable for most repositories, however if you use keep large binaries in your tree, it is probably not a good option.
A Distributed Version Control system (via git) means that you have a complete clone of the repository locally and allows you to work completely independently while offline and share your changes or receive other peoples changes as you see fit. Git is new in TFS 2013 and Visual Studio 2013.
This type of system is very useful for highly distributed teams and teams that want to take advantage of novel branching strategies but may not be appropriate for teams who have very complex requirements around fine-grained permissions.
If your server is TFS 2012 or better and you want to convert your existing server workspace to a local workspace, you can open the "Edit Workspace" dialog and in the advanced settings, change the type of your workspace. This will make all your files writable and you will continue working in a Subversion-like mode.
this is quite normal, having locks to the files.
However, you can set the level of source control of different check in / check out strategies.
link: http://msdn.microsoft.com/en-us/library/ms181237(v=vs.90).aspx
For the global documentation :
Source control : http://msdn.microsoft.com/en-us/library/vstudio/ms181368.aspx
Tfs global : http://msdn.microsoft.com/en-us/library/vstudio/hh529827(v=vs.110).aspx

How do you handle code promotion in a Sharepoint environment?

In a typical enterprise scenario with in-house development, you might have dev, staging, and production environments. You might use SVN to contain ongoing development work in a trunk, with patches being stored in branches, and your released code going into appropriately named tags. Migrating binaries from one environment to the next may be as simple as copying them to middle-ware servers, GAC'ing things that need to be GAC'ed, etc. In coordination with new revisions of binaries, databases are updated, usually by adding stored procedures, views, and adding/adjusting table schema.
In a Sharepoint environment, you might use a similar version control scheme. Custom code (assemblies) ends up in features that get installed either manually or via various setup programs. However, some of what needs to be promoted from dev to staging, and then onto production might be database content that supports the custom code bits.
If you've managed an enterprise Sharepoint environment, please share thoughts on how you manage promotion of code and content changes between environments, while protecting your work and your users, and keeping your sanity.
I assume when you talk about database content you are referring to the actual contents contained in a site a or a list.
Probably the best way to do this is to use the stsadm import and export commands to export and import content from one environment to another. (Don't use backup/restore when going from one environment to another.)
For any file changes (assemblies, aspx) you can use Features and then keep track of the installers. You would install the feature and do an upgrade to push changes.
There's no easy way to sync the data...you can use stsadm import/export commands as John pointed out. But this may not be straight-forward, especially if the servers are configured differently.
There's also Data Sync Studio product (http://www.simego.net/DataSync_Studio.aspx) you can try.
Depending on what form the database content takes, I would keep the creation of it in code so it's all in one place (your Visual Studio project) and can also be managed via source control. Deployment of the content could either be via a console application or even better feature receiver.
You might also like to read this blog post and look at the tool mentioned there for another approach.
The best resource I can point you to is Eric's paper:
http://msdn.microsoft.com/en-us/library/bb428899.aspx
I was part of a team working to better the story around development of WSS and MOSS solutions with TFS, but I don't know where that stands.

Resources