The tasks I do manually for updating my web site:
Stop IIS 7
Copy source files from a folder to the virtual directory of my web site
Start IIS 7
There are many ways to approach this, but here is one way.
I am assuming you don't want every single file in your source repository to exist on your destination server. The best way to reliably extract what you need from your source on a regular basis is through a build file. Two options for achieving this are nant and msbuild.
Once you have the set of files you want to deploy, you now need a way to distribute them to your destination server & to stop and start IIS. Again, there are options, but I would personally recommend powershell (with the IIS snapin) for this.
If you want this to happen regularly, consider a batch file executed by some timer, such as a scheduled task, or even better, a CI solution such as TeamCity.
For a full rundown, there are examples within my PowerUp project that does this.
It depends where you are updating from, but you could have a your virtual directory pointing to a local read-only working copy of your source code and create a task that every day runs a batch file/powershell script/etc. that would update that working copy (via a svn update, git pull etc.)
That supposes that you have a branch that always contains the latest releasable code.
You have to create a batch file with the following content:
Stop WWW publishing service
Delete your old files
Copy the new files
Start WWW publishing service
You can start/stop services like this:
net stop "World Wide Web Publishing Service"
When you have your batch file you can create a task in the Task Scheduler that calls your batch in a regular time interval (e.g. each day).
Related
I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)
I have successfully set up Jenkins on local domain as a test. It builds from SCM, zips the build, extracts to a unique timestamp folder, and then copies over the files to the IIS folder.
I now have to set it up to deploy to a Azure VM. Now things are getting hairy.
I get the file to copy across - it takes a long time. Unzipping literally takes an hour.
Cross domain user rights are also making things difficult as the user running Jenkins service does not exist on production boxes which are on Azure domains.
What are my options?
Should I install a slave node on the production box and then "activate" the slave from the master and then let the slave :
1. perhaps copy the file over from Azure storage to the production box?
2. extract the files
3. Copy the files to the IIS folder.
Well, there's no clear answer to this, try what works best for you. So the main options i see are:
1. Use slave node in Azure, upload zip to some place (Azure storage account or whatever) and let slave node handle the download\unpacking\etc.
2. Use remote PowerShell and connect directly to servers in Azure and download the zip from the web (or Azure storage or whatever) and extract it.
3. Use a tool, like Octopus, which does literally the same, but is kind of build with deployments in mind.
I have a poweshell command which deletes the folder(i.e. Summer) from wwwroot directory and recreates the folder with the necessary files(images, css, dll etc) in it. The problem is every once in a while the IIS tends to lock some of the images or files in the directory so the powershell command fails to delete the file. I do recycle/stop the apppool before running powershell script which is used by site but still the problem persists. This issue is random i.e. the powershell script can delete the folder sometime while it can't other time. The weird thing is, if i start deleting the contents (subfolders, files) inside 'Summer', at the end, i am able to delete 'Summer' folder, but it is an manual process and which is tedious.
Is there any command which i can put in powershell or batch file to delete 'Summer' folder, even though when it is locked by IIS?
I agree with #Lynn Crumbling and recommend iisreset.
Sysinternals has two tools that provide other options:
The ProcExp tool allows you to find which processes have open handles to a given file, and allows you to close that handle. The downside of this tool is that it's not a command line tool.
The MoveFile tool allows you to schedule the file to be removed after reboot.
You can use the IIS powershell commandlets to start and stop app pools, web sites etc
Import-Module WebAdministration;
Stop-WebAppPool ${appPoolName}
Stop-WebSite ${webSiteName}
you can then start them again afterwards using the opposite commands
Start-WebAppPool ${appPoolName}
Start-WebSite ${webSiteName}
As put in comment, fully stopping IIS using iisreset stop would work.
Also, you may want to stop only the application from which you are trying to delete files from. Look at the Administration Guide.
I want to clear out the working directory in a CruiseControl.NET build after the site has been deployed because space is an issue and there's no requirement to keep it.
The way things are set up at the moment everything is on 1 machine (that's unlikely to change), this is acting as both Mercurial repository server, testing web server and CruiseControl.NET build server.
So on C:\Repositories\ and C:\inetpub\wwwroot\ we have a folder per website. Also in C:\CCNet\Projects we have a folder per website per type of build (Test and Live) - so that means we've got at least 4 copies of each website on the server and at around 100mb per site X 100 sites that's adding up to a lot of disk space.
What I thought I would like to do is to simply delete the Working Directory on successful build, it only takes 5-10 seconds to completely get a fresh copy (one small advantage to the build server being the same machine as the hg server) and only keep a handful of relatively active projects current. Of the 100 or so sites we'll probably work on no more than 10 in a week (team of 5).
I have experimented with a task that runs cmd.exe to /del /s /q the Working Directory folder. Sometimes this will complete successfully, othertimes it will fail with the message that "The process cannot access the file because it is being used by another process". When it does complete ok the build kicks off again, presumably because the WD is not found and it needs to be recreated, so I'm finding I'm in a never ending loop there.
Are there any ways I can reduce the amount of space required to run these builds or do I need to put together a business case for increasing hosting costs for our servers?
You need to create your own ccnet task and build the logic into it.
Create a new project called ccnet.[pluginname].plugin.
Use the artifact cleanup task source as base to get going quickly
Change the base directory from result.ArtifactDirectory to whatever you need it to be
Compile and place \bin\Debug\ccnet.[pluginname].plugin.dll to c:\Program Files\CruiseControl.NET\server or wherever CCNET is installed.
Restart the service and you should be able to use your task in a very similar way as the artifacts cleanup task
I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.