I have a pipeline that deploys code to an IIS folder on an on-premise server. I'm trying to figure out how to best delete old release folders. I'm not seeing anything obvious within DevOps.
Is there a native way of doing this? Or should I roll my own PowerShell script to delete old releases?
Is there a native way of doing this?
Yes, of course there is.
Open your deploy task, go Advanced Deployment Options and then enable the option Remove Additional Files at Destination
Noted: This "delete" operation I mentioned is not mean that clear all the files in local IIS folder. It just delete the files at the destination where there's no corresponding file in the package which is being deployed.
In one word, for some files which same with previous, it will be override as the latest files. And, for any left over files from a previous deployment that are no longer required, they will be deleted.
If you do not trust this option and want to clear the previous files completely, you can also add the Powershell task before IIS manage task and run delete script.
Here is the sample script to delete local files:
Remove-Item -Path "D:\Websites2\*"
You can replace "D:\Websites2\*" as your local website file path.
Related
The current setup is that we're using gulp to build our VS solution using MSBuild and an Azure DevOps release pipeline to deploy our build artifacts via the Kudu Zip Deploy API (via PowerShell) to our Azure App Service.
Kudu appears to copy files that are unchanged, which appears to be causing unnecessary slowness on the target server because it causes the server to restart. Here's one example:
The contents of this file have not changed (as well as other binary files), but what probably has changed is the timestamp due to the way we're generating / regenerating some of these artifact files.
I have tried to see if Kudu can be configured to ignore timestamps, but there doesn't seem to be an option for it, and it might also not be a good solution. According to the Kudu zip deploy docs:
Efficient file copy: Files will only be copied if their timestamps don't match what is already deployed. Generating a zip using a build process that caches outputs can result in faster deployments.
Other ideas include it being a misconfiguration in the solution / file settings, or an issue with the way we're building via gulp. Any ideas on how I can prevent these unchanged files from being copied?
We have a new project in which we are trying to make use of the built in continuous integration in Kentico for tracking changes to templates, page types, transformations etc.
We have managed to get this working locally between two instances of a Kentico database, making changes in one, syncing the changes through CI and then restoring them up to the second database using the Continuous integration application that sits in the bin folder in the Kentico site.
The issue we are having is when it comes to deploying our changes to our dev and live environments.
Our sites are hosted as Azure App services and we deploy to them using VSTS (Azure DevOps) build and release workflows however, as these tasks run in an agent, any powershell script we try to run to trigger the CI application fails because it is not running in the site / server context.
My question is, has anyone managed to successfully run Kentico CI in the context of an Azure app service? Alternatively, how can I trigger a powershell script on the site following a deployment?
Yes, I've got this running in Azure DevOps within the release pipeline itself. It's something that we're currently rolling out as a business where I work.
The key steps to getting this working for me were as follows:
I don't want to deploy the ContinuousIntegration.exe or the repository folders, so I need to create a second artefact set from source control (this is only possible at the moment with Azure Repos and GitHub to my knowledge).
Unzip your deployment package and copy the CMS folder to a working directory, this is where you're going to run CI. I did this because I need the built assemblies available.
From the repo artefact in step 1, copy the ContinuousIntegration.exe and CI repository folders into the correct place in your unzipped working folder.
Ensure that the connection string actually works for the DB in your unzipped folder, if necessary, you may want to change your VS build options in regards to how the web.config is handled.
From here, you should be able to run CI in the new working folder against your target database.
In my instance, as I don't have CI running on the target site it means that everything is restored every time.
I'm in to process of writing this up in more detail, so I'll share here when I've done that.
Edit:
- I finally wrote this up in more detail: https://www.ridgeway.com/blog/web-development/using-kentico-12-mvc-with-azure-devops
We do, but no CI. VSTS + GIT. We store virtual objects in the file system and using git for version control. We have our own custom library that does import export of the Kentico objects (the ones are not controlled by Git).Essentially we have a json file "publishing manifest" where we specify what objects need to be exported (i.e. moved between environments).
There is a step from Microsoft 'Powershell on Target Machines', you guess you can look into that.
P.S. Take a look also at Three Ways to Manage Data in Kentico Using PowerShell
Deploy your CI files to the Azure App Service, and then use a Azure Job to run "ContinuousIntegration.exe"
If you place a file called KenticoCI.bat in the directory \App_Data\jobs\triggered\ContinuousIntegration - this will automatically create a web job that you can can trigger:
KenticoCI.bat
cd D:\home\site\wwwroot
ren App_Offline.bak App_Offline.htm
rem # run Kentico CI Integraton
cd D:\home\site\wwwroot\bin
ContinuousIntegration.exe -r
rem # Removes the 'App_Offline.htm' file to bring the site back online
cd D:\home\site\wwwroot
ren App_Offline.htm App_Offline.bak
I'm deploying an azure function app from package (using this guide - https://learn.microsoft.com/en-us/azure/azure-functions/run-functions-from-deployment-package) - this deploys correctly. Yet I can't seem to get an update deployed. Even after I upload a new package, changes are not picked up by azure function app. I tried stopping/starting the app to no avail.
How can I force it to pick up changes?
I ran into this issue with a newly deployed function using function v3, so this still seems to be an issue.
Short answer:
Remove the .zip in D:\home\data\SitePackages and then redeploy and your changes will get picked up.
Long answer
My setup is using ZIP deployment and WEBSITE_RUN_FROM_PACKAGE = 1.
It helps to know what happens when you use ZIP deployment:
The zip file is not written to wwwroot of your site, but instead to D:\home\data\SitePackages (source: https://learn.microsoft.com/en-us/azure/azure-functions/run-functions-from-deployment-package)
Then the contents of the zip file is mounted to D:\home\site\wwwroot and run from there.
For some reason, the .zip in D:\home\data\SitePackages was not being replaced, it was still the old version. To fix that, I used the Console of the App Service to delete the file before redeploying.
Navigate to your App Service function and open the Console (under Development tools section)
Run cd D:\home\data\SitePackages and then ls to see the files in the folder.
Run a rm command to remove both the zip and text file.
Redeploy your function, and you will see that changes are picked up.
We had a very weird problem using Azure Function App Deploy from Azure DevOps.
This worked perfectly until we one day made some code changes to the Azure Function that worked locally but not on our dev server. We started looking at the .zip file and the release pipeline but everything looked good here. We could also see that nothing had changed in our azure-pipelines.yml or release pipeline:
Git command: git log -p azure-pipelines.yml
For releases we used the Azure Functions task by Microsoft Corporation.
Looking at the release logs everything seemed good as well:
We then logged in to Kudu (Advanced Tools) and used powershell to look at the files deployed.
https://<your-function>.scm.azurewebsites.net/DebugConsole/?shell=powershell
Running the command dir D:\home\site\wwwroot we could see that the files had not been updated and when we looked at dir D:\home\data\SitePackages we could not see a new .zip file either.
Confirmed wrong .zip by running the command Get-Content D:\home\data\SitePackages\packagename.txt to see which .zip is being used.
Then wen't back to Azure DevOps and tried to create a new release but the files still did not update. Then I tried to clone the Azure Function App Deploy step that had previously worked and disabled the other one. Tried a new release and now everything worked.
I think this must be a Microsoft bug since we did not really change any values at all. Hope this can help someone else and that Microsoft fixes this.
If you replace old package using new one with the same package name(to leverage the same sas url), make sure the old one is overwritten. And you have to click the refresh button next to Function app to sync triggers along with the changes.
Update
I recommend using the publish command(func azure functionapp publish <functionAppName>) provided by Azure Function Core Tools(Cli). V2 Cli benefits from Run from package as well and automates the whole process for us(zip the folder, upload, create app setting, sync triggers).
The command get publishing info(username, password for deployment) first, then
Archives function project.
Uploads zip file(name is in the format of UTCTime-GUID.zip) to function-releases container in Storage account specified by AzureWebJobsStorage app setting.
Create an app setting WEBSITE_RUN_FROM_ZIP(original name of WEBSITE_RUN_FROM_PACKAGE, both work) with SAS Url.
Sync triggers to pick up changes.
After all efforts, the only thing I really determined was that you need to restart - and then wait. About 15-20 minutes after restart the changes were automagically there.
If you try to deploy with new JAR/ZIP to function-app , the changes will be reflected within 1-2 mins
* Make sure that you are using correct value for WEBSITE_RUN_FROM_PACKAGE
* Download the content from console and check whether its same or not
I had the same problem, after changing the deployment to use WEBSITE_RUN_FROM_PACKAGE. I think the problem is that it is using the cached deployment tool, so it's using the previous deployment script, not a new one when you change how you want it to deploy.
If you look at the deploy log in azure devops there is a link to the deploy log in the kudu interface:
I found this answer which explains how to fix it.
I have an app service in Azure and I have connected it with my source control on GitLab, everything works fine except one thing. When I deploy from Visual Studio I can tell that App_Data should not be replaced and it works. However, deploying from GitLab (I used this tutorial https://christianliebel.com/2016/05/auto-deploying-to-azure-app-services-from-gitlab/) just replaces all the files with what I have in source control, effectively removing customers data from App_Data.
I presume that this is just simple FTP replace (as I have to run my migrations on App_Start), yet is there a way how to not replace App_Data folder on the app service when deploying from gitlab? Having synchronized App_Data with source control is not acceptable.
Thank you
I have resolved the issue by downloading deployment.zip from AppService's Kudu.
Then I edited downloaded deployment.cmd so that KuduSync ignores App_Data.
Then I inserted the modified deployment.cmd and .deployment file to the root of my git folder.
In the Azure release definition I publish the build artifact to the UAT WebApp using Azure web deploy. However this deletes any previously user-uploaded files (e.g. images).
How can I release to UAT and preserve the user uploaded files?
Do I somehow need to perform the equivalent of extracting the .zip file over the existing files rather than replacing the entire website directory with the contents of the .zip?
You can add the following MSBuild property to your build
/p:SkipExtraFilesOnServer=true
OR add the MSDeploy provider flag:
-enableRule:DoNotDelete
https://dotnetcatch.com/2016/02/01/webdeploymsdeploy-quick-tip-keep-existing-files-during-deployment/
Did you try to use the -skip:Directory option to excluse the directory where the images are uploaded ? See: https://technet.microsoft.com/en-us/library/dd569089(WS.10).aspx