We have a new project in which we are trying to make use of the built in continuous integration in Kentico for tracking changes to templates, page types, transformations etc.
We have managed to get this working locally between two instances of a Kentico database, making changes in one, syncing the changes through CI and then restoring them up to the second database using the Continuous integration application that sits in the bin folder in the Kentico site.
The issue we are having is when it comes to deploying our changes to our dev and live environments.
Our sites are hosted as Azure App services and we deploy to them using VSTS (Azure DevOps) build and release workflows however, as these tasks run in an agent, any powershell script we try to run to trigger the CI application fails because it is not running in the site / server context.
My question is, has anyone managed to successfully run Kentico CI in the context of an Azure app service? Alternatively, how can I trigger a powershell script on the site following a deployment?
Yes, I've got this running in Azure DevOps within the release pipeline itself. It's something that we're currently rolling out as a business where I work.
The key steps to getting this working for me were as follows:
I don't want to deploy the ContinuousIntegration.exe or the repository folders, so I need to create a second artefact set from source control (this is only possible at the moment with Azure Repos and GitHub to my knowledge).
Unzip your deployment package and copy the CMS folder to a working directory, this is where you're going to run CI. I did this because I need the built assemblies available.
From the repo artefact in step 1, copy the ContinuousIntegration.exe and CI repository folders into the correct place in your unzipped working folder.
Ensure that the connection string actually works for the DB in your unzipped folder, if necessary, you may want to change your VS build options in regards to how the web.config is handled.
From here, you should be able to run CI in the new working folder against your target database.
In my instance, as I don't have CI running on the target site it means that everything is restored every time.
I'm in to process of writing this up in more detail, so I'll share here when I've done that.
Edit:
- I finally wrote this up in more detail: https://www.ridgeway.com/blog/web-development/using-kentico-12-mvc-with-azure-devops
We do, but no CI. VSTS + GIT. We store virtual objects in the file system and using git for version control. We have our own custom library that does import export of the Kentico objects (the ones are not controlled by Git).Essentially we have a json file "publishing manifest" where we specify what objects need to be exported (i.e. moved between environments).
There is a step from Microsoft 'Powershell on Target Machines', you guess you can look into that.
P.S. Take a look also at Three Ways to Manage Data in Kentico Using PowerShell
Deploy your CI files to the Azure App Service, and then use a Azure Job to run "ContinuousIntegration.exe"
If you place a file called KenticoCI.bat in the directory \App_Data\jobs\triggered\ContinuousIntegration - this will automatically create a web job that you can can trigger:
KenticoCI.bat
cd D:\home\site\wwwroot
ren App_Offline.bak App_Offline.htm
rem # run Kentico CI Integraton
cd D:\home\site\wwwroot\bin
ContinuousIntegration.exe -r
rem # Removes the 'App_Offline.htm' file to bring the site back online
cd D:\home\site\wwwroot
ren App_Offline.htm App_Offline.bak
Related
I wrote two azure pipelines. One pipeline build my app as a tool and publishes to a artifact feed.
My second pipeline runs that app from artifact feed. Problem is that my app does not see appsettings.json. When I tried to print current directory I saw that is root repo folder. How to solve this?
What have I tried is to set workingDirectory in task that run's tool as:
$(Build.BinariesDirectory)
$(Agent.BuildDirectory)
etc..
So, when I run tool locally and print current directory files i get all DLL's included. But on azure I get repo root.
Is my approach good or completely wrong?
We have a site design that makes use of modules that are developed separately from the master site. Thru reflection, we pick up the modules when the main app starts.
This works fine in local development and on a normal web server. But in the Azure environment when we try to use FTP to deploy the modules to our Azure-hosted site we are unable to because the main Azure deployment is read-only (because it is running from a package).
Is it possible to not have the main site running from a package? Is it acceptable to run it that way?
Is there another way to deploy Dlls to the Azure-hosted site without having them be part of the main site's build and deploy? Ultimately we are trying to avoid rebuilding the main site every time we want to add a module.
our Azure-hosted site we are unable to because the main Azure
deployment is read-only (because it is running from a package).
You could set WEBSITE_RUN_FROM_PACKAGE=0in your app setting to make not read only. WEBSITE_RUN_FROM_PACKAGE=1 is read only.
Is it possible to not have the main site running from a package? Is it
acceptable to run it that way?
You could consider switch your deployment methods to Zip Deploy to make your Azure-hosted site not read only.
Refer to this doc.
What can help in this situation is to build and publish the modules using Azure Artifacts.
There are several approaches please check the best practices page over here:
https://learn.microsoft.com/en-us/azure/devops/artifacts/concepts/best-practices?view=azure-devops
Depending on your approach the build, the release but also local development can use these published modules.
Example
You can for example use a private nuget feed:
Publish the modules from your modules pipeline:
https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/nuget?toc=%2Fazure%2Fdevops%2Fartifacts%2Ftoc.json&view=azure-devops&tabs=yaml
Consume these from visual studio:
https://learn.microsoft.com/en-us/azure/devops/artifacts/nuget/consume?view=azure-devops&tabs=windows
And consume them from the website pipeline, this can be a build but also a release if you want to side load them:
https://learn.microsoft.com/en-us/azure/devops/pipelines/packages/nuget-restore?source=recommendations&view=azure-devops
To keep a record of what modules are used in the website, I advise to build or release the website if the modules are changed.
i've an application Mean that will deployed on Azure in a continuous delivery.
We have define on VSTS a build script that build typescript to JS file, clean up unuseful files, a gulp task that zip files and finally copy zip files on drop folder.
After on release level we will have a script for IT environment, staging environment.
We have an Azure Web App Deployment task, it takes the zip files and decompress the files rightly.
But if i let the node-modules folder it takes more that 2 hours ...
i search a solution to launch an 'npm instal' after this task.
I know that's possible to go to Kudu and launch the npm install.
I've tested solution with a deploy.sh (see on the net) but task not launch automatically like i see.
I've try to launch a power shell script but it's failed ...
I've the feeling that's not possible except manual intervention via kudu or other .
There is a way to do this, in the Azure Portal go to your Web App then select "Deployment Options". In there you will see ways to load your app automatically from github, Visual Studio Team Services, OneDrive, and more.
It's all automatic, from the copying of the code to the npm install. Works great for us for development, test, and production environments using VSTS and we have a node/express/angular app.
Here are the docs for setting up automatic deployment:
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-deploy
I have a solution containing a .net mvc website, and a webjob.
I deploy using git - so on git push to azure, my website is upgraded. I'm now just adding a console application that is going to be run on a schedule. I'm trying to work out how to deploy this with the website when I git push, but I'm not sure how to do this.
I know I could create a folder website\app_data\jobs\triggered\webjob and copy the files into there (say from a post-build event on the webjob), but that would mean I would need to commit all those files to the git repo for the deploy to pick them up - which would also mean that every time I build, Git would be prompting me to commit them again - ugh.
Is there a nicer way to do this - where I can just push my repo to azure, and it will deploy my website correctly AND my webjob?
Thanks
Yes, you can do this without having to put the actual EXE's and project output into the folder explicitly. This blog post from the Azure Blog documents the workaround to enable Git or command-line deployment of a web application inclusive with WebJobs.
http://azure.microsoft.com/blog/2014/08/18/enabling-command-line-or-continuous-delivery-of-azure-webjobs/
If this doesn't unblock you please post an update and I'll help walk diagnose any other issues you run into. You may also want to update the WebJob publishing NuGet to the most updated one on NuGet here: https://www.nuget.org/packages/Microsoft.Web.WebJobs.Publish/1.0.2
As of 9/15/2015, this appears to be as simple as some context menus inside Visual Studio.
If you want your WebJob to automatically deploy whenever your Website is deployed, in Visual Studio you can right-click on the Website and select "Add->Existing Project as Azure WebJob".
More details here, in particular the "Enable automatic WebJobs deployment with a web project" section.
I was struggling with this, but I've got it working now.
It appears that WebJobs.Publish 1.0.2 must be used. 1.0.1 was not working for me. Worked as soon as I updated.
I had also tried adding webjobs.props files as indicated here by David Ebbo, but that didn't work for 1.0.1 and I've now removed those files and it's working under 1.0.2 without them.
Using WebJobs.Publish creates a webjob-publish-settings.json (in the webjob project) and a webjobs-list.json (in the MVC app) and that would seem to be all that is needed.
Only thing that does not work is creating the schedule for a scheduled job. Continuous and Triggered jobs deploy just fine. There's a thread here where David Ebbo mentions that this is a current limitation.
I am trying to create automated build to publish a folder with files onto Azure web-site. And I cannot accomplish this.
I am NOT publishing a solution (.sln), but rather a folder with files. I am using VS2013 and Visual Studio Online.
I have experience with TFS web publishing, so I published solutions many times.
So, what I did so far:
Created an MSBuild build.xml file that just copies files from the folder to the output.
Created a build definition based on AzureContinuousDeployment.11.xaml
Specified build.xml in my build definition, Process tab, in "Solution to build" parameter:
If I build my project, it is correctly built, files are copied to the output, etc (I can verify it by opening drop location, all files are there).
Then, I:
Created a web-site in Azure, linked it to my TFS subscription.
Downloaded a publish profile (.PublishSettings from a web-site).
Created a Web publish profile (.pubxml) in Visual Studio based on .PublishSettings file).
Specified Web Deploy Publish Profile and Deployment Settings Name:
But now I am getting an error during build:
Exception Message: Please specify a Visual Studio Solution (.sln) to build. (type BuildFromSolutionException)
So it asks me for a Visual Studio solution, but earlier it worked perfectly with MSBuild file (after step 3).
I tried to rename my .xml to .sln (probably it is not what I should have done), and build now says "There was no Windows Azure project (.ccproj) detected in the solution. Continuous delivery to an Azure Cloud Service requires an Azure project. (type CCProjNotFoundException)"
If I don't specify "Deployment Settings Name", build completes without errors, but again no publishing to Azure.
So, the question is, how to publish a custom MSBuild build, without a solution, onto Azure? Is TFS continuous Azure publishing for Solutions only? I expect it to be agile, like I published folders from Local Git to Web-site without any hassle.
What should I do?
There are a few confused ideas in your question. Fits, there is no relationship between and automated build and Git. You are using Team Foundation Build to run the workflow of deployment. It is the workflow that is not working for you. In effect the build and deployment script. In fact the script you are using works with both Git and TFVC so that is not the issue.
That specific script is designed specifically for building an azure project that is then continuously delivers to Azure and you likley can't use it as you are. You can however create another script and use that. I would suggest you try instead to use the Default build script and use a powershell script within the build to collect the files and then push them to Azure.
If you want to go a little more advanced you could create a copy of the default and make one that does not require MSBuild at all.