Kentico MVC site deployment - kentico

We are using Kentico 13 and developing a new site using ASP.NET Core. For deployment, we have leveraged Azure DevOps pipelines. When deploying our new site, all data in App_Data is being wiped. Which includes smart search indexes.
So after each deployment, we have to initialize the rebuild of indexes. I was thinking that we could add a script to copy existing App_Data folder to some temp directory before deployment and after the deployment is finished, copy the App_Data folder back.
Is this a good approach, is there any other way to solve this problem from your experience?

We use Azure DevOps pipelines but the App_Data folder is persisted between deployments.
Which DevOps yml task are you using?
AzureRmWebAppDeployment#4 has settings to ensure this folder is not removed, specifically RemoveAdditionalFilesFlag and ExcludeFilesFromAppDataFlag.
If you don't want to use those, Zach's method works.
You can also use an ASP.NET Core startup filter, which the Dancing Goat sample site uses to rebuild the index on startup.
services.AddSingleton<IStartupFilter>(new SmartSearchIndexRebuildStartupFilter());
public class SmartSearchIndexRebuildStartupFilter : IStartupFilter
{
public Action<IApplicationBuilder> Configure(Action<IApplicationBuilder> next)
{
return builder =>
{
// Ensures smart search index rebuild upon installation or deployment
builder.UseMiddleware<SmartSearchIndexRebuildMiddleware>();
next(builder);
};
}
}

In previous versions I have used Kentico's global events to trigger a smart search rebuild on App Start. So after you deploy your code, soon as the app starts the smart search will rebuild. If you use web farm you could potentially have this rebuilding more than you want though.

Related

how to configure webjob to access the web app's reference project files in azure

I have a .net core web app that I deploy to azure using Azure Devops build/release pipelines. This project references Business project and a Models project that are part of the three-tier solution. The Models project consists of Entity Framework 6 code first models (including migrations).
Recently I have had to deploy a triggered webjob in order to accomplish a long running task. This was just created as a normal .Net console app and then published from within Visual Studio 2017 by selecting "Publish as Azure Web Job". This webjob is published to and runs under the .net core web app service mentioned above. It references the same Models and Business project that the .net core web app references.
My issue is that whenever the model is changed by introducing db migrations, the web job also must be updated since the models.dll that is published as part of the webjob project resides separately in a directory app_data\jobs\triggered\webjob under the main web app.
Is there any way to configure my webjob so that the models.dll and business.dll are directly referenced from that of the main web app? Failing that, how can I modify the Azure devops process to copy these files to the directory of the webjob upon successful deploy? Is there a guide for this?
how can I modify the Azure devops process to copy these files to the directory of the webjob upon successful deploy? Is there a guide for this?
AFAIK, we could use powershell scripts and git command to filter whether the file modification comes from models.dll and business.dll files, like:
$editedFiles = git diff HEAD HEAD~ --name-only
$editedFiles | ForEach-Object {
Switch -Wildcard ($_ ) {
'SubFolderA/models.dll*' { Write-Output "##vso[task.setvariable variable=UpdateFile]True" }
# The rest of your path filters
}
}
Code comes from here.
Then add custom conditions in the next task in the build pipeline:
and(succeeded(), eq(variables['UpdateFile'], 'True'))
In the next task, we could use the kudu API to update these files to the directory of the webjob upon successful deploy:
GET /api/zip/{path}/
Zip up and download the specified folder. The zip doesn't include the top folder itself. Make sure you include
the trailing slash!
PUT /api/zip/{path}/
Upload a zip file which gets expanded into the specified folder. Existing files are not deleted
unless they need to be overwritten by files in the zip. The path can be nested (e.g. `folder1/folder2`), and needs to exist.
You could check the similar thread and the document for some details.
Hope this helps.

Can I use VSTS Release Manager to deploy a single web job under my app service without deploying the entire app service?

I have an Azure App Service which contains 5 web jobs. I have VSTS Release Manager set up to deploy the entire app service, which successfully updates my web jobs as well.
However, I want to deploy only a single web job without deploying the entire app service. I have the build set up successfully for the web job. But I am struggling with the configuration of the release pipeline. I've tried two methods:
1. Copy Files
Using this method, I am using $(build.artifactstagingdirectory)/app_data/jobs/triggered/[my-web-job-name] as the target folder. But when I go to find the files in the production environment (using Kudu console), they are not there. Since this completes "successfully", I think the target folder might be set up incorrectly. What target can I specify to get the files in the production environment? (as a side-note, do I need to do something special to have it deploy the contents of the drop.zip file?)
2. Azure App Service Deploy
This method seems to target the entire app service, and not a single web job. I have not tried running it using this method, as I am concerned it might wipe out my entire app service and replace it with my single web job. My thought is that there may be a way to set up a "sub-folder" of the app service to deploy into. But I'm not seeing any setting like that in any of the options. Is there a way to set up the "Azure App Service Deploy" to deploy to a single web job folder?
Or, is there an entirely different way to deploy a single web job?
I think you need to make sure the structure of your artifact matches the subfolders exactly. See here: http://www.bravegeek.com/2016/12/03/Deploy-WebJobs-from-Team-Services/
Relevant part:
In your Build definition, add a Copy Files step after the build step.
Set these properties
Source Folder: src/WebJobTest/bin/$(BuildConfiguration)/
Contents: **
Target Folder: $(build.artifactstagingdirectory)\WebJobTest\App_Data\jobs\continuous\WebJobTest

How to update a folder in an Azure AppService upon Checkin into a VSO TFS repository

I have a website hosted as an azure web app. It's an asp.net website that's in a vs solution. On folder of that website is my products documentation, all of it as static resources (html and images). These static resources are located in a folder in another vs solution (this is the actual products solution). Both solutions are TFS based in VSO.
As of now, i have a webjob running in the context of the website that is basically doing a "tfs get" on the documentation folder and placing it's contents into the documentation folder on the website. This is working, however, the vms the website is running on do change quite frequently and the mechanism to create a workspace is bound to the machine, not to the disk drive. Thus, i cannot get only the changes but i must always get all the content which right now takes about 20 minutes and creates unnecessary load on the website. (This is why i'm only running this webjob once a week.)
Now i'm looking for a better way to do this. I would like to only get the files that have changed, making this a lot faster and let cpu/drive costly.
I did not find a way to create a workspace on the webserver that isn't vanishing each time the webservers vm changes. (if it was possible to somehow attach the workspace to the drive instead of to the machine name, that would solve the problem.)
i was also looking at my continues build definition that i have running for the products solution. as part of that, i think it's possibly to create a deployment where the documentation folder is copied to the app services's documentation folder. This way i could get rid of the "special" webjob, but i'd still copy all the docs files each time. (also, the build agent for that is running on premises, so i'd also have to copy those files from premises up to the cloud when they're actually already there inside vso.). So basically, i don't think this option is a lot of use for my case.
Obviously if i moved the static docs resources from the products solution to the websites solution, i could simply use the automatic deployment that is available for website projects from vso to azure web app. Unfortunately, for various other reasons (one of which being, the static resources are partially created automatically from the .cs sources in the products solution) i simply cannot move the docs folder from the products solution to the websites solution.
So does anybody have a suggestion for a method where i could update the documentation folder in the web app based on changes in the corresponding VSO folder?
You can upload the updated files to Azure app service by using Kudu API.
Simple steps:
Create a Continuous Integration build
Check Allow Scripts to Access OAuth Token option in Options tab
Add PowerShell step/task to check the changes with REST API. (Refer to Calling VSTS APIs with PowerShell)
Add Azure PowerShell step/task to upload files to app service by using kudu api. (Refer to Remove files and folders on Azure before a new deploy from VSTS)
Here is what i ended up doing:
Created a ServiceHook in VSO that is wired to "Web Hooks". The hook is called upon each Check-In and filtered based on the directory i want. (All of this can be done using the existing functionality in VSO.)
The hook calls an Azure "functionapp" (which is easy to do, because functionapps have a "HttpTrigger" mechanism which fits in nicely here.)
The hook passes the id of the checkins ChangeSet to the function app.
The FunctionApp puts that id into an Azure (Storage) Queue.
This triggers an azure webjob which listens on that queue. That webjob uses the ChangeSet-Id to get the changes from VSO and acts on the changetype for each change. (e.g.: Downloads or deletes a file.)

How to deploy a static website to Azure from Visual Studio Team Services

I have an existing website that I would like to deploy on Azure, using Visual Studio Team Services. The website is made up of static files, there's no ASP.NET or anything else involved.
Within Visual Studio Team Services, I created a build which executes npm install and a gulp build. This results in a dist folder containing all the files for the website. In Azure, everything is set up correctly (subscription, web app,...).
However, I'm unsure on how to push my code to Azure. Exploring the options in the Release tab in VSTS, an 'artifact' always seems to be required, but I just have a bunch of files. I need to publish the files in the dist folder and make sure index.html is served.
How can I do that?
This question is related to this one, however, the answers all state to start from Azure, and do not mention how to deploy existing code using Visual Studio Team Services.
The trick is to create the artifact yourself, which can be as simple as a zip file containing the static website files. The zip file should be copied as an artifact to the $(build.artifactstagingdirectory) target directory. Next, you can use a simple Web App deployment task to publish the zip file to Azure. If index.html is in the root directory, Azure is smart enough to serve it.
Below is a working build and deploy flow. It assumes gulp is used to build the website and write the build output (i.e. the static files) to a dist folder.
The easiest way is to deploy from a source control, if you take a look under "Settings" for your Website in the Azure portal you will probably see "Continuous deployment".
From there you can deploy from Visual Studio Team Services, Github, etc.
Every check-in will be deployed, also wrong ones, so you may want a introduce a staging environment as a deployment slot as well, where you can swap staging with production whenever you feel your site is ready for production.
Without the need to create an artifact, another solution could be FTP deployment after creating an Service Endpoint in VSTS

Azure VSO Continuous Delivery: Deploy specific site in multi-site solution

We have a VSO repository with multiple sites in the same solution. We want to be able to deploy our sites independently of each other to Azure with continuous delivery. Right now the first site alphabetically is deployed to all our sites which of course is not desirable.
Is this possible to achieve?
I have tried to set the Project key to the correct csproj in App Settings like suggested here: https://github.com/projectkudu/kudu/wiki/Customizing-deployments without any success. Maybe Kudu is not used for VSO?
Any help would be greatly appreciated!
You should move away from trying to do this in a build, especially if you want that level of control.
You have Release Management Online provided with your VSO account and can use the Release Management client to configure your releases.
I believe that you can right-click on your Build and have an appropriate starter release template created when configured.
http://nakedalm.com/create-release-management-pipeline-professional-developers/
Here is an example of and end to end deployment with a web app. Deploying to Azure with RM is Childs play...

Resources