Azure web hosting using FTP / MS WebDeploy - azure

Can I host a web application created on .net core 2.1 with sql server as database to azure web app service using CI tools / MS WebDeploy?
The following points I want to take care:
The application is using file system for temp storage and file storage
Deployment should be managed by some CI tools such as jenkins
After deployment, the app settings file should be modified with some keys/server details
Log files(stored on app root) should be accessible by application administrator
Is there a way to create a virtual directory same as in IIS and upload the files using FTP or similar protocols..?

All your doubts about deploying .net core 2.1 web app are achievable.
Suppose our projects are all completed and uploaded to github.
Questions and explanations about your concerns:
About the connection configuration using the database, you can directly configure it in web.config. If you are using azure sql server, find the connection string, set up the firewall, and pass the SSMS test, you can test the connection in the code. It can also be added in the Configuration -> Application settings -> Connection strings in the portal. After the addition, the priority is higher than the configuration in web.config, which will override the configuration and not modify the web.config file.
Regarding the use of file storage, you can use azure storage services or not. Looking specifically at the business, for example, very small pictures, documents and other files can be stored in the current program running directory, which is consistent with the original development at the code level. When publishing, you need to include the MyFiles file in the publishing process, or wait for the publishing to be completed and add folders manually in kudu, or the program can judge. It is recommended to use the program to judge that the subsequent program upgrade will not lose data.
The confidential information in the app settings file can actually be configured in web.config or appsetting.json. Make sure that the offline project is running properly when you are debugging locally, and then you can publish it. The rest is configured in the portal as in the first explanation.
The Log Files file storage can fully achieve the effect you want. It should be enough to set the owner permissions of this app services. For details, please refer to the official documentation.
Virtual directories and virtual applications, I have a better answer in another post here, you can refer to it.
Steps:
First of all, we can create a web app in portal and select .net core 2.1. Create appservices, and click Deployment Center when finished.
Follow the prompts step by step, and wait until the Action in github is completed, and the release is successful.

Related

Best approach to manage serilog.json location in Azure environment

We have a Web application, developed with ASP.Net Core and deployed on Azure using Docker container. The Web App uses serilog library (https://serilog.net/) as internal logging engine. The configuration of Serilog is stored in the file serilog.json, included in the Visual Studio Solution Explorer and, obviously, in the final Docker container.
Using a special admin panel included in the Web App, an administrator can change the active Serilog log level to any supported value (Fatal, Error, Warning and so on). This feature simply updates the file serilog.json, and ASP.NET Core reloads on-the-fly the new log level.
It works like a charm, but now the problems begins.
We use Azure DevOps to deploy every nighty a new version of our Web App, so every night the current Docker container is overwritten with the newest one. Also the file serilog.json, so every night we lose the log level configuration.
What are we wrong with this approach?
We need to move 'serilog.json' outside the Docker container and store it in another location?
Any idea?
Thanks for your help, support and discussion!

How to update a folder in an Azure AppService upon Checkin into a VSO TFS repository

I have a website hosted as an azure web app. It's an asp.net website that's in a vs solution. On folder of that website is my products documentation, all of it as static resources (html and images). These static resources are located in a folder in another vs solution (this is the actual products solution). Both solutions are TFS based in VSO.
As of now, i have a webjob running in the context of the website that is basically doing a "tfs get" on the documentation folder and placing it's contents into the documentation folder on the website. This is working, however, the vms the website is running on do change quite frequently and the mechanism to create a workspace is bound to the machine, not to the disk drive. Thus, i cannot get only the changes but i must always get all the content which right now takes about 20 minutes and creates unnecessary load on the website. (This is why i'm only running this webjob once a week.)
Now i'm looking for a better way to do this. I would like to only get the files that have changed, making this a lot faster and let cpu/drive costly.
I did not find a way to create a workspace on the webserver that isn't vanishing each time the webservers vm changes. (if it was possible to somehow attach the workspace to the drive instead of to the machine name, that would solve the problem.)
i was also looking at my continues build definition that i have running for the products solution. as part of that, i think it's possibly to create a deployment where the documentation folder is copied to the app services's documentation folder. This way i could get rid of the "special" webjob, but i'd still copy all the docs files each time. (also, the build agent for that is running on premises, so i'd also have to copy those files from premises up to the cloud when they're actually already there inside vso.). So basically, i don't think this option is a lot of use for my case.
Obviously if i moved the static docs resources from the products solution to the websites solution, i could simply use the automatic deployment that is available for website projects from vso to azure web app. Unfortunately, for various other reasons (one of which being, the static resources are partially created automatically from the .cs sources in the products solution) i simply cannot move the docs folder from the products solution to the websites solution.
So does anybody have a suggestion for a method where i could update the documentation folder in the web app based on changes in the corresponding VSO folder?
You can upload the updated files to Azure app service by using Kudu API.
Simple steps:
Create a Continuous Integration build
Check Allow Scripts to Access OAuth Token option in Options tab
Add PowerShell step/task to check the changes with REST API. (Refer to Calling VSTS APIs with PowerShell)
Add Azure PowerShell step/task to upload files to app service by using kudu api. (Refer to Remove files and folders on Azure before a new deploy from VSTS)
Here is what i ended up doing:
Created a ServiceHook in VSO that is wired to "Web Hooks". The hook is called upon each Check-In and filtered based on the directory i want. (All of this can be done using the existing functionality in VSO.)
The hook calls an Azure "functionapp" (which is easy to do, because functionapps have a "HttpTrigger" mechanism which fits in nicely here.)
The hook passes the id of the checkins ChangeSet to the function app.
The FunctionApp puts that id into an Azure (Storage) Queue.
This triggers an azure webjob which listens on that queue. That webjob uses the ChangeSet-Id to get the changes from VSO and acts on the changetype for each change. (e.g.: Downloads or deletes a file.)

FTP'ing a Suave app to Azure

Having never used Azure before I'm attempting to deploy a simple F# Suave app to Azure using FTP. Ultimately I want to deploy via github but I initially thought FTP'ing it would be the easy first step. According to https://suave.io/azure-app-service.html it should be straight forward.
These are the steps I followed
Created a new web app in Azure including a resource group
and app service plan. All on the Free Tier.
Downloaded the publishsettings XML file that Azure created.
Cloned this repo: https://github.com/isaacabraham/fsharp-demonstrator
Used FileZilla to connect via FTP using the creds
from step 2.
Uploaded the files (via FTP) from
fsharp-demonstrator/src/SuaveHost (which includes the necessary web.config file) from the repo cloned at step 3 to
the site\wwwroot on Azure.
Navigated to Azure site url.
Then I receive the error:
The specified CGI application encountered an error and the server terminated the process.
(When I look at the folders on Azure under site\wwwroot I don't see any obj or bin folders. I don't think any msbuild process occurred. That doesn't seem right.)
Anybody got any idea what the problem is?
I suspect the issue is that when you deploy via FTP, then Azure does not automatically run the deploy script specified in the .deployment file.
The build.fsx script uses Kudu service to deploy the built files, so it might be easier to just use Github deployment rather than FTP - this way, Azure will do the deployment for you.
If you want to deploy via FTP, you'll need to build the project locally and upload the output. I'm not sure how to best do this with Isaac's Kudu-based demo though (ultimately, you need web.config that points to your built executable like this)

patching website on azure webroles

Sometimes in our website which is deployed on Azure web roles, issue comes related to small bugs in javascript and HTML. We go to all instances of webroles and fix these JS and HTML file on machines.
But I was looking into some automated way of doing this, downloading the files to patch from some central location and replace the files in all azure web roles. I am using ASP .net MVC for website.
It is possible to redeploy the website with the patch in the package but we don't want to wait for long deployment time. Please let me know if it is possible via some internal WEB API which replaces the content on all azure web roles.
There are 2 ways to deploy a new webrole:
redeploy
inplace update
The first one is the slowest, meaning new VM's are booted.
With inplace upgrade (https://azure.microsoft.com/en-us/documentation/articles/cloud-services-update-azure-service/)
The new application package is mounted on a new drive (usually F: instead of E:) and the IIS website is swapped to the new drive.
You can try this by going to the old portal and upload a new application package. In just a few seconds/minutes the update is done.
After digging many things on stackoverflow, I crafted my own solution which is creating a topic and subscribing to the topic in code when website starts. When I want to patch the web app then I send a message to Topic to start patching then each machine in the web roles will get notification from topic and start patching themselves. Patching itself is very easy, which is going to a web storage and downloading files from there and replacing files in approot.
When azure maintenance happens this patching may go away, so for this situation I made patching work started at start up of website too.
Cloud service deployment packages tend to be slow since they are basically a recipe on how to build and configure your deployment. The deployment not only puts the recipe out in Azure (so it can be used again if it needs to move your machine), but also follows the recipe to build out a VM for your Cloud Service (WebRoles/WorkerRoles are platform as a service so you don't have to worry about the OS and infrastructure level like you would if you were using the Virtual Machine Azure product but they do still run in VMs on physical hardware).
What you are looking to do is something that will update the recipe (your cloud service package) and your deployment after it is out and running already ... there is no simple way to do that in Cloud Services.
However, yes you could create a startup script that could pull the site files from blob storage or some other centralized location - this would compare to how applications (fiddler for example) look for updates then know how to update and replace themselves. For that sort of feature you will likely need to run code as an elevated user - one nice thing about startup scripts are they can run as an elevated user - so they can do about anything you need done on a machine (but will require you to restart the instance for them to run). Basically you would need to write some code that will allow your site to update itself. This link may help: https://azure.microsoft.com/en-us/documentation/articles/cloud-services-startup-tasks/
If you have the ability to migrate to WebApps and WebJobs, I would recommend looking into that since that compute product solves your problem really well.
Here is a useful answer of the differences between WebApps and Cloud Services: What is the difference between an Azure Web Site and an Azure Web Role

Publishing "baked" Orchard CMS to Azure CloudService

I am using Orchard CMS 1.6 with target deploy it to Azure cloud service. I have followed the steps in documentation Deploying Orchard to Windows Azure(http://docs.orchardproject.net/Documentation/Deploying-Orchard-to-Windows-Azure)
however,it hit time out error again and again during cooking receipe (Have tried with small VM)
My idea, is instead of running the setup process during initial setup, I would like to deploy the "baked" ready copy of Orchard (and manually deploying the DB scripts to SQL Azure) to cloudservice.
I tried with working on Orchard.Azure.sln and building the package again with ClickToBuildAzurePackage.cmd but now I come to the error
"The type 'Orchard.Environment.Configuration.AzureBlobTenantManager' could not be found. It may require assembly qualification, e.g. "MyType, MyAssembly"."
Any idea or experience to share?
Thanks.
Finally I made it works on cloudservice.
My idea is to cooking the recipe on my local machine instead of on Azure itself, to avoid timeout problem. So, we will have the ready cooked structure in Azure Storage and database schema.
Then, build the package with ClickToBuildAzurePackage.cmd and deploy the package to Azure Cloud Service. The instance shall skip the setup process since the Azure Storage contains the required information.
Below are my workarounds:
Download Orchard.Source.1.6.zip from the Orchard Codeplex
Extract and open the Orchard.Source.1.6\src\Orchard.Azure\Orchard.Azure.sln
Open the solution and edit your Orchard.Azure.Web role, change the Data Connection setting and Diagnostics to your production/development storage account.
Create an empty database in your SQL Azure (Assume you are using SQL Server)
Hit F5 to kick start the application and enter the setup information to start cooking recipe.
You may have the Orchard cooking in browser.
Once the Orchard is cooked. Check your Azure storage(With CloudBerry for example), you shall have the following folders:
media
site
wad-control-container
wad-iis-logfiles
Follow the instruction from Deploying Orchard to Windows Azure
You shall have the instance of Orchard running without kicking the setup process

Resources