Azure web app - syncing files from external process - azure-web-app-service

This question feels so simple that I'm confused why I can't find an answer!
We have an Azure web app, typically running on 2 to 5 instances.
At the moment we manually run a quite intensive PHP script a few times a day on a local computer to generate a folder of files. (The resulting folder isn't huge - typically about 10MB in size and a few hundred files in total.) We then sync them via Github and they deploy to the website. Easy.
That process is fine, but we want to move the PHP script to Azure so we can remove the dependency on running it locally and instead run it as a chron job.
How can we reliably sync the outputted folder from our script into our web app?

One option is to use a triggered WebJob with a cron schedule. Your WebJob can contain just your PHP script. Or if it needs a special command line to run, include a run.cmd batch file with the full PHP command line.
In your PHP script, do whatever you need to gather the right set of files, and then just copy them to %home%\site\wwwroot\json-data.
For this to work, everything you need to do within your PHP script needs to be runnable within the App Service sandbox. You should first try this directly from Kudu Console before moving it to a WebJob, to make sure everything can run.

Set your web app deployment source to Local Git. After that push your code to WebApp by Git Push . I am not sure about PHP build process but when you push your code to AzureWeb app via LocalGit method it builds and restores all dependencies and deploy it . For custom build script you can refer https://github.com/projectkudu/kudu/wiki/Custom-Deployment-Script.
https://medium.com/#trstringer/custom-build-logic-post-git-push-with-azure-app-service-and-kudu-for-a-node-js-web-app-1b2719598916

Related

Running Kentico continuous integration on Azure app services

We have a new project in which we are trying to make use of the built in continuous integration in Kentico for tracking changes to templates, page types, transformations etc.
We have managed to get this working locally between two instances of a Kentico database, making changes in one, syncing the changes through CI and then restoring them up to the second database using the Continuous integration application that sits in the bin folder in the Kentico site.
The issue we are having is when it comes to deploying our changes to our dev and live environments.
Our sites are hosted as Azure App services and we deploy to them using VSTS (Azure DevOps) build and release workflows however, as these tasks run in an agent, any powershell script we try to run to trigger the CI application fails because it is not running in the site / server context.
My question is, has anyone managed to successfully run Kentico CI in the context of an Azure app service? Alternatively, how can I trigger a powershell script on the site following a deployment?
Yes, I've got this running in Azure DevOps within the release pipeline itself. It's something that we're currently rolling out as a business where I work.
The key steps to getting this working for me were as follows:
I don't want to deploy the ContinuousIntegration.exe or the repository folders, so I need to create a second artefact set from source control (this is only possible at the moment with Azure Repos and GitHub to my knowledge).
Unzip your deployment package and copy the CMS folder to a working directory, this is where you're going to run CI. I did this because I need the built assemblies available.
From the repo artefact in step 1, copy the ContinuousIntegration.exe and CI repository folders into the correct place in your unzipped working folder.
Ensure that the connection string actually works for the DB in your unzipped folder, if necessary, you may want to change your VS build options in regards to how the web.config is handled.
From here, you should be able to run CI in the new working folder against your target database.
In my instance, as I don't have CI running on the target site it means that everything is restored every time.
I'm in to process of writing this up in more detail, so I'll share here when I've done that.
Edit:
- I finally wrote this up in more detail: https://www.ridgeway.com/blog/web-development/using-kentico-12-mvc-with-azure-devops
We do, but no CI. VSTS + GIT. We store virtual objects in the file system and using git for version control. We have our own custom library that does import export of the Kentico objects (the ones are not controlled by Git).Essentially we have a json file "publishing manifest" where we specify what objects need to be exported (i.e. moved between environments).
There is a step from Microsoft 'Powershell on Target Machines', you guess you can look into that.
P.S. Take a look also at Three Ways to Manage Data in Kentico Using PowerShell
Deploy your CI files to the Azure App Service, and then use a Azure Job to run "ContinuousIntegration.exe"
If you place a file called KenticoCI.bat in the directory \App_Data\jobs\triggered\ContinuousIntegration - this will automatically create a web job that you can can trigger:
KenticoCI.bat
cd D:\home\site\wwwroot
ren App_Offline.bak App_Offline.htm
rem # run Kentico CI Integraton
cd D:\home\site\wwwroot\bin
ContinuousIntegration.exe -r
rem # Removes the 'App_Offline.htm' file to bring the site back online
cd D:\home\site\wwwroot
ren App_Offline.htm App_Offline.bak

Azure DevTestLab Artifacts - Folder not accessible

I created an artifact - "TempArtifact". I added a sub-folder inside it called scripts, which contains all scripts which i will call from my main script. But it looks like this folder is not accessible.
When I add a "ls" command in my main script, it just shows my main script and the Artifacts.json (no folders).
should I be doing something extra to access the subfolder inside my artifact or is it a feature which is not supported by DTL-artifacts
Are you using VSO or Github? I know that for VSO, only first level files will be copied. I think it is the same for Github, but not completely sure.
For Windows systems, the artifacts are copied from source control to the remote VM locally at “:\Packages\Plugins\Microsoft.Compute.CustomScriptExtension\1.8\Downloads”, so you can review the scripts being executed. If the execution fails, the scripts will be kept after execution for review - have a look and see what files you have there.

Mean application on azure webapp

i've an application Mean that will deployed on Azure in a continuous delivery.
We have define on VSTS a build script that build typescript to JS file, clean up unuseful files, a gulp task that zip files and finally copy zip files on drop folder.
After on release level we will have a script for IT environment, staging environment.
We have an Azure Web App Deployment task, it takes the zip files and decompress the files rightly.
But if i let the node-modules folder it takes more that 2 hours ...
i search a solution to launch an 'npm instal' after this task.
I know that's possible to go to Kudu and launch the npm install.
I've tested solution with a deploy.sh (see on the net) but task not launch automatically like i see.
I've try to launch a power shell script but it's failed ...
I've the feeling that's not possible except manual intervention via kudu or other .
There is a way to do this, in the Azure Portal go to your Web App then select "Deployment Options". In there you will see ways to load your app automatically from github, Visual Studio Team Services, OneDrive, and more.
It's all automatic, from the copying of the code to the npm install. Works great for us for development, test, and production environments using VSTS and we have a node/express/angular app.
Here are the docs for setting up automatic deployment:
https://learn.microsoft.com/en-us/azure/app-service-web/web-sites-deploy

Build web application before or after deployment?

Context
Web application project has a /build (or /dist) folder with front-end files, generated during build (by Gulp). This folder is not under the source control (see, for example: React.js Starter Kit)
The server-side code doesn't require bundling or compilation step, so the /src folder from your project can be deployed as it is (these source files are used to run Node.js or ASP.NET vNext server)
Web application is deployed via Git (see Git-based deployment options in Heroku or Windows Azure as an example)
Questions
Is it better to build (bundle and minify) front-end files before or after deployment?
If before, you may end-up having a separate repository (or branch), with the /build folder under the source control alongside with the rest of the project files. This repo is used solely for deployment purposes.
If after, the deployment time may increase - time needed to download additional npm modules used in the build process, the server's CPU may spike up to 100% during the build, potentially harming your web application's responsiveness.
Is it better to build front-end files on the remote server before or after running KuduSync command?
If you deploy your web application to Windows Azure with Kudu, should the deployment script copy only the contents of the /build folder (with public, front-end files like .js, .html, .css) to /wwwroot? As opposed to copying all the project files (server-side source code and front-end bundles), which it does by default.
By default Azure's deployment script copies all the project files, from D:\home\site\repository folder to D:\home\site\wwwroot folder, and then Node.js app is started from there. Is it a necessary step? Why not to start the Node.js (or ASP.NET vNext) app from the D:\home\site\repository folder? And if it indeed should be copied to a separate folder, why source files are placed in wwwroot, maybe it's better to copy them to another folder, outside wwwroot?
I am not familiar with both Azure and Heroku so I can't give any ideas about those specific deployment options.
I am using (4 dedicated servers with 2 of them solely for serving static files), the option to build the bundled and minified javascript files (for front-end) and add all those files to the main repository has several advantages
You only need to run it once (either on your dev machine or on staging server, whatever way you want). This is particularly helpful when you have to run multiple static servers since you don't have to run the build command on each server. One might argue that they can use something like Glusterfs to synchronise files from one static server to all other servers and the build process only needs to be run once. However, it is a whole different story when it comes to this kind of setup
It makes your deployment process simple, just pull new code and restart the server(s) if necessary (assuming that you have some mechanism to increase the static file version so that all your clients will receive the latest version)
Avoid unnecessary dependencies on your production servers. This might sound weird for some people but I just don't want to install any extra libraries on my production servers unless they are absolutely necessary. With the build process run locally on my dev machine, my production servers only have what they need to run the production code and nothing else
However this approach also has some disadvantages:
When more than one developer in your team (accidentally) run the build process and commit the code, then you will have a crazy list of conflicts. However, it can be solved by simply running the build process again after you merge all the changes from other guys. This is more about the workflow
Your repository will be bigger. I personally don't think this is a big issue considering few extra MB of my bundled and minified files. If your front-end javascript is big enough for this to be an issue then it is another story

CruiseControl.net - Is there a way to run a batch file that will copy (msi) files to another server in a different domain?

I have an issue where I am trying to copy files(msi) from our build server to our Test server in CruiseControl. Once these are copied over we are planning on having a Scheduled Task that will run silent installs nightly. I need to be able to push back to CruiseControl the status of that build.
I am having issues copying these files from a batch file that is being run in our cruisecontrol prooject. I'm pretty sure its a permissions issue.
Also is there a way to push the build status back to CruiseControl so that it could tell us when the install failed?
Not a simple solution available, I'm afraid. Only time I saw anything similar done was using a python script to invoke commands on a remote system using ssh.

Resources