Azure DevTestLab Artifacts - Folder not accessible - azure

I created an artifact - "TempArtifact". I added a sub-folder inside it called scripts, which contains all scripts which i will call from my main script. But it looks like this folder is not accessible.
When I add a "ls" command in my main script, it just shows my main script and the Artifacts.json (no folders).
should I be doing something extra to access the subfolder inside my artifact or is it a feature which is not supported by DTL-artifacts

Are you using VSO or Github? I know that for VSO, only first level files will be copied. I think it is the same for Github, but not completely sure.
For Windows systems, the artifacts are copied from source control to the remote VM locally at “:\Packages\Plugins\Microsoft.Compute.CustomScriptExtension\1.8\Downloads”, so you can review the scripts being executed. If the execution fails, the scripts will be kept after execution for review - have a look and see what files you have there.

Related

How to set bin directory in azure pipelines?

I wrote two azure pipelines. One pipeline build my app as a tool and publishes to a artifact feed.
My second pipeline runs that app from artifact feed. Problem is that my app does not see appsettings.json. When I tried to print current directory I saw that is root repo folder. How to solve this?
What have I tried is to set workingDirectory in task that run's tool as:
$(Build.BinariesDirectory)
$(Agent.BuildDirectory)
etc..
So, when I run tool locally and print current directory files i get all DLL's included. But on azure I get repo root.
Is my approach good or completely wrong?

What is the scope of the pipeline tasks (for example command line/delete files) running on an hosted agent?

When the pipeline runs, it first downloads the repo files (Get Sources). Then I can unit test/build the project using a command line task if required. This is followed by deleting the files using the 'Delete Files' task. Any environment variables are scoped to this pipeline only and get cleared when pipeline execution completes. Any files created remain forever (unless deleted using 'Delete Files' or command line task).
What is the scope of the command line and delete files pipeline tasks running on an hosted agent? Can it access the entire hard disk of the agent (like all drives/files)? Is it build folder scoped or agent folder scoped or entire agent hard disk scoped?
The scope is to the entire hard disk.
You can use Command Line or any other tasks and perform operations in your system, not only in the build agent folder.

Azure DevOps - Clean up old releases on on-premise server?

I have a pipeline that deploys code to an IIS folder on an on-premise server. I'm trying to figure out how to best delete old release folders. I'm not seeing anything obvious within DevOps.
Is there a native way of doing this? Or should I roll my own PowerShell script to delete old releases?
Is there a native way of doing this?
Yes, of course there is.
Open your deploy task, go Advanced Deployment Options and then enable the option Remove Additional Files at Destination
Noted: This "delete" operation I mentioned is not mean that clear all the files in local IIS folder. It just delete the files at the destination where there's no corresponding file in the package which is being deployed.
In one word, for some files which same with previous, it will be override as the latest files. And, for any left over files from a previous deployment that are no longer required, they will be deleted.
If you do not trust this option and want to clear the previous files completely, you can also add the Powershell task before IIS manage task and run delete script.
Here is the sample script to delete local files:
Remove-Item -Path "D:\Websites2\*"
You can replace "D:\Websites2\*" as your local website file path.

Running Kentico continuous integration on Azure app services

We have a new project in which we are trying to make use of the built in continuous integration in Kentico for tracking changes to templates, page types, transformations etc.
We have managed to get this working locally between two instances of a Kentico database, making changes in one, syncing the changes through CI and then restoring them up to the second database using the Continuous integration application that sits in the bin folder in the Kentico site.
The issue we are having is when it comes to deploying our changes to our dev and live environments.
Our sites are hosted as Azure App services and we deploy to them using VSTS (Azure DevOps) build and release workflows however, as these tasks run in an agent, any powershell script we try to run to trigger the CI application fails because it is not running in the site / server context.
My question is, has anyone managed to successfully run Kentico CI in the context of an Azure app service? Alternatively, how can I trigger a powershell script on the site following a deployment?
Yes, I've got this running in Azure DevOps within the release pipeline itself. It's something that we're currently rolling out as a business where I work.
The key steps to getting this working for me were as follows:
I don't want to deploy the ContinuousIntegration.exe or the repository folders, so I need to create a second artefact set from source control (this is only possible at the moment with Azure Repos and GitHub to my knowledge).
Unzip your deployment package and copy the CMS folder to a working directory, this is where you're going to run CI. I did this because I need the built assemblies available.
From the repo artefact in step 1, copy the ContinuousIntegration.exe and CI repository folders into the correct place in your unzipped working folder.
Ensure that the connection string actually works for the DB in your unzipped folder, if necessary, you may want to change your VS build options in regards to how the web.config is handled.
From here, you should be able to run CI in the new working folder against your target database.
In my instance, as I don't have CI running on the target site it means that everything is restored every time.
I'm in to process of writing this up in more detail, so I'll share here when I've done that.
Edit:
- I finally wrote this up in more detail: https://www.ridgeway.com/blog/web-development/using-kentico-12-mvc-with-azure-devops
We do, but no CI. VSTS + GIT. We store virtual objects in the file system and using git for version control. We have our own custom library that does import export of the Kentico objects (the ones are not controlled by Git).Essentially we have a json file "publishing manifest" where we specify what objects need to be exported (i.e. moved between environments).
There is a step from Microsoft 'Powershell on Target Machines', you guess you can look into that.
P.S. Take a look also at Three Ways to Manage Data in Kentico Using PowerShell
Deploy your CI files to the Azure App Service, and then use a Azure Job to run "ContinuousIntegration.exe"
If you place a file called KenticoCI.bat in the directory \App_Data\jobs\triggered\ContinuousIntegration - this will automatically create a web job that you can can trigger:
KenticoCI.bat
cd D:\home\site\wwwroot
ren App_Offline.bak App_Offline.htm
rem # run Kentico CI Integraton
cd D:\home\site\wwwroot\bin
ContinuousIntegration.exe -r
rem # Removes the 'App_Offline.htm' file to bring the site back online
cd D:\home\site\wwwroot
ren App_Offline.htm App_Offline.bak

Azure web app - syncing files from external process

This question feels so simple that I'm confused why I can't find an answer!
We have an Azure web app, typically running on 2 to 5 instances.
At the moment we manually run a quite intensive PHP script a few times a day on a local computer to generate a folder of files. (The resulting folder isn't huge - typically about 10MB in size and a few hundred files in total.) We then sync them via Github and they deploy to the website. Easy.
That process is fine, but we want to move the PHP script to Azure so we can remove the dependency on running it locally and instead run it as a chron job.
How can we reliably sync the outputted folder from our script into our web app?
One option is to use a triggered WebJob with a cron schedule. Your WebJob can contain just your PHP script. Or if it needs a special command line to run, include a run.cmd batch file with the full PHP command line.
In your PHP script, do whatever you need to gather the right set of files, and then just copy them to %home%\site\wwwroot\json-data.
For this to work, everything you need to do within your PHP script needs to be runnable within the App Service sandbox. You should first try this directly from Kudu Console before moving it to a WebJob, to make sure everything can run.
Set your web app deployment source to Local Git. After that push your code to WebApp by Git Push . I am not sure about PHP build process but when you push your code to AzureWeb app via LocalGit method it builds and restores all dependencies and deploy it . For custom build script you can refer https://github.com/projectkudu/kudu/wiki/Custom-Deployment-Script.
https://medium.com/#trstringer/custom-build-logic-post-git-push-with-azure-app-service-and-kudu-for-a-node-js-web-app-1b2719598916

Resources