How to upload an artifact to some folder inside a remote machine - azure

I m completely new to github actions and not sure in which direction I should search for help so posting my problem here.
I have a vm running on azure. I have a github action which is generating an artifact on push, lets say myPackage.zip. I want to to extract it and paste into a folder C:/path/to/my/app/
I can access my vm using remote desktop as its windows based machine.

You can use this action: https://github.com/actions/upload-artifact
It allows uploading any file from any system into build artifacts.

Related

create local fileList from Azure Devops Repos

I am working on a script solution to capture status changes of an Azure DevOps repo.
Basically here is what we need:
The script is supposed to create a file list that contains ALL files in an Azure repo, and save it to local drive.
If the newly generated file list has changes, i.e., a file gets removed/added, the script should create a separate file list that records these changes.
Step 2 I can take care of myself.
But since I am not familiar with the DevOps API, could anyone help me on this?
Thank you in advance
For the first question, you can fetch repo to local drive, and list files in this directory.
If you could fetch repo to local drive, you can git log -1 for latest change commit.

Automatic update local repository

I'm having trouble creating some sort of automatic deployment function with Github
So, what I have is a repository on Github, and a local folder in my Ubuntu connected to that repository on Github, and what I want to achieve is, that everytime I upload/Add a new file to the repository on Github, I can somehow run a script that updates the local folder on my Ubuntu, with those new files stored in my repository on Github.
So to sum it up;
1.Upload new files to repo in Github
2.Run script on local Ubuntu machine
3. Newly uploaded files in repo in Github gets added to local folder on Ubuntu machine.
Is there anyways to achieve this? Thanks!
That seems to be a case for webhook, which comes with a constraint: your machine should be reachable from github.com.
If that is the case, you can setup a listener (for instance, alexandru/github-webhook-listener), which will detect JSON payload sent by GitHub on each push on your repository.
That local listener can then trigger a simple git pull in your local repository, updating its content that way.

Publishing using MSDeploy to Azure App from a different computer updates all the files

Currently in the process of potentially moving our sites to Azure. As it stands we are testing deploying to Azure app service, everything works and publishes fine using one computer. However if someone else runs a publish from a different computer with an identical build the publish operation sees fit to 'update' all of the files, of which there are a lot. Then when the next publish occurs from the original computer the same happens there. Further publishes from the same computer do not generate this 'updating' of all the files which takes a long time.
Never had this issue previously when publishing to IIS on our Rackspace servers. Why is MSDeploy choosing to update these files even though they have not changed at all and seemingly only because the publish is coming from a different computer to the last publish that occurred?
Can anyone explain how I can stop this?
It seems your project is in local repository or personal repository, and maybe published using zip deploy or Visual Studio.
Deploy like that would come up with a connection between Azure and the
location of your project. If you deploy from another computer, or
another repository, the connection would be fresh to the new one,
which would update all the files like publishing a new project.
You could consider deploying continuously from a remote repository like GitHub, which you could access it on any computer.
Here are the samples you could have a look:
Deploy using GitHub Action
Deploy using DevOps

Running Kentico continuous integration on Azure app services

We have a new project in which we are trying to make use of the built in continuous integration in Kentico for tracking changes to templates, page types, transformations etc.
We have managed to get this working locally between two instances of a Kentico database, making changes in one, syncing the changes through CI and then restoring them up to the second database using the Continuous integration application that sits in the bin folder in the Kentico site.
The issue we are having is when it comes to deploying our changes to our dev and live environments.
Our sites are hosted as Azure App services and we deploy to them using VSTS (Azure DevOps) build and release workflows however, as these tasks run in an agent, any powershell script we try to run to trigger the CI application fails because it is not running in the site / server context.
My question is, has anyone managed to successfully run Kentico CI in the context of an Azure app service? Alternatively, how can I trigger a powershell script on the site following a deployment?
Yes, I've got this running in Azure DevOps within the release pipeline itself. It's something that we're currently rolling out as a business where I work.
The key steps to getting this working for me were as follows:
I don't want to deploy the ContinuousIntegration.exe or the repository folders, so I need to create a second artefact set from source control (this is only possible at the moment with Azure Repos and GitHub to my knowledge).
Unzip your deployment package and copy the CMS folder to a working directory, this is where you're going to run CI. I did this because I need the built assemblies available.
From the repo artefact in step 1, copy the ContinuousIntegration.exe and CI repository folders into the correct place in your unzipped working folder.
Ensure that the connection string actually works for the DB in your unzipped folder, if necessary, you may want to change your VS build options in regards to how the web.config is handled.
From here, you should be able to run CI in the new working folder against your target database.
In my instance, as I don't have CI running on the target site it means that everything is restored every time.
I'm in to process of writing this up in more detail, so I'll share here when I've done that.
Edit:
- I finally wrote this up in more detail: https://www.ridgeway.com/blog/web-development/using-kentico-12-mvc-with-azure-devops
We do, but no CI. VSTS + GIT. We store virtual objects in the file system and using git for version control. We have our own custom library that does import export of the Kentico objects (the ones are not controlled by Git).Essentially we have a json file "publishing manifest" where we specify what objects need to be exported (i.e. moved between environments).
There is a step from Microsoft 'Powershell on Target Machines', you guess you can look into that.
P.S. Take a look also at Three Ways to Manage Data in Kentico Using PowerShell
Deploy your CI files to the Azure App Service, and then use a Azure Job to run "ContinuousIntegration.exe"
If you place a file called KenticoCI.bat in the directory \App_Data\jobs\triggered\ContinuousIntegration - this will automatically create a web job that you can can trigger:
KenticoCI.bat
cd D:\home\site\wwwroot
ren App_Offline.bak App_Offline.htm
rem # run Kentico CI Integraton
cd D:\home\site\wwwroot\bin
ContinuousIntegration.exe -r
rem # Removes the 'App_Offline.htm' file to bring the site back online
cd D:\home\site\wwwroot
ren App_Offline.htm App_Offline.bak

nodeJS/GitlabCI: How to serve decentralized productive application

I'm developing a nodeJS application using nextJS and an expressJS application. And I'm using an own gitlab instance for managing the git repository.
But the current application should not be deployed to a webserver at the end, but I need to create decentralized productive application. To make it a bit clearer:
Developing the application locally
Push application to my remote server
My customers should be able to get the productive app code from my remove server
Customers will run the application on there local environment - should be able to pull new versions from the remove server
So the application itself won't run on my remote server, but on the local server of the customers.
Normaly I would use my CI to test and build the application (which is be done by npm run build). Then I build an docker image which I use to run the application on my server. But all that is normaly working on the same server.
In this case I need to build the application and serve it to the customers / the customers should be able to pull the productive code. How can this be done.
Maybe I lose sight of the wood for the trees... and that's why I'm asking for help/hints.
There are a number of ways you can do this and a number of tools you can use as well. You probably want a pipeline similar to the following.
Code is developed locally, committed, and pushed to the self-hosted gitlab.
GitLab CI, (or any other CI configured) will then run CI of your code.
The final step of the CI is to create a "bundle" of your application. This is probably a .zip or similar and this will be pushed to a remote storage location. It is also possible to ensure that this is done only when pushing to specific branches (such as master).
You can use a number of things as your remote storage location, such as some sort of AWS S3 bucket, or something more complex such as Nexus (there are many free alternatives).
You would then want to give your customers access to either this storage location (if you're using something like S3, or Digital Ocean Block Storage, etc), or access to your distribution repository (such as Nexus).
You should be able to generate some sort of SSH key that you can put on your GitLabCI server and use to publish to these places. It should then be a simple case of making a HTTP call to upload a file to the relevant source. This would often be called when everything has been successful, and only for specific branches. For example if all your tests pass and you're on the master branch, zip up all your code and make a HTTP call to push the new zip file to AWS S3 which your customers have access to.
For further ideas, you could make your storage / distribution location into an FTP server if you wanted to, or a local network drive depending on what your needs are for distribution. If you're just dealing with docker for your customers, then I'd suggest building a Docker image and self-hosting a docker registry. Push to that registry after you've built the image, and that would be the end of your CI run.
As a side note, if your customers are using docker you could create a docker image either push it to a registry or export it as a .tar and upload it to a file storage location (S3 for example). This would make things simple for your customers and ensure you control the image creation step (if that's something you want to manage).
The gitlab ci docs might help you with the specifics of uploading artifacts to various locations.

Resources