Can I use Azure Function to automate linux tasks - linux

here is a diagreme explains what i do manually && what i want to automate
connect to an on-premise gitlab-ce instance
start a gitlab backup and wait for it to finish
copy the backup files to azure blob and wait for it to finish
copy the backup files from blob to azure vm and and wait for it to
finish
connect to azure vm and restore the backup
i want to automate those tasks that i do with an azure tool .
can someone guide me to a direction where i can start looking?
can Azure Functions perform those kind of tasks??
thanks.

Thank you Matt Stannett and Anand Sowmithiran. Posting your suggestions as an answer to help other community members.
You can achieve this using a Durable Azure Function with the function chaining pattern.
You need to automate the initiation of the gitlab backup and pushing that as a blob to Azure blob storage - this part needs to run on-prem, so you could use Powershell to this. Then, your Azure blob storage trigger function can pick up the backup blob and do the rest of the steps via initiating a durable function, it has to use Azure sdk to restore the backup inside Azure VM
You can refer to Create your first durable function in Python, Storage Providers for Azure Durable Functions, Azure Functions: How to manage Durable Functions with Blob Triggers?, and Automate Linux VM Customization Tasks Using CustomScript Extension

Related

Copying new Azure blobs to different container

We have 5 vendors that are SFTPing files to Blob Storage. When the files come in, I need to copy them to another container and create a folder in that container named with the date to put the files in. From the second container, I need to copy the files to a file share on an Azure server. What is the best way to go about this?
I'm very new to Azure and unsure what the best way is to accomplish what I am being asked to do. Any help would be greatly appreciated.
I'd recommend using Azure Synapse for this task. It will let you move data to and from different storage securely and with little-to-no code.
Specifically, I'd put a blob storage trigger on the SFTP blob container so that the Synapse Pipeline to move data automatically runs when your vendors drop their files.
Note that when you look for documentation on how to do things in Synapse, most of the time the Azure Data Factory documentation will also be applicable, since most of Data Factory's functionality is now in Synapse.
The ADF and Synapse YouTube channels are excellent resources, as well as the Microsoft Learn courses on Data Engineering.
I need to copy them to another container and create a folder in that container named with the date to put the files in.
You can use Azcopy to copy a files to another container by using SAS token.
command:
azcopy copy 'https://<storage account>.blob.core.windows.net/test/files?SAS' 'https://<storage account >.blob.core.windows.net/mycontainer/12-01-2023?SAS' --recursive
Console:
Portal:
I need to copy the files to a file share on an Azure server
You can also copy the files from container to file share by using Azcopy.
Command:
azcopy copy 'https://<storage account>.blob.core.windows.net/test?SAS' 'https://<storage account >.file.core.windows.net/fileshare/12-01-2023?SAS' --recursive
Console:
Portal:
You can get the SAS token through portal:
Go to portal -> your storage account -> shared access signature -> check the resource types -> click generate SAS and Connection-string.
Portal:
Probably azcopy is a good way to move all or part of the blobs from one container to another one. But I would suggest to automate it with Azure Functions. I think it can be atomated triggering an Azure Function every time a blob or set of blobs (Azure could process a batch of blobs) are updoladed to the source container.
Note on Azure Functions, depends on the quantity of blobs to be moved and the time that it could take, durable functions should be better solution to skip timeout exception. Durable function returns inmediate response but are running in "background".
Consider this article to have a better approach to this solution:
https://build5nines.com/azure-functions-copy-blob-between-azure-storage-accounts-in-c/

copy file to Azure File Share (Azure Storage)

in addition to my question Code sync from Azure Scale Set VM To Azure Storage is there any way to copy files from one of the particular Scale Set VM to Azure File share(Azure Storage) through ADO Pipelines ? since its scale Set server i cant push every time from one VM. Eg: In pool there will be 100 VMSS servers ,when i try to push code through pipeline it should pick up one server from pool and from that need to push code !!! does it possible ?
No, we do not have this kind of build-in task. We have a Azure File Copy task which use it copy application files and other artifacts to Microsoft Azure storage blobs or virtual machines (VMs).
When the target is Azure VMs, the files are first copied to an
automatically generated Azure blob container and then downloaded into
the VMs. The container is deleted after the files have been
successfully copied to the VMs.
As a workaround, please use a custom script to copy files to Azure File Storage. In other words, if you are able to do the same thing locally. You should also be able to achieve it through Azure DevOps pipeline.
You may have to built your own extension or use scripts in pipeline. Take a look at this similar 3rd-party task-- AzureDevOpsReleaseToFileStorage

Azure Function, Blob Trigger : Copy data file from blob to a server

My requirement is to create a blob trigger azure function which will be triggered when a specific file (say, trigger.txt) is copied into a container. Once it is triggered, the powershell function should copy this trigger.txt on to a Azure windows VM in the same same subscription and resource group.
I can see that the function is triggered if an example trigger.txt file is present.
What do I need to do to copy this blob in the container to the azure VM? I see that azcopy does not work.
You could consider one of the following approaches
Using Azure File Storage
Mount Azure File Storage onto Windows VM
Azure Function would create a new file in file storage using content from input blob. Since there is no binding support for file strorage, you will have to use the File Storage SDK directly.
You could also consider using Logic Apps which has connectors for both blob storage and file storage. But do note that there are file size restrictions that you may run into depending on your use case.
And finally, you might want to consider using blob events to reduce on polling costs for both approaches.
Use PowerShell Functions to remote into the VM as suggested by #silent in the comments and run azcopy to download the file
There is an official doc which showcases how you can run commands using remoting that you can refer to. The doc takes the context of remoting to an on-premises machine via a hybrid connection which you can ignore for your use case.
Also, if your VM doesn't have a public endpoint, you will have to use Premium Functions which support VNET Integration.

MarkLogic - Can we configure scheduled backup on Azure Blob

We want to configure schedule backup for database.
We have set storage account and access key for Azure Blob in security-> Credentials for Azure.
In backup directory, when enter azure://containName
This container name is exist in given storage account.
In response it says
The directory azure://backup/ does not exist on host ml01. Would you like to try and create it?
Can anybody please help me to configure?
It sounds like you want to create a work job which backup the data of your MarkLogic Database to Azure Blob Storage and trigger by a time schedule. Right? I do not completely understand what you said, so here just my suggestion below.
I'm not familar with MarkLogic, but I think you can write a script for NodeJS or a Java program to do the backup work, after I read the tag info for marklogic and I see it supports the client API for Node and Java.
As I known, there are three ways normally to deploy it on Azure if you are ready to backup in programming.
You can deploy it as a webjob with a cron expression to trigger the backup work, please refer to the offical document Run Background tasks with WebJobs in Azure App Service.
You can deploy it as a Web API on Azure by using the service like WebApp, and use Azure Scheduler to trigger it.
You can deploy it as an Azure Function and trigger it with timer trigger, please refer to the offical document Create a function in Azure that is triggered by a timer.
Ofcourse, there are other services can help to realize your needs. I don't know what the best for you is. If you have any concern, please feel free to let me know.
I was running into similar issue and was able to resolve it.
Create something side your container within a folder. so your structure should look like this
azure://containername/folder
I just was able to resolve my issue by doing that.

Launch Azure Container Service on Upload to Blob Storage

I have a use case where I'd like to launch a job on an Azure Container Service cluster to process a file being uploaded to Blob storage. I know that I can trigger an Azure Functions instance from the upload, but I haven't been able to find examples in the documentation of starting a job within Functions.
This diagram illustrates the AWS equivalent of what I want:
Thanks!
The Azure Event Grid feature is what you need. It is still in preview, but you can subscribe to the Blob Created event. You can set the subscriber endpoint to an Azure Function that puts a message in a queue to trigger your job, or you can expose a service on your cluster that will accept the request and do whatever you need done.
Microsoft provides a guide at https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-quickstart?toc=%2fazure%2fevent-grid%2ftoc.json#create-a-message-endpoint

Resources