MarkLogic - Can we configure scheduled backup on Azure Blob - azure

We want to configure schedule backup for database.
We have set storage account and access key for Azure Blob in security-> Credentials for Azure.
In backup directory, when enter azure://containName
This container name is exist in given storage account.
In response it says
The directory azure://backup/ does not exist on host ml01. Would you like to try and create it?
Can anybody please help me to configure?

It sounds like you want to create a work job which backup the data of your MarkLogic Database to Azure Blob Storage and trigger by a time schedule. Right? I do not completely understand what you said, so here just my suggestion below.
I'm not familar with MarkLogic, but I think you can write a script for NodeJS or a Java program to do the backup work, after I read the tag info for marklogic and I see it supports the client API for Node and Java.
As I known, there are three ways normally to deploy it on Azure if you are ready to backup in programming.
You can deploy it as a webjob with a cron expression to trigger the backup work, please refer to the offical document Run Background tasks with WebJobs in Azure App Service.
You can deploy it as a Web API on Azure by using the service like WebApp, and use Azure Scheduler to trigger it.
You can deploy it as an Azure Function and trigger it with timer trigger, please refer to the offical document Create a function in Azure that is triggered by a timer.
Ofcourse, there are other services can help to realize your needs. I don't know what the best for you is. If you have any concern, please feel free to let me know.

I was running into similar issue and was able to resolve it.
Create something side your container within a folder. so your structure should look like this
azure://containername/folder
I just was able to resolve my issue by doing that.

Related

Can I use Azure Function to automate linux tasks

here is a diagreme explains what i do manually && what i want to automate
connect to an on-premise gitlab-ce instance
start a gitlab backup and wait for it to finish
copy the backup files to azure blob and wait for it to finish
copy the backup files from blob to azure vm and and wait for it to
finish
connect to azure vm and restore the backup
i want to automate those tasks that i do with an azure tool .
can someone guide me to a direction where i can start looking?
can Azure Functions perform those kind of tasks??
thanks.
Thank you Matt Stannett and Anand Sowmithiran. Posting your suggestions as an answer to help other community members.
You can achieve this using a Durable Azure Function with the function chaining pattern.
You need to automate the initiation of the gitlab backup and pushing that as a blob to Azure blob storage - this part needs to run on-prem, so you could use Powershell to this. Then, your Azure blob storage trigger function can pick up the backup blob and do the rest of the steps via initiating a durable function, it has to use Azure sdk to restore the backup inside Azure VM
You can refer to Create your first durable function in Python, Storage Providers for Azure Durable Functions, Azure Functions: How to manage Durable Functions with Blob Triggers?, and Automate Linux VM Customization Tasks Using CustomScript Extension

Can Azure azcopy pull file from on-prem server?

I am very new to Azure, so posting this query. I just want to copy a flat file from an on-prem server to Azure BLOB storage daily basis. I thought "azcopy" will be a good solution for this requirement. But challenge is we can not configure "azcopy" utility in that on-prem server and can not schedule any cron job as this sever is owing by our client. So we thought of running "azcopy" utility from one of the Unix VM in Azure cloud and want to pull the file and transfer to Azure BLOB storage.
I am not getting any clear idea from documentation that "azcopy" can work as a pull manner or not.
Can anyone help me to understand whether my approach will work or not ? and if not then please give me some idea how to do this.
Please see below diagram depicting what I want to achieve.
AzCopy is not designed to be used in that way.
If there's any way for you to access the file, that will be up to your customer and you'd have to ask their network security folks how they would expect it to be done. But most customers (or at least all the ones I worked with before joining Microsoft) prefer push not pull models, since the security for push is more standard and easier to set up.

Is there any way to know when new file gets in azure file storage

I am trying to download and perform some action whenever there is new file in file storage.
but there is absolutely no way to do it.
I tried mapping the drive on to a vm and use Inotifywait tool (failed to get notifications).
I tried logic app and sftp to file share was not able to connect it.
I checking all over internet but found nothing
can anyone suggest any alternative ?
Thanks
I am trying to download and perform some action whenever there is new file in file storage.
As far as I know, it is not supported Azure storage file trigger now. I also find the related azure feedback.
can anyone suggest any alternative?
Based on my experience, if azure blob storage is possible, you could use blob trigger to replace it. Or you could use WebJob time trigger to check whether there is any update on the storage file share.

How to make code on an Azure VM trigger from storage blob change (like Functions do)

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?
Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

Notification about changes in azure storage

Currently, I work on a task to sync files inside azure with file-storage on a custom data center. I need the way to get a notification if something changes inside Azure file storage.
For example, for AWS I can configure notification through lambda function. Is there any similar way to do this in Azure?
As of today, this feature is not there as Azure Files binding is not supported. There is an open ticket on Github regarding this: https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14. It is available for Blob Storage though (that's why I asked in my comment).
For a list of available bindings, please see this: https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings.

Resources