Notification about changes in azure storage - azure

Currently, I work on a task to sync files inside azure with file-storage on a custom data center. I need the way to get a notification if something changes inside Azure file storage.
For example, for AWS I can configure notification through lambda function. Is there any similar way to do this in Azure?

As of today, this feature is not there as Azure Files binding is not supported. There is an open ticket on Github regarding this: https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14. It is available for Blob Storage though (that's why I asked in my comment).
For a list of available bindings, please see this: https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings.

Related

Search Azure File Share

Is there a way to use Azure Search against Azure File Shares. I only see blob storage as an option. We have on-prem servers that sync files to Azure File Shares and would like to search inside those files in a web application.
At this moment, there's no way unless you manually query and push file content to your Azure Cognitive Search index. In the future, there's a hope you'll be able to trigger an Azure Function using this type of binding, which will make your life easier. You can follow / vote up for this feature in the following link:
https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14
Per UserVoice Page for Azure Search: https://feedback.azure.com/forums/263029-azure-search/suggestions/14274261-indexer-for-azure-file-shares#{toggle_previous_statuses}, Azure File Indexer is available in private preview (in fact this has been in this stage for almost 2 years now :)).
Search team would like to reach out to them in case you're interested.

moving locally stored documented documents to azure

I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.

MarkLogic - Can we configure scheduled backup on Azure Blob

We want to configure schedule backup for database.
We have set storage account and access key for Azure Blob in security-> Credentials for Azure.
In backup directory, when enter azure://containName
This container name is exist in given storage account.
In response it says
The directory azure://backup/ does not exist on host ml01. Would you like to try and create it?
Can anybody please help me to configure?
It sounds like you want to create a work job which backup the data of your MarkLogic Database to Azure Blob Storage and trigger by a time schedule. Right? I do not completely understand what you said, so here just my suggestion below.
I'm not familar with MarkLogic, but I think you can write a script for NodeJS or a Java program to do the backup work, after I read the tag info for marklogic and I see it supports the client API for Node and Java.
As I known, there are three ways normally to deploy it on Azure if you are ready to backup in programming.
You can deploy it as a webjob with a cron expression to trigger the backup work, please refer to the offical document Run Background tasks with WebJobs in Azure App Service.
You can deploy it as a Web API on Azure by using the service like WebApp, and use Azure Scheduler to trigger it.
You can deploy it as an Azure Function and trigger it with timer trigger, please refer to the offical document Create a function in Azure that is triggered by a timer.
Ofcourse, there are other services can help to realize your needs. I don't know what the best for you is. If you have any concern, please feel free to let me know.
I was running into similar issue and was able to resolve it.
Create something side your container within a folder. so your structure should look like this
azure://containername/folder
I just was able to resolve my issue by doing that.

Azure Cloud Service (Classic) - Any way to log Diagnostic.Trace logs to BLOB storage

I've been asked to change an old Azure Cloud Service worker's logging to the System.Diagnostics.Trace logging style of logging. I've done that and now I'm about ready to deploy it to azure.
The client requirement is that these logs should appear in blob storage, similar to how the more modern app service logs can be configured to write their diagnostics to blob storage. There is an expectation that logs can be batched up and uploaded periodically (perhaps time or number of lines based).
Is there a nuget package or other library or config I should enable to connect the application to blob storage? I've spent about 20 mins searching here and online for a solution, but information seems to mainly talk about writing logs to Table Storage..
Edit: More detail:
This is an existing app (C# .Net Framework 4.5) that used to use an external logging service.
I assumed (incorrectly, I think) that the logging to blob storage was something I could configure in the Azure Portal.
As things are right now, NO log file of any kind is generated, but when I run the code in Visual Studio, I can see some Output from the logging statements
I have updated the code to use a standard (custom) logging system
that eventually boils down to using statements like the below:
Trace.TraceInformation($"DEBUG: {message}");
Here are some links I found with related information:
Streaming from command line
Trace listener question
Adding Trace to existing website
Performance Impact of Logging
Smarx Library
The logging is configured by the diagnostics.wadcfgx file which you can see in your solution.
This holds all of the diagnostic information that you want to collect. This can be controlled via the "Properties" of the Web\Worker role (right-click -> Properties).
From there, there is also the option to specify the Storage Account:
This isn't always ideal if you are deploying to multiple environments, so you should be able to alter the configuration from the Azure Portal, by downloading and uploading new configuration, following these instructions.
So logging to blob storage, think of it as uploading existing files to the blob storage. If your current app creates files, then you should use put blob property or blob append to add these files to blob storage. So you must interact with the storage SDK to make these transactions. You could also leverage logic apps which uses connectors to blob storage, and would perform certain actions based on specific triggers(time stamp and other conditions).
If you would like to see the generated logs in Azure Storage, you'll have to enable azure diagnostics but these logs would pertain to the storage account itself, not your app.
Since you mentioned that you see the output, you have to transfer that output as an object ex: (text file), then upload it to the storage account. You can find SDK information for C# here. I hope this helps.

How to make code on an Azure VM trigger from storage blob change (like Functions do)

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?
Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

Resources