Saving a JSON data from an Azure Function - azure

I have integrated an Azure Service Bus and an Azure Function to receive a message, and then update a SQL DB.
I want to save a JSON created from a query from the same Database to a Azure Blob.
My questions are:
I can save the JSON by calling the Azure Blob REST API. Is it a Cloud Native pattern to call one service from another service?
Is sending the JSON to the Azure Service Bus and another Azure Function saving the data to the Blob an optimal approach?
Is a resource other than Azure Blob to save the JSON data from an Azure Function which will make the integration easy.

There are many ways of saving a file in Azure Blob, if you want to save over HTTP, use Azure Blob REST API, you can also use Microsoft Azure Storage SDK that you can integrate into your application, there are storage client for many languages (.NET, Python, javascript, GO, etc.) or if you are using Azure function, you can use Output Binding.
it depends... Blob Storage is not the only location where you can save JSON, you can also save JSON straight into a SQL database for instance.
The easiest way to save from an Azure function is to use Azure Blob storage output binding for Azure Functions.

Related

How to get and parse .xlsx file in Azure Function called from Azure Logic App?

I am working on a logic app that gets a file azure file file shares, adds them in azure blob storage and then calls a Azure Function that receives that blob (.xlsx file).
After the blob is created the Azure Function will parse the data in the blob and will insert the data in MS dynamics CRM entity.
My question is how can I access the blob in Azure Function and parse it so I can the data that will be store in the entity?
I have successfully created the logic app that performs the mentioned steps:
My question is how can I access the blob in Azure Function and parse
it so I can the data that will be store in the entity?
From your screenshot, seems your logic app to trigger function is triggered by Http. So you can use azure function blob binding to access the blob in storage:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-input?tabs=csharp
(The information of blob name is incoming by json format, so binding can directly get the name.)
Regarding parsing, I think there are a lot of codes that can be used for reference. It is no need to be given here. If you need it, you can tell me(And please tell the language you are using.).

How to events from azure event hub to azure blob storage using logic app?

I want to move events from azure event hub to Azure blob storage using the logic app, can anyone suggest to me how to use with logic app connector, triggers, and action with the designer example.
Note:
Events are json events and need to be stored in blob storage.
You can start with a blank logic app, and use the search assistant to find what you're looking for.
Typing event hub gives:
where you can provide the connection by providing the name.
Save the content in a variable.
You can use SaveInitialJsonToBlobStorage to now store this json in a blob storage:

Logic Apps - Get Blob Content Using Path

I have a event driven logic app (blob event) which reads a block blob using the path and uploads the content to Azure Data Lake. I noticed the logic app is failing with 413 (RequestEntityTooLarge) reading a large file (~6 GB). I understand that logic apps has the limitation of 1024 MB - https://learn.microsoft.com/en-us/connectors/azureblob/ but is there any work around to handle this type of situation? The alternative solution I am working on is moving this step to Azure Function and get the content from the blob. Thanks for your suggestions!
If you want to use an Azure function, I would suggest you to have a look at this at this article:
Copy data from Azure Storage Blobs to Data Lake Store
There is a standalone version of the AdlCopy tool that you can deploy to your Azure function.
So your logic app will call this function that will run a command to copy the file from blob storage to your data lake factory. I would suggest you to use a powershell function.
Another option would be to use Azure Data Factory to copy file to Data Lake:
Copy data to or from Azure Data Lake Store by using Azure Data Factory
You can create a job that copy file from blob storage:
Copy data to or from Azure Blob storage by using Azure Data Factory
There is a connector to trigger a data factory run from logic app so you may not need azure function but it seems that there is still some limitations:
Trigger Azure Data Factory Pipeline from Logic App w/ Parameter
You should consider using Azure Files connector:https://learn.microsoft.com/en-us/connectors/azurefile/
It is currently in preview, the advantage it has over Blob is that it doesn't have a size limit. The above link includes more information about it.
For the benefit of others who might be looking for a solution of this sort.
I ended up creating an Azure Function in C# as the my design dynamically parses the Blob Name and creates the ADL structure based on the blob name. I have used chunked memory streaming for reading the blob and writing it to ADL with multi threading for adderssing the Azure Functions time out of 10 minutes.

Sql to Azure Blob to LogicApp

I am new Azure functions, One of my task is to read data from Sql database and upload that data as a csv file in azure Blob storage using Azure functions and then using logicapps to retreive it. I am stuck with Sql to file to Azure Blob
I would start with the Azure Functions documentation. I did a quick internet search and found this article on how to access to SQL database from an Azure Function: https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
Here is another article which shows how to upload content to blob storage: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#output
Apply your learnings from both and you should be able to accomplish this task.
What about if instead you create a trigger to start the logic apps when something happen in your DB. Interesting article here : https://flow.microsoft.com/en-us/blog/introducing-triggers-in-the-sql-connector/
you can then pass the information to a function to process the data and push the new csv file to the storage : https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet?tabs=windows
Optionally you might need to transform what the trigger from sql returns you, there you can use the logic apps transform the input : https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-transform

Is there a way to continuously pipe data from Azure Blob into BigQuery?

I have a bunch of files in Azure Blob storage and it's constantly getting new ones. I was wondering if there is a way for me to first take all the data I have in Blob and move it over to BigQuery and then keep a script or some job running so that all new data in there gets sent over to BigQuery?
BigQuery offers support for querying data directly from these external data sources: Google Cloud Bigtable, Google Cloud Storage, Google Drive. Not include Azure Blob storage. As Adam Lydick mentioned, as a workaround, you could copy data/files from Azure Blob storage to Google Cloud Storage (or other BigQuery-support external data sources).
To copy data from Azure Blob storage to Google Cloud Storage, you can run WebJobs (or Azure Functions), and BlobTriggerred WebJob can trigger a function when a blob is created or updated, in WebJob function you can access the blob content and write/upload it to Google Cloud Storage.
Note: we can install this library: Google.Cloud.Storage to make common operations in client code. And this blog explained how to use Google.Cloud.Storage sdk in Azure Functions.
I'm not aware of anything out-of-the-box (on Google's infrastructure) that can accomplish this.
I'd probably set up a tiny VM to:
Scan your Azure blob storage looking for new content.
Copy new content into GCS (or local disk).
Kick off a LOAD job periodically to add the new data to BigQuery.
If you used GCS instead of Azure Blob Storage, you could eliminate the VM and just have a Cloud Function that is triggered on new items being added to your GCS bucket (assuming your blob is in a form that BigQuery knows how to read). I presume this is part of an existing solution that you'd prefer not to modify though.

Resources