Error calling the azure function endpoint from azure data factory - azure

I have linked azure function in data factory pipeline which writes the text file to blob storage
The azure function works fine when executed independently and writes the file to blob storage
But i am facing below mentioned error when i run the azure function from data factory
{
"errorCode": "3600",
"message": "Error calling the endpoint.",
"failureType": "UserError",
"target": "Azure Function1"
}
I have configured the azure fucntion to access the blob with blobendpoint and shared access signature as mentioned below
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=XYZ;AccountKey=XYZ;BlobEndpoint=ABC;SharedAccessSignature=AAA"
Please let me know if i need to make some additional properties changes in blob storage to access azure function successfully from data factory

What is the trigger in your azure function? http trigger?
Also how is your azure function protected?
if protected using AAD you need Bearer token.
if you are using keys you need x-function key.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook#authorization-keys
Here is a video from channel9 that might help:
Run Azure Functions from Azure Data Factory pipelines
https://channel9.msdn.com/Shows/Azure-Friday/Run-Azure-Functions-from-Azure-Data-Factory-pipelines

The Azure Function Activity in the ADF pipeline expects the Azure Function to return a JSON object instead of an HttpResponseMessage.
Here is how we solved it:
https://microsoft-bitools.blogspot.com/2019/01/introducing-azure-function-activity-to.html

Related

Azure ADF using Azure Batch throws Shared Access Signature generation error

I am working on a simple Azure Data Factory pipeline where I have simply added a Batch Service and in that specified the Batch Service account (which I have created thru linked service and tested the connection is working). In the command I am just running a simple "ls" command and when I do a debug run I get this error: "Cannot create Shared Access Signature unless Account Key credentials are used." I have following linked services "Azure Batch", "Azure Blob Storage" and Key Vault (where we store the access key). All linked services connections are working properly.
Any help on how to fix this error: "Cannot create Shared Access Signature unless Account Key credentials are used."
Azure Batch Linked service:
Azure Storage Linked service:
Azure Data factory pipeline:
The issue happens because you use "Managed Identity" to connect ADF to the Storage. It will say "successful" when doing a connection test on the linked services but when this storage is used for a Batch, it needs to have "Account Key" authentication type (see here).

send the dta afrom azure blob container to azure function using azure datafactory activity

I have copied data to azure blob container using the copy activity.i was able to use that to trigger my azure function using Blob trigger.However my req is to call the azure function activity that can be configured in azure datafactory pipeline.to that i need to pass the blob container path so that the azure function based on the HTTP trigger can read from this path.Blob trigger works but isnt allowed.Any idea as to how to get the path of the container and pass it to the azure function activity?
Edit:-
i added this
And the output of the path in the request sent to the HTTPTrigger of azure func looks like this
This is where i need the fully formed path post the copy say folder/myfolder/2010/10/01
However i dont.
-----------------------UPDATE----------------------------------------------------
this is the sink dataset
with the connection of the dataset(sink)like this
and my copypipeline looks like this
ran the debug and the copy instead of folder/myfolder/2020/10/01 gives folder/myfolder/#variables('data')
According to the description of your question, it seems you do not know the target blob path of the "Copy" activity. I guess you use pipeline parameter to input the blob path in your data factory. Something like below:
So in the HTTP trigger function request body, you just need to choose the testPath.
If your function request body need to be like {"path":"xxx"}, you can use "concat()" function in data factory to join the string together.
==================================Update=================================

Azure Function can write files in Data Lake, when it is bound to Event Grid, but not when it is called from Azure Data Factory

I have an Azure Function that should process zip files and convert their contents to csv files and save them on a Data Lake gen 1.
I have enabled managed Identity of this Azure Functions. Then I have added this managed Identity as an OWNER on the Access control (IAM) of Data Lake.
First Scenario:
I call this Azure function from Azure Data Factory, and send the fileuri of zip files, which are persisted in a Storage Acccount from Azure Data Factory to Azure Functions, Azure Function process the files by saving csv files in data lake I get this error:
Error in creating file root\folder1\folder2\folder3\folder4\test.csv.
Operation: CREATE failed with HttpStatus:Unauthorized Token Length: 1162
Unknown Error: Unexpected type of exception in JSON error output.
Expected: RemoteException Actual: error Source:
Microsoft.Azure.DataLake.Store StackTrace: at
Microsoft.Azure.DataLake.Store.WebTransport.ParseRemoteError(Byte[]
errorBytes, Int32 errorBytesLength, OperationResponse resp, String
contentType).
RemoteJsonErrorResponse: Content-Type of error response:
application/json; charset=utf-8.
Error:{"error":{"code":"AuthenticationFailed","message":"The access token in the 'Authorization' header is expired.
Second Scenario
I have set an Event Grid for this Azure Functions. After dropping zip files in storage account, which is bound to Azure Functions, zip files are processed and csv files are successfully saved in the Data Lake.
good to know azure function and Data Lake are in the same vnet
Could someone explain to me, why my function works fine with the event grid but doesn't work if I call it from Azure Data Factory? (saving csv files in Data Lake Gen 1)
Have you show all of the error?
From the error seems it is related to Azure Active AD authentication(Data Lake Storage Gen1 uses Azure Active Directory for authentication). Have you use Bearer token in your Azure Function when you try to send something to Data Lake Storage Gen1?
please show the code of your azure function, otherwise it will be hard to find the cause of the error.

Azure Datafactory Web Activity

I have developed an Function app and published the same via Azure APIM. I am trying to call the API(APIM) from Data factory Web Activity, which fails with the below. However when I use the Azure function URL + Function key directly in WebActivity this works fine.
So I want to be able to call a REST endpoint internal to my organization from Azure Data Factory using Web Activity. Appreciate any directions on this.
{
"errorCode": "2108",
"message": "Error calling the endpoint. Response status code: ",
"failureType": "UserError
}

How to use an Azure blob-only storage account with Azure Functions - Trying to create blob snapshot

I'm trying to set up a function to take a snapshot of a blob container every time a change is pushed to it. There is some pretty simple functionality in Azure Functions to do this, but it only works for general purpose storage accounts. I'm trying to do this with a blob only storage account. I'm very new to Azure so I may be approaching this all wrong, but I haven't been able to find much helpful information. Is there any way to do this?
As #joy-wang mentioned, the Azure Functions Runtime requires a general purpose storage account.
A general purpose storage account is required to configure the AzureWebJobsStorage and the AzureWebJobsDashboard settings (local.settings.json or Appsettings Blade in the Azure portal):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "my general purpose storage account connection string",
"AzureWebJobsDashboard": "my general purpose storage account connection string",
"MyOtherStorageAccountConnectionstring": "my blob only storage connection string"
}
}
If you want to create a BlobTrigger Function, you can specify another connection string and create a snapshot everytime a blob is created/updated:
[FunctionName("Function1")]
public static async Task Run([BlobTrigger("test-container/{name}",
Connection = "MyOtherStorageAccountConnectionstring")]CloudBlockBlob myBlob,
string name, TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name}");
await myBlob.CreateSnapshotAsync();
}
In the Visual Studio:
I have tried to create snapshot for a blob-only storage
named joyblobstorage , but it failed. I supposed you should get the same error in the screenshot.
As the error information says Microsoft.Azure.WebJobs.Host: Storage account 'joyblobstorage' is of unsupported type 'Blob-Only/ZRS'. Supported types are 'General Purpose'.
In the portal:
I try to create a Function App and use the existing Storage, but it could not find my blob-only storage account. Azure Function setup in portal should not allow we to select a blob-only storage account. Please refer to the screenshot.
Conclusion:
It is not possible to create snapshot for a blob-only storage. In the official documentation, you could see the Storage account requirements.
When creating a function app in App Service, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage.
Also, in the App settings reference, you could see
AzureWebJobsStorage
The Azure Functions runtime uses this storage account connection string for all functions except for HTTP triggered functions. The storage account must be a general-purpose one that supports blobs, queues, and tables.
AzureWebJobsDashboard
Optional storage account connection string for storing logs and displaying them in the Monitor tab in the portal. The storage account must be a general-purpose one that supports blobs, queues, and tables.
Here is the Feedback, Azure App Service Team has explained the requirements on storage account, you could refer to it.

Resources