I am calling an http triggered Azure function app in data factory pipeline using ADF function activity. It is executing successfully in debug mode, but when I publish that pipeline and run the same code using data factory triggers I get below error-
{
"errorCode": "3600",
"message": "Object reference not set to an instance of an object.",
"failureType": "UserError",
"target": "AzureFunction"
}
Please let me know if I need to make some additional properties changes or I am missing anything here. Also is there any way I can see what URL is getting generated when I call function app through function activity in ADF.
I have tried calling same function app using web activity in ADF and that is working fine in both debug and trigger mode.
Linked service code for Azure function
{
"name": "linkedservicesAzureFunctions",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"typeProperties": {
"functionAppUrl": "https://xyz.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"type": "LinkedServiceReference",
"referenceName": "linkedservicesKeyVault"
},
"secretName": "functions-host-key-default"
}
},
"type": "AzureFunction"
}
}
There is a known Bug in Azure Data Factory and they are working on that. For now if you are creating the Azure Data Factory using .NET SDK then you'll need to set Headers like this in Azure Function Activity.
new AzureFunctionActivity
{
Name = "CopyFromBlobToSnowFlake",
LinkedServiceName = new LinkedServiceReference(pipelineStructure.AzureFunctionLinkedService),
Method = "POST",
Body = body,
FunctionName = "LoadBlobsIntoSnowFlake",
Headers = new Dictionary<string, string>{ },
DependsOn = new List<ActivityDependency>
{
new ActivityDependency{
Activity = "CopyFromOPSqlServerToBlob",
DependencyConditions= new List<string>{"Succeeded" }
}
}
}
If you are creating Azure Function Activity through UI then just update the description of Activity then Publish and Headers will automatically get Initialized.
Related
I am trying to implement a Get Metadata activity to return the column count of files I have in a single blob storage container.
Get Metadata activity is returning this error:
Error
I'm fairly new to Azure Data Factory and cannot solve this. Here's what I have:
Dataset:Source dataset
Name- ten_eighty_split_CSV
Connection- Blob storage
Schema- imported from blob storage file
Parameters- "FileName"; string; "#pipeline().parameters.SourceFile"
Pipeline:
Name: ten eighty split
Parameters: "SourceFile"; string; "#pipeline().parameters.SourceFile"
Settings: Concurrency: 1
Get Metadata activity: Get Metadata
Only argument is "Column count"
Throws the error upon debugging. I am not sure what to do, (404) not found is so broad I could not ascertain a specific solution. Thanks!
The error occurs because you have given incorrect file name (or) name of a file that does not exist.
Since you are trying to use blob created event trigger to find the column count, you can use the procedure below:
After configuring the get metadata activity, create a storage event trigger. Go to Add trigger -> choose trigger -> Create new.
Click on continue. You will get a Trigger Run Parameters tab. In this, give the value as #triggerBody().fileName.
Complete the trigger creation and publish the pipeline. Now whenever the file is uploaded into your container (on top of which you created storage event trigger), it will trigger the pipeline automatically (no need to debug). If the container is empty and you try to debug by giving some value for sourceFile parameter, it would give the same error.
Upload a sample file to your container. It will trigger the pipeline and give the desired result.
The following is the trigger JSON that I created for my container:
{
"name": "trigger1",
"properties": {
"annotations": [],
"runtimeState": "Started",
"pipelines": [
{
"pipelineReference": {
"referenceName": "pipeline1",
"type": "PipelineReference"
},
"parameters": {
"sourceFile": "#triggerBody().fileName"
}
}
],
"type": "BlobEventsTrigger",
"typeProperties": {
"blobPathBeginsWith": "/data/blobs/",
"blobPathEndsWith": ".csv",
"ignoreEmptyBlobs": true,
"scope": "/subscriptions/b83c1ed3-c5b6-44fb-b5ba-2b83a074c23f/resourceGroups/<user>/providers/Microsoft.Storage/storageAccounts/blb1402",
"events": [
"Microsoft.Storage.BlobCreated"
]
}
}
}
I have an Azure Function Core running on a Linux consumption plan running in WestUS. The function seems to be running OK - I had a previous .NET Standard 2.0 version of the same function running OK in Azure. However, when I try to create a subscription to an Event Grid Topic I get the following error:
Deployment has failed with the following error: {"code":"Url validation","message":"The attempt to validate the provided endpoint https://insysfunctiongetweathercore.azurewebsites.net/runtime/webhooks/EventGrid failed. For more details, visit https://aka.ms/esvalidation."}
I am using an EventGridTrigger so should not have to do anything to handle validation - this should happen automatically:
public static async Task Run([EventGridTrigger] EventGridEvent eventGridEvent, ILogger log)
Any ideas on what I might need to do differently when the function is hosted in Linux consumption plan?
EDIT
Here's a screenshot of the Create Event Subscription Form with error:
and the resulting JSON:
{
"name": "InSysWeatherPull",
"properties": {
"topic": "/subscriptions/xxxxxxxxxxx/resourceGroups/InergySystemsWest/providers/Microsoft.EventGrid/Topics/InSysEventGridWest",
"destination": {
"endpointType": "WebHook",
"properties": {
"endpointUrl": "https://insysfunctiongetweathercore.azurewebsites.net/runtime/webhooks/EventGrid?functionName=ProcessWeatherRequest&code=xxxxxxxxxxxx"
}
},
"filter": {
"includedEventTypes": [
"weather-zip-request"
],
"advancedFilters": []
},
"labels": [],
"eventDeliverySchema": "EventGridSchema"
}
}
Is there a option to get the event grid trigger url + key at output value from the deployment of a Azure Function?
The scenario we would like to do is as followed:
- We deploy a Function Service in a VSTS release via ARM.
- With the Function service deployed we deploy the event grid subscription.
Thanks,
Shraddha Agrawal
Yes, there is a way using the REST API to obtain a function access code. The following are the steps:
Let assume a name of the function is EventGridTrigger2 and the run.csx:
#r "Newtonsoft.Json"
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static void Run(JObject eventGridEvent, TraceWriter log)
{
log.Info(eventGridEvent.ToString(Formatting.Indented));
}
and the function.json file:
{
"bindings": [
{
"type": "eventGridTrigger",
"name": "eventGridEvent",
"direction": "in"
}
],
"disabled": false
}
As you can see the above bindings is untyped, which will work for any output schema such as InputEventSchema, EventGridSchema (default schema) and CloudEventV01Schema (after fixing some bug).
The destination property of the created Subscription looks like the following:
"destination": {
"properties": {
"endpointUrl": null,
"endpointBaseUrl": "https://myFunctionApp.azurewebsites.net/admin/extensions/EventGridExtensionConfig"
},
"endpointType": "WebHook"
},
Note, that the full subscriberUrl for Azure EventGrid trigger has the following format, where the query string contains parameters for routing request to the properly function:
https://{FunctionApp}.azurewebsites.net/admin/extensions/EventGridExtensionConfig?functionName={FunctionName}&code={masterKey}
For creating a subscriber we have to use its full subscriberUrl included a query string. In this moment, the only unknown value is the masterKey.
To obtain a Function App (Host) masterkey we have to use a management REST API call:
https://management.azure.com/subscriptions/{mySubscriptionId}/resourceGroups/{myResGroup}/providers/Microsoft.Web/sites/{myFunctionApp}/functions/admin/masterkey?api-version=2016-08-01
the response has the following format:
{
"masterKey": "*************************************************"
}
Note, that the authentication Bearer token is required for this call.
Once we have a masterKey for the FunctionApp (host), we can use it for any function within this host.
I think you are asking: "how can I deploy an Azure Function with a step in a VSTS Release using ARM and get its trigger url so that I can use the trigger url in the next VSTS Release step?"
It's not very well documented, but using the official docs, this blog post and some trial and error, we've figured out how.
This is what the ARM should look like:
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {}
"variables": {},
"resources": [],
"outputs": {
"triggerUrl": {
"type": "string",
"value": "[listsecrets(resourceId('Microsoft.Web/sites/functions', 'functionAppName', 'functionName'),'2015-08-01').trigger_url]"
}
}
}
You deploy it with an "Azure Resource Group Deployment" step, make sure that you enter a variable name in the "Deployment outputs" text box, let say triggerUrl.
Example output:
{"triggerUrl":{"type":"String","value":"https://functionAppName.azurewebsites.net/api/functionName?code=1234"}}
Then you put a PowerShell step (or an Azure PowerShell step) afterwards that picks up the value from the variable.
$environmentVariableName = "triggerUrl"
$outputVariables = (Get-Item env:$environmentVariableName).Value
Then do something with it.
With update of Functions App V2.0.12050 the URI of the Event-Grid trigger is a little different. See also here
I am creating a custom activity pipeline in Azure Data Factory and as a part of that I need to create an AzureBatchLinkedService using the JSON below:
{
"name": "AzureBatchLinkedService",
"properties": {
"hubName": "name_hub",
"type": "AzureBatch",
"typeProperties": {
"accountName": "accountname",
"accessKey": "**********",
"poolName": "?????????",
"batchUri": "https://northeurope.batch.azure.com",
"linkedServiceName": "AzureStorageLinkedService"
}
}
}
When I created this the first time, I didn't have to create a batch pool. It was created for me automatically
My question is how do I get Azure to create this "pool" for me? If I go into the batch account and click "Add+" to add a pool manually, it has all these config options and I don't know what to put for any of those fields.
I am a newbie to azure logic app. My aim is to send some variables to logic app(via java service code, which in turn invokes the request trigger with the provided POST URL as REST API) and obtain response as JSON.
Currently i have created a request trigger and the JSON schema looks as follows:-
{
"$schema": "http://json-schema.org/draft-04/schema#",
"definitions": {},
"id": "http://example.com/example.json",
"properties": {
"CustomerName": {
"type": "string"
},
"InvoiceFee": {
"type": "integer"
},
"InvoiceNo": {
"type": "integer"
}
},
"required": [
"CustomerName",
"InvoiceFee",
"InvoiceNo"
],
"type": "object"
}
From the request trigger, i am directing to response action and the following to be returned as JSON response.
{
"CustomerName": #{triggerBody()['CustomerName']},
"InvoiceFee": #{triggerBody()['InvoiceFee']},
"InvoiceNo": #{triggerBody()['InvoiceNo']}
}
Screenshot below:-
enter image description here
Could you please provide me some reference links of how to access logic app from java service?
I am don't know regarding how to pass the custom created object such that the parameters of the object maps to "CustomerName", "InvoiceNo", "InvoiceFee" properties.
My created java service code is as follows:-
InvoiceDTO invoiceDTOObject2 = new InvoiceDTO();
invoiceDTOObject2.setCustomerName("Sakthivel");
invoiceDTOObject2.setInvoiceNo(123);
invoiceDTOObject2.setInvoiceFee(4000);
ResteasyClient client = new ResteasyClientBuilder().build();
ResteasyWebTarget target = client.target("URL TO PROVID").resolveTemplate("properties", invoiceDTOObject2);
Response response = target.request().get();
String jsonResponse = response.readEntity(String.class);
System.out.println("JSON Reponse "+jsonResponse);
Looking at your code
Response response = target.request().get();
You are doing a GET-operation. Your Logic App HTTP Trigger would require you to perform a POST-operation using your InvoiceDTO-entity as body (serialized as JSON).
So should look something like this:
Response response = target.request().post( InvoiceDTO.entity(invoiceDTOObject2, MediaType.APPLICATION_JSON));
Not sure if it's 100% correct, my java is a little rusty, but that's the general idea.