I have an Azure Timer Trigger Function that should do some calculations and write the results to a json file in a pre-existing blob. How do I reference the pre-existing blob from within the Timer Triggered function?
I can't seem to find any documentation that provides a code sample. Can someone provide one?
First, you need to update your function.json configuration file, to bind the blob to the CloudBlockBlob instance you'll be using in your .csx code. You can edit it in Azure Portal via the "Integrate" option (the one with the lighting icon) under your function, in the Function Apps menu. On the top right of that page is a link that reads "Advanced Editor". Clicking that link will take you to your funciton's function.json file:
You'll see a JSON array named "bindings" that contains a JSON object that configures your timer. You'll want to add another JSON object to that array to bind your blob to a CloudBlockBlob instance that you'll be referencing in your function. Your function.json file will look something like this:
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
},
{
"type": "blob",
"name": "myBlob",
"path": "your-container-name/your_blob_filename.json",
"connection": "AzureWebJobsStorage",
"direction": "inout"
}
],
"disabled": false
}
Now you just need to update your function's Run method's signature. It looks like this by default:
public static void Run(TimerInfo myTimer, TraceWriter log)
Add your blob variable to the end of that signature (and also add the necessary includes):
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public static void Run(TimerInfo myTimer, TraceWriter log, CloudBlockBlob myBlob)
And you're all set! "myBlob" is bound to the blob "your_blob_filename.json" in your "your-container-name" container.
Related
I am trying to create an azure function which takes azure table as storage, which then i read in the function. I am able to run the function when i specify the below signature
public static async System.Threading.Tasks.Task RunAsync([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, [Table("CovidUpdateTable")]CloudTable ServiceFormTable, [Blob("outcontainer/{DateTime}", FileAccess.Write)] Stream outputBlob, ILogger log)
Here, I have to specify the table name in table attribute even when i have specified it in the binding config as below
{
"type": "table",
"name": "ServiceFormTable",
"tableName": "CovidUpdateTable",
"take": 50,
"connection": "AzureWebJobsStorage",
"direction": "in"
}
in portal c# script I can directly bind to CloudTable but in Visual studio, It throws an error if i remove the table attribute and use just cloudtable. I am not sure what is the purpose of tablename in this config when I have to specify the name in table attribute.
When we create the function in Azure portal, it will create a c# script function(csx file) and generate function.json file for us. And the function will read the configs from the function.json file autosomally. So we can directly configure binging in the file and do not need to configure things in the code. But when we create the function in Visual Studio, it will create c# function (cs file) and will not generate function.json file for us. And the function will not read the configs from the function.json file autosomally. So we need to configure these settings with attribute. For more details, please refer to the document
Update
If you want to use the binding properties in local.settings.json, please refer to the following steps
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"TableConnection": "<your azure table connection string>",
"Tablename": "<your table name>"
}
}
Configure code. You should use [Table("%Tablename%",Connection = "TableConnection")]CloudTable cloudTable, to bing Azure table
For example
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
[Table("%Tablename%",Connection = "TableConnection")]CloudTable cloudTable,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
log.LogInformation(cloudTable.Name);
var query = new TableQuery<DynamicTableEntity>();
foreach (var entity in
await cloudTable.ExecuteQuerySegmentedAsync(query, null))
{
log.LogInformation(
$"{entity.PartitionKey}\t{entity.RowKey}\t{entity.Timestamp}");
}
....
}
For more details, please refer to the document
I have Data incoming from Different devices to IoT hub from there using Stream Analytics to process it and store it in blob storage.
I know we can add {date}{time} we add in the path according to needed format, in that path can we add deviceId too.
example : For 2018/10/30/01 ( Date/month/day/hour) Can add /deviceId in that path while storing to blob
The following is an example of workaround for your case. It's based on the using an azure function (HttpTrigger) for output ASA job to append a data to the specific blob storage in the push manner.
Note, that the following workaround using the Max batch count for delivering events to the azure function value 1 (one telemetry data at the time).
ASA job query:
SELECT
System.Timestamp as [time], *
INTO outAF
FROM
iot TIMESTAMP BY time
Azure Function (HttpTrigger):
run.csx
#r "Newtonsoft.Json"
#r "Microsoft.WindowsAzure.Storage"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Microsoft.WindowsAzure.Storage.Blob;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static async Task<IActionResult> Run(string body, CloudBlobContainer blobContainer, ILogger log)
{
log.LogInformation($"{body}");
var jtoken = JToken.Parse(body);
var jobject = jtoken is JArray ? jtoken.SingleOrDefault<JToken>() : jtoken;
if(jobject != null)
{
var jtext = jobject.ToString(Formatting.None);
var data = JsonConvert.DeserializeAnonymousType(jtext, new {IoTHub = new { ConnectionDeviceId = ""}});
var blobName = $"{DateTime.UtcNow.ToString("yyyy/MM/dd/hh")}/{data.IoTHub.ConnectionDeviceId}";
var blob = blobContainer.GetAppendBlobReference(blobName);
if(!await blob.ExistsAsync())
{
await blob.CreateOrReplaceAsync();
}
await blob.AppendTextAsync(jtext + "\r\n");
}
return new NoContentResult();
}
function.json
{
"bindings": [
{
"authLevel": "function",
"name": "body",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"name": "blobContainer",
"type": "blob",
"path": "myContainer",
"connection": "mySTORAGE",
"direction": "out"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
]
}
I know we can add {date}{time} we add in the path according to needed
format, in that path can we add deviceId too.'
As #Peter Bons mentioned in the comment, variable names in output is not supported so far.
As workaround, you could use Blob Trigger Azure Function. You need to pass the deviceId in the output columns then get it in the blob trigger function. Then use blob sdk to create /deviceId directory to copy blob into it and delete the previous blob.
I have a activity function that should store message in Blob storage.I can overwrite a file in blob storage but i need to store data in different name.how to do that? Azure function doesn't support dynamic binding in nodejs.
Find one workaround, see whether it's useful.
Along with blob output binding, there's an activity trigger to receive message msg, we can put self-defined blob name in msg for blob binding path to consume.
In your orchestrator function which calls Activity function
yield context.df.callActivity("YourActivity", {'body':'messagecontent','blobName':'myblob'});
Then Activity function code should be modified
context.bindings.myOutputBlob = context.bindings.msg.body;
And its function.json can use blobName as expected
{
"bindings": [
{
"name": "msg",
"type": "activityTrigger",
"direction": "in"
},
{
"name":"myOutputBlob",
"direction": "out",
"type": "blob",
"connection": "AzureWebJobsStorage",
"path": "azureblob/{blobName}"
}
],
"disabled": false
}
This question already has an answer here:
Azure Function blob binding
(1 answer)
Closed 5 years ago.
Trying to remake the Azure Grid Image Resize example in c# using visual studio but having issues making the azure function trigger be triggered by the event grid and bind to the blob storage.
Current Code:
using Microsoft.Azure.WebJobs.Extensions.EventGrid;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage.Blob;
namespace FunctionApp
{
public static class CreateIndex
{
[FunctionName("CreateIndex")]
[StorageAccount("backup_STORAGE")]
public static void Run(
[EventGridTrigger()] EventGridEvent myEvent,
[Blob("{data.url}")] CloudBlockBlob inputBlob,
TraceWriter log)
{
log.Info(myEvent.ToString());
log.Info(inputBlob.ToString());
}
}
}
Generated function.json:
{
"generatedBy": "Microsoft.NET.Sdk.Functions.Generator-1.0.6",
"configurationSource": "attributes",
"bindings": [
{
"type": "eventGridTrigger",
"name": "myEvent"
}
],
"disabled": false,
"scriptFile": "../bin/FunctionApp.dll",
"entryPoint": "FunctionApp.CreateIndex.Run"
}
The binding is working for the event grid trigger but not the Blob input.
Expected function.json:
{
"bindings": [
{
"type": "EventGridTrigger",
"name": "myEvent",
"direction": "in"
},
{
"type": "blob",
"name": "inputBlob",
"path": "{data.url}",
"connection": "myblobstorage_STORAGE",
"direction": "in"
}
],
"disabled": false
}
Precompiled functions generate function.json for you, but they only put trigger binding inside of it. It's OK that your blob binding is not in this file.
The input Blob binding will still work: runtime will pick it up based on your attributes.
I have an Azure Function that I created in the Azure portal and now want to recreate it using the visual studio azure tooling associated with VS2017 Preview.
My function is Timer triggered, and also has an input binding for Azure DocumentDB (with a query) and an output binding to an Azure Service Bus queue.
Here's the functions.json definition from the portal:
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
},
{
"type": "serviceBus",
"name": "outputQueue",
"queueName": "test-output-requests",
"connection": "send-refresh-request",
"accessRights_": "Send",
"direction": "out"
},
{
"type": "documentDB",
"name": "incomingDocuments",
"databaseName": "test-db-dev",
"collectionName": "TestingCollection",
"sqlQuery": "select * from c where c.docType = \"Test\"",
"connection": "my-testing_DOCUMENTDB",
"direction": "in"
}
],
"disabled": false
}
In VS2017, I create an Azure Function project, then a Azure function using the TimerTriggered template:
public static void Run([TimerTrigger("0 */5 * * * *")] TimerInfo myTimer, TraceWriter log, ICollector<dynamic> outputQueue, IEnumerable<dynamic> incomingDocuments)
{
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
//Inspect incomingDocuments .. push messages to outputQueue
}
Running locally, the timer is triggered as expected - but how do I recreate the input and output bindings in code? I'm not sure what attributes I should use, and what I need in json config files to wire it up.
Add reference to following nuget pacakge
https://www.nuget.org/packages/Microsoft.Azure.WebJobs.Extensions.DocumentDB/
Add using Microsoft.Azure.WebJobs
Update the documentDb Parameter as follows (Add other properties as well)
[DocumentDB(ConnectionStringSetting = "")]IEnumerable<dynamic> incomingDocuments
Update the serviceBus Parameter as follows
[ServiceBus("test-output-requests",Connection = "ConnectionValue")]
Verify the generated function.json in the bin/functionName/function.json
Thanks,
Naren