Azure Blob Trigger - Dynamic BlobOutput Binding Name Based on Input Container - azure

[Function("Function1")]
[BlobOutput("test-samples-output/{name}", Connection = "ConnectionString1")]
public string Run([BlobTrigger("test-samples-trigger/{name}", Connection = "ConnectionString1")] string myBlob,
string name, string blobTrigger)
{
_logger.LogInformation($"C# Blob trigger function Processed blob\n Name: {name} \n Data: {myBlob}");
return myBlob;
}
I have a blob trigger set to 'test-samples-trigger/{name}'. I want to set the BlobOutput to use the input container name '{input-container-name}-output/{name}'. Is there a way to set the BlobOuput string to dynamically point to this location?

After reproducing from my end, One way to achieve your requirement is to read variable using GetEnvironmentVariable where the value is read from local.settings.json. Below is the complete code that worked for me.
Function1.cs
using System;
using System.IO;
using System.Text;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace FunctionApp13
{
public class Function1
{
[FunctionName("Function1")]
public void Run([BlobTrigger("samples-workitems/{name}", Connection = "connstr")]Stream myBlob,
[Blob("%outputContainer%/{name}", FileAccess.Write, Connection = "connstr")] Stream outputBlob,
string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
string outputContainer = Environment.GetEnvironmentVariable("outputContainer");
outputBlob.Write();
}
}
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "<ConnectionString>",
"connstr": "<ConnectionString>",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"outputContainer": "sample"
}
}
Results:

Related

Azure Function (Service Bus Trigger) Not Getting started when a new message comes into the service bus queue

Created an Azure Function which is a service bus triggered in Visual Studio and published to Azure from Visual Studio.
Whenever a message goes to queue, the function is running fine from local when manually run. But the expectation is the function should automatically trigger when a message is in the queue.
I am just adding a new message manually and seeing the logs if the function got triggered automatically but it is not. When I checked the Application Insight I found the below error logs
The listener for function 'ProcessVideos' was unable to start. Service Bus account connection string 'connection' does not exist. Make sure that it is a defined App Setting.*"
Code for local.settings.json where the Service Bus connection string is set.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"connection": "Endpoint=sb://videoupload10000.servicebus.windows.net/;SharedAccessKeyName=Listen;SharedAccessKey=80n8a0MCmh+3UZN4+4B7gDy4gp3hKCxfDI/9urDmaP8=;"
}
}
Code for the actual function.
using System;
using System.Text;
using System.Text.Json;
using System.Threading.Tasks;
using Azure.Messaging.ServiceBus;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Microsoft.Azure.Cosmos;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace ReceiveMessages
{
public static class Process
{
private static string blob_connection_string = "DefaultEndpointsProtocol=https;AccountName=videostorage1000;AccountKey=y6CVtXafqKuShZuv6BMbVj9DrymzVdNDpjDVxp6hZMvuRRjcCz/i8TrOGfM5T/JCvfG33sY3xqqW+ASt3p6V+Q==;EndpointSuffix=core.windows.net";
private static string source_container_name = "unprocessed";
private static string destination_container_name = "processed";
private static readonly string _connection_string = "AccountEndpoint=https://videodbupdate.documents.azure.com:443/;AccountKey=gmR051bG7uq7o2i519m7J9nh6tb4LLctfOQ3nPMUxMu9QJWsmh1SPiY8ylvxoY3bn7kWR4cS2qwanBdIoXSrpg==;";
private static readonly string _database_name = "appdb";
private static readonly string _container_name = "video";
[FunctionName("ProcessVideos")]
public static async Task Run([ServiceBusTrigger("videoqueue", Connection = "connection")]ServiceBusReceivedMessage myQueueItem, ILogger log)
{
ReceivedMessage _message = JsonSerializer.Deserialize<ReceivedMessage>(Encoding.UTF8.GetString(myQueueItem.Body));
BlobServiceClient _client = new BlobServiceClient(blob_connection_string);
BlobContainerClient _source_container_client = _client.GetBlobContainerClient(source_container_name);
BlobClient _source_blob_client = _source_container_client.GetBlobClient(_message.VideoName);
BlobContainerClient _destination_container_client = _client.GetBlobContainerClient(destination_container_name);
BlobClient _destination_blob_client = _destination_container_client.GetBlobClient(_message.VideoName);
CosmosClient _cosmosclient = new CosmosClient(_connection_string, new CosmosClientOptions());
Container _container = _cosmosclient.GetContainer(_database_name, _container_name);
BlobDownloadInfo _info = _source_blob_client.Download();
// Copy the blob to the destination container
await _destination_blob_client.StartCopyFromUriAsync(_source_blob_client.Uri);
log.LogInformation(_info.Details.LastModified.ToString());
log.LogInformation(_info.ContentLength.ToString());
BlobDetails _blobdetails = new BlobDetails();
_blobdetails.BlobName = _message.VideoName;
_blobdetails.BlobLocation = "https://videostorage100.blob.core.windows.net/processed/" + _message.VideoName;
_blobdetails.ContentLength = _info.ContentLength.ToString();
_blobdetails.LastModified = _info.Details.LastModified.ToString();
_blobdetails.id = Guid.NewGuid().ToString();
//_container.CreateItemAsync(_blobdetails, new PartitionKey(_message.VideoName)).GetAwaiter().GetResult();
// await _container.CreateItemAsync(_blobdetails, new PartitionKey(_message.VideoName));
Console.WriteLine("Item created");
// Delete the blob from the unprocessed container
_source_blob_client.Delete();
// Add the details of the blob to an Azure Cosmos DB account
}
}
}
The local settings are not uploaded to the cloud. to add your connection string you need to do the following. Go to your function app in azure. Select "configuration" under "settings" from the left side menu items. On this screen, you should click on the button "+ New Application Settings". Once the popup opens add "connection" as the name and your connection string as the value. Click on "OK" and then on the next screen click on "save" to save and apply the settings. Hope this helps
For a python project, your connection value in function.json needs to refer to the value in local.settings.json. Should be similar for you:
function.json:
"connection": "AzureWebJobsMyServiceBus"
local.settings.json:
"AzureWebJobsMyServiceBus": "Endpoint=sb://..."

How to get Multiple FIlename from blobtrigger azure function c#

How to get Multiple Filename from container using blobtrigger azure function c#?
Update:
Sample:
using System;
using System.IO;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Blob;
namespace FunctionApp116
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([BlobTrigger("test/{name}", Connection = "str")]CloudBlockBlob myBlob,ILogger log)
{
string blobname = myBlob.Name;
string containername = myBlob.Container.Name;
BlobServiceClient blobServiceClient = new BlobServiceClient(Environment.GetEnvironmentVariable("str"));
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containername);
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{blobname}" + $" Container Name is {containername}");
foreach (BlobItem blobItem in containerClient.GetBlobs())
{
log.LogInformation("\t" + blobItem.Name);
}
}
}
}
Original Answer:
The blob that blob trigger can input can not be more than one.
But, you can use blob storage sdk to get multiple blobs from the same container.
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet#code-examples
if you offer what language you are using, i can post a sample.

Time trigger function with CosmosDB inputs

I have created a time trigger function in azure functions and Added a CosmosDB input as shown below.
Below is the .csx file
#r "Microsoft.Azure.Documents.Client"
using System;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
public static async Task Run(TimerInfo myTimer, string[] inputDocument, TraceWriter log)
{
log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
// string [] = bindings.inputDocument;
DocumentClient client;
}
How to get the input documents from cosmosDb into this csx file?
I am not familiar with C#, in javascript we will use var Data = context.bindings.DataInput;
How to do the same in c#?
You can use it like the below snippet
public static void Run(TimerInfo myTimer, IEnumerable<dynamic> documents)
{
foreach (var doc in documents)
{
// operate on each document
}
}
More examples in documentation
Questions from comments
If we have more than one cosmos Db input do we need to add as below ?
No if even if you have more than one inputs the IEnumerable<dynamic> documents is used. And you can iterate the list.
How to add if we have a cosmosDB output ?
The out object is used in this which points to your binding.
public static void Run(string myQueueItem, out object employeeDocument, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
dynamic employee = JObject.Parse(myQueueItem);
employeeDocument = new {
id = employee.name + "-" + employee.employeeId,
name = employee.name,
employeeId = employee.employeeId,
address = employee.address
};
}
More information on Output

Why Azure function is not writing to Service Bus Topic

My function is like
[FunctionName("MyFunctionName")]
[return: ServiceBus("mytopic", Connection = "ServiceBusConnectionString")]
public static async Task<string> MyFunctionAsync([QueueTrigger("my-input-queue")] string msgIn, TraceWriter log)
{
My local.settings.json has
{
"IsEncrypted": false,
"Values": {
"ServiceBusConnectionString": "[my connection string]"
}
}
where [my connection string] is copy-pasted from a Primary Connecting String under one of the Shared access policies with a Send claim.
This just silently fails: Messages get stuck in my-input-queue and no errors are written to log streaming. However I'm 100% sure the attribute is the issue because I've deployed 100 different combinations of this to try and make it work :).
Any ideas?
Based on my test,it should work with servicebus attribute. The following is my test code.
[return: ServiceBus("topicName",Connection = "ServiceBusConnectionString", EntityType = EntityType.Topic)]
public static async Task<string>Run([QueueTrigger("queueName")]string myQueueItem, TraceWriter log)
{
...
return myQueueItem; // write to the Topic.
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "xxxxxx",
"AzureWebJobsDashboard": "xxxxxxxx",
"ServiceBusConnectionString": "xxxxxx"
}
}
You could get more information about Azure Service Bus output binding from this tutorial. You also could do that with follwoing way
[FunctionName("ServiceBusOutput")]
public static void Run([[QueueTrigger("queueName")]string myQueueItem,
TraceWriter log,
[ServiceBus("topicName",Connection = "ServiceBusConnectionString", EntityType = EntityType.Topic)]out string queueMessage)
{
log.Info("Azure Function Demo - Azure Service Bus Queue Topic");
queueMessage = myQueueItem;
}
You are missing the required settings for your QueueTrigger, so your function isn't triggering on new items in the queue. You should have values for AzureWebJobsStorage and AzureWebJobsDashboard, and your QueueTrigger should have a value for the Connection field.
For more information about how to wire up QueueTriggers and test locally, see this answer.

Azure Functions and DocumentDB triggers

Is it possible to specify the DocumentDB is to fire triggers when writing to DocumentDB?
I have an Azure function that pulls JSON messages off a Service Bus Queue and puts them into DocumentDB like so:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return myQueueItem;
}
This inserts new documents into the database as they are added to the service bus queue, however I need DocumentDB to process these as they are added and add attachments. This cannot be done in the present setup and I would like to tell DocumentDB to fire a trigger.
I have tried something like this:
using System;
using System.Threading.Tasks;
public static string Run(string myQueueItem, TraceWriter log)
{
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
return "x-ms-documentdb-post-trigger-include: addDocument\n" + myQueueItem;
}
It doesn't work and gives me errors like this:
Exception while executing function:
Functions.ServiceBusQueueTriggerCSharp1. Microsoft.Azure.WebJobs.Host:
Error while handling parameter _return after function returned:.
Newtonsoft.Json: Unexpected character encountered while parsing value:
x. Path '', line 0, position 0.
I like this setup because I can saturate the queue with requests to add records and they just buffer until the database can deal with it, which deals with spikes in demand, but it allows data offload from the client machine as fast as the network can carry it and then the queue/database combination gets caught up when demand drops again.
You could refer to the following code sample to create document with the trigger enabled in Azure Functions.
using System;
using System.Threading.Tasks;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
public static void Run(string myQueueItem, TraceWriter log)
{
string EndpointUri = "https://{documentdb account name}.documents.azure.com:443/";
string PrimaryKey = "{PrimaryKey}";
DocumentClient client = new DocumentClient(new Uri(EndpointUri), PrimaryKey);
client.CreateDocumentAsync(UriFactory.CreateDocumentCollectionUri("{databaseid}", "{collenctionid}"), new MyChunk { MyProperty = "hello" },
new RequestOptions
{
PreTriggerInclude = new List<string> { "YourTriggerName" },
}).Wait();
log.Info($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
}
public class MyChunk
{
public string MyProperty { get; set; }
}
Note: for using Microsoft.Azure.DocumentDB NuGet package in a C# function, please upload a project.json file to the function's folder in the function app's file system.
project.json
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.Azure.DocumentDB": "1.13.1"
}
}
}
}
Besides, please make sure you have created triggers in your DocumentDB, for details about creating triggers, please refer to this article.

Resources