Azure Function - Trigger blob storage copy when last modified - azure

First time working with azure functions. In my run.csx i have
public static void Run(Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
and in my functions.json i have
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "insights-logs-auditlogs/{name}",
"connection": "stnhnmspsplunkmgmt_STORAGE"
},
{
"name": "Output",
"path": "test/{rand-guid}",
"connection": "sttestfuncapptest_STORAGE",
"direction": "out",
"type": "blob"
}
]
}
This works fine when i edit existing files in the containers, but i cant figure out how to do the copy when a change has happened. Anyone can point me in the right direction. Greatly appreciate it.

The Blob storage trigger starts a function when a new or updated blob is detected .
As this blob trigger for azure function writes a log when a blob is added or updated in the samples-workitems container.
Try to create a SAS URL for the blob with required permissions with minimum read and then use that for file copy in c# and publish in azure.
Please check this SO reference
[FunctionName("MyBlobTrigger")]
public async void Run([BlobTrigger("uploads/{name}", Connection = "UploadStorageAccount")]CloudBlockBlob myBlob, string name, ILogger log, CancellationToken cancellationToken)
{
// Get SAS for the blob
var sasToken = myBlob.GetSharedAccessSignature();
//SAS URI
var blobSasUrl = $"{myBlob.Uri.AbsoluteUri}{sasToken}";
ShareClient share = new ShareClient(storageConnection, fileShareName);
ShareDirectoryClient directory = share.GetRootDirectoryClient();
ShareFileClient fileShare = directory.GetFileClient(name);
ShareClient share = new ShareClient(storageConnection, fileShareName);
ShareDirectoryClient directory = share.GetRootDirectoryClient();
ShareFileClient fileShare = directory.GetFileClient(name);
try
{
fileShare.Create(myBlob.Properties.Length);
//Copy blob's contents to storage file using async file copy.
//await fileShare.StartCopyAsync(new Uri(blobSasUrl));
}
}
Other references:
c# - Azure Function Blob Trigger copy file to File Share - Stack Overflow
ADF: copy last modified blob - Microsoft Q&A / Microsoft Docs
LastModifiedDate with Azure Data Factory | Azure Blog and Updates | Microsoft Azure

Related

Passing cloud table as an input binding

I am trying to create an azure function which takes azure table as storage, which then i read in the function. I am able to run the function when i specify the below signature
public static async System.Threading.Tasks.Task RunAsync([TimerTrigger("0 */1 * * * *")]TimerInfo myTimer, [Table("CovidUpdateTable")]CloudTable ServiceFormTable, [Blob("outcontainer/{DateTime}", FileAccess.Write)] Stream outputBlob, ILogger log)
Here, I have to specify the table name in table attribute even when i have specified it in the binding config as below
{
"type": "table",
"name": "ServiceFormTable",
"tableName": "CovidUpdateTable",
"take": 50,
"connection": "AzureWebJobsStorage",
"direction": "in"
}
in portal c# script I can directly bind to CloudTable but in Visual studio, It throws an error if i remove the table attribute and use just cloudtable. I am not sure what is the purpose of tablename in this config when I have to specify the name in table attribute.
When we create the function in Azure portal, it will create a c# script function(csx file) and generate function.json file for us. And the function will read the configs from the function.json file autosomally. So we can directly configure binging in the file and do not need to configure things in the code. But when we create the function in Visual Studio, it will create c# function (cs file) and will not generate function.json file for us. And the function will not read the configs from the function.json file autosomally. So we need to configure these settings with attribute. For more details, please refer to the document
Update
If you want to use the binding properties in local.settings.json, please refer to the following steps
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"TableConnection": "<your azure table connection string>",
"Tablename": "<your table name>"
}
}
Configure code. You should use [Table("%Tablename%",Connection = "TableConnection")]CloudTable cloudTable, to bing Azure table
For example
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
[Table("%Tablename%",Connection = "TableConnection")]CloudTable cloudTable,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
log.LogInformation(cloudTable.Name);
var query = new TableQuery<DynamicTableEntity>();
foreach (var entity in
await cloudTable.ExecuteQuerySegmentedAsync(query, null))
{
log.LogInformation(
$"{entity.PartitionKey}\t{entity.RowKey}\t{entity.Timestamp}");
}
....
}
For more details, please refer to the document

Can I add DeviceID in path while storing data from Azure Stream Analytics to Blob storage

I have Data incoming from Different devices to IoT hub from there using Stream Analytics to process it and store it in blob storage.
I know we can add {date}{time} we add in the path according to needed format, in that path can we add deviceId too.
example : For 2018/10/30/01 ( Date/month/day/hour) Can add /deviceId in that path while storing to blob
The following is an example of workaround for your case. It's based on the using an azure function (HttpTrigger) for output ASA job to append a data to the specific blob storage in the push manner.
Note, that the following workaround using the Max batch count for delivering events to the azure function value 1 (one telemetry data at the time).
ASA job query:
SELECT
System.Timestamp as [time], *
INTO outAF
FROM
iot TIMESTAMP BY time
Azure Function (HttpTrigger):
run.csx
#r "Newtonsoft.Json"
#r "Microsoft.WindowsAzure.Storage"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Microsoft.WindowsAzure.Storage.Blob;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static async Task<IActionResult> Run(string body, CloudBlobContainer blobContainer, ILogger log)
{
log.LogInformation($"{body}");
var jtoken = JToken.Parse(body);
var jobject = jtoken is JArray ? jtoken.SingleOrDefault<JToken>() : jtoken;
if(jobject != null)
{
var jtext = jobject.ToString(Formatting.None);
var data = JsonConvert.DeserializeAnonymousType(jtext, new {IoTHub = new { ConnectionDeviceId = ""}});
var blobName = $"{DateTime.UtcNow.ToString("yyyy/MM/dd/hh")}/{data.IoTHub.ConnectionDeviceId}";
var blob = blobContainer.GetAppendBlobReference(blobName);
if(!await blob.ExistsAsync())
{
await blob.CreateOrReplaceAsync();
}
await blob.AppendTextAsync(jtext + "\r\n");
}
return new NoContentResult();
}
function.json
{
"bindings": [
{
"authLevel": "function",
"name": "body",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get",
"post"
]
},
{
"name": "blobContainer",
"type": "blob",
"path": "myContainer",
"connection": "mySTORAGE",
"direction": "out"
},
{
"name": "$return",
"type": "http",
"direction": "out"
}
]
}
I know we can add {date}{time} we add in the path according to needed
format, in that path can we add deviceId too.'
As #Peter Bons mentioned in the comment, variable names in output is not supported so far.
As workaround, you could use Blob Trigger Azure Function. You need to pass the deviceId in the output columns then get it in the blob trigger function. Then use blob sdk to create /deviceId directory to copy blob into it and delete the previous blob.

How to dynamically set blob name to store in Blob storage in azure function nodejs?

I have a activity function that should store message in Blob storage.I can overwrite a file in blob storage but i need to store data in different name.how to do that? Azure function doesn't support dynamic binding in nodejs.
Find one workaround, see whether it's useful.
Along with blob output binding, there's an activity trigger to receive message msg, we can put self-defined blob name in msg for blob binding path to consume.
In your orchestrator function which calls Activity function
yield context.df.callActivity("YourActivity", {'body':'messagecontent','blobName':'myblob'});
Then Activity function code should be modified
context.bindings.myOutputBlob = context.bindings.msg.body;
And its function.json can use blobName as expected
{
"bindings": [
{
"name": "msg",
"type": "activityTrigger",
"direction": "in"
},
{
"name":"myOutputBlob",
"direction": "out",
"type": "blob",
"connection": "AzureWebJobsStorage",
"path": "azureblob/{blobName}"
}
],
"disabled": false
}

Trying to upload a file with ftp using azure functions

I am trying to send a file using external file protocol and FTP api connection. The configuration and code is straight forwards and the app runs successfully however no data is sent to the FTP and I cannot see any trace that the function even tried to send data using ftp.... What is wrong? and more important; Where can i monitor the progress of the external file api?
My code follows (Note: I have tried Stream and string as input and output)
run.csx
public static void Run(Stream myBlobInput, string name, out Stream
myFTPOutput, TraceWriter log)
{
myFTPOutput = myBlobInput;
//log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Content:{myBlob}");
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size:{myBlobInput.Length} \n Content:{myBlobInput.ToString()}");
}
function.json
"bindings": [
{
"name": "myBlobInput",
"type": "blobTrigger",
"direction": "in",
"path": "input/{name}",
"connection": "blob_STORAGE"
},
{
"name": "myFTPOutput",
"type": "apiHubFile",
"direction": "out",
"path": "/output/{name}",
"connection": "ftp_FTP"
}
],
"disabled": false
}
I could make it working :
If we want to have same file content in the output FTP server and same file name ,then here is the code and function.json
public static void Run(string myBlob, string name, TraceWriter log , out string outputFile )
{
log.Info($"2..C# Blob trigger function Processed blob\n Name:{name} \n Size:{myBlob.Length} \n Content:{myBlob.ToString()}");
outputFile=myBlob;
}
Also here is the function.json
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "myblobcontainer/{name}",
"connection": "AzureWebJobsDashboard"
},
{
"type": "apiHubFile",
"name": "outputFile",
"path": "LogFiles/{name}",
"connection": "ftp_FTP",
"direction": "out"
}
],
"disabled": false
}
the input binding should have a valid container name as in blob account here--:
blob container structure as path
Also in output binding for FTP, the path should be any folder in Root of FTP , what you see in FTP login UI/console and then filename , which in this case {name} ,which allows us to keep the same output file name as input blob name.
Ok, so I changed the ftp Connection to some other server and it work like a charm. That means that there were some firewall refusal from the Azure Function that triggered. The sad thing is that no error messages triggers that i can spot. Thanks for all support

How do you reference a blob from an Azure Timer Trigger Function?

I have an Azure Timer Trigger Function that should do some calculations and write the results to a json file in a pre-existing blob. How do I reference the pre-existing blob from within the Timer Triggered function?
I can't seem to find any documentation that provides a code sample. Can someone provide one?
First, you need to update your function.json configuration file, to bind the blob to the CloudBlockBlob instance you'll be using in your .csx code. You can edit it in Azure Portal via the "Integrate" option (the one with the lighting icon) under your function, in the Function Apps menu. On the top right of that page is a link that reads "Advanced Editor". Clicking that link will take you to your funciton's function.json file:
You'll see a JSON array named "bindings" that contains a JSON object that configures your timer. You'll want to add another JSON object to that array to bind your blob to a CloudBlockBlob instance that you'll be referencing in your function. Your function.json file will look something like this:
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
},
{
"type": "blob",
"name": "myBlob",
"path": "your-container-name/your_blob_filename.json",
"connection": "AzureWebJobsStorage",
"direction": "inout"
}
],
"disabled": false
}
Now you just need to update your function's Run method's signature. It looks like this by default:
public static void Run(TimerInfo myTimer, TraceWriter log)
Add your blob variable to the end of that signature (and also add the necessary includes):
#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
public static void Run(TimerInfo myTimer, TraceWriter log, CloudBlockBlob myBlob)
And you're all set! "myBlob" is bound to the blob "your_blob_filename.json" in your "your-container-name" container.

Resources