I am building a timer-triggered Azure Function that uses a PowerShell script. For input binding, there is Azure Table Storage. In order to use newly added values in my script, I was hoping to make use of the automatic Timestamp value that gets generated on each value addition to the table. However, the Timestamp is rendered empty when I run the function. The values are present in the table though.
My test code -
# Input bindings are passed in via param block.
param($Timer, $inputTable)
$newValues = $inputTable.GetEnumerator()
$newValues | ForEach-Object {
Write-host $_.Timestamp
}
Output when Write-host $_.Timestamp
2022-06-17T07:16:55.538 [Information] OUTPUT:
2022-06-17T07:16:55.614 [Information] INFORMATION:
2022-06-17T07:16:55.616 [Information] INFORMATION:
2022-06-17T07:16:55.621 [Information] INFORMATION:
Output for any other value Eg. Write-host $_.PartitionKey -
2022-06-17T07:17:34.230 [Information] OUTPUT:
2022-06-17T07:17:34.310 [Information] INFORMATION: partition1
2022-06-17T07:17:34.312 [Information] INFORMATION: partition1
2022-06-17T07:17:34.318 [Information] INFORMATION: partition1
If you're using the Table Storage input binding then I don't think the TimeStamp is present in the returned data.
Example
I have a function which has HTTP Input Binding and Table Storage Input Binding to lookup a list of users for table storage when invoked.
Table Storage
function.json
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "Request",
"methods": [
"get"
],
"route": "get/user"
},
{
"name": "PersonEntity",
"type": "table",
"tableName": "users",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"type": "http",
"direction": "out",
"name": "Response"
}
]
}
run.ps1
When I trigger the function it returns all users but the TimeStamp property is not returned from table storage. I don't think there's a way to retrieve this using the input binding.
Debugging Output
Also looks like someone logged an issue here:
https://github.com/Azure/azure-functions-nodejs-worker/issues/320
And this is the same as with Python and NodeJs bindings so not unique to PowerShell.
You could use the AzTable module that was linked in the other answer or there is another module (AzBobbyTables) in the PowerShell Gallery which is newer and written in C# and is supposed to be much more performant:
https://www.powershellgallery.com/packages/AzBobbyTables/2.0.0
https://twitter.com/PalmEmanuel/status/1462818044959461382
If you are using the Get-AzTableRow in your azure function, you will not give the time stamp.
The Get-AzTableRow will return the table columns and Partition and RowKey.
If you are trying to get the timestamp you have to convert that into string. likes below
Type - 1
[String]$_.Timestamp
# or
Type - 2
$_.Timestamp.toString("s")
I have added the same thing in your code.
$newValues | ForEach-Object {
Write-host $_.Timestamp.toString("s")
}
Related
I am trying to implement a Get Metadata activity to return the column count of files I have in a single blob storage container.
Get Metadata activity is returning this error:
Error
I'm fairly new to Azure Data Factory and cannot solve this. Here's what I have:
Dataset:Source dataset
Name- ten_eighty_split_CSV
Connection- Blob storage
Schema- imported from blob storage file
Parameters- "FileName"; string; "#pipeline().parameters.SourceFile"
Pipeline:
Name: ten eighty split
Parameters: "SourceFile"; string; "#pipeline().parameters.SourceFile"
Settings: Concurrency: 1
Get Metadata activity: Get Metadata
Only argument is "Column count"
Throws the error upon debugging. I am not sure what to do, (404) not found is so broad I could not ascertain a specific solution. Thanks!
The error occurs because you have given incorrect file name (or) name of a file that does not exist.
Since you are trying to use blob created event trigger to find the column count, you can use the procedure below:
After configuring the get metadata activity, create a storage event trigger. Go to Add trigger -> choose trigger -> Create new.
Click on continue. You will get a Trigger Run Parameters tab. In this, give the value as #triggerBody().fileName.
Complete the trigger creation and publish the pipeline. Now whenever the file is uploaded into your container (on top of which you created storage event trigger), it will trigger the pipeline automatically (no need to debug). If the container is empty and you try to debug by giving some value for sourceFile parameter, it would give the same error.
Upload a sample file to your container. It will trigger the pipeline and give the desired result.
The following is the trigger JSON that I created for my container:
{
"name": "trigger1",
"properties": {
"annotations": [],
"runtimeState": "Started",
"pipelines": [
{
"pipelineReference": {
"referenceName": "pipeline1",
"type": "PipelineReference"
},
"parameters": {
"sourceFile": "#triggerBody().fileName"
}
}
],
"type": "BlobEventsTrigger",
"typeProperties": {
"blobPathBeginsWith": "/data/blobs/",
"blobPathEndsWith": ".csv",
"ignoreEmptyBlobs": true,
"scope": "/subscriptions/b83c1ed3-c5b6-44fb-b5ba-2b83a074c23f/resourceGroups/<user>/providers/Microsoft.Storage/storageAccounts/blb1402",
"events": [
"Microsoft.Storage.BlobCreated"
]
}
}
}
I'm new to the Azure environment. I want to execute an .exe file by passing blob storage as an input and output parameter
below is my Json Function and PowerShell code I am not sure how to pass the input blob storage file as parameter and get the output on the blob storage.
I tried to run the below code but i am getting error
Function.json
{
"bindings": [
{
"name": "Timer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
},
{
"name": "inputBlob",
"direction": "in",
"type": "blob",
"path": "inputfilepath/{name}",
"connection": "test23_STORAGE"
},
{
"name": "outputBlob",
"direction": "out",
"type": "blob",
"path": "outputfilepath/{name}",
"connection": " test23_STORAGE"
}
]
}
Run.ps1
# Input bindings are passed in via param block.
param($Timer,$inputBlob,$outputBlob)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
Set-Location "C:\home\site\wwwroot\TimerTrigger1"
.\Extractor.exe -F $inputBlob -E $outputBlob
Error
2022-09-10T07:09:45Z [Information] Executing 'Functions.TimerTrigger1' (Reason='This function was programmatically called via the host APIs.', Id=07f70ccc-3e39-4e98-a830-73bfda54d101)
2022-09-10T07:09:45Z [Error] Executed 'Functions.TimerTrigger1' (Failed, Id=07f70ccc-3e39-4e98-a830-73bfda54d101, Duration=7ms)
AFAIK, we cannot use multiple input triggers, but we can use multiple input/output binding of the same trigger in one trigger function.
I want to execute an .exe file by passing blob storage as an input and output parameter
This might be done in 2 ways either fetching that .exe file from the Storage blob container path and running it or using the WebJobs with CRON Tab expression for these kinds of tasks.
From the Azure Functions, define one function for triggering the .exe file at runtime to run and another function for storing the output of that .exe application in the blob storage.
I can't quite seem to get the output bindings to enable a file to be saved to blob storage. I have created an Azure Function using Python, that uses a CosmosDB Change Feed trigger. I need to save that document to blob storage.
I've set-up the function.json file as follows:
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "cosmos_dev",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"createLeaseCollectionIfNotExists": "true"
},
{
"type": "blob",
"direction": "out",
"name": "outputBlob",
"path": "raw/changefeedOutput/{blobname}",
"connection": "blobStorageConnection"
}
]
}
So the trigger will get a documents like the following:
{ "id": "documentId-12345",
other sections here
"entity": "customer"
}
In the init.py file I have the base code of
def main(documents: func.DocumentList) -> func.Document:
logging.info(f"CosmosDB trigger executed!")
for doc in documents:
blobName = doc['id'] + '.json'
blobFolder= doc['entity']
blobData = doc.to_json()
I think i need to add in the def something like 'outputBlob: func.Out' but unsure how to proceed
Looking at the examples on github
https://github.com/yokawasa/azure-functions-python-samples/tree/master/v2functions/blob-trigger-watermark-blob-out-binding
it look like i have to
outputBlob.set(something)
So i'm looking for how to set up the def part and send the blob to the location that i've set from the data in the cosmosdb document.
I have tried the following:
def main(documents: func.DocumentList, outputBlob: func.Out[str] ) -> func.Document:
logging.info(f"CosmosDB trigger executed!")
for doc in documents:
blobName = doc['id'] + '.json'
outputBlob.set(blobName)
and get the result:
CosmosDB trigger executed!
Executed 'Functions.CosmosTrigger_py' (Failed, Id=XXXXX)
System.Private.CoreLib: Exception while executing function: Functions.CosmosTrigger_py. Microsoft.Azure.WebJobs.Host: No value for named parameter 'blobname'.
I could just call the connection stuff from the os.enviro, and get the connection string that way, I think and use the standard create_blob_from_text, with location, name and blob data,
block_blob_service.create_blob_from_text(blobLocation, blobName, formattedBlob)
Any pointers would be great
We are using a Cosmos DB input trigger for Azure Functions and are unable to use environment variables in our function.json like other input triggers for sticky slot settings.
Has anyone else had success using environment variables in function.json using Cosmos DB trigger type?
function.json
{
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "inputDocs",
"direction": "in",
"leaseDatabaseName": "leases",
"leaseCollectionName": "MyCosmosCollection-myFunction",
"connectionStringSetting": "CosmosTriggers-SourceAdapter",
"databaseName": "%cosmos-triggers-database-name%",
"collectionName": "MyCosmosCollection",
"createLeaseCollectionIfNotExists": true
}
],
"disabled": false
}
Azure Function Error
Function ($myFunction) Error: The listener for function 'Functions.myFunction' was unable to start.
Microsoft.Azure.WebJobs.Extensions.DocumentDB: Either the source collection 'MyCosmosCollection' (in database '%cosmos-triggers-database-name%') or the lease collection 'MyCosmosCollection-myFunction' (in database 'leases') does not exist. Both collections must exist before the listener starts. To automatically create the lease collection, set 'CreateLeaseCollectionIfNotExists' to 'true'. Microsoft.Azure.Documents.Client: Message: {"Errors":["Resource Not Found"]}
ActivityId: b00f7802-fccb-47eb-972d-0bd70ec896c1, Request URI: rntbd://bn6prdddc05-docdb-1.documents.azure.com:14639/apps/6628b461-75d4-4e4a-9897-ada4076dc30c/services/1b0fc27a-de15-45cf-a1b2-ebfce044d1e2/partitions/34cfee55-54aa-4e31-81f4-08cf1bfdf62f/replicas/131523094168492638s/.
Session Id: 092ccb7ce9104407bf56c26a5cc8b119
Timestamp: 2017-10-31T19:13:03.914Z
My flow is very simple: I want to have an azure function that runs once a day and they use its output to create a file in Dropbox.
The function does some processing and returns an object with 2 properties, a FileName and a FileContent, both are strings.:
return new AzureFunctionResponse
{
FileName = $"TestFile-{DateTimeOffset.UtcNow.ToUnixTimeMilliseconds()}",
FileContent = "This is the file content"
};
My problem is that I don't know how to use those 2 properties to setup my Dropbox connector
Here's my LogicApp flow:
I'd like to use the FileName and FileContent returned from my AzureFunction to populate the respective field in the Dropbox connector but I have no idea how to set this up. I've looked for documentation, but maybe I'm not looking at the right place because I'm not finding anything.
Also here are the bindings in my function.json file, if that can be of any help.
{
"disabled": false,
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"webHookType": "genericJson",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
}
Using the Parse JSON action after the function should do exactly what you need. Will parse output and make them available for you in the next step.
As alternative, you can implement the whole thing without using Logic Apps.
Make an Azure Function with Timer input trigger and Api Hub File output binding. No HTTP bindings are needed.
See this question for an example.