Pass Parameter to Azure Powershell Function - azure

I'm new to the Azure environment. I want to execute an .exe file by passing blob storage as an input and output parameter
below is my Json Function and PowerShell code I am not sure how to pass the input blob storage file as parameter and get the output on the blob storage.
I tried to run the below code but i am getting error
Function.json
{
"bindings": [
{
"name": "Timer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
},
{
"name": "inputBlob",
"direction": "in",
"type": "blob",
"path": "inputfilepath/{name}",
"connection": "test23_STORAGE"
},
{
"name": "outputBlob",
"direction": "out",
"type": "blob",
"path": "outputfilepath/{name}",
"connection": " test23_STORAGE"
}
]
}
Run.ps1
# Input bindings are passed in via param block.
param($Timer,$inputBlob,$outputBlob)
# Get the current universal time in the default string format.
$currentUTCtime = (Get-Date).ToUniversalTime()
# The 'IsPastDue' property is 'true' when the current function invocation is later than scheduled.
if ($Timer.IsPastDue) {
Write-Host "PowerShell timer is running late!"
}
# Write an information log with the current time.
Write-Host "PowerShell timer trigger function ran! TIME: $currentUTCtime"
Set-Location "C:\home\site\wwwroot\TimerTrigger1"
.\Extractor.exe -F $inputBlob -E $outputBlob
Error
2022-09-10T07:09:45Z [Information] Executing 'Functions.TimerTrigger1' (Reason='This function was programmatically called via the host APIs.', Id=07f70ccc-3e39-4e98-a830-73bfda54d101)
2022-09-10T07:09:45Z [Error] Executed 'Functions.TimerTrigger1' (Failed, Id=07f70ccc-3e39-4e98-a830-73bfda54d101, Duration=7ms)

AFAIK, we cannot use multiple input triggers, but we can use multiple input/output binding of the same trigger in one trigger function.
I want to execute an .exe file by passing blob storage as an input and output parameter
This might be done in 2 ways either fetching that .exe file from the Storage blob container path and running it or using the WebJobs with CRON Tab expression for these kinds of tasks.
From the Azure Functions, define one function for triggering the .exe file at runtime to run and another function for storing the output of that .exe application in the blob storage.

Related

Azure Functions PowerShell - Accessing Timestamp value from Azure Storage Table

I am building a timer-triggered Azure Function that uses a PowerShell script. For input binding, there is Azure Table Storage. In order to use newly added values in my script, I was hoping to make use of the automatic Timestamp value that gets generated on each value addition to the table. However, the Timestamp is rendered empty when I run the function. The values are present in the table though.
My test code -
# Input bindings are passed in via param block.
param($Timer, $inputTable)
$newValues = $inputTable.GetEnumerator()
$newValues | ForEach-Object {
Write-host $_.Timestamp
}
Output when Write-host $_.Timestamp
2022-06-17T07:16:55.538 [Information] OUTPUT:
2022-06-17T07:16:55.614 [Information] INFORMATION:
2022-06-17T07:16:55.616 [Information] INFORMATION:
2022-06-17T07:16:55.621 [Information] INFORMATION:
Output for any other value Eg. Write-host $_.PartitionKey -
2022-06-17T07:17:34.230 [Information] OUTPUT:
2022-06-17T07:17:34.310 [Information] INFORMATION: partition1
2022-06-17T07:17:34.312 [Information] INFORMATION: partition1
2022-06-17T07:17:34.318 [Information] INFORMATION: partition1
If you're using the Table Storage input binding then I don't think the TimeStamp is present in the returned data.
Example
I have a function which has HTTP Input Binding and Table Storage Input Binding to lookup a list of users for table storage when invoked.
Table Storage
function.json
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "Request",
"methods": [
"get"
],
"route": "get/user"
},
{
"name": "PersonEntity",
"type": "table",
"tableName": "users",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"type": "http",
"direction": "out",
"name": "Response"
}
]
}
run.ps1
When I trigger the function it returns all users but the TimeStamp property is not returned from table storage. I don't think there's a way to retrieve this using the input binding.
Debugging Output
Also looks like someone logged an issue here:
https://github.com/Azure/azure-functions-nodejs-worker/issues/320
And this is the same as with Python and NodeJs bindings so not unique to PowerShell.
You could use the AzTable module that was linked in the other answer or there is another module (AzBobbyTables) in the PowerShell Gallery which is newer and written in C# and is supposed to be much more performant:
https://www.powershellgallery.com/packages/AzBobbyTables/2.0.0
https://twitter.com/PalmEmanuel/status/1462818044959461382
If you are using the Get-AzTableRow in your azure function, you will not give the time stamp.
The Get-AzTableRow will return the table columns and Partition and RowKey.
If you are trying to get the timestamp you have to convert that into string. likes below
Type - 1
[String]$_.Timestamp
# or
Type - 2
$_.Timestamp.toString("s")
I have added the same thing in your code.
$newValues | ForEach-Object {
Write-host $_.Timestamp.toString("s")
}

How to set the output bindings for name and location in an Azure function using python?

I can't quite seem to get the output bindings to enable a file to be saved to blob storage. I have created an Azure Function using Python, that uses a CosmosDB Change Feed trigger. I need to save that document to blob storage.
I've set-up the function.json file as follows:
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "cosmos_dev",
"databaseName": "MyDatabase",
"collectionName": "MyCollection",
"createLeaseCollectionIfNotExists": "true"
},
{
"type": "blob",
"direction": "out",
"name": "outputBlob",
"path": "raw/changefeedOutput/{blobname}",
"connection": "blobStorageConnection"
}
]
}
So the trigger will get a documents like the following:
{ "id": "documentId-12345",
other sections here
"entity": "customer"
}
In the init.py file I have the base code of
def main(documents: func.DocumentList) -> func.Document:
logging.info(f"CosmosDB trigger executed!")
for doc in documents:
blobName = doc['id'] + '.json'
blobFolder= doc['entity']
blobData = doc.to_json()
I think i need to add in the def something like 'outputBlob: func.Out' but unsure how to proceed
Looking at the examples on github
https://github.com/yokawasa/azure-functions-python-samples/tree/master/v2functions/blob-trigger-watermark-blob-out-binding
it look like i have to
outputBlob.set(something)
So i'm looking for how to set up the def part and send the blob to the location that i've set from the data in the cosmosdb document.
I have tried the following:
def main(documents: func.DocumentList, outputBlob: func.Out[str] ) -> func.Document:
logging.info(f"CosmosDB trigger executed!")
for doc in documents:
blobName = doc['id'] + '.json'
outputBlob.set(blobName)
and get the result:
CosmosDB trigger executed!
Executed 'Functions.CosmosTrigger_py' (Failed, Id=XXXXX)
System.Private.CoreLib: Exception while executing function: Functions.CosmosTrigger_py. Microsoft.Azure.WebJobs.Host: No value for named parameter 'blobname'.
I could just call the connection stuff from the os.enviro, and get the connection string that way, I think and use the standard create_blob_from_text, with location, name and blob data,
block_blob_service.create_blob_from_text(blobLocation, blobName, formattedBlob)
Any pointers would be great

How to dynamically set blob name to store in Blob storage in azure function nodejs?

I have a activity function that should store message in Blob storage.I can overwrite a file in blob storage but i need to store data in different name.how to do that? Azure function doesn't support dynamic binding in nodejs.
Find one workaround, see whether it's useful.
Along with blob output binding, there's an activity trigger to receive message msg, we can put self-defined blob name in msg for blob binding path to consume.
In your orchestrator function which calls Activity function
yield context.df.callActivity("YourActivity", {'body':'messagecontent','blobName':'myblob'});
Then Activity function code should be modified
context.bindings.myOutputBlob = context.bindings.msg.body;
And its function.json can use blobName as expected
{
"bindings": [
{
"name": "msg",
"type": "activityTrigger",
"direction": "in"
},
{
"name":"myOutputBlob",
"direction": "out",
"type": "blob",
"connection": "AzureWebJobsStorage",
"path": "azureblob/{blobName}"
}
],
"disabled": false
}

Trying to upload a file with ftp using azure functions

I am trying to send a file using external file protocol and FTP api connection. The configuration and code is straight forwards and the app runs successfully however no data is sent to the FTP and I cannot see any trace that the function even tried to send data using ftp.... What is wrong? and more important; Where can i monitor the progress of the external file api?
My code follows (Note: I have tried Stream and string as input and output)
run.csx
public static void Run(Stream myBlobInput, string name, out Stream
myFTPOutput, TraceWriter log)
{
myFTPOutput = myBlobInput;
//log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Content:{myBlob}");
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size:{myBlobInput.Length} \n Content:{myBlobInput.ToString()}");
}
function.json
"bindings": [
{
"name": "myBlobInput",
"type": "blobTrigger",
"direction": "in",
"path": "input/{name}",
"connection": "blob_STORAGE"
},
{
"name": "myFTPOutput",
"type": "apiHubFile",
"direction": "out",
"path": "/output/{name}",
"connection": "ftp_FTP"
}
],
"disabled": false
}
I could make it working :
If we want to have same file content in the output FTP server and same file name ,then here is the code and function.json
public static void Run(string myBlob, string name, TraceWriter log , out string outputFile )
{
log.Info($"2..C# Blob trigger function Processed blob\n Name:{name} \n Size:{myBlob.Length} \n Content:{myBlob.ToString()}");
outputFile=myBlob;
}
Also here is the function.json
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "myblobcontainer/{name}",
"connection": "AzureWebJobsDashboard"
},
{
"type": "apiHubFile",
"name": "outputFile",
"path": "LogFiles/{name}",
"connection": "ftp_FTP",
"direction": "out"
}
],
"disabled": false
}
the input binding should have a valid container name as in blob account here--:
blob container structure as path
Also in output binding for FTP, the path should be any folder in Root of FTP , what you see in FTP login UI/console and then filename , which in this case {name} ,which allows us to keep the same output file name as input blob name.
Ok, so I changed the ftp Connection to some other server and it work like a charm. That means that there were some firewall refusal from the Azure Function that triggered. The sad thing is that no error messages triggers that i can spot. Thanks for all support

Using azure function output parameters with Dropbox connector

My flow is very simple: I want to have an azure function that runs once a day and they use its output to create a file in Dropbox.
The function does some processing and returns an object with 2 properties, a FileName and a FileContent, both are strings.:
return new AzureFunctionResponse
{
FileName = $"TestFile-{DateTimeOffset.UtcNow.ToUnixTimeMilliseconds()}",
FileContent = "This is the file content"
};
My problem is that I don't know how to use those 2 properties to setup my Dropbox connector
Here's my LogicApp flow:
I'd like to use the FileName and FileContent returned from my AzureFunction to populate the respective field in the Dropbox connector but I have no idea how to set this up. I've looked for documentation, but maybe I'm not looking at the right place because I'm not finding anything.
Also here are the bindings in my function.json file, if that can be of any help.
{
"disabled": false,
"bindings": [
{
"type": "httpTrigger",
"direction": "in",
"webHookType": "genericJson",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
}
Using the Parse JSON action after the function should do exactly what you need. Will parse output and make them available for you in the next step.
As alternative, you can implement the whole thing without using Logic Apps.
Make an Azure Function with Timer input trigger and Api Hub File output binding. No HTTP bindings are needed.
See this question for an example.

Resources