logic app 'When a resource event occurs' won't trigger - azure

I have a blob storage that has 2 Containers called input and output. when a file gets uploaded to input then a Function app (Blobtrigger) would work on it and save the result in output folder.
right now i need to trigger a workflow in Azure logic app. i didn't created any event grid outside of this workflow and now i'm trying to trigger it when a file get's uploaded (Created) in the output container.
However my app won't trigger. what should i do?

I have reproduced in my environment and triggered an event when blob get uploaded and it got triggered:
Please find the below approach to fix your issue :
Then uploaded a blob like below:
Output:
EDIT:
I too Uploaded into SubFolder:
Then in Ouput Subfolder:

i solved it.
Make sure your Storage Account is Version 2.(it's really important check it)
mine was V1 so i had to change it here:
use this as a filter for your specific Container. (for more Info Check Microsoft Docs)
/blobServices/default/containers/MyContainer/
in my Case it would be:
/blobServices/default/containers/output/

Related

Redeployment of Azure Function does not happen

I am experimenting with an Azure Function in PowerShell to answer a simple HTTP request.
I am editing directly in the Azure Portal.
I do not understand how deployment works. I assume it should directly redeploy after "save".
When I change my code and test it within Azure Portal I get debug results in the console/log and expected results in the HTTP-Output.
However: When I call the function from a browser, it returns different results. I think these are old results, so either the function is not redeployed or some cache / proxy is fooling my browser.
How can I see if redeployment took place after "save" in the AzurePortal?
How can I check debug / console output of a real invocation?
First, I run with the default Code from the Http Trigger Function (PowerShell Runtime) - i.e., created in the Azure Portal and edited the Function Code from the Portal itself > Saved > Clicked on Run again.
If we save the function, it will just save the changes in the code. We have to run again the Function for changes made in the Code Logic for new response.
You can see how many times your function in the Azure Function App is executed, in Monitor > Invocations with more information such as Response Code for every invocation (Function Run), Execution Time,
And I have updated the Function Code two times after 1st run of the default Code. Here in the Activity Logs, you can see number of times updated the run.ps1 function code updates registered:
Updated Answer:
I have run the Azure Function App PowerShell Http Trigger in the browser tab with its function URL for 4 times and the invocations are display after 4 minutes of time because my function app is hosted in Consumption plan:
Seems the function was "disabled" - after enabling it again all works as expected!

How to get multiple files using GetBobContent and add as attachment to Email in Azure Logic app?

Hi I am working in Azure Logic app. I am trying to get multiple files from azure data Lake gen v2 and attach these multiple files in an email. As a first step I have added http request and I am giving required information along with file path. It works fine for one file. but I am trying to input folder path and inside that folder, all the files I want to get and attach in email.
Logic app Flow Diagram
Added sample screenshot for attachment
tried to add attchment
In the above diagram, Get blob content step which works fine for one file but I am finding difficult to attach multiple files in email. Can some one help me to figure out the solution. Any help would be appreciated. Thank you
You can use List blobs action to list all blobs in the folder you want:
Then you can define a variable to append the attachments array.
Use For Each to loop the blobs from List Blobs action. Within For Each you can use Get blob content to get blob content, and then use Append to array variable to append attachments.
The expressions of Path, DisplayName and File Content are as follows:
Path : items('For_each')?['Path']
DisplayName : items('For_each')?['DisplayName']
File Content : body('Get_blob_content')
Finally, please fill in the attachment in the email:
==========================update===================
If you send an email with 400 response, please use expression in Append to array variable as below:
base64(body('Get_blob_content'))

Logic Apps - for each loop with liquid from blob storage

Im learning logic apps and im trying to create a simple flow from azure blob storage, perform a liguid parsing and then save parsed file to another blob container.
How should it work:
1. Whenever new file is added to blob container ("from") [containing xml files]
2.Liquid action takes place (XML -> JSON)
3.New file .json is saved to blob container ("too") :)
What i have learned:
1. I manage to write a liguid template for xml files - tested - working
2. I know how to copy file between blob containers - tested - working
For each loop:
https://i.imgur.com/ImaT3tf.jpg "FE loop"
Completed:
https://i.imgur.com/g6M9eLJ.jpg "Completed..."
Current LA:
https://i.imgur.com/ImaT3tf.jpg "Current"
What I dont know how to do:
1. How to "insert" current file content in for each into liquid action? It looks like logic apps is skipping that step?
The main problem is you could not use Current item as the xml content, you need to get the content with Get blob content action in For_each, then parse xml to json. After this create the blob in another container with json value.
You could refer to my workflow.

Why am I unable to set Amazon S3 as a trigger for my Serverless Lambda Function?

I am attempting to set a NodeJS Lambda function to be triggered when an image is uploaded to an Amazon S3 bucket. I have seen multiple tutorials and have the yml file set up as shown. Below is the YML config file:
functions:
image-read:
handler: handler.imageRead
events:
- s3:
bucket: <bucket-name-here>
event: s3:ObjectCreated:*
Is there something I am missing for the configuration? Is there something I need to do in an IAM role to set this up properly?
The YAML that you have here looks good but there may be some other problems.
Just to get you started:
are you deploying the function using the right credentials? (I've seen it many times that people are deploying in some other account etc. than they think - verify in the web console that it's there)
can you invoke the function in some other way? (from the serverless command line, using http trigger etc.)
do you see anything in the logs of that function? (add console.log statements to see if anything is being run)
do you see the trigger installed in the web console?
can you add trigger manually on the web console?
Try to add a simple function that would only print some logs when it is run and try to add a trigger for that function manually. If it works then try to do the same with the serverless command line but start with a simple function with just one log statement and if it works then go from there.
See also this post for more hints - S3 trigger is not registered after deployment:
https://forum.serverless.com/t/s3-trigger-is-not-registered-after-deployment/1858

Azure: Downloading from Blob Storage results in permissions error?

I’ve uploaded some files to Blob storage, and now I’m using the OnStart method to retrieve those files and run them. Right now I’m working locally.
Using the following code:
using (var fileStream = System.IO.File.OpenWrite(#"C:\testfolder"))
{
blob.DownloadToStream(fileStream);
}
Results in a “Access to the path 'C:\testfolder' is denied.” error.
What do you think is causing this? And - will this be an issue once the project is actually pushed up to Azure? I can change permissions locally, but I'm hoping that once it's actually in a live worker role, it won't be an issue.
Any help would be awesome :)
Scratch that - it looks like the C:\testfolder should specify the file name, not the location. I've changed it to C:\testfolder\test.txt and it works just fine :).

Resources