want to create a Azure Logic App expression - azure

In Azure Logic App, I am creating blob event grid. I am able to get event whenever I have added or deleted a blob from a storage account. In that I am getting the below responses.
In the Subject i am getting response as '/blobServices/default/containers/james/blobs/catputvendor/Capture.PNG'
Now I need to write an expression which gets below result.
'/james/catputvendor/Capture.PNG'.
Which expression is best.
I got below expression using c# working but in logic app expression, we don't have Remove method. How I need to do in logic app as following:
var subStri1 = str.Remove(str.IndexOf("/blobs"), "/blobs".Length).Substring(str.Remove(str.IndexOf("/blobs"), "/blobs".Length).LastIndexOf("/containers") + "/containers".Length);

The Subject in the Dynamic content, it's the absolute path about the blob. So you could use split expression to get the path you want.
And the expression would be like this:split(triggerBody()?['subject'], '/')?[4]. My subject path is /blobServices/default/containers/firstcontainer/blobs/test/Snipaste_2018-11-13_10-08-08.png. So the expression will get the container name firstcontainer.
So the whole expression will be
#{split(triggerBody()?['subject'], '/')?[4]}/#{split(triggerBody()?['subject'], '/')?[6]}/#{split(triggerBody()?['subject'],'/')?[7]}.
Here is my flow and the result page.
Hope this could help you, if you still have other questions, please let me know.

Related

How to use fhir client search to include all resources i.e. ($everything)

How would you search for all resources for a given patient e.g. encounter, appointment, consent?
I know you could search for it via postman request http://localhost:9090/organId/Patient/12345/$everything and get the result. But I want to be able to execute the search query from my java program.
This is what I have so far, but I know the include part is not good and not working. Googling didn't return any result.
Bundle bundle = myFhirClient
.search()
.forResource(Patient.class)
.returnBundle(Bundle.class)
.where(new NumberClientParam(Patient.SP_RES_ID).exactly().number(patientId)).include(new Include("$everything"))
.sort(new SortSpec().setOrder(SortOrderEnum.DESC).setParamName(Patient.SP_RES_ID))
.execute();
Any help is much appreciated
I had to use Fhir Client operation instead of search. This will return all the reference resources for the given patientId.
Parameters outParams = myFhirClient
.operation()
.onInstance(new IdType("Patient", patientId))
.named("$everything")
.withNoParameters(Parameters.class) // No input parameters
.execute();

Azure Functions Orchestration using Logic App

I have multiple Azure Functions which carry out small tasks. I would like to orchestrate those tasks together using Logic Apps, as you can see here:
Logic App Flow
I am taking the output of Function 1 and inputting parts of it into Function 2. As I was creating the logic app, I realized I have to parse the response of Function 1 as JSON in order to access the specific parameters I need. Parse JSON requires me to provide an example schema however, I need to be able to parse the response as JSON without this manual step.
One solution I thought would work was to register Function 1 with APIM and provide a response schema. This doesn't seem to be any different to calling the Function directly.
Does anyone have any suggestions for how to get the response of a Function as a JSON/XML?
You can run Javascript snippets and dynamic parse the response from Function 1 without providing a sample.
e.g.
var data = Object.keys(workflowContext.trigger.outputs.body.Body);
var key = data.filter(s => s.includes('Property')).toString(); // to get element - Property - dynamic content
return workflowContext.trigger.outputs.body.Body[key];
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-add-run-inline-code?tabs=consumption

Issue with validating Arm template using DeploymentsOperations.StartValidate

I am currently working on a project where i deploy multiple arm templates each deploying a VM and doing few operations on them. I wanted handle quota issues by calling template validation before triggerring the first deployment. So, i created a template which has logic to create required VMs and i am using this template only for validation (to check if quota will not be exceeded).
Since our code already has the ResourceManagementClient, i tried the following code:
Deployment parameters = new Deployment(
new DeploymentProperties(DeploymentMode.Incremental)
{
Template = templateFile,
Parameters = parameterFile,
});
DeploymentsValidateOperation dp = deployments.StartValidate(groupName, "validation", parameters);
But when i try to access the Value from the variable dp, i keep getting the following exception:
Generic Exception System.InvalidOperationException: The operation has
not completed yet. at Azure.Core.ArmOperationHelpers`1.get_Value()
at
Azure.ResourceManager.Resources.DeploymentsValidateOperation.get_Value()
at DeployTemplate.Program.d__3.MoveNext() in
\Program.cs:line 88
I even added a loop after the "StartValidate" to wait till the dp.HasCompleted is set to true. But this seems to run indefinetly. I also tried the "StartValidateAsync" method, which seems to have the same issue.
I wanted to understand if i am using this method correctly? if there is a better way to do the template validations? I could not find any examples on this method`s usage. if possible please share any code snippet where this method is used for my reference.
Note: Currently, Since this is not working, i am testing with Fluent Api way. That seems to be working. But, it requires lot of changes in our code as it creates ambiguity with many classes in "Azure.ResourceManager.Resources" which are already used for other operations.
I found that even though the deployment operations HasCompleted field is not set, when I call dp.GetRawResponse(), it returns the exact errors expected.
I now use this to validate my templates.

'Delay until' finish time of 'Queue a new build' not working in Azure Logic App

I'm triggering an Azure Logic App from an https webhook for a docker image in Azure Container Registry.
The workflow is roughly:
When a HTTP request is received
Queue a new build
Delay until
FinishTime of Queue a new build
See: Workflow image
The Delay until action doesn't work in that the queueried FinishTime is 0001-01-01T00:00:00.
It complains about the wrong format, so I manually added a Z after the FinishTime keyword.
Now the time stamp is in the right format, however, the timestamp 0001-01-01T00:00:00Z obviously doesn't make sense and subsequent steps are executed without delay.
Anything that I am missing?
edit: Queue a new build queues an Azure pipeline build. I.e. the FinishTime property comes from the pipeline.
You need to set a timestamp in future, the timestamp 0001-01-01T00:00:00Z you set to the "Delay until" action is not a future time. If you set a timestamp as 2020-04-02T07:30:00Z, the "Delay until" action will take effect.
Update:
I don't think the "Delay until" can do what you expect, but maybe you can refer to the operations below. Just add a "Condition" action to judge if the FinishTime is greater than current time.
The expression in the "Condition" is:
sub(ticks(variables('FinishTime')), ticks(utcNow()))
In a word, if the FinishTime is greater than current time --> do the "Delay until" aciton. If the FinishTime is less than current time --> do anything else which you want.(By the way you need to pay attention to the time zone of your timestamp, maybe you need to convert all of the time zone to UTC)
I've been in touch with an Azure support engineer, who has confirmed that the Delay until action should work as I intended to use it, however, that the FinishTime property will not hold a value that I can use.
In the meantime, I have found a workaround, where I'm using some logic and quite a few additional steps. Inconvenient but at least it does what I want.
Here are the most important steps that are executed after the workflow gets triggered from a webhook (docker base image update in Azure Container Registry).
Essentially, I'm initializing the following variables and queing a new build:
buildStatusCompleted: String value containing the target value completed
jarsBuildStatus: String value containing the initial value notStarted
jarsBuildResult: String value containing the default value failed
Then, I'm using an Until action to monitor when the jarsBuildStatus's value is switching to completed.
In the Until action, I'm repeating the following steps until jarsBuildStatus changes its value to buildStatusCompleted:
Delay for 15 seconds
HTTP request to Azure DevOps build, authenticating with personal access token
Parse JSON body of previous raw HTTP output for status and result keywords
Set jarsBuildStatus = status
After breaking out of the Until action (loop), the jarsBuildResult is set to the parsed result.
All these steps are part of a larger build orchestration workflow, where I'm repeating the given steps multiple times for several different Azure DevOps build pipelines.
The final action in the workflow is sending all the status, result and other relevant data as a build summary to Azure DevOps.
To me, this is only a workaround and I'll leave this question open to see if others have suggestions as well or in case the Azure support engineers can give more insight into the Delay until action.
Here's an image of the final workflow (at least, the part where I implemented the Delay until action):
edit: Turns out, I can simplify the workflow because there's a dedicated Azure DevOps action in the Logic App called Send an HTTP request to Azure DevOps, which omits the need for manual authentication (Azure support engineer pointed this out).
The workflow now looks like this:
That is, I can query the build status directly and set the jarsBuildStatus as
#{body('Send_an_HTTP_request_to_Azure_DevOps:_jar''s')['status']}
The code snippet above is automagically converted to a value for the Set variable action. Thus, no need to use an additional Parse JSON action.

ScriptError using Google Apps Script Execution API

Following these guides https://developers.google.com/apps-script/guides/rest/quickstart/target-script and https://developers.google.com/apps-script/guides/rest/quickstart/nodejs, I am trying to use the Execution API in node to return some data that are in a Google Spreadsheet.
I have set the script ID to be the Project Key of the Apps Script file. I have also verified that running the function in the Script Editor works successfully.
However, when running the script locally with node, I get this error:
The API returned an error: Error: ScriptError
I have also made sure the script is associated with the project that I use to auth with Google APIs as well.
Does anyone have any suggestion on what I can do to debug/ fix this issue? The error is so generic that I am not sure where to look.
UPDATE: I've included a copy of the code in this JSBin (the year function is the entry point)
https://jsbin.com/zanefitasi/edit?js
UPDATE 2: The error seems to be caused by the inclusion of this line
var spreadsheet = SpreadsheetApp.open(DriveApp.getFileById(docID));
It seems that I didn't request the right scopes. The nodejs example include 'https://www.googleapis.com/auth/drive', but I also needed to include 'https://www.googleapis.com/auth/spreadsheets' in the SCOPES array. It seems like the error message ScriptError is not very informative here.
In order to find what scopes you'd need, to go the Script Editor > File > Project Properties > Scopes. Remember to delete the old credentials ~/.credentials/old-credential.json so that the script will request a new one.
EDIT: With the update in information I took a closer look and saw you are returning a non-basic type. Specifically you are returning a Sheet Object.
The basic types in Apps Script are similar to the basic types in
JavaScript: strings, arrays, objects, numbers and booleans. The
Execution API can only take and return values corresponding to these
basic types -- more complex Apps Script objects (like a Document or
Sheet) cannot be passed by the API.
https://developers.google.com/apps-script/guides/rest/api
In your Account "Class"
this.report = spreadsheet.getSheetByName(data.reportSheet);
old answer:
'data.business_exp' will be null in this context. You need to load the data from somewhere. Every time a script is called a new instance of the script is created. At the end of execution chain it will be destroyed. Any data stored as global objects will be lost. You need to save that data to a permanent location such as the script/user properties, and reloaded on each script execution.
https://developers.google.com/apps-script/reference/properties/

Resources