How to convert Base64 to String in logic app inline code (Javascript) - node.js

Summary: Logic app inline code (which uses NodeJS) is missing Buffer class.
Detailed: I am trying to trigger a logic app when some content is pushed into SFTP. I want to add some meta-data and save the details in the cosmos DB.
The issue is, The name of the file is received as a base64 encoded string in the inline code and Buffer is not available to parse it.
I even tried to create a set variable step (and decode filename there) but I am unable
to pass this variable to the inline code step. (Not supported)
The final option would be to use cloud functions instead of inline code which I am trying to avoid.
Looking for a workaround for conversion.
Logic App error image
link to ms doc
Doesn't support require() statements
Doesn't work with variables

Inline code can only perform the simplest Javascript operations, we may not be able to use Buffer.
As for passing the base64 encoded string, you can put it in Compose first, and then pass it in the inline code.
I suggest you use the built-in base64 related methods in the Azure logic app first.
If this does not meet your needs, you can create an Azure function and then call it in the Azure logic app.

Related

Generate html email body in Azure Function

In my application, I need to send templated HTML emails from an Azure Function.
The content is fairly large and getting the HTML to work in all clients can be pretty hard, even with the help of something like https://litmus.com/pre-send-testing. For this reason, I do not want to to use string concatenation/interpolation. I want to have the email content in a file I can view/edit in an IDE.
I need to replace some content with text for the specific recipient.
Ideally, I would like to have conditional logic in the template to avoid too much duplication (although this is not essential).
I have used the excellent https://github.com/toddams/RazorLight nuget package in other environments but unfortunately it does not work in Azure Fucntions.
Are there any other solutions for text templating for HTML that work in Azure Functions?
At the moment my best option is something like this (where Body.html is an embedded resource file):
StringBuilder body = new StringBuilder();
Stream template = this.GetType().Assembly.GetManifestResourceStream("EmailTemplating.Body.html");
using var reader = new StreamReader(template, Encoding.UTF8);
body.Append(reader.ReadToEnd());
body.Replace("{{recipient-name}}", "Jim");
In the end, we used https://github.com/rexm/Handlebars.Net which has good templating features (standard Moustache/Handlebars functionality) and worked a charm in Azure Functions.
Did you consider storage for the purpose? You can store a template file say template.html in a blob container and leverage it whenever required (download the file into a MemoryStream and make the Stream the content of the Function’s response/ the content for the next action based on your reqirement.

AWS lambda function proxies requests of fetching binary blob(PDF) from service layer and then returns to the client

I've created a lambda function so that I can use it for validation purposes and then proxy the request to the service layer. Then the service layer response contains a binary blob(PDF), which goes through the lambda function then the API gateway finally would reach the client.
The first problem we ran into was the PDF got transformed or corrupted, just returned blank PDF. And then I found this post which did not make any sense to me at first. Until I saw this aws doc. It turns out it's required to encode the binary data into base64 and then put the indictor 'isBase64Encoded' to true. The gateway eventually converts the response back to the binary blob.
TBH, I am new to aws and I don't really understand why this is the way..what's wrong of passing through the original binary blob, why those conversion steps are necessary?
Here are list of things i had to do
Configured / as a Binary Media Type on gateway. (I tried to use application/pdf, but did not work?)
Make sure the response body from the service layer not transformed into string (I am using request, and by default it gives me string). I send encoding: null along with the request
When i get the Buffer data from the service layer, i use Buffer to convert response body into base64 encoding.
In the lambda output, I set isBase64Encoded to true
Finally, get the unaltered PDF...
I am wondering if someone can confirm i am doing in an expected way? Or maybe if there is a better way?
Also, when we set binary support media type to /, doesn't this mean it accepts all media types? But i only want the PDF to be supported.
This doc (https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings.html) should be able to answer your question. And there are two things you need to note:
You can pass the original binary file (blob) as well as a base64-encoded binary file through API Gateway.
Ref: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-content-encodings-examples-image-lambda.html
*/* (or /) works in your case, but it means the API Gateway will treat all payload as binary data and this breaks payload with text data, for example JSON payload. So, ideally application/pdf should be used as the "Binary Media Type".

GET / POST using Clarion

I have Clarion 9 app that I want to be able to communicate with HTTP servers. I come from PHP background. I have 0 idea on what to do.
What I wish to be able to do:
Parse JSON data and convert QUEUE data to JSON [Done]
Have a global variable like 'baseURL' that points to e.g. http://localhost.com [Done]
Call functions such apiConnection.get('/users') would return me the contents of the page. [I'm stuck here]
apiConnection.post('/users', myQueueData) would POST myQueueData contents.
I tried using winhttp.dll by reading it from LibMaker but it didn't read it. Instead, I'm now using wininet.dll which LibMaker successfully created a .lib file for it.
I'm currently using the PROTOTYPE procedures from this code on GitHub https://gist.github.com/ddur/34033ed1392cdce1253c
What I did was include them like:
SimpleApi.clw
PROGRAM
INCLUDE('winInet.equ')
ApiLog QUEUE, PRE(log)
LogTitle STRING(10)
LogMessage STRING(50)
END
MAP
INCLUDE('winInetMap.clw')
END
INCLUDE('equates.clw'),ONCE
INCLUDE('DreamyConnection.inc'),ONCE
ApiConnection DreamyConnection
CODE
IF DreamyConnection.initiateConnection('http://localhost')
ELSE
log:LogTitle = 'Info'
log:LogMessage = 'Failed'
ADD(apiLog)
END
But the buffer that winInet's that uses always returns 0.
I have created a GitHub repository https://github.com/spacemudd/clarion-api with all the code to look at.
I'm really lost in this because I can't find proper documentation of Clarion.
I do not want a paid solution.
It kind of depends which version of Clarion you have.
Starting around v9 they added ClaRunExt which provides this kind of functionality via .NET Interop.
From the help:
Use HTTP or HTTPS to download web pages, or any other type of file. You can also post form data to web servers. Very easy way to send HTTP web requests (and receive responses) to Web Servers, REST Web Services, or standard Web Services, with the most commonly used HTTP verbs; POST, GET, PUT, and DELETE.
Otherwise, search the LibSrc\ directory for "http" and you will get an idea of what is already there. abapi.inc for example, appears to provide a wrapper around wininet.lib.

How to create Index with custom analyzers from json file in Azure Search .NET SDK?

I had read that the Azure Search .NET SDK uses NewtonSoft.Json to convert it's models to/from json in it's underlying REST API calls so I've been doing the same in my own app.
I have a simple app which creates a new Index using the .NET SDK. To do this, I was defining my Index in a json file, using the format outlined here https://learn.microsoft.com/en-us/rest/api/searchservice/create-index and then I was converting this to a Microsoft.Azure.Search.Models.Index object using Newtonsoft.
var index = JsonConvert.DeserializeObject<Microsoft.Azure.Search.Models.Index>(System.IO.File.ReadAllText("config.json");
This was working fine before I configured custom Analyzers, but now that I have custom Analyzers in my config, the Analyzers, Tokenizers, and TokenFilters are not being resolved into the correct types. ie. my custom Analyzer is being deserialized as a Microsoft.Azure.Search.Models.Analyzer, instead of Microsoft.Azure.Search.Models.CustomAnalyzer, same goes for the Tokenizers and TokenFilters, they are being deserialized into the base types.
Is there an easy way I can create an Index like this in the .NET SDK from a json file?
Unfortunately this is not an officially supported scenario. While it works for simple index definitions, we're working to understand what we need to do to be able to support all cases.
Please post your feature request on our User Voice page to help us prioritize: https://feedback.azure.com/forums/263029-azure-search
In the meantime, you might be able to get it working yourself by adapting the JsonSerializerSettings initialization code at the bottom of this file.

Upload File with brackets ([ & ]) in the name

I'm moving a ClickOnce install from a regular web server to Azure Blob storage and have a problem with some of the files. The filenames contains [ ] and CloudBlob.UploadFile fails with an exception:
Microsoft.WindowsAzure.Storageclient.StorageException:
Error accessing blob storage: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
The code has been used for a while and only fails on files with [ ] in the name so I don't believe that it is an "authentication failure". In this particular case, this is the seventh file being uploaded in a loop. I found this link on MSDN about valid file names and this on stack overflow which both show problems with square brackets in URL's and reference UrlEncode. I added in a call to UrlEncode and that did not help. The container is created with public access since we use it to support customer downloads of our software. We have been hosting a "test" install in another container and have not had permission problems accessing that either.
I can upload the file with no name changes and then rename the file to add the "path" using newdesic's Azure Storage Explorer tool so what is that doing that I am not doing?
I see you're using the 1.7 SDK. This is a small encoding issue with the SDK which is also present in v2.0. Let's see what happens.
No encoding
account.CreateCloudBlobClient()
.GetContainerReference("temp")
.GetBlobReference("abc[]def.txt")
.UploadFile("myfile.txt");
If you don't encode the blob name, you'll end up with a request to the following URL which is causing the authentication exception:
http://account.blob.core.windows.net/temp/abc[]def.txt
The is because the SDK uses Uri.EscapeUriString internally to encode your string, but this doesn't take into account square brackets.
Encoding
Then you would expect the following to do the trick:
account.CreateCloudBlobClient()
.GetContainerReference("temp")
.GetBlobReference(HttpUtility.UrlEncode("abc[]def.txt"))
.UploadFile("myfile.txt");
The issue here is that you'll end up with this url:
http://account.blob.core.windows.net/temp/abc%255b%255ddef.txt
So what's happening here? Calling HttpUtility.UrlEncode turns abc[]def.txt to abc%5B%5Ddef.txt, which is correct. But internally, the SDK will encode this string again which results in abc%255b%255ddef.txt, which isn't what you want.
Workaround
The only way to apply encoding which takes square brackets into accounts is by using a small workaround. If you pass the full URL to the GetBlobReference method, the SDK assumes you did all the encoding yourself:
var container = account.CreateCloudBlobClient().GetContainerReference("temp");
var blob = container.GetBlobReference(String.Format("{0}/{1}",
container.Uri, System.Web.HttpUtility.UrlEncode("abc[]def.txt")));
blob.UploadFile("myfile.txt");
This results in a correctly encoded URL:
http://account.blob.core.windows.net/temp/abc%5b%5ddef.txt
And if you use a tool like CloudXplorer, you'll see the blob with the correct filename:
There are two known breaks in the Uri class in .Net 4.5
• ‘[‘,’]’ characters are no longer escaped
• ‘\’ character is now escaped as %5C
This is causing an authentication when the server attempts to validate the signature of the request as the canonicalized string is now different on the client and the server.
There are a few workarounds clients can use while this issue is present. The correct solution will depend on your specific application and requirements.
Avoid the ‘[‘,’]’, or ‘\’ characters in resource names
By simply avoiding these characters all together you will be able to avoid the issue described above.
Target .Net 4.0
Currently the recommendation is for clients to simply continue to target their applications to .Net 4.0 while a full solution is being investigated. Note, since .Net 4.5 is an in place upgrade clients can still take advantage of some performance improvements in the GC etc, without specifically targeting the .Net 4.5 profile. For Windows RT developers this is not an option and you will therefore require the workarounds detailed below.
Pre-Escape Data
If possible a client can pre- escape the data or replace the given characters with non-affected ones.
This is why the workaround above is working.

Resources