Azure Logic App FTP Create file fails with MaxRequestCountReached - azure

I am creating an Azure Logic App that will create a new file (from HTTP body content of about 5KB) on an FTP server.
Here is the FTP Create File code snippet:
{
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['ftp']['connectionId']"
}
},
"method": "post",
"body": "#body('Provider_Post')",
"path": "/datasets/default/files",
"queries": {
"folderPath": "/",
"name": "filename_#{utcNow()}.xml",
"queryParametersSingleEncoded": true
},
"authentication": "#parameters('$authentication')"
},
"runtimeConfiguration": {
"contentTransfer": {
"transferMode": "Chunked"
}
}
}
This step takes really long (32 minutes) and then fails with following error:
MaxRequestCountReached. The maximum number of requests allowed '1000' was not sufficient to upload the entire content. Uploaded content length: '2378'. Total content length: '4877'.
The file appears on the FTP server but only 2380 bytes from the end of the file is there.
What does this error mean and how to fix it? 5KB shouldn't be too much of data. Is this something about the FTP server? I can send the file with FileZilla without problems.
I even tested this so that I created another step (before the failing one) that will send the HTTP content statusCode (so, just "200") to a new file and it writes it, succesfully, in one second.

The reason this misbehaved was that I had disabled Binary Transport in the ftp API connection settings.
When I enabled the Binary Transport checkbox, it wrote the file in seconds.

Related

How to run thousand of json files as body of http POST request?

So there is an API that is being tested in POSTMAN. There are thousand of json files that needs to be sent as body while making POST request to the API. Is there any way in POSTMAN or any other tool or script that can make this process easier and automated. Those json files hold the resources, each json file is something like this:
{
"resourceType": "Bundle",
"type": "transaction",
"entry": [
{
"fullUrl": "12234",
"resource": {
"resourceType": "Player",
"name": [
{
"family": "James",
"given": [
"Smith"
]
}
],
"gender": "male"
},
"request": {
"method": "POST",
"url": "Player"
}
}
}
With that post api request, there is cognito token, api key that is being sent.
In short how can I loop through multiple json files and put each in request body while making POST request at API URL.
There is Directory Listing Config plugin which can be installed using JMeter Plugins Manager.
You can point the plugin to the directory with your JSON files and each JMeter user will read the next JSON file on each iteration.
Once done you can use __FileToString() function in the "Body Data" tab of the HTTP Request sampler to read the current JSON file from the hard drive and set it as the request payload in JMeter
${__FileToString(${file},,)}
Demo:

Is it possible for azure logic app to receive a file from incoming http request?

I wanted to know if its possible to perform a file upload request to azure logic app's HTTP listener?
I am not looking for built-in HTTP trigger which makes an HTTP call to the specified URL OR built-in HTTP action which makes an HTTP call to the specified URL
One of the workarounds is through postman. Here is my logic app for your reference
postman request :-
Here is the output :-
Yes, it's possible for an Azure Logic App to receive files via an HTTP POST request. Here is the request body JSON schema to use in the Logic App:
{
"properties": {
"formdata": {
"items": {
"properties": {
"key": {
"type": "string"
},
"type": {
"type": "string"
},
"value": {
"type": "string"
}
},
"required": [
"key",
"value",
"type"
],
"type": "object"
},
"type": "array"
},
"mode": {
"type": "string"
}
},
"type": "object"
}
The Python script below will send a request to the Logic App, including a dictionary of parameters and a separate dictionary associating each filename with its contents.
import requests
import pathlib
attachments = ["path/to/first_file.txt", "path/to/second_file.txt"] # Insert file paths
logic_app_url = "paste_logic_app_url_here" # Insert URL in quote marks
file_dict = {}
for filepath in attachments:
file_dict[pathlib.Path(filepath).name] = open(filepath, 'rb')
payload = {"first_key": "first_val"} # Extra fields to include in your request
response = requests.post(logic_app_url, headers=None, data=payload,
files=file_dict)
I've run the request above, and it works. The request is received and processed by the Logic App. However, I haven't yet figured out how to parse the individual attachments in the Azure Logic App GUI. I think this may require a For Each loop as explained in Microsoft docs. I hope this helps!

Azure Functions Proxy 404 on localhost

I have an Azure Function App with a function at the URL http://localhost:7072/api/create-room along with other functions. This particular function is a HTTPTrigger with allowed anonymous access and accepts the GET verb:
[HttpTrigger(AuthorizationLevel.Anonymous, "get")]
Along with that, I have a separate function app that hosts only a proxies.json file and serves only as a functions proxy. My proxies function is running on port 7071 locally.
My proxies file currently looks like this:
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"chatNegotiate": {
"matchCondition": {
"route": "/api/chat/negotiate",
"methods": [
"POST"
]
},
"backendUri": "%chat_api%/api/BeginNegotiate"
},
"chatMessages": {
"matchCondition": {
"route": "/api/chat/messages",
"methods": [
"POST"
]
},
"backendUri": "%chat_api%/api/PostMessage"
},
"createRoom": {
"matchCondition": {
"route": "/api/create-room",
"methods": [
"GET"
]
},
"backendUri": "%session_api%/api/CreateRoom"
}
}
}
When both of these function apps are deployed to Azure, everything works like a dream. I can make requests, they're forwarded on, requests come back. It's all glorious.
However, when I run these functions locally, the request is never forwarded on from the proxy, with the proxy returning a 404. I can hit the function on the other function app running locally on 7072 directly and all is well there, but not at all when I got via the proxy.
The proxy itself returns:
[30/05/2020 18:24:30] Host lock lease acquired by instance ID '0000000000000000000000002D5B6BEA'.
[30/05/2020 18:24:34] Executing HTTP request: {
[30/05/2020 18:24:34] "requestId": "9004b8e2-f208-4a98-8b48-6f85bca41281",
[30/05/2020 18:24:34] "method": "GET",
[30/05/2020 18:24:34] "uri": "/api/create-room"
[30/05/2020 18:24:34] }
[30/05/2020 18:24:34] Executed HTTP request: {
[30/05/2020 18:24:34] "requestId": "9004b8e2-f208-4a98-8b48-6f85bca41281",
[30/05/2020 18:24:34] "method": "GET",
[30/05/2020 18:24:34] "uri": "/api/create-room",
[30/05/2020 18:24:34] "identities": [],
[30/05/2020 18:24:34] "status": 404,
[30/05/2020 18:24:34] "duration": 15
[30/05/2020 18:24:34] }
From examples I've looked at such as https://chsakell.com/2019/02/03/azure-functions-proxies-in-action/, this should be working fine.
Any suggestions? Thanks in advance for any help you can provide!
I've solved this after all.
proxies.json isn't set to copy to the output directory by default.
You need to ensure that it's set to copy always.
In Visual Studio:
Right click proxies.json > click properties > Set Copy to output directory to Copy Always.
In Visual Studio Code (and other editors):
Open ProjectName.csproj and add an entry to always copy proxies.json to output directory.
<ItemGroup>
<None Update="proxies.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
This solved the problem with the 404 on my local instance of the function app proxy. In local.settings.json add this to Values:
"AZURE_FUNCTION_PROXY_DISABLE_LOCAL_CALL": true,
Credit: https://chsakell.com/2019/02/03/azure-functions-proxies-in-action/

Azure Logic App List Blob what is List_Blob metadata field value

Hi I am using Azure Logic App to List Blobs and then looping through list and delete blobs that are older than specified date. I have used Azure Portal Logic App Designer to built this. This is working fine. I would like to know that in below JSON where is metadata value coming from. But in Azure Blob Storage I haven't defined any metadata on container property. Can anyone advise where the metadata is coming from??
I have changed the metadata value and it gives errors.
"List_blobs": {
"runAfter": {},
"metadata": {
"JTJmbmlhbWhwcm9hY3RpdmVpbWFnnnhhhFZXM=": "/containerName"
},
"type": "ApiConnection",
"inputs": {
"host": {
"connection": {
"name": "#parameters('$connections')['azureblob']['connectionId']"
}
},
"method": "get",
"path": "/datasets/default/foldersV2/#{encodeURIComponent(encodeURIComponent('JTJmbmlhbWhwcm9hY3RpdmVpbWFnnnhhhFZXM='))}",
"queries": {
"nextPageMarker": "",
"useFlatListing": false
}
}
}
The metadata in the List_blobs is the folder name base64 string, you could check this doc:Converter functions. It's used to preserve your some information then use it in other place.
Like in the path property, it uses this metadata and convert it with encodeUriComponent expression because after decode the base64 string its / is %2f.
So like my folder path is /test the metadata is JTJmdGVzdA==, and the below is the decode string.

What format will a binary file output from the Storage Blob Connector be in when passed to the following action?

In an Azure App-Service Logic App I have an AzureStorageBlobConnector which retrieves a file from storage. The file is being retrieved as binary and without setting any ContentTransferEncoding. My connector definition (subscription details replaced with 'x') looks like this:
"azurestorageblobconnector": {
"type": "ApiApp",
"inputs": {
"apiVersion": "2015-01-14",
"host": {
"id": "/subscriptions/x/providers/Microsoft.AppService/apiapps/azurestorageblobconnector",
"gateway": "https://x.azurewebsites.net"
},
"operation": "GetBlob",
"parameters": {
"BlobPath": "#triggers().outputs.body.Properties['FilePath']",
"FileType": "Binary"
},
"authentication": {
"type": "Raw",
"scheme": "Zumo",
"parameter": "#parameters('/subscriptions/x/resourcegroups/x/providers/Microsoft.AppService/apiapps/azurestorageblobconnector/token')"
}
},
"repeat": null,
"conditions": []
},
I want to author a custom Api Connector to receive this file, make some changes to it, then return it for the next step in the workflow.
What form will the file be in when the storage blob connector passes it to the next connector as #body('azurestorageblobconnector').Content? Will it be HttpPostedFile or a Stream or Multipart content in the body, or something else?
It depends on how you configure the Connector, if you choose "Binary" then it will come as a string in Base64 encoded.
If you choose Text, then it will be "the raw text".
One way to deal with that, is in your API App try to Convert.FromBase64String and if that succeeds then you got yourself a byte array with the actual bytes. If it does not succeed then you can assume that is the raw text content of the file.

Resources