File Connector Not picking file in Azure Logic App - azure-api-apps

I have created an Azure Logic app, following this blog post.
http://blogs.biztalk360.com/azure-api-app-and-logic-app-in-depth-look-into-hybrid-connector-marriage-between-cloud-and-on-premise/
The difference is that both the source and destination are on the same server, so I have made only one instance of File Connector and using it in the app.
When the app is running I have observerd the following:
Trigger makes one entry of success in "Trigger History" section and all the other entries are of Failed.
File is not deleted from the source Folder.
File is not moved to destination.
Please let me know how I can troubleshoot this issue.
On further investigation, I have found I am getting this same error as in this blog post
https://social.msdn.microsoft.com/Forums/en-US/d06a12e6-86ce-4f1d-b94e-ea6a9c2c260d/why-file-connector-returns-internalservererror

Related

Job fail when uploading file from system to stream in Azure Media Service

I'm trying to set a VOD service on Azure media service using node.js, but job fails to reach file from container.
This is for a Linux server running node v10. The Azure tutorial for node works for sample URL, but not for a file from the system. The file gets stored in a input blob container but don't go past that.
When running the AMS tutorial, without a change, for node to get the streaming endpoints for a sample video from a URL from the tutorial I got the endpoints for stream, but when trying to upload a file from my system to the Azure media service, the file gets uploaded to a blob container, but the job apparently fails to find the video in the input asset container and returns a ErrorDownloadingInputAssetServiceFailure error, with message that says that :
File does not exist in the container:
2019-09-05T05:36:10.775Z-big_buck_bunny_480p_2mb.mp431 \nContainer
files: 10.775Z-big_buck_bunny_480p_2mb.mp431
I don't know where I'm getting this wrong, but from the message seems that the job is trying to find the file in a blob container that don't exists.
EDIT:
Tutorial link: https://learn.microsoft.com/en-us/azure/media-services/latest/stream-files-nodejs-quickstart
Tutorial GitHub: https://github.com/Azure-Samples/media-services-v3-node-tutorials.git
My GitHub for this code: https://github.com/DiegoAntonioli/azure-test.git
I'm using multer as a middleware to get the files, so i'm saving the files in the system through multer and uploading to a blob container on azure.
The file in the input asset blob container gets saved with the name i'm looking for "2019-09-05T05:36:10.775Z-big_buck_bunny_480p_2mb.mp431", and not the name that says in the error message "10.775Z-big_buck_bunny_480p_2mb.mp431".
EDIT 2:
Problem solved, seems that the azure searches for files in the blob container without the ":" character, so when i was uploading with a isosstring date in the start they looked for the file with the name only after the last ":" character, so it would never find the file. I don't know if it is in the documentation, but if it is they should make it clearer, because i was lost looking for my error and it was only the file name.
The root cause is that the blob/file name had an unsupported character ':'. See this page for the restrictions on allowed names. However, your post did reveal a bug in how we were unpacking the error message - that bug has been handed off to engineering to fix in an upcoming sprint.

Problem using Azure Functions and Azure Cosmos DB, returning "down for maintenance"

I am following this tutorial that creates an Azure Function trigged by http with output to CosmosDB.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-integrate-store-unstructured-data-cosmosdb
When I create just a simple Azure function it works ok, I trigge by http and the http response its ok.
But if create a new output to a ComosDB using the example code indicated in the tutorial the function returns "THIS AZURE FUNCTIONS APP IS DOWN FOR MAINTENANCE"
when trigged.
Please be patience, until last week I was just a c++ programmer hahaha.
My steps:
Creating a CosmosDB account and a database called "testDb".
Creating a Function App:
When trigged using this code its ok.
Creating a CosmosDB output.
I change my code to this:
Now when I trigger by http the response is:
What am I doing wrong?
Grateful.
Someone has reported same issue. To conclude the solution: In portal> Platform features> App Service Editor. Right click on app_offline.htm and delete.
This file is generated to stop the function app when you install cosmosdb extension. It is supposed to be deleted automatically after extension being installed, there seems some problem with this feature, probably related to the slow file system in Consumption plan.
If you are trapped again later, try to turn this behavior off, add an SCM_CREATE_APP_OFFLINE App Setting to your app and set the value to 0, check official announcement for this feature.

The gateway did not receive a response from 'Microsoft.Web' within the specified time period while creating FTP in Azure

I have an Azure logic app, inside that i am creating FTP (when file is added or modified) but after creation of FTP connection, i am unable to see the folder name inside the folder box with the error "The gateway did not receive a response from 'Microsoft.Web' within the specified time period".
Did you follow the steps in the doc create the FTP connector?
I think the problem is you didn't open the passive mode, the mode is a prerequisite asked in the doc. Due to environmental constraints, I couldn't open the passive mode, you could try it. And you need make sure you server is accessible from the internet.
If you still have questions, please let me know.

Copying a file from an FTP location into Azure DataLake

I have followed all steps shown in the MSDN documentation to Copy File from FTP.
So far, the data sets are created, linked servers were created, the pipeline is created. The diagram for the pipeline shows the logical flow. However, when I schedule the ADF, to do the work for me. It fails. The input dataset passes, but when executing the output dataset, I am presented with the following error.
Copy activity encountered a user error at Source side:
ErrorCode=UserErrorFileNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Cannot
find the file specified. Folder path: 'Test/', File filter:
'Testfile.text'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The
remote server returned an error: (500) Syntax error, command
unrecognized.,Source=System,'.
I can physically navigate to the folder and see for myself the file, but when using the ADF, I ma having issues. The firewall is set to allow the connection. Still I am getting this issue. As there is very minimal logging, I am unable to nail down the issue. Could someone help me out here?
PS: Cross Posted at MSDN
I encountered the same error and I was able to solve it by adding "enableSsl": true,
"enableServerCertificateValidation": true

Settings.txt with connection string gets removed after publish into Azure

I published my NopCommerce application to azure. as you may know connection string is defined in the settings.txt file under the app_data. It is also published with all files but strange thing is that when I open the settings.txt file. connections string is removed.
It should look like this
DataProvider: SQL Server CE
DataConnectionString: Data Source=|DataDirectory|\Nop.Db.sdf;Persist Security Info=False
but published version looks like that
DataProvider:
DataConnectionString:
Any idea why is that happening? or is there another way to copy directly? I am new on Azure and I couldn't find published files even.
Thanks
you can see the file system and edit \ upload files by using the SCM endpoint, there is good information in this blog: http://blogs.staykov.net/2013/12/windows-azure-secrets-of-web-site.html
Basically you take the url for your site http://<your_site>.azurewebsites.net and add scm to the url so it will be: http://<your_site>.scm.azurewebsites.net. Browsing to this location will give you access to a file browser as well as a set of diagnostics tools.

Resources