Azure Blob Storage as Source and FTP as destination - azure

Is there anyway I can transfer txt files from my Azure Blob Storage to a FTP directly, going serverless?
If possible using SSIS or Azure Data Factory.
Thanks!

you can use Azure Logic App:
Connectors to blob storage
Connectors to FTP
A simple logic app to push a blob to a FTP would be:

SSIS has a lot of connectors that can talk directly to AZURE storage. As for FTP, you may have to use a third party software (WinSCP) that can accomplish uploading of the file to FTP (if the built in FTP Task doesnt accomplish it already). If you are looking to go directly from Azure to FTP, you may have to rely on custom C# code. I am not even sure if that is possible.

You could use SSIS. Azure data factory copy activity doesn’t support ftp as sink.

Related

Is it possible to copy files from blob storage / file share into an FTP within an ADF?

I have a couple of pipelines in ADF, which in the end will produce some files. These files are currently being stored in file shares and blob storages. However, I'd like to move them inside a FTP server.
As of now, I have create a linked service to that FTP and a dataset that points to the FTP and the folder that I want to use to upload the files. However, when I use the activity "Copy Data" and use this dataset, I get the error "the linked service in sink dataset does not support sink".
As far as I understand, this is only possible with a SFTP, which is not valid for me, it must be an FTP (technical limitations).
Can you provide me some guidance here?
Best regards!
You can call Azure Logic App activity which contains SFTP and FTP connectors + Azure Storage connectors:
http://microsoft-bitools.blogspot.com/2018/06/execute-logic-apps-in-azure-data.html

Which Azure Storage method is best for a temporary file transfer?

I want to automate the transfer of files from a website not hosted in Azure to my client’s premises.
I am considering having an API on the website send the files to Azure Blob Storage , and then having another API running at the client site, download them.
Both would make use of the Azure storage API, which I like because it is easy to implement.
The files do not need to stay in Azure and can be deleted from storage once they are downloaded.
However I am wondering if there is a faster way.
Should I be using Hot Blob Storage or File Storage perhaps?
I looked at https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers but am still unclear as to the fastest method for my use case.
I suggest you can use File share, which can be mapped to local as a mapped drive and can be easily and faster operation like read / delete.
If you choose code only, from the comparison of blob and file, they can be up to Up to 60 MiB/s, I cannot see which is faster. There is a Azure Storage Data Movement Library , which is designed for high-performance uploading, downloading and copying Azure Storage Blob and File, you can use it for your purpose.
I would recommend blob storage for this application. Logic apps can also be used to automate this pipeline based on timer triggers or some other trigger.

how to load file from share point online to Azure Blob using Azure Data Factory

Can any one help me how to load csv file from share point online to azure Blob storage using Azure Data Factory.
I tried with Logic apps and succeed however logic app will not upload all file unless we made any change to the file or upload new.
I need to load all the file even there is no changes.
ADF v2 now supports loading from sharepoint online by OData connector with AAD service principal authentication: https://learn.microsoft.com/en-us/azure/data-factory/connector-odata
You probably can use a Logic App by changing to a Recurrence Trigger.
On that interval, you List the files in the Library then take any action on them you want.

Can Azure Data Factory write to FTP

I want to write the output of pipeline to an FTP folder. ADF seems to support on-premises file but not FTP folder.
How can I write the output in text format to an FTP folder?
Unfortunately FTP Servers are not a supported data store for ADF as of right now. Therefore there is no OOTB way to interact with an FTP Server for either reading or writing.
However, you can use a custom activity to make it possible, but it will require some custom development to make this happen. A fellow Cloud Solution Architect within MS put together a blog post that talks about how he did it for one of his customers. Please take a look at the following:
https://blogs.msdn.microsoft.com/cloud_solution_architect/2016/07/02/creating-ftp-data-movement-activity-for-azure-data-factory-pipeline/
I hope that this helps.
Upon thinking about it you might be able to achieve what you want in a mildly convoluted way by writing the output to a Azure Blob storage account and then either
1) manually: downloading and pushing the file to the "FTP" site from the Blob storage account or
2) automatically: using Azure CLI to pull the file locally and then push it to the "FTP" site with a batch or shell script as appropriate
As a lighter weight approach to custom activities (certainly the better option for heavy work).
You may wish to consider using azure functions to write to ftp (note there is a time out when using a consumption plan - not in other plans, so it will depend on how big the files are).
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You could instruct data factory to write to a intermediary blob storage.
And use blob storage triggers in azure functions to upload them as soon as they appear in blob storage.
Or alternatively, write to blob storage. And then use a timer in logic apps to upload from blob storage to ftp. Logic Apps hide a tremendous amount of power behind there friendly exterior.
You can write a Logic app that will pick your file up from Azure storage and send it to an FTP site. Then call the Logic App using a Data Factory Web Activity.
Make sure you do some error handling in your Logic app to return 400 if the ftp fails.

SFTP support for azure data factory

I have a SFTP server. I have my files uploaded there. I want to use azure data factory to connect with the SFTP server and read the files from the SFTP server and save them in the Azure Blob storage.
Is there a way to perform this using azure pipeline/activity configuration?
ADF has recently Added SFTP Support.Refer
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-sftp-connector
EDIT
Data Factory now has native support for sftp.
It doesn't appear that Data factory supports sftp natively, however:
If you need to move data to/from a data store that Copy Activity
doesn't support, use a custom activity in Data Factory with your own
logic for copying/moving data. For details on creating and using a
custom activity, see Use custom activities in an Azure Data Factory
pipeline.
Also, Azure Logic Apps do support sftp natively which you could use to drop into blob storage, however I'm guessing (I'm soon to find out) that you'll loose the knowledge that the sftp server failing is a route cause when monitoring the factory.
SFTP planned feature in the azure feedback portal, if it is important to you I would recommend voting it up.

Resources