Azure data factory - azure

I have two questions.
Is there any way to move data(CSV files in FTP server) periodically in to Azure Storage account using ADF?
After Switching Azure mode using
switch-azuremode AzureResourceManager
I could not use Get- help datafactory
(Used Powershell in Admin mode, added Azure account using "Add-AzureAccount")
Thanks

I try to answer your questions in order:
Not a native way at the moment, but you can create a custom activity which loads your CSV files into Azure storage. The scheduling can be done via JSON (as most of the functionality in Data Factory). For the future you can expect that there will be some way in a future way.
Haven't tried it, but you could try Get-Help azuredatafactory (caution: no whitespaces) or help azuredatafactory or have a look at the cmdlets reference.

Related

MarkLogic - Can we configure scheduled backup on Azure Blob

We want to configure schedule backup for database.
We have set storage account and access key for Azure Blob in security-> Credentials for Azure.
In backup directory, when enter azure://containName
This container name is exist in given storage account.
In response it says
The directory azure://backup/ does not exist on host ml01. Would you like to try and create it?
Can anybody please help me to configure?
It sounds like you want to create a work job which backup the data of your MarkLogic Database to Azure Blob Storage and trigger by a time schedule. Right? I do not completely understand what you said, so here just my suggestion below.
I'm not familar with MarkLogic, but I think you can write a script for NodeJS or a Java program to do the backup work, after I read the tag info for marklogic and I see it supports the client API for Node and Java.
As I known, there are three ways normally to deploy it on Azure if you are ready to backup in programming.
You can deploy it as a webjob with a cron expression to trigger the backup work, please refer to the offical document Run Background tasks with WebJobs in Azure App Service.
You can deploy it as a Web API on Azure by using the service like WebApp, and use Azure Scheduler to trigger it.
You can deploy it as an Azure Function and trigger it with timer trigger, please refer to the offical document Create a function in Azure that is triggered by a timer.
Ofcourse, there are other services can help to realize your needs. I don't know what the best for you is. If you have any concern, please feel free to let me know.
I was running into similar issue and was able to resolve it.
Create something side your container within a folder. so your structure should look like this
azure://containername/folder
I just was able to resolve my issue by doing that.

Azure Unzip automation

I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf

View Azure Blob Metadata Online

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata

Can Azure Data Factory write to FTP

I want to write the output of pipeline to an FTP folder. ADF seems to support on-premises file but not FTP folder.
How can I write the output in text format to an FTP folder?
Unfortunately FTP Servers are not a supported data store for ADF as of right now. Therefore there is no OOTB way to interact with an FTP Server for either reading or writing.
However, you can use a custom activity to make it possible, but it will require some custom development to make this happen. A fellow Cloud Solution Architect within MS put together a blog post that talks about how he did it for one of his customers. Please take a look at the following:
https://blogs.msdn.microsoft.com/cloud_solution_architect/2016/07/02/creating-ftp-data-movement-activity-for-azure-data-factory-pipeline/
I hope that this helps.
Upon thinking about it you might be able to achieve what you want in a mildly convoluted way by writing the output to a Azure Blob storage account and then either
1) manually: downloading and pushing the file to the "FTP" site from the Blob storage account or
2) automatically: using Azure CLI to pull the file locally and then push it to the "FTP" site with a batch or shell script as appropriate
As a lighter weight approach to custom activities (certainly the better option for heavy work).
You may wish to consider using azure functions to write to ftp (note there is a time out when using a consumption plan - not in other plans, so it will depend on how big the files are).
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You could instruct data factory to write to a intermediary blob storage.
And use blob storage triggers in azure functions to upload them as soon as they appear in blob storage.
Or alternatively, write to blob storage. And then use a timer in logic apps to upload from blob storage to ftp. Logic Apps hide a tremendous amount of power behind there friendly exterior.
You can write a Logic app that will pick your file up from Azure storage and send it to an FTP site. Then call the Logic App using a Data Factory Web Activity.
Make sure you do some error handling in your Logic app to return 400 if the ftp fails.

Unable to create an Import/Export Job on the new Azure portal

I have been trying to set up an import job as described here; the problem is that we do not have "Classic" storage, rather we are trying to set it up with "New" storage. Using the new portal I cannot find the place where one is meant to create a new job. The linked article shows how to do this for classic storage on the old portal only.
I have tried using the second approach they mention, which is to use the API, but that is turning out to be more of a pain than I though.
Does anyone know where I can add an import/export job in the new portal? Is this possible with "new" storage? If I manage to get the API way to work, can it be applied to "new" storage or is it only for "classic"?
Unfortunately, Import/Export is not available in the Preview Portal, and does not work at this time with v2 storage accounts. Can you use a Classic storage account instead?
We may be able to provide a sample code to unblock your scenario. Can you please send an email with your detailed requirements to waimportexport#microsoft.com so that we can set up a call to discuss further.
Thanks,
Aung

Resources