Excel OneDrive to Blob with Azure Data Factory - excel

Has anyone ever done the Azure Data Factory flow activity copying Excel data on OneDrive to Azure Blob?
I can't do with logic app, because it will cost extra for it.
thanks

At this time there is no connector in ADF which can help us achieve your requirement but there is another way you can approach i.e., with the help of Power Automate using template or creating your own flow. Please use this link to get this address Copy new files to AzureBlob from a OneDrive for Business folder.
Save it and Test Flow Manually under Test.
Reference: https://learn.microsoft.com/en-us/answers/questions/464671/copy-files-from-onedrive-and-transfer-to-azure-blo.html

Related

Change file directory in SharePoint using Azure Data Factory

I am new to Azure Data Factory and started doing projects with it. Currently, I have managed to copy files from SharePoint to ADLS. After copying, I would like to move the file in SharePoint to Archive folder using ADF but not successful.
So for example, my file is in "Shared Documents/latestfile/test.xlsx". After copying into ADLS, i would like to shift the file to "Shared Documents/Archive/test.xlsx"
Would kindly need some help in doing so. Thank you.
ShairPoint as a Sink is not yet support by Azure Data Factory.
Please refer the Docs to understand the Azure Data Factory connector overview, here.
You can try to leverage SharePoint APIs to achieve same, mentioned here: REST call to move a file/folder to another location | Link

Power Apps: Using a Form to enter data in DataLake, Data Factory or Synapse?

Its possible to create an App or WebForm App using Power Apps to save and retrieve information from Azure DataLake, Synapse or Data Factory?
Could you give any suggestion about this implementations, please?
I appreciate any help you can share!!
Thanks so much!
There are multiple ways to import and export data into Microsoft Dataverse. You can use dataflows, Power Query, Azure Data Factory, Azure Logic Apps, and Power Automate. See, Importing and exporting data and Import by bringing your own source file
You can configure dataflows to store their data in your organization’s Azure Data Lake Storage Gen2 account. This article describes the general steps necessary to do so, and provides guidance and best practices along the way.
Although I a starting in power apps, you can checkout for further Create, edit, or configure forms using the form designer
use Dataverse in ADF..
Add source to your forms...

Copy data transfer from sharepoint to datalake

EDIT:
I have gone though "copy data from sharepoint to blob storage" under this thread.
Azure Data Factory and SharePoint
But I am looking for a solution to copy data from sharepoint to datalake Gen2. Any pointers or examples would be useful. Thanks.
My raw data(csv format)is on a sharepoint folder. I am trying to automate data transfer when ever the raw data changes. Like I can update a csv file on sharepoint. That event should trigger data transfer from sharepoint to datalake.
I have seen few articles about using logic apps and datafactory to do data transfer.
But my scenario is to trigger that pipeline whenever the raw data gets updated on Sharepoint.
Can someone please throw some light? Thanks.
Regards

How to use Azure Data Factory to copy files between Sharepoint 365 and OneDrive

I have to build ADF pipelines that move files from Sharepoint document library folders into a single OneDrive which belongs to a 3rd party. I am unable to find good source of information on how to create Sharepoint and OneDrive datasets in ADF.
Any help on how to create the datasets would be appreciated.
Thank you!
Please ref this document: Azure Data Factory connector overview
They are not supported as the connector. We can not create the dataset.

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

Resources