EDIT:
I have gone though "copy data from sharepoint to blob storage" under this thread.
Azure Data Factory and SharePoint
But I am looking for a solution to copy data from sharepoint to datalake Gen2. Any pointers or examples would be useful. Thanks.
My raw data(csv format)is on a sharepoint folder. I am trying to automate data transfer when ever the raw data changes. Like I can update a csv file on sharepoint. That event should trigger data transfer from sharepoint to datalake.
I have seen few articles about using logic apps and datafactory to do data transfer.
But my scenario is to trigger that pipeline whenever the raw data gets updated on Sharepoint.
Can someone please throw some light? Thanks.
Regards
Related
Has anyone ever done the Azure Data Factory flow activity copying Excel data on OneDrive to Azure Blob?
I can't do with logic app, because it will cost extra for it.
thanks
At this time there is no connector in ADF which can help us achieve your requirement but there is another way you can approach i.e., with the help of Power Automate using template or creating your own flow. Please use this link to get this address Copy new files to AzureBlob from a OneDrive for Business folder.
Save it and Test Flow Manually under Test.
Reference: https://learn.microsoft.com/en-us/answers/questions/464671/copy-files-from-onedrive-and-transfer-to-azure-blo.html
I am new to Azure Data Factory and started doing projects with it. Currently, I have managed to copy files from SharePoint to ADLS. After copying, I would like to move the file in SharePoint to Archive folder using ADF but not successful.
So for example, my file is in "Shared Documents/latestfile/test.xlsx". After copying into ADLS, i would like to shift the file to "Shared Documents/Archive/test.xlsx"
Would kindly need some help in doing so. Thank you.
ShairPoint as a Sink is not yet support by Azure Data Factory.
Please refer the Docs to understand the Azure Data Factory connector overview, here.
You can try to leverage SharePoint APIs to achieve same, mentioned here: REST call to move a file/folder to another location | Link
Hopefully someone can help. I have a requirement of creating something like a "bring your own data model" feature/steps so clients can upload their own files, use it in a PowerBI data flow, link it with other sources (databases, services, etc), and create reports from it. The only problem is I'm having problems creating a data flow from an Azure blob file.
The files are CSV format
I have premium feature
I first tested and created an M query in PowerBI Desktop and it works, I can access the data
What I do is I copy my created M query to the Power BI data flow data source in a blank query. The result will is I can preview the data, but if I try to save it, I'm getting this error:
I tried the steps on this link but no luck.
Any ideas?
I am a newbie to Azure Data lake.
The below screenshot has 2 folders (Storage Account and Catalog), one for Datalake analytics and other data lake store.
My Question is why is the purpose of each folder and why are we using U-SQL for transformations when this can be done in the data factory.
Please explain the data flow process from the data store to the data lake.
enter image description here
Thanks you,
Addy
I have addressed your query on MSDN thread:
https://social.msdn.microsoft.com/Forums/en-US/f8405bdb-0c85-4d37-8f2e-0dab983c7f94/what-is-the-purpose-of-having-two-folders-in-azure-datalake-analytics?forum=AzureDataLake
Hope this helps.
I have a requirement to bulk upload data from a excel file to an Azure SQL table on a daily basis. I did some research and found that we could create a VM install full SQL and use SSIS package to do this.
Is there any other reliable way to go about this? The excel may contain up to 10,000 rows.
I have also read we could upload file to a blob storage and read from there but found it's not very robust approach.
Can anyone suggest if this is feasible approach-
Place excel file in Azure Website accessed via FTP
Azure Timer job using SQL Bulk copy code to update the SQL table
Any help would be highly appreciated!
You could use Azure Data Factory - check out the documentation here. Place your files in Azure Data Lake and the ADF will process them.