How to get data from Azure Table Storage to SharePoint Online - azure

I need to get data from Azure table storage and populate in SharePoint O365 site as a list. I found a way to achieve if it's from SQL Azure but couldn't able to find for Azure Tables. Kindly share your inputs.

Azure Table and SharePoint both have APIs, so in the base case you can import from one and export to others using their APIs. However, it seems you are looking for a tool, not to write code.
To export from Azure Tables you can use the AzCopy Tool (http://aka.ms/azcopy) to export the contents into a Json-formatted local file. Then, you can use whatever tool you use to import into SharePoint, as long as it understands Json files.

Related

Power Apps: Using a Form to enter data in DataLake, Data Factory or Synapse?

Its possible to create an App or WebForm App using Power Apps to save and retrieve information from Azure DataLake, Synapse or Data Factory?
Could you give any suggestion about this implementations, please?
I appreciate any help you can share!!
Thanks so much!
There are multiple ways to import and export data into Microsoft Dataverse. You can use dataflows, Power Query, Azure Data Factory, Azure Logic Apps, and Power Automate. See, Importing and exporting data and Import by bringing your own source file
You can configure dataflows to store their data in your organization’s Azure Data Lake Storage Gen2 account. This article describes the general steps necessary to do so, and provides guidance and best practices along the way.
Although I a starting in power apps, you can checkout for further Create, edit, or configure forms using the form designer
use Dataverse in ADF..
Add source to your forms...

Can we Export data from Azure Catalog to CSV? including tags?

I have data(tables) in an Azure Catalog with some user-defined tags in it, now I want to import data in CSV including tags as a column
Can we Export data from Azure Catalog to CSV?
It seems that this feature is currently not supported in Azure Data Catalog.
Until now, in the official feature suggestion for Microsoft Azure forum, there has been a such suggestion with similar issue exist in it: Export search results to CSV. You can comment and vote it there.

How to access an azure Database containing data from Azure Log Analytics Query

I have a working query for my app data to be analyzed.
currently it analyzes the last two weeks data with an ago(14d).
Now i want to use a value containing the release date of the apps current version. Since i havent found a way to add a new database table to the already existing database containing the log data in azure analytics, i created a new database in azure and entered my data there.
Now i just don't know, if i can get access to that database at all from within the web query interface of Azure log analytics, or if i have to use some other tool for that?.
i hope that somebody can help me on this.
As always with azure there is a lot of stuff to read about it, but nothing concrete for my issue (or at least i haven't found it yet).
And yes, i know how to insert the data into the query with a let, but since I want to use the same data in different queries, an external location which can be accessed from all the queries would be the solution I prefer.
Thx in advance.
Maverick
You cannot access a db directly. You are better of using a csv/json file in blob storage. In the following example I uploaded a txt file with csv data like this:
2a6c024f-9093-434c-b3b1-000821a15b1a,"Customer 1"
28a658a8-5466-45ea-862c-003b20507dd4,"Customer 2"
c46fb949-d807-4eea-8de4-005dd4beb39a,"Customer 3"
e05b67ee-ff83-4805-b004-0064449f196c,"Customer 4"
Then I can reference this data from log analytics / application insights in a query like this using the externaldata operator:
let customers = externaldata(id:string, companyName:string) [
h#"https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx"
] with(format="csv");
requests
| extend CompanyId = tostring(customDimensions.CustomerId)
| join kind=leftouter
(
customers
)
on $left.CompanyId == $right.id
The url https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx is created by creating a url including a SAS token by using the Microsoft Azure Storage Explorer, selecting a blob and then right click -> Get Shared Access Signature. In the popup create a SAS and then copy the uri.
i know Log Analytics uses Azure Data Explorer in the back-end and Azure Data Explorer has a feature to use External Tables within the queries but I am not sure if Log Analytics support External Tables.
External Tables in Azure Data Explorer
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables#:~:text=An%20external%20table%20is%20a,and%20managed%20outside%20the%20cluster.

Code to connect to Sharepoint from PySpark

I want to extract SharePoint List data using PySpark. I am not sure about the Sharepoint list data and storage. I want to read the SharePoint list data as a PySpark data frame.
I have tried Python Libraies:
Sharepy
Slum
Sharepoint and many others
Assuming you are using pyspark from databricks, I am using a different approach. I am using office 365 powerautomate flows to store the sharepoint lists in azure data storage as csv files. These flows can be called from databricks via calling the http triggers of power automate in python or you can have power automate automatically update when data change occurs. The csv files can then be mounted as tables in sql analytics and used easily in databricks. The benefit is microsoft offers and easy to use no code solution to exporting sharpoint to azure storage and it also handles all of the security nuances.
You can download the file/list that you want from Sharepoint first using one of the following packages, then you can use PySpark to ingest and process it.
https://pypi.org/project/sharepoint/
https://pypi.org/project/sharepy/
Here is a tutorial using Sharepy package: https://www.mydatahack.com/how-to-get-data-from-sharepoint-with-python/
I hope it helps.

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

Resources