Datafactory to sharepoint list - sharepoint

I've setup a connection from our data factory in azure to a sharepoint site so I can pull some of the lists on the site into blob storage so I can then process into our warehouse. This all works fine and I can see the data I want. I don't however want to pull all the columns contained in the list I'm after. Looking at the connection I can specify a query, however anything I put in here has no affect on the data that comes back. Is there a way to specify the columns from a list in sharepoint through the copy activity into blob storage?

You need to user select query like below -
$select=Title,Number,OrderDate in the Query text field of the Azure Data Factory Source.
You could use Preview Data button to validate the results. Please refer to the documentation for using Custom OData query options.
I have tried this and it works fine for me (see screenshot below)-
Thanks
Saurabh

Related

How to access an azure Database containing data from Azure Log Analytics Query

I have a working query for my app data to be analyzed.
currently it analyzes the last two weeks data with an ago(14d).
Now i want to use a value containing the release date of the apps current version. Since i havent found a way to add a new database table to the already existing database containing the log data in azure analytics, i created a new database in azure and entered my data there.
Now i just don't know, if i can get access to that database at all from within the web query interface of Azure log analytics, or if i have to use some other tool for that?.
i hope that somebody can help me on this.
As always with azure there is a lot of stuff to read about it, but nothing concrete for my issue (or at least i haven't found it yet).
And yes, i know how to insert the data into the query with a let, but since I want to use the same data in different queries, an external location which can be accessed from all the queries would be the solution I prefer.
Thx in advance.
Maverick
You cannot access a db directly. You are better of using a csv/json file in blob storage. In the following example I uploaded a txt file with csv data like this:
2a6c024f-9093-434c-b3b1-000821a15b1a,"Customer 1"
28a658a8-5466-45ea-862c-003b20507dd4,"Customer 2"
c46fb949-d807-4eea-8de4-005dd4beb39a,"Customer 3"
e05b67ee-ff83-4805-b004-0064449f196c,"Customer 4"
Then I can reference this data from log analytics / application insights in a query like this using the externaldata operator:
let customers = externaldata(id:string, companyName:string) [
h#"https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx"
] with(format="csv");
requests
| extend CompanyId = tostring(customDimensions.CustomerId)
| join kind=leftouter
(
customers
)
on $left.CompanyId == $right.id
The url https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx is created by creating a url including a SAS token by using the Microsoft Azure Storage Explorer, selecting a blob and then right click -> Get Shared Access Signature. In the popup create a SAS and then copy the uri.
i know Log Analytics uses Azure Data Explorer in the back-end and Azure Data Explorer has a feature to use External Tables within the queries but I am not sure if Log Analytics support External Tables.
External Tables in Azure Data Explorer
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables#:~:text=An%20external%20table%20is%20a,and%20managed%20outside%20the%20cluster.

How to get all file name that stored in blob storage and saved it into table or csv

I have trying to list all filename/url in my blob containter and save it to csv or table in azure sql database.
I was struggling in ADF with metadata activity:
But I can take the child item into table or csv. Is there any advice?
I suggest you to use logic app to achieve your needs, because it is very simple, the specific design is shown in the figure:
As the url is in this format:
https://myaccount.blob.core.windows.net/mycontainer/myblob
You need to define a variable as a prefix.
The usage of List blobs, you can refer to this link. For how to connect to your Azure database, you can refer to this official document.
===========update==============
Regarding your question on how to create a csv file, the answer is updated as follows:
I designed my logic app like this
In these steps, on how to create a csv table, you can learn from this official document.
I tested it for you and found no problems:
If you are trying to get the list in a C# program, you could use BlobContainerClient.GetBlobs()

Store and query static data in log analytics

While creating custom log search alerts in log analytics workspace, I want to store some data and query it in alert query. Basically, it is a mapping like ABC -> DEF, GHI -> JKL. These mappings can be changed manually.
I am looking a solution like creating a table or function in workspace, or reading data from blob in the query. I do not want to create the table or function in the alert query, just read from it. If there are other solutions, please suggest them too.
Have you tried inserting custom data in Log Analytics via the REST API? This will solve your problem--and it's what we do using Runbooks. Works great.
Log Analytics Data Collector API
I realize this is an old thread, but for anyone else looking to do this, see:
Implementing Lookups in Azure Sentinel
Azure Sentinel provides four methods to reference, import, and use lookup information. The methods are:
The built-in Watchlists feature, which enables uploading CSV files as lookup tables.
The externaldata KQL function, which enables referencing an Azure Storage file as a lookup table.
Custom tables, imported using a custom connector.
A KQL function utilizing the datatable operator, which can be updated interactively or using PowerShell.

How to get sorted results in powershell using az storage entity query?

When you are using Azure Storage Explorer you can click on the name of each columns to sort the results by that field.
Is there any way to sort query results in PowerShell using az storage entity query?
In another word, I can get the results in Azure CLI as an object and I can sort it using Sort-Object, but I want to sort entries on the Azure Storage Server and get sorted-results. It's not useful to get all of the data from the server and sort it manually.
Please see this page https://learn.microsoft.com/en-us/rest/api/storageservices/Query-Operators-Supported-for-the-Table-Service?redirectedfrom=MSDN.
There's a complete list of supported operators you can use with Azure Storage Table, OrderBy is sadly not among the supported ones.
This means, you will need to retrieve the data first, then do the sorting.
but I want to sort entries on the Azure Storage Server and get
sorted-results. It's not useful to get all of the data from the server
and sort it manually.
It is not possible as Azure Tables does not support server-side sorting. You will need to fetch the desired data on the client and perform the sorting there only.

Azure Data Sync - Copy Each SQL Row to Blob

I'm trying to understand the best way to migrate a large set of data - ~ 6M text rows from (an Azure Hosted) SQL Server to Blob storage.
For the most part, these records are archived records, and are rarely accessed - blob storage made sense as a place to hold these.
I have had a look at Azure Data Factory and it seems to be the right option, but I am unsure of it fulfilling requirements.
Simply the scenario is, for each row in the table, I want to create a blob, with the contents of 1 column from this row.
I see the tutorial (i.e. https://learn.microsoft.com/en-us/azure/data-factory/data-factory-copy-activity-tutorial-using-azure-portal) is good at explaining migration of bulk-to-bulk data pipeline, but I would like to migrate from a bulk-to-many dataset.
Hope that makes sense and someone can help?
As of now, Azure Data Factory does not have anything built in like a For Each loop in SSIS. You could use a custom .net activity to do this but it would require a lot of custom code.
I would ask, if you were transferring this to another database, would you create 6 million tables all with the same structure? What is to be gained by having the separate items?
Another alternative might be converting it to JSON which would be easy using Data Factory. Here is an example I did recently moving data into DocumentDB.
Copy From OnPrem SQL server to DocumentDB using custom activity in ADF Pipeline
SSIS 2016 with the Azure Feature Pack, giving Azure Tasks such as Azure Blob Upload Task and Azure Blob Destination. You might be better off using this, maybe an OLEDB command or the For Each loop with an Azure Blob destination could be another option.
Good luck!
Azure has a ForEach activity which can be place after LookUp or Metadata to get the each row from SQL to blob
ForEach

Resources