Logic app Azure retrieve data from Blob and save to Kusto - azure

I have created a logic app to execute the sp and save the data in csv file now i need to read the csv file and save the data (ingest) into Kusto, I have added reading data from blob "when a blob is added or modified" but when it's executed i am getting empty data in body . Can any one help me what could be the mistake,

Issue with with Kusto csv mapping after i ran the Query in Kusto i am ble to fix the issue.

Related

Required Blob is missing when preview data of a copy data activity sink dataset

I am new to Azure. I created a new ADF, pipeline, storage blob account and a copy data activity, the source is from a SQL server table and the sink output is a parquet file. But when I preview the data of my sink dataset, I got an error saying the required blob is missing.
I want to create a directory as well but weather I type in the folder name and file name or using parameters, I still receive the error. If I manually upload a file via the Azure Storage Explorer, the preview will have no issue.
Anyone what I missed?
Thanks for the help.
cheers
Albert
I created linked service for Azure SQL database and blob storage account and created dataset of SQL database for source:
Dataset of blob storage for sink:
When I preview the data by entering file name I got below error:
I got above error because I am not having that file in my blob storage.
In data factory the file will create automatically while debug the pipeline without entering filename.
I just gave the file path where my parquet file need save in sink dataset and debug the pipeline, it executed successfully.
My SQL table is copied to blob storage as parquet file successfully.

Azure Logs Kusto Query Output to database table

I need to save a output from a Kusto query on monitoring logs into a database table but I am unable to find a way to do it. I am presuming there will be a way to get the output from a Kusto query and save it to storage then pull that data into a table using a pipeline.
Any suggestions welcome
I have reproduced in my environment and got expected results as below:
Firstly, I have executed below Kusto query and exported it into csv file into local machine:
AzureActivity
| project OperationName,Level,ActivityStatus
And then uploaded the csv file from local machine into my Blob storage account as below:
Now I created an ADF service
now create new pipeline in it and take Copy activity in that pipeline.
Then I created linked service for blob storage as source and linked service for SQL database as sink.
In source dataset I gave the Blob file
In source dataset I gave SQL server table
Copy activity sink settings set table options as Auto create table
Output In SQL Query editor:
So what we do not is we have created a logic application that runs the query real-time and returns the data via http then we save that to the table. No manual intervention

issue with azure blob connector in power automate

I have a Power automate Flow which uses Azure blob connector to read excel file from the blob using the Get blob content action.
The problem is I need to process the excel data and save it in D365 f and O entity. for that I need the data in json format. I saw we can use cloudmersive connector to convert excel to json
I want to do it without using any 3rd party connector.?
You can read the file, and insert it into a table. After that, you can use compose actions or arrays to assign them to a JSON object.

Azure Data Factory Lookup Source Data and Mail Notification

I am trying my best to solve following scenario.
I am using PowerShell scripts to collect some information about my server environments and saving like .csv files.
There are information about Hardware, Running Services etc. in the .csv files.
I am sending these .csv files into Blob Storage and using Azure Data Factory V2 Pipelines to write these information into Azure SQL. I have succesfully configured mail notification via Azure Logic Apps that is informing me the Pipeline Run was succesfull/unsuccesfull.
Now I am trying to lookup into source data to find concrete column. In my scenario it is column with the name of Windows Service - for example - Column: PrintSpooler - Row: Running.
So I need to lookup for concretely column and also send a mail notification if the service is running or it is stopped.
Is there any way how to do that ?
In ideal way I want to receive a mail only in case the Service in my Source Data is stopped.
Thank you for any ideas.
Do you update the .csv file or upload a new .csv file?
If you upload a new .csv, then you can use azure function blob trigger.
This trigger will collect the new upload blob and you can do process on this blob. You can get the data in the .csv file and create an alert to your email.
This is the offcial document of azure function timetrigger:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-scheduled-function
In the blobtrigger, you can search whether there is a value in the .csv file and then you can set an output binding.
And then, go to this place:
Then you will get the alert in your email when the data in csv file is meet your requirement.

Azure Stream Analytics Job has no input or output

I am using Blob Storage as an Input (JSON file). I have tested the Query, the Query seems fine. I have specified Output as an Azure SQL Server Table. I can connect to this database from SSMS and query the table.
The Job status is running but I don't see any data loaded into the SQL table. I have checked Azure Management Services the status is Running there are no errors. How do I diagnose what is going on?
Note: I have left Blob storage path prefix as empty. I would like it to grab any file that comes into the storage container and not some specific files.
Have you created a query? You first need to create a query and then start your stream analytics.
Query Example:
SELECT
*
INTO
Output
FROM
Input
You can also create an output to PowerBI and run the SA. This will show you if the data schema and the query is right. If everything goes well, you should be able to see the JSON files as a dataset with the name values listed. You can create a mini dashboard for just the count of items received so you can in real time see if its loading and processing the JSONs from the BLOB.
If it fails, under operation logs to PBi output, it will tell you the data schema isn't supported.
Hope this helps!
Mert

Resources