Azure Stream Analytics Job has no input or output - azure

I am using Blob Storage as an Input (JSON file). I have tested the Query, the Query seems fine. I have specified Output as an Azure SQL Server Table. I can connect to this database from SSMS and query the table.
The Job status is running but I don't see any data loaded into the SQL table. I have checked Azure Management Services the status is Running there are no errors. How do I diagnose what is going on?
Note: I have left Blob storage path prefix as empty. I would like it to grab any file that comes into the storage container and not some specific files.

Have you created a query? You first need to create a query and then start your stream analytics.
Query Example:
SELECT
*
INTO
Output
FROM
Input

You can also create an output to PowerBI and run the SA. This will show you if the data schema and the query is right. If everything goes well, you should be able to see the JSON files as a dataset with the name values listed. You can create a mini dashboard for just the count of items received so you can in real time see if its loading and processing the JSONs from the BLOB.
If it fails, under operation logs to PBi output, it will tell you the data schema isn't supported.
Hope this helps!
Mert

Related

Azure Logs Kusto Query Output to database table

I need to save a output from a Kusto query on monitoring logs into a database table but I am unable to find a way to do it. I am presuming there will be a way to get the output from a Kusto query and save it to storage then pull that data into a table using a pipeline.
Any suggestions welcome
I have reproduced in my environment and got expected results as below:
Firstly, I have executed below Kusto query and exported it into csv file into local machine:
AzureActivity
| project OperationName,Level,ActivityStatus
And then uploaded the csv file from local machine into my Blob storage account as below:
Now I created an ADF service
now create new pipeline in it and take Copy activity in that pipeline.
Then I created linked service for blob storage as source and linked service for SQL database as sink.
In source dataset I gave the Blob file
In source dataset I gave SQL server table
Copy activity sink settings set table options as Auto create table
Output In SQL Query editor:
So what we do not is we have created a logic application that runs the query real-time and returns the data via http then we save that to the table. No manual intervention

How to access an azure Database containing data from Azure Log Analytics Query

I have a working query for my app data to be analyzed.
currently it analyzes the last two weeks data with an ago(14d).
Now i want to use a value containing the release date of the apps current version. Since i havent found a way to add a new database table to the already existing database containing the log data in azure analytics, i created a new database in azure and entered my data there.
Now i just don't know, if i can get access to that database at all from within the web query interface of Azure log analytics, or if i have to use some other tool for that?.
i hope that somebody can help me on this.
As always with azure there is a lot of stuff to read about it, but nothing concrete for my issue (or at least i haven't found it yet).
And yes, i know how to insert the data into the query with a let, but since I want to use the same data in different queries, an external location which can be accessed from all the queries would be the solution I prefer.
Thx in advance.
Maverick
You cannot access a db directly. You are better of using a csv/json file in blob storage. In the following example I uploaded a txt file with csv data like this:
2a6c024f-9093-434c-b3b1-000821a15b1a,"Customer 1"
28a658a8-5466-45ea-862c-003b20507dd4,"Customer 2"
c46fb949-d807-4eea-8de4-005dd4beb39a,"Customer 3"
e05b67ee-ff83-4805-b004-0064449f196c,"Customer 4"
Then I can reference this data from log analytics / application insights in a query like this using the externaldata operator:
let customers = externaldata(id:string, companyName:string) [
h#"https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx"
] with(format="csv");
requests
| extend CompanyId = tostring(customDimensions.CustomerId)
| join kind=leftouter
(
customers
)
on $left.CompanyId == $right.id
The url https://xxx.blob.core.windows.net/myblob.txt?sv=2019-10-10&st=2020-09-29T11%3A39%3A22Z&se=2050-09-30T11%3A39%3A00Z&sr=b&sp=r&sig=xxx is created by creating a url including a SAS token by using the Microsoft Azure Storage Explorer, selecting a blob and then right click -> Get Shared Access Signature. In the popup create a SAS and then copy the uri.
i know Log Analytics uses Azure Data Explorer in the back-end and Azure Data Explorer has a feature to use External Tables within the queries but I am not sure if Log Analytics support External Tables.
External Tables in Azure Data Explorer
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables#:~:text=An%20external%20table%20is%20a,and%20managed%20outside%20the%20cluster.

Stream Analytics doesn't output the data to SQL but does to a blob storage

In my project I receive data from Azure IoThub and want to send it to a SQL database using Azure stream analytics. I'm trying to achieve this using the following query:
SELECT
IoTDataArrayElement.ArrayValue.sProjectID AS id
INTO
[test-machine]
FROM
[iothub-input] AS e
CROSS APPLY GetArrayElements(e.iotdata) AS IoTDataArrayElement
HAVING IoTDataArrayElement.ArrayValue IS NOT NULL
When I run the query in the environment provided by stream analytics and press test query I get the expected output which is a projectID. But when I start the stream analytics job the data doesn't go in to my database table. The table has 1 column 'id'.
When I try to send all the data to a blob storage the stream analytics job works.
Can someone please explain to me why the query I use for sending the data to a database doesn't actually send the data to a database?
Couple of things you need to verify to make successfully configuration of Azure SQL DB as output:
Make sure firewall settings is ON for All Azure Services.
Make sure you have configured the output to the sql database with the correct properties defined.
The following table lists the property names and their description for creating a SQL Database output.
Make sure the table schema must exactly match the fields and their
types in your job's output.
Hope this helps.

Searching Storage Account with Azure Log Analytics

Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.

Logic app Azure retrieve data from Blob and save to Kusto

I have created a logic app to execute the sp and save the data in csv file now i need to read the csv file and save the data (ingest) into Kusto, I have added reading data from blob "when a blob is added or modified" but when it's executed i am getting empty data in body . Can any one help me what could be the mistake,
Issue with with Kusto csv mapping after i ran the Query in Kusto i am ble to fix the issue.

Resources