azure stream analytics not showing in powerBI - azure

I am using PowerBI to visualise stream analytics, however after adding a new output in azure and starting the job, it still does not appear as a dataset in powerBi.
What do I need to do to ensure it shows up?

This was caused by there being no output from the query.
When running the query using the test button, 0 rows are returned.
Solution was to modify the query so that it returns data.

SELECT persn_id,persn_name,Date,count(persn_id) Countperson INTO personBI(theoutputnameforpowerBi)
FROM personEventHubInput(theinputnamefromInputs)
Group by persn_id,persn_name,Date,tumblingwindow(ss,2)

Related

Get list of "new alerts" for azure monitor

I have KQL giving me counts of my alert by severity the only issue is when the user closes them (i.e updates the user response) no column in the alerts table is updated
So here is the azure triggered view
but the alerts table has nothing
This strikes me as a fairly normal ask
I am making the following assumption that you have a custom KQL query for Azure Resource Graph Explorer to identify Azure Monitor alerts.
Properties, such as alertState and monitorCondition are not standalone columns, but are nested properties within the dynamically typed "properties" column. As this is querying Azure Resource Graph, the records are updated directly, rather than adding a new log (as it would be in log analytics).
Below is a query that extracts the two relevant properties.
alertsmanagementresources
| extend alertState = tostring(parse_json(properties.essentials.alertState))
| extend monitorCondition = tostring(parse_json(properties.essentials.monitorCondition))
| project name, alertState, monitorCondition
If you need help, please share your query and what information you are looking to query.
Alistair

Azure function log that is older than 20 days

I'm trying to look at the logs for my Azure function,
In the Monitor view, I can just click the link under the Date column for the logs for a certain run.
But this is only for the last 20 invocation. For the older log how can I get it?
After clicking the Run query in Application Insight link above,
I arrived to a page with a table with the log entry, but I don't know how to open the actual logs!
If you want to access to the logs of an execution of your function app prior to those top twenty and you don't know so much how to query in Application Insights. There is an alternative using pre made queries available at monitor tab of your function.
Go to "Run query in application insights" to find the specific execution you are looking for.
This will auto generate for you the same query that filled the above grid. Here you just need to change the timestamp where condition to match our needs.
For example a specific date and time range:
"where timestamp between (todatetime('2022-10-01T00:00:00Z')..todatetime('2022-10-01T23:59:59Z'))"
With this change just run the query and identify function run you want to get full logs and get its operation_Id and invocationId values.
Go to monitor view again, select any execution and then "Run query in Application Insights"
Now you only need to replace operation_Id and invocationId values with those you got at step 1 and run again the query to get full log of that specific execution.

Query from Kusto to PowerBI

there are 2 different experiences i have seen while using "Query to PowerBI" from tools menu in kusto explorer. image of both
I am getting the fist one, but want to use second one(query? with additional details/options). How do i get it in second format
You can control the behavior of PowerBI query using Tools->Options->Tools->PowerBI Export To Native Connector.

How to make the aliases with Uppercase in stream analytics?

I have a simple json message that I receive from a device, this is the message
{"A":3,"B":4}
Also I set a query in the stream job to send the data to Power Bi, this is the query
SELECT * INTO [OutputBI] FROM [Input] WHERE deviceId='device1'
When I check the dataset in Power BI the name of columns were in uppercase |A|B| but when I used the alias in the query my columns were changed to lowercase |a|b|. This is the new query
SELECT v1 as A, v2 as B INTO [OutputBI] FROM [Input] WHERE deviceId='device1'
The reason why I change the query is because the variable names in the message were changed to A->v1, B->v2
My question is, Is there any way to use the alias in uppercase in the output of the job(Power BI in this case) ?
The problem is in the dataset of power BI, the first dataset recognized the column names in uppercase and when the query was changed, the column names were in lowercase, this is a trouble because of the dataset change, reports in power bi will not work, and I would have to do the reports again.
In the Configure section of the Stream Analytics job pane, selecting the Compatibility level and changing it to 1.1 should be able to solve the problem.
In this new version, case-sensitivity is persisted for field names when they are processed by the Azure Stream Analytics engine. However, persisting case-sensitivity isn't yet available for ASA jobs hosted by using Edge environment.
You could create a calculated column in PowerBI using the UPPER function. For example, Col2=UPPER(Column1)
You can also do this in the query editior / M Query using Text.Upper. Alternatively, I'm pretty sure there is a way to do it in the GUI.

No data is appearing in SSMS even though my job is running without errors

Problem: No data is appearing in SSMS (Sql Server Management Studio)
I don't see any errors appearing and my job diagram successfully shows a process from input to output.
I'm trying to use the continuous export feature of Azure Application Insights, Stream Analytics, and SQL Database.
Here is my query:
SELECT
A.context.data.eventTime as eventTime,
A.context.device.type as deviceType,
A.context.[user].anonId as userId,
A.context.device.roleInstance as machineName
INTO DevUserlgnsOutput -- Output Name
FROM devUserlgnsStreamInput A -- Input Name
I tested the query with sample data and the output box below the query and it returned what I expected, so I don't think the query itself is the issue.
I also know that the custom events I'm trying to display the attributes of have occurred since I began the job. My job is also still running and has not stopped since its creation.
In addition, I would like to point out that the monitoring graph on the stream analytics page detects 0 inputs, 0 outputs, and 0 runtime errors.
Thank you in advance for the help!
Below are some pictures that might help:
Stream Analytics Output Details
The Empty SSMS after I clicked "display top 1000 rows," which should be filled with data
No input events, output events, or runtime errors for the stream analytics job
I've had this issue twice with 2 separate application insights, containers, jobs, etc. Both times I solved this by editing the path pattern of my input(s) to my job.
To navigate to the necessary blade to make the following changes:
1) Click on your stream analytics job
2) Click "inputs" under the "job topology" section of the blade
3) Click your input (if multiple inputs, do this to 1 at a time)
4) Use the blade that pops up on the right side of the screen
The 4 potential solutions I've come across are ( A-D in bold):
A. Making sure the path pattern you enter is plain text with no hidden characters (sometimes copying it from the container on Azure made it not plain text).
*Steps:*
1) Cut the path pattern you have already in the input blade
2) Paste it into Notepad and re-copy it
3) Re-paste it into the path pattern slot of your input
B. Append your path pattern with /{date}/{time}
Simply type this at the end of your path pattern in the blade's textbox
C. Remove the container name and the "/" that immediately follows it from the beginning of your path pattern (see picture below)
Edit path pattern
Should be self-explanatory after seeing the pic.
D. Changing the date format of your input to YYYY-MM-DD in the drop-down box.
Should also be self-explanatory (look at the above picture if not).
Hope this helps!!

Resources