Stream Analytics UDF works in Test but not in Job - azure

I need to parse a JSON data in Stream Analytics,
Below is the sample is am using,
SELECT
UDF.parseData(GetRecordPropertyValue(GetArrayElement(A.message,0), 'raw')).intent as 'rawData'
FROM
AppInsightMessages A
I can able to parse the intent from the field. This is a custom logging required.
However it is not working in Stream analytics job.
I am getting error like
Stream Analytics job has validation errors: Query compilation error: Expression is not supported: 'udf . parseData
Tried with CAST ing to string to record also. no luck.
What I am doing wrong ?
thanks in advance ..

Usually, this is due to trying to merge multiple stages into a single expression.
Please try splitting the processing to several steps:
With UDFStep AS (
SELECT
UDF.parseData(GetRecordPropertyValue(GetArrayElement(A.message,0), 'raw'))
FROM
AppInsightMessages A
)
SELECT intent as rawData
FROM UDFStep
BTW, you don't need to quote the 'rawData'.

Related

Passing dynamic content inside a SQL Query in lookup stage of Azure Data Factory

I'm using a lookup stage as a source to fetch some data and i want to pass that output as the input to the next lookup stage. I tried adding #activity('Step1').output.firstRow.Col and it failed with scalar variable was not being declared. Also tried with #{activity('Step1').output.firstRow.col} which failed and the log shows only default expressions supported. Please help if it is possible.
I have accomplished this using dataflow, but considering the performance i would like to know if it can be done in a pipeline.
Please try this:
Query:select * from dbo.test5 where col = '#{activity('Lookup1').output.firstRow.col}'
Output:

Understanding Kusto

I am trying to understand Kusto (Log Analytics Query Language in Azure).
According to the documentation;
To retrieve , project name and resultsCode from the dependencies table, I need to enter the following:
dependencies
| project name, resultCode
The machines I have subscribed to do not have this table.
I am using the heartbeat table and trying to retrieve computer and category like so:
Heartbeat
| Category, Computer , IsGatewayInstalled
I however get the following error:
Query could not be parsed at 'Category' on line [2,2]
Token: Category Line: 2 Position: 2
This seems trivial and will appreciate any pointers on this.
the error you're getting is due to the fact there's no valid operator after the pipe (|), you should use the project operator before specifying the column names you want to retrieve

Azure Data Factory V2 Copy Activity file filters

I am using Data Factory v2 and I currently have a simple copy activity which copies files from an FTP server to blob storage. The file names on this server are of the following form :
File_{Year}{Month}{Day}.zip
In order to download the most recent file I add this filter to my input dataset json file :
"fileName": {
"value": "#concat('File_',formatDateTime(utcnow(), 'yyyyMMdd'), '.zip')",
"type": "Expression"
}
I now want to be able to download yesterday's file which is possible using adddays().
However I would like to be able to do this in the same copy activity and it seems that Data Factory v2 does not allow me to use the following kind of regular expression logic :
#concat('File_',formatDateTime(utcnow(), 'yyyyMMdd'), '.zip') || #concat('File_', formatDateTime(adddays(utcnow(), -1), 'yyyyMMdd'), '.zip')
Is this possible or do I need a separate activity ?
It would seem strange to need a second activity since a Copy Activity can only take a single input but if the regex is simple enough, then multiple files are treated as a single input and if not then multiple files are treated as multiple inputs.
The '||' won't work since it will be evaluate as a single string.
But I can provided two solutions for this.
using a tumbling window daily trigger and set the start time as yesterday. So it will trigger two pipeline run.
using Foreach activity + copy activity. The foreach activity iterate an array to pass the yesterday and today to the copy activity.
Btw, you could just use string interpolation expression instead of concat. They are the same.
File_#{formatDateTime(utcnow(), 'yyyyMMdd')}.zip
I would suggest you to read about get metadata activity. This can be helpful in your scenario I think.
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-get-metadata-activity
You have itemName property, lastModified property, check it out.

Can we run complex multi line SQL queries using Blueprism?

I am new to SQL stuff in blueprism, I am able to configure SQL object and execute simple queries, but I am facing trouble while trying to run multiline complex SQL queries.
when I was trying to execute the below query in blueprism, getting some error message, saying "Incorrect Syntax near Database2"
"select top 10 * from [Database1].[dbo].[Table1]
join [Database1].[dbo].[Table2] on [Database1].[dbo].[Table2].Fieldname1=[Database1].[dbo].[Table1].Fieldname2
join [Database2].[dbo].[Table1] on [Database2].[dbo].[Table1].Fieldname1=[Database1].[dbo].[Table2].Fieldname2"
Can somebody please help me, what was the wrong in the above query...
I found the answer myself, there should not be any additional white space characters in the query, entire query should be in continuous line. The beauty of blueprism is, it can execute any level of complex queries without any constraints, but need to modify the syntax accordingly. always we should mention the filename and table names in the following format - [databasename].[dbo].[tablename].[fieldname]

Stream Analytics -not seeing the output

I use Stream Analytic to save the EventHub data into an SQL DataBase.
Even though I can see that I have both input and output requests, when I write a query to see the data from the output table I can see just 200 empty rows!! So I send data to this table, but are just NULL values
I think the problem may bethe query between input and output because my output table is empty :(. This is how I wrote it:
SELECT id,sensor,val FROM EventHubInput
Could there be another problem?
I have to mention that my EventHub is the link between a Meshlium and Azure.This is why I think my problem can be also from the frame I send from Meshlium.
I really don't know what to do. HELP ?!
You haven't specified any output.
SELECT id,sensor,val
OUTPUT YourSQLOutput
FROM EventHubInput
Stream Analytics queries' default output is output.
So if your SQL DB alias is SQLDbOutput, it won't work. You should specify it yourself:
SELECT id,sensor,val
INTO SQLDbOutput
FROM EventHubInput
The editor in Azure should tell you the names of your inputs and outputs on the left.
Also make sure your events in Event Hub contain those properties (id, sensor, val), and that the SQL DB contains columns with the same names.

Resources