Cast String to date time in Stream analytics Query - azure

We are sending datetime as string in the format 2018-03-20 10:50:037.996, and we have written Steam analytics query as below.
SELECT
powerscout.Device_Id AS PowerScout,
powerscout.[kW System],
CAST(powerscout.[TimeStamp] AS datetime) AS [TimeStamp]
INTO
[PowergridView]
FROM
[IoTHubIn]
When we are sending data through Stream analytics, Job is getting failed.
Any Suggestions please.
Thanks In Advance

ASA can parse DATETIME fields represented in one of the formats described in ISO 8601. This format is not supported. You can try using custom JavaScript function to parse it.

Related

PostgreSQL - how to convert timestamp with timezone to timestamp without timezone

In my PostgreSQL database, the datetime stored as 2022-05-10 10:44:19+08 and when I get
the datetime by using the sequelize, it will give in format:: 2022-05-10T02:44:19.000Z.
So, my question is how to convert to 2022-05-10 10:44:19 ?
Thanks in advance.
There is a direct dependence on the time of your server. Therefore, depending on what you want to get, you can use different options.
Here is a dbfiddle with examples

SharePoint wcf-services JSON DateTime format parsing in C#

I am trying to read some data from the SharePoint API via the older _vti_bin/client.svc endpoint
I can't seem to find what type of date format this is and how I can parse it via C#.
The timestamp being returned is:
"LastContentModifiedDate": "/Date(2022,3,18,13,12,28,990)/"
The year and month are obvious so I could parse it myself if I knew what all the values are. Is there a formal definition for this or a way to parse this reliably? Is this a DateTime or DateTimeOffset or something else?
I just get an exception when trying to deserialize to a DateTime or DateTimeOffset.
The /Date(...)/ format is Microsoft's built-in JSON date format.
You can try to parse it using the code below.You can also check out this post, which provides a lot of methods.
using System.Web.Script.Serialization;
//code
JavaScriptSerializer json_serializer = new JavaScriptSerializer();
DateTime ddate = json_serializer.Deserialize<DateTime>(#"""\/Date(1326038400000)\/""").ToLocalTime();

Azure stream analytics split at comma

I have an input to my stream analytics job as a CSV string such as follows:
jon,41,111 treadmill lane,07831231123,aa,bb,123...etc.
I'd like to sort this data into columns of an SQL table with column headings:
name,age,address,phone,result1,result2,result3...etc.
I've tried using SQL split functions but none I've tried seem to be compatible with Azure stream analytics job query. Could anyone provide any assistance as to how I can split my string into the appropriate tables? Many thanks.
If your events are coming in with a CSV format, you don't have to do anything in your query to work with it. The trick is to set the correct serialisation for your input. When you create your IoT Hub input, set the serialisation to CSV:
This will work if your CSV message has the headers included in the message:
name,age,address,phone,result1,result2,result3
jon,41,111 treadmill lane,07831231123,aa,bb,123
It will show up in the input preview like so:
When the headers are present, you can use them in your queries.
SELECT
name,
age
INTO
target
FROM
[csv-input]

I am having weather data in grib2 format and i want to convert it into json format Is it possible to conversion using node.js?

I am having weather data in grib2 format and i want to convert it into json format or csv. I know the languages PHP , node.js. Is it possible to conversion using these technologies?
Using Nodejs try this package may help you
https://www.npmjs.com/package/grib2json

Azure Data Lake Analytics - Output dates as +0000 rather than -0800

I have a datetime column in an Azure Data Lake Analytics table.
All my incoming data is UTC +0000. When using the below code, all the csv outputs convert the dates to -0800
OUTPUT #data
TO #"/data.csv"
USING Outputters.Text(quoting : false, delimiter : '|');
An example datatime in the output:
2018-01-15T12:20:13.0000000-08:00
Are there any options for controlling the output format of the dates? I don't really understand why everything is suddenly in -0800 when the incoming data isn't.
Currently, ADLA does not store TimeZone information in DateTime, meaning it will always default to the local time of the cluster machine when reading (-8:00 in your case). Therefore, you can either normalize your DateTime to this local time by running
DateTime.SpecifyKind(myDate, DateTimeKind.Local)
or use
DateTime.ConvertToUtc()
to output in Utc form (but note that next time you ingest that same data, ADLA will still default to reading it in offset -0800). Examples below:
#getDates =
EXTRACT
id int,
date DateTime
FROM "/test/DateTestUtc.csv"
USING Extractors.Csv();
#formatDates =
SELECT
id,
DateTime.SpecifyKind(date, DateTimeKind.Local) AS localDate,
date.ConvertToUtc() AS utcDate
FROM #getDates;
OUTPUT #formatDates
TO "/test/dateTestUtcKind_AllUTC.csv"
USING Outputters.Csv();
You can file a feature request for DateTime with offset on our ADL feedback site. Let me know if you have other questions!

Resources