What is the correct JSON format to report "DateTime" property to IoT Central? - azure-iot-central

Question 1:
As titled. I was able to report Location property using Device Property library in the Property tab in Azure IoT Central, but cannot find the correct json format to report a DateTime property.
Here a copy of device twin I got back from IoT Central when I start my application:
"reported": {
"location": {
"lon": 120.567238,
"lat": 36.044977
},
"timestamp": {
"value": "2018-09-25T09:21:56Z"
}
}
Question 2:
Since I mentioned the Location property, is it also possible to report a Location without it being part of the Device Property library (as shown in the sample device templates)? Since a Location library already exist, it seems redundant create a Device Property to host the Location property. What is the difference between those two actions? Just FYI that I cannot find the correct JSON format to report for the Location property without it being under Device Property neither.

Question 1: The correct format to report DateTime is yyyy-MM-ddTHH:mm:ssZ (ISO 8601, in UTC). For example, 2018-11-26T8:08:08.808Z.
Question 2: There is a known issue for updating a Location property using the Flow connector. We are working on resolving this. Thanks for reporting.

Related

ADF: can't build simple Json file transformation (one field flattening)

I need a help in transforming simple json file inside Azure Data Flow. I need to flatten just one field date_sk in example here:
{
"date_sk": {"string":"2021-09-03"}
"is_influencer": 0,
"is_premium": -1,
"doc_id": "234"
}
Desired transformation:
"date_sk": {"string":"2021-09-03"}
to become
"dateToGroupBy" : "2021-09-03"
I create source stream, note the strange projection Azure picks, there is no "string" field anymore, but this is how automatic Azure transformation works for some reason:
Data preview of the same source stream node:
And here's how it suggest me to transform it in a separate "Derived Column" modifier. I played with the right part, but this is the only format (date_sk.{}) that does not display any error I was able to pick:
But then output dateToGroupBy field happens to be empty:
Any ideas on what could got wrong and how can I build the expected transformation? Thank you
Alright, it happened to be a Microsoft bug in ADF.
ADF stumbles upon "string" field name as JSON field, can't handle it, though schema and data validation passes through Ok, and showing no errors.
When I replace date_sk": {"string":"2021-09-03"} by date_sk": {"s1":"2021-09-03"} or anything other than string everything starts working just fine
and dateToGroupBy is filled with date values taken from date_sk.s1
When I return string back, it shows NULL in output values.
It supposed to either show error on verification stage or handle this field naming properly.

Cannot identify the correct COSMOS DB SQL SELECT syntax to check if coordinates (Point) are within a Polygon

I m developing an app that uses Cosmos DB SQL. The intention is to identify if a potential construction site is within various restricted zones, such as national parks and sites of special scientific interest. This information is very useful in obtaining all the appropriate planning permissions.
I have created a container named 'geodata' containing 15 documents that I imported using data from a Geojson file provided by the UK National Parks. I have confirmed that all the polygons are valid using a ST_ISVALIDDETAILED SQL statement. I have also checked that the coordinates are anti-clockwise. A few documents contain MultiPolygons. The Geospatial Configuration of the container is 'Geography'.
I am using the Azure Cosmos Data Explorer to identify the correct format of a SELECT statement to identify if given coordinates (Point) are within any of the polygons within the 15 documents.
SELECT c.properties.npark18nm
FROM c
WHERE ST_WITHIN({"type": "Point", "coordinates":[-3.139638969259495,54.595188276959284]}, c.geometry)
The embedded coordinates are within a National Park, in this case, the Lake District in the UK (it also happens to be my favourite coffee haunt).
'c.geometry' is the JSON field within the documents.
"type": "Feature",
"properties": {
"objectid": 3,
"npark18cd": "E26000004",
"npark18nm": "Northumberland National Park",
"npark18nmw": " ",
"bng_e": 385044,
"bng_n": 600169,
"long": -2.2370801,
"lat": 55.29539871,
"st_areashape": 1050982397.6985701,
"st_lengthshape": 339810.592994494
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-2.182235310191206,
55.586659699934806
],
[
-2.183754259805564,
55.58706479201416
], ......
Link to the full document: https://www.dropbox.com/s/yul6ft2rweod75s/lakedistrictnationlpark.json?dl=0
I have not been able to format the SELECT query to return the name of the park successfully.
Can you help me?
Is what I want to achieve possible?
Any guidance would be appreciated.
I appreciate any help you can provide.
You haven't mentioned what error you are getting. And you have misspelt "c.geometry".
This should work
SELECT c.properties.npark18nm
FROM c
WHERE ST_WITHIN({"type": "Point", "coordinates": [-3.139638969259495,54.595188276959284]}, c.geometry)
When running the query with your sample document, I was able to get the correct response(see image).
So this particular document is fine and the query in your question works too. Can you recheck your query on the explorer again? Also, are you referring to the incorrect database/collection by any chance?
Maybe a full screen shot of cosmos data explorer showing the dbs/collections, your query and response will also help.
I have fixed this problem. Not by altering the SQL statement but by deleting the container the data was held in, recreating it and reloading the data.
The SQL statement now produces the expected results.

Azure Resource Manager - Convert value to 'lower'

I was recently using ARM templates to deploy multiple resources into Azure. While deploying Storage accounts, I ran into an issue which was due to some constraints put up by Azure like
Name of Storage Account should not contain upper case letters
Its max length should be 24.
I want this name from the user and can handle the 2nd issue using the "maxLength" property on 'parameters'. But for lower case, there is no such property in 'parameters' also I'm unable to find any function which will convert the user entered value to lower case.
What I expect:
Method to convert the user entered value in lower case.
Any other method to suit my use case.
Thanks in advance.
You should look at the string function reference of the ARM templates.
you need to create a variable (or just add those functions to the name input, like so:
"name": "[toLower(parameters('Name'))]"
or add a substring method, something like this:
"variables": {
"storageAccountName": "[tolower(concat('sawithsse', substring(parameters('storageAccountType'), 0, 2), uniqueString(subscription().id, resourceGroup().id)))]"
},

Aggregate query for IBM Cloudant which is basically couchDB

I am a contributor at http://airpollution.online/ which is open environment web platform built open source having IBM Cloudant as it's Database service.
Platform's architecture is such way that we need to fetch latest data of each air pollution measurement devices from a collection. As far as my experience go with MongoDB, I have wrote aggregate query to fetch each devices' latest data as per epoch time key in each and every document available in respective collection.
Sample Aggregate query is :
db.collection("hourly_analysis").aggregate([
{
$sort: {
"time": -1,
"Id": -1
}
}, {
$project: {
"Id": 1,
"data": 1,
"_id": 0
}
}, {
$group: {
"_id": "$Id",
"data": {
"$last": "$$ROOT"
}
}
}
If someone has idea/suggestions about how can I write design documents in IBM Cloudant, Please help me! Thanks!
P.S. We still have to make backend open source for this project. (may take some time)
In CouchDB/Cloudant this is usually better done as a view than an ad-hoc query. It's a tricky one but try this:
- a map step that emits the device ID and timestamp as two parts of a composite key, plus the device reading as the value
- a reduce step that looks for the largest timestamp and returns both the biggest (most recent) timestamp and the reading that goes with it (both values are needed because when rereducing, we need to know the timestamp so we can compare them)
- the view with group_level set to 1 will give you the newest reading for each device.
In most cases in Cloudant you can use the built-in reduce functions but here you want a function of a key.
(The way that I solved this problem previously was to copy incoming data into a "newest readings" store as well as writing it to a database in the normal way. This makes it very quick to access if you only ever want the newest reading.)

XPages Extlib REST update (HTTP PUT) on calendar event is returning error code (cserror : 1028)

I am trying to update a calendar event in Domino Mail Calendar by using the REST data service "calendar" from the latest xpages extlib release "ExtensionLibraryOpenNTF-901v00_13.20150611-0803".
Has anybody done this with a successful return?
Unfortunately I haven't had any success trying to update a calendar event. I was successful getting the list of events, creating events, deleting an event, but to update an event seems to be somehow special. The PDF documentation for the calendar service is quite short on this point. My domino server is accepting all protocols including PUT. I am using the JSON format for my REST calls. The UPDATE I tried as described in the documentation with iCAL as well, but getting the same error.
I am using the Firefox REST Plugin for checking out the service, before I am implementing it.
Im using PUT, with content-type "text/calendar" as well "application/json".
My URL:
http://sitlap55.xyzgmbh.de:8080/mail/padmin.nsf/api/calendar/events/4D750E2B8159D254C1257E9C0066D48D
My Body looks like this, which is the easiest event type, a reminder (but I tried it with meeting and appointment as well):
{"events":[{"UID:"4D750E2B8159D254C1257E9C0066D48D","summary":"Ein Reminder update","start":{"date":"2015-08-13","time":"13:00:00","utc":true}}]}
This is how I return the event by a GET:
{
"href": "/mail/padmin.nsf/api/calendar/events/4D750E2B8159D254C1257E9C0066D48D-Lotus_Notes_Generated",
"id": "4D750E2B8159D254C1257E9C0066D48D-Lotus_Notes_Generated",
"summary": "Ein Reminder",
"start": {
"date": "2015-08-12",
"time": "07:00:00",
"utc": true
},
"class": "public",
"transparency": "transparent",
"sequence": 0,
"x-lotus-summarydataonly": {
"data": "TRUE"
},
"x-lotus-appttype": {
"data": "4"
}
}
This is the error I get:
{
"code": 400,
"text": "Bad Request",
"cserror": 1028,
"message": "Error updating event",
"type": "text",
"data": "com.ibm.domino.calendar.store.StoreException: Error updating event\r\n\tat com.ibm.domino.calendar.dbstore.NotesCalendarStore.updateEvent(NotesCalendarStore.java:229)\r\n\tat ... 65 more\r\n"
}
The attributes in the body I tried a lot of different things, using the id, no id, an UID like in the calendar service doumentation, ...
What am I doing wrong here?
The solution:
Using the PUT method, the URL which worked looks like this:
http://sitlap55.xyzgmbh.de:8080/mail/padmin.nsf/api/calendar/events/4D750E2B8159D254C1257E9C0066D48D-Lotus_Notes_Generated
the BODY looks like this:
{"events":[{"id":"4D750E2B8159D254C1257E9C0066D48D-Lotus_Notes_Generated","summary":"Some Reminder update #6","start":{"date":"2015-08-13","time":"10:00:00","utc":true}}]}
What I figured out was, that the "id" attribute is required ! a bit strange, because it is already in the URL.
I just checked against the documentation for Domino Access Services (DAS) 9.0.1 - and the example they have there actually works.
I tried a few things before that, e.g. if I could PATCH (change individual fields) or PUT (change ALL fields) just specifying the non-system fields. None of these worked. But taking the response from creating (or getting) the event and put it into a PUT request and adjust e.g. start time works fine.
Looking at your example I think the issue is similar as you do not include the end time in the request. But even so you seem to have to include the entire record as it is returned from the service - and please note that the url must end on the ENTIRE id (i.e. including "...-Lotus_Auto_Generated") :-)
/John
Edit:
It seems you don't need to add all fields... But be aware of the side effects of not specifying fields... You need to test it for yourself!

Resources