ADF : How to check if JSON array contains particular string - azure

Here is some sample data provided by a cosmosdb which contains JSON :
[
{
"Name":"ABC",
"ID":20,
"Category":"IT",
"training_cycles": [
"Jan 01, 2022 → Jun 30, 2022",
"Jul 01, 2021 → Dec 31, 2021"
]
},
{
"Name":"John",
"ID":25,
"Category":"Comp",
"training_cycles": [
"Jan 01, 2022 → Jun 30, 2022"
]
},
{
"Name":"XYZ",
"ID":23,
"Category":"HR",
"training_cycles": [
"Jan 01, 2022 → Jun 30, 2022"
]
}
]
Id like to ask azure data factory to select items which contain "Jul 01, 2021 → Dec 31, 2021" within "training_cycles".
So far, in my data flow, i have selected all my items and filtered to only see training_cycles... so my data only has 1 "column" called training_cycles and many items which contain these training_cycles.
I tried filtering with :
contains(training_cycles, "#training_cycles" == "Jul 01, 2021 → Dec 31, 2021")
but it selects all the data instead of only the items which contain the right data.
Thanks for the help

I tried to reproduce your issue.
My sample json
[
{
"Name":"ABC",
"ID":20,
"Category":"IT"
},
{
"Name":"John",
"ID":25,
"Category":"Comp"
},
{
"Name":"XYZ",
"ID":23,
"Category":"HR"
}
]
Source Settings Data preview
Then I used Filter Row Modifier. Here In Filter On condition I used Name==’ABC’. In your case you can use training_cycles == "Jul 01, 2021 → Dec 31, 2021"
Expected Result in Data preview

Related

LUIS inconsistent datetimeV2 parsing (US and UK formats)

As far as I'm aware, LUIS only comes in the en-US culture for English (there's no en-UK). Therefore, I'd expect the datetimeV2 entities to come back as YYYY-DD-MM. However sometimes LUIS sends back datetimeV2 entities as YYYY-MM-DD, and it's impossible to tell when this happens programtically.
Example:
Utterance "take time off 01/03/2019 to 04/03/2019" resolves as the US YYYY-DD-MM format:
[ { timex: '(2019-01-03,2019-04-03,P90D)',
type: 'daterange',
start: '2019-01-03',
end: '2019-04-03' } ]
HOWEVER, utterance "take time off 1st march 2019 to 4th march 2019" or "take time off march 1st 2019 to march 4th 2019" resolves as the UK YYYY-MM-DD format:
[ { timex: '(2019-03-01,2019-03-04,P3D)',
type: 'daterange',
start: '2019-03-01',
end: '2019-03-04' } ]
In addition, if the date is written as DD/MM/YYYY when the month > 12, the format is switched to YYYY-MM-DD once again. E.g. "take time off 01/03/2019 to 18/03/2019" resolves to the first date as YYYY-DD-MM and the second date as YYYY-MM-DD:
[ { timex: '(2019-01-03,2019-03-18,P74D)',
type: 'daterange',
start: '2019-01-03',
end: '2019-03-18' } ]
this makes it very hard to parse dates if the formats keep changing. How can I ensure every date range is formatted as YYYY-DD-MM? Or even YYYY-MM-DD, I don't care as long as it's consistent or at least tells me what format it has used.
There are a few points to see in your question.
The 1st one is about the first two items: there is a mistake in your assessment here:
Utterance "take time off 01/03/2019 to 04/03/2019" resolves as the US
YYYY-DD-MM format:
[ { timex: '(2019-01-03,2019-04-03,P90D)',
type: 'daterange',
start: '2019-01-03',
end: '2019-04-03' } ]
The resolution here is not the US (YYYY-DD-MM) format, it is in UK format YYYY-MM-DD because as you can see, there is a duration mention of P90D: 90 days between both dates, so 3 months.
For your last item, the reason is different. It can be explained when you have a look to how it is working. For such cases, you have to understand how this items recognition is working: as you can see here, LUIS uses Microsoft.Recognizers.Text to do this entity extraction from texts:
Microsoft.Recognizers.Text powers pre-built entities in both LUIS:
Language Understanding Intelligent Service and Microsoft Bot
Framework; and is also available as standalone packages (for the base
classes and the different entity recognizers).
All this solution is open-sourced, here: https://github.com/Microsoft/Recognizers-Text so we can analyse.
The available cultures in .Net version are listed here: https://github.com/Microsoft/Recognizers-Text/blob/master/.NET/Microsoft.Recognizers.Text/Culture.cs
public const string English = "en-us";
public const string EnglishOthers = "en-*";
public const string Chinese = "zh-cn";
public const string Spanish = "es-es";
public const string Portuguese = "pt-br";
public const string French = "fr-fr";
public const string German = "de-de";
public const string Italian = "it-it";
public const string Japanese = "ja-jp";
public const string Dutch = "nl-nl";
public const string Korean = "ko-kr";
I made a quick demo to see what is the output with your data, using the Culture possibilities provided by the Recognizers (as I don't know which English is used in LUIS):
Recognizing 'take time off 01/03/2019 to 18/03/2019'
**English**
01/03/2019 to 18/03/2019
{
"values": [
{
"timex": "(2019-01-03,2019-03-18,P74D)",
"type": "daterange",
"start": "2019-01-03",
"end": "2019-03-18"
}
]
}
**English Others**
01/03/2019 to 18/03/2019
{
"values": [
{
"timex": "(2019-03-01,2019-03-18,P17D)",
"type": "daterange",
"start": "2019-03-01",
"end": "2019-03-18"
}
]
}
As you can see, my 1st result is matching yours so I guess LUIS is based on English culture, so en-US if you have a look above.
Based on this, you can see in the implementation that for the US version, it is trying to match YYYY-DD-MM first and YYYY-MM-DD is a fallback, so the 1st date of your sentence is using the 1st matching (recognized as 3rd of January) whereas the 2nd date is using the fallback (recognized as 18th of March)

need to remove the array in json

var meds= [
{
"sno": 1,
"brandName": "EPIDOSIN 8 MG INJECTION",
"companyName": "TTK Healthcare Ltd",
"price": "Rs. 17",
"packagingOfProduct": "1 vial(s) (1 ML injection each)"
}
]
As per your comment above you want 17 from Rs. 17 from your array.
var value = meds["price"].split(""); // split Rs. 17 with space
var price = value[1]; // at index 1 you will get value 17
now you can do whatever you want to do with the price
if you want to set it back to the array, do this
meds["price"] = price;
Hope this will help!

set GrapqQL date format

I have a mongoDB database in which one field is an ISO date.
When i query the table using a graphql (node) query i receive my objects back all right but the date format i see in graphiql is in this weird format:
"created": "Sun Nov 26 2017 00:55:35 GMT+0100 (CET)"
if i write the field out in my resolver is shows:
2017-11-25T23:55:35.116Z
How do i change the date format so it will show ISO dates in graphiql?
the field is just declared as a string in my data type.
EDIT
My simple type is defined as:
type MyString {
_id: String
myString: String
created: String
}
When I insert a value into the base created is set automatically by MongoDB.
When I run the query it returns an array of obejcts. In my resolver (for checking) I do the following:
getStrings: async (_, args) => {
let myStrings = await MyString.find({});
for (var i = 0; i < myStrings.length; i++) {
console.log(myStrings[i]["created"]);
}
return myStrings;
}
all objects created date in the returned array have the form:
2017-11-25T23:55:35.116Z
but when i see it in GraphIql it shows as:
"created": "Sun Nov 26 2017 00:55:35 GMT+0100 (CET)"
my question is: Why does it change format?
Since my model defines this as a String it should not be manipulated but just retain the format. But it doesn't. It puzzels me.
Kim
In your resolver, just return a formatted string using toISOString()
const date1 = new Date('2017-11-25T23:45:35.116Z').toISOString();
console.log({date1});
// => { date1: '2017-11-25T23:45:35.116Z' }
const date2 = new Date('Sun Nov 26 2017 00:55:35 GMT+0100 (CET)').toISOString();
console.log({date2})
// => { date2: '2017-11-25T23:55:35.000Z' }
UPDATED to answer the added question, "Why does [the date string] change format"?
Mongo does not store the date as a string. It stores the date as a Unix epoch (aka Unix time, aka POSIX time), which is the number of seconds that have elapsed since January 1, 1970 not counting leap seconds (in ISO 8601: 1970-01-01T00:00:00Z). Since your data model requests a string, it'll coerce the value using toString()
const date1 = new Date('2017-11-25T23:45:35.116Z').toString();
console.log({date1})
// => { date1: 'Sat Nov 25 2017 15:45:35 GMT-0800 (PST)' }
That should clarify the behavior for you, but what you probably really want to do is change your model so that created is properly typed as a Date. You can do this a couple ways.
Create a custom scalar
GraphQLScalarType
Creating custom scalar types
Or use an existing package that already does the above for you
graphql-date
graphql-iso-date
You just need to do step by step:
Set type of your field is Object
Insert line scalar Object into your .graphql file
Add dependence graphql-java-extended-scalars into pom.xml file
Add syntax .scalar(ExtendedScalars.Object) in buildRuntimeWiring function
Let try it.
I try it and successful!

Query date with offset in AQL

I have this document:
{
paymentDate: '2015-08-08T23:41:23.909Z'
}
my local time is GMT+7 hence the date above is 2015-08-09 6:41:23 in my local time.
I want to send this query below, and receive above document
{
date: '2015-08-09',
offset: '+7'
}
What is the best way to achive that in AQL ?
as can be read in the documentation about dates, ArangoDBs native format JSON doesn't know a special date format, and thus its suggested to store dates as strings.
Best practice is to store UTC in the Database and convert it into the users timezone in the application.
Therefore a query would use FILTER and string comparison to select ranges:
arangosh> db._create("exampleTime");
[ArangoCollection 729616498254, "exampleTime" (type document, status loaded)]
arangosh> var timestamps = ["2014-05-07T14:19:09.522","2014-05-07T21:19:09.522","2014-05-08T04:19:09.522","2014-05-08T11:19:09.522","2014-05-08T18:19:09.522"];
arangosh> for (i = 0; i < 5; i++) db.exampleTime.save({value:i, ts: timestamps[i]})
arangosh> db._query("FOR d IN exampleTime FILTER d.ts > '2014-05-07T14:19:09.522' and d.ts < '2014-05-08T18:19:09.522' RETURN d").toArray()
[
{
"value" : 2,
"ts" : "2014-05-08T04:19:09.522",
"_id" : "exampleTime/729617284686",
"_rev" : "729617284686",
"_key" : "729617284686"
},
{
"value" : 1,
"ts" : "2014-05-07T21:19:09.522",
"_id" : "exampleTime/729617088078",
"_rev" : "729617088078",
"_key" : "729617088078"
},
{
"value" : 3,
"ts" : "2014-05-08T11:19:09.522",
"_id" : "exampleTime/729617481294",
"_rev" : "729617481294",
"_key" : "729617481294"
}
]
If your local time in timezone GMT+7 is 2015-08-09 6:41:23, the JavaScript code new Date().toISOString() would return "2015-08-08T23:41:23.000Z" in that moment. As you can see, it returns UTC time. Your computer needs to have the correct date, time and timezone configured of course.
If you want to query for a date in the past or future, and that date is in local time, you can construct an ISO8601 string with timezone offset specified. Let's say we want to know what 2011-01-01 2:00:00 in GMT+7 is in UTC time:
// timezone offset: 07 hours, 00 minutes (+0700)
new Date("2011-01-01T02:00:00+0700").toISOString()
// result: "2010-12-31T19:00:00.000Z"
The same works in AQL:
RETURN DATE_ISO8601("2011-01-01T02:00:00+0700")
// result: "2010-12-31T19:00:00.000Z"
If you already have a datetime string without timezone offset (2011-01-01T02:00:00), but want to assume it's your local time, you can do the following in JS to append the timezone offset:
// Should return -420 for your timezone GMT+7.
// You can supply an offset in minutes manually as well of course.
var offset = new Date().getTimezoneOffset()
var offsetHours = offset / 60 | 0
var offsetMinutes = Math.abs(offset % 60)
var offsetStr = ((offsetHours < 0) ? "+" : "-") + // GMT + or -?
((Math.abs(offsetHours) < 10) ? "0" : "") + // leading zero for single digit
Math.abs(offsetHours) + // hour portion
((offsetMinutes < 10) ? "0" : "") + // leading zero for single digit
offsetMinutes // minute portion
var dateStr = "2011-01-01T02:00:00" + offsetStr
console.log(dateStr)
console.log(new Date(dateStr).toISOString())
// on a GMT+7 machine, result should be:
// "2011-01-01T02:00:00+0700"
// "2010-12-31T19:00:00.000Z"
If the date string is in local time, but Zulu timezone offset was somehow added, you could correct it by 7 hours like this:
// It should had been +0700 and not +0000
var d = new Date("2015-08-09T06:41:23Z").getTime() - 7 * 60 * 60 * 1000
// result: 1439077283000, which is 2015-08-08T23:41:23.000Z
// or in a really hacky way:
new Date("2015-08-09T06:41:23Z".replace("Z", "+0700"))
//edit
this seems to work too:
var d = new Date("2015-08-09T06:41:23Z")
d.setHours(d.getHours() - 7)
This seems to work reliably even if you cross start or end datetime of DST, at least in Firefox. There was a bug in Chrome however, which led to completely off date calculations: https://code.google.com/p/v8/issues/detail?id=3116

Cannot delete supercolumn with cassandra-cli

[default#keyspace] get fv['user:/file.txt'];
=> (super_column=1365647977415,
(column=6363, value=0000000000000001, timestamp=1368238637628082)
(column=6c6d64, value=0000013f79344eb2, timestamp=1368238637628081)
(column=7362, value=000000000000003a, timestamp=1368238637628083))
=> (super_column=1365653962252,
(column=6363, value=0000000000000001, timestamp=1368238637727277)
(column=6c6d64, value=0000013f798fbee6, timestamp=1368238637727276)
(column=7362, value=0000000000000045, timestamp=1368238637727278))
del fv['user:/file.txt'][1365647977415];
column removed.
get fv['user:/file.txt'];
=> (super_column=1365647977415,
(column=6363, value=0000000000000001, timestamp=1368238637628082)
(column=6c6d64, value=0000013f79344eb2, timestamp=1368238637628081)
(column=7362, value=000000000000003a, timestamp=1368238637628083))
=> (super_column=1365653962252,
(column=6363, value=0000000000000001, timestamp=1368238637727277)
(column=6c6d64, value=0000013f798fbee6, timestamp=1368238637727276)
(column=7362, value=0000000000000045, timestamp=1368238637727278))
How is this possible? Comparator is ByteType, I used
assume fv comparator as LongType;
The problem was in the column timestamp which was newer than now. Be careful, guys.
=> (super_column=1365647977415,
(column=6363, value=0000000000000001, timestamp=1368238637628082)
1368238637628082 == Sat, 11 May 2013 02:17:17 GMT
Now is Thu, 11 Apr 2013 07:10:36 GMT

Resources