ArangoDB AQL: LOWER not working for value of slice? - arangodb

I try following :
FOR d IN cresume FILTER d.isActive==true AND d.isPublic==true AND 'javascript' IN LOWER(d.resume.skills[*].name) SORT d.activatedTS DESC LIMIT 200 RETURN d
idea is to check if (lowercase) javascript is in skills[*] name. This don't find Result. If I do:
FOR d IN cresume FILTER d.isActive==true AND d.isPublic==true AND 'JavaScript' IN d.resume.skills[*].name SORT d.activatedTS DESC LIMIT 200 RETURN d
I get the result
Question... Is LOWER not working on a value from an [*] Array/Slice ?

Got info from Arango Support. As documented, LOWER works on strings. But instead of LOWER(d.resume.skills[*].name) you can use d.resume.skills[* RETURN LOWER(CURRENT.name)]

Related

kdb/q: How to apply a string manipulation function to a vector of strings to output a vector of strings?

Thanks in advance for the help. I am new to kdb/q, coming from a Python and C++ background.
Just a simple syntax question: I have a string with fields and their corresponding values
pp_str: "field_1:abc field_2:xyz field_3:kdb"
I wrote an atomic (scalar) function to extract the value of a given field.
get_field_value: {[field; pp_str] pp_fields: " " vs pp_str; pid_field: pp_fields[where like[pp_fields; field,":*"]]; start_i: (pid_field[0] ss ":")[0] + 1; end_i: count pid_field[0]; indices: start_i + til (end_i - start_i); pid_field[0][indices]}
show get_field_value["field_1"; pp_str]
"abc"
show get_field_value["field_3"; pp_str]
"kdb"
Now how do I generalize this so that if I input a vector of fields, I get a vector of values? I want to input ("field_1"; "field_2"; "field_3") and output ("abc"; "xyz"; "kdb"). I tried multiple approaches (below) but I just don't understand kdb/q's syntax well enough to vectorize my function:
/ Attempt 1 - Fail
get_field_value[enlist ("field_1"; "field_2"); pp_str]
/ Attempt 2 - Fail
get_field_value[; pp_str] /. enlist ("field_1"; "field_3")
/ Attempt 3 - Fail
fields: ("field_1"; "field_2")
get_field_value[fields; pp_str]
To run your function for each you could project the pp_str variable and use each for the others
q)get_field_value[;pp_str]each("field_1";"field_3")
"abc"
"kdb"
Kdb actually has built-in functionality to handle this: https://code.kx.com/q/ref/file-text/#key-value-pairs
q){#[;x](!/)"S: "0:y}[`field_1;pp_str]
"abc"
q)
q){#[;x](!/)"S: "0:y}[`field_1`field_3;pp_str]
"abc"
"kdb"
I think this might be the syntax you're looking for.
q)get_field_value[; pp_str]each("field_1";"field_2")
"abc"
"xyz"

Asserting a null value via a Scenario outline

I am trying to assert a JSON response null value via the feature file below
Scenario Outline: GET incident notifications
Given I made a GET request for incident notifications for the "<incident>"
Then I should be able to see "<NotificationID>", "<DateTime>", "<ActionID>", "<Subject>", "<CreatedBy>","<Notes>"
Examples:
| incident |NotificationID|DateTime |ActionID|Subject |CreatedBy|Notes|
| 399 | 211 |2017-11-28T14:30:11.01|0 |Logged with Openreach| | |
| 400 | 2112 |2017-11-28T14:35:11.01|1 |Processed at Openreach|Agent | AgentNotes |
this is my step definition-
assertThat(webModel.getRestServices().response.getBody().path("CreatedBy[0]"),is(CreatedBy));
assertThat(webModel.getRestServices().response.getBody().path("Notes[0]"),is(Notes));
this is the error assertion error i get-
java.lang.AssertionError:
Expected: is ""
but: was null
I could get this thing working by asserting to nullValue() however, the 2nd run will fail as it has to take the parameter from the feature file.
any help would be greatly appreciated.
When we give any parameter as blank in the feature file, its taken as null. That's the reason you are getting this error as it is comparing null with a String.
We can solve this problem by doing a null or empty check before the assert. If the value is null, it would skip the assertion otherwise it will assert.

Can't search products in magento 2

When i try to search a product it show nothing. And i check the exception, it show some bug
[2018-10-06 01:42:56] main.CRITICAL: SQLSTATE[HY000]: General error: 1191 Can't find FULLTEXT index matching the column list, query was: SELECT search_synonyms.* FROM search_synonyms WHERE (MATCH (synonyms) AGAINST ('greene tweed n038406502sd653 o ring 2560 id x 0151 cx in' IN BOOLEAN MODE)) [] []
[2018-10-06 01:42:56] main.CRITICAL: SQLSTATE[HY000]: General error: 1191 Can't find FULLTEXT index matching the column list, query was: SELECT search_synonyms.* FROM search_synonyms WHERE (MATCH (synonyms) AGAINST ('greene tweed n038406502sd653 o ring 2560 id x 0151 cx in' IN BOOLEAN MODE)) [] []
ALTER TABLE search_synonyms ADD FULLTEXT (synonyms);
Try this to add a full text index.

Query with multiple filters on Pandas

I want to execute this query.The query is " filtering data with 'Gas Oil/ Diesel Oil - Production' transaction and the year is greater than 2000 ". Firstly , i tried to execute my query with & operand and vectorized column selection without using if statement. But it did not work.After then , i found this query at below.This time i could not get any output.What do you think about my query problem ?.Thanks ...
if all(b['Commodity - Transaction'] == 'Gas Oil/ Diesel Oil - Production') and all(b[ b['Year'] >2000 ]):
print (b)
else:
print('did not find any values')
what's wrong with:
b.loc[(b['Commodity - Transaction'] == 'Gas Oil/ Diesel Oil - Production') & (b['Year'] >2000)]
?
You can try first create mask with contains and the create subset - use Boolean indexing:
print b[(b['Commodity - Transaction'].str.contains('Gas Oil/ Diesel Oil - Production')) &
(b['Year'] > 2000) ]

Mnesia pagination with fragmented table

I have a mnesia table configured as follow:
-record(space, {id, t, s, a, l}).
mnesia:create_table(space, [ {disc_only_copies, nodes()},
{frag_properties, [ {n_fragments, 400}, {n_disc_copies, 1}]},
{attributes, record_info(fields, space)}]),
I have at least 4 million records for test purposes on this table.
I have implemented something like this Pagination search in Erlang Mnesia
fetch_paged() ->
MatchSpec = {'_',[],['$_']},
{Record, Continuation} = mnesia:activity(async_dirty, fun mnesia:select/4, [space, [MatchSpec], 10000, read], mnesia_frag).
next_page(Cont) ->
mnesia:activity(async_dirty, fun() -> mnesia:select(Cont) end, mnesia_frag).
When I execute the pagination methods it brings batch between 3000 and 8000 but never 10000.
What I have to do to bring me the batches consistently?
The problem is that you expect mnesia:select/4, which is documented as:
select(Tab, MatchSpec, NObjects, Lock) -> transaction abort | {[Object],Cont} | '$end_of_table'
to get you the NObjects limit, being NObjects in your example 10,000.
But the same documentation also says:
For efficiency the NObjects is a recommendation only and the result may contain anything from an empty list to all available results.
and that's the reason you are not getting consistent batches of 10,000 records, because NObjects is not a limit but a recommendation batch size.
If you want to get your 10,000 records you won't have no other option that writing your own function, but select/4 is written in this way for optimization purposes, so most probably the code you will be written will be slower than the original code.
BTW, you can find the mnesia source code on https://github.com/erlang/otp/tree/master/lib/mnesia/src

Resources