DocumentDB Stored Procedure Lumenize - azure

I'm using the aggregate stored procedure lumenize https://github.com/lmaccherone/documentdb-lumenize with the .net client and I'm in trouble in the filterquery content.
How simply pass alphanumeric value to the filterquery query ?
string configString = #"{
cubeConfig: {
groupBy: 'Modele',
field: 'Distance',
f: 'sum'
},
filterQuery: 'SELECT * FROM Modele WHERE ModeleGUID = ''0b93def1-ccd7-fc35-0475-b47c89137c3f'' '}";
Each test gives me a parse error in the filterquery :(
Error: One or more errors occurred.
Message: After parsing a value an unexpected character was encountered:
'. Path 'filterQuery', line 7, position 63.
End of demo, press any key to exit.
Thanks

Just to properly close this out: The issue is related to multiple single-quotes in the filter string. As long as they're escaped properly (e.g. \'), things should work as expected.

Related

How can I avoid escape chars in inserted binary string with Elixir/Ecto/Postgrex?

I'm new to elixir/ecto and I don't understand why my error_data field (defined as :binary in schema) gets inserted slash-escaped in my postgresql column:
params = %{error_data: "eyJtZXNzYWdlIjoiSW52YWxpZCB0b2tlbiIsImNhdXNlIjpbXSwiZXJyb3IiOiJub3RfZm91bmQiLCJzdGF0dXMiOjQwMX0="}
cast(%{}, params, [:error_data])
|> change(%{error_data: Base.decode64!(params.error_data)})
|> Ecto.Repo.insert()
Following #smathy insight, I've put an IO.puts(get_change(changeset, :error_data) between change and insert calls. It shows the data has beed decoded and is not slash escaped before insertion. But the next line showing Ecto query is escaped... Check my app's output:
[info] Creating error for 1 on channel 1
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
[debug] QUERY OK db=0.5ms
INSERT INTO "errors" ("code","error","error_message","http_status","id","channel_id","inserted_at","updated_at") VALUES ($1,$2,$3,$4,$5,$6,$7,$8) RETURNING "id" ["error-03", "{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}", "Invalid token", 401, 1, 1, ~N[2021-02-16 12:24:58], ~N[2021-02-16 12:24:58]]
Then check these DB queries out: the first is for the code inserted error. The last is from a manually inserted not-escaped error:
dev=# SELECT error FROM errors ORDER BY updated_at DESC limit 1;
error
---------------------------------------------------------------------------------------
"{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}"
(1 row)
dev=# SELECT error FROM errors ORDER BY updated_at ASC limit 1;
error
---------------------
{"eita": "deu pau"}
(1 row)
How can I avoid that escape and insert the plain decoded ({"message":"Invalid token","cause":[],"error":"not_found","status":401}) content?
If I could use ecto fragments in insertion, I'd have told the DB to decode the base64 string... I didn't find how to do that either... any help?
I wonder it there is any environment configuration that affects ECTO in order to log it's queries and ends up string casting/escaping the error_data binary...
They're not really there, they're just being displayed by whatever tool you're using to print out that value because that tool uses "s as the string delimiter, and therefore escapes them to avoid ambiguity.
Same thing happens in an iex session, if you actually print out the value then it comes out as you're expecting because when you output a string it won't include the delimiters:
iex(6)> Base.decode64! "eyJtZXNzYWdlIjoiSW52YWxpZCB0b2tlbiIsImNhdXNlIjpbXSwiZXJyb3IiOiJub3RfZm91bmQiLCJzdGF0dXMiOjQwMX0="
"{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}"
iex(7)> IO.puts v
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
:ok
Update
This is me running a psql query after running precisely the code you've shown above on a string (varchar) field:
testdb=# select error_data from tt;
error_data
-------------------------------------------------------------------------
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
(1 row)

syntax error at or near "I" while trying to insert rows in postgresql via python script

Trying to insert a query into a postgres table via python code.
Code line:
query = f"INSERT INTO Phone (data, result, reason) VALUES ('{json.dumps(data)}', {val}, '{json.dumps(reason)}') RETURNING id;"
here, data and reason are columns of type json and result is a boolcolumn.
Query on printing the variable:
INSERT INTO Phone (data, result, reason) VALUES ('{'ICl': False, 'Poster': True, 'Lock': True, 'Model': 'ABC'}', True, '{}') RETURNING id;
Error:
syntax error at or near "I"
LINE 1: I
^
The same query on copy pasting into Table plus directly gives me no error and the row gets inserted.
Where is the error? Can't seem to understand why the query is giving me such error that too only in Python code. Please help.
You need to let psycopg2 handle the quoting for you.
Please try this:
query = """INSERT INTO Phone (data, result, reason)
VALUES (%s,%s,%s) RETURNING id;"""
cursor.execute(query, (json.dumps(data), val, json.dumps(reason),))

How to fix 'Unclosed quotation mark after the character string \')\'.' error

I'm generating a dynamic sql query based on some user input. Here is the code that prepares the query:
var preparedParamValues = paramValues.map(paramValue => `'${paramValue}'`).join(',');
var sql = `INSERT INTO [DB] (${paramNames}) VALUES (${preparedParamValues})`;
When I send the following string to the DB it throws the below error:
'They're forced to drive stupid cars.'
I get an error :
'Unclosed quotation mark after the character string \')\'.'
I'm trying to find a way to escape all those characters but I don't understand the error or at least the last part of it with all the symbols.
You have to use two single quotes when a single quote appears in the string:
'They''re forced to drive stupid cars.'

invalid input syntax for type numeric: " "

I'm getting this message in Redshift: invalid input syntax for type numeric: " " , even after trying to implement the advice found in SO.
I am trying to convert text to number.
In my inner join, I try to make sure that the text being processed is first converted to null when there is an empty string, like so:
nullif(trim(atl.original_pricev::text),'') as original_price
... I noticed from a related post on coalesce that you have to convert the value to text before you can try and nullif it.
Then in the outer join, I test to see that there's a limited set of acceptable characters and if this test is met I try to do the to_number conversion:
,case
when regexp_instr(trim(atl.original_price),'[^0-9.$,]')=0
then to_number(atl.original_price,'FM999999999D00')
else null
end as original_price2
At this point I get the above error and unfortunately I can't see the details in datagrip to get the offending value.
So my questions are:
I notice that there is an empty space in my error message:
invalid input syntax for type numeric: " " . Does this error have the exact same meaning as
invalid input syntax for type numeric:'' which is what I see in similar posts??
Of course: what am I doing wrong?
Thanks!
It's hard to know for sure without some data and the complete code to try and reproduce the example, but as some have mentioned in the comments the most likely cause is the to_number() function you are using.
In the earlier code fragment you are converting original_price to text (string) and then substituting an empty string ('') if the value is NULL. Calling the to_number() function on an empty string will give you the error described.
Without the full SQL statement it's not clear why you're putting the nullif() function around the original_price in the "inner join" or how whether the CASE statement is really in an outer join clause or one of the columns returned by the query. However you could perhaps alter the nullif() to substitute a value that can be converted to a number e.g. '0.00' instead of ''.
Sorry I couldn't share real data. I spent the weekend testing small sets to try and trap the error. I found that the error was caused by the input string having no numbers, which is permitted by my regex filter:
when regexp_instr(trim(atl.original_price),'[^0-9.$,]') .
I wrongly expected that a non numeric string like "$" would evaluate to NULL and then the to_number function would = NULL . But from experimenting it seems that it needs at least one number somewhere in the string. Otherwise it reduces the string argument to an empty string prior to running the to_number formatting and chokes.
For example select to_number(trim('$1'::text),'FM999999999999D00') will evaluate to 1 but select to_number(trim('$A'::text),'FM999999999999D00') will throw the empty string error.
My fix was to add an additional regex to my initial filter:
and regexp_instr(atl.original_price2,'[0-9]')>0 .
This ensures that at least one number will be in the string and after that the empty string error went away.
Hope my learning experience helps someone else.

Cosmos DB: Querying documents properties with special characters such as "$"

I've got a Cosmos DB collection with a document that contains properties that have a special character and what I assume is a reserved word. An example document is:
{
$type: 'Some value',
Value: 'Some other value'
}
If I execute the following query in the Azure Portal Query Explorer:
select * from c where c.Value = 'Some other value'
I receive an error of "Syntax error, incorrect syntax near 'Value'.". I get a similar error querying on c.$type.
How do I escape these property values so that I can query?
In the case of special characters, you will need to wrap the property inside []
Example:
SELECT * FROM c WHERE c["$type"] = "Some value"
SELECT * FROM c WHERE c["value"] = "$Some other value"

Resources