syntax error at or near "I" while trying to insert rows in postgresql via python script - python-3.x

Trying to insert a query into a postgres table via python code.
Code line:
query = f"INSERT INTO Phone (data, result, reason) VALUES ('{json.dumps(data)}', {val}, '{json.dumps(reason)}') RETURNING id;"
here, data and reason are columns of type json and result is a boolcolumn.
Query on printing the variable:
INSERT INTO Phone (data, result, reason) VALUES ('{'ICl': False, 'Poster': True, 'Lock': True, 'Model': 'ABC'}', True, '{}') RETURNING id;
Error:
syntax error at or near "I"
LINE 1: I
^
The same query on copy pasting into Table plus directly gives me no error and the row gets inserted.
Where is the error? Can't seem to understand why the query is giving me such error that too only in Python code. Please help.

You need to let psycopg2 handle the quoting for you.
Please try this:
query = """INSERT INTO Phone (data, result, reason)
VALUES (%s,%s,%s) RETURNING id;"""
cursor.execute(query, (json.dumps(data), val, json.dumps(reason),))

Related

ADF: Pass dynamic Where Clause as a string with quotes

I have a lookup that retrieves a few records from a SQL Server table containing server, database, schema, table name and a whole where clause. These values are passed to a copy data (within a ForEach) In the copy data i have tried to use two different Dynamic query statement, but I seem to get an error. And can't figure out where I'm going wrong.
Values in table:
SRC_SERVERNAME
SRC_DATABASE
SRC_SCHEMANAME
SRC_TABLENAME
SRC_WHERE_DATE_CLAUSE
SQ01
NAV
dbo
Company$Sales Invoice Header
where [Posting Date] >= '2021-01-01'
Source setup:
Error for statement 1:
A database operation failed with the following error: 'Incorrect syntax near '.'.'
Incorrect syntax near '.'., SqlErrorNumber=102,Class=15,State=1,
Error for statement 2:
A database operation failed with the following error: 'Incorrect syntax near '.'.'
Incorrect syntax near '.'., SqlErrorNumber=102,Class=15,State=1,
Statement 1 (query):
SELECT *
FROM #{item().SRC_SERVERNAME}.#{item().SRC_DATABASENAME}.#{item().SRC_SCHEMANAME}.#{item().SRC_TABLENAME},' ',#{item().SRC_WHERE_DATE_CLAUSE}
Statement 2 (dynamic query with concat):
#concat('select * from ',item().SRC_SERVERNAME,'.',item().SRC_DATABASENAME,'.',item().SRC_SCHEMANAME,'.',item().SRC_TABLENAME,' ',item().SRC_WHERE_DATE_CLAUSE)
There is a syntax error in your query.
In the 4-part naming of SQL database, server name/database name/schema name/table name should be separated by ‘.’.
When you have space or other special characters in the name of server/database/schema/table, they should be embedded inside square braces [].
#concat('select * from [',item().SRC_SERVERNAME, '].[',item().SRC_DATABASENAME,'].[',item().SRC_SCHEMANAME,'].[',item().SRC_TABLENAME, '] ',item().SRC_WHERE_DATE_CLAUSE)

Can't insert 535.5357055664062 to JSON field of Google Cloud Spanner

Some floats are not available to insert into JSON fields.
{
"error": {
"code": 400,
"message": "Invalid value for bind parameter json: Expected JSON.",
"status":"INVALID_ARGUMENT"
}
}
examples)
Invalid float
535.5357055664062
Valid float
535.535705566406
103.3588679387016812112421411
They are valid for FLAOT64 type, so can be inserted into FLOAT64 fields.
+1 to Knut's answer. See also relevant documentation here: https://cloud.google.com/spanner/docs/working-with-json#specifications
If you do not care about round-trip through string representation the documentation points to: PARSE_JSON('<float_string>', wide_number_mode='round'), see also: https://cloud.google.com/spanner/docs/json_functions#parse_json .
Would this help in your case?
I think this is caused by Cloud Spanner trying to ensure that the string representation of the stored JSON value will be equal to the input. The decimal value 535.5357055664062 would be stored as float value 535.53570556640625. So you have:
535.5357055664062 vs
535.53570556640625
When the latter is rounded, it will return 535.5357055664063, which is different from the initial value.
Compare that to the decimal value 535.535705566406 which will translate to float value 535.53570556640625, so you have:
535.535705566406
535.53570556640625
When the latter is rounded, it will still return 535.535705566406.
The error message you are getting is somewhat confusing, so it would be interesting to know exactly how you are doing it. I tried the following using DBeaver and only SQL statements and no parameters, and got a somewhat more fitting error message:
create table jsontest (id int64, value json) primary key (id);
insert into jsontest (id, value) values (1, json '{"value": 535.5357055664062}');
The above returns the following error message:
SQL Error [3]: INVALID_ARGUMENT: com.google.api.gax.rpc.InvalidArgumentException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Invalid JSON literal: Input number: 535.5357055664062 cannot round-trip through string representation [at 1:45]
insert into jsontest (id, value) values (1, json '{"value": 535.5357055664062}')
You can easily try which values will be accepted and which will not by just executing a simple SELECT JSON '<any-random-float>' statement. For example:
-- This succeeds:
select json '535.5357055664063';
-- This fails:
select json '535.53570556640631';
(I understand that this does not solve your problem, but it seems that this is the intended behavior of the JSON data type in Cloud Spanner. Any questions regarding the feature itself should probably be done through the support channels of the product.)

How can I avoid escape chars in inserted binary string with Elixir/Ecto/Postgrex?

I'm new to elixir/ecto and I don't understand why my error_data field (defined as :binary in schema) gets inserted slash-escaped in my postgresql column:
params = %{error_data: "eyJtZXNzYWdlIjoiSW52YWxpZCB0b2tlbiIsImNhdXNlIjpbXSwiZXJyb3IiOiJub3RfZm91bmQiLCJzdGF0dXMiOjQwMX0="}
cast(%{}, params, [:error_data])
|> change(%{error_data: Base.decode64!(params.error_data)})
|> Ecto.Repo.insert()
Following #smathy insight, I've put an IO.puts(get_change(changeset, :error_data) between change and insert calls. It shows the data has beed decoded and is not slash escaped before insertion. But the next line showing Ecto query is escaped... Check my app's output:
[info] Creating error for 1 on channel 1
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
[debug] QUERY OK db=0.5ms
INSERT INTO "errors" ("code","error","error_message","http_status","id","channel_id","inserted_at","updated_at") VALUES ($1,$2,$3,$4,$5,$6,$7,$8) RETURNING "id" ["error-03", "{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}", "Invalid token", 401, 1, 1, ~N[2021-02-16 12:24:58], ~N[2021-02-16 12:24:58]]
Then check these DB queries out: the first is for the code inserted error. The last is from a manually inserted not-escaped error:
dev=# SELECT error FROM errors ORDER BY updated_at DESC limit 1;
error
---------------------------------------------------------------------------------------
"{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}"
(1 row)
dev=# SELECT error FROM errors ORDER BY updated_at ASC limit 1;
error
---------------------
{"eita": "deu pau"}
(1 row)
How can I avoid that escape and insert the plain decoded ({"message":"Invalid token","cause":[],"error":"not_found","status":401}) content?
If I could use ecto fragments in insertion, I'd have told the DB to decode the base64 string... I didn't find how to do that either... any help?
I wonder it there is any environment configuration that affects ECTO in order to log it's queries and ends up string casting/escaping the error_data binary...
They're not really there, they're just being displayed by whatever tool you're using to print out that value because that tool uses "s as the string delimiter, and therefore escapes them to avoid ambiguity.
Same thing happens in an iex session, if you actually print out the value then it comes out as you're expecting because when you output a string it won't include the delimiters:
iex(6)> Base.decode64! "eyJtZXNzYWdlIjoiSW52YWxpZCB0b2tlbiIsImNhdXNlIjpbXSwiZXJyb3IiOiJub3RfZm91bmQiLCJzdGF0dXMiOjQwMX0="
"{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}"
iex(7)> IO.puts v
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
:ok
Update
This is me running a psql query after running precisely the code you've shown above on a string (varchar) field:
testdb=# select error_data from tt;
error_data
-------------------------------------------------------------------------
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
(1 row)

dbSendquery giving format error

I need to create a table and insert some values into it uisng r
library(sqldf)
Company_Master <- dbConnect(SQLite(), dbname="Company Master.sqlite" )
dbSendQuery(conn = Company_Master,
"CREATE TABLE Company Master(
Comp_name varchar
Address varchar)
")
Till here no error is given.But after this when i want to insert some value into the table
dbSendQuery(conn = Company_Master, "INSERT INTO COMPANY MASTER
VALUES('INFOSYS','abcLALA')")
I get error like
Error in result_create(conn#ptr, statement) : near "Master": syntax error
I found the format on https://www.r-bloggers.com/r-and-sqlite-part-1/ and don't seem to get any other format except if using dbCEATETABLE. What is wrong in this format? Please help

DocumentDB Stored Procedure Lumenize

I'm using the aggregate stored procedure lumenize https://github.com/lmaccherone/documentdb-lumenize with the .net client and I'm in trouble in the filterquery content.
How simply pass alphanumeric value to the filterquery query ?
string configString = #"{
cubeConfig: {
groupBy: 'Modele',
field: 'Distance',
f: 'sum'
},
filterQuery: 'SELECT * FROM Modele WHERE ModeleGUID = ''0b93def1-ccd7-fc35-0475-b47c89137c3f'' '}";
Each test gives me a parse error in the filterquery :(
Error: One or more errors occurred.
Message: After parsing a value an unexpected character was encountered:
'. Path 'filterQuery', line 7, position 63.
End of demo, press any key to exit.
Thanks
Just to properly close this out: The issue is related to multiple single-quotes in the filter string. As long as they're escaped properly (e.g. \'), things should work as expected.

Resources