I am writing a simple insert query but getting error.
CREATE TABLE revenue_classification_region (
year int,
region text,
revenue_total int,
PRIMARY KEY (year,region));
INSERT INTO revenue_classification_region (year,region,revenue_total) VALUES (2015,’Western Europe’,5709);
Error:
Invalid syntax at line 1, char 84
INSERT INTO revenue_classification_region (year,region,revenue_total) VALUES (2015,’Western Europe’,5709);
Please help
The text value ('Western Europe') must be enclosed in normal simple quotes: '. Most probably this insert was copied from a web page in which the simple quote was replaced by one of the alternative quote characters.
Related
I have a lookup that retrieves a few records from a SQL Server table containing server, database, schema, table name and a whole where clause. These values are passed to a copy data (within a ForEach) In the copy data i have tried to use two different Dynamic query statement, but I seem to get an error. And can't figure out where I'm going wrong.
Values in table:
SRC_SERVERNAME
SRC_DATABASE
SRC_SCHEMANAME
SRC_TABLENAME
SRC_WHERE_DATE_CLAUSE
SQ01
NAV
dbo
Company$Sales Invoice Header
where [Posting Date] >= '2021-01-01'
Source setup:
Error for statement 1:
A database operation failed with the following error: 'Incorrect syntax near '.'.'
Incorrect syntax near '.'., SqlErrorNumber=102,Class=15,State=1,
Error for statement 2:
A database operation failed with the following error: 'Incorrect syntax near '.'.'
Incorrect syntax near '.'., SqlErrorNumber=102,Class=15,State=1,
Statement 1 (query):
SELECT *
FROM #{item().SRC_SERVERNAME}.#{item().SRC_DATABASENAME}.#{item().SRC_SCHEMANAME}.#{item().SRC_TABLENAME},' ',#{item().SRC_WHERE_DATE_CLAUSE}
Statement 2 (dynamic query with concat):
#concat('select * from ',item().SRC_SERVERNAME,'.',item().SRC_DATABASENAME,'.',item().SRC_SCHEMANAME,'.',item().SRC_TABLENAME,' ',item().SRC_WHERE_DATE_CLAUSE)
There is a syntax error in your query.
In the 4-part naming of SQL database, server name/database name/schema name/table name should be separated by ‘.’.
When you have space or other special characters in the name of server/database/schema/table, they should be embedded inside square braces [].
#concat('select * from [',item().SRC_SERVERNAME, '].[',item().SRC_DATABASENAME,'].[',item().SRC_SCHEMANAME,'].[',item().SRC_TABLENAME, '] ',item().SRC_WHERE_DATE_CLAUSE)
enter image description here
select from list by index ${locator_var} ${inp_msge_type}
--getting error as expected string, int found
select from list by index ${locator_var} 7
-----not getting any error
${inp_msge_type}----contains 7 from DB query the result is stored in this variable, to avoid hard coding we need to do this
Is there any way to write
Do not add links to screenshots of code, or error messages, and format the code pieces accordingly - use the ` (tick) symbol to surround them.
The rant now behind us, your issue is that the keyword Select From List By Index expects the type of the index argument to be a string.
When you called it
Select From List By Index ${locator_var} 7
, that "7" is actually a string (though it looks like a number), because this is what the framework defaults to on any typed text. And so it works.
When you get the value from the DB, it is of the type that the DB stores it with; and probably the table schema says it is int. So now you pass an int to the keyword - and it fails.
The fix is simple - just cast (convert) the variable to a string type:
${inp_msge_type}= Convert To String ${inp_msge_type}
, and now you can call the keyword as you did before.
I'm new to elixir/ecto and I don't understand why my error_data field (defined as :binary in schema) gets inserted slash-escaped in my postgresql column:
params = %{error_data: "eyJtZXNzYWdlIjoiSW52YWxpZCB0b2tlbiIsImNhdXNlIjpbXSwiZXJyb3IiOiJub3RfZm91bmQiLCJzdGF0dXMiOjQwMX0="}
cast(%{}, params, [:error_data])
|> change(%{error_data: Base.decode64!(params.error_data)})
|> Ecto.Repo.insert()
Following #smathy insight, I've put an IO.puts(get_change(changeset, :error_data) between change and insert calls. It shows the data has beed decoded and is not slash escaped before insertion. But the next line showing Ecto query is escaped... Check my app's output:
[info] Creating error for 1 on channel 1
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
[debug] QUERY OK db=0.5ms
INSERT INTO "errors" ("code","error","error_message","http_status","id","channel_id","inserted_at","updated_at") VALUES ($1,$2,$3,$4,$5,$6,$7,$8) RETURNING "id" ["error-03", "{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}", "Invalid token", 401, 1, 1, ~N[2021-02-16 12:24:58], ~N[2021-02-16 12:24:58]]
Then check these DB queries out: the first is for the code inserted error. The last is from a manually inserted not-escaped error:
dev=# SELECT error FROM errors ORDER BY updated_at DESC limit 1;
error
---------------------------------------------------------------------------------------
"{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}"
(1 row)
dev=# SELECT error FROM errors ORDER BY updated_at ASC limit 1;
error
---------------------
{"eita": "deu pau"}
(1 row)
How can I avoid that escape and insert the plain decoded ({"message":"Invalid token","cause":[],"error":"not_found","status":401}) content?
If I could use ecto fragments in insertion, I'd have told the DB to decode the base64 string... I didn't find how to do that either... any help?
I wonder it there is any environment configuration that affects ECTO in order to log it's queries and ends up string casting/escaping the error_data binary...
They're not really there, they're just being displayed by whatever tool you're using to print out that value because that tool uses "s as the string delimiter, and therefore escapes them to avoid ambiguity.
Same thing happens in an iex session, if you actually print out the value then it comes out as you're expecting because when you output a string it won't include the delimiters:
iex(6)> Base.decode64! "eyJtZXNzYWdlIjoiSW52YWxpZCB0b2tlbiIsImNhdXNlIjpbXSwiZXJyb3IiOiJub3RfZm91bmQiLCJzdGF0dXMiOjQwMX0="
"{\"message\":\"Invalid token\",\"cause\":[],\"error\":\"not_found\",\"status\":401}"
iex(7)> IO.puts v
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
:ok
Update
This is me running a psql query after running precisely the code you've shown above on a string (varchar) field:
testdb=# select error_data from tt;
error_data
-------------------------------------------------------------------------
{"message":"Invalid token","cause":[],"error":"not_found","status":401}
(1 row)
I'm generating a dynamic sql query based on some user input. Here is the code that prepares the query:
var preparedParamValues = paramValues.map(paramValue => `'${paramValue}'`).join(',');
var sql = `INSERT INTO [DB] (${paramNames}) VALUES (${preparedParamValues})`;
When I send the following string to the DB it throws the below error:
'They're forced to drive stupid cars.'
I get an error :
'Unclosed quotation mark after the character string \')\'.'
I'm trying to find a way to escape all those characters but I don't understand the error or at least the last part of it with all the symbols.
You have to use two single quotes when a single quote appears in the string:
'They''re forced to drive stupid cars.'
I have created a UDT made up of fields from three or four columns of data. One of the field contains a letter inside parens, for example (c) or (d). When importing the csv file using cqlsh's COPY FROM, I get an error message:
Syntax error in CQL query …..mismatched import ‘(‘ expecting ‘)’ (….column 3, column 4) VALUES (10.2[(]c…).
I have tried importing csv file with fields where the letter does not have brackets and get:
Syntax error in CQL query …..mismatched import ‘c‘ expecting ‘)’ (….column 3, column 4) VALUES (10.2[c]…)
I have tried importing csv file without a letter in the field and get:
Syntax error in CQL query …..mismatched import ‘,‘ expecting ‘)’ (….column 4) VALUES (10.2,…)
The UDT is made up of integers and text. It appears that importing the csv file containing the UDT which includes a letter inside a bracket (e.g. (c)) generates a data violation as does a letter with no bracket as does and as does field with no value in it.
Have you tried character escaping using double dollar ($$) or double quotes ('') ? http://docs.datastax.com/en/cql/3.3/cql/cql_reference/escape_char_r.html