How to get the value from influx db using time string - node.js

How to fetch the data from influx db using time-string in node.js
Query:
"SELECT * FROM Measurement WHERE time > '2018-01-08 23:00:01.232000000' AND time < '2018-01-09'/ "
I am facing the following error when executing the above query
Error: Error from InfluxDB: invalid operation: time and *influxql.RegexLiteral are not compatible at ResultError.Error

Related

psycopg2 SELECT query with inbuilt functions

I have the following SQL statement where i am reading the database to get the records for 1 day. Here is what i tried in pgAdmin console -
SELECT * FROM public.orders WHERE createdat >= now()::date AND type='t_order'
I want to convert this to the syntax of psycopg2but somehow it throws me errors -
Database connection failed due to invalid input syntax for type timestamp: "now()::date"
Here is what i am doing -
query = f"SELECT * FROM {table} WHERE (createdat>=%s AND type=%s)"
cur.execute(query, ("now()::date", "t_order"))
records = cur.fetchall()
Any help is deeply appreciated.
DO NOT use f strings. Use proper Parameter Passing
now()::date is better expressed as current_date. See Current Date/Time.
You want:
query = "SELECT * FROM public.orders WHERE (createdat>=current_date AND type=%s)"
cur.execute(query, ["t_order"])
If you want dynamic identifiers, table/column names then:
from psycopg2 import sql
query = sql.SQL("SELECT * FROM {} WHERE (createdat>=current_date AND type=%s)").format(sql.Identifier(table))
cur.execute(query, ["t_order"])
For more information see sql.

databricks odbc connection different behavior between runtime 7.3LTS and 9.1LTS

A third-party application connects to a databricks general-purpose cluster and fires some SQL queries. All worked fine on databricks runtime 7.3LTS, but when we upgraded the cluster to runtime 9.1LTS, the where clause in the query suddenly did not contain quotes around the string value.
This is the incoming query with the 9.1LTS runtime:
22/01/26 08:16:52 INFO SparkExecuteStatementOperation: Submitting query 'SELECT * FROM (SELECT `TimeColumn`,`ValueColumn` FROM `database`.`table1` WHERE `TimeColumn` >= { ts '2022-12-26 09:14:55' } AND `TimeColumn` <= { ts '2022-01-26 09:14:55' } AND **`CounterName`=28STO0004** ORDER BY `TagTimeStamp`) LIMIT_ZERO LIMIT 0'
This is the incoming query with the 7.3LTS runtime:
22/01/26 08:28:48 INFO SparkExecuteStatementOperation: Submitting query 'SELECT * FROM (SELECT C_79736572615f79736572615f74616774696d6576616c7565.`TimeColumn` AS C_0, C_79736572615f79736572615f74616774696d6576616c7565.`ValueColumn` AS C_43 FROM `database`.`table1` C_79736572615f79736572615f74616774696d6576616c7565 WHERE (**(C_79736572615f79736572615f74616774696d6576616c7565.`CounterName` = '28STO0004')** AND (C_79736572615f79736572615f74616774696d6576616c7565.`TimeColumn` >= TIMESTAMP '2022-12-26 09:14:55') AND (C_79736572615f79736572615f74616774696d6576616c7565.`TimeColumn` <= TIMESTAMP '2022-01-26 09:14:55')) ORDER BY C_0 ASC ) LIMIT_ZERO LIMIT 0'
This image is the error we receive in the third-party application.
No changes were made to the third-party application. We also have no configuration options on the application, nor query diagnostics.
When we set up a databricks SQL-endpoint, we received the same error as the databricks runtime 9.1LTS cluster.

Postgres data query using epoch timestamp as input from client?

I am in need to select all rows from table a that have updated_at newer than a given (epoch) timestamp, '1549312452' for example.
I am using node.js. When the client sends up that timestamp, I have the server convert it to date:
var date = new Date(timestmapInt * 1000);
I then run the following query:
`select * from a where a.updated_at > ${date}`
When I hit the endpoint I get this error:
"syntax error at or near \"Feb\""
So, in general, how can I query records newer than a certain date if my incoming parameter is 1549312452 in Postgresql?
You can pass the raw epoch to to_timestamp:
select * from a where a.updated_at > to_timestamp(1549312452)

Error while connecting DB2/IDAA using ADFV2

I am trying to connect DB2/IDAA using ADFV2 - while executing simple query "select * from table" - I am getting below error:
Operation on target Copy data from IDAA failed: An error has occurred on the Source side. 'Type = Microsoft.HostIntegration.DrdaClient.DrdaException, Message = Exception or type' Microsoft.HostIntegration.Drda.Common.DrdaException 'was thrown. SQLSTATE = HY000 SQLCODE = -343, Source = Microsoft.HostIntegration.Drda.Requester, '
I checked a lot and tried various options but still it's an issue.
I tried query "select * from table with ur" - query to call with read-only but still get above result.
If I use query like select * from table; commit; - then activity succeeded but no record fetch.
Is anyone have solution ?
I have my linked service setup like this. additional connection properties value is : SET CURRENT QUERY ACCELERATION = ALL

I am getting TraceRetrievalException while reading data from Cassandra

I am trying to read a particular data from database. And executing the query to get the result. I am running the code in a while (true) loop so that I am doing the operation again and again.
In some of the iterations, I am getting result but in some cases it shows com.datastax.driver.core.exceptions.TraceRetrievalException and couldn't get the execution time for the query.
Cassandra version-3.11.4 , Datastax version-3.7
Exception thrown :
com.datastax.driver.core.exceptions.TraceRetrievalException: Unable to retrieve complete query trace for id bf7902b0-5d26-11e9-befd-d97cd54dc732 after 5 tries
at com.datastax.driver.core.QueryTrace.doFetchTrace(QueryTrace.java:242)
at com.datastax.driver.core.QueryTrace.maybeFetchTrace(QueryTrace.java:176)
at com.datastax.driver.core.QueryTrace.getDurationMicros(QueryTrace.java:105)
at com.cassandra.datastax.DatastaxTestBootstrapper.lambda$0(DatastaxTestBootstrapper.java:227)
at java.util.Collections$SingletonList.forEach(Collections.java:4822)
at com.cassandra.datastax.DatastaxTestBootstrapper.readData(DatastaxTestBootstrapper.java:227)
at com.cassandra.datastax.MultipleThread.run(MultipleThread.java:13)
at java.lang.Thread.run(Thread.java:748)
Is there any way I can overcome this exception or can I increase the number of retries to retrieve complete query trace?
Statement statement1 = new SimpleStatement(
"SELECT * FROM keyspace.table where key='405861500500033'").enableTracing();
ResultSet resultSet = session.execute(statement1);
resultSet.getAllExecutionInfo()
.forEach(e -> System.out.println("time : " + e.getQueryTrace().getDurationMicros()));

Resources