Error with tokio postgres and database query Rust - rust

I am working on a rust project where I am working with tokio postgres.
I have a table by name of DATA that has two columns "id" and "opt". I have the data hitting the query as two inputs where one is a String and the other is an Option.
Following tokio postges conventions, I am having to call the client with the help of the source code provided below.I am using COALESCE to check if second parameter is NULL on the sql query and this way I am making the SQL query compact.
let rows = client.query("SELECT * FROM DATA WHERE id = $1 AND opt = COALESCE(opt, $2)", &[&input1, &input2)]).await?;
I am getting this error when I perform a cargo build.
expected `&dyn ToSql + Sync`, found enum `Result`
Can this error be mitigated from Rust?

The result from client.query is of type Result. Before working with the actual rows, you'll need to retrieve them from the Result type and deal with any errors that might have occurred.
See the Rust Book section or error handling for details on how to properly separate successful from erroneous results.
You can find the full function signature at docs.rs, for example, where you'll find that the return type is of Result<Vec<Row>, Error>.

Related

In Azure Data Factory, how do I pass the Index of a ForEach as a parameter properly

Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:

Why is knex.js allowing to insert data whose type is different from its migration?

I've been working with knex.js during the last days and I noticed that even though I specified some columns' types I can insert whatever type I want to.
For example, I have this table:
exports.up = function (knex) {
return knex.schema
.createTable('users', function (table) {
table.increments('id').primary();
table.string('login', 255).notNullable();
table.string('password', 255).notNullable();
table.string('user_type', 255);
Even though I have string being the type for login, password and usertype, I can insert integers and no error will be thrown. Wasn't it supposed to throw an error on that?
It is database driver dependent functionality. Knex really doesn't care what parameters you pass to query, it does not know any information about the schema that you have created.
So when you are giving dates or numbers to DB driver that is the part, which may or may not convert type for example from integer to string.
Since databases actually does not work with javascript types, the DB driver always needs to do some type of conversion from values JS representation to DB representation. So some automatic type conversions are expected.

how to do Gremlin contain search for both number and string

Neptune 1.0.2.1 + Gremlin + nodejs.
I have a vertext and property, e.g. Vertex - Device, property - Test, the Test property could store different type of data, e.g. number and string
Vertex 1 - Test = ['ABCD','xyz']
Vertex 2 - Test = [123,'XYZ']
I want to do a 'containing' search, e.g. Test=A, or Test=123 regardless the datatype.
I was trying
queryText = 'BC' //this throw error
or queryText = 123 //this actually works
//I expect both case should hit the result.
g.V().hasLabel('Device').or(__.has('Test', parseFloat(queryText)), __.has('Test', textP.containing(queryText)));
but get 'InternalFailureException\' error
Is it possible I can write a single query regardless the datatype?
if not possible, or at least make textP.containing work with multiple query assuming I know the datatype? right now the containing search throw error if the property contains number
It looks like you have the closing bracket in the wrong place inside the or() step. You need to close the first has step before the comma.
In your example
g.V().hasLabel('Device').or(__.has('Test', parseFloat(queryText), __.has('Test', textP.containing(queryText))));
Which should be
g.V().hasLabel('Device').or(__.has('Test', parseFloat(queryText)), __.has('Test', textP.containing(queryText)));
EDITED and UPDATED
With the corrected query and additional clarification about the data model containing different types for the same property key, I was able to reproduce what you are seeing. However, the same behavior can be seen using TinkerGraph as well as Neptune. The error message generated is is a little different but the meaning is the same. Given the fact that TinkerGraph behaves the same way I am of the opinion that Neptune is behaving consistently with the "reference" implementation. That said, this raises a question as to whether the TextP predicates should be smarter and check the type of the property before attempting the test.
gremlin> graph = TinkerGraph.open()
==>tinkergraph[vertices:0 edges:0]
gremlin> g = graph.traversal()
==>graphtraversalsource[tinkergraph[vertices:0 edges:0], standard]
gremlin> g.addV('test').property('x',12.5)
==>v[0]
gremlin> g.addV('test').property('x','ABCDEF')
==>v[2]
gremlin> g.V().hasLabel('test').or(has('x',12.3),has('x',TextP.containing('CDE')))
java.math.BigDecimal cannot be cast to java.lang.String
Type ':help' or ':h' for help.
Display stack trace? [yN]
ADDITIONAL UPDATE
I created a Jira issue so the Apache TinkerPop community can consider making a change to the TextP predicates.
https://issues.apache.org/jira/browse/TINKERPOP-2375

node-postgres: how to prepare a statement without executing the query?

I want to create a "prepared statement" in postgres using the node-postgres module. I want to create it without binding it to parameters because the binding will take place in a loop.
In the documentation i read :
query(object config, optional function callback) : Query
If _text_ and _name_ are provided within the config, the query will result in the creation of a prepared statement.
I tried
client.query({"name":"mystatement", "text":"select id from mytable where id=$1"});
but when I try passing only the text & name keys in the config object, I get an exception :
(translated) message is binding 0 parameters but the prepared statement expects 1
Is there something I am missing ? How do you create/prepare a statement without binding it to specific value in order to avoid re-preparing the statement in every step of a loop ?
I just found an answer on this issue by the author of node-postgres.
With node-postgres the first time you issue a named query it is
parsed, bound, and executed all at once. Every subsequent query issued
on the same connection with the same name will automatically skip the
"parse" step and only rebind and execute the already planned query.
Currently node-postgres does not support a way to create a named,
prepared query and not execute the query. This feature is supported
within libpq and the client/server protocol (used by the pure
javascript bindings), but I've not directly exposed it in the API. I
thought it would add complexity to the API without any real benefit.
Since named statements are bound to the client in which they are
created, if the client is disconnected and reconnected or a different
client is returned from the client pool, the named statement will no
longer work (it requires a re-parsing).
You can use pg-prepared for that:
var prep = require('pg-prepared')
// First prepare statement without binding parameters
var item = prep('select id from mytable where id=${id}')
// Then execute the query and bind parameters in loop
for (i in [1,2,3]) {
client.query(item({id: i}), function(err, result) {...})
}
Update: Reading your question again, here's what I believe you need to do. You need to pass a "value" array as well.
Just to clarify; where you would normally "prepare" your query, just prepare the object you pass to it, without the value array. Then where you would normally "execute" your query, set the value array in the object and pass it to the query. If it's the first time, the driver will do the actual prepare for you the first time around, and simple do binding and execution for the rest of the iteration.

Subsonic - Bit operation in Where Clause

I'm trying to make something like this:
int count = new Select().From(tblSchema).Where("Type & 1").IsEqualTo("1").GetRecordCount();
And the error message is:
Incorrect syntax near '&'.
Must declare the scalar variable "#Deleted".
Is it possible to do that with SubSonic?
Must declare the scalar variable
"#Deleted"
The second error would be caused by using logical deletes on the table you are querying (the table has an isDeleted or Deleted column).
But I'm looking through the code, I'm not sure how that parameter is getting in there. The SqlQuery.GetRecordCount method doesn't call CheckLogicalDelete(), from what I can tell. Is that error message unrelated?
This seems to be a bug in the way SubSonic is naming it's parameters when it generates the SQL to be executed.
What's happening is that SubSonic is looking at "Type & 1" and then creating a parameter to compare against called #Type&10 which is not a valid SQL parameter name. So you'll end up with the following SQL from your original query. You should submit a bug to http://code.google.com/p/subsonicproject/
Meanwhile you can workaround the bug for now by using an inline query:
http://subsonicproject.com/docs/Inline_Query_Tool
It is a little fuzzy as to what you are trying to accomplish but here is a best guess.
int count = new Select().From(tbl.Schema).Where(tbl.TypeColumn).IsEqualTo(true).GetRecordCount();

Resources