Scala Phantom Cassandra insert method returns empty ResultSet - cassandra

I want to insert data to my table in Cassandra and then return value from column "user_id" instead of full ResultSet. Here it is snippet of my code:
def create(user: User): Future[UUID] = {
insert
.value(_.id, user.id)
.value(_.email, user.email)
.value(_.name, user.name)
.consistencyLevel_=(ConsistencyLevel.ALL)
.future()
.map(r => fromRow(r.one()).id)
}
def fromRow(r: Row): User = {
User(id(r), email(r), name(r))
}
So future() returns Future[ResultSet]. After that I try to retrieve Row from ResultSet, modify it to User and get id eventually. Despite the fact that data were saved to my table I got
ResultSet[ exhausted: true, Columns[]]
columns of the ResultSet are empty and consequently r.one() returned null.
I haven't found any examples for my purpose. So, can phantom-dsl do something like Quill?
val q = quote {
query[Product].insert(lift(Product(0L, "My Product", 1011L))).returning(_.id)
}

So in more recent versions of phantom that create method is automatically generated. More details here. The fromRow method is also automatically generated so you don't need to type it manually.
Long story short, this is what you could use:
def create(user: User): Future[UUID] = {
store(user)
.consistencyLevel_=(ConsistencyLevel.ALL)
.future()
.map(_ => user.id)
}

Related

DRF how to return multiple datas inside the same serializer

I will try to make my problem as simple as possible:
I have this serializer:
class DatesSerializer(serializers.Serializer):
date = serializers.CharField(max_length=10)
... bunch of stuff and Others serializers
and on my view.py I have this piece of code:
dates = ["2021-05-02", "2021-06-28", "2021-07-02"]
...
for date in dates:
faults = DatesSerializer({
"date":date,
...
})
return Response({"faults":faults.data, status=200})
I receive a response like these:
{
"faults":{
"date":"2021-07-02"
....
}
}
what I wanted was a response like this one
{
"faults":{
"date":"2021-07-02"
....
},
{
"date":"2021-06-28"
....
}, {
"date":"2021-05-02"
....
}
}
I understand that on my loop I'm overwriting my serializer and that's why I just have the last entry, but I have tried to overcome this by adding on a dict and got nowhere since the key will always be the same, and I'm stuck on how to fix this
What you want is not a valid object. You want a list, and this easily can be accomplished in a loop by appending serializer data on each iteration.
res = []
for date in dates:
serializer = DatesSerializer({
"date":date,
...
})
res.append(serializer.data)
return Response({ "faults": res }, status=200)

Gatling Rest API Testing - retrieve a value from json response and add it to the list, iterate through list

I am new to Gatling, I am trying to do the performance testing for couple of rest calls. In my scenario I need to extract a value from json response of the 1st call and add those values to the list after looping for few times. Again after looping for few times and adding the values into the list, I want to reuse each value in my next rest call by iterating over the values in the list. Can anyone please suggest on how to implement this. I tried something as below,
var datasetIdList = List.empty[String]
val datasetidsFeeder = datasetIdList.map(datasetId => Map("datasetId" -> datasetId)).iterator
def createData() = {
repeat(20){
feed("").exec(http("create dataset").post("/create/data").header("content-type", "application/json")
.body(StringBody("""{"name":"name"}"""))
.asJson.check(jsonPath("$.id").saveAs("userId"))))
.exec(session => { var usrid = session("userId").as[String].trim
datasetIdList:+= usrid session})
}}
def upload()= feed(datasetidsFeeder).exec(http("file upload").post("/compute-metaservice/datasets/${datasetId}/uploadFile")
.formUpload("File","./src/test/resources/data/File.csv")
.header("content-type","multipart/form-data")
.check(status is 200))
val scn = scenario("create data and upload").exec(createData()).exec(upload())
setUp(scn.inject(atOnceUsers(1))).protocols(httpConf)
}
I am seeing an exception that ListFeeder is empty when trying to run above script. Can someone please help
Updated Code:
class ParallelcallsSimulation extends Simulation{
var idNumbers = (1 to 50).iterator
val customFeeder = Iterator.continually(Map(
"name" -> ("test_gatling_"+ idNumbers.next())
))
val httpConf = http.baseUrl("http://localhost:8080")
.header("Authorization","Bearer 6a4aee03-9172-4e31-a784-39dea65e9063")
def createDatasetsAndUpload() = {
repeat(3) {
//create dataset
feed(customFeeder).exec(http("create data").post("/create/data").header("content-type", "application/json")
.body(StringBody("""{ "name": "${name}","description": "create data and upload file"}"""))
.asJson.check(jsonPath("$.id").saveAs("userId")))
.exec(session => {
val name = session("name").asOption[String]
println(name.getOrElse("COULD NOT FIND NAME"))
val userId = session("userId").as[String].trim
println("%%%%% User ID ====>"+userId)
val datasetIdList = session("datasetIdList").asOption[List[_]].getOrElse(Nil)
session.set("datasetIdList", userId :: datasetIdList)
})
}
}
// File Upload
def fileUpload() = foreach("${datasetIdList}","datasetId"){
exec(http("file upload").post("/uploadFile")
.formUpload("File","./src/test/resources/data/File.csv")
.header("content-type","multipart/form-data")
.check(status is 200))
}
def getDataSetId() = foreach("${datasetIdList}","datasetId"){
exec(http("get datasetId")
.get("/get/data/${datasetId}")
.header("content-type","application/json")
.asJson.check(jsonPath("$.dlp.dlp_job_status").optional
.saveAs("dlpJobStatus")).check(status is 200)
).exec(session => {
val datastId = session("datasetId").asOption[String]
println("request for datasetId >>>>>>>>"+datastId.getOrElse("datasetId not found"))
val jobStatus = session("dlpJobStatus").asOption[String]
println("JOB STATUS:::>>>>>>>>>>"+jobStatus.getOrElse("Dlp Job Status not Found"))
println("Time: >>>>>>"+System.currentTimeMillis())
session
}).pause(10)
}
val scn1 = scenario("create multiple datasets and upload").exec(createDatasetsAndUpload()).exec(fileUpload())
val scn2 = scenario("get datasetId").pause(100).exec(getDataSetId())
setUp(scn1.inject(atOnceUsers(1)),scn2.inject(atOnceUsers(1))).protocols(httpConf)
}
I see below error when I try to execute above script
[ERROR] i.g.c.s.LoopBlock$ - Condition evaluation crashed with message 'No attribute named 'datasetIdList' is defined', exiting loop
var datasetIdList = List.empty[String] defines a mutable variable pointing to a immutable list.
val datasetidsFeeder = datasetIdList.map(datasetId => Map("datasetId" -> datasetId)).iterator uses the immutable list. Further changes to datasetIdList is irrelevant to datasetidsFeeder.
Mutating a global variable with your virtual user is usually not a good idea.
You can save the value into the user's session instead.
In the exec block, you can write:
val userId = session("userId").as[String].trim
val datasetIdList = session("datasetIdList").asOption[List[_]].getOrElse(Nil)
session.set("datasetIdList", userId :: datasetIdList)
Then you can use foreach to iterate them all without using a feeder at all.
foreach("${datasetIdList}", "datasetId") {
exec(http("file upload")
...
}
You should put more work in your question.
Your code is not syntax-highlighted, and is formatted poorly.
You said "I am seeing an exception that ListFeeder is empty" but the words "ListFeeder" are not seen anywhere.
You should post the error message so that it's easier to see what went wrong.
In the documentation linked, there is a Warning. Quoted below:
Session instances are immutable!
Why is that so? Because Sessions are messages that are dealt with in a multi-threaded concurrent way, so immutability is the best way to deal with state without relying on synchronization and blocking.
A very common pitfall is to forget that set and setAll actually return new instances.
This is why the code in the updated question doesn't update the list.
session => {
...
session.set("datasetIdList", userId :: datasetIdList)
println("%%%% List =====>>>" + datasetIdList.toString())
session
}
The updated session is simply discarded. And the original session is returned in the anonymous function.

how to check if any of given queries return any result in knex

I have two queries:
a) select id from ingredietns where name = my_param;
b) select word_id from synonyms where name = my_param;
Both return 0 or 1 row. I can also add limit 1 if needed (or in knex first()).
I can translate each into knex like this:
knex("ingredients").select('id').where('name', my_param) //do we need first()?
knex("synonyms").select('word_id').where('name', my_param) //do we need first()?
I need function called "ingredientGetOrCreate(my_param)". This function would
a) check if any of above queries return result
b) if any of these return, then return ingredients.id or synonyms.word_id - only one could be returned
c) if record doesn't eixst in any of tables, I need to do knex inesrt aand return newly added id from function
d) later I am not sure I also understand how to call this newly create function.
Function ingredientGetOrCreate would be used later as seperate function or in the following scenario (like "loop") that doesn't work for me either:
knex("products") // for each product
.select("id", "name")
.map(function (row) {
var descriptionSplitByCommas = row.desc.split(",");
Promise.all(descriptionSplitByCommas
.map(function (my_param) {
// here it comes - call method for each param and do insert
ingredientGetOrCreate(my_param)
.then(function (id_of_ingredient) {
knex('ingredients_products').insert({ id_of_ingredient });
});
...
I am stuck with knex and Promise queries because of asynchronouse part. Any clues, please?
I though I can somehow use Promise.all or Promise.some to call both queries.
P.S. This is my first day with nodejs, Promise and knex.
As far as I decode your question, it consists of two parts:
(1) You need to implement upsert logic (get-or-create logic).
(2) Your get part requires to query not a single table, but a pair of tables in specific order. Table names imply that this is some sort of aliasing engine inside of your application.
Let's start with (2). This could definitely be solved with two queries, just like you sense it.
function pick_name (rows)
{
if (! rows.length) return null
return rows[0].name
}
// you can sequence queries
function ingredient_get (name)
{
return knex('ingredients')
.select('id').where('name', name)
.then(pick_name)
.then(name =>
{
if (name) return name
return knex('synonyms')
.select('word_id').where('name', name)
.then(pick_name)
})
}
// or run em parallel
function ingredient_get (name)
{
var q_ingredients = knex('ingredients')
.select('id').where('name', name)
.then(pick_name)
var q_synonyms = knex('synonyms')
.select('word_id').where('name', name)
.then(pick_name)
return Promise.all([ q_ingredients, q_synonyms ])
.then(([name1, name2]) =>
{
return name1 || name2
})
}
Important notions here:
Both forms works well and return first occurence or JS' null.
First form optimizes count of queries to DB.
Second form optimizes answer time.
However, you can go deeper and use more SQL. There's a special tool for such task called COALESCE. You can consult your SQL documentation, here's COLASCE of PostgreSQL 9. The main idea of COALESCE is to return first non-NULL argument or NULL otherwise. So, you can leverage this to optimize both queries and answer time.
function ingredient_get (name)
{
// preparing but not executing both queries
var q_ingredients = knex('ingredients')
.select('id').where('name', name)
var q_synonyms = knex('synonyms')
.select('word_id').where('name', name)
// put them in COALESCE
return knex.raw('SELECT COALESCE(?, ?) AS name', [ q_ingredients, q_synonyms ])
.then(pick_name)
This solution guarantees single query and furthermore DB engine can optimize execution in any way it sees appropriate.
Now let's solve (1): We now got ingredient_get(name) which returns Promise<string | null>. We can use its output to activate create logic or return our value.
function ingredient_get_or_create (name, data)
{
return ingredient_get(name)
.then(name =>
{
if (name) return name
// …do your insert logic here
return knex('ingredients').insert({ name, ...data })
// guarantee homohenic output across get/create calls:
.then(() => name)
})
}
Now ingredient_get_or_create do your desired upsert logic.
UPD1:
We already got ingredient_get_or_create which returns Promise<name> in any scenario (both get or create).
a) If you need to do any specific logic after that you can just use then:
ingredient_get_or_create(…)
.then(() => knex('another_table').insert(…))
.then(/* another logic after all */)
In promise language that means «do that action (then) if previous was OK (ingredient_get_or_create)». In most of the cases that is what you need.
b) To implement for-loop in promises you got multiple different idioms:
// use some form of parallelism
var qs = [ 'name1', 'name2', 'name3' ]
.map(name =>
{
return ingredient_get_or_create(name, data)
})
var q = Promise.all(qs)
Please, note, that this is an agressive parallelism and you'll get maximum of parallel queries as your input array provides.
If it's not desired, you need to limit parallelism or even run tasks sequentially. Bluebird's Promise.map is a way to run map which analogous to example above but with concurrency option available. Consider the docs for details.
There's also Bluebird's Promise.mapSeries which conceptually is an analogue to for-loop but with promises. It's like map which runs sequentially. Look the docs for details.
Promise.mapSeries([ 'name1', 'name2', 'name3' ],
(name) => ingredient_get_or_create(name, data))
.then(/* logic after all mapSeries are OK */)
I believe the last is what you need.

Linq Invalid Cast Exception Same Object Type

I wrote this query and as my understanding of the business rules has improved I have modified it.
In this most recent iteration I was testing to see if indeed I had some redundancy that could be removed. Let me first give you the query then the error.
public List<ExternalForums> GetAllExternalForums(int extforumBoardId)
{
List<ExternalForums> xtrnlfrm = new List<ExternalForums>();
var query = _forumExternalBoardsRepository.Table
.Where(id => id.Id == extforumBoardId)
.Select(ExtForum => ExtForum.ExternalForums);
foreach (ExternalForums item in query)
{
xtrnlfrm.Add(new ExternalForums { Id = item.Id , ForumName = item.ForumName, ForumUrl = item.ForumUrl });
}
return xtrnlfrm;
}
Now in case it isn't obvious the query select is returning List of ExternalForums. I then loop through said list and add the items to another List of ExternalForums object. This is the redundancy I was expecting to remove.
Precompiler was gtg so I ran through it one time to very everything was kosher before refactoring and ran into a strange error as I began the loop.
Unable to cast object of System.Collections.Generic.HashSet
NamSpcA.NamSpcB.ExternalForums to type NamSpcA.NamSpcB.ExternalForums.
Huh? They are the same object types.
So am I doing something wrong in the way I am projecting my select?
TIA
var query = _forumExternalBoardsRepository.Table
.Where(id => id.Id == extforumBoardId)
.Select(ExtForum => ExtForum.ExternalForums);
This query returns IEnumerable<T> where T is type of ExtForum.ExternalForums property, which I would expect to be another collection, this time of ExternalForum. And the error message matches that, saying you have IEnumerable<HashSet<ExternalForums>>.
If you need that collection of collections to be flattened into one big collection of ExternalForums use SelectMany instead:
var query = _forumExternalBoardsRepository.Table
.Where(id => id.Id == extforumBoardId)
.SelectMany(ExtForum => ExtForum.ExternalForums);

Squeryl get value of serial

I insert a new row into a database and its id is auto-incremented ("serial"). How can I get the value of the id after insertion? Currently, I am using the following workaround:
inTransaction {
Schema.table.insert(new Entry(
content = "..."
))
def entries = from(Schema.table)(e => select(e) orderBy(e.id desc)).page(0, 1)
val id = entries.headOption match {
case Some(entry) => entry.id
case None => 0
}
}
If there is no easier way, how can I ensure this entire block will be an atomic operation?
yes,
val new = Schema.table.insert(myOriginalOld)
Console println new.id

Resources