I'm trying to get entries from indexedDB by key and apply some offset.
I have some data at indexedDb like this:
[
{id: 0, status: 'valid', message: 'some message ....'},
{id: 1, status: 'invalid', message: 'some message....'},
{id: 2, status: 'warning', message: 'some message....'},
{id: 3, status: 'valid', message: 'some message....'},
{id: 4, status: 'valid', message: 'some message....'},
{id: 5, status: 'valid', message: 'some message....'}
]
I have to filter data by 'status' key and render it with pagination, so if I have 20 entries with 'valid' status, I have to render only 15 and set pagination. Everything is ok with first page, but I can't create correct request to indexedDB in order to get entries with key 'valid' but with offset=15. How I can do this? I've tried this, but db.getAllFromIndex doesn't take offset or I don't understand how to pass it. I use 'idb' package here: https://www.npmjs.com/package/idb
const db = await openDB('DB', 1, {
upgrade(db) {
if (!db.objectStoreNames.contains('dbName')) {
db.createObjectStore('dbName').createIndex('status', 'status', {unique: false});
}
//...code for setting data here...
};
async getData(filter, offset = 0) {
const tx = db.transaction('dbName'),
range = offset ? IDBKeyRange.lowerBound(offset, true) : null,
items = filter === 'all'
? await db.getAll('dbName', range, rules.limit)// everything ok with plain data without filtering by key
: await db.getAllFromIndex('dbName', 'status', filter, rules.limit),// bug is here! should be offset
pagination = ...some code for define pagination...
await tx.done;
return { items, pagination };
}
So how I can do this correctly? Thanks
Related
Trying to update a field from the table where the table is connected with other tables also. But showing the error here. "An operation failed because it depends on one or more records that were required but not found. Record to update not found."
const deleteBook = await prisma.aca_book_list.update({
where: {
id: productId,
},
data: {
is_active: false,
}
})
await BasicResponse(res, 1, 'Book Deleted', deleteBook)
if (!deleteBook) {
await BasicResponse(res, 0, 'Book Not Found to Delete', [])
return
}
I am trying to filter events by date, i thought this was an issue of input from frontend, but a hardcoded date doesn't return anything either. The response is:
count: 1 // so it knows that there is something that fits the where clause
result: []
This is the service code:
async findAllByDate(
min: string | undefined,
max: string | undefined,
skip: number,
) {
const [result, total] = await this.repository.findAndCount({
take: 10,
skip: skip,
where: {
date: MoreThanOrEqual('2022-06-20'),
},
});
console.log(result);
return {
data: result,
count: total,
};
}
I have a import job happening once a week, which inserts all the records from MongoDB to ElasticSearch.
What i am doing is the following:
Records already exist in 'main' index
I insert all the new records into 'main-temp' index
I delete the 'main' index
I reindex 'main-temp' to 'main'
I delete the 'main-temp' index
I am running the operation locally on the same data set.
What i am noticing is that the number of records in the new 'main' index does not match the number of records that got imported to the 'main-temp' index.
Here is the code that i am using
try {
await client.indices.delete({index: "main"})
Logger.info('Old Index Deleted')
await client.indices.create({ index: 'main' })
Logger.info('New Index Created')
await client.reindex({
waitForCompletion: true,
refresh: true,
body: {
source: {
index: 'main-temp'
},
dest: {
index: 'main'
}
}
})
Logger.info('Temp Index Reindexed/Cloned')
await client.indices.delete({index: "main-temp"})
Logger.info('Temp Index Deleted')
} catch(e) {
Logger.error(e)
}
I am using Elastic search 6.8.9, so i can't use Clone API since it is part of 7.X
Check the screenshot below for the results, thing is whenever it reindex's the number of records is different (usually smaller few thousands)
https://i.stack.imgur.com/g1u0J.png
UPDATE: Here is what i get from reindex as response (if i do let result = await )
Sometimes it gets the correct number, sometimes not.
took: 22357,
timed_out: false,
total: 673637,
updated: 0,
created: 673637,
deleted: 0,
batches: 674,
version_conflicts: 0,
noops: 0,
retries: { bulk: 0, search: 0 },
throttled_millis: 0,
requests_per_second: -1,
throttled_until_millis: 0,
failures: []
I fixed this by introducing timeouts after creating/deleting the old index and after reindexing.
Here is the code
try {
await client.indices.delete({index: "main"})
Logger.info('Old Index Deleted')
await client.indices.create({ index: 'main' })
Logger.info('New Index Created')
await new Promise(resolve => setTimeout(resolve, 10000))
await client.reindex({
waitForCompletion: true,
refresh: true,
body: {
source: {
index: 'main-temp'
},
dest: {
index: 'main'
}
}
})
await new Promise(resolve => setTimeout(resolve, 15000))
Logger.info('Temp Index Reindexed/Cloned')
await client.indices.delete({index: "main-temp"})
Logger.info('Temp Index Deleted')
} catch(e) {
Logger.error(e)
}
It seems elasticsearch needs some time to get everything working.
My simple problem is:
I had an mongoose object at server side:
...
item = {
name: "Test",
id: 1
}
// item was an mongo schema
// id and name was define in model String and Number
Then I add into item new field mentions:
item.mention = [{ id: 1, ... }]
But I can't get mention at client side.
My response code:
res.json({ status: 1, message: 'success', data: item })
The response was data: { name: "Test", id: 1 }
I don't want to add mention into my mongo schema.
So, what's my problem?
How can I fix that?
Thanks!
You can convert your mongoose document to an object first, then add your additional field.
Something like this:
let o = item.toObject();
o.mention = [{ id: 1, ... }];
res.json({ status: 1, message: 'success', data: o })
You could also just put this additional data in your response:
res.json({ status: 1, message: 'success', data: item, mention: [...] })
Try:
item = JSON.parse(JSON.stringify(item));
Before asign new prop for item.
Now you can asign value for new prop item.mention = some_value;
This will give you a new copy object that you can work with.
You cannot assign a new prop value to mongoose object which has not been defined before.
My simple problem is:
I had an mongoose object at server side:
...
item = {
name: "Test",
id: 1
}
// item was an mongo schema
// id and name was define in model String and Number
Then I add into item new field mentions:
item.mention = [{ id: 1, ... }]
But I can't get mention at client side.
My response code:
res,json({ status: 1, message: 'success', data: item })
The response was data: { name: "Test", id: 1 }
I don't want to add mention into my mongo schema.
So, what's my problem?
How can I fix that?
Thanks!
The problem is that item is a MongooseDocument, not a plain javascript object.
There are multiple ways to achieve what you want.
1) Using .lean()
Documents returned from queries with the lean option enabled are plain
javascript objects, not MongooseDocuments
Model.findOne().lean().exec((err, item) => {
item.mention = [{ id: 1 }];
res.json(item);
});
This option will increase the performance since you ignore the overhead of creating the Mongoose Document
2) Using: .set()
item.set('mention', [{ id: 1}], { strict: false });
4) Using spread operator
res.json({
status: 1,
message: 'success',
data: { mention: [{ id: 5 }], ...item }
})
4) Using Object.assign as #Alexis Facques explained.
Taking advantage of Object.assign() should resolve your problem too.
res.json({
status: 1,
message: 'success',
data: Object.assign({ mention: [{id: 1...}] },item)
})