I have a big collection with this scheme:
{
stamp: 1650449324356,
value: 3434.4
}
now I want to check if there are values for a certain timerange and in a certain interval (15minutes), so I build me a query like this:
// define a 15min-divider
let divider = 15 * 60 * 1000
// get time-range in ms
let stampFrom = new Date("2022-01-01").getTime() + (new Date("2022-01-01").getTimezoneOffset() * 60 * 1000)
let stampTo = new Date("2022-02-01").getTime() + (new Date("2022-02-01").getTimezoneOffset() * 60 * 1000)
let timeScala = stampTo - stampFrom
// generate query-options
let quarterStamps = []
for(let i = 0; i < (timeScala / divider); i++) {
// generate query with every stamp/stamp+1min to check if a value exists between them
quarterStamps.push({
stamp: {
"$gte": stampFrom + (i * divider),
"$lt": stampFrom + (i * divider) + 60000
}
})
}
// build aggregation-query
let options = [
{
$match: {
$and: [
{ $or: quarterStamps },
{ my_id: "blah" }
]
}
},
{
"$sort": { "stamp": 1}
}
]
.. I then make a mongo-aggregate() and it works fine but:
A) the executiontime for the mongo-query is ca. 4 sec, when I query a timerange of 4 days in 15min-bits.. ok, I am on my local machine and the mongodb is on a remote server, but shouldn't it be faster anyway? what can I do?
B) I only get results for where something was found in a $gte/$lt-range, when nothing was found, there is no "placeholder"-entry in the result, but I need that.. I have then to iterate the whole result and rewrite it.. can I somehow avoid that? maybe in mongodb itself? so when no entry is found, it results in a null or something?
C) there can be the case, where there can be multiple results in a $gte/$lt-range, I then need to filter (group?) them out like I want to have
only the last occurance of a value in a range
only the first occurance of a value in a range
the avg-value of all occurances in a range
I would like to achieve as much as possible in a mongo-query, so I don't have to iterate the results like a million times in nodejs.
Is it possible or maybe just a part of it? Maybe someone can give me a kick in the right direction?
Related
I'm currently trying to get info off of an object but that's randomly selected. I believe that the true problem is that what I wrote is not being taken as a variable for selecting an existing object if not as the variable for the object, I don't know if this is a clear message or not.
Example of what I have tried:
let PK = ["Random1", "Random2", "Random3"]
let PKS = Math.floor(Math.random() * PK.length)
let Random1 = {
name: "Random1",
number: "010"
}
let Random2 = {
name: "Random2",
number: "011"
}
if(message.content.toLowerCase() == "random"){
message.channel.send(PK[PKS].number)
}
There is another thing I have tried which is by using ${}. These are the results:
"Random1.number" or "Random2.number" when what I was looking for is actually "010" or "011".
You should wrap your objects inside some collection such as an Array and then you just compare the value from your random selection to the value find in the collection (if any (randoms)):
let PK = ["Random1", "Random2", "Random3"];
let PKS = Math.floor(Math.random() * PK.length);
const randoms = [
{
name: "Random1",
number: "010",
},
{
name: "Random2",
number: "011",
},
];
if (message.content.toLowerCase() == "random") {
const findRandom = randoms.find((v) => v.name === PK[PKS]);
if (findRandom) {
message.channel.send(findRandom.number);
}
}
I have a DynamoDB table with the following items
{
"jobId":<job1>,
"cron" : "* 5 * * *"
},
{
"jobId":<job2>,
"cron" : "* 8 * * *"
}
I need to scan items who next execution time based on cron string is within the next 5 minutes, based on current time.
Is there a way I can convert the cron to a valid next execution time while scanning?
I am using node.js in AWS Lambda and cron-parser npm library to extract next_execution time from cron string.
Note that scanning the full table will slow down over time. You may want to consider some other data store or structure to store this data.
That said something like this could work:
const results = await client.scan({ TableName: 'tableName' }).promise();
const cronItems = results.Items;
const intervals = cronItems.map((item) => {
return cronParser.parseExpression(item.cron);
});
const now = new Date();
const fiveMinMillis = 300 * 1000;
const within5Mins = intervals.filter((interval) => {
const timeUntil = interval.next().valueOf() - now.valueOf();
return timeUntil < fiveMinMillis;
});
Note you will actually need to call scan(...) iteratively until the response does not include a LastEvaluatedKey attribute. See here for details.
if (req.body.positionDetails && req.body.positionDetails.length > 0) {
let total = req.body.positionDetails.length;
for (let i = 0; i < total; i++) {
db.query('SELECT * FROM position WHERE position <> $1',[req.body.positionDetails[i].position],function(err,p) {
console.log(p.rows)
});
}
}
It is selecting all the values from the database without checking the condition. How to solve this??
data is like
"positionDetails":[{"position":"manager"},{"position":"developer"}] and it is from postman.
Your prepared statement looks off to me. Try using ? as a placeholder for the position in your query.
db.query(
'SELECT * FROM position WHERE position <> ?', [req.body.positionDetails[i].position],
function(err, p) {
console.log(p.rows)
});
If this fixes the problem, then I suggest that you were comparing the position against the literal string $1, and every comparison failed resulting in every record appearing in the result set.
I have been stuck with this problem for a month.
I've been trying to use firebase to create a FIFO Inventory. I am using firebase cloud function to update the FIFO inventory. However if I stress test the following code just 10 times with for loop for both insert (push) and remove (pop), it breaks because of the concurrent update.
Does anyone have another solution for this?
Insert FIFO/PUSH:
let fifoRef = admin.database().ref('fifo/' + item.itemId + '/').push();
let fifo = {
price: data.items[uniqueId].price,
in: data.items[uniqueId].quantity,
quantity: data.items[uniqueId].quantity,
}
fifoRef.set(fifo);
Get FIFO Value/POP (Here I simply update the quantity for POP):
// get fifo
let fifoReference = 'fifo/' + item.itemId;
let fifoRef = admin.database().ref(fifoReference);
fifoRef.once('value').then(currentData => {
let fifo = currentData.val();
for (let key in fifo) {
let val = fifo[key];
if (val.quantity > 0) {
// get fifo quantity
let fifoQuantityRef = admin.database().ref(fifoReference + '/' + key + '/quantity/');
// get local cache value
let fifoQuantityListener = fifoQuantityRef.on('value', () => {
// transaction start
fifoQuantityRef.transaction(function (quantity) {
if (quantity) {
if (quantity > 0 && quantitySubtotal > 0) {
if (quantity >= quantitySubtotal) {
// minus inventory amount
let amount = calculator(quantitySubtotal + "*" + val.price);
quantity = calculator(quantity + "-" + quantitySubtotal);
quantitySubtotal = 0;
// update quantity
return quantity;
} else {
let amount = calculator(quantity + "*" + val.price);
quantitySubtotal = calculator(quantitySubtotal + "-" + quantity);
return 0;
}
}
}
return quantity;
}, (error, committed, result) => {
fifoQuantityRef.off('value', fifoQuantityListener);
}, true);
});
Brainstorming:
I just need Insight on how to get the VALUE using FIFO. From my understanding Firebase is best used to insert and remove not with transaction. But if I am using only insert and remove, how to create a FIFO? If I create FIFO with the quantity of 1, the data stored will be too large.
I did try to use Google Datastore, however datastore persistence is extremely slow (over 2 seconds for writing). Which is not possible to be used with firebase which have persistence of less than 1 second. The problem arise when PUSH and POP is done within 1 seconds, datastore insert is not persisted yet.
Any other brainstorming ideas?
i want to get all the records in particular record type , but i got 1000 only.
This is the code I used.
function getRecords() {
return nlapiSearchRecord('contact', null, null, null);
}
I need two codes.
1) Get whole records at a single time
2) Get the records page wise by passing pageindex as an argument to the getRecords [1st =>0-1000 , 2nd =>1000 - 2000 , ...........]
function getRecords(pageIndex) {
.........
}
Thanks in advance
you can't get whole records at a time. However, you can sort results by internalid, and remember the last internalId of 1st search result and use an additional filter in your next search result.
var totalResults = [];
var res = nlapiSearchRecord('contact', null, null, new nlobjSearchColumn('internalid').setSort()) || [];
lastId = res[res.length - 1].getId();
copyAndPushToArray(totalResult, res);
while(res.length < 1000)
{
res = nlapiSearchRecord('contact', null, ['internalidnumber', 'greaterthan', lastId], new nlobjSearchColumn('internalid').setSort());
copyAndPushToArray(totalResult, res);
lastId = res[res.length - 1].getId();
}
Beware, if the number of records are high you may overuse governance limit in terms of time and usage points.
If you remember the lastId you can write a logic in RESTlet to take id as param and then use that as additional filter to return nextPage.
You can write a logic to get nth pageresult but, you might have to run search uselessly n-1 times.
Also, I would suggest to use nlapiCreateSearch().runSearch() as it can return up to 4000 records
Here is another way to get more than 1000 results on a search:
function getItems() {
var columns = ['internalid', 'itemid', 'salesdescription', 'baseprice', 'lastpurchaseprice', 'upccode', 'quantityonhand', 'vendorcode'];
var searchcolumns = [];
for(var col in columns) {
searchcolumns.push(new nlobjSearchColumn(columns[col]));
}
var search = nlapiCreateSearch('item', null, searchcolumns);
var results = search.runSearch();
var items = [], slice = [], i = 0;
do {
slice = results.getResults(i, i + 1000);
for (var itm in slice) {
var item = {};
for(var col in columns) { item[columns[col]] = slice[itm].getValue(columns[col]); } // convert nlobjSearchResult into simple js object
items.push(item);
i++;
}
} while (slice.length >= 1000);
return items;
}