Comparing timestamp results from Postgres in Jest - node.js

I'm writing some tests in Jest for an API that's returning results from Postgres via the pg library. I post some data (via a faker'ed set of values, template1), and then test that what I get back is the same, allowing for an id value, and modified fields. The template1 data includes an approved property (SQL defn: approved timestamp with time zone), which is generated like so:
{
approved: faker.date.past(),
description: faker.lorem.paragraph(),
groups: faker.lorem.paragraph(),
}
This test is something like this:
expect(response.body.rows).toStrictEqual([
{
...template1,
id: 1,
modified: null,
},
])
The issue is that the test is failing because the returned value of approved appears to be a string:
expect(received).toStrictEqual(expected) // deep equality
- Expected
+ Received
## -1,8 +1,8 ##
Array [
Object {
- "approved": 2019-12-19T03:48:20.613Z,
+ "approved": "2019-12-19T03:48:20.613Z",
"approved_by": "Percy_Hills#yahoo.com",
I've tried casting the template1.approved value just prior to the comparison to both a date, and to a string. Both approaches fail. What am I doing wrong, and how do I fix the issue?

I didn't try hard enough - the answer was to convert the timestamp to JSON:
expect(response.body.rows).toStrictEqual([
{
...template1,
id: 1,
modified: null,
approved: new Date(template1.approved).toJSON(), // here
},
])

Related

Nodejs Elasticsearch query default behaviour

On a daily basis, I'm pushing data (time_series) to Elasticsearch. I created an index pattern, and my index have the name: myindex_* , where * is today date (an index pattern has been setup). Thus after a week, I have: myindex_2022-06-20, myindex_2022-06-21... myindex_2022-06-27.
Let's assume my index is indexing products' prices. Thus inside each myindex_*, I have got:
myindex_2022-06-26 is including many products prices like this:
{
"reference_code": "123456789",
"price": 10.00
},
...
myindex_2022-06-27:
{
"reference_code": "123456789",
"price": 12.00
},
I'm using this query to get the reference code and the corresponding prices. And it works great.
const data = await elasticClient.search({
index: myindex_2022-06-27,
body: {
query: {
match: {
"reference_code": "123456789"
}
}
}
});
But, I would like to have a query that if in the index of the date 2022-06-27, there is no data, then it checks, in the previous index 2022-06-26, and so on (until e.g. 10x).
Not sure, but it seems it's doing this when I replace myindex_2022-06-27 by myindex_* (not sure it's the default behaviour).
The issue is that when I'm using this way, I got prices from other index but it seems to use the oldest one. I would like to get the newest one instead, thus the opposite way.
How should I proceed?
If you query with index wildcard, it should return a list of documents, where every document will include some meta fields as _index and _id.
You can sort by _index, to make elastic search return the latest document at position [0] in your list.
const data = await elasticClient.search({
index: myindex_2022-*,
body: {
query: {
match: {
"reference_code": "123456789"
}
}
sort : { "_index" : "desc" },
}
});

Date Format in mongoose, mongodb and node application

First time question and newbie coder so treat me like I'm 5 years old.
Okay so as I'm learning, I'm horrible at trying to understand how to use documentation to fix my problems. I'm setting up a node application with mongodb as the backend. The starting point is it is something that will pull via api baseball game results into a database (date, teams, scores). The problem I've ran into is I want to take in dates. When I do it comes back with the generic 1970's date and I can't seem to get it to just keep the formatted date. The source of the data and the date will be given programmatically and so it won't be prone to human error so I don't need date validation. I just need it to recognize the date and keep it's format so I can find or filter on it. The data will only have the year/month/date and not the time.
I'm setting this up in a mongoose file to load into my mongodb DB.
I've looked at https://mongoosejs.com/docs/api.html#mongoose_Mongoose-Date information, but I'm either missing the point or not seeing what's in front of me. Any hints on how to accept date information?
const GameDayStats = mongoose.model('GameDayStats', {
Date: {
type: Date
},
Home: {
type: String
},
HomeScore: {
type: Number
},
Visitor: {
type: String
},
VisitorScore: {
type: Number
},
Final: {
type: Boolean
}
})
const game = new GameDayStats({
Date: 2021-08-04,
Home: 'Cardinals',
HomeScore: 8,
Visitor: 'Pirates',
VisitorScore: 2,
Final: true
})
game.save().then(() => {
console.log(game)
}).catch((error) => {
console.log('Game Stat Error!', error)
})
What the output looks like
{
_id: 602e774fbee0692662ea16fa,
Date: 1970-01-01T00:00:02.009Z, <----I want this in normal date form
Home: 'Cardinals',
HomeScore: 8,
Visitor: 'Pirates',
VisitorScore: 2,
Final: true,
__v: 0
}

Mongo DB: How do I query by both Id and date

I am trying to do a query by 2 parameters on a mongoDb database using Mongoose. I need to query by who the document was created by and also a subdocument called events which has a date. I want to bring back all documents within a timeframe.
My query looks like this.
var earliest = new Date(2018,0,3);
var latest = new Date(2018,0,4);
Goal.find({createdBy:userId,'events.date':{$gte: earliest, $lte: latest}})
.exec(function(err,doc)){ //do stuff}
The document below is what was returned. I get everything in my database back and my date range query isn't taken into account. I'm new to Mongodb and I don't know what I am doing wrong.
[
{
_id: "5a4dac123f37dd3818950493",
goalName: "My First Goal",
createdBy: "5a4dab8c3f37dd3818950492",
__v: 0,
events:
[
{
_id: "5a4dac123f37dd3818950494",
eventText: "Test Goal",
eventType: "multiDay",
date: "2018-01-03T00:00:00.000Z",
eventLength: 7,
completed: false
},
{
_id: "5a4dac123f37dd3818950495",
eventText: "Test Goal",
eventType: "multiDay",
date: "2018-01-04T00:00:00.000Z",
eventLength: 7,
completed: false
},
{
_id: "5a4dac123f37dd3818950496",
eventText: "Test Goal",
eventType: "multiDay",
date: "2018-01-05T00:00:00.000Z",
eventLength: 7,
completed: false
}
],
startDate: "2018-01-04T00:00:00.000Z",
createdOn: "2018-01-04T00:00:00.000Z"
}
]
There is a difference between matching documents and matching "elements of an array". Your document already contains the whole array, even the values that don't match your array filter criteria. But since your document match criteria matches, the whole document is returned (with all the array entries).
If you just want the matching "elements" then use .aggregate() instead. An example on how to use aggregate for such a task is available at Mongodb find inside sub array

MongoDB update/insert document and Increment the matched array element

I use Node.js and MongoDB with monk.js and i want to do the logging in a minimal way with one document per hour like:
final doc:
{ time: YYYY-MM-DD-HH, log: [ {action: action1, count: 1 }, {action: action2, count: 27 }, {action: action3, count: 5 } ] }
the complete document should be created by incrementing one value.
e.g someone visits a webpage first this hour and the incrementation of action1 should create the following document with a query:
{ time: YYYY-MM-DD-HH, log: [ {action: action1, count: 1} ] }
an other user in this hour visits an other webpage and document should be exteded to:
{ time: YYYY-MM-DD-HH, log: [ {action: action1, count: 1}, {action: action2, count: 1} ] }
and the values in count should be incremented on visiting the different webpages.
At the moment i create vor each action a doc:
tracking.update({
time: moment().format('YYYY-MM-DD_HH'),
action: action,
info: info
}, { $inc: {count: 1} }, { upsert: true }, function (err){}
Is this possible with monk.js / mongodb?
EDIT:
Thank you. Your solution looks clean and elegant, but it looks like my server can't handle it, or i am to nooby to make it work.
i wrote a extremly dirty solution with the action-name as key:
tracking.update({ time: time, ts: ts}, JSON.parse('{ "$inc":
{"'+action+'": 1}}') , { upsert: true }, function (err) {});
Yes it is very possible and a well considered question. The only variation I would make on the approach is to rather calculate the "time" value as a real Date object ( Quite useful in MongoDB, and manipulative as well ) but simply "round" the values with basic date math. You could use "moment.js" for the same result, but I find the math simple.
The other main consideration here is that mixing array "push" actions with possible "updsert" document actions can be a real problem, so it is best to handle this with "multiple" update statements, where only the condition you want is going to change anything.
The best way to do that, is with MongoDB Bulk Operations.
Consider that your data comes in something like this:
{ "timestamp": 1439381722531, "action": "action1" }
Where the "timestamp" is an epoch timestamp value acurate to the millisecond. So the handling of this looks like:
// Just adding for the listing, assuming already defined otherwise
var payload = { "timestamp": 1439381722531, "action": "action1" };
// Round to hour
var hour = new Date(
payload.timestamp - ( payload.timestamp % ( 1000 * 60 * 60 ) )
);
// Init transaction
var bulk = db.collection.initializeOrderedBulkOp();
// Try to increment where array element exists in document
bulk.find({
"time": hour,
"log.action": payload.action
}).updateOne({
"$inc": { "log.$.count": 1 }
});
// Try to upsert where document does not exist
bulk.find({ "time": hour }).upsert().updateOne({
"$setOnInsert": {
"log": [{ "action": payload.action, "count": 1 }]
}
});
// Try to "push" where array element does not exist in matched document
bulk.find({
"time": hour,
"log.action": { "$ne": payload.action }
}).updateOne({
"$push": { "log": { "action": payload.action, "count": 1 } }
});
bulk.execute();
So if you look through the logic there, then you will see that it is only ever possible for "one" of those statements to be true for any given state of the document either existing or not. Technically speaking, the statment with the "upsert" can actually match a document when it exists, however the $setOnInsert operation used makes sure that no changes are made, unless the action actually "inserts" a new document.
Since all operations are fired in "Bulk", then the only time the server is contacted is on the .execute() call. So there is only "one" request to the server and only "one" response, despite the multiple operations. It is actually "one" request.
In this way the conditions are all met:
Create a new document for the current period where one does not exist and insert initial data to the array.
Add a new item to the array where the current "action" classification does not exist and add an initial count.
Increment the count property of the specified action within the array upon execution of the statement.
All in all, yes posssible, and also a great idea for storage as long as the action classifications do not grow too large within a period ( 500 array elements should be used as a maximum guide ) and the updating is very efficient and self contained within a single document for each time sample.
The structure is also nice and well suited to other query and possible addtional aggregation purposes as well.

How to restrict _rev_info in cloudant result json

I am using cloudant for my Project. Every time i update a document and fetch a document, the result JSON comes with { _rev_info : [...] } (contains 500+ rev history). how i restrict & fetch data without _rev_info in cloudant??
{ name: "test",
"age":22,
_revs_info: [
{ rev: '510-454.....',
status: 'available' },
{....}
]
}
_revs_info is only be returned if you explicitly request it by passing revs_info=true in the query string. If you don't require the revision history, just exclude that parameter.

Resources