I have a collection that is having updated timestamps in epoch milliseconds. I have a python config that gets start date and end date parameters.
I have to form a query to filter this data
_id:61cd51d8e788b021539c1a6e
id:null
createTimestamp:1640845784796
updateTimestamp:1640845784796
deleteTimestamp:null
python.yaml
start date - 16-Feb-2022 (something like this)
{createTimestamp: {
$gte: ISODate("2022-02-16T00:00:00.000Z"),
$lt: ISODate("2022-02-17T00:00:00.000Z")
}
}
is there a way to convert the start date to epoch and pass that to Mongoquery?
something in python
startdate = int(datetime.fromisoformat("2022-01-01").timestamp())*1000
enddate = int(datetime.fromisoformat("2022-02-16").timestamp())*1000
query = {"updateTimestamp":{"$gte":startdate,"$lt":enddate}}
# myresult = collection.find().limit(5)
myresult = collection.find(query)
still, listing whole data? any issue here?
Related
I have a MS SQL Server DateTime field, and Im trying to search all records that are in between a date range:
mySqlString = "select * from users where signupDate >=#from and signupdate <=#to"
The two variables containing the date range come with format MM/dd/yyyy (dataFrom and dataTo, so Im replacing #from and #to at the string as follows:
datefrom = new Date(dataFrom);
dateto = new Date(dataTo);
req.input('from', sql.DateTime, datefrom )
req.input('to', sql.DateTime, dateto )
But I do not get any result.
What's the best approach to get this working properly?
You can always use CONVERT to accommodate your SQL query to your input format. In your case its format 101: select convert(varchar, getdate(), 101) ---> mm/dd/yyyy
So your query should look like
where (signupdate >= CONVERT(date, #from, 101)) AND (signupdate <= CONVERT(date, #to, 101))
This way you won't worry about the time of the stored date
req.input('from', sql.Date, (dataFrom))
req.input('to', sql.Date, (dataTo))
Assuming you checked if dataFrom and dataTo have valid dates.
I need to get the data from MongoDB between two given dates. The same mongo db query is working for ( yy-mm-dd hh:mm:ss.ms ) format but it is not working for ( dd-mm-yy hh:mm:ss) format.
Sample Data in DB
{
"name":"user1",
"Value":"Success",
"Date": "02-06-2020 00:00:00",
"Status":"available",
"Updated_on":"2021-01-09 00:00:00.0000"
}
Python:
start_date = "02-06-2020 00:00:00"
end_date = "11-06-2020 10:16:41"
data = list(db.collection.find({"Date":{"gte":start_date,"Slte":end_date},"Value":"Success"},{'_id':False,"Date":1,"name":1,"Value":1}))
print(data)
I need to get the data based on the "Date" field.
The problem is it is giving extra data than the start_date and end_date.
Example: if my start_date is "02-06-2020 00:00:00"and end_date is "11-06-2020 10:16:41", it is giving data from "02-04-2020 00:00:00" to "11-06-2020 10:16:41"
Any idea to achieve this and please explain why it is not taking dates correctly.
I have stored input data in date format in postgres database, but when I am showing the date in browser it's showing date with timezone and converting it from utc. For example I have stored the date in 2020-07-16 format. But when i am showing the date it becomes 2020-07-15T18:00:00.000Z. I have tried using select mydate::DATE from table to get only date but its still showing date with timezone. I am using node-postgres module in my node app. I suspect it's some configuration on node-postgres module? From their doc:
node-postgres converts DATE and TIMESTAMP columns into the local time
of the node process set at process.env.TZ
Is their any way i can configure it to only parse date? If i query like this SELECT TO_CHAR(mydate :: DATE, 'yyyy-mm-dd') from table i get 2020-07-16 but thats lot of work just to get date
You can make your own date and time type parser:
const pg = require('pg');
pg.types.setTypeParser(1114, function(stringValue) {
return stringValue; //1114 for time without timezone type
});
pg.types.setTypeParser(1082, function(stringValue) {
return stringValue; //1082 for date type
});
The type id can be found in the file: node_modules/pg-types/lib/textParsers.js
It is spelled out here:
https://node-postgres.com/features/types
date / timestamp / timestamptz
console.log(result.rows)
// {
// date_col: 2017-05-29T05:00:00.000Z,
// timestamp_col: 2017-05-29T23:18:13.263Z,
// timestamptz_col: 2017-05-29T23:18:13.263Z
// }
bmc=# select * from dates;
date_col | timestamp_col | timestamptz_col
------------+-------------------------+----------------------------
2017-05-29 | 2017-05-29 18:18:13.263 | 2017-05-29 18:18:13.263-05
(1 row)
I am trying to filter the data from mongo using python code and available date format in mongo is quite different hence resulting zero records. I am trying to convert the date format but still it did not work.
One of the value from date field:
2018-06-28 21:27:31.132Z
I have connected to DB and using below code that returns zero records even though there are more than 1000 records available in DB.
I have tried by formatting as below
import datetime
date_time_str_st = '2018-03-07 23:22:29'
date_time_obj_st = datetime.datetime.strptime(date_time_str_st, '%Y-%m-%d %H:%M:%S')
date_time_str_en = '2018-03-08 00:07:44'
date_time_obj_en = datetime.datetime.strptime(date_time_str_en, '%Y-%m-%d %H:%M:%S')
foramtdt1 = date_time_obj_st.strftime("%Y-%m-%d %H:%M:%S.%fZ")
foramtdt2 = date_time_obj_en.strftime("%Y-%m-%d %H:%M:%S.%fZ")
pipeline = [{'$match':{'$and':[{'date':{'$gte': {'$date': '2018-03-07 23:22:29.683Z' }}},{'date':{'$lt': {'$date': '2018-03-08 00:07:44.629Z' }}}]}}]
Read_data = spark.read.format("com.mongodb.spark.sql.DefaultSource").option("uri",connectionstring).option("pipeline",pipeline).load()
display(Read_data)
Also tried using direct filter
pipeline = [{'$match':{'$and':[{'date':{'$gte': '2018-03-07 23:22:29.683Z'}},{'date':{'$lt': '2018-03-08 00:07:44.629Z' }}]}}]
Readdata = spark.read.format("com.mongodb.spark.sql.DefaultSource").option("uri",connectionstring).option("pipeline",pipeline).load()
display(Readdata)
0 records are filtering. I believe i am not properly converting required timestamp format. Can anyone help me on this?
At first - I am a beginner with mongodb. So i have next probleb. I am using such a model as below with mongoengine:
class Stats(Document):
Name = StringField(max_length=250)
timestamp = LongField(default=mktime(datetime.now().timetuple()))
count = IntField()
<some other fields>
What exactly I want is to filter by the name (it's clear) and use aggregation operation sum over field count. But I want to count the sum of records grouped by hours/days/months.
As example, if we have records with such timestamps [1532970603, 1532972103, 153293600, 1532974500], then 1-2 form first group, and 3-4 form second group.
And that is where I have stuck. I have some ideas about grouping by every n records, or by dividing timestamp on 3600
(1 hour = 3600 seconds), but how to make it with mongoengine. Or even how to insert some expressions with python in a pipeline?
I will very appreciate any help.
I would recommend to use ISO date format and store complete date in timestamp. Here is your model
class Stats(Document):
Name = Document.StringField(max_length=250)
timestamp = Document.DateTime(default=datetime.utcnow()) //ISO time format recommended
count = Document.FloatField()
meta = {'strict': False}
Now you can aggregate them accordingly.
Stats.objects.aggregate(
{
'$group': {
'_id': {'year': {$year: '$timestamp'},
'month': {$month: '$timestamp'},
'day' : {$dayOfMonth: '$timestamp'},
'hour': {'$hour: '$timestamp'},
}
}
}
)