Query date on datetime stored within jsonfield in Postgres through Django? - python-3.x

I have a Postgres table with a jsonb column containing UTC timestamp data in ISO format like the following:
{
"time": "2021-04-13T20:14:56Z"
}
The Django model for this table looks like:
class DateModel(models.Model):
values = models.JSONField(default=dict)
I need to query the table for all records with a timestamp on a certain date (ignoring time)
I'm looking for a solution similar to the following:
DateModel.objects.filter(values__time__date='2021-04-13')
The other solution I have found is to query for records with date greater than the previous day and less than the next one. This works but I am looking for a way to do it with a single query so the code would be more concise.
Any suggestions?

There's a couple of annotations you need to perform on the queryset to extract the time field and convert it to a datetime.
First you need to extract the time string by using django.contrib.postgres.fields.jsonb.KeyTextTransform
from django.contrib.postgres.fields.jsonb import KeyTextTransform
query = DateModel.objects.annotate(time_str=KeyTextTransform('time', 'values'))
Then you need to convert that string to a datetime using Cast
from django.db.models.functions import Cast
from django.db.models import DateTimeField
query = query.annotate(time=Cast('time_str', output_field=DateTimeField()))
Then you can filter by that annotation
query = query.filter(time__date='2021-04-13')

Related

Python convert a str date into a datetime with timezone object

In my django project i have to convert a str variable passed as a date ("2021-11-10") to a datetime with timezone object for execute an ORM filter on a DateTime field.
In my db values are stored as for example:
2021-11-11 01:18:04.200149+00
i try:
# test date
df = "2021-11-11"
df = df + " 00:00:00+00"
start_d = datetime.strptime(df, '%Y-%m-%d %H:%M:%S%Z')
but i get an error due to an error about str format and datetime representation (are different)
How can i convert a single date string into a datetimeobject with timezone stated from midnight of the date value?
So many thanks in advance
It's not the way to datetime.strptime.
Read a little bit more here
I believe it will help you.
you should implement month as str and without "-".
good luck

How do I correctly order a Pandas series based on datetime

So to order my pandas dataframe, I have put this line of code for this specific output
airings_df = airings_df.sort_values(by=['Station', 'DateTime'])
The ordering for the station has worked, but it seems that times within the same hour are not being ordered correctly. What can I do to fix this?
You can try converting the DateTime column to datetime format:
airings_df['DateTime']= pd.to_datetime(airings_df['DateTime'])
And then sort:
airings_df = airings_df.sort_values(by=['Station', 'DateTime'])

how to get datetime type column data into 'MM/YYYY' format using flask SqlAlchemy in python

I have an column in the database which is datetime type and i have to retrieve date into 'MM/YYYY' format.
what i have to do this in sqlalchemy, python?
my current query is as follow =>
session.query(Model.models.DefineStructureOfTable.Detail.AsOfDate).filter(
Model.models.DefineStructureOfTable.Detail.ID.in_(getId)).all()
currently it gives me result as datetime type date
Please Help!
If your question is how to convert datetime into the format MM/YYYY, you can use strftime.
result.strftime("%m/%Y")
If your question is about how you make the database return a different format, you can't. You can't change the types of the underlying database. But you could change the type to TEXT, and just store the string directly - not recommended in this format because it will be hard to sort.
Alternatively add a method or property to the model to get the datetime in the right format.
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
month_and_year = db.Column(db.DateTime)
#property
def formatted_month_and_year(self):
return self.month_and_year.strftime("%m/%Y")

Dataframe with datetime64 dtype insert into to postgressql timestamp column

I am taking a dataframe and inserting it into a Postgresql table.
One column in the dataframe is a datetime64 dtype. The column type in PostgreSQL is 'timestamp without time zone.' To prepare the dataframe to insert, I am using to_records:
listdf = df.to_records(index=False).tolist()
When I run the to_records, it gives an error at the psycopg2's cur.executemany() that I am trying to insert Biginit into a Timestamp without timezone.
So I tried to add a dict of column_dtypes to the to_records. But that doesn't work. The below gives the error: "ValueError: Cannot convert from specific units to generic units in NumPy datetimes or timedeltas"
DictofDTypes = dict.fromkeys(SQLdfColHAedings,'float')
DictofDTypes['Date_Time'] = 'datetime64'
listdf = df.to_records(index=False,column_dtypes=DictofDTypes).tolist()
I have also tried type of str, int, and float. None worked in the above three lines.
How do I convert the column properly to be able to insert the column into a timestamp sql column?
I removed defining the dtypes from to_records.
And before to_recordes, I converted the datetime to str with:
df['Date_Time'] = df['Date_Time'].apply(lambda x: x.strftime('%Y-%m-%d %H:%M:%S'))
The sql insert command then worked.

BigQuery convert unix timestamp struct to struct of datetime

I have a BigQuery table that contains a struct column called daySliderTimes in the following form:
daySliderTimes STRUCT<_field_1 STRUCT<_seconds INT, _nanoseconds INT>, _field_1 STRUCT<_seconds INT, _nanoseconds INT>.
_field_1 and _field_2 represent two different timestamps. _seconds and _nanoseconds represent time since the unix epoch.
I want to convert the data into a new STRUCT with the following form:
daySlidertimes STRUCT<startTime DATETIME, endTime DATETIME>
This is the table as seen in the BigQuery UI:
If you want to create a new table from the old one with the format daySlidertimes STRUCT<startTime DATETIME, endTime DATETIME>, you can cast the data in milliseconds and so then transform it to TIMESTAMP with the function "TIMESTAMP_MICROS", check this link to see the amount of functions to parse timestamp [1].
An example of the query should look something like this:
CREATE TABLE `project.dataset.new_table` AS
SELECT searchDocId,
STRUCT(TIMESTAMP_MICROS(CAST(
((daySliderTimes.field1.seconds * 1e+6) +
ROUND(daySliderTimes.field1.nanoseconds * 0.001)) AS INT64)) as
startTime,
TIMESTAMP_MICROS(CAST( ((daySliderTimes.field2.seconds * 1e+6) +
ROUND(daySliderTimes.field2.nanoseconds * 0.001)) AS INT64)) as endTime)
as daySliderTimes,
enabledDaySliders
FROM `project.dataset.old_table`
[1] https://cloud.google.com/bigquery/docs/reference/standard-sql/functions-and-operators#parse_timestamp
You can use TIMESTAMP_SECONDS() function. This function converts the seconds to DATETIME format.
Therefore, you are ale to transform daySliderTimes._field_1.seconds to a date using TIMESTAMP_SECONDS() function. As well as, for _field_2, then aggregate them in a new struct format.
During the creation of the view or table, in your select you can do the following:
WITH table_newStruct as(
SELECT
#Select all the desired fields
searchDocId,
STRUCT(TIMESTAMP_SECONDS(daySliderTimes._field_1.seconds) as startTime,
TIMESTAMP_SECONDS(daySliderTimes._field_.seconds) as endTime) as new_daySlidertimes
FROM 'table_source')
SELECT searchDocId, new_daySlidertimes
FROM 'table_newStruct'
In addition, the returned TIMESTAMP should be in the following format 1970-01-01 00:00:00 UTC. You can format it using the FORMAT_DATE() function.

Resources