Error while converting timestamp string with timezone (+0000) to Timestamp using moment - node.js

I am trying to convert timestamp string 2021-09-29T05:32:48.000+0000 to time stamp using moment, but I am getting NaN as output. I am using this command to convert
(moment("2021-09-29T05:32:48.000+0000"))
Any suggestion on how I can handle the timestamp of this format in the node?

Related

How to load csv with datetime into postgresql using copy_from

I am trying to find a way to bulk load a csv file into postgresql. However, the data has datetime column with the format of "YYYY-MM-DD HH24:MI:SS". I couldn't find any documentation on how to bulk load date column using psycopg2 package in python 3.x. Can I get some help on this? Thanks in advance.
I am able to load the data using the below code:
cur.copy_from(dataIterator,'cmodm.patient_visit',sep=chr(31),size=8192,null='') conn.commit()
However, only the date part got loaded in the table. The time part was initialized:
2017-04-13 00:00:00 2017-04-13 00:00:00 2017-04-12 00:00:00
After discussing with #Belayer, it was concluded that copy_from takes the timestamp input value in the format 'YYYY-MM-DD HH:MI:SS'. If the source has some other format, then that needs to be converted to the desired format mentioned before in this response before feeding it into copy_from.

Presto epoch string to timestamp

Require your help as stuck in time conversion in presto.
I have a epoch column with name timestamp as a string datatype and i want to convert this into date timestamp.
I have used the below query after reading through various blogs:
SELECT date_parse(to_iso8601(from_unixtime(CAST(timestamp AS bigint)) AS date ,
'%Y-%m-%dT%H:%i:%s.%fZ'))
FROM wqmparquet;
Everytime i run this query i get an error:
INVALID_FUNCTION_ARGUMENT: Invalid format: "2020-04-27T19:49:50.000Z" is malformed at "T19:49:50.000Z"
Can somebody please help me on this.
I might be oversimplifying this, but if you want to convert an epoch string to a timestamp datatype, you can just do:
from_unixtime(cast(timestamp as bigint))
You can generate a timestamp with time zone by passing a second argument to from_unixtime(), as a time zone string.

Specifying timestamp or date format in Athen Table

I have a timestamp in ISO-8601 format and want to specify it either as a timestamp or datetime format when creating a table in Athena. Any clues on how to do this ?
Thanks!
When you create table in Athena you can set a column as date or timestamp only in the Unix format as follows:
DATE, in the UNIX format, such as YYYY-MM-DD.
TIMESTAMP. Instant in time and date in the UNiX format, such as
yyyy-mm-dd hh:mm:ss[.f...]. For example, TIMESTAMP '2008-09-15
03:04:05.324'. This format uses the session time zone.
If the format is different, define it as a String and when you query the data use the date function:
from_iso8601_date(string) → date
You can convert the data to make it easier and cheaper for specific use cases by using CTAS (create table as select) query that will generate a new copy of the data in a simpler and more efficient (compressed and columnar) parquet format.

Azure Data Factory Mapping Data Flow: Epoch timestamp to Datetime

I have a JSON-based source I'd like to transform using ADF Mapping Data Flow. I have a string containing an epoch timestamp value that I want to transform to Datetime value to later sink it into Parquet file.
Do you know a way? Docs of this language are here.
Source file:
{
"timestamp":"1574127407",
"name":"D.A."
}
Use toTimestamp() and set the formatting you wish as 2nd parameter
toTimestamp(1574127407*1000l)
From string:
toTimestamp(toInteger(toString(byName('timestamp')))*1000l,'yyyy-MM-dd HH:mm:ss')
I have came across various epoch timestamp values which are of 13 digits i.e., they even have milliseconds detailed information.
In such case, converting to integer using 'toInteger' won't serve the purpose instead this will keep the values as NULL. So, to fix this issue, we need to convert it to Long using toLong as below:
toTimestamp(toLong(toString(created)),'yyyy-MM-dd HH:mm:ss')
In above expression, 'created' is a field whose value is 13-digit epoch timestamp, something like this created='1635359043307'.
Here, toTimestamp returns the Date Timestamp with above-mentioned date format.
FYI, you can use this site https://www.epochconverter.com/ to check epoch timestamp to human date.

Logstash convert the "yyyy-MM-dd" to "yyyy-MM-dd'T'HH:mm:ss.SSSZ"

I use the logstash-input-jdbc plugin to sync my data from mysql to elasiticsearch. However, when I looked at the data in elasticsearch, I found that the format of the fields of all date types changed from "yyyy-MM-dd" to "yyyy-MM-dd'T'HH:mm:ss.SSSZ".I have nearly 200 fields whose type is date, so I want to know how to configure logstash so that it can output the format "yyyy-MM-dd" instead of "yyyy-MM-dd'T'HH:mm:ss.SSSZ".
Elasticsearch stores dates as UTC timestamps:
Internally, dates are converted to UTC (if the time-zone is specified) and stored as a long number representing milliseconds-since-the-epoch.
Queries on dates are internally converted to range queries on this long representation, and the result of aggregations and stored fields is converted back to a string depending on the date format that is associated with the field.
So if you want to retain the yyyy-MM-dd format, you'll have to store it as a keyword (which you then won't be able to do range queries on).
You can change Kibana's display to only show the yyyy-MM-dd format, but note that it will convert the date to the timezone of the viewer which may result in a different day than you entered in the input field.
If you want to ingest the date as a string, you'll need to create a mapping for the index in question to prevent default date processing.

Resources