How to read string column in format ‘15Aug21:12:45:24’ as time stamp in Tera data? - string

I have a character column in teradata table with format like this - ‘15AUG21:06:38:03’. I need to convert this column into time stamp so that I can use this column in order by statement. I am using teradata sql assistant to read data.

Use TO_TIMESTAMP:
SELECT TO_TIMESTAMP ('15AUG21:06:38:03', 'DDMONYY:HH24:MI:SS');

Related

HANA SQL:Import Excel Field formatted as date into Dataset or convert excel date values (eg. 43830 for 31.12.2019) to Date

I'd like to upload from an Excel file with a column formatted as date in excel to a dataset in the Analytical workspace of Tagetik on HANA database.
As excel stores date values as numbers (eg43830 foe 31.12.2019) I get an orror uploading in a dataset field with type date as 43830 is not a valid date format.
Uploading the excel field in a Dataset field with type text or number works, but then I have the numeric representation there.
So I'd like to convert this number (43830) to a valid date (31.12.2019) - I did not find an appropirate function in HANA SQL.
Thanks for some ideas.
best
Sabom
Why do you dont convert the number to a date as string in excel before upload and just use that date? Sounds like the easiest. Nontheless, you could use ADD_DAYS in HANA. The number you are seeing are days since January, 1st 1900. So ADD_DAYS with a base of 1900-01-01 should work.

Specify datetime2 format in Azure SQL data warehouse (synapse)

What is the correct way to specify the format of a datetime2 field when creating a table in Azure SQL data warehouse? I don't seem to be able to find an example in the documentation.
The data looks like this:
"2020-09-14T20:50:48.000Z"
CREATE TABLE [Foo].[Bar](
...
MyDateTime datetime2(['YYYY-MM-DDThh:mm:ss[.fractional seconds]')
)
As Panagiotis notes, the underlying representation is an int/long for the actual date value. This is how RDBMS engines can quickly compute the delta between two dates (days between Monday and Friday is a simple subtraction problem). To answer your question, you simply would format your create table as:
CREATE TABLE [Foo].[Bar](
...
MyDateTime datetime2
)
If you're interested in formatting the result in a query, you can look to the CONVERT or FORMAT functions. For example, if you wanted the format dd-mm-yyyy (Italian date), you could use either of the following:
SELECT
CONVERT(VARCHAR, CURRENT_TIMESTAMP, 105)
, FORMAT(CURRENT_TIMESTAMP, 'dd-MM-yyyy')
Note: CONVERT is generally faster than FORMAT and is the recommended approach if you have a date format that is supported. This is because the FORMAT function relies on the CLR which will include a context/process jump.

How to Convert a column having one timestamp to another timestamp in Azure Data Factory

I have column ABC where timestamp is of format dd/MM/yyyy HH:mm:SS (11/04/2020 1:17:40).I want to create another column ABC_NEW with same data as old column but with different timestamp 'yyyy-MM-dd HH:mm:SS'.I tried doing in azure data factory derived column using
toTimestamp(column_name,'yyyy-MM-dd HH:mm:SS') but it did not work it is coming as NULL. Can anyone help?
It's a 2-step process. You first need to tell ADF what each field in your timestamp column represents, then you can use string conversions to manipulate that timestamp into the output string as you like:
toString(toTimestamp('11/04/2020 1:17:40','MM/dd/yyyy HH:mm:ss'),'yyyy-MM-dd HH:mm:SS')
Data Factory doesn't support date format 'dd/mm/yyyy', we can not convert it to 'YYYY-MM-DD' directly.
I use DerivedColumn to generate a new column ABC_NEW from origin column DateTime and enter the expression bellow:
toTimestamp(concat(split(substring(DateTime,1, 10), '/')[3], '-',split(substring(DateTime,1, 10), '/')[2],'-',split(substring(DateTime,1, 10), '/')[1],substring(DateTime,11, length(DateTime))))
The result shows:
This is a trick which was a blocker for me, but try this-
Go to sink
Mapping
Click on output format
Select the data format or time format you prefer to store the data into the sink.

Specifying timestamp or date format in Athen Table

I have a timestamp in ISO-8601 format and want to specify it either as a timestamp or datetime format when creating a table in Athena. Any clues on how to do this ?
Thanks!
When you create table in Athena you can set a column as date or timestamp only in the Unix format as follows:
DATE, in the UNIX format, such as YYYY-MM-DD.
TIMESTAMP. Instant in time and date in the UNiX format, such as
yyyy-mm-dd hh:mm:ss[.f...]. For example, TIMESTAMP '2008-09-15
03:04:05.324'. This format uses the session time zone.
If the format is different, define it as a String and when you query the data use the date function:
from_iso8601_date(string) → date
You can convert the data to make it easier and cheaper for specific use cases by using CTAS (create table as select) query that will generate a new copy of the data in a simpler and more efficient (compressed and columnar) parquet format.

Null and date format handling in talend

I have an excel with a date field but the first row in the excel is blank and few other rows are having a date format as MM/dd/yyyy HH:mm:ss.
The data to be loaded into a Postgresql table with the field of data type timestamp yyyy-mm-dd HH:mm:ss.
The excel cannot be modified as it is being downloaded from the cloud and the data is loaded straight away into the table.
I tried using tConvert type but it cannot accept null or " " values in timestamp. I am facing a Null tMap error during runtime in talend. Even if I try to convert from string to date format in order to pass null in tmap, it is changing the date format and showing error.How can this be handled ?
The talend structure is : tFileInputExcel - > TMAP(date field : MM/dd/yyyy HH:mm:ss) -> tConvertType(date field : yyyy-mm-dd HH:mm:ss) ->TMAP(yyyy-mm-dd HH:mm:ss) -> Postgresql Table
Here is the Excel screenshot:
At first, I do not quite understand why do you want to use tConvertType component. After defining a proper schema Talend is changing your data into Java Date object and from that moment format is not important and you don't have to convert it when you want to put it into Postgres table. At least it should not cause NullPointerException.
Consider following steps:
Sample input file
I've prepared some file with date value/space/empty string, solution I'm describing works also with nulls.
Configure tFileInputExcel component
You have to allow taking null values in by checking the Nullable check box. You should also check trim option.
Examine output
After connecting input component to tLogRow null/empty/space values are handled properly.
I hope this will be helpful.
You can capture date format or null handling in variable within tMAP component
that is
var :TalendDate.formatDate("yyyy-mm-dd HH:mm:ss",row1.columnname)
so data flow would be
tFileInputExcel --->tMAP --->Postgresql Table

Resources