Prefix DateTime variable - cognos

I am new to cognos 10.2 report studio at the moment.
I need to declare the prefix date time in my sql in order to make my union queries works.
I've tested a few datetime declarations but it seems not working and i keep getting the server returned an unrecognizable query framework response.
I've tried some of the codes which i found in some cognos forum as per shown in below.
Codes that i've tried
1. '1970-01-01T00:00:00.000000000' as invdate
2. todate(null) as invdate
/********** This below is my code ***********/
select
'fstgld' as wso,
0 as pono,
'nosh' as shpm,
'gld' as DocType,
0 as DocNo,
'gl' as item,
trim(tffst305.dim2) as ItemGroup,
tffst305.year as fy,
tffst305.perd as period,
'fst' as slsordtype,
'finbg' as finbg,
0 as Qty,
tffst305.leac as leac,
0 as Sales,
tffst305.fdah-tffst305.fcah as Cost
current_date as invdate <------this is the part where i keep getting error as i need to declare a prefix datetime
From tffst305
WHERE
tffst305.ptyp = 1 and
tffst305.budg ='ACT' and
tffst305.company_nr = 810
union all
select
cisli310.orno as wso,
cisli310.pono as pono,
cisli310.shpm as shpm,
cisli310.tran as DocType,
cisli310.idoc as DocNo,
cisli310.item as item,
tdsls411.citg as ItemGroup,
tfgld018.year as fy,
tfgld018.vprd as period,
cisli310.sotp as slsordtype,
tccom112.cfcg as finbg,
cisli310.dqua as Qty,
'inv' as leac,
cisli310.amth(1) as Sales,
0 as Cost,
cisli305.idat as invdate <--- extracted from the table field
From cisli310
RIGHT OUTER JOIN cisli305 ON cisli310.tran = cisli305.tran and
cisli310.idoc = cisli305.idoc
LEFT OUTER JOIN tdsls411 ON cisli310.orno=tdsls411.orno and
cisli310.pono=tdsls411.pono
LEFT OUTER JOIN tccom112 ON cisli305.ofbp = tccom112.itbp
inner join tfgld018 on cisli310.tran = tfgld018.ttyp and cisli310.idoc =
tfgld018.docn
WHERE
cisli310.sotp in ('SSP', 'SPL', 'SWK') and cisli310.amth(1) <>0 and
cisli305.company_nr=810 and
cisli310.company_nr=810 and
tdsls411.company_nr=810 and
tfgld018.company_nr=810 and
tccom112.company_nr=810
The field of the record is a datetime datatype such as 2009-07-03 03:08:03pm

Try replacing current_date with # timestampMask ( $current_timestamp , 'yyyy-dd-mm' ) # You can add other date or time portions as needed
timestampMask ( string_expression1 , string_expression2 )
Returns "string_expression1", representing a timestamp with time zone, trimmed to the format specified in "string_expression2".
The format in "string_expression2" must be one of the following: 'yyyy', 'mm', 'dd', 'yyyy-mm', 'yyyymm', 'yyyy-mm-dd', 'yyyymmdd', 'yyyy-mm-dd hh:mm:ss', 'yyyy-mm-dd hh:mm:ss+hh:mm', 'yyyy-mm-dd hh:mm:ss.ff3', 'yyyy-mm-dd hh:mm:ss.ff3+hh:mm', 'yyyy-mm-ddThh:mm:ss', 'yyyy-mm-ddThh:mm:ss+hh:mm', 'yyyy-mm-ddThh:mm:ss.ff3+hh:mm', or 'yyyy-mm-ddThh:mm:ss.ff3+hh:mm'.The macro functions that return a string representation of a timestamp with time zone show a precision of 9 digits for the fractional part of the seconds by default. The format options allow this to be trimmed down to a precision of 3 or 0.

Related

SQL Server date range search issue

I have a MS SQL Server DateTime field, and Im trying to search all records that are in between a date range:
mySqlString = "select * from users where signupDate >=#from and signupdate <=#to"
The two variables containing the date range come with format MM/dd/yyyy (dataFrom and dataTo, so Im replacing #from and #to at the string as follows:
datefrom = new Date(dataFrom);
dateto = new Date(dataTo);
req.input('from', sql.DateTime, datefrom )
req.input('to', sql.DateTime, dateto )
But I do not get any result.
What's the best approach to get this working properly?
You can always use CONVERT to accommodate your SQL query to your input format. In your case its format 101: select convert(varchar, getdate(), 101) ---> mm/dd/yyyy
So your query should look like
where (signupdate >= CONVERT(date, #from, 101)) AND (signupdate <= CONVERT(date, #to, 101))
This way you won't worry about the time of the stored date
req.input('from', sql.Date, (dataFrom))
req.input('to', sql.Date, (dataTo))
Assuming you checked if dataFrom and dataTo have valid dates.

Athena query results show null values despite is not null condition in query

I have the following query which I run in Athena. I would like to receive all the results that contain a tag in the 'resource_tags_aws_cloudformation_stack_name'. However, when I run the query my results show me rows where the 'resource_tags_aws_cloudformation_stack_name' is empty and I don't know what I am doing wrong.
SELECT
cm.line_item_usage_account_id,
pr.line_of_business,
cm.resource_tags_aws_cloudformation_stack_name,
SUM(CASE WHEN cm.line_item_product_code = 'AmazonEC2'
THEN line_item_unblended_cost * 0.97
ELSE cm.line_item_unblended_cost END) AS discounted_cost,
CAST(cm.line_item_usage_start_date AS DATE) AS start_day
FROM cost_management cm
JOIN prod_cur_metadata pr ON cm.line_item_usage_account_id = pr.line_item_usage_account_id
WHERE cm.line_item_usage_account_id IN ('1234504482')
AND cm.resource_tags_aws_cloudformation_stack_name IS NOT NULL
AND cm.line_item_usage_start_date
BETWEEN date '2020-01-01'
AND date '2020-01-30'
GROUP BY cm.line_item_usage_account_id,pr.line_of_business, cm.resource_tags_aws_cloudformation_stack_name, CAST(cm.line_item_usage_start_date AS DATE), pr.line_of_business
HAVING sum(cm.line_item_blended_cost) > 0
ORDER BY cm.line_item_usage_account_id
I modified my query to exclude ' ' and that seems to work:
SELECT
cm.line_item_usage_account_id,
pr.line_of_business,
cm.resource_tags_aws_cloudformation_stack_name,
SUM(CASE WHEN cm.line_item_product_code = 'AmazonEC2'
THEN line_item_unblended_cost * 0.97
ELSE cm.line_item_unblended_cost END) AS discounted_cost,
CAST(cm.line_item_usage_start_date AS DATE) AS start_day
FROM cost_management cm
JOIN prod_cur_metadata pr ON cm.line_item_usage_account_id = pr.line_item_usage_account_id
WHERE cm.line_item_usage_account_id IN ('1234504482')
AND NOT cm.resource_tags_aws_cloudformation_stack_name = ' '
AND cm.line_item_usage_start_date
BETWEEN date '2020-01-01'
AND date '2020-01-30'
GROUP BY cm.line_item_usage_account_id,pr.line_of_business, cm.resource_tags_aws_cloudformation_stack_name, CAST(cm.line_item_usage_start_date AS DATE), pr.line_of_business
HAVING sum(cm.line_item_blended_cost) > 0
ORDER BY cm.line_item_usage_account_id
You can try space use case as below
AND Coalesce(cm.resource_tags_aws_cloudformation_stack_name,' ') !=' '
Or if you have multiple spaces try. The below query is not good if spaces required in actual data
AND Regexp_replace(cm.resource_tags_aws_cloudformation_stack_name,' ') is not null
Adding to this you may also have special char like CR or LF in data. Although its rare scenario

DATEDIFF overflow

I am using the following code in azure sql datawarehouse
SELECT cast(DATEDIFF(ms,cast(Start as datetime2),cast(EndTime as datetime2)
) as float) AS [total]--difference to be calculated in millisecond
FROM systable
but coming across an error as
"The datediff function resulted in an overflow. The number of dateparts separating two date/time instances is too large. Try to use datediff with a less precise datepart.
"
My requirement is to have the difference in milliseconds and if thats changed then it will affect other results.
request you to please provide some help
This happens because the DATEDIFF() function returns an integer. An integer only allows values up to 2,147,483,647. In this case, you have more than ~2B values causing the data type overflow. You would ideally use the DATEDIFF_BIG() function which returns a bigint that allows for values up to 9,223,372,036,854,775,807 or ~9 Septillion. DATEDIFF_BIG() isn't supported in SQL Data Warehouse / Azure Synapse Analytics (as of Jan 2020).
You can vote for the feature here: (https://feedback.azure.com/forums/307516/suggestions/14781627)
Testing DATEDIFF(), you can see that you can get ~25 days and 20 hours of difference between dates before you run out of integers. Some sample code is below.
DECLARE #startdate DATETIME2 = '01/01/2020 00:00:00.0000';
DECLARE #enddate DATETIME2 = '01/01/2020 00:00:02.0000';
-- Support:
-- MILLISECOND: ~25 days 20 Hours
-- MICROSECOND: ~35 minutes
-- NANOSECOND: ~ 2 seconds
SELECT
DATEDIFF(DAY, #startdate, #enddate) [day]
, DATEDIFF(HOUR, #startdate, #enddate) [hour]
, DATEDIFF(MINUTE, #startdate, #enddate) [minute]
, DATEDIFF(SECOND, #startdate, #enddate) [second]
, DATEDIFF(MILLISECOND, #startdate, #enddate) [millisecond]
, DATEDIFF(MICROSECOND, #startdate, #enddate) [microsecond]
, DATEDIFF(NANOSECOND, #startdate, #enddate) [nanosecond]
In the interim, you could calculate the ticks since the beginning of the time for each value and then subtract the difference. For a DATETIME2, you can calculate ticks like this:
CREATE FUNCTION dbo.DATEDIFF_TICKS(#date DATETIME2)
RETURNS BIGINT
AS
BEGIN
RETURN
(DATEDIFF(DAY, '01/01/0001', CAST(#date AS DATE)) * 864000000000.0)
+ (DATEDIFF(SECOND, '00:00', CAST(#date AS TIME(7))) * 10000000.0)
+ (DATEPART(NANOSECOND, #date) / 100.0);
END
GO
You can then just run the function and determine the ticks and the difference between ticks.
DECLARE #startdate DATETIME2 = '01/01/2020 00:00:00.0000';
DECLARE #enddate DATETIME2 = '01/30/2020 00:00:00.0000';
SELECT
dbo.DATEDIFF_TICKS(#startdate) [start_ticks],
dbo.DATEDIFF_TICKS(#startdate) [end_ticks],
dbo.DATEDIFF_TICKS(#enddate) - dbo.DATEDIFF_TICKS(#startdate) [diff];
Here is a sample running 500 years of differences:
DECLARE #startdate DATETIME2 = '01/01/2000 00:00:00.0000';
DECLARE #enddate DATETIME2 = '01/01/2500 00:00:00.0000';
SELECT
dbo.DATEDIFF_TICKS(#startdate) [start_ticks],
dbo.DATEDIFF_TICKS(#startdate) [end_ticks],
dbo.DATEDIFF_TICKS(#enddate) - dbo.DATEDIFF_TICKS(#startdate) [diff];
The results:
start_ticks end_ticks diff
-------------------- -------------------- --------------------
630822816000000000 630822816000000000 157785408000000000

Remove decimal places from varchar(32) result

I'm using SQL and windows batch script to download inventory from our POS and then upload it to a 3rd party platform. The file is successfully downloading and uploading, but the 3rd party platform is quite finicky on formatting. Specifically, it won't accept decimal place for the column titled "Quantity".
I've searched and tried various different approaches but can't seem to find one that works. The tricky aspect to this sql is that i had to use a join in order to create a title row and I'm using the format varchar(32) I've posted my sql below, any suggestions?
set nocount ON
SELECT CAST('sku' as VARCHAR(32)) AS sku,
CAST('quantity' as VARCHAR(32)) AS quantity
UNION
SELECT CAST(IM_BARCOD.BARCOD AS
VARCHAR(32)) as sku, case when
IM_INV.QTY_AVAIL > 0 then
CAST(IM_INV.QTY_AVAIL AS VARCHAR(32)) else
CAST(0 as VARCHAR(32)) END as quantity
FROM IM_BARCOD INNER JOIN IM_INV ON
IM_INV.ITEM_NO = IM_BARCOD.ITEM_NO INNER
JOIN IM_PRC ON IM_INV.ITEM_NO =
IM_PRC.ITEM_NO INNER JOIN
IM_ITEM ON IM_INV.ITEM_NO = IM_ITEM.ITEM_NO
UNION
SELECT CAST(IM_BARCOD.BARCOD AS
VARCHAR(32)) as sku, case when
IM_INV_CELL.QTY_AVAIL > 0 then
CAST(IM_INV_CELL.QTY_AVAIL AS VARCHAR(32))
else CAST (0 as VARCHAR (32)) END as
quantity FROM IM_BARCOD INNER JOIN IM_PRC
ON IM_BARCOD.ITEM_NO = IM_PRC.ITEM_NO INNER
JOIN IM_INV_CELL ON IM_BARCOD.ITEM_NO =
IM_INV_CELL.ITEM_NO AND
IM_INV_CELL.DIM_1_UPR=IM_BARCOD.DIM_1_UPR
AND IM_INV_CELL.DIM_2_UPR =
IM_BARCOD.DIM_2_UPR AND
IM_INV_CELL.DIM_3_UPR =IM_BARCOD.DIM_3_UPR
INNER JOIN
IM_ITEM ON IM_BARCOD.ITEM_NO =
IM_ITEM.ITEM_NO

virtual file set column and rowset variable U-SQL

I'm having an issue with scheduling job in Data Factory.
I'm trying to approach a scheduled job per hour which will execute the same script each hour with different condition.
Consider I have a bunch of Avro Files spread in Azure Data Lake Store with following pattern.
/Data/SomeEntity/{date:yyyy}/{date:MM}/{date:dd}/SomeEntity_{date:yyyy}{date:MM}{date:dd}__{date:H}
Each hour new files are added to Data Lake Store.
In order to process the files only once I decided to handle them by help of U-SQL virtual file set column and some SyncTable which i created in Data Lake Store.
My query looks like following.
DECLARE #file_set_path string = /Data/SomeEntity/{date:yyyy}/{date:MM}/{date:dd}/SomeEntity_{date:yyyy}_{date:MM}_{date:dd}__{date:H};
#result = EXTRACT [Id] long,
....
date DateTime
FROM #file_set_path
USING someextractor;
#rdate =
SELECT MAX(ProcessedDate) AS ProcessedDate
FROM dbo.SyncTable
WHERE EntityName== "SomeEntity";
#finalResult = SELECT [Id],... FROM #result
CROSS JOIN #rdate AS r
WHERE date >= r.ProcessedDate;
since I can't use rowset variable in where clause I'm cross joining the singe row with set , however even in this case U-SQL won't find the correct files and always return all files set.
Is there any workaround or other approach ?
I think this approach should work unless there is something not quite right somewhere, ie can you confirm the datatypes of the dbo.SyncTable table? Dump out #rdate and make sure the value you get there is what you expect.
I put together a simple demo which worked as expected. My copy of SyncTable had one record with the value of 01/01/2018:
#working =
SELECT *
FROM (
VALUES
( (int)1, DateTime.Parse("2017/12/31") ),
( (int)2, DateTime.Parse("2018/01/01") ),
( (int)3, DateTime.Parse("2018/02/01") )
) AS x ( id, someDate );
#rdate =
SELECT MAX(ProcessedDate) AS maxDate
FROM dbo.SyncTable;
//#output =
// SELECT *
// FROM #rdate;
#output =
SELECT *, (w.someDate - r.maxDate).ToString() AS diff
FROM #working AS w
CROSS JOIN
#rdate AS r
WHERE w.someDate >= r.maxDate;
OUTPUT #output TO "/output/output.csv"
USING Outputters.Csv();
I did try this with a filepath (full script here). The thing to remember is the custom date format H represents the hour as a number from 0 to 23. If your SyncTable date does not have a time component to it when you insert it, it will default to midnight (0), meaning the whole day will be collected. Your file structure should look something like this according to your pattern:
"D:\Data Lake\USQLDataRoot\Data\SomeEntity\2017\12\31\SomeEntity_2017_12_31__8\test.csv"
I note your filepath has underscores in the second section and a double underscore before the hour section (which will be between 0 and 23, single digit up to the hour 10). I notice your fileset path does not have a file type or quotes - I've used test.csv in my tests. My results:
Basically I think the approach will work, but there is something not quite right, maybe in your file structure, the value in your SyncTable, the datatype etc. You need to go over the details, dump out intermediate values to check until you find the problem.
Doesn't the gist of wBob's full script resolve your issue? Here is a very slightly edited version of wBob's full script to address some of the issues you raised:
Ability to filter on SyncTable,
last part of pattern is file name and not folder. Sample file and structure: \Data\SomeEntity\2018\01\01\SomeEntity_2018_01_01__1
DECLARE #file_set_path string = #"/Data/SomeEntity/{date:yyyy}/{date:MM}/{date:dd}/SomeEntity_{date:yyyy}_{date:MM}_{date:dd}__{date:H}";
#input =
EXTRACT [Id] long,
date DateTime
FROM #file_set_path
USING Extractors.Text();
// in lieu of creating actual table
#syncTable =
SELECT * FROM
( VALUES
( "SomeEntity", new DateTime(2018,01,01,01,00,00) ),
( "AnotherEntity", new DateTime(2018,01,01,01,00,00) ),
( "SomeEntity", new DateTime(2018,01,01,00,00,00) ),
( "AnotherEntity", new DateTime(2018,01,01,00,00,00) ),
( "SomeEntity", new DateTime(2017,12,31,23,00,00) ),
( "AnotherEntity", new DateTime(2017,12,31,23,00,00) )
) AS x ( EntityName, ProcessedDate );
#rdate =
SELECT MAX(ProcessedDate) AS maxDate
FROM #syncTable
WHERE EntityName== "SomeEntity";
#output =
SELECT *,
date.ToString() AS dateString
FROM #input AS i
CROSS JOIN
#rdate AS r
WHERE i.date >= r.maxDate;
OUTPUT #output
TO "/output/output.txt"
ORDER BY Id
USING Outputters.Text(quoting:false);
Also please note that file sets cannot perform partition elimination on dynamic joins, since the values are not known to the optimizer during the preparation phase.
I would suggest to pass the Sync point as a parameter from ADF to the processing script. Then the value is known to the optimizer and file set partition elimination will kick in. In the worst case, you would have to read the value from your sync table in a previous script and use it as a parameter in the next.

Resources