(Azure) How do I get and compare the Timestamp date from Table storage in a logic app? - azure

I'm trying to make a simple Azure logic app, never made one before. It's set to check a Timestamp from a table in Table Storage.
I get a result List of Entities that contains the Timestamp that I want to compare but my question is how do I compare the value of "Timestamp" with for example utcNow() that is a string?
{
"odata.etag": "W/\"datetime'2020-05-07T09%3A07%3A32.8275489Z'\"",
"Timestamp": "2020-05-07T09:07:32.8275489Z"
},
I just want to get the Timestamp-string out so I can compare it to the utcNow-string.

According to the screenshots you provided, it seems the error happened in the "condition" to compare the timestamp. If you want to know how to compare the timestamp, you can refer to my sample below:
I initialize a variable(string) named "Timestamp" to simulate your timestamp from your table storage. In "Condition" we can use "ticks()" method to convert the timestamp(string) to ticks(number) and then compare them.
The expressions before and after the "is less than" are:
ticks(variables('Timestamp'))
ticks(utcNow())
Hope it helps~

Related

Incremental load in Azure Data Factory

I am replicating my data from Azure SQl DB TO Azure SQL DB. I have some tables with date columns and some tables with just the ID columns which are assigning primary key. While performing incremental load in ADF, I can select date as watermark column for the tables which have date column and id as watermark column for the tables which has id column, But the issue is my id has guid values, So can I i take that as my watermark column ? and if yes while copy activity process it gives me following error in ADF
Please see the image for above reference
How can I overcome this issue. Help is appreciated
Thank you
Gp
I have tried dynamic mapping https://martinschoombee.com/2022/03/22/dynamic-column-mapping-in-azure-data-factory/ from here but it does not work it still gives me same error.
Regarding your question about watermak:
A watermark is a column that has the last updated time stamp or an incrementing key
So GUID column would not be a good fit.
Try to find a date column, or an integer identity which is ever incrementing, to use as watermark.
Since your source is SQL server, you can also use change data capture.
Links:
Incremental loading in ADF
Change data capture
Regards,
Chen
The watermark logic takes advantange of the fact that all the new records which are inserted after the last watermark saved should only be considered for copying from source A to B , basically we are using ">=" operator to our advantage here .
In case of guid you cannot use that logic as guid cann surely be unique but not ">=" or "=<" will not work.

Not able to change datatype of Additional Column in Copy Activity - Azure Data Factory

I am facing very simple problem of not able to change the datatype of additional column in copy activity in ADF pipeline from String to Datetime
I am trying to change source datatype for additional column in mapping using JSON but still it doesn't work with polybase cmd
When I run my pipeline it gives same error
Is it not possible to change datatype of additional column, by default it takes string only
Dynamic columns return string.
Try to put the value [Ex. utcnow()] in the dynamic content of query and cast it to the required target datatype.
Otherwise you can use data-flow-derived-column :
https://learn.microsoft.com/en-us/azure/data-factory/data-flow-derived-column
Since your source is a query, you can choose to bring current date in source SQL query itself in the desired format rather than adding it in the additional column.
Thanks
Try to use formatDateTime as shown below and define the desired Date format:
Here since format given is ‘yyyy-dd-MM’, the result will look as below:
Note: The output here will be of string format only as in Copy activity we could not cast data type as of the date.
We could either create current date in the Source sql query or use above way so that the data would load into the sink in expected format.

Azure Logic App - GetEntities (Azure Table) connector filter returning wrong result

In my logic app one step takes some filtered entities from Azure Table Storage. Filter consist of two conditions:
One field has to be equal to some constant value
Other field (datetime) has be to less or equal than current time minus 10min
It worked ok until last month when it started to return wrong results as seen in the screen below:
And the connector in Edit Mode:
I cannot work out what is happening. If I edit the row in the Azure Table (just click Update without changing anything) it starts to work properly. I thought that maybe the field was set with wrong type, but everything seems ok:
Maybe your error is caused by the wrong type of CreatedDate, you can refer to this post.
Simply put, you insert a time-formatted String into Azure Table data formatted as DateTime. This is shown in the portal as type DateTime, but it is actually a String.
Solution:
1. If you want to insert data of type DateTime, you can specify odata.type, please refer to the following example:
{
"Address":"Mountain View",
"Age":23,
"AmountDue":200.23,
"CustomerCode#odata.type":"Edm.Guid",
"CustomerCode":"c9da6455-213d-42c9-9a79-3e9149a57833",
"CustomerSince#odata.type":"Edm.DateTime",
"CustomerSince":"2008-07-10T00:00:00",
"IsActive":true,
"NumberOfOrders#odata.type":"Edm.Int64",
"NumberOfOrders":"255",
"PartitionKey":"mypartitionkey",
"RowKey":"myrowkey"
}
Reference:
https://learn.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model#property-types
https://learn.microsoft.com/en-us/rest/api/storageservices/inserting-and-updating-entities
2. Define CreatedDate as String type, but this is not a very good solution, it is better to insert the correct DateTime data.

Adding Extraction DateTime in Azure Data Factory

I want to write a generic DataFactory in V2 with below scenario.
Source ---> Extracted (Salesforce or some other way), which don't have
extraction timestamp. ---> I want to write it to Blob with extraction
Time Stamp.
I want it to be generic, so I don't want to give column mapping anywhere.
Is there any way to use expression or system variable in Custom activity to append a column in output dataset? I like to have a very simple solution to make implementation realistic.
To do that you should change the query to add the column you need, with the query property in the copy activity of the pipeline. https://learn.microsoft.com/en-us/azure/data-factory/connector-salesforce#copy-activity-properties
I dont know much about Salesforce, but in SQL Server you can do the following:
SELECT *, CURRENT_TIMESTAMP as AddedTimeStamp from [schema].[table]
This will give you every field on your table and will add a column named AddedTimeStamp with the CURRENT_TIMESTAMP value in every row of the result.
Hope this helped!

How is the RowKey formed in the WADMetrics*** table?

I'm looking to understand the data returned by WADMetricsPT1HP10DV2S20160415 table inside Azure's storage account.
It has the PartitionKey, RowKey and EntityProperties. PartitionKey, I understand. It translates to the resource Id of my resource inside the storage account. However, I partially understand the RowKey. An example RowKey is:
:005CNetworkInterface:005CPacketsReceived__2519410355999999999
I understand the first part, which is a metric name. But what I don't understand in the number/digits that follow. I am assuming it to be the timestamp, but can't say for sure.
I was attempting to use the RowKey filter, but due to this added wrinkle it's almost impossible to generate the RowKey and use it as a filter. Does anyone know how to generate the numbers/digits in order to create a RowKey filter?
If you're curious to know about what 2519410355999999999 number is, it is essentially reverse ticks derived by subtracting the event date/time ticks from the max date/time ticks value. To derive date/time value from this number, here's what you can do:
long reverseTicks = 2519410355999999999;
long maxTicks = DateTime.MaxValue.Ticks;
var dateTime = new DateTime(maxTicks - reverseTicks, DateTimeKind.Utc);

Resources