Azure Logic App - GetEntities (Azure Table) connector filter returning wrong result - azure

In my logic app one step takes some filtered entities from Azure Table Storage. Filter consist of two conditions:
One field has to be equal to some constant value
Other field (datetime) has be to less or equal than current time minus 10min
It worked ok until last month when it started to return wrong results as seen in the screen below:
And the connector in Edit Mode:
I cannot work out what is happening. If I edit the row in the Azure Table (just click Update without changing anything) it starts to work properly. I thought that maybe the field was set with wrong type, but everything seems ok:

Maybe your error is caused by the wrong type of CreatedDate, you can refer to this post.
Simply put, you insert a time-formatted String into Azure Table data formatted as DateTime. This is shown in the portal as type DateTime, but it is actually a String.
Solution:
1. If you want to insert data of type DateTime, you can specify odata.type, please refer to the following example:
{
"Address":"Mountain View",
"Age":23,
"AmountDue":200.23,
"CustomerCode#odata.type":"Edm.Guid",
"CustomerCode":"c9da6455-213d-42c9-9a79-3e9149a57833",
"CustomerSince#odata.type":"Edm.DateTime",
"CustomerSince":"2008-07-10T00:00:00",
"IsActive":true,
"NumberOfOrders#odata.type":"Edm.Int64",
"NumberOfOrders":"255",
"PartitionKey":"mypartitionkey",
"RowKey":"myrowkey"
}
Reference:
https://learn.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model#property-types
https://learn.microsoft.com/en-us/rest/api/storageservices/inserting-and-updating-entities
2. Define CreatedDate as String type, but this is not a very good solution, it is better to insert the correct DateTime data.

Related

Incremental load in Azure Data Factory

I am replicating my data from Azure SQl DB TO Azure SQL DB. I have some tables with date columns and some tables with just the ID columns which are assigning primary key. While performing incremental load in ADF, I can select date as watermark column for the tables which have date column and id as watermark column for the tables which has id column, But the issue is my id has guid values, So can I i take that as my watermark column ? and if yes while copy activity process it gives me following error in ADF
Please see the image for above reference
How can I overcome this issue. Help is appreciated
Thank you
Gp
I have tried dynamic mapping https://martinschoombee.com/2022/03/22/dynamic-column-mapping-in-azure-data-factory/ from here but it does not work it still gives me same error.
Regarding your question about watermak:
A watermark is a column that has the last updated time stamp or an incrementing key
So GUID column would not be a good fit.
Try to find a date column, or an integer identity which is ever incrementing, to use as watermark.
Since your source is SQL server, you can also use change data capture.
Links:
Incremental loading in ADF
Change data capture
Regards,
Chen
The watermark logic takes advantange of the fact that all the new records which are inserted after the last watermark saved should only be considered for copying from source A to B , basically we are using ">=" operator to our advantage here .
In case of guid you cannot use that logic as guid cann surely be unique but not ">=" or "=<" will not work.

Not able to change datatype of Additional Column in Copy Activity - Azure Data Factory

I am facing very simple problem of not able to change the datatype of additional column in copy activity in ADF pipeline from String to Datetime
I am trying to change source datatype for additional column in mapping using JSON but still it doesn't work with polybase cmd
When I run my pipeline it gives same error
Is it not possible to change datatype of additional column, by default it takes string only
Dynamic columns return string.
Try to put the value [Ex. utcnow()] in the dynamic content of query and cast it to the required target datatype.
Otherwise you can use data-flow-derived-column :
https://learn.microsoft.com/en-us/azure/data-factory/data-flow-derived-column
Since your source is a query, you can choose to bring current date in source SQL query itself in the desired format rather than adding it in the additional column.
Thanks
Try to use formatDateTime as shown below and define the desired Date format:
Here since format given is ‘yyyy-dd-MM’, the result will look as below:
Note: The output here will be of string format only as in Copy activity we could not cast data type as of the date.
We could either create current date in the Source sql query or use above way so that the data would load into the sink in expected format.

Azure-Data-Factory - If Condition returns false despite being logically true

I'm trying to do a logical test to compare two activity outputs.
The first one is giving back a file name (derived from GetMetaData)and the other one distinct filenames that are already in the database (derived from a lookup Activity).
So the first activity is giving X.csv (a file in a Blob0 while the second one is giving a list Y.csv; Z.csv (the result of the lookup Select distinct from table X)
Based on this outcome I would say that the logical test is true so ADF has to start a particular activity. I'm using the expresion below, but despite the fact there are no errors the outcome is always false. What am I doing wrong? I guess it has something to do with the lookup activity because the query will give a list of values I think.
please help thanks in advance!
#equals(activity('GetBlobName').output,activity('LookupBestandsnaam').output)
Output activity LookupBestandsnaam:
Output activity GetBlobName:
The output of Lookup and Get Metadata are different:
Lookup activity reads and returns the content of a configuration file
or table.
Get Metadata activity to retrieve the metadata of any data in Azure
Data Factory
We can't compare the output directly. You will always get false in the if condition expression.
Please try bellow expression:
#equals(activity('GetBlobName').output.value.name,activity('LookupBestandsnaam').output.value.bestandsnaam)
Update:
Congratulations that you use another way to solved it:
"I have now replaced the if condition with a stored procedure that uses an IF exists script running on the basis of look-up activity in ADF."

(Azure) How do I get and compare the Timestamp date from Table storage in a logic app?

I'm trying to make a simple Azure logic app, never made one before. It's set to check a Timestamp from a table in Table Storage.
I get a result List of Entities that contains the Timestamp that I want to compare but my question is how do I compare the value of "Timestamp" with for example utcNow() that is a string?
{
"odata.etag": "W/\"datetime'2020-05-07T09%3A07%3A32.8275489Z'\"",
"Timestamp": "2020-05-07T09:07:32.8275489Z"
},
I just want to get the Timestamp-string out so I can compare it to the utcNow-string.
According to the screenshots you provided, it seems the error happened in the "condition" to compare the timestamp. If you want to know how to compare the timestamp, you can refer to my sample below:
I initialize a variable(string) named "Timestamp" to simulate your timestamp from your table storage. In "Condition" we can use "ticks()" method to convert the timestamp(string) to ticks(number) and then compare them.
The expressions before and after the "is less than" are:
ticks(variables('Timestamp'))
ticks(utcNow())
Hope it helps~

Microsoft Graph API $filter=name eq 'foo' query not working on GET workbook/tables/{id}/columns. No error and no filtering

I'm looking at a table (Table1) inside an Excel book saved on my OneDrive for Business account. I then want to get the maximum value in the CREATEDDATE column from this table.
I want to avoid pulling down the whole table with the API, so I'm trying to filter the results of my query to only the CREATEDDATE column. However, the column results from the table are not being filtered to the one column and I'm not getting an error to help troubleshoot why. All I get is an HTTP 200 response and the full unfiltered table results.
Is it possible to filter the columns retrieved from the API by the column name? The documentation made me think so.
I've confirmed that /columns?$select=name works correctly and returns just the name field, so I know that it recognizes this as an entity. $filter and $orderby do nothing when referencing any of the entities from the response (name, id, index, values). I know that I can limit columns by position, but I'd rather explicitly reference the column by name in case the order changes.
I'm using this query:
/v1.0/me/drive/items/{ID}/workbook/tables/Table1/columns?$filter=name eq 'CREATEDDATE'`
You don't need to $filter here, just pull it by the name directly. The prototypes from the Get TableColumn documentation are:
GET /workbook/tables/{id|name}/columns/{id|name}
GET /workbook/worksheets/{id|name}/tables/{id|name}/columns/{id|name}
So in your case, you should be able to simply call call:
/v1.0/me/drive/items/{ID}//workbook/tables/Table1/columns/CREATEDDATE

Resources