I have an Excel sheet with a column containing IDs of items I want to retrieve
||ID||
|123|
|124|
|125|
The API I am calling can take an array of IDs as input (e.g. https://API.com/rest/items?ID=123&ID=124&ID=125....(up to 50))
and returns one JSON.
"data": [
{
"id": 123,
"fields": {
"name": "blah blah",
"description": "some description",
}
},
{
"id": 124,
"fields": {
"name": "blah bli",
"description": "some description",
}
},
{
"id": 125,
"fields": {
"name": "blah blo",
"description": "some description",
}
},...
]
}
I would like to load data from this JSON in another table or sheet.
||ID||Name||
|123|blah blah|
|124|blahblo|
|125|blahbli|
I would know how to parameterise the query by referencing single cells, but if I am retrieving 100+ items, that's a lot of work.
Can't I somehow build a query that gets each value (ID) from the table or range in one simple move?
--edit1--
Found another API endpoint where I can provide an array of IDs. (before I thought it was only possible to send one ID per request to retrieve one JSON at a time)
--edit2--
Maybe I can concatenate all IDs into the request URL already in an Excel cell and parameterise just based on that one cell. (still experimenting)
If you've got the JSON for each ID, then you just need to pull out the name part.
For example, if you have a column [JSON] with the JSON for each ID, then you can create a custom column using Json.Document([JSON])[data][name] as the formula.
You can likely combine pulling the JSON and parsing it into a single step if you'd like.
Related
So, I have a logic app that looks like below
enter image description here
The main idea of the app is to get the list items of a list and copy the contents in a csv file in blob storage.
The site name and list name are passed through the HTTP request body.
However, I would like to also define the Select operation column mapping dynamically.
The body looks like this
{
"listName" : "The list name",
"siteAddress" : "SharepointSiteAddress",
"columns" : {
"Email": " #item()?['Employee']?['Email']",
"Region": " #item()?['Region']?['Value']"
}
}
In the 'Map' section of the 'Select' Operation I use the 'columns' property as shown below
enter image description here
However, in the output stream of the 'Select' Operation, email and region column values are resolved with the string that is passed instead of retrieving the actual item value that I am trying to refer to.
Can I somehow create the csv table dynamically through the HTTP request while also being able to access the items' values?
Using expressions, you can create CSV file with Dynamic data. I have reproduced issue from my side and below are the stepts I followed.
Created Logic app as shown below,
In Http trigger, I have defined sample payload as shown below,
{
"listName" : "The list name",
"siteAddress" : "SharepointSiteAddress",
"columns" : {
"Email": " Email",
"DisplayName": "DisplayName"
}
}
In select action, from is taken from Get items value. In Map row, key is taken from Http trigger and value is from SharePoint item as shown below,
Map:
Key -triggerBody()?['columns']?['Email']
Value - item()?['Editor']?['Email']
Output of Get Items action in my case is like below. Hence written expression according to that.
"value": [
{
"#odata.etag": "\"1\"",
"ItemInternalId": "3",
"ID": 3,
"Modified": "2022-11-15T10:49:47Z",
"Editor": {
"#odata.type": "#Microsoft.Azure.Connectors.SharePoint.SPListExpandedUser",
"Claims": "i:0#.f|membership|xyzt.com",
"DisplayName": "Test",
"Email": "v#mail.com",
"JobTitle": ""
}
Tested logic app. It ran successfully and csv file is generated like below,
Csv file:
I have these kind of json documents in a CosmosDB database.
{
"Version": 0,
"Entity": {
"ID": "xxxxxxx",
"EventHistory": {
"2020-04-28T16:30:35.6887561Z": "NEW",
"2020-04-28T16:35:21.1811993Z": "PROCESSED"
},
"SourceSystem": "xxxx",
"SourceSystemIdentifier": "xxxx",
"PCC": "xxx",
"StorageReference": "xxxxxxxxxxxx",
"SupplementaryData": {
"eTicketCount": "2"
}
}
}
The number of sub-properties within the EventHistory node is dynamic. In the example there are two but it can be any number.
I couldn't find a way to count how many sub-properties the node contains. At least, I need to query those whose have only one property declared.
FYI: I'm not able to change the format of the documents. I know that it would be more convenient to store them as an array.
I tried to use ARRAY_LENGTH or COUNT functions but since it's not an array, the formers couldn't be applied.
I am stuck and need help to get data from another collection based array element from another collection.
My collections are as --
programs.dbs --
{
"_id": ObjectId("61c8f42ec63e700b415b4bed"),
"name": "Java",
"description": "this is a dummy information",
"instructor": ["61c8f6d7e690fc413a075e15", "61c8f6d7e690fc413a071e15", "61c8f0d7e690fc413a071e15"]
}
instructor.dbs --
{
"_id": ObjectId("61c8f6d7e690fc413a075e15"),
"name": "Instrctor1",
"description": "this is a dummy Instructor",
},
{
"_id": ObjectId("61c8f6d7e690fc413a071e15"),
"name": "Instrctor2",
"description": "this is a dummy Instructor",
},
{
"_id": ObjectId("61c8f0d7e690fc413a071e15"),
"name": "Instrctor3",
"description": "this is a dummy Instructor",
}
My query is to find one instructor information based on collection A instructor Array element.
I have tried to use this query but getting empty result -
db.getCollection('programs.dbs').aggregate([{$lookup:{from:'instructor.dbs',localField:'instructor',foreignField:"_id",as:'instructors'}}])
Immediate help will be great.
if you are using mongoose maybe the populate method can work.
Your array of instructors should not reference the IDs as strings but as ObjectId(...)
So you would need to have:
"instructor": [ObjectId("61c8f6d7e690fc413a075e15"), ObjectId("61c8f6d7e690fc413a071e15"), ObjectId("61c8f0d7e690fc413a071e15")]
I'm trying to archive old data from CosmosDB into Azure Tables but I'm very new to Azure Data Factory and I'm not sure what would be a good approach to do this. At first, I thought that this could be done with a Copy Activity but because the properties from my documents stored in the CosmosDB source vary, I'm getting mapping issues. Any idea on what would be a good approach to tackle this archiving process?
Basically, the way I want to store the data is to copy the document root properties as they are, and store the nested JSON as a serialized string.
For example, if I wanted to archive these 2 documents :
[
{
"identifier": "1st Guid here",
"Contact": {
"Name": "John Doe",
"Age": 99
}
},
{
"identifier": "2nd Guid here",
"Distributor": {
"Name": "Jane Doe",
"Phone": {
"Number": "12345",
"IsVerified": true
}
}
}
]
I'd like these documents to be stored in Azure Table like this:
identifier | Contact | Distributor
"Ist Guid here" | "{ \"Name\": \"John Doe\", \"Age\": 99 }" | null
"2nd Guid here" | null | "{\"Name\":\"Jane Doe\",\"Phone\":{\"Number\":\"12345\",\"IsVerified\":true}}"
Is this possible with the Copy Activity?
I tried using the mapping tab inside the CopyActivity, but when I try to run it I get an error saying that the dataType for one of the Nested JSON columns that are not present in the first row cannot be inferred.
Please follow my configuration in Mapping Tag.
Test output with your sample data:
I am filtering on an array of addresses. A type attribute determines the address category (POSTAL,RES,WORK etc). I need to filter only the POSTAL and RES address from this array.
I tried to use the filer array action , but it can take only one filter condition.
Can it be edited in the code view to achieve multiple filter conditions ? If yes, what is correct syntax for it.
{
"Name": "Douglas Adams",
"Address": [
{
"Type": "POSTALS",
"street_address": "42",
"city": "Milky Way",
"state": "HI"
},
{
"Type": "RES",
"street_address": "1618",
"city": "Golden ratio",
"state": "MA"
},
{
"Type": "BILLING",
"street_address": "1618",
"city": "Golden ratio",
"state": "MA"
}
]
}
Can it be edited in the code view to achieve multiple filter conditions ?
In shot, Yes. Logic Apps now supports multiple rules in a condition block. Here is a issue you could refer to.
Because I do not know the action above your Filter array, so I will give you a similar syntax (e.g. http body is equal 'bbb' and http headers is equal 'aaa') as below:
#and(equals(triggerOutputs()['headers'],'aaa'), equals(triggerBody(),'bbb'))
You could modify to your situation and fill into edit in basic mode in Filter array.
For more details, you could refer to this thread.