I am using an Azure Function kicked by an HTTPTrigger. In that function, I am using the CosmosClient from azure-cosmos.
client = CosmosClient(endpoint, key)
database = client.get_database_client(database-name)
container = database.get_container_client(container-name)
container.create_item(body=req_body)
With the last line, I am trying to create an item with the object that I receive in the body of the HTTP request. I thought that Cosmos DB would automatically generate the id of the item. However, I am getting the following error:
Exception: CosmosHttpResponseError: (BadRequest) Message: {"Errors":["The input content is invalid because the required properties - 'id; ' - are missing"]}
Do I really need to assign an id to the item or am I missing anything? According to the documentation, the id can either be "System-generated or user-configurable"
id is a required attribute. Some older SDKs had the option to autogenerate it as a Guid, but newer SDKs opt for transparency and giving the user the choice explicitly. You could use a Guid or any other value that is relevant for you.
Related
So I'm trying to do something I expected to be simple - using Logic App to insert a json object into Cosmos DB.
I've created a CosmosDB (based on Core SQL API) and created a container called Archive with the partition key /username (which is in my json object).
Now to Logic App.
First, I YOLO'ed the simple approach.
Which gave me error: "The input content is invalid because the required properties - 'id; ' - are missing"
But my json object doesnt have an id field. Maybe use the IsUpsert parameter? I dont feel like manipulating my input document to add a field called 'id'.
Which gave me error: "One of the specified inputs is invalid" Okay - feels even worse.
I now tried to use the other logic app connector (Not V2, but the original).
which gave me error: "The partition key supplied in x-ms-partitionkey header has fewer components than defined in the the collection."
I saw that this connector(unlike the V2 one) has a Partition Key Value parameter from UI, which I added to pass the value of my username
which gave me error "The input content is invalid because the required properties - 'id; ' - are missing".
At this point I thought, let me just give the bloody machine what it wants, and so I added "id" to my json object.
and yes that actually worked.
So questions are
With Logic Apps connectors, are you only able to insert json objects into Cosmos DB without that magic field "id" in the message payload?
If the partitioning key is required. Why is it not available from the V2 connector parameter UI?
Thanks.
The v1 version of this thing doesn't have partition key because it's so old, Cosmos didn't have partitioned containers. You will want to use the v2 preview. I think that will go GA at some point here soon because it's been in preview for a while now.
And yes, With Cosmos DB you can't insert anything without it's id so you will need to generate from whatever calls your endpoint and pass it in the body of your http request.
I am working on a proof of concept with GetStream.io using the .NET server side API to add activities and the react-js client components to render feeds. For some reason every activity is coming into my feeds with Unknown in bold at the top. I assume this is supposed to be the username or something? I read a post about passing in a reference to the user instead of the string userId but the .NET API constructor signature creating a new Activity only takes in a string userId parameter. I have verified that I am passing in a valid userId. Any suggestions on what I am doing wrong here?
Stream stores the unique reference and replaces it at read time. In some complex cases, you need to be able to generate a reference to an existing object and embed that inside of an activity.
Then, can you try this way:
// Since we know their IDs we can create a reference without reading from APIs
var userRef = Users.Ref(userId);
// And then add an activity with these references
var activity = new Activity(userRef, activityAction, message)
I am creating an extensive data factory work flow that will create and fill a data warehouse for multiple customers automatic, however i'm running into an error. I am going to post the questions first, since the remaining info is a bit long. Keep in mind i'm new to data factory and JSON coding.
Questions & comments
How do i correctly pass the parameter through to an Execute Pipeline activity?
How do i add said parameter to an Azure Function activity?
The issue may lie with correctly passing the parameter through, or it may lie in picking it up - i can't seem to determine which one. If you spot an error with the current setup, dont hesitate to let me know - all help is appreciated
The Error
{
"errorCode": "BadRequest",
"message": "Operation on target FetchEntries failed: Call to provided Azure function
'' failed with status-'BadRequest' and message -
'{\"Message\":\"Please pass 'customerId' on the query string or in the request body\"}'.",
"failureType": "UserError",
"target": "ExecuteFullLoad"
}
The Setup:
The whole setup starts with a function call to get new customers from an online economic platform. It the writes them to a SQL table, from which they are processed and loaded into the final table, after which a new pipeline is executed. This process works perfectly. From there the following pipeline is executed:
As you can see it all works well until the ForEach loop tries to execute another pipeline, that contains an azure function that calls a .NET scripted function that fills said warehouse (complex i know). This azure function needs a customerid to retrieve tokens and load the data into the warehouse. I'm trying to pass those tokens from the InternalCustomerID lookup through the ForEach into the pipeline and into the function. The ForEach works actually, but fails "Because an inner activity failed".
The Execute Pipeline task contains the following settings, where i'm trying to pass the parameter through which comes from the foreach loop. This part of the process also works, since it executes twice (as it should in this test phase):
I dont know if it doesn't successfully pass the parameter through or it fails at adding it to the body of the azure function.
The child pipeline (FullLoad) contains the following parameters. I'm not sure if i should set a default value to be overwritten or how that actually works. The guides i've look at on the internet havent had a default value.
Finally there is the settings for the Azure function. I'm not sure what i need to write in order to correctly capture the parameter and/or what to fill in - if it's the header or the body regarding the error message. I know a post cannot be executed without a body.
If i run this specific funtion by hand (using the Function App part of portal.azure.com) it works fine, by using the following settings:
I viewed all of your detailed question and I think the key of the issue is the format of Azure Function Request Body.
I'm afraid this is incorrect. Please see my below steps based on your description:
Work Flow:
Inside ForEach Activity, only one Azure Function Activity:
The preview data of LookUp Activity:
Then the configuration of ForEach Activity: #activity('Lookup1').output.value
The configuration of Azure Function Activity: #json(concat('{"name":"',item().name,'"}'))
From the azure function, I only output the input data. Sample Output as below:
Tips: I saw your step is executing azure function in another pipeline and using Execute Pipeline Activity, (I don't know why you have to follow such steps), but I think it doesn't matter because you only need to focus on the Body format, if your acceptable format is JSON, you could use #json(....),if the acceptable format is String, you could use #cancat(....). Besides, you could check the sample from the ADF UI portal which uses pipeline().parameters
I am using Boomi to pass data into NetSuite. When I use the execute - initialize function on the vendor payment object and set the type to vendorBill I get an error. Here's what I'm seeing:
XML file sent to NetSuite
<InitializeRecord>
<reference type="vendorBill" internalId="125056"></reference>
</InitializeRecord>
Error Message I'm receiving:
"Failed processing original documents iOi in the connector: java.lang.Exception: Unable to execute initialize. Must define valid Initialize Reference Type. Found: vendorBill Valid values are the following: [employee,vendor,vendorReturnAuthorization]"
According to NetSuite documentation the options I should have for initialize are:
employee,vendor,vendorBill.
I need to initialize the Vendor Bill, is there any reason why this isn't working, or a known workaround? Thanks! (Note that all other processes using initialize for other objects are using the same connector and are working properly.)
There is an unresolved bug in Boomi: BOOMI-30118. It's a defect in the code where VENDOR_RETURN_AUTHORIZATION should be VENDOR_BILL. There are 2 alternative solutions for this:
Use the SOAP connector to make the initialize request. Your account manager should be able to get you a temp license to use the SOAP connector until the built in Netsuite Connector bug is resolved.
An initialize isn't technically necessary. It won't be quite as simple, but you should be able to get all required fields through queries or gets, then map the create request directly.
I having the same error icrosoft.Azure.Documents.DocumentClientException: Message: {"Errors":["Owner resource does not exist"]} , this is my scenario. When I deployed my webapp to Azure and try to get some document from docDb it throws this error. The docdb exists in azure and contains the document that i looking for.
The weird is that from my local machine(running from VS) this works fine. I'm using the same settings in Azure and local. Somebody have and idea about this.
Thanks
Owner resource does not exist
occurs when you have given a wrong database name.
For example, while reading a document with client.readDocument(..), where the client is DocumentClient instance, the database name given in docLink is wrong.
This error for sure appears to be related to reading a database/collection/document which does not exist. I got the same exact error for a database that did exist but I input the name as lower case, this error appears to happen regardless of your partition key.
The best solution I could come up with for now is to wrap the
var response = await client.ReadDocumentAsync(UriFactory.CreateDocumentUri(database, collection, "documentid"));
call in a try catch, not very elegant and I would much rather the response comes back with some more details but this is Microsoft for ya.
Something like the below should do the trick.
Model myDoc = null;
try
{
var response = await client.ReadDocumentAsync(UriFactory.CreateDocumentUri(database, collection, document));
myDoc = (Model )(dynamic)response.Resource;
}
catch { }
if (myDoc != null)
{
//do your work here
}
That is to get a better idea of the error then create the missing resource so you dont get the error anymore.
A few resources I had to go through before coming to this conclusion:
https://github.com/DamianStanger/DocumentDbDemo
Azure DocumentDB Read Document Resource Not Found
I had the same problem. I found that Visual Studio 2017 is publishing using my Release configuration instead of the Test configuration I've selected. In my case the Release configuration had a different CosmosDB database name that doesn't exist, which resulted in the "owner resource does not exist" error when I published to my Azure test server. Super frustrating and a terrible error message.
It may also be caused by a not found attachment to a document.
This is a common scenario when you move cosmos db contents using Azure Cosmos DB Data Migration tool which moves all the documents with their complete definition but unfortunately not the real attachment content.
Therefore this result in a document that states it has an attachment, and also states the attachment link, but at that given link no attachment can be found because the tool have not moved it.
Now i wrap my code as follow
try{
var attachments = client.CreateAttachmentQuery(attacmentLink, options);
[...]
}
catch (DocumentClientException ex)
{
throw new Exception("Cannot retrieve attachment of document", ex);
}
to have a meaningful hint of what is going on.
I ran into this because my dependency injection configuration had two instances of CosmosClient being created for different databases. Thus any code trying to query the first database was running against the second database.
My fix was to create a CosmosClientCollection class with named instances of CosmosClient