Logic App - PartitionKey extracted from document doesn't match the one specified in the header - azure

I ran into this error while building a Logic App in Azure. The solution to this DocumentDB REST API: PartitionKey extracted from document doesn't match is not working.
My logic app receives a POST request with the raw JSON data, and then sends it to a Cosmos "Create or Update Document" step. In there, I am able to specify my DB, and for my inputs, I have body and headers like so:
...
"inputs": {
"body": "#triggerBody()",
"headers": {
"x-ms-documentdb-partitionkey": "#triggerBody()?['date']"
},
...
}
...
My JSON data looks like this:
{
"id": "20190106",
"date": "20190106",
...
}
In the error output, it literally shows my PartitionKey as 20190106, so the #triggerBody()?['date'] seems to have worked.
Any ideas?

Don't forget to add Content-Type: application/json in the Post request.(mentioned in this thread)
Cosmos DB connector:
Post Request:
Output:

If your partition key is date, then try passing the value of date [“20190106"] (use the square brackets) in the partition key header. This worked for me 2 days ago using Rest API for CosmoDb.

The answer is that Logic Apps do not support the Mongo API. Yes, there's a workaround, as in the Logic App can pass data from one endpoint to a serverless function that then writes to Mongo.
Problems inserting document with Mongodb and Logic Apps

Related

Is it okay if the command model(microservice) knows the resource url of the query model in 201 created response? (CQRS)

I divided my application into the command model and query model.
When the command is executed on the command model, the event is published, and then the query model creates its own data and persists. (it occurs in the same transaction.)
When the user sends data with Post method, the command model has to return created 201.
My question is that is it okay the command model knows about the query model's resource URL?
(is it okay for the command model's controller to be coupled with the query model?)
For example)
Request
Post /articles
body: { title: "the title", body: "the body"}
Response
201 Created
Location: /subscription/news
the UI only reads data from the query model and the query model has some different URL patterns compared to the command model, and they only provide news as a collection.
Is the above example make sense? What do you think?
Putting a reference to the query service's (GET) request in the HTTP response of the command service's (POST) reponse does not imply both services are coupled. Only information about where to find the freshly created resource is stored in the header but the services and their functionality remain separated.
If you want to know more about automatically creating the URL's (I assume you mean hard coded URL by coupling between services) instead of hardcoding them, you could also take a look at HATEOAS...

How to perform some tasks with an incoming request before saving it or responding with django rest framework?

I'm quite new to django, i'm trying to migrate and old API built in express to Django Rest Framework,
brief story:
The API is meant to receive different kind of payplods from different device, in example
{
"device": "device001",
"deviceType": "temperature_device",
"deviceTs": timestamp,
"payload": {
"airTemp": X,
"airHum": Y,
}
}
the payload wouldn't be always the same, so other devices (different type) will bring different key - value pairs in the "payload" field.
I'm using Django Rest Framework, alongside model serializers and and GenericViewSet,
but the problem is that before storing the data to the DB and returning the HTTP Response, I need to perform a data validation (minimum, and maximum values) and in some cases, the device sends some "corrupted" data (In example: Negatives number comes with the following syntax: 1.-5 instead of -1.5), I need to fix these values and so on, finally, I need to perform two HTTP request to an external API with the fixed payload and and API Key (that should be stored in the device details model in my database)
so, in short how can I perform any kind of -previous work- to a request BEFORE storing the data into the DB and returning the HTTP response?
You will receive your payload in request.data then you will have to serialize it and validate your payload according to your requirements.
Here is the DRF serilization document which would help you understand how exactly serialization works.
And here is the DRF Validators documents to understand how validators work.

Azure Build Pipeline - Pause and Enable DefinitionQueueStatus change REST API

We have many dozens of build pipelines and we want to pause and resume (re-enable) build pipelines from a simple webapp interface as we are making config changes frequently. Here is the MS doc explaining this API:
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/builds/update%20build?view=azure-devops-rest-5.0#definitionqueuestatus
From this documentation, it appears I need to hit the REST API and change/toggle the DefinitionQueueStatus -- however, this documentation only shows a sample for a build specific operation, whereas I want to pause then re-enable the entire build pipeline. What is the proper way to make this call?
I'm using fetch - and I've tried many dozen formats in the call - the 'ourorg' and 'ourproject' are correct (we use this call structure for many other calls), but all fails for this call below. I grabbed the 'definitionID' from the URL I can visibly see when in the Azure devops portal on the specific build pipeline page, and I'm using it for the {buildID} as I don't know what else to put there. Any guidance to help here is appreciated - I don't need to use fetch btw - any working sample will help here:
fetch(https://dev.azure.com/our_org/our_projectname/_apis/build/builds/definitionId=1593?retry=true&api-version=5.0 {
method: 'PATCH ',
credentials: 'same-origin',
body: 'DefinitionQueueStatus: "Enabled"'
}).then(function(response) {
console.log(response);
})
It seems that the body is incorrect in your post. Here is sample about how to use POSTMAN to access Azure DevOps Services REST APIs.
Generate the PAT, and then record the token, it is important to use to authorization, please see this document.
Create a new request in POSTMAN, it is recommended to put the request in a collection for Azure DevOps Services REST API;
Select the authorization as Basic Auth, you can input the username as any value, and the password as the token which is generated in step1.
Basic Auth
Set the REST API which you want to use,and select the request method type(GET,POST,FETCH ....), here you use https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}?api-version=5.0.
In the Body tab, you can set the request body as raw in json format, and input the value as following:
{
"buildNumber":"#20190607.2",
"buildNumberRevision":1,
"definition":
{
"id":1,
"createdDate":null,
"queueStatus":"paused"
}
}
Everthing is ready now, you can send the request, if sccuess, you will get the response from the REST API.
In your post, the body content is incorrect, the Request Body should meet the format in the REST API document. The DefinitionQueueStatus is a type in definitions. In addition, if you send the request with parameter retry, you will get the message The request body must be empty when the retry parameter is specified..

Access user data in AWS lambda function with custom authorizer

I got a nodeJS lambda function which returns database data and I'd like to filter that data based on the user. I created a custom authorizer lambda function which gets the user for a JWT token, but I couldn't find a way to pass data from the authorizer function to the database function, except for principalId (user.id).
What possibilities do I have here? Do I need to setup cognito? Or is there another possibility?
While reading documentation I found out something different from what the accepted answer suggests. Maybe it's new, but now output can include not only a principalId, but also a "context", which is an object. Sample:
{
"principalId": "xxxxxxxx", // The principal user identification associated with the token send by the client.
"policyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Action": "execute-api:Invoke",
"Effect": "Allow|Deny",
"Resource": "arn:aws:execute-api:<regionId>:<accountId>:<appId>/<stage>/<httpVerb>/[<resource>/<httpVerb>/[...]]"
}
]
},
"context": {
"key": "value",
"numKey": 1,
"boolKey": true
}
}
More from official documentation here. Much better aproach. :)
It seems you have a couple of options.
1) You can place all the information about the user you need into the principal id that is set in the custom authorizer function. So maybe you could serialize the user as json or if you need just a couple of ids then concatenate them together with special character like: principalId: "userId|organizationId". I believe that there is some caching that API Gateway does around that principal id that is returned so I wouldn't make it anything that could be highly dynamic. You could also turn off caching for authorization as well, but that would slow down that endpoint as a result.
2) Just pass the user id and do the user lookup again to get all the information in the function that does the database call. If you're using DynamoDB it will be fast supposedly.
And Cognito seems nice but I don't think it will help you solve the particular problem that you're having now. If it was me though I would choose option 2.
One possible way is to encode the data object to base64 string from authorizer lambda function and decode it down the line.
var principalId = new Buffer(JSON.stringify({
id: 5,
name: "John"
})).toString('base64');
var policy = require('./policy.json');
var policyConfig = {
"principalId": principalId,
"policyDocument": policy
};
context.succeed(policyConfig);
Decoding can be done in two places, one place is the request template section. This can be done by writing a transformation in velocity scripts as shown below
{
"requestTemplate": {
"application/json": {
"principal": "$util.urlEncode($util.base64Decode($context.authorizer.principalId))"
}
}
}
Other option is to decode inside the endpoint lambda function with the nodejs Base64 decoding. Check the following link for more information.
stack overflow answer for base64 decode

Azure Logic App - Twitter Connector Issues

I have a Logic App with Twitter connector and a Dropbox connector. The latter has repeater, which loops over the Twitter body and upload a text file in each iteration with Tweet_ID as file name. The Dropbox connector many times returns conflict errors, it seems Tweet connector keeps returning same tweets again and again, which had been already processed, which results in duplicate file names.
When I look at the output of the Dropbox connector, below is the body it returns.
"body": {
"status": 409,
"source": "api-content.dropbox.com",
"message": "conflict_file"
}
You have probably seen this page https://azure.microsoft.com/sv-se/documentation/articles/app-service-logic-use-logic-app-features/ where they show how to do this.
Have you checked that you don't supply the same Tweet_ID several times? The logic app json format it a bit tricky right now, with not so much documentation.
/dag
You are right. The twitter connector doesn't "remember" the tweets that are returned from a search. It will return the same again. (Just to be clear. We are discussing the Twitter Connector Action Search Tweets.)

Resources