Azure API App proxy generation error - azure

I was able to successfully create a test API and host in Azure. However when I try to create the proxy client, I receive the following error.
[Fatal]Error generating service model: The operation 'Get' has a body
parameter, but did not have a supported MIME type ('application/json')
in its Consumes property.
Exception: There was an error during code
generation when trying to add a client for the Microsoft Azure API App
Generating client code and adding to project failed
I checked the Swagger file and the Contains node was empty. When I change it to
"consumes": [
"application/json",
"application/xml"
]
the proxy creation works. Why did the auto-generated Swagger json not have the Contains property set? I went with the default SwaggerConfig when I created the API app. Am I missing some configuration? Any help will be greatly appreciated.

As I found out from the comments, the solution is to to remove the HttpRequestMessage as parameter of the Action. This will enable the API App Client to generate the code OK.
If you need to mock the object, please follow the documented way from here or another example here.

Related

Create a bug automatically when test fails by Azure Devops Rest api calls

Currently my project use a postman and export a collection to run via Jenkins pipeline. So is there anyway to integrate when testcases is fails in jenkins then calls Azure devops rest api to create a bug.
Please help. Much appreciated it
I'm not sure how the terminology translates to Azure Devops but we do it a lot with Jira, where the Jenkins pipeline creates a Jira issue by calling the Jira endpoint using the HTTP request plugin. I see that Azure DevOps also has a similar endpoint used to create work items, so you should be able to call that endpoint with a POST request to create the bug. This is how you would create a post using httprequest:
httpRequest([
acceptType : 'APPLICATION_JSON',
authentication : '<credentials>',
contentType : 'APPLICATION_JSON',
httpMode : 'POST',
requestBody : '''{
"content": {"key": "value"}
}''',
responseHandle : 'NONE',
url : "https://dev.azure.com/{organization}/{project}/_apis/wit/workitems/${type}?api-version=6.0",
validResponseCodes: '200,201',
]);
You could also simply use curl to make the POST request. You could also search around for any plugins that handle this for you, which would be an approach preferrable to the above methods.
You mention you have already tried this with Postman, so what is preventing you from implementing this into Jenkins?

How to pass Body Parameters(format) when calling a POST request with Content-Type as form-data in Azure Data Factory

I am trying to call API endpoint as a POST and Content-Type as from-data using azure data factory web activity. Tried different way of passing Body parameters but it failed.
Here is the Postman Request.
Here is the Azure Data Factory Web Activity configurations.(use the body as a json and tried different combinations but all didn't work)
And above is the error message.
Any help would be highly appreciated.
Since your request is seen successfully executing from postman, try copying the entire body from that and use in web activity.
The format for passing body for a POST request from a web activity is shown here.
Also make sure you have entered valid “url–Target endpoint and path”. This is usually seen as Activity requires Public end point but you may have used Private vnet where this is not allowed. Web Activity is supported for invoking URLs that are hosted in a private virtual network as well by leveraging self-hosted integration runtime. The integration runtime should have a line of sight to the URL endpoint.
Note: The activity will timeout at 1 minute with an error if it does not receive a response from the endpoint.
Further going through some similar scenarios it is learnt that;
Mostly the header is passed as string in WebActivity whereas Postman
it is integer/long
In case your API tries redirecting, it seems that the web activity
in Azure Data Factory does not currently support following
redirects, meanwhile Postman and other tools and libraries usually
follow redirects by default or include a option for handling them.
Checkout the supported authentication types in the web activity. If you are trying to authorize your from, try set the following.
URL: https://login.microsoftonline.com/<<tenantid>>/oauth2/token
Headers: Content-Type - "application/x-www-form-urlencoded"
Body: grant_type=client_credentials&client_id=<<clientid>>f&client_secret=<<secret>>&resource=https%3A%2F%2Fmicrosoft.onmicrosoft.com%2F<<resourceId>>
Error code: 2108:
Message: Error calling the endpoint '%url;'. Response status code: '%code;'
Cause: The request failed due to an underlying issue such as network connectivity, a DNS failure, a server certificate validation, or a timeout.
Workaround: Make the API call using Powershell, and call that Powershell script from within Data Factory.

How to pass in client Certificate parameter when creating service fabric connection azure devops REST API

I am attempting to create a service fabric connection via the azure devops rest API, as documented here:
https://learn.microsoft.com/en-us/rest/api/azure/devops/serviceendpoint/endpoints/create?view=azure-devops-rest-6.0
When attempting to define the authorization parameters, I have the following fields in my request:
"authorization": {
"parameters":{
"certlookup":"Thumbprint",
"servercertthumbprint": "{{certificateThumbprint}}",
"certificate":"{{certname}}",
"certificatepassword":null,
},
"scheme":"certificate"
},
This will create a service connection to the cluster, however it does not look like the 'client certificate' parameter (as shown in screenshot bellow) is parsed in anyhwhere. I also cant seem to find anywhere in the documentation that says how to do this.
How can I pass in the "Client Certificate" Value when using the REST API?
Below is the detailed view of the parameters which you can use for creating Service Fabric Service Connection:
Pls try to make your request like below (un tested) to make it work:
"authorization": {
"parameters":{
"certlookup":"Thumbprint",
"servercertthumbprint": "{{certificateThumbprint}}",
"clientcertificatedata":"{{certname}}",
"password":null,
},
"scheme":"certificate"
},
Please refer to EndpointAuthorizationParameters Class for details about the parameters to use.
The correct parameter to specify this value is actually just "certificate".
The original request posted is correct, I was entering the name of the certificate rather than the encoding of it.

What is the best way to call an authenticated HTTP Cloud Function from Node JS app deployed in GCP?

We have an authenticated HTTP cloud function (CF). The endpoint for this CF is public but because it is authenticated, it requires a valid identity token (id_token) to be added to the Authorization header.
We have another Node JS application that is deployed in the same Google Cloud. What we want is to call the CF from the Node application, for which we will be needing a valid id token.
The GCP documentation for authentication is too generic and does not have anything for such kind of scenario.
So what is the best way to achieve this?
Note
Like every google Kubernetes deployment, the node application has a service account attached to it which already has cloud function invoker access.
Follow Up
Before posting the question here I had already followed the same approach as #guillaume mentioned in his answer.
In my current code, I am hitting the metadata server from the Node JS application to get an id_token, and then I am sending the id_token in a header Authorization: 'Bearer [id_token]' to the CF HTTP request.
However, I am getting a 403 forbidden when I do that. I am not sure why??
I can verify the id_token fetched from the metadata server with the following endpoint.
https://www.googleapis.com/oauth2/v1/tokeninfo?id_token=[id_token]
It's a valid one.
And it has the following fields.
Decoding the id_token in https://jwt.io/ shows the same field in the payload.
{
"issued_to": "XXX",
"audience": "[CLOUD_FUNTION_URL]",
"user_id": "XXX",
"expires_in": 3570,
"issuer": "https://accounts.google.com",
"issued_at": 1610010647
}
There is no service account email field!
You have what you need in the documentation but I agree, it's not clear. It's named function-to-function authentication.
In fact, because the metadata server is deployed on each computes element on Google Cloud, you can reuse this solution everywhere (or almost everywhere! You can't generate an id_token on Cloud Build, I wrote an article and a workaround on this)
This article provides also a great workaround for local testing (because you don't have metadata server on your computer!)

how to pass and access path parameters in aws api gateway

I am new in web development and trying to learn AWS.
I have made a lambda function for listing.
What I am doing here is I am showing listing, if I get counteId in params(URL) then it shows only data of that counter id else it shows all data.
My lambda function was working fine. But I am having a problem while API integration.
this is how I am accessing pathparameters which are in the event
this is how I am configuring event
and this is my query and response
then I create an API gateway for it.
this is what I did while creating Resource
/{proxy+} - ANY - Setup
I want to get only data of counterId 1, but I am getting whole data.
response
My HTTPmethod is "ANY" and I choose lambda proxy integration in request integration.
I don't know how to send path parameters. Kindly help me.
You have to edit 'Mapping Template' in 'Integration Request' of you method properties in API Gateway.
You could find how to map it in API Gateway Mapping Template Reference article, in the 'Accessing the $input Variable' section.
Your template has to look like next:
{
"name" : "$input.params('name')",
"body" : $input.json('$')
}
Check out more details in my answer to the similar question.

Resources