I would like to make a monthly call to a public GraphQL API with some minor business logic.
I read about Posthook but it is unclear how I can use it to call a GraphQL API. Posthook appears to only support REST.
What is an easy and reliable way to do this? Would it make sense to use an AWS Lambda or is there a simpler way?
GraphQL services typically interact with clients over HTTP with POST requests. Clients send JSON in the request and receive JSON in the response.
Some use cases benefit from a GraphQL client library like Apollo Client, but plain old cURL works fine:
# request
curl --location --request POST 'https://swapi-graphql.netlify.app/.netlify/functions/index' \
--header 'Content-Type: application/json' \
--data-raw '{"query":"query Query($filmId: ID) {\n film(filmID: $filmId) {\n title\n }\n}","variables":{"filmId":1}}'
# response
{"data":{"film":{"title":"A New Hope"}}}
An AWS Lambda (AWS's serverless function service) would be a fine way to implement a scheduled GraphQL api call and apply business logic. The Lambda service integrates well with cron-scheduled triggering and results notification. It's easy to get started. You can set up the service by pointing-and-clicking in the AWS console. Or use a infrastructure-as-code library like AWS's SAM (i.e. define your infra in YAML) or CDK (i.e. define your infra in JS/Python/etc). Or in a squillion more ways.
The other cloud providers have similar offerings, take your pick.
Related
I have find out this called servicehooks link
But I want to do this programmatically where I have more number of projects and want to check any of that project repository has code push event happened , if yes need to check which files are pushed as a commit.
and based on that push message into my service bus queue.
any sample code for the same? looking for azure function app for above solution.
You can subscribe to code push events using ADO public API: Subscription create API
You want your request to look like this:
curl -H "Content-Type: application/json;api-version=4.0" \
-H "Authorization: Basic $(B64_TOKEN)"
--request POST \
--data {
"publisherId": "tfs",
"eventType": "git.push",
"resourceVersion": "1.0",
"consumerId": "webHooks",
"consumerActionId": "httpRequest",
"consumerInputs": {"url": $(WEBHOOK_URL)}
}
https://dev.azure.com/$(ORGANIZATION)/_apis/hooks/subscriptions
This will subscribe you to all code push on all your repositories of all your projects of your organization.
When you receive a code push notification (see documentation), you can extract the commit ids from the resource object (you might need to fetch the Push object using the API).
Then you can inspect which file are impacted with the Commit API.
If you want to see the file diff, there is also an undocumented API.
A member of my team developed this API where you send some numeric values and it gives you back a probability. He deployed it in Heroku and sent me the task of connecting it with our backend, you can make the call from cmd like this:
curl -d "{\"Values\":[[value1,value 2,value 3,value 4,value 5]]}" -H "Content-Type: application/json" -X POST https://apibc1.herokuapp.com/predict
And it will work just like intended, but to be honest I don't know how to make this call in my server file, I'm trying to use the request package in Node but I keep getting the invalid URI error on my logs. An example of the API working from cmd:
And this is what happens when I make the same call in my server.js file:
If you have a working curl you can import it into postman and generate a working code sample for a lot of languages.
Import the curl request
Then click the code button on the right
and select a language/framework option from the dropdown
For on-prem analysis services (reference https://learn.microsoft.com/en-us/sql/analysis-services/instances/configure-http-access-to-analysis-services-on-iis-8-0) it is possible to configure http endpoint (which you can use for implementing custom authentication). Is there a way to expose http endpoint also for azure version of analysis services ?
I tried playing with msmdpump.ini and all I got was various errors.
UPDATE
Looking at reflected Microsoft.AnalysisServices.AdomdClient.dll - the azure endpoint actually IS http endpoint. The communication goes like:
POST https://[yourregion].asazure.windows.net/webapi/clusterResolve
{"serverName":"your_as_server_name"}
Reply:
{"clusterFQDN":"[prefix]-[yourregion].asazure.windows.net",
"coreServerName":"your_as_server_name",
"tenantId":"... tenantID"}
And then
POST https://[prefix]-[yourregion].asazure.windows.net/webapi/xmla
Authorization: Bearer your_azure_ad_jwt_here
x-ms-xmlaserver: your_as_server_name
// xmla request inside the body
So in theory one should be able to leverage that to create a http proxy. However neither of those is documented/officially suported.
I tried this and made it work for case of Execute (you can use Execute + Statement for most tasks)
With second request you need three more headers (not sure about User-Agent):
User-Agent: XmlaClient
SOAPAction: urn:schemas-microsoft-com:xml-analysis:Execute
x-ms-xmlacaps-negotiation-flags: 1,0,0,0,0
Hi I used the deploy to bluemix button at the top right of this page
https://alchemy-language-demo.mybluemix.net/?cm_mc_uid=69990450306114782046823&cm_mc_sid_50200000=1478206651
to create an Alchemy language node server. It runs ok -
https://alchemylanguage-nodejs-encekxdev-216.mybluemix.net/
but when I try to post to it using postman I get the response 'internal server error'.
Checking the server logs, it seems to respond to the request with
code: 'EBADCSRFTOKEN', error: 'invalid csrf token'
Even if I try to send the csrf from the webpage version of the site, it still doesn't work.
I feel like I have missed something in the configuration of the server but don't know what. I'm not great with servers etc so sorry if this is super basic.
EDIT - I should mention I have also tried sending an apikey in the request which I got from the service credentials section in the alchemy service bit off the dashboard but the same error occurs.
EDIt - The call to the API I am making looks like this:
POST https://alchemylanguage-nodejs-encekxdev-216.mybluemix.net/api/keywords
with headers:
text: 'this is some dummy text I have made'
I have also tried adding apikey to the headers.
Thanks.
You do not POST your application, you have to POST to the API.
Here is the link with API URL and various samples:
http://www.ibm.com/watson/developercloud/alchemy-language/api/v1
For example, here is a curl request for keywords:
curl -X POST \
-d "outputMode=json" \
-d "url=http://www.twitter.com/ibmwatson" \
"https://gateway-a.watsonplatform.net/calls/url/URLGetRankedKeywords?apikey=$API_KEY"
Make sure you export $API_KEYto your environment before running the command above. You can find the $API_KEY in the AlchemyAPI instance you created in Bluemix.
Using this camel route sending POST request to Google Translate API:
from("direct:start").
setHeader(Exchange.HTTP_METHOD, constant('POST')).
setHeader('X-HTTP-Method-Override', constant('GET')).
setBody(constant('q=Hello')).
log(LoggingLevel.INFO, 'sourcingtool', '${body}').
to("https://www.googleapis.com/language/translate/v2?key=${api_key}&target=fr").
to('stream:out')
For some reason getting HTTP 400.
Who is seeing some problem in the request?
UPDATE 1
When I'm using curl and sending similar request, everyting works like a charm:
curl -XPOST -H "X-HTTP-Method-Override:GET" --data "q=Hello" "https://www.googleapis.com/language/translate/v2?key=MY_API_KEY&target=fr"
The answer was simple. I just needed to explicitly set CONTENT_TYPE:
from("direct:start").
setHeader(Exchange.HTTP_METHOD, constant('POST')).
setHeader(Exchange.CONTENT_TYPE, constant('application/x-www-form-urlencoded')). // this one did a trick
setHeader('X-HTTP-Method-Override', constant('GET')).
setBody(constant('q=Hello')).
log(LoggingLevel.INFO, 'sourcingtool', '${body}').
to("https://www.googleapis.com/language/translate/v2?key=${api_key}&target=fr").
to('stream:out')