I implemented a lambda function on AWS to create an Alexa Skill. I would like to use a https endpoint on another server with node.js. I have no concept on how to approach such a port.
This is my example code of the aws lambda function:
const skillBuilder = Alexa.SkillBuilders.custom();
exports.handler = skillBuilder
.addRequestHandlers(
LaunchRequestHandler,
MyIntentHandler,
TestIntentHandler,
HelpIntentHandler,
CancelAndStopIntentHandler,
SessionEndedRequestHandler
)
.addErrorHandlers(ErrorHandler)
.lambda();
and as an example, one of the handlers:
const LaunchRequestHandler = {
canHandle(handlerInput) {
return handlerInput.requestEnvelope.request.type === 'LaunchRequest';
},
handle(handlerInput) {
const speechText = 'Hello';
return handlerInput.responseBuilder
.speak(speechText)
.reprompt(speechText)
.withSimpleCard('Welcome', speechText)
.getResponse();
},
};
I would need help regarding the node.js app and how to translate this lambda function. Thank you in advance
There are a few options, considering your response in the comments section:
1 - You can create an HTTPS endpoint in API Gateway. The endpoint will get the data received in the https call and forward to lambda, then it will wait the function execution and forward the response. You can find more information here (http://docs.aws.amazon.com/apigateway/latest/developerguide/);
2 - You can provide an https endpoint in a server that you control and can use the aws-sdk (Javascript, Java, Python, etc.) to make a call to lambda function. Like a http service running inside EC2. You can find more information here for node.js (https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/lambda-examples.html);
3 - Now you can make a call to an SQS queue and configure it to fire the lambda execution. The same solution is available using AWS Kinesis. The SQS Queue receive messages as https calls. But this solution probably will need a third party to send the response for the original caller. You can find more information about this option here (http://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/).
Ok, sorry about this extra answer, but when I remembered I could'n stop thinking about how fool I were. So....
All AWS services were designed to be used primary as an https endpoint. The AWS Console, the aws-cli and all AWS-SDK's are just proxies to https calls. Knowing this you can make a simple https post request to invoke the lambda function. The API documentation for this request is here (https://docs.aws.amazon.com/lambda/latest/dg/API_Invoke.html).
But it only appears simple... the post request to be made must be signed with your access and secret keys. If it it not signed, it will not be accepted by AWS. It's not a simple process (but you must implement it only a single time in a util function). The console, aws-cli and aws-sdk automatically sign the requests for you, so they must be your primary option.
Related
Trying to send off a webhook to Slack whenever onWrite() is triggered directed toward my Firebase DB. Going off a few other posts/guides I was able to deploy the below code, but get the ReferenceError: Request is not defined error on execution. I can't figure out how to fix the Request is not defined.
const functions = require('firebase-functions');
const webhookURL = "https://hooks.slack.com/services/string/string";
exports.firstTest = functions.database.ref('first').onWrite( event => {
return request.post(
webhookURL,
{json: {text: "Hello"}}
);
});
Calling your Cloud Function via an URL and sending back a response
By doing exports.firstTest = functions.database.ref('first').onWrite() you trigger your firstTest Cloud Function when data is created, updated, or deleted in the Realtime Database. It is called a background trigger, see https://firebase.google.com/docs/functions/database-events?authuser=0
With this trigger, everything happens in the back-end and you do not have access to a Request (or a Response) object. The Cloud Function doesn't have any notion of a front-end: for example it can be triggered by another back-end process that writes to the database. If you want to detect, in your front-end, the result of the Cloud Function (for example the creation of a new node) you would have to set a listener to listen to this new node location.
If you want to call your function through an HTTP request (possibly from your front-end, or from another "API consumer") and receive a response to the HTTP Request, you need to use another type of Cloud Function, the HTTP Cloud Function, see https://firebase.google.com/docs/functions/http-events. See also the other type of Cloud Function that you can call directly: the Callable Cloud Functions.
Finally, note that:
With .onWrite( event => {}), you are using the old syntax, see https://firebase.google.com/docs/functions/beta-v1-diff?authuser=0
The Firebase video series on Cloud Function is a good point to start to learn more on all these concepts, see https://firebase.google.com/docs/functions/video-series?authuser=0
Calling, from your Cloud Function, an external URL
If you want, from a Cloud Function, to call an external URL (the Slack webhook mentioned in your question, for example) you need to use a library like request-promise (https://github.com/request/request-promise).
See How to fetch a URL with Google Cloud functions? request? or Google Cloud functions call URL hosted on Google App Engine for some examples
Important: Note that you need to be on the "Flame" or "Blaze" pricing plan.
As a matter of fact, the free "Spark" plan "allows outbound network requests only to Google-owned services". See https://firebase.google.com/pricing/ (hover your mouse on the question mark situated after the "Cloud Functions" title)
I am testing Dialogflow Fulfillment with the Inline Editor.
What I am trying to do is a http request using que 'request' library.
Here is the code I am using:
const requesthttp = require('request');
requesthttp('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY', { json: true }, (err, res, body) => {
if (err) { return console.log(err); }
console.log(body.url);
console.log(body.explanation);
});
But It returns me an error of not found.
I also noticed an alert on my Dialogflow with the following message:
"Billing account not configured. External network is not accessible and quotas are severely limited. Configure billing account to remove these restrictions."
So... Probably I can't test this piece of code without configuring a billing account.
My question is... Is there a url that I can use to test this code?
Or the only way for me to test this code is configuring a billing account and paying for it?
Thanks in advance
There are a number of approaches to testing your code.
If you want to continue to use Dialogflow's Inline Editor, you will need to setup Firebase to use a payment plan. However, the Blaze plan is "pay as you go" after a basic level of use. This level of use should be sufficient to cover most testing (and even very light production) uses of the service without imposing a charge. Once your Action has been approved, you're able to receive credits for the Google Cloud Platform, which can be applied to this use in case you go over the minimum level.
You can also use Firebase Cloud Functions, which the Inline Editor is based on, and your own local editor. One advantage of this is that you can serve the function locally, which has many of the same features as deploying it, but doesn't have the URL restriction (it is your own machine, after all). You can use a tool such as ngrok to create a secure tunnel to your your machine during testing. Once you have tested, you can deploy this to Firebase with a paid plan.
You can, of course, choose to use any other hosting method you wish. Google and Dialogflow allow you to run your fulfillment webhook on any server, as long as that server can provide an HTTPS connection using a valid, non-self-signed, certificate. If you're using node.js, you can continue to use these libraries. If you wish to use another language, you will need to be able to parse and return JSON, but otherwise you have no restrictions.
There are a lot of ways to create your own server like using NodeJS client with Express.JS which you can expose to the internet using NGROK as webhook for fulfilment.
Develop a webhook. You can use different client libraries in NodeJS (AoG Client or Dialogflow Client) or in Python (Flask-Assistant or Dialogflow Client) or can create your own just using JSON request/response with Dialogflow and Action-on-Google.
Once the webhook is ready, run it locally and expose to the internet using NGROK.
Start with following code for Actions-on-Google with Express.JS
'use strict';
const {dialogflow} = require('actions-on-google');
const express = require('express');
const bodyParser = require('body-parser');
const app = dialogflow();
app.intent('Default Welcome Intent', conv => {
conv.ask('Hi, Welcome to Assistant by Express JS ');
});
express().use(bodyParser.json(), app).listen(8080);
Since DF uses firebase cloud functions you can use https as in nodejs. But requesting domains outside of the google/firebase universe will require the paid version of firebase.
const https = require('https');
return new Promise((resolve, reject) => {
const hostname = info.hostname;
const pathname = info.pathname;
let data = '';
const request = https.get(`https://${hostname}${pathname}`, (res) => {
res.on('data', (d) => {
data += d;
});
res.on('end', resolve);
});
request.on('error', reject);
});
The Alexa skill docs will eventually allow you to send webhooks to https endpoints. However the SDK only documents lambda style alexa-sdk usage. How would one go about running Alexa applications on one's own server without anything abstracting Lambda? Is it possible to wrap the event and context objects?
You can already use your own endpoint. When you create a new skill, in the configuration tab, just choose HTTPS and provide your https endpoint. ASK will call your endpoint where you can run anything you want (tip, check ngrok.com to tunnel to your own dev machine). Regarding the event and context objects; your endpoint will receive the event object information. You don't need the context object for anything, that just lets you interact with Lambda-specific stuff (http://docs.aws.amazon.com/lambda/latest/dg/python-context-object.html). Just make sure that you comply with the (undocumented) timeouts by ASK and you are good to go.
Here's a way to do this that requires only a small change to your Skill code:
In your main index.js entry point, instead of:
exports.handler = function (event, context) {
use something like:
exports.myAppName = function (funcEvent, res) {
Below that, add the following workaround:
var event = funcEvent.body
// since not using Lambda, create dummy context with fail and succeed functions
const context = {
fail: () => {
res.sendStatus(500);
},
succeed: data => {
res.send(data);
}
};
Install and use Google Cloud Functions Local Emulator on your laptop. When you start and deploy your function to the emulator, you will get back a Resource URL something like http://localhost:8010/my-project-id/us-central1/myAppName.
Create a tunnel with ngrok. Then take the ngrok endpoint and put it in place of localhost:8010 in the Resource URL above. Your resulting fulfillment URL will be something like: https://b0xyz04e.ngrok.io/my-project-id/us-central1/myAppName
Use the fulfillment URL (like above) under Configuration in the Alexa dev console, selecting https as the Service Endpoint Type.
Ok, this is not what you think it is, I am not asking for help with the async/wait pattern or asynchronous programming I am well versed with those. I am however querying whether something is possible within a Node.JS Express service.
The Scenario
I have a web service which is developed in Node.JS and uses Express.JS to expose some REST endpoints that a client can connect to and send a POST request. For the most part these are Synchronous and will create a SOAP message and send that on to an external service and receive an immediate response which can then be returned to the client, all really simple stuff which is already implemented. So what's your point I hear you say, I am coming to that.
I have a couple of POST interactions that will build a SOAP message to send to an Asynchronous external endpoint where the response will be received asynchronously through an inbound endpoint.
Option 1: What I am looking for in these cases is to be able to build the SOAP message, create a listener (so I can listen for the response to my request), and then send the request to the external service which immediately returns a 200.
Option 2: When I setup the service I want to also setup and listen for incoming requests from the external service whilst also listening for REST requests from the internal service.
The Question
Is either option possible in Node and Express? and, if so how would one achieve this?
NOTE: I know its possible in C# using WCF or a Listener but I would like to avoid this and use Node.JS so any help would be greatly appreciated.
First of all check node-soap if it fits your needs.
Option 1: What I am looking for in these cases is to be able to build the SOAP message, create a listener (so I can listen for the response to my request), and then send the request to the external service which immediately returns a 200.
Here's a very basic non-soap service implementation.
let request = require('request-promise');
let express = require('express');
let app = express();
//Validate the parameters for the request
function validateRequest(req) { ... }
//Transform the request to match the internal API endpoint
function transformRequest(req) { ... }
app.post('/external', function(req, res) {
if(!validateRequest(req))
return res.status(400).json({success: false, error: 'Bad request format');
res.status(200).send();
let callbackUrl = req.query.callback;
let transformedRequest = transformRequest(req);
let internalServiceUrl = 'https://internal.service.com/internal'
request.post(internalServiceUrl, {body: transformedRequest}).then(function (internalResponse){
//Return some of the internal response?
return request.get(callbackUrl, {success: true, processed: true});
}).catch(function (e) {
request.get(callbackUrl, {success: false, error: e});
});
});
Option 2: When I setup the service I want to also setup and listen for incoming requests from the external service whilst also listening for REST requests from the internal service.
There is no "listening" in http. Check socket.io if you need realtime listening. It uses websockets.
Your other option is to poll the internal service (say if you want to check for its availability).
In my scenario I'm trying to implement server less backend that runs pretty long time consuming calculations. This calculations is managed by Lambda that refers to some external API.
In oder to request this I'm using Amazon API Gateway which has 10 seconds execution limitation. However Lambda runs about 100 seconds.
To avoid this limitation I'm using 2nd Lambda function to execute this time consuming calculation & report that calculation is started.
I looks very similar to this:
var AWS = require('aws-sdk');
var colors = require('colors');
var functionName = 'really-long'
var lambda = new AWS.Lambda({apiVersion: '2015-03-31'});
var params = {
FunctionName: functionName,
InvocationType: 'Event'
};
lambda.invoke(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(functionName.green + " was successfully executed and returned:\n" + JSON.stringify(data, null, 2).gray); // successful response
});
console.log("All done!".rainbow);
This code is executed over AWS API Gateway by thousands of clients browsers independently.
To inform each particular client that his Lambda function execution was successfully done I'v planed to use AWS SQS (because of long polling and some other useful functionalities out of the box).
So my question is:
How can I determine on the client which message in the queue belongs to this particular client? Or should I iterate over all queue to find proper messages by some request ID parameter in every client browser? I guess that this method will be inefficient when 1000 client will be simultaneously waiting for their results.
I do understand that I can write results to DynamoDB for example and periodically poll DB for the result via some homemade API. But is there any elegant solution to notify browser based client about completion of execution of time consuming Lambda function based on some Amazon PaaS solution?
Honestly the DynamoDB route is probably your best bet. You can generate a uuid in the first Lambda function executed by the API Gateway. Pass that uuid to the long-running Lambda function. Before the second function completes have it write to a DynamoDB table with two columns: uuid and result.
The API Gateway responds to the client with the uuid it generated. The client then long-polls with a getItem request against your DynamoDB table (either via the aws-sdk directly or through another API Gateway request). Once it responds successfully, remove said item from the DynamoDB table.
The context object of the lambda function will have the AWS request ID returned to the client that invoked the Lambda function.
So, client will have the lambda request ID of Lambda 1, Lambda 1 Context object will have the same request Id (irrespective of lambda retries, request ID remains same). So pass this request ID to Lambda 2 there by actual request ID is chained till the end.
Polling using the request id from client is fairly easy on any data store like dynamodb.