Listen to POST requests, parse-server, node.js - node.js

I have a self-hosted parse-server in AWS EC2. I want to update my database when I receive POSTnotifications from Apple. For that, I created a cloud function, but since Apple asks for a urlto send notifications, I'm not sure how to make my cloud function directly accessible via url or if I need to create an endpoint somewhere (AWS) to receive the notification from Apple and then make a new httpRequest or curl to my cloud function.
I'm looking for any directions or services (AWS) on how to perform this.

I don't think you need to do anything in AWS. You just need to add the cloudFunction to your main.js
As an example, here is an endPoint called add3NumbersTogether . I put this in my main.js file and then I can call this code from iOs (or another client). In iOs I use the Parse iOs SDK to make calls
Parse.Cloud.define("add3NumbersTogether", function(request, response) {
response.success( {request.params.num1 + request.params.num2 + request.params.num3});
});

Related

Error "ReferenceError: request is not defined" from basic Firebase Functions run

Trying to send off a webhook to Slack whenever onWrite() is triggered directed toward my Firebase DB. Going off a few other posts/guides I was able to deploy the below code, but get the ReferenceError: Request is not defined error on execution. I can't figure out how to fix the Request is not defined.
const functions = require('firebase-functions');
const webhookURL = "https://hooks.slack.com/services/string/string";
exports.firstTest = functions.database.ref('first').onWrite( event => {
return request.post(
webhookURL,
{json: {text: "Hello"}}
);
});
Calling your Cloud Function via an URL and sending back a response
By doing exports.firstTest = functions.database.ref('first').onWrite() you trigger your firstTest Cloud Function when data is created, updated, or deleted in the Realtime Database. It is called a background trigger, see https://firebase.google.com/docs/functions/database-events?authuser=0
With this trigger, everything happens in the back-end and you do not have access to a Request (or a Response) object. The Cloud Function doesn't have any notion of a front-end: for example it can be triggered by another back-end process that writes to the database. If you want to detect, in your front-end, the result of the Cloud Function (for example the creation of a new node) you would have to set a listener to listen to this new node location.
If you want to call your function through an HTTP request (possibly from your front-end, or from another "API consumer") and receive a response to the HTTP Request, you need to use another type of Cloud Function, the HTTP Cloud Function, see https://firebase.google.com/docs/functions/http-events. See also the other type of Cloud Function that you can call directly: the Callable Cloud Functions.
Finally, note that:
With .onWrite( event => {}), you are using the old syntax, see https://firebase.google.com/docs/functions/beta-v1-diff?authuser=0
The Firebase video series on Cloud Function is a good point to start to learn more on all these concepts, see https://firebase.google.com/docs/functions/video-series?authuser=0
Calling, from your Cloud Function, an external URL
If you want, from a Cloud Function, to call an external URL (the Slack webhook mentioned in your question, for example) you need to use a library like request-promise (https://github.com/request/request-promise).
See How to fetch a URL with Google Cloud functions? request? or Google Cloud functions call URL hosted on Google App Engine for some examples
Important: Note that you need to be on the "Flame" or "Blaze" pricing plan.
As a matter of fact, the free "Spark" plan "allows outbound network requests only to Google-owned services". See https://firebase.google.com/pricing/ (hover your mouse on the question mark situated after the "Cloud Functions" title)

NodeJS stream out of AWS Lambda function

We are trying to migrate our zip microservice from regular application in nodejs Express to AWS API Gateway integrated with AWS Lambda.
Our current application sends request to our API, gets list of attachments and then visits those attachments and pipes their content back to user in form of zip archive. It looks something like this:
module.exports = function requestHandler(req, res) {
//...
//irrelevant code
//...
return getFileList(params, token).then(function(fileList) {
const filename = `attachments_${params.id}`;
res.set('Content-Disposition', `attachment; filename=${filename}.zip`);
streamFiles(fileList, filename).pipe(res); <-- here magic happens
}, function(error) {
errors[error](req, res);
});
};
I have managed to do everything except the part where I have to stream content out of Lambda function.
I think one of possible solutions is to use aws-serverless-express, but I'd like a more elegant solution.
Anyone has any ideas? Is it even possible to stream out of Lambda?
Unfortunately lambda does not support streams as events or return values. (It's hard to find it mentioned explicitly in the documentation, except by noting how invocation and contexts/callbacks are described in the working documentation).
In the case of your example, you will have to await streamFiles and then return the completed result.
(aws-serverless-express would not help here, if you check the code they wait for your pipe to finish before returning: https://github.com/awslabs/aws-serverless-express/blob/master/src/index.js#L68)
n.b. There's a nuance here that a lot of the language SDK's support streaming for requests/responses, however this means connecting to the stream transport, e.g. the stream downloading the complete response from the lambda, not listening to a stream emitted from the lambda.
Had the same issue, now sure how you can do stream/pipe via the native lambda + API Gateway directly... but it's technically possible.
We used Serverless Framework and were able to use XX.pipe(res) using this starter kit (https://github.com/serverless/examples/tree/v3/aws-node-express-dynamodb-api)
What's interesting is that this just wraps over native lambda + API Gateway so, technically it is possible as they have done it.
Good luck

How to run Alexa skill with the alexa-sdk on own server with Node.js without Lambda drop-in?

The Alexa skill docs will eventually allow you to send webhooks to https endpoints. However the SDK only documents lambda style alexa-sdk usage. How would one go about running Alexa applications on one's own server without anything abstracting Lambda? Is it possible to wrap the event and context objects?
You can already use your own endpoint. When you create a new skill, in the configuration tab, just choose HTTPS and provide your https endpoint. ASK will call your endpoint where you can run anything you want (tip, check ngrok.com to tunnel to your own dev machine). Regarding the event and context objects; your endpoint will receive the event object information. You don't need the context object for anything, that just lets you interact with Lambda-specific stuff (http://docs.aws.amazon.com/lambda/latest/dg/python-context-object.html). Just make sure that you comply with the (undocumented) timeouts by ASK and you are good to go.
Here's a way to do this that requires only a small change to your Skill code:
In your main index.js entry point, instead of:
exports.handler = function (event, context) {
use something like:
exports.myAppName = function (funcEvent, res) {
Below that, add the following workaround:
var event = funcEvent.body
// since not using Lambda, create dummy context with fail and succeed functions
const context = {
fail: () => {
res.sendStatus(500);
},
succeed: data => {
res.send(data);
}
};
Install and use Google Cloud Functions Local Emulator on your laptop. When you start and deploy your function to the emulator, you will get back a Resource URL something like http://localhost:8010/my-project-id/us-central1/myAppName.
Create a tunnel with ngrok. Then take the ngrok endpoint and put it in place of localhost:8010 in the Resource URL above. Your resulting fulfillment URL will be something like: https://b0xyz04e.ngrok.io/my-project-id/us-central1/myAppName
Use the fulfillment URL (like above) under Configuration in the Alexa dev console, selecting https as the Service Endpoint Type.

Architecture for microservices

I've recently started to work with node.js and I have to build an architecture that should use multiple express.js services. Some of these services will have to be located on one server, anothers - on other server machines. I want to build a base service (like API Gateway), but I don't know what the proper way to communicate between this Gateway and microservices, or between two microservices.
Currently I'm working with a solution based on this:
# inside Gateway server I call another service:
http.get('http://127.0.0.1:5001/users', (service_res) ->
data = ''
service_res.on 'data', (chunk) ->
data += chunk
service_res.on 'end', ->
# some logic on data
).end()
I have a strong feeling that this approach is not right. What the proper way to build communication logic between API Gateway and microservices?
The logic you have is not incorrect but what would probably be better is to build a layer of abstraction on top of making requests to an another service eg. the API gateway to another microservice. Lets just call that microservice B for this instance (API gateway to make a request to B).
B in this case should provide its own client on how another service should interact with it, whether its through HTTP or WebSockets, the protocol is up to B because B understands how one should communicate with it. The argument for the client and the service being implemented together is that these two components should have a higher level of cohesion since technically they are bound by a contract eg. if a requests needs to be made to a service, it needs to adhere to the contract that the service requires.
In simple pseudocode with Express:
// implemented elsewhere, ideally next to the service that it communicates with
function BServiceClient() {
// ...
}
// the API gateway's calling code
app.get('...', function(request, response, next) {
// create an instance of the service client
var bServiceClient = new BServiceClient();
// retrieving the users from an abstracted endpoint
bServiceClient.GetUsers();
// do some processing and then render a response or call next
});
In order for it to be more testable, you might have to write your own wrapper around the app to do the proper dependency injection for injecting the client to make the routes more testable. Otherwise, you might be able to create another function that can inject the client and create the client at the handler level that calls the newly created function. The newly created function could then be tested. However, I prefer the former approach of using the wrapper. Hope this helps!
What i would do is,
Create separate modules for each microservice. Depending on what microservice you want to run, just have a route for that in express.
Inject the modules you want into an instance of express().
Example + shameless plug - https://github.com/swarajgiri/express-bootstrap/blob/master/core/index.js
Disclaimer - The above solution is a highly opinionated way of solving your problem.

Using NodeJs with Firebase - Security

Due to the need to do some server side code - mainly sending emails I have decided to use Nodejs & Express for the server side element along with Firebase to hold the data - Partly from a learning experience.
My question is whats the best approach with regards to using the client side Firebase library and the Nodejs library when doing authentication using the Simple Email & Password API. If I do the authentication client side and then subsequently call a different route on the NodeJS side will the authentication for that user be carried across in the request. What would be the approach to test the user is authenticated within Node.
One approach I assume is to get the current users username & password from firebase and then post these to NodeJS and then use the firebase security API on the server to test.
Essentially the problem here is you need to securely convey to your NodeJS server who the client is authenticated as to Firebase. There are several ways you could go about this, but the easiest is probably to have all of your client<->NodeJS communication go through Firebase itself.
So instead of having the client hit a REST endpoint served by your NodeJS server, have the client write to a Firebase location that your NodeJS server is monitoring. Then you can use Firebase Security Rules to validate the data written by the client and your server can trust it.
For example, if you wanted to make it so users could send arbitrary emails through your app (with your NodeJS server taking care of actually sending the emails), you could have a /emails_to_send location with rules something like this:
{
"rules": {
"emails_to_send": {
"$id": {
".write": "!data.exists() && newData.child('from').val() == auth.email",
".validate": "newData.hasChildren(['from', 'to', 'subject', 'body'])"
}
}
}
}
Then in the client you can do:
ref.child('emails_to_send').push({
from: 'my_email#foo.com',
to: 'joe#example.com',
subject: 'hi',
body: 'Hey, how\'s it going?'
});
And in your NodeJS code you could call .auth() with your Firebase Secret (so you can read and write everything) and then do:
ref.child('emails_to_send').on('child_added', function(emailSnap) {
var email = emailSnap.val();
sendEmailHelper(email.from, email.to, email.subject, email.body);
// Remove it now that we've processed it.
emailSnap.ref().remove();
});
This is going to be the easiest as well as the most correct solution. For example, if the user logs out via Firebase, they'll no longer be able to write to Firebase so they'll no longer be able to make your NodeJS server send emails, which is most likely the behavior you'd want. It also means if your server is temporarily down, when you start it back up, it'll "catch up" sending emails and everything will continue to work.
The above seems like a roundabout way of doing things, I would use something like https://www.npmjs.com/package/connect-session-firebase and keep firebase as the model, handling all routes through express. Easier if your express server is rendering templates and not just behaving as a JSON API.
If you are using Firebase Authentication, the client side can import the Firebase Library (e.g. for javascript) and authenticate directly with the library itself
import firebase from 'firebase/app';
const result = await firebase.auth().signInWithEmailAndPassword(_email, _password);
After that, the client can to obtain the ID Token, this token will be informed on each request that will be made to the server (e.g. as header).
const sendingIdToken = await firebase.auth().currentUser.getIdToken();
On the Node.js server side, you can install the Firebase Admin SDK, to verify if the user is authenticated on the Node.js server, like:
// Let's suppose the client informed the token as header
const receivingIdToken = req.headers['auth-token'];
admin.auth().verifyIdToken(receivingIdToken, true)
.then((decodedIdToken) => { /* proceed to send emails, etc */}, (error) => {...});
The Firebase Admin SDK gives full permissions to the Database, so keep the credentials safe.
You should also configure the Security Rules on Firestore (or Firebase Realtime), so the client side can still perform specific operations directly to the database (e.g. listening for realtime changes on a collection), but you can also restrict all access if you want the client to only interact with the node.js server.
For more details, I developed an example of a node.js server that uses the Firestore Database and handles security and more.

Resources