I have a webpage running on my raspberry using node.js. The webpage has a simple login and after logging in I can control some hardware. The login in is using BCrypt and a mongoDB locally hosted.
I'm using below code to access a specific page, and from there I can send POST commands.
app.get('/profile', isLoggedIn, function(req, res){
// do code (function call)
}
How can I setup AWS to communicate with my server and potentially access the function call?
How should I store my login information to my server in AWS?
/ Thomas
Based on your tags, you just want to use aws lambda.
You create end points in your node app ran on raspberry PI. Make sure you enable cors so that other servers can also make requests to your raspberry server.
When you have your endpoints all set up you can use aws lambda to make requests to your server. Lambdas are node instances, so all you need to do is make node http requests to the endpoints you created. Now what triggers your lambdas depends on what you want to do, just set up events accordingly.
If you want to call some function in your raspberry pi simply create an endpoint which calls that function and then aws lambda does a request to that endpoint, thus aws calls your local functions.
Same with logging in. I assume you use tokens to authorize requests to your server. Since you are making requests from lambdas and not a client, you can't use cookies/local storage to save the token. You will have to use some aws storage service, rds/s3/dynamodb/etc.
If you are open to changing your current web app architecture I suggest looking into using AWS IoT Platform. It seems like a perfect fit to set up your rasberry pi communication with AWS.
Description of IoT Platform:
AWS IoT is a managed cloud platform that lets connected devices easily and securely interact with cloud applications and other devices. AWS IoT can support billions of devices and trillions of messages, and can process and route those messages to AWS endpoints and to other devices reliably and securely.
Guide how to set it up on raspberry pi with javascript:
http://docs.aws.amazon.com/iot/latest/developerguide/iot-device-sdk-node.html
Related
I am trying to create a Node.js based service which will run in AWS EKS.
I have created A docker image with Node.js and installed my Node.js application in it.
My application currently supports one test REST API path.
I have below questions:
I am not able to find any useful example of this case, so I posted this question.
How my REST API which is in the service called from the outside, do I need to create a API Gateway and then link to this service > REST API?
Is there a SDK/Library that AWS provides which we need to implement where we get the incoming REST requests in the service and then call the appropriate REST API? Just like what we have in the AWS Lambda the exports.handler.
OR I need to expose REST APIs directly from the service and outside world will consume it based on the configuration?
Also How do I access AWS services from the service, when docker Image is running locally?
I built this mobile application using AWS Amplify, which store data into its DataStore model. I wish to access the DataStore on the raspberry pi running a NodeJS app. I tried using amplify by passing the tokens via bluetooth, which authenticate successfully, but only query/save locally.
Alternative is to manually build lambdas via API gateway to interact with the DynamoTables directly, but this would require the lambda to know about the generated table from amplify.
Is there a simpler way to get Amplify working directly on a non-cloud NodeJS app? Any other alternatives?
I have a node.js RESTful API application. There is no web interface (at least as of now) and it is just used as an API endpoint which is called by other services.
I want to host it on Amazon's AWS cloud. I am confused between two options
Use normal EC2 hosting and just provide the hosting url as the API endpoint
OR
Use Amazon's API Gateway and run my code on AWS Lambda
Or can I just run my code on EC2 and use API Gateway?
I am confused on how EC2 and API Gateway are different when it comes to a node.js RESTful api application
Think of API Gateway as an API management service. It doesn't host your application code, it does provide a centralized interface for all your APIs and allows you to configure things like access restrictions, response caching, rate limiting, and version management for your APIs.
When you use API Gateway you still have to host your API's back-end application code somewhere like Lambda or EC2. You should compare Lambda and EC2 to determine which best suits your needs. EC2 provides a virtual Linux or Windows server that you can install anything on, but you pay for every second that the server is running. With EC2 you also have to think about scaling your application across multiple servers and load balancing the requests. AWS Lambda hosts your functions and executes them on demand, scales out the number of function containers automatically, and you only pay for the number of executions (and it includes a large number of free executions every month). Lambda is going to cost much less unless you have a very large number of API requests every month.
I'm working on a Restful Web Application. I divide frontend and backend, using Angular2 for the front, and NodeJS for the back.
I would like to use Notifications and push them to specific users.
Sample : If my user decide to subscribe, he could get a Desktop notification when I decide to send one or if my NodeJS serveur want to send a message to a user group.
I have seen a lot of differents modules for the frontend and backend, but I'm a little bit lost.
Architecturally, how should I add this service in my application?
Should I use specific node modules?
You talk about desktop notifications. I guess you want the user to receive its notifications also when the browser or app is closed. In that case you need a Service Worker. A Service Worker is a script that your browser runs in the background, to which the message is being pushed when the browser or app is closed. For a nice introduction to Service Workers, read this. Angular has a Service Workers implemented in production version since 5.0.0. Klik here to read more about it.
At the backend you need a special Node module to send the notification. For instance node-pushserver, but there are many others. This special node module connects to a messaging service whom actual send the message. You can use for instance Google's cross-platform messaging solution Firebase Cloud Messaging (FCM) (the successor of Google Cloud Messaging (GCM)). It can send to Web, iOS and Android.
At the client side you need to register the Service Worker for push notification. Then you will get an endpoint that needs to be stored at the node server side. You send a push request with this endpoint to the messaging service every time.
You can also make use of a paid push notification provider to do the job. Click here for a list of them.
Setting up a WebSocket connection (like socket.io) won't work since it can't stay connected with the Service Worker.
You can use WebSockets for pushing data from the Node.js server. Add the ws package to your server's package.json. Take a look at the BidServer.ts here: https://github.com/Farata/angular2typescript/tree/master/chapter8/http_websocket_samples/server/bids
The Angular client is here: https://github.com/Farata/angular2typescript/tree/master/chapter8/http_websocket_samples/client/app/bids
I have some doubts about which is the most appropiate way to allow access to my company backend services from public Clouds like AWS or Azure, and viceversa. In our case, we need an AWS app to invoke some HTTP Rest Services exposed in our backend.
I came out with at least two options:
The first one is to setup an AWS Virtual Private Cloud between the app and our backend and route all traffic through it.
The second option is to expose the HTTP service through a reverse proxy and setup IP filtering in the proxy to allow only income connections from AWS. We donĀ“t want the HTTP Service to be public accesible from the Internet and I think this is satisfied whether we choose one option or another. Also we will likely need to integrate more services (TCP/UDP) between AWS and our backend, like FTP transfers, monitoring, etc.
My main goal is to setup a standard way to accomplish this integration, so we don't need to use different configurations depending on the kind of service or application.
I think this is a very common need in hybrid cloud scenarios so I would just like to embrace the best practices.
I would very much appreciate it any kind of advice from you.
Your option #2 seems good. Since you have a AWS VPC, you can get an IP to whitelist by your reverse proxy.
There is another approach. That is, expose your backends as APIs which are secured with Oauth tokens. You need some sort of an API Management solution for this. Then your Node.js app can invoke those APIs with the token.
WSO2 API Cloud allows your to create these APIs in the cloud and run the api gateway in your datacenter. Then the Node.js api calls will hit the on-prem gateway and it will validate the token and let the request go to the backend. You will not need to expose the backend service to the internet. See this blog post.
https://wso2.com/blogs/cloud/going-hybrid-on-premises-api-gateways/