AWS lambda not opening window using node.js - node.js

I have simple requirement wherein i need to open a window with given URL using node.js, which is deployed on AWS as Lambda function.
Following is the sample code I am trying, the execution of lambda function returns as Success, but no window is opened in any browser, i.e. the url is not launched. When I execute the same code on windows or linux, I can see a window getting launched.
function summaryHandler (event, context, callback) {
console.log('Will open google page');
var open = require('open');
open('http://www.google.com');
callback(null, 'Your window should be launched by now');
}
exports.summaryHandler = summaryHandler;
Can you please tell me where is the issue?

If you try to do this in Lambda, it would try to open the said webpage on the AWS server, not on your local machine. You could instead try to return a redirect to the webpage.
https://aws.amazon.com/blogs/compute/redirection-in-a-serverless-api-with-aws-lambda-and-amazon-api-gateway/

Related

Node.js: Passenger puts stop to application because it takes too much time

I installed nodejs on my website via the cpanel Application "Setup Node.Js App" and created the startup-file "app.js".
When I execute this file in the terminal with node app.js it works quite well.
Anyway if I access the page in the browser with the URL I get the error "Website not available" (ERR_CONNECTION_CLOSED). Sometimes though I get to a different window which says:
The Phusion Passenger application server tried to start the web
application, but this took too much time, so Passenger put a stop to
that. The stdout/stderr output of the subprocess so far is:
Item: apple
Item: eggs
Item: bread
Item: milk
When I click on the details it shows me this
errormessage
.
The current code is
const readline = require('readline');
const fs = require('fs');
const myInterface = readline.createInterface({
input: fs.createReadStream("shoppingList.txt")
});
function printData(data) {
console.log(`Item: ${data}`);
}
myInterface.on("line", printData);
but so far, this happens with any code in app.js. Also when I only write:
console.log("test");
I don't know how to get the app working when accessing it by the URL.
Can someone please help me with this problem?

NodeJS fetch returns ECONNREFUSED error when executed in fs.writeFile callback

I have some code where I have generated some json data, then I would like to
write it to a file, then
hit an api endpoint (a get request)
The issue I am running into is:
when I execute fetch after using fs.writeFile I get an ECONNREFUSED error. If I do not write the file, my get request to the endpoint is successful.
I am putting my fetch in the callback to the writeFile function - I have also tried fs.writeFileSync(url) which gives me the same results. My full code requires writeFile to come first.
I noticed if I wrap fetch in a setTimeout with 10000ms, then fetch will work. It seems as if writeFile isn't waiting long enough to execute the callback function.
Am I doing something incorrectly? or How do I correctly write a file and then subsequently fetch API data?
I boiled down my code to the most minimal example to reproduce this behavior - as well as allowing node to correctly return the error messages. (using a fake URL for this example as the real url isn't publicly accessible)
const fetch = require('node-fetch');
const fs = require('fs');
try {
fs.writeFile('./example.json', JSON.stringify(['test', 'one', 'two']), () => {
fetch('http://www.example.com/api_endpoint?q=test')
.then(console.info)
.catch(console.error);
});
} catch (e) {
console.info(e);
}
I'm running this in nodejs v10.15.1 on Linux Debian 8.11 (jessie)
I found out the problem, it's real silly...
My script is in the same repo as my API server. When running the server in dev mode (adonis serve --dev), fs.writeFile triggers the file watcher to reload (of course), which temporarily disconnects the server. It is very obvious now why it's not working.
The solution was to have the file watcher ignore the folder I am writing the json file to.
In my case (working with adonisjs) that is adonis server --dev -i scripts
Oddly enough, this is a project that worked a month ago and I didn't have this issue then. I guess something changed in how I'm running it between then and now.

How to run Alexa skill with the alexa-sdk on own server with Node.js without Lambda drop-in?

The Alexa skill docs will eventually allow you to send webhooks to https endpoints. However the SDK only documents lambda style alexa-sdk usage. How would one go about running Alexa applications on one's own server without anything abstracting Lambda? Is it possible to wrap the event and context objects?
You can already use your own endpoint. When you create a new skill, in the configuration tab, just choose HTTPS and provide your https endpoint. ASK will call your endpoint where you can run anything you want (tip, check ngrok.com to tunnel to your own dev machine). Regarding the event and context objects; your endpoint will receive the event object information. You don't need the context object for anything, that just lets you interact with Lambda-specific stuff (http://docs.aws.amazon.com/lambda/latest/dg/python-context-object.html). Just make sure that you comply with the (undocumented) timeouts by ASK and you are good to go.
Here's a way to do this that requires only a small change to your Skill code:
In your main index.js entry point, instead of:
exports.handler = function (event, context) {
use something like:
exports.myAppName = function (funcEvent, res) {
Below that, add the following workaround:
var event = funcEvent.body
// since not using Lambda, create dummy context with fail and succeed functions
const context = {
fail: () => {
res.sendStatus(500);
},
succeed: data => {
res.send(data);
}
};
Install and use Google Cloud Functions Local Emulator on your laptop. When you start and deploy your function to the emulator, you will get back a Resource URL something like http://localhost:8010/my-project-id/us-central1/myAppName.
Create a tunnel with ngrok. Then take the ngrok endpoint and put it in place of localhost:8010 in the Resource URL above. Your resulting fulfillment URL will be something like: https://b0xyz04e.ngrok.io/my-project-id/us-central1/myAppName
Use the fulfillment URL (like above) under Configuration in the Alexa dev console, selecting https as the Service Endpoint Type.

Connect manually with sails.io.js

I'm trying to use sails.io.js in a chrome extension.
I'm setting manually the url of my sails server with the line :
io.sails.url = "http://localhost:1337";
But I would like to get the url from the chrome local storage (to save the url not directly in the code). But the problem is, as said here:
You get 1 clock tick after the import before the socket attempts to
connect.
So if I get the URL from the chrome local storage with :
storage.get('url', function(result){
var url = result.url;
io.sails.url = url ;
});
It's too late! The options must be set just after the sails.io.js code..
So I was thinking about disabling autoConnect with :
io.sails.autoConnect = false;
(As said there)
But now my question is : How can I can connect manually with io.sails ?
I've tried io.sails.connect(), did not work. Any ideas ?
Thank you very much,
Well, it's yours to customize but if you want to get away with minimal amount of changes to sails.io.js then you could
change setTimeout to io.sails.doit = function() {
throw away the io.sails.autoConnect check
call io.sails.doit(..) when you're ready
Also, if you want to pass something over during the handshake you can add {'query':'param='+value} as second argument to io.connect(io.sails.url);
It is then available in sails app's config/sockets.js through socket.handshake.query.param.

Streaming large files causing server to hang

I have a feature in my web app that allows users to upload and download files. I serve up the app with Express, but the files are stored in a different server, so I proxy the requests to that server. Here's the proxy code using the request library:
module.exports = function(req, res) {
req.headers['x-private-id'] = getId();
var url = rewriteUrl(req.url);
var newRequest = request(url, function(error) {
if (error) console.log(error);
});
req.pipe(newRequest).on('response', function(res) {
delete res.headers['x-private-id'];
}).pipe(res);
};
This works fine for all of my requests, including downloading the file. However, I run into issues when 'streaming' the file. And by streaming, I mean I use fancybox to display the video using a video tag. The video displays fine the first few times.
But if I close fancybox and then reopen it enough times (5 specifically), it quits working after that; the video no longer shows up. The entire Express server seems to hang, unable to process any more requests. If I restart the server, everything is OK. To me it seems like the sockets from the proxy requests aren't being closed properly, but I can't figure out why. Is there something wrong with my proxy code?
You need to either increase the pool.maxSockets value passed in the request() config since it defaults to node's HTTP Agent's maxSockets which is 5, or opt out of connection pooling altogether with pool: false in the request() config.

Resources