Lambda function failing with Unable to import module 'index' - node.js

Error:
Unable to import module 'index': Error
at Function.Module._load (module.js:417:25)
at Module.require (module.js:497:17)
at require (internal/module.js:20:19)
at Object.<anonymous> (/var/task/node_modules/slack-incoming-webhook/lib/index.js:3:19)
at Module._compile (module.js:570:32)
at Object.Module._extensions..js (module.js:579:10)
at Module.load (module.js:487:32)
at tryModuleLoad (module.js:446:12)
at Function.Module._load (module.js:438:3)
By the looks of this my code isn't the problem it's a problem with the slack-incoming-webhook node-module however the piece of offending code is this line which looks completely normal.
var SlackClient = require('./client');
I have tried 4 different packages now (request, http, node-webhooks and now slack-incoming-webhooks) and they are all failing with code in node modules. I am thoroughly confused as I can get the code to work on my own computer and on an Amazon Linux AMI EC2 Instance (running same node version)
All the code is zipped and sent to lambda using the aws-cli and I have deployed node.js code on lambda before without any problems (alexa skill).
I have tried npm install on the ec2 instance, I have tried several different packages and I have come to the conclusion there must be some sort of configuration wrong in lambda but I can't find what. Could someone point me in the right direction...
Here is my code if anyone is curious also the lambda trigger is an aws iot button.
const slack = require('slack-incoming-webhook');
const send = slack({
url: 'https://hooks.slack.com/....'
});
exports.handler = function ()
{
send(process.env.company + ' has pushed their panic button! PANIC! PANIC! PANIC!');
};

This is common issue I have seen in many posts. Most of the cases it is the way zipping the files making the problem. Instead of zipping the folder you have to select all files and zip it like below,

I would simply refer to use Apex (http://apex.run/).
Pretty much awsm serverless framework to be used with AWS Lambda. Once this is setup, no need to do manual zipping.
Simply execute couple of commands:
apex create (to create the lambda)
apex deploy (deploy to your AWS region, no manual zipping required)
apex invoke to invoke it from your terminal.
Thanks

Related

Socket.io Syntax not Recognized on Server

Node server not able to understand socket.io syntax, even with all packages installed.
When I run my server locally with nodejs server.js it works fine is working. But, when I try to run it on my Ubuntu server, it does not seem to understand this line:
socket.on( 'client-data', ( serverpackage ) => {
^
SyntaxError: Unexpected token >
at Module._compile (module.js:439:25)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:902:3
The directory on my Ubuntu server is user/server, which contains all my server modules. Socket.io is installed, and I even checked to make sure all dependencies for socket.io are there too.
NodeJS-Socket-server-with-DB#1.0.0 /home/<user>/server
└── socket.io#2.1.1
I am also running this version of nodejs:
<user>#host*****:~/server$ nodejs -v
v0.10.25
You can't use arrow functions. You need at minimum version 4 of node however version 6 offers full compatability with arrow functions.
To fix your issue simply update node or change to a regular function like:
socket.on( 'client-data', function (serverpackage) {
Your nodejs version don't understand arrow function, check update if you want use it.
If not, please change arrow function to normal function.
Check Node.js ES2015 Support.

Joining CSV files and Remove Duplicates with Google Functions

I'm new at Google's Platform and I don't have the knowledge about creating Google Functions with external libraries.
I want to upload a CSV file to Cloud Storage and then trigger a Google Cloud Function to JOIN it with another CSV file in Google Cloud Storage and export the JOIN results to a new csv file and removing duplicates.
I've seen the libraries 'csv-join' and 'csv-reorder' for npm but I'm not sure about how to use it with Cloud Functions and if it is possible because I'm stuck at this point.
Thanks in advice.
Regards.
This is my code:
exports.Test_BQ = (event, callback) => {
const file = event.data;
if (file.resourceState === 'not_exists') {
console.log(`File ${file.name} deleted.`);
} else if (file.metageneration === '1') {
const reorder = require('csv-reorder');
reorder({
input: 'https://storage.googleapis.com/staging.XXXXX.appspot.com/test.csv',
output: 'https://storage.googleapis.com/staging.XXXXX.appspot.com/test_2.csv',
sort: 'policyID',
type: 'string',
descending: false,
remove: true,
metadata: false
});
}
callback();
};
I'm getting this error:
TypeError: promisify is not a function at Object. (/user_code/node_modules/csv-reorder/lib/read.js:6:18) at Module._compile (module.js:570:32) at Object.Module._extensions..js (module.js:579:10) at Module.load (module.js:487:32) at tryModuleLoad (module.js:446:12) at Function.Module._load (module.js:438:3) at Module.require (module.js:497:17) at require (internal/module.js:20:19) at Object. (/user_code/node_modules/csv-reorder/index.js:4:14) at Module._compile (module.js:570:32)
Now I'm only testing 'csv-reorder' to understand how this external libraries works. I've get results in local but no in cloud.
The csv-reorder library uses util.promisify, which is a part of Node 8.X version, while Google Cloud Functions currently runs Node.js v6.11.5. You can provide a polyfill to this older version via:
npm install --save util.promisify
You can then patch the util library like this:
const util = require('util');
require('util.promisify').shim();
This should solve the error you're facing. However, after having a closer look at the csv-reorder code, it appears it can only read a file from filesystem. Google Cloud Functions doesn't provide you with a filesystem to write to so you'll need to find another way to do it.

Trouble connecting to AWS Athena via JDBC using Node Lambda

My Goal
I am trying to use AWS's JDBC Driver to allow a Lambda function running Node 6.10 to connect to AWS Athena and create a database. (I will also want to be able to create and query against tables inside of that databse).
What I've Tried
I have tried the following code from an answer to a similar question:
var JDBC = require('jdbc');
var jinst = require('jdbc/lib/jinst');
if (!jinst.isJvmCreated()) {
jinst.addOption("-Xrs");
jinst.setupClasspath(['./AthenaJDBC41-*.jar']);
}
var config = {
// Required
url: 'jdbc:awsathena://athena.us-east-1.amazonaws.com:443',
// Optional
drivername: 'com.amazonaws.athena.jdbc.AthenaDriver',
minpoolsize: 10,
maxpoolsize: 100,
properties: {
s3_staging_dir: 's3://aws-athena-query-results-*/',
log_path: '/logs/athenajdbc.log',
user: 'access_key',
password: 'secret_key'
}
};
var hsqldb = new JDBC(config);
hsqldb.initialize(function(err) {
if (err) {
console.log(err);
}
});
The Errors I'm Seeing
When I run this on my own machine (Mac OSX El Capitan 10.11.6), I see the popup pictured below with the message No Java runtime present, requesting install. printed to my console.
When I deploy my code to Lambda and run it there, it fails with the following message:
Error: /var/task/node_modules/java/build/Release/nodejavabridge_bindings.node: invalid ELF header
When run locally, I can see that things fail at the var hsqldb = new JDBC(config); line, but when running on Lambda, the error occurs immediately upon requiring JDBC (the first line of the code above).
Update
The invalid ELF header issue seems to be pointing to the idea that the node_modules/java/build/Release/nodejavabridge_bindings.node file was compiled for an architecture incompatible with the one on which AWS Lambda runs (Linux x64).
This explains the difference in behavior when running locally vs when running on Lambda.
I have tried using node-gyp to compile the resource specifically for the x64 architecture, and saw the issue change but not resolve.
The node-gyp command I ran successfully was node-gyp configure --arch=x64 (run inside the node_modules/java/ directory)
Instead of an invalid ELF header error when running on Lambda, we now see a module initialization error (See logs below)
module initialization error: Error
at Module.load (module.js:487:32)
at tryModuleLoad (module.js:446:12)
at Function.Module._load (module.js:438:3)
at Module.require (module.js:497:17)
at require (internal/module.js:20:19)
at Object.<anonymous> (/var/task/node_modules/java/lib/nodeJavaBridge.js:21:16)
at Module._compile (module.js:570:32)
at Object.Module._extensions..js (module.js:579:10)
You are describing a couple of issues here.
First the missing JVM in MacOS. This is a documented bug within node-java. This link describes a workaround for that issue.
https://github.com/joeferner/node-java/issues/90#issuecomment-45613235
After applying that and changing the "setupClasspath"-statment, your sample should be runnable locally.
jinst.setupClasspath(['./AthenaJDBC41-1.0.1.jar']);
As for the ELF-problem, you cannot build Linux native modules for node in MacOS. And since npm does not distribute prebuild versions, you can only build you deployable on a target equivalent machine.
This means you need to install/package your modules on a Linux AMI (preferably the Lambda AMI).
Here an AWS blog post on how to do this:
https://aws.amazon.com/blogs/compute/nodejs-packages-in-lambda/
AMI versions used:
http://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html

proxies not supported on this platform

i'm trying to make(i don't know what it's called, hot load? hot reload?) Meteor-like real-time loading of data, but by using node.js not Meteor.
and i'm using the ddp module for the client(=browser, i have not tried it yet) and ddp-reactive-server, well, for the server.
server.js is like this:
var DDPServer = require('ddp-server-reactive');
var server = new DDPServer();
var todoList = server.publish('todolist');
after that i run the server using the command node server.js --harmony_proxies(notice i'm already using the flag) this is what i get:
[aseds#localhost ~]$ node server.js --harmony_proxies
/home/aseds/Desktop/projeh/css-goodness/node_modules/harmony-reflect/reflect.js:2049
throw new Error("proxies not supported on this platform. On v8/node/iojs, make sure to pass the --harmony_proxies flag");
^
Error: proxies not supported on this platform. On v8/node/iojs, make sure to pass the --harmony_proxies flag
at global.Proxy (/home/aseds/Desktop/projeh/css-goodness/node_modules/harmony-reflect/reflect.js:2049:13)
at publish (/home/aseds/Desktop/projeh/css-goodness/node_modules/ddp-server-reactive/lib.js:211:32)
at Object.<anonymous> (/home/aseds/Desktop/projeh/css-goodness/ddpserver.js:10:23)
at Module._compile (module.js:397:26)
at Object.Module._extensions..js (module.js:404:10)
at Module.load (module.js:343:32)
at Function.Module._load (module.js:300:12)
at Function.Module.runMain (module.js:429:10)
at startup (node.js:139:18)
at node.js:999:3
my nodejs version v5.4.1.
i'm not even sure if that's actually possible to make the automatic reload feature of Meteor this way but i'm trying! :)
Thanks in advance for any help you are able to provide.
I came across this thread with regards to --harmony-proxies:
https://github.com/tvcutsem/harmony-reflect/issues/56
The relevant bit:
I released version 1.4.0 which, when loaded twice according to the script outlined above, loads correctly.
Note that loading v1.3.1 of this library followed by v1.4.0 will still fail (the other way around works fine). So it's important that your dependencies upgrade to the latest version.
It appears that if harmony-proxies is loaded as a node dependency twice, with different versions required, and a version before 1.4.0 is loaded first, then you will see this error.

is it possible to set up amdefine in tests so that I don't have to define it in all my module files?

I have a set of objects that are used browser side but tested server side with mocha. I'm using require.js for AMD loading. The Require.js site suggests using amdefine on server-side to get the defined modules to work in node.js with this bit of code:
if (typeof define !== 'function') {
var define = require('amdefine')(module)
}
OK. But I have to put that into every module that I want to use in Node. In my case that means I have to strip it out of any code that I'm using client side (most of it).
I'm wondering if there's any way to put that chunk of code in my test instead so that I don't have to put it in my client side code. It seems silly to have code in my files that will only be needed for the tests -- makes more sense to put it in the test code. However, when I do that I get an error:
Error: amdefine with no module ID cannot be called more than once per file.
at runFactory (/home/vmplanet/dev/alpha/web/node_modules/amdefine/amdefine.js:159:23)
at define (/home/vmplanet/dev/alpha/web/node_modules/amdefine/amdefine.js:275:13)
at Object.<anonymous> (/home/vmplanet/dev/alpha/web/assets/src/coffee/delta/dataLayer.coffee:4:3)
at Object.<anonymous> (/home/vmplanet/dev/alpha/web/assets/src/coffee/delta/dataLayer.coffee:158:4)
at Module._compile (module.js:456:26)
at Object.loadFile (/usr/lib/node_modules/coffee-script/lib/coffee-script/coffee-script.js:179:19)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Module.require (module.js:364:17)
at require (module.js:380:17)
It's an odd error, since the amdefine code is only in one place -- the top of the test file. Is there a way to put this amdefine code in my test and only my test and still get the tests to run server side -- without having to strip out the amdefine code for client side?
If you use amd-loader, you can do this:
require("amd-loader");
var datatypes = require("../build/dist/lib/salve/datatypes");
var name_resolver = require("../build/dist/lib/salve/name_resolver");
That's it. You just require amd-loader first and then you can load AMD-style modules at will. (In the example above the two modules loaded after amd-loader are AMD-style modules.) And the AMD-style modules can load other AMD-style modules.
The snippet above is actual code from one of my test suites which tests a library designed AMD-style so that it can be loaded with RequireJS but is tested in Node.js.

Resources