Google credential json file on Serverless AWS - node.js

I try to use DialogFlow (API.AI or Google Cloud Dialogflow API) on my serverless project but the problem that I couldn't find any solution for pushing google credential json file to serverless side. I followed this tutorial (it's on google cloud website) and It works correctly on my local but not on lambda. I even tried to copy the file from webpack but It still doesn't work. For DialogFlow, I'm using dialogflow v2 nodejs library.
--- edit
I'm getting this error on lambda which is related to not find the json file I think because I'm not using this module (dialogFlow is using)
(rejection id: 2): Error: Cannot find module '/var/task/node_modules/grpc/src/node/extension_binary/node-v48-linux-x64-glibc/grpc_node.node'
--- edit end
node.js: 6.x
serverless: 1.26
====
serverless.yml
service: test-dialogflow-svc
plugins:
- serverless-webpack
- serverless-plugin-common-excludes
- serverless-offline
- serverless-offline-scheduler
package:
individually: true
include:
- googleCredentials.json
custom:
webpackIncludeModules: true
serverless-offline:
port: 3000
provider:
name: aws
runtime: nodejs6.10
stage: dev
region: eu-west-2
memorySize: 128
timeout: 5
environment:
GOOGLE_APPLICATION_CREDENTIALS: './googleCredentials.json'
functions:
hello:
handler: src/handlers/helloworld.handler
events:
- http:
path: hello
method: get
package:
include:
- googleCredentials.json
webpack.config.js
const path = require('path');
const slsw = require('serverless-webpack');
const nodeExternals = require('webpack-node-externals');
const WebpackPluginCopy = require('webpack-plugin-copy');
module.exports = {
entry: slsw.lib.entries,
target: 'node',
resolve: {
extensions: ['.js', '.json', '.ts', '.tsx']
},
externals: [nodeExternals()],
module: {
rules: [
{
test: /\.ts(x?)$/,
use: [
{
loader: 'awesome-typescript-loader'
}
]
}
]
},
plugins: [ // I tried to copy file with webpack as well
new WebpackPluginCopy([{
copyPermissions: true,
from: './googleCredentials.json'
}])
],
output: {
libraryTarget: 'commonjs',
path: path.join(__dirname, '.webpack'),
filename: '[name].js'
}
};

The answer to this question is made of two parts:
Copying the Google credentials .json file in the serverless .zip bundle
Compile the gRPC c++ native node module for amazon-linux or use the REST JSON API
1) It's possible to copy the Google credentials .json file into the .zip bundle using the serverless-webpack plugin and the webpack-plugin-copy.
serverless.yml
...
plugins:
- serverless-webpack
...
webpack.config.js
...
const WebpackPluginCopy = require('webpack-plugin-copy');
module.exports = {
...
plugins: [
new WebpackPluginCopy([{
copyPermissions: true,
from: `./googleCredentials.json`,
}])
],
};
2) The DialogFlow node client use this gRPC client which has a c++ native module dependency. This is also true for all other node clients for Google Cloud Platform products like Datastore.
You will need to build the native c++ modules on amazon-linux instance on your computer through Docker or an EC2.
C++ Addons as AWS Lambda functions
Using Packages and Native nodejs Modules in AWS Lambda
REST JSON API instead of gRPC
Since native c++ module are annoying to build and all the Google Cloud Platform node clients also add ~30mb to your serverless .zip bundle you might want to avoid the gRPC client and find/write a HTTP client that calls the REST JSON API instead. JSON over HTTP has a higher latency than gRPC but this is not significant unless you have many layers of micro services calling each other.
In the future the node gRPC clients might work without c++ modules using javascript and weight far less than 30mb but at the time of writing there is no sign of commitment except for an alpha stage submodule on the gRPC node client.

Related

How to include static files into the lambda package with AWS SAM esbuild?

I have a NodeJS AWS Lambda function which generates an e-mail based on a html template file (emailTemplate.html). I started building my lambdas with esbuild via SAM. Now I wonder how I can configure SAM/esbuild to include this file into my lambda package.
This is the SAM template configuration for the lambda:
EmailNotificationFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./lambdas-node/email-notifications/
Handler: daily-summary.handler
Timeout: 120
MemorySize: 512
Runtime: nodejs16.x
Metadata:
BuildMethod: esbuild
BuildProperties:
Sourcemap: true
EntryPoints:
- daily-summary.ts
In my application code, I read the file from the local file system:
fs.readFileSync("./emailTemplate.html", "utf-8")
The html file is small so I'd like to stick to this minimalistic approach. I can always fetch the file from S3 or package it in a layer but I prefer not to go there.
Ok, so basically ESBuild's file loader is the way to go. ESBuild will replace the import with a reference to the file and copy over the file to the result bundle. (That's exactly what I wanted.)
This behaviour seems rather specific to ESBuild and will not work with the regular tsc compiler. So I replaced my build step to typechecking with tsc and transpiling with esbuild (see below)
I added an import to the html file to my code. This will ESBuild trigger to do something with this file.
import emailTemplateHtmlUrl from "./emailTemplate.html";
To keep the typechecker happy, I added also a types.d.ts file (mind the d.ts extension)
declare module '*.html' {
const value: string;
export default value
}
And then I added the Loader to my SAM template so that ESBuild will copy over html files and reference them in the import:
EmailNotificationFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./lambdas-node/email-notifications/
Handler: daily-summary.handler
Timeout: 120
MemorySize: 512
Runtime: nodejs16.x
Metadata:
BuildMethod: esbuild
BuildProperties:
Sourcemap: true
Loader:
- .html=file
EntryPoints:
- daily-summary.ts
And finally, my new test command looks now like this:
tsc --noEmit
npx esbuild daily-summary.ts --outdir=. --loader:.html=file --platform=node
--bundle
mocha *.spec.js

electron-builder publish error: API V3 is no longer supported

I try to publish a program to a gitlab-server, via electron-builder. This is my electron-config.yml file:
appId: ch.janisperren.arawexdashboard
publish:
provider: github
token: mytoken
host: gitlab.myserver.ch
owner: myname
repo: myrepo
asar: true
files:
- "app.js"
- "dist/myfiles/*"
linux:
target:
target: AppImage
arch: x64
The app is being generated, but it is not published to the repo. I always get following error message:
API V3 is no longer supported. Use API V4 instead
But I don't know how to force the electron-builder in using the API V4. Any suggestions?
Thanks!

How to add a custom folder and file via YAML in Serverless app

I am writing a serverless app by using SAM. I created a config folder to keep some table information and some other info. then I load it in my app.js.
when I deploy the app.js locally by using SAM deploy, I observe that the config folder will not include. would you mind advise me how to add config folder in final build folder in .aws-sam\build folder?
my Yaml file
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Sample SAM Template for test
Globals:
Function:
Timeout: 120
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: hello-world/
Handler: app.lambdaHandler
Runtime: nodejs10.x
Events:
HelloWorld:
Type: Api
Properties:
Path: /hello
Method: get
Also when I run the project in debug mode I am getting this error:
{
"errorType": "Runtime.ImportModuleError",
"errorMessage": "Error: Cannot find module '../config/config.js'"
}
I load the js file as below:
"use strict";
let response;
const AWS = require('aws-sdk');
const config = require('../config/config.js');
To include custom files that you need to re-use in multiple functions like the config file in your case. Then you can use lambda layers.
In your template.yml you would include a layer as follow:
ConfigLayer:
Type: "AWS::Serverless::LayerVersion"
Properties:
CompatibleRuntimes:
- nodejs10.x
ContentUri: ./config/
and then add it to your lambda function definition:
Type: AWS::Serverless::Function
Properties:
Handler: cmd/lambdas/hello-world/app.lambdaHandler
CodeUri: src/
Runtime: nodejs10.x
Layers:
- Ref: ConfigLayer
Events:
CatchAll:
Type: Api
Properties:
Path: /hello-world
Method: GET
The contents of the config/ directory will be available in the /opt/ path.
that means the full path to your config.js will be /opt/config.js and you can access it from any lambda that uses that layer.

How to setup serverless.yml and webpack.config for a multiple-runtime AWS Lambda service

I want to deploy AWS Lambda functions with Node8.10 and Ruby2.5 runtimes from one serverless.yml file.
I set up the following folder structure, with /node and /ruby holding my respective handlers.
-/nodeRubyLambdas
-/node
-handler.js
-package.json, package-lock.json, /node_modules
-/ruby
-rubyRijndaelEncryption.rb
-Gemfile, Gemfile.lock, /vendor
-serverless.yml
-webpack.config.js
-package.json for serverless-webpack
Here is my serverless.yml
service: nodeRubyLambdas
plugins:
- serverless-webpack
- serverless-offline
custom:
webpack:
webpackConfig: ./webpack.config.js
includeModules: true
provider:
name: aws
stage: dev
region: us-west-2
iamRoleStatements:
- Effect: Allow
Action:
- lambda:InvokeFunction
Resource: "*"
package:
individually: true
functions:
nodeMain:
handler: node/handler.main
runtime: nodejs8.10
events:
- http:
path: main
method: get
package:
individually: true
rubyEncryption:
handler: ruby/rubyRijndaelEncryption.lambda_handler
runtime: ruby2.5
environment:
RIJNDAEL_PASSWORD: 'a string'
package:
individually: true
My webpack configuration: (This is the base example, I just added the bit to ignore ruby files when I got my first error.)
const slsw = require("serverless-webpack");
const nodeExternals = require("webpack-node-externals");
module.exports = {
entry: slsw.lib.entries,
target: "node",
// Generate sourcemaps for proper error messages
devtool: 'source-map',
// Since 'aws-sdk' is not compatible with webpack,
// we exclude all node dependencies
externals: [nodeExternals()],
mode: slsw.lib.webpack.isLocal ? "development" : "production",
optimization: {
// We do not want to minimize our code.
minimize: false
},
performance: {
// Turn off size warnings for entry points
hints: false
},
// Run babel on all .js files and skip those in node_modules
module: {
rules: [
{
test: /\.js$/,
loader: "babel-loader",
include: __dirname,
exclude: [/node_modules/, /\.rb$/]
}
]
}
};
Fail #0:
[Webpack Compilation error] Module parse failed
Fail #1:
Basically, webpack assumes all functions are .js and tries to package them as such. Based off this suggestion, I forced my entry point in webpack config to be my handler.js
module.exports = {
entry: "./node/handler.js",
target: "node",
...
This packages ONLY the Node Lambda. An empty placeholder for the Ruby Lambda is created on AWS.
Fail #2:
I commented out webpack from serverless.yml and added include and exclude statements in the functions package options.
functions:
nodeMain:
package:
individually: true
include:
- node/**
- handler.js
exclude:
- ruby/**
- rubyLambda/**
rubyEncryption:
package:
individually: true
include:
- vendor/**
- rubyRijndaelEncryption.rb
exclude:
- Gemfile
- Gemfile.lock
- node/**
This gets an [ENOENT: no such file or directory] for node/node_modules/#babel/core/node_modules/.bin/parser. This file is not there but I don't understand why it is looking for it, since webpack is not being called.
Sort of success?:
I was able to get the Lambdas to deploy if I commented out webpack and used
serverless deploy function -f <function name here>
to deploy the Ruby Lambda and then uncommented webpack and used the same thing to deploy the Node Lambda.
I'm convinced that there's a better way to get them to deploy; Have I missed something in my setup? Is there another option I haven't tried?
P.S. I did see this pull request https://github.com/serverless-heaven/serverless-webpack/pull/256, but it seems to be abandoned since 2017.
serverless-webpack is not designed for non-JS runtimes. It hijacks serverless packaging and deploys ONLY the webpack output.
Here are your options:
Don't use serverless-webpack and simply use serverless' built-in packaging.
You can use webpack directly (not serverless-webpack), and change your build process to compile using webpack first and then let serverless deploy the output folder.
P.S. The package.individually property is a root-level property in your serverless.yml. It shouldn't be in provider or in your function definitions.
For those who may be looking for options for multiple-runtimes other than serverless-webpack, I ended up switching to this plugin: https://www.npmjs.com/package/serverless-plugin-include-dependencies.
It works with my runtimes (Ruby and Node) and lets you use package.individually with package.include/exclude at the root and function level if the plugin misses something.

Microsoft Azure hosting | Nodejs | file doesn't get compiled by webpack

I'm trying to build a webapp using nodejs. It compiles and runs fine on my local machine but when I try to host it on Azure, webpack seems to cause problem.
//webpack.config.js
var config = {
entry: './main.js',
output: {
path:'/',
filename: 'index.js',
},
devServer: {
// inline: true,
// port: 8080
},
module: {
loaders: [
{
test: /\.jsx?$/,
exclude: /node_modules/,
loader: 'babel-loader',
query: {
presets: ['es2015', 'react']
}
}
]
}
}
module.exports = config;
This is the file hierarchy:
This is the sources tab in Chrome Dev tool for local machine. I notice here the index.js get compiled as specified in the config file.
Then I just place the source on the server using git. This is the response I get from the server:
This is the sources tab for the hosting server.
I suspect that it could be because there is difference in interpreting the directories on my local machine and the host?! I'm using MacOS.
Basically, you would need to compile your WebPack application first before deploying to Azure production server. However, you can also leverage Custom Deployment Script to install Node.js modules and run custom scripts to build your WebPack application on Azure Web Apps during the Azure Deployment task. For detailed steps, please check out this post on StackOverflow.

Resources