I can't seem to find the solution for this in the Firebase Documentation.
I want to test my functions.https.onCall functions locally. Is it possible using the shell or somehow connect my client (firebase SDK enabled) to the local server?
I want to avoid having to deploy every time just to test a change to my onCall functions.
My code
Function :
exports.myFunction = functions.https.onCall((data, context) => {
// Do something
});
Client:
const message = { message: 'Hello.' };
firebase.functions().httpsCallable('myFunction')(message)
.then(result => {
// Do something //
})
.catch(error => {
// Error handler //
});
For locally you must call (after firebase.initializeApp)
firebase.functions().useFunctionsEmulator('http://localhost:5000')
Although the official Firebase Cloud Function docs have not yet been updated, you can now use firebase-functions-test with onCall functions.
You can see an example in their repository.
I have managed to test my TypeScript functions using jest, here is a brief example. There are some peculiarities here, like import order, so make sure to read the docs :-)
/* functions/src/test/index.test.js */
/* dependencies: Jest and jest-ts */
const admin = require("firebase-admin");
jest.mock("firebase-admin");
admin.initializeApp = jest.fn(); // stub the init (see docs)
const fft = require("firebase-functions-test")();
import * as funcs from "../index";
// myFunc is an https.onCall function
describe("test myFunc", () => {
// helper function so I can easily test different context/auth scenarios
const getContext = (uid = "test-uid", email_verified = true) => ({
auth: {
uid,
token: {
firebase: {
email_verified
}
}
}
});
const wrapped = fft.wrap(funcs.myFunc);
test("returns data on success", async () => {
const result = await wrapped(null, getContext());
expect(result).toBeTruthy();
});
test("throws when no Auth context", async () => {
await expect(wrapped(null, { auth: null })).rejects.toThrow(
"No authentication context."
);
});
});
There is a simple trick, how you can simplify onCall -function testing. Just declare the onCall function callback as a local function and test that instead:
export const _myFunction = (data, context) => { // <= call this on your unit tests
// Do something
}
exports.myFunction = functions.https.onCall(_myFunction);
Now you can variate all cases with a normal function with the input you define on your function call.
Callables are just HTTPS functions with a specific format. You can test just like a HTTPS function, except you have to write code to deliver it the protocol as defined in the documentation.
you should first check for dev environment and then point your functions to local emulator.
For JS:
//after firebase init
if (window.location.host.includes("localhost") ||
window.location.host.includes("127.0.0.1")
) {
firebase
.app()
.functions() //add location here also if you're mentioning location while invoking function()
.useFunctionsEmulator("http://localhost:5001");
}
or if you don't create instance of firebase then
//after firebase init
if (window.location.host.includes("localhost") ||
window.location.host.includes("127.0.0.1")
) {
firebase
.functions()
.useFunctionsEmulator("http://localhost:5001");
}
or when serving pages from backend (node.js):
//after firebase init
if (process.env.NODE_ENV === 'development') {
firebase.functions().useFunctionsEmulator('http://localhost:5001');
}
if you are using angularfire, add this to you app.module
{
provide: FirestoreSettingsToken,
useValue: environment.production
? undefined
: {
host: "localhost:5002",
ssl: false
}
}
Related
Here is what I want to achieve : I want to get a JSON on a daily basis from a URL and convert it to a cloud firestore collection in order to be able to use it in my Flutter app. Ideally, the script would only add new data to the collection.
I saw that I can use scheduler from Firebase cloud functions to run tasks daily. That's not the problem for now.
However, I don't know how to use Firebase cloud functions properly to get data from URL and convert it to collection. Maybe that's not the point of cloud functions and I misunderstood something. So first question : Can I run classic nodeJS stuff inside cloud functions? I suppose I can
Next, I initialized a cloud function project locally, connected it to my Google account and started to write code into index.js.
const functions = require("firebase-functions");
const admin = require('firebase-admin');
const fetch = require('node-fetch');
const db = admin.firestore();
const collectionToiletRef = db.collection('mycollection');
let settings = { method: "Get" };
let url = "my-url.com"
fetch(url, settings)
.then(res => res.json())
.then((json) => {
print(json);
// TODO for each json object, add new document
});
Second question : How can I run this code to see if it works ? I saw that emulator can be used but how can I check visually my cloud firestore collection ? On this simple example, I only want to print my json to see if I can get the data correctly. Where would the printing be done ?
Maybe cloud functions is not what I need for this task. Maybe my code is bad. I don't know. Thanks for your help.
EDIT
I tried this but the call never ends. I think it's waiting for a promise that never returns or something like that.
const functions = require("firebase-functions");
const admin = require('firebase-admin');
const fetch = require('node-fetch');
admin.initializeApp();
const db = admin.firestore();
exports.tempoCF = functions
.firestore.document('/tempo/{docId}')
.onCreate(async (snap, context) => {
console.log("onCreate");
let settings = { method: "Get" };
let url = "https://opendata.paris.fr/api/records/1.0/search/?dataset=sanisettesparis&q=&rows=-1"
try {
let response = await fetch(url, settings);
let json = await response.json();
// TODO for each json object, add new document
await Promise.all(json["records"].map(toiletJsonObject => {
return db.collection('toilets').doc(toiletJsonObject["recordid"]).set({}); // Only to create documents, I will deal with the content later
}));
}
catch(error) {
console.log(error);
return null;
}
}
);
This code works and create all the documents I want but never return. However, the async (snap, context) => {} passed to onCreate is a Promise. And this promise ends when Promise.all ends. I'm missing something but I don't know why. I'm struggling a lot with async programming with Dart or JS. Not very clear in my mind.
Can I run classic nodeJS stuff inside cloud functions?
Sure! Since the fetch method returns a Promise you can very well use it in a background triggered or a scheduled Cloud Function.
How can I run this code to see if it works?
Your code will work perfectly in the emulator suite, but you will need to trigger the Cloud Function with one of the Firebase services that can run in the emulator. For example you can trigger the Cloud Function by creating a document in the Firestore emulator console.
The following Cloud Function will do the trick: just create a doc in a dummy tempo collection and the CF will add a new doc in a newDocscollection. It's up to you to adapt the fields values for this doc, I've just used the entire JSON object.
exports.tempoCF = functions
.firestore.document('/tempo/{docId}')
.onCreate((snap, context) => {
let settings = { method: "Get" };
let url = "https://..."
return fetch(url, settings)
.then(res => res.json())
.then((json) => {
console.log(json);
// TODO for each json object, add new document
return admin.firestore().collection('newDocs').add(json);
})
.catch(error => {
console.log(error);
return null;
});
});
You could also deploy your Cloud Function to the Firebase backend, and if you want to schedule it, just change the code as follows (change the trigger):
exports.scheduledFunction = functions.pubsub.schedule('every 5 minutes').onRun((context) => {
let settings = { method: "Get" };
let url = "https://..."
return fetch(url, settings)
.then(res => res.json())
.then((json) => {
console.log(json);
// TODO for each json object, add new document
return admin.firestore().collection('newDocs').add(json);
})
.catch(error => {
console.log(error);
return null;
});
});
Edit following your edit:
The following code does work correctly in the emulator, creating docs in the toilets collection.
exports.tempoCF = functions.firestore
.document('/tempo/{docId}')
.onCreate(async (snap, context) => {
console.log('onCreate');
let settings = { method: 'Get' };
let url =
'https://opendata.paris.fr/api/records/1.0/search/?dataset=sanisettesparis&q=&rows=-1';
try {
let response = await fetch(url, settings);
let json = await response.json();
return Promise.all( // Here we return the promise returned by Promise.all(), so the life cycle of the CF is correctly managed
json['records'].map((toiletJsonObject) => {
admin
.firestore()
.collection('toilets')
.doc(toiletJsonObject['recordid'])
.set({ adresse: toiletJsonObject.fields.adresse });
})
);
} catch (error) {
console.log(error);
return null;
}
});
Background
I am returning data from AWS Secrets Manager and using the aws-sdk to do so. Earlier I asked a question about how to correctly return the data and export it since the exported object never had the data resolved by the time the export was imported somewhere else. This caused me to get a bunch of undefined.
After solving that problem it was determined that the way to handle this was to wrap the aws-sdk function in a promise, then call the promise in another file with async await. This causes me issues.
Example
If I request and return the data from AWS like this,
let secrets = {
jwtHash: 10,
};
const client = new AWS.SecretsManager({
region: region
});
const promise = new Promise((resolve, reject) => {
client.getSecretValue({ SecretId: secretName }, async (err, data) => {
if (err) {
reject(err);
} else {
const res = await JSON.parse(data.SecretString);
secrets.dbUsername = res.username;
secrets.dbPassword = res.password;
secrets.dbHost = res.host;
secrets.dbPort = res.port;
secrets.dbDatabase = res.dbname;
resolve(secrets);
}
});
});
module.exports = promise;
Then I can import it in another file and use the data like this,
const promise = require('../secrets');
(async () => {
const secrets = await promise;
// use secrets here
})();
Now let's say in that file where I am trying to use secrets I have something like this,
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
module.exports = pool;
If I wrap the pool function in the async self invoking function I have trouble exporting it so it can be used anywhere in my app when I need a database connection. Similar, I have many functions throughout my application that need access to the secret data. If I were to walk through the application wrapping all of my code in async functions it would continue to cause more of these difficulties.
Question
It seems to me the best solution here would be to return the data asynchronously and once it has resolved, export it synchronously.
How can I achieve such a task in this scenario?
A win here would be,
Make the request in /secrets/index.js
Build the secrets object in the same file
Export secrets as an object that can easily be imported anywhere else in my application without the need for asynchronous functions.
Example of How I Would Like to Use This
const secrets = require('../secrets');
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
module.exports = pool;
Because the needed data is gotten asynchronously, there's no way around making everything that depends on it (somehow) asynchronous as well. With asynchronicity involved, one possibility is to usually export functions that can be called on demand, rather than exporting objects:
an object that depends on the asynchronous data can't be meaningfully exported before the data comes back
if you export functions rather than objects, you can ensure that control flow starts from your single entry point and heads downstream, rather than every module initializing itself at once (which can be problematic when some modules depend on others to be initialized properly, as you're seeing)
On another note, note that if you have a single Promise that needs to resolve, it's probably easier to call .then on it than use an async function. For example, rather than
const promise = require('../secrets');
(async () => {
// try/catch is needed to handle rejected promises when using await:
try {
const secrets = await promise;
// use secrets here
} catch(e) {
// handle errors
}
})();
you might consider:
const promise = require('../secrets');
promise
.then((secrets) => {
// use secrets here
})
.catch((err) => {
// handle errors
});
It's less wordy and probably easier to make sense of at a glance - better than a self-invoking async IIFE. IMO, the place to use await is when you have multiple Promises that need to resolve, and chaining .thens and returned Promises together gets too ugly.
A module that depends on secrets to perform has to, in its code, have something that effectively waits for secrets to be populated. Although being able to use your const secrets = require('../secrets'); in your lower code example would be nice, it just isn't possible like that. You can export a function that takes secrets as a parameter rather than as a require, and then (synchronously!) return the instantiated pool:
// note, secrets is *not* imported
function makePool(secrets) {
const pool = new Pool({
user: secrets.dbUsername,
host: secrets.dbHost,
database: secrets.dbDatabase,
password: secrets.dbPassword,
port: secrets.dbPort
});
pool.on('error', err => {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
return pool;
}
module.exports = makePool;
Then, to use it in another module, once the secrets are created, call makePool with the secrets, and then use / pass around the returned pool:
const secretsProm = require('../secrets');
const makePool = require('./makePool');
secretsProm.then((secrets) => {
const pool = makePool(secrets);
doSomethingWithPool(pool);
})
.catch((err) => {
// handle errors
});
Note that the doSomethingWithPool function can be completely synchronous, as is makePool - the asynchronous nature of secrets, once handled with .then in one module, does not have to be dealt with asynchronously anywhere else, as long as other modules export functions, rather than objects.
I would suggest doing everything in 1 file, and then instead of exporting the object you create, export a function that returns the object. The function will always have access to the must up-to-date version of the object, and you can call it from any file to access the same object.
Example:
Create two files in a folder. In the first file, we will do this:
Define a value.
Set a timeout to change the value after some time
Export the value itself
Export a function that returns the value
values.js
let x = 0 ; // set initial value
setTimeout(() => { x = 5; }, 2000); // sometime later, value will change
const getValueOfX = () => { return x; };
module.exports = {
x: x,
getValueOfX: getValueOfX
};
Now in the other file, we just import the two exports from the previous file (we put them both in an object for easy exporting). We can then log them out, wait for some time to pass, and log them out again.
index.js
let values = require('./values');
console.log(`Single value test. x = ${values.x}`);
console.log(`Function return value test. x = ${values.getValueOfX()}`);
setTimeout(() => { console.log(`Single value test. x = ${values.x}`); }, 4000);
setTimeout(() => { console.log(`Function return value test. x = ${values.getValueOfX()}`); }, 4000);
To run the code, just open your Terminal or Command Prompt and, from the same directory as these two files, run node index.js
You'll see that when just the value (object, array, w/e) is exported, it is exported as-is when the export runs - almost always before the API call is finished.
BUT - If you export a function that returns the value (object, array, w/e), then that function will retrieve the up-to-date version of the value at the time it is called! Great for API calls!
so your code might look like this:
let secrets = { jwtHash: 10 };
const client = new AWS.SecretsManager({
region: region
});
let pool = null;
client.getSecretValue({ SecretId: secretName }, async (err, data) => {
if (err) {
reject(err);
} else {
const res = await JSON.parse(data.SecretString);
pool = new Pool({
user: res.username,
host: res.host
database: res.dbname
password: res.password
port: res.port
});
pool.on('error', err=> {
console.error('Unexpected error on idle client', err);
process.exit(-1);
});
}
});
module.exports = function(){ return pool; };
One thing I do (especially when working with a large application that imports static variables that have been moved to a database) is load that file via a function and that function populates an export.
// config.js
const exports = {};
export async function populate() {
const RUNTIMEVARS = await what_you_do_to_get_vars();
for (const config of RUNTIMEVARS) {
exports[config.key] = exports[config.data];
}
// for anything needing the config in the bootstrap.
return exports;
}
export default exports;
Then in the bootstrap:
// bootstrap.js
import './database-connection.js'; // important to have no internal dependencies.
(async() => {
const { populate } = await import('./config.js');
await populate();
import('./application/index.js');
})()
Now any file inside your application can import config from '../config.js' as though it were statically declared as we populated the object in the populate function in the bootstrap.
I am wondering how to properly test Azure Functions with Jest. I have read the online documentation provided by MSoft but it's very vague, and brief. There are also some outdated articles I found that don't really explain much. Here is what I understand: I understand how to test normal JS async functions with Jest. And I understand how to test very simple Azure Functions. However I am not sure how to go about properly testing more complex Azure Functions that make multiple API calls, etc.
For example I have an HTTP Function that is supposed to make a few API calls and mutate the data and then return the output. How do I properly mock the API calls in the test? We only have one point of entry for the function. (Meaning one function that is exported module.exports = async function(context,req). So all of our tests enter through there. If I have sub functions making calls I can't access them from the test. So is there some clever way of mocking the API calls? (since actually calling API's during tests is bad practice/design)
Here is a sample of code to show what I mean
module.exports = async function (context, req)
{
let response = {}
if (req.body && req.body.id)
{
try
{
//get order details
response = await getOrder(context, req)
}
catch (err)
{
response = await catchError(context, err);
}
}
else
{
response.status = 400
response.message = 'Missing Payload'
}
//respond
context.res =
{
headers: { 'Content-Type': 'application/json' },
status: response.status,
body: response
}
};
async function getOrder(context, req)
{
//connection to db
let db = await getDb() // <- how to mock this
//retrieve resource
let item = await db.get...(id:req.body.id)... // <- and this
//return
return {'status':200, 'data':item}
}
Consider this (simplified) example.
src/index.js (Azure Function entry point):
const { getInstance } = require('./db')
module.exports = async function (context) {
// assuming we want to mock getInstance and db.getOrder
const db = await getInstance()
const order = await db.getOrder()
return order
}
src/db.js:
let db
async function getInstance() {
if (db === undefined) {
// connect ...
db = new Database()
}
return db
}
class Database {
async getOrder() {
return 'result from real call'
}
}
module.exports = {
getInstance,
Database,
}
src/__tests__/index.test.js:
const handler = require('../index')
const db = require('../db')
jest.mock('../db')
describe('azure function handler', () => {
it('should call mocked getOrder', async () => {
const dbInstanceMock = new db.Database() // db.Database is already auto-mocked
dbInstanceMock.getOrder.mockResolvedValue('result from mock call')
db.getInstance.mockResolvedValue(dbInstanceMock)
const fakeAzureContext = {} // fake the context accordingly so that it triggers "getOrder" in the handler
const res = await handler(fakeAzureContext)
expect(db.getInstance).toHaveBeenCalledTimes(1)
expect(dbInstanceMock.getOrder).toHaveBeenCalledTimes(1)
expect(res).toEqual('result from mock call')
})
})
> jest --runInBand --verbose
PASS src/__tests__/index.test.js
azure function handler
✓ should call mocked getOrder (4 ms)
For a complete quickstart, you may want to check my blog post
This one is starting to get under my skin...
I'm playing around with Firebase and Functions right now and made a very simple API with express as the middleware.
So I have this route:
...
app.get('/getAuthUrl', async (req, res) => {
const s: sessionManagement.ISessionManagement =
sessionManagement.SessionFactory.createSession(
functions.config().aproxy.session.mode, req, db)
// ...this work
const getback = req.query.getback;
await db.collection('tokens').doc('getback').set({getback});
// this not?!
s.setReturnURL(req.query.getback);
res.setHeader('Cache-Control', 'private');
res.status(200).json(new Message(redirectUri));
})
At first, I use the 2 lines under the comment "...this work" and it did work; writing an URL to a Firestore database. So far, so good.
Then I decided to get fancy and decide to use a object factory to manage the fact that using Express session on a localhost serving functions (with the emulator) on one port and the angular frontend on an other was causing some headache with session management. Long story short, I setup a factory that will return me a object that will manage if I was running localy or on Firebase cloud hosting and used a different strategy.
This is where Promise started to not holding their... promise!?
The line s.setReturnURL(.. call this method inside my factory:
export interface ISessionManagement {
setReturnURL: (value: string) => void
}
export class FirestoreSession implements ISessionManagement {
private database: FirebaseFirestore.Firestore
private fingerprint: string
constructor(database: FirebaseFirestore.Firestore, fingerprint: string) {
this.database = database
this.fingerprint = fingerprint
}
setReturnURL(value: string) {
console.log('setReturnURL')
this.writeDoc(value, 'getback')
}
writeDoc(value: string, document: string) {
(async () => {
console.log('inside async')
const doc = this.database.collection('tokens').doc(document).set({value})
const result = await doc
console.log(result)
})().catch(error => {
console.log(error)
})
console.log('finish writeDoc')
}
}
export class SessionFactory {
public static createSession(mode: string, req: express.Request, db: FirebaseFirestore.Firestore) : ISessionManagement {
if (mode === 'fingerprint' ) {
console.log('fingerprint mode');
// Use for local testing, since in multiport solution, cookie will not be unique to a session
return new FirestoreSession(db, fingerPrintMe(req));
} else {
// For production
return new ExpressSession(req);
}
}
}
So, here the console output of the flow of execution if a use my factory object:
✔ functions[app]: http function initialized (http://localhost:5000/myplayground/us-central1/app).
i functions: Beginning execution of "app"
> fingerprint mode
> setReturnURL
> inside async
> finish writeDoc
i functions: Finished "app" in ~1s
The code above is my latest attempt to make this work. I tried A LOT of variation, moving my async/await around but nothing budge!
What am I missing??? I feel the answer will make me crawl under some rock but I don't care, I need to take this one off my mind :-)
UPDATE
Turns out that I need to async/await all the way down to the call that return a promise to make this work AND ALSO, you cannot await a function that return void, so I had to change the interface signature. So here the code fragment that work :
app.get('/getAuthUrl', async (req, res) => {
...
await s.setReturnURL(req.query.getback)
...
})
...
export interface ISessionManagement {
setReturnURL: (value: string) => any
}
export class FirestoreSession implements ISessionManagement {
...
async setReturnURL(value: string) {
await this.writeDoc(value, 'getback')
}
...
async writeDoc(value: string, document: string) {
try {
const doc = await this.database.collection('tokens').doc(document).set({value})
console.log(doc)
} catch (error) {
console.log(error)
}
}
}
const doc = this.database.collection('tokens').doc(document).set({value})
const result = await doc
Why not just const result = await this.database.collection('tokens').doc(document).set({value})?
`writeDoc(value: string, document: string) {
(async () => {
console.log('inside async')
const doc = this.database.collection('tokens').doc(document).set({value})
const result = await doc
console.log(result)
})().catch(error => {
console.log(error)
})
console.log('finish writeDoc')
}`
First, writedoc is not a promise and isn't awaited. The internal async function isn't awaited either, so once execution pauses the rest of the function moves on. So it will execute and the rest of internal async function will be queued for the next event loop. Then everything returns and exits.
Make writedoc async, and await it and the internal async function.
That's what says in AWS:
The module-name.export value in your function. For example,
"index.handler" calls exports.handler in index.js.
And it correctly calls this function:
exports.handler = (username, password) => {
...
}
But what if the code is like this:
module.exports = (username, password) => {
...
}
How do I call it? Nothing I tried works like module.exports, module.handler, etc.
AWS Lambda expects your module to export an object that contains a handler function. In your Lambda configuration you then declare the file that contains the module, and the name of the handler function.
The way modules are exported in Node.js is via the module.exports property. The return value of a require call is the contents of the module.exports property at the end of the file evaluation.
exports is just a local variable pointing to module.exports. You should avoid using exports, and instead use module.exports, since some other piece of code may overwrite module.exports, leading to unexpected behaviour.
In your first code example, the module correctly exports an object with a single function handler. In the second code example, however, your code exports a single function. Since this does not match AWS Lambda's API, this does not work.
Consider the following two files, export_object.js and export_function.js:
// export_object.js
function internal_foo () {
return 1;
}
module.exports.foo = internal_foo;
and
// export_function.js
function internal_foo () {
return 1;
}
module.exports = internal_foo;
When we run require('export_object.js') we get an object with a single function:
> const exp = require('./export_object.js')
undefined
> exp
{ foo: [Function: internal_foo] }
Compare that with the result we get when we run require('export_function.js'), where we just get a function:
> const exp = require('./export_funntion.js')
undefined
> exp
[Function: internal_foo]
When you configure AWS Lambda to run a function called handler, that is exported in a module defined in the file index.js, here is an approximation of Amazon does when a function is called:
const handler_module = require('index.js');
return handler_module.handler(event, context, callback);
The important part there is the call to the handler function defined in the module.
You need to define or export your handler function.
exports.handler = (username, password) => {
...
}
I have used like this.
//index.js
const user = require('./user').user;
const handler = function (event, context, callback) {
user.login(username, password)
.then((success) => {
//success
})
.catch(() => {
//error
});
};
exports.handler = handler;
//user.js
const user = {
login(username, password) {
return new BPromise((resolve, reject) => {
//do what you want.
});
}
};
export {user};
when you call exports.handler what actually happens, is the module looks for a property called handler in the module.exports object, when it doesnt find it (since handler is not a property of module.exports), it creates one for us with the property name being whatever comes after exports. so use whatever you want, handler is not special to AWS.
try it yourself:
make a new test.js file and copy the following code:
exports.handler = () => {
console.log('hello from exports.handler')
}
exports.vinnyLovesJS = () => {
console.log('Vinny loves JS!');
(($) => { $('THIS is crazy! :)') })((a) => { console.log(a) },
console.log('I do! But...'))
}
module.exports.handler();
module.exports.vinnyLovesJS();
Now run the code with:
`node test.js`
Output:
hello from exports.handler
Vinny loves JS!
I do! But...
THIS is crazy! :)
enjoy the extra JS craziness!
If you are trying to use typescript with an Amazon skill and are getting this issue, it's likely that your event handler is being compiled to a sub-folder and not in your main directory.
By default, AWS looks for your event handler in the root directory of your lambda folder. However, if you are using typescript and compile your output to a build folder, you need to change where AWS looks for your handler. Modify ask-resources.json in the root directory of your skill from:
// ...
"skillInfrastructure": {
"userConfig": {
"runtime": "nodejs10.x",
"handler": "index.handler",
"awsRegion": "us-east-1"
},
"type": "#ask-cli/lambda-deployer"
}
// ...
to
// ...
"skillInfrastructure": {
"userConfig": {
"runtime": "nodejs10.x",
"handler": "build/index.handler",
"awsRegion": "us-east-1"
},
"type": "#ask-cli/lambda-deployer"
}
// ...