I'm trying to write tests for a node/postgresSQL setup. Currently at the step where I create a table and then seed the table with users. I have an async function called createUserTable and then a second async function called seedUserTable, but when I run them I get the error 'relation "public.user" does not exist. I'm really confused as to why my second async function is running before the first one has finished.
Here is my code
const { Pool } = require('pg');
const { DB_HOST, DB_USERNAME, DB_PASSWORD, DB_PORT, DB_TEST_DATABASE } = require('./config');
const pool = new Pool({
host: DB_HOST,
port: DB_PORT,
user: DB_USERNAME,
password: DB_PASSWORD,
database: DB_TEST_DATABASE,
});
const createUserTable = async () => {
console.log('createUserTable')
try {
const newUserTable = await pool.query(`
CREATE TABLE public.user
(
user_id SERIAL PRIMARY KEY UNIQUE NOT NULL,
username VARCHAR(35) UNIQUE NOT NULL
);
`);
console.log(newUserTable);
} catch (error) {
console.error(error.message);
}
}
const seedUserTable = async () => {
console.log('seedUserTable')
try {
const seedUserTable = await pool.query(`
INSERT INTO public.user(username)
VALUES (
$1
),
(
$2
)
RETURNING *
`,
[
'demo',
'demo2',
]
);
console.log(seedUserTable);
} catch (error) {
console.error(error.message);
}
}
createUserTable();
seedUserTable();
This is what I'm getting in the terminal.
$ node index2.js
createUserTable
seedUserTable
relation "public.user" does not exist
Result {
command: 'CREATE',
rowCount: null,
oid: null,
rows: [],
fields: [],
_parsers: undefined,
_types: TypeOverrides {
_types: {
getTypeParser: [Function: getTypeParser],
setTypeParser: [Function: setTypeParser],
arrayParser: [Object],
builtins: [Object]
},
text: {},
binary: {}
},
RowCtor: null,
rowAsArray: false
}
Edit: Removed the async from the two functions and wrapped the call in an async function like this.
(async () => {
await createUserTable();
setTimeout(() => {
console.log('5000 ms timeout');
}, 5000);
await seedUserTable();
})();
Currently getting this error:
createUserTable
Promise { <pending> }
seedUserTable
Promise { <pending> }
(node:19684) UnhandledPromiseRejectionWarning: error: relation "public.user" does not exist
5000 ms timeout
I suggest to remove all async functions that you've added and do this down below:
(async() => {
await createUserTable();
await seedUserTable();
})();
i am just thinking that the problem maybe from the postgress naming when it creates tables
try checking the name of the tables in the database after it's creation
Related
I get a connection the following way:
async function getMongoConnection(): Promise<Connection> {
const connection: Connection = await createConnection({
type: "mongodb",
host: "localhost",
port: 27017,
database: "test",
synchronize: true,
logging: true,
entities: [
File
]
});
return connection;
}
export { getMongoConnection };
and I have the following entity definition:
#Entity({
name: "files"
})
export class File {
#ObjectIdColumn()
id: ObjectID;
#Column()
filePath: string;
#Column()
fileSize: number;
}
When I'm running a "save" or "insert" call, it seems to work fine, but running "find" on the manager throws a Cannot read property 'prototype' of undefined error. The collection doesn't appear in the database either when I'm checking it with MongoDB Compass.
However, manager.stats(File).count is incrementing just fine.
Here's the last bit of related code:
async function saveFileEntry(filePath: string) {
const manager = (await mongoConnection).mongoManager;
console.log(await (await (manager.stats(File))).count);
const file = new File();
file.filePath = filePath;
file.fileSize = fs.statSync(filePath).size;
manager.save(file)
.then( (value) => {
console.log("Saved file:");
console.log(value);
manager.find(File)
.then( (files) => {
files.forEach( (file) => {
console.log("Found file:");
console.log(file);
});
})
.catch( (error) => {
console.log(`Couldn't find file: ${error}`)
});
})
.catch( (error) => {
console.log(`Couldn't save file: ${error}`);
});
}
TypeOrm and MongoDb just don't work together well as of the time this answer was written.
We ended up switching to Postgres and the same code worked right away with minimal changes.
If you come upon this question with the same problem: Too Bad!
I have the following lambda handler to unit test. It uses a library #org/aws-connection which has a function mysql.getIamConnection which simply returns a knex connection.
Edit: I have added the mysql.getIamConnection function to the bottom of the post
Edit: If possible, I'd like to do the testing with only Jest. That is unless it becomes to complicated
index.js
const {mysql} = require('#org/aws-connection');
exports.handler = async (event) => {
const connection = await mysql.getIamConnection()
let response = {
statusCode: 200,
body: {
message: 'Successful'
}
}
try {
for(const currentMessage of event.Records){
let records = JSON.parse(currentMessage.body);
await connection.transaction(async (trx) => {
await trx
.table('my_table')
.insert(records)
.then(() =>
console.log(`Records inserted into table ${table}`))
.catch((err) => {
console.log(err)
throw err
})
})
}
} catch (e) {
console.error('There was an error while processing', { errorMessage: e})
response = {
statusCode: 400,
body: e
}
} finally {
connection.destroy()
}
return response
}
I have written some unit tests and I'm able to mock the connection.transaction function but I'm having trouble with the trx.select.insert.then.catch functions. H
Here is my testing file
index.test.js
import { handler } from '../src';
const mocks = require('./mocks');
jest.mock('#org/aws-connection', () => ({
mysql: {
getIamConnection: jest.fn(() => ({
transaction: jest.fn(() => ({
table: jest.fn().mockReturnThis(),
insert: jest.fn().mockReturnThis()
})),
table: jest.fn().mockReturnThis(),
insert: jest.fn().mockReturnThis(),
destroy: jest.fn().mockReturnThis()
}))
}
}))
describe('handler', () => {
test('test handler', async () =>{
const response = await handler(mocks.eventSqs)
expect(response.statusCode).toEqual(200)
});
});
This test works partially but it does not cover the trx portion at all. These lines are uncovered
await trx
.table('my_table')
.insert(records)
.then(() =>
console.log(`Records inserted into table ${table}`))
.catch((err) => {
console.log(err)
throw err
})
How can set up my mock #org/aws-connection so that it covers the trx functions as well?
Edit:
mysql.getIamConnection
async function getIamConnection (secretId, dbname) {
const secret = await getSecret(secretId)
const token = await getToken(secret)
let knex
console.log(`Initialzing a connection to ${secret.proxyendpoint}:${secret.port}/${dbname} as ${secret.username}`)
knex = require('knex')(
{
client: 'mysql2',
connection: {
host: secret.proxyendpoint,
user: secret.username,
database: dbname,
port: secret.port,
ssl: 'Amazon RDS',
authPlugins: {
mysql_clear_password: () => () => Buffer.from(token + '\0')
},
connectionLimit: 1
}
}
)
return knex
}
Solution
#qaismakani's answer worked for me. I wrote it slightly differently but the callback was the key. For anyone interested here is my end solution
const mockTrx = {
table: jest.fn().mockReturnThis(),
insert: jest.fn().mockResolvedValue()
}
jest.mock('#org/aws-connection', () => ({
mysql: {
getIamConnection: jest.fn(() => ({
transaction: jest.fn((callback) => callback(mockTrx)),
destroy: jest.fn().mockReturnThis()
}))
}
}))
Updating your mock to look like this might do the trick:
const { mysql } = require("#org/aws-connection");
jest.mock("#org/aws-connection", () => ({
mySql: {
getIamConnection: jest.fn()
}
}));
const mockTrx = {
table: jest.fn().mockReturnThis(),
insert: jest.fn().mockResolveValue() // Resolve any data here
};
mysql.getIamConnection.mockReturnValue({
transaction: jest.fn((callback) => callback(mockTrx)),
});
You need to mock the transaction so that it executes your callback with a dummy trx. To do this, you need to make sure that all the functions inside the trx object return a reference back to it or a promise so that you can chain it appropriately.
Instead of mocking knex implementation, I've written knex-mock-client which allows you to mimic real db with an easy API.
Change your mock implementation with
import { handler } from "../src";
import { getTracker } from "knex-mock-client";
const mocks = require("./mocks");
jest.mock("#org/aws-connection", () => {
const knex = require("knex");
const { MockClient } = require("knex-mock-client");
return {
mysql: {
getIamConnection: () => knex({ client: MockClient }),
},
};
});
describe("handler", () => {
test("test handler", async () => {
const tracker = getTracker();
tracker.on.insert("my_table").responseOnce([23]); // setup's a mock response when inserting into my_table
const response = await handler(mocks.eventSqs);
expect(response.statusCode).toEqual(200);
});
});
This project is to record data by AWS Timestream, and it works well.
However, I'm failed to mock AWS TimestreamWrite by using jest. I tried some ways but not working. Can someone help me?
My files as below:
ledger-service.js
const AWS = require("aws-sdk");
const enums = require("./enums");
var https = require("https");
var agent = new https.Agent({
maxSockets: 5000,
});
const tsClient = new AWS.TimestreamWrite({
maxRetries: 10,
httpOptions: {
timeout: 20000,
agent: agent,
},
});
module.exports = {
log: async function (audit) {
try {
if (Object.keys(audit).length !== 0) {
if (!isPresent(audit, "name")) {
throw new Error("Name shouldn't be empty");
}
if (!isPresent(audit, "value")) {
throw new Error("Value shouldn't be empty");
}
return await writeRecords(recordParams(audit));
} else {
throw new Error("Audit object is empty");
}
} catch (e) {
throw new Error(e);
}
},
};
function isPresent(obj, key) {
return obj[key] != undefined && obj[key] != null && obj[key] != "";
}
function recordParams(audit) {
const currentTime = Date.now().toString(); // Unix time in milliseconds
const dimensions = [
// { Name: "client", Value: audit["clientId"] },
{ Name: "user", Value: audit["userId"] },
{ Name: "entity", Value: audit["entity"] },
{ Name: "action", Value: audit["action"] },
{ Name: "info", Value: audit["info"] },
];
return {
Dimensions: dimensions,
MeasureName: audit["name"],
MeasureValue: audit["value"],
MeasureValueType: "VARCHAR",
Time: currentTime.toString(),
};
}
function writeRecords(records) {
try {
const params = {
DatabaseName: enums.AUDIT_DB,
TableName: enums.AUDIT_TABLE,
Records: [records],
};
return tsClient.writeRecords(params).promise();
} catch (e) {
throw new Error(e);
}
}
ledger-service.spec.js
const AWS = require("aws-sdk");
const audit = require("./ledger-service");
describe("ledger-service", () => {
beforeEach(async () => {
jest.resetModules();
});
afterEach(async () => {
jest.resetAllMocks();
});
it("It should write records when all success", async () => {
const mockAudit={
name: 'testName',
value: 'testValue',
userId: 'testUserId',
entity: 'testEntity',
action: 'testAction',
info: 'testInfo',
};
const mockWriteRecords = jest.fn(() =>{
console.log('mock success')
return { promise: ()=> Promise.resolve()}
});
const mockTsClient={
writeRecords: mockWriteRecords
}
jest.spyOn(AWS,'TimestreamWrite');
AWS.TimestreamWrite.mockImplementation(()=>mockTsClient);
//a=new AWS.TimestreamWrite();
//a.writeRecords(); //these two lines will pass the test and print "mock success"
await audit.log(mockAudit); //this line will show "ConfigError: Missing region in config"
expect(mockWriteRecords).toHaveBeenCalled();
});
});
I just think the the AWS I mocked doesn't pass into the ledger-service.js. Is there a way to fix that?
Thanks
updates: Taking hoangdv's suggestion
I am thinking jest.resetModules(); jest.resetAllMocks(); don't work. If I put the "It should write records when all success" as the first test, it will pass the test. However, it will fail if there is one before it.
Pass
it("It should write records when all success", async () => {
const mockAudit = {
name: 'testName',
value: 'testValue',
userId: 'testUserId',
entity: 'testEntity',
action: 'testAction',
info: 'testInfo',
};
await audit.log(mockAudit);
expect(AWS.TimestreamWrite).toHaveBeenCalledWith({
maxRetries: 10,
httpOptions: {
timeout: 20000,
agent: expect.any(Object),
},
});
expect(mockWriteRecords).toHaveBeenCalled();
});
it("It should throw error when audit is empty", async () => {
const mockAudit = {};
await expect(audit.log(mockAudit)).rejects.toThrow(`Audit object is empty`);
});
Failed
it("It should throw error when audit is empty", async () => {
const mockAudit = {};
await expect(audit.log(mockAudit)).rejects.toThrow(`Audit object is empty`);
});
it("It should write records when all success", async () => {
const mockAudit = {
name: 'testName',
value: 'testValue',
userId: 'testUserId',
entity: 'testEntity',
action: 'testAction',
info: 'testInfo',
};
await audit.log(mockAudit);
expect(AWS.TimestreamWrite).toHaveBeenCalledWith({
maxRetries: 10,
httpOptions: {
timeout: 20000,
agent: expect.any(Object),
},
});
expect(mockWriteRecords).toHaveBeenCalled();
});
In ledger-service.js you call new AWS.TimestreamWrite "before" module.exports, this means it will be called with actual logic instead of mock.
The solution is just mock AWS before you call require("./ledger-service");
ledger-service.spec.js
const AWS = require("aws-sdk");
describe("ledger-service", () => {
let audit;
let mockWriteRecords;
beforeEach(() => {
mockWriteRecords = jest.fn(() => {
return { promise: () => Promise.resolve() }
});
jest.spyOn(AWS, 'TimestreamWrite');
AWS.TimestreamWrite.mockImplementation(() => ({
writeRecords: mockWriteRecords
}));
audit = require("./ledger-service"); // this line
});
afterEach(() => {
jest.resetModules(); // reset module to update change for each require call
jest.resetAllMocks();
});
it("It should write records when all success", async () => {
const mockAudit = {
name: 'testName',
value: 'testValue',
userId: 'testUserId',
entity: 'testEntity',
action: 'testAction',
info: 'testInfo',
};
await audit.log(mockAudit);
expect(AWS.TimestreamWrite).toHaveBeenCalledWith({
maxRetries: 10,
httpOptions: {
timeout: 20000,
agent: expect.any(Object),
},
});
expect(mockWriteRecords).toHaveBeenCalled();
});
});
I need to do a server-to-server graphQL call. Thanks to the advice received on this SO post, and also documented here, I'm approaching it like this:
async function () {
const {data, errors} = await graphql(
schema,
CRON_JOB_TO_FIND_USERS_WHO_HAVE_GONE_OFFLINE_MUTATION,
{},
{caller: 'synced-cron'},
{timeStarted: new Date().toISOString().slice(0, 19).replace('T', ' ')}
)
console.log('data', data)
console.log('errors', errors)
return true;
}
It's not throwing any errors, but it's returning null data:
Also, a debugger breakpoint in the resolver isn't being hit.
SCHEMA
cronJobToFindUsersWhoHaveGoneOffline(timeStarted: String): epUserData
QUERY
// note -- no gql``. This string is passed directly to graphql() function
// where it gets gql applied to it.
const CRON_JOB_TO_FIND_USERS_WHO_HAVE_GONE_OFFLINE_MUTATION = `
mutation ($timeStarted: String){
cronJobToFindUsersWhoHaveGoneOffline(timeStarted: $timeStarted){
id,
user_presence,
user_presence_time_of_last_update
},
}
`;
RESOLVER
cronJobToFindUsersWhoHaveGoneOffline(parent, args, context){
debugger; <== NEVER GETS ACTIVATED
return Promise.resolve()
.then(() => {
debugger;
//CODE TO FIND USERS AND MARK THEM AS BEING OFFLINE GOES HERE
return usersWhoWentOffline;
})
.then((usersWhoWentOffline) => {
debugger;
return usersWhoWentOffline;
})
.catch((err) => {
debugger;
console.log(err);
});
},
What am I missing?
It should work. Here is a completed working example:
server.ts:
import { ApolloServer } from 'apollo-server';
import { schema } from './schema';
const server = new ApolloServer({ schema });
export { server };
schema.ts:
import { gql, makeExecutableSchema } from 'apollo-server';
const typeDefs = gql`
type EpUserData {
id: ID!
user_presence: String
user_presence_time_of_last_update: String
}
type Query {
dummy: String
}
type Mutation {
cronJobToFindUsersWhoHaveGoneOffline(timeStarted: String): EpUserData
}
`;
const resolvers = {
Mutation: {
async cronJobToFindUsersWhoHaveGoneOffline(parent, args, context) {
const usersWhoWentOffline = { id: 1, user_presence: 'test', user_presence_time_of_last_update: '2020' };
return Promise.resolve()
.then(() => {
return usersWhoWentOffline;
})
.catch((err) => {
console.log(err);
});
},
},
};
const schema = makeExecutableSchema({ typeDefs, resolvers });
export { schema };
server.test.ts:
import { graphql } from 'graphql';
import { schema } from './schema';
import { server } from './server';
const CRON_JOB_TO_FIND_USERS_WHO_HAVE_GONE_OFFLINE_MUTATION = `
mutation ($timeStarted: String){
cronJobToFindUsersWhoHaveGoneOffline(timeStarted: $timeStarted){
id,
user_presence,
user_presence_time_of_last_update
},
}
`;
describe('62122142', () => {
beforeAll(async () => {
const { url } = await server.listen();
console.log(`server is listening on ${url}`);
});
afterAll(async () => {
await server.stop();
});
it('should pass', async () => {
const { data, errors } = await graphql(
schema,
CRON_JOB_TO_FIND_USERS_WHO_HAVE_GONE_OFFLINE_MUTATION,
{},
{ caller: 'synced-cron' },
{
timeStarted: new Date()
.toISOString()
.slice(0, 19)
.replace('T', ' '),
},
);
console.log('data', data);
console.log('errors', errors);
return true;
});
});
integration test result:
PASS apollo-graphql-tutorial src/stackoverflow/62122142/server.test.ts (7.143s)
62122142
✓ should pass (12ms)
console.log src/stackoverflow/62122142/server.test.ts:18
server is listening on http://localhost:4000/
console.log src/stackoverflow/62122142/server.test.ts:36
data [Object: null prototype] {
cronJobToFindUsersWhoHaveGoneOffline:
[Object: null prototype] {
id: '1',
user_presence: 'test',
user_presence_time_of_last_update: '2020' } }
console.log src/stackoverflow/62122142/server.test.ts:37
errors undefined
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshots: 0 total
Time: 7.226s
source code: https://github.com/mrdulin/apollo-graphql-tutorial/tree/master/src/stackoverflow/62122142
UPDATED
I am getting the following error when trying to invoke my Lambda function
{
"errorType": "TypeError",
"errorMessage": "e is not a function",
"trace": [
"TypeError: e is not a function",
" at Runtime.handler (/var/task/serverless_sdk/index.js:9:88355)",
" at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"
]
}
I have tracked this down to the reference to DB (see last few lines of schema.js DB should be imported at the top of schema.js
const { DB } = require('./db.js')
Indeed, when I try the same code on my local computer, there is no issue.
Does this have to do with some subtle ways how Lambda Functions (LF) are frozen for re-use in AWS? Where should I be initializing the DB connection in a LF?
I tried merging db.js into schema.js (no import) and I still get the same error.
I have checked the zip file that serverless loaded and it looks fine (node_modules and mine).
This is very hard to debug. So any tips in that direction would help.
server.js
const { ApolloServer } = require('apollo-server')
const { ApolloServer: ApolloServerLambda } = require('apollo-server-lambda')
const { typeDefs, resolvers, connect } = require('./schema.js')
// The ApolloServer constructor requires two parameters: your schema
// definition and your set of resolvers.
async function setup(where) {
if (where == 'local') {
const server = new ApolloServer({ typeDefs, resolvers })
let { url } = await server.listen()
console.log(`Server ready at ${url}`)
} else {
const server = new ApolloServerLambda({
typeDefs,
resolvers,
playground: true,
introspection: true,
cors: {
origin: '*',
credentials: true,
},
context: ({ event, context }) => (
{
headers: event.headers,
functionName: context.functionName,
event,
context
})
})
exports.graphqlHandler = server.createHandler()
}
}
let location = (process.env.USERNAME == 'ysg4206') ? 'local' : 'aws'
connect(location, setup)
schema.js
const { gql } = require('apollo-server')
const { GraphQLDateTime } = require('graphql-iso-date')
const { DB } = require('./db.js')
exports.typeDefs = gql`
scalar DateTime
type User {
id: Int
"English First Name"
firstName: String
lastName: String
addressNumber: Int
streetName: String
city: String
email: String
createdAt: DateTime
updatedAt: DateTime
}
type Query {
users: [User]
findUser(firstName: String): User
hello(reply: String): String
}
type Mutation {
addUser(user: UserType): User!
}
type Subscription {
newUser: User!
}
`
exports.resolvers = {
Query: {
users: () => DB.findAll(),
findUser: async (_, { firstName }) => {
let who = await DB.findFirst(firstName)
return who
},
hello: (_, { reply }, context, info) => {
console.log(`hello with reply ${reply}`)
console.log(`context : ${JSON.stringify(context)}`)
console.log(`info : ${JSON.stringify(info)}`)
return reply
}
},
Mutation: {
addUser: async (_, args) => {
let who = await DB.addUser(args.user)
return who
}
}
}
exports.connect = async (where, setup) => {
console.log(`DB: ${DB}') // BUG DB is returning null
await DB.dbSetup(where) //BUG these lines cause Lambda to fail
await DB.populate() //BUG these lines cause Lambda to fail
let users = await DB.findAll() //BUG these lines cause Lambda to fail
console.log(users) //BUG these lines cause Lambda to fail
await setup(where)
}
db.js
const { Sequelize } = require('sequelize')
const { userData } = require('./userData')
const localHost = {
db: 'm3_db',
host: 'localhost',
pass: 'xxxx'
}
const awsHost = {
db: 'mapollodb3_db',
host: 'apollodb.cxeokcheapqj.us-east-2.rds.amazonaws.com',
pass: 'xxxx'
}
class DB {
async dbSetup(where) {
let host = (where == "local") ? localHost : awsHost
this.db = new Sequelize(host.db, 'postgres', host.pass, {
host: host.host,
dialect: 'postgres',
logging: false,
pool: {
max: 5,
min: 0,
idle: 20000,
handleDisconnects: true
},
dialectOptions: {
requestTimeout: 100000
},
define: {
freezeTableName: true
}
})
this.User = this.db.define('users', {
firstName: Sequelize.STRING,
lastName: Sequelize.STRING,
addressNumber: Sequelize.INTEGER,
streetName: Sequelize.STRING,
city: Sequelize.STRING,
email: Sequelize.STRING,
})
try {
await this.db.authenticate()
console.log('Connected to DB')
} catch (err) {
console.error('Unable to connect to DB', err)
}
}
async select(id) {
let who = await this.User.findAll({ where: { id: id } })
return who.get({ plain: true })
}
async findFirst(name) {
let me = await this.User.findAll({ where: { firstName: name } })
return me[0].get({ plain: true })
}
async addUser(user) {
let me = await this.User.create(user)
return me.get({ plain: true })
}
async populate() {
await this.db.sync({ force: true })
try {
await this.User.bulkCreate(userData, { validate: true })
console.log('users created');
} catch (err) {
console.error('failed to create users')
console.error(err)
} finally {
}
}
async findAll() {
let users = await this.User.findAll({ raw: true })
return users
}
async close() {
this.db.close()
}
}
exports.DB = new DB()
serverless.yml
service: apollo-lambda
provider:
name: aws
stage: dev
region: us-east-2
runtime: nodejs10.x
# cfnRole: arn:aws:iam::237632220688:role/lambda-role
functions:
graphql:
# this is formatted as <FILENAME>.<HANDLER>
handler: server.graphqlHandler
vpc:
securityGroupIds:
- sg-a1e6f4c3
subnetIds:
- subnet-4a2a7830
- subnet-1469d358
- subnet-53b45038
events:
- http:
path: graphql
method: post
cors: true
- http:
path: graphql
method: get
cors: true
folder structure of zip
When AWS Lambda imports your file, the export isn't available yet. That's why it complains that your handler is not a function (because it is actually undefined at that time it is being imported).
Here are a couple of suggested solutions:
1. Use only apollo-server-lambda and use serverless-offline for local development. This way your handler code is exactly the same as what you have in Lambda.
const { ApolloServer: ApolloServerLambda } = require("apollo-server-lambda");
const { typeDefs, resolvers, connect } = require("./schema.js");
const server = new ApolloServerLambda({
typeDefs,
resolvers,
playground: true,
introspection: true,
cors: {
origin: "*",
credentials: true
},
context: ({ event, context }) => ({
headers: event.headers,
functionName: context.functionName,
event,
context
})
});
exports.graphqlHandler = server.createHandler();
2. Use apollo-server-lambda in your Lambda but use apollo-server in another file (e.g. local.js).. Then, you just use node local.js for local development. No need for that process.env.USERNAME check that you do at the end.
Found the problem. It is a bit embarrassing. But I post it in case others need this.
I was trying to connect to the DB as part of the initialization of the lambda app. Hoping that when the cold start or warm start happened, the variable with DB would be already holding the connection.
That is anti-pattern.
With apollo one has to reconnect to the DB on each request. that is in the resolver for the GraphQL one has to reconnect to the DB and then close it so that AWS can see there are no open connections and then close the Lambda function.
What threw me was this worked fine when running as ApolloServer and connecting to a local DB.