I've got a problem I cannot solve myself. My lambda function works as expected when invoked locally, but it does not send the text message when called from AWS Lambda. It doesn't log any error either.
Here's my code, I've only starred the private stuff out:
import request from 'request';
import AWS from "aws-sdk";
const options = {***};
const sentAlert = async msg => {
const sns = new AWS.SNS();
await sns.publish({
Message: msg,
PhoneNumber: '***',
MessageAttributes: {
'AWS.SNS.SMS.SenderID': {
'DataType': 'String',
'StringValue': '***'
}
}
}, function (err, data) {
if (err) {
console.log(err.stack);
return;
}
});
console.log('sms sent');
};
export const getAlert = async (event, context, callback) => {
request(options, (err, res, body) => {
if (err) { return console.log('error: ', err); }
if (body.length === 0 ) { return }
console.log(`***`);
const optionsId = {*** };
request(optionsId, (err, res, body) => {
const msg = body.current.indexes[0].description;
console.log('msg: ', msg);
sentAlert(msg);
});
});
};
I test it locally using serverless invoke local --function getSmogAlert and it works just as expected, I get the sms from AWS, but when I call it with serverless invoke --function getSmogAlert - it returns null and doesn't send any text message.
I've had similar problems with Nexmo and thought that maybe AWS.SNS will help me, but nope.
Any help, please?
As I wrote in my comment, I think you confuse the promises and callbacks in the execution. Try this changes:
const options = {***};
const sentAlert = (msg, callback) => {
const sns = new AWS.SNS();
await sns.publish({
TopicArn: ***
Message: msg,
PhoneNumber: '***',
MessageAttributes: {
'AWS.SNS.SMS.SenderID': {
'DataType': 'String',
'StringValue': '***'
}
}
}, function (err, data) {
if (err) {
console.log(err.stack);
callback(err);
}
});
console.log('sms sent');
callback(null)
};
export const getAlert = (event, context, callback) => {
request(options, (err, res, body) => {
if (err) {
console.log('error: ', err);
callback(err);
}
if (body.length === 0 ) {
console.log('Got no body!')
callback(null)
}
console.log(`***`);
const optionsId = {*** };
request(optionsId, (err, res, body) => {
if (err) {
console.log(err.stack);
callback(err);
}
const msg = body.current.indexes[0].description;
console.log('msg: ', msg);
sentAlert(msg, callback);
});
});
};
But in general, I would prefer to use async/await mechanism supported by AWS Lambda nodejs8.10 image. That would make your code simple and easier to reason about.
Related
I am trying to write an async lambda function which is calling a function for sign up a user in cognito.
my problem is that in my lambda function, it is not waiting for the result and finish the execution. would you mind check what is my issue? I am new to rxjs. please help me.
mylambda function
exports.handler = async (event, context) => {
//poolData and params will fetch from event
let source = await signup(poolData, params);
console.log(source);
});
my signup function
function signup(poolData, body) {
const userPool = new AmazonCognitoIdentity.CognitoUserPool(poolData);
const { username, password, attributes } = body;
const attributesList = [];
if (Array.isArray(attributes)) {
attributesList.push(
...attributes.map(item => new AmazonCognitoIdentity.CognitoUserAttribute(item))
);
}
let source = Observable.create(observer => {
let output = (err, res) => {
if (err)
{
observer.error(err);
}
else
{
const cognitoUser = res.user;
const data = {
username: cognitoUser.getUsername(),
};
observer.next(data);
}
observer.complete();
}
userPool.signUp(username, password, attributesList, null, output);
});
let respond;
let subscriber = {
next(value) {
console.log('Subscriber - next: ', value);
respond = {
'statusCode': 200,
'body': JSON.stringify({
"username": value.username,
})
}
}, error(err) {
console.log('Subscriber - err: ', err);
respond = err;
},
complete() {
console.log('Subscriber - complete');
return response;
}
};
source.subscribe(subscriber);
}
module.exports = signup;
This behavior is totally normal.
So first thing first, an observable is not a promise which means you are not able to await a response with the await keyword, also I don't see anything to be returned from the signup function, which will probably lead to undefined to be logged anyways.
So how to fix that, one way to fix this issue is to use toPromise() which will turn your observable into a promise which then can be awaited wherever needed.
The other way (which is the rxjs way) will be to return from the signup function the observable and inside your handler function to subscribe for the response.
let subscriber = {
next(value) {
console.log('Subscriber - next: ', value);
respond = {
'statusCode': 200,
'body': JSON.stringify({
"username": value.username,
})
}
}, error(err) {
console.log('Subscriber - err: ', err);
respond = err;
},
complete() {
console.log('Subscriber - complete');
return response;
}
};
exports.handler = (event, context) => {
//poolData and params will fetch from event
signup(poolData, params).subscribe(subscriber);
})
I have tried on both Async Handlers and Non-Async Handlers ways,
to implement dynamodb putItem method. it returned an invalid response.
Its works for commented setTimeout function.
I tried by promisify the dynomodb putItem method in Non-Async Handler way as mentioned and official documentation https://docs.aws.amazon.com/lambda/latest/dg/nodejs-handler.html. But still no luck.
can some one point me out the issue on code?
const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB({
region: 'ap-south-1',
apiVersion: '2012-08-10'
});
exports.lambdaHandler = async function(event) {
const promise = new Promise(function(resolve, reject){
// setTimeout(function(){
// console.log("resolvedx")
// resolve({statusCode: 200, body: "resolvedx"})
// }, 200)
dynamodb.putItem({
TableName: 'CUSTOMER_LIST',
Item: {
'CUSTOMER_ID': { N: '012' },
'CUSTOMER_NAME': { S: 'zzzza' }
}
}
, function(err, data) {
if (err) {
console.log("err", err);
reject(JSON.stringify({statusCode:200, body:err}));
} else {
console.log("success", data);
resolve(JSON.stringify({statusCode:200, body:data}));
}
})
})
return promise
}
as Non-Async Handlers
exports.lambdaHandlerx = (event, context, callback) => {
const dbPromise = () => new Promise((resolve, reject) => {
console.log("inside db promise")
dynamodb.putItem({
TableName: 'CUSTOMER_LIST',
Item: {
'CUSTOMER_ID': { N: '023' },
'CUSTOMER_NAME': { S: 'asdddwa' }
}
}
, function(err, data) {
if (err) {
console.log("err", err);
reject(err);
} else {
console.log("success", data);
resolve(data);
}
})
})
Promise.all([dbPromise()]).then(data => {
console.log("then", data)
callback(null, {
'statusCode': 200,
'body': JSON.stringify({
message: "dataxz"+ JSON.stringify(data)
})
})
}
).catch(e => {
callback(null, {
'statusCode': 200,
'body': JSON.stringify({
message: 'err'
})
})
})
}
Output
Function returned an invalid response (must include one of: body, headers, multiValueHeaders or statusCode in the response object). Response received:
I am trying to mock SQS call defined in my file.js. It is a global instance in the file. So, while testing as I have to require file.js, its instance is set and my mock method is not called. However, if I set that SQS instance locally in the function inside which it is required, I am able to mock. But that would be wrong as that instance will be set every time that method is called. How can I mock SQS in my test? I have tried all the ways which were given there in the issues. None of them is working for me.
//file.js
const AWS = require('aws-sdk');
const sqs = new AWS.SQS();
const queueURL = config.sqs_connect.queue_url;
const params = {
MaxNumberOfMessages: 10,
QueueUrl: queueURL
};
exports.receiveMessages = async function () {
// let sqs = new AWS.SQS();
return new Promise((resolve, reject) => {
sqs.receiveMessage(params, function (err, data) {
if (err) {
console.log("error")
reject(err);
} else if (data.Messages) {
try {
consumeAndDeleteMessages(data.Messages, err => {
if (err) reject(err);
else resolve();
});
} catch (error) {
reject(error);
}
} else {
// logger.log("No data in queue");
resolve();
}
});
})
}
// file.test.js
const AWS = require('aws-sdk');
const consumer = require('path-to-file');
describe("foo", () => {
it("updates all info", async () => {
let delete_stack = [];
AWSMock.setSDKInstance(AWS);
AWSMock.mock('SQS', 'receiveMessage', (params, callback) => {
callback(null, { Messages: [{ MessageId: '1234', ReceiptHandle: 'qwertyu', Body: JSON.stringify(update_payload) }] });
});
AWSMock.mock('SQS', 'deleteMessageBatch', (params, callback) => {
delete_stack.push(params.Entries);
callback(null, {});
});
await consumer.receiveMessages();
AWSMock.restore('SQS');
expect(delete_stack).toStrictEqual([
[{ "Id": "1234", "ReceiptHandle": "qwertyu" }]
]);
});
});
If I define sqs locally in receiveMessage, the test will work file. I have tried all the ways provided, none of them is working. Am I doing something wrong?
I am fairly new to Node.js, and what I am trying to achieve is to have two separate functions. One for Auth and one for sending data (So that I don't run into rate login limits if I were to simply use a callback after conn.login finishes). I tried to set this up in node like this:
var _request = {
url: '/services/data/v45.0/actions/custom/flow/Test1',
method: 'POST',
body: JSON.stringify({
"inputs": [{}]
}),
headers: {
"Content-Type": "application/json"
}
};
var conn = new jsforce.Connection({
clientId: process.env.cliendId,
clientSecret: process.env.clientSecret,
version: "45.0"
});
function sfdcAuth() {
conn.login(process.env.sfdcUser, process.env.sfdcUserPass, (err, userInfo) => {
if (err) {
console.log(err)
}
conn = conn;
console.log("Done")
});
}
function sfdcQuery() {
conn.request(_request, function(err, resp) {
console.log(resp);
console.log(err)
});
}
sfdcAuth()
sfdcQuery()
But because js is asynchronous it runs the second function without waiting for the first function to finish.
The simplest way is to pass your second function as a callback to your first function, which it can call when it’s done:
function sfdcAuth(callback) {
conn.login(process.env.sfdcUser, process.env.sfdcUserPass, (err, userInfo) => {
if (err) {
console.log(err);
}
// Invoke callback when done
callback();
});
}
function sfdcQuery() {
conn.request(_request, function(err, resp) {
console.log(resp);
console.log(err);
});
}
// Pass second function as callback to the first
sfdcAuth(sfdcQuery);
You could also make use of promises:
function sfdcAuth(callback) {
return new Promise((resolve, reject) => {
conn.login(process.env.sfdcUser, process.env.sfdcUserPass, (err, userInfo) => {
if (err) {
reject(err);
}
resolve(userInfo);
});
});
}
function sfdcQuery() {
return new Promise((resolve, reject) => {
conn.request(_request, function(err, resp) {
if (err) {
reject(err);
}
resolve(resp);
});
});
}
// Wait for promise to resolve before invoking second function
sfdcAuth()
.then(result => {
// Do something with result
return sfdcQuery();
})
.then(result => {
// You can continue the chain with
// the result from "sfdcQuery" if you want
})
.catch(err => {
// Handle error
});
I'm new with Lambda & SQS and I'm trying to create a function to send emails, queued in an SQS service, but I don't understand how to call the process function that contains the send + delete queue methods.
Here bellow I paste my code:
'use strict';
const AWS = require('aws-sdk');
const SQS = new AWS.SQS({ apiVersion: '2012-11-05' });
const Lambda = new AWS.Lambda({ apiVersion: '2015-03-31' });
const ses = new AWS.SES({ accessKeyId: "xxxxxxxx", secretAccesskey: "xxxxxxx/xxxxxxxxx" });
const s3 = new AWS.S3({ apiVersion: "2006-03-01", region: "us-west-2" });
const QUEUE_URL = 'https://sqs.us-west-2.amazonaws.com/xxxxxxx/queue';
const PROCESS_MESSAGE = 'process-message';
function getPieceOfMail (path, mapObj, replace) {
return new Promise(function (resolve, reject) {
s3.getObject({
Bucket: "myBucket",
Key: "myKey/" + path
}, function (err, data) {
if (err) {
reject(err);
} else {
if (replace === true) {
var re = new RegExp(Object.keys(mapObj).join("|"), "gi");
data = data.Body.toString().replace(re, function (matched) {
return mapObj[matched.toLowerCase()];
});
resolve(data);
} else {
resolve(data.Body.toString());
}
}
});
});
}
function getRegisterSource (nickname, activate_link) {
var activate_link, pieces;
pieces = [
getPieceOfMail("starts/start.html", {}, false),
getPieceOfMail("headers/a.html", {}, false),
getPieceOfMail("footers/a.html", {}, false),
];
return Promise.all(pieces)
.then(function (data) {
return (data[0] + data[1] + data[2]);
})
.catch(function (err) {
return err;
});
}
function sendEmail (email, data) {
return new Promise(function (resolve, reject) {
var params = {
Destination: { ToAddresses: [email] },
Message: {
Body: {
Html: {
Data: data
},
Text: {
Data: data
}
},
Subject: {
Data: "myData"
}
},
Source: "someone <noreply#mydomain.co>",
};
ses.sendEmail(params, function (err, data) {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
}
function process(message, callback) {
console.log(message);
// process message
getRegisterSource(event['nickname'], event['user_id'])
.then(function (data) {
return sendEmail(event["email"], data);
})
.catch(function (err) {
console.log("==ERROR==");
callback(err, err);
})
.finally(function () {});
// delete message
const params = {
QueueUrl: QUEUE_URL,
ReceiptHandle: message.ReceiptHandle,
};
SQS.deleteMessage(params, (err) => callback(err, message));
}
function invokePoller(functionName, message) {
const payload = {
operation: PROCESS_MESSAGE,
message,
};
const params = {
FunctionName: functionName,
InvocationType: 'Event',
Payload: new Buffer(JSON.stringify(payload)),
};
return new Promise((resolve, reject) => {
Lambda.invoke(params, (err) => (err ? reject(err) : resolve()));
});
}
function poll(functionName, callback) {
const params = {
QueueUrl: QUEUE_URL,
MaxNumberOfMessages: 10,
VisibilityTimeout: 10,
};
// batch request messages
SQS.receiveMessage(params, (err, data) => {
if (err) {
return callback(err);
}
// for each message, reinvoke the function
const promises = data.Messages.map((message) => invokePoller(functionName, message));
// complete when all invocations have been made
Promise.all(promises).then(() => {
const result = `Messages received: ${data.Messages.length}`;
callback(null, result);
});
});
}
exports.handler = (event, context, callback) => {
try {
if (event.operation === PROCESS_MESSAGE) {
console.log("Invoked by poller");
process(event.message, callback);
} else {
console.log("invoked by schedule");
poll(context.functionName, callback);
}
} catch (err) {
callback(err);
}
};
can somebody throw me some light to this?
Thanks in advice.
UPDATE
After so much misconception, I've decided to start looking on how the example of polling-SQS works provided by AWS.
There I've found that I lacked some basic SQS permissions, but solved now by adding the right policy:
{
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Action": [
"lambda:InvokeFunction"
],
"Resource": ["*"]
}]
}
This allows Lambda.invoke() to call process().
When the process(message, callback) is called, if I console.log(message);, it seems that there's no message, although the queue is being cleared by the line SQS.deleteMessage(params, (err) => callback(err, message));
What I was trying was to combine my sendMail function that is currently working with a SQS service so I only have to push each message to the queue.
This is a common requirement where AWS SES has its own limitations in sending emails at once. If these limitations are violated, the SES account will sandbox itself. It seems like you have solved the problem using proper access credentials.
This code contains a Python3 Lambda code that can be used to handle a situation like this, where a Lambda polls from SQS using threading, and sends emails using SES, without exceeding the given limitations.
Link to Github Project.
You can also consider using the new feature in SQS, which is capable of invoking lambdas, when a new message is placed within SQS. But, be careful not to exceed the maximum number of lambda functions within the AWS Account region. (See this document)