I have a lambda function within AWS which is triggered by an S3 upload event. When uploading a file containing whitespace to the S3 bucket I see
"AccessDenied: Access Denied",
" at Request.extractError (/var/task/node_modules/aws-sdk/lib/services/s3.js:700:35)",
" at Request.callListeners (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:106:20)",
" at Request.emit (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:78:10)",
" at Request.emit (/var/task/node_modules/aws-sdk/lib/request.js:688:14)",
" at Request.transition (/var/task/node_modules/aws-sdk/lib/request.js:22:10)",
" at AcceptorStateMachine.runTo (/var/task/node_modules/aws-sdk/lib/state_machine.js:14:12)",
" at /var/task/node_modules/aws-sdk/lib/state_machine.js:26:10",
" at Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:38:9)",
" at Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:690:12)",
" at Request.callListeners (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:116:18)"
I am uploading a file hello world.csv
console.log('File path: ' + submission.s3.object.key);
gives the below:
File path: hello+world.csv
My lambda is using const AWS = require('aws-sdk'); to read the content of the event from the S3 bucket.
I have seen similar issues raised previously but they seem to have been resolved.
How can I reference the correct path to my file within my lambda?
you need to escape them spaces
var path = '/path/with spaces/in/them/';
var escapedPath = path.replace(/(\s)/, "\\ ");
on yow case it might look like this
var path = submission.s3.object.key;
var escapedPath = path.replace(/(\s)/, "\\ ");
console.log('File path: ' + espacedPath);
Related
I get the error message when I want to upload a file size > 1MB to S3 Bucket in my company's network:
(If file size < 1MB, upload file working fine)
500: null
at Request.extractError (...\node_modules\aws-sdk\lib\services\s3.js:711:35)
at Request.callListeners (...\node_modules\aws-sdk\lib\sequential_executor.js:106:20)
at Request.emit (...\node_modules\aws-sdk\lib\sequential_executor.js:78:10)
at Request.emit (...\node_modules\aws-sdk\lib\request.js:686:14)
at Request.transition (...\node_modules\aws-sdk\lib\request.js:22:10)
at AcceptorStateMachine.runTo (...\node_modules\aws-sdk\lib\state_machine.js:14:12)
at ...\node_modules\aws-sdk\lib\state_machine.js:26:10
at Request.<anonymous> (...\node_modules\aws-sdk\lib\request.js:38:9)
at Request.<anonymous> (...\node_modules\aws-sdk\lib\request.js:688:12)
at Request.callListeners (...\node_modules\aws-sdk\lib\sequential_executor.js:116:18) {
code: 500,
region: null,
time: 2022-11-22T09:07:55.279Z,
requestId: null,
extendedRequestId: undefined,
cfId: undefined,
statusCode: 500,
retryable: true
}
And I found this error is simlar to this issue (https://github.com/localstack/localstack/issues/1410),
but I still get the same error after I try to set s3ForcePathStyle to true while creating s3 object.
Here is my code:
const s3 = new AWS.S3({apiVersion: '2006-03-01', s3ForcePathStyle: true});
let result = await s3.upload( {Bucket: bucket_name, Key: file_path, Body: request.files.buffer}).promise();
// using upload and putObject api return same error
console.log(result);
But if I run the code in my home, there is no error, and file upload succeed.
In my code, I set the NODE_TLS_REJECT_UNAUTHORIZED diabled to avoid ssl certificate problem, but still not working
process.env['NODE_TLS_REJECT_UNAUTHORIZED'] = 0;
I also try to use AWS CLI CMD (aws s3api put-object ...) in my company, upload the file size > 1MB is working fine!!
I don't understand why I cannot using node.js code to upload the file size > 1MB in my company, doesn't AWS CLI and npm aws-sdk library using same protocol?
I guess this error maybe related to my company's environment, but AWS CLI is working fine. So, I think it probably could be fixed the problem in my code, hopping someone can help, thank you so much.
Finally, I try to set enpoint and remove s3ForcePathStyle while careting s3 object.
It's working now!
const s3 = new AWS.S3({apiVersion: '2006-03-01', endpoint: 's3.amazonaws.com'});
But I still don't know why?
I have permissions, and can create Databases/Tables in the console just fine. but when i make code using the aws-sdk it fails with an InternalServerException: Internal Server Error. Is this only supported in the v3 api?
const params = {
DatabaseName,
Tags: getDefaultTags(customer,username)
};
console.log(params);
const promise = writeClient.createDatabase(params).promise();
to repeat, this is not a permissions issue, I have the region specified, I just get an internal error if I try to use the SDK:
InternalServerException: Internal Server Error
at Request.extractError (<<coderoot>>\node_modules\aws-sdk\lib\protocol\json.js:52:27)
at Request.callListeners (<<coderoot>>\node_modules\aws-sdk\lib\sequential_executor.js:106:20)
at Request.emit (<<coderoot>>\node_modules\aws-sdk\lib\sequential_executor.js:78:10)
at Request.emit (<<coderoot>>\node_modules\aws-sdk\lib\request.js:686:14)
at Request.transition (<<coderoot>>\node_modules\aws-sdk\lib\request.js:22:10)
at AcceptorStateMachine.runTo (<<coderoot>>\node_modules\aws-sdk\lib\state_machine.js:14:12)
at <<coderoot>>\node_modules\aws-sdk\lib\state_machine.js:26:10
at Request.<anonymous> (<<coderoot>>\node_modules\aws-sdk\lib\request.js:38:9)
at Request.<anonymous> (<<coderoot>>\node_modules\aws-sdk\lib\request.js:688:12)
at Request.callListeners (<<coderoot>>\node_modules\aws-sdk\lib\sequential_executor.js:116:18)
if I use the v3, I get the same error eventually, this example is creating a table, but it's the same idea:
InternalServerException: Internal Server Error
at deserializeAws_json1_0InternalServerExceptionResponse (<<coderoot>>\node_modules\#aws-sdk\client-timestream-write\dist-cjs\protocols\Aws_json1_0.js:899:23)
at deserializeAws_json1_0CreateTableCommandError (<<coderoot>>\node_modules\#aws-sdk\client-timestream-write\dist-cjs\protocols\Aws_json1_0.js:239:25)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async <<coderoot>>\node_modules\#aws-sdk\middleware-serde\dist-cjs\deserializerMiddleware.js:7:24
at async <<coderoot>>\node_modules\#aws-sdk\middleware-signing\dist-cjs\middleware.js:13:20
at async StandardRetryStrategy.retry (<<coderoot>>\node_modules\#aws-sdk\middleware-retry\dist-cjs\StandardRetryStrategy.js:51:46)
at async <<coderoot>>\node_modules\#aws-sdk\middleware-logger\dist-cjs\loggerMiddleware.js:6:22
at async Object.createTable (<<coderoot>>\timestream.db.js:228:22)
at async handleCustomerCreation (<<coderoot>>\router.js:60:21)
at async router (<<coderoot>>\router.js:266:13)
On BotFramework (NodeJS), I was trying to replicate the demo available at https://learn.microsoft.com/en-us/bot-framework/nodejs/bot-builder-nodejs-send-receive-attachments . It actually works well.
Code in case ms article change:
// Create your bot with a function to receive messages from the user
var bot = new builder.UniversalBot(connector, function (session) {
var msg = session.message;
if (msg.attachments && msg.attachments.length > 0) {
// Echo back attachment
var attachment = msg.attachments[0];
session.send({
text: "You sent:",
attachments: [
{
contentType: attachment.contentType,
contentUrl: attachment.contentUrl,
name: attachment.name
}
]
});
} else {
// Echo back users text
session.send("You said: %s", session.message.text);
}
});
However, the problem I am facing is that when I do a call from Skype (Normal), I receive an error message:
2017-12-07T02:16:15.815Z Error: POST to 'https://smba.trafficmanager.net/apis/v3/conversations/<My Conversation>/activities' failed: [400] Bad Request
at Request._callback (/app/node_modules/botbuilder/lib/bots/ChatConnector.js:545:46)
at Request.self.callback (/app/node_modules/request/request.js:186:22)
at emitTwo (events.js:126:13)
at Request.emit (events.js:214:7)
at Request.<anonymous> (/app/node_modules/request/request.js:1163:10)
at emitOne (events.js:116:13)
at Request.emit (events.js:211:7)
at IncomingMessage.<anonymous> (/app/node_modules/request/request.js:1085:12)
at Object.onceWrapper (events.js:313:30)
at emitNone (events.js:111:20)
Any ideas?
[Update: It happens only when I create the attachment response. So I expect that's where I have an issue]
Actually the code on the MS Website is not up-to-date (in a way).
If I follow the code visible at : https://github.com/Microsoft/BotBuilder-Samples/tree/master/Node/core-ReceiveAttachment
By example, I can receive the attachment and save it somewhere on a public folder. Once completed, then I can send back the "public" URL as attachment and then it works.
I'm trying to create a script in Cloud Functions for Firebase that will react to a db event and remove an image that has its path in one of the params ("fullPath").
this is the code i'm using:
'use strict';
const functions = require('firebase-functions');
const request = require('request-promise');
const admin = require('firebase-admin');
const gcs = require('#google-cloud/storage')({
projectId: 'XXXXXXX',
credentials: {
// removed actual credentials from here
}});
admin.initializeApp(functions.config().firebase);
// Deletes the user data in the Realtime Datastore when the accounts are deleted.
exports.removeImageOnNodeRemoval = functions.database
.ref("images/{imageId}")
.onWrite(function (event) {
// exit if we are creating a new record (when no previous data exists)
if (!event.data.previous.exists()) {
console.log("a new image added");
return;
}
// exit if we are just trying to update the image
if (event.data.exists()) {
console.log("image is been modified");
return;
}
let previousData = event.data.previous.val();
if(!previousData || !previousData.fullPath){
console.log("no data in the previous");
return;
}
let bucketName = 'XXXXXXX';
console.log("default bucketName", gcs.bucket(bucketName));
let file = gcs.bucket(bucketName).file(previousData.fullPath);
console.log('the file /'+previousData.fullPath, file);
file.exists().then(function(data) {
let exists = data[0];
console.info("file exists", exists);
});
file.delete().then(function() {
// File deleted successfully
console.log("image removed from project", previousData.fullPath);
}).catch(function(error) {
// Uh-oh, an error occurred!
console.error("failed removing image from project", error, previousData);
});
});
the error i'm getting:
failed removing image from project { ApiError: Not Found
at Object.parseHttpRespBody (/user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/src/util.js:192:30)
at Object.handleResp (/user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/src/util.js:132:18)
at /user_code/node_modules/#google-cloud/storage/node_modules/#google-cloud/common/src/util.js:465:12
at Request.onResponse [as _callback] (/user_code/node_modules/#google-cloud/storage/node_modules/retry-request/index.js:120:7)
at Request.self.callback (/user_code/node_modules/#google-cloud/storage/node_modules/request/request.js:188:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:191:7)
at Request.<anonymous> (/user_code/node_modules/#google-cloud/storage/node_modules/request/request.js:1171:10)
at emitOne (events.js:96:13)
at Request.emit (events.js:188:7)
at IncomingMessage.<anonymous> (/user_code/node_modules/#google-cloud/storage/node_modules/request/request.js:1091:12)
at IncomingMessage.g (events.js:291:16)
at emitNone (events.js:91:20)
at IncomingMessage.emit (events.js:185:7)
at endReadableNT (_stream_readable.js:974:12)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickDomainCallback (internal/process/next_tick.js:122:9)
code: 404,
errors: [ { domain: 'global', reason: 'notFound', message: 'Not Found' } ],
response: undefined,
message: 'Not Found' } { contentType: 'image/png',
fullPath: 'images/1491162408464hznsjdt6oaqtqmukrzfr.png',
name: '1491162408464hznsjdt6oaqtqmukrzfr.png',
size: '44.0 KB',
timeCreated: '2017-04-02T19:46:48.855Z',
updated: '2017-04-02T19:46:48.855Z' }
i have tried with and without credentials to google-cloud/storage (thinking they might get auto filled while im in firebase.functions - do i need them?). i have tried adding a slash to the file's path. i have validated that the file actually exists in the bucket (even tho file.exists() returns false). the credentials i provided are for an iam i created with admin privileges for the storage service.
i have also enable the billing account on the free plan.
any ideas?
ok, so i got this solved. here are my conclusions:
you need to add to your buckets name the ".appspot.com". Its not written anywhere in the docs and getting your bucket name in firebase is hard enough for someone that is not familiar in the google cloud. I hope they will add this little peace of information into their docs or make it clear what is your bucket's name in firebase.
use the environment variable process.env.GCLOUD_PROJECT, it should have your project id that is identical to your bucket id in firebase. again, remember to add the .appspot.com suffix to it.
regarding the credentials to GCS, you don't need to provide them when using cloud functions for firebase. you seem to be authenticated already.
Make sure your bucket name doesn't include gs://
Eg. instead of gs://my-project-id.appspot.com use my-project-id.appspot.com
let bucket = gcs.bucket('my-project-id.appspot.com')
This may happen to you (as it did to me) if you copy your bucket name from, for instance, Android code, where you can use full URL to connect with storage, ie. storage.getReferenceFromUrl('gs://my-proj...
.
Also, it seems the projectId variable you use to initialise gcs doesn't need the appspot.com suffix (but it shouldn't break if you have it included). Ie. projectId:'my-project-id' should be suffice.
Lastly, the GCS node package docs states you need to generate separate JSON to pass as keyFilename to test things locally. Good news is - you can use the Firebase Admin SDK key, as described in the Get Started Server/Admin docs instead.
I am trying to upload an image to my s3 bucket using the npm s3 module (https://www.npmjs.org/package/s3)
I used the following params
var params = {
localFile: 'image.png',
s3Params: {
Bucket: 'newstie.com',
Key: '/newsite/image'
}
};
And I am getting the following logs and I wasn't able to understand what I did wrong. Can you please help me? Many Thanks
progress 12181 0 12181
progress 12181 12181 12181
unable to upload Error
at Request.extractError (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/services/s3.js:257:35)
at Request.callListeners (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/sequential_executor.js:114:20)
at Request.callListeners (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/sequential_executor.js:115:16)
at Request.emit (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/sequential_executor.js:81:10)
at Request.emit (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:578:14)
at Request.transition (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:12:12)
at AcceptorStateMachine.runTo (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/state_machine.js:14:12)
at /Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/state_machine.js:28:10
at Request. (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:28:9)
at Request. (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:580:12)
progress 12181 0 12181
progress 12181 12181 12181
unable to upload Error
at Request.extractError (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/services/s3.js:257:35)
at Request.callListeners (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/sequential_executor.js:114:20)
at Request.callListeners (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/sequential_executor.js:115:16)
at Request.emit (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/sequential_executor.js:81:10)
at Request.emit (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:578:14)
at Request.transition (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:12:12)
at AcceptorStateMachine.runTo (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/state_machine.js:14:12)
at /Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/state_machine.js:28:10
at Request. (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:28:9)
at Request. (/Users/Desktop/newsite/node_modules/s3/node_modules/aws-sdk/lib/request.js:580:12)
I guess we need more code. I'm recommending using the aws-sdk. Works perfectly, easy to upload, sync, trigger jobs and so on.
But to answer your question and assuming the file you're trying to upload is located in the same directory as your script your code would look something like this:
var params = {
localFile: __dirname + '/image.png',
s3Params: {
Bucket: 'newstie.com',
Key: '/newsite/image'
}
};