Dynamically created s3 folders are not showing in listObjects - node.js

I'm using signed url to upload a file from my react application to a s3 bucket. I specify the path as part of my Key and the folders are getting created properly:
let params = {
Bucket: vars.aws.bucket,
Key: `${req.body.path}/${req.body.fileName}`,
Expires: 5000,
ACL: 'public-read-write',
ContentType: req.body.fileType,
};
s3.getSignedUrl('putObject', params, (err, data)=>{...
However, when I use s3.listObject, the folders that are created this way are not getting returned. Here is my node api code:
const getFiles = (req, res) => {
let params = {
s3Params:{
Bucket: vars.aws.bucket,
Delimiter: '',
Prefix: req.body.path
}
}
s3.listObjects(params.s3Params, function (err, data) {
if (err) {
res.status(401).json(err);
} else {
res.status(200).json(data);
}
});
}
The folders that are getting created through the portal are showing in the returned object properly. Is there any attribute I need to set as part of generating the signed URL to make the folder recognized as an object?

I specify the path as part of my Key and the folders are getting created properly
Actually, they aren't.
Rule 1: The console displays a folder icon for the folder foo because one or more objects exists in the bucket with the prefix foo/.
The console appears to allow you to create "folders," but that isn't what's happening when you do that. If you create a folder named foo in the console, what actually happens is that an ordinary object, zero bytes in length, with the name foo/ is created. Because this now means there is at least one object that exists in the bucket with the prefix foo/, a folder is displayed in the console (see Rule 1).
But that folder is not really a folder. It's just a feature of the console interacting with another feature of the console. You can actually delete the foo/ object using the API/SDK and nothing happens, because the console till shows that folder as long as there remains at least one object in the bucket with the prefix foo/. (Deleting the folder in the console sends delete requests for all objects with that prefix. Deleting the dummy object via the API does not.)
In short, the behavior you are observing is normal.
If you set the delimiter to /, then the listObjects response will include CommonPrefixes -- and this is where you should be looking if you want to see "folders." Objects ending with / are just the dummy objects the console creates. CommonPrefixes does not depend on these.

Related

How to upload only the file(mp4/mp3/pdf) given the path to the file to a cloud storage like s3 or spaces(Digital-Oceans)

Trying to read a video file using a relative path using fs.readfilesync(path) to upload to S3 bucket but the whole folder tree is being uploaded to S3.(in my case Spaces(digital ocean))
const file = fs.readFileSync('./downloads/this.movie_details.title/Understanding Network Hacks Attack And Defense With Python/Understanding Network Hacks Attack And Defense With Python.pdf');
const filename = video_path;
var params = {
Body: file,
Bucket: 'ocean-bucket21',
Key: filename,
};
s3.putObject(params, function (err, data) { });
The result of the above code in the cloud is the whole file structure is uploaded instead of only one file i.e. pdf
I think you are saying that you'd like to upload the file to the top-level (root) of the bucket, rather than putting it inside the folders.
If so, then you should modify the Key value in the params dictionary. The Key includes the full path of the object. If the Key contains slashes, they will be interpreted as directory names.
If you want it at the top-level, edit the value of Key so it just contains your desired filename.

Unable to fetch file from S3 in node js

I have stored video files in S3 bucket and now i want to show the files to clients through an API. Here is my code for it
app.get('/vid', async(req, res) => {
AWS.config.update({
accessKeyId: config.awsAccessKey,
secretAccessKey: config.awsSecretKey,
region: "ap-south-1"
});
let s3 = new AWS.S3();
var p = req.query.p
res.attachment(p);
var options = {
Bucket: BUCKET_NAME,
Key: p,
};
console.log(p, "name")
try {
await s3.getObject(options).
createReadStream().pipe(res);
} catch (e) {
console.log(e)
}
})
This is the output I am getting when ther is this file available in S3 bucket -
vid_kdc5stoqnrIjEkL9M.mp4 name
NoSuchKey: The specified key does not exist.
This is likely caused by invalid parameters being passed into the function.
To check for invalid parameters you should double check the strings that are being passed in. For the object check the following:
Check the value of p, ensure it is the exact same name of the full object key.
Validate that the correct BUCKET_NAME is being used
No trailing characters (such as /)
Perform any necessary decoding before passing parameters in.
If in doubt use logging to output the exact value, also to test the function try testing with hard coded values to validate you can actually retrieve the objects.
For more information take a look at the How can I troubleshoot the 404 "NoSuchKey" error from Amazon S3? page.
Having certain characters in the bucketname leads to this error.
In your case, there is an underscore. Try renaming the file.
Also refer to this
S3 Bucket Naming Requirements Docs
Example from Docs:
The following example bucket names are not valid:
aws_example_bucket (contains underscores)
AwsExampleBucket (contains uppercase letters)
aws-example-bucket- (ends with a hyphen)

AWS S3 nodejs - How to get object by his prefix

I'm searching how check if an object exist in my aws s3 bucket in nodejs without list all my object (~1500) and check the prefix of the object but I cannot find how.
The format is like that:
<prefix I want to search>.<random string>/
Ex:
tutturuuu.dhbsfd7z63hd7833u/
Because you don't know the entire object Key, you will need to perform a list and filter by prefix. The AWS nodejs sdk provides such a method. Here is an example:
s3.listObjectsV2({
Bucket: 'youBucket',
MaxKeys: 1,
Prefix: 'tutturuuu.'
}, function(err, data) {
if (err) throw err;
const objectExists = data.Contents.length > 0
console.log(objectExists);
});
Note that it is important to use MaxKeys in order to reduce network usage. If more than one object has the prefix, then you will need to return everything and decide which one you need.
This API call will return metadata only. After you have the full key you can use getObject to retrieve the object contents.

AWS transcribe: How to give a path to output folder

I am using aws transcribe to get the text of the video using node js. I can specify the particular destination bucket in params but not the particular folder. Can anyone help me with this ? This is my code
var params = {
LanguageCode: "en-US",
Media: { /* required */
MediaFileUri: "s3://bucket-name/public/events/545/videoplayback1.mp4"
},
TranscriptionJobName: 'STRING_VALUE', /* required */
MediaFormat: "mp4", //mp3 | mp4 | wav | flac,
OutputBucketName: 'test-rekognition',
// }
};
transcribeservice.startTranscriptionJob(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
I have specified the destination bucket name in OutputBucketName field. But how to specify a particular folder ?
I would recommend creating a designated S3 Bucket for Transcribe output and adding a trigger with a lambda function to respond to that trigger on 'Object create (All)'. Essentially, as soon as there is a new object added to your S3 bucket, a lambda function is invoked to move/process that output by placing it in a specific 'folder' of your choice.
This doesn't solve the API issue but I hope it serves as a good workaround - you could look at this article ( https://linuxacademy.com/hands-on-lab/0e291fc6-52a4-4ed3-ad65-8cf2fd84e0df/ ) as a guide.
have a good one.
Very late to the party, but I just had the same issue.
It appeared that Amazon added a new parameter called OutputKey that allows you to save your data in a specific folder in your bucket :
You can use output keys to specify the Amazon S3 prefix and file name
of the transcription output. For example, specifying the Amazon S3
prefix, "folder1/folder2/", as an output key would lead to the output
being stored as "folder1/folder2/your-transcription-job-name.json". If
you specify "my-other-job-name.json" as the output key, the object key
is changed to "my-other-job-name.json". You can use an output key to
change both the prefix and the file name, for example
"folder/my-other-job-name.json".
Just make sure to put 'folder/' as an OutputKey (with '/' symbol at the end), otherwise it will be interpreted as a name for your file and not the folder where to store it.
Hope it'll be useful to someone.

Concatenate express route parameter and file extension

In my Express app I create snapshots whose details I store in MongoDB. The actual snapshot files are stored in the snapshots folder under their _id, eg /snapshots/575fe038a84ca8e42f2372da.png.
These snapshots can currently be loaded by a user by navigating to the folder and id, in their browser, i.e. /snapshots/575fe038a84ca8e42f2372da, which returns the image file. However I think to be more intuitive the url path should include the file extension; i.e. the user should have to put in /snapshots/575fe038a84ca8e42f2372da .PNG to get the file.
This is what I have currently:
router.get('/:shotID', function(req, res, next) {
// Checks if shot exists in DB
Shots.findOne({
_id: req.params.shotID // More conditions might get put here, e.g. user restrictions
}, (err, result) => {
if (err) {
res.status(404).res.end();
return;
}
var file = fs.createReadStream(`./snapshots/${req.params.shotID}.png`);
file.pipe(res);
});
});
How can I incorporate the user putting in the file extension in this path?
You can provide a custom regular expression to match a named parameter, which can also contain a file extension:
router.get('/:shotID(?:([a-fA-F0-9]{24})\.png$)', ...);
For the URL path /snapshots/575fe038a84ca8e42f2372da.png, req.params.shotID will be 575fe038a84ca8e42f2372da.
If you want to match both with and without .png, you can use this:
router.get('/:shotID(?:([a-f0-9]{24})(?:\.png)?$)', ...);

Resources