express fileupload : no such file or directory - node.js

I am using express-file package. I use it like this:
const upload = require('express-fileupload');
app.use(upload({
createParentPath: true
}));
This is how I store:
await req.files.main_image.mv('./public/images/movies/'+main_image);
I already have public/images directory created. I don't have movies directory in public/images directory though.
When It works: If I have /public/images/movies directory already created, it works
When it doesn't work: If I don't have /public/images/movies directory created, but have /public/images directory. Then it says:
ENOENT: no such file or directory, open
'C:\Users\glagh\Desktop\Work\MoviesAdviser\public\images\movies\1554741546720-8485.11521.brussels.the-hotel-brussels.amenity.restaurant-AD3WAP2L-13000-853x480.jpeg
What to do so that it automatically creates /movies directory and put the image there?

Express-fileupload uses the following code to create the directory-path which you pass to the .mv function:
const checkAndMakeDir = function(fileUploadOptions, filePath){
//Check upload options were set.
if (!fileUploadOptions) return false;
if (!fileUploadOptions.createParentPath) return false;
//Check whether folder for the file exists.
if (!filePath) return false;
const parentPath = path.dirname(filePath);
//Create folder if it is not exists.
if (!fs.existsSync(parentPath)) fs.mkdirSync(parentPath);
return true;
};
The problem is that fs.mkdirSync() does not create the full path with the parent directories you specifiy (note that you'd use mkdir -p on the shell to create the whole folder structure) -> checkout How to create full path with node's fs.mkdirSync? for more information.
What you could do is use a module like fs-extra and use its function ensureDir which would create the corresponding parent directories (if they don't exist yet) before calling the .mv() function.
Or in case you're node version is >= 10 use the native {rescursive:true} option which you can pass to fs.mkdirSync.

Related

Read all JSON files contained in a dynamically updated folder

I've got multiple json files contained within a directory that will dynamically be updated by users. The users can add categories which will create new json files in that directory, and they can also remove categories which would delete json files in that directory. I'm looking for a method to read all json files contained in that folder directory, and push all the json files into a single object array. I imagine asynchronously would be desirable too.
I'm very new to using fs. I've management to read single json files by directory using
const fs = require('fs');
let data = fs.readFileSync('./sw_lbi/categories/category1.json');
let categories = JSON.parse(data);
console.log(categories);
But of course this will only solve the synchronous issue when using require()
As I'll have no idea what json files will be contained in the directory because the users will also name them, I'll need a way to read all the json files by simply calling the folder directory which contains them.
I'm imagining something like this (which obviously is foolish)
const fs = require('fs');
let data = fs.readFileSync('./sw_lbi/categories');
let categories = JSON.parse(data);
console.log(categories);
What would be the best approach to achieve this?
Thanks in advance.
First of all you need to scan this directory for files, next you need to filter them and select only JSONs, and at the end just read every file and do what you need to do
const fs = require('fs');
const path = require('path')
const jsonsInDir = fs.readdirSync('./sw_lbi/categories').filter(file => path.extname(file) === '.json');
jsonsInDir.forEach(file => {
const fileData = fs.readFileSync(path.join('./sw_lbi/categories', file));
const json = JSON.parse(fileData.toString());
});

Getting mkdir error while using uploading image using multer

while uploading file and creating a path , I am getting creating a folder error :-
Error: EACCES: permission denied, mkdir '/opt/bitnami/apps/NodeJS-Login/uploads'
at Object.fs.mkdirSync (fs.js:885:18)
at Function.sync (/opt/bitnami/apps/NodeJS-Login/node_modules/mkdirp/index.js:71:13)
at new DiskStorage (/opt/bitnami/apps/NodeJS-Login/node_modules/multer/storage/disk.js:21:12)
at module.exports (/opt/bitnami/apps/NodeJS-Login/node_modules/multer/storage/disk.js:65:10)
at new Multer (/opt/bitnami/apps/NodeJS-Login/node_modules/multer/index.js:15:20)
I am using bitnami on AWS to host my MEAN app.
on my main server.js file I have added this:-
app.use(multer({ dest: './uploads/',
rename: function (fieldname, filename) {
return filename;
},
}));
on schema model :-
companyLogo: {
data: Buffer,
type: String
}
and in controller for route :-
admin.companyLogo = fs.readFileSync(req.files.comLogo.path)
admin.companyLogo.type = 'image/png';
What should I do to make image upload ? Also do I have to pass other key values in form-data instead of raw ?
/opt is write protected by default, so here are possible fixes
1) Change permissions for /opt and allow the user to write in this folder (Not Recommended)
OR
2) Run the server.js with the super user, this way you have complete right over the directory and it will allow you to do anything (Not Recommended)
OR
3) Just change the path to somewhere the user has access to write (Recommended)

Is fs.mkdir() required to create a sub directory within /tmp in Firebase Cloud Functions?

Let's look at the code below. If I wanted to save a file to /tmp/new_folder should I use node's fs.mkdir() function or can I just give it the path as a string even though the sub-directory does not exist yet?
Also, is it a requirement to use path.join() over concatenating strings to create a the destination path?
// Download file from bucket.
const bucket = gcs.bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
const metadata = {
contentType: contentType,
};
return bucket.file(filePath).download({
destination: tempFilePath,
})
In the Cloud Functions runtime, /tmp already exists, so there is no need to try to create it before you write a file there. If you want to create a subdirectory under /tmp, you will have to create that on your own (and delete it when your function is done).

Extract a specific folder from a zip file using node.js

I have a zip file with the following structure:
download.zip\Temp\abc.txt
download.zip\Temp\Foo\abc2.txt
I want to extract the content under Temp in download.zip to a directory say D:\work_del.
This directory after extraction of zip should have abc.txt and Foo\abc2.txt
I am using adm-zip module of node but that doesn't seem to help. (Below code for reference).
var zip = require('adm-zip');
var file = new zip("D:\\Work\\download.zip");
file.extractEntryTo("Temp", 'D:\\Work_delete', false, true);
Any pointers to get the above the scenario working in node.js?
var zip = require('adm-zip');
var file = new zip("D:\\Work\\download.zip");
file.extractEntryTo("Temp/abc.txt", 'D:\\Work_delete', false, true);
The thing that I noticed is that if you specify the path as Temp\\1.txt it doesn't work. So try to avoid backslashes as forward slashes work perfectly fine in Windows with Node.js.
var zip = require('adm-zip');
var file = new zip("C:/Users/harslo/Desktop/node/Download.zip");
file.extractEntryTo("Temp/abc.txt", 'C:/Users/harslo/Desktop/node/Work_delete', false, true);
If you want to extract all the files inside of a folder use FolderName/ as described in adm-zip docs docs.
PS - ADM-ZIP extractEntryTo doesn't seem to be working with zips created with Windows Inbuilt "Send to ZIP".
var zip = require('adm-zip');
var file = new zip("D:/Work/download.zip");
file.extractEntryTo("Temp/", "D:/Work_delete", false, true);

Upload zip files to service for user download

I have an express (node.js) application that creates mp3 files for a user and stores them in a folder on the server.
The server's file structure looks like this:
data/
id#/
song1.mp3
song2.mp3
song3.mp3
id#/
song1.mp3
song2.mp3
song3.mp3
...
I want to create zip files and download links for the folder and email users their zip file. Each folder is named using an id# and corresponds to a user.
I want to use an external service (such as s3) to store and handle the downloads of the files.
How would I zip up the files and send them to the service and create download links? Which services should I look into?
you can use the child_process module to run a zip command in the background (this example is for say if you're on Linux, you can modify it to suit windows)
This just handles the zip process, then you can respond with the link to download the file:
var exec = require('child_process').exec;
var currentWorkingDirectory = '/data', // Folder that contains the sub folders of id#/
folderToZip = 'id123123', // Subfolder name id#, e.g. id123123
tmpFolderForZips = '/tmp/', // Folder to store the zip files in
execString = "zip -r " + tmpFolderForZips + folderToZip + ".zip " + folderToZip; // Tidy string to put it all together
console.log(execString);
var child = exec(execString, { cwd: currentWorkingDirectory });
child.on('error', function(err) {
console.log(err);
})
child.on('close', function() {
// respond with download link to zip file
console.log('Done Zipping');
});
Make sure you apt-get install zip

Resources