I am using imagemagick to covert file on node.
router.post('/route, upload.array('file'), async (req, res, next) => {
const files = req.files;
Promise.all(files.forEach(async (file) =>{
await fileconvert(file);
}));
...some db create
return res.send(createdId)
}
//from here different file
import im from 'imagemagick';
function fileconvert(file) {
return new Promise((resolve,reject)=>{
if(file.mimetype === "specialcase") {
im.convert(
[
file.path, '-format', 'png', 'newFile.png'
],
(err, stdout) => {
if(err) reject(err)
resolve(stdout);
}
)
}
//if not special-case without this, it doesn't work.
resolve();
})
}
In front-side, I'm using axios api call.
I am hoping that after fileconvert finished, then res.send(createdId).
Everything works fine, except that if too large file converted, I get broken image.
It may has file on exact directory but it hasn't converted fully yet.
It seems if file created, PASS. How can I get fully converted image...
Process on front-side is "Upload file => if createdId exists => route page
Thanks for your help.
Related
I would like to upload a picture on my Node.js server only if the picture is a real png / jpeg / jpg / gif file.
For testing purpose, I have 2 files : a "fake" png (a pdf file whose extension has been manually modified) and a real one.
I use formidable library for this. I have two functions. Neither function works as expected : actually, I would like the file to be written on the server disk only if the type is correct.
router.post('/producer', (req, res, next) => {
uploadImage1(req, res, next);
//uploadImage2(req, res, next);
...
});
Function 1 :
function uploadImage1(req, res, next) {
const formidableOptions = {
multiples: false,
keepExtensions: true,
uploadDir: __dirname + '/pictures',
maxFileSize: MAX_SIZE_FILE_UPLOAD_KO * 1024
}
const form = formidable(formidableOptions);
form.parse(req, async (err, fields, files) => {
if (err) {
return res.status(400).json("Something went wrong with your request");
}
const buffer = readChunk.sync(files.fileToUpload.path, 0, 4100);
const fileType = await FileType.fromBuffer(buffer);
if(!fileType.ext.match(/(jpg|jpeg|png|gif)$/i)) {
return res.status(400).json("Upload file type error");
}
return res.status(200).json("Image correctly uploaded");
});
}
My servers responses :
With real png file : "Image correctly uploaded"
With fake png file : "Upload file type error"
But, in both cases, the file is written on my server disk, whereas logically it shouldn't for the wrong file type.
Function 2 :
function uploadImage2(req, res, next) {
const formidableOptions = {
multiples: false,
keepExtensions: true,
uploadDir: __dirname + '/pictures',
maxFileSize: MAX_SIZE_FILE_UPLOAD_KO * 1024
}
const form = formidable(formidableOptions);
form.parse(req);
form.onPart = (part) => {
part.on('data', async (buffer) => {
const fileType = await FileType.fromBuffer(buffer);
if (fileType) {
if (!fileType.ext.match(/(jpg|jpeg|png|gif)$/i)) {
return res.status(400).json("Upload file type error");
} else {
// What to do next ? The picture is not written to server directory...
return res.status(200).json("Image correctly uploaded");
}
}
});
};
}
My servers responses :
With real png file : "Image correctly uploaded"
With fake png file : "Upload file type error"
But, in both cases, the file is never written on my server disk, whereas logically it should be in the correct case...
I'm stuck with this. Do you know how I could fix this please ? Many thanks.
Inside Function 1 : you are reading the file from disk with readChunksync. It means the file is already there. You need to remove it after res.status(400).json("Upload file type error"); with fs.unlink for example.
Inisde form.parse callback the files are already written.
Another way to do this is to pass in a custom options.fileWriteStreamHandler which checks for the type.
I have a file issueData.json and I want to update in POST request. This is my code.
I try to read the file parse to array, push the new, and after it re-write.
app.post("/api/issues", (req, res, next) => {
const issueObj = req.body;
fs.readFile("issuesData.json", (err: Error, data: string | Buffer) => {
if (err) {
res.status(500).send(err);
} else {
const stringData = data.toString();
const issueFile = [...JSON.parse(stringData)];
const updatedIssueFile = issueFile.push(issueObj);
fs.writeFile(
"issuesData.json",
JSON.stringify(updatedIssueFile),
(err: Error) => {
if (err) {
res.status(500).send(err);
} else {
res.status(200).send("Issue has updated");
}
}
);
}
});
});
1) Is it a good practice?
2) Is TS so, what should be the type of req, res, next?
3) It is a good way to update the JSON?
If you're just writing to a file, you might not need to read the contents of the file and append your issueObj to the issueFile array. Maybe you could just write the issueObj to a new line in your file. Maybe something like the appendFile function would help (https://nodejs.org/api/fs.html#fs_fs_appendfile_path_data_options_callback).
Currently, as your file grows, the read operations will take longer and longer and will affect performance. However, just writing will ensure you don't incur that overhead for each POST request.
I have an API backend with Node and Express. I am trying to take some filtered data from the frontend and create a CSV file and download it for the user. I have been using json2csv. I am able to create the data file correctly and when I use that file in my express route I download a file that just says undefined. At first, I thought it was an asynchronous issue, but after using a setTimeout as a test to see if that was an issue I still get the undefined data file. Console logging the "csvData" shows the correct data.
Express route to download the file.
app.post('/api/downloads/filtered', (req, res) => {
let fields = [];
fields = Object.keys(req.body[0])
const filteredData = req.body;
const json2csvParser = new json2csv({fields: fields});
const csvData = json2csvParser.parse(filteredData);
console.log(csvData)
fs.writeFile('./report.csv', csvData, (err) => {
if (err) {
console.log(err);
}
else {
console.log('created report.csv');
res.download('./report.csv');
}
})
})
I'm using Vue on the frontend, I get the file when clicking a button not sure if that is something I should include.
I ended up figuring out my issue. I found that downloading in a post request didn't seem to be possible. I needed a get request. Since the data for the file came in the request body I ended up keeping the post request to create the file and creating a separate get request to download the file this seemed to work fine but didn't find it documented anywhere so I wasn't sure if a better way exists.
app.post('/api/downloads/filtered', (req, res) => {
console.log(req.body)
let fields = [];
fields = Object.keys(req.body[0])
const filteredData = req.body;
const json2csvParser = new json2csv({fields: fields});
const csvData = json2csvParser.parse(filteredData);
console.log(csvData)
fs.writeFile('./report.csv', csvData, (err) => {
if (err) {
console.log(err);
}
else {
console.log('created report.csv');
}
})
})
app.get('/api/downloads/filtered', (req, res) => {
setTimeout(() => {res.download('./report.csv')}, 1000)
})
I have tried several methods to delete my photo, especially using fs.unlink but it seems not working at all, you can see from picture below that i save my photos in assets->img->products
and so my database looks like this
and my code looks like this
router.get("/admin/products/:id/delete", (req, res) => {
Product.findByIdAndRemove(req.params.id, (err, photo) => {
if (err) {
req.flash("error", "deleting photo failed");
return res.render("/admin/products/");
}
fs.unlink(photo.image1, function() {
console.log(photo.image1);
return res.redirect("/admin/products");
});
});
});
what is wrong from my code that did not delete my photo from my file?
It can not delete photos because you are passing the relative path as the first parameter.
photo.image1 = assets/img/products/image1.jpg
Try passing the passing the absolute path( from the root directory of your machine).
fs.unlink("absolute-path-to-assetsParentFolder" + photo.image1, function() {
console.log(photo.image1);
return res.redirect("/admin/products");
});
I am building a dummy image generator to improve my understanding of Node and Express. I get the dimensions from the URL and use GM package to resize it. The resulting stream is piped into the response. But I don't get the data on the front end and when I check the response tab in the Network panel of the dev tools in Chrome I see 'this response has no data available'. Can someone please give me some pointers as I am not sure where I can begin debugging this.
const IMAGE_DIR_PATH = path.join(__dirname, './../../data/images');
const getRandomImageName = () => {
return new Promise((resolve, reject) => {
fs.readdir(IMAGE_DIR_PATH, (err, images) => {
err ? reject(err) : resolve(
images[Math.floor(Math.random() * images.length)]
);
});
});
};
const pickDimensions = (dimensionsStr) => {
const dimensions = dimensionsStr.toLowerCase().split('x');
return {
width: dimensions[0],
height: dimensions[1] ? dimensions[1] : dimensions[0]
};
};
exports.getRandomImageByDimension = (req, res, next) => {
const dimensions = pickDimensions(req.params.dimensions);
//getRandomImageName returns a Promise
//resolves with the name of the file example: abc.jpg
getRandomImageName()
.then((img) => {
res.set('Content-Type', 'image/jpeg');
//I am getting the right image and the path below is valid
gm(path.resolve(`${IMAGE_DIR_PATH}/${img}`))
.resize(dimensions.width, dimensions.height)
.stream(function (err, stdout, stderr) {
if (err) { throw new Error(err); }
stdout.pipe(res);
});
})
.catch(e => {
res.status(500);
res.send(e);
});
};
The response headers I receive is shown below:
Turns out that I had forgotten to add ImageMagick as a dependency for gm and this was causing an error. Strangely the error was not shown in the console and I found it only after breaking down the app into simpler pieces and debugging as skirtle mentioned in a comment above.
To fix I did the following:
npm install --save imagemagick
Then in the file where I was using the gm package, I set the imageMagick property:
const gm = require('gm').subClass({imageMagick: true});
Note: I was running this on a Windows 7.