How to save a FormData image file locally with "fs" in Node - node.js

I have a React front-end that allows user to send an image using FormData() to the node backend.
In the Express backend's controller, I am using multer as a middleware to grab the image into the files variable:
private initializeRoutes() {
this.router.post(`${this.path}/image`, multerUpload.array("filesToUpload[]"), this.saveImage);
}
In the backend's service, I am trying to save files[0] into the uploads folder:
public saveImage(files: any) {
console.log("OtherServiceLocal saveImage - files :");
console.log(files[0]); // see screenshot at bottom for output
let localPath = fs.createWriteStream("./uploads/image.png");
files[0].buffer.pipe(localPath);
}
But I am getting the error:
I tried piping file[0] and file[0].buffer to no avail, and I had difficulty understanding how to transform it into a stream even after some research.
This is the outputs for console.log(files[0]);
{
fieldname: 'filesToUpload[]',
originalname: 'Screen Shot 2021-09-08 at 3.42.48 PM.png',
encoding: '7bit',
mimetype: 'image/png',
buffer: <Buffer 89 50 4e 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 03 85 00 00 02 c5 08 06 00 00 00 74 ff 46 78 00 00 00 01 73 52 47 42 00 ae ce 1c e9 00 00 00 62 ... 249405 more bytes>,
size: 249455
}
Please note that I am aware that you can use multer's upload method to save the image as a middleware directly in the router, but I can't to do this due to some requirement of my app.
Thanks,

file[0].buffer is an instance of Buffer.
So you can use fs.writeFile directly
fs.writeFile("./uploads/image.png", file[0].buffer, (err) => {
console.error(error)
})

Related

change multer uploaded file to ReadStream

I'm using nestjs, multer to read uploaded files.
file is well uploade via POST rest api.
I want to convert. this file to ReadableStream.
I want to avoid using write this files in disk and read again using createReadStream,
it would be better convert direct to ReadableStream using uploaded meta infos.
export function ApiFile(fieldName: string) {
return applyDecorators(UseInterceptors(FileInterceptor(fieldName)));
}
#Post("/file_upload")
#ApiFile('file')
create(
#Body() createNewsDto: CreateNewsDto,
#UploadedFile() file: Express.Multer.File,
) {
console.log({ file });
return this.myService.create(createNewsDto, file);
}
this is file meta data
{
file: {
fieldname: 'file',
originalname: 'screenshot.png',
encoding: '7bit',
mimetype: 'image/png',
buffer: <Buffer 59 10 4f 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 02 cf 00 00 02 1b 08 06 00 00 00 14 dd 73 8e 00 00 01 55 61 43 43 50 49 43 43 20 50 72 6f 66 69 ... 298432 more bytes>,
size: 298982
}
}
how can I achieve this?
I figured out how to send my files to remote server.
you need to use ReadableStream.from which change buffer file to ReadStream.
if your file meta info is below,
{
file: {
fieldname: 'file',
originalname: 'screenshot.png',
encoding: '7bit',
mimetype: 'image/png',
buffer: <Buffer 59 10 4f 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 02 cf 00 00 02 1b 08 06 00 00 00 14 dd 73 8e 00 00 01 55 61 43 43 50 49 43 43 20 50 72 6f 66 69 ... 298432 more bytes>,
size: 298982
}
}
you can convert this meta into stream
import { Readable} from 'stream';
import * as FormData from "form-data";
const formData = new FormData();
const stream = Readable.from(file.buffer);
formData.append("anyKeyValue", stream, {
filename: file.originalname,
contentType: file.mimetype
})
then send to remote server with content type multipart/form-data

Gulp image-min - How to continue after an error?

I am trying to run an image optimisation task on a folder with 1000s of images, unfortunately some of them are corrupted so the task keeps failing
I have tried to capture the error and continue, but its not working once it hits a corrupted image the task then aborts. Is there a way for this to continue running?
gulp.task('imgCompress', (done) => {
gulp.src(imgMedia)
.pipe(imagemin([
mozjpeg({
quality: 75
}),
pngquant({
quality: [0.65, 0.80]
})
]))
.on('error', gutil.log)
.pipe(gulp.dest(imgMediaDest))
done();
});
Error:
{ Error: write EPIPE
at WriteWrap.afterWrite [as oncomplete] (net.js:789:14)
errno: 'EPIPE',
code: 'EPIPE',
syscall: 'write',
stdout:
<Buffer ff d8 ff e0 00 10 4a 46 49 46 00 01 01 00 00 01 00 01 00 00 ff e1 2b 18 45 78 69 66 00 00 4d 4d 00 2a 00 00 00 08 00 0b 01 0f 00 02 00 00 00 06 00 00 ... >,
stderr: <Buffer >,
failed: true,
signal: null,
cmd:
'/private/var/www/website/m2/tools-media/node_modules/mozjpeg/vendor/cjpeg -quality 75',
timedOut: false,
killed: false,
message: 'write EPIPE',
name: 'Error',
stack:
'Error: write EPIPE\n at WriteWrap.afterWrite [as oncomplete] (net.js:789:14)',
__safety: undefined,
_stack: undefined,
plugin: 'gulp-imagemin',
showProperties: true,
showStack: false,
fileName:
'/private/var/www/website/m2/pub/media/catalog/product/1/i/1img_5232.jpg' }
You should be fine with gulp-plumber, which lets the stream continue after an error and even has an option to log error messages automatically without handling error events.
Also note that you should not call done after creating a stream, but after the stream has terminated (you could for example listen to finish events). But it's easier to simply return the stream from the task function.
const plumber = require('gulp-plumber');
gulp.task('imgCompress', () => {
return gulp.src(imgMedia)
.pipe(plumber({ errorHandler: true }))
.pipe(imagemin([
mozjpeg({
quality: 75
}),
pngquant({
quality: [0.65, 0.80]
})
]))
.pipe(gulp.dest(imgMediaDest))
});

My React App gives error when connected to Nodejs API (MS SQL Server)

i was just building a react app to display movie contents. I got my node api using mssql working fine and tested with postman for all CRUD operations.
When i started creating the react app and use this api, it shows following errors. Could u please help.
My node api json format:
{
Film_id: 1,
film_name: 'Van Helsing',
actor: 'Hugh Jackman',
actress: 'Kate B',
pub_date: '01/05/2008',
director: 'Brad',
producer: 'Universal',
prod_cost: 6454988948,
dist_cost: 2464546,
category: 'Horror',
cert_category: 'U/A',
poster: <Buffer ff d8 ff e0 00 10 4a 46 49 46 00 01 01 00 00 01 00 01 00 00 ff e1 00 60 45 78 69 66 00 00 49 49 2a 00 08 00 00 00 02 00 31 01 02 00 07 00 00 00 26 00 ... 122428 more bytes>
},
The error i am getting:
Error: Objects are not valid as a React child (found: object with keys {type, data}). If you meant to render a collection of children, use an array instead.
anonymous function)
D:/react2/andrew/src/Films.js:16
13 | fetch(process.env.REACT_APP_API+'Films')
14 | .then(response=>response.json())
15 | .then(data=>{
> 16 | this.setState({fls:data});
| ^ 17 | });
18 | }
19 |
My react component:
import React,{Component} from 'react';
import {Table} from 'react-bootstrap';
export class Films extends Component{
constructor (props){
super(props);
this.state={fls:[]}
}
refreshList(){
fetch(process.env.REACT_APP_API+'Films')
.then(response=>response.json())
.then(data=>{
this.setState({fls:data});
});
}
componentDidMount(){
this.refreshList();
}
componentDidUpdate(){
this.refreshList();
}
render(){
const {fls}=this.state;
return(
<div>
<Table className="mt-4" striped border hover size="sm">
<thread>
<tr>
<th>Film ID</th>
<th>Film Name</th>
<th>Actor</th>
<th>Actress</th>
<th>Published Date</th>
<th>Director</th>
<th>Producer</th>
<th>Production Cost</th>
<th>Distribution Cost</th>
<th>Category</th>
<th>Cert Category</th>
<th>Poster</th>
</tr>
</thread>
<tbody>
{fls.map(fl=>
<tr key={fl.Film_id}>
<td>{fl.Film_id}</td>
<td>{fl.film_name}</td>
<td>{fl.actor}</td>
<td>{fl.actress}</td>
<td>{fl.pub_date}</td>
<td>{fl.director}</td>
<td>{fl.producer}</td>
<td>{fl.prod_cost}</td>
<td>{fl.dist_cost}</td>
<td>{fl.category}</td>
<td>{fl.cert_category}</td>
<td>{fl.poster}</td>
<ts>Edit / Delete</ts>
</tr>)}
</tbody>
</Table>
</div>
)
}
}
I think you need to map through the values of your object. If you want to render the values of your "fls" object you have to write Object.values(fls).map(fls => ...
you can check the docs here: Objects.values MDN

How to use Nginx NJS with native nodejs modules and webpack?

There is a guide https://nginx.org/en/docs/njs/node_modules.html guide that describes how to use 'native' nodejs modules with njs.
I followed the guide until I did not understand what it means in bold:
Note that in this example generated code is not wrapped into function
and we do not need to call it explicitly. The result is in the "dist" directory:
$ cat dist/wp_out.js code.js > njs_dns_bundle.js
Let's call our code at the end of a file: <<<--- HERE
var b = set_buffer(global.dns);
console.log(b);
And execute it using node:
$ node ./njs_dns_bundle_final.js
Question is how do I include / require / import the webpack generated njs_dns_bundle.js in njs_dns_bundle_final.js which is the name of the Let's call our code at the end of a file since without it I get the error:
njs_dns_bundle_final.js:1
var b = set_buffer(global.dns);
ReferenceError: set_buffer is not defined
My code.js:
module.exports = {
hello: function set_buffer(dnsPacket) {
// create DNS packet bytes
var buf = dnsPacket.encode({
type: 'query',
id: 1,
flags: dnsPacket.RECURSION_DESIRED,
questions: [{
type: 'A',
name: 'google.com'
}]
})
return buf;
}
}
My njs_dns_bundle_final.js:
var myModule = require('./njs_dns_bundle');
var b = myModule.hello(global.dns);
console.log(b);
Node runs fine I think?!:
node ./njs_dns_bundle_final.js
<Buffer 00 01 01 00 00 01 00 00 00 00 00 00 06 67 6f 6f 67 6c 65 03 63 6f 6d 00 00 01 00 01>
NJS does not:
njs ./njs_dns_bundle_final.js
Thrown:
Error: Cannot find module "./njs_dns_bundle"
at require (native)
at main (njs_dns_bundle_final.js:1)
Thanks

Node.js: Download file from s3 and unzip it to a string

I am writing an AWS Lambda function which needs to download files from AWS S3, unzips the file and returns the content in the form of a string.
I am trying this
function getObject(key){
var params = {
Bucket: "my-bucket",
Key: key
}
return new Promise(function (resolve, reject){
s3.getObject(params, function (err, data){
if(err){
reject(err);
}
resolve(zlib.unzipSync(data.Body))
})
})
}
But getting the error
Error: incorrect header check
at Zlib._handle.onerror (zlib.js:363:17)
at Unzip.Zlib._processChunk (zlib.js:524:30)
at zlibBufferSync (zlib.js:239:17)
The data looks like this
{ AcceptRanges: 'bytes',
LastModified: 'Wed, 16 Mar 2016 04:47:10 GMT',
ContentLength: '318',
ETag: '"c3xxxxxxxxxxxxxxxxxxxxxxxxx"',
ContentType: 'binary/octet-stream',
Metadata: {},
Body: <Buffer 50 4b 03 04 14 00 00 08 08 00 f0 ad 6f 48 95 05 00 50 84 00 00 00 b4 00 00 00 2c 00 00 00 30 30 33 32 35 2d 39 31 38 30 34 2d 37 34 33 30 39 2d 41 41 ... >
}
The Body buffer contains zip-compressed data (this is identified by the first few bytes), which is not just plain zlib.
You will need to use some zip module to parse the data and extract the files within. One such library is yauzl which has a fromBuffer() method that you can pass your buffer to and get the file entries.

Resources