How to Process Uploaded Image with Graphql Apollo with SharpJS in NodeJS? - node.js

I have a graphql mutation that gets an image from the frontend, and that then is processed and optimized on my server.
But I can't figure out how to pass my image to sharp.
Here is my code:
const Mutation = {
createImage: async (_, { data }) => {
const { file } = data
const image = await file
console.log(image)
const sharpImage = sharp(image)
}
}
I know the code doesn't work and sharp throws an error saying that the input is invalid. So how can I work with createReadStream and to create an instance of sharp?
When I console.log(image), here is what I see:
image {
filename: 'image.png',
mimetype: 'image/png',
encoding: '7bit',
createReadStream: [Function: createReadStream]
}
Thanks a lot in advance!

So I figured out the solution to my question.
First, I found out that I needed to add scalar Upload to typeDefs.
Then, I need to add a resolver for Upload like this:
const { GraphQLUpload } = require('graphql-upload');
const server = new ApolloServer({
resolvers: {
Upload: GraphQLUpload,
}
})
Then in my resolver, here is what I had to do:
// this is a utility function to promisify the stream and store the image in a buffer, which then is passed to sharp
const streamToString = (stream) => {
const chunks = [];
return new Promise((resolve, reject) => {
stream.on('data', (chunk) => chunks.push(Buffer.from(chunk)));
stream.on('error', (err) => reject(err));
stream.on('end', () => resolve(Buffer.concat(chunks)));
})
}
const Mutation = {
createImage: async (_, { data }) => {
const { file } = data
const { createReadStream } = await file
const image = await streamToString(createReadStream())
const sharpImage = sharp(image)
}
}

Related

Error when finding out the type of file retrieved with axios and transformed into a buffer

I have a problem finding out the type of a file.
I take the file from axios but when I try to find out the type of the file I get undefined
My code
export const uploadFileToServer = async (path: string, feedLocation: string, fileName: string) => {
// #ts-ignore
let uploadPath = `./storage/${path}/${fileName}.csv`;
const { fileTypeFromBuffer } = await (eval('import("file-type")') as Promise<typeof import('file-type')>);
const response = await axios.get(feedLocation, {responseType: "arraybuffer"});
const buffer = Buffer.from(response.data, "binary");
const test = await fileTypeFromBuffer(buffer);
console.log("TEST", test);
fs.writeFile(uploadPath, buffer, (err) => {
if (!err) console.log('Data written');
});

await s3.getObject(params) returns object, but adding .promise() does not

I am trying to simply pull an image from an S3 bucket inside of an aws-lambda script that I wrote in Node.
From all the examples I see, people do:
const params = {
Bucket: event.bucket,
Key: event.prefix,
};
console.log('Calling getObject'); // This gets hit
const data = (await (s3.getObject(params).promise())).Body.toString('utf-8')
console.log({ data }); // This NEVER gets hit 😤
However, when I do it without the .promise() like:
const res = await s3.getObject(params);
console.log(res);
I do get a response. How can I pull an image or buffered object using s3.getObject()?
You could try it with something like:
async function getObj(params) {
const {
Bucket,
Key,
} = params;
const getObjectPromise = () => new Promise((resolve, reject) => {
try {
const data = s3.getObject(params);
resolve(data);
} catch (error) {
reject(error);
}
});
const response = await getObjectPromise();
// Do Something with response
}
EDIT:
If I'm not mistaken, the newer versions of S3 return a Readable stream. If so, you'll have to do:
...
const response = await getObjectPromise();
const data: string[] = [];
for await (const chunk of response.Body) {
data.push(chunk.toString());
}
return data.join('');
It turns out, I just had to hit the "Test" button multiple times.
Such a terrible experience.

node javascript file upload doesn't work on remote server

On my local dev machine accessing localhost the following code works beautifully even with network settings changed to "Slow 3G." However, when running on my VPS, it fails to process the file on the server. Here are two different codes blocks I tried (again, both work without issue on local dev machine accessing localhost)
profilePicUpload: async (parent, args) => {
const file = await args.file;
const fileName = `user-${nanoid(3)}.jpg`;
const tmpFilePath = path.join(__dirname, `../../tmp/${fileName}`);
file
.createReadStream()
.pipe(createWriteStream(tmpFilePath))
.on('finish', () => {
jimp
.read(`tmp/${fileName}`)
.then(image => {
image.cover(300, 300).quality(60);
image.writeAsync(`static/uploads/users/${fileName}`, jimp.AUTO);
})
.catch(error => {
throw new Error(error);
});
});
}
It seems like this code block doesn't wait long enough for the file upload to finish since if I check the storage location on the VPS, I see this:
I also tried the following with no luck:
profilePicUpload: async (parent, args) => {
const { createReadStream } = await args.file;
let data = '';
const fileStream = await createReadStream();
fileStream.setEncoding('binary');
// UPDATE: 11-2
let i = 0;
fileStream.on('data', chunk => {
console.log(i);
i++;
data += chunk;
});
fileStream.on('error', err => {
console.log(err);
});
// END UPDATE
fileStream.on('end', () => {
const file = Buffer.from(data, 'binary');
jimp
.read(file)
.then(image => {
image.cover(300, 300).quality(60);
image.writeAsync(`static/uploads/users/${fileName}`, jimp.AUTO);
})
.catch(error => {
throw new Error(error);
});
});
}
With this code, I don't even get a partial file.
jimp is a JS library for image manipulation.
If anyone has any hints to get this working properly, I'd appreciate it very much. Please let me know if I'm missing some info.
I was able to figure out a solution by referring to this article: https://nodesource.com/blog/understanding-streams-in-nodejs/
Here is my final, working code:
const { createWriteStream, unlink } = require('fs');
const path = require('path');
const { once } = require('events');
const { promisify } = require('util');
const stream = require('stream');
const jimp = require('jimp');
profilePicUpload: async (parent, args) => {
// have to wait while file is uploaded
const { createReadStream } = await args.file;
const fileStream = createReadStream();
const fileName = `user-${args.uid}-${nanoid(3)}.jpg`;
const tmpFilePath = path.join(__dirname, `../../tmp/${fileName}`);
const tmpFileStream = createWriteStream(tmpFilePath, {
encoding: 'binary'
});
const finished = promisify(stream.finished);
fileStream.setEncoding('binary');
// apparently async iterators is the way to go
for await (const chunk of fileStream) {
if (!tmpFileStream.write(chunk)) {
await once(tmpFileStream, 'drain');
}
}
tmpFileStream.end(() => {
jimp
.read(`tmp/${fileName}`)
.then(image => {
image.cover(300, 300).quality(60);
image.writeAsync(`static/uploads/users/${fileName}`, jimp.AUTO);
})
.then(() => {
unlink(tmpFilePath, error => {
console.log(error);
});
})
.catch(error => {
console.log(error);
});
});
await finished(tmpFileStream);
}

application/octet-stream issue while using google moderate images trigger (blur image)

I,m using moderate images solution trigger from google.
I taked this solution from here.
I ask some to upgrade for me this solution & here is code:
'use strict'
const gm = require('gm').subClass({imageMagick: true})
const functions = require('firebase-functions')
const admin = require('firebase-admin')
admin.initializeApp()
const Vision = require('#google-cloud/vision')
const vision = new Vision.ImageAnnotatorClient()
const spawn = require('child-process-promise').spawn
const path = require('path')
const fs = require('fs')
const { Storage } = require('#google-cloud/storage')
const gcs = new Storage({
projectId: xxxxxxxxxxx,
})
exports.blurOffensiveImages = functions.storage
.object()
.onFinalize(async (object) => {
const file = gcs.bucket(object.bucket).file(object.name)
const filePath = `gs://${object.bucket}/${object.name}`
console.log(`Analyzing ${file.name}.`)
try {
const [result] = await vision.safeSearchDetection(filePath)
const detections = result.safeSearchAnnotation || {}
if (
detections.adult === 'VERY_LIKELY' ||
detections.violence === 'VERY_LIKELY'
) {
console.log(`Detected ${file.name} as inappropriate.`)
await blurImage(file, object.bucket, object.metadata)
console.log('Deleted local file', file)
return null
} else {
console.log(`Detected ${file.name} as OK.`)
}
} catch (err) {
console.error(`Failed to analyze ${file.name}.`, err)
throw err
}
})
async function blurImage(file, bucketName, metadata) {
const tempLocalPath = `/tmp/${path.parse(file.name).base}`
const bucket = gcs.bucket(bucketName)
await file.download({ destination: tempLocalPath })
console.log('The file has been downloaded to', tempLocalPath)
// Blur the image using ImageMagick.
await new Promise((resolve, reject) => {
gm(tempLocalPath)
.blur(0, 20)
.write(tempLocalPath, (err, stdout) => {
if (err) {
console.error('Failed to blur image.', err);
reject(err);
} else {
console.log(`Blurred image: ${file.name}`);
resolve(stdout);
}
});
});
console.log('Blurred image created at', tempLocalPath)
await bucket.upload(tempLocalPath, {
destination: file.name,
metadata: { metadata: metadata },
})
console.log('Blurred image uploaded to Storage at', file)
return fs.unlink(tempLocalPath, (e) => { if (e) {console.log(e)}})
}
End it's worked perfect, with one bad issue.
Sometimes when user sending list of photos i have "application/octet-stream" file type, but it should be "image/jpg" all media files at my project should be image/jpg.
one user's publication with error in image data type
It's looks like this trigger stuck when it executing.
I made delay in uploading images in my project, but it's doesn't helps me.
I tested - when i delete this trigger - all uploading photos is well & no issues at all.
Help me fix it.
P.S. want to say also, after uploading - image should have all data like original. (Destination, name etc.)

How to control Apollo upload server multiple stream? like make buffer?

I am trying To upload multiple images for use Apollo server
I succeeded in using the createWriteStream module to convert the stream to a file
but i don't need make file, just need buffer string
But this is not return ever never buffer string
Please see my code
type Mutation {
createProduct(product_name: String!, product_info: String!, price: Int!, sale: Int, files: [Upload!]): Result!
}
this is my Mutation
createProduct(parent, {product_name, product_info, price, sale ,files }) {
db.Product.create({
product_name, product_info, price, sale
}).then( (product) => {
storeImages(files, product.id)
.then ( (result) => {
console.log(result)
})
})
and this is my resolver
const outStream = new Writable({
write(chunk, encoding, callback){
console.log(chunk)
},
})
const inStream = new Readable({
read(size) {
console.log(size)
}
})
const test2 = (chunk) => {
console.log(chunk)
outStream.destroy();
}
function test(){
let pass = new PassThrough();
console.log(pass)
console.log('pass!!')
return 'end';
}
const processUpload = async upload => {
const { stream, filename, mimetype } = await upload;
console.log('stream')
console.log(upload)
// const { id, path } = await storeFS({ stream, filename })
// return storeDB({ id, filename, mimetype, path })
}
const lastStream = (chunk) => {
console.log(chunk)
}
const storeImages = (files, product_id) => {
return new Promise( async (resolve, reject) => {
let save = []
files.forEach( async (image, index) => {
const { stream } = await image;
stream
.pipe(outStream)
// .pipe(inStream)
})
resolve(save)
Finally, the way I've tried
I'll summarize.
How can I use the apollo upload server to take a stream of multiple files into a blob and control it?

Resources