How to control Apollo upload server multiple stream? like make buffer? - node.js

I am trying To upload multiple images for use Apollo server
I succeeded in using the createWriteStream module to convert the stream to a file
but i don't need make file, just need buffer string
But this is not return ever never buffer string
Please see my code
type Mutation {
createProduct(product_name: String!, product_info: String!, price: Int!, sale: Int, files: [Upload!]): Result!
}
this is my Mutation
createProduct(parent, {product_name, product_info, price, sale ,files }) {
db.Product.create({
product_name, product_info, price, sale
}).then( (product) => {
storeImages(files, product.id)
.then ( (result) => {
console.log(result)
})
})
and this is my resolver
const outStream = new Writable({
write(chunk, encoding, callback){
console.log(chunk)
},
})
const inStream = new Readable({
read(size) {
console.log(size)
}
})
const test2 = (chunk) => {
console.log(chunk)
outStream.destroy();
}
function test(){
let pass = new PassThrough();
console.log(pass)
console.log('pass!!')
return 'end';
}
const processUpload = async upload => {
const { stream, filename, mimetype } = await upload;
console.log('stream')
console.log(upload)
// const { id, path } = await storeFS({ stream, filename })
// return storeDB({ id, filename, mimetype, path })
}
const lastStream = (chunk) => {
console.log(chunk)
}
const storeImages = (files, product_id) => {
return new Promise( async (resolve, reject) => {
let save = []
files.forEach( async (image, index) => {
const { stream } = await image;
stream
.pipe(outStream)
// .pipe(inStream)
})
resolve(save)
Finally, the way I've tried
I'll summarize.
How can I use the apollo upload server to take a stream of multiple files into a blob and control it?

Related

incorrect header check while trying to uncompressed s3 object body?

I compress and upload an object to s3 using the follwoing code:
let data: string | Buffer = JSON.stringify(rules);
let contentType = "application/json";
let encoding = null;
let filename = `redirector-rules.json`;
if (format === "gz") {
contentType = "application/gzip";
encoding = "gzip";
filename = `redirector-rules.gz`;
const buf = Buffer.from(data, "utf-8");
data = zlib.gzipSync(buf);
}
// res.end(data);
// return res.status(200).send(data);
await s3.upload(filename, data, contentType, encoding);
I am assuming this is working correctly since when I donwload the result file using aws s3 cp command it works just fine and I am able to uncompress it on my machine. additionally, possibly unrelated fact, if I downlaod via the conole for s3, my system is unable to uncompress it and it possibly corrupt or truncated.
on the other end I have a lambda code that read get the object and attempt to decompress it:
const getRules = async (rulesCommand: GetObjectCommand): Promise<Config> => {
const resp = await fetchRulesFile(rulesCommand);
const data = await parseResponse(resp, rulesCommand);
return data;
};
const fetchRulesFile = async (rulesCommand: GetObjectCommand): Promise<GetObjectCommandOutput> => {
try {
console.log(`Retrieving rules file with name ${rulesCommand.input.Key}`);
const resp = await client.send(rulesCommand);
return resp;
} catch (err) {
throw new Error(`Error retrieving rules file: ${err}`);
}
};
const parseResponse = async (resp: GetObjectCommandOutput, rulesCommand: GetObjectCommand): Promise<Config> => {
const { Body } = resp;
if (!Body) {
throw new Error("No body in response");
}
let data: string = await Body.transformToString();
if (rulesCommand.input.Key?.endsWith(".gz")) {
console.log(`Uncompressing rules file with name ${rulesCommand.input.Key}`);
try {
data = zlib.gunzipSync(data).toString("utf-8");
} catch (err) {
throw new Error(`Error decompressing rules file: ${err}`);
}
}
return JSON.parse(data) as Config;
};
but I keep getting this error: Error: incorrect header check
I resolved the issue by using Readable and streams in the parseResponse function:
const parseResponse = async (
resp: GetObjectCommandOutput,
rulesCommand: GetObjectCommand
): Promise<Config> => {
const { Body } = resp;
if (!Body) {
throw new Error("No body in response");
}
let data = "";
const readableStream = new Readable();
readableStream._read = () => {}; // noop
// #ts-ignore
Body.on("data", (chunk : any) => {
readableStream.push(chunk);
data += chunk;
});
// #ts-ignore
Body.on("end", () => {
readableStream.push(null);
});
const gunzip = zlib.createGunzip();
const result = await new Promise((resolve, reject) => {
let buffer = "";
readableStream.pipe(gunzip);
gunzip.on("data", (chunk) => {
buffer += chunk;
});
gunzip.on("end", () => {
resolve(buffer);
});
gunzip.on("error", reject);
});
return result as Config;
};
I had to add #ts-ignore at Body.on because of type mismatch. But it still worked in the compiled JS handler and fixing it with conversion seemed a bit complex.

Error when finding out the type of file retrieved with axios and transformed into a buffer

I have a problem finding out the type of a file.
I take the file from axios but when I try to find out the type of the file I get undefined
My code
export const uploadFileToServer = async (path: string, feedLocation: string, fileName: string) => {
// #ts-ignore
let uploadPath = `./storage/${path}/${fileName}.csv`;
const { fileTypeFromBuffer } = await (eval('import("file-type")') as Promise<typeof import('file-type')>);
const response = await axios.get(feedLocation, {responseType: "arraybuffer"});
const buffer = Buffer.from(response.data, "binary");
const test = await fileTypeFromBuffer(buffer);
console.log("TEST", test);
fs.writeFile(uploadPath, buffer, (err) => {
if (!err) console.log('Data written');
});

How to upload any type of file to aws s3 bucket?

I am trying to upload files like docs ppts etc. I have a front end in react and my upload function looks like this:
const reader = new FileReader()
const toBase64 = (file) =>
new Promise((resolve, reject) => {
reader.readAsDataURL(file);
reader.onload = () => resolve(reader.result);
reader.onerror = (error) => reject(error);
});
` const UploadMinues = async (event: any) => {
console.log(event, 'envent from forntend');
if (event.target && event?.target?.files[0]) {
try {
await toBase64(event.target.files[0]);
console.log(reader.result, 'reader.result');
const res = (await API.graphql({
query: `mutation MyMutation {
updateMinutes(input: { projectId:"${event.target.id}", adviserId: "${user.username}", file: "${reader.result}", fileName: "${event.target.files[0].name}", fileType: "${event.target.files[0].type}"}) {
minutesKey
}
}`,
authMode: GRAPHQL_AUTH_MODE.AMAZON_COGNITO_USER_POOLS
})) as any;
} catch (e) {
console.log('UpdateImage_Error==>', e);
setMinutesErr(e.message);
setOpenAlert(true);
}
} else {
console.log('errorrrrrrrrrrr');
return;
}
};`
And on the back end which is in I have a lambda function like this:
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient();
export async function updateMinutes(data: any) {
let { adviserId, projectId, file, fileName, fileType } = data;
console.log(data, "received from front end")
let s3bucket = new AWS.S3({ params: { Bucket: `${process.env.S3_BUCKET_NAME}` } });
try {
// const buf = Buffer.from(file.replace(/^data:application\/\w+;base64,/, ""), 'base64')
let params_upload = {
Key: `minutes/${adviserId}/${projectId}/${fileName}`,
Body: Buffer.from(file, "base64"),
ContentType: fileType,
CacheControl: 'max-age=86400'
};
const minutes_save = await s3bucket.upload(params_upload).promise()
const minutesKey = minutes_save.Key
let params = {
TableName: process.env.CONSULTATION_TABLE,
Key: {
adviserId: adviserId,
projectId: projectId,
},
UpdateExpression: `set minutes = :edu`,
ExpressionAttributeValues: {':edu' : [minutesKey]}
}
const data = await docClient.update(params).promise()
return {
minutesKey: minutesKey
}
} catch (err) {
console.log(err, "IMAGE_UPLOAD_ERROR")
}
}
The file get uploaded to s3 bucket but when I open it it is in some symbols format. Could someone please explain what i am doing wrong here. because same approach is working fine when I try to upload pdf or image but not with docs or excels files.
My input look like this:
<Input
id={data.projectId}
name={data.projectId}
onChange={UploadMinues}
accept="application/*"
multiple
type="file"
/>

How to Process Uploaded Image with Graphql Apollo with SharpJS in NodeJS?

I have a graphql mutation that gets an image from the frontend, and that then is processed and optimized on my server.
But I can't figure out how to pass my image to sharp.
Here is my code:
const Mutation = {
createImage: async (_, { data }) => {
const { file } = data
const image = await file
console.log(image)
const sharpImage = sharp(image)
}
}
I know the code doesn't work and sharp throws an error saying that the input is invalid. So how can I work with createReadStream and to create an instance of sharp?
When I console.log(image), here is what I see:
image {
filename: 'image.png',
mimetype: 'image/png',
encoding: '7bit',
createReadStream: [Function: createReadStream]
}
Thanks a lot in advance!
So I figured out the solution to my question.
First, I found out that I needed to add scalar Upload to typeDefs.
Then, I need to add a resolver for Upload like this:
const { GraphQLUpload } = require('graphql-upload');
const server = new ApolloServer({
resolvers: {
Upload: GraphQLUpload,
}
})
Then in my resolver, here is what I had to do:
// this is a utility function to promisify the stream and store the image in a buffer, which then is passed to sharp
const streamToString = (stream) => {
const chunks = [];
return new Promise((resolve, reject) => {
stream.on('data', (chunk) => chunks.push(Buffer.from(chunk)));
stream.on('error', (err) => reject(err));
stream.on('end', () => resolve(Buffer.concat(chunks)));
})
}
const Mutation = {
createImage: async (_, { data }) => {
const { file } = data
const { createReadStream } = await file
const image = await streamToString(createReadStream())
const sharpImage = sharp(image)
}
}

Transform stream coming from S3 buckets

Here is my function to get the contents of multiple objects and merge in one single file:
const getAllObjectsInRegions = (config, keys) => {
keys.forEach((key) => {
const params = {
Bucket: `${config.metadata.s3.bucket}`,
Key: key
}
const file = fs.createWriteStream('./out.txt')
return s3.getObject(params)
.createReadStream()
.pipe(file)
})
basically the object that I read is just a json file. Do you have any idea how to transform the content of the object before writing to the file? I need to extract some property instead to write the entire file. Thank you
EDIT:
const getAllObjectsInRegions = (config, keys) => {
const promise = new Promise((resolve, reject) => {
keys.forEach((key) => {
const params = {
Bucket: `${config.metadata.s3.bucket}`,
Key: key
}
const readStream = s3.getObject(params).createReadStream()
readStream.on('data', (data) => {
let playlist
try {
playlist = JSON.parse(data)
} catch(err) {
reject(err)
}
const slicedPlaylist = _.pick(playlist, ['contentKey', 'title', 'text' ])
fs.writeFile('./out.txt', slicedPlaylist, (err) => {
if (err) {
reject(err)
}
resolve()
})
})
})
})
return promise
}

Resources