I am getting this error while uploading file from postman.
(node:13648) [DEP0135] DeprecationWarning: ReadStream.prototype.open() is deprecated
My node version is 15.0. I'm using apollo-server-express. And this is my code
export const processUpload = async (file) => {
const {
createReadStream, mimetype, encoding, filename
} = await file;
const path = `uploads/${uuid()}${filename}`;
const stream = createReadStream();
return new Promise((resolve, reject) => {
stream
.pipe(fs.createWriteStream(path))
.on('finish', () => {
resolve({
success: true,
message: 'Successfully Uploaded',
mimetype,
filename,
encoding,
location: path
});
})
.on('error', (err) => {
console.log('Error Event Emitted', err);
});
});
};
Your node version is high, please add this in package.js file.
"resolutions": {
"**/**/fs-capacitor": "^6.2.0",
"**/graphql-upload": "^11.0.0"
}
I'm trying to download images from a Google share Drive using the API v3. The download itself will succeed but the image can't be seen. Opening the image from the MacOS finder just results in a spinner.
I started using the example from the documentation (here: https://developers.google.com/drive/api/v3/manage-downloads):
const drive = google.drive({version: 'v3', auth});
// ....
var fileId = '0BwwA4oUTeiV1UVNwOHItT0xfa2M';
var dest = fs.createWriteStream('/tmp/photo.jpg');
drive.files.get({
fileId: fileId,
alt: 'media'
})
.on('end', function () {
console.log('Done');
})
.on('error', function (err) {
console.log('Error during download', err);
})
.pipe(dest);
however that fails because the .on() method doesn't exist. The exact error is "TypeError: drive.files.get(...).on is not a function"
The .get() method returns a promise. The response of the promise contains data that, depending on the config is either a stream, a blob or arraybuffer. For all options, when I write the response data to a file, the file itself becomes unviewable and has the wrong size. The actual code (typescript, node.js) for the arraybuffer example is below. Similar code for blob (with added name and modifiedDate) and for stream give the same result.
const downloader = googleDrive.files.get({
fileId: file.id,
alt: 'media',
}, {
responseType: 'arraybuffer',
});
return downloader
.then((response) => {
const targetFile = file.id + '.' + file.extension;
fs.writeFileSync(targetFile, response.data);
return response.status;
})
.catch((response) => {
logger.error('Error in Google Drive service download: ' + response.message);
return response.message;
}
);
}
So the questions are:
what is the correct way to handle a download through Google Drive API v3 ?
do I need to handle any formatting of the response data ?
All help greatly appreciated!
Thanks
You want to download a file from Google Drive using googleapis with Node.js.
You have already been able to use Drive API.
If my understanding is correct, how about this answer?
Pattern 1:
In this pattern, arraybuffer is used for responseType.
Sample script:
const drive = google.drive({ version: "v3", auth });
var fileId = '###'; // Please set the file ID.
drive.files.get(
{
fileId: fileId,
alt: "media"
},
{ responseType: "arraybuffer" },
function(err, { data }) {
fs.writeFile("sample.jpg", Buffer.from(data), err => {
if (err) console.log(err);
});
}
);
In this case, Buffer.from() is used.
Pattern 2:
In this pattern, stream is used for responseType.
Sample script:
const drive = google.drive({ version: "v3", auth });
var fileId = '###'; // Please set the file ID.
var dest = fs.createWriteStream("sample.jpg");
drive.files.get(
{
fileId: fileId,
alt: "media"
},
{ responseType: "stream" },
function(err, { data }) {
data
.on("end", () => {
console.log("Done");
})
.on("error", err => {
console.log("Error during download", err);
})
.pipe(dest);
}
);
Note:
If an error occurs, please use the latest version of googleapis.
From your question, it seems that you have already been able to retrieve the file you want to download using your request, while the file content cannot be opened. But if an error occurs, please try to add supportsAllDrives: true and/or supportsTeamDrives: true in the request.
References:
Download files
google-api-nodejs-client/samples/drive/download.js
If I misunderstood your question and this was not the direction you want, I apologize.
Posting a third pattern for completeness using async/await and including teamdrive files.
async function downloadFile(drive: Drive, file: Schema$File, localDir: string = "/tmp/downloads") {
if (!fs.existsSync(localDir)) {
fs.mkdirSync(localDir)
}
const outputStream = fs.createWriteStream(`${localDir}/${file.name}`);
const { data } = await drive.files.get({
corpora: 'drive',
includeItemsFromAllDrives: true,
supportsAllDrives: true,
fileId: file.id,
alt: "media",
}, {
responseType: 'stream',
})
await pipeline(data, outputStream)
console.log(`Downloaded file: ${localDir}/${file.name}`)
}
If someone is looking for a solution is 2023, here you go!
const downloadFile = async (file) => {
const dirPath = path.join(process.cwd(), '/images');
if (!fs.existsSync(dirPath)) {
fs.mkdirSync(dirPath, { recursive: true });
}
const filePath = `${dirPath}/${file.name}.jpg`;
const destinationStream = fs.createWriteStream(filePath);
try {
const service = await getService();
const { data } = await service.files.get(
{ fileId: file.id, alt: 'media' },
{ responseType: 'stream' }
);
return new Promise((resolve, reject) => {
data
.on('end', () => {
console.log('Done downloading file.');
resolve(filePath);
})
.on('error', (err) => {
console.error('Error downloading file.');
reject(err);
})
.pipe(destinationStream);
});
} catch (error) {
throw error;
}
};
I have an endpoint that receives files and creates a background task for uploading those files to S3.
In order to background the file uploads, I'm using Agenda (https://github.com/agenda/agenda). The only limitation is that I need to store the file in a format that is supported by MongoDB (which is what Agenda uses under the hood). In order to do that, I am converting the file as a buffer before sending that over to Agenda.
This is my code:
Mutation: {
batchCreateProgressPics: combineResolvers(
isAuthenticated,
async (parent, { pics }, { models, currentUser }) => {
return await Promise.all(
pics.map(async (pic, i) => {
const { file, bodyPart, localPath } = pic;
const { createReadStream } = await file;
const stream = createReadStream();
console.log("Setting up buffer...");
const buffer = await new Promise((resolve, reject) => {
var buffers = [];
stream.on("data", function(data) {
buffers.push(data);
});
stream.on("end", function() {
const everything = Buffer.concat(buffers);
resolve(everything);
});
stream.on("error", function(e) {
reject(e);
});
});
const progressPic = await models.ProgressPic.create({
bodyPart,
user: currentUser.id,
url: localPath,
});
console.log("Creating backgruond task...");
Agenda.now("uploadProgressPic", {
userId: currentUser.id,
progressPicId: progressPic.id,
filename: `${progressPic.id}-${bodyPart}.jpg`,
buffer,
});
console.log("Done.");
return progressPic;
})
);
}
),
},
This is fast on my local development server, but taking a long time to run in production because of the buffer stuff. The lines following console.log(Setting up buffer...) are taking a long time.
What I would like to do is:
Create and return an array of progressPics, one for each element in the pics array
Do the buffer stuff after the response has been sent so it doesn't hold up the front end.
Is this possible?
============ UPDATE ==========
So if I do not await for the promise, it complains that the request disconnected before the buffer finished:
const uploadProgressPic = async ({ file, progressPicId, userId, bodyPart }) => {
try {
const { createReadStream } = await file;
const stream = createReadStream();
console.log("Setting up buffer...");
const buffer = await new Promise((resolve, reject) => {
var buffers = [];
stream.on("data", function(data) {
buffers.push(data);
});
stream.on("end", function() {
const everything = Buffer.concat(buffers);
resolve(everything);
});
stream.on("error", function(e) {
reject(e);
});
});
console.log("Done.");
console.log("Creating backgruond task...");
Agenda.now("uploadProgressPic", {
userId,
progressPicId,
filename: `${progressPicId}-${bodyPart}.jpg`,
buffer,
});
} catch (error) {
console.log("ERROR OCCURRED: ", error);
}
};
export default {
Mutation: {
batchCreateProgressPics: combineResolvers(
isAuthenticated,
async (parent, { pics }, { models, currentUser }) => {
return pics.map(async (pic, i) => {
const { file, bodyPart, localPath } = pic;
const progressPic = await models.ProgressPic.create({
bodyPart,
user: currentUser.id,
url: localPath,
});
uploadProgressPic({
file,
progressPicId: progressPic.id,
userId: currentUser.id,
bodyPart,
});
return progressPic;
});
}
),
},
};
Error:
ERROR OCCURRED: BadRequestError: Request disconnected during file upload stream parsing.
at IncomingMessage.<anonymous> (/Users/edmundmai/Documents/src/acne-tracker/server/node_modules/graphql-upload/lib/processRequest.js:300:35)
at Object.onceWrapper (events.js:291:20)
at IncomingMessage.emit (events.js:203:13)
at IncomingMessage.EventEmitter.emit (domain.js:471:20)
at resOnFinish (_http_server.js:614:7)
at ServerResponse.emit (events.js:208:15)
at ServerResponse.EventEmitter.emit (domain.js:471:20)
at onFinish (_http_outgoing.js:649:10)
at onCorkedFinish (_stream_writable.js:678:5)
at afterWrite (_stream_writable.js:483:3)
at processTicksAndRejections (internal/process/task_queues.js:77:11) {
message: 'Request disconnected during file upload stream parsing.',
expose: true,
statusCode: 499,
status: 499
}
========== UPDATE 2 =============
Even trying to 1) simplify it and 2) move createReadStream() outside of uploadProgressPic shows the same error:
const uploadProgressPic = async ({
stream,
progressPicId,
userId,
bodyPart,
models,
}) => {
try {
console.log("Uploading to S3...");
const { Location: url, Key: key, Bucket: bucket } = await S3.upload({
stream,
folder: userId,
filename: `${progressPicId}-${bodyPart}.jpg`,
});
if (url && key && bucket) {
await models.ProgressPic.findOneAndUpdate(
{ _id: progressPicId },
{ $set: { url, key, bucket } },
{ new: true, useFindAndModify: false }
);
console.log("Done!");
}
} catch (error) {
console.log("ERROR OCCURRED: ", error);
}
};
export default {
Mutation: {
batchCreateProgressPics: combineResolvers(
isAuthenticated,
async (parent, { pics }, { models, currentUser }) => {
return pics.map(async (pic, i) => {
const { file, bodyPart, localPath } = pic;
const progressPic = await models.ProgressPic.create({
bodyPart,
user: currentUser.id,
url: localPath,
});
const { createReadStream } = await file;
const stream = createReadStream();
uploadProgressPic({
stream,
progressPicId: progressPic.id,
userId: currentUser.id,
bodyPart,
models,
});
return progressPic;
});
}
),
},
};
Error:
Uploading to S3...
Uploading to S3...
Uploading to S3...
ERROR OCCURRED: BadRequestError: Request disconnected during file upload stream parsing.
at IncomingMessage.<anonymous> (/Users/edmundmai/Documents/src/acne-tracker/server/node_modules/graphql-upload/lib/processRequest.js:300:35)
at Object.onceWrapper (events.js:291:20)
at IncomingMessage.emit (events.js:203:13)
at IncomingMessage.EventEmitter.emit (domain.js:471:20)
at resOnFinish (_http_server.js:614:7)
at ServerResponse.emit (events.js:208:15)
at ServerResponse.EventEmitter.emit (domain.js:471:20)
at onFinish (_http_outgoing.js:649:10)
at onCorkedFinish (_stream_writable.js:678:5)
at afterWrite (_stream_writable.js:483:3)
at processTicksAndRejections (internal/process/task_queues.js:77:11) {
message: 'Request disconnected during file upload stream parsing.',
expose: true,
statusCode: 499,
status: 499
}
Done!
Funny thing is I still see a few Done!s in the logs even though it complains?
Not an expert on the subject but i have an idea that may work and a theory :
IDEA: if you're dealing with a big number a images than your problem may originate from the await Promise.all(). i recommend that you use parallelLimit from async to limit the parallel functions to be executed at a time otherwise you will have a performance problem.
THEORY: Maybe you can free the memory allocation after each use of Buffer to avoid memory leak problems and make your server more performant.
If i am wrong in anyway please correct me. I myself interested in the outcome of this problem.
Don't await the Promise.
new Promise((resolve, reject) => {
var buffers = [];
stream.on("data", function(data) {
buffers.push(data);
});
stream.on("end", function() {
const everything = Buffer.concat(buffers);
resolve(everything);
});
stream.on("error", function(e) {
reject(e);
});
}).then((buffer) => {
Agenda.now("uploadProgressPic", {
userId: currentUser.id,
progressPicId: progressPic.id,
filename: `${progressPic.id}-${bodyPart}.jpg`,
buffer,
});
}).catch((error) => {
// Clean up here
});
return models.ProgressPic.create({
bodyPart,
user: currentUser.id,
url: localPath,
});
This way, you'll kick off creating the buffers, but won't actually wait for that to code to execute and will immediately create the ProgressPic instance and return it. Because the call to Agenda.now requires the resolved value of the Promise, we stick it inside the then callback. Note that it's important to append a catch as well -- if you don't, you could end up with an unhandled rejection.
You may want to use the catch callback to log the error and do any additional cleanup. For example, you may want to create the created ProgressPic (in which case, you should move the create call above the buffer Promise so you can reference the created instance).
If you're like me and die a little bit on the inside each time you have to type then, you can extract all that logic into a separate function:
const uploadProgressPic = async (/* parameters omitted for brevity */) => {
try {
const buffer = await new Promise(...)
Agenda.now(...)
} catch (error) {
// Do whatever
}
}
and then call it inside your resolver, again, without awaiting it:
uploadProgressPic()
return models.ProgressPic.create({
bodyPart,
user: currentUser.id,
url: localPath,
});
I tried a variety of things that ended up not working because creating the buffer was just too slow in production for some reason. My ultimate solution that actually works was to split up the upload into two requests:
Backend:
Request #1: Create a progress pic, using the local file path as the URL
Request #2: Upload the file and update the progress pic
import { combineResolvers } from "graphql-resolvers";
import { isAuthenticated } from "./authorization";
import S3 from "../services/s3";
export default {
Query: {
progressPics: combineResolvers(
isAuthenticated,
async (parent, args, { models, currentUser }) => {
return await models.ProgressPic.find({ user: currentUser.id });
}
),
},
Mutation: {
createProgressPics: combineResolvers(
isAuthenticated,
async (parent, { pics }, { models, currentUser }) => {
return pics.map(async (pic, i) => {
const { bodyPart, localPath } = pic;
return await models.ProgressPic.create({
bodyPart,
user: currentUser.id,
url: localPath,
});
return progressPic;
});
}
),
updateProgressPics: combineResolvers(
isAuthenticated,
async (parent, { pics }, { models, currentUser }) => {
return pics.map(async (pic, i) => {
const { file, filename, progressPicId } = pic;
const { createReadStream } = await file;
const stream = createReadStream();
const { Location: url, Key: key, Bucket: bucket } = await S3.upload({
stream,
filename,
folder: currentUser.id,
});
return await models.ProgressPic.findOneAndUpdate(
{ _id: progressPicId },
{ $set: { url, key, bucket } },
{ new: true, useFindAndModify: false }
);
});
}
),
},
};
The frontend will then wait for the response from Request #1, and send Request #2 but ignore the response so it can just return immediately.
const createAndUploadProgressPics = async photos => {
const {
data: { createProgressPics: progressPics },
} = await createProgressPics({
variables: {
pics: photos.map((p, i) => ({
bodyPart: BODY_PARTS[i],
localPath: p.uri,
})),
},
});
updateProgressPics({
variables: {
pics: progressPics.map(({ id, bodyPart }, i) => {
return {
progressPicId: id,
filename: `${id}-${bodyPart}.jpg`,
file: photos[i],
};
}),
},
});
onFinish(progressPics);
navigation.goBack();
};
Important update: I found that I don't have such an issue on Apple Mac! Only on Windows 7!
The same pdf file was downloaded twice. One file I can open, second one - I can't open. Maybe somebody can give me any advice or know solution? Thank you!
I use fetch-node library to download binary files in Electron.js. When files are small there is no issues but with large files I see periodically (not constantly) the issue with file decoding. I can't open such files! Also I found some files have 5KB size but should have 10Mb+ size!
function fastFetch(url, dest, fileName, callback) {
console.log("fastFetch-------------------------------------------------------");
//import fetch from 'node-fetch';
fetch(url)
.then(res => res.buffer())
.then(buffer => {
const settings = {
flags: 'w',
encoding: null, //not applicable / no changes
mode: 0o666
};
try {
fs.writeFileSync(dest, buffer, settings);
let msgOK = {
filename: fileName,
status: 'OK',
text: `File downloaded successfully`
}
if (callback) callback(msgOK);
console.log(msgOK.text);
isLoading = false; //IMPORTANT!
} catch (err) {
console.error(err.stack || err.message);
let msgErr = {
filename: fileName,
status: 'ERROR',
text: `Error in file downloading ${err.message}`
}
ERRORS.push(err);
if (callback) callback(msgErr);
}
})
}
Version with WriteStream - the same issue - one file can be open, another one can't be open due to encoding issue:
function fetchWithFIleStream(url, dest, fileName, callback) {
console.log("fetch With FIle Stream-------------------------------------------------------");
//import fetch from 'node-fetch';
fetch(url)
.then(res => {
console.log("--------------------------------------------")
//console.log(res);
console.log(res.ok);
console.log(res.status);
console.log(res.statusText);
console.log(res.headers.raw());
console.log(res.headers.get('content-type'));
console.log("--------------------------------------------")
//ERROR: Not a function: res.setEncoding('binary');
return res.buffer();
})
.then(buffer => {
const settings = {
flags: 'w',
encoding: null, //default: 'utf8',
fd: null,
mode: 0o666,
autoClose: true
};
// response.pipe(fs.createWriteStream(dest, settings));
var wstream = fs.createWriteStream(dest, settings);
wstream.write(buffer);
wstream.on('finish', function () {
//console.log('END------------------------------------------------------')
let msgOK = {
filename: fileName,
status: 'OK',
text: `File downloaded successfully`
}
if (callback) callback(msgOK);
console.log(msgOK.text);
isLoading = false; //IMPORTANT!
wstream.end();
});
wstream.end();
})
}
I am using Multer for file uploading in Node js, but I am not getting proper file. I am trying to upload the image.
I have to upload the imported file on the S3 bucket but the file is not found, I don't know where I am wrong.
Here is my node js code:
import "#tsed/multipartfiles";
import Path = require("path");
const rootDir = Path.resolve(__dirname);
const aws = require("aws-sdk");
import { Express } from "express";
import { Controller, Post } from "#tsed/common";
type MulterFile = Express.Multer.File;
import { MultipartFile } from "#tsed/multipartfiles";
#controller("/session")
export class TeleconsultSessionController implements interfaces.Controller {
#httpPost("/uploadDocuments")
public async uploadDocument( #MultipartFile() file: MulterFile, req: Request, res: Response) {
try {
aws.config.update({ accessKeyId: "AKIAIYKXXXXXXXXX", secretAccessKey: "iMzE0wfryXXXXXXXXXXXXXXXXXXXX" });
aws.config.update({ region: "us-east-1" });
const s3 = new aws.S3();
s3.upload({
"Bucket": "teleconsult-development",
"Key": "image",
"Body": file
}, function (err: any, data: any) {
if (err) {
console.log("Error uploading data: ", err);
} else {
console.log(data);
}
});
}
catch (err) {
if (err instanceof ValidationException) {
res.status(400);
res.send({ error: err.getMessage() });
} else {
res.status(500);
res.send({ error: err.getMessage() });
}
}
}
In response when I hit through postman I am getting the value of file as:
Object {------WebKitFormBoundaryAo1BhVpGaB8uYmsw\r\nContent-Disposition: form-data; name: ""image1"; filename="1528114981689_566_28-1-2
Uploading files with multer will keep files inside server's upload folder first depends on your multer settings.
If a file is successfully uploaded by multer, you will get the following data from req.file.
{
fieldname: 'file',
originalname: 'sample.jpg',
encoding: '7bit',
mimetype: 'image/jpeg',
destination: './uploads/',
filename: 'd1fccf5dc0125937f6fbe7a4dacfc069',
path: 'uploads/d1fccf5dc0125937f6fbe7a4dacfc069',
size: 23904
}
The path value I will be using in the example below is not a directory. Multer will change your filename withiout an extension.
You will need to read that uploaded file as fileStream and then upload it to S3 will works.
For example:
let path = req.file;
let fileStream = fs.createReadStream(file.path);
fileStream.on('error', function (err) {
if (err) { throw err; }
});
fileStream.on('open', function () {
s3.upload({
"Bucket": "yourBucketName",
"Key": file.originalname,
"Body": fileStream
}, function (err: any, data: any) {
if (err) {
console.log("Error uploading data: ", err);
} else {
console.log(data);
}
});
});
Hope this helps you.