Drive Api - large file stream in nodejs - node.js

Drive Api - large file stream in nodejs
hello , I went to stream large file from google drive to my website but I have issue that...
app.get("/blog.mkv", (req, ress) => {
const p38290token = new google.auth.OAuth2(CLIENT_ID, CLIENT_SECRET, REDIRECT_URI);
p38290token.setCredentials({ refresh_token: token.acc });
const p38290Id = google.drive({
version: "v3",
auth: p38290token,
});
try {
p38290Id.files.get({
fileId: "1OU3BXc4FmyRD0rCW9S4XFfVxIl48vy3v",
alt: "media",
// arraybuffer , stream , blob
}, { responseType: "stream" },
(err, res) => {
if (err) {
console.log(err.message)
if (err.message === "invalid_grant") {
// fatchToken(exportFile)
}
} else {
res.data
.on("end", () => {
console.log("Done");
})
.on("error", err => {
console.log("Error", err);
})
.pipe(ress);
}
}
)
} catch (error) {
}
})
when user come to /blog.mkv video is start stream but user can't skip it (can't go forward or backwards ) , what should I do ?

Check this repo for streaming and downloading files from Google Drive.
Google-drive-video-streaming-nodejs
This is a small script in nodejs that allow you to watch a video stored in your Google Drive directly in your video player.
Install
You need only to install all the dependencies by typing this command:
npm install
Usage
Just type this command to startup the script.
node ./app.js
Now that the server is started you can start watching your video or download it.
Streaming
Paste this link into your player to start streaming the video.
http://127.0.0.1:9001/
Download
To download it, type this URL in a new browser tab.
http://127.0.0.1:9001/download
if you want you can specify the parameter p, that indicates as a percentage what portion of the video will be skipped. For example, to start downloading the video from starting from the halfway point you should use this link:
http://127.0.0.1:9001/download?p=50
You can even use the parameter c that indicates from what chunk the download must be started. To stop the downloading process use this URL:
http://127.0.0.1:9001/download_stop

Related

my videos on google bucket can not fast forward or rewind

So i built an e-learning platform with node.js and vue.js, and i am using GCP buckets to store my videos privately, everything works perfectly asides the fact that my videos can not fast forward or rewind, if you try moving the video to a specific position (maybe towards the end of the video) it returns to the same spot where you were initially, at first i taught it was a vue problem, but i tried playing this videos from my GCP bucket dashboard directly but it does the same thing. it only works fine when i use the firefox browser.
i am using the Uniform: No object-level ACLs enabled access control and the Not public permission settings. I am new the GCP, i have no idea what could be the problem
here is the node.js function i am using
const upload = async (req, res) => {
try {
if (!req.file) {
res.status(400).send('No file uploaded.');
return;
}
const gcsFileName = `${Date.now()}-${req.file.originalname}`;
var reader = fs.createReadStream('uploads/'+req.file.originalname);
reader.pipe(
bucket.file(gcsFileName).createWriteStream({ resumable: false, gzip: true })
.on('finish', () => {
// The public URL can be used to directly access the file via HTTP.
const publicUrl = format(
`https://storage.googleapis.com/bucketname/` + gcsFileName
);
// console.log('https://storage.googleapis.com/faslearn_files/' + gcsFileName)
fs.unlink('uploads/' + req.file.originalname, (err) => {
if (err) {
console.log("failed to delete local image:" + err);
} else {
console.log('successfully deleted local image');
}
});
res.status(200).send(publicUrl);
})
.on('error', err => {
console.log(err);
return
})
//.end(req.file.buffer)
)
// Read and display the file data on console
reader.on('data', function (chunk) {
console.log('seen chunk');
});
} catch (err) {
console.log(" some where");
res.status(500).send({
message: `Could not upload the file: ${req.file.originalname}. ${err}`,
});
}
};
the issue was comming from the way i encoded the video, i was supposed to use the blob but i used the pipe

Google drive API downloading file nodejs

Im trying to get the contents of a file using the google drive API v3 in node.js.
I read in this documentation I get a stream back from drive.files.get({fileId, alt: 'media'})but that isn't the case. I get a promise back.
https://developers.google.com/drive/api/v3/manage-downloads
Can someone tell me how I can get a stream from that method?
I believe your goal and situation as follows.
You want to retrieve the steam type from the method of drive.files.get.
You want to achieve this using googleapis with Node.js.
You have already done the authorization process for using Drive API.
For this, how about this answer? In this case, please use responseType. Ref
Pattern 1:
In this pattern, the file is downloaded as the stream type and it is saved as a file.
Sample script:
var dest = fs.createWriteStream("###"); // Please set the filename of the saved file.
drive.files.get(
{fileId: id, alt: "media"},
{responseType: "stream"},
(err, {data}) => {
if (err) {
console.log(err);
return;
}
data
.on("end", () => console.log("Done."))
.on("error", (err) => {
console.log(err);
return process.exit();
})
.pipe(dest);
}
);
Pattern 2:
In this pattern, the file is downloaded as the stream type and it is put to the buffer.
Sample script:
drive.files.get(
{fileId: id, alt: "media",},
{responseType: "stream"},
(err, { data }) => {
if (err) {
console.log(err);
return;
}
let buf = [];
data.on("data", (e) => buf.push(e));
data.on("end", () => {
const buffer = Buffer.concat(buf);
console.log(buffer);
});
}
);
Reference:
Google APIs Node.js Client

Google Drive File Download api in node js cannot working?

I am using google drive file download api for download the file from google drive.
I am using the following code,
var fileId = '1RDrrFGV2fM7jvnxGFileId';
var dest = fs.createWriteStream('./sample.xlsx');
drive.files.get({fileId: fileId, alt: 'media'}, {responseType: 'stream'},
function(err, res){
res.data
.on('end', () => {
console.log('Done');
})
.on('error', err => {
console.log('Error', err);
})
.pipe(dest);
}
);
Downloaded file was empty, How to get the file data.
drive.files.get will return only metadata of the file. Since you're trying to download file and retrieve the content you should use
drive.files.export({fileId: fileId, mimeType: <MIME type of your file>})
You can refer the client documentation https://apis-nodejs.firebaseapp.com/drive/index.html
I got a another solution for download the google drive file. use the following code snippet
var downloadfile="https://docs.google.com/uc?id=" + fileId
return downloadfile;
Browse the downloadfile url to download the file.

How to download a file from an SFTP server using Node

I'm looking to let a user download a file directly from an sftp server, but in the browser. For example, user wants to find a audio file name called 'noise', they enter the parameters and a button shows to download. I am using express as my web application framework and as well as ejs.
I've found methods to download via SFTP like the code below but this is saved through my applications folder rather than the user's disk.
sftp.connect(config).then(() => {
sftp.get('file.wav').then((data) => {
var outFile = fs.createWriteStream('file.wav')
data.on('data',function(response) {
outFile.write(response);
});
data.on('close', function() {
outFile.close();
});
});
})
How can you download directly from sftp to the user's disk by giving them the option through a button?
You pipe the stream to your res. For example:
router.get('/download', (req, res) => {
sftp.connect(config).then(() => {
sftp.get('file.wav').then((data) => {
res.pipe(data)
})
})
})

Sending a PDF from client to server isn't working with stream.pipe(res) in Ubuntu server

I have an application that makes a PDF and sends it to the client using node js. The application works perfectly locally but when I host it in a Ubuntu server in Digital Ocean, the endpoint that generates the PDF isn't working
This is the code that sends the PDF to the client :
pdf.create(html, options).toStream((err, stream)=> {
if (err) {
res.json({
message: 'Sorry, we were unable to generate pdf',
});
}
stream.pipe(res)
});
in the client side this how i communicate with the endpoint
genratePdf({ commit }, data) {
axios.post('http://localhost:1337/getpdf', data,{ responseType:'arraybuffer' }).then((response) => {
let blob = new Blob([response.data],{type:'application/pdf'})
var link=document.createElement('a');
link.href=URL.createObjectURL(blob);
link.download="Report_"+new Date()+".pdf";
link.click();
}, (err) => {
console.log(err)
})
When i run it locally it works perfectly:
but when I host in a Ubuntu Digital Ocean droplet the other endpoint work but the generated PDF one is not working and it shows me that error
I think it's a timeout problem the app does not wait for the stream to finish to pipe it in res.
There is actually an error occuring in the generation of your pdf but because you are not returning when handling error, it still executes the stream.pipe statement.
Change your code to this :
pdf.create(html, options).toStream((err, stream)=> {
if (err) {
console.error(err);
return res.json({
message: 'Sorry, we were unable to generate pdf',
});
}
return stream.pipe(res)
});
Notice I added a console.error(err);, it might help you to debug further. But I think the library you are using uses PhantomJS, might be an error with it as PhantomJS must be compiled for the arch.
Try doing rm -Rf ./node_modules && npm i

Resources