How to upload video or image file to database - node.js

I am working in a project in which i need to create api using nodejs to upload video or image file to the postgres database
How i can do this
Thanks a lot

You can convert an image or video to base64, and then upload the base64 encoded string to your database.
const fs = require('fs');
function encode_base64(file) {
const bitmap = fs.readFileSync(file);
return new Buffer(bitmap).toString('base64');
}
const image = encode_base64('image.jpg');
const video = encode_base64('image.mp4');

Related

Storing and downloading pdf files from mongoDB using Node.js

I am using NodeJs, mongoose, ejs
I want the user to be able to upload his CV (Less the 16 MB) as a pdf file, then the admin to be able to download the pdf or view the CV on the website.
I did the upload part where I stored the cv like this in the database (which might be a wrong way):
cv: Object {
fileName: "Test Results All in testing-homework.pdf"
filePath: "cvs\1650985306448-.pdf"
fileType: "application/pdf"
}
and the pdf file is stored in the "/cvs" folder after upload.
I'm seeking a way to be able to download/view the pdf file from the database.
I will suggest you to use GridFs from mongodb.
refer:https://www.mongodb.com/docs/drivers/node/current/fundamentals/gridfs/
To Download from DB
bucket.openDownloadStreamByName('your_file_name').
pipe(fs.createWriteStream('./desired_name_of_output_file'));
Some of the Basic Upload and Download Operations using GridFs in MongoDB
const mongoUrl='mongodb://127.0.0.1:27017/db'
const mongodb = require('mongodb');
const fs = require('node:fs');
const client = new mongodb.MongoClient(mongoUrl)
const db = client.db('db');
Create Bucket with user-defined name
const bucket = new mongodb.GridFSBucket(db,{bucketName:'name_of_bucket' });
Upload to DB
fs.createReadStream('./filename.pdf').
pipe(bucket.openUploadStream('filename', {
chunkSizeBytes: 1048576,
metadata: { field: 'filename', value: 'myValue' }
}))
Find
const cursor = bucket.find({});
cursor.forEach(doc => console.log(doc));
Delete
cursor.forEach(doc => bucket.delete(doc._id));

Cutted photo off when is big one. Nodejs, express, graphql

When I upload a big photo then it's cut off. These are a few lines where I save the photo on the backend side. It happens only when a photo is big (more than 2MB). I use nodejs, express, graphql with apollo-upload-client. All is running on docker with nginx.
const { filename, createReadStream } = await input.photo;
if (isFunction(createReadStream)) {
const fileStream = createReadStream();
const trimedFilename = trim(filename);
fileStream.pipe(fs.createWriteStream(`${eventFolderPath}/${trimedFilename}`));
}
The photo looks like that:
corrupted photo

Extract WAV header on javascript frontend (ReactJS)

I'm trying to analyze a file I'll be uploading from react, I need to know if it can be uploaded based on several factors.
I found https://github.com/TooTallNate/node-wav
It works great on nodejs and I'm trying to use it on react. The sample creates a readable stream and pipes it to the wav reader.
var fs = require('fs');
var wav = require('wav');
var file = fs.createReadStream('track01.wav');
var reader = new wav.Reader();
// the "format" event gets emitted at the end of the WAVE header
reader.on('format', function (format) {
//Format of the file
console.log(format);
});
file.pipe(reader);
Using FilePond controller I'm able to get a base64 string of the file. But I can't figure out how to pass it to the reader
this is what I have so far on ReactJS:
var reader = new wav.Reader();
reader.on('format', function (format) {
//Format of file
console.log('format', format);
});
const buffer = new Buffer(base64String, 'base64')
const readable = new Readable()
readable._read = () => { }
readable.push(buffer)
readable.push(null)
readable.pipe(reader)
But I get Error: bad "chunk id": expected "RIFF" or "RIFX", got "u+Zj"
Since this file works on NodeJS with the same lib is obvious I'm doing something wrong.
EDIT:
this was a problem with my Base64 string, this method works if anyone needs to analyze a wav on the frontend

Save binary image to file

I make an API request which returns a binary image. How can I save it to a file like photo.png on my machine? Doing some research, I've tried the following but when I open the image, my machine says it's damaged:
const buffer = new Buffer(imageBinary);
const b64 = buffer.toString("base64");
const path = `temp/${userId}`;
const url = path + "/photo.png";
if (!fs.existsSync(path)) fs.mkdirSync(path);
if (fs.existsSync(url)) fs.unlinkSync(url)
fs.createWriteStream(url).write(b64);
return url;
Edit: Here is the binary data FYI: https://gist.github.com/AskYous/1fd26dc0eb02b4ec1672dcf5c61a34df
You do not need to re-encode the buffer as base64. Just write the binary buffer as is:
fs.createWriteStream(url).write(imageBinary);

File read from angularjs and convert base64 and push into gitlab

multiple zip File read and display in angularjs and those files convert base64 in nodejs and push into gitlab. please suggest me if it possible in nodejs. is there any blug available for reference.
use fs module of nodejs to read the files from directory
const testFolder = './tests/';
const fs = require('fs');
fs.readdirSync(testFolder).forEach(file => {
console.log(file);
});
once you get the files you can covert to base64
function base64_encode(file) {
// read binary data
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString('base64');
}

Resources