How to read an image file in node js? - node.js

I am trying to read an image and text files and upload to aws s3 bucket using nodejs fs module. I am not using any express server, but plain javascript that calls aws-sdk and uploads items to aws-s3.
Here is how my project structure looks like
Inside s3.js I am trying to read the 2.png and friends.json files,
s3.js
const fs = require('fs');
const file = fs.readFileSync('../public/2.png', (err)=>console.log(err.message));
But this throw and error
Error: ENOENT: no such file or directory, open '../public/2.png'
What could be going wrong?

Could always try an absolute path instead of a relative and add encoding:
const fs = require('fs')
const path = require('path')
const image = path.join(__dirname, '../public/2.png')
const file = fs.readFileSync(image, {encoding:'utf8'})
Could also use upng-js with a Promise
const png = require("upng-js")
const fs = require('fs')
async function pngCode(img) {
try {
return png.decode(await fs.promise.promisify(fs.readFile)(img)
} catch (err) {
console.error(err)
}
}
pngCode('.../public/2.png')
None of the code is tested wrote on my phone.

Related

Node/express - cancel multer photo upload if other fields validation fails

I have multer as middleware before editing user function. The thing is that multer uploads photo no matter what, so I am wondering if there is a way to somehow cancel upload if e.g. email is invalid. I tried to delete uploaded image through function via fs.unlink if there is validation error within edit function, but I get "EBUSY: resource busy or locked, unlink" error. I guess that multer uploads at the same time while I try to delete image.
Any ideas how to solve this?
on your function make a try/catch block and handle on error throw
import { unlink } from 'node:fs/promises';
import path from 'path'
// code ...
// inside your function
const img = req.file // this needs to be outside the try block
try {
// your code, throw on failed validation
} catch (e) {
if (img) {
// depends on where you store in multer middleware
const img_path = path.resolve(YOUR_PATH, img.filename)
await unlink(img_path);
console.log(`deleted uploaded ${ img_path }`);
}
// revert transaction or anything else
}
Nowadays, applications usually separates uploading file API from data manipulating API for some features like previewing/editing image. Later, they can run a background job to clean unused data.
But if it's necessary in your case, we can use multer's builtin MemoryStorage to keep file data in memory first, then save it to disk after validation completes.
const express = require('express');
const app = express();
const multer = require('multer');
const storage = multer.memoryStorage();
const upload = multer({ storage });
const fs = require('fs');
app.post("/create_user_with_image", upload.single('img'), (req, res) => {
// Validation here
fs.writeFile(`uploads/${req.file.originalname}`, req.file.buffer, () => {
res.send('ok');
});
});
Note: as multer documentation said, this solution can cause your application to run out of memory when uploading very large files, or relatively small files in large numbers very quickly.

How to load image files in Node.js and Express to upload to Cloudinary

My server side has a folder in its file path with multiple images in them.
These images are generated by a separate code.
I need to open/load the images, so that I can upload it to Cloudinary.
I also have multer installed in the project, but I'm not sure how to use it,
What I have done till now, using glob.Glob to find all the paths:
const express = require('express');
const cloudinary = require("../utils/cloudinary");
const upload = require("../utils/multer");
const Glob = require('glob').Glob;
async function uploadToCloud()
{
var paths = new Glob('../output_temp/*.(jpg|jpeg|png)');
for (let key in paths) {
var path = paths[key];
try {
// Upload image to cloudinary
} catch (err) {
console.log(err);
}
}
}
}

Upload large file (>2GB) with multer

I'm trying to upload a large file (7GB) to my server. For this I'm using multer:
const express = require('express');
const multer = require('multer');
const {
saveLogFile,
} = require('../controller/log');
const router = express.Router();
const upload = multer();
router.post('/', upload.single('file'), saveLogFile);
In my saveLogFile controller, which is of format saveLogFile = async (req,res) => { ... } I want to get req.file. The multer package should give me the uploaded file with req.file. So when I try to upload small files (<2GB) It goes successfully. But when I try to upload files over 2GB, I get the following error:
buffer.js:364
throw new ERR_INVALID_OPT_VALUE.RangeError('size', size);
^
RangeError [ERR_INVALID_OPT_VALUE]: The value "7229116782" is invalid for option "size"
How can I bypass it? Actually, All I need is access for the uploaded file in my saveLogFile Controller.
The reason for this is probably that node will run out of memory as your using multer without passing any options. From the docs:
In case you omit the options object, the files will be kept in memory
and never written to disk.
Try using the dest or storage option in order to use a temporary file for the upload:
const upload = multer({ dest: './some-upload-folder' });
router.post('/', upload.single('file'), saveLogFile);

How to rename file using express-formidable package

Can't figure out on how to rename the file using the express formidable package
server.js contents:
const express = require('express')
const fs = require('fs')
const app = express()
const formidableMiddleware = require('express-formidable')
app.use(formidableMiddleware({
uploadDir: __dirname + '/public/files',
multiples: true,
keepExtensions: true
}))
router.route('/send/:mail')
.post((req, res) => {
let file = null
if (req.files) {
file = fs.readFileSync(req.files.file.path)
}
// here's the code to send email with mailgun js wrapper
})
Purpose of my code is to rename uploaded file, then send it as attachment via mailgun wrapper, however i'm stuck at the step of renaming the file using fs.rename() - i just don't know where to put it. Do i need to use node-formidable package together with express-formidable?
Solved using fs.renameSync:
fs.renameSync(req.files['files[0]'], req.files['files[0]'].name)

How to read a JSON file data and use it in firebase cloud function

I have a firebase cloud function which will be invoked on HTTP request which is working fine.
Now, I want to read data from a JSON file for some business logic. Below are the 2 ways I was trying to read the JSON file:
Option #1) Saved the JSON file inside 'public' directory in my nodejs project and deployed. Got a Hosting URL which I am using like below. But its throwing an error saying 'Error: getaddrinfo ENOTFOUND...'
Option #2) Uploaded the JSON file to firebase cloud storage. Didnt find any example to try this out. Ended up with the below code:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const Firestore = require('#google-cloud/firestore');
const firestore = new Firestore();
const http = require('http');
const url = require('url');
// Option #2 required variables
var storage = require('#google-cloud/storage');
const gcs = storage({projectId: "<Project ID>"});
const bucket = gcs.bucket("<bucket-name>");
const file = bucket.file("<filename.json>")
// HTTP Trigger
exports.functionName = functions.https.onRequest((req, res) => {
var request = require('request');
var paramValue = req.body.queryParam;
console.log(paramValue);
// Option #1 - Using hosted URL
var hostingURL = "https://xxxxxxxx.firebaseapp.com/filename.json";
console.log(hostingURL);
request({
url:hostingURL,
method: 'POST',
json:{ key: 'value' } },function(error, response, data) {
});
// Option #2 - Ended up here. Want to read from cloud storage bucket.
console.log(file);
});
Can some one help me?
You can place the .json file in the same folder, where your index.js is. Then you can do the following:
const config = require('./config.json');
console.log(config.foo);
Given following config.json file:
{
"foo" : "bar"
}
If your file is in Firebase Could Storage you can use this approach:
const admin = require('firebase-admin');
admin.storage().bucket().file("yourDirForFile/yourFile.json")
.download(function (err, contents) {
if (!err) {
var jsObject = JSON.parse(contents.toString('utf8'))
}
});
variable jsObject you can use as you wish. It is in memory for now.
You can use require('firebase-admin') as #AlexM said, and you can alternatively use client side package: #google-cloud/storage.
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const fileBucket = 'your-bucket.appspot.com';
const filePath = 'your-json.json';
const bucket = storage.bucket(fileBucket);
const file = bucket.file(filePath);
file.download()
.then((data) => {
const obj = JSON.parse(data);
});
Don't forget to:
run npm i #google-cloud/storage in your function folder before deploying
JSON.parse the result to get your json as an object

Resources