How to Lint code if I need additional mongodb request - node.js

Today I have this code. I learning NodeJS and all the time I have same question: I have a route. This route doing something, thats why I need some mongodb requests. In different ways I can need additional options or not. For example please look on my code:
const express = require('express');
const router = express.Router();
const debug = require('debug')('hackit:posts');
const boom = require('boom');
const config = require('../../config');
const db = require('monk')(config.mdbConnect);
const DBpgs = db.get('pgs');
const DBposts = db.get('posts');
const DBcategories = db.get('categories');
router.route( "/edit-:id" )
.get((req,res,next) => {
const ops ={
h1: "Edit post",
title: "Admin edit post",
description: "Admin page",
specialscript: "/javascripts/editpost.js"
};
DBposts.findOne({_id: req.params.id}, (err, saved) => {
if(saved.h1.length) debug('Edit «'+saved.h1+'»')
else debug('Create new post')
if(err) next(boom.badImplementation(err))
DBcategories.find({}, (err, categories) => {
if(err) next(boom.badImplementation(err))
let group = {}
if(saved.parent&&saved.parent.length){
DBpgs.findOne({parent: saved.parent}, (err, group) => {
if(err) next(boom.badImplementation(err))
res.render('admin/post', {ops, saved, categories, group})
})
}else{
res.render('admin/post', {ops, saved, categories, group})
}
})
})
})
module.exports = router;
There is only one route. The problem is if I got saved.parent I need to do additional request to db, to pick group options. I solved it just by if and ... its looks rude. Next two lines i type res.render('admin/post', {ops, saved, categories, group}) witch is the same line of code. I want to be a good programmer. Lint my code and say how I must fill my code please. Thanks!

The promise-based implementation would look something like this (I just modified your code directly in the answer so it probably has bugs, but it gives you the gist). Note that you get out of duplicating both the response as well as the error conditions.
router.route( "/edit-:id" )
.get((req,res,next) => {
const ops ={
h1: "Edit post",
title: "Admin edit post",
description: "Admin page",
specialscript: "/javascripts/editpost.js"
};
DBposts.findOne({_id: req.params.id})
.then(saved => {
if (saved.h1.length) debug('Edit «'+saved.h1+'»');
else debug('Create new post');
return DBcategories.find({})
.then(categories => {
if (saved.parent && saved.parent.length) {
return DBpgs.findOne({parent: saved.parent});
}
return {};
})
.then(group => {
res.render('admin/post', {ops, saved, categories, group})
})
.catch(error => {
next(boom.badImplementation(err));
});
});
});

Related

.post call not returning expected response

I am building out the backend for a flash card app for which you can find the repo here. There is a table called categories. There is an endpoint for adding a category.
router.post("/", protect, createCategory);
The endpoint runs a createCategory function which has logic separated out into a categoryController.js file.
const Categories = require("../models/categoryModel");
const createCategory = async (req, res) => {
const { title } = req.body;
const userId = req.user.id;
if (!title) {
res.status(404).json({ errMsg: "Please provide a title" });
} else {
const category = await Categories.createCategory({ title, userId });
console.log("category: ", category);
res.status(201).json(category);
}
};
The createCategory controller function in turn calls a createCategory function from the categoryModel.js file. The function from the model runs the database operations. Specifically, it inserts a new category into the database and references another function--getCategoryById--to return the newly created category.
const getCategoryById = (id) => {
return db("categories").where({ id }).first();
};
const createCategory = (category) => {
return db("categories")
.insert(category, "id")
.then((ids) => {
const [id] = ids;
return getCategoryById(id);
});
};
The problem is when I make the .post to create a new category, nothing is returned in the response. The database gets updated just fine but nothing is returned. You can see that I put a console.log in the controller and that is coming back undefined. I am not sure why.

How to make a GET Request for a unique register with AXIOS and NodeJS/Express

I'm trying to make GET request to external API (Rick and Morty API). The objective is setting a GET request for unique character, for example "Character with id=3". At the moment my endpoint is:
Routes file:
import CharacterController from '../controllers/character_controller'
const routes = app.Router()
routes.get('/:id', new CharacterController().get)
export default routes
Controller file:
async get (req, res) {
try {
const { id } = req.params
const oneChar = await axios.get(`https://rickandmortyapi.com/api/character/${id}`)
const filteredOneChar = oneChar.data.results.map((item) => {
return {
name: item.name,
status: item.status,
species: item.species,
origin: item.origin.name
}
})
console.log(filteredOneChar)
return super.Success(res, { message: 'Successfully GET Char request response', data: filteredOneChar })
} catch (err) {
console.log(err)
}
}
The purpose of map function is to retrieve only specific Character data fields.
But the code above doesn't work. Please let me know any suggestions, thanks!
First of all I don't know why your controller is a class. Revert that and export your function like so:
const axios = require('axios');
// getCharacter is more descriptive than "get" I would suggest naming
// your functions with more descriptive text
exports.getCharacter = async (req, res) => {
Then in your routes file you can easily import it and attach it to your route handler:
const { getCharacter } = require('../controllers/character_controller');
index.get('/:id', getCharacter);
Your routes imports also seem off, why are you creating a new Router from app? You should be calling:
const express = require('express');
const routes = express.Router();
next go back to your controller. Your logic was all off, if you checked the api you would notice that the character/:id endpoint responds with 1 character so .results doesnt exist. The following will give you what you're looking for.
exports.getCharacter = async (req, res) => {
try {
const { id } = req.params;
const oneChar = await axios.get(
`https://rickandmortyapi.com/api/character/${id}`
);
console.log(oneChar.data);
// return name, status, species, and origin keys from oneChar
const { name, status, species, origin } = oneChar.data;
const filteredData = Object.assign({}, { name, status, species, origin });
res.send(filteredData);
} catch (err) {
return res.status(400).json({ message: err.message });
}
};

How can I update my images with multer nodeJS(The images are stored in mongodb)

I am creating the backend server for a ecommerce website. I handle file uploads with multer and store them in gridfs. Now, when updating a new product, not all images may be submitted to be updated. I need a way to identify what field was updated, so I can delete the current file and upload the new file to mongodb. I've reached the point where I think it's impossible, but I'm still trying. Are there any alternative approach I could take. All responses are greatly appreciated.
Here is the code for the router and the 3 middleware functions I call when a request is made to said route.
```
router
.route("/:id")
.put(
productController.readPhotos,
productController.appendPhotoOnRequestBody,
productController.updateProduct
)
exports.readPhotos = upload.fields([
{ name: "coverPhoto", maxCount: 1 },
{ name: "colorPhoto", maxCount: 4 },
]);
exports.appendPhotoOnRequestBody = asyncHandler(async (req, res, next) => {
req.addedFiles = [];
if (
req.originalUrl === "/api/products" &&
req.method === "POST" &&
!req.files
) {
// Error message to send to user if the above condition is true.
const message = `To create a new product, a coverPhoto and at least one color with a colorPhoto must be specified`;
return next(new AppError(400, message));
} else if (!req.files) next();
if (req.files?.coverPhoto) {
// Extract coverPhoto
const [coverPhoto] = req.files.coverPhoto;
// Process images
const coverPhotoBuffer = await processImage(coverPhoto.buffer, [300, 300]);
// Pushing the coverPhoto to the db
const coverPhotoStream = Readable.from(coverPhotoBuffer);
const coverPhotoValue = await pushToDbFromStream(
coverPhotoStream,
req,
coverPhoto
);
req.body.coverPhoto = coverPhotoValue.filename;
req.addedFiles.push(req.body.coverPhoto);
}
if (req.files?.colorPhoto) {
// Extract colorPhotos
const colorPhotos = [...req.files.colorPhoto];
const colorPhotoBuffers = await Promise.all(
colorPhotos.map((photo) => processImage(photo.buffer, [50, 50]))
);
// Pushing the colorPhotos to the db
const colorPhotosStreams = colorPhotoBuffers.map((buff) =>
Readable.from(buff)
);
const colorPhotoValues = await Promise.all(
colorPhotosStreams.map((stream, index) =>
pushToDbFromStream(stream, req, colorPhotos[index])
)
);
req.body.colors = JSON.parse(req.body.colors);
colorPhotoValues.forEach((value, index) => {
req.body.colors[index].colorPhoto = value.filename;
});
req.body.colors.forEach((color) => req.addedFiles.push(color.colorPhoto));
}
next();
});
exports.updateProduct = asyncHandler(async (req, _, next) => {
req.product = await Product.findByIdAndUpdate(req.params.id, req.body, {
runValidators: true,
new: true,
});
next();
});
exports.updateProduct = asyncHandler(async (req, _, next) => {
req.product = await Product.findByIdAndUpdate(req.params.id, req.body, {
runValidators: true,
new: true,
});
next();
});
```
What I have done is upload new images, then merge these images with the images that are in MongoDB, and update the document with the images merged.
In this case, you are not querying the product first to obtain the images previously added.
I consider it necessary because MongoDB will replace the images previously added if you only send the new images.
There is a reference to the images in a color property of the product Schema. That uses the embedded document style. Instead, I could use the referenced way instead, and create new Color documents and update them individually.

Mongoose - Best way to update a record

I want to update a record with mongoose. I've seen other answers on SO. but they are very old like 4 - 5 years ago.
What is the best way to update a record with mongoose ? both for PUT and PATCH requests.
Currently I'm doing it like so. but I don't think this is a good way for models with many fields.
export const updateTestimonial = asyncHandler(async (req, res) => {
const values = await testimonialSchema.validateAsync(req.body);
const testimonial = await Testimonial.findById(req.params.id);
if (!testimonial) {
return res.status(404).json({ error: 'Testimonial not found' });
}
testimonial.name = values.name;
testimonial.text = values.text;
const updatedTestimonial = await testimonial.save();
res.status(200).json(updatedTestimonial);
});
You can use findByIdAndUpdate
export const updateTestimonial = asyncHandler(async (req, res) => {
const values = await testimonialSchema.validateAsync(req.body);
const testimonial = await Testimonial.findByIdAndUpdate(req.params.id, values, {new : true});
if (!testimonial) {
return res.status(404).json({ error: 'Testimonial not found' });
}
res.status(200).json(testimonial);
});

fs.writefile is making POST request loop infinitly within my express app

I have this current server code:
const express = require("express")
const fs = require("fs")
const router = express.Router()
const path = require("path")
const todos = JSON.parse(fs.readFileSync(path.join(__dirname, "../db", "todolist.json"), "utf8"))
router.get("/", async (req, res) => {
res.send(todos)
})
router.post("/new", async (req, res) => {
const { title, description } = req.body
const todoItem = {
id: "3",
title,
description
}
todos.todos.push(todoItem)
const data = JSON.stringify(todos, null, 2)
fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, () => {})
res.status(201).json(todoItem)
})
client:
console.log("Hello world!")
const somedata = {
title: "A new boy",
description: "Recieved from the client"
}
const main = async () => {
const response1 = await fetch("http://localhost:3000/todo", {
method: "GET",
})
const data1 = await response1.json()
const response2 = await fetch("http://localhost:3000/todo/new", {
method: "POST",
body: JSON.stringify(somedata),
headers: {
'Content-Type': 'application/json',
"Accept": "application/json"
}
})
const data2 = await response2.json()
return { data1, data2 }
}
main().then(data => console.log(data))
When I make a /POST request to create a new entity the browser just loops the request over and over until I manually have to quit the server. This does not happen if I use postman for some reason. Does anybody see any obvious error here with how the writeFile-method is used and why it continuously reloads the browser to keep pushing POST requests?
Thanks! :)
i had the same problem! And it took me about 1 hour to understand what my Problem is:
If you use "live server extension", the server will restart everytime, when you write, change or delete a file in the project folder!
So, if your node-app wirte a file, the live-server will restart and the app writes the file again! => loop
In my case, i write a pdf-file. All i had to do, is to tell the live server extension to ignore pdf files:
So i just add to "settings.json":
"liveServer.settings.ignoreFiles":["**/*.pdf"]
fs.writeFile is asynchronous function. So, to send a response after file written you must do it in the callback. And of course, don't forget about error checking. I.e.
router.post("/new", async (req, res) => {
const { title, description } = req.body
const todoItem = {
id: "3",
title,
description
}
todos.todos.push(todoItem)
const data = JSON.stringify(todos, null, 2)
fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, (err) => {
if(err) {
throw err;
}
res.status(201).json(todoItem)
})
})
Or you can use fs.writeFileSync as Muhammad mentioned earlier.
I think I found the problem. It seemed that the live server extension was messing things up when I had the client and server on separate ports, making the browser refresh for every request made somehow. I switched back to them sharing port, which then makes it work. I have to find a good way of separating them on a later basis without this bug happening, but that is for another time.
Thanks for your help :)
I share my working sample.body-parser dependency is need to get body in post request.Please don't change the order in server.js.Check and let me know.
and also check once whether your client code is in in loop.
My server.js
const express = require("express")
const fs = require("fs")
const router = express.Router()
const path = require("path")
const app = express();
const bodyParser = require("body-parser")
const todos = JSON.parse(fs.readFileSync(path.join(__dirname, "../db", "todolist.json"), "utf8"))
app.use(bodyParser.json());
app.use("/",router)
router.get("/todo", async (req, res) => {
res.send(todos)
})
router.post("/todo/new", async (req, res) => {
const { title, description } = req.body
const todoItem = {
id: "3",
title,
description
}
todos.todos.push(todoItem)
const data = JSON.stringify(todos, null, 2)
fs.writeFile(path.join(__dirname, "../db", "todolist.json"), data, () => {})
res.status(201).json(todoItem)
});
app.listen(3000, () => {
console.log(`Server running in Port`);
});
todolist.json
{
"todos": []
}
I think you should use fs.writeFileSync() or write some code in its callback

Resources