node.js mongoose subfield in a document - node.js

I have been working with node.js and mongoose for sometime and I am hitting a wall. I have a database with 20,000 documents and when i search the database from the cli it works fine.
db.Tickets.find({ "Customers.Customer.CustomerID" : '123123123' })
This returns 256 results
Schema
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
// Define collection and schema for Ticket
var Ticket = new Schema({
UserName: {
type: String
},
Status: {
type: String
},
TicketNumber: {
type: Number
},
Name: {
type: String
},
Description: {
type: String
},
TicketTypeName: {
type: String
},
DueDate: {
type: Date
},
MapCollectDate : {
type: Date
},
NumberofUsersAffected : {
type: Number
},
DNNumber : {
type : String
},
RevisionDate : {
type : Date
},
CommercialImpact : {
type: String
},
Customers :[{
Customer: [{
CustomerID: Number,
CustomerName: String
}]
}],
Although if I test this in node.js using mongoose. I can't get it to return anything
I have a generic search that works
Ticket.find(function (err, tickets){
But can't get the specific search to work.
I am Connecting to Mongo
const config = require('./db');
//const Course = require('./models/Course');
//const CourseRoute = require('./routes/CourseRoute');
const Ticket = require('./models/Ticket');
const TicketRoute = require('./routes/TicketRoute');
const PORT = 4000;
mongoose.connect(config.DB).then(
() => {console.log('Connected to MongoDB') },
err => { console.log('Error connecting to MongoDB' +err)
});
Output of the log
Your node js server is running on PORT: 4000
Connected to MongoDB
Connected to MySQL
My Route End point
router.route('/').get(function (req, res) {
Ticket.find({ "Customers.Customer.CustomerID" : global.auth_username }, function(err, ticket) {
if(err){
console.log(err);
}
else {
res.json(tickets);
}
});
});
Also tried without the variable
router.route('/').get(function (req, res) {
Ticket.find({ "Customers.Customer.CustomerID" : "123123123" }, function(err, ticket) {
if(err){
console.log(err);
}
else {
res.json(tickets);
}
});
});

I had the same issue when I forgot to connect to Mongoose before running query
mongoose.connect(MONGO_URL, mongoOptions)
.then(() => {
// do your thing here
})

You had over a year to figured this out, and I am sure that you did so, but either way it seems that you have a typo in your code. The callback parameter is named ticket - function(err, ticket) {, whereas you are logging tickets - res.json(tickets);. In the generic test you correctly wrote tickets - Ticket.find(function (err, tickets){, which is probably why it worked.
The takeaway lesson here is - use debugging tools instead of logging, makes it easier to catch such problems.
Also, it would be appropriate to answer your own question once you've figured it out. But given that this is probably completely useless, you might as well delete it. Cheers!

Related

Update document in MongoDB via NodeJS

So my knowledge of NodeJS and MongoDD are non-existent (just need to do a small code update for a friend) and I'm stuck.
Need to update a single document inside a collection via a unique id but can't seem to do it.
Here's the Model (I've trimmed it down and cut out all unnecessary data). I'm trying to update the field notes inside a transaction.
In short each entry in the given (an Agent) table will have a collection of multiple Transactions & Documents. I need to update a specific Transaction with the unique _id that is auto generated.
import { Schema, model } from 'mongoose';
interface Transaction {
first_name: string;
last_name: string;
type: string;
notes: string;
}
interface Agent {
org_id: number;
transactions: Array<Transaction>;
documents: Array<string>;
}
const transactionSchema = new Schema<Transaction>({
first_name: { type: String },
last_name: { type: String },
type: { type: String },
notes: String,
});
const transactionsSchema = new Schema<Agent>({
org_id: { type: Number },
transactions: [transactionSchema],
documents: [documentTypesSchema],
});
const AgentTransaction = model<Agent>(
'agent_transaction_table',
transactionsSchema
);
export default AgentTransaction;
Here's what I tried but didn't work (obviously), again I've trimmed out all unnecessary data. Just to clarify, the endpoint itself works, but the DB update does not.
import AgentTransaction from '../models/transaction'; // the above model
transaction.put('/notes', async (req, res) => {
const { org_id, transaction_id, notes } = req.body;
try {
const notesResult = await AgentTransaction.updateOne({
'transactions._id': transaction_id,
}, {
$set: {
'notes': notes
},
});
res
.status(200)
.json({ message: 'Updated', success: true, notesResult });
} catch (error) {
res.status(400).send(error);
}
});
So I figured it out. Maybe it'll help someone else as well.
const notesResult = await AgentTransaction.updateOne({
'transactions._id': { $in: [trunc2] },
}, {
$set: {
'transactions.$.notes': notes
},
});
The main issue was that the payload object needed to target the collection folder + the wildcard + the field, not just only the field.

Unable to find index for $geoNear query error with mongoose

I have the following codes which try to create secondary indexes with mongoose. I have followed the mongoose official document to implement it ( mongoose documentation: Indexes section). However, when I send a GET request through Postman, an error, "unable to find index for $geoNear query", occurs. My understanding is that in my case, location is equivalent to a $geoNear object, so my code should be fine (I know it's not fine. That's why I have got an error). Any comments or suggestions would be greatly appreciated.
app.js(get endpoint)
app.get('/api/stores', (req, res) => {
const zipCode = req.query.zip_code;
const googleMapsURL = "https://maps.googleapis.com/maps/api/geocode/json";
axios.get(googleMapsURL, {
params: {
address: zipCode,
key : "KEY"
}
}).then((response) => {
const data = response.data
const coordinates = [
data.results[0].geometry.location.lng,
data.results[0].geometry.location.lat,
]
Store.find({
location: {
$near: {
$maxDistance: 3218,
$geometry: {
type: "Point",
coordinates: coordinates
}
}
}
}, (err, stores)=> {
if (err) {
console.log(err);
res.status(500).send(err);
} else {
res.status(200).send(stores);
}
})
}).catch((error)=> {
console.log(error);
})
})
store.js
const mongoose = require('mongoose');
const storeSchema = mongoose.Schema({
storeName: String,
phoneNumber: String,
address: {},
openStatusText: String,
addressLines: Array,
location: {
type: {
type: String,
enum: ['Point'],
required: true
},
coordinates: {
type: [Number],
required: true
}
}
})
storeSchema.index({ location : "2dsphere"}, {sparse: true});
module.exports = mongoose.model('Store', storeSchema);
When creating a new index in MongoDB, you may have to drop the table as to have the index apply properly. Try creating the index on a fresh table. Does that work?
I figured out this error by first creating a new database and collection(I was using the default database named <dbname>). When I sent a POST request with the data I would like to store in MongoDB, MongoError: Can't extract geo keys appeared. I fixed this error by following this thread(reference). After these steps, my GET request worked and indexes were successfully created.

Mongoosastic Does not Index On Save

I have a mongoose model defined like this.
const custSchema = new mongoose.Schema({
name: {
type: String,
es_indexed: true,
es_type: text
},
phoneNumber: {
type: String,
es_indexed: true,
es_type: String
},
email: String
})
custSchema.plugin(mongoosastic);
const Cust = module.exports = mongoose.model('Cust', custSchema);
Cust.createMapping(function(err, mapping) {
if(err) {
console.log(err)
} else {
console.log(mapping);
}
});
let count = 0;
const stream = Cust.synchronize();
stream.on('data', () => {
count = count + 1;
})
stream.on('close', () => {
console.log("Total " + count + " documents indexed");
})
stream.on('error', (err) => {
console.log(err)
});
When I add new Collection to Cust, new document does not get added to elasticsearch unless I restart the server.
How can I solve this issue?
Struggled with this for a while until I discovered the issue was related to the ElasticSearch upgrade from v5 to v6, where "type" became a candidate for deprecation (removed entirely in v7) and no longer allows for more than one type. The library most likely forces its own "type" ("_doc" in addition to "customer" likely in your case), but we need to manually set the one type in mongoosastic as "_doc" (which is necessary for ElasticSearch v6).
First, delete your original index where the index and type may be causing the conflict.
Then, in your model, change
custSchema.plugin(mongoosastic);
to
custSchema.plugin(mongoosastic, {
type: '_doc'
});
And create a new index after changing this option in the plugin section
stream.on("data", function(err, doc) {
client.indices.create({
index: 'customers',
body: {
doc
}
}, function (error, response) {
console.log(response);
});
Was a pain to resolve, hopefully this will help a few others avoid the headache as well.

MongoDB and Nodejs insert ID with auto increment

I am new to NodeJs and MongoDB, i want to insert row with auto increment primary key 'id'. also defined a function called getNextSequence on mongo server.
this is working perfect on Mongodb server
> db.user.insert({
"id" : getNextSequence('user_id'),
"username" : "test",
"email" : "test#test.com",
"password" : "test123"
})
now i want to insert from NodeJs.I have tried this but not working
db.collection('user').insertOne({
id : "getNextSequence('user_id')",
username : query.name,
email: query.email,
password: query.pass
}, function(err, result) {
assert.equal(err, null);
console.log("row insterted ");
callback();
});
Assuming that getNextSequence is a server-script function (i.e. a method you defined and saved via db.system.js.save), it is not callable outside of the server. One way to go is to use eval, which forces the server to evaluate a string as a js code, even though it is not a good practice. Here is an example:
db.eval('getNextSequence(\'user_id\')', function(err, result) {
db.collection('users').insert({
"id" : result,
"username" : "test",
"email" : "test#test.com",
"password" : "test123"
});
});
Another way is to follow the mongo tutorial and to implement the getNextSequence directly in NodeJS. The syntax is pretty much the same:
function getNextSequence(db, name, callback) {
db.collection("counters").findAndModify( { _id: name }, null, { $inc: { seq: 1 } }, function(err, result){
if(err) callback(err, result);
callback(err, result.value.seq);
} );
}
You then use it in your nodeJS code like:
getNextSequence(db, "user_id", function(err, result){
if(!err){
db.collection('users').insert({
"_id": result,
// ...
});
}
});
Note: of course, you need to have set the counters collection as explained in the docs.
You can also use "mongoose-auto-increment".
The code has just 4 lines
var mongoose = require('mongoose');
var autoIncrement = require('mongoose-auto-increment');
autoIncrement.initialize(mongoose.connection);
userSchema.plugin(autoIncrement.plugin, 'user');
example :
npm i mongoose-auto-increment
connections.js :
const mongoose = require('mongoose');
require("dotenv").config;
const uri = process.env.MONGOURL;
mongoose.connect(uri, { useNewUrlParser: true }, (err) => {
if (!err) { console.log('MongoDB Connection Succeeded.') }
else { console.log('Error in DB connection : ' + err) }
});
require('../schema/userSchema');
userSchema.js :
var mongoose = require('mongoose'); // 1. require mongoose
var autoIncrement = require('mongoose-auto-increment'); // 2. require mongoose-auto-increment
var userSchema = new mongoose.Schema({
name: { type: String },
password: { type: String },
email: { type: String, unique: true, required: 'This field is required.' },
});
autoIncrement.initialize(mongoose.connection); // 3. initialize autoIncrement
userSchema.plugin(autoIncrement.plugin, 'user'); // 4. use autoIncrement
mongoose.model('user', userSchema);
To accomplish this, we will create a function that will keep trying to save the document untill it will have been saved with incremented _id
async function retryUntilSave(db, task) {
try {
const index = await db.collection('tasks').find().count() + 1;
const result = await db.collection('tasks').insertOne(Object.assign(task, { _id: index }))
} catch (error) {
if (error.message.includes("_id_ dup key")) {
console.log("ID already exists!")
console.log("Retrying...");
retryUntilSave(db, task)
} else {
console.log(error.message);
}
}
}
We can use task._id: index instead of Object.assign()
finally you can test this by making some concurrent requests
for (let index = 0; index < 20; index++) {
setTimeout(async () => {
await retryUntilSave(db, { title: "Some Task" })
}, 1000);
}
This function will handle easily if two or more tasks submitted at the same time because mogod throws error when we try to insert a document with duplicate _id, then we will retry saving the document again with incremented _id and this process will run until we save the document successfully !
You can also use "mongodb-autoincrement" module of node js. For example:
var autoIncrement = require("mongodb-autoincrement");
exports.yourMethod = function(newData, callback) {
autoIncrement.getNextSequence(db, your-collection-name, function (err, autoIndex) {
newData.id = autoIndex;
//save your code with this autogenerated id
});
}
You can use the below package on a model schema to auto-increment your collection field.
mongoose-auto-increment //you can download it from npm
Here I am not focusing on how to connect MongoDB. I just focus on how you can integrate auto increment in your model/collection/table.
const mongoose = require("mongoose"); //
const autoIncrement = require("mongoose-auto-increment");
const post_schema = new mongoose.Schema({
title: {
type: String,
required: true,
min: 3,
max: 225,
},
slug: {
type: String,
required: true,
},
});
autoIncrement.initialize(mongoose.connection);
post_schema.plugin(autoIncrement.plugin, {
model: "post", // collection or table name in which you want to apply auto increment
field: "_id", // field of model which you want to auto increment
startAt: 1, // start your auto increment value from 1
incrementBy: 1, // incremented by 1
});
module.exports = mongoose.model("post", post_schema);

Search in mongoosastic doesn't give any result

I tried to use mongoosastic search but it doesnt work
Job.js (Mongoose schema)
var mongoose = require('mongoose');
var mongoosastic = require('mongoosastic');
var Schema = mongoose.Schema;
var JobSchema = Schema({
title: { type: String, es_indexed:true },
category: { type: Schema.Types.ObjectId, ref: 'Category', es_indexed:true},
salary: { type: String, es_indexed:true },
});
JobSchema.plugin(timestamps);
JobSchema.plugin(mongoosastic, {
hosts: [
'localhost:9200'
]});
module.exports = mongoose.model('Job', JobSchema);
Routes.js
var express = require('express');
var app = express();
var Job = require('../models/job');
Job.createMapping(function(err, mapping) {
if (err) {
console.log('error creating mapping (you can safely ignore this)');
console.log(err);
} else {
console.log('mapping created!');
console.log(mapping);
}
});
app.post('/api/search/', function(req, res, next) {
Job.search({query_string: {query: req.body.test}}, function(err, results) {
if (err) return next(err);
res.send(results);
});
});
This is the data sets that already been saved in mongodb
{
"id": 12313,
"title": "Engineer"
},
{
"id": 13123,
"title": "Doctor"
},
{
"id": 121243134,
"title": "Software Engineer"
}
When I tried to run the search and search like "Engineer" and I keep getting this result.
Updated for Curl
curl -XGET localhost:9200/_mapping
curl -XGET localhost:9200/_search
Since title is an analyzed String, its value is indexed using the standard analyzer, i.e. in lowercased form, so "Engineer" will be indexed and "engineer"
Try searching for "engineer" in lowercase instead.
UPDATE
Based on our discussion, it seems that the problem is simply that your Elasticsearch is empty. Since you have existing data in MongoDB, you need to make sure to call the synchronize() method on your model in order to index all your MongoDB collection inside Elasticsearch.

Resources