I'm trying to implement a Password reset. So I'm taking the phone number of the user, getting the document from the database using the phone number to find it, and I'm taking the new password and trying to update the corresponding document using a PUT request in my Cloudant database.
app.post('/pass_rst', function(req,response){
var log='';
//log is just for me to see what's happening
var phone= req.body.phone;
log+=phone+'\n';
db.find({selector:{'phone':phone}}, function(err, result){
if(err){
throw err;
}
if(result.docs.length==0){
response.send('User doesnt exist');
}else{
var existing_data=result.docs[0];
log+=JSON.stringify(existing_data)+'\n';
var upd_pswd= req.body.new_password;
log+=upd_pswd+'\n';
var new_data=existing_data;
new_data.password=upd_pswd;
log+=JSON.stringify(new_data)+'\n';
var id= result.docs[0]._id;
log+=id+'\n';
//make PUT request to db
var options={
host:dbCredentials.host,
port:dbCredentials.port,
path:'/'+dbCredentials.dbName+'/'+id,
//url: dbCredentials.url+'/'+dbCredentials.dbName+'/'+id,
method:'PUT',
json:new_data,
headers:{
'Content-Type':'application/json',
'accept':'*/*'
}
};
log+=JSON.stringify(options)+'\n';
var httpreq= http.request(options);
//log+=JSON.stringify(httpreq);
httpreq.on('error', function(e){
response.send('Error'+e.message);
});
response.send(log+'\n\n\nUpdated');
}
});
});
dbCredentials is defined above as follows:
dbCredentials.host = vcapServices.cloudantNoSQLDB[0].credentials.host;
dbCredentials.port = vcapServices.cloudantNoSQLDB[0].credentials.port;
dbCredentials.user = vcapServices.cloudantNoSQLDB[0].credentials.username;
dbCredentials.password = vcapServices.cloudantNoSQLDB[0].credentials.password;
dbCredentials.url = vcapServices.cloudantNoSQLDB[0].credentials.url;
I've tried tinkering around with it, but in the best case scenario, I don't get an error and I see "Updated" but nothing actually happens in the database. Sometimes I get an error saying : 502 Bad Gateway: Registered endpoint failed to handle the request.
If you see what's going wrong, please let me know. Thank you.
This is the documentation on how to update documents in cloudant
UPDATE
Updating a document
PUT /$DATABASE/$DOCUMENT_ID HTTP/1.1 { "_id": "apple", "_rev":
"1-2902191555", "item": "Malus domestica", "prices": {
"Fresh Mart": 1.59,
"Price Max": 5.99,
"Apples Express": 0.79,
"Gentlefop's Shackmart": 0.49 } }
To update (or create) a document, make a PUT request with the updated
JSON content and the latest _rev value (not needed for creating new
documents) to https://$USERNAME.cloudant.com/$DATABASE/$DOCUMENT_ID.
If you fail to provide the latest _rev, Cloudant responds with a 409
error. This error prevents you overwriting data changed by other
processes. If the write quorum cannot be met, a 202 response is
returned.
Example response: { "ok":true, "id":"apple",
"rev":"2-9176459034" }
The response contains the ID and the new revision of the document or
an error message in case the update failed.
I am using bluemix -nodejs and cloudant. The best way that worked for me to do the update is to use the nano package for db interaction from node.js.
You can refer to the post here:
The summary is - By making use of nano api, you can easily update the record. You need to make sure to use - the _id and the right _rev number, while you use nano. This inturn uses PUT Method underneath.
Updating and Deleting documents using NodeJS on Cloudant DB
When you are including nano, make sure to update the package.json to have the nano dependency added. Let me know if you have further questions on the update/delete
When using the cloudant node.js module there is no separate update function.
You need to use the db.insert function also for the update with the right doc revision, so you need to read the latest revision before the insert.
https://github.com/apache/couchdb-nano#document-functions
"and also used to update an existing document, by including the _rev token in the document being saved:"
// read existing document from db
db.get(key, function(error, existing) {
if (!error)
// use revision of existing doc for new doc to update
obj._rev = existing._rev;
// call db insert
db.insert(obj, key, cb);
});
Related
I have a NodeJS backend call with MongoDB. This particular api call was working fine for a long time, and when testing out something on the frontend, I realized the call was not going through. I've checked the routes, controller, index files, and tested the call through Postman which works fine, no errors, and even returns an ObjectId (which means it must be interacting with the database, right?). However, when I search the mongo shell nothing comes back, which tells me it is not saving. I cannot find anything wrong and get no errors anywhere along the way. I checked on MongoDB Atlas, the collection only has 4kb of data so it is not that it is 'too full' and have tested all the other api calls (get, patch, delete) which work fine and have no issues like this in my other collections.
Even weirder, during the save call I push the ID to 2 other collection's documents as a ref. The Mongo shell does show that the new Id is populating to the other documents, but yet it is not saving the actual document... which should be happening prior to the push and needs to happen in order to get the ObjectId
Below is the controller for adding a new Visit document, the route for it, and the response from postman. I really have no idea of how else to determine what is taking place.
Controller
exports.addVisit = async (req, res) => {
const hoursValue = getTotalHours(req.body.visitStart, req.body.visitEnd)
try {
const visit = new Visit({
totalHours: hoursValue,
user: req.body.user,
client: req.body.client,
visitStart: req.body.visitStart,
visitEnd: req.body.visitEnd,
location: req.body.location
});
const user = await User.findById(req.body.user);
const client = await Client.findById(req.body.client);
visit.user = user._id
visit.client = client._id
visit.save();
user.visits.push(visit._id);
user.save();
client.visits.push(visit._id);
client.save();
console.log(visit)
console.log(user)
console.log(client)
res.status(201).send(visit);
} catch (error) {
res.status(400)
res.send({ error: "Error adding visit", error})
}
}
Route
router.route("/visits").post(addVisit)
Postman call to: http://localhost:5000/visitapi/visits
{
"client": "6205a8313fe12d6b4ec354c4",
"location": "Home",
"user": "62410a1dcaac9a3d0528de7a",
"visitStart": "2022-10-12T17:00:00.000Z",
"visitEnd": "2022-10-12T19:00:11.000Z"
}
Postman response
{
"client": "6205a8313fe12d6b4ec354c4",
"user": "62410a1dcaac9a3d0528de7a",
"location": "Home",
"visitStart": "2022-10-12T17:00:00.000Z",
"visitEnd": "2022-10-12T19:00:11.000Z",
"totalHours": 2,
"goals": [],
"id": "635302bb48e85ff6ad17ee59"
}
NodeJs console logging the same new document with no errors:
{
client: new ObjectId("6205a8313fe12d6b4ec354c4"),
user: new ObjectId("62410a1dcaac9a3d0528de7a"),
location: 'Home',
visitStart: 2022-10-12T17:00:00.000Z,
visitEnd: 2022-10-12T19:00:11.000Z,
totalHours: 2,
_id: new ObjectId("635302bb48e85ff6ad17ee59"),
goals: []
}
MongoShell showing the Client collection document stored the new Visit document Id:
visits: [
ObjectId("6257158d157e807e51c7e009"),
ObjectId("62fd852a252b83f4bc8f9782"),
ObjectId("63056cee252b83f4bc8f97e9"),
ObjectId("634ee01ec582da494032c73e"),
ObjectId("634ee09cc582da494032c7aa"),
ObjectId("634ee3d6ddbe3f7e6641d69e"),
ObjectId("634efcf1ddbe3f7e6641d6f9"),
ObjectId("634efdd3ddbe3f7e6641d71b"),
ObjectId("635029937da8972360d907c1"),
ObjectId("6350a0e37da8972360d9084f"),
ObjectId("635302bb48e85ff6ad17ee59") //_id matches the same returned by Postman/ NodeJS
],
Again, everything goes through with no errors on postman or the front or back end, and the backend even logs the returns the new document but no new document is being saved to the DB unless I manually pass it in the Mongosh shell, and only for this one collection. Totally lost, I'd appreciate any guidance on how to explore/ resolve this. Thanks
Edited to include the solution based on the discussion in comment
The problem could be in the mongoose schema. If the property names do not match, mongoose will simply ignore the mismatched properties.
Previous answer
Those mongodb calls are expected to be async. You might want to add await to all those.
visit.user = user._id
visit.client = client._id
await visit.save(); // <- this
user.visits.push(visit._id);
await user.save(); // <- this
client.visits.push(visit._id);
await client.save(); // <- and this
I'm having difficulty in updating an existing metafield with Shopify API. Each time I receive an error, advising me that the variant already exists... so it must be thinking I'm trying to create a new one (and not update).
I thought this may be an issue with 'put' and 'post' - so changed my method to put, however the error persists. I've hardwired in all my variables to make it easier to test.
I'm working with Cloudinary. I'm using https://github.com/sinechris/shopify-node-api with Express.js
app.post('/upload', function(req, res){
// upload page... assume we have uploaded our image - but have hard-wired a local file in for now
cloudinary.uploader.upload('/Users/Rob/Pictures/testimg.jpg', function(savedImg) {
var imageURL = savedImg.url;
console.log(imageURL)
},
{
public_id: "testimg"
});
// the saved image is returned - so we add it to our updateMetafieldData json object
var updateMetafieldData = {
"variant": {
"id": '253818949',
"metafields": [
{
"key": "variant_image_0",
"value": 'testimg', // whatever the public id of our image is.
"value_type": "string",
"namespace": "variant_image"
}
]
}
}
// and post the result to shopify - then redirect to /getvariants
Shopify.put('/admin/variants/253818949.json', updateMetafieldData, function(data){
// res.redirect('/getvariants')
});
});
I actually created Shopify Node API and was just now happened upon this months later but thought I'd answer for anyone else coming along.
Take a look at the shopify API here: https://docs.shopify.com/api/metafield#update
You can update the metafield directly by performing a PUT request against the metafield resource instead of the variant like so:
/admin/metafields/#{id}.json
You would of course need to know the ID of the metafield first so that would require a call to the variant first or you could simply store the id in your local database for reference.
Pretty straightforward and without any sort of configuration ->
In the project directory I entered the command:
$ meteor mongo to access the mongodb
From there (mongo shell), I switched to db meteor using the command use meteor then entered some basic data to test:
j = { name: "mongo" }
k = { x: 3 }
db.testData.insert(j)
db.testData.insert(k)
I checked and got results by entering: db.testData.find()
Here's my meteor code provided that mongodb access is only required on the client:
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to test.";
};
Template.hello.events({
'click input' : function () {
// template data, if any, is available in 'this'
if (typeof console !== 'undefined')
console.log("You pressed the button");
}
});
Documents = new Meteor.Collection('testData');
var document = Documents.find();
console.log(document);
var documentCbResults = Documents.find(function(err, items) {
console.log(err);
console.log(items);
});
}
Upon checking on the browser and based on the logs, it says undefined. I was unsuccessful from retrieving data from mongodb and showing to the client console.
What am I missing?
For this answer I'm going to assume this is a newly created project with autopublish still on.
As Christian pointed out, you need to define Documents on both the client and the server. You can easily accomplish this by just putting the collection definition at the top of the file or in another file which isn't in either of the server or client directories.
An example which prints the first two test documents could look like this:
Documents = new Meteor.Collection('testData');
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to apui.";
};
Template.hello.events({
'click input' : function () {
var documents = Documents.find().fetch();
console.log(documents[0]);
console.log(documents[1]);
}
});
}
Note the following:
The find function returns a cursor. This is often all you want when writing template code. However, in this case we need direct access to the documents to print them so I used fetch on the cursor. See the documentation for more details.
When you first start the client, the server will read the contents of the defined collections and sync all documents (if you have autopublish on) to the client's local minimongo database. I placed the find inside of the click event to hide that sync time. In your code, the find would have executed the instant the client started and the data probably would not have arrived in time.
Your method of inserting initial items into the database works (you don't need the use meteor by the way), however mongo will default to using an ObjectId instead of a string as the _id. There are subtle ways that this can be annoying in a meteor project, so my recommendation is to let meteor insert your data if at all possible. Here is some code that will ensure the testData collection has some documents:
if (Meteor.isServer) {
Meteor.startup(function() {
if (Documents.find().count() === 0) {
console.log('inserting test data');
Documents.insert({name: "mongo"});
Documents.insert({x: 3});
}
});
}
Note this will only execute if the collection has no documents in it. If you ever want to clear out the collection you can do so via the mongo console. Alternatively you can drop the whole database with:
$ meteor reset
It's not enough to only define collections on the client side. Your mongo db lives on the server and your client needs to get its data from somewhere. It doesn't get it directly from mongodb (I think), but gets it via syncing with the collections on the server.
Just define the Documents collection in the joint scope of client and server. You may also need to wait for the subscription to Documents to complete before you can expect content. So safer is:
Meteor.subscribe('testData', function() {
var document = Documents.find();
console.log(document);
});
I am using Node.js + Express to interact with mongodb.
I am trying to write a mongodb update function wrapper.
whenever I make a post request mongo reports "Invalid modifier specified: $set undefined".
Anyone know what's wrong with my funciton wrapper? or is there a better way to do this?
my update_personal_info.js looks like this:
exports.update_personal_info = function(req, res){
var critia = req.body.critia;
var data = req.body.data;
db.collection('accounts', function(err, collection) {
collection.update(critia, data, function (err, doc) {
if (err) {
res.send("There's problem updating the db");
}
});
});
}
The post data I am using to send via Chrome's extension "Advance Rest Client":
critia={"_id": "user_1234"}
data={$set: {"password": "44444444"}}
my app.js looks like this:
app.post('/1',update_personal_info);
accounts collection looks like this:
{
"_id": "user_1234",
"name": "John Chu",
"password": "111111",
"address": [
{"old": "123 seattle st. WA. 123456"},
{"new": "123 new york st. DC. 123456"}
]
}
Using express 3.4.4 and mongodb 1.3.20 it works just fine when I run it. Are you using a really old version of the mongodb driver?
One thing to me sure you are doing is setting the Content Type correctly in the Advanced REST Client. If you are using the Form tab in the Payload section, you need to set it to application/x-www-form-urlencoded.
Some general guidance:
You don't want to have the client send in the selector and the document. It opens you up to allow someone to send in values that could do some damage to your data.
I've solved my own problem by convert the post data string into an object using the eval() function:
var critia = eval("("+req.body.critia+")");
var data = eval("("+req.body.data+")");
Hope this answer will help others.
Cheers
I'm currently trying to make a register form using mongoDB and nodeJS - I've created new database and collection - I want to store: username, password, email and insert_time in my database.
I've added unique indexes to username/email and checked if it works - and I can not add a duplicated entry using mongo's console or rockmongo (php mongodb manager) - so it works fine.
However - when the piece of code that is supposed to register a new account is being executed and makes an insert with the data that is already in database - it returns an object that contains all the data that was supposed to be added with a new, unique id. The point is - it should return an error that would say that entries can not be duplicated and insert failed - instead it returns the data back and gives it a new id. Data that already resides in database remains untouched - even the ID stays the same - it's not rewritten with the new one returned by script's insert.
So, the question is... what am I doing wrong? Or maybe everything is fine and database's insert should return data even if it's failed?...
I even tried defining indexes before executing indexes.
I tried inserting the data using mongoDB's default functions and mongoJS functions - the result is the same in both cases.
The code I'm trying to execute (for mongoJS):
var dbconn = require("mongojs").connect('127.0.0.1:27017/db', ['users']);
var register = function(everyone, params, callback)
{
// TODO: validation
dbconn.users.ensureIndex({username:1},{unique:true});
dbconn.users.ensureIndex({email:1},{unique:true});
dbconn.users.save(
{
username: params.username,
password: params.password,
email: params.email,
insert_time: Date.now()
},
function(error, saved)
{
if(error || !saved)
{
callback(false);
}
else
{
console.log(error);
console.log(saved);
callback(true);
}
});
}
For both cases - inserting new data and inserting duplicated data that doesn't modify database in any way - ERROR is null and SAVED is just a copy of data that is supposed to be inserted. Is there any way to check if insert was made or not - or do I have to check whether the data already exists in database or not manually before/after making an insert?
Mongo works that way. You should tell you want to get errors back, using the safe option when you issue the save command (as per default it uses the "fire and forget" method).
https://docs.mongodb.com/manual/reference/command/getLastError/
This looks to be basically the same problem as MongoDB unique index does not work -- but with the JavaScript API rather than Java. That is, saving without either specifying the "safe" flag or explicitly checking the last error value is unsafe--- the ID is generated client side and the command dispatched, but it might still fail (e.g. due to a unique index violation). You need to explicitly get the last error status.
http://www.mongodb.org/display/DOCS/dbshell+Reference#dbshellReference-ErrorChecking suggests db.getLastError() using the command shell, and I assume the node API is identical where they can possibly make it so.
Same problem solved, just forgot that its async insert and any nodes process.exit stops adding data
var lineReader = require('readline').createInterface({
input: require('fs').createReadStream('parse.txt')
});
lineReader.on('line', function (line) {
console.log(line)
db.insert(line,{w:1},function(e,d){ if (e) throw e })
});
// f this s, data are still adding when do close KUR WA MAC 2 days of my f life wasted
lineReader.on('close', function() {
// ----->>>>>> process.exit(); <<<<<----------
});