PouchDB bulkdocs doesn't re-add document with {new_edits:false} if it was deleted before - couchdb

A local PouchDB is in sync with a remote Couch. The remote has validations which throw unauthorized errors based on some rules. When a denied event occurs, I have written the below code to fetch the remote document and replicate it locally, in order to revert the local doc changes back to how they are on the server -
function processUnauthorizedError(err, self) {
const { doc: { error, id } = {} } = err;
if (error !== "unauthorized") {
return;
}
// Replace/insert remote doc into local
// https://stackoverflow.com/a/34570061/2790937
return self.remoteDB.get(id, {revs: true})
.then(doc => {
Logger.log({doc});
return self.localDB.bulkDocs([doc], {new_edits: false})
})
.then((result) => {
Logger.log({result}); // Empty array, see https://github.com/pouchdb/pouchdb/issues/5775
return notifyUnauthorized(id, self)
})
}
This strategy is based on this suggestion - https://stackoverflow.com/a/34570061/2790937.
However, if the document was deleted locally, then the localdb bulkDocs operation doesn't insert it back. It completes with no errors and returns empty array, which is expected per this issue - https://github.com/pouchdb/pouchdb/issues/5775. To confirm that the document isn't inserted, I did a subsequent allDocs as well as a get on local.
Any suggestions?
The code is running in latest Chrome (81) and using latest PouchDB (pouchdb-browser) 7.1.1.

Related

Am I having a local Firestore database?

I want to understand what kind of Firestore database is installed to my box.
The code is running with node.js 9.
If I remove the internet for X minutes and put it back, I can see all the cached transactions going to Firestore (add, updates, deletes).
If I add firebase.firestore().enablePersistence() line after 'firebase.initializeApp(fbconfig), I am getting this error:
Error enabling offline persistence. Falling back to persistence
disabled: FirebaseError: [code=unimplemented]: This platform is either
missing IndexedDB or is known to have an incomplete implementation.
Offline persistence has been disabled.
Now, my question is. If I don't have persistence enabled or can't have it, how come when disconnecting my device from internet, I still have internal transaction going on? Am I really seeing it the proper way?
To me, beside not seeing the console.log() that I have inside the "then()" to batch.commit or transaction.update right away (only when putting back the internet) tells me that I have some kind of internal database persistence, don't you think?
Thanks in advance for your help.
UPDATE
When sendUpdate is called, it looks like the batch.commit is executed because I can see something going on in listenMyDocs(), but the console.log "Commit successfully!" is not shown until the internet comes back
function sendUpdate(response) {
const db = firebase.firestore();
let batch = db.batch();
let ref = db.collection('my-collection')
.doc('my-doc')
.collection('my-doc-collection')
.doc('my-new-doc');
batch.update(ref, { "variable": response.state });
batch.commit().then(() => {
console.log("Commit successfully!");
}).catch((error) => {
console.error("Commit error: ", error);
});
}
function listenMyDocs() {
const firebase = connector.getFirebase()
const db = firebase.firestore()
.collection('my-collection')
.doc('my-doc')
.collection('my-doc-collection');
const query = db.where('var1', '==', "true")
.where('var2', '==', false);
query.onSnapshot(snapshot => {
snapshot.docChanges().forEach(change => {
if (change.type === 'added') {
console.log('ADDED');
}
if (change.type === 'modified') {
console.log('MODIFIED');
}
if (change.type === 'removed') {
console.log('DELETED');
}
});
});
the console.log "Commit successfully!" is not shown until the internet comes back
This is the expected behavior. Completion listeners fire once the data is committed on the server.
Local events may fire before completion, in an effort to allow your UI to update optimistically. If the server changes the behavior that the client raised events for (for example: if the server rejects a write), the client will fire reconciliatory events (so if an add was rejected, it will firebase a change.type = 'removed event once that is detected).
I am not entirely sure if this applies to batch updates though, and it might be tricky to test that from a Node.js script as those usually bypass the security rules.

Why is Sequelize's save() not saving the changes that I've pushed to the JSON but saving everything else?

I'm making a function which will save the messages of my chat app to the database. It looks a bit like this:
let Server = require('../models/Server')
module.exports.sendMessage = (req, res, next) => {
let serverRoom = req.body.serverRoom
let serverId = req.body.server.id
let server = Server.findByPk(serverId).then(result => {
console.log(result.rooms[serverRoom].history)
result.rooms[serverRoom].history.push('Hi')
result.name = 'Test1'
console.log(result.rooms[serverRoom].history)
result.save();
})
.catch(error => {
console.log(error)
})
res.status(200).json({
message: 'Success'
})
}
The server has a JSON column which contains an array of objects which have a history property which is an empty array. In my code, I console.log() the history of room with index of 0, then I push a message 'Hi' to the history of that room and then I console.log() the history again to check if the message was pushed. The console.logs prove that the message was indeed pushed, however, when I try to save() the object to the database, the JSON changes do not save.
However, the name of the result which I also change to 'Test1' in the code actually changes in the database, so the save() function is working but without saving the JSON changes. Any clue why is that happening?
The issue is in this line:
result.rooms[serverRoom].history.push('Hi');
You have to do explicitly set json objects and any array of json objects via using concat for example.
More info here: https://sequelize.org/master/manual/upgrade-to-v6.html#-code-model-changed----code-

Deleted object is undefined in MEAN app

I am trying to build a MEAN stack SPA, and so far I have created the backend services using this tutorial: https://medium.com/netscape/mean-app-tutorial-with-angular-4-part-1-18691663ea96
Everything works fine when I am testing with Postman (POST, PUT, GET), however, when I try to delete an object, it results in the error:
{
"status": 400,
"message": "TypeError: Cannot read property 'n' of undefined"
}
Here is how my delete method looks in todos.service.js:
exports.deleteTodo = async function(id){
try{
var deleted = await ToDo.remove({_id: id})
if(deleted.result.n === 0){
throw Error("Todo Could not be deleted")
}
return deleted
}catch(e){
throw Error(e)
}
}
and here is my todos.controller.js:
exports.removeTodo = async function(req, res, next){
var id = req.params.id;
try{
var deleted = await TodoService.deleteTodo(id)
return res.status(204).json({status:204, message: "Succesfully Todo Deleted"})
}catch(e){
return res.status(400).json({status: 400, message: e.message})
}
}
When I try to delete an object, it successfully removes it (I can see it using robomongo that it has deleted it), however, I get the error message I wrote before. What could be the problem here?
You must be using Mongoose v5. The remove method of the model doesn't require any parameters, and returns the deleted document.
In v4 remove just called collection's remove method directly, and indeed requires the condition to delete and returns number of deleted documents.
Either limit version of mongoose in your package.json to something like ^4.0.0 or use a bit more modern tutorial.

Failing to get a response back from a web api call in node

this particular Node issue has been driving me crazy for going on a week.
I have to create a layer of friction (a modal that asks the user if they're sure) in the process of a csv file upload. Essentially, the flow will be:
User Clicks 'UPLOAD SPREAD SHEET' > File uploads to s3 > S3 returns a reference key > Pass reference key into micro service web api to evaluate > if true => ask user 'if they're sure' > If user is sure continue uploading > pass reference key onward to another endpoint, same service, to finish the upload. false return would continue on to the upload with no modal.
its kind of a silly product-based functionality that makes a show of alerting the user to potential duplicate entries in their spreadsheet since we can't currently detect duplicate entries ourselves.
Problem is, I can't get a response to return from the evaluation to save my life. If I console.log the response, I can see it in Node's terminal window but nothing comes back in the network tab for the response. I'm not sure if it's because it's a file upload, if it's busyboy, if it's just not the right syntax for the response type but endless googling has brought me no answers and I'd love it if someone more experienced with Node and Express could take a look.
router.post('/import/csv',
// a bunch of aws s3 stuff to upload the file and return the key
s3.upload(uploadParams, (err, data) => {
if (err) {
res.status(500).send({
error_message: 'Unable to upload csv. Please try again.',
error_data: err
});
} else if (data) {
// creating the key object to pass in
const defaultImportCheck = {
body: data.Key
};
// endpoint that will evaluate the s3 reference key
SvcWebApiClient.guestGroup.defaultImportCheck(defaultImportCheck)
.then((response) => {
if (response.status === 'success') {
// where the response should be. this works but doesn't actually send anything.
res.send(response);
} else {
const errorJson = {
message: response.message,
category: response.category,
trigger: response.trigger,
errors: response.errors
};
res.status(500).send(errorJson);
}
})
.catch((error) => {
res.status(500).send({
error_message: 'Unable to upload csv. Please try again.',
error_data: error
});
});
}
});
});
req.pipe(busboy);
}
);
Got it, for anyone that ends up having my kind of problem. It's a two parter so buckle up.
1) the action function that handles the response on the react side didn't convert the response into json. Apparently, what would get returned is a "readable stream" which should have then converted to json. it didn't.
2) the response itself needed to be in json as well.
so from the action function:
export function csvUpload(file) {
do some stuff
return fetch(fetch some stuff) { with some parameters }
.then(some error stuff)
.then(response => response.response.json())
}
then from the post request:
if (response.status === "success") {
res.json({ valid: response.data, token: data.Key)};
}
this returns an object with what I need back to the client. hope this helps someone else.

Meteor cannot retrieve data from MongoDB

Pretty straightforward and without any sort of configuration ->
In the project directory I entered the command:
$ meteor mongo to access the mongodb
From there (mongo shell), I switched to db meteor using the command use meteor then entered some basic data to test:
j = { name: "mongo" }
k = { x: 3 }
db.testData.insert(j)
db.testData.insert(k)
I checked and got results by entering: db.testData.find()
Here's my meteor code provided that mongodb access is only required on the client:
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to test.";
};
Template.hello.events({
'click input' : function () {
// template data, if any, is available in 'this'
if (typeof console !== 'undefined')
console.log("You pressed the button");
}
});
Documents = new Meteor.Collection('testData');
var document = Documents.find();
console.log(document);
var documentCbResults = Documents.find(function(err, items) {
console.log(err);
console.log(items);
});
}
Upon checking on the browser and based on the logs, it says undefined. I was unsuccessful from retrieving data from mongodb and showing to the client console.
What am I missing?
For this answer I'm going to assume this is a newly created project with autopublish still on.
As Christian pointed out, you need to define Documents on both the client and the server. You can easily accomplish this by just putting the collection definition at the top of the file or in another file which isn't in either of the server or client directories.
An example which prints the first two test documents could look like this:
Documents = new Meteor.Collection('testData');
if (Meteor.isClient) {
Template.hello.greeting = function () {
return "Welcome to apui.";
};
Template.hello.events({
'click input' : function () {
var documents = Documents.find().fetch();
console.log(documents[0]);
console.log(documents[1]);
}
});
}
Note the following:
The find function returns a cursor. This is often all you want when writing template code. However, in this case we need direct access to the documents to print them so I used fetch on the cursor. See the documentation for more details.
When you first start the client, the server will read the contents of the defined collections and sync all documents (if you have autopublish on) to the client's local minimongo database. I placed the find inside of the click event to hide that sync time. In your code, the find would have executed the instant the client started and the data probably would not have arrived in time.
Your method of inserting initial items into the database works (you don't need the use meteor by the way), however mongo will default to using an ObjectId instead of a string as the _id. There are subtle ways that this can be annoying in a meteor project, so my recommendation is to let meteor insert your data if at all possible. Here is some code that will ensure the testData collection has some documents:
if (Meteor.isServer) {
Meteor.startup(function() {
if (Documents.find().count() === 0) {
console.log('inserting test data');
Documents.insert({name: "mongo"});
Documents.insert({x: 3});
}
});
}
Note this will only execute if the collection has no documents in it. If you ever want to clear out the collection you can do so via the mongo console. Alternatively you can drop the whole database with:
$ meteor reset
It's not enough to only define collections on the client side. Your mongo db lives on the server and your client needs to get its data from somewhere. It doesn't get it directly from mongodb (I think), but gets it via syncing with the collections on the server.
Just define the Documents collection in the joint scope of client and server. You may also need to wait for the subscription to Documents to complete before you can expect content. So safer is:
Meteor.subscribe('testData', function() {
var document = Documents.find();
console.log(document);
});

Resources