Securing access to collection from the client's side - security

I have a meteor app prototype that works well, but is very insecure as of now: I needed to display a list of matching users to the currently logged-in user. For starters, I decided to publish all users, limiting the fields to what I would need to filter the user list on the client side.
Meteor.publish('users', function () {
return Meteor.users.find({}, {
fields: {
'profile.picture': 1,
'profile.likes': 1,
'profile.friends': 1,
'profile.type': 1
}
});
});
Then in my router, I would do a request to only show what I wanted on the client side:
Router.map(function() {
this.route('usersList', {
path: '/users',
waitOn: function () {
return Meteor.subscribe('users');
},
data: function () {
var user = Meteor.user();
return {
users: Meteor.users.find({ $and: [
{_id: {$ne : user._id}},
{'profile.type': user.profile.interest}
]})
};
}
});
});
In the code above, I query all users who are not the current user and whose type correspond the current user's interest. I also display a certain border on the photos of users who have my user in their "profile.friends" array, using this client helper:
Template.userItem.helpers({
getClass: function(userId) {
var user = Meteor.user();
var lookedup = Meteor.users.findOne({_id: userId});
if ($.inArray(user._id, lookedup.profile.friends) !== -1)
return "yes";
return "no";
}
});
Now this all worked great, but with this setup, every client can query every users and get their type, picture, list of friends and number of likes. If I was in an MVC, this info would only be accessible on server side. So I decided my next iteration to be a security one. I would move my query from the router file to the publications file. That's where trouble began...
Meteor.publish('users', function () {
var user = Meteor.users.findOne({_id: this.userId});
var interest = user.profile.interest;
// retrieve all users, with their friends for now
allUsers = Meteor.users.find({ $and: [
{'_id': {$ne: user._id}},
{'profile.type':interest}
]},
{ fields: {'profile.picture': 1, 'profile.friends': 1}}
);
return allUsers;
});
And in the router:
Router.map(function() {
this.route('usersList', {
path: '/users',
waitOn: function () {
return Meteor.subscribe('users');
},
data: function () {
var user = Meteor.user();
return {users: Meteor.users.find({_id: {$ne : user._id}})};
}
});
});
(note that I still need to exclude the current user from the router query since the current user is always fully published)
This works, but:
the user list does not get updated when I change the user interest and then do a Router.go('usersList'). Only when I refresh the browser, my list is updated according to the user's new interest. No idea why.
this solution still publishes the users' friends in order to display my matching borders. I wish to add a temporary field in my publish query, setting it to "yes" if the user is in the user's friends and "no" otherwise, but... no success so far. I read I could use aggregate to maybe achieve that but haven't managed to so far. Also, aggregate doesn't return a cursor which is what is expected from a publication.
This problem makes me doubt about the praises about meteor being suitable for secure apps... This would be so easy to achieve in Rails or others!
EDIT: As requested, here is the code I have so far for the transition of my "matching" check to the server:
Meteor.publish('users', function () {
var user = Meteor.users.findOne({_id: this.userId});
var interest = user.profile.interest;
// retrieve all users, with their friends for now
allUsers = Meteor.users.find({ $and: [
{'_id': {$ne: user._id}},
{'profile.type':interest}
]},
{ fields: {'profile.picture': 1, 'profile.friends': 1}}
);
// ------------- ADDED ---------------
allUsers.forEach(function (lookedup) {
if (_.contains(lookedup.profile.friends, user._id))
lookedup.profile.relation = "yes";
else
lookedup.profile.relation = "no";
lookedup.profile.friends = undefined;
return lookedup;
});
// ------------- END ---------------
return allUsers;
});
Obviously this code doesn't work at all, since I cannot modify cursor values in a foreach loop. But it gives an idea of what I want to achieve: give a way to the client to know if a friend is matched or not, without giving access to the friend lists of all users to the client. (and also, avoiding having to do one request per each user during display to ask the server if this specific user matches this specific one)

You can add a transform function and modify a cursor docs on the fly
meteor Collection.find

Related

Firebase Cloud Functions: Keeping things in sync

I have users and companies and want to store a company for each user and all of the users of each company in Firebase.
user={
"id":"tjkdEnc3skdm2Jjknd"
"name":"Adam",
"street":"Sideway 4",
"company":"dHend4sdkn25"
}
companies={
"id":"dHend4sdkn25",
"name":"Comp Ltd.",
"members":[
{
"id":"tjkdEnc3skdm2Jjknd"
"name":"Adam"
},{
"id":"dfjnUkJKB3sdn8n2kj"
"name":"Berta"
}
]
}
All explanations say that duplicate data is the best way to deal with and so I want to write some cloud functions to keep thigs in sync when editing on one of the sides.
Basically I started with
exports.userChangedCompany = functions.firestore
.document('users/{userId}')
.onUpdate((change, context) => {
const data = change.after.data();
const previousData = change.before.data();
if (data.company == previousData.company) {
return null;
}
else{
admin.firestore().doc('companies/'+data.company).set({ ... });
}
});
to update the companies when a user changed the company. Unfortunately I haven't found any hint how to set the new company-data properly.
Can someone please help me?
It sounds like you just need to remove user from members array of old company and add in that array of new company. You just need IDs of both companies.
async function updateCompanies(userId, username, oldCompanyId, newCompanyId) {
const companiesRef = await admin.firestore().collection("companies")
const userObj = {id: userId, name: username}
// Removing from old company and adding in new company
await Promise.all([
companiesRef.doc(oldCompanyId).update({members: admin.firestore.FieldValue.arrayRemove(userObj)}),
companiesRef.doc(newCompanyId).update({members: admin.firestore.FieldValue.arrayUnion(userObj)})
])
return true
}
You can just call this function in your cloud function. Just make sure you pass correct params. The reason why you need to pass the username as well is you have array of objects (members) and hence you need the complete object to add/remove using arrayUnion/arrayRemove.

How to avoid two concurrent API requests breaking the logic behind document validation?

I have an API that in order to insert a new item it needs to be validated. The validation basically is a type validator(string, number, Date, e.t.c) and queries the database that checks if the "user" has an "item" in the same date, which if it does the validation is unsuccessful.
Pseudocode goes like this:
const Item = require("./models/item");
function post(newDoc){
let errors = await checkForDocErrors(newDoc)
if (errors) {
throw errors;
}
let itemCreated = await Item.create(newDoc);
return itemCreated;
}
My problem is if I do two concurrent requests like this:
const request = require("superagent");
// Inserts a new Item
request.post('http://127.0.0.1:5000/api/item')
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Water Bottle"
})
/*
Inserts a new Item, which shouldn't do. Resulting in two items having the
same date.
*/
request.post('http://127.0.0.1:5000/api/item')
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Toothpick"
})
Both will be successful, which it shouldn't be since an "user" cannot have two "items" in the same date.
If I execute the second one after the first is finished, everything works as expected.
request.post('http://127.0.0.1:5000/api/item') // Inserts a new Item
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Water Bottle"
})
.then((res) => {
// It is not successful since there is already an item with that date
// as expected
request.post('http://127.0.0.1:5000/api/item')
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Toothpick"
})
})
To avoid this I send one request with an array of documents, but I want to prevent this issue or at least make less likely to happen.
SOLUTION
I created a redis server. Used the package redis-lock and wrapped around the POST route.
var client = require("redis").createClient()
var lock = require("redis-lock")(client);
var itemController = require('./controllers/item');
router.post('/', function(req, res){
let userId = "";
if (typeof req.body === 'object' && typeof req.body.id_user === 'string') {
userId = req.body.id_user;
}
lock('POST ' + req.path + userId, async function(done){
try {
let result = await itemController.post(req.body)
res.json(result);
} catch (e) {
res.status(500).send("Server Error");
}
done()
})
}
Thank you.
Explain
That is a race condition.
two or more threads can access shared data and they try to change it at the same time
What is a race condition?
Solution:
There are many ways to prevent conflict data in this case, a lock is 1 option.
You can lock on application level or database level... but I prefer you read this thread before chose any of them.
Optimistic vs. Pessimistic locking
Quick solution: pessimistic-lock https://www.npmjs.com/package/redis-lock
You should create a composite index or a composite primary key that includes the id_user and the start_date fields. This will ensure that no documents for the same user with the same date can be created, and the database will throw an error if you'll try to do it.
Composite index with mongoose
You could also use transactions. To do it, you should execute the find and the create methods inside a transaction, to ensure that no concurrent queries on the same document will be executed.
Mongoose transactions tutorial
More infos
I would go with an unique composite index, that in your specific case should be something like
mySchema.index({user_id: 1, start_date: 1}, {unique: true});

How do I increment a value for an existing object in Firebase?

I'm building a step counter app.
I got an iOS app that pushes the sum of each day to /users/{mobile}/steps/{date}/
When a new steps child is updated or added, I want to sum the value of all the steps for that particular user and update his stepsTotal.
To achieve that I need to
Find the original user and sum all the steps.
Save the new value to stepsTotal.
I would be most grateful if someone could give some help here. :-)
database
{
"users": {
"92291000": {
"firstName": "Tore",
"stepsTotal": "1500",
"steps": {
"02-09-2017": "500",
"03-09-2017": "1000"
},
import.js
var db = admin.database();
var dbRoot = db.ref("/");
var usersRef = dbRoot.child("users");
// This works
function saveUser(attributes) {
let mobile = attributes.mobile;
delete attributes['mobile']
let user = usersRef.child(mobile);
user.update(attributes);
}
function increaseSteps( { mobile=null, steps=null } = {}) {
// Find the User
console.log("looking for mobile", mobile); // OK
let userRef = usersRef.child(mobile);
// Here I'm not able to read the old data from the user.
userRef.transaction(function(user) {
console.log("user: ", user); // null
// ^ User is null.
});
/*
If I mangage to find user above, I expect to do something like this.
Or it is possible to only update *stepsTotal*?
*/
let attributes = {
firstName: user.firstName,
lastName: user.lastName,
stepsTotal: user.stepsTotal + steps,
}
user.update( attributes );
}
If I understand correctly, you have a problem in this snippet of the code:
let userRef = usersRef.child(mobile);
// Here I'm not able to read the old data from the user.
userRef.transaction(function(user) {
console.log("user: ", user); // null
// ^ User is null.
});
In Firebase Database transactions the initial value is often null. From the Firebase documentation on transactions:
Transaction Function is Called Multiple Times
Your transaction handler is called multiple times and must be able to handle null data. Even if there is existing data in your database it may not be locally cached when the transaction function is run.
This is due to how Firebase transactions work behind the scenes. To learn more about that, see my answers here Transcation updateFunction parameter is null and Firebase runTransaction not working.
The solution is to handle both cases: if the user node doesn't exist yet count the initial number of steps, otherwise update the number of steps:
let userRef = usersRef.child(mobile);
userRef.transaction(function(user) {
return (user || 0) + new_steps_for_user;
});

Using an arbitrary number of query params to filter results in mongoose

I'm building an API using node express and mongodb, with mongoose.
I have a post resource that handles user posts, and would like to be able to perform various queries on the post resource.
For instance I have a functions as that returns all posts as follows:
// Gets a list of Posts
exports.index = function(req, res) {
console.log(req.query);
Post.findAsync()
.then(mUtil.responseWithResult(res))
.catch(mUtil.handleError(res));
};
I looking for a good way of processing any additional query params that might come with the request.
/posts will return all posts, but /posts?user=12 will return posts by user with id 12 and /posts?likes=12 will return posts with 12 or more likes.
How can I check for and apply the these query params to filter and return the results since they may or may not be present.
Thanks ;)
If user=12 means "users with id 12", how does likes=12 mean "likes greater than 12"? You need to be more descriptive with your queries. You can do that by passing an array of objects. Send your query in a way that can be interpreted like this:
var filters = [
{
param: "likes",
type: "greater"
value: 12
},
{
param: "user",
type: "equal",
value: "12"
}]
var query = Post.find();
filters.forEach(function(filter) {
if (filter.type === "equal") {
query.where(filter.param).equals(filter.value);
}
else if (filter.type === "greater") {
query.where(filter.param).gt(filter.value);
}
// etc,,,
})
query.exec(callback);

Fetch Backbone collection with search parameters

I'd like to implement a search page using Backbone.js. The search parameters are taken from a simple form, and the server knows to parse the query parameters and return a json array of the results. My model looks like this, more or less:
App.Models.SearchResult = Backbone.Model.extend({
urlRoot: '/search'
});
App.Collections.SearchResults = Backbone.Collection.extend({
model: App.Models.SearchResult
});
var results = new App.Collections.SearchResults();
I'd like that every time I perform results.fetch(), the contents of the search form will also be serialized with the GET request. Is there a simple way to add this, or am I doing it the wrong way and should probably be handcoding the request and creating the collection from the returned results:
$.getJSON('/search', { /* search params */ }, function(resp){
// resp is a list of JSON data [ { id: .., name: .. }, { id: .., name: .. }, .... ]
var results = new App.Collections.SearchResults(resp);
// update views, etc.
});
Thoughts?
Backbone.js fetch with parameters answers most of your questions, but I put some here as well.
Add the data parameter to your fetch call, example:
var search_params = {
'key1': 'value1',
'key2': 'value2',
'key3': 'value3',
...
'keyN': 'valueN',
};
App.Collections.SearchResults.fetch({data: $.param(search_params)});
Now your call url has added parameters which you can parse on the server side.
Attention: code simplified and not tested
I think you should split the functionality:
The Search Model
It is a proper resource in your server side. The only action allowed is CREATE.
var Search = Backbone.Model.extend({
url: "/search",
initialize: function(){
this.results = new Results( this.get( "results" ) );
this.trigger( "search:ready", this );
}
});
The Results Collection
It is just in charge of collecting the list of Result models
var Results = Backbone.Collection.extend({
model: Result
});
The Search Form
You see that this View is making the intelligent job, listening to the form.submit, creating a new Search object and sending it to the server to be created. This created mission doesn't mean the Search has to be stored in database, this is the normal creation behavior, but it does not always need to be this way. In our case create a Search means to search the DB looking for the concrete registers.
var SearchView = Backbone.View.extend({
events: {
"submit form" : "createSearch"
},
createSearch: function(){
// You can use things like this
// http://stackoverflow.com/questions/1184624/convert-form-data-to-js-object-with-jquery
// to authomat this process
var search = new Search({
field_1: this.$el.find( "input.field_1" ).val(),
field_2: this.$el.find( "input.field_2" ).val(),
});
// You can listen to the "search:ready" event
search.on( "search:ready", this.renderResults, this )
// this is when a POST request is sent to the server
// to the URL `/search` with all the search information packaged
search.save();
},
renderResults: function( search ){
// use search.results to render the results on your own way
}
});
I think this kind of solution is very clean, elegant, intuitive and very extensible.
Found a very simple solution - override the url() function in the collection:
App.Collections.SearchResults = Backbone.Collection.extend({
urlRoot: '/search',
url: function() {
// send the url along with the serialized query params
return this.urlRoot + "?" + $("#search-form").formSerialize();
}
});
Hopefully this doesn't horrify anyone who has a bit more Backbone / Javascript skills than myself.
It seems the current version of Backbone (or maybe jQuery) automatically stringifies the data value, so there is no need to call $.param anymore.
The following lines produce the same result:
collection.fetch({data: {filter:'abc', page:1}});
collection.fetch({data: $.param({filter:'abc', page:1})});
The querystring will be filter=abc&page=1.
EDIT: This should have been a comment, rather than answer.

Resources