Supabase update with row level policies - row-level-security

I am not able to update a row which has role level policy enabled
My table has row level policies on insert and update as follow :
create policy "Allow individual insert access" on public.quotes for insert with check ( auth.uid() = created_by );
create policy "Allow individual update access" on public.quotes for update with check ( auth.uid() = created_by );
my insert function works fine once user is logged in :
export const addQuote = async (user_id, content) => {
try {
let body = await supabase
.from("quotes")
.insert([{ created_by: user_id, content: content }]);
return body;
} catch (error) {
console.log("error", error);
}
};
my update function works fine without row level policy :
export const updateQuote = async (quote_id, user_id, new_content) => {
try {
let body = await supabase
.from("quotes")
.update([{ created_by: user_id, content: new_content }])
.eq("id", quote_id);
return body;
} catch (error) {
console.log("error in updateQuote", error);
}
};
But when turning on RLS the response is 404 and the row is not updated.
What am I missing here ?

I change the row level policy from with check to using, and it is working as intended.
create policy "Allow individual update access" on public.quotes for update using( auth.uid() = created_by );
update with check ---> update using

You seem to have extra [] around your data. Could you try it without it?
let body = await supabase
.from("quotes")
.update({ created_by: user_id, content: new_content })
.eq("id", quote_id);

Related

How do I create auto increment value for a field in elastic search

I'm setting up an API for reading employee details, How can I se index ID and employee id as auto-increment
I'm using express4.17 and elasticsearch7.3
const client = require('../client/esConnect')
module.exports.createAttendance = async (body) => {
try{
let payload = payloadValue(body);
let response = await client.index(payload);
return response;
}catch(error){
console.log(error);
throw error;
}
}
payloadValue = (body) => {
return payload = {
index: "attendance",
type: "_doc",
id: 4,
body: {
empid: body.empid,
date: body.date,
present: body.present
}
}
}
As far as I can remember, Elasticsearch does not support (anymore) this kind of auto-increment ID. The ID field is automatically generated as a hash.
However, if you need to store somehow that number, you can add them within your code as another field or just overwriting the ID field.
Remember that Elasticsearch is NOT intended to be used as a Structured / ER database (like SQL).

How to add two records in a row?

When I want to add two records in sequence, only one record is added, on the second it throws an error due to the fact that it cannot create a field with such data:
"NOTES_ID is required","key: NOTES_ID, value: undefined, is not a
number"
How to create an entry for two related tables sequentially from the beginning for the main table, and then for the one that has the foreign key installed.
module.exports.create = async function (req, res) {
const stateMatrix = await StateMatrix.select().exec()
const noteObj = {
DATE: req.body.DATE,
TITLE: req.body.TITLE,
CONTENT: req.body.CONTENT
};
const noteStateObj = {
STATE_DATE: new Date().toLocaleDateString("en-US"),
STATES_ID: stateMatrix[0]._props.STATES_ID_CURR,
NOTES_ID: req.body.NOTE_ID,
USERS_ID: req.decoded.user_id
};
try {
await Notes.create(noteObj);
await NoteStates.create(noteStateObj);
res.status(201).json(noteObj, noteStateObj);
} catch (e) {
errorHandler(res, e);
}
};
Probably NoteStates is related to Notes through note_id field which can not be empty (I guess it's foreign key). It means that you should set it before saving noteStateObj:
// Something like this
const newNote = await Notes.create(noteObj);
noteStateObj.NOTES_ID = newNote.ID;
await NoteStates.create(noteStateObj);

problem when restarting tables and insert data in bigquery using node api

I have unexpected behaviour when loading data into BigQuery just after creating the schema.
I'm using Node API to insert data with BigQuery streaming API.
In order to reset the data I delete and create the tables before loading any data.
My Problem: the first time it works fine, but if I execute it again it fails.
The process always delete and creates the table schema, but does not insert the data until I wait a moment to execute it again.
This is the code which reproduces the case:
async function loadDataIntoBigquery() {
const {BigQuery} = require('#google-cloud/bigquery')
const tableName = "users"
const dataset = "data_analisis"
const schemaUsers = "name:string,date:string,type:string"
const userData = [{name: "John", date: "20/08/93", type: "reader"}, {
name: "Marie",
date: "20/08/90",
type: "owner"
}]
try {
const bigquery = new BigQuery()
await bigquery.createDataset(dataset).then(err => console.log("dataset created successfully")).catch(err => {
console.log("warn: maybe the dataset already exists")
})
await bigquery.dataset(dataset).table(tableName).delete().then(err => console.log("table deleted successfully")).catch((err) => {
console.log("Error: maybe the table does not exist")
})
await bigquery.dataset(dataset).createTable(tableName, {schema: schemaUsers}).then(() => console.log("table created successfully")).catch(err => console.log("Error: maybe the table already exists"))
await bigquery.dataset(dataset).table(tableName).insert(userData).then((data) => console.log("Ok inserted ", data)).catch(err => console.log("Error: can't insert "))
} catch (err) {
console.log("err", err)
}
}
to verify that the data was inserted I'm using this query
select * from `data_analisis.users`
I have the same issue. As a workaround, i insert data with a query instead :
const query = "INSERT INTO `"+dataset+"."+tableName"` (name, date, type ) VALUES ("+name+",'"+date+"','"+type+"')";
await bigQuery.query({
query: query,
useLegacySql: false,
location: 'EU'
}, (err) => {
console.log("Insertion error : ",err);
})

CosmosDB + MongoAPI, updating document workaround?

I've been trying to simply update a CosmosDB document via the mongodb api in my node application, I've been testing in and out, no errors but the value does not update no matter what.
I know updating array elements is not supported which is fine, but this is a top-level key-value pair. Changes simply don't happen with no error whatsoever.
I've been following the Mean.js project with uses CosmosDB + Mongoose + Node + Angular, looking at the API for updating hero and trying some of that code but it still doesn't update.
I've been reading the documentation trying to figure out the default way of handling CRUD operations within CosmosDB and which parts of the MongoAPI it supports but so far no luck.
For tests purposes, I'm using this code:
async function updateUser(id) {
try {
let user = await User.findById(id);
console.log (id);
console.log(user);
if (!user) return
user.id = id
user.firstName = 'ASDASDASASDASDASDASDASDA'
const result = await user.save()
console.log(result);
}
catch(err) {
console.log("There was an error updating user", err);
}
}
So, I've been playing around some more and managed to update a hero using this code:
updateHero('10')
async function updateHero(id) {
const originalHero = {
uid: id,
name: 'Hero2',
saying: 'nothing'
};
Hero.findOne({ uid: id }, (error, hero) => {
hero.name = originalHero.name;
hero.saying = originalHero.saying;
hero.save(error => {
return(hero);
console.log('Hero updated successfully!');
});
});
}
Now I'm just not sure why this has actually worked and why it hasn't before. The main thing that is different is that I'm using an 'uid' instead of the actual ID assigned by CosmosDB.
I tested sample code you provided and they both updated document successfully.
Sample document:
Snippet One:
updateUser('5b46eb0ee1a2f12ea0af307f')
async function updateUser(id) {
try {
let user = await Family.findById(id);
console.log (id);
console.log(user);
if (!user) return
user.id = id
user.name = 'ASDASDASASDASDASDASDASDA'
const result = await user.save()
console.log(result);
}
catch(err) {
console.log("There was an error updating user", err);
}
}
Output One:
Snippet Two:
updateFamily('5b46eb0ee1a2f12ea0af307f')
async function updateFamily(id) {
const updateFamily = {
_id: id,
name: 'ABCD',
};
Family.findOne({ _id : id }, (error, family) => {
family.name = updateFamily.name;
family.save(error => {
console.log(JSON.stringify(family));
console.log('Hero updated successfully!');
return(family);
});
});
}
Output Two:
In addition, you could use db.collection.update() to update document.
db.families.update(
{ _id: '5b46eb0ee1a2f12ea0af307f' },{ $set:
{
name: 'AAAA'
}
})
More details,please refer to the doc: https://docs.mongodb.com/manual/reference/method/db.collection.update/
Hope it helps you.

Interdependent Transactions with pg-promise

I am trying to build an app involves posts and tags for posts. For these I have a post, tags and post_tag table. tags has the tags I have defined before hand and in somewhere in the app is suggested to the user on the front-end. post_tag table holds the post and tag ids as pairs on each row.
I use express.js and postgreql and pg-promise.
As far as I know I need a transactional query(ies) for a create post operation.
Also I need a mechanism to detect if a tag was not in tags table when the user created the post, so that I can insert it on the fly, and I have a tag_id for each tag that is neccessary to use in insertion of the post_id and tag_id into post_tag table. Otherwise, I will have a foreign key error since I need to post_tag table's columns post_id and tag_id to reference posts and tags table id columns, respectively.
Here is the url function I use for this I have used so far unsuccessful:
privateAPIRoutes.post('/ask', function (req, res) {
console.log('/ask req.body: ', req.body);
// write to posts
var post_id = ''
var post_url = ''
db.query(
`
INSERT INTO
posts (title, text, post_url, author_id, post_type)
VALUES
($(title), $(text), $(post_url), $(author_id), $(post_type))
RETURNING id
`,
{
title: req.body.title,
text: req.body.text,
post_url: slug(req.body.title),
author_id: req.user.id,
post_type: 'question'
} // remember req.user contains decoded jwt saved by mw above.
)
.then(post => {
console.log('/ask post: ', post);
post_id = post.id
post_url = post.post_url
// if tag deos not exist create it here
var tags = req.body.tags;
console.log('2nd block tags1', tags);
for (var i = 0; i < tags.length; i++) {
if (tags[i].id == undefined) {
console.log('req.body.tags[i].id == undefined', tags[i].id);
var q1 = db.query("insert into tags (tag) values ($(tag)) returning id", {tag: tags[i].label})
.then(data => {
console.log('2nd block tags2', tags);
tags[i].id = data[0].id
// write to the post_tag
db.tx(t => {
var queries = [];
for (var j = 0; j < tags.length; j++) {
var query = t.query(
`
INSERT INTO
post_tag (post_id, tag_id)
VALUES
($(post_id), $(tag_id))
`,
{
post_id: post_id,
tag_id: tags[j].id
}
)
queries.push(query);
}
return t.batch(queries)
})
.then(data => {
res.json({post_id: post_id, post_url: post_url})
})
.catch(error => {
console.error(error);
})
})
.catch(error => {
console.error(error);
});
}
}
})
.catch(error => {
console.error(error);
})
});
The main problem you have - you can't use the root-level db object inside a task or transaction. Trying to create a new connection while inside a transaction breaks the transaction logic. You would need to use t.tx in such cases. However, in your case I don't see that you need it at all.
corrected code:
privateAPIRoutes.post('/ask', (req, res) => {
console.log('/ask req.body: ', req.body);
db.tx(t => {
return t.one(
`
INSERT INTO
posts (title, text, post_url, author_id, post_type)
VALUES
($(title), $(text), $(post_url), $(author_id), $(post_type))
RETURNING *
`,
{
title: req.body.title,
text: req.body.text,
post_url: slug(req.body.title),
author_id: req.user.id,
post_type: 'question'
} // remember req.user contains decoded jwt saved by mw above.
)
.then(post => {
console.log('/ask second query: post[0]: ', post);
console.log('/ask second query: tags: ', req.body.tags);
console.log('/ask second query: tags[0]: ', req.body.tags[0]);
// the key piece to the answer:
var tagIds = req.body.tags.map(tag => {
return tag.id || t.one("insert into tags(tag) values($1) returning id", tag.label, a=>a.id);
});
return t.batch(tagIds)
.then(ids => {
var queries = ids.map(id => {
return t.one(
`
INSERT INTO post_tag (post_id, tag_id)
VALUES ($(post_id), $(tag_id))
RETURNING post_id, tag_id
`,
{
post_id: post.id,
tag_id: id
}
)
});
return t.batch(queries);
});
});
})
.then(data => {
// data = result from the last query;
console.log('/api/ask', data);
res.json(data);
})
.catch(error => {
// error
});
});
The key here is simply to iterate through the tag id-s, and for the ones that are not set - use an insert. Then you settle them all by passing the array into t.batch.
Other recommendations:
You should use method one when executing an insert that returns the new record columns.
You should use try/catch only once there, on the transaction. This is relevant to how to use promises, and not just for this library
You can place your queries into external SQL files, see Query Files
To understand conditional inserts better, see SELECT->INSERT

Resources