I am testing out the use of "#casl/ability" for RBAC in express.
According to CASL docs, I should be able to define conditional restrictions on attributes against actions upon subjects and in the cases where classes are not used, a subject helper function can be used to wrap DTOs.
reference: https://casl.js.org/v4/en/guide/subject-type-detection
I tried the very simple example below which should have worked. But it does not. Am I understanding it incorrectly in some ways?
import { Ability, subject } from "#casl/ability";
const ability = new Ability([
{
action: "write",
subject: "docs",
conditions: {
publisherId: 53
}
}
]);
const docs = {};
// Also, if the third argument is skipped for 'fields', it throws an error
console.log(
ability.can("write", subject("docs", docs), "", { publisherId: 53 })
);
I have a sandbox here https://codesandbox.io/s/casl-test-conditions-uzc8v?file=/src/index.js:0-286
You incorrectly use ability.can Check the Api docs. That’s why it throws with the error message saying that you incorrectly use can.
To fix your example:
import { Ability, subject } from "#casl/ability";
const ability = new Ability([
{
action: "write",
subject: "docs",
conditions: {
publisherId: 53
}
}
]);
const docs = subject('docs', {
publisherId: 53
}); // “docs” type instance
console.log(
ability.can("write", docs)
);
Related
Reading the docs on Customizing the GraphQL Schema I'm trying to see if I have frontmatter, code:
---
title: Sample Post
date: 2019-04-01
fooId:
---
is it possible to set a default value for fooId? If I live it empty in the markdown file I get:
Cannot query field "fooId" on type "MdxFrontmatter".
If you don't expect "youTubeId" to exist on the type "MdxFrontmatter"
it is most likely a typo. However, if you expect "youTubeId" to exist
there are a couple of solutions to common problems:
If you added a new data source and/or changed something inside gatsby-node/gatsby-config, please try a restart of your development
server.
You want to optionally use your field "fooId" and right now it is not used anywhere.
It is recommended to explicitly type your GraphQL schema if you want
to use optional fields.
Attempt
exports.createSchemaCustomization = ({ actions, schema }) => {
const { createTypes } = actions
const typeDefs = [
'type MarkdownRemark implements Node { frontmatter: Frontmatter }',
schema.buildObjectType({
name: 'Frontmatter',
fields: {
tags: {
type: '[String!]',
resolve(source) {
const { fooId } = source
if (fooId === null) return 'foo'
return fooId
},
},
},
}),
]
createTypes(typeDefs)
}
When I implement the above code I still get the same error in the terminal. Is there a way in gatsby-node.js I can default fooId?
Try it like this:
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
Is not a "default" value per se as you mention but using type definitions you are able to customize the expected outcome of the Node when fetched. By default, all (mostly) the values are set as non-nullable (in the case above as String!). Using the previous type definition, you are setting the fooId as a nullable value, meaning that is not required, without the exclamation mark, !, what represents the nullability/non-nullability, allowing the fooId to be empty.
Just wanted to point out that if you use exports.sourceNodes in Gatsby 4.19.2:
exports.sourceNodes = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
you'll get a deprecation warning which was originally posted and to prevent this issue you should use createSchemaCustomization:
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
I am using Indicative in my project to validate my controller, but, Indicative don't have a "Unique" rule in "Validation Rules", but the framework Adonis have a rule call "unique" that does exactly what i need.
My project is made in Adonis, but i prefer to use "Indicative" and not "Validator" in Adonis, because i think is more easy and beautiful write the code direct in the Controller
code: 'required|string|max:255',
description: 'required|string|max:255|unique:tabela',
authors: 'string|max:255',
status: 'boolean',
user_id: 'integer',
created_at: [
importValidate.validations.dateFormat(['YYYY-MM-DD HH:mm:ss'])
],
updated_at: [
importValidate.validations.dateFormat(['YYYY-MM-DD HH:mm:ss'])
]
}
In the example above, I need the "code" to be "Unique" and return an error message and a response status. How can I do this?
The unique method of Validator will automatically search in the database. I don't think it's possible to do it with Indicative
I propose this solution (in your controller):
const { validate } = use('Validator')
...
const rules = {
code: 'unique:<table_name>,<field_name>'
}
const messages = {
'code.unique': '...'
}
const validation = await validate({ code: ... }, rules, messages)
if (validation.fails()) {
...
}
To use this command it is necessary to use Validator. I don't think there's an equivalent with Indicative
I am trying to implement a custom GraphQL directive. My understanding is that if my SchemaDirectiveVisitor subclass implements static getDirectiveDeclaration(directiveName, schema) then I don't have to manually declare the directive in my SDL (Schema Definition Language).
Because AuthDirective implements getDirectiveDeclaration, it’s no longer necessary for the schema author to include the directive #auth ... declaration explicitly in the schema. The returned GraphQLDirective object will be used to enforce the argument types and default values, as well as enabling tools like GraphiQL to discover the directive using schema introspection. Additionally, if the AuthDirective class fails to implement visitObject or visitFieldDefinition, a helpful error will be thrown.
Source: https://blog.apollographql.com/reusable-graphql-schema-directives-131fb3a177d1
and
However, if you’re implementing a reusable SchemaDirectiveVisitor for public consumption, you will probably not be the person writing the SDL syntax, so you may not have control over which directives the schema author decides to declare, and how. That’s why a well-implemented, reusable SchemaDirectiveVisitor should consider overriding the getDirectiveDeclaration method
Source: https://www.apollographql.com/docs/apollo-server/features/creating-directives.html
In my code, despite having implemented static getDirectiveDeclaration(directiveName, schema) I still have to declare the directive in SDL.
Shouldn't it work without manually declaring in SDL?
Full Example Code:
const { ApolloServer, gql, SchemaDirectiveVisitor } = require('apollo-server');
const { DirectiveLocation, GraphQLDirective, defaultFieldResolver } = require("graphql");
class UpperCaseDirective extends SchemaDirectiveVisitor {
static getDirectiveDeclaration(directiveName, schema) {
console.log("inside getDirectiveDeclaration", directiveName)
return new GraphQLDirective({
name: directiveName,
locations: [
DirectiveLocation.FIELD_DEFINITION,
],
args: {}
});
}
visitFieldDefinition(field) {
console.log("inside visitFieldDefinition")
const { resolve = defaultFieldResolver } = field;
field.resolve = async function (...args) {
const result = await resolve.apply(this, args);
if (typeof result === 'string') {
return result.toUpperCase();
}
return result;
};
}
}
const books = [
{
title: 'Harry Potter and the Chamber of Secrets',
author: 'J.K. Rowling',
},
{
title: 'Jurassic Park',
author: 'Michael Crichton',
},
];
const typeDefs = gql`
#########################################
# ONLY WORKS WITH THIS LINE UNCOMMENTED #
#########################################
directive #upper on FIELD_DEFINITION
type Book {
title: String
author: String #upper
}
type Query {
books: [Book]
}
`;
const resolvers = {
Query: {
books: () => books,
},
};
const server = new ApolloServer({
typeDefs,
resolvers,
schemaDirectives: {
upper: UpperCaseDirective
}
});
server.listen().then(({ url }) => {
console.log(`🚀 Server ready at ${url}`);
});
I have the same problem and was able to find this comment from graphql-tools issue #957.
From the changelog:
NOTE: graphql 14 includes breaking changes. We're bumping the major version of graphql-tools to accommodate those breaking changes. If you're planning on using graphql 14 with graphql-tools 4.0.0, please make sure you've reviewed the graphql breaking changes list.
This is likely caused by the fact that graphql-js now requires you to define your directives in your schema, before you attempt to use them. For example:
directive #upper on FIELD_DEFINITION
type TestObject {
hello: String #upper
}
You can likely work around this by pre-defining your directives in your schema, but I'd like to confirm this. If this works, we'll need to update the docs.
I'm having a little trouble with an integration test for my mongoose application. The problem is, that my unique setting gets constantly ignored. The Schema looks more or less like this (so no fancy stuff in there)
const RealmSchema:Schema = new mongoose.Schema({
Title : {
type : String,
required : true,
unique : true
},
SchemaVersion : {
type : String,
default : SchemaVersion,
enum: [ SchemaVersion ]
}
}, {
timestamps : {
createdAt : "Created",
updatedAt : "Updated"
}
});
It looks like basically all the rules set in the schema are beeing ignored. I can pass in a Number/Boolean where string was required. The only thing that is working is fields that have not been declared in the schema won't be saved to the db
First probable cause:
I have the feeling, that it might have to do with the way I test. I have multiple integration tests. After each one my database gets dropped (so I have the same condition for every test and precondition the database in that test).
Is is possible that the reason is my indices beeing droped with the database and not beeing reinitiated when the next text creates database and collection again? And if this is the case, how could I make sure that after every test I get an empty database that still respects all my schema settings?
Second probable cause:
I'm using TypeScript in this project. Maybe there is something wrong in defining the Schema and the Model. This is what i do.
1. Create the Schema (code from above)
2. Create an Interface for the model (where IRealmM extends the Interface for the use in mongoose)
import { SpecificAttributeSelect } from "../classes/class.specificAttribute.Select";
import { SpecificAttributeText } from "../classes/class.specificAttribute.Text";
import { Document } from "mongoose";
interface IRealm{
Title : String;
Attributes : (SpecificAttributeSelect | SpecificAttributeText)[];
}
interface IRealmM extends IRealm, Document {
}
export { IRealm, IRealmM }
3. Create the model
import { RealmSchema } from '../schemas/schema.Realm';
import { Model } from 'mongoose';
import { IRealmM } from '../interfaces/interface.realm';
// Apply Authentication Plugin and create Model
const RealmModel:Model<IRealmM> = mongoose.model('realm', RealmSchema);
// Export the Model
export { RealmModel }
Unique options is not a validator. Check out this link from Mongoose docs.
OK i finally figured it out. The key issue is described here
Mongoose Unique index not working!
Solstice333 states in his answer that ensureIndex is deprecated (a warning I have been getting for some time now, I thought it was still working though)
After adding .createIndexes() to the model leaving me with the following code it works (at least as far as I'm not testing. More on that after the code)
// Apply Authentication Plugin and create Model
const RealmModel:Model<IRealmM> = mongoose.model('realm', RealmSchema);
RealmModel.createIndexes();
Now the problem with this will be that the indexes are beeing set when you're connection is first established, but not if you drop the database in your process (which at least for me occurs after every integration test)
So in my tests the resetDatabase function will look like this to make sure all the indexes are set
const resetDatabase = done => {
if(mongoose.connection.readyState === 1){
mongoose.connection.db.dropDatabase( async () => {
await resetIndexes(mongoose.models);
done();
});
} else {
mongoose.connection.once('open', () => {
mongoose.connection.db.dropDatabase( async () => {
await resetIndexes(mongoose.models);
done();
});
});
}
};
const resetIndexes = async (Models:Object) => {
let indexesReset: any[] = [];
for(let key in Models){
indexesReset.push(Models[key].createIndexes());
}
Promise.all(indexesReset).then( () => {
return true;
});
}
I have defined several models that use a Datasource "db" (mysql) for my environment.
Is there any way to have several datasources attached to those models, so I would be able to perform REST operations to different databases?
i.e:
GET /api/Things?ds="db"
GET /api/Things?ds="anotherdb"
GET /api/Things (will use default ds)
As #superkhau pointed above, each LoopBack Model can be attached to a single data-source only.
You can create (subclass) a new model for each datasource you want to use. Then you can either expose these per-datasource models via unique REST URLs, or you can implement a wrapper model that will dispatch methods to the correct datasource-specific model.
In my example, I'll show how to expose per-datasource models for a Car model that is attached to db and anotherdb. The Car model is defined in the usual way via common/models/car.json and common/models/car.js.
Now you need to define per-datasource models:
// common/models/car-db.js
{
"name": "Car-db",
"base": "Car",
"http": {
"path": "/cars:db"
}
}
// common/models/car-anotherdb.js
{
"name": "Car-anotherdb",
"base": "Car",
"http": {
"path": "/cars:anotherdb"
}
}
// server/model-config.json
{
"Car": {
"dataSource": "default"
},
"Car-db": {
"dataSource": "db"
},
"Car-anotherdb": {
"dataSource": "anotherdb"
}
}
Now you have the following URLs available:
GET /api/Cars:db
GET /api/Cars:anotherdb
GET /api/Cars
The solution outlined above has two limitations: you have to define a new model for each datasource and the datasource cannot be selected using a query parameter.
To fix that, you need a different approach. I'll again assume there is a Car model already defined.
Now you need to create a "dispatcher".
// common/models/car-dispatcher.json
{
"name": "CarDispatcher",
"base": "Model", //< important!
"http": {
"path": "/cars"
}
}
// common/models/car-dispatcher.js
var loopback = require('loopback').PersistedModel;
module.exports = function(CarDispatcher) {
Car.find = function(ds, filter, cb) {
var model = this.findModelForDataSource(ds);
model.find(filter, cb);
};
// a modified copy of remoting metadata from loopback/lib/persisted-model.js
Car.remoteMethod('find', {
isStatic: true,
description: 'Find all instances of the model matched by filter from the data source',
accessType: 'READ',
accepts: [
{arg: 'ds', type: 'string', description: 'Name of the datasource to use' },
{arg: 'filter', type: 'object', description: 'Filter defining fields, where, orderBy, offset, and limit'}
],
returns: {arg: 'data', type: [typeName], root: true},
http: {verb: 'get', path: '/'}
});
// TODO: repeat the above for all methods you want to expose this way
Car.findModelForDataSource = function(ds) {
var app = this.app;
var ds = ds && app.dataSources[ds] || app.dataSources.default;
var modelName = this.modelName + '-' + ds;
var model = loopback.findModel(modelName);
if (!model) {
model = loopback.createModel(
modelName,
{},
{ base: this.modelName });
}
return model;
};
};
The final bit is to remove Car and use CarDispatcher in the model config:
// server/model-config.json
{
"CarDispatcher": {
dataSource: null,
public: true
}
}
By default, you can only attach data sources on a per-model basis. Meaning you can attach each model to a different data source via datasources.json.
For your use case, you will to add a remote hook to each endpoint you want for multiple data sources. In your remote hook, you will do something like:
...
var ds1 = Model.app.dataSources.ds1;
var ds2 = Model.app.dataSources.ds2;
//some logic to pick a data source
if (context.req.params...
...
See http://docs.strongloop.com/display/LB/Remote+hooks for more info.
For anyone still looking for a working answer to this, the solution for switching databases on the fly was to write a middleware script that examined the request path and then created a new DataSource connector, passing in a variable based on the req.path variable. For example, if the request path is /orders, then "orders" as a string would be saved in a variable, then we attached a new Datasource, passing in that variable for "orders". Here's the complete working code.
'use strict';
const DataSource = require('loopback-datasource-juggler').DataSource;
const app = require('../server.js');
module.exports = function() {
return function datasourceSelector(req, res, next) {
// Check if the API request path contains one of our models.
// We could use app.models() here, but that would also include
// models we don't want.
let $models = ['offers', 'orders', 'prducts'];
// $path expects to be 'offers', 'orders', 'prducts'.
let $path = req.path.toLowerCase().split("/")[1];
// Run our function if the request path is equal to one of
// our models, but not if it also includes 'count'. We don't
// want to run this twice unnecessarily.
if (($models.includes($path, 0)) && !(req.path.includes('count'))) {
// The angular customer-select form adds a true value
// to the selected property of only one customer model.
// So we search the customers for that 'selected' = true.
let customers = app.models.Customer;
// Customers.find() returns a Promise, so we need to get
// our selected customer from the results.
customers.find({"where": {"selected": true}}).then(function(result){
// Called if the operation succeeds.
let customerDb = result[0].name;
// Log the selected customer and the timestamp
// it was selected. Needed for debugging and optimization.
let date = new Date;
console.log(customerDb, $path+req.path, date);
// Use the existing veracore datasource config
// since we can use its environment variables.
let settings = app.dataSources.Veracore.settings;
// Clear out the veracore options array since that
// prevents us from changing databases.
settings.options = null;
// Add the selected customer to the new database value.
settings.database = customerDb;
try {
let dataSource = new DataSource(settings);
// Attach our models to the new database selection.
app.models.Offer.attachTo(dataSource);
app.models.Order.attachTo(dataSource);
app.models.Prduct.attachTo(dataSource);
} catch(err) {
console.error(err);
}
})
// Called if the customers.find() promise fails.
.catch(function(err){
console.error(err);
});
}
else {
// We need a better solution for paths like '/orders/count'.
console.log(req.path + ' was passed to datasourceSelector().');
}
next();
};
};