Foxx model schema with nested objects - arangodb

All the examples I've found for Foxx.Model schemas are flat - i.e. they don't include nested objects.
I'm trying to add a hash to save geo info on a model like this:
var Foo = Foxx.Model.extend({
schema: {
name: joi.string().required(),
location: joi.object().keys({
lat: joi.number(),
lng: joi.number()
})
}
});
This shows up in the Foxx interface Data Type as this:
foo {
name (string),
location (object, optional)
}
How do I get it to show the key names 'lat' and 'lng' for the location object?
Or am I thinking about this incorrectly?

You are using it correctly and it will work and check your object correctly. This is just a limitation of the documentation tool used in the admin interface of ArangoDB.

Related

How to find out whether a nested document is an actual subschema or just an object in Mongoose

I have the following situation:
I am creating a general purpose update function for my project, which takes payload and goes through it, checking whether property in payload exists on model schema and if it does it assigns the new property value to document (updating it). It does this also in subdocuments (recursively).
I have a custom defined type Language for multi-language string fields, which is an object that contains properties in form of language ('en', 'de', etc). Now since its a custom type, Mongoose doesn't know if its contents were modified, so I have to use markModified on it. And here comes the problem: Actual subschemas behave here differently than objects. If I call markModified on subschema, it expects path within that subschema, not entire document. On the other hand, if I call markModified on an object, it expect entire path from parent. I don't know whether it is a bug or not, but if I want to support both, I need to differentiate between the two in my function. Is there a way to know whether it's a subschema made by user or just an object (that was converted to subschema by mongoose)?
Example setup model:
const TestSchema = new Schema(
{
object: {
name: {
type: Language
}
},
nestedSchema: {
type: NestedTestSchema
}
}
)
const NestedTestSchema = new Schema(
{
name: {
type: Language
}
}
)
Example code:
const testDocument = new TestModel({
object: {
name: {
en: 'NameEN',
de: 'NameDE'
}
}
nestedSchema: {
name: {
en: 'NameEN',
de: 'NameDE'
}
}
})
// We make a payload to change these values
const payload = {
object: { // Update object
name: {
en: 'Name updated',
fr: 'Something',
}
},
nestedSchema: { // Update subschema
name: {
en: 'Name updated',
fr: 'Something',
}
}
}
And now when I receive this and update the document with these values, for object I have to
const { object, nestedSchema } = document // This, of course, is useless here, I would get nestedSchema and object as argument in recursive function, its only for demonstration
nestedSchema.markModified('name.en') // Etc
and for object I have to
object.markModified('object.name.en') // Etc
Together with co-workers we found out that object is not an actual subschema, it's called a nestedPath. Only nestedPaths have property $__isNested, subschemas don't. As a result, because of this different handling of two cases, in nestedPaths we need to specify full path when using markModified while in subschemas only path within that subschema

(Apollo) GraphQL Merging schema

I am using GraphQL tools/libraries offered by Apollo.
It is possible to merge remote GraphQL schema into a nested structure?
Assume I have n remote schemas, and I would like to merge the Query, Mutation and Subscription from different schemas into a single schema, except, each remote schema is placed under their own type.
Assume we have a remote schema called MenuX with:
type Item {
id: Int!
description: String
cost: Int
}
type Query {
items: [Item]
}
and a remote schema called MenuY with:
type Item {
id: Int!
name: String
cost: Int
}
type Query {
items: [Item]
}
I would like to merge the schema into a single schema but under their own types. A contrived example:
type MenuXItem {
id: Int!
description: String
cost: Int
}
type MenuYItem {
id: Int!
name: String
cost: Int
}
type MenuXQuery {
items: [MenuXItem]
}
type MenuYQuery {
items: [MenuYItem]
}
type Query {
MenuX: MenuXItem
MenuY: MenuYItem
}
As we can see that under the Query type it contains two new types which contain the query type from the remote schemas. Item from schema MenuX have been renamed by using the transformers from graphql-tools, similarly Item from schema MenuY has been transformed as well.
But is it possible to transform the structure as well?
With the actual remote schemas, we are looking at hundreds of types from each schema, and ideally, I would like to not pollute the root types and the Introspection documentation in GraphiQL.
Apollo's graphql-tools includes a module to transform schemas. You should be able to rename everything in MenuX with something like
import {
makeRemoteExecutableSchema,
transformSchema,
RenameTypes
} from 'graphql-tools';
const schema = makeRemoteExecutableSchema(...);
const menuXSchema = transformSchema(schema, [
RenameTypes((name) => `MenuX${name}`)
]);
Then you can use the transformed schema as the input to mergeSchemas.
Note that the top-level Query and Mutation types are somewhat special and you may want to try to more directly merge those types without renaming them, particularly if they don't conflict.
There is a plugin for Gatsby that contains a transformer that does what you want: https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/transforms.js
It namespaces the types of an existing GraphQL schema, so that you end up with:
type Namespace1Item {
...
}
type Namespace2Item {
...
}
type Namespace1Query {
items: [Namespace1Item]
}
type Namespace2Query {
items: [Namespace2Item]
}
type Query {
namespace1: Namespace1Query
namespace2: Namespace2Query
}
So, if you transform your schemas and them merge them, you should be good.
Its possible to achieve schema where Query type has root fields as entry points into source schemas, but only for Query type as Mutation type doesn't support nesting of mutations under names.
For this reason, prefixing names is prefered solution for schema stitching.

update a portion of array in graphQL

So I've decided to use graphql as my query engine along side with mongodb. So I created my schemas and everything looks great, BUT, one of my schemas contains a list of Strings, for instance:
exports.default = new gql.GraphQLInputObjectType({
name: 'myModel',
fields: {
type: { type: gql.GraphQLString },
workingDays: { type: new gql.GraphQLList(GraphQLString) }
}
});
So in the workingDays list I have 50 elements, and I'd like to change one of them, is there a way to do that with Graphql?
It just so happens to be a string type inside but, it could be an object as well.
Thanks.
You can add a new mutation that encodes this functionality.
For example updateWorkingDays(modelId: ID!, index: Int!, workDay: String) that updates the working day of model modelId at index to the new workDay.

Handling Mongoose Populated Fields in GraphQL

How do I represent a field that could be either a simple ObjectId string or a populated Object Entity?
I have a Mongoose Schema that represents a 'Device type' as follows
// assetSchema.js
import * as mongoose from 'mongoose'
const Schema = mongoose.Schema;
var Asset = new Schema({ name : String,
linked_device: { type: Schema.Types.ObjectId,
ref: 'Asset'})
export AssetSchema = mongoose.model('Asset', Asset);
I am trying to model this as a GraphQLObjectType but I am stumped on how to allow the linked_ue field take on two types of values, one being an ObjectId and the other being a full Asset Object (when it is populated)
// graphql-asset-type.js
import { GraphQLObjectType, GraphQLString } from 'graphql'
export var GQAssetType = new GraphQLObjectType({
name: 'Asset',
fields: () => ({
name: GraphQLString,
linked_device: ____________ // stumped by this
});
I have looked into Union Types but the issue is that a Union Type expects fields to be stipulated as part of its definition, whereas in the case of the above, there are no fields beneath the linked_device field when linked_device corresponds to a simple ObjectId.
Any ideas?
As a matter of fact, you can use union or interface type for linked_device field.
Using union type, you can implement GQAssetType as follows:
// graphql-asset-type.js
import { GraphQLObjectType, GraphQLString, GraphQLUnionType } from 'graphql'
var LinkedDeviceType = new GraphQLUnionType({
name: 'Linked Device',
types: [ ObjectIdType, GQAssetType ],
resolveType(value) {
if (value instanceof ObjectId) {
return ObjectIdType;
}
if (value instanceof Asset) {
return GQAssetType;
}
}
});
export var GQAssetType = new GraphQLObjectType({
name: 'Asset',
fields: () => ({
name: { type: GraphQLString },
linked_device: { type: LinkedDeviceType },
})
});
Check out this excellent article on GraphQL union and interface.
I was trying to solve the general problem of pulling relational data when I came across this article. To be clear, the original question appears to be how to dynamically resolve data when the field may contain either the ObjectId or the Object, however I don't believe it's good design in the first place to have a field store either object or objectId. Accordingly, I was interested in solving the simplified scenario where I keep the fields separated -- one for the Id, and the other for the object. I also, thought employing Unions was overly complex unless you actually have another scenario like those described in the docs referenced above. I figured the solution below may interest others also...
Note: I'm using graphql-tools so my types are written schema language syntax. So, if you have a User Type that has fields like this:
type User {
_id: ID
firstName: String
lastName: String
companyId: ID
company: Company
}
Then in my user resolver functions code, I add this:
User: { // <-- this refers to the User Type in Graphql
company(u) { // <-- this refers to the company field
return User.findOne({ _id: u.companyId }); // <-- mongoose User type
},
}
The above works alongside the User resolver functions already in place, and allow you write GQL queries like this:
query getUserById($_id:ID!)
{ getUserById(_id:$_id) {
_id
firstName
lastName
company {
name
}
companyId
}}
Regards,
S. Arora

How to get enum values from mongoose schema using virtual method?

I'm having difficulty getting enum values from my Mongoose schema using a virtual method on that same schema.
The property I'm trying to access in the schema is defined as follows:
, roles: {
type: [{
type: String
, enum: ['user', 'admin']
}]
, default: ['user']
}
The following is my virtual method I'm using to grab the enum values:
// Returns an array of all possible role enum values
UserSchema.virtual('possibleRoles').get(function() {
return this.schema.path('roles').caster.enumValues;
});
This works, however other examples I found online went about it in a different way. An example of this is here: Access the list of valid values for an Enum field in a Mongoose.js Schema
Is my method for accessing enums on a property dirty or incorrect? Is there a cleaner way I could write this?
This is clean and easy way.
var possibleRoles = ['user', 'admin'];
var UserSchema = new Schema({
roles: {
type: [{type: String, enum: possibleRoles}],
default: ['user']
}
});
UserSchema.virtual('possibleRoles').get(function () {
return possibleRoles;
});
remove the caster part, i don't know why that is there:
return this.schema.path('roles').enumValues;
that should work without any other issues

Resources