I'm using spring-data Mongo (1.3.3) as a mechanism for accessing Mongo.
My domain objects are written in Groovy and I use Jackson annotations to define properties and names:
#JsonProperty('is_author')
boolean author = false
#JsonProperty('author_info')
AuthorInfo authorInfo
When I persist one of my domain objects to Mongo, the JsonProperty annotation is ignored and the field is persisted using the standard object's field name.
By digging in the Spring Data Mongo documentation, I found out that the library expects a #Field annotation to modify the actual field's name in Mongo.
Is there a way to use only the Jackson annotations instead of using two annotations to achieve the same results. Maybe a "customized" version of MappingMongoConverter?
Since my application is in Groovy, I have used the new #AnnotationCollectorAST Transformation (http://blog.andresteingress.com/2013/01/25/groovy-2-1-the-annotationcollector-annotation/) to "merge" the Jackson and the Spring Data Mongo annotations. Here is how it looks like: simple and effective!
package com.someapp
import com.fasterxml.jackson.annotation.JsonProperty
import groovy.transform.AnnotationCollector
import org.springframework.data.mongodb.core.mapping.Field
#AnnotationCollector([Field, JsonProperty])
public #interface JsonMongoProperty {}
And here is how it is used:
#JsonMongoProperty('is_author')
boolean author = false
#JsonMongoProperty('author_info')
AuthorInfo authorInfo
Related
I am following the below blog which explains how to create operator and import another CR into existing one.
http://heidloff.net/article/accessing-third-party-custom-resources-go-operators/
here https://github.com/nheidloff/operator-sample-go/blob/aa9fd15605a54f712e1233423236bd152940f238/operator-application/controllers/application_controller.go#L276 , spec is created with hardcoded properties.
I want to import the spark operator types in my operator.
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/pkg/apis/sparkoperator.k8s.io/v1beta2/types.go
This spark operator is having say - 100+ types/properties. By following the above blog , i could create the Go object but it would be hardcoded. I want to create the dynamic object based on user provided values in CR YAML. e.g. - customer can provided 25 attributes , sometimes 50 for spark app. I need to have dynamic object created based on user YAML. Can anybody please help me out ?
If you set the spec type to be a json object, you can have the Spec contain arbitrary json/yaml. You don't have to have a strongly typed Spec object, your operator can then decode it and do whatever you want with it during your reconcile operation as long as its you as you can serialize and deserialize it from json. Should be able to set it to json.RawMesage I think?
What do you mean by hardcoded properties?
If I understood it correctly, you want to define an API for a resource which uses both types from an external operator and your customs. You can extend your API using the types from specific properties such as ScheduledSparkApplicationSpec from this. Here is an example API definition in Go:
type MyKindSpec struct {
// using external third party api (you need to import it)
SparkAppTemplate v1beta2.ScheduledSparkApplicationSpec `json:"sparkAppTemplate,omitempty"`
// using kubernetes core api (you need to import it)
Container v1.Container `json:"container,omitempty"`
// using custom types
MyCustomType MyCustomType `json:"myCustomType,omitempty"`
}
type MyCustomType struct {
FirstField string `json:"firstField,omitempty"`
SecondField []int `json:"secondField,omitempty"`
}
I am using nu.studer.jooq gradle plugin to generate pojos, tables and records for a PostgreSQL database with tables that have fields of type ENUM.
We already have the enums in the application, so I would like that the generator uses those enums instead of generating new ones.
I defined in build.gradle for the generator: udts = false, so it doesn't generate the enums, and I wrote a custom generator strategy that sets the package for the enums to be the one of the already existing enums.
I have an issue in the generated table fields, the SQLDataType.VARCHAR.asEnumDataType(mypackage.ExistingEnum) doesn't work because the mypackage.ExistingEnum does not implement org.jooq.EnumType.
public enum ExistingEnum {
VAL1, VAL2
}
Generated table record:
public class EntryTable extends TableImpl<EntryRecord> {
public final TableField<EntryRecord, ExistingEnum> MY_FIELD = createField(DSL.name("my_field"), SQLDataType.VARCHAR.asEnumDataType(mypackage.ExistingEnum.class), this, "");
}
Is there something I can do to fix this issue? Also we have a lot of enums, so writing a converter for each of them is not suitable.
The point of having custom enum types is that they are individual types, independent of whatever you encode with your database enum types. As such, the jOOQ code generator cannot make any automated assumptions related to how to map the generated types to the custom types. You'll have to implement Converter types of some sort.
If you're not relying on the jOOQ provided EnumType types, you could use the <enumConverter/> configuration, or write implementations based on org.jooq.impl.EnumConverter, which help reduce boilerplate code.
If you have some conventions or rules how to map things a bit more automatically (just because jOOQ doesn't know your convention doesn't mean you don't know it either), you could implement a programmatic code generation configuration, where you query your dictionary views (e.g. PG_CATALOG.PG_ENUM) to generate ForcedType objects. You can even use jOOQ-meta for that purpose.
I am trying to run a query from the results of another using with in the objection orm
ex:
Model.query().with(alias, query).select(columns).from(alias);
according to the Knex documentation which is linked from the objection docs, this should work fine. However, when I run the code, objection prepends the schema name to the alias and I get an error stating that relation schema.alias does not exist. I tried using raw but this did not help either.
ex:
Model.query().with(alias, query).select(columns).from(raw(alias));
is there a way for me to select the table/alias defined in the with method without objection prepending the schema to it?
The query method of the model I was using was overridden with code that specified the schema
ex:
class MyModel extends BaseModel {
static query() {
return super.query().withSchema(schema);
}
}
To get around this issue I used the query method of the parent class directly rather than the overridden query method of the model I was using.
This solves my current problem, but does not answer the question of whether one could omit the prepended schema name in the from method.
I have some fields with known mapping and some unknown, I want to store them.
Mapping:
class MyDoctype(DocType):
...
known_field = String(index='not_analyzed')
...
unknown_dict = Nested() # How can I store this dict ???
This should be possible as ElasticSearch 2.x can handle this mixed mapping.
Is ES dsl based on strict mappings behind the scene ?
I also looked at the persistence docs but it seems to rely on strong mappings everywhere.
You can use Object.
tested on Elasticsearch 6.x, Elasticsearch-dsl 6.x
from elasticsearch_dsl import DocType, Object
class MyDoctype(DocType):
...
known_field = String(index='not_analyzed')
...
unknown_dict = Object()
Is there a way we can use ObjectContext with DbContext's ModelBuilder? We don't want to use POCO because we have customized property code that does not modify entire object in update, but only update modified properties. Also we have lots of serialisation and auditing code that uses EntityObject.
Since poco does create a proxy with EntityObject, we want our classes to be derived from EntityObject. We don't want proxy. We also heavily use CreateSourceQuery. The only problem is EDMX file and its big connection string syntax web.config.
Is there any way I can get rid of EDMX file? It will be useful as we can dynamically compile new class based on reverse engineering database.
I would also like to use DbContext with EntityObject instead of poco.
Internal Logic
Access Modified Properties in Save Changes which is available in ObjectStateEntry and Save them onto Audit with Old and New Values
Most of times we need to only check for Any condition on Navigation Property for example
User.EmailAddresses.CreateSourceQuery()
.Any( x=> x.EmailAddress == givenAddress);
Access Property Attributes, such as XmlIgnore etc, we rely heavily on attributes defined on the properties.
A proxy for a POCO is a dynamically created class which derives from (inherits) a POCO. It adds functionality previously found in EntityObject, namely lazy loading and change tracking, as long as a POCO meets requirements. A POCO or its proxy does not contain an EntityObject as the question suggests, but rather a proxy contains functionality of EntityObject. You cannot (AFAIK) use ModelBuilder with EntityObject derivatives and you cannot get to an underlying EntityObject from a POCO or a proxy, since there isn't one as such.
I don't know what features of ObjectContext does your existing serialisation and auditing code use, but you can get to ObjectContext from a DbContext by casting a DbContext to a IObjectContextAdapter and accessing IObjectContextAdapter.ObjectContext property.
EDIT:
1. Access Modified Properties in Save Changes which is available in ObjectStateEntry and Save them onto Audit with Old and New Values
You can achieve this with POCOs by using DbContext.ChangeTracker. First you call DbContext.ChangeTracker.DetectChanges to detect the changes (if you use proxies this is not needed, but can't hurt) and then you use DbCotnext.Entries.Where(e => e.State != EntityState.Unchanged && e.State != EntityState.Detached) to get DbEntityEntry list of changed entities for auditing. Each DbEntityEntry has OriginalValues and CurrentValues and the actual Entity is in property Entity.
You also have access to ObjectStateEntry, see below.
2. Most of times we need to only check for Any condition on Navigation Property for example:
User.EmailAddresses.CreateSourceQuery().Any( x=> x.EmailAddress == givenAddress);
You can use CreateSourceQuery() with DbContext by utilizing IObjectContextAdapter as described previously. When you have ObjectContext you can get to the source query for a related end like this:
public static class DbContextUtils
{
public static ObjectQuery<TMember> CreateSourceQuery<TEntity, TMember>(this IObjectContextAdapter adapter, TEntity entity, Expression<Func<TEntity, ICollection<TMember>>> memberSelector) where TMember : class
{
var objectStateManager = adapter.ObjectContext.ObjectStateManager;
var objectStateEntry = objectStateManager.GetObjectStateEntry(entity);
var relationshipManager = objectStateManager.GetRelationshipManager(entity);
var entityType = (EntityType)objectStateEntry.EntitySet.ElementType;
var navigationProperty = entityType.NavigationProperties[(memberSelector.Body as MemberExpression).Member.Name];
var relatedEnd = relationshipManager.GetRelatedEnd(navigationProperty.RelationshipType.FullName, navigationProperty.ToEndMember.Name);
return ((EntityCollection<TMember>)relatedEnd).CreateSourceQuery();
}
}
This method uses no dynamic code and is strongly typed since it uses expressions. You use it like this:
myDbContext.CreateSourceQuery(invoice, i => i.details);