Here are my mappings for Complex and Dish :
public class ComplexMapping:ClassMap<Complex>
{
public ComplexMapping()
{
Table("ComplexTable");
Id(comp => comp.Id,"ComplexId").GeneratedBy.Identity();
Map(comp => comp.Name,"Name").Not.Nullable();
Map(comp => comp.Subscribe, "DescriptionComplex");
HasManyToMany(comp => comp.ScrollOfDish)
.Table("ComplexDish")
.ParentKeyColumn("ComplexId")
.ChildKeyColumn("DishId").Cascade.All();
}
}
public class DishMapping:ClassMap<Dish>
{
public DishMapping()
{
Table("DishTable");
Id(dish => dish.Id, "DishId").GeneratedBy.Identity();
Map(dish => dish.Name);
Map(dish => dish.Description);
Map(dish => dish.Price);
References(x => x.Category, "CategoryId").Cascade.None();
HasManyToMany(comp => comp.Scroll)
.Table("ComplexDish")
.ParentKeyColumn("DishId")
.ChildKeyColumn("ComplexId").Inverse();
}
}
I use DAO pattern - and when data from front-end come I create needed object
And object save but not whole object only name and description have saved but collection of products doesn't save. I think i forgot some simple thing please help me.
Usually a many-to-many to me represents associations between two separate entities where the entities themselves manage their life cycles.
For example. In your scenario
var firstDish = new Dish();
var secondDish = new Dish();
// 01 -- both dish objects are now attached to the session
session.SaveOrUpdate(firstDish);
session.SaveOrUpdate(secondDish);
var firstComplexObject = new Complex();
firstComplexObject.ScrollOfDish.Add(firstDish);
firstComplexObject.ScrollOfDish.Add(secondDish);
// 02 -- first Complex Object now attached to the session
session.SaveOrUpdate(firstComplextObject);
var secondComplexObject = new Complex();
secondComplexObject.ScrollOfDish.Add(firstDish);
secondComplexObject.ScrollOfDish.Add(secondDish);
// 03 -- second Complex Object now attached to the session
session.SaveOrUpdate(secondComplextObject);
I would avoid the complex Object managing the life cycle of Dish object like
var firstDish = new Dish();
var secondDish = new Dish();
var firstComplexObject = new Complex();
firstComplexObject.ScrollOfDish.Add(firstDish);
firstComplexObject.ScrollOfDish.Add(secondDish);
// the dish object are not attached to session
// hence the NHibernate has to save the entire object graph here!!!!
// 01 -- first Complex Object now attached to the session
session.SaveOrUpdate(firstComplextObject);
var secondComplexObject = new Complex();
secondComplexObject.ScrollOfDish.Add(firstDish);
secondComplexObject.ScrollOfDish.Add(secondDish);
// 02 -- second Complex Object now attached to the session
session.SaveOrUpdate(secondComplextObject);
Also since dishes would certainly be shared between two complex object, it would make sense to not cascade a DELETE from the complex item to the dish.
Hence I would make sure that you DO manage the life cycles separately.
Hope this pushes you in the right direction.
Related
Let's assume i have two Models in Objection Books and Chapters . What i want to achieve is before inserting Book to DB to save 1 default entry on Chapters table and return the creates chapter.id and save it to Books.
What is the best approach ?
What i do now is Using the $beforeInsert hooks and save first the Chapter record and then create the book (i describe it below on the code ).
But this i think may raise issues in the case of the last query fails and in general is a bad practise adding bussiness logic in the Data Model. i was thinking creating BookServices which will include functions like CreateNewBook where i will have trasanctions doing all the inserts and logic to the Db. So then i can use those BookServices in whatever controller need that logic.
What do you think is the best approach on that ?
Should i couple this logic to the model Layer or move it on Services Layer ?
What i do right now is :
Controller -- createBook.js
const book = require('Book')
createBook : () => {
const newBook = new Book({title: "My Title", chapters: [], author: "My Author"})
newBook.save()
}
Model -- Book.js
const Chapter = require('Chapter')
const { Model } = require('objection');
class Book extends Model {
static get tableName() {
return 'books';
}
$beforeInsert() {
super.$beforeInsert()
const newChapter = new Chapter({title: 'FrontMatter'})
const savedChapter = newChapter.save()
this.chapters.push(savedChapter.id)
}
}
module.exports = Book;
tables in my EntityFramework model are events, eventtypes, subevents, subeventtypes
using the MVC5 builders (right click on controllers, add, add controller) I created controllers and views for the last three tables without issue however when I create the controller and views for the events entity I produce the following errors
Keyword, identifier, or string expected after verbatim specifier: #
'EventType' is a type, which is not valid in the given context
the code that was generated in the event controller is
{
private Entities db = new Entities();
// GET: Events
public ActionResult Index()
{
var events = db.Events.Include(# => #.EventType); ERROR HERE
return View(events.ToList());
}
any help with this issue would be greatly appreciated
TIA
I experienced the same issue when using the "MVC Controller with views, using Entity Framework" template.
var #group = await _context.Groups
.Include(# => #.Company)
.FirstOrDefaultAsync(m => m.GroupId == id);
My workaround was simple to replace the # symbol with another character i.e. g
var #group = await _context.Groups
.Include(g => g.Company)
.FirstOrDefaultAsync(m => m.GroupId == id);
I have kinda 2 questions that can be answered separately.
Q#1
I am trying to save round trips to the database server.
Here's my algo:
Insert 2 entities (to get their IDs generated by the database)
Use the IDs returned to call a stored procedure passing it the IDs
The stored procedure takes the IDs and populates an adjacency list table which I am using to store a directed acyclic graph.
Currently I have a round-trip to the RDBMS for each parent-child relationship, plus one for the Insert of the entities.
I am known to do stuff like this:
public override int SaveChanges()
{
foreach (var entry in this.ChangeTracker.Entries().Where(e => e.State == System.Data.EntityState.Added).ToList())
{
if (entry.Entity is IRobot)
{
entry.Reference("Owner").CurrentValue = skyNet;
}
}
return base.SaveChanges();
}
So I was wondering if there was a way that I can detect an EntityState.Added for an "ADD" that was done similar to the following code:
var robot = new Robot();
skyNet.Robots.Add(robot);
db.Add(skyNet);
db.SaveChanges();
So that I can do something like this: (Note that this is psuedocode)
public override int SaveChanges()
{
foreach (var entry in this.ChangeTracker.Entries().Where(e => e.State == EntityState.**AddedToCollection**).ToList())
{
db.Relate(parent: skyNet, child: entry.Entity);
}
return base.SaveChanges();
}
Q#2
Is there anyway to call a stored procedure as part of the same "trip" to the database after calling a SaveChanges()?
Question 1
You can detect the state of an entity by
db.Entry(robot).State
After the line
skyNet.Robots.Add(robot);
the EntityState of robot will be Added. However, in your pseudocode it is not clear where the skyNet variable comes from. If you add the skyNet as you do in your code snippet you could do:
foreach( var skyNet in ChangeTracker.Entries()
.Where(e => e.State == EntityState.Added)
.Select (e => e.Entity)
.OfType<SkyNet>())
{
foreach(var robot in skyNet.Robots
.Where(r => db.Entry(r).State == EntityState.Added))
{
db.Relate(parent: skyNet, child: robot);
}
}
Question 2
You can't call a stored procedure in one roundtrip, that would require something like NHibernate's multi query. But, you can wrap SaveChanges and a stored procedure call in one transaction (which I think is what you mean) by using TransactionScope:
using (TransactionScope scope = new TransactionScope())
{
// stored procedure call here.
db.SaveChanges();
scope.Complete();
}
I am using Entity Framework 5 (DBContext) and I am trying to find the best way to deep copy an entity (i.e. copy the entity and all related objects) and then save the new entities in the database. How can I do this? I have looked into using extension methods such as CloneHelper but I am not sure if it applies to DBContext.
One cheap easy way of cloning an entity is to do something like this:
var originalEntity = Context.MySet.AsNoTracking()
.FirstOrDefault(e => e.Id == 1);
Context.MySet.Add(originalEntity);
Context.SaveChanges();
the trick here is AsNoTracking() - when you load an entity like this, your context do not know about it and when you call SaveChanges, it will treat it like a new entity.
If MySet has a reference to MyProperty and you want a copy of it too, just use an Include:
var originalEntity = Context.MySet.Include("MyProperty")
.AsNoTracking()
.FirstOrDefault(e => e.Id == 1);
Here's another option.
I prefer it in some cases because it does not require you to run a query specifically to get data to be cloned. You can use this method to create clones of entities you've already obtained from the database.
//Get entity to be cloned
var source = Context.ExampleRows.FirstOrDefault();
//Create and add clone object to context before setting its values
var clone = new ExampleRow();
Context.ExampleRows.Add(clone);
//Copy values from source to clone
var sourceValues = Context.Entry(source).CurrentValues;
Context.Entry(clone).CurrentValues.SetValues(sourceValues);
//Change values of the copied entity
clone.ExampleProperty = "New Value";
//Insert clone with changes into database
Context.SaveChanges();
This method copies the current values from the source to a new row that has been added.
This is a generic extension method which allows generic cloning.
You have to fetch System.Linq.Dynamic from nuget.
public TEntity Clone<TEntity>(this DbContext context, TEntity entity) where TEntity : class
{
var keyName = GetKeyName<TEntity>();
var keyValue = context.Entry(entity).Property(keyName).CurrentValue;
var keyType = typeof(TEntity).GetProperty(keyName, System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance).PropertyType;
var dbSet = context.Set<TEntity>();
var newEntity = dbSet
.Where(keyName + " = #0", keyValue)
.AsNoTracking()
.Single();
context.Entry(newEntity).Property(keyName).CurrentValue = keyType.GetDefault();
context.Add(newEntity);
return newEntity;
}
The only thing you have to implement yourself is the GetKeyName method. This could be anything from return typeof(TEntity).Name + "Id" to return the first guid property or return the first property marked with DatabaseGenerated(DatabaseGeneratedOption.Identity)].
In my case I already marked my classes with [DataServiceKeyAttribute("EntityId")]
private string GetKeyName<TEntity>() where TEntity : class
{
return ((DataServiceKeyAttribute)typeof(TEntity)
.GetCustomAttributes(typeof(DataServiceKeyAttribute), true).First())
.KeyNames.Single();
}
I had the same issue in Entity Framework Core where deep clone involves multiple steps when children entities are lazy loaded. One way to clone the whole structure is the following:
var clonedItem = Context.Parent.AsNoTracking()
.Include(u => u.Child1)
.Include(u => u.Child2)
// deep includes might go here (see ThenInclude)
.FirstOrDefault(u => u.ParentId == parentId);
// remove old id from parent
clonedItem.ParentId = 0;
// remove old ids from children
clonedItem.Parent1.ForEach(x =>
{
x.Child1Id = 0;
x.ParentId= 0;
});
clonedItem.Parent2.ForEach(x =>
{
x.Child2Id = 0;
x.ParentId= 0;
});
// customize entities before inserting it
// mark everything for insert
Context.Parent.Add(clonedItem);
// save everything in one single transaction
Context.SaveChanges();
Of course, there are ways to make generic functions to eager load everything and/or reset values for all keys, but this should make all the steps much clear and customizable (e.g. all for some children to not be cloned at all, by skipping their Include).
I have a simple IBackgroundTask implementation that performs a query and then either performs an insert or one or more updates depending on whether a specific item exists or not. However, the updates are not persisted, and I don't understand why. New items are created just as expected.
The content item I'm updating has a CommonPart and I've tried authenticating as a valid user. I've also tried flushing the content manager at the end of the Sweep method. What am I missing?
This is my Sweep, slightly edited for brevity:
public void Sweep()
{
// Authenticate as the site's super user
var superUser = _membershipService.GetUser(_orchardServices.WorkContext.CurrentSite.SuperUser);
_authenticationService.SetAuthenticatedUserForRequest(superUser);
// Create a dummy "Person" content item
var item = _contentManager.New("Person");
var person = item.As<PersonPart>();
if (person == null)
{
return;
}
person.ExternalId = Random.Next(1, 10).ToString();
person.FirstName = GenerateFirstName();
person.LastName = GenerateLastName();
// Check if the person already exists
var matchingPersons = _contentManager
.Query<PersonPart, PersonRecord>(VersionOptions.AllVersions)
.Where(record => record.ExternalId == person.ExternalId)
.List().ToArray();
if (!matchingPersons.Any())
{
// Insert new person and quit
_contentManager.Create(item, VersionOptions.Draft);
return;
}
// There are at least one matching person, update it
foreach (var updatedPerson in matchingPersons)
{
updatedPerson.FirstName = person.FirstName;
updatedPerson.LastName = person.LastName;
}
_contentManager.Flush();
}
Try to add _contentManager.Publish(updatedPerson). If you do not want to publish, but just to save, you don't need to do anything more, as changes in Orchard as saved automatically unless the ambient transaction is aborted. The call to Flush is not necessary at all. This is the case both during a regular request and on a background task.