My question is directly in relation to this one.
The accepted answer says "You also have to change state of the relation".
I use Model-First approach, and I don't have the foreign key in my entity. I only have the navigation property.
I Can change the state of the entities via DbEntityEntry, but I can't figure how to change the state of the relation itself. How can I access it?
Exemple of my code :
Building building = new Building() { Id = 1, Name = "modified" }; //Buiding 1 exists in DB
building.Adress = new Adress() { Id = 1, Road = "Sesame street" }; //Address 1 exists in DB
building.Adress.State = new State() { Id = 1 }; //State with Id 1 exists in DB
dbContext.Entry<Building>(building).State = EntityState.Modified;
dbContext.Entry<Adress>(building.Adress).State = EntityState.Modified;
//The state itself is not modified, but the relation between adress and state may do.
dbContext.Entry<State>(building.Adress.State).State = EntityState.Unchanged;
dbContext.SaveChanges();
This code pass without error, but the relation between Adress and State is never updated.
The Name of the building and the Street properties does.
Take a look at the ObjectStatementManager (and use ctrl + . to get the references for IObjectContextAdapter).
var unchangedItems = ((IObjectContextAdapter)dbContext).ObjectContext.ObjectStateManager.GetObjectStateEntries(EntityState.Unchanged);
var addedItems = ((IObjectContextAdapter)dbContext).ObjectContext.ObjectStateManager.GetObjectStateEntries(EntityState.Added);
It also specifically has ChangeRelationshipState which sounds appropriate in your scenario.
Related
In Service Fabric I am trying to call an ActorService and get a list of all actors. I'm not getting any errors, but no actors are returned. It's always zero.
This is how I add actors :
ActorProxy.Create<IUserActor>(
new ActorId(uniqueName),
"fabric:/ECommerce/UserActorService");
And this is how I try to get a list of all actors:
var proxy = ActorServiceProxy.Create(new Uri("fabric:/ECommerce/UserActorService"), 0);
ContinuationToken continuationToken = null;
CancellationToken cancellationToken = new CancellationTokenSource().Token;
List<ActorInformation> activeActors = new List<ActorInformation>();
do
{
var proxy = GetUserActorServiceProxy();
PagedResult<ActorInformation> page = await proxy.GetActorsAsync(continuationToken, cancellationToken);
activeActors.AddRange(page.Items.Where(x => x.IsActive));
continuationToken = page.ContinuationToken;
}
while (continuationToken != null);
But no matter how many users I've added, the page object will always have zero items. What am I missing?
The second argument int in ActorServiceProxy.Create(Uri, int, string) is the partition key (you can find out more about actor partitioning here).
The issue here is that your code checks only one partition (partitionKey = 0).
So the solutions is quite simple - you have to iterate over all partitions of you service. Here is an answer with code sample to get partitions and iterate over them.
UPDATE 2019.07.01
I didn't spot this from the first time but the reason why you aren't getting any actors returned is because you aren't creating any actors - you are creating proxies!
The reason for such confusion is that Service Fabric actors are virtual i.e. from the user point of view actor always exists but in real life Service Fabric manages actor object lifetime automatically persisting and restoring it's state as needed.
Here is a quote from the documentation:
An actor is automatically activated (causing an actor object to be constructed) the first time a message is sent to its actor ID. After some period of time, the actor object is garbage collected. In the future, using the actor ID again, causes a new actor object to be constructed. An actor's state outlives the object's lifetime when stored in the state manager.
In you example you've never send any messages to actors!
Here is a code example I wrote in Program.cs of newly created Actor project:
// Please don't forget to replace "fabric:/Application16/Actor1ActorService" with your actor service name.
ActorRuntime.RegisterActorAsync<Actor1> (
(context, actorType) =>
new ActorService(context, actorType)).GetAwaiter().GetResult();
var actor = ActorProxy.Create<IActor1>(
ActorId.CreateRandom(),
new Uri("fabric:/Application16/Actor1ActorService"));
_ = actor.GetCountAsync(default).GetAwaiter().GetResult();
ContinuationToken continuationToken = null;
var activeActors = new List<ActorInformation>();
var serviceName = new Uri("fabric:/Application16/Actor1ActorService");
using (var client = new FabricClient())
{
var partitions = client.QueryManager.GetPartitionListAsync(serviceName).GetAwaiter().GetResult();;
foreach (var partition in partitions)
{
var pi = (Int64RangePartitionInformation) partition.PartitionInformation;
var proxy = ActorServiceProxy.Create(new Uri("fabric:/Application16/Actor1ActorService"), pi.LowKey);
var page = proxy.GetActorsAsync(continuationToken, default).GetAwaiter().GetResult();
activeActors.AddRange(page.Items);
continuationToken = page.ContinuationToken;
}
}
Thread.Sleep(Timeout.Infinite);
Pay special attention to the line:
_ = actor.GetCountAsync(default).GetAwaiter().GetResult();
Here is where the first message to actor is sent.
Hope this helps.
I have tried reading the docs, but I don't get why the Update method produces a "Duplicate entry" MySQL error.
The docs says
In its most simple form, updating any model without any filters will update every field, except the Id which is used to filter the update to this specific record:
So I try it, and pass in an object, like below. A row with id 2 already exists.
using (var _db = _dbFactory.Open())
{
Customer coreObject = new Customer(...);
coreObject.Id = 2;
coreObject.ObjectName = "a changed value";
_db.Update<Customer>(coreObject); // <-- error "duplicate entry"
}
Yes, there are options using .Save and such, but what am I missing with the .Update? As I read it, it should use its Id property to update the row in the db, not insert a new row?
The issue with this method is that you're updating a generic object T but your Update API says to update the Concrete Customer type:
public void MyTestMethod<T>(T coreObject) where T : CoreObject
{
long id = 0;
using (var _db = _dbFactory.Open())
{
id = _db.Insert<T>(coreObject, selectIdentity: true);
if (DateTime.Now.Ticks == 0)
{
coreObject.Id = (uint)id;
_db.Delete(coreObject);
}
if (DateTime.Now.Ticks == 0)
{
_db.DeleteById<Customer>(id);
}
if (DateTime.Now.Ticks == 0)
{
coreObject.Id = (uint)id;
coreObject.ObjectName = "a changed value";
_db.Update<Customer>(coreObject);
}
}
}
Which OrmLite assumes that you're using a different/anonymous object to update the customer table, similar to:
db.Update<Customer>(new { Id = id, ObjectName = "a changed value", ... });
Which as it doesn't have a WHERE filter will attempt to update all rows with the same primary key.
What you instead want is to update the same entity, either passing in the Generic Type T or have it inferred by not passing in any type, e.g:
_db.Update<T>(coreObject);
_db.Update(coreObject);
Which will use OrmLite's behavior of updating entity by updating each field except for Primary Keys which it instead used in the WHERE expression to limit the update to only update that entity.
New Behavior in v5.1.1
To prevent accidental misuse like this I've added an Update API overload in this commit which will use the Primary Key as a filter when using an anonymous object to update an entity, so your previous usage:
_db.Update<Customer>(coreObject);
Will add the Primary Key to the WHERE filter instead of including it in the SET list. This change is available from v5.1.1 that's now available on MyGet.
My module creates a custom content item through the controller:
private ContentItem createContentItem()
{
// Add the field
_contentDefinitionManager.AlterPartDefinition(
"TestType",
cfg => cfg
.WithField(
"NewField",
f => f
.OfType(typeof(BooleanField).Name)
.WithDisplayName("New Field"))
);
// Not sure if this is needed
_contentDefinitionManager.AlterTypeDefinition(
"TestType",
cfg => cfg
.WithPart("TestType")
);
// Create new TestType item
var newItem = _contentManager.New("TestType");
_contentManager.Create(TestItem, VersionOptions.Published);
// Set the added boolean field to true
BooleanField newField = ((dynamic)newItem).TestType.NewField as BooleanField;
newField.Value = true;
// Set title (as date created, for convenience)
var time = DateTime.Now.ToString("MM-dd-yyyy h:mm:ss tt", CultureInfo.InvariantCulture).Replace(':', '.');
newItem.As<TitlePart>().Title = time;
return newItem;
}
The end result of this is a new TestType item with a field that's set to true. Viewing the content item in the dashboard as well as examining ContentItemVersionRecord in the database confirms that the value was set correctly.
However, queries don't seem to work properly on fields that are set in this manner. I found the record IntegerFieldIndexRecord, which is what I assume projections use to fill query result pages. On this, the value of TestField remains at 0 (false), instead of 1 (true).
Going to the content item edit page and simply clicking 'save' updates IntegerFieldIndexRecord correctly, meaning that the value is now picked up by the query. How can the record be updated for field values set programmatically?
Relevant section of migration:
SchemaBuilder.CreateTable(typeof(TestTypePartRecord).Name, table => table
.ContentPartRecord()
);
ContentDefinitionManager.AlterTypeDefinition(
"TestType",
cfg => cfg
.DisplayedAs("Test Type")
.WithPart(typeof(TitlePart).Name)
.WithPart(typeof(ContainablePart).Name)
.WithPart(typeof(CommonPart).Name)
.WithPart(typeof(IdentityPart).Name)
);
Edit: The fix for this is to manually change the projection index record whenever changing a field value, using this call:
_fieldIndexService.Set(testResultItem.As<FieldIndexPart>(),
"TestType", // Resolves as TestTypePart, which holds the field
"newField",
"", // Not sure why value name should be empty, but whatever
true, // The value to be set goes here
typeof(bool));
In some cases a simple contentManager.Publish() won't do.
I've had a similar problem some time ago and actually implemented a simple helper service to tackle this problem; here's an excerpt:
public T GetStringFieldValues<T>(ContentPart contentPart, string fieldName)
{
var fieldIndexPart = contentPart.ContentItem.As<FieldIndexPart>();
var partName = contentPart.PartDefinition.Name;
return this.fieldIndexService.Get<T>(fieldIndexPart, partName, fieldName, string.Empty);
}
private void SetStringFieldValue(ContentPart contentPart, string fieldName, IEnumerable<int> ids)
{
var fieldIndexPart = contentPart.ContentItem.As<FieldIndexPart>();
var partName = contentPart.PartDefinition.Name;
var encodedValues = "{" + string.Join("},{", ids) + "}";
this.fieldIndexService.Set(fieldIndexPart, partName, fieldName, string.Empty, encodedValues, typeof(string));
}
I've actually built this for use with MediaLibrary- and ContentPicker fields (they encode their value as string internally), so it might not be suitable for the boolean field in your example.
But it can't be that hard to implement, just look at the existing drivers and handlers for those fields.
There are 2 ways to fix this:
1) Ensure the newly created item is getting published by calling ContentManager.Publish() as Orchard.Projections.Handlers.FieldIndexPartHandler listens to the publish event to update the FieldIndexPartRecord
2) use IFieldIndexService to update FieldIndexPartRecord manually, see implementation of Orchard.Projections.Handlers.FieldIndexPartHandler to get in idea how to do this
Hope this helps.
:edit
Due to calling Create(...Published) the ContentManager.Published() won't do anything as the item is already considered published.
You can do the following to force the publish logic to run:
bool itemPublished = newItem.VersionRecord.Published;
// unpublish item first when it is already published as ContentManager.Publish() internally first checks for published flag and when set it aborts silently
// -> this behaviour prevents calling publish listeners
if (itemPublished)
_contentManager.Unpublish(newItem);
// the following call will result in calls to IContentHandler.Publishing() / IContentHandler.Published()
_contentManager.Publish(newItem);
or just create the item as a draft and publish it when everything is setup correctly.
I am using Entity Framework 5 (DBContext) and I am trying to find the best way to deep copy an entity (i.e. copy the entity and all related objects) and then save the new entities in the database. How can I do this? I have looked into using extension methods such as CloneHelper but I am not sure if it applies to DBContext.
One cheap easy way of cloning an entity is to do something like this:
var originalEntity = Context.MySet.AsNoTracking()
.FirstOrDefault(e => e.Id == 1);
Context.MySet.Add(originalEntity);
Context.SaveChanges();
the trick here is AsNoTracking() - when you load an entity like this, your context do not know about it and when you call SaveChanges, it will treat it like a new entity.
If MySet has a reference to MyProperty and you want a copy of it too, just use an Include:
var originalEntity = Context.MySet.Include("MyProperty")
.AsNoTracking()
.FirstOrDefault(e => e.Id == 1);
Here's another option.
I prefer it in some cases because it does not require you to run a query specifically to get data to be cloned. You can use this method to create clones of entities you've already obtained from the database.
//Get entity to be cloned
var source = Context.ExampleRows.FirstOrDefault();
//Create and add clone object to context before setting its values
var clone = new ExampleRow();
Context.ExampleRows.Add(clone);
//Copy values from source to clone
var sourceValues = Context.Entry(source).CurrentValues;
Context.Entry(clone).CurrentValues.SetValues(sourceValues);
//Change values of the copied entity
clone.ExampleProperty = "New Value";
//Insert clone with changes into database
Context.SaveChanges();
This method copies the current values from the source to a new row that has been added.
This is a generic extension method which allows generic cloning.
You have to fetch System.Linq.Dynamic from nuget.
public TEntity Clone<TEntity>(this DbContext context, TEntity entity) where TEntity : class
{
var keyName = GetKeyName<TEntity>();
var keyValue = context.Entry(entity).Property(keyName).CurrentValue;
var keyType = typeof(TEntity).GetProperty(keyName, System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance).PropertyType;
var dbSet = context.Set<TEntity>();
var newEntity = dbSet
.Where(keyName + " = #0", keyValue)
.AsNoTracking()
.Single();
context.Entry(newEntity).Property(keyName).CurrentValue = keyType.GetDefault();
context.Add(newEntity);
return newEntity;
}
The only thing you have to implement yourself is the GetKeyName method. This could be anything from return typeof(TEntity).Name + "Id" to return the first guid property or return the first property marked with DatabaseGenerated(DatabaseGeneratedOption.Identity)].
In my case I already marked my classes with [DataServiceKeyAttribute("EntityId")]
private string GetKeyName<TEntity>() where TEntity : class
{
return ((DataServiceKeyAttribute)typeof(TEntity)
.GetCustomAttributes(typeof(DataServiceKeyAttribute), true).First())
.KeyNames.Single();
}
I had the same issue in Entity Framework Core where deep clone involves multiple steps when children entities are lazy loaded. One way to clone the whole structure is the following:
var clonedItem = Context.Parent.AsNoTracking()
.Include(u => u.Child1)
.Include(u => u.Child2)
// deep includes might go here (see ThenInclude)
.FirstOrDefault(u => u.ParentId == parentId);
// remove old id from parent
clonedItem.ParentId = 0;
// remove old ids from children
clonedItem.Parent1.ForEach(x =>
{
x.Child1Id = 0;
x.ParentId= 0;
});
clonedItem.Parent2.ForEach(x =>
{
x.Child2Id = 0;
x.ParentId= 0;
});
// customize entities before inserting it
// mark everything for insert
Context.Parent.Add(clonedItem);
// save everything in one single transaction
Context.SaveChanges();
Of course, there are ways to make generic functions to eager load everything and/or reset values for all keys, but this should make all the steps much clear and customizable (e.g. all for some children to not be cloned at all, by skipping their Include).
I have a simple IBackgroundTask implementation that performs a query and then either performs an insert or one or more updates depending on whether a specific item exists or not. However, the updates are not persisted, and I don't understand why. New items are created just as expected.
The content item I'm updating has a CommonPart and I've tried authenticating as a valid user. I've also tried flushing the content manager at the end of the Sweep method. What am I missing?
This is my Sweep, slightly edited for brevity:
public void Sweep()
{
// Authenticate as the site's super user
var superUser = _membershipService.GetUser(_orchardServices.WorkContext.CurrentSite.SuperUser);
_authenticationService.SetAuthenticatedUserForRequest(superUser);
// Create a dummy "Person" content item
var item = _contentManager.New("Person");
var person = item.As<PersonPart>();
if (person == null)
{
return;
}
person.ExternalId = Random.Next(1, 10).ToString();
person.FirstName = GenerateFirstName();
person.LastName = GenerateLastName();
// Check if the person already exists
var matchingPersons = _contentManager
.Query<PersonPart, PersonRecord>(VersionOptions.AllVersions)
.Where(record => record.ExternalId == person.ExternalId)
.List().ToArray();
if (!matchingPersons.Any())
{
// Insert new person and quit
_contentManager.Create(item, VersionOptions.Draft);
return;
}
// There are at least one matching person, update it
foreach (var updatedPerson in matchingPersons)
{
updatedPerson.FirstName = person.FirstName;
updatedPerson.LastName = person.LastName;
}
_contentManager.Flush();
}
Try to add _contentManager.Publish(updatedPerson). If you do not want to publish, but just to save, you don't need to do anything more, as changes in Orchard as saved automatically unless the ambient transaction is aborted. The call to Flush is not necessary at all. This is the case both during a regular request and on a background task.