Kentico 11 - duplicate object is created after rollback - kentico

After I added versioning to a custom class (see https://devnet.kentico.com/articles/module-development-versioning-recycle-bin), the duplicated object is created (instead of just updating the existing one) when rollback feature is used.
Since I was asked for the exact steps that I took, here they are:
I have a custom class which is called StoreInfo. It contains a set of fields as Name, Address, Country, etc.
For enabling versioning, I added the following changes to the generated code:
public class StoreInfo : AbstractInfo<IndustryInfo>
{
...
public static ObjectTypeInfo TYPEINFO = new ObjectTypeInfo(...)
{
...
SupportsVersioning = true
};
...
protected override bool VersioningEnabled
{
get
{
return SettingsKeyInfoProvider.GetBoolValue("CMSEnableObjectsVersioning");
}
}
}
After this change was applied the Versions tab appeared in the UI interface:
Then I changed the name of the store from Test name to Test name 1, so Version 1.1 was added to the versions list:
The issue happened when I clicked this button:
Instead of updating the name of the existing Store back from Store name 1 to Store name it created a new store with the data of Version 1.0:
Any thoughts about why this happens would be helpful.
Kentico version is 11. (dev. approach - portal engine)

If you deleted Object 1 and then created a "new" Object 1 and tried to roll back the deletion of the original Object 1, it will create a new one because they don't have the same attributes, especially the GUID.
So you may need to specify exactly the steps you took when you deleted the object, what you did after you deleted the object and what you did when you rolled back the original object.

Related

Is that possible to retrieve products without passing catalogType?

I have a requirement to get products by code without knowing catalogType. Is that possible to retrieve products without passing catalogType?
Below is the code snippet I've tried:
#Resource
private ProductDao productDao;
#Resource
private CatalogVersionService catalogVersionService;
List<ProductModel> getProductsByCode(String code) {
CatalogVersionModel catalogVersionModel = new CatalogVersionModel();
catalogVersionModel.setVersion("Online");
catalogVersionService.addSessionCatalogVersion(catalogVersionModel);
List<ProductModel> productModels = productDao.findProductsByCode(code);
}
Below is the exception am getting:
{
"errors": [
{
"message": "model CatalogVersionModel (<unsaved>) cannot be serialized due to being modified, new or removed",
"type": "FlexibleSearchError"
}
]
}
May I know how to fix for above issue?
When you create a product/variant in SAP Commerce (hybris) you must attach it to a catalog.
A catalog (CatalogModel) also have a version (usually staged or online), and the object is called a CatalogVersionModel
When you want to retrieve a product/variant, you must indicate the CatalogVersionModel because the product code is not a unique key to retrieve the product in the DB (you can check the Type "Product" in the backoffice and see in the XML pane that both code and catalogVersion have the value unique="true")
Now in you code there are several issue.
You should not create a catalog version but you should retrive it using a service (See DefaultCatalogVersionService)
You should use a service to retrieve your product (See DefaultProductService)
In productService implementation, you'll find two methods getProductForCode.
One with only the sku code as parameter
One with the sku code and catalogVersion as parameter
The first method actually looks like the method you want, but in fact, it uses the catalogVersion in your session. Your session will be different if you run your code in groovy or if you run your code in Java from your ecommerce website.
You can find the comment of this method below
Returns the Product with the specified code. As default the search uses the current session user, the currentsession language and the current active catalog versions (which are stored at the session in the attribute SESSION_CATALOG_VERSIONS).For modifying the search session context see FlexibleSearchQuery.
You need to specify the catalog, because it is possible to have multiple catalogs, and the same product could exist in all of those catalogs.

Is there a way to configure Azure Table updates to preserve future/unknown properties/columns?

Suppose I create a model
public class Foo :TableEntity {
public int OriginalProperty {get;set;}
}
I then deploy a service that periodically updates the values of OriginalProperty with code similar to...
//use model-based query
var query = new TableQuery<Foo>().Where(…);
//get the (one) result
var row= (await table.ExecuteQueryAsync(query)).Single()
//modify and write it back
row.OriginalProperty = some_new_value;
await table.ExecuteAsync(TableOperation.InsertOrReplace(row));
At some later time I decide I want to add a new property to Foo for use by a different service.
public class Foo :TableEntity {
public int OriginalProperty {get;set;}
public int NewProperty {get;set;}
}
I make this change locally and start updating a few records from my local machine without updating the original deployed service.
The behaviour I am seeing is that changes I make to NewProperty from my local machine are lost as soon as the deployed service updates the record. Of course this makes sense in some ways. The service is unaware that NewProperty has been added and has no reason to preserve it. However my understanding was that the TableEntity implementation was dictionary-based so I was hoping that it would 'ignore' (i.e. preserve) newly introduced columns rather than delete them.
Is there a way to configure the query/insertion to get the behaviour I want? I'm aware of DynamicTableEntity but it's unclear whether using this as a base class would result in a change of behaviour for model properties.
Just to be clear, I'm not suggesting that continually fiddling with the model or having multiple client models for the same table is a good habit to get into, but it's definitely useful to be able to occasionally add a column without worrying about redeploying every service that might touch the affected table.
You can use InsertOrMerge instead of InsertOrReplace.

Why my app is searching for table that doesn't / shouldn't exist?

I have created an MVC app in which I renamed a model class from "Diplomata" to "Diplomas" and now I can't make the migrations to create a table with name "Diplomas", because they still use the old name for some reason. (using .NET Framework 4.6 and EntityFramework 6.1.2)
things I have tried so far:
dropping the db tables completely (from Visual Studio's SQL Server Object Explorer and deleting the files manually)
deleting the migration folder and re-enabling migrations
deleting the model and re-creating it (after deleting migrations and dropping the tables completely)
After enabling migrations again and using command "add-migration Initial" I get a script that generates a table with name "dbo.Diplomata"
this is the model
namespace DDS.Data.Models
{
using System.Collections.Generic;
using DDS.Data.Common.Models;
public class Diploma : BaseModel<int>
{
public string Title { get; set; }
public string Description { get; set; }
public virtual ICollection<Tag> Tags { get; set; }
}
}
this is the ApplicationDbContext
public class ApplicationDbContext : IdentityDbContext<ApplicationUser>
{
public ApplicationDbContext()
: base("DefaultConnection", throwIfV1Schema: false)
{
}
...
public IDbSet<Diploma> Diplomas { get; set; }
...
}
and this is the part of the migration script that is automatically generated
public partial class Initial : DbMigration
{
public override void Up()
{
CreateTable(
"dbo.Diplomata",
c => new
{
Id = c.Int(nullable: false, identity: true),
Title = c.String(),
Description = c.String(),
...
}
Also running a search in VS2015 for "Diplomata" in the entire solution doesn't find anything.
Adding a migration that is renaming the table makes the app crash after the update, because it is searching for a table with the old name. (Invalid object name 'dbo.Diplomata')
I have been debugging this all day with no result so any ideas or suggestions for where and what to look for are appreciated.
PS: This is my first question here so if I missed something or something is hard to understand please tell me, thank you
Did you try cleaning the migrations table? When migrations are enabled, a system table called "__MigrationHistory" is generated. You can locate such table from within SQL Server Management Studio. Go to YourDatabase->Tables->System Tables and it's going to be there.
The way migrations work is as follows:
An initial migration is generated with a specified name.
The initial migration is executed.
When the initial migration and subsequent migrations are run, the changes are applied to the database and a snapshot of the structure of the database it's saved in the __MigrationHIstory table as a new row.
When the application initializaes, EF will compare the latest record in the migrations table and compare that snapshot with the latest migration available, if they don't match, an exception will be thrown. This is how EF determines whether or not the Database changed.
When you are making changes to the original model, you are supposed to create subsequent migration files so you can revert or apply database changes. If you already deleted the initial migration file, probably your best option will be to clean the __MigrationHistory table. EF generates unique entries in the table (I believe that the default behavior is the name of the migration file + a timestamp when the migration was generated). You can always rename the initial migration file to match with the name in the __MigrationHistory table and create a new migration to apply the rename of the table in question. This will only work if the name of the files and model match the snapshots in the DB, otherwise an exception will be thrown
Take a look to this article for more information about migrations:
http://tech.trailmax.info/2014/03/inside_of_ef_migrations/
As a side note, you can also modify the default behavior oh how the migration history table is generated. This may be helpful in the event you need to support specific needs, such as renaming it, not generate it as a system table, add additional columns, etc. This will be helpful in certain scenarios. Check the following link (Particularly useful for cloud-based databases):
How do I add an additional column to the __MigrationHistory table?
NOTE: It is important to mention that tables that are not contained in the initial model or subsequent models will be excluded from the model validation. This is specially useful if you want to create tables that shouldn't be monitored by EF. I.E: The membership provider tables.
I hope this helps clarifying the problems you are having.
After few more trials and errors I gave up on fixing the problem. Instead I forced the app to created the model in a table with the name I wanted by adding Table attribute on the model. So now the model looks like this:
[Table("Diplomas")]
public class Diploma : BaseModel<int>
{
public string Title { get; set; }
public string Description { get; set; }
public virtual ICollection<Tag> Tags { get; set; }
}

Service Fabric - Stateful Service Persistence

I am new to service fabric and started by looking at the MSDN articles covering the topic. I began by implementing the Hello World sample here.
I changed their original RunAsync implementation to:
var myDictionary = await this.StateManager.GetOrAddAsync<IReliableDictionary<int, DataObject>>("myDictionary");
while (!cancellationToken.IsCancellationRequested)
{
DataObject dataObject;
using (var tx = this.StateManager.CreateTransaction())
{
var result = await myDictionary.TryGetValueAsync(tx, 1);
if (result.HasValue)
dataObject = result.Value;
else
dataObject = new DataObject();
//
dataObject.UpdateDate = DateTime.Now;
//
//ServiceEventSource.Current.ServiceMessage(
// this,
// "Current Counter Value: {0}",
// result.HasValue ? result.Value.ToString() : "Value does not exist.");
await myDictionary.AddOrUpdateAsync(tx, 1, dataObject, ((k, o) => dataObject));
await tx.CommitAsync();
}
await Task.Delay(TimeSpan.FromSeconds(1), cancellationToken);
}
I also introduced a DataObject type and have exposed an UpdateDate property on that type.
[DataContract(Namespace = "http://www.contoso.com")]
public class DataObject
{
[DataMember]
public DateTime UpdateDate { get; set; }
}
When I run the app (F5 in visual studio 2015), a dataObject instance (keyed as 1) is not found in the dictionary so I create one, set UpdateDate, add it to the dictionary and commit the transaction. During the next loop, it finds the dataObject (keyed as 1) and sets UpdateDate, updates the object in the dictionary and commits the transaction. Perfect.
Here's my question. When I stop and restart the service project (F5 in visual studio 2015) I would expect that on my first iteration of the RunAsync that the dataObject (keyed as 1) would be found but it's not. I would expect all state to be flushed to its replica.
Do I have to do anything for the stateful service to flush its internal state to its primary replica?
From what I've read, it makes it sound as though all of this is handled by service fabric and that calling commit (on the transaction) is sufficient. If I locate the primary replica (in Service Fabric Explorer->Application View) I can see that the RemoteReplicator_xxx LastACKProcessedTimeUTC is updated once I commit the transaction (when stepping through).
Any help is greatly appreciated.
Thank you!
-Mark
This is a function of the default local development experience in Visual Studio. If you watch the Output window closely after hitting F5 you'll see a message like this:
The deployment script detects that there's an existing app of the same type and version already registered, so it removes it and deploys the new one. In doing that, the data associated with the old application is removed.
You have a couple of options to deal with this.
In production, you would perform an application upgrade to safely roll out the updated code while maintaining the state. But constantly updating your versions while doing quick iteration on your dev box can be tedious.
An alternative is to flip the project property "Preserve Data on Start" to "Yes". This will automatically bump all versions of the generated application package (without touching the versions in your source) and then perform an app upgrade on your behalf.
Note that because of some of the system checks inherent in the upgrade path, this deployment option is likely to be a bit slower than the default remove-and-replace. However, when you factor in the time it takes to recreate the test data, it's often a wash.
You need to think of a ReliableDictionary as holding collections of objects as opposed to a collection of references. That is, when you add an “object” to the dictionary, you must think that you are handing the object off completely; and you must not alter this object’s state in the anymore. When you ask ReliableDictionary for an “object”, it gives you back a reference to its internal object. The reference is returned for performance reasons and you are free to READ the object’s state. (It would be great if the CLR supported read-only objects but it doesn't.) However, you MUST NOT MODIFY the object’s state (or call any methods that would modify the object’s state) as you would be modifying the internal data structures of the dictionary corrupting its state.
To modify the object’s state, you MUST make a copy of the object pointed to by the returned reference. You can do this by serializing/deserializing the object or by some other means (such as creating a whole new object and copying the old state to the new object). Then, you write the NEW OBJECT into the dictionary. In a future version of Service Fabric, We intend to improve ReliableDictionary’s APIs to make this required pattern of use more discoverable.

Is it possible to use ASP.NET Dynamic Data and SubSonic 3?

Is it possible to use ASP.NET Dynamic Data with SubSonic 3 in-place of Linq to SQL classes or the Entity Framework? MetaModel.RegisterContext() throws an exception if you use the context class that SubSonic generates. I thought I remembered coming across a SubSonic/Dynamic Data example back before SubSonic 3 was released but I can't find it now. Has anyone been able to get this to work?
I just got Subsonic 3.0.0.4 ActiveRecord working last night in Visual Studio 2010 with my SQLite database after a little bit of work and I've tried to document the steps taken here for your benefit.
Start by adding a New Item -> WCF Data Service to the project you're using to host your webapp/webservices then modify it similar to my PinsDataService.svc.cs below:
public class PinsDataService : DataService<PINS.Lib.dbPINSDB>
{
// This method is called only once to initialize service-wide policies.
public static void InitializeService(DataServiceConfiguration config)
{
config.SetEntitySetAccessRule("*", EntitySetRights.All);
config.UseVerboseErrors = true;
config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;
}
}
At this point your Dynamic Data Service would probably be working if you matched all the database naming conventions perfectly but I didn't have that kind of luck. In my ActiveRecord.tt template I had to prepend the following two lines before the public partial class declarations:
[DataServiceKey("<#=tbl.PrimaryKey #>")]
[IgnoreProperties("Columns")]
public partial class <#=tbl.ClassName#>: IActiveRecord {
I then added references to System.Data and System.Data.Services.Client followed by the inclusion of using statements for using System.Data.Services and using System.Data.Services.Common at the top of the ActiveRecord.tt template.
The next step was to use the IUpdateable partial class implementation from this blog post http://blogs.msdn.com/aconrad/archive/2008/12/05/developing-an-astoria-data-provider-for-subsonic.aspx and change the public partial class dbPINSDB : IUpdatable to match my subsonic DatabaseName declared in Settings.ttinclude
Then to consume the data in a separate client app/library I started by adding a 'Service Reference' named PinsDataService to the PinsDataService.svc from my client app and went to town:
PinsDataService.dbPINSDB PinsDb =
new PinsDataService.dbPINSDB(new Uri("http://localhost:1918/PinsDataService.svc/"));
PinsDataService.Alarm activeAlarm =
PinsDb.Alarms.Where(i => i.ID == myAA.Alarm_ID).Take(1).ElementAt(0);
Note how I'm doing a Where query that returns only 1 object but I threw in the Take(1) and then ElementAt(0) because I kept getting errors when I tried to use SingleOrDefault() or First()
Hope this helps--also, I'm already aware that dbPINSDB is a really bad name for my Subsonic Database ;)

Resources