Deleting audit log data pertaining to specific entity - dynamics-crm-2011

Is it possible to delete audit log data pertaining to specific entity only? We have a huge audit log which we wanted to reduce by purging log data of specific entities though we do want to keep other entities logs.

There is no support method for deleting Audit Log entries by entity type. The only method support for audit deletion is by date (i.e., all records older than X date.) *Note: that depending on the SQL environment the available end dates may be limited to the end date of an audit log partition. *
That said, there is an unsupported method for meeting this requirement. CRITICAL: Take your CRM server offline, backup the database, and test a restore before attempting - there is no support available for what I'm going to suggest, since this goes against the supported actions on Dynamics CRM 2011 SQL database.
The audit logs are stored in a table dbo.AuditBase. This table does not have an extension base, so there is only one record per audit entry to worry about.
You will need the ObjectTypeCode of the entity. You can get this from the database by running the following script:
SELECT [EntityId],[Name],[ObjectTypeCode]
FROM [].[MetadataSchema].[Entity] ORDER BY Name
Now that you have the ObjectTypeCode simply replace the xxxx in the script below with the value and run the script.
DELETE FROM [].[dbo].[AuditBase] WHERE ObjectTypeCode = xxxx
Audit records for specific entity type are now gone!

I know it isn't quite what you are looking for, but there is a DeleteAuditDataRequest API message that you can call to delete all audit data before a specific date.
As far as deleting specific records I don't believe you can. If you try out the following code you will get the following error The 'Delete' method does not support entities of type 'audt'
orgService.Delete("audit",auditId);
If it is an on premise environment you have direct DB access and you can archive the audit records or delete them via SQL.
Hope that helps.

Related

How to log last changes in hybris when the new feed/ feeds updates come through

I'm aware that Hybris have savedvaluesmodel and savedvalueentrymodel to capture last changes of the data model and its attribute value whatever has changed recently, and it also maintains the history.
And this works only if we are modifying the data after login into Backoffice and this doesn't seem to work in case of feeds which comes via HotFolder. I'd like to know, is there any provision which comes with Hybris out of the box to capture the same information or changes that was done for a given data model through feed?
What I have observed based on OOTB code is ,this class DefaultItemModificationHistoryService is responsible for logging the changes (populate the values and saved the last changes into the saved values model table) that was done at the model level, and this is located inside the OOTB Backoffice extension and this extension is already extended by myprojectbakcoffice extension which further extends myprojectcore extension.
In order to capture the last changes done via feed we thought of handling that logic in an interceptor, however the above class isn't accessible in our myprojectcore extension as it's declared in Backoffice.
What are the other possible solutions that I can think of in order to implement this?
Found some article related to this in here.
Please advise.
You can use the hybris commerce audit framework to log all of the changes happening in the system.
The documentation here says, "Generic audit tracks every persistence action, including creation, modification, and deletion for specified types. The audit is stored as a change log that allows you to see how an item changed over time."
But this comes with a DB overhead. There are specific tables that gets heavily logged with the details of the changes.
These tables have a naming convention as <item_type>_sn.
E.g.: For Order item type, the audit table would be auto created as orders_sn
This is why it is always advisable to turn off the audit as applicable.

Content Cache dependency in Kentico V9

I want to update cached content of one custom table when another custom table item is updated.
Let's say I have two Custom Tables: Product and Order.
There are List and Edit pages for both Product and Order.
In DB there is a trigger on Product that updates some data on Order if Product is changed.
My scenario is, when I update lets say Product 1(one of the item of type Product), I want Orders (all orders) cache to refresh and reflect changes made in DB for Order. This is not happening right now.
Global settings are 10 minutes for content caching. But somehow it takes 20 minutes to reflect changes. Not sure why.
Also on Orders' CustomTableRepeater's System Settings->Cache minutes is set to 0 means it should not be caching content at all but it still does so I am at loss here
Answer to this scenario would be setting cache dependency dummy key as per Kentico documentation.
My questions are:
Do I set dependency key of all orders on Product's edit page's web part partial output cache dependency property?
for e.g. orders|all
Will this refresh all order records cached in for custom table data source when any product is modified?
Or I set dependency key of all Products on orders' repeater's System settings->Content Cache Dependency property?
for e.g. products|all
Please note Cache minutes property is set to 0 so ideally this content should not be cached.
Or add above key to Order's Edit page's webpart's partial output dependency?
Also for custom table how to get proper dummy key? Is it
products|all
OR
nodes|corportateside|products|all
OR
customtableitem.products|all
Or I need to add pages' dummy keys that I can see in debug->cache settings?
I have tried setting up all these things but nothing seem to work.
Any help is greatly appreciated.
Okay so it turned out to be a not cache issue.
I was able to resolve my issue. Putting answer here for future reference I will first list down what I tried:
Installed Hotfix
Add Partial Cache Dependency key
Add Cache dependency key for Content caching. Nothing worked.
Got an idea by reading answer from this questions: https://devnet.kentico.com/questions/kentico-8-2-database-caching
When I was updating CustomTable A's data, in DB trigger on A would update data in table B which I needed to refresh in Site's cache.
When I tried 'Clear Cache' from Debug application from Admin, it still did not update data in Site. Also my Custom Table data in Admin was also not getting updated.
So reading one answer from above question, I realized I need to refresh Hashtables for data to be refreshed in admin and subsequently in site.
So I added code to CustomTableForm.aspx.cs in OnAfterSave event handler. Here I am checking if current CustomTable is my table A, then refresh hashtables of B.
This worked.

Dynamics CRM: Bulk update triggered automatically without actually having done so

I found out today that there has been a bulk update for contact and account entity for the owner field. We did not trigger a workflow nor did we do any operation with regard to an update.
Audit Log shows that one sales rep has changed the owner field for about 850 contacts but the sales rep has not manually triggered any workflow nor updated any record.
All these 850+ records have the same update time.
I have no idea what has happened or why.
Probably your sales rep opened the user record of the previous owner of these 850+ contacts and clicked on Reassign Records instead of changing only the value for the single lookup.
In this way the records are reassigned and the modifiedby is SYSTEM
Note: happened to a customer in a CRM Online environment

MS Dynamics CRM 2011: Unexpected error in Audit History upon deleting user team

We have an issue with an unexpected error in the audit history of contacts. The error appears since we have deleted a user team from the system.
Tracing the error delivers the message: "Crm Exception: Message: team With Id = b14b1a72-... Does Not Exist"
The auditbase table contains a row with the team id in the field "change data" for the contacts. Update rollup of CRM server is 11.
At the moment we see two possibilities:
- Restoring the teams out of a backup: single data rows have to be restored.
- "hard deleting" the affect rows from table auditbase: critical?
Are there any other ideas or hints? Has anyone already deleted entries from the audit table?
Thanks in advance
Alex
We have solved this issue by updating the affected rows via SQL statement. We set the change data entry to an existing team GUID. The audit history is displayed without an error message again. All affected audit entries where share/unshare operations (action type 48, 49) for the deleted teams on the contacts.
It remains unsettled how this could even happen. Deleting an object should also delete the corresponding audit history entries, especially when they are referenced.
Regards
Alex

Catalog items in Sharepoint

As a feed from external system we get a Catalog items (They are product info) as part of feed once a day. We need to take this feed and store in Sharepoint. Following are things we want to achive with this.
Need to search those items and show as part of standard search resutls.
There will be Insert (New Items) , updates and deletes to the items. In addtion to that catalog item will have metadata associated with it.
We would not be modifing any of that data in our system. it is just the display only.
I would like to know from the group what is the best way to store this in sharepoint and search on them.
I would agree with the suggestion of a timer job to do a perhaps nightly batch import and update of the sharepoint catalog. The catalog would be stored in a sharepoint list using a content type (set of fields) that you specify which will hold all the product related data for the catalog.
The BDC may well be your answer if it's compatible with the type of data you want to display and would be the easier cleaner option. However if it doesn't meet all your requirements, the above solution would be the most flexible route.
Give BDC (Business Data Catalog) a try. MOSS required.
If you don't have MOSS Enterprise, creating your own TimerJob that imports the Catalog info from this once a day into a list is also an option.

Resources