Prestashop 1.6. auto-clean stats - statistics

where I can find setting of auto-clean statistics in Prestashop 1.6.? In 1.5. it was in
"Stats" -> "Settings" -> "Auto-clean period"
There are 3 options (Week, Month, Year), but in 1.6. I canĀ“t found these options.

This option do not exists in the new versions of the PrestaShop but there is a good module for stats cleaning:
Automatic Stats Cleaner Module allows you to clear stats data that are collected by your store and take very huge space in database. It may reduce store database up to 90% depending on how long you have been running your store.
Compatible with PrestaShop v1.5 - v1.7.5
allow to delete statistics from connections
allow to delete statistics from logs
allow to delete statistics from not found pages
allow to delete statistics from guest information
allow to delete collection of no needed shopping cart data
allow to set up automatic stats cleaning
Here you can find this module:
https://addons.prestashop.com/en/website-performance/43071-automatic-stats-cleaner.html
Don't forget to make backup of the database before first cleaning.

Using the free add-on "PrestaShop Cleaner" in Prestashop 1.6.x you can clean your database of clients and orders. That will clean your statistics as well.
Don't forget to make Backups before doing that.

Related

What is the best speed optimization for Joomla based Social network (Easysocial)?

I am building (development phase) a Social Network site on Joomla 3 with Easysocial, Easyblog and EasyDiscuss as main extensions, using it on a shared host which has Varnish cache (static+dynamic) enabled and I am looking forward to use AWS storage and CDN.
My question is : What is the best speed optimization could be in this scenario. As main component would be creating dynamic content by users frequently and some instant features like chat, like, comment, friend request etc is included .
Previously I have activated CloudFlare free version to test leaving default settings and some features either was not working or working after 2-3 min. Please suggest me ? Also if possible suggest possible cache time, htaccess config, Etag options etc, whatever needed.
Thanks.
Install memcached - this will speed up a little, few times at least

Statistics usage of a database

Is there a way to monitor statistics on usage of documents within a database?
I have a lotus notes database hosted on a local server. I know I can get some info from 'User Detail...' in Info tab of Database property (right click on the database from domino designer), which basically shows me which user accessed database and which CRUD action was performed, but I was looking for something more in depth i.e. which document in particular is read the most and by who.
Since this is StackOverflow, not SuperUser or ServerFault, I'm going to treat this as a programming question. (On those other sites, they would tell you that tracking actions at the document level is not built into Notes and Domino's functionality, but there are some 3rd party add-on products that can do it for you.)
You can implement tracking features down to the document level in Notes and Domino using the Extension Manager API portion of the Notes C API. There is also a free package on the OpenNTF.org web site, called TriggerHappy, which provides a framework for using the Extension Manager features to call Java agents when events that you want to track occur. This can make it significantly easier to accomplish what you want, but it will not scale as well for large user bases.
You should also bear in mind that since Notes and Domino are designed for use in a distributed environment in which users can do their work in local replica databases, a tracking mechanism that is based on an Extension Manager plugin running on the server may not see changes at the moment that users make them. Instead, it might see them when those changes replicate from the user's computer to the server -- and replication does not guarantee that order is preserved, so the server might see some things happen in a different order than what the user actually did.
Have a look at the activity trends, see notes help.
If you need more details, you have to implement it by yourself.

how to do sharepoint database disk usage analysis and selective replication?

I have a SharePoint 2007 database that is 16GB in size and I want to know why, and how I can reduce the size. Ideally I would like a trimmed replica to use as a developer workstation that retains a good sample data set, and has the ability to be refreshed.
Can you please tell me if there are any third party tools or other methods to accomplish this? I have found the Microsoft tool (stsadm) to be very limited in this regard.
Many thanks.
You can start with the Storage Space Allocation page, available in every site collection.. http://server/_layouts/storman.aspx
That can tell you what lists are big etc.
Trashcans are also good candidates for trimming a database.
I regularly make backups of every site collection and just inspect the ones that get too big. It's always something; large PPTs or loads of images, etc.
Ultimately SQL Server will not just automatically shrink your database, so if you delete stuff the filesize on disk will not decrease; this is a SQL Server admin task.
16 GB is not that big really.. you can just backup and restore it in your dev environment and then delete some unneeded site(collection)s out of it to make it smaller.

How can I synchronize time zone info across SharePoint site collections?

We are building a large SharePoint 2007 installation with several hundred site collections over four regionally hosted Web applications. We have approximately 12,000 users, spread out more or less evenly around the globe, and each user may visit many site collections - both on their "home" regional server and on other regional servers.
My question is: how can we allow each user to set his/her time zone once, but allow the time zone to be synchronized to each site collection? Also, if the user moves from one time zone to another, how can we allow him/her to change time zones and apply the changes across all site collections?
We've considered the following:
Update time zone records via the SharePoint API using a scheduled process. Clumsy and slow - we'd prefer changes to take effect more quickly, and our maintenance windows are pretty small already.
Put a trigger on the table that holds time zone information and use a .NET stored proc to update via the SharePoint API. Definitely goes counter to SP best practices.
Create a workflow that allows a user to set his/her home time zone, and then iterate through the site collections to set the appropriate time zone info. This seems to work for existing site collections, but new site collections wouldn't get the settings.
Store the user's time zone in a cookie; have the master page get the cookie and update the current site collection's time zone setting. May work, but our users may use multiple machines, and also, we would rather not have the overhead of doing this on every page load.
So bottom line is that we're not sure what the best option is for us. Any thoughts would be appreciated.
I would suggest building on your cookie idea:
Store user's time zone in their profile and provide an interface to change it.
On page load, if a time zone cookie does not exist create one based on the user profile value.
Compare the cookie value to the time zone set in SPContext.Current.Web.CurrentUser and update accordingly.
As the SPUser object will already exist, and you can use cookies to avoid constantly looking up the profile value, the performance impact should be negligible. You can either add this logic to the master page or use a delegate control to insert a light-weight control (my preference).

Sharepoint as a high volume information system

I'm looking at designing some core information systems at a new company I'm working at (described one of my ideas here Workflow system)
I've thought a bit more, and am strongly considering using sharepoint for a lot of the heavy lifting seeing as it comes with so much out of the box.
However, I'm not sure how it will handle the high volume of data we'll be throwing at it. I read the MS whitepaper (http://go.microsoft.com/fwlink/?LinkId=95450&clcid=0x409), and it says about 2000 items in a list is about the limit using traditional design methods.
But first a bit of info on my plan and data structures :
We have multiple clients. Each client has multiple applications. Each application will have multiple, ongoing jobs (or process runs).
Each application will store significant correspondence and documentation. Each job represents the processing of a data file on a single run, and stores information about the job such as the postscript file, postal manifests, etc.
Job volume will be about 50 - 100 a day. Each job will have a workflow, triggered by external programs. Then, say on a "job scheduler" page, production staff can schedule the jobs and perform custom actions on the job (written as plugins).
I was thinking the jobs would sit outside and accessed via the BDC, but I would still like them represented in sharepoint lists, to add in sharepoint functionality and reporting, and they'd be accessible in multiple places
e.g.
Application portal - see jobs for application
Production scheduler - see lists of upcoming jobs, assign to resources, trigger other functionality (e.g. copy print file to printer, produce mailing machine file)
Invoicing view - view completed but uninvoiced jobs, export to accounting package
Client view - client portal displays jobs, invoices, stock levels (from external warehouse system), documentation, change register / helpdesk
So basic info about the job would sit in the BDC, but then sharepoint would capture additional metadata about each job. Also, down the line we might put in more advanced workflows using WF or something like K2 blackpoint / blackpearl.
Is this feasible? Any resources you'd recommend to read to get up to speed?
To use SharePoint, you should concentrate on what SharePoint is good at and what it is designed for.
SharePoint is a great collaboration portal, it is not so good as a simple high volume database. So...
You can setup a small site for each client and subsites for each job. The goal of the "job site" is to display (using a webpart perhaps) the relevant upcoming jobs, a list of job errors/exceptions and relevant team documentation on each job.
Separate sites can be created to give a particular "view" of the jobs. E.g an "Invoicing" site can be created to give a view again from BDC webparts of what is requiring invoicing.
https://iwsolve.partners.extranet.microsoft.com/SDPS/ may provide some help.
Don't try and store huge amounts of information in a SharePoint list, just because it may be possible to "tag" it with meta data. A database table is perfectly able to include columns supplying additional information if required.
Think about it this way. If you are creating 50-100 jobs a day, putting that data into a list pre-supposes your sites users are going to want to enter metadata on those jobs manually. I thought not, so create systems you need in order to get the metadata stored correctly at source, or store metadata about the "types" of jobs within a SharePoint list and allow SharePoint to match the job type with jobs in the BDC.
SharePoint will help you to integrate all your systems information together, but unfortunately it looks like you have a lot of work to do just planning what information should go where and how each type of use will view it.
Please take a look at this blog post I wrote on managing large SharePoint lists for better performance- it might offer a bit of an explanation for the 2,000 items issue, which is not actually a hard limit on the number of items in a list, as SharePoint will support up to 5 million items per list. One way around this would be to create and maintain different views that filter by an indexed field to show you different items, up to 2,000 at a time. Hope that helps.
Dina Ayoub
Program Manager
Windows SharePoint Services
SharePoint is probably quite a good fit for the UI side of things, though you'll need to think carefully about which parts are stored and modified in SharePoint lists and which parts are stored elsewhere. That's not so much a SharePoint issue as something you always have to deal with when you have multiple data sources.
I'd probably use a SharePoint list as the primary store for jobs, to avoid any sync issues and make editing easier. The volume of data shouldn't be an issue - just make sure you aren't trying to display 2000 items at once - it's the view, not the list itself that runs into performance issues on large numbers of items.
Tough question Dane... I would like to know a little more about your design / vision before giving an opinion.
Based on what I read in your question I would not use SharePoint 2007 as a development platform for this application.
1) Development experience in SharePoint 2007 can be painful and unproductive at times.
Hard to debug
Steep learning curve
2) Easy to get in trouble with performance
Data Layer is complex and can require expert SQL / SharePoint Admin skills to make platform scale.
Content databases should not exceed 100 GB.
3) Deployment can be extremely difficult depending on what you are doing.
4) New version will be released in the next 12 months.
Just my .02.

Resources