I have 1 level categorized view with around 80000 + documents and still increasing.
Initial loading of view with collapse all categories, Expand All/Collapse All pager control, works very fast, in a second.
But When I am trying to individual category, just one by one, takes around 10 seconds delay. Its a huge slow performance for users.
Please help on this, any fix available for this?
In 9.0.1 you can enable a new property that increases performance of categorized views.
See http://openntf.org/XSnippets.nsf/snippet.xsp?id=performant-view-navigation-for-notes-domino-9.0.1
You might want to revisit your UI pattern. Categories and pagers don't match well. See http://www.wissel.net/blog/d6plinks/SHWL-7UDMQS and fix as Per suggested the parameters
Related
Sorry not a coding question, not sure if I should be posting it here.
I struggle with the concept of what is 'large' in Notes nsf application design elements as opposed to the amount of data or records stored. For example it is stated that we shouldn't have too many views, but 'too many' does not give any scale whatsoever, is it 10,50,100,500 before it 'slows down'. I realise it also based on the view design but some idea of 'too many' would be beneficial. In this instance data and design elements are in the same nsf.
Is there a recommendation regarding number of elements such as XPages, Custom Controls, Managed Beans, Java Classes etc. What would be deemed excessive? In this instance I have data and logic in separate nsfs.
Any guidance would be greatly appreciated.
Thanks
There is a limit on the number of design elements. But unless you're importing a whole JavaScript framework into an NSF, uou're not likely to hit it.
As has been mentioned, view performance is dependent on many factors. 500 decently designed views are fine. 50 badly performing views can be bad. Lots of resorts on columns impacts the number of indexes that need to be created and managed. Using #Today or #Now in a view selection formula or column formula will be a big problem. Having lots of documents that rarely change, smaller numbers of documents that are updated every 30 seconds, lots of users regularly updating - these will all be impacts on performance.
Performance in code will also impact and XPages Toolbox or agent profiling will give an idea. DocumentCollection.count() is slow, but sometimes is needed. NoteCollections may be quicker. There are various blog posts covering this.
A managed bean that has a Map that grows and grows will impact Java memory.
But there are always performance enhancements being made on the server side. gRPC in Domino 10 will be extremely performant. So always try to be on a recent version and keep up to date with sessions at conferences etc so you know what TCO improvements are being made.
The bottom line is without an intimate understanding of your architecture and code, no one will be able to give you a definitive answer.
I have to use jointJs for building workflow diagram.
So how much performance will affect in ie browser when there are more then 200 component(includes arrows, nodes,bozes ) on same page ?
As you may know there is a demo application named Rappid here
You can test your performance requirements there.
I have tested the performance with ~150 components and 1 link between 2components and there was no performance issue at all.
I am using it for more than 200 nodes, but there are problems like if you want to add some description near every block at the same time it will hang for few minutes. But if you are doing operations in batches then it will be ok. There are problems if you want to hide 100 nodes together. I am doing these type of tasks using jQuery. I am using it for very complex use case for more than 500 nodes including nodes, arrows and other helper components. Tell me your exact requirement and I will be able to tell you if it will work or not.
I am using LazyDataModel to display the data table records, pagination, filter, sorting purpose. The records can be max 2500 around. I display 10 Records per page, so customer has to visit 250 pages if they don't know the search term. Now customer don't want to visit all the pages rather they want the implementation where they can do all the stuff being on the same page.
The other option which comes in mind is live scroll but I tried to work on PoC and come to know that LazyDataModel and Live Scroll doesn't work together. So I created a demo page independent of LazyDataModel using Live Scroll. I really like live scrolling when it comes to filter the records as it is much faster. The only thing is drill down till the end.
I have following questions:
How live scroll works internally?
Does Live Scroll load the data all together and fetch it from heap or cache? (i.e. scrollRows="20")
If Live Scroll can do better then why LazyDataModel?
Don't you think that pagination is all about old days?
We have this page in SharePoint that list all the sites, the person who manages that site, their contact info, and the last modified date.
Currently, we are using a custom webpart that crawls through the sites and reads through the metadata, and then it displays all these in a list.
Opening this page takes about 10+ seconds.
We're looking at ways to cut this time to less than 3 seconds.
I'm thinking about some sort of timer job that caches the page, say every half hour, and when the page is requested, simply display the cached version. The data in the page itself doesn't change that often so caching isn't really a big issue. Is this idea feasible? I'm fairly new in SharePoint so what would be the steps to implement this?
Or if there are any other options/suggestions on how to reduce the load time, I'm all ears.
here are some approaches that might work for you.
Extend your existing Webpart with a cache. So the first User who visit the Site will wait as long as with the existing Solution. But he will fill the cache, so every other call of the Site will be much faster
http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.webpartpages.webpart.partcachewrite(v=office.15).aspx
Create a Timer-Job that fill up da extra SharePoint- List with the fields you need. So you render your Webpart using this data. To fetch the needed data from the List will be much faster than iterating some SPWeb or SPSite Objects.
A lot of data already can be fetched from the Search-Service, and you can extend the Attributes the search engine will crawl. Once the search attributes are extended you can create a search driven Webpart
http://technet.microsoft.com/de-de/library/jj679900(v=office.15).aspx
Each of this Solutions should work at SP 2007/10/13
If you need a quick-win than mybee Solution 1 is the best for you.
Regards
I have a classic lotus notes form with 1800 fields and 7 tab,
Will the performance improvement if i split the form in 7 different form and then bind to the xpages
or
if i directly bind the form to the xpages will there be any performance impact?
thanks
That's a "try that with your RDBMS" type of situation.
Splitting forms will not help you, XPages doesn't care too much about how many data sources are used if the number of bindings is high (and you have a ratio of still > 200 for many forms) --- and you actually don't bind to a form, you bind to a document. The form is only a "design time convenience".
Making some wild guesses I would presume that a lot of the fields are repeating fields (like LineItem_1 LineItem_2 LineItem_3 ) instead of multi-value fields.
Moving forward you need to make a basic choice:
Is your data format fixed since all views, reports, import/export routines etc. rely on them .or. can you refactor the data model (based on your question I would presume the former).
If it is fixed I would look to encapsulate the document into a managed bean that provides collections of data for the repeating sets of fields and repeat controls to show the number of entries you actually need (The classic way would be to have different hide whens in every cell of the table hosting the fields). This way you have much less bindings to look after.
A very basic idea how to have a dynamic table can be found in Excercise 23 of the IBM XPages tutorial
Disclaimer: Tim Clark and I wrote that.
You also might consider using only that parts of the document the specific user needs that moment.
For this kind of situation, As of my concern, You have a form with seven tabs and 1800 fields. But it is too complex. But however we split a form into seven, each form will have 260 fields. now also code will become complex.
But my suggestion is that you can dynamically change the binding data in xpages. If you feel very complex to redesign the form then you follow the above idea. Otherwise change the design and make a nice look-up in xpages.