Why executeQuery after refresh takes 100x more time? - performance-testing

I have a form with 4 datasources. It is in listpage style and have Datasource1 (big table with lots of relations, columns, and indexes) shown in grid. When I open this form it takes like 200ms to open it but when I refresh it it takes 13s to refresh.
I used the Code profiler tool and I find out that the time is consumed in Datasource1 in executeQuery() method by command super();
When the executeQuery() is called from form by
Datasource1_ds.executeQuery();
it takes 200ms to call it.
There is like 15 columns in grid on form and sorting by one takes a little less than 1s.
So my question is. What is called in super(); when the form is refreshed by task F5 and not called by openning form and calling Datasource1_ds.executeQuery();?
I try to use Code profiler with different setings and actions, debug the code in various action done, use Visual Studio Profiler, use Activity Monitor in Microsoft SQL server on Microsoft Dynamics AX database, changes the Datasource1 table, with no luck.
Everytime I end up on the super(); Only time when the refresh is fast is when I have filters on the grid and it shows less rows. (I try to use VisibleRows property on grid but it does not help.)
I am using Microsoft Dynamics AX 2012 R2

I would suggest to capture with SQL Server Profiler queries that are executed (1) on initial executeQuery() when ListPage form is opened, then (2) on invocation of executeQuery() on form refresh.
Comparison of execution plans of these two queries must show the bottleneck. You may capture Showplan XML event.

Related

Need help in view delegate in custom screen

we created a custom screen which displays list of sales data based on filter conditions like (today, yesterday, this week, this month, this quarter, this year), we created a SQL view for this and then from VIEW we created and DAC and using it in custom screen. We also have filters in our screen. for filter conditions we are using view delegate and returning the data. the question is why the screen takes too long around 70 seconds to load 2K records. Is using view delegate decrease the speed of loading data. We can go with GI but we need to display images in the GRID so we opted for custom screen and also we have some report button in header which prints report. as we can't show images in GI we chose this.
The slowness you see is most likely caused by combination of two reasons.
When you use BQL view, it in fact requests only the number of records you see on the screen. For instance if you have grid with paging and only 20 records are visible on the page, the SQL select will have TOP 20 limitation. However, once you have select delegate, that optimization stops working since the framework does not know what you'd like to do with the data you select. The solution here would be to use SelectWithViewContext with DelegateResult return object instead of regular select. In that case user filtering, pagination and ordering is preserved in the select. (Use this method only if resulting records on the screen relate as 1 to 1 to the records you select. If you use any kind of aggregation or inserting records from 2 different select, that approach does not work)
Example:
protected virtual IEnumerable ardocumentlist()
{
PXSelectBase<BalancedARDocument> cmd =
new PXSelectJoinGroupBy<BalancedARDocument,
...
OrderBy<Asc<BalancedARDocument.docType, //Set necessary sorting fields: use the key fields
Asc<BalancedARDocument.refNbr>>>> //Set necessary sorting fields: use the key fields
(this);
PXDelegateResult delegResult = new PXDelegateResult
{
IsResultFiltered = true, //set these fields to indicate that the result does not need re-filtering, resorting and re-paging
IsResultTruncated = true,
IsResultSorted = true
}
foreach (PXResult<BalancedARDocument> res_record in cmd.SelectWithViewContext())
{
// add the code to process res_record
delegResult.Add(res_record);
}
return delegResult;
}
Probably you don't have proper indexes on your table since even if you select all 2k records at once it should not result in 70 seconds load time. Recommendation here would be to use the request profiler to catch the exact SQL generated (https://help-2020r2.acumatica.com/Help?ScreenId=ShowWiki&pageid=e4c450bb-86bc-4fb2-b7e6-1f715abe3c8b) and execute the SQL in MS SQL Management studio with option 'Include Actual Execution Plan' (https://learn.microsoft.com/en-us/sql/relational-databases/performance/display-an-actual-execution-plan?view=sql-server-ver15) . Usually in this mode the MS SQL server suggests the indexes needed to speed up the query execution.

DataSource.Error AnalysisServices: The XML for Analysis request timed out before it was completed.Timeout value:3600 sec

After connecting to analysis server through excel,when I am trying to import complete data into excel getting error after being loaded few rows.
Try changing the ServerTimeout server property to a value high enough for this operation to complete. To do this, connect to the SSAS server through SSMS and right-click the instance name then select Properties. Go the the General Tab, check the Show Advanced (All) Properties checkbox and find the ServerTimeout property. This is set in seconds with the a default of 3600 (one hour) as seen in the error message. Modify the Value column to be high enough to finish this task. After changing the ServerTimeout property press OK and this will be effective immediately with no restart necessary. For more details on this and other SSAS properties review the documentation. If what you're importing into Excel typically takes longer than an hour, you may want to look into another means of performing this such as SSIS.

Is there a way to disable notes/files in results set for Generic Inq?

I'm working through some poor performance when exporting data (StockItems) from Acumatica using a generic inquiry. The issue I'm having is that Acumatica is going after every item in the result set and then getting the notes for each item.
Is there a way to disable this for generic inquiry screens? I don't need notes, I just want the data. My main query finishes in .02 seconds but then there is another 2-3 seconds of SQL time while it gets the notes.
If you use DACs made from SQL Views the results grid doesn't add the notes. But once you join a DAC that is note enabled they start to populate in the results grid.

Implementing custom SSRS security within the report logic, in addition to ReportServer

SSRS Version 2008 (Not R2)
Hi all.
I am trying to implement custom security within a report being used for call center managers and agents. Here are my requirements:
Agents can see their own stats and no one else's stats.
Managers can see anyone's stats
I have a report that shows basic call center stats like # of dropped calls, on hold time, etc. The report has two parameters. One for #Date and one for #AgentID. We want to make sure the managers can view this report for any agent and the agents can see their stats and only their stats. I have tried a few techniques and would like to accomplish this within the report logic (stored procedure). My latest attempt involved capturing the current user's login (SELECT CURRENT_USER) and then bumping that up against a WHERE clause in the report's main SELECT statement. This seems to work fine in SQL/SSMS but does not seem to catch on when deployed as an SSRS report.
-- Sample user table
SELECT DISTINCT
ManagerID
,ManagerName
,LoginID
,'Manager' AS LEVEL
INTO #user_SOURCE
From dbo.AgentTable
UNION
SELECT DISTINCT
AgentID
,AgentName
,LoginID
,'Agent'
From dbo.AgentTable
UNION
SELECT
1
,'My Name'
,'mylogin'
,'Tester'
-- Then i have my simple SELECT statement that is inside a stored procedure and called by the report.
select top 1000 *
from dbo.CallCenterReportTable
where CURRENT_USER IN
(
SELECT
LoginID
FROM #user_SOURCE
)
The interesting thing is that i can do testing fine in SSMS, and i can even test it successfully in my local BIDS, but it only works in BIDS if i slightly adjust anything inside the report like page size, etc. Anything that would require a re-save of the RDL seems to make the security function as it should when viewing locally. That being said, the security will not work once deployed to the SSRS server. Even if I change the RDL slightly and redeploy, it does not work.
My last resort would be to create two reports that are almost identical. One would restrict use to only call center managers by AD role. The other would be open to agents and would utilize the USERID internal parameter in SSRS (WHERE USERID=#AgentID). I would really like to avoid having two reports if at all possible.

Do lots of fields cause a partial refresh to be slow?

I have a lot of fields on a form. Not exactly sure how many but it has to be close to 100, if not over.
I have a change event of one field doing a partial refresh of a computed field with the following formula.
return document1.getItemValueString("txtCustomScore");
txtCustomScore is the field that has the event.
It takes 3-4 seconds to update this field. Are all of those other field somehow affecting how long it takes to refresh this field? It is taking 3-4 seconds.
I even tried getValue instead of getItemValueString. As suggested in this thread:
Setting a document field with replaceItemValue from a rich text control?
But it still takes 3-4 seconds to update the computed field.
Is there anyway to fix this other than eliminating fields from the form?
Yes it does. Even for a partial refresh all component values get evaluated and the server side result tree is built. As Tommy suggested, partial execution mode might be your answer
I strongly encourage you to watch the XPages Masterclass Video Series 1 (See: http://tonymcguckin.wordpress.com/2013/04/22/xpages-masterclass-series-1/).
From this you will then be able to introspect the XPages Request Processing Lifecycle phases and Profile your application. This will uncover the exact reasons behind the processing cost.

Resources