Why does Google Page Speed Insights show different score each time I analyze the website? - pagespeed-insights

I tried to improve my Google Page Speed Insights mobile score by optimizing images on the website. I analyzed the website a few times - when I didn't do anything and when I replaced some not optimized images what was supposed to make things better. Each time I got a different score. First it was 49 or even 34 (when I didn't do anything) and then when I changed some images I got 54, 52, 49 or 34. It doesn't make much sense. Why does this happen and what can be done to improve the score?

In Page Insight there are Couple of Section "Score", "Field Data", "Origin Data", "Lab Data". The total score is based on all the four. If you optimized the code and run against page insight don;t look at score only see that was before and after the change in section "Lab Data". If your score improved in lab data you could see the total score go up after 28 days. In any case you don;t have to wait for 28 days instead click on the link "See Calculator" in lab data that will tell how your score look after 28 days.
From my experience tool is consistent the variation you are seeing are due to field data getting refreshed for your domain.
Thanks,

Related

PageSpeed Insights - CLS on field data not improving

We implemented CLS optimization 20 days ago, actual values (lab data) are perfect from that time.
CLS on field data is different story. It is improving but very very slowly. If it is truth that it is calculated out of 28-day period, then we might see significantly better values.
We started with CLS of 1.06 and now we are on 0.68. Lab data on my computer shows CLS of 0.001
Is there any way to validate field data calculation?
Or is there any other reason I am not seeing? Thanks.
First after 20 days a CLS drop from 1.06 to 0.68 is good, you should level out at about 0.5 which is a big improvement.
Unfortunately the reason you have CLS issues is that you still have problems somewhere.
You see the synthetic lab tests only measure initial page load for CLS at 2 specific screen sizes.
The field data measures until page unload and at every screen size.
So your problem is either further down the page or caused by CLS at a different screen size than those tested.
As you have "maxed out" the synthetic tests the advice in this answer I gave may help you identify CLS issues, which covers 2 ways to test using developer tools and how to track real world data (the best way in my opinion) to help narrow down the cause.

Will chunk/bundle optimisations help on my website if first Input delay(FID) is already less?

According to core web vitals there are only 3 core vitals for measuring the user experience of any website LCP(Largest contentful paint), FID(First input delay) and CLS(Cumulative Layout shift). According to Pagespeedinsights or CRUX dashboard, FID of my website is in good limits i.e 90% of users have an input delay of less than 100 ms
Will there be any benefit if I do the chunk optimisations(splitting, lazy loading) on the user experience of people landing on my website?
I understand that it will effect TBT(Total Blocking Time), TTI(Time to interactive) but anyways it doesn't matter if my FID is ver less. is my understanding correct?
I work on several large sites and we measure FID and TBT across thousands of pages. My work on this shows there is little correlation between TBT and FID. I have lots of pages reporting TBT of 2s or more but then are in the 90% score for FID. So I would NOT spend money or time optimizing TBT, what I would do instead is optimize for something that you can correlate to a business metric. For instance, add some user timings to measure how fast a CTA button appears and when it becomes interactive. This is a metric that is useful.
Being in the green on the core web vitals report (for one or all metrics) is great, but it doesn't mean that you should not try to improve performance further. In fact, if all your competitors have better FID / CLS / LCP / etc. than you, you will be at a disadvantage. Generally speaking, I think the web vitals report can be used as a guide to continuously prioritise changes to try and improve performance.
While it's impossible to predict the improvements without looking at a current report and the codebase, it seems fair to expect code-splitting to improve FID and LCP, and lazy-loading to help with LCP. Both improvements would benefit users.
Note that TBT and FID are pretty similar.

Is the PageSpeed Insight display score got from the “Lab Data” or “Field Data”?

I've randomly tested a web link and got 64. However, the Lab Data and Field Data seems quite different. I think it's because the web page owner just modified it.
Is the score “64” reflecting the Lab Data or Field Data?
Short Answer
It is lab data score.
Longer Answer
The score you see there is the "lab data" score, it is the score for this synthetic test you just ran. It will change every time you run Page Speed Insights.
"Field Data" will not contribute towards your score in Page Speed Insights and is purely for diagnostics.
The "Field Data" is calculated over a rolling 30 days so is useful to see if there are issues that automated tests do not pick up, but useless if you have just done a major update to fix a site issue (for 30 days at least).
Additionally CLS in "Field Data" is calculated the whole time someone is on the site (until the "unload" event on a page), the PSI "Lab Data" is only calculated on the above the fold content. That is sometimes another reason for disparity between results.

Does Google PSI "trailing thirty days" testing still occur?

I noticed in this Google PSI FAQ written for a previous deprecated version of the test that it says that changes made to the website do no effect the PSI score immediately.
"The speed data shown in PSI is not updated in real-time. The reported metrics reflect the user experience over the trailing thirty days and are updated on a daily basis."
https://developers.google.com/speed/docs/insights/faq
Does this part of the FAQ still apply today? I've noticed that if I reduce the number of DOM elements, the "Avoid an excessive DOM size" complaint in Google PSI immediately shows the correct new count of DOM elements but scores still remain in the same range.
The part you are referring to is "field data", which is indeed still calculated on a trailing 30 day period.
However when you run your website through Page Speed Insights that is tested without any cache and is calculated each time you run it. (known as "Lab Data")
Think of field data as "real world" information, based on visitors to your site and their experiences, it is a far more accurate representation of what is really happening when people visit your site.
Think of the "lab data" as a synthetic test and a diagnostic tool. They try to simulate a slower CPU and a 4G connection but it is still a simulation, it is designed to give you feedback on potential problems. It has the advantage of updating instantly when you make changes though.
For this reason your "field data" will always lag behind your "lab data" when you make changes to the site.
Also bear in mind that some items in the report are purely diagnostics. In your example of "excessive DOM size" this has no direct scoring implications. However it is there to explain why you might be getting slow render times and or a large Cumulative Layout Shift as lots of DOM elements = more rendering time & more chance of a reflow.
See this answer I gave on the new scoring model PSI uses.

how does google analytics calculate metrics like "average time spent"?

how services like google analytics calculate parameters like
"average time spent"
"number of users that came to the website via search Vs user that hit the url directly
etc.
I would imagine that google can easily record a HIT when someone clicks on a link in serach result. But after that how long and deep the user is brwosing that perticular website is out of tap...hmmmm ?
This question has some information. As mentioned in that question time should be calculated using an onUnload() event. When the js is loaded firstime the time may be recorded (in cookies)and then onUnload() the time spent is calculated and sent to Google for record.
The above question explains most of your question.
This thread states quite clearly that there is no unLoad() event: http://groups.google.com/group/analytics-help-troubleshoot/browse_thread/thread/d142572ddf1fa9dd/38dd640f949e9890?pli=1
Also, try going to GA and look for sessions with only 1 pageview - you will see the average page time is 0s, which proves the point.

Resources