I am trying to build a test which will fail upon slow / sluggish / jankiness performance of a web page or the elements within it.
By "slow" or "sluggish" i mean mostly the following:
Very delayed response when scrolling down the page
Delayed response when clicking an element
This sluggishness on particular page is happening when the system under test is scaled out.
One Idea i had to tackle this was to make a time out when explicitly waiting for some element to appear in the dome, but frankly i doubt this will answer my need, because i do see all elements in the dome, its just the page interaction is very slow both in FF and Chrome.
My Stack is Python3, Selenium WebDriver, Pytest
Thanks in advance!
Related
Currently I have a Next application with SSR using getInitialProps that takes too long to deliver the HTML based on the complexity of the app (I'm getting high Waiting for Server to respond times in Chrome in the network tab).
I'd want to figure out what is adding so much time (sometimes I get seconds), so I'm trying to:F
Find out how much time does it take for the server from the time it receives the GET request for the page to the time it sends the HTML
Have a clear picture of what's happening and how much time does it take during the SSR. Because at the moment is a black box for me.
I tried suggested improvements: code splitting, lazy loading components, code improvements, etc.
I tried using Server Timing API to measure the requests performed in getInitialProps, to narrow down a part of the process. But it doens't help with the rendering process and other Next processes that might add to the response time.
I tried using the Node.js profiler for Chrome using NODE_OPTIONS='--inspect' next dev. This is the closest I got to what I wanted, but I can't tell where does the server respond back, and what do each Activity corresponds to. Some documentation could be helpful.
I tried middleware. Not sure if I got something wrong, but I can't measure the time from start to finish.
Some observations were that other more simple pages, have faster response times, but regardless the time it takes is extremely longer (1 - 2 orders of magnitude)
I would like to know in "firefox devtools / performance / call tree" what means functions:IDLE and how to understand the meaning of a long time in that function.
What could be my next steps to solve delay problems in functions:IDLE?
At https://developer.mozilla.org/en-US/docs/Tools/Performance/Call_Tree
One thing to be aware of here is that idle time is classified as Gecko, so parts of your profile where your JavaScript isn't running will contribute Gecko samples. These aren't relevant to the performance of your site.
Importing webbrowser takes 30+ seconds to load, which really slows down the starting up of my program. I've tried setting my default browser to IE and Chrome, but still yielded the same result. Tried it in other machines too.
I'm running Python 3.6.4 (Windows 7 x64) with a fairly fast internet connection. I'm fairly new in python programming as well.
My questions will be:
What causes this slowdown? I'm watching youtube videos importing webbrowser, they seem to import it instantaneously. What can I do about it?
I've tried "cheating" my way around this desperately by putting the import in a function of a button so that it would not affect the startup of the program (didn't work. kind of stupid now I think about it. still took 30+ seconds to startup)
Another desperate measure I'm planning to do is put the import into a multi thread so it could import at the background while starting up the program. I haven't done multi threading yet, so I still need to learn this. Would this work though?
I don't know what other information I could share regarding this since I'm really lost here. Any advice would be much appreciated.
Edit: I made a simple py to time the execution of the code.
import timeit
start = timeit.default_timer()
import webbrowser
stop = timeit.default_timer()
print('Time: ', stop - start)
Output:
How are you so sure that slowdown is due to "import". This could be due to slower loading in your Chrome browser, try clearing the cache of Chrome browser. Did your Chrome browser is upto the mark as speed is considered ? also please show me the code.
I want to write a script using Selenium WebDriver so that I can run it and leave it for a few hours, with the program checking every so often to see if the element I asked for it to search for has appeared.
I'd be using WebDriverWait in order to wait for the element to appear and be clickable.
WebDriverWait(browser, 10).until(EC.element_to_be_clickable((lambda, "tag_to_be_chosen")))
I've looked around but haven't found a solid answer yet, but the issue is that after a certain amount of time selenium throws a TimeoutException, because the element still hadn't appeared yet after the page had been loaded (which was expected).
Is there a way to extend the amount of time that the WebDriver will wait for an element to appear indefinitely, so as long as the window is open, the driver is waiting and as soon as the javascript runs, the element is detected and clicked? or I guess alternatively, what would be the cost of increasing the waiting time? obviously if i need it to wait for possibly hours a time, changing seconds from 10 to for example, 3600 seconds.
The docs talk about explicit and implicit wait, but most people only use these for seconds or minutes; not potentially hours.
My ebook does animations driven by setTimeout. (I am using requestAnimationFrame when available, which it's not on older iPads running iOS 5). The book is composed of about 100 separate XHTML files to ensure page breaks occur exactly where they should, which is otherwise an iffy proposition in iBooks.
Immediately after opening the book, animations are very slow (eg one second per step, rather than 50ms), but after keeping the book open a while (a minute or so?), the animations ran at expected speed.
The reason I found: iBooks is apparently pre-loading all the pages in the book (I suppose in order to get page numbers, or speed up page turning). The pre-loading seems to be interfering with my animations--stealing setTimeout slots, as it were.
I had thought the problem might be the time required at load time to set up the animations on each document, but timed those and found it was just a few milliseconds per page. The problem may be a semi-large script (100K) on each of the 100+ pages, which I imagine iBooks is parsing over and over again as it preloads each page.
I have considered the option of including the large script dynamically when each page is viewed, but got stuck on figuring out how to tell when that is. We have no PageVisibility API in Safari, and the focus event does not fire on initial page load, so how do I tell when the page is actually being viewed, as opposed to stealthily pre-loaded in the background by iBooks?
My next attempt is going to be to shrink the number of individual XHTML pages down to 1 or a few, and take my chances with page-break-* and its ilk to handle page breaking.
What I need is a way to (1) tell iBooks either to not pre-load other pages in the book or (2) give my setTimeout requests priority over those queued up by iBooks for preloading pages or (3) know when a page is being displayed so I can inject the script at that point in time.
See also epub 3, how to prevent pages from running in background ? (iBooks / Readium) and FInding out when page is being viewed in EPUB FXL via Javascript.