Browser slow

From genomewiki
Revision as of 18:45, 8 November 2021 by Hiram (talk | contribs)
Jump to: navigation, search

Browser seems slow

There are various factors that could make the browser appear to have slow behavior. Here are suggested ways to test the browser performance.

Check network connection

Verify browser address can be found, unix command: host

$ host is an alias for has address has address

Verify network path is available to the browser: ping -c 3

$ ping -c 3
PING ( 56(84) bytes of data.
64 bytes from ( icmp_seq=1 ttl=43 time=170 ms
64 bytes from ( icmp_seq=2 ttl=43 time=170 ms
64 bytes from ( icmp_seq=3 ttl=43 time=170 ms

--- ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2000ms
rtt min/avg/max/mdev = 170.361/170.370/170.378/0.007 ms

verify browser alive and well

 time wget --user-agent=browserAliveTest -O- 2>&1 | grep Overall
<span class='timing'>Overall total time: 526 millis<br /></span>

real    0m1.245s
user    0m0.021s
sys     0m0.019s

Note the Overall total time: 526 millis is the total run time of the hgTracks CGI binary

The real 0m1.2345s is the total run time of the wget command including the transmission of the html text from the hgTracks operation. This test can also be used on the http connection to see if it is any different:

time wget --user-agent=browserAliveTest -O- 2>&1 | grep Overall
<span class='timing'>Overall total time: 439 millis<br /></span>

real    0m0.959s
user    0m0.003s
sys     0m0.012s

verify browser performance similar to web browser access

The simple wget test above only transmits the single top-level content of the html page returned from hgTracks. Your web browser will use references within that page to request more files from the browser. To simulate a complete, no cache involved, transfer of all files in the browser page, go to a work directory where the wget command can create a hierarchy of files. For example:

$ cd /dev/shm
$ time wget --user-agent=browserFullPageTest \
-o browser.wget.log \
--wait=0 \
--execute="robots=off" \
--no-cookies \
--timestamping \
--level=1 \
--convert-links \
--no-parent \
--page-requisites \
--adjust-extension \
--max-redirect=0 \

real    0m9.962s
user    0m0.074s
sys     0m0.040s

$ grep Overall*.html
<span class='timing'>Overall total time: 463 millis<br /></span>

Again, the Overall total time: 463 millis is the total run time of the hgTracks CGI binary, and the real 0m9.962s is the total time for the wget command to transfer everything. You will find a hierarchy of files in a newly created directory ./

If this command is repeated, with the previous result available in ./, this will simulate (not perfectly) a web browser operation where the static content remains in cache and doesn't need to be transferred again. This isn't a perfect web browser simulation because this command does a HEAD operation on each access that it wants to determine if it can be cached. A web browser has other means of caching to avoid constant HEAD operation. It will occasionally, but not often.