Browser slow: Difference between revisions

From genomewiki
Jump to navigationJump to search
No edit summary
No edit summary
Line 25: Line 25:
rtt min/avg/max/mdev = 170.361/170.370/170.378/0.007 ms
rtt min/avg/max/mdev = 170.361/170.370/170.378/0.007 ms
</nowiki>
</nowiki>
== verify browser alive and well ==
<nowiki>
time wget --user-agent=browserAliveTest -O- https://genome.ucsc.edu/cgi-bin/hgTracks?measureTiming=1 2>&1 | grep Overall
<span class='timing'>Overall total time: 526 millis<br /></span>
real    0m1.245s
user    0m0.021s
sys    0m0.019s
</nowiki>
Note the ''Overall total time: 526 millis'' is the total run time of the '''hgTracks''' CGI binary
The ''real  0m1.2345s'' is the total run time of the wget command including the transmission
of the html text from the '''hgTracks''' operation.  This test can also be used on the ''http'' connection
to see if it is any different:
<nowiki>
time wget --user-agent=browserAliveTest -O- http://genome.ucsc.edu/cgi-bin/hgTracks?measureTiming=1 2>&1 | grep Overall
<span class='timing'>Overall total time: 439 millis<br /></span>
real    0m0.959s
user    0m0.003s
sys    0m0.012s
</nowiki>
== verify browser performance similar to web browser access ==
The simple '''wget''' test above only transmits the single top-level content of the html page
returned from '''hgTracks'''.  Your web browser will use references within that page
to request more files from the browser.  To simulate a complete, no cache involved,
transfer of all files in the browser page, go to a work directory where the wget command
can create a hierarchy of files.  For example:
<nowiki>
$ cd /dev/shm
$ time wget --user-agent=browserFullPageTest \
-o browser.wget.log \
--wait=0 \
--execute="robots=off" \
--no-cookies \
--timestamping \
--level=1 \
--convert-links \
--no-parent \
--page-requisites \
--adjust-extension \
--max-redirect=0 \
"https://genome.ucsc.edu/cgi-bin/hgTracks?measureTiming=1"
real    0m9.962s
user    0m0.074s
sys    0m0.040s
$ grep Overall genome.ucsc.edu/cgi-bin/*.html
<span class='timing'>Overall total time: 463 millis<br /></span>
</nowiki>
Again, the ''Overall total time: 463 millis'' is the total run time of the '''hgTracks''' CGI binary,
and the ''real    0m9.962s'' is the total time for the wget command to transfer everything.
You will find a hierarchy of files in a newly created directory '''./genome.ucsc.edu'''.
If this command is repeated, with the previous result available in '''./genome.ucsc.edu''', this
will simulate ''(not perfectly)'' a web browser operation where the static content remains in cache and doesn't need
to be transferred again.  This isn't a perfect web browser simulation because this command does
a '''HEAD''' operation on each access that it wants to determine if it can be cached.  A web browser
has other means of caching to avoid constant '''HEAD''' operation.  It will occasionally, but not often.

Revision as of 18:45, 8 November 2021

Browser seems slow

There are various factors that could make the browser appear to have slow behavior. Here are suggested ways to test the browser performance.

Check network connection

Verify browser address can be found, unix command: host genome.ucsc.edu

$ host genome.ucsc.edu
genome.ucsc.edu is an alias for genome.soe.ucsc.edu.
genome.soe.ucsc.edu has address 128.114.119.131
genome.soe.ucsc.edu has address 128.114.119.132

Verify network path is available to the browser: ping -c 3 genome.ucsc.edu

$ ping -c 3 genome.ucsc.edu
PING genome.soe.ucsc.edu (128.114.119.131) 56(84) bytes of data.
64 bytes from hgw1.soe.ucsc.edu (128.114.119.131): icmp_seq=1 ttl=43 time=170 ms
64 bytes from hgw1.soe.ucsc.edu (128.114.119.131): icmp_seq=2 ttl=43 time=170 ms
64 bytes from hgw1.soe.ucsc.edu (128.114.119.131): icmp_seq=3 ttl=43 time=170 ms

--- genome.soe.ucsc.edu ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2000ms
rtt min/avg/max/mdev = 170.361/170.370/170.378/0.007 ms

verify browser alive and well

 time wget --user-agent=browserAliveTest -O- https://genome.ucsc.edu/cgi-bin/hgTracks?measureTiming=1 2>&1 | grep Overall
<span class='timing'>Overall total time: 526 millis<br /></span>

real    0m1.245s
user    0m0.021s
sys     0m0.019s

Note the Overall total time: 526 millis is the total run time of the hgTracks CGI binary

The real 0m1.2345s is the total run time of the wget command including the transmission of the html text from the hgTracks operation. This test can also be used on the http connection to see if it is any different:

time wget --user-agent=browserAliveTest -O- http://genome.ucsc.edu/cgi-bin/hgTracks?measureTiming=1 2>&1 | grep Overall
<span class='timing'>Overall total time: 439 millis<br /></span>

real    0m0.959s
user    0m0.003s
sys     0m0.012s

verify browser performance similar to web browser access

The simple wget test above only transmits the single top-level content of the html page returned from hgTracks. Your web browser will use references within that page to request more files from the browser. To simulate a complete, no cache involved, transfer of all files in the browser page, go to a work directory where the wget command can create a hierarchy of files. For example:

$ cd /dev/shm
$ time wget --user-agent=browserFullPageTest \
-o browser.wget.log \
--wait=0 \
--execute="robots=off" \
--no-cookies \
--timestamping \
--level=1 \
--convert-links \
--no-parent \
--page-requisites \
--adjust-extension \
--max-redirect=0 \
"https://genome.ucsc.edu/cgi-bin/hgTracks?measureTiming=1"

real    0m9.962s
user    0m0.074s
sys     0m0.040s

$ grep Overall genome.ucsc.edu/cgi-bin/*.html
<span class='timing'>Overall total time: 463 millis<br /></span>

Again, the Overall total time: 463 millis is the total run time of the hgTracks CGI binary, and the real 0m9.962s is the total time for the wget command to transfer everything. You will find a hierarchy of files in a newly created directory ./genome.ucsc.edu.

If this command is repeated, with the previous result available in ./genome.ucsc.edu, this will simulate (not perfectly) a web browser operation where the static content remains in cache and doesn't need to be transferred again. This isn't a perfect web browser simulation because this command does a HEAD operation on each access that it wants to determine if it can be cached. A web browser has other means of caching to avoid constant HEAD operation. It will occasionally, but not often.