I am having a little trouble digesting these numbers that I see here from the tests. I am running an AMD Phenom II x6 1100T @ 3.31GHz, 16GB RAM @ DDR3 1600 with 2 XFX Radeon 5770 1GB GPU in CrossFire mode and Windows 7 Ultimate 64-Bit. I am using a residential Time Warner internet connection with average speed of 25mbps down and 5mbps up.
From the two web browsers I use (Chrome 15 and IE9) on a daily basis here are the results I continued to get after 3 runs at each benchmark test:
Chrome 15: Run1=251.8ms; Run2=253.2ms; Run3=251.2ms
IE9: Run1=3,824.2ms; Run2=3,865.0ms; Run3=3,850.7ms
V8 Benchmark Suite – Version 6
Chrome 15: Run1=8,796; Run2=9,660; Run3=9,325
IE9: Run1=90.3; Run2=102; Run3=75.7 (OMG THAT WAS A PAINFUL TEST!!!)
Chrome 15: Run1=3,577.7ms; Run2=3,555.5ms; Run3=3,526.2ms
IE9: Run1=Gave up, had to keep pressing the “No” button when it asked to if I wanted it to stop running the script. Lost count after 58 presses of the “No” button. *cries*
Now granted there are some hardware differences and I’m sure an internet connection difference but I don’t quite see how there are such a HUGE difference between the scores that are produced. Hopefully someone out there could educate me on this, maybe I did something wrong here which gave me such a different score.
Fortunately for me I could careless either way how fast or slow the browser is, as long as I can get to my military accounts, shop online, and check email I’m good. All other internet goes through gaming and VoIP.