# Woot! Over 4GHz :D



## FF_CCSa1F (Jan 24, 2008)

After a long, long time of deportation to low-performance land after my 7900GT and the mobo it was mounted in died a horrible death along with the RAM, I've finally been able to get some clocks again 

I've finally, after over one and a half years been able to afford a new graphics card, RAM and mobo to replace my aging and low-performing GeForce 6600AGP and ASRock mobo. I've went from DDR400 Kingston KVR (That won't clock more than 10MHz), GeForce 6600 AGP and a _SHIT_ mobo to DDR800 XMS2 (Wich will clock 200+MHz), GeForce 7600GS (Yeah, I know...) and a Gigabyte P35-motherboard. From being trapped with a maximum overclock of 3.3GHz to 4.20GHz.

It's not the best system, but dammit, there's still some juice in this aging P4 630!







Now I'll make it for another while  On top of that, a mate has agree'd on DONATING his old E6600 to me, and I'm getting my uncle's old laptop to replace my beaten 466MHz one! This year has certainly gotten a good start on the computer side this year!

/Joyous outburst


----------



## yak (Jan 24, 2008)

You know, i never understood you overclockng type people.

What difference does it make to you if some task named X performs a whopping 1.6 seconds longer on a normal clock as opposed to when being pushed to the limits? 
Let me try and guess the reason why your original hardware burned down...


----------



## Ceceil Felias (Jan 24, 2008)

Except that clockspeed is no longer an accurate measurement of performance in this day and age...


----------



## FF_CCSa1F (Jan 25, 2008)

yak said:
			
		

> You know, i never understood you overclockng type people.
> 
> What difference does it make to you if some task named X performs a whopping 1.6 seconds longer on a normal clock as opposed to when being pushed to the limits?
> Let me try and guess the reason why your original hardware burned down...


Well, fact is, that this system is performing about 40% faster after that 40% overclock. SuperPI there on the left would score about 50 seconds unclocked. So, I'd say it's a bit more gain than that...

And my old hardware burned down due to a manufacturing error on the graphics card. Never buy MSI, they wouldn't even RMA it.



			
				Ceceil Felias said:
			
		

> Except that clockspeed is no longer an accurate measurement of performance in this day and age...


I know, but there's still something magical about 4GHz 

And this is about the best performance you can get out of any singlecore in these days.


----------



## CyberFoxx (Jan 25, 2008)

Yes, but what temperature does it idle at? Personally the fact that my Celeron D 360 (3.46Ghz) idles at 23C is quite nice, and hits 45C at full load is even better. (Actually, right now as it's compiling OpenOffice, it's hovering in the 41-42C range.) Personally, I'll take fast/stable/cool over fast/semi-stable/fry-an-egg-hot any day. Then again, not like I have a choice to overclock, stupid MSI and removing the BIOS menu on this model of motherboard. Although, it would be cool to see what I could get this sucker up to...


----------



## FF_CCSa1F (Jan 25, 2008)

CyberFoxx said:
			
		

> Yes, but what temperature does it idle at? Personally the fact that my Celeron D 360 (3.46Ghz) idles at 23C is quite nice, and hits 45C at full load is even better. (Actually, right now as it's compiling OpenOffice, it's hovering in the 41-42C range.) Personally, I'll take fast/stable/cool over fast/semi-stable/fry-an-egg-hot any day. Then again, not like I have a choice to overclock, stupid MSI and removing the BIOS menu on this model of motherboard. Although, it would be cool to see what I could get this sucker up to...



Well, with the fan at ~700RPM (About 60%), it idles at about 40-45C. If I crank the fan up to the max (2500RPM), it idles at about 28-29C. Full load gives it about 60-65C with the fan on 700RPM, and about 40 on max. It's really a lot cooler than I expected.

Doesn't really matter anyhow, since this CPU runs perfectly fine at 80-90C as long as C1E and thermal throttling are disabled. No performance loss at all until it hits 90. Then at 95 it slows down due to the on-die throttling, and at 100 it shuts down.


----------



## yak (Jan 25, 2008)

Alright, but my question still stands - 

What difference does it make to you if some task named X performs a whopping 1.6 seconds longer on a normal clock as opposed to when being pushed to the limits?


----------



## FF_CCSa1F (Jan 25, 2008)

yak said:
			
		

> Alright, but my question still stands -
> 
> What difference does it make to you if some task named X performs a whopping 1.6 seconds longer on a normal clock as opposed to when being pushed to the limits?



That doesn't make any differance. A 40% performance increase does. Minimal overclocks of under 10% doesn't usually make much of a differance on newer processors. But if you can push it over 10%, you can get a real boost from it. I've clocked my CPU exactly 40%; From 3.0GHz to 4.2GHz. And the kind of processor I have is a Prescott Core Pentium 4 - A kind of processor known to be very linear when it comes to speed/performance ratio. The Prescott core CPU series (Talking only intel and avereges here) is the most linear overclocker; If you clock it 10%, you get 10% gain. But there are other CPUs that will behave differently. Some will not give much increase at all with any overclocking. Some may even give only 5-10% performance increase on a 20% clock. SOme may give a 20% performance gain on 10% clock. It's really just down to the architecture.

The majority of processors out there won't go over 20 or 30%. The Prescotts and G0 revision C2D's, though, can easily pull off 40+%, and they're very effective when it comes to frequency/performance ratio. Especially the C2D's.

To put it short: I get 40% more power out of my computer now, without buying a new processor. Do you get why people overclock their stuff now? It's simply a huge performance increase - for free. If you know how to do it, there's nothing to lose. Sure, if you clock your CPU 20MHz and won't bother with raising the Vcore, it'll seem useless, but if you do it right, there are immense gains.

Also, from where did you get the 1.6 seconds-thing?


----------



## Aden (Jan 25, 2008)

yak said:
			
		

> What difference does it make to you if some task named X performs a whopping 1.6 seconds longer on a normal clock as opposed to when being pushed to the limits?



Hey, when I'm rendering a movie (or even just a complex still), I'll take all the speed I can get.


----------



## feilen (Jan 25, 2008)

Wait, did you use a program to overclock it? I want one!


----------



## yak (Jan 25, 2008)

*laughs* Guess i'm oldschool like that. I do see your points however, just wasn't in a situation when CPU speed was crucial for a certain task


----------



## Kougar (Jan 25, 2008)

Freezer, nice overclock! Gigabyte makes some solid P35 motherboards, you should have some fun with that E6600... 



			
				yak said:
			
		

> You know, i never understood you overclockng type people.
> 
> What difference does it make to you if some task named X performs a whopping 1.6 seconds longer on a normal clock as opposed to when being pushed to the limits?
> Let me try and guess the reason why your original hardware burned down...



Well, there are actually quite a few reasons I overclock. To neatly summarize, it is both a fun hobby, and there are real tangible performance benefits. Some others overclock for the thrill of breaking records. I prefer more to take a low model CPU and turn it into something that is faster than the $999 Extreme CPUs. A perceptive user will notice a 50% overclock, but anyone should be able to tell the difference with a 100% overclock. When dealing with Core 2 Duo CPU's there is just an absurd amount of headroom that is there but never gets used. It is like taking a car engine and putting a limiter on it at 55mph. 

Overclocking a $170 E6550 processor from 2.33Ghz to 3.0Ghz turns it into a processor that is 100% identical to a E6850. In many cases you would not even need to raise voltages beyond the actual 1.325v Intel rates their 65nm Core 2 Duo's at... assuming you need to raise it at all which is most likely to be the case. There is no physical difference except for bin grade and price between them.

Overclocking has a miconception that severe voltages must be used, heat must be generated, and components must be burnt out. My personal definition of Overclocking is anything but... to include undervolting at stock speeds (greatly extends battery life in notebooks, for example), overclocking with the stock voltages, and in general overclocking so the hardware lasts but is 100% stable and reliable, without hotspots that will lead to premature failure. I've overclocked a E6300 from 1.86Ghz to 3.8Ghz, better than a 100% overclock using voltages that could be run 24/7 and didn't push the processor beyond warm, although I have watercooling to thank for the low temps on that. My first self-built computer is still purring along fine in the back room, it has had a 2.8Ghz Northwood running at 3.2-3.4Ghz for the last 4-5 years, and the entire system still works fine. It runs 24/7 at stock volts at 3.2GHz now.

Take my Q6600... stock voltage is 1.200v. At *stock* voltage it runs up to 3.3Ghz without errors, almost a full free 1Ghz. Now if you were a power concious overclocker... I left it at stock 2.4Ghz, and undervolted it to 1.15v and it ran just fine again without errors or instability, but power draw form the wall dropped nicely and it obviously ran cooler. I run it at 3.2Ghz using the stock volts. At 3.2Ghz stock volts CoreTemp reports all cores below 50c... the socket temp is itself is 38c. Those are not hot at all for a full system load, since this is also my 24/7 folding@home box. 

If you think the performance gains are a mere 1.6 seconds, you should compare benchmark results between say a E6600 and a E6850, or a Q6600 and a QX6850. Or a E6300 and a E6850, even though there is a cache size difference with that comparison. :wink:


----------



## FF_CCSa1F (Jan 25, 2008)

feilen said:
			
		

> Wait, did you use a program to overclock it? I want one!



No, went the classic and best way, through the CMOS setup. Overclocking programs are shit in most cases, really.

BTW, now I'm running it at 4.3GHz, and I managed to push it up to 4.41GHz, but that was rahter unstable.


----------



## Janglur (Jan 26, 2008)

Congrats.  Enjoy your insane power bill.


----------



## FF_CCSa1F (Jan 26, 2008)

Janglur said:
			
		

> Congrats.  Enjoy your insane power bill.



Lol.

I think mom's 32" CRT TV that she leaves on 24/7 actually uses more power than my computer. :roll:


----------



## Janglur (Jan 26, 2008)

Actually, unlikely!


Average 32" CRT consumes 90 watts.


3.6 GHz P4 560 = 98 watts
GeForce 6600 = 76 watts
7.2k HDD:  8 watts
2 GB 800 MHz DDR2= 3 watts
800 MHz FSB northbridge = ~35 watts

Total:  220 watts, give or take 20%

The CPU would use more due to the OC.  The NB might use less, depending on your MB's chipset.


220 watts, assuming only on 12 hours a day, based on Colorado power costs, one of the lowest in the country (9.1 cents/kw):
$7.21/month
$86.49/year


----------



## net-cat (Jan 26, 2008)

Interesting program. I just tried it on my laptop, which is not overclocked and runs at 1.66 GHz. It finished in 36.016 seconds. I'll have to try it on my desktop when I get home.


----------



## FF_CCSa1F (Jan 26, 2008)

Janglur said:
			
		

> Actually, unlikely!
> 
> 
> Average 32" CRT consumes 90 watts.
> ...



:O You got me there. I admit defeat =P 

Meh, it's not me paying the bills anyhow.


----------



## Janglur (Jan 26, 2008)

I realized a couple years ago how dumb I am.  I get such insanely overpowered PCs for gaming, then spend 90% of their time turned on just playing music.

Now I use two PCs.
One is my gaming rig, and is set to really aggressive power saving:  Monitor off in 3 mins, HDDs spin down in 10, standby in 15, and hibernate in 30.
The other I can leave on 24/7 (for my music) and I designed it just for that:  It uses a maximum of 10 watts, and averages only 6 watts when playing music.


----------



## FF_CCSa1F (Jan 26, 2008)

Janglur said:
			
		

> I realized a couple years ago how dumb I am.  I get such insanely overpowered PCs for gaming, then spend 90% of their time turned on just playing music.
> 
> Now I use two PCs.
> One is my gaming rig, and is set to really aggressive power saving:  Monitor off in 3 mins, HDDs spin down in 10, standby in 15, and hibernate in 30.
> The other I can leave on 24/7 (for my music) and I designed it just for that:  It uses a maximum of 10 watts, and averages only 6 watts when playing music.



I have two main PCs too, one beside my bed and the one in the op. I have both on 24/7, though. The bed one is getting replaced by a lappy in a couple of weeks, though.


----------



## Janglur (Jan 26, 2008)

Upgrading to LCD from CRT typically saves 10-60 watts right there.

Setting your monitor to turn off rather than screensaver will also save power.  Screensavers make the CPU work, using more power.  Monitor-off will save power, and no screensaver keeps the CPU more idle, using less power.

Agressive standby also saves a lot of power, in exchange for a 3-10 second wait to 'warm up' before resuming from standby.

Finally, Hibernate (not reccomended for people with 2 GB or more of RAM) effectively turns the PC off.  In exchange for the amount of time to turn the PC off, it will 'reboot' and open applications/programs/etc. right where it left off.  This saves the most power.

I heavily reccomend at least standby for when it's on overnight and not in use.


By doing this your PC can use 50% less power or more.  Mine is about 60% less expensive to own.  =3


----------



## Aden (Jan 27, 2008)

*Janglur*: Why is hibernate not recommended if you have 2GB RAM or more?


----------



## Rhainor (Jan 27, 2008)

Aden said:
			
		

> *Janglur*: Why is hibernate not recommended if you have 2GB RAM or more?



When the computer goes into Hibernate mode, it saves the entire contents of RAM to a special file on the hard drive.  When it boots up, it checks for that file and, if found, loads it back into RAM and resumes execution of all the processes and programs that were running before Hibernation.

With 2gigs or RAM, it can take a long while to save all that to disk, and to resume it all on restart.


----------



## net-cat (Jan 27, 2008)

Actually, without KB909095, hibernate in won't generally work in XP if you have more than 1GB of RAM.

That's been my experience, anyway.


----------



## FF_CCSa1F (Jan 27, 2008)

Janglur said:
			
		

> Upgrading to LCD from CRT typically saves 10-60 watts right there.
> 
> Setting your monitor to turn off rather than screensaver will also save power.  Screensavers make the CPU work, using more power.  Monitor-off will save power, and no screensaver keeps the CPU more idle, using less power.
> 
> ...



Wow, you sure are energy efficient. I'm not  I only use LCDs, though.


----------



## Janglur (Jan 27, 2008)

Rhainor is close, but not quite.


If you have more than 2 GB of RAM, the contents of RAM are often heavily fragmented upon hibernation.  It will usually hibernate fine the first time, but on the second to fourth time the RAM and HDD-dump are so fragmented the OS can't handle it, and Hibernation may fail.

The trigger is >2 GB.  IE, if you have 1 GB of RAM, and 512 MB video card, then you're fine.  But if you exceed 2 GB (RAM+Video) then it will give you problems.


----------



## net-cat (Jan 27, 2008)

Actually, Microsoft posted a fix for that. I linked it above.


----------



## Janglur (Jan 27, 2008)

Actually, I downloaded that fix and it doesn't work.

In fact, noone has made it work anywhere that I can consider.  It's a non-fixing fix.


----------



## net-cat (Jan 28, 2008)

That's really odd. I downloaded it and it worked fine. :/


----------

