I got my new computer, with an unneccesary 660 and a i5 3570k. It runs at 32 degrees at rest and 50-60 playing Mario Kart Wii. Is it OK to overclock it to 4 ghz? How would I do it on a Gigabyte z77x-ud3h?
I think this belongs on an OC forum, but can I get it to stay at 1.6 ghz at idle and ramp up to 4 ghz if it's ok for the heat?
Generally you shouldn't OC on a stock cooler. It might be ok on dolphin but dolphin only uses 2 cores so it's nowhere near as stressful as something like a video encoder. It will probably be ok but I would still advise that you get an aftermarket cooler first.
Like NaturalViolence said, you can OC the 3750k with the standart cooler, but when you already hit the 60°C mark at 3.4GHz, you shouldnt do it.
60°C is my personal number to say "no, thats too hot", i also read, that you can damage your CPU, when you allways have 60~70°C (probably the range you will get with the standart one) but i cant confirm that, because my old AMD was at that range too and had no problems (wasnt really aware of the temperature back then).
So: Yes you can, but not healthy for your CPU at all (long range sight)
Quote:Like NaturalViolence said, you can OC the 3750k with the standart cooler, but when you already hit the 60°C mark at 3.4GHz, you shouldnt do it.
Since it's a desktop his cpu is turboing to 3.7GHz.
Quote:60°C is my personal number to say "no, thats too hot",
For a modern cpu that's ridiculously low. Anything below 90C is probably ok even for long term use. When OC I try to keep it below 75C or 80C depending on the situation.
Quote:i also read, that you can damage your CPU, when you allways have 60~70°C (probably the range you will get with the standart one) but i cant confirm that, because my old AMD was at that range too and had no problems (wasnt really aware of the temperature back then).
You realize that's a completely different architecture with different physical properties right?
(02-16-2013, 11:32 AM)NaturalViolence Wrote: [ -> ]1# Since it's a desktop his cpu is turboing to 3.7GHz.
2# For a modern cpu that's ridiculously low. Anything below 90C is probably ok even for long term use. When OC I try to keep it below 75C or 80C depending on the situation.
3# You realize that's a completely different architecture with different physical properties right?
1# I know, i have this CPU myself, so i know, what it does, but thanks for reminding me?
2# 90°C is ridiculously high when you ask me, hell even 75~80°C would be a very high number. Maybe it depends on the human sitting infront of the Computer, but its safe to say, that 60°C or lower is the best Temperature, you would aim on. i wouldnt let my computer run with 80°C all the time (PERSONAL OPINION!!!)
3# Yep, im pretty much aware of this. But still, you do realize, that there is a little statement, which says clearly "but i cant confirm that"?

Is it so hard to post multiple quotes?
RaZZZa Wrote:1# I know, i have this CPU myself, so i know, what it does, but thanks for reminding me?
Well you said "60°C mark at 3.4GHz" when it should be "60C mark at 3.7GHz". 3.7GHz is a lot closer to 4.0 GHz and with the exponential power consumption curves we're dealing with here it makes a difference. Going from 3.7 to 4.0GHz is not going to increase temperature a lot, going from 3.4 to 4.0 will.
RaZZZa Wrote:2# 90°C is ridiculously high when you ask me, hell even 75~80°C would be a very high number. Maybe it depends on the human sitting infront of the Computer, but its safe to say, that 60°C or lower is the best Temperature, you would aim on. i wouldnt let my computer run with 80°C all the time (PERSONAL OPINION!!!)
Ugh. This is not a personal opinion. It's physics. Smaller transistors operate at lower voltages, consume less power, and can tolerate higher temperatures. By changing the structure of the transistor, manufacturing process, clock rate, voltage, etc. you can radically shift the optimal temperature range for the architecture in either direction. Some architectures are designed to run hot. In addition to this newer architectures are generally designed to run at higher temperatures than older architectures due to the effect that I mentioned above. I don't care what your personal opinion is because it's wrong. You're arbitrarily deciding what "hot" and "very high number" are.
RaZZZa Wrote:3# Yep, im pretty much aware of this. But still, you do realize, that there is a little statement, which says clearly "but i cant confirm that"?
I am merely pointing out that the architecture you are comparing against has completely different physical and thermal properties. This makes any conclusions that you draw from the comparison automatically invalid.
You should buy Cooler Master Hyper 212 Evo , it's cheap and good for overclocking
Quote:Is it so hard to post multiple quotes?
No
Quote:Well you said "60°C mark at 3.4GHz" when it should be "60C mark at 3.7GHz". 3.7GHz is a lot closer to 4.0 GHz and with the exponential power consumption curves we're dealing with here it makes a difference. Going from 3.7 to 4.0GHz is not going to increase temperature a lot, going from 3.4 to 4.0 will.
Still, the range i gave is just so correct, that it hurts, that you still have to mark me wrong with it (i said 60~70°C range). Give me a break, will you.
Quote:Ugh. This is not a personal opinion. It's physics. Smaller transistors operate at lower voltages, consume less power, and can tolerate higher temperatures. By changing the structure of the transistor, manufacturing process, clock rate, voltage, etc. you can radically shift the optimal temperature range for the architecture in either direction. Some architectures are designed to run hot. In addition to this newer architectures are generally designed to run at higher temperatures than older architectures due to the effect that I mentioned above. I don't care what your personal opinion is because it's wrong. You're arbitrarily deciding what "hot" and "very high number" are.
Yes it it a personal opinion, when i say, i dont want to let my computer at 80°C all the time, please excuse me, but its bullshit, when you say, its not. With all the physics stuff, i dont know and i dont care, when i dont want to, then i dont want to. Get it? i think no.
Quote:I am merely pointing out that the architecture you are comparing against has completely different physical and thermal properties. This makes any conclusions that you draw from the comparison automatically invalid.
Who cares, run an AMD and an Intel with 120°C and they will both kinda burn over time. Or atleast get some serious damage. Nuff said
Quote:Yes it it a personal opinion, when i say, i dont want to let my computer at 80°C all the time, please excuse me, but its bullshit, when you say, its not. With all the physics stuff, i dont know and i dont care, when i dont want to, then i dont want to. Get it? i think no.
There's your problem. He knows "the physics stuff" and you don't. You are welcome to run your computer at 20C max or 100C max or whatever you like. But the chip can handle more than your arbitrarily chosen 60C without being damaged, even for long term use. It's
designed that way.
MaJoR Wrote:He knows "the physics stuff" and you don't.
To be fair I don't fully understand how they calculate it. Only the variables that effect it. In fact the only people who know how the optimal temperature range was calculated are the engineers at Intel or AMD who worked on the architecture. That being said we can still use the data that they produced to our advantage when OCing and should. Also the optimal temperature range goes down as you increase voltage so when OCing significantly you should always keep your temps lower than what is acceptable at stock settings (how much lower depends on how much of a voltage bump). With a small OC like he is proposing anything below 80C would be considered quite cool by ivy bridge standards.