12-07-2012, 12:13 PM
12-07-2012, 12:17 PM
(12-07-2012, 12:13 PM)admin89 Wrote: [ -> ]http://www.newegg.com/Product/Product.aspx?Item=N82E16813157293Ok thanks, will check it out.
Excelent review , best bang for the buck mobo
12-07-2012, 06:31 PM
Radeon HD 7870 - 23A and a 500W psu minimum
+ cpu + ...
+ cpu + ...
12-08-2012, 03:36 AM
That... makes little sense. Please can you rephrase it?
12-09-2012, 12:29 AM
(12-07-2012, 06:31 PM)Soopah Wrote: [ -> ]Radeon HD 7870 - 23A and a 500W psu minimumEverything is calculated in these recomendations..I doubt there is any card available which actually uses 500w and 23A on load..maybe geforce 690 or 590 or something like that..certanly not 7870
+ cpu + ...
12-09-2012, 12:04 PM
Quote:maybe geforce 690 or 590 or something like that
Still not even close. To my knowledge no video card in history has exceeded 300 watt power consumption under full load (excluding overclocked cards from extreme overclocking competitions of course).
Clarification:
Like you said the calculations are done based on hypothetical systems and refer to total system power consumption. They're extremely conservative about the measurements and use enthusiast grade cpus, motherboards, etc. so if it says that it requires a 500 watt power supply you can probably get away with 400 or even 350 watts. They figure that overestimating is way better than underestimating since people might complain about power issues if they underestimated the measurements. If it says 500 watts that means that no matter what other parts you use 500 watts will be enough.
Ultimately these measurements should just be ignored in favor of using a power calculator to estimate your total system power draw.
12-09-2012, 12:52 PM
GTX 295 and 4870X2 comes to mind, both had recommended PSU of 700W or more by Nvidia/ati.
Each generation now has improved power usage all over the line, both for CPU and GPU.
Two gens ago you usually had a 130W CPU, now they are 77W. (i7 920 vs 3770k f.ex)
On my GTX285 it says 215W on the box, a new GTX680 is 190W.
Earlier you needed 650W for a good gaming rig and today you can get away with much less.
This is why most people often tell you that you need more than you actually do, because it was needed 3 years ago.
PSUs are more efficient now and can usually deliver much more on the 12V rail than before too.
Each generation now has improved power usage all over the line, both for CPU and GPU.
Two gens ago you usually had a 130W CPU, now they are 77W. (i7 920 vs 3770k f.ex)
On my GTX285 it says 215W on the box, a new GTX680 is 190W.
Earlier you needed 650W for a good gaming rig and today you can get away with much less.
This is why most people often tell you that you need more than you actually do, because it was needed 3 years ago.
PSUs are more efficient now and can usually deliver much more on the 12V rail than before too.
12-10-2012, 07:02 AM
Quote:GTX 295 and 4870X2 comes to mind, both had recommended PSU of 700W or more by Nvidia/ati.
They both had lower TDPs than the cards that replaced them.
And both would work with a 500 watt PSU in most systems.
Quote:Each generation now has improved power usage all over the line, both for CPU and GPU.
Ahahahaha. No. Just because this latest generations top end have lower TDPs doesn't mean that every processor from every generation improved TDP over its predecessor.
Quote:Two gens ago you usually had a 130W CPU, now they are 77W. (i7 920 vs 3770k f.ex)
Sandy bridge-E is still 130w. Not as much has changed as you think.
Top TDP for Intel cpus by year:
1993: 16 watt (pentium, P5)
1994: 10 watt (pentium, P54C)
1995: 37 watt (pentium pro)
1996: 37 watt (pentium pro)
1997: 43 watt (pentium II, klamath)
1998: 27 watt (pentium II, deschutes
1999: 42 watt (pentium III, katmai)
2000: 55 watt (pentium 4, willamette)
2001: 75 watt (pentium 4, willamette)
2002: 68 watt (pentium 4, nothwood)
2003: 92 watt (pentium 4 extreme edition, gallatin)
2004: 115 watt (pentium 4, prescott)
2005: 130 watt (pentium D extreme edition, smithfield)
2006: 130 watt (pentium D extreme edition, presler)
2007: 130 watt (core 2 extreme, kentsfield)
2008: 136 watt (core 2 extreme, yorkfield)
2009: 130 watt (core i7 extreme, bloomfield)
2010: 130 watt (core i7 extreme, gulftown)
2011: 130 watt (core i7 extreme, sandy bridge-E)
2012: 130 watt (core i7 extreme, sandy bridge-E)
You can still get high end cpus with high TDPs, but nobody bothers anymore. The sandy bridge i7 3820 is actually affordable ($300 last I checked) and that's 130 watt. Ivy bridge-E isn't out yet but I would be willing to bet I know what TDP it's going to have. For some reason Intel priced bloomfield (130 watt TDP) within reach of a normal consumer at $300 even though it was intended as an enthusiast chip. This was most likely due to strong competition from AMD at the time. This is an outlier in the otherwise consistent pattern. Now with enthusiast cpus above we see TDP consistently going up every year until it capped out at 130 watt. The only exceptions are 1994, 1998, and 2002 where a die shrink resulted in a reduced TDP. But of course this is only enthusiast chips so this doesn't paint the entire picture.
Ever since 2007 Intel has produced at least 3 chips with standard TDPs of 130, 95, and 65 watts.
In 2007
Core 2 extreme 130 watt, kentsfield XE
Core 2 quad 95/105 watt, kentsfield
Core 2 duo 65 watt, conroe
In 2008
Core 2 extreme 130/136 watt, yorkfield XE
Core 2 quad 95 watt, yorkfield
Core 2 duo 65 watt, wolfdale
Each cpu was replaced by a new one with the same TDP. The top two are really the same chip, just binned for different TDPs.
In 2009/2010
Core i7/extreme 130 watt, bloomfield/gulftown
Core i5/i7 95 watt, lyynfield
Core i3/i5 65 watt, clarkdale
Same as before. Only this time the chips were priced/marketed differently. 4 brands now instead of 3. Each chip overlaps two different brands instead of 1. For some reason we didn't get die shrinks of bloomfield/lyynfield in 2010 like we were supposed to.
In 2011
Core i7/extreme 130 watt, sandy bridge/sandy bride-E
Core i5/i7 95 watt, sandy bridge
core i3, 65 watt sandy bridge
Once again not much changed. The real difference is not the 130 watt i7s are priced much higher than they were before, most likely due to lack of competition.
In 2012
Core i5/i7 77 watt, ivy bridge
core i3 55 watt, ivy bridge
Two differences here. The 130 watt chips (ivy bridge-E) have been delayed to 2013. Also the TDP went down slightly. Nobody really knows for sure why.
This is connected to the cheap TIM that they replaced the fluxless solder with in some way.
Theory 1:
Had they used fluxless solder like the previous generations they would have been able to pump up ivy bridge to the same TDPs as previous generations and reach higher clock rates than sandy bridge. Because of the cheap TIM they are using they are forced to keep the TDPs lower. If this is correct why they choose to use the cheap TIM would be a mystery. Perhaps to cut costs. Perhaps to extend sandy bridge's shelf life. Perhaps to make Haswell seem better. Who knows.
Theory 2:
The opposite. TDP went down so they were able to use cheap TIM instead of fluxless solder. This provides an explanation for the cheap TIM but now you have to explain why the TDP went down. Which brings us to an obvious question. Was ivy bridge capable of clocking higher and gimped? Or was there a manufacturing or architectural limitation that prevented it from clocking higher even with a die shrink? Both would explain the lower TDP.
If they had dieshrunk bloomfield/lyynfield in 2010 we would know by now since we would have two instances to compare. But since they didn't we have to wait until ivy bridge-E to know for sure. My money is on theory 1. If we knew more details about the architecture and manufacturing process we could try to determine whether signal propagation delays, gate switching delays, or thermal dissipation was playing a dominant role in clock rate scaling. If it's number 3 like most people suspect than they gimped it on purpose.
Quote:On my GTX285 it says 215W on the box, a new GTX680 is 190W.
TDP went up from geforce 7 to 8, geforce 8 to 200, and from geforce 200 to 400/500. This is the first generation of nvidia GPUs where TDP has ever gone down. And that was mainly because fermi did so poorly in the market due to high TDP that they specifically designed kepler with lower TDP in mind to try and better accommodate what the market wants.
Quote:Earlier you needed 650W for a good gaming rig and today you can get away with much less.
When did we ever need 650 watt for a good gaming rig?
Quote:This is why most people often tell you that you need more than you actually do, because it was needed 3 years ago.
Good lord where are you getting this from?
12-10-2012, 07:20 AM
(12-10-2012, 07:02 AM)NaturalViolence Wrote: [ -> ]Two differences here. The 130 watt chips (ivy bridge-E) have been delayed to 2013. Also the TDP went down slightly. Nobody really knows for sure why.Because 22nm? :p
12-10-2012, 08:07 AM
In the last 10 years die shrinks usually haven't lowered TDP (as you can see above). They use the extra thermal headroom to clock the chip higher, thus maintaining the same TDP but improving performance.
Did they deliberately lower desktop TDP with ivy bridge by choosing not to clock the chip higher? Maybe. But why on earth would they do that for the desktop platform?
Did they deliberately lower desktop TDP with ivy bridge by choosing not to clock the chip higher? Maybe. But why on earth would they do that for the desktop platform?