I have been researching the history of pc building(1990-now) and ive heard that the early pentium 4's were quite possibly the worst cpu ever made, because all it had to claim was a high clock speed. Ive also heard that the Athlon x2 64 demolished it, and the late pentium 3's were actually better than it. So that got me wondering, what are some other bad pc components, things that never had the right of existing on store shelves. Overpriced, Underpowered, i dont care. Just suggest it!
The worst PC components in existence?
|
10-13-2015, 03:15 AM
Well you probably want to look into things like PSUs or cases or bluetooth adapters, i.e. things that are superficially simple, so a load of crappy companies will make something that works for a few hours but will then permanently fail.
OS: Windows 10 64 bit Professional
CPU: AMD Ryzen 5900X RAM: 16GB GPU: Radeon Vega 56
Honestly, the Pentium 4 wasn't the worst CPU ever. At the time, they thought, why bother with IPC (instructions per clock) when you can just use long pipelines (which reduces IPC but allows higher clockspeeds) and the march of the continually reducing manufacturing processes to push chips to over 10ghz? But as they shrank the architecture to about 90nm, they ran into power leakage, a quantum phenomena that no one even knew about then! Power leakage left the P4 limited to around and under 4ghz, and AMD quickly and easily passed them with much lower clockspeeds, thanks to IPC.
While I don't know about the worst CPU ever, the most what were they thinking CPU ever award goes to the Bulldozer. Because even after Intel was utterly punished by AMD during this period because AMD focused on IPC, and even after Intel switched to an IPC focus after the Pentium 4, not even 5 years later AMD decided to do what the Pentium 4 did! Someone at AMD looked at that failure and went "that's a good idea." Wonko! Anyway, it's all on wikipedia. I read too much! https://en.wikipedia.org/wiki/Pentium_4#...chitecture https://en.wikipedia.org/wiki/Leakage_(e...conductors ![]() AMD Threadripper Pro 5975WX PBO+200 | Asrock WRX80 Creator | NVIDIA GeForce RTX 4090 FE | 64GB DDR4-3600 Octo-Channel | Windows 11 23H1 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 12
10-13-2015, 04:46 AM
MaJoR Wrote:power leakage, a quantum phenomena that no one even knew about then! Ahahaha, no. The exact magnitude of this phenomena was debated within the electronic engineering community but the phenomena itself was quite well known long before the P4 was ever in development. MaJoR Wrote:Honestly, the Pentium 4 wasn't the worst CPU ever. At the time, they thought, why bother with IPC (instructions per clock) when you can just use long pipelines (which reduces IPC but allows higher clockspeeds) and the march of the continually reducing manufacturing processes to push chips to over 10ghz? Everyone says this but based on what I've read from the engineers who worked on it it seems more like they didn't think the IPC would go down much if at all. The real story of the P4 development goes much deeper than the story that the media went with. And much of it is buried beneath NDA's and other walls. MaJoR Wrote:While I don't know about the worst CPU ever, the most what were they thinking CPU ever award goes to the Bulldozer. The worst cpu design ever award would definitely not go to either Intel or AMD. AMD may be worse than Intel but they're still above everyone else. MaJoR Wrote:Because even after Intel was utterly punished by AMD during this period because AMD focused on IPC, and even after Intel switched to an IPC focus after the Pentium 4, not even 5 years later AMD decided to do what the Pentium 4 did! Someone at AMD looked at that failure and went "that's a good idea." Wonko! Despite what the media says AMD did not do what the P4 did. They focused on increasing TLP at the expense of ILP. This has the added effect of allowing for better clock rate scaling because they gate the f**k out of everything.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."
-Ron Swanson "I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. " -Mark Antony 10-13-2015, 10:25 AM
i think it might be Pentium D which is my first CPU
two P4 slapped together just equal double stupidity . It run hot all the time and eat power like crazy , poor performance , not really faster than P4 However , i can't say Pentium D is the worse because amd FX 9000 series is not better . 200w CPU [oced versions of FX 8000 series] that can not run on normal mobo , stock cooler , heck not even aftermarker air cooling units yet not faster than haswell core i7 . It's very expensive at first launch , who would pay that much for factory oc cpu ? Laptop: Mini PC :: 10-13-2015, 11:09 AM
I would say any recent AMD CPU due the shit IPC rates and insane TDPs, but for some specific tasks they are worthwhile, so, it's neutral to me...
Avell A70 MOB: Core i7-11800H, GeForce RTX 3060, 16 GB DDR4-3200, Windows 11 (Insider Preview)
ASRock Z97M OC Formula: Pentium G3258, GeForce GT 440, 16 GB DDR3-1600, Windows 10 (22H2)
I hope AMD can create something amazing like APU . Thanks to APU , shitty Intel iGPU has gotten to be faster and faster . Apparently , Iris Pro HD 6200 is the fastest IGPU though it's hard to get
Their next big thing AMD Zen , I hope they don't shoot CPU's TDP up again . It would be all over for AMD if they did that . Lower TPD seems to be more futureproof . In the recent years , I've seen many desktop CPU that was built in a notebook ,even top of the line Xeon , 4.4GHz i7 4970k . Now even Nvidia could somehow put their desktop GTX 980 in a notebook ...It's a huge shit on the internet since they did that without underclocking GPU which mean no performance loss Is AMD still working on a 100-200W CPU ? Come on , it's almost 2016 already . The 4W Tegra X1 can run some light-weight game ...No one want to sit in front of a Pentium D forever Edit : Oops , Sorry > I'm offtopic again Laptop: Mini PC :: NaturalViolence Wrote:Everyone says this but based on what I've read from the engineers who worked on it it seems more like they didn't think the IPC would go down much if at all. Well, have you read anything from the period? Take this for example - http://www.anandtech.com/show/858/2 The author actually complains that the GameCube takes an IPC short pipeline approach instead of a long pipeline one! The prevailing thought at the time, from everything I've seen, was that IPC was irrelevant when you could just go to higher and higher clocks. As for leakage, there are many types of leakage, but everything I've read says that they did not expect any leakage to happen as the manufacturing process became smaller. Do you have anything saying otherwise? NaturalViolence Wrote:AMD may be worse than Intel but they're still above everyone else. That isn't really true anymore... AMD has been stuck on 28nm for some time, while TSMC and Samsung are making ARM chips at 14nm! At this point, with the same manufacturing size Intel still has an IPC lead, but if the ARM fabs reach 7nm while Intel is still at 14nm, they could finally outpace Intel! ![]() AMD Threadripper Pro 5975WX PBO+200 | Asrock WRX80 Creator | NVIDIA GeForce RTX 4090 FE | 64GB DDR4-3600 Octo-Channel | Windows 11 23H1 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 12
10-13-2015, 05:59 PM
Snapdragon 810.
Thermally throttles itself before you can even load that picture of that really cute cat on the Internet. But hey, you have a really cool $500 space heater 10-13-2015, 07:49 PM
All Itanics, i820/i840 and its buggy MTH, most non Intel chipset from the last century (nForce was also garbage looking back), Willamette/Northwood Celerons (not even worth the sand they're made with), overheating Cyrix 6x86, GeforceFX 5800, most ATI Rage cards, PCI Sound Blasters depending on who you ask, and for shits'n'giggles buggy Pentiums (the issue was overblown but I
![]() Northwood P4s were good, but Athlon XPs were almost as fast for half the cost. If BD had launched in 09 as planned it would've been alright, but ATI happened. |
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)