Quote:1. It is high end when you can't afford anything better.
That's not the definition of the term "high end". High end has nothing to do with what you can afford.
Quote:AMD has never made any original processors, anything (even the first 64 bit processor) was first invented by Intel.
WTF!?!?!?
That makes no sense. You're just talking out of your ass.
Intel didn't invent the K5, K6, K7, K8, K10, K11, K12, or K15 architectures. All of AMDs microarchitectures are unique and totally different from intels. The only thing they share in common is the instruction set architecture.
Quote:Their latest processors, especially that 12core Bulldozer, has so many issues that it's not even funny.
That has nothing to do with your point about innovation. Bulldozer in particular shows extreme levels of innovation with CMT threading and decoupled branch prediction.
Quote:Face it, AMD is going downhill on the CPU side.
If they didn't buy ATI, they would have no future.
Although I agree that they are going downhill this has nothing to with innovation. If they had chosen not to be innovative with bulldozer that would have actually produced a better cpu. Instead they tried a different approach and failed.
Quote:You know that Intel was the first to develop 64bit processors right?
You know that Intel was the first to develop dual core processors right?
Nuff' said.
What does that have to do with anything?
By that logic Intel is also lacking innovation because both of those were done by many other major competitors long before intel.
Let's look at a quick rundown of the history of popular or semi-popular 64 bit microprocessor architectures from all of the major cpu companies.
MIPS came out with their first 64 bit microprocessor lineup in 1991. It didn't sell that well because nobody needed a 64 bit microprocessor at the time, so nobody cared. All MIPS processors since then have been 64 bit.
DEC Alpha came out in 1992. Another semi-popular line of 64 bit microprocessors. All DEC alpha processors since then have been 64 bit.
Sun came out with a line of 64 bit SPARC microprocessors in 1995 (the ultrasparc). All sparc processors since then have been 64 bit.
IBM came out with 64 bit powerPC microprocessors in 1995 (the A10 and the A30). All high end power and powerpc chips since then have been 64 bit.
HP came out with 64 bit PA-RISC microprocessors in 1996. All PA-RISC chips since then have been 64 bit.
Intel finally comes out with its 64 bit itanium chips in 2001, 10 years after MIPS. They failed miserably in sales.
AMD comes out with opteron (server) and athlon 64 (desktop) in 2003. These are the first 64 bit x86 chips. They sell extremely well.
Intel clones the AMD64 (also called x86-64 or x64) and produces presscot, and inferior 64 bit x86 implementation. Yup, you heard right. They copied AMDs instruction set and have to pay licensing/royalty fees to AMD every year to continue using it. They even admit to it.
And don't even get me started on multicore architectures. Intel was very late to the party with that as well. And AMD was the first to produce a dual core x86 die.
Even if neither of these points were true and intel had invented the idea of multicore microprocessors, 64 bit microprocessors, and created the first x86 multicore and 64 bit microprocessors it still wouldn't matter. None of these things have anything to do with innovation! Implementation matters. Just because I make a car that goes 120mph doesn't mean that that if you make a car that goes 120mph you copied me.
Now back to the original topic of GPUs:
Quote:Quote:I'm glad nvidia has been much better with their cards lately.
.....can't tell if serious.
What I meant by that is the fact that nvidias low end gpus have been garbage compared to amd over the last few years. So saying "I'm glad nvidia has been much better with their cards lately." makes no sense. If you look at the <$100 card they have far less performance per dollar, performance per watt, and a severe lack of useful features compared to the competition. And they haven't been selling well either.