(11-01-2013, 11:10 AM)NaturalViolence Wrote: [ -> ]What exactly does this mean? An APU is just a cpu with an IGP. And for awhile both companies have had IGPs in most of their cpus. What exactly is shifting here?
I think he means AMD APU vs FX series.
That makes a bit more sense.
To be honest I'm surprised that they still have CPUs without an IGP. I guess they just want to avoid breaking backwards compatibility with the socket. Also they wanted to use the same chips for servers and desktops and of course IGPs are completely useless for server CPUs. I suspect that in the near future (2-3 years) they will either add IGPs to the FX lineup or ditch the FX lineup entirely. So in that sense he's right.
It's hard to calculate how much financial emphasis is being placed on one or the other since they both share the same cpu microarchitecture and the IGP uses the same microarchitecture as their discrete GPUs. As far as sales go CPUs without IGPs are extremely impractical for laptops and smaller systems. So that limits "traditional cpus" to systems with a much smaller user base. So APUs as a result do sell better in terms of total sales.
Still I don't like the wording of "shifting emphasis to APUs". All they did was move the IGP from the chipset to the cpu die like Intel.... That wording makes it sound like they moved to an entirely new type of product. They've been making chipsers, IGPs, and CPUs for many years. They just never put them on the same chip due to a number of constraints that no longer exist. Size being the most obvious one. They're still making the same products for the same markets, just with more integration. The design process is still the same and the functionality of the products is still the same. They're still working on the same things they were 5 years ago. So I don't really see any shifted emphasis in terms of the development side of this. Maybe from a consumer standpoint it seems that way with most cpus now have on-die IGPs when they didn't before. But really that was just the outcome of them continuing to do what they do. Just like how IGPs went from standalone cards to on board. Then from on board to on chipset. Now we've gone from on chipset to on die. The only way they could push it further is on core integration which I due suspect will happen in a few years.
True change is coming though. They may shift their emphasis more to the GPU side in the near future. But we're not quite there yet and to say that would just be an educated guess at this point. If AMD does decide to do that and focus on GPGPU one of three things will likely happen.
1. It is successful and turns the tides in their favor. Intel has a brainfart and they pay the price. Which I doubt will happen but I certainly could be wrong.
2. It is successful but doesn't turn the tide in their favor. Intel catches on and competes with them on the same ground. Nothing really changes.
3. It is unsuccessful. It turns out to not be a game changer. It never gains anything more than a small amount of adoption from software developers. It becomes a niche market like Itanium and and doesn't significant benefit any company. I doubt this outcome too because of the number of companies and devs that have already expressed interest in it.
吾 don't mean to really bump or prolong this discussion, but I just wanted to say a few things about "APU"-ness...
Arrandale and Clarkdale: released 2010-01-07
- iGP and CPU on same package in separate dies
Bobcat: released 2011-01-04
- iGP and CPU in single die
Sandy Bridge: released 2011-01-09
- iGP and CPU in single die
Yup. Technically they beat Intel by a few months with bobcat. Of course bobcat sold poorly and was limited to ultra low power platforms only so nobody really cared

. Not that any of this really matters with regards to the future. Moving to die integration a few months earlier/later than the competition hardly effected anything. What will be really important is how long AMD can hold onto HSA as an AMD only feature. Since at this point they've pretty much bet the entire future of their company on its success. It seems kind of stupid to me but then again I can see why they're going through with it. It's their only remaining option at this point now that they have no remaining chance of beating Intel in "conventional" cpu performance and energy efficiency.
All the tech analysts are talking about HSA as the big thing that's going to turn everything around for AMD. They won't shut up about it. Well I'm not buying it this time. Not until it actually comes out and proves itself to be everything they claim it's going to be. They've been talking about "the big thing" that AMD is going to use to whip Intel back into its place for years now. Remember when Intels Conroe didn't stand a chance against K8L? When deneb was going to crush nehalem? When bulldozer was going to crush sandy bridge? I'm still here. Still waiting for it to happen. Every time they hype up the press it all comes crashing down when the product is actually released and it turns out that their claims were based on falsified data, gross exaggerations, and deliberate misinterpretations. This has made me extremely skeptical of anything they say. It's not that I want AMD to fail. I just have a really hard time believing their bullshit after being burned so many times in the past. Now they're heading their most ambitious project yet while their once abundant supply of capital has all but dried up at this point. And once again making some really crazy and hard to swallow claims. Worst of all these claims about absurd performance gains across the board from some new special sauce technology that's going to be really easy for software developers to implement sound exactly like what I've heard from them in the past. I can only hope that they're right for once. But I'm not holding my breath.
I'm semi-excited for HSA but mainly just because I'd just like to see more GPGPU software rather than any supposedly "revolutionary" stuff. Remember, whether its software or hardware, never buy into hype.
To be honest I'm probably more "hyped" for Mantle than HSA; however that's probably largely because devs are already getting results:
https://twitter.com/cavemanjim/status/400802892208037888
Regarding Bobcat, AFAIK it being low-power was one of the main reasons for integrating the GPU into the CPU die - to reduce power consumption.
EDIT:
(11-20-2013, 04:18 PM)NaturalViolence Wrote: [ -> ]Yup. Technically they beat Intel by a few months with bobcat.
Wat.
Nintendo Maniac 64 Wrote:Regarding Bobcat, AFAIK it being low-power was one of the main reasons for integrating the GPU into the CPU die - to reduce power consumption.
Actually for low end systems it's more about cost. Separate chips doesn't save that much power but it does save a lot of space and cost.
Nintendo Maniac 64 Wrote:Wat.
Pffft. International standards, who needs em? We're American

(11-21-2013, 08:15 AM)NaturalViolence Wrote: [ -> ]Pffft. International standards, who needs em? We're American 
I've never seen an American put the day before the month, and I've never seen a European put the day next to the year. That's why year-month-day works, it makes sense from both an American and European viewpoint.
Seriously, the only way you could have interpreted that as "a few months" is if you read the date as year-day-month, which I have
never seen in my life.
Can't we just stick to dd/mm/yyyy, as it's right? By that I mean literally anything else is wrong (except the sort-of-okay thing NM64 used, which is sort of okay).
tl;dr yyyy-mm-dd seems to largely be a compromise between the US and Europe date-formatting order.
In the US we don't write the date as mm-dd-yyyy because of some logical order, it's simply because we're used to it. We've been saying things like "March 8" for ages and then we tack on the year after that with a comma separating the two (March 8, 2013). Therefore when you turn the order backwards to yyyy-dd-mm you eliminate the in-grained "I'm just used to it"-ness and we can clearly see how nonsensical it is to put the day between the month and year.
Again, it's almost impossible to combat the concept of "I'm just used to it" (see: Zsnes), so you have to work around it. Because the year is already tacked on in US-date formatting, it is no problem to instead to tack on the year before the month. As long as those people get to use their "mm-dd" they'll be fine.
Now unlike the US, Europe already uses a sensible date-formatting order, so even if you turn it backwards it would still make sense (unlike that crazy yyyy-dd-mm). Therefore understanding yyyy-mm-dd should be no problem in Europe as well. I mean, Europe already thinks that mm-dd-yyyy makes no sense, so there shouldn't be any reason for a European to read the date as yyyy-dd-mm since that too would put the day between the month and year.
"This data should be written XYZ."
"Fool, it's much better to write it as XZY."
"Technically you're both wrong, ZYX makes much more sense."
"Wow, looks no one knows what they're talking about. Y should always come first, so we use YXZ."
The debate over dates is very similar to endianness in computer science (and just as trivial). The data's the same in both cases, it's just that the order is shifted. In cases like these, the order can't be "wrong" because if the data is processed as intended, the results come out the same way. Fun fact: the term endianness is derived from Jonathan Swift's "Gulliver's Travels", where the peoples of one island argued whether it was best to crack an egg from the small end or the big end. 18th century British literature ftw. The order really only has any relevance when you're trying to communicate information to others. Personally, I just long-hand it nowadays on everything but forms (like at the doctor's office). There's no ambiguity about November 20th, 2013 (unless you factor in Time Zones

)
Just to explain to all the folks that don't get why Americans use MM-DD-YYYY, this format best mimics how we speak (at least over here?). When someone asks you the date, the full spoken response would be "Today's November 20th, 2013", thus we use MM-DD-YYYY. I majored in English, so my U.S. history is pretty sketchy, but I'm pretty sure we fought a big whole war with Britain in the late 18th century over stuff like that.