What has happened to Intel lately?
|
11-03-2013, 03:55 PM
That makes a bit more sense.
To be honest I'm surprised that they still have CPUs without an IGP. I guess they just want to avoid breaking backwards compatibility with the socket. Also they wanted to use the same chips for servers and desktops and of course IGPs are completely useless for server CPUs. I suspect that in the near future (2-3 years) they will either add IGPs to the FX lineup or ditch the FX lineup entirely. So in that sense he's right. It's hard to calculate how much financial emphasis is being placed on one or the other since they both share the same cpu microarchitecture and the IGP uses the same microarchitecture as their discrete GPUs. As far as sales go CPUs without IGPs are extremely impractical for laptops and smaller systems. So that limits "traditional cpus" to systems with a much smaller user base. So APUs as a result do sell better in terms of total sales. Still I don't like the wording of "shifting emphasis to APUs". All they did was move the IGP from the chipset to the cpu die like Intel.... That wording makes it sound like they moved to an entirely new type of product. They've been making chipsers, IGPs, and CPUs for many years. They just never put them on the same chip due to a number of constraints that no longer exist. Size being the most obvious one. They're still making the same products for the same markets, just with more integration. The design process is still the same and the functionality of the products is still the same. They're still working on the same things they were 5 years ago. So I don't really see any shifted emphasis in terms of the development side of this. Maybe from a consumer standpoint it seems that way with most cpus now have on-die IGPs when they didn't before. But really that was just the outcome of them continuing to do what they do. Just like how IGPs went from standalone cards to on board. Then from on board to on chipset. Now we've gone from on chipset to on die. The only way they could push it further is on core integration which I due suspect will happen in a few years. True change is coming though. They may shift their emphasis more to the GPU side in the near future. But we're not quite there yet and to say that would just be an educated guess at this point. If AMD does decide to do that and focus on GPGPU one of three things will likely happen. 1. It is successful and turns the tides in their favor. Intel has a brainfart and they pay the price. Which I doubt will happen but I certainly could be wrong. 2. It is successful but doesn't turn the tide in their favor. Intel catches on and competes with them on the same ground. Nothing really changes. 3. It is unsuccessful. It turns out to not be a game changer. It never gains anything more than a small amount of adoption from software developers. It becomes a niche market like Itanium and and doesn't significant benefit any company. I doubt this outcome too because of the number of companies and devs that have already expressed interest in it.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."
-Ron Swanson "I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. " -Mark Antony 11-20-2013, 12:48 PM
(This post was last modified: 11-20-2013, 12:50 PM by Nintendo Maniac 64.)
吾 don't mean to really bump or prolong this discussion, but I just wanted to say a few things about "APU"-ness...
Arrandale and Clarkdale: released 2010-01-07
Bobcat: released 2011-01-04
Sandy Bridge: released 2011-01-09
Dolphin 5.0 CPU benchmark
CPU: Xeon E3-1246 v3 (4c/8t Haswell/Intel 4th gen) — core & cache @ 3.9GHz via multicore enhancement GPU: Intel integrated HD Graphics P4600 RAM: 4x8GB Corsair Vengence @ DDR3-1600 OS: Linux Mint 20.3 Xfce + [VM] Win7 SP1 x64 11-20-2013, 04:18 PM
Yup. Technically they beat Intel by a few months with bobcat. Of course bobcat sold poorly and was limited to ultra low power platforms only so nobody really cared . Not that any of this really matters with regards to the future. Moving to die integration a few months earlier/later than the competition hardly effected anything. What will be really important is how long AMD can hold onto HSA as an AMD only feature. Since at this point they've pretty much bet the entire future of their company on its success. It seems kind of stupid to me but then again I can see why they're going through with it. It's their only remaining option at this point now that they have no remaining chance of beating Intel in "conventional" cpu performance and energy efficiency.
Rant:
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."
-Ron Swanson "I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. " -Mark Antony 11-20-2013, 04:31 PM
(This post was last modified: 11-20-2013, 04:37 PM by Nintendo Maniac 64.)
I'm semi-excited for HSA but mainly just because I'd just like to see more GPGPU software rather than any supposedly "revolutionary" stuff. Remember, whether its software or hardware, never buy into hype.
To be honest I'm probably more "hyped" for Mantle than HSA; however that's probably largely because devs are already getting results: https://twitter.com/cavemanjim/status/40...2208037888 Regarding Bobcat, AFAIK it being low-power was one of the main reasons for integrating the GPU into the CPU die - to reduce power consumption. EDIT: (11-20-2013, 04:18 PM)NaturalViolence Wrote: Yup. Technically they beat Intel by a few months with bobcat. Wat.
Dolphin 5.0 CPU benchmark
CPU: Xeon E3-1246 v3 (4c/8t Haswell/Intel 4th gen) — core & cache @ 3.9GHz via multicore enhancement GPU: Intel integrated HD Graphics P4600 RAM: 4x8GB Corsair Vengence @ DDR3-1600 OS: Linux Mint 20.3 Xfce + [VM] Win7 SP1 x64 11-21-2013, 08:15 AM
(This post was last modified: 11-21-2013, 08:16 AM by NaturalViolence.)
Nintendo Maniac 64 Wrote:Regarding Bobcat, AFAIK it being low-power was one of the main reasons for integrating the GPU into the CPU die - to reduce power consumption. Actually for low end systems it's more about cost. Separate chips doesn't save that much power but it does save a lot of space and cost. Nintendo Maniac 64 Wrote:Wat. Pffft. International standards, who needs em? We're American
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."
-Ron Swanson "I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. " -Mark Antony 11-21-2013, 08:18 AM
(This post was last modified: 11-21-2013, 08:21 AM by Nintendo Maniac 64.)
(11-21-2013, 08:15 AM)NaturalViolence Wrote: Pffft. International standards, who needs em? We're AmericanI've never seen an American put the day before the month, and I've never seen a European put the day next to the year. That's why year-month-day works, it makes sense from both an American and European viewpoint. Seriously, the only way you could have interpreted that as "a few months" is if you read the date as year-day-month, which I have never seen in my life.
Dolphin 5.0 CPU benchmark
CPU: Xeon E3-1246 v3 (4c/8t Haswell/Intel 4th gen) — core & cache @ 3.9GHz via multicore enhancement GPU: Intel integrated HD Graphics P4600 RAM: 4x8GB Corsair Vengence @ DDR3-1600 OS: Linux Mint 20.3 Xfce + [VM] Win7 SP1 x64 11-21-2013, 10:14 AM
Can't we just stick to dd/mm/yyyy, as it's right? By that I mean literally anything else is wrong (except the sort-of-okay thing NM64 used, which is sort of okay).
OS: Windows 10 64 bit Professional
CPU: AMD Ryzen 5900X RAM: 48GB GPU: Radeon 7800 XT 11-21-2013, 01:52 PM
(This post was last modified: 11-21-2013, 02:23 PM by Nintendo Maniac 64.)
tl;dr yyyy-mm-dd seems to largely be a compromise between the US and Europe date-formatting order.
In the US we don't write the date as mm-dd-yyyy because of some logical order, it's simply because we're used to it. We've been saying things like "March 8" for ages and then we tack on the year after that with a comma separating the two (March 8, 2013). Therefore when you turn the order backwards to yyyy-dd-mm you eliminate the in-grained "I'm just used to it"-ness and we can clearly see how nonsensical it is to put the day between the month and year. Again, it's almost impossible to combat the concept of "I'm just used to it" (see: Zsnes), so you have to work around it. Because the year is already tacked on in US-date formatting, it is no problem to instead to tack on the year before the month. As long as those people get to use their "mm-dd" they'll be fine. Now unlike the US, Europe already uses a sensible date-formatting order, so even if you turn it backwards it would still make sense (unlike that crazy yyyy-dd-mm). Therefore understanding yyyy-mm-dd should be no problem in Europe as well. I mean, Europe already thinks that mm-dd-yyyy makes no sense, so there shouldn't be any reason for a European to read the date as yyyy-dd-mm since that too would put the day between the month and year.
Dolphin 5.0 CPU benchmark
CPU: Xeon E3-1246 v3 (4c/8t Haswell/Intel 4th gen) — core & cache @ 3.9GHz via multicore enhancement GPU: Intel integrated HD Graphics P4600 RAM: 4x8GB Corsair Vengence @ DDR3-1600 OS: Linux Mint 20.3 Xfce + [VM] Win7 SP1 x64
"This data should be written XYZ."
"Fool, it's much better to write it as XZY." "Technically you're both wrong, ZYX makes much more sense." "Wow, looks no one knows what they're talking about. Y should always come first, so we use YXZ." The debate over dates is very similar to endianness in computer science (and just as trivial). The data's the same in both cases, it's just that the order is shifted. In cases like these, the order can't be "wrong" because if the data is processed as intended, the results come out the same way. Fun fact: the term endianness is derived from Jonathan Swift's "Gulliver's Travels", where the peoples of one island argued whether it was best to crack an egg from the small end or the big end. 18th century British literature ftw. The order really only has any relevance when you're trying to communicate information to others. Personally, I just long-hand it nowadays on everything but forms (like at the doctor's office). There's no ambiguity about November 20th, 2013 (unless you factor in Time Zones ) Just to explain to all the folks that don't get why Americans use MM-DD-YYYY, this format best mimics how we speak (at least over here?). When someone asks you the date, the full spoken response would be "Today's November 20th, 2013", thus we use MM-DD-YYYY. I majored in English, so my U.S. history is pretty sketchy, but I'm pretty sure we fought a big whole war with Britain in the late 18th century over stuff like that. |
« Next Oldest | Next Newest »
|
Users browsing this thread: 1 Guest(s)