VIA's Nano&Pico ITX systems were used to turn toasters and NES's into PCs back in the day. I guess with the rise of tablets and Android on a stick that isn't as impressive as it used to be.
Hardware Discussion Thread
|
01-19-2014, 03:31 AM
(This post was last modified: 01-19-2014, 03:32 AM by ThorhiantheUltimate.)
(01-18-2014, 02:55 PM)DatKid20 Wrote: Mantle is able to be used by Nvidia and Intel due to it not having specific code for GCN. If they get their hands on Mantle, AMD will either have the best implementation or be left in the dust It is true that Nvidia and Intel may be able to use it, however if Mantle works as advertised (which we won't know for sure until some games/applications come out for it) it might be able to make many games (if people decide to implement it) go from mostly CPU bound to GPU bound. Intel doesn't really have that problem, and all they could get from it really is a GPU performance bump (which would be nice for them, but I wouldnt think it would be extremely important for Intel). With nvidia, I honestly have no idea what they will do, they already have their own solid line of Cards and features (including Gsync, which is nice for some people) and I happen to use a GTX 460 since it works with the Cycles Render Engine on Blender (or at least properly, I know Cycles has had problems with Opencl, especially on AMD cards). -Side Note: Hey, the reply system works on my ipad now =D 01-19-2014, 04:51 PM
Most games are GPU bound not CPU bound. It's better to be GPU bound then CPU bound considering how much easier it is to use SLI/Crossfire then to wait for a new generation of CPUs to come out.
01-20-2014, 08:36 AM
(01-19-2014, 03:31 AM)ThorhiantheUltimate Wrote:(01-18-2014, 02:55 PM)DatKid20 Wrote: Mantle is able to be used by Nvidia and Intel due to it not having specific code for GCN. If they get their hands on Mantle, AMD will either have the best implementation or be left in the dust That's because AMD's OpenCL implementation doesn't allow global functions, so every triangle needs its own function when the renderer is compiled. This causes the renderer to overflow VRAM really easily with complex scenes. nVidia's implementation does allow global functions, so it doesn't have this issue, however the CUDA renderer is significantly faster, so therefore no-one uses OpenCL at all. This means that it's been labelled as deprecated, and so features and bugfixes aren't added, so it doesn't do much, and often crashes or produces garbage images. The CUDA renderer is used by a huge portion of the userbase, so it gets new features almost as soon as the CPU renderer does. This is why I have a GTX 770 on pre-order (the only affordable 4GB one is very out of stock), and not an R7/8/9 card. As for Mantle, there's also the issue that if AMD go and make it as specific as it might have to be to go as fast as predicted, all games will need to have an AMD Mantle engine and an nVidia Mantle engine, because you can't access AMDTextureRegister[4] on a chip which holds its textures in nVidiaTexReg(0x000F27B).
OS: Windows 10 64 bit Professional
CPU: AMD Ryzen 5900X RAM: 48GB GPU: Radeon 7800 XT 01-20-2014, 09:00 AM
ThorhiantheUltimate Wrote:I'd prefer if Intel wasn't the only x86 CPU manufacturer out there. Pffft. Everyone already knows this is our inevitable future: 5 internet points to anyone who knows the reference without googling. Honestly I think if mantle and HSA actually pan out it won't be as revolutionary for AMD as people are hyping it to be. Intel will start actually caring, will invest billions into IGP R&D, and will beat AMD at their own game. Right now Intel isn't investing much into IGP development because they assume that their customers don't care about it based on their market data. And they're right. They predict that this will change in the future and are gradually allocating more resources to that department each year as it becomes more important each year. This I believe is a better approach than AMDs "invest it all now and then see if the market will respond". AMD is trying to make the market adapt to them while Intel is adapting to the market. The advantage to AMDs approach is that in exchange for higher risk they have the potential to get the improvements needed faster than Intel. Problem is Intel has a lot of capital and can make drastic changes to their architecture very quickly if needed. Which is why I don't believe it will work. But what do I know about high level business strategy. lamedude Wrote:VIA's Nano&Pico ITX systems were used to turn toasters and NES's into PCs back in the day. I guess with the rise of tablets and Android on a stick that isn't as impressive as it used to be. Still less useful than an actual toaster or NES Datkid20 Wrote:Most games are GPU bound not CPU bound. This a thousand fold. Unless your cpu is nearly a decade out of date your framerate will be determined almost entirely by what GPU you have. It's been that way since around 2000ish when hardware T&L started to become common. Which transferred nearly all rendering work to the GPU.
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."
-Ron Swanson "I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. " -Mark Antony 01-20-2014, 11:25 AM
(01-20-2014, 08:36 AM)AnyOldName3 Wrote: That's because AMD's OpenCL implementation doesn't allow global functions, so every triangle needs its own function when the renderer is compiled. This causes the renderer to overflow VRAM really easily with complex scenes. nVidia's implementation does allow global functions, so it doesn't have this issue, however the CUDA renderer is significantly faster, so therefore no-one uses OpenCL at all. This means that it's been labelled as deprecated, and so features and bugfixes aren't added, so it doesn't do much, and often crashes or produces garbage images. Thanks for the info, I could never find anyone that could tell me WHY it didn't work out with OpenCL. Yes, I know it was Deprecated, I just never knew the technical aspects of why. Thank you for your explanation. To NV, your picture reference seems vaguely familiar, too bad I can't tell you what its from. A country with one company would be terrible :/ But yeah, I still stand by what I said, Intel being the only "good" x86 manufacturer (and will probably hop onto other platforms with its enormous amounts of cash) around will probably be a ugly mess for consumers. 01-20-2014, 11:46 AM
http://www.dailytech.com/Nintendo+Slashe...e34157.htm
Maybe just maybe Nintendo will learn to not make underpowered consoles and will release the Nintendo 3ds's successor in a few years with a Nvidia Tegra Procssor /wishfulthinking 01-20-2014, 11:59 AM
DatKid20 Wrote:Maybe just maybe Nintendo will learn to not make underpowered consoles Ahahahaha Spoiler:
"Normally if given a choice between doing something and nothing, I’d choose to do nothing. But I would do something if it helps someone else do nothing. I’d work all night if it meant nothing got done."
-Ron Swanson "I shall be a good politician, even if it kills me. Or if it kills anyone else for that matter. " -Mark Antony DatKid20 Wrote:and will release the Nintendo 3ds's successor in a few years with a Nvidia Tegra Procssor Why did you point out a link about the Wii U selling badly and talk about the 3DS as if it's going to die? You do realize the 3DS is selling amazing right?http://www.nintendolife.com/news/2014/01/nintendo_boasts_of_record_setting_3ds_sales_in_2013_and_confirms_yoshirs_new_island_release_date Plus Nintendo has a massive cash reserve of 14 billion dollars in pure currency thanks to the Wii. Of course none of this saves the Wii U, but it means that the Wii U won't tank Nintendo no matter how badly it does. Nintendo's big problem right now is not "surviving", it's making sure it doesn't become irrelevant in the console space so that the Wii U's successor has a chance. They can't pull out new hardware this early, as Sega taught everyone what happens if you do, so Nintendo needs to buckle down and ride the Wii U through as best as they can so they have as much momentum as possible for the next cycle. Just like the GameCube. I'm really getting sick of all this Nintendo dooms day crap. The situation is complicated and should be judged with all the factors, not histrionics and catchy headlines. Have some quality reading on the subject. http://www.nintendolife.com/news/2014/01...e_not_doom http://www.polygon.com/2014/1/17/5319016...s-to-adapt Intel Xeon w7-3465X OC | Asus Pro WS W790-E Sage SE | NVIDIA GeForce RTX 4090 FE | 8x16GiB G-Skill Zeta R5 DDR5-6000 | Windows 11 23H2 | (details)
MacBook Pro 14in | M1 Max (32 GPU Cores) | 64GB LPDDR5 6400 | macOS 13
|
« Next Oldest | Next Newest »
|
Users browsing this thread: 4 Guest(s)