Same question as in the random thread: Is there anything other than the ability to use silly-huge texture mods to justify the extra £60 for a 4GB GTX 770 over the 2GB for a single monitor, single GPU user?
Well.... it could possibly help with a 4k monitor down the road. That's all i got.
By the time 4K monitors are cheap enough for me to get one, I'll likely have changed to another card. Also, when the website who did the most exhaustive testing tested 1080p vs double widescreen vs quadruple widescreen (pixel-equivalent to 4K), there were usually decimal framerate improvements. If no-one's got an obvious reason, then it looks like the 2GB card.
I'd definitely suggest getting a GPU with more VRAM, especially with the next-gen games coming. The PS4 for example has around 4.5GB of VRAM available for the games so we're going to see games with higher resolution textures, etc next-gen.
Watch Dogs, for example, recommends a GPU with at least 2GB of VRAM. So unless you're on a tight budget, get a GPU with more than 2GB of VRAM.
If you don't mind going for AMD again, you can also take a look at the R9 280X or 7970GHz cards. They're very competitive with the GTX770.
Edit: Nvm, I just remembered you wanted to go NVIDIA this time for GPU rendering in Blender.
(11-03-2013, 02:06 AM)Garteal Wrote: [ -> ]I'd definitely suggest getting a GPU with more VRAM, especially with the next-gen games coming. The PS4 for example has around 4.5GB of VRAM available for the games so we're going to see games with higher resolution textures, etc next-gen.
Watch Dogs, for example, recommends a GPU with at least 2GB of VRAM. So unless you're on a tight budget, get a GPU with more than 2GB of VRAM.
If you don't mind going for AMD again, you can also take a look at the R9 280X or 7970GHz cards. They're very competitive with the GTX770.
Edit: Nvm, I just remembered you wanted to go NVIDIA this time for GPU rendering in Blender.
Again, given that most of the games I buy are a couple of years old, before I start to hit this being an issue, I'm likely to be thinking about a new GPU.
If someone can look at scan just in case all the cheap 2GB cards are in some way flawed (eg the manufacturer has a tendency to avoid using enough thermal paste) but the cheaper 4GB ones are good, that may help.
http://www.scan.co.uk/shop/computer-hardware/all/gpu-nvidia/geforce-gtx-770-(1536-cores)
AnyOldName3 Wrote:By the time 4K monitors are cheap enough for me to get one, I'll likely have changed to another card. Also, when the website who did the most exhaustive testing tested 1080p vs double widescreen vs quadruple widescreen (pixel-equivalent to 4K), there were usually decimal framerate improvements. If no-one's got an obvious reason, then it looks like the 2GB card.
Do keep in mind that when you begin to run slightly low on vram you get severe microstuttering. You get long stretches of time where enough vram is present to keep the game running smoothly followed by small periods of time where the game suddenly needs a bit more vram than it has. This causes a sudden brief crash in framerate. However the end result on average framerate may only be a few fps. If the framerate differs by more than a small margin of error that is clear evidence of microstuttering.
Since you didn't post the benchmarks in question I can't comment on whether or not this is the case but it sounds like it isn't. I'm assuming by "decimal framerate improvements" you mean <1 fps.
At the moment having that much video ram is only useful for 4k resolution + AA, craptons of SSAA, multimonitor (3 or more), crazy texture mods, or some combination of those. But garteal is right about the explosion of vram usage that's likely about to happen with next gen. games.
You're not going to see any quality difference between 2GB and 4GB cards other than the ram ICs used. Though I still think >2GB vram use will be rare.
Garteal Wrote:The PS4 for example has around 4.5GB of VRAM available for the games so we're going to see games with higher resolution textures, etc next-gen.
The PS4 has 4.5GB of TOTAL unified memory available for games. Only part of that will be used to store GPU assets. I suspect we'll still be able to get away with maxing out next gen. games on 2GB cards for awhile.
(11-03-2013, 03:04 PM)NaturalViolence Wrote: [ -> ]The PS4 has 4.5GB of TOTAL unified memory available for games. Only part of that will be used to store GPU assets. I suspect we'll still be able to get away with maxing out next gen. games on 2GB cards for awhile.
Yes you're right. It's shared memory. But still, my ultimate point was that the standard will be much higher in comparison to the previous generation.
You'll be fine with 2GB for now, especially if you're going to play your old games. What GTX770 are you looking at right now?
I'll post the Scan link again so you can look at pricing of various models:
http://www.scan.co.uk/shop/computer-hardware/all/gpu-nvidia/geforce-gtx-770-(1536-cores)
If I'm getting a 2GB card I'm probably most likely to get the MSI one (£251.72) due to the fact that it should boost higher as I have a MSI mobo, due to a feature called VGA boost.
Quote:MSI VGA Boost is one of MSI's exclusive ways to support gamers to fully engage in their virtual world.
VGA Boost increases the power limitations for MSI GAMING graphics cards when inserted in an MSI GAMING motherboard. It simply upgrades your Power Tuning and current limits allowing your graphics to boost to higher clock speeds when your gaming graphics get more intense and sustain maximum performance for a longer time.
This is all supported because of the power design for the PCI Express slots and the build quality of MSI Graphics cards. Now, matching MSI gaming hardware not only gives your game PC some fiery red dragon looks, your PC gaming experience will benefit from a safe and powerful boost in graphics.
Of course, it's always possible that a cheaper card may boost higher anyway due to faster stock clocks.
If I'm going for a 4GB card, it'll probably be the Zotac one (£275.72) due to the fact that it's much cheaper than any other card.
(11-02-2013, 12:02 AM)NaturalViolence Wrote: [ -> ]You'll be fine with VGA. I doubt you'll even notice the difference.
Hey, you're right
I had actually already tested the VGA before posting here but with HDMI device already dead i couldn't make valid comparisons, but due to the fact that HDMI is a more advanced (newer) technology than VGA i just couldn't shake off the feeling that the picture i was looking at looked better with a HDMI cable. However what i did today was simply adjust my TV's picture settings, increased contrast and lowered brightness and BAM!! HDMI look alike.
The R9 290 is extremely impressive. One of my friends is building his first rig and I might actually recommend he return the GTX 770 and delay the build just to get this instead.