Technology content trusted in North America and globally since 1999
6,555 Reviews & Articles | 44,285 News Posts

How much VRAM do you really need at 1080p, 1440p and 4K?

How much VRAM do you really need at 1080p, 1440p and 4K?
We test out our usual suite of benchmarks to see how much VRAM you really need
| Posted:

2015 is the year of 4K, right? But how much VRAM do you really need to play games in 1080p, 1440p, or 4K? There are so many conflicting reports: you need 6GB! No, 8GB! No, that's what the 12GB on Titan X is for! Well, with the next-gen AMD Radeon Fury X right around the corner rocking High Bandwidth Memory (HBM), and only 4GB of it, we have to start asking: just how much VRAM do you really need?




Most flagship GPUs are now coming with a minimum of 4GB, but with the release of the Titan X we saw this swell up to a huge 12GB. Things came back down to reality with the release of the GeForce GTX 980 Ti, which sports 6GB of VRAM. AMD is going to do things in reverse if the rumors are true, with their flagship Fury X card featuring 4GB of HBM, while their rebadged R9 390X will feature up to 8GB of GDDR5.


But when gaming at 4K, do you really need 8-12GB of VRAM? This is why we're here. We've used an NVIDIA GeForce GTX 980 Ti on a 4K monitor to do all of our testing to see how much VRAM is used in three resolutions: 1080p, 1440p and 4K.


Instead of writing about how many pixels are being rendered, we've put them into a chart so you can better understand just how many pixels we're driving here today. Right now, the 'next-gen' consoles are rendering games at around 720p - 900p, which if they were running at 60Hz (or 60FPS) which most of the time they aren't, it's usually 30FPS or so, they would be rendering 55 million pixels per second.




Jumping up to 1080p, that number climbs to 124 million while 1440p has it jump to 221 million. At 4K, the pixels rendered per second at 60Hz start to get serious, with 497 million, but 4K Surround has this catapult to 1.49 billion. 8K, which is in the not-too-distant future, sees 1.99 billion pixels being rendered per second.




  • CPU: Intel Core i7 5820K processor w/Corsair H110 cooler
  • Motherboard: GIGABYTE X99 Gaming G1 Wi-Fi
  • RAM: 16GB Corsair Vengeance 2666MHz DDR4
  • Storage: 240GB SanDisk Extreme II and 480GB SanDisk Extreme II
  • Chassis: Lian Li T60 Pit Stop
  • PSU: Corsair AX1200i digital PSU
  • Software: Windows 7 Ultimate x64




How we ran our tests


We ran all of our normal tests that we do for our video card reviews, and recorded their average VRAM usage in megabytes (so 1400MB = 1.4GB) over the entire run. Battlefield 4 is a little different, as we run a 5-minute real-time multiplayer match and average the FPS for our benchmarks. For this, we just ran it like normal, and recorded VRAM consumption.


So how did we go? Well, let's see.







Heaven - 4K Surround





Battlefield 4





Metro: Last Light





Middle-earth: Shadow of Mordor





Grand Theft Auto V











Tomb Raider





BioShock Infinite







1080p: At 1920x1080, we're seeing an average of around 1.5GB to 2GB of VRAM being used, with GTA V blowing all the way out to 3.7GB, and Shadow of Mordor getting there with 3.3GB. Still, 1080p is quite tame on the framebuffer.


1440p: Moving up to 2560x1440, we see GTA V consume 3.9GB of VRAM while Shadow of Mordor uses 3.5GB. Thief begins to crawl up there with 2.9GB of VRAM, while Battlefield 4 on the Ultra preset (minus AA) is only consuming 2.2GB of VRAM and is one of the best looking games out right now.


4K: As for what games or tests use the most VRAM, we see that it's a tie between Thief and GTA V, both using 4.3GB of VRAM, while Shadow of Mordor isn't far behind with 3.9GB being used - all at 4K. Battlefield 4 only uses 3GB of VRAM, which is quite surprising.


Final Thoughts


Surprised? Probably. 4GB of VRAM is more than enough for most video cards today, even at 4K. We haven't taken into account any anti-aliasing, as we're going to follow through with another article that looks at 1080p, 1440p and 4K with 4xAA enabled to see how much AA strains the framebuffer in these titles. But in all my years of using PCs, I barely use AA. AA is a personal preference. I'd rather have high framerates on my 120-144Hz screens, but anti-aliasing really helps at 1080p and below.




This test has shown that 8GB of VRAM is pretty useless right now, there's just no need. Sure, NVIDIA has a video card with 12GB of VRAM in the Titan X, and 6GB of VRAM on its new GTX 980 Ti, but most of the time it's not needed, even at 4K. If you start enabling AA, which we're going to be doing soon, then the VRAM consumption is going to skyrocket - which is something that will be interesting to see in our future article.


For now, we've shown you that even the latest games don't push that far over 4GB of VRAM, so you'll feel safe buying yourself a new card with 4-6GB of VRAM.

Related Tags

Submit a Tweakipedia user submitted tip or guide!

Use the form below to submit your own tip or guide - we may publish it on TweakTown.

Got an opinion on this content? Post a comment below!