Poor Watch Dogs PC Performance? Here's Why And How To Fix It

Over the past few weeks, I've spent a significant amount of time with Watch Dogs. The game is infamous for not running very well on PCs -- and after some legwork, I've figured out a hefty chunk of the reason why: 

Ubisoft royally screwed up its suggested VRAM (graphics card frame buffer) settings.

Watch Dogs sucks down far more memory than any other modern PC game I'm aware of; this game's VRAM demands are far, far outside the norm for a modern 1080p title. In the past, I've compared VRAM usage in games like Guild Wars, Battlefield 4, and Total War: Shogun 2.  In these games, the average VRAM use with all detail levels maxed out was about 1.5GB -- BF4 will break 2GB if you use the supersampling option to render the game internally in 4K mode, and Total War: Shogun 2 wants a bit more than 2GB of VRAM if you max out every single graphics setting and enable 8x MSAA.

Ubisoft claims that Watch Dogs' "High" texture detail setting requires a 2GB frame buffer while its "Ultra" textures need 3GB of frame buffer memory. That might technically be true, but these figures should be treated as a minimum, not a max. Playing the game through with High Textures and "Ultra" details (the two settings are controlled separately), my system was dogged with repeated, jerky slowdowns when running a GeForce GTX 770. Switching to "High" Details improved the situation, but didn't resolve it.



A check of GPU-Z and swapping to a GTX 780 (with 3GB of RAM) demonstrated why -- High Textures + Ultra Details chews up an easy 2.5GB of GPU RAM even at 1080p. High Textures + High Details still consumes about 2.1GB -- enough to send the GTX 770 into periodic paroxysms and long, stuttering pauses. This is often exacerbated by an Alt-Tab issue with Watch Dogs -- if you're playing this game, you don't want to Alt-Tab. Set for Borderless fulllscreen mode, or you'll be facing gradual performance degradation every time you switch back and forth between game and desktop.


This demo shows stuttering with a GeForce GTX 770 set to High Textures and Ultra Detail.

Unfortunately, the best way to improve performance in Watch Dogs is to ignore Ubisoft's recommendations altogether and opt for lower detail levels, depending on your configuration and monitor resolution. Our recommendations:

1GB of Video Memory
1366x768 Resolution
Medium Textures
Low Detail Settings
FXAA

1-2GB of Video Memory
1920x1080 Resolution
Medium Textures
Medium Detail Settings
FXAA

2GB of Video Memory
1920x1080 Resolution
High Textures
Medium Detail Settings
Temporal SMAA

3GB of Video Memory
1920x1080 Resolution
High Textures
High Detail settings
2x TXAA  / 4x MSAA (May require step-down to Medium Detail settings).

This means that only users with a Radeon R9 290 or GeForce GTX 780 are going to see Watch Dogs in its full glory.

It's hard to know, at this point, how much of the game's wretched memory management is caused by poor optimization, and how much is caused by Ubisoft's engine design decisions. The impact, however, leaves many graphics cards -- including GPUs that are less than a year old -- staggering under the rendering load. For this reason alone, I'd say Ubisoft poorly calculated its own requirements -- High Textures ought to have been described as requiring at least 3GB of RAM, and Ultra Textures require more like 4GB.  Once you enable anti-aliasing, the game will chug even on a GTX 780.

Ubisoft has said that a patch is coming that may address some of these issues.  We'll give it a spin when it finally arrives.