Watch_Dogs Review: Deus Ex Smartphonia

👤by Tim Harmer Comments 📅27-05-14
Next Generation Graphics, Hardware Dependent


Test System

CPU: Intel Core i5 3570K @ Stock
Memory: 8GB (2 x 4GB) GSkill DDR3 1333MHz @ Stock
GPU: ZOTAC GTX 660 AMP! Edition *
OS: Window 7
OS HDD: 256GB Samsung 840 EVO SSD
Uplay Folder: As above
Monitor Resolution: 1920 x 1080 (1080p)




*For the purposes of this review we had access to the unreleased NVIDIA GeForce 337.88 WHQL drivers. These drivers should be available upon the release of Watch_Dogs and are highly recommended for all users of NVIDIA GPUs.

Near future Chicago is based on the Chicago of the real world and so few liberties have been taken with the design, layout and art direction. Instead Ubisoft have pushed for as close to a realistic representation of Chicago as possible, including locally famous landmarks and transport systems whilst giving a slightly futuristic feel. In building this world Ubisoft created Disrupt, a game engine designed to enable an open world rather than the typical walled corridors funnelling gameplay or fixed zone transitions, and in that regard is very much in the mould of the Grand Theft Auto franchise.

Unlike similar titles Watch_Dogs doesn't have some of the more 'zany' in-world interactions, instead relying on solid in-game physics for a realistic take – at least, as realistic as possible when the protagonist wields a magic smartphone which can cause pipes to burst at a keypress. As a consequence you're unlikely to see some of the more ridiculous highlight reels compared to Just Cause 2, but it also lacks a charm and humour such titles bring to bear.



For its PC adaptation Ubisoft enlisted the aid of NVIDIA, and with their help also implemented two NVIDIA-specific technologies for those with suitable hardware.



Both HBOA+ and TXAA increase graphical quality with a relatively low performance overhead compared to more widely used methods, leading to improved in-game visuals for PC systems at each level of hardware power. Those without NVIDIA graphics hardware can make use of traditional Ambient Occlusion and Anti Aliasing techniques, but these will have a more significant impact on performance.

Note that for the highest graphics quality levels 3GB of VRAM are required.



The level of customisation available in the in-game is inadequate, but continues a trend in Ubisoft releases whereby PC ports are often grudging and rarely support the full range of customisation gamers expect. AA and AF sliders are welcome inclusions in the half-dozen options not in some way automated, and we also see the inclusion of texture and shadow quality adjustment which are critically important to optimising gameplay settings in a way that prioritises smooth frame-rates.

We played the game at the above settings (broadly close to the High preset) which resulted in a comfortable, if not optimal, 35-45fps. At these generally High (although certainly not Ultra) settings Watch_Dogs still looked great, even if it didn't match the visual fidelity of trailers and gameplay videos released by Ubisoft to date. Ultra settings were impossible to sustain on our test PC, resulting in unacceptable frame rates and high-order stuttering likely down to inadequacies in GPU, CPU and/or memory system.

Although it looks good during the day-time, the engine really shines at night and when applying the weather effects. Disrupt does a great job showing off diffuse lighting and partially reflective surfaces, giving them a sheen which it wouldn't have otherwise and really building an almost oppressive atmosphere. In contrast to other recent PC releases particle and fabric physics effects are much more subtle, benefiting the game as well as retaining decent frame rates on non-optimal systems.

As the game is played by the public post-release both NVIDIA and AMD will come up with optimum PC settings for a range of system specs. thanks to data gathered by GeForce Experience and Raptr respectively, and should distribute them post-haste. If you note the implicit irony of this arrangement don't worry, it just means you're still human.

One slight niggle we discovered was a little frame-rate stuttering at the tail end of long gameplay sessions. After around three hours we began to notice fractional pauses, after which the game then caught up; the longer we played after this point the more frequent said stutters became. We can't tell if this was a result of our system, engine or driver, but the problem cleared up after restarting the game client.

--

Update

We've now read reports of a significant performance dip in systems that make use of AMD GPU due to Ubisoft's apparent integration of proprietary NVIDIA GameWorks technologies in generating certain high quality effects. Some testers have AMD R9 290 GPUs running substantially slower than GTX 770, and although still providing highly playable rates this could be extremely detrimental in the mid-range.

Launch disparities between GPU vendors is nothing new, and both AMD and NVIDIA have been the beneficiaries of close relationships with developers in the past. Typically however these differences in expected performance tend to even out over time as driver engineers are able to tailor releases once exposed to launch code. Such was the case for Tomb Raider 2013 where AMD benefited, and we hope will be the case with Watch_Dogs. Like NVIDIA's 337.88 WHQL drivers AMD should be in the process of releasing their first optimised drivers - likely the Catalyst 14.6 betas - soon.

We would also say that it would be an extremely disappointing development if one vendor's performance were throttled unfairly, and more in keeping with the bad old days of vendor-specific optimisations and anticompetitive practices.




10 pages « 2 3 4 5 > »

Comments