Shortcuts

New HAWX 2 Demo Accused of Benchmark Shenanigans Before It Even Launches

rated by 0 users
This post has 7 Replies | 1 Follower

Top 10 Contributor
Posts 24,884
Points 1,116,905
Joined: Sep 2007
ForumsAdministrator
News Posted: Thu, Oct 21 2010 7:58 PM
It's nothing new to see AMD and NVIDIA arguing over benchmarks, but news that came across our desk today put a new twist on a very old battle. A few days ago, NVIDIA contacted us and inquired if we'd be interested in testing HAWX 2's stand-alone benchmark for evaluation, noting that tessellation effects were employed for better terrain realism. We said sure—a number of DX11-enabled games and/or test programs make use of the effect currently. Synthetic benchmarks aren't much better at this point; Unigine might as well be called TesselationMark.  NVIDIA's Reviewer's Guide also put great emphasis on the game's use of tessellation and Fermi's superior performance in that area. We read the guide, benchmarked the game, wrote down some numbers, and that was that—or so we thought. Imagine our surprise when we got an email from AMD, claiming that HAWX 2 was a very poor test and ought not to be used. The letter is quoted below:
"It has come to our attention that you may have received an early build of a benchmark based on the upcoming Ubisoft title H.A.W.X. 2.  I'm sure you are fully aware that the timing of this benchmark is not coincidental and is an attempt by our competitor to negatively influence your reviews of the AMD Radeon™ HD 6800 series products.  We suggest you do not use this benchmark at present as it has known issues with its implementation of DirectX® 11 tessellation and does not serve as a useful indicator of performance for the AMD Radeon™ HD 6800 series.  A quick comparison of the performance data in H.A.W.X. 2, with tessellation on, and that of other games/benchmarks will demonstrate how unrepresentative H.A.W.X. 2 performance is of real world performance.
 
AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark.  For that reason, we are working on a driver-based solution in time for the final release of the game that improves performance without sacrificing image quality.  In the meantime we recommend you hold off using the benchmark as it will not provide a useful measure of performance relative to other DirectX® 11 games using tessellation."
This argument over what is and isn't an accurate tessellation performance test underscores the fact that there are precious few titles to choose from currently. Fermi's tessellation performance is much stronger than AMD's at this point, but superior tessellation does not automatically translate into superior frame rates. Meanwhile, we're curious to see how this situation develops. Some of you might remember the ATI/NVIDIA Doom 3 vs. Half Life 2 knock-down battles from years ago. Back then, NVIDIA was nailed to the wall by the GeForce FX's performance in a relevant tech demo. Thanks in-part to Gabe Newell's evangelism of his new game engine, journalists took the terrible FX numbers as an indication that no one wanting to play HL2 should buy an NVIDIA card to do so.




Its status as a useful benchmark may be debatable but high tessellation does enhance in-game realism.

By the time the game actually shipped, NVIDIA had launched an entirely new GPU that didn't suffer from the same performance limitations as the earlier FX cards, but the damage had been done. Despite how the numbers had changed, NVIDIA was seen as a less-than-ideal choice for gamers wanting to play Half Life 2. In this case, we won't have to wait a year to find out how much tessellation really matters across a wide range of games—with a range of titles launching in the next few months we'll soon have a comprehensive view on how much tessellation performance affects gameplay. 
  • | Post Points: 35
Top 10 Contributor
Posts 5,048
Points 60,675
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Thu, Oct 21 2010 10:35 PM

>> Unigine might as well be called TesselationMark.

Except that unlike FutureMark, there are actual games being developed on the Unigine platform. At least one of which (http://oilrush-game.com) will be released in about a month.

What's wrong with Unigine?

>> For that reason, we are working on a driver-based solution in time for the final release of the game that improves performance without sacrificing image quality.

I.e. "We're going to make the driver recognize this game, discard the developers code and substitute our own highly-optimized-for-only-our-own-platform code, so that future reviews will be comparing apples and something that looks exceedingly like an apple but won't at all be indicative of the relative power to run the same code."

I can't see Ubisoft discarding ATI/AMD's suggested 'fix' code unless it hurt performance for the Nvidia cards or had some other negative aspect.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 35
Top 10 Contributor
Posts 6,181
Points 90,135
Joined: Aug 2003
Location: United States, Virginia
Moderator

Wow. I feel like we are in 2003 again with all the benchmark scandals. Think it is kinda low on both sides to suggest or ask to avoid certain benckmarks to reviewers.

  • | Post Points: 5
Top 100 Contributor
Posts 1,038
Points 11,350
Joined: Jul 2009
Joel H replied on Fri, Oct 22 2010 11:44 AM

3vi1,

"What's wrong with Unigine?"

Nothing.. There's nothing wrong with 3DMark, either. In both cases, however, it's important to know how they function and what they test. Over at Tech Report, Scott W has a piece on why Unigine's use of tessellation isn't all that great: http://techreport.com/articles.x/18682/10

On modern cards, the one performance factor that makes Unigine run smoothly is tessellation capability. If you want to test tessellation performance by itself, Unigine may be right up your alley. Similarly, games that employ the Unigine engine might turn out to be fantastic benchmarks.

  • | Post Points: 35
Top 500 Contributor
Posts 140
Points 1,710
Joined: Jun 2010
Location: Toronto

Not really a big deal since HAWX 2 isn't really as graphic heavy and even the 6850 does 70fps avg in the benchmark with tessellation on at 1080p, although the min frames are lower.

AMD is probably poo-pooing this more becaues the 5800 tessellation performance vs the 6800 tessellation performance is next to no improvement on this game, meaning their "smart tessellation" implementation isn't providing enough muscle.

  • | Post Points: 5
Top 10 Contributor
Posts 5,048
Points 60,675
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Sat, Oct 23 2010 3:18 PM

>> There's nothing wrong with 3DMark, either.

I actually would have sided with you even if you said there were: It seems like all the card manufacturers have chided them in the past for doing things in a manner that was unrealistic for a game engine and therefore not so comparable to real games. At least that's how I remember it.

Thanks for the link! I'll have to check that out.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 5
Top 150 Contributor
Posts 758
Points 10,695
Joined: Sep 2009
eunoia replied on Fri, Jul 15 2011 12:48 PM

.

...pending.

  • | Post Points: 20
Page 1 of 1 (8 items) | RSS