Intel "cheating" in 3DMark?

rated by 0 users
This post has 7 Replies | 3 Followers

Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 Posted: Tue, Oct 13 2009 8:14 AM

I thought this was interesting.  Reminds me of the old days when ATi and nVidia would "optimize" for the benchmarks.

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

  • | Post Points: 50
Top 75 Contributor
Posts 1,677
Points 24,005
Joined: Aug 2002
Location: Pittsburgh, Pennsylvania

Wow, I'm not sure what to say.  If the image quality isn't hurt than I have no problems with optimization.  But this?  Hmm.  I don't even look at 3DMark pages anymore for reviews (haven't for years) but it also affected Crysis so.  But again, no image quality loss = no complaints from me.  Just seems like good driver work.

Hello

  • | Post Points: 20
Top 10 Contributor
Posts 5,053
Points 60,715
Joined: May 2008
Location: U.S.
Moderator
3vi1 replied on Tue, Oct 13 2009 12:31 PM

I'm in agreement that I have no problem with them optimizing for games if the image quality isn't hurt.  But, I tend to side with the opinion that it's not exactly on the up-and-up to specifically target a benchmark.

The results of optimizations are going to vary greatly depending upon the specific game/app being optimized.  Optimizing a benchmark means that the results will not be useful to estimate performance for any game you might be thinking of buying (the vast majority of which will not have optimizations).

What part of "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn" don't you understand?

++++++++++++[>++++>+++++++++>+++>+<<<<-]>+++.>++++++++++.-------------.+++.>---.>--.

Top 75 Contributor
Posts 1,677
Points 24,005
Joined: Aug 2002
Location: Pittsburgh, Pennsylvania

Yeah, the fact that 3DMark isn't actually a game and is entirely used as a benchmark for performance makes it a bit shady.  Especially if the article is true about 3DMark optimizations being disallowed.  ATi's monthly patches almost always have driver improvements for specific games, but I don't think they optimize 3DMark (or at least not openly).  I remember Nvidia getting in trouble way back in 3DMark2001SE or something for stuff like hurting image quality. 

The only thing I can say for Intel here is that they better do their best to do this for every game.  They claim it's the CPU helping the GPU, and I'm all for that (and visa-versa).  Intel graphics are super weak so a little help can go a long way.

Hello

  • | Post Points: 5
Top 75 Contributor
Posts 1,964
Points 25,705
Joined: Sep 2009

I think the major point of the article is stated in the last couple of paras:

All of which brings us back to the perils of using 3DMark Vantage as a substitute or proxy for testing real games. Those perils are well established by now. PC makers and others in positions of influence would do well to re-train their focus on real applications—especially for testing integrated graphics solutions, which have no need of advanced graphics workloads based on the latest version of DirectX to push their limits. 3DMark's traditionally purported role as a predictor of future game workloads makes little sense in the context of these modest IGPs.

 

While it's good driver work and improves performance in Crysis, there is a conscious effort to gain an advantage on 3dMark Vantage. I can see why FutureMark is concerned, as similar optimization on their benchmark will hurt its credibility.

Thanks for bringing this up 3vi1.

 

  • | Post Points: 20
Top 10 Contributor
Posts 8,691
Points 104,390
Joined: Apr 2009
Location: Shenandoah Valley, Virginia
MembershipAdministrator
Moderator
realneil replied on Tue, Oct 13 2009 8:06 PM

So Intel want their cake and to eat it too.

It's like Sony thinking we wouldn't figure out the Rootkit they hid on their CD"S.

Dogs are great judges of character, and if your dog doesn't like somebody being around, you shouldn't trust them.

  • | Post Points: 5
Top 10 Contributor
Posts 6,181
Points 90,135
Joined: Aug 2003
Location: United States, Virginia
Moderator

My big issue with this is that they have what 5 games that they optimize. None of which can really be played well on there chip. It is a clear move to look better than Nvida and ATIs integrated solutions. These are all commonly benchmarked games, but really who plays them that much anymore? And who plays them on a intel gpu?

  • | Post Points: 5
Top 25 Contributor
Posts 3,486
Points 47,175
Joined: Nov 2005
Location: Metropolis
ForumsAdministrator
Moderator
Super Dave replied on Wed, Oct 14 2009 12:39 AM

This is not the first time that AMD has called-out Intel on it's questionable practices. Remember THIS?

 SPAM-posters beware! ®

  • | Post Points: 5
Page 1 of 1 (8 items) | RSS