A Secret, Powerful, and Beautiful Relationship Revealed - Intel Processors and ATI Graphics

rated by 0 users
This post has 29 Replies | 8 Followers

Not Ranked
Posts 30
Points 645
Joined: Feb 2010
wallacemanengine Posted: Wed, Mar 3 2010 12:12 AM

Hello, I am a high school student who has stumbled upon a great discovery only few computer "geeks" know, and that is the beautiful combination of power when you combine Intel's etremely powerful processors with the might of ATI's graphics. Two world-wide industrys that have been butting heads for nearly 100 years; and what power they can create when they join forces...

I will explain myself in my next post, which will be somewhat long but worth every single second of reading. This forum is for people who have been wondering what it is like to join technology from the two biggest rival computer companies in the entire world: Intel and AMD. This forum should be used for just about anything related to the subject: suggestions, reports, findings, prices, places, opinions, facts, past, future, anything really.

My findings and proof that Intel-ATI setups are extremely powerful, are also in my next post. Have fun posting!

Not Ranked
Posts 30
Points 645
Joined: Feb 2010

To begin this forum, I will say in my OPINION, Intel - ATI setups are the most powerful setups the computer world has to offer. My opinion, is, however, backed up by FACTS. These FACTS are BENCHMARK Tests that show that when Intel Processors are combined with ATI graphics, you not only get the gratest computer speed, but the gratest graphics, and therefore gaming performance as well.

As I said before, I am a High School Student; and though I am not a Computer Engineer, it is my chosen carrer choice and so therefore it is also my study area and chosen major for college. I have been following Intel and AMD products and technologies since I was 10 years old. These years it has been proven time and time again trough the countless back-and-forths between Intel and AMD that while Intel leads in processor and actual computer speed technology, AMD speeds ahead in graphics rendering and gaming areas. One day I asked myself "what if Intel and AMD teamed up their processors and or graphics, and the result was to be expected, extreme power. 6 months later, I got my awnser when my teachers allowed me to run tests on their computers. To begin, I ran a series of tests on 6 older computers used in my Video Production Class with the permission of my teacher Mr. Porter. Next I ran a second series of tests on a newer, computer lab set of 5 computers. Finally I ran a single Test on one of my Web Design Teacher: Mr. Kardens new AMD Phenom II X2 955 Processor-Powered Computers combined with ATI Radeon 5770 Graphics Cards. My Results, along with detailed specs and findings, are below:

First Sieries of Tests: 6 X Dell Demension 8300's all completely stock exept for they had been upgaded from WIndows ME to Windows XP Pro Version 2002, and that they had FireWires so that they could connect to brodcast-quality cameras.

Operating Systems: Windows XP Professional Version 2002

Processors: Stock Pentium IIII SOLO at 3.0 GHz

System memory: Stock 256MB DDR Memory

Graphics: 64MB Integrated by Intel (3); 64MB Integrated by NIVIDA (2), and 64MB Integrated by ATI (1)

Motherboards and system base: Standard Dell Demension 8300 with Standard Dell Chipset Families

Results: In benchmark tests, 3D graphics rendering on the Intel Integrated Graphics were 383, 409, and 412; averaging out to about 400. As expected, the NIVIDIA integrated graphics performed a bit better, coming in at 434 and 447, rounding to around 440. The results on the ATI tests were so austounding I was convinced that my benchmark had errored so I ran it three more times, but the results stayed about the same, coming in at 613, 622, 612, and 619, rounding in at around 620. I personally thought it was bogus, so with Porter's permission I downloaded my Call Of duty: Modern Warefare to the ATI computer and ran it. Now anyone who knows computers knows that in any circumstance the processor, memory, and graphics of a Dimension 8300 are nowhere near even minimal requirements to run. Astoundingly, it not only ran, but at near-native resolution (800x600 with native being 1028x768) provided all optional graphics and audio settings to very minimal. Still, it ran at 19 Frames Per Second on average, proving this time, ATI was on top of the food chain.

Second Sieries Of tests: 5 X MPC Custom Built Configurations by our school board all IDENTICAL exept again the random integrated graphics, exept with school Admin Software of course...

Operating Systems: Windows XP Professional Version 2004

Processors: Stock Intel Pentium D DUAL CORE at 1.8 GHz

System Memory: Stock 1028MB (1GB) DDR2 Memory

Graphics: 128MB Integrated by NIVIDIA (4), 128 Integrated by ATI (1)

Motherboards and System Base: Standard MPC Series Chipsets

Results: To keep it short this time I will simply List the Results. NIVIDIA: 713, 764, 729, and 755, for an average of about 740. ATI (3 tests to one machine) 916, 1004, and 967, for an average of 960 (Again, Amazing!)

Finally, just for the hell of it, I tested one my Web Design Teacher's 24 custom built AMD-based computers

Final Test: Single-Benchmark on Karden's Lab Computer

Operating System: Windows 7 Professional 64-Bit Edition

Processor: AMD Phenom II X4 QUAD CORE at 3.2 GHz

System Memory: 4112MB GDDR4 (4GB) Memory

Graphics: 1028MB (1GB) GDDR4 Deticated Graphics Memory by ATI (5770 Card)

Motherboard and System Base: ASUS 790 X/SB750 Motherboard

Results: As expected it blew the hell out of any other setup my school had to offer. When playing Karden's Unreal Tourtament 2004, you can play on native resolution with all graphics and audio options set to maximum, and have high contrast and still get a FPS rate of 38 on average. Thats amazing considering HD Television is brodcasted at 40-42 FPS while Blue-Ray Discs play at 44-46 FPS. Call of Duty: Modern Warefare ran at 32 FPS also with maximized settings, and Call of Duty: Modern Warefare 2 Ran at 31 FPS, again with maximized settings. Please take this into note: When I performed these benchmarks in January, 2010, the price of a 5770 card was over $500. Now it is just under $150. There is no reason to buy a 5800 series card for $500 when the 5770 performs so well, just so you guys know.

My Findings: Very Obvious; ATI Rules, simple as that. You want a nice computer, combine the fastest processing on the planet (Intel), with the gratest graphics (AMD). I hope you find these findings helpful. Feel free to join the conversation.

  • | Post Points: 35
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Thu, Mar 4 2010 12:45 AM

I'm pretty sure I read a similar thread of yours in another part of the HH forum.

 

As far as I'm concerned, there is nothing absolute in the computer industry. Things change often. Sometimes ATI have the lead and sometimes Nvidia has the lead. It could have been that ATI happen to have the superior integrated graphics at the time the Dell 8300 was released. You need a way bigger sample size to make such a bold claim like that. The worst thing to do is test a few cards, barely scratching 2 architectural generations, and saying that it represents the many years that Nvidia and ATI have been fighting it out.

 

As far as your last test and the remarks that went with it, what Nvidia cards did you test it up against? What exactly was the resolution and what is considered max settings; maxed AA? Don't know what currency you were quoting for the 5770, but they were never that high in USD. Also, while Modern Warefare is a pretty game, it is still a DX9 title. Also, I don't see how the last benchmark supports your argument of pairing up Intel with AMD to achieve a nice computer since that pc used an AMD processor.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

Well RyuGTX, I agree with most you have stated above, although I did test 12 different computers from only (it was 3 btw) 3 arcetectual generations, I am confident that you could test other generations and find the results very similar, the NIVIDIA cards I tested against are also included in my Specifications sections (note all levels of deticated graphics memory were always the same and along with the system age). I also respect the fact that NIVIDIA has led the market over ATI several times over the years yes; however they led because of reliability, price and or the fact they were the first to create video cards able to play 3D media, whereas ATI took its place at the top because of its pure power, FPS, and keeping that FPS in high resolution capabilities.

Sure NIVIDA has great cards absolutly, however they have rarely ever been the best at pure graphics rendering in HD, and I base that statement around  A.) Actual Benchmark Test results over the years by GeekSquad, TeckForum, and other Internation companies who test and post their results and B.) NIVIDA cards are specifically built not for pure performace wheras ATI is, but more for their custom features, such as 3D Rendering, The new power saving technology (whatever it was called, just released), and overall user frendliness. ATI cards are built more for the advanced computer users or video game users, who want smooth, non-laggy simple but powerful gameplay.

The resolutions tested were costom made by my teacher Mr. Karden, the web design teacher and one of the main administrators to the schools computers. I believe he told me that he had set the resoulution to a custom 16:9 resolution of 3200x1800 pixles, witch yes its pretty amazing. The Native Resolution of the Demention 8300's is 1024x768 and for the MPC's, It was 1280x1024 (flatscreen squares).

When I say maximum settings i mean taking the Graphics options, such as "Texture", "Landscape", and other custom graphics options included with most games and setting them to "Highest", "High", or "Max" settings (Depending on max level), all while playing on the highest resolution possible compatible with that computer (MAX NATIVE RESOLUTION).

As far as the industry goes, your completely right. It is very unpredictable, and I look forward to Intel and AMD maxing out their technology to compete with eachother.

Next, I did the test on the final computer just for the hell of it (I also state that in my last post), it was simply a test to rate a high-end AMD computer on a personal level. By the way, I forgot to post its 3D rendering score, which was 2567, followed by tests concluding in a 2613 and 2450.

As far as DX goes, I know The first Modern Warfare was a DX9, but I could have sworn Modern Warefare 2 used DX10, and the only reason I used the first title was because the older Dells and School-Oriented MPC's were only DX9 compatible. As for testing DX11 on Kardens computer, I do not yet own a DX11 title but I will soon as I am buying Quake 2 and maybe then we can run that test.

Finally, the 5770 was actually $500 for Karden because he had actually gotten them before the release date, he ordered them just before the release of the 5550. How is this possible? Mr. Karden is a top notch web designer with a huge reputation in AMD, he actually sugessted ideas as to how AMD.com should be updated and rebuilt to stay attractive to web surfers. Karden even phisically perticipated in the build of NewEgg.com, a parter to both Intel and AMD. Considering this, AMD offered Karden the 5770 more than a month before its official release date, of course at an outrageous cost. Even so, Karden got permission to use the school's budget on the cards and Phenom II setups (No wonder we have no friggen Autoshop, my favorite class!).

I also have a couple questions for you, and those are that first, do u know where to find prices on the new 5870 card? And secondly why are you so hostile, I specificly stated at first that my thought that Intel - ATI computer blow all others away was an opinion. If you dont like ATI or Intel so much, thats fine; however why question my findings? I mean who in their right mind could even come up with a story like this, anyway, and then who would post it? If my findings were completely rediculus, more than just you would have questioned me, not that I find you as a bad guy or anything, but I find it wierd only 1 person questiones such a "bold claim" as mine.

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Thu, Mar 4 2010 6:00 PM

Am I the only one who has an opinion about this topic on this forum? It doesn't even have to be in opposition.

When you put out such a strong statement like that, you will meet with opposing viewpoints. I don't have anything against Intel or AMD. I am no fanboy. I usually buy whatever is the most bang for my buck; sometimes just the higher-end stuff. For these reasons, I am questioning your findings. I don't want new readers to think that this is the golden rule when picking parts for a new pc build. I also don't see anyone supporting you.

As far as benchmarks, are you talking about GeekSquad from Bestbuy? I don't know any enthusiast to trust their word. How about you post a few benchmarks for us from these sites that are your sources?

For a DX11 title, you could try Battlefield Bad Company 2. But the results for that will be ATI being the winner as Nvidia doesn't have a DX11 card out yet. Also, from the rumors, the upcoming Nvidia DX11 card isn't going to be as great as Nvidia was saying it would be. You could test cards on a DX10 title like Crysis Warhead.

By the way, Nvidia doesn't take the lead just because they are reliable. The lead changes between ATI and Nvidia because of performance. They are fighting for the performance crown, which is what many people look for when buying cards. Who do you think had the performance crown before ATI released the 5xxx series? Nvidia's GTX 200 series were killer cards.

For example:

Yes, I do know were to find prices on the 5870. On average, I would say they go for about $400. At the end of 2009 and beginning for 2010, prices were higher because there was a shortage of 40nm chip production. I guess there still is but at least it seems to be easier to find availability of the 5870 now. Newegg has some in stock by a few different manufacturers. Cheapest one at this moment is $379.99 by HIS because there is an instant $40 savings. The others are more or less around the $400 mark.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

Your findings do support the fact that the last time NIVIDIA took the video card industry lead, it was because of performance. However, if you look at long-period pasts (as in 50+ years), NIVIDIA RARELY ever took the lead in sales because they were the kings of performance. MOST of the time NIVIDIA leads, its either because people enjoy their power saving technologies, leading innovation visual technologies, or simply because thier interface is very user-friendly and Intel - NIVIDA setups are the most popular setups.

As far as Benchmarking goes, I will try Battlecompany 2 if I ever get the chance. My favorite of all benchmarking testing companies can be found at this website: http://www.guru3d.com/

  • | Post Points: 20
Top 50 Contributor
Posts 2,851
Points 40,550
Joined: Feb 2004
Location: United States, New York
Moderator

I don't agree that ATI and Intel are the best pair.  I believe that Nvidia and Intel are almost always better.  For one ATI has the worst drivers on the planet, they sit with Creative, although they sit higher than creative.  I find that ATI serves the mainstream with lower prices.  Especially with Sapphire as their bang for the buck, atleast it used to be.  I find that Nvidia is for the true enthusiasts, of course my word is against yours.  But there is one thing you need to understand about the beauty of the gpu market.  Neither what I say or what you say is right.  I really think that the two companies are perfect for each other and they show it by completely one upping each other.  However, I will take my side with Nvidia.

[edit] Prime example of the pricing difference is the 4870 and the GTX260

  • | Post Points: 35
Top 500 Contributor
Posts 158
Points 1,735
Joined: Mar 2010

I've asked this question before on other forums and got some useful info, so if I may throw this out:

Everyone "knows" that ATi's drivers suck, at least for Windows. In fact, for some of my past computers, I used omegadrivers.net instead of the Catalyst suite. How is ATi's linux driver support, though? I heard it was bad, but then starting with the HD 4xxx series they started including Linux drivers on their CDs that were actually in step with Windows drivers, and someone else told me that community-based ATi drivers were excellent, good enough even for older chips like the X series.

By the same measure, how is nVidia's Linux driver support?

Sorry if this veers a bit off-topic from the original post, but the discussion quickly turned to graphics cards so I thought I'd bring this up.

  • | Post Points: 5
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

Ok, You're absolutly right with the 4870 and the GTX260 comparison. However, note that the GTX260 is a much newer and therefore technologically advanced card than the ATI 4870. To correctly compare NIVIDIA's GPU's to ATI's GPU's, you must compare two same-generation cards. For example, your GTX260 by NIVIDIA came out around the same time as when ATI began producing the 5000 series. To be more accurate, I would say the 5700 series ATI cards were what competed with NAVIDIA's GTX250-GTX280. Try benchmarking these two sets of cards and then come to a concution.

Also, I state what I stated above because if you look at the results I found in the first two statements on this forum, ATI beats NIVIDA across the line. I actually didnt even have a preference of NIVIDIA or ATI before I came across those results. Now I am a Die-Hard Intel - ATI fan.

  • | Post Points: 20
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

I personally let Widows Update set on "Automatic Updating" take care of all my driver and software as well as system updates and my systems run beautifully and therefore do not know much about the drivers themselves. However, if NIVIDIA and ATI have begun including Linux on their CD's, im sure they will perform just as well on Windows as they do Linux, otherwise they wouldnt have included them.

Hope that awnsers your question

  • | Post Points: 20
Top 10 Contributor
Posts 6,374
Points 80,315
Joined: Nov 2004
Location: United States, Arizona
Moderator

I don't see how you can say a dx10 card and a dx11 card are the same generation of cards... 

I Prefer Nvidia cards over ATI's simply because ati's drivers gave my nothing but problems.  I have uses both ATI and Nvidia cards in the same system but the ati drivers crashed the system about once a week. but the Nvidia drivers were stable on it for many months before and after i put the card back in. the cards being 8800GTX and 4870.

"Never trust a computer you can't throw out a window."

2700K

Z77 GIGABYTE G1.SNIPER

GIGABYTE GTX670

G.Skill Ripjaws X 16gb PC2133

Antec P280

Corsair H100

Asus Blu-ray burner

Seasonic X650 PSU

Patriot Pyro 128gb SSD

  • | Post Points: 35
Top 50 Contributor
Posts 3,017
Points 43,210
Joined: May 2005
Location: United States, Virginia
Moderator

If my memory serves me correctly the GTX 260 has been around for quite some time now releasing around the same timeframe as the 4000 series cards. Now the 285 and even the 275 cards were released later down the road. I believe the GTX 280 and GTX 260 were Nvidia's first offerings from the 200 series of cards. I am sure there is an artlicle floating around here somewhere from when they first got released. In recent releases the trend has been Nvidia's cards pump out higher benchmark scores when compared to similarly priced ATI comparisons but ATI's cards were priced a little better. I think ATI/AMD holds the crown for best bang for your buck but also put out a good bit more heat then Nvidia cards. So really guys let's not get into the battle of the companies deal here yet again. To each his own is what I say and everyone's scenario is different. In a perfect world we are all rich and just buy the most powerfull of everything on the market. Each product has it's own individual advantages for each person and their needs

  • | Post Points: 20
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

I could have sworn that the GTX 250-GTX280 cards were all around the Radeon 5000 lower series cards. And werent the GT and GTS 200-220 (or so) the ones to compete with the Radeon's higher end of the 4000 series?

  • | Post Points: 35
Top 500 Contributor
Posts 158
Points 1,735
Joined: Mar 2010

wallacemanengine:

I could have sworn that the GTX 250-GTX280 cards were all around the Radeon 5000 lower series cards. And werent the GT and GTS 200-220 (or so) the ones to compete with the Radeon's higher end of the 4000 series?

In terms of performance, yes, they're close from what I've remembered (actually, high end GTX 2xx series is more like midrange 5xxx series). However, you can't compare them, because GTX 2xx series is DX10, and 5xxx series is DX11. They were released afterwards, and nVidia was very slow to counter with its own DX11 chips, most likely because a deal with Intel fell apart and they had a lot of internal cleanup to do, setting production back by at least a few months.

  • | Post Points: 5
Top 50 Contributor
Posts 3,017
Points 43,210
Joined: May 2005
Location: United States, Virginia
Moderator
nelsoncp21 replied on Sat, Mar 13 2010 10:55 AM

wallacemanengine:

I could have sworn that the GTX 250-GTX280 cards were all around the Radeon 5000 lower series cards. And werent the GT and GTS 200-220 (or so) the ones to compete with the Radeon's higher end of the 4000 series?

I believe the GTX 260 and GTX280 were the first cards released then all the other variants were released with the lower end models available  and the mobile variants to try and fill in the gap with ATI's lowering offerings. Then the 285's and 275's came out for the dual gpu models. Someone else chime in here if I am wrong as my memory isn't as good as it used to be. I am getting old here or atleast feel like it.Big Smile

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Wed, Mar 17 2010 7:26 PM

nelsoncp21:

wallacemanengine:

I could have sworn that the GTX 250-GTX280 cards were all around the Radeon 5000 lower series cards. And werent the GT and GTS 200-220 (or so) the ones to compete with the Radeon's higher end of the 4000 series?

I believe the GTX 260 and GTX280 were the first cards released then all the other variants were released with the lower end models available  and the mobile variants to try and fill in the gap with ATI's lowering offerings. Then the 285's and 275's came out for the dual gpu models. Someone else chime in here if I am wrong as my memory isn't as good as it used to be. I am getting old here or atleast feel like it.Big Smile

Yeah, that is usually how they release their cards. High-performance cards first like the 280/260 (Nvidia) and 4870/4850 (ATI) and then fill in the lower-end stuff. Then somewhere along the line, they release a super high-end stuff (more recently) with dual gpu offerings.

I think the first 200-series cards came out in 2008. Just after a bunch of refreshes, like the 240 and 250, which just happened to come out around the time of ATI's 5000-series. They weren't meant to really compete.

Like everyone says, the whole gpu market is really like a frog hopping sort of deal (one trying to out do the other). So it is a bit difficult to compare generations; I think it is best to compare month by month. I think it is easier to compare that way. You might be able to do it by generation according to the Direct X version, but I don't think Nvidia chose to do DX10.1 like ATI did.

 

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

That is odd I never heard about releases of lower cards after the higher ones. Either way, because of actual performance engines, DDR type, and memory amout, It is still best to test NVIDIA to ATI by using two cards of the same exact type, being either 1 or 2GB; DDR3, DDR4, or DDR5 cards. Therefore, to test the GTX 250+ series, you must compare it to a Radeon 5000 series. By the way I like the simplicity of ATI releases, going up in number nearly all the time, and constantly advancing upward.

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Thu, Mar 18 2010 6:48 PM

wallacemanengine:

That is odd I never heard about releases of lower cards after the higher ones. Either way, because of actual performance engines, DDR type, and memory amout, It is still best to test NVIDIA to ATI by using two cards of the same exact type, being either 1 or 2GB; DDR3, DDR4, or DDR5 cards. Therefore, to test the GTX 250+ series, you must compare it to a Radeon 5000 series. By the way I like the simplicity of ATI releases, going up in number nearly all the time, and constantly advancing upward.

 

You have never heard about released of lower cards after the higher ones? Let's take the most recent release from ATI since I'm sure you should at least be familiar with that. They first released the higher-end stuff like the 5850 and above. If I'm not mistaken (and I'm too lazy to check), it was the 5870 first. Like a week after, it was the 5850. Then soon after, ATI started releasing other cards to fill the lower-end of their line-up starting with the 5770 and 5750. Albeit, this the time frame for the 5000 series launch was pretty short, but ATI didn't release all their cards on the same day. There was no word of a 5830 along with the 5870 release.On average, the time frame is longer.

As far as testing methodology, I would suggest one that is based on a time line. Results should be more accurate. Testing it by memory type won't work because ATI usually uses different memory. It would be impossible to test the GTS 250 to the Radeon 5000 series because the GTS 250 uses GDDR3 while the Radeon 5000 series uses GDDR5. Even the 4870 used GDDR5.

Here is a few Wiki links that say when each product was announced:

ATI 5000 series: http://en.wikipedia.org/wiki/Evergreen_%28GPU_family%29#Products

ATI 4000 series: http://en.wikipedia.org/wiki/Radeon_R700#Variants

 

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 20
Not Ranked
Posts 30
Points 645
Joined: Feb 2010

Although it may be true that a few ATI cards have been released out of "technical" order; for the most part, ATI seems to release their GPU's in logical cronological order. Just about any true professional will back that statement up.

  • | Post Points: 5
Top 500 Contributor
Posts 156
Points 2,495
Joined: Feb 2010
Location: Astoria, NY

Yeah I've always preferred Nvidia over ATI due to the driver support. But despite this, I always had a thing for ATI because they are really the underdogs. Nvidia has a lot more money than ATI and can really get away with selling GPU's for a bigger premium than ATI. 

ATi on the other hand sell GPU's that have the best performance for price ratio. But they REALLY need to start perfecting their drivers. if ATI drivers were ANYTHING like Nvidia's, their GPU would be way ahead of Nvidia. 

So really ATi has the better GPU but Nvidia because of their driver and game developer support, are at the advantage. 

Least that's what I think xD

  • | Post Points: 5
Not Ranked
Posts 5
Points 55
Joined: Apr 2010

ignorance is bliss wallacemanengine.

 

I see posts like this all over the internet and am starting to wonder if AMD/ATI have a program that I can prove that I spamed lots of hardware site forums with this type of reasoning and misinformation that they would give me a free computer, heck if it was a really good computer I could sell my soul and be an ati fanboy for a while.

  • | Post Points: 5
Not Ranked
Posts 32
Points 490
Joined: Apr 2010
Location: Sacramento, CA
Xolso replied on Sun, Apr 11 2010 12:24 AM

Got this off 4chan.

 

  • | Post Points: 35
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Sun, Apr 11 2010 1:34 AM

Yes, because we should be looking to 4chan for all our computer-related information and advice. Going by that logic, we should get blue LED fans because they keep our components cooler than red LED fans. No

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 5
Not Ranked
Posts 2
Points 40
Joined: Apr 2010
eric_free replied on Sun, Apr 11 2010 9:52 PM

But, consider support for GNU/Linux.

And by support, I mean a driver that actually works in accelerated 3D.  NVidia have always provided drivers that WORK.  ATI's Linux drivers do not work. 

I've been running NVidia cards on both AMD and Intel systems since about 1998/1999.  Very recently, ATI has started to cooperate with Open Source by providing information about their hardware, and I hope that their efforts bear fruit, in terms of a usable 3D accelerated driver soon, as I would install my ATI card if there was a driver that worked, as I prefer to use free software where ever possible.  But I cant use what does not exist.  So I am stuck with NVidia's proprietary (but working) 3D driver.

Top 75 Contributor
Posts 1,809
Points 18,105
Joined: May 2009
Location: Waikiki
That that is today's future technology, Is tomorrow's useless garbage.
 
There was a time when we didn't even have video cards. There will be a time when you tell future generations how you needed to have two huge Circuit boards just to handle the graphics!
Intel Core i7-875K Quad
Asetek 510LC 120MM
4GB Kingston Hyper-X DDR-3
ASUS P7P55D-E Pro
CyberPower 800 PSU
Kingston 64GB SSD 
2 Hitachi 1-TB HDD'S
FirePro V8800
8X Blu-Ray DVD±R/±RW
HPw2207 22" LCD
Cintiq 21UX
CoolerMaster 690II Advance
Win 7 Pro 64 bit
Special thanks to HotHardware.com!
  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Mon, Apr 12 2010 6:17 PM

Tomorrow's useless garbage for people on the bleeding edge, but still works great for a lot of other people and organizations.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 5
Top 500 Contributor
Posts 136
Points 1,890
Joined: Mar 2010
sp12 replied on Mon, Apr 12 2010 7:17 PM

4chan makes me smile. And I'm surprised by the amount of truth in that quote wall.

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Tue, Apr 13 2010 12:35 AM

I see very little truth if you are referring the the 4chan-created tier. Intel and Nvidia are still a good combination.  Even if you don't go with the latest GTX 400 series, the 200 series are still pretty nice. There are some good budget and mid-range cards available. Sure, you forgo Eyefinity and DX11, but not everyone needs or even cares for those features. Also, the Nvidia SLI scaling is pretty good.

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 5
Not Ranked
Posts 41
Points 740
Joined: Apr 2010
Location: Hamilton, Ontario

First of all: 

My Findings: Very Obvious; ATI Rules, simple as that. You want a nice computer, combine the fastest processing on the planet (Intel), with the gratest graphics (AMD). I hope you find these findings helpful. Feel free to join the conversation.

If i combined intel with AMD, i think my computer might explode

 

secondly you are comparing things in a small scale hoping that it would apply to all things, which is WAYYYY wrong.

basically you argued ketchup is better on  hot dog then on a t shirt.

but does this mean a hot dog > t shirt (some mite say yes)

Let me make this simpler.

If i compared listening to music, or rendering videos is much faster on a mac than a pc, would a mac be better than a pc?

Of course not!

Mac's can't run as much programs OR games as a PC does (blah blah blah insert stuff here)

 

  • | Post Points: 20
Top 100 Contributor
Posts 862
Points 11,010
Joined: Apr 2008
RyuGTX replied on Wed, Apr 14 2010 12:10 AM

unknownD3VIL:

First of all: 

My Findings: Very Obvious; ATI Rules, simple as that. You want a nice computer, combine the fastest processing on the planet (Intel), with the gratest graphics (AMD). I hope you find these findings helpful. Feel free to join the conversation.

If i combined intel with AMD, i think my computer might explode

 

 

 

Well, ATI is owned by AMD so technically you are combining Intel with AMD. I hope your computer doesn't explode. Wink

 

If you think you can’t do something, you’ll never be able to do it. No matter how easy it is.
  • | Post Points: 5
Page 1 of 1 (30 items) | RSS