Analyst Claims NVIDIA Has A Power Efficiency Edge Over AMD In Ethereum Mining But Not So Fast

As we reported yesterday, hardware OEMs are paying closer attention to the needs of consumers who partake in cryptocurrency mining. As of late, Ethereum, a new form of currency similar to Bitcoin, has been a hot topic. For example, ASRock is marketing its H110 Pro BTC+ motherboard, which features an huge array13 PCIe slots to be the foundation of a what could be a seriously powerful GPU-powered mining rig.

And with 13 PCIe slots at your disposal, you’ll want to find some [relatively] affordable GPUs with enough processing grunt to make mining financially worth your while. Mitch Steves, an RBC Capital analyst, recently pitted two mainstream GPUs against each other to see how they compare when it comes to cryptocurrency mining.

nvidia geforce gtx 1070 4

Steves’ testing involved the Pascal-based NVIDIA GeForce GTX 1070 and AMD’s newly released Radeon RX 580 to see how they performed with Bitcoin and Ethereum mining. His analysis determined that while the Radeon RX 580 offered just over a 3 percent performance advantage in mining Ethereum or Bitcoin and the estimated yearly dollars made via mining, the GeForce GTX 1070 actually resulted in much lower running costs due to a power efficiency advantage.

The analysis also mentions a 33 percent savings on power consumption for NVIDIA which then translates into lower daily and annual KW/h costs. When taking performance and running costs for electricity into account, the NVIDIA card came out slightly behind the AMD card in annual profit with Ethereum mining, but slightly ahead with Bitcoin mining.

chart nvidia amd
(Image Source: Barron's)

The cost-benefit of choosing the NVIDIA solution, when scaled at the datacenter level, Steves suggested would be even more apparent, noting, “If we switch to building a full Data Center environment, electrical costs become increasingly more important (Bitcoin environment) and the older NVIDIA GPUs outperform AMD over the course of a year."

Seems pretty cut and dry, right? Well, not quite. The analysis doesn’t take into account pricing for each card. GeForce GTX 1070 cards are running around $370 on Amazon, while the Radeon RX 580 retails for around $285 (if you can actually find one). That’s a pretty big disparity, which only grows if you start adding more GPUs to your mining rig. Throw thirteen RX 580 cards into your ASRock H110 Pro BTC+, and that’s a savings of $1,105.

amd radeon rx 580 rx 570 3
AMD Radeon RX 580 (top), Radeon RX 570 (bottom)

And Steves’ TDP numbers also appear to be off, citing the GTX 1070 at 90 watts (28 MH/s) and the RX 580 at 135 watts (29 MH/s). Actual TDP power numbers are 150 Watts for the GTX 1070 and also 150 Watts for the RX 580 but TDP doesn't tell the entire story, only real-time power draw in a specific application does. The two cards' performance is measured in million hashes per second or hash rate (MH/s). According to testing from 1st Mining Rig, here’s how the AMD cards stack up:

  • Radeon RX 570 (stock): 22 MH/s while drawing 115 watts
  • Radeon RX 570 (overclocked): 27 MH/s while drawing 135 watts
  • Radeon RX 580 (stock): 24 MH/s while drawing 136 watts
  • Radeon RX 580 (overclocked): 29 MH/s while drawing 153 watts

Although it’s not an apples to apples comparison, here are some relative numbers from the NVIDIA camp:

  • GeForce GTX 1070 FE: 22 MH/s while drawing 100 watts
  • GeForce GTX 1070 FE: 26 MH/s while drawing 118 watts
  • GeForce GTX 1070 FE: 32 MH/s while drawing 145 watts

The numbers are a lot closer here, but there's definitely no 33 percent swing in one direction or the other for NVIDIA versus AMD here. In reality, both GPU camps, AMD and NVIDIA have their own sweet spot for getting the most performance at a given power level. And given that enthusiasts have been known to tweak their card's BIOS and undervolt their GPUs, while massaging clock speeds to get that perfect ratio to maximize return on power and hardware investment, whittling down comparisons to fit neatly into a chart with only two specific GPU models isn't so cut and dry.

We hope to provide more specific and detailed light on this subject in the near future, so stick around.