It’s been a while since the cryptomining insanity started. It has caused quite some problems for gamers, especially those on a budget. First AMD’s RX 570/580s started disappearing, followed by the GeForce GTX 970s and then the Polaris based AMD GPU, the RX 480. The Radeon RX Vega just launched less than a month ago and it has been cleaned out of most places, which caused it’s price to rise and made it more or less obsolete. All this is the result of the cryptomining fever, that has gripped a significant number of people recently. Today we are going to investigate if Ethereum mining is still as attractive and which is the most efficient card for it- NVIDIA’s or AMD’s.

At the time of this writing, Ethereum (ETH) ranks second only to Bitcoin (BTC) in terms of market capitalization, with a total cap of almost $27 billion dollars and a daily volume of more than $750 million. In comparison, Bitcoin’s market cap exceeds $66 billion dollars and Litecoin (LTC), silver to Bitcoin’s gold, crests $3.2 billion.

One of the advantages of Ethereum over Bitcoin or Litecoin has to do with the algorithm chosen to validate the proof-of-work (PoW). While BTC relies on SHA-256 and Litecoin on Scrypt for its hash function, Ethereum calls on an algorithm called Ethash, created especially for this purpose. In practice, it was designed from the start to prevent the development of dedicated ASICs.

Indeed, while SHA-256 and Scrypt are extremely compute-hungry, consequently rendering ASICs more efficient than our graphics cards (even more so than CPUs), Ethash is rather dependent on memory performance (frequency, timing, and bandwidth). With their fast GDDR5, GDDR5X, and HBM, graphics cards are perfectly suited to mine Ethereum.

However, not all boards are similar. Certain GPU architectures are quicker and more effective than others, and all cards aren’t loaded with the same type of graphics memory, due to which not all models are that suitable for mining.

All of the presently available mid-range cards were used to test in case of AMD, but in NVIDIA’s case the 1060 6GB model was only used since the lower models had insufficient VRAM and didn’t do all that well.

RESULTS/BENCHMARKS

While the Radeon R9 390 proves to be the fastest at stock clock rates, it’s the Radeon RX 480 that leads once overclocked and optimized. The Radeon RX 480 also sits atop the podium once its voltages are fine-tuned.

The GeForce GTX 1060 cards also did rather well. They delivered between 24 and 25 MH/s once optimized which is a considerable improvement.

This next test sets a target power consumption of 120W for each card, and the benchmark was conducted using two DAG Epochs.

For the the GeForce GTX 1060 3GB, overclocking was needed and an increase of voltage to reach 120W, while the Radeon R9 390 8GB needed the opposite treatment, requiring a hefty choke to its power limit, culminating in a near 500 Mhz clock.

If you consider pure efficiency, the tables turn. Now the GeForce GTX 1060s are most attractive, with 4.15W consumed per MH/s, at worst, after optimizations.

The best Radeon card needs no less than 4.46W per MH/s, that too after after tuning. And please note that the GPU frequency or voltage of the Radeon R9 390 could not be properly changed so it loses the benefit of optimizations.

VERDICT

Well, turns out mining Ethereum is still viable on both GeForces and Radeons. But the right card choice depends on whether you’re looking for maximum hash rates or optimized efficiency .a.k.a profitability.

Out of all the mainstream graphics cards, AMD’s Radeon RX 480 is the fastest once overclocked and tuned for tighter GDDR5 timings. AMD’s ReLive Edition Beta for Blockchain Compute drivers too help. They improve performance and solve the problem caused by more recent DAG Epochs.

If you are looking for sheer efficiency, Nvidia’s GeForce GTX 1060 6GB is the go-to card for miners. But be sure to select a model that allows you to underclock the GPU’s frequency and voltage to a sufficiently low threshold. It’s a shame that Nvidia doesn’t allow the modification of its GPU BIOSes. This would have allowed further optimization for the 1060s and perhaps they might have matched the RX 480.

LEAVE A REPLY

Please enter your comment!
Please enter your name here