This post is a brief update to the previous article on the MSI RX580 and its mining performance.
I was just going to append the original article, but I decided to make a new post since I wanted to add a considerable amount of new information on new tests.
First off though I will provide the modified BIOS file I used in the article on the MSI RX580.
I do not usually like to supply BIOS files as it is easy to modify your own and I do provide the links to the tools I use and instructions to modify your own in the previous article MSI RX580 Armor 8G Review and BIOS Mod Guide. However, since I have been asked for the files that I used by a few readers I will share the BIOS mod I made doing the article. Please note that this file is for the MSI RX580 Armour 8GB with Samsung memory, please verify this before using it as yours might be different.
You can easily check you memory brand using the GPU-z utility. Version 1.20.0 or higher will properly detect the RX5xx series cards.
Please also backup your original BIOS file(s) first before making any changes and remember you make any modifications to your video cards and computer system AT YOUR OWN RISK! There is always the possibility of something going wrong and the author of this article is not responsible for any changes or damage to your system or graphic cards that may result.
I still recommend learning how to make your own modifications, but here is a link to a zip file containing the BIOS Mod I used for the specific card reviewed in the MSI Armor RX580 article: MSI_RX580_8G_1-1750_Mod_bios
Ethereum + Decred Results
Shortly after the article was published, Claymore released version 9.2 of his Ethereum+DCR dual miner. It now will identify the RX5xx series cards and also offers some small improvements in mining, especially when using the ASM functionality. One of the improvements is it now detects the cards as RX 480/580 to appease some who seemed worried earlier versions only showed up as a RX480. I suspect this is little more than a visual change, but life goes on. It is this version (9.2) that I performed my follow-up tests with.
UPDATE: Claymore it seems has been busy, as we now have version 9.3 out. I ran through the same testing with 9.3 and things have not substantially changed much, so all the results shown will essentially be the same with any 9.x Claymore release.
When mining Ethereum alone, the results were identical to that with 9.1, which was 30 Mh/s per card or 150Mh/s for the 5 card rig with at the wall power draw of 650 watts. This results in a overall system mining efficiency ratio of roughly 4.33W/Mh,
Below I am am running Ethereum + Decred on Claymore version 9.2, and you can see the results several hours in.
Again I have Core clocks set to 1125 MHz and Memory clocks set to 2175 MHz. I have adjusted the Core voltage down to 900mv in the Claymore launch bat file, which results in around a 0.8875 (887mv) reading in GPU-z. This results in a respectable sub 100 watt power consumption reading in GPU-z. The whole system measured with a Kill-a-watt meter draws~760 watts.
The formula I use to distill the per card at-the-wall wattage is roughly as follows:
- First I subtract the PSU efficiency rating to get the wattage actually going to the rig. I am using a Corsair RM1000x PSU, which according to the manufacturer is about 89% efficient at 120V AC input and roughly a 76% (760W out of 1000W) load.Interestingly, you gain about a 2% efficiency at 240V AC, which is what I run most of my rigs at. While I do it mainly because I can run roughly twice the rigs over a single circuit at the same amperage level, that 2% does add up when you start running dozens of rigs. For my test bench I use 120V as I can use the Kill-a-Watt meter. The results are close enough that for our purposes it really won’t make a difference, just keep in mind you can gain a very slight power advantage by going 240V if you can.
- This produces an equation of 760 * 0.89 = 676 Watts that are going to the rig itself. This shows right off that the PSU is wasting (converting to heat) roughly 82 watts. This is one reason why Gold or better PSUs are recommended.
- Next I subtract the system idle draw. To derive the system idle draw, I measure the rig while not mining shortly after boot-up. In my case this was around 55 watts. To make it an apple to apples comparison, I already subtracted the PSU’s share in this calculation, as the Kill-a-Watt actually read 62 watts. Since we are subtracting the PSU in step one, I only use the system draw minus PSU in this step, otherwise we would be subtracting it twice in the steps below.
- Now it is simply a matter of straight-forward subtraction. The full load Kill-aWatt reading of 760 watts – 84 (PSU) – 50 (system) = 626 watts. So it is these 626 watts that are being consumed by the GPUs while under load.
- All that remains is to divide 626 by the 5 GPUs in this rig to get our final 125 watts per GPU result.
So as you can see, we are actually using about 25 more watts in reality when mining then what GPU-z indicates, thus the importance of making your own measurements before calculating any ROI on you rigs.
This results in a real-work ratio of 125 watts/30 Mh or reduced; 4.17 watt per Mhash (4.17W/Mh) for Ethereum with some DCR income thrown in.
If you want to include the whole system draw it results in 762W/150Mh resulting in a 5.08W/Mh overall ratio. While higher than the 4.33W/Mh overall ratio we got above when mining Ethereum alone, the extra DCR income offsets this increase as well as provide additional profit over and above its increase in power usage. The only drawback is increased heat, so with the summer months coming in the Northern hemisphere, this may be a concern and should be taken into consideration.
UPDATE: Some people emailed me and prefer a Mh/Watt ratio instead, so easy enough. If we inverse the above we get a 0.1968 Mh/W when mining ETH+DCR and 0.2309 Mh/W when mining ETH alone. These figures would be per card with a prorated amount of power for the system and PSU overhead applied as outlined above.
With Zcash mining, simply using the same settings for core, memory and core voltage as we did with Ethereum we get roughly 285 H/s (sols) per GPU, or around 1430 H/s for the entire 5x GPU rig.
Since the Equihash (Zcash) algorithm usually prefers a higher core timing than does the Ethash algorithm that is used for Ethereum mining, I also ran a second test run with the core clock set a bit higher to 1300 MHz. I then needed to increase the core voltage to 1050mv too, so it does come as a trade-of between hash/sec versus power consumption. On the plus side, Zcash is not as memory bandwidth dependent as Ethereum, you can lower the memory clocks a bit, or to default in this case, to save a bit of power.
My results after several minutes of mining can be seen above. I should note that while I am only using one GPU-z window to avoid cluttering up the screenshot too much, all 5 cards are running at the exact same settings. You can see we went from around ~286 H/s using the same settings as we did for Ethereum mining, to 320 H/s per GPU when tweaking the settings for Zcash. I know a lot of people I talk to like to use the same settings so they can easily switch between algorithms, but it is well worth it to customize all your batch launch files to the requirements of each coin.
Because Ethereum is currently more profitable to mine than Zcash on the more modern cards, such as these, I did not spend a lot of time testing this algorithm. I only ran through the tests enough to ensure stability and to get a base metric to compare results. I do mine Zcash exclusively on my older R7 and R9 based hardware, as it comes out ahead when mining those chipsets, however for any RX series card, the Ethash algorithms reigns supreme at the moment.
While Zcash does need a higher core clock, it is not as taxing on the overall GPU as Ethereum. Even so, the need for a higher core vddc setting to support the higher core speed results in a overall system wattage draw very similar to that of ETH+DCR mining, or around 750 watts.