This article will guide you through the process of selecting the best GPU for mining.
Given the worldwide chip shortage at the moment, it’s probably fair to say that the best mining GPU right now is the one you can afford. Read on, though, if you’re debating between two or more graphic processing units (GPUs). In terms of mining, which GPU should I get?
The scope of this inquiry is enormous. Since the cost of electricity is so high, are we focusing on saving money by being as efficient as possible? Perhaps in terms of raw performance, as we are self-sufficient in terms of power thanks to our solar panels. Is there a problem with money? The most powerful graphics processing units (GPUs) can currently cost more than $3,000. It’s also important to note that GPU pricing is rather volatile right now. Together, these factors ensure that the ultimate solution will be unique for each individual.
Return on investment is the primary criterion when selecting mining GPUs (return on investment). In what time frame does the GPU pay for itself? We like returns of less than one year. Choose a graphics processing unit (GPU) based on this metric. Since daily profitability is fluid and ROI time is unlikely to be the same in the end, this metric is not always the most accurate.
The return on investment for the graphics card can be determined using the following formula: Return on Investment (ROI) = GPU cost (daily profit – electricity used).
The cost per megabit per second is another useful indicator. Simple and reliable, this metric never fails to disappoint. A hashrate comparison between two algorithms, such as DaggerHashimoto on GPU1 and KawPow on GPU2, can yield inaccurate results.
Here’s how to figure out how much you’ll spend per megabit per second: GPU cost divided by hashrate for a specific algorithm yields the cost per MH/s.
NVIDIA or AMD?
While NVIDIA may be more flexible than AMD, this is not always the case. Both KAWPOW (Ravencoin) and DaggerHashimto (Ethereum) are easily mined by AMD. In contrast, NVIDIA graphics processing units (GPUs) are the most productive when mining a wide variety of coins.
When compared to AMD, NVIDIA allows you to mine a wider variety of algorithms at a competitive rate. NVIDIA GPUs may be more “future-proof.”
Still, AMD graphics processing units are not to be dismissed. Potentially, in the future, there will be alternative algorithms or currencies that will be more profitable on AMD GPUs than NVIDIA ones.
A 5-kilowatt mining farm in your dorm room is not a good idea, and neither is kicking out your roommate.
If you rely on solar power or your electricity costs are extremely low (less than $0.056 per kilowatt-hour), then you are in luck. Then you can go for a GPU with lower efficiency and higher power consumption, such as the RTX 3090.
Which GPUs are Better?
Are you thinking about selling the GPUs again in a few years? If you’re looking to upgrade your graphics processing unit, we recommend the RTX 3000 or RX 6000 series, both of which will keep you at the forefront of your game even after three to four years. We recommend choosing the GPU with the highest return on investment if you want to use it only for mining and will not be reselling it. It doesn’t matter if it’s the newest model or one from a previous generation, like the RX 5000 or RTX 2000.
If you’re looking for mining graphics cards, we recommend looking elsewhere. As an illustration, the growing number of DAGs is causing the performance of the GTX series to degrade (the GTX 1080 Ti, which used to hash at roughly 55 MH/s, now only manages 45 MH/s).
Used GPU or New?
It has been argued that used graphics processing units (GPUs) used for mining will degrade over time. However, this is not the case. A GPU may continue to hash for a long time if it is kept clean, its temperature is kept below 70 degrees Celsius, and its fans are kept at a speed of roughly 60%.
There is less of a chance that you will purchase a defective GPU if you can find one with a documented past. Perhaps, like vehicles keep track of mileage, GPU makers should record past temps and loads in the BIOS.
How to determine power price?
Keep in mind that the per-kilowatt-hour rate you pay may vary from the one listed on your bill. Typically, the price of power is added to a network charge and a value-added tax.
This means that the cost of your electricity is different from what is shown on your bill. To figure out your monthly cost, divide your total bill by the number of kilowatt hours you consumed during the previous billing cycle.
During November, Bob consumed up to 3000 kWh. The cost per unit of energy is $0.06. The invoice total of $320 is significantly off from the correct amount of $180 (3000 * 0.06).
Bob may calculate the true cost of his electricity by dividing his total bill ($320) by the number of kilowatt-hours (kWh) he consumed (3000). The resulting cost of power is $0.106 per kilowatt-hour.
Leave a Reply