• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Tracking GPU Crypto mining activity via software / hardware?

Soldato
Joined
30 Jun 2019
Posts
7,875
Could Crypto mining algorithm usage be tracked by graphics cards? Ideally, this would be done through a hardware chip, so the tracking could not be disabled.

The purpose of this, would be to better protect those who buy used GPUs, as some graphics cards will be on their last legs when re-sold (or have reduced performance in many cases), due to extensive GPU mining. Sellers could check / report these statistics and grade GPUs accordingly.

Another added benefit, would be reducing prices of cards that were used for mining, which should reduce average used graphics card prices.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
What do you expect this hardware chip to track? You do realise that the temperatures and workload experienced during crypto mining is equal to the same stresses experienced during gaming?

Sorry, but this point makes no sense. Do gamers play GPU intensive games for 24 hours, every day? It's consistently using the GPU at high power consumption that will eventually degrade it. This would take much longer (many, many years) to occur for average gaming usage, but has been known to happen within months of GPU mining.

The chip could record the total power consumption, while running crypto algorithms and the number of hours spent mining. You could record other data too, but it would be a waste of time :p.

I accept your point that manufacturers may not be interested in doing this, but I would've thought such a basic chip could be integrated into the GPU itself by Nvidia. AMD do not seem interested in this particular problem though.

We record usage data (SMART) for SSDs and hard drives and it can be very useful to diagnose faults and developing problems, why not for graphics cards also?

Edit - There's a YT video, which discusses a study looking at data retention in modern graphics cards, the study found that damage (increase in memory errors) could occur within just 4 months running a graphics card at high VRAM temperatures (judging by the graph on the left, stability was significantly reduced at 70 degrees or higher). Link here:
https://www.youtube.com/watch?v=kdzsBDenww4&t=493s
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
I'm a bit staggered that some seem to be suggesting that running a graphics card at high temperatures (particularly the VRAM) for long periods, is not what harms graphics cards. I wonder what standard of evidence or data collection would be required to convince people otherwise?

What I said about high power consumption still stands, some higher end models will still be running at high power consumption (and corresponding temps) while mining, even if clock rates and other settings are altered.

I did try a bit of mining myself for a few days in February and it was possible to reduce power usage on a RTX 3070 significantly, but at a reduced hash rate.

I would guess that if you can use a power efficient model, and keep the VRAM temps to 60 or below this should reduce the likelihood of degradation, but there's no evidence to confirm what a safe mining operating temperature would be, for long term use.

Obviously an ideal solution would be auto throttling of memory clock speed /power usage to keep VRAM temps always at 60 degrees or lower, during mining sessions.

There's also been some anecdotal reports of clock rates degrading over many months/years of GPU mining, so the effects do not appear to be limited to the graphics memory (tbf, this could be caused by gaming eventually also).
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
Well, there isn't any evidence, if people only believe what they want to believe. It's not me saying it either, if you don't believe what's been said that's entirely your choice. There's degrees of degradation, the changes will be slow and gradual under the conditions mentioned, and no one can say they will definitely occur.

There's no onus on me, because it has nothing to do with me.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
Actually, from Nvidia's point of view, there may be a point. If a graphics card has been worn out / degraded a lot, they could check the counters for the crypto tracking chip to see if there has been hundreds / thousands of hours of GPU mining done on the graphics card. If so, this would void the warranty, if sent back to Nvidia or Graphics card manufacturers.

There's other reasons too (mostly to do with the affect it has had on the graphics card market / getting cards into the hands of retailers across the world etc).

I suspect the lifespan on LHR card versions will on average, be longer than unrestricted cards and I think this may have been a factor in Nvidia's decision to act against GPU crypto mining.
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
@Scougar - Huh? Why would it be bad for consumers if they could buy 2nd hand graphics cards at a reduced price (closer to their actual value)? And actually be aware of how much a graphics card has been mined on, before purchasing one?

At the moment, even 2nd hand graphics card prices are massively inflated, and sometimes cost the same / more than brand new graphics cards.

How would the warranties of gamers be affected, if they don't use their GPU extensively for Crypto mining?

I get it though, you don't like the idea.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
Well, that's crap. The GPU would track usage of the specific mining algorithms(and nothing else). It would be pretty easy to see which cards have been mined on for hundreds/thousands of hours, and which haven't. The tracking could be built into the GPU die itself, and therefore would have nothing to do with the manufacturers.

You don't like the idea, that's all you needed to say, but really I'm not expecting much in terms of a reply, with it apparently being such a divisive issue.
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
I don't think the most relevant question is, can it be done by Nvidia?

But rather, is it something that would help to correct overvaluing of used graphics cards, which in the past, were always sold at a discount compared to new gfx cards. The ridiculous thing is, GPU mining has substantially increased the value of cards that are years old (or, at least contributed to the value of used cards).

Another factor is, with the potential end of wide scale GPU mining (of Ethereum), would this measure still be necessary, if GPU mining declines massively in 2022? Perhaps Nvidia should have the design of this feature ready to go, in case other GPU mining algorithms become widely used in 2022.
 
Last edited:
Back
Top Bottom