• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Tracking GPU Crypto mining activity via software / hardware?

Soldato
Joined
30 Jun 2019
Posts
7,875
Could Crypto mining algorithm usage be tracked by graphics cards? Ideally, this would be done through a hardware chip, so the tracking could not be disabled.

The purpose of this, would be to better protect those who buy used GPUs, as some graphics cards will be on their last legs when re-sold (or have reduced performance in many cases), due to extensive GPU mining. Sellers could check / report these statistics and grade GPUs accordingly.

Another added benefit, would be reducing prices of cards that were used for mining, which should reduce average used graphics card prices.
 
Soldato
Joined
6 Jan 2013
Posts
21,849
Location
Rollergirl
What do you expect this hardware chip to track? You do realise that the temperatures and workload experienced during crypto mining is equal to the same stresses experienced during gaming? For example, my 3080 memory temperature will exceed 100C during gaming sessions just as it will during mining sessions.

Considering that gamers can and will push the hardware further than miners, will you be having this hardware chip record all overclocks? Should it record when coolers are removed to be replaced by water blocks?

Finally, considering the fact that manufacturers want to sell as many new GPUs as possible, why do you think they'd invest time and money to ensure you were getting the best possible deal on a used GPU?
 
Soldato
OP
Joined
30 Jun 2019
Posts
7,875
What do you expect this hardware chip to track? You do realise that the temperatures and workload experienced during crypto mining is equal to the same stresses experienced during gaming?

Sorry, but this point makes no sense. Do gamers play GPU intensive games for 24 hours, every day? It's consistently using the GPU at high power consumption that will eventually degrade it. This would take much longer (many, many years) to occur for average gaming usage, but has been known to happen within months of GPU mining.

The chip could record the total power consumption, while running crypto algorithms and the number of hours spent mining. You could record other data too, but it would be a waste of time :p.

I accept your point that manufacturers may not be interested in doing this, but I would've thought such a basic chip could be integrated into the GPU itself by Nvidia. AMD do not seem interested in this particular problem though.

We record usage data (SMART) for SSDs and hard drives and it can be very useful to diagnose faults and developing problems, why not for graphics cards also?

Edit - There's a YT video, which discusses a study looking at data retention in modern graphics cards, the study found that damage (increase in memory errors) could occur within just 4 months running a graphics card at high VRAM temperatures (judging by the graph on the left, stability was significantly reduced at 70 degrees or higher). Link here:
https://www.youtube.com/watch?v=kdzsBDenww4&t=493s
 
Last edited:
Soldato
Joined
6 Oct 2004
Posts
18,343
Location
Birmingham
Sorry, but this point makes no sense. Do gamers play GPU intensive games for 24 hours, every day? It's consistently using the GPU at high power consumption that will eventually degrade it. This would take much longer (many, many years) to occur for average gaming usage, but has been known to happen within months of GPU mining.

Technically it's high temperatures (which obviously is usually caused by high power consumption, but can be managed by adequate cooling).

I thought it was common knowledge (and sense!) that it was repeated heat cycling (and related thermal expansion/contraction) which causes the most stress to electronics (in particular solder joints). My card gets significantly hotter whilst gaming than while mining. My fan bearings will probably go earlier due to mechanical wear due to mining from being on 24/7 but in terms of chip/PCB failure - I'd be more concerned about the electronics constantly cycling between 50-85c (gaming) than sitting at 60c for extended periods (mining).

What do you expect this hardware chip to track? You do realise that the temperatures and workload experienced during crypto mining is equal to the same stresses experienced during gaming?

Considering they are able to make LHR versions without impacting gaming performance, I assume they are able to identify when mining algorithms are running?
 
Last edited:
Associate
Joined
8 Sep 2020
Posts
1,438
No issue with card that have been used for mining , no worse than any other card imo. certainly no degradation in performance ... only issue as mentioned above is the fans may be knackered but can be changed easy enough if needed .

 
Soldato
Joined
30 Nov 2011
Posts
11,376
Sorry, but this point makes no sense. Do gamers play GPU intensive games for 24 hours, every day? It's consistently using the GPU at high power consumption that will eventually degrade it. This would take much longer (many, many years) to occur for average gaming usage, but has been known to happen within months of GPU mining.

The chip could record the total power consumption, while running crypto algorithms and the number of hours spent mining. You could record other data too, but it would be a waste of time :p.

I accept your point that manufacturers may not be interested in doing this, but I would've thought such a basic chip could be integrated into the GPU itself by Nvidia. AMD do not seem interested in this particular problem though.

We record usage data (SMART) for SSDs and hard drives and it can be very useful to diagnose faults and developing problems, why not for graphics cards also?

Edit - There's a YT video, which discusses a study looking at data retention in modern graphics cards, the study found that damage (increase in memory errors) could occur within just 4 months running a graphics card at high VRAM temperatures (judging by the graph on the left, stability was significantly reduced at 70 degrees or higher). Link here:
https://www.youtube.com/watch?v=kdzsBDenww4&t=493s

My mining cards run at much much lower power than gaming, they also run much lower memory temps than gaming, most serious miners would be the same precisely because we don't want GPU's to die within a few months, there's also no way to reliably record when the GPU is mining as it's just running cuda operations, unless you are suggesting GPU manufacturers store a log of exactly what application is being run on the GPU?
Imagine the privacy issues.

You could log the total hours active but again some people don't turn off their PC's so that wouldn't really tell you anything about usage either.

LHR only affects Ethereum, not the hundreds of other algorithms miners use. And if they are selling LHR cards what would be the value of tracking if it's been used in LHR mode, as miners are not going to be using LHR cards anyway.

Someone could equally have used their GPU for video encoding 24/7 for months, if they did that on default settings or even OC rather than efficient, that would be more damaging than mining.

The answer is, if you are worried about what someone has done to their GPU, don't buy 2nd hand

My 8 mining cards have been running just coming up on 5 years, so if mining kills GPU's in a few months, how are mine still going after 60 months?
I've not had a single failure. A Motherboard went, not a single GPU. So do we need to track who has been using motherboards for mining too? How about CPU's, let's track everything everyone does with every item they own...
 
Last edited:

Deleted member 66701

D

Deleted member 66701

Could Crypto mining algorithm usage be tracked by graphics cards? Ideally, this would be done through a hardware chip, so the tracking could not be disabled.

The purpose of this, would be to better protect those who buy used GPUs, as some graphics cards will be on their last legs when re-sold (or have reduced performance in many cases), due to extensive GPU mining. Sellers could check / report these statistics and grade GPUs accordingly.

Another added benefit, would be reducing prices of cards that were used for mining, which should reduce average used graphics card prices.

Why? What's in it for the GPU manufacturers?
 
Associate
Joined
18 Oct 2002
Posts
1,938
Location
Sheffield
No issue with card that have been used for mining , no worse than any other card imo. certainly no degradation in performance ... only issue as mentioned above is the fans may be knackered but can be changed easy enough if needed .


I can't say I'm given confidence by Linus who asked his mining friend to loan him 3 cherry picked GPUs to test.

Why not take a look at any of the dedicated mining youtubers who have a differing viewpoint on this? Such as this guy who's been doing GPU mining videos since before ethereum:
https://youtu.be/vnpb8KMe1Gk?t=3208
 
Soldato
Joined
6 Oct 2004
Posts
18,343
Location
Birmingham
I can't say I'm given confidence by Linus who asked his mining friend to loan him 3 cherry picked GPUs to test.

Why not take a look at any of the dedicated mining youtubers who have a differing viewpoint on this? Such as this guy who's been doing GPU mining videos since before ethereum:
https://youtu.be/vnpb8KMe1Gk?t=3208

"You can't run too many things at 100c 24/7".

No **** sherlock. How many miners actually run their cards at 100c? As @andybird123 says, anyone with half a brain cell will be underclocking and undervolting their card, and ensuring plenty of cooling to extend the life of the card
 
Last edited:
Associate
Joined
18 Oct 2002
Posts
1,938
Location
Sheffield
Founders 3080/90s* at stock settings will hit 100 tjunction mining ethereum. The founders threads are full of people discussing changing their pads due to this.

* 3060ti/70s do not have memory temp reading.
 
Associate
Joined
8 Sep 2020
Posts
1,438
I can't say I'm given confidence by Linus who asked his mining friend to loan him 3 cherry picked GPUs to test.

Why not take a look at any of the dedicated mining youtubers who have a differing viewpoint on this? Such as this guy who's been doing GPU mining videos since before ethereum:
https://youtu.be/vnpb8KMe1Gk?t=3208

95% of people mining will not be running their cards at 100deg 24/7 and even if they did i still personally don't see it as an issue as the memory will not throttle till it reaches 110 deg . if it really was a big issue we would be seeing many many failed 3080/3090 etc by now but that is not the case and if it was NVidia would have a big issue on their hands considering most cards come with a decent warranty period .

My own results are inline with what linus posted and i have seen no degradation in performance in either of my cards. my 3090 is actually putting in better benchmark score than when i 1st got the thing yet its been mining around 21 hours each day and playing games the other 3 .. core temps and memory temps are perfect and nowhere near 100deg as a simple thermal pad change is all it takes to get the memory temperature under control which 95% of people will do weather that be gaming or mining .
 
Soldato
Joined
28 Oct 2009
Posts
5,294
Location
Earth
Founders 3080/90s* at stock settings will hit 100 tjunction mining ethereum. The founders threads are full of people discussing changing their pads due to this.

* 3060ti/70s do not have memory temp reading.

60ti and 3070 dont have the same issues they dont use gddr6x
 
Don
Joined
19 May 2012
Posts
17,183
Location
Spalding, Lincolnshire
The purpose of this, would be to better protect those who buy used GPUs, as some graphics cards will be on their last legs when re-sold (or have reduced performance in many cases), due to extensive GPU mining.
Sorry, but this point makes no sense. Do gamers play GPU intensive games for 24 hours, every day? It's consistently using the GPU at high power consumption that will eventually degrade it. This would take much longer (many, many years) to occur for average gaming usage, but has been known to happen within months of GPU mining.

Ah more utter rubbish - guess you're bitter as miners have bought all the graphics cards and you haven't been able to get one?

As already mentioned above thermal cycling (e.g. due to short periods of high temperature i.e. gaming, then cooling down) is generally far worse for hardware than running continuously.

GPUs aren't run at "high power consumption" when mining - typically they are run at lower power consumption than you would during gaming to reduce power usage, and maximise profit.
 
Soldato
Joined
30 Nov 2011
Posts
11,376
Ah more utter rubbish - guess you're bitter as miners have bought all the graphics cards and you haven't been able to get one?

As already mentioned above thermal cycling (e.g. due to short periods of high temperature i.e. gaming, then cooling down) is generally far worse for hardware than running continuously.

GPUs aren't run at "high power consumption" when mining - typically they are run at lower power consumption than you would during gaming to reduce power usage, and maximise profit.

His aim seems to be to increase the cost of new cards in order for him to save money buying 2nd hand cards in the future.
 
Associate
Joined
31 Dec 2008
Posts
2,284
My 3080 FE been mining non stop when not gaming since I got it at the beginning of November and the clocks it reaches at given voltage are exactly the same as day one so no sign of degradation at all. Stable memory overclock is exactly the same as well.
 
Caporegime
Joined
17 Mar 2012
Posts
47,640
Location
ARC-L1, Stanton System
Ah more utter rubbish - guess you're bitter as miners have bought all the graphics cards and you haven't been able to get one?

As already mentioned above thermal cycling (e.g. due to short periods of high temperature i.e. gaming, then cooling down) is generally far worse for hardware than running continuously.

GPUs aren't run at "high power consumption" when mining - typically they are run at lower power consumption than you would during gaming to reduce power usage, and maximise profit.

True, i do a bit of mining on NiceHash when i'm not using the GPU.

GPU temp 60c, VRam temp 70c, power 105 Watts. 44 Mh/s

This with the NiceHash Quick Miner, its specifically for Nvidia, it does all the undervolt and overclocking automatically.

https://www.techpowerup.com/gpu-specs/msi-rtx-2070-super-gaming-x.b7142
 
Last edited:
Associate
Joined
19 Sep 2020
Posts
212
Could Crypto mining algorithm usage be tracked by graphics cards? Ideally, this would be done through a hardware chip, so the tracking could not be disabled.

The purpose of this, would be to better protect those who buy used GPUs, as some graphics cards will be on their last legs when re-sold (or have reduced performance in many cases), due to extensive GPU mining. Sellers could check / report these statistics and grade GPUs accordingly.

Another added benefit, would be reducing prices of cards that were used for mining, which should reduce average used graphics card prices.

You can only tell that when there is a folk that is being straight honest with a manufacturer.

You can believe me or not, but they are some people asking on how to increase hash and other *****. :D
 
Associate
Joined
31 Dec 2008
Posts
2,284
True, i do a bit of mining on NiceHash when i'm not using the GPU.

GPU temp 60c, VRam temp 70c, power 105 Watts. 44 Mh/s

This with the NiceHash Quick Miner, its specifically for Nvidia, it does all the undervolt and overclocking automatically.

https://www.techpowerup.com/gpu-specs/msi-rtx-2070-super-gaming-x.b7142
2070 Super has a vram temperature reading.? Didn't know that.
I was mining a bit on my laptop but it doesn't have memory temperature reading and only showing memory power draw which was twice as high as when gaming so I gave up on that as it is a bit worrying.
Laptops don't have great cooling in general so probably not as safe to do as it is on a desktop GPU.
 
Soldato
Joined
6 Oct 2004
Posts
18,343
Location
Birmingham
2070 Super has a vram temperature reading.? Didn't know that.
I was mining a bit on my laptop but it doesn't have memory temperature reading and only showing memory power draw which was twice as high as when gaming so I gave up on that as it is a bit worrying.
Laptops don't have great cooling in general so probably not as safe to do as it is on a desktop GPU.

Yeah, I tried on my laptop for a few days, saw the temperatures at ~75c and fans at 100% and decided it was probably not a good idea! :cry:
 
Back
Top Bottom