• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Poll: Will you be buying a 2080Ti/2080/2070?

Which card will you be buying?


  • Total voters
    1,201
  • Poll closed .
Soldato
Joined
26 May 2009
Posts
22,101
Wow, just... Wow! Surely you're able to recognize an extremely poor article? Go watch Gamers Nexus' response video and let's get some sense into you :D
Here, I made it easier for you. :p
Already seen it, as mentioned before I turned it off halfway through because it was too cringy. Not sure if Steve genuinely doesn't understand it (may be an age thing) or if he's playing up his character like Linus/Jay do because he knows most of his viewers don't understand it and it's what they want to hear.
 
Associate
Joined
31 Jan 2012
Posts
2,004
Location
Droitwich, UK
With this justification from the author how can anyone possibly think the original article was satirical? Maybe it's not just Americans who don't understand it...

https://www.notebookcheck.net/Contr...endation-to-preorder-Nvidia-RTX.324802.0.html

Avram Piltch later responded to critics saying that the article was written that way deliberately in order to strike discussion about the new GPUs. He also said:

"My point is that, if RTX cards deliver even half of what they promise, they are worth the current prices (provided you can afford to pay them). Whether you want to buy one now (or at launch), I would not wait for some hypothetical future day when prices may drop.”
 
Associate
Joined
27 Jul 2015
Posts
1,470
Interesting comment from a different forum, a bitcoin / virtual currency miner, who claims the market is now so bad that they are losing money month on month, and unless it picks up soon there's going to be an awful lot of high end graphics cards being dumped on the second hand market. I had seen an article somewhere about these people wanting to return the cards to the sellers to get their money back on them, it did say how they did this, but I don't remember how.
 
Associate
Joined
27 Jul 2015
Posts
1,470
Interesting, I'm already nervous about second hand cards and that only furthers my concerns.

MY thinking is that the thermal shock silicon suffers when it is powered up from cold is potentially far more damaging than running it 24/7 at a constant temperature. I can't find the figures atm but the thermal increase on power up is quite startling.
 
Soldato
Joined
19 May 2009
Posts
3,113
Location
Cannock
Genuinly wasn't planning to get one of these cards due to how overpriced they are, I was going to stick with my 980ti until it dies or can't hold the FPS I like anymore, however after reading an article on it by the editor of Tom's Hardware I've decided to splurge for a 2080ti on payday.

An article that has no benchmarks or substantial indicators of the product's performance enticed you into splurging on a 2080Ti? Can I ask which parts of the article swung it for you?

I'm all for new stuff, but I generally like to know what I'm buying before I start parting with money in the same way you'd test drive a car before commiting to purchase.
 
Soldato
Joined
19 Oct 2008
Posts
5,952
MY thinking is that the thermal shock silicon suffers when it is powered up from cold is potentially far more damaging than running it 24/7 at a constant temperature. I can't find the figures atm but the thermal increase on power up is quite startling.
Depends. I would have thought testing and MTBF included the cold starts but maybe not 24/7 at constant load. So while cold starts may be damaging the components are probably designed and built with the consideration in mind. Gaming GPU's are not usually run 24/7 for gaming. Most likely if it can withstand the cold starts it should withstand a constant load but you're still edging towards that MTBF. GPU's used for mining usually have the memory overclocked too and some could be pushing things a bit far without necessarily realising but doing damage all the same.

I'm not expert but when mining was doing well there was also the mindset some had to run a GPU as hard as necessary generate the most amount of coins. I definitely wouldn't want to buy a used one of those even just 6 months old.

I'm all for new stuff, but I generally like to know what I'm buying before I start parting with money in the same way you'd test drive a car before commiting to purchase.
I haven't always test driven cars. No need to. Some enjoy doing that as part of their "exciting purchase". I know when a car is going to be good and is what I need so no need for me to do that. The car I've owned for the longest time was bought that way :).
 
Last edited:
Associate
Joined
27 Jul 2015
Posts
1,470
"Depends. I would have thought testing and MTBF included the cold starts but maybe not"

Unfortunately the laws of physics can't be changed, and silicon is what it is. There are numerous scientific papers about heat and the problems it causes if you can be bothered.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
"Depends. I would have thought testing and MTBF included the cold starts but maybe not"

Unfortunately the laws of physics can't be changed, and silicon is what it is. There are numerous scientific papers about heat and the problems it causes if you can be bothered.
And yet many of us have had cards that were used for daily gaming for 3+ years without falling apart. Those cards would have been power cycled at least once/day, and endured many more "thermal shocks" as you call it.

The idea that using your gaming GPU for gaming is the absolute worst thing you can do is, well, laughable.

If using a gaming GPU for gaming made it fail within a couple years, there would be countless returns under EU law for a start. Yet mining GPUs are failing within those sort of timescales. Fan bearings for a start. Memory artefacts second.
 
Associate
Joined
22 Jul 2018
Posts
157
Interesting comment from a different forum, a bitcoin / virtual currency miner, who claims the market is now so bad that they are losing money month on month, and unless it picks up soon there's going to be an awful lot of high end graphics cards being dumped on the second hand market. I had seen an article somewhere about these people wanting to return the cards to the sellers to get their money back on them, it did say how they did this, but I don't remember how.

I don't see how it can ever be resurrected to its former glory, the epic rise and global scale of Bitcoin mining has brought about its own downfall. As for miners finding a loophole on returns, it seems despicable, ludicrous and yet plausible. All those cheap 1080/1080ti's being dumped on the SH market will surely effect the success of the 2080. Nvidia need bringing down a peg or two. .
 
Soldato
Joined
26 May 2009
Posts
22,101
An article that has no benchmarks or substantial indicators of the product's performance enticed you into splurging on a 2080Ti? Can I ask which parts of the article swung it for you?
It wasn't really parts it was the overall point/narrative. I'll try and explain it in simple (yet long winded) terms.

Back in the day (TM) computers had single colour screens, green on black (I'm going somewhere with this) then came the era of Colour Graphics Adapters (CGA) video adapters with four different colours on the screen at once, then came the era of Enhanced Graphics Adapters (EGA) with 16 colours. This improvement from CGA to EGA could be directly compared with going from an Atari 2600 to a NES, and this is where I started with a 8086 computer with EGA display. Next came the Video Graphics Array (VGA) cards, which could display 256 colours at once (comparable to the NES to SNES transition) however my father wasn't going to drop a grand on a new video card just to make my games look cooler when WordPerfect and Supercalc (the DOS programs Word and Excel are ripped off from) would look the same. The next big jump was 16bit colour (even more colours again), then 3DFX invented 3D accelerators, then SLI which allowed games to be played in 800x600 and the mentally high 1024x768. Finally the last big advancement was hardware based Transform, clipping, and lighting (T&L) which was responsible for the rise of Nvidia and the fall of 3DFX, and that was the last major development in GPU technology, there hasn't been anything big since just improvements on what we have.

And this is the crux of it, every time one of those landmark developments hit the market I couldn't afford it, I had to wait a year or two until the tech trickled down to the cheaper video cards, I'd always wish I could get in on it sooner but couldn't. This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
 
Associate
Joined
2 Nov 2003
Posts
253
No one wanted landmark developments - RT or cheaper AA at 4K (not needed esp. with a TV display) - just better FPS / watt and FPS / £ .

Now everyone's a GPU expert - well the 1000 people worldwide buying the 2080 to get their 15 mins of fame at the top of 3Dmark rankings.

The rest of us are the 90% who bought 1060s or before that a 970 for 1080p and will go for a second hand 1080TI for £450 with 11GB VRAM and 2080 performance at 4K.
 
Soldato
Joined
19 Oct 2008
Posts
5,952
To be honest, I'd like to think used 1080 Ti's will be < £450 soon when the 20 series cards arrive, especially the early cards (now nearly 18 months old). I don't complain about new GPU prices, at least reference/FE, but I usually find used GPU prices too high. While I appreciate during the mining boom 1080 Ti's were hard to find and > £800 at times, the demand stripped price was really circa £680-700, so used cards should be priced based off of that.
On the CPU front 6700K's seem to be worth a little over £100 now from not long ago £350 when new.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
Hope you don't enjoy playing >1080p or >60 FPS, then.

Seems you'll be "gimping" yourself either way.
 
Caporegime
OP
Joined
24 Dec 2005
Posts
40,065
Location
Autonomy
It wasn't really parts it was the overall point/narrative. I'll try and explain it in simple (yet long winded) terms.

Back in the day (TM) computers had single colour screens, green on black (I'm going somewhere with this) then came the era of Colour Graphics Adapters (CGA) video adapters with four different colours on the screen at once, then came the era of Enhanced Graphics Adapters (EGA) with 16 colours. This improvement from CGA to EGA could be directly compared with going from an Atari 2600 to a NES, and this is where I started with a 8086 computer with EGA display. Next came the Video Graphics Array (VGA) cards, which could display 256 colours at once (comparable to the NES to SNES transition) however my father wasn't going to drop a grand on a new video card just to make my games look cooler when WordPerfect and Supercalc (the DOS programs Word and Excel are ripped off from) would look the same. The next big jump was 16bit colour (even more colours again), then 3DFX invented 3D accelerators, then SLI which allowed games to be played in 800x600 and the mentally high 1024x768. Finally the last big advancement was hardware based Transform, clipping, and lighting (T&L) which was responsible for the rise of Nvidia and the fall of 3DFX, and that was the last major development in GPU technology, there hasn't been anything big since just improvements on what we have.

And this is the crux of it, every time one of those landmark developments hit the market I couldn't afford it, I had to wait a year or two until the tech trickled down to the cheaper video cards, I'd always wish I could get in on it sooner but couldn't. This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD


But you don't know how the 280ti performs ? :p
 
Soldato
Joined
23 May 2006
Posts
6,949
It wasn't really parts it was the overall point/narrative. I'll try and explain it in simple (yet long winded) terms.

Back in the day (TM) computers had single colour screens, green on black (I'm going somewhere with this) then came the era of Colour Graphics Adapters (CGA) video adapters with four different colours on the screen at once, then came the era of Enhanced Graphics Adapters (EGA) with 16 colours. This improvement from CGA to EGA could be directly compared with going from an Atari 2600 to a NES, and this is where I started with a 8086 computer with EGA display. Next came the Video Graphics Array (VGA) cards, which could display 256 colours at once (comparable to the NES to SNES transition) however my father wasn't going to drop a grand on a new video card just to make my games look cooler when WordPerfect and Supercalc (the DOS programs Word and Excel are ripped off from) would look the same. The next big jump was 16bit colour (even more colours again), then 3DFX invented 3D accelerators, then SLI which allowed games to be played in 800x600 and the mentally high 1024x768. Finally the last big advancement was hardware based Transform, clipping, and lighting (T&L) which was responsible for the rise of Nvidia and the fall of 3DFX, and that was the last major development in GPU technology, there hasn't been anything big since just improvements on what we have.

And this is the crux of it, every time one of those landmark developments hit the market I couldn't afford it, I had to wait a year or two until the tech trickled down to the cheaper video cards, I'd always wish I could get in on it sooner but couldn't. This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
This is a good post and kind of a logical perspective.... And IF the cards can do raytracing properly without butchering the rest of the experience I will probably join you.
My worry is however from what we have seen so far this gen this won't be the case.
Now we all know the possible performance issues but how many folk out there who have not followed closely the reveal and have ordered after just seeing a bit of marketing fluff and are - quite justifiably - expecting slightly better 4k experience than the 10x but with ray tracing. ? It is not looking likely right now.
 
Associate
Joined
15 Oct 2014
Posts
746
Location
Somerset England
This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
You do realise it’s looking like the 2080ti can barely keep up 60fps at 1080p with ray tracing on? I’d call that gimping your experience...
 
Back
Top Bottom