Associate
It isn't satire though, his additions since have pretty much clarified that. He went mad with the money Nvidia are probably paying him.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Wow, just... Wow! Surely you're able to recognize an extremely poor article? Go watch Gamers Nexus' response video and let's get some sense into you
Already seen it, as mentioned before I turned it off halfway through because it was too cringy. Not sure if Steve genuinely doesn't understand it (may be an age thing) or if he's playing up his character like Linus/Jay do because he knows most of his viewers don't understand it and it's what they want to hear.Here, I made it easier for you.
Avram Piltch later responded to critics saying that the article was written that way deliberately in order to strike discussion about the new GPUs. He also said:
"My point is that, if RTX cards deliver even half of what they promise, they are worth the current prices (provided you can afford to pay them). Whether you want to buy one now (or at launch), I would not wait for some hypothetical future day when prices may drop.”
Interesting, I'm already nervous about second hand cards and that only furthers my concerns.
Genuinly wasn't planning to get one of these cards due to how overpriced they are, I was going to stick with my 980ti until it dies or can't hold the FPS I like anymore, however after reading an article on it by the editor of Tom's Hardware I've decided to splurge for a 2080ti on payday.
Depends. I would have thought testing and MTBF included the cold starts but maybe not 24/7 at constant load. So while cold starts may be damaging the components are probably designed and built with the consideration in mind. Gaming GPU's are not usually run 24/7 for gaming. Most likely if it can withstand the cold starts it should withstand a constant load but you're still edging towards that MTBF. GPU's used for mining usually have the memory overclocked too and some could be pushing things a bit far without necessarily realising but doing damage all the same.MY thinking is that the thermal shock silicon suffers when it is powered up from cold is potentially far more damaging than running it 24/7 at a constant temperature. I can't find the figures atm but the thermal increase on power up is quite startling.
I haven't always test driven cars. No need to. Some enjoy doing that as part of their "exciting purchase". I know when a car is going to be good and is what I need so no need for me to do that. The car I've owned for the longest time was bought that way .I'm all for new stuff, but I generally like to know what I'm buying before I start parting with money in the same way you'd test drive a car before commiting to purchase.
And yet many of us have had cards that were used for daily gaming for 3+ years without falling apart. Those cards would have been power cycled at least once/day, and endured many more "thermal shocks" as you call it."Depends. I would have thought testing and MTBF included the cold starts but maybe not"
Unfortunately the laws of physics can't be changed, and silicon is what it is. There are numerous scientific papers about heat and the problems it causes if you can be bothered.
Interesting comment from a different forum, a bitcoin / virtual currency miner, who claims the market is now so bad that they are losing money month on month, and unless it picks up soon there's going to be an awful lot of high end graphics cards being dumped on the second hand market. I had seen an article somewhere about these people wanting to return the cards to the sellers to get their money back on them, it did say how they did this, but I don't remember how.
It wasn't really parts it was the overall point/narrative. I'll try and explain it in simple (yet long winded) terms.An article that has no benchmarks or substantial indicators of the product's performance enticed you into splurging on a 2080Ti? Can I ask which parts of the article swung it for you?
Hope you don't enjoy playing >1080p or >60 FPS, then.This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
It wasn't really parts it was the overall point/narrative. I'll try and explain it in simple (yet long winded) terms.
Back in the day (TM) computers had single colour screens, green on black (I'm going somewhere with this) then came the era of Colour Graphics Adapters (CGA) video adapters with four different colours on the screen at once, then came the era of Enhanced Graphics Adapters (EGA) with 16 colours. This improvement from CGA to EGA could be directly compared with going from an Atari 2600 to a NES, and this is where I started with a 8086 computer with EGA display. Next came the Video Graphics Array (VGA) cards, which could display 256 colours at once (comparable to the NES to SNES transition) however my father wasn't going to drop a grand on a new video card just to make my games look cooler when WordPerfect and Supercalc (the DOS programs Word and Excel are ripped off from) would look the same. The next big jump was 16bit colour (even more colours again), then 3DFX invented 3D accelerators, then SLI which allowed games to be played in 800x600 and the mentally high 1024x768. Finally the last big advancement was hardware based Transform, clipping, and lighting (T&L) which was responsible for the rise of Nvidia and the fall of 3DFX, and that was the last major development in GPU technology, there hasn't been anything big since just improvements on what we have.
And this is the crux of it, every time one of those landmark developments hit the market I couldn't afford it, I had to wait a year or two until the tech trickled down to the cheaper video cards, I'd always wish I could get in on it sooner but couldn't. This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
This is a good post and kind of a logical perspective.... And IF the cards can do raytracing properly without butchering the rest of the experience I will probably join you.It wasn't really parts it was the overall point/narrative. I'll try and explain it in simple (yet long winded) terms.
Back in the day (TM) computers had single colour screens, green on black (I'm going somewhere with this) then came the era of Colour Graphics Adapters (CGA) video adapters with four different colours on the screen at once, then came the era of Enhanced Graphics Adapters (EGA) with 16 colours. This improvement from CGA to EGA could be directly compared with going from an Atari 2600 to a NES, and this is where I started with a 8086 computer with EGA display. Next came the Video Graphics Array (VGA) cards, which could display 256 colours at once (comparable to the NES to SNES transition) however my father wasn't going to drop a grand on a new video card just to make my games look cooler when WordPerfect and Supercalc (the DOS programs Word and Excel are ripped off from) would look the same. The next big jump was 16bit colour (even more colours again), then 3DFX invented 3D accelerators, then SLI which allowed games to be played in 800x600 and the mentally high 1024x768. Finally the last big advancement was hardware based Transform, clipping, and lighting (T&L) which was responsible for the rise of Nvidia and the fall of 3DFX, and that was the last major development in GPU technology, there hasn't been anything big since just improvements on what we have.
And this is the crux of it, every time one of those landmark developments hit the market I couldn't afford it, I had to wait a year or two until the tech trickled down to the cheaper video cards, I'd always wish I could get in on it sooner but couldn't. This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD
You do realise it’s looking like the 2080ti can barely keep up 60fps at 1080p with ray tracing on? I’d call that gimping your experience...This time however I can afford it, gimping my own experience for a year or two just to save money isn't how I want to go with my hobby so I'm getting a 2080ti, because as cheesy as it sounds, THG are right, when my life flashes before my eyes I want as much of it to have ray tracing as possible xD