• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

The problem as I see it is that, by bringing RT functionality so early and with such a heavy penalty to gamers, in both cost and the negative performance impact, nvidia could actually turn consumers off the idea of RT altogether.

It was so bad that nvidia also had to introduce dlss to mitigate the impact that RT had on framerates.

Hopefully the second coming will be much improved by both nvidia and amd.
That is truly the worst case scenario isn't it. Similar to your sentiment is that I believe that the 2080ti under the RTX branding makes the RTX branding look bad. Do to those performance RT penalties and the inclusion of DLSS 1.0 (which initially made branding of RTX much worst). Did you know some try to downplay DLSS 2.0 as if there wasn't revision? IE: to make it look like it was just DLSS original all along?
lol

I'm sure it would be somewhat "fix" in the 3000 series but let us not fool ourselves. Games will also be rasterized with elements of RT in them. Some games will have 1-2 RT elements others could have 3-4 and any combination.
 
Looks like Ampere is getting a Colossal spanking in the AI space.

Something not quite right with that article.

As for computing output, the MK200 delivers 250 TFLOPs of peak FP16 (with Sparsity) and 62.5 TFLOPs (with Sparsity) of peak FP32 performance. The NVIDIA A100 GPU delivers a total of 312 TFLOPs of FP16 (624 TFLOPs with Sparsity) and 19.5 TFLOPs FP32 (156 TFLOPs with Sparsity).

The numbers given there don't tie up with the numbers given at the end of the article.

Either way round, it is great to see other companies bringing compute GPU's to the market.
 
Why on earth would Nvidia expect people to use a new power connector purely for their GPU cmon people wake up

Because sooner or later it's likely there will be more different pcie power connectors. If they did launch it with a different power connector they could bundle an adaptor so it's not like some mad decision that's impossible to overcome.
 
Last edited:
If you’re using a TV as a PC monitor, I’d only consider OLED if you can get some decent burn in insurance with it. Personally, I think a QLED is a better bet for PC use.
 
If you’re using a TV as a PC monitor, I’d only consider OLED if you can get some decent burn in insurance with it. Personally, I think a QLED is a better bet for PC use.
Unless you play the same game over and over and over again, there's nothing to worry about.
 
Said many a soon to be heartbroken OLED owner

can you clarify many? How much is that compared to all the owners? What percentage?

Every manufacturing process has faults and tolerances, which inherently means some panels (no matter the technology) will be at a higher risk. The question is is it 10% of all panels, or 0.0001%?
 
Said many a soon to be heartbroken OLED owner
To be honest I would rather deal with burn-in issues than the lottery there is with TN, VA, and IPS panels in regards to backlight/glow problems because burn-in is something you yourself can actively mitigate through clever setup whereas backlight and glow issues are just there from the start.
 
To be honest I would rather deal with burn-in issues than the lottery there is with TN, VA, and IPS panels in regards to backlight/glow problems because burn-in is something you yourself can actively mitigate through clever setup whereas backlight and glow issues are just there from the start.
+1
 
i have burn in on my tv cause a young family member left my console on when i wasn't here and let me tell you its bloody irritating to see and you always look at it no matter how hard you try not to. thankfully its a old tv and nothing fancy but yeah that is my experience with burn in.
 
i have burn in on my tv cause a young family member left my console on when i wasn't here and let me tell you its bloody irritating to see and you always look at it no matter how hard you try not to. thankfully its a old tv and nothing fancy but yeah that is my experience with burn in.

How long was the console on for to cause that, I would have thought the auto power off on the TV would have prevented that?
 
its a old tv


I've seen bad burn in on gfs old plasma TV, but you only notice up close, not from sitting watching distance. Modern OLEDs have builtin anti-burn in features that would mean I would have no problem buying an OLED TV or monitor. You could even write some software if you were really worried: every n minutes make the display black for 0.5 seconds.

The keyword is old TVs have this, not ones made in the last few years.
 
How long was the console on for to cause that, I would have thought the auto power off on the TV would have prevented that?

no idea mate i returned a week later to see it on the screen and was gutted. i left and it was perfect i returned to see the triangle symbols from assassins creed syndicate burned.

this one
89543bc30227d1e65abdbaab5fefbb4a.png
 
Last edited:
Back
Top Bottom