• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I've seen bad burn in on gfs old plasma TV, but you only notice up close, not from sitting watching distance. Modern OLEDs have builtin anti-burn in features that would mean I would have no problem buying an OLED TV or monitor. You could even write some software if you were really worried: every n minutes make the display black for 0.5 seconds.

The keyword is old TVs have this, not ones made in the last few years.

The wallpaper engine from Steam is what I use, with the taskbar hidden automatically.
 
Why on earth would Nvidia expect people to use a new power connector purely for their GPU cmon people wake up

Having a new single connector is better than breaking spec on multiply connectors which AMD cards have done several times in the past.

350w card or not If the performance per Watt is there ill be impressed on a tech level, but then ill sill be worried on how the cool it.

Saying that I've WC'ed my cards with FC's or AIO's since the GTX480
 
I've seen bad burn in on gfs old plasma TV, but you only notice up close, not from sitting watching distance. Modern OLEDs have builtin anti-burn in features that would mean I would have no problem buying an OLED TV or monitor. You could even write some software if you were really worried: every n minutes make the display black for 0.5 seconds.

The keyword is old TVs have this, not ones made in the last few years.

yeah no doubt there has probably been improvements i was just saying as folk said they would rather deal with it. i was just giving the perspective of someone who is living with it. think my tv is lcd its definitely not plasma.
 
what. what i miss? we all need to upgrade our psus? :) never gonna happen. Feel free to quote me on that.

However will need an SFX PSU to make room for a 2.5 slot cooler in my tiny case.
 
There’s no way im forking out for a new psu for ampere that’s typical nvidia screwing is for more money

Yea there surely would be no need when you have 2x8pin and 75w from PCIE if you have a card that needs more well you must have screwed up. I intend to keep my AX860i as i paid £136 and it is worth even more and if they screwed me it would be like Intel and thier motherboard vendors and new sockets.


But how would Nvidia profit? Surely any arrangements would be illegal. I know Nvidia and AMD aka Lisa and Uncle Jensen roughly collaborate but your claim might be a bit too far lol.
 
Without knowing what the performance of the new cards will be like and how much it's going to cost, I don't think anyone can say they won't buy a new PSU. £150 ish on a new PSU might seem very worthwhile in a few months. The biggest concern I would have is that everyone can get hold of a new GPU but there will be a massive shortage of PSUs!
 
It would make some sense if it came in at the same time as ATX 12VO and the associated mobos were common but that’s not the situation we have.

Yes my first thought was ATX12VO, due to only 1 motherboard supporting ATX12VO currently I find it highly unlikely. ATX12VO is a future standard, I have no idea how that will roll out over the next few years.
 
Without knowing what the performance of the new cards will be like and how much it's going to cost, I don't think anyone can say they won't buy a new PSU. £150 ish on a new PSU might seem very worthwhile in a few months. The biggest concern I would have is that everyone can get hold of a new GPU but there will be a massive shortage of PSUs!

Top power draw is 375W - 2x 150W PCIE cables and 75W through the bus. Anyone with desktop system with a 650W PSU and not overclocking is going to cope with that handily. And I very much doubt that the power draw of a reference card will be anywhere near 375W. The official RTX 2080 Ti draw was 250W with a measured peak of 279W.

Now, if you have a HEDT system (Threadripper et al) then you likely already have a beefier PSU anyway.

People who worry about power are worrying unneccesarily. Someone is just stirring trouble.
 
People who worry about power are worrying unneccesarily.

I used to think that too, until I dusted off my old GTX 560 for some FAH work. It was pulling so much power for so little PPD...It was a real eye opener.

If my 1080Ti had its current performance with the efficiency of that old card, I would have to hire an electrical contractor to install some sort of industrial electrical power in my computer room, as well as dedicated air conditioning.

Consumer processing power has a limited envelope in which it can operate. I have a 15A circuit for the room. Most of us never get anywhere near that because manufacturers keep increasing efficiency and performance *together*. But if they ignored the efficiency side of the equation and just kept pulling more and more power, we would hit a limit in consumer households at some point.

For instance, I don't think current high-end performance with old Fermi efficiency would work with power from an average wall outlet today.

We might be able to ignore efficiency for a generation or two, depending on how efficient we are when we start ignoring it. -But eventually we would hit a wall.

*Edit* I just looked it up on TPU, and the 1080Ti is listed as 644% of the GTX 560's performance!
 
Last edited:
I've seen bad burn in on gfs old plasma TV, but you only notice up close, not from sitting watching distance. Modern OLEDs have builtin anti-burn in features that would mean I would have no problem buying an OLED TV or monitor. You could even write some software if you were really worried: every n minutes make the display black for 0.5 seconds.

The keyword is old TVs have this, not ones made in the last few years.
Is old really the key word though? If it happened in the past it could happen again. OLED has this weakness(apparently) and to discount the possibility of it occurring seems naive at best. Unless you're super rigorous supervising your TV making sure you're always there to stop 'misuse' it could potentially happen. Not saying it will but it could and they really aren't cheap!
 
Unless you play the same game over and over and over again, there's nothing to worry about.
I've over 6000 hours on one game, I think that qualifies as over and over and over ad nauseum but I'd still be furious if my TV/Monitor got burn in as a result. Many people have a go to game they spend way more time on than all the rest.
 
Is old really the key word though? If it happened in the past it could happen again. OLED has this weakness(apparently) and to discount the possibility of it occurring seems naive at best. Unless you're super rigorous supervising your TV making sure you're always there to stop 'misuse' it could potentially happen. Not saying it will but it could and they really aren't cheap!
All OLED has the same burnin weakness potential, but over the years firmware has improved to include better techniques to avoid getting it in normal use. You could definitely damage it if you wanted to, normal use nope. Where are the threads showing evidence of said damage and make and models? What year were they manufactured? Of all the things to currently worry about OLED burnin isn't one of them :)
 
For all those that dont know John Lewis offer Burn in protection for £140 for 5 year cover.
Well that's just 'great', the manufacturer should offer that NOT the retailer, TV's should last way longer than 5 years unless we want to fill every landfill on the planet with old TV's. That's 15 TV's per person per life in landfill across the developed World which isn't ideal given how rapidly countries are developing. Greta wouldn't approve! ;) But thanks for the heads up:)
 
Back
Top Bottom