• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Fire Shots At AMD’s 7nm Graphics Technology

Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
i love this Excuse. But You can also do compute on it.
Who to **** cares ?? Like 8 out of 10 people buying those cards Titans or Vega VII USE THEM FOR GAMING.
And even with wasting spavce for RT cores and stuff 2080 on 12nm is still better GAMING card than VII and cheaper to produce.

I think AMD is happy that NV went with them RT cores. Image if they just used that space for more gaming stuff. It's like 40-50% of chip. If they not went with that and had ful on RT free gaming card at 700 quid. AMD could close the gpu division down as they would have nothing interesting for GAMERS .

16gb of VRAM @2160p is better than 8gb of VRAM for gaming.

Purely for gaming @4k I would rather use VIIs rather than 2080s.
 
Permabanned
Joined
15 Oct 2011
Posts
6,311
Location
Nottingham Carlton
16gb of VRAM @2160p is better than 8gb of VRAM for gaming.

Purely for gaming @4k I would rather use VIIs rather than 2080s.
at 4k yes at 1440p hmm nope.
For now my 12gb titan does descent at 4k.... Or I should say 1800p. Not enough Power for full 4k :(
2080 and VII cant really do 4k gaming@60 maxed heh

index.php

I'w put like 110-120 hours in Division 2 and all I can say... Those are Best case numbers. I'w seen drops to 28fps on 4k ultra....



Need something with 2080ti power at 650-700 quid and Ill buy :)
 
Last edited:
Soldato
Joined
22 Apr 2016
Posts
3,432
No it's really a compute card that can game. It was designed as a compute card first and foremost. AMD for years have been designing one size fits all architectures to save on r and d or that's what it looks like from my end.

Some believe that something happened to Navi and that Vega 7 was never even meant to come into the gaming space.
I get that but let’s be honest the context the vii is primarily discussed on this forum is as a gaming card.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I have to really wonder about something. You know...one of those passing thoughts.
If it turns out to be true that next gen consoles can do 4K HDR at 60 FPS solid using 8/16 CPU using Navi where will that leave the rest of us?
It's a none issue if you, as a PC gamer, consider yourself in the midrange.
But what will it become if you are an enthusiast level PC gamer and can't do 4k HDR at 60FPS without lowering IQ to compensate?
Would it settle with you just fine at that point that you find out, after next gen consoles are released, that your Enthusiast level PC is now regulated to just 'high end' (which really isn't a thing btw but its better then saying you are a mid range pc gamer)?

Oh well, I guess one doesn't worry about it until it's actually confirmed I guess.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,585
Location
Greater London
I have to really wonder about something. You know...one of those passing thoughts.
If it turns out to be true that next gen consoles can do 4K HDR at 60 FPS solid using 8/16 CPU using Navi where will that leave the rest of us?
It's a none issue if you, as a PC gamer, consider yourself in the midrange.
But what will it become if you are an enthusiast level PC gamer and can't do 4k HDR at 60FPS without lowering IQ to compensate?

Oh well, I guess one doesn't worry about it until it's actually confirmed I guess.
Easy, move to console for 4K gaming on a proper OLED screen and stick to 1440p on the PC for PC exclusives. That is likely what I will be doing.
 
Soldato
Joined
6 Feb 2019
Posts
17,596
3dcenter.org has a graph showing GPU performance to power draw efficiency

It shows that the 2080ti is the most power efficient card at 4k (the Titan RTX was not tested)

This is abnormal, normally at the extreme high end GPU's lose efficiency within in the same architecture when compared to slightly lower models.
For example, Vega 64 is vastly inferior in efficiency to the Vega 56, Radeon 7 is not much better.

This suggests to me, even that Turing is still under it's peak efficiency curve, and that's at 16nm.
Nvidia could have made Turing cards even faster than they are now without moving to 7nm - how much faster is hard to say if they really wanted to push it

7nm is going to be very beneficial to Nvidia when they do move

Easy, move to console for 4K gaming on a proper OLED screen and stick to 1440p on the PC for PC exclusives. That is likely what I will be doing.

I'm using my 1440p 144hz screen only for first person shooters. Everything else I'm playing on a 55inch 4k HDR TV, running from either my PC or PS4 PRO
 
Last edited:
Soldato
Joined
6 Feb 2019
Posts
17,596
Easy, move to console for 4K gaming on a proper OLED screen and stick to 1440p on the PC for PC exclusives. That is likely what I will be doing.
50% at each tier. I would like some of what you have been smoking. If you said 50% price hikes at each tier you would be more believeable.

Look how much Radeon 7 gained over Vega 64 just by moving to 7nm.
Turing is even more efficient and will eat up all that lovely power savings by pushing higher clocks

Want to know how efficient Turing is?

Power limiting the 2080ti to only 160w, it will still perform the same as a 1080ti overclocked to 280w

tb8dc5u0abq21.png
 
Last edited:
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
Honestly, I wish I had a reason to care about power efficiency. Even looking to convert to a SFF build it seems mostly useless gains. The only other scenario I can see some usefulness is for off the grid setups where you rely on your own energy storage, but even then it's kinda meh.

I understand these efficiency gains are awesome for businesses and large-scale clients, but as an individual? A lot of hype for nothing.

Efficiency gains argument, is completely piece of marketing crap.
When 290X was released many argue 780Ti is more efficiency and it will pay back through electricity bills in couple of years.
When I run the proper maths to them, they should still need to keep their 780Ti for another 20 years, or 30 years in the case of the Titan, to get back the extra money paid over the 290X. Assuming the card is working still 8 hours per day 365 days per year at 100% load. Anything less like 3 hours at full load per day, takes 90+ yeahs to pay the card difference back through efficiency.

And in the case of RTX cards, the gap is even bigger, and amounts to centuries.... hahaha
So no GPU can pay the price difference through efficiency (power) gains in it's life time. NEVER EVER.

Anyone who believes the opposite is ignorant of simple maths.
 
Soldato
Joined
28 Oct 2011
Posts
8,405
Efficiency gains argument, is completely piece of marketing crap.
When 290X was released many argue 780Ti is more efficiency and it will pay back through electricity bills in couple of years.
When I run the proper maths to them, they should still need to keep their 780Ti for another 20 years, or 30 years in the case of the Titan, to get back the extra money paid over the 290X. Assuming the card is working still 8 hours per day 365 days per year at 100% load. Anything less like 3 hours at full load per day, takes 90+ yeahs to pay the card difference back through efficiency.

And in the case of RTX cards, the gap is even bigger, and amounts to centuries.... hahaha
So no GPU can pay the price difference through efficiency (power) gains in it's life time. NEVER EVER.

Anyone who believes the opposite is ignorant of simple maths.


Hear! Hear! The nonsense talked about efficiency gains needed calling out.
 
Soldato
Joined
26 Sep 2010
Posts
7,157
Location
Stoke-on-Trent
And I say this in all honesty, Panos, hot and "inefficient" GPUs have held tipped the balance in my room temperature over the colder months, actually saving me a chunk of cash by not having to push the heaters a bit more.

I wonder if I could repurpose a Radeon VII as a close-quarters space heater? The monthly extra I'd save on electricity would cover the cost of the card in 6 months...

:p
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Hardly anyone talks about cost saving when it comes to power efficiency. Far more people complain about people talking about it.
I don't think anyone cares about the cost saving, I fel it is more to do with heat generated and the noise level. More wattage = more heat = harder working fans. That has and is my issue with my 290X. Doesn't stop it being a good card but even with MSI's Twin Frozr fans, it is a noisy bugger when gaming.
 
Soldato
Joined
22 Apr 2016
Posts
3,432
It’s quite regrettable that certain people can only equate inefficiency to saving a few pennies a year.

They just can’t see the other disadvantages that inefficiency brings.

Heat, noise and thermal throttling. Just look at the unfortunate Radeon vii as a prime example of this.

Are the best gaming gpu’s inefficient? Of course not it’s hardly a desirable attribute to max performance.
 
Back
Top Bottom