• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

For some reason, people expect to get a lot more RT peformance from one gen to the next. I wonder how people will react if Ampere gives tiny performance performance increases all around.

New process - double the transistor count, that's not for having a billion radios on your new GPU!

RT will certainly have massive boost - Consoles will be pretty weak in this respect, so that's the game Huang can play well, plus if the rumours of 2nd FP32 unit are true then general performance improvements should be good: only problem is price vs AMDs offerings, but the fact that we see very hot 3090 indicates Huang is worried.
 
I like how Jensen gimped the vram on the 3080 so owners would become disgruntled sooner so he can sell something else to them sooner. Huge gap between 10gb and 24gb.
 
I like how Jensen gimped the vram on the 3080 so owners would become disgruntled sooner so he can sell something else to them sooner. Huge gap between 10gb and 24gb.
Am wondering if that was done so Nvidia can drop the price of the 3080 to cheap as possible for if it does go up against AMD new top end card

If AMD new flag ship card is the same speed as the 3080 then there be a battle on the price of them
 
Am wondering if that was done so Nvidia can drop the price of the 3080 to cheap as possible for if it does go up against AMD new top end card
Yeah perhaps allows them drop price and margin to compete better if they do come out with something good. 10gb vram is definitely cheaper than more than that.
 
I like how Jensen gimped the vram on the 3080 so owners would become disgruntled sooner so he can sell something else to them sooner. Huge gap between 10gb and 24gb.
Most likely a 3080ti/super which would have been the card they wanted to begin with but only had the option of the hugely expensive 3090 if they wanted more VRAM.

If AMD can match 3080 performance and prices and offer more VRAM then we would likely see an updated version sooner rather than later.
 
Not everyone is gaming at 4k tho.

I'm at 1440p. 10Gb vram is might be fine for the next 2 years.

They must know as well about next gen game requirements other wise when the next gen games do arrive and are consistently breaching the 10Gb vram mark then the sales of their cards will go down too as they are no longer desirable especially if AMD have cards with more vram.
 
I just realized that Turing got a lot of people to settle for tiny performance gains...except when it comes to RT performance.

For some reason, people expect to get a lot more RT peformance from one gen to the next. I wonder how people will react if Ampere gives tiny performance performance increases all around.

Any thing first gen sucks. Seriously. Like the Oculus DK and DK2, and etc etc. They didn't even try to hide those as fully blown products. Early VR was like Pong, and very simplistic. RT was always going to be the same. Just like when 4k launched, yet a single GPU to run it did not.

First gen adoption on any new tech costs money. We should all know this. Early 3d was very expensive too (the glasses, monitor you needed and so on) and nothing much changes.

Performance gains over the Titan XP were not as many people expected with the 2080Ti. Me? I thought they were fine. Mine flies, especially at the easy 2150mhz with no volt modding and 2200 if I ever want it. I have used RT, but the conclusion is, as always "It is too early". Just like I said above, everything at inception is crap.

RT has improved somewhat, but certainly not enough for me to buy a whole GPU just for it. That would be really silly, as there are hardly any games that support it (FFS Quake 1 and every one lost their minds LOL) and it's simply too early in the game.

My first RT card was 2070 Super for £418. My main rig still had the Titan XP in it I bought 3.5 years before (would be nearly 4 now !) and I was very impressed with the performance. For once I got lucky, and it outbenched my Titan XP on air (the Titan was under water) in most things and I was very impressed. So much so I replaced my Titan with a 2070 and under water the performance drop was so small I didn't notice. Encoding movies however? OMG. People are just not accepting all of the new things Nvidia did and do. Like, for years if I want 4k I run DSR. I ran Adaptive Vsync after buying a Freesync monitor and all sorts of the other goodies they do without all of the bravado.

I bought my XP in 2017 for £675 on the launch day of the 1080Ti. I could have gotten back £450 for it easily (or more than the 2070 Super cost) but I gave it to a chum.

Once I ditched 4k monitors and went to 1440p I have stayed there. 4k looks better, but every time you dump over a grand for 60 FPS it is short lived. My TXP? three years after the fact (and more) that I bought it it still ran COD MW ultra 1440p at over 120FPS. This is why any one who says that you have to have the latest most expensive GPU is crazy. Been there, done that believe me. Three Titan Black new, two Fury X. Yeah screw those potatoes. 4k does look better (if you sit and stare at it and not play it) than 1440p but the performance penalty has always been so big that I couldn't be arsed with it.

But yeah, it was quite clear RT was going to suck at first. Nvidia had switched from big old dies (Fermi) to tiny little ones with their balls clocked off. It was clear when they went back to the big guns it wouldn't be easy.
 
I just realized that Turing got a lot of people to settle for tiny performance gains...except when it comes to RT performance.

For some reason, people expect to get a lot more RT peformance from one gen to the next. I wonder how people will react if Ampere gives tiny performance performance increases all around.
I bought my 2080ti because it gives almost double the performance of my GTX 1080
Not because of the RT
 
New process - double the transistor count, that's not for having a billion radios on your new GPU!

RT will certainly have massive boost - Consoles will be pretty weak in this respect, so that's the game Huang can play well, plus if the rumours of 2nd FP32 unit are true then general performance improvements should be good: only problem is price vs AMDs offerings, but the fact that we see very hot 3090 indicates Huang is worried.

The console RT is not particularly impressive so far - watching the ratchet and clank demo yesterday and it's 30fps and yes it ray traced reflections but only certain objects are reflected, there is no rayvtracing on shadows or global illumination and those objects that do reflect are not of a very high quality - the reflections on floors and other shiny surfaces have a very dirty and aliased appearance like the reflection resolution is much lower than native. You can compare it to reflections in Battlefield V which are incredibly detailed and clean and Battlefield V is not chosey - every object is reflected, so when you look in a window you don't see missing geometry like you do in ratchet and clank

ratchet and clank is really our first proper look at RDNA2 rayvtracing as we know what it looks like and we know how it works and what know what the framerate and resolution is - and to be honest, I'm far from impressed
 
Last edited:
I bought my 2080ti because it gave almost double the performance of my GTX 1080
Not because of the RT

But, unlike pretty much every generation prior to Turing where we were getting meaningful performance increases at the same price point...Your 2080Ti should have come in around the same price point as the 1080Ti, but it didn't. It wasn't even close.

I didn't buy Turing because I expected more for my money than what it offered.

My point about RT, is that people seem to expect a lot more for their money again. (Which I'm happy to see)
 
The vast majority of people have vram configurations from around 6 - 11GB as of right now.

Tell me what has changed right now that we think that we need 20Gb of vram on a graphics card?
Not everybody will change their GPU every year/gen, so people are thinking ahead for 2+ years.

And also not everybody is buying the xx80 card either! The lesser cards have 6 or 8 GB!

Finally nobody said we need 20. We just don't want less than 12.

Sheesh, we've had 8GB cards for 5+ years.
 
Once I ditched 4k monitors and went to 1440p I have stayed there. 4k looks better, but every time you dump over a grand for 60 FPS it is short lived. My TXP? three years after the fact (and more) that I bought it it still ran COD MW ultra 1440p at over 120FPS. This is why any one who says that you have to have the latest most expensive GPU is crazy. Been there, done that believe me. Three Titan Black new, two Fury X. Yeah screw those potatoes. 4k does look better (if you sit and stare at it and not play it) than 1440p but the performance penalty has always been so big that I couldn't be arsed with it.

I'm happy at 1440p for this reason, the 1080 Ti still does fine at that res with max settings in all the games I play. A friend of mine went 1440p Ultrawide and enjoys high FPS so felt he had to go for the 2080 Ti to maintain high frame rates, and ended up spending twice what I did for a similar experience.

Ultimately we're both happy with we've got, and the choices reflect lifestyle. He's young, single, rent free and plays ALL the time, I have a mortgage, wife and two kids so don't have as much to play and don't deem the extra cost worthwhile.
 
Not everybody will change their GPU every year/gen, so people are thinking ahead for 2+ years.

And also not everybody is buying the xx80 card either! The lesser cards have 6 or 8 GB!

Finally nobody said we need 20. We just don't want less than 12.

Sheesh, we've had 8GB cards for 5+ years.

That what I gathered from reading some of the comments. People saying that early adopters of the 3080 will regret the purchase eventually with it's 'only' 10GB of vram and eventually lust for the 3080 Super / Ti with it's 20GB vram when it releases in 6-9 months.
 
Sure but now its getting faster and faster. Isn't speed important as well as capacity?
I don't think so, no, not really. If the data isn't in VRAM it has to be loaded from a much, much slower source (relatively speaking). Like an SSD.

So the capacity is still as important as it ever was.

The speed of the memory dictates how fast you can operate on it, read it, move stuff around from one bit of mem to another.

But if you have to swap bits out because VRAM is full, then where does that get swapped to and from? From a much, much slower medium.
 
Back
Top Bottom