• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I'm happy at 1440p for this reason, the 1080 Ti still does fine at that res with max settings in all the games I play. A friend of mine went 1440p Ultrawide and enjoys high FPS so felt he had to go for the 2080 Ti to maintain high frame rates, and ended up spending twice what I did for a similar experience.

Ultimately we're both happy with we've got, and the choices reflect lifestyle. He's young, single, rent free and plays ALL the time, I have a mortgage, wife and two kids so don't have as much to play and don't deem the extra cost worthwhile.
Best choice to go if you want high FPS in upcoming hardware hungry games IMO. And as said it makes the card last longer

There's also similar/better pixel density on smaller 1440p monitors vs a big 4k TV which some seem to use for gaming now, which basically makes up the quality difference anyway
 
One offs tho right. And at what res?

And if Nvidia have some special driver memory compression sauce who knows...

I don't think the memory configurations are a problem on the cards.
At 3440x1440

I Just started a little fly over New York and it using 10.5Gb here
kpWr3ez.jpg
 
At 3440x1440

I Just started a little fly over New York and it using 10.5Gb here
kpWr3ez.jpg

OK, fair enough. But we don't know for sure if it needs all that vram. We know that some games use all available vram regardless like COD if I remember correctly.

And also this is an example of a game that uses extreme levels of ram and vram. So a lot of games might not peak like this game does.
 
I don't think so, no, not really. If the data isn't in VRAM it has to be loaded from a much, much slower source (relatively speaking). Like an SSD.

So the capacity is still as important as it ever was.

The speed of the memory dictates how fast you can operate on it, read it, move stuff around from one bit of mem to another.

But if you have to swap bits out because VRAM is full, then where does that get swapped to and from? From a much, much slower medium.

Ok fair enough. I don't pretend to know how these programs are written to work but it would make logical sense to me to load things into system ram then swap into and out of vram from there as it would be far faster. Then swap into and out of system ram from the ssd. So whilst a game is running it could be loading something new from ssd to system ram then as you're closer to needing it, it loads it from system ram into vram at really high bandwidth.

Now we're going to be on PCIe 4.0, everything happening across the interface between GPU and CPU/RAM should be loads faster?

Fundamentally this needs to be a part of the testing now.
 
OK, fair enough. But we don't know for sure if it needs all that vram. We know that some games use all available vram regardless like COD if I remember correctly.

And also this is an example of a game that uses extreme levels of ram and vram. So a lot of games might not peak like this game does.
I just wouldn't want to take the gamble of dropping around £800 (guessing) on a GPU and find out 6 to 12 months down the line it doesn't have enough Vram
I got burn by that problem before with SLI 3GB 780ti's :(:mad:
 
I just wouldn't want to take the gamble of dropping around £800 (guessing) on a GPU and find out 6 to 12 months down the line it doesn't have enough Vram
I got burn by that problem before with SLI 3GB 780ti's :(:mad:
Exactly this. We can all accept that our expensive GPU in two years time is not the fastest on the market (in its price range) anymore. We can accept that the newest games are not running at the same high frame rates as the two year old games.

What we can't accept is that, due to a completely unnecessary limitation - planned obsolescence, you might say - performance drops off a cliff because the card simply runs out of VRAM. Resulting in bad stuttering and generally being unplayable.
 
I got a 2070 for £35 less than that dude. Again, it is faster and again it would have been cheaper. Was a nice model too with a nice cooler.

I never pay launch prices. Not ever. Mostly because the drivers usually suck, there are issues with the BIOS and etc. I just wait for it all to settle down, including the prices, then buy.

It would be pretty nuts to buy any of these cards until AMD have had their say. Like it was to pay launch price for the 20 series. Hence I waited, and I got a 2070 for £350 (new, Asus) and a 2070 Super for £418 (KFA2). Neither broke the bank, and both hit 100 FPS at 1440p ultra easily in anything that needs a high FPS (like COD MW for example).

I can't work you out.

How much should the new cards cost? My 1070 was bought at inflated release prices.
 
I just wouldn't want to take the gamble of dropping around £800 (guessing) on a GPU and find out 6 to 12 months down the line it doesn't have enough Vram
I got burn by that problem before with SLI 3GB 780ti's :(:mad:

I remember the vram issue well with those 780 Ti's, watch dogs ran like total crap on those cards the moment you launched the game, not because of the GPU power but it constantly ran out of vram. Game dropped frames and froze for a moment constantly. Titan black never had that issue as it had a massive 6gb of vram for the time. Was glad to get shot of those cards when the 980s came out.

I just hope the 10gb vram doesn't come back to bite nvidia in the ass if that's what they launch at.
 
I can't work you out.

How much should the new cards cost? My 1070 was bought at inflated release prices.

You just answered your own question mate :)

1. Cards are not worth launch prices. Don't pay launch prices.
2. How much should they cost? less than launch prices.

Had I adopted say, the 2070 at launch I would have paid nearly what I paid for the Titan XP and lost performance. Performance I did not want to lose. Had I bought a 2080? gawd, big hole.

What I say can be a bit confusing. I have three gaming PCs. I have two at my flat* and one at my mother's. She is old, and I spend a lot of time here any way as I restore bikes as a hobby and I have no garden. I have a 2070 Super at my mum's with a 16 core Haswell Xeon (I use VMware a lot) and a TR 1920x at home with a 2070. I initially bought the 2070 Super for home (to go with the new 1920x I bought and board back in January) but I could not get a block for it (the KFA2). So I bought a 2070 for home, and a block. Rig at my mother's

Ue0zDKs.jpg


Yes it's a mess, it's totally a work horse with gaming on the side. Rig at home.

A6noIHu.jpg


But. As you can see, other than the cooling and mods it's nothing exceptional. 32gb Doms, 2070 and a TR 1920x. I won a lawsuit against my old land lady, made out well so I built this with her money :D

uoMwKqP.jpg


nbxBSaC.jpg


So yeah, I have three rigs. Hence why I usually aim mid range long after launch for the right prices (all rigs run 1440p). I could sell off the TR rig but look at it. It took me three months of modding and work to build that, so I could never part with it. I am going to put that in my bedroom and use it for winter gaming.

And yeah, I don't have kids, am single (twice married, never again) and so on. But even I would never waste money on PC components. Everything I buy is either used, or I wait for the price to be right to me. I could afford to go stupid on hardware, but I got done pouring money down the bog years ago (after the 4k fiasco).

Is the new rig nice? yup. Is it fast? yus surree bob. Does it do anything the others can't? nope ! that is why I wouldn't dump 4 grand of my own cash on it. What I had was absolutely spot on. Never had any issues gaming, and I game a lot. More than I care about having the latest hardware any way.

Pay what you think it is worth, don't live on the bleeding edge and bottom line fella? enjoy the games. Stop counting FPS, turn off the OSD and just enjoy the bleedin' games !
 
The motherboards support CrossFire and SLi, so you are free to install dual-GPU in your system.
DX12 will help for the rest.
Very few MOBOs do support SLI and Crossfire is seemingly dead in the water.
Exactly this. We can all accept that our expensive GPU in two years time is not the fastest on the market (in its price range) anymore. We can accept that the newest games are not running at the same high frame rates as the two year old games.

What we can't accept is that, due to a completely unnecessary limitation - planned obsolescence, you might say - performance drops off a cliff because the card simply runs out of VRAM. Resulting in bad stuttering and generally being unplayable.
+1 You said it, it's not a compromise we should have to make and when you experience it with an expensive card its infuriating to say the least.
 
Last edited:
Back
Top Bottom