• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

With the consoles launching only 2 months later (and probably being paper-launched earlier than that) I reckon Nvidia are going to have to be very aggressive on pricing. Remember that - true or not - the consoles are expected to be good enough for 4K gaming. Plus AMD are no longer on the ropes but now a respected - possibly even feared - competitor. So I would not be surprised to see the 3080 Ti to be £599 or even £499. They've got to get people to buy rather than wait for the new consoles.
 
With the consoles launching only 2 months later (and probably being paper-launched earlier than that) I reckon Nvidia are going to have to be very aggressive on pricing. Remember that - true or not - the consoles are expected to be good enough for 4K gaming. Plus AMD are no longer on the ropes but now a respected - possibly even feared - competitor. So I would not be surprised to see the 3080 Ti to be £599 or even £499. They've got to get people to buy rather than wait for the new consoles.

That's around the price point I'm interested in, regardless of the nomenclature. If it's anything more expensive than that, then the only dilemma I'm going to have is whether to buy Xbox or PS5. I'm actually really looking forward to seeing the performance figures for the consoles.
 
4kBE.gif


vRfcoGk.png


Too good to be true?


Load of rubbish lol.

The thing what gives it away is the quoted amounts of memory on the cards.

There is no way a pure gaming card is going to need 20GB and 24GB of memory any time soon.

Yes I know the existing RTX Titan has 24GB but that is through force of circumstances as it is very closely related to the Turing professional cards.
 
Load of rubbish lol.

The thing what gives it away is the quoted amounts of memory on the cards.

There is no way a pure gaming card is going to need 20GB and 24GB of memory any time soon.

Yes I know the existing RTX Titan has 24GB but that is through force of circumstances as it is very closely related to the Turing professional cards.
True. The reason I'm not believing it is because I want it to be true too much, gotta stay pessimistic - we're talking about nvidia here after all. :D
 
True. The reason I'm not believing it is because I want it to be true too much, gotta stay pessimistic - we're talking about nvidia here after all. :D

I have not used more than about 13GB of VRAM and that was doing crazy things like running at 8k with all the settings turned up. I don't think that much memory will be needed for another 4 or 5 years.

The Ampere Titan may have crazy amounts of memory but again the card could be a rebadged pro card like the RTX Titan and Titan V were.
 
I've had several games use over 10gb vram, I don't know how high they would go though since I only have 11gb

Games also tend to cache vram they don't need it just like system ram - there are several games that use 14gb vram on a radeon 7 but it doesn't make the game performance any better

it's difficult to actually know, when do I need more vram and how much?
 
I really don't think increasing the resolution all the time is the answer. 8k is ridiculous!!!

Surely game developers must be doing it wrong in the first place if the image needs 8k to remove jaggies?

A DVD movie has no jaggies at 480p and looks fantastic. Why does a game need 8k to look as good? I don't even think a 16k game would look at realistic as a 480p dvd to be honest. Resolution clearly isn't the correct answer.

1440p should be plenty. We just need games developers to have a board meeting and work things out properly in my opinion.

We need better solutions to these problems. 8k isn't the answer until games look at good as a movie. Then 8k can add extra detail to the movie.
You're right game developers are clearly missing a trick somewhere but I'm not at all sure where that is.
 
I've had several games use over 10gb vram, I don't know how high they would go though since I only have 11gb

Games also tend to cache vram they don't need it just like system ram - there are several games that use 14gb vram on a radeon 7 but it doesn't make the game performance any better

it's difficult to actually know, when do I need more vram and how much?
Final Fantasy 15 used to use 11.5GB Oof my 12 available when I had my Titan XP. There was a few others that were using 10-11
GB also at the time but I forgot which games.
 
This is my problem with Turing. We are not at the point where NVidia can do what phone companies do and just keep targeting the higher end poser brigade as the technology is still behind where it needs to be.

If you’re not bothered about social image or absolute performance, an iPhone 6 is still a great phone.

When every card can easily do 4K, by all means fleece the top end buyers but that time isn’t here and should be no excuse to bring the price of the lower end up.
I think they're deliberately stalling on performance because once most cards can do 4K easily with decent AA etc the incentive to upgrade disappears. I have an iPhone 6 and can see no reason whatsoever to upgrade.
 
Final Fantasy 15 used to use 11.5GB Oof my 12 available when I had my Titan XP. There was a few others that were using 10-11GB also at the time but I forgot which games.

They tend to just use what's there tbh, my Ti uses around 10GB and my XT around 7GB yet didn't seem to affect performance whatsoever.
 
They tend to just use what's there tbh, my Ti uses around 10GB and my XT around 7GB yet didn't seem to affect performance whatsoever.
Yea. Does seem that way. Still though, I want more than 8gb on a 3070. Even the 1070 had 8gb...
 
With the consoles launching only 2 months later (and probably being paper-launched earlier than that) I reckon Nvidia are going to have to be very aggressive on pricing. Remember that - true or not - the consoles are expected to be good enough for 4K gaming. Plus AMD are no longer on the ropes but now a respected - possibly even feared - competitor. So I would not be surprised to see the 3080 Ti to be £599 or even £499. They've got to get people to buy rather than wait for the new consoles.
Well even I can't see anything like that panning out although I'd dearly like it (excuse the pun). I'll be buying two at that price in fact possibly even three!
 
Yea. Does seem that way. Still though, I want more than 8gb on a 3070. Even the 1070 had 8gb...

Ynf the more ram we get the longer it will 'last' as such. Look at system ram, it's been 'stuck' at 16GB for quite a while and can't really see it moving to 32GB for another few years at least.
 
Ynf the more ram we get the longer it will 'last' as such. Look at system ram, it's been 'stuck' at 16GB for quite a while and can't really see it moving to 32GB for another few years at least.
Yea, only game I see that will need it right now is Flight Simulator 2020, but I will not bother upgrading to 32gb until I hand down this cpu and ram to my partner and grab a 4900X with 32gb RAM in 1-2 years time on the cheap :D
 
I have not used more than about 13GB of VRAM and that was doing crazy things like running at 8k with all the settings turned up. I don't think that much memory will be needed for another 4 or 5 years.

The Ampere Titan may have crazy amounts of memory but again the card could be a rebadged pro card like the RTX Titan and Titan V were.

Do you own a VR headset? I have a Valve Index (2880×1600 resolution) and few VR games just chrew through VRAM. Some levels in Half Life Alyx are notorious for this on max settings. I've seen 10-13GB in use on a regular occurrence.

On another note, are you aware of specs of current gen consoles? They have 8GB of shared memory, that can be used for both graphics and system. As consoles are so well optimised, the system/OS doesn't use that much, and the majority can be dedicated to the GPU as VRAM. The next gen of consoles have 16GB of shared memory...

Expect AAA games 'ported' from PS5/Next gen Xbox to effectively double in VRAM requirements verses what we see now. This is planned obsolescence, as current 8GB VRAM cards will not be able to run equivalent graphics settings as the new consoles due to this VRAM limitation.

Nvidia will make a killing.
 
They tend to just use what's there tbh, my Ti uses around 10GB and my XT around 7GB yet didn't seem to affect performance whatsoever.

That’s the main thing, you see a lot of people comparing vram usage when the cards referenced have different amounts of memory. A card with more memory tends to use more as I’ve found out also.
 
Yah but that is fairly specialist, not exactly a pick up and go kind of game!! Stressful AF from what I've seen/read :eek:
Been playing Flight Simulator games on and off for about 20 years. I can land most aircraft safely. Those big commercial planes are not easy, but Cessna’s are super easy. Even flown one in real life, but did not do landing :p:D
 
Back
Top Bottom