• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

1080Ti - Will it last?

I think the whole Freesync/G-Sync thing is getting over-hyped. OP's monitor is going to be great, no matter what gpu he goes for. G-Sync is just a nice feature. No point selling the kid's Christmas presents so he can buy a 2080 just to stop a bit of screen tearing here and there.
It isn’t overhyped. G-Sync is a huge benefit at lower frame rates. He can extend the life of his card by going g-sync.
 
It really doesn't mattered. Looked what happened with Titan X Maxwell, it was the most powerful flagship GPU in 2015 with massive 12GB VRAM that supposed to be future proofing but it very struggled just like 980 Ti and massive 12GB VRAM buffer went wasted in 2018 games like Just Cause 4 still used around 3.5GB VRAM at 1080p, 4GB VRAM at 1440p and 5GB VRAM at 4K. We still yet to see games to maxed out 12GB VRAM.

So what are Nvidia meant to do to stop this happening?
The reality is that as GPUs become more powerful, the high end GPUs of the past become the midrange GPUs of today. Should Nvidia stop releasing more powerful cards?

People should buy a GPU for what they want at the time. We don't know about games or GPUs of the future, so it's a silly thing to worry about. Get the card that gives you the performance you want, today.
 
So what are Nvidia meant to do to stop this happening?
The reality is that as GPUs become more powerful, the high end GPUs of the past become the midrange GPUs of today. Should Nvidia stop releasing more powerful cards?

People should buy a GPU for what they want at the time. We don't know about games or GPUs of the future, so it's a silly thing to worry about. Get the card that gives you the performance you want, today.

No. Nvidia purposely gimps their older GPUs by not optimising them for the latest games. The 980Ti is being handily beaten by the 1070 in Just cause 4 for reference while the situation wasn't so the last year. If everyone goes for 1080ti you can bet nVidia will stop optimising for Pascal to force upgrades to RTX. Remember, they are in complete control of the high end GPU market. They can do whatever they want.
 
The benefit of G-Sync is not worth £400

The benefit of that extra you pay is realised in the savings when you don't need to pay for upgrades that often as G-Sync makes drops to 50fps almost unnoticeable and to 40 playable. Without G-Sync, lower fps would be a tearing,stuttery mess so you have to upgrade on a faster basis to maintain your fps.
 
It really doesn't mattered. Looked what happened with Titan X Maxwell, it was the most powerful flagship GPU in 2015 with massive 12GB VRAM that supposed to be future proofing but it very struggled just like 980 Ti and massive 12GB VRAM buffer went wasted in 2018 games like Just Cause 4 still used around 3.5GB VRAM at 1080p, 4GB VRAM at 1440p and 5GB VRAM at 4K. We still yet to see games to maxed out 12GB VRAM.

I personally own 4 games that max it out.

You keep pointing out maxwell cards, but maxwell cards are not as capable as a 1080ti.

Its hard to explain but what I mean is this, for many many generations the most powerful gpu in that generation was never powerful enough at least without SLI.
Pascal was the first generation where a single card was powerful enough to play games at max resolution at 60fps.

Maxwell was never powerful enough to do this, even on launch day. Same with all previous generations.

I expect if a 1080ti is good enough now for your needs, it will remain that way until we see next gen consoles at the very least, which is predicted in 2020.

Nvidia know this as well, so they had to create a tech for consumer space which "slows cards down", hence RT. Because otherwise people would stay on pascal with it been "good enough", the exceptions been those who always want the latest GPU.
 
I expect if a 1080ti is good enough now for your needs, it will remain that way until we see next gen consoles at the very least, which is predicted in 2020.

Nvidia know this as well, so they had to create a tech for consumer space which "slows cards down", hence RT. Because otherwise people would stay on pascal with it been "good enough", the exceptions been those who always want the latest GPU.

+1 I couldn't agree with you more...
 
"in general"

@Shaz12 ; What "tearing stuttery mess"? I've been happily gaming on my 1080p, 60Hz monitor for over 4 years.

Again; over-hyped.
Because you have probably gotten used to it. I was also gaming on a 60hz monitor until a month ago and didn’t notice it. I upgraded to a 144hz gsync monitor and now the difference is obvious. Try out 144hz with gsync and you wont want to go back.
 
Because you have probably gotten used to it. I was also gaming on a 60hz monitor until a month ago and didn’t notice it. I upgraded to a 144hz gsync monitor and now the difference is obvious. Try out 144hz with gsync and you wont want to go back.
Or dont try it, you will never know what your missing and save loads of money :p
 
No gfx card is future proof.

True but some do have a very long service life.

If you had purchased one of the original Kepler Titans (which is nearly 6 years old) you could still use it now as it still has reasonable GPU muscle and 6gb of VRAM.

Having said that the asking price was a bit steep.

The real question I am sure you are dying to ask is - "Will the RTX Titan last 6 years or 6 minutes before Space Invaders arrive".:eek:
 
Back
Top Bottom