Soldato
- Joined
- 19 Oct 2008
- Posts
- 6,014
These days everyone is afraid to say "I don't know" but have an opinion, and that's often based on assumptions /speculation rather than fact (we all do it sometimes ).How bad is the nerf in reality? Peeps are still raving about 900 series in other threads
NV I really doubt purposely hamper older gen cards. Games that were developed when the cards were current are probably not going to get slower. However, it's completely expected that more resources will be directed at the current gen. I assume architectures share some similarities so optimisations done for Pascal for example still work with Maxwell. If NV are able to form a larger performance gap between current and older gen cards even with old games it's probably that they're just able to eek a little more out of the latest cards. GPU's that were current when a game was releasd should be already pretty well optimised already.
Imagine trying to test every single card of say the latest 5 years against every new game. It'd be a time consuming, resource consuming task. I expect both NV and AMD have a way of simulating on different GPU's or at least architectures to make it a little easier.
EDIT: To answer the original question I would have wait for release details (specs/prices). A good rule of thumb however would always to pick up the new gen cards. It doesn't make sense to me to be buying new last gen GPU's unless the pricing is extremely attractive but that never really happens.
Even if the 2080 is just 8% faster for similar price, I'd go with the 2080 personally, just to get onto the forward facing technology.As well as performance you;ve got another 2-3years extra R&D.
Last edited: