Discussion in 'Graphics Cards' started by Kaapstad, Jan 18, 2020.
Once you go Black..........
...you never go back?
i think devs are to lazy with current tech they should maximize the pc but lazy ass devs dont think that way anymore..
Lazy ass devs also dont want to contemplate the off chance there are users that may want to use multi-gpu.
I think it comes down to time for reward. The extra time needed to do these things will not generate enough extra sales to make it worth it is what it sadly comes down to imo.
Well, if you paid £100+ for a game I'm sure they would spend more resources to squeezing all possible graphics candy out of hardware.
But because huge majority of player base won't be paying as much from game any more than Nvidias pumped up high end prices, developers won't do that.
The three big warning signs for me this time around are:
1. Nvidia selling ray tracing performance improvements over general rendering/rasterisation
2. Price movement on the tiers relative to tangible performance gains
3. Delaying a Ti high end part unnecessarily
to be honest i think the console ports are a joke they make them worse looking than consoles which the pc could do a hell of a lot better they should be ashamed of themselves. and the prices of the games themselves are a joke ...
Console games are like printer cartridges sold to cover the license/tax of the platform
Its not necessarily about cost though. If you build a game that taps into whatever resources are available - think ashes of singularity, or a tool like Blender. As the GPU makers have shifted the responsibility to the game makers to utilise, the reason we are not seeing good scaling and monster performance improvements you could argue is narrow thinking and industry mindset of churn out as fast as possible to move onto another project.
Just like you see with multi-core CPUs finally catching on (unfortunately adoption does take time) the GPU scene is going to revisit this at some point. To me though the whole point of having it was not just for the big spenders that got SLI titans because they could, it was about plopping in another GPU of the same a year or two down the line when they were much cheaper but could potentially double your candy.
Rumour has it that NVidia's Hopper architecture which is suppose to come after Ampere will be a MCM (multi chip module) architecture. If this is the case I think it will be completely different to the way that currant SLI works, even though I'm not sure how, but NVidia have plenty of very clever people who are probably working on it already, so I'm sure they will sort it out.
Same, i have a £500 limit, i may go over slightly but it will have to be something special. im hoping the 5800 will be around the £500 mark with a decent increase over a 2070s
Exactly, devs are lazy. I think stagnation in the GPU market is a good for the gaming industry. It means developers have to spend more time optimising for the GPUs available. Instead of thinking "so our code requires x GPU. So be it, whatever", they'll now have to spend months and months getting it looking just as amazing on the mainstream GPUs available.
My second Zotac card and very impressed in truth. Good cooling, quiet and overclocks fairly well. I am becoming a fan of Zotac in truth.
Hopefully, although realistically my 5700XT is very good at 1440p and I can't see me going to 4K anytime soon. I may hold out for the 6000 series, 7nm euv?
im on a vega 56 which doesnt quite have the legs for for a solid 1440p experience, i was tempted by the 5700xt but im not sure it offers enough of a jump to justify the cost.
I agree. especially number 3. I think that Nvidia will likely just release the Ti alongside all the other cards now though as the likelihood of a 3080 blowing the 2080 and 2080ti out of the water is quite low as long as AMD don’t bring anything to the table.
Their pricing strategy was spot on last gen (sadly for us) and maximised probably the amount of 2080ti owners to the extent its now pretty much normal to have a 2080ti on enthusiast forums whilst before it was still a BIT of a rarity.
Yeah it sounds likely, its a shame we have had regression over the past couple of years due to nobody wanting the hot potatoe to do the donkey work for it to be working as it should be.
Considering they're using TU104 chips for even the 2060 now (same chip as 2080 but obv. with stuff disabled), I'd say the answer to the title question is a definite yes.
Separate names with a comma.