Caporegime
- Joined
- 18 Oct 2002
- Posts
- 30,764
I'm glad we agree, the 4090 is the only obvious choiceAnything under 40% is a useless generational improvement IMHO.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm glad we agree, the 4090 is the only obvious choiceAnything under 40% is a useless generational improvement IMHO.
But for a mainstream dGPU buyer like me and everyone I know we are constrained by budget. I know of nobody who would spend £300 on a dGPU who will spend £600. So ultimately there is no point upgrading if the improvement is minimal. No wonder dGPU sales have collapsed.I'm glad we agree, the 4090 is the only obvious choice
Shame the 6700XT refuses to hit £300, it'd be a damn good buy at that for the mid-range.But for a mainstream dGPU buyer like me and everyone I know we are constrained by budget. I know of nobody who would spend £300 on a dGPU who will spend £600. So ultimately there is no point upgrading if the improvement is minimal. No wonder dGPU sales have collapsed.
The RX6700 10GB and RX6700XT 12GB have been available for between £300 to £360. But because of Nvidia pricing stuff so high even AMD price reductions are not massive.Shame the 6700XT refuses to hit £300, it'd be a damn good buy at that for the mid-range.
TBH at this point in their lift cycle the 3080 and 6800XT cards should be going for around £300. I'd be happy with 6800XT performance just not for its current priceShame the 6700XT refuses to hit £300, it'd be a damn good buy at that for the mid-range.
TBH at this point in their lift cycle the 3080 and 6800XT cards should be going for around £300. I'd be happy with 6800XT performance just not for its current price
Yeah its not happening but MSRP for tech that over two years old and on the verge of being replaced by their successors (hopefully) take the ****
The generational uplift is there. The RTX4080 is around 75% faster than the RTX3070TI at qHD:
The RTX4070TI is around 76% faster than an RTX3060! The RTX4090 is around 65% faster than the RTX3090 at 4K:
The RTX7900XT is around 50% faster than an RX6900XT. The big problem is that instead of giving users a big generational uplift for the tiers under the RTX4090,they simply used it to jack up prices.
The RTX3090 was around 56% faster than the RTX2080TI at 4K:
The RTX3080 was around 68% faster than an RTX2080 at 4K looking at the above TPU figures.
The names are meaningless. Nvidia can scribble anything they want on the box.
If they can't provide ~30% generational improvement at a given price point, it's a fail.
Allowing for inflation and considering that today's $599 is not worth as much as it was when Ampere launched, let's see Nvidia can manage a 25% improvement over the 3070Ti.
Which is now the mid-tier and lower issue. Your getting nowhere near enough gains for the gen shrink, at least not for these prices. Turing looks like losing its worst release title..
Peoples still holding out for a similar EOL 980Ti deal that they got back in the day
I'm just talking about advancement from a technology standpoint, not "Is it worth upgrading?"Anything under 40% is a useless generational improvement IMHO.
I'm just talking about advancement from a technology standpoint, not "Is it worth upgrading?"
Now that VR is the most difficult load I put on a GPU, I want something like >50% before I'm tempted to upgrade. (Although I don’t expect it at last-gen's same price point if the jump is that big)
I get no benefit over 90FPS, but avoiding dropped frames and keeping my minimum frame rate comfortably over 90FPS is valuable to me. I almost totally ignore average FPS these days and look at 1% and .1% lows.
The title of the video says it all...8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.
They are trying to make it look like they can " get away with it." But we don't know if they actually are.The generational uplift is there,if you normalise tier to tier. You are talking about 65% to 75% but Nvidia just decided to jack up pricing at the same time,so it actually looks poorer than it is.
So at this point,even if you do increase your budget,for many of us mainstream dGPU buyers it looks a poor generation.AMD also is realising they can get away with pricing their new generation equivalents only slightly better than Nvidia too!
I'm glad we agree, the 4090 is the only obvious choice
I think hiding in bushes was intended by the developer. When the settings in-game are used, that's fine but when you do scummy things like that to gain a massive advantage over others then anyone using that should've been permabanned imo.
Boo this man, "boooooooooooooooo". I've changed settings for clarity like in Battlefield games but I don't go beyond what everyone else can use as I don't want a massive advantage in my favour.
8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.
It's interesting that the settings menu at 3m26 suggests 11.4GB usage for the 4090 (along with 4.8GB of OS & other apps?!) whereas his vram page (done on a 6950XT) suggests 14.6GB usage.8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.
I understand to a certain level, in the very first CoD games we'd use certain console commands to remove the 91 FPS lock and set them higher, 125, 250, 333, 500 & 1000. 125 would be so you could jump to certain levels, 333 = highest jumps but 500 & 1000 would remove every 2nd or 3rd step noise and you'd be silent climbing ladders so most just used 125-333. We openly talked about it on the servers though so most people knew and used them also. I'd just feel dirty to have a visual advantage like that but as you said, you were a teenage mutant ninja troll at the time . At least there was a more even playing field in the league matches.Everybody has access to the console commands, just needed to google it and be okay with playing at barf settings.
Was not long before servers and leagues started specifying what the minimum settings were and if changing settings via console command was allowed because it did make a massive difference in terms of where people could hide and the amount of cover available.
In the initial leagues I played in everybody was doing it so it was atleast equal on that front but when I went into public games between matches then yes I did have a huge advantage. I remember someone trying to hide in a bush opposite a bunker thinking they were really well concealed (which they would have been at normal settings) but with my settings it was like they were stood out in the open. They were so confused when I kept shooting them.
Also you can booo me all you want. I was a teenager back then and a bit of a troll so I just found it really funny at the time.
The title of the video says it all...
The Last of Us Part I, RIP 8GB GPUs! Nvidia's Planned Obsolescence In Effect
Finally a review site uses that term.. That most of us here have been using since day one with Nvidia's games.