So grim i take it your not getting a 5090 then...
Been trying to get him to admit that for a while.
You can tell he has gone sour for a while now.
Hilarious really as he was the one to be pushing DLSS 1.0 which was tripe. Lol
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
So grim i take it your not getting a 5090 then...
Of course he will get one .Been trying to get him to admit that for a while.
You can tell he has gone sour for a while now.
Hilarious really as he was the one to be pushing DLSS 1.0 which was tripe. Lol
If you bought at launch and sold last month it would have been almost a free 4090 for 2 years. They held their value well.Don’t care what anyone says about the 5080, but checking EBay and all the people trying to sell 4090 for 1.3-1.5k are mad… you’d be mad to spend even a grand in my opinion on a 4090 unless you need the 24GB vram
Don’t care what anyone says about the 5080, but checking EBay and all the people trying to sell 4090 for 1.3-1.5k are mad… you’d be mad to spend even a grand in my opinion on a 4090 unless you need the 24GB vram
This puts the 5080 at 10% faster than 4090 and the 5070Ti 10% faster than 4080. Far Cry 6 is also heavily CPU single-thread bottlenecked so I think the Plague Tale numbers are more relevant.
5070 as fast as 4070Ti. Looks like a really nice generation for the price. I imagine the 5070 will become the most popular GPU on Steam in a few months.
Hard to find numbers for these old games to verify, however I do note from the presentation that rtx5000 has 2x RT performance per core claimed by Nvidia. And notice that both games you mentioned are using RT - so pure Raster numbers could be very different
end of January sales ? yay a 5090 for 1k pleaseRoll on end of January for the chaos..
Like you’ve said, for the games YOU play it makes no difference. Whereas for me it would make a big difference as I own a 4K 240hz monitor
Wish I could check it, not a big enough post count yet been a member for yearsyou haven't checked MM
I can’t see a 4090 for sale since the announcement unless I’m missing something? - I do wonder how many cores the GB203 has? With such a large disparity in core count a 5080 Ti would need to use a largely crippled GB202 I would imagine potentially disabling fully functional units? Unless they can squeeze more out of the GB203?you haven't checked MM
Now that we have full specs, we can analyse how much silicon Nvidia is not selling you.
Is multi frame generation a crutch to make up for the fact Nvidia is trying to sell you less silicon to make more profit? Well let's have a look
I will compare the core counts of various models for 3 generations to see the trend and make a comparison, all cards are compared to the top chip for that card's architecture as presented by Nvidia in their architecture white paper.
RTX3070: you get 55% of the full GPU die
3070ti: 57%
3080: 83%
3090: 97%
RTX4070: you get 38% of the full GPU die
4070ti 42%
4080: 53%
4090: 90%
And now for the RTX5000
RTX5070: you get 24% of the full GPU die
5070ti: 35%
5080: 45%
5090: 82%
Well the trend is fairly obvious, Nvidia is selling you smaller and smaller GPUs every generation - a 70 series card now gets your half the GPU it used to. Nvidia makes more profit while games get less relative performance than before
There is a misconception that high frame rate is about reaction time and only for competitive players. This couldn't be further from the truth. It is about motion clarity. At 120 fps your 4k resolution is at a far lower perceived resolution at anything over a few inches per second motion. If you agree 4k is better gfx than 1440p, well that also applies to motion. Unfortunately due to current sample and hold tech we need 1000fps for near perfect that many of us experienced for years with a decent crt that didn't need anywhere near 1000fps. Each noticeable jump is the doubling of the previous eg 60-120-240-480 will be noticeable each time.Does a 240hrz monitor HAVE TO run at 240hrz or does running at something 'rubbish' like 120 hrs magically make it stop working? You playing ultra competitive FPS shooter death match where every 0.0001 millisecond counts? Do you work out every day and take reaction testing and training like an F1 driver does to take advantage of that fraction of a thousands of a second?
Or are you just trying to justify buying one BECAUSE you can run at 240hrz.......but in really it wont make much difference lol
I'll be getting a 4k 240 OLED display soon.......purely because thats an actual upgrade with enormous colour accuracy and 'infinite contrast' - so thats the reason, ill still game at a lowly 100-144 hrz/FPS though, not sure how ill cope, pray for me.
There is a misconception that high frame rate is about reaction time and only for competitive players. This couldn't be further from the truth. It is about motion clarity. At 120 fps your 4k resolution is at a far lower perceived resolution at anything over a few inches per second motion. If you agree 4k is better gfx than 1440p, well that also applies to motion. Unfortunately due to current sample and hold tech we need 1000fps for near perfect that many of us experienced for years with a decent crt that didn't need anywhere near 1000fps. Each noticeable jump is the doubling of the previous eg 60-120-240-480 will be noticeable each time.
Is the 5070 likely to be capable of 4K 60fps with acceptable DLSS settings? Or is it going to be a blur and AI glitchfest?
This year I've had to use 3rd party frame gen mods or Lossless Scaling to play the games I want to play at 4k and at acceptable frame rates.So is anyone actually buying these new cards? What games do you play day in day out that you can’t run at all or barely run ?