• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5070 = 4090 for 499.

I've heard more than one rumour that it's 5-10% at best, it's basically a refresh with some "features" added.
Cant remeber the guys whos video it was but he made a very good point that the 40xx SUPER cards are the price point comparision to the new 50xx cards, not the Ti as the name suggests. When you compare like for like on that basis, ie 5070 Ti and 4070 SUPER, the cuda cores, ram speed etc are basically identical, so the 50xx series is purely a feature bump.
 
who ever thinks they will achieve 4090 performance with 5070 is delusional

5070 = 6144 cores
4090 = 16394 cores

Cuda cores do all the hardwork in a GPU, period

They might pull it off with fake AI frames and what ever illusions Nvidia is trying to pull off, fake hype with clever marketing

It will be funny to see all the 5070 buyers trying to render and think why they are not getting 4090 performances on anything without DLSS 4

Also to add a 4080 literally struggles to render 4k playback, using resolve it tanks out hard, now a 4090 manages just fine, so I want to see a 5070 render a 4k playback smoothly without any downscaling to a lower res, if it does i'll eat my hat

Basically to a general gamer a 5070 will work and do it tasks with fake AI frames buffing up the numbers,

For a professional a 5070 will be a disappointment if they think they will achieve 4090 performance on 4k render playback.
 
Last edited:
Cant remeber the guys whos video it was but he made a very good point that the 40xx SUPER cards are the price point comparision to the new 50xx cards, not the Ti as the name suggests. When you compare like for like on that basis, ie 5070 Ti and 4070 SUPER, the cuda cores, ram speed etc are basically identical, so the 50xx series is purely a feature bump.

Ergo why the only card in the entire lineup worth thinking about is the 5070ti unless you're insane/rolling in cash.

It's a 5090 or 5070ti or don't bother imo.
 
Ib5jLVw.gif
 
If dlss fg creates 10 fake frames for every one actual frame they could market it as 10x faster than previous gen gpus
I still recall seeing in big stores years ago computers showing 8GHz CPU! Then it turned out it's 4 cores with each having 2GHz... Not much different IMHO. :) I can bet a lot of normies fell for that, though.
 
Last edited:
I still recall seeing in big stores years ago computers showing 8GHz CPU! Then it turned out it's 4 cores with each having 2GHz... Not much different IMHO. :) I can bet a lot of normies fell for that, though.
4x2 is still 8 ghz though..

25% plus 75% make believe isn;t the same thing.

What is trhe GPU doing in reality.

is it calling 1 frame, ignoring 3 draw calls and putting in its own 3 predicted frames.
in essence it's cheating and probably refusing to render real frames to boost performance artificially.

There's obviously some frame pacing going on right.

it's not rendering all the frames it can, they will be purposely ignoring real frames so they can insert 3 fake ones for probably the same cost.
 
Last edited:
4x2 is still 8 ghz though..

25% plus 75% make believe isn;t the same thing.

What is trhe GPU doing in reality.

is it calling 1 frame, ignoring 3 draw calls and putting in its own 3 predicted frames.
in essence it's cheating and probably refusing to render real frames to boost performance artificially.

There's obviously some frame pacing going on right.

it's not rendering all the frames it can, they will be purposely ignoring real frames so they can insert 3 fake ones for probably the same cost.
They claim frame pacing is fixed with dlss4 actually - admitting it was broken in 3.5. Interesting they never mentioned this before 5000 series premiere :p Also, they say it's "up to" 3 additional frames. Which means it doesn't mean at all that it will be 3 frames, it could be 1 instead - so it's not even consistent like that. Sounds like one big scam really, but I see more and more comments online by people who really believe that rubbish 5070=4090... It's sad.
 
Also, they say it's "up to" 3 additional frames. Which means it doesn't mean at all that it will be 3 frames, it could be 1 instead - so it's not even consistent like that. Sounds like one big scam really, but I see more and more comments online by people who really believe that rubbish 5070=4090... It's sad.

You can choose 2x 3x or 4x frame gen
You can see it in the clips they're showing of it. You don't need to select 4x if you don't want to... Maybe 2x becomes an acceptable balance for people. Time will tell
 
Last edited:
At $499 it'll get rave reviews. That's the 3070 price over 4 years later.

It'll be compared to the 4070 , says starting at £539 and that's the FE, still 12gb probably matching 4070s

3rd party reviews will be mainly focusing on pure performance , I can see the memes when they'll talk about the 4090 performance claim
 
Last edited:
It'll be compared to the 4070 , says starting at £539 and that's the FE, still 12gb probably matching 4070s

Okay fair, the price is $549. It's still gonna get good reviews.

It's $50 cheaper than the 4070 super, and will likely beat it convincingly.

The 4070 Ti super is $799 and a 285W card. The 5070 is a 250W card, so if they've improved perf/watt by at least 13% then it will match that.
 
Last edited:
I think we'll be lucky if the 5080 comes near the 4090 performance in rasterisation.
I think Nvidia will want the 5080 to make it under the US export restrictions, so 4090D or slower is what I expect.

I also think uncorking the 256 bit bus with faster DDR7 will give the 5080 decent bump over the 4080(S). -even with almost no increase in cores.
 
Last edited:
I feel people are being out of line and judgemental just because somebody likes the idea of frame gen giving 4090 frame rates to lesser 50 series cards...

Frame gen can feel bad to some and fine for others.
I thought I would find it bad but I honestly like it on my 4090. I don't notice any lag or glitches from it and generally find it does what it's supposed to do.

Also, screaming "fake frames whaaa". Grow up.
What frames aren't fake? Fake pre-baked lighting, fake bump mapping, procedural landscape generation, fake upscaled textures. Every frame is full of fake elements so claiming frame gen is fake frames and acting like it's a deal breaker is straight up backwards. If somebody is not negatively effected by it and they get more enjoyment from their games from it, why is it a problem?
I think you're missing the point. Nvidia are claiming a 5070 is better than a 4090, but this simply isn't true. Most people can tell the difference between the input lag at 30fps to 60fps and 120fps. Just because a little counter in the corner says 240fps doesn't actually make the game feel more responsive, so the cuda core count / rasterisation performance is the key metric, how many numbers the counter shows isn't as relevant if the input lag isn't being reduced.

The input lag on the 4090 will be considerably lower than on the 5070 if running the same settings regardless of what the little counter in the corner says.

The only number we have at the moment from Nvidia is the little counter, not the relative input lag between the two cards. The numbers we.do have are completely disingenuous so it's not possible to make a value judgement.

Having said that, I am planning on getting a 5090 simply because I currently have a 3080 with 8700 cuda cores and the 5090 has over 20k, so regardless of anything else it will obviously be a decent upgrade from what I have. But if I had a 4090 I wouldn't be upgrading to anything in the 5000 series, and certainly wouldn't be worried about having to sell it for less than £700 "because a 5070 is faster", because it clearly isn't the case.
 
Back
Top Bottom