• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** NVIDIA RTX 30 SERIES NOW ONLINE AT OVERCLOCKERS UK (3090 / 3080 / 3070) **

Yeah I thought that to if that was a 2080ti ruining at 80 fps was it it must have been frame capped .mines was over a 100fps I think at 4k maybe they did gimp it.

Yep it's looking like that, apparently from I've researched is there is no rasterization performance increase whatsoever over a 2080ti just the other cores they've added for other compute stuff irrelevant to gaming.

I'll wait for numbers now but to me it's not sounding as good as they're trying to make everyone believe.
 
Just watched this for anyone interested in the 3080.
Must admit I am interested but after watching this he has some good valid points.

Not dissing the 3080 after all it's half the price of a 2080ti but is it real progress 2 years down the line other than extra RT cores and cuda cores.

https://youtu.be/b7nztmvF2f4

Just to add some of his basic points that cuda cores have no relevance to gaming because if the way they are being used so cut them numbers in half.

And the digital foundry video of doom eternal comparing the 2080ti Vs the 3080 is sponsored by Nvidia, people reckon the numbers of the 2080ti have been fudged in favour of the 3080.

Complete BS.

If an SM has double the FP32 shader processors then it has double the shader processing power (although nothing scales linearly).
 
if you thinking doubling the shader processors has no relevance to gaming, you don't have clue what you are talking about.

I'm not just talking about shaders, do you think doubling shaders automatically doubles the whole performance of a GPU?

And I'm still waiting for the answer of a 2080ti only doing 80fps in 4k in doom eternal?
 
I'm not just talking about shaders, do you think doubling shaders automatically doubles the whole performance of a GPU?

Is that what I said?

You said

Just to add some of his basic points that cuda cores have no relevance to gaming because if the way they are being used so cut them numbers in half.

How about you backtrack on that.
 
Just watched this for anyone interested in the 3080.
Must admit I am interested but after watching this he has some good valid points.

Not dissing the 3080 after all it's half the price of a 2080ti but is it real progress 2 years down the line other than extra RT cores and cuda cores.

https://youtu.be/b7nztmvF2f4

Just to add some of his basic points that cuda cores have no relevance to gaming because if the way they are being used so cut them numbers in half.

And the digital foundry video of doom eternal comparing the 2080ti Vs the 3080 is sponsored by Nvidia, people reckon the numbers of the 2080ti have been fudged in favour of the 3080.
You get it, other don't/won't. Why pay for latest gen performance when you need next gen performance. Just to say "I finally have 2080ti performance!!"
Shouldn't you say or have a desire to buy 3000 series performance...which is better then 2000 series performance?

If you plan on staying on a potato at 1080p then by all means buy last gen performance using this gen Uarch at a lower price. To everyone else, wait until we see some price wars between AMD/Nvidia so we can buy next gen performance.
 
Last edited:
Don't need to backtrack, just because it's got double the shaders does not double the performance and neither does double the cuda cores, do you know what cuda cores are used for?

You just told everyone to half them.

You think you are now preaching the idea that more shaders don't scale linearly? That is not what your post meant.
 
You get it, other don't/won't. Why pay for latest gen performance when you need next gen performance just to say "I finally have 2080ti performance?
Shouldn't you say you have 3000 series performance...which is better then 2000 series performance?

If you plan on staying on a potatoe at 1080p then by all means buy last gen performance using this gen Uarch at a lower price. To everyone else, wait until we see some price wars between AMD/Nvidia so we can buy next gen performance.

Correct. Best wait for a price war I think.
 
You just told everyone to half them.

You think you are now preaching the idea that more shaders don't scale linearly? That is not what your post meant.

The post was meant to not just believe a manufacturer tell you this that and the other until you see real numbers in all games not just a heavily optimised game like doom eternal, I've had Nvidia cards for the last 4 generations doesn't mean I always will.

And the guy in the vid owns a 2080ti why would he BS, watch his video that has been requested of doom eternal that he's posting up tomorrow allegedly.
 
Just watched this for anyone interested in the 3080.
Must admit I am interested but after watching this he has some good valid points.

Not dissing the 3080 after all it's half the price of a 2080ti but is it real progress 2 years down the line other than extra RT cores and cuda cores.

https://youtu.be/b7nztmvF2f4

Just to add some of his basic points that cuda cores have no relevance to gaming because if the way they are being used so cut them numbers in half.

And the digital foundry video of doom eternal comparing the 2080ti Vs the 3080 is sponsored by Nvidia, people reckon the numbers of the 2080ti have been fudged in favour of the 3080.

yep it was pointed out by a few tubers.

let's look at the 2080ti vs the 3080 in all it's rasterised specs not just the ones Nvidia announced. Yes you have a Cuda increase that helps compute performance but compute performance has never helped AMD in games and it isn't helping Nvidia much either. Games care more about the other rasterised cores on the die - the ROPs and TMUs, both of which are hardly changed at all and apart from the 3090 there has been no significant change in memory bandwidth.

Nvidia can scream about its "big Cuda core count" all it wants but the truth is that RTX3080 is only faster because it has 20% IPC increase however Samsung 8nm results in lower clock speed than Turing so Nvidia decided to push ridiculous amounts of power into it just to try and beat Turings clock speeds by a little bit
 
Last edited:
You get it, other don't/won't. Why pay for latest gen performance when you need next gen performance just to say "I finally have 2080ti performance?
Shouldn't you say you have 3000 series performance...which is better then 2000 series performance?

If you plan on staying on a potatoe at 1080p then by all means buy last gen performance using this gen Uarch at a lower price. To everyone else, wait until we see some price wars between AMD/Nvidia so we can buy next gen performance.

Sorry, had to PMSL at this. :)
 
Still happily running my Vega 56 Pulse, and on Horizon Zero Dawn MSI Afterburner is showing as much as 7GB allocated Vram @ medium settings and 1920 x 1200 resolution. Now I know that allocated Vram does not directly mean actual use...but Is anyone else worried that the upcoming RTX 3070 might be somewhat lacking with just 8gb of Vram?
 
yep it was pointed out by a few tubers.

let's look at the 2080ti vs the 3080 in all it's rasterised specs not just the ones Nvidia announced. Yes you have a Cuda increase that helps compute performance but compute performance has never helped AMD in games and it isn't helping Nvidia much either. Games care more about the other rasterised cores on the die - the ROPs and TMUs, both of which are hardly changed at all and apart from the 3090 there has been no significant change in memory bandwidth.

Nvidia can scream about its "big Cuda core count" all it wants but the truth is that RTX3080 is only faster because it has 20% IPC increase however Samsung 8nm results in lower clock speed than Turing so Nvidia decided to push ridiculous amounts of power into it just to try and beat Turings clock speeds by a little bit

This is what is important to me and should be to all gamers, rops, these are the true indication, and also the power draw and power requirements, never has a mainstream gaming GPU needed this sort of PSU and I'm not talking about titans, like you said they've obviously just unlocked as much power draw for it look good at least matching a 2080ti.

Think I'll stick to 1440p with the 1080ti for now see what happens in the 1st or 2nd quarter of next year, that'll be interesting.
 
Still happily running my Vega 56 Pulse, and on Horizon Zero Dawn MSI Afterburner is showing as much as 7GB allocated Vram @ medium settings and 1920 x 1200 resolution. Now I know that allocated Vram does not directly mean actual use...but Is anyone else worried that the upcoming RTX 3070 might be somewhat lacking with just 8gb of Vram?
But...nvidia reveal was so awesome!!! I got free cookies on the hype train and feeling pretty good about myself! Worried? What worry...:confused: /s
Chooo Choooooooo
 
Back
Top Bottom