• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Saved! Doesn't look too hot, eh?

YrUrM7N.png


That is awful. So RTX is still in beta-testing stage and generally not worth it even on their new high end cards which say they're double the performance?

67FPS at 1440p is just not very acceptable IMO.

This video makes me think I'll turn RTX off at 4k... so if I'm still turning off these high end options, why update GPU when the current 2080/2080ti probably hit over 60fps without RTX on too?
 
I don’t know. How do other cards perform in this title?


From that screenshot, its clear there on the 3080 its better to turn RTX off. It thats the case, the real question is how do other cards perform with RTX off?

As simply put, RTX still seems to be fairly half baked and not worth the FPS hit... which further then adds fuel to the fire of why go for NVIDIA if RTX is better off and we grab the extra FPS?
 
That is awful. So RTX is still in beta-testing stage and generally not worth it even on their new high end cards which say they're double the performance?

67FPS at 1440p is just not very acceptable IMO.

This video makes me think I'll turn RTX off at 4k... so if I'm still turning off these high end options, why update GPU when the current 2080/2080ti probably hit over 60fps without RTX on too?

Good point!
 
That is awful. So RTX is still in beta-testing stage and generally not worth it even on their new high end cards which say they're double the performance?

67FPS at 1440p is just not very acceptable IMO.
And that's with DLSS on. But I wouldn't judge it TOO harshly because that's in UE4, which can lead to results such as this one...
Devs still gotta optimise the settings for visuals vs performance, just slapping RT on it & letting it go is simply going to tank performance. (though to be clear, in the below screenshot it's not the devs' fault, that's just turning stuff on through the console which luckily you can do with UE4)

9irY9wE.jpg
 
Exactly, Anyone find they get better results than review sites (I know Gregster said so the other day).

That could be an early version of that game/bench and run like poop on my 1000 series or your 2000 series.

I would go by what my games run at now and what they will run like if/when the 3000 is fitted.
 
Good point!


Honestly I think this launch is going to be more underwhelming than people think. The big red flag for me is NVIDIA's pricing structure. They always bend us over.. but this time they've released early.

Are they scared about AMD?

If Raytracing hasn't become fully possible without destroying FPS, then the RTX 3080 is nothing more than another beta testing product for raytracing like my 2080. I've already got stung once but this rubbish.
 
Why a lot off people going for 3090? what is nice with that card? theoretically of course, as we don't have any valid data. I can see just more VRAM, nothing more
 
So much speculation. I don’t see why people get upset over leaked images. Just wait two days and you can make your minds up as to whether the upgrade is worth it.
 
Not sure I took correctly from above but on a 2080Ti with all bells and whistles it is 7FPS and on a 3080 it is 67FPS?

EDIT: NVM one was 1440p other was 4k.
 
Why a lot off people going for 3090? what is nice with that card? theoretically of course, as we don't have any valid data. I can see just more VRAM, nothing more

On paper it should be 20% faster, maybe more.

It doesnt have crippled memory like the 3080

They are the ships not the rejects used in the 3080. In theory they should be much mroe efficient and potentially boost up to 10% higher.

Cooling on the 3090 is far superior to the 3080.

epeen.
 
Why a lot off people going for 3090? what is nice with that card? theoretically of course, as we don't have any valid data. I can see just more VRAM, nothing more

for me, Cinema 4d/Octane render, UE4, and a few other programs. I game too but if that was my only focus I would stick with my 2080ti.
 
Not sure I took correctly from above but on a 2080Ti with all bells and whistles it is 7FPS and on a 3080 it is 67FPS?

No the 2080ti is running in true 4k. The 3080 is not even running at 1440p, its using DLSS to get 1440p.

You would need to see a 2080ti running DLSS and in 1440p to compare. I suspect that the difference will be smaller than you think/
 
Honestly I think this launch is going to be more underwhelming than people think. The big red flag for me is NVIDIA's pricing structure. They always bend us over.. but this time they've released early.

Are they scared about AMD?

If Raytracing hasn't become fully possible without destroying FPS, then the RTX 3080 is nothing more than another beta testing product for raytracing like my 2080. I've already got stung once but this rubbish.

I'm on the same page as you. JUst want to see the normal rasterisation performance. DLSS 2 is fine, but the 'better than 4k native' thing just doesn't sit right with me. And this sillyness for delaying reviews is almost laughable. Summat aint working right
 
RTX 3080 is nothing more than another beta testing product for raytracing like my 2080

Yes thats how its going to be, especially the 2% minimum fps will be so bad that it wont make any sense.
If you ask me, raytracing the way it has been applied by DX12 is a passing fad.. the real application of raytracing is the ability to perform mathematically accurate scene culling.. in future, the scene complexity will be so high that the precise culling performed by raytracing will save processing efforts higher up the rendering pipeline and thus generate greater performance than rasterization approach..
the way it has been deployed in DX12, is just a special effect.. all those special effects can be mimicked without raytracing and most folks playing a fast paced FPS might not even notice the difference
so, just buy a card for traditional raster performance.. i seriously doubt that the level of scene complexity for raytracing to make sense, will be reached in the foreseeable future
 
I'm on the same page as you. JUst want to see the normal rasterisation performance. DLSS 2 is fine, but the 'better than 4k native' thing just doesn't sit right with me. And this sillyness for delaying reviews is almost laughable. Summat aint working right


are the hiding something springs to mind lol.
 
Back
Top Bottom