• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Well must be a train wreck on Xbox then because I found it near on unplayable at any resolution or settings on my 3080.

Could not hold 60fps no matter what I tried and 60fps was very choppy.

The consoles target 30fps with a dynamic resolution. Can drop as low as 1440p outside when driving fast for example.

Also I'm guessing you are resisting using DLSS?

This is actually only slightly debatable? I think it is working without problems, but I would need to test it a bit more, since I have old CPU and waiting for a new one to play it.
So these are max settings, extra details are set to 100% in native 4K, with RTX on max, but without DLSS which lowers vRAM consumption a bit.

RyjUGGD.png

Do you get much higher frame rates with lower settings? Because that looks CPU bound to me. 30ms to 80ms CPU frame times, ouch.

Thought that was confirmed a while ago in the 10gb is it enough thread? :p

The card seems to run out of grunt before it does of vram from what we can see so far.

Which is exactly what you want. 4K games keep moving on every year. Even though 2080Ti was supposedly a 4K card in it's time, the RTX 3070 isn't unless you play those same old games. 8GB is therefore plenty.
 
Last edited:
WDL looks good for RT. I'm happy to give it months to mature as a game before I play it.

Not tried that. I don't buy ubisoft games and do not have a uplay account. Might just borrow it and play it from a friend in the future.


I thought that until today.

Call of duty campaign I think it looks really good IMO.

Yeah, I love RT. But pretty much everything out now is a bit ****. Glad to hear COD makes good use of it. Not my type of game unfortunately. Not long left now for Cyberpunk 2077. That is what I am looking forward to. Hopefully the RT is good also.

As for the future, if Atomic Heart is anything like the demo on Nvidia's website, that is a very impressive showing of what can be done, question is, will it be like that in the final game and if so will our 3080's be able to run it :p
 
@muon

I really like dls, been playing new call of duty of duty with balanced dls. It works really well.

Playing max settings 4k HDR with high ray tracing and it really smooth solid 60fps.
 
Not tried that. I don't buy ubisoft games and do not have a uplay account. Might just borrow it and play it from a friend in the future.




Yeah, I love RT. But pretty much everything out now is a bit ****. Glad to hear COD makes good use of it. Not my type of game unfortunately. Not long left now for Cyberpunk 2077. That is what I am looking forward to. Hopefully the RT is good also.

As for the future, if Atomic Heart is anything like the demo on Nvidia's website, that is a very impressive showing of what can be done, question is, will it be like that in the final game and if so will our 3080's be able to run it :p

You know what Atomic Heart looks like? The Half Life 3 we never had. It has the same art design and themes imo.
 

All my friends had same issue, as soon as you start driving about frame rate drops hard and lots of stutter.

I tried twerking settings which helped but still can't get smooth driving experience round the city even dropping settings right down, so have given up.

Also dls on watch dogs really gives even more stutter to an already stutter filled game.
 
The consoles target 30fps with a dynamic resolution. Can drop as low as 1440p outside when driving fast for example.

Also I'm guessing you are resisting using DLSS?



Do you get much higher frame rates with lower settings? Because that looks CPU bound to me. 30ms to 80ms CPU frame times, ouch.



Which is exactly what you want. 4K games keep moving on every year. Even though 2080Ti was supposedly a 4K card in it's time, the RTX 3070 isn't unless you play those same old games. 8GB is therefore plenty.

Yeah it's obvious that i7 4770k at 4.3 can't keep up with modern games anymore at all, that's why I will probably go with i7 10700K in the following month.
I have found a benchmark on some forum with the same max settings from a member who has i9 9900k and I think that RTX 3080 can maintain locked 30fps with all settings maxed in WDL without any DLSS in raw, native 4K.
The difference with the same card, but better CPU is like day and night, can't believe it...

txBtt0w.jpg
 
Seems a competitor received 50 Asus 3090s this week and got another delivery next week

Was outstanding 376 now 320 across all models of the 3090. Not sure on the others as I've not been monitoring

Waiting on my strix 3090 position 72:D.

Production is starting to ramp for sure
 
Seems a competitor received 50 Asus 3090s this week and got another delivery next week

Was outstanding 376 now 320 across all models of the 3090. Not sure on the others as I've not been monitoring

Waiting on my strix 3090 position 72:D.

Production is starting to ramp for sure
Stock is also starting to trickle into Europe too in greater numbers, both 3080 and 3090, and they stay in stock longer. Its definitely improving.

No way would I order a 3090 at this point though it only made sense (and still hardly any then) on release.
 
Stock is also starting to trickle into Europe too in greater numbers, both 3080 and 3090, and they stay in stock longer. Its definitely improving.

No way would I order a 3090 at this point though it only made sense (and still hardly any then) on release.
need 16GB+ or more unfortunately, Currently workload is using 11GB (1080TI) right away & my software is asking for more. I think 16GB may even be filled right away in my Tasks
 
need 16GB+ or more unfortunately, Currently workload is using 11GB (1080TI) right away & my software is asking for more. I think 16GB may even be filled right away in my Tasks
Is designing man caves that vram intensive? :p
 
Back
Top Bottom