• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Funny how a performance of 48 FPS with 33 min is not acceptable by your standards but a perf of 37 with 32 min is very acceptable and it allows you to trash AMD for their much worse performance. :)
1440p-ultra-1.png


Oh look how poor your card is, you can't even play CP at 1440p...i can't do it either but i get better FPS so your card is trash, mine is great. :D

It's called dlss bud :D ;)
 
In HZD you were going above 10Gb VRAM if you played at 4k with 90% FOV. After 5-10 mins you were going above 10Gb. Maybe they fixed it with a patch idk but it was a problem. The Rebar also increases the Vram usage with a couple of hundreds Mb.
It's called dlss bud :D ;)
It doesn't matter, the fact is the card is weaker than AMD is on CoD so by your higher standards you shouldn't use a card that can't do even 40FPS at 1440p. But you have different standards. :)
 
In HZD you were going above 10Gb VRAM if you played at 4k with 90% FOV. After 5-10 mins you were going above 10Gb. Maybe they fixed it with a patch idk but it was a problem. The Rebar also increases the Vram usage with a couple of hundreds Mb.

It doesn't matter, the fact is the card is weaker than AMD is on CoD so by your higher standards you shouldn't use a card that can't do even 40FPS at 1440p. But you have different standards. :)
Don't worry, anemic solutions such as shutting every kind of hardware accerelation in desktop apps, restarting the game every 20-30 mins when the VRAM is full (Cyberjunk, looking at you) and starts causing problems will be prime solutions for the premium 700+ dollar card in 2 years (msrp, of course).
 
I'm loving this more than gaming right now tbh :p Your honour this person committed murder... Ok show us the evidence...... Godfalllllllllll....... Anything more substantial than that?....... Godfallllllllllllllllllll!......

Case dismissed!

QdBn8A2.gif
Hahahahaha! :D


Have you not watched any serial killer/court based shows then? :D Generally you need to have quite a lot of evidence to back up said claims and said evidence needs to have actual substance/weight and not just be a wet fart.
Godfall is definitely a wet fart.
 
At least it didn't promise anything, unlike Cyberpunk where devs said it was going to be a generation defining game, whereas the game lacks a lot of serious content and just a shallow open world game.

This thread has gone into a loop lol. I guess we may take a rest, all of us. Let's wait for actual nextgen games.
 
At least it didn't promise anything, unlike Cyberpunk where devs said it was going to be a generation defining game, whereas the game lacks a lot of serious content and just a shallow open world game.

This thread has gone into a loop lol. I guess we may take a rest, all of us. Let's wait for actual nextgen games.
By the time next gen games come we will all be on next gen cards :p

Right now everything made seems to be made for both new and old consoles. Will be a while until we get games made from the ground up for the latest consoles.

Hell, I already had a 3080 for 7 months and played all graphically intensive games I wanted. The only other game I have any interest in that needs grunt won’t be out until December. Looking forward to next gen cards already :)
 
Don't worry, anemic solutions such as shutting every kind of hardware accerelation in desktop apps, restarting the game every 20-30 mins when the VRAM is full (Cyberjunk, looking at you) and starts causing problems will be prime solutions for the premium 700+ dollar card in 2 years (msrp, of course).
For many including me those cards were free and then some if left mining when not playing games. ;)
 
The difference is that a game like RE Village has 95% positive reviews even if the RT in the game is trashed by some reviewers. Days Gone has 92% positive reviews on Steam and it has no RT, it is also an old gen game.
The great RT titles heavily promoted, are all trash. So you can spend 100 hours inside the game looking at reflections. :D
Laughing that a card does 20 FPS on a heavily RT game sponsored by the competition, while your card does 30 is like laughing at a 3070 owner while you have 9Gb VRAM. :D
 
The difference is that a game like RE Village has 95% positive reviews even if the RT in the game is trashed by some reviewers. Days Gone has 92% positive reviews on Steam and it has no RT, it is also an old gen game.
The great RT titles heavily promoted, are all trash. So you can spend 100 hours inside the game looking at reflections. :D
Laughing that a card does 20 FPS on a heavily RT game sponsored by the competition, while your card does 30 is like laughing at a 3070 owner while you have 9Gb VRAM. :D
Can’t argue with that. Days Gone is a great game and I completed it in 100% on PS5 (it’s free for PS Plus subscribers).
A game doesn’t have to have superb graphics to be good but ideally you want both otherwise we would still be playing 8bit games.
 
The difference is that a game like RE Village has 95% positive reviews even if the RT in the game is trashed by some reviewers. Days Gone has 92% positive reviews on Steam and it has no RT, it is also an old gen game.
The great RT titles heavily promoted, are all trash. So you can spend 100 hours inside the game looking at reflections. :D
Laughing that a card does 20 FPS on a heavily RT game sponsored by the competition, while your card does 30 is like laughing at a 3070 owner while you have 9Gb VRAM. :D
Indeed, Hades is probs better than all RT games ever released. There's simply not a single, not even a single "GOTY" contestant RT game, yet.

It is clear that when developers target "RT", they lack in actual content, gameplay, and story elements. We're left with pretty lackluster games all around for RT tech demos it seems.

Check how water puddles are interactive and more immersive in W2;
https://youtu.be/ceMTRCvxfyA?t=680

Now let's take a look at Cyberjunk; [talk about a wet fart]

https://www.youtube.com/watch?v=lhqtoQ4E-dA

How the hell did this happen? I mean, really, how?

They're quite a small team. Making their game "RT" compliant probably pushed their resource budget into bottlenecks. Sad, really. Could've been something more, ruined probably by the RT.

In the end, no one cared abut its RT, as with DLSS the game looks blurry and garbage, so its visuals are not even better than that of Last of Us 2 or God of War in the end, both in terms of compositon and texture quality (Cyberpunk seriously has the worst texture quality between recent AAA games. we owe that to the 8-10 GB cards) and these console exclusive games run fantastic on 2013-2016 hardware that has 1.8-4.2 tflops respectively
 
They're quite a small team. Making their game "RT" compliant probably pushed their resource budget into bottlenecks. Sad, really. Could've been something more, ruined probably by the RT.

In the end, no one cared abut its RT, as with DLSS the game looks blurry and garbage, so its visuals are not even better than that of Last of Us 2 or God of War in the end, both in terms of compositon and texture quality (Cyberpunk seriously has the worst texture quality between recent AAA games. we owe that to the 8-10 GB cards) and these console exclusive games run fantastic on 2013-2016 hardware that has 1.8-4.2 tflops respectively

Wasn't RT supposed to make it easier for developers as they would not have to do all the stuff manually?
 
Wasn't RT supposed to make it easier for developers as they would not have to do all the stuff manually?
Obviously, but not for Cyberpunk, since it has all kinds of traditional lighting methods for previous gen GPUs and consoles... Cp2077 developers had to do non-RT lighting and shadows nonetheless which defeats the purpose of not bothering with them.
 
Wasn't RT supposed to make it easier for developers as they would not have to do all the stuff manually?

Being a relatively new feature, it still needs to mature. As it gains more traction, more efficient implementations/techniques will evolve, but for now it's a bit brute force and only Nvidia seem to have the power to handle it.

Yes though, the idea is that it will make development easier and more consistent.
 
Being a relatively new feature, it still needs to mature. As it gains more traction, more efficient implementations/techniques will evolve, but for now it's a bit brute force and only Nvidia seem to have the power to handle it.

Yes though, the idea is that it will make development easier and more consistent.
But it is being marketed and defended as it is already there. AMD needs to come up with a good RT title that shows reasonable and relevant RT implementation. Right now we have NVidia sponsored title with brute force RT effects which work sometimes and at other times are just there to tank FPS. But due to the buzz generated by these titles, developers will try to include these effects which would take time away from something interesting that they might have tried instead.

Metro EE is perhaps a good example of this. It has relevant RT effects but is mired by Physx and Hairworks. The interesting point here is that Metro EE has some of the worst textures I have seen in a game of such pedigree and bad character models. If your theory is right they could have corrected that instead of working on RT implementations because those crappy textures took me out of the setting more than lack of realistic lighting. To be fair the third-rate Mad Max + Far Cry copy-paste that they have used as a setting already made the game dead for me.
 
I see we still have certain people blaming games rather than their system/user error :cry: :D

In HZD you were going above 10Gb VRAM if you played at 4k with 90% FOV. After 5-10 mins you were going above 10Gb. Maybe they fixed it with a patch idk but it was a problem. The Rebar also increases the Vram usage with a couple of hundreds Mb.

It doesn't matter, the fact is the card is weaker than AMD is on CoD so by your higher standards you shouldn't use a card that can't do even 40FPS at 1440p. But you have different standards. :)

Already been debunked many times, look at the patch notes for the game and by people who posted videos showing their play through in that area. Also, in the video I posted a while back of a 6800xt vs 3080 (with the latest patch), it actually looked like the 6800xt was having issues loading some textures compared to the 3080 but of course, no one responded to that ;)

Rebar does increase vram usage but has it caused issues? Nope. Feel free to post evidence showing otherwise though. Also might be worth reading up on what exactly re-size bar/sam does. Nvidia have a very good article on it.

https://www.nvidia.com/en-gb/geforce/news/geforce-rtx-30-series-resizable-bar-support/

We talking about facts? Oh dear so is dlss not allowed then? Why not? Is that because amd "still" don't have their version out yet?

Fact is, in cyberpunk:

dlss quality + ray tracing on = superior experience because of better IQ on the "whole"
native res + ray tracing off = lesser experience because of worse IQ on the "whole"

Sure I could go with the second option and get a constant 90+ at all times including min fps but I rather have the superior IQ option as since the game is SP I can cope with the min fps of 50 and 70-80 fps on average on my setup (play more often on my 3440x1440 monitor too for free/g sync so when the fps does drop to 50 in the very intense moments, you don't notice it)

Don't hate me if non 3080/3090 owners can't enjoy the game :D

Hahahahaha! :D

Godfall is definitely a wet fart.

vyNJktF.gif

:D

But yeah, stuck in a loop now this thread, we can safely sum it up as this so far:

- 3080 "might" have a problem with its vram in 1-2 years time but by then, we'll all be on the new cards anyway for the better ray tracing perf. and more grunt, not for more vram....
- rdna 2 was DOA because of its inferior ray tracing perf. and lack of dlss equivalent
- people still have yet to post a game where a 3080 is struggling because of its VRAM @1440/4k with substantial evidence backing up said claim

Back I go to enjoy the sunshine now! :D Will be sure to grab the popcorn for when I get back though :D

The difference is that a game like RE Village has 95% positive reviews even if the RT in the game is trashed by some reviewers. Days Gone has 92% positive reviews on Steam and it has no RT, it is also an old gen game.
The great RT titles heavily promoted, are all trash. So you can spend 100 hours inside the game looking at reflections. :D
Laughing that a card does 20 FPS on a heavily RT game sponsored by the competition, while your card does 30 is like laughing at a 3070 owner while you have 9Gb VRAM. :D

Oh dear, have we resorted back to where we are basing games ratings on their optimisation/ray tracing perf? :o :cry:

Issues aside, cyberpunk is actually a solid game and did very well in reviews, heck even Mack from worth a buy said it would have been goty if it weren't for the issues

KyJbR06.png

Days gone is a great game though but that's got nothing to do with it not having ray tracing....


Sorry forgot....... godfaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaalllllllllllllllllll

RT3EvpN.jpg
 
Obviously, but not for Cyberpunk, since it has all kinds of traditional lighting methods for previous gen GPUs and consoles... Cp2077 developers had to do non-RT lighting and shadows nonetheless which defeats the purpose of not bothering with them.
The discussion was a bit different as explained by @Mesai. The point being that even RT implementation alone right now has significant resource costs. Maybe in the future, it would just an on/off switch but it isn't for now.
 
Wasn't RT supposed to make it easier for developers as they would not have to do all the stuff manually?

Yup it improves workflows for the developers tenfold:

https://www.youtube.com/watch?v=fiO38cqBlWU&t=312s

Digital foundry got the developers of metro to send them their workflows of the old method vs the new method, as you can see the time and effort saved is substantial. Very impressive how it even runs better than the original too. Would love to see more ray tracing only games but given that consoles/amd lack the capability, sadly we'll be held back.

But it is being marketed and defended as it is already there. AMD needs to come up with a good RT title that shows reasonable and relevant RT implementation. Right now we have NVidia sponsored title with brute force RT effects which work sometimes and at other times are just there to tank FPS. But due to the buzz generated by these titles, developers will try to include these effects which would take time away from something interesting that they might have tried instead.

Metro EE is perhaps a good example of this. It has relevant RT effects but is mired by Physx and Hairworks. The interesting point here is that Metro EE has some of the worst textures I have seen in a game of such pedigree and bad character models. If your theory is right they could have corrected that instead of working on RT implementations because those crappy textures took me out of the setting more than lack of realistic lighting. To be fair the third-rate Mad Max + Far Cry copy-paste that they have used as a setting already made the game dead for me.

Some of the textures did let metro enhanced down for me too, very disappointing that they didn't did do a better job here especially as the game didn't seem heavy on vram, iirc, was only 6.5-7.5gb vram being used on my 3080.... but every time the lighting came in to play, I was in awe, this area just had my jaw dropping throughout:

ADlXFe4.jpg

Absolutely stunning in motion and added so much to the atmosphere.

Would love to see more old games remastered with ray tracing like mad max, batman arkham games.
 
I run the following games at 4K with Ray Tracing (RT) on without issue 60+FPS without the need for upscaling technology.

Should have read the small print first, you still don't have the option of upscaling technology.

When RT supported games are optimised for the hardware they perform well with RT enabled, i think you'd be surprised if that is your impression.

Dirt 5
Godfall
World of Warcraft
Resident Evil Village
Call of Duty Black ops
Call of Duty Modern Warfare
Shadow of the Tomb Raider

Oh dear. These games use a very minimal amount of RT, mainly shadows. No wonder you thought the IQ uplift was debateable. Stop flaffing around and get a modern card.

What we've seen is that when RT is only optimised for one vendor, the RDNA2 cards do not perform as well.

Nope. What we have seen is RDNA2 doing okay in games that make very little use of RT, your list is a good example, while the opposite is true when you turn up the eye candy. DF did a decent breakdown on why this is true.


As with all RT seen so far in games though, the improvements to image quality are debateable and not worth the performance cost, even though the hardware has the grunt to run them - when optimised properly.

I think it's more true to say for a good experience you need to pick the right architecture for RT, Ampere or RDNA3 ;)

TechSpot has a easy to understand article on Ampere vs RDNA2 architecture - https://www.techspot.com/article/2151-nvidia-ampere-vs-amd-rdna2/. It even uses SOTTR for it's RT benchmark thus shows RDNA2 in it's best light.

But all of that is a discussion for another thread. This one was regarding 10GB and whether it is enough. No one has yet given any technical reason why it's not enough, though Skyrim with mods was a good attempt. Of course Skyrim with mods could break any amount of VRAM, even the 3090's 24GB.
 
That DF video comparing shadows, devs are still nerfing non RT to make RT look better, the non RT shadows are like if you go in gta 5 and set shadow quality to low, they basically locking high shadow details to RT only now. This is what I dont like, the games been played to sell the feature, we should be seeing non RT at its best vs RT at its best.
 

Part 2, 3 and 4;

https://www.youtube.com/watch?v=1IPRjzrHvak

https://www.youtube.com/watch?v=c1bKkUsrs_c

https://www.youtube.com/watch?v=al2O7XGURqM

oh golden part


This game never, ever had the chance of becoming a GOTY. Never. It lacks in content. These are not issues, these are DEFICITS that can even be found on android/ios mobile games.

Aside from TLOU2; even AC:Valhalla, Doom Eternal, MSFS 2020, and Hades are better games than Cyberpunk 2077.

For 2021, we've yet to see. Any above average AAA game will be picked over the Cyberpunk
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom