I'm sure that the amount of video memory has very little to do with total FPS in this game.
That's my point.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'm sure that the amount of video memory has very little to do with total FPS in this game.
One game does not determine the amount of video memory required for 'games', past or present, is my point.That's my point.
Does anyone know what the stock of the 6900 will be like? Is it gonna be better than non existent this time or is everyone staying silent? Barely even see any 6800's being scalped.. Made the ampere launch look good.
Ive seen details from 3 sources today, two from reddit regarding micro center stores in the US saying they won't have any stock and 1 source from a large Swedish pc retailer saying they won't have any stock but they have been allocated 35 cards with no eta on when those cards will arrive
One game does not determine the amount of video memory required for 'games', past or present, is my point.
Thats very disappointingIve seen details from 3 sources today, two from reddit regarding micro center stores in the US saying they won't have any stock and 1 source from a large Swedish pc retailer saying they won't have any stock but they have been allocated 35 cards with no eta on when those cards will arrive
Having to turn down texture settings with a high end GPU due to a lack of video memory is not acceptable, period.We have to turn down settings *now* if we want playable frame rates. There will be more games that necessitate this. The vram alarmists make it sound like turning down settings is a deal-breaker.
Yet here we are, turning down settings, and people aren't grabbing their pitchforks and marching to Nvidia's headquarters.
For some reason, a lack of GPU horsepower is acceptable to this bunch. Heck, maybe even the fault of the game developer for not "optimizing" the game to work with current GPU limitations.
Now, if someone creates a texture pack that doesn't work within common vram buffer limitations, well *that* somehow isn't the developer's fault, it's the manufacturer of the graphics card.
Either way, settings will need to be turned down in the future. It has always been the way things progress with games and the hardware we use to run them.
Having to turn down texture settings with a high end GPU due to a lack of video memory is not acceptable, period.
No one is grabbing their pitch forks because:
- It’s one game
- Its using RT which gimps performance all round regardless of video memory
- The game may not be well optimised based on user feedback
For some reason, a lack of GPU horsepower is acceptable to this bunch. .
I give up with this one.You really can't figure out why turning down settings because the GPU is out of horsepower rather than the GPU being bottlenecked is acceptable?
Are you the type of person that pairs an i3 processor with an RTX 3080 and thinks that it is acceptable?
No RT, not supported at launch.Slightly perplexed by the 6800 xt perf. Is ultra using RT or not?
Noticed they used a 9900K and it's PCI-E 3 so it will hinder the 6800 XT I assume not being on gen 4?
Not supported.Noticed they used a 9900K and it's PCI-E 3 so it will hinder the 6800 XT I assume not being on gen 4?
You really can't figure out why turning down settings because the GPU is out of horsepower rather than the GPU being bottlenecked is acceptable?
Are you the type of person that pairs an i3 processor with an RTX 3080 and thinks that it is acceptable?
How is this an answer to my question? My question was about why running out of GPU horsepower is an acceptable excuse to turn down settings? If you want to talk about the balance between VRAM amount and GPU horsepower there is a 130 page thread with no conclusion for you to post in.If the VRAM buffer and the GPU is grossly mismatched, *that* could be a problem. .
I end up with the GPU "bottlenecking" my VRAM buffer every single time.
Paper launch number two.![]()