Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
The information presented does provide insight and discussion. It doesn't require you to agree with it by calling someone a fanboy. LOLProviding information? Sure, I'll give you that (albeit handpicked bias)
Insight and discussion? Nah, lol
It would have lost out even harder without the cache though. GDDR6X wasn't available to them, so they had to come up with something to deal with regular GDDR6's lower bandwidth.
A lot of people seem to be grabbing the wrong end of the stick though by saying that the 6800 XT scales poorly at 4K. It doesn't. If you compare it to the 5700 XT and Turing, it sees the largest gains over those cards at 4K. Using TechPowerUp's numbers as an example, a 2080 Ti offers 84% of a 6800 XT's performance at 1440p, but only 80% at 4K, despite its wider memory bus and higher memory bandwidth. It's not that the 6800 XT is scaling poorly, it's just that Ampere is plain faster. In terms of scaling, it's more accurate to say that Ampere scales poorly below 4K (something we already knew), which is why AMD have been able to take the lead at lower resolutions. Whatever bottlenecks exist in Ampere's compute-focused architecture prevent it from achieving its full potential at lower resolutions, else it'd likely be just plain faster across the board.
In the interview he said there is something else they will do with Infinity fab. What that is hasnt been releaved.
That is some strong denial you got going there. Is this how you cope? What does that have to do with the 6800 xt setting a world record on air? You are all over the place lol.
It is only defending when you can actually refute the evidence set before you. Since you have shown no capacity to engage why you feel its not true its only you defending nvidia as usual.
Well it does appear that you wont get any good decision from some of the more rabid defenders. They will simply call you a fanboy as they have no means to admit that games are not coded neutral.
it is truly staggering, specially when they can not refute the truth resorting to name calling.
And that is the secret to they lies they post. but don't worry all they can do is name call you because you exposed the truth.
It is amazing at what depths they will lie, hide the truth, feign ignorance, employ deception and name call in order to protect nvidia.
And what makes it so profound is that some of them that been here for years know no more about the ends and outs of pc gaming then someone who just joined the forum that came from console. And it does appear they are not capable of providing any legitimate feedback beyond name calling, lying, deception, etc.
![]()
Why would it not beat the 3080 at 4k too? It has higher frequencies than 3080. If it`s not the bandwidth then there is no reason not to be better than 3080. But then again if it`s the bandwidth, why do you need 16gb? You will decrease settings long before you will fill 12gb of Vram because the fps will look bad.
Graphics-wise, Godfall looks gorgeous. While this game does not take advantage of Ray Tracing, it can push amazing visuals. There are a lot of reflective surfaces, a lot of particles effects, and some truly amazing lighting effects. Furthermore, Counterplay Games has used a lot of high-resolution textures, making everything look crisp and sharp. Speaking of textures, the game did not require more than 8GB of VRAM at 4K/Epic settings. So yeah, despite Counterplay’s claims, Godfall can run smoothly on GPUs that have 10GB of VRAM.
Godfall 4k epic settings(maxed out)
6g-8gb allocation, and only 5.5gb-6.5gb actual usage.
Sooooooo, 12gb my ass lmaaao
I am definitely buying the 3080 10gb now its not gonna be an issue for a while(1440p144hz)unless amd can come out with dlss of their own and the benchmarks are great and their drivers are awesome...
This is a next gen game... so we can be sure for 90% of the games 8gb won't cause huge issues..10gb is enough though for anything you throw at it..
EDIT- I would like to acknoledge that my title is a bit misleading, turns out that it was actually the developers that said it is using 12gb in the game not amd.
Also the game does allocate(im not sure how much it uses here) 11gb-12gb of vram when you turn FidelityEX LPM(provides an open source library to easily integrate HDR and wide gamut tone and gamut mapping into your game) which is basically an HDR implementation.
If you value this setting, 10gb is at the edge for this game though I think it should still run as the usage should be right around 10gb.
Yea but that is AMD biased and anything AMD wins in is AMD biased. Clearly obvious and anything NVidia wins in is GPU agnostic and stuffLOL 2.6ghz on air - that is totally bonkers @humbug
For reference, my liquid cooled 5700XT with binned 9900KF at 5.3Ghz gives a graphics score of 61% slower than this 6800XT bench run on air.
61% slower.
That score totally destroys anything posted in our own Firestrike thread, the highest posted score there is 41,787 !
6800xt is massively slower in port royal as well. Port Royal is just a DXR benchmark and it mirrors Control for performance. The RTX 3080 has better RT performance and memory bandwidth which helps at higher resolutions like 4k. The 6800xt is a better 1080p/1440p card for high fps rasterization games. With performance just a little better than the 2080 ti in DXR stock and gets a little more performance from rage and overclocking but not enough to get near the 3080.
If the game is more rasterization than RT. The 6800xt can win but as soon as the game fully uses RT the 3080 will dominate. The RT cores are just far better on ampere, there is no excaping the performance lead here as its too large.
Godfall requires only 6gb-8gb vram at 4k maxed out and not 12gb like claimed by amd
Had a quick play last night with the new patch and hit over 11GB np, should imagine it'd hit 12GB with a longer session. Looks absolutely fantastic too with all the bells and whistles on
(RT gave me about a 30fps hit on Ultrawide)
turn on ray tracing cyberpunk = 1080p with quality low settings on big Navi, and 4K Ultra on big Ampere...ouch
Inb4 comments that CD Projekt Red is an Nvidia shill
https://www.tomshardware.com/news/c...equirements-official-plus-our-recommendations
There is a difference between what the game needs and what it allocates.
Holy heatsink batman, if they are available next Wednesday that's the one to get :-O