• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
What's up with far cry 6 requiring 16 gb vram? can this be justified, game's graphics looked very oldgen... i really want to see what will these "hd texture pack" bring out

https://www.ubisoft.com/es-es/help/...for-higher-resolutions-in-far-cry-6/000098976

Strange considering the game looks no better than Far Cry 5, Sure some of the RT reflections are nice but even with the HD texture pack, Based on what I've seen, It's just FC5, Which also had an HD texture pack, In a different locale and that did more than fine with 8GB on a Vega64.
 
Last edited:
Yes, thanks to AMD and their practices we also can't have DLSS in majority of AAA titles now. I'm also kind of disappointed in Nvidia as well. It seems like they're not punching their weight enough or something. Last I checked they released a driver supposedly to have 100+ DLSS support with 26 new addition. Oh wow, 26 UE4 indie games that runs with high performance with least performant DLSS cards...Nvidia needs to step up their game, secure more and higher budget AAA game dealz
 
Last edited:
People moaning about AMD sponsoring games and then moaning about Nvidia not sponsoring enough games, gotta love the irony around these parts sometimes, next people will be blaming AMD for the lack of VRAM on Nvidia GPUS.
 
People moaning about AMD sponsoring games and then moaning about Nvidia not sponsoring enough games, gotta love the irony around these parts sometimes, next people will be blaming AMD for the lack of VRAM on Nvidia GPUS.
Regardless of who is sponsoring a game you shouldn't be getting 25% less raster fps as is the case in AC Valhalla by not going with that sponsors products.
 
Regardless of who is sponsoring a game you shouldn't be getting 25% less raster fps as is the case in AC Valhalla by not going with that sponsors products.

You are assuming the lack of performance on Nvidia cards on that specific game is related directly to AMD sponsorship, do you actually have any proof/source of that?
 
The Division 1 was pretty popular for being held back on PC as the Ubisoft devs said, Paraphrased - "We want parity with the consoles so didn't turn it up to 11 on PC".

You talking about gfx or the game itself? Div 1 looked good when it came out, the game itself with just one area after another, of opponents that took 300 more bullets to kill than the last area.

AS single player games go, the FC series (except the 2nd one) have had good stories and good open world, I think. Maybe because the are set in a more natural environment. As have the SP ghost recon games. I think the gfx options in these games can be optimized to look very, very good. Particularly with the natural environment background and very large maps.

I think the are good.
 
Last edited:
TBF Valhalla is a bit of an odd one as didn't the previous aco games run just as well or better on Nvidia cards?

It's not odd, people just don't think about the 'why' and jump to conclusions around sponsorships instead. There's really three reasons why:

1. The game is still based around their same engine since Black Flag (and it + Unity had heavy Nvidia involvement during dev in fact) which is suited better to graphics-heavy archs, like Nvidia used to have and AMD didn't. Kepler & Maxwell being obvious examples compared to the forward-looking & compute-focused GCN.
2. Today that's flipped on its head: Nvidia is more heavily compute-oriented while RDNA 2 is better at graphics. Further, Nvidia's advantages with Tensor & RT cores are not relevant here because there's no DLSS, and no RT.
3. AC games suffered immensely on PC due to DX11 (and various poor programming practices on their part, cf Kaldaien), and AMD also did poorly in DX11 titles due to various reasons but here Nvidia's software team had a decisive advantage, furthermore AC games became more & more demanding on CPU & memory. Valhalla is now DX12-only, so in a way AMD cards are now unshackled and can run properly and thus we see a big advantage for them here.

You can see a similar thing with Wildlands vs Breakpoint (they also use same engine as AC), when they finally added Vulkan to BP it was a huge boon for AMD users. Wildlands in particular was horrific and you could see this the best if you paid any attention to the power usage. It was always low on AMD cards, which is indicative of underutilisation. I remember with my Vega 64 I'd see the SOC power get cut in half almost, and that's irrespective of settings/reso. It was... wild. Mind you, it wasn't perfect for NV gpus either but it was much better than for AMD. The difference between RDNA 2 & Ampere isn't as big for BP, but it's there, and that's also at heart very much a DX11 title still, and the game also had more use for compute.
 
It's not odd, people just don't think about the 'why' and jump to conclusions around sponsorships instead. There's really three reasons why:

1. The game is still based around their same engine since Black Flag (and it + Unity had heavy Nvidia involvement during dev in fact) which is suited better to graphics-heavy archs, like Nvidia used to have and AMD didn't. Kepler & Maxwell being obvious examples compared to the forward-looking & compute-focused GCN.
2. Today that's flipped on its head: Nvidia is more heavily compute-oriented while RDNA 2 is better at graphics. Further, Nvidia's advantages with Tensor & RT cores are not relevant here because there's no DLSS, and no RT.
3. AC games suffered immensely on PC due to DX11 (and various poor programming practices on their part, cf Kaldaien), and AMD also did poorly in DX11 titles due to various reasons but here Nvidia's software team had a decisive advantage, furthermore AC games became more & more demanding on CPU & memory. Valhalla is now DX12-only, so in a way AMD cards are now unshackled and can run properly and thus we see a big advantage for them here.

You can see a similar thing with Wildlands vs Breakpoint (they also use same engine as AC), when they finally added Vulkan to BP it was a huge boon for AMD users. Wildlands in particular was horrific and you could see this the best if you paid any attention to the power usage. It was always low on AMD cards, which is indicative of underutilisation. I remember with my Vega 64 I'd see the SOC power get cut in half almost, and that's irrespective of settings/reso. It was... wild. Mind you, it wasn't perfect for NV gpus either but it was much better than for AMD. The difference between RDNA 2 & Ampere isn't as big for BP, but it's there, and that's also at heart very much a DX11 title still, and the game also had more use for compute.
Please stop your logical rationale, this is not the thread for it.
 
It's not odd, people just don't think about the 'why' and jump to conclusions around sponsorships instead. There's really three reasons why:

1. The game is still based around their same engine since Black Flag (and it + Unity had heavy Nvidia involvement during dev in fact) which is suited better to graphics-heavy archs, like Nvidia used to have and AMD didn't. Kepler & Maxwell being obvious examples compared to the forward-looking & compute-focused GCN.
2. Today that's flipped on its head: Nvidia is more heavily compute-oriented while RDNA 2 is better at graphics. Further, Nvidia's advantages with Tensor & RT cores are not relevant here because there's no DLSS, and no RT.
3. AC games suffered immensely on PC due to DX11 (and various poor programming practices on their part, cf Kaldaien), and AMD also did poorly in DX11 titles due to various reasons but here Nvidia's software team had a decisive advantage, furthermore AC games became more & more demanding on CPU & memory. Valhalla is now DX12-only, so in a way AMD cards are now unshackled and can run properly and thus we see a big advantage for them here.

You can see a similar thing with Wildlands vs Breakpoint (they also use same engine as AC), when they finally added Vulkan to BP it was a huge boon for AMD users. Wildlands in particular was horrific and you could see this the best if you paid any attention to the power usage. It was always low on AMD cards, which is indicative of underutilisation. I remember with my Vega 64 I'd see the SOC power get cut in half almost, and that's irrespective of settings/reso. It was... wild. Mind you, it wasn't perfect for NV gpus either but it was much better than for AMD. The difference between RDNA 2 & Ampere isn't as big for BP, but it's there, and that's also at heart very much a DX11 title still, and the game also had more use for compute.


Replace the word "graphics" with Shaders and it will sound more accurate. What you say is correct, prior to Turing Nvidia was focused on Shader performance which then flipped over and now it's all about Compute and AI for Nvidia architectures. On AMD it went the other way, GCN was focused on compute and then RDNA and all versions of RDNA so far are focused on shaders.


Personally I feel that compute and AI is the better choice for the future and GCN was a good choice, it was just several years too early. But I feel what also played a part in this flip is AMD itself - when Nvidia decided to move to a compute focused it's old shader cards were still dominating AMD, AMD had no answer for the 1080ti - Nvidia could be forgiven for thinking it would be easy to do compute and still beat AMD in shading
 
Last edited:
Fair point about dx 12, forgot about Valhalla being the first AC game to use it and would explain it a bit more given the cpu overhead of nvidia drivers etc.
 
Status
Not open for further replies.
Back
Top Bottom