• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Man of Honour
Joined
13 Oct 2006
Posts
91,000
Dunno man.. below is my layman assessment

Lets consider GI.. I believe you've got to do some kind of Monte Carlo integration at each bounce (they call it BRDF I suppose).. so lets say 100 rays emitted at each bounce .

This would look like 1 + 100 + 100^2 + 100^3... rays, I feel this is too expensive..

Something as simple as refraction too generates 2 rays per bounce.. 1+ 2 + 2^2 + 2^3 + 2^4... it will catch up

Yep I know I am sounding pessimistic

This is where path tracing and so on come in - there are a lot of optimisation and approaches so you can avoid generating rays for the sake of it, etc. currently this puts a heavily dependency on the performance and quality of your denoiser and other tricks which are hard to fully mask but if we had 4x the ray budget of Quake 2 RTX much of that would be eliminated.

So your talking about needing wanting double the 3080 RT performance now for it to be interesting? Isnt that basically saying it isnt quite there yet which is what we were posting umpteen pages back?

My previous posts have mostly been about refuting when people have said we can't or are an impossibly long way from X, Y or Z - my recent post is about where I find things interesting in terms of polishing out the solution and/or getting to the resolutions that people want who are less enthusiastic about the technology.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,000
Surely its better to sacrifice raytracing for higher native resolution?

1080p RTX on is not going to look as good as 4k RTX off I wouldn't think.

I'm talking in the context of the cost of games using path tracing or other RT techniques throughout - rather than token RT effects in an otherwise non-RT rendered game which is another matter.
 
Soldato
Joined
14 Aug 2009
Posts
2,750
Also, at @Calin Banc that looks a bit like Yela, its probably not but its not ArcCorp, it might be one of its cold moons.

Was on Area 18 as well, still remained under 7GB. Problem is that at that resolution, even when they'll move to Vulkan and fix their servers, a RTX2080 would be too weak for v synced 60fps, so settings have to be dropped, ergo vRAM going down. Is not like Vega VII is happily leaving behind the rtx2080 due to its 16GB HBM2 vRAM. Quite the contrary.

Surely its better to sacrifice raytracing for higher native resolution?

DLSS style of approach could help in this (even regular internal lower the render resolution if the result is good enough). Personally I couldn't care less about 4k, but I do love multi display setups for larger filed of view - ergo still higher resolution than 1080p. However, with a proper field of view and not in your face, vomit inducing fov, I would rather play a 1080p game with state of the art ray tracing than normal "hacking" at 4k or even 8k.
 
Caporegime
Joined
17 Mar 2012
Posts
47,534
Location
ARC-L1, Stanton System
Was on Area 18 as well, still remained under 7GB. Problem is that at that resolution, even when they'll move to Vulkan and fix their servers, a RTX2080 would be too weak for v synced 60fps, so settings have to be dropped, ergo vRAM going down. Is not like Vega VII is happily leaving behind the rtx2080 due to its 16GB HBM2 vRAM. Quite the contrary.

You get my point tho?
 
Man of Honour
Joined
13 Oct 2006
Posts
91,000
Thanks, but I'm not so sure 80 is the top one

Without knowing specifics of architectural changes, etc. is is anyone's guess what form the cards will take - they might make significant changes or they might just continue with refinements of the same architecture, etc. nothing has been leaked yet which really nails it down aside from some inferences from the console hardware.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
I dont know what the sweet spot is but I would guess around the 12Gb mark would be the minimum to get away with ~4k resolutions for the current moment in time. If you plan on having a card now that will last as long as possible then you need to have choice and probably why nvidia are now shipping the 20Gb 3080 to cover that base.

Nothing at 4k right now needs 12Gb of vRAM. Or when actually measured properly (I'll come back to this) actually needs 10Gb either.

I'd go with that, 12GB for gaming in 2021 and 2022 now 12GB would be minimum as we just dont' know how they'll use more and more textures and ram so, yeah, this is why I'm really surprised with the 3080, it had to be just down to cost to keep the "RRP" which is meangingless anyway down... can't think of a single other reaosn to only put 10GB on a state of the art card that is the pinacle of gaming lol... hey ho, I don't care, not buying one anyway.

Well the reason not to put more vRAM on the card is that it cannot make use of it for gaming. Any game that comes anywhere close to the 10Gb vRAM limit is completely unplayable because you run into GPU bottleneck long before you hit 10Gb. Putting more memory on the card than the GPU can realistically use costs more money and provides no benefit for gaming.

Check the VRam on this, 1440P. this is just in the middle of space, cruising above a planet surface the VRam over spills, I din't have the OSD running in the second video but its pretty rubber bandy on any planet surface.

Overspills what? Not 8Gb of vRAM.

Problem is getting an accurate gauge on actual VRAM utilisation without identical performing cards with different VRAM amounts (or some really in depth software analysis).

As something I've mentioned before - using a modded Skyrim for instance on both a GTX780 and 970 I had to load it up until the VRAM use as per overlay exceeded 5GB (despite using 3GB and "4"3.5GB cards) before there was any performance impact at all while other games will hit a wall the moment you got a few MB over the VRAM amount of your GPU and others like Battlefront will happily allocate 90% of your VRAM (up to a certain size) and sit at that amount almost constantly without actually using more than about 2.8GB at 1440p.

EDIT: Oh I see you are replying to someone on my ignore list - probably why they are talking nonsense.

We've finally got a good way to measure actual vRAM usage now, it was mentioned in the "is 10Gb enough" thread and it's worth repeating here. The latest beta for MSI's Afterburner now can look at actual process specific memory usage the instructions for enabling this OSD are here https://www.resetera.com/threads/msi-afterburner-can-now-display-per-process-vram.291986/

The bottom line is that almost all the measurement tools we've been using to inspect vRAM usage have been measuring only what the engine has demanded from the video card in terms of allocation (how much memory is reserved exclusively for that process), but this doesn't tell us how much of that allocated memory is actually used by the game engine. Prior to this you either needed to rely on more accurate in-game dev tools which most games don't have (or you can't get easy access to). Or a more recently application called Special K which hooks the game files to inspect what's going on, and had limited support and a bunch of other problems.

Anyway bottom line is that actual memory usage is always lower than what is allocated, often by a lot, the difference can be as much as 3Gb in the case of something like FS2020 where the original 12.5Gb at 4k Ultra that was measured in benchmarks is in reality about 9.5Gb. This would almost certainly be the reason for Skyrim modding not taking a performance hit when exceeding the cards vRAM. And as you've rightly pointed out, actual memory allocation is often done somewhat arbitrarily, a developer may just request 90% of the available vRAM in a system, so a 24Gb 3090 might report something like 21Gb of vRAM in "use" which is shockingly high, but in reality the game may only need 5-6Gb

So it's worth taking with a pinch of salt the examples above of high memory usage, what those people need to do is visit that link, upgrade MSI Afterburner to the new beta and enable both vRAM usage as normally measured, but also the "GPU Dedicated Memory Usage \ Process" vRAM for comparison to see what is actually in use.

Here is what my quick first test showed with my GTX 1080 on Wolfenstein II, with maxed settings at 4k (minus motion blur, and using SMAA (TX1)) which is 6.5Gb allocated but only 5.4Gb in use. A delta of 1.1Gb

Wolfenstein-The-New-Colossus-Screenshot-2020-10-17-13-40-05-34.png
 
Last edited:
Soldato
Joined
3 Aug 2015
Posts
7,030
Which company is considered the best AIB for AMD cards? Hoping they’ve got some decent designs. A lot of the Nvidia cards are pretty garish apart from the FE.
 
Associate
Joined
14 Dec 2016
Posts
958
Cheers. Interested to see what the new cards look like and how many slots they take up.



Ok :D

Id also say Sapphire but also Powercolor tend to be pretty damn good as well.

I avoid MSI as quite frankly they are a joke of a company, and ive had bad experiences with their coolers abnd backplates on my 290s back in the day.

Asus just tend to be overpriced and their coolers are often shoddy on AMD cards.

Gigabyte want to be Asus it seems as they think charging more than other partners makes their cards better, no, no it doesnt and again often certain models of theirs have terrible coolers.

XFX can be hit and miss, occasionally they have a really good card, not often though.

Asrock have joined recently but it seems the coolers on their cards again are lacking.

Occasionally you may see HIS as well, their cards are like XFX, hit and miss, but generally have marmite looks
 
Associate
Joined
14 Dec 2016
Posts
958
The AMD ref card looks bloody lovely though, just hope the cooler is cool n quiet, one thing you dont see often with AMD cards is white ones, i know it sounds odd but id actually like a white card to go in my build lol.

Powercolor Red Devil looks nice, Sapphire Pulse is nice looking as well, and i was a massive fan of the Sapphire 290 Toxic and Tri-X back in the day.

Sounds odd liking the look of a GPU but reality is a lot of us have cases on desks with glass doors, its quite nice to see clean themed builds imho
 
Status
Not open for further replies.
Back
Top Bottom