• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Found this interesting article by Nerdtechgasm, that compares the performancer per transistor between AMD and Nvidia. It's a free artricle at the time of me writing this.
https://www.patreon.com/posts/short-analysis-41489839

Intrestingly AMD has been getting more efficient with their transistor use, while Nvidia are getting less efficient (in terms of performance). He reckons that Nvidia sees themselves as soo far ahead of AMD that they have decided to no longer make gaming focused cards but are making them more compute focused. I would recommend watching his latest video on this. Video

Edit: Forgot to mention. 5700XT 10.3B transistors, Radeon VII, 13.3B transistors, both have similar performance. Vega 64 12.5B transistors and much slower than the two cards I've just mentioned. AMD have been making ground, slowly but they are moving.
 
This is my worry.

Going to go for 3080FE as I fear stock will be limited on these bad boys


I think you are being now too negative there humbug, were as before you were a bit too optimistic.
Truth is, it will probably be somewhere in the middle.

80cu Navi has probably long dead as a gamers gpu.

I don't see anything wrong with a Xbox clone, smallish 16gb of ram and probably more efficient than you think. It just needs to be priced right.

He needs to be way off because i can't see many people buying 300 Watt 3070< cards for £450, even with 16GB, why not just buy the 3070 and deal with much less heat?
 
If AMD aren't interested in competing with the 3080 i just want a 16GB card around 5700XT< for <£350, i'm not interested in a £450+ 2080TI<


Why are you so obsessed with VRAM if it's not going to have the raw power to plough through games at 4k and reach 60fps anyway?

AMD need to match the 3080 in rasterisation. They're already behind from an RT/DLSS perspective (until proven otherwise).

If you want just VRAM.. why not grab a Radeon VII
 
Bang for buck, isn't it already a fore-gone conclusion that AMD will grab that crown? Its always been the case.

The only question in my mind is at the 3080 end of the spectrum. AMD can give us as much VRAM as they want but if its not going to be able to keep up with the FPS of the 3080, it's all for nothing.
 
If AMD aren't interested in competing with the 3080 i just want a 16GB card around 5700XT< for <£400, i'm not interested in a £450+ 2080TI<
I've got my sites set on relatively high resolution VR that's really all I care about. I know even the 5700xt is going to satisfy my requirements for "normal" games no problem. But any lag in VR is 10x more annoying then regular gaming. You really need as much gpu power as you can get so hopefully it's equal to a 2080ti in that department. Anything higher then that I'll be super happy especially for £450. And 16Gb. I'm not too interested in Ray tracing at the moment either tbh
 
I've got my sites set on relatively high resolution VR that's really all I care about. I know even the 5700xt is going to satisfy my requirements for "normal" games no problem. But any lag in VR is 10x more annoying then regular gaming. You really need as much gpu power as you can get so hopefully it's equal to a 2080ti in that department. Anything higher then that I'll be super happy especially for £450. And 16Gb. I'm not too interested in Ray tracing at the moment either tbh


For VR, I'd suggest NVIDIA.

From all benchmarks I've seen, NVIDIA seems to consistently do better in VR. Thats ignoring VRSS and the mention of DLSS now working in VR for some titles later on down the line. I've yet to encounter a VR game which my 2080 hasn't been able to easily power through VRAM wise (VR games just aren't that complex and don't have that many complex textures) and that VRSS thing was really something fantastic in boneworks and a glimpse into the future.

With a HP Reverb G2, you will need as much raw power as you can get. I personally wouldn't settle with a 5700XT or its successor over a 3080.
 
Why are you so obsessed with VRAM if it's not going to have the raw power to plough through games at 4k and reach 60fps anyway?

AMD need to match the 3080 in rasterisation. They're already behind from an RT/DLSS perspective (until proven otherwise).

If you want just VRAM.. why not grab a Radeon VII

I know, i'm not the one saying AMD can't match the 3080.

I'm looking for more VRam because i'm VRam constrained, If Corteks is right the only thing AMD users have to look forward to is a step up in performance and more VRam.
 
For VR, I'd suggest NVIDIA.

From all benchmarks I've seen, NVIDIA seems to consistently do better in VR. Thats ignoring VRSS and the mention of DLSS now working in VR for some titles later on down the line. I've yet to encounter a VR game which my 2080 hasn't been able to easily power through VRAM wise (VR games just aren't that complex and don't have that many complex textures) and that VRSS thing was really something fantastic in boneworks and a glimpse into the future.

With a HP Reverb G2, you will need as much raw power as you can get. I personally wouldn't settle with a 5700XT or its successor over a 3080.
You're right the Nvidia cards are better at VR but not in the same price brackets their not. So you're still paying more for that performance. The 5700xt does a good job and comparible to the regular 2070 in VR. Hoping next gen they can improve VR performance I'll just have to wait and see. If it turns out to be a mammoth performance gap then sure I'll wait a bit longer and get the 3080 20gb but I'll still be reluctant to spend £700+ on a gpu tbh.
Iv noticed 6gb can get choppy in places with HL using my laptop so I'm not too confident that 10gb will last that long. And with cards having 16 or 24gb there's a good chance the developers will add in higher resolution textures as an option. That coupled with the increased resolution I just want to be sure I'm future proofed.
 
You're right the Nvidia cards are better at VR but not in the same price brackets their not. So you're still paying more for that performance. The 5700xt does a good job and comparible to the regular 2070 in VR. Hoping next gen they can improve VR performance I'll just have to wait and see. If it turns out to be a mammoth performance gap then sure I'll wait a bit longer and get the 3080 20gb but I'll be still reluctant to spend £700+ on a gpu tbh.
Iv noticed 6gb can get choppy in places with HL using my laptop so I'm not too confident that 10gb will last that long. And with cards having 16 or 24gb there's a good chance the developers will add in higher resolution textures as an option. That coupled with the increased resolution I just want to be sure I'm future proofed.


I agree but for VR, as you said, its important to be hitting the refresh rate or its REALLY jarring so I'd personally pay more for the best if its thats your priority.

I don't think so. VR is very low-end re: textures and graphical optimisations. They're far away from 2D/pancake games. We should be more concerned if we even get high-end developers to make games for VR. I'd definitely take speed and FPS over the extremely minute small possibility that we're going to get high-res super textures in our VR games from AAA developers.

Even fully modded 4k-texture skyrim isn't much of a problem for 8GB of VRAM before it becomes an issue for my puny 2080 in terms of raw speed and FPS.

Your 16GB or 100GB of VRAM is not going to future proof you at all if your card is too slow. I don't know where this obsession with VRAM has come from recently. I remember this same discussion about the GTX 670 2GB vs 4GB... and both became obsolete at the same time. A slow card is a slow card.


I guarantee that the 6900XT 16GB (if it comes) will become irrelavant and need upgrading at the same time as the 3080 for 99.999% of games and all VR games.
 
Right, finished watching that video.

I think what he's saying is a bit daft but i wouldn't put it past AMD to scrap big Navi and just push us off with a much smaller GPU.

What i can't get to grips with in that video he's say the 80 CU Big Navi is at best 2080TI +15%, so what he's saying is 5700XT + 100% CU's + 20% clock speeds (+120% total) for +35% performance.

Whatever dude you're not too bright.

Anyway, lets ignore that idiotic crap and just assume his performance claims are correct.

So, would you pay £450 to £500 for a 16GB GPU that sits in the middle between a 3070 and a 3080? Oh and yes ok lets throw 300 Watts in there as well just incase AMD go with a GPU so small they have to clock the absolute crap out of it to achieve that.

Yes or No?

Its a no from me.

Thanks, no way i could get through it. I listened to the first half sentence and had to turn it off.

The guy has been saying 2080ti +15% from the start. Jokes on him though because based on current leaks that's pretty close to the 3080 :D

Anyway, we already know AMD have a card that is 30% better than a stock 2080ti, so there goes his *theory*

He likely doesn't have sources, he can't reason, he can't apply critical thought, he probably can't research.

He's going to look pretty stupid once rdna2 launches, and coupled with his coprocessor claim that turned out to be rubbish, people might wake up to the fact he just guesses
 
I know, i'm not the one saying AMD can't match the 3080.

I'm looking for more VRam because i'm VRam constrained, If Corteks is right the only thing AMD users have to look forward to is a step up in performance and more VRam.


Fair enough. Which games are you playing which are causing you to be VRAM constrained? Or are you video editing?


I've yet to see VRAM holding back this puny 2080 before the 2080 itself holds itself back by simply not being fast enough.
 
Thanks, no way i could get through it. I listened to the first half sentence and had to turn it off.

The guy has been saying 2080ti +15% from the start. Jokes on him though because based on current leaks that's pretty close to the 3080 :D

Anyway, we already know AMD have a card that is 30% better than a stock 2080ti, so there goes his *theory*

He likely doesn't have sources, he can't reason, he can't apply critical thought, he probably can't research.

He's going to look pretty stupid once rdna2 launches, and coupled with his coprocessor claim that turned out to be rubbish, people might wake up to the fact he just guesses

I think he's being trolled and yes he doesn't have the capacity for critical thinking, believe it or not many people are like that.
 
Fair enough. Which games are you playing which are causing you to be VRAM constrained? Or are you video editing?


I've yet to see VRAM holding back this puny 2080 before the 2080 itself holds itself back by simply not being fast enough.

Star Citizen
 
When the **** does he breathe. It’s like he’s taking on once long sentence. Is it some sort of synthetic voice he’s using is he just weird as hell.
No, that's his voice. He discussed it briefly when he was a guest on MLID's Broken Silicon podcast earlier in the year. Essentially he's Portuguese and he struggles translating in his head, so his English comes out delayed and drawn out.
 
Speaking of MLID, go check out his video about Uncle Jensen's ultimate customer fleecing action plan (can't link at work). Or if you don't want to listen then hit up mooreslawisdead.com and read the article instead.

Whether it's true or not remains to be seen, but it does sound like a very Nvidia thing to do.
 
I agree but for VR, as you said, its important to be hitting the refresh rate or its REALLY jarring so I'd personally pay more for the best if its thats your priority.

I don't think so. VR is very low-end re: textures and graphical optimisations. They're far away from 2D/pancake games. We should be more concerned if we even get high-end developers to make games for VR. I'd definitely take speed and FPS over the extremely minute small possibility that we're going to get high-res super textures in our VR games from AAA developers.

Even fully modded 4k-texture skyrim isn't much of a problem for 8GB of VRAM before it becomes an issue for my puny 2080 in terms of raw speed and FPS.

Your 16GB or 100GB of VRAM is not going to future proof you at all if your card is too slow. I don't know where this obsession with VRAM has come from recently. I remember this same discussion about the GTX 670 2GB vs 4GB... and both became obsolete at the same time. A slow card is a slow card.


I guarantee that the 6900XT 16GB (if it comes) will become irrelavant and need upgrading at the same time as the 3080 for 99.999% of games and all VR games.
I get what your trying to say, but I've read other forum threads where people have said Skyrim has used up to 10.5gb of Vram if you mod the hell out of it. I haven't mentioned but I've got other usages for high memory anyway such as rendering very high fidelity models, scenes, 3d animation etc. Currently I use 3ds max which unfortunately doesn't support AMD gpus for rendering but will be moving to Blender after Uni ends which does support them.
Like I said it boils down to price. If the 3080 offers a much better then linear price/performance for VR I'll probably go for it but if its 20% more £ for 20% extra frames I likely won't bother hiking another 200+ quid for one if a 6700xt does the job.
 
Status
Not open for further replies.
Back
Top Bottom