• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Star Citizen.
Should not have to break down your own post but its your whining that made no sense.

Your 2080 is holding you back "doesnt cut it" is what you said. The 2080Ti is 35% or so better than your 2080.. and the AMD hardware in the console's is on par or better than the 2080Ti. If you dont get that then I cant help you.

How much VRAM has the 2080? 8Gb. How much VRAM has the 2080Ti? 11Gb

So theres a shortfall here of 3Gb already from your current card. No wonder you cant play "I'm desperate for more FPS to play RDR2 at 4K". It not only has less clockspeed but also VRAM. The bandwidth of the 3080 may save it, but clearly the 2080 cannot cope with 4k intensive games.

VRAM was not the reason why RDR2 was not playable at 4k/60 consistently. The 1080ti doesn't do it either and has the VRAM but benches nearly exactly the same as the 2080. Its due to bandwith. VRAM is not the sole issue to 4K gaming which AMD fanboys are trying to make out it is. For the AMD cards to be competitive and be relavant in 4K gaming, they need to match the 3080.

The Radeon VII, with its huge ammounts of VRAM, matches the same FPS on benches online.

I was responding to a post which said its wants a 5700XT with 16GB of VRAM, for all intents and purposes will be a waste for 99% of games.

As much as AMD might deliver on VRAM, if it doesn't deliver on speed and can't match the 3080 on benchmarks, then its just another duff release from AMD.

Please keep my post in context of the post it was reponding to.
 
then its just another duff release from AMD.
Unless, of course, you're not interested in 4K gaming or can't afford to move past 1080p and just want to play some damn games at a consistent 60Hz instead of swinging your green/red/blue dick around? Or does a GPU only count if it's bleeding your money dry at the very top end?

"duff release" my stinging rectum
 
So DLSS is the only nVidia feature.... that makes it "loaded"? :D

DXR is being used by CP2077, which is DirectX Raytracing. Nothing to do with nVidia. It's, as all graphics techs should be really, vendor agnostic. nVidia has RTX cores to do the calculations for ray tracing, but I dare say AMD has a solution as well..
I am planning on the 3080, but I'm at the very least gonna see what AMD has to offer first - and cyberpunk supporting DLSS isn't enough to force me to make the choice :p


DLSS, Nvidia image sharpening which seems to apply to more games than AMD's solution and yes, the currently more mature RTX feature set seems to be the main pulls towards NVIDIA.

However if AMD can match those features, I'd happily move back to AMD simply to try and get competition going again because Nvidia for the 20 series really took advantage of everyone.

Sadly for 4k gamers (like myself), we REALLY need all the performance we can get so if AMD can't match the 3080, I can't afford to go with a slower card which might potentially put me at 45-50fps whilst Nvidia's sail away at 60 without DLSS and 70-80 with it.
 
Also...

Something dodgy about Nvidia's marketing.

https://twitter.com/HardwareUnboxed/status/1301796954398101504

SvWSGTJ.png
 
Unless, of course, you're not interested in 4K gaming or can't afford to move past 1080p and just want to play some damn games at a consistent 60Hz instead of swinging your green/red/blue dick around? Or does a GPU only count if it's bleeding your money dry at the very top end?

"duff release" my stinging rectum


AMD is still very very relavant at the low end and mid-range, I agree. If you 1080p or 1440p, then I'd stay away from a new GPU as I can't think of many games which the current crop of GPUs can't play at 1440p/60.
 
Surely if the xbox gpu can get close to a 2080ti on 130w it's not going to be that difficult to at least get near a 3080 with a 300w desktop gpu.
I've seen it compared to the 2080.
The power consumption for the graphics seems very low (the biggest power drain) as the PSU is supposed to be 300W.
 
I'm saying, in this instance, it's unwise to gauge how well a graphics card might perform based on the performance of a cut down version in a console package vs another graphics card in a game where the LoD is not accurately defined. It barely gives you anything and really AMD need to give us a better example of what the performance will be if it can compete before people start buying up nVidia. Give people more of a reason to wait, if you will.
And I'm saying if they can get this on 130w it stands to reason they can do a lot more with something running 300w. The technology is there to do it, so seems it's going to boil down to margins and is it worth them making such a card. It may well be under 2080ti but I very much doubt it.
 
Very sneaky from them and wouldnt put it past them .. maybe you should post this on the NVIDIA thread.

There's a 2nd tweet which adds clarity:

https://twitter.com/HardwareUnboxed/status/1301797023000084480

Basically, there will be a range of progress depending on where the bottleneck sits like every game, every generation.

I'm more interested to see how the GPU bottleneck shifts. 1440/144+ could be cpu limited in cases going forward. I plan to do some frequency scaling testing at 1440p. RAM is already tuned so no bottleneck there.
 
I did, it'll get ignored...


I put nothing past AMD and NVIDIA when it comes to marketting tricks sadly.

Once the GPUs are in reviewer's hands, we can get a better understanding of whats going on, and what the real performance benefit is.

I might put off a GPU purchase until Cyberpunk releases. CDProject always seem to push the envelope tech wise with their releases so I think it'd be a decent benchmark to go off of what will be the most demanding game for the next 12-24 months, which is a fairly reasonable lifecycle of what I expect for my GPU to be pushing the envelope at the top tier for.


Afterall even today Witcher 3 at 4K is very demanding. If a 3080 or big navi can both ace the Cyberpunk max settings ultra all settings turned on including ray tracing, then we can rest assured both cards have good longevity.. and if they both match eachother, then VRAM does seem to be the factor which will likely play key in decided which to put our £££££ into. If however 3080 seems to destroy and offer a superior experience and FPS.. then it does sway back to the evil green teams favour.


come on AMD!! show us what u got!
 
I put nothing past AMD and NVIDIA when it comes to marketting tricks sadly.

Once the GPUs are in reviewer's hands, we can get a better understanding of whats going on, and what the real performance benefit is.

I might put off a GPU purchase until Cyberpunk releases. CDProject always seem to push the envelope tech wise with their releases so I think it'd be a decent benchmark to go off of what will be the most demanding game for the next 12-24 months, which is a fairly reasonable lifecycle of what I expect for my GPU to be pushing the envelope at the top tier for.


Afterall even today Witcher 3 at 4K is very demanding. If a 3080 or big navi can both ace the Cyberpunk max settings ultra all settings turned on including ray tracing, then we can rest assured both cards have good longevity.. and if they both match eachother, then VRAM does seem to be the factor which will likely play key in decided which to put our £££££ into.

Just wait for the reviews, Hardware Unboxed are the most trustworthy, completely so...
 
Just doubling a 5700XT die size would put it above a 3070 and that's before clock, IPC and power efficiency improvements which RDNA2 will bring.

I think AMD will have the performance but can they match nvidia's pricing?
 
1440P using 8GB, i have the GPU muscle to run 4K but the VRam chokes..
I don't play but it's renowned for being, to put it politely, not very well optimized.

I thought the built in granularity of most PC games means settings can be adjusted to improve performance?
 
I don't play but it's renowned for being, to put it politely, not very well optimized.

I thought the built in granularity of most PC games means settings can be adjusted to improve performance?
Its server locked, on the rare occasions i get a fresh server its running at 100 FPS 1440P highest settings, with full servers its 40 to 70 FPS depending on where i am, upping the resolution to 1800P doesn't bring the frame rates down it just brings the GPU load up, at 1800P its mostly playable if i'm in space, on a moon or planet surface it stutters and my system Ram also reaches full capacity, at 4K its unplayable.
 
Status
Not open for further replies.
Back
Top Bottom