• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Haha you know what i just reread it, Wccftech are talking about existing AMD hardware, not rdna2. All the people on this forum jumping on this article and claiming like rdna2 won't support RT on CP2077 at launch is laughable. I expect most of them to cease posting after the 28th Oct when all there garbage rumours are finally put to bed

Yep see what I wrote earlier its the boo brigade, I dont see why there is so much effort made to make AMD out to be rubbish.

..not the first time from guys not interested in AMD hardware except pointing out how good nvidia are.
 
Adaptive sync? It's a thing NV called G-Sync and made a show of "making premium" instead of bettering the whole industry.

Adaptive sync is a complicated one - variable refresh rate existed as an embedded display standard for years with no interest from any party in developing a desktop standard - when nVidia started showing interest in making it a desktop standard those with a professional interest (commercial) such as companies that produce air traffic control displays were obstructive and AMD disinterested.

The whole industry is responsible for it not being bettered - the current adaptive sync implementation still depends on a Frankenstein implementation of misusing panel self refresh and other features to make it work which is sub-optimal and dragging out replacing it with a better standard that has all the feature set available via the G-Sync module and more.
 
He describes in that video (at the 23:15 mark) a valid scenario in which big-dong Navi might actually have roughly the same speed as the 2080ti. His reasoning is sound.

Back to the drawing board?

I can't take any more guesses from anyone, there is a point at where it all becomes too much. I'm ready to stop reading any fake news until 28th when Lisa gives us real information
 
Adaptive sync? It's a thing NV called G-Sync and made a show of "making premium" instead of bettering the whole industry.

Maybe because freesync is worse? They could have supported it from the start and get a bad rep with all the issues the freesync monitors have atm. You can get an ok one or a horrible one. With gsync monitors using the module you know you’re getting a good or very good monitor that wont have issues with VRR, plus supported from 1fps to whatever highest fps it can achieve, not just 48-144 or worse.

So again, blocking what exactly?
 
I can't take any more guesses from anyone, there is a point at where it all becomes too much. I'm ready to stop reading any fake news until 28th when Lisa gives us real information
This thread in the last 20 or so pages is 99% full of egotistical and fanboy BS... if you are sick of it then just have a break from it for a few days and read WCCF and Videocardz and you will still be up to date. :)
 
Ampere leaks were spot-on..
I would still like people to form their independent opinions.. emotions run strong in this thread
I am just checking that peop;e aren't linking to any old crap from Twitter. So it would be good if when first posting these links people could add a short note as to why they are credible (ie: this persons Ampere leaks were spot on). :)

If this is true that would suggest that the card that AMD showed during the presentation that was on a par with the 3080 was indeed the 6900 XT.
 
I am just checking that peop;e aren't linking to any old crap from Twitter. So it would be good if when first posting these links people could add a short note as to why they are credible (ie: this persons Ampere leaks were spot on). :)

If this is true that would suggest that the card that AMD showed during the presentation that was on a par with the 3080 was indeed the 6900 XT.

I think it was. They will likely just say they have a premium flavour to offer to enthusiasts that can trump it and for all intents and purposes beat a 3090 in some things. The only time this will excite me though is if its a) available to buy and b) doesnt cost anywhere near a 3090..
 
Fresh leaks:

https://mobile.twitter.com/KittyYYuko/status/1318072755519062016

There is no XTX and LE
It should be: XT XTL XL (whats in a name..3 skus nonetheless)

https://mobile.twitter.com/KittyYYuko/status/1318073487060127745

22 May also be on the stage with 21
22 has a Dual-FAN 21 has a Triple-FAN

Oh golly gosh. XTX could be DOA.
Let's hope RT in cyberpunk is not.

I know you probably meant this as a joke but multiple times I have seen you making obviously biased statements against AMD. This one is just ridiculous though after the power hogs that Nvidia just released.

You need to chill bro, it was a joke.
 
Maybe because freesync is worse? They could have supported it from the start and get a bad rep with all the issues the freesync monitors have atm. You can get an ok one or a horrible one. With gsync monitors using the module you know you’re getting a good or very good monitor that wont have issues with VRR, plus supported from 1fps to whatever highest fps it can achieve, not just 48-144 or worse.

So again, blocking what exactly?

Yet GSync is the same thing now :p
 
Status
Not open for further replies.
Back
Top Bottom