• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
PMSL. I dont know why people get defendy and negative, if it causes you to do this then dont come on here and read/post - go play games or something. If your also anti-AMD then why bother?
Lol i am far from Anti AMD, i have a 5700XT Red Devil and a 3800X, a 1700 prior to that, Vega 64, 290, 7870, 6950 etc etc.

My feet are firmly in the AMD camp that is probably why im expecting them to not be as cheap as we all hope they will be :(

Im preparing for the worst so i can be pleasantly surprised if its actually a lot better
 
This then distorts the threads to appear to be AMD vs nvidia which is what were trying our best not to let it slip into..

In that case LOL some people would be better off not bringing gripes over nVidia into the thread LOL. Personally only times I've touched on nVidia in this thread is in reply to posts like people bringing up "LOL POSCAPs".
 
Benchmarks from the Zen 3 presentation now with more benches compared.

Averages: -2,32%, -5,76% and +2,37% vs. RTX 3080

Ej_nb10XsAIJ6Ka


https://twitter.com/planet3dnow/status/1315028334355591168

I think the versus 5700XT is the interesting stuff here - if you just took the specs of the 5700XT and doubled it generally you'd get nowhere near double the performance.
 
Think I’d happily switch to AMD if the new GPU gets within 10% of the 3080 at a lower price point...but both my monitors are G-Sync ..and as far as I’m aware (please correct me if wrong) only the very latest g-sync modules are able to use adaptive sync ‘naturally’ therefore allowing a freesync card enjoy the benefits of adaptive sync on a gsync monitor. So locked into nvidia unless i replace monitor too!

If your monitor is G-Sync then you are going to have to stick with Nvidia for your adaptive sync needs. After Nvidia lost the adaptive sync wars with AMD they started "certifying" Freesync monitors as "G-Sync certified" but they are actually Freesync monitors that have allegedly "passed" some arbitrary Nvidia internal testing and have been given a sticker for Nvidia marketing purposes.

You are unfortunately a victim of Nvidia's excellent marketing that got them to lock so many G-Sync buyers to the Nvidia brand. I suspect Rorff or some other NDF types to come here stating G-Sync is the bestest tech and way better than Freesync. Having used both the end result is identical to the end user but the Freesync panel was considerably cheaper.

Sorry to go off on a semi-tangent and please don't take the above as a personal attack. I just hate to see how Nvidia keep doing this locked down black box approach to GPU tech. Open standards are always the best option for consumers and it is one of the reasons I prefer AMD over Nvidia. I know AMD are a faceless corporation but it's these little things that make them more consumer friendly than Nvidia. For example, you see Nvidia marketing benchmarks for 3080 and think, knock 15% - 20% off that and that's what will be true. Lo and behold it actually ended up exactly like this. AMD's marketing for RX6000 will be pretty much in the ballpark based on their previous history of these things.
 
Think I’d happily switch to AMD if the new GPU gets within 10% of the 3080 at a lower price point...but both my monitors are G-Sync ..and as far as I’m aware (please correct me if wrong) only the very latest g-sync modules are able to use adaptive sync ‘naturally’ therefore allowing a freesync card enjoy the benefits of adaptive sync on a gsync monitor. So locked into nvidia unless i replace monitor too!

I'm in the same boat. Had been planning to skip this gen but the 3080 at MSRP was more than I was expecting and piqued my interest. Now with these hints of RDNA2 performance I'm very interested. But, as with you I'm on a 2015 G-Sync monitor and while I could sell it and buy a similar spec Freesync model without much money loss it's hassle I'm not sure I can be bothered with the hassle. And yet if the price is right I may consider the switch.

I'll have plenty of time to see how it all unfolds, I don't foresee basic 3080 models hitting MSRP for a long time, and if RDNA2 is as good as it looks it could be I guess they may have supply issues too if it proves popular.
 
Its a different architecture no?

I'm saying it points to some of the changes AMD have made.

I suspect Rorff or some other NDF types to come here stating G-Sync is the bestest tech and way better than Freesync. Having used both the end result is identical to the end user but the Freesync panel was considerably cheaper.

I imagine many people won't see much difference between them but G-Sync technically is the better technology as I've mentioned before - it has dynamic adaptive overdrive, better low framerate recovery as it doesn't rely on PSR and better support for applications and games that use borderless fullscreen and other windowed modes as it can use the FPGA to work around the junk way MS has implemented the display manager (although they keep breaking that in Windows 10).

(Point being this is why I chose it and why I will continue to choose it)
 
Last edited:
Lol i am far from Anti AMD, i have a 5700XT Red Devil and a 3800X, a 1700 prior to that, Vega 64, 290, 7870, 6950 etc etc.

My feet are firmly in the AMD camp that is probably why im expecting them to not be as cheap as we all hope they will be :(

Im preparing for the worst so i can be pleasantly surprised if its actually a lot better

Not you buddy I was referring to your reply as was agreeing in your sentiments :D

In that case LOL some people would be better off not bringing gripes over nVidia into the thread LOL. Personally only times I've touched on nVidia in this thread is in reply to posts like people bringing up "LOL POSCAPs".

You call people for being childish.. dont stoop to the same levels then.
 
I cant work out if Richdog is sane, or whether he's the Robin to Rroff (being Batman)? Sometimes you guys post sense then other times your coming across as a murky world of bitter anti-AMD-ness which is why I can see you are labelled as fanboys.
That doesn't even make logical sense and you are just being extremely childish with outright making that baseless stuff up. I give Roff grief for his posts when I don't agree with them the same as anyone else.

Also, what about my posts is 'anti-AMD', when I happily use AMD CPU's and will happily buy an AMD GPU this generation as I have repeatedly posted? Find me ONE 'anti-AMD' post I have made... you won't be able to unless you really do stretch the boundaries of BS even more than you are doing now.

I have also never been labelled as a fanboy of anything except maybe in your own mind, I am as I have always been... brand neutral. :)
 
Can't everyone just grow up, not a single 'fact' has been published and there's arguments all over the shop about red v green, 5n v 7n etc

Would you all behave in such an aggressive manner face to face, unlikely!
 
I tend to take most notice of those who put their money where their mouth is. If no-one bought AMD hardware where would we be? No competition.

Whatever the performance, if the price is right, we have competition. Just because a few PC elitists don't get cheap high end performance won't mean there's been a failure.

As with many hobbies I participate in my concern is the younger end just coming in. If prices are too high it discourages them. As long as AMD continue to service the low/mid market I don't see them having any issues. To be honest I think a lot of the high prices are fueled by the older gamers with good jobs, prepared to pay whatever, regardless, for the best. That's fine but they're a very small part of the market. If I was AMD I'd be aiming for huge sales of 1080p and 1440p cards.
 
Not sure how much difference it will make in 4K with the settings used in those benchmarks.

Yeah exactly, as shown in Hardware Unboxed videos when comparing the 10900K and the 3950X for when deciding what CPU to do testing with they showed that at 4K there is no difference, whereas once you drop down to 1440p and 1080p the 10900K opens up the lead you'd expect it to have
 
Yeah exactly, as shown in Hardware Unboxed videos when comparing the 10900K and the 3950X for when deciding what CPU to do testing with they showed that at 4K there is no difference, whereas once you drop down to 1440p and 1080p the 10900K opens up the lead you'd expect it to have

Ironically the high end CPU's are for low resolution gamers :p
 
Status
Not open for further replies.
Back
Top Bottom