• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Reading your post, i'm starting to agree more on the notion that this isn't the very best case that AMD is showing . As you've stated it would be marketing sucide to do that.

as noted, its a game of chess....
AMD is about to unleash hurt and now you know why nvidia dropped the prices as they did

Anyhow,
Most importantly, Herkelman stressed that AMD didn’t state which Radeon RX 6000 graphics card ran these benchmarks. We don’t know whether these results come from the biggest Big Navi GPU, or a more modest offering.

https://www.pcworld.com/article/3585090/amd-radeon-rx-6000-big-navi-performance-tease-rtx-3080.html
 
I'm happy to wait til Christmas. Not a problem, if AMD can't supply cards by Christmas.

The paper launch was pathetic and people pre-ordering are just idiots but the shoddy launch wouldn't deter me from buying NVIDIA.

What wil deter me from buying NVIDIA is if AMD release a superior, more powerful, card. I personally think RTX and DLSS features are probably worth about £50-100 because being able to play the odd AAA game at 4k/60-120hz when it shouldn't be possible like Cyberpunk is cool.

So either AMD beat it performance wise, price is far more competively taking into account the lack of features, provide healthy more VRAM which is proven to be taken advantage of by some form of games in the next 6 months (I'm prepared to wait that long to see how the market develops), or I'll go with the card with a more complex feature set.

I agree with the RTX+DLSS in ampere, but for me turing was a flop not only on price (before the Supers) but it was still to early with lack of games back then.
 
I see RT taking off with 30 series/RDNA2 adding a level of pleasure much like the Voodoo cards back in the day.
In that case, since it is predominately consoles that are pushing RT, would it not be best to wait to see how they are going to effect RT implementation in games going forward :confused:.

For all we know the way that control has implemented RT could be different to how future games implement it, which would change the performance delta (assuming there is one).

Edit: You also wouldn't a be a beta tester for new hardware and could potentially get a better deal than launch prices.
 
I assume they have access to their own GPU's...including the ones that are not yet on the market.

the point is they used the 2080ti because you can easily find someone with similar setup to verify AMD are not intentionally gimping the results to make it look worse than it is, its all about transparency.. if they used their own 6xxx GPUs you could not verify those results, and at the time the 2080ti was the best available GPU on the market, they used it on the previous Zen reveal, 3080 is not in sufficient supply for them to use it and for people to verify. Its really that simple, they are covering ALL bases by using hardware that the common man can pickup today and verify with.

This is a new AMD, who are not BSing people, they are being as open and honest as possible, including showing benchmark results where they lose, if you look at their past few reveals they have done this a lot, showing benchmarks where they are behind the competitors, its all about honesty ...
 
Most importantly, Herkelman stressed that AMD didn’t state which Radeon RX 6000 graphics card ran these benchmarks. We don’t know whether these results come from the biggest Big Navi GPU, or a more modest offering. (Herkelman also said there’s still fine-tuning left to do before launch.) AMD’s Ryzen 9 5900X, the CPU used for the tease, also hasn’t been tested by independent reviewers.
https://www.pcworld.com/article/3585090/amd-radeon-rx-6000-big-navi-performance-tease-rtx-3080.html


Lets face it, Ampere launch not only derailed but was the worst launch ever:
-doesn't overclock well without modifying the board itself
-has an imbalance of capacitor issues causing ctd, blackscreens, lol
-driver update that lowers GPU boost
-other AIBs modifying and revising board designs well after released do to those ctd, blackscreens
-low availability do to low yields, not because of demand
-calling a 3080 a "flagship" card lacking the performance uplift to justify it past a 2080ti
-3090 performance that is slighly higher then a 3080 yet well below the generational performance jump from a 2080ti
-AIB caught scalping their own customers at double the price
-prices typically higher then OEM overall
-8nm process node clearly isn't adaquate for the Uarch
-only 10gb vram for what they call a "flagship" card, lol
-replacement cards rumored to replace them creating an artificial EOL for 1st Gen Ampere

And to make things worst, their lack of execution from them forces those who would have bought a 3080 to wait for RDNA 2 to release. Which will cause those still waiting to consider RDNA 2 instead. That's the icing on the cake right there. AMD set a release date less then a month from now, November 5th. Something Nvidia should have done when they released Ampere. Instead of releasing at a later date to build stock they blew the whole thing releasing early. Expecting you to wait unitl next year to get one. Probably at an inflated price point.

Ampere's release is LOL worthy. Which gave AMD a pass on their presentation releases. Which so far have been flawless.
 
Last edited:
Reading your post, i'm starting to agree more on the notion that this isn't the very best case that AMD is showing . As you've stated it would be marketing sucide to do that.

Absolutely, and supported by how in that article Herkleman felt it important to note that they didn't tell you which GPU it was. This is only relevant if those results were produced by an unexpected card. 6900, possibly 6800xt i would guess. That way they still have the top dog 6900xtx to drop, the same as they did with 5950x
 
https://www.pcworld.com/article/3585090/amd-radeon-rx-6000-big-navi-performance-tease-rtx-3080.html


Lets face it, Ampere launch not only derailed but was the worst launch ever:
-doesn't overclock well without modifying the board itself
-has an imbalance of capacitor issues causing ctd, blackscreens, lol
-driver update that lowers GPU boost
-other AIBs modifying and revising board designs well after released do to those ctd, blackscreens
-low availability do to low yields, not because of demand
-calling a 3080 a "flagship" card lacking the performance uplift to justify it past a 2080ti
-3090 performance i higher then a 3080 yet well below 50% performance uplift from a 2080ti
-AIB caught scalping their own customers at double the price
-8nm process node clearly isn't adaquate for the Uarch
-only 10gb vram for what they call a "flagship" card, lol
-replacement cards rumored to replace them creating an artificial EOL for 1st Gen Ampere

And to make things worst, there lack of execution from them forces those who would have bought a 3080 to wait for RDNA 2 to release. Which will cause those still waiting to consider RDNA 2 instead. That's the icing on the cake right there. AMD set a release date less then a month from now, November 5th. Something Nvidia should have done when they released Ampere. They knew they only had a few 1000 world wide. Instead of releasing at a later date to build stock they blew the whole thing on day 1 with no massive stock replenishment in sight for several months.

Ampere's release is LOL worthy. Which gave AMD a pass on their presentation releases. Which so far have been flawless.

Careful @Rroff will be along shortly to accuse you of turning over stones to mudfling at Nvidia :(
 
The AMD hype thread is attracting more nvidia fanboys as they have nothing to do while waiting for their 2021 cards to get dispatched. It was only a few weeks ago they were shouting from the rooftops where is AMD, no information, they must not have anything yadayadyada.
 
trading blows with the 3080 is what I saw, but you have to remember it was running with Zen3 so just below is the reality I think. Doesn't mean its bad though the price could blow a 3080 out of the water.... Have to wait and see.
 
Bet AMD jebaiting us, show the 6900xt performance matching the 3080 and then boom 6900xtx smashes the 3090 on the 28th.

I do honestly think AMD has jebaited us, but perhaps not with whats coming out from the 28th review.

This is because the 80 CU being the limit doesn't make sense, if thats the biggest they could produce in time for this launch then thats fair, but I don't think thats the limit of Navi2, AMD dumped GCN and one of those reasons was the making a gaming card and not a compute centric card and to do away with the hard CU limit of GCN.

Going from 64 CU to 80 is not that big a leap to do away with that limit to do all of this work, I think theres a bigger chip coming at some point.
 
In that case, since it is predominately consoles that are pushing RT, would it not be best to wait to see how they are going to effect RT implementation in games going forward :confused:.

For all we know the way that control has implemented RT could be different to how future games implement it, which would change the performance delta (assuming there is one).

Edit: You also wouldn't a be a beta tester for new hardware and could potentially get a better deal than launch prices.

We already have seen how consoles are implementing RT. They show 1/4 res RT reflections and reduced number of reflections, e.g. not all NPCs cast a reflection. Don't get me wrong, I think the consoles will do a great job and the PC side even better. They just won't be able to reach the 30 series in RT detail/performance. Remember this is Nvidia's 2nd gen RT. I would have jumped on Turing had it's RT performance been better.

I think my 3080 TUF OC order cost £700 including VAT. They are now £750 :confused: I'll ignore the 20GB version as I feel I will need a new GPU before finding a use for the extra VRAM.

I plan on undervolting and frame capping to keep the temps down. Nothing worse than sweaty gaming.
 
trading blows with the 3080 is what I saw, but you have to remember it was running with Zen3 so just below is the reality I think. Doesn't mean its bad though the price could blow a 3080 out of the water.... Have to wait and see.

That'd suit me. I'm really not bothered about the absolute flagship £1000+ cards do because i'm never going to buy one. Normally I tend to buy in the kind of segment the 3070 occupies (i'm looking to trade up from a 1070) but might've stretched to a 3080 by now if stock had permitted as I had that level of funds available. So something coming up that is in that performance space between a 3080 and 3070 is right up my street (if the price is right).
 
Reading your post, i'm starting to agree more on the notion that this isn't the very best case that AMD is showing . As you've stated it would be marketing sucide to do that.

Would the hype/buzz not be much greater over the next few weeks running up to 28 October if they had shown something that could clearly beat a 3080? I find it odd they would hold back their best benchmarks or at least not hint at them in some way.
 
Would the hype/buzz not be much greater over the next few weeks running up to 28 October if they had shown something that could clearly beat a 3080? I find it odd they would hold back their best benchmarks or at least not hint at them in some way.

But would that get us talking as much as leaving a bit of mystery to it? Because i think their plan is for us to market it for them by discussing/trolling/fanboi'ing delete as appropriate. :)
 
My 5700XT at 2150mhz is around 2070S/2080 FPS
I run mine at 2050MHz but undervolted to 1160mV which is actually the default boost clock for the Nitro+. Never really needed the extra fps since I play at 1080P.
Also never had a black screen issue since day one so I reckon people who have had problems may have had other system components interfering with the 5700XT.
 
You do realise it was faster for it's price than anything Nvidia offer even now right? I know that will change soon but it has been faster than the equally priced 2060 Super for it's entire lifespan.

The RX 5700 XT is a mid-range 7nm card, not a high-end 16nm card which was magically done on 7nm to save cost for AMD but never for the consumers.
So, no, I don't realise the fake facts.

Just FYI, the 282 mm^2 Radeon HD 4890 was sold for $199 :D once upon a time.
 
Status
Not open for further replies.
Back
Top Bottom