• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Soldato
Joined
30 Jun 2019
Posts
7,876
Xbox one x targeted a peak of 4k supported by a fill rate of 37.5 gpixels/s
Xbox series x is targeting a peak of 8k which would mean it requires 37.5*4 or 150 gpixels/s .At 1.825ghz clk that would work out to approx 82 ROPs

Nah, 8K on consoles won't be a thing. Sorry. No where near powerful enough hardware. 8k is more than 4x the pixels of 4K. GPUs already stuggle to get 120 FPS at 4k, and many even struggle with 60 FPS.

4K, 8K... these are often just marketing phrases seen on tvs and displays.

I mean the RTX 3070 has only 64 ROPs, so the consoles can make do with that amount...
 
Last edited:
Associate
Joined
19 Jun 2017
Posts
1,029
That's peak pixel fill.. it won't be done natively, those will be 4k upscaled frames but still those many pixels have to be filled.

Nah, 8K on consoles won't be a thing. Sorry. No where near powerful enough hardware. 8k is more than 4x the pixels of 4K. GPUs already stuggle to get 120 FPS at 4k, and many even struggle with 60 FPS.

4K, 8K... these are often just marketing phrases seen on tvs and displays.
 
Associate
Joined
27 Sep 2020
Posts
34
at £650 its good.

At north of £800 it becomes somewhat meh!.

I almost fell for the hype then I realised that generational leaps between GPUs should be like this anyway. It shouldn't impress us at all really.

edit: seems the guy above me said exactly same thing.

Generational leaps between GPUs certainly shouldn't be like this; 30% more power to deliver 30% more performance is, at best, underwhelming and at worst, a failure in design. We expect a new architecture to be more efficient rather than flush power consumption down the toilet. Having 350-400W cards is not an encouraging result at all but the marketing has done an amazing job to get buyers to disregard this point.

Another thing is that a miniscule proportion of 3080 owners will have obtained the card for £650. In reality, most will be paying a much higher amount than that. This was a masterful marketing move by Nvidia: trick people into thinking the product can be purchased for £650 then strategically drive up the true price through forced (artifical) scarcity. You'll see almost all reviews and articles referring to the 3080 as a £650 card, which it isn't, and buyers will happily pay well over that amount without question. Cognitive dissonance will take care of the price discrepancy as customers will think of their £750 3080s as £650 cards due to the power of marketing!

I'm disappointed at how almost everyone seems to be ignoring the clear fact that the £650 3080 FE cards were loss leaders whose purpose was to pave the way for marked up cards from AIBs.
 
Associate
Joined
29 Jul 2011
Posts
1,438
Paid £550 for my 980Ti in 2015 which is still in my rig. £650 is the most I'll consider for a 3080/AMD equivalent (maybe stretching to £700 if it shows benefits for SFF cases), otherwise may stick to consoles for next gen.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
With 2080TI like performance....

Why not just have a 3GB 2080TI at £200, same logic right?



Only if you take speculation as gospel.

Since when did they begin to differentiate SKUs by cutting the memory capacity? :confused:

RX 480 from year 2016 - mid range card had 8 GB.
 
Soldato
Joined
25 Sep 2009
Posts
9,667
Location
Billericay, UK
Paid £550 for my 980Ti in 2015 which is still in my rig. £650 is the most I'll consider for a 3080/AMD equivalent (maybe stretching to £700 if it shows benefits for SFF cases), otherwise may stick to consoles for next gen.
Yep, the story is Nvidia were handing out a $50 credit on all 3080's just to keep the initial price for batch 1 down. Combine that with FX rates going the wrong way and gouging future batches are likely to be £100-£200 higher.
 
Associate
Joined
12 Jul 2020
Posts
288
at £650 its good.

At north of £800 it becomes somewhat meh!.

I almost fell for the hype then I realised that generational leaps between GPUs should be like this anyway. It shouldn't impress us at all really.

edit: seems the guy above me said exactly same thing.
Even £650 is pushing it. GPUs have inflated prices - personally don't think the RTX 3080 should cost anything more than £500.
 
Associate
Joined
12 Jul 2020
Posts
288
GA102 is a 628 mm² die. Its price is ok.

What is extremely overcharged is the 251 mm² (meh!) Navi 10 for £400 laugh
Year on year on the value proposition of high end GPUs has been getting steadily worse. Tbh the launch price for the 980Ti was superb at £550.
 
Soldato
Joined
21 Jan 2016
Posts
2,915
Year on year on the value proposition of high end GPUs has been getting steadily worse. Tbh the launch price for the 980Ti was superb at £550.

980ti launched in June 2015 when the pound was worth 1.53 dollars... these days it is bouncing around between 1.25 and 1.30

Had the 980Ti launched with today’s exchange rates it would have been at least 20% more expensive purely on that fact, so around £660 or likely more.

On top of that due to inflation £660 in 2015 would be £740 now.

So that (by your own admission), excellent value card would be more expensive today than the 3080 while the 3080 offers around 150% more performance. The value proposition arguably hasn’t got worse at all when comparing those two cards.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
10,073
No, i don't, i really don't, a card like that is a huge amount of muscle, that's an entry level 4K card and 6GB is 1080P limiting, with 6GB it has a ridiculous amount of wasted horse power. its wildly unbalanced.

Not to mention the £100 per card AMD will be flushing down the toilet by making a GPU with that much grunt a 6GB card.

Price dictates not spec. There would be many people sitting on 580's that would love 2080ti with 6gb at the correct price. I am not disagreeing it's a bit unbalanced just saying that we don't know if the specs are correct and if they are who it's aimed at. For me and you it's all wrong but to certain buyers at the right price they would easily jump.
 
Caporegime
Joined
17 Mar 2012
Posts
48,116
Location
ARC-L1, Stanton System
Price dictates not spec. There would be many people sitting on 580's that would love 2080ti with 6gb at the correct price. I am not disagreeing it's a bit unbalanced just saying that we don't know if the specs are correct and if they are who it's aimed at. For me and you it's all wrong but to certain buyers at the right price they would easily jump.

There will probably be a smaller GPU with 5700XT like performance, a cut down die with lower clocks? with 6GB for around £300.
 
Caporegime
Joined
17 Mar 2012
Posts
48,116
Location
ARC-L1, Stanton System
It would cut costs and mean they could do away with the 5700XT+5600XT.

Yeah, cut out some CU's to salvage some dies, lower clocks to reduce power to make the PCB and coolers cheaper.

The thing is GDDR6 at this point, even the faster 16GB/s one i would think is only around $5 a GB, $60 for 12GB $30 for 6GB, its doesn't seem worth making a card that fast 6GB when you could charge £100 more for it with 12GB and gain £70, salvaging defective dies for less shaders and lower clocks on cheaper PCB's and coolers is where you get your money back on 300'ish £ affordable fast GPU's.
 

DDH

DDH

Associate
Joined
29 Jun 2016
Posts
173
I doubt the 6700XT is 6GB, IMO its 12GB.

There will be a 6600XT with a 128Bit bus with 8GB.

All these GPU's will have a large L4 Cache to make up for the lack of memory bandwidth.

Other leaks have the 6700XT at 2.5Ghz, the 6900XT at 2.2Ghz.

Speculation:

5700XT 2560 Shaders @ 1900Mhz.

6700XT 2560 Shaders +10% IPC @ 2500Mhz +31.5% (+41.5% Total) =2080TI after scaling.

6800XT 3840 Shaders +10% IPC, +50% Shaders @ 2350Mhz +23.5% (+83.5 Total) =3080 after scaling.

6900XT 5120 Shaders +10% IPC, +100% Shaders @ 2200Mhz +16% (126% Total) =3090+ after scaling

If 3840 shaders = 3080, 5120 shaders will be a lot faster than the 3090
 
Status
Not open for further replies.
Back
Top Bottom