• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
If it forces Intel to innovate for once I don't see how that's a bad thing for the consumer. At the end of the day AMD is now ultra competitive in the GPU arena and dominates the CPU market, this is great for the gamer. The NV/Intel duopoly is in pieces!

Because if we have a lockup of AMD and Intel/Nvidia, when in the future one of those gets a big lead we get less choice.

What if Intel has the best CPU and AMD the best GPU? Or AMD best CPU and Nvidia best GPU.
 
Hats off to AMD. I didn't think they would compete with the 3080, let alone compete with the 3090. Fantastic to have some proper competition again!

Definitely got some buyer's remorse with my 3080 but it is what it is. Not going to rush to swap it out but will be keeping a keen eye on benchmarks when they come out.
 
It's a shame this new AMD tech to marry the 5000 series CPUs and 6000 series GPU's wont work with a 3000 series CPU ad 6000 series card.

Unless it will and they just haven't mentioned it.
 
On the flip side people are going out of their way to say the 6800XT is only competitive with the 3080 when smart access memory is enabled, which isn't what the slides showed at all.

FrFM62O.png


TrPtkTA.png


Frames Per Second (Up To) is a bit weird, are they comparing peak FPS or average?
 
Well, the echo chamber effect is strong. Nothing much will change anyway as much as you’d like to ignore DLSS and RT. Market share will most likely look the same in 2 years anyway.

If it would have been as good in RT as the 3080 and have an answer to DLSS only then it would have made sense to ask for the same price ( or well, 50$ lower, my bad )

either that or be way better in raster (+20%) like the rumors were suggesting.
 
I'm very curious about AMDs raytracing

From how I understand it:

AMD are utilizing software based raytracing via direct X - and accelerating/driving it with the card - right? A bit like how Nvidia have done it with the GTX cards, even though the performance is obviously terrible.

Wheres Nvidias is hardware based tracing, so its got dedicated wotsits on the card to do it. But when not in use, that portion of the card is basically idle. However with AMD, the whole card is rasterising as per the demands of the game/settings.

Is that right?

I also wonder whats easier to implement, Nvidias hardware based RT or AMDs software driven. Didnt AMD say something about 36 titles already? Isnt that already twice as many as Nvidia?
 
On the flip side people are going out of their way to say the 6800XT is only competitive with the 3080 when smart access memory is enabled, which isn't what the slides showed at all.

Thats a difficult decision.. especially given how 3080 prices are nowhere close to FE suggested
But, if i were to go by information on-paper the 3080 is still a superior buy.
$50 for DLSS, superior RT (even for screenshot purposes) easily makes up for it..
But all other nvidia announced products till date look hopelessly DOA
 
I wonder if there is any chance that the UK price of the RX 6800 will match the dollar price, equivalent to £445.9? If its priced at £500 that would be ok IMO.


Depends what mood Gerbo's bank manager is in and the Ahhem"Exchange rate" is ;-) Clue:Kinda like oil and gas prices....
 
Ah I see, the "no answer to DLSS" argument is already in full swing. Nice deflection, boys n girls, nice deflection. I guess you didn't hear Scott Herkelman mention "Super Resolution is in the works" then. And I also see the intentional misreading of charts is in play too. Let's ignore the first batch of numbers and only focus on the Rage Mode and Smart Access scores like AMD are lying about something.

The more things change, the more they stay the same.
 
You might want to go read up on how shared system memory in a console actually works before posting.

I'm happy to wait and see. :)
NVIDIA owners are trying to convince themselves 8GB is enough but even some developers have said its not.

Maybe AMD just put 16GB of VRAM on their cards for fun.
 
Because if we have a lockup of AMD and Intel/Nvidia, when in the future one of those gets a big lead we get less choice.

What if Intel has the best CPU and AMD the best GPU? Or AMD best CPU and Nvidia best GPU.

I see what you're getting at, but it's competiton that drives innovation, that's what has happened here, now Intel in particular have to answer. And so it goes on...
 
Status
Not open for further replies.
Back
Top Bottom