• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Doesn’t give me confidence in the product capabilities that they’ll do the zen3 event before Radeon. On the zen3 side there is no competitive offering from Intel they have to head off. Yet you got Nvidia unloading their arsenal. It’s the gpu, where you need to be competitive, you choose to be passive on?

Did it ever occur to you that the launch of Zen 3 allows AMD to then benchmark with those chips and the RDNA2 GPUs together :D
 
Yup.

Its sad because if AMD had released their range of GPUs BEFORE Nvidia had done thier announcement, they would have been crowned as the saviours of the PC GPU-landscape pricing wise because the 20 series were a gigantic rip off from the top to the bottom.

Instead NVIDIA have managed to release a bit earlier, with a more sensible pricing regime which will make AMD's offering look less-good.

NVIDIA really did the dirty last year with the 20 series GPUs, would have loved for AMD to have showed them up with a much earlier release.
well really AMD are the saviours as the is zero chance Nvidia would have given us the 3080 at $700 had they not considered AMD to be a threat this time around.

Nvidia want to give you as little performance as possible for the highest price they can charge as that means they make more margin and you have to upgrade again sooner.

The performance uplift on the 3080 over Turing though is by no means amazing with it being only around 25% it's just that the price is better this time around. The 2080ti was around 35% over the 1080ti at the same watts yet this time we're getting 25~30% at 30% more power so that says all you need to know about this Samsung 8nm node.

Had the 2080ti been a $700 Gpu then the 3080 would look like a very poor successor in terms of the performance uplift but by price cutting an overpriced card it now looks a great deal.
 
Console releases are what has forced Nvidia to price "competitively". Well as far as the 3070 goes.

If people were playing new games on a Series X with a stable 4k framerate of 30/60 for £450 then nvidia would have looked a joke if one single gpu couldn't offer a viable alternative.
 
well really AMD are the saviours as the is zero chance Nvidia would have given us the 3080 at $700 had they not considered AMD to be a threat this time around.

Nvidia want to give you as little performance as possible for the highest price they can charge as that means they make more margin and you have to upgrade again sooner.

The performance uplift on the 3080 over Turing though is by no means amazing with it being only around 25% it's just that the price is better this time around. The 2080ti was around 35% over the 1080ti at the same watts yet this time we're getting 25~30% at 30% more power so that says all you need to know about this Samsung 8nm node.

Had the 2080ti been a $700 Gpu then the 3080 would look like a very poor successor in terms of the performance uplift but by price cutting an overpriced card it now looks a great deal.

It's not sad AMD have been terrible at providing nvidia with decent competition and nvidia is still competing with itself in this space. TBF Nvidia started the conversation on raytracing and has spent money on taking the lead on better quality graphics, much like they did on physics, smoke effects, weather effects etc etc.

I can't see how you can get upset with nvidia for being good at their job, they are not the enemy the enemy is the company who can't be bothered to provide any bleeding edge tech or move things on and only wants to sell a load of mid range cards because that is where the market is.
 
Considering that we haven't even seen any official benchmarks or reviews yet, AMD have no gauge of what 'stronger than the 3080' is, and neither do we.

It's telling that Nvidia moved their review embargo to the day before consumers can buy the 3000 series, leaving AMD with little to no time in which to counter any benchmarks of Ampere.

You crack on though, enjoy your overclocked 2080ti in 3000 series guise :D


I think your little dig at the end misconstrues my position.

I'm happy to buy NVIDIA or AMD. Heck I'll buy both as I could do with two rigs. I have no place in this fanboy war you guys play. I'm firmly on the fence and will give my cash to whoever has the better product, or both :D
 
Console releases are what has forced Nvidia to price "competitively". Well as far as the 3070 goes.

If people were playing new games on a Series X with a stable 4k framerate of 30/60 for £450 then nvidia would have looked a joke if one single gpu couldn't offer a viable alternative.
Console has no bearing on this
 
It's not sad AMD have been terrible at providing nvidia with decent competition and nvidia is still competing with itself in this space. TBF Nvidia started the conversation on raytracing and has spent money on taking the lead on better quality graphics, much like they did on physics, smoke effects, weather effects etc etc.

I can't see how you can get upset with nvidia for being good at their job, they are not the enemy the enemy is the company who can't be bothered to provide any bleeding edge tech or move things on and only wants to sell a load of mid range cards because that is where the market is.


You sound like a high end gamer.

I think a lot of the anger from NVIDIA comes from low-end or mid-range gamers who are still sporting hardware which doesn't require 'cutting edge' tech.. and by cutting edge, we're only talking about 4k/120hz TVs which are fairly available in the OLED form factor now (for some cash).

For us high end gamers, AMD pretty much left is in the dust and didn't care to cater to us for years. This is where I think the divide us and why people who are interested in the 3080 for 4k 60-120hz gameplay, are pretty much ignoring the 'hope' that AMD fans have of some magical unicorn cards.

I mean if AMD released a card inbetween the 3070 and the 3080 with more VRAM or for a lower price than the 3080, AMD fans on this thread would be commending them.

However for the people with slightly deeper wallets and at 4k/60+ hz, I'd be seriously dissapointed with waiting for that as I need as much power and grunt possible for a GPU which isn't above £1000
 
The right timing for their event would
have been before nvidia cards arrive in consumer hands.

Right now, AMD are not giving a tangible incentive to wait. Instead it’s literally just “hey get stuff also! We will tell 1.5 months later what it is and when you can buy it.”

That’s a pretty poor sway tactic and when you’re that much of an underdog already, you need to execute better in all areas to catch up.

I’m pretty sure AMD will be working towards a schedule and if they moved the date forward they’d be upsetting a lot of manufacturers, distributors, suppliers, partners etc and putting a lot of pressure on them all for no real benefit. The cards will be ready when they are ready?!

If you can wait - Wait and buy which ever is the best offering you can afford from AMD or Nvidia.
If you can’t wait - Buy the best you can afford from Nvidia.

It really is that simple. Operations such as releasing a new bunch of cards are only ever going to get delayed from schedule. They will never be brought forward. AMD have chosen x date and Nvidia have chosen Y date and that’s it?!
 
nVidia didnt take the lead on physx, lol. They bought the company that did then slowly buried it. And if you think AMD have never provided any bleeding edge tech or had any firsts, then you've clearly been living under a rock.
 
It's not sad AMD have been terrible at providing nvidia with decent competition and nvidia is still competing with itself in this space. TBF Nvidia started the conversation on raytracing and has spent money on taking the lead on better quality graphics, much like they did on physics, smoke effects, weather effects etc etc.

I can't see how you can get upset with nvidia for being good at their job, they are not the enemy the enemy is the company who can't be bothered to provide any bleeding edge tech or move things on and only wants to sell a load of mid range cards because that is where the market is.
When you look at AMDs R&D budget compared to Nvidia's is it any supprise that they have struggled to compete especially at the high end where the financial gains are less. By focusing on the mid and low end AMD have done what they should be doing as you have to build from the bottom up.

This time they have a great chance to make a splash at the high end as Nvidia's new node isn't as good as they would have you believe infact it's barely better than Turing in performance per watt, even the RT performance hasn't scaled very well despite a large increase in RT cores.
 
The performance uplift on the 3080 over Turing though is by no means amazing with it being only around 25% it's just that the price is better this time around. The 2080ti was around 35% over the 1080ti at the same watts yet this time we're getting 25~30% at 30% more power so that says all you need to know about this Samsung 8nm node.

The 3080 and 3080 are clearly way above their peak efficiency.

the 3070 matches the 2080ti at 220w, that's 30w power saving.

The 3080 beats the 2080ti by 25% while using 70w more power.

I think the problem here is that the 3080 and 3090 have way too many Cuda cores and games stop scaling properly with core counts after a while which is what we're seeing here - double cores for 30% more performance
 
Last edited:
Console releases are what has forced Nvidia to price "competitively". Well as far as the 3070 goes.

If people were playing new games on a Series X with a stable 4k framerate of 30/60 for £450 then nvidia would have looked a joke if one single gpu couldn't offer a viable alternative.

If this is what consoles are producing nowt to worry about :p

4eu1v8.jpg
 
The 3080 and 3080 are clearly way above their peak efficiency.

the 3070 matches the 2080ti at 220w, that's 20w power saving.

The 3080 beats the 2080ti by 25% while using 70w more power.

I think the problem here is that the 3080 and 3090 have way too many Cuda cores and games stop scaling properly with core counts after a while which is what we're seeing here - double cores for 30% more performance

Yeah so they are flooring the pedal to the metal, either because they have to improve on turing1 or because they know where rdna2 reaches and need to dive over the line as its gonna be tight.
 
This is the problem - you can't just enable something like that at driver level and get good results - explicit multi-adaptor needs game developers on a per game basis to tune their rendering approach based on a carnal knowledge of the way their game works and its assets and build that approach in from the ground up to make decisions to farm out rendering as they develop the game - you can't just come in after the fact and make it work with explicit multi-adaptor and get good results.

This is exactly what happens at the driver level and this is why AMD releases specific game-ready drivers.

For Nvidia it's the same - game developers are totally blind for what happens at Nvidia's driver level and the job is done by Nvidia's software engineers, not by the developers who have no access to the closed Nvidia ecosystem.
 
You sound like a high end gamer.

I think a lot of the anger from NVIDIA comes from low-end or mid-range gamers who are still sporting hardware which doesn't require 'cutting edge' tech.. and by cutting edge, we're only talking about 4k/120hz TVs which are fairly available in the OLED form factor now (for some cash).

For us high end gamers, AMD pretty much left is in the dust and didn't care to cater to us for years. This is where I think the divide us and why people who are interested in the 3080 for 4k 60-120hz gameplay, are pretty much ignoring the 'hope' that AMD fans have of some magical unicorn cards.

I mean if AMD released a card inbetween the 3070 and the 3080 with more VRAM or for a lower price than the 3080, AMD fans on this thread would be commending them.

However for the people with slightly deeper wallets and at 4k/60+ hz, I'd be seriously dissapointed with waiting for that as I need as much power and grunt possible for a GPU which isn't above £1000
I think the point is valid for both types of gamer. If you are not pushing the high end then your tech isn't doing much more than what is available on a console, not to say everyone wants or needs a high end graphics card but that is where the tech march starts and the next gen gets pushed down to the lower levels. Lets face it if nvidia wasn't there then AMD would have us happy to chase 10fps over a console at console settings. Like it or not nvidia pushes the boundaries on graphics and for the last few years AMD hasn't. AMD had some great tech several years ago eyefinity, the 3D tridef tech, freesync was all very good but ultimately was ineffective thanks to their inferior performance. Perhaps we should be grateful freesync convinced nvidia to stop gsync dedicated hardware but that has meant those freesync users have upgraded to nvidia cards - the AMD equivalent is underwhelming.

Will it be any different this time round, lets hope so. AMD have a lot to prove with drivers, and software support for their cards advanced features. I don't think I'd take the risk with one.
When you look at AMDs R&D budget compared to Nvidia's is it any supprise that they have struggled to compete especially at the high end where the financial gains are less. By focusing on the mid and low end AMD have done what they should be doing as you have to build from the bottom up.

This time they have a great chance to make a splash at the high end as Nvidia's new node isn't as good as they would have you believe infact it's barely better than Turing in performance per watt, even the RT performance hasn't scaled very well despite a large increase in RT cores.
Their budget isn't my problem, I'll buy the best product not buy something because I feel sorry for the company and their lack of money.
 
Status
Not open for further replies.
Back
Top Bottom