• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I've seen a few outlandish predictions posted online as I cant see this being true for example:

4090 to be 1.8-2.0x faster than a 3090.

4080 to be 1.5x a 3090.

4070 to match or beat a 3090.

The strange one is a 4060 getting close to a 3080. (A 3060 is near half the speed of a 3080 so nvidia going to nearly double the performance for this tier?)

I think a gain of around 25-40% for each tier to more resonable.

Nvidia basically marked lower parts as higher tiers, so this may be a sign they will start going back to their old tier system and give us something decent for the money.

I highly doubt these cards will be close to double perf unless we are talking PURELY in specific RT based scenarios where they have done more work.
 
  • Like
Reactions: G J
With them numbers RDNA3 is gonna eat it alive...

:D

Good. We need AMD to compete with Nvidia so we can all buy cheaper Nvidia cards…

jWr67J8_d.webp
 
Really? Of all the computer bits and pieces you can buy the monitor is the thing you spend the most time looking at. They last multiple GPU and CPU refreshes, very often over 5 years of life in them, sometime more than 10, occassionally over 20 (my brothers old Amiga 1024 monitor from 1986 lasted well into the late 2000's/early 2010's). Bad ones can give you eyestrain and often have dead pixels in the worst places.

A monitor is the one thing you should spend big on above all others. Writing this on my gorgeous LG CX 48" OLED right now (£1,500 new in summer 2020 I think) and loving it.
I am still on my 2008 50" Pioneer Kuro 1080p plasma screen. It still has so much nicer colours than all the computer monitors I have seen. I am just waiting until it finally wears out so I have an excuse to get one of the 60" 4K LG OLEDs with G-sync, HDR and all that jazz
 
I am still on my 2008 50" Pioneer Kuro 1080p plasma screen. It still has so much nicer colours than all the computer monitors I have seen. I am just waiting until it finally wears out so I have an excuse to get one of the 60" 4K LG OLEDs with G-sync, HDR and all that jazz
I long to have my old Kuro back, have actually been looking at picking up one used. One of the best pieces of consumer displays ever made, stupendously good piece of kit. I can't think of a single consumer electronic that's had as good longevity save maybe my parent's old pioneer laserdisc they had through most of the VHS era.
 
I long to have my old Kuro back, have actually been looking at picking up one used. One of the best pieces of consumer displays ever made, stupendously good piece of kit. I can't think of a single consumer electronic that's had as good longevity save maybe my parent's old pioneer laserdisc they had through most of the VHS era.
It was even better after I discovered that energy saving mode 1 or 2 got rid of the annoying buzzing I put up with for the first 4 years
 
I long to have my old Kuro back, have actually been looking at picking up one used. One of the best pieces of consumer displays ever made, stupendously good piece of kit. I can't think of a single consumer electronic that's had as good longevity save maybe my parent's old pioneer laserdisc they had through most of the VHS era.
I'm typing this on my spare gaming PC downstairs - which is plugged into my Pioneer Kuro. Who needs a 4k TV when you are still getting lots of use from a 14 year old Kuro! Cost me £1600 at the time (40 inch model) so I've a few more years of "getting my money back" on it yet. :)

Great for Retro gaming, btw.
 
I've seen a few outlandish predictions posted online as I cant see this being true for example:

4090 to be 1.8-2.0x faster than a 3090.

4080 to be 1.5x a 3090.

4070 to match or beat a 3090.

The strange one is a 4060 getting close to a 3080. (A 3060 is near half the speed of a 3080 so nvidia going to nearly double the performance for this tier?)

I think a gain of around 25-40% for each tier to more resonable.
Someone got speed mixed up with price
 
Pretty much.

I'm still expecting:

4070 to match 3090 but be better in RT by at least 30% for £550-600
4080 to be about 30% better than the 3080 for £700-750
4090 to be about 30% better than the 3090 for £1500-1600

Anything less/worse = meh
Anything better = bonus
RTX 4090 30% faster than RTX 3090 ?

Meanwhile RTX 4090 will have:
-56% more cores
-16x the L2 cache
-No more shared pipeline with FP32/INT32 (high probability when you look at H100 design)
-30%+ higher clocks
 
RTX 4090 30% faster than RTX 3090 ?

Meanwhile RTX 4090 will have:
-56% more cores
-16x the L2 cache
-No more shared pipeline with FP32/INT32 (high probability when you look at H100 design)
-30%+ higher clocks
Specs could be true or like most rumours be a load of ****
 
Nexus's numbers wouldn't t make any sense unless the rumours are completely wrong, as with these estimates the 4080 and 4090 would have nearly identical performance despite rumours for many months and from several sources all saying the 4080 is this time on gh/gl103 not 102 so there would be a very significant difference in core count between the two cards.

Nexus's estimates would also place RTX4000 at a serious disadvantage to AMD unless AMD 100% faster rumours are also completely rubbish and it's also only 30% fastern
 
Nexus's numbers wouldn't t make any sense unless the rumours are completely wrong, as with these estimates the 4080 and 4090 would have nearly identical performance despite rumours for many months and from several sources all saying the 4080 is this time on gh/gl103 not 102 so there would be a very significant difference in core count between the two cards.

Nexus's estimates would also place RTX4000 at a serious disadvantage to AMD unless AMD 100% faster rumours are also completely rubbish and it's also only 30% fastern
And that is exactly what all of this is right now, guessing work.

The figures I listed is what "I am" expecting i.e. not what these clickbait sites are telling us (different performance figures every week it seems...), hence why I added this bit to the bottom of my post:

Anything less/worse = meh
Anything better = bonus
 
Well the worst case then is the normal "only" +30%, but with power going up at least that much too. So despite perf/watt last gen with Samsung's 8nm vs TSMC's 7nm being way in TSMC's favour, this would imply that the move to TSMC 5nm has brought nothing at all in term of perf/watt.

What I am worried about is this: we know process advances are slowing down; we know new nodes no longer give as much perf/watt improvements are they used to; and so on. Yet GPU vendors (and AMD basically hinted that no matter how much perf/watt goes up architecturally for RDNA3 they too will have to match top power usage) are going power consumption mad.

While over-engineered power hungry cards may have some extra longevity if undervolted and underclocked, this does not bode well at all.
 
As long as a 4080 or equivalent gpu can provide better performance than a 3090 and use less power (before any undervolting) than a 3090, I don't mind as it is still more efficient. Ultimately there is no way around it if you want something more powerful than the current lot of gpus. If amd haven't got rdna 3 RT sorted, they should be very worried given the current showing of ampere VS rdna 2 power per performance in RT workloads.
 
28nm TSMC (Maxwell) to 16nm TSMC (Pascal) = 88% better density = huge performance gains for GTX 1080Ti (471mm2) over GTX 980Ti (601mm2)

Then we had Turing on 12nm which is just marketing name for 16nm+, which isn't even+ cuz both have the same density and clocks.

Then we had Ampere on 8nm Samsung with 80% better density over 16/12nm TSMC and no clock improvements.

Now Custom 4N node from TSMC for Nvidia (Full EUV). Hopper on 4N is 118% denser than Ampere ( 98,3MTs/mm2 vs 45MTs/mm2)
 
Nvidia basically marked lower parts as higher tiers, so this may be a sign they will start going back to their old tier system and give us something decent for the money.

I highly doubt these cards will be close to double perf unless we are talking PURELY in specific RT based scenarios where they have done more work.
Amusing to point that out now... this behaviour began with the GTX 6xx series, where the top tier product release was only ever the mid-tier part, because they could sell mid tier as high end after destroying ATI/AMD, after AMD killed ATI.

This has been happening ever since.

The GTX680 was the mid-range part, the GTX690 was 2x mid-range parts on 1 board.

They only released the real high end part in the GTX7xx series.

We're already 5 generations behind, due to profiteering besting progression & I'm feddup of it... now everyone is happy paying excessive pricing for lesser components.

I wouldn't mind the pricing hike so much, if only they'd release the real high end parts like they used to, even if they charged a massive premium for it - giving the buyers the choice and maintaining the real engineering development that we all want to happen in this market.
 
Back
Top Bottom