• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

All of the talk about pricing got me wondering how much worse things have gotten at the low end...

GTX 260 - $399
GTX 460 - $279
GTX 560 -$249
GTX 660 - $299
GTX 760 - $249
GTX 960 - $199
GTX 1060 - $249
RTX 2060 - $349
RTX 3060 - $329
RTX 4060 - $299

That GTX 260 is about $585 after adjusting for inflation :eek:
The GTX 260 was built on the top die at the time so if a 4060 used the AD102 that would be a fair comparison, it also had 90% of the top tier cards bus and VRAM.

A comparable card to the GTX260 today for $585 would have to use the AD102, have atleast a 320 bit bus and 20gb VRAM.

A more valid comparison to a current day $300 4060 specs wise would be the GT240 which launched for $80 and would be $120 today.

This is why its important to look at the actual specs of a card rather than go on the naming.
 
Last edited:
I see PT on screenshots, looks great. I start it in my game, I move around and... what the hell is this noise everywhere, crawling on textures, in shadows etc? GI lagging for even 2-3 seconds at times, reflections of visibly low resolution and details, ghosting and blur everywhere, and all that cost so much FPS as well! Fix the noise, blur and ghosting and speed things up and I am happy - I am ok for it to be done with AI, it doesn't have to be perfect, just good enough. We know they can't make RT itself go much faster, for mentioned by me reasons, it has to come with some crutches to help it - and good, be smart about it, use all the other tech you can to make it better, don't just flip a switch in UE5 and expect miracles you lazy devs! :P
Which games exactly? :)
 
Pretty much all of them. Cyberpunk with PT, IJ, etc. Hub shown all these examples very well in their video too so I can refer you there now :)

You may be a person that’s very susceptible to some of these issues but I think, for some people, there is a certain amount of ‘retconning’ when it comes to why they dislike ray-tracing.

Before that HUB video it was:

“It tanks performance and you can barely see the difference. Not worth it. Also things can look too shiny.”

Not invalid points! Some people could legitimately argue that RT is rubbish for these reasons, from the perspective of preference.

Now after that video it’s:

“Textures become too noisy! It’s weird when you move around! The reflections are lower resolution!”

^Not invalid points either, technically, but you’d have to be a hawk to notice some of these things and pretty much nobody was talking about them before. If anything, these points show that the tech has some downsides, but they aren’t primary reasons for disliking it IMO. Even the HUB video notes at several points that the visual presentation of lighting is generally better with RT enabled.

Personally, I dislike it when things appear too shiny - that’s the aspect that bothers me. Hence when there is an option for RT shadows only (rather than reflections) I have sometimes opted for that.
 
Last edited:
I meant if a new 5080 is a grand, it'll push down the price of a used 4080super, in which case either option would be appealing

If the 5080 costs a grand and is around the same speed as a 4090 it will really push down the price of a secondhand 4090

Which would you choose for a grand ;)
Brand new 5080 16GB with 3+ years warranty sold by a well known retailer or by nvidia
or a secondhand 4090 24GB with no warranty with risk of something being wrong with it (remember the melting connectors) and being sold by some random stranger


If the 5080 is priced at £999 the 4090 secondhand value could be taking a huge nose dive
That as long there good stock of them and buyers no need to wait 6 months to get hold of one ;)
 
Last edited:
If the 5080 costs a grand and is around the same speed as a 4090 it will really push down the price of a secondhand 4090

Which would you choose for a grand ;)
Brand new 5080 16GB with 3+ years warranty sold by a well known retailer or by nvidia
or a secondhand 4090 24GB with no warranty with risk of something being wrong with it (remember the melting connectors) and being sold by some random stranger


If the 5080 is priced at £999 the 4090 secondhand value could be taking a huge nose dive
That as long there good stock of them and buyers no need to wait 6 months to get hold of one ;)

When you put it like that I'd be all over the 5080. Especially as with an FE the warranty is worth the paper it's written on.

But a 4080 Super FE for £600-650 with nearly two years warranty left...
 
If the 5080 costs a grand and is around the same speed as a 4090 it will really push down the price of a secondhand 4090

Which would you choose for a grand ;)
Brand new 5080 16GB with 3+ years warranty sold by a well known retailer or by nvidia
or a secondhand 4090 24GB with no warranty with risk of something being wrong with it (remember the melting connectors) and being sold by some random stranger


If the 5080 is priced at £999 the 4090 secondhand value could be taking a huge nose dive
That as long there good stock of them and buyers no need to wait 6 months to get hold of one ;)
I feel like the 4090 second hand market won't take as much of a nosedive as you think. With the 5080 only having 16GB VRAM I can see the 4090 retaining some value with it's 24GB for all the AI Nerds that don't want to pay for a 5090.
 
The GTX 260 was built on the top die at the time so if a 4060 used the AD102 that would be a fair comparison, it also had 90% of the top tier cards bus and VRAM.

A comparable card to the GTX260 today for $585 would have to use the AD102, have atleast a 320 bit bus and 20gb VRAM.

A more valid comparison to a current day $300 4060 specs wise would be the GT240 which launched for $80 and would be $120 today.

This is why its important to look at the actual specs of a card rather than go on the naming.

It's been explained to people several times over the last few years on here and by Gamersnexus,etc. People just seemingly want to pretend shrinkflation isn't happening.

You may be a person that’s very susceptible to some of these issues but I think, for some people, there is a certain amount of ‘retconning’ when it comes to why they dislike ray-tracing.
You must have missed people talking about the noise issue for the last several years. There are technical videos explaining it - it's an artefact because current hardware cannot cast enough rays. The more rays you cast,the less noise there is.This is why we need improvements in denoising and even Nvidia has talked about it extensively. This is why they introduced ray reconstruction as an attempt to try and address this.

So not sure why you are trying to know better than Nvidia.
 
Last edited:
You may be a person that’s very susceptible to some of these issues but I think, for some people, there is a certain amount of ‘retconning’ when it comes to why they dislike ray-tracing.

Before that HUB video it was:

“It tanks performance and you can barely see the difference. Not worth it. Also things can look too shiny.”

Not invalid points! Some people could legitimately argue that RT is rubbish for these reasons, from the perspective of preference.

Now after that video it’s:

“Textures become too noisy! It’s weird when you move around! The reflections are lower resolution!”

^Not invalid points either, technically, but you’d have to be a hawk to notice some of these things and pretty much nobody was talking about them before. If anything, these points show that the tech has some downsides, but they aren’t primary reasons for disliking it IMO. Even the HUB video notes at several points that the visual presentation of lighting is generally better with RT enabled.

Personally, I dislike it when things appear too shiny - that’s the aspect that bothers me. Hence when there is an option for RT shadows only (rather than reflections) I have sometimes opted for that.

Its really simple. When you have your own little echo chamber of "experts" drowning out the threads with how good it is, you would see that none of the "experts" picked up on this even though its their special niche topic. Now that a spotlight has shone on the negatives together with the low end hardware unable to really stand a chance with it, you get the broader picture that its still in its infancy and the games are not focusing on this to get decent results.
 
The amount of salt flowing in the others threads is hilarious. People having pre-launch meltdowns about others wanting to buy cards.
Sit back and watch it flow as it will eventually ease and some of the people who had meltdowns will roll into a thread a some point sheepishly announcing they got a card anyway.
 
Was looking at an old woolworths 1978 ad on youtube yesterday

large tin of Quailty Streets £4.99 on offer from £6.25 in 1978
£4.99 in 1978- today would be about £35.50 :eek::eek:

In 2011,the 1.2KG tin was £5:

That would be £7.56 in 2024. Also,the quality of the older sweets was better - there was more cocoa and a better selection of sweets.

The shrinkflation is not only the cost,but the use of lower quality ingredients. Just look at Cadbury's chocolate - less cocoa,less milk and more sugar.

WRT to graphics cards GTX580 at the end of 2010 with an almost 600mm2 dGPU for $500. That would be $723 in 2024. The GTX580 was also stupidly priced compared to the GTX570 which was $330,or $477 in 2024 money.

The GTX580 used the fully enabled GF110 dGPU and the GTX570 was 94% enabled with about 80% to 85% of the memory bandwidth and 85% of the VRAM.

The RTX4090 is $1600 in 2022 or $1724 in 2024. It uses 89% of the AD102 dGPU. So by past standards,the RTX4090 would be a GTX570 class dGPU.

So in reality we paying more between 3 to 4 times more for a GTX570 class dGPU. Not all of that is because the parts,etc cost more to source.

The 80 series dGPUs we get now(and the AMD equivalents) are closer to a GTX560TI class card of that era.
 
Last edited:
shrinkflation is estimated against a constant base, here you are comparing a gtx 580 with a full die rtx 4090. its not the same as comparing the % of cocoa in 100gm of chocolate
if you estimate things in money terms (like fps/$) ..again the shrinkflation argument doesnt hold

edit: also its not 89% of full die, the number just indicates shrinkage in steaming processors, fixed function and front/back end circuits on chip remain untouched so in percent terms it might be somewhere in 95% of full die for the 4090
 
Last edited:
Back
Top Bottom