• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

2080 Ti was more expensive than 1080 Ti.
1080 Ti was more expensive than 980 Ti.
980 Ti was more expensive than 780 Ti.

We've seen this three generations in a row now.
The difference between the release price of 980Ti and 1080Ti was $50.

The only time prices went up stupidly high was with turning. Also 2080Ti is on a huge die, doubt the 3080Ti will be. So no it is not justified expecting prices to keep going up.
 
That's pretty damn good.

I can't remember how much I purchased my RTX 2080 for but I think it was in the ballpark of £560. This was near launch - snagged it from someone who had an extra GPU he got for free.

I'd be happy to get anywhere near that price when sold.

Yeah, holding on and skipping one or two generations isn't economical.

I upgraded every generation and only lost £310 in the last 5 years. My buddy who got the Titan X at the same time as I did back in 2015 but never sold/upgraded (to save money in his mind) struggles to sell it for more than £250. So he's lost about £600 while sticking to the same GPU for 5 years.
 
The difference between the release price of 980Ti and 1080Ti was $50.

The only time prices went up stupidly high was with turning. Also 2080Ti is on a huge die, doubt the 3080Ti will be. So no it is not justified expecting prices to keep going up.

I hope you're right, but I do expect some price hike, towards the region of £1500, with the normal 3080 to approach closer to £900-1000 range.

We'll have to wait and see.
 
I hope you're right, but I do expect some price hike, towards the region of £1500, with the normal 3080 to approach closer to £900-1000 range.

We'll have to wait and see.
I bet you a tenner that will not be the case ;)
 
They just might you know, or you're gonna have a lot of pi**ed turing card owners. Imagine paying £1200 for a 2080ti to see it tank to £600 if performance gains are to be believed with Ampere?

I totally agree that they might tank in value, but it's all related to the core price / performance bump of the next lot (I don't think the average buyer yet cares enough about ray tracing). A £1,500 3080ti would prevent tanking, is my point. Although I don't think it'll be that much. From what I've seen, a £1k 2080ti replacement with 40% better basic performance seems likely. Hopefully it's better than that though!
 
Why are people saying things like 3080Ti costing £1400 to £1500?

Every time a new gen GPU comes out people do this. So will the 4080Ti cost £1800 to £2000? The 5080Ti £2300 to £2500 etc? Makes no sense.

I am willing to be the 3080Ti will be the same price or less than a 2080Ti.

More expensive to produce.

I don’t expect it to cost £1500 but I’ll bet it will be around the £1000-1100 mark again.
 
2080 Ti was more expensive than 1080 Ti.
1080 Ti was more expensive than 980 Ti.
980 Ti was more expensive than 780 Ti.

We've seen this three generations in a row now.

No, we haven't. You are looking at names rather than price points.

The 1080 Ti CRUSHED the 980Ti for ~$50 freaking dollars. 1080Ti money could get you a Turing card that was about as fast as a 1080Ti....plus ray tracing in its infancy.

Turing wasn't even in the same zip code as Pascal when it comes to generational advancement.
 
No, we haven't. You are looking at names rather than price points.

The 1080 Ti CRUSHED the 980Ti for ~$50 freaking dollars. 1080Ti money could get you a Turing card that was about as fast as a 1080Ti....plus ray tracing in its infancy.

Turing wasn't even in the same zip code as Pascal when it comes to generational advancement.

You're making my point for me. Nvidia gave people more performance, at higher prices. Or the same performance, at the same prices (plus some features that weren't useful at the time, like ray tracing).

I only hope this isn't the case again. I don't want to see a £1000-1100 product (whatever its name will be) that gives the same performance as a 2080 Ti plus a couple of new features.
 
No, we haven't. You are looking at names rather than price points.

The 1080 Ti CRUSHED the 980Ti for ~$50 freaking dollars. 1080Ti money could get you a Turing card that was about as fast as a 1080Ti....plus ray tracing in its infancy.

Turing wasn't even in the same zip code as Pascal when it comes to generational advancement.
Yea, how did they manage that? Surely the 1080Ti was more expensive to produce also? Hmm... lol.
 
Yea, how did they manage that? Surely the 1080Ti was more expensive to produce also? Hmm... lol.

775mm2 on the 2080Ti. To put that into perspective on a perfect yield basis. You’d get 3-4 5700XT chips out of the same space you’d get 1 2080Ti chip. Not so much £1100 worth, but good enough reasoning as to why they are expensive to produce.
 
Ooft, how much power are these things gonna use :eek:

I think that's unlikely as no current PSU can output to that socket. And 600W on a two-slot GPU? Come on! The article is incorrect in saying that it's the same as two 6 pin sockets. If it is true there will have to be an adapter that takes two (or more) 8 pin plugs.
 
NVIDIA GeForce RTX 30 Series Ampere Gaming Graphics Cards Rumored To Utilize New 12-Pin Power Interface

https://wccftech.com/nvidia-geforce-rtx-30-ampere-gaming-graphics-cards-new-power-connector-rumor/

Ooft, how much power are these things gonna use :eek:

That's actually impressive and reassuring. A lot more power hungry + smaller processing node means we're getting a good performance upgrade!

I think that's unlikely as no current PSU can output to that socket. And 600W on a two-slot GPU? Come on! The article is incorrect in saying that it's the same as two 6 pin sockets. If it is true there will have to be an adapter that takes two (or more) 8 pin plugs.

There will surely be adapters regardless, it's not like they'll be intaking a new voltage. It will still be 12v, maybe just more of them. They could just put in an adapter in the box like they used to do with molex to PCIe adapters.
 
I think that's unlikely as no current PSU can output to that socket. And 600W on a two-slot GPU? Come on! The article is incorrect in saying that it's the same as two 6 pin sockets. If it is true there will have to be an adapter that takes two (or more) 8 pin plugs.

If it's true, I literally have no idea what planet Nvidia are on, I mean what are they even thinking with this and like you say, what will it connect to from a standard ATX supply.
 
That's actually impressive and reassuring. A lot more power hungry + smaller processing node means we're getting a good performance upgrade!



There will surely be adapters regardless, it's not like they'll be intaking a new voltage. It will still be 12v, maybe just more of them. They could just put in an adapter in the box like they used to do with molex to PCIe adapters.

Exactly, I mean I'm no electrical engineer but is it possible the adaptor could connect to more than two PCIE power cables?

Also, is this a sign that Nvidia are having to blow their power budget to compete with AMD, assuming they know roughly where Navi 2x will perform?
 
Exactly, I mean I'm no electrical engineer but is it possible the adaptor could connect to more than two PCIE power cables?

Also, is this a sign that Nvidia are having to blow their power budget to compete with AMD, assuming they know roughly where Navi 2x will perform?

Of course it's possible. We had that in the past, like 2x molex to 1 8-pin PCIe.
 
Back
Top Bottom