• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

And with the sterling exchange rate bonfire expect worse than a 1:1 conversion to £... Think Gibbo said they need a 1.3 rate to get a 1:1 conversion (including VAT) and we are worse than that currently.

Exchange rate is awsome if yer exporting ;-)

I reckon price will be £50-£100 more than last gen, still bad but not the end of the world.
 
And with the sterling exchange rate bonfire expect worse than a 1:1 conversion to £... Think Gibbo said they need a 1.3 rate to get a 1:1 conversion (including VAT) and we are worse than that currently.
Woah woah woah.

So I've seen 1:1 pricing when the exchange rate is 1.4+ So.... someones milking us.

Equally. If I see VGA (or any component at more £'s than $'s I am out. That is absurd.
 
Pricing is a complete mystery at the moment. Chill.
Is it tho? Somebody from OcUK (was it Gibbo? Not sure) already suggested that they thought the 1080 would be £700+ (non-Ti).

It's in one of these threads :p

It should't surprise anyone if these turn out to be ludicrously priced. People will still pay, even if doing so makes no sense whatsoever. People would still pay for a 5% upgrade, just to say "I've got one; it cost me £1000 but I can afford it".
 
Nvidia is a business, their job is to make money, you or anyone elses feeling warm and fuzzy for them is irrelevant to them.
Since buying nVidia products is not mandatory for anyone, I would have thought they need to be a little considerate of their customer's feelings.

On the flip side tho, it seems nV aren't afraid of making enemies either. They're quite happy to pee off board partners, MS, Sony (etc), so I imagine they may honestly not give a rats ass about anyone :p
 
Since buying nVidia products is not mandatory for anyone, I would have thought they need to be a little considerate of their customer's feelings.

On the flip side tho, it seems nV aren't afraid of making enemies either. They're quite happy to pee off board partners, MS, Sony (etc), so I imagine they may honestly not give a rats ass about anyone :p

When you dominate a market you can take advantage of your customers and partners. That's why we need amd to come back strong and Intel to eventually produce a quality discrete graphics card.
 
Yeah, feels like monitor tech is the biggest driving force at the moment. Safe to assume that a mid range card will max out any game on a mid range monitor. Got a high end monitor? buy a high end graphics card.

And if you game at HDR buy AMD? :P
 
Is it tho? Somebody from OcUK (was it Gibbo? Not sure) already suggested that they thought the 1080 would be £700+ (non-Ti).

It's in one of these threads :p

It should't surprise anyone if these turn out to be ludicrously priced. People will still pay, even if doing so makes no sense whatsoever. People would still pay for a 5% upgrade, just to say "I've got one; it cost me £1000 but I can afford it".

We can guess, but that’s all we can do. I think the difference in cuda cores between the two top end models is far more interesting, it’s huge.

I’m quite excited, I’d planned to buy this year but a couple of others have put me right about future tech and how quickly Nvidia might move forward. If these cards are as powerful as I think they might be, then 7nm could be staggering, and come sooner that we might think.

I was ready to wait 9 months for the ti anyway, and the monitor I want doesn’t exist yet, so I’m quite content to sit back and watch this new era unfold.

If AMD can power through in the next few months things could get very interesting indeed. I’m hoping their success with Ryzen frees up a bit more capital for re-investment *crosses fingers*.

Price will just be... the price. Enough people will buy and continue to fill NVs pockets. If it’s good I might buy, if not I won’t.
 
Last edited:
Xxxxxxxxxxxxx is


We can guess, but that’s all we can do. I think the difference in cuda cores between the two top end models is far more interesting, it’s huge.

I’m quite excited, I’d planned to buy this year but a couple of others have put me right about future tech and how quickly Nvidia might move forward. If these cards are as powerful as I think they might be, then 7nm could be staggering, and come sooner that we might think.

I was ready to wait 9 months for the ti anyway, and the monitor I want doesn’t exist yet, so I’m quite content to sit back and watch this new era unfold.

If AMD can power through in the next few months things could get very interesting indeed. I’m hoping their success with Ryzen frees up a bit more capital for re-investment *crosses fingers*.

Price will just be... the price. Enough people will buy and continue to fill NVs pockets. If it’s good I might buy, if not I won’t.

Who you sending kisses to? :p
 

ChaosGroup has another demo running on experimental V-Ray GPU build with a pre-release Quadro RTX 6000 and driver, using RT Cores.

Production scene courtesy of Dabarti CGI. No denoising used.
 
I'm just happy that it appears we're not getting a mediocre upgrade and we'll still be well ahead of this and next gen consoles.

Dont hold breath on that.

I laugh how naive many above are and aren't thinking clever enough. Nvidia is rushing the 2080Ti out right now.
That should have triggered many to have second thoughts about buying an RTX card before the transition to 7nm next year. (consoles are also at 7nm)

Also if AMD was coming out with such a massive in mm2 chip, there were going to be a lot of comments, yet I haven't seen a single comment in 157 pages about the impact of having such a big chip. Which lets not forget that big size means a lot of power consumption, and a lot of heat.

If PNY post is correct and at base clocks this burns 285W. So expect a 450W+ heater like the good old GTX480.
In comparison Vega 64 @ 1630 (Turbo boost mode) in LC, Nitro+ and Red Devil has 276W hard set limit, while at 2012 core the GTX1080Ti consumes 330-340W (depending model).
 
285 WATTS TDP

YNXyliz.gif
 
Some are saying it's Pascal 1.3 with some extra bits strapped on for ray tracing and a die shrink, locked in ray tracing eco system pushed onto certain game devs. We've seen this before with Physx..

If the 'GTX' x60 / x70 card is power efficient and can handle VR well that one might be tempting. Don't card about the ray tracing stuff at all and the price of the RTX cards just seems to steep to justify.

Nice to be surprised with decent pricing at launch, but I doubt it xD
 
If PNY post is correct and at base clocks this burns 285W. So expect a 450W+ heater like the good old GTX480.

I had a 480. I'm surprised I managed to father two children after having that card. I'd have thought the immense heat from it would have sterilised me.

Come to think of it, I think that was a PNY card too. With the little Nvidia pixie mascot.
 
Dont hold breath on that.

I laugh how naive many above are and aren't thinking clever enough. Nvidia is rushing the 2080Ti out right now.
That should have triggered many to have second thoughts about buying an RTX card before the transition to 7nm next year. (consoles are also at 7nm)

Also if AMD was coming out with such a massive in mm2 chip, there were going to be a lot of comments, yet I haven't seen a single comment in 157 pages about the impact of having such a big chip. Which lets not forget that big size means a lot of power consumption, and a lot of heat.

If PNY post is correct and at base clocks this burns 285W. So expect a 450W+ heater like the good old GTX480.
In comparison Vega 64 @ 1630 (Turbo boost mode) in LC, Nitro+ and Red Devil has 276W hard set limit, while at 2012 core the GTX1080Ti consumes 330-340W (depending model).

Except the power consumption and heat of this is worth it due to the performance. The vega heat, power draw and noise is all a bit meh because of its competitor being much more efficient.
 
Back
Top Bottom