• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

If it's £1K for a Ti, I don't think I'll bother. Been waiting ages for the next significant bump in performance since I want to move to exclusively 4K gaming but I don't think I can justify that much for the privilege.

And before anyone says, no, a 1080Ti or V64 just can't cut it I'm afraid.

Been playing FO3 for the past week @4K on my RX480. Even old games look better at 4K but I'd like to be able to run the latest games maxed out too. But at £1K? Too much if that is the release price. £800ish and I'll probably bite if the performance is good enough, which it really should be for the 2080Ti.
 
Just going to chime in. As far as I'm aware nobody is undervolting 1080s to get them to run at over 2000mhz (or at least to get max oc), but instead increasing the voltage do get there. This is going to quickly put up the power consumption to well over 300w. The Vega card however, is able to get much better overclocks and much more stable clocks when it's undervolted. Trust me i'v ran about a 1000 tests in the last few days trying to get there on a reference card.

Sorry, this going way off-topic. I realise that Vega's wattage can be tamed by undervolting but the original discussion was regarding Panos' comparison of his tweaked card to the soon to be released Inno.


Seems you have no idea of the settings we have in AMD Wattman.
There are 4 basic ones.
Power Save Mode (170-176W cap), Balanced Mode (230-234W cap), Turbo mode (270-276W cap) and Custom where you can run wild with the card settings.
On a card like the Nitro+ these reflect to 1430 core, 1545, 1630, "go wild". Avg core speed is pending temps.

As for the power, I said the Factory overclocked like my old 1080ti Xtreme , yet you picked the FE.
AsF0q66.png



I had the card mate, I know how much power it burned.
That is why I love Vega , because when playing EU4/CK2 can burn only 80W, and can play Elite, WOT and WOWarships all day at 176W at at 100-110fps average and 140fps locked, respectively. (freesync 2560x1440 monitor all maxed out).
I picked the FE because PCper tests card only power and shows it pretty much matches the stated TDP. The 1080 is the same btw.

Undervolting, like overclocking isn't guaranteed so everyone's mileage will vary. The reason manufacturers set the minimum voltage (and max clock) is for maximum stability.

To try and compare an undervolted card to an unreleased card with unconfirmed specs that hasn't been reviewed yet, but you know it will be significantly faster is just ridiculous.


Anyway, enough of the talk about old and obsolete kit! There are new cards incoming;)
 
There's nothing hidden about it. The only consumer-grade card (i.e. not including Titans) that uses their largest Pascal GPU is the 1080 Ti. It's rumoured that in the 20xx series they won't even use the largest GPU on the 2080 Ti - it'll probably be a full-fat TU104 (i.e. mid-range chip). If that's true, you'll need to fork out for a Titan if you want their largest GPU, the TU102.

That's right, 2080 Ti buyers might be paying £1200 for a mid-range chip.

Inno 3D are suggesting the following specs for the 2080Ti

GPU Engine Specs:
CUDA Cores 4352
Standard Memory Config 11GB
Memory Interface GDDR6
Memory Interface Width 352-bit

We know that the Turing RTX 8000 Quadros have
CUDA cores 4608
Memory Interface Width 384-bit

So if the 2080Ti specs are correct it is not a mid range card it looks like slightly hobbled RTX 8000 Quadro.
The 2080 looks like a hobbled Quadro RTX 5000
That also leaves room for a Titan with the full 4352 core count and 16/24GB GDDR6 unless they skip a Titan until the next quadro line up.
 
Wots everyone expectations for tonite?.

Jesnsen on stage pulling totally idiotic comparisons out of his ass like he did at siggraph babbling on about how cpu's aren't good at raytracing compared to his new quadro card (like thats some kind of revelation). The pricing getting unveiled and eyebrows are raised, then Jensen reveals that people that attended the event get a free gift to make purchasing the new card easier. That gift being:


jNz182C.jpg
 
SLI is another description of having more money than sense and endlessly chasing the white rabbit. There are no games that need a 2xxx card. That is the sad fact and has been for the last few iterations or more.

In my mind the only reason for the new cards is so you can splash out £2000 on a 4k HDR monitor ;)
 
I’ve been using SLI buying top of the line GPUs every year since 2012. SLI scaling is not perfect, but sufficient.

It's also not supported as well as it used to be, same for amd with crossfire. Going forward they're going to get less support and more focus on single gpu.
 
Back
Top Bottom