• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anything 50% faster than 1080 ti?

Soldato
Joined
26 May 2009
Posts
22,101
Just wondering, single card... All ready own ti.. building another PC
Yo will have to wait for the 2080ti (or the 1180ti, depends what they call it).


sli/crossfire used to be much better years ago

they seem to have abandoned it in recent years
Don't worry, once the EU get bored of ruining the television and vacuum cleaner markets they will probably ban GPUs above 140w TDP, that will get SLI/CF focused again XD
 
Soldato
Joined
4 Nov 2006
Posts
2,944
Location
London
I looked at getting a 2nd 980ti recently but the support for SLI looked so hit and miss it didn't seem worth the hassle.

I remember in LTT's video they said NVIDIA have dropped support or at least dissuaded customers from going SLI for gaming, so it's not really recommended nowadays because of poor performance.

I dunno why though, they could have stayed silent and kept milking those mugs with pithy returns.
 
Soldato
Joined
26 May 2009
Posts
22,101
@ubersonic - lucky we are leaving the EU then
Andi.
If they chose to do that, and to be clear it's highly unlikely, then not being in the EU wouldn't insulate/protect us just like it never protected the USA from the EU's crack down on plasma televisions. If the biggest market is closed to your products then you reevaluate the products you're producing/developing.
 
Associate
Joined
7 May 2004
Posts
1,951
It is disappointing that huge hardware leaps are few and far between nowadays however I feel hardware is at a point where its lasting longer.
Way back when a new game would come out and you would have to upgrade to play it at a decent framerate and have it look good. Nowadays dropping a couple of settings doesn't make the game look as horrible as it used to.

I know with my 2 1080s that they have enough grunt and vram to last a good few years with high settings if not higher in games.

I wouldn't have been able to say that with 2 6800 ultras back in the day
 
Soldato
Joined
26 May 2009
Posts
22,101
Way back when a new game would come out and you would have to upgrade to play it at a decent framerate and have it look good.

90's/early 00's mentality: Let's make our game look as awesome as possible so it will sell more!
Late 00's/10's mentality: Let's gimp the graphics on our game so consoles can run it properly so it sells more!
 
Associate
Joined
14 Aug 2017
Posts
1,195
I wish sli wasn't such a joke!!

Just can't understand why it doesn't at least 70% work!!... They would sell soooooo many more cards!

Is it that bad?

I've just bought a second 1070, and from reviews I've read the performance on a lot of games (not all) seems to be around the level of a Titan Xp or even slightly ahead of it.
Of course my motherboard doesn't have the PCIe capability to actually support SLI so I'm back on the upgrade treadmill again...
 
Soldato
Joined
29 Aug 2010
Posts
7,845
Location
Cornwall
Well they are - sadly. Nvidia have been doing it after Fermi (2010). Remember that most GTX x80 launches here after haven't been made of their "high-end" GPUs but rather the mid-end GPU's. The GTX 1080 isn't their high-end chip but a mid-range chip. So they are holding back and not giving yes the full power from the start. In 2006 we gave $400 for a high-end chipset, today we pay the same for a mid-range chipset.
http://digiworthy.com/2017/09/13/nvidias-performance-improvement-per-gen/

In fairness it's not like Nvidia dropped to doing 20-30% improvements per generation and AMD kept doing 60-100% improvements. AMD did exactly the same or their chip would've been miles ahead of Nvidia's after just a generation or 2 and by now they'd be so far ahead there'd only be one choice for gaming.
It's not like Nvidia are just responding to AMD either. 16 months after the 1080 was released AMD released Vega and only matched the 1080. A mid-range part apparently, meaning AMD still haven't released a high-end GPU.
 
Associate
Joined
18 Aug 2009
Posts
213
Well they are - sadly. Nvidia have been doing it after Fermi (2010). Remember that most GTX x80 launches here after haven't been made of their "high-end" GPUs but rather the mid-end GPU's. The GTX 1080 isn't their high-end chip but a mid-range chip. So they are holding back and not giving yes the full power from the start. In 2006 we gave $400 for a high-end chipset, today we pay the same for a mid-range chipset.
http://digiworthy.com/2017/09/13/nvidias-performance-improvement-per-gen/


So you are agreeing that they are only holding back the highest gpu on a new product launch by a few months (definitely less than a year)? That doesn't seem unreasonable from either a marketing perspective or a manufacturing yield perspective. Either way, it definitely doesn't explain the purpose of this thread which is why incremental upgrades are now 20% and not over 50%.

So I'll give the obvious view; we've hit the point of diminishing returns on gpu development. Once a technology has sufficiently matured then expecting 50% gains per generation is completely unreasonable.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,566
Location
Greater London
we've hit the point of diminishing returns on gpu development.
It certainly is starting to feel that way. It is soon going to be really hard getting extra transistors on a chip. The main way to get it will be through better architecture which will likely mean much smaller gains.

My guess is the cards we get in 2019 on 7nm will last us a very long time.
 
Associate
Joined
21 Jul 2013
Posts
357
I still think graphics will just continue to improve. Every year I hear "surely we can't get any better graphics than now!" and every year they improve. :D I was personally blown away by Titanfall 2's graphics!

However, I do feel GPUs are starting to stagnate abit. My r9 290 has certainly earned its keep. I would probably keep it for another year or two if it didn't pump out so much heat! :p
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
I still think graphics will just continue to improve.

I think we are still at an early stage in GPU development.

Until we see single cards that can run 8k and higher with all the details maxed out on games that won't be available for another 10 or 20 years we won't even be close.

Then there is VR which is only going to get more demanding.

I also think that TV/monitor screens are only going to get larger with resolutions to match, image what will be needed to drive a setup that covers an entire wall in your home.

In 20 years time people will hold up a Titan Xp and say how can you call this a serious graphics card, it is too big, too hot, too noisy, too power hungry and not even fit to drive very basic setups.
 
Associate
Joined
20 Nov 2009
Posts
2,050
Location
Haarby, DENMARK
So you are agreeing that they are only holding back the highest gpu on a new product launch by a few months (definitely less than a year)? That doesn't seem unreasonable from either a marketing perspective or a manufacturing yield perspective. Either way, it definitely doesn't explain the purpose of this thread which is why incremental upgrades are now 20% and not over 50%.

So I'll give the obvious view; we've hit the point of diminishing returns on gpu development. Once a technology has sufficiently matured then expecting 50% gains per generation is completely unreasonable.

Yes if looking from a companys perspective then it makes sense. Sell the midrange chipsets to high-end prices and make a profit. However I'm more looking at the situation from the level of a consumer. We don't get the full power from the get go, and now we pay the same price for those 20%~30% improvement but on midrange chipsets. The high-end range chipset nows are in a new price league that started with GTX Titan and GTX 780, super enthusiats level and extremely high prices.
Point is that we as consumers have accepted this as the "new standard" in advancements/improvements - for better or for worse.

I admit I'm a part of the consumer level making it possible to uphold those crazy prices, but I'm not fond of it.
 
Associate
Joined
7 Apr 2014
Posts
230
Location
Dudley West Mids
So if a 1080 is a mid-range chip, does that mean the 1070 is a low-range with the Titan being the high-end. What does that make the 1060 and below are they door stop chips?
 
Associate
Joined
27 Dec 2014
Posts
1,686
Location
Southampton
So if a 1080 is a mid-range chip, does that mean the 1070 is a low-range with the Titan being the high-end. What does that make the 1060 and below are they door stop chips?

lol, door stop chips !


The categories in my head would be like this:
Titan - enthusiast level
1080 and TI - high end

1070 and 980 Ti - high end of mid range (!)

1060 and below can fit into another 2, 3 categories :)

The ranges are getting difficult to set with so many cards to choose from.

Whatever drives your own setup at good quality is high end to you anyway. So, let's focus less on defining ranges and more on playing those bad ass games ..
 
Back
Top Bottom