• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Soldato
Joined
13 Jun 2009
Posts
6,847
I have no doubt the 1170 will surpass the 1080 (maybe even match the 1080 Ti), I am just very sceptical of it actually being much cheaper. I bet it'll be more expensive than the 1070 was, which partially defeats the point.
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
I have no doubt the 1170 will surpass the 1080 (maybe even match the 1080 Ti), I am just very sceptical of it actually being much cheaper. I bet it'll be more expensive than the 1070 was, which partially defeats the point.
It may not release much cheaper than a 1080, but if the 1170 matches the 1080Ti or within %5 for a good chunk cheaper than the 1080Ti it will sell a lot. Only time will tell, but the previous 970 and 1070 offered up previous gen Titan competitive performance at a much more affordable price.

The thing is with the 970 and 1070, on release the difference was marginal vs the 780Ti/Original Titan and 980Ti/Titan X Maxwell, but over time NVIDIA prioritize optimizing the newer technology and they make up a bit more ground on the older gens. Although, in a blind test, for me it still would be hard to tell the difference between the 980Ti and 1070 today.
 
Soldato
Joined
23 May 2006
Posts
6,845
I have no doubt the 1170 will surpass the 1080 (maybe even match the 1080 Ti), I am just very sceptical of it actually being much cheaper. I bet it'll be more expensive than the 1070 was, which partially defeats the point.

sadly yeah i think we can be certain it will be way over the price of the 1070... but if it launches and it is around about the performance of the 1080ti and "only" £50 dearer for arguments sake than the 1070, but uses less power than the 1080ti, then i think there is a strong use case there.

Also i cant remember where i read it, so it may not be true (which may or not make it a lie depends on definitions above :D ) but I do remember reading that the next generation at launch would be "ok" however would really come into its own when games supported vulkan and also other APIs after DX12.

btw i read this as a humerous segue... apparently there will never be a direct X 12! an interesting statment (lie? ;) ) from AMD back in 2013.

https://tech.slashdot.org/story/13/04/12/1847250/amd-says-there-will-be-no-directx-12-ever
 
Associate
Joined
20 Sep 2011
Posts
812
given that the gtx 1080 only had the GPU crown for 2 months & was heavily overpriced by retailers is everyone buying the 1180 at launch or waiting for the titan, if Nvidia do a repeat of last gen

taking into considering the titan XP in August 2016 would have been a decent purchase given how long its been top end
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
given that the gtx 1080 only had the GPU crown for 2 months & was heavily overpriced by retailers is everyone buying the 1180 at launch or waiting for the titan, if Nvidia do a repeat of last gen

taking into considering the titan XP in August 2016 would have been a decent purchase given how long its been top end

That angle does come up fairly often when justifying the Titan premium. Ultimately, though I think the 1080 and 1180 are/will be overpriced, I'll still fork out for it, as in comparison to the Titan premium it's a bargain imo.

Getting the Titan to me is basically paying +80% more than a Ti instead of waiting for the Ti release, with no real world gaming performance difference.

The above argument is null if you're getting the Titan for non gaming performance features. Bottom line is, if you're happy to pay the price you can justify it anyway you want... or not. It's an individuals choice.

Titan is just not for me because besides the price, I particularly despise the poor performance of the stock blower cooler (I can't bare the hassle of waterblocks) compared to the best aftermarket coolers.
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
That angle does come up fairly often when justifying the Titan premium. Ultimately, though I think the 1080 and 1180 are/will be overpriced, I'll still fork out for it, as in comparison to the Titan premium it's a bargain imo.

Getting the Titan to me is basically paying +80% more than a Ti instead of waiting for the Ti release, with no real world gaming performance difference.

The above argument is null if you're getting the Titan for non gaming performance features. Bottom line is, if you're happy to pay the price you can justify it anyway you want... or not. It's an individuals choice.

Titan is just not for me because besides the price, I particularly despise the poor performance of the stock blower cooler (I can't bare the hassle of waterblocks) compared to the best aftermarket coolers.

This did occur to me also, but I am not sure a Titan will be coming so soon this time as we had the Titan V (admittedly not a traditional Titan, pricing wise anyway) not so long ago. Also a new Titan could end up being even more expensive than the usually £1000. So picking up a 1180 on release might not be as bad this time around.

I am not a fan of the noise of stock coolers or their performance either, but I have been waiting ages for something new and shiny and do not fancy waiting a couple of months for aftermarket cards this time. I will just use my headphones when playing new games. With old ones or games like FIFA, the fan will not need to work half as hard anyway to maintain 60fps for me.

If it ends up looking like I will be keeping the 1180 long term I can take a small hit on it and buy a aftermarket water cooled version a few months down the line. It all depends on what games they will be announcing at E3 this year also. If say Cyberpunk 2077 is not coming out until say late 2019, then a 1180 will likely be more than enough for me for 2 years.
 
Soldato
Joined
16 Jun 2004
Posts
3,215
If say Cyberpunk 2077 is not coming out until say late 2019

If that's what your looking to, then start saving now for a 32 inch, 120Hz, 4K HDR monitor as well, because that neon lit city will HAVE to be played on such a monitor to do it justice, anything less would be a crime. :)
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
This did occur to me also, but I am not sure a Titan will be coming so soon this time as we had the Titan V (admittedly not a traditional Titan, pricing wise anyway) not so long ago. Also a new Titan could end up being even more expensive than the usually £1000. So picking up a 1180 on release might not be as bad this time around.

I am not a fan of the noise of stock coolers or their performance either, but I have been waiting ages for something new and shiny and do not fancy waiting a couple of months for aftermarket cards this time. I will just use my headphones when playing new games. With old ones or games like FIFA, the fan will not need to work half as hard anyway to maintain 60fps for me.

If it ends up looking like I will be keeping the 1180 long term I can take a small hit on it and buy a aftermarket water cooled version a few months down the line. It all depends on what games they will be announcing at E3 this year also. If say Cyberpunk 2077 is not coming out until say late 2019, then a 1180 will likely be more than enough for me for 2 years.
Not a bad way to go about things. That way you can minimize the waiting period for more enjoyment of the hardware.

On another note, I got an email today from NVIDIA UK trying to peddle me 10 series cards lol. It had links to retailers where to buy aftermarket 1060 3GB/6GB, 1070 Ti, 1080 and 1080 Ti GPU's.

If that isn't a sign of a sell off of the current gen, I don't know what is haha :p
 
Soldato
Joined
30 Aug 2014
Posts
5,963
sadly yeah i think we can be certain it will be way over the price of the 1070... but if it launches and it is around about the performance of the 1080ti and "only" £50 dearer for arguments sake than the 1070, but uses less power than the 1080ti, then i think there is a strong use case there.

Also i cant remember where i read it, so it may not be true (which may or not make it a lie depends on definitions above :D ) but I do remember reading that the next generation at launch would be "ok" however would really come into its own when games supported vulkan and also other APIs after DX12.

btw i read this as a humerous segue... apparently there will never be a direct X 12! an interesting statment (lie? ;) ) from AMD back in 2013.

https://tech.slashdot.org/story/13/04/12/1847250/amd-says-there-will-be-no-directx-12-ever
Mantle pushed Microsoft to do DX12, even some of the documentation for DX12 is identical to Mantle's; so it wasn't a lie.
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
I would probably go for the 11 series depending on when it comes out.

how much do you think I would get for my 1080Ti now?
Quite a few posts have come up in the members market asking for £700 for used 1080Ti's, but people are sensibly very reluctant to entertain those prices with the current rumors of new faster cards ready to come in at this price range shortly.

Trying to sell for about 5% less than your purchase price isn't going to cut it and I usually look to sell around 20-33% less than my purchase price. I must admit, an exception was me selling a 1080Ti Hybrid FTW in the mining craze, which I did sell for only £50 less than I got it for new.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
If that's what your looking to, then start saving now for a 32 inch, 120Hz, 4K HDR monitor as well, because that neon lit city will HAVE to be played on such a monitor to do it justice, anything less would be a crime. :)

I already have 4K Gsync panel which I am very happy with. I have very little need for 120Hz due to the games I play. Not saying I would not like it, if it came as standard or cost a little extra, but I would not pay the silly asking price on those new 4K 120Hz monitor for something that will make little difference to my gaming. Besides I do not think there will be any cards out that will run Cyberpunk 2077 at above 60fps on release maxed out at 4K anyway. I heard they maybe doing something like Crysis where it will be ahead of its time graphically. I will probably end up playing it 30fps with the latest card for the eye candy :)

As for HDR, there are no proper HDR monitors right now as far as I know.

My next monitor will need to be OLED for me to want to upgrade anyway.

Not a bad way to go about things. That way you can minimize the waiting period for more enjoyment of the hardware.

On another note, I got an email today from NVIDIA UK trying to peddle me 10 series cards lol. It had links to retailers where to buy aftermarket 1060 3GB/6GB, 1070 Ti, 1080 and 1080 Ti GPU's.

If that isn't a sign of a sell off of the current gen, I don't know what is haha :p

Haha. Yeah. Hopefully it will be soon. I want to play Final Fantasy 15 :)
 
Mobster
Soldato
Joined
4 Apr 2011
Posts
3,501
If that's what your looking to, then start saving now for a 32 inch, 120Hz, 4K HDR monitor as well, because that neon lit city will HAVE to be played on such a monitor to do it justice, anything less would be a crime. :)

I own W3 and saw pre-release of W3. So I'm under no illusion that CP2077 will be just as butchered. It wont matter which monitor we all own :)
 
Soldato
Joined
28 Sep 2014
Posts
3,437
Location
Scotland
Mantle pushed Microsoft to do DX12, even some of the documentation for DX12 is identical to Mantle's; so it wasn't a lie.

Sorry no, Mantle did not pushed Microsoft to do DirectX 12. When AMD released Mantle whitepaper and SDK back in 2014, I read both and realised AMD had took a leaf from Fahrenheit Low Level API whitepaper and SDK which identical to Mantle and DirectX 12. Microsoft started developed Fahrenheit Low Level API back in 1997 to replace DirectX after games developers included John Carmack complained about DirectX 5 games are too slow and bottlenecks issues then asked Microsoft to develop Low Level API for games that can delivered high draw calls and remove CPU overheads but Microsoft abandoned it a few years later when they invested heavy in DirectX 7 and games developers are very pleased with DirectX 7 performance. Nvidia and ATi did not wrote Fahrenheit drivers for Windows XP so Microsoft released their generic Fahrenheit XSG driver on their website then pulled the plug on DirectX Fahrenheit website. I downloaded the driver and left it in a folder alongside with Fahrenheit whitepaper and SDK for a new months to see if somebody started created tech demo or game demo that took advantage of Fahrenheit Low Level API so it was really shame nobody included games developers as well as 3dfx, ATi and Nvidia had never did anything since 1997 to showed off what the new API can do at low level compared to DirectX high level so they all far too happily and too busy developed DirectX games, tech demos and drivers for PC and consoles then deleted the folder 2 years later after cleanup junk files so I wished I never deleted that Fahrenheit whitepaper, SDK and driver so I can upload these files to proved Mantle and DirectX 12 both were identical to Fahrenheit Low Level API. It very difficult to find these nearly 20 years old files on internet.

Microsoft started developed Fahrenheit Low Level API in 1997

https://web.archive.org/web/20080421215948/http://www.directx.com/graphics/fahrenheit.htm
http://www.aha.ru/~const/ftp/Fahrenheit - Frequently Asked Questions.htm

The problem is if Microsoft replaced DirectX with Fahrenheit Low Level API then the new API wont have full backward compatible with DirectX and OpenGL so games/software developers, movie studios and graphics card companies had to start render graphics, create games, applications and write graphics drivers from scratch in clean slate. This was exactly what happened to Mantle which did not have backward compatible with DirectX and Vulkan also did not have backward compatible with either Mantle or OpenGL. The industry did not wanted that with Fahrenheit Low Level API so Microsoft revisited that project a few years later and hired Fahrenheit API Architect David Blythe in September 2003 until February 2010 to developed DirectX 10, DirectX 11 and DirectX 12 Low Level API Microsoft wanted to retained fully backward compatible with DirectX.

https://www.linkedin.com/in/david-blythe-4726644

Nvidia started Fermi DirectX 12 GPU development in 2006.

https://www.extremetech.com/computing/83236-nvidia-launches-next-cuda-architecture

AMD probably started 1st gen GCN DirectX 12 GPU development at same time as HBM in 2006 or 2007.

https://www.kitguru.net/components/...o-work-on-hbm-technology-nearly-a-decade-ago/

AMD and Johan Andersson from DICE started Mantle development in 2012.

https://www.neogaf.com/threads/dice...nvidia-intel-others-only-amd-listened.715349/
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Stop that right now, we all know that AMD created everything and anything that NVidia, Intel, or Microsoft did or didn't do is just wrong and not real.

DirectX 12 couldn't possibly be the next iteration of DirectX after DirectX11, because that wouldn't make any sense at all, oh wait.......;)
 
Caporegime
Joined
18 Oct 2002
Posts
29,852
Stop that right now, we all know that AMD created everything and anything that NVidia, Intel, or Microsoft did or didn't do is just wrong and not real.

Tbf, with all the money washing around nVidia they really just do the bare minimum for gamers imo. AMD do just as well in this regard, and they've got bugger all!!!
 
Soldato
Joined
30 Aug 2014
Posts
5,963
Sorry no, Mantle did not pushed Microsoft to do DirectX 12. When AMD released Mantle whitepaper and SDK back in 2014, I read both and realised AMD had took a leaf from Fahrenheit Low Level API whitepaper and SDK which identical to Mantle and DirectX 12. Microsoft started developed Fahrenheit Low Level API back in 1997 to replace DirectX after games developers included John Carmack complained about DirectX 5 games are too slow and bottlenecks issues then asked Microsoft to develop Low Level API for games that can delivered high draw calls and remove CPU overheads but Microsoft abandoned it a few years later when they invested heavy in DirectX 7 and games developers are very pleased with DirectX 7 performance. Nvidia and ATi did not wrote Fahrenheit drivers for Windows XP so Microsoft released their generic Fahrenheit XSG driver on their website then pulled the plug on DirectX Fahrenheit website. I downloaded the driver and left it in a folder alongside with Fahrenheit whitepaper and SDK for a new months to see if somebody started created tech demo or game demo that took advantage of Fahrenheit Low Level API so it was really shame nobody included games developers as well as 3dfx, ATi and Nvidia had never did anything since 1997 to showed off what the new API can do at low level compared to DirectX high level so they all far too happily and too busy developed DirectX games, tech demos and drivers for PC and consoles then deleted the folder 2 years later after cleanup junk files so I wished I never deleted that Fahrenheit whitepaper, SDK and driver so I can upload these files to proved Mantle and DirectX 12 both were identical to Fahrenheit Low Level API. It very difficult to find these nearly 20 years old files on internet.

Microsoft started developed Fahrenheit Low Level API in 1997

https://web.archive.org/web/20080421215948/http://www.directx.com/graphics/fahrenheit.htm
http://www.aha.ru/~const/ftp/Fahrenheit - Frequently Asked Questions.htm

The problem is if Microsoft replaced DirectX with Fahrenheit Low Level API then the new API wont have full backward compatible with DirectX and OpenGL so games/software developers, movie studios and graphics card companies had to start render graphics, create games, applications and write graphics drivers from scratch in clean slate. This was exactly what happened to Mantle which did not have backward compatible with DirectX and Vulkan also did not have backward compatible with either Mantle or OpenGL. The industry did not wanted that with Fahrenheit Low Level API so Microsoft revisited that project a few years later and hired Fahrenheit API Architect David Blythe in September 2003 until February 2010 to developed DirectX 10, DirectX 11 and DirectX 12 Low Level API Microsoft wanted to retained fully backward compatible with DirectX.

https://www.linkedin.com/in/david-blythe-4726644

Nvidia started Fermi DirectX 12 GPU development in 2006.

https://www.extremetech.com/computing/83236-nvidia-launches-next-cuda-architecture

AMD probably started 1st gen GCN DirectX 12 GPU development at same time as HBM in 2006 or 2007.

https://www.kitguru.net/components/...o-work-on-hbm-technology-nearly-a-decade-ago/

AMD and Johan Andersson from DICE started Mantle development in 2012.

https://www.neogaf.com/threads/dice...nvidia-intel-others-only-amd-listened.715349/
I hadn't heard of Fahrenheit API before. If it's as you claim then fair enough. My question is how AMD were able to be first to market with a fully functional lower-level API concocted in a relatively short period of time given that they were basically going bankrupt and had very few resources while Microsoft have massive resources and DX12 come out significantly later?

Surely any sane company would realise that if Microsoft were indeed working on a lower-level API DX12 when Mantle was conceived then it would be a massive waste of precious resources to make what would be effectively a totally redundant API. This is why your version of events doesn't make sense to me.
 
Back
Top Bottom