Absolutely, it definitely might be, but my point is that isn't all lies just because it's unverified right now.It might well be
We have absolutely nothing concrete to go on.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Absolutely, it definitely might be, but my point is that isn't all lies just because it's unverified right now.It might well be
We have absolutely nothing concrete to go on.
It may not release much cheaper than a 1080, but if the 1170 matches the 1080Ti or within %5 for a good chunk cheaper than the 1080Ti it will sell a lot. Only time will tell, but the previous 970 and 1070 offered up previous gen Titan competitive performance at a much more affordable price.I have no doubt the 1170 will surpass the 1080 (maybe even match the 1080 Ti), I am just very sceptical of it actually being much cheaper. I bet it'll be more expensive than the 1070 was, which partially defeats the point.
I have no doubt the 1170 will surpass the 1080 (maybe even match the 1080 Ti), I am just very sceptical of it actually being much cheaper. I bet it'll be more expensive than the 1070 was, which partially defeats the point.
given that the gtx 1080 only had the GPU crown for 2 months & was heavily overpriced by retailers is everyone buying the 1180 at launch or waiting for the titan, if Nvidia do a repeat of last gen
taking into considering the titan XP in August 2016 would have been a decent purchase given how long its been top end
That angle does come up fairly often when justifying the Titan premium. Ultimately, though I think the 1080 and 1180 are/will be overpriced, I'll still fork out for it, as in comparison to the Titan premium it's a bargain imo.
Getting the Titan to me is basically paying +80% more than a Ti instead of waiting for the Ti release, with no real world gaming performance difference.
The above argument is null if you're getting the Titan for non gaming performance features. Bottom line is, if you're happy to pay the price you can justify it anyway you want... or not. It's an individuals choice.
Titan is just not for me because besides the price, I particularly despise the poor performance of the stock blower cooler (I can't bare the hassle of waterblocks) compared to the best aftermarket coolers.
If say Cyberpunk 2077 is not coming out until say late 2019
Not a bad way to go about things. That way you can minimize the waiting period for more enjoyment of the hardware.This did occur to me also, but I am not sure a Titan will be coming so soon this time as we had the Titan V (admittedly not a traditional Titan, pricing wise anyway) not so long ago. Also a new Titan could end up being even more expensive than the usually £1000. So picking up a 1180 on release might not be as bad this time around.
I am not a fan of the noise of stock coolers or their performance either, but I have been waiting ages for something new and shiny and do not fancy waiting a couple of months for aftermarket cards this time. I will just use my headphones when playing new games. With old ones or games like FIFA, the fan will not need to work half as hard anyway to maintain 60fps for me.
If it ends up looking like I will be keeping the 1180 long term I can take a small hit on it and buy a aftermarket water cooled version a few months down the line. It all depends on what games they will be announcing at E3 this year also. If say Cyberpunk 2077 is not coming out until say late 2019, then a 1180 will likely be more than enough for me for 2 years.
Mantle pushed Microsoft to do DX12, even some of the documentation for DX12 is identical to Mantle's; so it wasn't a lie.sadly yeah i think we can be certain it will be way over the price of the 1070... but if it launches and it is around about the performance of the 1080ti and "only" £50 dearer for arguments sake than the 1070, but uses less power than the 1080ti, then i think there is a strong use case there.
Also i cant remember where i read it, so it may not be true (which may or not make it a lie depends on definitions above ) but I do remember reading that the next generation at launch would be "ok" however would really come into its own when games supported vulkan and also other APIs after DX12.
btw i read this as a humerous segue... apparently there will never be a direct X 12! an interesting statment (lie? ) from AMD back in 2013.
https://tech.slashdot.org/story/13/04/12/1847250/amd-says-there-will-be-no-directx-12-ever
Quite a few posts have come up in the members market asking for £700 for used 1080Ti's, but people are sensibly very reluctant to entertain those prices with the current rumors of new faster cards ready to come in at this price range shortly.I would probably go for the 11 series depending on when it comes out.
how much do you think I would get for my 1080Ti now?
If that's what your looking to, then start saving now for a 32 inch, 120Hz, 4K HDR monitor as well, because that neon lit city will HAVE to be played on such a monitor to do it justice, anything less would be a crime.
Not a bad way to go about things. That way you can minimize the waiting period for more enjoyment of the hardware.
On another note, I got an email today from NVIDIA UK trying to peddle me 10 series cards lol. It had links to retailers where to buy aftermarket 1060 3GB/6GB, 1070 Ti, 1080 and 1080 Ti GPU's.
If that isn't a sign of a sell off of the current gen, I don't know what is haha
If that's what your looking to, then start saving now for a 32 inch, 120Hz, 4K HDR monitor as well, because that neon lit city will HAVE to be played on such a monitor to do it justice, anything less would be a crime.
Mantle pushed Microsoft to do DX12, even some of the documentation for DX12 is identical to Mantle's; so it wasn't a lie.
Stop that right now, we all know that AMD created everything and anything that NVidia, Intel, or Microsoft did or didn't do is just wrong and not real.
I hadn't heard of Fahrenheit API before. If it's as you claim then fair enough. My question is how AMD were able to be first to market with a fully functional lower-level API concocted in a relatively short period of time given that they were basically going bankrupt and had very few resources while Microsoft have massive resources and DX12 come out significantly later?Sorry no, Mantle did not pushed Microsoft to do DirectX 12. When AMD released Mantle whitepaper and SDK back in 2014, I read both and realised AMD had took a leaf from Fahrenheit Low Level API whitepaper and SDK which identical to Mantle and DirectX 12. Microsoft started developed Fahrenheit Low Level API back in 1997 to replace DirectX after games developers included John Carmack complained about DirectX 5 games are too slow and bottlenecks issues then asked Microsoft to develop Low Level API for games that can delivered high draw calls and remove CPU overheads but Microsoft abandoned it a few years later when they invested heavy in DirectX 7 and games developers are very pleased with DirectX 7 performance. Nvidia and ATi did not wrote Fahrenheit drivers for Windows XP so Microsoft released their generic Fahrenheit XSG driver on their website then pulled the plug on DirectX Fahrenheit website. I downloaded the driver and left it in a folder alongside with Fahrenheit whitepaper and SDK for a new months to see if somebody started created tech demo or game demo that took advantage of Fahrenheit Low Level API so it was really shame nobody included games developers as well as 3dfx, ATi and Nvidia had never did anything since 1997 to showed off what the new API can do at low level compared to DirectX high level so they all far too happily and too busy developed DirectX games, tech demos and drivers for PC and consoles then deleted the folder 2 years later after cleanup junk files so I wished I never deleted that Fahrenheit whitepaper, SDK and driver so I can upload these files to proved Mantle and DirectX 12 both were identical to Fahrenheit Low Level API. It very difficult to find these nearly 20 years old files on internet.
Microsoft started developed Fahrenheit Low Level API in 1997
https://web.archive.org/web/20080421215948/http://www.directx.com/graphics/fahrenheit.htm
http://www.aha.ru/~const/ftp/Fahrenheit - Frequently Asked Questions.htm
The problem is if Microsoft replaced DirectX with Fahrenheit Low Level API then the new API wont have full backward compatible with DirectX and OpenGL so games/software developers, movie studios and graphics card companies had to start render graphics, create games, applications and write graphics drivers from scratch in clean slate. This was exactly what happened to Mantle which did not have backward compatible with DirectX and Vulkan also did not have backward compatible with either Mantle or OpenGL. The industry did not wanted that with Fahrenheit Low Level API so Microsoft revisited that project a few years later and hired Fahrenheit API Architect David Blythe in September 2003 until February 2010 to developed DirectX 10, DirectX 11 and DirectX 12 Low Level API Microsoft wanted to retained fully backward compatible with DirectX.
https://www.linkedin.com/in/david-blythe-4726644
Nvidia started Fermi DirectX 12 GPU development in 2006.
https://www.extremetech.com/computing/83236-nvidia-launches-next-cuda-architecture
AMD probably started 1st gen GCN DirectX 12 GPU development at same time as HBM in 2006 or 2007.
https://www.kitguru.net/components/...o-work-on-hbm-technology-nearly-a-decade-ago/
AMD and Johan Andersson from DICE started Mantle development in 2012.
https://www.neogaf.com/threads/dice...nvidia-intel-others-only-amd-listened.715349/
Quite a few posts have come up in the members market asking for £700 for used 1080Ti's, but people are sensibly very reluctant to entertain those prices with the current rumors of new faster cards ready to come in