4k is still *so* harsh on cards, even the Ti! 1440p (and UW) and a high refresh is definitely the sweet spot for a good while yet.
I get that but I play on my TV from the sofa.

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
4k is still *so* harsh on cards, even the Ti! 1440p (and UW) and a high refresh is definitely the sweet spot for a good while yet.
4k is rather top end still when it comes to screens? At more common resolutions the current NVidia GPU's are still very good some two years down the road. Old tech though, definitely. Not saying they don't need replacing now but most of us are probably still using max settings especially with the top end cards.
I use high end hardware but must admit I'm still on a 1440P screen, and only 60Hz too. Dell monitors seem to last forever. I still also run a Dell monitor from 2008 too, albeit it's not the monitor I use for gaming.
I'm hopeful new cards will arrive June. Just not sure this time around I'll be upgrading given I'm finding 1070 Ti Sli working well.
4k is rather top end still when it comes to screens? At more common resolutions the current NVidia GPU's are still very good some two years down the road. Old tech though, definitely. Not saying they don't need replacing now but most of us are probably still using max settings especially with the top end cards.
I use high end hardware but must admit I'm still on a 1440P screen, and only 60Hz too. Dell monitors seem to last forever. I still also run a Dell monitor from 2008 too, albeit it's not the monitor I use for gaming.
I'm hopeful new cards will arrive June. Just not sure this time around I'll be upgrading given I'm finding 1070 Ti Sli working well.
If SLI is working fine then there’s no point upgrading unless the next Ti is phenomenally better, which is unlikely.
Annoyingly I got my 480 for 1080P gaming, which it’s no slouch at, then unexpectedly replaced my TV with a nice 4K one. Was hoping that the 1080Ti’s would drop in price a bit and then was looking to upgrade but then the whole crypto thing went up a notch and GPU pricing went mad.
Now I’ve got to wait as I can’t justify to myself the cost of a current Ti, not when the next one is hopefully going to be here in the next 6 months or so. 1080P it is for now.![]()
NVIDIA GeForce GTX 1180 & 1170 Projected to Land in July & Feature 16Gbps GDDR6 Memory in 8GB & 16GB Capacities
GDDR6 will reportedly power most, if not all, of NVIDIA’s upcoming graphics cards. This includes the upcoming gaming focused GeForce 11 series as well as upcoming Tesla and Quadro parts for the AI and autonomous driving markets.
SK Hynix has also confirmed to GamersNexus that GDDR6 will be entering production in three month’s time around late June, early July. This puts the the GeForce 11 series launch right around July, which happens to coincide with a report published earlier this month that put the launch also in the July timeframe.
The new memory will run at 1.35v and deliver double the speed of GDDR5 but will cost around 20% more to produce. So, expect slightly higher MSRPs than NVIDIA’s Pascal GeForce 10 series launch prices. SK Hynix has also confirmed that it will offer GDDR6 chips in 1GB and 2GB densities. Typically, each 32-bit GDDR memory controller segment is paired with a single GDDR chip, this in turn means that we’ll be looking at 8GB and 16GB capacities on 256-bit GPUs, i.e. what would be the next GTX 1180 and GTX 1170 graphics cards.
The fact that SK Hynix will be producing both densities means that a customer or multiple customers exist for both. It’s possible that we could see NVIDIA offer GTX 1180 and GTX 1170 cards in both 8GB and 16GB capacities. NVIDIA could also choose to differentiate these two cards based on memory capacity, in addition to GPU configuration.
All in all, it looks like we finally have a solid time-frame for when to expect the green team to put its chips on the table and show its cards, both figuratively and literally.
According to that 2070/80 = 1080Ti, 2060 = 1080, 2080 Ti 70/80% faster than 1080Ti, 2050 = 1060, 2030 = 1050.
I think this just highlights that the rumour sites know absolutely nothing. Have no "special sauce", any more than any of us do.
Some still saying Ampere is next GeForce; some saying Ampere is 7nm Volta shrink and for professional use only.
They can't even agree on whether it will be the 11 series or the 20 series.
Useless click-bait, all of it. nV have done very well leaking nothing at all.
NVIDIA GeForce 11 Series Launching Around July, Hynix GDDR6 Production Schedule Reveals
https://wccftech.com/nvidia-geforce...july-gddr6-mass-production-timeline-confirms/
GeForce GTX 2070 and 2080 Could Launch Summer 2018
http://www.guru3d.com/news_story/geforce_gtx_2070_and_2080_could_launch_summer_2018.html
According to that 2070/80 = 1080Ti, 2060 = 1080, 2080 Ti 70/80% faster than 1080Ti, 2050 = 1060, 2030 = 1050.
NVIDIA GeForce 11 Series Launching Around July, Hynix GDDR6 Production Schedule Reveals
https://wccftech.com/nvidia-geforce...july-gddr6-mass-production-timeline-confirms/
GeForce GTX 2070 and 2080 Could Launch Summer 2018
http://www.guru3d.com/news_story/geforce_gtx_2070_and_2080_could_launch_summer_2018.html
According to that 2070/80 = 1080Ti, 2060 = 1080, 2080 Ti 70/80% faster than 1080Ti, 2050 = 1060, 2030 = 1050.
I don't see how they could get those gains in performance on 12nm. They would have to add so many CUDA cores to get 70/80%. A leap in clocks like Maxwell to Pascal seems unlikely given what we saw with the Titan V.
If they up the prices so that the 2080 simply takes the current place of the 1080Ti, then make the 2080Ti an even more expensive power hungry beast, then it is possible (i.e. replace Titan Xp).
They would also be increasing the power envelope at each tier of cards. Then again they have a lot of headroom over AMD to do that.
I wouldn't say they increased price every generation but they certainly did with pascal, tried to keep it quiet because they knew they would get some flak for it.I’m done reading tech sites for rumours, now I wait for the hands on users benchmarks. That said I fear that next generation of AMD and NVIDIA GPU’s will be priced too high anyway, NVIDIA have steadily upped there prices for every release for the last few generations, then there’s the increasing RAM pricing and the crypto mining craze, though it does look like some miners are selling up if the MM and eBay are to go by
So will this be the first mainstream £1k card at launch? If prices continue to rise, and the memory costs more to produce, I can see the 2080ti being £1k. With is ridiculous in its insanity.
I wouldn't say they increased price every generation but they certainly did with pascal, tried to keep it quiet because they knew they would get some flak for it.
If they really do take the mickey and price their 2080 at £600+