• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

We will see in a few months time, but I suspect even a 1170 will be closer to £500 than the 1070 ever was at launch. Nvidia have a dominant market position and they know it. They also know that there are thou do, if not tens of thousands of people that will buy their latest card at almost whatever price they release at. Enough people have bought Titan Cards in their various iterations in the past for exorbitant amounts of cash for a minimal improvement for NV to know people are waiting. Nvidia might try and see how far they can push it, complete with 'we're sorry, poor yields and rising memory prices have forced us to raise our prices, it's not our fault' while reaping the cash. I still expect even the 1170 to be around £500, most likely more. The original 1070 FE launch was £399 two years ago, prior to the major memory shortage and mining boom. https://forums.overclockers.co.uk/threads/nvidia-gtx-1070-wanna-pre-order.18734903/

I don’t look at pound value as there are many things between what nvidia set their price as and what they end up costing us here in the end. If you want to know how much nvidia have actually raised price by you need to compare 1070 and and 1170 dollar price. My guess is $50 extra at most. If they hike price any more they will get a load of bad press and likely do long term damage to themselves.

With Intel now coming into the game too, Nvidia need to be careful and not **** gamers off. They have done enough of that already and I personally would not hesitate to sell my G-Sync monitor and move back to AMD or Intel in a couple of years time once they can offer 4K gaming performance cards.
 
Just turned my R9 390X into a dehydrator for making jerky whilst I wait for Nvidia to release the 1180...

Works real good...

Dhb0XygWsAAQe_H.jpg


(Yes I actually just made that reindeer jerky today... no I didn't really use my AMD card but I suspect it would serve better as a dehydrator than a gaming card and I am seriously bored waiting for the 1180...)
 
I'd be interested to know if AMD have ordered a surplus of VEGA GPU's due the recent mining craze. If yes this would hurt AMD a lot more than Nvidia.

Nvidia can reduce the price a lot more than AMD and still make a good profit. We know VEGA is an expensive GPU to make and being tied to costly HBM compounds the issue.


Good point. AMD did talk about increasing supllsnlike nvidia.

The only thing is AMD will be later to eeleare new GPUs by the sound of it so.may have more time to sell off stock
 
Just turned my R9 390X into a dehydrator for making jerky whilst I wait for Nvidia to release the 1180...

Works real good...


(Yes I actually just made that reindeer jerky today... no I didn't really use my AMD card but I suspect it would serve better as a dehydrator than a gaming card and I am seriously bored waiting for the 1180...)

GTX480 mate. Thats the card to use for jerky and cook your meal.
 
Hawaii is still a really good chip, paired with a good Freesync monitor it would still be a decent setup! Don't know if I should praise or derate AMD lol

Yeah.... 290X and 390X it's overclocked incarnation, still up there in the mid range band even today!!!
Anyone who uses 1080p and having Freesync monitor the last 3 years, hasn't felt the need to buy new GPU yet since they came out 5 years ago.

Imho same applies to Vega 64. With Freesync monitor even at 4K, there is no point to upgrade until the game graphics become better.
And on that front we are stagnant the last 6 years.
 
Yeah.... 290X and 390X it's overclocked incarnation, still up there in the mid range band even today!!!
Anyone who uses 1080p and having Freesync monitor the last 3 years, hasn't felt the need to buy new GPU yet since they came out 5 years ago.

Imho same applies to Vega 64. With Freesync monitor even at 4K, there is no point to upgrade until the game graphics become better.
And on that front we are stagnant the last 6 years.

I agreed with you all the way up to 4k. Being limited to 30Hz on 4k seriously sucks - if I was using 1080p with FreeSync I wouldn't be bothered - but I am using 4k non FreeSync and it is just a terrible experience. Plus the ethics behind the 390X still annoys me - they released it as a new card at a much higher price than the 290x for us only to find out after the fact that they were actually identical cards.
 
there is no point to upgrade until the game graphics become better.

I love my games sharp and not seeing pixels. At the very least 4K gives me this so I have not been able to go back to anything lower. Also how much 4K benefits a game may vary from game to game. Some games there really is a big difference. I have been on 4K since 2014 and remember when I ordered a 1440p monitor around 2016 to try out Freesync and high refresh rate, but the reducing in IQ was so big side by side. Even a game like FIFA 16 at the time there difference was big. In the end I stuck with the 60Hz non adaptive sync monitor rather than the one with Freesync and 144Hz.

Each to their own when it comes to monitors as it is subjective, but it does amaze me when people downplay the IQ benefits of 4K on PC monitor. Try a good 4K 27" monitor which has 163 PPI and tell me you cannot see the difference, if you do, I will tell you to go spec savers :p
 
Imo it is a little unfair comparing an AMD card using freesync with NV one that isn't. I am sure people using freesync OR gsync with their respective cards are getting superior experiences to those with the same cards running on standard monitor / TV.

Having spent almost £2000 on a high(ish) end hdr TV I shan't be replacing any time soon . So my next GPU will be the one which offers the best performance for the majority of games. I know there will always be the odd standout game which runs better than one card over the other, but I want the best card for the majority of cases not cherry picked examples.
 
I love my games sharp and not seeing pixels. At the very least 4K gives me this so I have not been able to go back to anything lower. Also how much 4K benefits a game from vary from game to game. Some games there really is a big difference. I have been on 4K since 2014 and remember when I ordered a 1440p monitor around 2016 to try out Freesync and high refresh rate, but the reducing in IQ was so big side by side. Even a game like FIFA 16 at the time there difference was big. In the end I stuck with the 60Hz non adaptive sync monitor rather than the one with Freesync and 144Hz.

Each to their own when it comes to monitors as it is subjective, but it does amaze me when people downplay the IQ benefits of 4K on PC monitor. Try a good 4K 27" monitor which has 163 PPI and tell me you cannot see the difference, if you do, I will tell you to go spec savers :p

Yes I have seen the Samsung 28" 4K (XCOM 2 event), and yes was impressed knowing what are you talking about :)
Unfortunately the games I play are not supporting such resolution nor scaling, and everything is tiny so cannot read the text (EUIV, CK2).

But I find the 55NU8000 in the bedroom more to my liking when comes to 4K atm.
 
Imo it is a little unfair comparing an AMD card using freesync with NV one that isn't. I am sure people using freesync OR gsync with their respective cards are getting superior experiences to those with the same cards running on standard monitor / TV.

Having spent almost £2000 on a high(ish) end hdr TV I shan't be replacing any time soon . So my next GPU will be the one which offers the best performance for the majority of games. I know there will always be the odd standout game which runs better than one card over the other, but I want the best card for the majority of cases not cherry picked examples.
Not sure if you was referring to my post, but I did not do that, as at the time I had a AMD card and now I have a Nvidia card with a G-Sync monitor. Not just that, but I was talking about IQ which has nothing to do with adaptive sync :)

Yes I have seen the Samsung 28" 4K (XCOM 2 event), and yes was impressed knowing what are you talking about :)
Unfortunately the games I play are not supporting such resolution nor scaling, and everything is tiny so cannot read the text (EUIV, CK2).

But I find the 55NU8000 in the bedroom more to my liking when comes to 4K atm.

Stellaris supports it :D

The UI scales to 200% from what I remember.
 
Last edited:
Just got my wife the Acer Predator 165Hz WQHD G-Synch IPS panel (27") which she has paired up with my old 980Ti and the difference is night and day - she will be getting 1180 when it comes out as well, I am expecting stonking results with it (she is on a 8700k rig with 32GB 4133MHz Trident Z and Samsung NVMe). Next step will be making a custom loop and getting rid of the AIOs - just spent a fortune putting in 10 Noctuas (6 on rads, 4 on case).
 
You don't know what the price will be either. It just might be that price. Nobody knows other than Nvidia.
I'm quite sure there are people within nvidia saying 'we can charge what we want, people will still buy, we have no competitors, thousands of people are waiting and have been for a year or more.' I mean let's face it, they've never been hugely bothered about their reputation and AMD just aren't competing at all.
 
Back
Top Bottom