• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX1180/2080 Speculation thread

Why would they launch a successor to the current generation when there is no competition and no commercial pressure to do so? they own the market and sold more cards in the last quarter than ever before. it would not be good business, for Nvidia or their Stakeholders


The commercial pressure is that they can get people with 1080s to upgrade to something faster, or push those with Maxwell/Kepler to finally upgrade. They can also show an even stronger advantage over AMD.


However, Nvidia really have to release the 2080 in July-August time frame or not bother at all until 7nm is consumer ready next summer.
 
OFC Nvidia will keep quiet about the next gen,since they still want people to buy what they have now,and also to make sure the competition are left in the dark. However,unless something has caused issues(like GDDR6) I would be very surprised if new cards were not launching this year based on a new uarch. Why?? Nvidia RTX - its being incorporated into games like Metro:Exodus being launched this year,and only the Titan V has hardware support,and Pascal is only supported in software only IIRC. So I would expect in late Q3 or Q4 2018 for them to launch some new cards,which also has been a common time when new cards have been launched by both AMD and Nvidia in the past.
 
I think the date of the next gen cards will be fixed by one thing only. How many Pascal chips do NVidia have left that they have got to use up.

I am guessing the foundries stopped making Pascal chips some time ago but NVidia still has a pile that needs to get used.

Perhaps someone with more knowledge about the chip foundries could add to or clarify about the above.
 
I think the date of the next gen cards will be fixed by one thing only. How many Pascal chips do NVidia have left that they have got to use up.

I am guessing the foundries stopped making Pascal chips some time ago but NVidia still has a pile that needs to get used.

Perhaps someone with more knowledge about the chip foundries could add to or clarify about the above.

on another forum i visit A fellow posted stating that the last Pascal chip rolled off the foundry line the day previous to his post ... I think that post was somewhere between 1 month - 2 months or so ago (i cant remember exactly). I have no idea if he was trolling but he sounded like it was something he "knew" rather than a prediction.

it is painful waiting... i have the cash ready to go - if i could buy an 1180ti for £800 right now i would do.
As it is i will make do with a standard 1180. If i knew no card was coming before winter this year i would get the 1080ti.... but with so many rumours of july or august i am not buying now.... certainly not at a price level i could have jumped in at 12 months ago!!!.
 
According to an article posted this morning Nvidia's CEO fielded GeForce questions after their Computex (“Isaac”) presentation today and said that new GeForce cards will launch "a long time from now".

:confused:
 
There's no real point in releasing a top-end GPU until the monitors are available to take advantage of them. The 10 series will handle 60 Hz 4k just fine. So until we get 120 Hz 4k and 4k UW, there's just no point.
 
I was gaming at 4k at medium-high settings on a GTX 780 Ti.
"Finally 4K gaming is feasible" seems to have been a sentiment surrounding the launch of the past ~3 generations. If I recall correctly from reading reviews each time.

The industry had to establish the Ultra Settings meme to keep everyone on a spending treadmill :)
 
There's no real point in releasing a top-end GPU until the monitors are available to take advantage of them. The 10 series will handle 60 Hz 4k just fine. So until we get 120 Hz 4k and 4k UW, there's just no point.

To me a gpu is only truly 4k/60 when you can do that on ultra detail.... and even the 1080ti can only manage this on a few of the newer games.... TBH i doubt even an 1180 will manage this either. an 1180ti probably wont be far off the mark assuming it is 25% faster than a vanilla 1180.

using anything other than full detail in the majority of games imo is simply just too wishy washy

I am sure I could run Thomas was alone at 4k/60 on a gtx 780... it does not make it a 4k/60 card however imo. ;)
 
Not sure what the obsession with Ultra quality is. If that was the indicator then what would be considered "true 1080p cards" right now? Only the GTX 1080 and above?? After all, that's what you need to play PUBG @ 1080p Ultra without dropping below 60 FPS.
 
To me a gpu is only truly 4k/60 when you can do that on ultra detail.... and even the 1080ti can only manage this on a few of the newer games.... TBH i doubt even an 1180 will manage this either. an 1180ti probably wont be far off the mark assuming it is 25% faster than a vanilla 1180.

using anything other than full detail in the majority of games imo is simply just too wishy washy

I am sure I could run Thomas was alone at 4k/60 on a gtx 780... it does not make it a 4k/60 card however imo. ;)
I agree with you Mike.
 
To me a gpu is only truly 4k/60 when you can do that on ultra detail....

You can always put more and more load on the GPU for incrementally lower improvements. But you try actually playing a game at 4k on a high dpi monitor and see how much you actually gain going from medium to high to ultra.
 
Ahhhhhhhhhhhhhhhhhhahahahahahahahahahahaha!!!!!!!

/strokes 1080Ti :p

/strokes my crappy 6 month old card.:p

Lol :D


Why do people never learn? The current GPUs still sell very well, there is no danger of competition, why would they release a new series right now? Might as well wait until either AMD makes a move (e.g. a 7 nm Vega shrink) or until sales start to dry up and new products are needed to freshen up the market. In the mean time they can focus on other business ventures and products. Naivety at its finest.

When Intel were in the same position they did release on a fixed cadence for a few years but even then it was just die shrinks and very minor improvements, nothing like the 20-30% bump people are expecting from the GTX 11 series. They were essentially product refreshes, a bit like if nVidia were to rebrand existing 10 series GPUs as 11 series with some minor clock bumps (as both AMD and nVidia have done in the past) but with no price drop (a la Intel).

Nah. Then why release then 10 series when they did. Could have waited longer and launched just before Vega.


The commercial pressure is that they can get people with 1080s to upgrade to something faster, or push those with Maxwell/Kepler to finally upgrade. They can also show an even stronger advantage over AMD.


However, Nvidia really have to release the 2080 in July-August time frame or not bother at all until 7nm is consumer ready next summer.

Agreed.


OFC Nvidia will keep quiet about the next gen,since they still want people to buy what they have now,and also to make sure the competition are left in the dark. However,unless something has caused issues(like GDDR6) I would be very surprised if new cards were not launching this year based on a new uarch. Why?? Nvidia RTX - its being incorporated into games like Metro:Exodus being launched this year,and only the Titan V has hardware support,and Pascal is only supported in software only IIRC. So I would expect in late Q3 or Q4 2018 for them to launch some new cards,which also has been a common time when new cards have been launched by both AMD and Nvidia in the past.

Exactly, what did people expect Mr Leather Jacket Man to say? He has old inventory to shift at full prices!!


I think the date of the next gen cards will be fixed by one thing only. How many Pascal chips do NVidia have left that they have got to use up.

I am guessing the foundries stopped making Pascal chips some time ago but NVidia still has a pile that needs to get used.

Perhaps someone with more knowledge about the chip foundries could add to or clarify about the above.

+1


Not sure what the obsession with Ultra quality is. If that was the indicator then what would be considered "true 1080p cards" right now? Only the GTX 1080 and above?? After all, that's what you need to play PUBG @ 1080p Ultra without dropping below 60 FPS.

Now this I agree with you on. There are many games I have played where the difference between High and Ultra is hard to see when playing. Hell, they are even hard to see on still pictures. But the difference in performance from one to the other is huge at 4K.

Way I see it is there is a much bigger difference in image quality using High on 4K vs Ultra on 1440P. 4K still better and the grunt needed between the two is reduced a lot.
 
OFC Nvidia will keep quiet about the next gen,since they still want people to buy what they have now,and also to make sure the competition are left in the dark. However,unless something has caused issues(like GDDR6) I would be very surprised if new cards were not launching this year based on a new uarch. Why?? Nvidia RTX - its being incorporated into games like Metro:Exodus being launched this year,and only the Titan V has hardware support,and Pascal is only supported in software only IIRC. So I would expect in late Q3 or Q4 2018 for them to launch some new cards,which also has been a common time when new cards have been launched by both AMD and Nvidia in the past.


Got pushed back to Q1-2019. :(
 
Not sure what the obsession with Ultra quality is. If that was the indicator then what would be considered "true 1080p cards" right now? Only the GTX 1080 and above?? After all, that's what you need to play PUBG @ 1080p Ultra without dropping below 60 FPS.

fair point and there are always going to be exceptions...... but i would say a 980ti (possibly a vanilla gtx 980 even) is a genuine 1080p/60 card because 90% of games *for the production life of the card* ran on full bubble 1080p / 60

as soon as you bring in graphics settings under full detail then not only are you deliberately reducing the fidelity of your game from what the developer ideally wanted people to see, but more importantly imo you are making what is a fairly easy to define benchmark that we can all do, a big grey area.....

As an example to you maybe minimum detail is fine, to me perhaps only ultra suffices, someone may be happy to run shadows on minimum, someone else will cut draw distance to minimum...... this means we all of a sudden have a quagmire where we are all stating we running perfectly well at 4k/60 when we are all making compromises and are not comparing the same thing.
i am possibly not wording myself well..... but ultimately for me just running on full keeps it nice and easy to do apples with apples comparisons. of course you can never have 100% because there will always be a chris roberts who just wants to bring your machine to its knees , or someone who just just a damn poor job of optimising a sloppy game.

For people in your situation, you should consider buying EVGA now as you will have a 90 day step up option.

not a bad shout actually. thanks .
 
Back
Top Bottom