• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Volta with GDDR6 in early 2018?

Caporegime
Joined
18 Oct 2002
Posts
29,993
Thanks to the 2nd hand market, the trick is to just keep swapping them with others.

You might buy a couple games and end up completing 10-20... just keep swapping.

If you're the kind of person who likes to build up a vast library of games then yeah, what you say is valid. But if you just want to play and swap, it really doesn't have to be any more expensive than PC gaming.

Just tot up the value of all those "Steam sale" games you bought over the years... and then never played :p

Tbh I get what you're saying, just keep one step behind the curve and it works out cheaper :)

**** Steam sales, torrents ftw :D
 
Soldato
Joined
2 Jan 2012
Posts
11,932
Location
UK.
**** off Boom, you'll still have about 10 new cards in the next 2 years, you're kidding no-one but yourself!! :p

Haha xD

No genuinely I can't justify the new pricing. If the next set of cards get another hike I would prob have to settle for a mid range card or wait a year or so later for AMD to finally bring a competitive card xD (See Pascal VS VEGA). Even then the AMD card prob won't be able to beat Nvidia's fastest, so we could have same situation again where Nvidia's high end cards cost near £800, and AMD's cards offer more value / less performance. But yeah my highest end days are over it seems.

Bad times man bad times.

On the other hand, an Xbox One X might just be enough to keep happy at 4K for a while.
 
Soldato
Joined
4 Jul 2012
Posts
16,910
It is nothing like that - upto and including 1440p type resolution the extra resolution on a PC/Windows platform is generally used to increase the amount of screen estate - once you go above that the extra resolution is most effective when used to increase the density of UI elements, etc. i.e. icons represented using say 256x256 pixels instead of 64x64 and text looks crisper with better defined curves, etc. while proportionally using the same space in the UI.

This isn't a straightforward story with Windows as a lot of legacy use is designed around working with pixels 1:1, Windows DPI scaling is somewhat less than the greatest and for many people to make the best advantage of 4K resolution really requires 40+ inch displays while there are many reasons people want or don't have space, etc. for bigger than the normal range of 24-30".

If you play a lot of older games and even quite a few recent games the UI doesn't scale well on 4K leaving you with various problems like tiny, tiny UIs or poorly positioned on the screen and in some games (which is never really going to be solved) its hard to find a good compromise for mouse input that lets you retain low level precision when making small mouse moves and effortlessly cover a huge amount of pixels at the same time.

This isn't going to be the same story for every one - but it certainly makes 4K somewhat marmite as things stand and while it would be nice to have all those extra pixels the reality is there might not be a solution that satisfies the requirements of every user for Windows desktop usage compared to other OSes like Android where 4K can even be useful on relatively small devices.

Dunno why I dignified it with a longer response but maybe next time try asking people what their reasoning is before whirling at them like some kind of ****.



Performance can be a reason but ultimately a more temporary one as slowly GPUs catch up with increases in resolution - there was a time when 1080p for instance was looked at from a performance stand point like 4K is today.

I think people miss that previous increases in resolution in Windows have been used to increase the amount of real estate you are working with on the screen but at 4K type resolutions that largely becomes less useful - a lot of people will simply run 125-150% UI scale to use the increased pixels to make things look nice while retaining the same amount of usable screen estate and that comes with a split in what people use an OS for - especially as Windows is a bit hit and miss when it comes to DPI scaling, etc.
Well now that isn't the resolution at fault, it's Windows. At the moment, I'm using UHD displays exclusively. Via my TV and laptop with a UHD display.

I know very well of the issues you've described. It appears that the latest iteration of Windows has changed some of that functionality.

For example, I used to not be able to use in game chat via the Origin client, due to everything being too big. It's now been fixed so that it looks and works fine.

But this isn't UHD, it's Windows and software developers not being on the back with it yet. For example, 3DS max 2015 doesn't have any way to work with high DPI displays while being usable, and UHD displays have been out for quite a while now. 2017/2018 does, but there are no excuses for why it's taking some companies so long to implement it.
 
Associate
Joined
1 Aug 2012
Posts
1,117
Location
Stoke-On-Trent
I really hope we see volta in 4-6 months time, I've been telling my self till June, just wait another year and you can get the new architecture.

But from reading what's been linked on here i guess that's not going to be the case. God damn it!
 
Soldato
Joined
28 Sep 2014
Posts
3,439
Location
Scotland
I read Igor Wallossek from Tom's Hardware's posts on 3Dcenter.org forums confirmed he seen Nvidia new internal roadmap that are confidential only to tech partners showed Ampere will launch in beginning of Q2 2018.

https://www.3dcenter.org/news/nvidi...bringt-im-q22018-gleich-die-ampere-generation

Translated:

In the last few days, there has been some interesting news regarding nVidia's near future plans for new gaming graphics chips. The point of departure is, of course, that although there is the Volta architecture for HPC needs, the corresponding gaming chips have not appeared so far, according to a recent announcement even to the second quarter of 2018 (or later) were postponed. Interestingly, in the internal roadmap spotted for this purpose, the entry "Volta" should no longer have been at this date, but simply "Gen. 4" - which now suddenly has a new meaning. Because first Heise recently reported that the successor to the current Pascal graphics cards under the code name "Ampere" should run. A clear confirmation of this can now be found in our forum via two posts by (once again) Igor Wallossek of Tom's Hardware :

Volta died, there's nothing for consumers anymore. However, one can easily guess the name of the next generation in Q2 if you divide James Watt by Alessandro Volta.
Source: Igor Wallossek of Tom's Hardware @ 3DCenter Forum

Ampere is fix, Q2 2018 too. The first already start with the BoM.
Source: Igor Wallossek of Tom's Hardware @ 3DCenter Forum
Obviously, Igor once again presented a new internal roadmap, in which now no longer "Gen. 4", but just "Ampere" is - and now firmly for the second quarter of 2018. This means a significant transformation of nVidia's plans, in which the Volta-based gaming chips GV102, GV104 and GV106 (as well as possibly others) were certainly once firmly planned - and will now be dropped in favor of the next generation. What this is, is unclear - conceivable would still be the one that has misjudged the manufacturing technology and the actually intended for Volta 10nm production is just too late for large graphics chips ready for the competition. In the case of the HPC chip GV100 , which was definitely to be brought in , it was necessary to resort to 12nm production, which, however, produced an unusually large chip even with 815mm² chip area. This extreme chip area may well be regarded as a certain indication that nVidia had actually planned the GV100 in 10nm production - because in this case the same chip would probably be ~ 500-550mm². And possibly the GV100 chip had to be slimmed down in its final form due to the limitations of the 12nm production even something, the non-fulfillment of some early performance promises points in this direction.

For the gaming chips in the 12nm production, the pressure is lower, since you (mostly) does not go into such extensive boundary areas. But for a higher performance under the 12nm production always means a larger chip area, since the 12nm production in this regard has only small advantages over the 16nm production (according to TSMC 20% less area requirement at 10% Mehrtakt or 25% less power consumption compared to 16FF) . Either nVidia would be able to offer gaming chips in 12nm production so rather only below average performance gains - this could be realized with slightly larger chip area and about the same power consumption quite. Or nVidia would have to make gaming chips in the 12nm production significantly larger, which would then at least have a medium-high performance boost result - but this at a higher cost and above all to a higher power consumption. How such a thing can look like was already exhaustively explained in a previous speculation article .

A performance doubling or even the proximity of it is not achievable with the 12nm production - for this the graphics chips would have to be too big and too expensive, especially the GV102 / GA102 would then encounter manufacturing limits (the GV100 chip with its 815mm² could be made at all only by means of expensive special procedures) . But above all, an additional performance under the 12nm production is always accompanied by an additional power requirement - really fast 12nm chips would only be possible by doing away with nVidia's first-class energy efficiency and (relatively) low power consumption. This should probably have been the sticking point, why nVidia has taken distance from this path then again: With the 12nm production for gaming chips, there is either only a small increase in performance, which would not have arrived in general - or nVidia would have to buy a large increase in performance with significant additional costs and the renunciation of the known energy efficiency, which would have been criticized as well.

Either way, the interim solution of the 12nm production is not good to adapt to new graphics chips, was in the end probably only for the GV100 chip a necessary evil to fulfill their own HPC plans or obligations in this regard. For new nVidia graphics chips after the Pascal generation you actually always needed a new real production fullnode - with everything else either the performance or the power consumption requirements would not be met. In addition, the lateness of a new nVidia graphics chip generation (the 18-month hitherto, and probably a total of ~ 24 months after the release of the current generation is comparatively long for nVidia) suggests that you are waiting for something to happen At a certain point in time is ripe - like the 10nm production for large graphics chips. For smartphone SoCs that is already in mass production since spring 2017, but usually the first months are blocked exclusively for large orders from Apple and Samung and subsequently the new production must first mature in such a way that produces the much larger graphics chips to a meaningful production yield can be. One year later than the first corresponding SoCs here is a rule of thumb, which has worked well in recent years - and now in the case of the 10nm production of TSMC fits well with the second quarter of 2018.

Long story short, due to these circumstances, we expect the ampere generation already in the 10nm production of TSMC. After this (for nVidia) long waiting phase between two generation of graphics cards as well as due to the availability of this production method, which was already predicted earlier in the spring / summer of 2018, everything else would really come as a surprise. Above all, the use of 12nm production in the spring / summer of 2018 would hardly make sense, as long as the 10nm production would soon be ready to go - nVidia (as explained above) can not achieve as much performance with 12nm production as you really wanted to bring. With the ampere generation in 10nm production, however, the generally usual performance boost of almost twice is possible. How exactly nVidia achieves this with the (supposed) HPC chip GA100 as well as the (probable) gaming chips GA102, GA104, GA106, GA107 & GA108 is still unknown and is therefore in the field of speculation . Conceivable for this is from larger architectural changes over pure more hardware units up to (still) higher clock rates still everything.

My theory is Nvidia probably decided to cancel Volta for consumer desktop after tested Volta GV102 Titan Xv prototype samples to see if it suit for production realised 12nm are far too expensive for consumer GPUs so it only good for GV100 because it was the largest GPU for servers and data centers so Nvidia skip it straight to Volta successor Ampere that will use TSMC 10nm process and possible Ampere could be the first to use MCM GPU design.

Gen 4? Hmmm I guess Pascal was marketing name for Maxwell Gen 3 and Ampere as Maxwell Gen 4?
 
Man of Honour
Joined
13 Oct 2006
Posts
91,425
"in which the Volta-based gaming chips GV102, GV104 and GV106 (as well as possibly others) were certainly once firmly planned" - absolute rubbish the only people who ever claimed this had no connection to nVidia whatsoever.
 
Soldato
Joined
2 Oct 2012
Posts
3,246
Im telling you ampere will be a pascal refresh. I even said before we even knew about ampere that nvidia wont release anything till q2 2k18 but people said they could release end of this year despite me saying this wont happen. I even said that i would be supprised to see anything beginning of next year. Q2 or 2h of 2k18 is where id expect to see anything significant from nvidia. Also dont beleive this nonsense about it being next gen after volta.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,425
Im telling you ampere will be a pascal refresh. I even said before we even knew about ampere that nvidia wont release anything till q2 2k18 but people said they could release end of this year despite me saying this wont happen. I even said that i would be supprised to see anything beginning of next year. Q2 or 2h of 2k18 is where id expect to see anything significant from nvidia. Also dont beleive this nonsense about it being next gen after volta

We might see a new Titan type card sometime around the end of the year. Jensen was pretty emphatic in what he said about GeForce in general.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
not sure if now is a good time to buy a card or not, you'd have thought new nvidia stuff would be inbound sooner or later?

Decide if the current cards will improve your experience. Consider FPS, temps and noise while keeping an eye on the Black Friday deals. I went from a 980Ti to 1080Ti just to drop the temps by ~7C, very happy with the better FPS minimums as well.

There is never a good time to buy, just a good reason.
 

Klo

Klo

Soldato
Joined
20 Nov 2005
Posts
4,109
Location
South East
Just buy a card now. With the tech market no matter when you buy something better will come along within 6 months.

While true, there are better/worse times to buy cards imo. For example if Nvidias latest card is coming out in a month, I wouldn't recommend you buy a 1080ti or whatever, I'd recommend you wait.
 

V F

V F

Soldato
Joined
13 Aug 2003
Posts
21,184
Location
UK
They’ve released cards just before Christmas? No way. Companies start slowly winding down first week in December.
 
Soldato
Joined
17 Jun 2012
Posts
5,951
We might see a new Titan type card sometime around the end of the year. Jensen was pretty emphatic in what he said about GeForce in general.

They've just released yet another mickey mouse Titan with the star wars version, just how many Pascal based Titans do we need?

I think releasing limited edition stuff is a sure fire sign that Pascal is done, they've released something to compete with the V56 with the 1070ti and now they're likely to sit on it and soak up the sales whilst planning the release of whatever is coming next. Likely the next XX70 and xx80 cards and following previous examples the XX80 will be the temporary high-end card until the Titan and Ti come along.

I think they were waiting to see how Vega went with a card in their pocket in case performance surprised them with driver maturity, it's been out a while now and Nv are likely happy they have something to cover it in each market so no need to do anything with Pascal now apart from sit back and light cigars with £50 notes.

Time to start the Ampere hype train and lube everyone up enough so that they're begging Nv to take another £600 off them for the next XX80 card when it actually arrives...
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
We might see a new Titan type card sometime around the end of the year. Jensen was pretty emphatic in what he said about GeForce in general.
Even if competitive situation doesn't require new product marketers and "quarterly economists" might demand some rehash.


I think they were waiting to see how Vega went with a card in their pocket in case performance surprised them with driver maturity, it's been out a while now and Nv are likely happy they have something to cover it in each market so no need to do anything with Pascal now apart from sit back and light cigars with £50 notes.
I was thinking they likely have those cigars made from €500 notes
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Haha xD

No genuinely I can't justify the new pricing. If the next set of cards get another hike I would prob have to settle for a mid range card or wait a year or so later for AMD to finally bring a competitive card xD (See Pascal VS VEGA). Even then the AMD card prob won't be able to beat Nvidia's fastest, so we could have same situation again where Nvidia's high end cards cost near £800, and AMD's cards offer more value / less performance. But yeah my highest end days are over it seems.

Bad times man bad times.

On the other hand, an Xbox One X might just be enough to keep happy at 4K for a while.

Hahahaha, Wait, that was a joke wasn't it?
 
Back
Top Bottom