• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia's GTC 2014 Thread

At least they have a better hit rate than Joel, who seems to start from a premise, get test results that show the opposite and then publish the conclusion anyway due to something unrelated :D

Mate, I am still scratching my head on that. Batman gets owned by AMD and nVidia have done AMD users an injustice. I am missing something for sure.
 
Mate, I am still scratching my head on that. Batman gets owned by AMD and nVidia have done AMD users an injustice. I am missing something for sure.

But remember, driver teams cant optimise for a game without source code... So I guess nvidia must have been really busy going through the source code for every DX11 game to be able to release a driver that offers a boost in lots of games all at once :D
 
that list boom is from here

http://www.neogaf.com/forum/showthread.php?t=790013

Hoping To See...

- Preview of the new 20nm high-end Maxwell GPU architecture (GM10x/GeForce 8xx) with unified virtual memory.
- Amazing real-time graphics demos (as always).
- More DirectX 12 goodness.
- OpenGL goodness.
- Demos of the to-be-released CPU-optimized GeForce driver: http://www.pcper.com/reviews/Graphic...y-Improvements
- GameWorks demos.
- Exciting announcements.

Notice the key bit highlighted. Someone hoping to see 20nm Maxwell stuff doesn't mean Nvidia ARE showing 20nm Maxwell stuff.
 
http://www.neogaf.com/forum/showthread.php?t=790013

Hoping To See...

- Preview of the new 20nm high-end Maxwell GPU architecture (GM10x/GeForce 8xx) with unified virtual memory.
- Amazing real-time graphics demos (as always).
- More DirectX 12 goodness.
- OpenGL goodness.
- Demos of the to-be-released CPU-optimized GeForce driver: http://www.pcper.com/reviews/Graphic...y-Improvements
- GameWorks demos.
- Exciting announcements.

That list comes from a post on neogaf, then quoted on a forum by random dude who takes it all as "fact", which is then taken as "news" by another joke tech site.

Also where did Charlie say no 20nm gpu's as I can't recall him posting an article saying that, though with the pay wall there isn't much news coming out of there any more.

Notice the key bit highlighted. Someone hoping to see 20nm Maxwell stuff doesn't mean Nvidia ARE showing 20nm Maxwell stuff.

Also worth pointing out that Nvidia's unified memory is Virtual unified memory, with basically none of the real benefits of unified memory(latency, reduction in wasted copying, etc) it's just a way to reduce some memory management work as far as I know.
 
Last edited:
So by definition it's still unified memory. You seem to know a lot about it considering they've barely announced anything about the architecture itself.

There is being informative and then there is what you're doing, which is subsequently after reading good things about improvements in NVs 20nm next gen architecture, belittle it before the press conference has even happened.
 
Last edited by a moderator:
that list boom is from here

http://www.neogaf.com/forum/showthread.php?t=790013



Notice the key bit highlighted. Someone hoping to see 20nm Maxwell stuff doesn't mean Nvidia ARE showing 20nm Maxwell stuff.

Ah great.. I will blame you now if they don't show 20nm :p

Nvidia usually unveil hardware right at the end of their conferences. Very hopefull it will be 20nm, GTX 790 also a possibility.

Is GTC a 3 day thing?
 
http://www.anandtech.com/show/7515/nvidia-announces-cuda-6-unified-memory-for-cuda

https://devblogs.nvidia.com/parallelforall/unified-memory-in-cuda-6/

http://www.hardwareluxx.com/index.p...ry-only-for-professional-segment-for-now.html

The title in the last one should give you a hint as to what it's all about.

Unified Virtual Memory is part of CUDA 6.0, and will also be supported by the first two "Maxwell" cards GeForce GTX 750 Ti and GTX 750

So much of a crap was given about the UVM support on the 750ti, that after it's launch you still think Nvidia hasn't spoken about it at all..... it's going to be massssively relevant.

To sum up, it's really a feature for professional only cards, it's been widely discussed before(Anandtech article dated NOVEMBER last year). It's not real unified memory so still has most of the latency/copy problems. It's got nothing to do with 20nm. It's going to mean effectively nothing to gamers. By the very fact it's called VIRTUAL unified memory defines it as NOT REAL unified memory.
 
Last edited by a moderator:
Ah great.. I will blame you now if they don't show 20nm :p

Nvidia usually unveil hardware right at the end of their conferences. Very hopefull it will be 20nm, GTX 790 also a possibility.

Is GTC a 3 day thing?

THey might do, they might not. I have no idea which way they'll go with 20nm. It's pretty much a confirmed fact from TSMC/everyone that 20nm is very "meh" as a process. 16nm brings with it most of the missing features that makes 20nm so meh. I can fully understand if both companies waited for 16nm, I can fully understand both companies thinking it's right to wait for 16nm. But you get the situation that one company thinks maybe if the other guys are waiting for 16nm and we can sneak a 20nm generation in between we might have a huge advantage for a year. Then if one company thinks the other one will do that, they feel they have to do it as well, etc, etc.

It will be expensive, bring a pretty tame generational increase, and cost them a lot for a pretty short lived product to go 20nm. That doesn't mean they won't.

The real issues come with, with a finite number of engineers, you'll get 16nm cards sooner if you don't make any 20nm cards. I still think that if we do see 20nm parts, then they will be 7970/680gtx level parts, not 290/780ti level parts. IE 300-350mm^2 max, and they won't try and make the 450-550mm^2 parts. With such a small reduction in power usage, a 290/780ti on 20nm aiming at the usual performance jump with twice the transistor density could end up 350-400W cards which I can't see either company wanting to do.
 
Back
Top Bottom