• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

AMD is releasing one called the 1440p :)

Hehe, that sort of thing could actually be a good marketing strategy, we all know how much the everyday consumer loves numbers, think back to the PlayStation 3 versus the Xbox 2, oh wait they couldn't call it that now could they, hey I know how about Xbox 360. :)
 
Hehe, that sort of thing could actually be a good marketing strategy, we all know how much the everyday consumer loves numbers, think back to the PlayStation 3 versus the Xbox 2, oh wait they couldn't call it that now could they, hey I know how about Xbox 360. :)

hehe yeh the 360
i am surprised none of them have put 4k in the name yet
we have all the VRready stuff and VR editions etc
 
That could work both ways, maybe it looking like the perfect card for 1080p would in fact be a good thing, it would possibly help gain sales from the uninformed.

Didn't think of it like that. Good point.

The informed who would be looking for new cards wouldn't base their buying decision on the name of the card anyway.

This is more the point I was trying to make, albeit, admittedly rather cryptically.
 
A bit different to my experience - my 780 at max overclock (bare in mind with the GHz edition it is more like comparing against an OC'd 780ti than a 780) still edges it over my friends 290X Tri-X (think he was running 1240 or 1250 on the core or something) in most games and I rarely have any framerate issues at 1440p with max settings - believe me if I did I'd have had a 980ti in there like as of months ago :S though to be fair there are some games like The Witcher 3 that I've not played where it might be a different story.

EDIT: Though I've not tried any comparisons with DX12 :O

EDIT2: It is quite interesting though - when I got this card it wouldn't have been unfair to say that OC'd v OC'd it stomped on that 290X - now they are running pretty much neck and neck in a lot of stuff.
I can promise you your 780 wont cut it for most modern demanding games at 1440p and certainly not at max settings. It's not even good enough for plenty of modern games at 1080p and max settings.

You really need a 980Ti for 1440p/60fps/high-ultra unless you just dont play many modern graphics-heavy games.
 
A bit different to my experience - my 780 at max overclock (bare in mind with the GHz edition it is more like comparing against an OC'd 780ti than a 780) still edges it over my friends 290X Tri-X (think he was running 1240 or 1250 on the core or something) in most games and I rarely have any framerate issues at 1440p with max settings - believe me if I did I'd have had a 980ti in there like as of months ago :S though to be fair there are some games like The Witcher 3 that I've not played where it might be a different story.

EDIT: Though I've not tried any comparisons with DX12 :O

EDIT2: It is quite interesting though - when I got this card it wouldn't have been unfair to say that OC'd v OC'd it stomped on that 290X - now they are running pretty much neck and neck in a lot of stuff.

In theory we are talking about a 390x as it's pretty much the same Gpu. Nobody in there right mind would pick a gtx780 of any kind when it's put like that. Your GHZ is around the same speed as a gtx780ti which gets easily beaten by a stock 290 in a lot of games these days. The gtx780 from all the evidence is not ageing well but 290x/390x clearly is. There is plenty of evidence out there to support this but here is Ubisoft's latest game who Nvidia partner.

http://www.guru3d.com/articles_pages/far_cry_primal_pc_graphics_performance_benchmark_review,10.html

The division as well.

http://www.guru3d.com/articles-pages/the-division-pc-graphics-performance-benchmark-review,6.html
 
Last edited:
Nvidia GPU's don't age well, in the most modern games a GTX 680 is now struggling to keep pace with a 7950 BE.
780TI struggling to keep up with a 290X.
I can't see my 970 being much good in going forward while a 390 still has a good future despite its already geriatric age.
 
Last edited:
http://www.nvidia.co.uk/content/PDF/kepler/NVIDIA-Kepler-GK110-Architecture-Whitepaper.pdf

Notice rops in the Gk110 whitepaper? They aren't listed in the specs when comparing it to previous architectures, just texture units

http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last/3



First do you see any major difference between the high level block diagram of Kepler and pascal in terms of something that says 'ROPS' on it and something that doesn't? Anandtech point out it has rops in the memory controller block but Nvidia themselves don't focus on rops, don't list ROPs on their Tesla cards and didn't list them in any information about previous Tesla cards.

Now compare the comparison specs on these pages

http://www.anandtech.com/show/6446/nvidia-launches-tesla-k20-k20x-gk110-arrives-at-last

http://www.anandtech.com/show/6760/nvidias-geforce-gtx-titan-part-1

Tesla launch comparison table, no rops, Titan based off the exact same core, rops included in the comparison.

Rops are entirely irrelevant to Telsa and Nvidia don't talk about them, I do think that at some stage Nvidia might well make a fully compute card and save the transistors and this generation might be that generation. But people are jumping to a lot of conclusions about it based off things they are just entirely guessing that mostly fly in the face of all previous cards(which again is possible, just not likely).


rest of the random ranting

yeah sick.
 
No surprises condisering the general opnion on here that I also think it looks fake - way too ugly and just not engineered enough for a new flagship Nvidia card.
 
I can promise you your 780 wont cut it for most modern demanding games at 1440p and certainly not at max settings. It's not even good enough for plenty of modern games at 1080p and max settings.

You really need a 980Ti for 1440p/60fps/high-ultra unless you just dont play many modern graphics-heavy games.

I'm sitting here doing just that (Asus ROG Swift) - ok 1-2 more recent games I don't play but plenty of recent stuff I do play at ultra settings without any problems at all.

I think a lot of review sites are still testing Kepler 2.0 cards with Boost 1.0 methodologies as that is the only way the numbers make sense (I do have a 970 as well and I still use the 780 over it in most cases for a reason :S).

EDIT: Not that I'd suggest anyone went for a Kepler card over the alternatives these days - but I sit and look at many of the benchmarks coming out of major sites and scratch my head - problem is a lot of the review sites don't seem to run tests that are easily reproduced to do a direct comparison against their results.
 
Last edited:
I think a lot of review sites are still testing Kepler 2.0 cards with Boost 1.0 methodologies as that is the only way the numbers make sense (I do have a 970 as well and I still use the 780 over it in most cases for a reason :S).

What is the different with a review testing a 780 with Kepler 2.0 cards with Boost 1.0 methodologies? as I would like to know.
 
What is the different with a review testing a 780 with Kepler 2.0 cards with Boost 1.0 methodologies? as I would like to know.

Short version - with the Boost 1.0 methodology they limit the boost to the "guaranteed" level (as in the number quoted on the box) as with older Kepler cards they'd often fail to sustain their initial boost clock indefinitely - which for a short benchmark run would often show them massively ahead of the equivalent AMD card but in actual gaming fall down quite a bit - whereas Kepler 2.0 cards will usually sustain it all day. (Some places tested by looping the Kepler cards for an hour first before benchmarking - others would limit the boost clock).

i.e. my 780GHZ has a rated boost clock of 1072MHz but actually boosts to 1188MHz out the box (and never drops) which makes a fair difference to benchmark scores.

EDIT: It is actually more complicated than that though - as if they are using an older revision 1 780 that they've had sitting around since they originally tested it on release and if they have the boost clock set to the reference boost of 900MHz that is way down from what someone with a 2nd revision 780 would be seeing in the realworld where they are probably holding a 11xxMHz boost out the box.
 
Last edited:
No surprises condisering the general opnion on here that I also think it looks fake - way too ugly and just not engineered enough for a new flagship Nvidia card.

The first picture i saw of this "new" cooler it looked just like someone had filled in the 9 with filler, then engraved a 10 instead.

Also, why is a picture of a cooler so exciting when Nvidia have yet to show ANY working 14nm stuff? Correct me and link me if im wrong.
 
Rroff did you download my project?

I built in a primitive benchmark into the Short level.

No - don't take it personally :P but was hoping you'd put up some pics/video of it before I downloaded something of that size.

Might have a look at it in a bit - downloading Everybody's Gone to the Rapture at the moment.
 
Back
Top Bottom