• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 780 Ti Specifications Leaked – Full Blown Gk110 Core with 2880 SP

So confirmed at $699 (£520 incl VAT). Not usre if this is a good or bad thing. Such a shame it wasn't called the Titan Ultra with 6GB and then I could see a market (or maybe my wallet could) :D

Really kinda hoping it's £550.00 + otherwise I'm gonna wish i had held off buying the 780 Classi:)

In reality, it's simply Nvidia flexing their muscles, knowing full well that they can yet again milk a captive cash cow:D
 
The thing that makes me smirk is that nVidia have been sitting on this, waiting...biding thier time.

Then BAM. 290X launches and they have stock going out to retailers for sale and cards in the hands of reviewers in the flash of an eye.

I might be seriously tempted pending reviews!
 
You are probably correct:)
From my own experience over various card generations, my 780 Classified ASIC reading is by far the highest I have encountered.

Now this may be total coincidence, but this is the first time I have purchased a top end, i.e non reference card.

Whilst I tend to agree, based upon the little research I have carried out, that the ASIC value is meaningless, there has to be some science behind it.

Whether a "binned" chip automatically guarantees a high ASIC I have no idea.

Ironically, a lower ASIC value is supposedly better for water/LN2 which is what I would imagine a large proportion of the "real overclockers" are using to cool their cards:D

I've only ever found ASIC values very useful when comparing cards from the same product line against results for a known value - the numbers on their own seem pretty meaningless unless you know the baseline to work from.
 
I've only ever found ASIC values very useful when comparing cards from the same product line against results for a known value - the numbers on their own seem pretty meaningless unless you know the baseline to work from.

On my two previous GTX 580's, one had an ASIC of 60, with a preset 1050mv at ref clocks. The other EVGA 3Gb model had an ASIC of 74, set at 1025mv at the same ref clocks.
Interestingly, both would do roughly the same 810 core overclock on the relative stock volts.

I suspect that as the ASIC value basically defines electrical leakage within the chip, thereby potentially reducing voltage/temps on air, it's really only a consideration for people not looking to water cool etc.

However, less voltage in any application = less potential heat = better efficiency.

There has to be some underlying scientific principle otherwise Nvidia/AMD wouldn't rate their chips and GPU-Z wouldn't have a number to read.
 
There has to be some underlying scientific principle otherwise Nvidia/AMD wouldn't rate their chips and GPU-Z wouldn't have a number to read.

Nope, there isn't. I wish I had kept some of the data I had collected on this. Nvidia and AMD rate their chips differently for a start, 70% on a nvidia GPU doesn't mean the same as 70% on an AMD one. The ASIC value on graphics chips seem to be a combination of lots of things and AMD and Nvidia use their own interpretation of those values to come up with the ASIC rating.

The reason I wish I had kept the data I had, I did a lot of forum browsing looking for ASIC scores and how well the cards overclocked and did a few tests on cards that my friends and family have. What I found out was nothing, no consistency at all.

So I concluded that ASIC numbers are useless to us. This is probably the wrong conclusion :D I sure wish a person from AMd or Nvidia would speak up on this subject.
 
MS rejected Mantle so Nvidia have been thrown another lifeline as the future is Direct X 11 for some time it seems on Xbox One which means most ports will again be from the XB1 not the PS4. DX11.2 is a white elephant on PC as few will buy a Win8 only game either :rolleyes:

http://semiaccurate.com/2013/10/16/microsoft-rejects-mantle/

You have to laugh at Nvidia though they must have known about this 2 months or more ago no way could they throw 780Ti together in a short space of time they have probably been stockpiling the GPU's since the summer :eek:
 
I have just seen the 2 benchmark reviews for bf4 from 2 different sites.at 1600 p res on ultra its around 2.8ish gigabye of memory used already. Now as we are now embracing the new console gen and porta I fear yes we will go over the 3gb within a year at 1600p res. Id I was going to spend £600 on a card I would like it to last 1 to 2 years.
 
Has anyone found out for sure what the specs are yet.

d90f.jpg


http://videocardz.com/47508/videocardz-nvidia-geforce-gtx-780-ti-2880-cuda-cores

And what's been posted, like this:

Fab: 28nm
Stream Processors: 2880
Texture Units: 240
ROP: 48
Memory: 1752 Mhz [7008 Effective]
Core Freq: 876 Mhz (902)
Boost Core Freq: 928 Mhz (1033)
Memory Bus: 384 Bit
336 GB/S
3GB (6GB)
TDP : 250W


http://wccftech.com/gtx-780-ti-specifications-confirmed-gk110-2880-sp-sheer-power/

jzie.jpg
 
Last edited:
I have just seen the 2 benchmark reviews for bf4 from 2 different sites.at 1600 p res on ultra its around 2.8ish gigabye of memory used already. Now as we are now embracing the new console gen and porta I fear yes we will go over the 3gb within a year at 1600p res. Id I was going to spend £600 on a card I would like it to last 1 to 2 years.

The same thing was said about the 2GB GTX680 and they're still rocking along performing as well as a 3GB 7970 in non-extreme usage. VRAM is overblown, just because a game loads the memory full instead of leaving it empty doesn't mean it needs that much to perform optimally.
 
Last edited:
Now as we are now embracing the new console gen and porta I fear yes we will go over the 3gb within a year at 1600p res. Id I was going to spend £600 on a card I would like it to last 1 to 2 years.
PC ports will come from the XBOX which has 8GB of shared RAM, 3GB of which is system reserved. I would be surprised if games used more than 3GB of the remaining 5GB as VRAM.

But I grant you with your monitor setup you are likely to hit the ceiling before others.
 
The same thing was said about the 2GB GTX680 and they're still rocking along performing as well as a 3GB 7970 in non-extreme usage. VRAM is overblown, just because a game loads the memory full instead of leaving it empty doesn't mean it needs that much to perform optimally.

Meh, not really. 680 is falling behind a bit nowadays. Though this might be a worst case scenario. 25% slower at 1080p than a 7970ghz. Not vram limited here either.


5dIILC8.png
 
Last edited:
Holding out for a non ref GTX 780ti with 6GB+ of vram (Purely so it beats the Titan everywhere) or I'll just wait until 20nm GPU's. Pirate Islands and Maxwell are sure to blow all from this gen away, and probably available as soon as May / June next year.
 
Back
Top Bottom