• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX 780, 770 coming in May

http://wccftech.com/msi-geforce-gtx-770-lightning-spotted-wild-features-twin-frozr-iv-cooling/

MSI Lightning 770 :D

MSI’s upcoming GeForce GTX 770 lightning which is a custom version of the GPU featuring Twin Frozr IV cooling and a non-reference PCB has been spotted.

MSI GeForce GTX 770 Lightning Spotted in the Wild

The GeForce GTX 770 Lightning from MSI comes with a non-reference PCB that uses the GK104-425-A2 cooler that features 1536 Cuda cores, 128 TMUs and 32 ROPs along with a 2GB GDDR5 memory that operates along 256-bit interface. From previous reports, we know that the reference models would be clocked at 1046 MHz core and 1085 MHz boost while the memory would be effectively clocked at 7 GHz which would give a good performance boost over the GeForce GTX 680. However, the MSI GeForce GTX 770 Lightning being a custom model would make use of a factory overclock out-of-box. Chances are that the base clock would be configured over 1100 MHz.

Additionally, the GeForce GTX 770 lightning would make use of the flagship Twin Frozr IV cooling from MSI that uses two large 10cm PWM fans with Propeller Blade Technology that cool the advanced cooling body of the Twin Frozr IV. The cooler is made up of high density fin grid array that provides heat dissipating area that is fed by heat-pipes featuring MSI’s SuperPipe technology. It also comes with dust removal technology which makes the fan run in reverse for 30 seconds after booting to remove dust from the heat sinks. Furthermore, the GPU shipped with MSI’s GPU Reactor module that adds more power to the VRM increasing its overclocking potential and reducing static noise.


Read more: http://wccftech.com/msi-geforce-gtx...features-twin-frozr-iv-cooling/#ixzz2TraT5k5p

The GPU comes with 4-Way SLI support and is powered by dual 8-Pin connectors compared to dual 6-Pin on the reference models. Display outputs include Dual DVI, HDMI and a display port. The reference GeForce GTX 770 would cost $399-$449 at launch so we can expect a premium price point for this model. There have been reports that NVIDIA would paper launch the GeForce GTX 770 during the GeForce GTX 780 launch on 23rd May while the availability date would be set on 31st May 2013.

Read more: http://wccftech.com/msi-geforce-gtx...features-twin-frozr-iv-cooling/#ixzz2Trak5JVw

Fantastic cards and hopefully with a bit of voltage play allowed.
 
http://wccftech.com/msi-geforce-gtx-770-lightning-spotted-wild-features-twin-frozr-iv-cooling/

MSI Lightning 770 :D



Fantastic cards and hopefully with a bit of voltage play allowed.

My issue with this...

Go look at a GTX680 Lightning card and box then go look at this card and box with the blurred out card number. Not only does it not actually say GTX770 anywhere on it, the box is 100% identical to the GTX680 variant.

I also see absolutely no appeal for MSI to make a lightning for the 770, especially given the two significantly faster cards above it.

Edit: Also, you'd presume the GTX770 will be stamped up as DX11.1, not just DX11 as the 680 was.

Edit 2: Nvidia still hate MSI for the last Lightning, can't see them wanting to upset the green light tripe any further with over spec voltage.
 
Last edited:
$399 - $449 at launch sounds good, that's about the cost of a decent 670 over there. So we could be looking at around the £320 mark over here.

here's hoping. :)
 
My issue with this...

Go look at a GTX680 Lightning card and box then go look at this card and box with the blurred out card number. Not only does it not actually say GTX770 anywhere on it, the box is 100% identical to the GTX680 variant.

I also see absolutely no appeal for MSI to make a lightning for the 770, especially given the two significantly faster cards above it.

Edit: Also, you'd presume the GTX770 will be stamped up as DX11.1, not just DX11 as the 680 was.

Edit 2: Nvidia still hate MSI for the last Lightning, can't see them wanting to upset the green light tripe any further with over spec voltage.

Yer, I doubt that is a genuine 770 Lightning and is a 680. It was a strange one between MSI and Nvidia with MSI saying they had been told to stop with voltages and Nvidia saying they could carry on but wouldn't honour any warranties.
 
Yer, I doubt that is a genuine 770 Lightning and is a 680. It was a strange one between MSI and Nvidia with MSI saying they had been told to stop with voltages and Nvidia saying they could carry on but wouldn't honour any warranties.

I think it was more down to Nvidia threatening to pull chip allocations to MSI if they carried on, which to some extent they did. You don't see many MSI Titans for sale...

Unfortunately I can only see MSI reference and basic Twin Frozr on nvidia cards for the foreseeable. Its a shame as Lightnings have broken so many records for Nvidia over the years.

Have a read of this, its a little out of date but shows just what the lightnings have been doing for years now: http://forum-en.msi.com/index.php?topic=150708.0
 
My issue with this...

Go look at a GTX680 Lightning card and box then go look at this card and box with the blurred out card number. Not only does it not actually say GTX770 anywhere on it, the box is 100% identical to the GTX680 variant.

I also see absolutely no appeal for MSI to make a lightning for the 770, especially given the two significantly faster cards above it.

Edit: Also, you'd presume the GTX770 will be stamped up as DX11.1, not just DX11 as the 680 was.

Edit 2: Nvidia still hate MSI for the last Lightning, can't see them wanting to upset the green light tripe any further with over spec voltage.

The blurred out part on the side of the box suggests that it would say 770 and not 680 as the first two characters are clearly the same one, and the third looks like it would have been a 0.

There's a few ways you could read in to this, and some of them really aren't good, as it could suggest that board partners are taking 680s straight rebranding them to 770s, just with a new box and BIOS, the only difference is that nVidia have produced a new reference cooler for them to make them look different.

That would make sense as for why there will be a Lightning 770 if they really are literally 680s with a new BIOS.
 
I think it was more down to Nvidia threatening to pull chip allocations to MSI if they carried on, which to some extent they did. You don't see many MSI Titans for sale...

Unfortunately I can only see MSI reference and basic Twin Frozr on nvidia cards for the foreseeable. Its a shame as Lightnings have broken so many records for Nvidia over the years.

Have a read of this, its a little out of date but shows just what the lightnings have been doing for years now: http://forum-en.msi.com/index.php?topic=150708.0

It's reasons that this that have developed my dislike for nVidia over the years.

If I had to describe nVidia in one word it'd probably be myopic.
 
The blurred out part on the side of the box suggests that it would say 770 and not 680 as the first two characters are clearly the same one, and the third looks like it would have been a 0.

There's a few ways you could read in to this, and some of them really aren't good, as it could suggest that board partners are taking 680s straight rebranding them to 770s, just with a new box and BIOS, the only difference is that nVidia have produced a new reference cooler for them to make them look different.

That would make sense as for why there will be a Lightning 770 if they really are literally 680s with a new BIOS.

That could be the only way, especially as Lightnings usually follow a good 3-4 months after a cards reference release. I couldn't envision MSI putting in the effort to design a whole new PCB for what is really a lower high end card.

Same box, same PCB, same cooler, same chip, different vBIOS, locked voltage (that I can assure you of if true).

Maybe they had a few spare 680 LTG PCB's kicking about? :p

I see what you mean with the blurred out part, BUT its not totally unreasonable to think its shopped that way, blurred with a hint of 770 :D
 
It's reasons that this that have developed my dislike for nVidia over the years.

If I had to describe nVidia in one word it'd probably be myopic.

I had to look that up in a dictionary :o

1. A visual defect in which distant objects appear blurred because their images are focused in front of the retina rather than on it; nearsightedness. Also called short sight.


Sounds about right :p
 
That could be the only way, especially as Lightnings usually follow a good 3-4 months after a cards reference release. I couldn't envision MSI putting in the effort to design a whole new PCB for what is really a lower high end card.

Same box, same PCB, same cooler, same chip, different vBIOS, locked voltage (that I can assure you of if true).

Maybe they had a few spare 680 LTG PCB's kicking about? :p

It could be a number of reasons really, nVidia could have designed a new reference board and MSI realised it's compatible with their lightning boards, or they really could just be 680s repurposed.

It'd cost MSI such a small amount to actually produce 770 Lightnings even on a new production run because all the R&D has been done anyway the first time around for the 680 Lightnings.

The thing I find the worst though is that having such radically different hardware difference between the X70 and X80 variant is going to make things look a bit confused.

Because basically the 770 is going to still going to struggle with situations where VRAM and memory bandwidth requirements are high, whereas the 780 won't have that issue, so in some benchmark situations the 780 would pull off ahead of the 770 by a large amount.

I see what you mean with the blurred out part, BUT its not totally unreasonable to think its shopped that way, blurred with a hint of 770 :D

Yeah that could also be the case, photoshopped then blurred so that it looks like 770 blurred. Easily done either way really.

I had to look that up in a dictionary :o

Haha :D




Sounds about right :p

By that, I mean that a lot of the things nVidia do look like they haven't taken in to account long term repercussions of those actions. Like bully tactics with their board partners, companies they're working with, reviewers and their obsession with all things proprietary.
 
I think it was more down to Nvidia threatening to pull chip allocations to MSI if they carried on, which to some extent they did. You don't see many MSI Titans for sale...

Unfortunately I can only see MSI reference and basic Twin Frozr on nvidia cards for the foreseeable. Its a shame as Lightnings have broken so many records for Nvidia over the years.

Have a read of this, its a little out of date but shows just what the lightnings have been doing for years now: http://forum-en.msi.com/index.php?topic=150708.0


I'm sorry, but, and even at the risk of upsetting Nvidia fans.... if Nvidia are behaving in this way.... then Nvidia are IMO: Rear End Holes! and Anti overclockers. :mad:
 
It could be a number of reasons really, nVidia could have designed a new reference board and MSI realised it's compatible with their lightning boards, or they really could just be 680s repurposed.

It'd cost MSI such a small amount to actually produce 770 Lightnings even on a new production run because all the R&D has been done anyway the first time around for the 680 Lightnings.

The thing I find the worst though is that having such radically different hardware difference between the X70 and X80 variant is going to make things look a bit confused.

Because basically the 770 is going to still going to struggle with situations where VRAM and memory bandwidth requirements are high, whereas the 780 won't have that issue, so in some benchmark situations the 780 would pull off ahead of the 770 by a large amount.



Yeah that could also be the case, photoshopped then blurred so that it looks like 770 blurred. Easily done either way really.



Haha :D






By that, I mean that a lot of the things nVidia do look like they haven't taken in to account long term repercussions of those actions. Like bully tactics with their board partners, companies they're working with, reviewers and their obsession with all things proprietary.

I see what you mean matey, fair points :)

I'm sorry, but, and even at the risk of upsetting Nvidia fans.... if Nvidia are behaving in this way.... then Nvidia are IMO: Rear End Holes! and Anti overclockers. :mad:

I'll try fish out the official statements, they were well worded which essentially said "follow the green light program or have allocations slashed"

The green light (or what ever it's official name was) basically says you must stay within nvidias voltage/tdp limits. So no more lightning, classified, SOC etc...as they would be pointless. Its a real shame but gives AMD an upper hand
 
I'm sorry, but, and even at the risk of upsetting Nvidia fans.... if Nvidia are behaving in this way.... then Nvidia are IMO: Rear End Holes! and Anti overclockers. :mad:

Well certainly, the thing is, some people get upset simply because you've said something negative about nVidia, whilst disregarding the truth of it.

But as I said before, it's for reasons like this that have built my dislike of nVidia.

They are very shortsighted when it comes to business decisions and don't seem able to see beyond ideas that strengthen their bottom line, regardless of whether it'll be detrimental in the future.

If the GTX 770 is just a rebadged GTX 680 with higher clocks, I wonder if NVidia have nobbled the GTX 770 to stop people going SLI with a GTX 680 and 770.

I bet they have.

The bottom line is that they'll have done it for cost reasons. They will literally have not had to spend out R&D money on producing them if they're straight rebrands.

This is what people really mean when they say that the GTX670/680 is "midrange", it's not about the performance on offer. nVidia have basically cut production costs in half and gimped the additional features of their chips to bring production costs right down, then doubled the price they want for them.

So production costs on a GTX680 now should be fairly similar to the production costs of the GTX560Ti, and the GTX580's production costs should be fairly similar to GTX Titan's production costs yet they are wildly more expensive.
 
I see what you mean matey, fair points :)



I'll try fish out the official statements, they were well worded which essentially said "follow the green light program or have allocations slashed"

The green light (or what ever it's official name was) basically says you must stay within nvidias voltage/tdp limits. So no more lightning, classified, SOC etc...as they would be pointless. Its a real shame but gives AMD an upper hand

Ah... yes, I do remember that now, Nvidia have got to big headed with the largest part of the market and every other reviewer singing their song for them.

And that's what happens as a result, the user gets screwed over.
 
I suppose it is always possible that Nvidia stopped the voltage malarkey on the 680 because they knew it would lessen the 770's that they knew they were going to be releasing. Unlikely I know but it is possible.

I do have to agree with Spoffle (my god am I feeling alright :D) that the 780 and 770 being on different chips could lead to some interesting differences in benchmark performances.

It will all depend on the price, as to whether these cards are any good or not, bring them in just under the existing 680/670 for the 770/760ti and they could be a good card to pick up, but the 780 I do fear it will be frightfully expensive.
 
I think the voltage cap was more down to the 670/80 not challenging what nvidia had in the pipeline (gk110).

The 670 I had would clock to over 1350mhz at 1.175v, if the limit was 1.3v that could easily have been closer to 1500mhz, which is stupidly fast and probably around or even over 780 perf.
 
No it didn't.

Only with masses of MSAA (8x) at high res did it go past the 2Gb mark.... which you would expect. But with MSAA x4 at high res it's fine.

http://www.overclock.net/t/1235392/...ormation-and-screenshot-thread-not-a-port/380

So you start by saying , no. Then go onto to agree with me.

Lmao !

Say what you want, and clearly you do, but I remember the reviews where MP3 was shown to use well over 3GB of vram at certain settings. So, sorry, and I say sorry quite wrongly in your case, but yes it can destroy 2GB of vram.
 
Last edited:
Back
Top Bottom