• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

First Nvidia GT300 Fermi pic.

card doesn't look too big either..........but ours wont look like that , so i read ...this card looks like a quick mock up.

but i dont think that it's a pure gaming card............it's more bias towards video and AutoCAD, but i'm not too sure

something doesn't make sense here..... i dont think that this is the version we're getting.............unless Nvidia are moving away from gaming....i'm a bit confused by all of this, there are just far too many conflicting rumours going around
 
Last edited:
nice pics.
That card is a beast. Although it looks far from finished. Its clear that they havent even got it running anything atm otherwise it would have been on show. Lambree is going to give this baby a run for its money.

Cant wait to get one of these anyway! I will be able to convert a Avi movie into MPEG up to in 3 minutes instead of 6min. Badaboom!


Badaboom costs a lot of money for what it is and it's only quicker at converting high definition films when it comes to converting AVI's to MPEG's and vis versa it's no faster then a what ever dual core CPU the host PC is using.

card doesn't look too big either..........but ours wont look like that , so i read ...

but i dont think that it's a pure gaming card............it's more bias towards video and AutoCAD, but i'm not too sure

It's a Telsa(SP?) card so it's not a gamers card, it's handy for boffin's who want to do complicated scientific algorithms and other complex number crunching who don't have the space for large server racks/farms.

Nvidia are taking a few different route to market with this chip, normally they promote the consumer products first then the business products normally crop up 6 months latter. Could this be a first sign that Nvidia are looking to reposition their brand? They have already said DirectX 11 isn't that big a deal in terms of shifting new GPU's but is that just a smoke screen to cover the fact that they are nervous about Intel's entry into the market?
 
Last edited:
The card shown off today is a Tesla card not a Geforce so I wouldn't get to excited, also this card runs of a 1x 8pin and 1x 6pin so we can already see the G300 core is going to require a lot of juice in order to run it.

forgive my idiocy in mathematics, but I always thought=

1 + 0 = 1
where 1 = 1x8pin pcie connector
and 0 = No 6 pin pcie connector

Tbh im not surprised by nvidia's descision to focus elsewhere. PC Gaming today represents a fraction of what it was over a decade ago. With consoles so affordable now, it makes sense to buy them instead.
 
http://arstechnica.com/hardware/new...ct-aim-at-intel-supercomputing-with-fermi.ars

Interesting article. Kinda hinting that the bottom is really falling out of the discrete gpu market and that there is more money to be made elsewhere (HPC & mobile) which is the way nVidia may be taking its product line.

Good article, whether or not it becomes true is another matter. nVidia have a huge market share in discrete graphic solutions along with their own technologies (PhysX). If they do stop making GPUs for gaming, I think it will be a long transition. Not to mention, if there's only the one major GPU manufacturer (ATI), it is going to monopolise the market which isn't good for consumers; long life cycles of cards, minor advancements in next gen, high prices in comparison.

Come one green team, we need you!
 
But as said, the pc gaming market is a dying market with the vast majority of games being console ports or released at the same time.

There are only so many people out there who will pay £500 for a graphics card when that would buy a high end console and a shed load of games.

It might just be me but there used to be at least one game a month for the pc which I was excited over and desperate to buy. Now it seems I have to wait 6 to 12 months for a ground breaking impressive pc game to come along (console ports excluded).
 
surely if (as i read in this thread) shader model 5 users double precision FP then that would mean it could fly through shader model 5 stuff compared to ATI's. Also, does anyone know it ray tracing would benefit a lot from moving to this sort of arcitecture? If so, then thats a win surely, since graphics are going that way.

it wont be an ati monopoly, larabee could compete once its gone through a gen or two, and nvidia will still probably release another few gens before thye pull out (eg, while they still can compete on perf and price).

I really liked the anandtech article, especially the piece about the FX/gt200 fubars.

It'll be a sad day (at least for me) when i cant buy an nvidia card for my pc, i just hope that the last card they release is an awesome one, and not a poor perf card with a high price tag so everyone remembers them badly.
 
Gonna go with Forno on this one, that card is just a mock up and shouldn't be considered the real deal, the solder pads don't even line with the power connectors, and there's not enough of them.
 
surely if (as i read in this thread) shader model 5 users double precision FP then that would mean it could fly through shader model 5 stuff compared to ATI's. Also, does anyone know it ray tracing would benefit a lot from moving to this sort of arcitecture? If so, then thats a win surely, since graphics are going that way.

it wont be an ati monopoly, larabee could compete once its gone through a gen or two, and nvidia will still probably release another few gens before thye pull out (eg, while they still can compete on perf and price).

I really liked the anandtech article, especially the piece about the FX/gt200 fubars.

It'll be a sad day (at least for me) when i cant buy an nvidia card for my pc, i just hope that the last card they release is an awesome one, and not a poor perf card with a high price tag so everyone remembers them badly.

The key word here is Shader Model 5 can use double precision floating point operations. There's nothing forcing developers to use them, and the majority probably still won't in graphics shaders because of the huge performance penalty. Even on Nvidia's new architecture, double precision floating point performance is half of its single precision performance.
 
i been waiting for the new gen nv gpu's to come out before i decide which way to go,just hope we not waiting to long before we see some..

Then timing is just as valid, because while Fermi currently exists on paper, it's not a product yet. Fermi is late. Clock speeds, configurations and price points have yet to be finalized. NVIDIA just recently got working chips back and it's going to be at least two months before I see the first samples. Widespread availability won't be until at least Q1 2010.
 
forgive my idiocy in mathematics, but I always thought=

1 + 0 = 1
where 1 = 1x8pin pcie connector
and 0 = No 6 pin pcie connector

Tbh im not surprised by nvidia's descision to focus elsewhere. PC Gaming today represents a fraction of what it was over a decade ago. With consoles so affordable now, it makes sense to buy them instead.

Nope the card has both a 8 pin and a 6 pin. There are plenty of other sources don't waste you time and just take my word for it.
 
It'll be a sad day (at least for me) when i cant buy an nvidia card for my pc, i just hope that the last card they release is an awesome one, and not a poor perf card with a high price tag so everyone remembers them badly.

You'll still be able to buy a nVidia card for your pc - it's primary use just may not be gaming anymore :p
 
Sounds like charlie has been writing his "articles" again....

Sry still don't see the second pcie connector on those pics, ive been out all evening so some1 is going to have to point it out 4 me.
 
Sry still don't see the second pcie connector on those pics, ive been out all evening so some1 is going to have to point it out 4 me.

Spec Savers?
videocard2.jpg
[/URL]


See this one? Lol, hope they don't want to get any heat out, seriously bad mock up. That's just embarrassing.

 
Last edited:
Back
Top Bottom