• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

First Nvidia GT300 Fermi pic.

Associate
Joined
6 Nov 2005
Posts
157
I would love to see the GPU going into the next generation of Xbox.. some how I think that AMD will do what they did last time. Stick with the highest return but then produce a really hardcore GPU for the Xbox that then appears two years later on the PC.

Also a great way for them to try out their newest ideas.
 
Soldato
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
Nvidia's mainstream Fermi arrives in late Q1

Launching in March if all goes well


Nvidia's mainstream market version of its GT300 / Fermi chip is scheduled for a Q1 2010 launch. This is not particularly good news for the company as ATI is already shipping its mainstream card based on the Juniper chip at pricing around €130 / $169 for the more expensive parts. Of course, we are talking about the HD 5770 and HD 5750 that both started shipping yesterday.


In perspective, Fermi will be released as a performance chip card, a high-end single chip card, and a dual-GPU card that might launch a few weeks later.

When it comes to entry level and mainstream segments, we will most likely have to wait until around March, if not even later. Despite the huge flexibility of Nvidia's Fermi GF100 architecture, it still takes time to derive slower performing chips from the original design.

We have a feeling that ATI might capture a much bigger piece of the DirectX 11 market than anyone expected due to being first to the market and shipping a month before its competition. Both of these factors are very beneficial for raking in good sales numbers. While Nvidia’s mainstream cards might end up faster, they will unfortunately come a month behind the competition.
http://www.fudzilla.com/content/view/15954/1/
 
Associate
Joined
12 Oct 2009
Posts
279
Is it likely this is going to be much more expensive than hd5870 and require more power as I was going to wait until it comes out before upgrading from 8800gt sli
 
Associate
Joined
18 Sep 2007
Posts
524
Location
London, England
I'm waiting to see what GF300/GT300 is before upgrading from 8800GT. Overall performance, power usage, and price will determine between 5870 and Nvidia's offering. If the GF300/GT300 is significantly more than the 5870, I'll most likely get the 5870 or wait for Nvidia's mainstream in Q1 2010. Tempted to wait until Q1 2010 TBH.
 
Associate
Joined
18 Jan 2009
Posts
94
It's a 40nm part; its power consumption will be less than the gt200-era cards, which already have better power consumption than their Radeon equivalents.

Methinks Mother Earth will be spared.
 
Soldato
Joined
3 Nov 2004
Posts
9,871
Location
UK
it's power consumption will be less than the gt200-era cards
That rather depends on the number and type of transistors. Consumption per transistor, yeah sure.

For instance overall load a 5870 uses more than a 4870, no?

powerconsumption.png
 
Last edited:
Soldato
Joined
11 Sep 2007
Posts
5,740
Location
from the internet
It's a 40nm part; its power consumption will be less than the gt200-era cards, which already have better power consumption than their Radeon equivalents.

Methinks Mother Earth will be spared.

Of course smaller manufacturing processes automatically mean the power consumption of a new chip will be lower than completely different chips on a larger production process, regardless of frequency, architecture or transistor count. Also the Pentium 4 Prescott chips clearly had much lower power consumption than any Pentium 3 chip.
 
Back
Top Bottom