• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Budget G80's

Permabanned
Joined
25 Oct 2004
Posts
9,078
I know the 8800's will be out sometime next week but what about the budget end of the spectrum, being unable to afford the 2nd mortgage it would take to buy one of those im looking at upgrading to hopefully what i guess would be called the 8600 series as my poor 9800XT is starting to show its age.

Would i be right in guessing that those cards could be expected around the time vista comes out ? ie jan/feb 2007.
 
Secret_Window said:
Who knows, it would be great if a 8600 GT or something came out; making it more affordable to those who can't afford the 8800's.

Yeah im kinda hoping it goes the same was that the 7800's did ie the GTX got released Q4 2005 and in jan 2006 the 7600's got released, would make sense what with Vista out Jan 30th lots of people will be after DX10 ready machines and budget cards is where the money will be at for nvidia and ati.
 
Vista may be out Jan, but I dont think any major DX10 games will be. So you might not see any midrange DX10 cards for a while, as demand wont be that high. You'll see a 8800GT card soon, but a 8600 type will be quite a way off.
 
The G80 chip is massive, so i wouldn't expect a midrange part based around G80 for quite some time yet. It might not be even economical until nvidia shift production to 65nm.

Also, IIRC, it was nearly a year after the 7800 series launched (June 2005?) when the 7600s launched (April 2006?).
 
I recall reading somewhere about budget DX10 cards coming around Feb/March in time for Vistas launch they said they will have them from the £100+ price range also.

8300/8600/8800 ranges and more.

Obviously u wont get much done on an 8300, but 8600 range should be good.
 
Fx-Overlord said:
The G80 chip is massive, so i wouldn't expect a midrange part based around G80 for quite some time yet. It might not be even economical until nvidia shift production to 65nm.

Also, IIRC, it was nearly a year after the 7800 series launched (June 2005?) when the 7600s launched (April 2006?).


I was about to say just that FX matey ;)

Cant remember the exact 7600GT date but it was in march i believe....the 7800GTX was june 22nd 2005...thats the day my bank balance went to just above 0 so i remember it well ;p
 
The 8800GT will probably have 512MB memory. Nvidia are bound to bring out 8600xx and 8300xx aswell like the 7 series.

However if you guys can wait awhile then 89xx will draw a lot less power as Nvidia didnt really have time to do this with the 88XX as they wanted to beat ATI's R600 release date, be cooler and be slightly faster :)
 
qwerty07 said:
The 8800GT will probably have 512MB memory. Nvidia are bound to bring out 8600xx and 8300xx aswell like the 7 series.

However if you guys can wait awhile then 89xx will draw a lot less power as Nvidia didnt really have time to do this with the 88XX as they wanted to beat ATI's R600 release date, be cooler and be slightly faster :)


Link?
 
No link required. They will refresh the 8800's with a 65nm, GDDR4 equipped variant next year that will use a hell of a lot less power unless they do somthing very very wrong. G80 is by all accounts HUGE, so they wont be adding much if anything to it in the process shift because they will want the smaller die size above all else. I dont expect to see budget and midrange 8 series chips until they move to 65nm, its just too big and uses too much power.
 
Watts Radeon X1950 XTX
Idle 184
Load 308


Watts GeForce 8800GTX
Idle 229
Load 321

For a card soooooooooooooooo much faster the power requirements are not that much more than a X1950 XTX.

4% more power usage but a massive boost in performance :D
 
easyrider said:
Watts Radeon X1950 XTX
Idle 184
Load 308


Watts GeForce 8800GTX
Idle 229
Load 321

For a card soooooooooooooooo much faster the power requirements are not that much more than a X1950 XTX.

4% more power usage but a massive boost in performance :D


also take into account that the x1900xtx has a higher power consumption rate than the x1950xtx
 
easyrider said:
Watts Radeon X1950 XTX
Idle 184
Load 308


Watts GeForce 8800GTX
Idle 229
Load 321

For a card soooooooooooooooo much faster the power requirements are not that much more than a X1950 XTX.

4% more power usage but a massive boost in performance :D

Compared to the current Ati cards yes.....compare it with the current nvidia cards and the picture changes a little hehe.

The load figures arent really the thing thatd bother me though,the idle figures are - im hoping thats poor driver optimizations at work becuase if its not, thats a whole chunk of extra idle power and far mroe expensive than the load figures (unless you play more than you idle of course ;p)
 
robg1701 said:
Compared to the current Ati cards yes.....compare it with the current nvidia cards and the picture changes a little hehe.

The G80 offers similar performance to two 7900 GTX in SLI.

so really its power usage is not that high considering its performance to power ratio.

You cant compare the G80 to current NV cards either its a new gen card.
 
easyrider said:
The G80 offers similar performance to two 7900 GTX in SLI.

so really its power usage is not that high considering its performance to power ratio.

You cant compare the G80 to current NV cards either its a new gen card.

Of course you can compare them (well, you literally just have so :p), power is power. I didnt see anyone jumping to Prescotts defence because it wasnt the same as an A64 or Northwood...granted, it was infact ever so slightly slower than Northwood and beaten silly in games by A64, but anyway...my point was that comparing power draw of current ATi cards and saying 'next gen NV isnt that much higher' is a bit one sided, when the current NV cards are generating similar performance to the current ATi cards on the same process technologies whilst using a lot less power.

Just so we are clear, I currently think that ATis cards use too much power (though i have an x1900xt in my gaming box because i think its the better card vs 7900GTX in terms of quality and future performance, despite being a noisy power sucker), thats why i brought up the current nvidia cards, which i think make a better power comparison with future nvidia AND ati products in terms of power usage. Just like i think future AMD and Intel products have conroe as a power benchline, not Prescott:)

As for power/performance - look for example at 7900GTX vs 7800GT SLI, again similar performance, but the 7900GTX gets away using a hell of a lot less power. I know thats not a fair comparison as its G71 vs G70 which is infact architecturally identicle and only process tech and memory changed, but thats still part of my feeling that G80 is a bit of a power sucker, and is why waiting for 'G81' is what ill be doing. As more background, I owned a 7800GTX until i replaced it with the X1900xt a couple months ago btw so im not suggesting i always go for the lwo power card :p

EDIT: And infact, I will probably end up with the probable 'R680' even if it uses more power than 'G81'
 
Last edited:
robg1701 said:
.my point was that comparing power draw of current ATi cards and saying 'next gen NV isnt that much higher' is a bit one sided,


Half Life 2: Lost Coast loves the GeForce 8800GTX. Here the GeForce 8800GTX is able to show significant performance gains over AMD’s ATI Radeon X1950 XTX—approximately 92%.

Quake 4 shows similar gains as Half Life 2: Lost Coast too, an approximate 92% improvement.


The 8800 GTX does not use 92% more power but offers nearly double the performance.

Condisering its power to performance ratio the G80 is staggering
 
Right, so G80 is 92% faster than the x1950 in quake4, where 7900GTX SLI is already about 85% faster (http://techreport.com/reviews/2006q3/radeon-x1950xtx/quake4-1600.gif), in an OpenGL game where nvidia are generally always faster anyway, great, that shows real progress.

So, in quake 4 at least,its got similar power/performance ratio as 7900GTX SLi. In HL2 its better as 7900GTX SLI is only around 50% faster than x1950 in Episode1.

Im not talking about power/performance betwen the nvidia cards. I did compare power/performance of the ati and nvidia cards to show that i thought current nvidia cards make a better basis of comparison than an ATi card that already uses a lot more power. Forgetting performance, compared to 7900GTX, G80 is it one hell of a power sucker (especially) at idle and load.

5900->6800 transition, power increase...6800-> 7800 transition, no change....7800 -> 7900 transition, power decrease. 7900-> 8800, mammoth power increase. Thus i expect if they jsut do a 65nm, GDDR4 equipepd refresh it should use a lot less power. Im jsut talking about power, i dont care about the relative change in performance.
 
easyrider said:
Half Life 2: Lost Coast loves the GeForce 8800GTX. Here the GeForce 8800GTX is able to show significant performance gains over AMD’s ATI Radeon X1950 XTX—approximately 92%.

Quake 4 shows similar gains as Half Life 2: Lost Coast too, an approximate 92% improvement.


The 8800 GTX does not use 92% more power but offers nearly double the performance.

Condisering its power to performance ratio the G80 is staggering

EDIT: Damn net connection. Beaten to a similiar post :)
 
Last edited:
Lanz said:
Vista may be out Jan, but I dont think any major DX10 games will be. So you might not see any midrange DX10 cards for a while, as demand wont be that high. You'll see a 8800GT card soon, but a 8600 type will be quite a way off.


No major games at all as far as im aware, theres only FSX now that can use DX10 when its banged on Vista, and a DX10 card, and now Crysis has been put back to April, after April, 6+ months away, wow DX10 games are flooding out in droves. :D

Theres bound to be budget versions, along the lines of what we have now like the 7600 GT's, GS's etc...
 
Back
Top Bottom