• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G80 specs finally confirmed! NOT old news!

Confusion said:
just found out you need 850w PSU for SLI :eek:
the whole thing is do you really need sli? if this thing is as fast as 2 gx2 , why on eath would you want to sli them? once the cards go to 2nd gen power should come down.....
now that nvidia had opened up on specs i would ssuggest ati wont be to far down the road , if only o stop prospective buyers buying nvidia rather than ati ,
Everyone needs to remember this is nvidias first go at unified shaders....... ati has done it before with xbox 360,,,,,,......
 
g80 didnt have unified shaders supposidly ...now look at it!.
it would be very foolish to right of ati just because it doesnt have a card spec released , for all we know ati maybe laughing at the g80 spec......?
as i said ati would be on there second generation unified shaders, just look at some of the graphics the xbox 360 is capable of , with out dx10 ......
some sources have stated that tha r600 is a modified xbox 360 chip , this would make sence , since it cost a whole load to develop the 360 graphics design.
microsoft must have seen something good , to go with ati over nvidia for the 360.....
DONT right ati off ! i am not a fan boy , i have nvidia at the moment , and always have had .
nvidia has been very scruplious with the truth in the past about there performance. take the 6800 series, reveiwers got highly optimized drivers that would shimmer on the screen to bleed the last drop of performace out of the chip , then after all the mags etc reviwed the chip , the performace was scaled down ........
 
Warbie said:
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2841&p=2

'Markus admitted that being a PC-only developer can easily lead to laziness, and developing for the 360 has improved the efficiency of Alan Wake tremendously. With that said, Markus expects the visual and gameplay experience to be identical on the Xbox 360 and the PC when Alan Wake ships, hopefully without any in-game load screens.'

'The demo ran extremely well on the test system, which was a Core 2 Quad running at 3.73GHz with a GeForce 7900 GTX. Markus said that it would have run just as well if the Core 2 Quad was running at its default clock speed, which we assume was 2.66GHz.'

The developers themselves are saying there will be little, if any, difference in performance between Alan Wake on a 360 and on a Core 2 Quad. I wouldn't be suprised in the slightest if this turned out to be the case. After all, according to Remedy the 360 is the 'main' platform Alan Wake is being developed for:

http://www.alanwake.com/forum/showpost.php?p=18278&postcount=10

Makes you wonder where all the 1000's of pounds we put towards gaming rigs actually goes.
so if xbox can run alan wake identical to the ,what is prob. the fastest pc`ss there has ever been , how come xbox or ps3 cant handle crysis? and alan wake seems to be better graphics wise?
also xbox , doesnt have to shunt the resolutions around a screen that pc has to........
 
Warbie said:
Agreed, especially as it was one of the first games released on the system. It takes a pretty powerful and expensive pc to top the 360.

Given more time with the machine, I believe Oblvion could run as well on the 360 as any current pc hardware would allow.
mybe , but just how poweful do you think the 360 is?
 
i am sure i can see a fly in thr referance , that they missed , they should send me the game and hardware for testing i can see what else they missed! :p ;)
 
Why is everyone moaning about a single 8800 getting just over 12k in 3D MARK o6!
that is a huge score for a single gpu card!
what was everyone expecting? 20k?
thats if this figure is to be correct , and remember a whole new core means poor drivers to start with !
a single 7900 gtx will do 7K in 3d mark 06 at stock ! so this card is just shy of 2x as fast!
 
Last edited:
The Asgard said:
For me to buy one of these it would have to be at least >1.5x the performance of a GTX Sli system. The headroom is just not going to be there over the current gen for the next gen games IMO.
in all honesty i never expected a first gen dx10 card to be quicker than 2 top of the range say x1900xt or 1950 xt in cross fire i think that was an unrealistic hope. of course it will be expensive nvidia has to recoup some of its losses , tbh i didnt think nvida would relese the g80 untill vista shipped , i think it is silly , esp , if the differance in dx9 to current cards is minimal , they might be shotting themselves in the foot here.
prob , why ati is waiting ,
 
Flanno said:
I presold my GX2 to a mate on the strength of the G80. If it isn't pulling much more then my GX2 - I wonder is it much faster in DX9. DX10 obviously does not concern me right now. But for 500 quid + I would hope it would be a big improvement in DX9.

Also, slightly off topic, but every bench I see shows Crossfire beating SLI. Why do so many people prefer SLI then ? Is it more stable. Does it scale better at higher resolutions. Is there any issues with Crossfire ?
crossfire is reletavliy new compared to sli so more people adopted it , but your right in that x1900xt or 1950 xt thump sli even quad sli , in most games , infact i think fear is the only game where quad sli will beat them
 
so its going to be about 30% faster than a x1950 xt for £500 WTF lol
for that you could get 2 x1950xtx and blow the g80 away hahaha
nvidia looks like they havent made the card near fast enough
:eek:
looks like the brute force high power way of doing things is starting to top out and reach a plateu,
for this card to benchmark so poorly esp with it power requirements and on screen spec seems like the company is reaching a rock and a hard place ,
maybe it is time ati or nvidia bought some IP tech from Power VR ?
 
Last edited:
LoadsaMoney said:
Yeah but don't forget you can get 2x G80's at over 1k to beat the 2x x1950's. :p
swings and roundabouts!
lol maybe we can all hook up direct to our houses 100amp supply sli 10 mobos together and play fear at 1000fps lol
this is getting crazy now honest , there is NO way i will be adding 2 psu`s to my pc to satify getting 16k in 3dmark 06
 
Flanno said:
I didn't.
I presold my GX2 to a mate to get the money to buy a G80.

After hearing the not so encouraging benchmarks being reported I was thinking Xfire would be faster for not a lot more money. Of course I wouldn't go down that route to be honest. I would rather buy one powerful card that is faster at DX9 and has DX10 features as well.

As it so happens if things go well I could be getting 2 G80's on launch day ;)
if you want the fastest dx9 cards get 2 x1950xt cause it looks like they will be faster than a g80 esp , if 10.5 k in 3dm 06 is all it can muster
 
naffa said:
LMFAO. :D :D :D

http://forums.overclockers.co.uk/showpost.php?p=8036333&postcount=63

And also he said he wanted DX10 features, so that's a mute point.
yes but by the looks of it the g80 wont run dx9 uber fast will it?
i am not a ati fan boy , but simply 2 x1950xtx destroys any other dual card set up at the moment! pure and simple
regards the laughing at the 64bit cpu , we have 64 bit operating systems now , so the 64bit cpus are not usless , why run out ot get a dx10 card when the release of dx10 will be 3 months away at the soonest and another 3 months before any games take advantage lol

well said loads , why do you want a dx10 card , you wont SEE ANY DIFFERENCE hahahaha if its to say nananana i can get 10.5k in 3d 06
some one with an uber sli or cross fire set up will come along and say ananana i get 12k lol
 
naffa said:
DX10 = 64bit CPUs all over again tbh.
thats ture , look how long 64 bit computing has taken to catch on , xp64 was crap , all the promise of huge games enviroments etc , it has never come off
vista 64 might be the same for a while
if dx10 takes as long , then dx10 cards will be even more of a mute point
 
Flanno said:
Will Vista is going to RTM the middle of next month and only volume license customers will be able to order systems with it. There is an embargo on all the OEM's not to ship to the general public until the launch day in January. But I will tell you this, the only thing stopping the OEM's from shipping early is possible fines from MS, which MS didn't implement when Gateway shipped XP early. I can tell you now with 100% certainty that all the OEMS have systems ready to ship with every flavour of Vista and are beavering away doing production runs using RC2. So all it takes is one OEM to ship Vista next month (which they will almost definitely do) and the likes of Dell, Gatway, HP, Fujitsu, Acer etc.. will all be following them. There is just too much money at stake in November which is the most busy month of the year for nearly every PC manufacturer. If MS deliver final RTM code Vista will almost definitely ship next month.

Where I am going with this, is the question of DX10 games support. Surely Nvidia have done deals with some games developers to have DX10 games shipping in January ? Otherwise it seems the G80 is just an exercise in having a card out a bit faster in DX9 then any single card solution at a really really expensive price with DX10 features that will be marketed but not used until the G80 refresh comes.
m8 nvidia just wanted to be first out before ati with a dx10 card , crysis is the only game to use dx10 so far , apart from the dx10 only halo 2 and alan wake.......
dont think any deal can be made with games produceres to have dx10 games out in january , crysois developers said dx10 is a nightmare at the moment as microsoft keep releasing new api code to them , so they have to start from the beginging again
 
Back
Top Bottom