• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

7970 vs 680 thread.

Has anyone here said otherwise? I don't think I've seen anyone state it's not the case.

Stop raving on here like a madman and go to Anandtech or whatever and rant over there.

Ah so you haven't noticed the ton of reviews declaring the 680 a winner and all of the poor deluded souls on here saying it's faster?
 
lol the GTX680 sure has some people worked up... can we stop crying over the clock boosting feature already? the cards are quite capable of running 1200MHz constantly out the box but instead nVidia decided to use a system to get the best combination of performance, power usage and heat output.

Saner voices have said something along the lines of, "think of 1200Mhz as its baseclock and lower clocks as a downclock" ... Cards have been doing something similar to this for a while with 2D and 3D clocks. This is an improved version to maintain better framerates.

And now KingP1ns bench also shows that the 680 has a Max OCs that's above 7970 and also beats it. But you still get a lot of fanbois whinging and making idiotic arguments about this or that. It's pathetic.
 
Dare i even suggest if both cards are pretty much equal when fully oc'ed the cheaper is the one to go for?

IIRC the 680 has a lower RRP.

Surely a good thing?:confused:

I don't think its an outlandish claim, they are certainly as close framerate wise as makes no difference to what people would experience ingame (except maybe in some specific high res/multi monitor or AA scenarios).

And when put to similar clocks the 7970 is just as fast.

There's no crying going on, just a huge case of denial.

The architectures are different you really can't compare them clock for clock. Comparing average attainable 24x7 stable overclocked versions against each other may have some merits but its a different story once your including end user overclocking and we are yet to see what the GTX680 is fully capable of overclocked although I suspect the 2 cards will still perform very similiarly when overclocked to the max.
 
I don't think its an outlandish claim, they are certainly as close framerate wise as makes no difference to what people would experience ingame (except maybe in some specific high res/multi monitor or AA scenarios).



The architectures are different you really can't compare them clock for clock. Comparing average attainable 24x7 stable overclocked versions against each other may have some merits but its a different story once your including end user overclocking and we are yet to see what the GTX680 is fully capable of overclocked although I suspect the 2 cards will still perform very similiarly when overclocked to the max.

Look back a page or two and they are compared clock for clock.

And once again, there's nothing in it.

Linus has uploaded a video and again, nothing in it.
 
Ah so you haven't noticed the ton of reviews declaring the 680 a winner and all of the poor deluded souls on here saying it's faster?

I haven't noticed many poor deluded souls on here stating that at all no. I've noticed most people who's opinion I respect on here stating that it's mostly even.

What I have noticed is yourself stinking up the forum for two days and essentially spoiling what appears to be an interesting period in graphics card history.

Certainly you've spoilt it for me anyway as every thread I've gone into you've been in it ranting about stuff that not many others seem to give a damn about.
 
Really? You don't find them biased at all??? Must be reading different reviews...like how every second word on the 7970 review was how overpriced it was...and yet on the 680 we get them saying "While we realise that £400 is a great deal to spend on a GPU, we really feel that the performance, power consumption and features on offer with the GTX 680 2GB more than justify the outlay. " .....not the first time, they did the same when comparing the 560ti to the 6950....unbiased wouldnt be the word I would choose.

Can you provide any quotes to substantiate this?
 
It's turns v-sync on when the card goes over 60fps and off when it is below that.

I'm not an expert but that sounds the wrong way round?
If you are goin to notice tearing etc, wouldn't it be noticed at the low frames?

I thought the adaptive was that it adjusted the FPS inbetween the typical vsync ranges, so if you can't make it to 60, it would vsync at 33fps for example?
 
I'm not an expert but that sounds the wrong way round?
If you are goin to notice tearing etc, wouldn't it be noticed at the low frames?

I thought the adaptive was that it adjusted the FPS inbetween the typical vsync ranges, so if you can't make it to 60, it would vsync at 33fps for example?

Anything over 60fps on a 60hz monitor will cause tearing I believe? Maybe I've got it the wrong way around. :)
 
I'm not an expert but that sounds the wrong way round?
If you are goin to notice tearing etc, wouldn't it be noticed at the low frames?

I thought the adaptive was that it adjusted the FPS inbetween the typical vsync ranges, so if you can't make it to 60, it would vsync at 33fps for example?

It happens generally when you're rendering above the refresh rate, Vsync caps it at the refresh rate.

It's perfectly fine.
 
I'm not an expert but that sounds the wrong way round?
If you are goin to notice tearing etc, wouldn't it be noticed at the low frames?

I thought the adaptive was that it adjusted the FPS inbetween the typical vsync ranges, so if you can't make it to 60, it would vsync at 33fps for example?

Its not an option everyone will want to use - but thats how it works - when the framerate starts dropping below 60fps it disables vsync as due to the way vsync works it will start rendering at multiples of the vsync Hz setting rather than the actual framerate the GPU can churn out - so if you drop to say 42fps instead of having 42fps rendered traditional vsync would result in 30fps being rendered. By disabling vsync for short periods it will continue to render at say the 42fps the GPU is trying to churn out instead of dropping to 30 usually resulting with minimal tearing and a smoother feel, less stutter and lower input latency - it does mean you might ocassionally see screen tearing however tho from my tests so far I've not actually noticed any so far.
 
How funny is it to watch/read people arguing over the minimal differences of what cards microscopically faster.

At the end of the day, these cards are designed to play games.....both cards of which can handle any game thrown at them with superior IQ....just becuase the GTX680 might be 2fps faster in 1 game and the 7970 is 2fps faster in another...who gives a damn?

All im seeing is my 1 inch schlong is larger than your 1 inch schlong!

I'll be getting a GTX 680.....not because of any reviews or anybodys biased opinions, but because I need an upgrade for my GTX 470 and the new generation card uses a lot less power (trying to cut my electricity bills down) while giving me better performance....and ill use it to guess what....play games with!
 
Its not an option everyone will want to use - but thats how it works - when the framerate starts dropping below 60fps it disables vsync as due to the way vsync works it will start rendering at multiples of the vsync Hz setting rather than the actual framerate the GPU can churn out - so if you drop to say 42fps instead of having 42fps rendered traditional vsync would result in 30fps being rendered. By disabling vsync for short periods it will continue to render at say the 42fps the GPU is trying to churn out instead of dropping to 30 usually resulting with minimal tearing and a smoother feel, less stutter and lower input latency - it does mean you might ocassionally see screen tearing however tho from my tests so far I've not actually noticed any so far.

Fair enough, I suppose I was nearly there, I just thought the vsync was "on" but variable, rather than off and on

Thanks for the info
 
Tbh it's entirely underwhelming, 3 months later than the 7970 and it's the same. Only thing that is truly different is the proprietary stuff that each company own. I was hoping it would be much much cheaper and actually better than the 7970 but it's not. As much as people would like to think so, it's not the case sadly.

Yay.
 
War never changes. :D

You can't fight in here gentlemen ! this is the war room ! :D


Tbh it's entirely underwhelming, 3 months later than the 7970 and it's the same. Only thing that is truly different is the proprietary stuff that each company own. I was hoping it would be much much cheaper and actually better than the 7970 but it's not. As much as people would like to think so, it's not the case sadly.

Yay.

Which is completely true. Even if it were better than the 7970 it would not be better enough to justify the price tag.

I have no problems whatsoever admitting that I overpaid for the 7970. I'm not exactly rich, and for many years lived in the mid range sector. I think the most I ever spent on a GPU was £250 and that was a pair of the buggers, two 5770s. I then changed them to a single 470 which with a Zalman cooler cost me £205.

Had the money not been money to burn I wouldn't have bought the 7970. Several weeks before I did I bought a 6970 fake Lightning for £285. That was the most I had spent since 1999/2000.

The sad fact is that for some reason people have not been able to be so selective and brutally honest about the 680. There are actually websites justifying the price tag.

One of those websites (Bit-tech) scored the 7970 lower complaining about the price tag. They then refused to score the 7870 and 7850 because they had no set price.

So if price is so important then why has the 680 been given the thumbs up?

I don't care what people say about me, what they think about me either. This isn't about being a fanboy or that I give a crap because I have an AMD card. I don't care, and I have put all of my peeves about the 7970 up for all to see. I've said plenty of bad about the 7970.

However, if the 7970 was not worth the asking price then why is the 680? does it do anything the 7970 doesn't?

Even if it were faster and by a long way then surely it still wouldn't be worth £400+. There are cards costing less than half of that that can take care of most anything at mainstream resolutions. So are we now being sold things we simply don't need?

It would seem so. So for the record all of these new cards are over priced and pretty stupid.
 
Last edited:
Back
Top Bottom