• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anyone else had loads of hardware issues with AMD current gen?

Well I wouldn't deny it, no, who would?

As I've said, my main issue revolves around giving them any of my money as I'd see that as me supporting the things they'd do.

It's not really about "my money" as such anyway, I'm a big fan of price/performance but I do like high end stuff, hell I've just built a Sandybridge-E computer with a 3930K, 32GB of RAM, 2 SSDs, dual graphics, watercooled in a TJ07 with 3x 2560x1440 monitors :p

If I got a 690 for free, I'd just sell it and recoup the costs of my new PC.
I'd make you sign something stating you cant and wont sell the card.
You would have to use the card in your rig.
Comes with a pretty waterblock as well.:D

Who would deny such a fine specimen ;)
 
Ok spoffle, you win. I will stick to Nvidia and I will carry on playing the games I enjoy and if I get any games with PhysX, I will just enjoy it rather than get into a debate. The one thing you keep on at SkodaMart for, you are doing yourself. You seriously come across as a major AMD Fanboy and if someone shows this same passion for Nvidia, you give them a hard time.

Going back to the OP, I am glad he is happy with his new Nvidia card and he can also enjoy enhanced details in the PhysX titles ;)
 
I think it's pretty funny that AMDs high end card is matched by nVidia's "mid range" card and yet people (usually AMD mud slingers) say that the 680 was a mid range card like it's a bad thing. If anything it's a damning indictment on AMD... more of which in a moment.

I'd say to 95% of consumers it doesn't matter if the card was originally scheduled to be a "mid range" card or not...

I don't believe for a minute the 680 series was meant to be a midrange card. I think it was a reaction to NVidia's yield woes of the last two generations, and similar to AMD's 5xxx series, they decided to scale back and concentrate on efficiency. They had been making cards that were really big, and it was costing them, and this time around they concentrated on efficiency and gaming power, dispensing with a lot of compute/cuda-type stuff that didn't serve the gaming goal.

If they did have a high end gaming card from GK110, and could actually produce it at a cost they could sell it, they would - they wouldn't hold it back.
 
Ok spoffle, you win. I will stick to Nvidia and I will carry on playing the games I enjoy and if I get any games with PhysX, I will just enjoy it rather than get into a debate. The one thing you keep on at SkodaMart for, you are doing yourself. You seriously come across as a major AMD Fanboy and if someone shows this same passion for Nvidia, you give them a hard time.

Going back to the OP, I am glad he is happy with his new Nvidia card and he can also enjoy enhanced details in the PhysX titles ;)

You're missing my point again (on purpose?). I'm not telling you it's wrong to like nVidia. You do this a lot, some one is asked something and then get all upset and offended when they respond to it and act like what's been said is irrelevant.

That post was a response to Stanners asking why >>I<< don't like nVidia. I'm not telling anyone that they should agree and do the same, you need to stop trying to see things in posts that aren't there.

As I've asked before (but you repeatedly can't substantiate) what exactly makes me an AMD fanboy? Again, because I dislike nVidia it must because I love AMD? Why do you insist that it must work like that?

I don't believe for a minute the 680 series was meant to be a midrange card. I think it was a reaction to NVidia's yield woes of the last two generations, and similar to AMD's 5xxx series, they decided to scale back and concentrate on efficiency. They had been making cards that were really big, and it was costing them, and this time around they concentrated on efficiency and gaming power, dispensing with a lot of compute/cuda-type stuff that didn't serve the gaming goal.

If they did have a high end gaming card from GK110, and could actually produce it at a cost they could sell it, they would - they wouldn't hold it back.

Neither do I, it's pretty clear that the GK110 is the compute part used in their Quadro/Tesla line of boards, and GK104 is GK110 with the compute logic stripped out for efficiency, yields, size and heat/power consumption.

It's very simple to look at really, nVidia couldn't use GK104 chips for Tesla/Quadro cards aimed at compute performance, because the compute performance isn't there in them, so they would have to have a different chip with all the compute logic there, which is what we've got in GK110.

In the HPC industry, die size, yields, heat and power draw come second to top performance, so it's much less of an issue for nVidia if they get bad yields because the customers only care about the performance on offer.
 
You're missing my point again (on purpose?). I'm not telling you it's wrong to like nVidia. You do this a lot, some one is asked something and then get all upset and offended when they respond to it and act like what's been said is irrelevant.

That post was a response to Stanners asking why >>I<< don't like nVidia. I'm not telling anyone that they should agree and do the same, you need to stop trying to see things in posts that aren't there.

As I've asked before (but you repeatedly can't substantiate) what exactly makes me an AMD fanboy? Again, because I dislike nVidia it must because I love AMD? Why do you insist that it must work like that?

I will call an end to this before we both take this thread way off track and end up with an infraction.
 
I don't believe for a minute the 680 series was meant to be a midrange card. I think it was a reaction to NVidia's yield woes of the last two generations, and similar to AMD's 5xxx series, they decided to scale back and concentrate on efficiency. They had been making cards that were really big, and it was costing them, and this time around they concentrated on efficiency and gaming power, dispensing with a lot of compute/cuda-type stuff that didn't serve the gaming goal.

If they did have a high end gaming card from GK110, and could actually produce it at a cost they could sell it, they would - they wouldn't hold it back.

The memory bus specifically - which even hurts 1080 performance a little - and lack of compute power would appear to show otherwise. So I disagree. Perhaps mid range is too low but 670 being the top end GK104 part and then whatever as the 680 is possible

Speculation and all that.
 
The memory bus specifically - which even hurts 1080 performance a little - and lack of compute power would appear to show otherwise. So I disagree. Perhaps mid range is too low but 670 being the top end GK104 part and then whatever as the 680 is possible

Speculation and all that.

I believe the 256 bus was an exercise in lowering costs due to yield issues really, I think the GTX6XX series are quite expensive for nVidia to produce, so anything with a GK104 based GPU is going to be relatively expensive for them to produce, which also would explain why they chose a 192bit bus for the GTX660Ti as well.
 
Believe it or not, I am not anti-ATi, if I had have been I wouldn't have bought a HD 7950 in the first place.
You all know the reason I went back to nVidia so I am not going to bore you with the details again.
The only reason for my previous passionate defense is that the GTX 660 Ti is actually a decent card, despite it's 192bit bus, but I am perfectly happy with 2x AA in games.
I have never called it good value for money, but for it's price it gets the job done and does it well.

That is my final word on the subject, but I must say that this thread has turned into quite an interesting read. I sincerly hope that both companies continue to remain competitive as a one horse race would benefit no one.
 
Well I wouldn't deny it, no, who would?

As I've said, my main issue revolves around giving them any of my money as I'd see that as me supporting the things they'd do.

It's not really about "my money" as such anyway, I'm a big fan of price/performance but I do like high end stuff, hell I've just built a Sandybridge-E computer with a 3930K, 32GB of RAM, 2 SSDs, dual graphics, watercooled in a TJ07 with 3x 2560x1440 monitors :p

If I got a 690 for free, I'd just sell it and recoup the costs of my new PC.

Have you used a GTX 690 ?

I think if one fell into your ownership for free once you have tried it and compared it to say a pair of HD 7970s you would not want to part with it.

Unless you are gaming above 1600p the GTX 690 gives a far better user experience than a pair of HD 7970s. Its smaller (mine are dwarfed by my platinums) quieter, uses less electricity, only needs 2 slots and does not suffer from microstutter. Also performance wise there is not that much in it.

I would never recommend someone buy a GTX 690 over a pair of HD 7950/70s if its bang for buck they are looking for but I would find it strange someone parting with one they got for free.

You don't see a lot of second hand GTX 690s do you. I think people like them once they start using them.:D:p:D
 
Nothing :D
It would just be funny, if i gifted you a 690...you wouldn't deny it, it would also mean you couldn't hate on it :D

You can gift me a 690 if you like mate :D be a staggering upgrade from this geforce 4 lmao

don't want to stir up another debate but isn't the performance per watt of amd cards significantly higher than the nvidias they compete with? EDIT: single cards
 
Last edited:
Have you used a GTX 690 ?

I think if one fell into your ownership for free once you have tried it and compared it to say a pair of HD 7970s you would not want to part with it.

Unless you are gaming above 1600p the GTX 690 gives a far better user experience than a pair of HD 7970s. Its smaller (mine are dwarfed by my platinums) quieter, uses less electricity, only needs 2 slots and does not suffer from microstutter. Also performance wise there is not that much in it.

I would never recommend someone buy a GTX 690 over a pair of HD 7950/70s if its bang for buck they are looking for but I would find it strange someone parting with one they got for free.

You don't see a lot of second hand GTX 690s do you. I think people like them once they start using them.:D:p:D

I already have 2x 7950s that are good overclockers, and I also have 3x 2560x1440 monitors. A GTX690 would be a downgrade, which is ridiculous considering the cost difference. I got my 2x 7950s for £480, how much are 690s again? :p

So if I did acquire a GTX690 for free, I would most certainly sell it.
 
I already have 2x 7950s that are good overclockers, and I also have 3x 2560x1440 monitors. A GTX690 would be a downgrade, which is ridiculous considering the cost difference. I got my 2x 7950s for £480, how much are 690s again? :p

So if I did acquire a GTX690 for free, I would most certainly sell it.

I think you need to read my post again.:D

Why are you messing around with a pair of HD 7950s to drive 3 x 2560x1440 monitors. The least I would go with is 3 x HD 7970s maybe even 4 or better still 4 x GTX 680s 4gb versions (more vram and better drivers for multi GPUs).
 
I think you need to read my post again.:D

Why are you messing around with a pair of HD 7950s to drive 3 x 2560x1440 monitors. The least I would go with is 3 x HD 7970s maybe even 4 or better still 4 x GTX 680s 4gb versions (more vram and better drivers for multi GPUs).

Didn't Rusty already establish that the 7950's work better in mufti screen res because of the 384Bit bus vs 265Bit on the 680?
 
I think you need to read my post again.:D

Why are you messing around with a pair of HD 7950s to drive 3 x 2560x1440 monitors. The least I would go with is 3 x HD 7970s maybe even 4 or better still 4 x GTX 680s 4gb versions (more vram and better drivers for multi GPUs).

The better drivers thing isn't really true, I've had no multi-GPU driver issues with the various crossfire setups I've used.

I'm also not interested in spending ridiculous money on 3+ graphics cards when it's not really necessary.

4GB 680s make even less sense because they're still limited by the 256bit bus.

Yes

2 x HD 7950s will work well @3 x 1080p, but they are not going to cut it with high settings @3 x 1440p

They "cut it" fine, the requirements for 3x2560 is often quite exaggerated, it isn't a linear increase in requirements going from say 3x 1920, which I have had, with a 2GB 5870, and a 6950. Triple 1920 also isn't as demanding as people make out.

For example, I noticed no performance difference in Fallout New Vegas (which I know isn't particularly demanding) with a 6950 at 5680x1200 and 7680x1440 (that's a single 6950) the performance was more than playable.

I played most games fine on a single 5870 or 6950 on my 1920x1200 monitors, Alan Wake for example ran pretty well, but did crawl on at 7680x1440.

2 overclocked 7950s will be more than enough.
 
The better drivers thing isn't really true, I've had no multi-GPU driver issues with the various crossfire setups I've used.

I'm also not interested in spending ridiculous money on 3+ graphics cards when it's not really necessary.

4GB 680s make even less sense because they're still limited by the 256bit bus.



They "cut it" fine, the requirements for 3x2560 is often quite exaggerated, it isn't a linear increase in requirements going from say 3x 1920, which I have had, with a 2GB 5870, and a 6950. Triple 1920 also isn't as demanding as people make out.

For example, I noticed no performance difference in Fallout New Vegas (which I know isn't particularly demanding) with a 6950 at 5680x1200 and 7680x1440 (that's a single 6950) the performance was more than playable.

I played most games fine on a single 5870 or 6950 on my 1920x1200 monitors, Alan Wake for example ran pretty well, but did crawl on at 7680x1440.

2 overclocked 7950s will be more than enough.

It does not matter how you dress this up you do not have enough GPU muscle to run modern games like BF3 or Far Cry 3 with quality settings at the resolution you are talking about.

The other day I ran Far Cry 3 @2560 x 1600 single monitor with the settings maxed out and managed to use 2gbs of vram. If you try that at 3 times the resolution I think you are going to need a little bit more than the 3gbs you have got and a lot more than a pair of HD 7950s can give.

I know the limitations of a GTX 690 and use them within that limitation. You should keep in mind the HD 7950s also have limitations. Rusty has shown they are great up to 5760 x 1080 (6220800 pixels) but going to 7680 x 1440 (11059200 pixels) is asking a bit too much with quality settings.

I also think you would find it interesting using something like a GTX 690, as a 2 GPU solution it is better than a pair of HD 7970s. Unfortunately as far as value for money goes it is c*&p compared to the HD 7*** series cards.

The point im making is all GPUs have strengths and weaknesses. The GTX 690 has the same problem running @5760 x 1080 as a pair of HD 7950s have @7680 x 1440.

People who use nvidia or AMD should remember that the other side do have good points as well as bad.

I happen to think my HD 5970s and HD 7970s are great and look forward in the next week or so having a lot of fun with the 7970s.
 
Back
Top Bottom