• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Drop your prices Nvidia. Please?

Seems completely insane to me to place so much enforces on 20w or 30w.

Laws of physics may well dictate it will add warmth to the room.

Laws of Physics also dictate a spec of dust has a gravitational pull, i notice that about as much as i notice to warmth my card is adding to my room.

come on guy's seriously?

It seems those 1KW+ electric heaters people use,or indeed central heating is way to powerful. According to this thread 30W is enough to heat a room. There is only one way to test his out - as winter is coming,I expect some people to ditch central heating or electric heaters.

They can use a 60W SES Clear Candle Lamp. Not only does it heat the room,but it gives you light too!! :p

LACA40SESCL.jpg


If they use two,the room will probably feel like a Sauna!! :p

They can then pitch their dual 60W SES bulb super efficient room heater to the Dragons' Den and make millions of quid!! :D

Edit!!

Too late!! Apple patented it!! Sorry chaps!! :(
 
Last edited:
It seems those 1KW+ electric heaters people use,or indeed central heating is way to powerful. According to this thread 30W is enough to heat a room. There is only one way to test his out - as winter is coming,I expect some people to ditch central heating or electric heaters.

They can use a 60W SES Clear Candle Lamp. Not only does it heat the room,but it gives you light too!! :p

LACA40SESCL.jpg


If they use two,the room will probably feel like a Sauna!! :p

They can then pitch their dual 60W SES bulb super efficient room heater to the Dragons' Den and make millions of quid!! :D

Edit!!

Too late!! Apple patented it!! Sorry chaps!! :(

lol... thats pretty funny. :)
 
@CAT-THE-FIFTH - :D you had to come and ruin it didn't you, i was finding that pretty amusing seeing people actually believe that :D Nvidia cards are a bit hotter but a bit silent vs AMD a bit cooler and slightly louder. nvidia chips are squeezed for every bit of juice, so naturally they are running hotter being overclocked to its limits (reason its locked).

Am sure, as many links showing AMD running cooler, you could find more of nvidia running hotter. It also depends on the pcb/components and cooler/fan :p

+1 nvidia for the cold winter months coming
 
It seems those 1KW+ electric heaters people use,or indeed central heating is way to powerful. According to this thread 30W is enough to heat a room. There is only one way to test his out - as winter is coming,I expect some people to ditch central heating or electric heaters.

They can use a 60W SES Clear Candle Lamp. Not only does it heat the room,but it gives you light too!! :p

If they use two,the room will probably feel like a Sauna!! :p

They can then pitch their dual 60W SES bulb super efficient room heater to the Dragons' Den and make millions of quid!! :D

Edit!!

Too late!! Apple patented it!! Sorry chaps!! :(
Because of course people use 1kW heaters in their small bedrooms. 30W is obviously not enough to heat a room by, but if you're in a small bedroom like a lot of people will be, the PC their own body heat, and a incandescent bulb will be enough to heat it. Another 30W doesn't help.
 
It seems those 1KW+ electric heaters people use,or indeed central heating is way to powerful. According to this thread 30W is enough to heat a room. There is only one way to test his out - as winter is coming,I expect some people to ditch central heating or electric heaters.

They can use a 60W SES Clear Candle Lamp. Not only does it heat the room,but it gives you light too!! :p

LACA40SESCL.jpg


If they use two,the room will probably feel like a Sauna!! :p

They can then pitch their dual 60W SES bulb super efficient room heater to the Dragons' Den and make millions of quid!! :D

Edit!!

Too late!! Apple patented it!! Sorry chaps!! :(

Lets say 1kW of heat is required to heat an area to an acceptable human comfort, in a domestic situation there is no allowance for gains for metabolic or thermal gains from equipment whcih can all add up. Lets say for the sake of an example.

1000 watts from heat emitter,
100 watts from Metabolic gains(1 person),
90 watts from a TV/Monitor,
300 watts from a pc,

This gives a total of 1490 watts, So this is already nearly 50% over the design target temperature. So every extra watt will push it even further.

(This is not related to Nvidia or AMD, just to show how easy it is to overheat an area within a domestic enviroment)
 
The GTX 660 is about the right price; it's only really the bundled games that are making the 7870 a better buy at £180-200.

660 Ti is still stupid expensive compared to a 7950. 670 maybe a bit too expensive too but the 660 Ti seems the worst offender to me. It needs to come down by at least £20 to be worth bothering with.
 
Because of course people use 1kW heaters in their small bedrooms. 30W is obviously not enough to heat a room by, but if you're in a small bedroom like a lot of people will be, the PC their own body heat, and a incandescent bulb will be enough to heat it. Another 30W doesn't help.

Switch the heater on less if you feel hot?? Maybe use the thermostat?? :p

Don't wear a jumper during summer?? :p

30W makes no realworld difference,it really does not and that is after actually living in a very hot country for a number of years.

The only place where 30W would make a difference is in a small SFF gaming system with limited cooling,as it would make it harder to cool effectively.

Lets say 1kW of heat is required to heat an area to an acceptable human comfort, in a domestic situation there is no allowance for gains for metabolic or thermal gains from equipment whcih can all add up. Lets say for the sake of an example.

1000 watts from heat emitter,
100 watts from Metabolic gains(1 person),
90 watts from a TV/Monitor,
300 watts from a pc,

This gives a total of 1490 watts, So this is already nearly 50% over the design target temperature. So every extra watt will push it even further.

(This is not related to Nvidia or AMD, just to show how easy it is to overheat an area within a domestic enviroment)

The 30W is just 2% of that 1490W you are talking about.

The 1KW heater is 67% of that 1490W.

Switch the heater on less if you feel hot?? Maybe use the thermostat?? :p

Edit!!

What has heating a room anyway got to do with the OP anyway?? :confused:

Going to stop right now with this,as its not relevant to the OP.
 
Last edited:
I run 3 monitors and SLI 680's. I can seriously feel the extra heat as opposed to just 1 680. I am not out to prove a point, merely giving my own perception in this nonensulant argument.
 
Every other review on the Internet.

I'm not being funny but you don't have the cards in question and it's not exactly hard to find a review which is in favour of your own bias. (speaking generally)

Honestly do you believe that an overclocked 7970 is THAT faster than an overclocked 680?

I have no real interest in arguing which is faster as a matter or principle but if we agree that the results there aren't correct then then you have to just discount them completely.

You mean all the reviews that have now outdated drivers on none GHZ edition cards?
 
I went from 3x GTX 580 to 3x GTX 670, and the 580's were better at heating up my room :) they were much more efficient than starting to turn up the heat on the radiator.

To topic - I would have liked to see nVidia drop the prices on the GTX 670s, I still think they're a too pricey compared to HD7970 cards now.
 
You mean all the reviews that have now outdated drivers on none GHZ edition cards?

No every other review which pitches the 680 against the 7970 at a "max" OC. Anyway as I said in that post which you quoted, I've no real interest which one is faster, but to post a graph that puts a 7970 20% ahead of a 680 is obviously flawed.

I think everyone else said the same thing.
 
No every other review which pitches the 680 against the 7970 at a "max" OC. Anyway as I said in that post which you quoted, I've no real interest which one is faster, but to post a graph that puts a 7970 20% ahead of a 680 is obviously flawed.

I think everyone else said the same thing.

Its not that out there, AMD trounce NV in Sleeping Dogs for example at extreme settings.
 
Its not that out there, AMD trounce NV in Sleeping Dogs for example at extreme settings.

Hmmm I wouldn't say trounce. It's a large difference yes but the difference between my benchmark and the highest one on here was about a difference of +7 on the minimum FPS and the same on the average. A comfortable win but perhaps not at large as you'd think. If you take the median of the AMD 7970 results I am a fair bit closer.

Anyway, but this graph which placed this 7970 20% higher was BF3 at 1080... I think you should have perhaps read the whole thread (if you could have beared it!) :p .

I have no doubt the gap isn't as large or there at all but to use that graph as an example of why somebody shouldn't consider Nvidia at all is a little misleading. The point he made about value for money was a valid one but the evidence used to back it up was terrible.
 
Last edited:
Back
Top Bottom