• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Fury(X) Fiji Owners Thread

It's the current leakage at higher temperatures that results in an increase in the power consumed, but it is typically fractional. This has no bearing on how the heat this generates is being dissipated, which is what was originally being inferred on the previous page

The only way water cooling can lower power consumption is if you lower the voltage.

Making something cooler does not make it use less power. It just means you can run it at a lower voltage, meaning you would use less power but only if you lower the voltage.

Putting a water block on something does not change the way it performs or the TDP, mate. Seriously, get your head around it. It just makes it cooler, but that waste power and heat is still being fully generated it is just being dissipated better but it's still being generated in the first place.
 
Making something cooler does not make it use less power. It just means you can run it at a lower voltage, meaning you would use less power but only if you lower the voltage.

Putting a water block on something does not change the way it performs or the TDP, mate.

It does have an effect on power usage. As silicons temperature increases it starts to free more electrons within its structure, becoming leaky. it is why it is called LEAKAGE current. it is power that is not being used to perform the function the transistor was made for.

But it is a small increase in power draw, but the increased temperature can reduce performance of the part due to leakage. But it depends on how well the transistor can cope with said leakage. if there is a large bias between on and off for the transistor, then a small amount of leakage won't make a difference between a 1 and 0. etc. in terms of performance. but a larger leakage can cause performance issues.

And TDP is just a measure of how much heat it should release at a specific load, with a specific core temperature as the base line.

But if you keep your core cooler then it will reduce leakage across the transistors. i have had cases where peoples system stability and even performance have improved by reducing their temps from 80-90 down to 40 - 60.
 
The only way water cooling can lower power consumption is if you lower the voltage.

Making something cooler does not make it use less power. It just means you can run it at a lower voltage, meaning you would use less power but only if you lower the voltage.

Putting a water block on something does not change the way it performs or the TDP, mate. Seriously, get your head around it. It just makes it cooler, but that waste power and heat is still being fully generated it is just being dissipated better but it's still being generated in the first place.

I give up. You're confusing energy transfer with the occurrence of current leakage. Don't start getting irate.
 
So you're telling me that cooling changes the physical properties of ICs?

I'm not getting irate.

Temperature affects all materials, the conductivity of semiconductors and the resistivity of metals increases with increasing temperature.

it is a well know scientific fact.

so keeping your core cooler can reduce leakage and power usage. but they are designed that the effect on performance will be small over a certain range. which is why over a certain temperature IC's fail due to leakage interfering with normal transistor functioning. but inadequate cooling which runs an IC at a higher temperature can increase overall waste heat and power usage.
 
What do you want him to run? Batman? AC Unity? Watchdogs? pCars and witcher?
The only game which managed to sort itself out was Witcher the rest of them are kind of the joke in terms of bugfest. And what all these games have in common? The word starts with letter 'G'.
You say cherry picking, I say he is just wise about what he is buying. With gameworks you are buying cat in the bag, while it has been a while since I last heard about any of AMD titles having some serious issues.

Isn't your list cherry picking too?

How about a selection of decent testable games from both camps, It's not like we have a lack of them.
 
At least I have owned a pair of Fury X's which is more than you can say and I bet I buy another Fury X before you get one :D

Only reason I would not have a pair in my main PC is the lack of HDMI 2 and poor crossfire support, I apologise to everyone in here for having to repeat that but shankly1985 does not have the best record when it comes to paying attention so I find i have to repeat myself for his benefit


That's irrelevant, People such as you owning two and getting screwed are why some of us aren't stupid enough to play the lottery and prefer to wait until the fixed cards hit the market before buying,

Personally I'd rather wait than go through the hassle of having to send them back cause there fitted with defective pump parts.
 
That's irrelevant, People such as you owning two and getting screwed are why some of us aren't stupid enough to play the lottery and prefer to wait until the fixed cards hit the market before buying,

Personally I'd rather wait than go through the hassle of having to send them back cause there fitted with defective pump parts.

People such as me? AMDMatt and Kapp both bought 4, what kind of people are they in your eyes? :rolleyes:
 
It gets harder to justify 3 with every passing release from both camps. And I don't necessarily think it's entirely either ones fault. Some games just aren't that accommodating for it, and if there's a specific rendering method or post proces that isn't XDMA / SLI friendly that's game critical, then it's tough titties.
 
It gets harder to justify 3 with every passing release from both camps. And I don't necessarily think it's entirely either ones fault. Some games just aren't that accommodating for it, and if there's a specific rendering method or post proces that isn't XDMA / SLI friendly that's game critical, then it's tough titties.

The real change will be with explicit multi gpu rendering systems with DX12 etc. as long as they make a decent system then it should scale far better beyond 2 cards.
 
Is it possible or sensible to install an AMD and NVIDIA graphics card into the same system and disable one or the other via software ? I am having some issues with my oculus that I think might be caused by AMD drivers and I would like to be able to quickly switch graphics cards to investigate the problem.
 
Is it possible or sensible to install an AMD and NVIDIA graphics card into the same system and disable one or the other via software ? I am having some issues with my oculus that I think might be caused by AMD drivers and I would like to be able to quickly switch graphics cards to investigate the problem.

Disable in device manager?
 
They weren't that hard to find on launch day dude.

Yes they were. I was ready at my desk come 1pm and there were only 6pc showing. Either most were already allocated or people were extremely quick to get what little was available. I had to wait for 1 Gigabyte unit to come into stock the following week. Personally I think Kaap (not sure how Matt acquired his) pulled in a repeat customer favor. Something I hadn't bothered to try in time
 
Back
Top Bottom