• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Official GTX750TI review thread

Impressive power consumption looking at that TPU review. Mediocre performance though when you can get a faster R270 using 40W+ more for the same price.
 
Simple explanation: lolTomhardware :P

Here's a better done test from TPU:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_750_Ti/23.html

Peak power draw was 57w in Crysis 2 and 66w in Furmark.

OTH,it could be a simple case TH have more sensitive equipment,and have bothered to look at dynamic power consumption better.

If you look at the measurements,you see it is the card quickly boosting up quickly and then clocking down again(going by the power consumption). You are looking at 1MS intervals,ie.,1/1000 of a second. However,average power consumption is in line with TPU.

Like I said Intel did the same with some of their chips.
 
Last edited:
OTH,it could be a simple case TH have more sensitive equipment,and have bothered to look at dynamic power consumption better.

If you look at the measurements,you see it is the card quickly boosting up quickly and then clocking down again(going by the power consumption). You are looking at 1MS intervals,ie.,1/1000 of a second. However,average power consumption is in line with TPU.

TPU weren't measuring average in the numbers I gave they were peaks, they measured it from the card using better equipment than THG and let's face it they have a much better track history.
 
TPU weren't measuring average in the numbers I gave they were peaks, they measured it from the card using better equipment than THG and let's face it they have a much better track history.

But again TH were measuring 1MS spikes,and it depends if TPU were really looking at that too.

Their average power consumption is in line with TPU and TPU have not shown any plots.

This has been done before - Intel did it with phone SOCs which could briefly boost past TDP spec to gain performance,but clocked down before the chip overheated.

Expect to see more and more of this from AMD and Nvidia as the process nodes get more and more expensive and problematic.

It also means I would only get a GTX750TI with a reasonable cooler,not an E-PEEN one,but probably something better then reference or at least run the reference cooler at a higher speed. I have read enough of the GTX660TI,GTX760,Geforce Titan,R9 290 and R9 290X reference cooler throttling over extended periods in sub-optimal thermal conditions to not trust any of them with clock boosting technology.

Reviews need to start hot box testing of graphics cards over extended periods of 15 minutes to 30 minutes - it took them months to start preheating for short periods cards before benchmarks to try and negate the effects of earlier generations of GPU boost. Personally it needs to be more stringent.

BTW,I predicted that issue too in 2012! ;)

Running 30 second to 180 second benchmarks in a air conditioned office on an open bench only helps both AMD and Nvidia clock boosting mechanisms. They are getting more and more extreme with each generation. Its a best case scenario,especially since more people are moving to smaller cases which are more thermally challenged.

Edit!!

Anyway we are going around in circles here.

We will need to agree to disagree here.
 
Last edited:
It doesn't seem to throttle so drastically, quite the opposite this amazing hummingbird - even caged and passively cooled doesn't drop more than 20-30Mhz (tested for 40 min of intense gaming).

Clock-Rate.png


Obviously this card is ideal solution for all those tens of millions oem/white box PC's sold per year (loose power, space requirements), more than enough for gazillions true pc gamers: Wot/Wowp, Dota, Lol, TL, FF:RR, War Thunder, WoW, Sc, D3 ect; forming the very foundation of PC gaming (while letting those snobbish enthusiast be busy playing console ports). :)
 
It will be cool if they release a single slot version as all the ones currently on sale have ridiculously oversized coolers presumably just taken from the GTX650 in order to save costs.

Or better yet imagine if a card company brings out a single slot PCI-E 1x card just for mining (with a 6pin connector for power obviously), not likely to happen but the was a time when we thought motherboards designed for mining sounded silly.
 
It doesn't seem to throttle so drastically, quite the opposite this amazing hummingbird - even caged and passively cooled doesn't drop more than 20-30Mhz (tested for 40 min of intense gaming).

Clock-Rate.png


Obviously this card is ideal solution for all those tens of millions oem/white box PC's sold per year (loose power, space requirements), more than enough for gazillions true pc gamers: Wot/Wowp, Dota, Lol, TL, FF:RR, War Thunder, WoW, Sc, D3 ect; forming the very foundation of PC gaming (while letting those snobbish enthusiast be busy playing console ports). :)

So that is from the German arm of TH it seems:

http://translate.google.com/transla...passiv-kuhlung-umbau,testberichte-241505.html

I am actually happy someone did extended tests for once,and I am a bit confident now about it - the GTX760, Geforce Titan and R9 290X tests were worrying.

How look at the clockspeed drops. As time progresses they are getting worse and worse.

They start at 13MHZ for the first 20 minutes,and after that its getting progessively worse and worse with upto 40MHZ decreases.

So after another 20 to 40 minutes,its only going to get worse.

However,the review uses quite a large third party cooler,which was removed from an HD6670(consumes around 10W more than an HD7750 and is close to a GTX750TI).

rNEo5jX.jpg


AbBrSIx.jpg


OCtuBCG.png


Also,it was tested on an open air test bench,and not inside a case,and the ambient temperature is only 22C,probably in an office/test lab with climate control.

During summer(or with the heating on),in a case it might overheat,especially if people keep their PCs on for a few days.

So probably a passive cooler,might not be as good as an active one.

This is why I always test my SFF builds during the summer.

This is why(as I mentioned before),hot box testing needs to be done.

Its done for PSUs,so why not for graphics cards?

I don't see why anyone should be against it. Nobody complains if PSUs are tested this way.
 
Last edited:
That passively cooled card seems pretty over the top considering this is all you need to cool a card of this power:

evga-gt440_zps146aeaa9.jpg


Single slot and any noise will be drowned out by the CPU fan anyway.
 
a very boring card right there :p

1400 GFLOPS for 60 watts is absolutely amazing! 23 GFLOPS/W!

Intel's energy efficient compute effort, the Xeon Phi from 2 years ago manages a miserable 7 GFLOPS/W. Even Nvidia's current best-effort, the 780 Ti, only does 20 GFLOPS/W. It's a pretty decent improvement.
 
I have found your perfect GTX750TI then:

7KOACcZ.jpg


http://www.hardware.fr/news/13576/gtx-750-ti-low-profile-galaxy.html

It probably is noisy though with that dinky fan.

OTH,I had an HD5670 single slot card in another SFF build.

Never again. The problem is that the compact cases have crappy cooling,ie,very little in the way of exhaust fans,and the hot air recirculates in a dead zone around the GPU,causing the fan to ramp up. None of the reviews really test them on anything but test benches or large ATX cases. Plus they accumulate dust easily blocking up the fins,causing the cooling to be impaired even further.

The Zotac GTX750TI seems to be the same price as the reference card with a better cooler - it even boosts a bit higher too and consumes slightly less power.

I was pretty impressed by the Zotac GTX650TI Boost I found for a mate a few months ago for around £100. The cooler was excellent,and the card compact too.

1400 GFLOPS for 60 watts is absolutely amazing! 23 GFLOPS/W!

Intel's energy efficient compute effort, the Xeon Phi from 2 years ago manages a miserable 7 GFLOPS/W. Even Nvidia's current best-effort, the 780 Ti, only does 20 GFLOPS/W. It's a pretty decent improvement.

Considering how much VRAM the GTX780TI has,and GDDR5 is not energy efficient,I suspect it makes the GTX780TI more impressive in some ways.

BTW,Knights Landing is being made on 14NM,and remember these are depreciated X86 cores. It might be easier to get closer to the theoretical performance of it than some of the AMD and Nvidia compute cards. I believe ease of programming is what Intel is trying to sell.
 
Last edited:
I think Maxwell will do very well on mobiles and that's where I think NVIDIA is focusing next (e.g. no GTX 880 or whatever anytime soon).

I just can't get excited about a GPU when the buzz-word is 'efficiency' :(

Give us the nvidia 880gtxtilighteningblack edition ! :D

My thoughts exactly when I was reading the reviews :D I miss the days when we got new high-end products every 6-9months that totally blew the past generation.
 
What this is a laptop GPU,which uses a smaller die than the current chips which are GK106 based.

The GK106 is a much bigger chip than the GM107 and uses more GDDR5 chips due to its 192 bit memory controller. Nvidia can cut production costs using a GM107 chip,and cut down versions can get into thinner computers. However,as with all Kepler based mobile GPUs,there will probably be no GPU Boost in mobile models.

I think Nvidia are aware of Iris Pro - Intel themselves were talking about the GTX650M if you remember.

They lost Apple contracts due to Iris Pro too in lower end models:

http://en.wikipedia.org/wiki/IMac_(Intel-based)#Slim_Unibody_iMac

If you look at the iMac the 2012 model had all Nvidia GPUs. Last year Intel starting eating away at the low end.

The same is happening with the MBP.

Rumours are,that if Broadwell launches this year,it will be probably for Apple first. 14NM Iris Pro graphics would be a threat to the current Nvidia models deployed in the MBP and iMac and could end up replacing them.

Notice how OpenCL performance has improved too. Nvidia has never really bothered about it as they concentrated on CUDA. However,OpenCL is more important under the Apple ecosystem,so it fits that they improve that too,especially in the face of improvements in current IGPs in this regard.

20NM looks expensive and might be delayed for both AMD and Nvidia too.

The next MBP and iMac update should be in the next two months or so,and a 28NM Maxwell would be a good sell for Nvidia.

The timing is close to the next Apple refreshes.

They can get traction before Broadwell appears.
 
I have found your perfect GTX750TI then:

Sadly its still double slot (can be seen from another angle), kind of a dumb move but I guess they needed to make it low profile.

Having said that though doesn't the PCI-E spec say that low profile 16x cards are supposed to identify themselves as low power (25w) devices not high power (75w) ones. Odd.
 
For this particular test Igor was using Enermax Fulmo ST enclosure with all included case fans switched off.
So throttling by ~30-40mhz (from 1160 to 1120) in complete passive usage scenario (case plus card itself) isn't quite as shocking as in other well documented cases (ie 290/290x refs).
Even tiny whiff of fresh air should easily remedy that. :)
 
Last edited:
BTW,Knights Landing is being made on 14NM,and remember these are depreciated X86 cores. It might be easier to get closer to the theoretical performance of it than some of the AMD and Nvidia compute cards. I believe ease of programming is what Intel is trying to sell.

Absolutely right CAT, it's a bit of apples and oranges comparing a GPU to the Phi, especially if double precision is important as then any Nvidia GPU other than the Titan is out the window. For single precision though it looks like Maxwell will be hard to beat by any other platform.
 
Back
Top Bottom