• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Official GTX750TI review thread

Looking at the reviews, so the GTX750Ti pretty much replace the GTX650Ti Boost 2GB at around the same price and but with at slightly lower performance, and the only real benefit it's more efficient with lower power consumption?

Did Nvidia pull an Intel prioritizing mobile/laptop instead of PC? Desktop users want performance, not saving a few quids a year off their electricity bill.
 
Looking at the Hexus review the Pitcairn GPU used in the R9 270X is 212MM2 including the 256 bit memory controller and the GM107 is 148MM2 including a 128 bit memory controller. Yet in the C3 and BF4 benchmarks the R9 270X is around 35% to 45% faster for a 43% increase in die area. Hence in performance/mm2 Maxwell does not really look like it is massively better IMHO.

The Bonaire XT GPU is 160MM2 and has a 128 bit memory controller - only problem is that it includes Tensilica audio DSP which takes up die area too and consume extra power too.

Despite what some reviews are indicating about performance/mm2,I don't see Maxwell being drastically better.

However,performance/watt has improved a decent amount and it makes the GTX750TI a nice but expensive card for certain SFF builds IMHO.

OTH regarding,the improvements in power consumption at a design level,I expect AMD will do the same for GCN2. Nobody expected the massive improvement in compute performance when GCN was launched,so some of the reviews might be a bit premature about the chances AMD has in the next year or so.

Their Jaguar and Kaveri replacements are both on 28NM still and the Kaveri replacement,Carrizo is still on 28NM and leaks indicate a reduction in maximum TDP to 65W from 95W for Kaveri.

I can see this down to addtional performance/watt improvements for the GPU and this is the main reason why Kaveri consumes less power than Trinity.
 
Last edited:
a very boring card right there :p
with the exception of the 290 we could go back to feb last year and nothing has really changed
if I was a hedgehog id go back into hibernation :)
 
Did Nvidia pull an Intel prioritizing mobile/laptop instead of PC? Desktop users want performance, not saving a few quids a year off their electricity bill.

Not at all, this card clearly isn't aimed at that market. The big cards will come, and when they do they will be more efficient, use less power, produce less heat and thus overclock even better. Maxwell arch looks awesome.
 
Not at all, this card clearly isn't aimed at that market. The big cards will come, and when they do they will be more efficient, use less power, produce less heat and thus overclock even better. Maxwell arch looks awesome.
I know the card itself isn't, but it does seem like the starting/reference point for them to work toward the more scaled down mobile GPU.

The card isn't should position at the same price bracket as the GTX650Ti Boost...it should really be positioned at around £80~£100 at most and taking the bus-powered king crown from the 7750.
 
Last edited:
a very boring card right there :p
with the exception of the 290 we could go back to feb last year and nothing has really changed
if I was a hedgehog id go back into hibernation :)

If these were priced well,I would be more excited.

However the GTX750TI is at least £115 at many retailers.

The problem is that the R7 260X is having price cuts putting it well under £100(form what I gather) and the R7 265 will match the price of the GTX750TI.

If this was like £80 to £90,it would have been a different story IMHO.

R9 270 and GTX660 cards can be had for around £130(or less),and will run fine off a decent 330W to 400W PSU. If the R7 265 comes it at under £120,it will face stiff competition.

My whole computer including a Xeon E3 1220(Core i5) and a GTX660 consumes around 200W at most from the wall socket when running games.
 
Last edited:
If these were priced well,I would be more excited.

However the GTX750TI is at least £115 at many retailers.

The problem is that the R7 260X is having price cuts putting it well under £100(form what I gather) and the R7 265 will match the price of the GTX750TI.

If this was like £80 to £90,it would have been a different story IMHO.

R9 270 and GTX660 cards can be had for around £130(or less),and will run fine off a decent 330W to 400W PSU. If the R7 265 comes it at under £120,it will face stiff competition.

My whole computer including a Xeon E3 1220(Core i5) and a GTX660 consumes around 200W at most from the wall socket when running games.
£90 plus a low-profile version would probably be appealing to people who wish to do some casual gaming on a HTPC in the living room on the TV, but £115+ is bit too much (plus there's no low profile version yet).
 
Today, four GM107-based cards in a mining rig should be able to outperform a Radeon R9 290X for less money, using less power.

Litecoin.png

07-Power-Consumption-Peak.png


~1000kh/s@270W@450£

This might somehow ease price gouging in Us.
 
£90 plus a low-profile version would probably be appealing to people who wish to do some casual gaming on a HTPC in the living room on the TV, but £115+ is bit too much (plus there's no low profile version yet).

Agreed,but even TH just loved the card I saw the following in their review:

http://media.bestofmicro.com/2/W/422600/original/01-GTX-750-Ti-Complete-Gaming-Loop-170-seconds.png

t7URxTO.png


TH says the card consume 60W on average but the problem is look at how high the power consumption can deviate from the median.

It looks like the card might be boosting massively during benchmarks. However,the problem is for consistency most benchmarks run on sites are 30 to 60 seconds long,not extended periods over 10 minutes.

The non-deterministic Boost Nvidia uses now has no upper limit and is also temperature sensitive too.

We saw with the reference GTX760(and to a lesser degree the reference GTX660TI) with reference coolers,the cards started throttling as they got hotter over time. HT4U saw that their GTX760 downclocked by 137MHZ on average after 15 minutes.

The boost is very aggressive.

The HD7850 and R7 265 stay at maximum clockspeed during gaming,so it does worry me whether the GTX750TI performance in reality is not as high with the weedy coolers many of the cheaper cards will have.

The better cooled models should be OK,but just like the Geforce Titan and the R9 290 series(which uses a different type of boost to other AMD cards),we really need extended tests on ALL cards being released now.

These boosting mechanisms can overinflate scores if only short benchmark runs are used.
 
Last edited:
Agreed,but even TH just loved the card I saw the following in their review:

http://media.bestofmicro.com/2/W/422600/original/01-GTX-750-Ti-Complete-Gaming-Loop-170-seconds.png

t7URxTO.png


TH says the card consume 60W on average but the problem is look at how high the power consumption can deviate from the median.

It looks like the card might be boosting massively during benchmarks. However,the problem is for consistency most benchmarks run on sites are 30 to 60 seconds long,not extended periods.

The non-deterministic Boost Nvidia uses now has no upper limit and is also temperature sensitive too.

We saw with the reference GTX760(and to a degree the GTX660TI) with reference coolers,the cards started throttling as they got hotter over time. HT4U saw that their GTX760 downclocked by 137MHZ on average after 15 minutes.

The HD7850 and R7 265 stay at maximum clockspeed during gaming,so it does worry me whether the GTX750TI performance in reality is not as high with the weedy coolers many of the cheaper cards will have.

The better cooled models should be OK,but just like the Geforce Titan and the R9 290 series(which uses a different type of boost to other AMD cards),we really need extended tests on ALL cards being released now.

These boosting mechanisms can overinflate scores if only short benchmark runs are used.
That's kinda odd...the card is bus-power and require no additional PCI-E 6pin according to review...I thought cards can only draw 75W max from the PCI-E slot?
 
Looking forward to the GTX 880. Have a feeling it's going to be very very very good..
So long as it doesn't follow the trend and direction of the GTX750Ti, or we might somehow end up with a £360-£400 GTX880 with 256-bit memory bus, ~3GB vram, slightly slower than the GTX780 but with awesome power efficiency consuming up to 150W instead of 268W requiring a single PCI-E 6 pin instead of one 6pin and one 8 pin :p
 
Last edited:
Agreed,but even TH just loved the card I saw the following in their review:

http://media.bestofmicro.com/2/W/422600/original/01-GTX-750-Ti-Complete-Gaming-Loop-170-seconds.png

t7URxTO.png


TH says the card consume 60W on average but the problem is look at how high the power consumption can deviate from the median.

It looks like the card might be boosting massively during benchmarks. However,the problem is for consistency most benchmarks run on sites are 30 to 60 seconds long,not extended periods over 10 minutes.

The non-deterministic Boost Nvidia uses now has no upper limit and is also temperature sensitive too.

We saw with the reference GTX760(and to a lesser degree the reference GTX660TI) with reference coolers,the cards started throttling as they got hotter over time. HT4U saw that their GTX760 downclocked by 137MHZ on average after 15 minutes.

The boost is very aggressive.

The HD7850 and R7 265 stay at maximum clockspeed during gaming,so it does worry me whether the GTX750TI performance in reality is not as high with the weedy coolers many of the cheaper cards will have.

The better cooled models should be OK,but just like the Geforce Titan and the R9 290 series(which uses a different type of boost to other AMD cards),we really need extended tests on ALL cards being released now.

These boosting mechanisms can overinflate scores if only short benchmark runs are used.

Thats an interesting observation.

Power consumption is good though, depending on review its 25 to 40 Watts less than the similar performing 260X. looks like about 30% better efficiency than Bonaire.

Performance is not so good in its price range.

It will be interesting to see how AMD respond to Maxwell.
 
So long as it doesn't follow the trend and direction of the GTX750Ti, or we might somehow end up with a £360-£400 GTX880 with 256-bit memory bus, ~3GB vram, slightly slower than the GTX780 but with awesome power efficiency consuming up to 150W instead of 268W requiring a single PCI-E 6 pin instead of one 6pin and one 8 pin :p

what's impressive about Maxwell so far is how much more efficient they've made it on the same process, that bodes very well for a cool running / non-power hungry card meaning they can concentrate on the 20nm shrink to maximise performance
 
what's impressive about Maxwell so far is how much more efficient they've made it on the same process, that bodes very well for a cool running / non-power hungry card meaning they can concentrate on the 20nm shrink to maximise performance

^^ This.

The GTX 880 will be better performing than older 7XX cards, but also cooler running, quieter. Have more OC room and use less power. All good things..

My ti's will be on MM as soon as GTX 880 launches lol :p
 
Back
Top Bottom