• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GeForce 750Ti – Runs Battlefield 4 Better Than PS4 & Titanfall Better Than Xbox One

It's not better than what's in the ps4. The gpu in the ps4 is in between a 7850 and 7870. What you are seeing in bf4 for ps4 was an after thought ported to the console. Early games on new release consoles are usually not great.

Its a bit different too.

TR said:
A 256-bit interface links the console's processor to its shared memory pool. According to Cerny, Sony considered a 128-bit implementation paired with on-chip eDRAM but deemed that solution too complex for developers to exploit. Sony has also taken steps to make it easier for developers to use the graphics component for general-purpose computing tasks. Cerny identifies three custom features dedicated to that mission:

An additional bus has been grafted to the GPU, providing a direct link to system memory that bypasses the GPU's caches. This dedicated bus offers "almost 20GB/s" of bandwidth, according to Cerny.
The GPU's L2 cache has been enhanced to better support simultaneous use by graphics and compute workloads. Compute-related cache lines are marked as "volatile" and can be written or invalidated selectively.
The number of "sources" for GPU compute commands has been increased dramatically. The GCN architecture supports one graphics source and two compute sources, according to Cerny, but the PS4 boosts the number of compute command sources to 64.

If developers take advantage of the PS4's apparently robust support for mixed GPU workloads, we could see more compute tasks being offloaded to the GPU in PC games. Let's hope developers don't rely too much on Sony's customizations, though.

It basically has the same number of shaders as the Pitcairn XT GPU,ie, 1280 like in the R9 270 series. However,it has one cluster disabled and hence has 1152 shaders.

Plus it appears compute has been significantly enhanced and it has features which increase memory bandwidth even more than the R9 270 has I suspect.

On top of this the lead consoles for BF4 were the PS3 and XBox360.
 
Last edited:
Then surely this is the case with all GPU's except the 780Ti 6GB and 290X? Effectively, anything else is a diluted GPU (as you put it).

You are only looking at this from a performance perspective (which isn't a bad thing) but for the performance per watt perspective, I see this as a very good GPU. I can see the appeal of both this and the 270/X in truth.

It's interesting that not that long ago team red would bang on about power consumption, performance/watt and die size all day long. Suddenly these things don't matter.
 
It's not better than what's in the ps4. The gpu in the ps4 is in between a 7850 and 7870. What you are seeing in bf4 for ps4 was an after thought ported to the console. Early games on new release consoles are usually not great.

Fair enough. I was just watching the vids showing the 750Ti performing admirably against the PS4 was all :)
 
Well, when we look at it like this,

http://www.overclockers.co.uk/showproduct.php?prodid=GX-188-GW&groupid=701&catid=1914&subcat=1854

£120 for that 750Ti

http://www.overclockers.co.uk/showproduct.php?prodid=GX-328-SP&groupid=701&catid=56&subcat=1982

£144 for that 270X

You can clearly see that the 750Ti is the price performance king and you will save even more money,
as it is far more efficient than the 270X. I always picture people who buy these cards as price aware and £24 saving is not to be sniffed at.

No it isn't. You do realise the GTX750TI is still a slower card,meaning it will have to be replaced quicker?? So over a few years you will be spending more in graphics card upfront.

KFA2 GTX660:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-036-KF&groupid=701&catid=1914&subcat=2379

MSI R9 270:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-245-MS&groupid=701&catid=56&subcat=1982

l5Zu4jv.png

FzYlaXA.png


Both around £130. R9 270 cards have been even lower for short periods.

The R9 270 and GTX660 against a GTX750TI which has been pushed to its limits?? Really now - so what happens when you overclock those?

Moreover,have you even run a midrange card??

My GTX660,H67 mini-ITX motherboard and a 80W TDP Xeon E3 1220(Core i5 2400) with an SSD and 2 HDDs consumes at most 200W during the combined test of 3DMark at the wall using an XFX PRO 450W PSU,which is Bronze rated.

My whole rig can run off a decent 300W PSU.

So if you had a Gold or Platinum rated PSU and an IB or Haswell based CPU,it would mean even yes.

That is peak power consumption. Usually it is around 150W to 180W at the wall.

Power consumption and PSU E-PEEN is only there to make people spend silly amounts on PSU you don't need.

PSU companies have made people gullible.

Anyone who is specced a rig on here will have at least a decent 300W+ PSU.

Moreover,I have run nothing but SFF mini-ITX and Shuttle SFF main rigs since late 2005.

You know the kind of rigs,where a 250W to 500W PSU are the most common??

I ran an unlocked 6800LE,ie,basically a 6800GT, off a 250W Shuttle TFX PSU with an overclocked XP 2800.

I ran a massively overclocked E4300,loads of RAM,multiple drives and an overclocked 8800GTS 512MB off a 400W Shuttle SFF PSUs on a blasted 975X chipset which drank power.

I ran a massively overclocked Q6600,loads of RAM,multiple drives and an overclocked HD5850 1GB at 950MHZ off a 450W Shuttle SFF PSUs on a blasted 975X chipset which drank power.

Plenty of people are running decent cards off 350W to 450W Silverstone PSUs.

The top end Valve SteamBox uses a Core i7 4770 and a Geforce Titan off a Silverstone 450W SFF PSU.

Those SFF PSUs are compact group regulated designs - they are decent for the size but not as good as full sized PSUs.

Thats just a few examples of what I have run in tiny boxes with limited cooling.


It's interesting that not that long ago team red would bang on about power consumption, performance/watt and die size all day long. Suddenly these things don't matter.

Another graph spam thread. Great.

Its funny when people who use only Nvidia cards ATM,have no right to criticise a Nvidia product,even when better Nvidia products for the same price are available.

So suddenly this faster GTX660 card for £130 is not any point then:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-036-KF&groupid=701&catid=1914&subcat=2379

Instead they write stuff like what you say. Try harder next time.

I see a lot of GTX660 hate here. At its current price its a great alternative to the R9 270.

But is also quite funny. Last year my GTX660(9 months ago) cost £140 with a free copy of a £30 game,ie,Metro:Last Light before launch.

Now we have a £120 to £125 cards which are slower,consumes a bit less power and no game.

Whopty Do!!

Newer must equal better value right??
 
Last edited:
It's interesting that not that long ago team red would bang on about power consumption, performance/watt and die size all day long. Suddenly these things don't matter.

Layte, it is comical reading these forums some times. One minute we have "You can't use this review site, as they are so biased, it isn't a fair comparison" and then the same person who said that then uses that same site for comparison because they show something they like...

You then get "Only fools spend that much on a GPU" and then when I do praise a low end card up, I get told it is just a diluted down top end card...

I scratch my head sometimes :D

Anyway, my thinking still stands and that is 'This is a good budget GPU' and ideal for a living room PC, or a low budget PC. I can see the frames to be expected from Anandtech and I would be happy for those if I was on a budget (of which I have been for many years) or if I wanted something in a HTPC/light gaming PC.


CAT, calm down fella please. :) It is all good debate and no need to come across as so aggressive :)
 
Last edited:
Well, when we look at it like this,

http://www.overclockers.co.uk/showproduct.php?prodid=GX-188-GW&groupid=701&catid=1914&subcat=1854

£120 for that 750Ti

http://www.overclockers.co.uk/showproduct.php?prodid=GX-328-SP&groupid=701&catid=56&subcat=1982

£144 for that 270X

You can clearly see that the 750Ti is the price performance king and you will save even more money,
as it is far more efficient than the 270X. I always picture people who buy these cards as price aware and £24 saving is not to be sniffed at.

No it isn't. You do realise the GTX750TI is still a slower card,meaning it will have to be replaced quicker?? So over a few years you will be spending more in graphics card upfront.

KFA2 GTX660:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-036-KF&groupid=701&catid=1914&subcat=2379

MSI R9 270:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-245-MS&groupid=701&catid=56&subcat=1982

l5Zu4jv.png

FzYlaXA.png


Both around £130. R9 270 cards have been even lower for short periods.

The R9 270 and GTX660 against a GTX750TI which has been pushed to its limits?? Really now - so what happens when you overclock those?

Moreover,have you even run a midrange card??

My GTX660,H67 mini-ITX motherboard and a 80W TDP Xeon E3 1220(Core i5 2400) with an SSD and 2 HDDs consumes at most 200W during the combined test of 3DMark at the wall using an XFX PRO 450W PSU,which is Bronze rated.

My whole rig can run off a decent 300W PSU.

So if you had a Gold or Platinum rated PSU and an IB or Haswell based CPU,it would mean even yes.

That is peak power consumption. Usually it is around 150W to 180W at the wall.

Power consumption and PSU E-PEEN is only there to make people spend silly amounts on PSU you don't need.

PSU companies have made people gullible.

Anyone who is specced a rig on here will have at least a decent 300W+ PSU.

Moreover,I have run nothing but SFF mini-ITX and Shuttle SFF main rigs since late 2005.

You know the kind of rigs,where a 250W to 500W PSU are the most common??

I ran an unlocked 6800LE,ie,basically a 6800GT, off a 250W Shuttle TFX PSU with an overclocked XP 2800.

I ran a massively overclocked E4300,loads of RAM,multiple drives and an overclocked 8800GTS 512MB off a 400W Shuttle SFF PSUs on a blasted 975X chipset which drank power.

I ran a massively overclocked Q6600,loads of RAM,multiple drives and an overclocked HD5850 1GB at 950MHZ off a 450W Shuttle SFF PSUs on a blasted 975X chipset which drank power.

Plenty of people are running decent cards off 350W to 450W Silverstone PSUs.

The top end Valve SteamBox uses a Core i7 4770 and a Geforce Titan off a Silverstone 450W SFF PSU.

Those SFF PSUs are compact group regulated designs - they are decent for the size but not as good as full sized PSUs.

Thats just a few examples of what I have run in tiny boxes with limited cooling.

Layte, it is comical reading these forums some times. One minute we have "You can't use this review site, as they are so biased, it isn't a fair comparison" and then the same person who said that then uses that same site for comparison because they show something they like...

You then get "Only fools spend that much on a GPU" and then when I do praise a low end card up, I get told it is just a diluted down top end card...

I scratch my head sometimes :D

Anyway, my thinking still stands and that is 'This is a good budget GPU and ideal for a living room PC, or a low budget PC. I can see the frames to be expected from Anandtech and I would be happy for those if I was on a budget (of which I have been for many years) or if I wanted something in a HTPC/light gaming PC.
5Pm70Wi.jpg


Its comical you go on about the GTX750TI being the price-performance monster while ignoring every benchmark of the GTX660,R9 270 and R9 270X out there,or even the fact their power requirements are not high. Then when presented with multiple benchmarks,questioning what you said,you just say whatever excuse.

But guess what,Nvidia agrees with me the GTX660 is faster.

Thats one of their slides.

CAT, calm down fella please. :) It is all good debate and no need to come across as so aggressive :)

GPUs,it serious business! ;)

But yeah,maybe I need me a cuppa.

Performance is stagnating again.

The £100 to £140 market for the last year has not really improved for most budget gamers.

Features like audio SOCs,a bit better power consumption are all OK,but in the end its detracting performance. Then all the magical PR in reviews start straight from the review guides.

I look now and 9 months ago,its not like price/performance has gotten any better.

Considering how far off the 20NM midrange is,its going to be another year of flipping stagnation.

We don't even get many games with the cards.
 
Last edited:
Its funny when people who use only Nvidia cards ATM,have no right to criticise a Nvidia product,even when better Nvidia products for the same price are available.

Instead they write stuff like what you say. Try harder next time.

I had xfire 7950's less than a year ago chap.

Has anyone pointed out that this is a new architecture? Lets see how they compare after 6-8 moths of driver optimisations.
 
It's interesting that not that long ago team red would bang on about power consumption, performance/watt and die size all day long. Suddenly these things don't matter.

Yes, if you change the arguments that used to be made, then they aren't the same ones being made now. Same performance for WAY more power usage, issue, double the performance for double the power usage... who cares.

Either way, if we go with say the 480gtx, power usage was the LAST in a LONG list of problems the card had that made it a bad choice.

Way more expensive for very little extra performance, much louder, 6 months later than the 5870, and used a lot more power. It used 102W more than a 5870 in Crysis, and what 43W more than a 5970.

http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/19

http://www.anandtech.com/show/2977/...tx-470-6-months-late-was-it-worth-the-wait-/9

yet was massively slower than a 5970 and only marginally faster than a 5870.

Perspective and not making up the arguments that were made.

I never once gave a damn about load power consumption WITHIN REASON. I didn't want 3% more performance for 50% more power. 50% more performance for 100% more power I wouldn't care about but if 50% more performance cost 100% more I wouldn't want that card as it would become bad value.

Significantly increased power consumption for the same speed, bad, increased power consumption for more speed, not bad. Increased price for same speed, bad, increased price for more speed, not bad..... there are limits on both. It's not a static argument and there is a crossover point on both scales where more performance vs price is or isn't worth it, and same for power.

60W less for 750ti than a 270 if they offered the same performance, great, when the 270 is often anywhere between 30-50% faster, occasionally less or more, isn't an issue. The biggest issue is, if the 270 cost 30-50% more to offer that 30-50% increased performance the 750ti would be a very good card. When the cheapest 750ti is £20 less than the cheapest 270. In fact the 270 is between 15-16% more expensive(cheapest vs cheapest on OCUK) yet the 270 is on average significantly more than 16% faster than the 750ti, making it poor value full stop.

Performance wins, power comes second in the list, or for me, fourth. Performance/price joint first, noise third, power fourth. IF I can't choose between cards based on 1-3 then 4th becomes a factor......

For instance £290 290 smashed a 290x, 780, 780ti, Titan in price/performance... it's already won for me before I even factor in noise or power. IF it didn't already make my mind up, noise is only a bit above a 780ti, higher than but faster than a 780, power usage similar to a 780ti, more than a 780, but not by enough on either to make me want one over the other. If the 290x cost £600 and the 290 £400+ then I might consider alternative cards but at £300 my primary reasons for rating a card, price performance, it wins hands down.
 
GPUs,it serious business! ;)

But yeah,maybe I need me a cuppa.

Performance is stagnating again.

The £100 to £140 market for the last year has not really improved for most budget gamers.

Features like audio SOCs,a bit better power consumption are all OK,but in the end its detracting performance. Then all the magical PR in reviews start straight from the review guides.

I look now and 9 months ago,its not like price/performance has gotten any better.

Considering how far off the 20NM midrange is,its going to be another year of flipping stagnation.

We don't even get many games with the cards.

I completely agree and performance is stagnating but when I look at the performance of this card and the power usage and then look to the future with Maxwell fully on 20nm, I can't help but get a little excited. I know, Boom was saying that the 880GT will be a monster and the more I think about it, the more I am inclined to agree (after seeing what this 750Ti can do with such a low power usage).

And like Layte has said, this is a new arch and look what happened with AMD and the 12.11 drivers/7XXX series. There could be some hidden performance yet to be seen (I don't expect miracles of course). Time will tell and from a selfish POV, I am only seriously interested in buying the top end next gen GPU's.
 
What a monumental mess up MS and Sony have made by putting in AMD cards into their machines. Struggling at 720p, 900p, 1080p etc from next gen consoles is a joke. Up pops nVidia with an immense card for the money and makes a complete fool of them.

Say this again in about two years when Devs start to fully use the Console's to there Max.
Compare a PS3 release game and then look at GTA5.. This always happens with new consoles and takes time for the Devs to get the most out of them.

Dont base Launch titles based on the max Next gen console offer.

You be very Wrong!
 
I had xfire 7950's less than a year ago chap.

Has anyone pointed out that this is a new architecture? Lets see how they compare after 6-8 moths of driver optimisations.

It isn't that new.

OjOGvVm.jpg


That second generation Kepler,ie,the GK208. That is the GPU in the GT640 revision 2. That was at least 10% more efficient than the GK107 based version. Hardly anyone reviewed it though.

Remember it also had no boost clocks,so ultimately it was not fully exploited.

It used shared textured clusters too like the GM108.

xIIxUpV.jpg
FgVgAwH.jpg


See the evolution to the GM108.

The move from Fermi to Kepler was much greater,even down to the use of software scheduling instead of hardware based scheduling in Fermi.

The thing especially with the GM108,you need to consider it only has one GPC,which means no interconnect.

The GF100 had power problems due to the interconnect,indicating that it does add a decent amount to the power consumption(the GF110 fixed it somewhat but also had better power saving gubbins too).

The problem is that compared to some of the marginally more expensive cards,there are large deficits in performance at times.
 
Last edited:
Edit ^^^ beaten to it.

I completely agree and performance is stagnating but when I look at the performance of this card and the power usage and then look to the future with Maxwell fully on 20nm, I can't help but get a little excited. I know, Boom was saying that the 880GT will be a monster and the more I think about it, the more I am inclined to agree (after seeing what this 750Ti can do with such a low power usage).

And like Layte has said, this is a new arch and look what happened with AMD and the 12.11 drivers/7XXX series. There could be some hidden performance yet to be seen (I don't expect miracles of course). Time will tell and from a selfish POV, I am only seriously interested in buying the top end next gen GPU's.
Its not a new Arch, not like Tahiti was over Cayman, its the Kepler Arch rearranged with a few tweaks.

Its more like Tahiti vs Hawaii, the same GCN architecture rearranged and tweaked. (GCN 1.2) :)
 
Last edited:
I completely agree and performance is stagnating but when I look at the performance of this card and the power usage and then look to the future with Maxwell fully on 20nm, I can't help but get a little excited. I know, Boom was saying that the 880GT will be a monster and the more I think about it, the more I am inclined to agree (after seeing what this 750Ti can do with such a low power usage).

And like Layte has said, this is a new arch and look what happened with AMD and the 12.11 drivers/7XXX series. There could be some hidden performance yet to be seen (I don't expect miracles of course). Time will tell and from a selfish POV, I am only seriously interested in buying the top end next gen GPU's.

The thing is the pricing is the problem and its way too close to faster cards,and maybe yeah they can eek some more performance here and there,but if anything the same could happen with the other cards.

But for people on low budgets they need to get the best performance,even if means a few watts more at the wall. They are unlikely to upgrade very often,so I prioritise performance first,and for them £500 to £550 is a decent amount to spend on a desktop.

I have personally helped convert some consoles gamers over!! :)

However,most people prefer tablets and laptops,so a gaming desktop is another computer they use sometimes on top of their other devices at extra cost. Plus the consoles are big competition,due to ease of use and console gamers know PCs are better. However,its the convenience factor due to the walled garden approach and the fact they know they don't need to upgrade the hardware for years. Most people are clueless about building or upgrading computers and neither are that bothered.

Consoles are the Apple of gaming,and PCs the Android.

PS:

About Maxwell on 20NM I hope so.

Good performing high end parts=decent midrange.

It will also force AMD to also compete too in performance and/or price/performance.

We need another decent jump.
 
Last edited:
@Cat why do you compare it to a 660 Then try and use Nvidia's own slides that say its not meant to replace said market?. Surely during the maxwell desgin when they told you the core limits and how it would be released at Clocks already pushed to the limits they mentioned that it wasnt a 660 replacment it was replacing the lower 650ti??

Side Note . Is it possible for you to refrain from spamming the same graph three times on one page in the future?? Thank you

Simply speaking for the power it uses it does a nice job its was based ,Since u didn't seem to realize. It was meant to go against the 260x which it does well while using Lower Wattage
 
Last edited:
@cat why do you compare it to a 660 Then try and use Nvidia's own slides that say its not meant to replace said market?. Surely during the maxwell desgin when they told you the core limits and how it would be released at Clocks already pushed to the limits they mentioned that it wasnt a 660 replacment it was replacing the lower 650ti??

Simply speaking for the power it uses it does a nice job its was based ,Since u didn't seem to realize. It was meant to go against the 260x which it does well while using Lower Wattage

http://www.overclockers.co.uk/showproduct.php?prodid=GX-036-KF&groupid=701&catid=1914&subcat=2379

http://www.overclockers.co.uk/showproduct.php?prodid=GX-248-MS&groupid=701&catid=1914&subcat=1854

They are both competing in the same price bracket pretty much even though the gtx660 is a good deal faster. I think that's what Cat is getting at among other good points. Would you not buy a gtx660 if you were looking for a gaming card or would you choose a gtx750ti just because it uses less power and be forced to upgrade sooner. Without reading through the thread again Cat worked out that in normal gaming hours per year the gtx750ti would save you £8 per year which is not much. As a gaming card it does not make much sense v the more powerful cards that are near to the 750ti price bracket.
 
Last edited:
http://www.overclockers.co.uk/showproduct.php?prodid=GX-036-KF&groupid=701&catid=1914&subcat=2379

http://www.overclockers.co.uk/showproduct.php?prodid=GX-248-MS&groupid=701&catid=1914&subcat=1854

They are both competing in the same price bracket pretty much even though the gtx660 is a good deal faster. I think that's what Cat is getting at among other good points. Would you not buy a gtx660 if you were looking for a gaming card or would you choose a gtx750ti just because it uses less power and be forced to upgrade sooner.

He might be comparing it to that But thats not where Nvidia placed it.
Card prices often change as market changes and cards get older and due to sales
Thats why even in your own Example the 660 was £155.99 inc VAT and atm its on sale at £129.95 inc VAT
http://www.overclockers.co.uk/showproduct.php?prodid=GX-228-MS&groupid=701&catid=56&subcat=1866

MSI Radeon R7 260X OC 2048MB GDDR5 PCI-Express Graphics Card £109.99 inc VAT

The 750ti is £113.99 inc VAT I'd hardly call that the same price bracket tbh
i dont see long threads with graph spam stating how the gtx 260x is a terrible card and why people shouldn't buy it despite it being around a 750ti price.
Would I personally shop around grab a deal for a faster card yes
Would i expect a deal on a brand new card No
Early adopters often end up paying a price premium but this time round the premium is at the lower end not the high end
As i said at the start it is a nice card and gives a good gain over what its replacing for less power
Sorry for the long post i recommend both AMD and Nvidia and run both in my own and familys machines. Its just annoying how certain people on both sides often seem to try and push a agenda against certain vendors when a card is mentioned or released
 
Back
Top Bottom