• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

That is bonkers power supply right there, apart from one upping the competition that's pointless from my understanding the 1080's power requirements.

Just want to point out, power supply limit for a card comes from the VRM/mosfet setup. PCI-e plugs literally back when they were making gpus at 100-120W max power range were rated at 75/150W. Later on as cards got more powerful the pci-e forum specifically stated that they were guidelines and you can draw really as much as you want from any cable as long as the card deals with it and the psu is capable of supplying it.

the biggest difference between reference and custom cards will be hopefully for users, the custom cards having the extra mosfets/vrms installed on the pcb. There are numerous cards that over/under spec pci-e plug guidelines, from 500W dual gpu cards that only have supposedly 375W of available power(2 8 pin plugs and 75W slot).

You can have 3 pci-e plugs, a supposed lets say 525W of available power but if the VRMs can only produce 200A of power at 1.2v, then it isn't going to get remotely close to 525W of power usage.

MSI are doing what you said, attempting to one up the competition by appealing to the general users belief that power supply is limited by the pci-e plugs rather than the vrm/mostfet/power circuitry on the board.
 
Been waiting for the Anandtech review to go up for the 1070 but noticed that it was already in the bench data base. Ran the numbers for it against the 1080 to see how they stack up at 4K.

http://www.anandtech.com/bench/product/1731?vs=1714

Code:
Anandtech GXT 1070 v GTX 1080 Founders Edition 4K benchmarks
26.1 Rise of the Tomb Raider (DX11)
18.6 Dirt Rally
22.4 Ashes of the Singularity (DX12)
24.5 Battlefield 4
27.1 Crysis 3
25.1 The Witcher 3
34.1 The Division
23.4 Grand Theft Auto V
27.1 Hitman (DX11)

25.3 Average

Even with these cards, frame rates are so low at 4k. I do wonder why people put themselves through that torture :)
 
Since the MSI 1080 Gaming X is already £650, I'm assuming the Lightning will be close to or over £700. Or MSI needs to adjust all their prices more appropriately.

yeah, there is a huge gap from the Armor to the GamingX, considering they are both using the same PCB, the GamingX should really be around £600 like the other aftermarket cards
 
It's weird because the MSI Gaming cards have usually been quite reasonable, price-wise.

Though it could be a case of retailers knowing they are possibly the most popular choice and are predicting demand to be higher for them(thus raising prices a bit on em)? Maybe not, just a thought.
 
As I said, improvements are becoming harder and more expensive to come by.

No, you're mistaking the idea that because company is making a decent profit and is selling a lot of product, that they will somehow decide to slash prices out of the goodness of their hearts. And that overall profits = individual profit margins.

Nvidia is doing so well not only because they're making some sort of profit on cards they produce, but also because they've managed to secure a very big majority of the marketshare in that space of time. It was not previously as big as it is now. And with the growth of the PC gaming sector, it means they are selling a LOT more cards than they were before. Even moderate profit margins with mass quantities adds up to a LOT of overall profit.

You are confusing margins with profits,they are not the same. Nvidia's margins were in the 30s around 5 years ago and are now nearly 60% or thereabouts. The margin is the average return on the products sold and total profit made is also dependent on volume of units sold - people forget Nvidia managed to reduced the production costs on lots of its Maxwell cards when compared to the Kepler cards they replaced.

Something like a large volume midrange GTX960 ended up selling for more than a GTX760 and had 1/3 smaller GPU and half the RAM chips used,and a cheaper PCB. Nvidia is making far greater margins on a GTX960 than a GTX760.

This is why it proves my point,that people keep saying costs are going up which is why cards will cost more. They are not otherwise margins would tank or be flat. They didn't tank during the latter parts of Kepler and Maxwell,since Nvidia managed to not only probably drop costs,but also had a decent selling price for many SKUs.

Nvidia grew margins with Kepler over Fermi despite people saying 28NM price increases were valid due to costs,but margins still went up as Nvidia used the Titan and TI strategy sucessfully to increase ASPs. Margins were not flat.

AMD's margins are much lower because every single competing card they sell costs more to make than what Nvidia is throwing at them and these are being sold at a similar or lower cost.

People making all these excuses,that prices should go up since costs are going up,are being disproved when companies margins go up.

If price increases were down to increased costs,the margins would stay flat. You don't believe me look at 1000s of other companies and their financial reports who say "despite ASP going up,margins stayed flat due to increased costs".

This is why Apple is such a profitable company - they have massive margins and decent volume.

Companies like Samsung sell far more phones and tablets than Apple but the margin is much lower so they do not make as much profit per unit sold,but are very profitable.

People are not being strong consumers when they just make up excuses for companies to milk them more.

The worse and worse increases in GPU performance per dollar,will end up pushing more to consoles and the remaining people to spend more and more on cards,and making people keep cards longer and longer. Take away all the enthusiasts and E-PEENERs,and look at where most of the market is??

Too many people on tech forums think standard gamers will just change out a £250 to £400 card every 12 to 18 months?? The more the average gamer spends on their card,the longer they will want to keep them,and this overpricing strategy AMD and Nvidia are trying will end up backfiring in the next few years.
The thing is short term it will work,but as more and more people cotton on,people will start coming to the realisation quicker.

Plus as the cheaper cards get even crapper in performance stakes,devs will be less and less inclined to really push the PC as a major platform,since most people,will still have relatively anaemic graphics in their PC. This is why The Witcher 3 which should have been the next Crysis after 8 years,ended up looking nowhere near as good as it should have.

Enthusiasts don't care since a £100 here or £200 here extra is nothing for their main hobby.

I am not argueing with you anymore about this.
 
Last edited:
Since when can you get a GPU with 4 free games? :eek: I thought they spaced out the free games depending on when you buy

It's a long story, I was watching a 4 letter competitors website for the classy to hit the shelf it was listed at £619 I got down there and as I went to check if they had any in stock it was listed as out of stock so I went to the desk and explained to the lady it was showing in stock just an hour ago and I've traveled 40 miles to come pick it up she was lovely went round the back and got me one, she had then got talking to me about the SC version which was something like £575 or something like that, and she put that price through by accident so she ended up letting me have the classy for £575 instead of the £619 so there's £45 saved and then she had some codes of the old games left so she sent me all 4 to my email, I was chuffed :D
 
Back
Top Bottom