Discussion in 'Graphics Cards' started by Gibbo, Apr 16, 2019.
I'm not saying it's worth it, just that the cooling is the USP.
Some seriously cheap performance at 1080p on both sides
I started PC gaming 20 years ago and my first graphics card was an Nvidia Riva TNT2 for £100 and that was high end. The only card on the market that could beat it was the Ultra version for £150 but it wasn't worth it for such a small increase in FPS.
Six months later, Nvdia released the Geforce 256 for over £250 and that was when high end GPU's got expensive because that's £425.59 in today's money when accounting for inflation. So as far as I'm conserned, high end GPU's have been too expensive for 20 years.
Yes I absolutely agree with you 100%. People seem to complain when the absolute best is too expensive. I've never looked at buying a suit and got annoyed that high end suits cost over £1000. I've never looked at buying a pen and complained that the high end parker pen costs £2,525.00. Instead I just buy a bic biro for 40p. Nobody can justify paying that for a pen. You can look at any product whether it's a tent for camping, a bed to sleep in, or a watch to tell the time. The high end will always be outrageously expensive. I dont think there's a high end product in the world that's "worth it". They're aimed at people with a lot of cash who want to splash out. What's wrong with that?
If you dont have the money, buy something cheaper. Where's the problem?
I've thought that high end graphics cards have been too expensive ever since Nvidia released the Geforce 256 back in 1999. So instead I bought the cheaper MX version back then when my Riva TNT2 started to show it's age. I've never had a problem with buying something thats not the best.
I guess some people are annoyed that the 1080TI was more affordable but I still think it was outrageous for a graphics card. So was the 980Ti and every xx80Ti before them.
epeen maybe a factor (not for me tho), I suppose the issue is how games are been rolled out. Dev's are implementing code that to run at max settings needs a powerful GPU and is it the case that they simply not optimising as much so it runs with same visual fidelity on a weaker card or that they keep raising the bar each year with new graphical fidelity. This combined with that when someone has a choice of turning settings down or buying a new GPU they choosing the latter.
You also also have the current craze in competitive gaming of people wanting high refresh rates, so e.g. 5 years ago 60fps was considered absolutely fine for gaming and a top end frame rate. There was no complaints of apparent input lag caused by frame rate and so forth, then nvidia rolls out gsync, and of course to soleve the problem of gpu's getting too good and to help sell gsync units we started seeing displays with uber high refresh rates. Then bam 144fps gaming was born. This has raised the requirements for gpu grunt as well.
I also have an observation that DX9 to DX11 transition also added to GPU grunt requirements. But thats an indirect result as most DX9 games are cpu bound on single threaded rendering, so with that that bottleneck released it in turns shifts load back to the GPU.
I am deffo guilty of myself of buying a new GPU to avoid turning down game settings. But also my 1080ti was purchased because of a concern that if I Didnt buy it there and then I wouldnt be able to buy an upgraded GPU at a future date with DVI support and also for under £1000 given the way turing prices were going. Normally I wouldnt have replaced my 1070 so soon.
My gpu history is interesting tho. Can see I have lost financial discipline progressively.
8800GT to gtx 460 (60 series, plus 3 gens gap), to gtx 760 (60 series, 3 gens gap), to gtx 970 (shifted to 70 series, 1 gen gap), to gtx 1070 (still on 70 series, 1 gen gap), to gtx 1080ti (to 80ti series, same gen).
Dev's know that a lot of people aren't buying current gen high end cards because of the prices so they'll optimise their games for the cards people do own. Their priority is MAXIMUM sales so why would they design their games for GPU's that most people dont have?
They will optimise for standard game settings yes, but the bar for ultra settings seems to be going upwards.
Far too much, and limited in supply.
$1900 in the states.
It comes with an AIO cooler which is a joke as the first thing you will want to do is rip it off and use something better.
Isn't this basically what were the poseidon cards? I remember the 1080ti poseidon release, it was £900, now this one is nearly twice the price for the same thing near enough! unbelievable
Poseidon was just a water block that had to be connected to a custom loop, was it not?
Ah so it was! Still, this isn't worth nearly twice the price though
There was a 980ti matrix, just a fancier heatsink with (i think) an led display on top.
I wouldn't buy this if it was cheaper than EVGA after my experience of Asus. Disgraceful
I doubt this is true but never the less even if it was the case it’s great news for owners of high end cards because it means we’ll get more lifespan from our cards. No one wants their cards to get outdated too quickly because graphics improve that a super fast pace that’s not fun.
With that being said some of the newer releases are starting to show signs up struggling on the 1080ti at 4K which certainly wasn’t the case last year - biggest example so far is the latest Metro game.
I don't see a problem there personally.
A 240mm rad is more than sufficient to keep it cool.
Fitting a custom block may look prettier, but it isn't going to yield better OC results imho.
Well it happened during the 8 year period when the CPU market was stagnent. Games didn't start to require a minimum of an i5 then 2 years later require an i7. Then 2 years later require an i9. Devs have common sense and know what hardware most people have, so I think that graphics development in games will slow down, just like CPU it did with CPU development. Now that CPUs are back on track, optimisation is going back in that direction.
I can’t find any reviews for the kingpin yet but based on what I’ve seen from other 240mm cards and also from users using the g12 with 240mm aio I’d say 240mm is a decent SweetSpot. However i belive any smaller and you will forego performance but conversely you will still gain a bit more performance with a custom loop or 360mm aio. The 240mm though is the sweetspot between cost and performance which is likely why it’s used.
Gpu boost on Turing is very agresive and can start downclocking cards when they are at 50c. Climate is also a consideration, no everywhere is as cold as the UK
Yet another 2080 Ti I won't be buying. The price is stupid for all of them. Keeping my trusty 1080 Ti for the foreseeable.
None of the AIO watercoolers are very good.
A proper high-end WC loop (probably costing less than half of what this card costs over the other 2080tis) will definitely give better results, it will also run completely silently.
I have fitted custom waterblocks to cards with AIO coolers and have got Improved performance.
For normal cards an AIO cooler does have benefits but for an extreme benching card like the Kingpin all it does is add extra expense for average cooling performance.
If EVGA were serious about producing an out and out extreme benching card, it would come as just the bare PCB leaving the end user to choose what cooling solution to use.
Also if the Kingpin did come as just the bare PCB absolutely no one would consider putting an AIO cooler on it after just spending $1900
Separate names with a comma.