• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th

However 1080 isn't even close to a £300 card, effectively what was once a mid range card, the (x)x60 now has just 33% if the cuda cores for the flagship Titan XP. Never mind the fuss about mining driving prices, the new 'mid range card costs £500 at 'coupon' pricing, up from 150- 250 back a few years.

Hardly going to get better in the next generation.

AD

To be fair the cards do get more complicated and better every year, using better components etc. They probably cost a lot of R&D, So I think expecting them to be the same as a card from 10 years ago (also including things like exchange rates and inflation etc.) is a bit ridiculous... Although yes I agree that the mining and general card prices currently are not good at all.
 
To be fair the cards do get more complicated and better every year, using better components etc. They probably cost a lot of R&D, So I think expecting them to be the same as a card from 10 years ago (also including things like exchange rates and inflation etc.) is a bit ridiculous... Although yes I agree that the mining and general card prices currently are not good at all.
You can account for some of the price increases with inflation and newer processes costing (relatively) more than older processes.

However that doesn't account for a mid-range card starting from £240 a couple years back, being now £500.

Inflation over four years is roughly 8%.
The difference in exchange rate between 2014 and today is from 1.7 to approx 1.35.

Taking all that into account, £250 in 2014 should now be £340. Not great, but better than bleedin' £500.

However... bare in mind that the 480/580 did fall below £200 recently. The £300 jump in in price since then is not about inflation, or Brexit, or anything else. The £300 jump since a few months ago is purely supply and demand, where demand = mining for the most part.

On the nVidia side... I haven't seen the 1070 fall below £380, ever. It's too expensive to truly be a mid-range card, esp since unlike before there is a vast performance gulf between it and the 1080. The 1060 is a low-mid card, just-about-managing at 1080p along with the 480/580, but not suitable for 1440p unless you can live with medium settings.

All in all, the mid-range cards are either too expensive for what they deliver or under-powered for the money, take your pick ;)
 
I enjoy gaming, but lately tweaking, overclocking and benchmarking has been more enjoyable than the average PC game (imo).

An updated Nintendo switch with the option of connecting an external Nvidia GPU would be quite awesome and refreshing.
 
I was planning to make the shift to 1440p later on this year, moving on from a 1070 and BenQ XL2411Z. It probably makes sense to wait for the 2070 or 2080 depending on pricing. I can't imagine the 2070 being much more than £400 surely? Hopefully the jump will be worth it.
 
I was planning to make the shift to 1440p later on this year, moving on from a 1070 and BenQ XL2411Z. It probably makes sense to wait for the 2070 or 2080 depending on pricing. I can't imagine the 2070 being much more than £400 surely? Hopefully the jump will be worth it.

When the 1070ti is £500-690? Its going to be £500+ easy.
 
Given the price increases lately, which are not all because of miners, yeah I can see that easily.

I'm guessing the 2080 will be around 10-20% faster than the 1080Ti based on the 980Ti/1080. If it drives the prices down of the 1080Ti to something more reasonable (and that's a big IF), that might seem like a good option actually.
 
seems pointless when not many games require more than current offering or other software for that matter.

Honestly, I don't see any game on the immediate or far horizon, apart from Cyberpunk 2077 & (maybe) Anthem, that makes me want to even think about investing in a new GPU for PC gaming.

The GPU power needed to run Cyberpunk 2077 & Anthem at 4K, 100fps+ with HDR support, which will hopefully be common place by the time of their release, will likely only become realistic with a 2080Ti in 2019, possibly even the generation after that.

..so no new 2070/2080 for me, I just don't need, or want, either.
 
I hope Nvidia get their act straight-ish, and at the very least slap a reasonable MSRP on their own reference/FE cards, to stay in tune with their own public requests for distributors and retailers to bring the price down. They will also have to come up with a better plan than AMD did with their gaming combos, to ensure the gaming market is supplied. Nvidia MSRP for 1070Ti FE is $450. If the 2070 FE is that money, I will pay that. If board-partners manage the same, then bonus, but I'm not holding my breath for those. If the exchange rate continues to improve, things may not be all that bad. And if Brexit finally happens, say goodbye to 20% VAT (presumably import duty will be less but time will tell).

Wouldn't be bothered about a new card if my 970 was still working. If you've had a 970 for a few years, be careful with the mosfets. I believe some manufacturers either used cheapo ones or cheap thermal pads, and quite a number of them are biting the dust now. If it's out of warranty I would look into replacing the mosfet thermal pads with a new/better quality pad unless you have plenty of trust in how your own model is built.
 
Given the results @Kaapstad has had on the Titan V and the speculation that Ampere is just a refined Pascal, which itself was a refined Maxwell, I don't see Ampere as being a value proposition for Pascal owners. I do see it being a value proposition for Maxwell owners. A Titan XA should give more than the performance of SLI Titan XMs when SLI is working without any of the hassles. Plus HDMI 2.1 and DP 1.4 support. But it still won't be worth upgrading until the monitors are there.
 
Given the results @Kaapstad has had on the Titan V and the speculation that Ampere is just a refined Pascal, which itself was a refined Maxwell, I don't see Ampere as being a value proposition for Pascal owners. I do see it being a value proposition for Maxwell owners. A Titan XA should give more than the performance of SLI Titan XMs when SLI is working without any of the hassles. Plus HDMI 2.1 and DP 1.4 support. But it still won't be worth upgrading until the monitors are there.

Depends - I'm playing The Division at the moment with a lot of settings turned up 2560x1440 with a GTX1070 and its OK - framerates mostly around 60-70 fps but I wouldn't say no to getting that more consistently to around 100fps without sacrificing things visually (half of the game's draw is the visual detailing of the game world which otherwise becomes kind of bland).

nVidia's next gaming cores are going to be using new performance libraries as well other other node refinements which should give a reasonable performance boost depending on how much nVidia trickle it out.
 
I was planning to make the shift to 1440p later on this year, moving on from a 1070 and BenQ XL2411Z. It probably makes sense to wait for the 2070 or 2080 depending on pricing. I can't imagine the 2070 being much more than £400 surely? Hopefully the jump will be worth it.

I use a GTX1080FE at qHD,so just tweak a few settings down and surely you will be fine??

If you are looking at high FPS,its not worth moving to qHD as its double the resolution of your current monitor.

So,basically you would need close to double the performance of a GTX1070,ideally to keep up with the experience you have on a GTX1070 at 1080p.
 
Back
Top Bottom