Given that the 1050=950 and 1050 is only about 20% faster than a 750 ti, I can't see the 2050 getting close to the 570 but I could be wrong. I also can't see Nvidia going that high on lower range market, which is probably profitable for a small card anyway. I imagine 2050=1050ti and 2050ti maybe a 20% slower than 570.
Also 570 and 580 have only seen their prices dropped recently. For a long time the 3gb 1060 was cheaper than a RX 570 and there's still only £20 between them and they perform similarly although with more VRAM and DX12 you'd guess at the 570 ageing better.
It was only more expensive when mining took off and I remember what happened the last time it happened too.
But the RX470 and RX570 before mining had more deals(look on HUKD). It was more the fact he was saying the RX590 needed to be £150. £150 is GTX1050TI money. I remember getting an RX470 4GB in 2016 for £160ish,and mates getting hold of the RX570 4GB before mining took off for around the same,and the 8GB RX470/RX480 cards being like £190ish. So the 8GB RX470/RX570 cards for under £160 with three AAA games is probably the cheapest they have been on average and the deals have been on for a month.However,there were deals to be had over the summer - a big retailer had the Sapphire Pulse RX580 4GB for just under £190,which my mate got hold off.
I can understand people saying £200 to £250 might be more of a pricing point for the RX590 8GB,but if the GTX1050TI has stayed stubbornly at that price,and the GTX1060 3GB cards at £180+ even with AMD throwing VRAM and games at them on their own cards pricing is not the issue here. Nvidia has made sure there are no real deals to be had on the GTX1050TI and GTX1060 3GB,so ultimately when you have had the RX470/RX480 4GB/8GB there,and certainly the RX480/RX580 4GB there,on and off during 2016,early 2017 and in the last few months,I am not sure what people expect AMD to do?? AMD has so many times offered better value for money at that level,so now it needs to be more?? Offer better than GTX1060 6GB performance for GTX1050TI money.
So that means an RX570 8GB has to be £100 and the RX580 8GB has to be £125,and the Vega64 has to be £250?? That is the kind of pricing you are looking at if you base your range around a £150 RX590.
I can only see that if Nvidia decide to launch a sub £300 GTX2060 with GTX1070 performance and AMD needs to drop pricing.
Regarding naming,Nvidia has pushed up pricing tiers again so might use larger than normal GPUs,so I hadn't realised it might mean more tiering happening so yeah you might be right. I can see the GTX2060 being a £300 to £350 card with GTX1070 performance,unless they call it the GTX2060TI. Then have a GTX2060 which is probably a tad faster than a GTX1060,ie,matching a RX590 for around £230 to £250. Then the GTX2050TI will be a £150 to £200,and will be RX570 level. Then a GTX2050 at £100 to £150 which will be probably a bit faster than a GTX1050TI. All the reviews will compare tier to tier,so the jump looks massive,and I expect it will be the best thing since sliced bread. So all the people saying the RX590 should be £150,will be going wow at the performance jump. AMD will be doomed as usual.
Now,if Nvidia do release GTX1070 level performance under £300,now that might be something then,but it seems only AMD
needs to do these kind of these price/performance jumps(since RX590 level performance of £150 would be equivalent price/performance to a Vega64 card at £250).
This is why if we even had a £250 Navi GPU with GTX1080 performance for £250(which I doubt),it probably wouldn't be enough. I expect the excuse is that it will not do RTX and DLSS then.
Ultimately I can see within a few years AMD not bothering - they might try a surge now with Navi,etc but I dunno whether its just wasted money TBH. I think most of the GPU development is probably due to consoles now,and any professional markets AMD might be able to get a niche in.
I could be wrong,but it seems investing in CPU has given them much more dividends quicker. Imagine if AMD launched faster more efficient and faster CPUs before Intel for 6 to 9 months before Intel could?? Imagine how well they would do. They have done that with GPUs and it didn't pan out as well as everyone thought.