It's not a waste to buy a quality brand power supply - continually buying stuff without any actual idea of what you really want (despite several people telling you it's a bad idea) - that's a waste.I don’t personally want to just waste good power supplies like this one and like I did with my seasonic on graphic cards that would run off 350w meaning spending more then I needed to on a power supply
Cores being higher makes no difference graphically. Performance is a combination of things - core count, clock speed, architecture (e.g. "work done" - think Core2Duo vs Pentium 4), memory bandwidth. Having more of something e.g. cores or bandwidth isn't necessarily better, if it becomes bottlenecked elsewhere (hence why a 1080Ti doesn't have a 64bit memory bus - the amount of Cores it has require a large amount of bandwidth to keep them busy).The cores matter because they are part of the perfoemance which usually higher the number means better graphically
No - there is no way you can specifically notice a difference. You can notice the difference from a faster card, but that is a result of architecture improvement, nothing to do with what size memory bus. Bus sizes are generally decreasing where possible due to cost and advances in technology:- having a wider bus, means having more traces on the PCB, and more memory modules, both of which increase complexity and cost; each newer generation of GDDR runs faster and typically transfers more times per clock; Memory compression technology has been implemented by both AMD and NVIDIA and continues to evolve, to make better use of whatever physical bandwidth is available.the interface matters because it determines how the memory performance is utilised, I have experienced 64bit, 128bit and 256bit and I do notice the difference
If bandwidth was all that mattered, then you should look at buying a Radeon 2900XT - that had a 512-bit memory bus, so must be better!

Having more VRAM means you can use higher quality textures - even "older" games have settings that can make use of it. 2GB has been mainstream since around 2012.but what does the 1030 having 2gb matter? Can it actually use it?
You have to get out of this mindset - power usage has nothing to do with performance. A 75Watt 1050Ti that has no power connector absolutely crushes your 5850. Newer high end cards draw less that older high end cards - the market is only going one way, manufacturers face pressure to reduce power consumption, both from environmental legislation, but also financial e.g. to reuse products across various markets e.g. desktop, laptop, mobile, games consoles.again I don’t want a 30w card, I want something that will actually be good for gaming
I didn't say that cuda core count didn't matter at all. The fact is though that a 384 core Geforce 1030 performs better than a 8 year old 1440 core Radeon 5850. The reasons why I've touched on above - faster clock speed, architecture improvements, memory compression, more VRAM.Don’t quite understand this? You say the 560 is better because it basically has double the shaders, but you implied about cuda core count not mattering? If I’m not mistaken the shader count for amd is what cuda count is to nvidia so with that said why would the 550 be terrible?
The issue with the Radeon RX550 vs RX560 is that if you are spending £80 on a 550, it is worth spending the extra £20 to get double the performance from a 560. An RX550 isn't much quicker than a GT1030, but the Geforce range doesn't suffer so much as the pricing is better spread out - the 1030 is £60, and a 1050 that is likely double the performance is £110 or so.
i get bored of the same parts though hence why i chose this old radeon, all this talking and browsing i havent even played a game yet lol
What is there to get bored of? Graphics Cards aren't Pokemon - you haven't got to catch them all?
Surely it's as simple as:
is it fast enough? no
can I afford/justify something better? yes/no
Most of us have to settle for whatever we can afford. The "best" graphics card I have in my house is a 270X - is it fast enough? probably not. Can I justify anything better? No - as the low-mid range new cards haven't really moved on from what I have, and I can't justify spending £100 on a 5 year old card that does.
as for getting hung up on those criteria specs, every time i look on this site or read reviews whether its professionals or other forums about graphic cards, those categories seem to be high on the conversation regardless of my own experiences when it comes to comparisons, mean why would a 1080ti need over 300bit interface if apparently 64bit is enough?
Covered above - but a 1080ti has more cores, and needs to keep them busy. If a 384-bit bus made everything instantly better, then every card would come with one. It's about what is appropriate for each card within the range in order to keep price/performance relatively similar regardless of your budget. Sometimes manufacturers get it wrong - e.g. the RX550 above - sacrifices were made to the number of cores to sell it to a lower segment, but it's still arguably too expensive for the target market (given that twice the performance normally costs twice as much e.g. RX560->RX570 or 1050->1060 pricing)
Last edited: