Associate
- Joined
- 14 Aug 2017
- Posts
- 1,196
So … we don’t actually know what that means. Great!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It means @TheLurkingDev might be forum member uscool.So … we don’t actually know what that means. Great!
That is completely inaccurate, as if you never actually looked into this at all. AMD has used integrated cores for a long time now, which contain stream processors (what you called shader cores, which in itself is not accurate anymore) but also RT accelerators (and other elements). RT accelerators are dedicated transistors that can only accelerate RT and nothing else - and they are present already in RDNA 2. Aside them being inside CU and not a separate unit, they're already dedicated "cores" for RT acceleration. The main difference is that NVIDIA's ones can accelerate more functions than AMD's one in RDNA 2 - one of the reasons of the speed difference. RDNA 3 improves on that, but there's always more space for improvement. CU contains AI acceleration transistors in RDNA 3 (like tensor cores, but again inside in CU and not separate unit). This is just how AMD designs their GPUs with integrating everything they can in CUs, sharing cache in between things etc. It also does NOT mean these things are part of "shaders".Had Nvidia not existed in an alternate reality, their RT improvement was nice but Nvidia is still a whole generation ahead. AMD is also not using dedicated RT cores for RT. Its still being done on shader cores indicating they aren't serious about the technology.
Theres also a gigantic gap between the price of these cards
Theres also a gigantic gap between the price of these cards
£500 in uk money, At that point you may as well get the best on the market.
That is completely inaccurate, as if you never actually looked into this at all. AMD has used integrated cores for a long time now, which contain stream processors (what you called shader cores, which in itself is not accurate anymore) but also RT accelerators (and other elements). RT accelerators are dedicated transistors that can only accelerate RT and nothing else - and they are present already in RDNA 2. Aside them being inside CU and not a separate unit, they're already dedicated "cores" for RT acceleration. The main difference is that NVIDIA's ones can accelerate more functions than AMD's one in RDNA 2 - one of the reasons of the speed difference. RDNA 3 improves on that, but there's always more space for improvement. CU contains AI acceleration transistors in RDNA 3 (like tensor cores, but again inside in CU and not separate unit). This is just how AMD designs their GPUs with integrating everything they can in CUs, sharing cache in between things etc. It also does NOT mean these things are part of "shaders".
Hell no.You buying a 4090 mate?
Hell no.
how do you figure that? pay 33% more for what exactly. Someone who has only 1k to spend on a card isnt going to pony up more to "get the best on the market' They also sure as **** arnt buying a 4080 16gb with 30% less raster performance which is 200 more£500 in uk money, At that point you may as well get the best on the market.
lol next gen is going to be 600w, atleast thats what i read in few articles and also the reason why amd and nvidia came up with this great idea of a new atx standard.. nex gen is going to be more expensive - unless intel pulls out something magical
Why not? What’s your budget? I personally would not want to spend more than £700 and for that on a new gen card would want a decent improvement over what I have in both rasta and rt. Thats why it seems to me I will be skipping this gen.
Pretty sure it will be safe to upgrade next gen. 4090 performance will probably drop into my price range, whether that is new or used.
i assumed from your sig that you would be buying a 80 class graphic card, but i believe the 5070 will be able to comfortably do 4K at a more reasonable price - so maybe lets seeHad the pound not dumped and was at 1.4 like it used to be for a long while, these AMD GPU’s would be looking very good to most of us. Prices may go up but if you look at dollar value you can see AMD have done well to improve on price for performance. Same will happen again next gen in my opinion.
how do you figure that? pay 33% more for what exactly. Someone who has only 1k to spend on a card isnt going to pony up more to "get the best on the market' They also sure as **** arnt buying a 4080 16gb with 30% less raster performance which is 200 more
The best part of the RDNA3 reviews isnt seeing how close it is to the 4090, but how much if crushes the 4080 for less coin
maybe you can but i doubt most people can afford to just up their outlay by 33%. only someone with money to spare or an idiot thinks an extra 500 is nothingif you can afford a grand for a gpu, you can go the extra 500 notes for the vastly superior product. No one cares about the 4080.
So looking at the extrapolated numbers from Linus at min;
No I couldn't that is daft. The extra £500 is an extra 50% what a stupid comment.if you can afford a grand for a gpu, you can go the extra 500 notes for the vastly superior product. No one cares about the 4080.
And the cheapest 4090 is more like £1800, so it's actually an extra 80% (on the AMD RRP, no idea how much they will actually cost).No I couldn't that is daft. The extra £500 is an extra 50% what a stupid comment.
£700 and below, I don't need the fastest thing on the block. It's a shame nvidia and amd don't want to offer anything decent in that range and instead just nickel-dime us at 1k+ price. They treat gamers with contempt as a side job to their data business.
i assumed from your sig that you would be buying a 80 class graphic card, but i believe the 5070 will be able to comfortably do 4K at a more reasonable price - so maybe lets see
if you can afford a grand for a gpu, you can go the extra 500 notes for the vastly superior product. No one cares about the 4080.