More than you know what the word "your" means by the looks of it, it's kind of why I've challenged your spurious claims. Keep digging though, it's hilarious.Do you know what the words troll and hypocrite mean?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
More than you know what the word "your" means by the looks of it, it's kind of why I've challenged your spurious claims. Keep digging though, it's hilarious.Do you know what the words troll and hypocrite mean?
It's almost like people buy 100 to 1 Nvidia GPUs.It's not how I see it. The amount of trashing nvidia vs amd for their pricing is around 100 to 1 going by the number of posts.
Why would that be the case? Why are you buying nvidia gpus if the price is trash or the Vram is not sufficient ? That does not make sense to me, something does not add up.It's almost like people buy 100 to 1 Nvidia GPUs.
It's not actually that extreme but can you not see why people would 'trash' Nvidia pricing when they hold something like an 80% market share, when the majority of people are either buying or want to buy Nvidia graphics cards.
I have tried to help you understand that gpus use this thing called electricity. I even gave you a link to a video to help you understand. I will link it again, https://youtu.be/DNX6fSeYYT8More than you know what the word "your" means by the looks of it, it's kind of why I've challenged your spurious claims. Keep digging though, it's hilarious.
I wouldn't.Why would that be the case? Why are you buying nvidia gpus if the price is trash or the Vram is not sufficient ? That does not make sense to me, something does not add up.
Sorry, what was it you were saying about trolling. Did you not read the post where i said if your only concern is power usage then don't plug it in.I have tried to help you understand that gpus use this thing called electricity. I even gave you a link to a video to help you understand. I will link it again, https://youtu.be/DNX6fSeYYT8
If you don't understand the issue of power use by a gpu after watching the video I suggest that you drop a comment in the video to Steve
Tbh I think a lot of people have had enough of nvidia with their antics over the last few years and AMD should be capitalising on the situation the same way they did against Intel for CPUs when performance stagnated, they offered more cores for less and look where they are today compared to pre zen.AMD learn their lesson that selling themselves short by hundred of pounds would probably allow them to shift cards at a faster pace, but with production and shipping not keeping up and end up with retailers not haven't stock to sell and end up more sales but not necessarily higher revenue.; also not to mention majority of the people only wish AMD price their cards lower for the sake of possibly buying Nvidia for cheaper.
They already offer faster performance at lower prices with more vram than Nvidia and their RT performance while being weaker than Nvidia's offering, Nvidia really only has performance advantage for RT at Ultra settings, which would require a 4080 and ideally a 4090 for 60+fps anyway (while the 4070ti and below cards are pretty much death on arrival with their stingy 12GB vram). DLSS 3.0 is being overhyped to ridiculous level, and despite with the improvement over DLSS 2.0 using DLSS WILL still cause higher latency/input lag than without DLSS enabled (so ok for casual or single player games, but not so much for competitive or fast paced games that require precise input). DLSS is pointless if it's coupled with Nvidia selling their cards 2 teirs lower for the price they are charging.
Call me old school, but I care more for performance, longevity and value than fancy add-ons that or others things that are no so important. Also I (and probably lots of people as well) upgrade our cards every 4 years or two gens instead of upgrading every 2 years or new gen, so if I was to buy a Nvidia cards today, anything lesser than a 4080 16GB would not even be a consideration; it would be different had Nvidia actually bother selling their their 4070 ti and 4070 with 256-bit bus and 16GB vram instead of 192-bit 12GB (or aka selling their 4060 and 4060ti as 4070 and 4070ti).AMD are behind on all fronts, image upscaling, RT, App support, efficiency, frame gen. It's clear why they can't get out of single digit market share. Always playing catchup with inferior solutions to nvidia's tech.
Won't work.Tbh I think a lot of people have had enough of nvidia with their antics over the last few years and AMD should be capitalising on the situation the same way they did against Intel for CPUs when performance stagnated, they offered more cores for less and look where they are today compared to pre zen.
That's not true though. I for one bought both the hd4870, the HD 5850, the hd6850 for my 2nd pc.and the HD 7870. Was full amd until pascal, when nvidia flew ahead and then amd never caught up again.Won't work.
Over a decade ago even when Nvidia is not as dominance as today, ATI (just before AMD acquired them) came out with the HD5850 and HD5870 and first to the market with dx11 support and Tessellation, and their cards utterly destroyed Nvidia's fastest GTX285 at the time, but people would still rather wait for Nvidia dx11 cards believing their buying time of "It's coming" BS (which end up didn't come till around half year later).
I think you may be a bit underestimating peoples' nature of resistance to change and won't consider AMD at all due to having always used Nvidia.
AMD are behind on all fronts, image upscaling, RT, App support, efficiency, frame gen. It's clear why they can't get out of single digit market share. Always playing catchup with inferior solutions to nvidia's tech.
I wouldn't be surprised if desktop radeon is put on the sidelines as AMD concentrate on consoles and APUs.
AMD learn their lesson that selling themselves short by hundred of pounds would probably allow them to shift cards at a faster pace, but with production and shipping not keeping up and end up with retailers not haven't stock to sell and end up more sales but not necessarily higher revenue.; also not to mention majority of the people only wish AMD price their cards lower for the sake of possibly buying Nvidia for cheaper.
There was some very unusual data in last month's Steam Hardware Survey, which caused several Nvidia GPUs to spike extraordinarily high in adoption rates. Things have apparently been corrected, with the affected GPUs returning to normal.
..Increases and decreases of several percent obviously indicated something went very wrong with Steam's data collection. With that issue fixed, we're greeted by similarly large changes in the opposite direction for May's results.
The Steam Hardware Survey for May (which looks at GPU market share for April) has reshuffled all the affected GPUs back to their normal hierarchy from before April.
Tbh I think a lot of people have had enough of nvidia with their antics over the last few years and AMD should be capitalising on the situation...
With Nvidia’s gpu being so grossly overpriced, AMD coming in a bit cheaper with a bit more vram just isn’t enough to convince those who have already said no to Nvidia.
I think you may be a bit underestimating peoples' nature of resistance to change and won't consider AMD at all due to having always used Nvidia.
You are just speaking for yourself own purchasing history though, and is hardly representing the overall market situation.That's not true though. I for one bought both the hd4870, the HD 5850, the hd6850 for my 2nd pc.and the HD 7870. Was full amd until pascal, when nvidia flew ahead and then amd never caught up again.
10 years ago was a long time and mid range GPUs were not priced at £1200 so I think with the sums being asked that AMD had a good chance this gen if they followed last gens pricing strategy where their 80 competitor was priced at just over £600, even with inflation a 7900XTX priced between £700-800 would have sold like crazy.Won't work.
Over a decade ago even when Nvidia is not as dominance as today, ATI (just before AMD acquired them) came out with the HD5850 and HD5870 and first to the market with dx11 support and Tessellation, and their cards utterly destroyed Nvidia's fastest GTX285 at the time, but people would still rather wait for Nvidia dx11 cards believing their buying time of "It's coming" BS (which end up didn't come till around half year later).
I think you may be a bit underestimating peoples' nature of resistance to change and won't consider AMD at all due to having always used Nvidia.
10 years ago was a long time and mid range GPUs were not priced at £1200 so I think with the sums being asked that AMD had a good chance this gen if they followed last gens pricing strategy where their 80 competitor was priced at just over £600, even with inflation a 7900XTX priced between £700-800 would have sold like crazy.
10 years ago was a long time and mid range GPUs were not priced at £1200 so I think with the sums being asked that AMD had a good chance this gen if they followed last gens pricing strategy where their 80 competitor was priced at just over £600, even with inflation a 7900XTX priced between £700-800 would have sold like crazy.
AMD dont escape any berating. Myself I think the XTX should be priced where the XT sits and the XT should start where the last gen's 6800XT sat which was around £600. I didn't think they had enough inventory floating about to pull an nvidia controlling the supply chain and lets face it if its only 10% of the market then who's gonna buy them? At least the prices of the 6950 and a few other cards have dropped massively, whereas the ampere stock sits as it did in 2020 (@msrp).
The mid range seems to have doubled. Remind me again though who releases the cards first to market?
"If you can't beat them, join them"Then you see the whole Navi10 to Navi 23 transition. Now apparently we might be getting the same with Navi 33. Basically instead of trying to make a decent improvement in price/performance over their last generation,they are looking at what Intel and Nvidia are doing and trying their best to join in.
The problem is AMD is being opportunistic with their pricing too. When they realised they beat Intel in CPUs,they jacked up the price of the Ryzen 5 3600 replacement,ie,the Ryzen 5 5600X. So instead of $200 it went upto $300. Then AMD tried to say it was only $50 more than the Ryzen 5 3600X,despite it having the same TDP and cooler as the Ryzen 5 3600. With Zen4 they tried the same.
I remember back in the day how only a few people were talking about higher prices when it came to Ryzen. I felt that the price argument was left aside, reasons (like always) where made for jacking up prices, even a sort of "AMD deserves this!" type of mentality. Is it then surprising that nVIDIA has its loyal fans or that is preferred even when is not logical to?The problem is AMD is being opportunistic with their pricing too. When they realised they beat Intel in CPUs,they jacked up the price of the Ryzen 5 3600 replacement,ie,the Ryzen 5 5600X. So instead of $200 it went upto $300. Then AMD tried to say it was only $50 more than the Ryzen 5 3600X,despite it having the same TDP and cooler as the Ryzen 5 3600. With Zen4 they tried the same.
The same as the RX7900XT,which is an RX6800/RX6800XT replacement. But AMD based it around Nvidia jacking up the price of the RTX4070/RTX4060TI to a $900 RTX4080 12GB. Then instead of realising Nvidia could easily reduce the price,they got caught out.
Then you see the whole Navi10 to Navi 23 transition. Now apparently we might be getting the same with Navi 33. Basically instead of trying to make a decent improvement in price/performance over their last generation,they are looking at what Intel and Nvidia are doing and trying their best to join in.