• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: AMD Keeps Screwing Up (HUB video) - do they?

Do people think AMD keep screwing up with their dGPU launches?


  • Total voters
    75
Caporegime
Joined
4 Jun 2009
Posts
31,635

This somewhat touches upon what a few us were saying in another thread - https://forums.overclockers.co.uk/threads/nvidia-4000-series.18948098/page-1491#post-37287408

Having owned basically every gen of amd gpu from the 3850 to the vega 56 which were all incredible gpus at the time and imo better than nvidia alternatives, over the past 4-5 years, it feels like they just can't catch a break now let alone keep up with the competition as they haven't focussed on the right areas. Imo, the 2 biggest oversights from them have been upscaling and RT (whether people like it or not, these 2 things have become incredibly important features as evidenced with the amount of RT titles coming out now and upscaling being required if you wish to keep graphical settings dialed up).

I don't think it is necessarily all amds fault or that their products are poor (except for the pricing of their products..... imo, they need to be at least 20-30% cheaper than nvidia alternatives due to the shortcomings which results in a worse overall package), I think it is more that nvidia have pulled so far ahead especially with their features recently and given nvidia have a good 2 years headstart here, for amd to catch a break, nvidia needs to **** up and amd need to pull of a ryzen moment.

Do people think AMD keep screwing up with their dGPU launches?

Mods, can we get a poll for yes or no? @Stanners
 
I wouldn't say not focusing on RT and upscaling was an oversight on AMD's part. That was more Nvidia releasing a tech that GPUs of the time weren't ready for and then releasing DLSS as a bandaid and using their mindshare to convince everyone that its needed. Was great work from Nvidia in all honesty

Somewhat an oversight as the writing was on the walls on what way the industry was moving with RT and finding other methods of increasing performance outside of just hardware advancements. Amd have/had plenty of time to get their upscaling up to scratch but they still haven't.

Someone had to get the RT evolution started and as we all know, the first iteration of tech. is always crap/meh and then it gradually becomes the standard to the point no one will really even consider it as a new/unique feature, when was the last time we heard people praise how good a games tesselation, ambient occlusion imlementation and so on was or how good gpus handle tessellation (in fact, we no longer even have seperate benchmarks for tessellation anymore).... Same way avatar and spiderman 2 doesn't really get praised for the "RT" because there is no option to turn it off so the "overall" visuals just get praised. If nvidia waited for the hardware to be there for RT, we still wouldn't have a single RT title now and we have come a long way from BF 5 and control RT days to the likes of CP 2077, AW 2, portal, avatar, metro ee kind of RT.

Most games coming out in the past 3 years have some form of RT and it's only becoming more and more common especially with UE 5 and lumen, RDNA 3 matches 4 year old nvidia gpus with RT enabled.... that is not a good look at all imo. Then we have ray reconstruction from nvidia which is starting to get added more now, not only does it improve IQ but also performance (except for first descandant for some strange reason).

AMD had years of doing poorly in DX 11 titles to only sorting this out in 2022 - https://community.amd.com/t5/gaming...nd-dx11-performance-optimizations/ba-p/523632 - they were lucky with the move to dx 12 as they had the generally better perf. here where nvidia had the disadvantage but nvidia have now addressed that - https://videocardz.com/newz/nvidias...performance-improvements-for-geforce-rtx-gpus

Unless AMD pull of a miracle or/and nvidia end up like intel, amd are going to have a hard battle for the future especially when/if more titles stop providing options to turn off RT entirely.....

DLSS is needed? No it's not "needed", at least not if you don't mind sacrificing graphical settings/IQ or/and getting worse performance.
 
Last edited:
Amd turned a blind eye to software evolution in the GPU space and stuck to what they are comfortable with: raster performance and associated drivers.

Now they’re in a constant state of needing to catch up while the main competitor refuses to take their foot off the gas.

They don’t understand that releasing poor versions of equivalent features does more reputational harm than good. They just think if it gives them a checkbox on a feature comparison sheet, it’s good enough to hookwink the consumer.

Their lack of sales prove otherwise.

The good news is that AMD themselves have even recognised their weakness and how far behind they are now hence the massive investment they are making into the software side now:


I suspect when their solutions match or even exceed nvidias solutions, we will no longer see them "championing" the open source approach.....
 
If you believe in marketing from any company then you are the problem. Far too many people "believe in" brands which is hilarious, and lap up marketing and other crap.

The problem is with such bad marketing, people get hyped and are expecting some ground breaking revolution then come release, amd are seen to be constantly under delivering by falling flat on their face because it has not lived up to the marketing, which does not give a good impression especially when said brand are saying they are a premium brand and charging in the same ballpark as the market leader: "poor volta", "overclockers dream", "our gpus aren't a fire hazard...." (fast forward to 7900xt(x) having vapour chamber issues....), "DLSS is dead! FSR is here", 4 years later, dlss is still here and still better.... We are working on FG, a year later, it gets released in 2 awful games (oh and doesn't work with VRR, you have to hit your screen refresh rate, frame pacing and UI issues etc.) and the driver solution with anti lag gets you banned in MP games...... "RDNA 3 will be the most power efficient gpu to date" - fast forward to it being a power guzzling beast...... It's things like this which stick around and tarnish the brand. Obviously nvidia aren't perfect and have their fair share of problems but difference is they are pretty quick to resolve issues and the marketing doesn't quite over promise to the same levels as amd do, of course they don't need to promise the world given their market share.

Well WELL overdue imo, but at least they've recognised it like you say. Probably AI-focussed but gamers will benefit too.

Yeah hopefully the AI side of things will translate down into gaming too like what has happened with nvidia. key thing will be amd ensuring they keep up good partnerships/communication with game developers so as to make sure their tech gets into games quickly and widespread, not slacking behind by several months/years. This is where it is mind boggling how they can't seem to achieve this given they provide the hardware to consoles, you would think they would have it much easier but seems not, which further strengthens my take that they are very much the kind of company to like the "over the fence" approach, that will need to change going forward too.

The 2 biggest successes right now with amd is their frame gen and AFMF 2 but yet amd aren't shouting about it from the rooftop.....
 
Not sure why the wall of text was needed in response to, if you believe marketing from any company you are the problem. Marketing = professional/paid lying.

If you don't believe that marketing plays some part in a companies success then you are delusional.

There are reasons why companies spending millions on marketing and it's not just a case of to lie or mislead potential/current customers... Not all marketing is lying although I suppose you could consider amds marketing to be lying in many ways, same way certain claims by nvidias marketing department are also lying.

The point of this thread and the point that HUB get to isn't so much just "marketing" but also the execution by amd.

As noted above by someone else (which is what we all value the most) it's the outcome/results that are most important and this outcome is confirmed by reviewers but poor execution/marketing can leave a bad taste in the mouth hence why amd are still renowned for always under delivering.
 
Last edited:
Probably should have made the poll a bit clearer, somewhat set the tone in my OP but not in the thread poll title..... it's not so much a question about amds marketing screwing up but amd and what they are delivering to their customers both hardware and software front. Safe to say marketing is a complete mess but what amd are delivering is also considered to be scoring own goals regardless of marketing.

So reasons I voted yes:

- the 7900xt(x) launch disaster with the vapour chamber and cards catching fire, more notworthy this given the comments amd amd made about the 4090 fire hazard
- each new fsr version being hailed as the dlss killer only to take 1 step forward then 2 steps back and intel being able to get their version on par with dlss in comparison despite being new to the dGPU market
- anti lag getting people banned, major oversight by amd this
- pricing their gpus at launch in same bracket as nvidia counterparts despite not offering the same complete overall package as their competition
- the claims of how efficient rdna 3 was going to be and better than ada yet come launch, power guzzling like mad and issues with idle power consumption with dual monitors, amd quickly removed some of their slides from the website as well to hide these false claims (also, this wasn't pr marketing but comments direct from Lisa too)
- knee jerk reaction to DLSS frame gen and making it out like it was in the works for months/years only to release in 2 awful titles and with a lot of missing features
- lack of RT focus with rdna 2, could be excused at launch but as shown, the cards are aging pretty poorly now with all the RT games coming out and even with the writing on the wall, they didn't properly address this with rdna 3 so now we have rdna 3 matching 4 year nvidia gpus in RT games

Those are the main things that come to mind for me.

I don't think it's quite right to say that AMD keep screwing up their launches. It's more that Nvidia made a successful long-term play that has handed them an advantage AMD have struggled to erode.

Go back to the early '10s and Nvidia were getting some very strange looks for that money they were putting into AI. When the 2000 series launched in 2018, it was branded a damp squib by a good chunk of the gaming press and didn't sell brilliantly. But what Nvidia had done was gamble that simply increasing traditional GPU performance metrics wasn't the way to go. They put serious research funding into RT and upscaling and took a "dump generation" of consumer GPUs to get the technology out there and get developers using it. When RT and DLSS actually became properly usable with the 3000-series, they had killer features and AMD were a long way behind them.

And in some ways, that's capitalism working correctly. Nvidia took a risk on long-term investment (with no short-term reward) and get a reward now in terms of increased market-share in the consumer space and absolute megabucks in the corporate AI space. The problem is that it's proving so hard for AMD to close the gap that we're now left without much competition at the top end of the consumer market.

Yup that's exactly it and noted similar in my OP, nvidia have simply excelled further and in a shorter space of time, essentially they invested in the right areas which are paying of big time.
 
Last edited:
What does "getting it right" mean? In their eyes, they may be "getting it right". Or do they have to copy everything nVidia does?

Well they are competing with nvidia on features which are now considered a must have by many users and tech press so yeah they kind of need to match nvidia if they want to be considered a premium brand and charge in the same bracket as nvidia gpus or better yet, beat nvidia to the punch rather than following behind by months/years.

They right now have one big advantage with AFMF but alas, their marketing capabilities have let them down on capitalising on this front, nvidia will no doubt have frame gen injection via drivers with 50xx/dlss 4 and shout about it from the rooftops so the main advantage amd have right now to go amd over nvidia will be gone.

remember RDNA2 launch, promising plentiful supply that turned out to be tiny
making such bold claims immediately following nvidia rtx 3000 scalping and shortages was so stupid

Ah yes, forgot to list this one.

I was wanting rdna 2 i.e. the 6800xt at the time but the lack of stock and 0% chance of getting one for MSRP in UK pushed me to the 3080, looking back and how things have ended up, amd did me a favour by giving 80% of their supply to consoles though. Was signed up on part alerts and 6800/6800xt hardly ever came into stock compared to the 3080 (even compared to the FE)
 
Maybe supply was plentiful, worldwide, just the UK got a limited supply? Who knows. One thing that was quantifiable was the nVidia pricing, but somehow AMD is equally as bad because some people couldn't get a card at launch?

I don't know why we even have these threads because its just used as an excuse for people that don't buy AMD GPUs and never will to have a go at them and claim they would buy them if they were 50% faster and 50% cheaper than nVidia!

It was stated ages ago that amd were supplying 80% of their rdna 2 stock to consoles at the time so it wasn't just a UK thing. What UK didn't have was the access to amd store in order to secure a gpu at MSRP so if people in the UK wanted a MSRP RDNA 2 gpu, they had to buy elsewhere and import or pay the inflated prices from etailers and AIBs.

With this post and your fanboy comment above, seems like you're the one who maybe is wearing rose tinted glasses and not liking some valid points being raised about amd?

How is a company going to improve if flaws/issues aren't pointed out? If AMD fans keep on sweeping things under the rug, amd will lose all market share and then we'll be left with only nvidia, not exactly a good thing for anyone. AMD are no longer the underdog, they are worth billions, they even admitted themselves their flaws around software feature set hence again, the huge investment, if they and their customers thought they were delivering products on par with the competition, they would not be doing this....

Are you saying that everyone who has voted yes doesn't buy amd gpus? I can see plenty of amd gpu owners voting yes..... And as noted, I have owned more amd hardware than intel and nvidia combined, probably more than most people on this forum too.
 
  • Like
Reactions: TNA
Name and shame please :D

Do you mean Nexus, the guy who before his 3080 had AMD cards only for like a decade?

I still remember him being a AMD boy. All the posts are there to see if you don't take my word for it.

Yup amd/ati were fantastic before rdna 2, nvidia didn't have enough advantages to warrant going with them (I don't care for cuda, shadowplay and physx [was nice but only in a handful of games] and didn't have a gsync module based display). Back in the day so to speak, it was very much just hardware/sheer perf and efficiency and pricing, nothing else mattered and amd always couldn't be matched on bang per buck especially when they did so many game bundles too.... now there are so many factors at play, more valuable/worthwhile feature sets to consider.

The one thing which I loved and kept me with amd in those days was mantle, it was a god send on bf 4 (my main game back then) and also the start of dx 12 where amd had the lead over nvidia, being in consoles and dx 12, expected to see bigger differences as time went on but alas that never really happened, at least not a substantial enough anyway and nvidia have now resolved their dx 12 shortcomings where in certain games like assassins creed, the lead was massive with amd.
 
  • Like
Reactions: TNA
My thoughts
:p


- the 7900xt(x) launch disaster with the vapour chamber and cards catching fire, more notworthy this given the comments amd amd made about the 4090 fire hazard.
I don't remember, but sorry if it was the case, but I thought the vapour chamber issue just caused throttling. I don't think they caught fire did they? And it only impacted the reference model. Whereas the 4090 and cable issue were melting across the board. I think we can probably say the 4090 issue in comparison, from a safety pov, is/was worse.

- each new fsr version being hailed as the dlss killer only to take 1 step forward then 2 steps back and intel being able to get their version on par with dlss in comparison despite being new to the dGPU market
It's taken a while yes, but FSR 3.1 is probably where AMD want to be (finally). And lets not forget that this works on all makes of cards, so it also benefits Nvidia users with older cards. Personally I'd be saying thanks AMD!

- anti lag getting people banned, major oversight by amd this
Tricky one. But can we fully lay this on AMD? I bet AMD and likewise Nvidia can't test every single driver change with every game and involve every dev team. I personally think some blame lies with VAC system. I mean, did Nvidia not know about Windows 11 22H2 and do any internal testing in advance before release? https://www.tomshardware.com/news/windows_11_22H2_nvidia_GPU_woes - I'd think this was a bigger oversight, impacting more people and more games than just CS2.

- pricing their gpus at launch in same bracket as nvidia counterparts despite not offering the same complete overall package as their competition
Nvidia stuff is overpriced, and usually Nvidia launch lesser cards first before the "proper" cards come out later - so no way AMD should be trying to emulate any of this. However, I don't know if general costs have gone up so hence price rises. Could all be fixed for all we know lol.

- the claims of how efficient rdna 3 was going to be and better than ada yet come launch, power guzzling like mad and issues with idle power consumption with dual monitors, amd quickly removed some of their slides from the website as well to hide these false claims (also, this wasn't pr marketing but comments direct from Lisa too)
Again I think the dual monitor issue was quite specific to high resolutions and high refresh rates. I'm not 100%, but, I think it is now under control (or at least improved). RDNA 3 performance per watt vs previous AMD cards (not Nvidia here) is actually an improvement. Toms Hardware did a comparison, "...the RX 7700 XT ultimately ends up delivering about a 20% improvement in performance per watt over the 6700 XT, and a 30% increase compared to the RX 6750 XT." Neat!

- knee jerk reaction to DLSS frame gen and making it out like it was in the works for months/years only to release in 2 awful titles and with a lot of missing features
Not sure what you mean on this one. lol.

- lack of RT focus with rdna 2, could be excused at launch but as shown, the cards are aging pretty poorly now with all the RT games coming out and even with the writing on the wall, they didn't properly address this with rdna 3 so now we have rdna 3 matching 4 year nvidia gpus in RT games
To me, that is probably the only downside at the moment - if RT is important to you. However, I just ran the Black Myth Wukong benchmark, which mrk informed me is using RT (Lumen), and my RDNA 2 card still managed respectable FPS at 1440p.

There were reports from people that their gpu popped because of the lack of water to cool i.e. vapour chamber issues. I can't recall if it impacted all but I believe it was only the 7900xtx affected, amd said it was a small batch but still the point stands, marketing/making fun of the competition only to face a similar problem yourself. Wasn't the 4090 issues in the end pointed down to user error (wasn't it gamer nexus who came to this conclusion, the issue was the connector wasn't providing a good enough feedback to show the cable had been fully inserted) and mostly affecting AIB models along with people using custom cables? I think if using the supplied power cable with the FE, it wasn't an issue? AMD handled it well though tbf to them.

FSR 3.1 as shown by DF etc. still is not anywhere near DLSS and barely improved over FSR 2.x versions (upscaling side of things). Frame generation is much better and very good now although question of consistentancy is still up in the air. As shown by the poll created here a while back, most people don't use FSR includng amds own customers so whilst it is is good and probably will eventually get there, it's not a great outcome still if most of their own user base aren't using it.

Given AMD always like to say about what they do is to work best for consumers but also their game dev partners, you would imagine they would be working closely to test such things like this. Yes new features like this should be tested by anyone who is going to be involved, the injection method of AFMF is done by amd, not by game devs though so no one to blame here but amd (unless they did pass it on to the game devs to test and they also missed it in their testing.....) hence why they pulled it and have now relaunched it. Microsoft releasing an update which harms nvidia (or any of the dgpu brands for that matter) is not on nvidia, of course, microsoft should be working alongside with dgpus vendors if they are going to be making changes which will impact gpu drivers. Personally can't say I had any issues with that update and my 3080 though and this is more a driver issue i.e. can't apply to all, same way AMD ran great for me throughout the decade but not so for others and vice versa.

Prices are 100% fixed and no doubt Jenson and Lisa are both behind it together.

It is still a far cry from what Lisa/AMd were saying before launch though.... But power efficiency no longer matters since RDNA 3 came out :p

This YT covers it well:


It then came out in 2 of the worst games to date, forgotten and immortals and was completely useless, didn't work with VRR, frame pacing issues, you had to be hitting your screen refresh rate for it to be smooth and using vsync so then getting insane latency among other issues i.e. kneejerk reaction to dlss 3

That's the point, if RT is important to you. The vast majority of games will work via rasterisation, the same can't be said for RT. nVidia released it when they didn't have the hardware to deliver it so used software trickery to make it playable and then their marketing convinced everyone they needed it
:D

Software trickery that basically every tech press confirm to be of value and deem to be worthy of having.... Again, why are amd investing millions into their software department?
 
You're right that Nvidia released the tech before it was really in a position to benefit users, but I disagree on the rest.

The 2000-series was not a great demonstration of either RT or DLSS (and I say that as somebody who owned a 2080Ti). RT implementations were pretty minimal, the performance cost was huge and using DLSS1 to claw back framerate came with a huge hit to image quality. What it did do was get developers excited that the tech was out there and get them to start making games that supported those new features, but from the customer's perspective, you were best off skipping the generation. Then the 3000-series arrived, DLSS got significantly better and (on the 3090 and 3080 at least) RT could actually be worth using and, for the first time, there were games that looked much better with it (Cyberpunk probably the first). That trend continued and accelerated for the 4000-series. It doesn't matter whether DLSS and frame generation are "software trickery"... the fact is that in many situations, they work.

For my money, the biggest harm to the reputation of RT came from Microsoft and Sony's decisions to make RT a box-tick feature for the Playstation 5 and Xbox Series X. Those consoles are about as good at it as the 2000-series was, which is to say, not very good at all. I think a lot of people outside of the PC enthusiast sector have become quite jaded about it as a result.

Agree except for this part:

For my money, the biggest harm to the reputation of RT came from Microsoft and Sony's decisions to make RT a box-tick feature for the Playstation 5 and Xbox Series X. Those consoles are about as good at it as the 2000-series was, which is to say, not very good at all. I think a lot of people outside of the PC enthusiast sector have become quite jaded about it as a result.

We can blame devs for not doing a better job of optimising and still supporting raster (although as shown, that is quickly changing now, much sooner than I predicted tbh), metro ee, avatar, spiderman 2 are all examples of RT only which run incredibly well on consoles.

"Better launches with more accurate marketing" is said in the first few sentences of the video, why make a topic on a video then try and deviate away from the core point, the marketing.

There is no such thing as accurate marketing, it is still first party data and cannot be trusted. I mean if you want to start a thread without the pointless video linked and pointless poll, and just say "Do AMD fail to launch products in ideal conditions more often than not in the recent past?" Answer = Yes.

So rename to the thread, delete the poll. or stick to topic, MARKETING!

Don't like, you know where the door is. For such a pointless thread, you seem to enjoy posting in it and discussing the points raised....

Marketing and execution i.e. releasing in a good state go hand in hand.
 
Back
Top Bottom