• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Have to agree, no issues with my 5700XT at all. Hardware Unboxed didn't get the issue either. I wonder if it was just some odd hardware configurations rather than drivers causing the issues? Fresh W10 install on X570 worked for me.

My 2 kids have 5700xt and they have had issues since the start with there’s and they had fresh windows 10 and all new 3600 on b450 mobo’s I don’t believe it was a driver problem I think it was the fact of being 50th anniversary editions and allways hitting the t junction temp till I under voltage them. Didn’t have issues since I did that. I believe it was the early cards with the issues but that my experience.


This whole AMD drivers thing has to stop, their drivers have been brilliant for a long while, yes people had issues early with some 5700 cards but it was only the vocal minority who shouted about it, many of us have had zero issues for years.

Issue may be stock, a lot depends on this, but dont worry about drivers, its been a bon issue for most

I think both amd and nvidia have the same amount of driver issues on many many games I don’t think either are better then the other. I’ve allways had more issues when it’s a amd game or nvidia sponsored game I always expect problems on games when I see amd on the load up or my kids when they nvidia. That partnership has allways caused more issues then it’s worth
 
Exactly this, people with bad ram timings from calculators or poor overclocks etc.

I run everything at stock, zero issues, works as intended, my 5700xt has been flawless, my 3800x and x570 again flawless, c16 3600mhz ram etc all flawless.

Buy the right components and treat them correctly and have zero problems if the hardware is not faulty

Agreed and a decent power supply properly connected ;)
 
thanks guys this definitely gives me some food for thought and some insight into what to do,just worried if I do cancel 3080 are these cards gonna be as hard to get at launch too,you would like to hope AMD are not going to make that mistake seeing how badly Nvidia have performed in their launch this is a huge opportunity for AMD to cash in on Nvidia's ineptitude
 
thanks guys this definitely gives me some food for thought and some insight into what to do,just worried if I do cancel 3080 are these cards gonna be as hard to get at launch too,you would like to hope AMD are not going to make that mistake seeing how badly Nvidia have performed in their launch this is a huge opportunity for AMD to cash in on Nvidia's ineptitude

Hedge your bets. Don't cancel your pre-order, you can always return it. See if you can get an AMD card on release. Win win :)
 
I’ve been using a 5600xt after selling a 2080ti prior to the 3080 launch and I have been super impressed with the card. HOWEVER, the amd system just is not as good as nvidia and that is a fact. In the 2 months of using the 5600xt I have had sudden black screen crashes, games crashing to desktop and that adrenaline software is pants imo.
Also, I use a predator ultra wide/100Hz monitor. Is clearly I cannot use vsync but getting the 100hz to remain stable is impossible. Activating vsync in game defaults often to 60Hz and setting a custom resolution often results in a “signal out of monitor display range” message.
In fact, the card doesn’t like 100Hz at all and has much better success at 95Hz. Not that that is perfectly stable either.
My friends 5700Xt was returned also as he had crashes / stuttering etc.

so my 2p is: yes the card is very capable and is great when it is used on a 60Hz tv (the system in our house we purchased the card for initially)and is natively supported for a dual boot Hackintosh - but the argument for poor drivers/software in windows is still present.

I’ve just purchased a 2070 super after conceding defeat on the 3080 for now. Plug and play, no issues at all and really does hit home how much more stable the nvidia system is when switching between the two like that.

that being said; I WOULD consider a big Navi in the current nvidia situation where it to be comparable performance wise and at a good price. I still feel Amd are the ‘good guys’ at the moment.
 
I've never had driver issues with my 5700xt.
Nvidia drivers have given me plenty issues and a need to have an account? That's a bit creepy.

TGP if I remember right does not include memory power draw, just the chip. 300W maybe.
 
Apparently, Cyberpunk 2077 won't be fully upgraded with ps5 / series x improvements until 2021 (maybe a bit earlier). But will get some graphical enhancements.

So, at least initially, the PC version will have significant advantages (unless the PC version looks like the last gen consoles too, until its upgraded!).

but it will be playable on launch on all consoles, at least.
 
Last edited:
I agree, I doubt NV originally planned for 320w and 350w tdps on the RTX 3080 / 3090. It looks very much like they decided to 'max them out' at a later stage.

For one thing, the RTX 3080 spec recommends a 700w PSU, enough to rule out some customers, who may opt for a GPU from AMD instead, with lower TDP.

How many people bought them that will probably send them back due to problems because they have an inferior/low grade PSU?
 
Do you have any proof on this GDDR6 claim you are making?

Im not convinced AMDs 7nm Gpus are going to ve as expensive to manufacture as Nvidias shoddy Samsung 8nm ones are....
If AMD can come to market with similar raster performance products at a lower cost to Ampere they will hoover up sales, as the lack of DLSS and RT performance will keep some die hards on the green side but there are a lot of fence sitters who don't care about anything other than raster performance and price and will jump.

AMDs biggest problem under the above scenario will be stock, global demand will be huge

I think we are being generous to people being a) informed and b) open to other avenues.

If you speak with anyone who isn't really into PC building, you will just hit a brick wall of mindshare. So ignorant in fact its akin to the PC World 101 training manual on the shop floor sales of "intel and nvidia for gaming". So I would like to see more fence sitters as you say buddy, but even going by this forum you have a majority that still buy nvidia.
 
I think we are being generous to people being a) informed and b) open to other avenues.

If you speak with anyone who isn't really into PC building, you will just hit a brick wall of mindshare. So ignorant in fact its akin to the PC World 101 training manual on the shop floor sales of "intel and nvidia for gaming". So I would like to see more fence sitters as you say buddy, but even going by this forum you have a majority that still buy nvidia.

Nvidia have had the better product since probably 7970 days so I can understand the balance tips in their favour but I'd love to replace my gtx1080 with a Radeon card because I like the idea of something new.

Brand loyalty is a strange one to me. I'll just get whatever is best bang for buck at the time.

The 3080 hype has really highlighted how insane people are though. I was refreshing the nvidia site myself on *release* day because it swept me up. Luckily it didn't materialise and I have the patience to wait.
 
Upgrading the cooler from the last gen. wouldve been an easy decision, considering the RTX 2080 Ti can exceed 80 degrees.

Not everyone will be willing to upgrade their PSU if a GPU isn't very power efficient. A tdp over 300w for a single GPU graphics card isnt exactly efficient, when comparing the perf. vs the RTX 2080 TI. And higher power usage always increases max temps...

I think many would rather keep their current PSU, and save the £100 or so for an equivalent / or greater AMD GPU, which (in recent years) havent had TDPs over 300w.

RDNA 2/3 are both planned to make large improvements to power efficiency, so I think this will become more relevent to customers in the next year or so.
You can run the 3080 at 200w and still out perform a 2080ti by quite a bit.
 
Its like the vega, tuning it was quite easy and ran quiet. People ingrain the release photographically and never change that opinion. One thing that being an AMD buyer you get used to as its way off the mark when you dig into the settings.
 
Nvidia have had the better product since probably 7970 days so I can understand the balance tips in their favour but I'd love to replace my gtx1080 with a Radeon card because I like the idea of something new.

Brand loyalty is a strange one to me. I'll just get whatever is best bang for buck at the time.

The 3080 hype has really highlighted how insane people are though. I was refreshing the nvidia site myself on *release* day because it swept me up. Luckily it didn't materialise and I have the patience to wait.

Depends on what you define as 'better'. Some of the AMD cards have actually been better if you look at them from another perspective.
 
Depends on what you define as 'better'. Some of the AMD cards have actually been better if you look at them from another perspective.

Lies... It gets a few fps Moar so the NV card is better than God! Meanwhile your AMD card is red so sucks monkey balls. That is all.
 
Status
Not open for further replies.
Back
Top Bottom