• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Some people just like very high framerates - when I was more seriously gaming I always used to buy either high end or one back from high end SLI setups even though I played 1-2 steps down resolution wise.

These days for single player I'm pretty happy around 60-70 FPS with G-Sync though and for multiplayer generally turn a lot of settings down anyway for a clear image.
 
The worst people are those that buy 3080 type products to fame at 1080p 60hz lol, believe me this does happen.

3080 perf should fut my 3440x1440p 100hz nicely
 
The worst people are those that buy 3080 type products to fame at 1080p 60hz lol, believe me this does happen.

3080 perf should fut my 3440x1440p 100hz nicely
why not? it means even cycperpunk 2 would run at that res. basically futyure games released 3-5 years will run fine meaning less upgrading. 4k users need to upgrade every gen to keep up
 
why not? it means even cycperpunk 2 would run at that res. basically futyure games released 3-5 years will run fine meaning less upgrading. 4k users need to upgrade every gen to keep up
I suppose if your happy paying all that money for wasted performance..

I prefer a better resolution and a bit more hz as the experience just feels nicer
 
If you game on a TV, high end CPUs have just never been that interesting. You would've started off at 1080p60, which doesn't need anything special CPU wise. You probably had something like a GTX1080 and got high settings with a constant 60fps. No real point in upgrading to any of the 20 series GPUs either. Then the next jump up might've been to a 4K120 TV which is a huuuuuge gap from 1080p60. Again, very little to gain from a high-end CPU here (I suppose some games might go to 120fps on the GPU but not on the CPU, strategy games maybe). Also the 2080Ti just doesn't really cut it for 4K60 so again, the 20 series was probably very uninteresting.

Regarding buying a 3080 for 1080p60. There is something very desirable about always being able to whack all settings to max, get a locked 60fps and have the ability to downsample (4K > 1080p actually looks pretty damn good). Sure, it's not the best way to spend money but I do see the appeal.
 
The worst people are those that buy 3080 type products to fame at 1080p 60hz lol, believe me this does happen.

3080 perf should fut my 3440x1440p 100hz nicely

indeed there is a guy with an old cpu who only plays one game at 1440p and with his 1080ti that game was only 80% utilised. He is complaining that with the 3080 his frame rates haven hardly increased at all, it’s just his GPUs utilisation that has gone down. Why didn’t he do his research before spending £750? A cpu upgrade would have likely given him the same frame rate improvements
 
Last edited:
It's hilarious that I'm reading disapproval about AMD's sneak peak of their RX6000 series to be honest. Lets compare and contrast the 2 launches in this duopoly. And I will keep my eye out when AMD hard launches as well. But, so far:

-3080, called the flagship card, has lower vram then the 1080ti and 2080ti
-AIBs charging higher prices then oem
-1 AIB caught scalping it's own customers
-Some AIBs using an imbalanced capacitors causing blackscreens and ctd. Some caught it before shipping while others didn't causing more delays.
-Paper Launch with only a few 1000 world wide (per Jayztwocents)
-Won't be able to replenish stock fully until around 2021
-Nvidia store crashing, and is accused of exposing customer data. Also show $899 prices for the 3080. Which was said to be fixed now.
-3090 disappointing performance uplift from a 3080 but cost upwards of $900 more.
-AIB using power connectors were the pins bend as you plug the cable into the socket.

When it comes to AMD's sneek peek I just don't see the comparison based on the competition's release. It would take a lot of mental gymnastics to contort and twist what AMD has done, so far, as a negative.

I wish people would actually get the truth of this - I find it cheapens criticisms even if there is a valid overall case - where the facts are distorted just to sling more mud. I'm surprised you didn't throw several random mentions of POSCAPs in there a few times just for effect.



Same with this - it is just desperate to sling mud - everywhere saw unprecedented demand with several retailers websites taken out for hours at a time and even unrelated sites thinking they were being DDOS'd.

Out of 9 points you think its mud slinging with just 2?? OK rroff :D

Those are the events that took place since lauching ampere. The launch of ampere, which was rushed, has been a complete disaster. One does not need to fling mud to point it out. Just point out what has happened.

Now, when compared to a sneek peek of the RX 6000 Series being called big navi and they lack of clarity on those benchmarks...they don't compare.



I think deep down he is worried AMD won't execute and trying to get the excuses/deflections in now :D:p
Ah, you are projecting with that post. :p

As it only reveals how you truly feel when you defend Nvidia from a horrendous launch of ampere.
Since you can't refute why amperes launch was a disaster, yet you try to defend it, I think deep down you are worried as you provide excuses and deflect (mudslinging) AMD will execute and will take marketshare. Just as @KentMan laid you out earlier.
:D
 
Last edited:
A 3080 for 1080P 60hz does make sense if you want more than 60fps in everything with all bells and whistles on max/ultra especially if you want to use a big 'old' TV.
There's no such thing as overkill or too much power/performance anywhere in PC gaming. 2 x 3090 SLI or 2 x 6900XT in Xfire at perfect scaling would be about where I'd want to start with 4K
all ultra at 60Hz assuming SLI and Xfire were still properly supported. I've had 3080 performance for 3.5 years already in the games I play at 4K and it cost me £1300 in 2017. (1080 TI SLI) https://www.youtube.com/watch?v=kjiIQzvGGG8&ab_channel=TheSpyHood I would have been able to double that performance at the same price 1.5-2 years ago if AMD and Nvidia hadn't **** the bed. Now I'm looking to equal it at £650-800 but no way to significantly improve it. SLI is dead but not in the games I play so there is no upgrade path I'm excited about. Really annoying tbh as I'm used to a really good graphical improvement every couple of years although newer AAA games (which a 3080 can't anywhere near max at 4K such as RDR2 and MSFS2020) mean I'm in limbo.
 
Out of 9 points you think its myd slinging with just 2?? OK rroff :D

Those are the events that took place since lauching ampere. The launch of ampere, which was rushed, has been a complete disaster. One does not need to fling mud to point it out. Just point out what has happened.

If one does not need to sling mud then why go to the effort of distorting and dragging everything up you can? (I only took up those 2 points as they were the most obvious examples of what you were doing) let alone why are you mentioning it in this thread? can't this thread be about discussion of what AMD is doing?

Ah, you are projecting with that post. :p

As it only reveals how you truly feel when you defend Nvidia from a horrendous launch of ampere.
Since you can't refute why amperes launch was a disaster, yet you try to defend it, I think deep down you are worried as you provide excuses and deflect (mudslinging) AMD will execute and will take marketshare.
:D

I've never defended Ampere's launch* - though I know it won't appear that way to you - I've simply been a stickler for people not distorting information. Likewise I've merely discussed AMD matters in a light of what is realistic - you don't see me talking down how they won't even beat the 3070, etc. etc. which is the kind of talk you get from people on the opposite side of the equation to yourself.

Just as @KentMan laid you out earlier.

You mean the misplaced and entirely childish comment about the soldering iron?


* I haven't actually even given my own thoughts on it positive or negative yet. Personally not the happiest about it as I'm doing some RTX based development as a hobby and my GTX1070 doesn't cut it and I'm in no hurry to spend money on Turing.
 
Struggling to justify moving on from my 4820K even at 1440p never mind 4K - I keep hovering over newer setups and/or alternatively sticking a Xeon with more cores in this rig but nothing feels like the right move forward at the moment.

The main difference really is in minimums with these 4000 series CPUs - I suspect increasingly in newer games even when the averages aren't that far behind the newer CPUs you will notice the minimum causing some lack of smoothness.
 
Like the thing is, your build needs to be balanced, if you have a 60Hz monitor you need a system that can provide just over that at whatever games and settings you use, any more and it's more or less a waste, no point having a GPU that can provide 200fps but your monitor is only 60Hz etc
 
Some people just like very high framerates - when I was more seriously gaming I always used to buy either high end or one back from high end SLI setups even though I played 1-2 steps down resolution wise.

These days for single player I'm pretty happy around 60-70 FPS with G-Sync though and for multiplayer generally turn a lot of settings down anyway for a clear image.

I dont play competitively like when I was in early twenties, it was CS or UT back then and good fun. Cathode Ray monitors were huge back then.

I still play the fps games, just my eyes are shocking bad now a couple of decades on so 60 fps is great for my needs. Only multiplayer lately is warzone, and either I got rusty in a few weeks off it or the min maxing has increased massively.
 
A lot of modern gaming has become about the meta even in an FPS game - so many players now play the thinking game - only ever attack when they have a stack (health, armour, perks or whatever), etc.

I dont know how you find it but I dont understand why there isnt options for casual play. Like say Im now rank 61 I wouldnt mind being in similar groups of players or people that want to log on for brief couple of hours. The opponents Im with have crazy items and obvious high skill you just watch the replay, like shouldnt the systems just match them to similar or even better make them start with a handicap to make them work harder. I dont get why its the opposite, flood them with super equipment so they are even more invincible lol.

When my mates used to make me prove skills on original counterstrike they would say right you can only use a pistol this round and a flashbang. Then when you still topped the board you proved you didnt need the AK47 or m4a1, that was the handicap.
 
Status
Not open for further replies.
Back
Top Bottom