• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Tbf it's not like nVidia have dropped their buyers in it before, 1060 3GB/2060 6GB for a start! 10GB should be fine for a couple of years at least though at 4k, nv will make sure of it (they do pretty much control the market and games tend to get built for them).
 
I'd prefer ultra settings in a game I've paid for, so for me a major thing. I've been playing Cold War at 1440p and the difference between that and 4k is pronounced, but so is the performance dip with terrible memory leakage imho on the 5700xt at the minute, it's a beta but for me, I want the best visuals beings as a PC costs thousands of pounds.
That's fair enough, but sometimes the effects are hard to notice or just subjective. Like shadows can be softer or harder, it's partly preference.
 
In before:

images
 
128 watts in game!? What?!

That seems crazy low to me... granted it was playing RDR2, but I expect despite being "last gen" that game should be stressing the console pretty well unless it's locked to the old framerates?

The tweet is refencing figures measured here btw https://tweakers.net/reviews/8252/3/xbox-series-x-onze-eerste-indrukken-en-tests-meten-is-weten.html

Its obviously gong to go higher than that in other games - it has to or the PSU choice in the machine makes no sense - at 128w that PSU in the series x is running under its optimal range, so it's pulling excess wattage from the wall - a smaller PSU would be more efficient, produce less heat and cheaper to manufacture - so why put in a PSU that's less efficient, runs hotter at low load and makes the machine more expensive? Only reason is that 128w is not represented of next gen games that push the hardware
 
I feel if rt is the next big thing (let's buy that argument for a while) these VRAM discussions won't matter cuz the hardware will be obsolete every 2nd year atleast for the next 4 generations... so buying a 3080 at launch is a better decision than buying a 3080 super 6 to 12 months down or spending $200 on more VRAM
Well, RT increases VRAM usage, so even that scenario would cause VRAM to be a factor earlier on.
 
Its obviously gong to go higher than that in other games - it has to or the PSU choice in the machine makes no sense - at 128w that PSU in the series x is running under its optimal range, so it's pulling excess wattage from the wall - a smaller PSU would be more efficient, produce less heat and cheaper to manufacture - so why put in a PSU that's less efficient, runs hotter at low load and makes the machine more expensive? Only reason is that 128w is not represented of next gen games that push the hardware
That's a fair point, but I'd also not take a last ten game as indicative of software that's designed to maximise the new hardware design.
 
But the problem is AMD might have the Console market but Nvidia own the PC market.

There is a lot of what I'll call 'game work's titles. Some of the biggest IPs from the biggest studios.

CyberPunk 2077 being a stellar example.
 
Status
Not open for further replies.
Back
Top Bottom