The Gears 5 demo Coalition knocked up in 2 weeks for the XSX reveal event, that's who.Who said it can?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
The Gears 5 demo Coalition knocked up in 2 weeks for the XSX reveal event, that's who.Who said it can?
Star Citizen.
Should not have to break down your own post but its your whining that made no sense.
Your 2080 is holding you back "doesnt cut it" is what you said. The 2080Ti is 35% or so better than your 2080.. and the AMD hardware in the console's is on par or better than the 2080Ti. If you dont get that then I cant help you.
How much VRAM has the 2080? 8Gb. How much VRAM has the 2080Ti? 11Gb
So theres a shortfall here of 3Gb already from your current card. No wonder you cant play "I'm desperate for more FPS to play RDR2 at 4K". It not only has less clockspeed but also VRAM. The bandwidth of the 3080 may save it, but clearly the 2080 cannot cope with 4k intensive games.
Unless, of course, you're not interested in 4K gaming or can't afford to move past 1080p and just want to play some damn games at a consistent 60Hz instead of swinging your green/red/blue dick around? Or does a GPU only count if it's bleeding your money dry at the very top end?then its just another duff release from AMD.
1440P using 8GB, i have the GPU muscle to run 4K but the VRam chokes..What res?
So DLSS is the only nVidia feature.... that makes it "loaded"?
DXR is being used by CP2077, which is DirectX Raytracing. Nothing to do with nVidia. It's, as all graphics techs should be really, vendor agnostic. nVidia has RTX cores to do the calculations for ray tracing, but I dare say AMD has a solution as well..
I am planning on the 3080, but I'm at the very least gonna see what AMD has to offer first - and cyberpunk supporting DLSS isn't enough to force me to make the choice![]()
Unless, of course, you're not interested in 4K gaming or can't afford to move past 1080p and just want to play some damn games at a consistent 60Hz instead of swinging your green/red/blue dick around? Or does a GPU only count if it's bleeding your money dry at the very top end?
"duff release" my stinging rectum
I've seen it compared to the 2080.Surely if the xbox gpu can get close to a 2080ti on 130w it's not going to be that difficult to at least get near a 3080 with a 300w desktop gpu.
Also...
Something dodgy about Nvidia's marketing.
https://twitter.com/HardwareUnboxed/status/1301796954398101504
![]()
And I'm saying if they can get this on 130w it stands to reason they can do a lot more with something running 300w. The technology is there to do it, so seems it's going to boil down to margins and is it worth them making such a card. It may well be under 2080ti but I very much doubt it.I'm saying, in this instance, it's unwise to gauge how well a graphics card might perform based on the performance of a cut down version in a console package vs another graphics card in a game where the LoD is not accurately defined. It barely gives you anything and really AMD need to give us a better example of what the performance will be if it can compete before people start buying up nVidia. Give people more of a reason to wait, if you will.
Very sneaky from them and wouldnt put it past them .. maybe you should post this on the NVIDIA thread.
Very sneaky from them and wouldnt put it past them .. maybe you should post this on the NVIDIA thread.
I did, it'll get ignored...
I put nothing past AMD and NVIDIA when it comes to marketting tricks sadly.
Once the GPUs are in reviewer's hands, we can get a better understanding of whats going on, and what the real performance benefit is.
I might put off a GPU purchase until Cyberpunk releases. CDProject always seem to push the envelope tech wise with their releases so I think it'd be a decent benchmark to go off of what will be the most demanding game for the next 12-24 months, which is a fairly reasonable lifecycle of what I expect for my GPU to be pushing the envelope at the top tier for.
Afterall even today Witcher 3 at 4K is very demanding. If a 3080 or big navi can both ace the Cyberpunk max settings ultra all settings turned on including ray tracing, then we can rest assured both cards have good longevity.. and if they both match eachother, then VRAM does seem to be the factor which will likely play key in decided which to put our £££££ into.
Just wait for the reviews, Hardware Unboxed are the most trustworthy, completely so...
The Gears 5 demo Coalition knocked up in 2 weeks for the XSX reveal event, that's who.
I don't play but it's renowned for being, to put it politely, not very well optimized.1440P using 8GB, i have the GPU muscle to run 4K but the VRam chokes..
Its server locked, on the rare occasions i get a fresh server its running at 100 FPS 1440P highest settings, with full servers its 40 to 70 FPS depending on where i am, upping the resolution to 1800P doesn't bring the frame rates down it just brings the GPU load up, at 1800P its mostly playable if i'm in space, on a moon or planet surface it stutters and my system Ram also reaches full capacity, at 4K its unplayable.I don't play but it's renowned for being, to put it politely, not very well optimized.
I thought the built in granularity of most PC games means settings can be adjusted to improve performance?