Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
RX 6800,Cyberpunk at 4k some settings turned dow with fsr 90 fps look realy goodGiven the DLSS improvements lately I'd be leaning towards the 3080 personally.
I think the next card in Nvidia's series is gonna be the same price as the RTX 3080. Many people were glad to pay that for the RTX 3080, so they will charge that again. There's been some inflation, so I doubt it will be less than that.
I don't think it will be an affordable card on launch, unless there is a Founders Edition version.
One problem Nvidia has, is that lots of people will be looking at buying used RTX 3080s (because there won't be a big performance difference between the next AD104 card and the RTX 3080) again when they launch less expensive RTX 4000 series cards - which of course they won't profit from at all. So I think they will need a reference model to help keep prices closer to MSRP.
I’m always amazedThis guy just guesses:
He reckoned that the RTX 4070 would be a 300w card in August 2022.
The RTX 4070 TI has a tdp of 285w.
Well, considering the average IQ of those that spend more time on the Twitter than in the real-world...I’m always amazed
that anyone believes these random twitter accounts.
Wno on earth plays cyberpunk anyhow? Its benchmark tool for people to say "..but RT" almost no one actually games it and I own a copy.RX 6800,Cyberpunk at 4k some settings turned dow with fsr 90 fps look realy good
Wno on earth plays cyberpunk anyhow? Its benchmark tool for people to say "..but RT" almost no one actually games it and I own a copy.
For Cyberpunk FSR 2 changes the performance a lot, particularly on an RX 6800. I can do 4K FSR Performance @ 60 fps with my RX 6800, the only thing I turn down is volumetric clouds & fog to medium because they crush performance but visually the difference is minimal. Granted, I've actually played this game with every resolution, aspect ratio and settings combination you can think of, so trust me when I say it actually does quite well in Cyberpunk.I'd buy a RX 6800, but it can't quite handle Cyberpunk at 1440p, at 60 FPS:
It looks like it can handle Warhammer III though:
I'm leaning towards either a RX 6800 XT or RTX 3080, but both are still outside what I'm willing to pay.
There have never been good stock levels for the 6800. OcUK has a £550 placeholder, that's it. Really hard to get these.Yup, the RX 6800 seems reasonably priced for the performance it offers atm. But I'd rather wait just a little longer.
I sort of agree about waiting, except that used prices do seem to be coming down. It's not easy to get a good deal though, as stock is limited.
If you have the cash to spend, I'd say it's not worth waiting (if it's going to be too frustrating).
There have never been good stock levels for the 6800. OcUK has a £550 placeholder, that's it. Really hard to get these.
My "stab in the dark" based on rumours is that the 4070 is shaping up to be a card that requires DLSS to perform acceptably, i.e. to reduce memory usage and bandwidth requirements at 1440p and 4k resolutions in modern workloads... I have nothing against DLSS, but it needs to be available on every. single. title. current and past for me to consider buying such a product.Lets assume the rumoured specs of the RTX 4070 are false. For one thing, 5,888 shaders would be the same amount as the RTX 3070...
If Nvidia released a version of the top AD104 chip, but without the GDDR6X, would anyone here buy it, if it was priced £100 lower?
With GDDR6X, the total memory bandwidth for AD104 is 504.2 GB/s (assuming 192 bit memory bus). Less than the RTX 3070 TI!
With GDDR6 (clocked at 18gbps) it would be 432 GB/s, similar to the RX 6750 XT:
Both players seem to be taking turns at shafting consumers, I wonder if someone will try to sue for collusion...My "stab in the dark" based on rumours is that the 4070 is shaping up to be a card that requires DLSS to perform acceptably, i.e. to reduce memory usage and bandwidth requirements at 1440p and 4k resolutions in modern workloads... I have nothing against DLSS, but it needs to be available on every. single. title. current and past for me to consider buying such a product.
I've been pretty sceptical about collusion and price fixing over the past 4 months, but the situation is getting suspicious. It will be interesting to see financial reporting on the amount of sales & stock held.Both players seem to be taking turns at shafting consumers, I wonder if someone will try to sue for collusion...