so instead of waiting.... just make my own? what is your suggestion lol. if we have to wait we have to wait
I think he is suggesting buying 20 series but I’m not doing that without at least knowing the rrp price/perf of the 30 series.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
so instead of waiting.... just make my own? what is your suggestion lol. if we have to wait we have to wait
Do it, people that say 4k isn't there yet, I have this simple argument
There are people that play at 1080p and 1440p at medium/high settings to get 60fps or more and it's "there" why can't the same be said for just high to ultra mixed settings at 4k?
There seems to be this thing that if you are gaming at 4k without everything on full then some how it's not ready.... Makes no sense
I play all my games at 4k and never see below 60fps on my 2080ti
Hell I even play some games at 5k or 8k maxed or there abouts it depends on the game
if you haven’t pulled the trigger on a 2080ti by now, you’ve waited too long.
I will get defensive!!IMO 4K is not here yet (some will no doubt get defensive), either play it @1440p or lower some settings 1 notch.
This whole Samsung 10nm rumor is obvious BS. The 10nm process just wasn't suitable for large GPUs. If it was then Turing would have been fabbed on 10nm. Besides that, 7nm EUV is a mature process with high yields. Moreover, the EUV varient is much cheaper and easier to design and fab than the regular 7nm DUV and also acts as a stepping stone to 5nm EUV as the 7nm DUV is a bit of a dead end (although there night be a 6nm from TSMC which is DUV).
As for capacity, after the summer Apple will move to 5nm and TSMC will be desperate for 7nmEUV customers, of which Nvidia is one of their primary customers.
He only follows Nvidia newsI take it you missed the stuff about AMD doubling their 7nm wafers at TSMC, essentially buying up all of Apple's capacity when they make the shift to 5nm.
I use no aa in many games. Better image quality as a result and saves on grunt. Win win.I take it all those who have to have every setting maxed out are using 8*AA or higher. Or is it a case of nah I don't do that even though it shoots my max setting argument out the window.
Except that's not what the claim was either.
The claim was no performance increase at the same price point, the 2070 came out at the same price as a 1080, but was 5-10% faster, and had RTX as well. That's a performance boost at the same price point. Similarly the 2080 vs the 1080Ti, same launch price, small performance increase plus RTX.
I take it all those who have to have every setting maxed out are using 8*AA or higher. Or is it a case of nah I don't do that even though it shoots my max setting argument out the window.
Lol.16xSSAA in Arma 3 really is pretty! I do think that there's little point in upping the resolution if you can't at least run the same image quality settings but clearly some think otherwise. Having said that there are few games I can really max at 4K with 1080ti SLI but I always try and always want to and I'm always a bit peeved when I have to make that compromise.
16xSSAA in Arma 3 really is pretty! I do think that there's little point in upping the resolution if you can't at least run the same image quality settings but clearly some think otherwise. Having said that there are few games I can really max at 4K with 1080ti SLI but I always try and always want to and I'm always a bit peeved when I have to make that compromise.
16xSSAA in Arma 3 really is pretty!
The higher resolution you go the less AA you need. I haven't been able to tell any difference between x8 and x16 at 4k, and not using any AA is often visually acceptable.
Upping resolution itself is an increase in visual quality, or there would be no point.
I only use the lowest possible aa if I do at all just to take the edge off lol puntasticYep, AA is the first thing I switch off at 4K. It's just not worth using as visually I can't tell the difference with it on at that res, least not in the games that I play.
The higher resolution you go the less AA you need. I haven't been able to tell any difference between x8 and x16 at 4k, and not using any AA is often visually acceptable.
Upping resolution itself is an increase in visual quality, or there would be no point.
Yep, AA is the first thing I switch off at 4K. It's just not worth using as visually I can't tell the difference with it on at that res, least not in the games that I play.
I only use the lowest possible aa if I do at all just to take the edge off lol puntastic
edit: also
I can totally relate to those video's but it's not the full story. I have found some games have that shimmering effect and others don't. In Rdr 2 if you turn off taa it's instantly noticeable but in other games its not there.
I know re:resolution but I'm 40inch 4K and wear glasses and its still easy for me to tell in a lot of games. Sure 4K on a 24 inch monitor or even 27" I also might struggle but I had a Samsung 4K 27inch and my response was meh.The higher resolution you go the less AA you need. I haven't been able to tell any difference between x8 and x16 at 4k, and not using any AA is often visually acceptable.
Upping resolution itself is an increase in visual quality, or there would be no point.