Come on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.
Only if the tests are at 2160p, and not 1080p30.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Come on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.
I will sing it’s praises once I see something that night and day looks better than what we have available right now in terms of image quality. Hoping 2020 brings somethingCome on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.
I've been tempted to sell the 2080 I use. I'm hardly gaming atm anyway. I can't remember when I played more than about 15 mins, so the onboard GPU in the CPU will suffice until Ampere if I do.No big deal, they don’t mind being at the forefront of tech and paying for it
If not gaming sell it for sure. That’s why I sold my Titan XP as I cut down gaming due to nothing much interesting me. Made do with my Vega which has done a great job while waiting for 3070 to come outI've been tempted to sell the 2080 I use. I'm hardly gaming atm anyway, I can't remember when I played more than about 15 mins, so the onboard GPU in the CPU will suffice.
I agree that the Turning cards will probably be hit pretty hard, probably harder than the 1080 Ti was hit as that still had decent raw performance when the new cards arrived - minus RT of course. It all depends on the jump NV can make. Needs to be at least 50% better RT performance,ideally X2.
I made the mistake of starting new games without finishing others. Now can't get into anything at the moment. It's like the old context switching problem - I'm rubbish at that, in this case getting back into the controls and atmosphere of the game. I'll probably keep it for the moment and maybe try starting one game from the beginning again when get a few hours spare.If not gaming sell it for sure. That’s why I sold my Titan XP as I cut down gaming due to nothing much interesting me. Made do with my Vega which has done a great job while waiting for 3070 to come out
I really do think the new cards will bring a huge boost to RT. They not only will give us more cores, but I bet you any money there will also be a huge increase in how good each core is.
I will sing it’s praises once I see something that night and day looks better than what we have available right now in terms of image quality. Hoping 2020 brings something
Crysis games brought about much bigger jumps in image quality than RT has done so far, enough said
True that. As soon as AMD can do it, it will be amazing but whilst they can't, it is pointless rubbish lol
As for Gsync, it was a case of it's not needed who cares, until of course AMD came along with Freesync.
Which also translates to G-sync monitors all having a consistent g-sync performance, are of a certain quality. Any manufacturer can knock up something Freesync compatible, but the sync performance probably varies a lot more. That's how I understand it anyway (someone correct me if talking rubbish ). I wouldn't want an el cheapo freesync monitor.Freesync is OPEN source. G-Sync is CLOSED source.
THATS the difference
Yeah, you are talking rubbish. Freesync and gsync do exactly the same job. The "sync performance" is exactly the same. i.e. ....it works.Which also translates to G-sync monitors all having a consistent g-sync performance, are of a certain quality. Any manufacturer can knock up something Freesync compatible, but the sync performance probably varies a lot more. That's how I understand it anyway (someone correct me if talking rubbish ). I wouldn't want an el cheapo freesync monitor.
As long as your over 40 fps on Freesync it works, not everyone’s gpus can maintain over 40 fps in certain games. The one thing stopping me buying a g sync compatible monitor is the 40-144 free sync range.Yeah, you are talking rubbish. Freesync and gsync do exactly the same job. The "sync performance" is exactly the same. i.e. ....it works.
It is why I went g-sync also on my 4K monitor. Much better g-sync range. My LG Freesync monitor before was 40-60 only. Plus Nvidia offered the GPU power at the time I needed. Vega was not even out back then.As long as your over 40 fps on Freesync it works, not everyone’s gpus can maintain over 40 fps in certain games. The one thing stopping me buying a g sync compatible monitor is the 40-144 free sync range.
A different reason stopping me from buying g sync is they are too expensive over the free sync ones, but I do not want such a limited free sync range.
Mate, again you are showing your complete lack of knowledge and making yourself look a bit daft in the process. Free/G are seriously not the same.Yeah, you are talking rubbish. Freesync and gsync do exactly the same job. The "sync performance" is exactly the same. i.e. ....it works.
Yep and the fact that gsync works on both gsync and freesync monitors means in the future I will buy only freesync monitors. On the higher end they will likely meet the a similar level of specification for it too.Freesync is OPEN source. G-Sync is CLOSED source.
THATS the difference
I actually recall AMD being first to market with tesselation, however their gpu's couldn't run it well while Nvidia's did much better and so the hate began.
Which also translates to G-sync monitors all having a consistent g-sync performance, are of a certain quality. Any manufacturer can knock up something Freesync compatible, but the sync performance probably varies a lot more. That's how I understand it anyway (someone correct me if talking rubbish ). I wouldn't want an el cheapo freesync monitor.
Come on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.
What? I think everybody agrees that decent real-time raytracing on a GPU would be amazing. It just hasn't happened yet. Respect to Nvidia for attempting it.
Maybe Nvidia will do a better job next year along with AMD and Intel.