• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Permabanned
Joined
2 Sep 2017
Posts
10,490
Come on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.

Only if the tests are at 2160p, and not 1080p30.

1547766741vimuprlk9m-6-1.png


1547766741vimuprlk9m-4-5.png
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
Come on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.:D
I will sing it’s praises once I see something that night and day looks better than what we have available right now in terms of image quality. Hoping 2020 brings something :D

Crysis games brought about much bigger jumps in image quality than RT has done so far, enough said ;)
 
Soldato
Joined
19 Oct 2008
Posts
5,951
No big deal, they don’t mind being at the forefront of tech and paying for it ;):p
I've been tempted to sell the 2080 I use. I'm hardly gaming atm anyway. I can't remember when I played more than about 15 mins, so the onboard GPU in the CPU will suffice until Ampere if I do.
I agree that the Turning cards will probably be hit pretty hard, probably harder than the 1080 Ti was hit as that still had decent raw performance when the new cards arrived - minus RT of course. It all depends on the jump NV can make. Needs to be at least 50% better RT performance,ideally X2.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
I've been tempted to sell the 2080 I use. I'm hardly gaming atm anyway, I can't remember when I played more than about 15 mins, so the onboard GPU in the CPU will suffice.
I agree that the Turning cards will probably be hit pretty hard, probably harder than the 1080 Ti was hit as that still had decent raw performance when the new cards arrived - minus RT of course. It all depends on the jump NV can make. Needs to be at least 50% better RT performance,ideally X2.
If not gaming sell it for sure. That’s why I sold my Titan XP as I cut down gaming due to nothing much interesting me. Made do with my Vega which has done a great job while waiting for 3070 to come out :D

I really do think the new cards will bring a huge boost to RT. They not only will give us more cores, but I bet you any money there will also be a huge increase in how good each core is.
 
Soldato
Joined
19 Oct 2008
Posts
5,951
If not gaming sell it for sure. That’s why I sold my Titan XP as I cut down gaming due to nothing much interesting me. Made do with my Vega which has done a great job while waiting for 3070 to come out :D

I really do think the new cards will bring a huge boost to RT. They not only will give us more cores, but I bet you any money there will also be a huge increase in how good each core is.
I made the mistake of starting new games without finishing others. Now can't get into anything at the moment. It's like the old context switching problem - I'm rubbish at that, in this case getting back into the controls and atmosphere of the game. I'll probably keep it for the moment and maybe try starting one game from the beginning again when get a few hours spare.
 
Soldato
Joined
21 Jan 2016
Posts
2,915
I will sing it’s praises once I see something that night and day looks better than what we have available right now in terms of image quality. Hoping 2020 brings something :D

Crysis games brought about much bigger jumps in image quality than RT has done so far, enough said ;)

There's a digital foundry video on the new CoD modern warfare where they show quite a bit of how much better the spot shadows are with RTX.

I wouldn't go so far as to call it night and day but the RTX enabled version does have noticeably better shadows that improves the overall look... whether that is worth the price of admission is an entirely different question.

I personally think the performance needs to get to the point where you aren't just having raytraced x or raytraced y, but you are having raytraced everything (reflections, global illumination, shadows etc etc) in order to get that "night and day" difference but the performance just isn't really there yet. Have to start somewhere though!
 
Soldato
Joined
6 Feb 2019
Posts
17,589
True that. As soon as AMD can do it, it will be amazing but whilst they can't, it is pointless rubbish lol

Yep - was exactly the same with Physx, Tesselation and Gsync.

I actually recall AMD being first to market with tesselation, however their gpu's couldn't run it well while Nvidia's did much better and so the hate began.

As for Gsync, it was a case of it's not needed who cares, until of course AMD came along with Freesync.

I found these comments still up from the first Gsync reviews

ces1ycmh.ufp.jpg



zezle2ah.ubz.jpg
 
Last edited:
Soldato
Joined
19 Oct 2008
Posts
5,951
Freesync is OPEN source. G-Sync is CLOSED source.

THATS the difference
Which also translates to G-sync monitors all having a consistent g-sync performance, are of a certain quality. Any manufacturer can knock up something Freesync compatible, but the sync performance probably varies a lot more. That's how I understand it anyway (someone correct me if talking rubbish :) ). I wouldn't want an el cheapo freesync monitor.
 
Soldato
Joined
19 Nov 2015
Posts
4,867
Location
Glasgow Area
Which also translates to G-sync monitors all having a consistent g-sync performance, are of a certain quality. Any manufacturer can knock up something Freesync compatible, but the sync performance probably varies a lot more. That's how I understand it anyway (someone correct me if talking rubbish :) ). I wouldn't want an el cheapo freesync monitor.
Yeah, you are talking rubbish. Freesync and gsync do exactly the same job. The "sync performance" is exactly the same. i.e. ....it works.
 
Soldato
Joined
4 Jan 2009
Posts
2,682
Location
Derby
Yeah, you are talking rubbish. Freesync and gsync do exactly the same job. The "sync performance" is exactly the same. i.e. ....it works.
As long as your over 40 fps on Freesync it works, not everyone’s gpus can maintain over 40 fps in certain games. The one thing stopping me buying a g sync compatible monitor is the 40-144 free sync range.
A different reason stopping me from buying g sync is they are too expensive over the free sync ones, but I do not want such a limited free sync range.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
As long as your over 40 fps on Freesync it works, not everyone’s gpus can maintain over 40 fps in certain games. The one thing stopping me buying a g sync compatible monitor is the 40-144 free sync range.
A different reason stopping me from buying g sync is they are too expensive over the free sync ones, but I do not want such a limited free sync range.
It is why I went g-sync also on my 4K monitor. Much better g-sync range. My LG Freesync monitor before was 40-60 only. Plus Nvidia offered the GPU power at the time I needed. Vega was not even out back then.
 
Soldato
Joined
11 Jan 2016
Posts
2,568
Location
Surrey
Freesync is OPEN source. G-Sync is CLOSED source.

THATS the difference
Yep and the fact that gsync works on both gsync and freesync monitors means in the future I will buy only freesync monitors. On the higher end they will likely meet the a similar level of specification for it too.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
For me the biggest difference between Freesync and Gysnc is consistency. You know that any Gsync monitor will do exactly what Gsync is defined to do. Otherwise it wouldn't get certified.

The same cant be said for FreeSync. You have to check the fine details of each and every monitor. As no two Freesync monitors support Freesync in the same way. What is the sync spectrum, does it support LFC etc etc etc etc...

That's why AMD decided to go down the FreeSync 2 route which was going to have a certification program.

This was around the time of the Vega 64. I haven't heard anything since on FreeSync 2.

If they standardised FreeSync so that you couldn't validate a monitor as supporting FreeSync unless it supported xyz features that would clear up the problem. And bring back the consistency for the consumer.
 
Last edited:
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
Which also translates to G-sync monitors all having a consistent g-sync performance, are of a certain quality. Any manufacturer can knock up something Freesync compatible, but the sync performance probably varies a lot more. That's how I understand it anyway (someone correct me if talking rubbish :) ). I wouldn't want an el cheapo freesync monitor.

I will 100% agree there are some bad Freesync or adaptive sync panels out there where the added feature was clearly about getting a sticker on the box. That said the gsync badge does not equal a quality monitor cause then Acer should never have been near it with many of their dreadful selection of VA panels for example. Now this is of course not a direct fault of Gsync but a good monitor is more than just does it support VRR and overdrive control through VVR range. Freesync gets mocked due to no overdrive control in the Freesync range in many(all?) models, however looking at for example Acer again and their Ultrawide 1080p Gsync panels, what good is the Gsync badge when the panels themselves are so slow and smears, that the thing Gsync always gets put on a pedestal for, which is overdrive control through VVR range, makes no difference.

To me Gsync modules makes no sense really and what we as consumers should push for is and end to the BS marketing of 1ms panels. IMHO we need 1 standard, which is adaptive sync, and then we need natively fast panels where overdrive is less or not needed(wishful thinking). The 1 ms g2g should be canned and something more like the response grid(4x4) that tftcentral does should be on the box. This is my opinion of it at the moment and im sure there are room for improvement.

TLDR; Gsync badge does not guarantee a good quality monitor.
 
Soldato
Joined
22 Nov 2018
Posts
2,715
Come on AMD hurry up with your interpretation of raytracing, just so the red team will stop sprouting drivel about raytracing and start singing its praises.:D

What? I think everybody agrees that decent real-time raytracing on a GPU would be amazing. It just hasn't happened yet. Respect to Nvidia for attempting it.
Maybe Nvidia will do a better job next year along with AMD and Intel.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
What? I think everybody agrees that decent real-time raytracing on a GPU would be amazing. It just hasn't happened yet. Respect to Nvidia for attempting it.
Maybe Nvidia will do a better job next year along with AMD and Intel.

You must be reading a different forum, because the hate for ray tracing is real, not nesscersarly because it is in its infancy, but because it is connected to NVIDIA.
 
Back
Top Bottom