• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** AMD ThreadRipper ***

Why the 120 FPS limit?

There's really no difference between 144FPS and 120FPS. By that I mean I cannot tell the difference and have experienced many games and monitors over the years, including decent CRT monitors.

In limiting to 120FPS:
  • Temperatures are lower
  • Graphics card is quieter
  • If you have a G-sync (or Freesync) monitor, you should at least IMO limit to 143fps so it stops the FPS hitting 144 now and then and disabling the sync causing a bit of lag and tear.
 
There's really no difference between 144FPS and 120FPS. By that I mean I cannot tell the difference and have experienced many games and monitors over the years, including decent CRT monitors.

In limiting to 120FPS:
  • Temperatures are lower
  • Graphics card is quieter
  • If you have a G-sync (or Freesync) monitor, you should at least IMO limit to 143fps so it stops the FPS hitting 144 now and then and disabling the sync causing a bit of lag and tear.
Fair enough. Have you tried Radeon Chill (or nVidia's equivalent if they have one)?

As for avoiding going over 143 FPS, FastSync, Enhanced Sync, or even VSync would solve that (although Enhanced Sync is only available in the very dodgy 17.7.2 driver right now). I am rarely at risk of going over anyway, except with really old games, so I don't need to worry.
 
Fair enough. Have you tried Radeon Chill (or nVidia's equivalent if they have one)?

As for avoiding going over 143 FPS, FastSync, Enhanced Sync, or even VSync would solve that (although Enhanced Sync is only available in the very dodgy 17.7.2 driver right now). I am rarely at risk of going over anyway, except with really old games, so I don't need to worry.

Unfortunately it doesn't help, it can make it worse in some cases when it flicks between sync technologies. The idea is just to stick with G-Sync and letting it do what it does and it's all smooth.

I think this video would explain what I mean better.

 
not realistic because a typical user doesnt sit there encrypting files and blender? really.

passmark is more realistic because its closer to what real world usage is. Of course its not popular because it doesnt show a great picture for high core count processors. So doesnt get used by reviewers due to the damage to marketing. :)

I'm.... are you serious?

You can't make videos with Passmark, you can't make games with Passmark, you can't develop any software with Passmark. of course Blender, Adobe Premier and all that stuff is more relevant to these CPU's because that's what the world uses to do the aforementioned.

You and many others have a very misconstrued perception on what market you're talking about. Gaming is very much a focal point. Hell, if it's not then motherboard vendors have things very wrong. Simply because AMD release a CPU with 16 cores, doesn't change the demand on these things. It's posturing to suit the argument or justification.

Believe me, gamers are interested in TR as much as anyone else. What users do you think get samples outside of media? Content creators, mostly. Guess what content? Yep, that's right, gaming ;).

You want to use it as a workstation, that's more than ok - it's excellent for that. Just don't become confused as to what sells. It's certainly not blender.


People wanting gaming CPU's are looking at the 7700K or the Ryzen 1600/1700, people looking at X399 and X299 'may' also play games but its not their primary concern, they are not buying £1000 CPU's for CSGO or Player Battlegrounds.....
Having said that yes both X299 and X399 also do gaming very well.

Unfortunately it doesn't help, it can make it worse in some cases when it flicks between sync technologies. The idea is just to stick with G-Sync and letting it do what it does and it's all smooth.

I think this video would explain what I mean better.


Right, from the Video.... the best way is to enable Free-Sync with no V-Sync and cap the Frame Rates 1 or 2 FPS below the monitors refresh rates.

That way you get the benefits of Free-Sync at the same latency as having no V-Sync on.

Having said all that AMD now has their equivalent to Fast Sync, on the nVidia side i think you run that in conjunction with G-Sync so when the Frame Rates go above the monitors range Fast-Sync takes over.

AMD's Fast Sync equivalent (what's it called?) may work in the same way.

hjg.png
 
Last edited by a moderator:
Right, from the Video.... the best way is to enable Free-Sync with no V-Sync and cap the Frame Rates 1 or 2 FPS below the monitors refresh rates.

That way you get the benefits of Free-Sync at the same latency as having no V-Sync on.

Having said all that AMD now has their equivalent to Fast Sync, on the nVidia side i think you run that in conjunction with G-Sync so when the Frame Rates go above the monitors range Fast-Sync takes over.

AMD's Fast Sync equivalent (what's it called?) may work in the same way.

hjg.png

Indeed.
 
Right, from the Video.... the best way is to enable Free-Sync with no V-Sync and cap the Frame Rates 1 or 2 FPS below the monitors refresh rates.

That way you get the benefits of Free-Sync at the same latency as having no V-Sync on.

Having said all that AMD now has their equivalent to Fast Sync, on the nVidia side i think you run that in conjunction with G-Sync so when the Frame Rates go above the monitors range Fast-Sync takes over.

AMD's Fast Sync equivalent (what's it called?) may work in the same way.
It does. AMD's recommendation used to be to use FreeSync and V-Sync together (although capping FPS was probably a better solution for anyone who was sensitive to the input lag V-Sync introduces). Their latest slides recommend FreeSync + Enhanced Sync. I am not sure what advantages or disadvantages it has compared to FPS capping though, plus we're getting rather off-topic. ;)
 
It does. AMD's recommendation used to be to use FreeSync and V-Sync together (although capping FPS was probably a better solution for anyone who was sensitive to the input lag V-Sync introduces). Their latest slides recommend FreeSync + Enhanced Sync. I am not sure what advantages or disadvantages it has compared to FPS capping though, plus we're getting rather off-topic. ;)


It is getting off topic so this will be my last word on this.

I have a GTX 1070, i don't have a G-Sync screen (i have a Free-Sync screen :O) a 75Hz screen, i use Fast-Sync Globally, for whatever reason it seems to be capped at 120FPS with Fast Sync on, but thats way higher than my screens 75Hz rate, without Fast Sync screen tearing is horrendous, Fast Sync makes that go away, yes right across the range... well at low FPS it can creep back in a slightly but its hardly there and the GTX 1070 doesn't tend to let anything fall below 60 FPS, i have to be running closer to 4K than 1440P in something like BF1 to see it fall below 60.

Its not a substitute for G-Sync / Free-Sync but its does eliminate screen tearing above the screens refresh rate, it even works well enough below it. AMD's Enhanced Sync is the same thing, run it with Free-Sync, that way you can run above your screen refresh rate.
 
Its not a substitute for G-Sync / Free-Sync but its does eliminate screen tearing above the screens refresh rate, it even works well enough below it. AMD's Enhanced Sync is the same thing, run it with Free-Sync, that way you can run above your screen refresh rate.
It doesn't do anything below the screen's refresh rate at all. It's not a replacement or cheap version of FreeSync or G-Sync, it's designed as a superior version of V-Sync and works in tandem with FreeSync or G-Sync.
 
I don't get why this TR discussions keeps coming back to gaming benchmarks... it's not a gaming CPU.

Of course I will be gaming on mine, but it's primarily for virtual machines, docker containers, compiling code and I'm increasingly getting into video editing. TR is a capable gaming CPU but if gaming is mostly what you're doing you're better off with a 1600X or 1700.
 
I'm looking for a Threadripper board with 6/7 PCI-E slots, anyone seen one? A bit like the WS -E99 range from Asus for X99, maybe a bit too close to release, but I want to go with TR, rather than than Epyc presently.
 
just to let everyone know I wont go on about mainstream usage anymore in this thread, its a valid point :) will stick to just the normal ryzen discussion on that.
 
Decided to say sod it and will buy a Threadripper system. I need the extra cores for video editing and stuff but it should be OK for gaming as well if I get a decent GPU. Will probably order some time this week.
 
I don't get this reluctance to talk about gaming. It' a forum!! Do people who do productivity work feel threatened or some such because people might primarily game on them? I bet there are more than you think. We know the productivity scores so should we just stop all discussion?

AMD went to the trouble to give TR a game mode, a lot of the motherboards are gamey. AMD and the mobo manufacturers clearly expect a lot of gaming to go on. AMD even did a video with a gaming system manufacturer about a TR gaming system.

I'm very interested in peoples gaming experiences so bring it on!
 
Back
Top Bottom