• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Yea. Your right of course. I need a new monitor. And Freesync panels are much cheaper so that's an ace in AMD's hat.

I'll wait for Vega. :cool:

But on another note I see cards like the 1070 as 1080/1200p cards now. Yes seriously, games released in the lased few months and moving forward to achieve 60 fps at 1080/1200p will require 1070 levels of performance.

Just look at ME Andromeda or Wildlands for example.

1070 = 1080/1200p
1080 = 1440p
1080ti = 4k

Honestly though I think you would still be better at mid level card and new monitor, turn down a little of the ultra settings to high to get the faster refresh rate and you will see a much more impressive, smooth, tear free game than you will with everything on ultra at 1200p with the fps sitting at 60 or so.

Also disregard the turd that is Wildlands as any standard to what is needed. That is the worst optimised game to date from my experience with it and why I have avoided it. That and it has nothing to do with what the series once was.

The part that kills performance at Ultra you will not see on the screen you are on about. Moving up to a 21:9 29" 1440p IPS panel would be my option for a mid level card at this time (my personal preference on 21:9 but if you prefer standard ratio then a 27" 1440p would be awesome too). Honestly after spending ages trying to work out what I wanted, I am an eye candy ***** so I will be waiting till next year but right now that is where my money would be.

1070 or similar with a new monitor of similar spec. Games that need that little more grunt, just drop a few options from ultra to high, notice zero difference but see a much nicer picture at more stable frames to stop the tearing etc.
 
That might be true for some games, but even a GTX1080 isn't enough for 1440p let alone a 1080Ti for 4K. That's where Volta rolls in :)
That depends on the individual though I would say. If you need minimum 144fps then likely 1080ti is not enough for 1440p. But if you just want 60 things change. Reason I say this is because he seems to be happy with 60hz/fps so what he said maybe true to him, just not for others who want minimum 100 or 144fps.

Like I have seen many people here say 1080 is fine for 1440p ultra wide at 100hz/fps. Yet when the same is applied to 4K, people laugh and say 1080 on 4K? Yet I would take 4K 60fps or even less over 100hz 1440p ultrawide without batting an eye lid. So generalising when saying these things I do not think works personally. Need to see what the persons needs are first :)
 
Get a Free Sync / G-Sync screen, even at 35 FPS its better than 120 FPS without it.

Ah I wouldn't say it's that extreme! It certainly helps though, especially in some games that can dips in FPS in some areas.

Although once you have it, it's hard to go back and not have Adaptive Sync anymore.
 
Yet I would take 4K 60fps or even less over 100hz 1440p ultrawide without batting an eye lid. So generalising when saying these things I do not think works personally. Need to see what the persons needs are first :)

I understand your view but I have since changed those thoughts myself and after a few years of gaming at 4K 60fps etc I would honestly want to select 100Hz 1440p 21:9 myself. I feel it offers a better experience and the detail difference as long as not too close to the screen works better for me.

Although I get the want for 4K I play mostly FPS games these days so the higher refresh makes the games much better. If however I was playing mainly RTS games then 4k 60fps is killer as you don't need the extra frames.

It does really come down not just to what the GPU can handle but the games also and best both worlds would be to have the ultimate 4K 240Hz OLED 34" 21:9 monitor and then when then select between getting the best average FPS at the highest refresh at a trade for the resolution and then if you are on your RTS game you have it set otherwise to suit the lower range of FPS but put all the quality settings up as a lot of the RTS is about the graphical look in my opinion.
 
Ah I wouldn't say it's that extreme! It certainly helps though, especially in some games that can dips in FPS in some areas.

Although once you have it, it's hard to go back and not have Adaptive Sync anymore.

Yeah with G-Sync I can tolerate drops to around 56fps except in hardcore multiplayer FPS whereas without I need a good 80+fps. 35fps would be pushing it a bit :o
 
First of all, at the minimum we will be getting 8gb of hbm2, which is perfectly fine with me.

Secondly unless you have an inside source (do share :)) no one knows if High Bandwidth Cache works or not yet. You seem certain and are writing it off already... It could end up being a feature that will benefit businesses more than us gamers, but we do not know yet.
Kaap is massively anti HBM and HBM2. He's made it illogically clear on numerous occasions.

Whether he likes HBM or not is besides the point, what he said is correct, If it works by moving some of whats stored in the gpu's memory over to system ram or ssd storage it's still going to be bottlenecked by the the speed of the storage used regardless of how fast the cache can work. I'm sure there will still be people that buy the 4gb model due to the cheaper cost of that model but I'm betting they'll have a lesser experience because of it, whether that's by latency when moving stuff around or by choosing lower settings to avoid it moving stuff of card we'll have to see, At the end of the day the truth is if someones happy dropping a few settings the visual difference is usually minimal, and in the majority of cases only really noticed when comparing screen shots not mid game so it's not really something to lose sleep over, That said if you are buying a high end GPU it's the sort of issue you should not have to face. And you should not have to lower settings when lesser cards aren't.
 
Yeah with G-Sync I can tolerate drops to around 56fps except in hardcore multiplayer FPS whereas without I need a good 80+fps. 35fps would be pushing it a bit :o

WoW is the worst offender for me! The latest raid has some extremely punishing areas, FPS is usually around 90-120 with a GTX 980Ti.
Then enter the raid and fight and hello mid 30's. Those drops are jarring even with Adaptive Sync.

It's better if it's a gradual decrease in FPS though, but a massive dip is always noticeable.
 
Yea. Your right of course. I need a new monitor. And Freesync panels are much cheaper so that's an ace in AMD's hat.

I'll wait for Vega. :cool:
If you do go for a Free-sync panel remember to make sure it has a wide working range and also make sure that it supports LFC, A lot of models do not support LFC which in my opinion is one of the most important additions, A lot of older Freesync monitors don't support it because it was not originally a Freesync feature, The way it works was taken from how G sync deals with frame rate drops and it was a big part of why G-sync was originally the superior syncing option.

But on another note I see cards like the 1070 as 1080/1200p cards now.

We already see games where even the 1080 struggles to stay above 60 at 1080p, I know it's often not the gpu and some games are still being built on engines that struggle with the enviroments modern games and gamers demand, The first example that comes to mind is Fallout 4. But yeah a 1070 is a top end 1080p card in my mind too.

That might be true for some games, but even a GTX1080 isn't enough for 1440p let alone a 1080Ti for 4K. That's where Volta rolls in :)
That's how I'm seeing it too, Volta may have 4k cards but todays 4k cards aren't really capable of top shelf gaming at 4k.
 
Ah I wouldn't say it's that extreme! It certainly helps though, especially in some games that can dips in FPS in some areas.

Although once you have it, it's hard to go back and not have Adaptive Sync anymore.
I've only had Freesync for 2 or 3 months and with only a Fury to run a 3440x1440 it's proved invaluable for a decent experience and thanks to it having a 30-75 working range
and LFC support it's been a pleasure to game on even when the fps has been lower than what I normally associate with playable numbers.
I certainly won't ever want to go without if I can avoid it.
 
Also check the Screen vendors support page for driver updates, you may find that the range has been upgraded, this one for example was 48 to 75 Hz, its now 35 to 75 Hz http://aoc-europe.com/en/products/g2460vq6#support-download

I would recommend everyone who has a Free Sync screen check for driver updates.

That's good news as a lot of panels haven't got good enough support, LG's monitors are among the worst and that's a shame as they have some great looking monitors.

This is a good option for checking monitors Freesync support butr now you mention that I wonder how often this is updated so yeah it'd be best to check if a monitors been updated if this shows the support to be lacklustre http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

A good Freesync panel and a Vega gpu will provide the best gaming experience you're likely too have over the next few years.
 
8GB HBM more than enough. It will also rock pretty hard on laptops, also interested to see what additional thermal/power efficiency this allows the Vega GPU to draw.

I don't recall have seen my VRAM usage on my 1080ti go above 6GB, if it has done then its been sneaky :)

I have been using VR, 1440p and 1080p with sumpersample.
 
I've only had Freesync for 2 or 3 months and with only a Fury to run a 3440x1440 it's proved invaluable for a decent experience and thanks to it having a 30-75 working range
and LFC support it's been a pleasure to game on even when the fps has been lower than what I normally associate with playable numbers.
I certainly won't ever want to go without if I can avoid it.

Oh certainly! I had to RMA my Fury X last week as some VRAM died on it. Even with using a GTX 980Ti Hybrid, the experience feels worse overall due to the lack of Adaptive Sync.

I hope the wait for vega isn't too far off.
 
I understand your view but I have since changed those thoughts myself and after a few years of gaming at 4K 60fps etc I would honestly want to select 100Hz 1440p 21:9 myself. I feel it offers a better experience and the detail difference as long as not too close to the screen works better for me.

Although I get the want for 4K I play mostly FPS games these days so the higher refresh makes the games much better. If however I was playing mainly RTS games then 4k 60fps is killer as you don't need the extra frames.

It does really come down not just to what the GPU can handle but the games also and best both worlds would be to have the ultimate 4K 240Hz OLED 34" 21:9 monitor and then when then select between getting the best average FPS at the highest refresh at a trade for the resolution and then if you are on your RTS game you have it set otherwise to suit the lower range of FPS but put all the quality settings up as a lot of the RTS is about the graphical look in my opinion.

Indeed. If I played FPS games online, I would do the same, but I do not and I am sure not everyone does. But people speak like everyone needs 100 or 144hz minimum, which is not true :)


Whether he likes HBM or not is besides the point, what he said is correct, If it works by moving some of whats stored in the gpu's memory over to system ram or ssd storage it's still going to be bottlenecked by the the speed of the storage used regardless of how fast the cache can work. I'm sure there will still be people that buy the 4gb model due to the cheaper cost of that model but I'm betting they'll have a lesser experience because of it, whether that's by latency when moving stuff around or by choosing lower settings to avoid it moving stuff of card we'll have to see, At the end of the day the truth is if someones happy dropping a few settings the visual difference is usually minimal, and in the majority of cases only really noticed when comparing screen shots not mid game so it's not really something to lose sleep over, That said if you are buying a high end GPU it's the sort of issue you should not have to face. And you should not have to lower settings when lesser cards aren't.

I would not write anything off until I see it in action. HBCC may end up being useless for gamers and be only good for businesses, but we won't know for sure until benches are out.

That's how I'm seeing it too, Volta may have 4k cards but todays 4k cards aren't really capable of top shelf gaming at 4k.

This is true if you need high fps, otherwise I think for someone like me and many out there who PC gaming is not solely about FPS/twitch gaming online the 1080Ti is an awesome 4K card IMO :D
 
Oh certainly! I had to RMA my Fury X last week as some VRAM died on it. Even with using a GTX 980Ti Hybrid, the experience feels worse overall due to the lack of Adaptive Sync.

I hope the wait for vega isn't too far off.

That's a shame, My Fury pro's still purring like a kitten thankfully, That said if it did die just before the warranty ended it wouldn't be the end of the world with it being so near to Vega. I was just about to ask if you'd tried Intel's adaptive sync support but noticed you haven't got igpu. I think I'll check if my 4790k supports it and have a mess around with it if it does. I tried getting Crysis 3 playable a few years back and 720p with low settings made it barely so, it'll be intersting to see how game changing Intel's adaptive sync is if my cpu supports it (not at 1440p though:D).
 
Last edited:
That's good news as a lot of panels haven't got good enough support, LG's monitors are among the worst and that's a shame as they have some great looking monitors.

This is a good option for checking monitors Freesync support butr now you mention that I wonder how often this is updated so yeah it'd be best to check if a monitors been updated if this shows the support to be lacklustre http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/freesync

A good Freesync panel with a Vega gpu will give you one of the best gaming experiences you're ever likely too have (Unless we get holosuite gaming in our lifetimes).

Because of EBay jokers i'm on a borrowed Reference 290X, which has at least given me an opportunity to try out Free Sync as i have had that AOC screen i linked for a while, i play Star Citizen a lot, it very much in Alpha and the performance is all over the place and rarely above <40 FPS, it was horrible on the GTX 970 but running it with Free Sync on the 290X..... OMG!!!!! its so smooth and responsive and the image is crisp during motion..... it can't and doesn't cure bad hitching but it does smooth that out somewhat and does cure the micro stutter and ghosting and latency you get from low performance.

If it was not for the fact that what i'm looking for in a GPU is more grunt to run higher res i would just get a perfectly adequate 8GB RX 480, its a shame AMD don't have an 1070 equivalent, i'd be all over it for Free Sync.
 
This is true if you need high fps, otherwise I think for someone like me and many out there who PC gaming is not solely about FPS/twitch gaming online the 1080Ti is an awesome 4K card IMO :D

That's true, I'm also not a twitch gamer, I'm more than happy at or around my monitors 75hz limit. so for me personally a 1080ti would be more than enough, I like to keep north of 60 if I can though.

If it was not for the fact that what i'm looking for in a GPU is more grunt to run higher res i would just get a perfectly adequate 8GB RX 480, its a shame AMD don't have an 1070 equivalent, i'd be all over it for Free Sync.

I'd of probably moved to a Polaris chip last year if it had been in the 1070's performance arena, But it's now a lot closer to Vega so even if they have something cooking in the refresh pot I won't bite now.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom