• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Little extreme on that one mate :D
Low fps still = high input lag Freesync/Gsync can't change this.

Anything above 60fps is still the sweet spot for pc gaming.

Yeah its the latency I notice - FreeSync/G-Sync do a good job of making it feel a lot nicer but they only marginally improve the feeling of input lag once you get low enough framerates to notice it which I definitely do notice much below ~56fps.
 
That's a shame, My Fury pro's still purring like a kitten thankfully, That said if it did die just before the warranty ended it wouldn't be the end of the world with it being so near to Vega. I was just about to ask if you'd tried Intel's adaptive sync support but noticed you haven't got igpu. I think I'll check if my 4790k supports it and have a mess around with it if it does. I tried getting Crysis 3 playable a few years back and 720p with low settings made it barely so, it'll be intersting to see how game changing Intel's adaptive sync is if my cpu supports it (not at 1440p though:D).

Aye, shame my care died; but as you mentioned it's so close to Vega it's not too bad.

You really should test Intel's sync, it might be decent for all we know. Then again I think it'll only really shine if used on Skylake or Kabylake as they have far better IGPs.

Rather odd we haven't seen any publications really test it as well.
 
Little extreme on that one mate :D
Low fps still = high input lag Freesync/Gsync can't change this.

Anything above 60fps is still the sweet spot for pc gaming.

As much as I love that Freesync/Gsync is available and want it in my next monitor purchase, people speak of it like it is some kind of magic tech. lol.

My understanding is, there is zero difference between having v-sync off (apart from tearing obviously) and having Freesync or G-sync on when it comes to smoothness or performance. 60 is still 60, 100 is still 100, just minus the tears. That is it!

That's true, I'm also not a twitch gamer, I'm more than happy at or around my monitors 75hz limit. so for me personally a 1080ti would be more than enough, I like to keep north of 60 if I can though.
I played Deus Ex Mankind Divided 40-45fps on 4K and it was smooth to me (I know that would not be the case for everyone), and that was on my 1070. Sure I turned off a few useless settings to get that fps, but if you compared the IQ to with those settings on and off, you would not see the difference. As a matter of fact I found those settings that sucked up my fps were making the image looks worse. The biggest one was contact hardening shadows which at 4K took so much fps to have on and made the game look worse. Yet so called PC master race gamers do not even tinker to see the difference some settings make to a game and just want max settings regardless.
 
Going to flog my 1070 the minute the AIB Vegas are announced, cant wait to get rid of this card and back onto Freesync. Dont get me wrong, the 1070 has been really good, but i miss Freesync.
 
I played Deus Ex Mankind Divided 40-45fps on 4K and it was smooth to me (I know that would not be the case for everyone), and that was on my 1070. Sure I turned off a few useless settings to get that fps, but if you compared the IQ to with those settings on and off, you would not see the difference. As a matter of fact I found those settings that sucked up my fps were making the image looks worse. The biggest one was contact hardening shadows which at 4K took so much fps to have on and made the game look worse. Yet so called PC master race gamers do not even tinker to see the difference some settings make to a game and just want max settings regardless.

I found DX:MD pretty nasty when it was between 45-55fps - however I turned down a small number of settings to get a fairly smooth 70fps so its possible one of the settings was causing it to run nastily (maybe at any framerate) - so maybe it would have run more acceptably at around 40-50fps with that problem setting(s) disabled who knows.
 
As much as I love that Freesync/Gsync is available and want it in my next monitor purchase, people speak of it like it is some kind of magic tech. lol.

My understanding is, there is zero difference between having v-sync off (apart from tearing obviously) and having Freesync or G-sync on when it comes to smoothness or performance. 60 is still 60, 100 is still 100, just minus the tears. That is it!
I played Deus Ex Mankind Divided 40-45fps on 4K and it was smooth to me (I know that would not be the case for everyone), and that was on my 1070. Sure I turned off a few useless settings to get that fps, but if you compared the IQ to with those settings on and off, you would not see the difference. As a matter of fact I found those settings that sucked up my fps were making the image looks worse. The biggest one was contact hardening shadows which at 4K took so much fps to have on and made the game look worse. Yet so called PC master race gamers do not even tinker to see the difference some settings make to a game and just want max settings regardless.

It does help smooth out the frame rate, but only to some degree, it will not fix stutters, and it defo wont fix sudden drops in FPS Something like 60 to 30 you will still feel that for example.
It does however smooth out the little changes and makes the game seem smoother.

One of the biggest things that make me laugh on the internet is people say its like 30fps is 60fps ITS that good guys !!!! Just Shut UP!¬
The biggest benifit from Freesync/Gsync is like you said no more screen tear while still keeping the zero input latency and Frame latency penalty from Vsync.
 
If they feel the need for a 4gb version then it must be an expensive range of cards I guess. I cant see much harm in trying 4gb which will be fine most of the time for most games for a year or so. If this caching is not calling up textures for an entire area well before being drawn then it'll end up much slower. However you can sell the card before 4gb is a issue to most people meaning the second hand market prices are fine and then try volta maybe with a conventional approach.

I do wonder if vega will be slightly pointless for 1080p but I know FPS games where people want 200 fps (minimum) as well. Otherwise the balanced approach of a modern monitor setup first sounds more sensible

Compared to what? its a pretty broad statement, what do you mean by expensive, £400 is a lot of money but is it a lot of money for a GTX 1070?
400 isnt as much as it used to be sadly. Unless we see sterling recover this year (article 50 is this week and so the news is in the price I hope) then you should take all prices in dollar terms and then convert at our current exchange rate to be more realistic.
 
Last edited:
Yeah its the latency I notice - FreeSync/G-Sync do a good job of making it feel a lot nicer but they only marginally improve the feeling of input lag once you get low enough framerates to notice it which I definitely do notice much below ~56fps.

Agree 100%
While I can game at 30fps once after couple mins of adjusting Freesync and Gsync can only improve this by not adding in Vsync input lag making 30fps latency even worst so it can be better vs not having this tech it doesn't however make it feel like 60fps that you read people saying. lol
 
I found DX:MD pretty nasty when it was between 45-55fps - however I turned down a small number of settings to get a fairly smooth 70fps so its possible one of the settings was causing it to run nastily (maybe at any framerate) - so maybe it would have run more acceptably at around 40-50fps with that problem setting(s) disabled who knows.
Either that, or I find 45fps more acceptable in such game. Some games I like minimum 60fps. Like driving games, most recently the Forze Horizon 3 Demo I tried, did not enjoy it at 30fps lock and would not want it under 60fps. All depends on the game for me.
 
Compared to what? its a pretty broad statement, what do you mean by expensive, £400 is a lot of money but is it a lot of money for a GTX 1070?

Dunno just my opinion. HBM2 and reading rumours that AMD will need to charge royalties for manufacturers to use freesync 2 due to extra work AMD has to do to validate panels.
 
What rumours? ^^^^ put the link here i bet they are nonsense.

400 isnt as much as it used to be sadly. Unless we see sterling recover this year (article 50 is this week and so the news is in the price I hope) then you should take all prices in dollar terms and then convert at our current exchange rate to be more realistic.

I kinda think it is a bit much for a mid tier card, 4 years ago a mid tier card was literally half that.

Brexit accounts for 15% or a bit more, not 100%
 
Last edited:
What rumours? ^^^^ put the link here i bet they are nonsense.

I kinda think it is a bit much for a mid tier card, 4 years ago a mid tier card was literally half that.

Brexit accounts for 15% or a bit more, not 100%

I paid so little for Top end cards back in the day, even adjusting for inflation. GPU prices have getting well out of hand, especially with lack of direct competition at the highend. :(


http://www.bankofengland.co.uk/education/Pages/resources/inflationtools/calculator/default.aspx

GTX 470 with adjusted inflation + Shipping included is £332.36
Imagine paying that much for the GTX 1080Ti, which is what the 470 was compared to the 480.

CMsaQDa.png

Or paying £289 for the GTX 580. That's £355.75 for the Titan X equivalent today!
N45DtpI.png

Heck I paid €381 for the 7950GX2 back in 2006. That's €412.01 today, or £356.34.
qS3Sjs9.png

What about the 8800GTX? It's £530.72 today.
J4Jywup.png
 
What rumours? ^^^^ put the link here i bet they are nonsense.



I kinda think it is a bit much for a mid tier card, 4 years ago a mid tier card was literally half that.

Brexit accounts for 15% or a bit more, not 100%

I got that impression from reading this. I think it goes on somewhere to talk about royalties or increased cost incurred for AMD.
 
I got that impression from reading this. I think it goes on somewhere to talk about royalties or increased cost incurred for AMD.

Well i read it and it just doesn't say that at all, in fact it says it should be relatively straightforward to ad HDR with Free Sync as there is little or no change to the scalers and as with Free Sync 1 HDR is very much with AMD on the software side.

The good news for AMD (and developers) is that the actual implementation of FreeSync 2 should be quite simple since most games are already rendering in HDR and tone mapping to at least SDR to begin with. Game developers only need to query for the API, tone map to the specifications AMD provides, and then from there it’s AMD and the monitor’s problem. But counting on developers to do anything extra for PC games is always a risk, one that has hurt initiatives in the past. For their part, AMD will be doing what they can: focus on the upstream engines and developer relations/evangelism. By getting FreeSync 2 support added to major engines like Unreal Engine and Unity, AMD makes it much easier for downstream developers to adopt FreeSync 2. Beyond that, it’s about convincing developers that supporting FreeSync 2 will be worth their while, both in terms of sales and improving the customer experience.

On the flip side of the coin, getting monitor manufacturers on-board should be relatively easy. AMD’s original FreeSync effort was extremely successful here (to the tune of 121 FreeSync monitors), in part because AMD made it such an easy feature to add, and they are aiming for something similar with FreeSync 2. It doesn’t sound like display controllers need to be substantially altered to support FreeSync 2 – they just need to have a tone mapping bypass mode and understand requests to switch modes – which would make it easy for the monitor manufacturers to add support. And for their part, the monitor manufacturers like features like FreeSync because they can be easily implemented as value add features that allow a monitor to be sold for a higher price tag.

Free Sync already has much better vendor adoption than G-Sync, despite market share differences, the reason being it doesn't cost vendors any royalties and the manufacture of the units is easier than G-Sync, AMD have always needed to provide the software just as they have with Free Sync 1, Free Sync 1 is software, its their version of the adaptive sync technology, Intel would use different, their own software, there isn't anything different here.
 
Last edited:
I paid so little for Top end cards back in the day, even adjusting for inflation. GPU prices have getting well out of hand, especially with lack of direct competition at the highend. :(


http://www.bankofengland.co.uk/education/Pages/resources/inflationtools/calculator/default.aspx

GTX 470 with adjusted inflation + Shipping included is £332.36
Imagine paying that much for the GTX 1080Ti, which is what the 470 was compared to the 480.

CMsaQDa.png

Or paying £289 for the GTX 580. That's £355.75 for the Titan X equivalent today!
N45DtpI.png

Heck I paid €381 for the 7950GX2 back in 2006. That's €412.01 today, or £356.34.
qS3Sjs9.png

What about the 8800GTX? It's £530.72 today.
J4Jywup.png

I think you need to read this to compare really.

https://forums.overclockers.co.uk/t...-price-history-of-nvidia-gpus.18772654/page-2

Edit: to give an idea release price of the 580 was £400. Release price of the 1080Ti is £690.

Now adjusting for inflation that would put the 580 at £492. So we are seeing price increase for high end of 40% compared to what they were previous from this one delta you are using. Now inflation in the UK does not always translate to true costs of a particular sector cost inflation as it of course is averaged across all sectors.

What would need to be seen is that if the sectors and costs of material that the GPU uses has risen. With that though we are talking about an increase total of 72% although you could say the Ti & Titan are outlier's and so the 1080 would give a figure 55% increase to the consumer compared to previous.

Now this gets muddied further in that the US is actually more like 25% increase to the customer as the 1080 had the largest UK mark up of any card sold by Nvidia over the last 17 years (at the top end) which is massive and shows just how overpriced the 1080 was in the UK.

It has some many different reasons why it's hard to really compare but I would say people who brought the 1080 got stiffed and actually the 980, 780 & 680 also stiffed people in the UK price wise. The others though were pretty inline with the US market.

Edit: Also though if you look at the last column that is the one actually showing relative prices up to 2007 were low (bar 2000 when really this all kicked off).

Then during 2007 prices increased to about the same levels as now when you take inflation into account. After that though and what you can see is that we had was a dip in high end GPU costs between 2008-2010. It levels out till 2012 and then it starts to rise back to what it was in 2007. Now this follows when the recession hit and then recovery on the global market and now the UK loosing ground to the global market over the last 12 months really.
 
Last edited:
I said 'Mid Tier' cards, 4 or 5 years ago the 7970 / GTX 680 was the top tier card, i bought the mid Tier Gigabyte WF X3 HD 7870, i paid £220 for it and it was not one of the cheaper ones..... it was actually one of the more expensive ones.

Today the 1070 is the 7870 equivalent and its £400, i get inflation and brexit and a little ontop just because..... but to double in price IMO is out of line with all that.

hgjhf.png
 
You guys are doing it all wrong. Not saying the prices have not gone up, but they have not gone up as much as you think they have.

Graphics cards are purchased in dollars. If you want to see if graphics cards have gone up and down in price, you need to compare them in dollars not pounds. Say a decade ago had the 1080ti been released, it would have been between £400-450. Why you may ask? Back then our currency was strong, we would buy around two dollars to a pound. The pound's value has steadily declined since then and hit the gutter with brexit. Also back then we paid 17.5 % VAT.

Adjusting just for inflation is not enough ;)
 
Status
Not open for further replies.
Back
Top Bottom