• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
I imagine once AMD become competitive in the high end market once again then nVidia will decide to magically support Freesync...


Only if nvidia think they are loosing sales because of it. And the first sign of that happening will be Gsync monitors getting priced the same or only slightly more that Freesync. Gsync has price premium simply because people are willing to pay that premium.
 
You can but things are pointing Vega 2.0 being HPC only with full FP64 support. So its not propably coming for gamers. After little Vega next gaming card from AMD will propably be Navi. Hope I didnt derail the hypetrain :D
 
I wanted to get the Hype Train back on track after its disasterous ending

AMD Working On 14nm+ Vega 2.0 To Compete With NVIDIA’s Volta In 2018

http://wccftech.com/amd-working-14nm-vega-2-0-compete-nvidias-volta-2018/

Shall i make a new thread named VEGA 2.0 HYPE TRAIN?:D:D:D


It will most probably be as fast as 1080TI but need 700w of power lol
Not interested in any more Vega personally. Even such a GPU would only compete with cruddy old Pascal. We need Navi, which hopefully will be new tech that actually translates into performance for AMD.
 
I wanted to get the Hype Train back on track after its disasterous ending

AMD Working On 14nm+ Vega 2.0 To Compete With NVIDIA’s Volta In 2018

http://wccftech.com/amd-working-14nm-vega-2-0-compete-nvidias-volta-2018/

Shall i make a new thread named VEGA 2.0 HYPE TRAIN?:D:D:D


It will most probably be as fast as 1080TI but need 700w of power lol
If its using 14nm+ it will be more efficient by a noticeable margin. Look at the 480 like the xfx which dropped noticeable power and that didn't even go onto a improved node.
I just don't think they improved vega as much as they had initially hoped to do so. They touted all these improvements but i think it all of which they had hoped to do never came light. So perhaps there is much improvements on Vega to be had as well as an improved node to allow them to do a Vega refresh. Also will help with driver improvement.
 
Only if nvidia think they are loosing sales because of it. And the first sign of that happening will be Gsync monitors getting priced the same or only slightly more that Freesync. Gsync has price premium simply because people are willing to pay that premium.

The main reason why Gsync monitors are more expensive than Adaptive Sync monitors is because Nvidia charges a license fee to install the Gsync module. Saying that, even if Nvidia did drop the license fee, the Gsync monitor would still cost more as the design of the monitor will cost more. With Adaptive Sync you can design a monitor and use that same build for a range of monitors whereas the monitor you build for the gsync module will only be for that monitor. It can't be used for non-gsync monitors. Which pushes the manufacturing costs up.
 
Only if nvidia think they are loosing sales because of it. And the first sign of that happening will be Gsync monitors getting priced the same or only slightly more that Freesync. Gsync has price premium simply because people are willing to pay that premium.
You are quite right because even though I have a Freesync monitor nVidia haven't lost a sale from me yet, I wonder how many others are in the same boat as me.
So will vega 2 be just like polaris, and get a crappy 100mhz clock boost and try to charge 50 more for it?
I don't mind, it's like the 1700 vs 1800x, just overclock it and you'll be golden!
The main reason why Gsync monitors are more expensive than Adaptive Sync monitors is because Nvidia charges a license fee to install the Gsync module. Saying that, even if Nvidia did drop the license fee, the Gsync monitor would still cost more as the design of the monitor will cost more. With Adaptive Sync you can design a monitor and use that same build for a range of monitors whereas the monitor you build for the gsync module will only be for that monitor. It can't be used for non-gsync monitors. Which pushes the manufacturing costs up.
I understand why G-Sync is slightly better because all the scalers are the same unlike Freesync but charge the licensing fee is stupid and killed it.
Basically what we already knew.

Vega 64 vs 1080
Vega 56 vs 1070

Problem is their pricing. Vega 56's performance could have been had 15 months ago for the same price. Basically AMD came in way late and did nothing to improve price for performance. Hence why they are going on about it being cheaper due to Freesync etc.

Think I made up my mind at this point. Skipping Vega for Volta :(
You have a HD 7970 atm you'd be better with just upgrading to a 1080 or Vega, at the end of the day that would be at least a x2.5 improvement for little outlay.
 
How many of you game on cards which are the wrong vendor for your monitor?

I'm imagining a fast refresh rate alone does a lot for removing tearing?

On my 75Hz FreeSync monitor there is hardly any tearing compared to if i set it to 60Hz but if i stream on Twitch i have to set it to 60Hz else i get crazy stuttering on my OBS which creates a stuttering stream. Must be because my second and third monitors are 60Hz non-FreeSync displays.

At 60Hz/720p60 the stream looks really smooth though :)

Only way i can stop stuttering at 75Hz is putting OBS under the game on the 75Hz monitor but thats no good is it, i need to see OBS.

Oh well, stuck at 60Hz on a 75Hz monitor for now even though the games i play still run nice n smooth lol
 
The main reason why Gsync monitors are more expensive than Adaptive Sync monitors is because Nvidia charges a license fee to install the Gsync module. Saying that, even if Nvidia did drop the license fee, the Gsync monitor would still cost more as the design of the monitor will cost more. With Adaptive Sync you can design a monitor and use that same build for a range of monitors whereas the monitor you build for the gsync module will only be for that monitor. It can't be used for non-gsync monitors. Which pushes the manufacturing costs up.


That is wrong. The license fee and module is likely around $30 in cost and if nvidia feel threatened could give the module away for free.
You are also wrong about the monitor costs themselves. If a Manufacturer want to deign and build a monitor for Gsync then there is no real additional design work required beyond a freesync monitor. What freesync can do is take an old monitor design that has VRR and add support via firmware (which has a cost of course). With Gsync it would depend upon the design of the existing monitor, if there is already space for the module then it will simply be inserted. But for a new monitor, then there is no extra design cost.

The cost difference between gsync and Freesync is almost entirely purely down to the price the market will bare. Gysync buyers don't mind pay more money for a gsync monitor.
Freesyn monitors also cost more than non-sync monitors, just they have to sell them with a lower margin to be competitive.
 
I thought Gsync also had a larger control module on the monitor to implement it. Partly that gives a wider range but thats part of the bigger cost too

6). HBCC is designed for HPC and not gaming. It can onyl really help in gaming when the GPU runs out of VRAM, which doesn't typically happen. AND then see your point 1.
My impression was HBCC had potential in theory at least ( till we see reviews) to raise min frame rates even while 8gb vram is not full or is that never the case.
So in an ideal case benchmarks might be something like:
1080ti 133 avg FPS & 63 min FPS
Vega 100 avg FPS & 70 min FPS

Then that would come into the freesync vs gsync argument again, does Vega appear to make a game look smoother despite being a slower card. The pepsi challenge was vs 1080 or 1080ti? because it was hitting above its weight if it was ti and I presume they had a point to make

It’s very important to point out that all the leaks and rumors we’ve seen surrounding Vega 20 up to this point pegged it as a 7nm chip.
So Vega2 is not significantly more developed, maybe the memory is faster. Is nobody waiting for Vega Nano?

You have a HD 7970 atm you'd be better with just upgrading to a 1080 or Vega, at the end of the day that would be at least a x2.5 improvement for little outlay.
I used to have a 7970 and I never wanted to bother with less then Vega because I wanted a proper upgrade that lasted not a swap that barely matters and means I want to buy another card the year after.
Is the Vega56 going to be enough for 1080p till beyond Navi, I dont want to stuck in a 'H1' type waiting game ever again tbh lol
 
Last edited:
They shouldn't even bother releasing a Vega 2.0 for consumers unless they can get 1080ti OC'd perf @ 500 usd they can forget it

I agree with TNA don't care much for Vega anymore, it's a dud.
 
Rapid Packed Math fp16 to be used in FM Serra, Wolfenstein 2 and Far Cry 5

http://www.guru3d.com/news_story/ra...d_in_fm_serrawolfenstein_2_and_far_cry_5.html

Rapid Packed Math basically halves the floating point calculations for data request, resulting in a faster turnaround time of that request/data, however with less precision and thus quality in some form. Considering lowering image quality in specific segments will pretty much always create more FPS, we'll just have to wait and see how the feature will evolve. Basically half-precision would be applied in segments where it really isn't needed. AMD has not revealed anything specific as to what and where excatly the feature can be used.

xihg38M.gif
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom