• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
And you are still wrong. The license is separate to the module.

D.P. Wrong again. Enough with the Damage patrol. Manufacturers build there product and apply their margins then sell the product. You can easily see the difference between the same panels using adaptive sync and Gsync. Gsync monitors are more expensive to make entirely because of the Gsync module and the costs associated with that.

And your thinking that the price been charged is entirely down to what the customer is willing to pay is completely rubbish. Think of how many more people would buy Gsync monitors if they were the same price as Adaptive Sync ones? Look around these forums at the number of people who refuse to buy a Gysnc monitor because of the price? if there were no added costs and Gsync is so good why aren't there any Gsync monitors in the £200-£250 price range? Because the costs of making a Gsync monitor are too high.


What complete rubbish, enough with this trolling. The license fee for gsync is going to be in the order of a few dollars, and that is included i he price of purchasing the module. The module cost includes the license fee.

The margins on a product are entirely dependent on the value they can sell the product at. With gsync they can sell with a higher profit margin, which is why the monitors are more expensive. That is a basic fact true for any product.
Manufacturers will set a price for a prouct that will maximize profits, not maximize sales volume. If they sold gsync monitors cheaper they may not sell enough additional volume to offset the cost.



There are no dirt cheap gsync monitors simply because Nvidia requires a gsync certified monitor to offer a specified minimum VRR operating range. AMD doesn't with Freesync so LG can pop out any old rubbish and claim it as freesync. You compare monitors that actually offer a comparable working range and quality of screen and you wont see many cheap freesync screens about.
 
And your thinking that the price been charged is entirely down to what the customer is willing to pay is completely rubbish. Think of how many more people would buy Gsync monitors if they were the same price as Adaptive Sync ones? Look around these forums at the number of people who refuse to buy a Gysnc monitor because of the price?

That doesn't mean there aren't enough people willing to pay to hold the price high even if there are a good number who are holding off due to the price - look in the monitors section there are still a ton of people buying the S2716DG despite the price.
 
That's not what he said. He said with Freesync you can take the same panel and house it in multiple designs without issue. As monitor designs change then with G-Sync panels you have to design them around the G-Sync module. Freesync is almost like a one size fits all where as designing G-Sync monitors is a one design at a time which costs money. There is tons more in there why G-Sync is more costly and it's not just about the design.




Any how it seems like for LG at least G-Sync is pretty risky for making your money back and doesn't leave you with enough options for other tech they want to include.



As I said before, if the monitor has something like a slim chassis then you may not be able to add a gsync module. If you design the monitor from scratch then you just include an area for the module. That same chassis can be used for freesync or non-sync monitors without an issue.



EDIT: and I will end this circular debate by saying my original argument still stands regardless on how much added cost you think gsync has. My point is Nvidia will only drop gsync and support freesync when they think they are loosing significant profits and would be more profitable long-term in supporting freesync. Leading up to that point you will see gsync monitors disapearing and sold off on the cheap.
 
Last edited:
It is looking like the miners will be after them all, prices should be less than the current inflated 580 prices and with the split of bitcoin ether is going back up!
 
I wanted to get the Hype Train back on track after its disasterous ending

AMD Working On 14nm+ Vega 2.0 To Compete With NVIDIA’s Volta In 2018

http://wccftech.com/amd-working-14nm-vega-2-0-compete-nvidias-volta-2018/

Shall i make a new thread named VEGA 2.0 HYPE TRAIN?:D:D:D


It will most probably be as fast as 1080TI but need 700w of power lol

I thought Navi was 2018?
While i appreciate some people have freesync and others just refuse to buy Nvidia because of personal choices ( which is fine everyones is entitled to do w/e they like) , I do think giving us what it seems as the same performance the competition has had for a year or so at similar pricing was a bad choice. I did expect Vega to come in and do much better then Nvidia's offerings after all this time away from the high end.
 
The only thing that's holding me off from getting the 1080ti Mini, is the Nano.
I don't want history to repeat, as I waited for the Fiji Nano and it came late and very expensive.
This time round 4k is actually important to me so might be worth waiting for.

You have waited this long but might as well wait 2 more weeks but seriously just buy the 1080TI, specially at 4k gaming, you need all the power you can get. The Nano will not cut it unless you are planning to water cool it and heavily overclock it.

Even the Vega water-cooled will be at most 10% faster thean a 1080FE card and might just match a heavily overclocked FE. If drivers bring another 10-15%% improvement then you will be sitting still 20-25% behind a stock 1080ti in performance. Lets say you can overclock the Vega water-cooled edition by another 100mhz then you might gain 7-8% performance, lets say 10%. Now you are 15% behind a reference 1080TI but once you overlock your TI the gap will just widen. But can you imagine a vega at 1800mHZ that thing could eat close to 450W of power.

If you plan on keeping your graphics card for a LONG time then maybe the VEGA with newer features might be better card, who knows.
 
If the reviews for Vega are good then I may look to upgrade around Christmas time as my RX480 struggles at 3440x1440 and my monitor has Freesync.
 
You have waited this long but might as well wait 2 more weeks but seriously just buy the 1080TI, specially at 4k gaming, you need all the power you can get. The Nano will not cut it unless you are planning to water cool it and heavily overclock it.

Even the Vega water-cooled will be at most 10% faster thean a 1080FE card and might just match a heavily overclocked FE. If drivers bring another 10-15%% improvement then you will be sitting still 20-25% behind a stock 1080ti in performance. Lets say you can overclock the Vega water-cooled edition by another 100mhz then you might gain 7-8% performance, lets say 10%. Now you are 15% behind a reference 1080TI but once you overlock your TI the gap will just widen. But can you imagine a vega at 1825mHZ that thing could eat close to 450W of power.

If you plan on keeping your graphics card for a LONG time then maybe the VEGA with newer features might be better card, who knows.

The sneaky vega Nano is being shown but I just don't know how long it will take for them to be produced after the 56/64 or at what price.
Totally agree with your opinion on 4k performance, the Nano say for example is 1150base/1300 boost and I tweak p-states and power controller, if it's only performing around 1070 at 4k then I could have bought the 1070 mini last year for 350.
The reason size matters to me is that I can only fit 8.5'' into the elite 110, which is hooked up to my 4k tv in the front room.
But if Vega responds to voltage like the v2 Polaris, then maybe it can be tweaked to hold 1400 at under 225w. I can wait and rather wait that bit longer but at the same time I'd hate to spend almost 700quid if the Nano could stir things up.
 
One scenario where HBCC might be useful is when you want to load huge amounts of textures. Or for those who run lots of mods and run low on vram.

If HBCC is something AMD can code for in drivers rather than developers having to code for in game, it will be a usual feature.

Open world games will also benefit big from this. The drawing in texture and heavy Vram Graphics feature won't bottleneck the GPU.
Interested in seeing what Farcry 5 does with it.
 
If the reviews for Vega are good then I may look to upgrade around Christmas time as my RX480 struggles at 3440x1440 and my monitor has Freesync.
At christmas you might be able to pick up a faster/cheaper Volta GPU. Probably not quite by then but onyl a few months for 2080 type GPUS to be released.
 
http://www.tweaktown.com/news/58635...-leaked-benchmarks-gtx-1070-killer/index.html
http://www.tweaktown.com/news/58635...-leaked-benchmarks-gtx-1070-killer/index.html
My source said that the RX Vega 56 card was running on an Intel Core i7-7700K @ 4.2GHz, had 16GB of DDR4-3000MHz RAM, and was running Windows 10. The benchmarks were run at 2560x1440 with the AMD Radeon RX Vega 56 easily beating NVIDIA's GeForce GTX 1070 in Battlefield 1, DOOM, Civilization 6, and even Call of Duty: Infinite Warfare. My source said that Battlefield 1 was run on Ultra settings, Civ 6 was on Ultra with 4x MSAA, DOOM was at Ultra with 8x TSAA enabled, and COD:IW was running on its High preset.
Radeon RX Vega 56 benchmark results:

Battlefield 1: 95.4FPS (GTX 1070: 72.2FPS)
Civilization 6: 85.1FPS (GTX 1070: 72.2FPS)
DOOM: 101.2FPS (GTX 1070: 84.6FPS)
COD:IW: 99.9FPS (GTX 1070: 92.1FPS)

Not looking bad at all if these are real.
 
Last edited:
Alreet guys let me overemphasize words with t and d for no apparent reason. :p

Lol I bet if he spoke as he probably does in real life half of his audience wouldn't understand **** :D especially Mericans.

I thought he would go a bit more in depth into the technical stuff, bit disappointing. Notice he mentioned that the Doom test was with the Air 64, was this confirmed ?

He seems to be putting a lot of faith into FP16...

And I found it a bit disingenuous on his behalf to point out that the 1070's price is high on newegg without pointing out that the 1080 start price is 50$ more (509$) (gigabyte windforce) a bit misleading, I mean it's only fair to point out all the prices if one of the prices is higher than expected
 
Last edited:
At christmas you might be able to pick up a faster/cheaper Volta GPU. Probably not quite by then but onyl a few months for 2080 type GPUS to be released.

I'll probably just see what the best option is at the time as although you may be right with Volta being faster/cheaper it wouldn't make use of Freesync unless Nvidia decide to include it in their cards, highly unlikely.
 
Status
Not open for further replies.
Back
Top Bottom