• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
I like his honesty so like other sites have reported AMD don't want the Miners taking all the GPUs that's why they has been some delay in make sure the supply is enough for also gamers to get hands on a GPU.


That's what I said earlier in the thread about the 2 week delay to help build inventory :p
 
if its gonna be equal in performance to 1070 and consume 250w then 1070 is still a winner. I used to have 2x r9 290 and on water they were running cool and performance was good but ~600w of power was to much. i get the same or better performance now with single 1070. I don't think that vega 56 will be 210w card.
 
if its gonna be equal in performance to 1070 and consume 250w then 1070 is still a winner. I used to have 2x r9 290 and on water they were running cool and performance was good but ~600w of power was to much. i get the same or better performance now with single 1070. I don't think that vega 56 will be 210w card.

I think it will be around 200watts total power consumption on the reference design, maybe lower, the clocks are "lower" (GPU and VRAM) so it will be designed to stay in that power package out of box, AIBs will probably be more power hungry by design.

After a certain clock speed the wattage goes through the roof, they are talking about in between 300 and 400 watts consumption for the GPU and VRAM for the liquid version for 1677mhz clock, where the lower clocked air vega64 they are talking 220w GPU and VRAM power consumption for 1546mhz, that added power for 130mhz is crazy, but at least the perf/clock ratio seems to scale well.
 
But I haven't had an AMD GPU since my 5870. Not easy to tear my self away from Nvidia.

Why is it so hard? If all these blind testers couldn't tell much of a difference and the Hardocp test had a 1080ti which went in AMD's favour then what is going to change so much in your gaming. Playing your games is the important part and once you start doing that you won't care what's driving it.
 
Playing your games is the important part and once you start doing that you won't care what's driving it.

Very true and something that people seem to forget about, gaming is what we buy these cards for, not pointless benchmarks which are basically wang measuring contests.
 
if its gonna be equal in performance to 1070 and consume 250w then 1070 is still a winner. I used to have 2x r9 290 and on water they were running cool and performance was good but ~600w of power was to much. i get the same or better performance now with single 1070. I don't think that vega 56 will be 210w card.

In the HardOcp video he said the 56 will be the king Gpu under $400 which suggests it will be faster. King is a strong word. Poor Pascal :D:D:D
 
Did he made a mistake by telling that the Vega 56 is 165w TDP and the Vega64 220w TDP?

He wasn't talking about TDP, and he wasn't talking about total power consumption, just GPU and VRAM consumption, TGP "total graphics power" (the bulk of the power consumption)
 
I think he is confusing the HDR effect (usually semi fake) that has been used in some games with actual HDR monitor output.
HDR has always been real.

Basically non HDR games are designed for 6bit displays and 6 bit colour range.
Activating HDR in a game via a mod makes it utilise 8 bit ranges. So you could call it 8 bit HDR.

The new marketed HDR is 10bit HDR so it is a newer variant, but the HDR itself is not new.

You can call it emulated HDR if you want, but HDR is only software anyway, even in so called proper HDR on ps4 and 4k tv's. Why do you think they could add it to ps4 via a software patch?

Its quite amusing how people have fell for it and spending wads of money to get HDR tv's and the like.
 
It won't be HDR tech as that's just come to TVs and Monitors. And it has to be 10bit HDR non of this half ass fake crap.
Freesync and Gsync are useful fullstop. Ive tried it without and gamed without for years then i finally tried it and it was amazing. It's like going from 60Hz to 144 and maintaining over 100FPS. You notice the difference in smoothness for sure. And FPS variation you dont feel as much if at all because i dont notice drops like i used to.

4k Does improve image quality. Heck even 1440p does. Ive got a 1080p second monitor at the side here and at the desktop is absolutely is not as good and as sharp as my 1440p at the desktop. Same in games. Max settings is false that just slows down GPUs.

I definitely dont see any m,meaningful benefit on refresh rates higher than 60fps, but I also dont play twitch shooters where it is probably the only genre it matters. The fact I dont see much benefit on 4k and 1440p is probably that I dont play on a huge screen, I respect when a screen is larger then lower resolutions will have a bigger impact due to the larger pixel size.

Bear in mind when comparing resolutions you need to compare with native screens, 1080p on a 1440p screen doesnt look great as its not native so you get certain doubled up pixels and scaling issues, but 1080p on a native 1080p screen is sharp.

From where I sit, somehow super high refresh rates have took off, people claim to notice the difference even tho the latencies are questionable to be if a human can notice, and since its harder for GPUs to keep up at say 144fps then there is a chain reaction that this has created a market for gsync.

If I switch from 30fps to 60fps or vice versa, yes its noticeable and feels odd at first, but after 10 mins or so at playing at a specific frame rate I get "used to it" and is fine. The only genre I would probably struggle with at 30fps is driving games, but they are fine at 60fps.

Now if I play a game that cannot keep to the framerate, yes then adaptive refresh rate would help, however for me this is not a common situation, it is very infrequent, that rare situation does it warrant me spending lots of cash on a gsync screen., and also that gsync has limited output ports further restricting me, in my opinion no.

My opinion may be changed tho if I played on a 60 inch tv, and played twitch shooters. :)

Of course everyone is right that a great thing about PC gaming is we all have this choice :)

On my 27inch I can see the difference at 1440p if I "look for it" but its not something I notice whilst actually playing a game, thats more what I meant, however things like draw distances, lighting effects and special affect details I will notice whilst playing a game.
 
HDR has always been real.

Basically non HDR games are designed for 6bit displays and 6 bit colour range.
Activating HDR in a game via a mod makes it utilise 8 bit ranges. So you could call it 8 bit HDR.

The new marketed HDR is 10bit HDR so it is a newer variant, but the HDR itself is not new.

You can call it emulated HDR if you want, but HDR is only software anyway, even in so called proper HDR on ps4 and 4k tv's. Why do you think they could add it to ps4 via a software patch?

Its quite amusing how people have fell for it and spending wads of money to get HDR tv's and the like.

HDR is not software only what are you gibbering on about?
 
HDR has always been real.

Basically non HDR games are designed for 6bit displays and 6 bit colour range.
Activating HDR in a game via a mod makes it utilise 8 bit ranges. So you could call it 8 bit HDR.

The new marketed HDR is 10bit HDR so it is a newer variant, but the HDR itself is not new.

You can call it emulated HDR if you want, but HDR is only software anyway, even in so called proper HDR on ps4 and 4k tv's. Why do you think they could add it to ps4 via a software patch?

Its quite amusing how people have fell for it and spending wads of money to get HDR tv's and the like.

You very wrong here!
Sony patched the PS4 with HDR support correct but you still need a HDR enabled TV "Hardware" to use it. We dont even have a Real HDR enabled Monitors yet from my understanding. Software "Drivers" is only one part of the story.
 
Status
Not open for further replies.
Back
Top Bottom