• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: ** The AMD VEGA Thread **

On or off the hype train?

  • (off) Train has derailed

    Votes: 207 39.2%
  • (on) Overcrowding, standing room only

    Votes: 100 18.9%
  • (never ever got on) Chinese escalator

    Votes: 221 41.9%

  • Total voters
    528
Status
Not open for further replies.
You very wrong here!
Sony patched the PS4 with HDR support correct but you still need a HDR enabled TV "Hardware" to use it. We dont even have a Real HDR enabled Monitors yet from my understanding. Software "Drivers" is only one part of the story.
The hardware is just a 10bit processing chip that supports the "software layer".

You could perhaps call it a HDR acceleration chip at best.

Think back to when nintendo added lockout chips in the EU NES I think they were called 10NES or something. The chips simply served as a means to enable games to work to block unlicensed games, the HDR chip on tv's is mostly similar in that it prevents people who havent paid the premium to use HDR on their non HDR tv.

Its funny that someone has edited the wikipedia HDR page to remove older uses of the technology. Rewriting history. :)
 
HDR has always been real.

Basically non HDR games are designed for 6bit displays and 6 bit colour range.
Activating HDR in a game via a mod makes it utilise 8 bit ranges. So you could call it 8 bit HDR.

The new marketed HDR is 10bit HDR so it is a newer variant, but the HDR itself is not new.

You can call it emulated HDR if you want, but HDR is only software anyway, even in so called proper HDR on ps4 and 4k tv's. Why do you think they could add it to ps4 via a software patch?

Its quite amusing how people have fell for it and spending wads of money to get HDR tv's and the like.

A lot of games fake up HDR using tone mapping type techniques using compression and exaggeration of the dynamic range this isn't comparable to HDR with a full source range to work from - very few games process it in such a way it would properly display on a 8-10 bit display that has the actual contrast range required without loss of precision in some parts of the range.
 
The hardware is just a 10bit processing chip that supports the "software layer".

You could perhaps call it a HDR acceleration chip at best.

Think back to when nintendo added lockout chips in the EU NES I think they were called 10NES or something. The chips simply served as a means to enable games to work to block unlicensed games, the HDR chip on tv's is mostly similar in that it prevents people who havent paid the premium to use HDR on their non HDR tv.

Its funny that someone has edited the wikipedia HDR page to remove older uses of the technology. Rewriting history. :)
I think you're missing the actual point in the HDR we're talking about. For all intents and purposes, we're talking about maximising contrast. Which is where fall array local dimming comes into things to enable proper HDR. I feel like you're misunderstanding something, as that isn't something you can achieve with just software. The hardware has to be able to physically produce the extended range from dark to bright that comes with the HDR spec.
 
The hardware is just a 10bit processing chip that supports the "software layer".

You could perhaps call it a HDR acceleration chip at best.

Think back to when nintendo added lockout chips in the EU NES I think they were called 10NES or something. The chips simply served as a means to enable games to work to block unlicensed games, the HDR chip on tv's is mostly similar in that it prevents people who havent paid the premium to use HDR on their non HDR tv.

Its funny that someone has edited the wikipedia HDR page to remove older uses of the technology. Rewriting history. :)

So what you saying is if a TV standard 4K or None 4K? has 10bit it has the capability to show HDR content?
My old 1080p Philips TV supports 4 Colour processing trillion colours (14-bit RGB and that cant use HDR from the PS4 Pro
 
The hardware is just a 10bit processing chip that supports the "software layer".

You could perhaps call it a HDR acceleration chip at best.

Think back to when nintendo added lockout chips in the EU NES I think they were called 10NES or something. The chips simply served as a means to enable games to work to block unlicensed games, the HDR chip on tv's is mostly similar in that it prevents people who havent paid the premium to use HDR on their non HDR tv.

Its funny that someone has edited the wikipedia HDR page to remove older uses of the technology. Rewriting history. :)

:rolleyes::p Where are you getting your 'version' of how modern HDR works, not off that Alex Jones chap is it?

now you think HDR is akin to a new form of DRM....what next?

I await the next thrilling instalment!

Love to hear your insights on multi channel audio formats.
 
They haven't "always been" anything, tbh.

770 - $399
970 - $329
1070 - $449

Take the pound out of the equation altogether, makes it easier.

Go back a bit more and you have the top tier end gpus at £400~. GTX 580 launched at £400, if it was today it would be called the Titan F and be priced at £1200. Many people forget the shuffling of the range nvidia has done since they introduced the Titan and x80 Ti cards, x70 is no longer the second tier card but the fourth mid range card.
 
Wonder which will draw more power

Intel's 18 core "165W" i9-7980XE as people try to overclock it from a base of 2.6GHz or a stock AMD "345W" Vega 64 Liquid

Sounds like a good time to market the high wattage psus.
 
I can see the liquid version beating the 1080 in almost all gaming benchmarks.
No.

According to AMD slides the liguid is only 20% faster at 2560X1440 than the FuryX 1050/500 (DX12 BF1 a good AMD game) on 7700K @ 4.2Ghz. (so boosting 4.5)

Overclocked 1080 is around 40% faster at the same resolution over an overclocked FuryX @ 1100/550 at DX11. Do the math.
 
That comparison is as clear as mud.

Yeah there might be something in it but it's so bad.

No.

According to AMD slides the liguid is only 20% faster at 2560X1440 than the FuryX 1050/500 (DX12 BF1 a good AMD game) on 7700K @ 4.2Ghz. (so boosting 4.5)

Overclocked 1080 is around 40% faster at the same resolution over an overclocked FuryX @ 1100/550 at DX11. Do the math.

Not stock vs stock, nothing is kept the same, not even the same DX....
 
I can see the liquid version beating the 1080 in almost all gaming benchmarks.

Very unclear cherry picked AMD benchmarks have been provided and no they don't show that.

Like everything else in this launch it has been smoke and mirrors to cover up performance. I realise your still hopeful but this isn't how you market or launch a winning product.
 
August 1st is also a pivotal day for BTC. A massive change that has taken years, its possible for the price and also ETH to move greatly going forward. So far as I can tell demand is not suffering a pullback or breakdown, its more idling right now. The price remaining so high I would normally guess as a bullish sign but cant say I know either way.

Only thing I read really was ETH needs 8gb now not 4gb so maybe you can pickup cheap 4gb 480 cards cheaper then previous. Its probably not absolute just less demand for that size memory now



A really nice monitor but its 100hz. Ok very nice but not perfect (for FPS players) and that price is too high not to buy your ideal monitor.
Did anyone already want to buy this particular monitor

The btc block size increase (Segwit2x) if transitioned smoothly will see the value of other virtual currencies drop significantly - as most people will start moving their holdings in other virtual currencies to btc... especially those who had hedged in the run-up to the transition..

there can be a big drop in non-eth and eth itself wont be much immune to a successful btc transition [i.e. the one without hard-fork]

i dont know how segwit2x is going to impact AML requirements (thats too technical - will have to look at source code and current methods of information gathering, which basically follows the SAS fraud analytics suite but is available for free) - if the transition fundamentally violates AML then all virtual currencies can be expected to drop.
 
HDR has always been real.

Basically non HDR games are designed for 6bit displays and 6 bit colour range.
Activating HDR in a game via a mod makes it utilise 8 bit ranges. So you could call it 8 bit HDR.

The new marketed HDR is 10bit HDR so it is a newer variant, but the HDR itself is not new.

You can call it emulated HDR if you want, but HDR is only software anyway, even in so called proper HDR on ps4 and 4k tv's. Why do you think they could add it to ps4 via a software patch?

Its quite amusing how people have fell for it and spending wads of money to get HDR tv's and the like.

Nope the software on PS4 has to transmit it. The panels on the TV's have to be designed to output it. There is a difference. You cannot do a software update to a TV for HDR output like you can for the item transmitting the data unless your panel happened have already supported 10-bit 1000NIT image levels but they were just not used (of which I am not aware of any doing so).

Yes games are desiged for 6bit, 8bit still doesn't make them HDR. HDR is a standard created for 10bit and technically only 10bit is HDR.

Now you can say that the 8bit has a greater dynamic range but that technically still isn't HDR.




Anyways going back to Vega. Watching the YouTube clips and people seem super hyped from games companies to use it but we are not seeing performance. Is this something again where we will see DX12 games in two years suddenly gain another 20% performance? Is AMD too far ahead of the curve again?

And also people were saying HBCC was not for gaming and that it would need direct input but all the videos from AMD are suggesting that it is literally a background task that sorts itself out?
 
Just when things could not get any worse for the Vega launch.....

eskOwrv.jpg


https://www.techpowerup.com/235701/...l-features-for-titan-xp-through-driver-update

The fastest gaming card available just got better !!!!

Shame I have not got any professional work for mine to do.:(
 
Status
Not open for further replies.
Back
Top Bottom