• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Now's your time AMD

Dev who is making their own raytracing based game:

https://twitter.com/SebAaltonen/status/1032283494670577664?ref_src=twsrc^tfw|twcamp^tweetembed|twterm^1032283494670577664&ref_url=https://s9e.github.io/iframe/twitter.min.html#1032283494670577664

Claybook ray-traces at 4.88 Gigarays/s on AMD Vega 64. Primary RT pass. Shadow rays are slightly slower. 1 GB volumetric scene. 4K runs at 60 fps. Runs even faster on my Titan X. And with temporal upsampling even mid tier cards render 4K at almost native quality.

The RTX2070 can do 6 Gigarays/s in comparison. It even seems Pascal is not that bad either. The chap was a former senior rendering lead at Ubisoft.

Edit!!

OTH,is that the Vega or Pascal GPU fully tapped out though just against the dedicated raytracing hardware of the RTX2070??
 
Last edited:
Dev who is making their own raytracing based game:

https://twitter.com/SebAaltonen/status/1032283494670577664?ref_src=twsrc^tfw|twcamp^tweetembed|twterm^1032283494670577664&ref_url=https://s9e.github.io/iframe/twitter.min.html#1032283494670577664



The RTX2070 can do 6 Gigarays/s in comparison. It even seems Pascal is not that bad either. The chap was a former senior rendering lead at Ubisoft.

Edit!!

OTH,is that the Vega or Pascal GPU fully tapped out though just against the dedicated raytracing hardware of the RTX2070??

Yeah, thats JUST the RTX processor of the 2070, so using both cuda plus RTX it would be like double, so roughly 2.5 x Vega
 
Dev who is making their own raytracing based game:

https://twitter.com/SebAaltonen/status/1032283494670577664?ref_src=twsrc^tfw|twcamp^tweetembed|twterm^1032283494670577664&ref_url=https://s9e.github.io/iframe/twitter.min.html#1032283494670577664



The RTX2070 can do 6 Gigarays/s in comparison. It even seems Pascal is not that bad either. The chap was a former senior rendering lead at Ubisoft.

Edit!!

OTH,is that the Vega or Pascal GPU fully tapped out though just against the dedicated raytracing hardware of the RTX2070??


He said that 4K 60FPS on Vega with 1 ray per pixel. (4.88grays). Also at same 1 ray per pixel, on Titan X can do more than 60fps at 4K.....

So is similar to using the RadeonRays 2.0 programming for CUDA & GCN cores

Also have a look here is related to the tweet. (posted on your tweet)


Real time rendering & ray tracing (including Physics) for PC, PS4Pro and XboneX :eek:
 
Dev who is making their own raytracing based game:

https://twitter.com/SebAaltonen/status/1032283494670577664?ref_src=twsrc^tfw|twcamp^tweetembed|twterm^1032283494670577664&ref_url=https://s9e.github.io/iframe/twitter.min.html#1032283494670577664



The RTX2070 can do 6 Gigarays/s in comparison. It even seems Pascal is not that bad either. The chap was a former senior rendering lead at Ubisoft.

Edit!!

OTH,is that the Vega or Pascal GPU fully tapped out though just against the dedicated raytracing hardware of the RTX2070??
Your statement regarding the 2070 isn't accurate. You would need to test the 2070 in the same scene as the vega 64, to find how many gigarays/s it could do
 
He said that 4K 60FPS on Vega with 1 ray per pixel. (4.88grays). Also at same 1 ray per pixel, on Titan X can do more than 60fps at 4K.....

So is similar to using the RadeonRays 2.0 programming for CUDA & GCN cores

Also have a look here is related to the tweet. (posted on your tweet)


Real time rendering & ray tracing (including Physics) for PC, PS4Pro and XboneX :eek:

I didn't realise existing consoles could do it already!
 
Anyway a moot point as it is full GPU against just the dedicated units on the Nvidia card so the older Pascal and Vega ones are probably fully tapped out and Turing isn't.

Yes. Btw thats how RadeonRays 2.0 is working on GCN & CUDA. It doesn't require specialized dedicated RT cores.
 
I didn't realise existing consoles could do it already!

Surprised also tbh. Possibly 4K 30FPS not 60FPS but that is inline with the capabilities. What surprises me is they push Physics & real time rendering also (no rasterization).
Which is baffling. Where the hell this performance is hidden all that time?

Yet do you remember the real time rendering video with the Unity engine running 980Ti and FuryX?
 
Hasn't it been AMD's time for a year or more now? I can't be the only one enjoying the RX 580's oomph for £150 less than the 1070.

(note, I would never spend more than £400 on a gpu - that's just one silly too far)
 
AMD's share price hit a 12 year high today, and a major analyst (after spending 2 days with AMD execs) raised his share price forecast further. AMD admitted they hadn't planned on Intel's woes with 10nm process, but it looks like a real window of opportunity for them for a while.

With 7nm CPU/GPU projected reaching market in 2019/2020 there could be a real change in the market yet.

Just another point of view, I was reading about it today.
 
so this is another thing AMD did first and nvidia painting as their invention?

whats the info on radeonrays? when was it introduced and has it been used?
 
Here is what's up with AMD.

If they still have both Sony and MS contracts for consoles they are suppose to make the next advancements in GPUs which are chiplet designs.
While Nvidia goes monolithic...AMD and possible Intel are going chiplet design. What we see in threadripper and the like is what's rumored for AMD 7nm graphics cards.
The only barrier I know of right now is MS willingness to allow for mGPus to be seen as 1 GPU. Which it hasn't as of yet. But the moment you do read about MS allowing mGPu to be seen as one gpu just like CPU you know chiplets are not far behind IMO.

What's also rumored is this is why we aren't seeing anything from AMD until 2020 time frame or thereabouts. Kinda hard to believe that to be honest. But once the news broke out that Navi was designed for consoles and why Raja left AMD it's starting to add up IMO.

Roughly 1.5 year hiatus from AMD on new graphics cards could indeed explain Raja Koduri's departure from AMD into Intel's loving arms.
 
Last edited:
Back
Top Bottom