• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Jimmy, how is this a hard question?

Well for a start it makes little grammatical sense -

And what other application on CPU that isn't more sensitive to latency then games?

What are you asking there? Please restate the question in a way that makes sense.

Consoles are using AMD's CPU/GPU and 16gb gddr6 and some how you see it as a compromise?

It is a compromise. The GDDR6 is great for graphic workloads, less so for general CPU workloads. So it's a compromise. There are a few "wins" from having unified memory - not having to copy data across the pcie bus as you would in a pc - but for the CPU workloads GDDR6 will be slower than DDR4 would be as it is a higher latency technology.

There you go again with your fallacy to your appeal to authority.

What appeal to authority?

You linked to the video, I'm telling you that the video *you linked to* explains exactly what I've been saying - different workloads benefit from memory tuned for different characterstics. That's not an appeal to authority any more than you linking it in the first place.

Jimmy, there is no technical merit to why it cannot work when it's working in a computer already, console. Of course there are pro and cons between ddr/gddr/hbm. I'm not talking about AI but PC in this area of PC Gaming. And since you cannot provide any practical use case application were such a slow down would occur in PC Gaming this debate becoming circular and getting way off topic.

I didn't say it can't work, please don't put words in my mouth. The "practical application" where GDDR6 will perform worse in gaming is the main game logic, that runs normal CPU code on the x86_64 part of the APU. DDR4 with its lower latency would be better there as it returns data faster. GDDR6, as I've said many times now, is great for graphics tasks which work in a massively parallel fashion on large vectors, and the greater bandwidth is put to full use. PC gaming benefits from both.

You weren't talking about AI but the people who are talking about "Near Memory" are, and especially where that "Near Memory" is GDDR or HBM. Which makes sense, because AI is largely about vector processing, like graphics workloads

What is it you don't get here?

You seem to believe that GDDR6 would be better for main system memory in a PC. It's not. It's tuned for a different purpose to DDR4, both have a role to play.

I'm not sure why you think that this is a 'meme' or an 'old wives tale'. And if you do still think that, perhaps *you* can provide the latency figures that show GDDR6 has lower latency?
 
Last edited:
Honestly,AMD needs to make sure the launch goes without problems:
1.)Have a decent stock cooler,because we all suspect only reference cards at launch
2.)Don't push the GPU so far,it starts to throttle on the stock cooler
3.)None of this last minute BIOS rubbish like with the 5600XT,which makes reviewers grumpy
3.)Make sure drivers are in good stead(even delay the launch to give more time for the software people),so reviewers can't go on about driver hurdles
4.)Try and price it such,that it can't really give Nvidia too much of a look-in.

Try to make the first reviews go uneventfully,because first impression counts. AMD does not need to compete with Nvidia at the very top,what it needs is a launch with good first impressions.
 
Last edited:
Honestly,AMD needs to make sure the launch goes without problems:
1.)Have a decent stock cooler,because we all suspect only reference cards at launch
2.)Don't push the GPU so far,it starts to throttle on the stock cooler
3.)None of this last minute BIOS rubbish like with the 5600XT,which makes reviewers grumpy
3.)Make sure drivers are in a good stead(even delay the launch to give more time for the software people),so reviewers can't go on about driver hurdles
4.)Try and price it such,that it can't really give Nvidia too much of a look-in

Try to make the first reviews go uneventfully,because first impression counts.

Fully agree.. except you have two 3's ;)

I hope AMD's launch goes well and they have something competitive... but I think I'm going to have to have gone with a 3090 before AMD will even have announced their cards :(
 
Fully agree.. except you have two 3's ;)

I hope AMD's launch goes well and they have something competitive... but I think I'm going to have to have gone with a 3090 before AMD will even have announced their cards :(
When AMD launches RDNA 2...Viewer discretion is advised. :p

In the meantime enjoy another open, open world game using raytracing on console, released next year though.


Much like its predecessor, Lego Star Wars: The Force Awakens, the game's hub will not be a single area, such as the Mos Eisley Cantina in Lego Star Wars: The Complete Saga, but a wide range of fully explorable planets filled with many iconic Star Wars landmarks. Planets and moons confirmed by the developers so far to be explorable in the hub include Coruscant, Naboo, Tatooine, Geonosis, Kamino, Utapau, Kashyyyk, Mustafar, Yavin 4, Hoth, Dagobah, Bespin, Endor, Jakku, Takodana, D'Qar, Starkiller Base, Ahch-To, Cantonica, Crait, Ajan Kloss, Pasaana, Kijimi, Kef Bir, and Exegol.[3] Many ships will have freely explorable areas in the hub as well, such as Star Destroyers and the Death Star. Random encounters will also happen in the game's hub. For example, an Imperial Star Destroyer could suddenly jump out of hyperspace and send a fleet of TIE Fighters after the player. Players can choose to engage in dogfights with them or continue onward to progress the story.

The game will have nearly 500 characters, with many of them being playable.
Ok, they throw in the kitchen sink. They opened the Star Wars world then opened it again. I wonder how much vram this will need...

But this does actually look real good.
 
Last edited:
lol, I've already address and beat you at this debate. You are only reiterating a prior off topic discussion. If you have something new or can provide examples of your assertion you can PM me.

You've addressed nothing, you're just making weird-ass, wrong assertions that even the videos and links you post don't agree with. If you want to carry on being wrong then that's fine I guess.
 
Last edited:
They need to get 4K8K onboard ASAP. They could fire most staff members if they did, they would only need to keep the engineers. Hell screw it, fire them too, 4K8K will do that too! :p:D
Were you bullied in school or the bully? :P poor guy has had enough thrown his way now don't you think?
 
Were you bullied in school or the bully? :p poor guy has had enough thrown his way now don't you think?
I don't know man, he keeps coming back for more. Many have tried repeatedly to explain to him it is not as easy as he makes it out to be. It is easy to look back when you have the information and say one should have done this or that. Hell, they even employed Scott Wasson from the Tech Report who obviously knows more than him and that was many years ago, things did not magically change. It takes time, but things are getting better over there now at least since Lisa took over.
 
Honestly ps5 has 875GB if i am not mistaken for installing games on it's ssd. If current COD takes 200GB+ there is not much space left for many more games. And then you will need a dedicated fast nvme external for PS5 or even worse a proprietary ssd for Xbox. Hey we are going back to consoles supporting cartridges haha - they are just nvme cartridges..

What is in the game that holds 200GB omg. Sorry off topic.

Wasn't one of the main benefits of the high speed memory system to allow for a change in the game file structure that removes a lot of duplication? The first PS5 reveal with Mark Cerny described the on-disc size of next-gen games being an order of magnitude smaller because of it. Hopefully, the same system can be used in PC games for those with sufficient memory bandwidth.
 
People need to stop looking at AMD to be a saviour imho, if you don't like Nvidia prices just don't buy new cards. Let your wallets talk.

I expect AMD to have some great value mid-high end cards in the pipeline, but people will buy Nvidia anyway.

Maybe Intel can shake things up later on next year, high time someone else threw money down the drain battling against Nvidia.
 
People need to stop looking at AMD to be a saviour imho, if you don't like Nvidia prices just don't buy new cards. Let your wallets talk.

I expect AMD to have some great value mid-high end cards in the pipeline, but people will buy Nvidia anyway.

Maybe Intel can shake things up later on next year, high time someone else threw money down the drain battling against Nvidia.

Yeah, AMD are done fruitlessly bankrupting themselves trying to gain market share from Nvidia.
 
They seemingly can't even handle making gains in the easy 'mid-range and below' segments!!!! :D

AMD seem much smarter now, developing GPU tech around console advances as well. So it serves a dual purpose.

Battling Nvidia in DGPU is a money pit, let someone else do that imho.

AMD are better off focusing on CPU and console. Have a few mid range cards out, no point going all in as people will just buy Nvidia anyway.

If Nvidia gets too expensive just don't buy em.
 
Very true and it's seeing them make large marketshare gains losses over time :p

I don't think AMD are focused on DGPU, they focus on CPU and console and chuck out a few DGPU's now and again.

Weirdly it's the Nvidia guys who obsess over AMD's GPU's lol. They want AMD to save them from Nvidia's pricing maybe, while mocking them at same time. Weird paradox eh.
 
Status
Not open for further replies.
Back
Top Bottom