• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

Can someone explain to me what use is AI to the average gamer?
Well for more believable AIs it would be nice, eventually.

However, currently it is all being used to upscale and generate fake frames. Apparently, AI upscaling is the best thing ever - not at all like that upscaling those cheap consoles used to do whatsoever!
 
Can someone explain to me what use is AI to the average gamer?
Potentially: Better interactions with NPCs, faster/better world creation quicker/cheaper game production, dynamic events in online games, smarter game AI, more variation less on rails games, better QC.
Lot's of things it could be used for if it is as good/dangerous as promised. If it's just a fancy chatbot not so much.
 
Can someone explain to me what use is AI to the average gamer?
I think there are a few areas where better AI/ML can lead to a better experience:

* Things like e.g. DLSS for performance enhancements
* More realistic acting game characters/enemy AIs
* Have convincing dynamic conversations with NPCs rather than just canned dialogue trees
* Advanced procedural generation - i.e. for remixes of existing game maps, new characters, scenery, etc.
* Easier creative tools in games where AI can do more of the heavy lifting for you
 
Can someone explain to me what use is AI to the average gamer?
It may make online shooters a bit more realistic.

In a simulation an AI drone had to be told that killing the guy giving it the orders to stop attacking it's objectives wasn't good.

It then went on to disable the communications system instead.

Gotta love AI. It's trolling from the get go.
 
I don’t see why they would split it, the businesses are too intertwined. The AI side is all being driven by GPU architecture, it wouldn’t make sense to split them. All of their major products are GPU related.

Why would they sell a portion of their business that still generates high revenue and is seemingly linked so specifically to other areas of their business.

Gaming focused GPU’s will become even more AI related if anything.
Not disagreeing as i don't think they will split, however i was just listen to this weeks LTT WAN show and Linus made a good point (IMO).

He made the point of should they, like i said i don't think they will but if they did spin off the gaming side of things it would probably be good for not only the consumer but also Nvidia.
 
explain how it would be good for nvidia for us uninformed, surely in these times, they not making as much cash as they expected and the gamer/reviewers backlash, so in turn, wouldnt that hit the share price of the gaming side if spun off?
 
Last edited:
AFAIK spinning off a part of the company doesn't come with separate shares for each part of the company.

Have i used the wrong wording? What i mean is how some companies have an overarching company owning them but are separate entities within that company, a bit like how Google the search engine reorganised into being owned by Alphabet Inc a few years back.
 
Last edited:
Yea, for that it made/makes less sense as RTG wasn't really competing with anything within the company, the CPU side of things wasn't really pulling the GPU side of things in two direction.

Where, and I'm paraphrasing what Linus said, it makes more sense for Nvidia is because the data centre (B2B, AI, etc, etc) side of things want, and get their own way due to the amount of revenue they can earn from that vs gaming, very different than what gaming demands. As someone asked in another thread what good is AI for gaming and while there are uses in gaming it's not to the same degree as businesses demand. The same with what someone mentioned about, i think, it was ULMB or whatever they're calling their new strobing tech, that's of no use most to businesses so it's unlikely to get as much investment as something that's of benefit to data centre (B2B, AI, etc, etc).

Basically because there's more profit to be made in the data centre (B2B, AI, etc, etc) there's a very high chance that their demands will win the day leaving the demands of gaming as the poor second cousin.
 
Last edited:
Surely Nvidia will not have to cut back on ada chips from TSMC due to AI demand? of course not, they can't sell what's out there right now. I can guarantee you the Nvidia next gen will be another ampere moment, just without the crypto and pandemic inflation.
 
You can shoehorn some hardware to 'help' with gaming tasks but you have to be honest at some stage and only engineer the product to have what it needs. If nvidia keep banging on about moores law is dead two times now and component costs etc. then they need to streamline it back so that the consumers are not having to subsidise it. Wasn't it raised that the tensor cores are not being used effectively or some component was kind of made redundant?

We have progression phases but at some point there will have to be a fork between AI specific tasks and gaming technology. Up until this point I have not really been paying much attention to that, but interesting how both vendors work internally and how their segmentation works.
 
Ada Lovelace was a Pascal moment but Nvidia wanted to keep mining prices and clear out their Ampere inventories. TSMC 4N 5NM to TSMC 3NM will be less of a jump IMHO,unless die sizes also increase or Nvidia makes some massive uarch breakthrough.
 
No but if you recall AMD had very compute heavy resources with Vega and when it came to gaming the excuse then was they were not focusing on it (or maybe the angle was the resources were being wasted). The compute element was actually quite strong then they shifted design again.
AMD's mistake was not to push harder for its strengths. Before RDNA it needed Mantle and heavy use of Async Compute to keep its GPUs properly feed. DX11 or lower meant a lot of bottlenecks when the scene got heavy on one thread.
Like I wrote with other occasions, why not push AI on the GPU, physics, more True Audio? Doing other stuff besides graphics probably would have helped. Or maybe not...

Anyway, nvidia does it better and while the free hate for DLSS and RT/PT runs free, it does make a serious step forward towards photorealism.
 
AMD's mistake was not to push harder for its strengths. Before RDNA it needed Mantle and heavy use of Async Compute to keep its GPUs properly feed. DX11 or lower meant a lot of bottlenecks when the scene got heavy on one thread.
Like I wrote with other occasions, why not push AI on the GPU, physics, more True Audio? Doing other stuff besides graphics probably would have helped. Or maybe not...

Anyway, nvidia does it better and while the free hate for DLSS and RT/PT runs free, it does make a serious step forward towards photorealism.
AMD pushes open source, some times it works most of the time it doesn't.

Without AMD's Mantle-that arguably got even more push back because it was AMD, there wouldn't be RT'ing.

Imo the 'hate' for RT'ing is because of the buy in/performance cost, shown by plenty of NV/AMD users.

Until RT'ing is available to mainstream users that can turn it on without the MASSIVE performance cost, then and only then will the hatepushback cease.

It also doesn't help when we are force fed upscaling>native.

Different needs for different users.
 
Back
Top Bottom