• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

Put simply game developers are sick to the back teeth of using 1K textures in games that we should be running at 2.5K if not 4K, 12GB should be the minimum for lower mid range cards and 16GB mainstream with 20+ at the higher end. AMD who have partnered with Epic Games agree, and its not just Epic Games who are pushing this, most are now, because they are all not willing to work with in last gen console constraints, no amount of Nvidia cheeping out on VRam will stop them.

Our game stutters on your GPU? that's your problem, you bought the wrong card, what possed you to buy that junk? You wouldn't buy a Lambo with a lawnmower engine, would you?

And i agree.
It's not that simple. Some of the best texture work that I've ever seen have no problem fitting on 8GB cards.

What AMD does with it's partnered games regarding textures and VRAM is very dubious.
 
Will Nvidia start throwing their toys out of the pram so to speak soon?

AMD are separating the analogue functions from the logic functions in GPU's.
Because analogue lithography is no longer shrinking in line with logic, in fact its no longer shirking at all, as a result the analogue functions are taking up ever more space in the die.
By taking the analogue functions out they can manufacture that on older cheaper nodes, because it makes no difference anyway and they can just glue those analogue functions to the logic as needed.

Nvidia engineers.
Hey Jenson, analogue lithography is no longer shrinking, that's a problem, too much of the analogue functions are talking up die space.
Lether Jacket man: so what? just take some out then and tell people Moores Law is dead.
Engineers: so 128Bit bus on the ##60 class and 192Bit on the ##70 class?
Lether Jacket man: the more you buy the more you save, woosh...... i am the sound effects.

I think he's already lost his toys.
 
Last edited:
Because analogue lithography is no longer shrinking in line with logic, in fact its no longer shirking at all, as a result the analogue functions are taking up ever more space in the die.
Sorry to be that pedantic A hole but it's not that they're no longer shrinking, it's that node shrinks are not seeing the same performance gains as the logic blocks / as they used to in the past. :)
 
Last edited:
Sorry to be that pedantic A hole but it's not that they're no longer shrinking, it's that node shrinks are not seeing the same performance gains as the logic blocks / as they used to in the past. :)


Read that :)
 
Read that :)
I don't really need to as I've already read about how analogue blocks have ceased scaling well with node shrinks, it's not that they're no longer shrinking. All the monolithic dies that are out there (Intel, Nvidia) are proof that you can still shrink analogue blocks, it's not a matter of can't as what you posted implied (at least to me), it's a matter of there being little to no benefit.
 
I don't really need to as I've already read about how analogue blocks have ceased scaling well with node shrinks, it's not that they're no longer shrinking. All the monolithic dies that are out there (Intel, Nvidia) are proof that you can still shrink analogue blocks, it's not a matter of can't as what you posted implied (at least to me), it's a matter of there being little to no benefit.

I didn't imply it, i said it, a 14nm analogue shrank about 50% from 28nm, 3nm analogue is about 10% smaller than 5nm, beyond that it barley moves at all.

Jenson has talked about this himself.

Nvidia's special source to get around it is.... just don't have it, the fact that Jenson stands on stage and says Moores Law is dead was him warrning you you're going to get a 4nm GPU that's no better than an 8nm GPU for the same money.
 
Last edited:
I didn't imply it, i said it, a 14nm analogue shrank about 50% from 28nm, 3nm analogue is about 10% smaller than 5nm, beyond that it barley moves at all.

Jenson has talked about this himself.

Nvidia's special source to get around it is.... just don't have it, the fact that Jenson stands on stage and says Moores Law is dead was him warrning you you're going to get a 4nm GPU that's no better than an 8nm GPU for the same money.
Is Nvidia going to be monolithic for the foreseeable then?
 
I didn't imply it, i said it, a 14nm analogue shrank about 50% from 28nm, 3nm analogue is about 10% smaller than 5nm, beyond that it barley moves at all.

Jenson has talked about this himself.

Nvidia's special source to get around it is.... just don't have it, the fact that Jenson stands on stage and says Moores Law is dead was him warrning you you're going to get a 4nm GPU that's no better than an 8nm GPU for the same money.
I'm not going to bother after this reply as like i said i was being pedantic, however you've got this very, very, wrong. 14nm shrank about 50% from 28nm, 3nm is about 10% smaller than 5nm irregardless of whether you're talking about logic or analog blocks, it did that because a reduction in size has nothing to do with whether that block is analogue or logic, you could, indeed Intel and Nvidia are manufacturing, analogue circuits on 10nm & 4nm.

What's started to happen is that when analog circuits are shrunk they're no longer seeing the same or even any performance improvements by going from something like 6nm to 4nm, performance improvements that you still get from shrinking a logic circuit, you no longer get the same scaling.

Shrinking logic circuits from 6 to 4nm gets you (pulling a number out my ass) a 20% improvement in performance, doing the same with an analog circuit will get you 2%.
 
Last edited:
BTW not sure this has been covered.


Worst in over 20 years.

I would imagine that is why they don't want to release another card no one is going to buy. There are already plenty of them sitting out there gathering dust. It's also why they are slashing prices on what is already made and out there, rather than just introducing more and more.

See also - 3090Ti gets cancelled and all of the other stories recently on GPUs not selling.
 
They could use AI to tell them why they're not selling. ;)

Not enough multibuy deals, obv.

Are we getting the 7800 XT 7900 GRE worldwide, or not? :o Articles seem to say both things.

"Although AMD's Radeon RX 7900 GRE was expected to be a product available exclusively in China, yesterday AMD published its recommended prices for the U.S., which means that the product could be coming to North America, too. As it turns out, the Radeon RX 7900 Golden Rabbit Edition can be obtained in Germany, but (for now?) only inside a gaming PC assembled by Memory PC. "

Maybe it's like the 7500F, full DIY in China, but only OEM everywhere else?
 
Back
Top Bottom