Soldato
- Joined
- 17 Aug 2003
- Posts
- 20,160
- Location
- Woburn Sand Dunes
Did someone hack 4k8k? He's.... he's been sensible lately :O WHAT HAPPENED?![]()
Even a broken clock's right twice a day...
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Did someone hack 4k8k? He's.... he's been sensible lately :O WHAT HAPPENED?![]()
Looks like a fake, smells like a fake, it's fake.![]()
![]()
This is from a while ago too. 30% faster than 2080ti. NVIDIA late again and hot and loud
Even a broken clock's right twice a day...
Did someone hack 4k8k? He's.... he's been sensible lately :O WHAT HAPPENED?
But you're right, I think the 30x0 series and pricing is the biggest evidence that AMD has something.
#poorraja #leakersdreamNah he'd have leaked his identity by now if he was![]()
Its not. Even the developer commented on it. The kicker? The GPU was driven by a 4800H mobile chip. You'd better strap inLooks like a fake, smells like a fake, it's fake.
Source? Or this is a rumourThere's another rumour and about rumour that AMD are "leaking" conflicting information in order to track down where their leaks are coming from... lol
Strap-on receivedIts not. Even the developer commented on it. The kicker? The GPU was driven by a 4800H mobile chip. You'd better strap in
This was also confirmed on a recent MLID Broken Silicon with SemiWiki Founder Daniel Nenni. He went into great length about designs are node-specific and you can't jump between them at a technical level, plus there are many NDAs involved to keep X foundry's fab secrets locked away. It's also prohibitively expensive to develop for 2 nodes simultaneously, so Sammy 8nm was decided a long time ago for the gaming line. It also pours cold water on the notion of a Super refresh later on with TSMC 7nm. Nvidia certainly have the money and capacity to start over and turn around quickly (the top A100 is on tSMC 7nm already so there's a head start), but it's unlikely.Based on AdoredTV's video this stuff is decided as soon as they start designing these chips.
What's the source of this rumour?Source? Or this is a rumour![]()
It's rumoured to be a leaker.What's the source of this rumour?
I will watch it.This was also confirmed on a recent MLID Broken Silicon with SemiWiki Founder Daniel Nenni. He went into great length about designs are node-specific and you can't jump between them at a technical level, plus there are many NDAs involved to keep X foundry's fab secrets locked away. It's also prohibitively expensive to develop for 2 nodes simultaneously, so Sammy 8nm was decided a long time ago for the gaming line. It also pours cold water on the notion of a Super refresh later on with TSMC 7nm. Nvidia certainly have the money and capacity to start over and turn around quickly (the top A100 is on tSMC 7nm already so there's a head start), but it's unlikely.
As an aside, Nenni also explained why the rumoured clock speeds of Ampere dropped as time went on, which also applies to why 5GHz Zen 2 CPUs never materialised. Essentially early engineering samples for development work are always golden samples because that's just what's coming off the wafer when dev yields are so low. As development continues, you get a broader idea of what clocks are going to be as yields increase and you simply just make more chips, and those average clocks (and therefore your final specs) always come down.
It was a really good episode.
It's rumoured to be a leaker.
This was also confirmed on a recent MLID Broken Silicon with SemiWiki Founder Daniel Nenni. He went into great length about designs are node-specific and you can't jump between them at a technical level, plus there are many NDAs involved to keep X foundry's fab secrets locked away. It's also prohibitively expensive to develop for 2 nodes simultaneously, so Sammy 8nm was decided a long time ago for the gaming line. It also pours cold water on the notion of a Super refresh later on with TSMC 7nm. Nvidia certainly have the money and capacity to start over and turn around quickly (the top A100 is on tSMC 7nm already so there's a head start), but it's unlikely.
As an aside, Nenni also explained why the rumoured clock speeds of Ampere dropped as time went on, which also applies to why 5GHz Zen 2 CPUs never materialised. Essentially early engineering samples for development work are always golden samples because that's just what's coming off the wafer when dev yields are so low. As development continues, you get a broader idea of what clocks are going to be as yields increase and you simply just make more chips, and those average clocks (and therefore your final specs) always come down.
It was a really good episode.
https://www.youtube.com/watch?v=O4DgXtxkZNg for the episode, I think the part specifically is 57 minutes in, but there are timestamps and chapters in the description.Was literally just reading this and about to reply.
People looking at the 3080 just as way to get a faster cheaper 2080 Ti are missing the forest AND the trees.
It's 2 years later and Nvidia have to launch a new GPU. Let's say the 3080 is actually 3080 Ti and sells for $1200 again, and the 3090 is instead the Ampere Titan and sells for $2999. How the f are they going to sell a 15% faster 2080 Ti for the same price, while bringing almost no new features to the tables? At least with the 2080 Ti they could make the b.s. claim of adding RTX & sacrificing the gains of a generation in order to advance rendering techniques. Okay, cool, then what? Can't do the same trick twice. And now they have both consoles & RDNA 2 breathing down their necks, and they'd have a launch even ******** than Turing? LOL! Nvidia COULDN'T have sold you this 3080 for same price as the 2080 Ti because it's just 15% faster w/ OC! It's not out of the goodness in their hearts, it's because it's a pathetic increase! Even for "$699" it's not a steal because that's just what it's supposed to be as baseline for a flagship (which this isn't). So it's only looking somewhat appetising because Turing has been so god awful.
If you look at the context proper all that Nvidia really delivered is the minimum not-completely-awful product. That's it.
Can AMD top this? Absolutely. Will they? Who knows, at the end of the day it's a question between them beating Nvidia's bottom(line) red, or making more money making more CPUs. I'm sure they have their pride, but I don't know if that will override their greed, they're still a corporation. If they go for 384bit GDDR6 then it's just gonna be another flatline generation.
25 - 30% faster at 4k. Oc the 2080ti and suddenly that gap closes dramatically. 3080fe oc is almost useless.RTX 3080 is 30% not 15% faster than RTX 2080 Ti and could sell for $1200, no problems, it's Nvidia after all and its endless base of loyal boys.
Nobody knew if it was a 3070, 3080 and 3090 until it was leaked until high heaven. And even then, Nvidia didn't confirm it until the actual kaunch event...
Why would AMD do any different?
https://www.youtube.com/watch?v=O4DgXtxkZNg for the episode, I think the part specifically is 57 minutes in, but there are timestamps and chapters in the description.