• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Is there a part of the joke that i am missing?:confused:
3rtz9jwnktl51.png
 
Thats going to **** off a lot of 3080 10gb buyers this week if true.

But they'll still have their money! :D

It's not though. It's really really not.

EDIT: I thought you were talking about the upside down 8 ! :D:D
LMAO! :D

Why would they do that so soon after saying 10GB is plenty for the 3080?
Marketing always talks out of both sides of their mouth. They just wanna sell you more ****. Betcha they'll spin it like "yeah it's enough but if you want more & peace of mind then buy the 20 GB it's a premium product" etc. I'd never underestimate Nvidia's ability to up-sell though.
 
£100 is a dream. GDDR6X is expensive, a 20GB 3080 will be at least £1000
Yes, it's not just the memory it's also the PCB. I wouldn't be shocked if it was £999. And also because at that price ML researches will buy every single one they can produce. Already the 3080 is incredible value outside some scenarios that simply require tons of vram, but now it's going to be good for all.

The RTX 3080 is currently by far the most cost-efficient card and thus ideal for prototyping. For prototyping, you want the largest memory, which is still cheap. With prototyping, I mean here prototyping in any area: Research, competitive Kaggle, hacking ideas/models for a startup, experimenting with research code. For all these applications, the RTX 3080 is the best GPU.

Suppose I would lead a research lab/startup. I would put 66-80% of my budget in RTX 3080 machines and 20-33% for “rollout” RTX 3090 machines with a robust water cooling setup. The idea is, RTX 3080 is much more cost-effective and can be shared via a slurm cluster setup as prototyping machines. Since prototyping should be done in an agile way, it should be done with smaller models and smaller datasets. RTX 3080 is perfect for this. Once students/colleagues have a great prototype model, they can rollout the prototype on the RTX 3090 machines and scale to larger models.
https://timdettmers.com/2020/09/07/which-gpu-for-deep-learning/
 
Exactly, just you mark my words. The 3090 buyers will be spewing when Nvidia release the 3090Ti. or the real Titan.
"BUT WE THOUGHT IS WOZ THE BEST 1!!"
"Nivida - Laughs in marketing".

I really don't understand this line of thinking, GPU makers always have mid-gen refreshes, ti's, supers etc. a new card coming out doesn't suddenly render your card completely useless and the people that always upgrade every 5 minutes will still upgrade.

If I do end up buying a 3090, I'll be doing it knowing full well that a cheaper 3080ti or a new 3090ti is a very real possibility but I won't care either way.
 
Back
Top Bottom