TVsWhose TCL?
Considering that I can't even go to Game or the rainforest and order a PS5 why would they want to release a mid gen refresh already?
Who is seriously trying to target 8K?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
TVsWhose TCL?
Considering that I can't even go to Game or the rainforest and order a PS5 why would they want to release a mid gen refresh already?
Who is seriously trying to target 8K?
I meant in reference to gaming. Also you didn't tell me who TCL is.
Do We Actually Need PS5 Pro/ 'Xbox Series Next' Enhanced Consoles This Generation?
Ok, so it is safe to say for the same 300 watts we get 50% higher performance?
Have i got that right?
Isn't this what HBM is? It's more or less dead in non-datacentre applications. Chiplets allow to put away stuff that doesn't need a cutting edge process like I/O. This increases the yield of the expensive large and performance defining part by making it smaller overall.
Yeah, i mean pre Zen and RDNA AMD was brute forcing their CPU's / GPU's to keep up with competition, and failing.
It seems now that Nvidia pushing their silicon harder, and Intel with their CPU's, but they are NOT failing. They just want those 10 FPS that puts them at the top of the bar graphs.
So AMD will have to do the same, Zen 4 will go to 170 watts, up from 105 watts Zen 3 and RDNA3 will go up from 300 watts, who knows what to but there is talk of 450 to 600 watts for Nvidia's RTX 4090 so i wouldn't be surprised going forward that any high end GPU will now be at least 400 watts, because no one wants to risk being left behind by the competition.
Isn't that great when all our energy costs are sky-rocketing? its going to cost 50p an hour to run these things.
Will you be buying AMD though Grim?
"gaming segment down" charge less then.
Rumours are wafers at at least $20,000 now. TSMC cashing in.
Wogh wogh
So they cut corners and lowered the infinity cache ?
You've priced up the materials but forgot about labour and operating costs. AMD spend 2.85 $billion per year on R&D. I don't know how many years it took to design RDNA but as an example, Raja Koduri started work on Ark in 2017 so up to 5 years. That's many $billions in design costs which need to be recovered in GPU sales before they even turn a profit. AMD's head office in Santa Clara, California has 18,924 employee's so that's probably close to a $billion per year just to keep the office running.
If they get +50% performance per watt, (as they told their investors) they should do well. (Unless they massively cut TDP's)
For VR, I prefer the frame timing of my 6800XT over my 3080Ti, but the 3080Ti has just enough grunt to run my sims at night with lowered settings and the 6800XT can't quite do it.
The 6800XT basically "feels" better in VR up to the point where it can't hold 90fps, then it's pretty bad. (I think Nvidia's reprojection works better so the 3080Ti remains playable when it starts to struggle)
Another 50% (for either card) should give me the grunt to run at night. If RDNA3 maintains the same solid frame pacing, and the price is good, AMD could get another sale from me.