Soldato
- Joined
- 29 Jul 2004
- Posts
- 7,108
Seems like a complete waste of time to release these chips.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
If you disable the P cores and then disable the E cores, memory latency changes so much that you cannot even measure it!So if you disable the P cores, memory latency significantly decreases? What if you disable E cores and leave the P cores on, same result?
Steve from Gamers Nexus found the same thing. He tested the 285k on Windows 11 23H2 as it was the most stable and gave the more consistent result (although they were still consistently disappointing). On one of 24H2 one of the gaming results on the high power profile was about 25% less then balanced.
Yeah, scheduling is completely screwed. There may be some hope at least for the worst results.
will he? anyone with a high spec pc is going to have a GPU which makes any cpu power savings look insignificant.
right now my 4090 is around 40+ watts sitting idle.
I guess my 2 desktop speakers with powered amps are using quite a bit from just being on.
My ultra wide monitor is probably using twice as many watts as a standard aspect ratio.
I wonder how many extra watts a monitor uses in HDR and at the highest refresh rate.
probably more than the difference in any CPU, I'm not about to lower them or the brightness to save a few pence an hour
it's basically insignificant in the grand scheme of things, as long as you don't have heat issues.
Turin vs Granite RapidsIf they kept to using Intel foundries it probably would have been.
Let's not pretend this Arrow Lake isn't the net result of those management decisions to go through round after round of job cuts. All that experience and talent was thrown out the window to improve the bottom line is has come back to bite them.
Unless the server side chips offer up some compelling performance and walk all over EPYC I think Pat Gelsinger could be on his way out in the next 12 months. He's been CEO since Rocket Lake and this was his 2nd crack at the whip to get Intel back into the game and frankly it's not good enough for regular consumers imo.
I find its been the same with the past 3 or 4 gens of CPUs from both players.Seems like a complete waste of time to release these chips.
I find its been the same with the past 3 or 4 gens of CPUs from both players.
They're that busy trying to keep up with each other release wise that they're literally throwing anything out with not really a huge performance uplift, Instead of leaving it 2-3 years between generations and actually giving us HUGE upgrades.
i get it, I mean they're a business and want to keep the money coming in but my god its gotten very stale.
I also don't like how they continue to shrink the NM smaller and smaller, I think this is why modern CPUs run so damn hot now is because the heat is so concentrated...i also believe that degradation will happen much much faster with modern processors, But again is that part of the plan!
(Maybe the last part I'm out of touch with and don't know enough)
I wouldn't say that. I really like the way AMD have gone with 3D cache, efficiency is crazy (admittedly because they had to limit TDP) and performance boost is awesome in some titles, decent in others.I find its been the same with the past 3 or 4 gens of CPUs from both players.
They're that busy trying to keep up with each other release wise that they're literally throwing anything out with not really a huge performance uplift, Instead of leaving it 2-3 years between generations and actually giving us HUGE upgrades.
i get it, I mean they're a business and want to keep the money coming in but my god its gotten very stale.
I also don't like how they continue to shrink the NM smaller and smaller, I think this is why modern CPUs run so damn hot now is because the heat is so concentrated...i also believe that degradation will happen much much faster with modern processors, But again is that part of the plan!
(Maybe the last part I'm out of touch with and don't know enough)
I hope not, you can normally trust a CPU to keep going until it is obsolete. I wouldn't be surprised if the memory controllers degrade though.i also believe that degradation will happen much much faster with modern processors, But again is that part of the plan!
alternatively theyll require water cooling as standard. as thermals increase, if they keep increasing that is.1080 Ti heatsink weighs 1.2 kilograms
4090 heatsink weighs 2.0 kilograms
The heatsink required to cool a GPU has nearly doubled in just 5 years. At this rate, the RTX 7090 heatsink will weigh nearly 4.0 kilograms.
Intel have been doing the same. Instead of innovating, Intel just increased the power requirements each gen it seams. Clocking them to within an inch of their life (or an inch beyond their life in a lot of instances). Do you think that's okay? Would you have been happy if next gen needed more power than 14th gen? Then even more power for the gen after that? I think Intel's current CPUs are a step in the right direction.
Nvidia needs to do what Intel and AMD are doing right now. I would be happy if the RTX 5090 used 60% less power than the 4090 but was only 5% slower. Then for the RTX 6090 to have a HUGE 70% performance improvement with no increase in power. That would impress me.
I find its been the same with the past 3 or 4 gens of CPUs from both players.
They're that busy trying to keep up with each other release wise that they're literally throwing anything out with not really a huge performance uplift, Instead of leaving it 2-3 years between generations and actually giving us HUGE upgrades.
i get it, I mean they're a business and want to keep the money coming in but my god its gotten very stale.
I also don't like how they continue to shrink the NM smaller and smaller, I think this is why modern CPUs run so damn hot now is because the heat is so concentrated...i also believe that degradation will happen much much faster with modern processors, But again is that part of the plan!
(Maybe the last part I'm out of touch with and don't know enough)
1080 Ti heatsink weighs 1.2 kilograms
4090 heatsink weighs 2.0 kilograms
The heatsink required to cool a GPU has nearly doubled in just 5 years. At this rate, the RTX 7090 heatsink will weigh nearly 4.0 kilograms.
Intel have been doing the same. Instead of innovating, Intel just increased the power requirements each gen it seams. Clocking them to within an inch of their life (or an inch beyond their life in a lot of instances).
This is just lack of experience or research tbh, you need to be smarter with your choice of components if you want a powerful yet silent system (that still runs cool).this was just luck really. i had no idea what the noise levels were going to be like.
why do people care about efficiency? run gpuz and watch how many watts the gpu is pulling with different settings in games.
worrying about 50 watts or whatever.. the battle is literally in your mind.
worrying about efficiency but probably have about 50 watts of RGB lighting, where's the logic?