• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Caporegime
Joined
18 Oct 2002
Posts
29,812
Seems like AMD are heading down the higher power use too for 7000 series GPUs ..


 
Soldato
Joined
6 Aug 2009
Posts
7,071
The problem with power usage going up is the die shrinks are not yielding the extra performance like they use to.
Yes starting to get to where CPU's were. Obviously they're going to have to get smarter in the next few generations. Looking forward to seeing how they tackle it.
 
Soldato
Joined
28 Oct 2009
Posts
5,291
Location
Earth
AMD says because Nvidia is pushing higher and higher power limits, they have to follow or get left behind. In a world where it's becoming harder to get more performance, yet the demand for performance is growing, something has to give and that's power
 
Soldato
Joined
6 Aug 2009
Posts
7,071
Yes s
AMD says because Nvidia is pushing higher and higher power limits, they have to follow or get left behind. In a world where it's becoming harder to get more performance, yet the demand for performance is growing, something has to give and that's power
Yes seems to go in cycles, we're in a pushing power one right now.
 
Associate
Joined
27 Jun 2022
Posts
2
Location
2 Frederick Street Kings Cross London WC1X 0ND GB
Previously, We have gathered plenty of information from the Twitter user kopite7kime about the specs of the RTX 4000 Series GPUs.
The RTX 4000 cards are based on Nvidia’s “Ada Lovelace” architecture.
We think that this year with their Nvidia 4000 series cards, the company will have to face a tight competition with both AMD and Intel. Because AMD’s 7000 series cards and Intel’s ARC Alchemist GPUs are expected to launch at the same time.
So, this time, the war between the prime Graphics Cards manufacturers is real in 2022.
 
Last edited by a moderator:
Soldato
Joined
6 Feb 2019
Posts
17,566
someone needs to make a case where the whole case if one giant heatsink to keep these cards cool

There already are passive cases like this but they are not particularly effective because passive cooling, even if the whole case is a heatsink, is still worse than a Noctua with a fan

And if you tried to build a case that's a giant actively cooled heatsink, to prevent hotspots you need to ensure there airflow over the entire case and that will be 1) complicated and expensive and 2) you'll have air coming in/out in 360 degrees around the case and the user may not like air blowing on them if the case on the desk and 3) would require a lot of fans so the case would be heavy and expensive
 
Last edited:
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Previously, We have gathered plenty of information from the Twitter user kopite7kime about the specs of the RTX 4000 Series GPUs.
The RTX 4000 cards are based on Nvidia’s “Ada Lovelace” architecture.
We think that this year with their Nvidia 4000 series cards, the company will have to face a tight competition with both AMD and Intel. Because AMD’s 7000 series cards and Intel’s ARC Alchemist GPUs are expected to launch at the same time.
So, this time, the war between the prime Graphics Cards manufacturers is real in 2022.
Umm thanks for that generic little copy pasta from wherever that is from.
 
Last edited by a moderator:
Soldato
Joined
11 Jul 2003
Posts
7,576
Location
Telford//west mids
Previously, We have gathered plenty of information from the Twitter user kopite7kime about the specs of the RTX 4000 Series GPUs.
The RTX 4000 cards are based on Nvidia’s “Ada Lovelace” architecture.
We think that this year with their Nvidia 4000 series cards, the company will have to face a tight competition with both AMD and Intel. Because AMD’s 7000 series cards and Intel’s ARC Alchemist GPUs are expected to launch at the same time.
So, this time, the war between the prime Graphics Cards manufacturers is real in 2022.
Remove your address mate everybody can see it.
 
Last edited by a moderator:
Associate
Joined
29 Jun 2016
Posts
529
Yes s

Yes seems to go in cycles, we're in a pushing power one right now.
Hard to hate either NVidia or AMD for increasing the power envelope. They will wrangle as much processing power out of every single transistor to maximise profits, and beyond that if there is a demand for more processing power they can only throw more transistors and energy at the problem.

Keep in mind the power envelope will increase only at the premium end of things. The low end/low power segments are rapidly advancing. Just look at the performance of the Ryzen 6000 APU's.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
Hard to hate either NVidia or AMD for increasing the power envelope. They will wrangle as much processing power out of every single transistor to maximise profits, and beyond that if there is a demand for more processing power they can only throw more transistors and energy at the problem.

Keep in mind the power envelope will increase only at the premium end of things. The low end/low power segments are rapidly advancing. Just look at the performance of the Ryzen 6000 APU's.

Yep. Plus one can still power limit or fps cap games. It’s what I do. Then the power is there and available for demanding games like Cyberpunk 2077.

Also overall even though power requirements might go up, the card will still be more efficient than previous gen cards. For example say one gets a 4080 and sets it to output the same fps as a 3080 by power limiting it, the 4080 should be taking less watts to output the same amount of performance.
 
Caporegime
Joined
4 Jun 2009
Posts
31,017
Yep. Plus one can still power limit or fps cap games. It’s what I do. Then the power is there and available for demanding games like Cyberpunk 2077.

Also overall even though power requirements might go up, the card will still be more efficient than previous gen cards. For example say one gets a 4080 and sets it to output the same fps as a 3080 by power limiting it, the 4080 should be taking less watts to output the same amount of performance.
Stop using logic! :cry:

Quite funny on amd reddit, people were giving of about the nvidia rumours of crazy power consumption then the recent news confirming rdna 3 will use more power "oh that's ok as long as the performance is there, will still be more efficient" :cry:
 
  • Haha
Reactions: TNA
Back
Top Bottom