• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD to launch 300W GPU with High-Bandwidth-Memory

KN8Rh5g.jpg

http://www.tweaktown.com/reviews/68...0-4gb-strix-oc-video-card-review/index16.html

Yeah at default settings the 980 is indeed more power efficient than it's predecessors, and more so than the 290X, once you add an overclock or really stress the GPU @ stock that low TDP goes out the window :D

I've got a GTX 980 which sits @ 1500 core all day long for gaming, it pulls about as much from the wall as my old Titan Black did at about 1200 core.

I just get whichever card is fastest, atm that is the 980. I don't really consider the TDP in broad terms because I'm not gonna be running at stock if you see what I mean.

If the 300w AMD card is faster than the 980, than I'll grab that and visa versa if it's GM200. Once a single card can handily cope with 1440P, than I'll pick whichever uses least power at that performance level. We're just not there yet. Probably 2016 for a midrange card (Power efficient) to do really well @ 1440P imho. Then I can settle for a lower down tier card that doesn't need as much juice. (Unless I move to 4K :p)



+1, I won't mention it again. Just answering his question.

Interesting you picked that review and not these (for example, just googling reviews of a 980 gaming edition)):

http://www.techpowerup.com/reviews/MSI/GTX_980_Gaming/23.html
http://hexus.net/tech/reviews/graphics/75689-msi-geforce-gtx-980-gaming-4g/?page=9
http://www.overclockersclub.com/reviews/msi_gtx980_gaming_4g/15.htm

All showing the 980 (even a factory overclocked one) to be vastly better in power consumption than a 290x.
 
Great cost ?? Like 50 pounds on 400 pound card ?? PLEASE.... And from what i seen air itself works very good to. Its win win situation.
Its still better than crappy AIO watercooling that we got on 295x..

Thats still quite a lot though mate. I'm sure that £50 could be better spent on things that benifit the majority of users.

The other issue is that it would screw over there board partners if the ref cooler was too good. I know the 295x2 was like that, but thats an uber extreme card, if they started doing it on the normal high end cards, then I dont think saphire and the like would be happy at all.

But I deffinatly agree that they need to do somthing with their ref cooler. Needs to catch up with nvidias one really.
 
It makes me wonder with the R9 300 series launching in 2015,the GM200 based GPUs in the next month or two,the GTX960 in a week or so and so on,whether we will be still having 28NM GPUs well into 2016?

You would expect at least a 1 year shelf life for all these launches,maybe a bit longer??

Its annoying that AMD has not even got a full new midrange series out now - the R9 285 is not even fully enabled either,and the R9 280,R9 280X,R9 270X,R9 270 and R7 265 are around three years old too.

They are literally just helping Nvidia.
 
Last edited:
It makes me wonder with the R9 300 series launching in 2015,the GM200 based GPUs in the next month or two,the GTX960 in a week or so and so on,whether we will be still having 28NM GPUs well into 2016?

You would expect at least a 1 year shelf life for all these launches,maybe a bit longer??

Its annoying that AMD has not even got a full new midrange series out now - the R9 285 is not even fully enabled either,and the R9 280,R9 280X,R9 270X,R9 270 and R7 265 are around three years old too.

They are literally just helping Nvidia.

Lol, yeah it really is time to move on. Fingers crossed we get a die shrink by late summer this year. If not very early 2016. Hopefully transitions to other nodes won't be as difficult as moving from 28nm has been, i.e don't want to move from 28nm to only get stuck at 16nm/14nm for another 3 years :p

Hopefully this is a bump in the road and things will move along faster.. The race for 4K might dictate a faster pace.. I hope so anyway..
 
Lol, yeah it really is time to move on. Fingers crossed we get a die shrink by late summer this year. If not very early 2016. Hopefully transitions to other nodes won't be as difficult as moving from 28nm has been, i.e don't want to move from 28nm to only get stuck at 16nm/14nm for another 3 years :p

Hopefully this is a bump in the road and things will move along faster.. The race for 4K might dictate a faster pace.. I hope so anyway..

Its more the case if the high end market flounders in performance,so does the sub £200 market too.

Its just stagnation ATM - sure power consumption is improving but thats not going to help me with the next generation titles which are coming out.

It seems both companies are concentrating more and more on mobile,which means more and more desktop chips,especially those under £200,are just pre-overclocked mobile chips.

I would rather they kept power consumption the same as the previous generation and used the performance/watt improvements to just boost performance overall.

The problem is that both companies would rather milk customers more and push gamers to spending £200+ on graphics cards.

Edit!!

The worse thing is that most cards under £200 can be run easily off a 450W PSU anyway especially if you own an Intel CPU.

My system with a Xeon E3 1230 V2 and a GTX660 with multiple drives at most draws around 200W at the wall when gaming.

A mate with a Xeon E3 1230 V3 and a R9 280 sees less than 300W at the wall too.
 
Last edited:
4K screens are becoming more attainable, the demand for GPU's that can game @ 4K will rise and drive the market.

We've stagnated at 1080P for to long imho. Cards from two gens ago can cope with this res. It's time to move forward, next gen screens and next gen cards to power them.

I bet AMD / Nvidia are more than happy to churn out another round of 28nm parts. Savings for them must be lovely. People buying them in droves as well. 970 / 980 have sold 1 million since Sept, the market is def ready for something even better..

28nm is a bump in the road, a big one and I'm not to sure if AMD / Nvidia are even in a rush to move on.. Once real competition kicks off @ 4K performance improvements will be massive again, just like when we went from 1280 x 1024 to 1920 x 1080.
 
4k requires 2 things

- Monstrous fill rate
- Monstrous bandwidth

With 640Gb/s that's rumoured the card should have the bandwidth for 4k, especially if AMD have implemented the new colour compression that the 285 has, fill rate will also be interesting too!
 
4k requires 2 things

- Monstrous fill rate
- Monstrous bandwidth

With 640Gb/s that's rumoured the card should have the bandwidth for 4k, especially if AMD have implemented the new colour compression that the 285 has, fill rate will also be interesting too!

I just wonder what will be for 8k down the line.
 
How many people really care about 4K though? We know AMD will be aiming for it but really what percent of the pop are really going to buy a 4K monitor.

I still can't understand the point in 4K TV's when there is nothing to watch on them. (separate grievence)

I bought a 4K TV specifically to play 4k PC games...

It has HDMI 2.0 and DisplayPort 1.2 so it does UHD @ 60hz and the response time is really low as well.
 
Really? See the average and peak numbers http://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/22.html those are present the usual gaming, and peak periods: 229 vs 236W and 269W 271W. The only use where it is a lot more, is at video play, where it uses a stupidly high clock which is not needed.

Here's another link http://www.guru3d.com/articles_pages/asus_geforce_gtx_780_ti_matrix_review,7.html 262 vs 286W

...and another http://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review/15 372W vs 375W

In reality the 780ti used only 15-30W less than the 290X, and was only 10C cooler, but just when compared ref to ref (with the crappy amd cooler). The custom cards ran about the same temps. So the amazing and mighty 780Ti was just a tiny-tiny tad better than the hot and power hungry and monstrous 290X.

Custom cards:
780Ti gaming review: http://www.techpowerup.com/reviews/MSI/GTX_780_Ti_Gaming/1.html 230-278W consumption, 30-78C temp

290X gaming review: http://www.techpowerup.com/reviews/MSI/R9_290X_Gaming/1.html 231-263W consumption, 40-79C temp.

So with all respect, i don't think you know better than 95% of the review sites.

Use of references is frowned upon here, you should rely on common knowledge, advertising blurb and anecdotal evidence at all times, preferably drawn from a friend of a friend! :p
 
Use of references is frowned upon here, you should rely on common knowledge, advertising blurb and anecdotal evidence at all times, preferably drawn from a friend of a friend! :p

Not forgetting smoke screens such as:

- calling the other party a fanboy even if they state they base a decision purely on value for money

- ignore the difficult to refute points and pick up on a tiny and unimportant point to focus on

- personal attacks

- totally incomprehensible responses such as saying you'll never again buy a company's product owing to the attitude of some people on the board.

Links to established review sites to put across a discussion point, indeed! I've never heard such nonsense.
 
I must be only one looking as good upgrade for 3D oculus rift gaming... Who cares about 4K when u got VR almost here. And helmet wont cost 600 pounds like almost descent 4k monitor costs.
O wait OR will have amoled display and not more of **** LCD...
 
Back
Top Bottom