• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon R9 290X with Hawaii GPU pictured, has 512-bit 4GB Memory

IIRC the TV's can only do 30hz?

Whenever I've used a TV as a PC monitor it's been a horrible experience as well. Input lag and all sorts on certain models.

With their post-processing?

Most TVs have a gaming mode to turn it off now, don't have a TV myself though.
 
Quick question.

How many people here:

A. Have a 4k monitor?
B. Would be prepared to turn down a currently released game's main graphical settings (excluding AA) after just spending £500+ on a graphics card?

:p

A. Probably none. I can technically afford one (don't tell the wife! :eek:) and would like one but I am not paying over 3k for a monitor. My 3x 1080p IPS screens will last me for a good while yet (unless I see a good buy on 3 x 1440p monitors ... :p)

B. Given I run Eyefnity / surround on a single card I am used to turning down settings even on a top end card. I am more annoyed by certain settings requiring stupid amounts of GPU grunt for no obvious visual benefit *cough* Sleeping Dogs Extreme AA */cough*.
 
Hi I'm Bill Paxton, and I'm here with AMD to tell you about 4K. It's one of the most powerful forces on this earth. Much like the 290X. At 30 frames a second your games will be running at the same speed as my voice. Slightly reduced speed and an annoying whine from my breathing area.
 
requiring stupid amounts of GPU grunt for no obvious visual benefit *cough* Sleeping Dogs Extreme AA */cough*.

Owen wash your mouth out with soap and water!! :eek:

The difference between SSAA extreme and HIGH is very apparent to me at 1440p. Comes with a large performance hit though as every single texture is rendered at high res on extreme. Its hard on the gpu's though. High does a good job but only a fraction of the textures get the SSAA treatment.

You'll need a dual card setup before you can even consider extreme above 1200p.
 
Last edited:
What really gets me is that they are testing 4k with current gen games...whilst having to turn down settings, and the frame-rates are still horrid.

There is no doubt that once the new consoles are out, the next gen games will run considerably worse that current games now . I would wager, similar to Crysis 3 as that already looks almost "next -gen" but most likely a fair bit worse that even that.. .

I therefore don't see how these are going to be the cards to promote 4k gaming? I think they will be incapable, even in crossfire, to do so.
 
I think next-gen games will run better due to overall improvements in engine-design and hopefully multi threaded properly: eventually. And lower API overheads with DX11+

The first ones released, probably not as they're multi-plat.
 
With their post-processing?

Most TVs have a gaming mode to turn it off now, don't have a TV myself though.

yeh some are good some are bad
i can see a lot console users buying them even at 30hz and up scaling

maybe they get better next year as they get cheaper
 
All guess work on my part but I think we can expect:

Titan level performance, slightly better in some cases at most resolutions. Stock clocks on both.
£430ish price tag
Mild overclocks on stock cooler, 90c seems to be the case at 40%.
"enthusiasts" will be forced to go water for overclocking.
Titan and 290x will be on par when overclocked.
Performance increases with mature drivers. (read mantle biased games, memory optimized games)
Generally smoother fps (higher min fps) as AMD seems to have concentrated on this with the micro stutter issues that seemed to plague them previously.

Never settle bundle within a months time.

The only relevance of 4k should be supersampling at this point, when 4k monitors are common you will be guaranteed to have a new card.
edit: or i guess for the eyefinity obssessd

Most people who like to have the best will replace their cards next year.
 
Last edited:
With their post-processing?

Most TVs have a gaming mode to turn it off now, don't have a TV myself though.

Even in gaming mode, a proper PC monitor is generally better.

There's a few exceptions like the Sony W6 series TV's which are brilliant and also have proper 4:4:4 RGB through HDMI.
 
yeh some are good some are bad
i can see a lot console users buying them even at 30hz and up scaling

maybe they get better next year as they get cheaper

I do all my gaming on my TV (Panasonic 55" VT50), including PC.

It does have a 'Game' mode but I don't ever use it to be honest. Never noticed any problems with input lag but then I use an xbox controller and not a mouse.
 
I do all my gaming on my TV (Panasonic 55" VT50), including PC.

It does have a 'Game' mode but I don't ever use it to be honest. Never noticed any problems with input lag but then I use an xbox controller and not a mouse.

yeh you got a good one :)
i messed around with a sony 4k on display and the upscale looked amazing, i think if people can buy something similar next year for around 1k a lot will bite, especially with the new consoles

pc's...30hz....not so sure :)
 
PC Gaming on a TV is a novelty. It makes you appreciate games with a controller more over their console counterparts as obviously it runs so much smoother and crisper, but I miss having a proper monitor. Need to order my 1440P one.
 
yeh you got a good one :)
i messed around with a sony 4k on display and the upscale looked amazing, i think if people can buy something similar next year for around 1k a lot will bite, especially with the new consoles

pc's...30hz....not so sure :)

id buy a 4k TV for the lounge if i had a console but as a PC gamer sitting ~2 feet away from my screen im happy with my 24" screen, i might go to 27" 1440p one day but i dont think they'l make 4k that small??
 
PC Gaming on a TV is a novelty. It makes you appreciate games with a controller more over their console counterparts as obviously it runs so much smoother and crisper, but I miss having a proper monitor. Need to order my 1440P one.

the only controller game ive played recently is f1 2013 and im rubbish at that
LoL, saintsrow most my time

monitors need to step up their game :p
so many badly made with horrible bleed, im scared to order any of them
 
I play Tomb Raider on my 42 inch panasonic tv with an xbox controller as i play it with the gf, and its great to be honest. Proper 1080p and 60fps is such a nicer experience compared with the consoles version :p
 
What really gets me is that they are testing 4k with current gen games...whilst having to turn down settings, and the frame-rates are still horrid.

There is no doubt that once the new consoles are out, the next gen games will run considerably worse that current games now . I would wager, similar to Crysis 3 as that already looks almost "next -gen" but most likely a fair bit worse that even that.. .

I therefore don't see how these are going to be the cards to promote 4k gaming? I think they will be incapable, even in crossfire, to do so.

Some games run like crap, others don't. In the test they showed Skyrim, Borderlands 2 and Max Payne 3 all achieving over 70fps average... is that somehow unplayable at 4k resolution?


It's daft and seems to be pretty pointed why people are down on 4k results.

4k isn't just about memory or the screens. It's the highest single screen resolution available, they are available and that takes a lot of horse power. when showing what the top cards can do using the highest settings possible makes sense.

Never in the history of the universe of computing has any company, Nvidia intel, AMD or anyone else have they based benchmarks on the maximum playable settings. Anandtech routinely shows fps in the 2-20fps range for Intel IGP's, is that bad because they show what their performance they can provide at maximum settings?

In terms of memory and the daft things people are saying about memory.

http://pctuning.tyden.cz/component/...pletni-vykon-r9-290x-vs-gtx-780-v-17-ti-hrach

You can see which game obviously breaks the 3gb memory limit on the 780gtx, realistically the obvious one is Sleeping Dogs, Dirt Showdown is a possibility to, realistically if a game runs out of memory we're talking about that level of performance loss.



If a card shows an advantage when the card is pushed harder, why on earth wouldn't they show that advantage, and secondly if a card is faster in the highest stress situations why wouldn't you as the consumer want to know?

People are also getting a bit ****y about 4k support, firstly there are several games that run a more than playable setting at that resolution, secondly support for a product doesn't necessarily mean gaming and gaming only, while they will be few and far between, there will be people with 4k screens. There will be people who game and who don't game who want 4k screens and for them to actively ignore the highest resolution on the monitors everyone wants.... while Nvidia shows 4k results would be absurd.

4k results being somehow bad seems to be a Nvidia strawman argument, as does insisting AMD has to compare their card to something that will cost twice as much.
 
id buy a 4k TV for the lounge if i had a console but as a PC gamer sitting ~2 feet away from my screen im happy with my 24" screen, i might go to 27" 1440p one day but i dont think they'l make 4k that small??

THe size vs res thing is really starting to irk. I didn't get a 1600p screen because frankly I never wanted a 30" screen on the desk that close to me. 4k I could go that big because there are enough pixels to justify it. Though it all become a bit awkward as I like having two physical screens so I can run a full screen game on one and keep things up on the other screen. A single massive screen, with the way current drivers/OS/full screen gaming all work is a bit of a headache. I think the industry in general, drivers, software, it's really not geared towards people wanting to do more than one thing at once, or using more than one screen. It works, but there are many drawbacks, many things you can't do, driver issues. AFAIK both AMD and Nvidia cards have to have a higher idle clock using more power just because you are using two screens.

While 4" phones get 1080p and we're still pretty much stuck with 1080p on 24" screens is a joke. I'd have killed for a 1600p 24" screen years ago, they just don't make them.

Again this is a huge reason why 4k support is absolutely vital. When both gpu makers, and all monitor makers can't pick a cable and create a standard. Had they made 4k 24" screens they would have all kinds of timing/cabling/driver issues. It makes it harder for companies to commit to making higher res/better screens because they want a full on support ready and sorted as they don't want to make higher cost screens, see them plagued by problems and lack of standards and then not sell. However if they'd pushed the screens it would likely have pushed a standard out much sooner... so often it's a case of no one agreeing on anything and being in a stalemate for years, it's ridiculous.

But the lack of high pixel density monitors is nothing short of ridiculous when you can get the same res on a 4-5" phone screen as a 24" monitor.
 
Last edited:
XbIlHmn.png

http://pctuning.tyden.cz/component/...pletni-vykon-r9-290x-vs-gtx-780-v-17-ti-hrach
 
Some games run like crap, others don't. In the test they showed Skyrim, Borderlands 2 and Max Payne 3 all achieving over 70fps average... is that somehow unplayable at 4k resolution?


It's daft and seems to be pretty pointed why people are down on 4k results.

4k isn't just about memory or the screens. It's the highest single screen resolution available, they are available and that takes a lot of horse power. when showing what the top cards can do using the highest settings possible makes sense.

Never in the history of the universe of computing has any company, Nvidia intel, AMD or anyone else have they based benchmarks on the maximum playable settings. Anandtech routinely shows fps in the 2-20fps range for Intel IGP's, is that bad because they show what their performance they can provide at maximum settings?

In terms of memory and the daft things people are saying about memory.

http://pctuning.tyden.cz/component/...pletni-vykon-r9-290x-vs-gtx-780-v-17-ti-hrach

You can see which game obviously breaks the 3gb memory limit on the 780gtx, realistically the obvious one is Sleeping Dogs, Dirt Showdown is a possibility to, realistically if a game runs out of memory we're talking about that level of performance loss.



If a card shows an advantage when the card is pushed harder, why on earth wouldn't they show that advantage, and secondly if a card is faster in the highest stress situations why wouldn't you as the consumer want to know?

People are also getting a bit ****y about 4k support, firstly there are several games that run a more than playable setting at that resolution, secondly support for a product doesn't necessarily mean gaming and gaming only, while they will be few and far between, there will be people with 4k screens. There will be people who game and who don't game who want 4k screens and for them to actively ignore the highest resolution on the monitors everyone wants.... while Nvidia shows 4k results would be absurd.

4k results being somehow bad seems to be a Nvidia strawman argument, as does insisting AMD has to compare their card to something that will cost twice as much.

You're mixing 'bad' with pointless. The card doesn't even support HDMI 2.0. How difficult is it this far down the line to just release some 1440P benches too.

The answer is not very.
 
Some games run like crap, others don't. In the test they showed Skyrim, Borderlands 2 and Max Payne 3 all achieving over 70fps average... is that somehow unplayable at 4k resolution?


It's daft and seems to be pretty pointed why people are down on 4k results.

4k isn't just about memory or the screens. It's the highest single screen resolution available, they are available and that takes a lot of horse power. when showing what the top cards can do using the highest settings possible makes sense.

Never in the history of the universe of computing has any company, Nvidia intel, AMD or anyone else have they based benchmarks on the maximum playable settings. Anandtech routinely shows fps in the 2-20fps range for Intel IGP's, is that bad because they show what their performance they can provide at maximum settings?

In terms of memory and the daft things people are saying about memory.

http://pctuning.tyden.cz/component/...pletni-vykon-r9-290x-vs-gtx-780-v-17-ti-hrach

You can see which game obviously breaks the 3gb memory limit on the 780gtx, realistically the obvious one is Sleeping Dogs, Dirt Showdown is a possibility to, realistically if a game runs out of memory we're talking about that level of performance loss.



If a card shows an advantage when the card is pushed harder, why on earth wouldn't they show that advantage, and secondly if a card is faster in the highest stress situations why wouldn't you as the consumer want to know?

People are also getting a bit ****y about 4k support, firstly there are several games that run a more than playable setting at that resolution, secondly support for a product doesn't necessarily mean gaming and gaming only, while they will be few and far between, there will be people with 4k screens. There will be people who game and who don't game who want 4k screens and for them to actively ignore the highest resolution on the monitors everyone wants.... while Nvidia shows 4k results would be absurd.

4k results being somehow bad seems to be a Nvidia strawman argument, as does insisting AMD has to compare their card to something that will cost twice as much.

*Blah blah blah* , *ignore everything I have said*

My post still stands, These cards will simply not be powerful enough to play graphically intensive next gen games at 4k res. I think even 3 or 4 of them might start struggling.

4k will be great and there will be cards that will be able to do 4k properly in a few years. These cards are not it.

In a years time when 4k will be closer perhaps to a reality for 99% of people (i.e. they might actually own a 4k monitor) 20nm gpus will be out.
 
Last edited:
Back
Top Bottom