New 120Hz/"240Hz" VA monitor for gamers (Eizo Foris FG2421)

Assuming the plasma is using 100 W more (an extreme example unless it's over 50") than a led 10 hours every day.

Based on a 14.9 per Kwh price, it would cost £54.40 "more" per year. More than likely it will be less than this.
 
Depends what plasma, the brand new ones seem to have lower usage, but 1-2 years ago they were about 250w+ whereas an LED is about 60w. So if you use it for 10+ hours a day over 5 years it really does add up to the amount I said.

Best case scenario (2014 plasma with 150w power usage)

10 hours a day for 5 years = Plasma £400, LED £125

3 year old plasma (samsung D6900 280w power usage)

10 hours a day for 5 years = £750 (same price as the TV cost new)

2014 Samsung F8500 (500w!) 10 hours a day for 5 years = £1300, ouch.
 
Last edited:
Depends what plasma, the brand new ones seem to have lower usage, but 1-2 years ago they were about 250w+ whereas an LED is about 60w. So if you use it for 10+ hours a day over 5 years it really does add up to the amount I said.

No it doesn't. As others have said, your calculations are way off. There is no way that would happen unless you happen to be staring at an all white screen at full brightness for 20 hours a day.
 
Depends what plasma, the brand new ones seem to have lower usage, but 1-2 years ago they were about 250w+ whereas an LED is about 60w. So if you use it for 10+ hours a day over 5 years it really does add up to the amount I said.

Best case scenario (2014 plasma with 150w power usage)

10 hours a day for 5 years = Plasma £400, LED £125

3 year old plasma (samsung D6900 280w power usage)

10 hours a day for 5 years = £750 (same price as the TV cost new)

2014 Samsung F8500 (500w!) 10 hours a day for 5 years = £1300, ouch.

You seem to be falling into the trap I mentioned above. Under realistic and normal viewing conditions the power draw is nowhere near that on the Samsung models. Measured using the IEC 62087 Edition 2 standard the D6900 draws closer to 160W. It would only draw anything like 280W under normal conditions when viewing 3D content with extreme brightness. And the F8500 draws closer to 280W under the same IEC measurement standard. Reviews which measure power draw under similarly realistic conditions show similar results. I happen to own a D6900 as it happens and like cranking the brightness up.... But the power draw is nowhere near what you claim and neither is the cost (thankfully).
 
No it doesn't. As others have said, your calculations are way off. There is no way that would happen unless you happen to be staring at an all white screen at full brightness for 20 hours a day.

Ok well you are free to think what you want but I can assure you the calculations are correct, some people just won't believe something if they don't want to, even if it is correct :/
 
You seem to be falling into the trap I mentioned above. Under realistic and normal viewing conditions the power draw is nowhere near that on the Samsung models. Measured using the IEC 62087 Edition 2 standard the D6900 draws closer to 160W. It would only draw anything like 280W under normal conditions when viewing 3D content with extreme brightness. And the F8500 draws closer to 280W under the same IEC measurement standard. Reviews which measure power draw under similarly realistic conditions show similar results. I happen to own a D6900 as it happens and like cranking the brightness up.... But the power draw is nowhere near what you claim and neither is the cost (thankfully).

My numbers were taken from the HDTV test review, I averaged the number between "calibrated SD" and "default SD".... It costs a lot more to run a plasma than it does an LED whether it is 150w or 300w it is still 3-6x the power usage of an LED.
 
Last edited:
Yep.
If power usage is very very important, then by all means go with LCD, with LED back/edge lighting, but if image quality is absolutely the most important thing, go with plasma.
 
I did not say LCD is better than plasma, its just a lot cheaper when you add up the price you paid for it + electricity usage.
 
What would you like to know?

I had one for a week. It was a decent monitor, but nowhere near worth the asking price. Good blacks, good colours and pretty good motion performance under optimal conditions. However, my panel had uniformity issues in three corners (fading), very noticeable cross-hatching and even some colour banding/dithering artefacts. Plus, the Turbo 240 mode introduced a great deal of "ghosting" in many cases.

All in all it would be the monitor to get if it was £150 cheaper. As it stands, I can't recommend it. Wait to see what the BenQ with the native strobing light offers, or the new Asus with G-sync. This monitor promises much and seems like the perfect monitor by many accounts, but if you have a keen eye (as most people who notice motion blur issues do), you may wind up quite disappointed.
 
Plasma vs LCD:

How many of the brought-forward figures are actually tested, I wonder? Because indeed, the real-world tests are the ones that matter. So if anyone is more interested, I would encourage to buy a watt-o-meter with a kWh and/or Cost mode. You know, the ones that you plug between the wall and the device, and they will calculate the necessary information (the basic models cost like 10€, I think). Then just keep the television (be it LCD or plasma) on for 10 hours, alternating with regular broadcasts, movies, maybe even some computer time.

And please don't temporarily change your current settings for "economy" or "energy saving", we want to know the real world values. And please state your model number, too.

For example, my Philips 42PFL6907T (42" LCD, 2012), takes about 80W, which is pretty much the max achievable power draw from it. With about 10h daily, this amounts to 290 kWh annually.
-- 80W in both 2D and 3D, no matter the source material
-- with only one change, it would take only about 45W (165kWh/a) if I simply let the backlight go into a PWM-mode by enabling "Dynamic backlight", but I'm a bit stubborn, so I won't let it
-- with a few extra changes, I could get it all the way down to 35W (130kWh/a), and even without any noticeable image quality difference, pretty much only affecting the PWM

So while it is indeed possible to specifically fine-tune a plasma to use less than 200W, the same analogy applies to LCDs, too. If we make compromises with the features and settings, then yes, we can get it way below the norm. But I thought that plasmas were all about not making compromises with regards to image quality? ;)
-- in other words, please test with your actual settings and in real-world scenarios, don't cherry-pick to prove a (false) point
-- if you necessarily want to cherry-pick, you might as well blatantly lie about the figures, wouldn't make a difference

And the power usage is indeed important, as you can reasonably offset the electricity bill savings to the TV purchase, itself. In essence, you might end up comparing a 600€ plasma to a 1000€ LCD. That's the difference it makes.
But also note that the offset cost hugely depends on your
a) average daily usage
b) planned lifetime of the device
c) your electricity tariff

From what I've seen, plasmas draw about 4x more power in AVERAGE when compared to LCDs of the same size (both using calibrated settings).

Or if there is someone who would like to refute the figures, I would insist that the same person would kindly provide us with better figures, as in my opinion a review site (hdtvtest.co.uk) that focuses solely on HDTV tests (plasmas and LCDs alike) should be quite a fitting choice for the average source data.

As for the FG2421:
Forgot to make an update to the first post that now they're on stock at OCUK.
Scratch that, seems like I didn't forget, after all. Updated and cleaned up the first post now even further, though. Also: normally £450, but this-week-only-offer has them for £420.
 
Last edited:
Mt wattmeter is acting up, think it has an internal battery strangely. I'll get it running again and make some measurements on my 55" plasma and 40" LCD
 
Not sure why plasma vs LCD has come into this discussion, but LCD is still not comparable to a good plasma panel. A low end plasama screen possibly, but but for colour reproduction, motion handling, backlight, contrast etc,etc plasma is still ahead.
So in your power usage comparison you'd be trading higher power usage for better picture quality.
 
@3t3P:
Yes, mine has one, too (LR44, 1.5V). It's useful if you want to maintain memory of previous usage and tariffs. Otherwise it would reset all the info once you remove the thing from the mains or if there is a black-out. Unfortunately, at least in my unit it seems like the LCD display in it is also powered by the battery, so it simply needs the battery, for it to be of any use. If I remove the battery, the display will remain blank even if connected to the mains. This, in turn, might be because the circuitry is driven by DC, and would thus need an additional AC/DC converter just for that. So simpler to just include a small battery which takes care of the operational side (calculations, memory, display). One battery will last for years. My unit doesn't even have an on/off switch (nor auto-switch-off timer), so it's displaying a static 0 watts all the time when sitting in the shelf. Actually, the battery had leaked and accumulated some of the powdery white stuff and the display was blank, but I just cleaned the contact surfaces and now it's again going strong. :D
(even though it now gives only ~1.2V)

@Gazb:
The plasma came into discussion via black levels and contrast ratios. Then came the discussion of power consumption. Then the comparison of the cost savings, and how much it affects the overall cost of the purchase.

In essence, my opinion:
Plasma indeed still has plenty of things going in its favor. But if you weigh in the cost savings, you can offset the cumulative electricity cost savings straight to the purchase, and thus you should compare higher price point LCDs to lower price point plasmas, after which LCD is quite tempting. Naturally, if you're going to spend £2000+ and you only use the set for 1-2 hours a day, then the offset is relatively quite small, and plasma will have better odds.

But in any case, that still doesn't remove the fact that plasma is a dying breed. It's not a coincidence that Panasonic is stepping out of the plasma market. Nevertheless, I'd say there will be new models from other manufacturers for a couple of years. But wouldn't hold my breath for a 4k model (at least not any reasonably sized/priced)...

And power consumption isn't the only trade-off with plasmas, there's also the heat, size, weight, resolution issues. Power consumption is just the most notable issue.

Note: CRT didn't vanish because of inferior image quality. Similarly, plasma won't vanish because of inferior image quality. No, they lost the game because LCD is more versatile and cheaper to produce. For the oblivious average-joe the image quality is not worth the extra cost and other drawbacks. The average-joe would have been content with the low-end TN panels, if the new more price-competitive IPS and VA panels hadn't emerged. Actually, the image quality isn't worth for the extra cost and drawbacks for most of the more knowledgeable people, either.
 
Yeah, even a 1080p 27" version of this would have been a very welcome guest. And what strikes me even more odd is that the competitors haven't yet brought forward any alternatives for the same panel... Are the others so gung-ho on 4k panels or something?

Hmm, now that I think about it, maybe the gamma-shift problem is preventing them from going for a bigger size? But that still doesn't explain why the competitors aren't flocking for the 24" variants. Furthermore, there are plenty of 27" TN monitors, I would assume they have it even worse with regards to viewing angles.

edit:
Actually, now that I checked, the 27" VA panels are indeed relatively uncommon, altogether. In the 27" range, there are about 3-4 times more IPS and TN variants when compared to VA.
 
Last edited:
Yeah, even a 1080p 27" version of this would have been a very welcome guest. And what strikes me even more odd is that the competitors haven't yet brought forward any alternatives for the same panel... Are the others so gung-ho on 4k panels or something?

Hmm, now that I think about it, maybe the gamma-shift problem is preventing them from going for a bigger size? But that still doesn't explain why the competitors aren't flocking for the 24" variants. Furthermore, there are plenty of 27" TN monitors, I would assume they have it even worse with regards to viewing angles.

edit:
Actually, now that I checked, the 27" VA panels are indeed relatively uncommon, altogether. In the 27" range, there are about 3-4 times more IPS and TN variants when compared to VA.

It's not just that 27" VA panels are uncommon, it's that VA panels are relatively uncommon for monitors full stop. I agree it would be nice seeing other manufacturers adopt this 23.5" Sharp panel or for a 27" alternative to be developed. The gamma shifting behaviour is actually relatively good on some AUO's new AMVA+ panels. Not up to IPS levels of consistency of course but it's not the target market nor is it an issue that is critical for many users. Monitors using these go all the way up to 27" (actually 27" is the most common size for these). Just need to see if AUO can push the refresh rate up, with a strobe backlight as a likely requirement there.
 
Hi guys, I just took delivery of this monitor and am very impressed with the contrast and colours, and 120hz coming from a 60hz monitor feels pretty good. I did read that some of these had an issue with cross hatching and while mine doesnt have any dead pixels or backlight bleed there is a small amount of cross hatching visible on plain backgrounds. I really have to look for it but its definitely there. Is this simply the nature of the panel or an actual defect? Wondering if its worth sending it back.
 
Back
Top Bottom