elvesbane
Contributor
..:: Free Radical ::.. said:1) In context to printed images (Clarkvision). Printers preserve the most detail when they are used to print at the highest possible dpi.
2) The "maximum" detail is perceived not with the eye at rest, but by accomodating the lens of the eye i.e. squinting which fatigues the eye and can cause eye strain or worse. For something like a monitor which depending on your job, you have to watch for hours, you'll want something which requires minimal effort.
3) Not everyone sits at 20 inches to view a monitor or watch HD movies.
4) Your powerful graphic card won't be powerful long enough as is the current trend with GPU hardware. Your monitor will outlive your GPU. Why not buy something which would serve you better for the long term?
5) If its a trade off between the screen real estate and "super high resolutions", I would prefer more screen real estate.
And my response to all of those...
1) While the author of that particular article might have used prints to come to his conclusion, the whole point of it was to find the resolution of the human eye - not "what resolution of printed material is best". Besides, you can't blame him for using printed material, seeing as how the most displays available for purchase come nowhere near the sort of detail (DPI, PPI, whatever) that a modern printer can achieve.
2) I admit that your point is valid, but does anyone have any clue as to what exactly is the resolution of the eye 'at rest' insofar as reading textual / viewing movies / viewing pictures / gaming is?? Besides, when moving from a 22" at 1680x1050 to a 21.5" at 1920x1080, the DPI goes from 90 to 102 - an increase of 12 DPI or 13.3%... but still well away from the very commonly heard 300 DPI limit for human visual acuity.
3) Ignoring some information in favour of others doesn't really help your case... it's just childish. Clarkvision mentions a MASSIVE 530 DPI as being the upper limit at 20 inches, or 1.67 ft. I'll go out on a limb here and assume that most people sit 2.5-3.5 ft away from their monitors (I'm at the 2.9 ft mark). Common sense tells me that the figure isn't so different from clarkvision's test distance so as to render the result (and the conclusion I made in my previous post) irrelevant.
4) Isn't that statement in MY favour?? Yes, it is true that the capability of graphics cards (and the demands on them) are growing rapidly - and one of the things I've noted is that everyone seems to be focusing on the HD and beyond HD resolutions for gaming. While HD gaming has been hyped for a very long time, you have to admit that even mainstream cards are able to manage themselves at HD resolutions well enough (Please don't cite Crysis as an example to the 'otherwise'. Crysis is weird.). In any case, future graphics card will most certainly focus more and more on gaming at higher resolutions - not at staying in the same place. Why introduce a limit on your resolution if its likely that you'll get one or more powerful graphics cards during the lifetime of the monitor?
5) According to tvcalculator.com, when watching movies at 16:9 (or wider), the 21.5" 16:9 has greater area (albeit by a very small margin - 197.33 sq. in. vs 195.56 sq. in. for 16:9 source) than the 22". This means that...
a) Area is greater when watching movies.
b) Pixel depth is greater on the 21.5", plus 1080p is shown at native, and not downscaled like for 1680x1050.
And yeah, its true that I sound like I'm trying to justify my purchase. But the point is that gaming (+ movie watching) is quite possible (at high or highest settings) as long as the person in question is willing to spend on a graphics card that costs about as much as the monitor. For eg. BenQ E2200HD = 11.8k, ATI Radeon 4870 = 12.8k (saw the listing in dealer's paradise! sounds nice, eh? ) As for me, I just finished playing Mirror's edge at max everything (<sarcasm>except AA - only 4x - sigh...</sarcasm>) at full HD on a paltry Core 192 GTX 260. That's an AAA title - on a card who's value is less than the monitor's. :hap2: