mrmazda at ij.net
Mon Jun 30 19:38:49 PDT 2008
On 2008/07/01 02:35 (GMT+0100) Glynn Clements apparently typed:
> Felix Miata wrote:
>> Sounds to me like you're using the same bogus math as typesetters and web
>> deeziners use. 8px is not half the size of 16px - it's 25%, length times
>> width. Size is area, not one single dimension. A 1600x1200 display has 4
>> times as many logical px (1,920,000) as an 800x600 display (480,000). Thus, a
>> 48x48 icon has 2304px, 2.25 times the 1024px of a 32x32 icon; 4 times as many
>> as a 24x24 (576).
> I think you missed my point, because using areas only makes the issue
> more pronounced.
Area doesn't make it any more or less pronounced. What it should make more
pronounced is the ability to recognize the disparities discussed. The
minimization of real differences into artificially smaller differences makes
the problems _look_ like smaller problems than they really are.
> My point is that you need small enough increments that you can always
> get roughly the size that you want, rather than being stuck with a
> choice between definitely too large or definitely too small.
Or use something intended to scale in the first place.
AFAICT, the technology exists for displays to be double or more the
resolution the average user has now, but the systems they're expected to be
used with are dependent on anachronisms like 96 DPI, choices between two tiny
bitmap icon size groups, and apps designed as if they were intended for print
media of fixed dimension instead of computer display screens of widely
varying size and resolution. Few would now buy those much higher resolutions
due to the tininess of objects that would result from their use encumbered by
those legacies. If the desktops could accommodate the same resolution laser
printers started with (300 or more), the increments would be too small to
matter, and scaling would bother few, or maybe no one.
> With fonts, you get that choice. But the interval between the
> available icon sizes is much larger.
As long as everyone is stuck with a tiny selection of bitmap images, that's
exactly right. It's well past time for everyone to remain stuck with them though.
>> > The 96 dpi figure was just an arbitrary value, chosen so that various
>> > common point sizes (6, 8, 12, 16) would work out to an integer number
>> > of pixels.
>> It's arbitrary all right, but not necessarily for the reason you claim. e.g.
>> at 96 DPI:
>> 6pt = 8.000px^~1.5 (not enough px per character box for all complete
>> character sets to be rendered intelligibly)
>> 8pt = 10.667px^~1.5
>> 10pt = 13.333px^~1.5
>> 12pt = 16.000px^~1.5
>> 16pt = 21.333px^~1.5
> I have no idea what you're getting at here.
You mentioned integer text sizes as a reason why 96 DPI. Few above calculate
to integers. e.g. I meant 13.333^1.5 as a character box of about 13.333 tall
by about half that width for a total of about 88.89px available per 10pt
>> The reason anyone else uses it is because M$ uses/used it, and the reason for
>> that misfortunate legacy is explained on:
> That's interesting, but it doesn't really have any bearing on the
> notion of resolution independence.
Just tried to correct any misconception about the source of legacy.
> So long as display resolutions remain low enough that you have UI
> elements which are only a few pixels in size, the fact that you
> ultimately have to rasterise whole pixels means that you can't just
> operate entirely in physical units, in the same way that you can with
> a 300+dpi laser printer.
Right, so desktop environments need to make some big changes to permit
display devices with enough resolution to be feasible. So, this thread isn't
so much about whether people know the conflicts exist so much as it is the
posture of those trying to make the best of what is vs. those trying to push
capabilities up to a reasonable ought-to-be. I doubt anyone would complain if
the average was 300. The problems are many in dealing with the gap between
current reality and goodness, how to eliminate that gap, and living with and
minimizing the pain of the under construction mess in the meantime.
> Well, you *can*, but the artifacts are going to look a lot worse.
Maybe to people with your 15/15 vision, but less likely to people corrected
to no better than 25/25. I generally find a native size image no better or
worse than that same image blown up to 4X its native size when its native
size is only 1/4 big enough to be useful anyway. I'm a user of high
resolution in order to gain quality, not interested in stuffing more things
of smaller size into a given space.
I'm all for getting everything beyond bitmaps ASAP. Puters are plenty
powerful. Let's get them using that power to make people happy users instead
of bickering finger pointers.
"Where were you when I laid the earth's
foundation?" Matthew 7:12 NIV
Team OS/2 ** Reg. Linux User #211409
Felix Miata *** http://fm.no-ip.com/
More information about the xorg