Some weeks ago, I was searching information about some graphical components to use in the application that we were programming. I don't remember the library I found, but I read about a new measure unit I haven't noticed (or didn't remember): the twip .
This week, I've found a piece of paper with a word on my table: twip .
As many other times I find a new measure unit I try to note it to search information and write about it in the future, in this blog.
Many programmers have listened about pixel, inch or cm as measure units used in graphical components like frames, windows, dialogs, screens, ... but I think much less have about twip.
I understand that depens on what programming language, libraries or tools you're using in the application to know this or that concept, but as most of us have been taught about the pixels and other graphical concepts, why didn't taught us about the twips? Maybe because it's a new concept brought by the evolution of the technology, or maybe it's a concept used in some technologies, or maybe because as technically the minimum representation of a color is a pixel in a screen, it's a nonsense think in something smaller or higher (depends on some parameters of that screen), or maybe, because the technologies used it in low-level and then, the user or programmer doesn't need to use it.
The fact is that nowadays there are many different sizes of screens with different resolutions, and that can be a headache for "front-end" programmers, more if use different operative systems, and/or browsers. There are different solutions for this problem.
twip is the abbreviation of "twentieth of a point", that is a measure used in laying out space or defining objects on a page or other area that is to be printed or displayed on a computer screen.
As a typographical measurement is defined as the 1/20 of a typographical point, what is a traditional measure in printing. A point is approximately 1/72nd of an inch. Then, 1/20*72=1/1440, what means that a twip is 1/1440th of an inch or 1/567th of a centimeter. That is, there are 1440 twips in an inch or 567 twips in a centimeter.
1 twip = 1/20 typographical point = 1/1440 inch = 1/567 cm
1 cm = 567 twips
1 inch = 1440 twips
1 typographical point = 20 twips
In computing twips are screen-independent units to ensure that the proportion of screen elements are the same on all display systems. That allows "theorically" to represent the graphical component in same size irregardless the resolution of the screen.
As I can read: "not all software development tools work with twips, and a programmer may sometimes need to convert between twips and pixels, and the reverse." That explains why many programmers don't know about the twips.
Some software I've read that uses twips: Microsoft's Visual Basic (version 6 and earlier, prior to VB.NET), rich text file format (RTF), Symbian OS bitmap images, SWF format (Flash), and it's the base length unit in OpenOffice.org and its fork LibreOffice.
Variant, in PostScript :
One twip is 1/1440 inch or 17.639 µm when derived from the PostScript point at 72 to the inch, and 1/1445.4 inch or 17.573 µm based on the printer's point at 72.27 to the inch.
In my opinion, according the documentation I've read, and as an answer of the previous questions, twips seems to be an old measure unit for programming, that was introduced for some tools and programming languages, as a solution for create windows components when the number different windows operative systems, windows applications, and screens with different sizes and resolutions (90's) started to increase exponentially. I would be glad if someone can report more information about twips.
References:
No hay comentarios:
Publicar un comentario