Well, in this post I'm going to write about one measure unit that its use has spread most quickly in the socity since the second half of the 20th century, according the evolution of the computing sciencies: the bit.
A bit (binary digit) in computing science is the minimum information data of the binary "boolean algebra", what represents two possible values.
Mathematically the are many algebraic structures, but this in particular had the advantage that could be physically implemented with a two-state device, that would be easier that other with more states.
Was in the half of the 19th century, when the English mathematician George Boole stablished the mathematical basis, trying to formulate a rational basis for the human thought based on the propositional logic.
But, as many other mathematical theories, and analisys, hadn't too much practical use since the American mathematician, electronic engineer, and cryptographer Claude Elwood Shannon published, after many years of investigation in an important laboratory, in 1948, the article "A Mathematical Theory of Communication", allowing capture all the advances in communications in that time (telegraph, radio, telephone, ...) and relating them with some mathematical concepts that exist previously, but that nobody had related before: information entropy, redundancy, ... and introducing the term bit as a unit of information. He was the first who applied the Boolean algebra in a generalized way in the field of the electronic design of electrical bistable switching circuits.
Shannon Communication System |
The article "A Mathematical Theory of Communication" is considered one of the founding works of the field of information theory, and both, George Boole and Claude Elwood Shannon are considered now two of the fathers of the computing science.
In that time, the programming was made by hardware, creating even large and heavy machines that consume a lot of electricity.
According the physics and other sciencies has evolved, circuits have become increasingly smaller and more efficient, and, traslating most of the logic of the program from hardware to software, by defining some components to perform specific operations, according the stablished mathematical logic.
Nowadays, according to the quantum computing, the two states of a bit are vertical polarization and horizontal polarization. In this case, the measure unit is named qubit.
Graphical representation of a qubit in form of Bloch sphere: apart of the states , are possible other general states of type . |
Some related units and conversions
1 Byte (in almost all cases) = 8 bits
1 Byte on some computers, especially older = 6, 7, 8 ó 9 bits
nat (natural digit) or nit = log2 e (≈ 1.443) bits
dit, ban, or hartley = log2 10 (≈ 3.322) bits
1 bit of information = about ln 2 (≈ 0.693) nats, or log10 2 (≈ 0.301) hartleys
Some authors also define a binit as an arbitrary information unit equivalent to some fixed but unspecified number of bits.
The conversion to the decimal system depends on the position in the binary chain of bits, and if it's little-endian or big-endian:
Decimal Number = Sum of {1 or 0 according the state of the bit} * 2{position in the binary chain}
Bibliography:
- La medida de todas las cosas.
No hay comentarios:
Publicar un comentario