Hi folks,

In our previous articles we’ve told you a few things about the concept of zero, then about numbers and then about the binary numeral system.
Today’s article will probably shed some light on the reasons for this consecution.

Abstract” etymologically means ‘separated from material because drawn from it’ while “concretemeans etymologically ‘solid’, ‘material’.
That is, “abstract ” is what escapes our senses while “concrete” are things we can ‘capture’ with our senses.
Logic, for instance, is abstract of course, but there are ways to make it concrete and handle it materially.

In the mid-1800’s British mathematician George Boole published “The Laws of Thought” (the complete title of the book was ‘An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities‘), a book which introduced the Boolean logic algebra to the world.
This special type of algebra differs from “classic” algebra because it operates with variables whose values aren’t numbers; instead their value can be either “true” or “false“, which can be represented by “0” and “1”.
Another difference from elementary algebra are the operations: boolean algebra main operations are conjunction (AND), disjunction (OR) and negation (NOT), instead of addition and multiplication which are the main operations in elementary algebra.
Boole’s algebra provided means by which various notions of logic could be expressed in a formalized language and got further perfected later by other mathematicians until reaching the modern mathematical structure it has today.
This abstract apparatus was to become the foundation of the design of combinatorial logic circuits.

In the 1930’s, American mathematician and electronic engineer Claude Shannon demonstrated that electrical applications of Boolean algebra could construct and resolve any logical-numerical relationship.
Using the property of electrical switches to do logic is the basic concept that made possible all electronic digital computers.
An off-topic but interesting enough small detail is that Shannon, “father of modern digital communications and information theory” was a distant relative of his childhood hero, Thomas Edison.

Through switches and relays, the “0”s and “1”s made the digital era possible: using just these two digits any numbers and logic could be represented and handled and all of a sudden binary was everywhere in a matter of just few decades, a blink of an eye when it comes to history.
Basically, electricity and semiconductors made it possible to program ‘matter’ and inter-relate with it based on abstractions.
The magic happens through microprocessors.

A Central Processing Unit (CPU) microprocessor is the “brain” of a computer, the place where arithmetical and logical operations happen and it evolved (and keeps evolving) following an amazing trend pointed-out based on observations made by INTEL’s co-founder Gordon Moore.
The trend is known today as “Moore’s Law” and after about half a century it still proves to be correct: it inferes that the number of transistors on integrated circuits doubles approximately every two years.
Due to various technical developments, twice as many circuits are “squeezed” each two years or so therefore each new generation of microprocessors is roughly twice more powerful than the previous one while the price is approximately the same.
To have a better image on this unbelievably high rate of development, try thinking how things would be if same principle would be achieved, for example, in airplane speeds!

Such a special situation deserves a more insightful look so next week we are going to tell you the basics of CPU functionalities and developments.

So see you next week, folks!

Bogdan