10 min readMar 3, 2021


Don’t you ever just sit down, take a moment, you know a minute or two, and think about certain things? You know for example, how the Universe was formed, like seriously, big bang theory, I mean I wanna believe it but we are all kind of thinking about it right? come on, you know you’ve been thinking about it too. Too many gaps in the story. I mean an explosion?? Duuh, what caused the explosion?? Yea, these are but a few things that sometimes keep me up at night. Let’s take gravity, for example, I mean okay science does tell us that gravity is basically a pulling force that a mass of body exerts, I think, but there’s more to gravity than that or even closer to home, the human brain, this mystery called the human brain is almost unsolvable.

The mystery that still fascinates me the most even among this colossally big stuff in the universe has got to be the computer. This is something I would say I do not understand one bit even though I am typing on one and I am a Computer Science Major, aah, funny right?? I know.

Now, let us go way back to the invention of the first computer. And before we begin. Let us define a computer. The most generic definition of a computer is “something that computes”. This would bring us to understand the word compute. Well, the oxford dictionary describes the word compute to mean “calculate something”.

So that would mean that at its core a computer is basically just a calculator. Good, at least we are getting somewhere right? Now let us get back to the invention of the computer or in this case the calculator.

Now hear me out, I think the first form of the computer must have had to be the fingers, you see it right, a perfect decimal number system.Ten fingers are a perfect calculator, humans must have had to use it as the first calculators.I mean think about it, things have to start from somewhere. I basically imagine that back then, a long time ago, humans being less smart than they actually are right now might have at some point been going about their activities and maybe only carrying out activities bound between these parameters,ten fingers.It has to be. If they were trading let’s say cows then, I imagine ten was the maximum number of cows they would trade because past that they would have nothing to reference. Remember, at that point, we were kinda really dumb.

So my imagination drives me to believe that the number ten was used earlier when humans lived in very small herds, probably a man , woman and maybe two three kids. Until population starts to increase exponentially and man can no longer keep track, at this point, man is adapting and getting smarter. I think man has now realized that he can use his ten fingers to count more by jumping back to the start after every ten fingers counted. And i think this counts as a breakthrough. Man can now count up to 60, 100 ,120 and this is actually very useful. Now man can have a bigger family, man can trade fifty cows, man can marry as many wives as man wants.

But then with time, the population gets larger, civilizations are starting to be formed as well as mix.People are meeting other people from elsewhere. Humans are not just dealing with uniform tally but now binary math. Addition. man wants to get totals. Man realizes he need to keep track of all he owns. Now man knows he owns 200 cows , maybe 25 wives and probably 83 children. But he now wants to know the total.How should he do this? should he line up all his cows ,children and wives and then start counting from 1 until he is done? Well at first he does this and he is happy with himself but with time , he gets more children, more wives and more cows and he has to do this all over again each time. Also , some wives,children and cows die.

At this point man is tired of tedious counting. Now he needs to do math and man invents the most basic math, addition.Now you should know, all other operations are some form of an addition operation. Man is happy with addition, he then adds in subtraction for when he loses property. But then time flies by and man is getting smarter and he is starting to interact with a lot of things in this world where civilization is growing rapidly and then the number of things to calculate are getting exponentially larger and the human brain can’t keep up, then enters probably the most defining tool of all time, drum rolls please…. the abacus.

The abacus works on a place value system , the right most frame is ones place value followed by tens, hundreds, thousands. There is a horizontal bar across the frames. Below the bar exists five beads on every frame while above the bar exists 2 beads. A bead below the bar is either a 1 or a 0 while a bead above the bar is either a 5 or a 0. A bead equals zero if it does not touch the horizontal bar directly or not attached to any other bead that touches the bar in any way. A bead is either a 1 or a 5 when a bead directly touches the horizontal bar or touches a bead that is in contact with the bar. This is the entire principle of how the Abacus works.

So the Abacus was a breakthrough in math for man because you know, he was able to add basically more than four digit numbers without having to do physical counting which by now you know how bad that is. So man has had a breakthrough.So now man can calculate basically up to 10 digit numbers with as much as a hundred times the speed it would take using the previous methods. man is now happy. But by now you have discovered it doesn’t stay that way for long right? The thing about us human beings is that we never get satisfied.

After sometime, human being is tired of having to physically move beads across boards every time they needed a calculation performed plus at this time, their brain has matured enough to come up with something better. What if they build something complex such that it does not just help you do the calculations but it does it for you? And while at it, why not sell it to the highest bidder and get rich ? Now man is starting to innovate in terms of wealth acquisition. Enter the father , Charles Babbage.

Charles Babbage’s idea was to produce a programmable machine that is capable of printing mathematical tables up to 20 decimal places in which he was funded by the British government. Due to cost reasons,he was only able to build a scaled down version able to produce tables up to 6 decimal places using polynomial system because of expenses involved in buying metal parts necessary to build the 20 decimal machine. He called this computer the Difference Engine.

The Difference Engine.

Now of course, a man who has a vision cannot be swayed, Babbage made new blueprints for a new version of his difference engine which could calculate up to 30 decimal places while using fewer moving parts. He also used punch cards to program it and it used a printer,bells and curve plotters to produce output.This was legendary. This meant more calculation at the hands of humans hence driving science and engineering forward exponentially.He named this machine , the Analytical Engine. And now man is happy.

The Analytical Engine

So man has all this enormous computing power, he should be satisfied right? Well..not quite. Ada Lovelace enters the chat. The Machine itself was a marvel to behold but it needed a bit of an edge you know, something to push it out of the comfort zone. This is where Ada Lovelace, the first computer programmer enters the chat😎. Now she was thinking a tad bit more about the applications of this beautiful piece of machinery. Why do numbers only? She asked. She figured ,human beings do not speak in math, she needed the machine to be able to understand letters and words. So she programmed the machine to even translate from French to English I think…..🙄.

Now this changes everything. Not only is the machine now viewed as a tool for math, but it can also be used for letters and numbers.She invented a completely different language called Ada for this.

Can I be sincere, this is the part where my understanding of computers stop because from here, I am completely tripping. Up to this point, we were moving beads and using punch-cards to program computers but from here on, I have no idea whats going on.Our next journey has to take us to the evolution of the processor.

Remember vacuum tubes or otherwise called Thermionic valves? Those were the processors running on the first generation computers, then came the Transistor technology on the second generation computers to then Integrated Circuits, then Very large Scale Integrated Circuits.

Now these computers all have one thing in common , their instructions are given using code. Are you listening? They are controlled by code. Basically the most advanced tech in basically all of eternity is controlled by a bunch of letters A to Z , ,0 to 9 and some other silly symbols????? That is complete madness. And now for more madness. I mean who even seriously sat down and decided to invent code?

The very early coding language was Machine language. Now before we continue , you need to know that a modern computer is built fundamentally on transistors. Transistors are basically made up of what is known as a semi conductor. As a science student, I remember a semi-conductor is basically a material whose properties lie between those of a conductor and an insulator. That simply means that depending on what conditions you put it under , you get an electricity conductor or insulator. This is the fundamental work of a transistor, that would be Switching. From this , you reckon a transistor exists in only two fundamental states. Its either allowing current to flow or not.

Now, because of this binary nature of transistors, computers only understand binary. Now a bunch of self proclaimed gods came together and decided that these two fundamental states will be represented using a 0 and 1 hence binary. So at the core of the modern computer, only binary is understood.And that’s why in the early days, mathematicians programmed computers directly using binary. Now as you can guess, this is difficult as hell, having to move away from humanity’s default decimal number system that is etched directly into their DNA is a pretty gruesome job.

Of course man was going to have to make a better way to code computers.Man decides to come up with a better way, enters mnemonic. Now a mnemonic is described as a system such as a pattern of letters, ideas, or associations which assists in remembering something. This is exactly what mnemonic did, these bunch of self proclaimed bunch of nerds decided to create short forms of commonly performed operations as well as memory locations. All someone needed to do to program a computer at this point was for one to copy their value to a specific memory location and then add an operand to those values and then copy the output to another memory location and boom , the math was done. This made it easy for humans to do their work . Again it possed a great challenge, machines don’t understand mnemonic, so of course some very dedicated mathematicians had to write an entire layer to translate these mnemonics to machine code.

And again with time , man realizes they can do better. So they come up with another language cluster, now called third generation languages. You know them as third generation languages. These languages are a leap forward. This is a structured language that defines syntax and semantics and a lot of rules defined to program languages. It offers more ways to manage memory and even better , it offers Graphical user interface, where a computer can be programmed by pressing icons on the screen.Magnificent, isn’t it? These languages include C, FORTRAN,PASCAL, BASIC.

Man uses these languages to do great many a things , even land man on then moon. But like we know, man is never satisfied. Man realizes soon enough that he is being very repetitive, he keeps on writing the same code to perform tasks that are more or less similar and it feels tedious and time-wasting.What if he can write a bare bond code intended to perform a job of a certain type of job and then only have to pass parameters? This way , man can use one code snippet to do a million other jobs.

This is where Object Oriented languages swooped in. Some really smart people decided that they needed a better way and so they built a language that organizes code into a class and an object. This approach picked because it mimicked the real world as it is. Why you ask? If human beings were going to create programs that were supposed to really make an impact in their life , then these programs had to be modeled after the real world. So came python , C++ , PHP,JAVA e.t.c

And even now , they sheer complexity of things man has built with these languages is unfathomable. What a journey.

So far the computer has been a tool that humans have to give instructions to.But human being wants an equal. I have no idea why but they seem to want a life form with the same intelligence as them.Now since man has been unsuccessful in finding such lifeforms here on earth or beyond, they have this stupid idea that , if they can’t find them , then why not build them instead??? I know right, seriously immature of them. But no one can stop man. Fifth generation languages enters the chat.

Fifth-generation languages aim to teach computers to be more human-like, which is known today as artificial intelligence. They involve a lot of visual aid, which is how humans learn. These languages have built in Artificial Natural language processing with very user-friendly interfaces and a lot of parallel processing just like a human being is. These languages will take advantage of the next level of computing which is known as quantum computing.

Remember how far we have come? This may not give you the complete information, but I hope it brought you closer to completely demystifying the wonderful tool that is “THE COMPUTER”




Software & AWS Cloud Engineer