Though we see modern day computers do everything tasks as simple as performing a calculation to more complex tasks as controlling a rover on another planet, the secret is that computers only understand two digits that are 0 and 1. These two digits are the “binary data”, which all computers of the modern world understand and make use of. However, now the obvious question that comes into the mind of all computer users is why do digits as simple as 99 not directly fed into the computer and instead a binary code has to be generated by the system itself, which for 99 reads something more complex like 1100011?