[Music]
Hi, thanks for tuning into Singularity
Prosperity. This video is the first in a
multi-part series discussing computing,
in this video we'll be discussing the
evolution of computing - more specifically,
the evolution of the technologies that
have brought upon the modern computing
era. The purpose of this video is so we
can appreciate how fast technology is
evolving and the people who have brought us
to this point! Many inventions have taken
several centuries to develop into their
modern forms and modern inventions are
rarely the product of a single inventors
efforts. The computer is no different, the
bits and pieces of the computer, both
hardware and software, have come together
over many centuries, with many people and
groups each adding a small contribution.
We start as early as 3000 BC with the
Chinese abacus, how is this related to
computing you ask? The abacus was one of
the first machines humans had ever
created to be used for counting and
calculating. Fast forward to 1642 and the
abacus evolves into the first mechanical
adding machine, built by mathematician
and scientist, Blaise Pascal. This first
mechanical calculator, the Pascaline, is
also where we see the first signs of
technophobia emerging, with
mathematicians fearing the loss of their
jobs due to progress. Also in the 1600s,
from the 1660s to the early 1700s, we
meet Gottfried Leibniz. A pioneer in many
fields, most notably known for his
contributions to mathematics and
considered by many the first computer
scientist. Inspired by Pascal he created
his own calculating machine, able to
perform all four arithmetic operations.
He was also the first to lay down the
concepts of binary arithmetic, how all
technology now days communicates and
even envisioned a machine that used
binary arithmetic. From birth we are
taught how to do arithmetic in base 10
and for most people that's all they're
concerned with, the numbers 0 to 9.
However, there are an infinite number of
ways to represent information, such as
octal as base 8, hexadecimal as base 16
used represent colors, base 256 which is
used for encoding, the list can go on.
Binary is base 2, represented by the
numbers 0 & 1,
we'll explore later in this video, why
binary is essential for modern computing.
Back on topic, progressing to the 1800s
we are met with Charles Babbage. Babbage is
known as the father of the computer, with
the design of his mechanical calculating
engines. In 1820, Babbage noticed that many
computations consisted of operations
that were regularly repeated and
theorized that these operations could be
done automatically.
This led to his first design, the
difference engine, it would have a fixed
instruction set, be fully automatic
through the use of steam power and print
its results into a table. In 1830,
Babbage stopped work on his difference
engine to pursue his second idea, the
analytical engine. Elaborating on the
difference engine this machine would be
able to execute operations in
non-numeric orders through the addition
of conditional control, store memory and
read instructions from punch cards,
essentially making it a programmable
mechanical computer. Unfortunately due to
lack of funding his designs never came
to reality, but if they had would have
sped up the invention of the computer by
nearly 100 years. Also worth mentioning
is Ada Lovelace, who worked very closely
with Babbage. She is considered the
world's first programmer and came up
with an algorithm that would calculate
Bernoulli numbers that was designed to
work with Babbage's machine.
She also outlined many fundamentals of
programming such as, data analysis,
looping and memory addressing. 10 years
prior to the turn of the century, with
inspiration from Babbage, American
inventor Herman Hollerith designed one
of the first successful
electromechanical machines, referred to
as the census tabulator. This machine would
read U.S. census data from punched cards,
up to 65 at a time, and tally up the
results. Hollerith's tabulator became so
successful he went on to found his own
firm to market the device, this company
eventually became IBM. To briefly explain
how punched cards work, essentially once
fed into the machine an electrical
connection is attempted to be made.
Depending on where the holes in the card
are will determine your input based on
what connections are completed. To input
data to the punched card you could use a
key punch machine aka the first
iteration of a keyboard! The
1800s were a period where the theory
of computing began to evolve and
machines started to be used for
calculations, but the 1900s is where we
begin to see the pieces of this nearly
5,000 year puzzle coming together,
especially between 1930 to 1950. In 1936,
Alan Turing proposed the concept of a
universal machine, later to be dubbed the
Turing machine, capable of computing
anything that is computable. Up to this
point, machines were only able to do
certain tasks that the hardware was
designed for. The concept of the modern
computer is largely based off Turings
ideas. Also starting in 1936, German
engineer, Konrad Zuse, invented the
world's first programmable computer. This
device read instructions from punched
tape and was the first computer to use
boolean logic and binary to make
decisions, through the use of relays. For
reference, boolean logic is simply logic
that results in
either a true or false output, or when
corresponding to binary, one or zero.
We'll be diving into boolean logic
deeper later in this video.
Zuse would later use punched cards to
encode information in binary, essentially
making them the first data storage and
memory devices. In 1942, with the computer
the Z4, Zuse also released the world's
first commercial computer. For these
reasons many consider Zuse the inventor
of the modern-day computer. In 1937,
Howard Aiken with his colleagues at Harvard
and in collaboration with IBM began work on
the, Harvard Mark 1 Calculating Machine, a
programmable calculator and inspired by
Babbage's analytical engine. This machine
was composed of nearly 1 million parts,
had over 500 miles of wiring and weighed
nearly 5 tons! The Mark 1 had 60 sets of
24 switches for manual data entry and
could store 72 numbers, each 23 decimal
digits. It could do 3 additions or
subtractions in a second, a
multiplication took 6 seconds, a division
took 15.3 seconds and a
logarithm or trig function took about
1 minute. As a funny side note, one of
the primary programmers of the Mark 1,
Grace Hopper, discovered the first
computer bug, a dead moth blocking one of
the reading holes of the machine.
Hopper is also credited with coining the
word debugging! The vacuum tube era
marks the beginning of modern computing.
The first technology that was fully
digital, and unlike the relays used in
previous computers, were less power-hungry,
faster and more reliable. Beginning in
1937 and completing in 1942, the first
digital computer was built by John
Atanasoff and his graduate student
Clifford Berry, the computer was dubbed
the ABC. Unlike previously built
computers like those built by Zuse, the
ABC was purely digital - it used vacuum
tubes and included binary math and
boolean logic to solve up to 29
equations at a time. In 1943, the Colossus
was built in collaboration with Alan
Turing, to assist in breaking German
crypto codes, not to be confused with
Turing's bombe that actually solved
Enigma. This computer was fully digital
as well, but unlike the ABC was fully
programmable, making it the first fully
programmable digital computer. Completing
construction in 1946, the Electrical
Numerical Integrator and Computer aka
the ENIAC was completed. Composed of
nearly 18,000 vacuum tubes and large
enough to fill an entire room, the
ENIAC is considered the first successful
high-speed electronic digital computer.
It was somewhat programmable,
but like Aikens Mark 1 was a pain to
rewire every time the instruction set had
to be changed. The ENIAC essentially took
the concepts from Atanasoff's
ABC and elaborated on them in a much
larger scale. Meanwhile the ENIAC was
under construction, in 1945, mathematician
John von Neumann, contributed a new
understanding of how computers should be
organized and built, further elaborating
on Turing's theories and bringing clarity
to the idea from computer memory and
addressing. He elaborated on conditional
addressing or subroutines, something
Babbage had envisioned for his
analytical engine nearly 100 years
earlier. Also the idea that instructions
or the program running on a computer
could be modified in the same way as
data, and to code them in binary. Von
Neumann assisted in the design of the
ENIACs successor, the Electronic Discrete
Variable Automatic Computer aka the
EDVAC, which was completed in 1950 and
the first stored-program computer. It was
able to operate over 1,000 instructions
per second. He is also credited with
being the father of computer virology
with his design of a self reproducing
computer program. And it contains
essentially those things which the
modern computer has in it, although in
somewhat primitive form. This machine has
the stored program concept as its major
feature, and that in fact is the thing
which makes the modern computer
revolution possible! At this point you
can see that computing had officially
evolved into its own field: From
mechanical, to electromechanical relays that
took milliseconds to digital vacuum
tubes that took only microseconds. From
binary as a way to encode information
with punched cards, to being used with
boolean logic and represented by
physical technologies like relays and
vacuum tubes to finally being used to
store instructions and programs. From the
abacus as a way to count, Pascal's
mechanical calculator, the theories of
Leibniz, Alan Turing and John von
Neumann, the vision of Babbage and the
intellect of Lovelace, George Bools
contribution of boolean logic, the
progressing inventions of a programmable
calculator to a stored-program fully
digital computer and countless other
inventions, individuals and groups. Each
step a further accumulation of
knowledge - while the title of the
inventor of the computer may be given to
an individual or group, it was really a
joint contribution over 5,000 years and
more so between 1800 to 1950.
Vacuum tubes were a huge improvement over
relays, but they still didn't make
economic sense in a large scale. For
example, of the ENIACs 18000 tubes, roughly
50 would burn out per day and a
round the clock team of technicians
would be needed to replace them. Vacuum
tubes were also the reason why computers
took up the space of entire rooms,
weighed multiple tons and consumed
enough energy to power a small town! In
1947, the first silicon transistor was
invented at Bell Labs and by 1954 the
first transistorized digital computer
was invented, aka the TRADIC. It was
composed of 800 transistors, took the
space of .085 cubic meters
compared to the 28 the ENIAC took up,
only took 100 watts of power and could
perform 1 million operations per second.
Also during this era, we begin to see
major introductions on both the hardware
and software aspect of computing. On the
hardware side, the first memory device,
the random-access magnetic core store,
was introduced in 1951 by Jay Forrester,
in other words, the beginnings of what is
now known as RAM today. The first hard
drive was introduced by IBM in 1957, it
weighed one ton and could store five
megabytes, costing approximately 27,000
dollars per month in today's money. On
the software side is where a lot of
major innovations and breakthroughs
began to come, this because computer
hardware and architecture was beginning
to become more standardized instead of
everyone working on different variations
of a computing machine. Assembly was the
first programming language to be
introduced in 1949 but really started
taking off in this era of computing.
Assembly was a way to communicate with
the machine in pseudo-English instead of
machine language aka binary. The first
true widely used programming language
was Fortran invented by John Backus at
IBM in 1954. Assembly is a low-level
language and Fortran is a high-level
language. In low-level languages while
you aren't writing instructions in
machine code, a very deep understanding
of computer architecture and
instructions is still required to
execute a desired program, which means a
limited number of people have the skills
and it is very error-prone. Also in the
early to mid 50s, to compile code back to
machine code was still an expensive and
time-consuming process. This all changed
with Grace Hopper and her development of
the first computer compiler, Hopper if
you remember from earlier also found the
first computer 'bug'. This allowed for
programming of computers to become more
affordable and nearly
instantaneous, instead of the
time-consuming process of writing code
in assembly and then manually converting
it back to machine code. As a side note,
Hopper also assisted with the invention of
and other early programming language,
Cobol. This era marks the beginnings of
the modern computing era and where the
exponential trend of computing
performance really began. While
transistors were a major improvement
over vacuum tubes, they still had to be
individually soldered together. As a
result, the more complex computers became,
led to more complicated and numerous
connections between transistors,
increasing the likelihood of faulty
wiring. In 1958, this all changed with
Jack Kilby of Texas Instruments and his
invention of the integrated circuit. The
integrated circuit was a way to pack
many transistors onto a single chip,
instead of individually wiring
transistors. Packing all the transistors
also significantly reduced the power and
heat consumption of computers once again
and made them significantly more
economically feasible to design and buy.
Integrated circuits sparked a hardware
revolution and beyond computers assisted
in the development of various other
electronic devices due to
miniaturization, such as the mouse invented
by Douglas Engelbart in 1964, he also
demonstrated the first graphical user
interface as a side note. Computer speed,
performance, memory and storage also
began to iteratively increases as ICs could
pack more transistors into smaller
surface areas. This demonstrated by the
invention of the floppy disk in 1971 by
IBM and in the same year, DRAM by Intel, to
list a few. Along with hardware, further
advances in software were made as well,
with an explosion of programming
languages and the introduction of some
of the most common languages today: BASIC
in 1964 and C in 1971. As you can see
from throughout this video, computing
since the 1900s has evolved at an
increasingly fast rate. Thus, in 1965, led
Gordon Moore, one of the founders of
Intel, to make one of the greatest
predictions in human history: Computing
power would double every two years at
low cost, and that computers would
eventually be so small that they could
be embedded into homes, cars and what he
referred to as personal portable
communications equipment,
aka mobile phones. We now refer to this
as Moore's Law.
Here are some charts to further illustrate
how fast computing was evolving and what
Moore based his predictions on: One of my
colleagues called this Moore's Law.
Rather than just being something that
chronicles the progress of the industry,
it kind of became something that drove
the progress of the industry. A
tremendous amount of engineering and
commitment has been required to make
that happen, but much to my surprise the
industry has been able to keep up with
the projection!
[Music]
At this point the video has come to a
conclusion, I'd like to thank you for
taking the time to watch it. If you
enjoyed it please leave a thumbs up and
if you want me to elaborate on any of
the topics discussed or have any topic
suggestions, please leave them in the
comments below.
Consider subscribing to my channel for
more content, follow my Medium
publication for accompanying blogs and
like my Facebook page for more
bite-sized chunks of content! This has
been Ankur, you've been watching
Singularity Prosperity and I'll see you
again soon!
[Music]
No comments:
Post a Comment