Micah Levason Profile

Micah Levason

Image description

Computers work through abstraction.

Computers combine mathematics, science, and creativity to perform tasks through abstraction.

The computer operates based on the principle of abstraction, effectively concealing complex details and showcasing only pertinent information to handle complexity efficiently.

Abstraction Layers:

  • Hardware: At the fundamental level, computers are electronic circuits, interpreting information in binary code (0s and 1s).
  • Software: This layer comprises sets of instructions that direct the hardware's actions, presenting a more human-comprehensible form compared to binary code.
  • Operating System: This specialized software layer governs the hardware while delivering shared services essential for running application software.
  • Applications: These specialized software, like web browsers or word processors, operate as per the capabilities provided by the operating system.

How Abstraction Functions:

  • Simplification: Each layer reduces the complexity of the layer below it, enabling users to engage with applications without delving into binary code.
  • Focus on Relevance: Each layer emphasizes pertinent details, exemplified by not needing to consider text storage in memory while using a word processor.
  • Creativity and Problem-Solving: Abstraction aids developers in breaking down complex software challenges into manageable components.

Integration of Mathematics, Science, and Creativity:

  • Mathematics: The fundamental operations and algorithms of computers are rooted in mathematical principles, with binary code being a direct manifestation of mathematics.
  • Science: Knowledge of electricity, materials, and physics is crucial for the creation and enhancement of hardware.
  • Creativity: The design of software, user interfaces, and experiences demands creativity, focusing on transforming intricate systems into user-friendly and efficient solutions.

In summary, through the abstraction concept, computers layer complexity, with each layer concealing the intricacies of the one below. This allows users and developers to concentrate on specific tasks without distraction, integrating mathematics, science, and creativity to deliver a broad spectrum of effective services.

Image description

2700–2300 BC

The journey of computer science began with the abacus around 2700-2300 BC, a simple device using binary positions akin to modern computer bits. This ancient tool laid the groundwork for computational thinking.

Surprisingly, the journey of computer science started thousands of years ago, around 2700–2300 BC, with a simple tool for counting which was 'the first abacus'. Although the abacus became revolutionary at that time, it seems quite uncivilized compared to modern computers. That's interesting because it encapsulates the early thought of computation or putting two things together, which seems to be the theme behind computing.

Let me do a breakdown:

Abacus - The Early Computer:

The abacus is not a computer in the modern sense but think of it as well as an ancient calculator. It has helped people to perform so sorts of arithmetic operations like addition, subtraction, multiplication, and division. The abacus consists of beads that move on rods. And depending upon the position holding of these beads, there is a consequent number represented.

Binary Positions - As a Root of Modern Computing:

Funny enough, the abacus uses a positional system. Much like modern-day computer systems use their positions of bits (binary digits) to represent data. Beads that are present on the abacus could be compared with the bits in the computer. This is because there is this idea of positions in which the values are depicted in both cases.

Computational Thinking - The Legacy of the Abacus:

Computational thinking involves the approach of problems in such a way it can be solved by algorithm and computation. It is only breaking down complex problems into manageable parts, look for patterns, make step-by-step solutions (algorithms), and think about the data to use. The abacus, by design, encourages this sort of thinking. It is not just working out, it is playing with numbers and finding the patterns and the relationship between them.

Thinking about the abacus, what is thus implied is that such a simple tool afforded a condition for or laid down the foundation of computational thinking. That was a beginning, a simple apparatus, which helped humans to start thinking about problems and their solutions in a structured computational way leading us to the complex computers we are using now.

Image description

1600

By 1600, William Gilbert's experiments with amber and wool led to the discovery of electricity, a fundamental force for future technologies.

By 1600, William Gilbert performed crucial experiments for the breakthrough in electricity. He noted how amber (a yellowish fossilized resin) reacted when he rubbed it with wool. What Gilbert noted is that rubbed amber can attract light objects like feathers. This was significant because again, it was really one of the first documented observations of electric forces.

Here’s why that would be a big deal:

Observation of the Invisible Forces: Before Gilbert, we basically didn't come to the realization that forces that are not visible could work over a distance. He actually proved and showed that such forces really do exist irrespective of the visibility part.

Electricity vs. Magnetism: Gilbert was also one of the first people to differentiate between the magnetic forces (like those in a compass) and the electric forces he saw from his research with amber and wool, such as lightning and electric sparks. This difference would prove to be important for many later scientists to begin understanding and then manipulating these two separately.

Basis for Future Technologies: The discovery of Gilbert became a base to the further research. Later on, by the name of electric circuits, the power systems were developed and finally all electronic gadgets that are utilized in present era.

Though it certainly was not rubbing stuff together to cause electricity, Gilbert's work was more than simply that; it indicated a basic point in the understanding and research into forces that would power the single most important central focus of mankind's development through the centuries to come.

Image description

1750

Benjamin Franklin furthered this understanding in 1750 by demonstrating that lightning is a form of electricity, leading to practical applications like lightning rods.

Additional information about the topic goes here.
Image description

1800

In 1800, Alessandro Volta's invention of the voltaic pile, the first battery, enabled controlled use of electricity, a critical advancement for electronic devices.

Additional information about the topic goes here.
Image description

1801

Shortly after, in 1801, Joseph Marie Jacquard's loom used punch cards for programming, foreshadowing computer algorithms.

Additional information about the topic goes here.
Image description

1831

Michael Faraday's 1831 discovery that moving magnets near wires generates electricity paved the way for electric power generation. .

Additional information about the topic goes here.
Image description

1837

In the same era, Samuel Morse's telegraph in 1837 revolutionized long-distance communication with Morse code.

Additional information about the topic goes here.
Image description

1837

Charles Babbage's Analytical Engine concept introduced the idea of a general-purpose computing machine.

Additional information about the topic goes here.
Image description

1837

In 1837, Ada Lovelace envisioned computers going beyond numerical calculations, suggesting broader applications.

Additional information about the topic goes here.
Image description

1847

George Boole's development of Boolean Algebra in 1847 became a cornerstone of digital circuit design.

Additional information about the topic goes here.
Image description

1879

In the late 19th century, Thomas Edison invented the practical electric light bulb, revolutionizing the use of electricity and lighting in homes and businesses.

Additional information about the topic goes here.
Image description

1880's

In the late 1880's Nikola Tesla and George Westinghouse figured out how to send electricity over long distances using Alternating Current (AC).

This was like building super long charging cables, making it possible to power up things like computers far away from where electricity is made.

Additional information about the topic goes here.
Image description

1880s-1890s

During the same period, Edison's Direct Current (DC) electricity faced off against Nikola Tesla's Alternating Current (AC).

During the same period, Edison's Direct Current (DC) electricity faced off against Nikola Tesla's Alternating Current (AC).

AC ultimately won due to its efficiency in long-distance transmission, which was crucial for the future of electrical distribution and computer technology.

Additional information about the topic goes here.
Image description

1904

The 20th century marked rapid advancements with John Ambrose Fleming's vacuum tubes in 1904, enabling early computers to process information.

Additional information about the topic goes here.
Image description

1945

In 1945, the development of the ENIAC, a significant general-purpose computer, was achieved. This was a collaborative effort involving several key contributors, among them Jean Jennings Bartik and Betty Snyder Holberton. Their work involved transforming complex problems into code, thereby setting the stage for the evolution of modern programming.

Additional information about the topic goes here.
Image description

1936

Alan Turing's 1936 Turing Machine concept laid the foundation for modern computer science.

Additional information about the topic goes here.
Image description

1947

The invention of the transistor in 1947 by Bardeen, Brattain, and Shockley at Bell Labs revolutionized electronics with its compact size and reliability.

Additional information about the topic goes here.
Image description

1957

The creation of FORTRAN in 1957 by John Backus's team at IBM made programming more accessible, especially for scientific tasks.

Additional information about the topic goes here.
Image description

1958

The integrated circuit, or microchip, was invented in 1958 by Jack Kilby and Robert Noyce, leading to the miniaturization of computers.

Additional information about the topic goes here.
Image description

1960

The 1960s introduced ASCII for information exchange and the birth of networked computing with ARPANET in 1969.

Additional information about the topic goes here.
Image description

1969

The 1960s introduced ASCII for information exchange and the birth of networked computing with ARPANET in 1969.

Additional information about the topic goes here.
Image description

1960-1970's

The relational database emerged in the late 1960s and early 1970s, thanks to Edgar F. Codd at IBM, enhancing data organization and retrieval.

Additional information about the topic goes here.
Image description

1970's

The 1970s saw the advent of personal computers, bringing computing power to the masses.

Additional information about the topic goes here.
Image description

1980 - 1990's

In the 1980s and 1990s, Philip Emeagwali's work in supercomputing demonstrated the potential of parallel processing, influencing distributed computing.

Additional information about the topic goes here.
Image description

1990's

Finally, the 1990s marked a significant milestone with the creation and expansion of the World Wide Web, transforming communication and making information globally accessible and interactive. This period solidified the internet's role as a platform for information exchange, reshaping business, politics, and daily life.

Additional information about the topic goes here.