The Chronology of Computation
One thing that is recurring in my thoughts is what computation truly is. Generally, the word computation is often used to describe any sort of phenemonma that involves some sort of process, or the act of the process itself. I find this term solely defined the fields of computer sciences and mathematicis. Naturally, this conccept is so abstract, that it can be applied to any field of study. Computer science and mathematics provide precise definitions and tools to formalize and study computation, and hencce, we designed and formalized - The Theory of Computation, a formal system, derived from formal sciences, that describes, model, and analyzes the inherent properties of computation.The Theory of Computation is the purest, most distilled form of computation. It isolates the essence of computation, stripping away context (e.g., hardware, software, or real-world implementation) to study its core principles.
Yet, I often where this idea originated from. Naturally, life itself computes - the process of evolution, the process of thinking, the process of reasoning, the process of physics, and the process of the entire universe. These are all computations in their own right. Though, strictly speaking, our view of computation is tradionally determinisitc - it can branch out and become 'quantumness' such that systems that are 'quantumness' can now be 'computational'.
Though, we didn't necessarily discovered and perceived that the world and beyond 'computes' - we discovered that we can compute. The idea of computation is as old as human civilization itself. The ancient Greeks, for example, used mechanical devices to perform calculations. The abacus, a simple counting tool, dates back to ancient Mesopotamia and China. The Antikythera mechanism, an ancient Greek analog computer, was used to predict astronomical positions and eclipses. These early devices were the precursor to the notion of computing, demonstrating the human desire to automate and enhance our ability to process information. In this historical analysis, computation was calculation, as it was rooted in physical tools - involving a deterministic processes aimed at solving specific problems, like counting or astronomical prediction.
Then, symbolic logic introduced Boolean algebra, bridging reasoning and computation. Symbolic logic and Boolean algebra revolutionized computation by providing the foundation for answering boolean, or yes/no questions and enabling systems to reason with decision. With symbolic logic, computation gained the ability to model binary decisions, which form the core of modern computing.
It was not until the Entscheidungsproblem problem, formally a conjecture - proposed by David Hilbert, which stated that there should exist an algorithm to determine the truth or falsity of any mathematical statement based on a set of axioms. That is, given a mathematical statement, there should exist a procedure to determine whether it is provable from the axioms or not. Naturally, computation is connected, as itself was a system that can provide a yes/no answer.
Thus, Computation was bore out of necessity. As mathematics advanced, it became necessary to formalize computation to answer fundamental questions about the limits of calculation and the solvability of problems. In fact, to derive computation, we need to derive what a formal system is. It provided the elements, rules, and structure—essentially, the building blocks—while computation is the act of dynamically applying those rules to manipulate the elements and achieve specific outcomes. As the concept of formal systems was refined, so too was the notion of computation. Gödel demostrated there are statements that cannot be proven within a system, and this idea parallels Turing’s proof that there are problems (e.g., the Halting Problem) that no algorithm can solve.
In its mathematical essence, computation is the process of transforming inputs into outputs through a sequence of well-defined rules. It is a formalized abstraction of problem-solving, rooted in symbolic manipulation and governed by logic, algorithms, and systems. Alan Turing, in his seminal 1936 paper "On Computable Numbers, with an Application to the Entscheidungsproblem," defines computation as: "A function is said to be computable if its values can be found by a purely mechanical process." Then, this mathematical view of computation extends back to Alonzo Church and his work on the lambda calculus, where he described computation in terms of recursive functions and symbolic transformations: "The notion of an effectively calculable function is precisely equivalent to the notion of a recursive function or a function defined by lambda calculus." Building on this foundation, the Church-Turing Thesis proposed that anything computable in the real world can also be computed by a Turing Machine—a theoretical model of computation. This thesis solidified the modern definition of computation, uniting all abstract notions of calculability under a universal framework.
The 1930s era laid the theoretical foundations of a formal system of computation, or deriving a system that could mechanize logical reasoning, formalize the concept of algorithms, and establish the universal framework for understanding what can and cannot be computed. Theory became practice, and Von Neumann, building on these foundations, developed the architecture of modern computers, where stored programs and binary logic transformed theoretical computation into physical, programmable machines. That is, "Any abstract scheme of computation must ultimately be embodied in some form of machinery." Almost a century later - computation evolutionize to be a pragmatic definition and practical perspective, computation is defined as a systematic execution of well-defined rules or algorithms, implemented on physical or abstract machines to solve problems, process information, and model systems.
s seen as the systematic execution of well-defined rules or algorithms, implemented on physical or abstract machines to solve problems, process information, and model systems. This understanding extends from theoretical foundations to practical implementations across disciplines. Computation now transcends its original boundaries, becoming a universal framework employed across diverse domains, including bio-computation, where cellular processes and genetic mechanisms are modeled as computational systems; quantum computation, which leverages the principles of quantum mechanics to solve problems intractable for classical systems; physics computation, where simulations and numerical methods unravel the complexities of the universe; and fields like economics, medicine, linguistics, and artificial intelligence, where computation enables modeling, optimization, and decision-making. That is, computation is intertwined with our modern view of algorithm, the process of transforming data into a desired output through a finite sequence of well-defined operations, systematically executed on physical or abstract machines to achieve precise objectives, optimize processes, and model complex systems.