Nervous System: The Laws of Thought
BRG is home to renowned thought leaders and experts considered authorities in their fields of work. Our timely research and perspectives provide analysis and insights on the most important issues facing the industries and organizations we serve.
With the aggressive pace of technological change and the onslaught of news regarding data breaches, cyber-attacks, and technological threats to privacy and security, it is easy to assume these are fundamentally new threats. The pace of technological change is slower than it feels, and many seemingly new categories of threats have been with us longer than we remember.
Nervous System is a monthly series that approaches issues of data privacy and cyber security from the context of history—to look to the past for clues about how to interpret the present and prepare for the future.
For many, the term “Boolean operators” has entered the lexicon in the context of formulating search queries in Google or Relativity, but that is just one specialized application of Boolean logic. In 1847, mathematician George Boole theorized that all logic could be reduced to a series of true-or-false decision points and that, by doing so, he could express any logical proposition as an algebraic equation. Boole turned human logic into simple math by looking closely at the possible outcomes of different true/false scenarios and realizing a limited set of possibilities existed. Because the values “True” and “False” can be represented as binary numbers, Boolean math can be encoded easily as machine logic for automatic processing. This nineteenth-century mathematical innovation became a building block of digital electronics.
Mathematics has a fundamental tension at its heart. On one hand, mathematical precepts make it possible to model real-world events abstractly: for example, addition can model the act of putting eggs into a basket, and subtraction can model taking eggs back out of that basket. The problem arises when those mathematical models start to suggest counterintuitive or challenging concepts: subtracting a larger number from a smaller one would result in a negative number, whereas once the basket is empty one cannot keep extracting eggs from it.
For thousands of years, mathematicians had avoided or rejected the concept of negative numbers—a philosophical absurdity at best and an aesthetic affront at worst. Certain accountants accepted the concept (to better catalog debts), but by the nineteenth century controversy still raged about whether to accept the notion of math that did not correlate directly to real-life experiences.
Into this fray came George Peacock, whose 1939 publications tried to settle the controversy by dividing algebra into two parts. He defined an arithmetical algebra, which imposed guardrails on the basic rules of algebra to exclude any operation that would result in an “absurd” outcome. Alongside it, he established a symbolic algebra, which left the basic rules of algebra intact but unmoored to specific interpretation. Symbolic algebra could model the behavior of conceptually challenging ideas like negative numbers, imaginary numbers, and infinities without having to concede such things existed.
The symbolic approach to algebra opened up strange new possibilities. If variables in symbolic algebra do not have to represent numbers per se, then the algebraic rules still must hold even if the variables represent something else.
Boole had been fascinated by classical logic, an ancient and venerable field that had sought to codify all of human thought in a set of fundamental logical axioms. It occurred to Boole that he could model those logical axioms using symbolic algebra and therefore use math to prove the axioms correct. This would not only unite the previously disconnected fields of logic and math, but also use the symbolic tools of addition and multiplication to model the fundamental tenants of human thought.
The smallest, irreducible atom of information is a bit, short for “binary unit”—a single instance of true or false. The smallest, most irreducible thing one could do with a bit would be to bring two of them together in some kind of interaction. Using 1 to represent “True” and 0 to represent “False,” Boole defined a limited set of operations that, by following the mathematical models of symbolic algebra, encompassed every possible outcome of a true-or-false scenario.
For example, Boolean “AND” is based on multiplication. Consider what happens when two bits are brought together, and both are True. This is comparable to multiplying 1 by 1, which yields a 1, for True. Any other combination—multiplying 1 and 0 or multiplying 0 and 0, produces a 0—a value of False.
The Boolean “OR” produces a value of True if any of the inputs is True, and returns False only if both inputs are False. This is modeled on addition.
The Boolean “XOR” (for “Exclusive Or”) offers an alternative version of the “OR” operation where a value of True can only be returned if both inputs are True; False is returned for all other scenarios.
The Boolean “NOT” inverts the input, producing a True if the input is False, and vice versa. This operator can be combined with any of the above to reverse its characteristics.
Collectively, these simple operators account for every possible true/false decision point. Boolean operators allow problems to be expressed as a series of true/false decision trees, which can then be represented using the components of symbolic logic. Put another way, to the extent that classical logicians had codified human thought as set of axioms, Boole made it possible to turn those into algorithms that could be calculated by logical machines.
Boole outlined these operators in his 1847 book, The Mathematical Analysis of Logic, and expanded on them with his most famous work, An Investigation of the Laws of Thought, in 1849.
In the late 1800s and early 1900s, telephone companies started building vast networks of electrical relays that took in electrical inputs and then either opened or closed the relay based on those inputs. These vast networks of electrical gates created a physical instantiation of Boolean logic, demonstrating the possibility of building an electrical machine that was capable of logic. That momentous fusion of telecommunications and mathematics gave birth to computer science—and ushered in the Information Age.
The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.
Related Professionals
Prepare for what's next.
ThinkSet magazine, a BRG publication, provides nuanced, multifaceted thinking and expert guidance that help today’s business leaders adopt a more strategic, long-term mindset to prepare for what’s next.