One of the key debates in philosophy is the debate about the nature of Being (the ontological question), the study of which is called “ontology”. Traditionally, Western philosophers focused on other fields of study first before attempting to answer the ontological question: metaphysics and epistemology. Eventually, Edmund Husserl decided that our experiences of reality were so fundamental that experience itself must be taken into account first, before any proceeding inquiry. He called this accounting of experience, “phenomenology” …a term that had been used before, but not in precisely the same way it would be used after Husserl. Long story short, Husserl attracted many later philosophers that would emphasize the centrality of ontology to phenomenology, articulating rich schools of thought that included Existentialism (one of my main influences).
Some of the basic insights of the Existentialists are that our experiences aren’t some abstract reflection upon a world of things in-themselves, a world that is “out there” ready to be known, indexed, and put into encyclopedic volumes of knowledge. Nor is our experience the result of a divine substance that is separate from objective reality and governed by a cosmic world wholly distinct from the material world. The Existentialists believed that first of all, our experiences are always embodied; that is, high-level logical operations performed on conceptual entities derive (but aren’t separate from) the sensual impressions we experience when touching, tasting, smelling, hearing, feeling, and seeing stuff. Second of all, they believed that our experiences are heavily tied-up with meaning, that the significance of an experience adds a dimension of value and that this value plays a role in shaping our overall experience of everything. In other words, people aren’t merely biased in their decisions because of accepted dogma, or social norms, or whatever else; rather, experience is itself determined by the complicated ways we individually and collectively interact with a world that we are situated within as “subjects”. Therefor, in attempting to comprehend things as they are, in-themselves, it is important to first recognize that whatever techniques or methods we use to do that are a reduction of our experiences …a narrowing of our subjective experience to specific points of focus that are important in some system or another. That there can never be an objective point-of-view (one that can access the objects of reality as they are directly, in-themselves), only various refinements in our attempts to understand the rules of our own experiences.
This sort of thinking has more-or-less informed post-industrial societies for the past century. Whereas an ancient Greek philosopher might debate how it is possible for a universe that is just one thing to transform itself into so many distinct things (metaphysics), then move on to develop an ethics from their conclusions …we moderns and post-moderns tend to debate the relative usefulness of each other’s biases in a universe that we are at the center of, each with our own more-or-less conflicting world(s) that we’re responding to. Sometimes we may say that these are versions of a single universe or world, that we are subjects of various paradigm, or governed in our thoughts by recognizable ideologies …but nevertheless, we’re highly aware of this relationship between our own experiences, what we think is important, and the opacity of an objective universe we try to model through scientific investigations. Very few of us will believe that if we depict the objects of our desire (like a buffet of meats and fruits and vegetables), then intensely worship that depiction, the natural response of the universe is to make those objects appear for us by whatever means necessary. We experience networks of objects with cause-effect relationships that require an account of their chain reactions to produce desired results. We are systematic in this way of being-in-the-world.
Now that I’ve written a bit about what phenomenology is (and more specifically, existential phenomenology), how it prioritizes fields of inquiry, distinctions it makes, insights it gleans …now I will get to the topic of the paper: Being-in-the-World Digitally. Michael Foucault puts an estimate on when we began to think in this modern, network-of-objects way at the Enlightenment, turn of the 18-19th Century. It’s when Western history begins to display attempts to create taxonomies in just about all fields of knowledge with distinctly modern notions of how to use those taxonomies: cause-effect relations, evolutionary transformations, representational models (rather than analogous, nor metaphorical models), etc. Between the hard and the soft sciences that emerged from this sort of thinking, there was an explosion of “knowledge” that didn’t merely come from the advantages of printing technologies and democratic political systems. This explosion of knowledge reinforced the methods that produced it, as well as the institutions that regulated those methods of production. However, we are now moving towards a different sort of thinking that although it is contingent upon the former, is itself characteristically its own. This has been called post-modern by some philosophers, but it has also been called digital, or genetic, or neuronal. It isn’t so much focused on producing taxonomic systems of knowledge, it is interested in dealing with taxonomic systems effectively (procedurally) …seeing that the past epoch has granted to us such a mess of them.
To clarify the distinctions with some at-hand examples (before moving on to the digital component), this focus on effectively dealing with taxonomies can be related to cooking and cook books, pharmacology and prescribing drugs, warfare and using targeted drone strikes based on intelligence analysis. The most valuable, meaningful, useful aspect that we’re paying each other top dollar for is the expertise …or rather, the form of knowledge that this particular sort of expertise produces: intellectual property -the patent-the method. In the hard sciences, teaching the systems of chemistry and biology aren’t nearly as lucrative as patenting a new drug, a gene sequence, or a therapeutic technique. It certainly isn’t admirable to live as though God will just reward and punish us as He sees fit, and it isn’t quite enough to simply be an educated study of scientific systems; we want results, inventions, products, recipes. In the digital age, what we want is: the Algorithms -the best ones.
One thing ought to be obvious to anyone reading this about what I said; and that is that there isn’t anything particularly new about algorithmic thinking …about cooking, medicine, patented inventions, and the rest. Humanity has long-since relied on ordered rules to do things; and, who can even say that human beings are original in that? The reason why I am using the term digital and not the term algorithmic is because there is something particular about the relationship between algorithms and digital information. There is a breadth produced by post-Enlightenment taxonomic systems and a depth produced by isolating digital components in those systems: atomic particles, genetic proteins, binary code, consumer taste profiles, early adopters of culture, etc. When these systems are able to encode particular differences down to the digit (or, the quantum or primitive level), a form of computational thinking becomes not only advantageous …but sometimes necessary.
To live in a digital world is to live in a modular world that can be reduced to its fundamental components. A world where it is not only possible to graft one thing onto another (like different species of citrus tree) to produce a new thing, but to comprehend how new arrangements of the most fundamental units can more efficiently and effectively produce something similar …such as how coding a tree’s genes can create a new species of trees, producing a variety of citrus fruits, and each fruit producing seeds that can be used to reproduce that new species of tree. There are countless other examples that I could use to highlight the difference between synthesizing higher-level parts and synthesizing fundamentals, but I hope they are unnecessary. The main point I want to emphasize about this concerns the complexity of synthesizing fundamentals. And further, the extent to which that complexity makes something like a sophisticated computer program such a valuable entity.
There are many other things that ought to be said about digital worlds. Discussions about the meaning of these bodies that our experiences are embodied by and how technology that can manipulate organic fundamentals should raise a brow. Discussions about the level of abstraction one reaches in moving from sense perception to conception of entities to systematic taxonomies of entities to the comprehension of taxonomized entities as compositions of digital building blocks …what it means to move through that spiral of thinking and then think about oneself in the world. What I want to focus on now is the value of good recipes, of computer code, of complex data analysis algorithms, and designed gene sequences. The reason why is because the value of such things implies that while on the one hand, most of us can access computers and languages to program them; on the other hand, this access tends to confront us as an enormity of abstract complexity. An abstract complexity that some of us make six-figures a year manipulating it, but all of us are manipulated by it and left to adapt in whatever worlds are created for us. Where some can create algorithms to compute variables necessary for drone navigation, and all of us are potential victims of drone warfare. Where some of us can hack, but all of us are always potentially hacked.
To Be Cont…