We are already data waiting to be recorded. We are not individual substances, but manifolds of differences that are bonded socially and intersubjectively by the manner in which such manifolds of differences are recorded, operated, ordered, permutated, and reconditioned. Any movement or gesture that we perform can be recorded as syntactic information. Even what is called noise can be codified since it is only finite incompression. To be invisible here is not to exist. The sensors today that track and surveil our activity are thus only exposing the informational nature of existence. Theocracy is a computational regime since it is based on the instructions we are given in order to operate. The instructions are always initially revealed rather than deduced. The rules for control come from outside of the system initially. A fundamental theocratic movement has to occur wherein the administrative rules are not immanently revealed, but imposed from a transcendent source. And, when there will be a vast power available to read anyone’s mind (now that the truth of thought’s relation to existential space and time has been revealed) as the brain is itself extended, how can one be sure that those who will wield that power not do so in a corrupt fashion.? What besides a theocracy will constrain them? (Noah Horwitz, An Sich, 210)
If any move or gesture that we perform can be recorded as syntactic information then we are, in fact, looking at a computational theocracy. For Horwitz, syntax, a formal relation in which one sign adds information regarding another, precedes semantics. How does it stand with the originary hypothesis? There is a question here of whether the hypothesis entails ontological assumptions, and another question about whether it transcends the semantics/syntax distinction and introduces a fresh vocabulary. At any rate, syntax and semantics, if we’re working with these traditional linguistic categories, needs to be complemented by pragmatics, which in these concluding remarks comes to play an implicit role in Horwitz’s analysis. Syntax, in the sense of subject-predicate or topic-comment relations, doesn’t emerge until the creation of the declarative sentence—but that doesn’t mean there’s not a more originary syntax, in which the gesture on the periphery predicates the center as the center of that periphery. Do we go further, and embed the originary event in the animal and natural worlds from which it distinguishes its participants? Is that necessary or is agnosticism in such matters both viable and preferable—it’s a position I’ve maintained so far and so, I think have those in GA, but no one has ever really posed, much less pressed, the question. Horwitz does so here, implicitly. If computation implies the incomputable, since, among other reasons, the data analyses carried out computationally always generates more data that it could not have already computed, then something like an originary syntax is the word of God, and the only real constraint on worldly power. This is an important move to make, in countering liberalism and its fetishism of checks and balances, but for any thinking beyond liberalism as well, because no occupant of the center can deny the incomputable, nor need he in order to see to his own succession.
I have suggested before, without claiming any philosophical competence, that a basic relation of measurement between “things” that, insofar as they share the same universe, is a minimal assumption of reality. Two particles in the same space, however far apart and separated by filters that mitigate any effect they might have on each other, register the effect the other has, in their respective locations within a given time frame. Anything that effects the composition of another thing in any way is being measured by that other thing. Of course, it could only be measuring it for a third thing, which could in fact be either the first thing or the second thing in a different time frame. So, two things that are different are the same insofar as they “have” length, and then their difference is recoded as a difference in length (but also every other possible measurement)—all we have to do to provide for an originary syntax is to say that the differences along all those measuring dimensions precede their being different things (and produces their difference as things) and is therefore prior to semantics (not necessarily pragmatics, though). This seems to me to provide for a basic syntax and basis for computation, insofar as the measuring must become reciprocal and the measurements themselves are measured on a different level or in a different register so that measurement becomes cause and generates the operations of what it measures which in turn increases the granularity and complexity of its own measuring operations. To measure is to assert two or more things are different in relation to the same: different lengths, different masses, different velocities, etc. On the originary scene, then, a new mode of measurement is created—the sign measures all moves and gestures in terms of the center.
On the face of it, computational theocracy sounds like a return to Cartesian dualism of primary and secondary qualities, but that’s only if (as it seems to me Horwitz contends) we treat measurement as a solely mathematical matter, rather than a broader question of symmetry or fit—of the kind that is necessary to make a scene a scene (and therefore makes any numbers meaningful in the first place). My breakdown of sociality into ritual, juridical and disciplinary is a way of acknowledging as few forms of fitting as scenic design as possible (if I absolutely have to, I can reduce it all to disciplinary scenes, as I once did in an Anthropoetics essay, but that pushes things further into incommunicability than I would like right now). An idiom is a form of measurement, and maybe idiom can serve as the single idiom for “fit” across the different spaces—we can always identify a vocabulary, constructions and grammar to a particular idiom, and even define idioms in terms of a particular fit. This is how stacking works: one item must be fit along with other items into a larger item, and the larger items then in turn need to be broken down in new ways into smaller items. Now we can compute a very large number of ways in which items at various levels of the stack can be fit into each other drawing upon the way they are already fit into each other and are composed of natural items that already fit into each other in ways that have been “designed” over eons. With this logic of computation in motion there is only one thing left for humans to do: position themselves at the end of particular processes of computation and determine whether, at the particular scale at which the human is being presently constituted, there is a fit or not. This is not only an irreducible and indispensable position that only humans will ever be able to take up, but it will be an enormously demanding one involving high levels of learning and training, an ability to improvise and take responsibility on the spot, and to convey one’s decisions back to the computational order (the center) in such a way that a new modes of fittingness will result from the next round.
We can already practice fitting in all of our practices. There’s a minimal syntax to everything you do, every move you make. Paying attention to a particular thing in a particular way is a syntax. How is paying attention to something like some other way you might have paid attention to that particular thing, or like some other thing you could have paid attention to in that way? Singularized succession in perpetuity is enacted here as well: if you think in terms of your practices being indispensable and retrievable by some future practice that will derive unmistakably from those practices while also being unrecognizable as related to them by any other than those who have retrieved the originary practice, then you will be practicing in computational terms. This is a kind of reworking of Peirce’s pragmatist maxim that all the consequences you consider to follow from the object of your inquiry comprise the entirety of your conception of that object. What thinking and practice then involve is resolving your practices into the most minimal syntactic elements, transportable and combinable with an unlimited range of other practices. Once you drop the Big Scenic Imaginary, and stop thinking about what you say or do in terms of its reception and acknowledgment by other individuals modeled on your own desire for reception and acknowledgment there is no other way of thinking about what you do than by seeking to indemnify it in relation to all the things that can happen as a result.
The Large Language Models bring computation back to language and provide a new direction for inquiring into and practicing language (designing idioms): breaking into the training level of the increasingly sophisticated LLMs. This will be the case once it becomes possible to load and train an LLM with your own data (all of the texts you want to be part of your reserve mind) and connect it to all existing data sources so that you can then ask it, for example, to create an argument between Homer and Isaiah about how to interpret a recent decision of the Supreme Court in terms of the legal practices of ancient Egypt. The pieces don’t quite fit here (and if they do, it will be possible to introduce some misfit), and so what the trained database takes to be Homer’s view of what the legal practices of ancient Egypt comprise, how they apply to that SC decision, and how Isaiah’s contrary view would take “Homer” to a new consideration of, say, battle and honor, will have holes that you will notice and more that you can learn to notice. Here is where the training material becomes evident and you can guess how your previous determinations of fit are playing out here. In the end, you might want to be able to say something transformative about that SC decision, or some element of it, that no one else would be able to say. We can, of course, include forward looking questions regarding, say the technological implications of a court decision, seen, say, from a carefully curated literary perspective. You will be able to tell the LLM to design these kinds of prompts itself, but that will always be a way for you to enter back into the training materials. (This is all obviously preliminary and suggestive, and once various media are articulated you will be able to generate, say, theme music for a SC decision that might have been written by Thomas Carlyle, etc.) All of this will encourage knowledge of our cultural heritages and a more thorough appropriation of it—it will be possible to show, very concretely, that building a database out of materials with more “integrity” will produce stronger results across the board.
We will then be able to control much of the data we feed back into the computation system—make it both more useful for worthwhile purposes and more tailored to ensure the usefulness and value of those purposes. This would entail what is, again, already possible now, which is thinking of yourself as a walking bundle of syntaxes and stacks of syntaxes in various relations of fit and misfit with the centralized data deposits. Everything can be reduced to what will become the extremely complex and generative activity of determining that this is the same. This is a model of health, this is a model of well-being, this is a model of love, of justice, of devotion, of succession. The computation system will spit out proposals regarding whether a particular treatment, lifestyle, decision, gesture, constitutes a sample of these terms, generates an idiom that can house them—and all of us will simply affirm or deny in each case with a growing knowledge of the ramifications of doing so. But in that case this is what we’re doing already even if our affirmations and denials must be composed more complexly due to the degree of misfitting pretty much everyone, regardless of values or politics, is confronted with—we are not yet ready to have our yeahs be yeahs and our nays be nays. Our idioms then will be registering these misfits, complaining, in a sense, that we can’t just affirm and deny in more immediate and immediately understood ways—this is to be made into the main form of resentment, a resentment we keep communicating to the center.
"Once you drop the Big Scenic Imaginary, and stop thinking about what you say or do in terms of its reception and acknowledgment by other individuals modeled on your own desire for reception and acknowledgment there is no other way of thinking about what you do than by seeking to indemnify it in relation to all the things that can happen as a result"
I can readily confess that i'm having considerable difficulty distinguishing that there bit from the thought that one should be content to just talk to oneself. Indeed, you have averred on the GAlist recently that you cannot stress enough how little you care whether anyone agrees or disagrees with you, and here you appear to be moving that assertion into the theoretical troposphere...