Deferral, Debt and the Idiom
Deferral and debt are closely related; we might even say that debt is the tokenization of deferral. The beauty of the concept of deferral and its irreplaceability for center study as for GA is that, unlike, e.g., “prevent,” “refrain from,” “desist,” and other near synonyms, “deferral” posits no endpoint. As long as there is humanity there is deferral. We can also distinguish deferral from another kind of delay, “deter,” and its nominalized form, “deterrence,” insofar as deferral includes deterrence (we all must defer or there’s no deferral and therefore we all must monitor one another’s deferral) while adding a critical component: allowing for an orderly distribution. When we deter, we just watch each other; when we defer, we allow one another to proceed, to “re-imagine” the shared object of desire in creative ways. “Defer” includes deferring to the other. With deterrence there is no object, and therefore no desire—only fear. We each let the other take a piece in the “faith” that they will limit themselves to only a piece and let us have our piece in term. The systems of exchange and negotiation this involves have no place in a system of deterrence, where one simply remains on hair trigger lest the other step over some at least somewhat arbitrary line. All this is a way of saying that in deferral we remain indebted, to each other and to the center through each other: the shared object is a gift from the center and sharing it is how we pay or, rather, pay down, the debt.
In that case, every utterance, or every sample, is both marking and repaying the debt, and in learning language we are learning the “weights and measures” of a complex system of debt. So I am here continuing my “Tokenizing Deferentiality” discussion (and “All You Need is Language” and earlier discussions), aiming at treating all human activity as linguistic use and all linguistic use as a marking and paying of debt (debt paid through its marking, which prices in degrees of enforcement and forgiveness). I refuse any numerical marking or mathematical method here (even if I wouldn’t and couldn’t prevent others from trying) because such methods merely beg the question of what you determine to be a “unit” to be measured, and such measuring already presupposes an answer to the question of the relation between what you will take to be “units”—their respective weights and measures. We have many ways within language of tokenizing deferral—indeed, if we didn’t, math would be impossible—and measuring its various increments, at various scales and within various frames of reference. We have, above all, “like,” positioned somewhere in between “same” and “other.” When you ask how much one thing is “like” another consider how many other assumptions the question and answer entail—everything is like something else in some way, and you could equally say that nothing is like anything else in other ways. You could right away introduce a scale (“1 to 10”) of “likeness,” but you will quickly find any measure along this line to be arbitrary, not least because in performing the measurement you are putting your finger on the scale in a way that would require another scale to measure, and so on. So, it’s more “accurate” instead to say that one thing is more like another than some other thing, in some regard, on some scene, from somewhere on that scene, with whatever exemptions, postponements, expirations, etc. are entailed. In that case, you’re standing for, or representing, that likeness.
Everything is to be sunk into tokenizing deferentiality, as we aim to ever simplify and make transportable and transferable (and “cheaper”) our idioms, which means that the most essential concepts, like scenicity, singularized succession, the juridical, the nomos, etc., must all be tokenizable. The word (a prime) “like” is itself, it seems, a kind of loosening or mitigation of “same”—a kind of lowering of the threshold of those things we might be able to identify as the “same” (or “other,” for that matter).”Things that are like are potentially the same—we are deferring saying “this is the same”; or, we are deferring saying they are “other.” But things that are like are also things that we like—the adjective and the verb apparently are ultimately the same word. And this makes perfect sense, since the things that we like are like us by virtue of the fact that we like them. So, “like” situates us in a desiring, deferring and reciprocally signifying relation to our scene—it is how we are building our scenes and fitting ourselves into them. But then we may take the next step and introduce the adverb, “likely,” which introduces, through desire, similarity and reciprocity, the domain of probability—if something is likely to happen, then the event in question is “like” some other event that has happened or that we know will happen. The more like it is the more we feel we can say it will happen, or has happened, with past events we have not “verified.” All the substitutions/sacrifices through which we pay our debts leverage the likelihood that things are like what we like. When we say something is “likely” we are already betting on it happening—“likely” implies at least more than a 50% probability. In that case, everything below 50% fails to meet the threshold of sayability within the idiom of liking and likening—under 50% and things are no longer like each other and one no longer likes them. One could argue that this installs a kind of blindness, a prejudice against the unlikely, into the idiom, the idiom I would like to propose as a tokenizing one, but the answer to this is that the scale or frame of reference needs to adjusted so as to like, find to be like, and to be likely, all that falls below this threshold—which means lowering the threshold of significance, which means increasing the sensitivity and resolution of our instruments registering new layers of data. We have to shape our desires and therefore our scenes so as to like and be like things previously off our radar, and in that way they become likely in some respect, upon some possible scene.
I’m aiming at an analysis of language that, because it is necessarily carried out within language is not only an analysis but an immersion and participation—which, then, pays down the debt to disciplinary analytical traditions. If someone pays you money in exchange for something you own, you “analyze” their gesture and the token offered by turning over the thing. The difficulty here is really analogous to (“like”) the difficulty posed by mathematization—as soon as you assign values to tokens you abstract the token from the events in which it will have it fluctuating values and create a metalanguage that serves more to police and regulate valuations than to directly raise and lower them. So, it’s a problem to, say, make lists of words and expressions and assign a particular increment of deferral to them—the list is itself just another use of signs, another idiom. It’s precisely because language always defeats such attempts at calculation that it is privileged as a mode of tokenization. Derrida’s marking of the constitutive iterability of any utterance is maybe the most effective way of making the point—the very fact that someone might be saying something ironically or citing rather than saying it (which really includes “irony”), and that you can never be sure, absolutely frustrates more familiar ways of tokenizing transactions.
The way through this thicket is to always be liking and likening toward the end of making something in particular more likely. We are very close to Peirce here: to say what becomes more likely as a result of what you have said is to give the meaning of what you have said. And we are enclosed within the originary hypothesis as well, which presupposes that “meaning” originates in an effort to make the survival of the group more likely, in this case precisely because liking something too much made us too much like each other. The solution turned out to be making all of us more like something else, something that is not a person, i.e., a metaperson. And it is of that metaperson that we can first of all say, “this is the same,” closing all the likes in upon identity, precisely so as to provide us with the slack to merely be like each other, rather than the same. Singularized succession in perpetuity is a concept that places the entire community all in regarding its continuity, which depends upon sustained, continually transferred, attention to the center. The token then comes with creating a new way, place and now of saying “this is the same,” which is a way of reorganizing and thereby recreating all of reality so as to center it on some new object of attention. That is the source of value, even if it involves authenticating the result of a prompted and targeted data search issuing in a valuation of some asset or the accreditation of some occupant of some center. You want to make social continuity more likely and this entails making a whole “suite” of things more likely insofar as they contribute to the likelihood of social continuity. The suite gets enhanced as more data comes in, as we contribute more and better data, and therefore as all the data recording, coding, curating, collating and training keeps lowering the threshold of detectability and intelligibility.
All of the means each and all of us have been afforded for finding and founding new ways of saying “this is the same” and this affordance is what we have been gifted and which we are obliged to pay forward, creating new debt as we go. We’ve been taught, by David Graeber and leftist thinkers like Maurizio Lazzarato, to see debt as a horrifying prison constructed by malevolent power actors, but debt is only an extension of the gift economy, itself only an extension of the ritual and ultimately originary scene wherein the central beast gives itself over to the group which in turn gives up a portion of future kills to the gifting metaperson. It is a world without such reciprocal obligations that should horrify us. When someone erects and “furnishes” a new scene upon which some thing can be displayed of which we can say “this is the same” (I see what you see and see that you see that I see…), we have obligated in turn all who come onto that scene to affirm that they see that thing, and the value of the token, then, is a matter of the social “curvature” carved out in the construction of that scene whereby others are drawn into it. But we can add that any new mode of indebtedness reacts back on previous modes, “doubling down” in the manner of enforcing them or forgiving them, while replacing them with a more payable or more “likely” loan—or, of course, both, in differing degrees, soliciting different actors on this scene or its successor scenes.
The concept of citationality, for which I cited Derrida above (think about the debt accrued here) is in fact (even if Derrida never turned it in this direction) a very parsimonious way of analyzing texts, including in the manner I proposed above (some of what I do is redeeming the unfulfilled promises of postructuralism and the “linguistic turn” more broadly). Once you’ve determined who is being cited in a particular utterance (sample), and establish the entire chain of citationality (really a massive, ultimately ungraspable web) you have understood the meaning of the utterance/sample—this is just turning the Peircean definition of meaning in reverse. Of course, for Derrida, it’s citation all the way down, and so it is, but citation still presupposes a “citer,” who is, after all, citing rather than not citing, citing one thing rather than another, and from one “stance” rather than another—and all of these differences introduced into the cited material concern how the cited sample will travel and endure, which might make us carriers of the Burroughsian “virus” of language (a concept very much alive in contemporary innovative writing) but I’ll stick with the pharmakon, which preserves the difference between pointing as singling out the victim for a lynching and pointing as an act of deferral, a substitute for the lynching (let’s say, the paraclete, or advocate, supplementing the katechon). So, citing the citable in a citable way is liking, likening and increasing likelihood and this by no means restricts one to or privileges the commonplace (you don’t have to cite the most often cited)—you pay down your debt by generating new likelihoods because this involves building more deferral than you undermine by marshalling more semiotic material to the cause of deferral. Thinking in terms of citation introduces a metalinguistic dimension: you are pointing to the “piece” of language you are deploying as well as whatever you are having that piece of language refer to. And this pertains to our completely institutionalized mode of life, where everything you do cites some prior utterance, some rule, some precedent, some model, increasingly explicitly. Tokenization heads toward citation as naming, much like one is named after a grandparent or beloved figure, but it is that which makes you singular and not just a “mention.” The most valuable token, then, is that which cites not exactly most widely but by casting the net of “likes” most widely while fishing out a name, a title, an honorific, in the most singularizing way possible.
The ideal style is one which is impenetrable to those on the scene where everything is always already the same and crystal clear insofar as one is on the infra and meta scene upon which we can say “this is the same.” This is an argument I’ve made many times, but I’m returning to it here to add that this infra and meta scene is the scene of tokenization, where a call is placed on a pedagogical future—you’re saying, not so much what everyone will necessarily be saying at some future date but what will be a constitutive element of the archive enabling anything to be said at that date. You’re liking those people and things that are likened to the furnishings, props and actors on future scenes that will be likely to like those people and things that will be likened to… It’s like a conversation anyone can join by introducing something unsaid in the conversation and yet without which nothing in that conversation could be said: some historical, institutional, conceptual, or linguistic frame of reference the traces of which can be made into outlines of the present scene. Everyone has had conversations in which the interlocutor is taking so much for granted that he has no right to and yet continuing the conversation requires ignoring that fact—in a sense, every conversation is a bit like that. We can create spaces that abolish such conversations not so much by establishing tables of agreed upon “facts” but by threading the conversation on protocols for summoning the archival traces that bear most directly upon it. We’re not thinking in terms of winning arguments here but of remaking our conceptual resources (idioms) in real time, with that being the “topic” or substance of the conversation. In doing so we can draw upon all the vocabularies bequeathed us by the institutions of deferral: we insure ourselves; we make deposits and withdrawals, invest, let interest accrue, short and buy put and call options, indemnify, copyright, trademark and patent, witness and subpoena, algorithmize, etc. Every sentence then becomes a summoning of the archives, or a search term, as well as a rich source of data deposited in the archive. That would be a likely scene of debt and deferral, upon which the distinction between forgiving and collecting what is owed is gradually eroded because the distinction between word and token would be dissolved in the sample. Generational debt would be forgiven once a progeny arrives who can re-found the corporate order.