The Nihil of Time
Unilateral Limiting and the Decomposability of Human Agency
I. Corrosive Constraints
It is natural enough, therefore, that critique is an instrument of dissolution; a regression to conditions—to the magmic power of presupposition—upon which all order floats.
The precedent of humanism as a sign of retrogressive philosophical and cultural agendas and its correspondent mirroring on the ideologically contaminated waters of long-termism
presses us with the task to sustain a critical assessment that must be constructed from the grounds of an open yet strategic probe: how can we bootstrap the programmatic dissolution of any given substantiality (i.e. an ideological purview that has no scale-sensitive positionings toward such-and-such conditions) ascribed to a determined yet monotonic human subject as a universal paradigm? In other words, when faced with the isomorphism between humanism and corrupt, reactionary so-called universalist projects founded on an implicit western-centric exceptionalism, we have to reassess both pragmatic and conceptual schemas necessarily driving us toward a continuum of critical self-effacement. For this task, we will specifically unchain the consequences of the negative temporal chemistry threading out of the project of Kant’s critical philosophy onto the context-specific proposal of inhumanism
from where we can productively decant a dispute of human supervenience that would be antagonistic to the particularities of Bostromian long-termism. Within the context of the aforementioned proposal of inhumanism, human supervenience is to be considered as a set of ordered cognitive structurings that hierarchically position human sapience in regard to outward sense data influxes that in a manner of maximal unforeclosure, from structuring to decomposability, open the gates to de-individuation.
Particularly for the inhumanist proposal, the conception of the transcendental order of space and time that Kant develops can be seen as a corrosive structural constraint that helps to undermine human subjectivity via retrochronical abjection inevitably locking human sapience back onto to the aprioristic transcendental or deep temporality of the inorganic. This means that the finer grained sense data that predates a coarser grained state of representational coding and structuration (i.e. the morphogenetic field underlying posterior states of becoming) has a revengeful surfeit on the cognitive structurings of human sapience resulting in the gaping decomposition of our bioimmunological apparatus into the utmost excess of primordial data-bleed. Thus, under this view, we can posit a decomposability of the human subject defined here as a top-down approach that begins from a stabilized and maximally representational cognitive layering that gets retroactively underpinned by a destabilizing ground, jeopardizing what we consider to be our “locating beliefs” as rational human agents in the world. Furthermore, the aformentioned strategy of subtraction would also serve to put into view the diagonalization or non-effective computability, following both Cantor and Putnam’s “Diagonal Argument”, of the One or the I when considered to be an inductively confirmed generality within the context of what a universal learning machine is capable of computationally achieving in terms of a closed set of predictive results.
To this end, Putnam contends the apparent infallibility of Rudolf Carnap’s degree of confirmation of a hypothesis or DC, here applied to the aforementioned set of closed predictive results such as follows: “Let T be any learning machine, and consider what T predicts if the first, second … and so on balls are all red. Sooner or later (if T is not hopelessly weak as a learning device) there must come an n such that T predicts ‘the nth ball will be red’. Call this number n1. If we let n1 be black, and then the next ball after that be red, and the next after that again be red, and so on, then two things can happen. T’s confidence may have been so shaken by the failure of its prediction at n1 that T refuses to ever again predict that a future ball will be red. In this case we make all the rest of the balls red. Then the regularity ‘all the balls with finitely many exceptions are red’ is a very simple regularity that T fails to extrapolate. So T is certainly not a ‘cleverest possible’ learning device.”
The diagonal argument put to work by Putnam undermines an inadequate framework of explication that extracts its short-sighted conclusions from a minimal informational pool using the example of a hypothetical universal learning machine, placing an initially contingent drift brought upon by the inclusion of further evidence to the forefront. The appearance of a contingent drift can help us illustrate how a compressed picturing of human subjectivity that is unmoored from a coarser grained or a minimal yet inadequate framework will eventually lead to a chasm when faced with a finer grained and complexity-laden framework of explication as highlighted above. Taking this into account, we can take a detour toward a brief observation Nick Land makes about the diagonal argument with the idea in mind that a formalist inductive method such as Rudolf Carnap’s is too restrictive to consider other diverging frameworks that would be open to the influx of evidence in excess. In our case, this evidence in excess corresponds to the temporal and spatial transcendental order theorized by Kant that, when seen under the purview of Cantorian mathematics underlying the diagonal argument, leads toward the application of non-denumerable sets of intensive magnitudes: “Cantor systematizes the Kantian intuition of a continuum into transinfinite mathematics, demonstrating that every rational (an integer or fraction) number is mapped by an infinite set of infinite sequences of irrational numbers. Since every completable digit sequence is a rational number, the chance that any spatial or temporal quantity is accurately digitizable is indiscernibly proximal to zero.”
Therefore, in order for us to posit a complexity-laden framework, we would need to consider that such framework must be open to the influx of divergent and initially incompatible evidence that would hazardously contrast with the conventional and short-handed evidence taken to be a given within a restricted framework. In turn, this contrast would initially show itself as contingent but would ultimately end up being a reconstructive contrast that would oblige us to inhabit a far richer language than the formalist one proposed by Carnap’s DC (Degree of Confirmation) while encapsulating the inclusion of transcendental conditions of the type of the space-time order and, while not endorsed by Putnam, the implicit idea that the application of intensive magnitudes corresponding to the space-time order can be seen as a corrosive yet necessary constraint on human agency.
II. Astronomical Enclosement
When weighing in with the latter, we must also consider the seedy arguments posed by long-termism that even when picturing at a certain capacity the corrosive constraints of space-time order on human agency, these would have a grandiose teleological aftertaste that precisely looks to counter the catastrophe-laden consequences of transcendental temporality under the name of the conservatorship of human supervenience. In what could be considered the founding move of long-termism, in “Astronomical Waste”, Nick Bostrom exerts the two-fold sin of exaggerating the role of the human subject against the backdrop of an explicit authoritarian project of spatial colonization that begins by examining the wastage of human potential when put against the ruthless and maximally entropic asymmetry of temporal scales, ultimately facing the loss of capitalizable opportunities at every turn of the clock, whilst continuing the boastful and uncritical expansion of the western human subject toward the very subjection of the stars.
In regard to the latter, Bostrom writes: “We might fall victim to an existential risk, one where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential. Because the lifespan of galaxies is measured in billions of years, whereas the time-scale of any delays that we could realistically affect would rather be measured in years or decades, the consideration of risk trumps the consideration of opportunity cost.” And even more bluntly, reaching the following bioimmunological conclusion: “Clearly, avoiding existential calamities is important, not just because it would truncate the natural lifespan of six billion or so people, but also—and given the assumptions this is an even weightier consideration—because it would extinguish the chance that current people have of reaping the enormous benefits of eventual colonization.”
Accordingly, if we stop to examine the bare-bones utilitarian rhetoric of Bostromian long-termism, we will be unsurprised to find the most erratic display of the degree of confirmation framework, which as previously mentioned would have for its basis a minimal pool of information and that in this case would be ascribed to a determined type of human agency (i.e. a predatory colonial western subject), with the intention to acridly ground an inflationary conception of the potentialities and statistical growth of human intellect when projected toward the future. Nevertheless, the apparently firm prognosis that corresponds to the speculative progress of the western subject in time becomes voided by disregarding any kind of complexifying or open conclusions that would have at its center paradigmatic counter-evidence such as contemporary theories in the field of physics that have put forward the caustic hypotheses of a symmetry of temporal scales and backward causation endangering the predominant and supposedly determinate phenomenological perspective of human agency that exclusively delves within asymmetrical temporal scales and, most importantly, the explicit muting of cosmic trauma that is precisely brought to the fore by the overwriting of transcendental temporality over our own whimsical and fairly lackluster empirical givenness within the indifferent contingent flow of the universe.
In light of this, and to avoid the outlandish assertions of long-termism and pick apart its embededness to the suppurating drawl of inert enlightened humanism and its ideological vows to contemporary technocapitalism, and as mentioned at the start of this writing, we would have the necessity to sketch out a critical apparatus. Considering what we have previously observed, this respective critical apparatus will have as its starting point the overlapping of diagonalization or non-computability with the ramifying consequences surrounding the question of time as we have found them at the edges of inhumanism. This overlapping must venture outward to its own logical terminus where we would embrace a horizon event that relishes the annihilation of any fixed locating beliefs that correspond to a poorly substantiated “I”. To put it shortly, the imperative scalpel we intend to employ has at its mire the unfeasibility of closure made explicit within the monadic play of mirrors that is long-termism. The effects of which would mean the hard limiting of human agency to the potentialities found on the virtual structuring of the transcendental space-time order. If we can identify the hectic trajectories to where we are lead with the irruption of the above-named virtual structuring while considering how this structuring gets traumatically embodied in the actual as a factor of decomposability, we can then overcome the plighted biases of a single-sided frame of a rigidly enclosed human subject.
For this, we must highlight and recontextualize one of the main tenets of inhumanism: the convergence toward the abstraction of time in-itself, a tenet that in turn must become a reconfiguring and dialectical fixture of the human subject while also considering the perceptual and conceptual irreversibility of the ripples produced by the collapse and further immersion toward and inside the space-time order (in the lingua franca of Landianism: k-space) onto our empirical selves: “In this sense K-space plugs into a sequence of nominations for intensive or convergent real abstraction (time in itself): body without organs, plane of consistency, planomenon, a plateau.” From this passage we can extract the following thought: the convergence toward the abstraction of time in-itself can be explicated as the unraveling of a clash between intensive and extensive qualities that becomes traversable when unmaking and re-arranging a given state of things onto a dynamic refiguring onto a yet-to-be-known state of things. Put differently, according to the vocabulary belonging to the rationalist continuum leading from Spinoza to Deleuze, the weaving between real and numerical distinctions can be used to implement the Kantian-critical purview of temporality and the measuring of its effects by way of intensive qualities, i.e. unmaking and re-arranging a given state of things, and the notation of the cascading immersion of these effects on our empirical register by utilizing extensive quantities, i.e. a yet-to-be-known state of things that can be made visible by way of bare statistical measurements. Using these ideas as a form of grounding, we can find ways to revel in the tools offered by contemporary science and physics (which would be an ordeal beyond the limits of this writing) having at its center the intensive qualities of time against the merely impoverished utilitarian perspective of long-termism which banishes any kind of depth from the merely extensive measurements used to eternalize an undisputed western human subject.
III. Multiscalar temporality as a heuristic critique
Following our line of argument and in order to sketch-out a potentially robust naturalization of the negative impact of time as a structural constraint on human subjectivity, we would need to observe that general contentions about inhumanism’s decomposability of human agency will become inevitable, and for this we will first take into consideration two concrete instances of fallibility. The first one being the seemingly unorthodox interpretation made of Boltzmann’s theory of thermodynamics, interpreted by Land as a transcendental law and as a working proof for the fixed tendency of matter toward its own dissolution in time. We will argue that Land’s justification of the aforementioned proof as to undermine any teleological conception of time and therefore of any telos embedded within human agency reverses precisely into an axiomatic view that predetermines the behavior of phenomena in time. Land delights in his own fallible device when writing the following: “Any process of organization is necessarily aberrational within the general economy, a mere complexity or detour in the inexorable death-flow, a current in the informational motor, energy cascading downstream, dissipation. There are no closed systems, no stable codes, no recuperable origins. There is only the thermospasmic shock wave, tendential energy flux, degradation of energy.”
Toward this end, we will use an exemplary lesson taken from Mark Wilson’s critique of “ersatz rigorism” when analyzing the contextual ills of reductive axiomatic formalism that is taken to be epistemologically complex while remaining fundamentally bare. Following Wilson, we can forfeit Landian temporal catastrophe as a variant of what we can call a “non-linear ersatz complexity”, where we would observe the supposed increase of complexity when utilizing non-linear systems as a model and in particular the dynamics of non-asymmetrical temporal flows but that do not suffice as they don’t go beyond a bare axiomatic proposal that has as its center the idea of primordial entropic unbecoming as a deregulative fixture haunting any kind of negentropic formation. On this point, Reza Negarestani poses a critique related to Land’s idea of apparent complexity when using non-linear models of explication to prove the dissolution of matter and order, that in the end become undermined thanks to their binding to fundamentally closed systems that depend on a predictive economy of self-regulation and largely unrevised locating beliefs: “The dissipative rate is energetically conceived as an economical (and hence, restricted) correlation; its existence is dictated by the exorbitant index of exteriority but its modi operandi are conditioned by the affordability of the interiorized horizon of the organism.”
With this in mind, the second instance we will consider is the “the non-successive and unsegmented zero of intensive extinction” behind “flatlining” to further compromise the complexities suggested by Land, as it would also implicitly suggest a flat picture of time that would act as a unilateral or unidirectional limit that has for consequence an orthodox form of eschatology rather than an intensive and multiscalar conception of time. For the latter, we will also build on the observations made by Wilson that place a direct response to the depthless formalist rigor of the so-called scientific epistemologies developed by much of the European and Anglo philosophy of the 20th century with recent developments in contemporary science and computation, utilizing multiscalar architectures that can tackle the problem of having various interacting submodels within a dominant structure (and accordingly, a dominant behavior), whilst not subtracting the importance of what happens whenever we immerse downwards and inwards to further levels of complexity: “The basic trick is to position a variety of individual modeling tactics (called ‘submodels’) within a coordinated architecture that shifts between these components in a controlled, checks-and-balances manner. By doing so, a multiscale plotting can resolve the computational hazards that Terrance Tao calls the ‘curse of dimensionality’: keeping track at one time of all the interactive variables relevant to a complex system will easily swamp the capacities of the most compendious of imaginable computers.”
Thus, if we now turn to Huw Price’s contention about time as phenomenologically perceived by human agents in an asymmetrical manner with a tendential behavior that affects things in-time by getting strained when putting it side by side to the subjectless perspective from the nowhen that doesn’t have the forward-causation belonging to the phenonemonological perspective as its basis, we can then try to piece together the idea that there might be a yet-to be deciphered dominant behavior of time rather than a restricted nature of time that would include as sub-models both asymmetrical and symmetrical directional flows. The enrichment of perspectival and non-perspectival temporalities and behaviors permitted by a multiscalar conception of time willfully demonstrate that Land’s inhumanist idea of time is deeply conservative when contrasted with developments that might contain and defy human agency without recurring to implicitly threadbare concepts such as the nature of time et al.
With the above, we would like to suggest that even if we are prone to find evident dangers leading toward orthodoxy and dogmatism in Land’s decomposability of human agency, we can also find productive ways of circumventing the substantive status surrounding human agency by using post-Landian responses as a starting point onto a robust hybridization of conceptual strategies and tools across continental and analytical philosophical traditions that can ultimately re-adjust rather than abandon the open-ended intellectual persuasions left behind by the inhumanist proposal. Thus, we will stand by a critique of a substantive “I” as a perspectival given in philosophy by way of what has been termed an “inessential indexical”, a conception rooted in the conflict between relational properties and self-names that Ruth Garret Millikan develops in The Myth of the Essential Indexical. On this note, we would highlight that the Millian observation that underlies Ruth Garrett Millikan’s critique of Lewis’ and Perry’s essential indexicality and that considered through and through has the capability of underwriting self-names carries the repercussion of spousing not organismic and holistic but decomposable and process laden approaches to self-situatedness within the world and the inner-workings and external-networkings that come to be associated with the act of self-naming.
Put differently and while borrowing from Patricia Reed’s parallel remarks on this subject, one does not begin from homophily to circumscribe within homophily, but rather when one subjects homophily from the inside to the knifings of immanent critique it reaches outward to the evidence in excess that challenges our perspectival givens. From this, we can observe an act of inward folding that unfolds toward alienation, which would be another name for what in Mark Fisher’s gothic materialist proposal is known as an implex (the fold outwards from within), i.e. a Liebnizian-Spinozist kernel that feedbacks onto an immanent critique of indexicality, generating strategies in which human subjectivity can be undermined while situated fully within the grasp of determinate material constraints. In turn, we could haphazardly define these material constraints as those belonging to our developing historical context as they adapt to our own cognitive domain, whilst jeopardizing any agentic givens that are and must be dialectically revisable through the conflict between the manifestly ideological and the scientific critique of our own positioning in the world. This could also bring forward a necessary grip with Louis Althusser’s Marxist critique of humanism as an ideological construct when faced with contingent and time-sensitive material constraints that define the relations of socioeconomical production that in turn have also produced the mirage of substantive selves.
Finally, and as an ongoing project, we would like to lead these conceptual syntheses to their ultimate consequences as a painstaking subtractive labor that could eventually piece together a discerning view of the unidirectional “nihil of time” via speculative physics with the anthropic perspective developed by Boltzmann. What would this imply? By taking into account Price’s reconstruction of Boltzmann’s discovery of statistics as a turning point on his theory about entropy, thanks to mathematicians Zermelo and Poincaré’s dispute about the second law of thermodynamics where they highlight that states of lower entropy or organizational influx in the universe even when considered to be of bare statistical importance do inevitably happen, we are lead to tentatively conjecture that what is most improbable within an asymmetric time scale can and will happen. That is, the aprioristic flux of transcendental space-conditions eventually do make themselves tangible when formalized within our bare perceptual view of phenomena in time (i.e. necessary entropy), meaning that eventually we could also consider the convergence of initially divergent senses of time revolutionizing our sense of selves. Price writes the following: “Life on Earth depends on a continual flow of energy from the sun, for example, and would not be possible without a low-entropy hot-spot of this kind. In light of considerations of this kind, Boltzmann suggests that it isn’t surprising that we find ourselves in a low-entropy region, rare as these might be in the universe as a whole. This is an early example of what has come to be called anthropic reasoning in cosmology. By way of comparison, we do not think that it is surprising that we happen to live on the surface of a planet, even though such locations are no doubt rather abnormal in the universe as a whole.”
By following Boltzmann’s purview, we can find the idea of implausible hypotheses becoming plausible in given time as an attractive strategy that can be filtered through the lens of both a heuristical method of critique as developed by complexity theory and the epistemics of surprisal as methods that are entropy tolerant but not overrun or ruled exclusively by entropy. For the idea of heuristics, we must turn to the observation made by William C. Wimsatt in regard to evidential excess and how this evidential excess rather than becoming a limiting factor to be discarded in the building of scientific theories and research can become a reconfiguring factor toward the implementation models that are error prone or that contend with the eventuality of surplus of evidence. Wimsatt writes: “A more realistic model of the scientist as problem solver and decision maker includes the existence of such limitations and is capable of providing real guidance and a better fit with actual practice in all of the sciences. In this model, the scientist must consider the size of computations, the cost of data collection, and must regard both processes as ‘noisy’ or error-prone.” This in hand becomes even more crucial when faced with the entropy tolerant framework proposed by Anil Cavia, where our own situatedness and posterior representational coding of what happens in the world is necessarily bootstrapped by initial states of contingency and entropy; but rather than giving in to the informational saturation of entropy, we might as well face the corrosion of impact and build upon the ruins of our now dislocated beliefs.
Cavia writes: “All the knowledge we have is of uncertainty, there is no means of disentangling judgement from contingency. Surprisal is precisely the idea that our capacity to learn is grounded in an attempt to absorb new forms of entropy as information, and that the negation of intelligence is a reversion to pattern. Here, encoding is an in-situ theory of knowledge in formation, an ontogenesis founded in the tension between freedom and constraint, not so much a dialectics as an informatics of pattern and surprisal.” The play of informational flux as has been highlighted circularly throughout this writing is a key factor that leads to a revision of the status of our own agency as humans, and even more so when the informational flux can potentially be utilized for the purpose of modifying determinate mechanisms and functions within a complex system. We can illustrate this by moving forward with William Bechtel and Richardson’s remarks on the issue of localization and decomposition by using French chemist Lavoisier’s “structural decomposition of water into constituent molecules”, in that most functions and operations belonging to a mechanism in particular can, once located, be potentially decomposed as to be structurally recomposed in an isomorphical manner, with the corresponding functions potentially enabled to be mapped in a one-to-one basis. The compelling factor of decomposition and then posterior structural recomposition is the failure or divergence of functions that could likely, when taking both heuristics and the epistemics of surprisal to their limit, synthesize diverging functions that could put at stake any configuring factors taken to be essential within the mechanisms of a complex system.
Thus, if we take the monstrosity of decomposition seriously, to what speculative stakes could we bet upon? Laura Tripaldi poses a challenge when analyzing soft intelligent technologies such as spider’s silk and the respective complexities in function and location that make for it a difficult subject to be structurally recomposed via biomimicry and therefore observing the diverging results as functionally efficient although botched in replicating a one-to-one mapping by lacking in imagination.
With this, we can elaborate some closing observations in that surprisal and heuristics can help rebound speculative monstrosity to the inbound conditions of human agency by way of not only decomposition when taking in the escalating effects of the intensive order of time, but also contemplating what comes thereafter by structurally recomposing our agency as to shake the retrogressive beliefs condensed in humanism and leading us to stranger ways to inhabit the world full of statistical possibilities and reversibly tread neighborhood of escalating outcomes that in this point in time inevitably rear us toward our own extinction. We turn to faciliaty zero in order to jump in degrees from the despotic order of representation to the inevitable intrusion of intensive futurity.