HENT DE VRIES: Good afternoon. This will be, for me this afternoon and for Mark in just a moment, a very different phenomenological experience having much to do with visuality, because I can barely see you. But for the sake of the greater good, we have the lamp on nonetheless.
It's my great pleasure and distinct honor to introduce to you Professor Mark Hansen this afternoon. Mark Hansen is Professor of Literature, Visual Studies, and Media Arts and Sciences at Duke University. Before taking up his position in 2009, he taught in the departments of English, Visual Arts, and Cinema and Media Studies at the University of Chicago, and as a tenured professor of English at Princeton University.
After having studied at NYU, the University of Paris-- [FRENCH] and [FRENCH], as one says-- and at the University of Konstanz, he received his PhD in comparative literature at the University of California at Irvine in 1994 with a project with a wonderfully evocative title, quote, "From Heidegger to Horror: The Critique of Machine Reduction of Technology from Romanticism to Contemporary Theory." Part of his work found its way into his first still highly readable and highly recommended book entitled Embodying Technesis: Technology Beyond Writing, published by the University of Michigan University Press in 2000 with an informative endorsement foreword by Katherine Hayles.
This book lays out all the central premises and hypotheses that have guided his subsequent work at least up to this date. The book is a genuine [INAUDIBLE] that maps out the ways in which new technological media challenge theoretical assumptions concerning language and meaning while avoiding an unproductive and much-lamented clash of cultures, which is that between, say, software engineers and plain-spoken historians of science on the one hand, and the abstract-minded theoreticians, eagerly conceptualizing and criticizing technology on the other, which I think includes most of us here today.
A leitmotif of later work appears here for the first time as well. Namely, and I quote, "the confidence necessary to assert the value of bodily experienced against the discursivist [? tight ?] of contemporary literary studies." End of quote. And in the word of his current colleague at Duke, Katherine Hayles, Hansen's work, like that of many of his colleagues, share a common conviction that, quote, "human cognition does not stand apart from the world of technical objects, but remains deeply immersed within it." End of quote.
Professor Hansen's first book criticizes the ways in which the 20th century attention to technology in Heidegger and Freud, post-structuralism and postmodernism, has gone hand in hand with the attempt to, quote, "put technology into discourse." End of quote. And to do so, by considering it as a privileged [? site ?] and trope, but ultimately also as a mere stand-in and metaphor for a professed externality of and for thought. In other words, as the, quote, "representational shorthand for something else." End of quote.
In its ontic turn towards technologies, not reality, this meant reality is with the same gesture reduced. If only because the place of the previous emphasis on linguistic code and, say, grammatology, that in the place of these, there now comes an identification of the technological with the machinal, as if this settled the matter. Yet this letter reduction, Hansen goes on to show, does not do justice to the material complexification of technological means and their corresponding analogous, digital, and virtual experiences that have a robustness of their own irreducible to the mere thinking of technology as, say, cultural theories latest transcendental signified and be it as that of the machine.
What professor Hansen distills instead, drawing on a wide array of sources in science studies, system theory, from Friedrich Kittler via Niklas Luhmann to Hans Ulrich Gumbrecht is, quote, "the basis for a new model of our being into contemporary technological world." End of quote. And this means-- and I quote again-- "to eschew the habit of reading and interpreting the world as a book of signifiers." End of quote. But instead, to fathom the, quote, "extensive materialization of the spirit that is the true legacy of modern technology." End of quote.
Professor Hansen ends the book with a powerful excursus on Walter Benjamin who programmatically-- and here I quote a somewhat longer passage, dense but very telling-- who programmatically, quote, "points us beyond the impasse of so-called technesis by refusing to collapse the technological reel into representation. And by linking it to embodiment," Professor Hansen continues, "Benjamin shows us that we can make sense of technology's diffuse, amorphous, corporeal impact without filtering it through language. And he urges us to focus on our own embodiment as the material site, the bearer of technologies otherwise wholly in human impact." End of quote.
And with this approving reference to Benjamin, the stage is set for what is to come. The second major work that I studied with great profit is Professor Hansen's Epochal New Philosophy for New Media, published with the MIT Press in 2004. This book discusses some of the essential questions regarding digital media and visual art, and of art beyond the merely visual, the merely visible, insisting on the specificity and novelty of the ever-newer media apparatuses while emphasizing the ongoing relevance of the body in so-called media practices.
In the most rigorous philosophical terms, Professor Hansen explores our very embodied experience of the virtual and the disorientation-- the, quote, "bewilderment and vertical," as he calls it, that can easily come with it. Deeply philosophical terms such as affect and sensation, perception and the aesthetic, are crucial to this inquiry into the co-evolution of the human and the technological that breaks away from all the all too facile announcements of a post-human era presumably characterized by, again, machinism and the anti-subjectivist paradigm, while forgetting other forms of human embodied agency still-- and perhaps even more and more-- at work here today.
Echoing an important Bergsonian motif dating all the way back to the pages of this author's Matiere et Memoire, Matter and Memory, and taken up by luminary authors as diverse as Maurice Merleau-Ponty and Gilles Deleuze, Professor Hansen insists that the contemporary digital image comprises, quote, "the entire process by which information is made perceivable," end of quote. Enhance that embodied experience is still the agent, quote, "that filters information in order to create images." End of quote.
This is to say that we do not simply receive images as preexisting technical forms. The category of the human thus remains indispensable no matter how much new technologies undeniably transcend the framings of everything we have seen, and felt, and done, or imagined before.
A third book-- which I won't attempt to summarize now for reasons of time-- is entitled Bodies in Code, with as a subtitle, Interfaces with New Media, and was published by Routledge in 2006. And we're all eagerly awaiting the release of his newest installment entitled, Feed-Forward: On the Future of Twenty-First-Century Media forthcoming from the University of Chicago Press later this year. Which will be followed by two sequels on the subject, one segment of which we will hear more about today, and if I'm not mistaken, on Thursday as well.
In concluding, I would like to draw your attention to three important co-edited volumes which give a state-of-the-art overview of the interdisciplinary fields of study on which Mark's works draws, and that he has helped revolutionize. First, I would like to mention The Cambridge Companion to Merleau-Ponty, published by Cambridge University Press in 2005 and co-edited with Taylor Carman, likewise a great student of Merleau-Ponty and of phenomenology, notably of Heidegger. Second volume, co-edited with Bruce Clarke, entitled Emergence and Embodiment: New Essays on Second-Order Systems Theory, published by Duke University Press in 2009.
And third, the volume that he co-edited with WJT Mitchell entitled Critical Terms for Media Studies and published by the University of Chicago Press in 2010. And especially this last volume is, in my view, the best starting point for anyone who would want to orient himself, herself, in the burgeoning fields of new media studies. And many of you here present today have already made deep incursions into that challenging field.
Professor Hansen has held distinguished visiting fellowships at the Institute for Advanced Studies at Princeton, as a Fulbright Senior Scholar in American Studies at the Leibniz University in Hanover, and at the Department of Digital Design and the Digital Media Lab at the Central Academy of Fine Arts in Beijing. And the list is far from complete. His lecture topic today will be Logics of Futurity, or on the Physicality of Media. Please join me in welcoming Professor Mark Hansen.
MARK B. N. HANSEN: Wow. Thank you, Hent, for that. That makes me feel so good about myself. I'm going to have to make sure to ask for a raise when I get home, seriously.
Yeah, and I want to thank Hent and the staff of the SCT, and especially all of you guys who I can't see at the moment, for being here, being willing to give up six weeks-- give up-- spend six weeks of your summer learning from us, and listening to us, and speaking with each other, and working on your own work, and so on. And I want to also thank deadlines, because I took seriously the call that Alice conveyed to us that we should present new work, and actually wrote a paper in the two weeks-- or not even quite two weeks that I've been here. So I also apologize if there are-- well, there are definitely going to be some issues, so I apologize for that. But I'm excited that I actually got some thoughts finished, and I can present them to you today.
So what I'm going to talk about, as Hent noted, is work from this project that I'm developing called Logics of Futurity, and I'll say a little bit more about that descriptively on Thursday. But I want to say right now that it comes out of the last part of my manuscript Feed-Forward, which is a study of what I call 21st century media, by which I mean, well, everything that we use today. Cell phones, censors, the internet with its recording technologies, and so on, where lots of the activity that's going on takes place outside of the frame of the human and of human perception.
So media has changed its function from being something that's synchronized with human perception and human experience, that represents human experience, or re-presents human experience, to something that has a machinic level that seems to take place outside of the domain of experience. And I use Whitehead in complicated ways that I'll say a little bit more about on Thursday to present a philosophy of experience that's capable of opening up experience to that dimension of the non-perceived. And the end of the book really looks at Whitehead's notion of what he calls real potentiality, which he sees as a feature of the settled world that, in some ways that also are complicated and I won't explain right now, includes the future, or at least partially includes the future in the present.
And I, in my book, contrast Whitehead's understanding of potentiality against notions of frequentist prediction and probability in order to argue for a kind of ontology behind prediction and probability. And you'll hear that kind of language coming up in the talk that I have from today.
So this is a further development of this idea that the future is implicated in the present. The book-- or maybe I think it will be a series of little books-- will have case studies that range from, or that include the two that I'm going to talk about today, quantum physics and global warming, but also include crowdsourcing, financial derivatives, microsound practice, and cognitive neuroscience, all of which I see as domains where there's an opening to the future in the present, and an implication of the present into the future. So this is really the logic that I'm interested in exploring.
OK, so my paper has-- well, it has six sections, but I'm only going to read four of them today. And I'm going to skip a section on Dennis Gabor, who's an engineer who came up with the idea of the indeterminacy principle of sound. And this is relevant-- or this is the basis of microsound production, and I'll say something about that, I hope, on Thursday. And I'm also going to skip the section on Whitehead where I outline the argument that is in a micro form, or in a small form, the paper that I asked you to read for Thursday. So I don't think that will be a problem for the argument I want to offer today.
So I'm going to put up passages which I'm not going to read for time's sake. Maybe I should stand-- should I stand over here? Or what should I do? Should I go like this? Let me stand over here. Is that OK? OK, I'll stand over here.
OK, so the first section is called "All Media Are Measuring Media." In this crucial-- so I'm going to ask you to do some cognitive work of your own. You have to read this and assimilate this as I read my paper. My paper is a comment and explication of this passage, and that will be the pattern that repeats.
So in this crucial passage from a recent essay entitled "Experimenting with Media Temporality," German media archaeologist Wolfgang Ernst weaves together all of the ingredients necessary to develop a physical approach to media process. Specifically, Ernst combines a view of media as measuring instrument with what he calls a time-critical approach to media, while in the process taking note of media's specific vocation to mediate between the cultural and the physical, which is equally to say between the macro realm of experience, including but by no means exclusively including human experience, and the micro realm of quantum processes.
In my talk today, I want to begin by affirming these commitments of Ernst's to media as measurement, to media as time-critical, and finally to media-- sorry, to time-critical media measurement as the performative hinge linking micro and macro worlds. As I shall develop them, these commitments are absolutely crucial for understanding the full extent of media's operationality in our world today, and specifically for appreciating how media-- that is, time-critical measuring media-- forms an unavoidable third term, and thus a principle of indirection in the circuit linking human experience and world.
I hope to demonstrate as much at the end of my talk by way of a brief discussion of global warming, a phenomenon-- if we can indeed call it one-- that exemplifies the way in which our relation to the world is mediated by media that operate outside the domain of phenomenality as such, and not just human phenomenality, and that materialize a physical propensity linking the present to the future in the form of probabilities, and crucially, wholly within the space of the present itself.
Let me now continue, however, by sticking closely to Ernst's claims. So first, to the claim that media is measurement. By qualifying media as measuring media, Ernst seeks to divorce media from their cultural and historical over-determination. His aim in doing so is to liberate the operationality of media from the constraints that human-focused cultural framings impose on it.
A case in point is the archeology of the acoustic, where human auditory sense-- as Ernst diplomatically puts it-- just does not suffice. To explore the operation of the sonic beyond the acoustic spectrum of humans-- or for that matter, beyond any other delimited acoustic spectrum-- we must, as it were, surrender our agency to the agency of the machine.
"The real archaeologists in media archeology," writes Ernst, and I quote, "are the media themselves. Not mass media, but measuring media that are able to decipher physically-real signals techno analogically." Media are the real archaeologists on Ernst's account precisely because they work directly on physical signals prior to their conversion into phenomenologically-accessible forms, this work being the very mediation necessary to bring the physical into the domain of experience.
Here, we grasp the specificity of Ernst's conception of media archeology in its full resonance. Rather than focusing on the long history of media devices-- as do Erkki Huhtamo and Siegfried Zelinsky, certainly the two most prominent figures in media archeology today-- Ernst concentrates his critical gaze on the actual operations of concrete media independently of their cultural functioning. For Ernst, archeology thus means the excavation of the affordances of media themselves prior to any consideration of, and without reference, to their role as agents in cultural life.
Ernst's commitment to this restricted understanding of archeology-- archeology is excavation of media affordances in themselves-- makes common cause with work and science studies and the history of science focused on the role of experiment. Ernst insists that physical phenomena, like the propagation of ultra-fast electromagnetic waves, are not invented but discovered. And crucially, they are discovered not through the extra acute vision of the scientist, but rather through the superior wisdom of media themselves.
On this account, humans set up experiments precisely to tap the superior wisdom of media. The experiment is simply a staging of the operation of the machine, a bringing to bear of its agency on some aspect of the physical world. And with this, a forging of contact between the physical and the experiential domain of cultural life. Noting that, quote, "in the dispositive of human-made experiments, media themselves have the best knowledge."
Ernst suggests Gaston Bachelard's phenomenotechnique as a concept capable of grasping the particular positivity of media operationality at issue here. Phenomenotechnique, as Bachelard puts it, quote, "intensifies that which shines through behind appearance." Phenomenotechnique, that is, addresses the noumenon behind the phenomenon. Though, as we shall see below, less as an unknowable Kantian thing in itself than as a fuller grasp, dare I say, prehension of the physical reality addressed by the experimental setup.
Let me now turn to the second of Ernst's commitments, to the time criticality of media. By time criticality, Ernst means something broadly akin to what Husserl meant by time-constituting, as in time-constituting consciousness, or time-consciousness, only with a scope that vastly exceeds the operations of any imaginable consciousness, human or otherwise. Time criticality thus refers to the time-constituting power of media, a power which, from the study of electricity and electromagnetism onwards, has been exercised predominantly at micro-temporal scales vastly exceeding the perceptual grasp of human consciousnesses.
In keeping with the specificity of his archaeological practice, Ernst distinguishes time-critical media from, quote, "time-based media." That is, media constituted out of and by time itself. Time-critical, Ernst explains, quote, "does not simply mean that media operations are time-based. Rather, time-criticality requires that media operations under the condition of digital signal processing must be processed through rigidly-determined time windows in order to work, and thereby to bring a message into being in the first place."
Time-criticality would thus designate what, from a Kantian and even from a Deleuzian perspective, we could refer to as a transcendental or transcendental-sensible structure. The operations of time-critical media produce the medium of experience-- time and space for Kant, the real conditions of sensibility for Deleuze-- and thus form the condition of possibility for experience as such.
I use the expression would designate, however, to mark a certain distance between Ernst's operational account of time-critical processes and any transcendental structure that like both Kant's and Deleuze's, necessarily takes as its reference point human experience, or at least some other delimited-- and in this respect-- privileged perspective.
Unlike these latter, Ernst's account locates the operation of time-critical media in relation to the physical [? eventality ?] of the world in and for itself. Media operate directly on the physical, and only subsequently-- which is to say, through this very operation-- do they produce structures that facilitate experience as such. And human experience in particular.
With this point, we can appreciate how Ernst's agenda in his recent study Chronopoetik, chrono poetics, expands upon his earlier reading of media as measuring instruments in the essay cited above. Moving beyond the polemical stance of vis-a-vis history, his career instigating effort to contrast media given this via counting against historical narration of all sorts. And the words in German are [GERMAN] and [GERMAN], so there's a play on word counting and narration.
So moving beyond his polemical stance vis-a-vis history, Ernst turns his attention in Chronopoetik to the properly ontological dimension of his program. How measuring media constitute time as a medium for experience. As its subtitle-- [GERMAN], which means something like Time Periods and Time Giving of Technical Media. As the subtitle makes clear, Ernst's aim is to excavate the deep affinity between technical, or measuring media, and the medium of time. Which is to say, how time's periodicities and [? givennesses ?] constitute a general medium for experience.
Thus, media's operation does not so much seek to isolate time as the target of measurement as does Aristotle's physical conception of time as the number of movement, so much as it aims to constitute out of the measuring of time a medium. Ernst states this quite matter-of-factly, and I quote. "Time as the measured value or statistic--" the word is [GERMAN]-- "of a movement is here not the goal of measurement, but rather itself a medium."
Ernst's account of time-critically generated time as general medium involves two fundamental commitments that serve to differentiate his approach, both from theoretical invocations of mathematics in the wake of Badiou, and from accounts of digital materiality in terms of cultural operations. The fundamental law of Ernst's time-critical media philosophy, if we can speak in these terms, is that mathematics must always be implemented in a physical materiality. Because every operation of information switching consumes a minimal time interval, [GERMAN], Heidegger's being in the world must be understood as and through Ernst's being in time.
This is why mathematics for Ernst is always [? and necessarily ?] technomathematics. Mathematics implemented through a physical realization. Mathematics as phenomenotechnique. And it is also why technomathematics-- that is, measuring media-- always operate in the realm of, and can only generate finite processes.
The practical correlate of this first commitment is a dynamic understanding of the operation of digitization. Contrasting it with sampling as a cultural aesthetic activity, Ernst links digitization with the sampling of the physical itself. What he calls, quote, "sampling in the technical sense."
This sampling is performative in the sense that it does not sample from a preexistent and standing time, but from the physical processes that are, as it were, underneath time. Its physical operation transforms these physical processes into time, which henceforth cannot exist outside of media operationality. That is, outside the concrete technomathematical operations that materialize time as always [? and necessarily ?] finite.
Digitization, explains Ernst, quote, "is an act of measuring that reacts flexibly to the temporality or frequency of the signal. World is not thereby submitted to the temporality of enduring an abstract standardized clock pulses. Rather, technically, pulse time as a function of numerical measurement conforms to the frequency of the signal, and thus to the worldliness of physics itself." That such flexible measurement necessarily takes place as a compromise between two distinct physical indices-- temporal period and signal frequency-- only serves to reinforce the [? onto ?] performative power of time-critical measuring media.
Insofar as they measure, quote, "the rhythm of burst sequences of micro-temporal sampling periods," time-critical media operate in the space of indeterminacy between time and frequency. Identified in 1946 by Nobel Prize-winning engineer Dennis Gabor as a mathematically-precise extension of quantum physical indeterminacy, this indeterminacy of time and frequency ultimately explains why media cannot be understood as measuring an externally-located and preexistent time, as all cultural aesthetic accounts of time-based media would have it. Rather, in the act of deciding on a correlation of period and frequency-- which is what happens in measurement-- measuring media literally produce time-critical phenomena.
Expanding on Gabor, who focused on the domain of sound, Ernst installs quantum physical indeterminacy at the basis of all time-critical mediations. "Insofar as it is produced by time-critical measurements," Ernst explains, "the parameter of time ultimately tips over into quantum physically-demonstrated indeterminacy." This is a quote. "The smaller the unit of measurement and the higher the exactness with which it is measured, the longer the required time of measurement. No time point, no really extensionless moment can thus be measured. Digital units of measure are never parts of a standing time, but are rather much more variable because every measurement is based on a minimum interval delta t, and can in this framework only ever discretely process a finite manifold of information."
The ultimate consequence of this conclusion, namely, that there is no real state of affairs prior to the time-critical act of measurement, will prove decisive for how we understand the operational ontology of media in our world today.
Let us, however, stick with Ernst a bit longer. Long enough, that is, to explicate his third and most complex commitment. The commitment to time-critical media measurement as the performative hinge linking micro and macro worlds.
Early in his introduction to Chronopoetik, and by way of what can only be a rhetorical question, Ernst links the determination of time-criticality to the operation of quantum leaps from the micro world of quantum particles to the macro world of experience. Could it be the case, he wonders, that, quote, "the macro time of media history relates to the micro time of time-critical processes in the same way that the intact world of classical physics relates to the micro world of quantum physics?"
It is precisely through this homology that we are able to grasp the fundamental time-criticality of the quantum dissolution of physical time, as Ernst himself makes explicit. And I quote, "the question concerning the disappearance of discrete time in the quantum mechanical indeterminacy relation is in actual fact time-critical. When particles disappear, with them also disappears the time point," [GERMAN], "and hence the point-oriented concept of the event. Here, media science in the narrow sense comes into play. Media science, that is, that does not just deal with mathematical questions, but also with the actual implementation, and thus the temporalization of mathematics in physical materiality."
Insofar as I can make out, the logic of Ernst's argument here is something like the following. Quantum mechanics, or more precisely, the resolution of quantum indeterminacy, is time-critical because it coincides with the actual implementation of the quantum leap in a material substrate.
OK, section two is called Hyperobjects and Hyperobfuscation. So I'm going to assume you read this. As the final lines of this passage make clear, this statement is not authored by a quantum physicist, but by one of the more prominent adherents of a recent philosophical movement, Tim Morton. The movement obviously being object-oriented ontology.
The statement comes from a chapter on nonlocality from Morton's recent book, Hyperobjects: Philosophy and Ecology After the End of the World, and forms part of a larger argument for an ecological account of object-oriented ontology whose fundamental concept is the hyperobject. Hyperobjects in Morton's definition are, quote, "things that are massively distributed in time and space relative to humans."
Examples of hyperobjects include a black hole, the Lago Agrio oil-- these are all Morton's examples-- the Lago Agrio oil field in Ecuador. Or the Florida Everglades, the biosphere, the solar system, the sum total of all nuclear material on Earth, or just the plutonium or the uranium. The very long-lasting product of direct human manufacture, such as Styrofoam or plastic bags, or the sum of the worrying machinery of capitalism. What makes all of these examples hyperobjects is their status as hyper in relation to some other entity, regardless of whether they are directly manufactured by humans or not.
For the purposes of Morton's ecological take on object-oriented ontology, two hyperobjects stand out, quantum mechanics and global warming. Both are massively distributed in time and space relative to humans. One in the direction of the very small, the other in the direction of the very large.
Morton goes on-- sorry, Morton goes so far as to ascribe the impetus behind the concept of the hyperobject to these two exemplary examples. For sure, and I quote, "the idea of hyperobjects arose because of quantum theoretical thinking about the nuclei of atoms and electron orbits, and because of systems' theoretical approaches to emergent properties of massive amounts of weather data." What makes these two complementary hyperobjects exemplary and special is the clarity with which they illustrate what will turn out to be the crucial characteristic of hyperobjects. The fact that they, quote, "occupy a high dimensional phase space that results in their being invisible to humans for stretches of time."
According to Morton, what makes hyperobjects like quantum mechanics and global warming difficult to conceptualize-- and indeed, for many, difficult to accept as real-- is the fact that they occur in a dimension that is oblique in relation to the everyday world of phenomenal experience. We simply lack any direct means of access to their [? eventality, ?] which occurs on a scale that exceeds our correlational grasp.
Morton deploys object-oriented ontology to address this problem. As formulated by Graham Harman, and as taken on board by Morton, OOO, object-oriented ontology, postulates that objects are infinitely withdrawn in relation to everything, including themselves. Objects do not enter into causal interrelations, and indeed do not properly speaking have relations at all. Rather, they generate sensible appearances that serve as traces of sorts of their ontological power.
In his formulation of the concept of the hyperobject, Morton draws in particular on Harman's account of causality as vicarious causality. In the wake of Humean skepticism and Kant's institution of a, quote, "rift between thing and appearance," causality no longer concerns objects themselves, but rather their appearances. For this reason, Morton dubs causality aesthetic, meaning that it is an affair of relations among phenomena. A, quote, "feature of phenomena rather than things in themselves."
Contextualized against this philosophical backdrop, hyperobjects attain a certain specificity. They are a special kind of object, a particularly-visible object that as Morton puts it, quote, "force something on us. Something that affects some core ideas of what it means to exist, what Earth is, what society is."
Put another perhaps simpler way, hyperobjects are objects that compel us to face the truth about objects. Namely, that they are, quote, "real things whose core reality is withdrawn from access even by themselves."
Let us attend more closely to Morton's claim that quantum mechanics, as one of the two privileged hyperobjects, has made the withdrawal of the object into an irrepressible reality. The plausibility of this claim, as we shall see, stands and falls with the realist interpretation of quantum mechanics that Morton, drawing extensively on David Bohm's account of the Einstein-Podolsky-Rosen theory, simply ratifies as true.
At the heart of Morton's account is a conviction that there is such a thing as a quantum entity. That actual photons exist prior to the measurement that on Niels Bohr's account is responsible for bringing them into existence. This indeed is the deeper truth behind Morton's association of quantum mechanics and OOO.
Specifically, when he asserts that quantum theory works because it's object-oriented-- the conclusion of this passage-- or even more consequentially, that quantum theory is a, quote, "valid physical theory to the extent that it is object-oriented," he stakes his claims on the validity of the realist interpretation of quantum mechanics. Which is to say, on the idea that quantum entities are real in their entangled or coherent states. That is, that they are real independently of the act of measurement that resolves quantum indeterminacy.
Morton is, or would be right-- right about quantum theory being object-oriented, and right that object orientation would provide the ontology for quantum theory-- if, and this is a big if, quantum entities were indeed real. Put another way, the reality of quantum entities would make quantum theory a valid physical theory, where valid means nothing other than realist, and it would indeed call for an ontology capable of explaining the absolute withdrawal of quantum objects. Now, obviously, I'm going to argue that quantum objects are not real.
Yet as Nathan Brown has pointed out, Morton makes no effort to justify his ratification of Bohm's extension of EPR-- Einstein-Podolsky-Rosen-- and doesn't even seem to be cognizant that it remains a highly contested theory. This fact should alert us to the sleight of hand operating at the core of Morton's argument for the hyperobject. Behind the claim that OOO is the truth of quantum mechanics is a more basic argument. Namely, that OOO simply is the truth.
That is why, standing in for any philosophical argument for the validity of Bohm's realism is nothing other than the logic of entailment imposed by OOO. The result is a circular logic that smacks of dogmatism. It is the unquestionable existence of absolutely withdrawn objects-- the first principle of object-oriented ontology-- that requires quantum theory to be a realist theory. And this unquestionable existence is at the same time the most powerful argument for why quantum theory is a realist theory. Philosophical logic, as Nathan Brown's critique implies, and as Morton's repeated belittling of scientists serves to attest, simply trumps science in a way that makes scientific argumentation moot.
Beyond its contribution to ratifying the grandiose ambition of OOO, the aim of Morton's engagement with quantum mechanics is to generalize quantum nonlocality such that it, transformed into a core property of hyperobjects, would apply to all of reality. Nonlocality in this expanded frame is nothing more than a synonym for aesthetic causality. Just as entangled particles enjoy nonlocal interrelations prior to their disentanglement or decoherence through measurement, so too do the particular appearances comprising the aesthetic domain participate in the operationality of a hyperobject whose contours are radically withdrawn from their vantage point.
The point, to say it another way, is that the particular aesthetic appearances of, say, global warming-- rain in Northern California, the Japan earthquake of 2011, the tsunami churning up La Nina in the Pacific and so on-- are correlated with one another. And indeed, are unknowingly so correlated in virtue of their belonging to, of their happening within the hyperobject that is global warming.
Once again, however, the logic is circular and points to an underlying dogmatic commitment on Morton's part. For the homology at issue here to work, it has to be the case that the hyperobject is real. And that it is real independently of the activity of the particular appearances that unknowingly and partially instantiate it.
This in turn means that the nonlocal situation prior to decoherence or disentanglement-- or in other words, prior to measurement-- would have to be real. Nothing more nor less is at stake in designating quantum mechanics a hyperobject. On this description, quantum nonlocality has to be a real object that is infinitely withdrawn even from itself. Once again, we are dealing with a situation in which the cart is driving the horse, since it is the logic of OOO that justifies the claim for the reality of the nonlocal.
If Morton's qualifications-- and I have a passage on the way that he starts to talk about nonlocality of the hyperobject as metaphoric, not as literal. Not as a literal instantiation of quantum mechanical nonlocality, so I read that as a kind of wavering on his part. So I skipped that part.
So if Morton's qualifications, and indeed, his unequivocal backing away in his more recent book, Realist Magic, from the strong reading of quantum reality offered in his book Hyperobjects. Both are from 2013. So if Morton's qualification suggests that the realist reading of quantum mechanics is unsustainable, it immediately returns us to that crucial element of quantum theory, the measurement problem that the emphasis on realism allowed Morton largely to bypass.
More precisely, we can now clearly discern how Morton's realist reading of quantum nonlocality, his postulation that, quote, "something definitely exists before measurement," aims to access quantum reality, quote, "as it is in itself." Which is to say, prior to the collapse of its coherence, superposition, or entanglement.
Quote, "at the quantum level," Morton argues, "persisting is simply the way in which quantum events inside an object cancel out. We have arrived at a strange insight. The persistence of a crystal lattice depends upon millions of quantum phenomena that subtend the relatively stable atoms and molecules in the lattice.
What are these quantum events? Nothing but the coherence of the quanta. That is, the way they occupy more than one place at once. At this scale, physics observes objects that occupy place x and place y at the same time. These objects are dialetheic," having two truths.
Faced with this absolutely astounding claim, one can only wonder by what mechanism physics can perform such observations given that measure-- and this is Martin's own definition-- that measure at the quantum level means, quote, "alter momentum position by means of another quantum." At the very least, we can affirm that observing quantum particles in their dialetheic states-- an impossibility, according to both Bohr and David Bohm-- would require a suspension of Heisenberg's principle of indeterminacy.
With this conclusion, we have indeed come full circle. Specifically, we have arrived at a point where the violence of Morton's object ontologizing turns against itself, undermining precisely that quantum theoretical principle, indeterminacy, from which it claimed to have found inspiration. What is left in its stead is a rotting skeleton made up of a dogmatic commitment to OOO. There are objects. These objects are real, and they are infinitely withdrawn from everything, even themselves.
OK. Section three, which is called Measurement as Originary Phenomenon. This passage, taken from Arkady Plotnitsky's recent reconstruction-- Plotnitsky's a literary critic and theorist who also happens to have a PhD in quantum physics. So he's written this technical book that I'm drawing on here.
OK, so this passage taken from Arkady Plotnitsky's recent reconstruction of Niels Bohr's radical non-classical interpretation of quantum mechanics assembles the three crucial features of Bohr's account. These features are, one, the absolute unknowability-- indeed, unthinkability-- of undisturbed quantum behavior. OK, that means there is no quantum object.
Two, the central role played by measuring instruments as the hinge between macro and quantum realms. And thus, as sole means of access to the latter. And three, the irreducibly probabilistic nature of predictions concerning quantum phenomena that attest to the irreducible role of measuring instruments in the formation of those quantum phenomena.
According to Plotnitsky, who bases his arguments on a careful consideration of the increasing radicalization of Bohr's thinking, especially in the aftermath of his debates with Einstein in the '30s, we must make a clear and categorical distinction between quantum objects and quantum phenomena. While the former, quantum objects, are absolutely unknowable, the latter, quantum phenomena, are the result of observation and measuring instruments, and thus are both knowable and part of the classical world.
Quantum objects, explains Plotnitsky, and I quote, "are unobservable as quantum objects." This situation compelled Bohr to theorize or idealize quantum objects first as irreducibly different from what is observed in measuring instruments impacted by quantum objects which observations define quantum phenomena, and second-- more radically-- as entities placed beyond quantum theoretical description, and even beyond any possible description, knowledge, and ultimately, conception. As the product of interaction between the measuring instrument and quantum objects, quantum phenomena provide the only possible access to the quantum level. An access that is, as we shall see, at best indirect, and that remains part of the classical macro world.
Bohr's genius is to have taken this situation for what it is, a radical break with classical ontology and its causal models, and an opportunity to rethink basic ontological commitments beginning with the Kantian noumenon. Thus, Plotnitsky characterizes the rupture between quantum phenomena and quantum objects as a radicalization of the Kantian rupture between phenomena and noumena.
Where Kant's noumenon remains thinkable or conceivable-- the thing in itself is thinkable or conceivable-- and can indeed be spoken of in the language of metaphysics, quantum objects in themselves are literally unthinkable. They are, as Plotnitsky puts it, "beyond the reach of the theory itself or any possible conception." They only become thinkable-- and in a sense, only come to exist-- in the form of quantum phenomena. Which is to say, as a contribution to or component of the measurement that breaks coherence or entanglement.
French philosopher Jacques Garelli takes this logic one step further when he announces that the, quote, "nonbeing of quantum particles repudiates the idea of the noumenon as such." In a move that will have important consequences for our appreciation of the experimental interface, what takes the place of the noumenon on Garelli's account is the mathematics underpinning the probabilities of experimental outcomes. And I quote, "the a priori conditions of possibility of the objects of experience are, in quantum physics, mathematically integrated to the apparatus that elicits the experience precisely as a priori conditions of its experimental possibility." It is this situation that poses the question of a Kantianism without noumenon.
Plotnitsky explains how mathematics stands in for the noumenon when he writes that mathematics, quote, "allows us to predict the outcomes of quantum experiments in the absence of any knowledge or even conception concerning the actual independent physical behavior of quantum objects themselves." Yet for Plotnitsky, who plays down the ontological implications of Bohr's radical position, the emphasis here lands squarely on the pragmatic function of mathematics, which he suggests is a, quote, "form of technology, a technology of thought." With this claim, we are returned to Wolfgang Ernst's emphasis on implementation, and what I earlier referred to as the fundamental law of his time-critical philosophy. Namely, that mathematics must always be implemented in a physical materiality.
Bohr makes common cause with Ernst on this point in the sense that the mathematics describing the probabilistic outcomes of measurement has no existence outside this situation, and thus operates as a technology, a Bachelardian phenomenotechnique. As Plotnitsky explains, and I quote, "the mathematics of quantum theory defines these probabilistic expectations. No other appear possible in the practice of quantum physics by enabling us to make better predictions concerning what is thus observed in measuring instruments or other macro objects under the impact of quantum objects. But this is also all that this mathematics does for us, and no other mathematics appears to be able to do more."
Although Plotnitsky insists that nonclassical epistemology precludes any form of ontology, meaning that the being of quantum objects defies ontological capture, the reference to Ernst points us to a different ontological stratum of quantum theory. Namely, the event to the event of measurement. In Bohr's understanding, what happens in measurement is a, quote, "reaction of quantum objects upon other quantum objects. Or more precisely, a reaction of quantum objects upon the quantum aspects of the measuring instrument itself." So Bohr thinks of the measuring instrument as having a macro classical side, which is what the human sets up as the experiment, and a quantum dimension that interacts with the quantum object.
This situation of experimental contact with quantum objects inside the measuring instrument definitely absolves Bohr of any charges of humanism or anthropocentrism-- for example, those raised by Karen Barad in her book Meeting the Universe Halfway-- since it establishes unequivocally that the subject of the measuring experiment is not the human observer, but the experimental setup itself. Just as the media themselves have the best knowledge in Ernst's understanding of media as measurement, so too do the experimental measurements. The media, as it were, of quantum physics produced something, quantum phenomena, that is radically outside the phenomenal grasp of human knowledge.
To Jacques Garelli and his fellow French phenomenologist Marc Richir, what results is a generalized conception of the phenomenon beyond human intentionality that is resolutely ontological. In his important essay, "Quantum Mechanics and Transcendental Philosophy," Richir writes, quote, "it is not the case that the real remains hidden or veiled," which is what Bohm thinks. "But rather, that there is a physical dimension--" a [FRENCH]-- "only through being taken up in the order of the phenomenon in which are included, not observers in the sense of contemplators, but the experimental apparatuses of observation and measure. When they produce phenomena, these apparatuses are in fact performing ontological work. As the non-causal outcome of contact with the quantum object, quantum phenomena give the being a physical reality itself."
Had I more time, I would here turn to the phenomenological project of the great Czech philosopher Jan Patocka. I remember being a student and going to Derrida's lectures, and one year he was talking about this dude Jan Patocka, and nobody knew who he was. You should all read him. [? He's ?] amazing.
So if I had more time, I would turn to the phenomenological project of Jan Patocka, who sought to reconstruct Husserl's philosophy into what he calls an asubjective phenomenology of manifestation. Breaking with the Husserlian [? topos ?] of the adumbration, or the [NON-ENGLISH], as a partial and unfulfilled presentation of an object, Patocka insists that worldly appearances or manifestations are the object itself in this particular dimension of its being. So there's no distinction between the object and its appearance. The appearance is the object in this particular showing of the object.
Put another way, Patocka defines appearances as first and foremost objective, part of the objective being of the world itself. Appearances are worldly and objective, that is, before they become subjective. Before they become elements in the intentional life of consciousness. Because it collapses the gap between appearance and object-- precisely the space of the rift of the OOO people-- Patocka's philosophy offers an important alternative to Graham Harman's object-centered ontology, which as we have seen via Morton's deployment of it, is predicated on an irreducible gap or rift between appearance and object.
In a way that is not dissimilar to the logic of Bohr's argument for the primacy of measure-- where there's no independent quantum object that a quantum phenomenon is referring to. The quantum phenomena is the object in the way that it appears.
So in a way that is not dissimilar to the logic of Bohr's argument for the primacy of measure, Patocka rejects any possibility of access to the objective being of the world that would not take the form of appearances. Stated more radically, the world has no being. It has no status as an object independently of its multiplicitous modes of appearing. Just as there is no being of quantum objects independently of measurement, there is no being of the world in itself that would be separate from its manifestations.
What this means, I suggest, is that the quantum phenomena produced through the technology of measurement are ontological. They are manifestations of the structure of the world as it is, and they are the only way in which the world is.
OK, so I skip a section on correlations without correlata. And I want to turn now to Bohr's renunciation of causality that Bohr believes is necessary in the wake of the quantum revolution. Because our access to the quantum domain is restricted to the results of quantum measuring experiments, because it can only occur as it were inside the apparatus of experiment, we must forsake any lingering hope of independent access to the quantum object, and indeed, as we have seen, must simply renounce any and all invocation of the quantum domain in itself.
In the 1937 essay entitled "Complementarity and Causality," where Bohr first formulates his ultimate radically nonclassical view, he takes stock of this momentous task and its irreducible necessity. And I quote, "the renunciation of the ideal of causality in atomic physics which has been forced on us is founded logically only on our not being any longer in a position to speak of the autonomous behavior of a physical object due to the unavoidable interaction between the object and the measuring instruments, which in principle cannot be taken into account if these instruments, according to their purpose, shall allow the unambiguous use of the concepts necessary for the description of experience."
In closing this discussion of the quantum theoretical concept of phenomenon and the crucial role played by the measuring instrument in its production, let me emphasize how radically different this renunciation of causality is from that animating Morton's aesthetic theory of causality. Whereas in Morton's account, the absolute withdrawal of the hyperobject effectively guarantees that all of its phenomenal or aesthetic manifestations are interconnected and are indeed manifestations of the hyperobject under consideration even though this cannot be perceived from any phenomenal point of view, in Bohr's account, there can be no reference beyond what is given in the measuring experiment.
That is precisely why Bohr's understanding of causality after Hume differs radically from Morton's. Indeed, by dissolving the gap between phenomenon and thing that Kant institutes as a way to overcome Humean skepticism and that OOO hypothesizes as the deep ontology of the object-- so by dissolving this gap between phenomenon and thing, Bohr in effect returns us to the situation described by Hume. There simply is no basis in reality for causal analysis.
What we are left with is probabilistic analysis of the possible outcomes of quantum experimentation, where in contradistinction to classical physics, probability does not index a closed set of variables that can at least ideally be fully known. But rather, it indexes an open and radically indeterminate set of possible outcomes that not only do not correlate sequentially with one another-- as do, for example, coin flips-- but that can differ even across identically-prepared experiments.
The deep reason for the shift from causal to probabilistic analysis does not have to do with any lack of knowledge, as it still arguably does on Morton's aesthetic account. Rather, it directly concerns the radical randomness of the quantum domain as this manifests through experiment in the form of quantum phenomena. Our predictions are unavoidably probabilistic, Plotnitsky explains, quote, "because identically-prepared experiments in general lead to different outcomes even in the case of individual quantum events." Identically-prepared experiments lead to different outcomes even in the case of individual quantum events.
If what accounts for the different outcomes of individual quantum events is the interaction that occurs when the measuring apparatus-- wait, wait, hold on. If what accounts for the different outcomes of individual quantum events is the interaction that occurs within the measuring apparatus, would we be wrong to conclude that the experimental operation of measurement and the probabilities associated with it possess ontological power? And if it is the case that experiment produces quantum phenomena as an ontological operation, wouldn't the data generated through experiment-- which correlate with the probabilities associated with it-- themselves possess ontological power?
OK, and so now I'm skipping a section on Dennis Gabor, whose purpose-- this is what I still haven't figured out, and I want to talk to you about it on Thursday-- whose purpose is to do the work of translating this paradox of quantum measurement and the ontological function of measuring from this really arcane domain of quantum physics into the sensory world. And so Ernst argues that sound has a privilege in the contemporary media sphere precisely for reasons connected to this, and I would agree with him. So that would be what I would try to explain here.
And then I have a section on-- whoa, too far. Section on Whitehead. No, it's not in here. OK, well, anyways, there's a section on Whitehead with a couple of quotes from Process and Reality where I talk about potentiality and probability. I'll talk about that on Thursday too.
So I want to now skip to the final section, which is called the ethics of climate simulation, where hopefully, now that you guys have gotten all way out there and you don't know what the hell I'm talking about, I'll bring it home and you'll at least go away with something that you can say, well, Hansen talked about this. OK, so the first little passage, "climate is what you expect, weather is what you get," is a line from the science fiction writer Robert Heinlein. And the second passage is the one that I'm commenting on in my commentary here, and I'll let you read it yourselves.
Not insignificantly, Austrian media theorist Claus Pias contextualizes his exploration of climate simulation in relation to the very same disjunction between weather and climate that animates Morton's account of global warming. Citing accounts from climate researchers, Pias exposes the paradox at the heart of the problematic of global warming.
Namely, that as one pair of scientists puts it, quote, "humans are strange. Although we cannot yet predict future climate conditions, we act as if we can. Every day, political and developmental policies are defined, and business, financial, or even personal decisions are made as if we knew which future climate conditions we will face."
What we know, in other words, are weather conditions that we can loosely rely on, at least for a span of a few days. But this knowledge does not translate in any direct or compelling way into predictions about climate in the future. So there seems to be a disjunction or an impasse from our experience of weather in the short term and the prediction of climate in the future.
We are thus faced with an impasse. Either we act as if today's weather predictions are in fact indicators of future climate conditions-- which seems to involve lots of errors-- or we face the difficulty of trusting the results of climate research, that as data of complex computer simulations seems very far removed indeed from our daily experience. In the one case, we act as if there were no disjunction whatsoever between weather and climate. While in the other, we experience the disjunction between them to be so vast that we simply cannot find any way of mediating between them.
Pias's plea for a shift in the object of scientific analysis is meant to resolve precisely this dilemma. What he describes as the content of this shift is an experimental deployment of computer simulation that pushes it into a position beyond the traditional categories of theory and experiment. Understood as, quote, "a temporalizing imitation of system behavior through the medium of the computer, computer simulation is able to make phenomena that are not analytically accessible-- phenomena like climate-- addressable."
To clarify this claim, Pias cites four characteristics of computer simulation that are responsible for the transformation it brings in the object of scientific knowledge. First, computer simulation separates the performance of the model from the exactness of the results and lays emphasis on the former. Second, computer simulation deals less with laws than with rules, which means that it is not bound by the limitations of contemporary scientific knowledge of nature, and is thus free to speculate about what is not or not yet known.
Quote, and I quote from Pias. "Rules have a different relation to the future than do laws, and a different relation to the cases of natural laws they produce." In climate research, this situation is expressed in particular through parametricization, in which it is a question of operational dealings with the not known. The rule proves itself as a form in which something that is not understood, in the strong sense of the term, can nonetheless be dealt with.
Third, computer simulations are not concerned with proof, but rather with the demonstration of adequateness. This characteristic is closely bound up with the requirement that simulations be run in real time, for it is only on this condition that simulations can embrace what remains beyond prediction, even in deterministic systems when these latter are implemented physically. Once again, the payoff is that climate simulations are able to simulate phenomena that are analytically very difficult to access, or that are indeed beyond analytical access as such.
And fourth, computer simulations are concerned less with truth than with correctness. Emancipated from the need to decide on a true, or since we are in the domain of probability here, a truer or more true reality, simulations are able to embrace a plurality of realities, each of which is possible, and each of which has a distinct probability. With reference to our earlier discussion of probability in quantum measurement, what Pias's analysis makes clear is the fact that it is the entire set of probabilities-- the full conjunction of possible realities-- that make up the phenomenon produced by experimental measurement or simulation.
In relation to the larger argument I've been developing here, the payoff of Pias's account of climate simulation comes by way of his affirmation of the ontological status of data. Despite its apparent disjunction from phenomenological experiences of the weather, the data generated through climate simulation is itself a phenomenological presentation of climate in the strong ontological sense that Patocka lends the term. The data of climate simulation, in other words, simply is the phenomenon of climate. There is no true climate, no real climate, no climate qua hyperobject that stands behind these phenomena.
Pias's position ultimately takes the form of an ethical claim. And the ethical claim that we accept the data of climate researchers, the phenomenon of climate, not as a mere representation that will forever be inadequate to its alleged cause-- which is where the problem of climate change denial is there. You can't prove it. So the ethical injunction that Pias gives us is that we accept the data of climate researchers-- which I'm arguing is the phenomenon of climate-- not as a mere representation that will forever be inadequate to its alleged cause, but as climate itself.
Such a claim requires a profound ethical shift, for it involves embracing the domain of probability, or more precisely, the set of probabilities associated with any given measurement as the phenomenon of climate. Indeed, as climate [? to ?] [? core. ?] So there's no climate behind the phenomenon of climate, just as there's no real quantum object behind the quantum phenomena.
That such an ethical shift is now required more than ever can be seen by way of contrast with Morton's account of the impasse of global warming. That is, the disjunction between weather and climate. "Global warming plays a very mean trick," Morton writes. And I quote, "it reveals that what we took to be a reliable world was actually just a habitual pattern.
"A collusion between forces such as sunshine and moisture and humans expecting such things at certain regular intervals and giving them names, such as dog days. We took weather to be real. But in an age of global warming, we see it as an accident. A simulation of something darker, more withdrawn. Climate.
"Global warming is really here," he continues. And I quote, "even more spookily, it was already here, already influencing the supposedly-real wet stuff falling on my head and the warm golden stuff burning my face at the beach. That wet stuff and that golden stuff which we call weather turns out to have been a false immediacy, an ontic pseudo reality that can't stand up against the looming presence of an invisible yet far more real global climate. Weather, that handy backdrop for human life worlds, has ceased to exist, and along with it, the cozy concept of life world itself."
Despite the rhetorical twists and turns of Morton's mode of argumentation here as elsewhere, the logic underlying this claim is the same one we examined above. Namely, there is a radical rift between phenomenon and thing. "Weather-- along with climate simulation and any other experimental production of data-- is a phenomenon, and thus necessarily, a false immediacy that cannot lead back to an embrace of climate in general. Climate, by contrast--" and this is Morton-- "is a thing, a hyperobject that is radically withdrawn from its appearances, including all weather events, and all computer simulations."
If Morton is able to dismiss data and simulation in this way, it is due to the same logic. In the face of a hyperobject that is infinitely withdrawn even from itself, simulation and data are confounded. The hyperobject climate can neither be modeled in real time, since it operates in nonlocal and atemporal ways, nor can it be equated with the massive data sets and computing power necessary to model climate for the same reason.
Indeed, Morton goes so far to suggest that climate simulation, understood as a phenomenal or aesthetic appearance of the hyperobject climate, is needed to supplement human perceptions precisely because of the withdrawal of the object. By contrast, what Pias's analysis helps us to grasp-- and let me say, what Pias's media-critical analysis helps us to grasp is that simulations are necessary precisely because there is no real climate in itself. There is no hyperobject called global warming. Rather, global warming simply is the conjunction of models of climate. A set that encompasses the host of humanly-perceived weather events along with all available climate simulations that produce data about the probability of [? incompossible ?] future possibilities.
Thinking about climate in this way, which is to say, thinking about climate ethically and conceiving of experimentation following Ernst as implementation, is to restore the possibility of engaging with climate change, to turn our new mode of knowledge into the basis for new forms of practical intervention. Morton is thus wrong to characterize global warming as a wicked problem. And a wicked problem he defines as a problem-- this is apparently a philosophical concept, I've never heard of it, maybe you have-- a problem one can understand perfectly, but for which there is no rational solution.
So Morton is thus wrong to characterize global warming as a wicked problem. It is rather Morton's analysis of global warming as a hyperobject that makes it a wicked problem. By instituting a radical rift between the hyperobject global warming and its worldly effects, Morton destroys any chance for us to come into touch with it, and thus, to act in ways that might forestall its future advent. OK, thanks.
We've received your request
You will be notified by email when the transcript and captions are available. The process may take up to 5 business days. Please contact email@example.com if you have any questions about this request.
Mark B. N. Hansen, professor of literature at Duke University , gave a lecture titled "Logics of Futurity, or on the Physicality of Media," June 30 in Bache Auditorium, as part of the 2014 School of Criticism and Theory public lecture series.