[APPLAUSE] BRUCE LEWENSTEIN: Good evening, everyone. My name is Bruce Lewenstein. I'm professor of Science Communication here at Cornell's Ithaca campus. And I'm really honored to welcome you this evening to this symposium celebrating the inauguration of Martha Pollack as our 14th president here at Cornell.
I think it's telling that President Pollack chose an intellectual inquiry as a central theme of her inauguration events today and tomorrow, beginning with the Festival of Scholarship that has just finished, this symposium, and then moving on to events tomorrow. Our topic tonight is Universities and the Search for Truth. And I think President Pollack chose this topic both because it is universal and because it's incredibly timely.
It's universal because the almost 1,000-year history of universities has been a history of finding ways to identify and defend reliable knowledge, to publish it, to teach it, to use our extension and outreach and public service activities to get it out to as many people as possible, and to live it here on our campuses. And yet, the topic is timely, too. As anyone watching debates about Brexit, or death squads in the Philippines, or the meaning of statues of Robert E. Lee in American history-- anyone following these debates will know that we live in a time when these debates are central to the way we live in the world. These debates have real consequences, real impacts on people's lives.
And so President Pollack has identified a range of faculty members from across the Cornell community, both here and in New York City, to explore the role of universities in the production and defense of truth. Notice that I said "truth" and not "the truth." Each of the speakers tonight will discuss the nature of truth in his or her own field. Personally, I like to use the phrase "reliable knowledge," as I already have this evening. But others will use different phrases.
In a world of alternate facts, how do we reconcile our need for stable truths with the idea that our scholarship has led us to, which produces different understandings of truth? I think that's our fundamental question tonight. How do we reconcile stable truth with multiple understandings of truth?
Each of the speakers will have five minutes to talk. That will leave us plenty of time for questions. When it comes time for questions, these microphones will be moved up into the aisles, and you can ask them there. There's also one in the balcony. Those of you who are watching online or are sitting in the far corners can also tweet your questions to the hashtag #CornellXIV, and we'll try to pull in some of those as well.
In the interest of time, we're forgoing long introductions. You have everyone's name and title in the program. We'll be going in order along the stage tonight-- Kevin Clermont from Law; Sarah Murray from Linguistics; Mor Naaman from Information Science at Cornell Tech New York City; Holly Prigerson from Weill Cornell Medicine in New York City; David Shalloway from Molecular Biology and Genetics; and Lyrae Van Clief-Stefanon from English. Kevin.
KEVIN CLERMONT: I am so happy to be here on this great day in Cornell's history. I'm the first to speak, thanks to the alphabetic power of my last name. It may seem unfortunate to start with law because most people think that law takes an idiosyncratic approach to truth. We legal academics and legal actors have had a tremendous difficulty in communicating the way we treat truth.
This evening, I'm going to talk only about law's take on historical fact, for which the law assumes a real world. There's nothing postmodern about law. And I'm treating hard fact, not interpretation. On fact, law gives no role to fake news or alternative facts.
In brief, there's nothing special about law's fact-finding. The law looks for the same truth as a historian or an intelligence analyst or a scientist would. Thus, the law might not have much to teach. But I do think it has something to teach.
The law's lessons lie in the two big differences from the search for pure knowledge in a university setting. First, front and center, truth is admittedly not the only purpose-- goal of law. Fairness, efficiency, and substantive goals join truth.
Together, they're supposed to produce justice. The goal of law is justice, not truth alone. For example, we suppress evidence of guilt if the evidence was illegally seized by the police because we seek to restrain police misbehavior. A multitude of goals prevails in most settings outside the university.
Second difference-- the law cannot leave decision for another day. Other disciplines can leave a question open or call a result statistically insignificant and await greater certainty to emerge. But legal actors-- and all of us in most of our daily lives-- must come to a decision. And we do so in a world of uncertainty and imperfect evidence.
As a result of these two big differences, law, more than any other discipline, studies the standards for making a decision. Just one illustration-- law has developed standards of proof for accepting a fact as true. Law employs three standards of proof on the probable side of the scale of likelihood-- preponderance of the evidence, clear and convincing evidence, beyond a reasonable doubt. The choice among these three reflects the multiple goals of law. The operation of these three reflects the need for immediate decision, as these standards put aside second-order probabilities.
So quickly, preponderance means more likely than not. If errors in one direction cause the same harm as errors in the other direction, the preponderance standard will minimize error costs. Clear and convincing evidence is used when the cost of a false positive considerably exceeds the cost of a false negative. An example would be where we civilly commit someone to a mental institution.
Beyond a reasonable doubt is reserved for the criminal system, where it is asserted that the cost of convicting the innocent greatly exceeds the cost of acquitting the guilty. Blackstone famously said, I quote, "It is better that 10 guilty persons escape than one innocent suffer." His position is not unarguable. Otto von Bismarck, for example, thought, quote, "It is better that 10 innocent men suffer than one guilty man escape."
And indeed, the founder of the Bolshevik secret police said almost exactly the same thing. Many of us don't admire these foreign thinkers. But a congressman said recently in a Congressional hearing on campus sexual misconduct procedures, "I mean, if 10 people are accused, and under reasonable likelihood standard, maybe one or two did it, it seems better to get rid of the 10 people," unquote. Apparently, innocent people must be sacrificed to make revolutionary social improvements. Truth doesn't always matter to the pure of heart.
The point here is, is that setting standards of proof is a matter of minimizing cost to social policy, sometimes at the expense of truth. Maybe a specific university-based example will drive the point home. Case number one-- public urination by a student, which is denied by the student but alleged by a witness. Then the clear and convincing standard applies under our campus code. The code's idea is that in a quasi punitive system, convicting the innocent is costly.
Case number two-- sexual misconduct by a student-- he said, she said on the issue of whether sex was consensual. Then the preponderance standard applies under our policy 6.4. Public urination, clear and convincing. Sexual misconduct, preponderance.
Why the difference? The federal government told us to apply the standard of preponderance. Cornell eagerly obeyed and in 2012 adopted preponderance for sexual misconduct against accused students, not against accused faculty.
Obviously, you can go too far with this reform. In my view, I don't think social policy can run off without regard for truth. Allegation, reasonable likelihood standard sets the bar too low.
Preponderance, though, might be right. It could be. It is, after all, the way to minimize errors. I think there are good reasons for preponderance, though my time has run out.
I just want to say that we're going to have to face this question again because the Trump administration seems poised to revoke the federal dictate. And so punt this question back to the universities. We're going to have to decide the standard of proof for sexual misconduct. A lot more than truth is in play in that decision. And that's a reality with which the law is used to dealing and on which it has something to teach.
BRUCE LEWENSTEIN: OK, thank you, Kevin. Sarah.
SARAH MURRAY: As a linguist, I'm interested in the scientific study of language, in particular on meaning of language, both how it relates to the world and how we use language. So within linguistics, we assume-- we make a very simple assumption-- that sentences have truth values. Sentences are true or false.
So for example, I'm sitting down. That's true. I'm standing up. That's false.
And so there you go. That's it. We're done.
But it turns out that not all cases are so clear cut. So for example, how about the sentence, Ithaca winters are not cold? Is that true? [CHUCKLES] Is it false?
Is there an objective fact of the matter about whether that sentence has a truth value? Does it rely on standards of coldness that might vary depending on each person? Can we capture this in a theory of meaning? So there are these issues-- lots of kinds of cases that push what seems like a clear-cut notion.
And another complication to this picture of linguistic meaning is that normally, we say things, but it's not apparent whether those are true or false. So you can see whether I'm standing up or sitting down. And so you know what the truth values of those sentences are. But language is really most useful when we rely on information from other people that other people get from their own experiences-- other people's information that they then communicate through language.
So there's over 7,000 languages in the world. And each grammatically encodes different kinds of information. So for example, in some languages like English, you have to say when the described event took place relative to when you're speaking. So is what you're describing in the past, or in the future, or currently happening? So for example, it's raining, it was raining, it will rain-- things like that.
But not all languages like English. And not all languages require this grammatical tense. Other languages have other requirements. And some, for example, require you to say what kind of evidence you have for the information you're describing.
So for example, whether you witnessed the event that you're talking about, whether you heard about it later, whether you're inferring that the event happened later on based on some kind of other evidence. So for example, I saw it raining, or I smelled it raining. I heard it raining. I heard that it rained, or it must've rained.
And in some languages, these are actually different forms of the verb that convey this information. And that's called grammatical evidentiality. That's present in about a quarter of the world's languages.
And of course, we can communicate that information in English. I just did in those examples. But the language-- our language, English-- doesn't require that. Some languages actually do require it.
And it might make you think, oh, languages with evidentials-- these directly encode how reliable information is because you're saying what your evidence for that is. But ultimately, people can say whatever they want, right? So if I say, I saw it raining, I could lie about the fact that I saw it. Or I could lie about the fact that it's raining. Or I could lie about both pieces of information.
So [CHUCKLES] adding that extra information doesn't directly guarantee the truth of that statement. So language itself doesn't ensure the truth or reliability of information. It's how we use language in communication and who's using the language and our judgments about that.
So it's a universal that people use language deceptively. But it's actually been argued that using language deceptively actually undermines and deteriorates the meaningfulness of language. So when we communicate, we do so against a shared backdrop of beliefs about the world. And language provides a public resource that allows us to collaborate in seeking truth.
So we share information of different kinds. And we add to our background of beliefs. In this way, we can share our experiences and what we know about the world. And in general, we have an assumption that we make that people are speaking truthfully.
And it's actually this assumption that we assume people are being truthful that allows deception to work. So nothing you can add to language-- even though there's really rich and very interesting linguistic diversity that's very important to study-- nothing that you can add to language itself directly increases reliability. But social means and social structure can-- relationships between people, how we judge a source, the goals of the conversation.
A complete understanding of the topic requires insights from many different fields. I'm actually a member of the graduate fields of American Indian and Indigenous Studies, Cognitive Science, and Philosophy, in addition to Linguistics. And the study of language and communication is even broader than that, crucially relying on interdisciplinary collaboration. So language is a public resource, and people use it for their own purposes, sometimes exploiting our conventions of truthfulness and cooperative communication for their own purposes. And no linguistic structure in isolation guarantees truth or reliability.
BRUCE LEWENSTEIN: Thank you, Sarah. Mor.
MOR NAAMAN: Madam President-- I always wanted to say that-- [CHUCKLING] panelists, and guests, modern media technology is killing truth and knowledge. Instead, our technology emphasizes only information and emotion-- information, verified or not, raw or summarized, in any form and any format, from any source or platform; and emotion, our emotional response to that information. We are biologically, psychologically, and evolutionary wired to respond to information and emotion. But we're not wired to respond to knowledge or truth.
Our information ecosystem-- most prominently and visibly, social media-- is a well-tuned and optimized machine that plays exactly to these tendencies, supporting information and the emotions it triggers over what we may refer to as knowledge of truth. As information scientists at the Cornell Tech campus, my job is to understand how the social and psychological interact. Critical to do if we-- yes, us, Cornellians-- want to have a role in addressing this issue. I'll get to our role in a minute, but let me continue to paint my dark picture first.
This abolishing of knowledge is not new. There were always cells of alternate realities, often created by a radical and charismatic leader, from cults to-- temporarily, one hopes-- entire nations. But digital technology created an accelerated dynamic where on the internet, believers in anything and everything can find thousands of fellow fantasists, as Kurt Anderson recently wrote.
Furthermore, the believers are even skeptic can find online a wide array of alternative facts to support any belief or viewpoint. This has been somewhat true, even with Usenet and GeoCities, for those of you who remember the '80s and '90s. But jumping to current days, social media allows these pockets of alternate realities to expand and connect and bring others in and craft and hone their messages.
You may have heard the expression, the filter bubble, coined by Eli Pariser to express that we are mostly exposed to points of view that are similar to ours, mainly due to a selected media diet or to personalization. But filter bubble is a weak phrase, suggesting that only if that filter was punctured, that only if that filter were popped, people would be exposed to other points of view. And world peace will ensue, and everything will be fine.
But we don't just have filter bubbles. We have persuasion chambers. Our online spaces are optimized to strengthen belief we already have and ignore or reject ideas that we do not agree with.
There are many behavioral and social mechanisms in play and technology platforms that make them into persuasion chambers. Some mechanisms enhance and direct the participation in those online spaces, including our desire for social approval, the desire to maintain favorable self-concept. And of course, we also see the role of reinforcement learning, where dopamine makes us hooked on getting likes. Our need for social approval is wired into our brains. Other mechanisms impact how we evaluate and even retain information-- for example, social proof, confirmation bias, and conversely, the backfire effect, causing us to reject information that does not agree with our current belief.
Technology and social media thrive on exploiting these behavioral and social mechanisms for profit. Maximizing engagement, they call it. Facebook, to pick one example, optimizes for engagement time and usage of the service, not for democracy or knowledge. The same mechanisms they exploit and the techniques they employ result in these unfortunate outcomes, confirming biases, misinformation, giving us only what we agree with, enhancing our convictions. In other words, undermining truth and knowledge.
By the way, Facebook, Google, and others are not purposely distorting our realities. It's clear that the people there are, in fact, aware of these challenges. I know that.
I have friends-- graduates and collaborators-- in these places. I get funding from them. We can help them change.
What can we do to address these issues? What is the role of Cornell and the university? I can say more in the discussion.
But very briefly, first, we need to provide proper non-tech educations to students with training technology and related fields, so they understand the ethical, philosophical, and social issues, as well as the technical. Second, we need to educate all students about technology and how it enhances and create misinformation and biases. Third, we need to create curricula that can help enhance the voices of those who participate in fact-led discussion, right or left. And finally, we need to focus research on big topics that are playing a critical role here. For example, trust and attention are two related topics that I've been exploring in my work recently.
I'm sorry for painting such a grim picture on such a festive Cornell day. I heard dark speechless are all the rage in inaugurations these days.
I do believe that Cornell is the best place to address these issues with our combination of engineering, liberal arts, humanities, computing and information sciences, and with the graduate field system that Sarah just alluded to, that allows us to flexibly bridge these areas in exciting ways.
BRUCE LEWENSTEIN: Thank you, Mor. Holly.
HOLLY PRIGERSON: OK. So I'll be talking about how half-truths and truthiness affect medical care. So as much as we may wish it be otherwise, truth is not an absolute thing. It's not binary. And it's on a continuum.
Science and medicine are compromised the further they stray from pure, 100%, unadulterated truth. Yet currently, much of medical information that exists today is likely based more on half-truths than on whole truths. Stephen Colbert coined the term "truthiness" to describe the belief that something should be true based on the feeling that it should be true-- it had to be right-- rather than on actual facts. As described in Wikipedia, truthiness can range from ignorant assertions of falsehoods to deliberate duplicity or propaganda intended to sway opinions.
Truthiness, like half-truths, is ubiquitous in medicine, particularly in the areas that I study, which are end-of-life cancer care. Some examples-- pharmaceutical companies are number one. Pharmaceutical companies are purveyors of half-truths and truthiness. They hype drugs, stretch the truth about what their medications can and cannot do, and promote false hope by inflating expectations.
We've all seen the direct-to-consumer advertising by several pharmaceutical companies, such as the marketing of Keytruda by Merck, which is just one of many, many examples. Merck claims that their stories are tru-- T-R-U, without the E, not true with the E, as in tru without the "E" is Keytruda. In fact, on their website, they say, "Keytruda offers the chance for a longer life. It's tru"-- without the E.
But it's only partially true. Keytruda may prolong life but likely only a couple of weeks or months and not the years that patients or family members are hoping for; only in highly selected subgroups of patients-- for example, lung-cancer patients who express the PD-L1 biomarker. And what they don't mention is how your quality of life will be affected by the drugs that you're taking when your lifetime is limited.
Another example of half-truths in medicine is the use of misleading outcome measures, such as progression-free survival, PFS. Progression-free survival is the time it takes for a tumor to start to progress again or grow. Progression-free survival is not necessarily correlated with living longer or feeling better.
In fact, some studies have shown the exact opposite. In other words, although the word "survival" is in progression-free survival, improvements in PFS, touted as trial successes, may not translate into any notable difference in survival at all. So it's this bait and switch of terms, which is really disingenuous and misleading.
Yet another example of something I've studied specifically, called palliative chemotherapy-- our NIH-funded studies of end-stage cancer patients show that chemotherapy given to dying patients does not palliate them or comfort them. In fact, it does the exact opposite. It poses risks for making patients much sicker, feel much worse. And palliative chemo, as we've shown, significantly increases the risk that a patient will spend the last few days in a hospital with a lot of adverse events.
It's disingenuous to call chemotherapy given to advanced cancer patients palliative chemotherapy. The real term should be that it's not curative chemotherapy. So it's just a dishonest term that physicians use.
Other untruths that I've studied include the misconception that terminally ill cancer patients cannot handle the truth about their terminal prognosis. Physicians fear that talking about death and dying will make patients needlessly hopeless. Our NIH-funded studies show the exact-- not the opposite but that it doesn't make patients significantly more depressed, anxious, or hopeless. But it does have a lot of good outcomes. So talking with patients with their doctors about the care that they would want to receive if they were dying is significantly associated with more realistic life-expectancy estimates; better end-of-life care, such as less-intensive, burdensome, and futile care; better patient quality of life; lower out-of-pocket costs; and lower costs to society-- that is, Medicare-- for costly end-of-life care.
There is a need for big pharma and physicians to be more forthright in sharing realistic patient survival estimates and presenting patients with full, honest information about how overall survival and quality of life is affected by the treatments being offered. Turning half-truths into whole truths and avoiding truthiness-- that is, saying what patients want to hear, rather than what they need to hear-- is a prescription for better end-of-life care. Thank you.
BRUCE LEWENSTEIN: Thanks, Holly. David.
DAVID SHALLOWAY: In science, truth functions to describe and explain observable phenomena. Scientists use a structured framework for discourse that distinguishes data from information from knowledge, engages the veracity of each differently. Data are the observations and measurements. Information is the interpretation of the data. And knowledge is the broad statements and theories that are supported by and explain large amounts of information.
For example, DNA sequences and medical records of thousands of people may be collected-- that's a lot of data-- and compared to give statistical correlations between DNA patterns and diseases. That is information. And these, combined with other information-- say, about the biochemistry in cells-- can lead to knowledge-- statements about the causal connection between the DNA mutation and the disease.
While these distinctions aren't absolute, they are important for our communications and arguments about scientific accuracy and truth. Respect for the difference between data, information, and knowledge underlies the structure of our scientific articles and talks. The data is presented in graphs, pictures, tables, and there is a Methods section that describes how the data was collected in sufficient detail to allow others to analyze and test the facts for themselves.
The information-- the immediate interpretation of the data-- is located in a Results section designated for this purpose. And opinions regarding the implications of the information-- the knowledge-- are located in the Discussion section. Distinguishing data from information from opinion clarifies our debates.
And it's particularly important when mistakes are found. We can reinterpret previous information, even while questioning or changing a theory. And we can reuse previous data when new discoveries demand that the original analysis be reformed.
Beyond this, academic scientists work within a culture that demands skepticism, openness, and honesty. We usually take this culture for granted, but it surfaces if any of its mores or taboos are violated. "Show me the data" goes without question, and the ease of digital sharing leaves no room for excuses. The high-quality journals now require that even our raw, unprocessed data be made available for others to check or reinterpret.
And lying-- falsifying data-- is the biggest sin there is. Most scientists have a gut reaction to falsification. Your reputation for accuracy and honesty is critically important within a pretty strong social structure.
And if you're caught in scientific fraud, your reputation is lost. And granting agencies will debar you from future funding. Your career as a scientist is finished.
Conversely, the granting agencies require that we explicitly teach and train our students in scientific ethics and in the methods needed for scientific rigor, transparency, and reproducibility. Scientists aren't intrinsically more honest than anyone else. But the culture both promotes and enforces honesty. And so while we're skeptical, even confrontational, with scientists in our own field, the flip side is that we tend to trust the conclusions of experts in other scientific fields, even when we can't follow their presentations in detail because we know that their colleagues are holding them to similar high standards.
What does this imply for the word "truth" in scientific discourse? I think most scientists are cautious about using this word. Data can be true or false, but knowledge is usually only an approximation, sufficiently accurate and useful within a given domain but without claim to absolute accuracy.
Newton's 17th-century physics still works to send people to the moon and back. But Einstein's theory of relativity showed that it was only an approximation good for objects traveling much slower than the speed of light. And Einstein's understanding of relativity was shown to be only an approximation that works for objects much larger than molecules. Quantum mechanics is needed for anything smaller. The history of science demands precision, caution, and humility when using the word "truth."
As Professor Lewenstein said, debates about what is true have real impacts on people's lives. And there's an ever-growing need for accurate knowledge and information in decision-making. And to echo Mor's comments, there's things that we as a university can do. While some of the scholarly practices and traditions of academic science may be too rigorous for general use, I think that there are aspects that we can and we should promote in the broader culture.
BRUCE LEWENSTEIN: Thank you, David. Lyrae.
LYRAE VAN CLIEF-STEFANON: Known as the impossibilia figure, adynaton is a type of hyperbole-- the rhetorical figure for magnifying a thing or an event by comparison with something impossible. There are many forms of this figure, among them reversals of nature, reversals of custom, numbers or objects impossible to count, and assertions of perpetuity.
When I want to teach my students about adynaton, I usually start with Stevie Wonder's track "As," an easy in from the 1976 album Songs in the Key of Life. He sings, "Just as Hate knows love's the cure, you can rest your mind assured that I'll be loving you always. Until the rainbow burns the stars out of the sky, always. Until the ocean covers every mountain high, always. Until we dream of life, and life becomes the dream."
The formal principle of [? adenotta-- ?] stringing together impossibilities-- was a way of inverting the order of things, drawing attention to categories, turning the world upside down. The eclipse on April 6, 648 BCE, seems to have given Archilochus the idea that anything was possible now that Zeus had darkened the sun, and thus, beasts of the field could change their food for that of dolphins. The Encyclopedia of Poetry and Poetics notes the position that adynaton is sometimes a confession that words fail us.
Who gets to close read or to declare what is meant to happen next in America or in the American university? Which truths do we hold to be self-evident? If America is an imaginary place where real things happen, its citizens, it seems, live in constant anticipation of its coherent iteration, at regular intervals applying the same processes to its product as those that produced it in an attempt to approximate underpinning mythic texts.
The nation desires its fictions. It longs to be struck by its own impossible beauty. Yet in a twinkling, it devolves from "All men are created equal" to the warped inheritance of Thomas Jefferson's lesser fabrications.
In the days leading up to this week's eclipse, I thought about the approach of this occasion, how both "author" and "inauguration" stem from the root "auger," a religious official amongst the Romans whose duty it was to predict future events and advise upon the course of public business in accordance with omens derived from the flight, singing, and feeding of birds, celestial phenomena, and other portents. On Monday, as the light went dusty, and I found myself awestruck on the Waterfront Trail, hands raised, spinning, surrounded by 50 seagulls that had suddenly, silently taken flight, filling the air around me, I could not help but think about adynaton. As the hot midafternoon cooled itself to containing the gloaming, which would depart, only to return later at its regularly scheduled hour, the eclipse re-created something familiar-- a day, but one that made space for itself within itself. The day possessed itself.
"And at that impossible poise between absolute flux and accidental suspense," June Jordan writes in her poem "Fragments from a Parable," "the most beautiful woman in the world became my mother. But as nothing is absolute or accidental, I only exchanged equilibria. I was not particularly born."
I am particularly interested in the ways black creative artists employ adynaton as a figure with which to conceive of habitable space to think through and out from distorted atmospheres of lies and nonsense. "Nevertheless, live," Gwendolyn Brooks exhorts in "The Second Sermon on the Warpland." "Conduct your blooming in the noise and whip of the whirlwind."
In 1773, in "On Imagination," Phillis Wheatley authored, black fantastic heterocosms, wherein "From star to star, the mental optics rove, Measure the skies, and range the realms above. There in one view we grasp the mighty whole, Or with new worlds, amaze th' unbounded soul." I stand amazed at the work she set me towards-- the work of inhabiting the measure. She is an impossible miracle. She is the truth I am honored to defend.
BRUCE LEWENSTEIN: Thank you, Lyrae. And thank you all for those really stimulating comments. In a moment--
Yeah, please. Thank you. In a moment, we'll begin taking questions from the floor as the microphones get moved up the aisles. And also, again, for those of you who are watching online or who prefer not to come up to the microphone, you can tweet your questions to the hashtag #CornellXIV. But while we're getting set up for that, I wonder if any of the panelists want to react to the things that you heard from other panelists.
DAVID SHALLOWAY: Well--
BRUCE LEWENSTEIN: Yeah, David.
DAVID SHALLOWAY: Well, I particularly appreciated Mor's comments. I'll just mention, the Faculty Senate passed a resolution in early spring recommending for Cornell to take a leadership position in honesty-- I forget the exact name of it, but that was the gist-- both in terms of our internal education, as Mor was talking about-- our training of students-- but also, we're a unique Ivy League university in that we're part state and explicitly in our mission is a public service role. And I think this whole idea of truth, which really seems to be under attack, however you define it-- this is part of our mission and something we can take a bigger role in. And it's a good place for Cornell to exhibit leadership, which I think is what you were getting at. We have the resources to do that.
BRUCE LEWENSTEIN: Other comments? No one at the microphones yet. Well, I have a question.
It seems to me that we have heard a difference here between the notion of truth as something very stable, in the sense of the way David was describing science, and truth as a kind of meaning that is produced through a process of discussion or presentation of evidence, more along the lines, I think, of what Kevin and Sarah were talking about. Is that a distinction that is useful or helpful in how we think about truth?
David's willing to jump in.
DAVID SHALLOWAY: Don't want to hog the show. As I tried to emphasize, one of the important things in science is a clear distinction between facts and opinions. And I think that balance is different in different fields. But I think that's a critical distinction.
And we heard that in some cases-- well, in poetry--
LYRAE VAN CLIEF-STEFANON: Mm.
DAVID SHALLOWAY: --we're not talking so much about observable facts. But I think the distinction is key, regardless of how the balance comes down.
BRUCE LEWENSTEIN: Well, Sarah, were you going to say something? Oh.
LYRAE VAN CLIEF-STEFANON: Go ahead.
BRUCE LEWENSTEIN: [INAUDIBLE] [CHUCKLES] Go ahead.
SARAH MURRAY: I just wanted to say, I completely agree with-- in linguistics as a science, you know, we assume this idea of objective facts about the world and then how we communicate them, and this idea that people have these backgrounds of beliefs that when we communicate, we build on and how we can exploit those background beliefs or reinforce those beliefs based on selective facts that we share or facts that are based on our own experiences. So the example I tried to give at the beginning was something like, Ithaca winters are not cold. And is there a fact of the matter about that?
I mean, in one sense, we very much want to maintain this idea of objective facts and observable facts. And in linguistics, when we think about the meaning of language, we think of whether or not the language corresponds to the facts of the world or not. But some sentences push on that a little bit, which is interesting to think about how do we account for that? What do we say about sentences like that?
BRUCE LEWENSTEIN: Mm-hm. Lyrae, were you going to say something?
LYRAE VAN CLIEF-STEFANON: Oh, I was just thinking about living in a world or in a nation that is like founded around the idea that my identity-- that I can't exist, and that by the framers, that a black woman poet-- there's no such thing as that. That cannot be possible, according to Thomas Jefferson, the same person, the same author of "We hold these truths to be self-evident." And so it changes your relationship to how you think about, like, how truth and things like that work when you walk through worlds that are so unreal that there is-- that that's why I'm saying that turn to the exaggeration that people are using to get themselves out of this crazy worlds that we live in.
And those worlds, like, cross boundaries and things-- like medical. I'm thinking right now about a diagnosis that I received with the words "likely due to the patient's wig" on it after I got an MRI. This is the hair that grows out of my head. But the world that I live in distorts facts and truth into these other things.
HOLLY PRIGERSON: It kind of raises the question of what's the purpose of truth? What function does truth serve? And I think we've all thought-- taken for granted-- that truth serves the function of just the way the world works. And you can rely-- as you said, reliable knowledge.
And once you start questioning what-- like fake news. How do you teach students to distinguish between what's real and true and reliable? And we've used the word "reliable" a lot. In science, it seems like the word is more "valid." Valid is true. And what's accurate and valid is what truth is. And then you know that it's what you think it is.
But right now, the function of truth, it almost seems like truth is so outdated. Truth is so, like, last year, you know, that it doesn't matter as much anymore. So I think that is so unsettling to people who've spent their life trying to get things accurate and reliable and true because that's the foundation of moving forward. Like, this is true. So then therefore, that's true.
And when you start doubting that, the reliability and understanding and veracity-- the validity of knowledge-- all kind of falls apart. So I am wondering, what's the function of truth?
KEVIN CLERMONT: But that's nothing new, is it? I mean, the fact that truth-- truth exists. And you can fade off into opinion, as David said. That's one dimension.
The other dimension is that truth-- there are other things that matter. I mean, if you're a scientist, and you discover some absolutely horrible bomb, do you put that truth out in the world? Do you say, no, there are other things that matter, and the truth here doesn't?
But that's an old thing. I mean, it's true that it seems to be more in the wind currently. But that's always been true, that there are multiple goals-- multiple goals. And truth is just one of them.
MOR NAAMAN: Mm-hm.
BRUCE LEWENSTEIN: You have a question here at the microphone. No? Oh. Over here. Yeah.
AUDIENCE: Oh, so thanks to all of you. This is terrific. David, I have a question for you. You, I thought, wonderfully brought up the extent to which Newton's theories were true and then the extent to which Einstein's theories were true. And then quantum dynamics comes along.
So could you say a few words about that wonderful article, "The Unreasonable Effectiveness of Mathematics in the Natural Sciences," which I as a non-scientist read with great interest? But I'd really like to hear you say something about it. I guess the basic idea was that math is a very abstract system, which we as people invented.
And yet it seems to work so effectively to describe the natural world. But I gather there are views on both sides of that-- very strong views on both sides of that. And it seems to me it relates rather closely to this question of "the truth."
DAVID SHALLOWAY: Sure. To paraphrase Einstein-- I won't get it right. But he said, the most mysterious thing about the universe is that we're able to understand it at all. And the article Hunter referred to-- there were two sides. One view was that isn't it amazing that all our math works? The other view was maybe what we see is defined by the way in which we look at it.
And to some extent, that second view is certainly true. We do experiments that define the velocity, that define the momentum. And then we can make equations about it.
Personally, I'll go with the first view, which I think is-- it's just amazing. It is a mystery that we are capable, that the universe-- the physical universe, let's say-- is describable in equations. In fact, the equations-- you mentioned quantum mechanics. The equations of quantum mechanics are very clear, and they work.
Your cell phone wouldn't work with the understanding of electronics and solid-state physics that depends on quantum mechanics. But exactly what those equations mean, what they're talking about, mm, don't know. Not just me. I mean--
--we don't know.
So there's a mysterious interplay between mathematics and the physical world and our understanding of it. I'll go with Einstein and say it's very beautiful. As a scientist, I find great beauty in that fact.
BRUCE LEWENSTEIN: Anybody else want to--
MOR NAAMAN: Maybe I will add to that. My [INAUDIBLE] is in computer science. And I will point to the information science. I call myself a reformed computer scientist.
A lot of it comes with-- in computer science, there's always a tendency, like in math, to say, OK, we can formulate things. We can prove things about them. Therefore, the math will be the truth, which is OK but very limited, right?
It turns out that the real world has a lot more complications associated with it when you're creating new technologies and creating new ideas. So I think there is beauty in math, of course. And equations-- being able to formulate things-- it's not going to help us everywhere.
BRUCE LEWENSTEIN: Question here.
AUDIENCE: Hi. Yes, my name's Timothy Ju. I'm a student studying computer science. And, well, first I want to thank you all for the wonderful ideas and wisdom that you have all shared with us tonight.
I do have a question. As many of you are welcome to answer, if you would like. But my question is, what is the biggest question that you have for which you would like to know the truth?
BRUCE LEWENSTEIN: [CHUCKLES] Oh. Who wants to take that?
LYRAE VAN CLIEF-STEFANON: Just in terms of the thing that I was talking about, I really would like to know what Phillis Wheatley's mama called her. I say that, like, not-- I mean, because this is a thing-- again, it's like tied to our history. Phillis was the name of that slave ship that she was brought here on. Wheatley is the name of the people who purchased her when she was 7.
That absence, that gap was one of the things that drove my last collection of poems because that's a missing thing that I'm trying to figure out. How do we remember? How do we remember that? And then I look at the news, and I wonder, well, what are people working on?
I think that's-- you know?
BRUCE LEWENSTEIN: Mm-hm. Holly.
HOLLY PRIGERSON: I'd want there to be a truth meter-- like how to know how true something is-- because as I started the talk, I don't think that truth is an absolute thing. So when I hear things, I have an internal sense, like, where's the truth meter on what this person is saying? And I wish that there were a way to quantify that and measure that. Like, when somebody is talking, yeah, Mor, can you build an app for that?
Like, how much of what this person's saying is true or not? Is there an app for that yet?
MOR NAAMAN: We can work on it.
I'll need a linguistic person to help.
HOLLY PRIGERSON: The truth meter.
SARAH MURRAY: Well, within linguistics, often people think the idea of communication-- the big question behind communication-- is something like, what world do we live in? And when we share information, we are actually sharing facts about the world that come from our own experience. And we're narrowing in on figuring out facts about what the world is.
But echoing a lot of what people have said on the panel, I don't know. Maybe we think we live in a different worlds, right? Maybe we're having different kinds of conversations.
And sometimes truth isn't the point of conversation. You think about jokes, or if you think about different kinds of conversations, getting to the truth of the world or what the facts are about the world or what people's experiences are-- sometimes that's not the point of conversation.
BRUCE LEWENSTEIN: Mm-hm. David? Mor? Kevin?
KEVIN CLERMONT: I'm trying to think of something non-sarcastic.
The next Powerball set of numbers, for example, I would like to know, meaning of life. One thing is that law constantly struggles with is not getting a truth meter or finding out what really happened but how, instead, to work with the knowledge-- with partial truths, where you can't know what really happened. And law is struggling with that. But basically, we have to find ways to deal with uncertainty and live in an uncertain world. It would be nice if we could resolve that a little better.
BRUCE LEWENSTEIN: Mm-hm. David.
DAVID SHALLOWAY: Yeah, it's a really interesting question because it raised the thought in my head. And I realized, I'm a basic scientist. If I was an applied scientist-- that is, I have a specific thing, maybe a better battery for electric cars or whatever-- have a specific place I want to go-- but as a basic scientist, although on a day-to-day basis, I might have specific things I'm trying to figure out true or false, the way we actually work is we look, what are questions that I might be able to answer? And I'm looking-- is there an area where I can find some more knowledge?
And then, wherever that is, as a basic scientist, within some constraints, I tend to go there. So I don't think, in a strategic level, I don't think of so much what's the big question to answer. It's some kind of balance between what's the big question, and what can I answer? Where is the opportunity for gaining some more truth?
MOR NAAMAN: Maybe-- right, to add on that. I think we're all taking the science approach, right? I don't think what do I want to know the truth about. That I want to-- what ideas I can pursue, what answers I can give about the state of the world.
And I mentioned working on trust. I think there's a big question of how to re-enable trust in our society-- in America and beyond-- and how technology can help with that. So I don't assume that I'll find truth at the end of that process. But I think it is a good question to pursue.
OK. OK. At the risk of making this a presidential microphone--
AUDIENCE: David Skorton, recovering Cornell president.
First, I want to congratulate everyone-- President Pollack on the conception and Bruce and your colleagues on the execution. It's a fabulous discussion. But I'm troubled by the fact that it's a fabulous discussion, and there's just a few of us here.
And so my question really isn't about the academic world. It's about the general public, who we serve. And what I'm hoping for is some ideas from the panelists, maybe especially from Bruce, about how we can either bring this kind of thinking and conversation to the public or the public to this kind of thinking or both.
BRUCE LEWENSTEIN: I'll let the panelists address before I do. Anybody want to? Yeah.
What are the-- I mean, David, you mentioned in your remarks especially. Mor had some specific suggestions for what we need to do in our teaching and in our academics. You said you have some specifics, but you didn't actually give them.
DAVID SHALLOWAY: OK. Well, I did mention there was a Faculty Senate resolution pushing for Cornell leadership in this area. It was just this spring. But it's already formed a nucleus of people to start concrete actions and actually look at what kind of actions we can take and to start moving in that direction.
There's been some baby steps taken. The university library is doing some things. A number of faculty are getting together on a Knowledge Matters initiative that's going to have workshops this coming year, actually mostly for faculty but aimed at engaging faculty and training faculty just on this point of bringing it out into the public.
I think we're just starting, frankly, because there was the spring, the summer. And I personally-- and I think many of my colleagues really think-- we need to start talking specifics. What can we do, brass tacks?
And actually, David, I could send you an email with a whole list of topics that have been discussed. It's early days. But I do hope we're going to actually take action and not just talk about it.
BRUCE LEWENSTEIN: So the thing that strikes me is that a couple of topics that people have already raised-- the question of emotion that Mor raised, the question of uncertainty that Kevin especially has brought in-- these are things that we in the university have difficulty accepting as being fundamental to how we talk with publics. And we think we can find a clear way of doing it-- not everybody in the university, but many. And I think the big challenge for us may not be how do we communicate with the public directly, but how do we help-- through the Knowledge Matters process and others-- how do we help our colleagues in the academy understand that emotion is as human as rationality, and that people will respond to emotion, and that that's not wrong, that uncertainty is a fundamental aspect of many aspects of our lives?
And even in the gaps I think that Lyrae talked about, we will never know exactly what was in that gap. So it's not just uncertainty in science. It's uncertainty in history. It's uncertainty in meanings.
And we have to learn how to deal with that ourselves. And we have to find ways to convey how to deal with those uncertainties and how to recognize that emotion is part of what people respond to. And that's not something that we can ignore and think that if we just give more data, if we just give more information, if we just give more knowledge, if we just give more truth, that somehow we'll be able to overcome those things. We have to recognize those things, I think, as fundamental.
MOR NAAMAN: There may be three different things that we can communicate related to this panel. One is the meaning of knowledge and truth and the philosophical underpinning and the considerations. That's hopeless. I mean, to philosophy students, we can do it. Others, perhaps, I don't know if we can do it broadly.
The second is to learn ourself maybe how to communicate science better, how to communicate ideas, finding knowledge better. I think this is something we can definitely achieve by educating our faculty and students with the newest and bravest consideration around that. And finally, what I think we can do is help the general population be able to evaluate information better, either by educating the population or by, as I suggested before, helping people who build the platforms through which the people consume that information do something different in how they expose information and how they expose the credibility of the source and so forth. So I think in that we can play a role-- a broad role, for sure because we educate so many students that end up contributing to these platform services.
DAVID SHALLOWAY: I would just echo what you said. I think we need-- you're in communications. I'm in science.
I have to say that as a scientist, I spoke about the community that I live in. In my community-- in the science community-- lying doesn't happen very much. And I don't know how to deal with it. I take it for granted that people have to be honest, or they will pay a big price.
And frankly, it's a disadvantage to me in communicating with the public. I, frankly, personally-- and I think most of my colleagues-- don't even know how to deal with the level of misinformation that we see out there, with the ability to work on emotional response. We don't work that way. And we're trained, actually, in not working that way.
It really puts me at a disadvantage and I think many scientists at a disadvantage. So we need to, within the university, it may be others who can help the scientists communicate better 'cause I don't know how to, frankly. With other scientists, OK.
HOLLY PRIGERSON: It seems like truth is something of a value. And it almost seems like we're assuming that everyone wants to seek truth. And I don't know whether a typical undergrad who's worried about getting into law school or med school cares about truth or cares about their GPA because they want to get a job.
And to the extent that truth is an absolute good or a value or a virtue, it seems that the university should play a role in having students prioritize it, along with getting a good GPA. It's almost like cultivating values in what an educated person would be.
The other thought that I was thinking is that it seems also that a critical skill for evaluating truth could possibly be taught. So it's like you don't teach someone what to think. You teach them how to think about things and how to reason and look at something critically. So maybe teaching instilling, A, the value in why that's a valuable thing to know or to do, and also the technique for trying to discern what is true and what isn't true.
BRUCE LEWENSTEIN: OK. And there's someone here in this microphone.
AUDIENCE: I'm not sure some of it wasn't answered, for which I'm very grateful. Alice [? Bergla, ?] [INAUDIBLE] very [? short ?] alum, but a linguistics minor, as well. In that context, if there are approximately 7,000 languages, and each of those speakers has one truth that they're saying that may be heard differently-- the same word heard differently-- and each has a truth, is it necessary to imagine a truth to solve world peace at a peace conference? Or is trust the outweighing value?
Is knowing your ally or your adversary and then discerning the words from that-- is that how we get to some, whether it's within our own culture of black, white, red, blue, university, differences between students, or in the real world-- North Korea and so forth-- when you try to come to some table, does trust come first before truth? Or is it part of the equation, without which there is no final signing off on anything? I'm not sure that's a question.
SARAH MURRAY: Maybe more than one question.
AUDIENCE: Well, I worry about words-- how they're heard and how they're spoken and how they're heard, even with good intent, on both sides. But maybe what you were talking about just recently the last question, about education and trying to teach how to think about a way to get to at least a personal truth. But we have very big questions in the world, with 7,000 languages. And I'm just wondering how you would rank what gets us to truth. And is trust a piece of that?
SARAH MURRAY: Yeah, trust is definitely a huge piece of communication, I think. Without that, communication doesn't function, right? And so in some sense, we exploit that notion of trust. And as people, we're saying, you just assume people are going to be honest and trustworthy. That's what allows us to exploit each other in conversation or exploit information.
I do want to say about the 7,000 languages-- it's true that languages differ in these ways. One really interesting thing with respect to the evidentials of the source of information that I mentioned is people who speak as a first language languages with evidentials often when they speak other languages bring that-- many of you probably speak multiple languages. And you hear that, right? You feel like you bring something from your other languages to the other languages that you speak.
But you'll hear people who are first-language speakers of languages with evidentials say things like, it must be raining, or I saw it raining, or add actually what seems to be, like, more information from their perspective, from the grammatical perspective of their first languages or their native languages. But it can actually make it sound weaker. So sometimes from an English perspective, if you hear somebody say something like, it must be raining versus it's raining, that somehow seems weaker, even though if from your perspective, that might be adding more information about your evidence for certain kinds of information. So trust is very important.
But also, if we have different native languages that have these different conventions, that have these different grammatical encodings, like you were saying before, how are words spoken, and how are they heard? How do we interpret what people's intentions are and what they're trying to communicate? That can really be affected by these grammatical features of language that I think are not common knowledge or not part of what we know just by saying something.
KEVIN CLERMONT: But can trust really substitute for truth? I mean, you imagine two people who can't communicate. Well, I mean, trust will maybe get them to stop hitting one another. But if you are really worried about coming to some agreement, about surrendering this territory for that territory, I mean, there is going to be-- trust won't get you there.
AUDIENCE: I agree. And--
BRUCE LEWENSTEIN: So-- I want-- I don't want to--
AUDIENCE: --also trusting someone to lie is also a piece of the puzzle, too. And maybe that's the knowing who the person on the other side of the table is. But anyway, it was answered in various ways before I asked. Thank you.
BRUCE LEWENSTEIN: Anybody else want to comment on this question?
MOR NAAMAN: Maybe to briefly say, I think trust is when we talk about [INAUDIBLE] communication, critical. It does. It Trumps truth, in many cases.
You talked about other people's information, right? And we believe the source more than we can evaluate the information. We'll just accept it based on the source itself.
It's been interesting to look at trust. Generalized trust, in general, has been declining-- trust in others, trust in institutions in the US in particular. But on the other hand, we see mechanisms that help us trust each other in a way that were not imaginable just 10 years ago.
So hitchhiking has been non-existent almost in the US for many years now. Yet anyone is clicking three times on their iPhone to summon a total stranger to pick them via Lyft or, god forbid, Uber.
So I think there are interesting opportunities that will be created to enable this trust between strangers using technology. Obviously, it can be used in different ways-- has been used in different ways as well. So we need to find out what we can do that can connect people beyond their similarity, beyond the tendency to trust each other based on race or gender or whatever else that is-- language-- and allow them to trust each other, not just to get themselves to the airport but in other settings.
BRUCE LEWENSTEIN: Mm-hm. Oh, I'm sorry. Over there.
AUDIENCE: Oh, good. Phil Hanlon. I'm a visitor from one of those other Ivy League institutions.
And I also want to thank you. It was a delightful panel. I really enjoyed it.
I was at a presentation by Michael Dimock, the president of the Pew Research Foundation, fairly recently. And his job that evening was to describe to our trustees the class that would be entering 10 years from now. What are they going to be like?
And the thing he hit on the most was that we are in a period of evolving notions of authority. So we are moving from experts, which has been the history, to crowdsourcing. So for the rising generation, according to Michael, the notion of authority is how many likes something has or what appears in Yelp rather than The New York Times. So given that we in the higher-ed business are kind of selling experts-- that's what we provide is experts-- if this is correct, what does this say about the way we're going to have to educate students in the future?
KEVIN CLERMONT: Well, we won't have to educate them, right?
That job is going to be fairly easy.
BRUCE LEWENSTEIN: Mm-hm.
HOLLY PRIGERSON: Well, it sounds like there are going to be consequences to crowdsourcing truth and that when there are negative consequences to realizing that people can have lots of opinions, but the value of those opinions-- like say, for example, you're not going to crowdsource what your cancer diagnosis is-- that the public, they're not going to crowdsource something that's really, really technical. That's really not in danger.
The thing that seems to be more in danger of being crowdsourced or having the users or internet-- people likes-- influencing what you know or don't know is more in the realm of psychosocial or political or spiritual domain. So I think it's field-specific. But there still will be experts and things that you can't ask people their-- it's not about opinion. It's about, I'd like to think, certain known truths. So I'll stop there.
BRUCE LEWENSTEIN: OK, David.
DAVID SHALLOWAY: I find that protection scary, especially in my area. But I'll throw out something I don't understand that's positive and that's how Wikipedia works so well, at least in the technical kinds of subjects that I look at. You look at math in there, you look at a lot-- I don't know about personal biographies and such. But if you look at the technical subjects in Wikipedia, they're pretty good.
And you can imagine it just wouldn't work. It could be vandalized. It could be subject to a lot-- but I find it's pretty good.
This is not crowdsourcing. But it's something different from a textbook written by experts. I'm actually pretty interested in how it manages to work. Maybe there's something there to learn.
MOR NAAMAN: But I think that I would call Wikipedia crowdsourcing. I think it's a great example. And I think the students of the future, as we think about higher education, I would love for them to teach themselves and create an environment where we can support them in teaching. And I don't think our role will be exactly the same in 10 years.
But I think the role of the university as a learning environment-- not just learning, by the way. They get a lot of other things. And we should support those as well. But the learning environment will be even better if we can figure out a way to work into their new ways.
BRUCE LEWENSTEIN: We have a question that's coming from Twitter that I think connects to this one we've just been talking about very well. It's from Jennifer Hilliard. "In a world of truthiness, how can we foster trust in the truths that are able to be verified?" It's a version of this expertise question. How do we foster the trust?
DAVID SHALLOWAY: I want to throw out two little incidents that have happened. So again, I have to remind you, I'm a scientist in a world with a very strong culture that insists on honesty and truth. Yesterday, I was speaking to a seminar speaker.
He was telling me about the results of someone else. He said, she's not skeptical enough, right? That just-- because implicit in all our discussions is skepticism is good.
I had a student talking to me. And he showed me some conclusion. And I said, well, how did you get that? And I kept hammering on him.
And he said, I get the feeling you don't trust me. And I told him, of course I don't trust you. It's my job not to trust you. And I don't expect you to trust me either. Our job is to force each other to defend what we have to say.
So in my five minutes, I said that we tend to be very confrontational within our own area. This gets me in trouble with my wife--
--because-- who's in the audience-- I have a habit of being confrontational with the people I know well. That's the style. And we feel that out of the confrontation is the best way to get to truth.
I tend to be trusting, though, as I also said, because I do look to experts in other areas. And part of that trust for them comes from my knowing that they're being subject to confrontation by their colleagues in their expertise. These are things that I value as an academic scientist.
I don't know if there's any hope of moving somebody's ways of relating to truth into a broader sphere. I would hope so. I never thought about it before, save the last year, when the whole issue of truth became such a public issue.
BRUCE LEWENSTEIN: Yes, is there a question here?
AUDIENCE: Many years ago, as a graduate student, I sat in this auditorium and heard Martin Luther King talk to the Cornell community at a time when at other universities he was not welcome, and in other cities he was put in jail. So I'd like to pose to the panel a practical question. How do you bring into the university environment the people you despise-- the liars, the cheaters, from your point of view?
And they may think that they have truth. Learned Hand-- truth is so precious. Use it sparingly.
What's the role of the university in bringing in the people that we don't like or we don't agree with? And how do you expose it without having riots in the streets? And how do you get students, faculty, everybody to hear what people have to say that you may not believe is true or you may absolutely hate? And I think that's a broad question to anybody on the panel who would like to try to address it.
KEVIN CLERMONT: Well--
BRUCE LEWENSTEIN: Anybody? Anybody want to--
KEVIN CLERMONT: President Pollack--
HOLLY PRIGERSON: First Amendment.
KEVIN CLERMONT: --probably should address that one. But--
--it is. It's an incredibly difficult question. I mean, the university has no obligation to bring people in. But if they do bring them in, should there be any kind of censorship?
The ACLU today has been suggesting that they're backing off from their absolute free speech thing, that they're not going to be representing extremists who want to march with weapons. Mm, I mean, that is sort of-- so I mean, I don't think there is an easy answer. I mean, that's going to be one of the really tough questions for a university in the future, as to what speakers to invite.
AUDIENCE: Today it's a problem.
KEVIN CLERMONT: Yeah, yes.
BRUCE LEWENSTEIN: So--
KEVIN CLERMONT: Well--
BRUCE LEWENSTEIN: --others on the panel who want--
KEVIN CLERMONT: --not tonight.
BRUCE LEWENSTEIN: This is a question for the new president, clearly. [CHUCKLES] Tomorrow, she says, she'll address it. Question here.
AUDIENCE: I'd like to maybe draw on the comments that many people on the panel made in their original presentations but to really ask them a little bit more directly. Many of you talked about what I would call incentives to depart from the truth. And we see them in all domains.
I was interested to hear what Professor Shalloway said. But I've seen studies that indicate that many scientists do not report all their data. Many scientists exclude data that departs from their conclusions.
We know about some of the big pharma people that were talked about. The elephant in the room here is politicians and the incentives they have to depart from the truth-- incentives that didn't begin in November 2016 but were accelerated by 2016. And it seems to me that the task, really, is to deal directly with those incentives and to make them less rewarded than they are now. I'd very much like to hear the panelists comment on that.
HOLLY PRIGERSON: I agree 100% with that truth right there. I wish that I were exposed to all the noble scientists that you've been exposed to. I've had students who've taken the truth from their results and twisted, taking out outliers. And there are ways to manipulate data to have it say things that you want it to say. And I've seen people manipulate that because they want to have a publication.
And the incentives are everywhere. The incentives for not telling the truth are-- there can be rewards for not telling the truth. And we see that there are very few penalties for not telling the truth. So it seems like there's the push and pull of if there are so many incentives for lying and getting away with things so that you can achieve some other end other than the pursuit of truth, there also need to be some-- it sounds so draconian-- but penalties. And what are the costs for not telling the truth?
And I mean, you did mention that you can get disbarred, or the NIH won't fund you if they find you've fabricated data. But aside from those things, how can you instill making in the university-- how can you have students who have other motivations? And the incentives for cheating and not being honest, we haven't really discussed it. There really are motivations for why somebody might not be 100% honest.
So how does a university try to incentivize and penalize the instances when truth isn't the goal always? So I think it's an excellent question. I don't know the answer, though.
BRUCE LEWENSTEIN: Mm-hm.
MOR NAAMAN: And in the technology world, it's maybe a little easier because it can be formulated. So what is the incentive for a lot of the fake news? The incentive was to make money. They brought a lot of people, a lot of views onto their site that had ads. So if you take out that incentive-- so that maybe getting to the tricky part-- then at least you can fight some of the fake news or even some of the websites that have a strong agenda to manipulate public opinion but rely for their existence on ads and on money that they get from search engines ad and so forth.
So for example, Google at some point removed their ads from a series of sites that were shown just to post fake news articles. And those articles had no more reason to live. They were not making them any dollars they were before.
So that's the good news there. It's hard to do. And it will be hard to do on a continuous basis. But at least there is some idea of the incentive in that case.
BRUCE LEWENSTEIN: Anybody else? David.
DAVID SHALLOWAY: So there's a few points. I tried in my talk to always preface the word "scientist" with academic scientist. The culture I talked about is one of academic scientists. There's a big difference between, say, company scientists and government scientists and academic scientists.
So we're all human. We all have rewards that go against going for the truth. But I can imagine that the pressure is on, say, an industrial scientist-- and we know there are industrial scientists or other scientists who just come out and lie, and then we hear about had made a lot of money.
You can do that in academic science, too. In academic science, my feeling is if I'm-- look, well, what are the temptations? Fame is a big one. Money, I don't think so much.
But we are human. There's no nirvana. But what I appreciate about our culture is that your reputation does count. And so you're running a risk. And if you're caught, you're in trouble.
When I saw a debate between a congressman and a scientist-- climate scientist-- on climate change, I'm thinking, here's a guy-- the congressman-- if he lies, and he's caught in a lie, no difference to his career. They do it all the time. But if the scientist is caught, he's finished. I'm not saying he will always be caught, but there's a difference there.
And I think at least we have tried to set up a-- I don't want to be Pollyanna. But at least the issue is raised there. And this issue of pharmaceutical companies-- actually and National Institutes of Health just caught up on this. There was a study showing how many drug leads couldn't be reproduced and so forth. So they just put in last year a requirement that we emphasize-- we already had to teach ethics. Now we have to teach reproducibility, transparency, and I forget the other word.
It's not perfect. But at least we're putting in the effort to do this. And we have to contrast this with what goes out outside of academia. I'm pretty proud, actually, of what we do internally.
BRUCE LEWENSTEIN: [INAUDIBLE]. So I see we still have a number of people who'd like to speak. But we have unfortunately reached the end of our time. And so I'm going to try to see if I can wrap this up.
I think in the last 90 minutes we've heard a number of really fascinating comments. I have a whole list of themes that I saw here. I'm going to pull out four of them. The first one I think I heard was the function of truth-- the function of this search for the truth-- that it's part of what motivates us. But it's also, in the different situations we described, it serves different functions there. So there's both a functional level for us and a functional level in society.
The second one is the role of the social-- the social approval, the social norms that are used by which truth is judged and by which we can perhaps enforce norms of trust. That's tied to culture and language and so forth. And I think trying to understand the social aspect of truth is important, which leads then to the third principle, which is trust. That ran through so much of what we talked about.
And I think we have to understand more about the role of trust in how we search for truth and in how truth is used and judged elsewhere, which leads to that final fourth point that I think has just come up that a couple of these questions have really challenged us on, which is, so what do we do? What's the need the need for action as we move forward? And I think that may be something we can take away.
Before we close, I want to thank the staff of Bailey Hall and the other people who contributed to organizing and producing tonight's symposium. While I think many of us may disagree about some of the points that we've heard tonight, I think we can agree that rich conversations like this are the essence of universities and the search for truth. So I thank President Pollack for giving us this opportunity during her inauguration to demonstrate the value of intellectual inquiry and the ways and to help us have these thoughts about where we need to move forward. Will you please join me in thanking tonight's panelists?
Thank you, and have a good evening.
We've received your request
You will be notified by email when the transcript and captions are available. The process may take up to 5 business days. Please contact email@example.com if you have any questions about this request.
This faculty panel discussion explores the ways in which people assess and communicate information, as well as the role of universities in those activities. Moderated by Bruce Lewenstein, professor of science communication.
Part of the two-day celebration of the inauguration of Martha E. Pollack as the 14th president of Cornell University.