– Welcome everyone to Wednesday Nite @ The Lab. I’m Tom Zinnen, I work here at the UW-Madison Biotechnology Center. I also work for UW-Extension Cooperative Extension and on behalf of those folks and their other co-organizers, Wisconsin Public Television, Wisconsin Alumni Association, and the UW-Madison Science Alliance, thanks again for coming to Wednesday Nite @ The Lab. We do this every Wednesday night 50 times a year. Tonight it’s my pleasure to welcome back to Wednesday Nite @ The Lab, Robert Thorne. He’s a professor in the School of Pharmacy here. He was born in Seattle, Washington and he went to high school at John F. Kennedy High School near Washington in a town west of Seattle called Burien. I should have said near Seattle. He went to the University of Washington as an undergrad and majored in chemical engineering. Then he went to the University of Minnesota, got his PhD in pharmaceutics and neuroscience. Then he went to New York University at the Medical School at NYU and did a postdoc. In 2010 he came to UW-Madison. Tonight, it’s a pretty pressing issue. The whole issue of biomedical research ethics, higher purposes, and challenges going forward. It’s been a pressing issue since we’ve had medicine and pharmacy and I think it’s going to be interesting to hear how Robert has taught his students over the years and how things are changing year by year, and in some cases week by week as new case studies develop. So would you please join me in welcoming Robert Thorne back to Wednesday Nite @ The Lab.
(applauding)
– Thank you, Tom. So it’s a real pleasure, as always to come here. As he said, I was here a year and 1/2 ago talking about something totally different. So I’m going to share with you today, really maybe what it’s like to be a student a little bit here and learn about research ethics. What we try to impart, what the perspective is on the challenges that we face today. I’ve always been interested in ethics. I’ve always been in science, but I have taken a number of ethics courses at the University of Washington, at University of Minnesota, and then I’ve taught ethics at two institutions, here and then previously in New York. So I’ve taken hopefully some lessons away from that over time. Now if I was going to talk to you about my research,
which is studying essentially the physiology of brain fluids and the circulation of fluids in the brain and how that affects drug delivery, particularly of large molecules, antibodies, viral vectors for gene therapy, if I was going to do that I would start with a disclosure. And indeed a year and 1/2 ago when I was before you I did start with a disclosure just like this where I acknowledged I periodically receive honoraria for speaking to various entities, sometimes companies. I’m an inventor on patents and patent applications. And I occasionally also serve as a consultant to companies: Genentech, Shire, Denali Therapeutics and other entities, some legal consulting. And so I’m not going to talk about my research today, but I start with something that really, it bears on research ethics. Why do I disclose, why do I talk about this? It’s really to let you know what my background is and if I have any conflicts of interest. And so this is a topic that we get into when we teach our students research ethics, when they’re starting their PhDs. And you might think, well this is a pretty simple concept. But even very recently maybe some of you saw this in The New York Times a little over a week ago. There was the chief medical officer, Dr. Jose Baselga at Memorial Sloan Kettering in New York City. He’s one of the world’s foremost experts on cancer.
He previously led, he was the president of the American Association for Cancer Research and among many things that that organization does is it sets guidelines for things. How to conduct research, what to do, how to disclose your potential conflicts. And the reason he was in the news is he didn’t do this himself, dozens and dozens of publications in some of the best journals. And about five days later he was relieved of his position at Memorial Sloan Kettering. So I think that it may be a trivial thing sometimes, and certainly he must have thought so, but even something simple like just starting out your talk or having something in your paper that you published saying, these are my conflicts, I wish to disclose this. You can see the ramifications, even for someone at the very top end of the cancer research field in the clinical area.
So what I hope to talk with you about now and give you a little bit of a flavor of is how we approach some of these issues at the university. Not only here, but other universities. How do we talk about ethics in science? And I guess the one place to start is to differentiate between research mistakes and misconduct. Lots of people I think, maybe junior graduate students often and the lay public might think that when there are problems in the research literature, that oh well, probably a lot of that’s just an honest mistake. If a paper’s really wrong and then it’s retracted, maybe that’s just error. Someone made a mistake and they didn’t catch it. And the truth is actually the opposite unfortunately.
So when I usually have to explain to our students what Calvin and Hobbes is, and maybe this audience I don’t have to do that, so a cartoon by Bill Watterson, and I’ve always liked this one. So Calvin’s talking to Hobbes, his imaginary friend. “I don’t believe in ethics anymore. As far as I’m concerned, the ends justify the means. Get what you can while the getting’s good, that’s what I say. Might makes right, the winners write the history books. It’s a dog eat dog world, so I’ll do whatever I have to do and let others argue about whether it’s right or not.” And then Hobbes pushes him. “Hey!” In the mud, “Why’d you do that?” “You were in my way, now you’re not. The ends justify the means.” “I didn’t mean for everyone you dolt, just me!” “Ah.” Ethics and what people want to have as their guiding principles sometimes is dictated by which end they’re on. And so the ends justify the means is a sort of Machiavellian concept.
So what is an ethical guiding principle? So we have to kind of– I’ll start you like if we were at the very beginning of our research ethics course, which we’ve just started one for about 40 graduate students in pharmaceutical sciences and in the Institute for Clinical and Translational Research, the Graduate Program in Clinical Investigation. And so we have to start kind of at the beginning. So ethics, it’s in the realm of philosophy. And so philosophers divide ethics into two areas. There’s normative ethics. This is where you try to establish what’s morally right and wrong with respect to an action. And this is really where we typically are in applied ethics in biomedical ethics and in research ethics in science. This is the area that we tend to reside in. Then there’s the sort of more scholarly consideration, meta-ethics and this is really for philosophers to sit around and debate moral concepts. I won’t say much about that. Concepts of duties, rights, moral reasoning. Maybe a little more pertinent is ethical theories. And so there are two real broad categories of ethical theories. One is based on consequences. Consequences drive it entirely. This is the teleological or consequentialist group of theories. And this is where right or wrong’s exclusively a function of what happens, of the outcomes. And probably the best example of a teleological theory is utilitarianism, so where you’re trying to maximize the greatest good for the greatest number of people. Pretty simple, that’s the goal. And you can contrast that with deontological theories, which are after Immanuel Kant, the German philosopher and these are duty-based. They just say, OK, forget consequences, you have a duty. It doesn’t matter if it achieves the greatest good or not. You’re motivated by a duty. And some classic examples of this would be the golden rule. Do unto others as you would have them do unto you. The Hippocratic oath in medicine, the principal of maleficence, first do good, and then non-maleficence, first do no harm, are duty-based examples.
So what we often try to do is take that and simplify it down for PhD students. So the simplified approach might be say, well, if you distill it all down what’s the point? Do the right thing, avoid misconduct. And the challenge sometimes in science and in medicine is that you kind of feel like you’re put in situations where you’re damned if you do and you’re damned if you don’t and sometimes you’re forced to choose one way or another way and so how do you go about that? This is one of the big things that we try to impart on students. So a way of thinking through problems.
So back to this question about misconduct and mistakes. What distinguishes them? So scientific misconduct involves deliberate deception. And you got the three main parts of that. One is fabrication, you just make it up. Another is falsification, and then last is plagiarism. Plagiarism’s where you copy or you take some intellectual property, some passage of another person and you pass it off as your own without proper attribution. So these are the big ones. What are the common motivations that drive people to do that, particularly at universities or in high level science? So one is career pressure. Lots of faculty and students and postdocs are under pressure to succeed, pressure to publish, pressure to be productive, pressure to achieve certain metrics. This many papers a year. Certain departments will have, will actually tell their junior faculty, we need this many, five a year.
So there’s pressure. Another is a belief, maybe, that you might know the answer and so I don’t want to do all this work to get there. I’m just going to, I’ll take a shortcut and I’ll get there quicker. And then another is, this is maybe a little bit more of a nefarious one, is if someone is trying to cover up their tracks. The idea that certain experiments are just so complicated, science has become so complex that certain experiments, even for those people right in the same field, it’s hard for them to really check and really reproduce it even if they wanted to and they had the resources to do so. So if someone wants to cross over to the dark side in such an area, of course it’s a little bit easier to escape detection. So in science, there’s also work structures, the way we do the organization of laboratories that sometimes can lead to conditions that the misconduct’s more likely to arise. And so there’s some nice work on this by Sydney Brenner, the Nobel Prize winner.
And so what are these work structures? Well, some labs are really big. And sometimes the PI, the principal investigator is never in the laboratory. They’re in their office, it might be down the hall, it could be on a different floor. They may be traveling a lot. They’re on the lecture circuit. The more famous the scientist, the more often they travel. So the connections between the principal investigator and the bench are sometimes not direct. That can be dangerous sometimes. Mistakes can be amplified if then scientists are afraid to correct mistakes. I often say, so you’re a beginning graduate student, you come up with an amazing result, you show it to your principal investigator, your advisor, and they’re so excited. And a week later the advisor’s already working on a grant for five million dollars and then a couple days later, ah, you made a mistake. You’ve got to go to your– Knock on the door and you say, you know that five million dollar grant you were? Yeah, that was my bad. So the student has to feel comfortable enough, empowered by the PI to say, oh, this was my mistake and not feel pressured to then, oh, but I can’t correct it now.
These are real examples, that’s why we bring ’em up. So you know, sometimes there can be a cooperative crime. If someone commits misconduct or maybe makes a mistake in that example and then doesn’t correct it, if the PI has created a set of situations in the laboratory where people don’t feel comfortable speaking out and saying, I made a mistake, then that’s a problem. So I mean, all of this really gets back to the idea that we have to be open about our mistakes in science and we have to be transparent. It is absolutely essential. Many mistakes are unintentional, and sometimes they’re even unavoidable. Why do we have to be accountable to the public and politicians? Well that’s pretty obvious. I mean, we govern ourselves. We’re like a club. We’re a privileged, elite group. We have our own rules. If someone gets out of line we’re often, we’re expected to police our own ecosystem, our own community.
And the second point here is really important. However, even though that’s true other people pay the bill. The vast majority of research at a university like this is funded by federal money, taxpayer dollars. So we really have to be careful, and this is a privilege to be a club and to police yourself. And if you don’t do it properly then there’s a nice, long history of Congress stepping in to help you do it. And so one of the principles in science is that you have hypotheses and you test them and many of these hypotheses are wrong. This is a quote I’ve always loved from the astrophysicist Carl Sagan. “Hypotheses in science are sometimes wrong. It’s perfectly all right. It’s the aperture to finding out what’s right. Science is a self-correcting process.”
This is something that I think is lost sometimes on people that do commit misconduct. That if the result’s important enough, if enough people care about it eventually it’s going to come to light. People are deluding themselves that they think it’s never going to, the fraud’s not going to be discovered. Now for the science that people don’t care about that’s in the fringe. 1/3 of all papers in the scientific literature, 1/3 are not cited. To me that’s the most frightening statistic I know of. There’s a lot of work that people do not care about. If there’s fraud and misconduct in that work that’s never going to be discovered. But the important results will be tested. And so it all gets back to public trust. I think particularly in today’s world as scientists we have to be concerned with public trust. We should never, ever take it for granted. And then you go back to Nietzsche, the German philosopher. I also like this quote, is that, “It is very advisable to examine and dissect the men and women of science for once, since they for their part are quite accustomed to laying bold hands on everything in the world, even the most venerable things, and taking them to pieces.”
What we do as scientists is often to examine things in down to real detail, and the whole idea of taking things apart, well, we also as a group have to be, if you’re going to dish it out, you have to be able to take it. And so trust, we can’t take for granted. And we have to be willing and expect to be put under the microscope ourselves. So challenges. So, that’s kind of like a little bit of an introduction that you’d have in a class. We talk about ethical theory. We talk about motivations. We talk about the need to communicate your work, the role of the public, the role of funding bodies. The graduate school at the University of Wisconsin, like many good graduate schools is very international. So we have students from the United States. We have students from around the world. And so another big aspect of these sorts of courses is we’re trying to get everybody on the same page. And so if you’re coming from China or you’re coming from India, then you need a little bit of an introduction about the National Institutes of Health and how the Department of Health and Human Services sets policy and what have you. So the other thing that we do is we talk about some history and we give examples, and I think a lot of ethics, ethics can be quite dry, particularly if you’re talking about it in the context of science unless you bring up some examples, and so I tried to give you a little bit of a flavor of how we do that. So research ethics has a sometimes dark history.
And some of the key parts that I’ll tell you about that are important in teaching a course, when I teach about human experimentation where do we start? We start in World War II, we start with the Holocaust. Why do we do that? Because Nazi medicine, the Nazi biomedical program, the classic slippery slope started with racial hygiene ideas, with sterilization and then went to medical killing and then ultimately mass murder and this life unworthy of life. Over six million Jews, Gypsies, Poles, Homosexuals, were killed in concentration camps. How does that relate to ethics? Obviously, it’s wrong. Well, it turns out that after the war these people, doctors, anthropologists, people in the pharmaceutical industry, had to be tried. And there wasn’t a code, there wasn’t some kind of ruling principle that everyone could agree on that had been in place for medical science.
And so the Allies developed a code of ethnics that they used to try these individuals in Nuremberg for the Nuremberg trials. And that Nuremberg Code of Ethics, it had two real prominent principles. One was that if you’re a human being and you’re going to be in a research study– This is a research study that was conducted by Germans. It was looking at– They did a lot of work on, work in quotes, looking at drowning. How long could people be in freezing cold water? And what could they do to help them recover from that? Because they were losing a lot of the German pilots that were being shot down in cold seas. And so two principles of the Nuremberg Code of Ethics. Informed consent. This individual did not consent to be put in cold water. And then the other is that, unacceptable morbidity and mortality, death, being hurt, cannot be an endpoint, typically, in an experiment. So the Nuremberg Code of Ethics, you can trace that to today to some of the principles that we have. The Nuremberg Code gave way to the Helsinki Code, until a lot of what you can see today. And those two principles are still very prominent. Informed consent and morbidity and mortality not an endpoint.
Another example, animal experimentation. So this was kind of the Wild West at one point in science. It was unregulated, and then much like a lot of examples that we cover in class with students, it was exposed by journalists in the press. That it was an unregulated industry. Some of these animals were being treated horribly and out of that came the Animal Welfare Act in 1966. And we’ve long had and we still do have a very vigorous debate over animal rights. So we have human rights and we have animal rights. And so then another real famous study, just to stay a few words about it would be the Tuskegee study, some of you may know about this. Absolutely horrid example of U.S. research. So this was in the early 1930s. The Public Health Service began a study, a study in quotes again, to determine the course of untreated syphilis in 400 African American men from Macon County, Alabama.
The men had syphilis, and that’s really what they were studying. What they wanted to study was the course of untreated syphilis. The problem was, is about 15 years later penicillin was discovered to be an effective therapy. These people that were conducting this study had a fork in the road. They decided then that they wouldn’t tell these men that there was a therapy, and they would follow them, because they thought this is the last chance they would have to follow the course of untreated syphilis. (sighs) 25 years after that, again, a reporter finally exposed it. Syphilis patients died untreated.
A year later Congress, lots of investigations. The study was shut down, of course, and out of that came a whole bunch of new laws and acts the National Research Act, to create a National Commission for the Protection of Research Subjects. Regulations should be codified and to absolutely require human subjects research of any kind to be reviewed by an institutional review board. It wasn’t all that long ago, the 70s, when that really came about. And there was a presidential apology. Obviously this was a corruption of– this it not– There was another principle that was violated here. If information comes to light, as an investigator you’ve got to be able, you’ve got to– You’re responsible to tell your study participants. If something happens that would affect their consent, their willingness to be a part of the study, they need to know.
So jumping ahead to today, just giving you a flavor of some of the issues. So that’s background. Some of the key cases that we use to sort of set the stage and then talk about really what are the codes and the principles, and that can be sometimes a bit drier material. What about today? One of the big topics we have is the reproducibility crisis in science. A number of years ago a couple of scientists with the company Amgen did an interesting study for their company where they tried to replicate some of the biggest, most important findings in the cancer field over the last 10 years. They did 53 studies. Ultimately they were only able to replicate 11%.
11%, based on what was reported in the papers and how to do the work. Why did they do that? Well, Amgen had a stake. They wanted to know where are they going to put their chips. What are they going to invest in to go forward? So ironically the reproducibility crisis is one area where our industry colleagues are way ahead of the academic community, because it’s hard to get resources and money in academia to reproduce someone else’s study. But in industry they have a reason to do so, because they have to know if they can trust it before they’re going to spend the next five, 10, $20 million to develop it. And so lots and lots of work across many different fields has revealed this to be a big problem.
Shifting gears again, now let’s go to misconduct. What kind of fraud is out there? What are some of the classic case studies that we talk about? So one in particular is at the highest levels. This fellow was the dean of a school of a very respected university in the Netherlands. Diederik Stapel, he was the subject, has been the subject of a lot of articles. There was a really beautiful New York Times magazine expose on him a few years back. So he was a Dutch social scientist and he was an academic star. So any of you that are familiar with academia, he got his PhD in 1997. In 2000 he became a professor very quickly and then by 2010 he’d moved to another university to become dean, so he had this meteoric rise. He was a super– He was one of the most successful scientists in all of Holland, hands down, no question about it. So what happened?
So as written in this New York Times magazine, they have some quotes from him. He was convinced his hypothesis was valid. He said, “I said, you know what, I’m going to create the data set,” he told me. He said felt both terrible and relieved. The results were published. “I realized, hey, we can do this,” he told me. What did he do? So the ways some of this social science works, you’re testing some hypothesis and looking at behavior and you’re trying to be sort of unobtrusive. And so his students and the people that work for him will come up with an idea and he’d have to run the study. He’d have to go and observe people somewhere. And so he would this, he would take the studies devised by his students and go off somewhere in the Netherlands and run the study. That was hard to do, and so eventually he said, “Well you know, I’m confident in the hypothesis. I’m not going to run the steps. I’ll just make up the data.” So he pretended to run the study and then gave the data back to his graduate students. And so you might think, well, OK. Well, what kind of research is that going to be? Well here’s one example, and for those of you who don’t know Science is a pretty good journal. It has what you would call a high impact factor. Science, Nature, these are journals that people really compete to be in. And so this is an example of one of his studies: “Coping with Chaos: How Disordered Contexts Promote Stereotyping and Discrimination.” It’s basically the idea that if someone is judging people and there’s some trash in their line of view, they’re going to judge more harshly. Explosive stuff. He made it all up.
His career took off. He published many, many studies, many with his doctoral students. They didn’t appear to have questioned why their supervisor was running many of the experiments for them, nor did his colleagues inquire about this unusual practice. Ultimately at the end of 2011 the universities– There was a whistleblower, several whistleblowers. The students and the universities unveiled their report at a joint news conference. He’d committed fraud in at least 55 of his papers.
It affected 10 PhD dissertations written by his students. Those students’ work is tarnished, with all the time spent there. There’s something– So there are number of ways of following fraud and misconduct in science. One is a beautiful website put out. It’s called Retraction Watch. And so there’s such a thing as a Retraction Watch Leaderboard. This is something you never want to be on. (audience laughing) Because basically, it just, at the top of the list is the most retractions, and that honor belongs to Yoshitaka Fuji, who is an anesthesiologist in Japan, was an anesthesiologist, but Diederik Stapel’s number three.
So takeaways from some of this that we talk about with specific cases in the class if you were with me over the course of a whole semester we’d get into it, nearly all of the top 30 are men. Not a single woman. The other thing that’s kind of interesting is top 10, half of them are Japanese. So you can get into the cultural aspects. Obviously gender considerations in the causes of some of this. And so we talk about this. Again, this is relevant for diverse student bodies where we have students coming from all over the world. We want to try to understand. I learn from them as much as they learn from me. I’m trying to understand as well.
Second example is a lot closer to home and a lot more junior. We’ve got a first year PhD student in their first semester at this institution and they were given a take home exam. And this take-home exam involved them reading a paper and then the instruction was, I’ll read it ’cause it may be hard for some of you to see. “Please summarize this paper, and in all caps, in your own words.” And the answer from the student, so wherever you see yellow here is taken word-for-word from some source: Wikipedia or the paper itself. And what you should take away is there’s not a whole lot of text that’s not in yellow. And this is part of his answer. So the next page also was equally bad. And so Wikipedia was sort of put together with another source that we’re not, Wikipedia quote was not attributed. And so cobbling together of different things. And there’s a name for that. It’s called pastiche or really patch-quilt plagiarism.
And so that was this essay. And this particular student was also enrolled in a high-level graduate survey course on research ethics. The instructor, I won’t divulge to you who that was, but you may be familiar with him after tonight. And so he also had to write a research paper. So we go back and we look at this research paper and the topic that this particular student chose was particularly ironic. He wrote on plagiarism. (audience laughing) You can’t make it up. (audience laughing)
And so here’s one source, and here’s another source and here’s the paper that this particular student put out. And again you notice hardly any text that’s original. And there’s a deceptiveness to this. So ultimately cutting into what happened here. Well, there are lots of things that we have to do as a professor when you encounter this sort of thing. So these are students that would go on to work in laboratories that would use federal money to do research. Is this the kind of person you would want in a laboratory like that doing important work, maybe work even that would affect clinical studies.
So this student got an F grade in the research ethics course and a failing grade in the other survey course. And then, you know, graduate students come to us. They sometimes have already researched productivity. This particular student had two first author publications in the literature, so we went back and we looked at that and what do you think we found there? We found the same thing. Ultimately this student voluntarily withdrew from the university. They were here one semester. A disaster for all involved. Certainly a disaster for him, disaster for us, and all of the work and resources we put into training these students. Are we doing enough to fight that at universities, not just this place, but in general? So lots of universities subscribe to plagiarism checking software. Some do not.
This is one that does not, yet. Why do we not? Well, it’s probably complicated. It comes down to resources. This is our own site for detection resources. While campus does not provide subscription access to these resources, individuals or departments may subscribe on their own. The really good software that’s used by the best journals, Nature Publishing Group, Science, is subscription only. And they typically will only do subscriptions within the entire university, not departments or individually. So you really need universities to invest in this and as you can tell this is the part where I’m sort of putting out an opinion. I think it’s time we do this. There are lots of things that you should check at a university to make sure that it’s original content. We can maybe talk more about that in the question and answer. So lots of other problems science is facing, just that I don’t really have time to get into. We have a huge money problem in science. The NIH, the National Institutes of Health budget doubled from 1998 to 2003, and then from 2003 to now it’s declined, it’s less than 80% today, what it was in 2003 if you adjust for inflation. So that’s a problem. That’s put a lot of pressure on individuals, scientists, investigators, students, postdocs, administrators at universities.
Many studies, too many, are poorly designed. And some of the incentives are bad. Some studies, they reek of bias. Why do they? Well they really need a result. Why do they really need a result? Well then it gets back to funding, grants, money, that kind of thing, and so there are these perverse incentives for scientists to game their results. Replicating results is crucial, but it’s rarely done as I mentioned before, it’s hard to get the resources to do it in academic science. Very expensive studies, very hard to do them twice. And do them exactly the way they were done before if they’re real questions. Peer review, some would say it’s broken. Many, many papers and grants are not reviewed carefully enough. Most of us in academia, if we’re going to be honest, we would admit to that. We’ve seen many examples.
I certainly have. Too much science is locked behind paywalls. You get into open versus closed access. If you’re doing science in Africa open access journals are very important, but of course a lot of them are not open access. Science is poorly communicated to the public. A huge, huge problem. If we’re going to influence public policy and have sound public policy based on science we as scientists have to communicate it effectively. And then lastly life as a young academic, a PhD student, is incredibly stressful today. I think if we have students they might attest to that.
There’s a long apprenticeship if you’re doing a PhD. Long apprenticeship, graduate school, postdoctoral fellowship, not very good wages all the way though so it’s a real delayed gratification scenario. So what to do? Last 15 minutes or so. So now I’m going to take a bit of a left turn and maybe offer, think about what prescriptions, what do we think about in teaching our own students to try to change some of the underlying dynamics? So what can we do? Well one thing we could do is we could give up. (audience laughs) We could ignore the problems. Or we could develop some kind of philosophy. Have some kind of vision, some way of thinking about the issue. Another thing that I teach our graduate students is how to communicate science, presentation design. And when I teach that one of the things I like to use is some lessons from Steve Jobs. Limited lessons, focused. Job’s favorite quote was, “Good artists copy, great artists steal” by Picasso, and really came from T.S. Eliot, “Talent imitates but genius steals.” Why did that speak to Jobs? ‘Cause Jobs wanted to take some of the best that he could identify in terms of design and then even in advertisement and merge, sort of put his products in the context of greatness and then take something from what everyone would admit is greatness and put it into his products.
And so when he turned Apple around in the 1990s, the ad campaign that did it was this think different. Some of you may remember it. And so the idea was to take Einstein and Maria Callas the opera singer and other real luminaries and say, oh, Apple, these people. So start with the philosophy, imitate greatness. So I tend to take this to heart when I’m looking for ideas. I said, well what’s the best? I go to that, I try to study it and then use it. So how many of you are familiar with this Michael Pollan book? Probably quite a few. OK, so In Defense of Food: An Eater’s Manifesto distilled down into some nice little punchy phrases. “Eat food, not too much, mostly plants.” So if I take this same philosophy and I twist it around for myself. In defense of biomedical research a scientist’s manifesto, I presented this to some of our own students not long ago. They asked me to give a sort of last lecture. I’m not dying. (audience laughing)
It’s complicated, but so I did it. And so this is really the content that I came up with. So what is my manifesto? Study history, at least a little. Read widely, define goals. Now, I’m not as good as Michael Pollan, so I got another list. Practice collaboration, communicate effectively, strive for excellence. OK, study history, at least a little. Students today often don’t know a lot of the essential history, what I just told you. Why do I start with the Holocaust and Nazi medicine when I’m talking about human experimentation? It’s because 41% of Americans today don’t know what Auschwitz is. Concentration camp, and 66% of millennials, our students, do not know. So there’s been a lot written about this. If you don’t know the history, then the influence of what happened in the past is not there.
And that old saying, history will repeat if you don’t have a good sense of what came before, if you don’t know it. And so I want to show you a very short little two-minute clip from a film in the 1980s that talks a little bit about what lessons should we draw from the past at universities?
(somber music)
– [Narrator] These are the remains of Josef Mengele. During his time at Auschwitz he carried out some of the most horrific experiments ever known. 400,000 people are said to have died at his hands. Many of the doctors who worked with him lived to continue their academic careers in peacetime Germany. So too did the professors who used his research. Professor Verschuer became professor of genetics at Munster University. He died a renowned scientist and a free man.
– I think there’s certainly a lesson to be learned, but a further question is whether it will be learned. Mengele was a colleague, he was a colleague of ours. And he simply completely forgot that scientific without justice and with equal rights certainly ends up in a hell like Auschwitz. But I would say many colleagues are not willing to see that.
– At that time and still today universities of course not only in the West, but also in the West, are educating what my colleague Franklin actually tell, of Temple University, Philadelphia, was a Methodist minister and a minister of religion, a professor of religion, called technically competent barbarians. And we seem to be educating that to this day. In other words, our university education, I’m speaking as a university professor, is not geared to the kind of society that we are educating towards and it does not prepare us to oppose the kind of developments that might end up or lead to a holocaust, and I think that that’s a tremendous challenge that Mengele poses for us. Because Mengele is a product of the kind of system that we still have of universities all over the world that teach techniques, that teach quote unquote science, but not the implications of it and not the use to which such things are put.- So this is, you know, I think it’s still as pertinent today as it was in the 80s.
We obviously have to teach context. This is why we teach research ethics. We don’t have a choice. We are mandated by the federal funding agencies to do it and it really comes out, ’til in the 80s you weren’t mandated to do it. And these sorts of considerations led to some real meaningful change. And I think we’ve improved in certain areas, but then I think in other areas we’ve had challenges creep in, and certainly the money is probably the big change. So in the 1980s to now, the money is different in science. And so that’s created more challenges that need to be dealt with. So one of my other prescriptions, read widely. I’m very proud, I have been very proud to be associated with the pharmaceutical science’s PhD program, it’s based in the School of Pharmacy. We have 40% of our PhD students are from the U.S. and 60% are international. We have typically about 50 to 60 students. This is a portion of them eating pizza as graduate students are prone to do.
I was with them. My daughter was there. And what I would emphasize here is they come from all these different places. I mentioned this before, China, Ethiopia, Hong Kong, India, Korea, Malaysia, Thailand, Vietnam, in addition to the United States. And so read widely. There’s an opportunity there, you know, when you’re around such diversity. So my prescription, my plea for our students is to stay up and follow current events. Know something about the world. Choose an interesting book every so often. Where do your classmates, friends, and coworkers come from and how has their experience been different from your own? This is one of the arguments for having diverse classes is that our PhD students, any students, learn from their classmates about the world. There’s probably no better education. You know, we can’t stand up and teach them about Asia probably as well as a classmate can that’s from Asia.
There are lots of other things that they could read about, particularly in graduate school. The backstory of what you’re studying in neuroscience, neurodegenerative diseases. There are lots of examples like that. What famous person had the disease that you research? What are the costs? Next prescription, define goals. Think about your life’s narrative. Where have you come from? What do you wish to be your motivating principles? Basic Midwestern values, be honest, be kind, be useful, be responsible, work hard, treat everybody with respect. So in showing this to students I’m not saying to them, adopt former President Obama’s goals. I’m saying, think about your own. We can all think about, what is the narrative of our life and how do we want it to proceed? And I think it’s much more important when you’re in your 20s to do that.
Practice collaboration, practice collaboration. So I trained in New York City and there’s this classic joke, a tourist says to a New Yorker, how do you get to Carnegie Hall? Carnegie Hall’s the famous music venue. And the New Yorker says to the tourist, practice, practice, practice. (audience laughing) So how do you practice collaboration? Like anything, it can be practiced. So if you’re a beginning student you do it with your classmates first. You join a laboratory, you do it in a laboratory. I think something that needs to be stressed more often in laboratories is that there should be a real heightened expectation of collaboration, not competition, because it teaches students to collaborate with the people that they’re around the next bench over. And then they get good at that and ultimately they’re going to do it outside of the lab, ultimately outside of the university. And that’s kind of a testing ground for them to learn how to do it, what the benefits are, what works for them.
Many different characteristics and traits and personalities, what works for one person doesn’t necessarily work for another. So students have to figure that out. And so you have to practice it to get good at it. Communicate effectively, I’ve mentioned that already. The tremendous importance of communicating science, no more important maybe than it is today. A book, Peter Feibelman, A PhD Is Not Enough! However brilliant your insights, they will be of little use if you cannot make them appear interesting to others. This is kind of 101 for junior scientists when they go to present their work at the first conference that they ever go to. What do you start with? Why should you care? Someone comes up and says, ah, that looks interesting. Or maybe they come up and say, why should I be interested in this? You’ve got to be able to answer that second question powerfully.
Communication is not just verbal. Increasingly we have graphical communication. A picture is worth 1,000 words. That has become even more so and more important in science. You’ve seen many countless examples of this as this cover of a high-impact factor journal Cell illustrates with a beautiful picture of the heart, sort of likening it to a tree. And the creation of new cells, as these cherry blossoms down here. Strive for excellence. Strive for excellence in your work, in your everyday life. We are what we repeatedly do. “Excellence, then, is not an act but a habit” is one of my favorite quotes for my own graduate students. Excellence becomes a habit if you do it enough. And so this is often attributed to Aristotle. A little more complicated than that. It’s actually attributed to a Pulitzer Prize-winning author that was paraphrasing Aristotle almost 100 years ago in a book. The point though, it kind of gets back to what I said about Steve Jobs and that campaign and good artists copy, great artists steal. Begin by copying the excellent traits and habits of others, not literally copying, but trying to add in, what makes it great? If you’re thinking about graphics, maybe you go to a museum, and you say, what works? And then you bring it into science. The best science crosses over and can become art.
But you have to have something to say yourself when done. So this is really the path and this is the arc that you take as a graduate student. By the end you need to have your own voice and you need to be able to speak and talk pretty powerfully. Convey your story, your major points. Make sure you get your facts straight, quote when appropriate, quote correctly and accurately and don’t trust Wikipedia. So if you trusted Wikipedia, you know sometimes you attribute quotes the wrong way.
So last slide, last few points. So this is kind of bonus, it’s not on my list. It’s some quotes that we also can pass along to our students, so “in God we trust, all others must bring data.” Who do you think came up with a phrase like that? Well of course it was a statistician. (audience laughing) So now these are sort of tests for our graduate students sometimes. “Not everything that can be counted counts, and not everything that counts can be counted.” Who came up with that? Of course it’s Einstein. “If you can’t explain it simply, you don’t understand it well enough.” That’s Einstein again, who won the 1921 Nobel Prize in physics. And then science has great beauty. One of the most famous pictures from science, the Solvay Conference in the 1920s on physics. All men except for Marie Curie, who won two Nobel Prizes before Albert Einstein won his only, and of course she saw the beauty in science.
So I’ve been really privileged over my career here to work with outstanding individuals. So by way of acknowledgement, you know, when you teach and when you work with students in the laboratory, I think you often learn as much from them as they learn from you, that’s the ideal situation. And so I’m definitely in that category. I’ve had a number of fantastic students working with me over the years. And I hope you enjoyed it tonight. I look forward to any questions you may have. Thank you very much.
(audience applauding)
Follow Us