top of page

Adrian Currie Transcript - S2 E5

Adrian Currie on Opportunistic Scientific Methods


Hi, welcome back to The HPS Podcast, where we discuss topics in History, Philosophy and Social Studies of science in an accessible way.


I am Samara Greenwood and today my guest is Adrian Currie, a Senior Lecturer in Philosophy at the University of Exeter. Adrian’s research revolves around the question of ‘how do scientists successfully generate knowledge in tricky circumstances?’


Rather than supposing there is a general answer to this question, Adrian has found successful scientists to be what he calls ‘a motley, opportunistic bunch’, using a wide variety of methods and techniques to exploit varying lines of evidence toward a variety of different aims.


Much of Adrian’s work has focused on the ‘historical sciences’, including palaeontology, archaeology and geology. In this week’s episode I chat with Adrian about how scientists in these fields have developed a range of creative ways to successfully generate knowledge – even when evidence is thin on the ground.


[00:00:00] Samara Greenwood: Hello, Adrian. It's lovely to have you on the HPS podcast.


[00:00:03] Adrian Currie: Hi Samara, it's great to be here.


[00:00:05] Samara Greenwood: So first, how did you come to history and philosophy of science?


[00:00:08] Adrian Currie: Well, I kind of didn't, to be honest. Academic communities, particularly in philosophy and history, are very small often and tend to be very idiosyncratic.

And so where I did my PhD at ANU at the time there wasn't really any history per se. It was a much more traditional analytic philosophy department that had a strong philosophy of biology component. And the HPS was done in Sydney and Melbourne. And I thought they were weird. Just because I hadn't had enough experience with them.


And I didn't really know how to place myself other than as a philosopher of biology, honestly, until I moved to Britain about seven years ago. Back when I was doing my PhD and I thought HPS was weird was like 13 years ago. I'm old and I sort of got to Britain and to a lesser extent Canada and people were like, 'Oh, there are all these different ways of doing philosophy of science. And there's integrated history and philosophy of science and there's philosophy of science and practice.' They started describing these things and I was like, 'Oh, I guess I sort of do that.'


So now I think what the philosophers of biology were doing at ANU was actually very strongly aligned with integrated history and philosophy of science.


It's just because, to be blunt, Australia is kind of a small place that means that people create their weird little silos and there's a lot of boundary work, to use a sociology term, in Australasian philosophy and philosophy of science. And I suspect that made it difficult for me as a graduate student to realize how useful and valuable history and philosophy of science was.


Now I'm a total convert. I don't really have very much H, I'm very much on the P but I'm a big fan of the H and integrate it whenever I can.


[00:01:50] Samara Greenwood: That is an interesting story. And so what are your current research interests? What are you working on at the moment?


[00:01:56] Adrian Currie: I'm a bit of a dabbler, so this is often a sort of difficult question to answer, but here are four concrete things that I'm doing.


So I've gotten really, really interested in thinking about the role of aesthetic sensibility and judgments in science. So here we're thinking about aesthetics, not in the way that say philosophers of science typically do, which is like, 'oh, this theory is elegant' or something like that. I'm more thinking about the role that science's tacit, affective, embodied judgments play in the kind of work that they do.


I've also been getting into the philosophy of history over the last few years, largely because people who work in the philosophy of history keep asking me what I think. Part of that project has been trying to defend a sort of vague analogue of scientific realism but about history - is a brief way of putting it.


And also I think, more interestingly, getting into the role of archival practice in shaping the way that historians do business - because the philosophy of history has mostly been interested in historiography, the writing of history, and hasn't paid much attention to the fact that historians are wildly constrained by the things that are archived. And so myself and Kirsten Walsh are trying to think about how you might create a kind of History and Philosophy of the Archive kind of project, which is exciting and fun.


Another thing that I'm doing at the moment is Sophie Veigel and I are co editing a book on methods in the philosophy of science, because one thing that's fascinating about the philosophy and history of science at the moment is that it's become just wildly disparate in terms of methods and methodology. Right? People are using so many different types of tools. People are starting to incorporate machine learning stuff. People are running simulations. People are going doing social science.


What we're building is this big edited volume. It's most likely going to be called something like Philosophy of Science, A User's Guide. And the idea is it's just going to have a bunch of people going, here's all these different ways of doing the history and philosophy of science, partly because honestly, I think it's really difficult to know what sort of options are out there and how to start learning them.


There's a few other things, but I'll stop there before I get carried away.


[00:04:02] Samara Greenwood: They all sound fabulous. Maybe we can do more podcast episodes on each of those? That would be great.


But for the topic of today, I've asked you to talk about the area of your research that involves how scientists generate knowledge in tricky circumstances. For this, you've tended to focus on the historical sciences, including geology, palaeontology, and archaeology. Could you tell us how you first became interested in this area of research?


[00:04:27] Adrian Currie: Like with HPS, I kind of fell into it on accident. So my PhD was originally on this notion of reductionism and in the proper tradition of all early PhDs, it was wildly overambitious. So the idea was, what does reductionism look like in the philosophy of mind and in scientific theories and in metaphysics?


So much of that debate just turns on some empirical claim that we have no way of verifying because physics isn't good enough yet, or some kind of metaphysical, philosophical commitment you might make. And that wasn't particularly interesting.


But I found myself asking well, what would reductionism look like if you were working on a long temporal scale? Because typically when people talk about reduction, they say, 'oh, well, the units of chemistry can be thought of in terms of the units of physics' or something like that. It's very atemporal. It's sort of at a time slice. And I was like, 'well, what would it be to be a geologist and a reductionist, right?' At least sometimes when you're a geologist, you're giving these big long narratives about, you know, the formation of mountain ranges, what would it be to be a reductionist about that?


And so I went and started reading a bit of geology and I fell into, I don't know why this happened, I fell into reading about theories of sauropod gigantism, so - how come sauropods got so big? - and theories of snowball earth - why were there glaciers in the tropics in the Neoproterozoic. And honestly, there was a brief bit where there was both reductionism and dinosaurs, and then eventually the reductionism just went and it was just full of the historical sciences for their own purposes.


So for me, the thing that really grabbed me was just how surprising these sciences seem to work given my presuppositions as a philosopher of science. I kind of went in expecting it to be, I suppose, in a sense, a lot more boring. I expected it to be a lot more methodical. I expected it to be a lot more like every paper is just another little brick in the wall of knowledge. And what I found instead was that it was significantly more speculative, significantly more creative and significantly more fun than I was anticipating.

And so I found myself... just really studying that and along the way, of course, I discovered some of the philosophers and historians who've been working in these kinds of areas. So at that stage, people like Derek Turner and Carol Cleland had been doing a bunch of work and that gave me a way of sort of framing what I was doing.


But honestly, a lot of it was just these sciences are really fascinating and I want to understand how they work. That was what kind of grabbed me.


Samara Greenwood: In 2018, Adrian published his book Rock, Bone & Ruin: An Optimist’s Guide to the Historical Sciences. In the book, Adrian argues that historical sciences, like geology, palaeontology and archaeology, operate in perhaps unexpected ways. Rather than relying on anything akin to a single, linear scientific method, the success of these sciences is due to their ‘messy’ resourcefulness and being what he calls ‘methodologically omnivorous’ – in other words - using multiple strategies and techniques to make the most of what can – at times – be quite limited primary sources of evidence from the fossil record.

As Adrian notes, and I quote “we attempt to isolate the method of the historical sciences…but…these sciences are at base opportunistic…and it is this opportunism that explains their success”


In the interview, I wonder also if there are lessons here for other sciences – perhaps in suggesting ways to increase the reliability of our findings by engaging in more diverse, open-ended – and yet always empirically grounded – research strategies?


[00:06:54] Samara Greenwood: I would love it if you could discuss some of your key findings from your research. So how do the historical sciences generate knowledge from limited evidence?


[00:07:04] Adrian Currie: Well, one question we always want to ask ourselves is, is the evidence really limited?


So I open Rock, Bone and Ruin with the case of, you know, a single tooth that's used to infer the existence of a giant platypus, right? And so it's kind of weird that I can just go, here's a molar - 10 million years ago in Queensland there was a meter and a half long platypus, right?


On the face of it that seems strange because the way we think of evidence is often in terms of lots of data, especially these days, right? So these days, you're kind of paradigm case of good science is often big data science or data driven sciences as Sabina Leonelli calls it. So it's like oh, what you want is large amounts of replicable data, right? Where data are these objects that I can put into big databases and then do statistics on. And that's just not how at least some of the historical sciences work. It is how quite a lot of it works, but that's not what I'm interested in.


And so it looks like this is terrible evidence. Because we're thinking of it as if it's on a particular model of science, as if more data equal good, because then I can do statistics. But actually you don't need to do much statistics if you've got rich enough background knowledge, right? And it turns out with fossil teeth, especially mammal teeth, we know so much about these and there's so many constraints and within mammals, teeth are so diverse and also tied to particular phylogenetic contexts that actually we can very well say a lot of good stuff about teeth. We can make lots of strong inferences.


And so the first thing to note is, there's a sense in which this only kind of looks like it's bad evidence, because we have a certain model of evidence in mind. And once we go, 'oh, actually there's other types of evidence, there's other ways we can support hypotheses', then it changes.


But of course, you don't want to take that too far. The evidence is impoverished, the fossil record is gappy, there's a lot of holes. You think about - dinosaurs controlled the earth as it were, they were the major fauna of the earth, for 150 million years. And you think about how much mammal diversity there is around us now, and then you think about how many dinosaur species we know.


There's a lot of weird dinosaurs that we have not discovered and possibly won't, right? This is an impoverished situation. So what do you do? What sort of scientific strategy do you take up and from what I've seen the sort of scientific strategy you take up really involves a kind of bold, speculative, messy, opportunistic approach.


One way of putting it, which is a bit cheeky, is you sort of throw everything at the wall and sort of see what sticks. But they're very, very good at thinking what to throw at the wall and they're very, very good at trying to figure out what will stick and then what to do with the things that do stick.


One thing that I often find is striking about going into particularly palaeontology labs, vertebrate palaeontology, but also archaeology, you'll find students at master's levels will be trying novel techniques - will be developing new methods, right? Which is really weird. You don't see that in like a chemistry lab or a wet lab or a bio. You know, you go in and they go, 'here's your pipette, do some stuff', right? Like here's your rules, follow these instructions. Downstream, maybe you'll get to develop a small thing that's a different research program, you know, blah, blah. Whereas often in palaeontology, it's much more like, 'do this weird thing, let's see what happens', right? And I think there's a very, very good reason they do science like that.


It's because you're trying to get as much out of the limited evidence you have. Here's one way I've put it in the past - there are sciences where you've got really, really rich seams of evidence. Experimental sciences are like this. I've got a really good experimental system. I've got a small set of techniques that are really effective. This is a rich vein of evidence that I can just sit there and whack it and get more and more and more and more. So the strategy there is kind of fairly homogenous, right? You're a methodological obligate. You've got your one system and you're doing your stuff with it. But if you're in a situation where actually, evidence is kind of thin on the ground in a certain way, what you should do is lower your standards of evidence in a certain sense. Right? Like if you're in that obligate situation, it's like, 'Oh, well, I'm only going to take the good evidence because I have so much of it and I can make more'.


Whereas if you're in the historical case, it's like, no, no, you want to have less evidential standards, but be better at figuring out how those things hang together.

That means you can have more sources of evidence. You can have more data and get better at sort of knitting it together using kind of narrative strategies to create these bold, complicated hypotheses, which give you more ways of tagging things to the earth, as it were finding more sources of evidence.


So one thing that's great weirdly about these big narratives, is that a big narrative has lots of points of testing. There's lots of places where you can test whether a big narrative is empirically adequate if you want.


So that was sort of a bundle of things. Roughly, there are methodological omnivores, by which I mean, they don't just use lots of evidence, right? It's that they very specifically are in the business of developing new ways of getting into different types of evidence, especially technological innovation, grabbing stuff from other disciplines and just seeing what happens as well as getting really good at some version of sort of coherence reasoning - these narratives that they produce.


And so that's my kind of fairly long winded, hand wavy answer to the question of how do they do well. They're really good at adopting the right scientific strategy to adopt given the kind of epistemic situation they're in.


[00:12:26] Samara Greenwood: Hmm. Yes. And it really turns to my next question, which is, I really have enjoyed the way you attribute the success of the historical sciences to this messy, creative, opportunistic way and how they use these varied strategies to build up reliable knowledge.


What I want to know is how do you think understanding more about this messy opportunism might be useful for practicing scientists, particularly in non historical sciences? Is that useful for others to know about?


[00:12:54] Adrian Currie: I think it is in several different ways. A lot of the work that a scientist does, particularly as they become a PI and they're running labs, is justifying their own existence in certain ways. People talk about things like 'physics envy', or at least they used to, and one way of thinking about physics envy is, 'Oh, I want to do it science the way that physicists do'.


I think a more realistic way of looking at it is, if I want to be a scientist, I need to be able to have a kind of marketing strategy where I'm able to explain to other scientists and, more importantly, the institutions and funders that I'm engaging with, why what I'm doing is legitimate.

So you end up with scientists trying to look like scientists of a particular type, and they're doing that because they sort of have to, right? And I take it that no, actually, there are lots of different ways that you can successfully generate knowledge. There are lots of different strategies. The question is not what's the best way of doing a science. The question is, given the sort of circumstances I'm in, given the resources I have to draw on, given the kinds of questions I'm interested in, what sort of strategy should I adopt? Sometimes the strategy is going to be, in some sense, kind of conservative. Sometimes the strategy is going to be much more speculative. And the trick is trying to handle that.


Some funders and some disciplines have started making noises about this kind of thing. So the European Research Commission will say things like, 'oh, we want this project to be risky' in some sense. But it's very unclear to me whether that actually pans out. It's very unclear to me whether that in fact is what they end up funding. But regardless, that's at least part of what's important here - is sort of going, hey, look there are many different ways of doing science and being able to have a story about why this way of doing science is the right way to do business, given the sort of resources and epistemic situation you're in, that's much better than going, 'Hey, I'm wearing a white lab coat, look at my nice lab, therefore, I'm a scientist.'

And I think a lot of the time scientists are really good at doing that. It's just that they often have to do it within a certain kind of let's say communicative and ideological framework, right? A lot of what scientists are doing are kind of weird signals that they're doing to various different audiences. And then when you talk to them about what they're actually interested in, it's often something quite different than the sort of noise they're making.


So I suppose part of it is that's more of a message for the social structure of science than it is for practicing scientists themselves. I suppose for practicing scientists themselves I guess I'm very... Let's put it this way, I don't see it as my job as a philosopher of science to tell scientists how to do science. Sometimes I do, and they always get angry at me when I do. But partly because scientists are good at doing science. They're pretty bad at knowing why they're doing science. They're pretty bad at giving the sort of philosophical explanation of science. But they're pretty good at doing science itself. I think often having those kinds of philosophical explanations can be helpful, right?


So partly it's helpful to have them because it helps you map out what kinds of other work you might do that's kind of useful. I think it's also useful partly just because being reflexive is important, on why you're doing what you're doing.


And even if you think the philosopher is kind of an idiot, or they don't understand the science, being able to bounce off of their ideas is actually a really useful way of you articulating and being reflective on your own kinds of practices, right? Why do we follow these protocols?

Thinking about history and sociology and philosophy are incredibly rich ways for practicing scientists to actually get that second order or reflective perspective on their own kind of work.


The final thing to say is, I think the most useful thing for practicing scientists, especially when they are grappling with very speculative areas, or there are conceptual confusions or whatever is just working with philosophers, right? So I think a lot of my work ultimately ends up speaking to philosophers because that's what I'm trying to do. When I want to speak to scientists, I work with a scientist because they know the rules of that particular type of game. So I suppose there's lots of space for, and there should be more, philosophers actually engaging with and talking to scientists. It makes the philosophy infinitely better. It sometimes makes the science a bit better.


[00:17:11] Samara Greenwood: No, absolutely. And two points to that. First some of the feedback we're getting for the podcast is definitely from practicing scientists going, this is great. They just love listening to these ideas. Seeing how they reflect on their own practices and how they can or cannot incorporate some of these ideas into what they're doing, which is fabulous.

My other point was also one thing that's coming up through the conversations I've been having with a number of philosophers and historians and sociologists of science, and also particularly the metascientists is that there are a few sciences that do feel a bit trapped at the moment. They're trapped, especially in methods. So we've had Fiona Fidler talking about psychology and ecology where statistical methods in particular have become so dominant and it's hard to jump out of those kinds of methods when they're so ingrained in a particular discipline.


And so I was wondering if perhaps you could see - by visiting these sort of more creative, opportunistic kind of sciences - that potentially that opens up opportunities for other sciences that don't have that tradition embedded in them. Could you see something down that line?


[00:18:15] Adrian Currie: Yeah, great. I think that's right. One part of the work I've done in the past has been thinking about the sort of social organization of science and the way that that leads to conservative practices.


There's some large scale sort of 20th century science stories you can tell here about the institutionalization and centralization of science after the second world war, et cetera, et cetera, et cetera.


And that leads to a kind of scientific environment where scientists are having to make safe bets, right? You're in a situation where any risks you take are going to, if they don't pay off, your career's dead, more or less, because it's such a high competition environment. I think that standard things to say about the danger of peer review and grants, where that's a further way of making this kind of homogeneity.


I think it's absolutely right that there are sciences that are stuck within certain, let's call them paradigms, that are not really fit for purpose anymore. The way I would say they're not really well adapted to the kind of epistemic circumstances those scientists are in. They're kind of trapped in various ways. I mean, one of the examples of this I quite like is, you know, up until pretty recently, so much AI research was, 'can I get this thing to play a computer game well?' It took a long time for them to actually do things that weren't that, right?


So there are lots of cases of particular standards, right? What you need to do to be publishable that heavily restricts what sorts of things those scientists might do. And that's super bad.


I think in terms of the lessons here, I tend to be kind of a structuralist about this stuff. I don't think that there are going to be these magical scientists who are going to come in and fix stuff. I think what you need to do is create the kind of publishing, funding, research environments that encourage that kind of diversity, and then you'll get it right.


So I know a little bit about this in ecology where in ecology you do have fights between the very as you say statistically based people and there are some people who are a bit more experimental, right? So they're not really in the process of doing big data. And the problem is, is that the big data stuff is getting much cheaper to do. And it has the sort of ideological weight behind it. It's much easier to be open science if you've got big data. It's kind of harder and weirder if you're doing cases or experiments or these kinds of things.


And so there's a big, large amount of institutional, ideological weight behind those ways of doing science. And I presume there's nothing wrong with doing those ways of doing science, but it's a terrible mistake if that's all that we're doing. And so how then do you create environments where people aren't forced to jump onto methodological bandwagons or get trapped in these methodological kind of holes, right?


I take it part of it's going to be about having diverse, very, very diverse standards in terms of what could be published, in terms of what can be funded, right? And how you actually go about doing that? I'm a big fan of incorporating lotteries in terms of funding the arguments in favour of reducing the role of peer review, not eliminating entirely, of course, but reducing it having shorter funding applications and adding a chance element to it.


I think there's actually a lot to say about that. It's only ourselves being pretentious about our capacity to properly gatekeep the true sort of discoverers of knowledge or whatever. That does that. So generally, I think the answer is having diverse structures is the kind of answer to having more diverse science.


But it does involve a lot of, like, it makes things really tricky. As things get more diverse, it gets harder and harder and harder to be able to use sort of science to inform what you're doing at, say, a policy level, right? But actually, you might think that that should be hard, right? It creates more noise, but we need to have better structures to be able to know what to do with that kind of noise.


[00:21:55] Samara Greenwood: Okay. And so coming back to our main topic of the day, what value do you think a more general audience might take away from a better understanding of how the historical sciences work?

[00:22:05] Adrian Currie: I think a lot of what I said earlier about understanding that science itself is often a much more messy human process than the way that we sort of imagine science as being, I think is very valuable for people to understand.


I think that from a science communication standpoint, particularly scientists themselves often really want to present science as being as kind of 'Objective' and 'Truthy' as possible and I think that's a really bad strategy because it means that all I need to do is go - here's how science actually works - and suddenly people in the public feel like perhaps they've been lied to or something like that and they haven't really - I just think that that's sort of a bad strategy.


More importantly, I think to an extent it's less about the sort of methods and nature of what the historical scientists do, but I think more about what you gain from knowing what the historical scientists know. I think what you learn from really looking at history and particularly deep, deep history, is that the world is fundamentally fragile.


Some things seem to hang around for a long time and then they break. For a hundred million years, the main macro faunal business were sauropod dinosaurs eating non flowering plants. And then you get flowering plants arise - within 20 million years 80 percent of the terrestrial fauna are flowering plants, suddenly you see massive increases in diversity and insects, particularly social insects. And, you know, for the 150 million years since then, since the sort of mid Cretaceous, the main business has been bees and flowers, right?


These are these really long standing robust features of our planet at this macro level that just kind of disappear and then they're gone and then something else comes up. Right.

And so realizing that what we think of as stable and normal is only stable and normal within certain types of limits and only within a certain sort of period, a certain temporal scale. I think that perspective of realizing that often the past just was really different. 150 million years ago, there were no flowers - that just wasn't a thing, right? Now there are flowers and that fundamentally changed the way that the terrestrial globe looks, right. And there's of course lots and lots of examples of that kind of thing. These examples where the past is just fundamentally different in these ways to the present.


I think once you realize that that lets you realize that actually the future can be fundamentally different from the present - that you shouldn't make assumptions about the way things will be, and moreover, realizing that we're so bad at thinking about the way things were in the past, we shouldn't think we're in a good position to actually think about the way things will be in the future.


I think that understanding the amount that the past has been lost, has been just fundamentally different to the way things are now, helps us at least have a certain type of perspective and what we can expect from our own capacity to understand an unprecedented future.


Also history is fun and interesting.

[00:24:58] Samara Greenwood: Hey, absolutely. I think that's a fabulous point to finish the podcast on. Thank you, Adrian, so much for being on the podcast. I have absolutely loved our chat today.


[00:25:09] Adrian Currie: It's been wonderful being here.


Comments


bottom of page