As scholars, how do we know what we know?
And what, exactly, is method? Does all scholarship necessary draw upon a method? And what does it mean to say that a piece of work doesn’t have a method?
In thinking through these questions below, I want to suggest that social scientists (and especially perhaps political economists, such as myself) face a dilemma in how we interact with the issue of method: do we want to open up the black box of how we know what we know — or would we rather leave it closed?
Although seemingly arcane, this dilemma has implications for how we collectively draw the boundaries of scholarship — and therefore how we reproduce the field, train and mentor early-career researchers, and distribute value and recognition in academia, i.e. whose work gets valued and recognised, and whose doesn’t.
Susan Strange is a giant in my field of international political economy (IPE). In Benjamin Cohen’s intellectual history of the field, Strange is named as one of the ‘Magnificent Seven’ who made and shaped the discipline. Strange effectively invented IPE as a field of study in the UK, and she then went on to produce some its most important and enduring works: often holistic and historicised accounts of power relations in a constantly evolving global economy. (To read more on the methodology and contribution of Susan Strange, then have a look at these pieces by May, Palan, and Tooze).
If you haven’t heard of Susan Strange, then it’s no problem. I would guess that all fields of social science have equivalent figures: roughly mid-century-and-later classical scholars who take synoptic approaches in tackling big problems, but do not have what would be considered today to be a ‘method’. I’m certain that one could find equivalent figures in, say, political science and international relations, the other two fields I know quite well.
Anyway, when I was a PhD student, I asked a prominent Professor in my field about what they thought of Strange’s work. He responded with a shrug of the shoulders and said something along the lines of: “her work was really just journalism, and not proper social science” — so what, so meh, so put it in the bin, it was implied.
That interaction has always stuck with me. In hindsight, the statement was clearly intended to discredit her work. But the claim did nevertheless contain a dilemma of sorts: her work, like much classic work, has no method of the type favoured and propagated by today’s research methods textbooks and funding bodies.
Ever since, I’ve always wondered what the implications of this are: if work like Strange’s has no method, can it be social science? Or, on the other hand, if we wish to insist that Strange’s work is social science, then must her work have a method?
We can expand on this questions: if it is the work of journalism rather than science, then are we able to credibly cite it in academic work? Should we treat Strange’s work as a relic, as important but ultimately ‘of its time’? Should we actively discourage scholars from pursuing work that is like Strange’s, so to ensure the field is less ‘journalistic’ and more ‘scientific’? Or, should we look to challenge this, by asking whether it is possible to do social science without a method? Or, whether instead, we should look to reclaim whatever Strange did as scholarly and as method-ical?
To put it even more simply, is this work scholarship or not? That’s the dilemma, which I’ll call the Susan Strange dilemma. If it isn’t scholarship, then fair enough, we should (at best) treat it a product of it time. If it is scholarship, then what kind of scholarship is it? Don’t we need to understand it? Although prompted by Strange’s work, this is just a working example: these questions have implications for social science as a whole.
These questions were raised once again for me recently, when attending a workshop on methods in political economy. It’s not the first workshop I’ve attended on methods (in political economy, even), and, like the other events, it was a productive and (for a geek like me) fascinating discussion.
However, like after all of the these workshops, I came away with a slight frustration. In my experience, workshops about method tend to go one of two ways: either into demonstrations of different techniques for collecting and analysis discrete datasets, or into discussions about ontology and epistemology (and therefore onto methodology rather than method).
There’s no issue with this per se. But it doesn’t really help with my Susan Strange dilemma. In that context, there is a danger that these discussions of method can end up circular and repetitive, or, worse, reproduce boundaries about what counts as legitimate knowledge — in a way that mirrors the Susan Strange encounter above.
I think I’ve come to realise the source of frustration. It is not an issue to do with how these workshops are organised or their rationale, but is instead a wider disciplinary issue. We don’t have a clear conceptualisation of what ‘method’ is. Consequently, our entire way of interacting with questions of method falls back onto a ‘positivist’ or ‘orthodox’ vision of research and scholarship — a point forcibly made to me by Ian Bruff when I was a PhD student, and one that I’m still coming to grips with.
This, in and of itself is not a problem. But the issue is that it can end up drawing a dividing line about what counts as method-ical research, and therefore implicitly bracketing or leaving out other legitimate and rigorous ways of knowing about the world. This is especially relevant and important to scholars who do work that does not use formalised techniques for collecting and analysis discrete datasets — which is a lot of work, possibly the majority of the work I read, cite, and enjoy.
So let me restate the Susan Strange dilemma again: is work that seems to be lacking in a method unfit for purpose because it is unscientific or unscholarly? Or, is that such work has a method, it is just that that the method is otherwise not recognised? We cannot think through this dilemma without first addressing the following question: what is a method?
In preparation for the workshop, I read Johnna Montgomerie’s edited collection Critical Methods in Political and Cultural Economy, John Law’s After Method, and Bruno Latour’s Science in Action. I found plenty of inspiration for thinking about method in these books.
Rather than thinking of myself as engaging in (social) ‘science’, I prefer to think in terms of ‘scholarship’. It has less baggage (or, more accurately, a different kind of baggage, but one I feel more comfortable carrying).
Scholarship is the collective process of producing knowledge. From the standpoint of the individual, the central social purpose of scholarship is to contribute to knowledge — whether that be through conducting research, supporting the process indirectly (such as peer reviewing), teaching students, or myriad other ways. To contribute to knowledge is by definition a collective endeavour. A finding or an argument that resides only in a desk drawer cannot contribute to knowledge.
Latour and Law show us that at it’s most basic scholarship involves making credible ‘statements’, or knowledge claims, about the world. Even if such statements may be rich interpretations, ethical claims, or prescriptions on how to conduct research, they are still essentially statements about the world. For Latour and Law, the process of scholarship rests on convincing other scholars that they should drop all modalities used in relation to a particular statement.
For those unacquainted with this kind of Science, Technology, and Society’ (STS) work, this might seem an odd and arcane definition. But it makes much more sense in the context of their work: observing how science works in action, by conducting ethnographies of laboratory life. One observation from this body of work is the ‘moment of truth’ for a scientist is not necessarily in suddenly comprehending reality or answering a question, but in convincing other scientists that a knowledge claim or statement is credible and useful. Much of this work is done in the proverbial laboratory, but much of it also done in writing and in interacting with other scientists.
To make sense of ‘modalities in relation to a statement’, we can imagine an academic article that is discussing the finding from a different paper. There are different ways to state those findings: ‘the paper suggests X’, ‘the paper argues X’, and ‘the paper demonstrates X’. In one sense, these statements are synonymous. But in another sense they are subtly different, with each mode of expression (‘suggest’, ‘argue’, ‘demonstrate’) conferring weaker or stronger authority onto the claims of the hypothetical findings. Or, as Latour puts it:
You need them [other scholars] to make your paper a decisive one. If they laugh at you, if they are indifferent, if they shrug it off, that is the end of your paper. A statement is thus always in jeopardy, much like the ball in a game of rugby. If no player takes it up, it simply sits on the grass.
This is why for Latour and Law, the objective of scholarship is to convince other scholars to use one’s statements or knowledge claims — and, perhaps, for other scholars to use ‘demonstrate’ rather than ‘suggest’ in the discussion of those statements.
On this basis, we can challenge conventional understandings of knowledge and science — that truth emerges from gaining more accurate depictions of reality. The Latour and Law depiction instead suggests that (scientific) truth emerges from a collective process of consensus making, which centres on how statements are about the world become credible — which in itself helps enact reality, rather than simply mirroring it. As Latour puts it: ‘The construction of facts, like a game of rugby, is thus a collective process’.
But, as Law writes, such statements ‘do not just idly freewheel in mid-air, or drop from heaven’. They are produced, as a collective process. But how? This is my way of getting to a definition of method, which I would like to formulate as: how we get from the point of not knowing (i.e. not being able to make a credible statement about the world) to knowing (i.e. being able to make a credible statement about the world). One important implication of this definition of method — how we get from Point A (no credible knowledge claim) to Point B (credible knowledge claim) — is that any claim to knowledge, or statement, must have a method, logically speaking.
This is a fundamentally different conception of ‘method’, as it is normally conceived of within the majority of methods textbooks, modules, and so on. I say ‘conceived of within’ rather than defined, because in my experience it is unusual for ‘method’ to be explicitly defined and conceptualised. It is instead implied. And one of the main ways in which it is implied is through what I will call The Menu.
When I was trained as a social scientist — as an undergraduate, but especially in preparation for doing a PhD — I got taught research methods, as many now are. That’s when I first became aware of The Menu. The Menu is the list of methods that are taught to trainee social scientists. If you look at almost any research methods module or textbook, you’ll find the same thing: usually after some preliminary thinking about the logic of inquiry, the bulk of time and space is taken up by going through different formalised techniques of data collection and analysis, which is almost always organised along a qualitative-quantitative divide. It will almost definitely include survey research and interviewing (and maybe also content analysis, discourse analysis, focus groups, experiments, and so on). It may also come complete with a secondary Menu that presents the various ontological and epistemological packages available for plug-and-play, e.g. positivism, interpretivism, critical realism, etc.
I don’t know if others feel the same, but as a PhD student I felt pressure to pick something off The Menu, and that if I didn’t then I wouldn’t be “proper” and neither would my research. Because research is meant to go like this: You have a research question, some ideas about how you want to go about answering it, and so then you consult The Menu for a method. You order whatever it is you need off The Menu, and you assemble it, being sure to follow the rules and procedures. Once applied, you will get an end product, that you will naturally compare and contrast with an image of what it was supposed to look like — ‘does it look right?’, ‘have I executed it accordingly and in line with the instructions?’, etc.
The kinds of methods that are included on The Menu all have a family resemblance. This language of method tends to be “skills”, “tools”, about “collecting and analysing data”, and as something that is distinct from theory. This kind of method is used in a seemingly disconnected and formalised way to generate knowledge claims (“an experimental study”, “a focus group study”, etc.). And — for me the most important — they all tend to interact with data in a certain way. As Alan Bryman’s industry standard textbook puts it: ‘to many people, data collection represents the key point of any research project, and it is probably not surprising therefore that this book probably gives more words and pages to this stage in the research process than any other’.
This definition of method is not just limited to social science orthodoxy. Patrick Thaddeus Jackson’s The Conduct of Inquiry is the best book on methodology I have read. At the start of his book, he makes clear that he is talking about methodology rather than method: whereas methodology is about ‘the logical structure and procedure of scientific inquiry’, methods are ‘techniques for gathering and analyzing bits of data’. Even Montgomerie’s brilliant edited collection, which rightly calls for a new vocabulary of method, ends up reproducing something like The Menu in Chapter 4. It is genuinely difficult to talk about method without falling back into this.
If method is about data, then what is meant by data? It is possible for data to be defined in a broad sense, but I don’t think that’s generally the case here. The majority of methods on The Menu share a certain relationship with data: these datasets are discrete, cataloguable, separate, and formalised. Data, which is assumed to be doing a lot of the knowledge production heavy-lifting, must be separated — disembodied — from the scholar. Jackson’s definition seems telling in this respect: ‘bits of data’, with ‘bit’ presumably meaning a thing or an object; as something that is objectifiable.
So, the common sense understanding of method is something like: (1) a method is a skill or tool for collecting and/or analysing data; and (2) that data should take form as an object.
Why does this matter? The methods on The Menu are undoubtedly a valuable resources to many researchers. I have used and will continue to use many of them myself. But I also wonder what it means that many ways of making knowledge claims about the world — i.e. methods, at least in my terms — are not included on The Menu.
Earlier I said that the social purpose of scholarship is to contribute to knowledge. This is of course important to remember in contemporary university life, in which knowledge is increasingly instrumentalised or set to compete against other imperatives (such as grant revenue or placing higher in rankings).
Methods are no exception to this. In principle, I’ve always thought that we should select our methods on the basis of a kind-of wager: that this particular way for knowing is the best — most productive, efficient, illustrative, and so on — way to answer a particular research question or puzzle.
This is obviously a utopia. Social science does not work like this. Most of us are trained or socialised in one way, and it’s very difficult to retrain another way due to time constraints, let alone other and more complex reasons. Most of us value the ways in which we were trained, and then go on to become embedded in scholarly networks that also values those ways. This process is sometimes such that method and/or methodology become the tie that holds together a network of scholars.
This same process can also happen on a more subtle, but larger, disciplinary scale. In fact, it’s probably inevitable. All disciplines value certain types of knowledge over others, and all disciplines value certain types of methods over others. After all, disciplines require discipline: standardisation, a canon, silo-ing, and so on. This is a transactions cost problem — it makes producing knowledge easier, because it provides a readily accessible context, like hooks to hang on.
International Relations scholar Laura Shepherd memorably blogged that if one of the leading US-based journals ‘publishes a collection of poems and photographs that stand alone as a comment on practices of global politics, then we will know we have forgotten, have transcended’ the field and its methodological disciplining. If one scholar thinks poetry is scholarship and others don’t, then this raises an important and difficult question: where do the boundaries of legitimate ways of knowing lie? And who gets to set them?
We can think a bit more about the underlying mechanisms of these boundaries. The boundaries of legitimate scholarly knowledge only exist inasmuch that they are produced and reproduced (and therefore evolve) through the meaningful actions and non-actions of individuals.
There’s two noteworthy features to these boundaries. One is an in/out feature, like a circle in which you are inside of or out. A way of knowing, and therefore the resulting knowledge claims, are excluded from what is considered legitimate scholarly knowledge. Rightly or wrongly, a volume of poetry (‘that stand[s] alone as a comment on practices of global politics’) would probably not be awarded a PhD in most Politics departments in the UK.
The second feature is a centre-periphery feature, like a set of concentric circles in which you can be placed closer to the centre or further away. In this case, the knowledge claims are ‘in’, and your way of knowing is seen as meeting the basic criteria of legitimate scholarship — but it doesn’t quite fit the ideal of what disciplinary knowledge should be. It is not valued as highly, not rewarded as highly, not cited as much, or included within debates. It’s an all-knowledge-is-equal-but-some-knowledge-is-more-equal-than-others kind of vibe. I think of it like this: it’s one thing to be invited to the party, but what’s the purpose of going if you’re just standing in corner by yourself because no one wants to talk to you?
Sara Ahmed puts this far more forcefully: ‘citations are academic bricks; and bricks become walls’.
Although Ahmed is referring to a process that extends far beyond methods, we can still identify what Law calls ‘methodological normativity’: some methods — ways of knowing — are seen as more legitimate methods than others, while some processes of knowing are not seen as legitimate at all.
Methodological normativity shapes how we reproduce the field, train and mentor early-career researchers, and distribute value and recognition in academia, i.e. whose work gets valued and recognised, and whose doesn’t. For example, research grant proposals especially for the ESRC require a systematic outline of the methods of analysis. It also matters for REF, in which one of the three criteria for assessing published work is ‘rigour’. This intersects with important ways of knowing that are often marginalised, such as post-colonial, queer, or feminist theory.
Methodological normativity also matters for the sake of scholarship. We want to know stuff. So if we know how we know, then surely we will be able to know better. As Howard Becker writes in Tricks of the Trade, social and scientific convention is the enemy of thought: ‘we need ways of expanding the reach of our thinking, of seeing what else we could be thinking and asking, of increasing the ability of our ideas to deal with the diversity of what goes on in the world.’
The original dilemma was whether work that has no explicit method or discrete dataset can be considered scholarship; or whether we should look to reclaim the meaning of method so include this work within the boundaries of scholarship. On the terms that I have set out, it is untenable to choose the first of these two paths. To claim that, say, Susan Strange’s work is not scholarship — or even proper scholarship, whatever that means — is not credible. However, taking the second path — as Montgomerie’s edited collection does, as does Law’s After Method — is not straightforward, and itself prompts many questions and issues. I’ll consider two here.
The first is over the dangers of reclaiming method. Strange’s method — inasmuch as she has a way of going from not being about make a credible claim about the world, to being able to do so — is somewhat black-boxed. Her claims are credible and — to me, at least — compelling. I would go even further and say that her work offers a true depth of insight and richness. Yet, despite teaching with Strange for the last four or five years, I would struggle to explain, when push comes to shove, what her method is. Her approach is method-ical, but you can’t order it off The Menu.
Will reclaiming the methods of, say, Strange mean simply adding more options onto The Menu? This is the worry that some political economists that I have spoken to have about these ideas. That there is a danger of formalising ways of knowing; of reducing complex, methodologically holist, and often critical work into a series of plug-and-play techniques and tools. Or, that by talking about method, that the substance and politics of knowledge claims risk playing giving way to an obsession over and fetishisation of technique and process.
This need not be the case. The autobiographical vignettes in Montgomerie’s book — I found those by May, Shilliam, and Bruff especially useful in this respect — show that it is possible and productive to reflect on the process in which knowledge claims are produced (or ‘cultivated’, in Shilliam’s terms) without falling into the trap of formalisation. If Susan Strange had written an equivalent of C. Wright Mills’ brilliant ‘On Intellectual Craftsmanship’, then that would be a tremendous resource for political economy students, no doubt. In other words, to reclaim method then The Menu must be subverted.
A second issue relates to how method is defined. Here, I have defined it in very broad terms — as the way in which we go from being able to not make a credible statement about the world, to being able to make one. When taken seriously, this definition vastly opens the boundaries of ‘method’.
For example, we classically think of Varieties of Capitalism (VoC) as though it is theoretical, conceptual, or even methodological. I don’t think anyone would describe Varieties of Capitalism as a method. VoC is an exemplar of comparative ideal-type analysis in the Weberian tradition. (I’ve written about ideal-typification here). In principle, the purpose of these ideal-types such as ‘liberal market economy’ or ‘coordinated market economy’ is to provide a ‘data container’ for empirical research, at which point the method comes into play. But those ideal-type concepts do not just work in that way, because they too also end up being claims to knowledge in their own right. For swathes of political economists, when they ‘look’ at Germany, they ‘see’ a coordinated market economy. Does this make ideal-typification a method, that cannot be meaningfully disentangled from the empirical data collection and analysis strategies?
To take another example, I don’t think any social scientists would describe a literature review as a method. Literature reviews are normally considered to simply synthesise knowledge rather than produce it. Fair enough. But I wonder how many terms and concepts that have become taken-for-granted parts of social reality have emerged from academic literature reviews? In that case, the ‘method’ of the literature review is a complicated process that eschews formalisation. It involves reading and learning, synthesising, reorganising, and writing. Writing is an essential part of that. Most people will be taking notes as they go, which is not simply an archive of thoughts, but is the process of making those often transient thoughts real.
Describing ideal-typification or (let alone literature reviewing) as a method might be seen as a problem, because concepts are theory, and theory is not method. But is this a problem because this broad definition of method is wrong, or is this a problem because it is a challenge to the methodological conventions of social science? And, if not this definition of method — the process in which we are able to make credible statements about the world — and not the definition that equates method with discrete data, then which definition? I am not sure if this is this is the right definition. But I would like to explore it more.