Название: Data Theory
Автор: Simon Lindgren
Издательство: John Wiley & Sons Limited
Жанр: Кинематограф, театр
isbn: 9781509539291
isbn:
A patchwork of solutions
It is a common conviction in social research that one cannot do ‘qualitative’ and ‘quantitative’ in the same breath, as they are based on different epistemologies. But this can in fact be debated. As argued by Bryman (1984), the difference may in practice lie not so much in different philosophical views on how knowledge about social reality is achieved, but simply in the path-dependent choices that are made by individual researchers who get stuck with one paradigm or the other. While it has become an eternal truth, reiterated by researchers and methods teachers alike, that ‘the problem under investigation properly dictates the methods of investigation’ (Trow, 1957, p. 33), very few adhere to this in practice. Bryman explains that:
it is not so much a problem that determines the use of a particular technique but a prior intellectual commitment to a philosophical position. The problem is then presumably formulated within the context of these commitments. This suggestion also makes some sense in terms of the individual biographies of many social researchers, most of whom do seem to be wedded to a particular research technique or tradition. Few researchers traverse the epistemological hiatus which opens up between the research traditions.
(Bryman, 1984, p. 80)
Doing digital social research, due to the particular challenges it raises, has ‘prompted its researchers to confront, head-on, numerous questions that lurk less visibly in traditional research contexts’ (Markham and Baym, 2009, pp. vii–viii). One such issue is definitely the need to address the long-standing dispute in social science between ‘qualitative’ and ‘quantitative’ methodological approaches, which has persisted, apparently unresolvably, for more than a century – or since ancient Greece, depending on who you ask. Among researchers, there are still traces of a battle between case-oriented interpretative perspectives, on the one hand, and variable-oriented approaches focused on testing hypotheses on the other. Scholars who prefer case-oriented methods will argue that in-depth understandings of a smaller set of observations are crucial for grasping the complexities of reality, and those who prefer variable-oriented approaches will argue that only the highly systematised analysis of larger numbers of cases will allow scholars to make reliable statements about the ‘true’ order of things.
Today, however, there is an increasingly widespread consensus that the employment of combinations of ‘qualitative’ and ‘quantitative’ methods is a valid and recommended strategy, which allows researchers to benefit from their various strengths, and balance their respective weaknesses. The ‘qualitative’ tradition is seen as the more inductively oriented interpretative study of a small number of observations, while the ‘quantitative’ tradition is characterised by the deductively oriented statistical study of large numbers of cases. This has given rise to the common notion that ‘qualitative’ research produces detailed accounts through close readings of social processes, while ‘quantitative’ research renders more limited, but controlled and generalisable, information about causal relations and regularities of the social and cultural fabric.
As argued above, most researchers would agree in theory that methodological pragmatism – letting the problem to be researched, and what type of knowledge is sought, decide which method should be used – but few actually do this. This is not because researchers are liars, but because it is in fact hard to make it happen. The general direction for the work in this book, in combining the data-drivenness of interpretive (‘qualitative’) sociology, with the data-drivenness of (‘quantitative’) computational methods, most closely resembles what methodologists Norman Denzin and Yvonna Lincoln (2005, pp. 4–6) have discussed in terms of bricolage.
‘Bricolage’ is a French term, popularised by cultural anthropologist Claude Lévi-Strauss (1966), which refers to the process of improvising and putting pre-existing things together in new and adaptive ways. From that perspective, our research method is not fully chosen beforehand, but rather emerges as a patchwork of solutions, old or new, to problems faced while carrying out the research. As critical pedagogy researcher Joe Kincheloe (2005, pp. 324–5) observes: ‘We actively construct our research methods from the tools at hand rather than passively receiving the “correct”, universally applicable methodologies’, and we ‘steer clear of pre-existing guidelines and checklists developed outside the specific demands of the inquiry at hand’. So, developing your method and methodology as a bricolage means placing your specific research task at the centre of your considerations, and allowing your particular combination and application of methods to take shape in relation to the needs that characterise the given task. So this, then, is not about letting the research problem guide a choice between already existing methods. Rather, it is about re-inventing your methods in relation to each and every new challenge.
For the purpose of this book’s ambition to establish an interface between interpretive sociology and computational methods, the idea of bricolage refers to the method of piecing these two together in the shape of an emergent construction ‘that changes and takes new forms as the bricoleur adds different tools, methods, and techniques of representation and interpretation to the puzzle’ (Denzin and Lincoln, 2005, p. 4). Method must not be dogmatic, but strategic and pragmatic. I therefore argue in this book, that computational techniques, results, and visualisations can be used as elements in a new form of interpretive enterprise.
The interpretive interface
Computational social scientists have worked to bring disciplines such as sociology into closer contact with data-intensive approaches. In those cases, the translating interface between the two paradigms has commonly been that of statistical and mathematical language. It has been the ‘quantitatively’ oriented social scientists that have bridged over. For example, Salganik (2018, p. 379) discusses how big data can be useful in social research by helping produce faster estimates, and engaging large numbers of research participants in crowd-coding efforts, especially if one is using established statistical strategies to increase the validity of the more messy kinds of online data. In this book, I instead advocate a more interpretive and ‘qualitative’ interface between social science and data science.
Analysing sociality in the age of deep mediatisation may appear to be something that should be done in more ‘quantitative’ terms, because of its scale and the numerical character of much social media data. But there is actually even more reason to approach such objects of study, as well as the new types of data they enable and exude, from a more interpretive standpoint. Just because sociality in the digital age happens in volume and numbers, does not mean that its traces are automatically akin to survey data or other forms of statistical inputs. It is important to realise that the internet, and its networked social tools and platforms, in many ways serve up a different research context than what has been the familiar one to social science. The new context possesses an ‘essential changeability’ that begs a conscious shift of focus and method (Jones, 1999, p. xi). It is because of this that researching digital society demands that the researcher be even more critical and reflective than is already demanded by scholarship in general.
The data that we face do not equal ‘society’. As, explained by Salganik (2018, p. 58), behaviour in big data systems is algorithmically confounded, as ‘it is driven by the engineering goals of the systems’. This means that when we analyse different forms of social interaction, social patterns, and activities in the datafied СКАЧАТЬ