Crisis? What Crisis?

Author
Affiliation

Kim A. Weeden

Cornell University, Department of Sociology

Abstract
Sociology has been slow to adopt the standards and practices of scientific transparency. Existing efforts stem from the voluntary efforts of individual researchers or journals rather than from the discipline’s professional association. The absence of organized, top-down initiatives to nudge the discipline toward open science reflects not only the usual problems of organizational inertia and resource constraints, but also sociology’s unusually high intradisciplinary fragmentation. This fragmentation takes many forms, not the least of which is disagreement among sociologists over both the desirability and the feasibility of data and code sharing, preregistration, and reproducibility standards. Although prospects for large-scale, open science initiatives in the field seem slim, more modest forms of transparency will continue to diffuse among networks of scholars, especially those using quantitative methods.

Background

Sociology has lagged behind economics, political science, and psychology in its recognition of a replication crisis and efforts to move toward scientific transparency. Although individual researchers have voluntarily adopted some of the “best practices” of transparency, they are still in the minority within the field. Similarly, a handful of sociology journals have tried to nudge the field toward transparency, most often by requiring authors to provide replication packages with code and data for quantitative articles or by explicitly welcoming replication studies. Very few have mandated preregistration of studies, and to my knowledge no journals have a process to verify replication packages.

Notably, these efforts to introduce transparency into sociological research have all been “bottom up.” The American Sociological Association, the discipline’s primary professional and scholarly association and the natural locus of top-down initiatives to improve disciplinary practice, has been largely silent on scientific transparency. The flagship journal in the field, which is published by the ASA, recommends but does not mandate replication packages, and the ASA’s statement of professional ethics does not mention scientific transparency.

Main Thoughts

Why has adoption of transparency standards in sociology been so slow and piecemeal? Is the absence of “top-down” leadership from the ASA merely the standard story of the challenge of coordination, organizational inertia and resource constraints? Or is there something about sociology as a field that has slowed its adoption of scientific transparency? I argue that it’s both: fragmentation within the discipline contributes to organizational inertia, and the combination makes top-down leadership on scientific transparency, or for that matter any issue related to the day-to-day practice of sociology, rather unlikely.

Some elected members of the ASA have attempted to leverage the authority and power of the organization over its journals to implement scientific transparency standards. As Philip Cohen documents in his blog, he brought a proposal to the Publications Committee (on which he served as an elected member) to adopt the Open Science Badge system. This proposal failed in a vote. Sixteen months and two ad hoc committees later, an alternative proposal passed the committee: authors would declare in a footnote whether their data and code were available online, and if not why not. This proposal was referred to Council, where it was rejected and, after another 4 months, sent back down to the Publication Committee with comments. By this time, the members of the Publication Committee who had pushed hardest for transparency had either given up or rotated off of the committee.

Without insider information, an autopsy of the two proposals’ death by committee is bound to be inconclusive. Perhaps committee and Council members thought the benefits of implementing such modest, unenforceable, and honor-system policies were not worth the effort. Perhaps they were concerned that any open science requirements, even weak ones, would burden already over-burdened editors and reviewers or slow an already slow editorial and publication process. Perhaps members of the permanent staff were concerned that each step toward open science would irrevocably alter the organization’s contract with Sage press, from which the ASA receives a large share of its annual revenues and operating budget. Perhaps committee members were worried that footnotes or badges would stigmatize papers whose authors declared they were exempt from sharing data and code. Or, perhaps some of the members of the committees objected in principle to scientific transparency.

The latter two speculations point to the second relevant feature of sociology: it is a highly fragmented discipline, with a weaker core and more internal heterogeneity than economics, political science, or psychology. (This statement is borne out by bibliometric data, which shows sociology is more likely than other social sciences to cite outside the field.)

Fragmentation within the field tends to be fractal. At the most basic level, sociologists disagree over the goals and epistemology of sociology. Should it be a science that strives, however imperfectly, to identify objective truth or generalized social processes? Is it a normative project, in which the main goal is to reveal not what is but what should be? Or is it an interpretivist project more akin to the humanities than to economics, political science, or psychology?

Even among scholars who fall into the “sociology is a science” camp, fragmentation can be observed in the wide range of methods used within the field. Although the majority of research applies quantitative methods to secondary (survey, digital trace) data, research using qualitative, mixed, ethnographic, experimental, computational, and historical-archival methods is also common. Practitioners of these disparate methods have different ideas about what scientific transparency means, whether particular practices (e.g., preregistration, the dissemination of replication packages) are feasible for their type of work, and even whether transparency and reproducibility are appropriate goals. At the risk of oversimplifying, scholars who mainly use qualitative methods have been more reluctant to embrace scientific transparency than those who mainly use quantitative methods.

Among quantitative scholars, objections to open science are typically over practicalities rather than principle: for example, how to resolve the tension between scientific transparency and the legal, ethical, or normative constraints of making restricted access or proprietary data publicly available. In most cases, these concerns are not insurmountable. For example, journals could simply exempt research that uses proprietary or restricted access data from replication package requirements, which is precisely the approach taken by the handful of journals with replication package policies. Alternatively, journals or researchers could work with third parties to verify results, although this solution quickly runs up against resource constraints: very few editors or reviewers in sociology are paid for their labor even “in kind,” relatively few sociologists enjoy institutional support for verification of their replication packages, and even fewer journals have the resources to pay for staff or students for this task. The greater challenge for open science among quantitative scholars is the general devaluation of replication studies, which are rare even at the journals (e.g., Sociological Science) that have explicitly embraced them as a valid and valued form of research.

Among qualitative scholars, concerns over data-sharing and other scientific transparency standards take a slightly different form than among quantitative researchers. Some of these concerns focus on the consequences of data sharing for qualitative research practice. Will data sharing undermine the trust that develops between researchers and subjects – trust on which qualitative research depends? Will subjects respond to data sharing by becoming more reluctant to participate in research, particularly on the sensitive topics that sociologists often study? Will research subjects change their behavior if they know that others in their community will have access to field notes or interview transcripts?

Other concerns about transparency in qualitative research focus on intellectual property and incentives. A tacit agreement in sociology is that researchers who collect their own data will have sole access to them as long as they wish to keep publishing off of the data, typically a matter of years rather than months. This norm seems to fly in the face of scientific practice, but in a world of extremely limited and declining federal or state funding for data collection, monopoly ownership is one way to offset the greater risks and potential career costs to individual researchers of collecting new data. Put bluntly, if qualitative researchers are required to release their field notes, transcripts or videos, or other raw data products, it's likely that fewer researchers will conduct qualitative studies, to the detriment of the field.

A final concern is that reliability, replicability, and reproducibility are not appropriate yardsticks against which to measure qualitative research. The process of producing and analyzing qualitative data is often iterative rather than linear, it is inherently intersubjective, and it relies on non-verbal cues from subjects and the embedded experiences of researchers that cannot be captured in transcripts, field notes, or other data products (Tsai et al 2016). Moreover, qualitative research is often not designed to generalize or to be replicable, but rather to generate new insights into social processes that can guide quantitative data collection efforts. In this context, transparency around the production of data or around the data themselves (e.g., through data sharing) may make little sense (Tsai et al 2016).

These concerns could, of course, be circumvented by exempting qualitative research from scientific transparency standards, or at least modifying standards to be sensitive to qualitative research (see Tsai et al for suggestions). Notably, however, some qualitative researchers whose work would likely be exempt from scientific standards still object to such standards. In some cases, this objection reflects disagreement over whether sociology is or should strive to be a science. In other cases, it seems to be rooted in the fear that adopting scientific standards, even with broad exemptions for qualitative research, would marginalize and stigmatize qualitative research within the field.

In this context, it is not surprising that the ASA, which represents the interests of all sociologists, would be reluctant to engage in top-down initiatives to mandate scientific transparency. More cynically, in the wake of rapidly declining membership, the ASA needs to be a big tent under which all sociologists can find shelter. The organizational inertia generated by governance structures that allow the death of transparency initiatives by committee is this reinforced by intra-disciplinary dynamics, and the desire to please (or at least not alienate) all segments of the field. The combined forces of inertia and disciplinary fragmentation forestall top-down initiatives for scientific transparency.

Conclusion

The conditions that slow sociology’s adoption of transparency standards are not likely to dissipate any time soon. The most likely impetus for change is the steady diffusion of scientific transparency in quantitative work, as more individual scholars and journals adopt transparency practices voluntarily, younger scholars are trained in these practices, and cross-disciplinary collaborations create ties over which “best practices” in other fields can infiltrate sociological research networks.

A top-down approach led by the discipline’s professional association seems less likely. Sociology is characterized by a live and let live ethos that allows scholars with very different perspectives on the discipline and on ways of doing sociology to coexist. The ASA embraces this ethos, making it difficult for elected representatives to nudge the discipline toward scientific transparency, or indeed any change in the day-to-day practice of research, about which various segments of the discipline hold very different views.

References

Cohen, Philip. 2021 (March 8). “The American Sociological Association is Collapsing and Its Organization a Perpetual Stagnation Machine.” https://familyinequality.wordpress.com/2021/03/28/the-american-sociological-association-is-collapsing-and-its-organization-is-a-perpetual-stagnation-machine/

Tsai, Alexander C., Brandon A. Kohrt, Lynn T. Matthews, Theresa S. Betancourt, Jooyoung K. Lee, Andrew V. Papachristos, Sheri D. Weiser, and Shari L. Dworkin. 2016. “Promises and Pitfalls of Data Sharing in Qualitative Research.” Social Science & Medicine 169(2016): 191-198.