Social Sciences cannot and should not be Value Free

Social Sciences cannot and should not be Value Free

Social Sciences cannot and should not be Value Free






Date of Submission

Plagiarism Statement

I, (name) declare that this research report is my own, unaided work. It is being submitted in partial fulfilment of the requirements for the degree of (course) at (University and department names). It has not been submitted before, in whole, or in part for any degree or examination at any other institution.


The issue of whether social sciences can or should be value-free has been debated for more than one and half centuries. The concept of value-free is an old science and so are the arguments against it. Prior to the beginning of the 20th century, the idea was supported and defended by Plato against Protagorean skepticism and relativism. Bacon warned that human values could interfere with or divert scientific enquiry (Koskinen et al. 2006, p. 86). During the nineteenth century, scholars used the term “ideology” to represent the influential values that could affect the objectivity of science. However, a major dispute or debate began early in the twentieth century with Max Weber’s defense of neutrality of social science. Weber argued that value-freedom can be achieved in social sciences through eliminating all values that might threaten objectivity. In doing so, Weber was trying to defend autonomy of social science with respect to politics and religion (Bruun, 2007, p. 109). In response to Weber’s plea for value-freedom, many philosophers in 1930s and 1940s tried to eliminate all values that were deemed to affect the objectivity in scientific enquiry (Koskinen et al. 2006, p. 91). To enhance objectivity in scientific enquiry, the philosophers adopted a strategy that relied only on evidence and logic (or justification for reason). However, values were not completely eliminated even after adopting the strategy.

In 1950s and 1960s, philosophers noted that there was a need for additional guidance than the sole reliance on evidence and logic. In response, a distinction was made between epistemic and non-epistemic values. Philosophers concluded that epistemic values (such as empirical support, simplicity and explanatory power) were necessary in guiding scientific enquiry. They argued that non-epistemic values (such as political and moral values) ought to be eliminated because they are likely to affect objectivity. By 1980s, philosophers believed that value-freedom could be enhanced in social sciences through elimination of non-epistemic values. However, after extensive studies and a close examination, the philosophers agreed that value freedom could not be achieved in economics (Mongin, 2006, p. 259). Since then, some scholars have defended value freedom, arguing that it is necessary and that it can be achieved in social sciences except in economics. However, opponents of the idea have proved that non-epistemic values are essential and cannot be eliminated in scientific enquiry. Douglas (2009, p. 44) and Bailey (2008, p. 30), in for instance, have demonstrated that non-epistemic values are needed in science in determining the limitations of methodology and the choice of projects. This paper argues against the idea of value freedom in social science. The paper posits that value freedom cannot be completely achieved in social science and that non-epistemic values are required in guiding scientific enquiry. Lastly, the paper evaluates several arguments advanced by defenders of the idea of value freedom in social sciences.


As argued by Douglas (2009, p. 71), social scientists have responsibilities for their actions, just like all other individuals in the society. In particular, social scientists have responsibilities for the intended consequences of their actions, as well as some of their unintended consequences. As Douglas (2009, p. 71) argues, social scientists are responsible for all foreseeable consequences of their actions. Thus, they should not portray negligence or recklessness in their actions. Due to the fact that science has a substantial impact on the society, meeting the associated responsibilities can have a profound influence of the direction and practice of science. Usually, social scientists engage in scientific enquiry with good intentions. However, there are always unintended consequences or side-effects in scientific enquiries. Some of the unintended side-effects are foreseeable and thus, are of great concern to scientists. Douglas (2009, p. 72) identifies two categories of side-effects that are foreseeable in scientific enquiry. The first is the side-effect that is likely to occur even if the knowledge produced by a scientific enquiry process is accurate and reliable. For over half a century, scientific research ethics has focused mainly on the consequences of scientific enquiries or processes on human beings. The research ethics places restrictions on the scientific enquiry projects and practices that are morally acceptable. Even where scientific enquiry process is carried out with good intentions, the project may be rejected upon consideration of moral values. When one considers the impact of a scientific project or the knowledge produced on the society, moral values may have greater weight or may be more relevant than the epistemic values. Recently, concerns have been raised worldwide over scientific projects that follow ethical processes to produce knowledge that may be harmful to the society (Jarvie & Zamora-Bonilla, 2011, p. 414).

The second kind of unintended foreseeable consequence is the possibility of making unreliable and inaccurate empirical claims. As Lekka-Kowalik (2010, p. 35) explains, claims derived from scientific enquiries have substantial authority in the society. Thus, well-intended claims can have substantial negative impact to the society when they are inaccurate and unreliable. Without research ethics that focus on the impact of scientific projects of knowledge on the society, social scientists can be negligent and reckless; they can always release information to the public that is inaccurate and unreliable and that has a negative impact on the public. Apart from these general responsibilities, there are other responsibilities that are special to science that social scientists must meet. As Lekka-Kowalik (2010, p. 35) explains, social scientists must adhere to role responsibilities such as fair evaluation and consideration and evaluation of work of others, open discussion of scientific results and honest reporting of data. These responsibilities are critical in enabling social scientists achieve the prime goals of science. There is little or no contention regarding the issue of whether role responsibilities should be adhered to.

As argued by Barber (2006, p. 539) adherence to the role responsibilities does not excuse a scientist from adhering to the general or basic responsibilities. Furthermore, it is implausible for scientists to transfer over the general or basic responsibilities elsewhere completely. However, partial sharing of the basic responsibilities is possible. For instance, in order to prevent the impacts of unintended harms in scientific enquiries using human subjects, scientists often submit their methodologies to designated ethical boards for review before commencing on the process. As Barber (2006, p. 539) noted, such kind of partial sharing is the most that scientists can accomplish. This is mainly due to the fact that scientists are the only ones aware of the novelty, nature and presence of the studies. If social scientists are allowed to relinquish the general responsibilities, scientific processes would need to be monitored closely and constantly by ethical oversight boards. However, scientists may not welcome such a move. Given that no one can take over the general responsibilities completely from the social scientists, it means that the scientists must meet the general and role responsibilities. As Davis (2013, p. 555) noted, some scholars have argued that social scientists should ignore the general responsibilities. However, it is not justifiable to shield the scientists from such responsibilities just because they cannot transfer them elsewhere. Shielding the scientists from the responsibilities is equal to giving them leeway to engage in scientific enquiry practices without considering the consequences or the harm that they may cause to the rest of the public. As Davis (2013, p. 555) explains, adherence to role responsibilities of a profession does not exempt an individual from adhering to general responsibilities.

One century ago, social science had little influence on public policy. Today, bureaucracies highly rely on scientific evidence in making decisions. Importantly, scientific findings are increasingly used in developing public policy. Due to the increased use of scientific knowledge in the society, it has become much more important for scientists to make the best choices out of alternative sets of choices (Delanty, 2005, p. 11). In the contemporary world, social scientists use subjective judgment to select the best methodology approaches. They make choices on how to characterize study events or subjects for the purpose of data collection. As well, they make choices on how to interpret their results. While reporting, social scientists rarely explicitly mention the choices they make. In other words, they hardly discuss or even mention the alternative paths that they could have taken. Instead, they describe the paths they have taken. To discuss alternative choices that would require the use of subjective judgments to come up with the most suitable options is what some scholars try to avoid by supporting value freedom. Both epistemic and non-epistemic values play a crucial role in the process of selecting the most suitable options. By arguing that social sciences can be value-free, scholars supporting the idea deliberately ignore the fact that both epistemic and non-epistemic values play a critical role in determining suitable options. Even though they regard their choices to be the most suitable, they avoid discussions on how they come up with the best choices (Kitcher, 2008, p. 221).

As Davis (2013, p. 556) explains, every choice that is made during the scientific enquiry process involves the possibility for error. It is possible to select methodological approach that is not appropriate or suitable, leading to unreliable and inaccurate results. Similarly one may select an approach that can lead to incorrect characterization of data. In the same vein, a social scientist may rely upon assumptions that may lead to wrong interpretation of results. In cases where the scientific enquiry results are applied in making public policy decisions, such errors can lead to substantial negative non-epistemic consequences. This implies that scientists have to select choices with errors that have least consequences. To assess the impact of any error associated with a particular choice to the public, one must assign non-epistemic values to the consequences.

Only through assigning both epistemic and non-epistemic values can a social scientist make the right and the most effective choice. This explains the fact that non-epistemic values play a key role in the process of selecting internal scientific choices (Kincaid, Dupre & Wylie, 2007, p. 12). As McMullin (2012, p. 124) explains, there are cases where non-epistemic values play minor or no role in the process of selecting internal scientific choices. For instance, there are cases where the uncertainty associated with alternative choices is too small that scientists do not need to rely on non-epistemic values. There are also cases where the consequences of an error are so opaque that it becomes difficult to decide on the non-epistemic values to rely on. However, as Betz (2010, p. 372) contends, there are fairly clear consequences of errors associated with particular choices in most cases. Also, most errors are associated with significant uncertainties. If a choice is likely to lead to an error that is associated with a significant uncertainty, scientists who select such a choice are held responsible for the consequences of the error on the public. In some cases, the error rates associated with different sets of choices may be the same but the consequences may be different. In such a case, a scientist is expected to select the choice with least serious consequence (Bauman, 2010, p. 48). The process of selecting a choice with least serious consequences on the public is more dependent on non-epistemic values than epistemic values. In short, non-epistemic values are useful and cannot be ruled out in making decisions during scientific enquiry processes.

As Brister (2008, p. 736) argues, the knowledge that is produced by social scientists is quite valuable to society. However the value of knowledge does not surpass ethical and social values, as some scholars suggest. Betz (2013, p. 207), for instance, argues that the value of knowledge gained from science is highly valuable and for it to be attained, the claim of social and moral responsibilities should be relinquished. Betz (2013, p. 207) and other scientists in the same view intend to support the autonomy of social science. In addition, they aim to shield social scientists from their responsibilities for the consequences that result from their actions. In other words, Betz (2013, p. 207) and other scholars supporting the idea of value freedom tend to accord high status to epistemic values and to downplay the importance of non-epistemic values. However, it is evident that in the society, epistemic values are not accorded such high status. Today, limits are made on the use of human subjects during science enquiries. This implies that the society is not willing to sacrifice social and ethical values for the knowledge produced through scientific enquiries.

As well, supporters of the idea of value-freedom in social sciences have argued that scientists should be shielded from the burden of the need to consider the consequences of errors associated with the choices that they make. According to Betz (2013, p. 208), requiring social scientists to consider the uncertainties involved and the consequences of errors on the public is a burden placed on scientists that may hamper science. In other words, Betz (2013, p. 208) argues that the price of adhering to the moral responsibility in science is too high. Unfortunately, scholars supporting this point have not clearly defined what the price associated with adherence to the moral responsibility is. In fact, the price of ignoring the non-epistemic values can be too high in some cases than the cost of adhering to non-epistemic values. Further, most scholars advocating for value freedom argue that objectivity may be lost if non-epistemic values are allowed to inform the scientific enquiry process. However, as Punch (2005, p. 48) explains, social sciences can be objective even when non-epistemic values are allowed to inform scientific enquiry process. This can be easily achieved if there is no conflict between epistemic and non-epistemic values. Although the general or non-epistemic values influence the choices made by scientists, they do not affect the application of epistemic values. In other words, scientists can adhere to the two types of values at the same time. Some defenders of the idea of value freedom have are concerned that if the idea is relinquished, social scientists will have inappropriate authority in the society. However, this need not be the case. Social scientists are encouraged to make their judgments explicit. This increases the possibility of genuine public input into science.


In conclusion, value freedom cannot be possible in economics and other social sciences. This is due to the fact that social scientists are responsible for the consequences of their actions in the contemporary society, just like the rest of the members of the society. They are restricted to pursue knowledge that is not harmful to society. They are held responsible for consequences in cases where they produce unreliable and inaccurate results to the public. Non-epistemic values cannot be ruled out in scientific enquiries since they influence the subjective judgments made by social scientists during the process of selecting the most suitable choices. As explained in the discussion, non-epistemic values are essential in scientific enquiries. They play a great role in enhancing adherence to research ethics by social scientists. Lack of adherence to research ethics can lead to harmful consequences on the public. Further, the values play an essential role in regulating the conduct of social scientists, which is not well regulated by oversight bodies. Although there are several arguments that have been advanced by some scholars to defend value freedom in social sciences, they are not persuasive and do not provide justifiable basis for complete elimination of non-epistemic values in scientific enquiries.


Bailey, K. (2008). Methods of Social Research, 4th Edition. Simon and Schuster, London

Barber, B. R. (2006). “The Politics of Political Science: “Value-free” Theory and the

Wolin–Strauss Dust-Up of 1963.” American Political Science Review, Vol. 1, No. 4, pp. 538-545

Bauman, Z. (2010). Hermeneutics and Social Science (Routledge Revivals): Approaches to

Understanding. Taylor & Francis, London

Betz, F. (2010). Managing Science: Methodology and Organization of Research. Springer,


Betz, G. (2013). “In defense of the value free ideal.” European Journal for Philosophy of

Science, Vol. 3, No. 2, pp. 207 – 220

Brister, E. (2008). “Ideals and Illusions:Value‐Free Science? Ideals and Illusions.” Ethics, Vol.

118, Iss. 4, pp. 735 – 738

Bruun, H. H. (2007). Science, Values and Politics in Max Weber’s Methodology. Ashgate

Publishing, Ltd, Hampshire

Davis, J. E. (2013). “Social Science, Objectivity, and Moral Life.” Society, Vol. 50, No. 6, pp.

554 – 559

Delanty, G. (2005). Social Science. McGraw-Hill International, New York

Douglas, H. E. (2009). Science, Policy, and the Value-Free Ideal. University of Pittsburgh Pre,


Jarvie, I. C. & Zamora-Bonilla, J. (2011). The SAGE Handbook of the Philosophy of Social

Sciences. SAGE, California

Kincaid, H., Dupre, E. J. & Wylie, A. (2007). Value-Free Science : Ideals and Illusions?: Ideals

and Illusions? Oxford University Press, Oxford

Kitcher, P. (2008). Science, Truth, and Democracy. Oxford University Press, New York

Koskinen H. J. et al. (eds.) (2006). Science – A Challenge to Philosophy? Peter Lang GmbH,

Frankfurt am Man

Lekka-Kowalik, A. (2010). “Why science cannot be value-free: understanding the rationality and

responsibility of science.” Science and engineering ethics, Vol. 16, No. 1, pp. 33 – 41

McMullin, E. (2012). “Values in Science.” Zygon, Vol. 47, Issue 4, pp. 686 – 709

Justin Biddle (2013). State of the field: Transient under-determination and values in science.” Studies in History and Philosophy of Science, Vol. 44, No. 1, p. 124

Mongin, P. (2006). “Value Judgments and Value Neutrality in Economics.” Economica, Vol. 73,

No. 90, pp. 257 – 286

Punch, K. S. (2005). Introduction to Social Research: Quantitative and Qualitative Approaches.

SAGE, California