Ethics and Science
Oct 01, 2017
Ethics and Science

At the end of the workshop on Ethics and Science at the Journée des Jeunes Chercheurs organized by the students at Observatoire Océanologique de Villefranche, the following questions were posed to all in attendance. Below is my painfully brief response to these excellent questions from the perspective of data sharing as applied to science and scientists.

Ethics and strenght: What can we do in our daily lives to make science more ethical without losing its strenght?
Ethics in time: How did our ethical constraints adjust to new technologies of the last decades and how will they change in the next years?
Ethics and equality: How do our values and social background influence our ethical behavior in science?

Note: I substituted ‘integrity’ for ‘strenght’ (sic) in the first question below. Without being absolutely certain what the workshop organizers intended, my guess is they were shooting for integrity. In the second question, I substituted ‘over’ for ‘in’ because it makes more sense grammatically. And, in the final question I substituted ‘context‘ for ‘equality’ because, well, I am quite certain context fits better given its, well, context.

Ethics and strenght integrity: Even as increasing computing power gives us the ability to analyze data, the volume of data is increasing even faster. This makes replication and verifiability difficult. It is incumbent upon the scientists to exercise diligence and protect the integrity of science. All said and done, science is unique precisely because of its integrity. Even though it may be a product of its cultural context, and thus, influenced by it (see more below), the fact is that integrity is science’s sole defense. The NAS report on promoting integrity in research1 specifically recommends openness to address the ‘Reproducibility Problem.’

Ethics in over time: The biggest ethical, as well as technological, behavioral, and legal, challenges are going to arise in studies involving human subjects, and will involve protecting the privacy and security of those who have entrusted us with their data. Behavioral challenge will involve sensitizing the public on the danger of losing privacy in exchange for convenience. Technologically it will involve extracting useful information from the data while protecting the privacy of the subjects. Apple vs FBI2 is a worthwhile case-study in choosing to protect data as a lesser of two evils. Legally we might need new contractual instruments that allow the use of data by those who should have access to it while creating disincentives for sharing the data with those not authotized to do so. As Shoshana Zuboff calls it Surveillance Capitalism which is, “Constituted by unexpected and often illegible mechanisms of extraction, commodification, and control that effectively exile persons from their own behavior while producing new markets of behavioral prediction and modification”.3

Ethics and equality context: Almost all qualitative considerations (ethics, equality, equity, privacy, security) are highly contextual, dependant on both time and geography.4 Notions of privacy and security vary widely by cultural context, and technologies invented in one part of the world but adopted in other parts of the world can amplify these differences with unexpected and significant consequences. Also see the triad of ethical challenges – context sensitivity, methodology, and legitimacy – proposed by Vayena, et. al.5

Time, space (geography), and sociocultural factors are intricately tied in with how science is practiced, assessed, and shared. For example, in human-subject research, privacy and openness are two sides of the same coin – one can’t exist without the other because the connecting factor between the two is trust. The first casualty of ethical compromise is going to be trust — damage to trust in researchers leading to damage to trust in science. And, without trust, there will be no data, and without data, science is just politics.

  1. National Academies of Sciences, Engineering, and Medicine. 2017. Fostering Integrity in Research. Washington, DC: The National Academies Press.
  2. FBI–Apple encryption dispute. (2017, September 26). In Wikipedia, The Free Encyclopedia. Retrieved 13:50, October 1, 2017
  3. Zuboff, Shoshana, Big Other: Surveillance Capitalism and the Prospects of an Information Civilization (March 2015). Journal of Information Technology, Vol. 30, Issue 1, pp. 75-89, 2015. Available at SSRN: or
  4. Adam Barth, Anupam Datta, John C. Mitchell, and Helen Nissenbaum. 2006. Privacy and Contextual Integrity: Framework and Applications. In Proceedings of the 2006 IEEE Symposium on Security and Privacy (SP '06). IEEE Computer Society, Washington, DC, USA, 184-198. DOI:
  5. Vayena E, Salathé M, Madoff LC, Brownstein JS (2015) Ethical Challenges of Big Data in Public Health. PLoS Comput Biol 11(2): e1003904.
⬅︎ The Future of Science is Open
➡︎ Back to the Future of Data Sharing