Metodo

International Studies in Phenomenology and Philosophy

Journal | Volume | Article

216842

Big data and their epistemological challenge

Luciano Floridi

pp. 435-437

Abstract

It is estimated that humanity accumulated 180 EB of data between the invention of writing and 2006. Between 2006 and 2011, the total grew ten times and reached 1,600 EB. This figure is now expected to grow fourfold approximately every 3 years. Every day, enough new data are being generated to fill all US libraries eight times over. As a result, there is much talk about “big data”. This special issue on “Evolution, Genetic Engineering and Human Enhancement”, for example, would have been inconceivable in an age of “small data”, simply because genetics is one of the data-greediest sciences around. This is why, in the USA, the National Institutes of Health (NIH) and the National Science Foundation (NSF) have identified big data as a programme focus. One of the main NSF–NIH interagency initiatives addresses the need for core techniques and technologies for advancing big data science and engineering (see NSF-12-499).

Publication details

Published in:

Powell Russell, Kahane Guy, Savulescu Julian (2012) Evolution, genetic engineering and human enhancement. Philosophy & Technology 25 (4).

Pages: 435-437

DOI: 10.1007/s13347-012-0093-4

Full citation:

Floridi Luciano (2012) „Big data and their epistemological challenge“. Philosophy & Technology 25 (4), 435–437.