New technologies in neuroscience generate reams of data at an exponentially increasing rate, spurring the design of very-large-scale data-mining initiatives. Several supranational ventures are contemplating the possibility of achieving, within the next decade(s), full digital simulation of the human brain. This "opinion" review questions the scientific and strategic underpinnings of the runaway enthusiasm for Big Data and industrial-scale projects at the interface between “wet” (biology) and “hard” (physics, microelectronics and computer science).
I will focus on four major issues:
(i) Are Big data produced by the industrialization of neuroscience and new algorithms in artificial intelligence, such as "deep learning", the soundest way to achieve substantial progress in understanding the brain?
(ii) Do we have a safe “roadmap” to build a Mega-Science of the Mind based on a scientific consensus?
(iii) Irrespectively of technological feasibility, what could be the main conceptual bottle-necks?
(iv) Do these large-scale approaches announce a new trend in scientific conduct, fueled by economics of promises and by the overselling of futurist mythology (theory of singularity and transhumanism).
]]>