In the spirit of this year's theme, experimental psychologist and Harvard professor Steven Pinker discussed reconquering values of enlightenment in conversation with journalist Andrian Kreye (Süddeutsche Zeitung) and philosophy professor Julian Nida-Rümelin (LMU Munich).
In his upcoming book „Enlightenment Now", Steven makes a case for the values and achievements of enlightenment: reason, science, humanism, and progress – as all of them have been challenged and threatened in recent years. "We have seen the rise of populism, religion, nationalism, reactionary ideology and the desire to return to a golden age." Progress isn't a law of nature, but a gift of science and humanism, Steven argues. Those values of enlightenment have been questioned by the political right, but also by the academic left. "It's been dismissed as naive or ideology of the Silicon Valley – it's quite remarkable how few people belief in the ideal of progress."
Julian Nida-Rümelin recently wrote about robo-ethics and will publish a book on digital humanism this fall. Do we have to re-calibrate the meaning of humanism in the light of AI? Will human rights and values also apply for robots and new forms of intelligent agents? Julian believes it would be a categorical mistake as he thinks of AI as a mere simulation of intelligence.
Looking at human-like androids presented in Science Fiction stories, it has to remain an open question, Steven argues. He cites the story of lieutenant commander Data in Star Trek. When the android is threatened to be dismantled for reverse engineering, the audience reaction strongly suggested this would be considered as an act of murder. There's a deep uncertainty whether being out of flesh is neccessary for having a consciousness, interests and thus rights, Steven resumes. Ruling this out, isn't that just "meat chauvinism?"
However, he concedes that today's state of AI is far from duplicating human intelligence or emulating it in a way that could create human-like agents. So we will probably still have some time to figure this one out.