The British Museum is a controversial edifice. In part a persistently triumphal display of looted treasure – such as the Parthenon Marbles and the Benin Bronzes – by a brutal and supremacist empire, part conservator of important artefacts of social history, its symbolism at a time of Brexit and resurgent nationalism is unhelpful to liberal sensibilities. It remains something of a contradiction that its erstwhile director, Neil MacGregor, combines a defence of its virtue as a world museum with criticism of the British view of its history in general as ‘dangerous’ (Allen, 2016).Continue reading “Technologies of Theology”
In Martin Heidegger‘s Being and Time, he refers to verfallen as a characteristic of being, or dasein. It means fallen-ness, or falling prey, an acknowledgement that we do things not because we want to do them, but because we must; we act in particular ways, we fall into line, we do jobs, have families, get a mortgage and a pension, obey the law and so on. We consciously engage with the systems and societies into which we have found ourselves. It is surprising how frequently this concept of ‘the fall’ emerges in philosophy, theology and popular culture.Continue reading “Falling Down”
AI poses several challenges for the religions of the world, from theological interpretations of intelligence, to ‘natural’ order, and moral authority. Southern Baptists released a set of principles last week, after an extended period of research, which appear generally sensible – AI is a gift, it reflects our own morality, must be designed carefully, and so forth. Privacy is important; work is too (we shouldn’t become idlers); and (predictably) robot sex is verboten. Surprisingly perhaps, lethal force in war is ok, so long as it is subject to review, and human agents are responsible for what the machines do: who those agents specifically are is a more thorny issue that’s side-stepped.Continue reading “World Religions and AI”
The New York Times and the Guardian have been digging ever deeper into the activities of the US National Security Agency or NSA following the leaking by Edward Snowdon of information about how they were spying both on countries and ordinary people at home. Hot on the heels of the Chelsea Manning and Wikileaks diplomatic cables episode, there has been a constant flow of stories reporting on nefarious activities of spooks and governments, embarrassing opinions, and the mechanisms by which international diplomacy and spying are conducted, though Wired Magazine had got there first. There are numerous angles to all of this. There is the technology problem, an Orwellian, Kurtzweilian post-humanist dystopia where technology trumps all, and big data and analytics undermines or redefines the essence of who we are and forces a kind of a re-evaluation of existence. There is the human rights problem, the balancing of the right to privacy and – generally speaking – an avoidance of judgement of the individual by the state, with the obligation to secure the state. This issue is complex – if for example we have an ability to know, to predict, to foretell that people are going to do bad things, but we choose not to do that because it would require predicting also which people were going to do not-bad things, and therefore invade their privacy, is that wrong? Many people said after 9/11 ‘why didn’t we see this coming?’ Which leads to the question – if you could know all that was coming, would you want to know?