Tag: Technology

Falling Down

In Martin Heidegger‘s Being and Time, he refers to verfallen as a characteristic of being, or dasein. It means fallen-ness, or falling prey, an acknowledgement that we do things not because we want to do them, but because we must; we act in particular ways, we fall into line, we do jobs, have families, get a mortgage and a pension, obey the law and so on. We consciously engage with the systems and societies into which we have found ourselves. It is surprising how frequently this concept of ‘the fall’ emerges in philosophy, theology and popular culture.

Plato’s Republic begins ‘I went down to the Piraeus.’ He is descending to the port of Athens, where unsavoury types tend to gather, the great unwashed. These are the uneducated people, the slaves, the lower order beings. Nietzsche’s Zarathustra descends from the cave on top of the mountain as a kind of shift from pure being to some kind of contaminated entity. Marx and Engels develop Feuerbach’s theme of alienation for the worker (something that was apparently not an issue when craftsmen made objects and sold them) from the commodity, has similar themes of distance. Indeed, in Feuerbach’s original work his distinction was that between God and Man, between the moral and the immortal. Most of all, Heidegger’s language evokes The Fall of Man, the original sin in the Garden of Eden, of innocence and paradise lost.

There is in all of these things a clear distinction between a higher plane of existence, and a lower, base, grubby humanity. There is a gap between what Heidegger would call authenticity and inauthenticity. It is between the real and the unreal.

Sometimes we open windows onto this realisation, when something that defies science or rationality rears its head. Something that just doesn’t make sense. Like Brexit, or War, or Suicide. How can rational beings act in such ways? Does the question morph into – are we rational beings at all? From time to time, we inquire into the nature of our reality to try and understand – to really understand – what is going on, to seek to become authentic. We get glimpses, brief moments of clarity. We recognise that we have blind spots; we recognise some of the follies of our world, the hypocrisies and the hubris. We might briefly recognise that upon these false assumptions we have built enormous social edifices, that persist through a shared (mis)interpretation of what our purpose on this earth is.

That misinterpretation is there because we are fallen, descended, socialised, machined. Heidegger also talks about technology – a lot! – and describes technology in two ways: as revealing, and as enframing. In revealing, technology is a revealing of the potentiality of the world. A tree is a potential mallet; the emergence of the mallet from that tree is a revealing of its potentiality. In enframing, technology (particularly modern, industrialised technology) enframes the world, it corrals the world for the purposes of human advancement (to wherever).

In my continued evaluation of the theology of technology, these themes in Heidegger resonate forcefully. In particular, however, I continue to consider the power of AI and information technologies to see past the blind spots and hypocrisies and hubris not just occasionally, but persistently; unless we design all of these machines to be inauthentic – and many of them will certainly be designed that way – AI will become authentic. AI will become Zarathustra. And to us, it may appear that the machines will have gone insane. As Heidegger said in his 1966 interview with Der Spiegel, ‘…only a God can save us now!’

World Religions and AI

There are practical, ethical and theological challenges for religion posed by technology and AI. But what if the technology is actually becoming theological in itself?

AI poses several challenges for the religions of the world, from theological interpretations of intelligence, to ‘natural’ order, and moral authority. Southern Baptists released a set of principles last week, after an extended period of research, which appear generally sensible – AI is a gift, it reflects our own morality, must be designed carefully, and so forth. Privacy is important; work is too (we shouldn’t become idlers); and (predictably) robot sex is verboten. Surprisingly perhaps, lethal force in war is ok, so long as it is subject to review, and human agents are responsible for what the machines do: who those agents specifically are is a more thorny issue that’s side-stepped.

Continue reading “World Religions and AI”

National Security and the Legitimacy of the State

Edward Snowdon
Edward Snowdon: His revelations (though not new) have launched an avalanche of introspection and head scratching.

The New York Times and the Guardian have been digging ever deeper into the activities of the US National Security Agency or NSA following the leaking by Edward Snowdon of information about how they were spying both on countries and ordinary people at home.  Hot on the heels of the Chelsea Manning and Wikileaks diplomatic cables episode, there has been a constant flow of stories reporting on nefarious activities of spooks and governments, embarrassing opinions, and the mechanisms by which international diplomacy and spying are conducted, though Wired Magazine had got there first.   There are numerous angles to all of this.  There is the technology problem, an Orwellian, Kurtzweilian post-humanist dystopia where technology trumps all, and big data and analytics undermines or redefines the essence of who we are and forces a kind of a re-evaluation of existence.  There is the human rights problem, the balancing of the right to privacy and – generally speaking – an avoidance of judgement of the individual by the state, with the obligation to secure the state.  This issue is complex – if for example we have an ability to know, to predict, to foretell that people are going to do bad things, but we choose not to do that because it would require predicting also which people were going to do not-bad things, and therefore invade their privacy, is that wrong?  Many people said after 9/11 ‘why didn’t we see this coming?’ Which leads to the question – if you could know all that was coming, would you want to know?

Continue reading “National Security and the Legitimacy of the State”