In his 1966 work The Order of Things, Michel Foucault describes in his preface a passage from Borges to establish his objective. Quoting Borges, who in turn refers to ‘a certain Chinese encyclopaedia’, the section describes a classification of animals as being ‘divided into: (a) belonging to the Emperor, (b) embalmed, (c) tame, (d) sucking pigs, (e) sirens, (f) fabulous, (g) stray dogs, (h) included in the present classification, (i) frenzied, (j) innumerable, (k) drawn with a very fine camelhair brush, (l) et cetera, (m) having just broken the water pitcher, (n) that from a long way off look like flies’. In a later lecture recalled by Laurie Taylor, Foucault lambasted the impulse to capture and mount every butterfly in a genus and lay them out on a table, to highlight minute differences in form and colour, as if trying to solve God’s puzzle. Continue reading “Reflections on Blackwater: Technological Theologies, Autistic Robots, and Chivalric Order”
How do markets optimise the delivery of social services and social welfare? This question surfaces many of the challenges for the Austrian School, the philosophy that free markets and the price mechanism can do a remarkable job in managing people and their behaviour. While initially Friedrich Hayek’s theorising argued that the role of the State should be minimal, he ultimately conceded that some State regulation was required in order to maintain markets, and some other functions. For example, ‘[t]o prohibit the use of certain poisonous substances, or to require special precautions in their use, to limit working hours or to require certain sanitary arrangements, is fully compatible with the preservation of competition. The only question here is whether in the particular instance the advantages gained are greater than the social costs they impose.’ (The Road to Serfdom, p.38/9) The ultimate question of Hayekian liberalism is how much does the government have to interfere? What is the minimum possible function of government? Continue reading “Hayek, The Busted Flush: Economic Value, Marketisation, and Social Justice”
On the day when Apple are supposed to be launching a new iPhone with facial scanning capability, the Guardian has delightfully timed a piece warning of the dangers of the technology. Its functions potentially extend to predicting sexual orientation, political disposition, or nefarious intent. What secrets can remain in the face of this extraordinary power! Indeed, it’s two years ago since I heard Martin Geddes talking about people continuing to wear face masks in Hong Kong not because of the smog, but to avoid facial scanning technologies deployed by an overbearing security apparatus. There’s no hiding from the data, no forgetting.
There has been much written in recent times about post-truth politics, and much associated naval gazing as commentators, analysts and politicians themselves have tried to understand how to deal with all this. Leave campaigners in the UK promised £350m a week for the NHS; Donald Trump still thinks he opposed the war in Iraq; and Vladimir Putin claimed no involvement with the war in Ukraine. Populism, reactionism, anti-intellectualism – call it what you will, it’s certainly got currency.
Speaking of currency, the pound has taken a pounding since the Brexit vote, and overnight trading on the 6th-7th October witnessed a flash crash that traders have struggled to explain. David Bloom at HSBC put it that the pound had become the de facto opposition in Britain. ‘Sterling,’ he said, ‘has become a structural and political currency.’ I’d go further than that. It’s the algorithms providing the opposition. The algorithms governing the trading desks even scrape news feeds to see potentially influential stories (one commentator suggested that comments from Francois Hollande on the Brexit negotiations may have triggered the ‘flash crash’). They sense and learn, and respond, constantly scoring and valuing political decisions and the smallest market moves.
The algorithms then form their own truth. This isn’t post-truth politics, it’s absolute truth. And that is potentially a whole lot worse.
We live in a perpetual reckoning. It’s a strange place in some respects, and not very forgiving. It’s a place where everything is counted, everything is measured, billed, quantified. I’m talking, of course, about the digital space that we inhabit today, where our politics and our societies function and grow, where our families meet, our groceries are procured, and our priests broadcast Mass.
I had the privilege to participate in a workshop on algorithmic governance this past Friday at my alma mater, the National University of Ireland, Galway, under the supervision of Dr Rónán Kennedy and Dr John Danaher of the Law Faculty. and co-funded by the Colleges of Business and Public Policy. It’s part of a wider program of research grandly titled ‘Algocracy and the Transhumanist Project‘, which promises to tread some fascinating pathways. Comprehensive synopses of the event have already been published by Dr Danaher and one of the speakers Dr Muki Haklay, so I won’t re-do their work, but instead refer to one of the particularly interesting themes that emerged from the work.
Shoshana Zuboff’s ‘Big Other’ and ‘Surveillance Capitalism’ as Future Economic Models
Shoshana Zuboff’s recently published article on what she has termed Information Civilization is a compact and helpful analysis of the kind of internet economies that are emerging in the early twenty-first century. This blog post is a commentary on that text. She takes Google’s Chief Economist Hal Varian as her foil, referencing his two articles Computer Mediated Transactions (2010) and Beyond Big Data (2013).