I’ve been going to quite a few events recently which broadly come under the heading of futurism – indeed many of them have been through a reliably high quality meetup group actually called London Futurists.
These meetings deal with more-or-less mind-boggling speculations and predictions of things like robots taking all the jobs, artificial intelligences surpassing human capacities, people hacking their own or their children’s biology through genetic or prosthetic modifications, and similar subjects. Sci-fi stuff, you might think …
… and some of it is, no doubt, a little far-fetched for me. However, there is a strong argument that while on the one hand technical advances in these areas are coming thick and fast, and on the other hand we have the self-feeding wild-fire of US venture capitalism driven by insanely ambitious people who have become billionaires before their thirtieth birthday, it seems likely that exponential rates of change will result in significant impact from one or more of these domains in the next couple of decades.
Processing these ideas, I kept coming round to the same thought – that if our society is to get itself into a position where we can make sensible, timely choices about these issues – all of which will have strong recursive impacts on our nature as humans, we are going to need to start from a reliable, widely-shared ethical position – one which has been designed to cope with futuristic-seeming situations.
And we don’t have any such thing. Nor do we have any strong candidates.
Of course, we have no shortage of people and groups with views on these matters (and everything else), but if we are looking for widely-shared, carefully drafted, pragmatic and applicable ethical frameworks that address the issues that are going to have major impacts on us – and I include Climate Change in the list – we won’t find much.
Eventually I began to say this out loud and in post-event discussion forums, until David Wood, who runs London Futurists, challenged me to make a clear proposal.
You can read it here.