Menu Menu

Should tech specialists take a Hippocratic oath?

STEM academic and BBC regular Hannah Fry calls for ethics to play a bigger part in tech fields that will shape our future.

Leading researcher, mathematician, TV presenter and all-round maths bae Hannah Fry has recently called for computer engineers and scientists to take the potential power they have over humanity’s future more seriously in the form of a Hippocratic oath.

For those who of you who haven’t watched nearly enoughΒ Grey’s Anatomy, a Hippocratic Oath is a pledge of ethics historically taken by doctors and physicians to uphold specific ethical standards. The pledge states that a medical practitioner will strive always to do no harm to those their practice impacts.

And it’s exactly this kind of thinking that Fry is convinced should also become a key component of the training scientists, mathematicians, and engineers receive.

Increasingly, these professionals are building systems that gather and sell personal data, and that exploit human frailties. They build and manage our infrastructure, our security, and our tools of war. Life or death decisions are no longer meaningfully just the realm of medical professional but are at the very core of the future tech people are creating. So, Fry argues, it’s no longer practical to consider mathematics and philosophy separate entities.

Take self-driving cars for example. Engineers and mechanics working on completely automatic modes of travel are having to turn instinctual human reactions into algorithms. Potential β€˜decisions’ these cars might have to make, like whether to prioritise the life of its passenger over pedestrians in a collision, will have to be hard-wired into the artificial intelligence.

It’s an abstract, conjectural version of the life of the mother vs. the life of the baby debate taught in medical schools, except, unlike doctors, engineers are typically woefully unprepared to engage in this type of debate. It’s never been part of their training.

As FryΒ put it to The Guardian, β€˜In medicine, you learn about ethics from day one. In mathematics, it’s a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take’.

Fry said she got a sense of the ethical blind spots scientists can have while describing to an academic conference in Berlin the computing modelling of the 2011 riots she had done for the Metropolitan Police. The audience began heckling her, and it was only later that she realised why as she stopped to consider the implications of police state mechanics for the people of Berlin.

She realised how mathematicians, computer engineers, and physicists are so used to working on abstract problems that they rarely stop to consider the ethics of how their work might be used.

The case for a Hippocratic Oath for scientists has been made before. In 1969 the philosopher Karl Popper wrote that β€˜one of the few things we can do is to try to keep alive, in all scientists, the consciousness of their responsibility.’

The fact of the matter is, when we engage with or build new technology, we’re consenting to its production not just on our own behalves, but for our children too. Perhaps now we don’t live in a world where gene ordering has resulted in genetic discrimination, but in 100 years this could very much be a reality.

Which is why, by Fry’s reasoning, we should put ethical backstops in place that force developers to consider whether their forays into future tech are beneficial long-term.

But could enforcing an Oath potentially limit the creation of new tech? In medicine, the Hippocratic Oath has been a central tenant in the debate surrounding euthanasia, with critics arguing that the very logic of β€˜do no harm’ contradicts assisted death. A pledge is, in anyone’s terms, a limitation. Given that we have no idea where the future of science could and will take us, is it really wise to put such firm legislation in place?

As the future tech creators and collaborators, Gen Z certainly has a stake in this discussion. Would a scientific oath of ethics be a good idea, or would it merely limit the trajectory of science? Let us know your thoughts in the comments below.

Accessibility