David is a technology adviser to the British Government. He focuses on emerging technologies and most recently worked on the NHS COVID-19 contact tracing app. He describes his job as an intermediary between software engineers, ministers, and lawyers. He attributes many pitfalls of government technology programmes to low levels of technological literacy. Some of those lessons are relevant to young lawyers and he offers them here.
At the start of April this year, a Silicon Valley executive picked up the phone. 5000 miles away in London, it was answered by a weary civil servant. A heated discussion followed that spanned contact tracing, sovereignty, and brand management. Days later, governments around the world announced u-turns to their contact tracing strategies. This article is about why, and what it might mean for you and your career in law.
Digital contact tracing.
A digital contact tracing system takes a labour-intensive activity and makes it cheap and scalable by outsourcing it to mobile phones. Simply put, these systems collect lists of ‘meetings’ and ‘infected people’. An algorithm compares the lists and pushes out alerts whenever there are matches.
Broadly speaking, governments have two options: ‘centralised’ systems, where aggregated ‘meeting’ data is stored and processed on a government server; and ‘decentralised’ systems, where this activity takes place on each individual’s phone. The former prioritises epidemiological utility whilst the latter prioritises privacy.
In this case, most governments sought to maximise the data available to their scientists and opted for centralised systems. They had good reasons but failed to articulate sufficiently why holding vast quantities of personal data was necessary and proportional in the battle against covid.
This compromised privacy and in the interests of preserving his company’s reputation as the safest place on the market for user data, the Silicon Valley executive had the difficult job of telling governments, after weeks of careful design, that his operating system would no longer support their software.
Desirable, feasible, viable… What about legal?
Technology programmes are highly structured affairs. Specialised teams work in defined phases to maximise productivity, avoid bias, and ultimately deliver better services to the public. Assumptions are tested early and often and developers strive for a triumvirate of what is desirable, feasible, and viable.
The UK Government had fielded a star team of renowned engineers, scientists, and ethicists. Lawyers, however, were conspicuously absent. Every day, features were reviewed against the ‘innovation criteria’ above but there was no voice in the room to comment on how each increased or decreased legal risk.
This was both a supply problem and a mindset problem: because there are so few technically literate lawyers, legal advice is given at the strategic level and does little more than define the ‘playing field’ for developers to keep within. But the devil is in the detail, and scientists, developers, and lawyers often reach wildly different interpretations of what is ‘necessary’ and ‘proportional’.
Build quickly, ethically, and legally.
In government, contracts are the result of competitive tenders. Companies are under pressure to push technical and financial boundaries. Here, there was additional pressure to get to market quickly.
Software companies have evolved to thrive in this kind of environment. Self-directed, multi-disciplinary teams work in ‘sprints’. There are few checks. Drags in the system are purged by individuals whose only job is to remove friction for developers.
Ethical and legal oversight are seen as exactly that, ‘drags’ in the system. Often, this kind of supervision is removed from the build team and might convene early in the project, fortnightly, or to discuss ad hoc issues as and when they are raised.
By the time an issue is identified, code or technical specifications may have been published and features may have been expanded. Retrospective changes can be reputationally challenging or technically impossible: like rebuilding the walls of a house without touching the roof.
What this all means for you.
In five years or less, I am confident that lawyers will become an integral part of the design process to prevent exactly these mistakes from happening.
Ideally, lawyers will be a common component in technical teams. They would make judgements, sprint by sprint, on how design choices increase or decrease legal risk.
It will be your ability to participate generally, not to code, that will open these doors to you. Can you understand what a data scientist, developer, or product manager says to you? Can you compare the legal risks of two software stacks and articulate it back to them?
If this world interests you, read about ‘design systems’ and ‘agile frameworks’. Try to spot the potential points of failure in a stressed and sleep-deprived team. If you find yourself working for a start-up, ask to sit in for a ‘sprint’ and pay attention to how quickly features are approved, coded, and shipped.
Set up a GitHub account, learn to do a pull request. You don’t need to read code, but I’ve seen GDPR breaches prevented by someone spotting an issue in a project’s documentation.
Finally, if you haven’t read the GDPR, do it… now. You would be surprised how many people in this line of work go no further than basic principles and take liberties with its interpretation.
David Bentham