“Whatever is done now will have an impact on our ‘new normal’”
The coronavirus crisis in Europe is “driving a public debate about privacy, ethics and public health, and what measures are appropriate (or not) to protect it”, says Patricia Shaw of the Homo Responsibilis Initiative.
LONDON · 08 APRIL 2020 · 11:27 CET
Many European countries are reaching the peak of the Covid-19 crisis, and governments have announced the implementation of special measures to fight against the pandemic in areas such as the telecommunications.
Many technological solutions can be useful to help stop the virus, believes Patricia Shaw from the Homo Responsiblis Initiative, a newly created think/action tank of Christian experts and thought leaders in the field of Artificial Intelligence, data analytics and digitalisation. But there is a need to carefully analyse the impact these special measures will have on the daily life of citizens.
“This is a time for critical thinking, discernment and acting wisely”, the United Kingdom based consultant in technology ethics told Evangelical Focus. A time to “demand greater transparency and oversight about the data and algorithmic systems being used”.
Question. How could this Covid-19 crisis be wrongly used by governments or businesses to implement further technological control over the population? Do you know of specific cases happening in Europe?
Answer. Firstly we need to be clear on what the proposed technological measures are that could have human rights, privacy, data protection and ethical implications for people: (1) Apps to self-declare symptoms of Covid-19, (2) nationally organised programmes of broad sweeping and regular testing for Covid-19 of all citizens, (3) contact tracing via Bluetooth exchanges between mobile phones, (4) geolocation data tracking based on mobile phone movements, (5) algorithmically generated QR codes to RAG rate personal autonomy of movement such as that seen in China with Alipay. Each has its merits for different purposes and different cultural contexts.
In Europe, anonymised aggregated geolocation data (such as that obtained from telecoms providers) has been called for by most governments. In Germany such data is being openly reviewed by academic researchers like that at the Robert Koch Institut to help provide learnings and insight to decision makers. Technically this data is not “personal data” under GDPR because it has been anonymised, but that does not mean to say it has no privacy or ethical implications.
Certain contact tracing Apps are currently being proposed by both Germany and England. It has a proven track record based on its roll out in Singapore. Although this contact tracing measure provides further assurance of not using personal data because it provides a randomly generated anonymous to contacts user ID, it too can have ethical implications.
So, what ethical implications are we talking about here? Ones of:
- Autonomy in freedom of movement,
- Agency in who gets to decide where and when you have freedom of movement,
- Privacy in who can see where you are and when, or who you are in contact with and when you are in contact with them - obviously all only apply when you do actually have the ability to move beyond staying at home,
- Data sharing in who (other than the App provider that collected the data) gets to see the data. There is a need for greater transparency in what data is being collected by an App or telecoms provider, and transparency regarding the app (or more pertinently the algorithmic systems behind the app) in what they are doing with the data collected, the purpose/use/storage of data in conjunction with the app, and last but not least the need for understanding the accuracy and the potential impacts of the app/measure resulting in potential bias and discrimination. We need to know that these things are being considered and that organisations seeking to protect our “public health” also have due regard for them,
- Safeguards for use of data/Apps/tools beyond the present crisis.
Q. Could the crisis be an "accelerator" for governmental initiatives (in collaboration with telecoms and other private companies) that were planned for the future, but being tested now?
A. This crisis is most definitely an accelerator for trialling and rolling out greater connectivity, more infrastructure, better security, wider accessibility, broadening interoperability standards, pushing for greater data sharing, and strengthening the case for more wide spread digital identification of citizens. It is also driving a more public debate about privacy, ethics and public health, and what measures are appropriate (or not) to protect it. It really brings into question whether the connectivity and data infrastructure we use freely on a day to day basis should remain in the hands of private sector actors; isn’t it after all a public utility?
Whether these measures were all current or longer term planned governmental initiatives is a matter of politics and public resource planning – that will vary from sector to sector and country to country. However what is clear is that decision makers want and need more information to help them make informed decisions; they need a fuller understanding of Covid-19, how it spreads, and the impacts and consequences (intended and unintended) of their interventions.
Much of the technological changes we will now see proposed and/or enacted would be in line with Smart Data/Data Economy/Open Data initiatives, the Digital Single Market strategy, and the European Strategy for Europe Fit for the Digital Age. The urgency of the crisis has demanded that much is brought forward.
Q. What is a sound balance between a) being aware of the potential dangers of new uses of technologies and b) a positive and constructive engagement with these realities, from a Christian perspective?
A. It is vitally important to not be ignorant of (i) the data being collected about you (whether that is anonymised/pseudonymised/classified as personal data or not), (ii) by whom, (iii) for what purpose it is being used, (iv) how the algorithmic system is collecting it, collating it, analysing it and using it, (v) who the data is being shared with, and (vi) where in the world it is being stored.
This is a time for critical thinking, discernment and acting wisely. This is a time to demand greater transparency and governance and oversight about the data and algorithmic systems being used.
Nothing is straight forward. The use of algorithmic systems and big data is complex. Our response should not be one born out of fear. Instead we need to actively seek to understand the proposed solutions, and how this impacts on not only Christians but the lives, data and privacy of all humanity. We (as Christians) should be a part of finding solutions to the current pandemic that work for all.
In seeking to protect public health, we are also loving our neighbour. Whether we engage or not with the Apps and tech tools being presented to us, is one of personal choice, which will require wisdom.
We need to be mindful of the “digitally excluded” in all this. Not everyone has access to the internet or is able to use a smart or blue tooth enabled phone. When evaluating the solutions presented to use, we need to weight up the risks and benefits. We need to consider how will the intervention resolve the current pandemic and eradicate the virus, and could the outcome be achieved another way. We need to know about the safeguards that have been put in place for after the crisis is over, like automatic deletion of data and insights collected from you.
What is clear is that whatever is done now, it will have an impact on our “new normal”.
For more information, read here the insights of another expert on the issue of technology and the rights of citizens, Jonathan Ebsworth.
Published in: Evangelical Focus - life & tech - “Whatever is done now will have an impact on our ‘new normal’”