Q3/2020 - Weltwirtschaftsforum Davos (WEF)

WEF-White Paper zu einem „Internet of Body“ (IoB), 3. August 2020

In einem neuen Weißbuch „Shaping the Future of the Internet of Bodies: New challenges of technology governance“, das das WEF zusammen mit der McGill University in Montreal erarbeitet hat, wird sich mit dem noch wenig diskutierten Thema eines „Internet of Body“ (IOB) beschäftigt. Immer mehr Sensoren und digitale Mini-Geräte würden in menschliche Körper implantiert und mit den globalen Netzwerken verbunden ohne, dass die Konsequenzen dieser Entwicklungen bislang ausreichend untersucht worden wären. Dabei würde eine gigantische Menge von biometrischen Daten zum menschlichen Verhalten generiert werden, über deren Verwendung man sich bislang überhaupt nicht klar ist, die aber auch von erheblichem Nutzen sein können für die Bekämpfung von Krankheiten und die Verbesserung der Lebensqualität. Gleichzeitig würden aber grundlegende politische, ethische und philosophische Frage aufgeworfen wie der Schutz des privaten Lebens, die Autonomie einer Persönlichkeit, Risiken für Diskriminierung sowohl im zwischenmenschlichen als auch im gesellschaftliche und wirtschaftlichen Bereich (Arbeit, Bildung, Gesundheit, Versicherung etc.)[1].  Die Studie gibt vier Empfehlungen für das weitere Vorgehen:

  1. Building a robust and consistent system of governance around the internet of bodies
  2. Addressing the outcomes of data inferences and analytics in data protection
  3. Building up a repertoire of privacy-enhancing technology, and developing a framework of decision-making
  4. Supporting data subjects and experimenting with the solidarity approach[2]

Neue Termin für WEF 2021, Juli 2020

Das für den Januar 2021 in Davos geplante reguläre Jahrestreffen des Weltwirtschaftsforums ist auf den 18. bis 21. Mai 2021 verschoben worden und soll in Luzern stattfinden[3].

Mehr zum Thema
  1. [1] Shaping the Future of the Internet of Bodies: New challenges of technology governance, McGill University & World Economic Forum, 6. August 2020: „The internet of things (IoT) is increasingly entangling with human bodies. This emergence and fast expansion of the “internet of bodies” (IoB) – the network of human bodies and data through connected sensors – while offering enormous social and health benefits, also raises new challenges of technology governance. With an unprecedented number of sensors attached to, implanted within or ingested into human bodies to monitor, analyse and even modify human bodies and behaviour, immediate actions are needed to address the ethical and legal considerations that come with the IoB. The urgency of such actions is further brought to the forefront by the global COVID-19 pandemic, with extensive IoB technologies and data being enlisted for the surveillance and tracking of coronavirus. This white paper comprises two parts. Part one provides a landscape review of IoB technologies, as well as their benefits and risks. An examination of the ecosystem shows that IoB technologies are deployed not only in medical scenarios but also across different sectors, from fitness and health management to employment settings and entertainment. The accelerating convergence of consumer devices and health/medical devices also shows that the line between medical and non-medical IoB devices is blurring. This suggests that new strategies of governance are needed for IoB devices, which are traditionally subject to different regulatory agencies and rules It is worth noting that this white paper will not delve into gaming and virtual reality (VR) devices nor the data from them. While related, these devices raise distinct issues from the more traditional health and fitness devices. Part two examines the governance of IoB data – focusing, in particular, on the regulatory landscape in the United States, with a comparative perspective of regulation in the European Union. This part examines current regulatory approaches to IoB data, as well as the challenges raised by the rapidly shifting ecosystem, especially the wide adoption of big data algorithms. Whereas IoB technologies also entail other issues such as the physical effects of devices on users and liability for physical harms, this paper focuses only on the governance of data generated from IoB, particularly from health and wellness IoB devices. Two main findings for policy-makers and stakeholders are highlighted. First, broad adoption of the IoB and frequent flows of IoB data across scenarios and sectors requires robust and consistent governance frameworks in both the medical and non-medical sectors. This is particularly the case for IoB data governance as, while clinically derived data is in general strictly regulated, the regulation of consumer-generated data and other non-clinical data is often, given the sensitivity of the data, uneven in terms of coverage and strength across sectors and jurisdictions; this is the case in, for example, the United States. Second, IoB data governance approaches and data protection laws need urgent updates to address the risk of privacy, unfairness and discrimination brought about by common practices of big data analytics. This risk presented by big data analytics exists with both medical data and non-medical data, as even deidentified medical data can be reidentified or misused in a way that causes harm and discrimination to individuals and groups. We therefore urge stakeholders from across sectors, industries and geographies to work together to mitigate the risks in order to fully unleash the potential of the IoB.“ In: https://www.weforum.org/reports/the-internet-of-bodies-is-here-tackling-new-challenges-of-technology-governance
  2. [2] Shaping the Future of the Internet of Bodies: New challenges of technology governance, McGill University & World Economic Forum, 6. August 2020: „To tackle these new challenges of technology governance related to the IoB, multistakeholder action is urgently needed. The following section outlines a menu of possible approaches, from regulatory to technological, to help mitigate these risks in order to fully unleash the potential of the IoB. 1. Building a robust and consistent system of governance around the internet of bodies – As the internet of things is increasingly evolving to be connected with human bodies, a robust and consistent system of governance is needed to address the risks of the expanding IoB. This means that, for example, in the US context, a new governance strategy should be formed across the conventional division of medical and non-medical fields to address the broad dynamics of IoB technologies and data. Some experts suggest combining the powers of the FDA and FTC, along with the Consumer Product Safety Commission and the Consumer Financial Protection Bureau. Senators Amy Klobuchar (D-MN) and Lisa Murkowski (R-AK) introduced the Protecting Personal Health Data Act in 2019, and proposed the establishment of a national task force entrusted with protecting health data. The governance of the IoB, as in the case of the IoT, relies on not only policy-makers and regulators but also trade groups, industrial associations, patient groups, users and citizens, civil society and other forms of multistakeholder cooperation, as IoB technologies are germane to the protection of fundamental human rights in a connected world. 2. Addressing the outcomes of data inferences and analytics in data protection – As discussed above, current data protection regulations mostly focus on ensuring that data is lawfully obtained, and that its processing meets the requirements of lawful grounds. But in general, extant regulations fail to address the risks of the outcomes of algorithmic deployment in all cases. To put it in a simple way, they concern mainly the input data, rather than the new data generated from the algorithm. AI and data regulation should address risks of privacy and discrimination in data inferences and algorithm analytics. Data protection experts have demonstrated that advances in big data analytics demand new protections for group privacy, addressing privacy interests of ad hoc groups formed by algorithmic classification. Different from the conventional concept of a group, algorithmically generated groups are characterized by a highly dynamic instead of a stable membership, and individuals clustered in a group may not even be aware of their membership. It therefore remains an urgent task to address the collective data rights of algorithm-generated groups, which are not equivalent to, or encompassed by, individual privacy. Sandra Wachter and Brent Mittelstadt advocate “a right to reasonable inferences” to address the accountability gap posed by “high-risk inferences” and the risk of “discrimination by association”. Wachter points out that to effectively address the risks of inferences, a robust data protection law should be supplemented with agile sectorial laws, especially in high-risk areas such as finance, employment and criminal justice. This requires a thorough re-examination of sectorial laws to make sure that they are updated to address the risk of algorithmic decision-making, as most anti-discrimination laws focus on preventing discrimination in human decisions but fall short of addressing the opacity and unpredictability of algorithms. 3. Building up a repertoire of privacy-enhancing technology, and developing a framework of decision-making – A broad range of technology solutions or methods have emerged to achieve specific privacy or data protection functionality, which include encryption, metadata and digital rights management, application programming, system development governance, identity management etc. Besides the well-known methods of deidentification and pseudonymization, synthetic data is another approach to depersonalizing data. Synthetic data is “fake” data that has the same statistical properties as real data, and can be used as a proxy for real data in AI and machine learning, software testing and other purposes. While deidentification, pseudonymization and synthetic data focus on the transformation of data, other technologies approach data protection through the control of data. Recently, a group of epidemiologists and data scientists in the UK carried out a study of COVID-19-related deaths among various groups of people and, instead of extracting the sensitive medical records of 17 million people from databases, developed software to run the analysis directly on the data. This approach of sharing and running analysis over sensitive data allows the control of data without moving it or giving it away. As privacy compliance should be considered as a spectrum of risks, the specific choice of privacy-enhancing technology is often considered along with other factors such as data utility and operational cost. A framework for decision-making can help optimize the solution in each case to protect the privacy of the individual’s data. 4. Supporting data subjects and experimenting with the solidarity approach – Responsible use of technology should respect human rights and ethics. In order to fully realize the social benefits of IoB technology and data, users should be empowered with the legal rights of a data subject and a supporting system to execute those rights. This requires a clearer definition of data ownership and better control of users’ own data. Users will be supported with the knowledge of how their data is used, and the ability to access and correct their information, including the means to address unfair inferences and analytics. In response to social sorting and stratification powered by big-data analysis, some experts advocate a solidarity approach to health data governance (one focused on societal and community good). This shifts the focus to the shared societal benefits and responsibilities, which motivates people to share data for the collective and individual good. Biobanks are a good example for sharing biological data. The solidarity approach of data governance treats data contributors as partners, and this involves explicit acknowledgement of the types of research that the database supports, and easy access for community members to research findings. Data subjects should also be informed of the potential risks associated with the participation.” In: https://www.weforum.org/reports/the-internet-of-bodies-is-here-tackling-new-challenges-of-technology-governance
  3. [3] Annual Meeting 2021 to take place in Lucerne-Bürgenstock, in: https://www.weforum.org/press/2020/10/annual-meeting-2021-to-take-place-in-lucerne-burgenstock