September 7th
14:00-14:15 | Welcome, introductions and opening remarks |
14:15-15:30 | Session 1: Transparency & Intervenability. Chair: Isabel Wagner -TIRA: An OpenAPI Extension and Toolbox for GDPR Transparency in RESTful Architectures, by Elias Grünewald, Paul Wille, Frank Pallas, Maria C. Borges and Max-R. Ulbricht. Transparency – the provision of information about what personal data is collected for which purposes, how long it is stored, or to which parties it is transferred – is one of the core privacy principles underlying regulations such as the GDPR. Technical approaches for implementing transparency in practice are, however, only rarely considered. In this paper, we present a novel approach for doing so in current, RESTful application architectures and in line with prevailing agile and DevOps-driven practices. For this purpose, we introduce 1) a transparency-focused extension of OpenAPI specifications that allows individual service descriptions to be enriched with transparency-related annotations in a bottomup fashion and 2) a set of higher-order tools for aggregating respective information across multiple, interdependent services and for coherently integrating our approach into automated CI/CD-pipelines. Together, these building blocks pave the way for providing transparency information that is more specific and at the same time better reflects the actual implementation givens within complex service architectures than current, overly broad privacy statements. -How do I opt out? Do Not Sell Mechanisms under CCPA by Sean O’Connor, Aden Siebel and Eleanor Birrell. The California Consumer Privacy Act (CCPA)— which began enforcement on July 1, 2020—grants California users the affirmative right to opt-out of the sale of their personal information. In this work, we perform a series of observational studies to understand how websites implement this right. We perform two manual analyses of the top 500 U.S. websites (one conducted in July 2020 and a second conducted in January 2021) and classify how each site implements this new requirement. We also perform an automated analysis of the Top 5000 U.S. websites. We find that the vast majority of sites that implement opt-out mechanisms do so with a Do Not Sell link rather than with a privacy banner, and that many of the linked opt-out controls exhibit features such as nudging and indirect mechanisms (e.g., fillable forms). Our results demonstrate the importance of regulations that provide clear implementation requirements in order empower users to exercise their privacy rights. -Consent as a Service: Auditability and Transparency, Industry Talk by Sridhar Reddy Maddireddy and Engin Bozdag (Uber). Regulations such as GDPR require consent to be auditable, documented, specific and granular. In this talk we will be talking about implementing a consent service for a large company with millions of users and thousands of microservices. We will discuss challenges around onboarding (API’s and app templates), auditability, security, flexibilities around regional deployment, increased transparency for end users, metrics and more. |
15:30-15:45 | Coffee Break |
15:45-17:00 | Session 2: Privacy risks. Chair: Isabel Barberá -Privacy Considerations for Risk-Based Authentication Systems, by Stephan Wiefling, Jan Tolsdorf and Luigi Lo Iacono. Risk-based authentication (RBA) extends authentication mechanisms to make them more robust against account takeover attacks, such as those using stolen passwords. RBA is recommended by NIST and NCSC to strengthen password-based authentication, and is already used by major online services. Also, users consider RBA to be more usable than two-factor authentication and just as secure. However, users currently obtain RBA’s high security and usability benefits at the cost of exposing potentially sensitive personal data (e.g., IP address or browser information). This conflicts with user privacy and requires to consider user rights regarding the processing of personal data. We outline potential privacy challenges regarding different attacker models and propose improvements to balance privacy in RBA systems. To estimate the properties of the privacy-preserving RBA enhancements in practical environments, we evaluated a subset of them with long-term data from 780 users of a real-world online service. Our results show the potential to increase privacy in RBA solutions. However, it is limited to certain parameters that should guide RBA design to protect privacy. We outline research directions that need to be considered to achieve a widespread adoption of privacy preserving RBA with high user acceptance. -Towards Explaining Epsilon: A Worst-Case Study of Differential Privacy Risks, by Luise Mehner, Saskia Nuñez von Voigt and Florian Tschorsch. Differential privacy is a concept to quantify the disclosure of private information that is controlled by the privacy parameter ε. However, an intuitive interpretation of ε is needed to explain the privacy loss to data engineers and data subjects. In this paper, we conduct a worst-case study of differential privacy risks. We generalize an existing model and reduce complexity to provide more understandable statements on the privacy loss. To this end, we analyze the impact of parameters and introduce the notion of a global privacy risk and global privacy leak. -In The Right Place at The Right Time: Challenges of Location Privacy in Location Data Platforms, Industry talk by Aleksandra Kovacevic and Stefano Bennati (HERE). Location carries rich contextual information that enables powerful services in increasingly autonomous world. A platform where location data is collected and shared from various vehicles and devices expands location-based use cases and its accuracy and robustness. At the same time, due to its spatial-temporal sequential nature location data is particularly privacy revealing. This talk aims at illuminating the challenges and bringing some clarity to estimating privacy risks while processing location data. We present real-world examples of privacy risk associated with releasing location data, which reveal that the privacy risk depends on the context of location and purpose of its processing. Richness of location data poses a challenge but also an opportunity for protecting location privacy while maintaining its value for service. We will present some possibilities to strike this balance of privacy protection and utility preservation of location data. |
17:00-17:15 | Coffee Break |
17:15-18:30 | Session 3: Privacy by design. Chair: Kim Wuyts -A Model-based Approach to Realize Privacy and Data Protection by Design, by Gabriel Pedroza, Victor Muntés-Mulero, Yod Samuel Martín and Guillaume Mockly. Telecommunications and data are pervasive in almost each aspect of our every-day life and new concerns progressively arise as a result of stakes related to privacy and data protection. Indeed, systems development becomes data-centric leading to an ecosystem where a variety of players intervene (citizens, industry, regulators) and where the policies regarding data usage and utilization are far from consensual. The new General Data Protection Regulation (GDPR) enacted by the European Commission in 2018 has introduced new provisions including principles for lawfulness, fairness, transparency, etc. thus endorsing data subjects with new rights in regards to their personal data. In this context, a growing need for approaches that conceptualize and help engineers to integrate GDPR and privacy provisions at design time becomes paramount. This paper presents a comprehensive approach to support different phases of the design process with special attention to the integration of privacy and data protection principles. Among others, it is a generic model-based approach that can be specialized according to the specifics of different application domains -Quantitative Privacy Risk Analysis, by R. Jason Cronk and Stuart Shapiro. Most privacy risk assessment methodologies are homegrown and qualitative. Numerical models generally involve largely arbitrary quantifications. FAIR, a quantitative risk model for information security related risks, can be modified for privacy, providing more meaningful measurements and supporting comparison of risks of similar scenarios with varying controls to organizational tolerances. -Privacy Threat Modeling, Industry talk Engin Bozdag (Uber). Privacy threat modeling methodologies such as LINDDUN involves the systematic identification, elicitation, and analysis of privacy- and/or security-related threats in the context of a specific system. However, applying LINDDUN in a micro-serviced environment with millions of users’ data is not an easy task. In this talk, we show how we have implemented threat modeling at Uber and some of the challenges in identifying gaps and tracking/verifying mitigations. |
18:30-18:45 | Coffee Break |
18:45-20:00 | Session 4: Data protection and governance. Chair: Jose M. del Alamo -An Overview of Runtime Data Protection Enforcement Approaches, by Laurens Sion, Dimitri Van Landuyt and Wouter Joosen. A regulatory framework such as the GDPR succeeds in (i) providing clarity about the nature and the reach of fundamental rights to data privacy and the sovereign role of the data subject, (ii) raising broader awareness of the substantial impact of large-scale, contemporary softwareintensive data processing operations on these rights and freedoms, and (iii) creating urgency and imposing gravity, by forcing organizations to take these rights and fundamental principles seriously in a proactive manner. However, regulatory frameworks lack clarity on how these concerns are to be enacted. For example, guidance is lacking on how software should be constructed to consider these data protection principles by design and by default. In this paper, we argue how the direct translation of the GDPR data protection principles into design or code falls short in the context of contemporary software systems, which are both more dynamic and nature and rely on an increasing number of complex inter-organizational collaborations. This means that in such a system, data protection decisions cannot be ‘hard-coded’ but will have to be decided at run time. In addition, we provide an overview of promising existing approaches that contribute to the accomplishment of these fundamental data protection principles at run time. -How Standards Co-Shape Personal Data Protection in the European Banking Sector, by Ine van Zeeland and Jo Pierson. Attention for the protection of personal data has increased in recent years, but the preponderance of privacyrelated scandals shows there is still a clear need for practical guidance. Companies and industries often argue for guidance in the form of standards, which are thought to have several benefits, such as creating a level playing field internationally, promoting interoperability, and allowing for straightforward conformity assessments. Conversely, standards may create confusion as to what “the rules” are if they are not fully aligned with laws, for instance when international standards and national laws govern the same practices. Given these pros and cons, to what extent do standards provide clarity and practical support in the protection of personal data? This paper explores that question through an evaluation of findings on standards in an ethnographic study of the European banking sector. The purpose of the study was to find factors that influence the protection of personal data “on the ground”, with standards potentially being one of those factors. -Breaking silos in Privacy with incident response, Industry talk by Sri Maddipati (Google). The implementation of privacy programs is maturing and privacy is no longer just a compliance issue. It forces organizations to focus on building user trust and weaving privacy into every business function. With the growing privacy integrations within an organization, silos emerge limiting the ability to have a holistic view of the policy/process gaps and key risks. Incident response (IR) is one of the key pillars of a strong privacy program allowing companies to detect, respond to, and mitigate incidents and communicate lessons learned to business and privacy teams. This helps address privacy issues in a cross functional manner. IR can be leveraged as an effective tool to break silos irrespective of the privacy governance model. In this talk I will discuss the challenges posed by siloed business functions and explore ways to build a unified vision, view compliance risks holistically, and develop and implement remediation plans. |
20:00 | Closing remarks |