Stanford Seminar - How Can Privacy Exist in a Data-Driven World?

Stanford Seminar - How Can Privacy Exist in a Data-Driven World?

Privacy in Photography

  • Blae discusses privacy in the context of photography, using the book "The Transparent City" as an example.
  • Students have varying opinions on whether the photos constitute privacy violations, leading to discussions about consent, context, and technology's role in privacy.

History of Privacy Rights

  • Blae traces the history of the right to privacy, referencing the work of Samuel Warren and Louis Brandeis in 1890 and subsequent refinements by scholars and legal experts.
  • The complexity of privacy and the need for concrete privacy principles when building interactive systems are emphasized.

Privacy Principles and Regulations

  • The US Federal Trade Commission's Fair Information Practice Principles (FIPs) are mentioned as an example of privacy principles, including notice, choice, access, integrity, and enforcement.
  • The European Union's General Data Privacy Regulation (GDPR) introduced several privacy rights for individuals, including the right to request data erasure, object to certain data processing, be notified of data breaches, and access information collected about them.
  • In the United States, privacy laws are typically sector-specific, with different laws governing children's data, financial data, educational data, health data, and video rental records.
  • California has taken a leading role in US privacy legislation with the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA), which provide Californians with rights similar to those under GDPR.

Privacy Concerns in Data-Driven Fields

  • The increasing reliance on data in fields like machine learning and large language models raises concerns about privacy, as these models require vast amounts of personal data to function.
  • The notion of "data is the new oil" highlights the potential benefits and risks associated with data, similar to the resource oil, which can drive innovation but also lead to economic marginalization and geopolitical conflicts.

Privacy in the Digital Age

  • The speaker challenges the claim that "privacy is dead" in the context of social media and argues that privacy remains a relevant concern in the digital age.
  • A user study was conducted to examine people's reactions to hyper-targeted ads based on their personal information.
  • Existing privacy transparency tools were found to be lacking in providing meaningful information to users.
  • The speaker emphasizes the need for data-driven methods to provide transparency about online tracking and privacy.

Data Access and Transparency

  • Data subject access rights, such as GDPR and CCPA, allow individuals to request a copy of their personal data from companies.
  • An analysis of Twitter's ad targeting revealed that users are being targeted based on specific behaviors, interests, and even tailored audiences created by advertisers.
  • A code design study was conducted to explore how data access rights could be made more meaningful for privacy.
  • Participants in a study had difficulty accessing and understanding their personal data provided by companies.
  • The "Data Access Illuminator" tool is being developed to help consumers understand the scope and implications of their personal data.

Art and Privacy Provocations

  • Art and intentional provocations are being explored as ways to engage people with privacy issues and reframe the narrative of how they interact with their data.
  • Collaborations between art students and computer science students have resulted in thought-provoking artworks that reflect on privacy in the digital and physical worlds.
  • Privacy provocations are being investigated as a way to modulate people's reactions to their own data and create desired user experiences.

Other Research Projects

  • The speaker highlights other research projects, including Jupiter Lab and Retrograde, which provides contextual nudges about potential fairness and bias issues when working with data in Jupyter notebooks.
  • They discuss their work on password modeling and security, emphasizing the risks of reusing passwords and the prevalence of password breaches.
  • The speaker mentions their interest in end-user programming, using formal models to help people program, and exploring large language models as a form of end-user programming.

Societal Implications of Large Language Models

  • The speaker, a professor of political science at Stanford, discusses the societal implications of large language models and the need for data privacy.
  • The speaker suggests that individuals should consider the kind of society they want to live in and how to use a combination of government regulations, non-governmental organizations, and systems to embed societal values in these systems.
  • The speaker acknowledges the time and effort required to maintain privacy and suggests that transparency alone may not be sufficient.
  • The speaker proposes the idea of "provocative transparency" to raise awareness and encourage individuals to care about their privacy.

Privacy Trade-offs and Ongoing Research

  • The speaker highlights the trade-offs between privacy and functionality, citing the example of Brave browser and its impact on website functionality.
  • Ongoing research by students to automatically "unbreak" websites that are affected by privacy settings is mentioned.
  • The speaker suggests the possibility of personalizing large language models locally to reduce the need for data to flow to the cloud, thereby enhancing privacy.

Overwhelmed by Endless Content?