Student’s Corner

This is a special section of our website dedicated to showcasing the analytical skills and insightful perspectives of law students with a keen interest in privacy law. Here, you’ll find a diverse collection of articles, essays, case analyses, and commentary written by students from various law schools. Our goal is to provide a platform for aspiring legal professionals to share their thoughts on privacy law, enhance their writing skills, and engage with a broader audience.

Why Contribute?

  • Share Your Insights: Whether you’re passionate about data protection, cyber security, or privacy regulations, the Student’s Corner is your space to express your views.
  • Build Your Legal Portfolio: Published work on our site can enhance your legal writing portfolio, providing valuable exposure and experience in the field of privacy law.
  • Engage with the Legal Community: Join a community of like-minded peers and receive feedback from readers and fellow legal enthusiasts.


The call for volunteers is currently open to student submissions.

For more information about and to volunteer email the Privacy Law Section Publications at privacypublications.cla@gmail.com

STUDENT ARTICLES

Words Matter: Addressing Bias in Digital Discourse

By: Kaavya Shanmugam
3L, Santa Clara University School of Law

Language is often perceived as a neutral medium yet embeds historical and societal biases that significantly influence legal interpretation and outcomes. Much like society, language evolves over time, reflecting historical events, power dynamics, and cultural attitudes. We now see privacy legal terminology adopting and reinforcing these biases.

Language is not neutral because it is deeply intertwined with the history and social structure of the societies in which it is used. Moreover, dominant groups often shape the language used to describe and define other groups, which can marginalize or misrepresent those groupsā€™ experiences and identities. For example, in criminal law, the use of the term ā€œblack marketā€ to describe illegal trade markets has been questioned for its potential to reinforce negative associations with blackness. Another example being ā€œblack codesā€ to describe laws that restricted African Americansā€™ rights, post-Civil War. It is important to recognize how seemingly neutral terms can carry unintended connotations and reinforce biases in the legal sphere.

Similarly, the technology industry has grappled with biased terminology, as seen in the recent shift from ā€˜dark patternsā€™ to ā€˜deceptive patternsā€ by the person who coined the term, Dr. Harry Brignull. Dr. Brignull even rebranded his website from darkpatterns.org to deceptive. Design. While the intent was to highlight deceptive practices, the choice of the word ā€œdarkā€ carries unintended implications. In many cultures, darkness is associated with negativity, danger, or evil, which can unintentionally reinforce harmful stereotypes and racial biases by equating darkness with undesirable traits. Moreover, the term lacks neutrality, implying a moral judgment that may get in the way of objective discussions about these design tactics. It also raises concerns about cultural sensitivity, as darkness holds different connotations in various societies. But, the current shift reflects a growing consensus for more precise terminology in describing deceptive user interface practices, while highlighting the widening gap between legal language and industry standards.

Itā€™s crucial to note that this terminology is still prevalent in current legislation. In fact, only three state privacy laws explicitly address these deceptive practices, and all of them continue to use the term ā€œdark patterns.ā€ These include the California Privacy Rights Act (CPRA), the Colorado Privacy Act, and the Connecticut Data Privacy Act. This situation highlights the pressing need for legal language to evolve alongside our understanding of these issues. As we craft future legislation and amend existing laws, itā€™s important that we consider adopting the term ā€œdeceptive patterns.ā€ This change would not only align our legal language with current expert consensus but also demonstrate our commitment to creating inclusive, unbiased laws that accurately describe the practices we aim to regulate.

From a policy perspective, the use of ā€œdark patternsā€ is problematic for several reasons. The global nature of the digital economy creates pressure to use terminology that translates effectively across different cultures and languages. The term ā€œdark patternsā€ may not convey the same meanings across different contexts and cultures, potentially complicating international collaboration on this critical issue. To effectively combat these practices, we need to educate the public with clear, understandable terms. Adopting the term ā€œdeceptive patternsā€ better aligns our language with the true nature of these practices. This change allows us to focus on the core issue ā€“ the deliberate deception of users ā€“ without getting sidetracked by debates over terminology or unintended cultural implications.

The use of biased terminology can also have psychological effects on users and consumers. Exposure to biased language can impact decision-making processes, self-perception, and even performance. In the context of ā€˜dark patternsā€™, the term itself may unintentionally prime users to expect malicious intent, potentially increasing anxiety and reducing trust in digital interfaces overall. Additionally, the repeated use of such terminology in tech and legal contexts can normalize these biases, perpetuating their effects across various domains of society. By shifting to more neutral language like ā€˜deceptive patternsā€™, we can mitigate these psychological impacts and foster a more inclusive digital environment.

In conclusion, the shift from ā€˜dark patternsā€™ to ā€˜deceptive patternsā€™ is more than a matter of semantics. Adopting more precise and inclusive terminology facilitates clearer communication across global markets, and focuses directly on the manipulative nature of these practices. As we move forward in crafting and enforcing legislation to protect consumers in the digital age, let us lead with language that is as clear, inclusive, and effective as the protections we aim to provide.

About the author: Kaavya is third-year law student at Santa Clara University School of Law, where she is pursuing the Privacy Certificate. She is deeply interested in the dynamic field of data protection and digital rights, and eager to make significant contributions to this crucial area after graduating in May 2025.

Disclaimer: The views expressed in this student article are solely those of the author and do not represent the opinions of the California Lawyerā€™s Association Privacy Law Section. This article is for educational purposes only and should not be considered as legal advice. Readers should consult with qualified legal professionals for specific guidance.

Identity & Criminal Check Verification: Potential Solution to Bridging the Safety Gap in Dating Apps

By, Kiara J. PatiƱo Navarro, CIPP/US
Santa Clara University School of Law, 2024 J.D. & Privacy Certificate Candidate

A lot has changed since Grindr launched in 2009 as one of the few geolocation dating apps. Dating apps now have a plethora of filters to ensure users can find matches that align with their preferences and feel secure in their interactions. These filters include, but are not limited to: age, race, height, gender orientation, religion, hobbies, dating goals, relationship preferences, and a verified selfie check.

What filter is currently missing? A verified identity and criminal records check.

In this piece, I will discuss the importance of identity and criminal verification checks, the need for them in the current digital landscape, how they can be implemented, and how their implementation could impact privacy compliance.

Safety Concerns

There are justifiable safety concerns in the use of dating apps, especially for women.

A ProPublica 2019 report found more than a third of women participants reported being sexually assaulted by someone they met through an online dating platform. And usage is high. Pew Research Center reported in 2023 that 53% of participants under 30 had used a dating site or app compared to 37% of participants aged 30 to 49. With the pervasiveness of dating app usage, it is unsurprising that 60% of Americans support a background check on dating apps. Unfortunately, 57% of women participants reported that online dating is ā€œnot at all/not too safe.ā€

Dating apps such as Hinge and Tinder are aware of these risks. Hinge and Tinderā€™s terms of use ask users to ensure they are eligible before making an account, and seemingly in users making their account, they are confirming that they have not been convicted of, or pleaded no contest to a felony, or any crime involving violence, including sex crimes; and that they are not required to register as a sex offender.

Online Dating Social Contracts

Before the explosion of online dating and the use of dating apps, social contracts in the dating world provided some comfort via informal verification. The term ā€œsocial contractā€ is an implicit agreement among the members of a society to cooperate for social benefits.

Social contracts in the dating world vary by community. For instance, many parts of the world were shaped by social contracts in close-knit communities.

Cass R. Sunstein describes in his 1996 article, ā€œOn The Expressive Function of Law,ā€ that in close-knit communities when a defector violates norms they will probably feel shame and this is an important motivational force for compliance, especially in considering the high social “tax” communities can enforce through informal punishment like ostracization.

There are some places in the world where dating apps are not useful or encouraged. For example, my familyā€™s small town of San JosĆ© de Gracia in the highlands of Jalisco, Mexico has a population of less than 10,000. This close-knit community is a place where social capital and reputation carry over multiple generations. A phrase used in town is, ā€œpueblo chico, infierno grandeā€ (small town, huge hell) which reflects their attitudes in feeling the social pressures to act exemplary or live in hellish exile.

Whereas in larger metropolitan areas, folks typically donā€™t have a tĆ­a (aunt) or a busybody neighbor who can give some assurance in dating by providing informal verification.

In the digital world, the closest form of community-based social nets are regional, women-led dating advice groups on social media platforms that warn others of their experiences using dating apps, particularly regarding sexual assault and catfishing.

Not a New Concept

Identification verification is not a new concept in dating apps.

The League is a dating app aimed at professionally career-driven folks. An identity check is required for all users via users merging their LinkedIn profile. This check relies upon people being truthful about their public LinkedIn. The waitlist is around 50,000 people in the Bay Area and each profile on the app is personally reviewed before going live.

The League’s model is a good first step and it would be beneficial for other dating apps to invest resources into trustworthy criminal and identity verification checks rather than relying on user-provided public representations. 

Privacy Considerations

If dating apps conducted criminal and identity checks, privacy considerations could include:

  • Dating apps map users’ identities by encouraging building a detailed profile, including their hobbies, sexual orientation, and religious and political beliefs. Some of this information is sensitive personal information under some privacy laws and if a dating site conducted criminal and identity checks, they would likely be collecting even more sensitive personal information, which may go against data minimization principles. Mozilla Fox recently reported that most dating apps (80%) share or sell usersā€™ personal information and wonā€™t guarantee them the right to delete their data.
  • Dating apps would need to exercise special care to safeguard usersā€™ sensitive personal information while balancing the need for the physical safety of their other users. For example, the California Consumer Privacy Act (ā€œCCPAā€), as amended by the California Privacy Rights Act (ā€œCPRAā€), provides California users the right to limit the use of their sensitive personal information for limited purposes. Per Section 1798.135 of the CCPA regulations, companies must provide a notice of the right to limit or provide an alternative opt-out link that is accessible to users. Thus, as users provide the information needed for background checks, companies must ensure that users are given notice of the right to limit or be provided with an opt-out link.
  • Companies should consider disposing data soon after verifying usersā€™ identities to safeguard usersā€™ information. As users have the right to correct under the CCPA, the data retention period could factor in the time needed to correct a user’s information if they were denied the badge unjustifiably.

In addition to these privacy concerns, wide-scale implementation of checks may be cost-prohibitive. However, users could be charged a one-time fee for a verified criminal and identity check badge and be afforded ease of mind in dating.

Conclusion

With a growing number of younger people using dating apps, these apps have a societal responsibility in shaping a safer dating experience. While identity and criminal verification checks may not entirely prevent sexual assault, they would likely result in a reduction and provide a more trustworthy and transparent space for users.

Disclaimer: The views expressed in this student article are solely those of the author and do not represent the opinions of the California Lawyer’s Association Privacy Law Section. This article is for educational purposes only and should not be considered as legal advice. Readers should consult with qualified legal professionals for specific guidance.

Forgot Password

Enter the email associated with you account. You will then receive a link in your inbox to reset your password.

Personal Information

Select Section(s)

CLA Membership is $99 and includes one section. Additional sections are $99 each.

Payment