LETTING TIME SERVE YOU: BOOT CAMPS AND ALTERNATIVE SENTENCING FOR FEMALE OFFENDERS [1] By: Ashley Krenelka Chase* & Sam Harden** Abstract While a lack of internet regulation is the norm in the United States, generative artificial intelligence (AI) presents a series of new challenges, particularly in the legal field. Those who are trained in the…
Articles
Synthetic Data and GDPR Compliance: How Artificial Intelligence Might Resolve the Privacy-Utility TradeoffÂ
Data is in many ways the lifeblood of the digital economy. High-quality data oftentimes requires significant detail which may be at odds with the privacy concerns of the human subjects from whom data is extracted. The tension between the usefulness of a dataset and the data subject’s privacy has been referred to as the “privacy-utility tradeoff.” A novel application of artificial intelligence has potentially made it possible to resolve this tradeoff through the creation of “synthetic data,” anonymized data generated through general adversarial neural networks from authentic raw data. Unlike pseudonymized data, synthetic data retain properties that are statistically equivalent to the underlying data gathered from data subjects. As the cost of compliance with privacy laws across the world increases, synthetic data may prove to be a viable solution to the tension between protecting individual privacy rights and the demand in the big data market.
This Note argues that large BigTech companies should incorporate synthetic data into their business models to protect users’ private, personal data while retaining large profits derived their ad-driven business models. Part I provides an overview of GDPR, the patchwork of U.S. privacy laws, and recent caselaw that illustrates EU regulators’ strict approach to enforcement compared to their U.S. counterparts. Part II discusses how the Privacy-Utility Tradeoff and BigTech’s current business model renders compliance with data privacy regulations difficult. Part III explains how synthetic data can be used to resolve the Privacy-Utility Tradeoff.
Moving the United States into the 21st Century for Children’s Online Privacy RightsÂ
It has been more than twenty-five years since the Children’s Online Privacy Protection Act (COPPA) was first implemented in the United States. Since its enactment—well over a decade ago—there has been only one instance in which Congress successfully passed noteworthy modifications to the Act. While there has been a recent increase in proposed amendments to the Act better to protect children in our current reality of everchanging technology, little has been done to initiate the much-needed change. The increased focus on children’s online rights has been sparked primarily by changes made in the United Kingdom. At the forefront of the drive for greater protection of the privacy rights of children, the United Kingdom’s transformation has left the world considering what alterations need to be made to their current systems to stay up to date with this growing demand.
Despite the mounting need for change, online service providers have stalled the process, leaving children in a world of new technologies without adequate protections in place. As market giants, online service providers influence ongoing debates to limit legislative changes and the potential economic burden of those changes. Several scholarships have identified issues with the current system in the United States, but few have taken on the task of proposing a practical solution. To effectuate change, it is imperative to zero in on the most essential needs of children to adequately protect them online while balancing the concerns ofthose opposing large-scale modifications. This Note will begin by looking at the current law of child online privacy protections in the United States, COPPA, exploring how the act works, how violations are handled, and how the original version of COPPA has changed. Next, it will explore the approach recently taken by the United Kingdom and then evaluate how COPPA compares, as well as the discussions currently taking place regarding this topic. Lastly, this Note will set out a five-point plan to implement the necessary changes to bring children’s online privacy protections into the 21st century.
Statistical Securities Compliance
This Article makes three main contributions. First, this Article introduces the Solana blockchain as a public good and provides policy analysis for open innovation. Second, this Article introduces a new dataset for SEC blockchain enforcement, supporting empirical compliance analysis. Third, this Article draws on the legal informatics literature to provide a mechanism for applied analysis of digital assets on the Solana blockchain in the context of securities law. The main purpose of this Article is to introduce new methods for using natural language processing to automate compliance services on the Solana blockchain.