This report analyzes international, federal, and state level legislation to protect child safety online. While well-meaning, this legislation is driven by moral panic rather than empirical evidence, is unlikely to help young people, and will harm privacy and free expression.
A range of approaches to child online safety legislation (COSL) is being proposed, debated, or implemented at both the federal and state level in the United States. While the specifics of these bills differ, they coalesce around concerns regarding the effects of social media on young people. This document:
Explains these concerns, why they have surfaced now, and how COSL purports to solve them.
Outlines major international, US federal and state legislative efforts, particularly the Kids Online Safety Act (KOSA).
Summarizes the primary frames by which COSL is justified—mental health, sexual exploitation and abuse, eating disorders and self-harm, and social media addiction—and evaluates the evidence for each.
Outlines concerns that researchers, activists, and technologists have with these bills: age verification, privacy and surveillance, First Amendment rights, and expansion of parental control over young people’s rights and autonomy.
While the impetus for this legislation is well-meaning, we question the assumptions behind it. Mental health and well-being is complicated, and tied to many different social and contextual factors. Solutions that exclusively focus on technology address only a very small part of this picture. The granular debate over the evidence linking smartphones and social media to youth well-being distracts us from the real difficulties faced by young people.
COSL poses enormous potential risks to privacy and free expression, and will limit youth access to social connections and important community resources while doing little to improve the mental health of vulnerable teenagers. Ultimately, legislation like KOSA is an attempt to regulate the technology industry when other efforts have failed, using moral panic and for-the-children rhetoric to rapidly pass poorly-formulated legislation.
We strongly believe that reform of social platforms and regulation of technology is needed. We need comprehensive privacy legislation, limits on data collection, interoperability, more granular individual and parental guidance tools, and advertising regulation, among other changes. Offline, young people need spaces to socialize without adults, better mental health care, and funding for parks, libraries, and extracurriculars. But rather than focusing on such solutions, KOSA and similar state bills empower parents rather than young people, do little to curb the worst abuses of technology corporations, and enable an expansion of the rhetoric that is currently used to ban books, eliminate diversity efforts in education, and limit gender affirming and reproductive care. They will eliminate important sources of information for vulnerable teenagers and wipe out anonymity on the social web. While we recognize the regulatory impulse, the forms of child safety legislation currently circulating will not solve the problems they claim to remedy.
Public concerns about the harm of social media to young people have recently accelerated. While worries about youth and technology are nothing new, documents revealed by whistleblower Frances Haugen suggested that social platforms have evidence that their products negatively affected the mental health and well-being of teenage users.1At the same time, policymakers have passed the Protecting American From Foreign Adversary Controlled Applications Act (PAFACA, 2024), citing national security concerns about TikTok.2 Claims that synthesize these concerns—for example, mental health and the manipulation of teenagers by foreign interests—constitute a powerful rhetoric that is being mobilized by both Democrats and Republicans, ostensibly in the pursuit of “holding tech accountable.”
This has coalesced into a will to act politically. The Kids Online Safety Act (KOSA) has been introduced in the U.S. Senate and placed onto the legislative calendar. With 68 senators signed on, KOSA appears to have enough support to pass through the Senate if brought to the floor for a vote. A companion bill was introduced into the House of Representatives on April 9, 2024. Congressional committees have held 42 hearings on the topic of social media and youth since 2017, and legislation has been proposed in the US, UK, France, and 35 US states, including California, Utah, and Maryland.3 The debate over these reforms takes for granted that social media harms young people, that this is supported by empirical evidence, and that the reforms proposed by policymakers will have a material, positive impact on young people’s mental health and well-being. However, these issues are far from settled.
Although several experts repeatedly appear in public discussions about youth safety, the research mobilized both in support of the claims about social media and the techno-legal “solutions” being offered are murkier than they appear. It is, unfortunately, very true that American youth are undergoing a mental health crisis.4 What is less clear is that social media is the cause.5 The timeframe of this crisis coincides with great social uncertainty, a rise in gun violence, felt impacts of climate change, a global pandemic, the popularity of helicopter parenting, and rising income inequality, all of which negatively affect mental health.6 There is evidence that teens suffering from mental health problems often retreat into social media (as do adults), which may explain these correlations.7 But despite high-profile claims to the contrary, research does not show that social media causes those conditions.8 Moreover, the proposed reforms to social media will not significantly improve young people’s mental health and will likely have negative consequences, not only for marginalized and minoritized teens, but for internet users as a whole.
This primer outlines the current state of Child Online Safety Legislation (COSL) at both the federal and state levels, explains the reasons behind current legislative action, and critically evaluates the evidence presented in support. We suggest that other explanations are possible for the children’s mental health crisis and that the proposed reforms will have unintended consequences that will actually harm young people, especially the most vulnerable.
We urge academics, activists, and policymakers to act against this legislation, not because we support “Big Tech,” but because we are concerned about young people. While many of the players in this landscape mean well, we are concerned with the long-term implications of pushing through legislation based on moral panics and unproven causal claims.
2.0 Why Act Now?
There have been concerns over young people’s consumption of media as long as media has existed.9 Plato fretted about the effect of writing on youth, while Victorians warned that “Penny Dreadful” novels would lead young people to crime. Parents were concerned in the 1940s that radio dramas were addictive, in the 1950s that horror comics created juvenile delinquents, in the 1980s that heavy metal turned children into devil-worshiping murderers, and in the 1990s that violent video games led to school shootings.10 In retrospect, many of these worries seem silly, such as hand-wringing over the popularity of jazz, swing dancing, or rock and roll, but this has not lessened their regularity nor the heightened rhetoric that accompanies them.11
This is especially true for digital technology.12 In general, child online safety legislation past and present is fueled by concerns over the risks children may encounter online, often referred to as “content, conduct, and contact.”13Content refers to exposure to online material such as violent imagery, extremism, sexually explicit content, or information that might spur harmful behavior like self-harm or eating disorders. Conduct involves children’s online behavior that could harm themselves or others, such as cyberbullying, sexting, piracy, or sharing personal information. Contact includes interactions with other people that might pose a risk to children, such as communicating with online predators or peers engaging in harmful interactions, or being manipulated into illegal activities or radical behavior.
Lawmakers have repeatedly proposed or passed legislation to limit young people’s online risks, using different legislative theories. Such legislation includes the Communications Decency Act (CDA, 1996), the Child Online Protection Act (COPA, 1998), the Children’s Online Privacy Protection Act (COPPA, 1998), the Children’s Internet Protection Act (CIPA, 2000), and the Deleting Online Predators Act (DOPA, 2006). All of these efforts were met with lobbying and litigation. Both the CDA and COPA had significant portions struck down on First Amendment grounds; CIPA, which requires schools and libraries to implement internet filters, was declared by the Supreme Court to be constitutional under certain conditions; and DOPA was never passed. Only COPPA was enacted in full, although with revisions that dropped the age from the originally proposed 18 to 13.
All these bills were fueled by moral panics. British sociologist Stanley Cohen defined a “period of moral panic” as one in which something, be it a person, a technology, a media object, or a group of people, “emerges to become defined as a threat to societal values and interests” disproportionate to its actual significance.14 Moral panics around youth and media, especially digital media, are extremely common. The first round of moral panics over online “content, conduct, and content” in the late 1990s (cyberporn, piracy, and the CDA) and the second in the mid-2000s (online predators, cyberbullying, and DOPA) involved overblown threats fueled by media hysteria, misinterpreted research, and inaccurate statistics.15 Unfortunately, moral panics can also prompt hastily-written legislation, the incarceration of innocents, and increased surveillance of young people.16
The current state of concerns about youth and mental health exemplifies a technological panic, which Dr. Amy Orben, who runs the Digital Mental Health Group at the University of Cambridge, defines as a time “in which the general population is gripped by intense worry and concern about a certain technology.”17 We believe the current slate of concerns around youth and social media meet the criteria for a moral panic, which is not a solid basis for legislation.
2.1 The Techlash and 2024
Since 2016, the technology industry has been public enemy number one. Companies like Meta, Google, and Amazon have been criticized—rightly—for spreading disinformation and hateful speech, invading user privacy, selling personal data, pushing sensational and harmful content, concentrating wealth, and contributing to environmental degradation.18 Tech workers, from Uber drivers to Amazon warehouse workers, have organized and demanded improvements in labor conditions, while white collar tech workers have protested the development of intrusive and unethical technologies.19 Positive perception of technology has drastically decreased, with the Pew Research Center finding a 21% drop from 2015 to 2019; by 2024, 44% of Americans believed that technology companies had a negative effect on the United States.20
These criticisms have led to repeated calls for greater regulation of technology companies, such as the need for comprehensive data privacy legislation, the suggestion that anti-trust action be taken against large companies like Google and Meta, and the desire to reform Section 230 to hold platforms accountable for some types of content.21 This has resulted in some reforms. For example, California passed the California Consumer Privacy Act (2018); Congress passed the Fight Online Sex Trafficking Act (FOSTA/SESTA) which amended Section 230 to make platforms liable for assisting, facilitating, or supporting sex trafficking (2018); the Justice Department brought antitrust cases against Google (2020, 2023) and the FTC against Meta (2020); and policymakers in the European Union enacted the General Data Protection Regulation (GDPR, 2018) and the Digital Services Act (DSA, 2022). While Congress has held many hearings about “Big Tech” and proposed a vast array of bills, the United States government has been unsuccessful in passing legislation at the federal level.
There is clearly a need for greater regulation of technology companies and their products. However, this regulation should be well-thought out, as internet regulation often has unforeseen consequences, especially on marginalized people.22 FOSTA and SESTA, which were rapidly pushed through in response to fears of “child trafficking,” have had minimal to no impact on human trafficking while making sex workers more vulnerable to violence, less able to screen clients, and less financially stable.23 Both bills portrayed a very complicated social issue—sex trafficking—as a technological problem that could be solved through changing online classified advertising.24 As Harvard Cyberlaw Clinic instructor Kendra Albert writes,
“The failure of technology policy advocates to understand or engage with underlying substantive debates over sex work hamstrung advocacy efforts and led to a failure to build meaningful coalitions both prior to FOSTA’s passage as well as after.”25
We are concerned that the current spate of legislative efforts to regulate internet content with the goal of improving children’s mental health and well-being are being hastily formulated without deep engagement with research and advocacy and will have similar negative outcomes on vulnerable populations.
Protecting young people is an evergreen concern that is hard to argue against, and the rhetoric of children as innocent and deserving of protection is extremely powerful.26 Internet regulation “for the children” has had some success in the United States, fueling CIPA, the CDA, COPA, COPPA, and DOPA. For these reasons, passing legislation around young people’s internet practices is more politically palatable than more comprehensive regulation; that does not mean it should be done.
Recent efforts, like KOSA and its ilk, are partly fueled by broad desires to “hold Big Tech accountable” and curb “harmful content online.” In a very polarized America, these are rare bipartisan concerns. President Joe Biden wrote in a 2023 editorial in the Wall Street Journal:
I urge Democrats and Republicans to come together to pass strong bipartisan legislation to hold Big Tech accountable. The risks Big Tech poses for ordinary Americans are clear. Big Tech companies collect huge amounts of data on the things we buy, on the websites we visit, on the places we go and, most troubling of all, on our children. As I said last year in my State of the Union address, millions of young people are struggling with bullying, violence, trauma and mental health. We must hold social-media companies accountable for the experiment they are running on our children for profit.
Biden also mentions data privacy, the need to protect small businesses, “toxic online echo chambers,” and “cyberstalking, child sexual exploitation, nonconsensual pornography, and sales of dangerous drugs.”27 On the other side of the aisle, Republicans are concerned with tech companies’ supposed liberal bias and have labeled anti-disinformation efforts “censorship,” while both parties are concerned about China’s global influence and Chinese apps like TikTok, which has been banned for government workers federally and in two dozen states in addition to PAFACA.28 As a result, bills like KOSA—which provide some measure of platform regulation after other options have failed, and promise to decrease young people’s exposure to harmful content—have broad, bipartisan support.29
3.0 The Legislative Landscape
In this section, we outline the primary COSL efforts internationally, at the US federal level, and at the US state level. While this is not comprehensive, we focus here on legislation that has passed or appears likely to pass.
3.1 International Legislation
Prior to 2022, international legislation focused on protecting children’s data online. This was championed by the United Kingdom’s Age-Appropriate Design Code passed in 2021 and inspired by the European Union’s preeminent data privacy law—the GDPR.30 However, multiple countries, including France, Germany, Australia, Canada, India, South Korea, and Japan, have recently enacted national-level laws regulating online child safety through direct restrictions on design and content. The regulatory features of these new regulations include age verification to prevent exposure to harmful content; protections for children’s data privacy; limits on internet usage; and requirements for parental consent.31 The Digital Services Act (DSA), passed in 2022 by the European Union, is a landmark regulation that applies to a sizable number of countries. In effect as of January 1, 2024, the DSA seeks to broadly regulate illegal and harmful online content and disinformation.32 DSA specifically protects children by requiring internet intermediaries to prevent exposure to age-inappropriate content; control and verify access to information by children; and install child-friendly grievance reporting channels.33
Following Europe’s lead, the United Kingdom passed the Online Safety Act in 2023, setting up a wide range of legal obligations on internet platform services. The UK Online Safety Act places duties of care on regulated user-to-user and search services to identify, mitigate, and manage the risks of harm from illegal as well as “content and activity that is harmful to children.” To outline these duties, the Act designates the UK’s Office of Communications (OFCOM), the government agency that regulates broadcasting and telecommunications, as regulator. The text of the Act also requires internet platform services to be “safe by design”—designed and operated to maintain a “higher standard of protection” for children than adults. Specifically, the Act requires platforms not only to remove illegal content, but legal content considered to be harmful for children, such as content containing themes of self-harm.34 Thus, services that host “age-inappropriate” content must implement age verification measures to ensure that only adults can access content containing pornography, self-harm, and bullying, among others. Finally, platforms must conduct and publish “children’s risk assessments” regularly.35 The UK’s Online Safety Act, therefore, is one of the more detailed pieces of legislation creating a duty of care to protect children on the internet.
3.2 Kids Online Safety Act (KOSA)
In the United States, the Kids Online Safety Act (KOSA / S.1409) is a major federal bill and possibly one of the most significant attempts by the government to rein in the technology industry. KOSA was initially introduced in the Senate by Sen. Richard Blumenthal (D-CT) and Sen. Marsha Blackburn (R-TN) on February 16, 2022. Following its reintroduction in May 2023, the bill gained bipartisan support in a polarized Congress. As of April 2024, KOSA has been unanimously approved by the U.S. Senate Committee on Commerce, Science, and Transportation and placed on the Senate legislative calendar. On April 9, 2024, Reps. Gus Bilirakis (R-FL), Kathy Castor (D-FL), Erin Houchin (R-IN), and Kim Schrier (D-WA) introduced a version of KOSA into the House.36 KOSA purports to set out requirements to protect minors from online harms, and is applicable to “covered platforms”—defined in the bill as “applications or services (e.g. social networks) that connect to the internet and are likely to be used by minors.”37 Since its introduction, the language of the bill has undergone several amendments.
According to the latest amended version of the Senate bill, unveiled by its authors in February 2024, its main areas of intervention are a duty of care, the establishment of a set of safeguards for minors, parental tools, and identifying which users are youth.
First, KOSA establishes a “duty of care” for internet platforms to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate…” a variety of listed harms to minors such as mental health disorders; online sexual exploitation; patterns of internet addiction; online bullying and harassment; and predatory, unfair or deceptive marketing practices, among others.38 However, the duty of care does not require platforms to prevent a minor from deliberately or independently seeking or requesting content.
Second,KOSA requires platforms to provide any user that it “knows is a minor” with readily-accessible and easy-to-use safeguards—which should be part of the default setting for minors—to limit the ability of others to communicate with them, restrict public access to their personal data, control personalized recommendation systems, and limit “design features that result in compulsive usage of the covered platform by the minor.”39 According to the bill, “design features” are “any feature or component of a covered platform that will encourage or increase the frequency, time spent, or activity of minors on the covered platform, or activity of minors on the covered platform.” The text goes on to specify examples of design features, such as infinite scrolling or auto play; rewards for time spent online; notifications; personalized recommendation systems; in-game purchases; and appearance altering filters.
Third, KOSA gives parents more control by mandating that platforms provide specific tools. These tools allow parents to manage their child's privacy and account settings, see how much time their child spends on a platform, and limit their usage time. These requirements are enhanced by the obligation for platforms to clearly disclose their terms of service, explain their use of personalized recommendation systems, and provide information about advertised products and services. Additionally, KOSA requires platforms to regularly monitor and annually publish transparency reports detailing the risks to minors online and the steps taken to mitigate these harms.
Finally, regarding how platforms determine if users are minors, KOSA specifies that it does not require the collection of new data, such as government-issued IDs, nor does it require the implementation of age verification systems. Instead, it directs a task force, including representatives from the National Institute of Standards and Technology (NIST), the Federal Communications Commission (FCC), the Federal Trade Commission (FTC), and the Secretary of Commerce, to explore technically feasible age verification methods through research. This task force study will look into the potential advantages of device and operating system-level age verification, the information required for verifying age, the accuracy of these systems, and ways to protect user privacy and data security, among other issues.
Furthermore, the bill's language makes it clear that platforms must implement safeguards or notifications depending on their awareness of whether a user is a minor. KOSA defines “know” or "knows” as having “actual knowledge or knowledge fairly implied on the basis of objective circumstances.” Most platforms ask users their age when they create an account, but do not require proof of this information with government-issued identification of any kind. In fact, civil society groups like the American Civil Liberties Union (ACLU) have clarified that to “know” the age of a minor will require age verification.40 This process could also inadvertently compel adults to provide official ID to confirm they are not minors. The ACLU and other First Amendment advocates have argued that age verification threatens free speech, privacy, online anonymity, and data security.41
The House version is slightly different in that the “duty of care” and “knowledge” obligations are tiered, similar to the DSA.42 Social media, messaging, streaming, and multiplayer gaming platforms are divided into “high impact platforms” (those with more than 150m users and $2.5 billion in yearly revenue), second-tier “covered platforms” (with annual revenue of $200K and 200K users), and third-tier platforms with lesser thresholds for revenue and users. “High impact” platforms have the duty of care, while “covered platforms” do not but are still required to implement safeguards.43 “Knowledge” that a user is a minor is also defined differently based on tier. High impact platforms “knew or should have known the individual was a child or minor,” second-tier platforms “ “knew or acted in willful disregard of the fact that the individual was a child or minor,” and third-tier platforms must have “actual knowledge.” The House bill also replaces references to “addiction” with “compulsive usage” (see Section 4.4), which is defined as “any response stimulated by external factors that causes an individual to engage in repetitive behavior reasonably likely to cause a mental health disorder.”44 There are also minor changes to “design features” including push notifications or badges that encourage engagement, while “deceptive marketing practices, or other financial harms” have been removed.45 However, the bill is in an early draft and is likely to continue to change as it works its way through the House.
In its previous versions, KOSA enabled state attorneys general to take legal actions against platforms due to a violation of the duty of care. However, due to criticism from the LGBTQ+ community on how a broadly defined duty of care would encourage greater online censorship of vulnerable communities (including LGBTQ+ youth),46 the amended version of the bill limits the power to enforce the duty of care to the FTC. Therefore, failing to meet the duty of care is no longer grounds for civil actions by state attorneys general, although breaches of other provisions still qualify—such as those related to safeguards for minors, disclosure, and transparency.
The latest Senate version of KOSA (February 2024) has been met with increased bipartisan support. Still led by Senators Blumenthal (D-CT) and Blackburn (R-TN), and joined by Sen. Cassidy (R-LA), KOSA is currently co-sponsored by 68 senators, creating a path for KOSA to successfully pass through the Senate, and a version was introduced into the House in April 2024. Beyond Capitol Hill, KOSA is backed by a wide coalition of civil society and industry supporters. This coalition includes Common Sense Media, the American Psychological Association, Parents for Safe Online Spaces, the NAACP, Christian Camp and Conference Association, Microsoft, and Snap, among others.
Yet, several LGBTQ+ and civil liberties organizations still consider KOSA to be dangerous. For the Electronic Frontier Foundation (EFF), “KOSA is still a censorship bill and it will still harm a large number of minors who have First Amendment rights to access lawful speech online.”47 Similarly, the ACLU “remains concerned that this bill would silence important conversations, limit minors’ access to potentially vital resources, and violate the First Amendment by imposing a roundabout government-mandated content moderation rule.”48 Finally, the digital rights group Fight for the Future keeps asking KOSA’s sponsors to clarify that the duty of care only applies in a content-neutral manner.49
KOSA has been deeply influenced by the UK Online Safety Act (discussed in section 3.1), which passed in October 2023. In contrast to KOSA, UK’s OSA identified a larger subset of harms which required platform regulation, including opioid markets, content uploaded from prisons, terrorist propaganda, disinformation, manipulation by AI, and even abuse of public figures like athletes and female journalists. However, common to both bills is the “duty of care” provision. But while both countries have a tradition of “duty of care” in their tort law, the UK has a different legal framework governing free speech than the First Amendment.50 In the case of the OSA, “duty of care” was suggested by Lorna Woods and William Perrin. In their iteration, Woods and Perrin derive the duty of care from the UK’s 1974 “Health and Safety at Work Act” by asserting that digital platforms operate similarly to quasi-public places such as officers, bars, and theme parks.51 This means that platforms are responsible to prevent harm in the same sense that theme parks are responsible to prevent injuries on their premises. KOSA lacks a clear reference to any sort of physical corollary, which creates ambiguity in the scope of its application, and its implications for free speech. The OSA reliance upon analogy between software design and public spaces falls flat when design choices can reflect developers’ free expression, an important distinction considering different legal standards for free speech.
3.3 State Legislation
Though KOSA has a significant amount of bipartisan support, policy analysts Scott Brennen and Matt Perault note that we are much more likely to see tech legislation passed at the state level, rather than federally.52 As Brennen and Perault argue, states are “poised to become critical battlegrounds for tech policy in the next two years” because of structural advantages at the state level (such as one party controlling all levels of government) that make passing legislation much easier. At the state level, we can see the implications of KOSA-like laws, how they may be used to achieve the political aims of lawmakers, their viability, and the role that age verification in particular is going to play in the future of the internet. This means that regardless of whether KOSA passes and what exists in the final version of the bill, state laws are likely to have a great impact on young people’s internet use.
A majority of states have proposed or passed KOSA-like legislation aimed at children’s use of social media, or other age verification laws. According to the National Conference of State Legislatures, 35 states and Puerto Rico addressed legislation introducing measures to protect children while using the internet, with 12 states passing bills and resolutions.53 This number does not include other legislation that will also require age verification, but aims to curtail adult access to information, specifically pornography.
Proposed and passed state laws aimed at distinguishing users by age can generally be separated into three broad categories, discussed below.
3.3.1 Controlling Minor Access to Social Media
The first set of laws are KOSA-like legislation passed by states like Texas, Utah, and Arkansas that are aimed at controlling broad access to information online through age verification, going beyond the restrictions placed on accessing pornography.54 These laws make up the first set of child online safety legislation. Arkansas’ State Bill (SB) 396 prohibits minors from creating online accounts unless they have explicit parental consent and the platform has verified their age using third party vendors.55 Texas’ House Bill (HB) 18 will require platforms to obtain parental consent verified by government-issued IDs in order to have new minors join as users.56 Utah’s SB 152 originally prohibited age-verified minors from opening an account without parental consent, until the tech industry group NetChoice sued to block the legislation on First Amendment grounds.57 SB 152 was amended by SB 194 and HB 464. SB 194 also institutes age verification, but only enables maximum privacy settings on a minor’s account if they lack parental consent.58 HB 494 allows parents and minors to sue platforms for negative effects on a minor’s mental health.59
Notably, all three states have different interpretations on how a minor can be verified and permitted to open a social platform account, but all maintain that data on minors used for verification cannot be retained beyond what is necessary. On the topic of parental control, Texas and Utah both require platforms to provide parents with access to their child’s online activity and other data collected on the minor. Another issue central to these bills is the harms caused to minors by deceptive online advertising. Texas’ HB 18 requires platforms to restrict advertisements for services or products that minors cannot use, and disclose in their terms of service how personally identifiable-information is integrated into recommendation algorithms.60 Utah, on the other hand, restricts minors from seeing any targeted advertisements and recommended content. Utah also prohibits direct messaging with specific accounts and displaying a minor’s account in search results to protect the discovery of minors online by other users.
3.3.2 State Child Privacy Bills
The second set of laws are aimed at increasing privacy protections for children. This includes legislation such as the California Age-Appropriate Design Code Act (CAADCA)61, a law which would require special data safeguards for underage users, which was modeled on the UK Age Appropriate Design Code.62 The California law was set to go into effect in July 2024, but a federal judge granted a preliminary injunction for the tech industry group NetChoice in September 2023.63 Critics have argued the law violates the First Amendment by creating barriers for both minors and adults seeking access to websites and other apps.64 Nonetheless, California is not the only state passing laws oriented at protecting privacy for young users. Connecticut has updated a privacy law to require online platforms to conduct regular safety assessments and make design changes to limit who can contact minors, and gives minors (defined in this case as consumers under 18) the right to “unpublish” and delete accounts.65 Similar laws have been proposed in Vermont and Illinois.66
3.3.3 State Anti-Pornography Bills
The last set of child online safety laws attempt to use age-gating as a mechanism for restricting users under 18 from accessing online pornography. Claiming that exposure to sexual content at a young age can lead to disorders and harmful behavior, eight states passed laws in 2022 and 2023 requiring websites with more than 33.3% pornographic content to use age verification methods prior to enabling access to users.67 Louisiana’s Act No. 440, which has become a model for similar legislation in other states, establishes platform liability and civil remedies for distributing materials harmful to minors.68 Initially, Louisiana’s law was set to be enforced by private right of action. However, supplemental updates to the law have empowered state attorneys general to proactively investigate platforms, who are subject to fines for non-compliance.69 Harmful materials are defined as those designed to appeal to “prurient interest,” including descriptions of actual, simulated, or animated sexually explicit materials.70 Prurient interest is, however, an extremely broad term and may signal a future filled with debates similar to those witnessed around obscenity laws. Regardless, the Louisiana law mandates platforms to verify non-minors, either by collecting digitized identification cards or instituting commercial age verification systems. The specific methods of commercial age verification stated in the law include government-issued identification or discerning age through transactional data.71 Following Louisiana, seven other states—Texas, Virginia, Arkansas, Mississippi, Utah, Montana and North Carolina—have enacted copycat laws and dozens of others have introduced similar bills in state legislatures.72
While there is a significant degree of variation in the requirements and platforms covered by these state laws mandating age verification, they demonstrate an increase in state-level power over minors’ online experiences. These laws give state-level governments, attorneys general, and parents increasing discretion on when and how minors access information. The threats to privacy and free speech enabled by government-mandated access to surveillance of minors is additionally alarming.
4.0 Arguments in Support of Children’s Online Safety Legislation (COSL)
Arguments in favor of COSL focus on the increasing rates of depression, loneliness, and anxiety in youth across the country and suggest that this phenomenon is due to social media and its addictive design created by “Big Tech.” These legislative efforts are often framed as regulating a tech industry that has gone rogue and no longer cares about the health of its users beyond their bottom line. In public discourse, this enables COSL to represent a broader concept of regulating technology and transforming how children interact with social media, rather than focusing on the specifics of any single bill.
4.1 Mental Health
For the last several years, public health officials, mental health professionals, and educators have warned that young people in the United States are undergoing a mental health crisis.73 Rates of anxiety and depression have increased significantly.74 Suicide rates among 10-24 year olds increased 67% from 2007 to 2021 and are at their highest rate since the 1980s.75 Eating disorders among both boys and girls have increased.76 The COVID-19 pandemic had enormously negative consequences on the mental health of children and teenagers.77 Moreover, extensive racial and economic disparities in treatment exist, with poor children and children of color far less likely to have access to mental health services.78 There is a significant gender gap, with girls and young women experiencing more anxiety and depression than their male counterparts.79
Various explanations for this crisis have been proposed, including the aftereffects of the pandemic, the opioid epidemic, COVID lockdowns and school closures, family problems, domestic violence, economic uncertainty, climate change, and helicopter parenting.80 The gender gap may be explained by early onset puberty, economic factors, sexual violence and assault, and prevalence of sexist public discourse, but there is no clear and obvious answer.81
The cause of the current crisis is contested. However, in the United States, the public, the surgeon general, and politicians have ignored this complicated sociopolitical context in favor of scapegoating social media as the cause.82 To support COSL, policymakers and advocates have frequently cited psychologists Jean Twenge and Jonathan Haidt to justify public intervention.83 The two have a variety of qualms about cell phones, the internet, and social media, claiming that such technology is addictive, causes sleep deprivation, takes time away from in-person socializing, and creates a “narcissistic” and “entitled” generation of “fragile” and “anxious” young people.84 Haidt is a moral psychologist and professor of business ethics with no scholarly expertise in social media or youth. He recently published a book, The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness (2024). It follows a long string of alarmist essays in outlets like The Atlantic with titles like, “The dangerous experiment on teen girls,” “How social media dissolved the mortar of society and made America stupid,” and “Yes, social media really is undermining democracy.”85 Twenge, a decorated professor of psychology, has written two books pathologizing millennials, The Narcissism Epidemic: Living in the Age of Entitlement (2010) and Generation Me (2014), and two pathologizing Gen Z, iGen: Why Today’s Super-Connected Kids are Growing up Less Rebellious, More Tolerant, Less Happy–and Completely Unprepared for Adulthood (2017) and Generations (2023).86 Twenge’s basic arguments paint young people as lazy, entitled, selfish, and narcissistic, while Haidt considers youth to be coddled, overprotected, and delicate.
Haidt and Twenge both switched from chronicling the deficient personality traits of young people to criticizing the social media platforms that, both claim, made the youth this way. While in earlier books, both blamed a variety of other actors including parents, consumer culture, therapists, celebrities, programs to boost self-esteem, front-facing cameras, moral relativism, schools, and so forth,87 they have narrowed in on social platforms and smartphones as the cause for broad supposed generational changes. The two have co-authored editorials with embellished titles like “This is our chance to pull teenagers out of the smartphone trap,” as well as a series of articles in prestigious outlets like Nature Human Behavior.88 Haidt and Twenge jointly contend “smartphones in general and social media in particular” are to blame for a litany of negative mental health outcomes among young people, particularly girls and young women,89 and that no other phenomena can be considered equally responsible for youth mental health crises.90 Twenge and Haidt assert widespread social media use has incited an inescapable, “planetary rewiring” of human interaction.91
Haidt’s work has been criticized by other academics as cherry-picking evidence, attributing causation to correlational data, and for furthering simplistic, unsubstantiated narratives.92 Most recently, while promoting The Anxious Generation, Haidt promoted the discredited theory of “rapid onset gender dysphoria,” which claims that social media is responsible for an increase in gender diversity in young people (discussed in 5.3).93 Twenge’s work has been repeatedly criticized for its emphasis on generational cohorts, reliance on cross-temporal meta-analysis, pathologizing of youth, and repeated conflation of correlation with causation.94 Her latest book states that technology is the “root cause” of “cultural changes” and “generational differences.” She calls this “The Technology Model of Generations.”95
Media scholars have spent many years trying to understand the effects of all types of media and technology on people and society more broadly. This field of study is known as media effects.96 Haidt and Twenge’s causal models in which social media and smartphones cause young people to be depressed, narcissistic, and anxious (or, in Twenge’s case, to determine generational differences) is an example of a theory of media effects called technological determinism—that technology determines society. In media and technology studies, such theories are widely considered to be inaccurate and out-of-date. Nevertheless, technological determinism has tremendous staying power because it provides easy, simplistic explanations for complicated events.97
Despite its popularity in documentaries and op-eds, technological determinism is flawed because it ignores contextual factors. Rich historical accounts of the effects of technology, such as Elizabeth Eisenstein’s history of the printing press and its impacts, note that technology has particular effects due to the time and place in which it is enacted and that the same technology leveraged in a different context may have completely different effects.98 After the horrific 1999 Columbine shooting, a robust debate raged over whether playing violent video games contributed to school shootings. Twenty years later, academic evidence overwhelmingly finds no link between the two.99 Moreover, video games are immensely popular in Japan and South Korea, which have the lowest rates of violent crime in the world.100 As Chris Ferguson and James Ivory write, “Efforts to blame mass homicides on video games appear to be due to unfamiliarity with games among older adults, prejudicial views of young offenders, and a well-identified cycle of moral panic surrounding media as a scapegoat for social ills.”101
This is not to say that technologies do not contribute to social change—of course they do.102 Social platforms have enabled the development of knowledge commons, virtual community, new art and media forms, and an enormous number of new business models, as well as facilitating disinformation, networked harassment, and hateful content. However, taking a complex situation like youth mental health and attributing a single causal element is short-sighted. Rather, we advocate for an ecological approach to improving young people’s health and well-being in which the full spectrum of young people’s lives and concerns are considered.
Identifying the cause of the mental health crisis prescribes the solutions.103 If the youth mental health crisis is caused by social platforms and smartphones, then eliminating social platforms and smartphones would, logically, ameliorate the crisis. Putting aside for a second the impossibility of this reform, let us critically examine the supposed relationship between social platforms and mental health concerns. (No one is, at present, attempting to legislate away smartphones, although Haidt suggests that parents restrict access to cellphones until age 16.)104
Fundamentally, scholarly studies show that the assertion that teenage mental health problems are caused by social media is far more complex than is implied in the media. In May 2023, the American Psychological Association released a Health Advisory on Social Media Use in Adolescence, concluding that “[u]sing social media is not inherently beneficial or harmful to young people,” and that its effects “are dependent on adolescents’ own personal and psychological characteristics and social circumstances—intersecting with the specific content, features, or functions that are afforded within many social media platforms.”105 In other words, when youth are already doing poorly, interactions online and media consumption can exacerbate mental health issues. This is well-established in other media contexts. For example, when youth are already contemplating suicide, exposure to TV shows like 13 Reasons Why or learning about a celebrity’s death by suicide can have significant effects on their mental health and, in the worst cases, spur copycat suicides.106
Given these debates around causality, the National Academies of Science, Engineering, and Medicine convened a committee of experts in cognitive science, computational social science, economics, education, epidemiology, law, media science, mental health, network science, neuroscience, pediatrics, psychology, social media, and technology to assess the impact of social media on the health and wellbeing of adolescents and children.107 In their December 2023 report, the experts concluded, “The committee’s review of published literature did not support the conclusion that social media causes changes in adolescent health at the population level. The report also notes that the use of social media—rather than having purely negative or positive impacts—is likely a constantly shifting calculus of the risky, the beneficial, and the mundane that affects different people in different ways.”108
To boost their cause, Twenge and Haidt frequently mention the quantity of studies proving a link between social platform use and mental health.109 They maintain two lengthy Google docs, one showing a downturn in youth mental health and the other analyzing studies that draw correlations between social media and mental health. Even a detailed perusal of these studies finds that very few draw causal links between the two.110 In contrast, one study of 464 teens found no correlation between social media and adverse mental health consequences, with the authors concluding that “concerns regarding social media use may be displaced.”111 A study of more than two thousand 10-15 year olds in North Carolina found little evidence of a negative association between digital technology use and well-being.112 In 2023, a very large study of Facebook adoption in 72 countries found “no evidence” that “social media is associated with widespread psychological harm.”113
To turn back to critiques of technological determinism and simple media effects, mental health is complicated, and the causes are complex; isolating one particular variable—social media—is extremely simplistic.114 Despite hundreds of studies on the interactions between social media and youth mental health, there is no clear evidence of an association, let alone a cause.115 The public discourse on this has frustrated many specialists, with one paper by researchers from scholars titled, simply, “There is no evidence that associations between adolescents’ digital technology engagement and mental health problems have increased.”116
In all the attention given to debating the binary of whether social media is good or bad, or adding up the number of studies on each side, the nuances often get lost. Many young people are doing badly and their interactions with technology can exacerbate existing struggles. But it is important to consider the particular arrangement of social context, social platforms, and economic circumstances that makes this possible. For example, anxious and lonely people, both adults and teenagers, often turn to social media for companionship.117 While this search for connection is understandable, social platforms are not a substitute for in-person interaction. Unfortunately, young people’s abilities to socialize, walk to school, and travel around their neighborhoods without adult supervision have decreased significantly in the last 30 years.118 This is partially due to the discourse of “stranger danger” that has been on the rise since the 1980s (see Section 4.2)119 and partly due to widespread adoption of mobile devices, which has allowed parents to surveil their children in a way that was impossible for earlier generations.120 Haidt himself has written about the rise of “helicopter parenting” and the huge change in social norms that puts enormous social pressure on parents to limit their children’s movements and freedom, and advocates for “increased amounts of independent play and responsibility in the real world.”121 These factors are unlikely to change regardless of how social platforms are regulated.
In the debates about COSL, many people center technology instead of youth. If, instead, we center youth, they mention a broad array of issues that are increasing their anxiety. In light of increased weather events and global warming, climate anxiety is increasing both domestically and around the globe.122 Young people are expressing greater concern about the future, prompting them to question whether having children is wise.123 In the US, anxieties around school shootings124 and student college debt continue to mount.125 And since young people are often affected by the well-being of their parents, the endemic mental health struggles, divorces, job precarities, and other factors that are affecting the mental health of adults tend to have refractive effects on young people. Because so many teenagers use social media, their struggles are made very visible online in ways that are often otherwise challenging to see.126 When the EFF asked youth how they felt about KOSA, more than 5,000 responded, explaining how social media has enriched their life, connected them with others, and helped them find valuable information, and that they feared KOSA would curtail these valuable resources.127
So what do young people need? A recent report from the Crisis Text Line, which provides support to youth struggling with mental health, analyzed more than 87,000 anonymized conversations with young people. Teens asked for “opportunities for social connection; engagement in music, writing, visual, and performing arts; mental health services; exercise and sports programming; books and audiobooks; and outdoor spaces and nature.”128 Unfortunately, the same report points out that funding for such programs never recovered from the 2008 financial crisis, with funding for libraries, parks, and art and music programs in schools all lower in 2024 than it was in 2010.129 Strong relationships with parents can improve mental health outcomes.130 But young people also need access to non-custodial adults who are looking out for them. They need to feel safe from gun violence in their communities. And obviously, more young people need access to mental health resources.
Ultimately, the deluge of child online safety legislation fails to address these issues. Instead, it empowers parents to further surveil and restrict their children’s self-expression, and there is no evidence that COSL will address the problems that Haidt and Twenge write about so prolifically. In his Substack, Haidt says, “If you listen to the alarm ringers and we turn out to be wrong, the costs are minimal and reversible.”131 Not only is this a poor basis on which to build legislation, it is also wrong. As we show in this primer, there are enormous potential risks to privacy, free expression, and—ultimately—the mental health of vulnerable teenagers.
4.2 Sexual Exploitation and Abuse
The fear that young people will be abducted or assaulted by “online predators” or subject to sex trafficking is a constant modern concern. KOSA and similar bills attempt to reduce the “sexual exploitation and abuse of minors,” although the mechanisms for doing so are unclear. One possibly relevant provision in KOSA requires platforms to “limit the ability of other individuals to communicate with the minor,” which addresses fears of strangers, especially adult strangers, communicating with young people. However, this provision would seemingly prevent young people from communicating with anyone online. Is the risk of “stranger danger” grave enough to prevent young people from using the internet to communicate?
First, is important to distinguish the risk of “online predators” from the known availability of child sexual abuse material (CSAM), as well as milder content that appeals to pedophiles, some of which is posted, knowingly or unknowingly, by parents.132 While these are significant and troubling, neither KOSA nor any other child safety bills would address these concerns.133 CSAM is already illegal under federal law and is moderated by both state law enforcement and platform companies, who are legally required to report CSAM to the National Center for Missing & Exploited Children (NCMEC).134
Second, researchers use the term “online sexual abuse” to include grooming, solicitation, and image-based abuse. Most perpetrators of online sexual abuse are not strangers, just as in cases of dating violence, child sexual abuse, child predation, cyberbullying, and cyberstalking, all of which are far more likely to be perpetrated by someone known to the victim.135 A recent meta-analysis of 32 studies of online sexual abuse, conducted by the Crimes against Children Research Center, found that 44% of offenders are under 18 and 68% are acquaintances of the victim.136 In contrast to the stereotype of online sexual abuse “typically involving younger children, deception, abduction, and coercive violence at the hands of internet strangers,”137 most perpetrators knew the victim “offline” and used the internet to “build trust and forge relationships that facilitated their crimes.”138 The average incident involved a teenager engaged in a sexual relationship with an older adult, most of whom were connected to offline contexts.
This very rarely involves abduction, as actual incidences of abduction due to online interactions with strangers are very small.139 However, the public perceives online predator abduction to be a common occurrence, partially due to highly-publicized cases like that of Alicia Kozakiewicz, who appeared on the cover of People.140 NCMEC, which runs a CyberTipline where the public can report suspected child sexual exploitation, widely publicized an 82% increase in tipline reports of “online enticement” between 2021 and 2022.141
However, “online enticement” is a broad category, and NCMEC attributes the increase to a rapid rise in “sextortion,” which Justin Patchin and Sameer Hinduja define as “the threatened dissemination of explicit, intimate, or embarrassing images of a sexual nature without consent, usually for the purpose of procuring additional images, sexual acts, money, or something else.”142 One study found that 60% of minors experiencing sextortion knew the perpetrators offline, often as current or former romantic partners, and 75% had voluntarily provided the images before the sextortion took place.143 A large, nationally-representative survey of American teenagers similarly found that most perpetrators were the victim’s “real life” friend or romantic partner; only 4-6% of victims were sextorted by someone online that they didn’t know very well.144 Sextortion, therefore, is better characterized as a part of intimate partner violence or cyberbullying than something committed by “online predators.”
While all the scenarios outlined in this section are serious and there is a significant social responsibility to prevent them, focusing on “stranger danger” is misplaced, since only a small number of online sexual abuse incidents involve strangers. As Sutton & Finkelhor write, “An emphasis on stranger oriented messages can be problematic for effective prevention. It fails to orient the police and public to the multiple and varied sources of danger. These messages associate danger with the fact that someone is unknown rather than with particular problematic behaviors by any correspondent, known or unknown which do not map to the realities of the evidence base.”145 In other words, effective prevention strategies would help young people identify worrisome behavior from friends, peers, family members, and acquaintances, rather than shadowy internet strangers. KOSA would not contribute to such efforts.
Preventing all young people from communicating with adults or peers is a drastic measure, and one that will, most likely, only marginally decrease the rate of online sexual abuse of young people and, instead, cut them off from potential sources of support and friendship.
4.3 Eating Disorders and Self-Harm
Eating disorders (EDs) are often directly mentioned as proof that platforms need regulation. While there are important connections between social media and EDs, politicians and lobbying groups frame the relationship as clear-cut and solvable by regulatory solutions like KOSA. These assertions often rely on claims that platforms, like Instagram, allow pro-eating disorder content and that the comparative culture of social media leads to feelings of body dissatisfaction. While these claims are worthy of research and policy, the relationship between social media and eating disorders is complex and cannot be reduced to a simple causal claim that asserts social media produces eating disorders. There is a correlation between excessive use of social media and increased body dissatisfaction, decreased self-esteem, and ED-linked behaviors. Public health researchers Santarossa & Woodruff found that increased time on social media was “significantly related to higher ED symptoms/concerns.”146 Similarly, one psychiatric study demonstrated that “time spent online or using social media” was “related to body image” and showed “an association with abnormal eating attitudes and behaviors, binging, purging, use of laxative/weight loss or diuretics.”147
These correlations, however, paint a simpler picture than the research would necessarily indicate. Scholarship describes, but does not make a prescriptive causal claim about the effects of social media. Is it that increased use of social media results in body dissatisfaction, or that those experiencing body dissatisfaction are more likely to seek out dieting content and engage in processes of peer-comparison? American University psychologists Meier and Gray found that total time spent on Facebook (FB) was not correlated with “thin ideal internalization, self-objectification, weight dissatisfaction, and drive for thinness.”148 It was “the amount of FB time allocated to photo activity” that was associated with ED linked behaviors.149 This finding has been repeated in the context of “selfie feedback,” “photographic activity,” and exposure to “idyllic images.”150
These distinctions matter because they inform whether targeted interventions will succeed or fail and highlight the expansiveness that the duty of care plank becomes impossible to enforce. If the problem is not merely time spent, and active posting, on social media platforms, then policies that merely rein in time spent won’t solve the problem. Research has shown that exposure to photographic activity, not just participation, is the predictive variable in ED-linked behavior. One group of eating disorder researchers found that “unposted selfies” were linked to “great ED symptom severity.”151 Solutions such as KOSA do not resolve those structural elements because they are social, not merely technological. Lastly, the complex web of causality makes enforcement almost impossible. If dieting culture is correlated with ED, does KOSA imply that platforms ought to eliminate any content associated with dieting culture or photographic activity? If platforms like Instagram are harmful to some young women, but helpful to others, does that warrant restricting them for all young people? These are practical questions that highlight the inability of KOSA to adequately solve the real problems highlighted by scholars and activists.
4.4 Social Media Addiction
Throughout the Senate version of KOSA, language suggests that social media is “addictive.” One of Section 3’s list of “harms to minors” is “patterns of use that indicate or encourage addiction-like behaviors.”152 KOSA also requires the FTC to commission studies on “Addiction-like use of social media and design factors that lead to unhealthy and harmful overuse of social media.”153 The scholarly evidence concurs that some social media users exhibit addiction-like behaviors, or what some scholars call “problematic social media use.”154 (The Senate bill carefully uses “addiction-like” rather than “addiction” because scholars have warned that using the language of addiction to describe social media use minimizes the physical symptoms and seriousness of drug and alcohol addiction. The House version eschews this language entirely for “compulsive usage.”) This use is characterized by “mood modification, salience, tolerance, withdrawal symptoms, conflict, and relapse,”155 or, in plain language, “being overly concerned about social media, strongly motivated and having been devoting [sic] a great amount of time and energy to use social media, to the degree that an individual’s social activities, interpersonal relationships, studies/jobs, and/or health and well-being are impaired.”156 Most research finds that these behaviors are present in a minority of social media users, from 3.42% in a study of US 18-25 year olds157 and 4.5% in a study of Hungarian teenagers158 to 9.1% amongst teenagers in the Netherlands159 and 9.4% in Finnish adolescents.160
We think this is one area where COSL could do some good. For example, New York’s Stop Addictive Feeds Exploitation (SAFE) for Kids act requires all social platforms to provide a chronological feed,161 which Instagram users have been requesting since it was removed. Whether this will have a significant effect on the amount of time users spend on social media or the content they see is unknown, but unlike many of COSL’s provisions, it does not seem to have deleterious effects. In fact, we believe that allowing users to have more control over their feeds, whether by decreasing “suggested posts'' from accounts the user does not follow, blocking types of advertisements, or giving users the option to turn off algorithmic recommendations, should be extended to all users of social platforms, not just those under 18. This is because problematic social media use is just as likely to manifest in adults as it is in teenagers.162
However, “addiction-like behaviors,” to the extent that they exist, are currently tied to social media use overall and not simply algorithmic feeds. As one study concludes, “very little is known about how platform affordances and user perceptions may influence social media addiction.”163 Indeed, a number of studies suggest that evidence of “addiction-like behaviors” can be found in use of the internet itself, not just social media platforms.164 This is in contrast to popular consensus that social media companies strategically design their platforms to be “addictive.”165 KOSA’s provision to fund more research in this area is a good step forward towards answering these questions.
Finally, the popular idea that social media produces “hits” of “dopamine” which facilitate addiction is not supported by research. Eschewing the language of “addiction” for “habitual use,” Mark D Griffiths, Distinguished Professor of Behavioural Addiction at the International Gaming Research Unit at Nottingham Trent University, writes, “the idea that dopamine ‘hijacks the brain’ and leads to ‘compulsive loops’ are analogies used in the media rather than the phrases used by scientists.”166
5.0 Arguments against Child Online Safety Legislation (COSL)
The deluge of child online safety legislation has been met with repeated criticism from groups like the EFF, Fight For the Future, and the ACLU. While acknowledging the well-intentioned nature of COSL, critics have highlighted how these bills will require widespread age verification, negatively impact the privacy of all internet users, increase surveillance of adolescents, and violate key First Amendment provisions. Additionally, the enforcement mechanisms inherent in these bills can be used specifically against LGBTQ+ content and reproductive rights.
5.1 Age Verification
Most of this proposed or passed legislation will necessitate some form of age verification, or “age-gating,” either explicitly or implicitly. This is particularly true for laws aimed at restricting access to information based on age (such as North Carolina and Montana’s anti-pornography bills, which require pornography websites to proactively verify the ages of users trying to access pornography.)167 But platforms and other internet service providers may need to engage in age verification simply to determine if they must comply with the law. For instance, critics of California’s Age Appropriate Design Code (AB 2273), such as Mike Masnick, have argued that a website would need to use age verification to determine whether the law would apply.168 And though KOSA states that is does not require age-gating, age verification, or the collection of additional data from users to determine age,169 it’s unclear how the law would be applied without it. The law requires that platforms treat minors differently from adults which, according to the ACLU, could “necessitate that platforms verify users’ ages.”170 Due to the breadth of these laws, and since children generally do not have government-issued identification, these age verification laws—whether done through IDs, credit cards, AI-gating, or facial scanning—will impact all users, regardless of age.
While KOSA only implicitly requires age verification for social media, many states are introducing bills which mandate age verification and parental consent. Arkansas enacted SB 396, which requires social media platforms to enact “reasonable age verifications” that include a “digitized identification card,” “government-issued identification,” or “any commercially reasonable age verification that holds an Identity Assurance Level 2.”171 Similarly, Utah and Texas instituted a requirement for commercially reasonable age verification. Other states, such as New Jersey, attempted to institute age verification, but ultimately failed.172 Federal bills like S. 1291 and S. 419 institute age verification at a larger scale. These calls are likely to increase in the coming years. In the most politically charged context, Florida’s recently signed HB3 requires age verification for any “website or application” that “contains a substantial portion of material harmful to minors.”173 HB3 applies to any website where “more than 33.3 percent of total material” is “harmful to minors” which is content which: 1) “the average person applying contemporary community standards would find, taken as a whole, appeals to the prurient interests,” 2) “depicts or describes, in a patently offensive way, sexual conduct,” 3) “when taken as a whole, lacks serious literary artistic, political, or scientific value for minors.”174
These are remarkably unclear standards, as it is not evident what determines “literary, artistic, political, or scientific value for minors.”175 Similarly, “contemporary community standards” regarding “the prurient interest” are subject to unilateral definition by the Florida Department of Legal Affairs.176 This means that Florida can use age verification to wage political battles over platforms which they deem politically lacking value for children, a strategy the state has previously used to ban LGBTQ+ books from classrooms.
Age verification is simple in theory, but difficult to enact effectively without sacrificing privacy, free expression, equity, accuracy, or all of the above. It is much easier to determine whether someone is over 18 than if they are under 18, since adults are much more likely than children to have government-issued IDs or credit cards. There is an implicit assumption that if someone is unable to verify they are over 18, they are presumed to be a minor. Platforms are likely to punt this responsibility to third party companies, such as PornHub’s parent company, MindGeek. (The Age Verification Providers Association unsurprisingly supports KOSA and even provides a guide to legislators on how to write age verification legislation that won’t be struck down by the courts.)177
Solutions fit within one of three categories: self-assertion, ID verification, and estimation. Checking a box that asserts a user is over 18 offers privacy, but provides little accuracy. Verification of an ID, such as a passport or driver's license, is relatively accurate, but would functionally ban undocumented immigrants or other adults without identification from social media platforms while risking private information leaking in the case of a security breach. Others use proxies to estimate age. Access to a credit card is often used as a proxy for age, but doing so would exclude the 18% of Americans who lack a credit card.178 Services such as Yoti offer AI-powered facial recognition software, which can supposedly estimate age based on a facial scan. An independent review of the public-facing software has noted that it can easily be circumvented, with one report noting that Yoti could be spoofed by placing a dog in front of a face.179 A litany of prior studies have highlighted the racial bias often inherent in many facial recognition systems, complicating the ease with which these systems can be applied to a diverse set of users.180
Young people are also likely to subvert many of these requirements relatively easily. The use of VPNs during the registration process would allow users to dodge the verification requirement. Users might also shift to smaller social media websites that lack parental supervision or age verification because they do not have user accounts, which is necessary to be subject to some legislation such as Texas HB 18. Youth might abuse loopholes and spoofing techniques to get online regardless of the intentions of legislators. And, of course, they might use services that are not under US jurisdiction, complicating the politics of which social platforms can legally operate within the United States.
5.2. Privacy and Surveillance
Questions of privacy are essential to understanding the dangers of COSL. Although many COSL bills are categorized as “privacy bills,” they present significant challenges to online privacy and enable wide-scale surveillance.
First, wide scale implementation of mandatory age verification would have devastating consequences for internet privacy, making it more or less impossible to browse the web anonymously. Requiring users to submit government IDs for every social media platform they wish to access makes it impossible to access such sites anonymously, or even without a record. This is compounded by the fact that KOSA’s “duty of care” provision and various state bills may make vast swaths of the internet—not just social platforms—subject to age verification, such as pornographic websites and sites with “critical race theory,” sexual health, abortion information, and LGBTQ+ content.181
To be clear, this is legal information protected by the First Amendment. People should have the right to access information without government-mandated surveillance, as the reveal of such information could be extremely embarrassing—in some contexts, life-threatening—or even open them up to criminal proceedings. If COSL passes, adults who are unwilling to share private information with a third-party will find themselves unable to access content which was previously freely available. Tying individual users to government IDs violates the anonymity of users, and the privacy that children and adults need.
Second, there is little information on the companies that would administer age verification systems. Requiring private companies to collect very confidential information without any limits on how that information may be used opens up users to identity theft and having private information hacked, leaked, or breached. KOSA, for example, asserts that the federal government will produce guidance for platforms within 18 months that would explain how to comply with KOSA. These ambiguities amount to a “vote for KOSA now” and figure out how it will be implemented afterwards.
Supporters of age verification programs point to clauses which prevent information from being saved or used after the verification process, but this itself poses many distinct risks. If information is deleted immediately following verification, then those systems are substantially less auditable because there would be no concrete record of the information provided for verification. Given data aggregation, a person’s government ID could easily be linked to a list of sensitive sites they have accessed, which could then be bought and sold by data brokers. Data brokers are notorious for creating market categories of vulnerable people, such as men suffering from erectile dysfunction or senior citizens susceptible to scams.182 The data broker industry is virtually unregulated and sells individual personal data about millions of Americans.
Finally, the privacy of individual children and teenagers also matters. Under many state bills, anyone under 18 could have anything they do online surveilled by their parents. While it is tempting to place parents’ rights above the rights of the child, children also have rights—although the United States is the only nation that has not ratified the UN’s Convention on the Rights of the Child.183 In fact, the United States lacks not only a formal framework, but a “fundamental, common understanding of children as human beings and rights holders.”184 Instead, the conservative framework of parent’s rights has, as outlined below, been used primarily to limit children’s access to information, mostly information that is age-appropriate but politically controversial in some US contexts. Historically, children have been treated as the property of their parents, and, as Yale legal scholar Samantha Godwin argues, “any account of parental rights grounded in a parent’s separate interests supervening on the interests of their children has the effect of denying children equal moral consideration.”185 Privacy is a universal human right, but children living in their parents’ house often suffer from a lack of privacy, as their rooms, actions, and belongings are often heavily surveilled by their parents.186 There must be a balance between regulating children’s access to inappropriate online content and enabling parental surveillance.
5.2.1 Parental Control
Parental control aspects of COSL are more likely to survive First Amendment review than age verification or duty of care. However, we have other concerns. An essential element of the discourse surrounding and supporting this batch of regulatory bills is the invocation of parental rights and protection. As Klobuchar noted in a hearing on Big Tech and the Online Child Exploitation Crisis, parents often referred to social media as “a faucet overflowing in a sink” leaving them “with a mop while her kids are getting addicted to more and more different apps and being exposed to [dangerous] material.”187 In this view, children are addicts-in-waiting, and legal change is necessary to put parental control as a precursor for adolescent participation, rather than as a retroactive backstop. This is most present in the appeals for age verification as a precursor to participation on social media, with a requirement for parental consent for those under the age of 16 or 18.
While parental tools are important to protect the privacy and safety of children, parental monitoring is not intrinsically safe. Even if KOSA does not allow parents to view the content their children have accessed, bills pending in Arkansas, Indiana, Iowa, Tennessee, and Wisconsin would make social posts, messages, and likes available to parents. These would eliminate the capacity for children to have any private interactions on social media platforms, regardless of the social context through which they operate. Although most parents want the best things for their kids, there are abusive parents and parents who hold very different cultural and political views than their children. In these cases, COSL that allows parental monitoring would facilitate abuse and conflict. Given that LGBTQ+ youth are overrepresented among homeless and runaway children, who often leave home due to parental abuse or rejection,188 and that children with different political beliefs from their parents can face family conflict and uncertainty,189 COSL that allows parents to see the content of sites their children visit may make vulnerable minors more vulnerable.
Moreover, the rhetoric of parental rights has recently returned as a central element of conservative lobbying against Critical Race Theory (CRT), DEI efforts, and LGBTQ+ content.190 Movements espousing the need for parental control over education and exposure to sexual content can be traced back as far as the 1920s.191 For instance, Anita Bryant’s 1977 Save Our Children campaign asserted that protecting gay men and lesbians would result in schools being forced to hire “homosexuals,” who would recruit them to homosexuality.192 More recently, the moral panic over drag shows at libraries echoes a similar fear of queer and trans influence on children. This derives from the idea that parents ought to be the gatekeepers and determiners of what children ought to be exposed to, regardless of the historical context and danger. Recently, “parents’ rights” have been weaponized to ban books with LGBTQ+ characters from school libraries, characterize doctors providing gender-affirming care as “groomers,” and remove mentions of slavery from textbooks.193 COSL, even when well-intended, fuels this rhetoric and will likely be used to similar ends.
5.2.2 Chilling Effects on Information Access and Free Expression
In practice, increased parental control over social media usage can result in a chilling effect, particularly on young people seeking essential information and community. According to a 2014 study of 5542 internet users aged 13-18, LGBTQ+ youth disproportionately rely upon online sexual health information, with 78 percent of LBGTQ youth searching for sexual health information.194 This is likely due to a lack of offline educational opportunities for queer youth without outing themselves to potentially unsupportive social and familial circles.195 Considering recent moral panics surrounding queer representation in educational settings, the need for online sexual health information will likely become more acute. This concern has been validated by relevant research. One study found that queer youth experiencing parental digital surveillance would self-censor, experience negative mental health, and worry about abuse if their sexuality was found out.196 If parental access now provides search history, comments, user activity, and even access to private messages to unsupportive parents, then queer youth will have their sexual privacy eroded, and be potentially subject to abusive responses.
In addition to the “duty of care” provision, it is worrisome that many types of COSL seek to expand the monitoring of young people’s actions online, given the data privacy concerns around reproductive rights.197 In Mississippi and Indiana, women were prosecuted for text messages and Google searches indicating they were seeking abortions.198 Even if reproductive health care is not defined as “harmful,” the age-gating and other provisions of COSL will have a chilling effect on young people searching for information on abortion care.
Similarly, undocumented immigrants may be subject to arrest or deportation for providing information for age verification, given that ICE regularly uses commercial databases, social media platforms, and cellphone records to identify undocumented people.199 A survey of immigrants in New York City found that 42% would only use the internet for certain things and 26% would use the internet less or not at all if identity verification was required.200 As a result, widespread age verification would negatively impact access to information for marginalized groups.
5.3 Weaponization of Duty of Care
One central concern is the ability of KOSA’s duty of care provision to be weaponized against certain types of content as a means to protect children. This would be operationalized under the Duty of Care provision under Section 3, which requires platforms to take “reasonable measures in the design and operation of any product service of feature” to mitigate harm towards minors. Importantly, however, KOSA relies on the FTC to litigate whether a platform is causing harm, which can often swing largely based on the political winds of the time. This vagueness in defining “harm” thus allows state attorneys general to pressure the FTC to define different types of content as “harmful,” and opens the door to state bills which would make it impossible for youth to access content that KOSA would allow. KOSA, as written, requires supporters to trust that the guidelines produced within 18 months would prevent the duty of care from being weaponized, without any clarification on how those guidelines might be formulated.
Right-wing organizations and individuals have already espoused their intentions to use COSL to censor transgender and gender-affirming content (of course, left-wing organizations and individuals might also use COSL for their preferred censorship preferences). In a commentary piece for The Heritage Foundation titled “How Big Tech Turns Kids Trans,” Jared Ecker and Mary McCloskey lauded KOSA as prohibiting “the sexual exploitation of minors and the promotion of content that poses risk to minors' physical, and mental health.”201 Protecting children, for Heritage, requires guarding “against the harms of sexual and transgender content online.” In a X (formerly Twitter) thread justifying this position, the Heritage Foundation asserted that trans content, and transness, are “social contagion” that conditions kids to “permanently damaging their healthy bodies.”202 In a video published by the conservative Family Policy Alliance, Marsha Blackburn lauded KOSA, while also asserting that “protecting minor children from the transgender in this culture” ought to be a priority for the conservative agenda.203 Most recently, in an interview with PBS, Jonathan Haidt claimed that young people become trans due to peer contagion effects, saying “It’s very different from the kinds of gender dysphoria cases that we’ve known about for decades. I mean, it is a real thing. But what happened, especially when girls got, was YouTube and Instagram early, but then especially TikTok, girls just, you know, girls get sucked into these vortices and they take on each other’s purported mental illnesses.”204
This rhetoric of transness as a “social contagion” emerges from the proliferation of the term “rapid onset gender dysphoria,” coined by Lisa Littman in 2018.205 As Florence Ashley describes the term, ROGD describes a “new clinical subgroup of transgender youth, which would be characterized by coming out as transgender out of the blue in adolescence or early adulthood.”206 Rapid onset supposedly occurs due to peer influence networks, trans content on social media, trauma, and even sexual objectification. The study justifying Littman’s theory of ROGD has been shown to be flawed in multiple ways. Notably, Littman solicited parents from anti-trans parental groups and did not interview any trans people. Furthermore, as Arjee Restar, an Assistant Professor of Epidemiology at the University of Washington highlights, Littman relied upon parental respondents to diagnose their children without clinical qualifications and relied upon retroactive reports that spanned “on average…6 years for parents to remember between their child’s ‘childhood’ and current age.”207 Yet, the perceived legitimacy of ROGD allows anti-trans organizations to subvert the intentions of KOSA, and claim that it would meet Section 3’s standard of “evidenced-informed medical information.”208
ROGD as a conceptual frame has provided anti-trans state legislatures with justification for the restriction of trans-affirming care, as was the case when Florida directly cited Littman’s conception of ROGD as proof of the need to restrict gender-affirming care.209 ROGD frames transgender identity as both a source of “social contagion” and a product of social contagion. Simply put, this understanding of gender dysphoria, and the expression of transgender identity, asserts that children are becoming trans due to social acceptance of trans identity and that children are identifying as trans due to social pressure, whether explicit or implicit. This relies upon a view of trans identity as pathological, and merely a fad which children adopt, rather than a very material experience.
Similarly, civil liberties groups identify KOSA as a mechanism to shut down reproductive rights content, especially access to abortion pills, abortion funds, and legal resources for abortion seekers. Since the overturning of Roe v. Wade in the Dodds decision, 14 states have banned abortion, three have implemented gestational limits beyond those in Roe, and three have had abortion bans blocked.210
Abortion restrictions, such as requiring only physicians to prescribe abortion medication and necessitating women seeking abortions to undergo mandated ultrasounds, counseling, and waiting periods, have been imposed across the country.211 Access to contraception has decreased.212 Conservative anti-abortion groups and politicians are attempting to redefine certain types of contraceptives, such as IUDs and emergency contraception, as abortion, in order to further restrict their usage.213 A few states have sought to prevent women from traveling to other states to seek abortion care.214
In this very contentious landscape, people seeking reproductive health care—especially young women—may find their access to reproductive health care information severely curtailed. Given that anti-abortion strategies often rely on dubious claims that abortions (and contraception) are harmful to women,215 it is likely that states in which abortion is illegal will similarly define reproductive rights content as harmful to young people.
Supporters of KOSA point to revisions of the original 2022 bill in response to massive critique from prominent LGBTQ organizations, but these revisions fail to provide meaningful protections. These changes largely rely upon defining the “covered platform” to exclude things like a suicide hotline, and limiting the provision from restricting minors from “deliberately and independently searching for, or specifically requesting, content.”216 While these are important restrictions, they fail to adequately protect trans youth. First, NGOs can lobby the FTC to claim the existence of trans content as causing harm, which will likely result in platforms preemptively moderating contentious content before the lawsuit takes place. Second, state laws will likely define trans content as harmful, thus supplementing KOSA. In a polarized social context where the definition of “harmful” is highly subjective and deeply influenced by politics, allowing the government to decide which content is considered “harmful” opens up a serious vector for abuse.
5.4 First Amendment Concerns
From the perspective of digital civil rights organizations, COSL presents a significant threat to the First Amendment, and poses potential censorship concerns. Even after the revised version of KOSA was unveiled, the EFF, the Center for Democracy and Technology (CDT) and the ACLU have all cited First Amendment concerns with KOSA.217 The ACLU has noted there are two areas of the bill that could potentially chill speech: (1) Requiring or incentivizing age verification, which they argue could chill speech for both adults and minors; and (2) the “Duty of Care” requirement which, they argue, could “entice platforms to censor content.” The Electronic Frontier Foundation has also argued that KOSA poses First Amendment concerns, pointing to the degree of power that state attorney generals will have in targeting online services and speech. They note that even with changes to the bill made in 2024, KOSA will still permit state officials to enforce the law based on “their political whims.”218 They have also raised concerns about First Amendment concerns resulting from the need to restrict content based on age, which effectively mandates age verification.
At the state level, many of the bills which proposed age verification have been challenged on First Amendment grounds. In response to lawsuits from tech industry organization Net Choice and the ACLU, the California Age-Appropriate Design Code was blocked by a district court injunction on First Amendment grounds and is headed to the Ninth Circuit.219 Arkansas SB 396 was similarly blocked after Net Choice, the ACLU, and the EFF filed comparable lawsuits.220 Texas’s HB 1181, which requires age verification to view pornography, was recently upheld by the Fifth Circuit after a lawsuit from adult entertainment companies and is likely to be appealed in a higher court.221
The implementation of Utah’s SB 152 was delayed until October 1, 2024, in response to a set of First Amendment lawsuits against the bill.222 Legislators have introduced SB 194 and HB464 in response.223 SB 194 removes SB 152’s requirement of parental consent to open an account, but places any minor accounts that lack parental consent at the maximum level of privacy, removing the ability to share content with or message anyone beside already “connected accounts.”224 Platforms are then required to use age assurance criteria to determine if “an account holder is a minor with an accuracy rate of at least 95%,” and users must appeal with documentary evidence if they are categorized falsely.225
There is a long history of internet legislation requiring age verification that has been struck down because of the First Amendment. In an early effort to protect minors online, the Communications Decency Act (CDA) was passed in 1996 to criminalize the knowing transmission of obscene, indecent, or patently offensive material to minors (defined as anyone below the age of 18).226 Persons or businesses could avoid being prosecuted under this law by restricting minor access to indecent material, and requiring the “use of a verified credit card, debit card, adult access code, or adult personal identification number” to view it.227
As noted by Pagidas (1998), “the act immediately sparked controversy.”228 It was quickly challenged in Reno v. ACLU, in which a group of twenty plaintiffs, led by the ACLU, challenged its constitutionality through filing suit against the Attorney General and the Department of Justice.229 Their complaint alleged the law was unconstitutional because it “criminalizes expression that is protected by the First Amendment,” is “impermissibly overbroad and vague,” and is not the “least restrictive means of accomplishing any compelling governmental purpose.”230 Additionally they asserted that the Act violated the constitutional right to privacy, effectively criminalizing private email corresponding to or among individuals under the age of 18.231
In the 1997 Supreme Court’s decision, the majority noted potential problems “confronting age verification for recipients of Internet communications.”232 The Court noted that in the late 1990s, “the Government offered no evidence that there was a reliable way to screen recipients and participants in such forums for age.”233 The Court continued, “even if it were technologically feasible to block minors’ access to newsgroups and chat rooms…that potentially ‘elicit “indecent’ or ‘patently offensive’ contributions,” it would not be possible to block those types of content while still enabling them to access the remaining content that was not indecent. They argued that the age verification technologies that had been proposed—namely using a credit card as a surrogate for proof of age—would bar adults who did not have a credit card. Thus, in the Court’s opinion, age verification violated the First Amendment of both children and adults.
It is unclear how and whether these concerns have been addressed by KOSA, or whether the potential chilling effect on speech posed by age verification would remain. Though legislators have claimed that KOSA would not require age verification, this legislation and similar state bills described above require that platforms and other companies operating on the internet be able to distinguish between those who the law applies to (i.e. under a certain age), and those it does not (see Section 5.1).
6.0 Recommendations
At this writing, KOSA has only recently been introduced into the House, and various COSL efforts are being fought in the courts, primarily on First Amendment grounds. We recognize the tremendous amount of public concern over the mental health of young people and the unprecedented power of the technology industry. While we believe the current instantiations of COSL are not appropriate ways to solve these problems, we offer some suggestions. While these reforms are more complicated than simply passing legislation and require more political will, we believe they will positively impact the well-being of children, parents, and other adults alike.
Center Young People
Discussions of social media, children, and mental health invariably turn into debates about whether social media is good or bad for children. This emerges when we center the technology. Instead, we need to center young people. What is at the root of their struggles? What do young people need and want to feel empowered? As researchers, we find that this rarely starts with fixes to technology. Moreover, young people are not a monolith; content that empowers one teenager may make another anxious.234 What happens online and offline are interlinked; children’s behaviors are just as influenced by their family, peer group, and social context as they are by screens.235 The issues young people are facing are ecological in nature. It is critical that lawmakers see technology as one facet in a system, one that mirrors and magnifies a range of dynamics already at play. Rather than rendering unhealthy youth practices invisible, we should be certain to help young people who are seeking help.
Increase Access to Mental Health
Young people need improved access to free or low-cost mental health services. Regardless of what is contributing to the rise in depression, anxiety, and death by suicide, mental health resources are a clear way to counter them. Mental health support must be provided to children, teenagers, and parents around the country.236 In addition to addressing the shortage of social workers, therapists, and counselors, this might include community and support groups to help young people suffering from loneliness or social isolation.237 The Biden Administration’s comprehensive national strategy to address the mental health crisis and $200 million earmarked for youth mental health support is a step in the right direction.238
Rebuild the Social Fabric
In the wake of the COVID-19 pandemic, the social fabric has frayed.239 In September 2023, the Census Bureau released figures showing that the number of children living in poverty in the US more than doubled, from 5.2% in 2021 to 12.4% in 2022.240 The pandemic hindered student achievement and caused ongoing learning deficits.241 These large systemic factors must be addressed.
More specifically, young people need concerned adults in their lives who are not their parents, such as coaches, pastors, teachers, neighbors, and godparents. Locales must provide spaces where teenagers can socialize in person, such as libraries, youth centers, and skateparks. Even though there is tremendous social pressure on parents to engage in intensive helicopter parenting, parents should enable their children to have unstructured free time, walk to school, and “hang out.”
Create Digital Street Outreach Programs
In the 1990s, when concerns about crack and HIV were at an all-time high, communities began investing in “street outreach” programs. Community members went out into the streets to offer clean needles and information about local services that could be supportive. They checked in on those who were struggling with care—giving people dignity and hope.
Today’s youth need digital street outreach programs. They are crying out in pain online. Rather than try to render those cries invisible, we need social services that can ensure that when young people are struggling, they have access to services that might help them.242
Better Individual and Parental Controls
Many of us feel that we have an unhealthy relationship with our phones, tablets, and laptops. The success of third-party software such as Freedom, the popularity of “digital detox” retreats and programs, and a small but significant number of people choosing “dumbphones” over smartphones all show the desire for widespread changes in modern technology use. The types of platform reforms included in some COSL, such as chronological feeds, blocks on “dark patterns,” and granular user controls are important and necessary for everyone, not just those under 18.
Car manufacturers are required to install seat belts, but they are not required to prevent accidents. This should also be true for social media regulation. It is reasonable for specific features to be outlawed. For example, it is reasonable to outlaw auto-play, or to let people view the content they want without recommended content forced upon them. (Again, this would be good for everyone, not just minors.) However, it is not reasonable to create requirements that expect companies to broker what does or does not cause emotional distress. This is techno-legal solutionism.243
Similarly, the current state of parental tools on streaming services, video games, tablets, and smartphones is dire. Parental tools are ineffective, difficult to use, and clunky. For example, parents should be able to block individual YouTube channels, games on Steam, video podcasts on Spotify, or shows on streaming services. All technology companies—not just social platforms—should invest in improving parental tools through user testing with parents and children. A growing body of literature suggests that youth should be included in the design process, as empowering young people and increasing parental engagement are more effective in reducing online risk than authoritarian approaches.244
Limit the Scope of Acceptable Advertising
In Packer v. Utah (1932), the Supreme Court ruled that Utah had every right to restrict what advertisers could post on public billboards because people “have the message of the billboard thrust upon them” unlike other advertising where people had some choice. Rather than trying to carve up what children can and cannot see, social media should be treated like a public billboard. There should be explicit restrictions on what kinds of goods and services cannot be advertised on social media—to anyone—due to its role as a digital town square.
Pass Privacy Legislation
We need comprehensive privacy legislation, and there must be clear limits on how data collected by platforms and other tech companies can be used. But this is true for everyone, not just people under 18. The draft American Privacy Rights Act recently unveiled by congresswomen Maria Cantwell (D-WA) and Cathy McMorris Rodgers (R-WA) would give people more control over how their data is collected and used online. Users would have the ability to opt-out of targeted advertising and the collection of biometric and facial recognition data, as well as having their personal information sold by data brokers. While this is only a draft, it provides a baseline for how we might move forward.245
7.0 Conclusions
The public debate around the harms of social media and smartphones has devolved into a battle between two simplistic narratives: social media is causing a youth mental health crisis, or it is not causing a youth child mental health crisis. This debate distracts us from the reality that young people are in pain and need help.
Even if KOSA or any other COSL passes, it is very unlikely to substantially alleviate the crisis in the short term, and we are dubious as to the efficacy (or constitutionality) of such legislation in the long-term. But the intense focus on smartphones and social media drowns out necessary and immediate reforms that can be undertaken to improve the mental health and well-being of children, teenagers, and young adults.
We strongly believe that reform of social platforms and regulation of technology is needed. We need comprehensive privacy legislation, limits on data collection, interoperability, more granular parental and individual content tools, and advertising regulation, to name but a few changes. Young people need spaces to socialize without adults, better mental health care, and funding for parks, libraries, and extracurriculars.
But rather than focusing on such non-digital solutions, COSL empowers parents rather than young people, does little to curb the worst abuses of technology corporations, and enables an expansion of the rhetoric that is currently used to ban books, eliminate diversity efforts in education, and limit gender affirming and reproductive care. KOSA will eliminate important sources of information for vulnerable teenagers and wipe out anonymity on the social web. While we empathize with the regulatory impulse, the forms of child safety legislation currently circulating will not solve the problems they claim to remedy.
Appendix
Authors
Alice E. Marwick is Microsoft Visiting Professor at the Center for Technology Policy at Princeton University, Principal Researcher at the Center for Information, Technology, and Public Life, and Associate Professor of Communication at the University of North Carolina at Chapel Hill.
Jacob Smith is a PhD candidate in Communication and a Graduate Research Assistant at the Center for Information, Technology, and Public Life at the University of North Carolina at Chapel Hill.
Robyn Caplan is an Assistant Professor at Duke University's Sanford School of Public Policy and a Senior Lecturing Fellow in the Center for Science & Society at Duke University.
Meher Wadhawan is a Legal Associate at YouTube and a former Researcher at Duke University's Sanford School of Public Policy.
Acknowledgements
Thank you to the Center for Information Technology Policy at Princeton University for funding this work through a Microsoft Visiting Professorship and providing support throughout.
Thank you to Amanda Reid, danah boyd, Matt Perault, Maria P. Angel, Shreyas Gandlur, and Arvind Narayanan for helpful feedback and editing.
Thank you to Katherine Furl and Felicity Gancedo for layout and publishing.
Works Cited
Abrams, Zara. “2023 Trends Report: Kids’ Mental Health Is in Crisis. Here’s What Psychologists Are Doing to Help.” Monitor on Psychology 54, no. 1 (February 2023): 63.
———. “Stress of Mass Shootings Causing Cascade of Collective Traumas.” Monitor on Psychology 53, no. 6 (October 27, 2023): 20.
Akter, Mamtaj, Amy J. Godfrey, Jess Kropczynski, Heather R. Lipford, and Pamela J. Wisniewski. “From Parental Control to Joint Family Oversight: Can Parents and Teens Manage Mobile Online Safety and Privacy as Equals?” Proceedings of the ACM on Human-Computer Interaction 6, no. CSCW1 (April 7, 2022): 57:1-57:28. https://doi.org/10.1145/3512904.
Albert, Kendra. “Five Reflections from Five Years of FOSTA/SESTA.” Cardozo Arts & Entertainment Law Journal 40, no. 2 (2022): 413–40.
Alizadeh, Hadi, Ayyoob Sharifi, Safiyeh Damanbagh, Hadi Nazarnia, and Mohammad Nazarnia. “Impacts of the COVID-19 Pandemic on the Social Sphere and Lessons for Crisis Management: A Literature Review.” Natural Hazards 117, no. 3 (July 1, 2023): 2139–64. https://doi.org/10.1007/s11069-023-05959-2.
American Civil Liberties Union. ACLU v. Reno (Complaint), No. Civ. no 96-963 (US District Court Eastern District of Pennsylvania February 8, 1996).
Angel, María P. “Despite Wide Support, the Kids Online Safety Act Will Not Fix the Youth Mental Health Crisis.” Fast Company, October 13, 2023. https://www.fastcompany.com/90966640/kids-online-safety-act-will-not-fix- youth-mental-health-crisis/.
Angel, Maria P., and danah boyd. “Techno-Legal Solutionism: Regulating Children’s Online Safety in the United States.” In CSLAWW’24. Boston, MA: ACM, 2024. https://doi.org/10.1145/3614407.3643705.
Aradillas, Elaine. “At 13, She Was Abducted and Tortured by an Online Predator. Now, She’s Determined to Keep Other Kids Safe.” People Magazine, August 31, 2022. https://people.com/crime/alicia-kozakiewicz-abducted-online- predator-now-keeps-kids-safe/.
Arnett, Jeffrey Jensen. “The Evidence for Generation We and Against Generation Me.” Emerging Adulthood 1, no. 1 (March 1, 2013): 5–10. https://doi.org/10.1177/2167696812466842.
Arnett, Jeffrey Jensen, Kali H. Trzesniewski, and M. Brent Donnellan. “The Dangers of Generational Myth-Making: Rejoinder to Twenge.” Emerging Adulthood 1, no. 1 (March 1, 2013): 17–20. https://doi.org/10.1177/2167696812466848.
Ashley, Florence. “A Critical Commentary on ‘Rapid-Onset Gender Dysphoria.’” The Sociological Review 68, no. 4 (July 1, 2020): 779–99. https://doi.org/10.1177/0038026120934693.
Atkinson, Robert D., Doug Brake, Daniel Castro, Colin Cunliff, Joe Kennedy, Michael McLaughlin, Alan McQuinn, and Joshua New. “A Policymaker’s Guide to the ‘Techlash’ —What It Is and Why It’s a Threat to Growth and Progress.” Washington, D.C.: Information Technology & Innovation Foundation, October 28, 2019. https://itif.org/publications/2019/10/28/policymakers-guide-techlash/.
Ayers, John W., Benjamin M. Althouse, Eric C. Leas, Mark Dredze, and Jon-Patrick Allem. “Internet Searches for Suicide Following the Release of 13 Reasons Why.” JAMA Internal Medicine 177, no. 10 (October 1, 2017): 1527–29. https://doi.org/10.1001/jamainternmed.2017.3333.
Bacchini, Fabio, and Ludovica Lorusso. “Race, Again: How Face Recognition Technology Reinforces Racial Discrimination.” Journal of Information, Communication and Ethics in Society 17, no. 3 (January 1, 2019): 321–35. https://doi.org/10.1108/JICES-05-2018-0050.
Bado, Patricia, Julia Schafer, Andre R. Simioni, Rodrigo A. Bressan, Ary Gadelha, Pedro M. Pan, Eurípedes C. Miguel, Luis A. Rohde, and Giovanni A. Salum. “Screen Time and Psychopathology: Investigating Directionality Using Cross-Lagged Panel Models.” European Child & Adolescent Psychiatry 31, no. 4 (April 2022): 689–91. https://doi.org/10.1007/s00787-020-01675-5.
Bányai, Fanni, Ágnes Zsila, Orsolya Király, Aniko Maraz, Zsuzsanna Elekes, Mark D. Griffiths, Cecilie Schou Andreassen, and Zsolt Demetrovics. “Problematic Social Media Use: Results from a Large-Scale Nationally Representative Adolescent Sample.” PLOS ONE 12, no. 1 (January 9, 2017): e0169839. https://doi.org/10.1371/journal.pone.0169839.
Barnert, Elizabeth, Joseph Wright, Charlene Choi, Jonathan Todres, Neal Halfon, and Advisory Committee Reimagining Children’s Rights Steering Committee and Project Team. “Reimagining Children’s Rights in the US.” JAMA Pediatrics 176, no. 12 (December 1, 2022): 1242–47. https://doi.org/10.1001/jamapediatrics.2022.3822.
Berryman, Chloe, Christopher J. Ferguson, and Charles Negy. “Social Media Use and Mental Health among Young Adults.” Psychiatric Quarterly 89, no. 2 (June 1, 2018): 307–14. https://doi.org/10.1007/s11126-017-9535-6.
Bhandari, Neeraj, and Shivani Gupta. “Trends in Mental Wellbeing of US Children, 2019–2022: Erosion of Mental Health Continued in 2022.” International Journal of Environmental Research and Public Health 21, no. 2 (February 2024): 132. https://doi.org/10.3390/ijerph21020132.
Bilu, Yonatan, Natalie Flaks-Manov, Maytal Bivas-Benita, Pinchas Akiva, Nir Kalkstein, Yoav Yehezkelli, Miri Mizrahi-Reuveni, et al. “Data-Driven Assessment of Adolescents’ Mental Health During the COVID-19 Pandemic.” Journal of the American Academy of Child & Adolescent Psychiatry 62, no. 8 (August 1, 2023): 920–37. https://doi.org/10.1016/j.jaac.2022.12.026.
Bleakley, Paul, Elena Martellozzo, Ruth Spence, and Jeffrey DeMarco. “Moderating Online Child Sexual Abuse Material (CSAM): Does Self-Regulation Work, or Is Greater State Regulation Needed?” European Journal of Criminology 21, no. 2 (March 1, 2024): 231–50. https://doi.org/10.1177/14773708231181361.
Blunt, Danielle, and Ariel Wolf. “Erased: The Impact of FOSTA-SESTA and the Removal of Backpage on Sex Workers.” Anti-Trafficking Review, no. 14 (April 27, 2020): 117–21. https://doi.org/10.14197/atr.201220148.
Boczkowski, Pablo, and Leah A. Lievrouw. “Bridging STS and Communication Studies: Scholarship on Media and Information Technologies.” The Handbook of Science and Technology Studies 3rd Edition (2008): 949–77.
Brewin, Mark. “Why Elizabeth Eisenstein Might Have Been a Technological Determinist: And Why, in the End, It Does Not Really Matter.” Explorations in Media Ecology 16, no. 4 (December 1, 2017): 287–302. https://doi.org/10.1386/eme.16.4.287_1.
Bridge, Jeffrey A., Joel B. Greenhouse, Donna Ruch, Jack Stevens, John Ackerman, Arielle H. Sheftall, Lisa M. Horowitz, Kelly J. Kelleher, and John V. Campo. “Association Between the Release of Netflix’s 13 Reasons Why and Suicide Rates in the United States: An Interrupted Time Series Analysis.” Journal of the American Academy of Child & Adolescent Psychiatry 59, no. 2 (February 1, 2020): 236–43. https://doi.org/10.1016/j.jaac.2019.04.020.
Bryant, Jennings, and Bruce W. Finklea. Fundamentals of Media Effects. Waveland Press, 2022.
Butkowski, Chelsea P., Travis L. Dixon, and Kristopher Weeks. “Body Surveillance on Instagram: Examining the Role of Selfie Feedback Investment in Young Adult Women’s Body Image Concerns.” Sex Roles: A Journal of Research 81, no. 5–6 (2019): 385–97. https://doi.org/10.1007/s11199-018-0993-6.
Cao, Xiongfei, Mingchuan Gong, Lingling Yu, and Bao Dai. “Exploring the Mechanism of Social Media Addiction: An Empirical Study from WeChat Users.” Internet Research 30, no. 4 (January 1, 2020): 1305–28. https://doi.org/10.1108/INTR-08-2019-0347.
Caplan, Scott E. “Theory and Measurement of Generalized Problematic Internet Use: A Two-Step Approach.” Computers in Human Behavior, Advancing Educational Research on Computer-supported Collaborative Learning (CSCL) through the use of gStudy CSCL Tools, 26, no. 5 (September 1, 2010): 1089–97. https://doi.org/10.1016/j.chb.2010.03.012.
Casale, Silvia, Laura Rugai, and Giulia Fioravanti. “Exploring the Role of Positive Metacognitions in Explaining the Association between the Fear of Missing out and Social Media Addiction.” Addictive Behaviors 85 (October 1, 2018): 83–87. https://doi.org/10.1016/j.addbeh.2018.05.020.
Cavazos, Jacqueline G., P. Jonathon Phillips, Carlos D. Castillo, and Alice J. O’Toole. “Accuracy Comparison Across Face Recognition Algorithms: Where Are We on Measuring Race Bias?” IEEE Transactions on Biometrics, Behavior, and Identity Science 3, no. 1 (January 2021): 101–11. https://doi.org/10.1109/TBIOM.2020.3027269.
Cemiloglu, Deniz, Mohamed Basel Almourad, John McAlaney, and Raian Ali. “Combatting Digital Addiction: Current Approaches and Future Directions.” Technology in Society 68 (February 1, 2022): 101832. https://doi.org/10.1016/j.techsoc.2021.101832.
Centers for Disease Control & Prevention. “Youth Risk Behavior Survey Data Summary & Trends Report: 2011-2021.” Youth Risk Behavior Survey (YRSB). Atlanta, GA: Centers for Disease Control and Prevention, National Center for HIV, Viral Hepatitis, STD, and TB Prevention, Division of Adolescent and School Health, 2023. https://www.cdc.gov/healthyyouth/data/yrbs/pdf/YRBS_Data-Summary-Trends_Report2023_508.pdf.
Charlson, Fiona, Suhailah Ali, Tarik Benmarhnia, Madeleine Pearl, Alessandro Massazza, Jura Augustinavicius, and James G. Scott. “Climate Change and Mental Health: A Scoping Review.” International Journal of Environmental Research and Public Health 18, no. 9 (January 2021): 4486. https://doi.org/10.3390/ijerph18094486.
Chatlani, Neeraj, Arianna Davis, Karla Badillo-Urquiola, Elizabeth Bonsignore, and Pamela Wisniewski. “Teen as Research-Apprentice: A Restorative Justice Approach for Centering Adolescents as the Authority of Their Own Online Safety.” International Journal of Child-Computer Interaction 35 (March 1, 2023): 100549. https://doi.org/10.1016/j.ijcci.2022.100549.
Clark, LaToya Baldwin. “The Critical Racialization of Parents’ Rights.” Yale Law Journal 132, no. 7 (2023 2022): 2139–2204.
Clayton, Ellen Wright, Peter J Embí, and Bradley A Malin. “Dobbs and the Future of Health Data Privacy for Patients and Healthcare Organizations.” Journal of the American Medical Informatics Association 30, no. 1 (January 1, 2023): 155–60. https://doi.org/10.1093/jamia/ocac155.
Cohen, Rachel, Toby Newton-John, and Amy Slater. “The Relationship between Facebook and Instagram Appearance-Focused Activities and Body Image Concerns in Young Women.” Body Image 23 (December 2017): 183–87. https://doi.org/10.1016/j.bodyim.2017.10.002.
Cohen, Stanley. Folk Devils and Moral Panics. London: MacGibbon and Kee Ltd, 1972.
Crandon, Tara J., James G. Scott, Fiona J. Charlson, and Hannah J. Thomas. “A Social–Ecological Perspective on Climate Anxiety in Children and Adolescents.” Nature Climate Change 12, no. 2 (February 2022): 123–31. https://doi.org/10.1038/s41558-021-01251-y.
Crisis Text Line and Common Good Labs. “What Do Young People in Crisis Need from Their Communities? Solutions to the Epidemic of Depression and Suicide among Adolescents in the United States.” New York, NY: Crisis Text Line, February 8, 2024. https://www.crisistextline.org/blog/2024/02/08/six-things-adolescents-need/.
Curtin, Sally C., and Matthew F. Garnett. “Suicide and Homicide Death Rates Among Youth and Young Adults Aged 10–24: United States, 2001–2021.” NCHS Data Brief. Atlanta, GA: Centers for Disease Control and Prevention, June 12, 2023. https://doi.org/10.15620/cdc:128423.
Daly, Michael. “Prevalence of Depression Among Adolescents in the U.S. From 2009 to 2019: Analysis of Trends by Sex, Race/Ethnicity, and Income.” Journal of Adolescent Health 70, no. 3 (March 1, 2022): 496–99. https://doi.org/10.1016/j.jadohealth.2021.08.026.
Davis, Adam C., Steven Arnocky, and Mirella Stroink. “The Problem of Overpopulation: Proenvironmental Concerns and Behavior Predict Reproductive Attitudes.” Ecopsychology 11, no. 2 (June 2019): 92–100. https://doi.org/10.1089/eco.2018.0068.
DeCamp, Whitney, and Christopher J. Ferguson. “The Impact of Degree of Exposure to Violent Video Games, Family Background, and Other Factors on Youth Violence.” Journal of Youth and Adolescence 46, no. 2 (February 2017): 388–400. https://doi.org/10.1007/s10964-016-0561-8.
Deckard, Faith M., Bridget J. Goosby, and Jacob E. Cheadle. “Debt Stress, College Stress: Implications for Black and Latinx Students’ Mental Health.” Race and Social Problems 14, no. 3 (September 1, 2022): 238–53. https://doi.org/10.1007/s12552-021-09346-z.
Directorate-General for Communications Networks, Content and Technology (European Commission). The Digital Services Act (DSA) Explained: Measures to Protect Children and Young People Online. Publications Office of the European Union, 2023. https://data.europa.eu/doi/10.2759/576008.
Di Pietro, Giorgio. “The Impact of Covid-19 on Student Achievement: Evidence from a Recent Meta-Analysis.” Educational Research Review 39 (May 1, 2023): 100530. https://doi.org/10.1016/j.edurev.2023.100530.
Dreßing, Harald, Josef Bailer, Anne Anders, Henriette Wagner, and Christine Gallas. “Cyberstalking in a Large Sample of Social Network Users: Prevalence, Characteristics, and Impact Upon Victims.” Cyberpsychology, Behavior, and Social Networking 17, no. 2 (February 2014): 61–67. https://doi.org/10.1089/cyber.2012.0231.
Drotner, Kirsten. “Dangerous Media? Panic Discourses and Dilemmas of Modernity.” Paedagogica Historica 35, no. 3 (January 1, 1999): 593–619. https://doi.org/10.1080/0030923990350303.
Edlund, Mark J., Valerie L. Forman-Hoffman, Cherie R. Winder, David C. Heller, Larry A. Kroutil, Rachel N. Lipari, and Lisa J. Colpe. “Opioid Abuse and Depression in Adolescents: Results from the National Survey on Drug Use and Health.” Drug and Alcohol Dependence 152 (July 1, 2015): 131–38. https://doi.org/10.1016/j.drugalcdep.2015.04.010.
Eisenstein, E. L. The Printing Press as an Agent of Change. Cambridge: Cambridge University Press, 1980.
Felfe, Christina, Judith Saurer, Patrick Schneider, Judith Vornberger, Michael Erhart, Anne Kaman, and Ulrike Ravens-Sieberer. “The Youth Mental Health Crisis: Quasi-Experimental Evidence on the Role of School Closures.” Science Advances 9, no. 33 (August 18, 2023): eadh4030. https://doi.org/10.1126/sciadv.adh4030.
Ferguson, Christopher J. “The School Shooting/Violent Video Game Link: Causal Relationship or Moral Panic?” Journal of Investigative Psychology and Offender Profiling 5, no. 1–2 (2008): 25–37. https://doi.org/10.1002/jip.76.
Christopher J. Ferguson and James D. Ivory, “ School Shootings: Mediatized Violence in a Global Age,” Studies in Media and Communications, Vol. 7. Glenn W. Muschert and Johanna Sumiala (Emerald Group Publishing Limited, 2012), 47–67, https://doi.org/10.1108/S2050-2060(2012)0000007007.
Ferguson, Christopher J., Linda K. Kaye, Dawn Branley-Bell, Patrick Markey, James D. Ivory, Dana Klisanin, Malte Elson, etal. “Like This Meta-Analysis: Screen Media and Mental Health.” Professional Psychology: Research and Practice 53, no. 2 (2022): 205–14. https://doi.org/10.1037/pro0000426.
———.“Open Letter from Parents of Trans and Gender Expansive Kids: KOSA Would Make Our Kids Less Safe.” Accessed May 6, 2024. https://transparentsletter.com.
Finkelhor, David, Richard Ormrod, Heather Turner, and Sherry L. Hamby. “The Victimization of Children and Youth: A Comprehensive, National Survey.” Child Maltreatment 10, no. 1 (February 1, 2005): 5–25. https://doi.org/10.1177/1077559504271287.
Finkelhor, David, Kerryann Walsh, Lisa Jones, Kimberly Mitchell, and Anne Collier. “Youth Internet Safety Education: Aligning Programs With the Evidence Base.” Trauma, Violence, & Abuse 22, no. 5 (December 2021): 1233–47. https://doi.org/10.1177/1524838020916257.
Fisk, Nathan W. Framing Internet Safety : The Governance of Youth Online. John D. and Catherine T. MacArthur Foundation Series on Digital Media and Learning. Cambridge, Massachusetts: MIT Press, 2016.
Foran, Samantha R. “Parents’ Rights or Parents’ Wrongs?: The Political Weaponization of Parental Rights to Control Public Education.” Wis. L. Rev., 2022, 1513.
Foster, Sarah, Karen Villanueva, Lisa Wood, Hayley Christian, and Billie Giles-Corti. “The Impact of Parents’ Fear of Strangers and Perceptions of Informal Social Control on Children’s Independent Mobility.” Health & Place 26 (March 1, 2014): 60–68. https://doi.org/10.1016/j.healthplace.2013.11.006.
Francis, Jacinta, Karen Martin, Lisa Wood, and Sarah Foster. “‘I’ll Be Driving You to School for the Rest of Your Life’: A Qualitative Study of Parents’ Fear of Stranger Danger.” Journal of Environmental Psychology 53 (November 1, 2017): 112–20. https://doi.org/10.1016/j.jenvp.2017.07.004.
Frank, Gillian. “‘The Civil Rights of Parents’: Race and Conservative Politics in Anita Bryant’s Campaign against Gay Rights in 1970s Florida.” Journal of the History of Sexuality 22, no. 1 (January 2013): 126–60. https://doi.org/10.7560/JHS22106.
Gambon, Thresia B. and Janna R. Gewirtz O’Brien. “Runaway Youth: Caring for the Nation’s Largest Segment of Missing Children.” Pediatrics 145, no. 2 (February 1, 2020): e20193752. https://doi.org/10.1542/peds.2019-3752.
George, Madeleine J., Michaeline R. Jensen, Michael A. Russell, Anna Gassman-Pines, William E. Copeland, Rick H. Hoyle, and Candice L. Odgers. “Young Adolescents’ Digital Technology Use, Perceived Impairments, and Well-Being in a Representative Sample.” The Journal of Pediatrics 219 (April 2020): 180–87. https://doi.org/10.1016/j.jpeds.2019.12.002.
Ghosh, Arup Kumar, Karla Badillo-Urquiola, Shion Guha, Joseph J. LaViola Jr, and Pamela J. Wisniewski. “Safety vs. Surveillance: What Children Have to Say about Mobile Apps for Parental Control.” In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 1–14. CHI ’18. New York, NY, USA: Association for Computing Machinery, 2018. https://doi.org/10.1145/3173574.3173698.
Ghosh, Arup Kumar, Karla Badillo-Urquiola, Mary Beth Rosson, Heng Xu, John M. Carroll, and Pamela J. Wisniewski. “A Matter of Control or Safety? Examining Parental Use of Technical Monitoring Apps on Teens’ Mobile Devices.” In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 1–14. CHI ’18. New York, NY, USA: Association for Computing Machinery, 2018. https://doi.org/10.1145/3173574.3173768.
Godwin, Samantha. “Against Parental Rights.” Columbia Human Rights Law Review 47 (March 1, 2015): 1–83.
Griffiths, M. D. “Adolescent Social Networking: How Do Social Media Operators Facilitate Habitual Use?” Education and Health 36, no. 3 (December 3, 2018): 66–69.
Grossman, Shelby, Riana Pfefferkorn, David Thiel, Sara Shah, Renée DiResta, John Perrino, Elena Cryst, and Alex Stamos. “The Strengths and Weaknesses of the Online Child Safety Ecosystem,” Stanford Digital Repository, 2024. https://doi.org/10.25740/pr592kc5483.
Haidt, Jonathan, Jean Twenge, and Zach Rausch. “Adolescent Mood Disorders since 2010: A Collaborative Review.” Unpublished Manuscript. New York University, Ongoing. https://tinyurl.com/TeenMentalHealthReview.
Heffer, Taylor, Marie Good, Owen Daly, Elliott MacDonell, and Teena Willoughby. “The Longitudinal Association Between Social-Media Use and Depressive Symptoms Among Adolescents and Young Adults: An Empirical Reply to Twenge et al. (2018).” Clinical Psychological Science 7, no. 3 (May 1, 2019): 462–70. https://doi.org/10.1177/2167702618812727.
Heins, Marjorie. Not in Front of the Children: “Indecency,” Censorship, and the Innocence of Youth. New Brunswick, N.J: Rutgers University Press, 2007.
Helm, Sabrina, Joya A. Kemper, and Samantha K. White. “No Future, No Kids–No Kids, No Future?” Population and Environment 43, no. 1 (September 1, 2021): 108–29. https://doi.org/10.1007/s11111-021-00379-5.
Heritage Foundation [@Heritage]. “And Yes, Not Only Is It a Social Contagion, but as @SarahPPerry Explains, This Poisonous Ideology Is Worsening at Lighting Speed. Https://T.Co/1wVyUcLH3X.” Tweet. Twitter, May 21, 2023. https://twitter.com/Heritage/status/1660111878738067457.
Hickman, Caroline, Elizabeth Marks, Panu Pihkala, Susan Clayton, R. Eric Lewandowski, Elouise E. Mayall, Britt Wray, Catriona Mellor, and Lise van Susteren. “Climate Anxiety in Children and Young People and Their Beliefs about Government Responses to Climate Change: A Global Survey.” The Lancet Planetary Health 5, no. 12 (December 1, 2021): e863–73. https://doi.org/10.1016/S2542-5196(21)00278-3.
Jaffe, Sarah. “It’ll Take a Movement: Organizing at Amazon after Bessemer.” New Labor Forum 30, no. 3 (September 1, 2021): 30–37. https://doi.org/10.1177/10957960211035077.
Jang, Yujin, and Bomin Ko. “Online Safety for Children and Youth under the 4Cs Framework—A Focus on Digital Policies in Australia, Canada, and the UK.” Children 10, no. 8 (August 2023): 1415. https://doi.org/10.3390/children10081415.
Jenkins, Philip. Moral Panic: Changing Concepts of the Child Molester in Modern America. Yale University Press, 2004.
Kaewpradub, Natthakarn, Komsan Kiatrungrit, Sirichai Hongsanguansri, and Chosita Pavasuthipaisit. “Association Among Internet Usage, Body Image and Eating Behaviors of Secondary School Students.” Shanghai Archives of Psychiatry 29, no. 4 (August 8, 2017): 208. https://doi.org/10.11919/j.issn.1002-0829.216092.
Kauhanen, Laura, Wan Mohd Azam Wan Mohd Yunus, Lotta Lempinen, Kirsi Peltonen, David Gyllenberg, Kaisa Mishina, Sonja Gilbert, Kalpana Bastola, June S. L. Brown, and Andre Sourander. “A Systematic Review of the Mental Health Changes of Children and Young People before and during the COVID-19 Pandemic.” European Child & Adolescent Psychiatry 32, no. 6 (June 1, 2023): 995–1013. https://doi.org/10.1007/s00787-022-02060-0.
Kavanaugh, Megan L., and Amy Friedrich-Karnik. “Has the Fall of Roe Changed Contraceptive Access and Use? New Research from Four US States Offers Critical Insights.” Health Affairs Scholar, February 8, 2024. https://doi.org/10.1093/haschl/qxae016.
Keyes, Katherine M., Dahsan Gary, Patrick M. O’Malley, Ava Hamilton, and John Schulenberg. “Recent Increases in Depressive Symptoms among US Adolescents: Trends from 1991 to 2018.” Social Psychiatry and Psychiatric Epidemiology 54, no. 8 (August 1, 2019): 987–96. https://doi.org/10.1007/s00127-019-01697-8.
Keyes, Katherine M., Dahsan Gary, Patrick M. O’Malley, Ava Hamilton, and John Schulenberg. “Recent Increases in Depressive Symptoms among US Adolescents: Trends from 1991 to 2018.” Social Psychiatry and Psychiatric Epidemiology 54, no. 8 (August 1, 2019): 987–96. https://doi.org/10.1007/s00127-019-01697-8.
Keyes, Katherine M., and Jonathan M. Platt. “Annual Research Review: Sex, Gender, and Internalizing Conditions among Adolescents in the 21st Century – Trends, Causes, Consequences.” Journal of Child Psychology and Psychiatry 65, no. 4 (2024): 384–407. https://doi.org/10.1111/jcpp.13864.
Kim, Jinhee, and Swarn Chatterjee. “Student Loans, Health, and Life Satisfaction of US Households: Evidence from a Panel Study.” Journal of Family and Economic Issues 40, no. 1 (March 1, 2019): 36–50. https://doi.org/10.1007/s10834-018-9594-3.
Klobuchar, Amy. “Big Tech and the Online Child Sexual Exploitation Crisis.” Dirksen Senate Office Building Room G50, January 31, 2024.
Krumsvik, Rune Johan. “Screenagers, Social Media, Screen Time, and Mental (Ill) Health.” Nordic Journal of Digital Literacy 18, no. 2 (June 13, 2023): 81–84. https://doi.org/10.18261/njdl.18.2.1.
Ledbetter, Andrew M., Olivia M. Lavin, and Eryn N. Bostwick. “Applying Relational Turbulence Theory to Parent-Child Political Conversations: The Role of (Dis)Agreement About Christian Nationalism.” Journal of Family Communication 0, no. 0 (2024): 1–20. https://doi.org/10.1080/15267431.2024.2328141.
Lemish, Dafna. “The Social Media (Moral) Panic This Time: Why CAM Scholars May Need a More Complex Approach.” Journal of Children and Media 17, no. 3 (July 3, 2023): 271–77. https://doi.org/10.1080/17482798.2023.2235159.
Littman, Lisa. “Parent Reports of Adolescents and Young Adults Perceived to Show Signs of a Rapid Onset of Gender Dysphoria.” PLOS ONE 13, no. 8 (August 16, 2018): e0202330. https://doi.org/10.1371/journal.pone.0202330.
Lohmann, Sophie, and Emilio Zagheni. “Multi-Platform Social Media Use: Little Evidence of Impacts on Adult Well-Being.” OSF, June 10, 2020. https://doi.org/10.31234/osf.io/r46nd.
Looney, Martin, and Bob Duff. AN ACT CONCERNING ONLINE PRIVACY, DATA AND SAFETY PROTECTIONS., Pub. L. No. Public Act No. 23-56 (2023).
Lu, Wenhua, and Katherine M. Keyes. “Major Depression with Co-Occurring Suicidal Thoughts, Plans, and Attempts: An Increasing Mental Health Crisis in US Adolescents, 2011–2020.” Psychiatry Research 327 (September 1, 2023): 115352. https://doi.org/10.1016/j.psychres.2023.115352.
Lukianoff, Greg, and Jonathan Haidt. The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting up a Generation for Failure. New York: Penguin, 2019.
Lundahl, Outi. “Media Framing of Social Media Addiction in the UK and the US.” International Journal of Consumer Studies 45, no. 5 (2021): 1103–16. https://doi.org/10.1111/ijcs.12636.
Mader, Rodney. “Print Culture Studies and Technological Determinism.” College Literature 36, no. 2 (2009): 131–40.
Markey, Patrick M, and Christopher J Ferguson. “The Violent Video Game Moral Panic and the Politics of Game Research.” American Journal of Play 10, no. 1 (Fall 2017): 99–115.
Markey, Patrick M., Charlotte N. Markey, and Juliana E. French. “Violent Video Games and Real-World Violence: Rhetoric versus Data.” Psychology of Popular Media Culture 4, no. 4 (2015): 277.
Marwick, Alice E., and danah boyd. “Networked Privacy: How Teenagers Negotiate Context in Social Media.” New Media & Society 16, no. 7 (2014): 1051–67.
Marzi, Isabel, and Anne Kerstin Reimers. “Children’s Independent Mobility: Current Knowledge, Future Directions, and Public Health Implications.” International Journal of Environmental Research and Public Health 15, no. 11 (November 2018): 2441. https://doi.org/10.3390/ijerph15112441.
Mavoa, Jane, Simon Coghlan, and Bjørn Nansen. “‘It’s About Safety Not Snooping’: Parental Attitudes to Child Tracking Technologies and Geolocation Data.” Surveillance & Society 21, no. 1 (March 16, 2023): 45–60. https://doi.org/10.24908/ss.v21i1.15719.
McCarroll, Estefania. “Weapons of Mass Deportation: Big Data and Automated Decision-Making Systems in Immigration Law.” Geo. Immigr. LJ 34 (2019): 705.
McKell, Michael, and Jordan Teuscher. SOCIAL MEDIA REGULATION AMENDMENTS, Pub. L. No. S.B. 194, 13 (2024).
McMullen, Heather, and Katharine Dow. “Ringing the Existential Alarm: Exploring BirthStrike for Climate.” Medical Anthropology 41, no. 6–7 (October 3, 2022): 659–73. https://doi.org/10.1080/01459740.2022.2083510.
Meier, Evelyn P., and James Gray. “Facebook Photo Activity Associated with Body Image Disturbance in Adolescent Girls.” Cyberpsychology, Behavior and Social Networking 17, no. 4 (April 2014): 199–206. https://doi.org/10.1089/cyber.2013.0305.
Mérelle, Saskia YM, Annet M. Kleiboer, Miriam Schotanus, Theresia LM Cluitmans, Cornelia M. Waardenburg, Danielle Kramer, Dike Van de Mheen, and Tony van Rooij. “Which Health-Related Problems Are Associated with Problematic Video-Gaming or Social Media Use in Adolescents? A Large-Scale Cross-Sectional Public Health Study.” CLINICAL NEUROPSYCHIATRY 14, no. 1 (2017): 11–19.
Mezuk, Briana, Donovan Maust, and Kara Zivin. “A Response to the President’s Call to Support Public Mental Health.” American Journal of Preventive Medicine 63, no. 4 (October 2022): 660–63. https://doi.org/10.1016/j.amepre.2022.05.009.
Mitchell, Kimberly J., Michele L. Ybarra, Josephine D. Korchmaros, and Joseph G. Kosciw. “Accessing Sexual Health Information Online: Use, Motivations and Consequences for Youth with Different Sexual Orientations.” Health Education Research 29, no. 1 (February 2014): 147–57. https://doi.org/10.1093/her/cyt071.
Moreno, Megan, Karyn Riddle, Marina C. Jenkins, Ajay Paul Singh, Qianqian Zhao, and Jens Eickhoff. “Measuring Problematic Internet Use, Internet Gaming Disorder, and Social Media Addiction in Young Adults: Cross-Sectional Survey Study.” JMIR Public Health and Surveillance 8, no. 1 (January 27, 2022): e27719. https://doi.org/10.2196/27719.
Musto, Jennifer, Anne E. Fehrenbacher, Heidi Hoefinger, Nicola Mai, P. G. Macioti, Calum Bennachie, Calogero Giametta, and Kate D’Adamo. “Anti-Trafficking in the Time of FOSTA/SESTA: Networked Moral Gentrification and Sexual Humanitarian Creep.” Social Sciences 10, no. 2 (February 2021): 58. https://doi.org/10.3390/socsci10020058.
Musto, Jennifer Lynne, and danah boyd. “The Trafficking-Technology Nexus.” Social Politics: International Studies in Gender, State & Society 21, no. 3 (September 1, 2014): 461–83. https://doi.org/10.1093/sp/jxu018.
Nicholas, Siân, and Tom O’Malley. Moral Panics, Social Fears, and the Media: Historical Perspectives. Routledge, 2013.
Niederkrotenthaler, Thomas, Stefanie Kirchner, Benedikt Till, Mark Sinyor, Ulrich S. Tran, Jane Pirkis, and Matthew J. Spittal. “Systematic Review and Meta-Analyses of Suicidal Outcomes Following Fictional Portrayals of Suicide and Suicide Attempt in Entertainment Media.” eClinicalMedicine 36 (June 1, 2021). https://doi.org/10.1016/j.eclinm.2021.100922.
Niederkrotenthaler, Thomas, Steven Stack, Benedikt Till, Mark Sinyor, Jane Pirkis, David Garcia, Ian R. H. Rockett, and Ulrich S. Tran. “Association of Increased Youth Suicides in the United States With the Release of 13 Reasons Why.” JAMA Psychiatry 76, no. 9 (September 1, 2019): 933–40. https://doi.org/10.1001/jamapsychiatry.2019.0922.
O’Brien, Colin, and Kanako Taku. “Alpha and Beta Changes in Anxiety in Response to Mass Shooting Related Information.” Personality and Individual Differences 186 (February 1, 2022): 111326. https://doi.org/10.1016/j.paid.2021.111326.
O’Day, Emily B., and Richard G. Heimberg. “Social Media Use, Social Anxiety, and Loneliness: A Systematic Review.” Computers in Human Behavior Reports 3 (January 1, 2021): 100070. https://doi.org/10.1016/j.chbr.2021.100070.
Odgers, Candice L. “The Great Rewiring: Is Social Media Really behind an Epidemic of Teenage Mental Illness?” Nature 628, no. 8006 (March 29, 2024): 29–30. https://doi.org/10.1038/d41586-024-00902-2.
Odgers, Candice L., and Michaeline R. Jensen. “Annual Research Review: Adolescent Mental Health in the Digital Age: Facts, Fears, and Future Directions.” Journal of Child Psychology and Psychiatry 61, no. 3 (2020): 336–48. https://doi.org/10.1111/jcpp.13190.
Office of the Surgeon General. “Social Media and Youth Mental Health: The US Surgeon General’s Advisory.” Washington, D.C.: US Department of Health and Human Services, 2023. https://pubmed.ncbi.nlm.nih.gov/37721985/.
Ophir, Yaakov, Yuliya Lipshits-Braziler, and Hananel Rosenberg. “New-Media Screen Time Is Not (Necessarily) Linked to Depression: Comments on Twenge, Joiner, Rogers, and Martin (2018).” Clinical Psychological Science 8, no. 2 (March 1, 2020): 374–78. https://doi.org/10.1177/2167702619849412.
Orben, Amy, and Andrew K. Przybylski. “Reply to: Underestimating Digital Media Harm.” Nature Human Behaviour 4, no. 4 (April 2020): 349–51. https://doi.org/10.1038/s41562-020-0840-y.
———. “The Association between Adolescent Well-Being and Digital Technology Use.” Nature Human Behaviour 3, no. 2 (February 2019): 173–82. https://doi.org/10.1038/s41562-018-0506-1.
Paakkari, Leena, Jorma Tynjälä, Henri Lahti, Kristiina Ojala, and Nelli Lyyra. “Problematic Social Media Use and Health among Adolescents.” International Journal of Environmental Research and Public Health 18, no. 4 (January 2021): 1885. https://doi.org/10.3390/ijerph18041885.
Pagidas, Vasiliki. “First Amendment - Freedom of Speech - Provisions of the Communications Decency Act of 1996 Intended to Protect Minors From Exposure to Indecent and Patently Offensive Material on the Internet Violate the First Amendment - Reno v. ACLU, 117 S. Ct. 2329 (1997).” Seton Hall Constitutional Law Journal 8, no. 3 (June 1, 1998). https://scholarship.shu.edu/con_law/vol8/iss3/18.
Pan, Yuan-Chien, Yu-Chuan Chiu, and Yu-Hsuan Lin. “Systematic Review and Meta-Analysis of Epidemiology of Internet Addiction.” Neuroscience & Biobehavioral Reviews 118 (November 1, 2020): 612–22. https://doi.org/10.1016/j.neubiorev.2020.08.013.
Parodi, Katharine B., Melissa K. Holt, Jennifer Greif Green, Michelle V. Porche, Brian Koenig, and Ziming Xuan. “Time Trends and Disparities in Anxiety among Adolescents, 2012–2018.” Social Psychiatry and Psychiatric Epidemiology 57, no. 1 (January 1, 2022): 127–37. https://doi.org/10.1007/s00127-021-02122-9.
Patchin, Justin W., and Sameer Hinduja. “Sextortion Among Adolescents: Results From a National Survey of U.S. Youth.” Sexual Abuse 32, no. 1 (February 1, 2020): 30–54. https://doi.org/10.1177/1079063218800469.
Patry, William. Moral Panics and the Copyright Wars. Oxford University Press, USA, 2009.
Piao, Jianmin, Yinqiong Huang, Cheng Han, Yike Li, Yanbing Xu, Yazhuo Liu, and Xue He. “Alarming Changes in the Global Burden of Mental Disorders in Children and Adolescents from 1990 to 2019: A Systematic Analysis for the Global Burden of Disease Study.” European Child & Adolescent Psychiatry 31, no. 11 (November 1, 2022): 1827–45. https://doi.org/10.1007/s00787-022-02040-4.
Piera Pi-Sunyer, Blanca, Jack L. Andrews, Amy Orben, Lydia G. Speyer, and Sarah-Jayne Blakemore. “The Relationship between Perceived Income Inequality, Adverse Mental Health and Interpersonal Difficulties in UK Adolescents.” Journal of Child Psychology and Psychiatry 64, no. 3 (2023): 417–25. https://doi.org/10.1111/jcpp.13719.
Pisaniello, Monique Simone, Adon Toru Asahina, Stephen Bacchi, Morganne Wagner, Seth W. Perry, Ma-Li Wong, and Julio Licinio. “Effect of Medical Student Debt on Mental Health, Academic Performance and Specialty Choice: A Systematic Review.” BMJ Open 9, no. 7 (July 1, 2019): e029980. https://doi.org/10.1136/bmjopen-2019-029980.
Redbird, Beth, Laurel Harbridge-Yong, and Rachel Davis Mersey. “The Social and Political Impact of the COVID-19 Pandemic: An Introduction.” RSF: The Russell Sage Foundation Journal of the Social Sciences 8, no. 8 (December 1, 2022): 1–29. https://doi.org/10.7758/RSF.2022.8.8.01.
Restar, Arjee Javellana. “Methodological Critique of Littman’s (2018) Parental-Respondents Accounts of ‘Rapid-Onset Gender Dysphoria.’” Archives of Sexual Behavior 49, no. 1 (January 2020): 61–66. https://doi.org/10.1007/s10508-019-1453-2.
Riehm, Kira E., Ramin Mojtabai, Leslie B. Adams, Evan A. Krueger, Delvon T. Mattingly, Paul S. Nestadt, and Adam M. Leventhal. “Adolescents’ Concerns About School Violence or Shootings and Association With Depressive, Anxiety, and Panic Symptoms.” JAMA Network Open 4, no. 11 (November 1, 2021): e2132131. https://doi.org/10.1001/jamanetworkopen.2021.32131.
Roberti, Amanda. “‘Women Deserve Better:’ The Use of the Pro-Woman Frame in Anti-Abortion Policies in U.S. States.” Journal of Women, Politics & Policy 42, no. 3 (July 3, 2021): 207–24. https://doi.org/10.1080/1554477X.2021.1925478.
Romer, Daniel. “Reanalysis of the Bridge et al. Study of Suicide Following Release of 13 Reasons Why.” PLOS ONE 15, no. 1 (January 16, 2020): e0227545. https://doi.org/10.1371/journal.pone.0227545.
Rounsefell, Kim, Simone Gibson, Siân McLean, Merran Blair, Annika Molenaar, Linda Brennan, Helen Truby, and Tracy A. McCaffrey. “Social Media, Body Image and Food Choices in Healthy Young Adults: A Mixed Methods Systematic Review.” Nutrition & Dietetics 77, no. 1 (2020): 19–40. https://doi.org/10.1111/1747-0080.12581.
Rudolph, Cort W., David P. Costanza, Charlotte Wright, and Hannes Zacher. “Cross-Temporal Meta-Analysis: A Conceptual and Empirical Critique.” Journal of Business and Psychology 35, no. 6 (December 1, 2020): 733–50. https://doi.org/10.1007/s10869-019-09659-2.
Rudolph, Cort W., Rachel S. Rauvola, David P. Costanza, and Hannes Zacher. “Generations and Generational Differences: Debunking Myths in Organizational Science and Practice and Paving New Paths Forward.” Journal of Business and Psychology 36, no. 6 (December 1, 2021): 945–67. https://doi.org/10.1007/s10869-020-09715-2.
Santarossa, Sara, and Sarah J. Woodruff. “#SocialMedia: Exploring the Relationship of Social Networking Sites on Body Image, Self-Esteem, and Eating Disorders.” Social Media + Society 3, no. 2 (April 1, 2017): https://doi.org/10.1177/2056305117704407.
Samji, Hasina, Judy Wu, Amilya Ladak, Caralyn Vossen, Evelyn Stewart, Naomi Dove, David Long, and Gaelen Snell. “Review: Mental Health Impacts of the COVID-19 Pandemic on Children and Youth – a Systematic Review.” Child and Adolescent Mental Health 27, no. 2 (2022): 173–89. https://doi.org/10.1111/camh.12501.
Savage, Jon. “Demonising Those Teenage Dirtbags: The Current Moral Outcry over Drill Music Is so Last Century. Adults Have Been Scared about What the Kids Are Singing for Decades.” Index on Censorship 47, no. 2 (July 1, 2018): 66–69. https://doi.org/10.1177/0306422018784511.
Schlegel, Laurie. Provides for liability for publishers and distributors of material harmful to minors, Pub. L. No. HB 142 (2022)
Schweizer, Susanne, Rebecca P. Lawson, and Sarah-Jayne Blakemore. “Uncertainty as a Driver of the Youth Mental Health Crisis.” Current Opinion in Psychology 53 (October 1, 2023): 101657. https://doi.org/10.1016/j.copsyc.2023.101657.
Shachar, Carmel. “HIPAA, Privacy, and Reproductive Rights in a Post-Roe Era.” JAMA 328, no. 5 (August 2, 2022): 417–18. https://doi.org/10.1001/jama.2022.12510.
Shanahan, Lilly, and William E. Copeland. “Commentary: Integrative, Multi-Level Explanatory Models Are Needed to Understand Recent Trends in Sex, Gender, and Internalizing Conditions, Reflections on Keyes and Platt (2023).”Journal of Child Psychology and Psychiatry 65, no. 4 (2024): 408–12. https://doi.org/10.1111/jcpp.13957.
Shannon, Holly, Katie Bush, Paul J. Villeneuve, Kim GC Hellemans, and Synthia Guimond. “Problematic Social Media Use in Adolescents and Young Adults: Systematic Review and Meta-Analysis.” JMIR Mental Health 9, no. 4 (April 14, 2022): e33450. https://doi.org/10.2196/33450.
Silbermann, Jacki. “A New Voice in the Labor Movement? Organizing for Social Responsibility in the Tech Sector.”Employee Rights and Employment Policy Journal 25, no. 2 (2021): 197–256.
Silén, Yasmina, and Anna Keski-Rahkonen. “Worldwide Prevalence of DSM-5 Eating Disorders among Young People.” Current Opinion in Psychiatry 35, no. 6 (November 2022): 362. https://doi.org/10.1097/YCO.0000000000000818.
Sirois, Tyler, Fiono McFarland, Michele Rayner, Chase Tramont, and Toby Overdorf. Online Protections for Minors, Pub. L. No. HB 3, s. 501.1736, F.S. (2024).
Smith, Samantha A., and Simon A. Cole. “MyMoralPanic: Adolescents, Social Networking, and Child Sex Crime Panic.” In The Ashgate Research Companion to Moral Panics, edited by Charles Krinsky, 207–23. London: Taylor & FrancisGroup, 2013.
Stevens, John Paul. Janet Reno, Attorney General of the United States, Et Al., Appellants v. American Civil Liberties Union Et Al., U.S. (U.S. Supreme Court 1997).
Sun, Yalin, and Yan Zhang. “A Review of Theories and Models Applied in Studies of Social Media Addiction and Implications for Future Research.” Addictive Behaviors 114 (March 1, 2021): 106699. https://doi.org/10.1016/j.addbeh.2020.106699.
Sutton, Samantha, and David Finkelhor. “Perpetrators’ Identity in Online Crimes Against Children: A Meta-Analysis.” Trauma, Violence, & Abuse, August 23, 2023, https://doi.org/10.1177/15248380231194072.
Tarvin, Emily, and Mel Stanfill. “‘YouTube’s Predator Problem’: Platform Moderation as Governance-Washing, and User Resistance.” Convergence 28, no. 3 (June 1, 2022): 822–37. https://doi.org/10.1177/13548565211066490.
Trzesniewski, Kali H., M. Brent Donnellan, and Richard W. Robins. “Is ‘Generation Me’ Really More Narcissistic Than Previous Generations?” Journal of Personality 76, no. 4 (2008): 903–18. https://doi.org/10.1111/j.1467-6494.2008.00508.x.
Tsevreni, Irida, Nikolaos Proutsos, Magdalini Tsevreni, and Dimitris Tigkas. “Generation Z Worries, Suffers and Acts against Climate Crisis—The Potential of Sensing Children’s and Young People’s Eco-Anxiety: A Critical Analysis Based on an Integrative Review.” Climate 11, no. 8 (August 2023): 171. https://doi.org/10.3390/cli11080171.
Twenge, Jean M. Generation Me-Revised and Updated: Why Today’s Young Americans Are More Confident, Assertive, Entitled-and More Miserable than Ever Before. New York: Simon and Schuster, 2014.
———. Generations: The Real Differences Between Gen Z, Millennials, Gen X, Boomers, and Silents—and What They Meanfor America’s Future. New York: Simon and Schuster, 2023.
———. iGen: Why Today’s Super-Connected Kids Are Growing up Less Rebellious, More Tolerant, Less Happy–and Completely Unprepared for Adulthood–and What That Means for the Rest of Us. New York: Simon and Schuster, 2017.
Twenge, Jean M., and W. Keith Campbell. The Narcissism Epidemic: Living in the Age of Entitlement. New York: Simon and Schuster, 2009.
Twenge, Jean M., Jonathan Haidt, Thomas E. Joiner, and W. Keith Campbell. “Underestimating Digital Media Harm.” Nature Human Behaviour 4, no. 4 (2020): 346–48.
Valkenburg, Patti M., Jochen Peter, and Joseph B. Walther. “Media Effects: Theory and Research.” Annual Review of Psychology 67, no. 1 (January 4, 2016): 315–38. https://doi.org/10.1146/annurev-psych-122414-033608.
Vigdal, Julia Schønning, and Kolbjørn Kallesten Brønnick. “A Systematic Review of ‘Helicopter Parenting’ and Its Relationship With Anxiety and Depression.” Frontiers in Psychology 13 (May 25, 2022): 872981. https://doi.org/10.3389/fpsyg.2022.872981.
Viljoen, Salomé. “The Promise and Limits of Lawfulness: Inequality, Law, and the Techlash.” Journal of Social Computing 2, no. 3 (September 2021): 284–96. https://doi.org/10.23919/JSC.2021.0025.
Vitale, Joseph. Requires age verification and parent or guardian consent for minor’s use of social media platform; prohibits certain messaging between adults and minors on social media platform., Pub. L. No. SENATE No. 4215 (2023).
Vuorre, Matti, Amy Orben, and Andrew K. Przybylski. “There Is No Evidence That Associations Between Adolescents’ Digital Technology Engagement and Mental Health Problems Have Increased.” Clinical Psychological Science 9, no. 5 (September 1, 2021): 823–35. https://doi.org/10.1177/2167702621994549.
Vuorre, Matti, and Andrew K. Przybylski. “Estimating the Association between Facebook Adoption and Well-Being in 72 Countries.” Royal Society Open Science 10, no. 8 (August 9, 2023): 221451. https://doi.org/10.1098/rsos.221451.
Walsemann, Katrina M., Gilbert C. Gee, and Danielle Gentile. “Sick of Our Loans: Student Borrowing and the Mental Health of Young Adults in the United States.” Social Science & Medicine 124 (January 1, 2015): 85–93. https://doi.org/10.1016/j.socscimed.2014.11.027.
Warner, Benjamin R., Colleen Warner Colaner, and Jihye Park. “Political Difference and Polarization in the Family: The Role of (Non)Accommodating Communication for Navigating Identity Differences.” Journal of Social and Personal Relationships 38, no. 2 (February 1, 2021): 564–85. https://doi.org/10.1177/0265407520967438.
Wisniewski, Pamela, Arup Kumar Ghosh, Heng Xu, Mary Beth Rosson, and John M. Carroll. “Parental Control vs. Teen Self-Regulation: Is There a Middle Ground for Mobile Online Safety?” In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, 51–69. CSCW ’17. New York, NY, USA: Association for Computing Machinery, 2017. https://doi.org/10.1145/2998181.2998352.
Woods, Lorna, and William Perrin. “Online Harm Reduction – a Statutory Duty of Care and Regulator.” SSRN Scholarly Paper. Rochester, NY, April 18, 2019. https://doi.org/10.2139/ssrn.4003986.
Wolak, Janis, David Finkelhor, Wendy Walsh, and Leah Treitman. “Sextortion of Minors: Characteristics and Dynamics.” Journal of Adolescent Health 62, no. 1 (January 1, 2018): 72–79. https://doi.org/10.1016/j.jadohealth.2017.08.014.
Woodhouse, Barbara Bennett. Hidden in Plain Sight: The Tragedy of Children’s Rights from Ben Franklin to Lionel Tate. Princeton University Press, 2008. https://www.jstor.org/stable/j.ctt7rr8c.
Woodside, Jonathan, Tara Vinodrai, and Markus Moos. “Bottom-up Strategies, Platform Worker Power and Local Action: Learning from Ridehailing Drivers.” Local Economy 36, no. 4 (June 1, 2021): 325–43. https://doi.org/10.1177/02690942211040170.
Wyatt, Sally. “Technological Determinism Is Dead; Long Live Technological Determinism.” The Handbook of Science and Technology Studies 3rd Edition (2008): MIT Press, 165–80.
Yellowlees, R., A. E. Dingemans, J. Veldhuis, and A. J. D. Bij de Vaate. “Face Yourself(Ie): Investigating Selfie-Behavior in Females with Severe Eating Disorder Symptoms.” Computers in Human Behavior 101 (December 1, 2019): 77–83. https://doi.org/10.1016/j.chb.2019.07.018.
Zhan, Min. “Financial Stress and Hardship among Young Adults: The Role of Student Loan Debt.” Journal of Sociology & Social Welfare 49, no. 3 (2022): 84–111.
، موضوعی است که بسیاری از افراد به دنبال آن هستند. در واقع، امکان کسب درآمد از طریق اینترنت بدون نیاز به سرمایهگذاری اولیه وجود دارد، اما نیازمند تلاش، زمان و یادگیری است. روشهایی مانند تولید محتوا در یوتیوب یا شبکههای اجتماعی، بلاگنویسی، انجام نظرسنجیهای آنلاین، مشارکت در برنامههای وابسته (افلییت مارکتینگ)، و ارائه خدمات فریلنسری میتوانند راههای مناسبی برای شروع باشند. این روشها به سرمایهگذاری مالی اولیه نیاز ندارند، اما باید مهارتهای خاصی را توسعه دهید و زمان زیادی را برای ایجاد و ارتقاء محتوای خود صرف کنید.
برای موفقیت در کسب درآمد اینترنتی واقعی رایگان، مهم است که ابتدا هدف و علاقهمندیهای خود را شناسایی کنید و سپس بهطور مستمر در آن زمینه فعالیت کنید. ایجاد محتوای با کیفیت و ارائه خدمات منحصربهفرد میتواند شما را از رقبا متمایز کند و مخاطبان بیشتری جذب کند. همچنین، بهرهگیری از ابزارها و منابع آموزشی رایگان موجود در اینترنت میتواند به بهبود مهارتها و افزایش شانس موفقیت شما کمک کند. در نهایت، صبر و پشتکار دو عامل کلیدی در دستیابی به درآمد اینترنتی پایدار و واقعی هستند.