The Race To Age Verification: Protecting Young Americans Online

Concerns for Young Americans Online

For years, widely-used apps such as Instagram and TikTok have stirred concern among parents, advocacy groups, and regulators. These platforms are accused of delivering harmful content to teenagers, promoting issues like bullying, drug use, eating disorders, self-harm, and suicide. Emma Lembke, a youth activist who is now a college sophomore, shared her testimony, recounting her experience of creating her first Instagram account at the age of 12. She described how features like endless scrolling and autoplay lured her into spending five to six hours daily mindlessly scrolling, leading to depression, anxiety, and disordered eating.

“Teens who use social media for more than three hours a day face double the risk of depression and anxiety symptoms,” cautioned the Surgeon General, Dr. Vivek Murthy. He emphasized, “the average amount of time that kids use social media is 3 1/2 hours a day,” during his conversation with Morning Edition host Steve Inskeep.

Citing a recent study mentioned in the U.S. Surgeon General’s Advisory, it’s revealed that up to 95% of youths aged 13-17 use social media, and over a third of them claim to use it “almost constantly.” A related report on teen exposure to online pornography in the United States indicates that many teenagers, aged 13-17, have encountered explicit content online, with some as young as 10. Significantly, much of this exposure is intentional rather than accidental. This highlights the necessity for open discussions with teenagers about pornography, similar to conversations about sex, social media, and substance use.

The regulations described below reflect an increasing agreement that the present internet is failing to protect children, emphasizing the need for age verification to protect them from explicit content and harmful social media influences.

Federal Legislation Aims to Protect Children Online

The Senate Commerce Committee has unanimously approved two key bills: the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA). President Biden has expressed support for these measures and encouraged Congress to pass them. On July 27, 2023, both KOSA and COPPA 2.0 passed through the Senate Commerce Committee without any opposition. These bipartisan bills aim to compel online platforms to focus on children’s well-being in tech product design and to extend privacy protections to teenagers, marking a significant milestone.

KOSA (Kids Online Safety Act)

The Kids Online Safety Act, originally introduced by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) in 2022, underwent a reintroduction in May 2023. The revised bill, now passed through the Committee, aims to set legal standards for safeguarding minors, specifically those under 17, from online harms like content promoting self-harm, eating disorders, and substance abuse. Notable amendments emphasize the “duty of care” for online service providers in prioritizing children’s protection. Senator John Thune (R-SD) introduced an important amendment, mandating companies to inform users about content filtering through algorithms, along with an opt-out option.

KOSA is designed to establish a new legal standard for the Federal Trade Commission and state attorneys general, enabling them to regulate companies that fail to prevent children from accessing harmful content on their platforms. The bill’s sponsors, Senators Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT), assert that the legislation aims to shield children from content glamorizing harmful behaviors like eating disorders, suicidal thoughts, substance abuse, and gambling. The bill also imposes restrictions on children aged 13 and under from using social media and requires parental consent for those under 17.

Key Provisions:

  • The bill outlines requirements to protect minors from online harms.
  • These requirements apply to covered platforms, excluding entities such as internet service providers and educational institutions.
  • Covered platforms must prioritize minors’ best interests, mitigating potential harms like sexual exploitation and online bullying.
  • They must provide safeguards for minors and tools for parents or guardians to supervise platform usage, including privacy and account settings control.
  • Specific provisions include the disclosure of information, reporting of harms, prohibition of advertising age-restricted products to minors, and annual reporting on potential risks to minors.
  • Enforcement is entrusted to the Federal Trade Commission and state authorities.
  • The bill establishes a program for noncommercial independent research, a council for implementation guidance, and mandates guidance for market and product research related to minors and age verification assessment.
  • During the bill’s markup, Senator Blackburn introduced an amendment addressing concerns raised by digital rights groups, particularly language related to user age verification. Although these changes were approved, concerns persist about potential data collection requirements to comply with other rules outlined in the bill.

COPPA 2.0

COPPA 2.0, an update to the 1998 Children’s Online Privacy Protection Act (COPPA), aims to extend the original law’s restrictions on internet companies collecting and using personal information from children under 13 to children up to 16 without their consent. Additionally, it prohibits platforms from targeting ads to kids. The Children and Teen’s Online Privacy Protection Act, initially introduced in 2021, seeks to update the existing COPPA and extends privacy protections to users aged 13 to 16.

Key Provisions:

  • Extends COPPA protections from children under 13 to children up to 16.
  • Bans targeted ads to minors.
  • Requires parental consent for collecting personal information from children.
  • Ensures transparency and security of children’s personal information.
  • Encourages self-regulation through a “safe harbor” provision for industry groups to seek compliance approval.

State-by-State Age Verification Mandates

Arkansas SB 396

Arkansas – SB 396 Passed 04/11/2023 and Effective 09/01/2023 

The Arkansas bill establishes very specific and restrictive criteria such as age verification method for classifying platforms. It encompasses companies with subscription-based models, except for those primarily focused on social interaction, non-educational short video creation, gaming, cloud storage, and professional development. The state’s definition also exempts platforms offering services such as email, direct messaging, streaming, news, sports, entertainment, online shopping, document collaboration, or commenting on news websites. These stringent criteria may lead to most major social media platforms not meeting this definition. Moreover, the exclusions related to short-form video creation could potentially exclude many leading social media platforms.

California AB 2273 ( AADC or Age-Appropriate Design Code Act)

California – AB 2273 Passed 09/15/2023 and Effective 07/01/2024

The California Age-Appropriate Design Code Act (AADC), or Assembly Bill No. 2273, introduces stringent regulations for online services directed at children, effective on July 1, 2024. These rules apply if a service is aimed at children, used by children, features child-focused ads, resembles a child-accessed service, or includes child-friendly design elements. Businesses must conduct a Data Protection Impact Assessment, establish stringent age-appropriate data practices, and maintain highly protective default privacy settings. Clear child-friendly policies and limitations on child data usage are obligatory. A working group will provide AADC implementation guidance, and the Attorney General can enforce compliance with penalties of up to $7,500 per affected child for intentional violations. This law prioritizes children’s online safety and privacy, while emphasizing the importance of employing age assurance methods that align with privacy and data protection principles. It’s worth noting that in December 2022,  NetChoice, an association of online services and platforms, filed a lawsuit seeking to overturn the law. 

Louisiana HB 142

Louisiana – HB 142 Passed 06/15/2022 and Effective 01/01/2023

Louisiana’s House Bill No. 142, known as Act No. 440, establishes provisions related to material harmful to minors. It intends to provide a civil remedy for damages against commercial entities distributing such material online, particularly addressing the potential harm to minors due to early exposure to explicit content. The law requires reasonable age verification methods for those seeking access to such material and holds entities liable for not complying. It also outlines exceptions, defines key terms, and emphasizes that legitimate news outlets are not affected by these regulations. Furthermore, internet service providers and related entities are not held responsible for content they don’t create. The Act is effective from January 1, 2023.

Mississippi SB 2346

Mississippi – SB 2346 Passed and Effective 07/01/2023

Mississippi’s law affects commercial entities that publish or distribute “material harmful to minors” on the internet, which refers to explicit content that is sexual in nature and lacks any redeeming literary, artistic, political, or scientific value for minors. It requires these entities to have reasonable age verification systems in place to confirm that individuals accessing the material are 18 years of age or older. If they fail to do so, they may be held liable for damages resulting from a minor’s access to the material. The law also specifies definitions, including what constitutes “material harmful to minors” and “reasonable age verification methods.” It exempts news-gathering organizations and certain internet service providers and search engines from liability under this act. 

Montana SB 544

Montana – SB 544 Passed 05/19/2023 and Effective 01/01/2024

Montana’s law safeguards against explicit content that may harm minors online, often related to sexual, graphic, or pornographic material. It mandates commercial entities to employ reasonable age verification methods for users aged 18 or older. Non-compliance can lead to liability, covering legal costs and attorney fees. The law exempts news organizations, specific internet providers, and search engines. It requires the Department of Justice to report enforcement activities annually and provides clear definitions for key terms. 

Texas HB 1181

Texas – HB 1181 Passed 06/12/2023 and Effective 09/01/2023

This Texas law introduces measures to restrict children’s access to pornographic materials on websites. It places the responsibility on organizations to implement age verification mechanisms to block users under 13 from accessing explicit content. Individuals uploading such material can be held accountable if it’s accessed by minors under 13. This law applies to actions occurring after its effective date. 

Texas HB 18

Texas – HB 18 Passed 06/13/2023 and Effective 09/01/2024

Under this law, “digital service providers” that facilitate social interaction through profiles and content distribution via messages, pages, videos, feeds, music, animation and instruction activities tailored toward minors. It introduces various provisions, including restrictions on collecting personal identifying information from minors, requirements for parental tools and consent, transparency in advertising, and algorithms, and prohibits limiting or discontinuing digital services based on data collection consent. Parents or guardians can take legal action against digital service providers for violations, with potential remedies including injunctive relief, damages, attorney’s fees, and court costs. Violations of this law are considered deceptive trade practices.

Utah SB 287

Utah – SB 287 Passed 03/14/2023 and Effective 05/03/2023

S.B. 287 is a Utah law addressing age requirements for online pornography viewers. It imposes responsibilities and liabilities on commercial entities providing explicit or harmful content, requiring them to verify the age of users seeking access to such material. The law introduces definitions for various terms, outlines rules for data retention, and holds publishers and distributors liable for failing to comply with age verification requirements. Importantly, the legislation exempts internet service providers and hosting entities from liability for content they do not create, aiming to regulate online access to explicit content while safeguarding certain online intermediaries.

Utah SB 152

Utah – SB 152 Passed 03/23/2023 and Effective 05/03/2023

Utah’s definition of a “social media platform” excludes platforms primarily designed for functions such as email, private messaging, licensed streaming, gaming, non-user-generated news and sports content, and various other features like e-commerce, gaming, and more. These exclusions are mainly relevant to platforms that emphasize social interaction and content sharing. In accordance with the law, social media companies must verify the age of Utah residents who wish to create or maintain an account, with a specific focus on minors under 18 who need parental or guardian consent to use these platforms. Commencing on March 1, 2024, Utah-based social media companies are obligated to confirm the age and consent of their account holders, especially minors. New users must provide this information during registration, while existing account holders have a 14-day window to complete the verification process if they haven’t already done so. Failure to meet these requirements will result in denied access to the account until compliance is achieved.

Virginia SB 1515

Virginia – SB 1515 Passed and Effective 05/12/2023

Virginia’s law focuses on civil liability for publishing or distributing material harmful to minors on the internet. “Material harmful to minors” pertains to explicit content that, as a whole, lacks serious value for minors and may include nudity, sexual conduct, or other explicit representations. Commercial entities distributing this content must verify users’ age, with penalties for non-compliance. The law does not impose obligations on internet service providers or users of interactive computer services. 

North Carolina HB 8, Article 51 (Pornography Age Verification Enforcement Act or PAVE Act)

North Carolina – HB 8, Article 51 Passed 09/29/2023 and Effective 01/01/2024

The “Pornography Age Verification Enforcement Act” (PAVE Act) in North Carolina imposes age verification requirements on commercial entities publishing or distributing material harmful to minors on the internet. They must verify users’ ages using commercially available databases or other reasonable methods. No identifying information can be retained after access is granted. Violators are subject to civil liability, and affected parties may seek injunctive relief, compensatory and punitive damages, as well as legal fees. These requirements don’t apply to news organizations, and internet service providers are not liable solely for providing access to such content.

Variances in Age Verification Approaches

The states mentioned earlier employ diverse strategies for age verification, especially concerning platform and content accessibility. Four states require age verification, while Arkansas outlines specific methods. In contrast, Utah’s law lacks details, allowing for greater flexibility but creating compliance challenges. Parental consent is another focus, with approved methods like video conferences and credit card validation by the FTC. Nevertheless, state-specific laws may have limitations in comprehensively regulating minors’ social media usage due to varying approaches and gaps in content oversight.

Big Tech Initiatives to State Laws: A Complex Landscape

Responding to growing concerns, major tech corporations have rolled out a range of parental oversight features. Presently, both state and federal authorities are actively engaged in overseeing social media companies and adult entertainment businesses, resulting in a diverse set of regulations.

To illustrate, several states, such as Arkansas, Utah, Texas, California, and Louisiana, have implemented fresh laws requiring age verification and imposing restrictions on minors’ use of social media platforms.

Interested in reading more about age restrictions online?

Check out our blog post Ensuring Compliance and Mitigating Risks: A Guide for Website Administrators in the Adult Content Industry.

Token of Trust: Your Comprehensive Age Verification Solution

In a rapidly evolving digital landscape where age verification is crucial, Token of Trust stands as your reliable partner. With a diverse range of age verification methods, from government document verification to public and private database checks, and even age estimation services, we offer comprehensive solutions to meet your unique needs. Our archival tools ensure compliance with state and global privacy laws, providing a secure and accessible data repository for audit support. When it comes to safeguarding your business and ensuring regulatory compliance, Token of Trust is the trusted choice for age verification.