Utah is making waves for a pair of bills moving through the state House and Senate aimed at protecting kids from social media. But there is much more in the bills than what is being noted in the press. If passed, everyone in the state would be required to verify their age in order to use social media.
These new laws wouldn’t just be for teens and their parents. They would mandate ID verification for every Utahn.
Some are wondering if Utah’s teen tech bills could be model legislation. As the following analysis explains, leaders should acknowledge that:
- The two bills aren’t just age verification mandates for teens, they also mandate ID verification for everyone over 18.
- Since expired ID cards won’t count, the poor and the elderly will be hit the hardest.
- If it all works as the bill authors hope, social media companies will soon be in the possession of highly sensitive personal information.
- Approximately two out of three Americans do not wish to share their identification documents with social media platforms.
- The definition of social media addiction in the bills doesn’t match current psychiatric practice.
All regulation is difficult to comply with, but these new rules would be especially difficult to implement. While we should be giving tools to teens and parents to help them navigate our complex digital world, these bills carry a heavy burden for Utahns without much evidence that they will provide solutions.
The basic structure of H.B. 311, its limitations, and age verification
Utah House Bill H.B. 311 is intended to create more protections for teens under the age of 18 using social media, namely by requiring age verifications, but goes even one step further. It is a broad identification mandate for everyone over 18 using social media. It also creates a new Division of Consumer Protection (DCP) that will work out the details for both age verification and compliance with state and federal laws.
Giving the details to the DCP is a smart move because it could mean more flexibility in what age verification methods will be accepted. But it also means that the hardest part of making this bill work will be passed on to the new division. Instead of passing legislation and then leaving it up to the division to figure out how to make it work, legislators should be doing the hard work now. Should the bill not be in compliance with state or federal law, there is a risk that key components of the law will be gutted, causing even greater confusion and harm to consumers and companies.
When it was first introduced, the bill only allowed government-issued documents to be accepted. That means anyone who doesn’t have a government-issued ID won’t be able to access social media sites. And remember, these IDs cannot be expired. People without current IDs would count as undocumented. Some estimates put the number of people without a valid government-issued ID as high as 11% of the population. This is a big group and includes the elderly and the poor, which would be caught up in this scheme.
To sidestep these problems, countries like the United Kingdom and Germany accept age verification from algorithmic facial recognition systems. Yoti’s system of algorithmic age verification has been a leader, for example. For their own part, Germany allows some wiggle room since individuals must be recognized by the system as being at least 23 to gain access to adult content. If Utah policymakers are serious about age verification, they should be open to accepting facial recognition systems, even with their drawbacks.
If it all works as the bill authors hope, social media companies will soon be in the position of storing highly sensitive personal information. Make no mistake, age verification mandates are mandates for more collection of data and many aren’t comfortable with this intrusive collection of data. As our colleague Taylor Barkley explained in a preview of new polling data from CGO, approximately two out of three Americans do not wish to share their identification documents or their biometric information like face scans with social media platforms.
To help remedy this, the state could craft rules to ensure that third parties can confirm age without needing to share personally identifiable information.
The bill also prohibits Utah minors from entering into a contract online unless the minor’s parent or legal guardian consents. While some might think that this part of the bill is directed at privacy policies and cookies, as Caden Rosenbaum of the Libertas Institute noted, “A platform’s terms of service, an agreement to use a platform within set parameters in exchange for access to the platform, is not a legally binding contract. It lacks the basic element of consideration recognized in most common law states like Utah.”
More likely, this part of the bill is meant to give parents more control over their kids if they start making money via streaming services like YouTube and Twitch. But sites already require parental consent to start earning revenues, so this part of the bill is likely overkill.
H.B 311’s definition of social media
Age verification would only apply to those sites deemed to be social media. To fit this legal definition, a website needs at least 10 million account holders and those account holders have to be able to create a profile, upload posts, view the posts of other account holders, and interact with other account holders or users.
Because these four broad activities encompass many of the most popular sites on the Internet, the bill authors agreed to exempt a wide range of services from age verification. Sites that are dedicated to electronic mail, direct messaging, streaming, online shopping or e-commerce, digital news, scholarly research, cloud storage, tech support, business-to-business communication, data visualizations, or shared document collaboration are exempted from the regulation. While streaming services like Spotify and Netflix would be exempt from age verification, the bill is written in such a way as to include YouTube, Twitch, and other user-generated content sites.
As one would expect, this method of broadly defining and then exempting companies creates some interesting edge cases. For one, it seems that popular messaging boards like those behind Duolingo and Stack Overflow would need to verify the age of their users, even though they were probably meant to be exempted since the law carves out academic or scholarly research.
Discord presents a unique case. While the service has over 150 million monthly active users, it is actually split into channels which are called servers, allowing for either private or public discussion. The service also has a voice chat feature and an instant messaging component, which would seemingly put it in the direct messaging bucket. In other words, Discord straddles the line between a social media platform and a messaging service. Even the Wikipedia entry struggles to truly define what Discord is, suggesting that courts will likely have the same problem.
Also, the bill doesn’t specify that those accounts have to be in the United States, so lots of sites could be caught up. As one would expect, Snapchat, TikTok, and Instagram would need to age verify, but so would Friendster. While Friendster is no longer popular in the United States it does have a strong following elsewhere. If nothing else, the bill should be clarified to make sure it only applies to US users.
Social media addiction in H.B. 311
In addition to age verification, H.B. 311 prohibits a social media company from implementing designs that the company knows causes a minor to become addicted to a social media platform.
The House bill is also unique in its definition of social media addiction, defining it as use that
- indicates preoccupation or obsession with, or withdrawal or difficulty to cease or reduce use of, a social media platform despite the user’s desire to cease or reduce that use; and
- causes physical, mental, emotional, developmental, or material harms to the user.
While social media addiction has not been defined in the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders (DSM), these bills echo the way that the DSM defines Internet Gaming Disorder (IGD). What is notable is that the bill only includes four of the total nine symptoms that define IGD. What’s left out are some key indicators of problematic use like giving up other activities, tolerance, deceiving family members, and jeopardizing or losing a job or relationship due to social media use.
The most important shift from the DSM is that these bills require only “physical, mental, emotional, developmental, or material harms to the user” to define addiction, whereas gaming addiction is more specific and requires “significant impairment or distress.” Rather than reflecting the language of psychiatry, the bill’s language instead relies upon the language of torts, which is focused on harms. If implemented, this bill would be far more expansive than current medical practice. Under H.B. 311, everyone would probably be addicted.
Regardless, this part of the bill has a confusing construction. If lawmakers want to hold platforms accountable for social media addiction, they should simply write that into the bill. Instead, H.B. 311 is written in such a way that only applies to design features.
If there is any clear message to be gained from social media research, it is that everyone reacts differently to these design features. A feature that is troubling to one individual could be completely harmless or even helpful to another individual. The endless scroll feature of Instagram Reels could just as well recommend a video on finding mental health resources.
What S.B. 152 does differently
Much like H.B. 311, the companion bill in the Utah Senate, S.B. 152, institutes an age verification requirement for social media companies. It is broadly assumed that the two bills will be merged in conference, and so what will eventually become law will be a combination of the two.
So if S.B. 152 passes, social media companies will also need to ensure that the accounts of minors
- cannot send direct messages to accounts that aren’t friends;
- cannot be found using the search feature if they aren’t already linked;
- aren’t shown any advertisements;
- don’t have any personal information collected from the posts, content, messages, text or usage activities; and
- are prohibited from being targeted or suggested for groups, services, products, posts, accounts, or users.
Additionally, S.B. 152 gives parents and guardians, who previously gave their consent for the profile, the ability to see all of the posts and the messages that the account made. This feature is already common on social media platforms today. Messages and posts can be downloaded. In practice, this provision would probably require that parents also have a profile to be able to control the data of their children.
S.B. 152 also mandates that the accounts of minors will be inaccessible between the hours of 10:30 PM and 6:30 AM, unless parents go into the settings and change these defaults. Again, this feature already exists on many social media platforms and cell phones. Parents can already set time restrictions on devices and platforms. But the bill would force parents to have their own profile to access this data and change it.
In its current form, S.B. 152 would make it incredibly difficult for minors to connect with their friends. Since the bill prohibits social media platforms from targeting or suggesting groups and users to minors, and minors cannot search for an account, there doesn’t seem to be any way for mutual friends to actually connect online.
There are other practical conflicts that need addressing. Because S.B. 152 doesn’t clearly define personal information(typical of privacy bills), social media sites would no longer be able to collect any data and then display it. In the plain reading of the bill, it is hard to see how any social media site would be able to comply, since their basic function is to collect, store, and then display personal information. While members in the House and Senate might cheer it, this bill would probably outlaw TikTok, Instagram Reels, and other algorithmically based services for those under 18.
The cost of implementing H.B. 311, S.B. 152 and their legality
To support these new provisions, the bills direct Utah’s Division of Consumer Protection to investigate complaints and bring cases. But before a case is brought, the division has to send a letter to the company and give them 30 days to change their systems. This is known as a 30-day cure period. If a company doesn’t have the proper age verification system and is found to be in violation of the law, they could face a $2,500 penalty per violation.
To further ensure compliance, the bills would grant a private right of action (PRA). These kinds of provisions give trial lawyers the ability to bring a case against companies. But may also create instances in which parties prefer private litigation when administrative measures through the DCP may be more appropriate. In particular, the administrative measures provide a cure period while the private right of action does not.
As it stands, the PRA portion of the bill opens to the door to gaming by attorneys—a likely distraction from ensuring kids’ safety. For this reason, the PRA should (1) only be available when all administrative measures have been exhausted and (2) only be for injunctive, non-monetary relief.
The prohibition against addictive designs has the biggest stick. It grants the DCP the ability to bring a case that can recover “a civil penalty not to exceed $250,000 per violation.” Because the violation hinges on Utah minors becoming “addicted to the social media platform,” the state needs to ensure that their definition of addictive designs is in line with accepted practice.
Beyond all of these challenges, if the state still wants to move forward with these two bills, they should, at a minimum:
- Allow facial recognition systems to verify age so that the elderly, the poor, and others without photo identification can access social media sites;
- Craft rules to ensure that third parties can confirm age without needing to share personally identifiable information;
- Clarify that the 10 million account holders need to be located in the United States;
- Expand the definition of addiction to include all of the key indicators of problematic use like giving up other activities, tolerance, deceiving family members, and jeopardizing or losing a job or relationship due to social media use;
- Align the social media addiction definition with current psychological practice, requiring a “significant impairment or distress” instead of merely causing physical, mental, emotional, developmental, or material harms to the user;
- Direct relevant state agencies five years after passage to oversee a study on the mental health of teens to inform federal and state policymakers on the effectiveness of the measures in the bill;
- Limit the availability of private rights of actions;
- Require that any private right of action is only able to seek injunctive relief; and
- Change the language of S.B. 152 to ensure that platforms can collect, store, and then display posts, videos, and other content.