Big Tech and Future Lawmaking Paper
- Ellie W
- Dec 9, 2024
- 6 min read

This paper was written for my Communication Law and Ethics class. This assignment was to choose a topic within the realm of big tech and write about the current and potential future lawmaking surrounding that topic. I chose to write about youth saftey online, especially in regards to minors using social media platforms.
Here is an excerpt of the main section of the paper:
"...One emerging issue with technology and communication law is youth privacy and safety while online. There are rising rates of children accessing the internet and social media, the us. General surgeon states that, “nearly 40% of children 8 to 12 years old and 95% of children 13 to 17 years old use social media apps” (Cleveland Clinic). Usage of social media and internet has also been shown to have negative effects on children. High usage of social media is linked to poor mental health in children and teens with high rates of depression and anxiety. High amounts of social media useare also linked to less social interaction, poor sleep, low self-esteem, and poor focus (Cleveland Clinic).
Many social media sites have addictive qualities and targeted algorithms that can cause harm to children and teens who are more easily influenced by content. Most online social media platforms are designed to be addictive in order to generate more interaction from users. This is especially concerning for the more vulnerable children and teens who are interacting with social media.
More internet usage also brings the dangers of children being exposed to inappropriate topics and content for their age. According to a report from BBC, most children in the UK have been exposed to, or accidentally found pornography online by the age of 13, but children as young as eight are being affected. The report also states that, “Most children first saw pornography on social media (Wain). A 2022 report from Common Sense Media states that, “15% (of surveyed teens) said they first saw online pornography at age 10 or younger. The average age reported is 12” (Robb and Mann). Children can also be exposed to cyber-bullying, violent content, predators, and harmful trends through social media and internet usage. More laws need to be put in place to help protect children’s safety online.
While there is still a large gap between current laws and technology, there are some laws in place to help protect our youth. Many social media platforms already ask for people to enter their birthdate when registering for an account in order to verify if the user is thirteen or older. The Federal Trade Commission has recently passed the Children’s Online Privacy Protection Rule (COPPA). This law states that online platforms cannot collect any information or data on users under the age of thirteen. It also states that any online sites that are designed for children under the age of thirteen must meet certain criteria in terms of their subject matter (eCFR). This law helps to protect the privacy of children under the age of thirteen, and also helps to moderate age-appropriate content for site and online platforms designed for children.
States are also starting to protect the youth through age verification laws. Fourteen states, including Texas, Mississippi, and Virginia have recently proposed or passed bills that require online platforms, to implement age verification on sites “where more than one third of the content meets the definition of material harmful to minors” (“Age Verification Bills – Action Center”). They are passing these stricter verification laws in order to ensure that young children or minors are unable to access those sites. These bills are currently enforced through civil fines or private lawsuits.
There are a lot of issues in creating effective communication laws. One of the biggest issues in communication for online platforms and social media is that they are very hard to effectively enforce. For example, the age verification laws are hard to enforce because they can easily be bypassed through a VPN. Also, many social media and online platforms will not meet the requirements of having over one-third of their content meet the standards that require them to implement the verification. However, a lot of children are still able to access harmful material on these platforms. So, the law isn't able to cover a lot of the actual issues.
Another issue with forming communication laws is the lack of current accountability that online platforms have in regard to the content posted and shared through them. The Communications Decency Act grants a large amount of immunity for these platforms in regard to how their content is moderated. This is hard to balance because more moderating and filtering of content leads to an infringement on a person's first amendment rights.
There is a lot of concern about how the increased regulation of social media will harm our user's ability to speak freely on these platforms. Some people argue that a lot of laws to prevent explicit or inappropriate content will also cause a chilling effect on people's ability to post or speak freely on social media platforms.
However, one could also argue that by allowing the platforms to complete control of how they choose to moderate content, they are already infringing on our first amendment rights. When president Donald Trump was banned from twitter, effectively silencing his speech rights on a public platform. That decision was upheld in the U.S. courts due to twitter being a private corporate, and able to choose what they wanted to moderate on their platform.
The immunity currently granted to these online platforms allows them to be much laxer about how they filter and moderate content, which is still allowing a lot of explicit, illicit, and inappropriate content through. This has already been shown to be harmful to children, many of whom are exposed to pornographic content on these platforms. I think that these social media companies need to be held more accountable on how they moderate inappropriate content, because what they are currently doing is harming our youth. Or they need to be help more accountable for who they allow on their platforms, which is where age verification laws come into play.
There are multiple future strategies that are being taken in the United States to help counter the lack of effective enforcement in laws regarding children's online safety and privacy. Many states are still proposing or passing age verification laws, with stricter requirements and an increase in their penalties. However, some argue that these will still not be effective as they can be bypassed. There are also arguments that these laws violate first amendment rights, and some people could be uncomfortable sharing sensitive personal data with online platforms in order to complete these verifications.
Another strategy that is being proposed is the requirement of parental consent, making their platforms less addictive, and limiting the time the platforms can be accessed. The issue with these laws is that they can easily violate first amendment rights, and it would require the social media sites to be willing to remake their platforms to be less addictive for their minor users, such as not using algorithms, notifications, or targeted content.
Another strategy is to increase the accountability, or responsibility that the social media corporations have in regard to the content found on their sites that children can access. Online platforms would potentially be much more motivated to ensure that their content is following their community guidelines if they were to be held more accountable for their content. This would be in regard to harmful trends, graphic and explicit content.
There are also some newly proposed federal bills that could potentially change how children's privacy and safety is regarded by the law. One of these bills is the Kids Online Safety Act (KOSA), that recently passed the senate. This bill would give the social media platforms likely to be accessed by children a duty of care. This would require them to filter out inappropriate content, and “provide minors with options to protect their information, disable addictive product features, and opt out of personalized algorithmic recommendations. They would also be required to limit otherusers from communicating with children and limit features that “increase, sustain, or extend the use” of the platform” (Ortutay).
In order to ensure that these strategies are successful it is important that the policymakers cooperate with the social media corporations. It will be harder for regulations to be effectively passed if the companies don’t want to cooperate and make those changes. A positive example of this cooperation would be with the recently proposed KOSA bill, as some big tech companies and social media corporations such as Microsoft, X, and Snap have supported the act..."
We also were required to present our main topics from the paper. Here is the powerpoint slides I used to help present.
To read the full paper:
Comments