!Discover over 1,000 fresh articles every day

Get all the latest

نحن لا نرسل البريد العشوائي! اقرأ سياسة الخصوصية الخاصة بنا لمزيد من المعلومات.

TikTok: Classified Documents Show Awareness of Harm to Youth but Lack of Effective Action

In recent years, the “TikTok” app has become one of the most popular platforms among teenagers and young people across the United States and the world. However, questions have begun to arise about its negative effects on children’s mental health and behavior. A comprehensive investigation lasting over two years by 14 attorneys general revealed internal documents reflecting TikTok’s awareness of the application’s risks, despite the company’s insistence on the safety of its users. These previously confidential documents contain information revealing how the app was deliberately designed to attract users’ attention, potentially leading to addiction and numerous psychological issues. In this article, we examine how TikTok faced legal investigations and what has been revealed about its internal strategies and their potential impact on teenagers.

Implications of TikTok Use on Young People

TikTok, one of the most popular social media platforms, is currently facing intense scrutiny due to the harmful effects that may result from its use, especially among teenagers. Internal documents released show that the company was aware of the potential risks of its application but did not take adequate steps to address them. These risks include issues related to mental and behavioral health, with studies showing that excessive use of the platform can lead to a state of addiction, dissuading teenagers from performing their daily tasks and socializing. For instance, research indicates that users become addicted after watching 260 videos, which leads to continuous use of up to 35 minutes, impairing their ability to concentrate and increasing anxiety and depression.

At the same time, the documents reflect that TikTok introduced tools to limit daily usage time, but these tools had little impact on reducing the time teenagers spend on the platform. Conversely, the company seemed more focused on how to boost public trust in it by improving its image before the media. These strategies appeared to be more like publicity tools rather than effective measures to protect young users from risks.

TikTok’s Strategies to Enhance User Engagement

TikTok relies on an advanced algorithm that makes the user experience unique and engaging. This algorithm is not only designed to display content that interests the user but also crafted in a way that ensures users stay engaged for longer periods. Internal documents indicate that the company was aware of how the density and speed of content affect users’ feelings of addiction. For example, short videos lasting only a few seconds tend to keep users hooked on pressing to watch more without stopping.

TikTok’s success is built on this dynamic, with research showing that most teenagers spend a long time scrolling through content. This dynamic leads to a decline in critical thinking and deep discussion skills among teenagers, resulting in a generation that lacks the ability to effectively engage in social life.

TikTok’s Response to Public and Legal Scrutiny

As concerns grow about the platform’s impact on children and teenagers, TikTok is facing increasing pressure from governmental entities. Several U.S. states have filed lawsuits against the company, claiming that it harms the mental health of young users and contributes to their addiction. The court recently unveiled confidential information and data, sparking debate over the company’s transparency and ethics.

Through the lawsuits, it is alleged that the company was aware of the risks associated with using its application but chose to ignore them. Currently, TikTok is being pressed to provide more transparency regarding how it manages its content and to take public concerns seriously. The lawsuits aim to change the way TikTok interacts with its users and urge the company to take preventive measures to ensure the safety of teenagers.

Obsession

Beauty and Its Standards on Social Media Platforms

In addition to the mental health risks, many of TikTok’s repercussions are based on the beauty standards that the platform promotes. Documents reveal how TikTok has worked to adjust its algorithm to favor the display of videos from users deemed “attractive,” reflecting narrow beauty standards that may negatively affect users’ perceptions of beauty. The impact of this extends to youth demographics, who may feel pressured to conform to those standards.

TikTok has faced pressure to provide educational resources around self-image disorders. Employees have pointed to an urgent need to clarify the impact of filters on mental health and how they can lead to low self-esteem among young people. Awareness campaigns have been proposed, but they remain inadequately implemented.

The Legal Future of TikTok in the United States

Statements have emerged from state officials that if TikTok fails to address security concerns regarding its users, especially adolescents, it may face a national ban. Questions surrounding its ownership by China, alongside the questionable practices that may be unveiled, have placed TikTok in a difficult defensive position. However, TikTok is attempting to highlight its commitment to safety by offering features such as screen time defaults and enhanced privacy settings. Yet, these measures remain controversial, as many wonder whether they are truly sufficient.

The year 2023 may represent a turning point, as government bodies are moving more swiftly to address public health concerns. Current trends indicate that the future will be fraught with legal and regulatory challenges, requiring TikTok to focus not only on business aspects but also on social responsibility.

The Impact of TikTok on Young Users

TikTok is regarded as one of the most influential social platforms among new users, particularly those under 17. Statistics have shown that around 95% of smartphone users in this age group use the app regularly, reflecting their significant engagement. However, with increased usage, concerns are growing about the negative psychological effects of excessive use. Research has shown that overuse of social media apps may be linked to heightened feelings of depression, anxiety, and self-image issues among children. TikTok has sensed this trend and implemented tools for time management, such as notifications alerting users to how much time they have spent on the app, alongside parental control features.

However, an internal audio report from TikTok indicates that these tools may not be effective as intended, as minors often lack the necessary management skills to control their app usage. Studies have confirmed that children are often the most susceptible to addiction to the app’s content, underscoring the need for monitoring and awareness from parents. A swift response to these challenges could help alleviate psychological pressures on children, therefore parents should be aware of how much time their children spend on the app.

Information Bubbles and Their Impact on Content

In addition to the psychological impact, “information bubbles” are a major concern in the context of using TikTok. Bubbles refer to the state in which a user is exposed only to information and opinions that align with their beliefs, thereby reinforcing a limited set of views and contributing to a narrowed perspective. Through internal experiments, TikTok employees discovered that users can quickly become immersed in negative bubbles in less than half an hour of use, highlighting how youth engage with negative content faster than anticipated.

Focused
Some presentations at the company suggest that the app can accelerate the display of harmful content, such as videos related to suicide and eating disorders. These concerns pose a significant challenge for a company like TikTok, as it must balance freedom of expression with user protection, especially for young people. Regulating the app’s content is a key point in enhancing user safety, and the company must utilize all available resources to reduce the risks arising from these bubbles.

Content Moderation and Addressing Negative Phenomena

The content moderation process at TikTok is subject to several layers of verification before any action is taken. Technology is used to achieve the first pass of censorship, where artificial intelligence is employed to detect content that includes inappropriate scenes, and then the process is supplemented by human review in later stages. However, the high proportion of “leaked” violating content has been a warning sign, as it is reported that some videos with harmful content, such as those related to suicide or eating disorders, sometimes made their way to the public before being removed.

It requires more effort from TikTok to enhance the effectiveness of moderation plans, including increasing the number of human moderators and developing more stringent policies regarding harmful content. Additionally, the company needs to develop effective strategies to communicate with the audience about their responsibilities as users, especially when it comes to maintaining everyone’s mental health. Although there is a desire within the company to address these negative phenomena, it may face practical challenges given the huge volume of content being published daily.

Dealing with Children and Illegal Practices

Regarding users under the age of 13, TikTok enforces a strict policy that prevents them from creating personal accounts. However, there are potential loopholes in these policies, making it easier for children to bypass these restrictions and subscribe on the platform. In this context, a Department of Justice report shows that TikTok has violated federal law protecting children’s data, leading to lawsuits against the company.

Moderators at TikTok need to handle allegations related to the presence of children on the platform with caution. Internal documents may indicate guidelines not to remove user accounts believed to be under 13 unless their personal information clearly indicates otherwise. This requires the company to strengthen age verification measures to enhance safety on the platform. The key here is to reduce children’s access to inappropriate content and thus enable them to use TikTok safely.

Source link: https://www.npr.org/2024/10/11/g-s1-27676/tiktok-redacted-documents-in-teen-safety-lawsuit-revealed

Artificial intelligence has been used ezycontent


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *