A new report from the Federal Trade Commission (FTC) reveals that major social media and video streaming services are collecting and monetizing vast amounts of user data, often without adequate protections, especially for children and teens.
FTC requests companies’ data
Recommended Videos
The staff report stems from information collected following 6(b) orders issued in December 2020. The FTC requested data on how companies collect, use, and track user data.
The companies covered in the report include Amazon.com, Inc., ByteDance Ltd., which operates the short video service TikTok, Discord Inc., Facebook, Inc., Reddit, Inc., Snap Inc., Twitter, Inc., WhatsApp Inc., and YouTube LLC.
These companies’ business models heavily incentivize mass data collection, primarily used for personalized advertising. The FTC found that this poses significant risks to users’ privacy.
“The orders asked for information about how the companies collect, track and use personal and demographic information, how they determine which ads and other content are shown to consumers, whether and how they apply algorithms or data analytics to personal and demographic information, and how their practices impact children and teens,” the FTC said in a news release.
Concerns over invasive tracking
The report also raised concerns about the use of invasive tracking technologies, such as pixels, which allow companies to track user preferences and interests for advertising.
“The report lays out how social media and video streaming companies harvest an enormous amount of Americans’ personal data and monetize it to the tune of billions of dollars a year,” said FTC Chair Lina M. Khan. “While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from (identity) theft to stalking. Several firms’ failure to adequately protect kids and teens online is especially troubling. The Report’s findings are timely, particularly as state and federal policymakers consider legislation to protect people from abusive data practices.”
Automated systems, including artificial intelligence and algorithms, use collected data with little transparency or user control, leaving users with few options to opt-out, according to the report.
The findings revealed that many companies failed to treat teen accounts differently from adult ones. Additionally, companies often avoid liability under the Children’s Online Privacy Protection Act (COPPA) by claiming their platforms are not intended for child users, even though children use these sites or apps.
Companies collected and retained troves of data indefinitely, including the information sourced from data brokers from users and non-users of their platforms, the news release said.
Some companies also did not delete all user data, even when users requested it to be removed and deleted.
Staff recommendations
FTC staff made several recommendations to policymakers and companies, including:
Congress should pass comprehensive federal privacy legislation to limit surveillance, address baseline protections, and grant consumers data rights;
Companies should limit data collection, implement concrete and enforceable data minimization and retention policies, limit data sharing with third parties and affiliates, delete consumer data when it is no longer needed, and adopt consumer-friendly privacy policies that are clear, simple, and easily understood;
Companies should not collect sensitive information through privacy-invasive ad tracking technologies;
Companies should carefully examine their policies and practices regarding ad targeting based on sensitive categories;
Companies should address the lack of user control over how their data is used by systems as well as the lack of transparency regarding how such systems are used, and also should implement more stringent testing and monitoring standards for such systems; Companies should not ignore the reality that there are child users on their platforms and should treat COPPA as representing the minimum requirements and provide additional safety measures for children;
The Companies should recognize teens are not adults and provide them greater privacy protections; and
Congress should pass federal privacy legislation to fill the gap in privacy protections provided by COPPA for teens over the age of 13.
STAFF RECOMMENDATIONS
Find more information about the report here.