A New F.T.C. Report Finds a Vast Surveillance Ecosystem
- September 30, 2024
- Clayton Rice, K.C.
Social Media and Video Streaming Services have become a ubiquitous part of life and culture in the digital age. But these services do more than let their users connect with the world from the palm of their hands. A new report by the United States Federal Trade Commission has found that these companies have engaged in vast surveillance of users with lax privacy controls and inadequate safeguards for children and teens. The report highlighted that companies collected, retained and broadly shared troves of user data that raises serious concerns about control and oversight.
1. Introduction
The Federal Trade Commission is an agency of the United States government with headquarters in the Federal Trade Commission Building in Washington, D.C. It was established in 1914 in response to the monopolistic trust crisis of the 19th century and operates the Bureau of Competition, the Bureau of Consumer Protection and the Bureau of Economics. It is the only federal agency that deals with consumer protection and competition issues in broad sectors of the economy. The FTC describes the agency’s every day mission as: (a) the pursuit of strong and effective law enforcement against deceptive, unfair and anticompetitive business practices; (b) the creation and sharing of practical, plain-language educational programs for consumers and businesses; (c) the advancement of consumers’ interests by sharing its experience with federal and state legislatures and U.S. and international government agencies; and, (d) the development of policy and research tools through workshops, conferences and hearings. (here)
2. Background
On December 14, 2020, the FTC issued orders to nine social media and video streaming companies requiring them to provide data on “how they collect, use and present personal information, their advertising and user engagement practices, and how their practices affect children and teens.” (here) The orders were issued under s. 6(b) of the Federal Trade Commission Act which authorizes the Commission to conduct studies that do not have a specific law enforcement purpose. (here) The orders were sent to Amazon, ByteDance (the operator of TikTok), Discord, Facebook, Reddit, Snap, Twitter, WhatsApp, and YouTube. The orders specifically asked for information about how the companies collect, track and use personal information.
On September 19, 2024, the FTC released the report titled A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services. (here) “The report lays out how social media and video streaming companies harvest an enormous amount of Americans’ personal data and monetize it to the tune of billions of dollars a year,” said FTC Chair Lina Khan in a press release announcing the report. “While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identity theft to stalking.” (here) In my following comments I will give you the key findings and recommendations in the 129-page report.
3. The FTC Staff Report
The report contains five sections relating to: (a) data practices such as collection, use, disclosure, minimizing, retention and deletion; (b) advertising and targeted advertising; (c) the use of automated decision-making technologies; (d) practices relating to children and teens; and, (e) concerns relating to competition. Here are the report’s key findings.
- Many companies collected and could indefinitely retain troves of data from and about users and non-users, and they did so in ways consumers might not expect.
- Many companies relied on selling advertising services to other businesses based largely on using the personal information of their users. The technology powering this ecosystem took place behind the scenes and out of view to consumers, posing significant privacy risks.
- There was a widespread application of algorithms, data analytics or artificial intelligence, to users’ and non-users’ personal information. These technologies powered the SMVSSs – everything from content recommendation to search, advertising and inferring personal details about users. Users lacked any meaningful control over how personal information was used for AI-fueled systems.
- The trend among the companies was that they failed to adequately protect children and teens – this was especially true of teens, who are not covered by the Children’s Online Privacy Protection Rule (“COPPA Rule”). (here)
In a post to the Electronic Frontier Foundation’s website, staff technologist Lena Cohen said the report confirmed what EFF has been warning about for years – tech giants are “widely harvesting and sharing your personal information to fuel their online behavioral advertising businesses.” (here) Here is the summary of the report’s recommendations.
- Companies can and should do more to protect consumers’ privacy, and Congress should enact comprehensive federal privacy legislation that limits surveillance and grants consumers data rights.
- Companies should implement more safeguards when it comes to advertising, especially surrounding the receipt or use of sensitive personal information.
- Companies should put users in control of – and be transparent about – that data that powers automated decision-making systems, and should implement more robust safeguards that protect users.
- Companies should implement policies that would ensure greater protection of children and teens.
- Firms must compete on the merits to avoid running afoul of the antitrust laws.
The Electronic Frontier Foundation has advocated loud and long for U.S. federal privacy legislation. Among the key components any privacy legislation should contain, Ms. Cohen argued that companies should be prohibited from processing a person’s data beyond what is necessary to provide them what they asked for. “Users should have the right to access their data, port it, correct it, and delete it,” she said. In my remaining comments, I will focus on some aspects of the report that address the collection, use and sharing of consumer data.
The report found the data collection practices employed by the companies went much further both on and off the SMVSS and “sometimes implicated consumers who are not even registered users of an SMVSS.” Consumer data was collected from a variety of sources including: data from advertisers and data brokers, advertising tracking technology, engagement with social media and video streaming services, and users’ use of other products or services provided by corporate affiliates. Several companies reported the collection of IP addresses and device data. Some companies inferred information about a user based on the content the user shared or posted on the SMVSS. Several companies allowed users to connect their account to accounts on other SMVSSs thus establishing another means of data collection.
The companies that engaged in advertising generally used consumers’ information for targeted advertisements. Some companies reported using data, including the data they received from third party advertisers that used their digital advertising services, for their SMVSS’s own business purposes such as ad optimization and research and development. Only a few companies said they would use anonymized data wherever they could still meet the business purpose without using identifiable data. Beyond advertising, the companies generally reported using data to maintain and enhance user engagement through content promotion. Some companies reported that information about a user’s friends also influenced the content promoted.
The report found that the systems and procedures designed by the SMVSSs “are often opaque, leaving consumers in the dark about the breadth of sharing, the parties with whom the companies share information, and the purposes of that disclosure.” Some companies reported sharing data broadly with affiliates and third part entities but provided “limited transparency” on the specifics in their responses. No company provided a comprehensive list of all third party entities they shared personal information with. The companies responses “lacked clear explanations or specificity regarding the exact use cases for sharing with each entity.” The report suggested the lack of transparency “could indicate an inability or unwillingness to account for the extent of those practices because consumers’ data was shared so broadly.”
4. Conclusion
I will conclude by leaving you with the implications of the report’s findings to competition analysis. In digital markets, including AI, “acquiring and maintaining access to significant user data can be a path to achieving market dominance and building competitive moats that lock out rivals.” The competitive value of user data “can incentivize firms to prioritize acquiring it, even at the expense of user privacy and sometimes the law.” Data abuse can “raise entry barriers and fuel market dominance” which in turn can enable harmful data practices in an “unvirtuous cycle.” These implications are enhanced in importance by the report’s key finding that “[t]he companies’ data practices posed risks to users’ and non-users’ data privacy and the companies’ data collection, minimization, and retention practices were “woefully inadequate.” In an article published by The New York Times on September 19, 2024, Cecilia Kang drew attention to the two-failure conundrum that continues to haunt Congressional legislative initiatives. Self-regulation has been a failure. Yet, nearly all attempts to regulate Big Tech have also failed. (here)