China’s powerful cyberspace regulator has taken the first step in a groundbreaking — and uncertain — government effort to rein in the automated systems that shape the Internet.
Earlier this month, the Cyberspace Administration of China published summaries of 30 core algorithms belonging to two dozen of the country’s most influential internet companies, including TikTok owner ByteDance Ltd., e-commerce giant Alibaba Group Holding Ltd.
and Tencent Holdings Ltd.
owner of China’s ubiquitous WeChat super app.
The milestone marks the first systematic effort by a regulator to force Internet companies to disclose information about the technologies that power their platforms, which have shown the ability to radically change everything from pop culture to politics. It also sets Beijing on a path that some technology experts say few governments, if any, are equipped to handle.
The public versions of the applications explain in plain language what types of data a given algorithm uses and what it does with the data. In many cases, they provide less detail than what Facebook voluntarily discloses to users about how it ranks content in its News Feed.
The full filings, which are not public, contain more extensive descriptions of data and algorithms, some of which are considered confidential business information, people familiar with the filings said. They also include a self-assessment of potential security risks, according to public records of what the regulator asked the companies to provide.
Companies submitted the information in response to a new law that went into effect in March that requires regulators to clean up the negative effects of algorithms, such as amplifying harmful information, violating user privacy and abusing gig workers. The law also requires algorithms to be used to promote “positive energy,” a Xi Jinping-era phrase for content that elevates public opinion and treats the Communist Party favorably.
Beijing is not alone in trying to limit the power of algorithms that underpin the internet. Regulators in the US and EU are grappling with similar issues, such as how to protect teenage mental health and stamp out viral misinformation.
However, the Chinese law represents the most assertive attempt to directly monitor algorithms. Ultimately, it can be applied to any service in the country that uses algorithmic technology.
“They’re doing things that no one else has tried yet, and the rest of the world can learn from what works and doesn’t work,” Graham Webster, who runs the DigiChina Project, which tracks China’s digital policy developments, said at Stanford University.
An important question the effort raises, algorithm experts say, is whether direct government regulation of algorithms is practically possible.
The majority of today’s internet platform algorithms are based on a technology called machine learning, which automates decisions such as ad targeting by learning to predict user behavior from large data stores. Unlike traditional algorithms that contain explicit rules coded by engineers, most machine learning systems are black boxes, making it difficult to decipher their logic or predict the consequences of their use.
Beijing’s interest in regulating algorithms started in 2020 after TikTok sought an American buyer to avoid being banned in the United States, according to people familiar with the government’s thinking. As several bidders for the short video platform lost interest after Chinese regulators announced new export controls on information-recommendation technology, it tipped off Beijing to the importance of algorithms, the people said.
China’s cyberspace administration moved quickly to draft a new law on algorithmic recommendation systems, seeking in particular to understand how the country’s tech companies shape online discourse and how to curb that influence, people familiar with the matter said.
By January 2022, the law was ready and it came into force two months later – an impressive pace for a government that sometimes sits on draft laws for years, Mr Webster said.
The cyber security regulator did not respond to a request for comment.
China’s law shocked people in US technology policy circles because of its scope and aggressiveness, according to Suresh Venkatasubramanian, a computer science professor at Brown University who served as assistant director of the White House Office of Science and Technology Policy until this month.
Some in the US government were intrigued when Facebook whistleblower Frances Haugen argued in Congress last fall to put limits on the social media company’s algorithms, according to Mr. Venkatasubramanian. However, regulators were concerned that it would set a precedent for state control over the flow of information.
“Once you go down that road, it’s very hard to go back,” he said.
EU regulators, facing the same questions, have been more forceful but have still avoided direct government reviews of algorithms.
In July, the European Parliament passed legislation requiring the biggest platforms, such as Google and Facebook, to conduct regular assessments of their systemic risks, such as whether they are spreading illegal content. The companies can choose how to manage these risks, including adjusting their algorithms, but must submit to independent audits to prove that their solutions actually worked.
Implementation and enforcement details of EU law are vague, policy experts say. “It will take years and years of battles and maybe even lawsuits” to interpret the law, said Matthias Spielkamp, CEO of AlgorithmWatch, a Berlin-based nonprofit research and advocacy company.
Beijing’s approach also remains vague. In theory, the Chinese law could give the government full control over the main mechanisms that orchestrate online spaces and, increasingly, offline life as well. Still, Beijing may well be tripping over its own ambitions, tech experts say.
Social media recommendation engines represent some of the most complicated algorithmic systems, with apps like Facebook and TikTok using hundreds or even thousands of algorithms to determine who sees what information.
Having detailed documentation, or even the code, of these systems is not enough to understand how they will affect something as broad as online discourse, according to Cathy O’Neil, an algorithmic auditor who works with US government agencies to audit company algorithms . “What’s actually important is the data that goes through the algorithm,” she said.
Even with full access to this data, which changes with every user input and interaction, a tech company’s own engineers still struggle to precisely adjust the behavior of its systems, according to Ms. O’Neil. Targeted changes like promoting more propaganda are possible, she said, “but it’s actually impossible to control what a recommendation engine does overall.”
Tech analysts and industry insiders also question whether the Cyberspace Administration, which started as a propaganda arm, has the technical expertise to enforce its own rules.
“‘…it’s actually impossible to control what a recommendation engine does overall.’“
Shortly after the Chinese law took effect, government relations managers and algorithm engineers at ByteDance met with Cyberspace Administration officials to explain the documents they submitted, people familiar with the matter said. During one of those meetings, agency officials showed little understanding of the technical details, and company representatives had to rely on a mix of metaphors and simplified language to explain how the recommendation algorithm worked, one of the people said.
Companies have not been required to submit code or user data, the people said.
Chinese authorities’ guidelines issued last year called on several agencies to expand staff to monitor algorithms.
“They’re trying to build the tools, hire people and get the technical expertise to tackle this kind of thing,” said Kendra Schaefer, head of tech policy research at Beijing-based strategic advisory consultancy Trivium China. “So enforcement of this will increase slowly over the next five to 10 years.”
— Raffaele Huang contributed to this article.
Write to Karen Hao at email@example.com
Copyright ©2022 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8