A Federal Agency to Regulate Social Media Is a Good Idea. Here’s What It Should Do.
In a recent New York Times essay, Senators Elizabeth Warren and Lindsey Graham proposed a new federal agency to regulate social media and tech companies.
As the CEO and founder of LiveWorld, the longest-standing social media-related company in the world (27 years), having overseen more than 3 million hours of content moderation and engagement managed hundreds of brand social media programs and previously been a senior executive at Apple, I applaud this proposal as both needed and a good idea. Here’s why and specifically what the new agency must do.
Needed and a Good Idea
Social media has become a dominant element in most of our lives, bringing great benefits and positive experiences. But with that, comes a bunch of bad stuff, too. It’s a very complex and rapidly changing media form with daily – even hourly – changes in usage patterns driven by billions of people.
As I discuss in more detail in my eBook, “Is Privacy Dead in the Digital Age – and What to Do About It,” one-off laws and simplistic control measures won’t work and can have unintended consequences that do more harm than good. This is especially the case in social media where users’ organic usage patterns are different than in any other media form. And, while the negative elements of social media can be quite bad and must be addressed, remember that they are only a tiny percentage of what happens daily – most of which is neutral or quite positive for people. We don’t want to reduce or damage those positive experiences. Still, with over 5 billion users, even if only 1% are bad actors or generating problems, that’s 50 million people. Fifty million people can cause a lot of trouble in a day. For the betterment of all, we need rules of the road that structurally manage the media form.
The impacts on our lives, the potential for very good and very bad, and the sheer complexity are such that this industry needs oversight and regulation. But it must be done on an ongoing, sustained basis with depth of expertise and continuity of management. The complexity and dynamics run across the assorted companies and are beyond the scope of any one company to manage by itself. Even if well intended, the leaders and managers of those companies are not positioned for the task. Further, some may be motivated more by self- or company interest, rather than public interest. Some of these managers might even be erratic in their decisions and behavior with damaging impacts to both the industry and the public.
Congress has demonstrated it is woefully ill-equipped to address this through individual laws and rhetoric. There are multiple reasons for this: lack of knowledge, expertise, and experience with this new and evolving medium; legislative paralysis and lack of continuity in looking at and managing the subject; and finally, partisanship. Examples include proposed laws and pressure that has caused unintended consequences, arbitrary actions that don’t accomplish anything, and suggestions to break up the companies which, in most cases, won’t help and might make things worse. A dedicated regulatory agency can address these issues, but it must be focused and nimble to do so. It must go beyond partisan rhetoric and the posturing and lobbying of the industry to find solutions that actually support the positive interests of the public while mitigating and managing the negative side of social media.
What the New Agency Should Do
While there is seemingly no end to what this new agency needs to do, it can start with 7 high-impact items and areas to focus on:
1) Transparency: Require the social networks to display where various information, such as ads and content, is coming from. Also included should be what other information that entity is presenting and to whom. This information should be available both currently and historically.
2) Line Item Opt-In: Require the social networks to enable users to choose what kind of content they do and do not wish to receive. This should be done by item and cover both the content itself and the algorithm model (chronological, engagement, interests, people one follows, and so on) that is used to choose and present content.
3) Always Optional Opt-Out: Require the social networks to enable users to, at any time, opt out of the content and algorithms they’ve chosen as well as any other aspects of the service.
4) Tools: Require the social networks to implement easy-to-use tools that enable users to accomplish all the above.
5) Monitoring & Reporting: Require the social networks to monitor and report illegal actions, disinformation, and other problematic content. Both the content and the network’s actions on it must be reported to the agency and/or law enforcement as appropriate.
6) Education: Require the social networks to develop and provide education programs on the use of social media, their social networks, and the various above tools and processes.
Additionally, through dedicated taxes or regulation, require the social networks to financially contribute to the agency’s own education program. This program will provide materials, guidance, and funding to K-12 schools across the country for educating teachers and students on the use of social networks. This must include critical thinking, managing information flow, identifying disinformation, safeguarding self-esteem and self-image, and a further range of topics that will equip our children to manage and thrive in the ever-expanding social media age.
7) Ensure Competition: First, the agency should be tasked with reviewing and approving any mergers and acquisitions in conjunction with the Justice Department to ensure healthy competition.
Second, the agency should study the media form to better understand how scale, technology architecture, and systems integration can be in the best interest of consumers as well as how to encourage and sustain competition. In social media, simply breaking up companies won’t solve the issues; in fact, it might worsen them as consumers are then deprived of wanted services, and it may be harder to drive the needed regulatory oversight.
However, there are approaches the new agency can pursue beyond regulating M&A. For example, it can establish standard protocols for integration that enable smaller companies to create services that communicate with the larger servicers. The companies must be required to support such protocols. An example is open APIs for messaging. This would enable users to message across systems owned by different companies, much the way we can make phone calls today across different carriers.
With these measures, a new federal agency can both empower users and guide and constrain companies. This will help drive the positives of social media while mitigating the negatives.
The author, Peter Friedman, is the founder and CEO of LiveWorld, the longest standing social media related company in the world and with the most years of content moderation experience. With a combination of human moderators, technology platforms, and digital agency services, for over 27 years LiveWorld has provided hundreds of programs to F500 brands in healthcare, financial services, CPG, auto, entertainment, and Internet categories. Prior to founding LiveWorld, Mr. Friedman was Vice President and General Manager of Apple’s Internet services division including its moderated online communities