Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How Big Tech Threatens the Culture of Our Communities

Facebook and its ilk bombard us with vitriolic content, and their algorithms help to divide Americans. Local-government leaders need to keep this in mind when they offer up incentives to attract their operations.

Former Facebook employee Frances Haugen testifies before a Senate panel on Capitol Hill Oct. 5, 2021, in Washington, D.C.
(Drew Angerer/Pool/ABACAPRESS.COM/TNS)
Former Facebook employee turned whistleblower Frances Haugen testified before a U.S. Senate subcommittee last week that Facebook and some other social media companies allow their platforms to be used for anti-democratic, divisive and hateful purposes. She also alleged, while providing documents stolen when she worked as a product manager at Facebook, that the company had in its possession research that proved this but that its leadership ignores the research because the company makes huge profits from the contentious and vitriolic content posted on its platform.

Much of this discussion has been pitched as a problem of technology and policy with global implications. But little attention has been given to the implications for local communities. I see three potential areas of concern for local-government officials: First, many fringe groups are using news feeds and content from social media to target public officials, even to the point of threatening their lives. Second, local governments are either purchasing technology or outsourcing network infrastructure to private firms that utilize some of the same technology that Facebook and others are under fire for. And finally, local governments are offering Facebook, Google and other Big Tech firms lucrative tax breaks to lure their operations to their communities.

Sadly, the issues raised by Haugen before Congress and on “60 Minutes” are hard for some to understand. Facebook, Google, Twitter and other social media and search platforms collect a large amount of data on users through web cookies and other tracking technologies. What is problematic is what happens next: Facebook and other companies use the data to determine preferences and tastes and then target users with ads and content using algorithms and artificial intelligence.

That becomes a problem for public officials when content and advertisements are used for unlawful or anti-democratic purposes. For example, the noisy and sometimes violent protests against public school officials who try to enforce mask mandates no doubt blur the line between free speech and criminal conduct. Many protesters we see on cable news and in social media are pushing, shoving and threatening to show up at school board members’ homes. These mostly right-wing activists use social media to mobilize crowds by feeding them a diet of conspiracy theories and false information. Much of this is related to how tech companies use algorithms and artificial intelligence to deploy content on their platforms.

Then there is the problem of content itself. The “algorithmic amplification” process sometimes takes users to sites they are not looking for. Recently, when I was searching for pro-civil rights sites, I was redirected to sites of organizations opposed to civil rights. I immediately recognized the problem, but what if I was a young student who was not as familiar with the topic or how Internet searches work? This problem has implications for both information integrity and education.

An equally serious problem, and one that is harder to detect, is when algorithms are set up in a way that results in discriminatory outcomes against women, racial and ethnic groups, or low-income individuals. My daughter, applying for a position recently with a technology firm, used a lot of strange jargon of which I was unfamiliar in her letter of interest. I asked her why. She told me she had read on a social media site that the initial batch of letters and résumés were screened by applicant tracking system technology that looked for certain keywords. If these words and phrases were missing, your application would be automatically kicked out.

The implications for equity, diversity and inclusion for workforce development would be seriously impacted if local governments outsourced HR to firms that made use of such technologies. The use of supposedly “neutral technology” that nevertheless discriminates by kicking out ethnic-sounding names, or de-prioritizing applications from historically black colleges and universities like Spelman and Morehouse or women’s colleges like Wellesley and Bryn Mawr, would create a huge problem for public officials. If tech companies or local governments are not held accountable for what some have called “a new Jim Code,” the consequences in the future could be grave for a diverse republic. This potential problem should be addressed from the onset when public officials draw up requests for proposals. Technology isn’t neutral; the systems are designed by humans, and they take on the biases and limited experiences of their developers and designers.

Finally, I believe one of the most important questions local governments will have to answer is whether they should continue to subsidize the move of Big Tech companies into their communities if it is determined that any of Haugen’s allegations are true. Many companies come to local governments promising to bring hundreds of good-paying data center jobs and broadband for underserved communities. Facebook is building a data center in an industrial park whose campus bestrides land in three counties about 45 miles east of Atlanta, and as far as I can tell no one asked about user data or AI issues. In fact, mayors and county executives often brag about luring a big firm to their jurisdiction.

I recommend that local officials pay close attention to the congressional hearings underway and assess for themselves the larger impact these companies might be having on society. Are hate groups using the platforms to mount insurrection against our democracy? Are our most fragile residents, children and the poor, exploited by these technologies because they depend on them as their primary information source? And are the platforms deliberately separating us and making it difficult for us to form consensus and be civil to one another?

Facebook and the other titans of Big Tech, the chairman of the committee before which Haugen testified told his Senate colleagues, “are facing their Big Tobacco moment.” Are the dangers of this largely unregulated industry equivalent to those of the tobacco industry of the past? Tobacco companies knew their products caused cancer and were addictive, but they made huge profits, so they ignored the dangers to society. If the allegations leveled by Haugen and others turn out to be true, we might end up paying a bigger cost this time around: the loss of democracy and civilization. It's too high a price to pay.

Governing's opinion columns reflect the views of their authors and not necessarily those of Governing's editors or management.
Government and education columnist
Special Projects