LITA President’s Program: Meredith Broussard on “Artificial Unintelligence”

The Library and Information Technology Association (LITA) President’s program at the 2019 ALA Annual Conference featured Meredith Broussard, professor of journalism at New York University’s Arthur L. Carter Institute. She was previously an editor for the Philadelphia Enquirer, Slate, and Harper’s Magazine. Her talk was based on her book, Artificial Unintelligence: How Computers Misunderstand the World.

Array of 10 book covers with the heading "Read the Resistance"
Meredith Broussard encouraged the audience to “read the resistance,” recommending other books in addition to her own that examine technology through a critical lens: Weapons of Math Destruction, Black Software, Automating Inequality, Race After Technology, Brotopia, Behind the Screen, Algorithms of Oppression, Twitter and Tear Gas, and Programmed Inequality.

Broussard opened with the context which frames her talk. She asked the question whether technology could be neutral, free from prejudice. Some believe that technology is an equalizer in a democratic society, but it can also amplify discriminatory practices and perpetuate wrong assumptions. She pointed out dichotomies between developers vs. maintainers, creators vs. documentarians, hard skills vs. soft skills. She commented that Alexa, the smart voice assistant, had a female voice, while Hal, the robot of 2001: A Space Odyssey had a male voice. There is a small circle of white men who create technology, from which women and women of color are excluded.

She then set out to define what artificial intelligence is and what it is not. The course she teaches, data journalism, features the use of artificial intelligence (AI) in investigative reporting.  There has been a lot of misunderstanding about it. AI conjures up pictures of the Terminator, characters in Star Trek, Star Wars — all Hollywood creations. The discipline of AI involves natural language processing and machine learning. In its strictest sense, AI is complex math, and the concept of machine learning allows us to grasp AI more easily. As an example of AI as applied to data journalism, she mentioned the sinking of the Titanic and how insurance companies could determine the premium for its passengers based on the kind of tickets they purchased, whether steerage or first class. Passengers in the holds were more likely to die in case of shipwreck as first class travelers would have access first to lifeboats. In case of such a disaster at sea, poorer people were thus unfairly penalized for their lower economic status.

Some may think that machines, once fed with data, are better than humans at making decisions. As it turns out, however, people — in this case white male mathematicians such as Alan Turing, Lawrence Page, and Sergey Brin — embed their own bias in making technology work. Creators of technology should come from more diverse backgrounds in order to serve people with different needs. For instance, the Apple watch, designed probably by men, did not contain an application that may help women keep track of their menstrual cycle. “Techno chauvinism” persists, as also evidenced in the declining number of female members of the American Mathematical Society between 1985 and 2016.

In order to make technology inclusive, we must keep in mind what technology is built for. Broussard cited the example of an automatic soap dispenser that functions only if users have a white skin. Recognition technology taught the dispenser to recognize white skin but not darker skin. So often artificial intelligence reinforces power and privilege.

Broussard concluded her talk with a challenge to her audience. We should design learning models “to create a world as it should be” and not as it is. She has been writing on the World Wide Web since the late 1990s. She wonders whether we will be able to read yesterday’s news on tomorrow’s computers. In order to “future proof” the news, we should preserve the first draft of history. Cutting edge digital news is disappearing. The Internet Archive only contains static files. It is up to humans to find ways to preserve digital contents.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.