Technology

Meredith Whittaker, the President of Signal, Learned What not to do When Working at Google

Meredith Whittaker, the President of Signal, Learned What not to do When Working at Google

Meredith Whittaker transitioned into the charity sector last year following a career in academics, government service, and the computer business. She now holds a key position at the Signal Foundation.

She is currently the president of a company that runs one of the most widely used encrypted messaging apps in the world, with tens of millions of users who use it to keep their talks secret and away from the reach of major internet giants.

Whittaker has valid reasons to be wary of for-profit businesses and how they utilize data; she previously worked for Google for 13 years.

She spent more than a decade working for the search engine giant when, in 2017, she heard through a friend that Google’s cloud computing division was engaged in a contentious Project Maven contract with the Department of Defense. She and other workers saw it as hypocritical for Google to work on artificial intelligence technology that could potentially be used for drone warfare. They started discussing taking collective action against the company.

“People were meeting each week, talking about organizing,” Whittaker said in an interview with CNBC, with Women’s History Month as a backdrop. “There was already sort of a consciousness in the company that hadn’t existed before.”

With tensions high, Google workers then learned that the company reportedly paid former executive Andy Rubin a $90 million exit package despite credible sexual misconduct claims against the Android founder.

Whittaker organized a large-scale protest against the corporation, enlisting the support of thousands of Google employees, and calling for increased transparency and an end to employee forced arbitration. The walkout was a historic event for the tech sector, which had previously seen few high-profile examples of employee action.

We go out of our way in sometimes spending a lot more money and a lot more time to ensure that we have as little data as possible. We know nothing about who’s talking to whom, we don’t know who you are, we don’t know your profile photo or who is in the groups that you talk to.

Meredith Whittaker

“Give me a break,” Whittaker said of the Rubin revelations and ensuing walkout. “Everyone knew; the whisper network was not whispering anymore.”

Google did not immediately respond to a request for comment.

Whittaker left Google in 2019 to return full time to the AI Now Institute at New York University, an organization she co-founded in 2017 that says its mission is to “help ensure that AI systems are accountable to the communities and contexts in which they’re applied.”

Whittaker never intended on pursuing a career in tech. She studied rhetoric at the University of California, Berkeley. She said she was broke and needed a gig when she joined Google in 2006, after submitting a resume on Monster.com. She eventually landed a temp job in customer support.

“I remember the moment when someone kind of explained to me that a server was a different kind of computer,” Whittaker said. “We weren’t living in a world at that point where every kid learned to code that knowledge wasn’t saturated.”

‘Why do we get free juice?’

Beyond learning about technology, Whittaker had to adjust to the culture of the industry. At companies like Google at the time, that meant lavish perks and a lot of pampering.

“Part of it was trying to figure out, why do we get free juice?” Whittaker said. “It was so foreign to me because I didn’t grow up rich.”

Whittaker said she would “osmotically learn” more about the tech sector and Google’s role in it by observing and asking questions. When she was told about Google’s mission to index the world’s information, she remembers it sounding relatively simple even though it involved numerous complexities, touching on political, economic and societal concerns.

“Why is Google so gung-ho over net neutrality?” Whittaker said, referring to the company’s battle to ensure that internet service providers offer equal access to content distribution.

Several European telecommunications providers are now urging regulators to require tech companies to pay them “fair share” fees, while the tech industry says such costs represent an “internet tax” that unfairly burdens them.

“The technological sort of nuance and the political and economic stuff, I think I learned at the same time,” Whittaker said. “Now I understand the difference between what we’re saying publicly and how that might work internally.”

At Signal, Whittaker gets to focus on the mission without worrying about sales. Signal has become popular among journalists, researchers and activists for its ability to scramble messages so that third parties are unable to intercept the communications.

As a nonprofit, Whittaker said that Signal is “existentially important” for society and that there’s no underlying financial motivation for the app to deviate from its stated position of protecting private communication.

“We go out of our way in sometimes spending a lot more money and a lot more time to ensure that we have as little data as possible,” Whittaker said. “We know nothing about who’s talking to whom, we don’t know who you are, we don’t know your profile photo or who is in the groups that you talk to.”

Tesla and Twitter CEO Elon Musk has praised Signal as a direct messaging tool, and tweeted in November that “the goal of Twitter DMs is to superset Signal.”

Musk and Whittaker share some concerns about companies profiting off AI technologies. Musk was an early backer of ChatGPT creator OpenAI, which was founded as a nonprofit. But he said in a recent tweet that it’s become a “maximum-profit company effectively controlled by Microsoft.” In January, Microsoft announced a multibillion-dollar investment in OpenAI, which calls itself a “capped-profit” company.

Beyond just the confusing structure of OpenAI, Whittaker is out on the ChatGPT hype. Google recently jumped into the generative AI market, debuting its chatbot dubbed Bard.

Whittaker said she finds little value in the technology and struggles to see any game-changing uses. Eventually the excitement will decline, though “maybe not as precipitously as like Web3 or something,” she said.

“It has no understanding of anything,” Whittaker said of ChatGPT and similar tools. “It predicts what is likely to be the next word in a sentence.”

OpenAI did not immediately respond to a request for comment.

She fears that companies could use generative AI software to “justify the degradation of people’s jobs,” resulting in writers, editors and content makers losing their careers. And she definitely wants people to know that Signal has absolutely no plans to incorporate ChatGPT into its service.

“On the record, loudly as possible, no!” Whittaker said.