Investors Demand That Facebook Do Better to Curb Sexual Abuse

[ad_1]

The balancing act between personal privacy and public safety has bedeviled Big Tech since the advent of instant messaging in the mid-1990s. From the beginning, the thorniest issues arose from the online sexual exploitation of children. But are technology companies responsible for the criminal use of their platforms? Many big investors now say they are. And that has led to one of this year’s most memorable shareholder initiatives, in which Lisette Cooper took on

Facebook.

Cooper is a well-known advisor, the vice chair of Fiduciary Trust,

Franklin Resources

’ (ticker: BEN) $25 billion wealth management arm. An approachable investor with a doctorate in geology from Harvard University, Cooper has long been troubled by the growth in online child exploitation, and made preventing it a part of her professional work years ago.

This year, Cooper asked fellow Facebook (FB) shareholders if the steady increase in online child exploitation posed a risk to their investment in the social-media juggernaut. Facebook was adding privacy tools such as end-to-end encryption, in which only the two people involved in the communication could see the data, not law enforcement, nor anyone else. It’s a boon for privacy—and for predators. Cooper advised shareholders who agreed with her to back her proposal directing Facebook’s board to assess the risks.

“Privacy tools are good, but they have implications for child predators and the exploitation of children online,” said Cooper in an interview with Barron’s. “Our concern is that kids be safe, that law enforcement can access the material so they can find the kids and prosecute the predators, or stop someone from harming hundreds of children.”

Facebook opposed the measure and, like the rest of Big Tech, has generally opposed creating backdoors into encryption, arguing that it weakens security. “Strong encryption is important to keeping everyone safe from hackers and criminals,” a Facebook spokesperson told Barron’s. “We disagree with those who argue privacy mostly helps bad people, which is why we’ll continue to stand up for encryption.”

There are plenty of laws to hold companies accountable for facilitating sex trafficking on their platforms. Still, the incidents of abuse are growing swiftly. In 2019, there were more than 16.8 million reports of online child sexual abuse material, including graphic and violent images and videos, up from 10.2 million reports in 2017, according to National Center for Missing & Exploited Children, or NCMEC.

One company stood out: In 2019, some 94% of the reports stemmed from Facebook and its platforms, including Messenger and Instagram. The center said the next closest, Google, accounted for 2.7%.

Cooper founded institutional investment firm Athena Capital in a Boston suburb in 1993. Some 90% of clients were family offices, many controlled by women interested in expressing values through investments. Athena helped fund the Women’s Inclusion Project, an impact-investing initiative, with shareholder-advocacy firm Proxy Impact and clients of other major advisors, such as Aperio, Veris Wealth Partners, and Tiedemann Advisors. Initially, they worked on gender-lens campaigns like equal pay. Soon they began working on child sexual exploitation.

Read More in Guide to Wealth

In 2019, the group campaigned against

Verizon

Communications (VZ), asking Verizon’s board to evaluate the risks of potential child sexual exploitation through its products.

Apple

(AAPL) had already threatened to remove Verizon’s Tumbler app from its App Store after finding a significant amount of child pornography on the site. The resolution won 34% of the vote. After the vote, Verizon created a new digital safety hub on its website, beefed up its child-safety program, and created a new digital safety lead officer.

Then came Facebook. Cooper and Proxy Impact asked for a meeting; they say Facebook never answered. In December 2019, they filed their shareholder resolution. From the start, Cooper was hands-on, sitting in on the calls, reaching out to other institutional shareholders. “I have worked on 500 shareholder resolutions,” says Michael Passoff, CEO of Proxy Action. “Lisette was only the second person who wanted to be involved personally. That was really rare.”

Facebook advised investors to reject the proposal, pointing out that it had partnerships with NCMEC and other nongovernmental organizations, and that it used sophisticated technology to detect child-exploitation imagery and potentially inappropriate interactions between minors and adults, including artificial intelligence and photo and video technology that detected more than 99% of the users and content that it removed for violating its policy.


Privacy tools are good, but they have implications for child predators and the exploitation of children online.


— Lisette Cooper

This wasn’t enough for Cooper, who lobbied for more support. Institutional Shareholder Services and Glass Lewis, the big proxy advisors, agreed to back the resolution. Franklin bought Athena in early 2020, so Cooper went to persuade the Franklin analyst about Facebook. Eventually, she said, Franklin decided to vote all of its shares in favor of Cooper’s resolution. Franklin said that it had nothing further to add to Cooper’s comments. Today, Franklin has about four million Facebook shares, according to Bloomberg.

Cooper soon learned she had another reason to work the phones. A couple of weeks before the big news conference in May, she asked her 22-year-old daughter, Sarah, whether she had any stories to share about Facebook. Mother and daughter were briefly estranged in 2015 when Sarah turned 18, changed her phone number, and moved out of the house. That year, for several weeks, Cooper hadn’t heard from Sarah, except for a mysterious call in which her daughter said, sadly, “I miss my mom.” But now they were tight again, and when Cooper asked, she thought Sarah might share a story or two. “I thought, oh, she might have sent some sexy pictures or some normal teenage thing,” Cooper recalls.


Photograph by Mary Beth Koeth

A day or two later, Sarah came to Cooper in the sunroom and told her mother the following story: When she was 16, Sarah met a man on Facebook whom she calls J. He admired her, told her she looked sexy and, like Sarah, loved reading the Twilight books and listening to Nicki Minaj. She sent him nude pictures. She lived for his messages on Facebook Messenger. When she turned 18, they made plans to meet.

Sarah told her mother that when she got into his car, he brought her to a nearby house where he forced her to drink shots and take cocaine. There he forced her to have sex with him and another woman as somebody filmed them. Then he brought her to a motel in New York state, where he locked her into a room, raped her, and forced her to have sex with customers. One day, when the guards that her rapist had posted weren’t looking, she called a family friend on the hotel phone. A day later, he arrived. As he circled the parking lot, Sarah ran out and leaped into his car. J and his guards gave chase. The family friend gunned the engine back to Boston, where they arrived safely.

Cooper was floored. It was such a terrible story that she told Sarah that staying away from the news conference might be better. “We went back and forth for a week. It was a terrible situation,” Lisette recalls. But Sarah pressed; she wanted to do it. “It was a huge, huge leap of faith to come forward,” Sarah told Barron’s. “I was going through my own journey of wanting to help others.”

Both Sarah and Cooper spoke tearfully at the news conference. The next week, Cooper’s resolution received 12.6% of the vote. Facebook founder Mark Zuckerberg and management control 88% of the vote through supervoting shares. Take those out, and Cooper’s resolution was backed by 43% of the remaining, nonmanagement-owned, shares. That’s a remarkable amount when compared with the support even popular shareholder resolutions typically get.

When Sarah decided to finally tell Lisette her story this past spring, she had been studying psychology and, as part of her senior project, needed to pull together all that she’d learned.

Now 23, Sarah will graduate in a few weeks. She and her mother are on good terms. “Now, we have the ability to collaborate, which is fantastic,” Sarah says. It has been painful to share her story, but Sarah has spoken publicly to a variety of organizations on the topic of child sexual abuse, determined that her experience won’t be repeated.

Sarah and Lisette declined to discuss any interactions they’ve had with law enforcement.


If Facebook doesn’t find a solution voluntarily, it faces challenges from customers, advertisers, and regulators.


— Lisette Cooper

Facebook pledged to encrypt its messaging servicesin 2019. WhatsApp, used by more than two billion people in 180 countries, already has end-to-end encryption. That’s not yet the case for Messenger; in an email to Barron’s, a Facebook representative said the company “is committed to making Messenger end-to-end encrypted.” The spokesperson added, “Facebook leads the industry in combating child abuse online, and we’ll continue to do so on our private messaging services.”

It isn’t an either/or, says Cooper. She’d like to see Facebook hire more live monitors to sift through the vast amounts of data to find abuses that aren’t caught by the company’s artificial intelligence, and to strengthen age-verification protocols to keep predators and children apart.

Meanwhile, Facebook has faced a variety of other challenges. Congress has started looking at the alleged monopolistic power of Big Tech. This year, the Senate introduced the Lawful Access to Encrypted Data Act, or LAEDA, which would require tech companies to assist law enforcement to access their encrypted devices and services when authorities obtain a search warrant.

The European Union has made fighting child sexual abuse a priority, saying end-to-end encryption “makes identifying perpetrators more difficult, if not impossible.” Says Cooper: “If Facebook doesn’t find a solution voluntarily, it faces challenges from customers, advertisers, and regulators. A legislative solution will end up mandating lawful access. There’s already regulatory scrutiny and pressure on the antitrust side.”

“Lisette does a remarkable job of combining her tremendous professional skills and intelligence with a mother’s pain and anguish,” says Lori Cohen, executive director of Ecpat-USA, a leading anti-child-trafficking organization. “If law enforcement can’t get access to data, then all of our children become vulnerable to criminal exploitation.”

Cooper intends to bring the resolution again, before Facebook’s Dec. 11 deadline for filing shareholder proposals for its next proxy ballot.

Write to Leslie P. Norton at leslie.norton@barrons.com

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *