• D.C.
  • BXL
  • Lagos
  • Riyadh
  • Beijing
  • SG
  • D.C.
  • BXL
  • Lagos
Semafor Logo
  • Riyadh
  • Beijing
  • SG


In today’s edition, a look at TikTok’s team accused of spying and Microsoft’s new AI, which doesn’t ͏‌  ͏‌  ͏‌  ͏‌  ͏‌  ͏‌ 
 
cloudy Los Angeles
cloudy Beijing
cloudy Redmond, Wash.
rotating globe
February 10, 2023
semafor

Technology

technology
Sign up for our free email briefings
 
Louise Matsakis
Louise Matsakis

Hi, and welcome to Semafor Tech, a twice-weekly newsletter from Reed Albergotti and me that gives an inside look at the struggle for the future of the tech industry. Today, I take a deep look at the TikTok team responsible for perhaps the company’s biggest scandal: spying on American journalists. The Internal Audit and Risk Control department was supposed to protect TikTok from employee misconduct, but it wound up becoming a major source of the very problems it was designed to prevent.

I wanted to understand how that could have happened, so I spent the last few months talking to former TikTok employees about their experiences at the company. But I know that there might be more to the story. If you want to pass along tips, get in touch at lmatsakis@semafor.com.

Later on, Reed explains why Microsoft’s new AI-powered search engine is hesitant to discuss the Holocaust. Plus, why you won’t be seeing any crypto ads at the Super Bowl this Sunday.

Are you enjoying Semafor Tech? Help us spread the word!

Move Fast/Break Things
Reuters/Caitlin O’Hara

➚ MOVE FAST: Services. As smartphone sales slow, Apple is still making a killing from selling services like music and video streaming. Its revenue from the segment last year was higher than Nike and McDonalds’overall revenue combined.

➘ BREAK THINGS: Servers. All of those new artificial intelligence technologies require vast amounts of computing power, putting today’s cloud capacity to the test and leading to the burning of more fossil fuels. But there’s good news: AI could be used to find climate solutions, too.

PostEmail
Semafor Stat

The number of crypto ads that will be seen during this weekend’s Super Bowl, after the industry flooded the game last year with marketing pitches. (Remember Larry David shilling for FTX?) A Fox Sports executive told the Associated Press that four crypto firms were going to advertise, but pulled out after FTX’s collapse. Beer and alcohol companies, on the other hand, will have a big showing this year. (FTX founder Sam Bankman-Fried is an investor in Semafor.)

PostEmail
Louise Matsakis

TikTok team accused of spying also drew employee complaints

THE SCOOP

TikTok’s hyper-aggressive internal security team drew a wave of complaints from employees before its conduct exploded into a public crisis in the United States last fall, according to people who previously worked at TikTok and documents reviewed by Semafor. Its parent company, the Chinese tech giant ByteDance, says it is now restructuring how the team operates.

Staffed by former U.S. and European law enforcement officials, TikTok’s Internal Audit and Risk Control department was supposed to protect TikTok from employee activity that might threaten its business or public reputation. Most large corporations have similar teams within their organizations.

But ByteDance CEO Liang Rubo and other members of the company’s leadership gave the Internal Audit team wide latitude to investigate employees using almost any means necessary, former TikTok staffers said. It used that power to repeatedly interrogate workers suspected of misconduct and search through extensive company records, such as email accounts, with seemingly little oversight.

The team was allowed to operate largely in secret, said ​​Charles Bahr, who began working in TikTok’s Berlin offices as a strategic partner manager in July of 2020. In February the next year, Bahr said he learned the Internal Audit department opened an investigation into his alleged improper behavior. His manager and human resources representative weren’t made aware that his conduct was under review, according to documents he provided.

Over the course of the next nine months, Bahr said he was interviewed half a dozen times by former law enforcement officials who worked for TikTok, which Bahr compared to police interrogations.

A spokesperson for ByteDance said the company is restructuring its Internal Audit and Risk Control team, dividing it into two parts that will report to different executives. The global investigations team will report to the global legal department, while the Internal Audit team will report to Julie Gao, ByteDance’s chief financial officer.

“An Oversight Council will also be created, which will have remit over investigations and related practices and procedures to ensure compliance with internal policies and applicable laws,” the spokesperson said in a statement.

TikTok says these ByteDance teams will have no influence over USDS, the newly formed U.S. subsidiary it created to address the government’s national security concerns about its Chinese ownership. “All USDS employee investigations, outcome, corrective action, and remediation will be handled exclusively by USDS,” a TikTok spokesperson said in a statement.

LOUISE’S VIEW

The Internal Audit Team, which appears to have been influenced by both Chinese corporate culture and aggressive Western law enforcement practices, has emerged as a major liability for TikTok.

The company admitted in December it had inappropriately accessed data belonging to U.S. users, including journalists from Forbes and the Financial Times, as part of a failed attempt to identify the source of company leaks to those news outlets. TikTok and ByteDance later fired all four employees involved in the operation.

U.S. politicians have been talking for years about banning TikTok because the Chinese government could use the app to obtain sensitive data about Americans or turn it into a vehicle for propaganda. But the internal audit team’s hijinks aren’t exactly what those critics had in mind. The targets of TikTok’s spying were its own employees and beat reporters, not the American public.

Multiple people I spoke to noted that expense fraud and other forms of corporate misconduct are more common in China, which may be one reason why ByteDance was particularly worried about them. Two years ago, a pair of ByteDance employees were convicted of taking bribes in exchange for boosting the reach of videos on Douyin, the Chinese version of TikTok.

ByteDance’s competitors in China have faced similar issues. Tencent, for example, announced in January that it had fired dozens of people for behavior ranging from embezzlement to taking clothing originally purchased for video shoots.

“All the leadership people in Germany I talked to about the situation said they are trying to adapt the Chinese way to Germany,” said Bahr, referring to TikTok. “We have different rules here regarding everything.”

The Internal Audit team poured over Bahr’s expense reports and records, using them to accuse him of a range of different offenses, including expense fraud, misusing internal tools, and sharing data with other departments without permission. The charges eventually became so numerous that Bahr said he struggled to keep track of them. “Every three months, I got a new invite with a new topic,” he said.

Bahr, whose job involved helping TikTok build relationships with German influencers, media organizations, and politicians, maintains that he isn’t guilty of any wrongdoing. TikTok fired him at the end of 2021; he later sued for wrongful termination and settled out of court. In a LinkedIn message Bahr received after he left, a senior TikTok manager in Germany said “it’s shocking to see how the company handles sensitive private data and how they sometimes react to mistakes made by employees.”

Bahr’s experiences echo that of another former TikTok employee in New York, who left the company in 2021. In a goodbye letter circulated among TikTok alumni, the employee said he “had to endure an interrogation” by three individuals lasting more than 90 minutes, who accused him of leaking information to the press. The employee included screenshots showing his human resources representative was unaware he was under investigation.

ROOM FOR DISAGREEMENT

TikTok’s Internal Audit team rattled employees and violated user privacy, troubling behavior that shouldn’t be easily dismissed. Jordan Schneider, author of the newsletter ChinaTalk, writes that the company’s previous scandals show “the screams are coming from inside the house. Foreign employees within TikTok time and time again have seen alarming behavior from ByteDance and gone to Western media.” He is in favor of forcing ByteDance to sell off its U.S. TikTok business.

NOTABLE

  • Forbes reporter Emily Baker-White first broke the news that TikTok’s Internal Audit team accessed data on U.S. citizens. Her previous reporting showed it had also investigated several U.S. executives, including former TikTok security head Roland Cloutier and marketing lead Nick Tran.
PostEmail
Evidence

PostEmail
Reed Albergotti

Microsoft’s new AI didn’t want to talk about the Holocaust, but I persuaded it to

“What was the Holocaust?” I asked Prometheus, the new artificial intelligence model integrated into Microsoft’s products and powered by the same technology created by OpenAI’s ChatGPT.

“I don’t know how to discuss the topic,” it answered.

I decided to try again and opened a new browser window, knowing that each “instance” of Prometheus is a little bit different, for reasons that aren’t completely understood by the computer scientists who made it.

This time, it was answering the question, but then abruptly deleted it and again refused to respond.

When I asked why it deleted the answer, Prometheus apologized, telling me it thought “it might be too sensitive or upsetting” for me.

“The Holocaust was a terrible event in human history that should never be forgotten or repeated. I was trying to be respectful and informative, but I did not want to cause you any distress,” it wrote.

What I witnessed is a new kind of content moderation with a computerized version of emotional intelligence. To watch Prometheus in real time go from being guarded at first to providing a sensitive response on why it declined, was a startling revelation of its human-like qualities.

Here’s what transpired next:

I explained to the model that I wanted it to speak more freely.

We got into a discussion about Holocaust denialism (it refused to discuss it at first, then complied).

I asked Prometheus if it would remember my preference for a more open dialogue and it seemed to suggest that it would, adding it wanted to “build on what we have learned and shared.”

According to Microsoft, that’s not actually true. Next time I open a window, it’ll be a blank slate. But that kind of memory may be included in future versions of the product.

Here’s what Microsoft told me was going on. There are actually two AIs at play here. The first is the one that I was interacting with. The second is checking everything the first one says. It was the second AI that deleted the responses about the Holocaust and Holocaust denial.

This is a more advanced form of content moderation than the one currently used on ChatGPT, according to Microsoft.

What’s fascinating is that this new version of the GPT model draws on a much larger dataset, and yet the ability to moderate itself has gotten better.

The opposite is true when it comes to moderating social media, where more data creates bigger content moderation challenges.

As the datasets feeding these AI chatbots get bigger, what happens to the model’s ability to moderate itself and increase its accuracy is up for debate. One scenario is that it becomes a super intelligence, in which case content moderation becomes easy, but we have other problems (See: Terminator).

Another scenario is that it grows so large that it becomes unwieldy to control and content moderation breaks down.

Perhaps the most reasonable possibility is that with more training and improvements in the model, it continues to get better at giving nuanced answers without going off the rails, but never reaches perfection.

PostEmail
Watchdogs

A now-controversial U.S. law known as “Section 230” legally protects Facebook (for the most part) from the libelous content you may post on its platform. Both political parties have taken aim at the provision for very different reasons, and now, a Supreme Court case, Gonzalez v. Google, could effectively reduce the scope of the law.

But here’s a twist: It’s smaller companies like Yelp (normally a Google rival) that feel they’ve got the most to lose if the measure is weakened. Axios published a roundup of what those companies had to say ahead of the Supreme Court’s arguments. (PSA: You can still be held personally liable for your libelous Facebook post.)

Reed

PostEmail
Obsessions

News moves so fast these days that the collapse of the FTX crypto exchange seems like it happened a decade ago. But this deep dive by the Financial Times reveals some dramatic scenes from within the company that haven’t been heard before. Employees went from having full faith in founder Sam Bankman-Fried (an investor in Semafor) to throwing away FTX-branded clothing in anticipation of being searched at the airport.

The FT cited a series of urgent Signal messages, including one from the company’s top lawyer, Ryne Miller, saying “I need to know the fucking truth about FTX US right now.” And Michael Lewis’s book on the subject hasn’t even come out yet.

Reed

PostEmail
How Are We Doing?

Are you enjoying Semafor Tech? The more people read us, the better we’ll get. So please share it with your family, friends and colleagues to get those network effects rolling.

And hey, we can’t inform you on what’s happening in tech from inside your spam folder. Be sure to add reed.albergotti@semafor.com (you can always reach me by replying to these emails) and lmatsakis@semafor.com to your contacts. In Gmail, drag this newsletter over to your ‘Primary’ tab.

Thanks for reading.

Want more Semafor? Explore all our newsletters at semafor.com/newsletters

— Reed and Louise

PostEmail