Skip to content

More researchers needed to rid the internet of harmful material, UK communications boss says at Northeastern conference

Ofcom chief executive Melanie Dawes calls for “potential groundbreaking change” during the Internet and Society event in London.

A panel of people sitting in front of a Northeastern branded backdrop.
Ofcom chief Melanie Dawes and Tom Wheeler, the former FCC chairman, take part in a fireside chat at the Internet and Society event. Photo by Carmen Valino for Northeastern University

LONDON — The head of the U.K.’s Office of Communications wants “as many researchers as possible” to “shine a light” on what is happening “behind closed doors” in order to make the internet a safer place.

Ofcom chief executive Melanie Dawes made the intervention at the “Internet and Society: The Trans-Atlantic Research Future” conference arranged by Northeastern University’s Internet Democracy Initiative and held at the school’s London campus on May 10.

She took part in a fireside chat alongside Tom Wheeler, a former chairman of the Federal Communications Commission (FCC), the independent communications regulator in the United States, which was chaired by David Lazer, Northeastern’s distinguished professor of political science and computer and information science.

Dawes said access rights to the data that lies behind social media operations and other big online platforms needed to be “strengthened” in Britain to allow experts to use the information.

The former economist said Ofcom was “trying to build a more data-based way of measuring prevalence of types of harm for different types of users” after being charged by the U.K. government with implementing the Online Safety Act, which after years of delay passed into law in October and aims to better protect people online, with an emphasis on safeguarding children.

When announcing Ofcom’s plan last week for enforcing the new law, she said social media platforms like Facebook, Instagram and TikTok would have to “tame” their algorithms to filter out or downgrade harmful material to help protect children.

Dawes, speaking at the Northeastern conference, said it was “very important” that the watchdog could team up with researchers to understand the mechanics behind the platforms.

“I would like as many researchers as possible to be able to join us in the job of working out what is happening and shining a light on things that have been under the bonnet, behind closed doors,” she said.

The Ofcom boss said the U.K.’s Online Safety Act contained “some provisions for research access to data” but that they were not as strong as those enjoyed in Europe.

“There is a legal question here about researcher access to data, which I think is the real potential groundbreaking change here that would allow the broader research community to essentially play on the data and be able to shine a light on all of this,” Dawes said.

“In the U.S. there has been a real challenge from some of the tech platforms against researchers who have been trying to do this. So I think, at some level, having provisions in law for it to be done are going to be needed.”

Video by Mark Dutton for Northeastern University

Lazer asked Dawes and Wheeler about the anxieties that keep them “awake at night.”

Dawes stated that the impact of misinformation and disinformation in an election year, with a U.K. general election expected before the end of 2024, was a concern, adding that it is “becoming harder and harder for the public to navigate information.”

The former senior government official said social media algorithms and the growth of news aggregators meant people were “more likely to get stuck in a rabbit hole of a narrow range of opinions and less likely to be able to spot fake news.”

Wheeler, who held a broadly equivalent position to Dawes between 2013-17 when he was at the FCC but did not have to regulate online communications, said he feared Washington had failed to protect the American people against dangers involved with advancing technologies.

“What worries me is, for the first time in American history, we have failed to step up to the implications of a new technology and develop expectations for the rights of individuals and the activities of the marketplace relative to that new technology,” he said.

He suggested regulators in the U.S. were now looking to Britain and the European Union to “answer those questions” around internet safety, with Brussels working on its EU Digital Services Act that holds similarities to the U.K.’s Online Safety Act.

“As a person who has spent his life in technology in the U.S., I think that is a disappointing development and a failure to lead on our part,” he added.

During a Q&A, the pair were asked by John Wihbey, an associate professor of media and technology at Northeastern and co-leader of the Internet Democracy Initiative, what they made of the move to attempt to block TikTok from operating under Chinese ownership in the U.S.

In April, President Joe Biden signed legislation to ban the social media platform in the U.S. if Beijing-based owner ByteDance does not sell its stake within a year. 

Wheeler said there was a “legitimate worry” around national security in TikTok’s case.

Dawes remained neutral with her response to the question, saying only that Ofcom viewed the so-called ban “with interest.”

A room full of audience members listening at the Internet and Society conference.
The day-long Internet and Society conference was held at Northeastern University in London. Photo by Carmen Valino for Northeastern University

At a session following the discussion with the regulators, TikTok was represented by Ali Law, the platform’s director of public policy and government affairs in the U.K. and Ireland.

He stressed the importance TikTok places on ensuring users are kept safe online, telling the audience that three-quarters of content that is in violation of the platform’s policies do not get a single view and “only 2% gets more than 1,000 views,” with moderation carried out by both humans and artificial intelligence.

During the panel discussion chaired by Wihbey, Law appeared alongside Beth Goldberg, head of research at Jigsaw — Google’s unit exploring threats to open societies — and Peter Stern, director of content policy and stakeholder engagement at Meta, the owner of Facebook, in a closing panel session that touched upon the impact of deep fakes, protecting under-18 users and creating guides to help social media users spot fake information during election campaigns.