On the British hearings in Washington about Social Media's role in spreading political misinformation
It didn’t get
much mainstream press in North America, but last week 11 British MPs held
hearings in Washington investigating US technology companies and their
responsibility for monitoring and/or preventing malicious "news" content on
their platforms. This hearing was the first ever select committee session to be live-streamed
from abroad. Why the committee chose to hold
its hearings in the US is unclear, but it may be a sign of both how seriously
the UK is taking this issue and of how important it is that the inquiry be
highly visible on both sides of the Atlantic.
Regardless of the unique format, on 8 February, 2018, as part of a inquiry into “fake news” by the Select Committee on Digital,
Culture, Media and Sport, the British MPs questioned representatives from Google, YouTube, Facebook
and Twitter for about four hours. The
approach of the British MPs and their refusal to accept the technology
companies’ shrugging off of any responsibility for the paid content on their
platforms is in stark contrast to the largely hands-off approach taken by
American authorities.
The UK committee’s fake news inquiry actually
commenced in January 2017, only to be suspended during the general election
last year. The inquiry was a response to reporting that propaganda and fake news
reporting had played an unprecedented role in both the 2016 Brexit campaign and
the US presidential election and resulting concern of its impact on
democratic process in the future. The Committee
Chair, MP Damian Collins, argued in 2017 that “[t]he growing phenomenon of fake
news is a threat to democracy” and, moreover, the major tech companies bore
some responsibility to address this growing issue. Reading the transcript of the hearing
indicates both the technical sophistication of the British panel and their
unwillingness to allow the tech companies to get away with facile answers.
Generally, the issues raised by British
lawmakers are the same as those raised in the US Congressional committee hearings with the same tech companies
last year. The American hearings were
interested in the lack of transparency in political advertising on social media
sites. The British committee was
particularly concerned not only about the lack of transparency for consumers of
news, but also the tech companies’ unwillingness to share with governments the
information it has about the origin of problematic content, and indeed, an
unwillingness to acknowledge there is a problem at all.
Whose
Responsibility?
Indeed, throughout the hearings the MPs
made clear that they did not think the tech companies had been taking these
issues seriously enough. When Facebook’s
representatives said they had not seen anything to suggest there had been
foreign interference in the Brexit referendum, Chairman Collins exclaimed, “But
you haven’t looked, have you? … You haven’t looked!" In response, Simon Milner, Facebook’s Policy
Director, UK, Middle East and Africa, said that unlike in the US election,
Facebook had not been given any intelligence reports to suggest that there was
any such interference. Collins
expressed cool disbelief:
You
were insinuating that there was a lack of intelligence in the U.K. that had
existed in America, and that the absence of intelligence support from the U.K.
meant that that work had not already been done. But in America, it was not
intelligence reports from the Government that led to that work being done; it
was pressure from Congress, which led to what they would see, I think, as the
company doing the bare minimum.
Milner’s suggestion that Facebook
required some external cause for concern before taking on the responsibility of
investigating was at odds with the position the companies had taken before the
US congressional committees where they had emphasized that malicious content
comprised a very small proportion of overall content on their services, and that
they were doing everything they could to stamp it out.
The issue of the companies’ own
transparency was raised as a problem in determining from the outside if there
was a problem of foreign interference: As
Ian Lucas, MP, put it, “You have everything. You have all the information. We
have none of it, because you will not show it to us.” The MPs were skeptical of the response from
the companies that their business incentives aligned with lawmakers’ public
policy concerns. In response to YouTube’s assertion that it dedicated
significant resources to managing content on its platform, Collins called the
company’s allocation of $10 million a year “a small sticking plaster over a
gaping wound.”
We
have heard the expression “top priority” a lot. If we judge the company based
on what it does rather than what it says, the top priority is maximising
advertising revenue from the platform, and a very small proportion of that is
reinvested into dealing with some of the more harmful content. That is one of
the reasons why we are here and why social concern about this is growing.
More specifically, during the British
hearing the MPs repeatedly expressed frustration that the tech companies’ seemed
to have had no scruple about allowing foreign-paid advertising on their
platforms during recent elections. Simon
Milner, representative for Facebook. suggested that responsibility for this issue
lay not with the social media platform but with the British electoral commission
or the persons who had purchased the illegal advertisements. Collins expressed
incredulity at this answer:
Chair: It is extraordinary:
if Facebook were a bank, and somebody was laundering money through it, the
response to that would not be, “Well, that is a matter for the person who is
laundering the money and for the authorities to stop them doing it. It is
nothing to do with us. We are just a mere platform through which the laundering
took place.” That bank would be closed down and people would face prosecution.
What you are describing here is the same
attitude—it is up to the Electoral Commission to identify the person. Even
though you know when money is being paid or linked to accounts outside a
country, you do not detect it. We hear a lot about the systems, but they are
not picking that up at all. Many people would find that astonishing.
In contrast to the British suggestion
that prosecution of the social media giants might be the answer to problem them
of turning a blind eye to practices that contravene electoral laws, American Senators
Mark Warner and Amy Klobuchar have merely introduced a proposal for the Honest Ads Act, which would mandate much greater
transparency and disclosure for online political advertising.
To
Regulate or Not to Regulate?
The UK is clearly moving in a direction
of tighter regulation rather than merely rules for transparency. Conservative MP Julian Knight, during the
hearing referred to the recent German law creating liability for large fines if
companies fail to take down “manifestly unlawful content” (which includes hate
speech, pornography and potentially fake news) sufficiently quickly. Suggesting that this law had caused a decline
in the prominence of such content within Germany, Knight said to the companies’
representatives: “Surely this is strong evidence that the way in which Western
democracies protect themselves is to regulate you.”
Perhaps the issue that indicates the starkest
difference between the way the British and the Americans are tackling the
problem is on the definition of “fake news”.
American lawmakers have worried during their hearings about stifling
political debate, and the difficulty of finding any objective way of determining
content’s accuracy. Congressional
hearings indicated there was more acceptance among American lawmakers of the
tech companies claim that should not be asked to become “arbiters of truth.” The
British Committee, in contrast, see these companies simply abdicating any
responsibility, particularly as social media platforms are not neutral transmitters of content. As the MPs rightly shot back at the Facebook
representatives:
Jo Stevens: Ms Bickert, my
colleague just said that you are in fact the largest disseminator—some would
argue publisher—of news in the world. You make de facto publishing decisions
every day. We have heard you describe that. You design the algorithms and the
algorithms then decide what we read and see on Facebook every day. Those
algorithms are systematic. They are not objective and they inherit the biases
of the people who are developing them.
…. Your algorithms enable hyper-personalised
content that is really finely grained to be directed and targeted towards
specific individuals. Your advertising helps to do that, but most individuals
who use Facebook do not even realise that you are doing that. They do not
realise that what comes up on their Facebook—
Monika Bickert: The newsfeed.
Jo
Stevens:
Yes. What comes up there is what you are targeting towards them. So there is a
huge power imbalance there, because you are controlling it and the person who
is receiving it has no control over it. That kind of reminds me—if you will
forgive the analogy—of an abusive relationship where coercive control is going
on. Somebody is deciding what you see, hear and read, what you have access to.
Can you see the parallels with that? Can you see why I would be concerned about
that?
Similarly, Rebecca Pow, MP, said she
was “staggered” by the companies saying they did not have rules on truth:
To
me, that gets to the nub of what we are all talking about today. As a platform,
you are openly able to spread disinformation … What worries me is what this is
doing to our children. Shouldn’t you take some responsibility for it? If you
cannot and are not able to and your policing system is not up to it, surely
some sort of regulation or body will have to be put in place to ensure that the
next generation is safe.
Giles Watling, MP, followed-up simply:
“You have enormous power, and with enormous power comes great responsibility.
You seem to want to duck that.”
Politics
and Truth
Despite the robust attack on the irresponsibility
of the social media giants, there is an element of hypocrisy in the British Committee’s
approach. During the Brexit campaign
there was the controversy over the clearly
false claims made by the major campaigns.
Indeed, the Tories that led the Brexit
campaign and who now lead the British Committee were the worst
offenders in promoting fraudulent and misleading claims. The tech companies during the hearing also
seized on the fact that under British electoral law, political advertisements
are exempt from the rules that govern truth in advertising. As Nick Pickles of Twitter noted:
During
an election campaign in the UK, political advertisements are exempt from the
advertising rules, so that would be taking regulation of UK political
advertising and giving it to American technology companies. In terms of the
democratic process, that seems to me quite a robust step to take.
The other media tech giants all suggested
that a cooperative rather than regulatory was more likely to be productive and avoid
unintended consequences. Determining who
and what is legitimate in the political sphere is inherently fraught. Nonetheless, other statements by the British
government suggest that regulation is the direction that the British are
heading. Prime Minister Theresa May devoted a
large portion of her speech at the World Economic Forum in Davos at the end
of January to the issues of social media. She forecast “new rules and
legislation” to deal with the loss of trust in social media companies, and
reiterated her goal of making the U.K. “the safest place to be online.” Alongside
May’s statement is the recent Internet Safety Strategy Green Paper.
Both pronouncements are long on rhetoric but short on detail. It does seem apparent, however, that the British, unlike the Americans, are moving towards regulating social media content in an effort to diminish the reach of “fake” and “malicious” news. There will undoubtedly be issues regarding how any such regulation deals with the legitimate fears of civil rights advocates about political censorship. We will have to wait to see just what is proposed and how successful it will be.
Both pronouncements are long on rhetoric but short on detail. It does seem apparent, however, that the British, unlike the Americans, are moving towards regulating social media content in an effort to diminish the reach of “fake” and “malicious” news. There will undoubtedly be issues regarding how any such regulation deals with the legitimate fears of civil rights advocates about political censorship. We will have to wait to see just what is proposed and how successful it will be.
No comments:
Post a Comment