Prince Harry, the Duke of Sussex, revealed at the RE:WIRED conference today that he had emailed Twitter CEO Jack Dorsey prior to the Capitol riots on January 6th to warn him that “his platform was allowing a coup to be staged. That email was sent the day before and then it happened and I haven’t heard from him since.”
Twitter declined to comment. But the incident speaks to how seriously the Duke of Sussex takes misinformation and media manipulation. For him, it’s personal. “I learned from a very early age that the incentives of publishing are not necessarily aligned with the incentives of truth,” he said, especially because the United Kingdom press is prone to conflating profit with purpose. “They successfully turned fact-based news into opinion-based gossip with devastating consequences,” he added. “I know this story all too well. I lost my mother to this self manufactured rabidness, and obviously I’m determined not to lose the mother of my children to the same thing.”
Harry spoke as part of a panel on misinformation, moderated by WIRED’s Editor-at-Large Steven Levy and also featuring Renée DiResta, the Technical Research Manager at Stanford Internet Observatory and Rashad Robinson, a Co-Chair on the Aspen Commission on Information Disorder and President of Color Of Change.
How did the internet’s early ideals of truth and democracy become so warped? And how do we straighten the whole thing out?
“Misinformation has always existed,” explained DiResta. “What’s different now is the way in which it spreads, the speed at which it spreads, and the way in which each individual person participates in moving information from their community into other communities.” This individualized spread of information has led to the creation of what DiResta calls, “bespoke realities, places where people tend to congregate with those who are very like minded.”
Such bespoke realities are especially vulnerable to “ampliganda,” a term DiResta coined to capture how social media has turned users into not only content creators, but content disseminators. In practice, this often results in amplifying the content that outrages us “because that’s the content being pushed in our feed.”
Speaking to the social justice implications of these actions, Robinson added that, “The fact of the matter is that inequality, injustice, all these things are not unfortunate like a car accident. It is part of design.” According to Robinson, these platforms profiteering off of hate and fear helped drive the false narratives around the Black Lives Matter protests of 2020 and the advancement of voter suppression techniques in the lead up to last year’s election. He explained, “We have a set of self-regulated companies and self-regulated companies are unregulated companies.”
Watch the RE:WIRED conference on WIRED.com.
More Great WIRED Stories