/aug 6, 2020

Live from Black Hat: Hacking Public Opinion with Renée DiResta 

By Chris Kirsch

Psychological operations, or PsyOps, is a topic I’ve been interested in for a while. It’s a blend of social engineering and marketing, both passions of mine. That's why I found the keynote by Renée DiResta, Research Manager at the Stanford Internet Observatory, particularly interesting. 

The Internet Makes Spreading Information Cheap & Easy 

Disinformation and propaganda are old phenomena that can be traced back to the invention of the printing press – and arguably before then. With the advent of the Internet, the cost of publishing dropped to zero. There are no hosting costs on certain platforms, but especially in the beginning, the blogosphere was very decentralized, and it was hard to get people to read your contentWith thrise of social media, you can share your content and it can become viral. At the same time, content creation becomes easier. All of this eliminates cost barriers and gatekeepers.  

State Actors “Hack” Our Opinions 

As social media platforms matured, the algorithms that curate content become more and more sophisticated. They are trying to group people and deliver personalized targeting of content, which allows adversaries to analyze and game the algorithms.  

Renee

State actors don’t just influence, they start hacking public opinion, which involves fake content producers and fake accounts. They can do this more effectively because they understand the ecosystem extremely well, typically applying one of four tactics, sometimes in combination:   

  • Distract: Taking attention away from news stories that are detrimental to the state actor
  • Persuade: Providing convincing content to sway a target’s opinion
  • Entrench: Getting individuals to identify with their peer groups and dig their heels in
  • Divide: Pitting groups against each other to spread dissent 

Architecture of a Modern Information Operation  

Architecture of a Modern Information Operation

Information operations often create fake public personas, such as journalists, to create content. They then seed it to social media and amplify it through bot accounts to get organic shares among the population. The ultimate goal is to have mass media pick up the stories and amplify even further.  

Many of these campaigns use algorithmic manipulation. The Russian disinformation campaign around the 2016 election only spent $100,000 in advertising, but their real lift came from creating compelling content that people shared organically.  

From a defensive perspective, you can look at these operations as a kill chain. You should ask yourself: Which part of the chain can I disrupt to slow or stop the campaign? The last hop to mass media is particularly important.  

Telling a Positive Story About China  

China is powerful player in information operations, but we’ll see in a moment that their operations have less impact than Russia’sHowever, their network infiltration operations, which can be related to information operations, are already very advanced 

In a nutshell, the goal of China’s information operations is to “Tell China’s Story Well”. They are primarily concerned with persuasion, sometimes distraction. For example, during the COVID-19 crisis, China first controlled domestic perception, then put out English language posts about WHO praising the Chinese response. They pushed this out on Facebook to ensure they reached large global audiences. They flip back and forth between funny things that people retweet and more aggressive messages. 

A Look Into Chinese Information Operations  

China has decades of experience in both covert and overt domestic information management. They're now taking these inward-facing capabilities and employing them outside of their borders.  

We can classify their content sources into three categories:  

  • Light: Official state news outlets
  • Grey: Content farms that are not easily attributable to the state and push out fake political stories 
  • DarkPurely online properties that spread disinformation 

Even though Facebook is banned in China, its content platforms have more than 220 million followers. They have also expanded to troll accounts and covert strategies, which have been taken down from Facebook and Twitter in some occasions.  

As Western media began to talk about Hong Kong protests, Chinese troll accounts surfaced, pretending to be Hong Kong citizens, and told the journalists that they got the story completely wrong. However, China lost its Hong Kong bots early in the protests because they were shut down. Research showed that most accounts were not created pre-emptively but as a reaction to a crisis.  

China

China is Struggling to Have Real Impact, But They’ll Get Better  

The surprising thing was that 92 percent of accounts had less than 10 followers. Most tweets didn’t even have a “like,” and maximum tweet engagement was 3,700. In other words, China did a very poor job of getting real people to pick up their content.  

While China is good at creating content, they are sloppy at their social media game. China is well resourced, and they’re committed to improving. At the same time, we shouldn’t overemphasize the impact of the efforts.  

Russia’s Game: Entrench and Divide  

By comparison, Russia is best in class when it comes to information operationsThey excel at creating agents of influence and manipulating media. They are using network infiltration as one of their tactics, both to hack public influencers and by leaking data to the media.  

Russia has the same set of overt and covert media, ranging from light to dark, but it spends a fraction of China’s budget. One example of a covert content source is BlackMattersUS, which is officially operated by an American activist but is actually run by a Russian contractor in St. Petersburg. 

Its media outlets have fewer Facebook followers, only in the range of 39 million, but they have a lot more engagement. Russia is much better at segmenting their audience and creating custom content that plays into their narratives, entrenching and dividing their audiences. They are also better at picking the right types of media for the audience and social network, e.g. videos for young millennials.  

Russian Memes vs. Chinese Narratives  

While China is focusing mainly on creating a certain narrative, Russia focuses much more on memes that convey feelings or a point of view. Much of this content is generated by the Internet Research Agencya Russian content farm that is not officially associated with the government to create plausible deniability. They focus on social content first, which lends itself to certain types of media 

Russian meme

Memes look at how people feel. They are identity-focused and entrench people in their groups. Content is created to reinforce their beliefs. By sharing the content, individuals are signaling membership in their group. Interestingly, the IRA does this both on the political left and on the right, splitting the country in two 

Creating Agents of Influence  

Russia doesn’t stop with online engagement and shaping opinions. It wants to create agents of influence that go out on the street and conduct activism. When you follow the Internet Research Agency page or like a piece of content, you give the IRA a signal that you’re sympathetic to a particular point of view. 

What DiResta has observed is attempts to recruit these people through a constant outreach, more than you’d typically see from a media outlet. They offer financial resources and logistical support to turn people into agents of influence, mobilizing them, getting them out into the streets as activists. This happens behind the scenes, in direct messages, not visible if you’re simply looking at the memes on social media 

Throwing Hacked Data into the Mix 

Russia goes one step further, engaging GRU hacking operations in its information campaigns. APT28, also known as Fancy Bear, began creating fake Facebook pages years ago when the GRU was experimenting with these tactics.  

Hacked data

The green circles represent fake public personas, often journalists, that put out geopolitical content on their own fake media sites. They share the content with Western and regional blogs to gain wider distribution. However, the GRU did not have a lot of success with this tactic.  

They since modified their tactics. Public officials or agencies are hacked, then the material is offered to journalists through fake personas, such as Guccifer 2.0The Internet Research Agency then creates memes based on the content to amplify on social mediaFinally, RT and SputnikRussia’s state news outlets, talk about the substance of hack while denying their state’s involvement 

While China is focused on telling a positive story about their country, Russia is more interested in exploiting divisions in our society and using vulnerabilities in our information ecosystem.  

Russia Will Use the Same Methods in the 2020 Elections 

We should expect Russia to employ similar tactics in the 2020 U.S. presidential elections:  

  • Hacking leaking operations 
  • Hackingvoting machines 
  • Infiltratinggroups 
  • Amplifyingnarratives  

Even if Russia doesn't hack the voting machines, just claiming they've been successful will cause mistrust in the elections. And that is their goal: undermining confidence in our political system.  

The Effects Will Outlast Active Operations  

You can’t hack a social system if the system is resistant to the attack, but our country is divided and very vulnerable. DiResta found an activist’s page on Facebook that contained 40 percent IRA content. However, the person behind the page was real, not a bot. They were sharing the content because the IRA had created messages that resonated extremely well.   

People internalize opinions based on repetition. False stories are memorized by real people and spread long after active operations have ceased. We’re all more instrumented than ever before.  

The challenge for scientific research is: We can easily quantify likes and retweets and see how they are reacting, but it’s hard to see if it changed hearts and minds.  

What Does This Mean for Corporate Information Security?  

If you’re a CISO in a company with international competitors, you're just as much at risk. Companies with geopolitical aspects such as fracking for oil and agricultural firms like Monsanto have already been targets. Companies taking part in social issues have seen content against them amplified on social media.  

However, most companies don’t have a position on the org chart to deal with adversarial information operations. As a CISO, you probably need to start thinking about how you would respond. But the question isn’t purely technical. It's not a social media analysis problem. You need to conduct red teaming exercises that involve people from both technical teams and corporate communications.  

 

If you found this post interesting and would like an overview of additional Black Hat sessions, visit the Veracode blog.

 

Related Posts

By Chris Kirsch

Chris Kirsch works on the products team at Veracode and has 20 years of experience in security, particularly in the areas of application security testing, security assessments, incident response, and cryptography. Previously, he managed Metasploit and incident response solutions at Rapid7 and held similar positions at Thales e-Security and PGP Corporation. He is the winner of the Social Engineering CTF Black Badge competition at DEF CON 25.