I have a big backlog of readings to eventually write about on this topic, but I’m hastening to jump the line with this expert new article by Clint Watts, “So What Did We Learn? Looking Back on Four Years of Russia’s Cyber-Enabled “Active Measures”” (Jan 2018), since it is one of the few based on recent empirical research.
Watts has been monitoring Russian “active measures” online for years. Here he identifies “four phases to their operations.” Over time, the four “layer one on top of each other” and blend together “as the Kremlin’s military, diplomacy, intelligence, and information arms pursue complementary tasks mutually supporting each other.”
In “Phase 0: Capability Development (Jan 2014 – Fall 2014)”, Russians began experimenting with cyber weapons and techniques in Syria, Iraq, and the Ukraine.
In “Phase 1: Infiltrate Audiences (Fall 2014 – Summer 2015)”, recognizing their successes in Phase 0, they began using troll armies to campaign in America for the “amplification of social issues of race, immigration, and anti-government conspiracies.”
In “Phase 2: Influence Audiences (Fall 2015 – Election Day 2016)”, they added hacking for the purpose of “seeking compromising information that could be used not strictly for its intelligence value but also as nuclear fuel for information warfare.” They also deployed a set of political narratives to undermine support for Clinton, enhance support for Trump, and drive wedges between pro-Hillary and pro-Bernie supporters. In addition, the Russians deployed another set of narratives that “shifted from strictly political narratives attacking or promoting candidates to attacking the integrity of elections and democracy itself.” Thus, according to Watt’s analysis, the Kremlin “sought not to win the election, but undermine American faith in institutions and processes”.
In “Phase 3: Leak Kompromat and Power Narratives (Fall 2015 – Election Day 2016)”, Russian operations focused on driving divisive narratives into the U.S, using outfits like Wikileaks and DC Leaks, along with Sputnik News and Russia Today channels. As a result,
“Stolen information provided the nuclear fuel for the Kremlin’s information warfare arming click-bait websites, conspiracy theorists, political opportunists, and mainstream media discussions with corrosive, divisive narratives or timely distractions from more relevant political discussions. … Putin’s plan did not seek to change the vote, but to undermine it, and strike a lasting blow against democracy post-election regardless of the victor.”Watts reports that U.S. intelligence and other agencies were very slow to catch on to all this. Partly because of an excess of hubris, in that they didn’t think Russia would attack our system this way. Partly because of a lack of imagination, in that they hadn’t grasped how influential hacked information and other cyber measures could be. And partly because our system is so disorganized — no one has a clear mandate to lead on cyber matters, and there is no unified plan. Watt’s says that what’s really required is a “task force approach”. But here we are, years after the Kremlin began its information attack, and still we have no strategy. “It is impossible to know who in the U.S. government is in charge of counter influence, when no one is certain what the plan is.”
Watts goes on to identify some telling signs that Russians are plotting an information attack, but I’m going to jump over those to end by quoting his concluding paragraphs:
“I estimate the decision point, from a strategic perspective, for Putin’s plan to mess with the 2016 presidential election came in the summer of 2015. Russia’s hacking decision demonstrated execution of a planned campaign, the pursuit of defined goals, and the level of Kremlin commitment to achieving its foreign policy goals vis-à-vis other actions. For instance, Russian hackers aggressively targeted the United States, France, and Germany suggesting their commitment to breaking up the European Union and weakening NATO by employing the same technique in sequence. In the future when Russia’s hackers launch a widespread campaign, analysts should ask not just what information the hackers were seeking to acquire but why the information of targets might be of use for influence. Secondly, hacking for influence must be done well in advance to allow sufficient time for triage, release, and subsequent influence of a targeted population. Looking back, hacking to influence on behalf of a candidate likely requires a year’s lead time; hacking to advance a general conspiracy, possibly only two to three months.
“The Kremlin’s playbook is in the wild, and authoritarians around the world have begun adopting their techniques in pursuit of domestic and foreign audience manipulation. The world, and particularly the West, must move past the presidential election of 2016, learn from its mistakes, and begin anticipating where the Kremlin will move next. No one has successfully countered Russia’s approach yet, and Putin has no reason to stop. In the absence of resistance, Russia will exploit success, not demonstrate self-restraint.”I think this is quite worrisome. It leads me to reiterate that my two series of readings — the one on tribes and tribalism, and this one on cognitive warfare — are linked by a point straight from TIMN theory: The grand purpose of cognitive warfare at the societal level is to tribalize (even atomize) a society. And to do so by way of undermining peoples' trust in and ties to their institutional and market systems, as well as emerging network systems, so they have nowhere to go but back to the tribal form.
To read Watt’s article and see his striking graphic, go here: