NATO’s Allied Command Transformation Innovation Hub has enlisted a team of Johns Hopkins engineering students in the fight against misinformation.
The Whiting School team members—all students in Johns Hopkins University’s Clark Scholars Program—are tasked with defining what has come to be known as cognitive warfare as well as designing a system to identify, measure, and track attacks that confuse, misinform, and manipulate the behavior of targeted audiences.
“Today’s battles are being fought with ideas,” said team member William Rong, a third-year materials science and engineering student. “I think we all want an informed and less polarized society, so this project is really important.”
Clark Scholars work on innovative projects to gain practical engineering experience and are challenged to develop solutions to real-world needs. Established in 2016 with a $15 million investment in the Whiting School of Engineering from the A. James & Alice B. Clark Foundation, the program is designed not only to attract talented engineering students to Johns Hopkins, but also to prepare them for leadership roles. The team members are working with NATO through the Whiting School of Engineering’s Center for Leadership Education, with advisers Alexander Cocron, a CLE lecturer, and Lawrence Aronhime, associate teaching professor and director of CLE’s international programs.
Since last year, Rong and his team have been working remotely with peers from Imperial College London and graduate students from the Czech Technical University in Prague on the project. Their first goal was to delineate cognitive warfare’s parameters and describe what it involves. In the end, the group defined it as a combined arms approach that integrates the capabilities of several existing warfare techniques, including cyber, information, psychology, and social engineering, to achieve a desired outcome without any physical fighting.
“The term ‘cognitive warfare’ barely appeared on the internet when we began our study,” Cocron says. “The students were the first to sit down and really think about how to define it and what it looks like.” Cocron added that an article that the student team wrote for NATO Review on the subject has been “quite influential.”
Next, the students began to tackle the challenge of creating a system, or tool, to identify and track cognitive warfare campaigns. According to Cocron, such a system would need to recognize patterns (also called “signatures”) in the actions of those waging cognitive warfare, and then have the ability to trace the origins of that disruptive online content. With these two tasks in mind, the group is designing a live map inspired by the very successful COVID-19 dashboard, created by a team at the Whiting School of Engineering’s Center for Systems Science and Engineering and used by scientists, government, and public health officials around the world to track the novel coronavirus pandemic.
“The COVID map was the main inspiration for our idea to create our Cognitive Warfare Dashboard to show users what information is being spread, where it’s coming from, and why,” Rong said.
The group says it’s important to note that cognitive warfare does not rely upon “fake news,” which is relatively easy to disprove with a few clicks of a mouse. Acts of cognitive warfare are far more challenging to detect, as they typically weaponize factual information from photos, news articles, and real emails to influence their chosen audiences’ opinions and thoughts.
Rong and Cocron cite the hacking of the Democratic National Convention computer networks in 2016 as a good example of how those waging cognitive warfare combine nefarious strategy with factual information to achieve their goals. According to the 2018 United States District Court’s timeline of events, hackers started trying to access DNC accounts and files as early as the spring of 2016.
“They collected a ton of information that they then sat on and waited to leak,” says Cocron.
Cocron says the hackers’ timing and the fact that the information slowly released consisted of real emails or statements made by members of the DNC is what qualifies this an apparent act of cognitive warfare. The hackers’ goal was apparently to sow discord in the 2016 Democratic presidential primary and possibly to harm 2016 Democratic presidential candidate, Hillary Clinton.
“It sowed distrust among people all over the political spectrum,” said Rong. “It was engineered in a way that indirectly caused chaos and destabilized the political process.”
Rong and his team believe that, when complete, their dashboard will provide authorities with information that can help them respond quickly and effectively before the misinformation takes root in social media and in people’s minds.
“The dashboard should contribute to threat prevention and response,” said Serge Da Deppo, founder and manager of the NATO Innovation Hub. Da Deppo credits the student team members with bringing creativity, energy, and original insight to the challenge of developing this tool.
“Working with students is especially beneficial,” he said. “Out-of-the-box insights and effective solutions are guaranteed because they are not only non-NATO in affiliation but from a younger generation than our personnel.”
Rong credits Alexander Rovalino, a fellow Clark Scholar and third-year biomedical engineering student, with the original idea for the dashboard, and graduate students from the Czech Technical University for the programming heavy parts of the design process.
“We all have a similar way of thinking despite being specialists in different fields and it’s made us all open to new ideas. We all brought our skills to bear to tackle a new and very real problem for the world,” said Rong.