The third 360 / OS organized by the DFRLab of the Atlantic Council brought together journalists, activists, innovators, and leaders from six continents bound together fighting for objective truth as a foundation of democracy.
Digital Forensic Center’s team joined the Digital Sherlocks in London at the end of June for two days in open source research, as well as in the combat against fake news and other phenomena which today, unfortunately, shape the world we live in.
If you thought that the recently ended European elections would be a perfect opportunity for other countries to interfere through disinformation campaigns, you would not be the only one to think so. Many experts and people around the world saw it coming, taking into account numerous statements of the officials. That particular belief was coming from the fact that many recent elections around the world (US, Brazil, India, Columbia, Sweden..) always came in pair with some sort of information campaign coming from the East, on bigger or lesser scale influenced the final ratio of votes.
European elections in May were a very tempting target for somebody who wanted to interfere in our democratic processes, Sir Julian King, European Commissioner for the Security Union, said, on the first panel of the first working day at 360/OS. But thanks to increased measures to protect its citizens from disinformation, he added, the EU did not see any kind of spectacular attack.
The EU got all the member states together to work on election security, as well as set up a rapid alert system, which allowed experts, both from civil society and the administrations to all look out for attempts of organized disinformation spreading and share that information with others.
Social networks must take actions against spreading disinformation
What we also learned from his speech, the EU also sat down with the big social media platforms’ representatives and together they agreed on a new code of practice on tackling disinformation on social media platforms. The issues of identification and deletion of false and misleading information, as well as the empowerment of consumers and the research community to identify instances of disinformation were incorporated in the newly created codex.
Time will tell if the new instrument will give results, but one thing is for sure – social networks, especially Facebook, must act as they are under growing criticism for facilitating the spread of disinformation.
People in Facebook are well aware of that. That is why Nathaniel Gleicher, Head of Facebook Cybersecurity policy, gave us better understanding of how the platform perceives and addresses information operations. Disinformation is not the term in use at Facebook. He presented the audience with future plans for developing the software on the platform that will take down fake accounts and deceptive pages even faster.
On the panel entitled Open Source: Witness to a Crime, where the speaker was Eliot Higgins, founder of Bellingcat – investigative journalism network that specializes in fact-checking. Thanks to its researches (MH17, the Skripal poisonings, bombings in Syria, Yemen and Iraq) the network set up foundations of researches using open source. Together with GLAN (Global Legal Action Network) Bellingcat launched a project, the aim of which is to increase trust in open-source evidence.
One of the panels Deepfakes: It’s not What it Looks Like! tackled the phenomenon of creating fake videos. Sam Gregory, Program Director at WITNESS spoke about the evolution of technologies and deepfake and showcased, step by step, the process of creating one unique deepfake video. Having in mind the scope of disinformation and fake news threats which actively seek to destabilize democratic societies, all participants agreed that the constant fight is necessary against this modern calamity.
Keypoints & recommendations
- All the countries threatened by disinformation campaigns and digital manipulation need to work together on countering this issue and on strengthening their democratic institutions showing the unity, prosperity and stability.
- Information warfare requires communication among countries in order to raise critical awareness.
- Working together with the platforms, such as Google, Facebook, Twitter, Mozilla etc., is necessary with a view to increase transparency, identification and deletion of fake and deceptive accounts and posts, and to empower the users and the research community to identify instances of disinformation.
- Propaganda on the rise in authoritarian regimes: it is not about stopping people’s access to digital services but nudging their ideals & beliefs through automation, data filtering and data surveillance.
- The power rivalries that we are seeing playing out in cyberspace have implications for our democratic systems.
- Technology is a tool and authoritarian leaders have learned how to use that tool.
- Information operations are defined as any coordinated effort to manipulate or corrupt public debate to achieve a strategic goal.
- Bad actors don’t need to use super sophisticated techniques to target and manipulate the public.
- Big news organizations should look more at the bigger picture, understanding who the actors are and how disinformation spreads – not just state what is true or not. It is necessary to educate their viewers how to be skeptical.
- Three trends of online authoritarians: intimidation, quelling dissent, online harassment to create fear.
- Strength is in: numbers, digital literacy, healthy skepticism, knowing how to be safe on the Internet, civil engagement and digital resilience.
- Fighting disinformation should be a civic duty just as voting.
 Open Source