USE CASE#2
Use of AI tools to support the discovery of potential foreign influence operation faster

Use case manager -> Debunk
Current problem and needs
In the last 3 years most threat actors are switching to audio visual content that is way more complicated to analyse. Video content has a much higher impact on society than just text based. In 2022, the Debunk team conducted an analysis for NATO StratCom, examining 350 hours of the most prominent Kremlin-produced TV shows. Number of influence campaigns in Tik Tok, YouTube shorts and Instagram has increased dramatically.
Use case objectives and expected benefits
Debunk analysts monitor and analyse potential information influence cases from Kremlin actors on a daily basis. Since the invasion of Ukraine, the number of information attacks, cyber-attacks, and coordinated inauthentic behaviour (CIB) increased sharply. Covert influence operations have adopted a brute-force, “smash-and-grab” approach of high-volume but very low-quality campaigns across the internet.
Therefore, consistent assessment of the information environment, mapping out hostile actors, and exposing attempts of algorithmic manipulation is crucial – and not only at the time of war or significant events.
To detect CIB such cases, analysts aim to apply AICODE content-driven data processing tools to spot the anomalies.
The expected benefits are that these automated tools for source analysis would help Debunk analysts to detect new harmful sources a lot faster. In addition, by automatic analysing thousands of videos from social media, they would greatly improve analysis and productivity.
AI-CODE Services that will be tested and validated:
Other use cases

Use case #1
AI tooling for trusted content
