SlideShare a Scribd company logo
Distributed Defense Against Disinformation
Disinformation Risk Management
and Cognitive Security Operations Centers
SJ Terp to CITRIS, Feb 10 2021
1
DISINFORMATION. DIS. NOT MIS.
“deliberate promotion… of false,
misleading or mis-attributed information
focus on online creation, propagation,
consumption of disinformation
We are especially interested in
disinformation designed to change beliefs
or emotions in a large number of people”
2
A THOUSAND-POINT PROBLEM NEEDS A THOUSAND POINT SOLUTION
Collaborative, Heterogeneous, Connected
Response: Actors
• Platform
• Law Enforcement
• Government
• Elves
• Public
• Influencer
• Media
• Nonprofit
• Educator
• Corporation
4
COMMUNITY: COGSECCOLLAB
10 years ago: Disaster Data Today: Disinformation
• MisinfoSecWG
• MisinfoSec
• Covid19Activation
• Covid19Disinformation
• CTI League Disinformation
• CogSecCollab
• Threet
OUR PRACTICAL EXPERIENCE INCLUDES
COGSECCOLLAB DEPLOYMENTS
CTI LEAGUE’S DISINFORMATION TEAM: First Global Volunteer
emergency response Community, defending and neutralizing
cybersecurity threats and vulnerabilities to the life-saving sectors
related to the current COVID-19 pandemic.
• Active Cyber-attack Neutralization
• Cyber-attack Prevention
• Cyber-supporting the Life-saving and Health-related Sectors
• Monitoring Cyberspace for Potential Danger to Public Health and
Safety.
SHARING TECH DEVELOPMENT: with NATO, EU, Canada, UN,
Cognitive Intelligence Center, DROG (BadNewsGame/
HarmonySquare creators), etc.
OUR STARTING POINT: THREE LAYERS OF SECURITY
PHYSICAL
SECURITY
CYBER
SECURITY
COGNITIVE
SECURITY
ORGANISING TEAMS
8
Disinformation Security
Operations Centers
DISINFORMATION CYBER THREAT INTELLIGENCE
• Inform: Summarise and share information about ongoing incidents
• Neutralise: Disinformation incident response: triage, takedown, escalation.
• Prevent: Collate disinformation indicators of compromise (IoCs) and vulnerabilities; supply to
organisations.
• Support: Assess the possibility of direct attack, and ways to be ready for that.
• Clearinghouse: Collate and share incident data, including with organizations focusing on
response and countercampaigns.
9
DISINFORMATION SECURITY OPERATIONS CENTER
• Risk Mitigation: secure system, check compliance
• simulations, red teaming, penetration testing,
• compliance analysis,
• team exercises
• Enablement : foundation work
• data engineering,
• Information frameworks,
• training
• Operations : day-to-day response
• incident response: discovering, investigating, responding to security threats
• Threat Intelligence (TI), research, investigations
ACTION
MONITORING
RESPONSIBLE FOR
DISINFORMATION SOC: ORGANISATION BOUNDARIES
Internet
Domains
Social Media
Platforms
Organization’s
Platforms
Lawmakers
Organization’s
Business Units
COG SOC
Infosec SOC
Organization’s
Communities
Media
Disinformation SOC: Configurations
Cognitive
ISAO
ISAC/
ISAO
Infosec
SOC
Comms
Legal
COG
SOC Trust&
Safety
Platform
ORG
Infosec
SOC
Comms
Legal
COG
Desk Trust&
Safety
Platform
Comms
Legal
COG
Desk
Trust&
Safety
Platform
ORG
ORG
ORG
ORG
ORG
ORG
ORG
COG
SOC
SOC COMPONENTS
• People
• Enough people to make a difference, in time
• Enough connections / levers to make a difference
• Culture
• Safety processes: mental health and opsec
• Process
• Understand disinformation, understand threat response
• Fast, lightweight processes
• Technology
• Speed - supporting analysis, storage etc
• Sharing - get data to responders in ways they understand (whatever works)
SOC INTERNAL ORGANISATION: TIERS
Tier1 Triage
• Scanning systems
• Triaging alerts
• Gathering data
• Starting tickets
Tier2 Incident
Response
• Analysis
• Remediation
• Tactical response
Tier3 SMEs
• Threat hunting
• Deep analysis
• Strategic response
Tier4 Management
• Business connections
• Plans, audits, organization
Tickets Responses Reports
Crisis Plan
Platform alerts
Social media
External alerts
Business Units
Partners &
Responders
Disinformation Knowledge
• Artifacts, narratives, actors,
segments etc
Specialist Knowledge
• Politics, industry, marketing etc
COGNITIVE SECURITY
ENABLEMENT:
FRAMEWORKS
15
DISINFORMATION LAYER MODEL: PYRAMID
1
6
Campaigns
Incidents
Narratives
Artifacts
DISINFORMATION
OBJECT MODELS:
AMITT STIX
CAMPAIGN
INCIDENT
NARRATIVE
ARTIFACT
DISINFORMATION
OBJECT MODELS:
ACTOR,
BEHAVIOUR,
CONTENT AND
NARRATIVES IN
AMITT STIX
ACTOR
BEHAVIOUR
CONTENT
NARRATIVE
Disinformation TTPs:
AMITT Framework
Response: Mitigations and Countermeasures
21
● Detect: find them
● Deny: stop them getting in
● Disrupt: interrupt them
● Degrade: slow them down
● Deceive: divert them
● Destroy: damage them
● Deter: discourage them
https://github.com/cogsec-collaborative/amitt_counters
Defence:
AMITT
Counter
TTPs
EXAMPLE: COUNTER-NARRATIVES INTO DISINFO SPACES
DISINFORMATION
INCIDENT RESPONSE
Threat Intelligence with brains and feels
24
The Disinformation “Three Vs”
VARIETY
platforms
structure
VOLUME
petabytes
VELOCITY
real-time
streaming
near-real-time
AMITT AUTOMATION
Planning Preparation Execution Evaluation
https://cogsec-collab.org/
Planning Preparation Execution Evaluation
Attack
Defense
(Some) Tactical Tasks
• Activity analysis
• Track artefacts (messages, images, urls, accounts, groups etc), e.g.
• find artefact origins,
• track how artefact moves across channels/groups etc
• find related artefacts
• Detect AMITT Techniques, e.g.
• detect computational amplification
• detect, track and analyse narratives
• Network detection
• Find inauthentic website networks (pinkslime)
• Find inauthentic account and group networks (including botnets)
• Credibility/ Verification
• Fact-checking: verify article, image, video etc doesn’t contain disinformation.
• Source-checking: verify source (publisher, domain etc) doesn’t distribute disinformation.
Machine Learning / AI
• Graph analysis
• Find super-spreaders
• Find rumor origins
• Uncover new artefacts
• Track movement over time
• Text Analysis
• Find themes
• Classify to narratives
• Cluster text to narratives
• Search for similar text/narratives
• Image, video, audio analysis
• Cluster images
• Search for similar images
• Detect shallowfakes
Shared Awareness: MISP Threat Intelligence Platform
• Threat sharing standard with large
community
• EU funded (ENISA, CIRCL)
• ISAC, ISAO, CERTs, CSIRTs
• NATO, Military, Intelligence
• Fortune 500’s
• Open data standards
• MISP Core, STIX
• Connections
• API push/pull
• Email
• Anomali ThreatStream, ThreatConnect,
OSQuery
29
Disinformation TTPs: AMITT Framework
30
(TTP Framework adopted by MITRE, upstream development by CogSecCollab)
DISINFORMATION
RISK MITIGATION
Reducing risks before incidents happen
31
DISINFORMATION RISK
Risk prediction
● which Events:
• to Whom
• what Effects
• how Severe
• how Likely
• which Environments (PESTLE
etc)
• which Factors could change these
• how to Measure this
Risk mitigation / remediation
● which Actions to balance:
• risk
• values
• other environmental factors
• how to Measure this
PREDICT, PRIORITISE,
PROTECT
DISINFORMATION BLUE TEAMS, RED TEAMS
Blue team: respond to disinformation
• response plan
• playbook
• tracking: campaigns, incidents, narratives, artifacts
Red team: learn from examples
• run a "disinformation as a service"/alternative marketing company,
• run a hostile social media platform,
• mix disinformation with coordinated sensor spoofs
• extend an existing narrative (e.g. anti-medical safety)
DISINFORMATION SIMULATIONS AND GAMES
ENABLEMENT: Process, Tools, Training
Training: e.g. Data Science for Disinformation Response
• Introduction to the disinformation team
• What are we chasing? Digital harm training
• Getting set up for disinformation data science
• Data sources (process, sources, stores)
• Disinfo Data Science examples
• Social text analysis
• Image data analysis
• Relationships as data
• Extending your analysis with machine learning skillz
• Communicating results (style, narratives, visualisations)
35
Tools:
• HIVE/D3PO - case tracking
• Data tools - analysis support (scrapers etc)
• MISP - intelligence sharing (AMITT TTPs etc)
Process:
THANK YOU
@bodaceacat
@ngree_h0bit
https://cogsec-collab.org/
Threet.consulting
36

More Related Content

2021-02-10_CogSecCollab_UBerkeley

  • 1. Distributed Defense Against Disinformation Disinformation Risk Management and Cognitive Security Operations Centers SJ Terp to CITRIS, Feb 10 2021 1
  • 2. DISINFORMATION. DIS. NOT MIS. “deliberate promotion… of false, misleading or mis-attributed information focus on online creation, propagation, consumption of disinformation We are especially interested in disinformation designed to change beliefs or emotions in a large number of people” 2
  • 3. A THOUSAND-POINT PROBLEM NEEDS A THOUSAND POINT SOLUTION Collaborative, Heterogeneous, Connected
  • 4. Response: Actors • Platform • Law Enforcement • Government • Elves • Public • Influencer • Media • Nonprofit • Educator • Corporation 4
  • 5. COMMUNITY: COGSECCOLLAB 10 years ago: Disaster Data Today: Disinformation • MisinfoSecWG • MisinfoSec • Covid19Activation • Covid19Disinformation • CTI League Disinformation • CogSecCollab • Threet
  • 6. OUR PRACTICAL EXPERIENCE INCLUDES COGSECCOLLAB DEPLOYMENTS CTI LEAGUE’S DISINFORMATION TEAM: First Global Volunteer emergency response Community, defending and neutralizing cybersecurity threats and vulnerabilities to the life-saving sectors related to the current COVID-19 pandemic. • Active Cyber-attack Neutralization • Cyber-attack Prevention • Cyber-supporting the Life-saving and Health-related Sectors • Monitoring Cyberspace for Potential Danger to Public Health and Safety. SHARING TECH DEVELOPMENT: with NATO, EU, Canada, UN, Cognitive Intelligence Center, DROG (BadNewsGame/ HarmonySquare creators), etc.
  • 7. OUR STARTING POINT: THREE LAYERS OF SECURITY PHYSICAL SECURITY CYBER SECURITY COGNITIVE SECURITY
  • 9. DISINFORMATION CYBER THREAT INTELLIGENCE • Inform: Summarise and share information about ongoing incidents • Neutralise: Disinformation incident response: triage, takedown, escalation. • Prevent: Collate disinformation indicators of compromise (IoCs) and vulnerabilities; supply to organisations. • Support: Assess the possibility of direct attack, and ways to be ready for that. • Clearinghouse: Collate and share incident data, including with organizations focusing on response and countercampaigns. 9
  • 10. DISINFORMATION SECURITY OPERATIONS CENTER • Risk Mitigation: secure system, check compliance • simulations, red teaming, penetration testing, • compliance analysis, • team exercises • Enablement : foundation work • data engineering, • Information frameworks, • training • Operations : day-to-day response • incident response: discovering, investigating, responding to security threats • Threat Intelligence (TI), research, investigations
  • 11. ACTION MONITORING RESPONSIBLE FOR DISINFORMATION SOC: ORGANISATION BOUNDARIES Internet Domains Social Media Platforms Organization’s Platforms Lawmakers Organization’s Business Units COG SOC Infosec SOC Organization’s Communities Media
  • 12. Disinformation SOC: Configurations Cognitive ISAO ISAC/ ISAO Infosec SOC Comms Legal COG SOC Trust& Safety Platform ORG Infosec SOC Comms Legal COG Desk Trust& Safety Platform Comms Legal COG Desk Trust& Safety Platform ORG ORG ORG ORG ORG ORG ORG COG SOC
  • 13. SOC COMPONENTS • People • Enough people to make a difference, in time • Enough connections / levers to make a difference • Culture • Safety processes: mental health and opsec • Process • Understand disinformation, understand threat response • Fast, lightweight processes • Technology • Speed - supporting analysis, storage etc • Sharing - get data to responders in ways they understand (whatever works)
  • 14. SOC INTERNAL ORGANISATION: TIERS Tier1 Triage • Scanning systems • Triaging alerts • Gathering data • Starting tickets Tier2 Incident Response • Analysis • Remediation • Tactical response Tier3 SMEs • Threat hunting • Deep analysis • Strategic response Tier4 Management • Business connections • Plans, audits, organization Tickets Responses Reports Crisis Plan Platform alerts Social media External alerts Business Units Partners & Responders Disinformation Knowledge • Artifacts, narratives, actors, segments etc Specialist Knowledge • Politics, industry, marketing etc
  • 16. DISINFORMATION LAYER MODEL: PYRAMID 1 6 Campaigns Incidents Narratives Artifacts
  • 18. DISINFORMATION OBJECT MODELS: ACTOR, BEHAVIOUR, CONTENT AND NARRATIVES IN AMITT STIX ACTOR BEHAVIOUR CONTENT NARRATIVE
  • 20. Response: Mitigations and Countermeasures 21 ● Detect: find them ● Deny: stop them getting in ● Disrupt: interrupt them ● Degrade: slow them down ● Deceive: divert them ● Destroy: damage them ● Deter: discourage them https://github.com/cogsec-collaborative/amitt_counters
  • 24. The Disinformation “Three Vs” VARIETY platforms structure VOLUME petabytes VELOCITY real-time streaming near-real-time
  • 25. AMITT AUTOMATION Planning Preparation Execution Evaluation https://cogsec-collab.org/ Planning Preparation Execution Evaluation Attack Defense
  • 26. (Some) Tactical Tasks • Activity analysis • Track artefacts (messages, images, urls, accounts, groups etc), e.g. • find artefact origins, • track how artefact moves across channels/groups etc • find related artefacts • Detect AMITT Techniques, e.g. • detect computational amplification • detect, track and analyse narratives • Network detection • Find inauthentic website networks (pinkslime) • Find inauthentic account and group networks (including botnets) • Credibility/ Verification • Fact-checking: verify article, image, video etc doesn’t contain disinformation. • Source-checking: verify source (publisher, domain etc) doesn’t distribute disinformation.
  • 27. Machine Learning / AI • Graph analysis • Find super-spreaders • Find rumor origins • Uncover new artefacts • Track movement over time • Text Analysis • Find themes • Classify to narratives • Cluster text to narratives • Search for similar text/narratives • Image, video, audio analysis • Cluster images • Search for similar images • Detect shallowfakes
  • 28. Shared Awareness: MISP Threat Intelligence Platform • Threat sharing standard with large community • EU funded (ENISA, CIRCL) • ISAC, ISAO, CERTs, CSIRTs • NATO, Military, Intelligence • Fortune 500’s • Open data standards • MISP Core, STIX • Connections • API push/pull • Email • Anomali ThreatStream, ThreatConnect, OSQuery 29
  • 29. Disinformation TTPs: AMITT Framework 30 (TTP Framework adopted by MITRE, upstream development by CogSecCollab)
  • 30. DISINFORMATION RISK MITIGATION Reducing risks before incidents happen 31
  • 31. DISINFORMATION RISK Risk prediction ● which Events: • to Whom • what Effects • how Severe • how Likely • which Environments (PESTLE etc) • which Factors could change these • how to Measure this Risk mitigation / remediation ● which Actions to balance: • risk • values • other environmental factors • how to Measure this PREDICT, PRIORITISE, PROTECT
  • 32. DISINFORMATION BLUE TEAMS, RED TEAMS Blue team: respond to disinformation • response plan • playbook • tracking: campaigns, incidents, narratives, artifacts Red team: learn from examples • run a "disinformation as a service"/alternative marketing company, • run a hostile social media platform, • mix disinformation with coordinated sensor spoofs • extend an existing narrative (e.g. anti-medical safety)
  • 34. ENABLEMENT: Process, Tools, Training Training: e.g. Data Science for Disinformation Response • Introduction to the disinformation team • What are we chasing? Digital harm training • Getting set up for disinformation data science • Data sources (process, sources, stores) • Disinfo Data Science examples • Social text analysis • Image data analysis • Relationships as data • Extending your analysis with machine learning skillz • Communicating results (style, narratives, visualisations) 35 Tools: • HIVE/D3PO - case tracking • Data tools - analysis support (scrapers etc) • MISP - intelligence sharing (AMITT TTPs etc) Process: