Mitigation Framework

Step 1. Threat Map. Identify potential threat methods for analysis.

โ€‹
Subject Type
โ€‹
โ€‹
Threat Type
Individual
Group Identity
Organization
Direct
Bullying; coordinated targeting; hateful, inflammatory, or embarrassing comments; threats of violence; upsetting content; gendered threats; sustained harassment; mob harassment; sexual harassment; stalking; doxxing; SWATing; and account takeovers/lockouts.
Tactics leveraging social cleavages (for example hate speech or dog whistles) such as race, ethnicity, socioeconomic status or class, gender, sexual orientation, religion, regional or national origin, citizenship status, occupation, employment status, age / generation, education, or political affiliation.
Coordinated targeting to organizational accounts; Denial of service or access to an organizationโ€™s content;
Indirect
Spreading of false or misleading information about an individual; defamatory information; disclosure of non-consensual intimate images; impersonation; hateful, inflammatory, or embarrassing comments.
Spreading of false or misleading information about a social group; hate speech directed towards a social group; divisive speech that may be either opposed or supportive of various social groups.
Mass internet shutdowns, establishing seemingly allied organizations to share disingenuous content; establishing opposition organizations to spread opposing viewpoints; imitation of the organizationโ€™s online presence(eg, typosquatting).
Ingestion
Persuasion of the individual to believe or biased towards inaccurate information.
Persuasion of groups to believe inaccurate information about other groups, sowing division or apathy or bolstering alliances.
Persuasion of the organization to use inaccurate information in decision making.
Generation
Creation, publishing, or sharing of misinformation, harassment against co-workers and others outside of the organization
Creation and spreading of misinformation; harassment against co-workers and others outside of the organization
Creation / spreading of misinformation, harassment against co-workers and others outside of the organization

Step 2. Harm Map. Connect scenarios to potential harms for the organization or its individuals or groups of individuals.

Individual Harms
โ€‹
โ€‹
Harms to Self Determination
Definition
โ€‹
โ€‹
Loss of autonomy
Loss of autonomy includes needless changes in behavior, including self-imposed restrictions on freedom of expression or assembly.
Loss of liberty
Improper exposure to arrest or detainment. Even in democratic societies, false or negative information can lead to increased scrutiny, arrest or, abuse of governmental power.
โ€‹
Power imbalance
Information, or threat of disclosure, can create an inappropriate power imbalance or takes unfair advantage of a power imbalance between acquirer and the individual.
โ€‹
Physical harm
Actual physical harm to a person, including the potential to cause death.
โ€‹
Psychological harm
Information can cause psychological distress to the target such as increased anxiety, fear, and depression, possibly triggering reactions to previous trauma. This distress can also contribute to physical self-harm.
โ€‹
Reputational Harms
โ€‹
โ€‹
โ€‹
Loss of trust
The breach of implicit or explicit expectations about the character and behavior between individuals or organizations. Loss of trust can leave entities reluctant to engage in further cooperation.
Stigmatization
Information can create a stigma that can cause embarrassment, emotional distress or discrimination.
โ€‹
Economic Harms
โ€‹
โ€‹
โ€‹
Financial losses
Harms due to a result of loss of employment, business relationships, increased government scrutiny, and imprisonment.
Group Harms
โ€‹
โ€‹
Reputational Harms
โ€‹
โ€‹
โ€‹
Discrimination
Groups within an organization or individuals may be unfairly judged, scrutinized, or excluded based on their actual or perceived group affiliation.
Stigmatization
Information can create a stigma that can cause embarrassment, emotional distress or discrimination of a certain group.
โ€‹
Organizational Harms
โ€‹
โ€‹
Operational Harms
โ€‹
โ€‹
โ€‹
Loss of productivity
Inefficiencies due to decision-making based on inaccurate or misleading information leading to increased delays, false starts on program activities, or time spent sorting and verifying information for accuracy.
Loss of mission impact
Decreased impact due to organizational decision-making, activities that incorporate or promote inaccurate information, or from the influence of competing narratives on the organizationsโ€™ supported beneficiaries.
โ€‹

Step 3. Threat Scenarios. Develop practical description of the threat and challenge assumptions.

โ€‹
Probing Questions
Adversary
  • What is the identity of the adversary responsible for the harmful information?
  • What are the goals (if any) of an adversary sharing the harmful information?
  • What resources might an adversary have at their disposal?
Content
  • Does the content contain personal information?
  • Does the content threaten or create fear for oneโ€™s safety?
  • What elements of โ€œtruthโ€ are contained in the message?
Context
  • How is the harmful information delivered?
  • When and how often are interactions taking place?
  • How might the harmful information affect current events or campaigns?
Audience
  • Who is the intended recipient of the information?
  • How could various stakeholders of the organization perceive the harmful information? What social norms might be violated?
  • How might the audience react to the harmful information?
  • How might law enforcement or government regulators react to the harmful information, if known?
Legitimacy
  • What might give this threat legitimacy with an influential audience?
  • Why might the threatโ€™s message or methods be perceived as normatively acceptable?
  • How might those information sources already deemed legitimate by certain audiences spread or give additional credibility to the threat?
  • Who in power may spread or give credibility to the threat?
Impersonation
  • How might an adversary take over or share information from an account belonging to the target?
  • How might an adversary convince an audience that their information is being shared with the targetโ€™s approval?
  • How might an adversary bypass any vetting processes intended to ensure representations are made by authentic sources of information?
Linking
  • How have associates of the target been subject to harmful information threats in the past?
  • How might publicly disclosed information about associations of the target tie to additional harmful information threats?
  • How might historical information about the targetโ€™s associations and activities be used in combination with the threat?
Amplification
  • How might an adversary disseminate information to a large audience?
  • What is the current number of followers or subscribers of the adversary?
  • How might a harmful message move, intentionally or unintentionally, from less active online forums to more popular platforms?
  • How has an adversaryโ€™s message or similar threats been amplified in the past?
Collection
  • How might sensitive information about the target be gathered by an adversary?
  • How might a threat have been able to access, store, or share private information about the target?
  • How might publicly available information about the target give credibility to a threat?
Suppressing
  • How might an adversary prevent opposing perspectives from being shared and heard?
  • Why might the target be unable to use existing their information channels (website, social media accounts, newsletter) to counter the threat?
  • How might an audience be blocked from accessing the targetโ€™s information or counter-messaging?

Step 4: Mitigation Map. Select suitable controls to mitigate potential harms.

Identify
โ€‹
โ€‹
Identify Harmful Information Risks
โ€‹
โ€‹
Identify Harmful Information Risks
Identify Potential Threats
  • Consider threats to individuals, groups, or the organization
  • Consider direct targeting, indirect attacks, ingestion, and generation
Connect Threats to Potential Harms
  • Identify the impact of potential threats to individuals, groups, and the organization
  • Consider physical, reputational, financial harms
โ€‹
Create and Prioritize Threat Scenarios
  • Describe threat scenarios in detail
  • Evaluate and prioritize scenarios based on likelihood and impact
โ€‹
Identify informal practices or formal policies
โ€‹
โ€‹
Identify informal practices or formal policies
Security (Physical or Digital) or Incident Response
Identify and evaluate the following:
  • Evaluate security risk management abilities and training.
  • Consider how psychosocial risks are addressed in the risk assessment / management program.
  • Improve account security of organizational and personal social media accounts.
  • Decrease the online availability of personal information about staff members.
  • Other:
Social Media Use
Identify and evaluate the following:
  • Acceptable social media use for organizational accounts, including response policy for comments and private messages.
  • Monitoring protocols for mentions of your organization and staff members in social media, comments, and forums.
  • How policies consider the subjective experience of online abuse.
  • Other:
โ€‹
Communications and Public Relations strategy
Identify and evaluate the following:
  • Media literacy and verification processes to avoid sharing and consuming misinformation.
  • Plans to address potential information threats in advance.
  • Existing messaging that addresses misinformation directly or offers constructive alternative narratives in outreach to funders and stakeholders
  • Contacts at social media platforms, media outlets, academia, government, and intermediaries that can support the organization during a crisis
  • โ€œFirst pageโ€ search results for the organization and its members
  • Other:
โ€‹
Human Resources or Employee Health & Wellness
Identify and evaluate the following:
  • The ability and experience of members of historically disadvantaged or marginalized groups to report, respond, and recover from harmful information
  • Reporting and confidential disclosure mechanisms for online and offline abuse
  • Partnerships with programs offering mental health counseling, trainers, and other resources for victims and subjects of harmful information
  • Other:
โ€‹
Workplace Ethics / Code of Conduct
Identify policies and practices regarding:
  • Financial accounting
  • Managing conflict of interests
  • Political endorsements and advocacy
  • Whistleblower protections
  • Other:
โ€‹
Evaluate Organizational Culture
โ€‹
โ€‹
Evaluate Organizationโ€™s capacity to address harmful information
Leadership
Identify and evaluate the following:
  • Buy-in to address concerns of misinformation and online abuse
  • Openness and transparency on areas for improvement
  • Other:
Values
Identify and evaluate the following:
  • Explicit values
  • Implicit values
  • Other:
โ€‹
Performance
Identify and evaluate the following:
  • How leadership and staff uphold organizational values
  • How staff and leadership perform and manage the identified policies or practices
  • Other:
โ€‹
Protect
โ€‹
โ€‹
Improve Organization-wide Digital Security
โ€‹
โ€‹
Protect the confidentiality, integrity, and availability of the organizationโ€™s and individualsโ€™ information systems
Maintaining confidentiality
  • Secure accounts (personal & organizational)
  • Secure devices
  • Implement network monitoring
  • Other:
Maintaining availability of information
  • Implement DoS Protection
  • Enable Censorship Circumvention
  • Other:
โ€‹
Maintain integrity of information
  • Enable domain spoofing protection. eg DMARC
  • Enable DNS Hijacking protection (DNSSEC)
  • Register similar URLs
  • Other:
โ€‹
Minimize the Availability of Potentially Harmful Information.
โ€‹
โ€‹
Reducing or obfuscating available open source information on organization or members.
Organizational Data Management
  • Implement data minimization strategy
  • Conduct open source audit
  • Other:
Personal Data Management
  • Review Old Social Media Posts
  • Review Social Media Privacy Settings
  • โ€œDox Yourselfโ€
  • Other:
โ€‹
Maintain Social Media Management best practices
  • Create policies for how to engage with legitimate commentators versus โ€œtrollsโ€ in public and via private messages.
  • Maintain social media manager anonymity.
  • Other:
โ€‹
Strengthen Communication Plan and Social Media Policies
โ€‹
โ€‹
Develop communication plan and social media policies
Create a strategy for when to let harmful information to โ€œdie outโ€, when to counter with direct refutations, or when to promote new narratives.
  • Create messages in advance.
  • Connect with a network of journalists and fact-checkers.
  • Create advertising and automation strategies for messaging amplification.
  • Other:
โ€‹
Maintaining organizational presence and accurate information on authoritative sources of information
  • Improve web presence and search engine optimization including strengthened networks of supporting sites.
  • Correct the record on authoritative sources such as Wikipedia
  • Other:
Detect
โ€‹
โ€‹
Implement Individual Detection
โ€‹
โ€‹
Develop individual skills to identify known strategies for creating harmful information
Identify and learn how to react when in potentially compromising situations
  • Verify the identity of new contacts, online and offline
  • Familiarize with counterintelligence tradecraft
  • Avoid discussing politically or culturally sensitive topics with strangers
  • Other:
Improve media literacy to reduce an organization's susceptibility to its own digestion and spread of misinformation.
  • Teach source checking
  • Implement content verification procedures
  • Other:
โ€‹
Implement Organizational Detection
โ€‹
โ€‹
Develop organizational policies and practice for detecting harmful content
Implement manual content monitoring
  • Implement and train staff on reporting harmful (or suspected) online information, including seemingly innocuous behavior
  • Create a plan to relieve subjects of abuse from self-monitoring
  • Create an emergency plan for manual monitoring of abuse campaigns by staff.
  • Other:
Implement automatic content monitoring
  • Set free keyword notification tools such as Google Alerts
  • Preset filtered feeds in tools such as TweetDeck
  • Employ social sensing or brand monitoring services
  • Other:
โ€‹
Implement external content monitoring
  • Collaborate with other organizations to monitor and research developments in misinformation in oneโ€™s domain
  • Create an intake plan for colleagues from other organizations that request help
  • Other:
โ€‹
Respond
โ€‹
โ€‹
Immediate Response -
โ€‹
โ€‹
โ€œTop 3 Thingsโ€, planned in advance.
Physical Safety and Wellbeing
  • Train staff for initial shock: โ€œbreathe and connect with support, donโ€™t handle this aloneโ€
  • Plan to move to safety if credible threats
  • โ€œBetter to be safe than sorryโ€ policies
  • Other:
Digital Security
  • Conduct Incident Response procedures
  • Other:
โ€‹
Gather Evidence and Stay Aware of Threats
  • Monitor and Archive (Tweetdeck, Dox Yourself, Hunch.ly, Archive.org, Google Alerts)
  • Manage manual monitoring of abuse campaigns by co-workers accounting for burn-out.
  • Other:
โ€‹
Next Stage Response
โ€‹
โ€‹
Prevent Escalation of Harms
Respond to content on Platforms
  • Engage with platforms or intermediaries for removal of harmful content or automated accounts
  • Use tools to identify, ignore, and/or block bots/trolls
  • Other:
Execute Crisis Communication Plan
  • Engage with supporters and funders to keep them informed
  • Inform public via media or other outlets (as needed)
  • Other:
โ€‹
Engage legal protections from harassment or threats.
  • Notify law enforcement authorities if appropriate (SWATing prevention)
  • Contact legal counsel for jurisdiction-based guidance
  • Other:
โ€‹
Recover
โ€‹
โ€‹
Improving Safety
โ€‹
โ€‹
Holistic Recovery
Rebuild Psychological Resilience
  • Offer multiple avenues for coping
  • Provide counseling services for employees
  • Other:
Improve Physical Protections
  • Reassess physical vulnerabilities at work locations and increase protections as appropriate
  • Revisit personal security plans for employees
  • Other:
โ€‹
Recover Digital Safety
  • Reassess digital vulnerabilities and increase protections as appropriate
  • Other:
โ€‹
Repair Information Harms
โ€‹
โ€‹
โ€‹
Refine Communications Plan
  • Adjust messaging based on counternarratives and situation
  • Engage with supporters and funders to keep them informed.
  • Inform public via media or other outlets
  • Other:
Continue to use Platform-Specific Methods
  • Search engine optimization
  • Search result downranking
  • Content removal processes such as Right to be Forgotten / DMCA.
  • Other:
โ€‹
Seek Legal Remedies
  • Contact legal counsel for jurisdiction-based guidance
  • Other:
โ€‹
Reassessment
โ€‹
โ€‹
โ€‹
Conduct a Formal After-Event Assessment
  • Learn how the organization could improve
  • Learn and validate what people did well
  • Describe resources that you wish were available.
  • Other:
Copy link
Contents