ProjectDomino: Open intelligence to scale infodemic community response
Leo Meyerovich, PhD
CEO of Graphisty, and co-founder of Project Domino
- Respond (Decrease transmission & spread), such as: Optimal preventive interventions & uptake maximization, Cutting through “infodemic” & enabling better response, Data-driven learnings for increased efficacy of interventions
Early in COVID, the Director-General of the WHO warned “We’re not just fighting an epidemic; we’re fighting an infodemic.” He knew responders would struggle with sense making and mass behavior change.
A big reason is social media changed everything. With 4 billion people using it, it is one of our fastest and farthest-reaching data sources. Interventionists can directly engage with half the planet. At the same time, misinformation outnumbers clinical reports. Looking back a year later, COVID has killed more people in the US than WWII, and 30% of adults still oppose vaccination. The ecosystem is struggling.
Project Domino gives intervention partners leverage over social media. For example, at-risk COVID groups have massive preexisting digital communities, such as the 19 million cancer patients with worries like whether compromised immune systems make vaccination unsafe. These groups are big and effective because when people look for answers to make decisions, they respond better to peers and leaders vs. FAQs and bots. However, community organizers struggle on core tasks like combating misinformation, sense making, and running campaigns. They are not alone. Journalists, researchers, digital platforms, and policy makers want to use similar methods and also struggle. They are overdue for a technology partner.
The pilot begins with two partners and expands to successive cohorts of similar ones. We will serve:
Patient Groups on Social Media: We are empowering groups actively fighting the infodemic, both by promoting positive information or combating misinformation. We identified community groups, journalists/investigators/researchers, and digital platforms as top groups. Each serves a greater population.
Community Organizers: Our solution takes a “community health worker” approach to flip the disinformation narrative. We are empowering a focused audience of patient leaders and influencers who have established social media sharing networks for up to a decade before the COVID19 pandemic. Their networks have a unique strength: Community-driven collaborations with clinical and scientific communities, experience with data literacy, and a broader reach as trusted peers on social media. After initial buildout, we will iteratively listen, measure, and react, and then expand the cohort.
Data Journalists: The investigation intervention with Social Forensics will run similarly. The investigations and affiliated media partners will vary through the projects. We will take the same approach of initial building, iterative agile support, and then growth. We are including both groups in our funding request as they will have direct impact and gain more flexibility in engaging with us.
- Pilot: A project, initiative, venture, or organisation deploying its research, product, service, or business/policy model in at least one context or community
- Artificial Intelligence / Machine Learning
- Big Data
- Crowd Sourced Service / Social Networks
- GIS and Geospatial Technology
- Software and Mobile Applications
A strength of our approach is embracing open source data science and communities of people who are directly affected by the problem.
Some of our contributions already include:
Open source for our data infrastructure
Fixes to partner team open source code
Open source notebooks sharing some of our techniques
Public talks such as the DefCon AIV keynote on what we do
Assistance to breaking news reports in media like ABC News
Looking ahead, we will also be open sourcing:
Labeled training data
Trained models
Finally, our direct interventions are explicitly done in the public. Our results get published, the account gets taken down, etc.
Through an initial feasibility study in 2019, The Light Collective conducted a survey (N=299) and 21 interviews with leaders and organizers of peers supporting group leaders and participants in the cancer community on social media. The single most common reason people chose as to why they participate in their group on social media was to exchange and receive health information (79%). The second most common reason was to connect with others who share the same condition (76%). While we gather further evidence, we hypothesize the impact of health information seeking behaviors on social networks largely influenced by peers within established patient communities online.
We posit that individuals’ behavior is influenced most by peers. Authentic engagement starts with community relationships, not with institutions or chatbots. Established patient networks of peers on social media sharing health information need data-driven strategies to understand how to respond to misinformation. Our proposal takes a “community health worker” approach to putting analytics tools into the hands of patient community leaders. We aim to measure the impact of our interventions on knowledge-seeking behaviors of patients on social media. At scale, our method bridges data literacy gaps between at-risk patients and the social media platforms where they reside.

Initially we will focus on similar English speaking groups. Then, multilingual for global.
Our premise is simple. Let’s employ the same data-driven approach to visualize and respond to the infodemic on social media as we have for the pandemic. Showing information outbreaks of fake accounts, bot networks, and health content spreading in digital communities may have similar global impact to improve communities’ decision-making in the pandemic. What if journalists, public health officials, and patients seeking knowledge online had a way to see the bigger picture of social media manipulation. What if we made OSINT tools to see that the infodemic is real, and show vectors spreading of misinformation in digital communities? These are the questions we aim to explore.
We fundamentally believe that digital communities on social networks must be equipped and empowered to know if they are being targeted, misinformed, and manipulated. If funded, we’re showing in real time how bot networks and fake accounts spread and target vulnerable patient communities with anti-vaccination content to people who are making decisions in real time. In doing so, we aim to fundamentally impact knowledge-sharing about health on social media in a way that transforms knowledge sharing on a global scale.
In order to evaluate impact, earlier we noted different types of metrics. First, our intervention partners’, (i.e. The Light Collective and patient communities) will conduct interviews and a survey to measure the “baseline” of data literacy at the beginning of our project before we introduce analytics tools. Once tools are introduced, we will measure how data introduction of these analytics tools with participants will effectively enable participants in patient communities to respond to health misinformation within their networks. Metrics may include the following examples:
Effectiveness of identifying and responding to disinformation
Bot or disinformation networks identified by participants
Overall engagement with the tools
# of participants responding to counter the effects of disinformation
How tweets are reported within networks
Time to takedown.
# of identified disinformation tweets per hashtag
Post-Intervention data literacy
- United States
- United Kingdom
- United States
International expansion - Navigating Privacy Laws internationally
GDPR
How to Serve Users with data privacy and data retention laws
Financial - How do we transition from volunteer only to employees?
Volunteers can only work on their own time on nights and weekends
Without proper employees our impact is constrained and time limited
Have to pick and choose battles based on manpower and scheduling
Patient populations seeking knowledge on social media do not necessarily have the data and statistical literacy to understand how analytics can lead to strategies for action. Another barrier is trust. In order to ensure our interventions are being adopted in trustworthy ways that have an impact on data literacy, we will work with key stakeholders to:
Demo Analysts notebooks
Video Trainings
Easy to use and understand Self Service Analytics
Through The Light Collective, we will integrate with their work on patient-community driven governance practices
- Collaboration of multiple organisations
Proposal partners:
Graphistry
The Light Collective
Social Forensics
Compute sponsors:
Neo4j
Project Advisory:
Disaster Inc
Stanford
UIUC
General advisory:
Individuals with past leadership roles at HHS, FDA, NSF, Berkeley, the White House, etc.
General:
Nvidia
Bloomberg
Volunteers:
Often from top data orgs, but we do not make formal affiliations
Our prototype has momentum and is transitioning to committed pilots. To support our pilots, we are gathering grants aligned with our interventions. The pilot phase will unlock scale for more effective and self-serve solutions, and in turn, financial independence.
The successful prototype phase showed us what is already easy vs. possible with manual work vs. ready for scale. Our unpaid volunteers successfully broke national misinformation new stories, with one of our members playing a role in the recent reveal of QAnon. Likewise, we succeeded beyond expectations on AI tasks like vaccine side-effect detection.
To provide an accessible core for intervention partners and contributing volunteers, we need a classic dedicated data product team. A small group can go a long way, especially given our open source model enables us to solicit help on smaller pieces. Likewise, to ensure we’re building for intervention teams, we want to support 2 representative groups to work with us on impactful use cases. At the end, those use cases will have impact, and we will be ready to scale to helping additional groups that look similar to them, yet with only a fraction of the work for each. Excitingly, that’s the beginning of scale out!
We like to collaborate with data science technology partners, and for interventions, with investigators & journalists, digital platforms of all sizes, and community organizers.
The Light Collective is a good example of a community group, and upon success, we are interested in other groups, including government and commercial.
For digital platforms, we’ll be looking into websites top 50 social media sites and newspapers that likely have misinformation problems but not satisfactory anti-misinformation tooling.
Technology partners like Google, Microsoft, Facebook, ...: Our GPU and data pipeline hardware resources were contributed by sponsors, and we would love continued support of this kind! Likewise, we are eager for collaborations on their non-proprietary models.
Legal: Open governance and data usage agreements matter, yet lawyers are expensive. We would rather spend any funding on our makers and interveners, so legal support would be amazing.

Founder & CEO

Co-Founder, BRCA Advocate, Security Researcher