Transforming Public Health Surveillance

Unleashing the power of AI-powered surveillance vs. traditional in-house systems

Building a world-class disease surveillance engine is hard. We would know – ours was the first to catch a cluster of unknown respiratory illness cases in Wuhan, China, which was the earliest signal of the resulting COVID-19 pandemic. Public health departments strive to do the same – why can’t they?

The problem is that lack of funding, understaffing, and constant changes in priorities has left public health departments stuck using manual disease surveillance as the “path of least resistance”, or “good enough”. We cannot accept good enough anymore. It is time to re-direct our bright analysts to more impactful, action-oriented tasks and leave the administrative work for automation… and here is why.

  1. Global Coverage: Diseases do not respect borders, and nor should disease surveillance efforts. Recent notable outbreaks in the USA, including Zika (2016), COVID-19 (2019), Mpox (2022) all originated from another country. Without travel you don’t have spread. The Federal Aviation Administration estimates that 2.9M passengers fly to or from the USA every day – that is 2.9M daily opportunities for disease spread.1 We constantly hear of departments struggling to “scan the world”. It’s because there isn’t – and likely will never be – a single source of global truth for disease activity. It is scattered, more and more every day, across different articles, networks, and reports. And that means that it is impossible for public health departments – who on average have 2 analysts dedicating 3-hours of their day to disease surveillance – to capture, sort, and vet everything that they need to produce timely and reliable risk assessments. Hard trade-offs have to be made, perhaps prioritizing local outbreaks, or only looking at reactive indicator-based sources… and before we know it, the next pandemic is only discovered when it knocks at your door.
  2. Volume: The more articles and sources processed, the more likely you are to capture early signals of disease events. This concept is intuitive, considering the case for global coverage shared above. However, departments shy away from increasing volume because it comes with increased complexity. Complexity via the added resources needed to find new sources, remove duplicate reporting, and process more articles… as well as being able to do-so in more languages. It is a losing battle. Staffing up to manually process more articles can move the needle by a couple hundred a day – what is needed is hundreds of thousands.
  3. Verification: The quality of the analysis is only as good as the information it was based on. While news feeds like ProMed, EIOS, or HealthMap can help the need to ingest a high volume of articles, they still require dedicated resources to manually review every single article, extract relevant information, and then turn it into intelligence. Sifting through the noise to find the signal is a mundane, lower-value task; but cannot be overlooked. Furthermore, the quality of the intelligence produced (and finding supporting contextual information) relies on the tenure of the analyst. And what happens when your best analyst leaves? You’ll spend more time training someone else to take their place and hope an outbreak doesn’t occur while you do it.
  4. Speed: All this needs to happen fast – as close to real-time as possible. It took just 12 days for Mpox to spread from Africa’s tropical rainforest to Europe, and then to North America. Acting in real-time to control an outbreak, or as close as humanly possible, is the best defense against disruption. That means that waiting for the update of official reports, which occur weekly at best, is not fast enough. Once a new outbreak makes headlines on your nightly news channel, it’s already too late. Vigilance around the clock, every day of the week, and throughout the entire year is necessary. However, unless you possess a substantial budget, implementing a manual surveillance program that can accomplish this is highly improbable.

When mapped to a standard horizon scanning process, the difference becomes clear.

StepManual processWhat is possible with AI
  • Spending hours collecting data from multiple decentralized sources
  • Personal Excel sheets… everywhere
  • Relying solely on delayed indicator-based sources
  • Limited language coverage
  • Coverage during working hours only
  • Single feed that captures the world (300,000+ articles)
  • No excel
  • Global coverage
  • 130+ languages
  • 24/7 collection
  • Manual source vetting for every article
  • Manual extraction and re-keying of intelligence
  • Pre-vetted articles
  • Automatically extracted intelligence from articles directly integrated into workflows or programs of choice
  • Relying on the tenure of an analyst to find supporting context
  • Intelligence lost on individual workspaces
  • Ready-to-use contextual insights across broad range of sources
  • Equal access for everyone
  • Delayed or incomplete insights
  • Up-to-date insights that incorporate all available intel

Achieving the future together

Coming out of COVID-19, many governments are assessing their response capabilities and adopting public-private partnership (“P3”) strategies to keep up with innovations in disease surveillance. The marriage of public sector reach with private sector speed can boost the performance of critical disease surveillance and threat assessment workflows in a credible and efficient way. Already adopted in other infrastructure projects, using the P3 model for proactive infectious disease intelligence is the next natural step.

AI-powered disease surveillance is the future. It is necessary to keep up with global disease activity and protect citizens at home, and abroad. To learn more about how BlueDot is partnering with organizations to reduce the disruption of infectious diseases, contact us today.

Recommended Reading