Pandemic threat reporting platforms
Highly relevant science articles of sick animals, without missing any relevant science articles, and minimizing irrelevant article results included in the query dataset, is needed. We can achieve this through advanced keywords.
By searching for already published science articles and future articles to be written of infected animals that have also spilled over to villagers who developed immunity, we can find the correct potential pandemic sites. We just need to make sure ALL/MOST articles are found and none are missed. Ai will miss 'em. A common sense viewing of the science article finds through keywords and/or authors will result in obvious assessment of the spillover risk.
A platform for authors to submit sick animal science articles, or authors of news articles to submit new outbreak articles will ensure we have all/most articles available to us. We will find such authors and not miss them also due to advanced keyword strings.
Many systems will overlook sick animal articles for consideration. Using a large dataset as inputs for Ai to classify the right articles will result in irrelevances being chosen or relevances missed as Ai may not be advanced enough to find all correct articles or will usually exclude relevant articles.
1.There are numerous science articles describing animals that test positive for a worrisome virus. Nearby villagers are positive for the virus as well in some articles. I have advanced keyword strings to find these specific articles with high specificity and low exclusion resulting in a highly relevant dataset. We can contact each author to inform them of PPE protocol in the field, and report future sick animal findings, to never miss an article.
2.Using another set of custom keyword we can search for news articles of outbreaks of the same virus in the region of the animal to find the spillover threat from the animal.
3.I have keywords for finding news articles relating to outbreaks.These keywords were in the winning solution for the challenge "early signs of pending pandemic" on innocentive.com. We can use these keywords to find news articles describing outbreaks to find authors' names to email them requesting submission of similar articles in the future to a reporting platform. All news companies will eventually be notified. This will ensure we have all/most news articles of all outbreaks through one platform,never missing an article.
4. I will find talent to routinely search for news articles of outbreaks via keywords.

Populations near high risk animals such as villagers, and the populations these villagers may come in contact with. It will also protect researchers who collect samples by informing them of proper PPE protocols. A new global PPE protocol standard does need to be set when collecting samples. All researchers participating in such activities need to be made aware of proper and extraneous PPE protocols from before entering the cave and the hours after exiting the cave. Quarantine measures may need to be occur as well. This will protect the region the cave is located in, and protect the regions the researchers reside in or do work in. Biggest of all, it will protect the entire worlds population by preventing the next pandemic due to early prediction allowing such prevention.
Ai is great at classifying images, but not so great at understanding context of text filled science article documents resulting in faulty classification which may miss the highest threat animals from highly relevant articles. Ai will always be chasing that highly relevant article and it wont even be apparent the relevant future article will be missed because relevant articles wont be classified as threats in the first place due to an unfiltered broad data set. It is a problem created from a problem which keeps compounding.
We can use my keywords to find news articles and present it to internet users using CAPTCHA technology to help identify news articles that describe a pandemic potential by asking the internet user "which one of these articles foreshadow a pandemic?".
We can also use my keywords to collect science articles of sick animals and present them via CAPTCHA technology to scientists and ask "which one of these articles could foreshadow a pandemic?".
Eventually, I will want to build an Ai that can search for highly relevant articles resulting in a smaller result set without missing important articles and the majority of the dataset being highly relevant while minimizing irrelevant articles. This will be a hard task because I am fully aware of how inefficient Ai is for query based uses at the current time. However I hope to change that in the future and may be a new challenge after this system is set into place by applying advanced search strategy knowledge to the Ai to tease out the highly relevant science articles without the erroneous false positives. Thereafter it is just a matter of discussion on the sick animals that are the highest threat from the data presented.
All of the above will make sure to serve the entire world.
- Strengthen disease surveillance, early warning predictive systems, and other data systems to detect, slow, or halt future disease outbreaks.
This will predict pandemics using already published public articles.
- Prototype: A venture or organization building and testing its product, service, or business model.
The right eyes need to see what is found using the system (eg. science articles of infected bats or camels) and assess risk. This is where Solve can help.
The website may serve a similar purpose of promedmail but for news article writers and scientific article authors to submit their outbreak findings. The website needs to be built, and also have a credible organization's name backing it to make sure authors and news companies are on board to submit.
Additionally, we can build a separate website for researchers to notify and report zoonotic viruses found in animals. We can use my keywords to find authors of such science articles and advise them to submit on the platform before publication which takes months.
- Big Data
- Human Capital (e.g. sourcing talent, board development, etc.)
- Product / Service Distribution (e.g. expanding client base)
- Technology (e.g. software or hardware, web development/design, data analysis, etc.)
- No, I do not wish to be considered for this prize, even if the prize funder is specifically interested in my solution
- No, I do not wish to be considered for this prize, even if the prize funder is specifically interested in my solution
- No, I do not wish to be considered for this prize, even if the prize funder is specifically interested in my solution
- Yes, I wish to apply for this prize
Science articles and News articles are the most abundant and most long term data available publicly and is scattered across hundreds of thousands of database sources. Using advanced keywords we can filter out the relevant data articles and use it to our advantage. This is strong data science.
When the system is set in place I will look into transferring search strategies through algorithms into the Ai realm with the goal of it being fully automated and just as efficient as using advanced keywords as described.
- No