Theia
Automated Postoperative Wound Assessment and Surgical Site Surveillance through Machine Learning
The World Health Organization estimates 266.2 to 359.5 million surgical operations were performed in 2012, displaying an increase of 38% over the preceding eight years. Surgeries expose patients to an array of possible afflictions in the surgical site, some of which can lead to death. Postoperative wound complications are a significant cause of expense in time and money for hospitals, doctors, insurers, and patients. Hence, an effective method to diagnose the early onset of wound complications is strongly desired. Algorithmically classifying wound images is a difficult task due to the variability in the appearance of wound sites.
We built an artificial intelligence algorithm, called Deepwound, to identify characteristics of a wound from a smartphone image. Our final computational model can accurately identify the presence of nine labels: the presence of a drainage, wound, fibrinous exudate, granulation tissue, surgical site infection, open wound, staples, steri strips, and sutures. Smartphones provide a means to deliver accessible wound care due to their increasing ubiquity. Paired with artificial intelligence, they can provide clinical insight to assist surgeons during postoperative care. We also built mobile application frontend to Deepwound, called Theia, that assists patients in tracking their wound and surgical recovery from the comfort of their home.
According to our literature search, we attained results that surpass the state-of-the-art for the prediction of surgical site infection. Our model achieves scores superior to prior work in this area. This is the first extensive research project that can identify a variety of afflictions and surgical dressings from a single image. Moreover, the mobile application provides a practical way to utilize my sophisticated algorithm.
My team and I have received acclaim for our work at the 2018 Consumer Electronics Show and 2018 Surgical Infection Society Annual Meeting. I was invited to present our work at the 2018 Consumer Electronics Show and our team received a top 10 award at the 2018 Surgical Infection Society Annual Meeting. We are currently in the process of publishing a comprehensive paper regarding our research so others can build upon it.
We are currently working on embedding our deep learning technology onto a mobile device as well as testing various domain-specific architectures for our algorithms. Our solution will be able to provide rapid wound assessment to surgical patients, drastically improving patient outcomes and reducing hospital costs.
- Effective and affordable healthcare services
- Other (Please Explain Below)
Our solution integrates artificial intelligence into the surgery pipeline, providing on-demand wound assessment and alerting patients and doctors of malignancies as soon as they set in. We consider our solution innovative as it is the first all in one software solution to predict and manage wounds and to our knowledge, is one of the first practical uses of AI and machine learning in the medical industry. Moreover, our mobile application provides a patient-facing interface that enables patients to take charge of their own wound care.
The premise of our project is to synthesize mobile health technologies with postoperative wound assessment. Telemedicine enables patients and healthcare providers to communicate with digital assets, such as text message and images, without face-to-face meetings. Cutting-edge, novel deep learning technologies reside at the core of our platform. We provide a mobile application that utilizes machine learning to identify surgical characteristics and track malignancies, such as surgical site infection. Moreover, the app can be used to directly communicate with the surgeon.
We are currently working on integrating our novel research with WinguMD's BodyMapSnap mobile application. BodyMapSnap is an iOS app that is used by surgeons all over the world to document and track various surgeries. Equipping BodyMapSnap with our software will enable automatic wound image annotation. Components from Theia will be added to BodyMapSnap along with a mechanism to collect more annotated wound images with verification from professionals. This will further improve our models in a self-fulfilling cycle of accuracy.
After incorporating our health technologies into BodyMapSnap, we aim to begin rolling out our software to hospitals in the United States.
After having enough annotated data, we can train mobile embedded models, using architectures such as SqueezeNet, so our solution will be faster at generating predictions and can be available to those without internet access. This will further expand the reach of our application.
Finally, in the near future, we would like to implement wound healing prediction using the time series of patient wound images. Recurrent neural networks (RNNs) have shown great promise in this particular field and after collecting enough time-series data, we will be able to investigate this idea.
- Adult
- Old age
- US and Canada
We believe that the significant importance of our software will motivate healthcare providers to use it and provide us with the necessary feedback to improve and maximize our potential.
We will deploy our solution through the vast network that BodyMapSnap already has.
Currently, we are serving zero people with Deepwound. However, as we add our system to BodyMapSnap, we will serve significantly more people.
Once our algorithms are fully integrated with BodyMapSnap, we will serve the same number of people as they serve.
- Not Registered as Any Organization
- 7
- 1-2 years
We are affiliated with Stanford University and the Palo Alto Veterans Affairs Hospital and therefore have a network of various professors and researchers that could potentially assist us in our work.
will add later
will add later
- Peer-to-Peer Networking
- Organizational Mentorship
- Connections to the MIT campus
- Impact Measurement Validation and Support
- Media Visibility and Exposure
- Other (Please Explain Below)
