Project

Research Area | Application, Interpretability and Clinical Translation of Deep Learning Algorithms for Medical Images

Pratik Shah

Research in our laboratory reduces dependence on specialized medical imaging devices, biological and chemical processes and creates new paradigms for using low-cost images captured using simple optical principles for point-of-care clinical diagnosis. We have demonstrated a series of generative, prediction and classification algorithms for obtaining medical diagnostic information of organs and tissues from simple images captured by low-cost devices. For example:

  • In collaboration with Brigham and Women’s Hospital in Boston, MA, we devised and published a novel “Computational staining” system to digitally stain photographs of unstained tissue biopsies with Haematoxylin and Eosin (H&E) dyes to diagnose cancer. This research also described an automated “Computational destaining” algorithm that can remove dyes and stains from photographs of previously stained tissues, allowing reuse of patient samples. Our method uses neural networks to help physicians provide timely information about the anatomy and structure of the organ and saving time and precious biopsy samples. (Project link)
  • We also reported, In collaboration w… View full description

Research in our laboratory reduces dependence on specialized medical imaging devices, biological and chemical processes and creates new paradigms for using low-cost images captured using simple optical principles for point-of-care clinical diagnosis. We have demonstrated a series of generative, prediction and classification algorithms for obtaining medical diagnostic information of organs and tissues from simple images captured by low-cost devices. For example:

  • In collaboration with Brigham and Women’s Hospital in Boston, MA, we devised and published a novel “Computational staining” system to digitally stain photographs of unstained tissue biopsies with Haematoxylin and Eosin (H&E) dyes to diagnose cancer. This research also described an automated “Computational destaining” algorithm that can remove dyes and stains from photographs of previously stained tissues, allowing reuse of patient samples. Our method uses neural networks to help physicians provide timely information about the anatomy and structure of the organ and saving time and precious biopsy samples. (Project link)
  • We also reported, In collaboration with Stanford University School of Medicine and Harvard Medical School,  several novel mechanistic insights and methods to facilitate benchmarking and clinical and regulatory evaluations of generative neural networks and computationally H&E stained images . Specifically, high fidelity, explainable, and automated computational staining and destaining algorithms to learn mappings between pixels of nonstained cellular organelles and their stained counterparts were trained.  A novel  and robust loss function was devised for the  deep  learning algorithms to preserve tissue structure. We communicate that our virtual staining neural network models were generalizable to accurately stain previously unseen images acquired from patients and tumor grades not part of training data. Neural activation maps in response to various tumors and tissue types were generated to provide the first instance of explainability and mechanisms used by  deep learning models for virtual H&E staining and destaining.  And image processing  analytics and statistical testing were used  to benchmark the quality of generated images.  Finally, we evaluated computationally stained images for prostate tumor diagnoses with multiple pathologists  for clinical evaluations.  (Project link)
  • In collaboration with Beth Israel Deaconess Medical Center in Boston, MA, we investigated the use of dark field imaging of capillary bed under the tongue of consenting patients in emergency rooms for diagnosing sepsis (a blood borne bacterial infection). A neural network capable of distinguishing between images from non-septic and septic patients with more than 90% accuracy was reported for the first time. This approach can rapidly stratify patients and offer rational use of antibiotics and reduce disease burden in hospital emergency rooms and combat antimicrobial resistance. (Project link)
  • We successfully predicted signatures associated with fluorescent porphyrin biomarkers (linked with tumors and periodontal diseases) from standard white-light photographs of the mouth, thus reducing the need for fluorescent imaging. (Project link
  • We have also communicated research studies reporting automated segmentation of oral diseases from standard photographs by neural networks and correlations with systemic health conditions such as optic nerve abnormalities in patients for personalized risk scores. (Project link, Project link)