AI & Health

UBC is home to many researchers who contribute to the health of British Columbia, as well as world-health.  Artificial Intelligence is a huge part of the evolving health field, and its importance continues to grow.  Below are other AI & Health groups, both at UBC and external organizations, that feature UBC faculty, most of which contain CAIDA members.

Group Abbreviation Internal/External
Biomedical Imaging and Artificial Intelligence BMI-AI UBC
Bionics Network   UBC
Centre for Brain Health    
Canada's Digital Technology Supercluster Digital Supercluster Canada; BC based
Vancouver Coastal Health Research Institute VCHRI VCH and UBC
Data Science and Health DASH UBC

Representative Projects: 

CAIDA members have contributed to a multitude of health-related projects.  Below you can read about some of our featured projects.  You can also check out our news page for more CAIDA highlights, or you can follow us on Twitter.

Intelligent Network
Point of Care Ultrasound

Dr. Robert Rohling, 
Dr. Purang Abolmaesumi,
Dr. Teresa Tsang,
Dr. Oron

UBC researchers, and CAIDA members, Dr. Robert Rohling, Dr. Purang Abolmaesumi, and Dr. Teresa Tsang, alongside Dr. Oron Frenkel, have integrated portable, handheld ultrasounds scanners with artificial intelligence (AI) to accelerate COVID-19 diagnosis. 

Their project, Intelligent Network Point of Care Ultrasound (IN-PoCUS), has been funded by B.C.’s Digital Technology Supercluster, and it is currently being focused towards rural communities in B.C., with 50 units being sent to rural acute care sites and 30 units being sent to urban acute care sites.

The system uses a network of lung images and an AI algorithm to detect changes in the lungs associated with COVID-19.  This not only allows diagnosis to occur almost instantly and for vulnerable cases to be identified early, but the project also has the potential to be applied to areas beyond COVID-19 in the future.  To learn more about this exciting project, click here.


Non-Invasive Classification
of Alzheimer’s Disease
Using Eye Tracking and Language 

Oswald Barral, 
Hyeju Jang,
Sally Newton-Mason, 
Sheetal Shajan, 
Thomas Soroski, 
Giuseppe Carenini, 
Cristina Conati, 
Thalia Field

Alzheimer’s disease (AD) is an insidious progressive neurodegenerative disease resulting in impaired cognition, dementia, and eventual death. Early detection of AD is important because the sooner the disease is diagnosed, the more can be done to alleviate and delay its impact on the patient's life. Thus, there has been increasing interest in creating diagnostic tools that are non-invasive, efficient and cost-effective, leveraging machine learning to detect patterns resulting from the decline that occurs at the earliest stages of the disease in multiple cognitive domains including speech and eye movements. While related work has investigated AD classification using speech collected during spontaneous speech tasks, in this project we study the utility of eye movements and their combination with speech for this classification task. Our results thus far are based on  speech and eye movement data collected from 68 memory clinic patients (with a diagnosis of AD, mixed dementia, mild cognitive impairment, or subjective memory complaints) and 73 healthy volunteers completing a standard, short  picture description task known as the Cookie Theft task. These results  show  that eye tracking data is predictive of AD in a patient versus control classification task and that using eye tracking data for this predictive task  is complementary to using speech alone, namely combining both modalities yields the best classification performance. Our results suggest that eye tracking is a useful modality for classification of AD, most promising when considered as an additional non- invasive modality to speech-based classification. We are currently investigating how to further improve our classifiers by collecting additional data so as to have corpora large enough to explore Deep Learning methods in addition to the standard machine learning algorithms investigated thus far