Exploring contextual multimodal cues as memory aids
We are exploring the potential of proximity-triggered contextual audio and visual cues to help early-stage Alzheimer’s patients recall familiar people and places. In particular, we are using proximity beacons to determine when the user is physically close to another person, such as a loved one. The beacons will then trigger cues in the form of:
- audio conveying contextual information such as name, relationship, time/place/details of last interaction;
- images and video (using AR) showing previous interactions along with text displaying contextual information; and
- music in the form of specific songs associated with specific individuals.
We’re interested in tackling the following questions:
- Which cue modalities are the most effective in improving recognition in early-stage Alzheimer’s patients?
- What advantages and challenges are afforded by each of the different modalities?