private webpage: 
Autism is associated with impairments of attention, memory and information processing. For children with autism impairments there is difficult to remember and identify real objects. Teachers prompt children with autism to redirect their attention to the object discrimination training and reduce the time they spend “off task”. Technological interventions that provide interactive visual supports help teachers to keep students “on task” . However, these tools still lack some realism to help students generalize from the classroom to other environments. Thus, a new type of interactive visual supports capable of augmenting the physical form of a traditional object with digital information is needed. One possible solution to this problem is the use of Augmented Reality (AR) due its capability to seamlessly superimpose digital information on real objects. 
 
In this project, we developed MOBIS, a mobile augmented reality application enabling multi-modal interaction to provide guidance to students with autism during the object discrimination training. The system uses a vision-based object recognition algorithm to associate visual and verbal prompts to the object being discriminated (i.e., “object of interest”). MOBIS consists of three interfaces: (1) one running in a tablet teachers use to set up the therapy and monitor each trial, (2) a second one running in a smartphone a student uses as a “visor” to uncover visual and verbal prompts added on top of physical objects, and (3) the third one is a Tangible User Interface (TUI) housing accelerometers that could be attached to the objects being discriminated to detect students’ interaction gestures to facilitate the record-keeping for the teacher.
 Teachers use the tablet to upload captured photographs of relevant objects, and associate tags to photos and tagged photos to therapies. To create a new tag, teachers select the object and associate a visual support (e.g., a circle) to an audio or a text message. This message will be displayed to the student with autism as a prompt superimposed over the detected object. Teachers also use the tablet  to select the object the student will need to discriminate, the number of trials, and the amount of prompts and rewards available per trial. Teachers first select the number of students that will participate in a trial, as the system supports multiple users. Then the teacher selects the object the student will learn to discriminate. The ANS Tag Search Engine (see 4.3.2), to improve performance, only considers the tagged objects that match the object selected by the teacher. Next, the teacher selects the number of prompts, including visual and audio prompts, vibration, and a combination of the three, to be provided to students. The level of prompting will depend on the functioning level of the student and should be faded out as the student executes the skill being taught without needing object-initiated prompts. Having these different forms of visualization support multiple modes of interaction. Then, the teacher selects the rewards associated to each trial and to the complete activity. Finally, the teacher selects the number of trials per activity, and initiates the activity. This will activate the ANS Client running in the student’s smartphone. 
 
 We conducted a 7-week deployment study of MobIS in Pasitos with 10 low-functioning students with autism and 2 teachers. We evaluated the usability, usefulness and efficacy of the system in helping students with autism practice their language and cognitive skills. We are currently analyzing the collected data including interviews, videos, logs and observations reports. We plan to have the results and the paper of this project at the end of the year.
 

Project participants

 
Lizbeth Escobedo, Ph.D.,Post-doctoral scholar (see more about lizbeth …)
e: lescobedo[at]u …

Monica Tentori, Ph.D., Assistant Professor (see more about monica …)
e: mten …
 
In collaboration with Jesus Favela (CICESE) and Eduardo Quintana (CICESE)

Related publications to the project

  • Escobedo, L.,  Tentori, M., Quintana, E., Favela, J., Garcia-Rosas, D. (in press) “Integrating the physical and the digital world to increase the attention of children with autism”. To appear in IEEE Pervasive Computing

  • Quintana, E., Ibarra, C., Escobedo, L., Tentori, M. and Favela, J. (2012) “Object and gesture recognition to assist children with autism during the discrimination training”, CIARP ’12, Buenos Aires, Argentina

Media coverage

Desarrolla CICESE tecnología para niños autistas Software especializado, videojuegos, sistemas especiales@Gaceta CICESEEnsenada.net