private webpage: 

Children with autism have sensory processing disorders impairing their ability of having awareness of their body and movements  so they frequently exhibit “atypical” body-interactions they use as “compensatory movements” helping them to have some sense of agency when interacting with the world.  In this project, we explore how tangible computing could offer to children with autism the body awareness they need during sensory therapies.  We hypothesize that an “interactive multisensory environment” that takes into account the body movement of their users as an implicit input to adapt such environment could provide the much-needed engagement children with autism need during during multisensory therapies, and give them feedback about their body and movements. Such an environment could also automatically record the information clinicians’ need.

Building on our initial literature review we conducted a qualitative study to inform the design of SensoryPaint and uncover the type of interactions children with autism experience during sensory therapies.

We found out sensory therapies involve a combination of the use of the MSE (Figure 1 left) to expose children with autism to different visual and audio stimulus, the use of balls and toys to teach children with autism to tolerate a variety of textures and fabrics (Figure 1 middle), and the use of a mirror to give children with autism “biofeedback” –a process of gaining body awareness using your sensory channels (i.e. sight, hearing, taste, touch, smell, Figure 1 right).

Fig. 1. The sensory therapy ecosystem. A controlled multisensory environment (MSE) or “snoezlen” uses fiber ropes, multimedia projections, and music as sensory stimulus. (left) Toys and objects with different characteristics and textures enable children with autism to work their hyper and under-sensitive disorders (middle). A mirror provides children with autism with feedback about their body and movements (right).

Following an interactive user-centered design methodology we used the results of the qualitative study to iteratively design several low-fidelity prototypes that exploit the interaction experience of children with autism with the multisensory environmen. The low-fidelity prototypes were discussed during several participatory design sessions that help us to select the more appropriate prototype balancing interactivity and “biofeedback”.

We envisioned an interactive multisensory environment like a “virtual interactive cave” that enables children to paint in walls using different kinds of brushes, and to use their body to control the instrumental music playing within the cave and the intensity of the color used to paint in the wall (Figure 2). Brushes are balls wearing “skins” of different textures (e.g., rice, plastic). To interact with the interactive multisensory environment the child could: (a) move the ball to draw the trajectory of the ball while hearing an instrumental sound while the ball (Figure 2 left); (b) throw or kick the ball to create “painting effects” in the form of “splashes” drawn at the contact point (Figure 2 middle); and (c) adjust the music when the child move different parts of his body to adjust the instrumental music being played (Figure 2 right).

Fig. 2. An interactive multisensory environment. A child with autism using balls to draw shapes in the wall (left) and throw balls towards the wall (middle). A child with autism adjusting the instrumental music being played within the cave

We implemented the envisioned SensoryPaint system using kinect camera (see video).   To recognize objects, their colors, and forms we used OpenCV, and OpenGL for the 3D rendering displayed with a typical multimedia projector, and opneAL to adjust an audio multichannel. We are currently finishing implementation and conducting the deployment study.

In collaboration with Gillian R. Hayes (UC Irvine)

Project participants

Related publications to the project

  • Ringland, K., Zalapa, R., Neal, M., Escobedo, L., Tentori, M., Hayes, G.R. (2014) SensoryPaint: A Multimodal Sensory Intervention for Children with Neurodevelopmental Disorders. Ubicomp 2014, Seattle, WA, USA, September 13-17 (Honourable Mention)

  • Ringland, K., Zalapa, R., Hayes, G. R., Tentori, M. (2014) SensoryPaint: An Interactive Surface Supporting Sensory Integration in Children with Neurodevelopmental Disorders, IMFAR 2014, Atlanta, Georgia, May 14-17, 2014

Media coverage

Desarrolla CICESE tecnología para niños autistas Software especializado, videojuegos, sistemas especiales@Gaceta CICESE, Ensenada.net