By Tiffany Fox
San Diego, Calif., July 28, 2014 — Scholars from around the world convened at the University of California, San Diego’s Qualcomm Institute earlier this month for the sixth International Symposium on Gesture Studies – a misnomer of sorts, said UC San Diego Dean of Social Sciences Jeffrey L. Elman in his remarks, because “many of the talks address issues that go far beyond gesture alone.”
Human gesture, which is used in place of or in conjunction with speech, is a rich field of study among disciplines in the social sciences such as education, communications and psychology. Yet as the world becomes increasingly automated – and controlled through human gestures like pointing, tapping and beckoning – cognitive science, computer engineering and even healthcare are becoming inextricably linked as researchers try to understand and improve the ways humans and machines communicate.
One of the plenary speakers for the event was Microsoft’s Andy Wilson, who presented keynote remarks on “Interacting in Spatial Augmented Reality.” Wilson is known for his work developing the Kinect gaming system and other technologies that rely heavily on gesture to facilitate human-computer interaction.
Wilson emphasized the importance he and his team place on “taking inspiration from the real world” as they develop new technologies (which, in his case, includes one of the first interactive tabletops).
“We want people to interact with this technology in a way you would with real world objects. In the real world, if you have a photo, how do you stretch that photo and not tear it? Today these gestures are part of the common vocabulary of interaction.”
“It’s really stunning,” added Elman, who is also a professor of Cognitive Science at UC San Diego. “The talks are on subjects ranging from animals to robots, code-speech to pantomime, on-the-spot innovation of gestural strings to community sign languages, talks on the timing of gesture and spatial extent, from psychophysics to higher order cognition.
“Many of the talks,” he added, “really deal with phenomena that are deeply related but often studied in isolation. Whether it’s connecting language acquisition with language evolution, body and action with thought or icons with symbols, gesture affords a very rich avenue for insights and understanding that might not otherwise be possible.”
Several presenters at ISGS described the role of gesture studies in education, and how better understanding gesture can improve communication and learning. With her plenary talk, “Space and time on our hands,” Susan Wagner Cook, an assistant professor of Psychology at the University of Iowa tackled one of the basic questions at the heart of gesture studies: “Why is it that when people talk, they gesture?” Cook noted that gestures display a “exquisite temporal synchrony with the accompanying speech” and acknowledged the prevailing belief that "the primary function of gesture is to communicate information.”
“I believe that gesture does more than simply communicate information,” she remarked. “There’s a direct effect of gesture on learning that is separate from communicating in the moment.”
Cook’s conclusion has implications in the field of online or computer-generated education as well. She shared data taken from a study that used a computer-generated avatar to teach math skills. When the avatar’s eye gaze, body position and prosody were controlled and gesture was either used or not used in tandem with speech, their findings show that gestures can influence conceptual learning.
“Clearly, gesture helps students understand mathematics,” Cook concluded. “Students who learn (from an avatar who gestures) transfer (those skills) and generalize more than students who don’t. Gesture makes students solve problems more quickly, it influences how learning unfolds over time and gesture supports learning and memory.”
UC San Diego Interim Vice Chancellor for Equity, Diversity and Inclusion Carol Padden notes that such applications are among the most important for gesture studies.
“Other researchers studying gesture,” added Padden, “showed that some gestures are tied to communicating about specific features, while other gestures refer to abstract concepts. The latter type of gestures seem to play a role in expansive learning, or learning across domains.”
Padden, who is also a professor of Communication, noted there are many prominent gesture and sign language researchers at UC San Diego, the Salk Institute and San Diego State University, and the event drew more than 200 people to the Qualcomm Institute, which is home to a robotics lab and several research teams that are actively involved in computer-generated education and human-computer interaction (some using the Kinect gaming system).
“This was a major attraction for conference attendees, to interact with UC San Diego’s excellent language, cognitive science and communication faculty in this area," said Padden, "and to also meet with faculty and researchers from neighboring research institutes.”
Tiffany Fox, (858) 246-0353, email@example.com