Keynote Speakers

New ways to design haptic user interfaces
Professor Stephen Brewster, University of Glasgow, UK

Abstract. I have been working in the area of haptics for a long time and am constantly frustrated by the limitations of the technology when trying to design interesting user interfaces. Force-feedback devices always feel spongy and vibration motors just feel like … vibration motors. One reason for this is that there is a mismatch between these devices and human capabilities. In this talk I will think about haptics in a new way and look at different types of devices that are a much better match to us as humans and show how we might design better user interfaces using them.
Bio. Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow, UK. He runs the internationally leading Multimodal Interaction Group. His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly hearing, touch and gesture) to create a rich, natural interaction between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. A long term theme has been haptics, starting with force-feedback and more recently tactile displays. He has authored over 350 papers and is a member of the CHI Academy. website

Brain circuitry representations of haptic information
Dr Henrik Jörntell, Lund University, Sweden

Abstract.The aim of this presentation is to explore novel views of how brain circuitry can use the biomechanical properties of the skin and the properties of its sensors to abstract and represent haptic information arising in skin-object interactions. Using the interpretational framework of the haptic input features as defined by the laws of contact mechanics, the brain’s first order processing of tactile skin sensor information, which is performed in the cuneate nucleus of the brainstem, reveals major surprises that contrast with established view of tactile processing in the brain. Rather than being focused on preserving the responsive properties of individual peripheral sensors, the neurons of the cuneate nucleus seem to respond to specific combinations of sensors that form selective projections within the sensory space defined by the haptic input features. By integrating information of different combinations of skin sensors, individual cuneate neurons, even though activated from the same part of the skin, form different such projections in haptic input space. The findings have potentially major implications for current views on the organization of the brain’s processing and representation of haptic information, which are also discussed.
Bio. Dr. Henrik Jörntell has been working at the Neurophysiology Section of the Department of Medical Science of Lund University since 1991. He received his PhD degree in 1997 in Systems Neurophysiology, LU. He spent two postdoc years at the University of Copenhagen 1998-1999 learning the patch clamp technique, This technique was then subsequently introduced in the in vivo experimental setup in Lund from year 2000. He was appointed associate professor in Neurophysiology at LU in 2007. His expertise is on systems neuroscience, in vivo cellular and synaptic physiology, cerebellum, motor cortex, spinal circuitry, motor control and sensory processing. website

The Essential Interface Between Science And Technology
Professor Blake Hannaford, University of Washington, USA

Abstract. Haptic technology and the community of researchers engaged in haptics have grown tremendously in the last 20-30 years.  The challenge of developing technology which can exploit the full range of human haptic capabilities is huge and the “Holodeck” with full haptics is still a far-off vision.   This talk will present a non-comprehensive overview of haptics science and technology, illustrated with some examples, which hopes to convey the breadth and challenge of this vision. In some sense haptic devices started out in the form of experimental apparatus for psychophysics experimentation.  Yet significant insights into haptic perception can be gained by manipulation of simple objects. Admittance and Impedance type haptic devices introduce a simulated link between physical and virtual worlds and significant computational challenges.  Finally, the expansion of teleoperation into surgery drives renewed interest in the still challenging task of practical force reflecting teleoperation.
Bio. Professor Blake Hannaford received the B.S. degree in Engineering and Applied Science from Yale University in 1977, and the M.S. and Ph.D. degrees in Electrical Engineering from the University of California, Berkeley, in 1982 and 1985 respectively. Before graduate study, he held engineering positions in digital hardware and software design, office automation, and medical image processing. At Berkeley he pursued thesis research in multiple target tracking in medical images and the control of time-optimal voluntary human movement. From 1986 to 1989 he worked on the remote control of robot manipulators in the Man-Machine Systems Group in the Automated Systems Section of the NASA Jet Propulsion Laboratory, Caltech. He supervised that group from 1988 to 1989. Since September 1989, he has been at the University of Washington in Seattle. He was awarded the National Science Foundation’s Presidential Young Investigator Award and the Early Career Achievement Award from the IEEE Engineering in Medicine and Biology Society. His currently active interests include haptic displays on the internet, surgical biomechanics, and biologically based design of robot manipulators. website