[dbaust] Sven Topp President of ADBC presentation about Haptics

  • From: "Trudy Ryall" <trudy.ryall@xxxxxxxxxxx>
  • To: <"Undisclosed-Recipient:;"@freelists.org>
  • Date: Tue, 10 Jun 2014 21:40:34 +1000

Hello all,
Please find below and attached Sven Topp President of ADBC presentation about 
Haptics in Sydney at the National Deafblind conference last weekend.

Monday 9th June 
HAPTICS : Technology and Touch
Overview
- What is Haptics?
- Haptic Communication
- Harnessing Haptics
- Technology and Design
- Where's the Button?
- Haptic Revolution
- Touching the Future


What is Haptics?
While Haptics has been studied for some time with some of the earliest recorded 
research occurring in the early 1900's, it is a relatively new topic area that 
is highly multidisciplinary. Described by Wikipedia as "Any form of non-verbal 
communication involving the sense of touch" and by Dictionary.com as "To 
Touch". Haptics forms part of our everyday life and is part of the current 
multi-modal concepts of communication which also includes aural and visual 
cues. It is particularly common in environments where two or more people share 
the same physical space.

The way in which we communicate via touch is often reliant on culture. For 
example, a traditional Western style greeting is the handshake (particularly 
between males). Job interviewers still rely heavily on this basic contact to 
determine overal factors about a job candidate such as confidence.


Haptic Communication
The Deafblind Community has relied heavily on Haptics as part of its formal 
communication strategy. While Deafblindness severity varies greatly and thus, 
communication methods have high levels of variation, there are many formal 
methods that almost are entirely based on interpersonal touch (touch occurring 
between two individuals). Touch is also used by Deafblind individuals to 
explore their environment and receive either direct or indirect transfer of 
other communication modalities (eg speech -> Tactile).

This heavy reliance on Haptics by the community on a daily basis makes it an 
expert in an emerging field that still has a long way to go in fully 
understanding the extent of Haptic capabilities.

Some examples of formal communication methods occurring within the Deafblind 
Community include Deafblind Tactile FIngerspelling (Australia and UK), Tactile 
Auslan or Hand Over Hand (Australia), Braille (Internationally), LORM (Germany) 
and Social Haptics (Europe). But while these formal methodologies exist it is 
also important to keep in mind that Haptics communication is a part of our 
everyday life and occurs, regardless of our formal methods, age, culture and 
other influencing factors.


Harnessing Haptics
The Human Body is hot wired for Haptic reception. But this wiring is not evenly 
distributed with Haptic resolution varying greatly between surface areas on the 
body. While the back is relatively low resolution, the hands and particularly 
the fingertips are extremely high. Ever wonder why babies place things in their 
mouth constantly? The human mouth is one of the most sensitive locations but 
not to worry, we're not about to start putting electrodes in there.....

Two important things to remember about the type of touch our body is primarily 
geared to receive. Rate of change (ie movement) and differentiation, much in 
the same way our eyesight works (we are particularly perceptible to motion and 
large differences/high contrast).

So what can we feel? There are lots of different things we can feel/sense 
through our touch receptors : Temperature - Hot or cold?
Moisture - Dry or wet?
Sound/Vibration - What is the amplitude and frequency?
Shape : is it rounded, flat or pointy?
Size - How large is the object?
Orientation - Which way is the object facing?
Weight - Is it light or heavy?
Tension - Is it loose or high strung?

Some interesting factors to note are that if we place two temperature based 
inputs close enough together, our skin averages out the temperature and does 
not treat them as discrete inputs (eg Hot + Cold equals Warm). Over the years 
there have been devices and techniques that have attempted to directly transfer 
speech into touch (Tadoma, Vocoder, Tactaid) but have had little in the way of 
mainstream uptake. One possible reason for this is that the important 
frequencies with which speech occurs is often outside the range of frequencies 
the human skin is attuned to (generally thought to be 30 - 300 Hz).

But we are also capable of detecting and determining compound differences. 
Texture is a very good example of this (as is Emotional intent). Texture 
comprises many variables including Temperature, pattern (uniform or irregular), 
friction (rough or smooth), hardness and many other factors.


Technology and Design
Over the years, technology has changed greatly and its focus in the design 
phase has also changed with the technology. Haptics has become an important 
part of the design process but its less about Haptics being a trade off of 
technological requirement and more about actively responding to the user in a 
Haptic modality. Three factors of older technological design were form equals 
function, technological and material constraints.

As an example, let's take the old style Rotary Dial Phone.
In this case, both the form equals function and technology constraints are 
fairly similar and intertwined. For example we need a handset to ensure that 
the microphone and speaker are close to the ear and mouth. The rotary dial 
mechanism is required to generate the correct dial tones. Those familiar with 
older style rotary phones might remember the inbuilt Haptic components of the 
rotary dial (it would stop when you reached the end). Materials in use at the 
time were timber and brass which were fairly commonly available and easily 
worked with. The had an impact on how the phone felt to use as well.

What about today's technology though? These days we rely heavily on 
multi-functional devices so we've almost completely lost the rule of form 
equals function. Technological constraints are now a bit backwards and more 
determined by the question of the form factor (the predetermined size and 
weight constraints) of the device. Materials constraints are somewhat present 
but is less restrictive with access to greater range of easily manipulated 
materials.

Let's take a look at a tablet as an example of a modern technological device.
Except for processor constraints in the area of technology, neither form equals 
function or technology constraints are really at play. This has almost become 
entirely about "What size do we want it to be?" and then cramming as much 
technology in as we possibly can (eg high resolution touch screen, speakers, 
microphone, accelerometer, gyroscope, GPS etc). Here, materials are a more 
interesting aspect of design for a Haptics enthusiast. This is especially true 
with current ranges where it's all become about how the user perceives the 
device when they touch it. Good examples of materials use are the Galaxy 
NotePro devices that come with an artificial leather back designed to provide 
the sensation of high quality materials use while providing high levels of 
friction for non-slip use. Apple also sticks with its aluminium Unibody design 
which is known to produce a sensation of good quality.

But humans are still slaves to preconceived conceptions of quality in terms of 
Haptics. This is evident in DVD/Blu Ray Player manufacturers placing weights in 
their machines to generate the sensation of quality components (good AV 
equipment often was oft heavier). Our perception is also linked to other inputs 
such as Vision and our preconceived conceptions of how an object should behave 
(eg a washing machine dial that does not "click" on each setting may cause a 
sensory mismatch).


Where's the button?
In 2007 we saw a fundamental shift in the popularity of touch screen devices 
with the release of Apple's original iPhone. While not technologically 
spectacular (no SMS and often criticised for very poor hardware components) it 
did bring the concept of the touch screen to the masses. Previous devices were 
already available (HTC HD, Palm etc) but hadn't enjoyed such levels of 
popularity.

With this popularity of such a flexible device we also saw a change in how 
things such as the Disability Technology sector responded. No longer did you 
need a special device to do Video magnification, Document Scanning and OCR, 
Book Reading and many other things. You simply needed the right App, all on the 
one device that fit in your pocket. But smart phones tend to still remain a bit 
of a "Jack of all trades" type device with problems such as no stand for long 
use of a magnifying App, camera is not ideal for close focal points and the 
microphone is not well suited to speech recognition in noisy environments 
(although they are improving).

The good part about the use of a touch screen device was that it offered much 
greater portability. A (hopefully) simple way to access many input and output 
functionality because we only needed to render relative controls on the screen 
at any given time (eg play button to start music). Responsiveness has also 
improved with computing capabilities and this has also allowed us to start 
using much more complicated functionality (eg speech -> text). All things 
considered, the touch screen has offered the disability community a far greater 
level of Independence it has not previously enjoyed.

But as always, there is bad that comes with the good. In particular and 
especially relevant to people with a vision impairment are that touch screens 
are still relatively vision intensive (braille displays can be linked but this 
detracts from portability). Sometimes there are issues with glare (in outdoor 
environments) and App developers vary wildly in their rendering of common 
functions and design strategies (there is no standard followed for software 
design and interface layout). Furthermore, current Operating System trends have 
a heavy reliance on text and are still not particularly finger-friendly.

So are touch screen devices better or worse?
Depends entirely on the application of the touch screen and how its interactive 
is implemented. A good example is turning the page of your book by swiping the 
screen. But they're not always appropriate for the use and environment in which 
they're aimed. Do you really want to be trying to see your car's climate 
control system on a touch screen in a glare car while trying to drive? Not 
likely...


Haptics Revolution
So your phone vibrates as do most tablets now. That's the very tip of the 
iceberg though and is accomplished with a very simple DC motor placed in direct 
contact with your devices outer edge (often only being able to change the 
duration and frequency of the motor).

Active Haptic feedback is becoming more and more a possibility with large 
company investigation (it's the next frontier). Size and expense constraints 
are becoming less also and there are plenty of developments over the horizon. 
Some that relate to touch screen devices are being able to place active force 
on a naked finger, active permeation of the screen height (ie raising and 
lowering parts of the screen), phone bumpers with multi-actuator feedback to 
provide haptic GPS direction and many others.

But let's get back to Haptic Communication.....
Several Deafblind related gloves are currently starting to reach introduction 
stage. One of these is the LORM Data glove so let's have a look at what it can 
do. The LORM glove converts text messages on a mobile phone to the LORM 
alphabet (and vice versa) using a series of vibrotactile actuators for 
generating vibrations and sensors to determine active touch on the glove. It's 
important to note here that the LORM alphabet is generated by a series of taps 
and swipes on the fingers and palm of the hand.

This is a great step in the right direction but there are limitations and 
possible problems. One key limitation is that the actuators used, while small, 
are only either on or off which produces a very "flat" communication method 
without emotional content. Its also very directed and single purpose and its in 
contact with the hand constantly meaning it may suffer from desensitisation 
over time (not to mention wearing out of materials through continual removal 
and contact).


Touch the future
Which is where technological advancement such as the Haptuator Mark II 
(TctileLabs) come into play. Relatively cheap, small and easy to control the 
Haptuator Mar II has greater flexibility. It's not just a matter of being on or 
off but controlling the actuators amplitude, frequency and duration with a 
quick response time.

With actuators along these lines we can generate different types of "contact" 
and not just the general tap/swipe that we are used to. Some things that we can 
achieve are the Tap sensation (High Frequency, High Amplitude and short 
duration), rotation (propeller noise) and directional blunt motion (white 
noise, a quick swipe can be generated by simply activating two actuators that 
are fairly far apart in quick succession).

This allows us to think about much more complicated and intricate forms of 
communication. We can even look at transferring emotions music and visual 
content.

Soon the Deafblind community could be mobile calls using Haptics technology.



Trudy Ryall
E: trudy.ryall@xxxxxxxxxxx
M: 0401324381

Attachment: Monday 9th June.dot
Description: MS-Word document

Other related posts:

  • » [dbaust] Sven Topp President of ADBC presentation about Haptics - Trudy Ryall