Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
28 views26 pages

Robotics and Autonomous Systems

This document reviews the advancements in tactile sensing for dexterous robot hands, highlighting its importance in autonomous manipulation. It discusses various types of tactile sensors, their integration with robotic systems, and the algorithms used for processing tactile data. The paper also outlines the requirements and challenges of implementing tactile sensing in robotics, suggesting future research directions in this field.

Uploaded by

zerde.nurbayeva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views26 pages

Robotics and Autonomous Systems

This document reviews the advancements in tactile sensing for dexterous robot hands, highlighting its importance in autonomous manipulation. It discusses various types of tactile sensors, their integration with robotic systems, and the algorithms used for processing tactile data. The paper also outlines the requirements and challenges of implementing tactile sensing in robotics, suggesting future research directions in this field.

Uploaded by

zerde.nurbayeva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Robotics and Autonomous Systems 74 (2015) 195–220

Contents lists available at ScienceDirect

Robotics and Autonomous Systems


journal homepage: www.elsevier.com/locate/robot

Tactile sensing in dexterous robot hands — Review


Zhanat Kappassov a,∗ , Juan-Antonio Corrales b , Véronique Perdereau a
a
Institute of Intelligent Systems and Robotics, University of Pierre and Marie Curie, CC 173 - 4 Place Jussieu 75005, Paris, France
b
Institut Francais de Mecanique Avancee, Campus de Clermont-Ferrand les Cezeaux BP265 63175 AUBIERE Cedex, France

highlights
• We present a review of tactile sensing applications in dexterous robot hand manipulation.
• This problem is key to dexterous manipulation, and no update reviews are available.
• The main types of tactile sensors and their integration with robot hands are discussed.
• An overview of tactile data processing techniques and its applications is presented.

article info abstract


Article history: Tactile sensing is an essential element of autonomous dexterous robot hand manipulation. It provides in-
Received 9 January 2015 formation about forces of interaction and surface properties at points of contact between the robot fingers
Received in revised form and the objects. Recent advancements in robot tactile sensing led to development of many computational
15 July 2015
techniques that exploit this important sensory channel. This paper reviews current state-of-the-art of
Accepted 20 July 2015
Available online 28 July 2015
manipulation and grasping applications that involve artificial sense of touch and discusses pros and cons
of each technique. The main issues of artificial tactile sensing are addressed. General requirements of a
Keywords:
tactile sensor are briefly discussed and the main transduction technologies are analyzed. Twenty eight
Tactile sensing various tactile sensors, each integrated into a robot hand, are classified in accordance with their trans-
Tactile sensors duction types and applications. Previously issued reviews are focused on hardware part of tactile sensors,
Robot hands whereas we present an overview of algorithms and tactile feedback-based control systems that exploit
Dexterous manipulation signals from the sensors. The applications of these algorithms include grasp stability estimation, tactile
Tactile sensing application object recognition, tactile servoing and force control. Drawing from advancements in tactile sensing tech-
Review nology and taking into consideration its drawbacks, this paper outlines possible new directions of research
in dexterous manipulation.
© 2015 Elsevier B.V. All rights reserved.

1. Introduction reviewing the current state-of-the-art tactile sensors and their ap-
plications in dexterous robot hands.
Autonomous dexterous manipulation, also known as in-hand During the last decades, industrial robots have replaced humans
object manipulation, is one of the much-desired key skills of indus- in heavy, repetitive or/and unsafe manufacturing tasks [3]. The
car, consumer electronics, and aerospace industries, to name
trial and social robots [1]. The development of autonomous dex-
only a few, have used pre-programmed robotic manipulators
terous robotic systems is a complex process of an interdisciplinary
equipped with simple two-finger grippers in large scale production
nature involving such diverse research fields as computer vision,
lines. Nevertheless, current manufacturing demands dictate a
force control, motion planning, grasping, sensor fusion, digital
need for lower volume assembly of more customizable and
signal processing, human–robot interaction, learning and tactile variable products, requiring robots with higher adaptability, easy
sensing [2]. In this paper we address the issue of tactile sensing reconfigurability in software and hardware, more flexibility and
more manipulation capabilities [4]. This need can be met by
replacing grippers with multi-fingered dexterous robot hands that
∗ are able to grasp very different objects and even manipulate them
Corresponding author.
E-mail addresses: [email protected] (Z. Kappassov), with the use of fingers [1]. Dexterous robot hands are also essential
[email protected] (J.-A. Corrales), [email protected] in the new generation social and service robots which can replace
(V. Perdereau). humans in daily routines [5], and provide assistance to the elderly
http://dx.doi.org/10.1016/j.robot.2015.07.015
0921-8890/© 2015 Elsevier B.V. All rights reserved.
196 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

and the disabled. The incursion of robotics in domestic life presents sors [17]. Even though, intrinsic sensors can give approximate in-
new challenges to robotic design. Unlike industrial environments formation about interaction force as shown elsewhere [2], extrinsic
domestic spaces are typically unstructured which means that tactile sensors give much more precise and multi-modal informa-
perception needs to be added to the robots’ control strategies. tion about interaction properties [19]. Thus, tactile sensors can be
Among perception modalities, tactile sensing plays an impor- defined as a tool that can evaluate a given property of an object
tant role in physical interactions, especially with human beings. through physical contact between the hand and the object [20].
Neuroscience has long demonstrated the importance of tactile When a tactile sensor is represented by an array, each sensing el-
feedback in human manipulation. Different studies have shown ement of the sensor is referred differently in robotics literature,
that people with anesthetized fingertips are unable to maintain a e.g. sensing cell, taxel or tactel.
stable grasp [6], and children with deficient tactile sensing have Tactile sensors meet the following task-related requirements of
difficulties in performing manipulation tasks [7]. Tactile sensors in-hand manipulation [10]:
provide robots with information about physical contact, whereby
autonomous robot hands can operate in unstructured environ- (1) Response. In collision avoidance [21] and human–robot inter-
ments and manipulate unknown objects [8]. At the same time, the action tasks, tactile sensors must provide information about
availability of sensory information to the robot ensures its safe op- the presence of contact and measure the strength of contact
eration in direct human–robot interaction applications. force, respectively.
In traditional industrial approaches control of robot end- (2) Exploration. During exploration, tactile sensors should provide
effectors is achieved by embedding prior knowledge about artic- information about: surface properties from measurements of a
ulated object and environment into the control algorithm. Robot texture, hardness, and temperature [22]; structural properties
hands are thus able to manipulate only known objects and work from shape [23]; and functional properties from detection of
in a structured environment, which means they are less adap- contacts and vibrations [24].
tive to unexpected events. To overcome these limitations, an ap- (3) Manipulation. In autonomous manipulation tasks, tactile data
proach based on active exploration, which relies on data from is used as a control parameter in: slip detection; estimation
tactile sensors, can be implemented to let robot hands explore ob- of grasp stability [25]; contact point estimation, surface nor-
jects and run control actions when unexpected events occur. Only mal and curvature measurement [26]; tangential and normal
a few approaches use tactile feedback inside autonomous control forces measurements for achieving stable grasps [27]; and con-
schemes [8]. tact force measurements for fingertip force control [28].
Artificial tactile sensors in robotic applications are represented
by pressure profile sensing arrays, force-torque sensors, and Depending on the task, the sensor has different design specifi-
dynamic tactile sensors [9]. Information acquired from artificial cations, which were first determined by Harmon [29]. The basic
sensing systems can be used for finding contact locations, recon- design criteria for tactile sensors have been previously reported
structing and recognizing object shape, and measuring contact in [17] for humanoid robots, in [15] for biomedical engineering,
forces and temperature. in [16] for prosthetic hands, and in [18] for manufacturing and large
Even though tactile sensory information is an essential element tactile system implementation. In autonomous manipulation ap-
in the process of manipulation, technology and research in artificial plications, tactile sensors meet requirements for object character-
tactile sensing is not developed as well as other perception modal- ization and identification (e.g. they estimate the compliance, ther-
ities [10]. Promising new technological advances in tactile sensors mal and textural properties) and for manipulation (e.g. they control
based on micro-electromechanical systems [11] and organic tran- the force applied to the object) [19].
sistors [12], have not been applied yet to robotic devices. The most important design criteria for tactile sensors with
Currently research is focused on developing new tactile skins, application in manipulation tasks are summarized in Table 1and
covering robot hands with tactile sensors and investigating new discussed in following:
algorithms and approaches for using tactile information in au-
(1) Requirements on spatial resolution of a tactile sensing array
tonomous manipulation. New techniques that use tactile sensing
depends on both the size of the objects to be recognized and
information include object recognition and exploration, grasp sta-
the location of the sensor on a robot hand. A rather high spa-
bility estimation, force control, tactile servoing and slip detection.
tial resolution is desirable in in-hand object manipulation [30]
This paper presents a thorough review of the most recent ad-
or tactile servoing [31] tasks, whereas in the cases when high
vances in robotic tactile sensing. Previous review articles have
sensitivity or high frequency response are desirable, e.g. re-
mostly concentrated in tactile hardware dealing with tactile
active force control [32], the spatial resolution is limited by
sensing technologies for robot hands [13], for minimal invasive
surgery [14], for biomedical applications [15], slip detection in for the following reasons. A higher spatial resolution unavoid-
hand prostheses [16], robotic tactile skins [17] and large area tactile ably leads to a longer acquisition time [33], a larger number
skins [18]. This paper will review the techniques for handling tac- of wire connections and a stronger sensitivity to external elec-
tile data in robotic manipulation applications covering approaches tromagnetic noises. The first two consequences are straight-
and applications of tactile sensors in the control of multi-fingered forward, high resolution requires a large number of sensing
robotic hands. The paper is organized as follows: Tactile sensing cells, which in turn causes longer processing time. These sens-
technologies are given in Section 2. Integration of the sensors with ing cells also require more wire connections. The highest limit
robot hands and tactile data acquisition are reviewed in Section 3. of sensitivity is given by the minimum detectable variation
This is followed by a survey of computational techniques that use of the measured signal. As sensing cells become smaller the
tactile information to control the robot hands. These techniques sensitivity to external electro-magnetic noises and crosstalk
include grasp stability estimation 4.1, object recognition 4.2, force increases. Thus, the sensitivity degrades because the level of
control 4.4 and tactile servoing 4.3. A summary of the conclusions noise can become comparable with the signal. By considering
appear in Section 5. these pros and cons, the requirements on spatial resolution can
vary for different parts of a robot hand. It was previously in-
2. Tactile sensing technologies vestigated that the resolution on the fingertips should be as
high as 1 mm since the fingertips are mostly involved in fine
Information about interaction properties can be acquired from manipulation [29]. In the current state of the art, fingertip tac-
proprioceptive (intrinsic) sensors, such as joint angle sensors with tile sensors integrated with robot hands have a spatial resolu-
actuator torque sensors, and cutaneous (extrinsic) tactile sen- tion of around 5 mm [34,35]. On less sensitive parts of a robot
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 197

Table 1
Design criteria: pros and cons.
Criteria pros cons Application

High spatial resolution A smaller objects can be recognized and A smaller sensitivity and a longer Contact pattern recognition, fine
features with a higher precision can be processing time. manipulation.
extracted.
High sensitivity Detection of a rather small change of a Dynamic range of the sensor shrinks, Light touch detection and fragile object
contact force. spatial resolutions decreases. manipulation.
High frequency response A rather fast response to the changes in Spatial resolution and dynamic range Detection of a slip and texture recognition.
the level of the contact force decrease
Low hysteresis High frequency response Degrease of the sensor’s surface friction Detection of a slip and texture recognition.
and dynamic range.
Low number of wire The workspace of robot hands does not Decrease of the frequency response (in Dexterous manipulation
connections change. case of using serial data
communication).
High surface friction Insuring stable grasp without applying Impede tactile exploration procedure. Grasping
high forces. Reduces the frequency response of the
sensor (in case of using soft paddings).

hand like the palm, the spatial resolution decreases up to 5 mm However, sensitivity and frequency response of a sensor may
as stated in [17]. Requirements for spatial resolution can be degrade with the increase of flexibility. Though, reading de-
omitted when only slippage is of importance, e.g. automatic vices can have high sampling rate, a sensor may have sig-
grasping using vibrations to achieve stable grasp [32] and slip- nificant hysteresis, which reduces dynamic response [9]. The
detection with center-of-pressure tactile sensor [36]. memory effect could be avoided by use of a thinner foam,
(2) Sensitivity in the tactile sensors is given by the smallest de- which in turn decreases the dynamic range, since the maxi-
tectable variation in pressure/force. A small detectable varia- mum charge (in capacitive sensors) that can be stored is pro-
tion means a high sensitivity. High sensitivity is very important portional to the thickness of the foam. This maximum charge
in manipulation tasks with fragile and deformable objects as represents the largest detectable force.
in [37] or [38]. However, the range from the minimum to max- (5) Wiring of tactile sensors should not affect the workspace of
imum detectable pressure/force, i.e. dynamic range, shrinks robot hands [18]. Integration of a high number of tactile sen-
with the increase of the sensitivity of a tactile sensor, which is sors in the robot hand is challenging due to wiring constraints.
caused by the technology used in the structure of the current As an example, in [43] a multimodal tactile sensor is installed
sensors. An area of sensing cells the sensor also causes contra- as a complete fingertip with bulky backside instead of dis-
diction between sensitivity and spatial resolution as was dis- tal and middle phalanges. Shielding and smart wiring should
cussed above. guarantee minimum sensitivity to noise and minimum tactile
Dahiya et al. [17] impose following requirements. The sen- cross-talk. Use of serial communication protocol decreases the
sitivity on the fingertips should be not less than 1 mN, while a number of connection wires as in iCub skin [44], but it increases
the sampling rate.
dynamic range of 1000:1 is desirable.
(6) A sensor itself should be flexible so it can be attached to any
(3) Requirements for frequency response highly depend on the
type of robot hand [44], unless the sensor is designed as a com-
application. In general, tactile sensors can be dynamic or
plete part of a robot hand, as for example the 3D-shaped tactile
static [9]. If the hand is required to detect vibrations during
sensing fingertip in [34].
slippage, the frequency response should be as high as the vibra-
(7) Surface properties of tactile sensors, such as mechanical com-
tion frequencies occurring during a slippage [16,32,38,39]. In pliance and surface friction coefficient should fit to various ma-
human hands, the detectable vibration frequencies vary from nipulation tasks. Elastic material with given friction coefficient
5 to 50 Hz and from 40 to 400 Hz for different afferents [6]. Thus and compliance can cover tactile sensors. If the contact sens-
the frequency response of a dynamic tactile sensor should be ing surface has very low friction, then the hand must apply
at least 400 Hz, i.e. the sampling rate must be at least 800 Hz high normal forces to keep the object stable, which can lead
according to Nyquist–Shannon sampling theorem. When only to breaking the object [34]. However, the low friction of the
spatial resolution is of importance (e.g. tactile object recogni- sensor surface is needed in tactile exploration procedures [31].
tion [40]), then the frequency response is not restricted by the (8) A robust sensor design should guarantee that the sensor can
response time. On the contrary, when measurements of vibra- withstand highly repetitive usage without its performance be-
tions are used to prevent a slippage [41], to detect a contact ing affected. The sensor should endure normal as well as lateral
of a grasped object with an environment [38] or to recognize forces.
a texture of a surface [22], then the response time of a sensor
becomes crucial. The frequency response (bandwidth) is lim- 2.1. Tactile sensor types
ited by the softness (elasticity) of a tactile sensor. The use of
soft materials, that are used to increase surface friction, causes Change of capacitance, resistance, optical distribution, electrical
phase delay in propagation of the waves of the mechanical vi- charge can be used in the sensing systems [45,46]. In the robotics
brations that occur at the point of contact. literature, these different ways to construct the sensing systems
(4) Hysteresis and memory effect ideally should be as low as pos- are referred as transduction of contact information [10]. And the
sible. Tactile sensing arrays incorporating flexible foam in their types of tactile sensors vary depending on the transduction.
structure it unavoidably leads to an elastic behavior of the sen- In the following we describe the basic types of tactile sensors
sors. Once the sensor is pressed and released, the flexible foam and their transduction methods. The advantages and disadvan-
first compresses and then regains its form but not immediately tages of each sensor type are given in Table 2.
(hysteresis effect) and sometimes not to the previous shape
(memory effect). Moreover, the sensor could be covered by a 2.1.1. Piezoresistive sensors
soft material, e.g. silicon rubber as in [42]. The advantage of The piezoresistive effect is a physical process during which
using flexible materials is the increase of a contact friction. electrical resistance changes when the material is mechanically
198 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Fig. 1. Piezoresistive Tactile Sensor Arrays: (a) illustration of resistance changes in conductive rubber [47],(b) nano-scale image of conductive rubber [48], (c) structure
of piezoresistive tactile array [49], (d) piezoresistive fabric tactile sensor [50], (e) schematic of electrode layer of the 3D-shaped tactile sensor [34], (f) tactile image of a
piezo-resistive pressure sensor array [35].

Table 2
Tactile sensing types: advantages and disadvantages of major sensor types. Abbreviations for the names: PRes.—piezoresistive sensors, Cap.—capacitive, PEl.—piezoelectric
sensor, Opt.—optical sensors, BarS.—sensors based on barometric measurements, MultiM.—multimodal sensors, SoundS.—structure borne sound sensors.

Type Advantages Disadvantages

PRes. Many commercial solutions exist, simpler for manufacturing, can be Non-linear response, temperature and moistness dependence, fatigue,
flexible. permanent deformation, hysteresis.
Cap. A number of commercial solutions, can be flexible, may have higher Susceptibility to electro-magnetic noise, sensitivity to temperature,
bandwidth than PRes. non-linear response, hysteresis.
PEl. Very high bandwidth. Temperature dependence, dynamic sensing only.
QTC Linear response, higher dynamic range (w.r.t. Cap. and PRes). More complex for manufacturing (w.r.t. in Cap. and PRes).
Opt. High spatial resolution, high sensitivity, repeatability, immunity to EM Bulky, high-power consumption, high computational costs.
noise.
BarS. High bandwidth, high sensitivity, temperature and moistness Low spatial resolution.
(fluid) independence.
SoundS. High bandwidth. Dynamic sensing only.

deformed (Fig. 1(a)) [45]. Materials possessing this effect are called (pressure conductive rubber), Eeonyx [60] (piezoresistive fabric),
piezoresistors [51]. ATi industrial automation [61] (Force/Torque sensors).
There are several technologies for artificial tactile sensing It is worth mentioning that currently developed tactile sensors
based on piezoresistive materials: Force Sensing Resistors (FSR), based on pressure sensitive rubber and organic transistors, such
pressure-sensitive conductive rubber, piezoresistive foam, and as the ones used in the bionic skin [62], are exceptionally thin
piezoresistive fabric. The simplest way to incorporate tactile sens- and highly flexible. Conductive rubbers used in piezoresistive sen-
ing via discrete components is by using FSRs [9] and they are widely sors have a nonlinear force-resistance characteristic (please refer
used in positioning devices such as joysticks [52]. Piezoresistive to sensor calibration plot in [31]). As a consequence of using elastic
rubber is a composite material made by mixing non-conductive materials, the sensors have severe hysteresis. The sensitivity in the
elastomer with homogeneously distributed electrically conductive piezoresistive sensors may decrease due to wearing and tearing off,
carbon particles [53,47]. Fig. 1(b) shows the structure of conduc- since the resistance of the conductive rubber does not depend on
tive rubber at nano-scale level [48]. Sensors based on conductive deformation only but also on thickness. Moreover, materials used
rubber with multilayer structures as in [49,39,35,54] (Fig. 1(c), (d)) in the piezoresistive sensors could change their properties due to
may suffer from delamination of top layers. This can be avoided variation of the temperature and moistness [45].
by using a single layer of the conductive rubber with a stitched ar- Piezoresistors also suffer from lower repeatability:after mul-
ray of wires in orthogonal orientations as in [55]. Another method tiple deformations, an elastic material may never regain its
of designing tactile sensing arrays using the conductive rubbers in- initial form. Some of the piezoresistive sensing arrays are also frag-
corporates a non-flexible pattern of the electrodes on one layer and ile to shear forces, e.g. Weiss tactile sensors [58]. In spite of these
piezoresistive rubber on a second layer (Fig. 1(e)) [34]. Some of the drawbacks, a number of robot hands incorporate piezoresistive
sensors and components are commercially available from Inter- tactile sensing arrays, since the sensors are relatively simple to
link [56] and the Tekscan [57] (FSRs), Weiss Robotics [58] (rigid tac- manufacture, can be flexible and many commercial solutions exist.
tile sensors based on carbon enriched silicone rubber), Inaraba [59] Compared to capacitive sensors that will be discussed in the next
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 199

Fig. 3. Piezoelectric Tactile Sensing: (a) the piezoelectric effect—an applied force
causes rearrangement of positive Si and negative O2 particles leading to an increase
Fig. 2. Capacitive Tactile Sensing Technology: (a) capacitance of a parallel plate of potential [45]; (b) a tactile sensing array based on the piezoelectric effect with
capacitor depends on distance between plates d and area of the plates A (q is electrodes on the bottom layer, piezoelectric material in the middle and rubber on
the stored charge) [45]; (b) two conductive plates are separated by an elastic the top [24], (c) schematic model of a piezoelectric sensing tactel [71].
dielectric—as force is applied, the distance between the plates reduces, changing
the capacitance [9]; (c) mesh of triangle shape capacitive sensors for the palm of
dynamic/static tactile sensor (Fig. 3(b)) based on PVDF polymer
the iCub humanoid robot [44].
and piezoresistive foam from Weiss Robotics for a fluidic robot
hand [70]. Chuang et al. [71] developed a flexible tactile sensor
section, the piezoresistive sensors are more robust (not com-
based on piezoelectric film with structural electrodes for grasping
pletely) to electro-magnetic noises. an object of unknown weight (Fig. 3(c)).
Piezoelectric materials have high bandwidth up to 7 kHz as re-
2.1.2. Capacitive sensors ported in [24]. These materials have faster dynamic response than
Capacitive sensors consist of two conductive plates (Fig. 2(a)) capacitive sensors. Their disadvantages include fragility of electri-
separated by a compressible dielectric material (Fig. 2(b)). When cal junctions, temperature sensitivity [52] and they are suitable for
the gap between plates changes under the applied forces, the dynamic measurements only.
capacitance is also changed. Besides normal forces, the shear forces
can be calculated by the sensor with the use of embedded multiple 2.1.4. Quantum tunnel effect sensors
capacitors [63]. Pressure sensing arrays can be constructed by Quantum Tunnel Composite (QTC) sensors can change their
overlapping row and column electrodes isolated from each other properties from insulators to conductors under compression [17].
by elastic dielectric [33]. Sensitivity to small forces can be achieved QTC sensors are more technologically advanced compared to
by using more compressible elastic materials or thin sensors. As piezoresistive and capacitive sensors. The metal particles in QTC
a flexible foam between two plates gets thinner than a smaller get so close to each other that quantum tunneling (of electrons)
charge in the sensor could be measured that is in turn means a takes place between the particles. Using QTC material, Zhang
higher sensitivity. et al. [72] (Fig. 4) developed a flexible tactile sensor for an anthro-
Capacitive technology is very popular among the sensing trans- pomorphic artificial hand with capability of measuring shear and
ducers and it has been widely used in robotic applications [17]: normal forces. The sensor has sensitivities of 0.45 mV/mN in x-
for example, in tactile the skin (Fig. 2(c)) for the iCub humanoid and y-directions and of 0.16 mV/mN in z-directions, and dynamic
robot [44], in the PR2 robot grippers [38], with the multifingered ranges up to 8 N in z- and y-directions and 20 N in x-direction. QTC-
‘‘Allegro’’ robot hand [64], and with the Robotiq robot gripper [65]. based tactile sensors [73] were integrated with previous versions
There are commercial capacitive pressure sensing arrays such of the Shadow robot hand [74] and used in the tactile glove for the
as ‘‘DigiTacts’’ from Pressure Profile Systems (PPS) [66] and Robonaut hand [75]. The sensors have linear response (please refer
capacitance-to-digital-converter (CDC) chips such as ‘‘AD7147’’ to sensor outputs w.r.t. normal force in [72]) and a dynamic range
from Analog Devices [67]. starting from 0 to 22 N which outperform the piezoresistive sensor
The major disadvantages of capacitive sensors are susceptibility with a maximum force of 5 N [31] in terms of the dynamic range.
to electro-magnetic noise, sensitivity to temperature, non-linear These sensors suffer from wear and tear of and, therefore, their sen-
response (please refer to the plot with response of excited sitivity decreases as in the case of the piezoresistive sensors. To the
taxel in [68]), and hysteresis. Their advantages include a higher best of our knowledge, for the tactile sensing materials within this
frequency response relatively to piezoresistive sensors. Since category, there are no commercial products that are designed for
capacitive technologies are used in every day life applications, use with robot hands.
as for example touch screens, this type of tactile sensing have
been well investigated and used in robotics and especially in robot 2.1.5. Optical sensors
hands. Optical sensing is based on optical reflection between mediums
with different refractive indices. Conventional optical tactile sen-
2.1.3. Piezoelectric sensors sors consist of an array of infrared light-emitting diodes (LEDs) and
The piezoelectric effect (Fig. 3(a)) is described as electrical photo detectors (Fig. 5(a)). The intensity of the light is proportional
charge generation in the crystalline material due to deformation to the magnitude of the pressure [45]. Optical sensors can also be
caused by applied force/pressure [45]. The piezoelectric effect is made sensitive to shear forces, e.g. Yussof et al. [37] developed an
produced in quartz crystals, as well as in human-made ceramics optical three-axis tactile sensor for the fingertips of a two-fingered
and polymers, such as polyvinylidene fluoride (PVDF) [69]. A hand (Fig. 5(b)). The sensor consists of 41 sensing elements made
piezoelectric tactile sensor can be created with the PVDF film strips from silicon rubber, a light source, an optical fiber-scope, and a cur-
embedded into a rubber material. Piezoelectric materials, being rent charged coupled device (CCD) camera. With the optical tac-
restricted for dynamic measurements and used in ultrasonic-based tile sensor, the hand is capable of manipulating a light paper box
sensors, are suitable for dynamic tactile sensing [17,33]. Among (Fig. 5(c)). Kampmann et al. [76] embedded fiber optic sensors to
other piezoelectric materials, PVDF polymer has features such as a multi-modal tactile measuring system of a three-fingered robot
flexibility and chemical stability, which makes it preferable for gripper (Fig. 7(d)). Xie et al. developed a flat 3 × 3 optical tactile
use in touch sensors. Seminara et al. [69] conducted research on sensor array (Fig. 5(d)) with elements of the sensor that are mag-
PVDF electro-mechanical design of tactile sensors with frequency netic resonance compatible for use in Magnetic Resonance Imag-
range of 1 Hz to 1 kHz. Goger et al. [24] developed a combined ing [77]. Johnson et al. [78] proposed a novel ‘‘GelSight’’ tactile
200 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Fig. 4. Quantum Effect Tactile Sensing: (a) structure of a tactel of the QTC based
tactile sensing array with capability of measuring shear and normal forces [72]; (b)
the flexible tactile sensing array for a finger of an anthropomorphic robot hand with
the tactels that can measure shear forces [72].

sensor to capture surface textures using an elastomer coated with Fig. 5. Optical Tactile Sensors: (a) an optical tactile transducer based on the
a reflective membrane and a camera with resolution of up to 2 principle of frustrated total internal reflection [45], (b) a structure of optical three-
microns. (Fig. 5(e)). A fingertip with a ‘‘GelSight’’ (Fig. 5(f)) tactile axis tactile sensor: a displacement of a sensing element fixed on flexible finger
sensor can measure the surface roughness and texture, the pres- surface causes changes in light propagation in opto-fibers [37], (c) fingers with the
sure distribution, and even a slip [79]. Another example of an opti- sensitive optical sensors manipulating a light paper box [37], (d) photo of an optical
3 × 3 tactile array with magnetic field compatibility [77], (e) ‘‘GelSight’’ optical
cal tactile sensor with transparent elastomer material is presented sensor consisting of a piece of clear elastomer coated with a reflective membrane
in [80], where an LED and a photo-diode distant from each other senses the shape of the cookie surface [79], (f) finger configurations of the ‘‘GelSight’’
are placed against a reflecting (contact) planar surface. When sur- sensor [79].
face deforms it causes changes in reflected beams. Similar concept
is used in the OptoForce sensors [81]. These sensors are based on
the use of infrared light to detect deformation of the contact sur-
face, which in turn transforms to force. The forces in three dimen-
sions are estimated from measurements of four photo-diodes that
surround one infrared source. The reflecting surface has a semi-
spherical shape.
Sensors within this category have good spatial resolution, sen-
sitivity, high repeatability and immunity from electro-magnetic in-
terference [15]. The disadvantages of these tactile sensors are their
relatively big size, high-power consumption and high computa-
tional costs [10].

2.1.6. Sensors based on barometric measurements


Tactile sensors within this group use pressure transducers that
have been long used for measuring the pressure in liquids and
air [45]. Use of liquid inside a tactile sensor allows getting high
frequency response and deformability of the sensor at the same
time. A liquid is used as propagation media for vibrations, which Fig. 6. Sensors based on barometric measurements: (a) the structure of a tactile
sensing cell with a barometer and silicon rubber (b), the TakkStrip tactile array of
are represented by changes in pressure value. This approach takes
these cells [87], (c) custom shaped array of the pressure sensing barometers of the
advantage of the conventional pressure sensors, as for example the iHY hand [42], (d) micro-vibration sensing system based on a fluid pressure sensor
digital barometer [82]. Wettels et al. [19] introduced the sensing of the BioTac tactile sensor [83].
system that incorporates electro-conductive fluid to produce
both constant and dynamic signals (Fig. 6(d)). Micro-vibrations, et al. [42] developed a tactile sensor array (Fig. 6(c)) with a spatial
caused by either motion over textured surface or slippage at any resolution of around (3–5 mm), sensitivity 1 mN, and dynamic
contact point, propagate as sound waves through the liquid media range up to 4.9 N for a three-fingered robot hand [42].
to a pressure transducer [83]. The bandwidth of the sensor is Sensors involving liquid and barometers have high frequency
1 kHz, which makes the sensing system well suitable for slip response [86]. Sensors with silicon rubber and barometer are low-
detection applications. The sensor is embedded in the multi-modal cost, but has low frequency response [85] as a result of the elasticity
biomimetic⃝ R
BioTac fingertip sensor from SynTouch LLC [84]. of the silicon rubber. Hence, the use of a liquid as a propagation
In [85], no liquid is used as a propagation media, a barometer is media is more suitable when frequency response is of importance.
instead molded within a silicon rubber in each tactel. The rubber
acts as a membrane (Fig. 6(a)). Once the rubber is deformed due to 2.1.7. Multi-modal tactile sensors
the contact with an environment, it causes changes in the pressure To match the human hand’s different types of tactile sensing
values of the barometer. Using the same digital barometer, Odhner modalities (thermal, fast adapting and slow adapting afferents) [6]
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 201

indicators to trigger the placement of the object by the manipu-


lator. Romano et al. [38] use a high sensitive 3-axis accelerometer
in the base of PR2 robot gripper in order to detect the contact of
the object with the table and to release the object. Earlier, Kyberd
et al. [32] integrated a microphone with an anthropomorphic pros-
thetic hand for automated grasping.
Sensors within this group have wide bandwidth, but are suit-
able for dynamic measurements only. However, in a close proxim-
ity of an object, it is possible to estimate the distance to the object
by comparing a level of an environmental acoustic noise and a level
of noise within the sensor as has been shown by Jiang et al. [93].
The presented concept of the sensor is based on Seashell Effect—
increase of a level of noise in cavities due to resonance of sound
waves and intrinsic resonance frequency of a cave. The sensor in-
corporates a cavity and a microphone located inside the cavity.
The cavity has its own resonance frequency that depends on both
the structure of the cavity and the distance from an object to be
grasped.
Data stream coming from tactile sensors has different physical
meanings for different transduction technologies. In general it can
Fig. 7. Multimodal Tactile Sensors: (a) schematic of the biomimetic BioTac tactile
sensor with 19 electrodes, fluid pressure sensor and thermometer [84], (b) photo of be dynamic or static according to the time response and may repre-
the multimodal BioTac tactile sensor, (c) combined tactile-proximity sensor that can sent an array of data, vector or scalar value. Hence, data acquisition
measure both the distance to an object and the contact pressure [90], (d) drawing from different sensors has its own approaches as discussed in the
of a multi-modal tactile sensing module consisting of optical and piezoresistive following section.
sensors [76].

2.2. Tactile data types and acquisition


as close as possible, a robot hand should be equipped with multi-
modal tactile sensors. Current multi-modal tactile sensing systems
incorporate static pressure distribution arrays, dynamic tactile Force torque sensors installed on the fingertips of a robot hand
sensors, thermal sensors, and proximity sensors. The BioTac finger- provide with force and torque values in each direction in R3 . A
shaped sensor array (Fig. 7(a), (b)) provides information about contact point location can be estimated from these forces and
the contact forces, microvibrations, and temperature produced torques as long as the shape of the fingertip is known [26]. The
during contact with external objects [19]. Some tactile sensors measured forces and torques can then be used for force control
have the ability to sense dynamic and static contact forces since (Section 4.4) and in haptic object recognition (Section 4.2).
they have been constructed using a combination of piezoresistive Tactile sensors with fast response (such as accelerometers, mi-
and piezoelectric materials. Examples of such material include crophones, piezoelectric and capacitive technology based sensors,
piezoresistive rubber with PVDF (Fig. 3(b)) [24] that is integrated and barometers with fluid media) provide information about vi-
with an anthropomorphic fluidic hand [70] and pressure variable brations at the contact point (see Fig. 8(b)). Information about vi-
resistor ink with PVDF that is integrated with a four fingered brations can be further used for slip detection and haptic object
exploration (Sections 4.2.2 and 4.1.2). The dynamic response of the
robot hand [88]. Another hybrid sensing system with similar
tactile sensing arrays is limited by the sampling rate of reading de-
combination of dynamic and static transducers combines carbon
vices. In [39], the sampling rate of data acquisition board is 10 kHz,
micro-coil touch sensor and a force tactile sensor [89]. Hasegawa
while in [9] the signal bandwidth is limited by the sampling rate
et al. integrated proximity and pressure sensors on the fingertip
(300 Hz) of a commercial capacitance-to-digital-converter [67].
(Fig. 7(c)) to enhance autonomous grasping [90]. Optical sensors
Fig. 9(a) and (b) show schematic diagrams of reading devices for
also found their application in the multi-modal approach. A
dynamic capacitive and piezoelectric PVDF sensors.
three-fingered robot gripper described in [76] incorporates optical
Information from pressure sensing tactile arrays can be treated
sensors and combines measurements of absolute forces by strain
as a gray-scale image in computer vision [40] (see Fig. 8(a)). Al-
gauge sensors, dynamic forces by piezoelectric sensors, and force
though some tactile arrays may have tactels (Fig. 4(a)) that can
distribution by fiber optic sensors, as shown in Fig. 7(d). Unlike
measure pressure in the three-dimensional space as in [72], the
the above multimodal sensors, in which locations of sensing units
value of each tactel in most of the current tactile sensors is pro-
are known, a sensing system of a robot fingertip proposed by
portional to applied normal pressure only. Tactile sensor arrays
Hosoda et al. [91] has random distribution of the sensing units.
provide information about contact shape and pressure distribu-
Similar to [24], the sensing system consists of piezoresistive and
tions [41].
piezoelectric sensors to measure static forces and vibrations. The
In capacitive and piezoresistive sensors, data from each tactel
piezoelectric sensors are placed at a skin layer and inside the
can be acquired either directly, which means that a high amount
fingertip thus giving possibility to measure internal vibrations. The
of wires is required, or by using a multiplexing circuit (Fig. 10(c)),
only drawback of the multimodal tactile sensors is their size.
which decreases twofold the number of wire connection.
Piezoresistive tactile arrays consist of a common electrode, of
2.1.8. Structure-borne sound tactile sensors sensing electrodes that are arranged as a matrix, and of conduc-
Vibrations and waves in solid structures are summarized by the tive rubber in between. Pressing on the sensor’s surface provides
term ‘‘Structure-borne sound’’ [92]. In manipulation tasks, struc- an image of the applied pressure profile [53]. Fig. 1(f) illustrates
ture born sounds occur at the initial contact of a manipulated ob- the image of the sensing array which is produced when a spheri-
ject with the environment or during the slippage. Accelerometers cal object is pressing the tactile surface. Tactile images can be used
and microphones can be used as detecting devices. In pick-and- for contact pattern recognition [94], grasp stability estimation [95],
place manipulation tasks, these structure born sounds can serve as object classification [49], and tactile servoing [31].
202 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

a b
2
1.5

Force [N]
1 2
0.5

Amplitude
1
0
0
1
2 3 –1
Y 3 2
1 X
1.5 2 2.5 3
Time[s]

Fig. 8. Tactile sensing signal types: (a) a two-dimensional pressure distribution


of a tactile sensing array, where the sensing tactels are located on xy plane and
force/pressure is measured along z-axis [77]; (b) dynamic tactile signal from a single
tactel or from an ensemble of tactels, which can be acquired during a slippage [24];
(c) 6 DoF force/torque sensor measurements in the ellipsoid-shape fingertip [26]
including normal forces in each direction Fi of the Cartesian space, torques Mi ,
contact point P, forces and torques at contact point F and q.

For tactile sensing arrays, data acquisition involves the usage of


analog to digital converters (Fig. 9(d)) as well as of microprocessing
units for polling each tactel [18]. The capacitance of capacitive tac-
tile sensors can be measured by commercial CDCs chips. The CDC
chips can include I 2 C serial interface. Digital barometers such as
the absolute digital pressure sensor ‘‘MPL115A’’ [96] that is used
in the iHY robot hand [42] has also I 2 C serial interface. Communi-
cation with processing units can be realized via different transmis-
sion protocols (e.g. controller area network (CAN), and universal
serial bus (USB), RS232). In iCub skin [44], local measurements are
sent by an on-board processing unit over a CAN bus (Fig. 9(e)). Mul-
timodal tactile sensing data in the BioTac sensor [19] is acquired by
the PIC microprocessor and sent to the host processing unit over se-
rial peripheral interface (SPI). In order to minimize memory use of
micro-processing units, data coming from sensors can be prepro-
cessed by signal conditioning circuits, which can be implemented
as system on chip (SOC) or system in package (SIP) [18]. Fig. 9. Tactile sensing reading circuits: (a) the condenser microphone circuit
In some specific applications, for example in fast reaction to for capacitive sensors [9]; (b) a circuit for utilizing piezoelectric PVDF film as a
slip [24], signals from tactile sensors can be analyzed and processed stress rate sensor [9]; (c) signal conditioning and voltage multiplexing for a 3 × 3
tactile sensing array [49]; (d) the voltage-divider circuit for a pressure conductive
within a controller without sending information to the host com- rubber [39]; (e) network structure of the iCub tactile sensing skin using CAN-bus for
puter. In most of the applications, middleware and high-level soft- connecting tactile sensing patches, 12 tactels in each patch, with a main processing
ware installed onto the main processing unit is used to compute unit [44].
acquired data and control the system. For these purposes versa-
tile open source and commercial robot control platforms are avail- 3. State of the art tactile sensor integration with robot hands
able: in [30], robot operating system (ROS) [97] is used to control
Shadow robot hand [98], robot platform (YARP) [99] is used to con- In this section we review the existing robot hands equipped
trol iCub humanoid robot [100]; the controlling operating system with tactile sensors and discuss several issues related to the
dSPACE from dSPACE Co. is used in [39] to control high speed- integration process.
robot hand [101] in real-time and C++ libraries of open Robot con-
trol software (OROCOS) [102]. Among open source robot control 3.1. Issues related to the shape of the attachment surface
platforms, ROS is the most widely used and supports both simula-
tion (Gazebo simulator) and control of the Shadow hand, the Barret Mounting tactile sensors on the palm, a jaw grippers or on
hand and many other manipulators and robots. fingers with flat surfaces is relatively straightforward, one of
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 203

Fig. 10. Simple integration of tactile sensing arrays: (a) the Tekscan tactile sensing
system consisting of 349 taxels with the Shadow robot hand [94], (b) the Allegro
robot hand with PPS RoboTouch capacitive arrays [64], (c) the Robotiq adaptive
gripper with sensor suite installed on the contact surface [65].

the simplest ways involving using a double side tape. Fig. 10(a)
shows an experimental setup containing the Shadow Hand and the
Tekscan tactile sensing system (Model 4256E), which was used for
contact shape recognition [94]. In another manipulating setup, off-
the-shelf capacitive arrays have been installed on the fingertips
of the four-fingered ‘‘Allegro’’ robot hand (Fig. 10(b)). Fig. 10(c)
illustrates the Robotiq adaptive gripper covered by capacitive
pressure sensing arrays used for the recognition of the type of the
slip [65]. Attaching tactile sensors on fingers and fingertips is a
complex process as curved surfaces with small radius of curvature Fig. 11. Advanced integration of tactile sensors on the robot fingertips: (a) a flexible
have to be taken into account. Tactile sensors should be either: PCB for a capacitive tactile sensing array with 12 taxels designed for the iCub
(a) flexible and appropriately shaped to envelop a given surface, humanoid robot [103], (b) the iCub flexible PCB wrapped around the inner support
of the fingertip [103], (c) a 3D-shaped rigid tactile sensing array with 12 sensing
as in iCub tactile fingertip sensors (Fig. 11(a), (b)) [103]; (b)
elements attached to the fingertip of the Shadow robot hand [34], (d) the BioTac
rigid and shaped as an attachment part, e.g. [34] or [104] where multimodal tactile sensor installed on the Shadow robot hand by replacing two last
a 3D-shaped tactile sensing array and an ellipsoid F/T sensor links of the finger [43], (e) ATi nano 17 force/torque sensor on the fingertip of the
(Fig. 11(c) and (e)) replace the fingertips of the Shadow robot Shadow robot hand [30].
hand [98]. In another version of the Shadow robot Hand with the
integrated BioTac multimodal tactile sensor, each finger loses one
DoF (Fig. 11(d)),—the sensor is as big as the two last links, distal
and middle phalanges of the human index finger.
The shape of the links of the fingers in robot hands is different
from the shapes of human finger phalanxes. The proximal and
middle links of fingers in artificial robot hands have a smaller
contact surface than those of humans, a fact that significantly
decreases the sensing area and causes difficulties with attachment.
Fig. 12 shows the difference between sensing areas on the middle
and proximal links of a human finger and a robot finger. Current
artificial tactile sensors are not as flexible as human skin and
cannot cover the empty space between the links for closing the
finger of robot hands.

3.2. Wiring issues


Fig. 12. Difference in contact surfaces between a human finger and a robot
A key issue in tactile sensing array integration is the amount finger [105].
of wires required to read and transmit the data from the sensing
arrays. Any increase in the number of tactels in tactile sensing wires needed in the parallel access schemes. For example, in the
array causes an increase either in the number of wires or/and on row–column scheme [49] n + m wires are needed for n ∗ m array
the time needed for data acquisition from sensors. A serial data of sensors instead of n ∗ m + 1 wires required in the schemes with
communication can be used to reduce the number of connections. one common ground [34]. Other approaches dedicated to reducing
For example, in the iCub skin, communication was implemented wiring issues include wireless data and power transmission and
through I2C serial bus, where only four wires were connected implementation of a decentralized data pre-processing of tactile
to the PCB of the sensing array [44]. However, serial access of signals [24,76].
data is slower than parallel access. In iCub, the skin sampling
rate for each tactel decreases from 100 to 25 Hz as the number 3.3. Integration steps
of tactels increases. If the real-time pressure distribution is of
interest, as for example in tactile servoing [31], the serial data One way to integrate tactile sensors in robot hands is in using
access may fail to produce time-series images of the contact image. tactile gloves. A number of tactile data gloves have been designed
The parallel access of data provides higher acquisition rate, but for use in human grasping applications rather than in autonomous
requires a higher number of wires than the serial one. Employing manipulations tasks, e.g. [54]. However, tactile data gloves could be
advanced addressing schemes is a way of reducing the number of worn on robot hands, as in the Robonaut robot hand [106] capable
204 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Fig. 13. Tactile data glove based on conductive rubber (a) and the tactile
information from the data glove during a grasp (b) [54].

of sensing 19 points of contact. Commercial tactile data gloves are


available from Tekscan [57] and CyberGlove [107]. Fig. 13 shows a
tactile data glove based on piezoresistive and conductive fabrics.
A more effective way of integrating tactile sensors is to embed
them into the robot hand. The embedding procedure of the
tactile sensing skin within the robot hand involves the following Fig. 14. Three-fingered robot hands with tactile sensors: (a) a finger with tactile
steps [44]: sensor of the 3-fingered high-speed robot hand [101], (b) assembly of tactile sensing
arrays with a robot finger of the Universal robot hand with 3 movable and 2
• definition of the surface to be covered by the available 3D immovable fingers [35], (c) schematic illustration of a finger of the iHY robot hand
with embedded array of pressure sensors based on digital barometers placed inside
computer-aided-drawing (CAD) model or by means of a 3D
the soft paddings of the fingers [42]; (d) schematic illustration of the integration of
scanner. a multimodal sensing system with a three-fingered robot hand [76].
• manufacturing of the supporting part using tactile sensing PCBs.
This part is to be attached to the robot hand. The use of a 3D to explore object surfaces and for human–robot interaction pur-
printer can facilitate the manufacturing procedure. This step is
poses [113]. The Barret hand [114] has capacitive tactile sensors
not applicable if integration of fingertip-shaped tactile sensors
on the tips, distal link and palm.
is required, which involves changing the structure of the finger.
The ‘‘Takktile’’ arrays [87] based on barometric measure-
• identification and wiring of the sensing elements. ments have been integrated with the iRobot–Harvard–Yale (iHY)
• gluing the sensing elements down on the supporting part. Hand [42]. The hand is covered by an array of 48 tactels on the
• covering the sensing elements with flexible material, e.g. silicon palm, a 2 × 6 array on proximal links, and a 2 × 5 array on dis-
rubber. For a specific surface shape, custom molds should be tal links with two of the tactels on the tip (Fig. 14(c)).
designed. An optical tactile array of 41 tactels with the ability to measure
normal and tangential forces has been placed on the tips of a two-
3.4. Robot hands equipped with tactile sensors fingered robot system [13]. A multimodal tactile sensing system
may require a larger space, especially if optical tactile sensors are
This section presents an overview of manipulating platforms incorporated within it. Fig. 14(d) illustrates the robot hand with the
with sensorized artificial hands, developed in the framework of multimodal sensing system [76]. Force torque sensors are placed
research projects in autonomous manipulation and tactile sensing at the base of each finger, not on the fingertips as in the Shadow
applications. A list of these platforms is presented in Table 3 hand [104]. The three-axis opto-force sensors [81] can be installed
and a summary with comments about the different hand/sensor on the tips of the Barret Hand [114].
combinations are given in Table 4. In [117], photo-reflectors have been attached to the three-
In [49] an 8 × 8 tactile array based on piezoresistive rubber fingered robot to provide proximity information for preshaping
has been attached onto the grippers of the 3-fingered Schunk SDH the fingers during grasping. The Seashell effect sensors [93], which
hand for classifying deformable objects. Outstanding in speed per- also provide proximity information, have been installed on the PR2
formance, the Lightweight High-Speed Multifingered Hand Sys- robot grippers.
tem [101] integrates Center-of-Pres-sure (CoP) sensor for the force The tactile sensing system for the DLR robot hand–arm sys-
measurements and PVDF based high sensitive tactile sensor for slip tem [116] is designed as large scale tactile skin using the col-
detection, as shown in Fig. 14(a) [39]. Commercial 3-finger Schunk umn–row net structure [115]. The robonaut hand has tactile feed-
SDH hand [109] with integrated Weiss Robotics piezoresistive tac- back through the tactile data glove incorporating piezoresistive
tile sensors [58] incorporates a 14 × 6 array on each distal link and technology [106] and QTC technology [75]. Fig. 15(a) shows the
a 14 × 7 array on each middle link. The Universal robot hand [35] Fluidic hand [70] with modified version of the Weiss [58] sen-
has 102 tactels on fingertips and 70 tactels on the rest of the links. sors. The dexterous Gifu III robot hand (Fig. 15(d)) has a sensing
In contrast to the serial connection of sensors present in the iCub array of 859 taxels (Fig. 15(c)) based on piezoresistive conduc-
skin [44], each tactile array has its own connection with the acqui- tive ink [110]. An array of 24 conductive ink in combination with
sition board (Fig. 14(b)). piezoelectric PVDF material has been used in the SKKU II robot
Capacitive arrays from Pressure Profile System [66] have been hand [88]. The Shadow Hand [98] has different integrated tactile
integrated with PR2 robot grippers [95]. The sensor array on the sensors: force/torque sensors (Fig. 11(e)) [30], multimodal Biotac
PR2 robot has tactels in back and front, on left and right sides, and tactile sensors (Fig. 11(d)) [43], 3D-shaped fingertip tactile sensors
finally on the tip. Very sensitive tactile sensors with fibers con- (Fig. 11(c)) [34], and QTC sensors [74]. The robot hand of the iCub
nected to capacitive sensor akin to animal whiskers have been inte- humanoid robot [100] has sensors on the fingertips and palm, but
grated with the parallel jaw gripper of a humanoid robot platform not on the middle and proximal phalanges (Fig. 15(b)).
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 205

Table 3
The list of tactile sensors that have been integrated with robot hands. Number of tactels (No.), spatial resolution (Res.), sensitivity (Sens.), dynamic range (Range) and data
acquisition rate (Rate) are provided where possible.
Tactile sensor Robot hand No. of tactels Res./Sens./Range Rate

Piezoresistive sensors
FSR [56] Robonaut data glove [106] 19 5 mm/0.1 N/20 N 1 kHz
Fabric sensor [60] Sensor glove [54] 56 34 mm2 /(0.1–30 N) –
Rubber-based [49] Schunk gripper [108] 8×8 6.25 mm2 /–/250 kPa 100 fps
Rubber-based [39] High-speed 3-fingered hand [101] 17 × 19 3 mm/–/– 10 kHz
Weiss Robotics [53] Schunk sDH [109] (14 × 6) and (14 × 7) 3.5 mm/–/250 kPa 800 fps
3D-shaped sensor [34] Shadow hand 12 5.5 mm/0.03 cmN
2 /10 N ∼1 kHz
Rubber-based [35] Universal robot hand [35] 102 on tip 3.6 mm/ 1 N/– 50 Hz
N
Gifu hand sensor Gifu hand III [110] 624 ∼4 mm/–/22 cm2 10 Hz
Tekscan [57] Shadow hand [94] 349 4 mm/–/345 kPa 200 Hz
FSR [41] Southampton hand [41] 15 – –
ATi Nano17 sensors [61] Shadow hand [98] 5 per finger –/ 3.26 mN/12 N 833 Hz
Weiss Robotics [58] Fluidic FRH-4 hand [111] 14 × 6 3.5 mm/–/250 kPa 230 fps

Capacitive sensors
Icub sensor [103,44] iCub Humanoid robot 12 per tip, 48-palm 7 mm/2.5 kPa
fF
/150 kPa 25–250 Hz
PPS sensors [66] PR2 robot grippers [38] 22 4 mm/6.25 mN/7 kPa 24.4 Hz
PPS RoboTouch [66] Allegro robotic hand [64] 24 25 mm2 /7 kPa 30–100 Hz
Dynamic sensor [9] Robotiq gripper [112] 132 –/–12 N 300 Hz
Combined sensor [113] Parallel jaw gripper [113] 16 10 mN Up to 35 kHz
PPS RoboTouch Barrett hand [114] 120 per finger 5 mm/6.25 mN/7 kPa 30–100 Hz

Piezoelectric sensors
PRes. [58] + PVDF [24] 8 DoF fluid hand [70] 4×7 3.5 mm/–/250 kPa ≥1 kHz
PRes. ink + PVDF [88] SKKU hand II [88] 24 on fingertip 0.5 mm/–/– –
Tactile skin [115] DLR hand [116] In Process of Development

Barometric measurements based sensor


Takktile (silicon) [87] iHY robot hand [42] 24 + 48 5 mm/10 mN/4.9 N 50 Hz
BioTac (liquid) [83] Shadow hand [98] 1 per finger –/0.1 N/3 N 1 kHz

QTC tactile sensors


Robonoaut sensors Robonaut hand [75] 33 –/0.1 N/10 N –
2 /400 m2
kN kN
Piratech [73] Shadow hand [74] 36 3 m –

Optical tactile sensors


Sensor for MRT [77] Robot manipulator 9 –/0.5 N/5 N 25 fps
3DoF sensor [37] Robot gripper 41 3 mm/0.08 N/1.8 N 10 Hz
Optoforce [81] Barret hand 1 per finger 10 mm/–/10 N –

Multi-modal tactile sensors


Proximity sensor [90] A three-fingered hand [90] Palm: 5 × 6 10 and 2 cm 1 kHz
BioTac sensor [84] Shadow hand [43] 19 + fluid barometer + thermistor –/∼0.01 N/1:1000 50 Hz; 2 kHz; 50 Hz
Optical + PVDF + Force 3-fingered gripper [76] 324 fibers, 120 PVDF, 3 F/T –/–/4 N 30 fps; 10 kHz; 100 Hz

‘‘Structure-borne sound’’ tactile sensors


Microphone Oxford prosthesis [32] 1 – –
Accelerometer PR2 robot grippers [38] 1 0.15 m/s2 3 kHz
SeaShell effect sensor [93] PR2 robot grippers 1 –/Non/Non 44 kHz

Table 4
Sensors integrated with robot hands: advantages and disadvantages of major approaches.
Hand/Sensor Combination Advantages Disadvantages

3D-shaped array [34] & Shadow hand; iCub Multiple point of contact, covers spherical shapes, Normal force measurements only
robot fingertip sensor [68] wires—within fingers
Ellipsoid f/t sensor [104] & Shadow hand; Covers spherical shapes, high sensitivity, shear Single point of contact only, wires—outside of fingers
OptoForce [81] & Barret hand forces.
BioTac [84] & Shadow hand Multiple point of contact, high bandwidth, Last joint static (20°)
wires—inside
Robonaut glove and hand [106] Ease of replacement, low cost Not reliable compared to rigidly attached sensors
Fabric sensor [54] Ease of replacement, stretchable Wear and tear off
Tactile sensing array (PPS [66], Tekscan [57], Can be easily attached to any flat and cylindrical Cannot cover spherical shape, wiring issues
and etc.) & any robot hand surfaces
Weiss Robotics [58] & any robot hand; Robust Flat surface only
Takktile [87] & iHY hand
SeaShell effect sensor (Cavity with Pre-touch sense Direct contact of the cavity with an object limits forces
microphone & PR2) [93]
Proximity sensor [90] Pre-grasp sense Cannot measure very close proximities
Accelerometer at the base of robot Vibration detection Interference with electric motor noise
grippers [38]
Microphone at the tips of the Oxford hand Vibration detection No interference with motor noises
prosthesis [32]
206 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Fig. 16. Tactile sensing techniques. Tactile sensing in robot hands is used for object
recognition, tactile servoing, force control and for assessing grasp stability.

vibrations and light touch [121]. Each patch of the HEX-O-SKIN is a


hexagonal printed circuit board equipped with proximity sensors,
accelerometers, thermistors, and a local controller. Each patch is
less than 2 g in weight, 5.1 cm2 in area, and 3.6 mm thick.
A limited number of tactile sensing skins has been integrated in
robotic manipulators for applications that require tactile feedback,
as in safe human–robot interaction. An example of an industrial
manipulator covered with an array of capacitative proximity
sensors is described in [21]. A commercial industrial manipulator
that incorporates 118 proximity sensors is shown in [122].
Research in design of multi-fingered dexterous robot hands, being
previously focused on prosthetic hands only, has surged in recent
years. Various dexterous robot hands were developed in research
laboratories and became commercially available [74,118,119].

4. Computational techniques in tactile sensing applications

In the robotics literature, tactile feedback has been widely


used for telemanipulation, haptic devices, and legged robots [123].
In event-driven manipulation, tactile signals have been used
Fig. 15. Five-fingered robot hands with tactile sensors: (a) the fluidic robot hand for detection of the current manipulation phase (contact/no
with combined piezoelectric and piezoresistive tactile sensors that can sense high- contact, rolling, sliding) [124]. The use of tactile information for
frequency vibrations due to the absence of electric motors [24], (b) the robot hand object exploration and recognition, material classification, and slip
of the iCub humanoid robot with tactile sensors on the fingertips and the palm [44], prediction has recently become rather popular as is reflected in [40,
(c) flexible tactile sensing arrays of the SKKU robot hand [88], (d) the SKKU robot
hand [88].
95,23,19].
In robot hand applications, tactile signals are used to recognize
objects, control forces, grasp objects, and to servo surfaces (Fig. 16).
Besides the five fingered robot hands, a number of anthropo-
Each of these applications will be discussed in following sections.
morphic robot hands with three fingers and thumb exist, including
The major computational techniques used in these applications
the ‘‘Twendy one’’ robot hand covered by capacitive tactile sensing
are illustrated in (Fig. 17). As discussed in Section 2.2, different
arrays [118] and the ‘‘Allegro’’ robot hand [119] developed by Sim-
tactile sensor types have different sensing quantities, including
Lab Co.
force vectors, vibrations, and contact patterns. These quantities
are then subjected to various computational techniques. The same
3.5. Large area tactile skin computational technique may be used in a number of applications,
as it illustrated in the latter figure.
There is a high demand for manipulators and humanoid robots
whose whole surface is covered with tactile sensors [18]. Large 4.1. Grasp stability and slip detection
sensing areas embedded in robotic systems enhance human–robot
interaction and are important for safety reasons. However, a large Grasping is one of the basic skills service robots and industrial
area tactile skin and the concomitant increase in the amount of manipulators are expected to have. Before performing a grasping
tactels present challenges with regards to optimal data acquisition procedure, a robot must plan the grasp. Grasping is a complex
and wiring. process for robot hands even if object parameters such as shape,
The number of sensing tactels should be easily changeable for position, physical properties are known. When the properties are
arbitrary surfaces to enhance the performance of the system. The known, analytical approaches involving force and form closures
iCub skin uses flexible triangle patches consisting of 12 sensing can be employed to perform grasping [125]. In unstructured
tactels each and off-the-shelf CDC AD7147 [67] (Fig. 2(c)). Up to environments, object parameters are uncertain, which makes the
16 triangle patches in series can be connected with each other but grasping task even more difficult and presents a big challenge
only one of them must be connected with the micro-processing for grasp stability approaches. A detailed review of all grasping
unit which significantly reduces the amount of wires required. techniques is out the scope of this paper and can be found in
However, polling time increases proportionally to the number of previous papers [126,127].
serial sensing elements. The iCub skin has been integrated in the In some approaches, the robot grasping procedure could be sim-
child-sized humanoid robot KASPAR [120] and the autonomous plified by using proximity sensors on fingertips [128]. There are
humanoid robot NAO. iCub skin based on capacitive technology can two main approaches of robot grasping that involve tactile feed-
sense applied pressure only. back. One approach treats grasping as a control problem and does
Unlike the capacitive technology based iCub skin which can not consider hand kinematics or assumes simple hands like grip-
only sense applied pressure, HEX-O-SKIN measures temperature, pers [39]. Another approach makes use of both model based grasp
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 207

Fig. 17. Overview of computational techniques applied to tactile sensing signals in the reviewed robot hand applications. Each tactile data type is shown on the left. The
computational techniques applied to the tactile signal are shown in the middle. Different applications of sensorized robot hands exploiting these techniques are shown on
orbital on oval blocks. Arrows indicate only the major techniques of deriving information.

planning and force feedback to address the problem of grasping 4.1.2. Vibrations as the slip-signals
with dexterous robot hands that have more dof than grippers [25]. Except exploratory procedures such as texture recognition, the
Regarding tactile sensor types and the way of processing the key feature of a stable grasp is the absence of slippage [43]. During
data, there are three different techniques for assessing grasp stabil- slippage or at the moment of contact with the environment, a
ity at the current state-of-the-art: friction cone based techniques, robot hand experiences mechanical vibrations. This phenomenon
vibrations based techniques, and tactile images based techniques. is known as structure-borne sound [130]. The absence of vibration
Each of the technique is discussed in following. frequencies indicates the absence of slippage. Achieving stable
grasp by detecting vibrations has been long implemented in
4.1.1. Friction cone estimation for the slip event hand prosthetic devices [16,131,132]. In order to detect vibrations
The friction coefficient of surfaces and the load conditions are during a slip event, the tactile sensor should have appropriate
very important in grasping. When humans pick up an object, they bandwidth to detect the vibration frequencies (Section 2).
take into account these parameters and adjust grasping forces Piezoelectric materials (Fig. 3) and capacitive sensors (Fig. 2)
based on tactile feedback during manipulation. The stability of
have been widely used for detecting vibrations induced by a slip.
a grasp is evaluated by the ratio of normal, Fnorm , to tangential,
These sensors are usually embedded into pressure sensitive tactile
Ftang , reaction forces and static coefficient of friction µf (Fig. 18(a)).
arrays. Signals coming from each sensor represent high-frequency
Maintaining objects within the friction cone, to preclude slippage,
oscillations (Fig. 8(b)) and are sampled at a high sampling rate.
is ensured by the following condition [125]: 1 < µf × FFnorm . The
tang Dynamic tactile signals can be processed directly in time do-
tangential force can be obtained by force/torque (F/T) sensors, for main and in frequency domain. One of the simplest ways of de-
example ATi Nano 17 [61], whereas most of the current pressure
tecting the slippage is to use a high-pass filter (Fig. 18(b)). A given
sensing arrays can measure normal pressure only (Fig. 8(c)). In [86]
level of filtered disturbances indicates a slip-event. In [38] forces
tangential forces are computed by applying a Kalman filter to the
of each cell in capacitive tactile sensing array are subjected to a
data of the pressure sensing arrays of a bio-mimetic tactile sensor.
discrete-time first-order Butterworth high-pass filter with cut-off
The sensor consists of conductive fluid and electrodes placed in
different places of the fingertip. Hence, the sensor does not provide frequency of 5 Hz to mimic fast adaptive (FA-II) human afferents. A
absolute force values. The Kalman filter integrates signals from high-bandwidth accelerometer is used to detect contact between
the electrodes to produce a force output. Other approaches can the object and the environment. The detection of slippage by eval-
rely on dynamic friction models that allow the prediction of an uating the level of high-passed filtered data can be processed at a
incipient slip. For example, using F/T sensors installed on the Barret high rate.
hand [114], Song et al. [27] estimate the coefficients of the dynamic Another computational technique using vibrations is based on
LuGre friction model of a contact with an unknown object through the transformation to the frequency domain and the calculation
two exploratory motions. Break-away friction ratio (BF-ratio) is of the spectrum power, as shown in Fig. 18(c). In [39] pressure
then computed to predict a slippage. Besides the transduction disturbance signals are subjected to discrete wavelet transform
methods mentioned in Section 2.1, heat microflux detectors, which (DWT) [134]. When DWT power exceeds the experimentally
are mainly used for measuring objects’ thermal properties, can be determined threshold, initial slip is detected and the grasping force
used for detecting a slip [129]. The temperature at the contact is increased accordingly. Cutkosky et al. [9] developed a technique
point increases during the slip due to the energy dissipation at the to distinguish between two types of slippage: robot hand/object
presence of friction forces. and object/environment. Acquired data from these two types of
208 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Fig. 18. Data processing steps for the techniques applied in tactile-based stable grasping: (a) slip detection based on static (e.g. friction cone) and dynamic (e.g. LuGre)
contact force models [27]; (b) slip detection based on vibrations that can be recognized in time domain by existence of high-pass filtered tactile data [38]; (c) slip detection
based on vibrations by calculating a spectral power in the frequency (i.e. Fourier transformations) and time–frequency (i.e. wavelet transformations) domains [39,9]; (d)
slip detection based on vibrations that can be recognized in time–frequency domain by extracting and classifying features of transformed signals [24]; (e) grasp stability
estimation based on features from tactile images and hand kinematics [95,25].

slippage were identified by the parameter noted as power-ratio grasp. Table 5 lists the robot hands and tactile sensors that have
classifier, which is calculated by applying Fourier transformation been tested with the above techniques.
and phase shifting in frequency domain. The power-ratio classifier
is the ratio of the spectrum power of the individual tactel to the 4.1.3. Tactile image features for stable grasp estimation
power spectrum of all tactels. Tactile signals are processed in a Data from tactile sensing arrays can be treated as a gray scale
way that mimics the effects of stimuli on human tactile receptors, image (Fig. 8(a)). When an object comes to contact with the tactile
both individually and as an ensemble. Slip is classified by values array, tactile image features of the contact pattern can be extracted
of relative power between individual tactels and the array as an for the further estimation of a stability of a grasp.
ensemble. The first technique introduced in [133] detects the slippage
A further computational technique uses transformation to fre- of an object by analyzing changes of feature points of the tactile
quency domain and then applies principal component analysis image. Data is collected at a sampling rate of 60 Hz from a
(PCA) and machine learning methods (Fig. 18(d)). In [24], in- 44 × 44 array of piezoelectric sensors installed on an industrial
put signal (x[n]) is processed by the Short-Time Fourier Trans- manipulator. Before the actual motion of the grasped object in a
formation (STFT) with window function in a short period of slip-event, there are some feature points that remain on previous
time (w[n]), which provides a two dimensional representa- positions and points that have moved. Ratio of immobile points
tion
∞ in time–frequency−idomain: STFT {x[n]} ≡ X (m, w) = to moved points indicates the slip-event. This approach requires a
wn
n=−∞ x [n ]w[ n − m] e . The transformed signal is then sub- large tactile array because the surface of an object that is in contact
jected to PCA and the slip is detected by k-NN (k nearest neighbor) should be fully represented in the tactile image.
classifier. The slip detection techniques demonstrated in the previous
Depending on the transduction type of the sensor, a stable grasp sections can be used in grasping approaches that address the
can be qualitatively assessed from: (1) contact forces [27], (2) vi- grasp as a control problem and do not take into account the hand
brations [9], and (3) tactile contact patterns and hand kinemat- kinematics. For the dexterous robot hands with tactile sensing
ics [25]. Fig. 18 outlines different algorithms and computational arrays, a grasp stability can be estimated by computing tactile
techniques that have been used for achieving and assessing the information together with hand kinematics (Fig. 18(e)).
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 209

Table 5
Approaches for the grasp stability estimation based on tactile information. A number of computational techniques for assessing the grasp based on vibration, friction force
model and tactile images are listed in accordance with used tactile sensors and robot hands.
Sensors Robot hands Techniques Ref.

Vibrations
Weiss Robotics + PVDF Fluidic hand [70] Filtering, STFT, PCA, kNN [24]
PPS sensors PR2 gripper Filtering, Grip force control [38]
CoP + PVDF [39] High-speed hand [101] DWT power, force control [39]
Capacitive sensors Robotiq gripper [112] FFT, spectral power, phase shift, slip type detection [9]
Microphone Prosthetic hand Filtering [32]
Accelerometer PR2 robot grippers Filtering, object—world contact detection [38]

Friction force model


BioTac [86] Otto Bock M2 hand Force control, Kalman filter [86]
ATi nano 17 [61] Barret hand LuGre dynamic friction model, break-away ratio [27]

Tactile images
Piezoresistive [58] Schunk 3-finger gripping hand sDH [109] Temporal and static image features + joint angles, HMM and SVM [95]
Capacitive sensor arrays Barret hand Image features + joint angles, SVM [25]
Piezoelectric (16 × 16) Manipulator Image moments, Localized displacement phenomenon, incipient slip [133]

Bekiroglu et al. [95] consider grasp stability as a probability stability from contact patterns. And high sensitivity is essential for
distribution that depends on tactile images acquired from pressure the techniques that rely on estimation of surface friction.
distribution sensing arrays; joint configuration of the hand;
object information (e.g. object shape class) and grasp information 4.2. Tactile object recognition
(e.g. hand pre-shape). Grasp stability is evaluated by analyzing
tactile images and hand configurations based on supervised Object recognition is an important element in human–robot in-
machine learning algorithms. While AdaBoost [135] and Support teraction and autonomous manipulation [150]. In manipulation
Vector Machine (SVM) [136] classifiers are used for one-shot tasks, robotic systems detect, explore and recognize objects. For
recognition at the final step of the grasping procedure, the hidden the detection and recognition tasks robots use their perception
Markov model (HMM) [137] classifier is used for the time-series system. The perception system includes audio, vision and tactile
case. It should be noted that, besides the SVM, Adaboost and kNN subsystems. Information from audio devices – high sensitive mi-
classifying algorithms, other classification, clustering, statistical crophones that can detect micro-vibrations – serves to detect a
learning and data mining algorithms described in [138] can be slip [32] and to recognize textures [151]. Visual information is pro-
used for the grasp stability estimation. Dang et al. [25] developed vided by RGB cameras, stereo cameras, RGB-Depth cameras, laser
a grasping framework that generates grasps; executes and then scanners, and etc. Image information can be sufficient to control
estimates the quality of the grasp and performs hand adjustment a robot in some applications as in visual servoing. However, re-
and local geometry exploration if the grasp is not successful. cent trends show that, even though robotic vision gives a lot of
Grasp stability is estimated from tactile images. Unlike to the information, tactile information about the contact is still neces-
algorithm of Bekiroglu et al. [95], the position of each tactile array is sary as it improves performance of the recognition and manipu-
calculated to determine the configuration of the contacts involved lation tasks [17]. Data from vision may be noisy or even not avail-
in a grasp. Then grasp feature vectors are computed using bag-
able when the robot itself obstructs visibility during manipulation.
of-words model and classified by a supervised SVM classifier. If
Tactile information from end-effector can complement the infor-
a grasp is not successful, the robot adjusts the hand according to
mation acquired from vision for object detection and recognition.
tactile experience database of stable grasps or explore the local
Tactile sensors can provide information about local surface texture,
geometry.
as for example in [79].
Other rather old approach proposed by Kyberd et al. [41] detects
Depending on the sensor type, there are three different
slippage by calculating changes in tactile pattern represented by a
approaches of tactile object recognition (see Table 6 and Fig. 19).
matrix in which the increase of force corresponds to 1, decrease to
The first approach of object identification, a robot hand uses
(−1), and no changes to (0). Slippage and twist are derived then by
multimodal tactile information [43]. Different tactile signals are
summing and subtracting the neighbor elements in the matrix.
combined to identify an object in contact with the sensors. The
The advantages and disadvantages of the above approaches
second approach is based on spectral analysis. The texture of a
are given in Fig. 18. In the case of estimation of grasp stability
surface is identified via vibrations which occur when a tactile
by measuring normal forces, the friction surface must be given
sensor slides over the surface. Oscillations are transformed to
in advance or estimated by tangential force measurements
frequency (time–frequency) domain to detect a different texture
(Fig. 18(a)). Meanwhile, the rest approaches do not require
according to the spectrum of the acquired signal [24]. In the last
this preliminary information about surface. However, the second
approach of the contact pattern recognition, image processing
approach of detecting vibrations by applying a high-pass filter
techniques are applied in order to recognize the shape of the object
(Fig. 18(b)) may suffer from an interference noise coming
from electric motors. This interference can be eliminated by that is in contact with a sensing array [40]. Tactile images can be
transforming temporal signals to the frequency domain and also used to classify deformable and rigid objects [49] and in some
filtering out motor noise harmonics (Fig. 18(b)). These approaches specific cases for texture recognition [79] from a contact print with
are well suited for reactive controllers. But in grasp planning high resolution.
algorithms, information about contact patterns play an essential
role. 4.2.1. Tactile object identification
Regarding the sensor parameters, high temporal resolution is Robot fingers with as many sensing modalities as human fin-
very important for the vibration based techniques and less impor- gertips, for example the multimodal BioTac sensor [84], can iden-
tant for the one based on friction cone estimation. High spatial res- tify an object through its physical properties. In [19], multimodal
olution increases performance of the approach of assessing grasp information is sensed by barometer, thermistor, pressure sensitive
210 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Table 6
Tactile Object Recognition. A number of computational techniques for object identification, texture classification and contact pattern recognition is listed in accordance with
tactile sensor types and robot hands.
Tactile sensors Hands Methods Ref.

Object identification
8 × 8 array and 6 × 14 Weiss Schunk gripper and SDH hand Image moments, k-NN, DTW [49]
sensor
BioTac [84] Shadow hand ANN, GMMR, PCA [43]
Capacitive array + microphone Barret hand Multimodal categorization using statistical model [139]

Texture recognition
BioTac sensor [84] Shadow hand FFT, SVM, Bayesian approach [19]
Ati Nano 17 Barret hand friction ratio, FFT, k-NN [140]
Digital accelerometer – STFT, k-NN, SVM [141]
PVDF based Artificial robotic finger from Robotis FFT, majority voting, naive Bayes tree (NBTree), naive Bayes, decision trees [142]
motors (J48)
Accelerometer – Feature extraction, SVM, Pitman–Yor process mixture models [143]
GelSight – MSLBP [79]

Contact pattern and shape recognition


PVDF + conductive foam Fluid hand [70] PCA, Moments, k-NN [24]
PPS [66] – RRT, k-means, GMMs, Bag-of-features, PCA, SIFT, MR-8, Polar Fourier [40]
Joystick sensor 6 DoF Manipulator Curvature estimation, surface normal, model-based recognition [144]
Piezoresistive rubber Universal robot hand [35] Multicontact recognition [35]
Tekscan [57] Shadow hand Edge detection, Segmentation, Neural networks [94]
Weiss Robotics sensor DSA 9205 1 DoF Gripper Bag-of-features [145]
Weiss Robotics Schunk [109], ARMAR-IIIb ANN classifier, PCA, SOM [146]
Weiss Robotics robotic manipulator (Phantom Omni) SIFT, k-means, kNN, bag-of-features [147]
Tekscan TM(4256 E) Barret hand PCA, convexity, naive Bayes classifier [148]
GelSight [79] Gripper Localization: Binary Robust invariant scalable keypoints, [149]

Table 7
Types of actions for object identification [43] based on multimodal tactile perception.
Exploratory movements Control variables Feedback signals Sensory information

Pressure Fingertip position Fingertip force Fingertip deformation


Lateral sliding Fingertip velocity & force Fingertip velocity & force Vibrations
Static contact Fingertip position Local deformation Heat flow
Enclosure Hand joint torques Hand joint torques Hand joint positions
Lifting Arm joint position Arm joint position Arm joint forces
Contour following Fingertip position Local contact Fingertip position

for object identification. The pressure movement is used to esti-


mate the flexibility of an object. During the lateral sliding motion,
a tactile sensor can detect the texture of a surface. Temperature is
measured in static contact. The shape of an object can be recog-
nized by calculating the joint angles of the fingers during the en-
closure. By lifting an object, the mass of an object can be estimated.
Finally, the borders of a surface can be recognized by following the
contour. High-passed pressure value of the orthonormal to contact
surface electrode is used to explore an object’s compliance; around
1.47 N of force is applied using torque controllers until reaching
Fig. 19. Tactile object recognition. Tactile data can be used for classification of the steady state. Texture is recognized from vibrations by applying
textures based on spectrum of frequencies that appear during the sliding over similar computational techniques as in vibration sensing for slip
a surface [43]. Object that is in contact with the tactile sensing array can be
detection (see Section 4). Before the sliding motion, the robot end-
recognized from the contact patterns using image processing techniques [40].
When multimodal tactile information is available, objects can be identified through effector is controlled by torque controller. When the desired con-
statistical and probabilistic analysis of multimodal data [19]. tact force is achieved, the torque controller is switched to mixed
position velocity controller to perform the sliding motion. During
liquid, and pattern of electrodes distributed over the entire sur- the sliding motion, the robot gets information about the surface
face of the fingertip. Artificial neural networks (ANN) and Gaus- roughness. The traction of a surface can be measured by compar-
sian mixture model regression (GMMR) are used to extract force ing tangential forces and normal forces. Temperature heat flux is
vectors from an array of electrodes; those vectors are then used to measured by maintaining static contact. After selecting the most
extract traction information. The barometer gives texture informa- informative exploratory movements, objects are classified accord-
tion by analyzing oscillations in the frequency domain. Then the ing to training data.
temperature sensor information is combined with these modali-
In other multi-modal approach [139], a 3D visual sensor,
ties to select exploratory movements to achieve an effective object
recognition procedure. Exploratory movements (Table 7) proposed auditory information acquired by shaking the object, and tactile
in [43] use Bayesian theory to identify the most informative action. images acquired from grasp have been used to identify an object.
Although the procedure of object exploration may involve up to six Statistical model Latent Dirichlet allocation (LDA) is implemented
actions, only the first three exploratory movements have been used for on-line object categorization.
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 211

Fig. 20. Computational techniques applied in tactile texture recognition: (a) major flow chart of a texture recognition, including filtering, Fourier Transform and feature
extraction, and classification; (b) Fishel et al. [22] estimate surface roughness by calculating average spectral power Power from N harmonics with amplitude Pac (n) and
surface fineness λ by comparing finger velocity v and frequency f ; (c) Hongbin et al. [140] estimate the dynamic friction model ft /ˆ fn and detect the variation from the
estimated model, where ft and fn are the tangential and normal forces, respectively; (d) Jamali et al. [142] use directly the Fourier components as the feature space for
classification algorithm; (e) Li et al. [79] use a Multi-scale local binary pattern, which is operator for texture classification, for contact pattern recognition exploiting the
GelSight sensor with a high spatial resolution that allow recognition of even very smooth textures.

4.2.2. Texture recognition where N is total number of harmonics. Meanwhile, spectral


centroids, SC, used to estimate a fineness of the surface:
Texture recognition, as the vibration-based slip detection
techniques 4.1.2, is based on dynamic tactile data and draws on N
fft (Pac (n))2 ∗ f

signal processing methods. Most commonly, variations of sensing
n=1
value, whether they come from micro-vibration sensors or tactile SC =
N
, (2)
arrays, are subjected to Fast Fourier transformation (FFT). Then the 
fft (Pac (n)) 2
spectral components and possibly computed features are used for n =1
classification algorithms (Fig. 20(a)).
where f is a frequency, fft (·) is the FFT. The authors state that
Fishel et al. [22] registered vibrations by change in pressure of these spectral centroids give better estimation of the fineness than
a barometer located within a liquid. At initial step, signals from conventional relationship f = v/λ, in which v and λ are the
the barometer are filtered by pass-band filter with the bandwidth
velocity and the spatial wavelength of the texture/fingerprints,
of from 20 to 700 Hz. Then Fourier Transformation is applied to
at a higher velocities or on finer surfaces respectively. These
the signals. Derived spectral components could be already in the
features together with a motor current demand to estimate
classification algorithm, but would not result in good estimation
the tractions of a surface are then implemented in Bayesian
for properties of a surface. To estimate the roughness of a surface,
classification/exploration (Fig. 20(b)).
the authors proposed to calculate a spectral power of the pressure
variations, Pac (n): In [140] since the authors were limited with the acquisition rate
from sensors, the maximum detectable frequency was 100 Hz. The
N
Friction Ratio was estimated by use of a six-axis force and torque
1  sensor. Then variations of a mean squared error (MSE) between
Power = Pac (n)2 , (1)
N n =1 the estimated ratio and sensor output were then subjected to FFT.
212 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

The FFT resulted in spectral components that can be applied in compared by computing the Euclidean distance, d(I1 , I2 ) pixel by
classifications step. The authors applied k-NN classifier. (Fig. 20(c)). pixel:
Jamali et al. [142] applied high pass filter with cut-off frequency 
of 500 Hz and removed DC (constant) component by use of d(I1 , I2 ) = |I1 (x, y) − I2 (x, y)|, (3)
x y
Zero-Mean Normalization. As in the above approaches the input
signal was transformed to frequency domain. The harmonics that and the distance between two observations, z1 and z2, is calculated
occur during sliding were classified by means of Majority voting by taking into account the distance between fingers, ω, and the
algorithms (Fig. 20(d)). weighting factor, α , that represents the contribution of changes in
Unlike the above approaches, Li et al. [79] recognized the contact patterns and finger distance:
texture as an image through a contact pattern sensing the GelSight left left right
sensor with a resolution of around 2 microns. The authors proposed
d(z1; z2) = α ∗ (d(I1 ; I2 ) + d(I1 ; I2right ))
Multi-scale local binary pattern (MLBP) to classify high resolution + (1 − α) ∗ |ω1 − ω2 |, (4)
tactile images (Fig. 20(e)). left right
where I1 and I1
stand for tactile images from left and
Regardless the source of vibrations the applied computational
right fingers of a gripper. The k-means unsupervised clustering
techniques can share common methods. In the approaches
algorithm has been applied to get centers (centroids) of each
proposed by Jamali et al. [142] and Fishel et al. [22], the measured
cluster (c1 . . . ck ). The centroids serve to build a vocabulary for
signal, which are acquired from a piezoelectric and liquid pressure
the bag-of-features approach (Fig. 21(b)). To verify the proposed
sensor, are filtered first and transformed to the frequency domain.
techniques, the authors carried out 830 tactile observations with a
Metrics used for the classification are different in these two
6 × 14 piezoresistive array for 21 different objects.
approaches. While, in the first approach, the authors used the Hongbin et al. [94] applied a three-layer Neural Network to
Fourier components as the metrics for a classifier, in the second classify contact patterns. As in the above approach, an image is
approach, one more step is taken to extract features that represent normalized to the highest value in range zero to one. Then two
surface properties as the metrics for their classifier. The estimation more preprocessing steps of resizing and thresholding operations
of surface properties from the Fourier components rather than were carried out. The operation of resizing from a 5 × 9 to a
using them directly as the metrics give an advantage to the 12 × 20 image has been implemented by linear interpolation. The
exploration procedures, because the extracted features can be used thresholding operation provided at the output a binary image. Both
to choose the next exploratory action (Table 7 which results in a operations have been implemented to enhance the tactile image
higher recognition rate). since the sensor used in the paper was with low spatial resolution.
The vibrations can be also represented by a combination of In order to get features for the classifier, the authors calculated
several variables, as for example, ratio between the normal and the number of repetitions of the same image sub-patterns created
tangential forces [140]. The variation of the proposed metric by sweeping a 3 × 3 pixel-window (Fig. 21(c)). The efficacy of
represents the change of the traction properties. Therefore, the the proposed approach has been verified on the recognition of 4
used metric are not explicitly related to the surface texture, which different shapes, including edge, sphere, ring, and rectangle, with
may result in not perfect recognition process. 40 tests for each shape.
In addition to thresholding, resizing, and normalization of
sensor values during the preprocessing steps, a contact pattern
4.2.3. Contact pattern recognition
could be also normalized spatially (normalization of a contact
Object recognition from tactile arrays uses image processing pose) as was implemented by Göger et al. [24]. The normalization
techniques [40]. Fig. 21(a) outlines the most common steps of a contact pose is performed by means of applying two-
in tactile contact pattern recognition: preprocessing, feature dimensional (p + q)th order image moments, mp,q :
extraction, and classification. As preprocessing steps we consider 
the following operations: spatial filtering, thresholding, and mp,q = xp yq I (x, y), (5)
normalization of sensor output values to the highest one. Image x y
features can be computed from tactile images by applying
where x, y, I (x, y), p, q are the two-dimensional coordinates of
PCA, which results in image moments (i.e. eigenvectors and
each tactel in the image, pressure value, x-order, and y-order,
eigenvalues) that provide information about contact area, center
respectively. The authors carried out PCA to get a reduced matrix
of pressure, and orientation of line in the case of the edge contact
formed by eigenvectors and applied k-NN classifier in a recognition
type. An alternative way of extracting features from a tactile image step. For benchmarking, 7 different contacts, including small point,
is to use Hough transformations [152]. This method is less reliable large point, two-point, full, edge, surface with hole, and waved
in extracting a straight line as stated in [31] (Fig. 22(a)), but can surface, have been acquired with a 4 × 7 array 10 times for training
be effectively applied for the detection of circles in the image. and 10 times for testing. The use of PCA resulted in the matrix
Besides geometrical elements, tactile image processing draws on containing eigenvectors of the size of 112 21(d).
other image processing tools, such as contour detection in order to Pezzementi et al. [40] introduced Moment-Normalized Trans-
achieve identification of more complex shapes (Fig. 22(b)). Rather lation-Invariant descriptor (feature extractor), in which the two-
than extracting features in spatial units, one could extract image dimensional spatial Fourier Transform has been applied to image
features represented in the frequency domain by applying Fourier moments to add invariance to transformations. The authors
Transform. Finally, these features serve as core for classification applied two different clustering algorithms: k-means and GMMs;
algorithms. Scale-invariant feature transformation (SIFT), which is GMMs has shown a higher recognition performance to the
mostly used in computer vision, can be also implemented in tactile detriment of computational time. An image was normalized as
contact processing to extract features. well as in approaches described above. An algorithm similar to
Schneider et al. [145] used the bag-of-features approach rapidly-exploring random trees (RRT) has been implemented in
for tactile pattern recognition. In a preprocessing step, all the exploratory stage. The recognition and exploration techniques
measurements are normalized to the sensor’s maximum response have been tested in simulation of 10 different three-dimensional
to allow the recognition to be invariant to the pressure level: objects with 100 tactile images per object. As in the above
Z ∈ [0; 1]x∗y . Two tactile images noted as I1 (x, y) and I2 (x, y) are preprocessing steps, the images are first normalized, resized by
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 213

Fig. 21. Computational techniques applied in tactile contact pattern recognition: (a) major flow chart of contact pattern recognition, including preprocessing, feature
extraction, and classification; (b) Schneider et al. [145] normalize tactile image and calculate Euclidean distance pixel by pixel (c) Hongbin et al. [94], (d) Göger et al. [24], (e)
Pezzementi et al. [40], (f) Drimus et al. [49], (g) Hongbin et al. [148].

factor 2 to enhance the quality of the image due to the low as the features to recognize deformable objects. Then Dynamic
resolution of 4 × 7, and thresholded for calculation of image Time Wrapping (DTW) applied to these features in order to find
moments 21(e). the shortest path between two tactile images from the same tactile
As tactile contact patterns change with the time when a robot sensor in two consequent moments of time. As in the approach
squeezes a deformable object, one could extract a set of features proposed by Schneider et al. [145] and described above, the authors
from a series of images from one tactile array. Drimus et al. [49] calculate the Euclidean distance between two observations z1 , z2 .
proposed to use an explicit estimate of an average pressure: However, the gripping distance is not taken into account and the
distance between two observations is calculated by means of DTW,
1 
applied on the features, when in the former approach the distance
Pavg = I (x, y), (6)
Nx ∗ Ny x y is directly calculated in image space:

in which Nx and Ny are the number of sensing cells in row and d(z1 , z2 ) = DTW (Pavg
1
, Pavg
2
). (8)
column of an array, and an implicit estimate of contact area:
 Similar to Göger et al. [24], the k-NN classifier has been carried
1 
out in a recognition step. It was shown that a robot exploiting the
area = (I (x, y) − Pavg )
2 (7)
Nx ∗ Ny x y above algorithm could distinguish a spoiled fruit from a fresh fruit
214 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Fig. 22. Contact pattern recognition and feature extraction from a tactile image: (a) Fig. 23. General control framework for dexterous manipulation [157]. The path of
extracted feature of contact edge based on image moments (blue line) and hough information from command through control laws to application on the dexterous
line transform (red line) [31]; (b) geometrical shape derived from the tactile image: hand using object impedance control for rolling manipulation is shown. The
original tactile image on the left and detected contour on the right [133]. (For highlighted blocks can be replaced with tactile servoing control laws.
interpretation of the references to color in this figure legend, the reader is referred
to the web version of this article.)

by applying palpations with a two-fingered gripper and an 8 × 8


sensing array (Fig. 21(f)).
In [148], instead of converting a tactile image to a binary image,
authors apply PCA to a pressure profile and extract orthonormal
eigenvectors in the three-dimensional space. In addition to these
vectors (3 principal axis of a profile), convexity and concavity of
the pressure profile are estimated by comparing a pressure value
at a centroid (center of pressure) and its surrounding area. Then
Fig. 24. Architecture of a tactile servoing controller [158]. The tactile feature
contact patterns are classified by applying naive Bayes approach extraction processes the contact information and provides features to control.
to this set of features. Pressure values are normalized and scaled to Desired contact state, Sd , is compared with the actual contact state, Sa . Sa is derived
the range that is equivalent to that of the number of sensing cells from actual tactile feature, Fa . The errors in the contact state space are transformed
in x and y. Benchmarking has been performed on recognition of 6 to the joint space of a robot.

contact patterns with a 5 × 9 sensing array (Fig. 21(g)).


A rather high precision, 0.14 mm, in localization of an object in In dexterous manipulation tasks, tactile servoing can be
a hand has been achieved by use of GelSight sensor [149] and by assigned to the middle level of the control architecture as proposed
means of Binary Robust invariant scalable keypoints (BRISK). by Okamura et al. [157] for dexterous end-effectors. The authors
Several approaches use the same preprocessing and feature determined three levels of control for dexterous manipulation:
extraction methods as it can be noticed from Fig. 21. When the low-Level is for impedance and active compliance control,
spatial resolution of sensing arrays is not high enough, the contact kinematics, and forces; mid-Level is responsible for manipulation
image is resized and augmented by various interpolation methods phases, transitions between the manipulation phases, and event
(Fig. 21(c), (d)). Then the features can be extracted with a higher (i.e. contact) detection; high-Level is dedicated to planning a task
accuracy. and choosing a grasp (Fig. 23).
In the most of the cases, only the shape of the contact pattern is In the current state-of-the-art, tactile servo schemes have been
of importance and, therefore, pressure values can be normalized. proposed for only planar end-effectors with single tactile array
However, it is not applicable in recognition of deformable objects [31,158]. Objects in the real world can have many different shapes,
(Fig. 21(f)), because pressured values used to estimate an average but the types of contact that have been detected so far are limited
force. to a few particular cases: plane on plane, line on plane, and point
Thresholding operation that result in a binary image improves on plane.
performance of feature extractors and used in the approaches, in An example of a basic tactile servoing control architecture
which two-dimensional PCA is applied to get metrics for classifiers is shown in Fig. 24. The motion planner specifies the desired
(Fig. 21(d), (e)). In the cases when the Euclidean distance between contact state, Sd . Extracted features from the tactile sensor, Fa ,
tactels is used as the metric (Fig. 21(b)) or the three-dimensional are transformed to actual contact state, Sa , when inverse sensor
contact profile is of importance (Fig. 21(g)), the thresholding model is available; then Sa is compared with Sd to generate the
operation cannot be used. error. If an inverse sensor model is not available, then the Tactile
Jacobian is needed to relate the variation in the tactile feature
4.3. Tactile servoing vector to that in the contact state. The tactile servo solver generates
an error, dS, and informs the planner about the adjustment of
Humans often perform tactile servoing actions almost subcon- the desired contact state. The contact model block is dedicated
sciously, for example, when they search in a pocket a key. In cases to transform the changes from the contact feature state to the
when visual information is not available, the motor responses are position of the robot’s end-effector dX in the task space. Finally,
coupled with tactile feedback only. In autonomous robot control the Robot Inverse Jacobian is applied to calculate the robot’s joint
theory, tactile feedback can be used to servo objects. The con- values dθ from the error of the end-effector position in Cartesian
cept of tactile servoing is analogous to image-based visual servoing space. The robot joint angles calculated via tactile feedback are
[153,154]. Robot motion driven by tactile feedback was imple- expressed as follows [159]:
mented by Berger et al. [155] and then extended to tactile servo
concept by Sikka et al. [156] in the early 1990s. However, tactile θ (t + 1) = θ (t ) + dθ ,
servoing is not well investigated, which is partially due to the fact dθ = Jθ−1 dX ,
that tactile perception technology is not as well developed as vi-
Xf − Xa ( t ) (9)
sion technology. A good resolution and a high number of tactels dX = (t < Tseg ),
in tactile array are needed to improve the performance of tactile Tseg − t
servoing algorithms. Xa (t ) = fs−1 (Fa (t )),
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 215

where θ (t ) and θ (t + 1) are the actual and calculated joint angles,


dθ is the error in joint angles, fs−1 (·) is the inverse tactile model,
Jθ−1 (·) is the robot inverse kinematics, Xa is the actual Cartesian
position of the robot end-effector, Xf is the desired final position,
Tseg and t are the period of time within which the robot reaches its
final position and time.
In [31], this tactile servo controller is extended by implement-
ing a selection matrix for combining various servoing tasks, i.e.
following an edge with a tactile array attached to the planar end-
effector or aligning the orientation with a detected edge. The in-
verted task Jacobian was introduced to transform errors in the
space of tactile image features into errors in a motion twist in the
Cartesian space. Unlike the feature extraction through inverse tac-
tile sensor model in the former control scheme, tactile features
in the latter one are extracted using image processing techniques
Fig. 25. Block diagram of position-based force control using tactile sensors [39].
such as PCA and Hough transformations. The extracted tactile im-
The position of a hand is modified according to the measured force and detected
age features are then transformed to motions: slip signal. The slip signal increases the desired force. qd , qˆd , and q are the desired
joint angle, modified desired joint angle, and actual joint angle, respectively. fd and
– positional deviations are mapped into tangential motions in x- f are the desired and actual forces. The desired force increases when the dynamic
and y-directions w.r.t. the sensor frame; sensor detects vibrations.
– normal forces are mapped to motion in z-orientation w.r.t. the
sensor frame;
– rotational error is mapped into rotational velocity around z-axis
w.r.t. the sensor frame.
In [23], tactile information from a fingertip of the iCUB humanoid
robot is used to follow an edge of an object by performing
palpations. It was shown that the robot could follow sophisticated
contours when the recognition of the contact pattern was
performed after each palpation.
As the touch driven control algorithms are important in ex-
ploratory actions, tactile servoing plays an essential role the ex-
ploration of unknown objects. However, due to imperfections of Fig. 26. Surface contour following using the contact sensing finger (control
diagram). [160]. Rangedes (|fd |r ) is a range of the desired force, (|fn |m ) is the measured
tactile sensing arrays, difficulties in integration of these arrays, normal force, (|fn |r ) is the friction cone, (fn ) and (ft ) are the normal and tangential
and computational costs, which impede to process the data to use forces, P is the contact point location, and (J −1 ) is the robot’s inverse kinematics.
within the control loops. Tactile servoing [155] has been imple-
mented on planar end-effectors only [31]. In [38], tactile information derived from one type of tactile
sensors only is used to control both grasping force and slippage.
4.4. Force control with tactile sensors The contact force is calculated according to Eq. (10). During
grasping, the desired force is compared to actual forces of tactile
In the robotics literature, tactile feedback, being mainly used for sensors. In the lift and hold phase, the applied force increases
event-driven manipulation [124], has been rarely employed inside proportionally to the actual force at each instance of the slip signal.
a control loop because of the noisy signals coming from tactile The above force controllers are used to ensure stable grasping.
sensors [8]. Most of the research in force control with dexterous Another application of tactile sensors in force control include the
hands is aimed at controlling the grasping force in order to achieve surface following motion. Following unknown surface is an essen-
a stable grasp (Section 4.1). Li et al. [31] control the normal force to tial task to explore objects. An advanced fingertip sensor with capa-
keep the robot in contact with an object during the edge servoing bility to measure contact point locations, e.g. [104], could be used
task. Force and a Center of Pressure of the contact pattern are to accomplish such type of tasks. When the contact surface of a fin-
estimated by following equations: gertip is known, the contact location with the environment can be
 estimated by measuring forces and torques in SE(3) [160]. A de-
c = f −1 fij cij ; sired friction cone is given, and if the normal force Fn is smaller
i,j∈R than the smallest value of a range, a finger moves towards the con-
 (10) tact, and vice versa. If the Fn is in the range, then the finger moves in
f = fij ,
the sliding direction only. Errors in contact locations and forces are
i,j∈R
transformed to joint angles q through Jacobian matrix J (Fig. 26).
where cij are the discrete coordinates of the tactels, f and fij are A result of exploiting the controller for following a surface with a
the total force and force of each tactel, respectively, and R is the fingertip of a robot hand is shown in Fig. 27 (red dashed line).
number of tactels in x- an y-directions.
Teshigawara et al. [39] obtain signals from both a dynamic 5. Conclusions and future research
tactile array and a Center of Pressure (CoP) tactile sensor to
control the grasping force (Fig. 25). The desired force fd increases Applications of tactile sensing in autonomous manipulation
proportionally to the spectral power of the vibrations that occur and research trends in this field over the last two decades have
during the slip. Vibrations are measured by dynamic tactile been reviewed. The major tactile sensor types and computational
sensors. The actual force, which is measured by CoP, is then techniques have been discussed. Table 8 summarizes and compare
subtracted the modified desired force. The desired joint angle qd the usefulnesses of different tactile data types in the reviewed
is then calculated according to the force and vibrations. robot hand applications.
216 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Table 8
Advantages and disadvantages of the major tactile sensor types in the robot hand applications. Advantages and disadvantages are noted as ‘‘+’’ and ‘‘−’’, respectively.
Tactile data Grasp stability Object recognition Tactile servoing Force control
+ − + − + − + −
Two- Comprehensive Computationally Best suited, No Best suited, Not as precise Possibility to As discussed in
dimensional information costly objects’ information objects’ and reliable as control Table 2 are not
pressure about contact shapes and about surface borders, edges force/torque applied reliable
distribution point deformability friction, lower and corners sensors pressure
(contact locations. can be temporal can be rather than
pattern) Incipient slip recognized resolution detected in force w.r.t.
can be from contact one action contact area
detected at patterns
high sampling
rate
Dynamic Best suited, No Very precise No Non Non Reactive force No information
tactile signal fast response information estimation of information control: as about forces
(Vibrations) to a slippage about static surface about contact vibrations
to ensure forces and texture during pattern and detected,
grasping contact sliding motion surface feed-forward
stability pattern friction term of
force/position
added
Force/Torque Traditionally Surface Estimation of No Precise force Single contact Best suited, In case of the
sensor used for properties surface information control point, edges most precise, F/T sensor using
measurements achieving must be friction about contact cannot be number of DoF strain gauges,
(Force vector) stable grasps, known or well pattern detected in on can be equal the data suffer
perform well estimated action to SE(3) from drift
in static and
quasi-static
actions,
incipient slip
detection

(2) Multimodal object recognition exploiting geometric model of


the robot hands. Presence of multimodal information requires
algorithms that combine different perception modalities.
Fusion of visual and tactile data can be used for more
precise estimation of the robot’s pose in the world [161].
By considering tactile sensing arrays and the structure of
the hands, the shape recognition process can be significantly
accelerated compared to the single contact tactile shape
recognition when only one tactile sensor explores an object
surface. However, most of the tactile object recognition
approaches do not take into account complex hand geometry
so far.
Moreover, finding common features between visual and
tactile sensing modalities is a key point for smooth transition
Fig. 27. Hand with ellipsoid f/t sensors following a surface (the red dashed from visual-based to tactile-based control during the explo-
line) [104]. (For interpretation of the references to color in this figure legend, the
ration of unknown objects.
reader is referred to the web version of this article.)
(3) Reactive behavior with dynamic data. Although force con-
Despite all the advances in sensor technologies and their trol algorithms (for example, hybrid position/force control
integration in robotic hands (described in Sections 2 and 3.1) and by Raibert and Craig [162]) have been investigated since the
the development of new techniques to process and interpret the early 1990s, there are a lot of unsolved problems in grasping
data provided by them (described in Section 4), there is still a and autonomous manipulation tasks. In experimental results
wide scope of investigation in the field of autonomous dexterous from the DARPA Autonomous Robotic Manipulation program,
manipulation based on tactile sensing. Research to be undertaken Righetti et al. [2] have shown that the robot could not react fast
in the future includes the following: enough to unexpected disturbances, an issue related to reac-
(1) Design of dexterous robot hands with integrated tactile tive behaviors. This problem can be solved by dynamic tactile
sensors. Wires can be routed within the structure of the robot sensing, as it was reviewed in Section 4.1.2. Grasping based on
hands to ensure efficient connection of embedded sensors. the analysis of vibrations can be applied to address the problem
The non-linear frictions in the tendon driven robot hands of reactive behavior.
and backlash of the actuators should be reduced to enhance (4) Measurement of tangential forces. Slip detection techniques
manipulation performance. In the case of electric robotic based on friction force models do not suffer from frequency
hands, tactile sensors are usually affected by the noise of interference as do vibration-based techniques. However, fric-
the electric motors actuating the fingers [22]. In addition, tion force models require measurements of both the normal
mechanical vibrations transmitted by the motors can interfere and tangential reaction forces, which cannot be retrieved from
with vibration-based sensors [38] for slippage detection. most of the pressure sensing tactile sensors.
Therefore, new materials (such as the fluidic hand [24]) and (5) In-hand manipulation algorithms for both rigid and de-
new isolation solutions should be considered in order to formable objects. In-hand manipulation has become increas-
remove these problems. ingly important in recent years. This task requires a lot of
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 217

computational effort, since the number of DoF of the dexterous [21] S. Navarro, M. Marufo, Y. Ding, S. Puls, D. Goger, B. Hein, H. Worn, Methods for
robot hand is as much as the whole body of a humanoid robot. safe human–robot-interaction using capacitive tactile proximity sensors, in:
2013 IEEE/RSJ International Conference on Intelligent Robots and Systems,
In-hand object manipulation is not sufficiently addressed by IROS, 2013, pp. 1149–1154. http://dx.doi.org/10.1109/IROS.2013.6696495.
current research and is one of the areas that warrants further [22] J.A. Fishel, G.E. Loeb, Bayesian exploration for intelligent identification of
investigation. textures, Front. Neurorobotics 6 (4) (2012). http://dx.doi.org/10.3389/fnbot.
2012.00004.
Future research work in dexterous manipulation should be fo- [23] U. Martinez-Hernandez, T. Dodd, L. Natale, G. Metta, T. Prescott, N. Lepora,
Active contour following to explore object shape with robot touch, in: World
cused on the investigation of autonomous control algorithms that Haptics Conference, WHC, 2013, 2013, pp. 341–346. http://dx.doi.org/10.
comprehensively use tactile feedback by applying tactile servoing 1109/WHC.2013.6548432.
and force controls, as well as on multimodal object recognition and [24] D. Göger, N. Gorges, H. Worn, Tactile sensing for an anthropomorphic robotic
hand: Hardware and signal processing, in: IEEE International Conference on
tactile-based stable grasp estimation in order to enhance the per- Robotics and Automation, 2009. ICRA’09. 2009, pp. 895–901. http://dx.doi.
formance of dexterous manipulation and allow robots to operate org/10.1109/ROBOT.2009.5152650.
in real world with a highly modifying environment. [25] H. Dang, P. Allen, Stable grasping under pose uncertainty using tactile
feedback, Auton. Robots 36 (4) (2014) 309–330. http://dx.doi.org/10.1007/
s10514-013-9355-y.
References [26] H. Liu, X. Song, J. Bimbo, K. Althoefer, L. Senerivatne, Intelligent fingertip
sensing for contact information identification, in: J.S. Dai, M. Zoppi, X. Kong
(Eds.), Advances in Reconfigurable Mechanisms and Robots I, Springer, Lon-
[1] A. Bicchi, Hands for dexterous manipulation and robust grasping: a difficult
don, 2012, pp. 599–608. http://dx.doi.org/10.1007/978-1-4471-4141-9_54.
road toward simplicity, IEEE Trans. Robot. Autom. 16 (6) (2000) 652–662.
[27] X. Song, H. Liu, K. Althoefer, T. Nanayakkara, L. Seneviratne, Efficient break-
http://dx.doi.org/10.1109/70.897777.
away friction ratio and slip prediction based on haptic surface exploration,
[2] L. Righetti, M. Kalakrishnan, P. Pastor, J. Binney, J. Kelly, R. Voorhies, G.
IEEE Trans. Robot. 30 (1) (2014) 203–219. http://dx.doi.org/10.1109/TRO.
Sukhatme, S. Schaal, An autonomous manipulation system based on force
2013.2279630.
control and optimization, Auton. Robots 36 (1–2) (2014) 11–30. http://dx.
[28] K.-C. Nguyen, V. Perdereau, Fingertip force control based on max torque
doi.org/10.1007/s10514-013-9365-9.
adjustment for dexterous manipulation of an anthropomorphic hand, in:
[3] B. Siciliano, O. Khatib (Eds.), Springer Handbook of Robotics, Springer, 2008, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems,
http://dx.doi.org/10.1007/978-3-540-30301-5. IROS, 2013, pp. 3557–3563. http://dx.doi.org/10.1109/IROS.2013.6696863.
[4] euRobotics aisbl, Robotics 2020. strategic research agenda for robotics in [29] L.D. Harmon, Automated tactile sensing, Int. J. Robot. Res. 1 (2) (1982) 3–32.
Europe. http://www.eu-robotics.net/cms/upload/PDF/SRA2020_0v42b_ http://dx.doi.org/10.1177/027836498200100201.
Printable_.pdf (accessed 26.05.14). [30] J.A.C. Ramon, V. Perdereau, F.T. Medina, Multi-fingered robotic hand planner
[5] B. Gates, A robot in every home, Sci. Am. 296 (1) (2007) 58–65. for object reconfiguration through a rolling contact evolution model, in:
[6] R.S. Johansson, J.R. Flanagan, Coding and use of tactile signals from the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe,
fingertips in object manipulation tasks, Nat. Rev. Neurosci. 10 (5) (2009) Germany, May 6–10, 2013, 2013, pp. 625–630. http://dx.doi.org/10.1109/
345–359. http://dx.doi.org/10.1038/nrn2621. ICRA.2013.6630638.
[7] J.R. Wingert, H. Burton, R.J. Sinclair, J.E. Brunstrom, D.L. Damiano, Tactile [31] Q. Li, C. Schürmann, R. Haschke, H. Ritter, A control framework for tactile
sensory abilities in cerebral palsy: deficits in roughness and object discrim- servoing, in: Robotics: Science and Systems, 2013.
ination, Dev. Med. Child Neurol. 50 (11) (2008) 832–838. http://dx.doi.org/ [32] P.J. Kyberd, M. Evans, S. te Winkel, An intelligent anthropomorphic hand, with
10.1111/j.1469-8749.2008.03105.x. automatic grasp, Robotica 16 (1998) 531–536.
[8] M. Prats, A.P. del Pobil, P.J. Sanz, Robot physical interaction through the com- [33] M. Cutkosky, R. Howe, W. Provancher, Force and tactile sensors, in: B. Sicil-
bination of vision, tactile and force feedback, in: B. Siciliano, O. Khatib (Eds.), iano, O. Khatib (Eds.), Springer Handbook of Robotics, Springer, Berlin, Hei-
Tracts in Advanced Robotics, in: Springer Tracts in Advanced Robotics, vol. delberg, 2008, pp. 455–476. http://dx.doi.org/10.1007/978-3-540-30301-5_
84, Springer, 2013, p. 177. http://dx.doi.org/10.1007/978-3-319-03017-3_3. 20.
[9] M.R. Cutkosky, J. Ulmen, Dynamic tactile sensing, in: R. Balasubramanian, [34] R. Koiva, M. Zenker, C. Schurmann, R. Haschke, H. Ritter, A highly sensitive
V.J. Santos (Eds.), The Human Hand as an Inspiration for Robot Hand Devel- 3D-shaped tactile sensor, in: 2013 IEEE/ASME International Conference on
opment, in: Springer Tracts in Advanced Robotics, vol. 95, Springer-Verlag, Advanced Intelligent Mechatronics, AIM, 2013, pp. 1084–1089. http://dx.
2014, pp. 219–246. http://dx.doi.org/10.1007/978-3-319-03017-3_3. doi.org/10.1109/AIM.2013.6584238.
[10] R. Dahiya, M. Valle, Tactile sensing technologies, in: Robotic Tactile Sens- [35] W. Fukui, F. Kobayashi, F. Kojima, H. Nakamoto, N. Imamura, T. Maeda, H.
ing, Springer, Netherlands, 2013, pp. 79–136. http://dx.doi.org/10.1007/ Shirasawa, High-speed tactile sensing for array-type tactile sensor and object
978-94-007-0579-1_5. manipulation based on tactile information, J. Robot. (2011).
[11] T. Sekitani, U. Zschieschang, H. Klauk, T. Someya, Flexible organic transistors [36] D. Gunji, Y. Mizoguchi, S. Teshigawara, A. Ming, A. Namiki, M. Ishikawaand,
and circuits with extreme bending stability, Nat. Mater. 9 (12) (2010) M. Shimojo, Grasping force control of multi-fingered robot hand based on
1015–1022. slip detection using tactile sensor, in: IEEE International Conference on
[12] M. Kaltenbrunner, T. Sekitani, J. Reeder, T. Yokota, K. Kuribara, T. Tokuhara, Robotics and Automation, 2008. ICRA 2008. 2008, pp. 2605–2610. http://dx.
M. Drack, R. Schwödiauer, I. Graz, S. Bauer-Gogonea, et al., An ultra- doi.org/10.1109/ROBOT.2008.4543605.
lightweight design for imperceptible plastic electronics, Nature 499 (7459) [37] H. Yussof, M. Ohka, H. Suzuki, N. Morisawa, Tactile sensing-based control
(2013) 458–463. system for dexterous robot manipulation, in: S.-I. Ao, B. Rieger, S.-S. Chen
[13] H. Yousef, M. Boukallel, K. Althoefer, Tactile sensing for dexterous in-hand (Eds.), Advances in Computational Algorithms and Data Analysis, in: Lec-
manipulation in robotics—a review, Sensors Actuators A 167 (2) (2011) ture Notes in Electrical Engineering, vol. 14, Springer, Netherlands, 2009,
171–187. http://dx.doi.org/10.1016/j.sna.2011.02.038. Solid-state Sensors, pp. 199–213. http://dx.doi.org/10.1007/978-1-4020-8919-0_15.
Actuators and Microsystems Workshop. [38] J. Romano, K. Hsiao, G. Niemeyer, S. Chitta, K. Kuchenbecker, Human-inspired
[14] P. Puangmali, K. Althoefer, L. Seneviratne, D. Murphy, P. Dasgupta, State-of- robotic grasp control with tactile sensing, IEEE Trans. Robot. 27 (6) (2011)
the-art in force and tactile sensing for minimally invasive surgery, IEEE Sens. 1067–1079. http://dx.doi.org/10.1109/TRO.2011.2162271.
J. 8 (4) (2008) 371–381. http://dx.doi.org/10.1109/JSEN.2008.917481. [39] S. Teshigawara, T. Tsutsumi, S. Shimizu, Y. Suzuki, A. Ming, M. Ishikawa, M.
[15] M.I. Tiwana, S.J. Redmond, N.H. Lovell, A review of tactile sensing Shimojo, Highly sensitive sensor for detection of initial slip and its applica-
technologies with applications in biomedical engineering, Sensors Actuators tion in a multi-fingered robot hand, in: 2011 IEEE International Conference
A 179 (0) (2012) 17–31. on Robotics and Automation, ICRA, 2011, pp. 1097–1102. http://dx.doi.org/
[16] M. Francomano, D. Accoto, E. Guglielmelli, Artificial sense of slip—a review, 10.1109/ICRA.2011.5979750.
IEEE Sens. J. 13 (7) (2013) 2489–2498. http://dx.doi.org/10.1109/JSEN.2013. [40] Z. Pezzementi, E. Plaku, C. Reyda, G. Hager, Tactile-object recognition from
2252890. appearance information, IEEE Trans. Robot. 27 (3) (2011) 473–487. http://
[17] R. Dahiya, G. Metta, M. Valle, G. Sandini, Tactile sensing—from humans to dx.doi.org/10.1109/TRO.2011.2125350.
humanoids, IEEE Trans. Robot. 26 (1) (2010) 1–20. http://dx.doi.org/10.1109/ [41] P.J. Kyberd, P.H. Chappell, Object-slip detection during manipulation using a
TRO.2009.2033627. derived force vector, Mechatronics 2 (1) (1992) 1–13.
[18] R. Dahiya, P. Mittendorfer, M. Valle, G. Cheng, V. Lumelsky, Directions toward [42] L.U. Odhner, L.P. Jentoft, M.R. Claffee, N. Corson, Y. Tenzer, R.R. Ma, M.
effective utilization of tactile skin: A review, IEEE Sens. J. 13 (11) (2013) Buehler, R. Kohout, R.D. Howe, A.M. Dollar, A compliant, underactuated hand
4121–4138. http://dx.doi.org/10.1109/JSEN.2013.2279056. for robust manipulation, Int. J. Robot. Res. (2014). http://dx.doi.org/10.1177/
[19] N. Wettels, J. Fishel, G. Loeb, Multimodal tactile sensor, in: R. Balasubrama- 0278364913514466.
nian, V.J. Santos (Eds.), The Human Hand as an Inspiration for Robot Hand [43] D. Xu, G. Loeb, J. Fishel, Tactile identification of objects using Bayesian explo-
Development, in: Springer Tracts in Advanced Robotics, vol. 95, Springer ration, in: 2013 IEEE International Conference on Robotics and Automation,
International Publishing, 2014, pp. 405–429. http://dx.doi.org/10.1007/ ICRA, 2013, pp. 3056–3061. http://dx.doi.org/10.1109/ICRA.2013.6631001.
978-3-319-03017-3_19. [44] A. Schmitz, P. Maiolino, M. Maggiali, L. Natale, G. Cannata, G. Metta, Methods
[20] M. Lee, H. Nicholls, Review article tactile sensing for mechatronics—a state and technologies for the implementation of large-scale robot tactile sensors,
of the art survey, Mechatronics 9 (1) (1999) 1–31. http://dx.doi.org/10.1016/ IEEE Trans. Robot. 27 (3) (2011) 389–400. http://dx.doi.org/10.1109/TRO.
S0957-4158(98)00045-2. 2011.2132930.
218 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

[45] J. Fraden, Handbook of Modern Sensors: Physics, Designs, and Applications, [75] T.B. Martin, R. Ambrose, M. Diftler, R. Platt Jr., M.J. Butzer, Tactile gloves
Springer, 2004. for autonomous grasping with the NASA/DARPA robonaut, in: 2004 IEEE
[46] R. Russell, Robot Tactile Sensing, Prentice Hall, 1990. International Conference on Robotics and Automation, 2004. Proceedings.
[47] S. Teshigawara, S. Shimizu, K. Tadakuma, M. Aiguo, M. Shimojo, M. Ishikawa, ICRA’04, vol. 2, 2004, pp. 1713–1718. http://dx.doi.org/10.1109/ROBOT.
High sensitivity slip sensor using pressure conductive rubber, in: Sensors, 2004.1308071.
2009 IEEE, 2009, pp. 988–991. http://dx.doi.org/10.1109/ICSENS.2009. [76] P. Kampmann, F. Kirchner, Integration of fiber-optic sensor arrays into a
5398213. multi-modal tactile sensor processing system for robotic end-effectors,
[48] J.A. Rogers, T. Someya, Y. Huang, Materials and mechanics for stretchable Sensors 14 (4) (2014) 6854–6876. http://dx.doi.org/10.3390/s140406854.
electronics, Science 327 (5973) (2010) 1603–1607. http://dx.doi.org/10. [77] H. Xie, A. Jiang, H. Wurdemann, H. Liu, L. Seneviratne, K. Althoefer, Magnetic
1126/science.1182383. resonance-compatible tactile force sensor using fiber optics and vision
[49] A. Drimus, G. Kootstra, A. Bilberg, D. Kragic, Design of a flexible tactile sensor, IEEE Sens. J. 14 (3) (2014) 829–838. http://dx.doi.org/10.1109/JSEN.
sensor for classification of rigid and deformable objects, Robot. Auton. Syst. 2013.2281591.
62 (1) (2014) 3–15. http://dx.doi.org/10.1016/j.robot.2012.07.021. New [78] M.K. Johnson, F. Cole, A. Raj, E.H. Adelson, Microgeometry capture using an
Boundaries of Robotics. elastomeric sensor, ACM Trans. Graph. 30 (4) (2011) 46:1–46:8. http://dx.
[50] G. Büscher, R. Koiva, C. Schürmann, R. Haschke, H.J. Ritter, Flexible and doi.org/10.1145/2010324.1964941.
stretchable fabric-based tactile sensor, in: IEEE/RSJ International Conference [79] R. Li, E. Adelson, Sensing and recognizing surface textures using a gelsight
on Intelligent Robots and Systems, IROS 2012, Workshop on Advances in sensor, in: 2013 IEEE Conference on Computer Vision and Pattern Recogni-
Tactile Sensing and Touch based Human–Robot Interaction, 2012. tion, CVPR, 2013, pp. 1241–1247. http://dx.doi.org/10.1109/CVPR.2013.164.
[51] S. Stassi, V. Cauda, G. Canavese, C.F. Pirri, Flexible tactile sensing based on [80] M. Koike, S. Saga, T. Okatani, K. Deguchi, Sensing method of total-internal-
piezoresistive composites: A review, Sensors 14 (3) (2014) 5296–5332. reflection-based tactile sensor, in: World Haptics Conference, WHC, 2011
http://dx.doi.org/10.3390/s140305296. IEEE, 2011, pp. 615–619. http://dx.doi.org/10.1109/WHC.2011.5945556.
[52] R.S. Dahiya, M. Valle, Tactile sensing for robotic applications, in: Sensors, [81] O. LTD., Opto-force sensor. http://www.optoforce.com/3dsensor/ (accessed
Focus on Tactile, Force and Stress Sensors, 2008, pp. 298–304. 10.06.15).
[53] K. Weiss, H. Worn, The working principle of resistive tactile sensor cells, in: [82] Honeywell, Bridge pressure sensor. http://sccatalog.honeywell.com/
IEEE International Conference Mechatronics and Automation, vol. 1, 2005, pdbdownload/images/26pc.smt.series.chart.1.pdf (accessed 12.05.14).
pp. 471–476. http://dx.doi.org/10.1109/ICMA.2005.1626593. [83] J. Fishel, V. Santos, G. Loeb, A robust micro-vibration sensor for biomimetic
[54] G.H. Büscher, R. Kõiva, C. Schürmann, R. Haschke, H.J. Ritter, Flexible fingertips, in: 2nd IEEE RAS EMBS International Conference on Biomedical
and stretchable fabric-based tactile sensor, Robot. Auton. Syst. 63 (2015) Robotics and Biomechatronics, 2008. BioRob 2008. 2008, pp. 659–663.
244–252. http://dx.doi.org/10.1016/j.robot.2014.09.007. http://dx.doi.org/10.1109/BIOROB.2008.4762917.
[55] M. Shimojo, A. Namiki, M. Ishikawa, R. Makino, K. Mabuchi, A tactile sen- [84] SynTouch, The biotac. http://www.syntouchllc.com/Products/BioTac/ (ac-
sor sheet using pressure conductive rubber with electrical-wires stitched cessed 12.05.14).
method, IEEE Sens. J. 4 (5) (2004) 589–596. http://dx.doi.org/10.1109/JSEN. [85] Y. Tenzer, L.P. Jentoft, R.D. Howe, The feel of MEMS barometers: Inexpensive
2004.833152. and easily customized tactile array sensors, Robot. Autom. Mag., IEEE 21 (3)
[56] I. Electronics, Fsr. http://www.interlinkelectronics.com/fsrtech.php (ac- (2014) 89–95. http://dx.doi.org/10.1109/MRA.2014.2310152.
cessed 29.04.14). [86] N. Wettels, A. Parnandi, J.-H. Moon, G. Loeb, G. Sukhatme, Grip control using
[57] Tekscan, Flexiforce. http://www.tekscan.com/flexiforce.html (accessed biomimetic tactile sensing systems, IEEE/ASME Trans. Mechatronics 14 (6)
29.04.14). (2009) 718–723. http://dx.doi.org/10.1109/TMECH.2009.2032686.
[58] W. Robotics Tactile sensors. http://weiss-robotics.de/en/tactile-sensors. [87] TakkTile, Takktile kit. http://www.takktile.com (accessed 10.02.14).
html (accessed 4.04.14). [88] B. Choi, S. Lee, H.R. Choi, S. Kang, Development of anthropomorphic robot
[59] L. Inaba Rubber Company, Conductive rubber. http://www.inaba-rubber.co. hand with tactile sensor: SKKU hand II, in: 2006 IEEE/RSJ International
jp/en/b_products/inastomer/index.html (accessed 29.04.14). Conference on Intelligent Robots and Systems, 2006, pp. 3779–3784. http://
[60] Eeonyx, Piezoresistive fabric sensors. http://www.eeonyx.com/eeontex. dx.doi.org/10.1109/IROS.2006.281763.
php (accessed 29.04.14). [89] T. Kawamura, N. Inaguma, K. Nejigane, K. Tani, H. Yamada, Measurement of
[61] ATi, F/t sensor: Nano17. http://www.ati-ia.com/products/ft/ft_models.aspx? slip, force and deformation using hybrid tactile sensor system for robot hand
id=Nano17 (accessed 9.05.14). gripping an object, Int. J. Adv. Robot. Syst. 10 (2013).
[62] T. Someya, T. Sekitani, Bionic skins using flexible organic devices, in: 2014 [90] H. Hasegawa, Y. Mizoguchi, K. Tadakuma, A. Ming, M. Ishikawa, M. Shimojo,
IEEE 27th International Conference on Micro Electro Mechanical Systems, Development of intelligent robot hand using proximity, contact and slip
MEMS, 2014, pp. 68–71. http://dx.doi.org/10.1109/MEMSYS.2014.6765575. sensing, in: 2010 IEEE International Conference on Robotics and Automation,
[63] H.Kew Lee, J. Chung, S.-I. Chang, E. Yoon, Normal and shear force mea- ICRA, 2010, pp. 777–784. http://dx.doi.org/10.1109/ROBOT.2010.5509243.
surement using a flexible polymer tactile sensor with embedded multiple [91] K. Hosoda, Y. Tada, M. Asada, Anthropomorphic robotic soft fingertip with
capacitors, J. Microelectromech. Syst. 17 (4) (2008) 934–942. http://dx.doi. randomly distributed receptors 54 (2) 104–109. http://dx.doi.org/10.1016/j.
org/10.1109/JMEMS.2008.921727. robot.2005.09.019.
[64] C.A. Jara, J. Pomares, F.A. Candelas, F. Torres, Control framework for dexterous [92] M. Möser, Structure-borne sound, in: Engineering Acoustics, Springer,
manipulation using dynamic visual servoing and tactile sensors’ feedback, Berlin, Heidelberg, 2009, pp. 117–142. http://dx.doi.org/10.1007/
Sensors 14 (1) (2014) 1787–1804. http://dx.doi.org/10.3390/s140101787. 978-3-540-92723-5_4.
[65] B. Heyneman, M. Cutkosky, Biologically inspired tactile classification of [93] L.-T. Jiang, J.R. Smith, Seashell effect pretouch sensing for robotic grasping,
object-hand and object-world interactions, in: 2012 IEEE International in: 2012 IEEE International Conference on Robotics and Automation, ICRA,
Conference on Robotics and Biomimetics, ROBIO, 2012, pp. 167–173. http:// IEEE, 2012, pp. 2851–2858, http://dx.doi.org/10.1109/ICRA.2012.6224985.
dx.doi.org/10.1109/ROBIO.2012.6490961. [94] H. Liu, J. Greco, X. Song, J. Bimbo, L. Seneviratne, K. Althoefer, Tactile image
[66] PPS, Tactile sensors. http://www.pressureprofile.com/products.php (ac- based contact shape recognition using neural network, in: 2012 IEEE Con-
cessed 9.05.14). ference on Multisensor Fusion and Integration for Intelligent Systems, MFI,
[67] AD, Ad7147 technical datasheet. http://www.analog.com/static/ 2012, pp. 138–143. http://dx.doi.org/10.1109/MFI.2012.6343036.
imported-files/Data_Sheets/AD7147.pdf (accessed 9.05.14). [95] Y. Bekiroglu, J. Laaksonen, J.A. Jorgensen, V. Kyrki, D. Kragic, Assessing grasp
[68] P. Maiolino, M. Maggiali, G. Cannata, G. Metta, L. Natale, A flexible and robust stability based on learning and haptic data, IEEE Trans. Robot. 27 (3) (2011)
large scale capacitive tactile system for robots, IEEE Sens. J. 13 (10) (2013) 616–629. http://dx.doi.org/10.1109/TRO.2011.2132870.
3910–3917. http://dx.doi.org/10.1109/JSEN.2013.2258149. [96] Freescale, Miniature i2c digital barometer. http://cache.freescale.com/files/
[69] L. Seminara, M. Capurro, P. Cirillo, G. Cannata, M. Valle, Electromechanical sensors/doc/data_sheet/MPL115A2.pdf (accessed 12.05.14).
characterization of piezoelectric {PVDF} polymer films for tactile sensors in [97] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, A.Y.
robotics applications, Sensors Actuators A 169 (1) (2011) 49–58. http://dx. Ng, ROS: an open-source robot operating system, in: ICRA Workshop on
doi.org/10.1016/j.sna.2011.05.004. Open Source Software, vol. 3, 2009, p. 5.
[70] S. Schulz, C. Pylatiuk, A. Kargov, R. Oberle, G. Bretthauer, Progress in the [98] Shadowrobot, Shadow dexterous hand. http://www.shadowrobot.com/
development of anthropomorphic fluidic hands for a humanoid robot, in: products/dexterous-hand/ (accessed 12.05.14).
2004 4th IEEE/RAS International Conference on Humanoid Robots, vol. 2, [99] G. Metta, P. Fitzpatrick, L. Natale, YARP: yet another robot platform, Int. J. Adv.
2004, pp. 566–575. http://dx.doi.org/10.1109/ICHR.2004.1442671. Robot. Syst. 3 (1) (2006) 43–48.
[71] C.-H. Chuang, M.-S. Wang, Y.-C. Yu, C.-L. Mu, K.-F. Lu, C.-T. Lin, Flexible [100] G. Metta, G. Sandini, D. Vernon, L. Natale, F. Nori, The iCub humanoid robot:
tactile sensor for the grasping control of robot fingers, in: 2013 International An open platform for research in embodied cognition, in: Proceedings of the
Conference on Advanced Robotics and Intelligent Systems, ARIS, 2013, pp. 8th Workshop on Performance Metrics for Intelligent Systems, PerMIS’08,
141–146. http://dx.doi.org/10.1109/ARIS.2013.6573549. ACM, New York, NY, USA, 2008, pp. 50–56. http://dx.doi.org/10.1145/
[72] T. Zhang, H. Liu, L. Jiang, S. Fan, J. Yang, Development of a flexible 3-d tactile 1774674.1774683.
sensor system for anthropomorphic artificial hand, IEEE Sens. J. 13 (2) (2013) [101] A. Namiki, Y. Imai, M. Ishikawa, M. Kaneko, Development of a high-speed
510–518. http://dx.doi.org/10.1109/JSEN.2012.2220345. multifingered hand system and its application to catching, in: IEEE/RSJ
[73] Peratech, Quantum tunneling composite. http://www.peratech.com (ac- International Conference on Intelligent Robots and Systems, vol. 3, 2003, pp.
cessed 9.05.14). 2666–2671. http://dx.doi.org/10.1109/IROS.2003.1249273.
[74] S.R. Company, Developments in dextrous hands for advanced robotic appli- [102] H. Bruyninckx, P. Soetens, B. Koninckx, The real-time motion control core
cations, in: Automation Congress, 2004. Proceedings. World, vol. 15, 2004, of the Orocos project, in: IEEE International Conference on Robotics and
pp. 123–128. Automation, 2003, pp. 2766–2771.
Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220 219

[103] A. Schmitz, M. Maggiali, L. Natale, B. Bonino, G. Metta, A tactile sensor for [131] A. Cranny, D. Cotton, P. Chappell, S. Beeby, N. White, Thick-film force and
the fingertips of the humanoid robot iCub, in: 2010 IEEE/RSJ International slip sensors for a prosthetic hand, Sensors Actuators A 123–124 (0) (2005)
Conference on Intelligent Robots and Systems, IROS, 2010, pp. 2212–2217. 162–171. eurosensors {XVIII} 2004 The 18th European conference on Solid-
http://dx.doi.org/10.1109/IROS.2010.5648838. State Transducers.
[104] H. Liu, K. Nguyen, V. Perdereau, J. Bimbo, J. Back, M. Godden, L. Seneviratne, [132] J.T. Belter, J.L. Segil, A.M. Dollar, R.F. Weir, Mechanical design and
K. Althoefer, Finger contact sensing and the application in dexterous hand performance specifications of anthropomorphic prosthetic hands: A review,
manipulation 1–17 http://dx.doi.org/10.1007/s10514-015-9425-4. J. Rehabil. Res. Dev. 50 (5) (2013).
[105] Z. Kappassov, Y. Khassanov, A. Saudabayev, A. Shintemirov, H. Varol, Semi- [133] V.A. Ho, T. Nagatani, A. Noda, S. Hirai, What can be inferred from a tactile
anthropomorphic 3D printed multigrasp hand for industrial and service arrayed sensor in autonomous in-hand manipulation? in: 2012 IEEE Inter-
robots, in: 2013 IEEE International Conference on Mechatronics and Au- national Conference on Automation Science and Engineering, CASE, 2012,
tomation, ICMA, 2013, pp. 1697–1702. http://dx.doi.org/10.1109/ICMA. pp. 461–468. http://dx.doi.org/10.1109/CoASE.2012.6386384.
2013.6618171. [134] J.J. Benedetto, Wavelets: Mathematics and Applications, vol. 13, CRC press,
[106] M. Diftler, C. Culbert, R. Ambrose, R. Platt Jr., W. Bluethmann, Evolution of 1993.
[135] A. Vezhnevets, Gml adaboost matlab toolbox. http://graphics.cs.msu.ru/en/
the NASA/DARPA robonaut control system, in: IEEE International Conference
science/research/machinelearning/adaboosttoolbox (accessed 26.05.14).
on Robotics and Automation, vol. 2, 2003, pp. 2543–2548. http://dx.doi.org/ [136] C.-C. Chang, C.-J. Lin, LIBSVM: A library for support vector machines, 2001.
10.1109/ROBOT.2003.1241975. http://www.csie.ntu.edu.tw/∼cjlin/libsvm/ (accessed 26.05.14).
[107] CyberGlove, Cyberglove. http://www.cyberglovesystems.com/ (accessed [137] K. Murphy, Hidden Markov model toolbox for matlab. http://www.cs.ubc.
12.05.14). ca/∼murphyk/Software/HMM/hmm.html (accessed 26.05.14).
[108] Schunk, 2-finger-parallel gripper. http://www.schunk.com/schunk_files/ [138] X. Wu, V. Kumar, J.R. Quinlan, J. Ghosh, Q. Yang, H. Motoda, G.J. McLachlan,
attachments/OM_AU_PG__EN.pdf (accessed 29.04.14). A. Ng, B. Liu, S.Y. Philip, et al., Top 10 algorithms in data mining, Knowl. Inf.
[109] Schunk, 3-finger gripping hand sdh. http://www.schunk.com/schunk_files/ Syst. 14 (1) (2008) 1–37.
attachments/SDH_DE_EN.pdf (accessed 29.04.14). [139] T. Araki, T. Nakamura, T. Nagai, K. Funakoshi, M. Nakano, N. Iwahashi, Online
[110] T. Mouri, H. Kawasaki, K. Yoshikawa, J. Takai, S. Ito, Anthropomorphic robot object categorization using multimodal information autonomously acquired
hand: Gifu hand III, in: Proc. Int. Conf. ICCAS, 2002, pp. 1288–1293. by a mobile robot, Adv. Robot. 26 (17) (2012) 1995–2020. http://dx.doi.org/
[111] I. Gaiser, S. Schulz, A. Kargov, H. Klosek, A. Bierbaum, C. Pylatiuk, R. Oberle, 10.1080/01691864.2012.728693.
T. Werner, T. Asfour, G. Bretthauer, R. Dillmann, A new anthropomorphic [140] H. Liu, X. Song, J. Bimbo, L. Seneviratne, K. Althoefer, Surface material recog-
robotic hand, in: 8th IEEE-RAS International Conference on Humanoid nition through haptic exploration using an intelligent contact sensing finger,
Robots, 2008. Humanoids 2008. 2008, pp. 418–422. http://dx.doi.org/10. in: 2012 IEEE/RSJ International Conference on Intelligent Robots and Sys-
1109/ICHR.2008.4755987. tems, IROS, 2012, pp. 52–57. http://dx.doi.org/10.1109/IROS.2012.6385815.
[112] Robotiq, 3-finger adaptive robot gripper. http://robotiq.com/en/products/ [141] J. Sinapov, V. Sukhoy, R. Sahai, A. Stoytchev, Vibrotactile recognition and
industrial-robot-hand (accessed 12.05.14). categorization of surfaces by a humanoid robot, IEEE Trans. Robot. 27 (3)
[113] P.A. Schmidt, E. Maël, R.P. Würtz, A sensor for dynamic tactile information (2011) 488–497. http://dx.doi.org/10.1109/TRO.2011.2127130.
with applications in human–robot interaction and object exploration, Robot. [142] N. Jamali, C. Sammut, Majority voting: Material classification by tactile
Auton. Syst. 54 (12) (2006) 1005–1014. http://dx.doi.org/10.1016/j.robot. sensing using surface texture, IEEE Trans. Robot. 27 (3) (2011) 508–521.
2006.05.013. http://dx.doi.org/10.1109/TRO.2011.2127110.
[114] Barret, Barret hand. http://www.barrett.com/robot/products-hand.htm (ac- [143] P. Dallaire, P. Giguère, D. Émond, B. Chaib-draa, Autonomous tactile percep-
cessed 12.05.14). tion: A combined improved sensing and Bayesian nonparametric approach,
[115] M. Strohmayr, D. Schneider, The DLR artificial skin step II: Scalability as a Robot. Auton. Syst. 62 (4) (2014) 422–435. http://dx.doi.org/10.1016/j.robot.
prerequisite for whole-body covers, in: 2013 IEEE/RSJ International Confer- 2013.11.011.
ence on Intelligent Robots and Systems, IROS, 2013, pp. 4721–4728. http:// [144] R. Ibrayev, Y.-B. Jia, Recognition of curved surfaces from one-dimensional
dx.doi.org/10.1109/IROS.2013.6697036. tactile data, IEEE Trans. Autom. Sci. Eng. 9 (3) (2012) 613–621. http://dx.doi.
[116] M. Grebenstein, M. Chalon, W. Friedl, S. Haddadin, T. Wimböck, G. Hirzinger, org/10.1109/TASE.2012.2194143.
R. Siegwart, The hand of the DLR hand arm system: Designed for interaction, [145] A. Schneider, J. Sturm, C. Stachniss, M. Reisert, H. Burkhardt, W. Burgard,
Int. J. Robot. Res. 31 (13) (2012) 1531–1555. http://dx.doi.org/10.1177/ Object identification with tactile sensors using bag-of-features, in: IEEE/RSJ
0278364912459209. International Conference on Intelligent Robots and Systems, 2009. IROS
[117] K. Koyama, H. Hasegawa, Y. Suzuki, A. Ming, M. Shimojo, Pre-shaping for 2009. 2009, pp. 243–248. http://dx.doi.org/10.1109/IROS.2009.5354648.
various objects by the robot hand equipped with resistor network structure [146] S. Navarro, N. Gorges, H. Worn, J. Schill, T. Asfour, R. Dillmann, Haptic
proximity sensors, in: 2013 IEEE/RSJ International Conference on Intelligent object recognition for multi-fingered robot hands, in: Haptics Symposium,
Robots and Systems, IROS, 2013, pp. 4027–4033. http://dx.doi.org/10.1109/ HAPTICS, 2012 IEEE, 2012, pp. 497–502. http://dx.doi.org/10.1109/HAPTIC.
IROS.2013.6696932. 2012.6183837.
[118] Twendy-one, Twendy-one robot hand. http://twendyone.com (accessed [147] S. Luo, W. Mou, M. Li, K. Althoefer, H. Liu, Rotation and translation invariant
4.08.14). object recognition with a tactile sensor, in: SENSORS, 2014 IEEE, 2014, pp.
[119] Simlab, Allegro-hand. http://www.simlab.co.kr/Allegro-Hand.htm (accessed 1030–1033. http://dx.doi.org/10.1109/ICSENS.2014.6985179.
4.08.14). [148] H. Liu, X. Song, T. Nanayakkara, L.D. Seneviratne, K. Althoefer, A computa-
[120] K. Dautenhahn, C.L. Nehaniv, M.L. Walters, B. Robins, H. Kose-Bagci, tionally fast algorithm for local contact shape and pose classification using a
N.A. Mirza, M. Blow, Kaspar—a minimally expressive humanoid robot for tactile array sensor, in: 2012 IEEE International Conference on Robotics and
human–robot interaction research, Appl. Bionics Biomech. 6 (3–4) (2009) Automation, ICRA, 2012, pp. 1410–1415. http://dx.doi.org/10.1109/ICRA.
369–397. 2012.6224872.
[121] P. Mittendorfer, G. Cheng, Humanoid multimodal tactile-sensing modules, [149] R. Li, R. Platt, W. Yuan, A. ten Pas, N. Roscup, M. Srinivasan, E. Adelson,
IEEE Trans. Robot. 27 (3) (2011) 401–410. http://dx.doi.org/10.1109/TRO. Localization and manipulation of small parts using gelsight tactile sens-
2011.2106330. ing, in: 2014 IEEE/RSJ International Conference on Intelligent Robots and
[122] Bosch, Apas assistant. http://www.bosch-apas.com/media/en/apas/ Systems, IROS 2014, 2014, pp. 3988–3993. http://dx.doi.org/10.1109/IROS.
microsite_apas/2014_apasassistant.pdf (accessed 26.05.14). 2014.6943123.
[123] S. Nicosia, RAMSETE: Articulated and Mobile Robotics for Services and [150] H. Yan, M.H. Ang, A.N. Poo, A survey on perception methods for human–robot
Technology, vol. 270, Springer, 2001. interaction in social robots, Int. J. Soc. Robot. 6 (1) (2014) 85–119. http://dx.
[124] R. Howe, Tactile sensing and control of robotic manipualtion, J. Adv. Robot. 8 doi.org/10.1007/s12369-013-0199-6.
(3) (1994) 245–261. [151] N. Wettels, G. Loeb, Haptic feature extraction from a biomimetic tactile
[125] D. Prattichizzo, J.C. Trinkle, Grasping, in: B. Siciliano, O. Khatib (Eds.), Springer sensor: Force, contact location and curvature, in: 2011 IEEE International
Handbook of Robotics, Springer, Berlin, Heidelberg, 2008, pp. 671–700. Conference on Robotics and Biomimetics, ROBIO, 2011, pp. 2471–2478.
[126] F. Cordella, L. Zollo, A. Salerno, D. Accoto, E. Guglielmelli, B. Siciliano, Human http://dx.doi.org/10.1109/ROBIO.2011.6181676.
hand motion analysis and synthesis of optimal power grasps for a robotic [152] J. Matas, C. Galambos, J. Kittler, Robust detection of lines using the pro-
hand, Int. J. Adv. Robot. Syst. 11 (2014). gressive probabilistic hough transform, Comput. Vis. Image Underst. 78 (1)
[127] C. Goldfeder, P. Allen, Data-driven grasping, Auton. Robots 31 (1) (2011) (2000) 119–137. http://dx.doi.org/10.1006/cviu.1999.0831.
1–20. http://dx.doi.org/10.1007/s10514-011-9228-1. [153] S. Hutchinson, G. Hager, P. Corke, A tutorial on visual servo control, IEEE
[128] S. Ye, K. Suzuki, Y. Suzuki, M. Ishikawa, M. Shimojo, Robust robotic grasping Trans. Robot. Autom. 12 (5) (1996) 651–670. http://dx.doi.org/10.1109/70.
using ir net-structure proximity sensor to handle objects with unknown 538972.
position and attitude, in: 2013 IEEE International Conference on Robotics [154] S.H. François Chaumette, Visual servoing and visual tracking, in: B. Siciliano,
and Automation, ICRA, 2013, pp. 3271–3278. http://dx.doi.org/10.1109/ O. Khatib (Eds.), Springer Handbook of Robotics, Springer, 2008.
ICRA.2013.6631033. [155] A.D. Berger, P.K. Khosla, Using tactile data for real-time feedback, Int. J. Robot.
[129] D. Accoto, R. Sahai, F. Damiani, D. Campolo, E. Guglielmelli, P. Dario, A slip Res. 10 (2) (1991) 88–102. http://dx.doi.org/10.1177/027836499101000202.
sensor for biorobotic applications using a hot wire anemometry approach, [156] P. Sikka, H. Zhang, S. Sutphen, Tactile servo: Control of touch-driven robot
Sensors Actuators A 187 (0) (2012) 201–208. http://dx.doi.org/10.1016/j. motion, in: Experimental Robotics III, Springer, 1994, pp. 219–233.
sna.2008.07.030. [157] A. Okamura, N. Smaby, M. Cutkosky, An overview of dexterous manipulation,
[130] R. Howe, M. Cutkosky, Sensing skin acceleration for slip and texture percep- in: IEEE International Conference on Robotics and Automation, vol. 1, 2000,
tion, in: 1989 IEEE International Conference on Robotics and Automation, pp. 255–262. http://dx.doi.org/10.1109/ROBOT.2000.844067.
1989. Proceedings, vol. 1, 1989, pp. 145–150. http://dx.doi.org/10.1109/ [158] H. Zhang, N. Chen, Control of contact via tactile sensing, IEEE Trans. Robot.
ROBOT.1989.99981. Autom. 16 (5) (2000) 482–495. http://dx.doi.org/10.1109/70.880799.
220 Z. Kappassov et al. / Robotics and Autonomous Systems 74 (2015) 195–220

Juan-Antonio Corrales received the Doctorate degree in


[159] N. Chen, H. Zhang, R. Rink, Edge tracking using tactile servo, in: 1995 IEEE/RSJ
Automatic Control from the Escuela Politécnica Superior,
International Conference on Intelligent Robots and Systems 95. ‘Human
Universidad de Alicante, Alicante in 2011. He is currently
Robot Interaction and Cooperative Robots’. Proceedings, vol. 2, IEEE, 1995,
an Associate Professor at Institut Francais de Mecanique
pp. 84–89.
[160] J. Back, J. Bimbo, Y. Noh, L. Seneviratne, K. Althoefer, H. Liu, Control a contact Avancee, Clermont-Ferrand, France where he has been
sensing finger for surface haptic exploration, in: 2014 IEEE International giving courses since September 2014 in automatic control,
Conference on Robotics and Automation, ICRA, 2014, pp. 2736–2741. http:// programming and robot control. His research interests
dx.doi.org/10.1109/ICRA.2014.6907251. include human–robot interaction, sensor fusion, robotic
[161] J. Bimbo, L. Seneviratne, K. Althoefer, H. Liu, Combining touch and vision for manipulation.
the estimation of an object’s pose during manipulation, in: 2013 IEEE/RSJ
International Conference on Intelligent Robots and Systems, IROS, 2013, pp.
4021–4026. http://dx.doi.org/10.1109/IROS.2013.6696931.
[162] M.H. Raibert, J.J. Craig, Hybrid position/force control of manipulators, J. Dyn.
Syst. Meas. Control 103 (2) (1981) 126–133. Véronique Perdereau is full professor at UPMC since 2003
and an IEEE senior member. She obtained her Electrical
and Information engineering M.S. in 1987 and Robotics
Zhanat Kappassov is a Ph.D. student at University of Pierre and Automation Ph.D. degree in 1991. She is author
and Marie Curie, Paris, France. He has received the Engi- of more than 100 scientific published papers. She has
neering degree in Radioengineering from the University organized several international conference sessions and
of Control Systems and Radioelectronics, Russian Feder- workshops, reviewed for many international publications
ation in 2011. He has worked at Nazarbayev University, and has been a member of several scientific program
Astana, Kazakhstan in the position of a research assistant. committees. She has been invited to several international
He is currently with Institute of Intelligent Systems and conferences. She is principal investigator in several
Robotics. His research interests include robot hands de- collaborative EU projects. She received in 2013 from the
sign, robotic manipulation and robot control. French Ministry of Research and Education the price ‘‘Etoiles de l’Europe’’ for the
coordination of the HANDLE EU project and was appointed Knight of Legion of
Honor in 2014.

You might also like