Abstract

From the list of interfaces used in virtual reality systems, haptic interfaces allow users to touch a virtual world with their hands. Traditionally, the user’s hand moves the end-effector of a robotic arm. When there is no contact in the virtual world, the robotic arm is passive; when there is contact, the arm suppresses mobility to the user’s hand in certain directions. Unfortunately, the passive mode is never completely seamless to the user. Haptic interfaces with intermittent contacts are interfaces using industrial robots that move towards the user when contact needs to be made. As the user is immersed in the virtual world via a virtual reality head mounted display (HMD), he cannot perceive the danger of a collision when he changes his area of interest in the virtual environment. The objective of this article is to describe four movement strategies for the robot to be as fast as possible on the contact zone while guaranteeing safety. This work uses the concept of predicting the user’s intention through his gaze direction and the position of his dominant hand (the one touching the object) and safe-points outside the human workspace. Experiments are done and analyzed with a Pareto front with a UR5 robot, an HTC vive tracker system for an industrial application involving the analysis of materials in the interior of a car.

References

1.
Fuchs
,
P.
,
Moreau
,
G.
, and
Guitton
,
P.
,
2011
,
Virtual Reality: Concepts and Technologies
,
CRC Press
,
Boca Raton, FL
.
2.
Perret
,
J.
, and
Vercruysse
,
P.
,
2014
, “
Advantages of Mechanical Backdrivability for Medical Applications of Force Control
,”
Conference on Computer/Robot Assisted Surgery (CRAS)
,
Genoa, Italy
,
Oct. 14–16
, pp.
84
86
.
3.
McNeely
,
W. A.
,
1993
, “
Robotic Graphics: A New Approach to Force Feedback for Virtual Reality
,”
Proceedings of IEEE Virtual Reality Annual International Symposium
,
Seattle, WA
,
Sept. 18–22
,
IEEE
, pp.
336
341
.
4.
Endo
,
T.
,
Kawasaki
,
H.
,
Mouri
,
T.
,
Ishigure
,
Y.
,
Shimomura
,
H.
,
Matsumura
,
M.
, and
Koketsu
,
K.
,
2011
, “
Five-Fingered Haptic Interface Robot: Hiro III
,”
IEEE Trans. Haptics
,
4
(
1
), pp.
14
27
.
5.
Kawasaki
,
H.
, and
Mouri
,
T.
,
2007
, “
Design and Control of Five-Fingered Haptic Interface Opposite to Human Hand
,”
IEEE Trans. Robot.
,
23
(
5
), pp.
909
918
.
6.
Araujo
,
B.
,
Jota
,
R.
,
Perumal
,
V.
,
Yao
,
J. X.
,
Singh
,
K.
, and
Wigdor
,
D.
,
2016
, “
Snake Charmer: Physically Enabling Virtual Objects
,”
Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction
,
Eindhoven, Netherlands
,
Feb. 14–17
,
ACM
, pp.
218
226
.
7.
La De Cruz
,
O.
,
Gosselin
,
F.
,
Bachta
,
W.
, and
Morel
,
G.
,
2018
, “
Contributions to the Design of a 6 Dof Contactless Sensor Intended for Intermittent Contact Haptic Interfaces
,”
2018 3rd International Conference on Advanced Robotics and Mechatronics (ICARM)
,
Singapore
,
July 18–20
,
IEEE
, pp.
130
135
.
8.
Hayhoe
,
M.
,
Aivar
,
P.
,
Shrivastavah
,
A.
, and
Mruczek
,
R.
,
2002
, “
Visual Short-Term Memory and Motor Planning
,”
Prog. Brain Res.
,
140
, pp.
349
363
.
9.
Johansson
,
R. S.
,
Westling
,
G.
,
Bäckström
,
A.
, and
Flanagan
,
J. R.
,
2001
, “
Eye–Hand Coordination in Object Manipulation
,”
J. Neurosci.
,
21
(
17
), pp.
6917
6932
.
10.
Michael
,
L.
,
Neil
,
M.
, and
Jennifer
,
R.
,
1999
, “
The Roles of Vision and Eye Movements in the Control of Activities of Daily Living
,”
Perception
,
28
(
11
), pp.
1311
1328
.
11.
Pelz
,
J.
,
Hayhoe
,
M.
, and
Loeber
,
R.
,
2001
, “
The Coordination of Eye, Head, and Hand Movements in a Natural Task
,”
Exp. Brain Res.
,
139
(
3
), pp.
266
277
.
12.
Smeets
,
J. B. J.
,
Hayhoe
,
M. M.
, and
Ballard
,
D. H.
,
1996
, “
Goal-Directed Arm Movements Change Eye–Head Coordination
,”
Exp. Brain Res.
,
109
(
3
), pp.
434
440
.
13.
Stamenkovic
,
A.
,
2018
, “
Do Postural Constraints Affect Eye, Head, and Arm Coordination?
,”
J. Neurophysiol.
,
120
(
4
), pp.
2066
2082
.
14.
Mercado
,
V. R.
,
Marchal
,
M.
, and
Lécuyer
,
A.
,
2019
, “
Entropia: Towards Infinite Surface Haptic Displays in Virtual Reality Using Encountered-Type Rotating Props
,”
IEEE Trans. Visual Comput. Graph.
,
27
(
3
), pp.
2237
2243
.
15.
Salazar
,
S. V.
,
Pacchierotti
,
C.
,
de Tinguy
,
X.
,
Maciel
,
A.
, and
Marchal
,
M.
,
2020
, “
Altering the Stiffness, Friction, and Shape Perception of Tangible Objects in Virtual Reality Using Wearable Haptics
,”
IEEE Trans. Haptics.
,
13
(
1
), pp.
167
174
.
16.
Tomić
,
M.
,
Chevallereau
,
C.
,
Jovanović
,
K.
,
Potkonjak
,
V.
, and
Rodić
,
A.
,
2018
, “
Human to Humanoid Motion Conversion for Dual-Arm Manipulation Tasks
,”
Robotica
,
36
(
8
), pp.
1167
1187
.
17.
Guda
,
V. K.
,
Chablat
,
D.
, and
Chevallereau
,
C.
,
2020
, “
Safety in a Human Robot Interactive: Application to Haptic Perception
,”
International Conference on Human–Computer Interaction HCII 2000
,
Copenhagen, Denmark
,
July
, pp.
562
574
.
18.
Cherubini
,
A.
,
Passama
,
R.
,
Crosnier
,
A.
,
Lasnier
,
A.
, and
Fraisse
,
P.
,
2016
, “
Collaborative Manufacturing With Physical Human–Robot Interaction
,”
Robot. Comput. Integr. Manuf.
,
40
, pp.
1
13
.
19.
Long
,
P.
,
Chevallereau
,
C.
,
Chablat
,
D.
, and
Girin
,
A.
,
2018
, “
An Industrial Security System for Human–Robot Coexistence
,”
Ind. Robot. Int. J.
,
45
(
2
), pp.
220
226
.
20.
Ravichandar
,
H. C.
, and
Dani
,
A.
,
2015
, “
Human Intention Inference and Motion Modeling Using Approximate E–M With Online Learning
,”
2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
,
Hamburg, Germany
,
Sept. 28-Oct. 3
, pp.
1819
1824
.
21.
Pérez-D’Arpino
,
C.
, and
Shah
,
J. A.
,
2015
, “
Fast Target Prediction of Human Reaching Motion for Cooperative Human–Robot Manipulation Tasks Using Time Series Classification
,”
2015 IEEE International Conference on Robotics and Automation (ICRA)
,
Seattle, WA
,
May 26–30
, pp.
6175
6182
.
22.
Luo
,
R.
,
Hayne
,
R.
, and
Berenson
,
D.
,
2018
, “
Unsupervised Early Prediction of Human Reaching for Human–Robot Collaboration in Shared Workspaces
,”
Auton. Robots
,
42
(
3
), pp.
631
648
.
23.
Koppula
,
H. S.
,
Jain
,
A.
, and
Saxena
,
A.
,
2016
, “Anticipatory Planning for Human–Robot Teams,”
Springer Tracts in Advanced Robotics
,
M.
Ani Hsieh
,
O.
Khatib
, and
Vijay Kumar
, eds., Vol.
109
,
Springer Verlag
,
Cham
, pp.
453
470
.
24.
Feichtenhofer
,
C.
,
Pinz
,
A.
, and
Zisserman
,
A.
,
2016
, “
Convolutional Two-Stream Network Fusion for Video Action Recognition
,”
2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
,
Salt Lake City, UT
,
June
, pp.
1933
1941
.
25.
Quigley
,
M.
,
Conley
,
K.
,
Gerkey
,
B.
,
Faust
,
J.
,
Foote
,
T.
,
Leibs
,
J.
,
Wheeler
,
R.
, and
Ng
,
A. Y.
,
2009
ROS: An Open-Source Robot Operating System
,”
Proceedings. of the IEEE International Conference on Robotics and Automation (ICRA) Workshop on Open Source Robotics
,
Kobe, Japan
,
May 12–17
, Vol. 3, No. 3.2, pp.
1
5
.
26.
ISO Central Secretary
,
2021
, “
Robots and Robotic Devices—Safety Requirements for Industrial Robots—Part I: Robots
,” din en iso 10218-1.
27.
Mugisha
,
S.
,
Zoppi
,
M.
,
Molfino
,
R.
,
Guda
,
V.
,
Chevallereau
,
C.
, and
Chablat
,
D.
,
2021
, “
Safe Collaboration Between Human and Robot in a Context of Intermittent Haptique Interface
,”
ASME International Design Engineering Technical Conferences & Computers and Information in Engineering Conference
,
Virtual, Online
,
Aug. 17–19
.
28.
Lobbybot Project
, https://www.lobbybot.fr/, Accessed October 15, 2021.
You do not currently have access to this content.