Abstract

In assistive robotics, research in Brain Computer Interface aims to understand human intent to enhance Human–Robot interaction and augment human performance. In this research, a framework to enable a person with an upper limb disability to use an assistive system toward maintaining self-reliance is introduced and its implementation and evaluation are discussed. The framework interlinks functional components and establishes a behavioral sequence to operate the assistive system in three stages; action classification, verification, and execution. An action is classified based on identified human intent and verified through haptic and/or visual feedback before execution. The human intent is conveyed through facial expressions and verified through head movements. The interlinked functional components are an electro-encephalogram (EEG) sensing device, a head movement recorder, a dual-purpose glove, a visual feedback environment, and a robotic arm. Five volunteers are used to evaluate the ability of the system to recognize a facial expression, the time required to respond using head movements, convey information through vibrotactile feedback effects, and the ability to follow the established behavioral sequence. Based on the evaluation, a personalized training dataset should be used to calibrate facial expression recognition and define the time required to respond during verification. Custom vibrotactile effects were effective in conveying system information to the user. The volunteers were able to follow the behavioral sequence and control the system with a success rate of 80.00%, thus providing confidence to recruit more volunteers to identify and address improvements and expand the operational capability of the framework.

References

1.
Encarnação
,
P.
, and
Cook
,
A.
,
2017
, “
Robotic Assistive Technologies: Principles and Practice
,” 1st ed.,
CRC Press
, Boca Raton, FL.10.4324/9781315368788
2.
Tapus
,
A.
,
Mataric
,
M. J.
, and
Scassellati
,
B.
,
2007
, “
Socially Assistive Robotics [Grand Challenges of Robotics]
,”
IEEE Rob. Autom. Mag.
,
14
(
1
), pp.
35
42
.10.1109/MRA.2007.339605
3.
Christopher and Dana Reeve Foundation
,
2013
, “
Stats About Paralysis
,” Christopher and Dana Reeve Foundation, Short Hills, NJ, accessed Jan. 20, 2022, https://www.christopherreeve.org/living-with-paralysis/stats-about-paralysis
4.
Ziegler-Graham
,
K.
,
MacKenzie
,
E. J.
,
Ephraim
,
P. L.
,
Travison
,
T. G.
, and
Brookmeyer
,
R.
,
2008
, “
Estimating the Prevalence of Limb Loss in the United States: 2005 to 2050
,”
Arch. Phys. Med. Rehabil.
,
89
(
3
), pp.
422
429
.10.1016/j.apmr.2007.11.005
5.
Chowdhury
,
P.
,
Shakim
,
S. K.
,
Karim
,
M. R.
, and
Rhaman
,
M. K.
,
2014
, “
Cognitive Efficiency in Robot Control by Emotiv EPOC
,” International Conference on Informatics, Electronics & Vision (
ICIEV
),
Dhaka, Bangladesh
, May 23–24, pp.
1
6
.10.1109/ICIEV.2014.6850775
6.
Grude
,
S.
,
Freeland
,
M.
,
Yang
,
C.
, and
Ma
,
H.
,
2013
, “
Controlling Mobile Spykee Robot Using Emotiv Neuro Headset
,”
Proceedings of the 32nd Chinese Control Conference
,
Xi'an, China
,
July 26–28
, pp.
5927
5932
.https://ieeexplore.ieee.org/document/6640475
7.
Jang
,
W. A.
,
Lee
,
S. M.
, and
Lee
,
D. H.
,
2014
, “
Development BCI for Individuals With Severely Disability Using EMOTIV EEG Headset and Robot
,” 2014 International Winter Workshop on Brain-Computer Interface (
BCI
),
Gangwon, Korea
,
Feb. 17–19
, pp.
1
3
.10.1109/iww-BCI.2014.6782576
8.
Aguiar
,
S.
,
Yanez
,
W.
, and
Benítez
,
D.
,
2016
, “
Low Complexity Approach for Controlling a Robotic Arm Using the Emotiv EPOC Headset
,” 2016 IEEE International Autumn Meeting on Power, Electronics and Computing (
ROPEC
),
Ixtapa, Mexico
,
Nov. 9–11
, pp.
1
6
.10.1109/ROPEC.2016.7830526
9.
Kline
,
A.
, and
Desai
,
J.
,
2014
, “
SIMULINK® Based Robotic Hand Control Using EmotivTM EEG Headset
,” 2014 40th Annual Northeast Bioengineering Conference (
NEBEC
),
Boston, MA
,
Apr. 25–27,
pp.
1
2
.10.1109/NEBEC.2014.6972839
10.
Ouyang
,
W.
,
Cashion
,
K.
, and
Asari
,
V. K.
,
2013
, “
Electroencephelograph Based Brain Machine Interface for Controlling a Robotic Arm
,” 2013 IEEE Applied Imagery Pattern Recognition Workshop (
AIPR
),
Washington, DC
,
Oct. 23–25
, pp.
1
7
.10.1109/AIPR.2013.6749312
11.
Zamora
,
IN.
,
Benítez
,
D. S.
, and
Navarro
,
M. S.
,
2019
, “
On the Use of the EMOTIV Cortex API to Control a Robotic Arm Using Raw EEG Signals Acquired From the EMOTIV Insight NeuroHeadset
,” 2019 IEEE Chilean Conference on Electrical, Electronics Engineering, Information and Communication Technologies (
CHILECON
),
Valparaiso, Chile
,
Nov. 13–27,
pp.
1
6
.10.1109/CHILECON47746.2019.8987541
12.
Lekova
,
A.
,
Chavdarov
,
I.
,
Naydenov
,
B.
,
Krastev
,
A.
, and
Kostova
,
S.
,
2019
, “
Brain-Inspired IoT Controlled Walking Robot – Big-Foot
,”
Adv. Sci., Technol. Eng. Syst. J.
,
4
(
3
), pp.
220
226
.10.25046/aj040329
13.
Choi
,
H.
,
Crump
,
C.
,
Duriez
,
C.
,
Elmquist
,
A.
,
Hager
,
G.
,
Han
,
D.
,
Hearl
,
F.
, et al.,
2021
, “
On the Use of Simulation in Robotics: Opportunities, Challenges, and Suggestions for Moving Forward
,”
Proc. Natl. Acad. Sci.
,
118
(
1
), p.
e1907856118
.10.1073/pnas.1907856118
14.
Michel
,
O.
,
2004
, “
Cyberbotics Ltd. WebotsTM: Professional Mobile Robot Simulation
,”
Int. J. Adv. Rob. Syst.
,
1
(
1
), pp.
5
42
.10.5772/5618
15.
Zhao
,
J.
,
Li
,
W.
, and
Li
,
M.
,
2015
, “
Comparative Study of SSVEP- and P300-Based Models for the Telepresence Control of Humanoid Robots
,”
PLos One
,
10
(
11
), p.
e0142168
.10.1371/journal.pone.0142168
16.
Chung
,
M.
,
Cheung
,
W.
,
Scherer
,
R.
, and
Rao
,
R.
,
2011
, “
Towards Hierarchical BCIs for Robotic Control
,”
5th International IEEE/EMBS Conference on Neural Engineering
,
Cancun, Mexico
,
Apr. 27
–May 1, pp.
330
333
.10.1109/NER.2011.5910554
17.
Li
,
W.
,
Li
,
Y.
,
Chen
,
G.
,
Meng
,
Q.
,
Zeng
,
M.
, and
Sun
,
F.
,
2014
, “
Acquiring Brain Signals of Imagining Humanoid Robot Walking Behavior Via Cerebot
,”
Foundations and Practical Applications of Cognitive Systems and Information Processing
,
F.
Sun
,
D.
Hu
, and
H.
Liu
, eds.,
Springer
,
Berlin Heidelberg
, pp.
617
627
.10.1007/978-3-642-37835-5_53
18.
Wang
,
F.
,
Li
,
X.
, and
Pan
,
J.
,
2022
, “
A Human-Machine Interface Based on an EOG and a Gyroscope for Humanoid Robot Control and Its Application to Home Services
,”
J. Healthcare Eng.
,
2022
, pp.
1
14
.10.1155/2022/1650387
19.
Scalera
,
L.
,
Seriani
,
S.
,
Gallina
,
P.
,
Di Luca
,
M.
, and
Gasparetto
,
A.
,
2018
, “
An Experimental Setup to Test Dual-Joystick Directional Responses to Vibrotactile Stimuli
,”
IEEE Trans. Haptics
,
11
(
3
), pp.
378
387
.10.1109/TOH.2018.2804391
20.
Yunus
,
R.
,
Ali
,
S.
,
Ayaz
,
Y.
,
Khan
,
M.
,
Kanwal
,
S.
,
Akhlaque
,
U.
, and
Nawaz
,
R.
,
2020
, “
Development and Testing of a Wearable Vibrotactile Haptic Feedback System for Proprioceptive Rehabilitation
,”
IEEE Access
,
8
, pp.
35172
35184
.10.1109/ACCESS.2020.2975149
21.
Pittera
,
D.
,
Obrist
,
M.
, and
Israr
,
A.
,
2017
, “
Hand-to-Hand: An Intermanual Illusion of Movement, ICMI '17: Proceedings of the 19th ACM
,”
International Conference on Multimodal Interaction
,
Glasgow, UK
,
Nov. 13–17
, pp.
73
81
.10.1145/3136755.3136777
22.
Pezent
,
E.
,
Agarwal
,
P.
,
Hartcher-OrBrien
,
J.
,
Colonnese
,
N.
, and
O'Malley
,
M. K.
,
2022
, “
Design, Control, and Psychophysics of Tasbi: A Force-Controlled Multimodal Haptic Bracelet
,”
IEEE Trans. Rob.
,
38
(
5
), pp.
2962
2978
.10.1109/TRO.2022.3164840
23.
Culbertson
,
H.
,
Schorr
,
S. B.
, and
Okamura
,
A. M.
,
2018
, “
Haptics: The Present and Future of Artificial Touch Sensation
,”
Annu. Rev. Control, Rob., Auton. Syst.
,
1
(
1
), pp.
385
409
.10.1146/annurev-control-060117-105043
24.
Brewster
,
S.
, and
Brown
,
L. M.
,
2004
, “
Tactons: Structured Tactile Messages for Non-Visual Information Display
,”
Australasian User Interface Conference
,
Dunedin, New Zealand
, Jan. 18–22, pp.
15
23
.https://dl.acm.org/doi/10.5555/976310.976313
25.
Chan
,
A.
,
MacLean
,
K.
, and
McGrenere
,
J.
,
2005
, “
Learning and Identifying Haptic Icons Under Workload
,”
First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics Conference
,
Pisa, Italy
,
Mar. 18–20,
pp.
432
439
.10.1109/WHC.2005.86
26.
Tonin
,
L.
,
Bauer
,
F. C.
,
del
,
R.
, and
Millán
,
J.
,
2020
, “
The Role of the Control Framework for Continuous Teleoperation of a Brain–Machine Interface-Driven Mobile Robot
,”
IEEE Trans. Rob.
,
36
(
1
), pp.
78
91
.10.1109/TRO.2019.2943072
27.
Tucker
,
M. R.
,
Olivier
,
J.
,
Pagel
,
A.
,
Bleuler
,
H.
,
Bouri
,
M.
,
Lambercy
,
O.
,
del R Millán
,
J.
,
Riener
,
R.
,
Vallery
,
H.
, and
Gassert
,
R.
,
2015
, “
Control Strategies for Active Lower Extremity Prosthetics and Orthotics: A Review
,”
J. NeuroEng. Rehabil.
,
12
(
1
), p.
1
.10.1186/1743-0003-12-1
28.
Al-Qaysi
,
Z. T.
,
Zaidan
,
B.
,
Zaidan
,
A.
, and
Suzani
,
M.
,
2018
, “
A Review of Disability EEG Based Wheelchair Control System: Coherent Taxonomy, Open Challenges and Recommendations
,”
Comput. Methods Programs Biomed.
,
164
, pp.
221
237
.10.1016/j.cmpb.2018.06.012
29.
Reaz
,
M.
,
Hussain
,
M.
,
Ibrahimy
,
M.
, and
Mohd-Yasin
,
F.
,
2007
, “
EEG Signal Analysis and Characterization for the Aid of Disabled People
,”
WIT Trans. Biomed. Health
,
12
, pp.
287
294
.10.2495/BIO070271
30.
Whitaker
,
S.
,
2020
, “
Development and Evaluation of a Brain-Computer Interface for Human-Robot Interaction in Simulation and Hardware Environment
,”
M.S. thesis
,
The University of Texas at Arlington
,
Arlington, TX
.https://www.proquest.com/openview/e416bd8eed7408c89f88943a94935a25/1?pqorigsite=gscholar&cbl=18750&diss=y
31.
Hazra
,
S.
,
2018
, “
Inducing Vibro-Tactile Sensation at Mesoscale
,”
M.S. thesis
,
The University of Texas at Arlington
,
Arlington, TX
.http://hdl.handle.net/10106/29676
32.
Hall
,
J. A.
,
Horgan
,
T. G.
, and
Murphy
,
N. A.
,
2019
, “
Nonverbal Communication
,”
Annu. Rev. Psychol.
,
70
(
1
), pp.
271
294
.10.1146/annurev-psych-010418-103145
33.
Emotiv Inc.,
2022
, “
EmotivBCI
,” Emotiv Inc., San Francisco, CA, accessed Jan. 18, 2022, https://emotiv.gitbook.io/emotivbci
34.
AndrewJoseph
,
C.
,
2019
, “
Facial Expression Detections
,” Emotiv Inc., San Francisco, CA, accessed Aug. 16, 2019, https://www.emotiv.com/knowledge-base/facial-expression-detections
35.
Tinkerkit,
2022
, “
Tinkerkit Braccio Robot
,” accessed Aug. 15, 2022, https://store-usa.arduino.cc/products/tinkerkit-braccio-robot?selectedStore=us
You do not currently have access to this content.