This paper explores a novel human–machine interaction (HMI) paradigm that utilizes the sensing, storage, computation, and communication (SSCC) power capabilities of mobile devices to provide intuitive interactions with dynamic systems. The HMI paradigm addresses the fundamental challenges by integrating computer vision, 3D virtual graphics, and touchscreen sensing to develop mobile apps that provide interactive augmented reality (AR) visualizations. While prior approaches used laboratory-grade hardware, e.g., personal computer (PC), vision system, etc., for streaming video to remote users, the approach exploits the inherent mobility of mobile devices to provide users with mixed-reality (MR) environments in which the laboratory test-bed and augmented visualizations coexist and interact in real-time to promote immersive learning experiences that don’t yet exist in engineering laboratories. By pointing the rear-facing cameras of the mobile devices at the system from an arbitrary perspective, computer vision techniques retrieve physical measurements to render interactive AR content or perform feedback control. Future work is expected to examine the potential of our approach in teaching fundamentals of dynamic systems, automatic control, robotics, etc. through inquiry-based activities with students.
Skip Nav Destination
Using Mobile Devices for Mixed-Reality Interactions
Article navigation
June 2016
Select Article
Using Mobile Devices for Mixed-Reality Interactions
with Educational Laboratory Test-Beds
Jared Alan Frank,
Jared Alan Frank
Department of Mechanical and Aerospace Engineering New York University Tandon School Of Engineering
Search for other works by this author on:
Vikram Kapila
Vikram Kapila
Department of Mechanical and Aerospace Engineering New York University Tandon School Of Engineering
Search for other works by this author on:
Jared Alan Frank
Department of Mechanical and Aerospace Engineering New York University Tandon School Of Engineering
Vikram Kapila
Department of Mechanical and Aerospace Engineering New York University Tandon School Of Engineering
Jared Alan Frank, a mechanical engineering doctoral student at NYU, is conducting research on the development and evaluation of immersive apps for intuitive interactions with physical systems.
Vikram Kapila (vkapila@nyu.edu), a professor of mechanical engineering at NYU, has research interests in mechatronics, robotics, and STEM education.
Mechanical Engineering. Jun 2016, 138(06): S2-S6 (5 pages)
Published Online: June 1, 2016
Citation
Alan Frank, J., and Kapila, V. (June 1, 2016). "Using Mobile Devices for Mixed-Reality Interactions
with Educational Laboratory Test-Beds." ASME. Mechanical Engineering. June 2016; 138(06): S2–S6. https://doi.org/10.1115/1.2016-Jun-4
Download citation file:
Get Email Alerts
Cited By
New “Flies” Around the Landfill
Mechanical Engineering (November 2024)
Below and Beyond
Mechanical Engineering (November 2024)
Hidden Infrastructure for the New Energy Economy
Mechanical Engineering (November 2024)
Autonomous Freight Takes to the Rail, Road, Sea, and Air
Mechanical Engineering (September 2024)
Related Articles
A Survey of Automatic Control Teaching and Research in University Mechanical Engineering Departments
J. Dyn. Sys., Meas., Control (March,1973)
Preparing for the Intelligence Era
Mechanical Engineering (November,2010)
Editorial
J. Mech. Des (March,2000)
Rapid Embedded Programming in the Mathworks Environment
J. Comput. Inf. Sci. Eng (September,2002)
Related Proceedings Papers
Related Chapters
QP Based Encoder Feedback Control
Robot Manipulator Redundancy Resolution
Electronic Communication: A View from the Classroom
International Conference on Mechanical Engineering and Technology (ICMET-London 2011)
Cultivation Path of Clothing Pattern Maker with Digital Technology
International Symposium on Information Engineering and Electronic Commerce, 3rd (IEEC 2011)