GESTURE CONTROL OF DRONE AND MODULATE ANY DEVICE USING LEAP MOTION

Published 31 May 2020 •  vol 13  •  no 5  • 


Authors:

 

Lucky Verma, SRM Institute of Science and Technology, EEE, India

Abstract:

 

In this project, we present our implementation of using a motion controller to control the motion of a drone by simple human gestures. Leap Motion controller has been used as the motion controller and a drone for this Implementation. A drone is an unarmed aerial Vehicle that can be remotely controlled or fly autonomously through software-controlled flight plans in their embedded systems, working in conjunction with onboard sensors and GPS. The drone is connected to the ground station via 2.4Ghz transmitter and receiver system and the Leap is connected to the ground station via serial and USB port. The LEAP Motion Controller recognizes the hand gestures and relays it on to the ground station. The ground station JavaScript code in Node.js runtime environment which is used as the platform for this implementation. JavaScript is the programming language used for interaction with the DRONE in order to convey the simple hand gestures for mapping and parsing files on the system. In our implementation, we have written JavaScript codes to interpret the hand gestures captured by the LEAP, and transmit them in order to control the motion of the drone by these gestures.

Keywords:

 

Leap Motion, Drone, Tricopter, Johnny Five, Node, Arduino, Servo

References:

 

[1] Alsheakhali, Mohamed, Ahmed Skaik, Mohammed Aldahdouh, and Mahmoud Alhelou. "Hand Gesture Recognition System." Information & Communication Systems 132 (2011).
[2] Bhuiyan, Moniruzzaman, and Rich Picking. “Gesture-controlled user interfaces, what have we done and what’s next.” Proceedings of the Fifth Collaborative Research Symposium on Security, E-Learning, Internet and Networking (SEIN 2009), Darmstadt, Germany. 2009.
[3] Singha, Joyeeta, and Karen Das. "Hand gesture recognition based on Karhunen-Loeve transform." arXiv preprint arXiv:1306.2599 (2013).
[4] Silva¹, Eduardo S, "A preliminary evaluation of the leap motion sensor as controller of new digital musical instruments." (2013).
[5] Kainz, Ondrej, and František Jakab. "Approach to Hand Tracking and Gesture Recognition Based on Depth-Sensing Cameras and EMG Monitoring." Acta Informatica Pragensia 3.1 (2014): 104-112.
[6] Han, Jihyun, and Nicolas Gold. "Lessons Learned in Exploring the Leap Motion TM Sensor for Gesture-based Instrument Design." Proceedings of the International Conference on New Interfaces for Musical Expression. 2014.
[7] Potter, L. E., Araullo, J., & Carter, L. (2013, November). The Leap Motion controller: a view on sign language. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration (pp. 175-178). ACM.
[8] Leap Motion: https://www.leapmotion.com,Accessed, 9.3.2015.
[9] ROS Leap Motion: https://github.com/warp1337/rosleapmotion, Accessed, 5.5.20150
[10] leap_motion: http://wiki.ros.org/leap_motion, Accessed, 5.5.2015.
[11] AR Drone Commands: https://robohub.org/up-and-flying-with-the-ardrone-and-ros-getting-started/, Accessed, 5.5.2015 [14] Autonomy Lab SFU. AR Drone Autonomy: https://github.com/autonomylab/ardrone_autonomy,Accessed, 17.3.2015

Citations:

 

APA:
Verma, L. (2020). Gesture Control of Drone and Modulate any Device Using Leap Motion. International Journal of Control and Automation (IJCA), ISSN: 2005-4297 (Print); 2207-6387 (Online), NADIA, 13(5), 1-10. doi: 10.33832/ijca.2020.13.5.01.

MLA:
Verma, Lucky, “Gesture Control of Drone and Modulate any Device Using Leap Motion.” International Journal of Control and Automation, ISSN: 2005-4297 (Print); 2207-6387 (Online), NADIA, vol. 13, no. 5, 2020, pp. 1-10. IJCA, http://article.nadiapub.com/IJCA/vol13_no5/1.html.

IEEE:
[1] L. Verma, "Gesture Control of Drone and Modulate any Device Using Leap Motion." International Journal of Control and Automation (IJCA), ISSN: 2005-4297 (Print); 2207-6387 (Online), NADIA, vol. 13, no. 5, pp. 1-10, May 2020.