Abstract- gesture recognition. Differently from the Kinect, this

 

Abstract-
Drones nowadays are widely used around
the world for a variety of purposes including aerial videographer, photography,
surveillance etc. A simple gesture controller can make the task of piloting
much easier. In our implementation, the Leap Motion Controller is used for
recognition of gestures, which are motion of the hand, and as a result, we can
control the motion of the drone by simple gestures from the human hand. The
motive of this project is to capture all points of hand and recognize the
gestures and controlling the drone with the same gesture. From a hardware
perspective, the Leap Motion Controller is an eight by three centimeter unit,
which comprises two stereo cameras and three infrared LEDs. This paper proposes
a hand gesture recognition scheme explicitly targeted to Leap Motion camera.
The two stereo cameras as well as the three infrared LEDs perform the function
of tracking the infrared light having a wavelength of about 850 nanometers,
which is outside the visible light spectrum. Our implementation of using a
motion controller to control the motion of a drone is via simple human
gestures. The main advantage of this system is that capturing all gestures will
help to control the drone without using any remote.

Keywords: Leap Motion
Controller, LEDs, Gesture recognition, Leap camera.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

 

I.                  
INTRODUCTION

Drones
nowadays are widely used around the world for a variety of purposes including
aerial videography, photography, surveillance etc. A simple gesture controller
can make the task of piloting much easier. In this study, we present our
implementation of using a motion controller to control the motion of a drone
via simple human gestures. In our implementation, the Leap Motion Controller is
used for recognition of gestures, which are motion of the hand, and as a
result, we can control the motion of the drone by simple gestures from the
human hand. In recent years, hand gesture recognition has attracted a growing
interest due to its applications in many different ?elds, such as
human-computer interaction, robotics, and computer gaming, automatic
sign-language interpretation and so on. The problem was originally tackled by
the computer vision community by means of images and video .

Dynamic hand
gesture recognition is considered to be the problem of sequential modeling and
classi?cation. The recent introduction of the Leap
Motion device has opened new opportunities for gesture recognition. Differently
from the Kinect, this device is explicitly targeted to hand gesture
recognition. Differently from the Kinect, this device is explicitly targeted to
hand gesture recognition and directly computes the position of the ?ngertips
and the hand orientation.

 

The Leap Motion controller is a small USB
peripheral device which is designed to be placed on
a physical desktop, facing upward. It can also be mounted onto a virtual
reality headset. Using two monochromatic IR cameras and three infrared LEDs, the device observes a roughly hemispherical area, to a
distance of about 1 meter. In our implementation, the Leap Motion
Controller is used for recognition of gestures, which are motion of the hand,
and as a result, we can control the motion of the drone by simple gestures from
the human hand. However  there are
previous implementation  to control the
drone but in this we will show hoe capture all the points of hands to control
the drone with given gesture.

 

II.               
PAST WORK

Ayanava
Sarkar,  Ketul Arvindbhai Patel, Geet
Krishna Capoor, Ganesh Ram R.K “Gesture Control of Drone Using a Motion
Controller” examines that Drones nowadays are widely used around the world for
a variety of purposes including aerial videographer, photography, surveillance
etc. In many cases, there is a requirement of a skilled pilot to perform these
tasks using the drone which proves to be exorbitant. A simple gesture
controller can make the task of piloting much easier. In this study, we
present our implementation of using a motion controller to control the motion
of a drone via simple human gestures. We have used the Leap as the motion
controller and the Parrot AR DRONE 2.0 for this implementation. The Parrot AR
DRONE is an off the shelf quad rotor having an on board Wi-Fi system 1.

Giulio
Marin, Fabio Dominio, Pietro Zanuttigh “Hand Gesture Recognition With The Leap
Motion And  Kinect  Devices” states that the paper proposes a
novel hand gesture recognition scheme explicitly targeted to Leap Motion data.
An ad-hoc feature set based on the positions and orientation of the ?ngertips
is computed and fed into a multi-class SVM classi?er in order to recognize the
performed gestures. A set of features is also extracted from the depth computed
from the Kinect and combined with the Leap Motion ones in order to improve the
recognition performance.
The
recent introduction of novel acquisition devices like the Leap Motion and the
Kinect allows to obtain a very informative descrip- tion of the hand pose that
can be exploited for accurate gesture recognition2.

Wei
Lu, Member, IEEE, Zheng Tong, and Jinghui Chu “Dynamic Hand Gesture Recognition
With Leap Motion Controller”examines that this paper, we propose a novel
feature vector which is suitable for representing dynamic hand gestures, and
presents a satisfactory solution to recognizing dynamic hand gestures with a
Leap Motion controller (LMC) only. These have not been re- ported in other
papers. The feature vector with depth information is computed and fed into the
Hidden Conditional Neural Field (HCNF) classi?er to recognize dynamic hand
gestures. The proposed feature vector that consists of single-?nger features
and double- ?nger features has two main bene?ts 3.

Bing-Yuh
Lu, Chin-Yuan Lin, Shu-Kuang Chang, 
Yi-Yen Lin, Chun-Hsiang Huang, Hai-Wu Lee, Ying-Pyng Lin “Bulbs Control
in Virtual Reality by Using Leap Motion Somatosensory Controlled Switches”
states that the study presented a Leap Motion somatosensory controlled
switches. The switches were implemented by the relays. The “open” or “short” of
the switching circuit were controlled by the sensing of the Leap Motion
somatosensory module. The virtual switches on the screen have designed to be 5
circle buttons. Leap
motion somatosensory controlled switches was implemented to aid some persons
whose hands have been damaged can not perform the switches well 4.

Kemal
ERDO?AN, Akif DURDU, Nihat YILMAZ “Intention Recognition Using Leap
MotionController and Artificial Neural Networks”, Intention recognition is an
important topic in the field of Human Robot Interaction. If the robot is wanted
to make counter movements just in time according to human’s actions, a robotic
system must recognize the intention of the human necessarily. A method for a
robotics system to estimate the human’s intention is presented. In our method,
the information is provided from the sensor called as leap motion controller
device. The decision about the tendency of human intention is made by
Artificial Neural Network.
To
obtain a satisfying result from ANN classifier all data sets are clustered,
trained and tested together with k-fold cross validation method with varied
transfer functions, training algorithms, hidden layer numbers and iteration
numbers.5

III.            
SYSTEM  IMPLEMENTATION

 

A.     
Hardware:

 

·        
Microcontroller:

We
are using the ARM based AVR microcontroller- ATMEGA32 which is a 40 pin IC
consisting of 5 ports and 32 programmable input/output lines. It operates on an
8 MHz crystal. The microcontroller has an 8-channel, 10-bit ADC and 3 on chip
timers. It also consists of 1024 bytes of EEPROM and 2K bytes of internal SRAM.

The
PC system are attached to the microcontroller through the Serial communication.
The DAC converts the digital values from the sensors to analog. then the values
are given to Remote unit.

·        
PCF8519P:

The PCF8591P is a 8bit
A/D and D/A converter in 16 pin DIP package. It is a single chip, single
supply, low power 8bit CMOS data acquisition device with four analogue inputs,
one analogue output and serial I2C bus interface. The functions of PCF8591P includes analogue
input multiplexing, onchip track and hold function, 8bit analogue to digital
conversion and 8bit digital to analogue conversion. The maximum conversion rate
is given by maximum speed of the I2C bus.

 

Softwares:

 

·        
mikroC
PRO for AVR: used for coding the microcontroller in
embedded C.

·        
AVRFLASH:
used
to burn the program onto the microcontroller.

·        
NetBeans
IDE 7.1: to create the GUI, registration and user login forms
for the server.

·        
Serilization: to
create the user database.

·        
Express
PCB: to design the PCB layout.

 

B.     
Operation:

The
Leap Motion Controller as shown in Fig.1 is a gesture recognition device which
uses advanced algorithms for such operations. From a hardware perspective, the
Leap Motion Controller is an eight by three centimeter unit, which comprises
two stereo cameras and three infrared LEDs. The two stereo cameras as well as
the three infrared LEDs perform the function of tracking the infrared light
having a wavelength of about 850 nanometers, which is outside the visible light
spectrum.

 

When the palm
portion is completely parallel to the LEAP motion controller, it will be able
to detect it. It is because the palm being parallel to the controller, it will
recognize it as a single finger after this the drone can be controlled by given
direction. The Leap Motion Controller uses its two monochromatic infrared (IR) stereo
cameras and three infrared LEDs to track any hand motion up to a distance of
about 1 meter or about 3 feet directly above it. Hence, it forms a
hemispherical area above itself whose radius is about 1 meter and recognizes
any hand gesture occurring in that plot of distance.

 

In
this we will first read of all points from leap sensor. All the points taken from the leap
camera would be represented as P1, P2,…..PN, respectively. Scaling the points before would help
us to find their feature extraction further used for implementing in
application. Now
start to calculate the features of all points and then calculate the distance
factor.
The Distance vector formula will calculate all the points i.e D1,D2…..D16, and
now it would be used for detecting the feature.
Compare the gestures stored in
database with the distance vector points and if it matches then the resultant
gesture will be given to hardware to control the drone. By using the
Cosine similarity algorithm, calculate the angle values for respective points.
Then sort the values and find the maximum value out of it, & then create
the gesture by the maximum value. at the end the command will be given to
hardware ie drone to control it.

The main
objective of this project is to developing an application using 3D camera i.e
Leap sensor to control drone. In this paper we implement writing codes to
capture the hand gesture captured by the Leap. This
paper will help us to detect & calculate total 16 points of hand which will
be helpful to detect any gestures. The drone responds to any hand gesture and
moves accordingly.
The points can be determined  by using cosine Similarity Algorithm were if
the gesture matches with the gesture stored then the drone would be controlled
as that.

 

 

 

 

IV.             
FUTURE SCOPE

The system illustrate of finding all the points of hand to make a gesture
to control the drone. Creating a gestures and storing in database will help to
find everytime the gesture that is recorded and if it matches with the stored
database then it would be given as output to the hardware and respectively it
will move or control the drone. Creating
a gestures and storing in database will help to find every time the gesture
that is recorded and if it matches with the stored database then it would be
given as output to the hardware and respectively it will move or control the
drone. The Leap Motion Controller uses its two monochromatic infrared
(IR) cameras and three infrared LEDs to track any hand motion up to a distance
of about 1 meter or about 3 feet directly above it. The hand gestures relayed are converted to linear and angular
displacements and stored in an database. The proposed feature
vector that consists of single-?nger features and double- ?nger features has
two main bene?ts.

 

V.                
CONCLUSION AND DISCUSSIONS

With the help
of the LEAP Motion Controller, we have been able to move the DRONE by using
hand motion. The drone responds to any hand gesture and moves accordingly. It forms a hemispherical area above itself whose
radius is about 1 meter and recognizes any hand motion occurring in that plot
of volume.While
Creating a gestures and storing
in database will help to find everytime the gesture that is recorded and if it
matches with the stored database then it would be given as output to the
hardware and respectively it will move or control the drone. The hand
gestures relayed are converted to linear and angular displacements and stored
in an array. Hence, it can be concluded that with the help of the Leap Motion
Controller, we can use the DRONE to perform various tasks such as aerial
videography, performing acrobatic tasks, to name a few. This project concludes
of detecting all 16 points and controls the drone respectively for the further
application.

 

References:

1 Ayanava
Sarkar,  Ketul Arvindbhai Patel, Geet
Krishna Capoor, Ganesh Ram R.K “Gesture
Control of Drone Using a Motion Controller” , ©2016 IEEE

2 Giulio Marin, Fabio Dominio, Pietro
Zanuttigh “Hand Gesture
Recognition With The Leap Motion And 
Kinect  Devices , Department of
Information Engineering, University of Padova, ICIP 2014, ©2014 IEEE

3
Wei Lu, Member, IEEE, Zheng Tong, and Jinghui Chu “Dynamic Hand Gesture Recognition
With Leap Motion Controller” IEEE SIGNAL PROCESSING LETTERS,
VOL. 23, NO. 9, SEPTEMBER 2016

4
Bing-Yuh Lu, Chin-Yuan Lin, Shu-Kuang Chang, 
Yi-Yen Lin, Chun-Hsiang Huang, Hai-Wu Lee, Ying-Pyng Lin “Bulbs Control
in Virtual Reality by Using Leap Motion Somatosensory Controlled Switches” ICACT2017
February 19 ~ 22, 2017

5
Kemal ERDO?AN, Akif DURDU, Nihat YILMAZ “Intention Recognition Using Leap
MotionController and Artificial Neural Networks” ©2016 IEEE

6 Guanglong
Du, Ping Zhang, and Xin Liu, “Markerless Human–Manipulator Interface Using Leap
Motion With Interval Kalman Filter and Improved Particle Filter”, IEEE
TRANSACTIONS ON INDUSTRIAL INFORMATICS, VOL. 12, NO. 2, APRIL 2016

 

x

Hi!
I'm Freda!

Would you like to get a custom essay? How about receiving a customized one?

Check it out