Updated date:

Gesture Control Drone For Crowd Analysis- Software Requirement Specifications (SRS)

Thank you for visiting my profile! I am dedicated to inspiring innovation through evolving to meet the latest technological demands

System Requirement Specification(SRS)

A software requirements specification (SRS) is a description of a software system to be developed. It lays out functional and non-functional requirements and may include a set of use cases that describe user interactions that the software must provide.

Introduction

In this project we will be controlling a drone with hand gestures using a leap motion controller and the output images of the drones would be used to do crowd analysis, which means estimating the number of people in an image.

It is not easy to control a drone with remote control. Only highly trained individuals are able to do so. When an untrained person tries to control the drone it often ends in a crash or any damage to the drone or others.

The second problem is the estimation of the crowd in a specific area usually we see many estimations of a gathering and all of them vary with a high degree. And no one knows the close to the real estimation of the crowd. Often in accidents, we cannot see the people entrapped or gathered in the accidental site.

Background

It is not very easy to control a drone with its provided remote control the drone sometimes becomes uncontrollable and bangs into the walls. Secondly, in large gathering it is not easy to estimate the number of people, it is very slow and very unreliable.

Nowadays in events where is a large gathering news channels often provide crowds estimations which vary highly from channel to channel and the only method from which these stats are coming from is simply counting the people or giving estimations which are often not correct and very time taking. The crowd estimation application will provide a very close to the real estimation of the crowd from an image that will be highly useful for news channels and rescue squads to see how many people are present in a specific area without getting close to it. The user will be able to bypass the counting process it will be done by the application. The user can give a very close to real estimations in no time.

Motivation

The motivation behind this idea is to produce integration which would provide ease of use for controlling a drone and to use to estimate the number of people in a given image as fast as possible.

The user of the product should know how to use the product, how to interact with it and how to take full advantage of the application. Users must know where not to use the product. The environment must be open i.e. drone must be flown outside because inside it can be damaged or an individual can get hurt. The images should be taken from a specific height so that the algorithm can work correctly. The user must know the limitations of the crowd estimation algorithm so that it should not waste the effort of the user and the product.

Purpose

The purpose of this document is to provide the reader with a more detailed view of the project with the help of diagrams like sequence diagram, activity diagram, class diagram etc. so the reader can understand the project more clearly.

Intended Audience and Reading Suggestions

Intended audience for this document is as follows

  • User
  • Stakeholders
  • Developer
  • Designer

The document contains information that could be helpful in development so for detailed view developer is suggested to read the document with detail.

Product scope

The main scope of the project is to produce an integration for drones to make them controllable by hand gestures and use the output images for crowd analysis. The main steps are:

  • Take hand gestures input
  • Map the input onto drone commands
  • Send the intended command to drone
  • Receive the images from drone
  • Perform crowd analysis on one image

Overall Description

Product Perspective

This project is for controlling a drone from hand gestures and performing crowd analysis on the output images.

Product Functions

  • Login
  • Controlling a drone with gesture
    • Getting gestures from user
    • Mapping the gestures onto drone signals
    • Passing the signals to drone
    • Getting output images from drone
    • Performing Crowd Estimation algorithm on images

User Classes and characteristics

The purpose of the project is that everyone could fly the drone but main user would be the people who use drone regularly they would be provided with ease

The second module would be used by only news casters so that they can cover an event and give close to real estimation of crowd.

Operating Environment

The operating system compatible with this system would be all windows versions above windows 7

The drone would be only work if it has a developer SDK available

Design and implementation constraints

The constraints are as follows:

Working with the hardware

  • Working with the hardware is always difficult because it is sensitive and can get damaged so it care is needed

Sending signals to drone

  • As per plan out drone would be already capable to connect with the system but it makes things complex because there is only one Wi-Fi in computer which would be connected to drone now PC cannot be connected to other Wi-Fi

Leap motion connectivity

  • For running leap motion device specific software and drivers are required and its SDK has to be used

Programing drone

  • For programing drone we need a drone that could be programed means a drone that comes with and SDK otherwise

Making the gestures objects

  • Making gesture objects is difficult because they may work on a PC but not on the other due to FPS or any other issue so many objects have to be created for same gesture so error could be minimized

Taking write images

  • Now crowd analysis algorithm would work only on a suitable class of images so images should be taken correctly

End user constraints

  • User should be able to understand the software
  • User should be able to understand how to give gestures to drone and how it is reacting
  • User should not have any motivation to harm the product
  • Drone should be flown at a suitable height to take images

User documentation

User would be provided with a manual to understand how all the system works. Moreover on the packaging basic instructions would be given so that if the user pays a little attention then user may not even need manual.

Assumptions and dependencies

  • User should know the system and able to understand it
  • Required software are installed
  • All connections are made
  • All hardware is working correctly
  • Drone is flown with care
  • Images taken properly
  • There would be a dashboard where user can see the whole system and can fly drone or do the image processing algorithms
  • The image processing module would give its result on an interface
  • When the user selects fly drone then interface changes and leap motion is detecting hands and showing to the user and mapping is somehow shown to user for interactive feedback and signals would be sent to drone
  • Then a login interface
  • A signup interface

External Interface options

User Interfaces

  • There would be a dashboard where user can see the whole system and can fly drone or do the image processing algorithms
  • The image processing module would give its result on an interface
  • When the user selects fly drone then interface changes and leap motion is detecting hands and showing to the user and mapping is somehow shown to user for interactive feedback and signals would be sent to drone
  • Then a login interface
  • A signup interface

Hardware Interfaces

Hardware of the project

  • PC
    • Requirements 2Gb RAM, Hard sufficient to store images and software, GPU 1Gb recommended
    • Leap motion device
  • Drone
    • A drone with provided SDK and a GPS system would be best

Software Interfaces

Software interfaces would be the software we will be using

  • Visual studio 2013 or above
  • Matlab (if used)
  • Linux (if leap motion windows programmability not available)

System Features

This section contains a detailed view of every system feature

Getting users hand gestures

The system would get users hand gestures through the leap motion device and storing them in a DBMS table or a text file form where it could be loaded in future for further use.

And when running the system would not save these gestures but match the new coming with these previous ones

User System Interaction

User makes gestures over the leap motion device

System shows the gestures made by the user on screen

Functional Requirements

  • Leap motion getting gestures
  • Leap motion passing the gestures to the system
  • System receiving the gestures
  • System understanding the gestures

Mapping the gestures onto command signals

The gestures coming from the leap motion would be understood by the system and then matched with the previous gestures that with which it matches. Then system would know which command to send the drone and it would be sent

User System interaction

Very light mapping feedback would be shown to user so that user knows what gesture is given it is processed or not and sent to drone or not.

Functional Requirements

  • System would contain all the gestures to match
  • System would contain all the commands to be called

Crowd analysis

User would select the images on which he wants to apply the algorithms. Selecting more than one images would take crowd estimation algorithm would be performed on it or if user selects only one image then straight crowd analysis would be performed on it.

User would be given results

User interaction

User would select images from a window

Then as the algorithms run some feedback would be provided to the user

After the algorithms had ran then the result would be shown

Functional requirements

  • Take selected images
  • Perform crowd analysis
  • Show results to user

Login

User would be displayed a locked screen on which he would enter the username and the password if the credentials matches then the user would logged in else the user would not be allowed to run the software.

User interaction

User would enter credentials. If corrected system would run if not user would stay on the same screen

Functional Requirements

  • System would take the user credentials
  • Match them with database
  • Return the results
  • Speed: the drone will advance at a speed of at least 0.25 m/second..
  • A tutorial will be written to make it easy for use to the average Joe.
  • The Drone could be set in any open space. The code documentation will be written as clear and detailed as possible.
  • The leap motion device should be positioned correctly so that it can collect gestures easily
  • Drone should not be flown in a closed environment
  • All device should be connected
  • Connections should be made
  • Leap service should be running
  • The system should map the data fairly fast so that drone could receive the signals in real time
  • Crowd analysis should be fast enough to give result on the spot in minutes
  • There should be a authentication login for user to preserve data and maintain session

Non Functional Requirements

  • Speed: the drone will advance at a speed of at least 0.25 m/second..
  • A tutorial will be written to make it easy for use to the average Joe.
  • The Drone could be set in any open space. The code documentation will be written as clear and detailed as possible.
  • The leap motion device should be positioned correctly so that it can collect gestures easily
  • Drone should not be flown in a closed environment

Positioning of hardware

  • The leap motion device should be positioned correctly so that it can collect gestures easily
  • Drone should not be flown in a closed environment

Connections

  • All device should be connected
  • Connections should be made
  • Leap service should be running
  • The system should map the data fairly fast so that drone could receive the signals in real time
  • Crowd analysis should be fast enough to give result on the spot in minutes

Time

  • The system should map the data fairly fast so that drone could receive the signals in real time
  • Crowd analysis should be fast enough to give result on the spot in minutes

Authentication

  • There should be a authentication login for user to preserve data and maintain session Information
  • There should be some entries in database s that credentials could be matched

Image Quality

  • The images should not be blurry so that algorithm can run on them
  • The images should be of high resolution so that algorithms can run on them
  • Weather should be suitable for drone to fly and take image

Weather

  • Weather should be suitable for drone to fly and take image

Business rules

  • Only one user can interact with the system at a given time
  • More than one user would not be allowed

Quality

  • All interfaces should be simple and easy to use
  • Leap motion should be getting all gestures
  • Drone should be receiving all signals

Use cases

Use case diagram is the set of actions that a user would take and the required or optional things to complete an action

Use Case Diagram- Gesture Control Drone For Crowd Analysis

Use Case Diagram- Gesture Control Drone For Crowd Analysis

High level use cases

High level use cases are detailed description of the main use cases and there requirements and options

Use Cases

UC01: Login

UC02: Signup

UC03: Show images

UC04: Select Images

UC05: Perform crowd analysis

UC06: Show results

UC07: Fly Drone

UC08: Get gestures

UC09: Show gesture data

UC010: Map gesture data

UC011: Send command to drone

UC01: Login

Use Case idUC01

Use Case

Login

Actor

User

Type

Primary

Description

System ask user to enter his user name and password. System will maintain a session for user and store user login credentials in system

Pre-Condition

User has not provided his/her Information

Post-Condition

User login credentials is stored in the System and user clicked the sign in button on the form

 

 

UC02: Sign Up

Use Case idUC02

Use Case

Sign Up

Actor

User

Type

Primary

Description

System ask user to provide basic information for registration. User will provide name, contact, address and CNIC details

Pre-Condition

User must be on Sign Up form.

Post-Condition

All mandatory fields must be validated and user clicked the sign up button on the form

 

 

UC03: Show Images

Use Case idUC03

Use Case

Show Images

Actor

User

Type

Primary

Description

System will process captured images and detect redundancy of crowd in captured images.

Pre-Condition

Multiple Images must be captured by drone camera during survey of crowded area

 

Captured Images must be stored in the system

Post-Condition

Duplicate frames must be removed from the captured images

UC04: Select Image

Use Case idUC04

Use Case

Select Image

Actor

User

Type

Primary

Description

User will select images form the system.

Pre-Condition

Captured images must be stored in the system.

 

Captured images must be in a predefined format with timestamp.

Post-Condition

Captured Images will selected.

UC05: Show Results

Use Case idUC06

Use Case

Show Results

Actor

User

Type

Primary

Description

System will show results of crowd analysis and generate reports of crowd estimation. Charts will be generated by the system which shows crowd estimation results of multiple images

Pre-Condition

Crowd analysis must be performed on the selected images.

 

Crowd estimation results must be stored in the system.

Post-Condition

Reports must be generated by the system in a printable form.

 

Crowd estimation result charts must be shown by the system.

UC06: Fly Drone

Use Case idUC07

Use Case

Fly Drone

Actor

User

Type

Primary

Description

User will select the option fly drone. System will show interface to user to show gestures performed by users. System will now ready to detect hand gestures

Pre-Condition

Leap device must be connected.

 

User must be logged in to the system.

Post-Condition

None

UC07: Get Gesture

Use Case idUC08

Use Case

Get Gesture

Actor

User, leap

Type

Primary

Description

System will get gesture input from leap motion device. System will request leap services to process and provide gesture input

Pre-Condition

Advance algorithms must be applied to the raw sensor data by leap motion device.

 

Necessary resolution adjustment must be performed on raw gesture data.

Post-Condition

Leap motion services are running in the background and system is ready to communicate with the device and services

UC08: Show Gesture Data

Use Case idUC09

Use Case

Show Gesture Data

Actor

System

Type

Secondary

Description

System will show hand gesture tracking data like hand position, fingers count and position on x-y-z plane. System will show captured frames data in human readable form

Pre-Condition

Leap services are running in the background.

 

Useful data must be extracted from hand gesture frames.

 

 

Post-Condition

Hand gesture frames must be shown on system screen in human readable format.

 

Hand position values on x-y-z plane must be shown on the system screen.

UC09: Map Gesture Data

Use Case idUC10

Use Case

Map Gesture Data

Actor

System

Type

Secondary

Description

System will map each pre-defined hand gesture given by user into drone command. System will process and validate each hand gesture frame. If the hand gesture is valid then system will map the gesture into drone command

Pre-Condition

Frames data must be extracted from hand gesture input.

 

Hand and fingers position values must be extracted from each frame and stored in the system.

Post-Condition

Each gesture is validated and mapped into respective drone command

Post-Condition

Hand gesture frames must be shown on system screen in human readable format.

 

Hand position values on x-y-z plane must be shown on the system screen.

UC10: Send Command to Drone

Use Case idUC11

Use Case

Send Command to Drone

Actor

Drone

Type

Secondary

Description

System will activate the respective mode of drone as per user gesture command. Drone will fly in a direction or perform action as system will request to do so

Pre-Condition

Users hand gesture must be recognized.

 

User hand gesture must be mapped in to drone command.

Post-Condition

Command must be sent to the drone successfully

 

Drone must be operated and fly in a mode that is sent by a system as command.

Activity Diagram

Activity Diagram

Activity Diagram

Domain Model

Domain Model

Domain Model

Class Diagram

Class Diagram

Class Diagram

System Sequence Diagrams

Login

Login

Login

Crowd Analysis

Crowd Analysis

Crowd Analysis

Map Gestures

Map Gestures

Map Gestures

Sequence Diagram

Sequence Diagram

Sequence Diagram

Login()

Namelogin

Responsibilities

Take the user id and password from user

Cross References

login

Exceptions

invalid user id and password, no user account

Preconditions

user have account, familiar user

Post Conditions

user login with system

Check Connection()

Namecheck connection

Responsibilities

It will check the connection of leap motion device with system

Cross References

checking connectivity

Exceptions

leap motion device not connected

Preconditions

user login to the system and leap motion device connected to the system

Post Conditions

leap motion sensor is connected with system

Load Gesture()

Nameload gesture

Responsibilities

it will load the gesture from database or file when user login

Cross References

load gesture into system for comparing of coming gesture from leap device

Exceptions

gesture not load, invalid gesture

Preconditions

user login, database created or file should be into system

Post Conditions

gesture loaded into system

Load Command()

Nameload command

Responsibilities

it will load the command from database of file when user login

Cross References

load gesture into system for comparing of coming gesture from leap device

Exceptions

Command not loaded into system, invalid command

Preconditions

user login, database created or file should be into system

Post Conditions

gesture loaded into system

Get Gesture()

NameGet gesture

Responsibilities

user move his/her hand on the leap motion device and this device send command to system

Cross References

get gesture

Exceptions

leap motion device not connected, user not login

Preconditions

user know how to use it, leap motion device should be connected

Post Conditions

gesture signals are sent from leap motion device to system

Map Gesture()

NameMap gestures

Responsibilities

This operation map the gesture signals into command that will send to the drone

Cross References

map gesture data

Exceptions

signals not received, gesture not detected

Preconditions

signals received from leap device, gesture should be defined

Post Conditions

gesture mapped into the commands

Send Command()

NameSend command

Responsibilities

this will send the mapped signals to drone as a commands

Cross References

send command to drone

Exceptions

signals not mapped, invalid signals

Preconditions

mapped signals correctly

Post Conditions

signals sent to drone as a command

Select Images()

NameSelect images

Responsibilities

this will select taken images from the drone for analysis

Cross References

select images

Exceptions

image not available

Preconditions

image taken by the system

Post Conditions

image is selected from system for analysis

Crowd Analysis()

NameCrowd analysis

Responsibilities

this will count the people from selected image

Cross References

perform crowd analysis

Exceptions

picture invisible

Preconditions

 

Post Conditions

crowd analysis is performed and people are counted from image

Class Diagram

Class Diagram

Class Diagram

Architecture Diagram

Architecture Diagram

Architecture Diagram

References

https://www.mathworks.com/matlabcentral/answers/223506-error-generating-samples-for-cascade-classifier-training

https://www.mathworks.com/help/vision/ref/vision.peopledetector-system-object.html#bthxpq4-8

http://vislab.isr.ist.utl.pt/datasets/

http://people.cs.umass.edu/~smaji/projects/ped-detector/

https://www.mathworks.com/help/vision/ref/traincascadeobjectdetector.html#btrjwku-8

https://www.mathworks.com/matlabcentral/fileexchange/39627-cascade-trainer--specify-ground-truth--train-a-detector

https://www.mathworks.com/help/vision/ug/train-a-cascade-object-detector.html

https://www.mathworks.com/help/vision/ref/traincascadeobjectdetector.html#description

© 2020 Sumair Sajid