Abstract: In today's world everyone needs to operate gadgets like smartphone, laptop or computer. However, such devices are not in reach of disable people due to special hardware requirements and high cost. The disclosed invention provides an interface for the people with amputated limbs to interact with the computer. This is a human nose-eye based interface which basically translates the movement of the human nose tip to the cursor movement on the screen.The system makes use of other activities of the human face such as the blinking and squinting of the eye. The mouse clicks are controlled by the left and right eye clicks correspondingly and squeezing of the eyes toggles the scroll mode. The idea is even accessible for a person with glasses. The system in discussion makes use of a simple webcam and open source softwares to determine facial coordinates. This interface provides a complete hands free experience to the users. The first proposed idea requires no external hardware support or sensors for its implementation. Cost effective system can be operated from desktop or laptops with a webcam making it operable by a wide spectrum of people suffering from disabilities.
Claims:1) Operating computer with mouse shaft instigation using facial gestures for people suffering from disabilities, a system comprising:
Webcam (201) captures the facial gesture of the user;
Innovative facial gesture controlled computer mouse for the use of disabled will get activated;
Left eye blink (301) will act as a normal click and right eye (302) movement act like right mouse click;
The disclosed invention comes with a first of its kind ‘plug and play’ feature;
It can be integrated with any operating system available in the market and has zero hardware requirements;
Once downloaded, there is no need for an internet connection for its usage;
Building the most cost effective solution to address the challenges of disabled people to operate computer like devices;
2) The system as claimed in claim 1 wherein any disabled person (102) can operate computer (101) or similar devices with nose and eye movement
3) The system as claimed in claim 1 wherein just the interface needs to be activated by external help and rest activities can be performed explicitly by the user (102)
4) The system as claimed in claim 1 wherein no extra hardware required apart from webcam (201)
5) The system as claimed in claim 1 wherein software gets downloaded on any desktop, laptop or similar devices (101)
6) This invention aims at utilizing the face coordinates of a disabled person to control the movement of the mouse cursor and perform nearly all operations of the mouse or keypad.
, Description:Description
Field Of Invention:
This invention belongs to the field of physically disabled persons (specially disability of hands) operating any computer and similar gadgets.
Background Of Invention:
The past two decades witnessed the advent of various portable input devices such as mouse, joystick, touch pad and etc. which came into existence due to the invention of graphical user interface in all operating systems like Windows, Macintosh and Unix based operating systems.With all these inventions the difficulty in using computers narrowed down to almost zero.
These existing systems involve considerable use of hand for controlling the activities of the mouse. Therefore, people who are unable to use their hands properly (eg: amputees, quadriplegia and person with disability) will be totally deprived of access to the computer. Our approach of ‘Controlling the Mouse using Facial Gestures’ enables people with hand disabilities to freely access the computer where most of the cursor movements are controlled by the various movements of the human face gestures.
Prior Art:
According to prior art it is observed that developments done so for involves need of specially designed hardware components. Inventions of the past consisted of a hardware setup comprising a copper frame which had embedded motion sensors that were worn around the head and face. As the user moves their head, movement of the pointer is achieved on account of relative motion between the surface of the head/face and the motion sensor is reflected in the movement of the cursor on the screen.
Prior Art Drawbacks:
Use of too many hardware components increases the cost of the system and makes the system complex to install and operate. It also impacts the speed of execution of such systems. Need of developing such systems with minimum expenses with basic components available with laptops or desktop increases the scope of such systems. It will make it reachable to the targeted people.
Summary of the Invention:
According to this disclosed invention provides a way for easy operation of computer-like devices. The actual implementation details comprise the following parts.
Webcam:-
The webcam (201) is used to capture the human face and the image captured is processed to a grayscale. Facial coordinates are detected from the recorded grayscale frame. This is the first step as it provides the input to the system to activate the interface.
Mouth Tracking/Switching:-
The system begins to track the user’s mouth movement(305) to trigger the analysing mode of the interface. As the user opens his mouth (305) wide apart once the width of the mouth crosses a certain threshold value, the reading mode of the system is turned on/off.
Controlling the Cursor:-
Movement of the nose tip (303) is tracked to replicate the movement of the cursor on the screen. In a GUI based interface it performs almost all the tasks. The human nose tip suited the best to emulate a point object like a cursor. The interface tracks the movement of the nose tip and guides the cursor (308) to move accordingly.
Clicking:-
Eye (301 and 302) blinks control the click events of the mouse and scroll functionality. The click of the mouse is a very precise movement, hence particular facial movements should result in the clicking of the mouse. Therefore, the system identifies the blinks of the human eye and initiates the click. Right (302) and left (301) eye blinks for right and left clicks respectively.
Scrolling:-
To scroll through multiline documents the system includes a scroll mode (307) which can be activated. Scroll mode (307) will activate if a user squints his eyes (301 and 302) for a stipulated time duration. Post this the scroll mode (307) automatically gets activated and the user can scroll up or down by moving their head in either direction.
Objectives:
Objective of present invention are as follows:
To develop a software interface which performs most of the functions of a commercially available computer mouse.
The interface should not become too cumbersome for the end users. It should come with ‘plug and play’ mode which will enable the user to switch on the interface whenever needed.
Sustainability of the interface by people of all age groups. Disable people belonging to any age group should easily be able to use it.
To minimize the human effort required for controlling the mouse.
The entire system should come with affordable pricing.
It should not involve use of extra hardware equipment and components.
Brief description of the accompanying drawings:
Fig 1: shows the overall system side view.
Fig 2: shows the overall system view.
Fig 3: shows the main frame where the human face is captured.
Detailed description of the accompanying drawings:
Disclosed invention will act as an interface which will track the facial gestures of the user (disabled person) and assist the user in operating his/her personal computer.
In Fig 1:
(102) denotes the user (specially a disabled person) sitting in front of the system (101) which he/she intends to use. The system (101) has the interface installed in it to aid the disabled user (102) to perform computational tasks and activities with the help of his/her facial movements.
In Fig 2:
Fig 2 represents the overall system design. The webcam (201) of the system captures the input from the user for the interface to switch on. (202) denotes the laptop/computer screen where the user (102) can see the output and perform the task. The minimization option (203) is given to hide the panel window while the user (102) performs other tasks. The main frame (204) pops up as soon as the interface is activated. This frame (204) displays all the inputs read by the webcam (201) and the corresponding cursor movement.
In Fig 3:
Fig 3 discusses the main frame in detail. The left eye (301) of the user which is tracked by the webcam (201) to calculate its coordinates. The right eye (302) of the user which is tracked by the webcam (201) to calculate its coordinates. Both eye coordinates are used for calculating the eye aspect ratio. The anchor point/nose tip (303) of the user (102) whose movement controls the movement of the cursor on the screen. (304) is the neutral area. The mouth coordinates (305) are used for calculating the Mouth Aspect Ratio (MAR). MAR helps in determining the input mode of the system (101). (306) is a text message which is displayed to indicate the input mode activation (307) displays another text message indicating scroll mode activation. (308) displays a text message indicating the direction in which the cursor moves.
| # | Name | Date |
|---|---|---|
| 1 | 202041050694-STATEMENT OF UNDERTAKING (FORM 3) [21-11-2020(online)].pdf | 2020-11-21 |
| 2 | 202041050694-FORM 1 [21-11-2020(online)].pdf | 2020-11-21 |
| 3 | 202041050694-DRAWINGS [21-11-2020(online)].pdf | 2020-11-21 |
| 4 | 202041050694-DECLARATION OF INVENTORSHIP (FORM 5) [21-11-2020(online)].pdf | 2020-11-21 |
| 5 | 202041050694-COMPLETE SPECIFICATION [21-11-2020(online)].pdf | 2020-11-21 |