Abstract: ABSTRACT 1 An improved gesture based input control system for computer assisted 2 design applications. The system comprises a processor unit, a communication 3 device, at least one sensor unit and a storage unit. The processor unit decodes 4 the gesture of a user based on the sequence of the sensor parameters and 5 positional information of the digits and respond with a pre-decided action to the 6 gesture of the user in order to effectively sense the 6-Degree of Freedom Inertial 7 Motion and enable smaller and weaker hardware architectures to perform 8 gesture processing with a capable degree of accuracy in a wide range of 9 computer assisted design applications. 10
DESC:DESCRIPTION 1
GESTURE BASED INPUT CONTROL SYSTEM FOR COMPUTER 2 ASSISTED DESIGN APPLICATIONS 3
4
TECHNICAL FIELD 5
[001] The present invention generally relates to the data processing systems 6 and methods. The present invention also relates to gesture recognition 7 techniques and applications. The present invention further relates to gesture 8 based input and control systems and methods. The present invention more 9 particularly related to an improved gesture based input control system for 10 computer assisted design applications. 11
BACKGROUND OF THE INVENTION 12
[002] With the advent of the recent advancement in modern computing 13 applications, several computer assisted design software applications are 14 introduced in the market for generating and manipulating various two-15 dimensional (2D) and three-dimensional (3D) objects. Such computer assisted 16 design software play a major role and has revolutionized modelling and 17 researching in a wide range of industrial applications including automotive, 18 aerospace, medical equipment and medical devices, etc., A user can interact 19 with such computer assisted design programs using various peripheral input 20 devices, such as a keyboard, a computer mouse, a trackball, a touchpad, a touch-21 sensitive pad, and/or a touch-sensitive display. The computer assisted design 22
COMPLETE SPECIFICATION
3
programs may also provide various software tools for generating and 1 manipulating 2D and 3D objects. 2
[003] The computer assisted design programs may provide a drafting area 3 showing 2D or 3D objects being processed by the user, and menus outside the 4 drafting area for allowing the user to choose from various tools in generating or 5 modifying 2D or 3D objects. For example, there may be menus for 2D object 6 templates, 3D object templates, paint brush options, eraser options, line options, 7 colour options, texture options, options for rotating or resizing the objects, and 8 so forth. The user may select a tool from one of the menus and use the selected 9 tool to manipulate the 2D or 3D object. 10
[004] The recent growing popularity for gesture based devices in computing 11 space has introduced a variety of human actuated gesture based devices for 12 controlling the input/output functions in the computing devices. Various gesture 13 based devices for inputting gestures, for example, but not limited to, mice, 14 touch-sensitive surfaces, motion detectors using camera signals, and pressure- 15 sensitive input surfaces. 16
[005] Various processing techniques and tools have been introduced to 17 recognise the gestures and invoke specific actions for recognised gestures 18 received from the input devices. Such gesture recognition parameters include 19 collating input points sensed at a high sample rate, determining which input 20 points are associated with each other, and analysing traits or features of a set of 21
COMPLETE SPECIFICATION
4
associated input points to recognize a gesture. While any type of software or 1 program can implement gestures, gesturing is often used in conjunction with 2 graphical desktops, graphical user shells, window managers, and the like. 3
[006] In particular to computer assisted design applications, gesture based input 4 devices are increasingly used due its adaptability in performing 3-dimensional 5 design applications where the gesture based device effectively allow for 6 positioning of cursors or objects relative to conventional X, Y, Z coordinates. 7 More specifically, gesture based input devices can be advantageous in many 8 6Degree of Freedom applications where the position information as in a 3D 9 device and further provide rotational control about each of three axes, 10 commonly referred to as roll, pitch and yaw. However, majority of prior art 11 gesture based input devices don not exhibit the refinement, accuracy or ease of 12 use characteristic. 13
[007] In fact, existing gesture based input devices are typically cumbersome, 14 inaccurate, non-intuitive, tiring to use, and limited in their ability to manipulate 15 objects. Also, image processing is the most popular method of carrying out 16 gesture recognition in such prior art systems where image processing is plagued 17 by problems of segmentation, lack of environmental robustness, expensive 18 hardware and higher processing load. 19
[008] Based on the foregoing a need therefore exists for an improved gesture 20 based input control device which is a 6-Degree (at least 6-axis/6-DoF) of 21
COMPLETE SPECIFICATION
5
Freedom Inertial Motion Unit computationally efficient, to enable smaller and 1 weaker hardware architectures to perform gesture processing with a capable 2 degree of accuracy. A need also exists for an improved gesture based input 3 control system for computer assisted design applications, as described in greater 4 detail herein. 5
SUMMARY OF THE INVENTION 6
[009] The following summary is provided to facilitate an understanding of 7 some of the innovative features unique to the disclosed embodiment and is not 8 intended to be a full description. A full appreciation of the various aspects of the 9 embodiments disclosed herein can be gained by taking the entire specification, 10 claims, drawings, and abstract as a whole. 11
[0010] Therefore, one aspect of the disclosed embodiment is to provide for an 12 improved gesture based input control system for computing applications. 13
[0011] It is another aspect of the disclosed embodiment to provide for an 14 improved 6-Degree(at least 6-axis/6-DoF) of Freedom Inertial Motion Unit 15 computationally efficient to enable smaller and weaker hardware architectures 16 to perform gesture processing with a capable degree of accuracy. 17
[0012] It is a further aspect of the disclosed embodiment to provide for an 18 improved miniaturized and effective gesture based input control system for 19 computer assisted design applications. 20
COMPLETE SPECIFICATION
6
[0013] The aforementioned aspects and other objectives and advantages can 1 now be achieved as described herein. An improved gesture based input control 2 system for computer assisted design applications, is disclosed herein. The 3 system comprises a processor unit, a communication device, at least one sensor 4 unit and a storage unit. The sensor values can be sampled and fetched from a 5 digit tracking unit and the sensor unit at regular intervals and the sensor values 6 are stored into the storage unit. The processor unit further fetches the sensor 7 values and passes them through a low pass infinite impulse response filter to 8 reduce the noise present in the sensor values. The filtered sensor values can be 9 processed to obtain at least one sensor parameter and positional information of 10 the digits. The processor unit decodes the gesture of a user based on the 11 sequence of the sensor parameters and positional information of the digits and 12 respond with a pre-decided action to the gesture of the user in order to 13 effectively sense the 6-Degree (at least 6-axis/6-DoF) of Freedom Inertial 14 Motion and enable smaller and weaker hardware architectures to perform 15 gesture processing with a capable degree of accuracy in a wide range of 16 computer assisted design applications. 17
[0014] The system proposed herein is a miniaturized and effective gesture 18 based input control system for handing gesture based processing of input 19 controls for computer assisted design and gaming applications including 20 gaming devices , remote vehicle control, robotics control, home automation, 21
COMPLETE SPECIFICATION
7
VR/AR. The sensor unit can be a sensing unit including, but not limited to, an 1 accelerometer, a gyroscope, and a magnetometer. Alternatively, the sensor unit 2 can be a combination of one or more sensing units including, but not limited to, 3 an accelerometer, a gyroscope, and a magnetometer. Note that the sensing units 4 proposed herein are exemplary combinations of sensors used in the invention. 5 Alternatively, a person skilled in the art can choose an appropriate sensor or a 6 combination of sensors for the intended purpose in the invention within the 7 scope of the proposed invention. Also, the sensor parameters herein can be such 8 as for example, but not limited to, pitch, yaw, roll or angular velocity or a 9 combination of 2 or more of them as well as positional information of the digits. 10
[0015] The system decodes a gesture of the user based on the sequence of 11 above sensor parameters and positional information of the digits. The sequence 12 of values mapped to each of the user gestures is pre-decided. Each time a 13 gesture is decoded, the values are used to train a machine learning application of 14 the processor unit in order to increase the accuracy of decoding the gesture of 15 the users. The decoded gesture is then sent via a communication device to any 16 other system/device such as, a computer assisted design application that is 17 paired to the communication device. The paired system/device then responds 18 with pre-decided actions depending on the gesture of the user and the pre-19 decided actions with respect to the gestures are modifiable on the paired 20 system/device. 21
22
COMPLETE SPECIFICATION
8
[0016] The gesture based input control system proposed herein can be worn on 1 the back of the palm of the user where the device is positioned inside the palm 2 covering the palm region. The system can be mechanically extensible along its 3 length, allowing it to fit across a range of palm sizes of the users. The system 4 can be also implemented with a flexible/inflexible strap design based on the 5 requirements of the user. The system can alternatively track digits of the palm 6 by using at least one IR Emitter/Receiver pair to track the digits. The IR Emitter 7 is modulated at a specific frequency and angled to a digit in a way to maximize 8 the range of movement tracking of the digit. 9
[0017] The IR Receiver is passed through a low pass filter with a critical 10 frequency in the range of the modulated frequency. The filtered signal is then 11 passed through an amplifier with an appropriate gain. From the same array or 12 another IR Receiver, a signal is obtained that corresponds to the ambient IR 13 signals that do not arise from the emitter. 14
[0018] The signal is derived from the minima of the signal that is outputted by 15 the previously described low pass filter. Based on this level either the duty cycle 16 of the IR emitter may be modified, or scaling may be carried out via a 17 software/firmware. This allows the IR Receiver to perform ranging accurately, 18 even while the sensor has a DC bias caused by ambient IR not produced by the 19 emitter. The emitter/receiver pair can be calibrated for a distance larger than the 20 furthest distance of the digit from the pair, as the reduction in the duty cycle can 21 lead to a loss in range. 22
COMPLETE SPECIFICATION
9
[0019] The gesture based input control system can track gestures/fingers with 1 very high level accuracy (an accuracy of mm level), along with palm gestures, 2 without additional use of camera/EMG/sensors on the fingers which can be a 3 user friendly system for precision-led computer assisted design applications. 4 The gesture based input control system using sensors and processor for accurate 5 detection and mapping of gestures can be a cost effective and simpler solution 6 for a wide range of applications including computer assisted design 7 applications, gaming and console applications and other advanced computing 8 applications. 9
10
BRIEF DESCRIPTION OF THE DRAWINGS 11
12
[0020] The drawings shown here are for illustration purpose and the actual 13 system will not be limited by the size, shape, and arrangement of components or 14 number of components represented in the drawings. 15
16
[0021] FIG.1 illustrates a graphical representation of an improved gesture 17 based input control system for computer assisted design applications, in 18 accordance with the disclosed embodiments; and 19
20
COMPLETE SPECIFICATION
10
[0022] FIG. 2 illustrates a high level circuit diagram an improved gesture 1 based input control system with ambient IR sensor and LPF and HPF, in 2 accordance with the disclosed embodiments. 3
DETAILED DESCRIPTION 4
[0023] The particular values and configurations discussed in these non-5 limiting examples can be varied and are cited merely to illustrate at least one 6 embodiment and are not intended to limit the scope thereof. 7
8
[0024] The embodiments now will be described more fully hereinafter with 9 reference to the accompanying drawings, in which illustrative embodiments of 10 the invention are shown. The embodiments disclosed herein can be embodied in 11 many different forms and should not be construed as limited to the 12 embodiments set forth herein; rather, these embodiments are provided so that 13 this disclosure will be thorough and complete, and will fully convey the scope 14 of the invention to those skilled in the art. Like numbers refer to like elements 15 throughout. As used herein, the term "and/or" includes any and all combinations 16 of one or more of the associated listed items. 17
18
[0025] The terminology used herein is for the purpose of describing particular 19 embodiments only and is not intended to be limiting of the invention. As used 20 herein, the singular forms "a", "an" and "the" are intended to include the plural 21 forms as well, unless the context clearly indicates otherwise. It will be further 22
COMPLETE SPECIFICATION
11
understood that the terms "comprises" and/or "comprising," when used in this 1 specification, specify the presence of stated features, integers, steps, operations, 2 elements, and/or components, but do not preclude the presence or addition of 3 one or more other features, integers, steps, operations, elements, components, 4 and/or groups thereof. 5
6
[0026] Unless otherwise defined, all terms (including technical and scientific 7 terms) used herein have the same meaning as commonly understood by one of 8 ordinary skill in the art to which this invention belongs. It will be further 9 understood that terms, such as those defined in commonly used dictionaries, 10 should be interpreted as having a meaning that is consistent with their meaning 11 in the context of the relevant art and will not be interpreted in an idealized or 12 overly formal sense unless expressly so defined herein. 13
14
[0027] FIG.1 illustrates a graphical representation of an improved gesture 15 based input control system 100 for computer assisted design applications, in 16 accordance with the disclosed embodiments. The system 100 can be made of 17 plastic and/or any similar non-electromagnetic radiation blocking material. The 18 system 100 can be either 3D printed or injection molded or CNC milled or 19 casted. The system 100 comprises a printed circuit board (PCB) 110 which 20 comprises a processor unit, a communication device, at least one sensor unit and 21 a storage unit. Note that the sensors and electronics components can be placed 22
COMPLETE SPECIFICATION
12
appropriately into the PCB 110 depending on the degree of motion that needs to 1 be captured. It can be placed above the knuckles or in the palm of the hand at 2 the base of the fingers. The PCB 110 and sensors are placed within the 3 enclosure 120, which has an extensible design that is carried out mechanically 4 to allow multiple palm sizes to be accounted for. 5
6
[0028] The sensor values can be sampled and fetched from a digit tracking 7 unit and the sensor unit at regular intervals and the sensor values are stored into 8 the storage unit. The processor unit further fetches the sensor values and passes 9 them through a low pass infinite impulse response filter to reduce the noise 10 present in the sensor values. The filtered sensor values can be processed to 11 obtain at least one sensor parameter and positional information of the digits. 12 The processor unit decodes the gesture of a user based on the sequence of the 13 sensor parameters and positional information of the digits and respond with a 14 pre-decided action to the gesture of the user in order to effectively sense the 6-15 Degree (at least 6-axis/6-DoF) of Freedom Inertial Motion and enable smaller and 16 weaker hardware architectures to perform gesture processing with a capable 17 degree of accuracy in a wide range of computer assisted design applications. 18
19
[0029] The system 100 proposed herein is a miniaturized and effective gesture 20 based input control system for handing gesture based processing of input 21 controls for computer assisted design applications. The sensor unit can be a 22
COMPLETE SPECIFICATION
13
sensing unit including, but not limited to, an accelerometer, a gyroscope, and a 1 magnetometer. Alternatively, the sensor unit can be a combination of one or 2 more sensing units including, but not limited to, an accelerometer, a gyroscope, 3 and a magnetometer. Note that the sensing units proposed herein are exemplary 4 combinations of sensors used in the invention. Alternatively, a person skilled in 5 the art can choose an appropriate sensor or a combination of sensors for the 6 intended purpose in the invention within the scope of the proposed invention. 7 Also, the sensor parameters herein can be such as for example, but not limited 8 to, pitch, yaw, roll or angular velocity or a combination of 2 or more of them as 9 well as positional information of the digits. 10
11
[0030] The system decodes a gesture of the user based on the sequence of 12 above sensor parameters and positional information of the digits. The sequence 13 of values mapped to each of the user gestures is pre-decided. Each time a 14 gesture is decoded, the values are used to train a machine learning application of 15 the processor unit in order to increase the accuracy of decoding the gesture of 16 the users. The decoded gesture is then sent via a communication device to any 17 other system/device such as, a computer assisted design application that is 18 paired to the communication device. The paired system/device then responds 19 with pre-decided actions depending on the gesture of the user and the pre-20 decided actions with respect to the gestures are modifiable on the paired 21 system/device. The gesture based input control system proposed herein can be 22
COMPLETE SPECIFICATION
14
worn on the back of the palm of the user. The system can be mechanically 1 extensible along its length, allowing it to fit across a range of palm sizes of the 2 users. The system can be also implemented with a flexible/inflexible strap 3 design based on the requirements of the user. 4
5
[0031] FIG. 2 illustrates a high level circuit diagram 200 of an improved 6 gesture based input control system 100 with ambient IR sensor and LPF , in 7 accordance with the disclosed embodiments. The system 100 can alternatively 8 track digits of the palm by using at least one IR Emitter/Receiver pair to track 9 the digits. The IR Emitter is modulated at a specific frequency and angled to a 10 digit in a way to maximize the range of movement tracking of the digit. 11
12
[0032] The IR Receiver is passed through a low pass filter with a critical 13 frequency. The filtered signal is then passed through an amplifier with an 14 appropriate gain. From the same array or another IR Receiver, a signal is 15 obtained that corresponds to the ambient IR signals that do not arise from the 16 emitter. 17
18
[0033] The signal is derived from the minima of the signal that is outputted by 19 the previously described low pass filter. Based on this level either the duty cycle 20 of the IR emitter may be modified, or scaling may be carried out via a 21 software/firmware. This allows the IR Receiver to perform ranging accurately, 22
COMPLETE SPECIFICATION
15
even while the sensor has a DC bias caused by ambient IR not produced by the 1 emitter. The emitter/receiver pair can be calibrated for a distance larger than the 2 furthest distance of the digit from the pair, as the reduction in the duty cycle can 3 lead to a loss in range. 4
5
[0034] The gesture based input control system can track gestures/fingers with 6 very high level accuracy (an accuracy of mm level), along with palm gestures, 7 without additional use of camera/EMG/sensors on the fingers which can be a 8 user friendly system for precision-led computer assisted design and gaming 9 applications including gaming devices , remote vehicle control, robotics 10 control, home automation, VR/AR. The gesture based input control system 11 using sensors and processor for accurate detection and mapping of gestures can 12 be a cost effective and simpler solution for a wide range of applications 13 including computer assisted design applications, gamming and console 14 applications and other advanced computing applications. 15
16
[0035] The proposed invention can be effectively used in a wide range of 17 computer assisted design application such as, for example, but not limited to, 18 SolidWorks, AutoCAD, Sketch up, Fusion360, etc. to Pan, Zoom and Rotate 19 using hands instead of conventional 2D input devices. The system further 20 allows intuitive and more efficient maneuvering within computer assisted 21 design software. The device can be also configured to map various gestures 22
COMPLETE SPECIFICATION
16
performed to shortcuts within the application to bring up additional 1 menus/tools. The system can also be adapted in computer programming within 2 Integrated Development Environments such as DevCpp, Visual Studio, Eclipse 3 etc. to scroll, select, cut, copy, paste code or to build/compile code with 4 available gestures to work faster and more efficiently with the text based 5 environment. The system can be also used in gaming/virtual reality/augmented 6 reality instead of a 2D input device wherein hand gestures can be used for 7 aiming, reloading, shooting, driving, etc. to create a more immersive and 8 healthy experience for gamers and home automation users. 9
10
[0036] It will be appreciated that variations of the above-disclosed and other 11 features and functions, or alternatives thereof, may be desirably combined into 12 many other different systems or applications. Also that various presently 13 unforeseen or unanticipated alternatives, modifications, variations or 14 improvements therein may be subsequently made by those skilled in the art 15 which are also intended to be encompassed by the following claims. ,CLAIMS:I/We CLAIM: 2
1. An improved gesture based input control system for computer assisted 3 design applications, said system comprising: 4
a sensor unit comprising at least one sensor configured with a processor unit 5 and a communication unit wherein values at the sensor unit can be sampled and 6 fetched from a digit tracking unit at regular intervals and stored into the storage 7 unit; 8
a low pass infinite impulse response filter to receive the sensor values 9 fetched at the processor unit in order to reduce the noise present in the sensor 10 values wherein the filtered sensor values can be processed to obtain at least one 11 sensor parameter and positional information of the digits; 12
a gesture decoding module at the processor unit decodes the gesture of a 13 user based on the sequence of the sensor parameters and positional information 14 of the digits wherein the processor unit responds with a pre-decided action to 15 the gesture of the user in order to effectively sense the 6-Degree (at least 6-16 axis/6-DoF) of Freedom Inertial Motion and enable smaller and weaker 17 hardware architectures to perform gesture processing with a capable degree of 18 accuracy in a wide range of computer assisted design applications. 19
2. The system of claim 1 wherein the sensor unit comprises at least one of the 20 following sensors: an accelerometer, a gyroscope, and a magnetometer. 21
COMPLETE SPECIFICATION
18
3. The system of claim 1 wherein the gesture decoding module of the 1 processor unit decodes a gesture of the user based on the sequence of sensor 2 parameters and positional information of the digits. 3
4. The system of claim 1 wherein the sequence of values mapped to each of the 4 user gestures is pre-decided wherein the values are used to train a machine 5 learning application of the processor unit in order to increase the accuracy of 6 decoding the gesture of the users each time a gesture is decoded and sent via the 7 communication device to any other system/device. 8
5. The system of claim 4 wherein the system/device paired via the 9 communication device responds with pre-decided actions depending on the 10 gesture of the user wherein the pre-decided actions with respect to the gestures 11 are modifiable on the paired system/device. 12
6. The system of claim 1 wherein the gesture based input control system can 13 be worn on the back of the palm of the user where the device is positioned 14 inside the palm covering the palm region. 15
7. The system of claim 1 wherein the gesture based input control system can 16 be mechanically extensible along its length, allowing it to fit across a range of 17 palm sizes of the users. 18
8. The system of claim 1 wherein the gesture based input control system can 19 track digits of the palm by using at least one IR Emitter/Receiver pair to track 20 the digits wherein the IR Emitter is modulated at a specific frequency and 21
COMPLETE SPECIFICATION
19
angled to a digit in a way to maximize the range of movement tracking of the 1 digit. 2
9. The system of claim 8 wherein the IR Receiver is passed through a low pass 3 filter with a critical frequency in the range of the modulated frequency wherein 4 the filtered signal is then passed through an amplifier with an appropriate gain 5 to obtain a signal that corresponds to the ambient IR signals that do not arise 6 from the emitter. 7
10. The system of claim 9 wherein the filtered signal is derived from the 8 minima of the signal that is outputted by the previously described low pass filter 9 and based on this level either the duty cycle of the IR emitter may be modified 10 and scaling may be carried out via a software/firmware. 11
12
6. DATE AND SIGNATURE: 20.06.2018 13
14
| # | Name | Date |
|---|---|---|
| 1 | 201741039821-PROVISIONAL SPECIFICATION [08-11-2017(online)].pdf | 2017-11-08 |
| 2 | 201741039821-POWER OF AUTHORITY [08-11-2017(online)]_18.pdf | 2017-11-08 |
| 3 | 201741039821-POWER OF AUTHORITY [08-11-2017(online)].pdf | 2017-11-08 |
| 4 | 201741039821-FORM FOR SMALL ENTITY(FORM-28) [08-11-2017(online)].pdf | 2017-11-08 |
| 5 | 201741039821-FORM FOR SMALL ENTITY [08-11-2017(online)].pdf | 2017-11-08 |
| 6 | 201741039821-FORM 1 [08-11-2017(online)]_43.pdf | 2017-11-08 |
| 7 | 201741039821-FORM 1 [08-11-2017(online)].pdf | 2017-11-08 |
| 8 | 201741039821-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-11-2017(online)].pdf | 2017-11-08 |
| 9 | 201741039821-EVIDENCE FOR REGISTRATION UNDER SSI [08-11-2017(online)].pdf | 2017-11-08 |
| 10 | 201741039821-DRAWINGS [08-11-2017(online)].pdf | 2017-11-08 |
| 11 | 201741039821-DECLARATION OF INVENTORSHIP (FORM 5) [08-11-2017(online)]_30.pdf | 2017-11-08 |
| 12 | 201741039821-DECLARATION OF INVENTORSHIP (FORM 5) [08-11-2017(online)].pdf | 2017-11-08 |
| 13 | Correspondence by Agent_Form1,Form5,Form26_17-11-2017.pdf | 2017-11-17 |
| 14 | 201741039821-DRAWING [20-06-2018(online)].pdf | 2018-06-20 |
| 15 | 201741039821-CORRESPONDENCE-OTHERS [20-06-2018(online)].pdf | 2018-06-20 |
| 16 | 201741039821-COMPLETE SPECIFICATION [20-06-2018(online)].pdf | 2018-06-20 |
| 17 | 201741039821-FORM 3 [04-08-2021(online)].pdf | 2021-08-04 |
| 18 | 201741039821-FORM 18 [30-10-2021(online)].pdf | 2021-10-30 |
| 19 | 201741039821-FER.pdf | 2022-07-22 |
| 20 | 201741039821-FER_SER_REPLY [22-01-2023(online)].pdf | 2023-01-22 |
| 21 | 201741039821-CORRESPONDENCE [22-01-2023(online)].pdf | 2023-01-22 |
| 22 | 201741039821-US(14)-HearingNotice-(HearingDate-15-12-2025).pdf | 2025-11-06 |
| 1 | FER-2022-07-22-13-36-19E_22-07-2022.pdf |
| 2 | 220722222E_22-07-2022.pdf |