Sign In to Follow Application
View All Documents & Correspondence

Gesture Based Vehicle Control System And Method For Omni Directional Vehicles

Abstract: GESTURE-BASED VEHICLE CONTROL SYSTEM AND METHOD FOR OMNI-DIRECTIONAL VEHICLES ABSTRACT A gesture-based vehicle control system (100) for controlling omni-directional vehicles is disclosed. The system (100) comprising: a vehicle (102) equipped with a motor (108) and wheels (110a-110n); a controller (112) installed in the vehicle (102), and configured to control the motor (108) and the wheels (110a-110n); a wearable device (104) in communication with sensor nodes (114) and a control unit (116), wherein the sensor nodes (114) are configured to detect gestures of a user and the control unit (116) is configured to interpret the detected gestures. A processor (118) configured to: receive data packets comprising the interpreted gestures from the wearable device (104) over the internet; process the received data packets into navigational commands; transmit the navigational commands to the controller (112); and actuate the motor (108) of the vehicle (102). The system (100) provides a quick and latency-free method for controlling the vehicle (102). Claims: 6, Figures: 4 Figure 1A is selected.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 May 2024
Publication Number
22/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR University
SR University, Ananthasagar, Warangal Telangana India 506371 patent@sru.edu.in 08702818333

Inventors

1. Mr. Y. Srikanth
Department of ECE, SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)
2. Ch. Rajendra Prasad
Department of ECE, SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)
3. V. Vinod Kumar
Department of ME, SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)
4. B. Anil
SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)
5. CH. Hasini
SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)
6. P. Neha
SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)
7. G. Sanjay
SR University, Warangal, Ananthasagar, Telangana- 506371, India (IN)

Specification

Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to robotics and remote-controlled vehicles and particularly to a gesture-based vehicle control system for controlling omnidirectional vehicles.
Description of Related Art
[002] Remote-controlled vehicles have seen significant advancements in recent years, driven by a pursuit of more intuitive and user-friendly control mechanisms. Traditional methods of controlling such devices often involve complex interfaces or manual input devices that can be challenging for users to master and may limit the range of possible actions. As technology continues to evolve, there is a growing demand for control systems that are more natural, intuitive, and immersive.
[003] Moreover, one promising avenue of exploration in this regard is gesture control technology. Gesture control enables users to interact with devices using hand movements and gestures, mimicking natural human behavior. By leveraging sensors such as accelerometers, gyroscopes, and cameras, gesture control systems can interpret these movements and translate them into commands that drive the motion of a vehicle or robot.
[004] However, this approach offers several shortcomings such as delay in operation, latency, cross and miscommunication, and so forth.
[005] There is thus a need for an improved and advanced gesture-based vehicle control system for controlling an omnidirectional vehicle that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a gesture-based vehicle control system for controlling an omnidirectional vehicle. The system comprising: a vehicle equipped with a motor and wheels. The system further comprising: a controller installed in the vehicle, and configured to control the motor and the wheels. The system further comprising: a wearable device in communication with sensor nodes and a control unit. The sensor nodes are configured to detect gestures of a user and the control unit is configured to interpret the detected gestures. The system further comprising: a processor located on a cloud server in communication with the controller and the control unit over an Internet. The processor is configured to: receive data packets comprising the interpreted gestures from the wearable device over the internet; process the received data packets into navigational commands; transmit the navigational commands to the controller using a web-based service; actuate the motor of the vehicle based on the navigational commands.
[007] Embodiments in accordance with the present invention further provide a method for controlling a vehicle. The method comprising steps of: enabling a user to take control over a wearable device; receiving navigational commands in form of gestures from the wearable device; inferring the received gestures to obtain a motion and a direction of the vehicle; and actuating motor to enable the vehicle to carry out the obtained motion in the obtained direction.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a gesture-based vehicle control system for controlling an omnidirectional vehicle.
[009] Next, embodiments of the present application may provide a gesture-based vehicle control system for controlling an omnidirectional vehicle that is quick, active, and latency-free.
[0010] These and other advantages will be apparent from the present application of the embodiments described herein.
[0011] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0013] FIG. 1A illustrates a block diagram of a gesture-based vehicle control system for controlling an omnidirectional vehicle, according to an embodiment of the present invention;
[0014] FIG. 1B illustrates a vehicle, according to an embodiment of the present invention;
[0015] FIG. 2 illustrates a block diagram of a processor of the gesture-based vehicle control system for controlling an omnidirectional vehicle, according to an embodiment of the present invention; and
[0016] FIG. 3 depicts a flowchart of a method for controlling a vehicle, according to an embodiment of the present invention.
[0017] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0019] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0020] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0021] FIG. 1A illustrates a block diagram of a gesture-based vehicle control system 100 (hereinafter referred to as the system 100) for controlling an omnidirectional vehicle 102 (hereinafter referred to as the vehicle 102), according to an embodiment of the present invention. In an embodiment of the present invention, the system 100 may enable a user to control, navigate, and maneuver a vehicle 102 in a premise using hand gestures. In an embodiment of the present invention, the vehicle 102 controlled by the system 100 may be an omnidirectional vehicle such that the vehicle 102 may move in any direction. In another embodiment of the present invention, the vehicle 102 controlled by the system 100 may be bi-directional, allowing for movement both forward and backward. This capability enables the vehicle 102 to navigate in two opposing directions, providing versatility in maneuvering within constrained spaces or while performing specific tasks.
[0022] According to embodiments of the present invention, the system 100 may comprise the vehicle 102, a wearable device 104, and a cloud server 106. The cloud server 106 and the vehicle 102 may be connected using a web-based service 122. The wearable device 104 and the cloud server 106 may be counted using an internet. Further, the vehicle 102 may comprise a motor 108, wheels 110a-110n, and a controller 112. The wearable device 104 may further comprise sensor nodes 114 and a control unit 116. The cloud server 106 may comprise a processor 118 and a storage unit 120.
[0023] In an embodiment of the present invention, the vehicle 102 may be equipped with the motor 108 and the wheels 110a-110n. The motor 108 may provide a rotational force to the wheels 110a-110n for navigation of the vehicle 102 in the premise. Further, the controller 112 may be installed in the vehicle 102 and may be configured to control the motor 108 and the wheels 110a-110n based on navigational commands transmitted by the wearable device 104.
[0024] In an embodiment of the present invention, the wearable device 104 may be connected to the sensor nodes 114 and the control unit 116. According to embodiments of the present invention, the wearable device 104 may be, but not limited to, a ring, a wristband, a handheld controller, and so forth. In a preferred embodiment of the present invention, the wearable device 104 may be a glove. Embodiments of the present invention are intended to include or otherwise cover any type of the wearable device 104, including known, related art, and/or later developed technologies.
[0025] The sensor nodes 114 may be configured to detect gestures of the user. In an embodiment of the present invention, the sensor nodes 114 may be arranged inside the wearable device 104. In another embodiment of the present invention, the sensor nodes 114 may be arranged in an environment to detect the gestures of the user wearing the wearable device 104. In such an embodiment of the present invention, the wearable device 104 may communicate with the sensor nodes 114 using a short-range wireless protocol such as Bluetooth or a ZigBee. The wearable device 104 may act as a transmitter for relaying gesture data to the sensor nodes 114. The sensor nodes 114 may then process and interpret the gestures based on predefined algorithms or patterns. This arrangement may allow for the detection of the gestures within a specified area or range to provide flexibility in usage scenarios.
[0026] The gestures may be, but not limited to, a flick gesture, a pinch in gesture, a pinch out gesture, a push gesture, a pull gesture, a steer gesture, and so forth. Embodiments of the present invention are intended to include or otherwise cover any gestures, including known, related art, and/or later developed technologies.
[0027] In an embodiment of the present invention, the control unit 116 may be configured to interpret the detected gestures. In a preferred embodiment of the present invention, the control unit 116 may be an Espressif 32 (ESP32). The control unit 116 may interpret the detected gestures based on predefined gesture recognition algorithms or patterns programmed into its firmware. These algorithms analyze the spatial and temporal characteristics of the detected gestures to identify specific gestures and their corresponding commands or actions.
[0028] In an embodiment of the present invention, the cloud server 106 may comprise the processor 118 and the storage unit 120. The processor 118 may be configured to communicate with the controller 112 using the web-based service 122 and in communication with the control unit 116 over the internet. The storage unit 120 may further be configured to store the navigational commands transmitted by the user in form of the gestures.
[0029] FIG. 1B illustrates the vehicle 102, according to an embodiment of the present invention. The wheels 110a-110n may navigate the vehicle 102. Further, the wheels 110a-110n may be controlled using the controller 112 based on navigational commands transmitted by the wearable device 104, in an embodiment of the present invention.
[0030] FIG. 2 illustrates a block diagram of the processor 118 of the system 100 for controlling the vehicle 102, according to an embodiment of the present invention. The processor 118 may comprise the computer-executable instructions in form of programming modules such as a data receiving module 200, a processing module 202, a data transmission module 204, and an actuation module 206.
[0031] In an embodiment of the present invention, the data receiving module 200 may be configured to receive data packets comprising the interpreted gestures from the wearable device 104 over the internet. The data receiving module 200 may further transmit interpreted gestures to the processing module 202, in an embodiment of the present invention.
[0032] In an embodiment of the present invention, the processing module 202 may be activated upon receipt of the interpreted gestures from the data receiving module 200. The processing module 202 may be configured to process the received data packets into the navigational commands, in an embodiment of the present invention. The processing module 202 may further be configured to transmit the navigational commands to the data transmission module 204.
[0033] In an embodiment of the present invention, the data transmission module 204 may be activated upon receipt of the navigational commands from the navigational commands. The data transmission module 204 may be configured to transmit the navigational commands to the controller 112 using the web-based service 122, in an embodiment of the present invention. After transmission of the navigational commands to the controller 112, the data transmission module 204 may transmit an activation signal to the actuation module 206.
[0034] In an embodiment of the present invention, the actuation module 206 may be activated upon receipt of the activation signal from the data transmission module 204. The actuation module 206 may be configured to actuate the motor 108 of the vehicle 102 based on the navigational commands, in an embodiment of the present invention.
[0035] FIG. 3 depicts a flowchart of a method 300 for controlling the vehicle 102 using the system 100, according to an embodiment of the present invention.
[0036] At step 302, the system 100 may enable the sensor nodes 114 to detect the gestures of the user wearing the wearable device 104.
[0037] At step 304, the system 100 may enable the control unit 116 to be in communication with the sensor nodes 114 for interpreting the gestures detected by the sensor nodes 114.
[0038] At step 306, the system 100 may enable the processor 118 to receive the data packets comprising the interpreted gestures form the wearable device 104 over the Internet.
[0039] At step 308, the system 100 may process the received data packets into the navigational commands.
[0040] At step 310, the system 100 may enable the web-based service 122 to transmit the navigational commands to the controller 112 of the vehicle 102.
[0041] At step 312, the system 100 may actuate the motor 108 of the vehicle 102 based on the navigational commands.
[0042] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0043] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
We Claim:
1. A gesture-based vehicle control system (100), the system (100) comprising:
a vehicle (102) equipped with a motor (108) and wheels (110a-110n);
a controller (112) installed in the vehicle (102), and configured to control the motor (108) and the wheels (110a-110n);
a wearable device (104) in communication with sensor nodes (114) and a control unit (116), wherein the sensor nodes (114) are configured to detect gestures of a user and the control unit (116) is configured to interpret the detected gestures;
a processor (118) located on a cloud server (106) in communication with the controller (112) and the control unit (116) over an internet, characterized in that the processor (118) is configured to:
receive data packets comprising the interpreted gestures from the wearable device (104) over the internet;
process the received data packets into navigational commands;
transmit the navigational commands to the controller (112) using a web-based service (122); and
actuate the motor (108) of the vehicle (102) based on the navigational commands.
2. The system (100) as claimed in claim 1, wherein the processor (118) is configured to receive instruction from a wearable device (104) selected from a glove, a ring, a wristband, a handheld controller, or a combination thereof.
3. The system (100) as claimed in claim 1, wherein the gestures are selected from a flick gesture, a pinch-in gesture, a pinch-out gesture, a push gesture, a pull gesture, a steer gesture, or a combination thereof.
4. The system (100) as claimed in claim 1, wherein the control unit (116) is an Espressif 32 (ESP32).
5. The system (100) as claimed in claim 1, wherein the cloud server (106) comprises a storage unit (120) for storing the navigational commands transmitted by the user in form of the gestures.
6. A method (300) for controlling a vehicle (102) using a gesture-based vehicle control system (100), the method (300) is characterized by steps of:
detecting gestures of a user, wearing a wearable device (104), by sensor nodes (114);
interpreting the detected gestures using a control unit (116) in communication with the sensor nodes (114);
receiving, by a processor (118), data packets comprising the interpreted gestures from the wearable device (104) over an internet;
processing the received data packets into navigational commands;
transmitting the navigational commands to the controller (112) of a vehicle (102) using a web-based service (122); and
actuate the motor (108) of the vehicle (102) based on the navigational commands.
Date: May 28, 2024
Place: Noida

Dr. Keerti Gupta
Agent for the Applicant
(IN/PA-1529)

Documents

Application Documents

# Name Date
1 202441041774-STATEMENT OF UNDERTAKING (FORM 3) [29-05-2024(online)].pdf 2024-05-29
2 202441041774-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-05-2024(online)].pdf 2024-05-29
3 202441041774-POWER OF AUTHORITY [29-05-2024(online)].pdf 2024-05-29
4 202441041774-OTHERS [29-05-2024(online)].pdf 2024-05-29
5 202441041774-FORM-9 [29-05-2024(online)].pdf 2024-05-29
6 202441041774-FORM FOR SMALL ENTITY(FORM-28) [29-05-2024(online)].pdf 2024-05-29
7 202441041774-FORM 1 [29-05-2024(online)].pdf 2024-05-29
8 202441041774-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-05-2024(online)].pdf 2024-05-29
9 202441041774-EDUCATIONAL INSTITUTION(S) [29-05-2024(online)].pdf 2024-05-29
10 202441041774-DRAWINGS [29-05-2024(online)].pdf 2024-05-29
11 202441041774-DECLARATION OF INVENTORSHIP (FORM 5) [29-05-2024(online)].pdf 2024-05-29
12 202441041774-COMPLETE SPECIFICATION [29-05-2024(online)].pdf 2024-05-29
13 202441041774-FORM-26 [11-07-2024(online)].pdf 2024-07-11