Abstract: Disclosed herein is a gesture-based smart home system (100) that comprises a hand patch (102) affixed to dorsal surface of the hand, which further comprises a plurality of sensors (118) configured to sense signals depicting hand gestures, a first communication unit (120) configured to enable data transfer, a microcontroller (124) configured to process data, further comprises an input module (126) receive the sensed signals, a pre-processing module (128) remove irrelevant data, a feature extraction module (134) extract features, a gesture recognition module (138) recognize hand gestures, an appliance allocator module (140) identify the appropriate appliance (104), a connection establishment module (142) connect the hand patch (102) with the appliance (104), a feedback module (144) provide real-time feedback, an output module (146) display the output on a display screen (122), a communication network (106) configured to facilitate data transmission, a user device (108) configured to track performance of the appliance (104).
Description:FIELD OF DISCLOSURE
[0001] The present disclosure generally relates to a home appliance control system, more specifically, relates to a gesture-based smart home system using machine learning models.
BACKGROUND OF THE DISCLOSURE
[0002] A home appliance control system integrates modern technology with household devices to enhance convenience, energy efficiency, and automation. By connecting appliances such as refrigerators, washing machines, thermostats, and lighting systems, users is able to control and monitor their home remotely. These systems typically rely on wireless technologies to enable seamless communication between devices. Equipped with sensors and smart algorithms, these appliances can adjust their settings based on real-time data, such as occupancy or environmental conditions, offering optimized performance and energy savings. This innovative approach offers a more intuitive and hands-free way to control appliances. The ability to remotely manage and automate household tasks not only enhances user comfort but also contributes to greater efficiency, reducing energy consumption and providing more control over daily activities.
[0003] Further, this approach offers significant benefits for people of all ages, enhancing convenience and accessibility in everyday life. For deaf and mute individuals, it provides a particularly valuable solution by enabling communication through hand gestures rather than spoken commands. This system allows them to control appliances, adjust settings, and interact with their environment in a way that is intuitive and independent, fostering greater autonomy and inclusivity. The use of non-verbal gestures ensures that individuals with hearing or speech impairments can easily navigate and manage their home appliances without relying on traditional voice-based interfaces.
[0004] The conventional systems face several limitations, as many systems primarily rely on voice commands or traditional remote controls, which are not accessible to deaf and mute individuals. These systems create a significant barrier for users who cannot interact through voice or find remotes challenging to use. As a result, individuals with hearing and speech impairments face difficulties in independently managing essential home electrical devices. Further, traditional systems struggle to accurately interpret gestures, often leading to errors or misinterpretations, especially in environments with cluttered backgrounds or multiple people present. Additionally, these systems can be costly and require regular calibration, which limits their accessibility and affordability. Moreover, the sensitivity of these systems to external factors like lighting conditions and ambient noise can further hinder their reliability. Lastly, the complexity of integrating gesture controls with existing home automation infrastructure remains a significant challenge, limiting the widespread adoption of these systems in everyday homes.
[0005] The present invention overcomes these limitations of the prior art by introducing a gesture-based smart home system. Instead of relying on voice commands, the present invention enables seamless control of home devices using advanced technologies, allowing for better interpretation of natural hand movements in diverse environments, reducing the likelihood of misinterpretations. Additionally, the present invention features enhanced adaptability, enabling users to customize gestures according to personal preferences, promoting a more intuitive and user-friendly experience. The system's robustness against external factors, such as varying lighting conditions and background noise, further enhances its reliability and performance. Moreover, the invention seamlessly integrates with current home automation systems, simplifying the process of controlling multiple appliances without the need for extensive infrastructure changes. This innovative approach makes smart home systems more accessible, empowering deaf and mute individuals to manage their household electronics independently, efficiently, and inclusively.
[0006] Thus, in light of the above-stated discussion, there exists a need for a gesture-based smart home system.
SUMMARY OF THE DISCLOSURE
[0007] The following is a summary description of illustrative embodiments of the invention. It is provided as a preface to assist those skilled in the art to more rapidly assimilate the detailed design discussion which ensues and is not intended in any way to limit the scope of the claims which are appended hereto in order to particularly point out the invention.
[0008] According to illustrative embodiments, the present disclosure focuses on a gesture-based smart home system which overcomes the above-mentioned disadvantages or provide the users with a useful or commercial choice.
[0009] An objective of the present disclosure is to develop a gesture-based smart home system using advanced technologies for better interpretation of gestures for controlling home appliances.
[0010] Another objective of the present disclosure is to develop a system that allows users to customize and personalize gestures for a more intuitive and seamless interaction with appliances.
[0011] Another objective of the present disclosure is to develop a system that is accessible to people of all ages, including the elderly and those with disabilities, such as the deaf and mute.
[0012] Another objective of the present disclosure is to bridge the accessibility gap in smart home technology by providing a practical and user-friendly system.
[0013] Yet another objective of the present disclosure is to create a system with large scalability, capable of handling multiple home appliances.
[0014] In light of the above, in one aspect of the present disclosure, a gesture-based smart home system is disclosed herein. The system comprises a hand patch of a pre-defined shape affixed to the dorsal surface of the hand to operate appliances, wherein the hand patch further comprises a plurality of sensors configured to sense signals depicting hand gestures of a user, a first communication unit configured to establish a communication link with the appliance and enable seamless data transfer, a microcontroller connected to the plurality of sensors, the first communication unit and configured to interpret the hand gestures and generate appropriate commands for operating the appliances, wherein the microcontroller further comprises an input module configured to receive the sensed signals from the plurality of sensors, a pre-processing module configured to remove irrelevant data and noise from the received signals, a feature extraction module configured to extract features from the processed signals for further analysis, a gesture recognition module configured to analyse the extracted features and recognize hand gestures using machine learning models, an appliance allocator module configured to identify and direct the recognised hand gestures to the appropriate appliance, a connection establishment module configured to establish a connection between the first communication unit of the hand patch and a second communication unit of the appliance allocated to transmit commands, a feedback module configured to provide real-time feedback in response to performing the corresponding operations, an output module configured to display the recognised hand gesture and the status of the appliance to a display screen. The system includes a communication network configured to facilitate data transmission within the system. The system also includes a user device connected to the hand patch and the appliances via the communication network and configured to track usage patterns and performance of the appliance in connection over time to provide insights to the user for optimization through a user interface.
[0015] In one embodiment, the system further comprises a cloud database configured to store data related to recognised hand gestures, user preferences, and all connected appliance settings.
[0016] In one embodiment, the hand patch further comprises a power supply unit configured to convert energy captured from the ambient light and the user hand into electrical energy for the power supply.
[0017] In one embodiment, the plurality of sensors comprises a flex sensor, and an inertial measurement unit sensor.
[0018] In one embodiment, the microcontroller further comprises a training and testing module configured to split the pre-processed data into training and testing datasets and train the machine learning models using the training dataset.
[0019] In one embodiment, the microcontroller further comprises an authentication module configured to authenticate the user to ensure secure and personalized access to the system.
[0020] In one embodiment, the microcontroller further comprises an evaluation module configured to analyse the performance of the pre-trained models.
[0021] In one embodiment, the gesture recognition module compares the extracted features to the pre-stored data associated with gesture patterns.
[0022] In one embodiment, the appliance allocator module maps the recognised hand gestures to the pre-stored gestures associated with pre-defined home appliance operations.
[0023] In light of the above, in one aspect of the present disclosure, a method for controlling home appliances is disclosed herein. The method includes receiving the sensed signals from the plurality of sensors via an input module. The method also includes removing irrelevant data and noise from the received signals via a pre-processing module. The method further includes extracting features from the processed signals for further analysis via a feature extraction module. Furthermore, the method includes analysing the extracted features and recognizing hand gestures using machine learning models via a gesture recognition module. in addition to it, the method includes identifying and directing the recognised hand gestures to the appropriate appliance via an appliance allocator module. Moreover, the method includes establishing a connection between the first communication unit of the hand patch and a second communication unit of the appliance allocated to transmit commands via a connection establishment module. Also, the method includes providing real-time feedback in response to performing the corresponding operations via a feedback module. The method includes displaying the recognised hand gesture and the status of the appliance on a display screen via an output module. Additionally, the method includes facilitating data transmission within the system via a communication network. At last, the method includes tracking usage patterns and performance of the appliance in connection over time to provide insights to the user for optimization via a user device.
[0024] These and other advantages will be apparent from the present application of the embodiments described herein.
[0025] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0026] These elements, together with the other aspects of the present disclosure and various features are pointed out with particularity in the claims annexed hereto and form a part of the present disclosure. For a better understanding of the present disclosure, its operating advantages, and the specified object attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated exemplary embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] To describe the technical solutions in the embodiments of the present disclosure or in the prior art more clearly, the following briefly describes the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description merely show some embodiments of the present disclosure, and a person of ordinary skill in the art can derive other implementations from these accompanying drawings without creative efforts. All of the embodiments or the implementations shall fall within the protection scope of the present disclosure.
[0028] The advantages and features of the present disclosure will become better understood with reference to the following detailed description taken in conjunction with the accompanying drawing, in which:
[0029] FIG. 1 illustrates a block diagram of a gesture-based smart home system, in accordance with an embodiment of the present disclosure; and
[0030] FIG. 2 illustrates a flow chart of a method, outlining the sequential steps for controlling home appliances, in accordance with an embodiment of the present disclosure.
[0031] Like reference, numerals refer to like parts throughout the description of several views of the drawing.
[0032] The gesture-based smart home system is illustrated in the accompanying drawings, which like reference letters indicate corresponding parts in the various figures. It should be noted that the accompanying figure is intended to present illustrations of exemplary embodiments of the present disclosure. This figure is not intended to limit the scope of the present disclosure. It should also be noted that the accompanying figure is not necessarily drawn to scale.
DETAILED DESCRIPTION OF THE DISCLOSURE
[0033] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure.
[0034] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be apparent to one skilled in the art that embodiments of the present disclosure may be practiced without some of these specific details.
[0035] Various terms as used herein are shown below. To the extent a term is used, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0036] The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items.
[0037] The terms “having”, “comprising”, “including”, and variations thereof signify the presence of a component.
[0038] Referring now to FIG. 1 to FIG. 2 to describe various exemplary embodiments of the present disclosure. FIG. 1 illustrates a block diagram of a gesture-based smart home system 100, in accordance with an embodiment of the present disclosure.
[0039] The system 100 comprises a hand patch 102, which further comprises a plurality of sensors 118, a first communication unit 120, a microcontroller 124, which further comprises an input module 126, a pre-processing module 128, a training and testing module 130, an authentication module 132, a feature extraction module 134, an evaluation module 136, a gesture recognition module 138, an application allocator module 140, a connection establishment module 142, a feedback module 144, and an output module 146. The system 100 may include a communication network 106. The system 100 may include a user device 108.
[0040] The system 100 facilitates hands-free control of the appliances 104 connected with the hand patch 102 by translating hand gestures into commands, thereby enhancing user interaction with smart home technologies.
[0041] The hand patch 102 of a pre-defined shape, affixed to the dorsal surface of the hand to operate appliances 104. The hand patch 102 is a wearable device, designed to be placed on the back of the hand to enable remote operation of the appliances 104 using hand gestures.
[0042] In one embodiment of the present invention, the pre-defined shape of the hand patch 102 is a long rectangle. This shape covers the central area of the back of the hand, running from the wrist towards the knuckles.
[0043] In one embodiment of the present invention, the appliances 104 may include televisions, lights, air conditioners, and other appliances.
[0044] The plurality of sensors 118 is configured to sense signals depicting hand gestures of a user. The plurality of sensors 118 senses various user gestures, movements, and positions of the hand and fingers. The hand gestures may include swipe, point, zoom, and wave gestures.
[0045] In an exemplary embodiment of the present invention, swiping the hand up and down increases and decreases the volume on the television and the temperature on the air conditioner, respectively. Fingers swipe left and right to change channels and switch between applications on smart devices. Pointing fingers at a device, such as a light and air conditioner, turns the appliances 104 on or off or changes their settings, like switching the AC mode. A pinch gesture is used to adjust screen brightness on the television and dim the lights, creating the perfect ambiance. Lastly, a wave of the hand triggers functions like powering on the appliances 104, and activating preset modes, making for a truly hands-free, user-friendly experience.
[0046] In one embodiment of the present invention, the plurality of sensors 118 comprises a flex sensor 148, and an inertial measurement unit sensor 150.
[0047] In one embodiment of the present invention, the flex sensor 148 detects bending and flexing of the fingers, which is crucial for tracking gestures that involve finger movements, such as pinching, and swiping. The inertial measurement unit sensor 150 detects hand movements and orientation, especially for gestures that involve the entire hand, like waving, swiping, and pointing.
[0048] The first communication unit 120 is configured to establish a communication link with the appliance 104 and enable seamless data transfer. It facilitates the seamless data transfer by establishing the communication link with a second communication unit 112 of the appliance 104.
[0049] In one embodiment of the present invention, the first communication unit 120 and the second communication unit 112 enable communication utilising near field communication (NFC) technology to facilitate secure, low-power, and short-range wireless interactions between the hand patch 102 and the appliances 104.
[0050] The microcontroller 124 is connected to the plurality of sensors 118, the first communication unit 120, and is configured to interpret the hand gestures and generate appropriate commands for operating the appliances 104, wherein the microcontroller 124 further comprises several modules.
[0051] The input module 126 is configured to receive the sensed signals from the plurality of sensors 118. The input module 126 is responsible for interfacing with the plurality of sensors 118 and ensuring that the data is accurately received for interpretation.
[0052] The pre-processing module 128 is configured to remove irrelevant data and noise from the received signals. The pre-processing module 128 removes minor fluctuations caused by accidental movements and environmental interferences. It performs specific tasks like noise reduction, data smoothing, to ensure that only the most relevant hand gesture data is passed further.
[0053] In one embodiment of the present invention, the microcontroller 124 further comprises the training and testing module 130 configured to split the pre-processed data into training and testing datasets and train the machine learning models using the training dataset.
[0054] In one embodiment of the present invention, the microcontroller 124 further comprises the authentication module 132 configured to authenticate the user to ensure secure and personalized access to the system 100. The authentication module 132 verifies the identity of users, and the appliance 102 ensures that only authorized entities are offered access to a particular service. For user authentication, a basic form is provided in which the user enters a unique combination of a username and a password, and for authenticating the appliance 104, the NFC and Bluetooth exchange cryptographic tokens.
[0055] The feature extraction module 134 is configured to extract features from the processed signals for further analysis. This module 134 extracts valuable key features from the data, which are essential for the hand gestures recognition and allocation of the appliances 104.
[0056] In one embodiment of the present invention, the microcontroller 124 further comprises the evaluation module 136 configured to analyse the performance of the pre-trained models. The evaluation module 136 utilises F1-score, precision, accuracy, and recall evaluating the performance of the feature extraction module 134.
[0057] The gesture recognition module 138 is configured to analyse the extracted features and recognize hand gestures using machine learning models. The gesture recognition module 140 utilises machine learning models such as neural networks, support vector machines (SVMs), and so forth. These models are trained to recognize specific hand gestures based on their temporal sequence and spatial characteristics, allowing for high-accuracy recognition even under varying user conditions
[0058] In one embodiment of the present invention, the gesture recognition module 138 compares the extracted features to the pre-stored data associated with gesture patterns. The gesture recognition module 138 processes incoming gesture data in real-time and identifies the closest matching pre-defined gesture, providing a high level of responsiveness and adaptability.
[0059] The appliance allocator module 140 is configured to identify and direct the recognised hand gestures to the appropriate appliance 104. An appliance detection module ensures that a specific gesture is accurately mapped to the corresponding appliance operation, thereby enabling the system to identify and direct the gesture to the appropriate appliance.
[0060] In one embodiment of the present invention, the appliance allocator module 140 maps the recognised hand gestures to the pre-stored gestures associated with pre-defined home appliance operations. The appliance allocator module 140 identifies the closest matching pre-defined gesture, providing a high level of responsiveness and adaptability.
[0061] The connection establishment module 142 is configured to establish a connection between the first communication unit of the hand patch 102 and the second communication unit 112 of the appliance 104 allocated to transmit commands. The connection between the hand patch 102 and the appliance 104 is established via near field communication (NFC) and Bluetooth.
[0062] The feedback module 144 is configured to provide real-time feedback in response to performing the corresponding operations. The feedback module 114 provides real-time feedback to the user after the operation of the appliance 104 is triggered. It generates auditory feedback, visual feedback, and haptic feedback. This feedback is essential for confirming the successful execution of the gesture, enhancing user interaction, and ensuring that the user is informed about the state of the appliance 104.
[0063] The output module 146 is configured to display the recognised hand gesture and the status of the appliance 104 on a display screen 122.
[0064] In one embodiment of the present invention, the hand patch 102 further comprises a power supply unit 116 configured to convert energy captured from the ambient light and the user hand into electrical energy for the power supply. This power supply unit 116 harnesses energy from photovoltaic cells, allowing it to capture ambient light and convert it into electrical energy. Additionally, the power supply unit 116 is designed to charge by capturing energy from the human body (specifically from the hand) when the user’s hand is in motion and in contact with the hand patch 104, effectively recharging the hand patch 102 without requiring external charging sources.
[0065] The communication network 106 is configured to facilitate data transmission within the system 100. The exchange of data is facilitated between the hand patch 102, the appliances 104, and the user device 108 via the communication network 106.
[0066] In one embodiment of the present invention, the communication network 106 may include wired and wireless networks.
[0067] In a preferred embodiment of the present invention, the communication network 106 may include wireless networks.
[0068] The user device 108 is connected to the hand patch 102 and the appliances 104 via the communication network 106 and is configured to track usage patterns and performance of the appliance 104 in connection over time to provide insights to the user for optimization through a user interface 114. The user device 108 provides users with an intuitive way to view, control, and manage appliances 104, making the system 100 more interactive, user-friendly, and efficient in managing home automation tasks. Further, it is configured to provide recommendations related to the connected appliance parameters based on real-time environmental conditions.
[0069] In an exemplary embodiment of the present invention, the user device 108 may recommend adjusting the temperature setting based on current room conditions and suggest modifying the brightness of lights depending on ambient light levels. These recommendations help users make informed decisions to improve convenience and energy savings.
[0070] In one embodiment of the present invention, the system 100 further comprises a cloud database 110 configured to store data related to recognised hand gestures, user preferences, and all connected appliance settings.
[0071] FIG. 2 illustrates a flow chart of a method 200, outlining the sequential steps for controlling home appliances, in accordance with an embodiment of the present disclosure.
[0072] At step 202, the sensed signals are received from the plurality of sensors 118 via the input module 126.
[0073] At step 204, irrelevant data and noise are removed from the received signals via the pre-processing module 128.
[0074] At step 206, features are extracted from the processed signals via the feature extraction module 134 for further analysis.
[0075] At step 208, the extracted features are analysed, and hand gestures are recognised via the gesture recognition module 138 using machine learning models.
[0076] At step 210, identify and direct the recognised hand gestures to the appropriate appliance 104 via the appliance allocator module 140.
[0077] At step 212, a connection between the first communication unit of the hand patch 102 and a second communication unit 112 of the appliance 104 allocated is established to transmit commands via the connection establishment module 142.
[0078] At step 214, real-time feedback is provided in response to performing the corresponding operations via the feedback module 144.
[0079] At step 216, the recognised hand gesture and the status of the appliance 104 are displayed on a display screen 122 via the output module 146.
[0080] At step 218, facilitate data transmission within the system 100 via the communication network 106.
[0081] At step 220, usage patterns and performance of the appliance 104 in connection over time are tracked to provide insights to the user for optimization via the user device 108.
[0082] In the best mode of operation of the present invention, the plurality of sensors 118 integrated into the hand patch 102 detect signals corresponding to hand gestures. These signals are transmitted to the microcontroller 124 and received by the input module 126. Irrelevant data and noise are filtered out through the pre-processing module 128, and relevant features are extracted via the feature extraction module 134 for subsequent analysis. The extracted features are analysed first, consequently recognising hand gestures through the gesture recognition module 138, utilizing machine learning models. The recognized gestures are then identified and directed to the appropriate appliance 104 via the appliance allocator module 140. A communication link is established between the first communication unit of the hand patch 102 and the second communication unit 112 of the designated appliance 104 through the connection establishment module 142 to transmit control commands. Real-time feedback is provided to the user based on the performed actions through the feedback module 144. The recognized gesture, along with the operational status of the appliance 104, is displayed on the display screen 122 via the output module 146. The user device 108 tracks usage patterns and appliance performance over time, providing optimization insights to the user via the communication network 106.
[0083] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it will be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0084] A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware, computer software, or a combination thereof.
[0085] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described to best explain the principles of the present disclosure and its practical application, and to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but such omissions and substitutions are intended to cover the application or implementation without departing from the scope of the present disclosure.
[0086] Disjunctive language such as the phrase “at least one of X, Y, Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
[0087] In a case that no conflict occurs, the embodiments in the present disclosure and the features in the embodiments may be mutually combined. The foregoing descriptions are merely specific implementations of the present disclosure, but are not intended to limit the protection scope of the present disclosure. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in the present disclosure shall fall within the protection scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
, Claims:I/We Claim:
1. A gesture-based smart home system (100) for controlling home appliances, the system (100) comprising:
a hand patch (102) of a pre-defined shape, affixed to the dorsal surface of the hand to operate appliances (104), wherein the hand patch (102) further comprises:
a plurality of sensors (118) configured to sense signals depicting hand gestures of a user;
a first communication unit (120) configured to establish a communication link with the appliance (104) and enable seamless data transfer;
a microcontroller (124) connected to the plurality of sensors (118), the first communication unit (120) and configured to interpret the hand gestures and generate appropriate commands for operating the appliances (104), wherein the microcontroller (124) further comprises:
an input module (126) configured to receive the sensed signals from the plurality of sensors (118);
a pre-processing module (128) configured to remove irrelevant data and noise from the received signals;
a feature extraction module (134) configured to extract features from the processed signals for further analysis;
a gesture recognition module (138) configured to analyse the extracted features and recognize hand gestures using machine learning models;
an appliance allocator module (140) configured to identify and direct the recognised hand gestures to the appropriate appliance (104);
a connection establishment module (142) configured to establish a connection between the first communication unit of the hand patch (102) and a second communication unit (112) of the appliance (104) allocated to transmit commands;
a feedback module (144) configured to provide real-time feedback in response to performing the corresponding operations;
an output module (146) configured to display the recognised hand gesture and the status of the appliance (104) on a display screen (122);
a communication network (106) configured to facilitate data transmission within the system (100); and
a user device (108) connected to the hand patch (102) and the appliances (104) via the communication network (106) and configured to track usage patterns and performance of the appliance (104) in connection over time to provide insights to the user for optimization through a user interface (114).
2. The system (100) as claimed in claim 1, wherein the system (100) further comprises a cloud database (110) configured to store data related to recognised hand gestures, user preferences, and all connected appliance settings.
3. The system (100) as claimed in claim 1, wherein the hand patch (102) further comprises a power supply unit (116) configured to convert energy captured from the ambient light and the user hand into electrical energy for the power supply.
4. The system (100) as claimed in claim 1, wherein the plurality of sensors (118) comprises a flex sensor (148), and an inertial measurement unit sensor (150).
5. The system (100) as claimed in claim 1, wherein the microcontroller (124) further comprises a training and testing module (130) configured to split the pre-processed data into training and testing datasets and train the machine learning models using the training dataset.
6. The system (100) as claimed in claim 1, wherein the microcontroller (124) further comprises an authentication module (132) configured to authenticate the user to ensure secure and personalized access to the system (100).
7. The system (100) as claimed in claim 1, wherein the microcontroller (124) further comprises an evaluation module (136) configured to analyse the performance of the pre-trained models.
8. The system (100) as claimed in claim 1, wherein the gesture recognition module (138) compares the extracted features to the pre-stored data associated with gesture patterns.
9. The system (100) as claimed in claim 1, wherein the appliance allocator module (140) maps the recognised hand gestures to the pre-stored gestures associated with pre-defined home appliance operations.
10. A method (200) for controlling home appliances, the method (200) comprising:
receiving the sensed signals from the plurality of sensors (118) via an input module (126);
removing irrelevant data and noise from the received signals via a pre-processing module (128);
extracting features from the processed signals for further analysis via a feature extraction module (134);
analysing the extracted features and recognizing hand gestures using machine learning models via a gesture recognition module (138);
identifying and directing the recognised hand gestures to the appropriate appliance (104) via an appliance allocator module (140);
establishing a connection between the first communication unit of the hand patch (102) and a second communication unit (112) of the appliance (104) allocated to transmit commands via a connection establishment module (142);
providing real-time feedback in response to performing the corresponding operations via a feedback module (144);
displaying the recognised hand gesture and the status of the appliance (104) on a display screen (122) via an output module (146);
facilitating data transmission within the system (100) via a communication network (106); and
tracking usage patterns and performance of the appliance (104) in connection over time to provide insights to the user for optimization via a user device (108).
| # | Name | Date |
|---|---|---|
| 1 | 202541034402-STATEMENT OF UNDERTAKING (FORM 3) [08-04-2025(online)].pdf | 2025-04-08 |
| 2 | 202541034402-REQUEST FOR EARLY PUBLICATION(FORM-9) [08-04-2025(online)].pdf | 2025-04-08 |
| 3 | 202541034402-POWER OF AUTHORITY [08-04-2025(online)].pdf | 2025-04-08 |
| 4 | 202541034402-FORM-9 [08-04-2025(online)].pdf | 2025-04-08 |
| 5 | 202541034402-FORM FOR SMALL ENTITY(FORM-28) [08-04-2025(online)].pdf | 2025-04-08 |
| 6 | 202541034402-FORM 1 [08-04-2025(online)].pdf | 2025-04-08 |
| 7 | 202541034402-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-04-2025(online)].pdf | 2025-04-08 |
| 8 | 202541034402-DRAWINGS [08-04-2025(online)].pdf | 2025-04-08 |
| 9 | 202541034402-DECLARATION OF INVENTORSHIP (FORM 5) [08-04-2025(online)].pdf | 2025-04-08 |
| 10 | 202541034402-COMPLETE SPECIFICATION [08-04-2025(online)].pdf | 2025-04-08 |
| 11 | 202541034402-Proof of Right [10-04-2025(online)].pdf | 2025-04-10 |