Sign In to Follow Application
View All Documents & Correspondence

Washing Machine With Gesture Based Control System And Method Thereof

Abstract: Disclosed herein is a washing machine with gesture-based control system and method thereof (100) a processing unit (102), configured to execute gesture recognition, a gesture recognition unit (104), housed within the processing unit (102), an image capturing unit (106), operably connected to the processing unit (102), the image capturing unit configured to capture real-time images or videos of user gestures, a communication network (108), configured to enable data exchange between a plurality of system components, a control unit (110), operably connected to the processing unit (102), via the communication network (108), the control unit configured to map the identified gestures to corresponding washing machine functions, a washing module (112), operably connected to the control unit (110), a display unit (114), operably connected to the control unit (110), the display unit being configured to provide real-time visual feedback, an audio feedback unit (116), operably connected to the control unit (110).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 April 2025
Publication Number
18/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

SR UNIVERSITY
ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Inventors

1. NARESH KUMAR SRIPADA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
2. SIRIKONDA SHWETHA
SUMATHI REDDY INSTITUTE OF TECHNOLOGY FOR WOMEN, ANANTHASAGAR, HASANPARTHY, WARANGAL, INDIA
3. N. SANJANA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
4. D. DEEKSHITHA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
5. G. SOUJANYA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
6. S. NEELIMA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA
7. T. SINDHUJA
SR UNIVERSITY, ANANTHSAGAR, HASANPARTHY (M), WARANGAL URBAN, TELANGANA - 506371, INDIA

Specification

Description:FIELD OF DISCLOSURE
[0001] The present disclosure relates generally relates to gesture-based appliance control, more specifically, relates to washing machine with gesture-based control system and method thereof.
BACKGROUND OF THE DISCLOSURE
[0002] Traditional washing machines require physical contact with buttons or touch panels, which can accumulate dirt and germs over time. This invention allows users to operate the washing machine using simple hand gestures, eliminating the need for direct contact. This touch-free control enhances hygiene, making it particularly beneficial for users concerned about cleanliness, such as in hospitals, kitchens, or homes with young children.
[0003] Many people, including those with mobility impairments, elderly individuals, or those with wet or dirty hands, find it difficult to operate traditional washing machines. With gesture-based controls, users can easily start, stop, and customize their washing cycles without struggling with small buttons or complex interfaces. This makes laundry more accessible and convenient for a wider range of users.
[0004] Gesture-controlled technology adds a futuristic and effortless experience to home appliances. Users can control the washing machine with intuitive hand movements without searching for buttons or navigating complicated settings. This modern approach enhances convenience, allowing for quicker and more enjoyable operation, making laundry a hassle-free task.
[0005] Many existing washing machines rely on physical buttons or touchscreens, which can be difficult to operate for people with mobility impairments or those whose hands are wet or dirty. This lack of adaptability makes these machines less user-friendly, especially in situations where quick and effortless control is needed. Additionally, users may struggle to operate the machine in low-light conditions or when they are multitasking.
[0006] Traditional washing machines with physical buttons or touch panels experience wear and tear over time, leading to malfunctioning controls. Users may face issues like stuck buttons, unresponsive panels, or broken dials, resulting in frequent repairs or replacements, increasing maintenance costs and inconvenience. Over time, the repeated pressing of buttons or the use of touchscreens can also lead to fading labels, making it harder to understand or select the correct settings.
[0007] Many existing washing machines have limited connectivity and may not seamlessly integrate with modern smart home ecosystems. Users often need to manually adjust settings or operate the machine in person, rather than enjoying remote control or automation features that could make laundry management easier and more efficient. This lack of smart integration prevents users from monitoring their laundry cycles remotely or receiving helpful notifications about the washing process.
[0008] Thus, in light of the above-stated discussion, there exists a need for a washing machine with gesture-based control system and method thereof.
SUMMARY OF THE DISCLOSURE
[0009] The following is a summary description of illustrative embodiments of the invention. It is provided as a preface to assist those skilled in the art to more rapidly assimilate the detailed design discussion which ensues and is not intended in any way to limit the scope of the claims which are appended hereto in order to particularly point out the invention.
[0010] According to illustrative embodiments, the present disclosure focuses on a washing machine with gesture-based control system and method thereof which overcomes the above-mentioned disadvantages or provides the users with a useful or commercial choice.
[0011] An objective of the present disclosure is to provide a washing machine system that allows users to operate it without physical contact, ensuring a seamless and intuitive experience. This enhances ease of use while reducing dependency on manual controls, making laundry more convenient for all users. The system is designed to be user-friendly and accessible for people of all age groups.
[0012] An objective of the present disclosure is to improve hygiene by eliminating the need to touch physical buttons or dials, reducing the risk of germ transmission. This ensures a cleaner and safer user experience, particularly in shared spaces or public laundry facilities. It further promotes a more sanitary washing environment.
[0013] Another objective of the present disclosure is to enhance accessibility for individuals with mobility impairments or disabilities by enabling hands-free operation. This allows users who may struggle with conventional controls to operate the machine independently. The system is designed to cater to a wide range of user needs, making it more inclusive.
[0014] Another objective of the present disclosure is to integrate advanced gesture recognition technology to enable smooth and accurate user interactions. This helps prevent errors and ensures the correct commands are executed based on user intentions. The intuitive operation minimizes confusion and makes it easier to learn and use.
[0015] Another objective of the present disclosure is to provide a washing machine system that adapts to modern smart home environments, allowing seamless integration with other automated appliances. The system is designed to enhance convenience by enabling remote operation and monitoring. It improves the overall efficiency of household management.
[0016] Another objective of the present disclosure is to provide a system that simplifies the washing process by reducing the number of manual steps required to operate the machine. This helps save time and effort, making laundry a more effortless task. The system ensures that users can focus on other activities without constantly monitoring the machine.
[0017] Another objective of the present disclosure is to minimize user errors by providing a simple and effective control mechanism that does not require complicated instructions. The system ensures that users can easily understand and operate the machine without extensive learning or technical knowledge. This enhances the overall usability of the appliance.
[0018] Another objective of the present disclosure is to reduce wear and tear caused by physical buttons and touch panels, thereby increasing the longevity and durability of the washing machine. With fewer mechanical components subject to damage, the system enhances the reliability of the appliance. This ultimately reduces maintenance and repair costs for users.
[0019] Another objective of the present disclosure is to ensure that the system can function effectively under various environmental conditions, including low-light settings. The washing machine should remain responsive and operational regardless of external factors. This improves the overall usability and ensures consistent performance at all times.
[0020] Yet another objective of the present disclosure is to provide a washing machine system that is aesthetically modern and innovative, offering a sleek and futuristic user experience. The gesture-based control system enhances the design appeal of the appliance, making it a stylish addition to any home. This ensures that the machine not only performs efficiently but also complements modern home interiors.
[0021] In light of the above, in one aspect of the present disclosure, a washing machine with gesture-based controlled system is disclosed herein. The system comprises a processing unit configured to execute gesture recognition and control washing machine operations. The system includes a gesture recognition unit, housed within the processing unit, the gesture recognition unit including an artificial intelligence-based processor configured to analyse captured images and identify predefined user gestures. The system also includes an image capturing unit operably connected to the processing unit, the image capturing unit configured to capture real-time images or videos of user gesture. The system also includes a communication network configured to enable bidirectional data exchange between a plurality of system components. The system also includes a control unit operably connected to the processing unit via the communication network, the control unit configured to map the identified gestures to corresponding washing machine functions. The system also includes a washing module operably connected to the control unit, the washing module utilizing a drum, a motor, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit, to perform washing operations. The system also includes a display unit operably connected to the control unit, the display unit being configured to provide real-time visual feedback regarding selected gestures, washing cycle status, and user instructions. The system also includes an audio feedback unit operably connected to the control unit, the audio feedback unit being configured to provide auditory confirmation of detected gestures and washing machine status updates.
[0022] In one embodiment, the processing unit comprises a microprocessor, a memory unit, and a neural processing core, the neural processing core dedicated to executing artificial intelligence-based gesture recognition algorithms.
[0023] In one embodiment, the gesture recognition unit comprises a feature extraction module, a gesture classification module, and a gesture mapping module, each operably connected to the neural processing core of the processing unit.
[0024] In one embodiment, the image capturing unit comprises an optical sensor array, an infrared illumination module, and a depth-sensing module, each configured to enhance the accuracy of gesture recognition in varying lighting conditions.
[0025] In one embodiment, the control unit comprises a command processing module, a washing cycle execution module, and a safety monitoring module, each configured to process user commands, execute washing operations, and ensure safe operation of the system.
[0026] In one embodiment, the display unit comprises a touch-enabled liquid-crystal display (LCD) panel and an light-emitting diode (LED) indicator array, each operably connected to the control unit to provide visual feedback on system status and user inputs.
[0027] In one embodiment, the audio feedback unit comprises a speaker array and a voice synthesis module, each configured to generate sound alerts and spoken notifications in response to washing machine status updates.
[0028] In one embodiment, the communication network comprises a wireless transceiver, a signal processing module, and a data encryption module, each configured to enable secure and reliable data exchange between system components.
[0029] In one embodiment, the washing module utilizes a motor control circuit, a water flow regulation valve, a detergent dispensing actuator, and a drum rotation sensor, each operably connected to the control unit to execute precise washing functions based on received control signal.
[0030] In light of the above, in one aspect of the present disclosure, washing machine with gesture-based control system is disclosed herein. The method comprises capturing real-time images or videos of user gestures using an image capturing unit. The method includes transmitting the captured image data to a processing unit via a communication network. The method also includes analysing the captured image data within a gesture recognition unit, housed within the processing unit, the gesture recognition unit including an artificial intelligence-based processor configured to identify predefined user gestures based on feature extraction and pattern recognition. The method also includes mapping the identified user gestures to corresponding washing machine functions using a control unit operably connected to the processing unit. The method also includes executing the mapped washing machine functions by a washing module utilizing a motor, a drum, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit. The method also includes providing real-time visual feedback through a display unit operably connected to the control unit, the display unit configured to display the selected washing function, cycle status, and user prompts. The method also includes delivering auditory confirmation through an audio feedback unit, the audio feedback unit generating voice prompts or sound notifications to indicate successful gesture recognition and washing machine operation status. The method also includes monitoring user gestures throughout the washing cycle, allowing dynamic adjustments to washing parameters based on user input.
[0031] These and other advantages will be apparent from the present application of the embodiments described herein.
[0032] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0033] These elements, together with the other aspects of the present disclosure and various features are pointed out with particularity in the claims annexed hereto and form a part of the present disclosure. For a better understanding of the present disclosure, its operating advantages, and the specified object attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated exemplary embodiments of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0034] To describe the technical solutions in the embodiments of the present disclosure or in the prior art more clearly, the following briefly describes the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description merely show some embodiments of the present disclosure, and a person of ordinary skill in the art can derive other implementations from these accompanying drawings without creative efforts. All of the embodiments or the implementations shall fall within the protection scope of the present disclosure.
[0035] The advantages and features of the present disclosure will become better understood with reference to the following detailed description taken in conjunction with the accompanying drawing, in which:
[0036] FIG. 1 illustrates a block diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure;
[0037] FIG. 2 illustrates a flowchart of a washing machine with gesture-based control system, in accordance with an exemplary embodiment of the present disclosure;
[0038] FIG. 3 illustrates a flowchart of a washing machine with gesture-based control method, in accordance with an exemplary embodiment of the present disclosure;
[0039] FIG. 4A-4D illustrates the screenshots of the different gestures of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure;
[0040] FIG. 5 illustrates a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure;
[0041] FIG. 6 illustrates a workflow of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure;
[0042] FIG. 7 illustrates a screenshot of the architecture of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure;
[0043] FIG. 8A-8C illustrates the screenshots of the web interfaces of a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0044] Like reference, numerals refer to like parts throughout the description of several views of the drawing.
[0045] The washing machine with gesture-based control system and method thereof is illustrated in the accompanying drawings, which like reference letters indicate corresponding parts in the various figures. It should be noted that the accompanying figure is intended to present illustrations of exemplary embodiments of the present disclosure. This figure is not intended to limit the scope of the present disclosure. It should also be noted that the accompanying figure is not necessarily drawn to scale.
DETAILED DESCRIPTION OF THE DISCLOSURE
[0046] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure.
[0047] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be apparent to one skilled in the art that embodiments of the present disclosure may be practiced without some of these specific details.
[0048] Various terms as used herein are shown below. To the extent a term is used, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0049] The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items.
[0050] The terms “having”, “comprising”, “including”, and variations thereof signify the presence of a component.
[0051] Referring now to FIG. 1 to FIG. 8 to describe various exemplary embodiments of the present disclosure. FIG. 1 illustrates a block diagram of a washing machine with gesture-based control system, perspective in accordance with an exemplary embodiment of the present disclosure.
[0052] The system 100 may include a processing unit 102 configured to execute gesture recognition and control washing machine operations. The system 100 may also include a gesture recognition unit 104 housed within the processing unit 102 the gesture recognition unit including an artificial intelligence-based processor configured to analyse captured images and identify predefined user gestures. The system 100 may also include a gesture recognition unit 104 housed within the processing unit 102 the gesture recognition unit including an artificial intelligence-based processor configured to analyse captured images and identify predefined user gestures. The system 100 may also include an image capturing unit 106 operably connected to the processing unit 102 the image capturing unit configured to capture real-time images or videos of user gestures. The system 100 may also include a communication network 108 configured to enable bidirectional data exchange between a plurality of system components. The system 100 may also include a control unit 110 operably connected to the processing unit 102 via the communication network 108 the control unit configured to map the identified gestures to corresponding washing machine functions. The system 100 may also include a washing module 112 operably connected to the control unit 110 the washing module utilizing a drum, a motor, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit 110 to perform washing operations. The system 100 may also include a display unit 114 operably connected to the control unit 110 the display unit being configured to provide real-time visual feedback regarding selected gestures, washing cycle status, and user instructions. The system 100 may also include an audio feedback unit 116 operably connected to the control unit 110 the audio feedback unit being configured to provide auditory confirmation of detected gestures and washing machine status updates.
[0053] The processing unit 102 comprises a microprocessor, a memory unit, and a neural processing core, the neural processing core dedicated to executing artificial intelligence-based gesture recognition algorithms.
[0054] The gesture recognition unit 104 comprises a feature extraction module, a gesture classification module, and a gesture mapping module, each operably connected to the neural processing core of the processing unit 102.
[0055] The image capturing unit 106 comprises an optical sensor array, an infrared illumination module, and a depth-sensing module, each configured to enhance the accuracy of gesture recognition in varying lighting conditions.
[0056] The control unit 110 comprises a command processing module, a washing cycle execution module, and a safety monitoring module, each configured to process user commands, execute washing operations, and ensure safe operation of the system.
[0057] The display unit 114 comprises a touch-enabled liquid-crystal display (LCD) panel and a light-emitting diode (LED) indicator array, each operably connected to the control unit (110) to provide visual feedback on system status and user inputs.
[0058] The audio feedback unit 116 comprises a speaker array and a voice synthesis module, each configured to generate sound alerts and spoken notifications in response to washing machine status updates.
[0059] The communication network 108 comprises a wireless transceiver, a signal processing module, and a data encryption module, each configured to enable secure and reliable data exchange between system components.
[0060] The washing module 112 utilizes a motor control circuit, a water flow regulation valve, a detergent dispensing actuator, and a drum rotation sensor, each operably connected to the control unit 110 to execute precise washing functions based on received control signals.
[0061] The method 100 may include capturing real-time images or videos of user gestures using an image capturing unit 106. The method 100 may also include transmitting the captured image data to a processing unit 102 via a communication network 108. The method 100 may also include analysing the captured image data within a gesture recognition unit 104 housed within the processing unit 102 the gesture recognition unit including an artificial intelligence-based processor configured to identify predefined user gestures based on feature extraction and pattern recognition. The method 100 may also include mapping the identified user gestures to corresponding washing machine functions using a control unit 110 operably connected to the processing unit 102. The method 100 may also include executing the mapped washing machine functions by a washing module 112 utilizing a motor, a drum, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit 110. The method 100 may also include providing real-time visual feedback through a display unit 114 operably connected to the control unit 110 the display unit configured to display the selected washing function, cycle status, and user prompts. The method 100 may also include delivering auditory confirmation through an audio feedback unit 116 the audio feedback unit generating voice prompts or sound notifications to indicate successful gesture recognition and washing machine operation status. The method 100 may also include monitoring user gestures throughout the washing cycle, allowing dynamic adjustments to washing parameters based on user input.
[0062] The processing unit 102 executes gesture recognition and manages washing machine operations with high efficiency. The processing unit 102 includes a microprocessor, a memory unit, and a neural processing core. The microprocessor handles command execution, data management, and communication between system components. The memory unit stores gesture recognition algorithms, user preferences, and system logs. The neural processing core processes complex gesture recognition tasks, leveraging artificial intelligence-based models to identify user gestures accurately. The processing unit 102 receives real-time image data from the image capturing unit 106, processes gesture inputs, and transmits control commands to the control unit 110. The processing unit 102 ensures smooth integration of gesture-based operations with traditional washing machine functionalities, providing a seamless user experience. The processing unit 102 continuously updates stored data to enhance gesture recognition accuracy over time. The processing unit 102 optimizes processing efficiency by dynamically allocating computational resources based on system requirements. The processing unit 102 actively monitors the status of connected components to ensure system stability and responsiveness. The processing unit 102 enables intelligent decision-making by analysing user interaction patterns and adapting to different gesture inputs.
[0063] The gesture recognition unit 104 is housed within the processing unit 102 and is responsible for analysing captured images to detect predefined user gestures. The gesture recognition unit 104 includes a feature extraction module, a gesture classification module, and a gesture mapping module. The feature extraction module processes image data received from the image capturing unit 106 to isolate relevant gesture features. The gesture classification module utilizes artificial intelligence-based algorithms to compare detected features with predefined gesture patterns. The gesture mapping module associates recognized gestures with specific washing machine functions, allowing users to control operations without physical contact. The gesture recognition unit 104 continuously refines recognition accuracy by learning from repeated user interactions. The gesture recognition unit 104 ensures real-time processing of gesture inputs, minimizing response delays. The gesture recognition unit 104 dynamically adjusts recognition sensitivity based on environmental factors such as lighting conditions and user positioning. The gesture recognition unit 104 enhances user convenience by eliminating the need for manual controls, providing an intuitive and accessible interface.
[0064] The image capturing unit 106 captures real-time images or videos of user gestures and transmits the data to the processing unit 102 for analysis. The image capturing unit 106 comprises an optical sensor array, an infrared illumination module, and a depth-sensing module. The optical sensor array captures high-resolution images under normal lighting conditions, ensuring clear gesture detection. The infrared illumination module enhances visibility in low-light environments, enabling consistent gesture recognition. The depth-sensing module measures the spatial position of user gestures, improving detection accuracy. The image capturing unit 106 operates continuously to detect user input without requiring direct contact with physical controls. The image capturing unit 106 ensures minimal interference from background noise by filtering out irrelevant movement. The image capturing unit 106 supports adaptive calibration to align with different user preferences and operating conditions. The image capturing unit 106 optimizes image data transmission to the processing unit 102 by compressing and pre-processing captured images. The image capturing unit 106 enhances system responsiveness by minimizing latency in image analysis and transmission.
[0065] The communication network 108 enables bidirectional data exchange between all system components, ensuring seamless interaction between hardware and software elements. The communication network 108 comprises a wireless transceiver, a signal processing module, and a data encryption module. The wireless transceiver facilitates real-time communication between the processing unit 102, control unit 110, and other connected components. The signal processing module ensures stable data transmission by filtering out noise and interference. The data encryption module secures transmitted information, preventing unauthorized access. The communication network 108 maintains a consistent connection between system components to ensure uninterrupted operation. The communication network 108 dynamically adapts to network conditions, optimizing data transfer speeds and reliability. The communication network 108 enhances security by implementing encryption protocols to protect sensitive user data. The communication network 108 continuously monitors data flow to detect and resolve connectivity issues in real time.
[0066] The control unit 110 maps identified gestures to corresponding washing machine functions and execute necessary commands. The control unit 110 includes a command processing module, a washing cycle execution module, and a safety monitoring module. The command processing module interprets gesture inputs and translates them into actionable machine operations. The washing cycle execution module controls washing parameters such as water level, spin speed, and detergent dispensing based on received gestures. The safety monitoring module ensures operational safety by detecting anomalies and preventing system malfunctions. The control unit 110 continuously synchronizes with the processing unit 102 to execute commands accurately. The control unit 110 optimizes washing efficiency by dynamically adjusting machine settings according to user inputs. The control unit 110 enhances user control by supporting multiple gesture-based commands for different washing functions. The control unit 110 integrates with external connectivity solutions to enable remote operation and monitoring.
[0067] The washing module 112 performs washing operations by utilizing various mechanical and electronic components. The washing module 112 includes a drum, a motor, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit 110. The drum accommodates laundry and rotates based on control inputs. The motor drives the drum, ensuring effective washing and spinning. The water inlet regulates water flow into the drum according to predefined washing settings. The detergent dispenser releases detergent at appropriate stages of the washing cycle. The actuators control different mechanical movements, including drum rotation speed and water drainage. The washing module 112 operates in synchronization with gesture-based inputs, ensuring hands-free machine control. The washing module 112 optimizes detergent and water usage by adjusting settings based on user-defined washing modes. The washing module 112 ensures efficient stain removal and fabric care through precise control of washing parameters. The washing module 112 continuously monitors drum load and adjusts operational parameters accordingly.
[0068] The display unit 114 provides real-time visual feedback regarding gesture recognition, washing cycle status, and user instructions. The display unit 114 comprises a touch-enabled liquid-crystal display panel and a light-emitting diode indicator array. The liquid-crystal display panel presents detailed washing settings, cycle progress, and system alerts. The light-emitting diode indicator array visually confirms recognized gestures and machine responses. The display unit 114 enhances user interaction by offering an intuitive and informative interface. The display unit 114 dynamically updates displayed content based on user inputs and washing machine status. The display unit 114 supports multiple languages and customization options to improve user accessibility.
[0069] The audio feedback unit 116 provides auditory confirmation of detected gestures and washing machine status updates. The audio feedback unit 116 comprises a speaker array and a voice synthesis module. The speaker array generates sound alerts to indicate system responses and washing cycle progress. The voice synthesis module converts text-based notifications into spoken messages, enhancing user interaction. The audio feedback unit 116 ensures real-time feedback by synchronizing with the control unit 110. The audio feedback unit 116 improves accessibility for visually impaired users by providing audible instructions. The audio feedback unit 116 enhances user experience by delivering clear and concise auditory notifications. The audio feedback unit 116 dynamically adjusts volume levels based on ambient noise conditions.
[0070] FIG. 2 illustrates a flowchart of a washing machine with gesture-based control system in accordance with an exemplary embodiment of the present disclosure.
[0071] At 202, Detect user gestures in real-time using the image capturing unit.
[0072] At 204, send the captured gesture data to the processing unit for processing.
[0073] At 206, Process and interpret the gesture data using AI-based recognition in the gesture recognition unit.
[0074] At 208, associate the recognized gesture with a predefined washing machine function through the control unit.
[0075] At 210, initiate the washing module to perform the corresponding washing operation.
[0076] At 212, display and announce system responses via the display unit and audio feedback unit.
[0077] At 214, continuously track and adjust washing operations based on additional user inputs or environmental factors.
[0078] FIG. 3 illustrates a flowchart of a washing machine with gesture-based control method in accordance with an exemplary embodiment of the present disclosure.
[0079] At 302, capturing real-time images or videos of user gestures using an image capturing unit.
[0080] At 304, transmitting the captured image data to a processing unit via a communication network.
[0081] At 306, analysing the captured image data within a gesture recognition unit, housed within the processing unit, the gesture recognition unit including an artificial intelligence-based processor configured to identify predefined user gestures based on feature extraction and pattern recognition.
[0082] At 308, mapping the identified user gestures to corresponding washing machine functions using a control unit operably connected to the processing unit.
[0083] At 310, executing the mapped washing machine functions by a washing module utilizing a motor, a drum, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit.
[0084] At 312, providing real-time visual feedback through a display unit operably connected to the control unit, the display unit configured to display the selected washing function, cycle status, and user prompts.
[0085] At 314, delivering auditory confirmation through an audio feedback unit, the audio feedback unit generating voice prompts or sound notifications to indicate successful gesture recognition and washing machine operation status.
[0086] At 316, monitoring user gestures throughout the washing cycle, allowing dynamic adjustments to washing parameters based on user input.
[0087] FIG. 4A-4D illustrates the screenshots of the different gestures of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0088] FIG. 4A illustrates a screenshot of the gestures to swipe the wash cycles of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0089] A fully extended open hand with fingers straight and close together is positioned against a neutral background, representing a predefined input for selecting or switching between washing cycles through artificial intelligence-based gesture recognition processing.
[0090] FIG. 4B illustrates a screenshot of the gestures to start machine of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0091] A closed fist with an extended thumb pointing upward is positioned against a neutral background, representing a predefined input for initiating washing operations through artificial intelligence-based gesture recognition processing within the control unit.
[0092] FIG. 4C illustrates a screenshot of the gestures to stop the washer of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0093] A closed fist with an extended thumb pointing downward is positioned against a neutral background, representing a predefined input for terminating washing operations through artificial intelligence-based gesture recognition processing within the control unit.
[0094] FIG. 4D illustrates a screenshot of the gestures to activate spin dry mode of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0095] A hand is forming a circular motion with fingers extended, representing a predefined input for initiating high-speed spinning. Artificial intelligence-based gesture recognition processing within the control unit is detecting the motion to execute the drying operation.
[0096] FIG. 5 illustrates a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0097] At 502, input phase.
[0098] At 504, user gesture input: hand movements in front of the camera.
[0099] At 506, camera capture: real-time video processing.
[0100] At 508, processing phase.
[0101] At 510, gesture recognition: analysing frames with Al processor.
[0102] At 512, command mapping: gestures mapped to commands (e.g., wave to start).
[0103] At 514, implementation phase.
[0104] At 516, commands execution: activating actuators.
[0105] At 518, drum rotation: for selected wash cycle.
[0106] At 520, detergent dispenser activation.
[0107] At 522, water inlet & temperature regulation.
[0108] At 524, evaluation phase.
[0109] At 526, user reviews: feedback via led or voice (e.g., "wash cycle selected: cotton').
[0110] FIG. 6 illustrates a workflow of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0111] At 602, user performs gestures.
[0112] At 604, camera captures the gesture.
[0113] At 606, command recognized.
[0114] At 608, provide feedback.
[0115] At 610, command processing.
[0116] At 612, control unit interprets command.
[0117] At 614, actuators execute washing.
[0118] At 616, provide feedback to user.
[0119] FIG. 7 illustrates a screenshot of the architecture of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0120] User 702 is interacting with the gesture-based control system by performing predefined hand movements. User 702 is positioned within the field of view of the gesture recognition camera module 708, which is capturing real-time movements. User 702 is relying on artificial intelligence-based recognition processing to ensure accurate interpretation of commands for seamless operation of the washing machine.
[0121] Wi-Fi module 704 is establishing wireless communication between the gesture-based control system and the washing machine. Wi-Fi module 704 is transmitting command signals generated from the interpreted gestures to the washing machine control unit. Wi-Fi module 704 is ensuring a stable connection for real-time execution of user 702 inputs, facilitating a touchless and efficient washing process.
[0122] Camera 706 is capturing high-resolution images of user 702 hand movements within the defined operational range. Camera 706 is continuously streaming visual data to the gesture recognition module 710 for processing. Camera 706 is playing a crucial role in detecting precise gestures by maintaining an optimal frame rate and exposure settings under different lighting conditions.
[0123] Gesture recognition camera module 708 is integrating camera 706 with processing capabilities for analysing user 702 hand movements. Gesture recognition camera module 708 is ensuring that the captured images are pre-processed for noise reduction before forwarding them to gesture recognition module 710. Gesture recognition camera module 708 is maintaining accuracy by distinguishing between intentional commands and unintended background motion.
[0124] Gesture recognition module 710 is processing visual data received from gesture recognition camera module 708 to interpret user 702 gestures. Gesture recognition module 710 is employing artificial intelligence algorithms to classify detected movements into predefined command categories. Gesture recognition module 710 is generating corresponding control signals that are transmitted through Wi-Fi module 704 to execute washing machine functions based on user 702 input.
[0125] FIG. 8A-8C illustrates the screenshots of the web interfaces of a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0126] FIG. 8A illustrates the screenshots of the washing machine in idle state web interface of a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0127] The web interface is displaying the washing machine in an idle state while awaiting user interaction. The selectable wash cycle is set to cotton, with gesture instructions indicating swipe left and swipe right for cycle selection and a wave gesture for starting or stopping the wash cycle. The system is enabling gesture-based control without physical contact.
[0128] FIG. 8B illustrates the screenshots of the washing machine in wash cycle selection state web interface of a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0129] The web interface is displaying the washing machine in the wash cycle selection state, allowing the user to choose from cotton, wool, delicate, and quick wash options. The system is recognizing swipe gestures for changing the cycle and a wave gesture for starting or stopping the washing process, ensuring gesture-based control without direct physical interaction.
[0130] FIG. 8C illustrates the screenshots of the washing in progress state web interface of a data flow diagram of a washing machine with gesture-based control system and method thereof, in accordance with an exemplary embodiment of the present disclosure.
[0131] The web interface is displaying the washing in progress state, showing the selected wash cycle as delicate. The system is allowing the user to stop the wash cycle using a button or a wave gesture. The instructions are indicating swipe gestures for changing the wash cycle and a wave gesture for starting or stopping the washing process.
[0132] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it will be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0133] A person of ordinary skill in the art may be aware that, in combination with the examples described in the embodiments disclosed in this specification, units and algorithm steps may be implemented by electronic hardware, computer software, or a combination thereof.
[0134] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described to best explain the principles of the present disclosure and its practical application, and to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but such omissions and substitutions are intended to cover the application or implementation without departing from the scope of the present disclosure.
[0135] Disjunctive language such as the phrase “at least one of X, Y, Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
[0136] In a case that no conflict occurs, the embodiments in the present disclosure and the features in the embodiments may be mutually combined. The foregoing descriptions are merely specific implementations of the present disclosure, but are not intended to limit the protection scope of the present disclosure. Any variation or replacement readily figured out by a person skilled in the art within the technical scope disclosed in the present disclosure shall fall within the protection scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.
, Claims:I/We Claim:
1. A washing machine with gesture-based control system (100) comprising:
a processing unit (102), configured to execute gesture recognition and control washing machine operations;
a gesture recognition unit (104), housed within the processing unit (102), the gesture recognition unit including an artificial intelligence-based processor configured to analyse captured images and identify predefined user gestures;
an image capturing unit (106), operably connected to the processing unit (102), the image capturing unit configured to capture real-time images or videos of user gestures;
a communication network (108), configured to enable bidirectional data exchange between a plurality of system components;
a control unit (110), operably connected to the processing unit (102), via the communication network (108), the control unit configured to map the identified gestures to corresponding washing machine functions;
a washing module (112), operably connected to the control unit (110), the washing module utilizing a drum, a motor, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit (110), to perform washing operations;
a display unit (114), operably connected to the control unit (110), the display unit being configured to provide real-time visual feedback regarding selected gestures, washing cycle status, and user instructions;
an audio feedback unit (116), operably connected to the control unit (110), the audio feedback unit being configured to provide auditory confirmation of detected gestures and washing machine status updates.
2. The system (100) as claimed in claim 1, wherein the processing unit (102) comprises a microprocessor, a memory unit, and a neural processing core, the neural processing core dedicated to executing artificial intelligence-based gesture recognition algorithms.
3. The system (100) as claimed in claim 1, wherein the gesture recognition unit (104) comprises a feature extraction module, a gesture classification module, and a gesture mapping module, each operably connected to the neural processing core of the processing unit (102).
4. The system (100) as claimed in claim 1, wherein the image capturing unit (106) comprises an optical sensor array, an infrared illumination module, and a depth-sensing module, each configured to enhance the accuracy of gesture recognition in varying lighting conditions.
5. The system (100) as claimed in claim 1, wherein the control unit (110) comprises a command processing module, a washing cycle execution module, and a safety monitoring module, each configured to process user commands, execute washing operations, and ensure safe operation of the system.
6. The system (100) as claimed in claim 1, wherein the display unit (114) comprises a touch-enabled liquid-crystal display (LCD) panel and a light-emitting diode (LED) indicator array, each operably connected to the control unit (110) to provide visual feedback on system status and user inputs.
7. The system (100) as claimed in claim 1, wherein the audio feedback unit (116) comprises a speaker array and a voice synthesis module, each configured to generate sound alerts and spoken notifications in response to washing machine status updates.
8. The system (100) claimed in claim 1, wherein the communication network (108) comprises a wireless transceiver, a signal processing module, and a data encryption module, each configured to enable secure and reliable data exchange between system components.
9. The system (100) as claimed in claim 1, wherein the washing module (112) utilizes a motor control circuit, a water flow regulation valve, a detergent dispensing actuator, and a drum rotation sensor, each operably connected to the control unit (110) to execute precise washing functions based on received control signals.
10. A washing machine with gesture-based control method (100), comprising:
capturing real-time images or videos of user gestures using an image capturing unit (106);
transmitting the captured image data to a processing unit (102) via a communication network (108);
analysing the captured image data within a gesture recognition unit (104), housed within the processing unit (102), the gesture recognition unit including an artificial intelligence-based processor configured to identify predefined user gestures based on feature extraction and pattern recognition;
mapping the identified user gestures to corresponding washing machine functions using a control unit (110) operably connected to the processing unit (102);
executing the mapped washing machine functions by a washing module(112) utilizing a motor, a drum, a water inlet, a detergent dispenser, and actuators, each operably connected to the control unit (110);
providing real-time visual feedback through a display unit (114) operably connected to the control unit (110), the display unit configured to display the selected washing function, cycle status, and user prompts;
delivering auditory confirmation through an audio feedback unit (116), the audio feedback unit generating voice prompts or sound notifications to indicate successful gesture recognition and washing machine operation status;
monitoring user gestures throughout the washing cycle, allowing dynamic adjustments to washing parameters based on user input.

Documents

Application Documents

# Name Date
1 202541033775-STATEMENT OF UNDERTAKING (FORM 3) [07-04-2025(online)].pdf 2025-04-07
2 202541033775-REQUEST FOR EARLY PUBLICATION(FORM-9) [07-04-2025(online)].pdf 2025-04-07
3 202541033775-POWER OF AUTHORITY [07-04-2025(online)].pdf 2025-04-07
4 202541033775-FORM-9 [07-04-2025(online)].pdf 2025-04-07
5 202541033775-FORM FOR SMALL ENTITY(FORM-28) [07-04-2025(online)].pdf 2025-04-07
6 202541033775-FORM 1 [07-04-2025(online)].pdf 2025-04-07
7 202541033775-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [07-04-2025(online)].pdf 2025-04-07
8 202541033775-DRAWINGS [07-04-2025(online)].pdf 2025-04-07
9 202541033775-DECLARATION OF INVENTORSHIP (FORM 5) [07-04-2025(online)].pdf 2025-04-07
10 202541033775-COMPLETE SPECIFICATION [07-04-2025(online)].pdf 2025-04-07
11 202541033775-Proof of Right [16-04-2025(online)].pdf 2025-04-16