Sign In to Follow Application
View All Documents & Correspondence

System And Method For Generating Dynamic User Interfaces For Household Appliances

Abstract: The present disclosure relates to a system (102) for generating a scene based on the working of an appliance (104). The system (102) includes at least one sensing unit (202) configured to sense data of a plurality of operational parameters associated with the appliance. In addition, the system includes a control unit (204) configured to receive the data on the operational parameters of the appliance. In addition, the control unit (204) is configured to analyze the data of the operational parameters with a predefined marker value and generate the scene based on real-time equating of the data with the marker value.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 December 2023
Publication Number
27/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Atomberg Technologies Private Limited
Office No. 1205, 12th Floor, Rupa Solitaire, Millennium Business Park, Thane-Belapur Road, Mahape, Navi Mumbai 400710, Maharashtra, India

Inventors

1. SANGHVI, Yash Chetan
ATOMBERG TECHNOLOGIES PRIVATE LIMITED, Office No. 1205, 12th Floor, Rupa Solitaire, Millennium Business Park, Thane-Belapur Road, Mahape, Navi Mumbai - 400710, India
2. GAUTAM, Utkarsh
ATOMBERG TECHNOLOGIES PRIVATE LIMITED, Office No. 1205, 12th Floor, Rupa Solitaire, Millennium Business Park, Thane-Belapur Road, Mahape, Navi Mumbai - 400710, India
3. VIJAYDEEP, Sachie
ATOMBERG TECHNOLOGIES PRIVATE LIMITED, Office No. 1205, 12th Floor, Rupa Solitaire, Millennium Business Park, Thane-Belapur Road, Mahape, Navi Mumbai - 400710, India

Specification

DESC:FIELD OF THE INVENTION

The present disclosure relates to household appliances and more particularly, to a system and a method for generating dynamic user interfaces for household appliances based on various operational parameters.

BACKGROUND

With the advancement of technology, various household appliances such as ceiling fans, refrigerators, air-conditioners, cooktops, and mixer grinders are connected to Internet of Things (IoT) devices to enable smart applications and services of such household appliances. Further, the IoT devices are interconnected objects embedded with sensors and software to collect and exchange data. With the implementation of the IoT devices, the household appliances can be controlled remotely via a smartphone by a user. For instance, operations of an IoT-based ceiling fan, such as varying the speed and switch-ON/OFF, can be performed remotely by the user via the smartphone.

Currently, there is no provision which gives a better understanding to the user about parameters such as the electricity consumption of a household appliance. Further, no visual or graphical representation of such parameters is provided to the user, such that the user is unable to get information on how much electricity is consumed by using the household appliance. In such scenarios, the user may extensively use the household appliance without getting information about an increase or a decrease in the consumption of electricity. This hampers the user experience and sometimes increases overall expenses associated with the usage of the household appliance. Furthermore, the user is unable to get information about the current electricity consumption while using an energy-efficient household appliance. Thus, the user cannot get information about the savings of using the energy-efficient household appliance. Moreover, there is no provision which enables the user to access information about the carbon impact generated due to the usage of the household appliance.

Therefore, in view of the above-mentioned problems, it is desirable to provide a system or a method that can eliminate one or more of the above-mentioned problems associated with the existing systems.

SUMMARY

This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention and nor is it intended for determining the scope of the invention.

The present disclosure relates to a system for generating a scene based on the working of an appliance. The system includes at least one sensing unit configured to sense a data of a plurality of operational parameters associated with the appliance. In addition, the system includes a control unit configured to receive the data on the operational parameters of the appliance. In addition, the control unit is configured to analyze the data of the operational parameters with a predefined marker value, and generate the scene based on real-time equating of the data with the marker value.

In an embodiment, the present disclosure is related to a method for generating a scene based on the working of an appliance. The method includes sensing a data of a plurality of operational parameters associated with the appliance, through at least one of a sensing unit. In addition, the method includes receiving the data on the operational parameters of the appliance by a control unit in communication with the one or more sensors. Further, the method includes equating the data of the operational parameters with a predefined marker value. Furthermore, the method includes generate the scene based on real-time equating of the data with the marker value through an artificial intelligence-based technique.

Furthermore, the control unit may be configured to generate the scene having the dynamic user interface based on the measured operational parameters and the determined computing parameters of the household appliance. The scene may include the dynamic user interface to represent one or more images of a plurality of trees planted, a plurality of coins gained while saving electricity, an equated number of the carbon emissions reduced, an equated number of money saved on real time basis. In an embodiment, the scene may include the dynamic user interface to represent the reduction visually or graphically in electricity bills and carbon emissions in terms of the money saved and trees planted, respectively.

Herein, the information related to the benefits of using the household appliance may be effectively conveyed to the users via the user interface, such that the users may prefer to adopt such household appliances having cleaner and energy-efficient technologies.

To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:

Figure 1(a) illustrates an exemplary environment of a system for generating a scene based on the working of an appliance based on various operational parameters, according to an embodiment of the present disclosure;

Figure 1(b) illustrates an exemplary environment of the system for generating the dynamic user interfaces for household appliances based on various operational parameters, according to another embodiment of the present disclosure;

Figure 2 illustrates a block diagram of the system, the household appliance and a user device connected to the system, according to an embodiment of the present disclosure;

Figures 3(a) and 3(b) illustrate various exemplary usage scenarios of the dynamic user interfaces of the user device, according to an embodiment of the present disclosure;

Figure 4 illustrates an exemplary environment of the system for generating the dynamic user interfaces for a ceiling fan based on various operational parameters, according to an embodiment of the present disclosure; and

Figure 5 illustrates a flow chart depicting a method for generating a scene based on the working of an appliance, according to an embodiment of the present disclosure.

Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, a plurality of components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.

DETAILED DESCRIPTION OF FIGURES

For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art to which invention belongs. The system and examples provided herein are illustrative only and not intended to be limiting.

For example, the term “some” as used herein may be understood as “none” or “one” or “more than one” or “all.” Therefore, the terms “none,” “one,” “more than one,” “more than one, but not all” or “all” would fall under the definition of “some.” It should be appreciated by a person skilled in the art that the terminology and structure employed herein is for describing, teaching, and illuminating some embodiments and their specific features and elements and therefore, should not be construed to limit, restrict, or reduce the spirit and scope of the present disclosure in any way.

For example, any terms used herein such as, “includes,” “comprises,” “has,” “consists,” and similar grammatical variants do not specify an exact limitation or restriction, and certainly do not exclude the possible addition of a plurality of features or elements, unless otherwise stated. Further, such terms must not be taken to exclude the possible removal of the plurality of the listed features and elements, unless otherwise stated, for example, by using the limiting language including, but not limited to, “must comprise” or “needs to include.”

Whether or not a certain feature or element was limited to being used only once, it may still be referred to as “plurality of features” or “plurality of elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “plurality of” or “at least one” feature or element does not preclude there being none of that feature or element, unless otherwise specified by limiting language including, but not limited to, “there needs to be a plurality of…” or “plurality of elements is required.”

Unless otherwise defined, all terms and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by a person ordinarily skilled in the art.

Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements of the present disclosure. Some embodiments have been described for the purpose of explaining the plurality of the potential ways in which the specific features and/or elements of the proposed disclosure fulfil the requirements of uniqueness, utility, and non-obviousness.

Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or other variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, plurality of particular features and/or elements described in connection with plurality of embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although plurality of features and/or elements may be described herein in the context of only a single embodiment, or in the context of more than one embodiment, or in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.

Any particular and all details set forth herein are used in the context of some embodiments and therefore should not necessarily be taken as limiting factors to the proposed disclosure.

Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.

Figure 1(a) illustrates an exemplary environment 100-1 of a system 102 (shown in Figure 2) for generating a scene based on the working of an appliance 104, in accordance with the embodiment of the present disclosure. In an embodiment, the appliance 104 may be an electrical appliance, a household appliance without departing from the scope of the present disclosure. In addition, the appliance 104 and the household appliance 104 have been alternatively used throughout the disclosure without departing from the scope of the present disclosure.

In an embodiment, the scene may include a dynamic user interface 302 (shown in Figures 3(a) and 3(b)) for household appliances 104-1, 104-n based on various operational parameters of the appliance. In an embodiment, the scene is at least one of a multimedia, an image, a Graphics Interchange Format (GIF), a sound, an animation, a visual information a haptic based scene, a multimodal scene without departing from the scope of the present disclosure.

Referring to Figure 1(b) an exemplary environment 100-2 of the system 102 for generating the dynamic user interfaces 302 for household appliances 104-1, 104-n based on various operational parameters is shown, according to another embodiment of the present disclosure.

Referring to Figure 2 a block diagram of the system 102, the household appliance 104 and a user device 106 connected to the system 102, according to an embodiment of the present disclosure. Figures 3(a) and 3(b) illustrate various exemplary usage scenarios of the dynamic user interface 302 of the user device 106, according to an embodiment of the present disclosure.

Referring to Figures 1(a), 1(b), and 2, the environment 100-1 may include the system 102, multiple household appliances 104-1, 104-n, and the user device 106. In an embodiment, the environment 100-1 may include multiple user devices 106 in communication with the system 102 and multiple household appliances 104-1, 104-n.

In an embodiment, as illustrated in Figure 1(a), the user device 106 may be in direct communication with one or more household appliances 104-1, 104-n via a first network 108-1. Herein, data related to the operational parameters of the household appliances 104-1, 104-n may be directly transmitted to the user device 106. In an embodiment, the first network 108-1 may be a wired network or a wireless network such as Bluetooth or a WIFI network or the like.

In addition, the network may include, but is not limited to, a mobile network, a broadband network, a Wide Area Network (WAN), a Local Area Network (LAN), and a Personal Area Network. Herein, even if the household appliances 104-1, 104-n may not be connected to the internet, but only to the user device 106 via Bluetooth, the information may be still displayed on the user interface of the user device 106. Thus, the user may still get the latest data related to the saving due to the usage of the appliance 104.

In an embodiment, as illustrated in Figure 1(b), the user device 106 may be in communication with the household appliance 104 via a second network 108-2 and a cloud-based server 110. The cloud-based server 110 may include a cloud-based storage unit. Herein, the data related to the operational parameters of the household appliances 104-1, 104-n may be transmitted to the user device 106 or stored in the cloud-based storage unit of the cloud-based server 110.

In an example, the second network 108-2 may be a wireless network, a wired network, or a combination thereof. In an embodiment, the second network 108-2 is in communication with the control unit 204, the one or more user device 106, and the cloud-based server 110. In an embodiment, the second network 108-2 is configured to receive the data by the control unit 204.

The network can also be an individual network or a collection of many such individual networks, interconnected with each other and functioning as a single large network, e.g., the Internet or an intranet. The network can be one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), and the internet.

In an embodiment, the network may either be a dedicated network, a virtual network, or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), and Transmission Control Protocol/Internet Protocol (TCP/IP), to communicate with each other. An example of a network may include Fiber Channel Protocol (FCP) on Fiber Channel media. In an example, the second network 108-2 may include a Global System for Mobile Communication (GSM) network, a Universal Mobile Telecommunications System (UMTS) network, or any other communication network that uses any of the commonly used protocols, for example, Hypertext Transfer Protocol (HTTP) and Transmission Control Protocol/Internet Protocol (TCP/IP).

The system 102 may be adapted to collect the data related to the operational parameters of the household appliances 104-1, 104-n. Further, the system 102 may generate the dynamic user interface 302 on the user device 106, based on collected data related to the operational parameters of the household appliances 104-1, 104-n. The generated dynamic user interface 302 conveys information to a user, such that the user may be aware of the operational parameters of the household appliances 104-1, 104-n and their impacts. Constructional and operational details of the system 102 are explained in the subsequent paragraphs with reference to Figure 2.

For the sake of readability, multiple household appliances 104-1, 104-n may be interchangeably referred to as household appliances 104 or appliances 104, without departing from the scope of the present disclosure.

Referring to Figures 1(a), 1(b), and 2, the household appliance 104 may be embodied as an appliance 104 including, but not limited to, a ceiling fan, an induction stove, a refrigerator, a cooktop, and a mixer grinder. In another embodiment, the household appliance 104 may be embodied as a non-electrical appliance 104 or an electrical appliance 104 including, but not limited to, a gas stove, without departing from the scope of the present disclosure. Herein, the operational parameter of the household appliance 104 may be, but is not limited to, revolutions per minute (RPM), current, voltage, speed, or fuel supplied. In an embodiment, the household appliance 104 consumes less electricity and may be referred as to a 5-star appliance 104.

The household appliance 104 may include at least one sensing unit 202 and a controller 203 in communication with the sensing unit 202. The sensing unit 202 may include a plurality of sensors positioned on the household appliance 104 or in proximity to the household appliance 104.

In an embodiment, the plurality of sensors may be adapted to measure the operational parameters. In an embodiment, the sensing unit 202 may be adapted to transmit the measured operational parameters to the controller 203. The sensing unit 202 is configured to sense data of a plurality of operational parameters associated with the appliance. In an embodiment, the sensing unit 202 is at least one of a standalone sensing unit and an inbuilt sensing unit with a control unit 204.

The user device 106 may be in communication with the household appliance 104 and the system 102. In an embodiment, the user device 106 may be in communication with the controller 203 of the household appliance 104. The user device 106 associated with the user may include, but is not limited to, a smartphone, a tablet, a laptop, a personal computer, a smartwatch, a smart television, and an IoT device in communication with the system 102, directly or indirectly. The user device 106 may include a display unit 214 and the control unit 204 in communication with the display unit 214.

In an embodiment, the control unit 204 may be in communication with the controller 203 and adapted to generate the scene based on the working of an appliance 104. In an embodiment, the scene may include a dynamic user interface 302 on the display unit 214, based on the data received from the controller 203.

The control unit 204 is configured to receive the data on the operational parameters of the appliance. In addition, the control unit 204 is configured to analyze the data of the operational parameters with a predefined marker value. Further, the control unit 204 is configured to generate the scene based on real-time equating of the data with the marker value. In an embodiment, the real-time equating of the data with the marker value is performed through an artificial-intelligence (AI) based technique.

In an embodiment, the marker value is at least one of revolutions per minute (RPM) of the appliance, a current drawn by the appliance, an operating time of the appliance, an ambient conditions of the appliance, operating parameters of the appliance, a temperature data related to the appliance, an operating voltage of the appliance, a speed of the appliance, a fuel consumed by the appliance, an idle time of the appliance, a run time of the appliance, and an amount of carbon emissions saved by the appliance.

The system 102 may be in communication with the user device 106. The system 102 may be adapted to collect the data related to the operational parameters of the household appliance 104 and generate the scene having the dynamic user interface 302 on the display unit 214 of the user device 106.

The system 102 may include, but is not limited to, the controller 203 and the control unit 204 in communication with the controller 203. In an embodiment, the controller 203 and the control unit 204 are in direct communication via Bluetooth and any other first network 108-1. In another embodiment, the controller 203 and the control unit 204 are in communication via the second network 108-2 and the cloud-based server 110. In an embodiment, the received data of the plurality of operational parameters of the appliance may be directly transferred to the user device 106 through the Bluetooth or any other first network 108-1 without departing from the scope of the present disclosure.

In an embodiment, the controller 203 may be embodied as an onboard controller 203. In an embodiment, the cloud-based server 110 is configured to store the data of the operational parameters of the appliance. In another embodiment, the control unit 204 is configured to receive the data from the cloud-based server 110 for equating the operational parameters with the predefined marker value.

The controller 203 may be configured to receive the measured operational parameters such as RPM, current, voltage, speed or fuel supplied, of the household appliance 104 from the sensing unit 202. Furthermore, the controller 203 may be configured to determine the computing parameters of the household appliance 104, based on the received operational parameters of the household appliance 104. Herein, the computing parameters may include, but are not limited to, electricity consumption, idle time, run time or fuel consumption. The data related to the computing parameters may be transmitted to the control unit 204 via one of the first networks 108-1 or the second network 108-2.

The control unit 204 may include, but is not limited to, a processor 206, memory 208, module(s) 210, a database 212, and the display unit 214. The module(s) 210 and the memory 208 may be coupled to the processor 206. The processor 206 may be a single processing unit or a number of units, all of which could include multiple computing units.

The processor 206 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 206 is configured to fetch and execute computer-readable instructions and data stored in the memory 208.

The memory 208 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory 208, such as static random-access memory 208 (SRAM) and dynamic random-access memory 208 (DRAM), and/or non-volatile memory 208, such as read-only memory 208 (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.

The module(s) 210, amongst other things, includes routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The module(s) 210 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions. Further, the module(s) 210 may be implemented in hardware, instructions executed by at least one processing unit, for e.g., the processor 206, or by a combination thereof.

The processing unit may comprise a computer, a processor, a state machine, a logic array and/or any other suitable devices capable of processing instructions. The processing unit may be a general-purpose processor which executes instructions to cause the general-purpose processor to perform operations or, the processing unit may be dedicated to performing the required functions. In some example embodiments, the module(s) 210 may be machine-readable instructions (software, such as web-application, mobile application, program, etc.) which, when executed by a processor/processing unit, perform any of the described functionalities.

The control unit 204 may be configured to receive the signal indicative of the computing parameters from the controller 203 via one of the first network 108-1 or the second network 108-2. The control unit 204 may be configured to calculate the electricity bills saved, the money saved, and carbon emission reduction, based on the received computing parameters such as the electricity consumption, the idle time, the run time, or the fuel consumption.

Further, the control unit 204 may be configured to generate the dynamic user interface 302 on the display unit 214 to visually or graphically represent the electricity bills saved, and equate the money saved to the money put in a virtual bank. Herein, the savings due to the reduction in electricity consumption may be represented on the dynamic user interface 302 in terms of coins or notes in the virtual bank, as shown in Figure 3(a).

As depicted in Figure 3(a), the control unit 204 generates the user interface 302-1 to represent the saving in terms of two coins. Further, the control unit 204 generates another user interface 302-2 to represent the saving in terms of multiple coins. Herein, the number of coins is increased in the user interface 302-2 as compared to the user interface 302-1. Thus, the dynamic user interface 302 may be continuously changed based on the variation in the operational parameters and the computing parameters of the household appliance 104.

In an embodiment, the scene may include the dynamic user interface to represent one or more images of a plurality of trees planted, a plurality of coins gained while saving electricity, and an equated number of the carbon emissions reduced, an equated number of money saved on real time basis. In an embodiment, the scene may include the dynamic user interface to represent the reduction visually or graphically in electricity bills and carbon emissions in terms of the money saved and trees planted, respectively.

The control unit 204 may be configured to generate the dynamic user interface 302 on the display unit 214 to visually or graphically represent the carbon emission reduction on the user interface. Herein, the control unit 204 determines how many Kilograms (kgs) of CO2 are emitted for 1 unit of electricity produced. Accordingly, the control unit 204 further determines how many kgs of CO2 are saved by using the household appliance 104.

Further, the control unit 204 evaluates how many kgs of CO2 are consumed by one tree in a year. Thus, the control unit 204 estimates the number of trees planted, based on the reduction in the carbon emissions and the electricity consumption. This information may be further represented on the dynamic user interface 302 in terms of the number of trees planted, such that the savings may be conveyed to the user in terms of the number of trees planted, as shown in Figure 3(b).

As depicted in Figure 3(b), the control unit 204 generates the user interface 302-3 to represent the savings in terms of trees planted. Further, the control unit 204 generates another user interface 302-4 to represent the saving in terms of multiple trees planted. Herein, the number of trees planted is increased in the user interface 302-4 as compared to the user interface 302-3. Thus, the dynamic user interface 302 may be continuously changed based on the variation in the operational parameters and the computing parameters of the household appliance 104. Thus, the number of coins or notes in the virtual bank, and the number of equivalent trees may be determined based on the real-time data of the appliance 104.

In an embodiment, the information displayed on the dynamic user interface 302 may be used to determine a service time and a cleaning time of the appliance 104. Herein, multiple factors such as time, on-off cycles, placement, and usage statistics may be used to determine service times. Further, this saving-related data displayed on the trees planted user interface 302 may be used by a manufacturer to make the discounting strategy.

In an embodiment, the system 102 determines the savings in terms of money and carbon emissions based on the current electricity consumption and generates the dynamic user interfaces 302 to represent information related to the savings, while the user uses an energy-efficient household appliance. Herein, such savings may be determined by comparing the energy consumption of the energy-efficient household appliance and the non-efficient household appliance. Thus, the user may be aware of the saved money and reduced carbon emissions due to the usage of the energy-efficient household appliance.

Figure 4 illustrates an exemplary environment of the system 102 for generating the dynamic user interfaces 302 for a ceiling fan based on various operational parameters, according to an embodiment of the present disclosure. Herein, the household appliance 104 may be embodied as a ceiling fan. The ceiling fan may include a brushless direct current (BLDC) electric motor and a motor driver 402 coupled to the BLDC electric motor.

The ceiling fan may be provided with the BLDC electric motor consumes less electricity and thereby, causes less carbon emissions. In an embodiment, such ceiling fans consume about 40% less electricity than other ceiling fans. Therefore, the implementation of such ceiling fans saves electricity which may be translated into electricity bill savings for a user. The lower consumption of electricity also lowers the carbon emission of the ceiling fan.

The system 102 may include the controller 203 and the control unit 204 in communication with the controller 203. Herein, the sensing unit 202 may be in communication with the motor driver 402 and positioned in proximity to the motor driver 402. The sensing unit 202 may be adapted to measure the operational parameters such as the RPM, the voltage, the current, or the speed, of the motor driver 402. In an embodiment, the motor driver 402 may send the plurality of operating parameters to the controller 203 without departing from the scope of the present disclosure.

Further, the motor driver 402 having the sensing unit 202, may be in communication with the controller 203 such as an onboard controller. Herein, the sensing unit 202 may be adapted to transmit the signal indicative of the measured operational parameters to the controller 203. Further, the controller 203 may be in communication with a command channel 404 to get a trigger command. The controller 203 may determine the computing parameters such as run time, idle time, and electricity consumption of the BLDC motor, based on the received operational parameters.

In an embodiment, the controller 203 may be in communication with the user device 106 such as the smartphone having the control unit 204, via the first network 108-1. In another embodiment, the controller 203 may be in communication with the user device 106 having the control unit 204, via the second network 108-2 and the cloud-based server 110. In an embodiment, the controller 203 may send the operating parameters of the appliance to the cloud-based server 110 without departing from the scope of the present disclosure. In an embodiment, the latest state of the operating parameters and the generated scene based on real-time equating of the data with the marker value may be fetched by the user device 106 without departing from the scope of the present disclosure.

The data related to the computing parameters of the ceiling fan may be stored in the database 212 of the system 102. In an embodiment, the database 212 may be in direct communication with the user device 106 such as the smartphone. In another embodiment, the database 212 may be in communication with the user device 106 such as the smartphone via a cloud network.

The control unit 204 receives the data related to the computing parameters of the ceiling fan from the controller 203, and analyzes the received data related to the computing parameters. Further, the control unit 204 may be configured to generate the dynamic user interface 302 on the display unit 214 of the user device 106, based on the received data related to the computing parameters of the ceiling fan.

In an example, due to the reduction in electricity consumption of the ceiling fan, the dynamic user interface 302 (shown in Figures 3(a)- 3(b)) may be generated to represent the reduction visually or graphically in electricity bills and the carbon emissions in terms of the money saved and trees planted, respectively. Herein, the information related to the benefits of using the ceiling fan with the BLDC motor, may be effectively conveyed to the users via the dynamic user interface 302.

Referring to Figure 5, a flow chart depicting the method 500 for generating a scene based on the working of an appliance 104 is shown, in accordance with the embodiment of the present disclosure. The present disclosure also relates to the method 500 for generating a scene having the dynamic user interfaces 302 for household appliances 104. The method steps are not intended to be construed as a limitation, and any number of the described method steps can be combined in any appropriate order to execute the method or an alternative method.

Additionally, individual steps of the method 500 may be deleted from the method 500 without departing from the spirit and scope of the subject matter described herein. The method for generating a scene having the dynamic user interfaces 302 for household appliances 104 may be performed by using the system 102 as shown at least in Figure 2.

The method includes a step 502, for sensing data of a plurality of operational parameters associated with the appliance 104. For so doing, the least one of the sensing unit 202 is used. The sensing unit 202 is configured to sense data of a plurality of operational parameters associated with the appliance. In an embodiment, the sensing unit 202 is at least one of a standalone sensing unit and an inbuilt sensing unit with a control unit 204.

In an embodiment, the one of the operational parameters of the appliance may include but is not limited to a revolutions per minute (RPM) of the appliance, a current drawn by the appliance, an operating time of the appliance, an ambient conditions of the appliance, operating parameters of the appliance, a temperature data related to the appliance, an operating voltage of the appliance, a speed of the appliance, a fuel consumed by the appliance, an idle time of the appliance, a run time of the appliance, an amount of carbon emissions saved by the appliance.

Further, the method 500 moves to step 504. In the step 504, the method 500 includes receiving of the data on the operational parameters of the appliance by the control unit 204 is performed. In an embodiment, the control unit 204 is configured to receive the operational parameters of the appliance with the one or more sensors.

The method 500 moves to step 506. In the step 506, the method 500 includes equating of the data received by the control unit 204 via the one or more sensors of the sensing unit 202 is performed. The equating of the data of the operational parameters is performed with the marker value pre-stored within the system 100.

In an embodiment, the marker value may include values of at least one of but not limited to the one of revolutions per minute (RPM) of the appliance, the current drawn by the appliance, the operating time of the appliance, the ambient conditions of the appliance, operating parameters of the appliance, the temperature data related to the appliance, the operating voltage of the appliance, the speed of the appliance, the fuel consumed by the appliance, the idle time of the appliance, the run time of the appliance, an amount of carbon emissions saved by the appliance.

Furthermore, the method 500 moves to step 508. In the step 508, the method 500 includes generation of the scene is performed. The scene is generated based on the real-time equating of the data with the marker value through an artificial intelligence-based technique. In an embodiment, the scene is at least one of a multimedia, an image, a Graphics Interchange Format (GIF), a sound, an animation, a visual information, a haptic based scene, a multimodal scene without departing from the scope of the present disclosure.

The system 102 and the method 500 of the present disclosure generate the scene having the dynamic user interface 302 based on the measured operational parameters and the determined computing parameters of household appliance 104. Further, the system 102 and method 500 may generate the dynamic user interface 302 to visually or graphically represent the reduction in electricity bills and the carbon emissions in terms of the money saved and trees planted, respectively. In an embodiment, the system 102 and method 500 may generate the scene showing the dynamic user interface 302 to visually or graphically represent a plurality of control parameters of the appliance 104. In an embodiment, the system 102 and method 500 may generate the scene showing the dynamic user interface 302 to visually or graphically represent a plurality of parameters related to a safety and security of the appliance 104.

The scene having the dynamic user interface 302 presents the operational and computing parameters of the appliance 104 in a visually compelling way. By visually representing benefits like savings on electricity bills and reductions in carbon emissions, users can quickly grasp the positive impact of the usage of the appliance 104. Further, the inclusion of real-world metrics, such as money saved and trees planted, makes the information more tangible and relatable, encouraging users to take a more active interest in energy conservation.

Herein, since the information related to the benefits of using the household appliance 104 may be effectively conveyed to the users via the user interface 214, the users may prefer to adopt such household appliance 104 having cleaner and energy-efficient technologies. Also, by continuously monitoring and displaying the operational parameters of the appliance, the system 102 helps users optimize their appliance usage to reduce energy consumption. This directly leads to lower electricity bills, a major financial benefit for the user.

Further, by leveraging the measured operational parameters and determined computing parameters, the system 102 provides real-time data-driven insights that help users understand exactly how the appliance is performing in terms of energy consumption and carbon footprint. In this manner, the user can personalize the display according to their preferences, including the adjustments to how the savings or carbon reductions are represented like more detailed graphs or specific types of environmental metrics.

While specific language has been used to describe the present subject matter, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. ,CLAIMS:1. A system (102) for generating a scene based on the working of an appliance (104), the system (102) comprising:
at least one sensing unit (202) configured to sense data of a plurality of operational parameters associated with the appliance;
a control unit (204) configured to:
receive the data on the operational parameters of the appliance;
analyze the data of the operational parameters with a predefined marker value; and
generate the scene based on real-time equating of the data with the marker value.

2. The system (102) as claimed in claim 1, wherein the real-time equating of the data with the marker value is performed through an artificial-intelligence (AI) based technique.

3. The system (102) as claimed in claim 1, wherein the marker value is at least one of revolutions per minute (RPM) of the appliance, a current drawn by the appliance, an operating time of the appliance, an ambient conditions of the appliance, operating parameters of the appliance, a temperature data related to the appliance, an operating voltage of the appliance, a speed of the appliance, a fuel consumed by the appliance, an idle time of the appliance, a run time of the appliance, and an amount of carbon emissions saved by the appliance.

4. The system (102) as claimed in claim 1, wherein the at least one sensing unit (202) is coupled to the appliance and the control unit (204), wherein the sensing unit (202) is at least one of a standalone sensing unit and an inbuilt sensing unit of the control unit (204).

5. The system (102) as claimed in claim 1, wherein the scene is at least one of a multimedia, an image, a Graphics Interchange Format (GIF), a sound, an animation, a visual information, a haptic based scene, a multimodal scene.

6. The system (102) as claimed in claim 4, comprising a first network (108-1) to facilitate the receiving of the data by the control unit (204) from at least one of the user devices (106), wherein the first network (108-1) is at least one of a mobile network, a broadband network, a Wide Area Network (WAN), a Local Area Network (LAN), and a Personal Area Network, a Bluetooth, a wireless network, a wired network, and a combination.

7. The system (102) as claimed in claim 1, wherein the system (102) includes a cloud-based server (110) configured to store the data of the operational parameters of the appliance, wherein the control unit (204) is configured to receive the data from the cloud-based server (110) for equating the operational parameters with the predefined marker value.

8. The system (102) as claimed in claim 3, wherein the system (102) includes a second network (108-2) in communication with the control unit (204), a one or more user devices (106), and the cloud-based server (110) to facilitate the receiving of the data by the control unit (204).

9. The system (102) as claimed in claim 8, wherein the scene generated based on real-time represents one or more images of a plurality of trees planted, a plurality of coins gained while saving electricity, an equated number of the carbon emissions reduced, an equated number of money saved, a plurality of control parameters, a plurality of parameters related to a safety and security of the appliance (104).

10. A method (500) for generating a scene based on the working of an appliance (104), the method (500) comprising:
sensing data of a plurality of operational parameters associated with the appliance (104), through at least one of a sensing unit (202);
receiving the data on the operational parameters of the appliance by a control unit (204) in communication with the one or more sensors;
analyzing the data of the operational parameters with a predefined marker value; and
generating the scene based on real-time equating of the data with the marker value.

11. The method (500) as claimed in claim 10, wherein the marker value is at least one of revolutions per minute (RPM) of the appliance, a current drawn by the appliance, an operating time of the appliance, an ambient conditions of the appliance, operating parameters of the appliance, a temperature data related to the appliance, an operating voltage of the appliance, a speed of the appliance, a fuel consumed by the appliance, an idle time of the appliance, a run time of the appliance, and an amount of carbon emissions saved by the appliance.

Documents

Application Documents

# Name Date
1 202321089515-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [28-12-2023(online)].pdf 2023-12-28
2 202321089515-STATEMENT OF UNDERTAKING (FORM 3) [28-12-2023(online)].pdf 2023-12-28
3 202321089515-PROVISIONAL SPECIFICATION [28-12-2023(online)].pdf 2023-12-28
4 202321089515-FORM 1 [28-12-2023(online)].pdf 2023-12-28
5 202321089515-DRAWINGS [28-12-2023(online)].pdf 2023-12-28
6 202321089515-DECLARATION OF INVENTORSHIP (FORM 5) [28-12-2023(online)].pdf 2023-12-28
7 202321089515-FORM-26 [27-03-2024(online)].pdf 2024-03-27
8 202321089515-FORM 18 [26-12-2024(online)].pdf 2024-12-26
9 202321089515-DRAWING [26-12-2024(online)].pdf 2024-12-26
10 202321089515-CORRESPONDENCE-OTHERS [26-12-2024(online)].pdf 2024-12-26
11 202321089515-COMPLETE SPECIFICATION [26-12-2024(online)].pdf 2024-12-26
12 Abstract-1.jpg 2025-02-10