Abstract: A personalized cooking assistive device, comprising a platform 101 placed over a fixed surface of an enclosure, a biometric sensor to enable a user to provide their biometrics, a touch interactive display panel 103 to allow the user to provide input details regarding their desired specific food item that is to be cook, an internet module linked with the microcontroller to access data for recipes or ingredient information, a holographic projection unit 104 to project virtual images or videos for providing cooking assistance, an artificial intelligence-based imaging unit 105 to monitor spills of food item while cooking, a curved-shaped extendable flap 106 to get extend for preventing spillage of essential cooking items, a microphone 109 to allow user to simply speak regarding misalignment of the platform 101, multiple pneumatic arms 107 to balance the platform 101, an LED (Light Emitting Diode) light 108 to increase visibility.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a personalized cooking assistive device that is capable of simplifying and enhancing home cooking by providing personalized recipe suggestions, real-time step-by-step guidance, and automatic spill detection and prevention.
BACKGROUND OF THE INVENTION
[0002] Cooking, a fundamental life skill, has remained largely unchanged despite the rapid pace of technological advancements in various aspects of daily life. For centuries, cooking has relied heavily on traditional methods, passed down through generations, and often characterized by trial-and-error approaches. Home cooks continue to spend countless hours searching for recipes, deciphering complex instructions, and navigating the nuances of culinary techniques. This process is overwhelming, especially for individuals with limited cooking experience, disabilities, or medical conditions that require specialized dietary considerations.
[0003] Traditional cooking assistance tools, such as cooking apps and smart displays, have attempted to bridge this gap but fall short in providing comprehensive solutions. These tools lack personalized guidance, real-time interaction, automatic safety features, and integration of user biometrics and medical history. As a result, home cooks face numerous drawbacks, including time-consuming meal preparation, increased risk of accidents and spills, limited accessibility, and inadequate consideration of medical conditions and dietary restrictions.
[0004] CN104223933A discloses a cooking assist system and a method thereof. The cooking auxiliary system comprises a processor, a display device, a storage device and a communication module, wherein a menu data packet is prestored in the storage device, and comprises dish names, required raw materials and seasoning names; restaurant information is prestored in the menu data packet, and comprises restaurant names, raw materials used by the restaurants, and seasoning lists; the menu data and the business information can be updated through client end networking. The cooking assist system and the method thereof can effectively assist inconvenient users in cooking and buying food, and the users can share the information.
[0005] US4375586A discloses a cooking assistance device for use with a microwave oven comprising a card reader for reading a menu data recorded in a card, a memory unit in which data representing the quantity of foodstuff materials to be heated by said oven and heating processes are stored, and a control unit for reading data from said memory unit in response to the menu data. A character code generator provides conversion of the data read from said memory unit into a character code which is fed into a mini-printer mounted in said oven to provide a print out of characters on a recording medium. A data input key is also provided for entry or alteration of data representing the quantity of servings. A power controller determines the power level and heating interval of the foodstuff based on the stored and keyboard input data and accordingly controls the heating operation.
[0006] Conventionally, there exists many devices that are capable of assisting a user in cooking, however these existing devices are also incapable of providing instruction in each step of recipe. In addition, these existing devices are also incapable of protecting the food from being spilled, which cause mess in cooking place.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that requires to be capable of solving cooking problem of the user by assisting the user through recommendations of recipes, providing each step of cooking and protecting the food from get spilled while cooking.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a device that is capable of providing personalized cooking assistance and recommendations to a user by authentication of the user for providing user a relevant and effective support.
[0010] Another object of the present invention is to develop a device that is capable of automatically protecting food item from getting spill by adaptive adjustment to reduce risks, reduce hassle and ensure optimal conditions.
[0011] Yet another object of the present invention is to develop a device that is capable of providing immersive and real-time guidance by projecting step-by-step instructions to engage users, build confidence and foster learning and exploration.
[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0013] The present invention relates to a personalized cooking assistive device that is capable of transforming home cooking with tailored recipe suggestions, interactive guidance, and intelligent safety features by adjusting to individual skill levels and lighting conditions, ensuring a smooth, secure, and enjoyable culinary experience that encourages healthy habits and culinary discovery, accessible to cooks of all levels.
[0014] According to an embodiment of the present invention, a personalized cooking assistive device, comprising a platform developed to be placed over a fixed surface of an enclosure, multiple suction cups attached underneath to affix the platform with the surface, a biometric sensor embedded with the platform to enable a user to provide their biometrics, a touch interactive display panel installed on the platform to allow the user to provide input details regarding their desired specific food item that is to be cook, an internet module linked with the microcontroller to access data for recipes or ingredient information, a holographic projection unit installed on the platform to project virtual images or videos for providing cooking assistance and an artificial intelligence-based imaging unit arranged over the platform to capture multiple high-resolution images of the user from various angle to monitor spills of food item while cooking.
[0015] According to another embodiment of the present invention, the proposed device further comprises of a curved-shaped extendable flap installed at lateral sides of the platform to get extend for preventing spillage of essential cooking items in real time, a microphone installed on the platform to allow user to simply speak regarding misalignment of the platform, multiple pneumatic arms installed with the platform at it sides to balance the platform to improve angle platform and screen, an LED (Light Emitting Diode) light installed on the platform to increase visibility especially to those user who is suffering from eye visibility issues and a battery is associated with the device to supply power to electrically powered components which are employed herein.
[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a personalized cooking assistive device.
DETAILED DESCRIPTION OF THE INVENTION
[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0021] The present invention relates to a personalized cooking assistive device that is capable of elevating home cooking by providing personalized recipes, real-time guidance, and automated safety measures along with adapting to users' skills and surroundings for a hassle-free, healthy, and delightful cooking experience.
[0022] Referring to Figure 1, an isometric view of a personalized cooking assistive device is illustrated, comprising a platform 101 installed with multiple suction cups 102, a touch interactive display panel 103 mounted on the platform 101, a holographic projection unit 104 provided on the platform 101, an artificial intelligence-based imaging unit 105 is installed on the platform 101, a curved-shaped extendable flap 106 attached with lateral sides of the platform 101, plurality of pneumatic arms 107 are mounted on sides of the platform 101, an LED (Light Emitting Diode) light 108 mounted on the platform 101 and a microphone 109 embedded on the platform 101.
[0023] The device disclosed herein, comprises of a platform 101, which serves as a main structure of the device and developed to be placed over a fixed surface of an enclosure such as kitchen. The platform 101 made of sturdy material such as steel or aluminum, ensuring strength and stability, wherein multiple suction cups 102 attached underneath to affix the platform 101 with the surface. The suction cups 102 are used to create a vacuum seal between the surface and the platform 101. When the suction cups 102 are pressed against the surface, the initial contact creates a seal between the cups 102 and the surface, this seals off the area within the suction cups 102. The suction cups 102 are designed to maintain a relatively airtight seal.
[0024] After adhering the platform 101 over the surface, the process begins where a user provides biometrics over a biometric sensor embedded with the platform 101. The biometric sensor herein is a fingerprint sensor to capture and analyze biometric data. The fingerprint sensor captures the user’s unique physiological or behavioral characteristics and converts them into a digital template. This template is a mathematical representation of the fingerprints. The digital template is compared to the pre-saved fingerprints stored in a database linked with an inbuilt microcontroller for the verification of the user. This is done by calculating the similarity between the new templates and the stored templates.
[0025] There’s a matching threshold set to determine whether the captured fingerprint sufficiently matches any of the pre-saved fingerprints. Based on the comparison results and the matching threshold of the fingerprints the authentication is decided. If the user is successfully authenticated, then the microcontroller fetches medical history of the user from the database for monitoring medical condition of the user.
[0026] Simultaneously, the microcontroller actuates a touch interactive display panel 103 installed on the platform 101 to allow the user to provide input details regarding their desired specific food item that is to be cook. The touch interactive display panel 103 as mentioned herein is typically an LCD (Liquid Crystal Display) screen that presents output in a visible form.
[0027] The screen is equipped with touch-sensitive technology, allowing the user to interact directly with the display using their fingers. A touch controller IC (Integrated Circuit) is responsible for processing the analog signals generated when the user inputs details regarding cooking food as per their desire. A touch controller is typically connected to the microcontroller through various interfaces which may include but are not limited to PI (Serial Peripheral Interface) or I2C (Inter-Integrated Circuit).
[0028] After receiving the input details from the user, the microcontroller fetches data of recipes from an internet module linked with the microcontroller to access data of recipes, ingredients, and culinary knowledge, including instructional videos, cultural cooking stories, and techniques from around world. The microcontroller fetches data from the internet module through a complex process involving multiple stages. Initially, the microcontroller sends a request signal to the internet module, specifying the type of data required, such as recipes or ingredient information. This request signal is generated based on user input or predefined settings.
[0029] Upon receiving the request, the internet module activates its communication protocols, such as Wi-Fi or cellular connectivity, to establish a connection with the internet. The module then sends a Hypertext Transfer Protocol (HTTP) request to the specified server, which hosts the desired data. The server processes the request and transmits the relevant data back to the internet module. This data is typically formatted in JavaScript Object Notation (JSON) or Extensible Markup Language (XML) for easy parsing.
[0030] The internet module receives the data and performs preliminary processing, including data decompression and error checking. The processed data is then transmitted to the microcontroller through a serial communication interface, such as Universal Asynchronous Receiver-Transmitter (UART) or Inter-Integrated Circuit (I2C). The microcontroller receives the data and stores it in its internal memory. The data is then parsed and analyzed using predefined protocols, allowing the microcontroller to extract relevant information and update its internal databases. Finally, the microcontroller uses the fetched data to generate personalized cooking.
[0031] After fetching the recipes from the internet module, the microcontroller actuates a holographic projection unit 104 installed on the platform 101 to project virtual images or videos for providing cooking assistance. On actuation of holographic projection unit 104 by the microcontroller, the light source emits various combination of lights towards the lens which is further portrayed in front of the user to project the virtual images or videos for providing an appropriate assistance to aid the user in cooking. The holographic projection unit 104 getting slow down or accelerates as per the user proficiency or requirement and also suggest recipes suitable in accordance with skill level of the user for reducing complexity of recipes as per the historical user date.
[0032] An artificial intelligence-based imaging unit 105 arranged over the platform 101 to capture multiple high-resolution images of the user from various angle to monitor spills of food item while cooking. The artificial intelligence based imaging unit 105 is constructed with a camera lens and a processor, wherein the camera lens is adapted to capture a series of images of the user. The processor carries out a sequence of image processing operations including pre-processing, feature extraction, and classification.
[0033] The image captured by the imaging unit 105 is real-time images of the user. The artificial intelligence based imaging unit 105 transmits the captured image signal in the form of digital bits to the microcontroller. The microcontroller upon receiving the image signals compares the received image signal with the pre-fed data stored in a database and constantly determines spills of food item while cooking.
[0034] During the cooking, in case the food item is spilled and spill is detected by the imaging unit 105, then the microcontroller synchronously actuates a curved-shaped extendable flap 106 installed at lateral sides of the platform 101 to get extend for preventing spillage of essential cooking items in real time. The extension of the flap 106 is powered by a drawer arranged, which is integrated in the flap 106 for providing extension or retraction to the flap 106.
[0035] The drawer arrangement consists of a drawer that typically slides on the rails inside the flap 106. These rails provide a smooth and stable path for the extension or retraction of the flap 106. When the microcontroller actuates the drawer arrangement, the motor starts rotating and the rotational motion is converted into linear motion through the use of gears. As the motor rotates, the drawer moves either outward or inward along the sliding rails. This extension of the flap 106 increase the size of the flap 106 to form a boundary to prevent the spillage of the food items.
[0036] If the platform 101 gets misbalance while the user cooking, then the user required to provide voice commands over a microphone 109 installed on the platform 101 regarding misalignment of the platform 101. The microphone 109 plays a crucial role by converting spoken words or commands into electrical signals which are then processed and analyzed to trigger specific actions. When the user speaks or commands for misalignment of the platform 101, their vocal cords vibrate, creating sound waves.
[0037] These sound waves travel through the air as variations in air pressure. The microphone 109 mentioned herein is a transducer that converts these variations in air into electric signals. The analog electrical signal is converted into digital form which is done by an analog-to-digital converter (ADC). The digital signal is then subjected to various signal processing techniques to enhance voice quality and eliminate noise.
[0038] Based on the detected misalignment, the microcontroller actuates multiple pneumatic arms 107 installed with the platform 101 at its sides to balance the platform 101 to improve angle platform 101 and screen. The pneumatic arms 107 as mentioned herein are powered by a pneumatic unit that utilizes compressed air to extend and retract the arms 107. The process begins with an air compressor which compresses atmospheric air to a higher pressure. The air cylinder of the pneumatic unit contains a piston that moves back and forth within the cylinder.
[0039] The cylinder is connected to one end of the pneumatic arms 107. The piston is attached to the pneumatic arms 107 and its movement is controlled by the flow of compressed air. To extend the pneumatic arms 107 the piston activates the air valve to allow compressed air to flow into the chamber behind the piston. As the pressure increases in the chamber, the piston pushes the pneumatic arms 107 to the desired length for balancing the platform 101 over the surface.
[0040] If the ambient light 108 reduces in comparison to threshold light 108 detected by the imaging unit 105, then the microcontroller actuates an LED (Light Emitting Diode) light 108 installed on the platform 101 to increase visibility especially to those user who is suffering from eye visibility issues. The LEDs 108 work by utilizing a phenomenon called electroluminescence. When an electric current flows through the LED 108, it causes the electrons in the semiconductor material to release energy in the form of light 108, then the energy released corresponds to the wavelength of the light 108.
[0041] A battery is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.
[0042] The present invention works best in the following manner, where the platform 101 as disclosed in the invention is placed over the fixed surface of the enclosure like a kitchen, multiple suction cups 102 affix the platform 101 with the surface, then the user provides their biometrics over the biometric sensor and the microcontroller accordingly actuates the touch interactive display panel 103 to allow the user to provide details regarding their desired specific food item that is to be cook. Based on the provided input the internet module access data for recipes or ingredient information and the holographic projection projects virtual images or videos for providing cooking assistance step by step, the artificial intelligence-based imaging unit 105 to monitor spills of food item while cooking and accordingly the curved-shaped extendable flap 106 get extend for preventing spillage of essential cooking items in real time, the microphone 109 allows the user to simply speak regarding misalignment of the platform 101, multiple pneumatic arms 107 to balance the platform 101 to improve angle platform 101 and screen, the LED (Light Emitting Diode) light 108 to increase visibility in surroundings and the battery to supply power to electrically powered components which are employed herein.
[0043] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A personalized cooking assistive device, comprising:
i) a platform 101 developed to be positioned on a fixed surface inside an enclosure, wherein said platform 101 is installed with multiple suction cups 102, arranged beneath said platform 101 for affixing said platform 101 with said surface;
ii) a biometric sensor installed over said platform 101 to receive biometrics of said user based on which an inbuilt microcontroller compares said received biometrics with a pre-saved biometrics of a registered user, wherein in case of successful authentication of said user, said microcontroller fetches medical history of said user to determine medical condition of said user;
iii) a touch interactive display panel 103 mounted on said platform 101 that is accessed by a user for providing input details regarding a specific food item that said user desires to cook, wherein said microcontroller accesses an internet module integrated with said microcontroller to access data of recipes, ingredients, and culinary knowledge, including instructional videos, cultural cooking stories, and techniques from around world;
iv) a holographic projection unit 104 provided on said platform 101 that is activated by said microcontroller for offering step-by-step cooking assistance, wherein said holographic projection slows down or accelerates based on user's cooking proficiency, and recommends recipes tailored to user’s skill level, adjusting complexity of recipes based on historical user data; and
v) an artificial intelligence-based imaging unit 105 is installed on said platform 101 and paired with a processor for capturing and processing multiple images of said user, respectively, to detect spills of food item during cooking, wherein upon successful detection said microcontroller actuates a curved-shaped extendable flap 106 attached with lateral sides of said platform 101 to extend in a synchronous manner, protecting internal components from accidental spills.
2) The device as claimed in claim 1, wherein plurality of pneumatic arms 107 are mounted on sides of said platform 101, configured to adjust viewing angle of device’s platform 101 and screen based on user voice commands via a microphone 109 embedded on said platform 101.
3) The device as claimed in claim 1, wherein an LED (Light Emitting Diode) light 108 mounted on said platform 101 activates automatically when ambient light 108 falls below a threshold to enhance visibility.
4) The device as claimed in claim 1, wherein a battery is associated with said device for powering up electrical and electronically operated components associated with said device.
| # | Name | Date |
|---|---|---|
| 1 | 202421094344-STATEMENT OF UNDERTAKING (FORM 3) [30-11-2024(online)].pdf | 2024-11-30 |
| 2 | 202421094344-REQUEST FOR EXAMINATION (FORM-18) [30-11-2024(online)].pdf | 2024-11-30 |
| 3 | 202421094344-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-11-2024(online)].pdf | 2024-11-30 |
| 4 | 202421094344-POWER OF AUTHORITY [30-11-2024(online)].pdf | 2024-11-30 |
| 5 | 202421094344-FORM-9 [30-11-2024(online)].pdf | 2024-11-30 |
| 6 | 202421094344-FORM FOR SMALL ENTITY(FORM-28) [30-11-2024(online)].pdf | 2024-11-30 |
| 7 | 202421094344-FORM 18 [30-11-2024(online)].pdf | 2024-11-30 |
| 8 | 202421094344-FORM 1 [30-11-2024(online)].pdf | 2024-11-30 |
| 9 | 202421094344-FIGURE OF ABSTRACT [30-11-2024(online)].pdf | 2024-11-30 |
| 10 | 202421094344-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-11-2024(online)].pdf | 2024-11-30 |
| 11 | 202421094344-EVIDENCE FOR REGISTRATION UNDER SSI [30-11-2024(online)].pdf | 2024-11-30 |
| 12 | 202421094344-EDUCATIONAL INSTITUTION(S) [30-11-2024(online)].pdf | 2024-11-30 |
| 13 | 202421094344-DRAWINGS [30-11-2024(online)].pdf | 2024-11-30 |
| 14 | 202421094344-DECLARATION OF INVENTORSHIP (FORM 5) [30-11-2024(online)].pdf | 2024-11-30 |
| 15 | 202421094344-COMPLETE SPECIFICATION [30-11-2024(online)].pdf | 2024-11-30 |
| 16 | Abstract.jpg | 2024-12-24 |
| 17 | 202421094344-FORM-26 [03-06-2025(online)].pdf | 2025-06-03 |