Sign In to Follow Application
View All Documents & Correspondence

Autonomous Dining Assistive Device

Abstract: An autonomous dining assistive device is comprising, a mat-like body 101 designed to be placed on a dining surface, a plurality of suction units 102 for gripping utensils containing food, a motorized slider 103 is mounted along the body’s edge to enable linear horizontal movement of a robotic arm 104, which is operable for reaching different areas and delivering food to a user’s mouth nearby, an imaging unit 105 synchronized with an embedded color sensor identifies food type, a voice command interface 106 with a microphone array and voice recognition module allows natural language control, a covering unit 107 with a curved expandable rod 107a and motorized ball and socket joint 107c covers food when no user is detected, a gyroscope sensor detect involuntary hand tremors, a thermal camera 108 monitor food temperature, a rotatable air blower 109 cool the food, a multi-section compartment 110 stored with various cutleries.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 June 2025
Publication Number
28/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. K. Bala Sankar
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. J.Santhosh Prem
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. P.Satish
Department of Computer Engineering - Artificial Intelligence, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
4. Dr. Madhu Shukla
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
5. Simrin Fathima Syed
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
6. Vipul Ladva
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
7. Akshay Ranpariya
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
8. Neel Dholakia
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to an autonomous dining assistive device that is capable of assisting individuals with limited motor abilities by facilitating food identification, delivery, temperature regulation, and hygienic protection during mealtime.

BACKGROUND OF THE INVENTION

[0002] Many individuals with physical disabilities, motor impairments, or age-related limitations face significant challenges during mealtimes, including difficulty in holding utensils, reaching food, or maintaining hand stability. These limitations often lead to a loss of independence, reliance on caregivers, and increased risk of spillage, choking, or inadequate nutrition. Conventional dining aids offer limited functionality and require constant supervision, which is not feasible in all care settings. There is a growing need for an autonomous dining assistance solution that adapt to user needs, enhance comfort, and ensure safety during eating. Such a device overcome challenges related to motion control, user interaction, food handling, and hygiene while supporting both independent and assisted dining experiences effectively.

[0003] Traditionally, several similar devices are available, such as the Obi feeding robot, Neater Eater, and Liftware utensils. While these devices support individuals with limited mobility, they have notable drawbacks. For example, the Obi robot operates on predefined feeding paths and lacks adaptive features like food type recognition or dynamic portion control. Neater Eater requires user effort and is not fully automated, limiting its effectiveness for users with severe impairments. Liftware utensils offer hand tremor stabilization but do not assist with food transfer or independent eating. Most of these devices lack integrated voice control, environmental sensing, or personalized interaction, reducing their overall autonomy, versatility, and suitability for users with complex needs.

[0004] CN103027772A discloses about a dining assistive device in the field of life auxiliary instrument, which comprises a base, a support post, a desktop, a turntable, a delivery device, a plectrum and pedals. The support post is arranged in the center of the base; an indexing mechanism is arranged in the support post; the support post is provided with the desktop; the turntable is arranged in the center of the desktop; the indexing mechanism in the support post is connected with the turntable; the delivery device is arranged above the desktop; the plectrum is fixedly arranged on the edge of the turntable in a protruded manner; the pedals are respectively arranged on two sides of the base and respectively connected with the indexing mechanism in the support post and the delivery device through steel wires; the support post is provided with the indexing mechanism; the indexing mechanism consists of an inner rod, an outer cylinder and a return spring; the outer cylinder is sleeved on the inner rod and is reset by the return spring; and the pedals transfer power through the steel wires to convert the up-and-down motion of the inner rod into the rotary indexing motion of the outer cylinder. The dining assistive device has the advantages of simple and exquisite structure, and simplicity and convenience in operation.

[0005] CN107928922B discloses about an intelligent meal auxiliary device which comprises an arm auxiliary mechanism and a tray mechanism, wherein the arm auxiliary mechanism comprises a first fixing seat, a first mechanical arm arranged on the first fixing seat, a pressure sensing hand rest arranged at the upper end of the first mechanical arm and a first controller, wherein an arc-shaped supporting groove matched with the arm of a patient is arranged at the upper part of the pressure sensing hand rest, the pressure sensing hand rest senses the pressure applied by the arm of the patient, the sensed current pressure is transmitted to the first controller, the first controller receives the current pressure and compares and analyzes the current pressure with preset pressure to calculate the action trend of the arm of the patient, and further the action of the first mechanical arm is controlled to assist the arm of the patient to move in the direction of the action trend. The invention assists the patient suffering from hand diseases to independently eat, not only solves the dining problem of the patient, but also can perform rehabilitation training on the patient.

[0006] Conventionally, many devices are available in market for assisting user in dining. However, these devices lack in adaptability and automation. Most rely on fixed operations without real-time responsiveness to user needs, food type, or environmental conditions. They typically require manual input, offer limited functionality, and do not fully support independent dining.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that requires to be capable of offering autonomous, real-time dining assistance tailored to user needs. The device should support safe, hygienic, and independent eating through autonomous motion control, food recognition, temperature regulation, and voice interaction, reducing caregiver dependence.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to develop a device that is capable of supporting individuals with limited motor abilities by providing autonomous assistance during eating, enabling them to consume food more safely, independently, and with minimal external help.

[0010] Another object of the present invention is to develop a device that is capable of detecting user's needs and conditions in real-time and adjust the food delivery process accordingly to ensure comfort, safety, and an efficient eating experience.

[0011] Another object of the present invention is to develop a device that is capable of regulating food temperature and protect food from insects or contaminants by monitoring environmental conditions and responding with appropriate actions to maintain hygiene and food quality.

[0012] Yet, another object of the present invention is to develop a device that is capable of enabling users to control the device using simple voice commands, reducing physical effort and allowing convenient, hands-free interaction during meals.

[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0014] The present invention relates to an autonomous dining assistive device that is capable of handling and delivering food precisely to user while offering support for independent eating. The present device identifies food, food types, and maintains hygiene, all while adapting to user needs and ensuring safety during meal times.

[0015] According to an embodiment of the present invention, an autonomous dining assistive device comprises of a mat-like body developed to be positioned on a dining surface, a plurality of suction units embedded within the body for gripping utensils containing food, a motorized slider mounted along the edge of the body, configured to provide linear horizontal movement to a robotic arm, the robotic arm operable for accessing different areas and delivering food to a user’s mouth present in proximity, a microcontroller configured to operate the robotic arm in two modes: an autonomous mode for picking up food and delivering to the user, and a supportive mode for stabilizing the user’s hand during independent eating, an imaging unit installed on the body and synced with an embedded color sensor to detect type of food present over the utensil, a voice command interface comprising a microphone array and voice recognition module to facilitate direct control of the robotic arm via natural language instructions.

[0016] According to another embodiment of the present invention, the device further includes a covering unit comprising a curved expandable rod and a plate connected via a motorized ball and socket joint, configured to automatically cover the food when no user presence is detected nearby, a proximity sensor is integrated with the body to detect presence and exact positioning of user present in proximity, a gyroscope sensor is integrated with the robotic arm to detect involuntary hand tremors, a thermal camera is mounted on the body to monitor food temperature, a rotatable air blower mounted on the body to cool the food when temperature exceeds a preset threshold, a Passive Infrared (PIR) sensor is to detect approaching insects, a compartment having multiple sections is provided with the body for storing various cutleries, with automatic cutleries selection based on user needs or food type and a battery is associated with the device for supplying power to electrical and electronically operated components associated with the device.

[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of an autonomous dining assistive device.

DETAILED DESCRIPTION OF THE INVENTION

[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0022] The present invention relates to an autonomous dining assistive device that is capable supporting individuals during mealtime by enabling independent food consumption, ensuring user safety, maintaining hygiene, adjusting to user needs, and providing responsive interaction without the need for continuous physical assistance or supervision.

[0023] Referring to Figure 1, an isometric view of an autonomous dining assistive device is illustrated, comprising a mat-like body 101 developed to be positioned on a dining surface, a plurality of suction units 102 embedded within the body 101, a motorized slider 103 mounted along the edge of the body 101, a robotic arm 104 installed with the slider 103, an imaging unit 105 installed on the body 101, a voice command interface 106 is installed with the body 101, a covering unit 107 comprising a curved expandable rod 107a and a plate 107b connected via a motorized ball and socket joint 107c, a thermal camera 108 is mounted on the body 101, a rotatable air blower 109 mounted on the body 101, a compartment 110 having multiple sections is provided with the body 101.

[0024] The device disclosed herein includes a mat-like body 101 is developed to be positioned on a dining surface. The body 101 herein includes all necessary component of the device for assisting users in dining. A push button is equipped with the device for activating and deactivating the device. The push button is accessed by the user for activating the device. When the user presses the push button, the electrical circuit is completed, which in response turns the device on. The push button is integrated with an actuator and a spring, which are automatically activated when pressed. They work together to move the internal contact, completing the circuit and allowing electrical current to flow, thereby activating the device.

[0025] When the push button is pressed, the button sends a signal (usually a change in voltage or current) to an inbuilt microcontroller associated with the device to either power up or shut down the microcontroller. Conversely, releasing the button allows the spring to return to its original position, breaking the circuit and sending the signal to deactivate the device. The microcontroller is pre-fed to detect this signal and respond accordingly. The microcontroller used herein is pre-fed using artificial intelligence and machine learning protocols to coordinate the working of the device.

[0026] Multiple suction units 102 are integrated within the body 101, adapted to grip and stabilize food-containing utensils during operation. Each suction units 102 consists of a small, flexible suction cup connected to a miniaturized vacuum pump, controlled by the microcontroller. When a utensil is placed on the body 101, embedded weight sensors detect its presence and initiate the suction process. The vacuum pump then activates, drawing air out from beneath the suction cup, causing the cup to adhere tightly to the base of the utensil. This vacuum seal effectively anchors the utensil in place, preventing unwanted movement during food retrieval. The suction force is dynamically adjusted based on the detected weight and shape of the utensil to ensure stable gripping without causing damage or excessive force.

[0027] The weight sensor functions based on the principle of a load cell, which is used for accurate measurement of the utensil’s weight. The load cell acts as a transducer that converts the mechanical force exerted by the utensil into an electrical signal. When the utensil is placed on the surface of the body 101, it applies a downward force onto the load cell embedded beneath. This force causes a slight deformation in the load cell’s internal metal structure. Strain gauges attached to the load cell detect this deformation by sensing changes in electrical resistance. These changes are directly proportional to the applied force and are converted into an electrical signal. The signal is then transmitted to the microcontroller, which interprets the data to determine the precise weight of the utensil. Thus, the microcontroller dynamically adjusts the suction intensity for secure and stable gripping.

[0028] An imaging unit 105 is mounted on the body 101 of the device and operates in conjunction with a color sensor to detect and identify the type of food present in the utensil. The imaging unit 105 comprises an image-capturing module equipped with a set of lenses capable of capturing multiple images from various angles within the surrounding field of view. The captured images are stored in the internal memory of the imaging unit 105 in the form of optical data.

[0029] The imaging unit 105 also includes an onboard processor fed with artificial intelligence (AI) protocols, which processes the optical data to extract relevant features from the captured images—such as shape, texture, and color patterns indicative of specific food types. The extracted information is then converted into digital signals and transmitted to the microcontroller.

[0030] Upon receiving this processed data, the microcontroller analyzes the input to determine the specific type of food present. This classification assists in decision-making processes, such as selecting the appropriate cutlery from a compartment 110 having multiple sections and installed with the body 101, adjusting portion sizes of the food upon detection of signs of swallowing difficulties, and customizing the feeding or assistance approach based on the food’s characteristics.

[0031] The color sensor operates by detecting and analyzing the wavelengths of light reflected from the surface of the food placed in the utensil, in order to determine the type or category of food. The sensor typically utilizes photodiodes or phototransistors that are sensitive to different bands within the visible light spectrum. When light illuminates the food, the sensor measures the intensity of the reflected light across primary color channels (e.g., red, green, and blue). This reflected color information is indicative of the food’s surface properties such as color and texture.

[0032] The collected data is converted into electrical signals and transmitted to the microcontroller. The microcontroller processes these signals to interpret the color profile. Based on this analysis, the microcontroller determines the type of food—for example, identifying whether it is a vegetable, fruit, meat, or dessert—enabling the device to automatically adjust parameters such as cutlery selection, portion control, or feeding sequence.

[0033] A motorized slider 103 is positioned along the periphery of the body 101 to facilitate the linear horizontal traversal of a robotic arm 104, which is operatively mounted on the slider 103 for delivering food to the user located in close proximity, as detected by a proximity sensor embedded with the body 101. The motorized slider 103 typically comprises a motorized carriage engaged with a linear rail, enabling controlled and precise movement of the robotic arm 104 along a horizontal axis. Upon receiving actuation signals from the microcontroller, the motor drives the carriage along the rail, allowing smooth and accurate sliding motion of the robotic arm 104 across different sections of the body 101. This configuration ensures that the robotic arm 104 access various utensil positions and efficiently reach the user’s mouth for food delivery.

[0034] The robotic arm 104 comprises a robotic link and a clamp mounted at the terminal end of the link. The robotic link is constructed from multiple interconnected segments, each joined by articulated joints, also referred to as axes. Each joint is equipped with a stepper motor that enables rotational movement, allowing the robotic link to perform precise and coordinated motions. Upon receiving actuation signals from the microcontroller, the stepper motors drive the movement of the individual segments, enabling the clamp to be accurately positioned for gripping, retrieving, or delivering food items. This configuration ensures flexible and controlled operation of the robotic arm 104 across various positions on the body 101 and towards the user’s mouth.

[0035] The proximity sensor integrated with the body 101 is preferably an ultrasonic proximity sensor, which operates using ultrasonic waves to detect the presence and precise positioning of a user nearby. The sensor emits high-frequency ultrasonic pulses directed toward the surrounding environment. When these pulses encounter an object—such as a user—they are reflected back toward the sensor. The receiver component of the ultrasonic sensor, which is tuned to detect the returning echoes, captures the reflected waves.

[0036] The time interval between the emission and reception of the ultrasonic waves is measured and used to calculate the distance between the sensor and the detected object. This distance data is transmitted to the microcontroller, which processes and analyzes it to determine not only the presence of the user but also their exact location relative to the device.

[0037] A voice command interface 106 is provided on the body 101, comprising a microphone array and a voice recognition module designed to interpret natural language instructions issued by the user, enabling hands-free control of the robotic arm 104. Each microphone within the array includes a small diaphragm attached to a moving coil. When sound waves from the user's voice strike the diaphragm, it causes the coil to vibrate within a magnetic field, thereby generating an electrical current corresponding to the sound signal.

[0038] This analog signal is then digitized and transmitted to the voice recognition module, which extracts and processes linguistic features from the audio input. The voice recognition module is a cloud-based to enhance the precision of command interpretation, especially under conditions with high ambient noise. The interpreted command is forwarded to the microcontroller, which analyzes the instruction and executes the corresponding action—such as initiating food pickup, adjusting the arm’s position, modifying speed, or switching between two operational modes: an autonomous mode, the clamp is maneuvered to pick up food from designated areas on the body 101 and deliver it directly to the user; and a supportive mode, the arm 104 assists in stabilizing the user’s hand during self-feeding, compensating for tremors or unsteady motion.

[0039] A gyroscope sensor is embedded within the robotic arm 104 to monitor and detect involuntary hand tremors. The gyroscope sensor integrated within the robotic arm 104 functions by measuring angular velocity and rotational movements of the arm 104 or the user’s hand. The sensor typically contains a vibrating element that detects changes in orientation or motion. When involuntary hand tremors occur, they cause rapid, unintended angular movements that the gyroscope senses as variations in rotational speed or direction. These motion signals are converted into electrical data and transmitted to the microcontroller, which analyzes the frequency and amplitude of the detected tremors. By identifying these tremor patterns in real-time, the microcontroller regulates the actuation of the robotic arm 104 for stabilizing the user’s arm by providing subtle, opposing movements through the arm’s motors—effectively dampening the tremors by moving the arm 104 in the opposite direction of the detected shakes. As a result, the robotic arm 104 compensates for the user’s tremors in real-time, maintaining a steady and precise motion that prevents spillage and ensures safe delivery of food to the user’s mouth.

[0040] To ensure food remains at a safe temperature, a thermal camera 108 is mounted on the body 101 that continuously monitors the food temperature. The thermal camera 108 operates by detecting infrared (IR) radiation naturally emitted by all objects based on their temperature. Food, like any warm object, emits infrared energy in the form of heat. The thermal camera 108 captures this IR radiation through an infrared sensor array and converts it into temperature data using onboard signal processing. Each pixel in the camera's sensor measures the intensity of infrared energy, creating a thermal image where different temperatures appear as varying colors or shades. This data is continuously transmitted to the microcontroller, which analyzes the temperature readings in real time. If the detected temperature exceeds a preset safety threshold stored in a linked database, the microcontroller activates a rotatable air blower 109 installed on the body 101 to bring the food temperature to a safe and comfortable level before delivery to the user.

[0041] The air blower 109 is activated by the microcontroller when the thermal camera 108 detects that the food temperature exceeds the predefined threshold. The blower 109 comprises a vortex chamber, a heating unit (optional or bypassed for cooling use), an impeller, and an outlet duct. The primary function during cooling is to generate a high-velocity airflow directed toward the food surface. Upon actuation, the impeller begins rotating, drawing in ambient air from the surroundings. As the impeller spins, it creates a centrifugal force that induces vortex motion within the blower 109 chamber. This motion propels the air forward through the internal channels.

[0042] The blower 109 is also coupled with a heating unit for multi-functionality (e.g., warming), in the cooling scenario, the airflow bypasses the heating element. The rotating impeller compresses the air along curved channels, increasing its velocity and generating a helical airflow pattern. The resulting pressurized air is expelled through the outlet duct and directed toward the surface of the food. This airflow accelerates the dissipation of heat from the food, effectively cooling it to a safer and more comfortable temperature for consumption.

[0043] In an embodiment of the present invention, a primary motorized ball-and-socket joint is positioned between the rotatable air blower 109 and the body 101 for enabling multi-directional movement of the blower 109. Thus, allows the blower 109 to be precisely oriented by the microcontroller to direct airflow toward specific areas of the food surface, thereby effectively regulating its temperature. The motorized ball-and-socket joint comprises a motor powered by the microcontroller, a spherical (ball-shaped) element, and a corresponding socket housing. The ball is seated within the socket in a manner that allows it to pivot freely in multiple directions. The motor, under control of the microcontroller, generates electrical current to drive rotational movement of the ball within the socket. This motion enables precise multi-axis orientation of the connected air blower 109.

[0044] The microcontroller continuously monitors temperature data received from the thermal camera 108 and, when required, actuates the motor to reposition the blower 109 via the ball-and-socket joint. This directional adjustment allows the blower 109 to target specific food items or areas on the body 101 with a focused stream of air for improving the efficiency of heat dissipation and ensuring uniform cooling. By dynamically adjusting the blower’s orientation, the device maintains the food at a safe and desirable temperature for consumption.

[0045] A Passive Infrared (PIR) sensor is employed to detect the approach of insects near the food surface. PIR sensors operate by measuring infrared (IR) radiation—specifically, heat energy—emitted by objects in their environment using a pair of pyroelectric sensor elements. These elements are typically arranged in adjacent segments within the sensor housing and generate electrical signals based on the intensity of IR radiation they receive.

[0046] All objects with a temperature above absolute zero emit electromagnetic radiation in the infrared spectrum, which is invisible to the human eye but detectable by IR-sensitive electronics. The PIR sensor continuously compares the IR levels received by its two segments. When an insect or other object moves across the sensor’s field of view, it causes a differential change in the infrared radiation between the two halves. Though insects are cold-blooded and emit relatively low levels of IR radiation, their motion produces enough variation to trigger the sensor.

[0047] The sensor's onboard electronics analyze the differential signals, and when the detected change exceeds a predefined threshold stored in the database, an output signal is sent to the microcontroller. The microcontroller processes this input and subsequently activates the air blower 109 to emit a targeted stream of air to create an air barrier that deters insects from approaching or settling on the food, thereby maintaining hygiene and safety.

[0048] A covering unit 107 is provided, consisting of a curved extendable rod 107a and a plate 107b coupled via a motorized ball-and- socket joint 107c. This unit 107 is configured to automatically position the plate 107b over the food area when the absence of the user is detected in the vicinity, as determined by the proximity sensor. The extension and retraction of the curved rod 107a are pneumatically powered, controlled by the microcontroller through an integrated pneumatic unit. This pneumatic unit comprises an air compressor, air cylinders, air valves, and a piston, all working collaboratively to enable the smooth extension and retraction of the rod 107a.

[0049] The microcontroller controls the pneumatic operation by actuating the valves to permit the flow of compressed air from the compressor into the air cylinder. The incoming air generates pressure against the piston, causing it to extend. As the piston extends, it pushes the connected rod 107a outward, thereby extending the curved arm and moving the plate 107b into position over the food. To retract the rod 107a, the microcontroller signals the valve to close, halting airflow and allowing the piston to retract, which in turn pulls back the rod 107a and uncovers the food area.

[0050] The motorized ball-and- socket joint 107c works in same manner as the primary motorized ball and socket joint disclosed above, for multi-directional articulation of the plate 107b. The microcontroller precisely regulates the actuation of the motorized ball and socket joint 107c to orient and position the plate 107b as needed—either to fully cover the food or to move it aside—based on the real-time user presence detection. Herein, the covering unit 107 is also actuated based on the user’s commands received through the voice command interface 106.

[0051] Lastly, a battery (not shown in figure) is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.

[0052] The present invention work best in the following manner, where the mat-like body 101 is positioned on the dining surface, with the plurality of suction units 102 embedded within the body 101 to grip and stabilize utensils containing food. The motorized slider 103 mounted along the edge of the body 101 provides linear horizontal movement to the robotic arm 104, enabling it to access various food areas and deliver food to the user's mouth. The microcontroller governs the robotic arm 104 in two modes: the autonomous mode for picking up food and delivering it to the user, and the supportive mode for stabilizing the user's hand during independent eating. The imaging unit 105, working in coordination with the embedded color sensor, identifies the type of food present in the utensil and detects signs of swallowing difficulty, enabling the microcontroller to adjust portion size, select appropriate cutlery, and pace delivery. The voice command interface 106, including the microphone array and cloud-based voice recognition module, allows for natural language control of the robotic arm 104, even in noisy environments. The thermal camera 108 continuously monitors food temperature and triggers the rotatable air blower 109 to cool food when necessary. The PIR sensor detects approaching insects and activates the blower 109 to form the air barrier. The proximity sensor determines user presence and position, and the motorized ball-and-socket joint 107c with the curved expandable rod 107a supports the covering unit 107, which automatically covers or uncovers food based on user presence. The integrated cutlery compartment 110 enables automatic utensil selection based on user needs or food type.

[0053] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) An autonomous dining assistive device, comprising:

i) a mat-like body 101 developed to be positioned on a dining surface;

ii) a plurality of suction units 102 embedded within the body 101 for gripping utensils containing food;

iii) a motorized slider 103 mounted along the edge of the body 101, configured to provide linear horizontal movement to a robotic arm 104, the robotic arm 104 operable for accessing different areas and delivering food to a user’s mouth present in proximity;

iv) a microcontroller configured to operate the robotic arm 104 in two modes: an autonomous mode for picking up food and delivering to the user, and a supportive mode for stabilizing the user’s hand during independent eating;

v) an imaging unit 105 installed on the body 101 and synced with an embedded color sensor to detect type of food present over the utensil;

vi) a voice command interface 106 comprising a microphone array and voice recognition module to facilitate direct control of the robotic arm 104 via natural language instructions; and

vii) a covering unit 107 comprising a curved expandable rod 107a and a plate 107b connected via a motorized ball and socket joint 107c, configured to automatically cover the food when no user presence is detected nearby.

2) The device as claimed in claim 1, wherein a proximity sensor is integrated with the body 101 to detect presence and exact positioning of user present in proximity.

3) The device as claimed in claim 1, wherein a gyroscope sensor is integrated with the robotic arm 104 to detect involuntary hand tremors and upon detection the microcontroller actuates counteracting motions to stabilize the movement.

4) The device as claimed in claim 1, wherein a thermal camera 108 is mounted on the body 101 to monitor food temperature and the microcontroller triggers a rotatable air blower 109 mounted on the body 101 to cool the food when temperature exceeds a preset threshold.

5) The device as claimed in claim 1, wherein a Passive Infrared (PIR) sensor is to detect approaching insects and activate the air blower 109 to create an air barrier deterring insects from reaching the food.

6) The device as claimed in claim 1, wherein a compartment 110 having multiple sections is provided with the body 101 for storing various cutleries, with automatic cutleries selection based on user needs or food type.

7) The device as claimed in claim 1, wherein the imaging unit 105 is configured to detect signs of swallowing difficulties in the user, and based on which the microcontroller actuated the robotic arm 104 for delivering food towards the user with an optimum cutlery, adjusting the portion size of food presented to the user, and pacing of food delivery to ensure user safety.

8) The device as claimed in claim 1, wherein the suction units 102 dynamically adjusts intensity based on weight of utensils as detected via plurality of weight sensors spatially arranged over the body 101.

9) The device as claimed in claim 1, wherein the voice recognition module is cloud-based to enhance accuracy of voice command processing in noisy environments.

10) The device as claimed in claim 1, wherein the covering unit 107 automatically moves to cover or uncover the food based on the presence or absence of the user detected via imaging unit 105 or user commands.

Documents

Application Documents

# Name Date
1 202521060390-STATEMENT OF UNDERTAKING (FORM 3) [24-06-2025(online)].pdf 2025-06-24
2 202521060390-REQUEST FOR EXAMINATION (FORM-18) [24-06-2025(online)].pdf 2025-06-24
3 202521060390-REQUEST FOR EARLY PUBLICATION(FORM-9) [24-06-2025(online)].pdf 2025-06-24
4 202521060390-PROOF OF RIGHT [24-06-2025(online)].pdf 2025-06-24
5 202521060390-POWER OF AUTHORITY [24-06-2025(online)].pdf 2025-06-24
6 202521060390-FORM-9 [24-06-2025(online)].pdf 2025-06-24
7 202521060390-FORM FOR SMALL ENTITY(FORM-28) [24-06-2025(online)].pdf 2025-06-24
8 202521060390-FORM 18 [24-06-2025(online)].pdf 2025-06-24
9 202521060390-FORM 1 [24-06-2025(online)].pdf 2025-06-24
10 202521060390-FIGURE OF ABSTRACT [24-06-2025(online)].pdf 2025-06-24
11 202521060390-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [24-06-2025(online)].pdf 2025-06-24
12 202521060390-EVIDENCE FOR REGISTRATION UNDER SSI [24-06-2025(online)].pdf 2025-06-24
13 202521060390-EDUCATIONAL INSTITUTION(S) [24-06-2025(online)].pdf 2025-06-24
14 202521060390-DRAWINGS [24-06-2025(online)].pdf 2025-06-24
15 202521060390-DECLARATION OF INVENTORSHIP (FORM 5) [24-06-2025(online)].pdf 2025-06-24
16 202521060390-COMPLETE SPECIFICATION [24-06-2025(online)].pdf 2025-06-24
17 202521060390-FORM-26 [25-06-2025(online)].pdf 2025-06-25
18 Abstract.jpg 2025-07-08