Abstract: A portable smart mat (100) for tracking nutrition intake is disclosed. The portable smart mat includes a top surface (122) includes one or more sections to accommodate a utensil holding at least one of a food and a beverage. The portable smart mat is powered by an in-built rechargeable battery. A receiving module (124) to receive input from a user (120). A plurality of sensors (126) to detect the weight of at least one of food and beverage. An integration module (128) to integrate input and weight. A prediction module (130) to detect at least one of food and beverage and predict a corresponding nutritional value. A visual feedback module (132) to trigger an alert the user if the nutritional value exceeds a pre-determined value. A plurality of light emitting diodes (134) glow in response to the alert, thereby tracking and monitoring consumption of nutrition intake of the user. FIG. 1
Description:FIELD OF INVENTION
[0001] Embodiments of the present disclosure relate to the field of health and wellness, and more particularly, a portable smart mat for tracking nutrition intake and a method thereof.
BACKGROUND
[0002] Self-monitoring food intake plays a significant role in weight management and promoting a healthy lifestyle. Keeping track of the amount and nutritional value of the food consumed allows individuals to maintain a balanced diet, make informed dietary choices, and work towards achieving their health goals. However, there are several challenges that may be encountered while tracking food intake.
[0003] One of the primary challenges is accurately measuring the quantity of food consumed, which can make it difficult to manage their weight or nutritional intake. This can lead to overeating or underestimating calorie intake thereby hindering their efforts to achieve health and fitness goals.
[0004] Further, tracking food intake can be a time-consuming and laborious process, which can lead to people giving up on the task eventually. The manual recording of each meal and its nutritional content is cumbersome and requires significant effort and attention to detail. This time commitment may lead to frustration and discouragement, causing individuals to abandon their tracking efforts altogether.
[0005] Furthermore, many people face challenges to identify the nutritional content of the foods they consume, especially when it comes to specific health conditions or dietary restrictions. This lack of information can impact their ability to make appropriate food choices.
[0006] Hence, there is a need for an improved portable smart mat for nutrition intake tracking and a method thereof which addresses the aforementioned issue(s).
OBJECTIVE OF THE INVENTION
[0007] An objective of the invention is to measure the weight of the food placed on each section of a portable smart mat in real-time.
[0008] Another objective of the invention is to detect the food placed on each section of the portable smart mat and predict corresponding nutritional values based on the weight and other characteristics of the food using Artificial Intelligence in real-time.
[0009] Yet, another objective of the invention is to notify the user about the nutritional value of consumed food exceeds a pre-determined value, thereby providing immediate feedback to the user in real-time.
BRIEF DESCRIPTION
[0010] In accordance with an embodiment of the present disclosure, a portable smart mat for nutrition intake tracking is provided. The portable smart mat includes a top surface. The top surface of the portable smart mat includes one or more sections. Each of the one or more sections is adapted to accommodate a utensil holding at least one of a food and a beverage. The portable smart mat is powered by an in-built rechargeable battery. The portable smart mat includes a processing subsystem hosted on a server. The processing subsystem is configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a receiving module. The receiving module is configured to receive input from a user regarding at least one of the food and the beverage. The input includes at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage. The portable smart mat includes a plurality of sensors positioned beneath the one or more sections. The plurality of sensors is configured to detect the weight of each empty utensil placed on the one or more sections. The plurality of sensors is configured to detect the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils. The processing subsystem includes an integration module operatively coupled with the plurality of sensors. The integration module is configured to integrate the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model. The processing subsystem includes a prediction module operatively coupled with the integration module. The prediction module is configured to detect at least one of the food and the beverage. The prediction module is configured to predict a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time. The processing subsystem includes a visual feedback module operatively coupled with the prediction module. The visual feedback module is configured to trigger an alert for the user if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value. The portable smart mask includes a plurality of light emitting diode positioned around each of the one or more sections and operatively coupled to the visual feedback module. The plurality of light emitting diode is adapted to glow in response to the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user.
[0011] In accordance with another embodiment of the present disclosure, a method to operate a portable smart mat for nutrition intake tracking is provided. The method includes placing, on a utensil on each of the one or more sections on a top surface of a portable smart mat, at least one of a food and a beverage. The method includes receiving, by a receiving module, input from a user regarding at least one of the food and the beverage. The input includes at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage. The method includes detecting, by a plurality of sensors, the weight of each empty utensil placed on the one or more sections. The method includes detecting, by the plurality of sensors, the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils. The method includes integrating, by an integration module, the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model. The method includes detecting, by a prediction module, at least one of the food and the beverage. The method includes predicting, by the prediction module, a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time. The method includes triggering, by a visual feedback module, an alert the user if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value. The method includes glowing, by a plurality of light emitting diode, in response the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user.
[0012] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0014] FIG. 1 is a block diagram representation of a portable smart mat for tracking nutrition intake in accordance with an embodiment of the present disclosure;
[0015] FIG. 2 is a schematic representation of a top surface of a portable smart mat of FIG. 1 in accordance with an embodiment of the present disclosure;
[0016] FIG. 3 is a schematic representation of a portable smart mat in use of FIG. 1 in accordance with an embodiment of the present disclosure;
[0017] FIG. 4 is a schematic representation of a portable smart mat with embedded components of FIG. 1 in accordance with an embodiment of the present disclosure;
[0018] FIG. 5 is a block diagram of a computer or a server in accordance with an embodiment of the present disclosure;
[0019] FIG. 6 illustrates an exemplary screenshot of connecting a portable smart mat with a user device of FIG. 1 in accordance with an embodiment of the present disclosure;
[0020] FIG. 7(a) illustrates an exemplary screenshot of weight detection and nutritional tracking of food of FIG. 1 in accordance with an embodiment of the present disclosure;
[0021] FIG. 7(b) illustrates an exemplary screenshot to demonstrate ability of a user to upload images taken using a user device and stored images of FIG 1, in accordance with an embodiment of the present disclosure;
[0022] FIG. 8 illustrates an exemplary screenshot of displaying the nutritional value of the food placed on a portable smart mat in accordance with an embodiment of the present disclosure;
[0023] FIG. 9(a) illustrates a flow chart representing the steps involved in a method to operate a portable smart mat for nutrition intake tracking in accordance with an embodiment of the present disclosure; and
[0024] FIG. 9(b) illustrates continued steps of the method of FIG. 9(a) in accordance with an embodiment of the present disclosure.
[0025] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0026] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0027] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or subsystems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0028] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0029] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0030] Embodiments of the present disclosure relate to a portable smart mat for nutrition intake tracking. The portable smart mat includes a top surface. A top surface of the portable smart mat includes one or more sections. Each of the one or more sections is adapted to accommodate a utensil holding at least one of a food and a beverage. The portable smart mat is powered by an in-built rechargeable battery. The portable smart mat includes a processing subsystem hosted on a server. The processing subsystem is configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem includes a receiving module. The receiving module is configured to receive input from a user regarding at least one of the food and the beverage. The input includes at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage. The portable smart mat includes a plurality of sensors positioned beneath the one or more sections. The plurality of sensors is configured to detect the weight of each empty utensil placed on the one or more sections. The plurality of sensors is configured to detect the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils. The processing subsystem includes an integration module operatively coupled with the plurality of sensors. The integration module is configured to integrate the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model. The processing subsystem includes a prediction module operatively coupled with the integration module. The prediction module is configured to detect at least one of the food and the beverage. The prediction module is configured to predict a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time. The processing subsystem includes a visual feedback module operatively coupled with the prediction module. The visual feedback module is configured to trigger an alert for the user if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value. The portable smart mask includes a plurality of light emitting diode positioned around each of the one or more sections and operatively coupled to the visual feedback module. The plurality of light emitting diode is adapted to glow in response to the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user.
[0031] FIG. 1 is a block diagram of a portable smart mat (100) for tracking nutrition intake in accordance with an embodiment of the present disclosure. In one embodiment, the portable smart mat (100) is rectangular in shape. It will be appreciated to those skilled in the art that the portable smart mat (100) may also be adapted to any other suitable dimension and shape that aids a user (120) to use the portable smart mat (100) with ease. In another embodiment, the portable smart mat (100) may be fabricated from a suitable material including plastic, rubber, glass and the like. The choice of material depends on several factors such as durability, weight, cost, and functionality.
[0032] The portable smart mat (100) includes a top surface (122). The top surface (122) of the portable smart mat (100) is divided into one or more sections. Each of the one or more sections is adapted to accommodate a utensil holding at least one of a food and a beverage. Examples of utensils include plates, bowls, cups, and the like. In a preferred embodiment, the portable smart mat (100) includes five sections. Each section has a specific size, such as W 8cm × H 8 cm, W 8cm × H 8 cm, W 8cm × H 8 cm, W 16cm × H 8 cm, and W 16cm × H 17.5 cm. These dimensions represent the width (W) and height (H) measurements for each section, indicating the space allocated for placing utensils. In one embodiment, the one or more sections are adjustable in size to accommodate different utensil size and shape.
[0033] Further, the portable smart mat (100) is powered by an in-built rechargeable battery (not shown in FIG. 1). The in-built rechargeable battery may be charged multiple times using an appropriate charging method or cable. Examples of the in-built rechargeable battery includes, but is not limited to, Lithium-ion (Li-ion) Batteries, Nickel-Metal Hydride (NiMH) Batteries, Nickel-Cadmium (NiCad) Batteries and the like.
[0034] The portable smart mat (100) includes a plurality of sensors (126) positioned beneath the one or more sections. Examples of the plurality of sensors (126) include, but is not limited to, load cells, strain gauges, pressure sensors, capacitive sensors, optical sensors, and the like. The plurality of sensors (126) is configured to detect the weight of each empty utensil placed on the one or more sections. By accurately determining the weight of empty utensils, the portable smart mat (100) ensures that the nutritional calculations are based on the actual weight of the food and beverage being consumed by the user (120). Further, the plurality of sensors (126) is configured to detect the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils.
[0035] Additionally, the portable smart mat (100) includes a wireless communication module (not shown in FIG.1). The wireless communication module is configured to transmit and receive data between the portable smart mat (100) and a user device (118).
[0036] The portable smart mat (100) includes a processing subsystem (105) hosted on a server (108). In one embodiment, the server (108) may include a cloud-based server. In another embodiment, parts of the server (108) may be a local server coupled to a user device (118). The processing subsystem (105) is configured to execute on a network (115) to control bidirectional communications among a plurality of modules. In one example, the network (115) may be a private or public local area network (LAN) or Wide Area Network (WAN), such as the Internet. In another embodiment, the network (115) may include both wired and wireless communications according to one or more standards and/or via one or more transport mediums. In one example, the network (115) may include wireless communications according to one of the 802.11 or Bluetooth specification sets, or another standard or proprietary wireless communication protocol. In yet another embodiment, the network (115) may also include communications over a terrestrial cellular network, including, a global system for mobile communications (GSM), code division multiple access (CDMA), and/or enhanced data for global evolution (EDGE) network.
[0037] The processing subsystem (105) includes a receiving module (124). The receiving module (124) is configured to receive input from the user (120) operating the user device (118) regarding at least one of the food and the beverage. Examples of the input include, but is not limited to, at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage. The pre-defined list is displayed to the user as an available option for selecting the food and beverage. The pre-defined list includes names of food and beverages that cover a variety of cuisines, reflecting the diversity of dishes from various countries, regions, or cuisines. Examples for the pre-defined list includes Indian cuisines (Butter chicken, Biriyani, Naan bread, Masala chai and the like), Chinese cuisines (Fried rice, Spring rolls, Chinese dumplings, noodles and the like), Italian Cuisine (Spaghetti Bolognese, Pizza, pasta, delectable Italian ice cream and others) and the like.
[0038] Further, the user (120) may capture one or more images of their food or beverage using a camera configured in the user device (118). Alternatively, the user (120) can upload one or more images from the stored images. It is to be noted that the user device (118) includes, but is not limited to, a mobile phone, desktop computer, portable digital assistant (PDA), smart phone, tablet, ultra-book, netbook, laptop, multi-processor system, microprocessor-based or programmable consumer electronic system, or any other communication device that the user (120) may use. In some embodiments, the user device (118) may comprise a display module (not shown) to display information (for example, in the form of user interfaces). In further embodiments, the user device (118) may comprise one or more of touch screens, accelerometers, gyroscopes, cameras, microphones, global positioning system (GPS) devices, and so forth.
[0039] In an embodiment, the user (120) may also record videos showcasing their food and beverages to provide additional visual information. Further, the user (120) has the option to provide a text description of the food and beverage he/she is consuming. In another embodiment, the user (120) can use voice input as a convenient method for tracking information about the food and beverage with advanced voice recognition technology.
[0040] The processing subsystem (105) includes an integration module (128) operatively coupled with the plurality of sensors (126). The integration module (128) is configured to integrate the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model. Machine learning models are artificial intelligence algorithms that may learn patterns and make predictions based on input data. The patterns may be applied in various time scales, for instance, instant, daily, weekly and the like. Examples of the artificial intelligence algorithm include but not limited to Deep Neural Networks Specifically Visual Transformers combined with heuristics of old food logs of this and similar users, Convolutional Neural Network (CNN), Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN) and Deep Q-Networks. The integration module (128) utilizes the machine learning model to leverage its capabilities in analyzing and interpreting the combined input and weight data.
[0041] Further, the processing subsystem (105) is operatively coupled to a database (110) configured to store food information includes a plurality of food items and beverages pertaining to various countries, regions or cuisines. In one embodiment, the database (110) is configured to store Indian cuisine. Typically, the food information stored in the database (110) is used for analyzing the at least one of the food and the beverage placed on the portable smart mat (100).
[0042] The processing subsystem (105) includes a prediction module (130) operatively coupled with the integration module (128). The prediction module (130) is configured to detect at least one of the food and the beverage. Further, by accessing the information stored in the database (110), the prediction module (130) can identify the specific food and beverage items with a high level of accuracy. The prediction module (130) is also configured to predict a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time. Typically, the nutritional value indicates a measure of the nutrients (such as carbohydrates, fats, proteins, minerals and vitamins) in the food or beverage items. In other words, the nutritional value predicted by the prediction module (130) includes, but is not limited to calories, macronutrients (such as carbohydrates, proteins, and fats), micronutrients (such as vitamins and minerals), fiber content and the like. These predictions are delivered in real time, ensuring that the user receives immediate feedback.
[0043] In an embodiment, the prediction module (130) is configured to predict nutritional value of each individual bowl or utensil placed on the portable smart mat (100). When the bowl or utensil is placed on the portable smart mat (100), the nutritional values, including macronutrients such as carbohydrates, proteins, and fats, of that particular bowl are predicted individually as well as collectively for the entire meal.
[0044] When the prediction module (130) predicts the nutritional value of the detected food and beverage, it compares the obtained values with a pre-determined value. The pre-determined values are customizable based on personal dietary goals, food allergy information, eating behaviour, weight and health conditions of the user (120). It is important that the nutritional value does not exceed the pre-determined value to ensure a balanced diet. Likewise, it is also important to check if the obtained values is not met or met based on the pre-determined value. These and other various scenarios would be apparent to those having ordinary skill in the art.
[0045] Further, the processing subsystem (105) includes a visual feedback module (132). The visual feedback module (132) triggers an alert for the user (120) if the nutritional value of at least one of the food and the beverage exceeds the pre-determined value. The alert may be in any form, for instance, a visual alarm, an audible alarm, an alert message presented on the user device (118) and the like. Specifically, the visual alarm is rendered by a plurality of Light Emitting Diodes (LED) (134). The portable smart mat (100) includes the plurality of light emitting diodes (134) positioned around each of the one or more sections and operatively coupled to the visual feedback module (132). The plurality of light emitting diodes (134) is adapted to glow in response to the alert triggered. The plurality of light emitting diodes (134) are programmed to illuminate or change their color to provide a clear and noticeable visual signal to the user (120), thereby tracking and monitoring consumption of nutrition intake of the user (120).
[0046] In an embodiment, the plurality of light emitting diodes (134) may emit a green glow as a positive feedback when the food or beverage is determined to be healthy and nutritious.
[0047] It must be noted that the positioning of the plurality of light emitting diodes (134) may be modified according to the preferences and choices of skilled makers or designers of the portable smart mat (100).
[0048] In one embodiment, the various functional components of the system may reside on a single computer, or they may be distributed across several computers in various arrangements. The various components of the system may, furthermore, access one or more databases, and each of the various components of the system may be in communication with one another. Further, while the components of FIG. 1 are discussed in the singular sense, it will be appreciated that in other embodiments multiple instances of the components may be employed.
[0049] FIG. 2 is a schematic representation of a top surface of a portable smart mat (100) of FIG. 1 in accordance with an embodiment of the present disclosure. The top surface (122) is depicted as a rectangular area, representing the main working area of the portable smart mat (100). Within this area, one or more sections are designated to accommodate utensils and food or beverage items. Each section has clear boundaries indicating a specific area for placing the utensils. The number of sections may vary depending on the design and configuration of the portable smart mat (100). In a preferred embodiment, there are five sections. Each section has a specific size, such as W 8cm × H 8 cm, W 8cm × H 8 cm, W 8cm × H 8 cm, W 16cm × H 8 cm, and W 16cm × H 17.5 cm. Further, a plurality of light emitting diodes (134) are positioned in proximity to the boundaries of one or more sections. The plurality of light emitting diodes (134) illuminate or change color to provide clear and noticeable signals regarding the nutritional values or alerts related to the placed food or beverage items.
[0050] FIG. 3 is a schematic representation of a portable smart mat (100) of FIG. 1 in use in accordance with an embodiment of the present disclosure. Let’s consider an example, a user ‘X’ plans to have a meal including multiple items: a bowl of salad, one piece of sandwich, a bowl of pea sprouts, a bowl of paneer, and a glass of orange juice. The user ‘X’ wants to track and analyze the nutritional intake using the portable smart mat (100). User 'X' places each item on the corresponding section of the portable smart mat's top surface (122). A plurality of sensors (126) captures the weight of each item and send it to a user device (118). A receiving module (124) receives input about the food and beverage placed on the portable smart mat (100) through various means, such as selecting items from a pre-defined list, capturing images of the items using a camera, or describing the items using text. An integration module (128) and a prediction module (130) utilize this weight data, along with other characteristics of the food and beverage, to determine their nutritional values in real time. These values may include calories, macronutrients, micronutrients, and other relevant information. A visual feedback module (132) provides information about the nutritional values of the food and beverage corresponding to each bowl. For instance, the salad contains a high-calorie content, a plurality of light emitting diodes (134) that surrounding the corresponding section of the portable smart mat (100) may illuminate or change color to alert the user about the nutritional composition.
[0051] FIG. 4 is a schematic representation of a portable smart mat (100) with inside component in accordance with an embodiment of the present disclosure. A plurality of sensors (126) is configured to detect the weight of the utensils and the food or beverage items placed on the mat. The portable smart mat (100) has an in-built rechargeable battery to power the hardware components. Further, the portable smart mat (100) includes a wireless communication module (not shown in FIG.1). The wireless communication module is configured to transmit and receive data between the portable smart mat (100) and a user device (118).
[0052] FIG. 5 is a block diagram of a computer or a server in accordance with an embodiment of the present disclosure. The server (200) includes processor(s) (230), and memory (210) operatively coupled to the bus (220). The processor(s) (230), as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[0053] The memory (210) includes several subsystems stored in the form of executable program which instructs the processor (230) to perform the method steps illustrated in FIG. 1. The memory (210) includes a processing subsystem (105) of FIG.1. The processing subsystem (105) further has following modules: receiving module (124), integration module (128), prediction module (130), and visual feedback module (132).
[0054] In accordance with an embodiment of the present disclosure, a portable smart mat (100) for nutrition intake tracking is provided. The portable smart mat (100) includes a top surface (122). The top surface (122) of the portable smart mat (100) includes one or more sections. Each of the one or more sections is adapted to accommodate a utensil holding at least one of a food and a beverage. The portable smart mat (100) is powered by an in-built rechargeable battery. The portable smart mat (100) includes a processing subsystem (105) hosted on a server (108). The processing subsystem (105) is configured to execute on a network to control bidirectional communications among a plurality of modules. The processing subsystem (105) includes a receiving module (124). The receiving module (124) is configured to receive input from a user (120) regarding at least one of the food and the beverage. The input includes at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage. The portable smart mat (100) includes a plurality of sensors (126) positioned beneath the one or more sections. The plurality of sensors (126) is configured to detect the weight of each empty utensil placed on the one or more sections. The plurality of sensors (126) is configured to detect the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils. The processing subsystem (105) includes an integration module (128) operatively coupled with the plurality of sensors (126). The integration module (128) is configured to integrate the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model. The processing subsystem (105) includes a prediction module (130) operatively coupled with the integration module (128). The prediction module (130) is configured to detect at least one of the food and the beverage. The prediction module (130) is configured to predict a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time. The processing subsystem (105) includes a visual feedback module (132) operatively coupled with the prediction module (130). The visual feedback module (132) is configured to trigger an alert for the user (120) if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value. The portable smart mask includes a plurality of light emitting diodes (134) positioned around each of the one or more sections and operatively coupled to the visual feedback module (132). The plurality of light emitting diodes (134) is adapted to glow in response to the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user (120).
[0055] The bus (220) as used herein refers to internal memory channels or computer network that is used to connect computer components and transfer data between them. The bus (220) includes a serial bus or a parallel bus, wherein the serial bus transmits data in bit-serial format and the parallel bus transmits data across multiple wires. The bus (220) as used herein may include but not limited to, a system bus, an internal bus, an external bus, an expansion bus, a frontside bus, a backside bus and the like.
[0056] FIG. 6 illustrates an exemplary screenshot of connecting a portable smart mat (100) with a user device of FIG. 1 in accordance with an embodiment of the present disclosure. A user (120) is presented with a user interface or application on the user device (118) that facilitates a synchronization between the portable smart mat (100) and the user device (118). The user interface may display relevant information such as the device name, connection status, and available options. Within the interface, an option labeled "Sync Mat" is displayed to the user. This option is typically placed visibly, allowing the user (120) to easily identify and click on it to initiate the syncing process. Upon synchronization, the portable smart mat (100) transfers information such as food and beverage data, weight measurements, and other relevant data captured by a plurality of sensors (126) to the user device (118).
[0057] FIG. 7(a) illustrates an exemplary screenshot of weight detection and nutritional tracking of food of FIG. 1 in accordance with an embodiment of the present disclosure. The screenshot displays a visual representation of various food items that have been placed on the portable smart mat (100). Each food item is accompanied by its corresponding weight, which is detected by the plurality of sensors (126) beneath the sections of the portable smart mat (100) with the total weight. This information allows the user (120) to track the weight of their food accurately.
[0058] FIG. 7(b) illustrates an exemplary screenshot demonstrating the ability of a user to upload images captured using a user device (118) and stored images of FIG 1, in accordance with an embodiment of the present disclosure. The screenshot displays options for uploading images of food. It may include an icon or button to capture an image using the device's camera in real-time and another option to browse and select images from the user device's stored photo gallery.
[0059] FIG. 8 illustrates an exemplary screenshot displaying the nutritional value of the food placed on a portable smart mat (100) in accordance with an embodiment of the present disclosure. This screenshot showcases a user interface that provides a detailed breakdown of the nutritional value of the food items detected on the portable smart mat (100). The screenshot displays a list of the food and beverage that have been placed on the portable smart mat (100). Each food item is listed individually, allowing the user (120) to identify and track the nutritional value of each item separately. For each food item, the user interface presents detailed nutritional information. This may include but is not limited to calories, carbohydrates, proteins, fats, vitamins, minerals and other relevant nutritional values.
[0060] FIG. 9(a) illustrates a flow chart representing the steps involved in a method to operate a portable smart mat for nutrition intake tracking in accordance with an embodiment of the present disclosure. FIG. 9(b) illustrates continued steps of the method of FIG. 9(a) in accordance with an embodiment of the present disclosure. The method (300) includes placing, on a utensil on each of the one or more sections on a top surface of a portable smart mat, at least one of a food and a beverage in step 310. The method (300) involves arranging utensils such as bowls, plates, cup and the like on the designated sections of the portable smart mat. The user places the utensils along with the food and/or beverage items they intend to consume.
[0061] In a preferred embodiment, there are five sections. Each section has a specific size, such as W 8cm × H 8 cm, W 8cm × H 8 cm, W 8cm × H 8 cm, W 16cm × H 8 cm, and W 16cm × H 17.5 cm. These dimensions represent the width (W) and height (H) measurements for each section, indicating the space allocated for placing utensils. In a specific embodiment, the one or more sections is adjustable in size to accommodate different utensil sizes and shapes.
[0062] The method (300) includes receiving, by a receiving module, input from a user regarding at least one of the food and the beverage. The input includes at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage in step 320. In one embodiment, the user may capture images of their food or beverage using a camera on a user device or the user can upload image from the stored images. The user may also record videos showcasing their food and beverages to provide additional visual information. Further, the user has the option to provide a text description of the food and beverage that he/she is consuming.
[0063] The method (300) includes detecting, by a plurality of sensors, the weight of each empty utensil placed on the one or more sections in step 330. By accurately determining the weight of empty utensils, the portable smart mat ensures that the nutritional calculations are based on the actual weight of the food and beverage being consumed by the user.
[0064] The method (300) includes detecting, by the plurality of sensors, the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils in step 340.
[0065] The method (300) includes integrating, by an integration module, the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model in step 350.
[0066] The method (300) includes detecting, by a prediction module, at least one of the food and the beverage in step 360.
[0067] The method (300) includes predicting, by the prediction module, a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time in step 370. The nutritional value includes predicted by the prediction module include but not limited to calories, macronutrients (such as carbohydrates, proteins, and fats), micronutrients (such as vitamins and minerals), fiber content and the like. These predictions are delivered in real time, ensuring that users receive immediate feedback.
[0068] The method (300) includes triggering, by a visual feedback module, an alert the user if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value in step 380.
[0069] The method (300) includes glowing, by a plurality of light emitting diode, in response the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user in step 390.
[0070] Various embodiments of the portable smart mat for tracking nutrition intake as described above allow the users to accurately track the nutritional content of their food and beverage intake in real time. Subsequently, by weighing the food and beverages placed on the smart mat and analyzing their characteristics, the portable smart mat provides detailed information about the nutritional value, including calories, macronutrients, micronutrients and the like. Further, the portable smart mat provides personalized feedback based on the user's nutritional goals and preferences. By setting predetermined values, the portable smart mat can alert the user if the nutritional composition of a particular food or beverage item exceeds his/her required intake. This helps users maintain a healthy and balanced diet. The visual feedback module along with the plurality of light emitting diodes facilitates easy monitoring of the nutrition intake.
[0071] The techniques described in this disclosure may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing subsystem” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
[0072] Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described in this disclosure. In addition, any of the described units, modules, or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
[0073] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the disclosure and are not intended to be restrictive thereof.
[0074] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0075] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, the order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.
, Claims:1. A portable smart mat (100) for tracking nutrition intake comprising:
a top surface (122), wherein the top surface (122) of the portable smart mat (100) comprises one or more sections, wherein each of the one or more sections is adapted to accommodate a utensil holding at least one of a food and a beverage;
wherein the portable smart mat (100) is powered by an in-built rechargeable battery;
a processing subsystem (105) hosted on a server (108), wherein the processing subsystem (105) is configured to execute on a network to control bidirectional communications among a plurality of modules comprising:
a receiving module (124) configured to receive input from a user (120) regarding at least one of the food and the beverage wherein the input comprises at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage;
a plurality of sensors (126) positioned beneath the one or more sections, wherein the plurality of sensors (126) is configured to:
detect the weight of each empty utensil placed on the one or more sections; and
detect the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils;
characterized in that,
wherein the processing subsystem (105) comprises:
an integration module (128) operatively coupled with the plurality of sensors (126), wherein the integration module (128) is configured to integrate the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model;
a prediction module (130) operatively coupled with the integration module (128), wherein the prediction module (130) is configured to:
detect at least one of the food and the beverage; and
predict a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time;
a visual feedback module (132) operatively coupled with the prediction module (130), wherein the visual feedback module (132) is configured to trigger an alert the user (120) if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value; and
a plurality of light emitting diodes (134) positioned around each of the one or more sections and operatively coupled to the visual feedback module (132) wherein the plurality of light emitting diodes (134) is adapted to glow in response the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user (120).
2. The portable smart mat (100) as claimed in claim 1, wherein the portable smart mat (100) comprises a wireless communication module, wherein the wireless communication module is configured to transmit and receive data between the portable smart mat (100) and a user device (118).
3. The portable smart mat (100) as claimed in claim 1, comprising a database (110), wherein the database (110) is configured to store food information comprises a plurality of Indian and international food items and beverages.
4. The portable smart mat (100) as claimed in claim 1, wherein the one or more sections is adjustable in size to accommodate different utensil size and shape.
5. The portable smart mat (100) as claimed in claim 1, wherein the receiving module (124) is configured to enable the user (120) to capture photos of at least one of the food and the beverage using a camera on the user device (118).
6. The portable smart mat (100) as claimed in claim 1, wherein the nutritional value comprises calories, macronutrients, micronutrients, and fiber content.
7. The portable smart mat (100) as claimed in claim 1, wherein the pre-determined value is customizable based on personal dietary goals, weight and health conditions of the user (120).
8. A method (300) to operate a portable smart mat for tracking nutrition intake comprising:
placing, on a utensil on each of the one or more sections on a top surface of the portable smart mat, at least one of a food and a beverage; (310)
receiving, by a receiving module, input from a user regarding at least one of the food and the beverage wherein the input comprises at least one of an image, video, text description, and user selection from a pre-defined list of at least one of the food and the beverage; (320)
detecting, by a plurality of sensors, the weight of each empty utensil placed on the one or more sections; (330)
detecting, by the plurality of sensors, the weight of at least one of the food and the beverage in each utensil placed on the one or more sections by eliminating the weight of the corresponding empty utensils; (340)
characterized in that,
integrating, by an integration module, the input and weight of at least one of the food and the beverage with the state-of-the-art machine learning model; (350)
detecting, by a prediction module, at least one of the food and the beverage; (360)
predicting, by the prediction module, a corresponding nutritional value based on the weight and one or more characteristics of at least one of the food and the beverage in real time; (370)
triggering, by a visual feedback module, an alert the user if the nutritional value of at least one of the food and the beverage exceeds a pre-determined value; and (380)
glowing, by a plurality of light emitting diode, in response the alert triggered, thereby tracking and monitoring consumption of nutrition intake of the user. (390)
Dated this 29th day of September 2023
Signature
Jinsu Abraham
Patent Agent (IN/PA-3267)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202341065713-STATEMENT OF UNDERTAKING (FORM 3) [29-09-2023(online)].pdf | 2023-09-29 |
| 2 | 202341065713-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-09-2023(online)].pdf | 2023-09-29 |
| 3 | 202341065713-PROOF OF RIGHT [29-09-2023(online)].pdf | 2023-09-29 |
| 4 | 202341065713-POWER OF AUTHORITY [29-09-2023(online)].pdf | 2023-09-29 |
| 5 | 202341065713-FORM-9 [29-09-2023(online)].pdf | 2023-09-29 |
| 6 | 202341065713-FORM FOR SMALL ENTITY(FORM-28) [29-09-2023(online)].pdf | 2023-09-29 |
| 7 | 202341065713-FORM FOR SMALL ENTITY [29-09-2023(online)].pdf | 2023-09-29 |
| 8 | 202341065713-FORM 1 [29-09-2023(online)].pdf | 2023-09-29 |
| 9 | 202341065713-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-09-2023(online)].pdf | 2023-09-29 |
| 10 | 202341065713-EVIDENCE FOR REGISTRATION UNDER SSI [29-09-2023(online)].pdf | 2023-09-29 |
| 11 | 202341065713-DRAWINGS [29-09-2023(online)].pdf | 2023-09-29 |
| 12 | 202341065713-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2023(online)].pdf | 2023-09-29 |
| 13 | 202341065713-COMPLETE SPECIFICATION [29-09-2023(online)].pdf | 2023-09-29 |
| 14 | 202341065713-MSME CERTIFICATE [03-10-2023(online)].pdf | 2023-10-03 |
| 15 | 202341065713-FORM28 [03-10-2023(online)].pdf | 2023-10-03 |
| 16 | 202341065713-FORM 18A [03-10-2023(online)].pdf | 2023-10-03 |
| 17 | 202341065713-FORM-26 [13-10-2023(online)].pdf | 2023-10-13 |
| 18 | 202341065713-FER.pdf | 2023-11-06 |
| 19 | 202341065713-OTHERS [27-12-2023(online)].pdf | 2023-12-27 |
| 20 | 202341065713-FORM 3 [27-12-2023(online)].pdf | 2023-12-27 |
| 21 | 202341065713-FER_SER_REPLY [27-12-2023(online)].pdf | 2023-12-27 |
| 22 | 202341065713-US(14)-HearingNotice-(HearingDate-06-08-2024).pdf | 2024-07-09 |
| 23 | 202341065713-FORM-26 [26-07-2024(online)].pdf | 2024-07-26 |
| 24 | 202341065713-Correspondence to notify the Controller [26-07-2024(online)].pdf | 2024-07-26 |
| 25 | 202341065713-Written submissions and relevant documents [16-08-2024(online)].pdf | 2024-08-16 |
| 26 | 202341065713-PatentCertificate18-09-2024.pdf | 2024-09-18 |
| 27 | 202341065713-IntimationOfGrant18-09-2024.pdf | 2024-09-18 |
| 28 | 202341065713- Certificate of Inventorship-044000036( 13-01-2025 ).pdf | 2025-01-13 |
| 29 | 202341065713- Certificate of Inventorship-044000032( 13-01-2025 ).pdf | 2025-01-13 |
| 30 | 202341065713- Certificate of Inventorship-044000042( 14-01-2025 ).pdf | 2025-01-14 |
| 31 | 202341065713- Certificate of Inventorship-044000123( 05-03-2025 ).pdf | 2025-03-05 |
| 1 | SearchHistoryE_06-11-2023.pdf |
| 2 | SearchE_03-11-2023.pdf |