Abstract: A meal delivery system is disclosed. The meal delivery system comprises a collapsible enclosure providing a space to allow for placement of a meal. The meal delivery system further comprises a set of first sensors to detect at least one qualitative and/or quantitative parameter related to the meal. The meal delivery system further comprises a controller configured to process the detected parameter to determine qualitative and/or quantitative properties for the meal; determine desired qualitative and/or quantitative properties for the meal; compare the determined qualitative and/or quantitative properties with the desired qualitative and/or quantitative properties; and generate a first signal indicative of approval for the meal if the determined qualitative and/or quantitative properties is within a corresponding threshold of the desired qualitative and/or quantitative properties for the meal, and a second signal indicative of rejection for the meal otherwise. FIG. 1A
Description:MEAL DELIVERY SYSTEM
FIELD OF THE PRESENT DISCLOSURE
[0001] The present disclosure generally relates to a meal delivery mechanism, and more particularly to a meal delivery system to ensure qualitative and/or quantitative properties for the one or more food items in the meal being served.
BACKGROUND
[0002] Good health is a prerequisite for national development. With the world’s largest youth population, India represents an inspiring demographic dividend that can have lasting impact on the social and economic development of the country. Therefore, investing in the health and wellbeing of children is a critical priority in nation-building efforts. Many recent research studies have shown clear correlations between poor diet and health conditions. Children are vulnerable to wide spectrum of communicable and chronic disease conditions due to nutritional deficiencies. The total economic impact of poor nutrition has proven difficult to quantify as a whole, with estimates ranging from hundreds to thousands of crores annually. However, there is little doubt that the consequences of poor nutrition are severe. Poor diets, such as under eating, overeating, and/or consuming poor quality food have direct correlations to certain medical conditions, such as diabetes, cancer, heart disease, hypertension, chronic kidney disease, obesity, and the like.
[0003] For institutions, such as schools providing meals to children under mid-day meal scheme, it is often difficult to ascertain qualitative and/or quantitative standards of served meal. Also, it is often not possible for such institutions to ensure that the served meal meets prescribed nutritional requirements or necessary dietary regimen for the recipient of the served meal each time. Similarly, restaurants and cloud kitchens also need to ensure that each serving of the cooked and packaged meal maintains certain standards. Traditional way for achieving this is manual inspection of the meal. However, such manual inspection is time consuming and error prone. Moreover, existing devices which may provide some functionality to check quality or quantity of meals are generally fixed, bulky, often designed to cater to checking of specific food items, and are not readily available and/or easily operated and interpreted by inexperienced individuals.
[0004] There is a need for a simple, quick, and effective system for determining appropriate quality and quantity of a meal, which may further be augmented to take into consideration characteristics of the individual receiving the meal.
SUMMARY
[0005] In an aspect, a meal delivery system is disclosed. The meal delivery system comprises an enclosure providing a space to allow for placement of a meal comprising one or more food items therein. The meal delivery system further comprises a set of first sensors incorporated in the enclosure, wherein each first sensor in the set of first sensors is configured to detect at least one qualitative and/or quantitative parameter related to the one or more food items in the meal currently placed in the space of the enclosure. The meal delivery system further comprises a controller. The controller is configured to process the detected at least one parameter, from the set of first sensors, related to the one or more food items in the meal currently placed in the space of the enclosure to determine qualitative and/or quantitative properties for the one or more food items in the meal. The controller is further configured to determine desired qualitative and/or quantitative properties for the one or more food items in the meal. The controller is further configured to compare the determined qualitative and/or quantitative properties for the one or more food items in the meal with the desired qualitative and/or quantitative properties for the one or more food items in the meal. The controller is further configured to generate a first signal indicative of approval for the meal if the determined qualitative and/or quantitative properties for the one or more food items in the meal is within a corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal and a second signal indicative of rejection for the meal if the determined qualitative and/or quantitative properties for the one or more food items in the meal is not within the corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal.
[0006] In one or more embodiments, the enclosure is a collapsible enclosure configured to be transformed into an operation state and a collapsed state, and wherein the enclosure is adapted to provide the space to allow for the placement of the meal therein in the operation state thereof.
[0007] In one or more embodiments, the set of first sensors comprises an image sensor arranged in the enclosure to capture an optical image of the meal. The controller is configured to: analyse the captured optical image of the meal to determine one or more of size, shape, colour, texture, volume of at least one of the one or more food items in the meal, and generate the first signal if the determined one or more of size, shape, colour, texture, volume is within a corresponding respective qualitative and/or quantitative range threshold and the second signal if the determined quality and/or quantity is not within the corresponding respective qualitative and/or quantitative range threshold.
[0008] In one or more embodiments, the set of first sensors comprises a spectroscopic sensor arranged in the enclosure to capture a spectroscopic image of the meal, and wherein the controller is configured to analyse the captured spectroscopic image of the meal to determine presence of adulterants in the one or more food items in the meal, and generate the first signal if there is no presence of adulterants in the meal and the second signal if there is presence of adulterants in the meal.
[0009] In one or more embodiments, the set of first sensors comprises a thermal sensor arranged in the enclosure to measure a temperature of at least one of the one or more food items in the meal, and wherein the controller is configured to analyse the measured temperature of the at least one of the one or more food items in the meal, and generate the first signal if the measured temperature is within a corresponding temperature range threshold and the second signal if the measured temperature is not within the corresponding temperature range threshold.
[0010] In one or more embodiments, the set of first sensors comprises one or more load cell sensors arranged in a base of the enclosure to measure weight of at least one of the one or more food items in the meal placed thereon, and wherein the controller is configured to analyse the measured weight of the at least one of the one or more food items in the meal, and generate the first signal if the measured weight is within a corresponding weight range threshold and the second signal if the measured weight is not within the corresponding weight range threshold.
[0011] In one or more embodiments, the meal delivery system further comprises a machine learning model trained on the desired qualitative and/or quantitative properties for the one or more food items in relation to multiple characteristics of different individuals. The controller is configured to: determine one or more characteristics of an individual receiving the meal currently placed in the space of the enclosure, and implement the machine learning model to determine the desired qualitative and/or quantitative properties for the one or more food items in the meal currently placed in the space of the enclosure based on the determined one or more characteristics of the individual receiving the meal.
[0012] In one or more embodiments, the meal delivery system further comprises a vending apparatus disposed in the enclosure, wherein the vending apparatus is configured to dispense the one or more food items of the meal based on the one or more characteristics of the individual.
[0013] In one or more embodiments, the meal delivery system further comprises at least one second sensor configured to capture information about the one or more characteristics of the individual receiving the meal currently placed in the space of the enclosure. The at least one second sensor comprises one or more of: an infrared sensor configured to capture physical traits of the individual, wherein the controller is configured to determine a Body Mass Index (BMI) of the individual based on the captured physical traits of the individual as the one or more characteristics of the individual, and an image sensor configured to capture a facial image of the individual, wherein the controller is configured to determine an identity of the individual based on the captured facial image of the individual and further determine a meal preference of the individual based on the determined identity of the individual as the one or more characteristics of the individual.
[0014] In one or more embodiments, the meal delivery system further comprises a third sensor configured to capture at least one trait of an individual receiving the meal currently placed in the space of the enclosure. The controller is configured to one or more of: confirm duplicity in case of the individual having previously received meal in an ongoing meal session based on the captured at least one trait of the individual, and mark attendance for the individual based on the captured at least one trait of the individual.
[0015] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE FIGURES
[0016] For a more complete understanding of example embodiments of the present disclosure, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
[0017] FIG. 1A illustrates a diagrammatic perspective view representation of a meal delivery system, in accordance with one or more embodiments of the present disclosure;
[0018] FIG. 1B illustrates a diagrammatic front planar view representation of the meal delivery system, in accordance with one or more embodiments of the present disclosure;
[0019] FIG. 1C illustrates a diagrammatic side planar view representation of the meal delivery system, in accordance with one or more embodiments of the present disclosure;
[0020] FIGS. 2A-2E illustrate various diagrammatic view representations of an enclosure for the meal delivery system, depicting its transformation from an operation state (as depicted in FIG. 1C) to a collapsed state thereof, in accordance with one or more embodiments of the present disclosure;
[0021] FIG. 3 illustrates a schematic block diagram representation of the meal delivery system, in accordance with one or more embodiments of the present disclosure;
[0022] FIG. 4 illustrates a schematic block diagram representation of hardware for a controller of the meal delivery system, in accordance with one or more embodiments of the present disclosure;
[0023] FIG. 5 illustrates an exemplary representation of a meal comprising one or more food items placed in a space of the enclosure of the meal delivery system, in accordance with one or more embodiments of the present disclosure;
[0024] FIG. 6 illustrates an exemplary representation of implementation of the meal delivery system to determine qualitative and/or quantitative properties for the one or more food items in the meal, in accordance with one or more embodiments of the present disclosure;
[0025] FIG. 7 illustrates an exemplary representation of implementation of the meal delivery system to capture information about the one or more characteristics of the individual receiving the meal currently placed in the enclosure, in accordance with one or more embodiments of the present disclosure; and
[0026] FIG. 8 illustrates an exemplary representation of a first user interface showing qualitative and/or quantitative properties for the one or more food items in the meal and characteristics of an individual receiving the meal using the meal delivery system, in accordance with one or more embodiments of the present disclosure; and
[0027] FIG. 9 illustrates an exemplary representation of a second user interface showing data analysis for the meals delivered using the meal delivery system, in accordance with one or more embodiments of the present disclosure.
DETAILED DESCRIPTION
[0028] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure is not limited to these specific details.
[0029] Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
[0030] Furthermore, in the following detailed description of the present disclosure, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be understood that the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the present disclosure.
[0031] Embodiments described herein may be discussed in the general context of computer-executable instructions residing on some form of computer-readable storage medium, such as program modules, executed by one or more computers or other devices. By way of example, and not limitation, computer-readable storage media may comprise non-transitory computer-readable storage media and communication media; non-transitory computer-readable media include all computer-readable media except for a transitory, propagating signal. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
[0032] Some portions of the detailed description that follows are presented and discussed in terms of a process or method. Although steps and sequencing thereof are disclosed in figures herein describing the operations of such process or method, such steps and sequencing are exemplary. Embodiments are well suited to performing various other steps or variations of the steps recited in the flowchart of the figure herein, and in a sequence other than that depicted and described herein.
[0033] Some portions of the detailed descriptions that follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those utilizing physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as transactions, bits, values, elements, symbols, characters, samples, pixels, or the like.
[0034] Referring now to FIGS. 1A-1C, illustrated are diagrammatic representations of a meal delivery system (as represented by reference numeral 100), in accordance with one or more embodiments of the present disclosure. In the present illustrations, the meal delivery system 100 has been embodied in the form of a “self-contained portable/collapsible box” which may generally be used as a standalone device comprising all of the required sensors and processing systems. In alternative examples, the meal delivery system 100 may be embodied as a distributed arrangement with, for example, the processing systems being remote to the device so as to allow for performing remote analytics (as may be contemplated) without departing from the spirit and the scope of the present disclosure. Generally, the present meal delivery system 100 concerns an apparatus for monitoring a qualitative aspect (such as, ingredients quality, cooking quality, etc.) and a quantitative aspect (such as, weight, volume, etc.) of food items in a meal, so as to generally determine nutritional value of the meal. Further, the present meal delivery system 100 enables to prevent multiple deliveries of the meals to the same individual, as well as may be implemented for marking attendance for the individuals based on the meal being received therefrom. Furthermore, the present meal delivery system 100 enables to maintain records of meal disbursements and may also allow to perform analytics regarding meal disbursements without any limitations.
[0035] As illustrated, the meal delivery system 100 includes an enclosure 110. In the present illustrations, the enclosure 110 has been shown to be in the form of a generally cuboidal box; however other shapes may be contemplated without any limitations. The enclosure 110 supports the various components of the meal delivery system 100, including various sensors and processing systems thereof. Such design with the enclosure 110 provides that the meal delivery system 100 is a standalone apparatus, which is also portable (as discussed later in more detail) and thus may be carried from one site to another as required for its operation. The enclosure 110 provides a space 112 to allow for placement of a meal comprising one or more food items therein. It may be contemplated that the one or more food items of the meal may be arranged (put together) in a container, such as a tray, a plate, or a box, which in turn may be placed at the base 114 of the enclosure 110. Further, it may be contemplated that the multiple food items as part of the meal may be separated via one or more partitions in the container for segregation thereof, or may be placed adjacent to each other without segregation in a same compartment of the container without any limitations.
[0036] As may be seen, the said space 112 may be made accessible from one open side in the enclosure 110. In particular, the enclosure 110 has the base 114 accessible via the space 112 onto which the meal may be placed. Herein, a user (i.e., server of the meal) may place the meal at the base 114 via the open side in the enclosure 110, and then an individual (i.e., recipient of the meal) may pick-up the meal via the same open side in the enclosure 110. In some other examples, the space 112 may be in the form of a window or a cut-out in the enclosure 110 to allow for placement of the meal by the user in the space 112 from one open side and for pick-up of the meal by the individual from another (like opposite) open side in the enclosure 110, without departing from the spirit and the scope of the present disclosure.
[0037] Further, as illustrated in combination of FIGS. 1A-1C, the enclosure 110 has a front side 110a, a back side 110b, a bottom side 110c and a top side 110d. Herein, the front side 110a may provide support to mount some of the components of the meal delivery system 100. The bottom side 110c may provide support for the meal delivery system 100 to be placed onto a supporting surface (such as, the ground). Also, the enclosure 110 may include a set of legs (represented by reference numeral 116) provided at the bottom side 110c thereof. The set of legs 116 may allow the enclosure 110 to be placed above (i.e., with a gap from) the supporting surface. In some examples, the enclosure 110 may also be provided with an extended member 118, which may be in the form of an L-shaped member (such as, a hook) or the like, and be used for supporting the enclosure 110 over a surface, such as for temporarily affixing the enclosure 110 over the said surface. It may be appreciated that the enclosure 110 may be provided with other supporting members for its operation without any limitations.
[0038] In the present embodiments, the enclosure 110 is a collapsible enclosure (with the two terms being interchangeably used hereinafter). The collapsible enclosure 110 is configured to be transformed into an operation state and a collapsed state. Herein, the collapsible enclosure 110 is adapted to provide the space 112 in the operation state thereof, to allow for the placement of the meal therein. Further, the collapsible enclosure 110 is adapted to be in a portable form in the collapsed state thereof, in order to be easily carried from one site to another, and to be installed at the site with relative ease. For this purpose, as shown in FIGS. 1A-1C, the enclosure 110 may be made of hinged panels (as generally represented by reference numeral 120) which may be pivotably coupled to each other. These hinged panels 120 may be configured to fold onto each other in order to dispose the enclosure 110 in the collapsed state thereof. Further, the enclosure 110 may include side panels 122 which form sides of the enclosure 110 and a back panel 124 which form a back (i.e., the back side 110b) of the enclosure 110. These side panels 122 and the back panel 124 support the hinged panels 120 in the unfolded state thereof, so as to dispose and maintain the enclosure 110 in the operation state thereof.
[0039] For purposes of the present disclosure, in example embodiments, the enclosure 110 may be made of selected materials, such as aluminium, plastics, etc., so that the enclosure 110 is relatively lightweight for it to be portable enough and be easily carried from one site to another (in the collapsed state thereof) as per operational requirements. In some example, the enclosure 110 may also be provided with a handle (not shown), such as affixed to the top side 110d of the enclosure 110 to allow for carrying of the meal delivery system 100 by the user thereof, say from one site to another. Other supporting and reinforcing elements may be added to the enclosure 110 to enhance the durability of the meal delivery system 100, as may be contemplated by a person skilled in the art.
[0040] FIGS. 2A-2E illustrate various diagrammatic view representations of the enclosure 110 for the meal delivery system 100, depicting its transformation from the operation state to the collapsed state thereof, in accordance with one or more embodiments of the present disclosure. Referring back to FIG. 1C, the enclosure 110 is depicted in the operation state thereof. Herein, all of the panels 120 of the enclosure 110 are extended out to dispose the enclosure 110 in the operation state thereof. Now, as shown in FIG. 2A, firstly the side panels 122 may be folded against the back panel 124 (i.e., the back side 110b) of the enclosure 110. Further, referring to FIGS. 2B-2D, at each instance, one or more of the hinged panels 120 may be folded in, to carry out disposing of the enclosure 110 to the collapsed state thereof. Referring to FIG. 2E, the enclosure 110 is depicted in the collapsed state thereof. As shown in FIG. 2E, all of the hinged panels 120 and the side panels 122 of the enclosure 110 are folded, generally, against the back panel 124 to dispose the enclosure 110 in the collapsed state thereof. Such configuration details of the enclosure 110 allowing for easy and quick transformation between the operation state and the collapsed state may be understood, and thus not explained herein in any more detail for the brevity of the present disclosure.
[0041] In an example implementation, the present meal delivery system 100 may utilize a food container to receive and retain the food item. One or more load sensors may measure weight of the food item being placed in the food container. Based on the measured weight and the identification of the food item, a nutritional value of the food item can be identified and notified to an individual. The meal delivery system 100 may further include sensors to identify a type of the food item. In one embodiment, such sensor may be a camera which captures an image of the food item, in turn, one or more processors may identify the type of the food item based on the image. The method may be carried by the meal delivery system 100 which may provide various functions based on the data gathered about meal delivery. The individual behaviour of food consumption, such as calories intake and eating habit may be monitored and evaluated by the meal delivery system 100, in order to improve the individual's nutritional intake. The meal delivery system 100 may also provide a meal plan either based on the food item being consumed by the individual over time or an input from the individual. Such functions of the meal delivery system 100 may be understood based on the description in the proceeding paragraphs.
[0042] Now referring to FIG. 3, illustrated is a schematic block diagram representation of the meal delivery system 100, in accordance with one or more embodiments of the present disclosure. As shown, the meal delivery system 100 includes a set of first sensors 310. In the present embodiments, the set of first sensors 310 may be incorporated in the enclosure 110. Each first sensor in the set of first sensors 110 is configured to detect at least one qualitative and/or quantitative parameter related to the one or more food items in the meal currently placed in the space 112 of the enclosure 110, in the meal delivery system 100. Herein, the qualitative parameter may be representative of a quality (such as, nutritional value) of the one or more food items in the meal. In the present examples, the qualitative parameter may also include indicators for determining a type of each of the one or more food items in the meal. Further, the qualitative parameter may be representative of a quantity (such as, weight, volume, or the like) of the one or more food items in the meal. In an example, the set of first sensors 310 may only include a single first sensor without any limitations.
[0043] In an embodiment, the set of first sensors 310 includes an image sensor 312. In the present meal delivery system 100, the image sensor 312 is arranged in the enclosure 110. In an example, as shown in FIGS. 1A and 1B, the image sensor 312 may be arranged on the top side 110d of the enclosure 110 so as to have the meal placed at the base 114 in a field-of-view thereof. For purposes of the present disclosure, the term “image sensor” is used broadly herein to refer to a device that converts an optical or visual image into an electrical signal. The image sensor 312 can include, by way of example, a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) active pixel sensor, or other semiconductor image chip. Herein, the image sensor 312 is configured to capture an optical image of the meal.
[0044] In an embodiment, the set of first sensors 310 includes a spectroscopic sensor 314. In the present meal delivery system 100, the spectroscopic sensor 314 is arranged in the enclosure 110. In an example, as shown in FIGS. 1A and 1B, the spectroscopic sensor 314 may be arranged on the top side 110d of the enclosure 110 so as to have the meal placed at the base 114 in a field-of-view thereof. For purposes of the present disclosure, the term “spectroscopic sensor” is used broadly herein to refer to a device that is based on the principle of interaction between molecules and light. The spectroscopic sensor 314 enable the simultaneous real-time bioprocess monitoring of various critical process parameters including biological, chemical, and physical variables. The spectroscopic sensor 314 may be configured to perform non-destructive and non-invasive measurement in real-time operation. Herein, the spectroscopic sensor 314 is configured to capture a spectroscopic image of the meal.
[0045] In an embodiment, the set of first sensors 310 includes a thermal sensor 316. In the present meal delivery system 100, the thermal sensor 316 is arranged in the enclosure 110. In an example, as shown in FIGS. 1A and 1B, the thermal sensor 316 may be arranged on the top side 110d of the enclosure 110 so as to have the meal placed at the base 114 in a field-of-view thereof. For purposes of the present disclosure, the term “thermal sensor” is used broadly herein to refer to temperature sensing devices, such as thermistors, thermocouples, resistor thermometers, any thermally resistive material, and other temperature sensing devices. Herein, the thermal sensor 316 is configured to measure a temperature of at least one of the one or more food items in the meal.
[0046] In an embodiment, the set of first sensors 310 includes one or more load cell sensors 318. In the present meal delivery system 100, the one or more load cell sensors 318 are arranged in the enclosure 110. In an example, as shown in FIG. 1B, the one or more load cell sensors 318 may be arranged on the bottom side 110c of the enclosure 110 so as to have the meal placed at the base 114 be located on top thereof. This way the one or more load cell sensors 318 may be able to measure weight of the at least one of the one or more food items in the meal. In example embodiments, the load cell sensors 318 may be distributed at the base 114 so as to measure weights of different sections of a container containing the meal. By mapping with the image of the meal (such as, the optical image captured by the image sensor 312), it may be possible to determine individual weights of different food items in the meal using the load cell sensors 318, as may be contemplated by a person skilled in the art.
[0047] Also, as shown in FIG. 3, the meal delivery system 100 includes at least one second sensor 320 arranged in the enclosure 110. Herein, the second sensor 320 is configured to capture information about one or more characteristics of the individual receiving the meal currently placed in the space 112 of the enclosure 110. For this purpose, as shown in FIGS. 1A and 1B, the second sensor 320 may be arranged at the front side 110a of the enclosure 110 so as to have at least a face of the individual receiving the meal in a field-of-view thereof. In an embodiment, the second sensor 320 may include an infrared sensor 322 configured to capture physical traits of the individual, which may be used to determine a Body Mass Index (BMI) of the individual (as discussed later in more detail). In an alternative or additional embodiment, the second sensor 320 may include an image sensor 324 configured to capture a facial image of the individual, which may be used to identify the individual receiving the meal (as discussed later in more detail).
[0048] Also, as shown in FIG. 3, the meal delivery system 100 includes a third sensor 330 arranged in the enclosure 110. Herein, the third sensor 330 is configured to capture at least one trait of an individual receiving the meal currently placed in the space 112 of the enclosure 110. For this purpose, as shown in FIGS. 1A and 1B, the third sensor 330 may be arranged at the front side 110a of the enclosure 110 so as to capture the at least one trait of the individual receiving the meal. In an example, the third sensor 330 may be an image sensor and is configured to capture a facial image of the individual receiving the meal, as the trait for that individual. As may be appreciated, in such case, the functioning of the second sensor 320 and the third sensor 330 may be integrated in a single sensor arranged in the enclosure 110 of the meal delivery system 100.
[0049] Further, as illustrated in FIG. 3, the meal delivery system 100 includes a machine learning model 340. For purposes of the present disclosure, the machine learning model 340 is trained on the desired qualitative and/or quantitative properties for the one or more food items in relation to multiple characteristics of different individuals. For instance, the machine learning model 340 may be trained on data regarding nutritional requirements (such as, daily nutritional requirements) of an individual based on the BMI of that particular individual. Further, the machine learning model 340 may be trained on data regarding meal preference (such as, vegetarian, non-vegetarian, vegan, kosher, etc.) of an individual based on an identity (such as, facial identity) of that particular individual. Such implementation of the machine learning models may be contemplated by a person skilled in the art and thus has not been explained further for the brevity of the present disclosure.
[0050] Further, as illustrated in FIG. 3, the meal delivery system 100 includes a controller 350. The controller 350 works in conjunction (i.e., executes) the machine learning model 340 to process the sensed parameters from the sensors 310, 320, 330. As used herein, the term “controller” generically includes the known types of analog and digital logic control implementations that can be used to implement a control circuit for the meal delivery system 100, and may refer to circuit implementations utilizing such circuits for transforming an electrical signal in accordance with a mathematical operation or algorithm. A person skilled in the art of control system art may recognize that the controller 350 may be implemented with analog or digital circuits and combinations of them. The mathematical operations of the controller 350 may be implemented with any of a variety of commercially available microprocessors, microcontrollers, or other computing circuits. As known in the current state of the art, analog circuit and mathematical operations may be economically performed by software programmed digital circuits having software algorithms that simulate analog circuit operations and perform mathematical operations. Many of these operations can be performed by discrete logic, programmable logic array (PLA), programmable gate array (PGA) or digital signal processor (DSP) implementations, as well as by microprocessors or microcontrollers, as known in the art.
[0051] In one or more embodiments, as illustrated in FIG. 4, the controller 350 includes a processing unit 405 for running software applications and optionally an operating system. A memory 410 stores applications and data for use by the processing unit 405. A storage 415 provides non-volatile storage for applications and data and may include fixed disk drives, removable disk drives, flash memory devices, and CD-ROM, DVD-ROM, or other optical storage devices. An optional user input device 420 includes devices that communicate user inputs from one or more users to the controller 350 and may include keyboards, mice, joysticks, touch screens, etc. A communication or network interface 425 is provided which allows the controller 350 to communicate with other computer systems via an electronic communications network, including wired and/or wireless communication and including an Intranet or the Internet. In one embodiment, the controller 350 receives instructions and user inputs from a remote computer through a communication interface 425. The communication interface 425 can comprise a transmitter and receiver for communicating with remote devices. An optional display device 450 may be provided which can be any device capable of displaying visual information in response to a signal from the controller 350. The components of the controller 350, including the processing unit 405, the memory 410, the data storage 415, the user input devices 420, the communication interface 425, and the display device 450, may be coupled via one or more data buses 460.
[0052] As illustrated in FIG. 4, a graphics system 430 may be coupled with the data bus 460 and the components of the controller 350. The graphics system 430 may include a physical graphics processing unit (GPU) 435 and graphics memory. The GPU 435 generates pixel data for output images from rendering commands. The physical GPU 435 can be configured as multiple virtual GPUs that may be used in parallel (concurrently) by a number of applications or processes executing in parallel. For example, mass scaling processes for rigid bodies or a variety of constraint solving processes may be run in parallel on the multiple virtual GPUs. Graphics memory may include a display memory 440 (e.g., a framebuffer) used for storing pixel data for each pixel of an output image. In another embodiment, the display memory 440 and/or additional memory 445 may be part of the memory 410 and may be shared with the processing unit 405. Alternatively, the display memory 440 and/or additional memory 445 can be one or more separate memories provided for the exclusive use of the graphics system 430. In another embodiment, graphics system 430 includes one or more additional physical GPUs 455, similar to the GPU 435. Each additional GPU 455 may be adapted to operate in parallel with the GPU 435. Each additional GPU 455 generates pixel data for output images from rendering commands. Each additional physical GPU 455 can be configured as multiple virtual GPUs that may be used in parallel (concurrently) by a number of applications or processes executing in parallel, e.g., processes that solve constraints. Each additional GPU 455 can operate in conjunction with the GPU 435, for example, to simultaneously generate pixel data for different portions of an output image, or to simultaneously generate pixel data for different output images. Each additional GPU 455 can be located on the same circuit board as the GPU 435, sharing a connection with the GPU 435 to the data bus 460, or each additional GPU 455 can be located on another circuit board separately coupled with the data bus 460. Each additional GPU 455 can also be integrated into the same module or chip package as the GPU 435. Each additional GPU 455 can have additional memory, similar to the display memory 440 and additional memory 445, or can share the memories 440 and 445 with the GPU 435. It is to be understood that the circuits and/or functionality of GPU as described herein could also be implemented in other types of processors, such as general-purpose or other special-purpose coprocessors, or within a CPU.
[0053] In particular, the controller 350 is configured to process the detected at least one parameter, from the set of first sensors 310, related to the one or more food items in the meal currently placed in the space 112 of the enclosure 110 to determine qualitative and/or quantitative properties for the one or more food items in the meal. For this purpose, in an embodiment, the controller 350 is configured to analyse the captured optical image of the meal, via the image sensor 312, to determine one or more of size, shape, colour, texture, volume of the one or more food items in the meal. For example, it may be determined for breads like rotis if that is overcooked based on presence of “burnt” (or “black”) regions thereon. It may be appreciated that that calibration provides colour measurement accurate to better than 0.5%. Image processing algorithms identify the breads within the optical image and provide data for the machine learning model 340 to assess the bake quality. The combination of these technologies provides a machine vision system for bake quality assessment that is demonstrably precise. For determining volume, which is an indicator of the quantity, the optical image may be analysed for area being covered and depth of the corresponding food item in the container in which the meal is being served.
[0054] Alternatively, or additionally, the controller 350 is configured to analyse the captured spectroscopic image of the meal, via the spectroscopic sensor 312, to determine presence of adulterants in the one or more food items in the meal. For example, in food items like dal, metanil yellow/artificial colours may be added to improve its colour. Traditionally, its presence can be tested in dal by adding a few drops of HCl to a test sample, if the solution turns pink in colour, which indicates the presence of metanil yellow. In present embodiments, FTIR spectroscopy (Fourier-transform infrared spectroscopy) technique may be used on the captured spectroscopic image of the meal, via the spectroscopic sensor 312, to detect metanil yellow [See: Evaluation of Turmeric Powder Adulterated with Metanil Yellow Using FT-Raman and FT-IR Spectroscopy, Dhakal et. al., incorporated herein by reference].
[0055] Alternatively, or additionally, the controller 350 is configured to analyse the measured temperature, via the thermal sensor 316, of the at least one of the one or more food items in the meal to check if the respective temperatures of the one or more food items are within corresponding acceptable temperature ranges. Alternatively, or additionally, the controller 350 is configured to analyse the measured weight of the at least one of the one or more food items in the meal, via the one or more load cell sensors 318, to check if the respective weights (quantity) of the one or more food items are within corresponding acceptable weight ranges.
[0056] Subsequently, the controller 350 is configured to determine desired qualitative and/or quantitative properties for the one or more food items in the meal. For this purpose, the controller 350 is configured to first determine one or more characteristics of the individual receiving the meal currently placed in the space 112 of the enclosure 110. This is achieved by using the second sensor 320 as discussed in the preceding paragraphs. Herein, the one or more characteristics may be physical characteristics of the individual, meal preference characteristics of the individual, and the like. In an embodiment, the controller 350 is configured to determine the Body Mass Index (BMI) of the individual based on the captured physical traits of the individual as the one or more characteristics of the individual. Alternatively, or additionally, the controller 350 is configured to determine an identity of the individual based on the captured facial image of the individual and further determine a meal preference of the individual based on the determined identity of the individual as the one or more characteristics of the individual. Thereafter, the controller 350 is configured to implement the machine learning model 340 to determine the desired qualitative and/or quantitative properties for the one or more food items in the meal currently placed in the space of the enclosure based on the determined one or more characteristics of the individual receiving the meal. For instance, as discussed, the machine learning model 340 trained on data regarding nutritional requirements (such as, daily nutritional requirements) of an individual for his/her BMI may be implemented to provide the desired nutritional requirement for the one or more food items in the meal based on the determined BMI of the particular individual. Similarly, the machine learning model 340 trained on data regarding meal preference (such as, vegetarian, non-vegetarian, vegan, kosher, etc.) of an individual in consideration of his/her identity (such as, facial identity) may be implemented to provide the desired meal preference for the one or more food items in the meal based on the determined identity of the particular individual.
[0057] The controller 350 is further configured to compare the determined qualitative and/or quantitative properties for the one or more food items in the meal with the desired qualitative and/or quantitative properties for the one or more food items in the meal. That is, the controller 350 may check if the determined qualitative and/or quantitative properties for the one or more food items in the meal matches (or is within an acceptable range of acceptance) with the desired qualitative and/or quantitative properties for the one or more food items in the meal. In an example, the said acceptable range of acceptance may be defined to be within ±10% of the corresponding desired qualitative and/or quantitative properties for the one or more food items in the meal.
[0058] The controller 350 is further configured to generate a first signal indicative of approval for the meal if the determined qualitative and/or quantitative properties for the one or more food items in the meal is within a corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal and a second signal indicative of rejection for the meal if the determined qualitative and/or quantitative properties for the one or more food items in the meal is not within the corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal. In an embodiment, the controller 350 is configured to generate the first signal if the determined one or more of size, shape, colour, texture, volume is within a corresponding respective qualitative and/or quantitative range threshold and the second signal if the determined quality and/or quantity is not within the corresponding respective qualitative and/or quantitative range threshold. Alternatively, or additionally, the controller 350 is configured to generate the first signal if there is no presence of adulterants in the meal and the second signal if there is presence of adulterants in the meal. Alternatively, or additionally, the controller 350 is configured to generate the first signal if the measured temperature is within a corresponding temperature range threshold and the second signal if the measured temperature is not within the corresponding temperature range threshold. Alternatively, or additionally, the controller 350 is configured to generate the first signal if the measured weight is within a corresponding weight range threshold and the second signal if the measured weight is not within the corresponding weight range threshold.
[0059] In some embodiments, the controller 350 is further configured to confirm duplicity in case of the individual having previously received meal in an ongoing meal session. This may be achieved by the controller 350 using the captured at least one trait of the individual, via the third sensor 330. As discussed, in some examples, the third sensor 330 may be an image sensor and the captured trait may be a facial data of the individual. Using the facial data, the controller 350 can check if the same individual may be taking unfair advantage to receive meal multiple times. In such case, the controller 350 may be configured to generate an alarm or the like. In another embodiment, the controller 350 is configured to mark attendance for the individual based on the captured at least one trait of the individual. That is, based on the captured trait (like facial data) and using a database of individuals with such trait information, the controller 350 can confirm if a listed individual (i.e., an individual with information available in the database) may have received the meal or not, and mark attendance for the corresponding individual as an indicator or a record for receipt of the meal by that individual. For example, in school serving meals (like mid-day meal programme), the present meal delivery system 100 may be implemented to mark attendance of students who may have received the meal on a particular day.
[0060] In some embodiments, as shown in FIG. 3, the meal delivery system 100 may further include a display 360 to display information about the meal being currently placed in the space 112 of the enclosure 110. Herein, the display 360 may be disposed in communication with the controller 350 to receive the said information. For example, the display 360 may show information about nutritional properties of the various food items in the meal and/or temperature of the various food items in the meal, as has been determined using the controller 350. In present embodiments, as shown in FIGS. 1A and 1B, the display 360 may be mounted at the front side 110a of the enclosure 110 so that the displayed information may be viewed/verified by the individual located in the front of the meal delivery system 100, like in case of loading the meal in the enclosure 110 or receiving/picking the meal from the enclosure 100.
[0061] In one or more embodiments, the meal delivery system 100 may further include a vending apparatus (not shown). Such vending apparatus may be disposed in the enclosure 110 to dispense the one or more food items of the meal. For this purpose, the vending apparatus may be arranged on the top side 110d of the enclosure 110 to allow for dispensing of the one or more food items of the meal into the container placed at the base 114 of the enclosure 110. In the present embodiments, the vending apparatus is configured to dispense the one or more food items of the meal based on the one or more characteristics of the individual. In an example, the vending apparatus may dispense the required quantity of food items to provide the desired nutritional requirement for the one or more food items in the meal based on the determined BMI of the particular individual. In another example, the vending apparatus may dispense the required type of food items to provide the desired meal preference for the one or more food items in the meal based on the determined identity of the particular individual. A person skilled in the art may understand the functioning mechanism of such vending apparatus and thus the same has not been described herein for the brevity of the present disclosure.
[0062] FIG. 5 illustrates an exemplary representation of a meal 10 comprising one or more food items 12, 14, 16 as may be placed at the base 114 (of the enclosure 110) of the meal delivery system 100, in accordance with one or more embodiments of the present disclosure. Herein, as may be seen, the meal 10 including the food items 12, 14, 16 is placed in a container 20. In the present example, the meal 10 has an egg as a first food item 12, dal as a second food item 14, and rice as a third food item 16. It may be appreciated that the shown food items and the number of shown food items are exemplary only and shall not be construed as limiting to the present disclosure in any manner. Further, FIG. 6 illustrates an exemplary representation of implementation of the meal delivery system 100 to determine qualitative and/or quantitative properties for the one or more food items 12, 14, 16 in the meal 10, in accordance with one or more embodiments of the present disclosure. In general, the meal delivery system 100 may map out different food items 12, 14, 16 in the meal 10, for example, as virtual sections A, B, C as shown to be able to independently determine qualitative and/or quantitative properties for each of the food items 12, 14, 16 in the meal 10. Such technique involving image processing may be contemplated by a person skilled in the art.
[0063] FIG. 7 illustrates an exemplary representation of implementation of the meal delivery system 100 to capture information about the one or more characteristics of the individual receiving the meal 10 currently placed in the enclosure 110, in accordance with one or more embodiments of the present disclosure. This way the present meal delivery system 100 may be able to determine one or more characteristics of the individual (such as, individual represented by the numeral 30 in FIG. 7) receiving the meal 10 currently placed in the space 112 of the enclosure 110, such as by using the second sensor 320 (as discussed in the preceding paragraphs). This, in turn, may be used to determine desired qualitative and/or quantitative properties for the one or more food items in the meal for that individual 30, and also be used to ensure that qualitative and/or quantitative properties of the one or more food items in the meal being received by the individual is within corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal.
[0064] FIG. 8 illustrates an exemplary representation of a first user interface 800 showing qualitative and/or quantitative properties for the one or more food items in the meal and characteristics of the individual (such as, the individual 30) receiving the meal using the meal delivery system 100, in accordance with one or more embodiments of the present disclosure. As shown, the first user interface 800 is a dashboard providing information about qualitative and/or quantitative properties for the food items, such as in a table including type of food item, its temperature, its portion size, its nutrition value, and so on. The first user interface 800 may also provide an annotated image of the meal with its qualitative and/or quantitative properties information mapped thereon. The first user interface 800 may further provide information about the individual (such as, the individual 30) receiving the meal utilizing pre-entered information in a database of individuals, such as, for verification/confirmation by the user who may be serving the meal using the meal delivery system 100. In the present embodiments, the first user interface 800 may be implemented (displayed) locally on the display 360 of the meal delivery system 100 the reference of the user operating the meal delivery system 100 for delivering the meal and/or the individual using the meal delivery system 100 for receiving the meal. In example embodiments, the first user interface 800 may also be used to ensure that the meal is properly packed, such as in case of implementation of the meal delivery system 100 in a quick-service restaurant (QSR) or the like.
[0065] In some embodiments, the meal delivery system 100 may further be implemented for meal delivery data analytics. As discussed in the preceding paragraphs, the meal delivery system 100 can maintain records of identity of individuals having received the meals. The meal delivery system 100, or specifically the controller 350 therein, may be provided with a communication interface (not shown), such as equipped with Wi-Fi, GSM, CDMA, 2G, 3G, 4G, 5G antenna or the like, to allow for uploading of collected data into a cloud server (not shown). The cloud server may be configured to process the received data to generate insights related to delivery of meals using one or more of such meal delivery systems 100 across one or multiple sites. FIG. 9 illustrates an exemplary representation of a second user interface 900 showing data analysis for the meals delivered using the meal delivery system 100, in accordance with one or more embodiments of the present disclosure. As shown, the second user interface 900 may provide information about collective insights about meal delivery for one session, one day or multiple days. The information may include the number of meals delivered, the number of individuals served, and the like. Further, as shown, such information may be displayed in tabular and/or graphical form in the second user interface 900 for reference of, say, an organization responsible for delivery of meals using one or more meal delivery systems 100. In example embodiments, the meal delivery system 100 may further be implemented to determine utilization of raw-materials for preparation of meals at various sites and further be implemented to plan for procurement of raw-material for each of the sites based on the determined utilization of raw-materials and available information about inventory of raw-materials at each of the sites.
[0066] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiment was chosen and described in order to best explain the principles of the present disclosure and its practical application, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated. , Claims:WE CLAIM
What is claimed is:
1. A meal delivery system comprising:
an enclosure providing a space to allow for placement of a meal comprising one or more food items therein;
a set of first sensors incorporated in the enclosure, wherein each first sensor in the set of first sensors is configured to detect at least one qualitative and/or quantitative parameter related to the one or more food items in the meal currently placed in the space of the enclosure; and
a controller configured to:
process the detected at least one parameter, from the set of first sensors, related to the one or more food items in the meal currently placed in the space of the enclosure to determine qualitative and/or quantitative properties for the one or more food items in the meal,
determine desired qualitative and/or quantitative properties for the one or more food items in the meal,
compare the determined qualitative and/or quantitative properties for the one or more food items in the meal with the desired qualitative and/or quantitative properties for the one or more food items in the meal, and
generate a first signal indicative of approval for the meal if the determined qualitative and/or quantitative properties for the one or more food items in the meal is within a corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal and a second signal indicative of rejection for the meal if the determined qualitative and/or quantitative properties for the one or more food items in the meal is not within the corresponding threshold of the desired qualitative and/or quantitative properties for the one or more food items in the meal.
2. The meal delivery system as claimed in claim 1, wherein the enclosure is a collapsible enclosure configured to be transformed into an operation state and a collapsed state, and wherein the enclosure is adapted to provide the space to allow for the placement of the meal therein in the operation state thereof.
3. The meal delivery system as claimed in claim 1, wherein the set of first sensors comprises an image sensor arranged in the enclosure to capture an optical image of the meal, and wherein the controller is configured to:
analyse the captured optical image of the meal to determine one or more of size, shape, colour, texture, volume of at least one of the one or more food items in the meal, and
generate the first signal if the determined one or more of size, shape, colour, texture, volume is within a corresponding respective qualitative and/or quantitative range threshold and the second signal if the determined quality and/or quantity is not within the corresponding respective qualitative and/or quantitative range threshold.
4. The meal delivery system as claimed in claim 1, wherein the set of first sensors comprises a spectroscopic sensor arranged in the enclosure to capture a spectroscopic image of the meal, and wherein the controller is configured to:
analyse the captured spectroscopic image of the meal to determine presence of adulterants in the one or more food items in the meal, and
generate the first signal if there is no presence of adulterants in the meal and the second signal if there is presence of adulterants in the meal.
5. The meal delivery system as claimed in claim 1, wherein the set of first sensors comprises a thermal sensor arranged in the enclosure to measure a temperature of at least one of the one or more food items in the meal, and wherein the controller is configured to:
analyse the measured temperature of the at least one of the one or more food items in the meal, and
generate the first signal if the measured temperature is within a corresponding temperature range threshold and the second signal if the measured temperature is not within the corresponding temperature range threshold.
6. The meal delivery system as claimed in claim 1, wherein the set of first sensors comprises one or more load cell sensors arranged in a base of the enclosure to measure weight of at least one of the one or more food items in the meal placed thereon, and wherein the controller is configured to:
analyse the measured weight of the at least one of the one or more food items in the meal, and
generate the first signal if the measured weight is within a corresponding weight range threshold and the second signal if the measured weight is not within the corresponding weight range threshold.
7. The meal delivery system as claimed in claim 1 further comprising:
a machine learning model trained on the desired qualitative and/or quantitative properties for the one or more food items in relation to multiple characteristics of different individuals,
wherein the controller is configured to:
determine one or more characteristics of an individual receiving the meal currently placed in the space of the enclosure, and
implement the machine learning model to determine the desired qualitative and/or quantitative properties for the one or more food items in the meal currently placed in the space of the enclosure based on the determined one or more characteristics of the individual receiving the meal.
8. The meal delivery system as claimed in claim 7 further comprising a vending apparatus disposed in the enclosure, wherein the vending apparatus is configured to dispense the one or more food items of the meal based on the one or more characteristics of the individual.
9. The meal delivery system as claimed in claim 7 further comprising at least one second sensor configured to capture information about the one or more characteristics of the individual receiving the meal currently placed in the space of the enclosure, wherein the at least one second sensor comprises one or more of:
an infrared sensor configured to capture physical traits of the individual, wherein the controller is configured to determine a Body Mass Index (BMI) of the individual based on the captured physical traits of the individual as the one or more characteristics of the individual, and
an image sensor configured to capture a facial image of the individual, wherein the controller is configured to determine an identity of the individual based on the captured facial image of the individual and further determine a meal preference of the individual based on the determined identity of the individual as the one or more characteristics of the individual.
10. The meal delivery system as claimed in claim 1 further comprising a third sensor configured to capture at least one trait of an individual receiving the meal currently placed in the space of the enclosure, wherein the controller is configured to one or more of:
confirm duplicity in case of the individual having previously received meal in an ongoing meal session based on the captured at least one trait of the individual, and
mark attendance for the individual based on the captured at least one trait of the individual.
| # | Name | Date |
|---|---|---|
| 1 | 202311005471-FORM 18 [27-01-2023(online)].pdf | 2023-01-27 |
| 2 | 202311005471-FORM 1 [27-01-2023(online)].pdf | 2023-01-27 |
| 3 | 202311005471-FIGURE OF ABSTRACT [27-01-2023(online)].pdf | 2023-01-27 |
| 4 | 202311005471-DRAWINGS [27-01-2023(online)].pdf | 2023-01-27 |
| 5 | 202311005471-DECLARATION OF INVENTORSHIP (FORM 5) [27-01-2023(online)].pdf | 2023-01-27 |
| 6 | 202311005471-COMPLETE SPECIFICATION [27-01-2023(online)].pdf | 2023-01-27 |
| 7 | 202311005471-Proof of Right [06-02-2023(online)].pdf | 2023-02-06 |
| 8 | 202311005471-FORM-26 [06-02-2023(online)].pdf | 2023-02-06 |
| 9 | 202311005471-Others-090223.pdf | 2023-02-10 |
| 10 | 202311005471-GPA-090223.pdf | 2023-02-10 |
| 11 | 202311005471-Correspondence-090223.pdf | 2023-02-10 |
| 12 | 202311005471-FER.pdf | 2025-08-08 |
| 13 | 202311005471-FORM 3 [25-09-2025(online)].pdf | 2025-09-25 |
| 1 | 202311005471_SearchStrategyNew_E_mealdeliveryE_22-04-2025.pdf |