Sign In to Follow Application
View All Documents & Correspondence

An Augmented Reality Based Intelligent Recommendation System And A Method Thereof

Abstract: Disclosed is an augmented reality based recommendation system (101) and method (1000). The system comprises a processor (201) and memory (205) comprising a database (211). The processor (201) receives medical records from a user and processes the records using a machine learning technique for deriving a medical profile. A wearable device (103-4) tracks physical activities of the user for generating a behavioural profile. The medical profile and the behavioural profile of the user is maintained in the database (211) further comprising pre-stored product information. The product information comprises nutritional contents of the product. An image capturing means of the user device (103-1) scans the product. The processor (201) retrieves nutritional content of the product and analyses the nutritional content of the product based upon the medical profile and the behavioural profile of the user. A recommendation information of the product is displayed. [To be published with Figure 3

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
17 December 2019
Publication Number
25/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@stratjuris.com
Parent Application

Applicants

Zensar Technologies Limited
Zensar Knowledge Park, Plot # 4, Midc, Kharadi, Off Nagar Road, Pune-411014, Maharashtra, India

Inventors

1. Ritika Deepak Chawla
Zensar Technologies, Plot #4, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra 411014
2. Vikram Shrimantrao Samdare
Zensar Technologies, Plot #4, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra 411014
3. Nilesh Prakash Parakh
Zensar Technologies, Plot #4, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra 411014
4. Juhi Ajmera
Zensar Technologies, Plot #4, MIDC, Kharadi, Off Nagar Road, Pune, Maharashtra 411014

Specification

Claims:WE CLAIM:

1. An augmented reality based intelligent recommendation system (101), comprising:
a processor (201); and
a memory (205) coupled with the processor (201), wherein the processor (201) is configured to execute programmed instructions stored in the memory (205), the programmed instructions comprising instructions for:
receiving a plurality of medical records from a user;
processing the plurality of medical records using a machine learning technique in order to derive a medical profile of the user;
tracking one or more physical activities of the user via a wearable device (103-4) in communication with the processor (201) in order to generate a behavioural profile of the user;
maintaining the medical profile and the behavioural profile of the user in a database (211) present in the memory (205), wherein the database (211) further comprises pre-stored product information associated to a plurality of products, and wherein the pre-stored product information comprises nutritional contents associated with each product;
scanning, in a real time, an image associated to a product using an image capturing means of a user device (103-1);
retrieving, nutritional content, based upon the scanned image of the product;
analyzing, the nutritional content of the product based upon the medical profile and the behavioural profile of the user; and
displaying, recommended information, in an augmented reality environment, on the user device (103-1), wherein the recommended information comprising the nutritional content visualized in a manner to indicate whether or not the product is recommended to the user based upon the analysis of the nutritional content of the product.

2. The system (101) as claimed in claim 1, wherein the medical records received are in assorted formats and comprises one or more of past and current medical reports, medication list, prescriptions, allergy details, operation details, test-reports, and medical scans.

3. The system (101) as claimed in claim 2, wherein the processing of the plurality of medical records using the machine learning technique at least comprises:
scanning a plurality of text section in one or more of the medical records;
splitting the plurality of text section into separate words;
discarding delimiters, wherein delimiters comprise at least antecedents;
comparing the separate words with content keywords stored in memory, wherein the content keywords are scanned from the medical records to form a list of content keywords;
extracting matching words, from the separate words, matching with the list of content keywords;
creating a matrix of co-occurrence of the matching words; and
estimating a score of the matching words, wherein the score of the matching words is estimated by dividing a sum of a number of co-occurrences of the matching words with the list of content keywords and multiplying by a number of times the matching words appear in the medical records.

4. The system (101) as claimed in claim 1, wherein the analysis of the nutritional content of the product comprises:
estimating a nutritional level of the user based upon the medical profile and the behavioural profile of the user in a real time; and
determining a recommendation score by comparing the nutritional content of the product and the nutritional level of the user in a real-time.

5. The system (101) as claimed in claim 4, wherein the nutritional content in the recommendation information is visualized in form of at least one of the signs of tick or cross, or highlighted in different colour codes.

6. The system (101) as claimed in claim 1, wherein the nutritional content is retrieved by performing at least one of:
fetching a corresponding product identifier of the product and matching the product identifier with the product identifier stored in a product image database (509) containing product nutritional content details; and
scanning, the image of the product to extract one or more contents of the product and thereby derive the nutritional content of the product, in a real-time.

7. A method for an augmented reality based intelligent recommendation system comprising:

receiving via a processor (201), a plurality of medical records from a user;
processing via the processor (201), the plurality of medical records using a machine learning technique in order to derive a medical profile of the user;
tracking one or more physical activities of the user via a wearable device (103-4) in communication with the processor (201) in order to generate a behavioural profile of the user;
maintaining the medical profile and the behavioural profile of the user in a database (211) present in the memory (205), wherein the database (211) further comprises pre-stored product information associated to a plurality of products, wherein the pre-stored product information comprises nutritional contents associated with each product;
scanning in a real time, an image associated to a product using an image capturing means of a user device (103-1);
retrieving via the processor (201), nutritional content, based upon the scanned image of the product;
analyzing via the processor (201), the nutritional content of the product based upon the medical profile and the behavioural profile of the user; and
displaying a recommendation information, in an augmented reality environment, on the user device (103-1), wherein the recommendation information comprising the nutritional content visualized in a manner to indicate whether or not the product is recommended to the user based upon the analysis of the nutritional content of the product.

8. The method as claimed in claim 7, wherein the processing of the plurality of medical records using the machine learning technique at least comprises:
scanning, via the processor (201), a plurality of text section in one or more of the medical records;
splitting, via the processor (201), the plurality of text section into separate words;
discarding via the processor (201), delimiters, wherein the delimiters comprise at least antecedents;
comparing, via the processor (201), the separate words with content keywords stored in memory, wherein the content keywords are scanned from the medical records to form a list of content keywords;
extracting, via the processor (201), matching words, from the separate words, matching with the list of content keywords;
creating via the processor (201), a matrix of co-occurrence of the matching words; and
estimating, via the processor (201), a score of the matching words, wherein the score of the matching words is estimated by dividing a sum of a number of co-occurrences of the matching words with the list of content keywords and multiplying by a number of times the matching words appear in the medical records.

9. The method as claimed in claim 7, wherein the analysis of the nutritional content of the product comprises:
estimating, via the processor (201), a nutritional level of the user based upon the medical profile and the behavioural profile of the user in a real time; and
determining, via the processor (201), a recommendation score by comparing the nutritional content of the product and the nutritional level of the user in a real-time.

10. The method as claimed in claim 7, wherein the nutritional content is retrieved by performing at least one of:
fetching a corresponding product identifier of the product and matching the product identifier with the product identifier stored in a product image database (509) containing product nutritional content details; and
scanning, the image of the product to extract one or more contents of the product and thereby derive the nutritional content of the product, in a real-time.
Dated this 17th Day of December 2019


Priyank Gupta
Agent for the Applicant
IN/PA- 1454
, Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION

(See Section 10 and Rule 13)

Title of invention:
AN AUGMENTED REALITY BASED INTELLIGENT RECOMMENDATION SYSTEM AND A METHOD THEREOF

APPLICANT

Zensar Technologies Limited,
an Indian Entity,
having address as:
Zensar Knowledge Park, Plot # 4, Midc, Kharadi, Off Nagar Road, Pune-411014,
Maharashtra, India

The following specification describes the invention and the manner in which it is to be performed.

CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
The present application does not claim priority from any other patent application.

TECHNICAL FIELD
The present subject matter described herein, in general, relates to a recommendation system. More particularly, the present subject matter is related to an augmented reality based intelligent recommendation system and a method thereof.
BACKGROUND
In the recent past, people have become proactive about health care. The availability of variety of food products in the market has increased. Thus, people are now conscious about the effect of consumption of these food products on their health. Moreover, people are also many a times confused, about a certain food product being beneficial or harmful for their health.
Each day there are a large number of people in the world who are being diagnosed with various allergies and illnesses. In such situations, it is necessary for patients to consciously choose the food products for themselves from the available food products in the market. It is not always possible for the patients to consult their doctor while consuming the food products.
Though there exist solutions that may assist people knowing the nutritional content associated to a packaged food, to be consumed, in an intuitive manner. However, none of the existing solutions neither considers the health condition of the consumer nor displays the probable nuisances of the packaged food before its consumption.
SUMMARY
This summary is provided to introduce the concepts related to an augmented reality based intelligent recommendation system and a method thereof and the concepts are further described in the detail description. This summary is not intended to identify essential features of the claimed subject matter nor it is intended to use in determining or limiting the scope of claimed subject matter.

In one implementation, the present subject matter discloses an augmented reality based intelligent recommendation system. The system comprises a processor and a memory coupled with the processor, wherein the processor is configured to execute programmed instructions stored in the memory. The processor may be configured to execute a programmed instruction for receiving a plurality of medical records from a user. The processor may be configured to execute a programmed instruction for processing the plurality of medical records using a machine learning technique in order to derive a medical profile of the user. The processor may be configured to execute a programmed instruction for tracking one or more physical activities of the user via a wearable device in communication with the processor in order to generate a behavioural profile of the user. The processor may be configured for to execute a programmed instruction for maintaining the medical profile and the behavioural profile of the user in a database present in the memory. In one aspect, the database may further comprise pre-stored product information associated to a plurality of products, wherein the product information comprises nutritional contents associated with each product. The processor may be configured to execute a programmed instruction for scanning, in a real time, an image associated to a product using an image capturing means of a user device. The processor may be configured to execute a programmed instruction for retrieving nutritional content based upon the scanned image of the product. The processor may be configured to execute a programmed instruction for analyzing the nutritional content of the product based upon the medical profile and the behavioural profile of the user. The processor may be configured to execute a programmed instruction for displaying recommendation information, in an augmented reality environment, on the user device. In one aspect, the recommendation information may comprise the nutritional content visualized in a manner to indicate whether or not the product is recommended to the user based upon the analysis of the nutritional content of the product.
In another implementation, the present subject matter discloses a method for an augmented reality based intelligent recommendation system. The method may comprise receiving, via a processor, a plurality of medical records from a user. The method may comprise processing, via the processor, the plurality of medical records using a machine learning technique in order to derive a medical profile of the user. The method may further comprise tracking one or more physical activities of the user via a wearable device in communication with the processor in order to generate a behavioural profile of the user. Further, the method may comprise maintaining, via the processor, the medical profile and the behavioural profile of the user in a database present in the memory, wherein the database further comprises pre-stored product information associated to a plurality of products, wherein the product information comprises nutritional contents associated with each product. The method may further comprise scanning, in a real time, an image associated to a product using an image capturing means of a user device. The method may further comprise retrieving, via the processor, nutritional content based upon the scanned image of the product. Further, the method may comprise analysing, via the processor, the nutritional content of the product based upon the medical profile and the behavioural profile of the user. Furthermore, the method may comprise displaying, via the processor, recommendation information, in an augmented reality environment, on the user device. In one aspect, the recommendation information may comprise the nutritional content visualized in a manner to indicate whether or not the product is recommended to the user based upon the analysis of the nutritional content of the product.
BRIEF DESCRIPTION OF DRAWINGS
The detailed description is described with reference to the accompanying figures.
Figure 1 illustrates an implementation 100 of an augmented reality based intelligent recommendation system 101, in accordance with an embodiment of a present subject matter.
Figure 2a and figure 2b illustrates machine learning models MediAid AI Engine and ActiTracker AI Engine belonging to the system 101, in accordance with an exemplary embodiment of a present subject matter.
Figure 3 illustrates a functional architecture 400 of the system 101, in accordance with an exemplary embodiment of a present subject matter.
Figure 4 illustrates an architecture 500 of the database belonging to the system 101, in accordance with an exemplary embodiment of the present subject matter.
Figure 5 illustrates a method 600 depicting a scenario wherein a product is recommended to the user by scanning the product and fetching diet details of the user from the database, in accordance with an exemplary embodiment of the present subject matter.
Figure 6 illustrates a method 700 depicting a scenario wherein a product is being recommended to a diabetic patient, in accordance with an exemplary embodiment of the present subject matter.
Figure 7 illustrates a method 800 implemented by the augmented reality based intelligent recommendation system 101, in accordance with the embodiment of the present subject matter.

DETAILED DESCRIPTION
Reference throughout the specification to “various embodiments,” “some embodiments,” “one embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases “in various embodiments,” “in some embodiments,” “in one embodiment,” or “in an embodiment” in places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
Figure 1 illustrates an implementation 100 of an augmented reality based intelligent recommendation system 101, in accordance with an embodiment of a present subject matter. In accordance with aspects of the present subject matter, the system (101) is enabled is to improve societal health and wellness using the augmented reality environment. Previous researches have shown a strong co-relation between dietary choices and health conditions. Poor dietary choices can lead to increased risk of poor health conditions such as obesity as well as chronic diseases such as cardiovascular disease and diabetes. In order to prevent such risks, the system (101) is configured to provide nutritional information to users at the point-of-purchase and improve customer decision-making about healthy food, thereby facilitating in improving general wellness of the public. An augmented reality based computing application provides real-time customized product recommendations based on lifestyle and medical history which helps identify products that could have hazardous effects on the user’s health.
In one embodiment, the system (101) may be connected to a user device (103) over a network (102). It may be understood that the system (101) may be accessed by multiple users through one or more user devices (103-1), (103-2), (103-3) …(103-n), collectively referred to as the user device (103) hereinafter, or user (103), or applications residing on the user device (103). In one embodiment, a user device 103 may be a wearable device 103-4. In alternative embodiments, the wearable device 103-4 may be a standalone device (as shown in Figure 1) separate from the user device 103 or may be incorporated within the user device 103. The user (103) may be any person, machine, software, automated computer program, a robot or a combination thereof. In one embodiment, the user device (103-1) and the wearable device 103-4 may be used by a user of the system (101).
In an embodiment, the present subject matter is explained considering that the system (101) may be implemented in a variety of user devices, including but not limited to, server, a portable computer, a personal digital assistant, a handheld device, a mobile, a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, and the like. In one embodiment, the system (101) may be implemented in a cloud-computing environment. In an embodiment, the network (102) may be a wireless network such as Bluetooth, Wi-Fi, LTE and such like, a wired network or a combination thereof. The network (102) can be accessed by the user device (103) using wired or wireless network connectivity means including updated communications technology.
In one embodiment, the network (102) can be implemented as one of the different types of networks, cellular communication network, Local Area Network (LAN), Wide Area Network (WAN), the internet, and the like. The network (102) may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network (102) may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
In an embodiment, the system (101) may be configured to recommend a food product to the user. Once the user registers with the system (101), the system (101) may be configured to receive medical records from the user and processes the records using a machine learning technique for deriving a medical profile of the user. Further, the system (101) may be configured to track one or more physical activities of the user for generating a behavioural profile of the user. The physical activities are tracked using the wearable device (103-4). In one embodiment, the wearable device (103-4) may include at least one of a smart wrist band, pendant, ring, waist band, ankle band, eyewear, pedometer, necklace, and the like. In one embodiment, the wearable device (103-4) may further include a plurality of sensors selected from, but are not limited to, a heart-beat sensor, an emotion sensor, an accelerometer, a gyro meter, a magnetometer, a pulse rate sensor, an EEG sensor, a blood pressure sensor, and the like.
In one embodiment, the medical profile and the behavioural profile of the user may be maintained in a database (225) of the memory (205). The database (225) may further comprise pre-stored information of various products available in the market. The product information may comprise nutritional contents of the product. The user may scan an image of a product via an image capturing means of the user device (103-1). The system (101), in real-time, retrieves the nutritional content of the identified product, from the database (225). The system (101) immediately analyses the nutritional content of the product based upon the medical profile and the behavioural profile of the user and displays recommendation information of the product on the display or display screen of the user device (103-1). The product recommendation information may comprise information regarding whether the product is safe for consumption by the user. In one embodiment, the products may comprise, but may not be limited to, packed or unpacked food products such as cookies, juices, powders, grains, vegetables, fruits, and the like.
In one embodiment, the system (101) may be further configured to alert the user when the identified food product is allergic to the user if consumed. The food product is determined as allergic to the user based on the medical history, the health parameters of the user and the nutritional information of the product. In one embodiment, the system (101) further guides the user in identifying the overall content and quantity of the food to be consumed by the user.
Now, referring to the components of the augmented reality based intelligent recommendation system 101, in accordance with an embodiment of a present subject matter. The system (101) may include at least one processor (201), an input/output (I/O) interface (203), a memory (205), programmed instructions (207) and data (209). In one embodiment, the at least one processor (201) may be configured to fetch and execute computer-readable/programmed instructions (207) stored in the memory (205).
In one embodiment, the I/O interface (203) may be implemented as a mobile application or a web-based application and may further include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, image capturing means of the user device and the like. The I/O interface (203) may allow the system (101) to interact with the user devices (103). Further, the I/O interface (203) may enable the user device (103) to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface (203) can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface (203) may include one or more ports for connecting to another server.
In an implementation, the memory (205) may include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and memory cards. The memory (205) may include programmed instructions (207) and data (209).
In one embodiment, the data (209) may comprise a database (211), and other data (213). The other data (213), amongst other things, serves as a repository for storing data processed, received, and generated by the one or more of the programmed instructions (207).
The aforementioned computing devices may support communication over one or more types of networks in accordance with the described embodiments. For example, some computing devices and networks may support communications over a Wide Area Network (WAN), the Internet, a telephone network (e.g., analog, digital, POTS, PSTN, ISDN, xDSL), a mobile telephone network (e.g., CDMA, GSM, NDAC, TDMA, E-TDMA, NAMPS, WCDMA, CDMA-2000, UMTS, 3G, 4G), a radio network, a television network, a cable network, an optical network (e.g., PON), a satellite network (e.g., VSAT), a packet-switched network, a circuit-switched network, a public network, a private network, and/or other wired or wireless communications network configured to carry data. Computing devices and networks also may support wireless wide area network (WWAN) communications services including Internet access such as EV-DO, EV-DV, CDMA/1×RTT, GSM/GPRS, EDGE, HSDPA, HSUPA, and others.
The aforementioned computing devices and networks may support wireless local area network (WLAN) and/or wireless metropolitan area network (WMAN) data communications functionality in accordance with Institute of Electrical and Electronics Engineers (IEEE) standards, protocols, and variants such as IEEE 802.11 (“WiFi”), IEEE 802.16 (“WiMAX”), IEEE 802.20x (“Mobile-Fi”), and others. Computing devices and networks also may support short range communication such as a wireless personal area network (WPAN) communication, Bluetooth® data communication, infrared (IR) communication, near-field communication, electromagnetic induction (EMI) communication, passive or active RFID communication, micro-impulse radar (MIR), ultra-wide band (UWB) communication, automatic identification and data capture (AIDC) communication, and others.
In one embodiment, the processor (201) may be configured to execute instructions for receiving a plurality of medical records from the user. The medical records received are in assorted formats and comprises one or more of past and current medical reports, medication list, prescriptions, allergy details, operation details, test-reports, and medical scans.
In one embodiment, the processor (201) may be configured to execute instructions for processing the plurality of medical records using a machine learning technique in order to derive a medical profile of the user. In one embodiment, the processing of the plurality of medical records using the machine learning technique may further include multiple steps. In one embodiment, the processing may include a step of scanning a plurality of text sections in one or more of the medical records. Further, the processing may include a step of splitting the plurality of text sections into separate words. Further, the processing may include a step of discarding delimiters. In one exemplary embodiment, the delimiters may include at least the antecedents. Further, the processing may include a step of comparing the separate words with content keywords stored in the memory. The content keywords are scanned from the medical records to form a list of content keywords. Further, the processing may include a step of extracting matching words, from the separate words, matching with the list of content keywords. The processing may further include a step of creating a matrix of co-occurrence of the matching words. The processing may further include a step of estimating a score of the matching words. In one exemplary embodiment, the score of the matching words may be estimated by dividing a sum of a number of co-occurrences of the matching words with the list of content keywords and multiplying by a number of times the matching words appear in the medical records. In an embodiment, the medical profile of the user derived based upon the processing of the medical records as explained above may be stored and maintained in the database (211).
In one embodiment, the processor (201) may be configured to execute instructions for tracking one or more physical activities of the user via the wearable device (103-4) in order to generate a behavioural profile of the user. In one embodiment, the behavioural profile of the user may be indicative of lifestyle followed by the user in his/her daily routine activities. It is to be noted herein that the tracking of the physical activities of the user via the wearable device (103-4) is known in the art and the details of which are not described in this application for the sake of brevity. In an embodiment, the behavioural profile of the user generated based upon the tracking of the medical records may be stored and maintained in the database (211). In one embodiment, the database (211) may further comprises pre-stored product information associated to a plurality of products, wherein the product information comprises at least nutritional contents associated with each product.
In one embodiment, the processor (201) may be configured to execute instructions for scanning an image associated to a product using an image capturing means (not shown) of the user device (103) to identify the product, in a real time. The user 103 may scan the product which the user is interested in purchasing either through online medium (e.g. ecommerce platform) or through retail shops. In one embodiment, the processor (201) may be configured to execute instructions to retrieve the nutritional content, based upon the scanned image of the product.
In one embodiment, the processor (201) may be configured to execute instructions to analyse the nutritional content of the product based upon the medical profile and the behavioural profile of the user. In one exemplary embodiment, the processor (201) may be configured to execute instructions to analyse the nutritional content of the product by estimating a nutritional level of the user based upon the medical profile and the behavioural profile of the user in a real time. In this exemplary embodiment, the processor (201) may be configured to execute instructions for determining a recommendation score by comparing the nutritional content of the product and the nutritional level of the user in a real-time.
In one embodiment, the processor (201) may be configured to execute instructions for displaying recommendation information, in an augmented reality environment, on the user device 103. In one embodiment, the recommendation information may comprise the nutritional content visualized in a manner to indicate whether or not the product is recommended to the user 103 based upon the analysis of the nutritional content of the product. In one exemplary embodiment, the nutritional content in the recommendation information is visualized in form of at least one of the signs of tick or cross, or highlighted in different colour codes.
Various exemplary embodiments/implementations of the components belonging to the system 101 is hereinafter explained as below:
Referring to Figure 2a and figure 2b, exemplary implementations of machine learning models such as Artificial Intelligence Engines (MediAI Engine and ActiTracker AI Engine respectively) is illustrated. The machine learning models MediAid AI Engine and ActiTrack AI Engine supports the user profile creation which provides recommendations based on the medical history/medical profile and the behavioural profile/lifestyle followed by user respectively. As shown in figure 2a, the machine learning model MediAid AI Engine may be configured to evaluate the daily nutritional intake of the user based on the medical history of user. The machine learning model MediAid AI Engine may be trained using Natural Language Processing (NLP) technique which performs Keyword Extraction to capture diagnoses, allergies, test results, and the like from one or more medical prescriptions or EMR (Electronic Medical Record) (301). The machine learning model ActiTracker AI Engine (shown in figure 2b) may be a software or computing application platform that utilizes data captured from the one or more wearable devices (103-4), associated to the user, to analyse the daily physical activity data of the user and predicts the lifestyle followed by user to make personalized recommendations. The parameters that may be obtained from the wearable device (103-4) related to the physical activities of the user may include, but are not limited to, step count, heart rate, workout, sleep pattern, and the like.
Now referring to figure 3, a functional architecture (400) of the system (101) is illustrated, in accordance with an exemplary embodiment of a present subject matter. In this exemplary embodiment, the wearable device (103-4) is a smart wrist band. In this exemplary embodiment, when the user wears the wearable device (103-4), the wearable device (103-4) communicates with the user device (103-1) via Bluetooth communication. Thus, the data captured by the smart wrist band is transmitted to the smartphone (i.e. the user device 103) of the user via Bluetooth. This is facilitated by turning ON the Bluetooth connection of the smartphone which displays a list of available devices. The MAC address of the smart wrist band is copied and input it in the mobile application. Once connection is successful, daily health records of the user are captured that enables in predicting the lifestyle of the user. In this exemplary embodiment, one or more medical reports (401) of the user are stored in the database, e.g. a cloud storage (405) as shown. Further, the cloud storage (405) pre-stores one or more product nutritional contents (402). Additionally, lifestyle details (403) of the user obtained from the smart wrist band is further collected at the cloud storage (405). In this exemplary embodiment, the lifestyle details are collected on the cloud storage (405) via the smartphone associated to the user. Further, in this exemplary embodiment, one or more APIs are used to analyse the data stored within the cloud storage (405) and display recommendation information to the user based upon the analysis of the data on the user device (103-1). In this exemplary embodiment, the system 101 is implemented as an Application Framework utilizing an AWS cloud platform. The nutritional information of each product and the user profile is stored on the AWS cloud platform. Based on the user profile, the daily nutritional intake by user is calculated. In this exemplary embodiment, a color format scheme is utilized by the AWS cloud platform which provides easy interpretation of information.
Now referring to Figure 4, an architecture (500) of the database belonging to the system (101) is illustrated, in accordance with an exemplary embodiment of the present subject matter. In this exemplary embodiment, the MediAid AI Engine (501) and the ActiTracker AI Engine (502) are configured to transmit the corresponding information to the cloud storage (405). In this exemplary embodiment, the cloud storage (405) comprises a medical history database (505), a lifestyle database (506), a nutrition intake database (507) and a product nutrition content (508). The MediAid AI Engine (501) is configured to transmit the corresponding information of the user’s medical history to the medical history database (505). The ActiTracker AI Engine (502) is configured to transmit the corresponding information of the user’s lifestyle to the lifestyle database (506). The nutrition intake data base (507) is configured to store both the information received from the medical history data base (505) and the lifestyle data base (506). In one embodiment, a cloud (504) may be used which comprises a product image database (509) to store information about the available food products in the market. In one embodiment, the food product images are maintained in the cloud (504) along with nutritional information of each product stored as metadata. When the user scans the product using a camera (510), in the real-time, the nutritional contents of the product are displayed on the display screen of the user device (103-1) with a color formatted scheme. In one embodiment, a Recommendation AI Engine (511) is configured to recommend a product based on the information obtained from the nutrition intake database (507), the product nutrition content (508) and the product image database (509).
In one embodiment, the product image database (509) stores the product images with an identifier. When the user scans the product’s image using the camera on a smartphone or a tablet and it successfully identifies the product, the corresponding identifier is fetched and matched with the product image database (509) containing product nutritional content details. This data is compared with the nutritional intake of person evaluated from the AI Engine. The nutritional information is then displayed on the display screen of the user device (103-1) with a color-font scheme which makes the interpretation easy for user. Thus, the scanned image of the product is retrieved via the processor (201). In one embodiment, the nutritional content of the product is retrieved by fetching a corresponding product identifier of the product and matching the product identifier with the product identifier stored in a product image database (509) containing product nutritional content details. In another embodiment, the nutritional contents of the product is retrieved by scanning, the image of the product to extract one or more contents of the product and derive the nutritional contents from the one or more contents extracted. The product image database (509) may store the one or more contents of the product. The processor (201) may be configured to derive the nutritional contents of the scanned product based upon the one or more contents of the product. In yet another embodiment, the nutritional content of the product may be derived by performing Natural Language Processing (NLP) technique on the one or more contents extracted based upon the scanning of the image of the product. With Computer Vision an Optical Character Recognition Engine is designed. The image of the product captured in real-time is preprocessed by carrying out Image Segmentation which converts every pixel in an image with from color to black and white. The noise level on the image is optimized and areas outside text removed. A Convolution Neural Network is trained with each character label information and train a neural network in recognizing each character. The pre-trained CNN model loaded will predict each character and extract text. This extracted data will be fed into a database using Web Service In one example, different color fonts may comprise content in red font defining that the intake of product would cause the daily consumption level to go high and content in green font would mean that the food product is suitable for consumption, which is evaluated based on the user profile.
Now referring to Figure 5, a method (600) depicting a scenario wherein a product is recommended to the user by scanning the product and fetching diet details of the user form cloud is illustrated, in accordance with an exemplary embodiment of the present subject matter. In this exemplary embodiment, the system (101) is configured to register a new user by allowing the user to input details including email ID, password, height, weight, age and gender of the user. Further, the system (101) stores the corresponding data of the user. If the user is already registered with the system (101), the user may login with the system (101) by only providing inputs email ID and password.
At step 601, the user is successfully registered and logged with the system (101). At step 602, the user may upload the one or more medical prescriptions or EMR (Electronic Medical Record) (301) by scanning via image capturing means of the user device. At step 603, the natural language processing algorithms such as the MediAid AI engine is used for keyword-based extraction of the medical history of the user. The MediAid AI engine is configured to scan a section of the text, split the words, and remove delimiters and words like it, is, a, are, can, all etc. The MediAid AI engine is further configured to form a list of content keywords. Further, MediAid AI engine is configured to create a matrix of co-occurrence of words with every other content keyword. The MediAid AI engine is further configured to provide score to the words that is calculated by dividing the sum of number of co-occurrences the word has with other content keyword by number of times the word appears in text. At step 604, the smart wrist band is connected with the smartphone. The smart wrist band is configured to track daily activities performed by the user. At step 605, the health data i.e. the electronic medical record 301 and the information about daily activities is stored in the AWS cloud. At step 606, the ActiTracker AI Engine is configured to predict the lifestyle of the user based on step count, sleep patter, workout and heart rate. At step 607, the product may be scanned via the camera of the smartphone. At step 608, the product images are stored in the cloud. At step 609, the nutritional contents of the products are stored on the cloud. At step 610, an API is called to fetch the product image. At step 611, an API is called to fetch the product details. At step 612, an API is called to fetch diet details of user from the cloud. At step 613, the daily nutritional intake of the user may be compared with the product nutritional contents. At step 614, the product may be recommended to the user if the product is proved to be healthy to the user.
In a preferred embodiment, the system 101 is implemented as an Augmented Reality and Artificial Intelligence based application. In another embodiment, the system may be provided by retailers to their customers to promote the health of customers by furnishing them with real-time personalized/customized recommendations at point-of-sales to attain sustainability.
In one exemplary embodiment, consider a user uploads doctor’s medical prescription/ EMR according to table 1 as below:
Name: Mr. XYZ
Date: 30/08/2019

Lab Test Result Unit Reference Range
TSH 0.02 uIU/mL 0.40-4.50
Free T4 4.45 ng/dL 0.72-1.72

Diagnosis: The patient is suffering from hyperthyroid.

Medication: A course of Methimazole Tablets for next 3 months is to be carried out.

Table 1
The MediAid Engine performs NLP based keyword extraction and stores the data into medical_history table hosted on the cloud. When the EMR is uploaded, the NLP engine splits the words and removes delimiter and words like is, a, are, can etc. Further, the Engine maps the words with the medical jargons that are pre-fed in the system and creates a list of content keywords. Here a list of content keywords is formed containing:
patient suffering hyperthyroid
course methimazole tablets next 3 months
These keywords are then fed into the database which further recommends products that have nutrition content level appropriate for the user’s consumption.
The MediAid AI Engine provides following information according to table 2 after completion of the analysis.
ID DateTime Allergy Treatment Medication Diagnoses
1 2019-08-31T10:30:20Z Nuts HyperThyroid Methimazole Tablets Dosage for next 3 months
Table 2
It is identified that the user is undergoing treatment of hyperthyroid, thus the consumption of sodium level must be low in the person’s diet.
In one embodiment, the user is then prompted to interface the wearable device with the smartphone by connecting it over Bluetooth. The data captured from the device is stored in activity_tracker table hosted on cloud. The ActiTracker AI Engine predicts that the user follows a sedentary lifestyle according to table 3.
ID DateTime Heartrate Workout StepCount SleepPattern
1 2019-08-31T17:35:45Z 80 0 3500 7:30
Table 3
A pre-fed database containing nutritional consumption levels based on height, weight and lifestyle is maintained on the cloud. Based on the parameters gathered from the AI engines, the intake of nutritional content by the person is evaluated.
In one exemplary embodiment, when the user scans a packet of “McVitie’s” biscuit, the recommendation engine takes into consideration the medical history and lifestyle followed by the user to recommend whether the user should purchase the product at point-of-sales in real-time. The nutritional information is displayed in a color-formatted scheme making the interpretation easy for user. In one embodiment, if the food product is not available in the catalog, the user interface is displayed with a notification such as “Item Unavailable”.
Figure 6 illustrates a method (700) depicting a scenario wherein a product is being recommended to a diabetic patient, in accordance with an exemplary embodiment of the present subject matter. At step (701), the smart wrist band is connected with the smartphone. The connectivity is obtained by turning ON the Bluetooth setting of the smartphone and copying the MAC address of the smart wrist band in the user interface. At step (702), the medical reports are uploaded. In one embodiment, the medical reports may be uploaded in the system (101). The NLP based keyword extraction is implemented and necessary medical records are stored in the database over the cloud storage. At step (703), the user may scan the product for recommendation. At step (704), the data collection may be performed. In one embodiment, the data collected from the smart wrist band is used to predict the lifestyle followed by the user. The health history of the user is generated from the medical records. This data is used for prescribing the daily nutrition content to be consumed by the user. At step (705), diabetes detection and highlighting of high sugar level content may be performed. In one embodiment, if the user is found to be a diabetic patient, then the products with higher level of sugar are highlighted in red font on the user interface.
In one embodiment, the system (101) is augmented reality and artificial intelligence based recommendation engine at point-of-sales to customers. The system (101) may be configured to provide customizable recommendation. The system (101) also provides real time alerts if the user is purchasing products which exceeds the overall content he is required for the day/week. The wearable device (103-4) is quick-fit. The system 101 is highly customizable, robust and provides low-latency support time. The system (101) also provide just-in-time alerts if the person has not worn everything needed for that work.
Now referring to figure 7, a method (800) for an augmented reality based intelligent recommendation system (101) is depicted, in accordance with an embodiment of the present subject matter.
At step (801), a plurality of medical records is received, via the processor (201), from a user.
At step (802), the plurality of medical records is processed, via the processor (201), using a machine learning technique in order to derive a medical profile of the user.
At step (803), physical activities of the user are tracked. In one embodiment, one or more physical activities of the user are tracked via the wearable device (103-4) in order to generate a behavioural profile of the user.
At step (804), the medical profile and the behavioural profile of the user is maintained in a database (211) present in the memory (205), wherein the database (211) further comprises pre-stored product information associated to a plurality of products. The product information comprises nutritional contents associated with each product.
At step (805), a product may be scanned. In one embodiment, an image associated to the product is scanned in a real time, using an image capturing means of a user device (103-1) to identify the product.
At step (806), the nutritional content corresponding to the scanned image of the product are retrieved via the processor (201). . In one embodiment, the nutritional content of the product is retrieved by fetching a corresponding product identifier of the product and matching the product identifier with the product identifier stored in a product image database (509) containing product nutritional content details. In another embodiment, the nutritional content of the product is retrieved by scanning, the image of the product to extract one or more contents of the product and thereby deriving the nutritional content of the product from the one or more contents, in a real-time.
At step (807), the processor is configured for analysing the nutritional content of the product based upon the medical profile and the behavioural profile of the user.
At step (808), a recommendation information, is displayed in an augmented reality environment, on the user device (103-1). The recommendation information comprises the nutritional content visualized in a manner to indicate whether or not the product is recommended to the user based upon the analysis of the nutritional content of the product.
The embodiments, examples and alternatives of the preceding paragraphs or the description and drawings, including any of their various aspects or respective individual features, may be taken independently or in any combination. Features described in connection with one embodiment are applicable to all embodiments, unless such features are incompatible.
Although implementations for the augmented reality based intelligent recommendation system and a method thereof have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for the augmented reality based intelligent recommendation system and a method thereof.

Documents

Application Documents

# Name Date
1 201921052305-STATEMENT OF UNDERTAKING (FORM 3) [17-12-2019(online)].pdf 2019-12-17
2 201921052305-POWER OF AUTHORITY [17-12-2019(online)].pdf 2019-12-17
3 201921052305-FORM 1 [17-12-2019(online)].pdf 2019-12-17
4 201921052305-FIGURE OF ABSTRACT [17-12-2019(online)].pdf 2019-12-17
5 201921052305-DRAWINGS [17-12-2019(online)].pdf 2019-12-17
6 201921052305-COMPLETE SPECIFICATION [17-12-2019(online)].pdf 2019-12-17
7 201921052305-FORM 18 [18-12-2019(online)].pdf 2019-12-18
8 Abstract1.jpg 2019-12-21
9 201921052305-Proof of Right [18-02-2020(online)].pdf 2020-02-18
10 201921052305-FORM 3 [08-06-2020(online)].pdf 2020-06-08
11 201921052305-FER.pdf 2021-10-19
12 201921052305-OTHERS [08-02-2022(online)].pdf 2022-02-08
13 201921052305-FER_SER_REPLY [08-02-2022(online)].pdf 2022-02-08
14 201921052305-CLAIMS [08-02-2022(online)].pdf 2022-02-08
15 201921052305-Response to office action [22-10-2024(online)].pdf 2024-10-22
16 201921052305-US(14)-HearingNotice-(HearingDate-19-06-2025).pdf 2025-05-29
17 201921052305-FORM-26 [17-06-2025(online)].pdf 2025-06-17
18 201921052305-Correspondence to notify the Controller [17-06-2025(online)].pdf 2025-06-17
19 201921052305-Written submissions and relevant documents [01-07-2025(online)].pdf 2025-07-01

Search Strategy

1 SearchHistoryE_11-08-2021.pdf