Abstract: The present disclosure discloses a method for training a system for measuring or estimating or both of a user's response to a stimulus and for classifying the response. In one embodiment, the system is trained using a test stimulus, wherein the test stimulus is presented to one or more users, one or more images of the users' face are captured and simultaneously EEG signals of the users are captured. Then one or more emotional features are derived from the facial data of the users. Further, one or more cognitive features and emotional features are derived from the EEG signals of each of the users, and a training dataset is created by correlating the one or more emotional features from the facial data, one or more cognitive and emotional features from the EEG signals with one or more features associated with the test stimulus. Thus created training dataset is used for measuring or estimating or both of a user's response to the stimulus.
We Claim:
1. A method for training a system for measuring or estimating or both of a user's response
to a stimulus and for classifying the response, the method comprising:
presenting a test stimulus to a one or more users;
extracting one or more features associated with the test stimulus and storing in a memory;
capturing one or more images of the one or more users' face and simultaneously capturing EEG signals of the one or more users;
measuring facial data from the one or more images of each of the one or more users;
deriving one or more emotional features from the facial data, and one or more cognitive features and one or more emotional features from the EEG signals of each of the one or more users;
creating a training dataset by correlating the one or more emotional features from the facial data, one or more cognitive features and one or more emotional features from the EEG signals with the one or more features associated with the test stimulus; and
creating a platform by storing the training dataset, for measuring or estimating or both of a user's response to the stimulus.
2. A method of measuring or estimating or both of the user's response to the stimulus
using the platform of claim 1, method comprising:
extracting one or more features associated with the stimulus;
extracting the one or more features of the test stimulus, stored in the memory associated with the system, that matches with the one or more features of the stimulus, and
extracting the one or more of the cognitive features or the one or more emotional features or both from the training dataset matching with the extracted one or more features associated with the stimulus for estimating a user's response to the stimulus and for classifying the response to the stimulus.
3. The method as claimed in claim 1, wherein the stimulus and the test stimulus is one of a video, an audio, an image, an advertisement, a promotional content, a web page, a user interface, a chat bot, a mobile app, a video game, and content in any form or format.
4. The method as claimed in claim 1, wherein the test stimulus is presented on user devices associated with the first user and the second user.
5. The method as claimed in claim 1, wherein the one or more images of the one or more users' face are captured, using a camera associated with the user devices, while the one or more users are experiencing the test stimulus.
6. The method as claimed in claim 1, wherein the EEG signals of the one or more users are captured, using an EEG headset worn by the one or more users, while the one or more users are experiencing the test stimulus.
7. The method as claimed in claim 1, wherein the facial data comprises facial action units and one or more landmarks.
8. The method as claimed in claim 1, wherein deriving the one or more emotional features from the facial data comprises:
determining eccentricity of a plurality of contours of the face using one or more landmarks;
calculating distances between the two or more landmarks and normalizing the distances by face radius;
calculating different angles between the two or more landmarks; and
deriving the one or more emotional features based on the normalized distances and the distances between the one or more landmarks.
9. The method of claim 2, wherein the classifying the user's response includes predicting
general populations response to the new stimulus from a group of responses including,
but not limited to, popular, unpopular, one or more measures of popularity or
unpopularity, a probability of going viral on social media, a probability of being
ignored, one or more measures of comfort, one or more measures of discomfort, one or more measures of anger, one or more measures of revulsion.
10. A method for training a system for measuring or estimating or both of a user's response
to a stimulus and for classifying the response, the method comprising:
presenting a test stimulus to a one or more users;
extracting one or more features associated with the test stimulus and storing in a memory;
capturing one or more images of the one or more users' face and simultaneously capturing EEG signals of the one or more users;
measuring facial data from the one or more images of each of the one or more users;
deriving one or more emotional features from the facial data, and one or more cognitive features and one or more emotional features from the EEG signals of each of the one or more users;
creating a first training dataset by correlating the facial data with the one or more cognitive features and one or more emotional features from the EEG signals;
creating a second training dataset by correlating the first dataset with the one or more features associated with the test stimulus; and
creating a platform by storing the first and the second training dataset, for measuring or estimating or both of a user's response to the stimulus.
11. A method of measuring or estimating or both of the user's response to the stimulus
using the platform of claim 10, method comprising:
extracting one or more features associated with the stimulus;
extracting the one or more features of the test stimulus, stored in the memory associated with the system, that matches with the one or more features of the stimulus, and
extracting the one or more of the cognitive features or the one or more emotional features or both from the second training dataset matching with the extracted one or more features associated with the stimulus for estimating a user's response to the stimulus and for classifying the response to the stimulus.
12. A method for predicting one or more emotional features and one or more cognitive
features of the user using the platform of claim 10, the method comprising:
receiving one or more images of the user's face or a video of the user's face; measuring facial data from the one or more images or from the video; deriving the one or more emotional features and the one or more cognitive features of the user by correlating the facial data and the first training dataset.
13. A system for measuring or estimating or both of a user's response to a stimulus and for
classifying the response, the system comprising:
a processor, the processor configured for:
presenting a test stimulus to a one or more users;
extracting one or more features associated with the test stimulus;
capturing one or more images of the one or more users' face and simultaneously capturing EEG signals of the one or more users;
measuring facial data from the one or more images of each of the one or more users;
deriving one or more emotional features from the facial data, and one or more cognitive features and one or more emotional features from the EEG signals of each of the one or more users;
creating a training dataset by correlating the one or more emotional features from the facial data, one or more cognitive features and one or more emotional features from the EEG signals with the one or more features associated with the test stimulus; and
creating a platform by storing the training dataset, for measuring or estimating or both of a user's response to the stimulus; a memory, the memory configured for:
storing one or more features associated with the test stimulus;
storing training dataset; wherein, the processor is further configured for:
extracting one or more features associated with the stimulus;
extracting the one or more features of the test stimulus, stored in the memory associated with the system, that matches with the one or more features of the stimulus, and
extracting the one or more of the cognitive features or the one or more emotional features or both from the training dataset matching with the extracted one or more features associated with the stimulus for estimating a user's response to the stimulus and for classifying the response to the stimulus.
| # | Name | Date |
|---|---|---|
| 1 | 201941026450-Response to office action [17-10-2024(online)].pdf | 2024-10-17 |
| 1 | 201941026450-STATEMENT OF UNDERTAKING (FORM 3) [02-07-2019(online)].pdf | 2019-07-02 |
| 2 | 201941026450-FORM FOR SMALL ENTITY(FORM-28) [02-07-2019(online)].pdf | 2019-07-02 |
| 2 | 201941026450-Response to office action [27-08-2024(online)].pdf | 2024-08-27 |
| 3 | 201941026450-IntimationOfGrant24-01-2023.pdf | 2023-01-24 |
| 3 | 201941026450-FORM FOR SMALL ENTITY [02-07-2019(online)].pdf | 2019-07-02 |
| 4 | 201941026450-PatentCertificate24-01-2023.pdf | 2023-01-24 |
| 4 | 201941026450-FORM 1 [02-07-2019(online)].pdf | 2019-07-02 |
| 5 | 201941026450-Written submissions and relevant documents [11-11-2021(online)].pdf | 2021-11-11 |
| 5 | 201941026450-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [02-07-2019(online)].pdf | 2019-07-02 |
| 6 | 201941026450-EVIDENCE FOR REGISTRATION UNDER SSI [02-07-2019(online)].pdf | 2019-07-02 |
| 6 | 201941026450-Correspondence to notify the Controller [26-10-2021(online)].pdf | 2021-10-26 |
| 7 | 201941026450-FORM-26 [26-10-2021(online)].pdf | 2021-10-26 |
| 7 | 201941026450-DRAWINGS [02-07-2019(online)].pdf | 2019-07-02 |
| 8 | 201941026450-FER.pdf | 2021-10-17 |
| 8 | 201941026450-DECLARATION OF INVENTORSHIP (FORM 5) [02-07-2019(online)].pdf | 2019-07-02 |
| 9 | 201941026450-COMPLETE SPECIFICATION [02-07-2019(online)].pdf | 2019-07-02 |
| 9 | 201941026450-US(14)-HearingNotice-(HearingDate-27-10-2021).pdf | 2021-10-17 |
| 10 | 201941026450-ABSTRACT [04-09-2021(online)].pdf | 2021-09-04 |
| 10 | 201941026450-Proof of Right (MANDATORY) [12-09-2019(online)].pdf | 2019-09-12 |
| 11 | 201941026450-CLAIMS [04-09-2021(online)].pdf | 2021-09-04 |
| 11 | 201941026450-FORM-26 [12-09-2019(online)].pdf | 2019-09-12 |
| 12 | 201941026450-COMPLETE SPECIFICATION [04-09-2021(online)].pdf | 2021-09-04 |
| 12 | Correspondence by Agent_Form1,Form26_17-09-2019.pdf | 2019-09-17 |
| 13 | 201941026450-DRAWING [04-09-2021(online)].pdf | 2021-09-04 |
| 13 | 201941026450-Request Letter-Correspondence [22-07-2020(online)].pdf | 2020-07-22 |
| 14 | 201941026450-FER_SER_REPLY [04-09-2021(online)].pdf | 2021-09-04 |
| 14 | 201941026450-FORM28 [22-07-2020(online)].pdf | 2020-07-22 |
| 15 | 201941026450-Form 1 (Submitted on date of filing) [22-07-2020(online)].pdf | 2020-07-22 |
| 15 | 201941026450-OTHERS [04-09-2021(online)].pdf | 2021-09-04 |
| 16 | 201941026450-CERTIFIED COPIES TRANSMISSION TO IB [22-07-2020(online)].pdf | 2020-07-22 |
| 16 | 201941026450-Response to office action [13-07-2021(online)].pdf | 2021-07-13 |
| 17 | 201941026450-Request Letter-Correspondence [27-07-2020(online)].pdf | 2020-07-27 |
| 17 | 201941026450-FORM 3 [18-05-2021(online)].pdf | 2021-05-18 |
| 18 | 201941026450-EVIDENCE FOR REGISTRATION UNDER SSI [31-12-2020(online)].pdf | 2020-12-31 |
| 18 | 201941026450-FORM 3 [07-12-2020(online)].pdf | 2020-12-07 |
| 19 | 201941026450-FORM 18A [31-12-2020(online)].pdf | 2020-12-31 |
| 19 | 201941026450-MSME CERTIFICATE [31-12-2020(online)].pdf | 2020-12-31 |
| 20 | 201941026450-FORM FOR SMALL ENTITY [31-12-2020(online)].pdf | 2020-12-31 |
| 20 | 201941026450-FORM28 [31-12-2020(online)].pdf | 2020-12-31 |
| 21 | 201941026450-FORM-9 [31-12-2020(online)].pdf | 2020-12-31 |
| 22 | 201941026450-FORM FOR SMALL ENTITY [31-12-2020(online)].pdf | 2020-12-31 |
| 22 | 201941026450-FORM28 [31-12-2020(online)].pdf | 2020-12-31 |
| 23 | 201941026450-FORM 18A [31-12-2020(online)].pdf | 2020-12-31 |
| 23 | 201941026450-MSME CERTIFICATE [31-12-2020(online)].pdf | 2020-12-31 |
| 24 | 201941026450-FORM 3 [07-12-2020(online)].pdf | 2020-12-07 |
| 24 | 201941026450-EVIDENCE FOR REGISTRATION UNDER SSI [31-12-2020(online)].pdf | 2020-12-31 |
| 25 | 201941026450-Request Letter-Correspondence [27-07-2020(online)].pdf | 2020-07-27 |
| 25 | 201941026450-FORM 3 [18-05-2021(online)].pdf | 2021-05-18 |
| 26 | 201941026450-CERTIFIED COPIES TRANSMISSION TO IB [22-07-2020(online)].pdf | 2020-07-22 |
| 26 | 201941026450-Response to office action [13-07-2021(online)].pdf | 2021-07-13 |
| 27 | 201941026450-Form 1 (Submitted on date of filing) [22-07-2020(online)].pdf | 2020-07-22 |
| 27 | 201941026450-OTHERS [04-09-2021(online)].pdf | 2021-09-04 |
| 28 | 201941026450-FER_SER_REPLY [04-09-2021(online)].pdf | 2021-09-04 |
| 28 | 201941026450-FORM28 [22-07-2020(online)].pdf | 2020-07-22 |
| 29 | 201941026450-DRAWING [04-09-2021(online)].pdf | 2021-09-04 |
| 29 | 201941026450-Request Letter-Correspondence [22-07-2020(online)].pdf | 2020-07-22 |
| 30 | 201941026450-COMPLETE SPECIFICATION [04-09-2021(online)].pdf | 2021-09-04 |
| 30 | Correspondence by Agent_Form1,Form26_17-09-2019.pdf | 2019-09-17 |
| 31 | 201941026450-CLAIMS [04-09-2021(online)].pdf | 2021-09-04 |
| 31 | 201941026450-FORM-26 [12-09-2019(online)].pdf | 2019-09-12 |
| 32 | 201941026450-ABSTRACT [04-09-2021(online)].pdf | 2021-09-04 |
| 32 | 201941026450-Proof of Right (MANDATORY) [12-09-2019(online)].pdf | 2019-09-12 |
| 33 | 201941026450-COMPLETE SPECIFICATION [02-07-2019(online)].pdf | 2019-07-02 |
| 33 | 201941026450-US(14)-HearingNotice-(HearingDate-27-10-2021).pdf | 2021-10-17 |
| 34 | 201941026450-DECLARATION OF INVENTORSHIP (FORM 5) [02-07-2019(online)].pdf | 2019-07-02 |
| 34 | 201941026450-FER.pdf | 2021-10-17 |
| 35 | 201941026450-DRAWINGS [02-07-2019(online)].pdf | 2019-07-02 |
| 35 | 201941026450-FORM-26 [26-10-2021(online)].pdf | 2021-10-26 |
| 36 | 201941026450-EVIDENCE FOR REGISTRATION UNDER SSI [02-07-2019(online)].pdf | 2019-07-02 |
| 36 | 201941026450-Correspondence to notify the Controller [26-10-2021(online)].pdf | 2021-10-26 |
| 37 | 201941026450-Written submissions and relevant documents [11-11-2021(online)].pdf | 2021-11-11 |
| 37 | 201941026450-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [02-07-2019(online)].pdf | 2019-07-02 |
| 38 | 201941026450-PatentCertificate24-01-2023.pdf | 2023-01-24 |
| 38 | 201941026450-FORM 1 [02-07-2019(online)].pdf | 2019-07-02 |
| 39 | 201941026450-IntimationOfGrant24-01-2023.pdf | 2023-01-24 |
| 39 | 201941026450-FORM FOR SMALL ENTITY [02-07-2019(online)].pdf | 2019-07-02 |
| 40 | 201941026450-Response to office action [27-08-2024(online)].pdf | 2024-08-27 |
| 40 | 201941026450-FORM FOR SMALL ENTITY(FORM-28) [02-07-2019(online)].pdf | 2019-07-02 |
| 41 | 201941026450-STATEMENT OF UNDERTAKING (FORM 3) [02-07-2019(online)].pdf | 2019-07-02 |
| 41 | 201941026450-Response to office action [17-10-2024(online)].pdf | 2024-10-17 |
| 1 | searchE_13-01-2021.pdf |