Abstract: The present disclosure discloses a method of predicting an interest of a user and a system thereof. The method comprises receiving, by the system, a first set of inputs related to the user from a first set of sensors, where one or more features of the user are. The method further comprises receiving a second set of inputs related to an environment around the user from a second set of sensors, assigning weights to each input among the first set of inputs, identifying a user behavior associated with each of the one or more features of the user by and predicting the interest of the user based on the user behavior associated with each of the one or more features of the first set of inputs and the weights assigned to each input among the first set of inputs, thus increasing accuracy of prediction. Figure 2
Claims:We claim:
1. A method of predicting an interest of a user, the method comprising:
receiving, by an interest prediction system (100), a first set of inputs related to the user from a first set of sensors, wherein one or more features of the user are extracted from the first set of inputs;
receiving, by the interest prediction system (100), a second set of inputs related to an environment around the user from a second set of sensors;
assigning, by the interest prediction system (100), weights to each input among the first set of inputs based on the second set of inputs;
identifying, by the interest prediction system (100), a user behavior associated with each of the one or more features of the user by comparing the one or more features with one or more predefined features; and
predicting, by the interest prediction system (100), the interest of the user based on the user behavior associated with each of the one or more features of the first set of inputs and the weights assigned to each input among the first set of inputs.
2. The method as claimed in claim 1, wherein the first set of inputs comprises at least one of image data, video data, audio data and text data.
3. The method as claimed in claim 1, wherein the second set of inputs comprises at least one of temperature data, luminosity data, humidity data, air composition data, air quality data and audio noise data.
4. The method as claimed in claim 1, wherein the one or more features comprises at least one of body postures of the user, facial parameters of the user and audio parameters of the user.
5. An interest prediction system (100), comprising:
a processor (203); and
a memory (202), communicatively coupled to the processor (203), which stores processor executable instructions, which, on execution causes the processor to:
receive a first set of inputs related to a user from a first set of sensors, wherein one or more features of the user are extracted from the first set of inputs;
receive a second set of inputs related to an environment around the user from a second set of sensors;
assign weights to each input among the first set of inputs based on the second set of inputs;
identify a user behavior associated with each of the one or more features of the user by comparing the one or more features with one or more predefined features; and
predict the interest of the user based on the user behavior associated with each of the one or more features of the first set of inputs and the weights assigned to each input among the first set of inputs.
6. The interest prediction system (100) as claimed in claim 5, wherein the first set of inputs comprises at least one of image data, video data, audio data and text data.
7. The interest prediction system (100) as claimed in claim 5, wherein the first set of sensors comprises at least one of image capturing devices, video capturing devices, audio recording devices and user interface.
8. The interest prediction system (100) as claimed in claim 5, wherein the second set of inputs comprises at least one of temperature data, luminosity data, humidity data, air composition data, air quality data and audio noise data.
9. The interest prediction system (100) as claimed in claim 5, wherein the second set of sensors comprises at least one of temperature sensors, luminosity sensors, humidity sensors, air monitor sensors and sound sensors.
10. The interest prediction system (100) as claimed in claim 5, wherein the one or more features comprises at least one of body postures of the user, facial parameters of the user and audio parameters of the user.
Dated this 16th day of February 2017
R Ramya Rao
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
The present disclosure relates to field of predictive analytics. More particularly, but not specifically, the present disclosure relates to a method of predicting an interest of a user and a system thereof.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201741005568-IntimationOfGrant04-07-2023.pdf | 2023-07-04 |
| 1 | Power of Attorney [16-02-2017(online)].pdf | 2017-02-16 |
| 2 | 201741005568-PatentCertificate04-07-2023.pdf | 2023-07-04 |
| 2 | Form 5 [16-02-2017(online)].pdf | 2017-02-16 |
| 3 | Form 3 [16-02-2017(online)].pdf | 2017-02-16 |
| 3 | 201741005568-FORM-26 [16-05-2023(online)].pdf | 2023-05-16 |
| 4 | Form 18 [16-02-2017(online)].pdf_277.pdf | 2017-02-16 |
| 4 | 201741005568-Written submissions and relevant documents [10-05-2023(online)].pdf | 2023-05-10 |
| 5 | Form 18 [16-02-2017(online)].pdf | 2017-02-16 |
| 5 | 201741005568-AMENDED DOCUMENTS [13-04-2023(online)].pdf | 2023-04-13 |
| 6 | Drawing [16-02-2017(online)].pdf | 2017-02-16 |
| 6 | 201741005568-Correspondence to notify the Controller [13-04-2023(online)].pdf | 2023-04-13 |
| 7 | Description(Complete) [16-02-2017(online)].pdf_276.pdf | 2017-02-16 |
| 7 | 201741005568-FORM 13 [13-04-2023(online)].pdf | 2023-04-13 |
| 8 | Description(Complete) [16-02-2017(online)].pdf | 2017-02-16 |
| 8 | 201741005568-POA [13-04-2023(online)].pdf | 2023-04-13 |
| 9 | 201741005568-US(14)-HearingNotice-(HearingDate-27-04-2023).pdf | 2023-04-05 |
| 9 | REQUEST FOR CERTIFIED COPY [22-02-2017(online)].pdf | 2017-02-22 |
| 10 | 201741005568-FER_SER_REPLY [08-12-2020(online)].pdf | 2020-12-08 |
| 10 | PROOF OF RIGHT [31-05-2017(online)].pdf | 2017-05-31 |
| 11 | 201741005568-FORM 3 [08-12-2020(online)].pdf | 2020-12-08 |
| 11 | Correspondence by Agent_Form 30_Form 1_02-06-2017.pdf | 2017-06-02 |
| 12 | 201741005568-PETITION UNDER RULE 137 [08-12-2020(online)].pdf | 2020-12-08 |
| 12 | abstract201741005568.jpg | 2017-06-07 |
| 13 | 201741005568-FER.pdf | 2020-06-09 |
| 14 | 201741005568-PETITION UNDER RULE 137 [08-12-2020(online)].pdf | 2020-12-08 |
| 14 | abstract201741005568.jpg | 2017-06-07 |
| 15 | 201741005568-FORM 3 [08-12-2020(online)].pdf | 2020-12-08 |
| 15 | Correspondence by Agent_Form 30_Form 1_02-06-2017.pdf | 2017-06-02 |
| 16 | 201741005568-FER_SER_REPLY [08-12-2020(online)].pdf | 2020-12-08 |
| 16 | PROOF OF RIGHT [31-05-2017(online)].pdf | 2017-05-31 |
| 17 | REQUEST FOR CERTIFIED COPY [22-02-2017(online)].pdf | 2017-02-22 |
| 17 | 201741005568-US(14)-HearingNotice-(HearingDate-27-04-2023).pdf | 2023-04-05 |
| 18 | 201741005568-POA [13-04-2023(online)].pdf | 2023-04-13 |
| 18 | Description(Complete) [16-02-2017(online)].pdf | 2017-02-16 |
| 19 | Description(Complete) [16-02-2017(online)].pdf_276.pdf | 2017-02-16 |
| 19 | 201741005568-FORM 13 [13-04-2023(online)].pdf | 2023-04-13 |
| 20 | Drawing [16-02-2017(online)].pdf | 2017-02-16 |
| 20 | 201741005568-Correspondence to notify the Controller [13-04-2023(online)].pdf | 2023-04-13 |
| 21 | Form 18 [16-02-2017(online)].pdf | 2017-02-16 |
| 21 | 201741005568-AMENDED DOCUMENTS [13-04-2023(online)].pdf | 2023-04-13 |
| 22 | Form 18 [16-02-2017(online)].pdf_277.pdf | 2017-02-16 |
| 22 | 201741005568-Written submissions and relevant documents [10-05-2023(online)].pdf | 2023-05-10 |
| 23 | Form 3 [16-02-2017(online)].pdf | 2017-02-16 |
| 23 | 201741005568-FORM-26 [16-05-2023(online)].pdf | 2023-05-16 |
| 24 | Form 5 [16-02-2017(online)].pdf | 2017-02-16 |
| 24 | 201741005568-PatentCertificate04-07-2023.pdf | 2023-07-04 |
| 25 | 201741005568-IntimationOfGrant04-07-2023.pdf | 2023-07-04 |
| 25 | Power of Attorney [16-02-2017(online)].pdf | 2017-02-16 |
| 1 | searchedstrategyE_09-06-2020.pdf |
| 1 | searchstrategyAE_02-07-2021.pdf |
| 2 | searchedstrategyE_09-06-2020.pdf |
| 2 | searchstrategyAE_02-07-2021.pdf |