Abstract: The present disclosure relates to a method and system for obtaining interactive user feedback in real-time by feedback obtaining system. The feedback obtaining system establishes connection between user device of user and server of service provider based on user location received from user device, receives static data of user from server and dynamic data of user from capturing device located at site of service provider, identify contextual information associated with user based on static data and dynamic data, provide one or more feedback queries for user from database based on contextual information, provide one or more sub-feedback queries for user based on response of user for one or more feedback queries and obtains user feedback based on response of user for one or more sub-feedback queries and one or more feedback queries and implicit feedback. The use of implicit feedback together with actual feedback gives effective feedback of users. Fig.1
Claims:WE CLAIM:
1. A method for obtaining interactive user feedback in real-time, the method comprising:
establishing, by a feedback obtaining system (101), a connection between a user device of a user and a server of a service provider (105) based on a user location received from the user device;
receiving, by the feedback obtaining system (101), static data of the user from the server and dynamic data of the user from a capturing device located at a site of the service provider;
identifying, by the feedback obtaining system (101), contextual information (205) associated with the user based on the static data and the dynamic data, wherein the contextual information (205) comprises implicit feedback of the user;
providing, by the feedback obtaining system (101), one or more feedback queries for the user from a database based on the contextual information (205);
providing, by the feedback obtaining system (101), one or more sub-feedback queries for the user based on a response of the user for the one or more feedback queries; and
obtaining, by the feedback obtaining system (101), user feedback based on a response of the user for the one or more sub-feedback queries and the one or more feedback queries and the implicit feedback.
2. The method as claimed in claim 1, wherein the user location comprises coordinate details of the user.
3. The method as claimed in claim 1, wherein the static data of the user comprises contact details, login credentials, user images and historic feedback data associated with the user.
4. The method as claimed in claim 1, wherein the dynamic data of the user comprises video of the user captured at the site of the service provider.
5. The method as claimed in claim 1, wherein identifying contextual information (205) comprises generating one or more static attributes and one or more dynamic attributes associated with user actions.
6. The method as claimed in claim 1 further comprising generating a feedback report (211) based on the user feedback.
7. The method as claimed in claim 1 further comprising recreating a virtual environment of the site of the service provider for the user for providing feedback, when the connection established between the user device and the server of the service provider (105) disconnects.
8. A feedback obtaining system (101) for obtaining interactive user feedback in real-time, comprising:
a processor; and
a memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to:
establish a connection between a user device of a user and a server of a service provider (105) based on a user location received from the user device;
receive static data of the user from the server and dynamic data of the user from a capturing device located at a site of the service provider;
identify contextual information (205) associated with the user based on the static data and the dynamic data, wherein the contextual information (205) comprises implicit feedback of the user;
provide one or more feedback queries for the user from a database based on the contextual information (205);
provide one or more sub-feedback queries for the user based on a response of the user for the one or more feedback queries; and
obtain user feedback based on a response of the user for the one or more feedback queries and the one or more sub-feedback queries and the implicit feedback.
9. The feedback obtaining system (101) as claimed in claim 8, wherein the user location comprises coordinate details of the user.
10. The feedback obtaining system (101) as claimed in claim 8, wherein the static data of the user comprises contact details, login credentials, user images and historic feedback data associated with the user.
11. The feedback obtaining system (101) as claimed in claim 8, wherein the dynamic data of the user comprises video of the user captured at the site of the service provider.
12. The feedback obtaining system (101) as claimed in claim 8, wherein the processor (117) identifies contextual information (205) by generating one or more static attributes and one or more dynamic attributes associated with user actions.
13. The feedback obtaining system (101) as claimed in claim 8, wherein the processor (117) generates a feedback report (211) based on the user feedback.
14. The feedback obtaining system (101) as claimed in claim 8, wherein the processor (117) recreates a virtual environment of the site of the service provider for the user for providing feedback, when the connection established between the user device and the server of the service provider (105) disconnects.
Dated this 16th day of February 2017
R Ramya Rao
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
The present subject matter is related in general to collecting feedback, more particularly, but not exclusively, to a method and system for obtaining interactive user feedback in real-time.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201741005562-IntimationOfGrant20-12-2022.pdf | 2022-12-20 |
| 1 | Power of Attorney [16-02-2017(online)].pdf | 2017-02-16 |
| 2 | 201741005562-PatentCertificate20-12-2022.pdf | 2022-12-20 |
| 2 | Form 5 [16-02-2017(online)].pdf | 2017-02-16 |
| 3 | Form 3 [16-02-2017(online)].pdf | 2017-02-16 |
| 3 | 201741005562-Written submissions and relevant documents [01-11-2022(online)].pdf | 2022-11-01 |
| 4 | Form 18 [16-02-2017(online)].pdf_239.pdf | 2017-02-16 |
| 4 | 201741005562-AMENDED DOCUMENTS [06-10-2022(online)].pdf | 2022-10-06 |
| 5 | Form 18 [16-02-2017(online)].pdf | 2017-02-16 |
| 5 | 201741005562-Correspondence to notify the Controller [06-10-2022(online)].pdf | 2022-10-06 |
| 6 | Drawing [16-02-2017(online)].pdf | 2017-02-16 |
| 6 | 201741005562-FORM 13 [06-10-2022(online)].pdf | 2022-10-06 |
| 7 | Description(Complete) [16-02-2017(online)].pdf_238.pdf | 2017-02-16 |
| 7 | 201741005562-POA [06-10-2022(online)].pdf | 2022-10-06 |
| 8 | Description(Complete) [16-02-2017(online)].pdf | 2017-02-16 |
| 8 | 201741005562-US(14)-HearingNotice-(HearingDate-21-10-2022).pdf | 2022-09-21 |
| 9 | 201741005562-CLAIMS [19-01-2021(online)].pdf | 2021-01-19 |
| 9 | REQUEST FOR CERTIFIED COPY [22-02-2017(online)].pdf | 2017-02-22 |
| 10 | 201741005562-CORRESPONDENCE [19-01-2021(online)].pdf | 2021-01-19 |
| 10 | PROOF OF RIGHT [31-05-2017(online)].pdf | 2017-05-31 |
| 11 | 201741005562-DRAWING [19-01-2021(online)].pdf | 2021-01-19 |
| 11 | Correspondence by Agent_Form 1_02-06-2017.pdf | 2017-06-02 |
| 12 | 201741005562-FER.pdf | 2020-07-24 |
| 12 | 201741005562-FER_SER_REPLY [19-01-2021(online)].pdf | 2021-01-19 |
| 13 | 201741005562-FORM 3 [19-01-2021(online)].pdf | 2021-01-19 |
| 13 | 201741005562-RELEVANT DOCUMENTS [19-01-2021(online)].pdf | 2021-01-19 |
| 14 | 201741005562-Information under section 8(2) [19-01-2021(online)].pdf | 2021-01-19 |
| 14 | 201741005562-PETITION UNDER RULE 137 [19-01-2021(online)].pdf | 2021-01-19 |
| 15 | 201741005562-OTHERS [19-01-2021(online)].pdf | 2021-01-19 |
| 16 | 201741005562-Information under section 8(2) [19-01-2021(online)].pdf | 2021-01-19 |
| 16 | 201741005562-PETITION UNDER RULE 137 [19-01-2021(online)].pdf | 2021-01-19 |
| 17 | 201741005562-RELEVANT DOCUMENTS [19-01-2021(online)].pdf | 2021-01-19 |
| 17 | 201741005562-FORM 3 [19-01-2021(online)].pdf | 2021-01-19 |
| 18 | 201741005562-FER_SER_REPLY [19-01-2021(online)].pdf | 2021-01-19 |
| 18 | 201741005562-FER.pdf | 2020-07-24 |
| 19 | 201741005562-DRAWING [19-01-2021(online)].pdf | 2021-01-19 |
| 19 | Correspondence by Agent_Form 1_02-06-2017.pdf | 2017-06-02 |
| 20 | 201741005562-CORRESPONDENCE [19-01-2021(online)].pdf | 2021-01-19 |
| 20 | PROOF OF RIGHT [31-05-2017(online)].pdf | 2017-05-31 |
| 21 | 201741005562-CLAIMS [19-01-2021(online)].pdf | 2021-01-19 |
| 21 | REQUEST FOR CERTIFIED COPY [22-02-2017(online)].pdf | 2017-02-22 |
| 22 | 201741005562-US(14)-HearingNotice-(HearingDate-21-10-2022).pdf | 2022-09-21 |
| 22 | Description(Complete) [16-02-2017(online)].pdf | 2017-02-16 |
| 23 | 201741005562-POA [06-10-2022(online)].pdf | 2022-10-06 |
| 23 | Description(Complete) [16-02-2017(online)].pdf_238.pdf | 2017-02-16 |
| 24 | 201741005562-FORM 13 [06-10-2022(online)].pdf | 2022-10-06 |
| 24 | Drawing [16-02-2017(online)].pdf | 2017-02-16 |
| 25 | Form 18 [16-02-2017(online)].pdf | 2017-02-16 |
| 25 | 201741005562-Correspondence to notify the Controller [06-10-2022(online)].pdf | 2022-10-06 |
| 26 | Form 18 [16-02-2017(online)].pdf_239.pdf | 2017-02-16 |
| 26 | 201741005562-AMENDED DOCUMENTS [06-10-2022(online)].pdf | 2022-10-06 |
| 27 | Form 3 [16-02-2017(online)].pdf | 2017-02-16 |
| 27 | 201741005562-Written submissions and relevant documents [01-11-2022(online)].pdf | 2022-11-01 |
| 28 | Form 5 [16-02-2017(online)].pdf | 2017-02-16 |
| 28 | 201741005562-PatentCertificate20-12-2022.pdf | 2022-12-20 |
| 29 | Power of Attorney [16-02-2017(online)].pdf | 2017-02-16 |
| 29 | 201741005562-IntimationOfGrant20-12-2022.pdf | 2022-12-20 |
| 1 | TPOSEARCHREPORTE_14-07-2020.pdf |