Abstract: The present disclosure discloses a method and device for diagnosing problems in appliances. The device may receive a user input describing a problem related to an appliance. Further, the device extracts one or more objects from the user input for determining at least one effect of the problem and determines a problem domain based on the one or more objects. Further, the device retrieves a plurality of causes from the problem domain leading to the at least one effect. Furthermore, the device instructs user to perform at least one action and analyses user observations to determine an actual cause of the problem from the plurality of causes, for diagnosing the problem in appliances. The method and device of the present disclosure diagnoses problem in appliances by interacting with the user in real time. Figure 2
Claims:We claim:
1. A method for assisted diagnosis of problems in appliances, comprising:
receiving, by an assistance device, a user input describing a problem related to an appliance;
extracting, by the assistance device, one or more objects from the user input, wherein at least one effect of the problem is determined based on the one or more objects;
determining, by the assistance device, a problem domain from a plurality of problem domains based on the one or more objects;
retrieving, by the assistance device, a plurality of causes from the problem domain leading to the at least one effect;
instructing, by the assistance device, user to perform at least one action related to the appliance corresponding to at least one of the plurality of causes, wherein user observations are received upon completion of the at least one action; and
analysing, by the assistance device, the user observations for determining a cause from the plurality of causes corresponding to the at least one effect, for diagnosing the problem in appliances.
2. The method as claimed in claim 1, wherein the user input comprises at least one of text, speech, gestures, images and videos.
3. The method as claimed in claim 1, wherein extraction further comprises generating queries to the user based on the user input for extracting one or more objects from the user input.
4. The method as claimed in claim 1, wherein the one or more objects comprises at least one of a keyword and an image frame.
5. The method as claimed in claim 1, wherein each of the plurality of problem domains comprises information on problems related to each of one or more appliances, wherein the information comprises the plurality of causes corresponding to the at least one effect and the at least one action to be performed corresponding to the each of the plurality of causes.
6. The method as claimed in claim 1, wherein the user observations are at least one of inputs received from the user and inputs received from one or more sensors associated with the assistance device.
7. An assistance device for diagnosing problems in appliances, said assistance device comprising:
a processor; and
a memory, communicatively coupled with the processor, storing processor executable instructions, which, on execution causes the processor to:
receive, a user input describing a problem related to an appliance;
extract, one or more objects from the user input, wherein at least one effect of the problem is determined based on the one or more objects;
determine, a problem domain from a plurality of problem domains based on the one or more objects;
retrieve, a plurality of causes from the problem domain leading to the at least one effect;
instruct, user to perform at least one action related to the appliance corresponding to at least one of the plurality of causes, wherein user observations are received upon completion of the at least one action; and
analyse, by the assistance device, the user observations for determining a cause from the plurality of causes corresponding to the at least one effect, for diagnosing the problem in appliances.
8. The device as claimed in claim 6, wherein the user input comprises at least one of text, speech, gestures, images and videos.
9. The device as claimed in claim 6, wherein extraction further comprises generating queries to the user based on the user input for extracting one or more objects from the user input.
10. The device as claimed in claim 6, wherein the one or more objects comprises at least one of a keyword and an image frame.
11. The device as claimed in claim 6, wherein each of the plurality of problem domains comprises information on problems related to each of one or more appliances, wherein the information comprises the plurality of causes corresponding to the at least one effect and the at least one action to be performed corresponding to the each of the plurality of causes.
12. The device as claimed in claim 6, wherein the user observations are at least one of inputs received from the user and inputs received from one or more sensors associated with the assistance device.
Dated this 18th day of March, 2018
R Ramya Rao
Of K&S Partners
Agent for the Applicant
IN/PA/1607
, Description:TECHNICAL FIELD
The present disclosure relates to virtual assistance. More particularly, but not exclusively, the present disclosure relates to a method and an assistance device for providing real-time assistance to diagnose problems in appliances.
| Section | Controller | Decision Date |
|---|---|---|
| 43,47 | PRAVEEN KUMAR | 2024-04-15 |
| 43,47 | PRAVEEN KUMAR | 2024-04-15 |
| # | Name | Date |
|---|---|---|
| 1 | 201841009874-STATEMENT OF UNDERTAKING (FORM 3) [18-03-2018(online)].pdf | 2018-03-18 |
| 2 | 201841009874-REQUEST FOR EXAMINATION (FORM-18) [18-03-2018(online)].pdf | 2018-03-18 |
| 3 | 201841009874-POWER OF AUTHORITY [18-03-2018(online)].pdf | 2018-03-18 |
| 4 | 201841009874-FORM 18 [18-03-2018(online)].pdf | 2018-03-18 |
| 5 | 201841009874-FORM 1 [18-03-2018(online)].pdf | 2018-03-18 |
| 6 | 201841009874-DRAWINGS [18-03-2018(online)].pdf | 2018-03-18 |
| 7 | 201841009874-DECLARATION OF INVENTORSHIP (FORM 5) [18-03-2018(online)].pdf | 2018-03-18 |
| 8 | 201841009874-COMPLETE SPECIFICATION [18-03-2018(online)].pdf | 2018-03-18 |
| 9 | 201841009874-REQUEST FOR CERTIFIED COPY [04-05-2018(online)].pdf | 2018-05-04 |
| 10 | 201841009874-Proof of Right (MANDATORY) [01-08-2018(online)].pdf | 2018-08-01 |
| 11 | Correspondence by Agent_Form 1_07-08-2018.pdf | 2018-08-07 |
| 12 | Abstract 201841009874.jpg | 2018-08-29 |
| 13 | 201841009874-RELEVANT DOCUMENTS [26-02-2021(online)].pdf | 2021-02-26 |
| 14 | 201841009874-PETITION UNDER RULE 137 [26-02-2021(online)].pdf | 2021-02-26 |
| 15 | 201841009874-Information under section 8(2) [26-02-2021(online)].pdf | 2021-02-26 |
| 16 | 201841009874-FORM 3 [26-02-2021(online)].pdf | 2021-02-26 |
| 17 | 201841009874-FER_SER_REPLY [05-03-2021(online)].pdf | 2021-03-05 |
| 18 | 201841009874-FER.pdf | 2021-10-17 |
| 19 | 201841009874-US(14)-HearingNotice-(HearingDate-01-03-2024).pdf | 2024-02-07 |
| 20 | 201841009874-POA [12-02-2024(online)].pdf | 2024-02-12 |
| 21 | 201841009874-FORM 13 [12-02-2024(online)].pdf | 2024-02-12 |
| 22 | 201841009874-Correspondence to notify the Controller [12-02-2024(online)].pdf | 2024-02-12 |
| 23 | 201841009874-AMENDED DOCUMENTS [12-02-2024(online)].pdf | 2024-02-12 |
| 24 | 201841009874-Written submissions and relevant documents [15-03-2024(online)].pdf | 2024-03-15 |
| 25 | 201841009874-FORM-26 [15-03-2024(online)].pdf | 2024-03-15 |
| 26 | 201841009874-FORM 3 [15-03-2024(online)].pdf | 2024-03-15 |
| 27 | 201841009874-PatentCertificate15-04-2024.pdf | 2024-04-15 |
| 28 | 201841009874-IntimationOfGrant15-04-2024.pdf | 2024-04-15 |
| 1 | 2021-04-2915-04-04AE_29-04-2021.pdf |
| 2 | 2020-09-0115-08-56E_01-09-2020.pdf |