Abstract: In one embodiment, a system, comprising a processor and a memory, for alerting a user, is disclosed. The processor receives one or more sensor data. The processor further determines one or more environmental factors, a current activity of the user, one or more motion data and one or more objects in proximity to the user based on the one or more sensor data. The processor further generates an initial degree of safety for each of the one or more environmental factors, the current activity of the user, the one or more motion data of the electronic device and the one or more objects in proximity to the user based on dynamically adaptive thresholds. The processor further generates an aggregate degree of safety based on the initial degree of safety and one or more pre-defined rules. The processor further alerts the user based on the aggregate degree of safety. FIG.2
Claims:WE CLAIM
1. A method of alerting a user, the method comprising:
receiving, by an electronic device, one or more sensor data;
determining, by the electronic device, one or more environmental factors,
a current activity of the user, one or more motion data of the electronic device and one or more objects in proximity to the user based on the one or more sensor data;
generating, by the electronic device, an initial degree of safety for each of the one or more environmental factors, the current activity of the user, the one or more motion data of the electronic device and the one or more objects in proximity to the user based on dynamically adaptive thresholds;
generating, by the electronic device, an aggregate degree of safety based on the initial degree of safety and one or more pre-defined rules; and
alerting, by the electronic device, the user based on the aggregate degree of safety.
2. The method as claimed in claim 1, wherein the user is alerted of one or more potentially dangerous situations to operate the electronic device.
3. The method as claimed in claim 1, wherein the dynamically adaptive thresholds are based on the one or more objects in proximity to the user.
4. The method as claimed in claim 1, wherein the one or more environmental factors comprises at least one of height from ground level, posture of the user, location, temperature, humidity or illumination.
5. The method as claimed in claim 1, wherein the current activity of the user comprises at least one of running, walking, driving, biking or remaining still.
6. The method as claimed in claim 1, wherein the one or more motion data of the electronic device comprises at least one of tilt, shake, rotation or swing.
7. The method as claimed in claim 1, wherein the one or more objects in proximity to the user comprises at least one of hills, valleys, vehicles, fire, water bodies, slippery surfaces, wild animals, safety notices or number of people.
8. The method as claimed in claim 1, further comprising identifying the one or more objects in proximity to the user using multimodal recurrent neural network.
9. The method as claimed in claim 1, wherein the initial degree of safety and the aggregate degree of safety comprises at least one of safe, unsafe or care.
10. The method as claimed in claim 1, further comprising disabling features of the electronic device after a predefined time interval from the time of alerting the user.
11. A system for alerting a user, the system comprising:
a processor;
a memory communicatively coupled to the processor, wherein the memory stores the processor-executable instructions, which, on execution, causes the processor to:
receive one or more sensor data;
determine one or more environmental factors, a current activity of the user,
one or more motion data of the electronic device and one or more objects in proximity to the user based on the one or more sensor data;
generate an initial degree of safety for each of the one or more environmental factors, the current activity of the user, the one or more motion data of the electronic device and the one or more objects in proximity to the user based on dynamically adaptive thresholds;
generate an aggregate degree of safety based on the initial degree of safety and one or more pre-defined rules; and
alert the user based on the aggregate degree of safety.
12. The system as claimed in claim 11, wherein the user is alerted of one or more potentially dangerous situations to operate the electronic device.
13. The system as claimed in claim 11, wherein the dynamically adaptive thresholds are based on the one or more objects in proximity to the user.
14. The system as claimed in claim 11, wherein the one or more environmental factors comprises at least one of height from ground level, posture of the user, location, temperature, humidity or illumination.
15. The system as claimed in claim 11, wherein the current activity of the user comprises at least one of running, walking, driving, biking or remaining still.
16. The system as claimed in claim 11, wherein the one or more motion data of the electronic device comprises at least one of tilt, shake, rotation or swing.
17. The system as claimed in claim 11, wherein the one or more objects in proximity to the user comprises at least one of hills, valleys, vehicles, fire, water bodies, slippery surfaces, wild animals, safety notices or number of people.
18. The system as claimed in claim 11, wherein the processor is further configured to identify the one or more objects in proximity to the user using multimodal recurrent neural network.
19. The system as claimed in claim 11, wherein the initial degree of safety and the aggregate degree of safety comprises at least one of safe, unsafe or care.
20. The system as claimed in claim 11, wherein the processor is further configured to disable the features of the electronic device after a predefined time interval from the time of alerting the user.
Dated this 14th day of June, 2017
R Ramya Rao
Of K&S Partners
Agent for the Applicant
, Description:TECHNICAL FIELD
This disclosure relates generally to end user devices and more particularly to a system and method for alerting a user on the end user devices.
| # | Name | Date |
|---|---|---|
| 1 | Power of Attorney [14-06-2017(online)].pdf | 2017-06-14 |
| 2 | Form 5 [14-06-2017(online)].pdf | 2017-06-14 |
| 3 | Form 3 [14-06-2017(online)].pdf | 2017-06-14 |
| 4 | Form 18 [14-06-2017(online)].pdf_154.pdf | 2017-06-14 |
| 5 | Form 18 [14-06-2017(online)].pdf | 2017-06-14 |
| 6 | Form 1 [14-06-2017(online)].pdf | 2017-06-14 |
| 7 | Drawing [14-06-2017(online)].pdf | 2017-06-14 |
| 8 | Description(Complete) [14-06-2017(online)].pdf_153.pdf | 2017-06-14 |
| 9 | Description(Complete) [14-06-2017(online)].pdf | 2017-06-14 |
| 10 | Request For Certified Copy-Online.pdf | 2017-06-15 |
| 11 | REQUEST FOR CERTIFIED COPY [15-06-2017(online)].pdf | 2017-06-15 |
| 12 | 201741020816-Abstract.jpg | 2017-06-16 |
| 13 | Request For Certified Copy-Online.pdf_1.pdf | 2017-07-05 |
| 14 | 201741020816-Proof of Right (MANDATORY) [01-09-2017(online)].pdf | 2017-09-01 |
| 15 | Correspondence by Agent_Form1_05-09-2017.pdf | 2017-09-05 |
| 16 | 201741020816-REQUEST FOR CERTIFIED COPY [20-12-2017(online)].pdf | 2017-12-20 |
| 17 | 201741020816-FER.pdf | 2020-07-24 |
| 18 | 201741020816-RELEVANT DOCUMENTS [19-01-2021(online)].pdf | 2021-01-19 |
| 19 | 201741020816-PETITION UNDER RULE 137 [19-01-2021(online)].pdf | 2021-01-19 |
| 20 | 201741020816-OTHERS [19-01-2021(online)].pdf | 2021-01-19 |
| 21 | 201741020816-Information under section 8(2) [19-01-2021(online)].pdf | 2021-01-19 |
| 22 | 201741020816-FORM 3 [19-01-2021(online)].pdf | 2021-01-19 |
| 23 | 201741020816-FER_SER_REPLY [19-01-2021(online)].pdf | 2021-01-19 |
| 24 | 201741020816-DRAWING [19-01-2021(online)].pdf | 2021-01-19 |
| 25 | 201741020816-CORRESPONDENCE [19-01-2021(online)].pdf | 2021-01-19 |
| 26 | 201741020816-CLAIMS [19-01-2021(online)].pdf | 2021-01-19 |
| 27 | 201741020816-ABSTRACT [19-01-2021(online)].pdf | 2021-01-19 |
| 28 | 201741020816-PatentCertificate13-12-2023.pdf | 2023-12-13 |
| 29 | 201741020816-IntimationOfGrant13-12-2023.pdf | 2023-12-13 |
| 30 | 201741020816-PROOF OF ALTERATION [10-04-2024(online)].pdf | 2024-04-10 |
| 1 | searchE_21-07-2020.pdf |