Abstract: The embodiments herein provide a system and method for an optimal imaging of a pupil for tracking for all gaze positions and all lighting conditions. The system comprises an goggle module with an unobtrusive frame for providing a wide field of view and enables an inferomedial eccentric camera placement. An infra-red band pass filter is provided for cutting-off direct light and reflections from surroundings. An array of linear infrared LEDs are mounted on the frame to provide optimum illumination to the pupil. The goggle module is connected to a computing module for analysis of captured visual data. The system provides no interference with natural line of sight and no restriction of field of vision. The applications of the system include medical applications defense related entertainment exploration and scientific research.
We Claim:
1. A system for an optimal imaging of a pupil for balance assessment of a patient, the system comprises:
a goggle module comprising an unobtrusive frame, and wherein the unobtrusive frame is configured for providing a wide field of view, and wherein the unobtrusive frame is further configured for snug fitting user face without any adjustments, and wherein the goggle module is configured for providing complete light sealing in order to avoid any external interference;
one or more cameras coupled with the unobtrusive frame in an inferomedialeccentric manner, and wherein the one or more cameras are configured for capturing a plurality of high quality eye images; and
a computing module run on a hardware processor and communicatively coupled to the goggle module, and wherein the computing module is configured for enabling the user to analyze the movement of the pupil for balance assessment of a patient based on the plurality of high quality eye images captured using the goggle module;
an array of a plurality of linear infrared LEDs coupled to the unobtrusive frame, and wherein the array of a plurality of linear infrared LEDs is configured for providing optimum illumination to the pupil of the eye when ambient light is low;
wherein the plurality of cameras are arranged in a lower frame and the plurality of LEDs are arranged at preset distances to emit light in preset angles.
2. The system according to claim 1, wherein the goggle module further comprises an infra-red band pass filter coupled to the unobtrusive frame, and wherein the infra-red band pass filter is configured for cutting-off direct light and reflections from surroundings when the ambient light is high.
3. The system according to claim 1, wherein the computing module further comprises an image analysis engine, and wherein the image analysis engine is configured for analyzing the plurality of high quality eye images captured using one or more cameras, and wherein the plurality of high quality eye images are analyzed by the image analysis engine using one or more machine learning algorithms.
4. The system according to claim 1, wherein the computing module further comprises an analytics engine, and wherein the analytics engine is configured for generating one or more reports for the analysis performed using image analysis engine, and wherein the one or more reports generated are visually presented to the user using a display device, and wherein the generated reports are also downloadable in a PDF or video format on an external storage media.
5. The system according to claim 1, wherein the computing module further comprises an artificial intelligence engine, and wherein the artificial intelligence engine is configured for interpreting any neuro-vestibular disorders in the patient based on the analysis of the plurality of captured high quality eye images.
6. The system according to claim 1, wherein the computing module further comprises an augmented reality engine, and wherein the augmented reality engine is configured for combining virtual and real time test scenes for an accurate balance disorder diagnosis.
7. The system according to claim 1, wherein the system is further configured for subjecting a patient to a plurality of tests for diagnosing balance disorders, and wherein the plurality of tests comprise a saccades test, a smooth pursuit test, a caloric reflex test, an optokinetic stimulation test, a nystagmus test, a gaze fixation test, a subjective midpoint test, a pupillometry test, a head impulse test, an open test, a positional test and a subjective visual vertical test.
8. The system according to claim 1, wherein the goggle module is further configured for recording video and graph of any eye movement for diagnosis of balance disorders in the patient.
9. The system according to claim 1, wherein the one or more cameras comprise two eye tracking cameras and one scene camera, and wherein the scene camera is configured for providing additional means for head tracking and auto-calibration.
10. The system according to claim 1, wherein the system is further configured for enabling accurate tracking of the pupil movement even when the patient is wearing prescription glasses.
11. A computer implemented method comprising instructions stored on a non-transitory computer readable storage medium and run on a computing device provided with a hardware processor and memory for an optimal imaging of a pupil in an eye, the method comprising the steps of:
fitting a goggle module to a user, and wherein the goggle module is communicatively connected to a computing module;
setting preferences of image capture in the computing device and initiating image capture process through the goggle module;
storing image capture information in the storage database and enabling real-time analysis through the computing module;
ending the image capture process as per user preference and analyzing the captured data; and
performing image capture analysis as per user preferences and rendering the analysis to the user on a plurality of devices;
wherein the plurality of cameras are arranged in a lower frame and the plurality of LEDs are arranged at preset distances to emit light in preset angles.
12. The method according to claim 11, wherein the step of analyzing the captured data further comprises analyzing the plurality of captured high quality of eye images using one or more machine learning algorithms through image analysis module.
13. The method according to claim 11, wherein the method further comprises generating one or more reports for the analysis performed using image analysis engine, and wherein the one or more reports generated are visually presented to the user using a display device, and wherein the generated reports are also downloadable in a PDF or video format on an external storage media.
14. The method according to claim 11, wherein the method further comprises interpreting any neuro-vestibular disorders in the patient based on the captured plurality of high quality eye images using an artificial intelligence engine.
15. The method according to claim 11, wherein the method further comprises combining virtual and real time test scenes for accurate balance disorder diagnosis using an augmented reality engine.
16. The method according to claim 11, wherein the method further comprises subjecting a patient to a plurality of tests for diagnosing balance disorders, and wherein the plurality of tests comprise a saccades test, a smooth pursuit test, a caloric reflex test, an optokinetic stimulation test, a nystagmus test, a gaze fixation test, a subjective midpoint test, a
pupillometry test, a head impulse test, an open test, a positional test and a subjective visual vertical test.
17. The method according to claim 11, wherein the method further comprises providing optimum illumination to the pupil of the eye when ambient light is low using an array of a plurality of linear infrared LEDs coupled to the unobtrusive frame of the goggle module.
18. The method according to claim 11, wherein the method further comprises cutting-off direct light and reflections from surroundings when the ambient light is high using an infra-red band pass filter coupled to the unobtrusive frame of the goggle module.
19. The method according to claim 11, wherein the method further comprises enabling accurate tracking of the pupil movement even when the patient is wearing prescription glasses.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201947035121-FORM 4 [11-06-2024(online)].pdf | 2024-06-11 |
| 1 | 201947035121-STATEMENT OF UNDERTAKING (FORM 3) [30-08-2019(online)].pdf | 2019-08-30 |
| 2 | 201947035121-IntimationOfGrant08-03-2024.pdf | 2024-03-08 |
| 2 | 201947035121-PROOF OF RIGHT [30-08-2019(online)].pdf | 2019-08-30 |
| 3 | 201947035121-POWER OF AUTHORITY [30-08-2019(online)].pdf | 2019-08-30 |
| 3 | 201947035121-PatentCertificate08-03-2024.pdf | 2024-03-08 |
| 4 | 201947035121-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105) [30-08-2019(online)].pdf | 2019-08-30 |
| 4 | 201947035121-AMMENDED DOCUMENTS [28-02-2024(online)].pdf | 2024-02-28 |
| 5 | 201947035121-FORM FOR SMALL ENTITY(FORM-28) [30-08-2019(online)].pdf | 2019-08-30 |
| 5 | 201947035121-FORM 13 [28-02-2024(online)].pdf | 2024-02-28 |
| 6 | 201947035121-FORM 3 [28-02-2024(online)].pdf | 2024-02-28 |
| 6 | 201947035121-FORM 1 [30-08-2019(online)].pdf | 2019-08-30 |
| 7 | 201947035121-MARKED COPIES OF AMENDEMENTS [28-02-2024(online)].pdf | 2024-02-28 |
| 7 | 201947035121-FIGURE OF ABSTRACT [30-08-2019(online)].jpg | 2019-08-30 |
| 8 | 201947035121-PETITION UNDER RULE 137 [28-02-2024(online)].pdf | 2024-02-28 |
| 8 | 201947035121-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-08-2019(online)].pdf | 2019-08-30 |
| 9 | 201947035121-DRAWINGS [30-08-2019(online)].pdf | 2019-08-30 |
| 9 | 201947035121-RELEVANT DOCUMENTS [28-02-2024(online)].pdf | 2024-02-28 |
| 10 | 201947035121-DECLARATION OF INVENTORSHIP (FORM 5) [30-08-2019(online)].pdf | 2019-08-30 |
| 10 | 201947035121-Written submissions and relevant documents [28-02-2024(online)].pdf | 2024-02-28 |
| 11 | 201947035121-COMPLETE SPECIFICATION [30-08-2019(online)].pdf | 2019-08-30 |
| 11 | 201947035121-Correspondence to notify the Controller [09-02-2024(online)].pdf | 2024-02-09 |
| 12 | 201947035121-US(14)-HearingNotice-(HearingDate-13-02-2024).pdf | 2024-01-16 |
| 12 | 201947035121.pdf | 2019-08-31 |
| 13 | 201947035121-FORM 3 [08-11-2023(online)].pdf | 2023-11-08 |
| 13 | Correspondence by Agent_POA_06-09-2019.pdf | 2019-09-06 |
| 14 | 201947035121-FORM 18 [06-06-2020(online)].pdf | 2020-06-06 |
| 14 | 201947035121-FORM 3 [24-05-2023(online)].pdf | 2023-05-24 |
| 15 | 201947035121-FER.pdf | 2021-10-18 |
| 15 | 201947035121-FORM 3 [30-06-2022(online)].pdf | 2022-06-30 |
| 16 | 201947035121-FER_SER_REPLY [28-02-2022(online)].pdf | 2022-02-28 |
| 16 | 201947035121-FORM 4(ii) [28-12-2021(online)].pdf | 2021-12-28 |
| 17 | 201947035121-FORM 13 [28-02-2022(online)].pdf | 2022-02-28 |
| 18 | 201947035121-FORM 4(ii) [28-12-2021(online)].pdf | 2021-12-28 |
| 18 | 201947035121-FER_SER_REPLY [28-02-2022(online)].pdf | 2022-02-28 |
| 19 | 201947035121-FER.pdf | 2021-10-18 |
| 19 | 201947035121-FORM 3 [30-06-2022(online)].pdf | 2022-06-30 |
| 20 | 201947035121-FORM 18 [06-06-2020(online)].pdf | 2020-06-06 |
| 20 | 201947035121-FORM 3 [24-05-2023(online)].pdf | 2023-05-24 |
| 21 | 201947035121-FORM 3 [08-11-2023(online)].pdf | 2023-11-08 |
| 21 | Correspondence by Agent_POA_06-09-2019.pdf | 2019-09-06 |
| 22 | 201947035121-US(14)-HearingNotice-(HearingDate-13-02-2024).pdf | 2024-01-16 |
| 22 | 201947035121.pdf | 2019-08-31 |
| 23 | 201947035121-COMPLETE SPECIFICATION [30-08-2019(online)].pdf | 2019-08-30 |
| 23 | 201947035121-Correspondence to notify the Controller [09-02-2024(online)].pdf | 2024-02-09 |
| 24 | 201947035121-Written submissions and relevant documents [28-02-2024(online)].pdf | 2024-02-28 |
| 24 | 201947035121-DECLARATION OF INVENTORSHIP (FORM 5) [30-08-2019(online)].pdf | 2019-08-30 |
| 25 | 201947035121-DRAWINGS [30-08-2019(online)].pdf | 2019-08-30 |
| 25 | 201947035121-RELEVANT DOCUMENTS [28-02-2024(online)].pdf | 2024-02-28 |
| 26 | 201947035121-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-08-2019(online)].pdf | 2019-08-30 |
| 26 | 201947035121-PETITION UNDER RULE 137 [28-02-2024(online)].pdf | 2024-02-28 |
| 27 | 201947035121-FIGURE OF ABSTRACT [30-08-2019(online)].jpg | 2019-08-30 |
| 27 | 201947035121-MARKED COPIES OF AMENDEMENTS [28-02-2024(online)].pdf | 2024-02-28 |
| 28 | 201947035121-FORM 1 [30-08-2019(online)].pdf | 2019-08-30 |
| 28 | 201947035121-FORM 3 [28-02-2024(online)].pdf | 2024-02-28 |
| 29 | 201947035121-FORM 13 [28-02-2024(online)].pdf | 2024-02-28 |
| 29 | 201947035121-FORM FOR SMALL ENTITY(FORM-28) [30-08-2019(online)].pdf | 2019-08-30 |
| 30 | 201947035121-AMMENDED DOCUMENTS [28-02-2024(online)].pdf | 2024-02-28 |
| 30 | 201947035121-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105) [30-08-2019(online)].pdf | 2019-08-30 |
| 31 | 201947035121-POWER OF AUTHORITY [30-08-2019(online)].pdf | 2019-08-30 |
| 31 | 201947035121-PatentCertificate08-03-2024.pdf | 2024-03-08 |
| 32 | 201947035121-PROOF OF RIGHT [30-08-2019(online)].pdf | 2019-08-30 |
| 32 | 201947035121-IntimationOfGrant08-03-2024.pdf | 2024-03-08 |
| 33 | 201947035121-STATEMENT OF UNDERTAKING (FORM 3) [30-08-2019(online)].pdf | 2019-08-30 |
| 33 | 201947035121-FORM 4 [11-06-2024(online)].pdf | 2024-06-11 |
| 1 | 201947035121SearchstrategyE_29-06-2021.pdf |