Abstract: Disclosed is a system for detecting eye fatigue and strain, comprising an image capture module within a mobile application, configured to periodically capture live images of a user; an eye recognition module programmed to detect the eye area and movement from the captured images; an analysis module for determining the state of eye fatigue or strain based on recognized eye movements; and an alert module designed to generate an auditory alarm via the mobile application to alert the user upon detection of eye fatigue or strain. Fig. 1 Drawings FIG 1 FIG. 2 FIG. 3
Description:Field of the Invention
[0001] The present disclosure generally relates to eye fatigue and strain detection systems. Particularly, the present disclosure relates to a system for detecting eye fatigue and strain through an image capture module within a mobile application.
Background
[0002] The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] The exploration of digital content through mobile devices has become an integral part of daily life, with reading digital texts being a predominant activity. Users engage in prolonged periods of reading on mobile platforms, which has introduced concerns regarding visual health. Specifically, the strain and fatigue experienced by the eyes during extensive reading sessions on such devices pose significant challenges. The illumination, text size, and the act of focusing on small, pixelated fonts for extended periods contribute to these issues, leading to discomfort and potential long-term visual impairment.
[0004] Techniques for monitoring and mitigating eye strain and fatigue have been developed, focusing on the identification of symptoms associated with prolonged exposure to screens. Traditional methods include the use of software that adjusts screen brightness and color temperature based on ambient conditions or time of day. While beneficial, these solutions do not address real-time detection and mitigation of eye fatigue specifically during reading sessions on mobile devices. They are static solutions, lacking the ability to dynamically assess the user's visual state and provide immediate, personalized interventions.
[0005] Further developments have introduced hardware-based solutions, such as external devices that monitor eye movement and blink rates to estimate fatigue levels. These devices, while effective in certain environments, are not practical for mobile use due to their requirement for additional hardware and lack of integration with mobile reading applications. The inconvenience of carrying extra devices and the need for continuous synchronization with mobile platforms limit their applicability for monitoring eye health in real-time during mobile reading activities.
[0006] Moreover, software applications have been designed to prompt users to take breaks or perform eye exercises based on fixed time intervals. However, these applications do not account for individual differences in susceptibility to eye fatigue and strain. The generalized approach fails to accurately detect the onset of eye strain, often interrupting users unnecessarily or failing to alert users in a timely
manner. The lack of personalized monitoring and intervention mechanisms highlights the need for an integrated solution capable of accurately detecting and responding to eye fatigue in real time during the reading process on mobile devices.
[0007] In light of the above discussion, there exists an urgent need for solutions that overcome the problems associated with conventional systems and techniques for detecting and mitigating eye fatigue and strain during reading sessions on mobile devices. Such solutions should offer real-time, personalized monitoring and intervention, seamlessly integrated within mobile reading applications to enhance user comfort and prevent long-term visual health issues without the need for additional hardware or generalized, static intervention strategies.
Summary
[0008] The present disclosure generally relates to eye fatigue and strain detection systems. Particularly, the present disclosure relates to a system for detecting eye fatigue and strain through an image capture module within a mobile application.
[0009] The following presents a simplified summary of various aspects of this disclosure in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical
elements nor delineate the scope of such aspects. Its purpose is to present some concepts of this disclosure in a simplified form as a prelude to the more detailed description that is presented later.
[00010] The following paragraphs provide additional support for the claims of the subject application.
[00011] The disclosure describes a comprehensive system for detecting eye fatigue and strain, incorporating a suite of technologies within a mobile application to monitor and analyze the user's eye condition in real time. The core components of this system include an image capture module, an eye recognition module, an analysis module, and an alert module. The image capture module is embedded within the mobile application and is programmed to periodically capture live images of the user. This functionality enables continuous monitoring of the user's eyes, especially during activities that may induce eye strain, such as reading.
[00012] In an embodiment, the image capture module is specifically activated during reading activities. This feature ensures that the system is particularly vigilant during periods when the user is likely to experience eye fatigue, thereby enhancing the system's effectiveness in real-time eye condition monitoring. By focusing on periods of intensive eye use, the system offers targeted intervention to mitigate the adverse effects of prolonged reading sessions on the user's eyes.
[00013] In another embodiment, the image capture module operates on a schedule, capturing images at predetermined intervals. This systematic approach ensures consistent monitoring of the user's eye condition, enabling the system to track changes in eye health over time. The regular capture of images allows for the detection of subtle variations in eye movement and condition that may indicate the onset of fatigue or strain.
[00014] In a further embodiment, the eye recognition module employs a sophisticated algorithm to analyze the captured images, focusing on detecting the position and openness of the eyes. This technology enables the system to assess the user's eye condition accurately by examining specific markers that indicate fatigue, such as decreased blink rate or increased eye closure.
[00015] In yet another embodiment, the analysis module incorporates machine learning algorithms to evaluate variations in eye movement patterns. These algorithms are trained to identify signs of eye fatigue or strain, making the system capable of learning from a vast dataset of eye movements to improve its accuracy and reliability over time.
[00016] In an additional embodiment, the alert module is designed to generate an auditory alarm to notify the user when eye fatigue or strain is detected. This alarm is customizable, allowing users to adjust the volume and tone according to their preferences. The customization ensures that
the alarm is effective in capturing the user's attention without being intrusive or disruptive.
[00017] In a further embodiment, the mobile application is engineered to function in the background, permitting the system to operate unobtrusively while the user engages in other activities on their mobile device. This background functionality ensures that the system's monitoring and analysis processes do not interfere with the user's normal use of their device, enhancing user experience and compliance.
[00018] In another embodiment, the system includes a user interface that presents real-time analysis of the user's eye condition. This interface provides immediate feedback to the user, allowing them to understand their eye health status and take necessary precautions to prevent further strain or fatigue.
[00019] In an additional embodiment, if the user does not acknowledge the auditory alarm, the alert module is configured to trigger further interactive prompts. These prompts are designed to ensure that the user is aware of their eye condition and encourage them to take breaks or adjust their reading habits to alleviate strain.
[00020] Lastly, the method for detecting eye fatigue and strain employs a sequence of steps that begins with activating the image capture module during a user's reading activity, followed by the periodic capturing of live images. These images are then analyzed using an eye
recognition algorithm to identify signs of fatigue or strain. If such
conditions are detected, the system generates an auditory alarm to alert the user. This methodological approach ensures that the system is both proactive and responsive in managing and mitigating eye fatigue and strain, offering a practical solution for preserving eye health in the digital age.
Brief Description of the Drawings
[00021] The features and advantages of the present disclosure would be more clearly understood from the following description taken in conjunction with the accompanying drawings in which:
[00022] FIG. 1 illustrates a system for detecting eye fatigue and strain, in accordance with the embodiments of the present disclosure;
[00023] FIG. 2 illustrates a method for detecting eye fatigue and strain, in accordance with the embodiments of the present disclosure; and
[00024] FIG. 3 illustrates a generalized architecture of a drowsiness detection process, in accordance with the embodiments of the present disclosure.
Detailed Description
[00025] In the following detailed description of the invention,
reference is made to the accompanying drawings that form a part hereof, and in which is shown, by way of illustration, specific embodiments in
which the invention may be practiced. In the drawings, like numerals describe substantially similar components throughout the several views. These embodiments are described in sufficient detail to claim those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims and equivalents thereof.
[00026] The use of the terms “a” and “an” and “the” and “at least one” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The use of the term “at least one” followed by a list of one or more items (for example, “at least one of A and B”) is to be construed to mean one item selected from the listed items (A or B) or any combination of two or more of the listed items (A and B), unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless
otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
[00027] The present disclosure generally relates to eye fatigue and strain detection systems. Particularly, the present disclosure relates to a system for detecting eye fatigue and strain through an image capture module within a mobile application.
[00028] Pursuant to the "Detailed Description" section herein, whenever an element is explicitly associated with a specific numeral for the first time, such association shall be deemed consistent and applicable throughout the entirety of the "Detailed Description" section, unless otherwise expressly stated or contradicted by the context.
[00029] The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
[00030] FIG. 1 illustrates a system (100) for detecting eye fatigue and strain, in accordance with the embodiments of the present disclosure. Said system (100) comprises an image capture module (102) within a mobile application, configured to periodically capture live images of a user. An eye recognition module (104) is programmed to detect the eye area and movement from the captured images. An analysis module (106) is included for determining the state of eye fatigue or strain based on recognized eye movements. Additionally, an alert module (108) is incorporated to generate an auditory alarm via the mobile application to alert the user upon detection of eye fatigue or strain.
[00031] In an embodiment, the image capture module (102) within the mobile application is configured to capture live images of the user at predetermined intervals. Said intervals can be adjusted based on user preferences or automatically based on the analysis of the user's eye condition over time. The flexibility in setting the capture intervals enables the system (100) to efficiently monitor the user's eye condition without causing unnecessary interruptions.
[00032] In an embodiment, the eye recognition module (104) utilizes advanced image processing algorithms to accurately detect the eye area and movements from the captured images. This module is capable of distinguishing between different types of eye movements and blinking patterns, which are critical indicators of eye fatigue or strain. The precision of the eye recognition module (104) ensures that the analysis is
based on reliable data, leading to accurate determination of the user's eye condition.
[00033] In an embodiment, the analysis module (106) employs machine learning techniques to analyze the detected eye movements and determine the state of eye fatigue or strain. Said module is programmed to recognize patterns that indicate fatigue or strain, such as increased blink rate, eye closure duration, and the rate of eye movements. By analyzing these patterns, the analysis module (106) can accurately assess the user's eye condition and trigger the alert module (108) when necessary.
[00034] In an embodiment, the alert module (108) is designed to generate an auditory alarm via the mobile application when eye fatigue or strain is detected. The alarm is intended to prompt the user to take a break from the screen, thereby mitigating the effects of eye fatigue or strain. The alert module (108) can be customized to offer different types of alarms, such as varying sounds or verbal reminders, providing flexibility in how the user is alerted to the detected condition.
[00035] In an embodiment, the system (100) further comprises a user interface within the mobile application, allowing the user to configure the settings of the image capture module (102), eye recognition module (104), analysis module (106), and alert module (108). This interface enables the user to personalize the system (100) according to individual preferences and needs, such as adjusting the frequency of image
captures, selecting the type of auditory alarm, and viewing reports on the detected eye condition over time.
[00036] In an embodiment, the system (100) includes a feedback mechanism for the user to provide input on the accuracy of the eye fatigue or strain detection and the effectiveness of the auditory alarms. This feedback is used to continuously improve the algorithms used by the eye recognition (104) and analysis modules (106), enhancing the system's overall performance and reliability.
[00037] In an embodiment, the system (100) is integrated within a range of mobile devices, making it widely accessible to users across different platforms. The compatibility with various mobile devices ensures that a larger audience can benefit from the features provided by the system (100) for detecting eye fatigue and strain.
[00038] In an embodiment, the system (100) utilizes secure data processing and storage techniques to protect the privacy of the user's data. The images captured by the image capture module (102) and the analysis results generated by the analysis module (106) are encrypted and stored in a secure manner, ensuring that the user's personal information is safeguarded against unauthorized access.
[00039] In an embodiment, the image capture module (102) of the system (100) is activated during a reading activity by the user to monitor the user's eye condition. This feature is particularly designed to address
the challenges associated with prolonged reading sessions, which can
significantly contribute to eye fatigue and strain. By focusing the activation of the image capture module (102) during reading activities, the system (100) ensures targeted monitoring when the risk of developing eye fatigue or strain is highest. The module leverages advanced image recognition technologies to detect when the user is engaged in reading, either through the analysis of on-screen content or the user's eye movement patterns, which tend to be more focused and static during reading activities. This targeted approach allows for efficient use of the device's resources, as continuous monitoring throughout all activities may not be necessary or efficient. Furthermore, by activating during reading activities, the system (100) provides timely interventions, prompting users to take breaks or adjust their reading habits before significant eye fatigue or strain sets in. This proactive measure contributes to better eye health management and reduces the likelihood of discomfort or more serious eye conditions developing over time.
[00040] In an embodiment, the image capture module (102) within the system (100) is configured to capture images at predetermined intervals. This functionality ensures that the monitoring of the user's eye condition is carried out systematically, providing consistent data over time. The predetermined intervals can be customized based on the user's preferences or based on recommendations derived from preliminary assessments of the user's eye health. This approach balances the need for regular monitoring with the desire to minimize intrusiveness, ensuring
that the user's experience with the mobile device is not adversely affected. The use of predetermined intervals allows the system (100) to efficiently allocate resources, optimizing battery life and processing power by avoiding continuous image capture. Moreover, this method facilitates the collection of data across different times and conditions, enabling a comprehensive analysis of the user's eye health. This data can reveal patterns in eye fatigue and strain, such as certain times of day when the user is more susceptible. The ability to capture images at predetermined intervals thus plays a crucial role in the overall effectiveness of the system (100) in detecting and managing eye fatigue and strain.
[00041] In an embodiment, the eye recognition module (104) within the system (100) utilizes an algorithm to detect the position and openness of the eyes. This sophisticated algorithm is the cornerstone of the module's ability to accurately monitor and assess the user's eye condition. By analyzing the position of the eyes, the system (100) can determine the direction of gaze and infer the user's focus or engagement level with a task. Similarly, assessing the openness of the eyes allows the system (100) to detect signs of fatigue or strain, such as frequent blinking or squinting. The algorithm is designed to work under various lighting conditions and with different user physiognomies, ensuring broad applicability and reliability. The eye recognition module's (104) ability to detect these specific aspects of the eye's appearance is critical for accurate analysis. It enables the system (100) to differentiate between
normal eye movements associated with reading or screen use and those movements or conditions indicative of eye fatigue or strain. This precision is vital for providing timely and appropriate alerts to the user, ensuring interventions are made when most needed to mitigate the effects of eye strain and enhance overall eye health.
[00042] In an embodiment, the analysis module (106) within the system (100) includes machine learning algorithms for assessing variations in eye movement patterns indicative of fatigue or strain. These algorithms are trained on vast datasets of eye movement patterns, allowing them to recognize subtle changes that may signal the onset of eye fatigue or strain. By continuously learning from new data, the algorithms improve their accuracy over time, adapting to the unique eye movement patterns of individual users. This personalized approach enhances the system's effectiveness in detecting eye fatigue or strain at their earliest stages. The machine learning algorithms analyze various parameters, including blink rate, blink duration, eye movement speed, and saccadic movements, to assess the user's eye condition. By identifying deviations from the user's normal eye movement patterns, the system (100) can detect signs of fatigue or strain even before the user becomes consciously aware of them. This proactive detection enables timely interventions, such as suggesting breaks or eye exercises, which can prevent the progression of eye fatigue or strain and contribute to the user's overall eye health.
[00043] In an embodiment, the alert module (108) within the system
(100) is configured to produce an alarm sound that is adjustable by the user in terms of volume and tone. This customization feature is crucial for ensuring that the alerts are effective in capturing the user's attention without causing annoyance or disturbance. Users can select a sound that is pleasant and noticeable to them, increasing the likelihood that they will heed the alert and take the recommended action, such as taking a break or adjusting their screen usage. The adjustability of the alarm sound also accommodates different environments and situations; for example, a user may prefer a softer tone in a quiet workspace and a louder, more distinct sound in a noisy environment. This flexibility enhances the user's control over their interaction with the system, fostering a positive user experience and encouraging adherence to the system's recommendations for managing eye fatigue and strain.
[00044] In an embodiment, the mobile application within the system
(100) is configured to operate in a background mode while the user is engaged in other activities on the mobile device. This capability ensures that the system's monitoring and alert functions are continuously active, providing uninterrupted protection against eye fatigue and strain. Operating in the background allows the system (100) to perform its tasks without interfering with the user's interaction with other applications or the mobile device's functionality. This seamless integration into the user's daily device usage is critical for ensuring that the system's benefits are
realized without requiring significant changes to the user's habits or device settings. The background operation mode also enables the system
(100) to gather data and provide alerts in real time, ensuring that interventions can be made promptly to mitigate the effects of eye fatigue or strain. This unobtrusive approach to monitoring and alerting is essential for user compliance and satisfaction, making the system (100) an effective tool for managing eye health in the digital age.
[00045] In an embodiment, the system (100) includes a user interface for displaying real-time analysis of the user's eye condition. This user interface is designed to provide users with immediate feedback on their eye health, offering insights into their eye fatigue and strain levels. Through visual representations such as graphs, charts, or indicators, users can easily understand their current eye condition and track changes over time. This real-time analysis empowers users to make informed decisions about their screen usage and adopt healthier habits to protect their eyes. The user interface is intuitive and user-friendly, ensuring that users of all technical abilities can navigate the features and interpret the data presented. The availability of real-time analysis enhances the system's value as a comprehensive eye health management tool, encouraging users to take proactive steps towards minimizing eye fatigue and strain. This feature not only raises awareness about the importance of eye health but also facilitates a more engaged and proactive approach to eye care.
[00046] In an embodiment, the alert module (108) within the system
is further configured to trigger additional user-interactive prompts if the auditory alarm is not acknowledged. This feature ensures that critical alerts regarding eye fatigue or strain are not overlooked, increasing the effectiveness of the system (100) in promoting eye health. The additional prompts may include visual alerts on the screen, vibration alerts, or a series of escalating alarm sounds, requiring the user to interact with the device to confirm receipt of the warning. This interaction can also involve providing the user with suggestions for mitigating eye fatigue or strain, such as performing specific eye exercises or adjusting screen brightness. The aim of these interactive prompts is to encourage immediate action by the user, addressing the detected eye condition before it worsens. The implementation of multiple alert mechanisms caters to different user preferences and situations, ensuring that the system (100) remains effective across a wide range of contexts and user behaviors. This approach reflects a comprehensive strategy to manage eye health, emphasizing the importance of user engagement and responsiveness to the system (100)'s alerts.
[00047] FIG. 2 illustrates a method 200 for detecting eye fatigue and strain, in accordance with the embodiments of the present disclosure. At step 202, activate an image capture module within a mobile application specifically during a user's reading activity to monitor eye condition. At step 204, capture live images of the user periodically through the
activated image capture module to gather data on eye behavior. At step 206, utilize an eye recognition algorithm to accurately identify the eye area and movement from the captured images. At step 208, analyze the identified eye movement using the eye recognition algorithm to assess patterns indicative of eye fatigue or strain. At step 210, generate an auditory alarm through the mobile application to alert the user when eye fatigue or strain is determined from the analysis.
[00048] FIG. 3 illustrates a generalized architecture of a drowsiness detection process, in accordance with the embodiments of the present disclosure. The process initiates with the user engaged in reading, during which the mobile application activates its image capture module. This module periodically captures live images of the user's face with particular emphasis on the eyes. Following image capture, the system employs an eye recognition algorithm that detects and analyzes the eye condition, identifying specific areas and movements that are critical in determining eye fatigue or strain. The algorithm scrutinizes these movements and positions to accurately assess signs of drowsiness. Once analysis confirms the presence of fatigue or strain, the system triggers the mobile application to generate an auditory alarm . This alert aims to notify the user of their condition, prompting them to take necessary action, such as taking a break from reading to rest the eyes, thereby mitigating the risks associated with continued strain and potential drowsiness.
[00049]
[00050] Example embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including hardware, software, firmware, and a combination thereof. For example, in one embodiment, each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations can be implemented by computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
[00051] Throughout the present disclosure, the term ‘processing means’ or ‘microprocessor’ or ‘processor’ or ‘processors’ includes, but is not limited to, a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of
types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor).
[00052] The term “non-transitory storage device” or “storage” or “memory,” as used herein relates to a random access memory, read only memory and variants thereof, in which a computer can store data or software for any duration.
[00053] Operations in accordance with a variety of aspects of the disclosure is described above would not have to be performed in the precise order described. Rather, various steps can be handled in reverse order or simultaneously or not at all.
[00054] While several implementations have been described and illustrated herein, a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein may be utilized, and each of such variations and/or modifications is deemed to be within the scope of the implementations described herein. More generally, all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine
experimentation, many equivalents to the specific implementations described herein. It is, therefore, to be understood that the foregoing implementations are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, implementations may be practiced otherwise than as specifically described and claimed. Implementations of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
Claims
I/We claims:
A system (100) for detecting eye fatigue and strain, comprising: an image capture module (102) within a mobile application, configured to periodically capture live images of a user; an eye recognition module (104) programmed to detect the eye area and movement from the captured images; an analysis module (106) for determining the state of eye fatigue or strain based on recognized eye movements; and an alert module (108) designed to generate an auditory alarm via the mobile application to alert the user upon detection of eye fatigue or strain.
The system of claim 1, wherein the image capture module (102) is activated during a reading activity by the user to monitor the user's eye condition.
The system of claim 1, wherein the image capture module (102) is configured to capture images at predetermined intervals.
The system of claim 1, wherein the eye recognition module (104) utilizes an algorithm to detect the position and openness of the eyes.
The system of claim 1, wherein the analysis module (106) includes machine learning algorithms for assessing variations in eye movement patterns indicative of fatigue or strain.
The system of claim 1, wherein the alert module (108) is configured to produce an alarm sound that is adjustable by the user in terms of volume and tone.
The system of claim 1, wherein the mobile application is configured to operate in a background mode while the user is engaged in other activities on the mobile device.
The system of claim 1, wherein the system includes a user interface for displaying real-time analysis of the user's eye condition.
The system of claim 1, wherein the alert module (108) is further configured to trigger additional user-interactive prompts if the auditory alarm is not acknowledged.
A method (200) for detecting eye fatigue and strain, comprising: activating an image capture module (102) within a mobile application during a user's reading activity; periodically capturing live images of the user; utilizing an eye recognition algorithm to identify eye area and movement from the captured images; analyzing eye movement to determine the presence of eye fatigue or strain; and generating an auditory alarm through the mobile application to alert the user upon determination of eye fatigue or strain.
SYSTEM FOR DETECTING EYE FATIGUE AND STRAIN
Disclosed is a system for detecting eye fatigue and strain, comprising an image capture module within a mobile application, configured to periodically capture live images of a user; an eye recognition module programmed to detect the eye area and movement from the captured images; an analysis module for determining the state of eye fatigue or strain based on recognized eye movements; and an alert module designed to generate an auditory alarm via the mobile application to alert the user upon detection of eye fatigue or strain.
Fig. 1
Drawings
FIG 1
FIG. 2
FIG. 3
, Claims:I/We claims:
A system (100) for detecting eye fatigue and strain, comprising: an image capture module (102) within a mobile application, configured to periodically capture live images of a user; an eye recognition module (104) programmed to detect the eye area and movement from the captured images; an analysis module (106) for determining the state of eye fatigue or strain based on recognized eye movements; and an alert module (108) designed to generate an auditory alarm via the mobile application to alert the user upon detection of eye fatigue or strain.
The system of claim 1, wherein the image capture module (102) is activated during a reading activity by the user to monitor the user's eye condition.
The system of claim 1, wherein the image capture module (102) is configured to capture images at predetermined intervals.
The system of claim 1, wherein the eye recognition module (104) utilizes an algorithm to detect the position and openness of the eyes.
The system of claim 1, wherein the analysis module (106) includes machine learning algorithms for assessing variations in eye movement patterns indicative of fatigue or strain.
The system of claim 1, wherein the alert module (108) is configured to produce an alarm sound that is adjustable by the user in terms of volume and tone.
The system of claim 1, wherein the mobile application is configured to operate in a background mode while the user is engaged in other activities on the mobile device.
The system of claim 1, wherein the system includes a user interface for displaying real-time analysis of the user's eye condition.
The system of claim 1, wherein the alert module (108) is further configured to trigger additional user-interactive prompts if the auditory alarm is not acknowledged.
A method (200) for detecting eye fatigue and strain, comprising: activating an image capture module (102) within a mobile application during a user's reading activity; periodically capturing live images of the user; utilizing an eye recognition algorithm to identify eye area and movement from the captured images; analyzing eye movement to determine the presence of eye fatigue or strain; and generating an auditory alarm through the mobile application to alert the user upon determination of eye fatigue or strain.
SYSTEM FOR DETECTING EYE FATIGUE AND STRAIN
| # | Name | Date |
|---|---|---|
| 1 | 202421033149-OTHERS [26-04-2024(online)].pdf | 2024-04-26 |
| 2 | 202421033149-FORM FOR SMALL ENTITY(FORM-28) [26-04-2024(online)].pdf | 2024-04-26 |
| 3 | 202421033149-FORM 1 [26-04-2024(online)].pdf | 2024-04-26 |
| 4 | 202421033149-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-04-2024(online)].pdf | 2024-04-26 |
| 5 | 202421033149-EDUCATIONAL INSTITUTION(S) [26-04-2024(online)].pdf | 2024-04-26 |
| 6 | 202421033149-DRAWINGS [26-04-2024(online)].pdf | 2024-04-26 |
| 7 | 202421033149-DECLARATION OF INVENTORSHIP (FORM 5) [26-04-2024(online)].pdf | 2024-04-26 |
| 8 | 202421033149-COMPLETE SPECIFICATION [26-04-2024(online)].pdf | 2024-04-26 |
| 9 | 202421033149-FORM-9 [07-05-2024(online)].pdf | 2024-05-07 |
| 10 | 202421033149-FORM 18 [08-05-2024(online)].pdf | 2024-05-08 |
| 11 | 202421033149-FORM-26 [15-05-2024(online)].pdf | 2024-05-15 |
| 12 | 202421033149-FORM 3 [13-06-2024(online)].pdf | 2024-06-13 |
| 13 | 202421033149-RELEVANT DOCUMENTS [17-04-2025(online)].pdf | 2025-04-17 |
| 14 | 202421033149-POA [17-04-2025(online)].pdf | 2025-04-17 |
| 15 | 202421033149-FORM 13 [17-04-2025(online)].pdf | 2025-04-17 |
| 16 | 202421033149-FER.pdf | 2025-11-21 |
| 1 | 202421033149_SearchStrategyNew_E_202421033149E_19-11-2025.pdf |