Abstract: A user-centric focus management system, comprises of a body 101 installed with a biometric scanner 102 that scans and matches biometric signs, an arm 103 embodied with a projection surface 105 as an end effector, connected with a ball and socket joint 104, an imaging unit 106 for gaze detection, a projector 107 attached via a rotatable joint 108, to project a preferred clock type over projection surface 105, a printing unit 109, for preparing a to-do list including time constraints, a receptacle 110 for collection of printed to-do list, installed with a pneumatically extendable rod 111, that extends receptacle 110, to provide to-do list, one or more suction cups 112 are fabricated to stably hold body 101, a speaker unit 115 to notify user for taking breaks and accordingly updates to-do list, a plate 113 for placing one or more electronic gadgets, including one or more clamps 114 to clasp gadgets.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a user-centric focus management system that enables users to manage their study or work sessions effectively through real-time monitoring, adaptive scheduling, and distraction control techniques by assessing previous performance and upcoming tasks to ensure structured planning and timely execution of work.
BACKGROUND OF THE INVENTION
[0002] Focus management plays a critical role in enhancing productivity and improving learning outcomes, especially for students and professionals who need to manage their time effectively. Focus management poses several challenges, especially in environments filled with distractions. One of the primary issues is the constant influx of notifications from electronic systems, which interrupts concentration and reduces productivity. Users often struggle with time management, finding it difficult to prioritize tasks effectively or allocate appropriate time to each activity. Mental fatigue and stress further hinder focus, leading to decreased motivation and inconsistent performance. Emotional states, such as anxiety or frustration, also impact the ability to maintain attention on tasks. Additionally, the lack of personalized tools that adapt to individual needs makes it harder for users to stay engaged. Traditional methods rely heavily on self-discipline, which is not be sufficient for everyone, especially in high-pressure situations. Without real-time feedback or adaptive support, users find challenging to stay on track, leading to procrastination, missed deadlines, and reduced overall efficiency.
[0003] Traditional methods of focus management primarily rely on manual tools and self-discipline to help individuals stay organized and productive. Common approaches include using physical planners, notebooks, and calendars to schedule tasks and set deadlines. Many people also use simple digital tools like basic to-do list apps, calendar reminders, and alarms to manage their time and workload. Techniques such as the Pomodoro method, where work is broken into timed intervals with short breaks, are also widely practiced. In academic or professional settings, mentors or supervisors provide guidance on time management strategies. Additionally, some individuals use timers or focus-enhancing applications that block distracting websites or limit screen time. However, these traditional methods often require consistent self-monitoring and lack the ability to adapt to an individual’s changing needs or emotional state, which limit their effectiveness for users who struggle with maintaining focus and managing distractions.
[0004] CN112929717A discloses a focus management method and display equipment. The focus management method provided by the invention comprises the steps that a display system obtains a key value sent by a remote controller, and the key value is used for controlling the moving direction of a focus; analyzing an object pointed by the key value into a part object, specifically, analyzing the part object into a reference number of the part object; the display system sends the reference number to a current page; the display system determines an updated second position of the focus according to the current focus value of the first position of the focus of the current page and the reference number of the part object; the invention further provides display equipment. The display equipment comprises a display, a focus recording module, a focus calculation module and a focus setting module. The problem of focus loss caused by slow focus searching and positioning timeliness, focus repetition and focus non-mounting is solved to a certain extent.
[0005] Conventionally, many systems have been developed to address focus management, however the systems mentioned in the prior arts have limitations pertaining the ability to adapt to individual user behaviors and needs, and offering real-time feedback or personalized guidance based on a user’s performance or emotional state. Moreover, the existing systems fail to integrate external data sources or features, such as biometric monitoring or task adjustments. As a result, users face challenges in maintaining consistent focus, managing distractions, and optimizing their workflow, ultimately reducing productivity and learning efficiency.
[0006] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that is capable of providing an optimized work or study environment, ensuring a customized focus management approach, thereby providing personalized experience to a user. Additionally, the system helps the user to manage time effectively by generating structured task lists, and minimizing interruptions by restricting access to gadgets.
OBJECTS OF THE INVENTION
[0007] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0008] An object of the present invention is to develop a system that is capable of providing an optimized work or study environment based on individual preferences and past performance, ensuring a customized focus management approach, thereby providing personalized experience to a user.
[0009] Another object of the present invention is to develop a system that is capable of helping users manage time effectively by generating structured task lists based on upcoming priorities and past performance.
[0010] Another object of the present invention is to develop a system that is capable of presenting important information within the user’s view, reducing distractions and enhancing focus.
[0011] Another object of the present invention is to develop a system that is capable of minimizing interruptions by restricting access to gadgets and monitoring notifications to filter out non-essential interactions.
[0012] Yet another object of the present invention is to develop a system that is capable of detecting signs of fatigue or stress and suggests breaks, ensuring a balanced workflow.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to a user-centric focus management system that assists users in maintaining focus and productivity by analysing upcoming tasks, correlates them with past performance, and generates an optimized schedule to ensure efficient time management. Additionally, the proposed system continuously monitors user’s behaviour and engagement, providing timely prompts for task transitions, break reminders, and adaptive notifications to maintain an optimal balance between productivity and well-being.
[0015] According to an embodiment of the present invention, a user-centric focus management system comprising of a user interface, configured with a set of questionnaires, to be furnished by a user for creating a user-profile, a cuboidal body, installed with a biometric scanner that scans and matches biometric signs of the user, to identify the user’s profile a robotic arm, attached over the cuboidal body, embodied with a projection surface as an end effector, the robotic arm connected to the body via a motorized ball and socket joint, an imaging unit for gaze detection, a motor driver connected with the robotic arm and motorized ball and socket joint to position the projection surface in front of the user, a projector attached via a rotatable joint, to project a preferred clock type over the projection surface, a central server stored with previous performance data corresponding to each user-profile, a scanner to scan one or more pages provided by the user, for extracting information related to upcoming examination of the user, a printing unit, a microcontroller is configured with a machine learning protocol for correlating the previous performance with upcoming examination of the user, for preparing a to-do list including time constraints and communicating the list to the printing unit.
[0016] According to another embodiment of the present invention, the proposed system further includes a receptacle installed adjacent to the printing unit for collection of the printed to-do list, the receptacle is installed with a pneumatically extendable rod, that extends the receptacle, out of the cuboidal body, to provide the to-do list to the user, one or more suction cups are fabricated over a bottom surface of the body, to stably hold the body against a mounting surface, an IOT (internet of things) module is interconnected with the central server, to fetch real time information for each user profile, an integrated speaker unit to notify the user for taking breaks and accordingly updates the to-do list, in case the facial expressions correspond to negative emotions, a plate for placing one or more electronic gadgets of the user, the plate including one or more clamps to clasp the gadgets, preventing the user from accessing the gadgets according to the time constraints, a Wi-Fi module, configured to connect the gadgets to process the incoming messages, calls over the gadgets.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a cuboidal body associated with a user-centric focus management system.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to a user-centric focus management system that is accessed by a user to enhance productivity and concentration by dynamically adapting to the user's study requirements. Additionally, the system monitors the user's real-time focus levels, analyzes past performance, and generates a structured to-do list with time constraints to optimize task management.
[0023] Referring to Figure 1, an isometric view of a cuboidal body associated with a user-centric focus management system is illustrated, comprising a cuboidal body 101, installed with a biometric scanner 102, a robotic arm 103, attached over the cuboidal body 101 via a motorized ball and socket joint 104, embodied with a projection surface 105 as an end effector, an imaging unit 106 installed with the body 101, a projector 107 via a rotatable joint 108 installed with robotic arm 103, a printing unit 109 installed with the body 101, a receptacle 110 installed adjacent to the printing unit 109 via a pneumatically extendable rod 111, one or more suction cups 112 are fabricated over a bottom surface of the body 101, a plate 113 installed with the body 101 and having multiple clamps 114 and a speaker unit 115 installed with the body 101.
[0024] The system disclosed herein comprises a cuboidal body 101, which serves as a main structure of the system and is developed to provide assistance to a user to provide an optimized work or study environment. The body 101 is developed to be placed over a surface. To ensure the stability of the body 101 over the surface, the body 101 is equipped with multiple suction cups 112. The suction cups 112 are used to create a vacuum seal between the surface and the body 101. When the suction cups 112 are pressed against the surface, the initial contact creates a seal between the cups 112 and the surface, this seals off the area within the suction cups 112. The suction cups 112 are designed to maintain a relatively airtight seal for securing the body 101 over the surface.
[0025] To initiate operation of the system, the user need to activate the system first by pressing a push button installed on the body 101. The push button typically consists of a button cap which is the visible rounded part of the button that the user presses. When the user pushes the push button, it pushes down a plunger, which is a small rod or a cylinder. Inside the push button, there are electrical contacts made of electrical materials like metal. When the user presses the push button, it completes the electrical circuit, allowing current to flow and triggering an inbuilt microcontroller’s operation, which is associated with the system. After activating the system, the user need to create a profile over a user interface installed in a computing unit of the user (e.g., laptop, tablet and smartphone).
Before creating profile of the user, the user interface displays a series of questionnaires to the user that is to be furnished by the user to create profile. After creation of the profile, the user utilizes a biometric scanner 102 installed with the body 101, to get verify and access the system.
[0026] In preferred embodiment of the present invention, the biometric scanner 102 mentioned herein is typically a fingerprint sensor to authenticate the user before user. The fingerprint sensor is embedded with the body 101 and when the user places their finger on the fingerprint, it activates automatically. The sensor uses optical sensors use light to create a 2D image of the ridges and valleys. The captured fingerprint is processed and compared with a stored template within the microcontroller. If authentication fails, the microcontroller halts the operation of the system. However, if the fingerprint matches an authorized user, the microcontroller re-configures the system accordingly.
[0027] In an alternate embodiment of the present invention, the biometric scanner 102 may include one or more image sensor capable of authenticating user by using Facial Recognition. The sensor which is integrated within the body 101, positioned to capture the user's face when they approach. It continuously scans for facial features and maps key points such as eye position, nose, mouth, and overall face structure. The captured facial image is converted into a unique numerical representation based on facial landmarks. The microcontroller compares this representation with stored profiles in the microcontroller.
[0028] After successfully authenticating the user, the microcontroller actuates a robotic arm 103 mounted with the cuboidal body 101 to position a projection surface 105 arranged with the robotic arm 103 as an end effector, in front of the user. The robotic arm 103 is a type of mechanical arm which is usually available with similar function to a human arm. The segments of such a manipulator are connected by joints allowing either rotational motion or translational displacement. The robotic arm 103 contains several segments that are attached together by joints also referred to as axes. The robotic arm 103 contains several segments that are attached together by motorized joints also referred to as axes. Each joint of the segments contains a step motor that rotates and allows the robotic arm 103 to complete a specific motion in translating the projection surface 105 in front of the user.
[0029] To enhance the functionality of the robotic arm 103 there is a motorized ball and socket joint 104 installed in between the body 101 and the arm 103. The motorized ball and socket joint 104 consists of a ball-shaped element that fits into a socket, which provides rotational freedom in various directions. The ball is connected to a motor driver, which provides the controlled movement. The arm is attached to the socket of the motorized ball and socket joint 104. The motor driver responds by adjusting the ball and socket joint 104 and rotates the ball in the desired direction, and this motion is transferred to the socket that holds the arm to aid in translating the projection surface 105 in front of the user.
[0030] Concurrently, the microcontroller actuates a projector 107, which is operatively linked with the microcontroller and installed with the body 101 to project a preferred clock type over the projection surface 105.
[0031] In preferred embodiment of the present invention, the projector 107 is typically a holographic projector 107, which emits various combination of lights towards the lens which is further portrayed over the projection surface 105 to project a preferred clock type. The clock type includes but not limited to analogue, digital, hybrid and world clock.
[0032] To enhance functionality of projector 107, there is a rotatable joint 108 integrated with the projector 107 to ensure the clock projection remains optimally aligned on the projection surface 105. The rotatable joint 108 typically consists of a motorized hinge, allowing the projector 107 to rotate around a fixed axis. The motorized hinge joint typically involves the use of the motor driver to control the movement of the hinge and the connected component. The hinge joint provides the pivot point around which the movement occurs. The motor is the core component responsible for generating the rotational motion. It converts the electrical energy into mechanical energy, producing the necessary torque that drives the hinge joint. As the motor rotates, the motorized hinge joint, it allows the projector 107 to rotate around a fixed axis.
[0033] The microcontroller controls the movement of the rotatable joint 108 to ensure that that the clock projection is over the projection surface 105, which is ensured by an imaging unit 106, operatively coupled with said microcontroller for gaze detection. The imaging unit is constructed with a camera lens and a processor, wherein the camera lens is adapted to capture a series of images of the user. The processor carries out a sequence of image processing operations including pre-processing, feature extraction, and classification by utilizing machine learning and artificial intelligence protocols. The image captured by the imaging unit is real-time images of the user. The artificial intelligence based imaging unit transmits the captured image signal in the form of digital bits to the microcontroller. The microcontroller upon receiving the image signals and constantly perform gaze detection operation. The gaze detection determines where the user is looking by analyzing their eye movements, essentially identifying the direction of their gaze.
[0034] The microcontroller, linked with a scanner, plays a crucial role in extracting and processing information from one or more pages provided by the user to gather details about upcoming examinations. Internally, this process involves multiple steps, including scanning, data extraction, and intelligent processing using machine learning protocols. When a user inserts a page into the scanner, an optical scanning mechanism captures a high-resolution digital image of the document. In a preferred embodiment of the present invention, the scanner is typically equipped with an optical character recognition module, which converts printed or handwritten text into machine-readable data. The scanned document is then transmitted to the microcontroller, which acts as the central processing unit, responsible for analyzing and interpreting the extracted information. In an embodiment, the page includes details regarding the tasks to be performed by the user. In another embodiment, the page includes an examination datesheet.
[0035] Once the microcontroller receives the scanned text, it applies OCR protocols to identify relevant information, such as exam names, dates, subjects, and specific topics. To ensure accuracy, the microcontroller cross-references the extracted data with predefined templates stored in a central server. An IOT (internet of things) module is interconnected with the central server, to fetch real time information for each user profile. The IOT (Internet of Things) module, interconnected with the central server, serves as a bridge for fetching real-time information specific to each user profile. Internally, The IOT operates through a combination of cloud connectivity, data synchronization, and processing, ensuring that the user receives the most relevant and updated information seamlessly.
[0036] If the document contains structured information, such as a printed schedule, the microcontroller uses pattern recognition techniques to extract relevant details efficiently.
[0037] For unstructured or handwritten documents, the microcontroller employs machine learning-based text recognition protocols that improve accuracy by identifying handwriting patterns and contextualizing the extracted data.
[0038] In an alternate embodiment of the present invention, if necessary, the microcontroller also capable of using natural language processing (NLP) to interpret free-form text and detect key information related to the examination schedule.
[0039] Once the microcontroller successfully extracts the details regarding the tasks, in one embodiment exam details, it processes this data to correlate it with the user’s existing study schedule and performance history. If the user has a linked profile stored in the central server, the microcontroller integrates the newly scanned information with the previous performance records, which allows the microcontroller to suggest a personalized study plan, prioritizing topics based on the user's strengths and weaknesses.
[0040] After processing, the microcontroller generates a to-do list with structured tasks and time constraints, ensuring that the user efficiently prepares for the forthcoming examination. The finalized schedule is then sent to a printing unit 109 linked with the microcontroller, where a hard copy is generated for user reference. Additionally, the microcontroller also updates the projected task reminders, adjusting focus management elements accordingly.
[0041] In case the user wants any changes in to-do list, they are allowed to make changes as per their preference in the list by accessing the user-interface in ease manner.
[0042] Once the microcontroller processes the extracted examination details and correlates them with the user's past performance data, it compiles a personalized to-do list with specific study tasks, deadlines, and recommended time allocations. This structured task list is formatted into a printable document using text formatting protocols stored within the system. The formatting process ensures clarity, using appropriate fonts, spacing, and alignment to make the printed document easy to read.
[0043] Before sending the document to the printing unit 109, the microcontroller converts the digital task list into a printer-compatible format, such as PCL (Printer Control Language). This formatted data is transmitted to the printer driver, which translates it into a language the printer understands, ensuring that the text and any graphical elements (such as study progress bars or time indicators) are correctly rendered.
[0044] As the document prints, the microcontroller ensures error detection and quality control. The microcontroller tracks the paper's position and ink distribution. If any low-ink issue is detected, the microcontroller pause printing. Once printing is complete, the to-do list is directed to the receptacle 110 with a pneumatically extendable rod 111, allowing easy access for the user. The rod 111 as mentioned herein are powered by a pneumatic unit that utilizes compressed air to extend and retract the rod. The process begins with an air compressor which compresses atmospheric air to a higher pressure.
[0045] The air cylinder of the pneumatic unit contains a piston that moves back and forth within the cylinder. The cylinder is connected to one end of the rod. The piston is attached to the rod 111 and its movement is controlled by the flow of compressed air. To extend the rod 111 the piston activates the air valve to allow compressed air to flow into the chamber behind the piston. As the pressure increases in the chamber, the piston pushes the rod 111 to the desired length, ensuring that the printed document is conveniently presented without requiring the user to manually retrieve it from inside the cuboidal body 101.
[0046] When the to-do list is generated, it includes a set of predefined tasks, each assigned a specific time constraint. The microcontroller continuously monitors these time constraints using its internal clock and compares the current time with the allotted time for each task. As the deadline for a task approaches or expires, the microcontroller automatically adjusts the projection parameters to visually notify the user. One of the primary parameters controlled by the microcontroller is color modulation. The projector 107, linked to the microcontroller, adjusts the color of the projected display to reflect the urgency of tasks.
[0047] Another parameter is blinking effects, which the microcontroller triggers as an additional visual cue. When a task nears its time limit, the projector’s display start blinking the clock projection at an increasing frequency, ensuring that the user’s attention is drawn toward the notification. The microcontroller achieves this effect by modulating the light intensity of the projector 107 at predefined intervals.
[0048] In case the microcontroller via the imaging unit 106 detects any negative emotions on the user’s face, then the microcontroller actuates a speaker unit 115 to inform the user to take breaks and according to the breaks of the user, the microcontroller automatically adjusts the to-do list. The speaker unit 115 is capable of producing clear and natural sound and is capable of adjusting its volume based on ambient noise levels. The speaker unit 115 consists of audio information, which is in the form of recorded voice, synthesized voice, or other sounds, generated or stored as digital data.
[0049] This data is often in the form of an audio file. The digital audio data is sent to a digital-to-analog converter (DAC). The DAC converts the digital data into analog electrical signals. The analog signal is often weak and needs to be amplified. An amplifier boosts the strength to a level so that the speaker drives it effectively. The amplified audio signal is then sent to the speaker. The core of the speaker is an electromagnet attached to a flexible cone. These sound waves travel through the air as pressure waves and are picked by the user’s ear.
[0050] For example, if the user is tired as detected by the imaging unit 106, the microcontroller directs the speaker to alert the user for taking break and in case the user takes break of 30 minutes, then the microcontroller adjusts this 30 minutes break into the list and extend time period of to-do list.
[0051] The body 101 includes a plate 113, which is dedicated to accommodate one or more electronic gadgets of the user with the help of multiple clamps 114 installed with the plate 113 to prevent the user from accessing the gadgets according to the time constraints. The clamp includes a pair of flaps which are pivoted with each other for allowing the axial motion of the flaps required for clasping the gadgets, a DC motor is paired with the pivot joint that is activated by the microcontroller for providing a rotational motion to the joint for automating the movement of the flaps for gripping the gadgets to prevent the user from accessing the gadgets.
[0052] A Wi-Fi (wireless fidelity) module integrated into the cuboidal body 101 serves as a critical communication interface, enabling the microcontroller to interact with the user's connected gadgets. Internally, this module functions through a series of wireless data exchanges, message filtering, and emergency detection mechanisms, ensuring that users remain undisturbed during focused tasks while staying informed about urgent communications.
[0053] Once the Wi-Fi module establishes a connection with the user's gadgets, it continuously monitors incoming calls and messages. The microcontroller retrieves notifications from the connected gadgets through established protocols including but not limited to Bluetooth tethering, push notifications, and API integration (depending on the operating system of the connected gadgets). This allows the microcontroller to process real-time communication data without requiring the user to directly check their gadgets.
[0054] To differentiate between regular notifications and emergency situations, the microcontroller is equipped with a natural language processing (NLP) protocols. For example, when a message or call is received, the microcontroller scans for predefined emergency phrases, such as “urgent,” “accident,” “emergency,” or custom keywords. If an emergency phrase is detected, the microcontroller triggers an immediate alert through the integrated speaker unit 115, ensuring that the user is promptly informed. For example, a text from an emergency contact (such as a family member or workplace) may be prioritized over promotional or social notifications.
[0055] The present invention works best in the following manner, where the process begins with the user interacting with the user interface, which presents the set of questionnaires to collect essential information for profile creation. This data is stored in the central server, forming the basis for personalized recommendations. Upon system activation, the biometric scanner 102, integrated into the cuboidal body 101, authenticates the user by scanning and matching biometric data. Once authenticated, the scanner sends the signal to the microcontroller, which retrieves the user's past performance records, study schedules, and preferences from the central server. This ensures that the system adapts to the individual’s learning habits and requirements. Following authentication, the system activates the robotic arm 103, which is attached to the cuboidal body 101 through the motorized ball and socket joint 104. This arm holds the projection surface 105 as its end effector and is dynamically positioned based on gaze detection data from the imaging unit 106. The microcontroller processes gaze patterns and aligns the projection surface 105 directly in front of the user, reducing distractions and enhancing engagement. The projector 107, mounted on the rotatable joint 108, projects the preferred clock type (analog, digital, hybrid, or world clock) onto the surface. The microcontroller continuously adjusts projection parameters, such as color or blinking effects, based on the urgency of tasks in the to-do list. To assist with exam preparation and focus management, the system includes the scanner that allows users to input physical pages containing exam schedules or study material. The scanner extracts relevant information and forwards it to the microcontroller, which applies machine learning algorithms to correlate past performance data with the upcoming examination. The finalized to-do list is then transmitted to the printing unit 109, which prints the hard copy for user reference. The printed list is placed in the receptacle 110, which is equipped with the pneumatically extendable rod 111 that extends outward from the cuboidal body 101, allowing the user to easily retrieve their study schedule. To ensure optimal concentration, the system employs the imaging unit 106 that continuously monitors facial expressions and gaze patterns. If the microcontroller detects stress, fatigue, or negative emotions, the microcontroller triggers the integrated speaker unit 115, prompting the user to take the break. Additionally, the system modifies the to-do list dynamically, either adjusting break times or re-prioritizing tasks based on the user's emotional state. To minimize distractions, the plate 113 with clamps 114 holds electronic gadgets, such as mobile phones or tablets. This plate 113 restricts the user’s physical access to their systems, ensuring they stay focused. Additionally, the Wi-Fi module connects the user’s gadgets to the microcontroller, enabling it to monitor and process incoming calls and messages. Routine notifications are muted to maintain concentration, but in cases where the emergency phrase (e.g., "urgent," "critical," "accident") is detected in the call or message, the microcontroller generates the audio notification through the speaker unit 115, ensuring that the user is alerted only when necessary.
[0056] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A user-centric focus management system, comprising
i) a user interface, configured with a set of questionnaires, to be furnished by a user for creating a user-profile;
ii) a cuboidal body 101, installed with a biometric scanner 102 that scans and matches biometric signs of said user, to identify the user’s profile and relay a signal to a microcontroller for re-configuring the system accordingly;
iii) a robotic arm 103, attached over said cuboidal body 101, embodied with a projection surface 105 as an end effector, said robotic arm 103 connected to said body 101 via a motorized ball and socket joint 104;
iv) an imaging unit 106, operatively coupled with said microcontroller for gaze detection, wherein said microcontroller, triggers a motor driver connected with said robotic arm 103 and motorized ball and socket joint 104 to position said projection surface 105 in front of the user;
v) a projector 107, operatively linked with said microcontroller, via a rotatable joint 108, to project a preferred clock type over said projection surface 105;
vi) a central server, operatively coupled with said microcontroller, stored with previous performance data corresponding to each user-profile;
vii) a scanner, coupled with said microcontroller, to scan one or more pages provided by said user, for extracting information related to one or more tasks to be performed by said user;
viii) a printing unit 109, coupled with said microcontroller, wherein said microcontroller is configured with a machine learning protocol for correlating the previous performance with tasks to be performed by said user, for preparing a to-do list including time constraints and communicating the list to said printing unit 109; and
ix) a receptacle 110 installed adjacent to said printing unit 109 for collection of the printed to-do list, wherein said receptacle 110 is installed with a pneumatically extendable rod 111, that extends said receptacle 110, out of said cuboidal body 101, to provide the to-do list to the user.
2) The system as claimed in claim 1, wherein one or more suction cups 112 are fabricated over a bottom surface of said body 101, to stably hold said body 101 against a mounting surface.
3) The system as claimed in claim 1, wherein said microcontroller, alters one or more projection parameters based on the time constraints of the to-do list, said projection parameters including but not limited to color, blinking.
4) The system as claimed in claim 3, wherein said to-do list includes a pre-defined number of tasks along with time constraint for each task, and upon competition of the time constrain for a particular task, the microcontroller alters said parameters to notify the same to said user.
5) The system as claimed in claim 1, wherein said clock type includes but not limited to analogue, digital, hybrid and world clock.
6) The system as claimed in claim 1, wherein said user-interface allows the user to change the to-do list as per the user’s preference.
7) The system as claimed in claim 1, wherein an IOT (internet of things) module is interconnected with said central server, to fetch real time information for each user profile.
8) The system as claimed in claim 1, wherein said imaging unit 106 monitors user’s facial expressions and relays the information to said microcontroller which in turn triggers an integrated speaker unit 115 to notify the user for taking breaks and accordingly updates the to-do list, in case the facial expressions correspond to negative emotions.
9) The system as claimed in claim 1, wherein said cuboidal body 101, includes a plate 113 for placing one or more electronic gadgets of said user, said plate 113 including one or more clamps 114 to clasp the gadgets, preventing the user from accessing said gadgets according to the time constraints.
10) The system as claimed in claim 9, wherein said cuboidal body 101 unit includes a Wi-Fi module, configured to connect said gadgets with said microcontroller to process the incoming messages, calls over said gadgets and in case any emergency phrase is identified over said calls, messages, said microcontroller generates a notification said speaker unit 115.
| # | Name | Date |
|---|---|---|
| 1 | 202521024286-STATEMENT OF UNDERTAKING (FORM 3) [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 202521024286-REQUEST FOR EXAMINATION (FORM-18) [18-03-2025(online)].pdf | 2025-03-18 |
| 3 | 202521024286-REQUEST FOR EARLY PUBLICATION(FORM-9) [18-03-2025(online)].pdf | 2025-03-18 |
| 4 | 202521024286-PROOF OF RIGHT [18-03-2025(online)].pdf | 2025-03-18 |
| 5 | 202521024286-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 6 | 202521024286-FORM-9 [18-03-2025(online)].pdf | 2025-03-18 |
| 7 | 202521024286-FORM FOR SMALL ENTITY(FORM-28) [18-03-2025(online)].pdf | 2025-03-18 |
| 8 | 202521024286-FORM 18 [18-03-2025(online)].pdf | 2025-03-18 |
| 9 | 202521024286-FORM 1 [18-03-2025(online)].pdf | 2025-03-18 |
| 10 | 202521024286-FIGURE OF ABSTRACT [18-03-2025(online)].pdf | 2025-03-18 |
| 11 | 202521024286-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [18-03-2025(online)].pdf | 2025-03-18 |
| 12 | 202521024286-EVIDENCE FOR REGISTRATION UNDER SSI [18-03-2025(online)].pdf | 2025-03-18 |
| 13 | 202521024286-EDUCATIONAL INSTITUTION(S) [18-03-2025(online)].pdf | 2025-03-18 |
| 14 | 202521024286-DRAWINGS [18-03-2025(online)].pdf | 2025-03-18 |
| 15 | 202521024286-DECLARATION OF INVENTORSHIP (FORM 5) [18-03-2025(online)].pdf | 2025-03-18 |
| 16 | 202521024286-COMPLETE SPECIFICATION [18-03-2025(online)].pdf | 2025-03-18 |
| 17 | Abstract.jpg | 2025-03-25 |
| 18 | 202521024286-FORM-26 [03-06-2025(online)].pdf | 2025-06-03 |