Abstract: A high-jump training and assistance system comprising a platform 101 with adjustable columns 102 via vertical sliding rail 103, a crossbar 104 mounted along the transverse axis from columns 102, a handlebar 105 with a biometric scanner 106 for user authentication, a sensor suite 110 with pressure, force, and gyroscopic sensors for detecting grip strength, asymmetry, and hand alignment, a speaker 111 for real-time voice alerts, a holographic projection unit 112 for spatial jump guidance, an artificial intelligence-based imaging camera 113 with imaging processor for movement monitoring and posture analysis, an inflatable landing unit with motorized rollers 115 and inflatable sheets secured via motorized clippers 117 on a parallel slider track 118 for safety, an ultrasonic sensor for motion tracking and collision detection, a head gear 119 with EEG sensors for neuro-feedback on mental focus and fatigue, LEDs 120 on the crossbar 104 for visual fatigue and readiness alerts.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a high-jump training and assistance system that is capable of improving user performance, ensuring training safety, and enhancing overall training efficiency. It is designed to offer a personalized and guided experience, helping users develop proper techniques, reduce the risk of injury, and achieve consistent progress in high-jump practice.
BACKGROUND OF THE INVENTION
[0002] High-jump is a physically demanding athletic discipline that requires a combination of strength, technique, coordination, and precise body control. Effective training in high-jump not only involves repeated physical attempts but also demands continuous feedback, posture correction, and safety assurance. Athletes often face challenges such as improper technique development, lack of personalized training metrics, and increased risk of injuries during practice sessions. As performance expectations rise in competitive sports, there is a growing need for systems that supports athletes with structured, adaptive, and safer training environments.
[0003] Traditionally, high-jump training has relied on manual observation, static equipment, and generalized training routines, which lack the capacity to provide real-time feedback or personalized progression. Coaches typically evaluate jump techniques visually and provide subjective input, which does not always capture the intricacies of movement or early signs of fatigue. Moreover, safety precautions are limited to basic landing mats, offering minimal dynamic response to unexpected falls. This conventional approach often leads to slower skill development, increased physical strain, and inconsistent performance outcomes, thereby highlighting the need for an advanced, intelligent training system.
[0004] US20180207464A1 disclose a training system for high jump performance that includes a landing platform with adjustable legs supporting the landing platform, where the adjustable legs are adapted to vary the height of the landing platform. A trainer places the landing platform in a high jump training environment and engages an athlete in a series of jump attempts, where the height of the landing platform is adjusted to provide for increased performance by the athlete. Preferably, the adjustable legs include a support brace and each leg rests on a footing.
[0005] US5842954A discloses a jump training device has a substantially ladder-like element including two longitudinal bar members spaced from one another in a transverse direction, and a plurality of transverse members connected to the longitudinal members movably movable in a longitudinal direction, each of the transverse members being yieldable so that when an athlete jumping over the transverse members touches a transverse member, the transverse member yields so as to prevent injury to the athlete.
[0006] Conventionally, many systems are disclosed for assisting athletic training, including those for jump improvement, performance monitoring, and injury prevention. While some include feedback features or safety elements, they generally operate on standard pre-set parameters and are not tailored to individual user profiles. Moreover, such conventional systems often fail to integrate biometric authentication, adaptive posture correction, and dynamic safety in a unified manner.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that offers a personalized, intelligent, and safe high-jump training environment. Such a system should be capable of recognizing individual users, adapting to their specific performance levels, monitoring real-time physical and cognitive conditions, and providing corrective feedback. Additionally, it should include dynamic safety features to prevent injuries, ensure consistent training progression, and enable remote performance tracking.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a system that offers a personalized and adaptive training experience by utilizing user-specific data for performance optimization and progression tracking.
[0010] Another object of the present invention is to ensure user safety during training by incorporating real-time monitoring, posture detection, and cushioned landing surfaces in response to incorrect or unsafe movements.
[0011] Another object of the present invention is to enhance the overall efficiency, consistency, and safety of high-jump training by providing real-time insights, adaptive adjustments, and guided support based on individual performance and physical condition.
[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0013] The present invention relates to a high-jump training and assistance system that is capable of enhancing athlete safety, improving jump precision, supporting personalized training, and offering real-time posture correction, biometric authentication, and adaptive feedback for both physical and cognitive conditions of the user.
[0014] According to an embodiment of the present invention, a high-jump training and assistance system is disclosed comprises a platform structured to support user activity during jump training, a pair of columns mounted on the platform via vertical sliding rails for adjustable elevation of a crossbar positioned along the transverse axis, a handlebar installed on at least one of the columns, a motorized clamp mounted on an extendable pole with a ball and socket joint for stabilizing the handlebar during operation, the handlebar comprising a biometric scanner for fingerprint-based user authentication, a microcontroller linked to the biometric scanner for processing fingerprint ridges and patterns and for comparing the same with a pre-stored training profile in a database, upon successful authentication, the microcontroller retrieves and loads the user-specific profile and triggers the sliding rails to adjust column height and position the crossbar at a personalized optimal height for the user, a sensor suite installed on the handlebar including a pressure sensor to measure grip strength, a force sensor to detect asymmetric pressure, and a gyroscopic sensor to monitor alignment, wherein the microcontroller evaluates sensor data to detect unsafe posture or grip and activates a speaker installed on the platform to generate real-time voice alerts for corrective guidance, a holographic projection unit mounted on the platform to display jump trajectory and optimal body alignment visuals, and an artificial intelligence-based imaging camera paired with a processing unit for capturing and analysing real-time body movement and posture in comparison to the projected visual guidance.
[0015] According to another embodiment of the present invention, the system further comprises an inflatable member mounted on the distal end of the platform, including at least two inflatable sheets rolled around a pair of motorized rollers, the sheets connected to inflating units via controllable valves for inflation, the ends of the sheets secured by motorized clippers mounted on a parallel slider track to deploy and maintain a cushioned safety zone upon fall detection, the artificial intelligence-based imaging camera detects incorrect movement leading to potential falls and prompts the microcontroller to trigger inflation and sheet deployment, an ultrasonic sensor positioned on the handlebar to detect motion parameters, jump velocity, and potential collision threats, a computing unit wirelessly connected via a communication module to the microcontroller to receive and display performance metrics and user data remotely, a headgear comprising EEG (Electroencephalography) sensors to capture real-time neural indicators such as focus, fatigue, and stress levels, enabling neuro-feedback for adaptive training, a set of LEDs (Light Emitting Diodes) installed on the crossbar to visually indicate fatigue via red blinking lights and optimal performance states via green indicators.
[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a high-jump training and assistance system.
DETAILED DESCRIPTION OF THE INVENTION
[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0021] The present invention relates to a high-jump training and assistance system that is capable of providing a personalized and adaptive training experience. Additionally, the system facilitates customized adjustments based on individual needs, supports dynamic feedback to prevent injury, and enhances overall training efficiency by combining user-specific data with intelligent control strategies.
[0022] Referring to Figure 1, an isometric view of a high-jump training and assistance system is illustrated, comprising a platform 101 assembled with pair of columns 102, each via a vertical sliding rail 103, a crossbar 104 attached along traverse axis of the columns 102, a handlebar 105 is installed on one of the column 102, a biometric scanner 106 installed on the handlebar 105, a motorized clamp 107 mounted on an extendable pole 108 with a motorized ball and socket joint 109 to secure the handlebar 105, a sensor suite 110 installed on the handlebar 105, a speaker 111 mounted on the platform 101, a holographic projection unit 112 mounted on the platform 101, an artificial intelligence-based imaging camera 113 paired with a processor mounted on the platform 101, an inflatable member 114 comprising at least two inflatable sheet mounted on motorized rollers 115 positioned on distal end of the platform 101, an inflating unit 116 is connected at the end of the sheet on inflatable member 114, a pair of motorized clipper 117 mounted on a parallel slider track 118 with the platform 101, a head gear 119 associate with the system, a plurality of LEDs (Light Emitting Diodes) 120 integrated on the crossbar 104.
[0023] The system disclosed herein, includes a platform 101 serving as the base for all structural and functional components. The platform 101 assembled with a pair of columns 102, each via a vertical sliding rail 103 for height adjustment of a crossbar 104 attached along traverse axis of the columns 102. The vertical sliding rail 103 functions through motorized actuator comprising a stepper motor linked to a lead. This drives a movable carriage holding the crossbar 104, enabling smooth vertical adjustment. The system includes position sensors or encoders for accurate height tracking, all controlled by a microcontroller based on the user’s authenticated training profile. Internal wiring channels route power and data signals, ensuring responsive, real-time adjustments for optimal high-jump training setup.
[0024] A handlebar 105 is installed on at least one of the column 102, gripped by a motorized clamp 107 mounted on an extendable pole 108 with a motorized ball and socket joint 109 to secure the handlebar 105 to the platform 101 for enhanced stability and dynamic adjustment. The motorized clamp 107 consists of an electric motor linked to a gear that drives the opening and closing motion of the clamp 107 arms. When the microcontroller sends an activation signal, the motor rotates to tighten or release the clamp 107 around the handlebar 105. This ensures firm attachment and stability during use. The clamp 107 is further mounted on an extendable pole 108 with a ball-and-socket joint 109 for flexible positioning and dynamic adjustment during training sessions.
[0025] The extendable pole 108 operates through a pneumatic unit that uses compressed air to control extension and retraction. Internally, the pole 108 consists of telescopic segments housed within a cylinder connected to a pneumatic actuator. When the microcontroller signals the pneumatic unit, valves open to direct pressurized air into the cylinder, pushing the inner segments outward and extending the pole 108. To retract, air is released or redirected, allowing the segments to collapse back. This functioning provides smooth, and stability for handlebar 105, enhancing user alignment and ensuring adaptable positioning during training activities.
[0026] While, the motorized ball and socket joint 109 enables multi-directional movement and positioning of the extendable pole 108 via the clamp 107. Internally, the joint 109 consists of a spherical ball housed within a socket that allows rotation along multiple axes. A set of miniaturized electric motors, often stepper motors, are integrated around the socket. When activated by the microcontroller, these motors apply controlled torque to rotate or tilt the ball in specific directions. This enables precise angular adjustment of the pole 108 and attached handlebar 105, ensuring optimal alignment and stability based on user position or training needs, enhancing safety and performance accuracy.
[0027] The handlebar 105 is accessed by a user for initiating authentication via a biometric scanner 106 installed on the handlebar 105 by skimming fingerprint on the biometric scanner 106. The handlebar 105 is installed to facilitate user interaction with the system, enable biometric authentication, and monitor grip-related parameters. The biometric scanner 106 in the handlebar 105 operates by capturing the unique ridge patterns of the user's fingerprint when they place their finger on the scanner 106 surface. Internally, the scanner 106 uses optical sensing technology to generate a high-resolution image of the fingerprint. This data is then processed to extract key features such as ridge endings, bifurcations, and pattern flow. The extracted features are compared against stored templates in a linked database. The microcontroller handles this comparison process to authenticate the user's identity. Upon successful verification, the system loads the corresponding user profile to customize training parameters.
[0028] The microcontroller operatively connected to the biometric scanner 106, configured to process the ridge patterns captured by the scanner 106 and authenticate the user’s identity by comparing them with pre-stored data in a linked database, upon successful authentication, the microcontroller retrieves the corresponding personal training profile and activates the sliding rail 103 to adjust the height of the columns 102, thereby elevating the crossbar 104 to an optimum training level. The microcontroller is encrypted with machine learning protocols that enable adaptive monitoring and control of the system’s operation parameters.
[0029] A sensor suite 110 is installed on the handlebar 105, which includes a pressure sensor, a force sensor, and a gyroscopic sensor to measure grip strength, detect asymmetric force pattern and incorrect alignment of hand, respectively. The sensor suite 110 operates by continuously capturing and transmitting real-time data to the microcontroller for analysis. When the user grips the handlebar 105, the integrated sensors detect pressure distribution, applied force, hand orientation, and movement patterns. These signals are converted into digital data, allowing the system to evaluate grip consistency, detect misalignments, and identify signs of fatigue or unsafe posture. Based on this evaluation, the microcontroller initiate voice alerts or adjust training parameters to ensure safety and performance optimization.
[0030] The pressure sensor embedded in the handlebar 105 detects the intensity and distribution of pressure applied by the user’s hands during grip. It typically utilizes piezo resistive sensing elements that respond to variations in pressure by changing resistance. These analog signals are converted into digital data and sent to the microcontroller for real-time analysis. This allows the system to monitor whether the user is maintaining a consistent grip or showing signs of imbalance, fatigue, or improper hand placement during the training session.
[0031] The force sensor measures the total amount of mechanical force exerted by the user’s hands on the handlebar 105. It generally incorporates strain gauges that deform slightly under pressure, generating corresponding electrical signals. These signals are then amplified, digitized, and processed by the microcontroller to evaluate the strength of the grip. By monitoring these values, the system detects excessive or insufficient grip force, which indicate physical fatigue, lack of focus, or potential risk of improper posture during high-jump preparation.
[0032] And, the gyroscopic sensor monitors angular orientation and movement of the user's hand on the handlebar 105 by detecting rotational velocity across multiple axes. It uses MEMS (Micro-Electro-Mechanical Systems) technology to sense shifts in direction and angular rate. These measurements are transmitted to the microcontroller, where they are analysed to assess hand stability and wrist alignment. The sensor helps identify asymmetric or erratic hand movements, signalling potential balance issues or incorrect form, which the system uses to provide corrective feedback and enhance jump posture training.
[0033] In case, the measured data depicts an unsafe movement, the microcontroller triggers voice alerts via a speaker 111 mounted on the platform 101 to prompt corrective action. The speaker 111 internally receives electrical signals from the microcontroller, which are generated in response to detected unsafe movements, incorrect posture, or system alerts. These signals cause the speaker’s diaphragm to vibrate, producing sound waves that deliver voice alerts or guidance instructions. This real-time auditory feedback ensures that the user receives immediate cues to correct their form or respond to any potentially hazardous situations during training.
[0034] A holographic projection unit 112 is mounted on a side of the platform 101, which operates by projecting three-dimensional spatial visuals onto a designated area around the platform 101 to guide the user for jump approach, trajectory and optimal body posture. Internally, the projection unit 112 uses a laser interference to create volumetric or stereoscopic projections that appear to float in space. The projection unit 112 receives data inputs from the microcontroller, which customizes the visuals based on the user’s training profile. These projections serve as real-time, interactive visual aids, helping users align their body movements precisely with optimal jump techniques for improved performance and reduced risk of injury.
[0035] An artificial intelligence-based imaging camera 113 is paired with a processor mounted on the platform 101 for capturing and processing multiple images of the platform 101 to monitor and compare user’s movements with respect to the spatial visuals. The imaging camera 113 captures continuous high-resolution images and video of the user during training. These visual inputs are fed into the processor which is a dedicated artificial intelligence (AI) processor that analyse body posture, movement patterns, and jump trajectory in real time. Using pre-trained machine learning models, the system identifies deviations from optimal form and detects unsafe or incorrect movements. The processed information is transmitted to the microcontroller, which coordinates feedback and safety responses. This integration enables precise, adaptive monitoring to enhance performance and reduce injury risk during high-jump training. The microcontroller then processes the captured images and videos from the imaging camera 113 to evaluate and generate a detailed performance metrics.
[0036] A computing unit operates as a remote interface that receives performance data from the microcontroller via a communication module for establishing a wireless connection to enable real-time transfer of metrics and obtained data for remote monitoring. Internally, the communication module establishes the wireless connection such as but not limited through Bluetooth, Wi-Fi, or other standard protocols allowing real-time transmission of metrics, movement data, and health indicators. The computing unit processes, stores, and displays this information through the user interface, enabling trainers or users to monitor progress, analyse patterns, and make informed adjustments to training routines based on live or recorded feedback.
[0037] Additionally, an ultrasonic sensor is installed on the handlebar 105 and synced with the imaging camera 113 for capturing motion data, measuring velocity, trajectory and perming collision detection based on which the microcontroller identifies potential fall and activate real time voice alerts for unsafe movements. The ultrasonic sensor emits high-frequency sound waves and measures the time taken for the echoes to return after reflecting off the user. This time delay is used to calculate distance and movement velocity. The sensor continuously monitors the jump path, and if sudden or unsafe motion is detected, it sends data to the microcontroller, which then trigger alerts and deploy safety measures.
[0038] In case the imaging camera 113 detects incorrect movement corresponding to a potential fall, the microcontroller activates motorized rollers 115 positioned on distal ends of the platform 101 for rolling inflatable sheets comprised on an inflatable member 114 integrated on the motorized rollers 115. The motorized rollers 115 functions to deploy or retract inflatable sheets in response to potential falls. Internally, it consists of an electric motor connected to a rotating axle, around which the inflatable sheet is wound. When activated by the microcontroller upon detecting a hazardous movement, the motor rotates to unroll the sheet rapidly which enables quick inflation area setup, forming a cushioned landing zone to minimize injury during incorrect jump landings. The inflatable sheets are layered with a rubberized member for ensuring grip is maintained during landing.
[0039] Each of the sheet is connected with an inflating unit 116 via a valve for inflation. The inflating unit 116 comprises a compact air compressor connected to the inflatable sheets through electronically controlled valves. When the AI-based imaging camera 113 detect a hazardous movement or failed jump, the microcontroller triggers the valves to open, allowing pressurized air from the compressor to rapidly fill the inflatable sheets. Pressure sensors inside the inflating unit 116 is used to monitor inflation levels. This ensures that the cushioning surface is fully deployed in time to absorb impact, thereby preventing injury and enhancing user safety during training.
[0040] Further, the ends of sheet are secured by a pair of motorized clipper 117 is mounted on a parallel slider track 118 with the platform 101 for creating the cushioned landing area to reduce any risks of injuries. The parallel slider track 118 serves as a guiding structure that allows the motorized clippers 117 to move in a straight, aligned path along the platform 101’s surface. Positioned parallel to the inflatable sheet’s deployment area, the track 118 ensures the clippers 117 travel uniformly and precisely when activated by the microcontroller. This coordinated movement is essential for stretching and securing the inflatable sheet into a stable, cushioned configuration.
[0041] Each motorized clipper 117 is designed to grip the edges of the inflatable sheet and is mounted on a motor-driven carriage aligned with the slider track 118. When the microcontroller detects a potential fall, it sends signals to activate motors, causing the clippers 117 to move along the track 118 and stretch the sheet into a flat, secure position. This action ensures the inflated area is evenly tensioned, providing a stable and cushioned surface to absorb impact.
[0042] A head gear 119 is wirelessly associated with the microcontroller with a plurality of EEG (Electroencephalography) sensors to monitor brain activity of the user, including mental focus, stress levels and fatigue, thus enabling neuro-feedback for adaptive training adjustments. These sensors detect and record electrical signals generated by neurons in the brain. The collected signals are transmitted to the microcontroller, which interprets patterns related to focus, stress, and fatigue. Based on this data, the system adjusts training parameters or trigger alerts. This neuro-feedback enables personalized training by adapting intensity and guidance according to the user's mental and physical readiness.
[0043] A plurality of LEDs (Light Emitting Diodes) 120 are installed on the crossbar 104, configured to adjust brightness and color based on ambient light and to display alerts relating to fatigue with a red colored blinking light or optimal condition signals with green colored lights. The LEDs 120 operate by allowing an electric current to pass through a diode. When voltage is applied, electrons move across a junction and recombine with holes, releasing energy as light. The emitted light’s colour is determined by the semiconductor's composition. In the invention, the microcontroller regulates the LEDs 120 operation by controlling voltage and current which enables dynamic adjustments in brightness and colour, providing clear visual indicators of fatigue, readiness, or ambient lighting conditions during high-jump training.
[0044] The present invention works best in the following manner, where the platform 101 is assembled with the pair of columns 102 secured in upright alignment through integrated vertical sliding rail 103. The user initiates operation by gripping the handlebar 105, where the biometric scanner 106 verifies identity through fingerprint recognition. Upon successful authentication, the microcontroller retrieves the user’s personalized training profile and adjusts the crossbar 104 height accordingly. The user then receives holographic guidance via the projection unit 112, which displays the optimal jump trajectory and posture. As the user prepares, the handlebar’s sensor suite 110 including pressure, force, and gyroscopic sensors, monitors grip consistency, asymmetric force, and hand alignment. If unsafe patterns are detected, the microcontroller issues real-time voice alerts via the speaker 111. During the jump, the AI-based imaging camera 113 and ultrasonic sensor jointly track user motion, velocity, and posture, comparing it with ideal patterns for corrective feedback. In case a fall is predicted, the imaging camera 113 activates the motorized rollers 115 to unroll inflatable sheets, while the inflating unit 116 and motorized clippers 117 secure the cushioned landing area. Simultaneously, the EEG sensors in the head gear 119 monitor cognitive states like stress and focus, enabling adaptive responses. Performance metrics and biometric data are processed by the microcontroller and wirelessly transmitted to the computing unit for real-time monitoring and data logging. Visual cues are continuously provided through crossbar 104 mounted LEDs 120, signaling fatigue or readiness, ensuring the safe, data-driven, and fully personalized high-jump training experience.
[0045] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A high-jump training and assistance system, comprising:
a) a platform 101 assembled with pair of columns 102, each via a vertical sliding rail 103 for height adjustment of a crossbar 104 attached along traverse axis of the columns 102, wherein at least one of the column 102 is installed with a handlebar 105, that is accessed by a user for initiating authentication via a biometric scanner 106 installed on the handlebar 105 by skimming fingerprint on the biometric scanner 106;
b) a microcontroller linked with the biometric scanner 106 for processing ridges and patterns analyzed by the biometric scanner 106, to compare and authenticate identity of the user with a pre-fed data stored in a linked database, wherein upon successful authentication, the microcontroller processes the stored pre-fed data to load a personal training profile, based on which the sliding rail 103 are triggered to adjust height of the columns 102 for elevating the cross bar, at an optimum height;
c) at least one sensor suite 110 installed on the handlebar 105, which includes a pressure sensor, a force sensor, a gyroscopic sensor to measure grip strength consistency, detect asymmetric force pattern and incorrect alignment of hand, respectively, wherein in case the measured data depicts an unsafe movement, the microcontroller triggers voice alerts via a speaker 111 mounted on the platform 101 to prompt corrective action;
d) a holographic projection unit 112 mounted on the platform 101 for projecting spatial visual guidance for jump approach, trajectory and optimal body posture, wherein an artificial intelligence-based imaging camera 113 paired with a processor, is mounted on the platform 101 for capturing and processing multiple images of the platform 101, respectively to monitor and compare user’s movements with respect to the spatial visuals;
e) an inflatable member 114 comprising at least two inflatable sheets mounted on at least one motorized rollers 115 positioned on distal end of the platform 101, wherein in case the imaging camera 113 detects incorrect movement corresponding to a potential fall, the microcontroller activates the rollers 115 for rolling/ unrolling of the sheets, each of the sheet is connected with an inflating unit 116 via a valve for inflation, while the ends of sheet are secured by a pair of motorized clipper 117 mounted on a parallel slider track 118, with the platform 101 for creating a cushioned landing area, to reduce any risks of injuries; and
f) said microcontroller processes the captured images and videos from the imaging camera 113 to evaluate and generate a detailed performance metrics, wherein the microcontroller is linked with a computing unit via a communication module for establishing a wireless connection to enable real-time transfer of metrics and obtained data, for remote monitoring.
2) The system as claimed in claim 1, wherein an ultrasonic sensor is installed on the handlebar 105 and synced with the imaging camera 113 for capturing motion data, measuring velocity, trajectory and performing collision detection with real time voice alerts for unsafe movements.
3) The system as claimed in claim 1, wherein a head gear 119 is associated with the system, with a plurality of EEG (electroencephalography) sensors to monitor brain activity of the user, including mental focus, stress levels and fatigue, thus enabling neuro-feedback for adaptive training adjustments.
4) The system as claimed in claim 1, wherein a plurality of LEDs (Light Emitting Diodes) 120 are installed on the crossbar 104, configured to adjust brightness and color based on ambient light and to display alerts relating to fatigue with a red colored blinking light or optimal condition signals with green colored lights.
5) The system as claimed in claim 1, wherein the microcontroller accesses the database storing real-time obtained data, performance metrics, and health indicators, to analyze data for fatigue detection and personalized training adjustments for the user.
6) The system as claimed in claim 1, wherein handle bar is gripped by a motorized clamp 107 mounted on an extendable pole 108 with a motorized ball and socket joint 109, to secure the handlebar 105 to the platform 101 for enhanced stability and dynamic adjustment.
7) The system as claimed in claim 1, wherein the microcontroller is encrypted with machine learning protocols that enable adaptive monitoring and control of the system’s operation parameters.
8) The system as claimed in claim 1, wherein the inflatable sheets are layered with a rubberized member for ensuring grip is maintained during landing.
9) The system as claimed in claim 1, wherein the head gear 119 is wirelessly associated with the microcontroller, and for being operable through conditions processed by the microcontroller.
10) The system as claimed in claim 1, wherein the ultrasonic sensor is configured to perform collision detection, based on which the microcontroller identifies potential fall.
| # | Name | Date |
|---|---|---|
| 1 | 202521056015-STATEMENT OF UNDERTAKING (FORM 3) [10-06-2025(online)].pdf | 2025-06-10 |
| 2 | 202521056015-REQUEST FOR EXAMINATION (FORM-18) [10-06-2025(online)].pdf | 2025-06-10 |
| 3 | 202521056015-REQUEST FOR EARLY PUBLICATION(FORM-9) [10-06-2025(online)].pdf | 2025-06-10 |
| 4 | 202521056015-PROOF OF RIGHT [10-06-2025(online)].pdf | 2025-06-10 |
| 5 | 202521056015-POWER OF AUTHORITY [10-06-2025(online)].pdf | 2025-06-10 |
| 6 | 202521056015-FORM-9 [10-06-2025(online)].pdf | 2025-06-10 |
| 7 | 202521056015-FORM FOR SMALL ENTITY(FORM-28) [10-06-2025(online)].pdf | 2025-06-10 |
| 8 | 202521056015-FORM 18 [10-06-2025(online)].pdf | 2025-06-10 |
| 9 | 202521056015-FORM 1 [10-06-2025(online)].pdf | 2025-06-10 |
| 10 | 202521056015-FIGURE OF ABSTRACT [10-06-2025(online)].pdf | 2025-06-10 |
| 11 | 202521056015-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [10-06-2025(online)].pdf | 2025-06-10 |
| 12 | 202521056015-EVIDENCE FOR REGISTRATION UNDER SSI [10-06-2025(online)].pdf | 2025-06-10 |
| 13 | 202521056015-EDUCATIONAL INSTITUTION(S) [10-06-2025(online)].pdf | 2025-06-10 |
| 14 | 202521056015-DRAWINGS [10-06-2025(online)].pdf | 2025-06-10 |
| 15 | 202521056015-DECLARATION OF INVENTORSHIP (FORM 5) [10-06-2025(online)].pdf | 2025-06-10 |
| 16 | 202521056015-COMPLETE SPECIFICATION [10-06-2025(online)].pdf | 2025-06-10 |
| 17 | 202521056015-FORM-26 [18-06-2025(online)].pdf | 2025-06-18 |
| 18 | Abstract.jpg | 2025-06-25 |