Abstract: A worker assistance and performance monitoring system, comprising a frame 101 equipped with a multiple wheels 102 connected with frame 101 via telescopic bars 103, a vision module 104 disposed over frame 101 via a rotatable joint 104b to monitor construction area along with profile of a worker, an inspection module 105 configured over frame 101 to process construction data and evaluate faulty areas, inspection module 105 comprising a wall inspection module 105, an electrical fitting inspection module 105, and a concrete inspection module 106, an audio-visual unit 107 coupled with inspection module 105 to provide real-time audio-visual alerts to workers regarding faulty areas, a processor coupled with vision module 104 and interlinked with a memory module to compare a set of work parameters of worker with a pre-defined worklist, and a computing unit wirelessly connected with inspection module 105 to provide real-time updates to an authorized person regarding workers.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a worker assistance and performance monitoring system that provides support to workers and assist to oversee their working environments to ensure quality, safety, and efficiency, by providing real-time feedback and alerts to both workers and supervisors, thus improving overall performance and reduce errors.
BACKGROUND OF THE INVENTION
[0002] Construction and infrastructure development are critical sectors that demand accurate, timely, and safe execution of tasks by on-site workers. These activities require continuous inspection, monitoring, and coordination to ensure that structural components are built to specification and safety standards are maintained. In addition to ensuring work quality, there is a constant need to track worker productivity, compliance, and safety behavior. Manual supervision of these tasks is often labor-intensive and may not provide real-time feedback or consistent performance tracking.
[0003] Traditionally, quality inspections on construction sites are carried out manually by engineers or supervisors who physically check completed work against design specifications. This includes examining wall finishes, electrical fittings, concrete quality, and other structural features. These manual methods depend heavily on human judgment, which should be inconsistent due to fatigue, time pressure, or lack of proper tools. Moreover, identifying hidden faults such as internal cracks, moisture levels, or electrical irregularities are challenging without advanced equipment. Worker monitoring in traditional systems is also done through manual supervision, where team leads or site managers observe worker activities and compare them with daily targets. This process is often irregular and may not capture real-time non-compliance, safety violations, or poor performance.
[0004] US20240104714A1 discloses a construction inspection method includes establishing a first construction model based on detection data at a construction site, comparing the first construction model with a previously established second construction model based on construction design data to acquire a comparison result, confirming an inspection result of a construction in accordance with the comparison result, and sharing the inspection result with a user. A construction inspection apparatus includes a modeling unit that establishes a first construction model based on detection data at a construction site, a comparison unit that compares the first construction model with a previously established second construction model based on construction design data to acquire a comparison result, a confirmation unit that confirms an inspection result of a construction in accordance with the comparison result, and a sharing unit that shares the inspection result with a user. A construction inspection system includes the construction inspection apparatus.
[0005] US11106208B2 discloses a building quality inspection system includes a controller, a drone, and a robot that are communicably connected to one another. The controller includes circuitry configured to, when exterior of a building is inspected, sends an inspection objective to the drone to instruct the drone to carry out a visual inspection of the exterior of the building, receives inspection data collected by the drone during the visual inspection of the exterior of the building, extracts a location where damage is suspected from the inspection data collected by the drone, sends the location where damage is suspected to the robot to carry out an exterior inspection at the location where damage is suspected, receives inspection data collected by the robot during the exterior inspection, and determines current quality of the exterior of the building based on the inspection data collected by the drone and the robot.
[0006] Conventionally, many systems have been developed to assist with construction quality inspection and performance tracking, however these existing systems and devices mentioned in the prior arts have limitations pertaining to comprehensive site monitoring, real-time communication, and multi-parameter evaluation of construction quality and worker activity. Existing systems are also often either limited to specific inspection tasks such as visual scanning or rely heavily on semi-automated data collection that requires manual interpretation.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that requires to be capable of performing construction inspection and worker monitoring, and needs to provide real-time alerts, ensuring timely rectification of faults, and enhancing safety compliance across the site.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a system that is capable of monitoring quality of construction work in real time to ensure that it complies with predefined standards and guidelines.
[0010] Another object of the present invention is to develop a system that is capable of identifying construction faults or irregularities at an early stage to support timely corrective measures and reduce overall project delays.
[0011] Another object of the present invention is to develop a system that is capable of continuously tracking and comparing worker performance with assigned tasks to ensure proper compliance and accountability at the worksite.
[0012] Yet another object of the present invention is to develop a system that is capable of delivering timely alerts and updates to both on-site workers and remote supervisors regarding safety issues or unattended construction defects.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to a worker assistance and performance monitoring system that is capable of assisting workers in completing their duties along with continuously observing and evaluating their actions to maintain safety standards, ensure work accuracy, and communicate important information promptly to prevent mistakes and enhance productivity.
[0015] According to an embodiment of the present invention, a worker assistance and performance monitoring system is disclosed, comprising a frame equipped with a plurality of motorized wheels connected via telescopic bars, allowing mobile operation across the construction site, a vision module mounted over the frame via a rotatable joint, enabling constant monitoring of the site and worker profiles, an inspection module arranged on the frame to analyze construction data against predefined standards, the inspection module including a wall inspection module, an electrical fitting inspection module, and a concrete inspection module, an audio-visual unit operatively connected with the inspection module to provide real-time alerts to workers about faulty areas, a processor coupled with the vision module and linked to memory to compare worker performance parameters with a pre-assigned worklist, and a computing unit wirelessly connected to the inspection module to send live updates to authorized personnel when alerts are ignored or irregularities occur.
[0016] According to another embodiment of the present invention, the present invention further comprising the wall inspection module that includes an articulated link with an AI imaging unit, a LIDAR sensor, a GPR sensor, and a UPV sensor, the electrical fitting inspection module uses an infrared sensor and non-contact voltage tester, while the concrete inspection module deploys robotic arms with a sensing plate containing moisture, temperature, and viscosity sensors, the system processes deviations in construction quality and sends notifications, the vision module further integrates a camera and microprocessor with mapping and profiles to regulate telescopic rod and wheel actuators, an LDR (Light Dependent Resistor) and lighting module activate site lighting under low light, and the computing unit enables authorized updates to worklists and safety compliance monitoring, with the audio-visual unit projecting strict warnings and remote alerts when safety or performance thresholds are violated, thereby delivering a comprehensive, automated site assistance and monitoring solution.
[0017] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a worker assistance and performance monitoring system.
DETAILED DESCRIPTION OF THE INVENTION
[0019] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0020] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0021] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0022] The present invention relates to a worker assistance and performance monitoring system that is developed for helping workers to perform tasks more effectively by monitoring their progress, detecting problems, and sending timely notifications to both workers and managers to ensure compliance with safety and quality requirements, thereby minimizing risks and improving work outcomes.
[0023] Referring to Figure 1, an isometric view of a worker assistance and performance monitoring system is illustrated, comprises of a frame 101, equipped with a plurality of motorized wheels 102 connected to the frame 101 via telescopic bars 103, a vision module 104 mounted on the frame 101 that includes a camera 104a through a rotatable joint 104b, a wall inspection module 105 105 includes an articulated link 105a integrated with an artificial intelligence (AI) imaging unit 105b, and a concrete inspection module 106 includes a set of robotic arms 106a installed with a sensing plate 106b, and an audio-visual unit 107 operatively coupled with the inspection modules 105, 106 and comprises a display panel 107a, a speaker 107b and LEDs 107c (light emitting diodes).
[0024] The present invention includes a frame 101 is positioned over a ground surface in a construction site. The frame 101 provides a stable and mobile platform for various integrated modules. The system is linked with a user interface which is installed in a computing unit and wirelessly connected to a microcontroller by means of a communication module. The user accesses the user-interface to provide input regarding activation of system. The user is also provided with an option of updating the worklist of the workers and a set of parameters include but not limited to worker safety compliance, worker daily target, worker breaks, worker faults.
[0025] The communication module includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module. The Wi-Fi module contains transmitters and receivers that use radio frequency signals to transmit data wirelessly to the microcontroller. The wireless module typically includes components such as antennas, amplifiers, and processors to facilitate communication and further connected to networks such as Wi-Fi, Bluetooth, or cellular networks, allowing system to exchange information over short or long distances for communication of wireless commands to facilitate operations of the system.
[0026] A vision module 104 is affixed on top of the frame 101 comprising a camera 104a linked with the microprocessor for observation of both surroundings and individual worker identity and position. The microprocessor is preloaded with digital maps of the construction site, along with profiles of individual workers and their assigned worklists.
[0027] When the system is activated, the camera 104a continuously captures real-time visual data of the surroundings, including workers' physical presence and actions. This video feed is processed by the microcontroller, which uses the stored mapping and profile data to identify the specific location of each worker and compare their activities with the expected tasks in their worklists. The microcontroller applies image recognition and tracking protocols to match visual markers such as uniforms, posture, movement, or facial features to the stored profiles.
[0028] The vision module 104 is mounted through a rotatable joint 104b to allow overall monitoring in the surroundings.
[0029] In an embodiment of the present invention, the rotary joint used herein is a ball-and-socket joint. The ball-and-socket joint consists of a spherical ball mounted on the vision module’s 104 stem, which fits inside a complementary socket fixed on the frame 101. This design allows multi-directional rotation along three axes: pitch, yaw, and roll. The joint uses low-friction materials or lubricants to enable smooth movement. Control could be manual or motorized, where small servo motors apply torque to adjust the ball’s orientation within the socket. The joint’s structure supports angular displacement in all directions, providing nearly unrestricted viewing flexibility.
[0030] In another embodiment of the present invention, the rotary joint used herein is a rotary bearing joint. The rotary bearing joint utilizes rolling-element bearings (ball or roller bearings) housed between inner and outer races fixed respectively to the vision module 104 and frame 101. These bearings reduce friction during rotation, allowing smooth continuous 360-degree rotation about a single vertical axis. A motor, such as a brushless DC motor, applies torque to the inner race to spin the vision module 104. The bearing assembly supports axial and radial loads, stabilizing the module while enabling free rotation. Position sensors or encoders track angular displacement, providing feedback to maintain or adjust orientation as commanded by the control system.
[0031] The user is prompted to input site-specific information into the system is computing unit, including the intended application area of the mixture and environmental factors such as whether the target wall or surface is directly exposed to sunlight.
[0032] The frame 101 is equipped with a plurality of motorized wheels 102, each connected to the frame 101 through telescopic bars 103 that allow adjustable extension and retraction to adapt to uneven construction site surfaces and maintain stability during movement. Based on the detected type of construction area, the microcontroller sends a signal to a pneumatic unit to power the telescopic bars 103 for positioning the motorized wheels 102.
[0033] The pneumatic unit that includes an air compressor, air cylinder, air valves and piston which works in collaboration to aid in extension and retraction of the bars 103. The microcontroller sends a signal to the pneumatic unit associated with the link 105a that leads to actuation of valve to allow passage of compressed air from the compressor within the cylinder from one end, the compressed air further develops pressure against the piston and results in pushing and extending the piston. The piston is connected with the bars 103 and due to applied pressure the...extends and similarly, the microcontroller retracts the bars 103 by pushing compressed air via the other end of the cylinder, by opening the corresponding valve resulting in retraction of the piston, and the retraction of the bars 103.
[0034] The microcontroller actuates plurality of wheels 102 (ranging in between 4 to 6), upon adjustment of the bars 103 to allow the frame 101 to adapt to uneven construction site surfaces and maintain stability during movement. The omnidirectional wheels 102 have small discs or rollers around the circumference of the wheel that are powered by a direct current (DC) motor, enabling the wheels 102 to move in required direction.
[0035] An inspection module 105 is installed over the frame 101 to collect and analyse construction data against predefined standards for identifying faulty areas. This module comprises wall inspection module 105, electrical fitting inspection module 105, and concrete inspection module 106.
[0036] A wall inspection module 105 including an articulated link 105a integrated with an AI (artificial-intelligence) imaging unit 105b, LIDAR sensor, ground penetrating radar (GPR) sensor, and ultrasonic pulse velocity (UPV) sensor, for structural assessment. The AI imaging unit 105b uses one or more camera 104as combined with advanced image processing protocols powered by artificial intelligence. The imaging unit 105b captures high-resolution images or video of the wall surface and analyzes visual data to detect surface anomalies such as cracks, unevenness, discoloration, or material degradation. The AI protocols are trained on large datasets to recognize patterns that indicate structural faults or defects. The imaging unit 105b classify the severity of detected issues and provide real-time feedback or alerts to workers, facilitating quick identification and decision-making.
[0037] The LIDAR (Light Detection and Ranging) sensor emits laser pulses towards the wall surface and measures the time it takes for the reflected light to return. By calculating the distance based on the speed of light and the return time, the sensor generates a precise 3D map of the wall’s exterior geometry. This detailed topographical data helps detect surface deformations such as bumps, dips, or warping that might not be easily visible. LIDAR also enables accurate measurement of wall dimensions, aiding in structural integrity assessments and monitoring changes over time.
[0038] The ground penetrating radar (GPR) sensor transmits high-frequency electromagnetic waves into the wall material and detects the reflected signals from subsurface structures. It reveals hidden features beneath the surface such as voids, cracks, rebar placement, and moisture intrusion. By analyzing the time delay and intensity of returned signals, the system constructs an image of internal wall conditions. This non-destructive testing technique is crucial for evaluating structural soundness without damaging the wall and helps identify internal defects that could compromise safety.
[0039] The ultrasonic pulse velocity (UPV) sensor operates by sending ultrasonic pulses through the wall material and measuring the velocity at which the waves propagate. The speed of the ultrasonic pulses depends on the material’s density and elasticity; thus, variations in pulse velocity indicates flaws such as cracks, voids, or degradation in concrete or masonry. By comparing measured velocities against standard values, the system assesses the uniformity and quality of the wall material.
[0040] An electrical fitting inspection module 105, including an infrared sensor and non-contact voltage tester, for electrical fault detection.
[0041] The infrared (IR) sensor detects heat emitted by electrical components and wiring without physical contact. Electrical faults such as overloaded circuits, loose connections, or faulty equipment often generate abnormal heat signatures. The IR sensor captures infrared radiation and converts it into thermal images or temperature data, allowing identification of hotspots that indicate potential problems. By continuously monitoring temperature variations, the sensor helps detect early signs of electrical failures, preventing hazards such as short circuits or fires.
[0042] The non-contact voltage tester detects the presence of electric voltage in wires, outlets, or electrical fittings without direct electrical contact. It works by sensing the electric field generated by live conductors using capacitive coupling or electromagnetic field detection. When brought near a live wire, the tester alerts the user through visual signals (LED indicators), audible beeps, or vibrations, confirming voltage presence and helping identify energized circuits.
[0043] A concrete inspection module 106, comprising robotic arms 106a equipped with a sensing plate 106b that includes a group of sensors including moisture sensor, temperature sensor, viscosity sensor, and GPR sensor.
[0044] The robotic arms 106a consist of multiple motorized joints controlled by servo motors, which receive precise commands from the microcontroller. The robotic arms 106a comprises, motor controllers, arm, end effector and sensors. All these parts are configured with the microcontroller. The elbow is at the middle section of the arms 106a that allows the upper part of the arms 106a to move the lower section independently. Lastly, the wrist is at the tip of the upper arms 106a and attached to the end effector thereby the end effector works as a hand to position the sensing plate 106b near concrete.
[0045] The moisture sensor typically operates on the principle of electrical impedance or capacitance. Upon activation, it sends a low-frequency electrical signal into the concrete, and measures the response which varies with water content. The microcontroller processes the impedance values to estimate the moisture level within the concrete matrix. Higher water content lowers impedance, indicating wet or uncured concrete, while lower moisture increases impedance. The microcontroller compares these measurements with threshold values stored in memory and flags anomalies by triggering notifications.
[0046] The temperature sensor used is generally a thermistor or RTD (Resistance Temperature Detector). Upon actuation, the sensor converts temperature changes into a corresponding electrical resistance variation. The microcontroller reads this resistance via an ADC (Analog to Digital Converter) and converts it into temperature data using calibration curves. This continuous temperature monitoring allows the microcontroller to detect deviations from optimal curing temperatures, which are critical for concrete strength development, and issue alerts if the temperature is too high or too low.
[0047] The viscosity sensor typically functions by measuring the resistance to flow or shear stress within the concrete slurry. It may use a rotational viscometer principle, where a rotor embedded in the sensing plate 106b spins at a constant speed and the torque required to maintain this speed is measured. The microcontroller receives torque data proportional to the fluid’s viscosity and processes it against predefined limits. Significant deviations indicate issues such as mix inconsistency or segregation, prompting the system to alert operators.
[0048] The GPR sensor transmits high-frequency electromagnetic pulses into the concrete using an antenna mounted on the sensing plate 106b. The pulses reflect off subsurface structures or anomalies and are received back by the antenna. The microcontroller processes these reflected signals using protocols such as Fast Fourier Transform (FFT) and time-domain analysis to generate subsurface images or profiles. Variations in signal reflection indicate presence of voids, cracks, or rebar positioning, enabling non-destructive internal evaluation of the concrete.
[0049] Upon receiving processed data from these sensors, the microcontroller compares the readings with user-defined standards or thresholds stored in memory. Any detected deviations trigger an alerting means which communicates alerts in real-time through the computing unit, enabling timely intervention to maintain concrete quality and safety.
[0050] An audio-visual unit 107, operatively coupled with the inspection module 105, delivers real-time alerts to workers present near identified faulty areas. The audio-visual unit 107 typically comprises speaker 107b and display panel 107a positioned on the frame 101. When the alert signal is received, the microcontroller in the audio-visual unit 107 triggers the speaker 107b to emit audible warnings, such as al arms 106a, voice messages, or predefined safety instructions, which are clearly heard by workers in the vicinity.
[0051] The speaker 107b works by taking the input signal from the microcontroller, it then processes and amplifies the received signal through a series of equipment in a specific order within the speaker 107b, and then sends the output signal in form of audio notification through the speaker 107b for alerting the workers present near identified faulty areas.
[0052] Simultaneously, the display panel 107a is activated to provide visual cues, such as flashing lights, text messages, or symbols, that indicate the exact nature and location of the fault. The display panel 107a is used for displaying images. The panel is made of insulating material but surface of the panel is coated with thin layer of electrically conducting material that helps in creating a low intensity electric discharge conducted towards the internal circuitry of the panel. Thus, the panel displays visual cues pre-fed notifications to indicate the exact nature and location of the fault.
[0053] The audio-visual unit 107 is equipped with holographic projection capabilities, designed to enable seamless and effective communication with workers on construction sites. A sound level sensor and a directional microphone are operatively associated with the system to continuously monitor ambient noise levels in the environment and detect situations where speech-based instructions may be compromised due to excessive noise.
[0054] The sound level sensor continuously samples the environmental sound pressure and converts the analog acoustic signal into a proportional electrical signal. This electrical signal is then digitized and processed by the microcontroller to determine the real-time sound level. The sound level sensor is programmed with a threshold value that represents the maximum acceptable noise level for clear verbal communication. When ambient noise exceeds this threshold, the sensor triggers a signal to the processor indicating that the surrounding environment is too noisy for reliable speech-based instruction delivery.
[0055] The directional microphone typically works based on phase cancellation principles or multi-microphone arrays with digital beamforming protocols. The directional microphone is aligned to focus on the expected location of the speaker 107b. It isolates the intended speech signal from surrounding ambient noise by enhancing signals coming from the target direction and suppressing off-axis noise. This improves voice recognition accuracy and ensures better audio input quality. In response to such conditions, the system is programmed to automatically transition from audio to visual communication mode, and the alerts or instructions are displayed on the display panel 107a to project holographically for enhanced visibility.
[0056] If a defect remains unresolved after worker intervention, the inspection module 105 triggers a second level inspection, and if the issue persists, the computing unit sends further alerts to supervisors.
[0057] To further ensure message clarity and user-specific comprehension, the system employs facial recognition module and worker identification sensors. The facial recognition module is composed of a high-resolution camera 104a system and a dedicated image processing unit or microcontroller with embedded machine learning (ML) protocols. The camera 104a captures the live facial image of a nearby worker and sends it to the processing unit, where the image is analysed using pattern-matching protocols against a pre-stored facial database maintained in local memory or a cloud-based storage system.
[0058] The system uses worker ID (Identification) sensors such as RFID (Radio Frequency Identification) readers to detect RFID tags embedded in helmets or ID cards, biometric sensors for identity validation, or wearable signal receivers for BLE (Bluetooth Low Energy) tags worn by workers. These sensors either confirm the identity already detected via facial recognition or identify a worker in situations where the face is obstructed.
[0059] These components enable the system to personalize the language of communication according to the identified worker’s preferred language. This adaptive, sensor-driven communication means ensures that critical instructions are effectively delivered, even in high-noise conditions typical of construction environments.
[0060] The microcontroller linked to the vision module 104 and linked with a memory module, compares work parameters for each worker such as safety compliance, daily targets, breaks, faults against a predefined worklist. The system is equipped with a laser displacement sensor operatively combined with the imaging unit 105b to continuously monitor the vertical positioning of workers engaged in tasks at elevated heights.
[0061] The laser displacement sensor emits a focused laser beam toward the worker, which reflects back to the sensor’s optical receiver. Based on the time-of-flight or triangulation principle, the sensor calculates the real-time distance between itself and the worker’s position, thus determining the worker’s elevation from the ground. This distance data is continuously relayed to the microcontroller, which simultaneously receives input from the imaging unit 105b for verifying the worker’s identity and analyzing their posture and location. The microcontroller compares this real-time height data with predefined safety thresholds such as the six-foot limit commonly observed in construction safety protocols.
[0062] The inspection module 105 accurately measures the real-time height of each worker above ground level. When the system detects that a worker is operating at a height exceeding six feet without the use of a safety harness belt, it automatically triggers an immediate safety alert using audio-visual unit 107. This alert is simultaneously transmitted to both the on-site supervisor and the corresponding worker through an integrated IoT-based communication module. The real-time notification ensures that appropriate corrective measures are taken without delay. By providing timely warnings for non-compliance with height safety protocols, the system enforces adherence to occupational safety standards and significantly mitigates the risk of serious injuries resulting from falls or slips during construction activities.
[0063] An LDR (Light Dependent Resistor) and lighting module activate illumination when ambient light drops below a threshold, enhancing visibility during inspections. The LDR monitors light intensity by varying its resistance based on the amount of light it receives. When light falls on the LDR, its resistance decreases proportionally to the light's intensity. This change in resistance alters the voltage across the LDR, which is measured by the sensor. The voltage signal is then sent to microcontroller and processed to determine the light intensity in the surrounding environment.
[0064] The audio-visual unit 107 includes LEDs 107c (light emitting diode) are made from semiconductor materials which have properties that allow them to emit light. The LEDs 107c contains a p-n junction, where a p-type region is positively charged and an n-type region is negatively charged. When voltage is applied, electrons from the n-region move towards the p-region, and holes from the p-region move towards the n-region. As the electrons move across the p-n junction, they recombine with the holes. During this process, the electrons lose energy, and this energy is released in the form of photons (light). In case of safety-related non-compliance, the audio-visual unit 107 issues strict warnings to the worker and simultaneously sends notifications via the computing unit to management.
[0065] In an embodiment of the present invention, a sunlight exposure detection system, typically comprising a combination of infrared and ultraviolet (UV) sensors, configured to detect the intensity of solar radiation in the area of interest. The infrared sensor is designed to detect the thermal component of sunlight, specifically the infrared radiation emitted by the sun. It measures the heat energy present in the environment, which correlates with the intensity and angle of sunlight exposure.
[0066] Simultaneously, the ultraviolet sensor detects radiation in the UV spectrum, particularly UVA and UVB wavelengths, which are associated with sun intensity and potential material degradation or health risks. The sensor measures the amount of UV light present in the environment and converts it into an electrical signal that indicates UV intensity levels. Together, the IR and UV sensors provide comprehensive sunlight exposure data. This sensor data is continuously fed into a microcontroller or processing unit, which evaluates whether the sunlight intensity crosses a predefined threshold.
[0067] Upon detecting high levels of sunlight exposure at the specified location, the system automatically generates real-time guidance for the workers. This instruction is communicated through a display panel 107a, advising the workers to modify the material composition accordingly. Recommended adjustments may include the addition of thermal insulating agents or hydration-retarding compounds to the mixture. These modifications are suggested to enhance the thermal stability of the material, ensuring it remains suitable for use under high-temperature conditions and minimizing the risk of structural compromise due to premature setting or thermal stress.
[0068] A battery (not shown in figure) is associated with the system to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrodes named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the system.
[0069] The present invention works best in the following manner, where the frame 101 as disclosed in the invention is supported on the plurality of motorized wheels 102 connected via the telescopic bars 103, enabling controlled movement across the construction site terrain. The vision module 104 is connected through the rotatable joint 104b, which actively scans the surrounding environment and detects the real-time profile and movement of the workers. The system further comprises the inspection module 105 installed over the frame 101, which processes site-specific construction data through its integrated wall inspection module 105, electrical fitting inspection module 105, and concrete inspection module 106. Each of these modules utilizes the dedicated sensors to capture physical parameters and assess them against the predefined dataset stored in the memory module. The processor, in communication with these modules, analyses the data to identify structural faults, anomalies, or deviations in workmanship. When the defect is detected, the audio-visual unit 107 mounted on the frame 101 provides immediate on-site alerts to the workers. If the issue remains unaddressed for the threshold time, or if worker performance does not match assigned targets, the computing unit wirelessly transmits the irregularities to the authorized person for further action.
[0070] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A worker assistance and performance monitoring system, comprising:
i) a frame 101 equipped with a plurality of motorized wheels 102 connected with said frame 101 via telescopic bars 103;
ii) a vision module 104 disposed over the frame 101 via a rotatable joint 104b to monitor construction area along with profile of a worker;
iii) an inspection module 105 configured over the frame 101 to process construction data with respect to a pre-defined data and evaluate faulty areas, the inspection module 105 comprising:
a wall inspection module 105, an electrical fitting inspection module 105, a concrete inspection module 106;
iv) an audio-visual unit 107, operatively coupled with the inspection module 105, to provide real time audio-visual alert to the workers regarding the faulty areas;
v) a processor coupled with the vision module 104 and interlinked with a memory module, the processor configured to compare a set of work parameters of the worker with a pre-defined worklist for each worker; and
vi) a computing unit wirelessly connected with the inspection module 105, configured to provide real time update to an authorized person in case the workers ignore the audio-visual alert for a threshold time interval or any irregularity is identified with respect to the work parameters.
2) The system as claimed in claim 1, wherein the wall inspection module 105 includes an articulated link 105a integrated with an artificial intelligence (AI) imaging unit 105b and a LIDAR sensor along with a ground penetrating radar (GPR) sensor and an ultrasonic pulse velocity (UPV) sensor, the electrical fitting inspection module 105 include infrared sensor and non-contact voltage tester, the concrete inspection module 106 includes a set of robotic arms 106a installed with a sensing plate 106b.
3) The system as claimed in claim 2, wherein the sensing plate 106b includes a group of sensors including moisture sensor, temperature sensor, viscosity sensor, output of which is processed and transmitted to the processor for computation with respect to set of user-defined values and in case any deviation is identified, a notification is sent over the computing unit.
4) The system as claimed in claim 2, wherein the electrical fitting inspection module 105 is configured to alert the worker regarding determine defects including wire breakage, voltage defect and the wall inspection module 105 is configured to determine defects including surface cracks, wall strength, unevenness, bumps, material condition and generate notification to the worker.
5) The system as claimed in claim 4, wherein the inspection module 105 processes a second level inspection after the worker rectifies the defect and in case the defect is not resolved in second level inspection, an alert is sent over the computing unit.
6) The system as claimed in claim 1, wherein the vision module 104 comprises a camera 104a and a microprocessor, programmed with a mapping of the construction area and profiles of different workers along with the worklist for each worker to regulate a set of actuators connected with the wheels 102 and telescopic rods.
7) The system as claimed in claim 1, further comprising an LDR (Light Dependent Resistor) and a lighting module to provide lightning on detecting ambient light below a threshold value.
8) The system as claimed in claim 1, wherein the computing unit is configured to provide an option of updating the worklist of the workers.
9) The system as claimed in claim 1, wherein the set of parameters, includes but not limited to worker safety compliance, worker daily target, worker breaks, worker faults.
10) The system as claimed in claim 9, wherein in case of irregularity in regard to safety compliance, the audio-visual unit is configured to project a strict warning to the worker along with a notification over the computing unit.
| # | Name | Date |
|---|---|---|
| 1 | 202521062430-STATEMENT OF UNDERTAKING (FORM 3) [30-06-2025(online)].pdf | 2025-06-30 |
| 2 | 202521062430-REQUEST FOR EXAMINATION (FORM-18) [30-06-2025(online)].pdf | 2025-06-30 |
| 3 | 202521062430-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-06-2025(online)].pdf | 2025-06-30 |
| 4 | 202521062430-PROOF OF RIGHT [30-06-2025(online)].pdf | 2025-06-30 |
| 5 | 202521062430-POWER OF AUTHORITY [30-06-2025(online)].pdf | 2025-06-30 |
| 6 | 202521062430-FORM-9 [30-06-2025(online)].pdf | 2025-06-30 |
| 7 | 202521062430-FORM FOR SMALL ENTITY(FORM-28) [30-06-2025(online)].pdf | 2025-06-30 |
| 8 | 202521062430-FORM 18 [30-06-2025(online)].pdf | 2025-06-30 |
| 9 | 202521062430-FORM 1 [30-06-2025(online)].pdf | 2025-06-30 |
| 10 | 202521062430-FIGURE OF ABSTRACT [30-06-2025(online)].pdf | 2025-06-30 |
| 11 | 202521062430-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-06-2025(online)].pdf | 2025-06-30 |
| 12 | 202521062430-EVIDENCE FOR REGISTRATION UNDER SSI [30-06-2025(online)].pdf | 2025-06-30 |
| 13 | 202521062430-EDUCATIONAL INSTITUTION(S) [30-06-2025(online)].pdf | 2025-06-30 |
| 14 | 202521062430-DRAWINGS [30-06-2025(online)].pdf | 2025-06-30 |
| 15 | 202521062430-DECLARATION OF INVENTORSHIP (FORM 5) [30-06-2025(online)].pdf | 2025-06-30 |
| 16 | 202521062430-COMPLETE SPECIFICATION [30-06-2025(online)].pdf | 2025-06-30 |
| 17 | Abstract.jpg | 2025-07-14 |