Abstract: Disclosed herein is method and worker monitoring system for determining working condition of a worker performing qualitative evaluation of products. In some embodiments, a head pose and a position of the worker are detected from plurality of image frames of a predetermined work location of the worker. Thereafter, the head pose is classified into one of a distraction pose and a non-distraction pose upon verifying that the position of worker is within a specified region of interest in the predetermined work location. Finally, working condition of the worker is determined based on classification of the head pose and predetermined operating parameters. In an embodiment, the present disclosure automatically detects when the worker is in a distracted work condition and recommends reverification of the products which were evaluated during the distracted work condition of the worker. Thus, the present disclosure enhances accuracy and reliability of qualitative evaluation of the products. FIG. 1
Claims:WE CLAIM:
1. A method for determining working condition of a worker (103) performing qualitative evaluation of products (105), the method comprising:
capturing, by a worker monitoring system (111), video of a predetermined work location (101), wherein the video is converted into a plurality of image frames (211);
detecting, by the worker monitoring system (111), a head pose and a position of the worker (103) by analysing the plurality of image frames (211) using one or more predetermined image processing techniques;
classifying, by the worker monitoring system (111), the head pose into one of a distraction pose and a non-distraction pose using pretrained deep learning models, upon verifying the position of the worker (103) within a specified region of interest in the predetermined work location (101); and
determining, by the worker monitoring system (111), the working condition of the worker (103) based on the classification of the head pose and one or more predetermined operating parameters (215).
2. The method as claimed in claim 1 comprises training the worker monitoring system (111) with the one or more predetermined image processing techniques for detecting the head pose, wherein the training comprises:
receiving a plurality of training images with one or more distinct head poses of the worker (103);
segregating the one or more distinct head poses into one or more classes of head poses based on an angle of the one or more distinct head poses; and
annotating the plurality of training images, corresponding to the one or more distinct head poses, to the one or more classes of head poses.
3. The method as claimed in claim 1, wherein classifying the head pose comprises comparing the head pose with one or more classes of head poses and wherein the one or more classes of head poses comprises one of one or more distraction poses and one or more non-distraction poses.
4. The method as claimed in claim 3, wherein the one or more distraction poses are obtained by:
extracting one or more distinct head poses of the worker (103) from a plurality of historical image frames (213) of the predetermined work location (101), using the one or more predetermined image processing techniques;
generating a histogram of each of the one or more distinct head poses;
identifying a mean frequency value of the one or more distinct head poses from the histogram; and
classifying the one or more distinct head poses as the one or more distraction poses based on the mean frequency value.
5. The method as claimed in claim 1, wherein the one or more predetermined operating parameters (215) comprise at least one of a threshold time of distraction, a threshold time of absence of the worker (103) from the predetermined work location (101) and a threshold time period for detecting sleep condition of the worker (103).
6. The method as claimed in claim 1, wherein the working condition of the worker (103) is at least one of a non-distracted work condition and a distracted work condition, wherein the distracted work condition includes a distraction condition, a sleep condition and a worker (103) absence condition.
7. The method as claimed in claim 6, wherein the sleep condition of the worker (103) is determined by:
identifying a plurality of key points, corresponding to the worker (103), on each of the plurality of image frames (211), wherein the plurality of key points represent at least one of head of the worker (103), chest of the worker (103), shoulder of the worker (103) and arms of the worker (103);
comparing angles between the plurality of key points with corresponding predetermined reference angles for a predetermined time period for determining deviation in the angles; and
determining the sleep condition of the worker (103) based on the deviation in the angles.
8. The method as claimed in claim 6, wherein the worker (103) absence condition is determined when position of the worker (103) is not detected within specified region of interest in the plurality of image frames (211).
9. The method as claimed in claim 6 comprises:
generating an alarm event corresponding to the distracted work condition of the worker (103);
combining a plurality of image frames (211) corresponding to the distracted work condition of the worker (103) into a video; and
transmitting the alarm event and the video to predetermined worker (103) management personnel for notifying the distracted work condition of the worker (103).
10. The method as claimed in claim 9, wherein the alarm event comprises information related to at least one of time of occurrence of the distracted work condition, duration of the distracted work condition, predetermined work location (101) of the worker (103) and product identifiers corresponding to one or more products (105) evaluated by the worker (103) during the distracted work condition.
11. A worker monitoring system (111) for determining working condition of a worker (103) performing qualitative evaluation of products (105), the worker monitoring system (111) comprising:
a processor (203); and
a memory (205), communicatively coupled to the processor (203), wherein the memory (205) stores processor-executable instructions, which on execution, cause the processor (203) to:
capture video of a predetermined work location (101), wherein the video is converted into a plurality of image frames (211);
detect a head pose and a position of the worker (103) by analysing the plurality of image frames (211) using one or more predetermined image processing techniques;
classify the head pose into one of a distraction pose and a non-distraction pose using pretrained deep learning models, upon verifying the position of the worker (103) within a specified region of interest in the predetermined work location (101); and
determine the working condition of the worker (103) based on the classification of the head pose and one or more predetermined operating parameters (215).
12. The worker monitoring system (111) as claimed in claim 11, wherein to train the worker (103) monitoring system with the one or more predetermined image processing techniques for detecting the head pose, the processor (203) is configured to:
receive a plurality of training images with one or more distinct head poses of the worker (103);
segregate the one or more distinct head poses into one or more classes of head poses based on an angle of the one or more distinct head poses; and
annotate the plurality of training images, corresponding to the one or more distinct head poses, to the one or more classes of head poses.
13. The worker monitoring system (111) as claimed in claim 11, wherein the processor (203) classifies the head pose comprises by comparing the head pose with one or more classes of head poses and wherein the one or more classes of head poses comprises one of one or more distraction poses and one or more non-distraction poses.
14. The worker monitoring system (111) as claimed in claim 13, wherein to obtain the one or more distraction poses, the processor (203) is configured to:
extract one or more distinct head poses of the worker (103) from a plurality of historical image frames (213) of the predetermined work location (101), using the one or more predetermined image processing techniques;
generate a histogram of each of the one or more distinct head poses;
identify a mean frequency value of the one or more distinct head poses from the histogram; and
classify the one or more distinct head poses as the one or more distraction poses based on the mean frequency value.
15. The worker monitoring system (111) as claimed in claim 11, wherein the one or more predetermined operating parameters (215) comprise at least one of a threshold time of distraction, a threshold time of absence of the worker (103) from the predetermined work location (101) and a threshold time period to detect sleep condition of the worker (103).
16. The worker monitoring system (111) as claimed in claim 11, wherein the working condition of the worker (103) is at least one of a non-distracted work condition and a distracted work condition, wherein the distracted work condition includes a distraction condition, a sleep condition and a worker (103) absence condition.
17. The worker monitoring system (111) as claimed in claim 16, wherein to determine the sleep condition of the worker (103), the processor (203) is configured to:
identify a plurality of key points, corresponding to the worker (103), on each of the plurality of image frames (211), wherein the plurality of key points represent at least one of head of the worker (103), chest of the worker (103), shoulder of the worker (103) and arms of the worker (103);
compare angles between the plurality of key points with corresponding predetermined reference angles for a predetermined time period to determine deviation in the angles; and
determine the sleep condition of the worker (103) based on the deviation in the angles.
18. The worker monitoring system (111) as claimed in claim 16, wherein the processor (203) determines the worker (103) absence condition when position of the worker (103) is not detected within specified region of interest in the plurality of image frames (211).
19. The worker monitoring system (111) as claimed in claim 16, wherein the processor (203) is further configured to:
generate an alarm event corresponding to the distracted work condition of the worker (103);
combine a plurality of image frames (211) corresponding to the distracted work condition of the worker (103) into a video; and
transmit the alarm event and the video to predetermined worker (103) management personnel to notify the distracted work condition of the worker (103).
20. The worker monitoring system (111) as claimed in claim 19, wherein the alarm event comprises information related to at least one of time of occurrence of the distracted work condition, duration of the distracted work condition, predetermined work location (101) of the worker (103) and product identifiers corresponding to one or more products (105) evaluated by the worker (103) during the distracted work condition.
Dated this 15th day of February 2019
R. RAMYA RAO
OF K&S PARTNERS
ATTORNEY FOR THE APPLICANT
IN/PA - 1607 , Description:TECHNICAL FIELD
The present subject matter is, in general, related to production industry and more particularly, but not exclusively, to method and system for determining working condition of a worker performing qualitative evaluation of products.
| # | Name | Date |
|---|---|---|
| 1 | 201941006140-FER.pdf | 2021-10-17 |
| 1 | 201941006140-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2019(online)].pdf | 2019-02-15 |
| 2 | 201941006140-CLAIMS [15-07-2021(online)].pdf | 2021-07-15 |
| 2 | 201941006140-REQUEST FOR EXAMINATION (FORM-18) [15-02-2019(online)].pdf | 2019-02-15 |
| 3 | 201941006140-POWER OF AUTHORITY [15-02-2019(online)].pdf | 2019-02-15 |
| 3 | 201941006140-COMPLETE SPECIFICATION [15-07-2021(online)].pdf | 2021-07-15 |
| 4 | 201941006140-FORM 18 [15-02-2019(online)].pdf | 2019-02-15 |
| 4 | 201941006140-DRAWING [15-07-2021(online)].pdf | 2021-07-15 |
| 5 | 201941006140-FORM 1 [15-02-2019(online)].pdf | 2019-02-15 |
| 5 | 201941006140-FER_SER_REPLY [15-07-2021(online)].pdf | 2021-07-15 |
| 6 | 201941006140-OTHERS [15-07-2021(online)].pdf | 2021-07-15 |
| 6 | 201941006140-DRAWINGS [15-02-2019(online)].pdf | 2019-02-15 |
| 7 | 201941006140-Proof of Right [14-07-2021(online)].pdf | 2021-07-14 |
| 7 | 201941006140-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2019(online)].pdf | 2019-02-15 |
| 8 | 201941006140-PETITION UNDER RULE 137 [13-07-2021(online)].pdf | 2021-07-13 |
| 8 | 201941006140-COMPLETE SPECIFICATION [15-02-2019(online)].pdf | 2019-02-15 |
| 9 | 201941006140-PETITION UNDER RULE 138 [13-07-2021(online)].pdf | 2021-07-13 |
| 9 | Abstract.jpg | 2019-02-19 |
| 10 | 201941006140-Proof of Right [13-07-2021(online)].pdf | 2021-07-13 |
| 10 | 201941006140-Request Letter-Correspondence [19-02-2019(online)].pdf | 2019-02-19 |
| 11 | 201941006140-FORM 3 [12-07-2021(online)].pdf | 2021-07-12 |
| 11 | 201941006140-Power of Attorney [19-02-2019(online)].pdf | 2019-02-19 |
| 12 | 201941006140-Form 1 (Submitted on date of filing) [19-02-2019(online)].pdf | 2019-02-19 |
| 13 | 201941006140-FORM 3 [12-07-2021(online)].pdf | 2021-07-12 |
| 13 | 201941006140-Power of Attorney [19-02-2019(online)].pdf | 2019-02-19 |
| 14 | 201941006140-Proof of Right [13-07-2021(online)].pdf | 2021-07-13 |
| 14 | 201941006140-Request Letter-Correspondence [19-02-2019(online)].pdf | 2019-02-19 |
| 15 | 201941006140-PETITION UNDER RULE 138 [13-07-2021(online)].pdf | 2021-07-13 |
| 15 | Abstract.jpg | 2019-02-19 |
| 16 | 201941006140-COMPLETE SPECIFICATION [15-02-2019(online)].pdf | 2019-02-15 |
| 16 | 201941006140-PETITION UNDER RULE 137 [13-07-2021(online)].pdf | 2021-07-13 |
| 17 | 201941006140-Proof of Right [14-07-2021(online)].pdf | 2021-07-14 |
| 17 | 201941006140-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2019(online)].pdf | 2019-02-15 |
| 18 | 201941006140-OTHERS [15-07-2021(online)].pdf | 2021-07-15 |
| 18 | 201941006140-DRAWINGS [15-02-2019(online)].pdf | 2019-02-15 |
| 19 | 201941006140-FER_SER_REPLY [15-07-2021(online)].pdf | 2021-07-15 |
| 19 | 201941006140-FORM 1 [15-02-2019(online)].pdf | 2019-02-15 |
| 20 | 201941006140-DRAWING [15-07-2021(online)].pdf | 2021-07-15 |
| 20 | 201941006140-FORM 18 [15-02-2019(online)].pdf | 2019-02-15 |
| 21 | 201941006140-COMPLETE SPECIFICATION [15-07-2021(online)].pdf | 2021-07-15 |
| 21 | 201941006140-POWER OF AUTHORITY [15-02-2019(online)].pdf | 2019-02-15 |
| 22 | 201941006140-CLAIMS [15-07-2021(online)].pdf | 2021-07-15 |
| 22 | 201941006140-REQUEST FOR EXAMINATION (FORM-18) [15-02-2019(online)].pdf | 2019-02-15 |
| 23 | 201941006140-FER.pdf | 2021-10-17 |
| 23 | 201941006140-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2019(online)].pdf | 2019-02-15 |
| 24 | 201941006140-US(14)-HearingNotice-(HearingDate-15-10-2025).pdf | 2025-09-29 |
| 25 | 201941006140-POA [09-10-2025(online)].pdf | 2025-10-09 |
| 26 | 201941006140-FORM 13 [09-10-2025(online)].pdf | 2025-10-09 |
| 27 | 201941006140-Correspondence to notify the Controller [09-10-2025(online)].pdf | 2025-10-09 |
| 28 | 201941006140-AMENDED DOCUMENTS [09-10-2025(online)].pdf | 2025-10-09 |
| 29 | 201941006140-Written submissions and relevant documents [30-10-2025(online)].pdf | 2025-10-30 |
| 30 | 201941006140-FORM-26 [30-10-2025(online)].pdf | 2025-10-30 |
| 1 | 2021-01-1116-37-16E_11-01-2021.pdf |