Sign In to Follow Application
View All Documents & Correspondence

Methods For Determining Manufacturing Waste To Optimize Productivity And Devices Thereof

Abstract: A method, non-transitory computer readable medium, and productivity assessment computing device that identifies entities present in frames of a video. Entity movement across the frames is plotted to obtain a trajectory of the entities. Interactions of one or more of the entities in each of the frames are identified. A unique sequence encoding is generated for subtasks performed by each of the entities. One of the subtasks is classified as a waste subtask based on one or more of the interactions corresponding to the one of the subtasks and the trajectory and a type of each of the entities associated with the interactions. The sequence encodings of the one of the subtasks are correlated with a number of frames per second of the video to determine waste duration value(s) for a task and the waste duration value(s) are output.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
12 February 2015
Publication Number
10/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipr@akshipassociates.com
Parent Application
Patent Number
Legal Status
Grant Date
2022-02-16
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. KRUPAL CHANDRESH MODI
6-B, Vishwanagar Society, Jivrajpark, Ahmedabad 380051, Gujarat, India
2. RINKU VATNANI
171, Jairampur Colony, Near Collectorate, Indore 452007, Madhya Pradesh
3. AKASH GUPTA
71, Shanti Niketan Colony, Barkat Nagar, Kisan Marg, Jaipur 302015, Rajasthan, India
4. AKBAR ABDULMALIK LADAK
703/1E, Divyasree Elan, Sarjapur Road [Opp. Total Mall], Bangalore 560035, Karnataka, India.

Specification

CLIAMS:WE CLAIM:
1. A method for analyzing video data to determine manufacturing waste to optimize productivity, the method comprising:
identifying, by a productivity assessment computing device, a plurality of entities present in each of a plurality of obtained frames of a video and a type of each of the plurality of entities;
plotting, by the productivity assessment computing device, movement of each of the plurality of entities across at least a subset of the frames to obtain a trajectory of each of the plurality of entities;
identifying, by the productivity assessment computing device, a plurality of interactions of one or more of the plurality of entities in each of the plurality of frames;
generating, by the productivity assessment computing device, a unique sequence encoding for each of a plurality of subtasks performed by each of the plurality of entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task;
classifying, by the productivity assessment computing device, at least one of the subset of subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of subtasks and the trajectory and the type of each of the plurality of entities associated with the one or more of the interactions;
correlating, by the productivity assessment computing device, the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video to determine at least one waste duration value for the task; and outputting, by the productivity assessment computing device, the waste duration value.

2. The method as set forth in claim 1, wherein the identifying further comprises identifying one or more features of each of the plurality of entities in each of the plurality of frames and the determining further comprises correlating a mapping of each of the plurality of entities to a position on a floor plan with the features of each of the plurality of entities in adjacent ones of the plurality of frames.

3. The method as set forth in claim 1, wherein the plurality of entities are identified based on a plurality of classifiers, the subset of the subtasks is associated with the task based on a correlation with the classifiers, and the classifiers comprise training images of the plurality of entities and training videos of the tasks.

4. The method as set forth in claim 1, wherein the waste duration value is for transportation waste and comprises a duration that a material, one of the plurality of entities is interacting with a machine, one of the plurality of entities or a human, one of the plurality of entities and is moving from one location to another location.

5. The method as set forth in claim 1, wherein the waste duration value is for a waiting waste and comprises a duration that a human, one of the entities is idle and not interacting with any other of the plurality of entities.

6. The method as set forth in claim 1, wherein the waste duration value is for a motion waste and comprises a duration that a human, one of the entities is not interacting with a material, one of the entities or a machine, one of the entities and is moving from one location to another location.

7. The method as set forth in claim 6, further comprising determining, by the productivity assessment computing device, one or more frequently followed paths based on the trajectories and outputting an indication of the frequently followed paths, wherein the frequently followed paths correspond with the one of the subset of subtasks classified as a waste subtask.

8. A productivity assessment computing device comprising at least one processor and a memory coupled to the processor which is configured to be capable of executing programmed instructions comprising and stored in the memory to:
identify a plurality of entities present in each of a plurality of obtained frames of a video and a type of each of the plurality of entities;
plot movement of each of the plurality of entities across at least a subset of the plurality of frames to obtain a trajectory of each of the plurality of entities;
identify a plurality of interactions of one or more of the plurality of entities in each of the plurality of frames;
generate a unique sequence encoding for each of a plurality of subtasks performed by each of the plurality of entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task;
classify at least one of the subset of subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of subtasks and the trajectory and the type of each of the entities associated with the one or more of the interactions;
correlate the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video to determine at least one waste duration value for the task; and
output the waste duration value.

9. The productivity assessment computing device as set forth in claim 8, wherein the processor coupled to the memory is further configured to be capable of executing at least one additional programmed instruction to:
identify one or more features of each of the plurality of entities in each of the plurality of frames; and
correlate a mapping of each of the plurality of entities to a position on a floor plan with the features of each of the plurality of entities in adjacent ones of the plurality of frames.

10. The productivity assessment computing device as set forth in claim 8, wherein the plurality of entities are identified based on a plurality of classifiers, the subset of the subtasks is associated with the task based on a correlation with the classifiers, and the classifiers comprise training images of the plurality of entities and training videos of the tasks.

11. The productivity assessment computing device as set forth in claim 8, wherein the waste duration value is for a transportation waste and comprises a duration that a material, one of the plurality of entities is interacting with a machine one of the entities or a human, one of the entities and is moving from one location to another location.

12. The productivity assessment computing device as set forth in claim 8, wherein the waste duration value is for a waiting waste and comprises a duration that a human, one of the entities is idle and not interacting with any other of the plurality of entities.

13. The productivity assessment computing device as set forth in claim 8, wherein the waste duration value is for a motion waste and comprises a duration that a human, one of the entities is not interacting with a material, one of the entities or a machine, one of the entities and is moving from one location to another location.

14. The productivity assessment computing device as set forth in claim 13, wherein the processor coupled to the memory is further configured to be capable of executing at least one additional programmed instruction to determine one or more frequently followed paths based on the trajectories and outputting an indication of the frequently followed paths, wherein the frequently followed paths correspond with the one of the subset of subtasks classified as a waste subtask.

15. A non-transitory computer readable medium having stored thereon instructions for determining and analyzing manufacturing waste to optimize productivity comprising executable code which when executed, by a processor, causes the processor to perform steps comprising:
identifying a plurality of entities present in each of a plurality of obtained frames of a video and a type of each of the plurality of entities;
plotting movement of each of the plurality of entities across at least a subset of the plurality of frames to obtain a trajectory of each of the plurality of entities;
identifying a plurality of interactions of one or more of the plurality of entities in each of the plurality of frames;
generating a unique sequence encoding for each of a plurality of subtasks performed by each of the plurality of entities, wherein the subtasks are identified based on the interactions and a sequence of at least a subset of the subtasks is associated with at least one task;
classifying at least one of the subset of subtasks as a waste subtask of the task based on one or more of the interactions corresponding to the one of the subset of subtasks and the trajectory and the type of each of the plurality of entities associated with the one or more of the interactions;
correlating the sequence encodings of the one of the subset of subtasks with a number of frames per second of the video to determine at least one waste duration value for the task; and
outputting the waste duration value.

16. The non-transitory computer readable medium as set forth in claim 15, wherein the identifying further comprises identifying one or more features of each of the entities in each of the frames and the determining further comprises correlating a mapping of each of the plurality of entities to a position on a floor plan with the features of each of the plurality of entities in adjacent ones of the plurality of frames.

17. The non-transitory computer readable medium as set forth in claim 15, wherein the plurality of entities are identified based on a plurality of classifiers, the subset of the subtasks is associated with the task based on a correlation with the classifiers, and the classifiers comprise training images of the plurality of entities and training videos of the tasks.

18. The non-transitory computer readable medium as set forth in claim 15, wherein the waste duration value is for a transportation waste and comprises a duration that a material, one of the entities is interacting with a machine one of the entities or a human one of the entities and is moving from one location to another location.

19. The non-transitory computer readable medium as set forth in claim 15, wherein the waste duration value is for a waiting waste and comprises a duration that a human, one of the entities is idle and not interacting with any other of the plurality of entities.

20. The non-transitory computer readable medium as set forth in claim 15, wherein the waste duration value is for a motion waste and comprises a duration that a human, one of the plurality of entities is not interacting with a material, one of the plurality of entities or a machine, one of the entities and is moving from one location to another location.

21. The non-transitory computer readable medium as set forth in claim 20, further having stored thereon instructions which when executed, by the processor further cause the processor to perform at least one additional step comprising determining one or more frequently followed paths based on the trajectories and outputting an indication of the frequently followed paths, wherein the frequently followed paths correspond with the one of the subset of subtasks classified as a waste subtask.

Dated this 12th day of February, 2015
Shwetha A Chimalgi
Of K&S Partners
Agent for the Applicant
,TagSPECI:FIELD OF THE INVENTION
This technology relates to assessing manufacturing productivity and more particularly to methods and devices for analyzing video data to identify waste in a manufacturing environment.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 690-CHE-2015 FORM-9 12-02-2015.pdf 2015-02-12
1 690-CHE-2015-RELEVANT DOCUMENTS [20-09-2023(online)].pdf 2023-09-20
2 690-CHE-2015 FORM-18 12-02-2015.pdf 2015-02-12
2 690-CHE-2015-IntimationOfGrant16-02-2022.pdf 2022-02-16
3 IP30108-spec.pdf ONLINE 2015-02-13
3 690-CHE-2015-PatentCertificate16-02-2022.pdf 2022-02-16
4 IP30108-fig.pdf ONLINE 2015-02-13
4 690-CHE-2015-Written submissions and relevant documents [01-02-2022(online)].pdf 2022-02-01
5 FORM 5-IP30108.pdf ONLINE 2015-02-13
5 690-CHE-2015-Correspondence to notify the Controller [17-01-2022(online)].pdf 2022-01-17
6 FORM 3-IP30108.pdf ONLINE 2015-02-13
6 690-CHE-2015-AMENDED DOCUMENTS [20-12-2021(online)].pdf 2021-12-20
7 690-CHE-2015-Request For Certified Copy-Online(18-02-2015).pdf 2015-02-18
7 690-CHE-2015-FORM 13 [20-12-2021(online)].pdf 2021-12-20
8 690CHE2015_Certifiedcopyrequest.pdf ONLINE 2015-02-19
8 690-CHE-2015-POA [20-12-2021(online)].pdf 2021-12-20
9 690-CHE-2015-US(14)-HearingNotice-(HearingDate-17-01-2022).pdf 2021-12-13
9 IP30108-spec.pdf 2015-03-13
10 690-CHE-2015-ABSTRACT [26-03-2020(online)].pdf 2020-03-26
10 IP30108-fig.pdf 2015-03-13
11 690-CHE-2015-CLAIMS [26-03-2020(online)].pdf 2020-03-26
11 FORM 5-IP30108.pdf 2015-03-13
12 690-CHE-2015-COMPLETE SPECIFICATION [26-03-2020(online)].pdf 2020-03-26
12 FORM 3-IP30108.pdf 2015-03-13
13 690-CHE-2015-CORRESPONDENCE [26-03-2020(online)].pdf 2020-03-26
13 690CHE2015_Certifiedcopyrequest.pdf 2015-03-13
14 690-CHE-2015 POWER OF ATTORNEY 16-06-2015.pdf 2015-06-16
14 690-CHE-2015-DRAWING [26-03-2020(online)].pdf 2020-03-26
15 690-CHE-2015 FORM-1 16-06-2015.pdf 2015-06-16
15 690-CHE-2015-FER_SER_REPLY [26-03-2020(online)].pdf 2020-03-26
16 690-CHE-2015 CORRESPONDENCE OTHERS 16-06-2015.pdf 2015-06-16
16 690-CHE-2015-FORM 3 [26-03-2020(online)].pdf 2020-03-26
17 690-CHE-2015-Information under section 8(2) [26-03-2020(online)].pdf 2020-03-26
17 690-CHE-2015-FER.pdf 2019-09-26
18 690-CHE-2015-OTHERS [26-03-2020(online)].pdf 2020-03-26
19 690-CHE-2015-FER.pdf 2019-09-26
19 690-CHE-2015-Information under section 8(2) [26-03-2020(online)].pdf 2020-03-26
20 690-CHE-2015 CORRESPONDENCE OTHERS 16-06-2015.pdf 2015-06-16
20 690-CHE-2015-FORM 3 [26-03-2020(online)].pdf 2020-03-26
21 690-CHE-2015 FORM-1 16-06-2015.pdf 2015-06-16
21 690-CHE-2015-FER_SER_REPLY [26-03-2020(online)].pdf 2020-03-26
22 690-CHE-2015 POWER OF ATTORNEY 16-06-2015.pdf 2015-06-16
22 690-CHE-2015-DRAWING [26-03-2020(online)].pdf 2020-03-26
23 690-CHE-2015-CORRESPONDENCE [26-03-2020(online)].pdf 2020-03-26
23 690CHE2015_Certifiedcopyrequest.pdf 2015-03-13
24 FORM 3-IP30108.pdf 2015-03-13
24 690-CHE-2015-COMPLETE SPECIFICATION [26-03-2020(online)].pdf 2020-03-26
25 690-CHE-2015-CLAIMS [26-03-2020(online)].pdf 2020-03-26
25 FORM 5-IP30108.pdf 2015-03-13
26 690-CHE-2015-ABSTRACT [26-03-2020(online)].pdf 2020-03-26
26 IP30108-fig.pdf 2015-03-13
27 690-CHE-2015-US(14)-HearingNotice-(HearingDate-17-01-2022).pdf 2021-12-13
27 IP30108-spec.pdf 2015-03-13
28 690-CHE-2015-POA [20-12-2021(online)].pdf 2021-12-20
28 690CHE2015_Certifiedcopyrequest.pdf ONLINE 2015-02-19
29 690-CHE-2015-FORM 13 [20-12-2021(online)].pdf 2021-12-20
29 690-CHE-2015-Request For Certified Copy-Online(18-02-2015).pdf 2015-02-18
30 690-CHE-2015-AMENDED DOCUMENTS [20-12-2021(online)].pdf 2021-12-20
30 FORM 3-IP30108.pdf ONLINE 2015-02-13
31 FORM 5-IP30108.pdf ONLINE 2015-02-13
31 690-CHE-2015-Correspondence to notify the Controller [17-01-2022(online)].pdf 2022-01-17
32 IP30108-fig.pdf ONLINE 2015-02-13
32 690-CHE-2015-Written submissions and relevant documents [01-02-2022(online)].pdf 2022-02-01
33 IP30108-spec.pdf ONLINE 2015-02-13
33 690-CHE-2015-PatentCertificate16-02-2022.pdf 2022-02-16
34 690-CHE-2015-IntimationOfGrant16-02-2022.pdf 2022-02-16
34 690-CHE-2015 FORM-18 12-02-2015.pdf 2015-02-12
35 690-CHE-2015-RELEVANT DOCUMENTS [20-09-2023(online)].pdf 2023-09-20
35 690-CHE-2015 FORM-9 12-02-2015.pdf 2015-02-12

Search Strategy

1 SearchHistory_26-09-2019.pdf

ERegister / Renewals

3rd: 04 May 2022

From 12/02/2017 - To 12/02/2018

4th: 04 May 2022

From 12/02/2018 - To 12/02/2019

5th: 04 May 2022

From 12/02/2019 - To 12/02/2020

6th: 04 May 2022

From 12/02/2020 - To 12/02/2021

7th: 04 May 2022

From 12/02/2021 - To 12/02/2022

8th: 04 May 2022

From 12/02/2022 - To 12/02/2023

9th: 10 Feb 2023

From 12/02/2023 - To 12/02/2024

10th: 31 Jan 2024

From 12/02/2024 - To 12/02/2025

11th: 05 Feb 2025

From 12/02/2025 - To 12/02/2026