Sign In to Follow Application
View All Documents & Correspondence

A Device For Tracking At Least One Tool In A Tool Room

Abstract: Abstract A device for tracking at least one tool in a tool room and a method thereof. The device 10 comprises an authentication module 16 adapted to identify and authenticate a person 15 entering/exiting the tool room 14. The device 10 further comprises at least one camera 18 adapted to capture multiple images of the person15 entering the tool room 14 and an image processing unit 20 for processing the captured multiple images during entry/exit and movement of the person 15 in the tool room 14. The device 10 comprises an action recognition module 22 adapted to identify an action performed by the person 15 on at least one tool 12 based on captured movement in the processed multiple images. (Figures 1 &2)

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 March 2024
Publication Number
40/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Bosch Global Software Technologies Private Limited
123, Industrial Layout, Hosur Road, Koramangala, Bengaluru – 560095, Karnataka, India
Robert Bosch GmbH
Postfach 300220, 0-70442, Stuttgart, Germany

Inventors

1. Harshavardhan Amirthalingam
12/117B,Win Gate Garden, Kannankurichi, Salem – 636008. Tamilnadu, India
2. Vinoth Kumar Muthuraj
6/441 A, Krishanan Pudhur, Ammapet, Salem – 646003. Tamilnadu, India
3. Badrinarayanan Gorontala Ranganathan
Purva Highlands K901, Holiday Village Road, Mallasandra, Bangalore- 560062, Karnataka, India

Specification

Description:Complete Specification:

The following specification describes and ascertains the nature of this invention and the manner in which it is to be performed.
Field of the invention
[0001] The invention is related to a device for tracking atleast one tool in a tool room.

Background of the invention

[0002] Tool Monitoring such as place and usage history, Alert on tool misplacement in tool room is notified via the Integrated Dashboard to the concerned users, in addition to this on the dashboard, Analytics on tool usage trend, availability time and other KPI’s are also updated in real time. In workshops, managing and distributing tools can be difficult. The current system of recordkeeping is manual, labor-intensive, prone to mistakes, and makes it challenging to keep track of where tools are located and who has checked them out. This may result in tools being mistreated, misplaced, or ruined. Other Prior arts that concentrate on camera-based item tracking and action recognition are only applicable to commercial stores, are computationally expensive, and rely on additional hardware to track items, such as weight sensors and smart shelves, which adds significantly to the cost of the solution.

[0003]A patent application 20210124940 discloses a system includes a sensor, a weight sensor, and a tracking subsystem. The tracking subsystem receives an image feed of top-view images generated by the sensor and weight measurements from the weight sensor. The tracking subsystem detects an event associated with an item being removed from a rack in which the weight sensor is installed. The tracking subsystem determines that a first person or a second person may be associated with the event. In response to determining that the first or second person may be associated with the event, buffer frames are stored of top-view images generated by the sensor during a time period associated with the event. The tracking subsystem then determines, using at least one of the stored buffer frames and a first action-detection algorithm, whether an action associated with the event was performed by the first person or the second person.


Brief description of the accompanying drawings
[0004] Figure 1 illustrates a device for tracking atleast one tool in a tool room according to one embodiment of the invention; and

[0005] Figure 2 illustrates a flowchart of a method of tracking atleast one tool in the tool room according to the present invention.

Detailed description of the embodiments
[0006] Figure 1 illustrates a device for tracking atleast one tool in a tool room according to one embodiment of the invention. The device 10 comprises an authentication module 16 adapted to identify and authenticate a person 15 entering/exiting the tool room 14. The device 10 further comprises at least one camera 18 adapted to capture multiple images of the person 15 entering the tool room 14 and an image processing unit 20 for processing the captured multiple images during entry/exit and movement of the person 15 in the tool room 14. The device 10 comprises an action recognition module 22 adapted to identify an action performed by the person 15 on at least one tool 12 based on captured movement in the processed multiple images.

[0007] Further the construction of the device and the working of the device is explained in detail. The device 10 comprises a control unit 24 adapted to perform at least one function related to a recognition of the action performed by the person 15 in the tool room 14. The control unit 24 is chosen from a group of control units comprising a microcontroller, a microprocessor, a digital circuit, an integrated chip, and the like. The control unit 24 comprises a memory 26 to store data related to the detection of the tool 12 tracking in the tool room 14. The memory 26 stores data related to the tool room 14 that is divided into multiple shelf boundaries 28 having a tool 12 positioned in each of the shelf boundary 28. It is to be noted that, not all the shelf boundaries 28 have the tool 12 positioned inside them. Further to this, the memory 26 comprises a centralized repository that holds the details of the history of the tool usage and respective person used that tool 12.

[0008] The control unit 24 performs multiple operations during the operating mode of the device 10. One such actions is detecting the picking up/dropping down of a tool 12 in the tool room 14 by a located person 15. The present invention discloses one such concept of how we are detecting the absence/presence and the further tracking of the tool 12 in the tool room 14 with the help of multiple camera’s 18 operated/controlled by the control unit 24.

[0009] The tool tracking is automated with the aid of cameras 18, which removes the need for manual entry into an application. Additionally, unlike other alternatives, the proposed invention doesn't rely on complicated equipment, such as tool tags and weight sensor shelves. The area around the tool 12 where the tool 12 is positioned is referred as shelf boundary 28. I.e.., the total area of the tool room 14 is divided into multiple shelf boundaries 28 and a shelf boundary 28 is provided with a tool 12 in the tool room 14. The proposed solution takes a novel track by employing computer vision technology to keep track of tool movement inside the tool room 14. The real-time video is captured by the multiple cameras 18 placed strategically throughout the tool room 14, and the retrieved spatio-temporal data is used by the control unit 24 having an integrated intelligence module (not shown) (that uses any one of the intelligence techniques like artificial intelligence, deep learning technique, machine learning technique and the like) to process it.

[0010] The tool room area 14, as well as the tool shelf boundaries 28 are both covered by cameras 18 that are hanging from the ceiling at a top-down position. Multiple cameras 18 and overlapping cameras 18 can both be used to view the same region. People 15 inside the tool room 14 is detected and followed using an intelligence powered camera (for instance, an artificial intelligence (AI) camera). The control unit 24 comprises a person detection module 30 receives the captured multiple images 18 for locating the person 15 in the captured images and a new camera tracking is initiated for detecting the movement of the person 15 in the tool room 14, after mapping the located person 15 with a predefined person data in the memory 26.

[0011] The person detection module 30 uses any one of the following techniques comprising a facial recognition system or a unique QR code that will be scanned at the tool room's 14 entrance via an app, which is used to identify a person 15 that is entering the space 14. The control unit 24 then determines who took which tool 12 and record the tool's location 28 and usage history once the person 15 has been spotted and recognized. After which, the pertinent data is automatically collected in a centralized database, processed, and displayed in a mobile/web app, giving workshop managers, and concerned staff up-to-date visibility into tool availability and usage.

[0012] Figure 2 illustrates a flowchart of a method of tracking atleast one tool in the tool room according to the present invention. In step S1, a person 15 entering/exiting the tool room 14 is identified and authenticated by an authentication module 16. In step S2, multiple images 18 of the person 15 are captured entering the tool room 14 by multiple cameras 18 positioned in the tool room 14 and an image processing unit 20 for processing the captured multiple images 18 during entry/exit and movement of the person 15 in the tool room 14. In step S3, an action performed by the person 15 on at least one the tool 12 is identified based on captured movement in the processed multiple images 18 by an action recognition module 22.

[0013] A detailed explanation of the above methodology is explained below. The tools 12 are usually positioned in various places in a tool room 12. The tool room 12 is divided into multiple shelf boundaries 28, wherein some of the shelf boundaries 28 comprising at least one tool 12 or more depending on the requirement. One of the designs of the shelf boundary 28 in the tool room 14 is that each shelf boundary 28 is provided with a pixel matrix (i.e., (0,0), (0,1)). The multiple cameras 18 positioned in various places of the tool room 14 helps in detecting and identifying the people/persons 15 entering and existing the room 14 and the movement of the person 15 inside the tool room 14. The person 15 entering the tool room 14 is visible under entry/exit cameras 18 once they have entered the tool room 14 and the authentication module 16 verifies the person 15 using an authentication means , one such is a QR code at the entrance. From there, the image stream is passed to the person detection model 30, which locates the person 15 in the image.

[0014] The control unit 24 initiates a new camera tracker for the person 15 with the unique person ID, in which his extracted features from at least one image are stored. The staff details are then mapped to these trackers using one of the aforementioned identifying techniques. The staff details are pre-loaded into the memory 26 of the control unit 24. In the next frames, the person's features are retrieved and stored in the same tracker. When new detections appear or when additional or multiple people 15 enter, their features are extracted and compared to those in the existing trackers. If the mapping is failed in the tracker, then the control unit 24 classifies the new detections as new people, and new trackers are initiated to them.

[0015] The Entry/exit camera-created trackers are designated as global trackers, and they will be used to reidentify people across several cameras. As the located person 15 moves through the tool room 14, he will be captured by multiple cameras 18, and as he does so, individual trackers will be started in each one. These trackers, which may contain different person IDs at first, are then compared with global trackers, and features that match are referred to as the same person ID. This allows for the re-identification and tracking of the same person 15 across multiple cameras 18 in the tool room 14.

[0016] Once the person 15 is identified by the control unit 24, then the action recognition module 22 performs the operations in two steps. One being , a 2D pose estimation model is used to extract and recognize important body parts from the detected human body of the located person 15 , and the second being , a custom module 34 (of the control unit 24) using established object detection and computer vision methods to identify pick-up and drop actions of tools 12 on the rack or from the shelf boundary 28. This technique effectively monitors both the handled object/tool 12 and the specific person's body part movements. In addition to the identified shelf boundary 28/person attributes, the tracker object created for each individual additionally extracts 2d key points of the individual using a pose estimate technique.

[0017] A virtual boundary space is defined around each shelf/rack across multiple cameras 18 which is referred as the shelf boundary 28. A person's body part entering this shelf boundary 28 to take up or put back a tool 12 triggers the custom module 34, which operate in multiple ways. One such is ,when a person's body part key point crosses a shelf boundary 28, his body part is examined to see if it is empty or not, to determine whether the person 15 is about to pick something up or put something down. This is done by looking for tool 12 item detections outside of the shelf boundary 28 in the image. The presence or absence of a tool 12 is then confirmed computing Euclidean distance for the person's body part and detected tool 12.

[0018] In another instance, the person 15 is carrying a tool 12 and it crosses the defined shelf boundary 28, the cameras 18 captures the preceding fifteen frames as a sequence. Subsequently, the Euclidean distances between the body part keypoint and the distance between the tool's centroid and the body keypoint, are computed for this sequence. A further step involves calculating the Cosine similarity between these two Euclidean sequences to determine if both the tool trajectory and the hand trajectory are aligned. Upon satisfying this condition, the drop action is confirmed by the control unit 24 , and the person's 15 entry with the corresponding tool 12 in the repository is reset.

[0019] Yet in another instance, When the located person 15 does not have a tool 12 in his hand while crossing the shelf boundary 28, his key points are tracked for the consecutive framed and are checked whether the body part keypoint has exited the shelf boundary or not.

[0020] In another example, When the located person 15 lifts the tool 12 and subsequently withdraws their body part from the shelf boundary 28, the keypoint representing their body part's position traverses the shelf boundary 28 once again. This traversal is identified, prompting an examination of the following fifteen frames. During this analysis, it is determined whether any tool detection is in proximity to the person's body part keypoint using the Euclidean distance methodology. This approach resembles the cosine similarity trajectory method disclosed above. Through this verification process, it is ascertained whether a tool 12 has indeed been lifted. Subsequently, an object detection model in the action recognition module 22 is employed to recognize the specific tool 12 that has been acquired. Once the tool 12 has been confirmed as picked up by the individual, both the person15 particulars and the tool 12 specifics are logged as an entry in a centralized repository using any one of the interfaces, wherein one such interface is an Application Programming Interfaces (APIs).

[0021] In addition to the above-mentioned methodology, a digital communication channel is developed as part of the device 10 to increase efficiency even more. Users can request tool borrowing and return through this channel using a mobile application or a web portal, eliminating the necessity for laborious paperwork or face-to-face encounters. Through the channel, notifications and reminders are issued, assuring prompt returns, and lowering the possibility of misunderstandings. In a workshop, it is simple to combine the suggested system with other systems and inventory management software. Additionally, tool recommendation or a process and tool condition monitoring are offered. The above disclosed methodology shows significant gains in computational efficiency and accuracy through empirical comparisons with current state-of-the-art action recognition systems and algorithms.

[0022] It should be understood that embodiments explained in the description above are only illustrative and do not limit the scope of this invention. Many such embodiments and other modifications and changes in the embodiment explained in the description are envisaged. The scope of the invention is only limited by the scope of the claims.
, Claims:We Claim:
1. A device (10) for tracking at least one tool (12) in a tool room (14) , said device (10) comprising :
- an authentication module (16) adapted to identify and authenticate a person (15) entering/exiting said tool room (14);
characterized in that :
- at least one camera (18) adapted to capture multiple images of said person (15) entering said tool room (14) and an image processing unit (20) for processing said captured multiple images during entry/exit and movement of said person (15) in said tool room (14);
- an action recognition module (22) adapted to identify an action performed by said person (15) on at least one said tool (12) based on captured movement in said processed multiple images.

2. The device (10) as claimed in claim 1, wherein said device (10) comprises a control unit (24) adapted to perform at least one function related to a recognition of said action performed by said person (15) in said tool room (14).

3. The device (10) as claimed in claim 2, wherein said control unit (24) comprises said image processing unit (20) , said action recognition module (22), a memory (26), wherein said memory (26) stores data related to said tool room (14) divided into multiple shelf boundaries (28) having the tool (12) positioned in each of said shelf boundary (28).

4. The device (10) as claimed in claim 1, wherein said device (10) comprises a person detection module (30) receives the captured multiple images for locating the person (15) in said captured images and a new camera tracking is initiated for detecting the movement of the person (15) in said tool room (14), after mapping said located person (15) with a predefined person data in said memory (26).

5. The device (10) as claimed in claim 1, wherein said action recognition module (22) adapted to extract a human body part of the located person (15) and the action performed (picking up/dropping down) by the located person (15) , on the at least one tool (12) in the tool room (14), is detected.

6. The device (10) as claimed in claim 1, wherein the control unit (24) adapted to detect the pickup/drop down of the atleast one tool (12) in their corresponding shelf boundary (28) based on a distance computed between the located person (15) and the detected tool (12).

7. The device (10) as claimed in claim 1, wherein the picking up /dropping down of the at least one tool (12) from the corresponding shelf boundary (28) is detected by the images captured by multiple cameras positioned in the tool room (14).

8. The device (10) as claimed in claim 1, wherein the action recognition module (22) adapted to track the located person’s key points for the consecutive frame, when the located person (15) doesn’t have the tool (12) in his body part while crossing the shelf boundary (28), and to detect whether the body part key point has exited the shelf boundary (28).

9. A method for tracking at least one tool (12) in a tool room(14) by a device (10) having a control unit (24) , said method comprising :
- identifying and authenticating a person (15) entering/exiting said tool room (14) by an authentication module (16) ;
characterized in that :
- capturing multiple images of the person entering the tool room (14) by multiple cameras (18) positioned in the tool room (14) and an image processing unit (20) for processing said captured multiple images during entry/exit and movement of said person (15) in said tool room (14) ;
- identifying an action performed by the person on at least one said tool (12) based on captured movement in said processed multiple images by an action recognition module (22).

Documents

Application Documents

# Name Date
1 202441025793-POWER OF AUTHORITY [29-03-2024(online)].pdf 2024-03-29
2 202441025793-FORM 1 [29-03-2024(online)].pdf 2024-03-29
3 202441025793-DRAWINGS [29-03-2024(online)].pdf 2024-03-29
4 202441025793-DECLARATION OF INVENTORSHIP (FORM 5) [29-03-2024(online)].pdf 2024-03-29
5 202441025793-COMPLETE SPECIFICATION [29-03-2024(online)].pdf 2024-03-29