Abstract: OBJECT TRACKING ROBOT ABSTRACT An object tracking robot (100) is disclosed. The robot (100) comprises a chassis (102). The chassis (102) comprises an input unit (104) to capture real-time images. The chassis (102) comprises a set of wheels (106) adapted to induce a motion in the robot (100) A processing unit (112) is configured to receive the captured real-time images; preprocess the captured images; lock in a tracker in the foreground of the captured images; extract a color of the tracker locked in the foreground; track the extracted color for adhering and locking the tracker on the extracted color; map a route from a present location of the robot (100) to the locked tracker; and actuate the motor driver (110) for driving the set of wheels (106). The robot (100) tracks and follows moving objects in real-time, enhancing responsiveness, suitable for dynamic environments such as surveillance or autonomous patrol. Claims: 10, Figures: 3 Figure 1A is selected.
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a robot and particularly to an object tracking robot.
Description of Related Art
[002] Object tracking plays a crucial role in surveillance, automation, and robotics. Traditional surveillance systems rely on manual oversight or expensive, centralized hardware setups, which limit scalability and practical deployment in diverse environments. As the demand for autonomous and adaptive systems increases, the capability to detect and follow objects in real-time without the need for constant human intervention becomes critical in sectors such as security, transportation, and smart infrastructure.
[003] However, existing object tracking systems often rely on complex hardware setups or require substantial computational resources to function effectively. These systems tend to involve high costs and demand skilled personnel for installation and maintenance. As a result, they become impractical for applications that require low-cost, portable, or rapidly deployable solutions. Furthermore, such systems usually lack the flexibility to operate across varying environments or conditions without significant customization.
[004] Moreover, many current implementations face delays in detection and response, especially when required to operate in real-time scenarios. The limited processing capabilities of certain embedded platforms lead to latency and reduced accuracy in object recognition and movement prediction. These constraints make it difficult for systems to perform reliably in situations that demand immediate action based on visual input.
[005] Another major shortcoming lies in the integration between sensory inputs and motion control. Coordination between cameras, controllers, and actuators often suffers due to inefficient communication protocols or poor synchronization mechanisms. This results in sluggish or inaccurate tracking behavior, which can hinder the effectiveness of the system in dynamic or unpredictable settings.
[006] There is thus a need for an improved and advanced object tracking robot that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[007] Embodiments in accordance with the present invention provide an object tracking robot. The object tracking robot comprising a chassis adapted to encapsulate components of the robot. The chassis comprising an input unit adapted to capture real-time images of surroundings of the robot. The chassis further comprising a set of wheels adapted to induce a motion in the robot. The set of wheels is driven using a motor driver. The chassis further comprising a processing unit communicatively connected to the input unit and to the motor driver. The processing unit is configured to receive the captured real-time images of the surroundings of the robot; preprocess the captured images by conducting a process of foreground and background isolation. The isolated background is removed from the captured images; lock in a tracker in the foreground of the captured images; extract a color of the tracker locked in the foreground of the captured images. The color of the locked tracker is extracted using an Open Computer Vision algorithm (OpenCV); track the extracted color for adhering and locking the tracker on the extracted color. The tracking of the extracted color is carried out using a pre-coded Python language based program; map a dynamic route from a present location of the robot to the locked tracker; and actuate the motor driver for driving the set of wheels in a direction towards the locked tracker.
[008] Embodiments in accordance with the present invention further provide a method for tracking an object using an object tracking robot. The method comprising steps of receiving captured real-time images of surroundings of the robot from an input unit; preprocessing the captured images by conducting a process of foreground and background isolation. The isolated background is removed from the captured images; locking in a tracker in the foreground of the captured images; extracting a color of the tracker locked in the foreground of the captured images. The color of the locked tracker is extracted using an Open Computer Vision algorithm (OpenCV); tracking the extracted color for adhering and locking the tracker on the extracted color. The tracking of the extracted color is carried out using a pre-coded Python language based program; mapping a dynamic route from a present location of the robot to the locked tracker; and actuating a motor driver for driving a set of wheels in a direction towards the locked tracker.
[009] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide an object tracking robot.
[0010] Next, embodiments of the present application may provide an object tracking robot that allows broader access for educational, research, and small-scale industrial applications.
[0011] Next, embodiments of the present application may provide an object tracking robot that tracks and follows moving objects in real-time for enhancing responsiveness and making it suitable for dynamic environments such as surveillance or autonomous patrol.
[0012] Next, embodiments of the present application may provide an object tracking robot that remains compact due to the use of integrated hardware components, allowing easy transportation and deployment in varied physical settings without the need for bulky infrastructure.
[0013] Next, embodiments of the present application may provide an object tracking robot that operates without the need for manual intervention by combining object detection with motion control. It identifies the object’s position and directs the robot’s movement accordingly, ensuring hands-free functionality.
[0014] Next, embodiments of the present application may provide an object tracking robot that allows easy upgrades, including improved sensors, night vision capability, and AI-based object recognition, offering flexibility for future enhancements or specific application needs.
[0015] These and other advantages will be apparent from the present application of the embodiments described herein.
[0016] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0018] FIG. 1A illustrates a schematic block diagram of an object tracking robot, according to an embodiment of the present invention;
[0019] FIG. 1B illustrates the object tracking robot, according to an embodiment of the present invention; and
[0020] FIG. 2 depicts a flowchart of a method for tracking an object using the object tracking robot, according to an embodiment of the present invention.
[0021] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0022] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0023] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0024] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0025] FIG. 1A illustrates a schematic block diagram of an object tracking robot 100 (hereinafter referred to as the robot 100), according to an embodiment of the present invention. The robot 100 may be adapted to lock a tracker on a color of an object. Further, the robot 100 may track the locked tracker, thus, traversing towards the object. Moreover, if the object may be dynamic then the robot 100 may track the color of the object in a three dimension spatial environment for locking of the tracker in a real-time, and hence following the object.
[0026] The robot 100 may be used in fields such as, but not limited to, a manufacturing line, a surveillance, a militarized zone, and so forth. Embodiments of the present invention are intended to include or otherwise cover any field of the utilization of the robot 100, including known, related art, and/or later developed technologies.
[0027] According to the embodiments of the present invention, the robot 100 may incorporate non-limiting hardware components to enhance the processing speed and efficiency such as the robot 100 may comprise a chassis 102, an input unit 104, a set of wheels 106, a servo motor 108, a motor driver 110, a processing unit 112, a Universal Serial Bus (USB) 114, a storage unit 116, and a power supply unit 118. In an embodiment of the present invention, the hardware components of the robot 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing robots.
[0028] In an embodiment of the present invention, the chassis 102 may be adapted to provide a structural strength and integrity to the robot 100. The chassis 102 may comprise the input unit 104, the set of wheels 106, the servo motor 108, the motor driver 110, the processing unit 112, the Universal Serial Bus (USB) 114, the storage unit 116, and the power supply unit 118.
[0029] In an embodiment of the present invention, the input unit 104 may be adapted to capture real-time images of surroundings of the robot 100. The input unit 104 may be, but not limited to, a flood illuminator, a dot projector, a flywheel, a gyroscope, an ultrasonic sensor, and so forth. In a preferred embodiment of the present invention, the input unit 104 may be a 5 megapixel Pi camera. Embodiments of the present invention are intended to include or otherwise cover any type of the input unit 104, including known, related art, and/or later developed technologies.
[0030] In an embodiment of the present invention, the set of wheels 106 may be adapted to induce a motion in the robot 100. The set of wheels 106 may enable a linear forward and backward motion in the robot 100, in an embodiment of the present invention. In another embodiment of the present invention, the set of wheels 106 may enable a curved forward and backward motion in the robot 100. In yet another embodiment of the present invention, the set of wheels 106 may enable a right turn, a left turn, and an axis-about turn in the robot 100.
[0031] In an embodiment of the present invention, the set of wheels 106 may be connected to the servo motor 108. The servo motor 108 may comprise a shaft (not shown) that may be fixated on a center of the set of wheels 106. The fixation may be carried out by means such as, but not limited to, ball-bearings, timing belts, and so forth. Embodiments of the present invention are intended to include or otherwise cover any means for fixation of the shaft of the servo motor 108 with the set of wheels 106, including known, related art, and/or later developed technologies.
[0032] In an embodiment of the present invention, the servo motor 108 may be electronically driven by the motor driver 110. The motor driver 110 may be adapted to receive electronic instructions comprising a direction, a distance, and a speed to be attained by the robot 100. The motor driver 110 may interpolate the electronic instructions and may drive the servo motor 108 in such a fashion that the connected set of wheels 106 may attain the instructed direction, the instructed distance, and the instructed speed. The motor driver 110 may be, but not limited to, a sine wave driver, cosine wave driver, and so forth. In a preferred embodiment of the present invention, the motor driver 110 may be an L293D servo driver. Embodiments of the present invention are intended to include or otherwise cover any type of the motor driver 110, including known, related art, and/or later developed technologies.
[0033] In an embodiment of the present invention, the processing unit 112 may be connected to the input unit 104 and to the motor driver 110. The connectivity of the processing unit 112 with the input unit 104 may be enabled by the Universal Serial Bus (USB) 114. The Universal Serial Bus (USB) 114 may operate on a Plug and Play protocol. The Plug and Play protocol may allow replacement of the input unit 104 with other kinds of input unit 104 that may specifically be designed for a specialized task. The Universal Serial Bus (USB) 114 may further enable installation of auxiliary accessories to the robot 100. The auxiliary accessories may be, but not limited to, a flashlight, a buzzer, a speaker, a launcher, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the auxiliary accessories, including known, related art, and/or later developed technologies.
[0034] The processing unit 112 may be configured to receive the captured real-time images of the surroundings of the robot 100. The processing unit 112 may be configured to preprocess the captured images by conducting a process of foreground and background isolation. The process of foreground and background isolation may further be enhanced by an onboard Joint Photographic Experts Group (JPEG) encoder and decoder (not shown) integrated into the processing unit 112. The isolated background may be removed from the captured images. Upon removal of the background, the captured real-time images may be left in the foreground.
[0035] The processing unit 112 may be configured to lock in a tracker in the foreground of the captured images. The processing unit 112 may be configured to extract a color of the tracker locked in the foreground of the captured images. The color of the locked tracker may be extracted using an Open Computer Vision algorithm (OpenCV). The Open Computer Vision algorithm (OpenCV) may be executed on an open video core Graphics Processor Unit (GPU) (not shown). Furthermore, the process of color extraction may be enhanced by an onboard graphic accelerator (not shown) integrated on to the processing unit 112. The processing unit 112 may be configured to track the extracted color for adhering and locking the tracker on the extracted color. The tracking of the extracted color may be carried out using a pre-coded Python language based program.
[0036] The processing unit 112 may be configured to map a dynamic route from a present location of the robot 100 to the locked tracker. The processing unit 112 may be configured to actuate the motor driver 110 for driving the set of wheels 106 in a direction towards the locked tracker.
[0037] The processing unit 112 may be, but not limited to, a Programmable Logic Control (PLC) unit, a microprocessor, a development board, and so forth. In a preferred embodiment of the present invention, the processing unit 112 may be a Raspberry Pi model 3. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 112, including known, related art, and/or later developed technologies.
[0038] For an initial use of the robot 100, the processing unit 112 may be configured by connecting with a computer device (not shown). The connectivity may be established using a Vulcanized Jacket and Tape (VJT) cable (not shown). The input unit 104 may capture and transmit live footage for processing. Further, the robot 100 may automatically center the object and adjust a position accordingly, moving left, right, forward, or backward as needed when power may be supplied to the processing unit 112.
[0039] In an embodiment of the present invention, the storage unit 116 may be adapted to store an operating system of the processing unit 112. The processing unit 112 may operate on a Raspbian operating system. The Raspbian operating system may be based on Debian Linux. The storage unit 116 may be, but not limited to, a Random-Access Memory (RAM), a Static Random-Access Memory (SRAM), a Dynamic Random-Access Memory (DRAM), a Read-Only Memory (ROM), an Erasable Programmable Read-only Memory (EPROM), an Electrically Erasable Programmable Read-only Memory (EEPROM), a NAND Flash, a cache memory, a Hard Disk Drive (HDD), a Solid-State Drive (SSD), and so forth. In a preferred embodiment of the present invention, the storage unit 116 may be a 16 Gigabyte (GB) Secure Digital (SD) memory. Embodiments of the present invention are intended to include or otherwise cover any type of the storage unit 116, including known, related art, and/or later developed technologies.
[0040] In an embodiment of the present invention, the power supply unit 118 may be adapted to supply operational power to the processing unit 112. The power supply unit 118 may supply the operational power from a battery. In another exemplary embodiment of the present invention, the power supply unit 118 may supply the operational power from a wall-outlet power supply.
[0041] In an embodiment of the present invention, the battery power supply may be from a rechargeable battery. In another embodiment of the present invention, the battery power supply may be from a non-rechargeable battery. According to embodiments of the present invention, the battery for power supply may be of any composition such as, but not limited to, a Nickel-Cadmium battery, a Nickel-Metal Hydride battery, a Zinc-Carbon battery, a Lithium-Ion battery, and so forth. Embodiments of the present invention are intended to include or otherwise cover any composition of the battery, including known, related art, and/or later developed technologies.
[0042] In an embodiment of the present invention, the wall-outlet power supply may be from a grid power line supply. In another embodiment of the present invention, the wall-outlet power supply may be from a generator line power supply. According to embodiments of the present invention, the wall-outlet power supply may be of any rating such as, but not limited to, a 110-volt supply, a 220-volt supply, and so forth. Embodiments of the present invention are intended to include or otherwise cover any rating of the wall-outlet power supply, including known, related art, and/or later developed technologies.
[0043] According to an embodiment of the present invention, the power supply unit 118 may supply an Alternating Current (AC) power supply. According to another embodiment of the present invention, the power supply unit 118 may supply a Direct Current (DC) power supply. According to yet another embodiment of the present invention, the power supply unit 118 may supply any type of power supply.
[0044] FIG. 1B illustrates the robot 100, according to an embodiment of the present invention. In an exemplary embodiment, the robot 100 may be configured to track an exemplary bottle 120. The chassis 102 of the robot 100 may encompass the processing unit 112. The processing unit 112 may be powered by the power supply unit 118. The processing unit 112 may interpolate a location of the exemplary bottle 120 by extracting a color of the exemplary bottle 120 and placing and locking a target on the extracted color. By doing so, the processing unit 112 may be aware of the extracted color and the location of the extracted color. The processing unit 112 may drive the set of wheels 106 to approach the locked tracker, which has already been bonded to the color of the exemplary bottle 120. Wheresoever, the exemplary bottle 120 may end up, the processing unit 112 may identify the extracted color, may lock the tracker on the color, and may start traversing towards the exemplary bottle 120. In a bigger picture, when encapsulated entire working of the robot 100, a human may perceive that the robot 100 is intended towards the exemplary bottle 120 and may be following the exemplary bottle 120 in the three dimension spatial environment.
[0045] FIG. 2 depicts a flowchart of a method 200 for tracking the object using the robot 100, according to an embodiment of the present invention.
[0046] At step 202, the robot 100 may receive the captured real-time images of the surroundings.
[0047] At step 204, the robot 100 may preprocess the captured images by conducting the process of foreground and background isolation.
[0048] At step 206, the robot 100 may lock in the tracker in the foreground of the captured images.
[0049] At step 208, the robot 100 may extract the color of the tracker locked in the foreground of the captured images.
[0050] At step 210, the robot 100 may track the extracted color for adhering and locking the tracker on the extracted color.
[0051] At step 212, the robot 100 may map the dynamic route from the present location of the robot 100 to the locked tracker.
[0052] At step 212, the robot 100 may actuate the motor driver 110 for driving the set of wheels 106 in the direction towards the locked tracker.
[0053] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0054] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. An object tracking robot (100), the robot (100) comprising:
a chassis (102) adapted to encapsulate components of the robot (100), the chassis (102) comprises:
an input unit (104) adapted to capture real-time images of surroundings of the robot (100);
a set of wheels (106) adapted to induce a motion in the robot (100), wherein the set of wheels (106) are driven using a motor driver (110); and
a processing unit (112) communicatively connected to the input unit (104) and to the motor driver (110), characterized in that the processing unit (112) is configured to:
receive the captured real-time images of the surroundings of the robot (100);
preprocess the captured images by conducting a process of foreground and background isolation, wherein the isolated background is removed from the captured images;
lock in a tracker in the foreground of the captured images;
extract a color of the tracker locked in the foreground of the captured images, wherein the color of the locked tracker is extracted using an Open Computer Vision algorithm (OpenCV);
track the extracted color for adhering and locking the tracker on the extracted color, wherein the tracking of the extracted color is carried out using a pre-coded Python language based program;
map a dynamic route from a present location of the robot (100) to the locked tracker; and
actuate the motor driver (110) for driving the set of wheels (106) in a direction towards the locked tracker.
2. The robot (100) as claimed in claim 1, wherein the input unit (104) is a 5 megapixel Pi camera.
3. The robot (100) as claimed in claim 1, wherein the motor driver (110) is an L293D servo driver.
4. The robot (100) as claimed in claim 1, wherein the input unit (104) is connected to the processing unit (112) using a Universal Serial Bus (USB) (114).
5. The robot (100) as claimed in claim 1, wherein the processing unit (112) is a Raspberry Pi.
6. The robot (100) as claimed in claim 1, comprising a storage unit (116) adapted to store an operating system of the processing unit (112).
7. The robot (100) as claimed in claim 1, comprising a power supply unit (118) adapted to supply operational power to the processing unit (112).
8. A method (200) for tracking an object using an object tracking robot (100), the method (200) is characterized by steps of:
receiving captured real-time images of surroundings of the robot (100) from an input unit (104);
preprocessing the captured images by conducting a process of foreground and background isolation, wherein the isolated background is removed from the captured images;
locking in a tracker in the foreground of the captured images;
extracting a color of the tracker locked in the foreground of the captured images, wherein the color of the locked tracker is extracted using an Open Computer Vision algorithm (OpenCV);
tracking the extracted color for adhering and locking the tracker on the extracted color, wherein the tracking of the extracted color is carried out using a pre-coded Python language based program;
mapping a dynamic route from a present location of the robot (100) to the locked tracker; and
actuating a motor driver (110) for driving a set of wheels (106) in a direction towards the locked tracker.
9. The method (200) as claimed in claim 8, wherein the input unit (104) is a 5 megapixel Pi camera.
10. The method (200) as claimed in claim 8, wherein the motor driver (110) is an L293D servo driver.
Date: April 22, 2025
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202541039355-STATEMENT OF UNDERTAKING (FORM 3) [24-04-2025(online)].pdf | 2025-04-24 |
| 2 | 202541039355-REQUEST FOR EARLY PUBLICATION(FORM-9) [24-04-2025(online)].pdf | 2025-04-24 |
| 3 | 202541039355-POWER OF AUTHORITY [24-04-2025(online)].pdf | 2025-04-24 |
| 4 | 202541039355-OTHERS [24-04-2025(online)].pdf | 2025-04-24 |
| 5 | 202541039355-FORM-9 [24-04-2025(online)].pdf | 2025-04-24 |
| 6 | 202541039355-FORM FOR SMALL ENTITY(FORM-28) [24-04-2025(online)].pdf | 2025-04-24 |
| 7 | 202541039355-FORM 1 [24-04-2025(online)].pdf | 2025-04-24 |
| 8 | 202541039355-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [24-04-2025(online)].pdf | 2025-04-24 |
| 9 | 202541039355-EDUCATIONAL INSTITUTION(S) [24-04-2025(online)].pdf | 2025-04-24 |
| 10 | 202541039355-DRAWINGS [24-04-2025(online)].pdf | 2025-04-24 |
| 11 | 202541039355-DECLARATION OF INVENTORSHIP (FORM 5) [24-04-2025(online)].pdf | 2025-04-24 |
| 12 | 202541039355-COMPLETE SPECIFICATION [24-04-2025(online)].pdf | 2025-04-24 |