Abstract: 7. ABSTRACT The present invention is directed to a retrofit apparatus (100) adapted to fit to an object displacing system (102). The electronic retrofit apparatus (100) comprises of a high performance embedded system with a Graphics Processing Unit (GPU), a live image capturing device (104), a control unit to establish communication between the processing unit, and a plurality of motors (106) and the image capturing device (104), and a user interface to show a live image feed, a detected image feed and a tracked image feed of the object being selected. The high-performance embedded system with GPU receives the images captured by the live image capturing device (104), passes to a Deep Neural Network, wherein the system outputs multiple detected boxes for an object to be detected. The high performance embedded system enables the user to select a bounding box to be tracked, so that a tracker module of the embedded system tracks the selected object and sends command to the controller driving the motors (106), and to center the launcher on the selected object to induct. Figure associated with Abstract is Fig. 1
DESC:4. DESCRIPTION
Technical Field of the Invention
The present invention relates to an electronic retrofit for a multipurpose displacing system.
Background of the Invention
During a study of displacing systems that are in use, the inventors found very lees success rate of hitting the targets, because of too much of a manual intervention in targeting and inducting the displaced objects. The inventors found a requirement for a retrofit that will help the user in tracking the target after its launch automatically.
The inventors also found a requirement to record the launching activity and develop a feedback system that will visualize the use of the displacing system.
Brief Summary of the Invention
According to an aspect of the present invention, the said electronic retrofit apparatus comprises a high-performance embedded system with a Graphics Processing Unit (GPU). The apparatus also comprises a live image capturing device with optical zoom to capture a feed for image processing.
In accordance with the aspect of the present invention, the said apparatus also comprises a controller for driving a plurality of motors adaptable to fit to the actuators of the displacing system that drive the targeting means in both azimuth and elevation angles.
In accordance with the aspect of the present invention, the said apparatus also comprises a control unit to establish communication between the processing unit, motors and the image capturing device. The said apparatus further comprises a user interface to show a live image feed, a detected image feed and a tracked image feed of the object being selected.
In accordance with the aspect of the present invention, the high-performance embedded system with GPU receives the images captured by the live image capturing device, passes to a Deep Neural Network, wherein the system outputs multiple detected boxes for an object to be detected.
In accordance with the aspect of the present invention, the high performance embedded system enables the user to select a bounding box to be tracked, so that a tracker module of the embedded system tracks the selected object and sends command to the controller driving the motors, and to center the launcher on the selected object to launch an object.
In accordance with the aspect of the present invention, the live image capturing device with optical zoom of the said apparatus functions: to get the live feed of the scenario; to show the detected objects and tracked object on the screen to the user; and to process the captured images for detection and tracking.
In accordance with the aspect of the present invention, the control unit of the said apparatus is used to: establish a communication protocol between processing unit and image capturing device unit, for giving the commands to zoom in and out; and establish a communication protocol between processing unit and motors, for giving a set of commands to move the launcher.
In accordance with the aspect of the present invention, the user interface of the said apparatus leverages the user to enable the detection module and get the detections for a current scene; select the object to be tracked from all the detected objects; use the keyboard to move the launcher (up, down, left, right); and send commands to zoom in and out of the image capturing device.
The invention is capable of other embodiments or of being practiced or carried out in several ways. Also, it is to be understood that the phraseology and terminology employed herein is for description and should not be regarded as limiting.
Brief Description of the Drawings
FIG. 1 shows the block diagram of the retrofit apparatus in accordance with the present invention.
Detailed Description of the Invention
It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting the invention.
According to an exemplary embodiment of the present invention, an electronic retrofit apparatus adapted for an object displacing system is disclosed. The apparatus comprises of a high-performance embedded system with a Graphics Processing Unit (GPU). The apparatus also comprises of a live image capturing device with optical zoom to capture a feed for image processing.
In accordance with the exemplary embodiment of the present invention, the said apparatus also comprises of a controller for driving a plurality of motors adaptable to fit to the actuators of the displacing system that drive the targeting means in both azimuth and elevation angles.
In accordance with the exemplary embodiment of the present invention, the said apparatus also comprises of a control unit to establish communication between the processing unit, motors and the image capturing device. The said apparatus further comprises a user interface to show the live image feed, detected image feed and a tracked image feed of the object being selected.
In accordance with the exemplary embodiment of the present invention, the high-performance embedded system with GPU receives the images captured by the live image capturing device, passes to a Deep Neural Network, wherein the system outputs multiple detected boxes for an object to be detected.
In accordance with the exemplary embodiment of the present invention, the high performance embedded system enables the user to select a bounding box to be tracked, so that a tracker module of the embedded system tracks the selected object and sends command to the controller driving the motors, and to center the launcher on the selected object to launch an object.
In accordance with the exemplary embodiment of the present invention, the live image capturing device with optical zoom of the said apparatus functions: to get the live feed of the scenario; to show the detected objects and tracked object on the screen to the user; and to process the captured images for detection and tracking.
In accordance with the exemplary embodiment of the present invention, the control unit of the said apparatus is used to establish a communication protocol between processing unit and image capturing device unit, for giving the commands to zoom in and out; and establish a communication protocol between processing unit and motors, for giving a set of commands to move the launcher.
In accordance with the exemplary embodiment of the present invention, the user interface of the said apparatus leverages the user to enable the detection module and get the detections for a current scene; select the object to be tracked from all the detected objects; use the keyboard to move the launcher (up, down, left, right); and send commands to zoom in and out of the image capturing device.
In accordance with the exemplary embodiment of the present invention, the said apparatus aids in capturing of a target, tracking of the target, displacing and following it after an object is being launched using the said displacing system.
In accordance with the exemplary embodiment of the present invention, the said apparatus enables: the user to handle the existing displacement systems remotely; the existing displacement systems to provide automated tracking of the object, without the user to perform it manually; and the existing displacement systems to achieve object displaced by the operator and tracking by the said retrofit apparatus.
Referring now to the drawing wherein like numbers represent like parts in each of the several figures, wherein:
FIG. 1 shows the block diagram of the said retrofit apparatus (100) in accordance with the present invention. The said retrofit apparatus comprises a high-performance embedded system with a Graphics Processing Unit (GPU) (110), a displacing unit to launch the object, a plurality of motors one for azimuth moment and one for elevation moment of the launcher, an image capturing device with optical zoom to capture a feed for image processing and control unit (108) to establish communication (118) between the processing unit, motors (106) and the Image capture device 304.
The said retrofit apparatus (100) enables a perception module (112) for adopting an environment in the form of a live image frame and localizes the objects. The said perception module (112) utilizes its two subsystems: a detector (114) for detecting the objects within a frame by drawing rectangles around the objects; and a tracker (116), for tracking the selected objects preferred by the user.
The said perception module (112) acts to switch into a tracking mode and enable the discharge system to launch onto the selected target; and tracks the object by using the tracker (116) post launch of the object.
The said retrofit apparatus (100) is provided with a Robotics Module (not shown) moves the launcher (120) in accordance with the decisions made by the perception (112) module, specifically by the tracker (116). It is only engaged in two situations: a Tracking (116) mode (autonomous mode) and a User engagement (manual mode) through the User Terminal. The Perception Module (112) communicates with the Robotic API (not shown) to relay motor movement decisions. These decisions, as discussed before, are either made by the user through the User Terminal or are made by the tracker (116).
The Robotics Module consists of two motors (106) fitted to the elevation and azimuth wheels of the launcher (120). The launcher (120) is localized onto the target by appropriately moving these motors (106). Internally the Robotic module uses a controller to move the motors (106). The controller exposes motor (106) commands through its own Robotic API to receive motor (106) movement commands.
The retrofit apparatus (100) is provided with user interface, leverages the user to use the keyboard to move the launcher (112) (up, down, left, right), then the object being launched in the environment (122) using the said displacing system (102).
On the contrary, it is intended to cover alternatives, modifications and equivalents. Various modifications to the present invention will be readily apparent to a person skilled in the art and can be made to the present invention within the spirit and scope of the invention.
,CLAIMS:5. CLAIMS
I/We Claim
1. An electronic retrofit apparatus (100) for a displacing system (102), wherein the apparatus (100) comprises of:
i. a high-performance embedded system with a Graphics Processing Unit (GPU);
ii. a live image capturing device (104) with optical zoom to capture a feed for image processing;
iii. a controller for driving a plurality of motors (106) adaptable to fit to the actuators of the displacing system (102) that drive a targeting means in one or both of an azimuth and an elevation angle;
iv. a control unit to establish communication between the processing unit, the motors (106) and the image capturing device (104);
v. a user interface to show a live image feed, detected image feed and a tracked image feed of the object being selected;
vi. the high-performance embedded system with GPU receives the images captured by the live image capturing device (104), passes to a Deep Neural Network, wherein the said system outputs multiple detected boxes for an object to be detected by a user;
and
vii. the high performance embedded system enables the user to select a bounding box to be tracked, so that a tracker module of the embedded system tracks the selected object and sends command to the controller driving the motors (106), and to center the launcher on the selected object to induct.
2. The apparatus (100) according to claim 1, wherein the live image capturing device (104) functions:
i. to get a live feed of the scenario;
ii. to show a plurality of detected objects and tracked objects on the screen;
and
iii. to process the captured images for detection and tracking.
3. The apparatus (100) according to claim 1, wherein the control unit is used to:
i. establish a communication protocol between processing unit and image capturing device (104) unit, for giving the commands to zoom in and out;
and
ii. establish a communication protocol between processing unit and motors (106), for giving a set of commands to move the inductor.
4. The apparatus (100) according to claim 1, wherein the user interface leverages the user to:
i. enable the detection module and get the detections for a current scene;
ii. select the object to be tracked from all the detected objects;
iii. use the keyboard to move the launcher (up, down, left, right);
and
iv. send commands to zoom in and out of the image capturing device (104).
5. The apparatus (100) according to claim 1, wherein the retrofit apparatus aids in capturing of a target, tracking of the target, aid in displacing and following the displaced object.
6. The apparatus (100) according to claim 1, wherein the retrofit apparatus enables:
i. the user to handle the existing displacement systems (102) remotely;
ii. the existing displacement systems (102) to provide automated tracking of the object, without the user to perform it manually;
and
iii. the existing displacement systems (102) to achieve intended displacement from the inductor and tracking by the retrofit apparatus.
7. The apparatus (100) according to claim 1, wherein the said retrofit apparatus enables a perception module (112):
i. for adopting an environment in the form of a live image frame and localizes the objects;
ii. to use its two subsystems:
i. a detector (114), for detecting the objects within a frame by drawing rectangles around the objects;
and
ii. a tracker (116), for tracking the selected objects preferred by the user;
iii. to switch into tracking mode and enable the discharge system to induct the selected target;
and
iv. to track the object by using the tracker (116), after induction of the object.
8. The apparatus (100) according to claim 1, wherein a Robotics Module (not shown in figures) moves a launcher (120) in accordance with the decisions made by the perception (112) module, specifically by the tracker (116).
6. DATE AND SIGNATURE
Dated this January 10, 2020
Applicant Signature
(Mr. Srinivas Maddipati)
Authorized Signatory
For., Zen Technologies Ltd
| # | Name | Date |
|---|---|---|
| 1 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI [17-05-2024(online)].pdf | 2024-05-17 |
| 1 | 201841034740-PROVISIONAL SPECIFICATION [14-09-2018(online)].pdf | 2018-09-14 |
| 2 | 201841034740-FORM FOR SMALL ENTITY(FORM-28) [14-09-2018(online)].pdf | 2018-09-14 |
| 2 | 201841034740-FORM FOR SMALL ENTITY [17-05-2024(online)].pdf | 2024-05-17 |
| 3 | 201841034740-IntimationOfGrant02-05-2024.pdf | 2024-05-02 |
| 3 | 201841034740-FORM FOR SMALL ENTITY [14-09-2018(online)].pdf | 2018-09-14 |
| 4 | 201841034740-PatentCertificate02-05-2024.pdf | 2024-05-02 |
| 4 | 201841034740-FORM 1 [14-09-2018(online)].pdf | 2018-09-14 |
| 5 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-09-2018(online)].pdf | 2018-09-14 |
| 5 | 201841034740-ABSTRACT [01-12-2021(online)].pdf | 2021-12-01 |
| 6 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI [14-09-2018(online)].pdf | 2018-09-14 |
| 6 | 201841034740-CLAIMS [01-12-2021(online)].pdf | 2021-12-01 |
| 7 | 201841034740-DRAWINGS [14-09-2018(online)].pdf | 2018-09-14 |
| 7 | 201841034740-COMPLETE SPECIFICATION [01-12-2021(online)].pdf | 2021-12-01 |
| 8 | abstract 201841034740.jpg | 2018-09-17 |
| 8 | 201841034740-DRAWING [01-12-2021(online)].pdf | 2021-12-01 |
| 9 | 201841034740-PostDating-(13-09-2019)-(E-6-250-2019-CHE).pdf | 2019-09-13 |
| 9 | 201841034740-FER_SER_REPLY [01-12-2021(online)].pdf | 2021-12-01 |
| 10 | 201841034740-APPLICATIONFORPOSTDATING [13-09-2019(online)].pdf | 2019-09-13 |
| 10 | 201841034740-FORM 13 [01-12-2021(online)].pdf | 2021-12-01 |
| 11 | 201841034740-MARKED COPIES OF AMENDEMENTS [01-12-2021(online)].pdf | 2021-12-01 |
| 11 | 201841034740-PostDating-(14-11-2019)-(E-6-309-2019-CHE).pdf | 2019-11-14 |
| 12 | 201841034740-APPLICATIONFORPOSTDATING [14-11-2019(online)].pdf | 2019-11-14 |
| 12 | 201841034740-OTHERS [01-12-2021(online)].pdf | 2021-12-01 |
| 13 | 201841034740-DRAWING [10-01-2020(online)].pdf | 2020-01-10 |
| 13 | 201841034740-PETITION UNDER RULE 137 [01-12-2021(online)]-1.pdf | 2021-12-01 |
| 14 | 201841034740-COMPLETE SPECIFICATION [10-01-2020(online)].pdf | 2020-01-10 |
| 14 | 201841034740-PETITION UNDER RULE 137 [01-12-2021(online)].pdf | 2021-12-01 |
| 15 | 201841034740-Proof of Right [03-02-2020(online)].pdf | 2020-02-03 |
| 15 | 201841034740-RELEVANT DOCUMENTS [01-12-2021(online)]-1.pdf | 2021-12-01 |
| 16 | 201841034740-OTHERS [03-02-2020(online)].pdf | 2020-02-03 |
| 16 | 201841034740-RELEVANT DOCUMENTS [01-12-2021(online)].pdf | 2021-12-01 |
| 17 | 201841034740-FER.pdf | 2021-10-17 |
| 17 | 201841034740-FORM FOR SMALL ENTITY [03-02-2020(online)].pdf | 2020-02-03 |
| 18 | 201841034740-FORM 18 [07-02-2020(online)].pdf | 2020-02-07 |
| 18 | 201841034740-FORM 3 [03-02-2020(online)].pdf | 2020-02-03 |
| 19 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI [03-02-2020(online)].pdf | 2020-02-03 |
| 19 | 201841034740-Deed of Assignment_(As Filed)_06-02-2020.pdf | 2020-02-06 |
| 20 | 201841034740-Description Complete_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 20 | 201841034740-ENDORSEMENT BY INVENTORS [03-02-2020(online)].pdf | 2020-02-03 |
| 21 | 201841034740-Form1_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 21 | 201841034740-Form5_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 22 | 201841034740-Form28_Small Entity_06-02-2020.pdf | 2020-02-06 |
| 22 | 201841034740-Form3_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 23 | 201841034740-Form28_Small Entity_06-02-2020.pdf | 2020-02-06 |
| 23 | 201841034740-Form3_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 24 | 201841034740-Form1_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 24 | 201841034740-Form5_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 25 | 201841034740-ENDORSEMENT BY INVENTORS [03-02-2020(online)].pdf | 2020-02-03 |
| 25 | 201841034740-Description Complete_(After Filing)_06-02-2020.pdf | 2020-02-06 |
| 26 | 201841034740-Deed of Assignment_(As Filed)_06-02-2020.pdf | 2020-02-06 |
| 26 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI [03-02-2020(online)].pdf | 2020-02-03 |
| 27 | 201841034740-FORM 18 [07-02-2020(online)].pdf | 2020-02-07 |
| 27 | 201841034740-FORM 3 [03-02-2020(online)].pdf | 2020-02-03 |
| 28 | 201841034740-FER.pdf | 2021-10-17 |
| 28 | 201841034740-FORM FOR SMALL ENTITY [03-02-2020(online)].pdf | 2020-02-03 |
| 29 | 201841034740-OTHERS [03-02-2020(online)].pdf | 2020-02-03 |
| 29 | 201841034740-RELEVANT DOCUMENTS [01-12-2021(online)].pdf | 2021-12-01 |
| 30 | 201841034740-Proof of Right [03-02-2020(online)].pdf | 2020-02-03 |
| 30 | 201841034740-RELEVANT DOCUMENTS [01-12-2021(online)]-1.pdf | 2021-12-01 |
| 31 | 201841034740-COMPLETE SPECIFICATION [10-01-2020(online)].pdf | 2020-01-10 |
| 31 | 201841034740-PETITION UNDER RULE 137 [01-12-2021(online)].pdf | 2021-12-01 |
| 32 | 201841034740-DRAWING [10-01-2020(online)].pdf | 2020-01-10 |
| 32 | 201841034740-PETITION UNDER RULE 137 [01-12-2021(online)]-1.pdf | 2021-12-01 |
| 33 | 201841034740-APPLICATIONFORPOSTDATING [14-11-2019(online)].pdf | 2019-11-14 |
| 33 | 201841034740-OTHERS [01-12-2021(online)].pdf | 2021-12-01 |
| 34 | 201841034740-MARKED COPIES OF AMENDEMENTS [01-12-2021(online)].pdf | 2021-12-01 |
| 34 | 201841034740-PostDating-(14-11-2019)-(E-6-309-2019-CHE).pdf | 2019-11-14 |
| 35 | 201841034740-APPLICATIONFORPOSTDATING [13-09-2019(online)].pdf | 2019-09-13 |
| 35 | 201841034740-FORM 13 [01-12-2021(online)].pdf | 2021-12-01 |
| 36 | 201841034740-FER_SER_REPLY [01-12-2021(online)].pdf | 2021-12-01 |
| 36 | 201841034740-PostDating-(13-09-2019)-(E-6-250-2019-CHE).pdf | 2019-09-13 |
| 37 | abstract 201841034740.jpg | 2018-09-17 |
| 37 | 201841034740-DRAWING [01-12-2021(online)].pdf | 2021-12-01 |
| 38 | 201841034740-DRAWINGS [14-09-2018(online)].pdf | 2018-09-14 |
| 38 | 201841034740-COMPLETE SPECIFICATION [01-12-2021(online)].pdf | 2021-12-01 |
| 39 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI [14-09-2018(online)].pdf | 2018-09-14 |
| 39 | 201841034740-CLAIMS [01-12-2021(online)].pdf | 2021-12-01 |
| 40 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-09-2018(online)].pdf | 2018-09-14 |
| 40 | 201841034740-ABSTRACT [01-12-2021(online)].pdf | 2021-12-01 |
| 41 | 201841034740-PatentCertificate02-05-2024.pdf | 2024-05-02 |
| 41 | 201841034740-FORM 1 [14-09-2018(online)].pdf | 2018-09-14 |
| 42 | 201841034740-IntimationOfGrant02-05-2024.pdf | 2024-05-02 |
| 42 | 201841034740-FORM FOR SMALL ENTITY [14-09-2018(online)].pdf | 2018-09-14 |
| 43 | 201841034740-FORM FOR SMALL ENTITY [17-05-2024(online)].pdf | 2024-05-17 |
| 43 | 201841034740-FORM FOR SMALL ENTITY(FORM-28) [14-09-2018(online)].pdf | 2018-09-14 |
| 44 | 201841034740-EVIDENCE FOR REGISTRATION UNDER SSI [17-05-2024(online)].pdf | 2024-05-17 |
| 44 | 201841034740-PROVISIONAL SPECIFICATION [14-09-2018(online)].pdf | 2018-09-14 |
| 1 | SearchHistory(6)E_19-08-2021.pdf |