Sign In to Follow Application
View All Documents & Correspondence

Method For Calculating Launch Parameters Of An Object Of Interest And System Thereof

Abstract: The present invention relates a system and method for calculating launch parameters of an object of interest. The method comprises capturing super slow-motion video of said object, said object being initially placed within a Region of Interest (ROI), wherein the video comprises a plurality of video frames. The method further comprises masking at least a portion in a plurality of consecutive video frames and detecting said object in the plurality of consecutive masked video frames. Thereafter, centre points of the object in said masked video frames are determined to estimate outliers for the object. Further, said centre points are processed based on the estimated outliers to determine filtered centre points of the object and the filtered centre points are converted into real world coordinates. Lastly, launch parameters of the object are calculated based on the real-world coordinates.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 June 2021
Publication Number
52/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
mohammed.faisal@ltts.com
Parent Application

Applicants

L&T TECHNOLOGY SERVICES LIMITED
DLF IT SEZ Park, 2nd Floor – Block 3 1/124, Mount Poonamallee Road Ramapuram, Chennai

Inventors

1. ARANGARAJAN PALANIAPPAN
50A, Sindhu Nagar, Aindhu panai, Kadachanallur, Tiruchengode - 638008
2. AKSHAYA BABU
RRRRA-98, (Revathy House) Poothanappilly, East Ponnurunni Road, Vyttila.P.O, Kochi - 682019
3. DESAI KAVYA
13-1-525-32, Lecturers colony, Anantapur - 515001

Specification

DESC:TECHNICAL FIELD
[0001] The present subject matter described herein, in general, discloses method and system for calculating launch parameters of an object of interest using an imaging algorithm.

BACKGROUND
[0002] There are various techniques for calculating launch parameters of objects such as a golf ball after hitting, namely, the speed, rotational amount, the angle of elevation, and the deflection angle thereof with respect to a predetermined flight direction. However, such launch measurement techniques use controlling/computing device connected to CCD camera(s) and involve use of various sensors. Further, these techniques require complex system which has high FPS rate and cost in-efficient.

[0003] Another conventional technique used to detect the launch parameters of an object includes use of radars. Radars works by transmitting radio waves and receiving the same waves bounced off objects in its field of detection. Using this, distance, velocity and size of objects may be detected. However, use of radars may result in complex algorithms and multiple antennas for calculating launch parameters of object.

[0004] Yet another conventional technique describes use of light patterns from a plurality of contrasting areas on object. A computer may receive the signals generated by the light patterns as received by a camera unit and may discriminates between the signals to determine the object’s movement. However, this technique utilises various light source and it cannot detect the angle of launch of the object.

[0005] Thus, there exists a need of a system and method for calculating launch parameter of an object which are fast, less complex and which does not require complex and costly systems.


SUMMARY
[0006] The present disclosure overcomes one or more shortcomings of the prior art and provides additional advantages discussed throughout the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.

[0007] In one non-limiting embodiment of the present disclosure a method for calculating launch parameters of an object of interest is disclosed. The method comprises capturing super slow-motion video of said object, said object being initially placed within a Region of Interest (ROI), wherein the video comprises a plurality of video frames. The method further comprises masking at least a portion in a plurality of consecutive video frames and detecting said object in the plurality of consecutive masked video frames, wherein the object is detected in a non-masked portion of said video frames. Thereafter, the method comprises determining centre points of the object, in said masked video frames, to estimate outliers for the object. Further, the method comprises processing said centre points based on the estimated outliers to determine filtered centre points of the object and converting the filtered centre points into real world coordinates. Lastly, the method comprises calculating launch parameters of the object based on the real-world coordinates.

[0008] In another non-limiting embodiment of the present disclosure the launch parameters comprise at least speed and an angle of launch of the object. Further, calculating the launch parameters comprises calculating the speed and the angle of launch of the object, using at least two real-world coordinates corresponding to the filtered centre points associated with at least two video frames in which the object is detected.

[0009] In yet another non-limiting embodiment, the present disclosure recites that the object centre points are determined based on statistical parameters of the object in each video frame.

[0010] In yet another non-limiting embodiment for processing the centre points, said method further comprises generating a point vector of the centre points based on a sequence of the video frames and calculating a threshold value based on values of the point vector. Said threshold value is used to process said centre points to determine the filtered centre points.

[0011] In yet another non-limiting embodiment of the present disclosure the threshold value is calculated by calculating difference of an initial value and a final value of the point vector, after discarding edge values of the point vector, and multiplying the said calculated difference by an outlier factor.

[0012] In yet another non-limiting embodiment of the present disclosure, a system for calculating launch parameters of an object of interest is disclosed. The system comprises a capturing unit configured to capture super slow-motion video of said object, said object being initially placed within a Region of Interest (ROI), wherein the video comprises a plurality of video frames. The system further comprises a memory communicatively coupled with the capturing unit, wherein said memory stores the captured super slow motion video. The system also comprises at least one processor communicatively coupled with the capturing unit and the memory. The at least one processor is configured to mask at least a portion in a plurality of consecutive video frames, detect, using an object detection unit, said object in the plurality of consecutive masked video frames, wherein the object is detected in a non-masked portion of said video frames. Determine, using a centre detection unit, centre points of the object in said masked video frames, to estimate outliers for the object, process, using a outliers unit, said centre points based on the estimated outliers to determine filtered centre points of the object, convert, using a conversion unit, the filtered centre points into real world coordinates, and calculate, using a calculating unit, launch parameters of the object based on the real-world coordinates.

[0013] In yet another non-limiting embodiment, the present disclosure describes that the launch parameters comprise at least speed and an angle of launch of the object. The at least one processor is configured to calculate the speed and the angle of launch of the object, using at least two real-world coordinates corresponding to the filtered centre points associated with at least two video frames in which the object is detected.

[0014] In yet another non-limiting embodiment of the present disclosure, the object centre points are determined based on statistical parameters of the object in each video frame.

[0015] In yet another non-limiting embodiment of the present disclosure, in order to process the centre points, the at least one processor is configured to: generate a point vector of the centre points based on a sequence of the video frames, and calculate a threshold value based on values of the point vector. Said threshold value is used to process said centre points to determine the filtered centre points.

[0016] In yet another non-limiting embodiment of the present disclosure, the at least one processor is configured to calculate the threshold value by calculating difference of an initial value and a final value of the point vector, after discarding edge values of the point vector, and multiply the said calculated difference by an outlier factor.

OBJECTIVES OF THE INVENTION
[0017] An objective of the present invention is to provide a system and a method for calculating launch parameters of an object of interest, using an imaging algorithm.

[0018] Another objective of the present invention is to provide an accurate system with reduced cost for calculating launch parameters of an object of interest.

[0019] Yet another objective of the present invention is to provide a less complex system for calculating launch parameters of an object of interest, using a monocular camera.

[0020] Yet another objective of the present invention is to remove dependency on identification of frame or time of impact on the object.

BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed embodiments. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and with reference to the accompanying figures, in which:

[0022] Fig. 1a and 1b illustrate a system for calculating launch parameters of an object of interest, in accordance with an embodiment of the present subject matter.

[0023] Fig. 1c illustrates a block diagram of an estimation unit of a system for calculating launch parameters of an object of interest, in accordance with an embodiment of the present subject matter.

[0024] Fig. 2 and 3 illustrate exemplary user interface of the system for calculating launch parameters of an object of interest, in accordance with an embodiment of the present subject matter.

[0025] Fig. 4 is a flow diagram illustrating a method for calculating launch parameters of an object of interest, in accordance with an embodiment of the present subject matter.

[0026] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.

DETAILED DESCRIPTION
[0027] In the present document, the word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or implementation of the present subject-matter described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments.

[0028] While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.

[0029] The terms “comprises”, “comprising”, “include(s)”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, system or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or system or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.

[0030] In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and are shown by way of illustration of specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.

[0031] The present invention will be described herein below with reference to the accompanying drawings. In the following description, well known functions or constructions are not described in detail since they would obscure the description with unnecessary detail.

[0032] The present disclosure recites a system for calculating launch parameters such as speed, angle, and spin of an object of interest. The system can capture a super slow-motion video of the object during launch using a monocular camera. The video comprises a plurality of video frames. The system may mask at least a portion in a plurality of consecutive video frames and may detect said object in the plurality of consecutive masked video frames. Thereafter, the system may determine centre points of the object and may generate a vector of the centre points. Based on the determined centre points, the system may estimate the outliers for removing noise from the frames. According to an embodiment, noise represents any object, which is not of interest and the system is not interested in calculating launch parameters of that object. Furthermore, the system may convert the centre points into real-world coordinates after removal of the noise. The system may then determine the launch parameters of the object based on the real-world coordinates of the object of interest. In this manner, the system may calculate the launch parameters of the object accurately using a monocular camera, which is less time taking and also reduces the complexity of the system.

[0033] Referring to figure 1a, a system 100 for calculating the launch parameters of the object of interest is shown. It may be understood that the system 100 may be implemented in a variety of user device, such as a mobile, a laptop computer, a desktop computer, a notebook, a mainframe computer, and a handheld device, but not limited thereto. The system 100 may comprise at least one processor 102, a memory 104, and a camera 106. The at least one processor 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor is configured to fetch and execute computer-readable instructions stored in the memory.

[0034] The camera 106 may capture a super slow-motion video of the object of interest 108. The object 108 may be initially placed in a Region of Interest (ROI). For example, the monocular camera 106 may record the video at high frame rate such as 960FPS. The recorded video may be stored in the memory unit 104. The monocular camera 106, the memory 104 and the processor 102 may communicate with each other via a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite.

[0035] In the below paragraphs fig. 1a and 1b are explained in conjunction with each other. Fig. 1b discloses the system 100 according to another embodiment of the present disclosure. The system 100 of Fig 1b may comprise a capturing unit 202, an object detection unit 204, a centre detection unit 206, an estimation unit 208, a conversion unit 210, and a calculating unit 212, which may be communicatively coupled with a processing unit 214 and with each other. The processing unit 214 may comprise at least one processor.

[0036] According to an embodiment, the camera 106 of the system 100 may be focused parallel to launch of the object 108 so that launch of the object 108 may be captured more accurately. The recorded video of the launch of the object 108 may be stored in the memory 104. The video may be stored in form of video frames. Further, the processor 102 may fetch the frames of the video. In an exemplary embodiment, the functionality of the monocular camera 106 may be performed by the capturing unit 202.

[0037] The at least one processor 102 may process the plurality of the frames to detect the object of interest in the frames. According to an embodiment, the processor 102 may work in conjunction with the object detection unit 204 to detect the object of interest in the frames. The processor 102 may mask at least a portion in a plurality of consecutive video frames and may detect the object in the plurality of consecutive masked video frames. In accordance with an embodiment, the processor 102 may utilize object identification algorithm such as optical flow techniques along with K means segmentation technique, but not limited thereto.. In one implementation, at least a portion of the plurality of video frames being masked may include the left portion of the video frames so that any undesired object such as movement of a golf club or a player is not captured, while calculating launch parameters of a golf ball. In another exemplary embodiment, the masking of video frames is not limited to the left portion of the video frames, and may include masking of any other portion of the video frames based on the position of the monocular camera 106. The masking of the frames is used to increase the accuracy and efficiency of the system 100. Such masking of the frames removes other undesired object from the frames.

[0038] The processor 102 further detects the centre point of the object in the plurality of masked frames. In accordance with an exemplary embodiment, the processor 102 may work in conjunction with the centre detection unit 206 to detect the centre points of the object in the plurality of frames. According to an embodiment, the processor 102 may use statistical parameters such as area and circularity and other similar parameters, of the object to determine the centre points in the frames. The processor 102 may use the said centre points to determine estimate outliers for the object. Further, the processor 102 may process said centre points based on the estimated outliers to determine filtered centre points of the object. Particularly, said estimated outliers are used to remove any further noise in the frames, i.e., to remove any other undesired object present in the masked frames, which is not of interest. In one implementation, the processor 102 may work in conjunction with the estimation unit 208 to estimate the outliers. In an exemplary embodiment, the estimation unit 208 may comprise various modules as illustrated in Fig. 1c of the present disclosure.

[0039] As shown in Fig. 1c, the estimation unit 208 may comprise a sorting module 302, a dividing module 304, a difference calculation module 306, a threshold determination module 308, and a filtering module 310. Functioning of each of said modules is described in below paragraphs in conjunction with fig.1a.

[0040] Coming back to figure 1a and 1b, according to an embodiment, in order to estimate the outliers, the processor 102 may generate a vector of the centre points based on the sequence of the video frames. In accordance with another embodiment, the processor 102 may sort the generated vector in accordance with the sequence of the video frames. According to an embodiment, the processor 102 may work in conjunction with the sorting module 302 of the estimation unit 208 to sort the centre point vector. Further, the processor 102 may discard the edge values of the vector of centre points. In an embodiment, the processor 102 may discard the 1/4th values of the total values present in the centre point vector. In an exemplary embodiment, the dividing module 304 may work in conjunction with the processor 102 and may divide the centre point vector into at least three parts to discard the edge values (1/4th values of initial and final values) of the centre point vector.

[0041] The processor 102 may further calculate a difference in initial and final values of the centre point vector, after discarding the edge values. In one implementation, the processor 102 may work in conjunction with the difference calculation module 306 to calculate the difference. The calculated difference in the initial and final values may be used to determine a threshold value. Particularly, the processor 102 may determine the threshold by multiplying the calculated difference by an outlier factor. The outlier factor may be defined as 1.5 or other values based on case to case basis, i.e.,
Threshold = Difference Value * Outlier Factor
According to an exemplary embodiment, the processor 102 may work in conjunction with the threshold determination module 310 to determine the threshold value.
[0042] The calculated threshold values may be used to determine the filtered centre points. In accordance with an embodiment, the processor 102 may evaluate the centre points based on said threshold to determine the filtered centre points. The processor 102 may also determine a median of the centre points. Further, the processor 102 determines centre point values which are less than “median – Threshold” and more than “Median + Threshold” and removes these values to generate filtered centre points. The processor 102 may also associate the frame numbers to the filtered centre points, to ensure that launch parameters are determined based on the centre points of at least two frames to improve the accuracy. In one implementation, the processor 102 may work in conjunction with the filtering module 312 of the estimation unit 208 determine the filtered centre points.

[0043] Further, the processor 102 may convert the filtered centre points into real-world coordinates to determine the launch parameters of the object. The processor 102 may process the real-world coordinates to calculate the launch parameters such speed and an angle of launch of the object. According to an embodiment, the processor 102 may work in conjunction with the conversion unit 210 to convert the centre points into the real-world coordinates. The processor 102 may calculate the speed and the angle of launch of the object, using at least two real-world coordinates corresponding to the filtered centre points associated with at least two video frames in which the object is detected. Further, the processor 102 may use any known techniques for processing the real-world coordinates to determine the launch parameters. In one implementation, the processor 102 may work in conjunction with the calculation unit 212 to calculate the launch parameters.

[0044] In this manner, the launch parameters may be determined accurately while reducing complexity and cost of the system. This also reduces the time taken in determination of the launch parameters.

[0045] According to an embodiment of the present disclosure, the object 108 may be a golf ball and the system 100 may determine speed and angle of the launch of golf ball when a player hits the ball. The system 102 may calculate important launch features of the golf through video capturing from camera 106 using super slow-motion technique. The effectiveness of each golf shot is dictated by the interrelationship between distance and accuracy. Although external factors including wind, air density and friction of the landing surface play a role in this outcome, the components controlled by the player are the initial ball velocity and direction, as well as the spin/angle imparted on the ball. Thus, the system 100 may be used for training purpose in the golf. In this manner, the system 100 may be useful in any field where one can get training in sports. Although, this embodiment describes one of the sports called Golf, but it can be extended to any other sports by calculating important launch parameters.

[0046] According to an embodiment, the system 100 may be a mobile device of a user. The mobile device may be used to calculate the golf launch parameters using the super slow motion videos. The system 100 may be any device provided it has the capabilities to perform the operations as discussed in above paragraphs. Compared to the existing camera based devices for calculating launch parameters, this is a cost effective as well as simple to set up solution as it requires only a device with super slow-motion capability.

[0047] Fig. 2 and 3 illustrate a display interface of such a mobile device, which may be used to calculate the launch parameters of an object i.e., golf ball as an example in said instance. The ball may be placed in a Region Of Interest (ROI) and super slow motion video may be captured. After the video is recorded, the frames inside video will be processed, in accordance with the process discussed in above paragraphs, and parameters like speed (velocity) and angle will be displayed, in real time, as shown in Fig. 3.

[0048] Fig. 4 discloses a method 400 for calculating launch parameters of an object of interest. At step 402, the method 400 comprises capturing super slow-motion video of said object while said object being initially placed within a Region of Interest (ROI). According to an embodiment, the present disclosure describes that the video comprises a plurality of video frames. In one implementation, the video may be captured by a monocular camera at 960 frames per second, which has super slow-motion video capturing capability. The camera may be placed parallel to the object such that it focuses on launch of the object.

[0049] At step 404, the method 400 recites masking at least a portion in a plurality of consecutive video frames. The masking of the frames is used to increase the accuracy and efficiency of the system 100. Such masking of the frames removes other undesired object from the frames. Further, masking at least a portion in the plurality of consecutive video frames is performed to at least keep focus on the object of interest and remove dependency on identification of time of impact on said object for the launch parameter calculation. In one implementation, while calculating launch parameters of a golf ball, a left half of the frames may be masked to remove to the portion where golf club hits the golf ball, but not limited thereto. At step 406, the method 400 recites detecting said object in the plurality of consecutive masked video frames, wherein the object is detected in a non-masked portion of said video frames. The object may be detected using object detection techniques such as optical flow along with k means segmentation technique, but not limited thereto.

[0050] At step 408, the method 400 recites determining centre points of the object, in said masked video frames, to estimate outliers for the object. The object centre points are determined based on statistical parameters such as area and circularity and other similar parameters of the object in each video frame. Further, said estimated outliers are used to remove any further noise in the frames, i.e., to remove any other undesired object present in the masked frames.

[0051] The method 400 at step 410 describes processing said centre points based on the estimated outliers to determine filtered centre points of the object. Said processing comprises generating a point vector of the centre points based on a sequence of the video frames, and calculating a threshold value based on values of the point vector. The threshold value is used to process said centre points to determine the filtered centre points.

[0052] According to an embodiment, in order to estimate the outliers, a vector of the centre points may be generated based on the sequence of the video frames. According to an embodiment, the threshold value is calculated by calculating difference of an initial value and a final value of the point vector, after discarding edge values of the point vector. The threshold may be determined by multiplying the calculated difference by Outlier Factor which may have value of 1.5, i.e.,
Threshold = Difference Value * Outlier Factor
[0053] At step 412, the method 400 recites converting the filtered centre points into real world coordinates. Further, at step 414, method recites calculating launch parameters of the object based on the real-world coordinates. The launch parameters comprise at least speed and an angle of launch of the object. According to an embodiment, calculating the launch parameters comprises calculating the speed and the angle of launch of the object, using at least two real-world coordinates corresponding to the filtered centre points associated with at least two video frames in which the object is detected.

[0054] In this manner, the launch parameters may be determined accurately while reducing processing time, complexity and cost of the system.

[0055] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.

[0056] Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.

[0057] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer- readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., are non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.

[0058] Suitable processors/controllers include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), and/or a state machine.


,CLAIMS:1. A method for calculating launch parameters of an object of interest, the method comprising:
capturing super slow-motion video of said object, said object being initially placed within a Region of Interest (ROI), wherein the video comprises a plurality of video frames;
masking at least a portion in a plurality of consecutive video frames;
detecting said object in the plurality of consecutive masked video frames, wherein the object is detected in a non-masked portion of said video frames;
determining centre points of the object, in said masked video frames, to estimate outliers for the object;
processing said centre points based on the estimated outliers to determine filtered centre points of the object;
converting the filtered centre points into real world coordinates; and
calculating launch parameters of the object based on the real-world coordinates.

2. The method as claimed in claim 1, wherein the launch parameters comprise at least speed and an angle of launch of the object,
wherein calculating the launch parameters comprises calculating the speed and the angle of launch of the object, using at least two real-world coordinates corresponding to the filtered centre points associated with at least two video frames in which the object is detected.

3. The method as claimed in claim 1, wherein the object centre points are determined based on statistical parameters of the object in each video frame.

4. The method as claimed in claim 1, wherein processing the centre points further comprises:
generating a point vector of the centre points based on a sequence of the video frames; and
calculating a threshold value based on values of the point vector; and
wherein said threshold value is used to process said centre points to determine the filtered centre points.

5. The method as claimed in claim 4, wherein the threshold value is calculated by:
calculating difference of an initial value and a final value of the point vector, after discarding edge values of the point vector, and multiplying the said calculated difference by an outlier factor.

6. A system for calculating launch parameters of an object of interest, the system comprising:
a capturing unit configured to capture super slow-motion video of said object, said object being initially placed within a Region of Interest (ROI), wherein the video comprises a plurality of video frames;
a memory communicatively coupled with the capturing unit, wherein said memory stores the captured super slow motion video; and
at least one processor communicatively coupled with the capturing unit and the memory, wherein the at least one processor is configured to:
mask at least a portion in a plurality of consecutive video frames;
detect, using an object detection unit, said object in the plurality of consecutive masked video frames;
determine, using a centre detection unit, centre points of the object in said masked video frames, to estimate outliers for the object;
process, using a outliers unit, said centre points based on the estimated outliers to determine filtered centre points of the object;
convert, using a conversion unit, the filtered centre points into real world coordinates; and
calculate, using a calculating unit, launch parameters of the object based on the real-world coordinates.

7. The system as claimed in claim 6, wherein the launch parameters comprise at least speed and an angle of launch of the object,
wherein the at least one processor is configured to calculate the speed and the angle of launch of the object, using at least two real-world coordinates corresponding to the filtered centre points associated with at least two video frames in which the object is detected.

8. The system as claimed in claim 6, wherein the object centre points are determined based on statistical parameters of the object in each video frame.

9. The system as claimed in claim 6, wherein in order to process the centre points, the at least one processor is configured to:
generate a point vector of the centre points based on a sequence of the video frames; and
calculate a threshold value based on values of the point vector; and
wherein said threshold value is used to process said centre points to determine the filtered centre points.

10. The system as claimed in claim 9, wherein the at least one processor is configured to calculate the threshold value by calculating difference of an initial value and a final value of the point vector, after discarding edge values of the point vector, and multiply the said calculated difference by an outlier factor.

Documents

Application Documents

# Name Date
1 202141028454-STATEMENT OF UNDERTAKING (FORM 3) [24-06-2021(online)].pdf 2021-06-24
2 202141028454-PROVISIONAL SPECIFICATION [24-06-2021(online)].pdf 2021-06-24
3 202141028454-FORM 1 [24-06-2021(online)].pdf 2021-06-24
4 202141028454-DRAWINGS [24-06-2021(online)].pdf 2021-06-24
5 202141028454-DECLARATION OF INVENTORSHIP (FORM 5) [24-06-2021(online)].pdf 2021-06-24
6 202141028454-Proof of Right [06-12-2021(online)].pdf 2021-12-06
7 202141028454-Correspondence_Amend the email addresses_14-12-2021.pdf 2021-12-14
8 202141028454-DRAWING [18-03-2022(online)].pdf 2022-03-18
9 202141028454-CORRESPONDENCE-OTHERS [18-03-2022(online)].pdf 2022-03-18
10 202141028454-COMPLETE SPECIFICATION [18-03-2022(online)].pdf 2022-03-18
11 202141028454-Proof of Right [22-03-2022(online)].pdf 2022-03-22
12 202141028454-FORM-26 [13-10-2022(online)].pdf 2022-10-13
13 202141028454-Form-18_Examination Request_13-10-2022.pdf 2022-10-13
14 202141028454-Correspondence_Form-18_13-10-2022.pdf 2022-10-13
15 202141028454-FER.pdf 2023-01-30
16 202141028454-OTHERS [25-07-2023(online)].pdf 2023-07-25
17 202141028454-FER_SER_REPLY [25-07-2023(online)].pdf 2023-07-25
18 202141028454-DRAWING [25-07-2023(online)].pdf 2023-07-25
19 202141028454-COMPLETE SPECIFICATION [25-07-2023(online)].pdf 2023-07-25
20 202141028454-CLAIMS [25-07-2023(online)].pdf 2023-07-25
21 202141028454-ABSTRACT [25-07-2023(online)].pdf 2023-07-25
22 202141028454-RELEVANT DOCUMENTS [13-02-2025(online)].pdf 2025-02-13
23 202141028454-MARKED COPIES OF AMENDEMENTS [13-02-2025(online)].pdf 2025-02-13
24 202141028454-FORM 13 [13-02-2025(online)].pdf 2025-02-13
25 202141028454-AMENDED DOCUMENTS [13-02-2025(online)].pdf 2025-02-13

Search Strategy

1 202141028454E_27-01-2023.pdf
2 202141028454AE_06-12-2023.pdf