Sign In to Follow Application
View All Documents & Correspondence

Apparatus And Method For Monitoring And Controlling Low Amplitude Intensity Variation Of Moving Objects In Imaging System

Abstract: Present disclosure relates to apparatus and method for monitoring and controlling low light imaging system. Apparatus (100) comprises Image Intensifier Tube (IIT) (102), image sensor (104) and Digital Image processing (DSP) unit (106). IIT (102) has gain adjusting capability and configured to receive and filter optical light associated with optical scene. Received optical light comprises optical parameters of first predefined value selected from pixel amplitude and pixel density of image frames, and wavelength of optical light. IIT (104) amplifies first predefined value to second predefined value, preserving spatial dimension. Image sensor (104) sense second predefined value and correspondingly generate set of electrical signals. DSP unit (106) extracts optical parameters from received set of electrical signals and periodically monitor and process extracted optical parameters and correspondingly adjust gain of IIT, and frame rate and resolution of image sensor to detect one or more object parameters in one or more image frames.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
16 March 2021
Publication Number
38/2022
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Bharat Electronics Limited
Corporate Office, Outer Ring Road, Nagavara, Bangalore - 560045, Karnataka, India.

Inventors

1. SREEVALSEN N
EO&L/PDIC, Bharat Electronics Limited, Jalahalli Post, Bangalore - 560013, Karnataka, India.
2. DARAM SHYAM SUNDER
EO&L/PDIC, Bharat Electronics Limited, Jalahalli Post, Bangalore - 560013, Karnataka, India.

Specification

Claims:1. An apparatus (100) for monitoring and controlling a low light imaging system, the apparatus (100) comprising:
an Image Intensifier Tube (IIT) (102) having gain adjusting capability, the IIT (102) configured to:
receive and filter optical light associated with an optical scene comprising one or more image frames having one or more objects, the received optical light having optical parameters of a first predefined value, which is selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light; and
amplify the first predefined value of the received optical light to a second predefined value, while preserving spatial dimension of the optical scene;
an image sensor (104) operatively coupled to the IIT (102), the image sensor (104) configured to sense the optical light having the second predefined value and correspondingly generate a set of electrical signals; and
a Digital Image Processing (DSP) unit (106) operatively coupled to the IIT (102), and the image sensors (104), the DSP unit (106) comprising one or more processors, any or a combination of Field Programmable Gate Arrays (FPGAs), Digital Signal Processor (DSP), controller operatively coupled to a memory (108) storing one or more instructions executable by the one or more processors, and configured to:
extract the optical parameters from the received set of electrical signals;
periodically monitor and process the extracted optical parameters for each of the one or more image frames associated with the optical scene; and correspondingly adjust the gain of the IIT (102), and a frame rate and resolution of the image sensor to detect one or more object parameters in the one or more image frames of the optical scene.
2. The apparatus (100) as claimed in claim 1, wherein when the image sensor (104) is adapted to generate the set the electrical signals in analog form, the apparatus (100) comprises an Analog to Digital Convertor (ADC) (110) operatively configured between the image sensor (104) and the DSP unit (106), and wherein the ADC (110) is configured to convert and transmit the analog form of the set the electrical signals into digital form to the DSP unit (106).
3. The apparatus (100) as claimed in claim 2, wherein the DSP unit (106) operates the image sensor (104) in a maximum frame rate in a binning mode, and operate the ADC (110) with full ADC data width, and correspondingly monitors a pattern in rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of an object of interest in the one or more image frames, to identify whether the one or more object is moving using the changing frame rate, and also identify noise and bad pixel from the pattern.
4. The apparatus (100) as claimed in claim 1, wherein the DSP unit (106) is configured detect the one or more objects in the one or more image frames, and identify a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset comprising pre-defined object parameters of known objects.
5. The apparatus (100) as claimed in claim 4, wherein the DSP unit (106) is configured identify a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly zooming focus of the apparatus (100) over the region of interest.
6. The apparatus (100) as claimed in claim 1, wherein the one or more objects in the optical scene comprises any or a combination of humans, animals, electrical discharge, plumes from rocket and missile, and emission from aircraft and helicopter.
7. A method for monitoring and controlling a low light imaging system, the method comprising the steps of:
receiving and filtering, by an Image Intensifier Tube (IIT) (102), optical light associated with an optical scene comprising one or more image frames having one or more objects, the received optical light having optical parameters of a first predefined value, which is selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light, and
amplifying, by IIT (102), the first predefined value of the received optical light to a second predefined value, while preserving spatial dimension of the optical scene;
sensing, by an image sensor (104) having binning capability, the optical light having the second predefined value, and correspondingly generating a set of electrical signals;
extracting, by a Digital Signal Processing (DSP) unit (106) comprising one or more processors, any or a combination of Field Programmable Gate Arrays (FPGAs), Digital Signal Processor (DSP), the optical parameters from the received set of electrical signals;
periodically monitoring and processing, by the DSP unit (106), the extracted optical parameters for each of the one or more image frames associated with the optical scene; and correspondingly adjusting the gain of the IIT (102), and a frame rate and resolution of the image sensor to detect one or more objects parameters in the one or more image frames of the optical scene.
8. The method as claimed in claim 7, wherein when the image sensor (104) is adapted to generate the set the electrical signals in analog form, the method comprises the step of converting, by an Analog to Digital Convertor (ADC) (110) operatively configured between the image sensor (104) and the DSP unit (106), the analog form of the set the electrical signals into digital form to the DSP unit (106).
9. The method as claimed in claim 8, wherein the method comprises the step of operating, by the DSP unit (106), the image sensor (104) in a maximum frame rate in the binning mode, and the ADC (110) with full ADC data width, and correspondingly monitoring a pattern in rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of the one or more objects of interest in the one or more image frames, to identify whether the one or more objects of interest is moving using the changing frame rate, and also identify noise and bad pixel from the pattern.
10. The method as claimed in claim 1, wherein the method comprises the steps of:
detecting, by the DSP unit (106), the one or more objects in the one or more image frames upon identification of the one or more object parameters;
identifying, by the DSP unit (106), a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset comprising pre-defined object parameters of known objects; and
identifying, by the DSP unit (106), a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focusing over the region of interest.
, Description:TECHNICAL FIELD
[0001] The present invention generally relates to video processing systems. Particularly, the present invention relates to an apparatus and method for monitoring and controlling a low light imaging system for efficiently identifying and capturing high-quality images of moving objects.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Generally, in a moving scene capturing mode, an imaging device captures an image of a subject using a group of lens elements, sensor, or pixel array, where each pixel converts the light intensity pattern into an electrical signal according to a pre-determined driving clock signal. The obtained image frames are then temporarily stored in a buffer memory. After capturing the image from the pixel array, the next step is to process the image to get desired quality images.
[0004] The digital camera may also include a Digital Signal Processor (DSP), which determines a frame rate with which the image frames are filed, where the DSP may variably determine the frame rate according to a motion data speed of a subject in the frames. The digital camera may further include a recording medium in which the filed image frames are stored.
[0005] Image intensifiers are employed to amplify the light from an optical scene by first converting photons to electrons, then performing electron multiplication, and finally converting the electrons back to photons. Image intensifiers are useful for amplifying optical scenes in very low light environments. For video recording purposes and the protection of the video camera sensors, the gain of the image intensifier must be controlled to preserve the desired optical data. This control necessitates an Automatic Gain Control (AGC) system.
[0006] The United States Patent Document Number US 8,319,859B2 (date of patent: Nov. 27, 2012) titled as “digital camera having a variable frame rate and method of controlling the digital camera” published by Won-jung Kim, Kang-min Lee relates to a digital camera capable of variably setting a frame rate. The digital camera includes an imaging device, which outputs a plurality of temporally continuous image frames when photographing a moving image. The digital camera also includes a digital signal processor that determines a frame rate with which the image frames are filed, wherein the digital signal processor variably determines the frame rate according to a motion data speed of a subject in the frames. The digital camera further includes a recording medium in which the filed image frames are stored. Accordingly, as the frame rate is varied according to the motion of a subject, the motion of the Subject can be smoothly realized in a high motion speed section from a reproduction perspective, and substantially repeated image frames can be removed in a low speed motion section. Thus, the size of moving image files can be reduced which results in saving memory.
[0007] Another United Patent Document Number US2003/0107648A1 (date of patent: Jun. 12, 2003) titled as “surveillance system and method with adaptive frame rate” published by Richard Stewart, Keith Trahan, David Chesavage, Sean Casey, Solana Beach, Michael Rome, San Diego, Chris Kokinakes discloses a surveillance system, which includes video surveillance cameras that are in various locations sought to be monitored. Each camera is associated with a variable frame rate that is faster when motion is detected in the location and slower when little or no motion is detected, to improve resolution when needed. A system hub receives video feeds from the cameras and sends them on to wireless clients upon client request.
[0008] A non-patent literature published by Kim R. Fowler, Senior Member, IEEE, titled as “Automatic Gain Control for Image-Intensified Camera” IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 53, NO. 4, AUGUST 2004, relates to the stability of a nonlinear, sampled-data Automatic Gain Control (AGC) for image-intensified cameras. The AGC increases the intensifier gain if the video scene is too dim and decreases the gain if the video scene is too bright. Otherwise, the AGC does not alter the gain. The gain of the system is stable, i.e., the gain does not oscillate between two values, for all possible static video scenes, if the thresholds for defining both dim and bright scenes and the gain factor are chosen properly. This paper derives the regions for gain stability and provides the criteria for choosing the thresholds and the gain factor.
[0009] The above-cited prior arts work on fixed-gain intensifiers, and fixed frame width and resolution of images sensors, thus the cited prior-arts are incapable of capturing good quality images in low light conditions and are also inefficient in capturing images of moving objects such as moving animals, humans, electric discharge, the exhaust of rockets and missiles, and emission from aircraft and helicopter.
[0010] Therefore, there is a need in the art for providing an apparatus and method for controlling the gain of the Image Intensifier Tube (IIT) and adjusting the frame rate and resolution of the image sensor for achieving optimum performance in low light imaging systems, especially for capturing images of moving objects.

OBJECTS OF THE PRESENT DISCLOSURE
[0011] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0012] It is an object of the present disclosure to control the gain of the IIT and adjust frame rate and resolution of the image sensor for achieving optimum performance in low light imaging system.
[0013] It is an object of the present disclosure to provide an apparatus that captures only the area of interest and provide coordinates, but not limited to a particular area, which was recorded in a high frame rate with improved gain and bit resolution.
[0014] It is an object of the present disclosure to provide a better system for zooming multiple objects of interest independently, thus by providing selective zooming the user can have a comprehensive understanding of the event.
[0015] It is an object of the present disclosure to reduce memory usage and increase the processing capability of the imaging system.
[0016] It is an object of the present disclosure to utilize the binning mode of the image sensor to achieve maximum frame rate while maintaining the high pixel resolution for object location identification.
[0017] It is an object of the present disclosure to utilize IIT to amplify the light from an optical scene while preserving the spatial dimensions of the scene.
[0018] It is an object of the present disclosure to capture images of moving objects such as moving animals, humans, electric discharge, the exhaust of rockets and missiles, and emissions from aircraft and helicopters.

SUMMARY
[0019] The present invention generally relates to video processing systems. Particularly, the present invention relates to an apparatus and method for monitoring and controlling a low light imaging system for efficiently identifying and capturing high quality images of moving objects.
[0020] An aspect of the present disclosure pertains to an apparatus for monitoring and controlling a low light imaging system. The apparatus may comprise an Image Intensifier Tube (IIT), an image sensor, a Digital Image Processing (DSP) unit. The IIT may have gain adjusting capability, and the IIT may be configured to receive and filter optical light associated with an optical scene which may comprise one or more image frames having one or more objects. Further, the received optical light may have optical parameters of a first predefined value, which is selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light. Furthermore, the first predefined value of the received optical light is amplified to a second predefined value by the IIT, while preserving spatial dimension of the optical scene. The image sensor may be operatively coupled to the IIT, which may be configured to sense the optical light having the second predefined value and may correspondingly generate a set of electrical signals. The DSP unit may be operatively coupled to the IIT, and the image sensors. Also, the DSP unit may comprise one or more processors which may be operatively coupled to a memory for storing one or more instructions executable by the one or more processors. The DSP can be any or a combination of Field Programmable Gate Arrays (FPGAs), Digital Signal Processor (DSP) or controller. Further, the DSP unit may be configured to extract the optical parameters from the received set of electrical signals. The extracted optical parameters for each of the one or more image frames associated with the optical scene may be periodically monitored and processed. Correspondingly, the DSP unit may adjust the gain of the IIT, and a frame rate and resolution of the image sensor to detect one or more object parameters in the one or more image frames of the optical scene.
[0021] In an aspect, when the image sensor generates the set the electrical signals in analog form, the apparatus may comprise an Analog to Digital Convertor (ADC), which may be operatively configured between the image sensor and the DSP unit. The ADC may be configured to convert and transmit the analog form of the set the electrical signals into digital form to the DSP unit.
[0022] In an aspect, the DSP unit may operate the image sensor in a maximum frame rate in a binning mode. In addition, the DSP unit may operate the ADC with full ADC data width, and correspondingly, the DSP may monitor a pattern in the rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of an object of interest in the one or more image frames. Thus, the DSP is capable of identifying whether the one or more object is moving using the changing frame rate, and also identifying noise and bad pixel from the pattern.
[0023] In an aspect, the DSP unit may be configured to detect the one or more objects in the one or more image frames, and identify a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset comprising pre-defined object parameters of known objects.
[0024] In an aspect, the DSP unit may be configured to identify a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focus the apparatus over the region of interest.
[0025] In an aspect, the one or more objects in the optical scene may comprise any or a combination of humans, animals, electrical discharge, plumes from rocket and missile, and emission from aircraft and helicopter.
[0026] In an aspect, a method for monitoring and controlling a low light imaging system may comprise receiving and filtering optical light associated with the optical scene by the IIT. The optical light may comprise one or more image frames having one or more objects. The received optical light having optical parameters of a first predefined value, which is selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light. The first predefined value of the received optical light may be amplified to a second predefined value by using the IIT, while preserving spatial dimension of the optical scene. Further, the image sensor may have binning capability, and the method may comprise a step of sensing the optical light having the second predefined value, and correspondingly generating a set of electrical signals. Further, the optical parameters may be extracted from the received set of electrical signals by using the DSP unit. Finally, the method may involve the step of periodic monitoring and processing may be performed by the DSP unit by using the extracted optical parameters for each of the one or more image frames associated with the optical scene. Further, the method may comprise the step of adjusting the gain of the IIT, and a frame rate and resolution of the image sensor to detect one or more objects parameters in the one or more image frames of the optical scene.
[0027] In an aspect, the image sensor may be adapted to generate the set the electrical signals in analog form by using the ADC which may be operatively configured between the image sensor and the DSP unit. The method may comprise the step of converting the analog form of the set the electrical signals into digital form, and sending it to the DSP unit.
[0028] In an aspect, the method may comprise the step of operating, by the DSP unit, the image sensor in a maximum frame rate in the binning mode, and the ADC with full ADC data width, and correspondingly monitoring a pattern in rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of the one or more objects of interest in the one or more image frames, to identify whether the one or more objects of interest is moving using the changing frame rate, and also identify noise and bad pixel from the pattern.
[0029] In an aspect, the method may comprise the step detecting, by the DSP unit, the one or more objects in the one or more image frames upon identification of the one or more object parameters. Further, the method may comprise the step of identifying, by the DSP unit, a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset comprising pre-defined object parameters of known objects. Finally, the method may comprise the step of identifying, by the DSP unit, a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focusing the IIT over the region of interest.

BRIEF DESCRIPTION OF THE DRAWINGS
[0030] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0031] The diagrams are for illustration only, which thus is not a limitation of the present disclosure, and wherein:
[0032] FIG. 1 illustrates an exemplary block diagram of the proposed apparatus, in accordance with an exemplary embodiment of the present disclosure.
[0033] FIG. 2 illustrates representation of the proposed apparatus, in accordance with an exemplary embodiment of the present disclosure.
[0034] FIG. 3 illustrates steps involved in proposed method, in accordance with an exemplary embodiment of the present disclosure.
[0035] FIG. 4A-4B illustrates steps involved in Initialization, Monitoring, Detection, Processing, Identification and Classification of the proposed method of FIG. 3, in accordance with an exemplary embodiment of the present disclosure.

DETAILED DESCRIPTION
[0036] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0037] Various terms as used herein are shown below. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0038] In some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[0039] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0040] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0041] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
[0042] The present invention generally relates video processing systems. Particularly, the present invention relates to an apparatus and method for monitoring and controlling a low light imaging system for efficiently identifying and capturing high quality images of moving objects.
[0043] According to an aspect of the present disclosure pertains to an apparatus for monitoring and controlling a low light imaging system. The apparatus can include an Image Intensifier Tube (IIT), an image sensor, a Digital Image Processing (DSP) unit. The IIT can have gain adjusting capability, and the IIT can be configured to receive and filter optical light associated with an optical scene which can include one or more image frames having one or more objects. Further, the received optical light can have optical parameters of a first predefined value, which is selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light. Furthermore, the first predefined value of the received optical light is amplified to a second predefined value by the IIT, while preserving spatial dimension of the optical scene. The image sensor can be operatively coupled to the IIT, which can be configured to sense the optical light having the second predefined value and correspondingly generate a set of electrical signals. The DSP unit 106 can be operatively coupled to the IIT, and the image sensors. Also, the DSP unit can include one or more processors which can be operative coupling to a memory for storing one or more instructions executable by the one or more processors. Further, the DSP unit can be configured to extract the optical parameters from the received set of electrical signals. The extracted optical parameters for each of the one or more image frames associated with the optical scene can be periodically monitored and processed. Correspondingly, the DSP unit can include adjusting the gain of the IIT, and a frame rate and resolution of the image sensor to detect one or more object parameters in the one or more image frames of the optical scene.
[0044] In an embodiment, when the image sensor generates the set the electrical signals in analog form, the apparatus can include an Analog to Digital Converter (ADC), which can be operatively configured between the image sensor and the DSP unit. The ADC can be configured to convert and transmit the analog form of the set the electrical signals into digital form to the DSP unit.
[0045] In an embodiment, the DSP unit can include the operation of the image sensor in a maximum frame rate in a binning mode. In addition, the DSP unit can operate the ADC with full ADC data width, and correspondingly the DSP can monitor a pattern in rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of an object of interest in the one or more image frames. Thus, the DSP is capable of identifying whether the one or more object is moving using the changing frame rate, and also identifying noise and bad pixel from the pattern.
[0046] In an embodiment, the DSP unit can be configured to detect the one or more objects in the one or more image frames, and identify a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset can include pre-defined object parameters of known objects.
[0047] In an embodiment, the DSP unit can be configured to identify a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focus the apparatus over the region of interest.
[0048] In an embodiment, the one or more objects in the optical scene can include any or a combination of humans, animals, electrical discharge, plumes from rocket and missile, and emission from aircraft and helicopter.
[0049] In an embodiment, a method for monitoring and controlling a low light imaging system can include receiving and filtering optical light associated with the optical scene by the IIT. The optical light can include one or more image frames having one or more objects. The received optical light having optical parameters of a first predefined value, which is selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light. The first predefined value of the received optical light can be amplified to a second predefined value by using the IIT, while preserving spatial dimension of the optical scene. Further, the image sensor can have binning capability, and the method can include a step of sensing the optical light having the second predefined value, and correspondingly generating a set of electrical signals. Further, the optical parameters can be extracted from the received set of electrical signals by using the DSP unit. Finally, the method can involve the step of periodic monitoring and processing can be performed by the DSP unit by using the extracted optical parameters for each of the one or more image frames associated with the optical scene. Further, the method can include the step of adjusting the gain of the IIT, and a frame rate and resolution of the image sensor to detect one or more objects parameters in the one or more image frames of the optical scene.
[0050] In an embodiment, method can include the step of operating by the DSP unit, the image sensor can include adapted to generate the set the electrical signals in analog form by using the ADC which can be operatively configured between the image sensor and the DSP unit. The method can include the step of converting the analog form of the set the electrical signals into digital form and sending it to the DSP unit.
[0051] In an embodiment, the method can include the step of operating, by the DSP unit, the image sensor in a maximum frame rate in the binning mode, and the ADC with full ADC data width, and correspondingly monitoring a pattern in rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of the one or more objects of interest in the one or more image frames, to identify whether the one or more objects of interest is moving using the changing frame rate, and also identify noise and bad pixel from the pattern.
[0052] In an embodiment, the method can include the step detecting, by the DSP unit, the one or more objects in the one or more image frames upon identification of the one or more object parameters. Further, the method can include the step of identifying, by the DSP unit, a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset including pre-defined object parameters of known objects. Finally, the method can include the step of identifying, by the DSP unit, a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focusing the IIT over the region of interest.
[0053] FIG. 1 illustrates an exemplary block diagram of the proposed apparatus, in accordance with an exemplary embodiment of the present disclosure
[0054] As illustrated in FIG. 1, the apparatus 100 is dedicated for monitoring and controlling the low light imaging system can include an Image Intensifier Tube (IIT), an image sensor 104, a Digital Image Processing (DSP) unit 106. The IIT 102 can have gain adjusting capability, and the IIT 102 can be configured to receive and filter optical light associated with an optical scene which can include one or more image frames having one or more objects. Further, the received optical light can have optical parameters of a first predefined value, which can be selected from pixel amplitude, and pixel density of the one or more image frames, and wavelength of the optical light. Furthermore, the first predefined value of the received optical light can be amplified to a second predefined value, while preserving spatial dimension of the optical scene. The image sensor 104 can be operatively coupled to the IIT 102 and configured to sense the optical light having the second predefined value and correspondingly can generate a set of electrical signals. The DSP unit 106 can be operatively coupled to the IIT 102, and the image sensors 104. Also, the DSP unit 106 can include one or more processors which can be operatively coupled to a memory 108 for storing one or more instructions executable by the one or more processors. The DSP can be any or a combination of Field Programmable Gate Arrays (FPGAs), Digital Signal Processor (DSP) or controller. Further, the DSP unit 106 can be configured to extract the optical parameters from the received set of electrical signals. The extracted optical parameters for each of the one or more image frames associated with the optical scene can be periodically monitored and processed. Correspondingly, adjust the gain of the IIT 102, and a frame rate and resolution of the image sensor 104 to detect one or more object parameters in the one or more image frames of the optical scene.
[0055] FIG. 2 illustrates representation of the proposed apparatus 100, in accordance with an exemplary embodiment of the present disclosure.
[0056] As illustrated in FIG.2, the apparatus 100 can include an optics 202, Image Intensifier Tube (IIT) 102, the image sensor 104, and the DSP unit 106. The optics 202 along with wavelength band filter (not shown in figure) is configured to select the interesting field of view, which can allow only wavelength of interest and absorbs or reflects the remaining wavelength. Further, the IIT 102 can amplify the light from an optical scene by first converting photons to electrons, and then perform electron multiplication, and finally converts the electrons back to photons. IIT 102 is useful for amplifying optical scenes in very low light environments. The IIT 102 can amplify the light from an optical scene while preserving the spatial dimensions of the scene. Thus, the proposed IIT 204 can enable the designer to use any image sensor in order to efficiently identify and capture image of the moving object.
[0057] In an embodiment, the light from IIT 204 can be focused to a Complementary Metal Oxide Semiconductor (CMOS) sensor or any other image sensor, by the bonding of IIT 204 with image sensor 104. The bonding can either be free space bonding or fiber taper bonding. An image can be collected from the image sensor 104 either by configuring internal registers or external control signal. The image sensor 104 can be converted from the light intensity information to voltage signals. This voltage signal can be digital or analogue depending on the image sensor 104. If the sensor is of analogue type, then an ADC 110 can be connected to convert to digital domain. This digital signal is received by the DSP unit 106.
[0058] In an embodiment, when the image sensor 104 can be adapted to generate the set the electrical signals in analog form, the apparatus 100 can include an Analog to Digital Converter (ADC) 110, which can be operatively configured between the image sensor 104 and the DSP unit 106. The ADC 110 can be configured to convert and transmit the analog form of the set the electrical signals into digital form to the DSP unit 106.
[0059] In an embodiment, the DSP unit 106 can operate the image sensor 104 in a maximum frame rate in a binning mode. In addition, the DSP unit 106 can operate the ADC 110 with full ADC data width, and correspondingly the DSP 106 can monitor a pattern in rate of change of pixel amplitude and direction vector of movement of pixel, and a pattern of change of size of an object of interest in the one or more image frames. Thus, identifying whether the one or more object is moving using the changing frame rate, and also identifying noise and bad pixel from the pattern.
[0060] In an embodiment, the DSP unit 106 can be configured to detect the one or more objects in the one or more image frames, and identify a type of the detected one or more objects in the one or more image frames upon comparing the detected one or more object parameters with a dataset including pre-defined object parameters of known objects.
[0061] In an embodiment, the DSP unit 106 can be configured to identify a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focus the apparatus 100 over the region of interest.
[0062] In an embodiment, the DSP unit 106 can be configured to identify a region of interest in the one or more image frames based on the detected one or more objects parameters, and correspondingly focus the apparatus 100 over the region of interest.
[0063] In an embodiment, the one or more objects in the optical scene can include any or a combination of humans, animals, electrical discharge, plumes from rocket and missile, and emission from aircraft and helicopter.
[0064] FIG. 3 illustrates steps involved in proposed method, in accordance with an exemplary embodiment of the present disclosure.
[0065] FIG. 4A-4B illustrates steps involved in Initialization, Monitoring, Detection, Processing, Identification and Classification of the proposed method of FIG.3, in accordance with an exemplary embodiment of the present disclosure.
[0066] In an embodiment, the apparatus 100 can be initially operated initialisation mode, at block 302, the DSP unit 106 checks whether the initialisation is completed or not, at block 304. If the initialisation is completed then the following steps are executed:
At step 401, the sensor can be configured for generating optimum fixed frame rate, which can be decided based on the response time and velocity of the object.
At step 402, the IIT gain can be configured just above the noise floor, which can be configured by IIT 104 and operated in optimum gain above noise floor. This can be decided by the noise floor of the apparatus 100 and detects minimum power density. The initial gain set can be in such a way that, if object of interest appears in the field of the apparatus 100 it can give pixel amplitude greater than the set threshold which will be adaptive depending on the noise floor. This can be performed to keep the threshold just above the noise floor. If the image sensor 104 is of digital type then the pixel bit width will be fixed or won’t be much flexible.
At step 403, the image data is stored in optimum bit per pixel format, bit per pixel can be kept to the maximum value and for storage and processing, it can be approximated to lower optimum bit width. Thereby, the memory overhead is reduced and the processing speed per frame is increased, from 8 to 14-bit change in pixel data width will increase the dynamic range by 36 db. The digital sensor with high resolution without having facility for binning mode, the binning can be done in the DSP unit 106. In case, analogue sensor is used then flexibility is available, as the ADC 110 is can be chosen by the designer. The ADC 110 can be operated in optimum bit width. In addition, when required the bit width of ADC can be increased accordingly.
[0067] At step 404, finally, an Optical band pass filter is placed for detection of a particular wavelength of emission.
[0068] In an embodiment, the apparatus 100 can be operated in monitoring mode at block 306, the DSP unit 106 checks whether the fixed number of frames are detected or not, at block 308. If the monitoring is completed then the following steps are executed:
At step 405, the apparatus 100 will look for low intensity object detection, the apparatus 100 can continue to operate in fixed frame rate, gain and data width, which are set during the initialisation mode.
At step 406, the apparatus 100 checks for the fixed number of frames for conformation of detection by observing, if the signal crosses a threshold in any of the pixel. If the fixed number of frames are detected it will register the detection, then look for one more frame for conformation. Once conformed the detection mode is executed.
[0069] In an embodiment, the apparatus 100 can operate detection mode at block 310, and the DSP unit 106 checks whether the detection is persisting or not, at block 312. If the detection is persisting, then the following steps are executed:
At step 407, taking the frames and distributing in the form of the row and column location of detected object. This can be done in full resolution mode for maximum resolution for the location of the object with respect to sensor. In detection mode the processor will register the row and column location of detected object. This will be done in full resolution mode for maximum resolution for the location of the object with respect to sensor.
At step 408, the sum of amplitude pixel is stored by including the nearby suspected pixel. This done because object cannot be able to bring the pixel amplitude of nearby pixel above threshold value but it can increase the amplitude of nearby pixels. Further, the number of pixel above certain threshold with direction vector is stored by monitoring the change in amplitude of the neighbouring pixel, using this direction vector of the object which can be created. Further, the apparatus 100 continues to register the sum of nearby pixels and with direction vector, if the detection persists execute processing mode.
[0070] In an embodiment, the apparatus 100 can be operated in processing mode at block 314, the DSP unit 106 checks whether the adaptive configuration is completed or not, at block 316. If the adaptive configuration is completed then the following steps are executed:
At step 409, the apparatus 100 modifies the gain of the IITUBE periodically.
At step 410, the sum of amplitude pixel includes the suspected pixel which is received and stored in the memory 108.
At step 411, the number of pixel above certain threshold with direction vector is received and stored in the memory 108.
At step 412, the pixel amplitude nearby in the maximum bit resolution is read.
At step 413, modify the frame rate periodically to maximum possible value.
At step 414, change the mode of the sensor to binning mode for achieving maximum frame rate possible.
[0071] In an embodiment, the apparatus 100 can be operated in identification mode at block 318, the DSP unit 106 checks whether the detection is real or not, at block 320. If the detection is real, then the following steps are executed:
At step 415, the system looks for the pattern in the rate of change of pixel amplitude & direction vector of movement of pixel.
At step 416, if recognized as an object of interest change the mode to classification
At step 417, look for the pattern of change of size of the object of interest.
At step 418, identify whether the object is moving using the changing frame rate.
At step 419, Identify the noise and bad pixel from the pattern.
[0072] In an embodiment, the apparatus 100 for imaging application is very wide field of view optics is used and the sensor has high pixel resolution. Image can be zoomed automatically using the location of the suspected image. Thereby, showing only the area of interest which was recorded in high frame rate with improved gain and bit resolution is presented in the proposed invention. IR band filter correspond to body temperature of animals can be used to detect humans or animals faraway in road and give approximate distance by the amplitude and number of pixel occupied. UV band filter correspond to wavelength of emission missile plume, missiles can be detected and identified. By selecting IR band filter correspond to outer body or engine temperature of aircraft, missile or helicopter. The apparatus 100 can also be used to detect and identify aircraft, missile or helicopter.
[0073] In an embodiment, the apparatus 100 can operate classification mode at block 322, the DSP unit 106 checks whether the detection exits or not. If the detection exists, then the following steps are executed:
At step 420, comparing the object parameter with data base and recognize the object.
At step 421, the apparatus 100 sends the location, type and characteristics to user.
At step 422, after reporting, move to monitoring mode.
[0074] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGS OF THE INVENTION
[0075] The proposed invention controls the gain of the IIT and adjusts frame rate and resolution of the image sensor for achieving optimum performance in low light imaging system.
[0076] The proposed invention provides an apparatus that can captures only the area of interest and provide coordinates, but not limited to a particular area, which was recorded in high frame rate with improved gain and bit resolution.
[0077] The proposed invention controls the gain of the IIT and adjusts frame rate and resolution of the image sensor for achieving optimum performance in low light imaging system.
[0078] The proposed invention provides a better system for zooming multiple objects of interest independently, thus by providing selective zooming the user can have a comprehensive understanding of the event.
[0079] The proposed invention reduces memory usage and increases the processing capability of imaging system.
[0080] The proposed invention utilizes the binning mode of the image sensor to achieve maximum frame rate while maintaining the high pixel resolution for object location identification.
[0081] The proposed invention utilizes IIT to amplify the light from an optical scene while preserving the spatial dimensions of the scene.
[0082] The proposed invention captures images of moving objects such as moving animals, humans, electric discharge, the exhaust of rockets and missiles, and emissions from aircraft and helicopters.

Documents

Application Documents

# Name Date
1 202141011186-STATEMENT OF UNDERTAKING (FORM 3) [16-03-2021(online)].pdf 2021-03-16
2 202141011186-POWER OF AUTHORITY [16-03-2021(online)].pdf 2021-03-16
3 202141011186-FORM 1 [16-03-2021(online)].pdf 2021-03-16
4 202141011186-DRAWINGS [16-03-2021(online)].pdf 2021-03-16
5 202141011186-DECLARATION OF INVENTORSHIP (FORM 5) [16-03-2021(online)].pdf 2021-03-16
6 202141011186-COMPLETE SPECIFICATION [16-03-2021(online)].pdf 2021-03-16
7 202141011186-Proof of Right [12-04-2021(online)].pdf 2021-04-12
8 202141011186-POA [17-10-2024(online)].pdf 2024-10-17
9 202141011186-FORM 13 [17-10-2024(online)].pdf 2024-10-17
10 202141011186-AMENDED DOCUMENTS [17-10-2024(online)].pdf 2024-10-17
11 202141011186-FORM 18 [03-03-2025(online)].pdf 2025-03-03