Abstract: The present invention relates to a system and method for vehicle undercarriage inspection system and method. Non-uniform motion of vehicle over undercarriage scanner is a major challenge causing performance degradation for area scan-based imaging systems.The system and method ensures high quality images even when vehicle stops over the scanner temporarily and then moves forward.The system employs an auxiliary imaging unit (102) for providing motion information and a detection system(104) which senses the presence of the vehicle over the scanner. Undercarriage image is formed by capturing and mosaicing of undercarriage frames by a high frame rate undercarriage camera (105) using input signals from the detection system and auxiliary imaging unit. A nested buffer of optimal size is used for storing captured image frames, which are stitched together in real time and then post-processed to remove duplication or stretching of vehicle regions due to halting of the vehicle ensuring undistorted undercarriage image formation. Representative figure: Figure 1
DESC:TECHNICAL FIELD
[0001] The present invention relates generally to vehicle undercarriage inspection systems. The invention, more particularly, relates to security systems to carry out inspection of vehicle undercarriage, where undistorted image of the undercarriage of the vehicle is to be obtained.
BACKGROUND
[0002] Vehicle inspection is very critical in global scenario. Traditional manual methods like mirror-on-a-stick are no longer reliable and accurate to detect potentially dangerous objects hidden on the underside of vehicle.
[0003] Undervehicle scanning systems are installed at facility access points and security check points to detect planted explosive devices, contraband, suspicious packages, or any non-standard modifications in the undercarriage of the vehicle. Non-uniform motion of vehicle over undercarriage scanner is a major challenge causing performance degradation for undercarriage imaging systems.
[0004] United States Publication No. US20030185340A1 discloses a vehicle undercarriage imaging system and method, which comprises a plurality of electronic cameras, illumination devices, and range finders. Range finders measure the position of the vehicle wheels in order to properly mark the position at which various images are captured.
[0005] Indian patent 2459/DEL/2005 discloses a vehicle inspection method and system. It relates to capturing multiple images of the vehicle and calibrating the images to form a composite image. Average velocity of vehicle is obtained at different instants by calibrating the image using knowledge of the time instants at which the vehicle crosses certain pre-determined fixed locations.
[0006] United States Patent No.US7349007B2 discloses an entry point control device, system and method. It claims a method for obtaining at least two simultaneous undercarriage images via the scanner camera, with one image being taken at an angle in the direction of vehicle travel and one image being taken at an angle against the direction of vehicle travel and processing the images taken by the front and scanner cameras.
[0007] However, none of the above noted patents discuss handling of a situation where vehicle temporarily stops over the scanner and then moves forward. Such a situation may prove detrimental to system performance of the prior art documents above. None of the prior art address the problem of non-uniform motion of vehicle, particularly, stopping the vehicle over scanner for some time and then moving forward, which may be deliberately employed to overload the system with multiple junk frames, and thus evade the security system.
[0008] There is still a need of an invention which solves the above defined problems and provides a vehicle undercarriage inspection system and method that enables the system to scan and record the images of the undercarriage of a vehicle when the vehicle temporarily stops over the scanner, among other scenarios.
SUMMARY OF THE INVENTION
[0009] This summary is provided to introduce concepts of the present invention. This summary is neither intended to identify essential features of the present invention nor is it intended for use in determining or limiting the scope of the present invention.
[0010] In accordance with the present invention, a method for undistorted undercarriage image formation of a vehicle is described. The method comprises the steps of: detecting, by a detection unit, a presence of the vehicle over an undercarriage imaging device; capturing, by the undercarriage imaging unit, frames of an undercarriage of the vehicle; determining, by an auxiliary imaging unit, a movement of the vehicle; identifying, by a frame buffering unit, one or more relevant frames of the undercarriage of the vehicle; buffering, by the frame buffering unit, relevant frames from the captured images; creating , by an image processing unit, a stitched image from the frame buffered images; and performing, by the image processing unit , post-processing of the stitched image to form an undistorted image of undercarriage of the vehicle.
[0011] In one aspect, the step of identifying relevant frames by the frame buffering unit comprises: discarding frames captured by the undercarriage imaging unit when the vehicle is determined to be neither moving nor the vehicle is detected to be present over the undercarriage imaging unit and storing the remaining frames. In one aspect, the step of buffering the captured undercarriage frames comprises: creating an outer buffer and inner buffer of predetermined sizes, wherein the inner buffer is nested inside the outer buffer and the inner buffer is dynamically created to store the frames, when the vehicle is detected to be stopped over the undercarriage imaging device; determining whether the vehicle has stopped over the undercarriage imaging unit; if yes capturing and storing frames in the inner buffer and overwriting contents of inner buffer once it is filled; else storing frames in the outer buffer.
[0012] In one aspect, the step of creating the stitched image comprises: identifying overlapping regions between successive frames, by correlating one or more consecutive frames; and blending of successive frames, without duplication of overlapping regions to create the stitched image.
[0013] In one aspect, the step of performing post-processing of stitched image comprises dividing the stitched image for: identifying stationary regions from the stitched image, identifying transition regions from the stitched image; identifying motion regions from the stitched image; and removing distortions in undercarriage image due to stationary regions and transition regions.
[0014] In one aspect, identifying stationary regions comprises the steps of: identifying consecutive non-overlapping blocks of appropriate size from the stitched image; determining similarity for the consecutive non-overlapping blocks; classifying the similar non-overlapping blocks as stationary region; and storing starting and ending position of the stationary region in the stitched image.
[0015] In one aspect, identifying transition regions comprises the steps of: identifying a block of appropriate width before the starting position of stationary region; identifying a block of appropriate width after the ending position of stationary region; and classifying the blocks as transition region.
[0016] In one aspect, identifying motion regions comprises of: classifying the region outside the transition region and the stationary region as a motion region.
[0017] In one aspect, removing distortions in undercarriage image due to stationary regions and transition regions comprises of: discarding the region between starting and ending positions of stationary region; down-sampling transition regions by an appropriate factor along the direction of length of vehicle; and merging motion regions and down-sampled transition regions to form the undistorted stitched undercarriage image.
[0018] In another aspect of the invention, a system for undistorted image formation in vehicle inspection application is provided below. a detection unit configured to identify presence of vehicle, an auxiliary imaging unit configured to determine the movement of the vehicle; an undercarriage imaging unit configured to capture plurality of images of undercarriage of the vehicle; a frame buffering unit configured to identify relevant frames from the captured frames and buffer frames of the captured images; and an image processing unit configured to create a stitched image from frame buffered images and perform post processing of the stitched image to form an undistorted image of the undercarriage of the vehicle.
[0019] In another aspect, the frame buffering unit is configured to discard frames captured by the undercarriage imaging unit when the vehicle is determined to be neither moving nor the vehicle is detected to be present over the undercarriage imaging unit and store the remaining frames.
[0020] In another aspect, the undercarriage imaging unit comprises: an undercarriage camera configured to capture plurality of images of vehicle undercarriage without any loss of information; and a lighting unit configured to illuminate the vehicle undercarriage.
[0021] In another aspect, the frame buffering unit, is configured to: create an outer buffer and inner buffer of predetermined sizes, wherein the inner buffer is nested inside the outer buffer; if the vehicle is in movement, store frames in the outer buffer; or if the vehicle is above undercarriage imaging unit, check whether the vehicle is stopped over undercarriage imaging device and store frames in the inner buffer.
[0022] In another aspect, the image processing unit is configured to: input stitched image to the post processing unit; divide the stitched image into plurality of regions selected from one or more motion regions, one or more transition regions and a stationary region; identify a starting point and an ending point of the stationary region and discard the area between the starting point and ending point of the stationary region; and down-sample along the direction of length of vehicle each transition regions; and merge motion regions with the down-sampled transition regions for forming undistorted image.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0023] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0024] Figure 1illustrates the arrangement of a vehicle undercarriage inspection system, according to an exemplary implementation of the present invention.
[0025] Figure 2 illustrates a flowchart depicting a method for undistorted image formation on capture of an image by the vehicle undercarriage inspection system of Figure 1, according to an exemplary implementation of the present invention.
[0026] Figure3 illustrates a flowchart for storing relevant captured undercarriage frames as illustrated in Figure 2 according to an implementation of the present invention.
[0027] Figure 4 is a depiction of different regions which can be present in a stitched undercarriage image.
[0028] Figure 5 illustrates a flowchart for identification of a stationary region in the stitched image of Figure 4, according to an embodiment of the present invention.
[0029] Figure 6 illustrates a flowchart for post-processing of stitched image of Figure 5, according to an exemplary embodiment of the present invention.
[0030] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative methods embodying the principles of the present invention. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0031] The various embodiments of the present invention describe about vehicle undercarriage inspection systems and methods thereof. It further provides an improved undercarriage inspection system and method for use.
[0032] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0033] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the present invention and are meant to avoid obscuring of the present invention.
[0034] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0035] Undercarriage inspection systems employ single or multiple cameras to capture continuous images of different sections of a vehicle undercarriage as vehicle moves over scanner. A composite undercarriage image is formed from these captured frames. Generation of a proper undercarriage image, when vehicle moves over scanner in a non-uniform manner, is a challenging task. If vehicle temporarily stops over scanner and then resumes motion, a continuous capture of frames in such a scenario may lead to memory issues and/or image distortion. The current invention describes a system and method for undistorted undercarriage image formation during non-uniform motion of vehicle over scanner, particularly when vehicle stops over scanner temporarily and then moves forward. The system comprises one or more high frame rate cameras for undercarriage imaging of a vehicle. The system further comprises an auxiliary camera, configured to provide motion detection information. The method comprises the steps of vehicle detection, undercarriage frame capturing, motion detection, identifying relevant frames, frame buffering, image stitching and image post-processing.
[0036] An overview of vehicle inspection system is shown in Fig. 1. The system comprises of vehicle detection unit 104, undercarriage imaging unit, auxiliary vehicle imaging unit 102, frame buffering unit 106 and image processing unit 107. The undercarriage imaging unit is a combination of an undercarriage camera 105 and a lighting unit 103.It should be understood that there are communication links between multiple modules, for transmission of data and control signals. Standard communication links include wired and wireless communication links. Some of the wired and wireless communication links operate using standard protocols such as, but not limited to, serial line (RS232, RS422) or Ethernet, USB, PCI Bus, Bluetooth, IRDA, and 802.11.x protocols.
[0037] Data processing (for example, in image processing unit 107 and frame buffering unit 106) can be done on a combination of different computing platforms such as, but not limited to, embedded systems, microcontroller units and general-purpose processors in single board computers or standard Personal Computers with one or more processors, memory systems.
[0038] The presence of a vehicle 101 over undercarriage imaging unit is detected by one or more sensors. Different types of sensors such as but not limited to induction loops, pressure sensors, infra-red sensors and proximity sensors can be used for vehicle detection. A specific sensor or a combination of the sensors can be used for detection of presence of a vehicle. For detection of entry/exit of vehicles, a specific combination of sensors could be used. The entry sensor may be placed little ahead of the undercarriage camera and the exit sensor may be placed a small distance after the camera along the direction of motion of vehicle, to ensure imaging without any loss of information.
[0039] Fig. 1 also depicts a detection unit 104, which is configured to give different output signals based on whether the undercarriage camera is seeing the vehicle or not, placed alongside the undercarriage camera 105. Detection unit can contain an extra sensor unit along with entry and exit sensors, which gives a positive detection when the vehicle is directly above undercarriage camera. Different arrangements for sensors are possible in the detection unit depending on sensor latency and operating requirements. The detection unit 104 gives continuous outputs for the entire duration of operation of the undercarriage inspection system. Other modules like undercarriage camera 105, lighting unit 103 and auxiliary camera 102 are triggered based on output of the detection unit 104.
[0040] In one embodiment, undercarriage camera 105 may be used for capturing images of the undercarriage of the vehicles. One or more high frame rate cameras (line-scan or area-scan cameras) for undercarriage imaging may be used. The camera lens and positioning are selected to ensure imaging of complete width of vehicles, belonging to different categories. Once the presence of vehicle is detected, the camera captures thin strips of vehicle undercarriage in quick succession, without any motion blur or loss of information. The image capture timing is adjusted to produce some overlap between successive frames from the same camera, in case of area-scan cameras. It should be understood that the image capturing may start little before the vehicle is directly over the camera and continue for a small time even after the vehicle passes over the camera. This is to ensure that beginning and end portions of vehicle are not missed, regardless of vehicle speed or sensor latency.
[0041] A lighting unit 103 configured to illuminate the vehicle undercarriage during scanning is placed alongside the undercarriage camera 105. The lighting unit 103 is turned on once vehicle entry is detected and turned off after vehicle exit. The camera and lighting assembly may be either fixed or portable.
[0042] In an embodiment of the invention, the system comprises an auxiliary camera 102, configured to provide motion detection information. Information on vehicle motion is given out by auxiliary camera 102, as long as vehicle detector gives a positive detection. Reliable inference regarding presence of vehicle motion is obtained by analyzing multiple wide area images of the vehicle. This processing may be done onboard or on a separate processing element which receives the camera feed. The auxiliary camera 102 can be configured to capture any one of vehicle front view, rear view, undercarriage view or any other convenient viewing angle. In general, auxiliary camera 102 will have a much lower frame rate than undercarriage camera 105. Independent objects moving in the scene can also give a positive motion detection result, even when the vehicle is stationary, depending on the motion sensitivity of the camera. Image processing methods are employed for handling such scenarios.
[0043] In another embodiment of the invention, the output signals from undercarriage imaging unit and auxiliary camera 102 are received by a frame buffering unit 106. Frame buffering unit 106 identifies the frames relevant for stitching based on these inputs and stores the relevant frames. Relevant frames are then given to image processing unit 107, which stitches frames together, identifies and rectifies distortions and provides a composite undercarriage image at its output. The manner in which the undercarriage system captures the images of the vehicles is described below.
[0044] It is required to have an undistorted image of the vehicle undercarriage for identifying the potentially dangerous objects. In an embodiment of the invention, a method of undistorted undercarriage image formation is disclosed in Fig. 2. In step 201, the vehicle is detected. In step 202, the photographic capture of undercarriage of a vehicle is initiated. In step 203, the vehicle’s movement is detected. Motion detection and undercarriage image capturing start once presence of vehicle is detected. Motion detection and vehicle detection outputs, along with captured undercarriage frames are used to identify relevant undercarriage frames in step 204. The collected data from step 204 is transferred for frame buffering step 205. The captured images are then stitched in step 206 to produce one or mosaics of images and generated image is post processed in step 207.
[0045] The undercarriage camera operates at a high frame rate for providing high quality images, whereas vehicle detection and motion detection can be performed reliably only at a much lower rate. If a vehicle temporarily stops over scanner, undercarriage camera would have captured several frames from the instant of actual stopping of the vehicle till an indication is provided by sensors that vehicle is stopped over the scanner. Similarly, when vehicle resumes motion, there will be a time delay for the sensor to indicate motion. But it should be ensured that undercarriage information is captured during this time period as well. The disclosed method enables accurately synchronizing sensor signals and undercarriage frames, without loss of information.
[0046] Whenever vehicle is present over scanner, undercarriage camera captures image frames continuously. Information regarding whether the vehicle is directly above the undercarriage camera as well as presence of vehicle motion are collected. Undercarriage frames captured while both ‘vehicle-present-over-scanner’ and ‘vehicle-motion-present’ signals are absent, are discarded. Remaining frames are identified as relevant input frames at step 204 for frame buffering at step 205.
[0047] Storing relevant undercarriage frames in memory is done through frame buffering at step 205, which involves creating a buffer arranged as a nested circular buffer, called the inner buffer, within a linear/ circular buffer, called the outer buffer. This method is described in detail here with reference to Fig. 3, where the outer buffer is assumed to be a linear buffer of sufficient size to store all the undercarriage frames captured, when vehicle moves over scanner without stopping. This assumption is used only for ease of explanation, and different sizes and configurations of the buffers are covered within the scope of this invention.
[0048] Appropriate sizes are defined at step 301 for outer and inner buffers. The size of outer buffer is selected in such a way that it can store all the undercarriage frames captured, when vehicle moves over scanner without stopping. The size of inner buffer is selected in such a way that it can store frames captured when vehicle resumes motion after stopping, before sensors indicate motion detection, so that any sensor delay is handled appropriately. Outer buffer of fixed size is created at step 302 at a first instance, whereas inner buffer is created dynamically, whenever vehicle stops over scanner. Undercarriage camera captures frames continuously at step 303 once it receives the trigger signal. In one exemplary embodiment, the trigger signal emitted when vehicle is present directly above the undercarriage camera after being detected by the entry detector may be termed as ‘vehicle-present-over-scanner’. Each captured frame is checked at steps 304, 305 for its relevance, based on vehicle detection and motion detection outputs. Irrelevant frames are discarded at step 306. If the current frame is relevant, it is stored at step 307 at the current outer buffer position and outer buffer position is incremented at step 308. When vehicle is stopped over scanner, it is identified based on vehicle detection and motion detection outputs and ‘vehicle-stopped-over-scanner’ signal is emitted. If ‘vehicle-stopped-over-scanner’ signal is present, inner buffer size is also incremented at step 310. When the inner buffer size exceeds the maximum permissible limit at step 311, inner buffer size is reset to 0. The contents of inner buffer are overwritten, once it is filled, from starting position of inner buffer, as long as ‘vehicle-stopped-over-scanner’ signal is present at step 312.
[0049] This method ensures that information is retained during vehicle stopping and vehicle restarting transition phases, while duplicate frames captured during vehicle-stopped scenario does not fill up the available memory space. Multiple inner buffers can be created if vehicle stops multiple times.
[0050] In another embodiment of the invention, undercarriage frames are stitched at step 206, after identifying overlapping regions between successive frames. Two consecutive frames are correlated, overlap is identified, and they are blended seamlessly at the point of overlap. At least one column from each relevant undercarriage frame is present in stitched image to ensure preservation of all undercarriage details. Two thin strips of consecutive undercarriage locations may look similar, due to the nature of vehicle undercarriage. Image correlation will indicate complete overlap of frames in such cases, even when they are physically distinct locations. Proper timing of image capture combined with retaining at least one column from each frame makes sure that no information is lost, but it also causes some duplication of data when vehicle is stopped over scanner. Similarly, non-uniform motion of vehicle during stopping and restarting, results in stretching of undercarriage image regions corresponding to these transition phases. Image post-processing at step 207 identifies such distortions and rectifies them.
[0051] Different regions in a stitched image formed when a vehicle stops over the scanner once and moves forward after some time, are illustrated in fig. 4. R1 and R5 are regions where normal vehicle motion is present (motion regions). Region R2 corresponds to the transition phase when vehicle changes from moving state to stopped state. Similarly, region R4 corresponds to the transition phase during restarting of vehicle (transition regions). R3 is the stationary region, formed when vehicle is completely stopped over the scanner. B1 and B2 are two consecutive non-overlapping rectangular blocks in stitched image.
[0052] In an embodiment of the invention, stationary regions in a stitched image are identified as illustrated in fig. 5. From the stitched undercarriage image at step 501, a rectangular block B1 of fixed size, starting from first column of stitched image, is selected at step 502. It is compared with the consecutive non-overlapping block B2 of same size at step 503. Similarity between successive blocks is determined at step 504, based on an appropriate measure of similarity like correlation coefficient, sum of absolute differences etc. This process is repeated throughout the length of stitched image at step 508. If two blocks are sufficiently similar at step 505, it is considered as an indication of vehicle stopping and the count of stationary blocks is incremented at step 506. The first block to have sufficient similarity score at step 507 denotes the start of stationary region. The index of first column of this block is identified as the starting position S1 of stationary region at step 511. There will be a number of successive blocks with high similarity score. The last column of last block with high similarity is identified as the ending position S2 of stationary region at steps 509, 510. Width of an individual block should be selected in such a manner that it is neither too narrow nor too wide, to arrive at a conclusive decision regarding stationary state of vehicle.
[0053] In another exemplary embodiment of the invention, distortions in undercarriage image are removed after identifying stationary regions. A stitched image at step 601 may be considered as a combination of different regions: motion regions R1 and R5, transition regions R2 and R4 and stationary region R3, step 602 (Fig.4). Once the starting and ending points of stationary region R3 are identified at step 603, a block of fixed width W before S1 is assumed as the transition region R2 at step 605, prior to vehicle stopping, and a similar block after S2 is assumed as the transition region R4 at step 607, after vehicle restarting.
[0054] Non-uniform motion of vehicle during stopping and restarting results in horizontal stretching of undercarriage image around the beginning and end positions of stationary region. Regions R2 and R4 are down-sampled along horizontal direction at steps 606, 608, to remove the effect of stretching. Frames captured and stitched while the vehicle is completely stopped form the stationary region R3. They result in duplication of data and hence are discarded at step 604. Finally, regions R1, down-sampled versions of R2 and R4, and R5 are merged to get the undistorted undercarriage image at step 609. The same principle can be used to handle multiple stationary regions as well.
[0055] In brief, in an exemplary embodiment of the invention, a system for undistorted image formation in vehicle undercarriage inspection application is disclosed. The system comprises of a detection unit 104 configured to identify presence of vehicle directly over the undercarriage camera.
[0056] In an exemplary embodiment of the invention, the system comprises an undercarriage imaging unit configured to capture thin image slices of vehicle undercarriage at high frame rate and an auxiliary vehicle imaging unit 102 configured to provide motion detection information.
[0057] In an exemplary embodiment of the invention, the system comprises a frame buffering unit 106 configured to store the relevant frames captured by undercarriage camera.
[0058] In an exemplary embodiment of the invention, the system comprises an image processing unit 107 configured to stitch together undercarriage frames and compensate the distortions introduced by non-uniform motion of the vehicle. The system also incorporates one or more sensors which gives different outputs based on whether the undercarriage camera is seeing the vehicle or not.
[0059] In an exemplary embodiment of the invention, the undercarriage imaging unit comprises a high frame rate camera 105 configured to capture images of thin strips of vehicle undercarriage, ensuring there is no loss of information between successive frames.
[0060] In an exemplary embodiment of the invention, the system comprises a lighting unit 103 configured to illuminate the vehicle undercarriage during scanning, ensuring high clarity image is obtained irrespective of ambient lighting conditions.
[0061] In an exemplary embodiment of the invention, the auxiliary camera 102/ vehicle imaging unit 102 is configured to capture wide area images of the vehicle and these images are further analyzed to provide reliable information on presence of vehicle motion. The frame buffering unit 106 receives undercarriage image frames, motion detection and ‘vehicle-present-over-scanner’ signals at its input and provides the relevant frames to be stitched at its output.
[0062] In an exemplary embodiment of the invention, the image processing unit 107 takes output of frame buffering unit as its input, stitches them, identifies any distortions present, rectifies distortions and provides a composite undistorted undercarriage image as its output.
[0063] In an exemplary embodiment of the invention, a method for undistorted image formation in vehicle undercarriage inspection systems is disclosed. The method comprises steps of : detecting (201) presence of vehicle over undercarriage imaging device, capturing (202) thin image slices of vehicle undercarriage at high frame rate ; identifying (203) movement or stoppage of vehicle; identifying (204) relevant frames captured by undercarriage imaging device; storing (205) relevant undercarriage frames in memory; mosaicing (206) of individual undercarriage frames to create a composite image; and post-processing (207) stitched image to remove duplication and stretching.
[0064] In an exemplary embodiment of the invention, the method step of identifying relevant frames captured by undercarriage imaging device comprises of: analyzing outputs of motion detection and presence of vehicle over scanner at the instant of capturing undercarriage image frame and discarding frames when both vehicle motion and ‘vehicle-present-over-scanner’ signals are absent.
[0065] In an exemplary embodiment of the invention, the method step of storing relevant undercarriage frames in memory comprises: creating a buffer arranged as a nested circular buffer (inner buffer) within a linear/circular buffer (outer buffer); creating an inner buffer dynamically, whenever vehicle stops over scanner; selecting an appropriate size for outer buffer so as to store all the undercarriage frames captured, when vehicle moves over scanner without stopping; selecting an appropriate size for inner buffer to store undercarriage frames captured when vehicle starts to move after stopping, before sensors indicate motion detection, so that any sensor delay is handled properly; storing relevant undercarriage frames in successive locations of outer buffer when ‘vehicle-stopped-over-scanner’ signal is absent; storing relevant undercarriage frames in successive locations of inner buffer when ‘vehicle-stopped-over-scanner’ signal is present and overwriting contents of inner buffer, once it is filled, from starting position of inner buffer, as long as ‘vehicle-stopped-over-scanner’ signal is present.
[0066] In an exemplary embodiment of the invention, the method step of mosaicing undercarriage frames comprises identification of overlapping regions between successive frames, by correlating two consecutive frames and seamless blending of successive frames, without duplication of overlapping regions.
[0067] In an exemplary embodiment of the invention, the method step of post-processing stitched image comprises: Taking consecutive blocks of appropriate size from stitched image and finding similarity between them; Identifying consecutive blocks with sufficient similarity as locations of vehicle stopping (stationary region); Identifying stretching in undercarriage image arising out of non-uniform motion during stopping and restarting vehicle, present around the beginning and end positions of stationary region; Removing stretching in undercarriage image, caused while stopping and restarting the vehicle, by down-sampling stretched portions; Identifying duplication in undercarriage image arising out of frames captured and stitched while the vehicle is completely stopped, present at the middle of stationary region; Removing duplication in undercarriage image, caused by frames captured during stopped state of vehicle, by discarding duplicate blocks.
[0068] Thus, a system and method to provide undercarriage images of same quality, whether the vehicle moves with a uniform velocity, non-uniform velocity or temporarily stops over the scanner, has been described. This can be used in portable or fixed undercarriage inspection systems. The same principle may be extended to other imaging systems as well.
[0069] The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the invention.
,CLAIMS:
1. A method for undistorted undercarriage image formation of a vehicle, the method comprising:
detecting (201), by a detection unit (104), a presence of the vehicle over an undercarriage imaging device (105);
capturing (202), by the undercarriage imaging unit (105), frames of an undercarriage of the vehicle;
determining (203), by an auxiliary imaging unit (102), a movement of the vehicle;
identifying (204), by a frame buffering unit (106), one or more relevant frames of the undercarriage of the vehicle;
buffering (205), by the frame buffering unit (106), relevant frames from the identified relevant frames;
creating (206), by an image processing unit (107), a stitched image from the frame buffered images; and
performing (207), by the image processing unit (107), post-processing of the stitched image to form an undistorted image of undercarriage of the vehicle.
2. The method as claimed in claim 1, wherein the step of identifying (204) relevant frames by the frame buffering unit, comprises of:
discarding frames captured by the undercarriage imaging unit when the vehicle is determined to be neither moving nor the vehicle is detected to be present directly above the undercarriage imaging unit and storing the remaining frames.
3. The method as claimed in claim 1, wherein the step of buffering (205) of captured undercarriage frames, comprises the steps of:
creating (302, 301) an outer buffer and inner buffer of predetermined sizes, wherein the inner buffer is nested inside the outer buffer and the inner buffer is dynamically created to store the frames, when the vehicle is detected to be over the undercarriage imaging device;
determining whether the vehicle has stopped over the undercarriage imaging unit;
if yes, capturing and storing frames in the inner buffer and overwriting contents of inner buffer once it is filled;
else
capturing and storing frames in the outer buffer.
4. The method as claimed in claim 1, wherein the step of creating (206) the stitched image, comprises:
identifying overlapping regions between successive frames, by correlating one or more consecutive frames; and
blending of successive frames, without duplication of overlapping regions to create the stitched image.
5. The method as claimed in claim 1, wherein performing (207) post-processing of stitched image comprises:
dividing the stitched image for:
identifying stationary regions from the stitched image;
identifying transition regions from the stitched image;
identifying motion regions from the stitched image; and
removing distortions in undercarriage image due to stationary regions and transition regions.
6. The method as claimed in claim 5, wherein identifying stationary regions comprises the steps of:
identifying consecutive non-overlapping blocks of appropriate size from the stitched image;
determining similarity for the consecutive non-overlapping blocks;
classifying the similar non-overlapping blocks as stationary region; and
storing starting and ending position of the stationary region in the stitched image.
7. The method as claimed in claim 5, wherein identifying transition regions comprises the steps of:
identifying a block of appropriate width before the starting position of stationary region;
identifying a block of appropriate width after the ending position of stationary region; and
classifying the blocks as transition region.
9. The method as claimed in claim 5, wherein identifying motion regions comprises of:
classifying the region outside the transition region and the stationary region as a motion region.
10. The method as claimed in claim 5, wherein removing distortions in undercarriage image due to stationary regions and transition regions comprises:
discarding (604) the region between starting and ending positions of stationary region;
down-sampling (606, 608) transition regions by an appropriate factor along the direction of length of vehicle; and
merging motion regions (609) and down-sampled transition regions to form the undistorted stitched undercarriage image.
11. A system for undistorted image formation in vehicle undercarriage inspection application, the system comprising of:
a detection unit (104) configured to identify presence of vehicle,
an auxiliary imaging unit (102) configured to determine the movement of the vehicle;
an undercarriage imaging unit (105) configured to capture plurality of images of undercarriage of the vehicle;
a frame buffering unit (106) configured to identify relevant frames from the captured frames and buffer frames of the captured images; and
an image processing unit (107) configured to create a stitched image from frame buffered images and perform post processing of the stitched image to form an undistorted image of the undercarriage of the vehicle.
12.The system as claimed in claim 11, wherein the frame buffering unit (106) is configured to discard frames captured by the undercarriage imaging unit when the vehicle is determined to be neither moving nor the vehicle is detected to be present directly above the undercarriage imaging unit and store the remaining frames.
13. The system as claimed in claim 11, wherein the undercarriage imaging unit comprises:
an undercarriage camera (105) configured to capture plurality of images of vehicle undercarriage without any loss of information; and
a lighting unit (103) configured to illuminate the vehicle undercarriage.
14.The system as claimed in claim 11, wherein the frame buffering unit (106) is configured to:
create an outer buffer and inner buffer of predetermined sizes, wherein the inner buffer is nested inside the outer buffer;
if the vehicle is in movement, store frames in the outer buffer; or
if the vehicle is above undercarriage imaging unit, check whether the vehicle is stopped over undercarriage imaging device and store frames in the inner buffer and overwrite contents of inner buffer once it is filled.
15. The system as claimed in claim 11, wherein the image processing unit (107) is configured to:
input stitched image to the post processing unit;
divide the stitched image into plurality of regions selected from one or more motion regions, one or more transition regions and a stationary region;
identify a starting point and an ending point of the stationary region and discarding the area between the starting point and ending point of the stationary region; and
down-sample along the direction length of vehicle each transition regions; and merge motion regions with the down-sampled transition regions for forming undistorted image.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 202041012884-PROVISIONAL SPECIFICATION [24-03-2020(online)].pdf | 2020-03-24 |
| 1 | 202041012884-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 2 | 202041012884-FORM 1 [24-03-2020(online)].pdf | 2020-03-24 |
| 2 | 202041012884-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 3 | 202041012884-IntimationOfGrant27-06-2024.pdf | 2024-06-27 |
| 3 | 202041012884-FIGURE OF ABSTRACT [24-03-2020(online)].jpg | 2020-03-24 |
| 4 | 202041012884-PatentCertificate27-06-2024.pdf | 2024-06-27 |
| 4 | 202041012884-DRAWINGS [24-03-2020(online)].pdf | 2020-03-24 |
| 5 | 202041012884-Written submissions and relevant documents [21-06-2024(online)].pdf | 2024-06-21 |
| 5 | 202041012884-FORM-26 [21-06-2020(online)].pdf | 2020-06-21 |
| 6 | 202041012884-FORM-26 [24-06-2020(online)].pdf | 2020-06-24 |
| 6 | 202041012884-Correspondence to notify the Controller [04-06-2024(online)].pdf | 2024-06-04 |
| 7 | 202041012884-US(14)-ExtendedHearingNotice-(HearingDate-12-06-2024).pdf | 2024-05-31 |
| 7 | 202041012884-Proof of Right [24-09-2020(online)].pdf | 2020-09-24 |
| 8 | 202041012884-FORM 3 [30-09-2020(online)].pdf | 2020-09-30 |
| 8 | 202041012884-Correspondence to notify the Controller [27-05-2024(online)].pdf | 2024-05-27 |
| 9 | 202041012884-ENDORSEMENT BY INVENTORS [30-09-2020(online)].pdf | 2020-09-30 |
| 9 | 202041012884-FORM-26 [27-05-2024(online)].pdf | 2024-05-27 |
| 10 | 202041012884-DRAWING [30-09-2020(online)].pdf | 2020-09-30 |
| 10 | 202041012884-US(14)-HearingNotice-(HearingDate-04-06-2024).pdf | 2024-05-14 |
| 11 | 202041012884-ABSTRACT [03-07-2023(online)].pdf | 2023-07-03 |
| 11 | 202041012884-CORRESPONDENCE-OTHERS [30-09-2020(online)].pdf | 2020-09-30 |
| 12 | 202041012884-CLAIMS [03-07-2023(online)].pdf | 2023-07-03 |
| 12 | 202041012884-COMPLETE SPECIFICATION [30-09-2020(online)].pdf | 2020-09-30 |
| 13 | 202041012884-COMPLETE SPECIFICATION [03-07-2023(online)].pdf | 2023-07-03 |
| 13 | 202041012884_Correspondence_05-10-2020.pdf | 2020-10-05 |
| 14 | 202041012884-DRAWING [03-07-2023(online)].pdf | 2023-07-03 |
| 14 | 202041012884-FORM 18 [27-06-2022(online)].pdf | 2022-06-27 |
| 15 | 202041012884-FER.pdf | 2023-01-06 |
| 15 | 202041012884-FER_SER_REPLY [03-07-2023(online)].pdf | 2023-07-03 |
| 16 | 202041012884-OTHERS [03-07-2023(online)].pdf | 2023-07-03 |
| 17 | 202041012884-FER_SER_REPLY [03-07-2023(online)].pdf | 2023-07-03 |
| 17 | 202041012884-FER.pdf | 2023-01-06 |
| 18 | 202041012884-FORM 18 [27-06-2022(online)].pdf | 2022-06-27 |
| 18 | 202041012884-DRAWING [03-07-2023(online)].pdf | 2023-07-03 |
| 19 | 202041012884-COMPLETE SPECIFICATION [03-07-2023(online)].pdf | 2023-07-03 |
| 19 | 202041012884_Correspondence_05-10-2020.pdf | 2020-10-05 |
| 20 | 202041012884-CLAIMS [03-07-2023(online)].pdf | 2023-07-03 |
| 20 | 202041012884-COMPLETE SPECIFICATION [30-09-2020(online)].pdf | 2020-09-30 |
| 21 | 202041012884-ABSTRACT [03-07-2023(online)].pdf | 2023-07-03 |
| 21 | 202041012884-CORRESPONDENCE-OTHERS [30-09-2020(online)].pdf | 2020-09-30 |
| 22 | 202041012884-DRAWING [30-09-2020(online)].pdf | 2020-09-30 |
| 22 | 202041012884-US(14)-HearingNotice-(HearingDate-04-06-2024).pdf | 2024-05-14 |
| 23 | 202041012884-ENDORSEMENT BY INVENTORS [30-09-2020(online)].pdf | 2020-09-30 |
| 23 | 202041012884-FORM-26 [27-05-2024(online)].pdf | 2024-05-27 |
| 24 | 202041012884-FORM 3 [30-09-2020(online)].pdf | 2020-09-30 |
| 24 | 202041012884-Correspondence to notify the Controller [27-05-2024(online)].pdf | 2024-05-27 |
| 25 | 202041012884-US(14)-ExtendedHearingNotice-(HearingDate-12-06-2024).pdf | 2024-05-31 |
| 25 | 202041012884-Proof of Right [24-09-2020(online)].pdf | 2020-09-24 |
| 26 | 202041012884-FORM-26 [24-06-2020(online)].pdf | 2020-06-24 |
| 26 | 202041012884-Correspondence to notify the Controller [04-06-2024(online)].pdf | 2024-06-04 |
| 27 | 202041012884-Written submissions and relevant documents [21-06-2024(online)].pdf | 2024-06-21 |
| 27 | 202041012884-FORM-26 [21-06-2020(online)].pdf | 2020-06-21 |
| 28 | 202041012884-PatentCertificate27-06-2024.pdf | 2024-06-27 |
| 28 | 202041012884-DRAWINGS [24-03-2020(online)].pdf | 2020-03-24 |
| 29 | 202041012884-IntimationOfGrant27-06-2024.pdf | 2024-06-27 |
| 29 | 202041012884-FIGURE OF ABSTRACT [24-03-2020(online)].jpg | 2020-03-24 |
| 30 | 202041012884-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 30 | 202041012884-FORM 1 [24-03-2020(online)].pdf | 2020-03-24 |
| 31 | 202041012884-PROVISIONAL SPECIFICATION [24-03-2020(online)].pdf | 2020-03-24 |
| 31 | 202041012884-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 1 | searchE_05-01-2023.pdf |