Sign In to Follow Application
View All Documents & Correspondence

A System And A Method For Detecting Over Speeding Objects With Visual Evidence.

Abstract: The present invention discloses a system and a operative method for generating time stamp integrated with visual evidence of displacement of moving object comprising image capturing means for generating time stamped images of instants of moving objects covering sequence of displacement images of said moving object, object detection device for identifying at least one moving object in said sequence of images for its time stamp integrated with visual evidence of displacement and grid generator enabling mapping of said time stamp of instant included sequence of images of said identified moving object in a grid structure and determining distance of select pixel points in the said images of identified moving object from at least one fixed reference point, transforming the said distance of pixels to actual displacement of the said moving object on earth and superimposing the said transformed distance on the image sequence itself and thus generating a spatiotemporal data in the image sequence itself as visual evidence of displacement of moving object and speed of movement, including over-speeding information based thereon.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 January 2018
Publication Number
28/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
anjanonline@vsnl.net
Parent Application

Applicants

VIDEONETICS TECHNOLOGY PRIVATE LIMITED
PLOT-5, BLOCK-BP, SALT LAKE, KOLKATA, WEST BENGAL INDIA-700091.

Inventors

1. DAS, Sudeb;
AS-1/114/1, Kalyanpur Housing Asansol, West Bengal, India-713305
2. BHATTACHARYYA, Kaustubh;
Post & Village – Fingapara, Dist. North 24 Parganas, West Bengal, India-743129.
3. GORAI, Apurba;
Vill & PO – Hat Ashuria, Dist. Bankura, West Bengal, India-722204.
4. BOSE, Tuhin;
BE-1/14/1, Peyara Bagan, Deshbandhu Nagar, Kolkata West Bengal, India-700 059.
5. ACHARYA, Tinku;
E375, Baishnabghata-Patuli Township, Kolkata, West Bengal, India- 700094

Specification

Claims:1. A system for generating evidence images covering the field of view consisting of one or more moving objects/vehicles integrated with spatiotemporal information of the moving objects of interest in the image itself for evaluating related spatiotemporal information comprising:
image capturing means for generating time stamp imprinted images of instants of moving objects including sequence of spatiotemporal information of said moving object;
object detection device for identifying at least one moving object in said sequence of images for its time stamp integrated including visual evidence of its spatiotemporal information;
grid generator enabling mapping of said time stamp of instants including its spatiotemporal information in sequence of images of said identified moving object in a relative grid structure and generating the related spatiotemporal information of said moving object related to distance of select pixel points in the said images of identified moving object from at least one fixed reference point.

2. A system as claimed in claim 1 for generating spatio-temporal information including time stamp integrated with visual evidence of displacement of moving object comprising:
image capturing means for generating time stamp imprinted images of instants of moving objects covering sequence of displacement images of said moving object;
object detection device for identifying at least one moving object in said sequence of images for its time stamp integrated with visual evidence of displacement;
grid generator enabling mapping of said time stamp of instant included sequence of images of said identified moving object in a grid structure and determining distance of select pixel points in the said images of identified moving object from at least one fixed reference point and generating time stamped integrated with visual evidence of displacement of moving object based thereon.

3. The system as claimed in claim 1 or 2, for determining speed of said moving object/vehicles comprising:
said grid generator generating images showing the position of the moving object/vehicles within the said grid-structured graphics as distance of various points from a fixed reference point, thereby generating a self-sufficient evidence of the speed of the vehicle when a sequence of images, involving a sequence of images received from said image capture means comprising of at least two images.

4. The system as claimed in anyone of claim 1 to 3, comprising
said image capturing means comprising at least one video camera to capture video of objects having sequence of images of said objects with time of capturing of the images imprinted thereon;
an imaging processor operatively connected to said camera for receiving the captured video from the camera having
memory device to store the images constituting the captured video having the time of capturing such images imprinted thereon;
said object detection device enabled for accessing stored images from the memory device one after another according to their time of capturing to detect moving objects in the images and thereby store pixel position information of thus detected moving objects or any pre-defined segment thereof in the memory device;
computation device successively asserting distances between said detected pixel positions and a fixed reference point in the stored images one after another according to their time of capturing;
speed calculator embodying transformation function for facilitating determination of displacement of the moving objects under field of view (FOV) of the camera between the imprinted time of capturing of two consecutive stored images being involved for asserting the distances between the detected pixel positions and the fixed reference point based on change in said distances between the detected pixel positions and the fixed reference point in said two consecutive stored images and thereby determining the actual speed of movement of the moving objects in the FOV plane.

5. The system as claimed in anyone of the claims 1 to 4, for image based vehicle speed detection including over-speeding incidence of the vehicle comprising
said video camera disposed on an installation post ensuring the vehicle moving on street or on desired zone under surveillance comes within the FOV of the camera;
said imaging processor disposed in operative communication with the video camera to receive the captured video of the vehicle on the street or on the desired zone under surveillance and thereby analyze the captured video to detect moving vehicle within the FOV of the camera and determine the speeding information of thus detected moving vehicle.

6. The system as claimed in anyone of the claims 1 to 5, wherein the video camera is preferably an IP camera adapted to continuously capture images of the zone under its FOV in sufficient resolution and generate timestamp of the instant when the image is captured.

7. The system as claimed in anyone of claims 1 to 6, wherein the imaging processor connects the camera over network or though any other communication channel and receives streams of images along with timestamps against each image which is sorted in the memory device of the imaging processor according to their time of capturing.

8. The system as claimed in anyone of claims 1 to 7, wherein the object detection device is configured to execute a computer program to analyze the images stored in the memory device one after another according to their time of capturing and detects moving vehicle in the images.

9. The system as claimed in anyone of the claims 1 to 8, wherein the object detection device notes the pixel position of the detected moving vehicle or any part thereof including license plate of the vehicle and stores the pixel position in the memory device.

10. The system as claimed in anyone of the claims 1 to 9, wherein the computation device includes
inputting means to receive fixed number of measurements related to the video camera installation and the FOV of the video camera as input;
said grid generator to virtually divides the image plane into MxN grid graphics having equal dimensions with M strips vertically and N strips horizontally relatable to corresponding equal number of non-uniformly spaced virtual vertical and horizontal strips in the FOV panel;
wherein two or more images stored in the memory device along with their time stamps and the grid structure are superimposed to provide a resulting image showing the distance of the detected moving vehicle from the fixed reference points in each image in scale of the grid structure strips.

11. The system as claimed in anyone of the claims 1 to 10, wherein the inputting means is adapted to receive measurements related to the video camera installation and the FOV plane including the video camera height specifically camera lens to ground height (AB), length of blind zone which is distance from the installation post base to the surveillance zone which is not covered under the FOV (BC), length of the surveillance zone which is under the FOV in longitudinal direction (CD), width of starting surveillance zone (EF) and width of end surveillance zone (GH).

12. The system as claimed in anyone of the claims 1 to 11, wherein the inputting means computes FOV angle based on the received measurements related to the video camera installation and the FOV plane by calculating ? - ? where,
? = is angular displacement of the starting surveillance zone with respect to the camera installation post and ? = is angular displacement of the ending surveillance zone with respect to the camera installation post.

13. The system as claimed in anyone of the claims 1 to 12, wherein the speed calculator determines actual speed of movement of the detected moving object including the moving vehicles in the FOV plane by
receiving distance of the moving object from the fixed reference point in nth image captured at time tn from the computing device in scale of the image plane grid structure strips;
calculating corresponding distance dn in the FOV plane by relating the grid structure strips of the received distance with corresponding virtual division strips in the FOV panel by involving transformation function;
receiving distance of the object from reference point in n+1th image captured at time tn+1 from the computing device in scale of the image plane grid structure strips;
calculating corresponding distance dn+1 in the FOV plane by relating the grid structure strips of the received distance with corresponding virtual division strips in the FOV panel by involving transformation function;
computing the speed by involving (dn+1 – dn)/(tn+1- tn).

14. The system as claimed in anyone of the claims 1 to 13, wherein the speed calculator comprises comparator means for compare the computed speed with respect to a predefined speed limit and provides feedback to a remote recipient upon detecting the computed speed over the predefined speed limit with the nth image and n+1th image with their respective imprinted time stamp and the grid graphics visually evidencing the over speeding incident.

15. The system as claimed in anyone of the claims 1 to 14, wherein the transformation function of the speed calculator computes the virtual division strip of the FOV plane as S0S1= for the corresponding image plane division strip S0S1= .

16. A method of generating visual evidence for speed of moving objects comprising:
generating time stamp imprinted images of instants of moving objects covering sequence of displacement images of said moving object;
identifying at least one moving object in said sequence of images for its time stamp integrated with visual evidence of displacement;
mapping of said time stamp of instant included sequence of images of said identified moving object in a grid structure involving select pixel points in the said images of identified moving object from at least one fixed reference point and generating time stamped integrated with visual evidence of displacement of moving object based thereon, thus producing a complete spatiotemporal information of the moving object in the image sequence itself.

17. A method as claimed in claim 16 comprising
generating images with a grid-structured graphics including distance of various pixel-points in the image from a fixed reference point in standard units ;
calibrating images involving transformation by converting each pixel in the image to corresponding distance on earth measured from the said reference point.

18. A method as claimed in claim 16 or 17 comprising generating images showing the position of the moving object/vehicles within the said grid-structured graphics as distance of various points from a fixed reference point, and superimposing the said transformed distance on the image sequence itself and thus generating a spatiotemporal data in the image sequence itself thereby generating a self-sufficient evidence of the speed of the vehicle when a sequence of images, involving a sequence of images received from said image capture means comprising of at least two images.
, Description:FIELD OF THE INVENTION:
The present invention relates to capturing video of moving object and analysis of the captured video for retrieving information relating to the moving objects with supportive images from the video. More specifically, the present invention is directed to develop a system and method for determining speed of an object moving under a camera field of view including the incidence of over-speeding with respect to a speed limit and generating corresponding image sequence that is calibrated for visually evidencing for such over-speeding incidence.

BACKGROUND OF THE INVENTION:
Automatic monitoring of various activities in a scene by analyzing video is a current trend in visual computing domain. Finding speed of a moving object under a camera field of view including the incidence of over-speeding with respect to a speed limit is one of such applications. As for example, Traffic rule enforcement agencies often require finding out whether a vehicle is running too fast on the street.
There are a number of technologies that are in use to record speed of the vehicles. That includes laser or radar based technologies which applies Doppler effects to find the speed of vehicles.
Alternatively, camera based vehicle speed detecting device to detect and track the object of interests and detecting the speed of the said objects is also available in the state of art. Along with the speed data, a sequence of images showing the vehicle in the scene is produced as evidence. However, there is no information within the images itself to find out the distance covered by the vehicle during a certain interval of time represented by two different images. A separate piece of information telling the speed of the vehicle has to be tagged with the images. This can be challenged for authenticity or correctness because the tagging of speed data with the image is separately done by such existing camera based vehicle speed detecting device, and there is no evidence or proof to ensure that the correct speed data has been tagged with the images. More specifically, the image sequence itself carries no information within itself to ascertain the speed of the moving object. Therefore, the images do not qualify as a self-sufficient evidential proof for the speed of the vehicle.
Thus, there has been a need for an image based vehicle speed detecting technique which will accurately find out the incidence of over speeding of any object or vehicle under surveillance tagged with visual evidence qualifying as a self-sufficient evidential proof for the speed of the object or a vehicle. Although, we have emphasized the speed of a vehicle here, the disclosed art is not limited to detect the speed of vehicle only. The object can be any moving object in the scene, human, animal, vehicle, or any inanimate object in motion.

OBJECT OF THE INVENTION:
It is thus the basic object of the present invention is to develop an image based vehicle/object speed detecting system which would be adapted to analyze continuous images of vehicles/objects covering sequence of displacement images of the vehicle/object being moved and selectively track the moving vehicle/object for accurately determining speed of the moving vehicle/object along with visual evidence of recoding the speed of the object/vehicle.

Another object of the present invention is to develop an image analyzing method to analyze continuous images of vehicles/objects covering sequence of displacement images of the vehicle/object being moved and selectively track the moving vehicle/object for accurately determining speed of the moving vehicle/object.

Another object of the present invention is to develop a system for determining speed of a moving object such as vehicle under field of view of a camera including the incidence of over-speeding by the moving object or the vehicle with respect to a speed limit and generating corresponding images for visually evidencing for such over-speeding incidence.

Yet another object of the present invention is to calibrate the scene and superimpose the said calibration data on the images themselves for visually evidencing the speed of the object or vehicle in general, and over-speeding of a vehicle in particular.

SUMMARY OF THE INVENTION:
Thus according to the basic aspect of the present invention a system for generating images of moving objects integrating spatiotemporal information on the image itself for evaluating related spatio-temporal information comprising:
image capturing means for generating time stamp imprinted images of instants of moving objects including sequence of spatio-temporal information of said moving object;
object detection device for identifying at least one moving object in said sequence of images for its time stamp integrated including visual evidence of its spatiotemporal information;
grid generator enabling mapping of said time stamp of instants including its spatio-temporal information in sequence of images of said identified moving object in a relative grid structure and generating the related spatiotemporal information of said moving object related to distance of select pixel points in the said images of identified moving object from at least one fixed reference point.

According to another aspect of the present invention a system is provided for generating time stamp integrated image sequence with visual evidence of displacement of moving object comprising:
image capturing means for generating time stamped images of instants of moving objects covering sequence of displacement images of said moving object;
object detection device for identifying at least one moving object in said sequence of images for its time stamp integrated with visual evidence of displacement;
grid generator enabling mapping of said time stamp of instant included sequence of images of said identified moving object in a grid structure and determining distance of select pixel points in the said images of identified moving object from at least one fixed reference point, thus providing a spatiotemporal data related to the moving object, and generating time stamped integrated with visual evidence of displacement of moving object based thereon.

According to another aspect of the present system for generating time stamped integrated with visual evidence of displacement of moving object which is adapted for determining speed of said moving object/vehicles comprising
said grid generator generating images showing the position of the moving object/vehicles within the said grid-structured graphics as distance of various points from a fixed reference point, thereby generating a self-sufficient evidence of the speed of the vehicle when a sequence of images, involving a sequence of images received from said image capture means comprising of at least two images.

According to another aspect in the present invention, a system is provided comprising
said image capturing means comprising at least one video camera to capture video of objects;
an imaging processor operatively connected to said camera for receiving the captured video from the camera having
memory device to store continuous images constituting the captured video along with time of capturing such images;
said object detection device enabled for accessing stored images from the memory device one after another according to their time of capturing to detect moving objects in the images and thereby store pixel position information of thus detected moving objects or any pre-defined segment thereof in the memory device;
computation device successively asserting distances between said detected pixel positions and a fixed reference point in the stored images one after another according to their time of capturing;
speed calculator embodying transformation function for facilitating determination of displacement of the moving objects under field of view (FOV) of the camera between the time of capturing of two consecutive stored images being involved for asserting the distances between the detected pixel positions and the fixed reference point based on change in said distances between the detected pixel positions and the fixed reference point in said two consecutive stored images and transforming the said change in the detected pixel positions to actual distance travelled by the moving object on earth, and thereby determining the actual speed of movement of the moving objects in the FOV plane.
In a preferred embodiment, the present system for image based vehicle speed detection including over-speeding incidence of the vehicle comprises
said video camera disposed on an installation post ensuring the vehicle moving on street or on desired zone under surveillance comes within the FOV of the camera;
said imaging processor disposed in operative communication with the video camera to receive the captured video of the vehicle on the street or on the desired zone under surveillance and thereby analyze the captured video to detect moving vehicle within the FOV of the camera and determine the speeding information of thus detected moving vehicle.

In an embodiment of the present system, the video camera is preferably an IP camera adapted to continuously capture images of the zone under its FOV in sufficient resolution and generate timestamp of the instant when the image is captured.

In an embodiment of the present system, the imaging processor connects the camera over network or though any other communication channel and receives streams of images along with timestamps against each image which is sorted in the memory device of the imaging processor according to their time of capturing.

In an embodiment of the present system, the object detection device is configured to execute a computer program to analyze the images stored in the memory device one after another according to their time of capturing and detects moving vehicle in the images.

In an embodiment of the present system, the object detection device notes the pixel position of the detected moving vehicle or any part thereof including license plate of the vehicle and stores the pixel position in the memory device.

In an embodiment of the present system, the computation device includes
inputting means to receive fixed number of measurements related to the video camera installation and the FOV of the video camera as input;
said grid generator to virtually divides the image plane into MxN grid graphics having equal dimensions with M strips vertically and N strips horizontally relatable to corresponding equal number of non-uniformly spaced virtual vertical and horizontal strips in the FOV panel;
wherein two or more images stored in the memory device along with their time stamps and the grid structure are superimposed to provide a resulting image showing the distance of the detected moving vehicle from the fixed reference points in each image in scale of the grid structure strips.

In an embodiment of the present system, the inputting means is adapted to receive measurements related to the video camera installation and the FOV plane including the video camera height specifically camera lens to ground height (AB), length of blind zone which is distance from the installation post base to the surveillance zone which is not covered under the FOV (BC), length of the surveillance zone which is under the FOV in longitudinal direction (CD), width of starting surveillance zone (EF) and width of end surveillance zone (GH).

In an embodiment of the present system, the inputting means computes FOV angle based on the received measurements related to the video camera installation and the FOV plane by calculating ? - ? where,
? = is angular displacement of the starting surveillance zone with respect to the camera installation post and ? = is angular displacement of the ending surveillance zone with respect to the camera installation post.

In an embodiment of the present system, the speed calculator determines actual speed of movement of the detected moving object including the moving vehicles in the FOV plane by
receiving distance of the moving object from the fixed reference point in nth image captured at time tn from the computing device in scale of the image plane grid structure strips;
calculating corresponding distance dn in the FOV plane by relating the grid structure strips of the received distance with corresponding virtual division strips in the FOV panel by involving transformation function;
receiving distance of the object from reference point in n+1th image captured at time tn+1 from the computing device in scale of the image plane grid structure strips;
calculating corresponding distance dn+1 in the FOV plane by relating the grid structure strips of the received distance with corresponding virtual division strips in the FOV panel by involving transformation function;
computing the speed by involving (dn+1 – dn)/(tn+1- tn).

In an embodiment of the present system, the speed calculator comprises comparator means for compare the computed speed with respect to a predefined speed limit and provides feedback to a remote recipient upon detecting the computed speed over the predefined speed limit with the nth image and n+1th image with their respective time stamp visually evidencing the over speeding incident.

In an embodiment of the present system, the transformation function of the speed calculator computes the virtual division strip of the FOV plane as S0S1= for the corresponding image plane division strip S0S1= .

According to another aspect in the present invention there is also provided a method of generating visual evidence for speed of moving objects comprising:
generating time stamp images of instants of moving objects covering sequence of displacement images of said moving object;
identifying at least one moving object in said sequence of images for its time stamp integrated with visual evidence of displacement;
mapping of said time stamp of instant included sequence of images of said identified moving object in a grid structure involving select pixel points in the said images of identified moving object from at least one fixed reference point and generating time stamped integrated with visual evidence of displacement of moving object based thereon.

The method preferably comprises the step of
generating images with a grid-structured graphics including distance of various pixel-points in the image from a fixed reference point in standard units;
calibrating images involving transformation by converting each pixel in the image to corresponding distance measured from the said reference point.

The method also comprises the steps of generating images showing the position of the moving object/vehicles within the said grid-structured graphics as distance of various points from a fixed reference point, thereby generating a self-sufficient evidence of the speed of the vehicle when a sequence of images, involving a sequence of images received from said image capture means comprising of at least two images.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS:
Figure 1 shows (a) detection of speed of a moving object under a camera field of view including the incidence of over-speeding with respect to a speed limit and (b) corresponding image coordinates for visually evidencing the over-speeding incidence in accordance with the present invention.
Figure 2 shows operative sequences associated in detection of speed of a moving vehicle under camera field of view including the incidence of over-speeding and generating corresponding visual evidence for such over-speeding incidence in accordance with the present invention.

DESCRIPTION OF THE INVENTION WITH REFERENCE TO THE ACCOMPANYING DRAWINGS:
As stated hereinbefore, the present invention discloses a system for determining speed of a moving object such as vehicle under field of view of a camera including the incidence of over-speeding by the moving object or the vehicle with respect to a speed limit and generating corresponding images for visually evidencing for such over-speeding incidence. According to a preferred embodiment, the system of the present invention adapted for generating images of moving objects integrating spatiotemporal information on the image itself for evaluating related spatio-temporal information by comprising an image capturing means for generating time stamp imprinted images of instants of moving objects including sequence of spatiotemporal information of said moving object, an object detection device for identifying at least one moving object in said sequence of images for its time stamp integrated including visual evidence of its spatiotemporal information and a grid generator enabling mapping of said time stamp of instants including its spatiotemporal information in sequence of images of said identified moving object in a relative grid structure and generating the related spatio-temporal information of said moving object related to distance of select pixel points in the said images of identified moving object from at least one fixed reference point.
The speed detection technique of the present invention includes calibration of scene/images captured by the camera and marking the images with a grid-structured graphics showing distance of various pixel-points in the image from a fixed reference point in standard units (say, meters or inches). The distance is calculated with respect to the fixed reference point or line.
In essence, the system of the present invention calibrates the captured images by using a transformation function that converts each pixel in the image to corresponding distance measured from the reference point. Whenever any object (say, License plate of a vehicle) is found at any position in the scene, the distance of the object from that reference point is ascertained. This information is used to determine speed of the vehicle or speed of any object moving in the scene just by looking at any two or more images where the vehicle license plate of the same vehicle is captured. The images show the position of the vehicles within the said grid-structured graphics that shows distance of various points from a fixed reference point, thus creating a self-sufficient evidence of the speed of the vehicle when a sequence of images, the sequence consisting of at least two images, is available. No additional equipment is required to calibrate the scene; the mounting position of the camera that captures the images of the vehicles and a few measurements of the field of view of the camera are used to calibrate the scene accurately.
Reference is first invited from the accompanying figure 1 which shows (a) detection of speed of a moving object under a camera field of view including the incidence of over-speeding with respect to a speed limit and (b) corresponding image coordinates for visually evidencing the over-speeding incidence in accordance with the system of present invention.
The present system includes the imaging device which is preferably a video camera and a cooperative imaging processor. The video camera is disposed on an installation post on a suitable position ensuring the vehicles on the street or on the desired zone under surveillance comes within the field of view of the camera for certain duration of time. The imaging processor is disposed in operative communication with the video camera to receive the captured video of the vehicles on the street or on the desired zone under surveillance and thereby analyze the captured video to detect moving vehicles and retrieve the speeding information of such moving vehicles along with selective supporting image frames of said captured video for visually endorsing or cross-checking such retrieved speeding information.
The imaging processor is pre-calibrated with following parameters such as (i) AB = Camera Height in meter (specifically camera lens to ground) = CAMERA_LENSE_HEIGHT, (ii) BC = Length of blind zone in meter = distance from the installation point base to the surveillance zone which is not covered under the field of view of the camera CAMER_INV_DISTANCE, (iii) CD = Length of the surveillance zone which is under the field of view of the camera (longitudinal direction) in meter = CAMERA_VISIBLE_DIS, (iv) EF = Width of starting surveillance zone in meter = CAMERA_VISIBLE_START_ SIDE_DIS, (v) GH = Width of End surveillance zone in meter = CAMERA_VISIBLE_END_ SIDE_DIS.
Based on the above parameters, the imaging processor calculates angular displacement of the starting surveillance zone with respect to the camera installation post (?) and angular displacement of the ending surveillance zone with respect to the camera installation post (?) by following
? = and
? =
Based on the above angular displacements, the angular expansion of the surveillance zone i.e. the field of view (FOV) region of the camera is determined by following
FOV angle = = ? - ?
The FOV region of the camera as captured by the camera disposed on the installation post in image plane is received by the imaging processor.
The imaging processor virtually divides the captured image plane into (MxN) preferably (100 x 100) grids having equal dimensions with 100 strips vertically and 100 strips horizontally. For illustration purpose, a 4 x 4 grid representation of the image plane is shown in the accompanying figure 1 (b) wherein the image plane is virtually divided into four vertical and horizontal strips. The FOV plane is also virtually divided into equal number of relatable divisions same that of divisions of the image plane, however the separation between the vertical and horizontal strips of the FOV plane is different depending on distances from the installation post. The accompanying figure 1 (a) shows the FOV plane is also virtually divided into four relatable vertical and horizontal strips. Also, as shown in the figure 1(a), the FOV angle (? - ?) is divided by the imaging processor into four parts to get the angle of each division.
The image plane vertical and horizontal strips representing a distance in the image plane is relatable to vertical and horizontal strips of corresponding division of the FOV plane to represent equivalent distance in the FOV plane. The imaging sensor of the present system is particularly configured to map the distance in the image plane to its corresponding distance in the FOV plane by involving the relation of the vertical and horizontal strips of the image plane and the FOV plane wherein the image plane division strips representing a distance in the image plane is transformed into the FOV plane division strips for representing corresponding distance in the FOV plane. Following this transformation, for an image plane division strip S0S1= (for the division shown in figure 1(b)), the corresponding FOV plane division strip (Figure 1(a)) will be S0S1= .
Thus based on the above transformation, all horizontal as well as vertical strips of the image plane are mapped to their corresponding strips in the FOV plane for determination of a distance in the FOV plane based on the distance of the image plane. The imaging processor, after determination of the distance in the FOV plane, the tracks movement of the object under FOV of the camera in the image plane and thereby computes speed of the object in the FOV plane.

The tracking of the moving object such as vehicle in the image plane and therefrom calculating the vehicle speed in the FOV plane including generating the images evidencing/supporting the calculated speed is further explained hereunder with supportive of illustration of the accompanying figure 2.

As referred in the figure 2, the video camera 101 which is preferably an IP camera continuously captures images of the zone under its FOV in sufficient resolution (say, in milliseconds) and generate timestamp of the instant when the image is captured. The video camera 101 is also enabled to stream such sequence of images to the imaging processor 102 along with the timestamp. The timestamp may also printed on the image.
The imaging processor 102 is enabled to receive images from the camera 101 as described above and stores the continuous images along with time of capturing such images.
The imaging processor 102 embodies an object detection device having an access to the stored images in the memory device. The object detection device is configured to execute a computer program that implementing algorithm for analyzing the accessed stored images from the memory device one after another according to their time of capturing to detect moving objects such as moving vehicles in the images and thereby store pixel position information of thus detected moving objects or any pre-defined segment thereof such as license plate of the vehicle in the memory device
The imaging processor 102 also embodies computation device having inputting means 105 to receive fixed number of measurements related to camera installation point and the FOV as input such as
AB = Camera Height in meter (lens to ground) = CAMERA_LENSE_HEIGHT
BC = Length of blind zone in meter = CAMER_INV_DISTANCE
CD = Length of FOV (longitudinal direction) in meter = CAMERA_VISIBLE_DIS
EF = Width of starting FOV zone in meter = CAMERA_VISIBLE_START_ SIDE_DIS
GH = Width of End FOV zone in meter = CAMERA_VISIBLE_END_ SIDE_DIS
The inputting means is adapted to compute the FOV angle based on above the received measurements related to the video camera installation and the FOV plane by calculating ? - ? where, ? = is angular displacement of the starting surveillance zone with respect to the camera installation post and ? = is angular displacement of the ending surveillance zone with respect to the camera installation post.
The computation device further includes a grid generator 103 which virtually divides the captured image plane into said MxN grids having equal dimensions with M strips vertically and N strips horizontally which are relatable to corresponding equal number of non-uniformly spaced virtual vertical and horizontal strips in the FOV panel.
The computation device then involves two or more images along with their time stamps and also the Grid data and superimposes the grid structure on each of the images. The resulting images thus shows the distance of the detected moving object/vehicle from a fixed reference points in each image.
The imaging processor includes speed calculator 104 for transforming each pixel position of the moving objects to a distance measure in real coordinates from a fixed reference point, say the top of the image. The speed calculator 104 embodies transformation function for mapping the image plane division strips to its corresponding FOV plane division strips to facilitate determination of displacement of the moving objects in the FOV plane between the time of capturing of two consecutive stored images being involved for asserting the distances between the detected pixel positions and the fixed reference point based on change in said distances between the detected pixel positions and the fixed reference point in said two consecutive stored images. The speed calculator further calculates the actual speed of movement of the moving objects in the FOV plane from the determined displacement of the moving objects in the FOV plane.
The Speed calculator determines the speed of the detected moving object/vehicle as follows:
receiving distance of the moving object from the fixed reference point in nth image captured at time tn from the computing device in scale of the grid structure strips;
calculating corresponding distance dn in the FOV plane by relating the grid structure strips of the received distance of with corresponding virtual division in the FOV panel by involving transformation function;
receiving distance of the object from reference point in n+1th image captured at time tn+1 from the computing device in scale of the grid structure strips;
calculating corresponding distance dn+1 in the FOV plane by relating the grid structure strips of the received distance of with corresponding virtual division in the FOV panel by involving transformation function;
computing the speed by involving (dn+1 – dn)/(tn+1- tn).
In a preferred embodiment of the present system, the speed calculator comprises comparator means for compare the computed speed with respect to a predefined speed limit and provides feedback to a remote recipient upon detecting the computed speed over the predefined speed limit with the nth image and n+1th image with their respective time stamp visually evidencing the over speeding incident.
As for example, if the distances of the object from the reference points be 1.310m and 6.350m and the related timestamps are 12:34:30:600 and 12:34:30:440
Then the speed of the vehicle is calculated as:
(6.350 – 1.310)/(12:34:30:600 – 12:34:30:440)*60*60 Kmph = 113 Kmph
As all these four data (6.350, 1.310, 12:34:30:600 and 12:34:30:440) with proof are available in the images themselves, one can use these images as self-sufficient evidence of the Speed of the object in the scene.

Documents

Application Documents

# Name Date
1 201831001323-ABSTRACT [27-09-2022(online)].pdf 2022-09-27
1 201831001323-STATEMENT OF UNDERTAKING (FORM 3) [11-01-2018(online)].pdf 2018-01-11
2 201831001323-FORM 1 [11-01-2018(online)].pdf 2018-01-11
2 201831001323-CLAIMS [27-09-2022(online)].pdf 2022-09-27
3 201831001323-DRAWINGS [11-01-2018(online)].pdf 2018-01-11
3 201831001323-COMPLETE SPECIFICATION [27-09-2022(online)].pdf 2022-09-27
4 201831001323-COMPLETE SPECIFICATION [11-01-2018(online)].pdf 2018-01-11
4 201831001323-FER_SER_REPLY [27-09-2022(online)].pdf 2022-09-27
5 201831001323-Proof of Right (MANDATORY) [13-03-2018(online)].pdf 2018-03-13
5 201831001323-OTHERS [27-09-2022(online)].pdf 2022-09-27
6 201831001323-FORM-26 [13-03-2018(online)].pdf 2018-03-13
6 201831001323-FER.pdf 2022-04-18
7 201831001323-FORM 18 [28-12-2021(online)].pdf 2021-12-28
8 201831001323-FORM-26 [13-03-2018(online)].pdf 2018-03-13
8 201831001323-FER.pdf 2022-04-18
9 201831001323-Proof of Right (MANDATORY) [13-03-2018(online)].pdf 2018-03-13
9 201831001323-OTHERS [27-09-2022(online)].pdf 2022-09-27
10 201831001323-COMPLETE SPECIFICATION [11-01-2018(online)].pdf 2018-01-11
10 201831001323-FER_SER_REPLY [27-09-2022(online)].pdf 2022-09-27
11 201831001323-COMPLETE SPECIFICATION [27-09-2022(online)].pdf 2022-09-27
11 201831001323-DRAWINGS [11-01-2018(online)].pdf 2018-01-11
12 201831001323-FORM 1 [11-01-2018(online)].pdf 2018-01-11
12 201831001323-CLAIMS [27-09-2022(online)].pdf 2022-09-27
13 201831001323-STATEMENT OF UNDERTAKING (FORM 3) [11-01-2018(online)].pdf 2018-01-11
13 201831001323-ABSTRACT [27-09-2022(online)].pdf 2022-09-27

Search Strategy

1 SearchStrategyE_18-04-2022.pdf