Abstract: A SYSTEM AND A METHOD OF IDENTIFYING AND TRACKING OBJECTS, AND ESTIMATING THE SPEED OF THE OBJECTS ABSTRACT A system (12) and a method (500) for identifying and tracking objects, and estimating the speed of objects (104) such as vehicles are disclosed. The system (12) comprises a thermal imaging capturing unit (16) that utilises the heat emitted from the objects (104) and creates visible images of the objects (104). The thermal imaging capturing unit (16) identifies the objects (104) and sends a signal to the system (12). The system (12) creates a virtual speed capturing zone i.e., a starting point (106a) and an end point (106b). The system (12) obtains the heat signature information from the thermal imaging capturing unit (16) and tracks the movement of the objects (104) from the starting point (106a) to the end point (106b). Based on the time duration taken by the objects (104) to move from the starting point (106a) to the end point (106b), the system (12) estimates the speed of the objects (104). [To be published with FIG. 2]
DESC:A SYSTEM AND A METHOD OF IDENTIFYING AND TRACKING OBJECTS, AND ESTIMATING THE SPEED OF THE OBJECTS
FIELD OF INVENTION
[01] The present invention generally relates to a field of measuring speed of objects such as vehicles. More specifically, the present invention relates to a system and a method of identifying objects such as vehicles and tracking their speed.
BACKGROUND OF THE INVENTION
[02] There is a growing need to estimate the speed of vehicles accurately for taking a variety of decisions. For example, estimation of vehicles speed helps in enforcement of appropriate speed limits to increase the road safety. Further, estimation of vehicles speed helps in traffic monitoring and forecasting road networks. Proper estimation of vehicles speed can be used to enhance safety of people, animals, reduce emissions and consumption of energy, etc.
[03] Several techniques are employed to accurately estimate the speed of the vehicles. Existing techniques utilise sensors such as radar, laser, Light Detection and Ranging (Lidar) and cameras to estimate the speed of the vehicles. One such example is disclosed in a United States Publication no. 20160232410, entitled “Vehicle speed detection” (“the ‘410 Publication”). The ‘410 Publication discloses improved systems and methods for determining vehicle speed along a roadway use a precisely timed sequence of images. In various embodiments, vehicle position and speed may be estimated based on geometric knowledge of certain vehicle features unlikely to vary among vehicles, e.g., boundary points along license plate characters and the knowledge that these points are coplanar on intact license plates. Embodiments of the invention facilitate determination of position and speed without requiring any manual calibration or measurement of the camera's position in space with respect to the road surface.
[04] Another example is disclosed in a United States Granted Patent no. 9,083,856, entitled “Vehicle speed measurement method and system utilizing a single image capturing unit” (“the ‘856 Patent”). The ‘856 Patent discloses a vehicle speed measurement method and system for identifying a violating vehicle utilizing single image capturing unit. A vehicle image can be captured by the image capturing unit in order to estimate speed of the vehicle utilizing an ALPR unit. The license plate characters can be located and extracted from the captured image. A distance of the license plate from the image capturing unit can be calculated utilizing a physical character height from a reference point of the image capturing unit and a pixel character height generated by the ALPR unit. A position of the license plate in the field of view of the image capturing unit along with the distance information can be utilized to determine a height of the license plate from a road surface. The height of the license plate can be employed to accurately estimate the vehicle speed.
[05] Yet another example is disclosed in a United States Granted Patent no. 11,435,474, entitled “Vehicle system for detection of oncoming vehicles” (“the ‘474 Patent”). The ‘474 Patent discloses a vehicle environment detection system (40) in an ego vehicle (1), including a sensor arrangement (4) and a main control unit (8) is arranged to detect and track at least one oncoming vehicle (9), and to determine whether the ego vehicle (1) has entered a curve (17). When this is the case. The main control unit (8) is arranged to, determine an ego direction (21) along which the ego vehicle (1) travels with a corresponding ego direction angle (?ego) with respect to a predetermined axis (xglob), determine a measured oncoming direction (18) of the tracked oncoming vehicle (9) with a corresponding oncoming angle (?track, glob) with respect to the predetermined axis (xglob) during a plurality of radar cycles, determine a difference angle (d) between the measured oncoming direction (18) and the ego direction (21), and compare the difference angle (d) with a threshold angle (?max), and to determine that the oncoming vehicle (9) is crossing if the difference angle (d) exceeds the threshold angle (?max).
[06] Although the radar, laser, Lidar and cameras are useful in estimating the speed of the vehicles, they have few problems. For example, the radar, Lidar and laser are very expensive to deploy. Further, the cameras depend on lighting and weather conditions which limit their usage during darkness and/or extreme weather conditions.
[07] Therefore, there is a need for a system that can estimate the speed of objects such as vehicles without costly sensors and can function under any lighting and weather conditions.
SUMMARY OF THE INVENTION
[08] The problems in the existing art are met by a system and a method of identifying and tracking objects, and estimating the speed of objects using thermal imaging.
[09] Accordingly, it is an object of the present invention to provide a system that utilises the heat emitted from the objects and creates an image of the objects for calculating the speed of the objects.
[010] In order to achieve one or more of the objects as stated above, the present invention provides a system for identifying and tracking objects, and estimating the speed of objects. The system comprises a thermal imaging capturing unit capable of recording minute differences in the heat emitted by an object in its field of view and translating the information into visible images. The object includes, but not limited to, a vehicle, human beings, animals, etc. The system receives the visible images and creates a virtual first point and a second point. The first point indicates a starting point and the second point indicates an end point within the field of view of the thermal imaging capturing unit. The system tracks speed of the object by estimating the time duration taken by the object to move or travel from the first point and a second point.
[011] In one aspect of the present invention, the system checks whether the speed of the object exceeds a predetermined speed limit. Further, the system triggers an image capturing unit to capture an image identifying a unique identification such as a number plate of the object. Subsequently, the system generates an alert notifying the object of breach of the predetermined speed limit and the image captured. Optionally, the system detects the presence of the object for a predetermined time period (parked at the side of the road, or vehicle breakdown, etc), say for more than two minutes, then the system generates an alert to the object or to the concerned authorities. Additionally, if the system detects the object travelling from the second point to the first point, then the system notifies the object and/or raises a challan or penalty to the owner of the vehicle.
[012] In one advantageous feature of the present invention, the system utilises the heat emitted from the objects such as the vehicles and creates an image of the objects. The image is used for estimating the speed of the vehicles. When compared to the existing RADAR, LiDAR and only camera systems, the presently disclosed system presents a relatively cheap and unique way of estimating the speed of the vehicles.
[013] In another advantageous feature of the present invention, the system can be deployed in a variety of application. For example, the system can be deployed at a gantry or at a pole adjacent to the road for estimating the speed of the vehicles. Since the system utilises the thermal imaging capturing unit, the system can be used during the day time, night time and even during extreme weather conditions (e.g., rain, fog, snow and smoke).
[014] In another advantageous feature of the present invention, the system helps to identify the object and the speed at which the object is travelling even if the object is hidden behind some obstacle or another object. Additionally, the profile i.e., shape and size of the object can be derived based on the heat signature captured by the system.
[015] In yet another advantageous feature of the present invention, the system helps in enforcement of appropriate speed limits to increase the road safety. Further, the system helps in traffic monitoring and forecasting road networks.
[016] Further advantages and examples of the invention will be brought out in the following part of the specification, wherein detailed description is for the purpose of fully disclosing the invention without placing limitations thereon.
BRIEF DESCRIPTION OF THE DRAWINGS
[017] In the following drawings like reference numbers are used to refer to like elements. Although the following figures depict various examples of the invention, the invention is not limited to the examples depicted in the figures.
[018] FIG. 1 illustrates an environment of a system for identifying and tracking objects, and estimating the speed of the objects, in accordance with one embodiment of the present invention;
[019] FIGs. 2 to 5 show exemplary environments of identifying and tracking objects, and estimating the speed of the objects, in accordance with several embodiment of the present invention; and
[020] FIG. 6 is a method of identifying and tracking objects, and estimating the speed of the objects, in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[021] The following detailed description is susceptible to various modifications and alternative forms, specific embodiments thereof will be described in detail and shown by way of example. It should be understood, however, that there is no intent to limit example embodiments of the present invention to the particular forms disclosed. Conversely, example embodiments are to cover all modifications, equivalents, and alternatives falling within the scope of the invention.
[022] It should be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present invention.
[023] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the terms “comprises,” “comprising,” “includes,” “including,” and/or “having” specify the presence of stated features, integers, steps, operations, elements, and/or components when used herein, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[024] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present invention belongs. It should be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and are not to be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[025] Various features and embodiments of a system for identifying and tracking objects, and estimating the speed of the objects are explained in conjunction with the description of FIGs. 1-6.
[026] FIG. 1 shows an environment 10 in which a system 12 for identifying and tracking objects, and estimating the speed of the objects implements, in accordance with one exemplary embodiment of the present invention. The system 12 includes a server or a database comprising an application to execute functions for identifying objects and tracking speed of the objects. In one embodiment, the system 12 operates as a standalone device or connects to other (e.g., networked) systems. A person skilled in the art appreciates that the system 12 implements in any different computing systems, environments, and/or configurations such as a workstation, an electronic device, a mainframe computer, a laptop, and so on. In a networked deployment, the system 12 operates in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. A toll operator, law enforcement, traffic management authority or any specific entity manages the system 12 for identifying objects and tracking speed of the objects depending on the need.
[027] The system 12 includes a processor (not shown) such as a central processing unit (CPU), a graphics processing unit (GPU) or both. The system 12 includes a memory (not shown). The memory and/or the processor stores program instructions (i.e., software). The program instructions are transmitted or received over a network 14 utilizing any one of a number of well-known transfer protocols. The network 14 may be a wireless network, a wired network or a combination thereof. Network 14 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 14 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 14 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[028] The system 12 communicatively connects to a thermal imaging capturing unit 16 via the network 14. In one exemplary implementation, the thermal imaging capturing unit 16 installs at a toll booth or any other traffic junction where movement of objects/vehicles are high. For instance, the thermal imaging capturing unit 16 mounts to a pole at the side of the road or at the centre of road. The thermal imaging capturing unit 16 encompasses in a housing (not shown) made of compact, lightweight and shock resistant material. Further, the housing is water resistant and allows for use in all weather conditions. The thermal imaging capturing unit 16 utilises the heat emitted from the objects and creates image(s) of the objects. The present invention is explained considering that the objects are land-based vehicles such as cars, buses, two-wheelers, three-wheelers, bicycles, trucks and other moving objects. However, it is possible to install the thermal imaging capturing unit 16 at a sea port to identify objects such as boats, ships or water-based transport vehicles and track their speed without departing from the scope of the present invention.
[029] In the present invention, the thermal imaging capturing unit 16 indicates a thermal camera capable of recording minute differences in the heat emitted by the objects i.e., vehicles and translating the information into visible images of the vehicles. The thermal imaging capturing unit 16 utilises thermal contrast of the objects and provides vision on the objects thereby allows to identify and track the objects in the darkness and/or extreme weather conditions.
[030] Still referring to FIG. 1, the system 12 communicatively connects to a plurality of user devices i.e., a first user device 18a, a second user device 18b, etc., collectively referred to as user devices 18, or simply user device 18 when referred to a single user device. Each of the system 12 and the plurality of user devices 18 communicatively connect via the network 14. The user device 18 includes, but not limited to, a mobile phone, a laptop, a desktop computer, a tablet, a wrist watch and other electronic devices. In one example, a user such as a law enforcement officer or toll operator operates the user device 18 to access the data stored and processed by the system 12.
[031] Now referring to FIG. 2, a perspective view set up and a field of view of the thermal imaging capturing unit 16 is shown, in accordance with one exemplary embodiment of the present invention. As specified above, the thermal imaging capturing unit 16 installs at a toll booth or any other traffic junction where movement of objects/vehicles are high. Consider that the thermal imaging capturing unit 16 installs at the side of highway over a gantry or pole or any other structure depending on the need. FIG. 2 shows an environment 100 showing a field of view 102 captured by the thermal imaging capturing unit 16. In the present example, the thermal imaging capturing unit 16 is installed such that the thermal imaging capturing unit 16 is able to capture heat emitted by the objects/vehicles 104 approximately 300 meters from its position. As such, the thermal imaging capturing unit 16 is configured to capture objects/vehicles 104 moving towards or away from its location/position depending on the need.
[032] FIG. 2 shows the feature of the thermal imaging capturing unit 16 capturing the objects/vehicles 104 moving towards its position and passing by. Here, when the objects/vehicles 104 come in its field of view, the thermal imaging capturing unit 16 records minute differences in the heat emitted by the objects/vehicles 104. Further, the thermal imaging capturing unit 16 translates the information into visible images of the objects/vehicles 104 thereby identifying the presence of the images of the objects/vehicles 104 in its field of view 102.
[033] Upon identifying, the thermal imaging capturing unit 16 sends a signal, which acts as a trigger point to the system 12 to track and estimate the speed of the objects/vehicles 104. Here, the system 12 employs the thermal imaging capturing unit 16 to track the movement of the objects/vehicles 104 in its field of view 102. Concurrently or consecutively, the system 12 creates a virtual speed capturing zone 106 i.e., a starting point 106a and an end point 106b in its field of view 102. The starting point 106a indicates a line or zone the system 12 begins tracking and estimating the speed of the objects/vehicles 104. The end point 106b indicates a line or zone the system 12 stops tracking and estimating the speed of the objects/vehicles 104. Here, the system 12 obtains the heat signature information from the thermal imaging capturing unit 16 and tracks the movement of the objects/vehicles 104 from the starting point 106a to the end point 106b. Based on the time duration taken by the objects/vehicles 104 to move from the starting point 106a to the end point 106b, the system 12 estimates the speed of the objects/vehicles 104 (i.e., speed = distance/time).
[034] Additionally, the system 12 creates lane zones 108. The lane zones 108 indicate zones created by the system 12 corresponding to the number of lanes available on the road or seaways. In the example illustrated in FIG. 2, the system 12 creates two lane zones 108 i.e., first zone 108a and a second zone 108b. The system 12 detects the objects/vehicles 104 moving in the first zone 108a or the second zone 108b or both lane zones. As specified above, the system 12 estimates the speed of the objects/vehicles 104 based on the time duration taken by the objects/vehicles 104 to move from the starting point 106a to the end point 106b. Here, the system 12 estimates the speed of the objects/vehicles 104 based on the time duration taken by the objects/vehicles 104 to move from the starting point 106a to the end point 106b in each of the first zone 108a and the second zone 108b. This way, the system 12 tracks and estimates the speed of the objects/vehicles 104 in the first zone 108a and the second zone 108b.
[035] The presently disclosed system 12 can be deployed to detect the speed of the objects/vehicles 104 in one or both directions. Here, the thermal imaging capturing unit 16 installs at an appropriate position and identifies the objects/vehicles 104 moving from the starting point 106a to the end point 106b in a given direction. Further, the system 12 tracks the objects/vehicles 104 moving in the first zone 108a or the second zone 108b, or changing lanes between the first zone 108a and the second zone 108b and estimates their speed.
[036] In one exemplary implementation, the system 12 tracks the speed of the objects/vehicles 104 and alerts/warns them of their speed. Here, the system 12 employs a loud speaker or a display board such as a signage/sign board to display the speed at which the objects/vehicles 104 are travelling and suggests them to drive below the allowed speed. This way, the objects/vehicles 104 can slow down and proceed. In another exemplary implementation, the system 12 executes late specific speed limits and alerts the objects/vehicles 104. For example, consider the first zone 108a is determined as a slow speed lane, say 50 kilometres per hour (kmph) and the second zone 108b is determined as a high speed lane, say 70 kmph. If the system 12 determines that the speed of the vehicles 104 moving in the first zone 108a is more than 50 kmph, then the system 12 alerts the vehicles104 employing the loud speaker or the display board, as explained above. Further, if the vehicles 104 are shifting lanes i.e., moving between the first zone 108a and the second zone 108b, then the system 12 alerts the vehicles 104 employing the loud speaker or the display board, as explained above.
[037] Additionally, when the system 12 estimates the speed of the vehicle 104 moving in the second zone 108b to be above the permissible limit, say 90 kmph, then the system 12 employs an additional sensor such as a camera (not shown) to capture an image of the license plate number. The system 12 uses the image of the license plate number to raise a violation challan to the vehicle 104 as a proof of the over speeding.
[038] FIG. 3 shows an environment 200 showing a field of view 202 captured by the thermal imaging capturing unit 16. Here, the field of view 202 refers to an area captured by thermal imaging capturing unit 16 for identifying and tracking objects 212 such as vehicles, humanbeings or even animals. Here, the system 12 creates a virtual starting point 206 and an end point 108 in its field of view 202. As specified above, the starting point 206 indicates a line or zone the system 12 begins tracking for estimating the speed of the objects/vehicles 212. The end point 108 indicates a line or zone the system 12 stops tracking for estimating the speed of the objects/vehicles 212. Here, the system 12 obtains the heat signature information from the thermal imaging capturing unit 16 and tracks the movement of the objects/vehicles 212 from the starting point 206 to the end point 208. When the objects 212 come within the field of view 202, the system 12 creates a bounding box 210 around each object 212 to track each object 212 moving in a first direction 214. As the system 12 obtains heat information of the objects 212, the system 12 is capable of identifying the object 212 even if an object 212 is behind another object 212 or is obstructed by another structure (not shown). Here, the first direction 214 indicates the objects 212 travelling towards the thermal imaging capturing unit 16. Optionally, the system 12 creates the bounding box 210 around each object 212 to track each object 212 moving in a second direction 218. The second direction 218 indicates a direction in which the objects 212 move away from the thermal imaging capturing unit 16. In other words, the second direction 218 indicates a direction of movement of the objects 212 which is in breach of the defined travel direction for the objects on the road 204. If the system 12 detects the objects 212 moving in the second direction 218, then the system 12 generates an alarm to notify the authorities. Optionally, the system 12 employs an additional sensor such as a camera (not shown) to capture an image of the license plate number of the vehicle/object 212. The system 12 uses the image of the license plate number to raise a violation challan to the vehicle 212 as a proof of the travelling in the wrong direction.
[039] As specified above, the system 12 obtains the heat signature information from the thermal imaging capturing unit 16 and tracks the movement of the objects/vehicles 212 from the starting point 206 to the end point 208. Based on the time duration taken by the objects/vehicles 212 to move/travel from the starting point 206 to the end point 208, the system 12 estimates the speed of the objects/vehicles 212 (i.e., speed = distance/time).
[040] In one exemplary embodiment, the system 12 defines a safety zone or violation zone 216. In one example, the safety zone or violation zone 216 indicates a zone on the road 204 that specifies the objects 212 to travel at a reduced speed, say 40Kmph. In the present scenario, the system 12 calculates the speed of the object 212 from the starting point 206 to the end point 208. If the speed of the object 212 is more than the allowed speed limit, then the system 12 raises an alarm or captures the image of the number plate of the object 212. Optionally, the safety zone 216 indicates a zone on the road 204 that allows the objects to stop by the side of the road without incurring a penalty or warning. In the present scenario, when the system 12 detects an object 212 that has stopped or parked, then the system 12 may not take any action on the objects 212. In yet another example, the safety zone 216 indicates a zone on the road 204 that strictly prohibits the objects 212 from stopping (i.e., no parking zone). In such a scenario, when the system 12 detects an object 212 that has stopped or parked in the safety zone 216, then the system 12 may raise a violation challan to the vehicle 212 as a proof of the stopping in the safety zone 216.
[041] FIG. 4 shows an environment 300 showing a field of view 302 captured by the thermal imaging capturing unit 16. Here, the field of view 302 refers to an area captured by thermal imaging capturing unit 16 for identifying and tracking objects 312. Here, the objects 312 indicate humans or animals walking or running or passing by a road 304. As specified above, the field of view 302 includes a starting point 306 and an end point 308. The starting point 306 indicates a line or zone the system 12 begins tracking for estimating the speed of the objects 312. The end point 308 indicates a line or zone the system 12 stops tracking for estimating the speed of the objects 312. Here, the system 12 obtains the heat signature information from the thermal imaging capturing unit 16 and tracks the movement of the objects 312 in the field of view. Further, the system 12 creates a bounding box 310 around each object 312 to track each object 312 moving in the field of view 302.
[042] The system 12 estimates the speed of the vehicle 104 moving in the second zone 108b to be above the permissible limit, say 90 kmph, then the system 12 employs an additional sensor such as a camera (not shown) to capture an image of the license plate number. FIG. 5 shows an environment 400 of showing a field of view 402 captured by an image capturing unit (not shown). When the thermal imaging capturing unit 16 determines that a vehicle 406 in a field of view 404 is travelling at more than the permissible speed limit, then the system 12 triggers the image capturing unit to capture an image 408 of number plate or licence plate of the vehicle 406. The system 12 uses the image 408 of the license plate to raise a violation challan to the vehicle 406 as a proof for the travelling at more than the permissible speed limit.
[043] Now referring to FIG. 6, a method 500 for system and a method of identifying objects such as vehicles and tracking their speed is explained, in accordance with one exemplary embodiment of the present invention. The method 500 may be described in the general context of computer executable instructions. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 500 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[044] The order in which the method 500 is described and is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 500 or alternate methods. Additionally, blocks may be deleted from the method 500 without departing from the scope of the invention described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 500 may be implemented in the above-described system 12.
[045] At step 502, the system 12 communicatively connects to a thermal imaging capturing unit 16. The thermal imaging capturing unit 16 mounts at a gantry, pole or any other structure at the road depending on the need. The thermal imaging capturing unit 16 is positioned to capture objects in its field of view. In accordance with the present invention, the thermal imaging capturing unit 16 records minute differences in the heat emitted by the objects in its field of view and translates the information into visible images. The system 12 creates a virtual first point and a second point. The first point indicates a starting point and the second point indicates an end point. At step 504, the system 12 tracks all the objects within the field of view. Further, the system 12 estimates the speed of the objects/vehicles based on the time duration taken by the objects/vehicles to move from the first point to the second point or vice versa. This way, the system 12 tracks and estimates the speed of the objects/vehicles. Further, the system 12 triggers an image capturing unit (Visible Imaging Sensor (RGB Colour Camera)) to capture an image of license plate of the object/vehicle, as shown at step 510.
[046] At step 506, the system 12 checks whether the speed of the vehicle is more than the predetermined limit. For instance, consider that the predetermined limit is set as 80 kilometres per hour (kmph). If the speed of the vehicle is within 80 kmph, then the method 500 moves to step 508 and ends. If the system 12 determines that the speed of the vehicle is more than 80 kmph, then the method moves to step 512. At step 512, the system 12 Subsequently, the system 12 generates an alert to notify the vehicle/object of the breach or speed limit or violation, as shown at step 512.
[047] A person skilled in the art understands that the system described herein provides several advantages. In one example, the system helps in identifying the objects/vehicles without depending on the lighting and weather conditions. The system helps in estimating the speed of the vehicles in one or both directions. This reduces the need for deploying multiple cameras to estimate the speed of the vehicles. The system utilises a thermal camera which is relatively cheap compared to Lidar or radar based systems.
[048] The presently disclosed system can be deployed at highway toll stations, traffic junctions, apartments, construction sites, miming industry, etc. The system can be used for counting the density of vehicles at a given point of time. Further, the system can be used for estimating the speed at which the vehicles are travelling. Based on the amount of vehicles on the road, the speed at which the vehicles should travel can be determined. This way, the traffic congestion can be avoided at particular hours of the day. In one example, the system can be used for separating commercial and passenger vehicles in the first lane and the second lane depending on the need.
[049] In the above description, numerous specific details are set forth such as examples of some embodiments, specific components, devices, methods, in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to a person of ordinary skill in the art that these specific details need not be employed, and should not be construed to limit the scope of the invention.
[050] In the development of any actual implementation, numerous implementation-specific decisions must be made to achieve the developer's specific goals, such as compliance with system-related and business-related constraints. Such a development effort might be complex and time consuming, but may nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill. Hence as various changes could be made in the above constructions without departing from the scope of the invention, it is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
[051] The foregoing description of embodiments is provided to enable any person skilled in the art to make and use the invention of the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the novel principles and invention disclosed herein may be applied to other embodiments without the use of the innovative faculty. It is contemplated that additional embodiments are within the true scope of the disclosed invention.
,CLAIMS:WE CLAIM:
1. A method (500) of identifying and tracking objects, and estimating the speed of the objects, the method (500) comprising the steps of:
providing (502) a thermal imaging capturing unit (16) capable of recording minute differences in the heat emitted by an object (104) in its field of view (102) and translating the information into visible images;
creating (502) a first point (106a) and a second point (106b) over the visible images, the first point (106a) indicating a starting point and the second point (106b) indicating an end point within the field of view (102) of the thermal imaging capturing unit (16); and
tracking (504) the object in the field of view (102), the tracking (504) comprising tracking the speed of the object (104) by estimating the time duration taken by the object (104) to travel from the first point (106a) to the second point (106b).
2. The method (500) as claimed in claim 1, comprising checking whether the speed of the object (104) exceeds a predetermined speed limit.
3. The method (500) as claimed in claim 2, comprising triggering an image capturing unit to capture an image identifying the object (104 using a unique identification of the object (104).
4. The method (500) as claimed in claim 3, comprising generating an alert notifying the object (104) of a breach of the predetermined speed limit along with the image captured.
5. The method (500) as claimed in claim 1, comprising identifying the object (104) travelling from the second point (106b) to the first point (106a) and notifying the object (104).
6. A system (12) for identifying and tracking objects, and estimating the speed of the objects, the system (500) comprising:
a thermal imaging capturing unit (16) capable of recording minute differences in the heat emitted by an object (104) in its field of view (102) and translating the information into visible images,
wherein the system (12) receives the visible images and creates a first point (106a) and a second point (106b) over the visible images, wherein the first point (106a) indicates a starting point and the second point (106b) indicates an end point within the field of view (102) of the thermal imaging capturing unit (16), and
wherein the system (12) tracks the objects in the field of view (102) and tracks the speed of the object (104) by estimating the time duration taken by the object (104) to travel from the first point (106a) to the second point (106b).
7. The system (12) as claimed in claim 6, wherein the system (12) checks whether the speed of the object (104) exceeds a predetermined speed limit.
8. The system (12) as claimed in claim 7, wherein the system (12) triggers an image capturing unit to capture an image identifying the object (104 using a unique identification of the object (104).
9. The system (12) as claimed in claim 8, wherein the system (12) generates an alert notifying the object of a breach of the predetermined speed limit and the image captured.
10. The system (12) as claimed in claim 6, wherein the system (12) detects a presence of the object (104) for a predetermined time period and generates an alert to the object (104).
| # | Name | Date |
|---|---|---|
| 1 | 202141052881-PROVISIONAL SPECIFICATION [17-11-2021(online)].pdf | 2021-11-17 |
| 2 | 202141052881-POWER OF AUTHORITY [17-11-2021(online)].pdf | 2021-11-17 |
| 3 | 202141052881-FORM FOR SMALL ENTITY(FORM-28) [17-11-2021(online)].pdf | 2021-11-17 |
| 4 | 202141052881-FORM FOR SMALL ENTITY [17-11-2021(online)].pdf | 2021-11-17 |
| 5 | 202141052881-FORM 1 [17-11-2021(online)].pdf | 2021-11-17 |
| 6 | 202141052881-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [17-11-2021(online)].pdf | 2021-11-17 |
| 7 | 202141052881-EVIDENCE FOR REGISTRATION UNDER SSI [17-11-2021(online)].pdf | 2021-11-17 |
| 8 | 202141052881-DRAWINGS [17-11-2021(online)].pdf | 2021-11-17 |
| 9 | 202141052881-Proof of Right [16-11-2022(online)].pdf | 2022-11-16 |
| 10 | 202141052881-FORM FOR SMALL ENTITY [16-11-2022(online)].pdf | 2022-11-16 |
| 11 | 202141052881-EVIDENCE FOR REGISTRATION UNDER SSI [16-11-2022(online)].pdf | 2022-11-16 |
| 12 | 202141052881-DRAWING [16-11-2022(online)].pdf | 2022-11-16 |
| 13 | 202141052881-COMPLETE SPECIFICATION [16-11-2022(online)].pdf | 2022-11-16 |
| 14 | 202141052881-FORM-9 [17-11-2022(online)].pdf | 2022-11-17 |
| 15 | 202141052881-FORM 3 [17-11-2022(online)].pdf | 2022-11-17 |
| 16 | 202141052881-FORM 18 [17-11-2022(online)].pdf | 2022-11-17 |
| 17 | 202141052881-ENDORSEMENT BY INVENTORS [17-11-2022(online)].pdf | 2022-11-17 |
| 18 | 202141052881-FER.pdf | 2023-01-11 |
| 19 | 202141052881-OTHERS [07-07-2023(online)].pdf | 2023-07-07 |
| 20 | 202141052881-FORM 3 [07-07-2023(online)].pdf | 2023-07-07 |
| 21 | 202141052881-FER_SER_REPLY [07-07-2023(online)].pdf | 2023-07-07 |
| 22 | 202141052881-DRAWING [07-07-2023(online)].pdf | 2023-07-07 |
| 23 | 202141052881-COMPLETE SPECIFICATION [07-07-2023(online)].pdf | 2023-07-07 |
| 24 | 202141052881-CLAIMS [07-07-2023(online)].pdf | 2023-07-07 |
| 25 | 202141052881-ABSTRACT [07-07-2023(online)].pdf | 2023-07-07 |
| 26 | 202141052881-US(14)-HearingNotice-(HearingDate-14-03-2024).pdf | 2024-02-27 |
| 27 | 202141052881-Correspondence to notify the Controller [14-03-2024(online)].pdf | 2024-03-14 |
| 28 | 202141052881-Annexure [14-03-2024(online)].pdf | 2024-03-14 |
| 29 | 202141052881-Written submissions and relevant documents [18-03-2024(online)].pdf | 2024-03-18 |
| 30 | 202141052881-Annexure [18-03-2024(online)].pdf | 2024-03-18 |
| 31 | 202141052881-PatentCertificate21-03-2024.pdf | 2024-03-21 |
| 32 | 202141052881-IntimationOfGrant21-03-2024.pdf | 2024-03-21 |
| 33 | 202141052881-Response to office action [23-08-2024(online)].pdf | 2024-08-23 |
| 34 | 202141052881-Annexure [23-08-2024(online)].pdf | 2024-08-23 |
| 1 | SearchStrategyE_10-01-2023.pdf |