Sign In to Follow Application
View All Documents & Correspondence

Traffic Management System For Emergency Vehicles

Abstract: System and method are disclosed for managing traffic at intersections and enables the emergency vehicles to move without waiting in the traffic. Live feed of intersections at road is collected through image acquisition units 102 installed at intersections, and using a recurrent neural network (RNN) architecture such as long short-term memory (LSTM) the received live feed is analysed, that facilitates in determining traffic density on various lanes on the intersections, also detects location of the emergency vehicles in the traffic. Upon detection of the emergency vehicles associated traffic signal lights 108 can be switched from red light to green light, thus enables the emergency vehicle to move easily.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 January 2022
Publication Number
44/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. LILHORE, Umesh Kumar
Associate Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
2. SIMAIYA, Sarita
Associate Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.
3. SAURABH, Praneet
Associate Professor, School of Engineering and Technology, Mody University, Lakshmangarh, Rajasthan - 332311, India.
4. SANDHU, Jasminder
Assistant Professor, Chitkara University Institute of Engineering and Technology, Chitkara University, Chandigarh-Patiala National Highway, Village Jansla, Rajpura, Punjab - 140401, India.

Specification

TECHNICAL FIELD
[0001] The present disclosure relates in general to systems for controlling traffic signals, and more particularly to identifying an emergency vehicle in traffic and correspondingly controlling traffic signals.

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Emergency services vehicles of police, fire, ambulances, etc. are designated to respond to emergencies and are bound to transport services, material and people. These vehicles are authorized by law to safely pass through the intersections disregarding traffic rules. Emergency vehicles are equipped with visual and audible warning devices to alert the nearby persons on their approach. However rapid population growth in cities has resulted in high traffic densities and unavoidable conjunctions to these vehicles, and the ambulance the purpose of the ambulance arrives late in such cases and unfortunately sometimes is not served.
[0004] Conventionally, commuters give way to emergency vehicles, on hearing the vehicle’s siren. However due to heavy traffic and automated traffic signals, an emergency vehicle that needs to pass through the intersection needs to wait to pass the signal. This process imposes considerable delay in response time and leads in losing valuable human life and property.
[0005] Existing systems are disclosing use of infrared frequency mediated alerting traffic of the arrival of an emergency vehicle. However, infrared communication has its shortcomings. As Infrared frequencies are affected by hard objects like, walls, doors, heavy vehicle, smoke, dust, fog, etc., Infrared- based communication systems have the line-of-sight limitations associated with them. For example, Infrared-based transmitter will not be able to transmit a signal from an ambulance when a truck is blocking signal’s way from an ambulance to a traffic light console.
[0006] Therefore, to overcome the above mentioned drawback, there is need to develop a system through which an emergency vehicle can move safely without stopping in the traffic.

OBJECTS OF THE PRESENT DISCLOSURE
[0007] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0008] An object of the present disclosure is to provide faster movement of emergency vehicles in traffic.
[0009] It is another object of the present disclosure to control traffic signals in a smart way.
[0010] It is another object of the present disclosure to provide real time location tracking of the emergency vehicle on the road.
[0011] It is another object of the present disclosure to provide economical solution to faster movement of emergency vehicles.
[0012] Other object, features, and advantages will become apparent from detail description and appended claims to those skilled in the art.

SUMMARY
[0013] Various aspects of the present disclosure relates to systems for controlling traffic signals. In particular the present disclosure relates to a system for identifying an emergency vehicle in traffic and correspondingly controlling traffic signals.
[0014] According to an aspect, the present disclosure a traffic management system for emergency vehicles is disclosed, the system may include one or more image acquisition units positioned at a plurality of intersections of a road and configured to acquire one or more videos of each of the plurality of intersections in real time, and a processing unit operatively coupled to each of the one or more image acquisition units.
[0015] In an aspect, the processing unit comprising a learning engine may be coupled with a memory, the memory storing instructions executable by the learning engine and configured to receive the acquired videos of the plurality of intersections, analyse the received videos to determine traffic intensity at each of the plurality of intersections of the road, and check for emergency vehicle in the traffic on at least one of the plurality of intersections of the road, and extract location of the emergency vehicle found in traffic, and correspondingly generating one or more control signals, wherein the one or more control signals are transmitted to one or more traffic signal lights to illuminate a specific light, thereby the traffic flow, at the associated intersection minimize, and enables the emergency vehicle to move.
[0016] In an aspect, upon detection of the emergency vehicle in the traffic on at least one of the plurality of intersections, associated traffic signal light illuminates a green color light for a time span, where the instructions are transmitted by the processing unit in the form of one or more control signals.
[0017] In an aspect, upon illuminating the green color light on at least one of the traffic signal light 108, the one or more traffic signal lights positioned at the plurality of intersections of the road illuminates a red color light to halt movement of the vehicle from other sides of the road for the time span.
[0018] In an aspect, based on the location of the emergency vehicle in the traffic, distance of the emergency vehicle from the associated traffic signal light is computed, and correspondingly the time span of illuminating the specific color at the one or more traffic signal lights is estimated.
[0019] In an aspect, one or more display units may be positioned at the road, and configured to display traffic information, wherein the one or more display units are selected from a group consisting of but not limited to light emitting diode (LED), liquid crystal display (LCD), organic light emitting diode (OLED), and LED matrix.
[0020] In an aspect, the learning engine include an artificial recurrent neural network (RNN) architecture.
[0021] In an aspect, the artificial recurrent neural network (RNN) architecture is a long short-term memory (LSTM) architecture.
[0022] In an aspect, the emergency vehicle may be selected from the group comprising of ambulance, fire-fighting vehicles, police petrol van, a SWAT vehicle, civil emergency service vehicle including crew vehicles dealing with gas leakage, electricity short circuits, water issues and other priority vehicles, including vehicles of important functionaries, dignitaries and high rank government officials.
[0023] Another aspect of the present disclosure pertains to a method for controlling traffic for emergency vehicles, the method may include receiving at a processing unit, one or more videos from one or more image acquisition units positioned at a plurality of intersections of a road, displaying at a display unit, the one or more videos, analysing the received videos, by a learning engine of the processing unit to determine traffic intensity on each of the plurality of intersections of the road, and checking for emergency vehicle in the traffic, and extracting location of the emergency vehicle, upon detection of the emergency vehicle in the traffic, and correspondingly generating one or more control signals, where the one or more control signals pertains information to illuminate a specific light, at one or more traffic signal lights.
[0024] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF DRAWINGS
[0025] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0026] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0027] FIG. 1 illustrates an exemplary block diagram of a traffic management system, in accordance with an embodiment of the present disclosure.
[0028] FIG. 2 illustrates an exemplary functional components of a processing unit of the proposed system, in accordance with an embodiment of the present disclosure.
[0029] FIG. 3 illustrates an exemplary method for controlling traffic for emergency vehicle, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0030] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present disclosure as defined by the appended claims.
[0031] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. Embodiments explained herein relate to system for controlling traffic signals. In particular the present disclosure relates to a system for identifying an emergency vehicle in traffic and correspondingly controlling traffic signals that enables the emergency vehicle to reach its destination on time without any delay.
[0032] According to an embodiment of the present disclosure, a traffic management system 100 (also referred as system 100, hereinafter) for detecting emergency vehicle in traffic, and controlling traffic to provide way to the emergency vehicle to move fast. The emergency vehicle can include, but limited to the emergency vehicle is selected from the group comprising of ambulance, fire-fighting vehicles, police petrol van, a SWAT vehicle, civil emergency service vehicle including crew vehicles dealing with gas leakage, electricity short circuits, water issues and other priority vehicles, including vehicles of important functionaries, dignitaries and high rank government officials.
[0033] In an embodiment, the system 100 can include one or more image acquisition units 102 (collectively referred as image acquisition units 102, and individually referred as image acquisition unit 102), and a processing unit 104. The processing unit 104 can be operatively coupled with each of the image acquisition units 102 and one or more traffic signal lights 108 positioned on various intersections of roads of an area such as a city. For example, the intersection can be equipped with a traffic control pre-emption system such as the Opticom® Priority Control System, the OPTICOM GPS priority control system, or a networked system.
[0034] Each of the image acquisition unit 102 can include but not limited to, webcam and surveillance camera that can be installed on various positions on the road, and in proximity of the traffic signal lights 108 at each intersections of the road. The image acquisition units 102 can be configured for live streaming (i.e. videos) of the traffic on the intersections, videos of an intersections (having approximately 8 lanes) can be transmitted to the processing unit 104 for analysis. In an embodiment, the processing unit 104 can include a learning engine 106 coupled with a memory, the memory storing instructions executable by the learning engine and configured to analyse the videos of intersections acquired by the image capturing units. The learning engine 106 can be an artificial recurrent neural network (RNN) architecture, including long short-term memory (LSTM) architecture that can facilitates in analyse the received videos of the intersections.
[0035] In an embodiment, the processing unit 104 can be configured to determine traffic intensity at each of the intersections of the road, by counting number of vehicles or any other method. For example, the videos can be break into frame or images, and correspondingly using learning engine, the traffic intensity can be determined. Further, the processing unit can be configured to detect emergency vehicle in the traffic on any of the lane at the intersection of the road.
[0036] In an embodiment, the processing unit 104 can be configured to extract location of the emergency vehicle found in the traffic, and correspondingly generating one or more control signals (also referred as control signals). The one or more control signals can be transmitted to one or more traffic signal lights 108 to illuminate a specific light, thereby the traffic flow, at the associated intersection minimize, and enables the emergency vehicle to move. For example, an ambulance is found in second lane at fifteenth position, and the traffic signal is red, the processing unit 104 can analyse this, and instruct the associated traffic signal light 108 to illuminate green color for a time span (i.e. 15 seconds) required to cross the intersection by the ambulance, after fifteen seconds the traffic signal light 108 can operate normally. At the same time, other traffic signal lights 108 of the intersection can be controlled in such a manner, to minimize traffic congestion on the intersection. The color of lights can be switched such that the ambulance vehicle can pass the road crossing immediately, whereas other vehicles approaching the same road crossing have to wait until the ambulance has passed the road crossing.
[0037] In an embodiment, the traffic information, and the ambulance location can be displayed on one or more display units 110 (collectively referred as display unit 110, and individually referred as display unit 110), where the display units 110 can be positioned on the road or near the intersections. For example, the display unit can be configured to display the information of ambulance, and the display unit 110 can display a message for drivers waiting on other lane to have patience, as there traffic signal will be red for long.
[0038] In an embodiment, the display unit 110 can be selected from a group consisting of but not limited to light emitting diode (LED), liquid crystal display (LCD), organic light emitting diode (OLED), LED matrix. Size of the display units 110 can be large, thus people sitting in the vehicle can read the message easily. For example, inside the display units 110, a series of LED arrays can be placed adjacent one another to form the screen of the display unit 110, which shows both static and animated warnings, advisories and other information. Although a series of LED arrays can be shown in the illustrated embodiment, one large LED array can be used. Control circuit boards can be provided above the LED arrays. The display unit 110 can display an image of the ambulance icon, a photo of the intersection or traffic data indicating the direction of the ambulance.
[0039] In an exemplary embodiment, the system 100 can include a communication unit (not shown) to establish communication in between various components of the system 100. The system 100 can also include Wireless Fidelity (Wi-Fi), Bluetooth, Li-Fi, Wireless Local Area Network (WLAN), ZigBee, and GSM module.
[0040] As illustrated in FIG. 2, a processing unit 104 can include one or more processor(s) 202. The one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 can be configured to fetch and execute computer readable instructions stored in a memory 204 of the processing unit 104. The memory 204 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the likes.
[0041] In an embodiment, the processing unit 104 can also include an interface(s) 206. The interface(s) 206 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may facilitate communication of system 100. The interface(s) 206 may also provide a communication pathway for one or more components of the system 100. Examples of such components include, but are not limited to, learning engine(s) 106 and database 208.
[0042] In an embodiment, the learning engine(s) 106 can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the learning engine(s) 106. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the learning engine(s) 106 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the learning engine(s) 106 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the learning engine(s) 106. In such examples, the processing unit 110 can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to processing unit 104 and the processing resource. In other examples, the learning engine(s) 106 may be implemented by electronic circuitry. The database 208 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the learning engine(s) 106.
[0043] In an embodiment, the learning engine(s) 106 can include an extraction unit 210, a comparison unit 212, a classification and training unit 214, a signal generation unit 216, and other unit(s) 218. The other unit(s) 218 can implement functionalities that supplement applications or functions performed by the system 100 or the learning engine(s) 106.
[0044] In an embodiment, the database 208 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the learning engine(s) 106. The database 208 can be a server including several local and/or remote servers
[0045] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the system 100. These units too may be merged or divided into super- units or sub-units as may be configured.
[0046] In an embodiment, the processing unit 104 can be configured to receive one or more videos acquired by image capturing units 102 in an electric form, where the videos include information of traffic on intersections on road. The extraction unit 210 can be configured to extract frame from videos, and images of traffic on the road. The extracted information can be transmitted to the comparison unit 212.
[0047] In an embodiment, the comparison unit 212 can be configured to compare the extracted information from a set of pre-defined images and frames of videos. The ambulance can be detected, and distance of location from the associated traffic signal light 108 can be computed, and further transmitted the information to the classification and training unit 214 in machine readable form or binary form, where the classification and training unit 214 can classify the information and correspondingly the signal generation unit 216 can generate and transmit control signals.
[0048] In an embodiment, the classification and training unit 214 can be configured to receive the extracted information from the images such as traffic density, color of emergency vehicle, and other feature of emergency vehicle, time span required to cover a distance by the emergency vehicle, in machine readable form or binary form and update and train the classification and training unit 214 based on extracted information. The learning model can be trained based on the received and analysed information where the leaning model can be stored in the database 208.
[0049] In an embodiment, the learning engine 106 can include but not limited to an artificial recurrent neural network (RNN) architecture, long short-term memory (LSTM) architecture. machine learning algorithms and deep learning algorithms
[0050] In an embodiment, the learning engine 106 can include a convolutional neural network (CNN) and a long short-term memory recurrent neural network (LSTM-RNN). In general, the CNN identifies and extracts spatial features through an iterative process of convolving the images, pooling results of the convolving, and then repeating the process of convolving and pooling using the pooled results from a previous iteration. CNN can be implemented in this manner until a final fully connected layer outputs a feature map or other general indication of spatial features of the images after, for example, several iterations (e.g., 5 iterations of the CNN) the spatial information is used from the CNN as an electronic input into the LSTM. In general, the LSTM is a type of recurrent neural network that determines temporal relationships or other temporal information about the spatial features identified by the CNN. That is, the LSTM-RNN includes aspects that account for changes between the images in order to identify color of light illuminated in the traffic signal light 108, and number of vehicle before the emergency vehicle. In either case, the learning engine106 can implement the LSTM-RNN to produce a prediction of the traffic intensity, and color of light, and uses the prediction to generate an output that identifies the particular states as statistical likelihoods or probabilities. Based on the distance of the emergency vehicle from the associated traffic signal light 106, the time span (i.e. time required to cross the intersection from the current location of the emergency vehicle) is estimated and the signal generation unit 316 can generate signals.
[0051] In an exemplary embodiment, a VGG-19 (convolutional neural network that is 19 layers deep) architecture can be used for training the dataset. VGG-19 CNN is used as a pre-processing model. It uses an alternating structure of multiple convolutional layers and non-linear activation layers, which is better than a single convolution The layer structure can better extract image features, use max pooling for down sampling, and modify the linear unit as the activation function, that is, select the largest value in the image area as the pooled value of the area. The down sampling layer is mainly used to improve the anti-distortion ability of the network to the image, while retaining the main features of the sample and reducing the number of parameters, thus, it can facilitate in reducing error rates and improving efficiency.
[0052] In an embodiment, the signal generation unit 316 can be further configured to generate and transmit the control signals which can be transmitted to the associated traffic signal light 108 to switch the color of light. Upon detection of the emergency vehicle in the traffic at least one of the intersections, associated traffic signal light 108 can illuminate a green color light for a time span, and the instructions can be transmitted by the processing unit 104 in the form of control signals. Further, upon illuminating the green color light on at least one of the traffic signal light 108, other traffic signal lights 108 positioned at the same intersections of the road illuminates a red color light to halt movement of the vehicle from other sides of the road for the time span.
[0053] As illustrated in FIG. 3, a method (300) for controlling traffic for emergency vehicles is disclosed. At step (302) the method (300) can include receiving at a processing unit 104 one or more videos acquired by one or more image acquisition units 102 positioned at a plurality of intersections of a road.
[0054] At step (304), the method (300) can include displaying, at a display unit 110, the one or more videos acquired by associated image acquisition unit 102. The display unit can display traffic on a road, or in a particular lane, thus, the driver of the vehicle can choose another lane, based on traffic densities as shown on the display unit 110.
[0055] At step (306), the method (300) can include analysing the received videos, by a learning engine 106 of the processing unit 104 to determine traffic intensity at each of the plurality of intersections of the road, and checking for emergency vehicle in the traffic. The learning engine 106 can include an artificial recurrent neural network (RNN) architecture, where the artificial recurrent neural network (RNN) architecture is a long short-term memory (LSTM) architecture that facilitates in accurately analysing the traffic on the road.
[0056] At step (308), the method (300) can include extracting location of the emergency vehicle, upon detection of the emergency vehicle in the traffic, and correspondingly generating one or more control signals. The generated one or more control signals can pertain information to illuminate a specific light, at one or more traffic signal lights 108. For example, an ambulance is found on first lane, on tenth position, and the red light of lane is ON, upon detection of ambulance, the system 100 can switch the red light to green light till the ambulance cross the intersection. When the ambulance cross the intersection, the red light can be turned on automatically. Also, at the same time, light of other lane can be controlled correspondingly, to avoid any clashes.
[0057] Moreover, in interpreting the specification, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the r eferenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced.
[0058] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE INVENTION
[0059] The proposed invention provides a system for faster movement of emergency vehicles in traffic.
[0060] The proposed invention provides a system to control traffic signals in a smart way.

[0061] The proposed invention provides a system to provide real time location tracking of the emergency vehicle on the road.
[0062] The proposed invention provides a system to provide economical solution to faster movement of emergency vehicles.

We Claims:

1. A traffic management system 100 for emergency vehicles, the system comprising:
one or more image acquisition units 102 positioned at a plurality of intersections of a road and configured to acquire one or more videos of each of the plurality of intersections in real time; and
a processing unit 104 operatively coupled to each of the one or more image acquisition units 102, the processing unit comprising a learning engine 106 coupled with a memory, the memory storing instructions executable by the learning engine and configured to:
receive the acquired videos of the plurality of intersections;
analyse the received videos to determine traffic intensity at each of the plurality of intersections of the road, and check for emergency vehicle in the traffic on at least one of the plurality of intersections of the road; and
extract location of the emergency vehicle found in traffic, and correspondingly generating one or more control signals, wherein the one or more control signals are transmitted to one or more traffic signal lights 108 to illuminate a specific light, thereby the traffic flow, at the associated intersection minimize, and enables the emergency vehicle to move.
2. The traffic management system as claimed in claim 1, wherein upon detection of the emergency vehicle in the traffic on at least one of the plurality of intersections, associated traffic signal light 108 illuminates a green color light for a time span wherein the instructions are transmitted by the processing unit in the form of one or more control signals.
3. The traffic management system as claimed in claim 1, wherein upon illuminating the green color light on at least one of the traffic signal light 108, the one or more traffic signal lights 108 positioned at the plurality of intersections of the road illuminates a red color light to halt movement of the vehicle from other sides of the road for the time span.
4. The traffic management system as claimed in claim 1, wherein based on the location of the emergency vehicle in the traffic, distance of the emergency vehicle from the associated traffic signal light 108 is computed, and correspondingly the time span of illuminating the specific color at the one or more traffic signal lights is estimated.
5. The traffic management system as claimed in claim 1, wherein one or more display units 110 are positioned at the road, and configured to display traffic information, wherein the one or more display units 110 are selected from a group consisting of but not limited to light emitting diode (LED), liquid crystal display (LCD), organic light emitting diode (OLED), and LED matrix.
6. The traffic management system as claimed in claim 1, wherein the learning engine 106 include an artificial recurrent neural network (RNN) architecture.
7. The traffic management system as claimed in claim 5, wherein the artificial recurrent neural network (RNN) architecture is a long short-term memory (LSTM) architecture.
8. The traffic management system as claimed in claim 1, wherein the emergency vehicle is selected from the group comprising of ambulance, fire-fighting vehicles, police petrol van, a SWAT vehicle, civil emergency service vehicle including crew vehicles dealing with gas leakage, electricity short circuits, water issues and other priority vehicles, including vehicles of important functionaries, dignitaries and high rank government officials.
9. A method for controlling traffic for emergency vehicles, the method comprising:
receiving at a processing unit, one or more videos from one or more image acquisition units 102 positioned at a plurality of intersections of a road;
displaying, at a display unit 110, the one or more videos;
analysing the received videos, by a learning engine 106 of the processing unit to determine traffic intensity on each of the plurality of intersections of the road, and checking for emergency vehicle in the traffic; and
extracting location of the emergency vehicle, upon detection of the emergency vehicle in the traffic, and correspondingly generating one or more control signals, wherein the one or more control signals pertains information to illuminate a specific light, at one or more traffic signal lights.

Documents

Application Documents

# Name Date
1 202211000483-STATEMENT OF UNDERTAKING (FORM 3) [05-01-2022(online)].pdf 2022-01-05
2 202211000483-POWER OF AUTHORITY [05-01-2022(online)].pdf 2022-01-05
3 202211000483-FORM FOR STARTUP [05-01-2022(online)].pdf 2022-01-05
4 202211000483-FORM FOR SMALL ENTITY(FORM-28) [05-01-2022(online)].pdf 2022-01-05
5 202211000483-FORM 1 [05-01-2022(online)].pdf 2022-01-05
6 202211000483-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-01-2022(online)].pdf 2022-01-05
7 202211000483-EVIDENCE FOR REGISTRATION UNDER SSI [05-01-2022(online)].pdf 2022-01-05
8 202211000483-DRAWINGS [05-01-2022(online)].pdf 2022-01-05
9 202211000483-DECLARATION OF INVENTORSHIP (FORM 5) [05-01-2022(online)].pdf 2022-01-05
10 202211000483-COMPLETE SPECIFICATION [05-01-2022(online)].pdf 2022-01-05
11 202211000483-Proof of Right [04-07-2022(online)].pdf 2022-07-04
12 202211000483-FORM-9 [31-10-2022(online)].pdf 2022-10-31
13 202211000483-FORM 18 [10-10-2023(online)].pdf 2023-10-10
14 202211000483-FER.pdf 2025-03-22
15 202211000483-FORM 3 [20-06-2025(online)].pdf 2025-06-20
16 202211000483-FORM-5 [18-07-2025(online)].pdf 2025-07-18
17 202211000483-FORM-26 [18-07-2025(online)].pdf 2025-07-18
18 202211000483-FER_SER_REPLY [18-07-2025(online)].pdf 2025-07-18
19 202211000483-CORRESPONDENCE [18-07-2025(online)].pdf 2025-07-18
20 202211000483-CLAIMS [18-07-2025(online)].pdf 2025-07-18

Search Strategy

1 0483E_04-07-2024.pdf