Sign In to Follow Application
View All Documents & Correspondence

Traffic Monitoring System To Facilitate Passage Of Emergency Vehicles

Abstract: The present disclosure relates to the traffic monitoring system 100, which ensures rapid and safe flow of traffic when an emergency vehicle arrives at an intersection. The system 100 comprises microphones 102, cameras 106, and processing unit 104. The microphones 102 detect sirens of the emergency vehicles, and consequently the cameras 106 can be actuated. The cameras 106 capture real time images of the road, and the processing unit 104 analyzes the captured real time images to determine the presence of at least one emergency vehicle. Subsequently, the processing unit 104 can generate and transmit a set of control signals. The system 100 comprises beacons 108 configured to illuminate based on the received set of control signals, to prioritize the movement of the emergency vehicle.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
28 May 2020
Publication Number
49/2021
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-12-26
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. KUMAR, Ashok
Chitkara University, Chandigarh Patiala National Highway, NH 64, Village Jahnsla, Rajpura, Punjab - 140401, India.
2. MANGLA, Monika
Lokmanya Tilak College of Engineering, Koparkhairane, Navi Mumbai, Maharashtra, India.
3. SHINDE, Subhash K
Lokmanya Tilak College of Engineering, Koparkhairane, Navi Mumbai, Maharashtra, India.
4. MEHTA, Vaishali
Karnal Institute of Technology & Management, Karnal, Haryana, India.
5. BHUSHAN, Megha
Department of Computer Science and Engineering, Koneru Lakshmaiah Education Foundation, Hyderabad, Telangana - 500075, India.

Specification

[0001] The present disclosure relates to systems and methods associated with traffic
monitoring. In particular, the present disclosure provides a traffic monitoring system to
facilitate passage of emergency vehicles.
BACKGROUND
[0002] The background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information
provided herein is prior art or relevant to the presently claimed invention, or that any
publication specifically or implicitly referenced is prior art.
[0003] Day-by-day increase in traffic congestion, especially in urban areas, is one of
the most concerning issues for society. In various accidents and mis-happenings, loss of
human lives, which is the greatest asset in itself, and other important assets, such as
machinery, industry, farms, building, etc, may be caused due to such continuous increase in
traffic congestion, which may result in unavailability of timely services and delayed traffic
emergency responses. Hence, there is an urgent requirement to handle any accident-related
case, mis-happening, or, emergency, in an efficient manner realizing the emergencies and the
need for emergency vehicles to pass thorough.
[0004] In most of the emergency situations, timely intervention may mitigate
resulting damage, and prevent worsening of the situation. Further, chances of survival of a
patient or a victim requiring urgent medical attention due to any accident, or other mishappening, increases with timely availability of emergency vehicles, and other such
emergency aids. In order to handle such cases safely and efficiently, the emergency vehicles
are required to arrive at the site of an emergency, and from there to hospital, in minimal
amount of time.
[0005] Typically, emergency vehicles comprise visual and audible alarms such as
sirens, horns, bells, flashlights, beacons, and the like, to alert other vehicles and pedestrians in
the area. However, in traffic congestions, traffic jam, and other such situations, vehicles
cannot move swiftly, resulting in a chaos, due to which the visual and audible alarms cannot
provide sufficient aid to the emergency vehicles to swiftly travel and reach the emergency
site, or hospital in time.
3
[0006] At various intersections and cross-roads, the emergency vehicles may come
across numerous traffic lights. If the emergency vehicles are to wait at the traffic lights, it
may cause irreversible and escalated damage to the property, asset and/or human life. On the
contrary, if emergency vehicles violate traffic signals, it may cause additional road accidents
as advocated by the statistics released by the National Highway Traffic Safety Administration
(NHTSA). According to these statistics, there are approximately 6500 accidents involving
ambulances each year in the United States. Also, the number of accidents involving fire
trucks per year is estimated to be 31600 in the same report. These accidents are due to
violation of traffic lights by emergency vehicles owing to the requirement of reaching
destination at the earliest possible. These statistics are eye opening and also demand an
efficient mechanism for smooth and fastest passage of emergency vehicles.
[0007] There is, therefore, a need in the art to provide an efficient and cost-effective
system and method to the above-mentioned problems, and, provide a means to ensure
reduction in occurrences of the emergency vehicle being stuck in traffic congestions, and
allow easy passage of emergency vehicles.
OBJECTS OF THE PRESENT DISCLOSURE
[0008] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[0009] It is an object of the present disclosure to provide a system for efficient traffic
monitoring.
[0010] It is another object of the present disclosure to provide a system for
recognising emergency vehicles.
[0011] It is another object of the present disclosure to provide a system to facilitate
passage of emergency vehicles.
[0012] It is another object of the present disclosure to provide a system that consumes
minimal electrical power.
[0013] It is another object of the present disclosure to provide an improved, reliable,
efficient, cost-effective, and accurate system.
[0014] These and other objects of the present invention will become readily apparent
from the following detailed description taken in conjunction with the accompanying
drawings.
4
SUMMARY
[0015] The present disclosure relates to systems and methods associated with traffic
monitoring. In particular, the present disclosure provides a traffic monitoring system to
facilitate passage of emergency vehicles.
[0016] An aspect of the present disclosure pertains to a traffic monitoring system to
facilitate passage of emergency vehicles, the system comprises: one or more first detectors
configured at a region of interest (ROI) to detect a set of acoustic signals; one or more second
detectors configured at the ROI to detect attributes associated with one or more emergency
vehicles; and a processing unit operatively coupled to the one or more first detectors and the
one or more second detectors, the processing unit comprising one or more processors coupled
with a memory, the memory storing instructions executable by the one or more processors
configured to: receive the set of acoustic signals from at least one of the one or more first
detectors; responsive to positive comparison between the received set of acoustic signals and
a first dataset comprising one or more pre-defined acoustic signals, generate a set of actuation
signals to facilitate actuation of at least one of the one or more second detectors, wherein the
actuation of the at least one of the one or more second detectors enable detection of attributes
associated with at least one of the one or more emergency vehicles; and generate a set of
control signals based on the detected attributes to facilitate passage of one or more emergency
vehicles.
[0017] In an aspect, the one or more first detectors comprise any or a combination of
microphone, mic, and acoustic sensor.
[0018] In an aspect, the one or more second detectors comprise any or a combination
of camera, proximity sensor, velocity detector, and acceleration detector.
[0019] In an aspect, the attributes associated with the one or more emergency vehicles
comprise any or a combination of size, shape, length, height, speed, and distance, from the
one or more second detectors, of the one or more emergency vehicles.
[0020] In an aspect, the system comprises one or more beacons configured at a predetermined position at the ROI, and operatively coupled to the processing unit, and wherein,
based on the set of control signals, at least one of the one or more beacons may be
illuminated.
[0021] In an embodiment, the processing unit may be configured to determine a time
period based on the detected attributes associated with the at least one of the one or more
emergency vehicles.
5
[0022] In an embodiment, the at least one of the one or more beacons can be
illuminated for the determined time period.
[0023] In an embodiment, in case of negative comparison between the received set of
acoustic signals and the first dataset the one or more second detectors may remain in energy
saving mode.
[0024] In an embodiment, the attributes associated with the one or more emergency
vehicles may be stored in a second dataset, and correspondingly the second dataset may be
updated.
[0025] In an embodiment, the processing unit may be configured to determine the
time period based on the updation of the second dataset.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] The accompanying drawings are included to provide a further understanding
of the present disclosure, and are incorporated in and constitute a part of this specification.
The drawings illustrate exemplary embodiments of the present disclosure and, together with
the description, serve to explain the principles of the present disclosure.
[0027] The diagrams are for illustration only, which thus is not a limitation of the
present disclosure, and wherein:
[0028] FIG. 1 illustrates exemplary block diagram of the proposed traffic monitoring
system to illustrate its overall working in accordance with an embodiment of the present
disclosure.
[0029] FIG. 2 illustrates exemplary functional units of a processing unit, in
accordance with an exemplary embodiment of the present disclosure.
[0030] FIG. 3 illustrates exemplary structural diagram of the proposed traffic
monitoring system, in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION
[0031] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be
apparent to one skilled in the art that embodiments of the present invention may be practiced
without some of these specific details.
[0032] Embodiments of the present invention may be provided as a computer
program product, which may include a machine-readable storage medium tangibly
embodying thereon instructions, which may be used to program a computer (or other
6
electronic devices) to perform a process. The machine-readable medium may include, but is
not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc
read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such
as ROMs, PROMs, random access memories (RAMs), programmable read-only memories
(PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash
memory, magnetic or optical cards, or other type of media/machine-readable medium suitable
for storing electronic instructions (e.g., computer programming code, such as software or
firmware).
[0033] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus
for practicing various embodiments of the present invention may involve one or more
computers (or one or more processors within a single computer) and storage systems
containing or having network access to computer program(s) coded in accordance with
various methods described herein, and the method steps of the invention could be
accomplished by engine s, routines, subroutines, or subparts of a computer program product.
[0034] If the specification states a component or feature “may”, “can”, “could”, or
“might” be included or have a characteristic, that particular component or feature is not
required to be included or have the characteristic.
[0035] As used in the description herein and throughout the claims that follow, the
meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates
otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on”
unless the context clearly dictates otherwise.
[0036] The recitation of ranges of values herein is merely intended to serve as a
shorthand method of referring individually to each separate value falling within the range.
Unless otherwise indicated herein, each individual value is incorporated into the specification
as if it were individually recited herein. All methods described herein can be performed in
any suitable order unless otherwise indicated herein or otherwise clearly contradicted by
context. The use of any and all examples, or exemplary language (e.g. “such as”) provided
with respect to certain embodiments herein is intended merely to better illuminate the
invention and does not pose a limitation on the scope of the invention otherwise claimed. No
language in the specification should be construed as indicating any non-claimed element
essential to the practice of the invention.
7
[0037] Groupings of alternative elements or embodiments of the invention disclosed
herein are not to be construed as limitations. Each group member can be referred to and
claimed individually or in any combination with other members of the group or other
elements found herein. One or more members of a group can be included in, or deleted from,
a group for reasons of convenience and/or patentability. When any such inclusion or deletion
occurs, the specification is herein deemed to contain the group as modified thus fulfilling the
written description of all groups used in the appended claims.
[0038] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. This
invention may, however, be embodied in many different forms and should not be construed
as limited to the embodiments set forth herein. These embodiments are provided so that this
disclosure will be thorough and complete and will fully convey the scope of the invention to
those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the
invention, as well as specific examples thereof, are intended to encompass both structural and
functional equivalents thereof. Additionally, it is intended that such equivalents include both
currently known equivalents as well as equivalents developed in the future (i.e., any elements
developed that perform the same function, regardless of structure).
[0039] The present disclosure relates to systems and methods associated with traffic
monitoring. In particular, the present disclosure provides a system and method to facilitate
passage of emergency vehicles.
[0040] According to an aspect the present disclosure pertains to a traffic monitoring
system to facilitate passage of emergency vehicles, the system includes: one or more first
detectors configured at a region of interest (ROI) to detect a set of acoustic signals; one or
more second detectors configured at the ROI to detect attributes associated with one or more
emergency vehicles; and a processing unit operatively coupled to the one or more first
detectors and the one or more second detectors, the processing unit comprising one or more
processors coupled with a memory, the memory storing instructions executable by the one or
more processors configured to: receive the set of acoustic signals from at least one of the one
or more first detectors; responsive to positive comparison between the received set of
acoustic signals and a first dataset comprising one or more pre-defined acoustic signals,
generate a set of actuation signals to facilitate actuation of at least one of the one or more
second detectors, wherein the actuation of the at least one of the one or more second detectors
enable detection of attributes associated with at least one of the one or more emergency
8
vehicles; and generate a set of control signals based on the detected attributes to facilitate
passage of one or more emergency vehicles.
[0041] In an embodiment, the one or more first detectors include any or a
combination of microphone, mic, and acoustic sensor.
[0042] In an embodiment, the one or more second detectors include any or a
combination of camera, proximity sensor, velocity detector, and acceleration detector.
[0043] In an embodiment, the attributes associated with the one or more emergency
vehicles include any or a combination of size, shape, length, height, speed, and distance, from
the one or more second detectors, of the one or more emergency vehicles.
[0044] In an embodiment, the system includes one or more beacons configured at a
pre-determined position at the ROI, and operatively coupled to the processing unit, and
wherein, based on the set of control signals, at least one of the one or more beacons can be
illuminated.
[0045] In an embodiment, the processing unit can be configured to determine a time
period based on the detected attributes associated with the at least one of the one or more
emergency vehicles.
[0046] In an embodiment, the at least one of the one or more beacons can be
illuminated for the determined time period.
[0047] In an embodiment, in case of negative comparison between the received set of
acoustic signals and the first dataset the one or more second detectors can remain in energy
saving mode.
[0048] In an embodiment, the attributes associated with the one or more emergency
vehicles can be stored in a second dataset, and correspondingly the second dataset can be
updated.
[0049] In an embodiment, the processing unit can be configured to determine the time
period based on the updation of the second dataset.
[0050] FIG. 1 illustrates exemplary block diagram of the proposed traffic monitoring
system to illustrate its overall working in accordance with an embodiment of the present
disclosure.
[0051] As illustrated in FIG. 1, in an embodiment, the block diagram of the proposed
traffic monitoring system 100 (also, referred to as proposed system 100, or system 100,
hereinafter) includes one or more first detectors 102-1, 102-2… 102-N (collectively referred
to as first detectors 102, and, individually referred to as first detector 102, hereinafter). The
first detectors 102 can includeany or a combination of microphone, mic, acoustic sensor, and
9
the likes. The first detectors 102 can be systematically placed at pre-defined locations at a
region of interest (ROI), and can be configured to detect a set of acoustic signals around the
ROI, where the set of acoustic signals can be associated with horns of vehicles, entities, and
the likes.
[0052] In an embodiment, the block diagram of the proposed system 100 includes a
processing unit 104. The processing unit 104 can be operatively coupled to the first detectors
102, and can receive the set of acoustic signals from at least one of the first detectors 102. In
an embodiment, the processing unit 104 can perform comparison between the received set of
acoustic signals and a first dataset, where the first dataset can be including one or more predefined acoustic signals, which can be associated with emergency vehicles. In another
embodiment, responsive to positive comparison between the received set of acoustic signals
and the first dataset, i.e., when the the received set of acoustic signals matches with at least
one of the one or more pre-defined acoustic signals of the first dataset, the processing unit
104 can generate a set of actuation signals.
[0053] In an embodiment, the block diagram of the proposed system 100 includesone
or more second detectors 106-1, 106-2… 106-N (collectively referred to as second detectors
106, and, individually referred to as second detector 106, hereinafter), where the second
detectors 106 can include any or a combination of camera, proximity sensor, velocity
detector, acceleration detector, and the likes. The second detectors 106 can be operatively
coupled to the processing unit 104, and at least one of the second detectors 106 can get
actuated based on the generated set of actuation signals. In an illustrative embodiment, the
second detectors 106 can include a rotatable camera, which can be positioned at a predetermined height at a traffic signal postto cover a wide field of view, where the traffic signal
post can be positioned at an intersection.The rotatable camera can be actuated based on the
set of actuation signals, which are generated by the processing unit 104, based on the positive
comparison between the received set of acoustic signals and the first dataset. The rotatable
camera can rotate towards the direction from which the set of acoustic signals are being
transmitted, and can capture one or more images in that direction. In an illustrative
embodiment, the second detectors 106 can be configured to automatically switch to an
energy-saving mode when not in use, that is, when the set of actuation signals is not
transmitted to the second detectors 106 through the processing unit 104.
[0054] In an embodiment, the processing unit 104 can identify one or more
emergency vehicles (also, individually referred to as emergency vehicle, and collectively
referred to as emergency vehicles, hereinafter) from the captured one or more images.In
10
another embodiment, the processing unit 104 can enable detection of attributes associated
with at least one of the one or more emergency vehicles, where the attributes associated with
the one or more emergency vehicles can include any or a combination of size, shape, length,
height, speed, distance, from the one or more second detectors, of the emergency vehicles,
and the likes. In an illustrative embodiment, the attributes associated with the emergency
vehicles can be stored in a second dataset. In an embodiment, the processing unit 104 can be
configured to generate a set of control signals based on the detected attributes to facilitate
passage of at least one of the emergency vehicles. In another embodiment, the processing unit
104 can be configured to determine a time period based on the detected attributes associated
with the at least one of the emergency vehicles.
[0055] In an embodiment, the block diagram of the proposed system 100 includes one
or more beacons 108-1, 108-2…108-N (also, collectively referred to as beacons 108, and
individually referred to as beacon 108, hereinafter). The beacons 108 can be operatively
coupled to the processing unit 104, and can be positioned at a pre-determined position at the
ROI, such that the beacons 108 can be easily visible through a distant position. In an
embodiment, the set of control signals generated by the processing unit 104 can be
transmitted to the beacons 108 to illuminate at least one of the beacons 108. In an
embodiment, the at least one of the beacons 108 can be illuminated for the determined time
period, through the set of control signals, to facilitate passage of at least one of the emergency
vehicles.
[0056] FIG. 2 illustrates exemplary functional units of a processing unit, in
accordance with an exemplary embodiment of the present disclosure.
[0057] As illustrated, the processing unit 104 can include one or more processor(s)
202. The one or more processor(s) 202 can be implemented as one or more microprocessors,
microcomputers, microcontrollers, digital signal processors, central processing units, logic
circuitries, and/or any devices that manipulate data based on operational instructions. Among
other capabilities, the one or more processor(s) 202 are configured to fetch and execute
computer-readable instructions stored in a memory 204 of the processing unit 104. The
memory 204 can store one or more computer-readable instructions or routines, which may be
fetched and executed to create or share the data units over a network service. The memory
204 can include any non-transitory storage device including, for example, volatile memory
such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0058] In an embodiment, the processing unit 104 can also include an interface(s)
206. The interface(s) 206 may include a variety of interfaces, for example, interfaces for data
11
input and output devices, referred to as I/O devices, storage devices, and the like. The
interface(s) 206 may facilitate communication of the Processing unit 104 with various
devices coupled to the Processing unit 104. The interface(s) 206 may also provide a
communication pathway for one or more components of the Processing unit 104. Examples
of such components include, but are not limited to, processing engines(s) 208 and database
210.
[0059] In an embodiment, the processing engine(s) 208 can be implemented as a
combination of hardware and programming (for example, programmable instructions) to
implement one or more functionalities of the processing engine(s) 208. In examples described
herein, such combinations of hardware and programming may be implemented in several
different ways. For example, the programming for the processing engine(s) 208 may be
processor executable instructions stored on a non-transitory machine-readable storage
medium and the hardware for the processing engine(s) 208 may include a processing resource
(for example, one or more processors), to execute such instructions. In the present examples,
the machine-readable storage medium may store instructions that, when executed by the
processing resource, implement the processing engine(s) 208. In such examples, the
Processing unit 104 can include the machine-readable storage medium storing the
instructions and the processing resource to execute the instructions, or the machine-readable
storage medium may be separate but accessible to processing unit 104 and the processing
resource. In other examples, the processing engine(s) 208 may be implemented by electronic
circuitry. The database 210 can include data that is either stored or generated as a result of
functionalities implemented by any of the components of the processing engine(s) 208.
[0060] In an embodiment, the processing engine(s) 208 can include an actuating unit
212, a determining unit 214, and other unit(s) 220. The other unit(s) 222 can implement
functionalities that supplement applications or functions performed by the processing unit
104 or the processing engine(s) 208.
[0061] In an embodiment, the actuating unit 212 associated with the processing unit
104 can facilitate actuation of second detectors 106. In an embodiment, first detectors 102
can be systematically placed at pre-defined locations at a region of interest (ROI), and can be
configured to detect a set of acoustic signals around the ROI, where the set of acoustic signals
can be associated with horns of vehicles, entities, and the likes. The first detectors 102 can
include any or a combination of microphone, mic, acoustic sensor, and the likes. In an
embodiment, comparison between the received set of acoustic signals and a first dataset can
be performed, where the first dataset can be including one or more pre-defined acoustic
12
signals, which can be associated with emergency vehicles. In another embodiment,
responsive to positive comparison between the received set of acoustic signals and the first
dataset, i.e., when the the received set of acoustic signals matches with at least one of the
one or more pre-defined acoustic signals of the first dataset, the actuating unit 212 can
facilitate generation of a set of actuation signals. The second detectors 106 can be operatively
coupled to the processing unit 104, and at least one of the second detectors 106 can get
actuated based on the generated set of actuation signals, where the second detectors 106 can
include any or a combination of camera, proximity sensor, velocity detector, acceleration
detector, and the likes. In an illustrative embodiment, the second detectors 106 can include a
rotatable camera, which can be positioned at a pre-determined height at a traffic signal post to
cover a wide field of view. The rotatable camera can be actuated based on the set of actuation
signals, which are generated by the processing unit 104, based on the positive comparison
between the received set of acoustic signals and the first dataset. The rotatable camera can
rotate towards the direction from which the set of acoustic signals are being transmitted, and
can capture one or more images associated with the direction. In an illustrative embodiment,
the second detectors 106 can be configured to automatically switch to an energy-saving mode
when not in use, that is, when the set of actuation signals is not transmitted to the second
detectors 106.
[0062] In an embodiment, the determining unit 214 associated with the processing
unit 104 can enable identification of emergency vehicles from the captured one or more
images through various image processing techniques, such as, but not limited to, extraction
technique, exclusion technique, segregation technique and recognition technique. In an
illustrative embodiment, through any or a combination of extraction technique and exclusion
technique, the determining unit 214 can facilitate identification of emergency vehicles from
the one or more images, which are captured by the rotatable camera.
[0063] In another embodiment, the determining unit 214 can enable detection of
attributes associated with at least one of the one or more emergency vehicles, where the
attributes associated with the one or more emergency vehicles can include any or a
combination of size, shape, length, height, speed, distance, from the one or more second
detectors, of the emergency vehicles, and the likes. In an illustrative embodiment, the
attributes associated with the emergency vehicles can be stored in a second dataset. In an
embodiment, based on the detected attributes to facilitate passage of at least one of the
emergency vehicles, the actuation unit 212 can facilitate generation of a set of control signals.
In an illustrative embodiment, if the emergency vehicle in the one or more images, captured
13
through the rotatable camera, is identified to be fire brigade, the determining unit 214 can
enable detection of attributes associated with the identified fire brigade.
[0064] In an embodiment, the determining unit 214 can enable determination of a
time period based on the detected attributes associated with the at least one of the emergency
vehicles. In an embodiment, the proposed system 100 can include beacons 108 positioned at a
pre-determined position at the ROI, such that the beacons 108 can be easily visible through a
distant position. In an embodiment, the set of control signals can be transmitted to the
beacons 108 to illuminate at least one of the beacons 108. In an embodiment, the at least one
of the beacons 108 can be illuminated for the time period determined by the determining unit
214, where the time period can be computed and determined in a way, such that the
emergency vehicles, identified by the second detectors 104, can pass swiftly through the ROI.
[0065] In an illustrative embodiment, the identified fire brigade can be found to be
positioned 750 metres away from the traffic signal post on which the rotatable camera is
configured. Accordingly, the time period can be determined through the determining unit 214
so that the fire brigade could pass through the intersection in minimal time. In the process of
determining of the time period, traffic around the fire brigade, lane associated with the fire
brigade, size of the fire brigade, and the likes, can also be taken into account. In another
illustrative embodiment, the beacons 108 associated with the system 100 can be of three
colours, i.e., red beacon 108-1, yellow beacon 108-2, and green beacon 108-3. For the
determined time period, the green beacon 108-3 can be illuminated for the lane in which the
fire brigade is standing, and simultaneously, for all other lanes the red beacon 108 can be
illuminated, so that the fire brigade can pass through the intersection easily.
[0066] In an embodiment, the attributes associated with the one or more emergency
vehicles are stored in a second dataset, where the second dataset is updated through machine
learning and other similar techniques based on the stored attributes. In an embodiment, the
determining unit 214 can facilitate determination of the time period based on the updation of
the second dataset, thereby, mitigating requirement of repeated detection, processing, and
calculations, which can aid in saving memory and increasing processing speed of the
processing unit 104, hence, making the system 100 smart and fast. In an illustrative
embodiment, the determining unit 214 can facilitate determination of the time period based
on the updation of the second dataset, hence, making the process fast and efficient, and,
allowing the fire brigade to cross the intersection easily and in minimal time.
[0067] FIG. 3 illustrates exemplary structural diagram of the proposed system, in
accordance with an exemplary embodiment of the present disclosure.
14
[0068] In an illustrative embodiment, the proposed system 100 can include
microphones 102, such that a microphone 102 is installed at each intersection. The
microphone 102 can be configured to continuously capture the sounds at the intersection,
covert the captured sounds into electrical signals, and feed the corresponding electrical
signals to the processing unit 104, where the processing unit 104 can be Raspberry PI.
Further, each traffic light post 310 on the intersection can include an overhead camera 106.
[0069] In an embodiment, the microphone 102 always remain in active mode,
however, all camera 106 remain in passive mode, so as to conserve a significant amount of
energy at the camera part. The processing unit104 can apply pre-processing, and
subsequently machine learning on the electrical signal to detect the presence of emergency
vehicles, if any. Upon detection of any such sound that is related to emergency vehicles, it is
evident that there is some emergency vehicle in the vicinity. Upon detection of an emergency
siren, the processing unit 104, further, examines if the sound of the siren is escalating. If the
sound of siren is escalating, the processing unit 104 can determine that the emergency vehicle
is approaching the intersection. In such a situation, all cameras 106installed on the traffic
light post 310 on the intersection transit to active mode. Once activated, the cameras 106 can
capture images of the traffic at regular intervals. The captured images can be transmitted to
the processing unit 104, which can further analyse the received images using image
processing and machine learning to detect the lane(also, referred to as channel, hereinafter)
on which the emergency vehicle is.
[0070] In another illustrative embodiment, the channel can also be ascertained by
detecting blinking light on top of the emergency vehicles. After ascertaining the channel, the
processing unit 104 can dynamically enhance the duration for illumination of green light 108
for unrestricted passage of the emergency vehicle. The quantum of increase in duration of
illumination of the green light 108 can be determined based on volume of traffic on
corresponding channel and distance of the emergency vehicle. The proposed system 100 can
minimize the travel time of emergency vehicles. Additionally, the proposed system 100can
also ensure that emergency vehicles do not violate any traffic rules, hence, eliminating the
chaos on the channels/roads.
[0071] Thus, it will be appreciated by those of ordinary skill in the art that the
diagrams, schematics, illustrations, and the like represent conceptual views or processes
illustrating systems and methods embodying this invention. The functions of the various
elements shown in the figures may be provided through the use of dedicated hardware as well
as hardware capable of executing associated software. Similarly, any switches shown in the
15
figures are conceptual only. Their function may be carried out through the operation of
program logic, through dedicated logic, through the interaction of program control and
dedicated logic, or even manually, the particular technique being selectable by the entity
implementing this invention. Those of ordinary skill in the art further understand that the
exemplary hardware, software, processes, methods, and/or operating systems described
herein are for illustrative purposes and, thus, are not intended to be limited to any particular
named.
[0072] While embodiments of the present invention have been illustrated and
described, it will be clear that the invention is not limited to these embodiments only.
Numerous modifications, changes, variations, substitutions, and equivalents will be apparent
to those skilled in the art, without departing from the spirit and scope of the invention, as
described in the claim.
[0073] In the foregoing description, numerous details are set forth. It will be apparent,
however, to one of ordinary skill in the art having the benefit of this disclosure, that the
present invention may be practiced without these specific details. In some instances, wellknown structures and devices are shown in block diagram form, rather than in detail, to avoid
obscuring the present invention.
[0074] As used herein, and unless the context dictates otherwise, the term "coupled
to" is intended to include both direct coupling (in which two elements that are coupled to
each other contact each other)and indirect coupling (in which at least one additional element
is located between the two elements). Therefore, the terms "coupled to" and "coupled with"
are used synonymously. Within the context of this document terms "coupled to" and "coupled
with" are also used euphemistically to mean “communicatively coupled with” over a
network, where two or more devices are able to exchange data with each other over the
network, possibly via one or more intermediary device.
[0075] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts
herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of
the appended claims. Moreover, in interpreting both the specification and the claims, all
terms should be interpreted in the broadest possible manner consistent with the context. In
particular, the terms “comprises” and “comprising” should be interpreted as referring to
elements, components, or steps in a non-exclusive manner, indicating that the referenced
elements, components, or steps may be present, or utilized, or combined with other elements,
components, or steps that are not expressly referenced. Where the specification claims refers
16
to at least one of something selected from the group consisting of A, B, C ….N, the text
should be interpreted as requiring only one element from the group, not A plus N, or B plus
N, etc.
[0076] While the foregoing describes various embodiments of the invention, other
and further embodiments of the invention may be devised without departing from the basic
scope thereof. The scope of the invention is determined by the claims that follow. The
invention is not limited to the described embodiments, versions or examples, which are
included to enable a person having ordinary skill in the art to make and use the invention
when combined with information and knowledge available to the person having ordinary skill
in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0077] The present disclosure provides a system for efficient traffic monitoring.
[0078] The present disclosure provides a system for recognising emergency vehicles.
[0079] The present disclosure provides a system to facilitate passage of emergency
vehicles.
[0080] The present disclosure provides a system that consumes minimal electrical
power.
[0081] The present disclosure provides a system an improved, reliable, efficient, costeffective, and accurate system.

We Claim:

1. A smart traffic monitoring system to facilitate passage of emergency vehicles, the
system comprises:
one or more first detectors configured at a region of interest (ROI) to detect a
set of acoustic signals;
one or more second detectors configured at the ROI to detect attributes
associated with one or more emergency vehicles;
a processing unit operatively coupled to the one or more first detectors and the
one or more second detectors, the processing unit comprising one or more processors
coupled with a memory, the memory storing instructions executable by the one or
more processors configured to:
receive the set of acoustic signals from at least one of the one or more
first detectors;
responsive to positive comparison between the received set of acoustic
signals and a first dataset comprising one or more pre-defined acoustic signals,
generate a set of actuation signals to facilitate actuation of at least one of the
one or more second detectors, wherein the actuation of the at least one of the
one or more second detectors enable detection of attributes associated with at
least one of the one or more emergency vehicles; and
generate a set of control signals based on the detected attributes to
facilitate passage of one or more emergency vehicles.
2. The system as claimed in claim 1, wherein the one or more first detectors comprise
any or a combination of microphone, mic, and acoustic sensor.
3. The system as claimed in claim 1, wherein the one or more second detectors comprise
any or a combination of camera, proximity sensor, velocity detector, and acceleration
detector.
4. The system as claimed in claim 1, wherein the attributes associated with the one or
more emergency vehicles comprise any or a combination of size, shape, length,
height, speed, and distance, from the one or more second detectors, of the one or more
emergency vehicles.
5. The system as claimed in claim 1, wherein the system comprises one or more beacons
configured at a pre-determined position at the ROI, and operatively coupled to the
18
processing unit, and wherein, based on the set of control signals, at least one of the
one or more beacons is illuminated.
6. The system as claimed in claim 5, wherein the processing unit is configured to
determine a time period based on the detected attributes associated with the at least
one of the one or more emergency vehicles.
7. The system as claimed in claim 5, wherein the at least one of the one or more beacons
is illuminated for the determined time period.
8. The system as claimed in claim 1, wherein in case of negative comparison between
the received set of acoustic signals and the first dataset the one or more second
detectors remain in energy saving mode.
9. The system as claimed in claim 1, wherein the attributes associated with the one or
more emergency vehicles are stored in a second dataset, and correspondingly the
second dataset is updated.
10. The system as claimed in claim 9, wherein the processing unit is configured to
determine the time period based on the updation of the second dataset.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 202011022305-Correspondence to notify the Controller [18-11-2024(online)].pdf 2024-11-18
1 202011022305-IntimationOfGrant26-12-2024.pdf 2024-12-26
1 202011022305-STATEMENT OF UNDERTAKING (FORM 3) [28-05-2020(online)].pdf 2020-05-28
1 202011022305-US(14)-HearingNotice-(HearingDate-22-11-2024).pdf 2024-10-22
2 202011022305-PatentCertificate26-12-2024.pdf 2024-12-26
2 202011022305-FORM-26 [18-11-2024(online)].pdf 2024-11-18
2 202011022305-FORM FOR STARTUP [28-05-2020(online)].pdf 2020-05-28
2 202011022305-CLAIMS [16-12-2022(online)].pdf 2022-12-16
3 202011022305-Annexure [05-12-2024(online)].pdf 2024-12-05
3 202011022305-CORRESPONDENCE [16-12-2022(online)].pdf 2022-12-16
3 202011022305-FORM FOR SMALL ENTITY(FORM-28) [28-05-2020(online)].pdf 2020-05-28
3 202011022305-US(14)-HearingNotice-(HearingDate-22-11-2024).pdf 2024-10-22
4 202011022305-CLAIMS [16-12-2022(online)].pdf 2022-12-16
4 202011022305-DRAWING [16-12-2022(online)].pdf 2022-12-16
4 202011022305-FORM 1 [28-05-2020(online)].pdf 2020-05-28
4 202011022305-Written submissions and relevant documents [05-12-2024(online)].pdf 2024-12-05
5 202011022305-FER_SER_REPLY [16-12-2022(online)].pdf 2022-12-16
5 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-05-2020(online)].pdf 2020-05-28
5 202011022305-CORRESPONDENCE [16-12-2022(online)].pdf 2022-12-16
5 202011022305-Correspondence to notify the Controller [18-11-2024(online)].pdf 2024-11-18
6 202011022305-Proof of Right [16-12-2022(online)].pdf 2022-12-16
6 202011022305-FORM-26 [18-11-2024(online)].pdf 2024-11-18
6 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI [28-05-2020(online)].pdf 2020-05-28
6 202011022305-DRAWING [16-12-2022(online)].pdf 2022-12-16
7 202011022305-US(14)-HearingNotice-(HearingDate-22-11-2024).pdf 2024-10-22
7 202011022305-FER_SER_REPLY [16-12-2022(online)].pdf 2022-12-16
7 202011022305-FER.pdf 2022-06-17
7 202011022305-DRAWINGS [28-05-2020(online)].pdf 2020-05-28
8 202011022305-CLAIMS [16-12-2022(online)].pdf 2022-12-16
8 202011022305-DECLARATION OF INVENTORSHIP (FORM 5) [28-05-2020(online)].pdf 2020-05-28
8 202011022305-FORM 18 [14-01-2022(online)].pdf 2022-01-14
8 202011022305-Proof of Right [16-12-2022(online)].pdf 2022-12-16
9 202011022305-COMPLETE SPECIFICATION [28-05-2020(online)].pdf 2020-05-28
9 202011022305-CORRESPONDENCE [16-12-2022(online)].pdf 2022-12-16
9 202011022305-FER.pdf 2022-06-17
9 202011022305-FORM-26 [21-07-2020(online)].pdf 2020-07-21
10 202011022305-DRAWING [16-12-2022(online)].pdf 2022-12-16
10 202011022305-FORM 18 [14-01-2022(online)].pdf 2022-01-14
10 202011022305-Proof of Right [21-07-2020(online)].pdf 2020-07-21
11 202011022305-COMPLETE SPECIFICATION [28-05-2020(online)].pdf 2020-05-28
11 202011022305-FER_SER_REPLY [16-12-2022(online)].pdf 2022-12-16
11 202011022305-FORM-26 [21-07-2020(online)].pdf 2020-07-21
12 202011022305-DECLARATION OF INVENTORSHIP (FORM 5) [28-05-2020(online)].pdf 2020-05-28
12 202011022305-FORM 18 [14-01-2022(online)].pdf 2022-01-14
12 202011022305-Proof of Right [16-12-2022(online)].pdf 2022-12-16
12 202011022305-Proof of Right [21-07-2020(online)].pdf 2020-07-21
13 202011022305-COMPLETE SPECIFICATION [28-05-2020(online)].pdf 2020-05-28
13 202011022305-DRAWINGS [28-05-2020(online)].pdf 2020-05-28
13 202011022305-FER.pdf 2022-06-17
14 202011022305-Proof of Right [16-12-2022(online)].pdf 2022-12-16
14 202011022305-FORM 18 [14-01-2022(online)].pdf 2022-01-14
14 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI [28-05-2020(online)].pdf 2020-05-28
14 202011022305-DECLARATION OF INVENTORSHIP (FORM 5) [28-05-2020(online)].pdf 2020-05-28
15 202011022305-DRAWINGS [28-05-2020(online)].pdf 2020-05-28
15 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-05-2020(online)].pdf 2020-05-28
15 202011022305-FER_SER_REPLY [16-12-2022(online)].pdf 2022-12-16
15 202011022305-FORM-26 [21-07-2020(online)].pdf 2020-07-21
16 202011022305-DRAWING [16-12-2022(online)].pdf 2022-12-16
16 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI [28-05-2020(online)].pdf 2020-05-28
16 202011022305-FORM 1 [28-05-2020(online)].pdf 2020-05-28
16 202011022305-Proof of Right [21-07-2020(online)].pdf 2020-07-21
17 202011022305-COMPLETE SPECIFICATION [28-05-2020(online)].pdf 2020-05-28
17 202011022305-CORRESPONDENCE [16-12-2022(online)].pdf 2022-12-16
17 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-05-2020(online)].pdf 2020-05-28
17 202011022305-FORM FOR SMALL ENTITY(FORM-28) [28-05-2020(online)].pdf 2020-05-28
18 202011022305-CLAIMS [16-12-2022(online)].pdf 2022-12-16
18 202011022305-DECLARATION OF INVENTORSHIP (FORM 5) [28-05-2020(online)].pdf 2020-05-28
18 202011022305-FORM 1 [28-05-2020(online)].pdf 2020-05-28
18 202011022305-FORM FOR STARTUP [28-05-2020(online)].pdf 2020-05-28
19 202011022305-US(14)-HearingNotice-(HearingDate-22-11-2024).pdf 2024-10-22
19 202011022305-STATEMENT OF UNDERTAKING (FORM 3) [28-05-2020(online)].pdf 2020-05-28
19 202011022305-FORM FOR SMALL ENTITY(FORM-28) [28-05-2020(online)].pdf 2020-05-28
19 202011022305-DRAWINGS [28-05-2020(online)].pdf 2020-05-28
20 202011022305-FORM-26 [18-11-2024(online)].pdf 2024-11-18
20 202011022305-FORM FOR STARTUP [28-05-2020(online)].pdf 2020-05-28
20 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI [28-05-2020(online)].pdf 2020-05-28
21 202011022305-STATEMENT OF UNDERTAKING (FORM 3) [28-05-2020(online)].pdf 2020-05-28
21 202011022305-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [28-05-2020(online)].pdf 2020-05-28
21 202011022305-Correspondence to notify the Controller [18-11-2024(online)].pdf 2024-11-18
22 202011022305-Written submissions and relevant documents [05-12-2024(online)].pdf 2024-12-05
22 202011022305-FORM 1 [28-05-2020(online)].pdf 2020-05-28
23 202011022305-Annexure [05-12-2024(online)].pdf 2024-12-05
23 202011022305-FORM FOR SMALL ENTITY(FORM-28) [28-05-2020(online)].pdf 2020-05-28
24 202011022305-FORM FOR STARTUP [28-05-2020(online)].pdf 2020-05-28
24 202011022305-PatentCertificate26-12-2024.pdf 2024-12-26
25 202011022305-IntimationOfGrant26-12-2024.pdf 2024-12-26
25 202011022305-STATEMENT OF UNDERTAKING (FORM 3) [28-05-2020(online)].pdf 2020-05-28

Search Strategy

1 searchE_15-06-2022.pdf

ERegister / Renewals