Sign In to Follow Application
View All Documents & Correspondence

Priority Determining System For Elevator

Abstract: The present disclosure relates to a priority determining system for elevator. The system 100 includes an image acquisition unit 102 to capture one or more images, a processing unit 106 operatively coupled with the image acquisition unit, and configured to detect one or more entities from the captured one or more images, extract a set of features from each of the detected one or more entities, compare the extracted set of features with a data set, where the data set includes predefined priority of the each detected one or more entities, identifies at least one of the entity among the one or more entities based on the predefined priority, anda control unit 104operatively coupled with the processing unit 106, where the control unit 104 facilitates in movement of the elevator in response to the priority based selection of the at least one or more entities.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 June 2020
Publication Number
52/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector - 9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. KUMAR, Ashok
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
2. AHUJA, Sachin
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.
3. GOYAL, Nitin
Chitkara University, Chandigarh-Patiala National Highway (NH-64), Village Jansla, Rajpura, Punjab - 140401, India.

Specification

0001] The present disclosure relates generally to field of mechatronics. More particularly,
the present disclosure provides a priority determining system forentities in elevator.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the
present invention. It is not an admission that any of the information provided herein is prior art or
relevant to the presently claimed invention, or that any publication specifically or implicitly
referenced is prior art.
[0003] Elevators installed in the malls, hospitals and educational institutes and other
organizations reduces a lot of human effort. However, elevators have certain protocols of
working, and accordingly they function. Generally, they have operational panels with switches
bearing numbers and some other switches for opening and closinggate of the elevator voluntarily
as requested by humans.Sometimes there can arise a situation, when human is there on
stretcheroutside elevator on a floor to be shifted urgently to some other floor for immediate
treatment. However, elevator arrives at that floor as per its protocol of working. That is if
elevatoris currently going up at a requested level by human and situation of emergency arises at
an another level, the elevator will go at requested level and then it will come at another level
with emergency situation.
[0004] In such cases, there are chances of risk related to life of humans or danger. Also in
some situation, two humans standing at different levels with different designation request for
elevator, however according to working protocols of conventional elevator, the elevator goes to
level who requests it first. This further causes time wastage of human with high designation and
delay in their work. In certain conditions, humans with facial expression showing urgency for
elevator dispatch is seen, however again conventional elevator follows working protocols and
elevator is dispatched at requested level irrelevant of needs and urgency of humans. Difficulties
and inconvenience caused due to conventional elevators need to be resolved by bringing
elevators which handles situations described above intelligently.
3
[0005] There is a need to overcome the above mentioned problems by providing a solution,
which can identify emergency situationsmartly and automatically during elevator dispatching
and accordingly operates for the humans with emergency situation and is very easily managed
and installed.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein
satisfies are as listed herein below.
[0007] It is an object of the present disclosure to provide a system, that helps in identifying
an emergency and accordingly dispatch elevator.
[0008] It is an object of the present disclosure to provide a system, that aids in prioritizing
the elevator dispatch.
[0009] It is an object of the present disclosure to provide a system, that is easily installed and
maintained in educational institute, hospital, offices, and other similar buildings.
[0010] It is an object of the present disclosure to provide a system, that facilitate in detecting
criminal activities and help in keeping a safe environment.
[0011] It is an object of the present disclosure to provide a system, that aids in saving time
and life of people during emergency time.
[0012] It is an object of the present disclosure to provide an efficient, innovative, user
friendly, economical, easily maintained system for emergency identification in elevator.
SUMMARY
[0013] The present disclosure relates generally to field of mechatronics. More particularly,
the present disclosure provides a priority determining system for entities in elevator.
[0014] An aspect of the present disclosure pertains to apriority determining system for
elevator, said system may include an image acquisition unit to capture one or more images, a
processing unit operatively coupled with the image acquisition unit, and where the processing
unit may include one or more processors coupled with a memory, the memory storing
instructions executable by the one or more processors and configured to detect one or more
entities from the captured one or more images, extract a set of features from each of the detected
one or more entities, compare the extracted set of features with a data set, where the data set may
4
includepredefined priority of the each detected one or more entities, andidentifies at least one of
the entity among the one or more entities based on the predefined priority, anda control unit
operatively coupled with the processing unit, where the control unit facilitates in movement of
the elevator in response to the priority based selection of the at least one or more entities.
[0015] In an aspect, the image acquisition unit may include any or a combination of camera,
Closed Circuit Television (CCTV) camera, and image sensor.
[0016] In an aspect, the image acquisition unit may be configured at a predetermined
position in the elevator, and where the predetermined position may include top middle position
of the elevator’s entry.
[0017] In an aspect, the processing unit may be configured to generate a set of alarm signals,
when the control unit enables the movement of the elevator at a level other than the requested
level.
[0018] In an aspect, the set of alarm signals may be provided using a siren, speaker, and
illuminated light emitting diode (led).
[0019] In an aspect, the set of features maypertain to any or a combination of emergency
characteristics, facial expressions, facial recognition, body language, dress code, colour code,
urgent movement, and designation associated with the each of the detected one or more entities.
[0020] In an aspect, the predefined priority may be associated with the each of the detected
one or more entities.
[0021] In an aspect, the predefined priority may be configured to be updated based on the
priority of the each of the detected one or more entities.
[0022] In an aspect, the device may include a power source configured to supply electric
power to the device.
[0023] In an aspect, the power source may include any or a combination of battery, inverters,
generators, power lines, and electric lines.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The accompanying drawings are included to provide a further understanding of the
present disclosure, and are incorporated in and constitute a part of this specification. The
drawings illustrate exemplary embodiments of the present disclosure and, together with the
description, serve to explain the principles of the present disclosure.
5
[0025] The diagrams are for illustration only, which thus is not a limitation of the present
disclosure, and wherein:
[0026] FIG. 1 illustrates a block diagram of the proposed system forpriority determining
system for elevator, in accordance with an embodiment of the present disclosure.
[0027] FIG. 2 illustrates exemplary functional components of processing unit of the proposed
system for priority determining system for elevator,in accordance with an embodiment of the
present disclosure.
[0028] FIG. 3 illustrates overall components and working of the proposed system for priority
determining system for elevator, in accordance with an embodiment of the present disclosure.
DETAIL DESCRIPTION
[0029] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be apparent to
one skilled in the art that embodiments of the present invention may be practiced without some
of these specific details.
[0030] Embodiments of the present invention include various steps, which will be described
below. The steps may be performed by hardware components or may be embodied in machineexecutable instructions, which may be used to cause a general-purpose or special-purpose
processor programmed with the instructions to perform the steps. Alternatively, steps may be
performed by a combination of hardware, software, firmware and/or by human operators.
[0031] Embodiments of the present invention may be provided as a computer program
product, which may include a machine-readable storage medium tangibly embodying thereon
instructions, which may be used to program a computer (or other electronic devices) to perform a
process. The machine-readable medium may include, but is not limited to, fixed (hard) drives,
magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs),
and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access
memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs),
electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other
type of media/machine-readable medium suitable for storing electronic instructions (e.g.,
computer programming code, such as software or firmware).
6
[0032] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus for
practicing various embodiments of the present invention may involve one or more computers (or
one or more processors within a single computer) and storage systems containing or having
network access to computer program(s) coded in accordance with various methods described
herein, and the method steps of the invention could be accomplished by modules, routines,
subroutines, or subparts of a computer program product.
[0033] If the specification states a component or feature “may”, “can”, “could”, or “might”
be included or have a characteristic, that particular component or feature is not required to be
included or have the characteristic.
[0034] As used in the description herein and throughout the claims that follow, the meaning
of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise.
Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the
context clearly dictates otherwise.
[0035] Exemplary embodiments will now be described more fully hereinafter with reference
to the accompanying drawings, in which exemplary embodiments are shown. This invention
may, however, be embodied in many different forms and should not be construed as limited to
the embodiments set forth herein. These embodiments are provided so that this invention will be
thorough and complete and will fully convey the scope of the invention to those of ordinary skill
in the art. Moreover, all statements herein reciting embodiments of the invention, as well as
specific examples thereof, are intended to encompass both structural and functional equivalents
thereof. Additionally, it is intended that such equivalents include both currently known
equivalents as well as equivalents developed in the future (i.e., any elements developed that
perform the same function, regardless of structure).
[0036] While embodiments of the present invention have been illustrated and described, it
will be clear that the invention is not limited to these embodiments only. Numerous
modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled
in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0037] The present disclosure relates generally to field of mechatronics. More particularly,
the present disclosure provides a priority determining system for entities in elevator.
7
[0038] According to an aspect the present disclosure pertains to a priority determining
system for elevator, said system can include an image acquisition unit to capture one or more
images, a processing unit operatively coupled with the image acquisition unit, and where the
processing unit can include one or more processors coupled with a memory, the memory storing
instructions executable by the one or more processors and configured to detect one or more
entities from the captured one or more images, extract a set of features from each of the detected
one or more entities, compare the extracted set of features with a data set, where the data set can
includepredefined priority of the each detected one or more entities, andidentifies at least one of
the entity among the one or more entities based on the predefined priority, anda control unit
operatively coupled with the processing unit, where the control unit facilitates in movement of
the elevator in response to the priority based selection of the at least one or more entities.
[0039] In an embodiment, the image acquisition unit may include any or a combination of
camera, Closed Circuit Television (CCTV) camera, and image sensor.
[0040] In an embodiment, the image acquisition unit can be configured at a predetermined
position in the elevator, and where the predetermined position can include top middle position of
the elevator’s entry.
[0041] In an embodiment, the processing unit can be configured to generate a set of alarm
signals, when the control unit enables the movement of the elevator at a level other than the
requested level.
[0042] In an embodiment, the set of alarm signals can be provided using a siren, speaker, and
illuminated light emitting diode (led).
[0043] In an embodiment, the set of features canpertain to any or a combination of
emergency characteristics, facial expressions, facial recognition, body language, dress code,
colour code, urgent movement, and designation associated with the each of the detected one or
more entities.
[0044] In an embodiment, the predefined priority can be associated with the each of the
detected one or more entities.
[0045] In an embodiment, the predefined priority can be configured to be updated based on
the priority of the each of the detected one or more entities.
[0046] In an embodiment, the device can include a power source configured to supply
electric power to the device.
8
[0047] In an embodiment, the power source can include any or a combination of battery,
inverters, generators, power lines, and electric lines.
[0048] FIG. 1 illustrates a block diagram of the proposed system for priority determining
system for elevator, in accordance with an embodiment of the present disclosure.
[0049] As illustrated in FIG. 1, the proposed system 100 (also referred to as system 100,
herein) can include an image acquisition unit 102, a processing unit 106, and a control unit 104.
The image acquisition unit 102 can be operatively coupled with the processing unit 106. The
control unit 104 can be operatively coupled with the processing unit 106. The system can be
configured to determine priority of one or more entities (also referred individually as an entity,
and collectively as entities) for an elevator. The system can include machine algorithms to
determine priority of the entity for elevator despatch.
[0050] In an embodiment, the image acquisition unit 102 can be configured to capture one or
more images (also referred collectively as images, and individually as an image) of the entities.
In an illustrative embodiment, the image acquisition unit 102 can include any or a combination
of camera, Closed Circuit Television (CCTV) camera, and image sensor. The image acquisition
unit 102 such as camera can capture the images of the entities as the entities are found in the
camera vicinity. In another illustrative embodiment, the image acquisition unit 102 can be
configured at a predetermined position. The predetermined position can include middle top of the
elevator’s gate at each level. The image acquisition unit 102 can capture the images of the
entities at each level of the elevator.
[0051] In an illustrative embodiment, the image acquisition unit 102 such as camera can be a
digital camera. The digital camera can include lens to focus light falling on the image and
transmit the light to an image sensor, wherein the image sensor can be configured inside the
camera.The image sensor can receive the light and convert the light into a set electrical charges.
The image sensor can be a microchip with arrays of sensors, where the sensors can be configured
to convert the light into set of electrical sensors. The set of electrical signals can be transmitted to
the processing unit 106. The digital camera image quality can depend on resolution of the
camera. The image captured by the digital camera can be in binary form, and the processing unit
106 can be configured to receive the image in binary form.
[0052] In an illustrative embodiment, the image acquisition unit 102 can be a CCTV or
surveillance camera. The CCTV camera can include lens, camera, video recorder, display unit
9
(optional, such as monitor), cables, storage unit. The lens can be configured inside camera and
configured to focus the light falling on the image. The CCTV camera can be analogue camera or
digital camera. The analogue CCTV camera can be configured to capture the images, and
correspondingly generate a set of continuous video signal. The video recorder such as
DigitalVideo Recorder (DVR) camera can facilitate in digitizing the set of continuous video
signal at the camera. The video recorder can also be Network video recorder (NVR). The CCTV
camera can also facilitate in transmitting the images to the display unit such as monitor through
wireless transmission or through the cables. The storage unit can be hard disk configured to store
video recordings of the images. The digitized set of continuous video signals can be transmitted
to the processing unit 106.
[0053] In an embodiment, the processing unit 106 can be configured to receive the image
from the image acquisition unit 102 in binary form. The binary form can include 0’s and 1’s. The
processing unit 106 can decode the binary form and transmit the image in binary form to sub
units. The sub units can include detection unit 212, extraction unit 214, comparison unit 216,
identification unit 218, and other unit(s) 220. The processing unit 106 can facilitate in detecting
one or more entities from the images with help of detection unit 212. The processing unit 106
can be configured with machine learning algorithms to facilitate in determining priorityfor
elevator dispatch.
[0054] In an illustrative embodiment, the processing unit 106 can be microprocessor,
microcontroller, Arduino Uno, At mega 328, raspberry-pi, and other similar processing unit 106.
The processing unit 106 can receive the set of electrical signals from the image sensor. In
another illustrative embodiment, the processing unit 106 can be configured to receive the
digitized set of continuous video signals from the CCTV camera. Further, the processing unit
106 can be configured to extract a set of features from each of the detected one or more entities.
The processing unit 106 can facilitatein comparing the extracted set of features with a data set,
where the data set can include redefined priority of the each detected one or more entities, and
can then identify at least one of the entity among one or more entities based on the predefined
priority with help of the sub units.The set of features can pertain to emergency characteristics,
facial expressions, facial recognition, body language, dress code, colour code, urgent movement,
and designation associated with the each of the detected one or more entities.
10
[0055] In an embodiment, the control unit 104 can be configured to facilitate movement of
the elevator in response to the priority based selection of the at least one or more entities.The
entities can be humans, whose images is captured by the image acquisition unit 102. The
predefined priority can be associated with the each of the detected one or more entities, and
where the predefined priority can be configured to be updated according to the priority of the
each of the detected one or more entities. In an illustrative embodiment, the control unit can be
configured to control electric motor, pulley system, cables,cars counterweights with help of
elevator algorithm. The control unit 104 can be electronic control system which can facilitate in
directing the cars to correct floors or levels using elevator algorithm. The cars can be balanced
by the counterweights. When the elevator goes up the counterweight goes down and vice versa.
The electric motor can facilitate in moving the elevator according to the priority of the entities.
[0056] In an illustrative embodiment, the control unit can receive the machine algorithms
from the processing unit 10 and accordingly facilitates in moving the elevator according to the
priority. The machine algorithms can be fed in the processing unit 106 such as raspberry – pi.
The control unit 104 can enable movement of the elevator in accordance with the machine
algorithms stored in the processing unit 106. The priority can be stored in a database, and can be
identified by the processing unit 106 to facilitate in moving the elevator with help of the control
unit 106.
[0057] FIG. 2 illustrates exemplary functional components of processing unit of the proposed
system for priority determining system for elevator,in accordance with an embodiment of the
present disclosure.
[0058] As illustrated in an embodiment, the processing unit 106 can include one or more
processor(s) 202. The one or more processor(s) 202 can be implemented as one or more
microprocessors, microcomputers, microcontrollers, digital signal processors, central processing
units, logic circuitries, and/or any devices that manipulate data based on operational instructions.
Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute
computer-readable instructions stored in a memory 204 of the processing unit 106. The memory
204 can store one or more computer-readable instructions or routines, which may be fetched and
executed to create or share the data units over a network service. The memory 204 can include
any non-transitory storage device including, for example, volatile memory such as RAM, or nonvolatile memory such as EPROM, flash memory, and the like.
11
[0059] In an embodiment, the processing unit 106 can also include an interface(s) 206. The
interface(s) 206 may include a variety of interfaces, for example, interfaces for data input and
output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may
facilitate communication of the processing unit 106 with various devices coupled to the
processing unit 106. The interface(s) 206 may also provide a communication pathway for one or
more components of processing unit 106. Examples of such components include, but are not
limited to, processing engine(s) 208 and data 210.
[0060] In an embodiment, the processing engine(s) 208 can be implemented as a
combination of hardware and programming (for example, programmable instructions) to
implement one or more functionalities of the processing engine(s) 208. In examples described
herein, such combinations of hardware and programming may be implemented in several
different ways. For example, the programming for the processing engine(s) 208 may be
processor executable instructions stored on a non-transitory machine-readable storage medium
and the hardware for the processing engine(s) 208 may include a processing resource (for
example, one or more processors), to execute such instructions. In the present examples, the
machine-readable storage medium may store instructions that, when executed by the processing
resource, implement the processing engine(s) 208. In such examples, the processing unit 106 can
include the machine-readable storage medium storing the instructions and the processing
resource to execute the instructions, or the machine-readable storage medium may be separate
but accessible to processing unit 106 and the processing resource. In other examples, the
processing engine(s) 208 may be implemented by electronic circuitry. A database 210 can
include data that is either stored or generated as a result of functionalities implemented by any of
the components of the processing engine(s) 208.
[0061] In an embodiment, the processing engine(s) 208 can include a detection unit 212,an
extraction unit 214, a comparison unit 216, an identification unit 218, and other unit (s) 220. The
other unit(s) 218 can implement functionalities that supplement applications or functions
performed by the system 100 or the processing engine(s) 208.
[0062] The database 210 can include data that is either stored or generated as a result of
functionalities implemented by any of the components of the processing engine(s) 208.
12
[0063] It would be appreciated that units being described are only exemplary units and any
other unit or sub-unit may be included as part of the system 100. These units too may be merged
or divided into super- units or sub-units as may be configured.
[0064] As illustrated in FIG. 2, the processing unit can facilitate detection of entities from an
array of images, extracting a set of features from each of the detected entities, compare the
extracted set of features with a data set, where the data set can include predefined priority of the
each detected one or more entities, andidentifies at least one of the entity among the one or more
entities based on the predefined priority. The processing unit 106 can store machine algorithms
and according to the machine algorithm stored, the processing unit 106 can facilitate in
transmitting the priority based selection in machine readable form to the control unit 104 of the
elevator.
[0065] In an embodiment, the detection unit 212 can facilitate in detecting the entities from
the array of images. The array of images can be captured by the image acquisition unit 102. In an
illustrative embodiment, the image acquisition unit 102 can include lens to focus the light on the
image to be captured and an image sensor can facilitate in converting the light into electrical
signals. The set of electrical signals can be transmitted to the processing unit 104 with help of
detection unit 212. The detection unit 212 can receive the set of electrical signals and detect the
entities from the array of images. In another illustrative embodiment, the detection unit 212 can
receive digitized set of continuous signals from the image acquisition unit 102, and can detect the
entities from the array of images.
[0066] In an embodiment, the detection unit 212 can enable detection of the entities, where
the entities can be humans. In an illustrative embodiment, the image acquisition unit 102 can be
configured at a predetermined position in the elevator. The predetermined position can include
top middle portion of the elevators entry. In another illustrative embodiment, the image
acquisition unit 102 can be configured to capture the images and can transmit the images to the
detection unit 212 in machine readable form with help of processing unit 106 such as raspberrypi. The detection unit 212 can be configured to detect the entities from the array of images as the
entities is found in the vicinity of the image acquisition unit 102.
[0067] In an embodiment, the extraction unit 214 can facilitate in extracting a set of features
from each of the detected entities. The set of features can pertain to any or a combination of
emergency characteristics, facial expressions, facial recognition, body language, dress code,
13
colour code, urgent movement, and designation associated with the each of the detected one or
more entities. In an illustrative embodiment, the extraction unit 214 can enable extracting the set
of features from the each detected entities.The extraction unit 214 can extract the set of features
from the detected entities in machine readable form. The set of features can be fed into the
processing unit 106 or the extraction unit 214 in form of machine algorithms. The machine
algorithms for the set of features can be transmitted to the comparison unit 216.
[0068] In an illustrative embodiment, the extraction unit 214 can facilitate in extracting the
set of features of the each detected entities detected by the detection unit 212. In another
illustrative embodiment, as the image acquisition unit 102 captures the array of the images, the
detection unit 212 can detect the entities and then the extraction unit 214 can extract the set of
features like an emergency situation related to health, crying face of the entity from the machine
algorithms in binary form. The extraction unit 214 can extract the set of features of the detected
entities and can transmit the set of features to the comparison unit 216 for determining the
priority of at least one of the detected entities. In yet another illustrative embodiment, according
to the priority predefined in the dataset, the extracted set of features from the comparison unit
214 can be compared by the comparison unit 216.
[0069] In an illustrative embodiment, the comparison unit 214 can facilitate in comparing the
extracted set of features first set of parameters with a data set, where the data set can pertain to
predefined priority of the each detected entities. The comparison unit 216 can receive the
extracted set of features from the extraction unit 214, and can compare with the data set stored in
database 210. The predefined priority of the each detected entities ranges can include
thresholdcharacteristics pertaining to the set of features of the detected entities. The threshold
characteristics for the predefined priority can include a medical emergency, entity with high
designation, sobbed facial expression, and the like.The comparison unit 214 can compare the
extracted set of features, and can facilitate in finding whether the extracted set of features has
reached the predefined priority of the each detected entities.
[0070] In an illustrative embodiment, the comparison unit 214 can receive the extracted set
of features in machine readable form. The comparison unit 214 can facilitate in comparing the
received extracted set of features in machine readable from with help of a comparator. The
comparator can enable comparing the extracted set of features with the predefined priority of the
each detected entities. The comparator can include an analogue comparator or a digital
14
comparator. The digital comparators can compare the extracted set of features with the
predefined priority of the each detected entities. The digital comparators can facilitate
comparison with help of logic gates such as AND, NOT or NOR gates. The digital comparator
can be configured to accept the extracted set of features in the machine readable form. The
digital comparator can compare and transmit the compared set of features to the identification
unit 218.
[0071] In an embodiment, the identification unit 218 can facilitate in identifying at least one
of the entity among the entities based on the predefined priority. The identification unit 218 can
be configured to receive the compared set of features from the comparison unit 216. In another
illustrative embodiment, the identification unit 218 can identify the at least one of the entity
among the entities based on the predefined priority according to the machine algorithms. The
machine algorithms can be set in the processing unit 106 or signal generation unit 218 to
facilitate in identifying the at least one entity among the entities. In yet another illustrative
embodiment, after the identification unit 218 identifies the at least one entity from among the
entities on the basis of the predefined priority, the identification unit can facilitate in transmitting
the identified entity in machine readable form to a control unit 104. Further, the control unit can
enable movement of an elevator according to the priority based selection of the at least one
entity.
[0072] In an illustrative embodiment, after receiving the compared set of features from the
comparison unit 216, the identification unit 218 can facilitate in identifying the at least one entity
among the entities on the basis of predefined priority and help the control unit 104 to enable
movement of the elevator. In another illustrative embodiment, the set of features such as medical
emergency, facial expressions associated with the at least one detected entities, and the like can
be extracted from the extraction unit 214, and compared with the predefined priority, and then
can be identified by the identification unit 218. In yet another illustrative embodiment, as the
image pertaining to the medical emergency associated with the at least one detected entities is
detected, the identification unit 218 can facilitate in identifying the entity with medical
emergency among entities according to the predefined priority. According to the identified entity
with medical emergency, the identification unit 218 can facilitate in transmitting the identified
entity with medical emergency to the control unit 104 in machine readable form.
15
[0073] In an illustrative embodiment, the identification unit 218 can transmit the identified at
least one of the entity from among the entities to the control unit in machine readable form,
where the control unit 104 can be configured to receive the identified at least one of the entity
and can convert them in form of motion to facilitate movement of the elevator. The control unit
104 can be configured to facilitate the movement of the elevator with help of electric motor,
pulley and elevator algorithm.
[0074] FIG. 3 illustrates overall components and working of the proposed system for priority
determining system for elevator, in accordance with an embodiment of the present disclosure.
[0075] As illustrated in FIG. 3, the system 100 can include image acquisition unit 102 such
as 102-1, which can be positioned at afirst predetermined position. The first predetermined
position can include top middle position of an elevator’s entry at a first level 302-1 of the
elevator. The system can include image acquisition unit 102-2, which can be positioned at a
second predetermined position. The second predetermined position include top middle position
of an elevator’s entry ata second level 302-2 of the elevator. The system can include image
acquisition unit 102-3, which can be positioned at a third predetermined position. The third
predetermined position include top middle position of an elevator’s entry at a third level 302-3 of
the elevator.
[0076] In an illustrative embodiment, the system 100 can include left side and right side
associated with the first level 302-1, second level 302-2, and third level 302-3of the elevator. The
left side can include first level 302-1 left side 304-1, second level 302-2 left side 304-2, and third
level 302-3 left side 304-3. The right side can include first level 302-1 right side 306-1, second
level 302-2 right side 306-2, and third level 302-3 right side 306-3. In another illustrative
embodiment, the system 100 can include a processing unit 106 such as raspberry – pi positioned
inside the elevator can be configured as a central controller. The image acquisition unit 102-1,
102-2, 102-3 can be configured to capture array of images, wherethe array of images can be
transferred to the central controller. The central controller can include machine learning
algorithms that facilitates in identifying the situation of emergency.
[0077] In an illustrative embodiment, the image acquisition unit 102-1, 102-2, and 102-3
configured at the first level 302-1, second level 302-2, and the third level 302-3 can facilitate in
capturing the array of images from the first level 302-1, second level 302-2, and the third level
302-3. In another illustrative embodiment, an entity with medical emergency can wait on the first
16
level 302-1, for the elevator dispatch, and an another entity without medical emergency can wait
on the second level 302-2. The image acquisition unit 102-1 positioned at the first level 302-1
can be configured to capture the image pertaining to medical emergency and the image
acquisition unit 102-2 positioned at the second level 302-2 can be configured to capture the
image pertaining to no medical emergency. The image acquisition unit 102-1 and 102-2 can
transfer the image pertaining to medical emergency associated with the entity, where the image
pertaining to the medical emergency can be predefined priority stored in a database 210 of the
system 100, and the image pertaining to no medical emergency to the raspberry –pi. The
raspberry -pi can facilitate in identifying the image pertaining to the medical emergency and can
enable control unit 104 in dispatching the elevator at first level 302-1 with medical emergency.
[0078] In an illustrative embodiment, the predefined priority can be configured to be updated
based on the priority of the each of the detected one or more entities.In another illustrative
embodiment, the raspberry - pi can be configured to generate a set of alarm signals, when the
control unit 104 enables the movement of the elevator at a level other than the requested level.
The set of alarm signals can be provided using a siren, speaker, and illuminated light emitting
diode (led).
[0079] In an illustrative embodiment, the device can include a power source configured to
supply electric power to the device. In another illustrative embodiment, the power source can
includeany or a combination of battery, inverters, generators, power lines, and electric lines.
[0080] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams,
schematics, illustrations, and the like represent conceptual views or processes illustrating
systems and methods embodying this invention. The functions of the various elements shown in
the figures may be provided through the use of dedicated hardware as well as hardware capable
of executing associated software. Similarly, any switches shown in the figures are conceptual
only. Their function may be carried out through the operation of program logic, through
dedicated logic, through the interaction of program control and dedicated logic, or even
manually, the particular technique being selectable by the entity implementing this invention.
Those of ordinary skill in the art further understand that the exemplary hardware, software,
processes, methods, and/or operating systems described herein are for illustrative purposes and,
thus, are not intended to be limited to any particular named.
17
[0081] While embodiments of the present invention have been illustrated and described, it
will be clear that the invention is not limited to these embodiments only. Numerous
modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled
in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0082] In the foregoing description, numerous details are set forth. It will be apparent,
however, to one of ordinary skill in the art having the benefit of this disclosure, that the present
invention may be practiced without these specific details. In some instances, well-known
structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring
the present invention.
[0083] As used herein, and unless the context dictates otherwise, the term "coupled to" is
intended to include both direct coupling (in which two elements that are coupled to each other
contact each other) and indirect coupling (in which at least one additional element is located
between the two elements). Therefore, the terms "coupled to" and "coupled with" are used
synonymously. Within the context of this document terms "coupled to" and "coupled with" are
also used euphemistically to mean “communicatively coupled with” over a network, where two
or more devices are able to exchange data with each other over the network, possibly via one or
more intermediary device.
[0084] It should be apparent to those skilled in the art that many more modifications besides
those already described are possible without departing from the inventive concepts herein. The
inventive subject matter, therefore, is not to be restricted except in the spirit of the appended
claims. Moreover, in interpreting both the specification and the claims, all terms should be
interpreted in the broadest possible manner consistent with the context. In particular, the terms
“comprises” and “comprising” should be interpreted as referring to elements, components, or
steps in a non-exclusive manner, indicating that the referenced elements, components, or steps
may be present, or utilized, or combined with other elements, ` components, or steps that are not
expressly referenced.
[0085] While the foregoing describes various embodiments of the invention, other and
further embodiments of the invention may be devised without departing from the basic scope
thereof. The scope of the invention is determined by the claims that follow. The invention is not
limited to the described embodiments, versions or examples, which are included to enable a
18
person having ordinary skill in the art to make and use the invention when combined with
information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0086] The present disclosure provides a system, that helps in identifying an emergency and
accordingly dispatch elevator.
[0087] The present disclosure provides a system, that aids in prioritizing the elevator
dispatch.
[0088] The present disclosure provides a system, that is easily installed and maintained in
educational institute, hospital, offices, and other similar buildings.
[0089] The present disclosure provides a system, that facilitate in detecting criminal
activities and help in keeping a safe environment.
[0090] The present disclosure provides a system, that aids in saving time and life of people
during emergency time.
[0091] The present disclosure an efficient, innovative, user friendly, economical, easily
maintained system for emergency identification in elevator.

We Claim:

1. A priority determining system for elevator, said system comprises of:
an image acquisition unit to capture one or more images;
a processing unit operatively coupled with the image acquisition unit, and wherein
the processing unit comprises of one or more processors coupled with a memory, the
memory storing instructions executable by the one or more processors and configured to:
detect one or more entities from the captured one or more images;
extract a set of features from each of the detected one or more entities;
compare the extracted set of features with a data set, wherein the data set
comprises predefined priority of the each detected one or more entities;
identifies at least one of the entity among the one or more entities based on
the predefined priority; and
a control unit operatively coupled with the processing unit, wherein the control
unit facilitates in movement of the elevator in response to the priority based selection of
the at least one or more entities.
2. The system as claimed in claim 1, wherein the image acquisition unit comprises any or a
combination of camera, Closed Circuit Television (CCTV) camera, and image sensor.
3. The system as claimed in claim 2, wherein the image acquisition unit is configured at a
predetermined position in the elevator, and wherein the predetermined position comprises
top middle position of the elevator’s entry.
4. The device as claimed in claim 1, wherein the processing unit is configured to generate a
set of alarm signals, when the control unit enables the movement of the elevator at a level
other than the requested level.
5. The device as claimed in claim 4, wherein the set of alarm signals is provided using a
siren, speaker, and illuminated light emitting diode (led).
6. The system as claimed in claim 1, wherein the set of features pertains to any or a
combination of emergency characteristics, facial expressions, facial recognition, body
language, dress code, color code, urgent movement, and designation associated with the
each of the detected one or more entities.
20
7. The system as claimed in claim 1, wherein the predefined priority is associated with the
each of the detected one or more entities.
8. The system as claimed in claim 4, wherein the predefined priority is configured to be
updated based on the priority of the each of the detected one or more entities.
9. The device as claimed in claim 1, wherein the device comprises a power source
configured to supply electric power to the device.
10. The device as claimed in claim 9, wherein the power source comprises any or a
combination of battery, inverters, generators, power lines, and electric lines.

Documents

Application Documents

# Name Date
1 202011025947-Annexure [24-01-2025(online)].pdf 2025-01-24
1 202011025947-CLAIMS [04-10-2022(online)].pdf 2022-10-04
1 202011025947-Correspondence to notify the Controller [06-01-2025(online)].pdf 2025-01-06
1 202011025947-STATEMENT OF UNDERTAKING (FORM 3) [19-06-2020(online)].pdf 2020-06-19
2 202011025947-Written submissions and relevant documents [24-01-2025(online)].pdf 2025-01-24
2 202011025947-US(14)-ExtendedHearingNotice-(HearingDate-09-01-2025)-1100.pdf 2025-01-06
2 202011025947-FORM FOR STARTUP [19-06-2020(online)].pdf 2020-06-19
2 202011025947-CORRESPONDENCE [04-10-2022(online)].pdf 2022-10-04
3 202011025947-FORM FOR SMALL ENTITY(FORM-28) [19-06-2020(online)].pdf 2020-06-19
3 202011025947-FER_SER_REPLY [04-10-2022(online)].pdf 2022-10-04
3 202011025947-Correspondence to notify the Controller [06-01-2025(online)].pdf 2025-01-06
3 202011025947-Correspondence to notify the Controller [20-12-2024(online)].pdf 2024-12-20
4 202011025947-FORM 1 [19-06-2020(online)].pdf 2020-06-19
4 202011025947-FORM-26 [04-10-2022(online)].pdf 2022-10-04
4 202011025947-FORM-26 [20-12-2024(online)].pdf 2024-12-20
4 202011025947-US(14)-ExtendedHearingNotice-(HearingDate-09-01-2025)-1100.pdf 2025-01-06
5 202011025947-Correspondence to notify the Controller [20-12-2024(online)].pdf 2024-12-20
5 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-06-2020(online)].pdf 2020-06-19
5 202011025947-FER.pdf 2022-04-29
5 202011025947-US(14)-HearingNotice-(HearingDate-24-12-2024).pdf 2024-11-26
6 202011025947-CLAIMS [04-10-2022(online)].pdf 2022-10-04
6 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI [19-06-2020(online)].pdf 2020-06-19
6 202011025947-FORM 18 [07-02-2022(online)].pdf 2022-02-07
6 202011025947-FORM-26 [20-12-2024(online)].pdf 2024-12-20
7 202011025947-CORRESPONDENCE [04-10-2022(online)].pdf 2022-10-04
7 202011025947-DRAWINGS [19-06-2020(online)].pdf 2020-06-19
7 202011025947-FORM-26 [21-07-2020(online)].pdf 2020-07-21
7 202011025947-US(14)-HearingNotice-(HearingDate-24-12-2024).pdf 2024-11-26
8 202011025947-CLAIMS [04-10-2022(online)].pdf 2022-10-04
8 202011025947-DECLARATION OF INVENTORSHIP (FORM 5) [19-06-2020(online)].pdf 2020-06-19
8 202011025947-FER_SER_REPLY [04-10-2022(online)].pdf 2022-10-04
8 202011025947-Proof of Right [21-07-2020(online)].pdf 2020-07-21
9 202011025947-COMPLETE SPECIFICATION [19-06-2020(online)].pdf 2020-06-19
9 202011025947-CORRESPONDENCE [04-10-2022(online)].pdf 2022-10-04
9 202011025947-FORM-26 [04-10-2022(online)].pdf 2022-10-04
10 202011025947-DECLARATION OF INVENTORSHIP (FORM 5) [19-06-2020(online)].pdf 2020-06-19
10 202011025947-FER.pdf 2022-04-29
10 202011025947-FER_SER_REPLY [04-10-2022(online)].pdf 2022-10-04
10 202011025947-Proof of Right [21-07-2020(online)].pdf 2020-07-21
11 202011025947-DRAWINGS [19-06-2020(online)].pdf 2020-06-19
11 202011025947-FORM 18 [07-02-2022(online)].pdf 2022-02-07
11 202011025947-FORM-26 [04-10-2022(online)].pdf 2022-10-04
11 202011025947-FORM-26 [21-07-2020(online)].pdf 2020-07-21
12 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI [19-06-2020(online)].pdf 2020-06-19
12 202011025947-FER.pdf 2022-04-29
12 202011025947-FORM 18 [07-02-2022(online)].pdf 2022-02-07
12 202011025947-FORM-26 [21-07-2020(online)].pdf 2020-07-21
13 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-06-2020(online)].pdf 2020-06-19
13 202011025947-FER.pdf 2022-04-29
13 202011025947-FORM 18 [07-02-2022(online)].pdf 2022-02-07
13 202011025947-Proof of Right [21-07-2020(online)].pdf 2020-07-21
14 202011025947-FORM-26 [21-07-2020(online)].pdf 2020-07-21
14 202011025947-FORM-26 [04-10-2022(online)].pdf 2022-10-04
14 202011025947-FORM 1 [19-06-2020(online)].pdf 2020-06-19
14 202011025947-COMPLETE SPECIFICATION [19-06-2020(online)].pdf 2020-06-19
15 202011025947-DECLARATION OF INVENTORSHIP (FORM 5) [19-06-2020(online)].pdf 2020-06-19
15 202011025947-FER_SER_REPLY [04-10-2022(online)].pdf 2022-10-04
15 202011025947-FORM FOR SMALL ENTITY(FORM-28) [19-06-2020(online)].pdf 2020-06-19
15 202011025947-Proof of Right [21-07-2020(online)].pdf 2020-07-21
16 202011025947-COMPLETE SPECIFICATION [19-06-2020(online)].pdf 2020-06-19
16 202011025947-CORRESPONDENCE [04-10-2022(online)].pdf 2022-10-04
16 202011025947-DRAWINGS [19-06-2020(online)].pdf 2020-06-19
16 202011025947-FORM FOR STARTUP [19-06-2020(online)].pdf 2020-06-19
17 202011025947-CLAIMS [04-10-2022(online)].pdf 2022-10-04
17 202011025947-DECLARATION OF INVENTORSHIP (FORM 5) [19-06-2020(online)].pdf 2020-06-19
17 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI [19-06-2020(online)].pdf 2020-06-19
17 202011025947-STATEMENT OF UNDERTAKING (FORM 3) [19-06-2020(online)].pdf 2020-06-19
18 202011025947-US(14)-HearingNotice-(HearingDate-24-12-2024).pdf 2024-11-26
18 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-06-2020(online)].pdf 2020-06-19
18 202011025947-DRAWINGS [19-06-2020(online)].pdf 2020-06-19
19 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI [19-06-2020(online)].pdf 2020-06-19
19 202011025947-FORM 1 [19-06-2020(online)].pdf 2020-06-19
19 202011025947-FORM-26 [20-12-2024(online)].pdf 2024-12-20
20 202011025947-FORM FOR SMALL ENTITY(FORM-28) [19-06-2020(online)].pdf 2020-06-19
20 202011025947-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-06-2020(online)].pdf 2020-06-19
20 202011025947-Correspondence to notify the Controller [20-12-2024(online)].pdf 2024-12-20
21 202011025947-US(14)-ExtendedHearingNotice-(HearingDate-09-01-2025)-1100.pdf 2025-01-06
21 202011025947-FORM FOR STARTUP [19-06-2020(online)].pdf 2020-06-19
21 202011025947-FORM 1 [19-06-2020(online)].pdf 2020-06-19
22 202011025947-Correspondence to notify the Controller [06-01-2025(online)].pdf 2025-01-06
22 202011025947-FORM FOR SMALL ENTITY(FORM-28) [19-06-2020(online)].pdf 2020-06-19
22 202011025947-STATEMENT OF UNDERTAKING (FORM 3) [19-06-2020(online)].pdf 2020-06-19
23 202011025947-FORM FOR STARTUP [19-06-2020(online)].pdf 2020-06-19
23 202011025947-Written submissions and relevant documents [24-01-2025(online)].pdf 2025-01-24
24 202011025947-Annexure [24-01-2025(online)].pdf 2025-01-24
24 202011025947-STATEMENT OF UNDERTAKING (FORM 3) [19-06-2020(online)].pdf 2020-06-19

Search Strategy

1 SearchStrategyMatrixE_29-04-2022.pdf