Sign In to Follow Application
View All Documents & Correspondence

System And Method To Assess And Accredit Response To Augmented Reality (Ar)

Abstract: The present disclosure provides a system 100 and a method for assessing and accrediting a response to Augmented Reality (AR). The proposed system 100 comprises markers 120, a scanning unit 110, an AR unit 102, an eye-ware 106 and an input unit 108. At least one of the markers 120 is scanned through the scanning unit 110. The AR unit 102 emulates an AR view associated with the scanned markers 120. The eye-ware 106 is configured to display the emulated AR view. A response, corresponding to the emulated AR view, can be generated through the input unit 108. A weighted combination, of one or more response parameters associated with the generated response, is being calculated. An accreditation report is being generated when the weighted combination exceeds a pre-determined threshold. A warning report is being generated when the weighted combination is less than the pre-determined threshold.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 March 2020
Publication Number
17/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector -9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. CHANDER PARTAP SINGH
Chitkara University, Chandigarh Patiala National Highway (NH-64), Village, Jansla, Rajpura, Punjab- 140401, India.
2. MANISHA
Chitkara University, Chandigarh Patiala National Highway (NH-64), Village, Jansla, Rajpura, Punjab- 140401, India.

Specification

[0001] The present disclosure relates to the field of Augmented Reality (AR). In
particular, the present disclosure provides a system and method for assessing and accrediting
response to augmented reality (AR).
BACKGROUND
[0002] The background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information provided
herein is prior art or relevant to the presently claimed invention, or that any publication
specifically or implicitly referenced is prior art.
[0003] Now-a-days, with an advancement in the technology, and in our efforts to move
towards renewable, green and eco-friendly techniques, like, laying of power lines as well as gaslines, like, PNG, in buildings, such as giant industries, multi-storeyed flats, and houses, has
become a common practice. Though, these practices are really appreciable, but, any misshape,
negligence, or accident can result in a hazardous fire.
[0004] Moreover, degrading environmental conditions are getting worse with each
passing day, so is the chance of getting prone to adverse conditions such as a fire breakout. A lot
of incidents can also happen due to negligence or human errors. Whatever may the reason be, the
fire breakout events have become an inevitable part of nowadays lifestyle. Many researchers
have put efforts to circumvent the problem from the source level itself but due to the variation
and vastness of the sources, it seems almost impossible to circumvent more than half of the
cases.
[0005] Various mock-drills are conducted periodically to mitigate losses in case of a realtime disaster or hazardous situation. In the mock-drills, an imitating environment to mimic a
real-time disaster or hazardous situation is created, and response of people present there is
observed, and, hence, people are trained-well for such real-time disasters or hazardous situations.
The mock-drills are conducted in schools, hospitals, societies, and other strategic areas to train
people to react and cope-up with several such disasters or hazardous situations such as
3
earthquake, landslide, cyclone, etc. But, these mock-drills generally skip training for real-time
fire hazard. And, even, if fire mitigating mock drills are included, they are not effective enough,
as, it is very tough to create an environment, which can imitate and provide a real-time effect.
[0006] Also, most of the fire brigade departments are running drills where the employee
(firefighter) has to go through a series of obstacles in real-life, endangering themselves to
adverse conditions. In case of normal people, and school kids, trying to learn the firefighting
tactics, applying the same scenario would be fatal.
[0007] There is, therefore, a need in the art to provide an efficient, safe, and reliable
system to overcome the above-mentioned problems, and, provide a means to create an
environment that can be favorable for learning the basic principles of firefighting without
endangering one’s life.
OBJECTS OF THE PRESENT DISCLOSURE
[0008] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[0009] It is an object of the present disclosure to provide a system and method for
facilitating AR-based emulation of one or more markers.
[0010] It is another object of the present disclosure to provide a system and method for
assessing a response for AR-based emulation, and generate a corresponding report.
[0011] It is another object of the present disclosure to provide and method for emulating
an AR view for providing training for firefighting.
[0012] It is another object of the present disclosure to provide a system and method for
providing an interesting, interactive, accurate, fast, efficient, cost effective and simple AR-based
training platform.
[0013] These and other objects of the present invention will become readily apparent
from the following detailed description taken in conjunction with the accompanying drawings.
SUMMARY
[0014] The present disclosure relates to the field of Augmented Reality (AR). In
particular, the present disclosure provides a system and method for assessing and accrediting
response to augmented reality (AR).
4
[0015] An aspect of the present disclosure pertains to a system to assess and accredit a
response to Augmented Reality (AR), the system comprising: one or more markers positioned at
one or more pre-determined positions at an Area of Interest (AOI); a scanning unit to scan the
one or more markers; and an augmented reality (AR) unit operatively coupled to the scanning
unit, the AR unit comprising one or more processors coupled with a memory, the memory
storing instructions executable by the one or more processors configured to:extract marker
attributes associated with at least one of the scanned one or more markers; emulate an AR view
corresponding to the extracted marker attributes of the at least one of the scanned one or more
markers; extract one or more response parameters from a set of first data packets received from
an input unit, wherein the set of first data packets may be transmitted by the input unit in
response to the emulated AR view; generate a weighted combination of the extracted one or
more response parameters; and wherein the AR unit may be configured to, when the generated
weighted combination exceeds a predetermined threshold, generate an accreditation report.
[0016] In an aspect, the AR unit may be configured to generate a warning report when
the generated weighted combination is below the predetermined threshold.
[0017] In an aspect, the AR unit may assign pre-defined weights to each of the one or
more response parameters, and correspondingly generate the weighted combination based on the
extracted one or more response parameters.
[0018] In an aspect, the at least one of the one or more markers may be configured with
one or more IR emitting sources, and the scanning unit may be configured with an IR emission
sensor.
[0019] In an aspect, the system may comprise an eye-ware operatively coupled to the AR
unit to facilitate display of the emulated AR view.
[0020] In an aspect, the marker attributes may comprise any or a combination of identity
(ID), internet protocol (IP) address, size, shape, location, and alignment of a marker.
[0021] In an aspect, the one or more response parameters comprise any or a combination
of number, amount, and frequency of the set of first data packets, angle, location, and alignment
of the input unit, pressure, and force exerted at the input unit, and position and distance of the
input unit with respect to the at least one of the emulated one or more markers.
5
[0022] In an aspect, the AR unit may be configured to receive the set of first data packets
from the input unit within a pre-determined time-period, and wherein number of the one or more
received set of first data packets may be below a pre-defined number.
[0023] In an aspect, the input unit may comprise one or more response buttons, wherein
the one or more response buttons may be operated in response to the emulated AR view.
[0024] Another aspect of the present disclosure pertains to a method for assessing and
accrediting a response to Augmented Reality (AR), the method comprising the steps of:
scanning, at a scanning unit, one or more markers positioned at one or more pre-determined
positions at an Area of Interest (AOI);extracting, at one or more processors of a processing
engine, marker attributes associated with at least one of the scanned one or more markers;
emulating, at the one or more processors,an AR view corresponding to the extracted marker
attributes of the at least one of the scanned one or more markers; extracting, at the one or more
processors,one or more response parameters from a set of first data packets received from an
input unit, wherein the set of first data packets may be transmitted by the input unit in response
to the emulated AR view; generating, at the one or more processors,a weighted combination of
the extracted one or more response parameters; and wherein, in the event of the generated
weighted combination exceeding a predetermined threshold, an accreditation report may be
generated.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The accompanying drawings are included to provide a further understanding of
the present disclosure, and are incorporated in and constitute a part of this specification. The
drawings illustrate exemplary embodiments of the present disclosure and, together with the
description, serve to explain the principles of the present disclosure.
[0026] The diagrams are for illustration only, which thus is not a limitation of the present
disclosure, and wherein:
[0027] FIG. 1 illustrates exemplary block diagram of the proposed system to illustrate its
overall working in accordance with an embodiment of the present disclosure.
[0028] FIG. 2 illustrates exemplary engines of an AR unit in accordance with an
exemplary embodiment of the present disclosure.
6
[0029] FIG. 3 illustrates a method to elaborate working of the proposed system in
accordance with an exemplary embodiment of the present disclosure.
[0030] FIG. 4 illustrates an exemplary computer system in which or with which
embodiments of the present invention can be utilized in accordance with embodiments of the
present disclosure.
DETAILED DESCRIPTION
[0031] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be apparent to
one skilled in the art that embodiments of the present invention may be practiced without some
of these specific details.
[0032] Embodiments of the present invention may be provided as a computer program
product, which may include a machine-readable storage medium tangibly embodying thereon
instructions, which may be used to program a computer (or other electronic devices) to perform a
process. The machine-readable medium may include, but is not limited to, fixed (hard) drives,
magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs),
and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access
memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs),
electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other
type of media/machine-readable medium suitable for storing electronic instructions (e.g.,
computer programming code, such as software or firmware).
[0033] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus for
practicing various embodiments of the present invention may involve one or more computers (or
one or more processors within a single computer) and storage systems containing or having
network access to computer program(s) coded in accordance with various methods described
herein, and the method steps of the invention could be accomplished by engines, routines,
subroutines, or subparts of a computer program product.
7
[0034] If the specification states a component or feature “may”, “can”, “could”, or
“might” be included or have a characteristic, that particular component or feature is not required
to be included or have the characteristic.
[0035] As used in the description herein and throughout the claims that follow, the
meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates
otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on”
unless the context clearly dictates otherwise.
[0036] The recitation of ranges of values herein is merely intended to serve as a
shorthand method of referring individually to each separate value falling within the range. Unless
otherwise indicated herein, each individual value is incorporated into the specification as if it
were individually recited herein. All methods described herein can be performed in any suitable
order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of
any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain
embodiments herein is intended merely to better illuminate the invention and does not pose a
limitation on the scope of the invention otherwise claimed. No language in the specification
should be construed as indicating any non-claimed element essential to the practice of the
invention.
[0037] Groupings of alternative elements or embodiments of the invention disclosed
herein are not to be construed as limitations. Each group member can be referred to and claimed
individually or in any combination with other members of the group or other elements found
herein. One or more members of a group can be included in, or deleted from, a group for reasons
of convenience and/or patentability. When any such inclusion or deletion occurs, the
specification is herein deemed to contain the group as modified thus fulfilling the written
description of all groups used in the appended claims.
[0038] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. This
invention may, however, be embodied in many different forms and should not be construed as
limited to the embodiments set forth herein. These embodiments are provided so that this
disclosure will be thorough and complete and will fully convey the scope of the invention to
those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the
invention, as well as specific examples thereof, are intended to encompass both structural and
8
functional equivalents thereof. Additionally, it is intended that such equivalents include both
currently known equivalents as well as equivalents developed in the future (i.e., any elements
developed that perform the same function, regardless of structure).
[0039] The present disclosure relates to the field of Augmented Reality (AR). In
particular, the present disclosure provides a system and method for assessing and accrediting
response to augmented reality (AR).
[0040] According to an aspect the present disclosure pertains to a system to assess and
accredit a response to Augmented Reality (AR), the system including: one or more markers
positioned at one or more pre-determined positions at an Area of Interest (AOI);a scanning unit
to scan the one or more markers; and an augmented reality (AR) unit operatively coupled to the
scanning unit, the AR unit can be including one or more processors coupled with a memory, the
memory storing instructions executable by the one or more processors configured to:extract
marker attributes associated with at least one of the scanned one or more markers; emulate an
AR view corresponding to the extracted marker attributes of the at least one of the scanned one
or more markers; extract one or more response parameters from a set of first data packets
received from an input unit, wherein the set of first data packets can be transmitted by the input
unit in response to the emulated AR view; generate a weighted combination of the extracted one
or more response parameters; and wherein the AR unit can be configured to, when the generated
weighted combination exceeds a predetermined threshold, generate an accreditation report.
[0041] In an embodiment, the AR unit can be configured to generate a warning report
when the generated weighted combination is below the predetermined threshold.
[0042] In an embodiment, the AR unit can assign pre-defined weights to each of the one
or more response parameters, and correspondingly generate the weighted combination based on
the extracted one or more response parameters.
[0043] In an embodiment, the at least one of the one or more markers can be configured
with one or more IR emitting sources, and the scanning unit can be configured with an IR
emission sensor.
[0044] In an embodiment, the system can include an eye-ware operatively coupled to the
AR unit to facilitate display of the emulated AR view.
[0045] In an embodiment, the marker attribute scan include any or a combination of
identity (ID), internet protocol (IP) address, size, shape, location, and alignment of a marker.
9
[0046] In an embodiment, the one or more response parameters can include any or a
combination of number, amount, and frequency of the set of first data packets, angle, location,
and alignment of the input unit, pressure, and force exerted at the input unit, and position and
distance of the input unit with respect to the at least one of the emulated one or more markers.
[0047] In an embodiment, the AR unit can receive the set of first data packets from the
input unit within a pre-determined time-period, and wherein number of the one or more received
set of first data packets can be below a pre-defined number.
[0048] In an embodiment, the input unit can include one or more response buttons,
wherein the one or more response buttons can be operated in response to the emulated AR view.
[0049] According to another aspect the present disclosure pertains to a method for
assessing and accrediting a response to Augmented Reality (AR), the method can be including
the steps of: scanning, at a scanning unit, one or more markers positioned at one or more predetermined positions at an Area of Interest (AOI); extracting, at one or more processors of a
processing engine, marker attributes associated with at least one of the scanned one or more
markers; emulating, at the one or more processors,an AR view corresponding to the extracted
marker attributes of the at least one of the scanned one or more markers; extracting, at the one or
more processors,one or more response parameters from a set of first data packets received from
an input unit, wherein the set of first data packets can be transmitted by the input unit in response
to the emulated AR view; generating, at the one or more processors,a weighted combination of
the extracted one or more response parameters; and wherein, in the event of the generated
weighted combination exceeding a predetermined threshold, an accreditation report can be
generated.
[0050] FIG. 1 illustrates exemplary block diagram of the proposed system to illustrate its
overall working in accordance with an embodiment of the present disclosure.
[0051] As illustrated in FIG. 1, in an embodiment, the block diagram of the proposed
system 100 can include one or more markers 120-1, 120-2… 120-N (collectively referred to as a
plurality of markers 120, or markers 120, and individually referred to as marker 120). The
plurality of markers 120 can be positioned at one or more pre-determined positions at an Area of
Interest (AOI). In an embodiment, at least one of the plurality of markers 120 can be configured
with one or more IR emitting sources.
10
[0052] In an embodiment, the block diagram of the proposed system 100 can include a
scanning unit 110. The scanning unit 110 can be configured to scan at least one of the plurality
of markers 120. In an illustrative implementation, the scanning unit 110 can capture one or more
images of at least one of the plurality of markers 120. In another illustrative implementation, the
scanning unit 110 can be configured to extract marker attributes associated with at least one of
the plurality of markers 120, through scanning of the at least one of the plurality of markers 120,
such as, bar code reading, optical mark recognition technique, and the likes. The marker
attributes can include any or a combination of identity (ID), internet protocol (IP) address, size,
position, shape, location, alignment of a marker 120, and the likes. In an illustrative embodiment,
the scanning unit 110 can be configured with an IR emission sensor, which can facilitate more
accurate detection of the at least one of the plurality of markers 120, which is configured with the
one or more IR emitting sources.
[0053] In an embodiment, the block diagram of the proposed system 100 can include an
augmented reality (AR) unit 102. The AR unit 102 can be operatively coupled to the scanning
unit 110. In an embodiment, the AR unit 102 can be configured to facilitate extraction of the
marker attributes associated with the at least one of the scanned markers 120. In another
embodiment, the AR unit 102 can be configured to emulate an AR view corresponding to the
extracted marker attributes of the at least one of the scanned markers 120.
[0054] In an embodiment, the block diagram of the proposed system 100 can include an
eye-ware 106. The eye-ware 106 can be operatively coupled to the AR unit 102 that can facilitate
display of the emulated AR view. In an illustrative embodiment, the eye-ware 106 can be
configured to display a two-dimensional (2D) representation, as well as, a three-dimensional
(3D) representation of the emulated AR view. A user can switch between the 2D representation,
and, the 3D representation, as and when required.
[0055] In an embodiment, the block diagram of the proposed system 100 can include an
input unit 108. The input unit 108 can be operatively coupled to the AR unit 102. The input unit
108 can be a hand-held device, and can be configured to include one or more response buttons,
which can be operated in one or more ways, such as pressed or rolled, to provide a response to
the emulated AR view that is being displayed at the eye-ware 106. In an illustrative embodiment,
the input unit 108 can be operated to generate a set of first data packets in response to the
emulated AR view. In an embodiment, the AR unit 102 can receive the set of first data packets
11
from the input unit 108 within a pre-determined time-period, and number of the one or more
received set of first data packets cannot exceed a pre-defined number.
[0056] In an embodiment, the set of first data packet scan include one or more response
parameters. The AR unit 102 can be configured to extract one or more response parameters from
the set of first data packets received from the input unit 108, where, the one or more response
parameters can be including any or a combination of number, amount, and frequency of the set
of first data packets, angle, location, and alignment of the input unit 108, pressure, and force
exerted at the input unit 108, and position and distance of the input unit 108 with respect to the at
least one of the emulated markers 120, and the likes.
[0057] In an embodiment, the AR unit 102 can be configured to generate a weighted
combination of the extracted one or more response parameters. In an illustrative implementation,
the AR unit 102 can assign pre-defined weights to each of the one or more response parameters,
and can correspondingly generate the weighted combination based on the extracted one or more
response parameters.
[0058] In an embodiment, the AR unit 102 can be configured to assess the generated
weighted combination, and compare the generated weighted combination with a predetermined
threshold. In an illustrative implementation, the AR unit 102 can be configured to generate an
accreditation report when the generated weighted combination exceeds the predetermined
threshold. In another illustrative implementation, the AR unit 102 can be configured to generate
a warning report when the generated weighted combination is below the predetermined
threshold.
[0059] In an illustrative implementation, the proposed system 100 can be configured to
as training means for providing/ facilitating firefighting training. In an embodiment, the markers
120 can be positioned at the pre-determined positions at the AOI, where the AOI can be a multistoreyed building, a training ground, a strategic complex, and the likes. The AR unit 102 can be
configured to emulate an AR view, representing the at least one of the scanned markers 120 as a
fire emitting source, corresponding to the extracted marker attributes of the at least one of the
scanned markers 120. For example, if size of a first marker 120-1 is more than that of a second
marker 120-2, then, in the emulated AR view, the first marker 120-1 can be emulated as a large
fire emitting source, whereas, the second marker 120-2 can be emulated as a small fire emitting
source.In an illustrative embodiment, the at least one of the markers 120 can be configured with
12
the one or more IR LEDs, and the scanning unit 110 can be configured with an IR emission
sensor, which can facilitate more accurate detection of the at least one of the markers 120.
[0060] In another illustrative implementation, a user can operate the input unit 108 to
provide a response to the emulated AR view. The user can wear the eye-ware 106 that can be
configured for displaying of the emulated AR view of the at least one of the markers 120.In an
illustrative embodiment, the input unit 108 can be operated to generate a set of first data packets
in response to the emulated AR view. The AR unit 102 can be configured to extract one or more
response parameters from the set of first data packets, and, correspondingly generate a weighted
combination of the extracted one or more response parameters. The AR unit 102 can assign predefined weights to each of the one or more response parameters, and can correspondingly
generate the weighted combination based on the extracted one or more response parameters. For
example, let weights assigned to the number of the received set of first data packets (n) and the
pressure exerted (p) at the input unit 108, be 0.4 units and 0.5 units, respectively. Then, the
formula for the weighted combination (Tw) can be given as –
Tw = 0.4*n + 0.5*p
and, let the value of pre-defined threshold be 5.5 units.
In a case, when number of the received set of first data packets (n) is 5, and pressure exerted (p)
at the input unit 108 is 8 units, then, the resulting weighted combination (Tw) is 6 units, which
exceeds the pre-defined threshold, hence, the AR unit 102 can generate an accreditation report
indicating the task is completed successfully.
In another case, when number of the received set of first data packets (n) is 10, and pressure
exerted (p) at the input unit 108 is 4 units, then, the resulting weighted combination (Tw) is 6
units, which exceeds the pre-defined threshold, hence, the AR unit 102 can generate an
accreditation report indicating the task is completed successfully.
In yet another case, when number of the received set of first data packets (n) is 6, and pressure
exerted (p) at the input unit 108 is 5 units, then, the resulting weighted combination (Tw) is 4.9
units, which is below the pre-defined threshold, then, the AR unit 102 can generate a warning
report indicating the task is failed, and, needed to be performed again.
In another illustrative implementation, number of the received set of first data packets cannot
exceed a pre-defined number, let, it be 15. In case, the number of the received set of first data
packets becomes equal to 15, and, the generated weighted combination (Tw) is still below the
13
pre-defined threshold, then, the AR unit 102 can generate a warning report indicating the task is
failed, and, needed to be performed again. In an embodiment, user has to wait for some time in
order to generate the set of first data packets, again, from the input unit 108, once, the set of first
data packets are exhausted, without completion of the task.
[0061] According to various embodiments of the present disclosure, the AR unit 102 can
provide for an Artificial Intelligence (AI) based automatic emulation of the one or more scanned
markers 120.
[0062] In an embodiment, a response device can be configured which can include both
the input unit 108 as well as the scanning unit 110. In another embodiment, the response device
can also include the eye-ware 106. Hence, the user device can allow multi-tasking, such as
providing display of the emulated AR view, and allowing the user to scan as well as respond to
the emulated AR view, simultaneously.
[0063] FIG. 2 illustrates exemplary engines of an AR unit in accordance with an
exemplary embodiment of the present disclosure.
[0064] As illustrated, the AR unit 102 can include one or more processor(s) 202. The one
or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers,
microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any
devices that manipulate data based on operational instructions. Among other capabilities, the one
or more processor(s) 202 are configured to fetch and execute computer-readable instructions
stored in a memory 204 of the AR unit 102. The memory 204 can store one or more computerreadable instructions or routines, which may be fetched and executed to create or share the data
units over a network service. The memory 204 can include any non-transitory storage device
including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM,
flash memory, and the like.
[0065] In an embodiment, the AR unit 102 can also include an interface(s) 206. The
interface(s) 206 may include a variety of interfaces, for example, interfaces for data input and
output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may
facilitate communication of the AR unit 102 with various devices coupled to the AR unit 102.
The interface(s) 206 may also provide a communication pathway for one or more components of
the AR unit 102. Examples of such components include, but are not limited to, processing
engine(s) 208 and data 210.
14
[0066] In an embodiment, the processing engine(s) 208 can be implemented as a
combination of hardware and programming (for example, programmable instructions) to
implement one or more functionalities of the processing engine(s) 208. In examples described
herein, such combinations of hardware and programming may be implemented in several
different ways. For example, the programming for the processing engine(s) 208 may be
processor executable instructions stored on a non-transitory machine-readable storage medium
and the hardware for the processing engine(s) 208 may include a processing resource (for
example, one or more processors), to execute such instructions. In the present examples, the
machine-readable storage medium may store instructions that, when executed by the processing
resource, implement the processing engine(s) 208. In such examples, the AR unit 102 can
include the machine-readable storage medium storing the instructions and the processing
resource to execute the instructions, or the machine-readable storage medium may be separate
but accessible to AR unit 102 and the processing resource. In other examples, the processing
engine(s) 208 may be implemented by electronic circuitry. The data 210 can include data that is
either stored or generated as a result of functionalities implemented by any of the components of
the processing engine(s) 208.
[0067] In an embodiment, the processing engine(s) 208 can include an extraction engine
212, an AR emulation engine 214, a report generation engine 216, and other engine(s) 218. The
other engine(s) 218 can implement functionalities that supplement applications or functions
performed by the AR unit 102 or the processing engine(s) 208.
[0068] In an embodiment, the extraction engine 212 of the AR unit 102 can facilitate
extraction of marker attributes associated with at least one of the plurality of markers 120,
through the scanning unit 110. The marker attributes can include any or a combination of identity
(ID), internet protocol (IP) address, size, position, shape, location, alignment of a marker 120,
and the likes. The scanning unit 110 can scan the at least one of the plurality of markers 120
within a pre-configured time-period. In an illustrative embodiment, the at least one of the
markers 120 can be configured with the one or more IR LEDs, and the scanning unit 110 can be
configured with an IR emission sensor, which can facilitate more accurate detection of the at
least one of the markers 120. In an embodiment, the scanning unit 110 can be configured to
capture one or more images of at least one of the plurality of markers 120. The extraction engine
212 can extract the marker attributes of the at least one of the plurality of markers 120 by
15
processing the captured one or more images, associated by the at least one of the plurality of
markers 120, at the processing engine(s) 208. In another embodiment, the extraction engine 212
can facilitate extraction of the marker attributes of the at least one of the plurality of markers 120
directly through the scanning unit 110.
[0069] In an embodiment, the extraction engine 212 can enable extraction of one or more
response parameters from a set of first data packets, which can be transmitted to the AR unit 102
through an input unit 108. The input unit 108 can be operatively coupled to the AR unit. The
input unit 108 can be a hand-held device, and can be configured to include one or more response
buttons, which can be operated in one or more ways, such as pressed or rolled, to generate the set
of first data packets in response to an emulated AR view that is being displayed at an eye-ware
106 adapted to be worn by a user, which can be operatively coupled with the AR unit 102. In an
embodiment, the AR unit 102 can receive the set of first data packets from the input unit 108
within a pre-determined time-period, and number of the one or more received set of first data
packets cannot exceed a pre-defined number. In an embodiment, any or a combination of the one
or more response parameters from the set of first data packet, and, the extracted marker attributes
of each of the scanned marker 120 can be stacked separately in any or a combination of first-in
first-out (also referred to as FIFO, herein) stack, last-in first-out (also referred to as LIFO, herein)
stack, and the likes. In an embodiment, the extracted marker attributes of each of the scanned
marker 120 can be collated under a separate identity (collectively referred to as IDs, and
individually referred to as ID, herein).
[0070] In an embodiment, the AR emulation engine 214 of the AR unit 102 can enable
emulation of the AR view of the markers 120 selected from the plurality of markers 120 through
the scanning of the at least one of the plurality of markers 120. In an embodiment, the AR
emulated view can be displayed at the eye-ware 106. In an embodiment, for emulating AR view,
the extracted marker attributes can be compared with an emulation dataset. The AR emulation of
the at least one of the scanned markers 120 can be induced based on the comparison of the
extracted marker attributes with the emulation dataset. The emulation dataset can include marker
attributes associated with multiple markers. The AR emulation of the at least one of the scanned
markers 120 can be done based on matching of marker attributes of the at least one of the
scanned markers 120 with marker attributes of at least one marker of the multiple markers
associated with the emulation dataset. In an embodiment, the emulation dataset can be associated
16
with the data 210 of the proposed system 100. In another embodiment, the emulation dataset can
be acquired from a third source. In an embodiment, in case the extracted marker attributes of the
at least one of the scanned markers 120 do not match with the emulated dataset, the marker 120
is required to be scanned again. Also, in case of, negative comparison of the extracted marker
attributes of a scanned marker 120 with the emulated dataset can symbolize that the emulation
dataset do not include marker attributes of the marker being scanned, and the emulation dataset is
needed to be updated.
[0071] In an illustrative embodiment, the proposed system 100 can be configured to as
training means for providing/ facilitating firefighting training. In an illustrative implementation,
in case, size of a first marker 120-1 is more than that of a second marker 120-2, then, in the
emulated AR view, the first marker 120-1 can be emulated as a large fire emitting source,
whereas, the second marker 120-2 can be emulated as a small fire emitting source, but, in case,
both, the first marker 120-1 and the second marker 120-2, are positioned in a closed proximity
with respect to each other, then, it may result in an emulation of a huge fire emitting source,
which can be associated with an aggregation of the size of both, the first marker 120-1 and the
second marker 120-2.
[0072] In an embodiment, the report generation engine 216 of the AR unit 102 can
facilitate generation of a weighted combination of the extracted one or more response
parameters, and further, generate a corresponding report. In an illustrative implementation, predefined weights can be assigned to each of the one or more response parameters, and
correspondingly the weighted combination can be generated based on the extracted one or more
response parameters. In an embodiment, the report generation engine 216 can enable assessment
of the generated weighted combination, and compare the generated weighted combination with a
predetermined threshold. In an illustrative implementation, the report generation engine 216 can
facilitate generation of an accreditation report when the generated weighted combination exceeds
the predetermined threshold. In another illustrative implementation, the report generation engine
216 can facilitate generation of a warning report when the generated weighted combination is
below the predetermined threshold.
[0073] In an illustrative implementation, let weights assigned to the frequency of the
received set of first data packets (f) and the force exerted (F) at the input unit 108, be 0.3 units
17
and 0.6 units, respectively. Here, the weighted combination (Tw) can be calculated through the
formula –
Tw = 0.3*f + 0.6*F
and, let the value of pre-defined threshold be 5.5 units.
In a case, when frequency (f) of receiving the set of first data packets is 10 units, and force
exerted (F) at the input unit 108 is 5 units, then, the resulting weighted combination (Tw) is 6
units, which exceeds the pre-defined threshold, hence, the report generation engine 216 can
facilitate generation of the accreditation report indicating the task is completed successfully.
In another case, when frequency (f) of receiving the set of first data packets is 5 units, and force
exerted (F) at the input unit 108 is 7 units, then, the resulting weighted combination (Tw) is 5.7
units, which exceeds the pre-defined threshold, hence, the report generation engine 216 can
facilitate generation of the accreditation report indicating the task is completed successfully.
In yet another case, when frequency (f) of receiving the set of first data packets is 5 units, and
force exerted (F) at the input unit 108 is 5 units, then, the resulting weighted combination (Tw) is
4.5 units, which is below the pre-defined threshold, then, the report generation engine 216 can
facilitate generation of a warning report indicating the task is failed, and, needed to be performed
again.
In another illustrative implementation, number of the received set of first data packets cannot
exceed a pre-defined number, let, it be 15. In case, the number of the received set of first data
packets becomes equal to 15, and, the generated weighted combination (Tw) is still below the
pre-defined threshold, then, the report generation engine 216 can facilitate generation of a
warning report indicating the task is failed, and, needed to be performed again. In an
embodiment, user has to wait for some time in order to generate the set of first data packets,
again, from the input unit 108, once, the set of first data packets are exhausted, without
completion of the task.
[0074] FIG. 3 illustrates a method to elaborate working of the proposed system in
accordance with an exemplary embodiment of the present disclosure.
[0075] As illustrated in an embodiment, the method can include a step 302 of scanning,
at a scanning unit 110, at least one of the plurality of markers 120, where the plurality of markers
120 are positioned at one or more pre-determined positions at an Area of Interest (AOI).The
scanning unit 110 can scan the at least one of the plurality of markers 120 within a pre-
18
configured time-period. In an embodiment, the scanning unit 110 can capture one or more
images of the at least one of the plurality of markers 120. In an illustrative embodiment, the
scanning unit 110 can be configured with an IR emission sensor, which can facilitate more
accurate detection of the at least one of the plurality of markers 120, which is configured with the
one or more IR emitting sources.
[0076] In an embodiment, the method can include a step 304 of extracting, at one or
more processors of a processing engine, marker attributes associated with at least one of the
markers 120 that are being scanned in the step 302. The marker attributes can include any or a
combination of identity (ID), internet protocol (IP) address, size, position, shape, location,
alignment of a marker 120, and the likes.
[0077] In an embodiment, the method can include a step 306 of emulating, at the one or
more processors,an AR view corresponding to the marker attributes, being extracted in the step
304, of the at least one of the scanned markers 120. In an illustrative embodiment, an eye-ware
106 can be operatively coupled to the AR unit 102 can facilitate display of the emulated AR
view. The eye-ware 106 be configured to display a two-dimensional (2D) representation as well
as a three-dimensional (3D) representation of the emulated AR view. A user can switch between
the 2D representation, and, the 3D representation, as and when required.
[0078] In an embodiment, the method can include a step 308 of extracting, at the one or
more processors, one or more response parameters from a set of first data packets that are being
received from an input unit 108. The set of first data packets can be transmitted by the input unit
108 in response to the AR view being emulated in the step 306. The input unit 108 can be a
hand-held device, and can be configured to include one or more response buttons, which can be
operated in one or more ways, such as pressed or rolled, to provide a response, which can include
generation of the set of first data packets, to the emulated AR view that is being displayed at the
eye-ware 106. In an embodiment, the processing engine 102 can receive the set of first data
packets from the input unit 108 within a pre-determined time-period, and number of the one or
more received set of first data packets cannot exceed a pre-defined number. In an embodiment,
the set of first data packets can include the one or more response parameters, which can include
any or a combination of number, amount, and frequency of the set of first data packets, angle,
location, and alignment of the input unit 108, pressure, and force exerted at the input unit 108,
19
and position and distance of the input unit 108 with respect to the at least one of the emulated
markers 120, and the likes.
[0079] In an embodiment, the method can include a step 310 of generating, at the one or
more processors, a weighted combination of the one or more response parameters, which are
being extracted in the step 308. In an illustrative implementation, pre-defined weight scan be
assigned to each of the one or more response parameters, and correspondingly the weighted
combination of the one or more response parameters, based on the extracted one or more
response parameters, can be generated.
[0080] In an embodiment, the method can include a step 312 of generating, at the one or
more processors, an accreditation report, when the weighted combination of the one or more
response parameters generated in the step 310 exceeds a predetermined threshold. In an
illustrative embodiment, the generated weighted combination can be assessed, compared with the
predetermined threshold, and a corresponding report can be generated.
[0081] In an embodiment, the method can include a step of generating, at the one or more
processors, a warning report when the generated weighted combination is below the
predetermined threshold.
[0082] FIG. 4 illustrates an exemplary computer system in which or with which
embodiments of the present invention can be utilized in accordance with embodiments of the
present disclosure.
[0083] As shown in FIG. 4, computer system includes an external storage device 410, a
bus 420, a main memory 430, a read only memory 440, a mass storage device 450,
communication port 460, and a processor 470. A person skilled in the art will appreciate that
computer system may include more than one processor and communication ports. Examples of
processor 470 include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or
AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™
system on a chip processors or other future processors. Processor 470 may include various
engines associated with embodiments of the present invention. Communication port 460 can be
any of an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, a
Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or
future ports. Communication port 460 may be chosen depending on a network, such a Local
20
Area Network (LAN), Wide Area Network (WAN), or any network to which computer system
connects.
[0084] In an embodiment, the memory 430 can be Random Access Memory (RAM), or
any other dynamic storage device commonly known in the art. Read only memory 440 can be
any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory
(PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 470.
Mass storage 450 may be any current or future mass storage solution, which can be used to store
information and/or instructions. Exemplary mass storage solutions include, but are not limited to,
Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment
(SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial
Bus (USB) and/or Fire wire interfaces), e.g. those available from Seagate (e.g., the Seagate
Barracuda 7102 family) or Hitachi (e.g., the Hitachi Deskstar 7K1000), one or more optical
discs, Redundant Array of Independent Disks (RAID) storage, e.g. an array of disks (e.g., SATA
arrays), available from various vendors including Dot Hill Systems Corp., LaCie, Nexsan
Technologies, Inc. and Enhance Technology, Inc.
[0085] In an embodiment, the bus 420 communicatively couples processor(s) 470 with
the other memory, storage and communication blocks. Bus 420 can be, e.g. a Peripheral
Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface
(SCSI), USB or the like, for connecting expansion cards, drives and other subsystems as well as
other buses, such a front side bus (FSB), which connects processor 470 to software system.
[0086] In another embodiment, operator and administrative interfaces, e.g. a display,
keyboard, and a cursor control device, may also be coupled to bus 420 to support direct operator
interaction with computer system. Other operator and administrative interfaces can be provided
through network connections connected through communication port 460. External storage
device 410 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives,
Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital
Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to
exemplify various possibilities. In no way should the aforementioned exemplary computer
system limit the scope of the present disclosure.
[0087] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams,
schematics, illustrations, and the like represent conceptual views or processes illustrating
21
systems and methods embodying this invention. The functions of the various elements shown in
the figures may be provided through the use of dedicated hardware as well as hardware capable
of executing associated software. Similarly, any switches shown in the figures are conceptual
only. Their function may be carried out through the operation of program logic, through
dedicated logic, through the interaction of program control and dedicated logic, or even
manually, the particular technique being selectable by the entity implementing this invention.
Those of ordinary skill in the art further understand that the exemplary hardware, software,
processes, methods, and/or operating systems described herein are for illustrative purposes and,
thus, are not intended to be limited to any particular named.
[0088] While embodiments of the present invention have been illustrated and described,
it will be clear that the invention is not limited to these embodiments only. Numerous
modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled
in the art, without departing from the spirit and scope of the invention, as described in the claim.
[0089] In the foregoing description, numerous details are set forth. It will be apparent,
however, to one of ordinary skill in the art having the benefit of this disclosure, that the present
invention may be practiced without these specific details. In some instances, well-known
structures and devices are shown in block diagram form, rather than in detail, to avoid obscuring
the present invention.
[0090] As used herein, and unless the context dictates otherwise, the term "coupled to" is
intended to include both direct coupling (in which two elements that are coupled to each other
contact each other)and indirect coupling (in which at least one additional element is located
between the two elements). Therefore, the terms "coupled to" and "coupled with" are used
synonymously. Within the context of this document terms "coupled to" and "coupled with" are
also used euphemistically to mean “communicatively coupled with” over a network, where two
or more devices are able to exchange data with each other over the network, possibly via one or
more intermediary device.
[0091] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts
herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the
appended claims. Moreover, in interpreting both the specification and the claims, all terms
should be interpreted in the broadest possible manner consistent with the context. In particular,
22
the terms “comprises” and “comprising” should be interpreted as referring to elements,
components, or steps in a non-exclusive manner, indicating that the referenced elements,
components, or steps may be present, or utilized, or combined with other elements, components,
or steps that are not expressly referenced. Where the specification claims refers to at least one of
something selected from the group consisting of A, B, C …. N, the text should be interpreted as
requiring only one element from the group, not A plus N, or B plus N, etc.
[0092] While the foregoing describes various embodiments of the invention, other and
further embodiments of the invention may be devised without departing from the basic scope
thereof. The scope of the invention is determined by the claims that follow. The invention is not
limited to the described embodiments, versions or examples, which are included to enable a
person having ordinary skill in the art to make and use the invention when combined with
information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0093] The present disclosure provides system and method for facilitating AR-based
emulation of one or more markers.
[0094] The present disclosure provides system and method for assessing a response for
AR-based emulation, and generate a corresponding report.
[0095] The present disclosure provides system and method for emulating an AR view for
providing training for firefighting.
[0096] The present disclosure provides system and method for providing an interesting,
interactive, accurate, fast, efficient, cost effective and simple AR-based training platform.

We Claim

1. A system to assess and accredit a response to Augmented Reality (AR), the system
comprising:
one or more markers positioned at one or more pre-determined positions at an
Area of Interest (AOI);
a scanning unit to scan the one or more markers; and
an augmented reality (AR) unit operatively coupled to the scanning unit, the AR
unit comprising one or more processors coupled with a memory, the memory storing
instructions executable by the one or more processors configured to:
extract marker attributes associated with at least one of the scanned one or
more markers;
emulate an AR view corresponding to the extracted marker attributes of
the at least one of the scanned one or more markers;
extract one or more response parameters from a set of first data packets
received from an input unit, wherein the set of first data packets are being
transmitted by the input unit in response to the emulated AR view;
generate a weighted combination of the extracted one or more response
parameters; and
wherein the AR unit is configured to, when the generated weighted combination
exceeds a predetermined threshold, generate an accreditation report.
2. The system as claimed in claim 1, wherein the AR unit is configured to generate a
warning report when the generated weighted combination is below the predetermined
threshold.
3. The system as claimed in claim 1, wherein the AR unit assigns pre-defined weights to
each of the one or more response parameters, and correspondingly generate the weighted
combination based on the extracted one or more response parameters.
4. The system as claimed in claim 1, wherein the at least one of the one or more markers is
configured with one or more IR emitting sources, and the scanning unit is configured
with an IR emission sensor.
24
5. The system as claimed in claim 1, wherein the system comprises an eye-ware operatively
coupled to the AR unit to facilitate display of the emulated AR view.
6. The system as claimed in claim 1, wherein the marker attributes comprise any or a
combination of identity (ID), internet protocol (IP) address, size, shape, location, and
alignment of a marker.
7. The system as claimed in claim 1, wherein the one or more response parameters comprise
any or a combination of number, amount, and frequency of the set of first data packets,
angle, location, and alignment of the input unit, pressure, and force exerted at the input
unit, and position and distance of the input unit with respect to the at least one of the
emulated one or more markers.
8. The system as claimed in claim 1, wherein the AR unit receives the set of first data
packets from the input unit within a pre-determined time-period, and wherein number of
the one or more received set of first data packets is below a pre-defined number.
9. The system as claimed in claim 1, wherein the input unit comprises one or more response
buttons, wherein the one or more response buttons are operated in response to the
emulated AR view.
10. A method for assessing and accrediting a response to Augmented Reality (AR), the
system comprising the steps of:
scanning, at a scanning unit, one or more markers positioned at one or more predetermined positions at an Area of Interest (AOI);
extracting, at one or more processors of a processing engine, marker attributes
associated with at least one of the scanned one or more markers;
emulating, at the one or more processors,an AR view corresponding to the
extracted marker attributes of the at least one of the scanned one or more markers;
extracting, at the one or more processors,one or more response parameters from a
set of first data packets received from an input unit, wherein the set of first data packets
are being transmitted by the input unit in response to the emulated AR view;
generating, at the one or more processors,a weighted combination of the extracted
one or more response parameters; and
25
wherein, in the event of the generated weighted combination exceeding a predetermined
threshold, an accreditation report is generated.

Documents

Application Documents

# Name Date
1 202011010446-STATEMENT OF UNDERTAKING (FORM 3) [11-03-2020(online)].pdf 2020-03-11
2 202011010446-FORM FOR STARTUP [11-03-2020(online)].pdf 2020-03-11
3 202011010446-FORM FOR SMALL ENTITY(FORM-28) [11-03-2020(online)].pdf 2020-03-11
4 202011010446-FORM 1 [11-03-2020(online)].pdf 2020-03-11
5 202011010446-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-03-2020(online)].pdf 2020-03-11
6 202011010446-EVIDENCE FOR REGISTRATION UNDER SSI [11-03-2020(online)].pdf 2020-03-11
7 202011010446-DRAWINGS [11-03-2020(online)].pdf 2020-03-11
8 202011010446-DECLARATION OF INVENTORSHIP (FORM 5) [11-03-2020(online)].pdf 2020-03-11
9 202011010446-COMPLETE SPECIFICATION [11-03-2020(online)].pdf 2020-03-11
10 202011010446-FORM-26 [23-04-2020(online)].pdf 2020-04-23
11 202011010446-Proof of Right [27-06-2020(online)].pdf 2020-06-27
12 abstract.jpg 2021-10-18
13 202011010446-FORM 18 [11-05-2022(online)].pdf 2022-05-11
14 202011010446-FER.pdf 2022-09-21
15 202011010446-FORM-26 [20-03-2023(online)].pdf 2023-03-20
16 202011010446-FER_SER_REPLY [20-03-2023(online)].pdf 2023-03-20
17 202011010446-CORRESPONDENCE [20-03-2023(online)].pdf 2023-03-20
18 202011010446-COMPLETE SPECIFICATION [20-03-2023(online)].pdf 2023-03-20
19 202011010446-CLAIMS [20-03-2023(online)].pdf 2023-03-20

Search Strategy

1 202011010446E_20-09-2022.pdf