Abstract: A METHOD AND A SYSTEM FOR PROVIDING AUGMENTED REALITY IN AUTOMOBILES USING UNMANNED AERIAL VEHICLES (UAVS) The present invention provides a method and a system for providing augmented reality in automobiles using Unmanned Aerial Vehicles (UAVs). The system comprising an automobile comprising an infotainment and navigation device; and an unmanned aerial vehicle, operatively coupled to said automobile, to provide real-time data exterior to the automobile; said infotainment and navigation device (102), operable to create and display real-time composite images or videos, by combining at least part of said real-time data obtained from the unmanned aerial vehicle with data available internally. Thus, the invention provides enhanced driver’s experience utilizing images/videos captured of surrounding natural environment, terrain, weather, traffic or any other like.
FORM-2
THE PATENT ACT,1970
(39 OF 1970)
AND
THE PATENT RULES, 2003
(As Amended)
COMPLETE SPECIFICATION
(See section 10;rule 13)
" A METHOD AND A SYSTEM FOR PROVIDING AUGMENTED REALITY IN AUTOMOBILES USING UNMANNED AERIAL
VEHICLES (UAVS)"
Tata Sons Limited, a corporation organized and existing under the laws of India, of Bombay House, 24 Homi Mody
Street, Mumbai 400 001, Maharashtra, India.
The following specification particularly describes the invention and the manner in which it is to be performed:
2
A METHOD AND A SYSTEM FOR PROVIDING AUGMENTED REALITY IN
AUTOMOBILES USING UNMANNED AERIAL VEHICLES (UAVS)
TECHNICAL FIELD
The present invention relates to in-vehicle infotainment and navigation system and more
particularly, to a system and method for providing augmented reality in automobiles using
unmanned aerial vehicles (UAVs) in a real-time space.
BACKGROUND
In-Vehicle information and entertainment systems are becoming more prevalent in the
automobile industry (referred to herein as infotainment systems). Beyond radios and
navigation devices, today's vehicles are equipped with video players, game consoles, multiple
displays, external networking connections and many more.
Presently, vehicle manufacturers attempt to entice travellers to use a specific conveyance
based on any number of features. Most of these features focus on-board vehicle devices such
as camera mounted over the vehicles which can guide over the road ahead. However, they
lack in providing information related to the surrounding areas and any other safety-related
constraints in real-time. Thus, the automobile industry has failed to appease the supposed
needs of the travellers by providing more advanced infotainment systems in vehicles.
Alternatively, there are several advanced driver assistance systems (ADAS), equipped with
cameras, radar/lidar or any other device. ADAS provides real-time information related to
surrounding environments. However, these ADAS are also constrained by the line-of-sight
and other installation limitations.
Hence, there is a need to have a system and method that can overcome the above stated
problems and provides a system and method for augmented reality in automobiles.
SUMMARY
The following presents a simplified summary of the subject matter in order to provide a basic
understanding of some aspects of subject matter embodiments. This summary is not an
3
extensive overview of the subject matter. It is not intended to identify key/critical elements of
the embodiments or to delineate the scope of the subject matter.
Its sole purpose is to present some concepts of the subject matter in a simplified form as a
prelude to the more detailed description that is presented later.
It is therefore a primary objective of this invention to provide a system and method for
augmented reality in automobiles.
According to the preferred embodiment, an integrated in-vehicle system which comprises an
infotainment and navigation device operatively connected with at least a flying unmanned
aerial vehicles (UAVs). Said flying unmanned aerial vehicles (UAVs) are capable of
providing augmented reality information related to surrounding areas as well as the road
ahead. Further, the movement of said flying unmanned aerial vehicles (UAVs) can be
controlled from the infotainment and navigation device.
In another embodiment, the present invention provides a system (100) for providing
augmented reality, said system comprising an automobile (101) comprising an infotainment
and navigation device (102); and an unmanned aerial vehicle (103), operatively coupled to
said automobile, to provide real-time data exterior to the automobile; said infotainment and
navigation device (102), operable to create and display real-time composite images or videos,
by combining at least part of said real-time data obtained from the unmanned aerial vehicle
with data available internally.
In another embodiment, the composite images or videos are created, in the infotainment and
navigation device, by superimposing or overlaying video or images as obtained in real time
from the unmanned aerial vehicle with GPS data available from an internal navigation device.
In another embodiment, the unmanned aerial vehicle comprises a plurality of sensors to
obtain said real-time data.
In another embodiment, the automobile comprises a plurality of control elements, operable to
control the unmanned aerial vehicle and the infotainment and navigation device. The control
elements are placed on a steering wheel of the automobile.
4
In another embodiment, the infotainment and navigation device comprises a display, a
processor and a memory. The display is coupled to the processor and the memory.
In another embodiment, the present invention a method for providing an augmented reality,
said method comprising obtaining, external real-time data using an Unmanned Aerial Vehicle
(UAV); providing said data to an infotainment and navigation device of the automobile; and
creating and displaying, composite images or videos by combining said external real-time
data with internal data.
These and other objects, embodiments and advantages of the present invention will become
readily apparent to those skilled in the art from the following detailed description of the
embodiments having reference to the attached figures, the invention not being limited to any
particular embodiments disclosed.
Brief Description of the Drawings
For a better understanding of the embodiments of the systems and methods described herein,
and to show more clearly how they may be carried into effect, reference will now be made,
by way of example, to the accompanying drawings, wherein:
FIGURE 1 illustrates a graphic representation of a system for providing augmented
reality in accordance with the present invention.
FIGURE 2 illustrates a block diagram of an internal configuration of the infotainment and
navigation device and the UAV in accordance with the present invention.
FIGURE 3 illustrates a method for providing augmented reality in accordance with the
present invention.
DESCRIPTION
Exemplary embodiments now will be described with reference to the accompanying
drawings. The invention may, however, be embodied in many different forms and should not
be construed as limited to the embodiments set forth herein; rather, these embodiments are
5
provided so that this invention will be thorough and complete, and will fully convey its scope
to those skilled in the art. The terminology used in the detailed description of the particular
exemplary embodiments illustrated in the accompanying drawings is not intended to be
limiting. In the drawings, like numbers refer to like elements.
The specification may refer to “an”, “one” or “some” embodiment(s) in several locations.
This does not necessarily imply that each such reference is to the same embodiment(s), or
that the feature only applies to a single embodiment. Single features of different
embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural
forms as well, unless expressly stated otherwise. It will be further understood that the terms
“includes”, “comprises”, “including” and/or “comprising” when used in this specification,
specify the presence of stated features, integers, steps, operations, elements, and/or
components, but do not preclude the presence or addition of one or more other features,
integers, steps, operations, elements, components, and/or groups thereof. It will be
understood that when an element is referred to as being “connected” or “coupled” to another
element, it can be directly connected or coupled to the other element or intervening elements
may be present. Furthermore, “connected” or “coupled” as used herein may include
operatively connected or coupled. As used herein, the term “and/or” includes any and all
combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein
have the same meaning as commonly understood by one of ordinary skill in the art to which
this invention pertains. It will be further understood that terms, such as those defined in
commonly used dictionaries, should be interpreted as having a meaning that is consistent
with their meaning in the context of the relevant art and will not be interpreted in an idealized
or overly formal sense unless expressly so defined herein.
The figures depict a simplified structure only showing some elements and functional entities,
all being logical units whose implementation may differ from what is shown. The
connections shown are logical connections; the actual physical connections may be different.
It is apparent to a person skilled in the art that the structure may also comprise other
functions and structures.
6
Also, all logical units described and depicted in the figures include the software and/or
hardware components required for the unit to function. Further, each unit may comprise
within itself one or more components which are implicitly understood. These components
may be operatively coupled to each other and be configured to communicate with each other
to perform the function of the said unit.
The detailed description follows in parts to terms of processes and symbolic representations of
operations performed by conventional computers, including computer components. For the
purpose of this invention, a computer may be any microprocessor or processor (hereinafter
referred to as processor) controlled device such as, by way of example, personal computers,
workstations, servers, clients, minicomputers, main-frame computers, laptop computers, a
network of one or more computers, mobile computers, portable computers, handheld
computers, palm top computers, set-top boxes for a TV, interactive televisions, interactive
kiosks, personal digital assistants, interactive wireless devices, mobile browsers, or any
combination thereof.
For the most part, the operations described herein are operations performed by a computer or
a machine in conjunction with a human operator or user that interacts with the computer or the
machine. The programs, modules, processes, methods, and the like, described herein are but
an exemplary implementation and are not related, or limited, to any particular computer,
apparatus, or computer language. Rather, various types of general purpose computing
machines or devices may be used with programs constructed in accordance with the teachings
described herein.
It would be well appreciated by persons skilled in the art that the term “module” and “unit”
can be interchangeably used in the present invention.
It would be well appreciated by persons skilled in the art that the term “unmanned aerial
vehicle (UAV)” and “drone” can be interchangeably used in the present invention.
The present invention describes a unique way in which hardware and software components of
In-Vehicle Infotainment/Human Machine Interface (IVI/HMI) and flying Drones or Unmanned
Aerial Vehicles are integrated in a unified way for enhancing a driver’s experience for different
use cases involving motoring. Further, the integrated capabilities of drone systems and
7
vehicular IVI/HMI systems offers a synergistic effect, not possible by the individual features,
thereby providing unique experiences to the driver.
Figure 1 illustrates a graphic representation of a system for providing augmented reality in
accordance with the present invention. The system (100) comprising an automobile (101)
with an infotainment and navigation device (102). It further comprises a flying unmanned
aerial vehicle (UAV) (103) which is wirelessly connected with the automobile and the
infotainment and navigation device. Further, the operation and control of the flying UAV is
possible with the dashboard of the infotainment and navigation system.
Additionally, the UAV is mounted with plurality of sensors to obtain various types of data in
surrounding environment.
Figure 2 illustrates a block diagram of an internal configuration of the infotainment and
navigation device and the UAV in accordance with the present invention. The Unmanned
aerial vehicle comprises an on-board control unit (201). The on-board control unit (201) of
the flying UAV comprises a processor (201a) coupled with a memory (201b) to perform all the
control operations and a communication unit or a transceiver (201c) to wirelessly communicate
with transceiver (202a) of the infotainment and the navigation device. Similarly, the
infotainment and navigation device comprises an integrated processor (202b) coupled with a
memory (202c), a display (203) and a dashboard (204). It would be well appreciated by
persons skilled in the art that the display/dashboard can be a touchscreen. Further, the control
of the UAV could be managed via steering mounted controls connected with infotainment
system via a communication bus line. The driver or co-passenger can control the UAV either
through the touch-screen or steering-wheel-mounted elements.
Further, the communication of control and data signal from the moving automobile on a
ground to the flying unmanned aerial vehicle (UAV) involves sending accurate vehicle
position with a much powerful GPS device fitted to the UAV for better alignment of GPS
coordinates in “follow me” or “look ahead” mode.
It will be well understood by a person skilled in the art that integrated vehicle and UAV
systems would allow for controlling of an aerial object (an UAV) by a moving object on the
ground (an automobile) autonomously.
8
In another embodiment, the UAV control unit may also communicate with one or more
sensors, which are either associated with the UAV or communicate with the UAV. UAV
sensors may include one or more sensors for providing information to the vehicle control
unit that determine or provide information about the environment in which the vehicle is
operating. Examples of information provided by the sensors and that may be used by the
UAV control unit may include natural environment, terrain data, weather tracking data,
traffic data, user health tracking data, or other types of data, which may provide
environmental or other data to the UAV control unit. The UAV control unit may also perform
signal processing of signals received from one or more sensors. Such signal processing may
include estimation of a measured parameter from a single sensor, such as multiple
measurements of a range state parameter Signal processing of such sensor signal
measurements may comprise stochastic signal processing, adaptive signal processing, and/or
other signal processing techniques known to those skilled in the art.
Further, the sensors operates in an area what can be detected by those sensors associated with
the UAV. Although sensor range is shown as a fixed and continuous, the sensor range may be
dynamic and/or discontinuous. For example, a camera may provide real-time images of the
surrounding images, or an environmental sensor may provide environmental conditions (e.g.,
rain, fog, clear, etc.). Thus, the environment may have an area that includes all areas within
the sensor range. The area may include locations of travel that the vehicle may proceed to in
the future.
The integrated vehicle and UAV systems would allow vehicular dynamic data to be shared
with the on-board control unit of the UAV through the infotainment and navigation device. In
another embodiments, as the vehicle speed increases from stop position to a moving position,
the on-board control unit of the UAV is automatic signalled to take-off. As the vehicle’s
speed increases, the on-board control unit can adjust the speed of the flying UAV, and when
the vehicle stops and engine is turned-off, the on-board control unit can signal the flying
UAV to land on the stationary vehicle. This would also allow for taking off/landing of the
UAV from /on a moving automobile.
9
In another embodiments, infotainment and navigation device is capable of capturing other
navigation information by another like means and can transfer that data to the UAV via RF
communication link
Figure 3 illustrates a method for providing augmented reality in accordance with the present
invention. In step (301), obtaining, external real-time data using an Unmanned Aerial Vehicle
(UAV). In step 302, providing said data to an infotainment and navigation device of the
automobile. In step (303), creating and displaying, composite images or videos by combining
said external real-time data with internal data.
Further, said creating of composite images or videos, in the infotainment and navigation
device, is by superimposing or overlaying video or images, as obtained in real time from the
unmanned aerial vehicle with GPS data available from an internal navigation device.
In an exemplary embodiment, the augmentation would be to superimpose or overlay live
video or composite images transmitted from the UAV onto a 3D-navigation system.
According to requirement of the driver, the view-angle and height of the 3D-view of the
streets/ buildings shown in the navigation are determined and then these get translated for
adjusting the viewing angle and height of the UAV to capture real-time video and transmitted
at the same rate as the displacement of the map on the screen so that video overlay on the 3Dview
map will be proper.
The term “bus” and variations thereof, as used herein, can refer to a subsystem that transfers
information and/or data between various components. A bus generally refers to the collection
communication hardware interface, interconnects, bus architecture, standard, and/or protocol
defining the communication scheme for a communication system and/or communication
network. A bus may also refer to a part of a communication hardware that interfaces the
communication hardware with the interconnects that connect to other components of the
corresponding communication network. The bus may be for a wired network, such as a
physical bus, or wireless network, such as part of an antenna or hardware that couples the
communication hardware with the antenna. A bus architecture supports a defined format in
10
which information and/or data is arranged when sent and received through a communication
network. A protocol may define the format and rules of communication of a bus architecture.
The term “communication unit” or “transceiver” and variations thereof, as used herein, can
refer to a collection of communication components capable of one or more of transmission,
relay, interconnect, control, or otherwise manipulate information or data from at least one
transmitter to at least one receiver. As such, the communication may include a range of
systems supporting point-to-point or broadcasting of the information or data. A
communication system may refer to the collection individual communication hardware as
well as the interconnects associated with and connecting the individual communication
hardware. Communication hardware may refer to dedicated communication hardware or may
refer a processor coupled with a communication means (i.e., an antenna) and running
software capable of using the communication means to send and/or receive a signal within
the communication system. Interconnect refers some type of wired or wireless
communication link that connects various components, such as communication hardware,
within a communication system. A communication network may refer to a specific setup of a
communication system with the collection of individual communication hardware and
interconnects having some definable network topography. A communication network may
include wired and/or wireless network having a pre-set to an ad hoc network structure.
The terms dash and dashboard and variations thereof, as used herein, may be used
interchangeably and can be any panel and/or area of a vehicle disposed adjacent to an
operator, user, and/or passenger. Dashboards may include, but are not limited to, one or more
control panel(s), instrument housing(s), head unit(s), indicator(s), gauge(s), meter(s), light(s),
audio equipment, computer(s), screen(s), display(s), HUD unit(s), and graphical user
interface(s).
The term “display” refers to a portion of a physical screen used to display the output to a
user. Such as console display, instrument cluster, heads-up display.
The term “GPS” refers to the Global Positioning System in which satellite-based radio
systems, typically satellite-based, provide three-dimensional position and time information to
which suitably equipped receivers anywhere on or near the surface of the Earth compute their
global position, velocity and orientation (heading).
11
The terms “infotainment” and “infotainment device” may be used interchangeably and can
refer to the hardware/software products, data, content, information, and/or systems, which
can be built into or added to vehicles to enhance driver and/or passenger experience.
Infotainment may provide media and/or multimedia content. An example is informationbased
media content or programming that also includes entertainment content.
The term “screen,” “touch screen,” “touchscreen,” or “touch-sensitive display” refers to a
physical structure that enables the user to interact with the computer by touching areas on the
screen and provides information to a user through a display. The touch screen may sense user
contact in a number of different ways, such as by a change in an electrical parameter (e.g.,
resistance or capacitance), acoustic wave variations, infrared radiation proximity detection,
light variation detection, and the like. In a resistive touch screen, for example, normally
separated conductive and resistive metallic layers in the screen pass an electrical current.
When a user touches the screen, the two layers make contact in the contacted location,
whereby a change in electrical field is noted and the coordinates of the contacted location
calculated. In a capacitive touch screen, a capacitive layer stores electrical charge, which is
discharged to the user upon contact with the touch screen, causing a decrease in the charge of
the capacitive layer. The decrease is measured, and the contacted location coordinates
determined. In a surface acoustic wave touch screen, an acoustic wave is transmitted through
the screen, and the acoustic wave is disturbed by user contact. A receiving transducer detects
the user contact instance and determines the contacted location coordinates.
The terms “vehicle,” “car,” “automobile,” and variations thereof may be used
interchangeably herein and can refer to a device or structure for transporting animate and/or
inanimate or tangible objects (e.g., persons and/or things), such as a self-propelled
conveyance. A vehicle as used herein can include any conveyance or model of a conveyance,
where the conveyance was originally designed for the purpose of moving one or more
tangible objects, such as people, animals, cargo, and the like. The term “vehicle” does not
require that a conveyance moves or is capable of movement. Typical vehicles may include
but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed
conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space
craft, flying machines, human-powered conveyances, and the like.
12
According to an embodiment of the present invention, the user has been given a choice to
experience “hyper-reality” by engaging functionalities to view images/videos transmitted from
a camera mounted on the flying UAV above the vehicle. The driver also has the options of
controlling manoeuvres of the UAV and also the movements of the camera and other sensors
mounted on the drone. The video/image will be shown on any of the display. Other applications
for enhanced driver’s experience utilizing images/videos captured for surveillance, traffic
management or any other like data.
In another embodiment, on-board cameras on UAV also provide 360view or surround view
based system which gives a perspective of nearby spaces around the vehicle. This view can be
augmented with the view of a UAV hovering on the top of the vehicle thereby broadening the
view/perspective of the driver for the vehicle.
In an advantageous embodiment, the present invention provides a new driving experience of
“hyper-reality” by feeding aerial view or extended vision of the scene ahead with the help of
a drone/UAV.
The present invention is applicable to all types of on-chip and off chip memories used in
various in digital electronic circuitry, or in hardware, firmware, or in computer hardware,
firmware, software, or in combination thereof. Apparatus of the invention can be
implemented in a computer program product tangibly embodied in a machine-readable
storage device for execution by a programmable processor; and methods actions can be
performed by a programmable processor executing a program of instructions to perform
functions of the invention by operating on input data and generating output. The invention
can be implemented advantageously on a programmable system including at least one input
device, and at least one output device. Each computer program can be implemented in a highlevel
procedural or object-oriented programming language or in assembly or machine
language, if desired; and in any case, the language can be a compiled or interpreted language.
Suitable processors include, by way of example, both general and specific microprocessors.
Generally, a processor will receive instructions and data from a read-only memory and/or a
random access memory. Generally, a computer will include one or more mass storage devices
for storing data file; such devices include magnetic disks and cards, such as internal hard
13
disks, and removable disks and cards; magneto-optical disks; and optical disks. Storage
devices suitable for tangibly embodying computer program instructions and data include all
forms of volatile and non-volatile memory, including by way of example semiconductor
memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks
such as internal hard disks and removable disks; magneto-optical disks; CD-ROM and DVDROM
disks; and buffer circuits such as latches and/or flip flops. Any of the foregoing can be
supplemented by, or incorporated in ASICs (application-specific integrated circuits), FPGAs
(field-programmable gate arrays) and/or DSPs) digital signal processors).
It will be apparent to those having ordinary skill in this art that various modifications and
variations may be made to the embodiments disclosed herein, consistent with the present
invention, without departing from the spirit and scope of the present invention. Other
embodiments consistent with the present invention will become apparent from consideration
of the specification and the practice of the description disclosed herein.
14
We claim:
1. A system (100) for providing augmented reality, said system comprising:
an automobile (101) comprising an infotainment and navigation device (102); and
an unmanned aerial vehicle (103), operatively coupled to said automobile, to provide
real-time data exterior to the automobile;
said infotainment and navigation device (102), operable to create and display realtime
composite images or videos, by combining at least part of said real-time data
obtained from the unmanned aerial vehicle with data available internally.
2. The system as claimed in claim 1, wherein said unmanned aerial vehicle comprises a
plurality of sensors to obtain said real-time data.
3. The system as claimed in claim 1, wherein said automobile comprises a plurality of
control elements, operable to control the unmanned aerial vehicle and the
infotainment and navigation device.
4. The system as claimed in claim 3, wherein said control elements are located on a
steering wheel of the automobile.
5. The system as claimed in claim 1, wherein said composite images or videos are
created, in the infotainment and navigation device, by superimposing or overlaying
video or images as obtained in real time from the unmanned aerial vehicle with GPS
data available from an internal navigation device.
6. The system as claimed in claim 1, wherein said data is an image or video data of
surrounding natural environment, terrain, weather or traffic.
7. A method for providing augmented reality in an automobile, said method comprising:
obtaining, external real-time data using an Unmanned Aerial Vehicle (UAV);
15
providing said data to an infotainment and navigation device of the automobile; and
creating and displaying, composite images or videos by combining said external realtime
data with internal data.
8. The method as claimed in claim 7, wherein the method comprises obtaining data from
a plurality of sensors disposed on the unmanned aerial vehicle.
9. The method as claimed in claim 7, wherein the method comprises controlling the
unmanned aerial vehicle and the infotainment and navigation device using a plurality
of control elements.
10. The method as claimed in claim 7, wherein said creating of composite images or
videos, in the infotainment and navigation device, is by superimposing or overlaying
video or images, as obtained in real time from the unmanned aerial vehicle with GPS
data available from an internal navigation device.
| # | Name | Date |
|---|---|---|
| 1 | Form 3 [14-12-2015(online)].pdf | 2015-12-14 |
| 2 | Drawing [14-12-2015(online)].pdf | 2015-12-14 |
| 3 | Description(Provisional) [14-12-2015(online)].pdf | 2015-12-14 |
| 4 | Other Patent Document [14-06-2016(online)].pdf | 2016-06-14 |
| 5 | Other Document [29-09-2016(online)].pdf | 2016-09-29 |
| 6 | Form 13 [29-09-2016(online)].pdf | 2016-09-29 |
| 7 | OTHERS [13-12-2016(online)].pdf | 2016-12-13 |
| 8 | Other Patent Document [13-12-2016(online)].pdf | 2016-12-13 |
| 9 | Drawing [13-12-2016(online)].pdf | 2016-12-13 |
| 10 | Description(Complete) [13-12-2016(online)].pdf_59.pdf | 2016-12-13 |
| 11 | Description(Complete) [13-12-2016(online)].pdf | 2016-12-13 |
| 12 | 4691-MUM-2015-ORIGINAL UNDER RULE 6 (1A)-03-07-2017.pdf | 2017-07-03 |
| 13 | 4691-MUM-2015-FORM 18 [17-07-2017(online)].pdf | 2017-07-17 |
| 14 | 4691-MUM-2015-PETITION UNDER RULE 137 [27-07-2017(online)].pdf | 2017-07-27 |
| 15 | 4691-MUM-2015-FORM-26 [27-07-2017(online)].pdf | 2017-07-27 |
| 16 | 4691-MUM-2015-Proof of Right (MANDATORY) [27-09-2017(online)].pdf | 2017-09-27 |
| 17 | Abstract.jpg | 2018-08-11 |
| 18 | 4691-MUM-2015-ORIGINAL UNDER RULE 6 (1A)-031017.pdf | 2018-08-11 |
| 19 | 4691-MUM-2015-ORIGINAL UNDER RULE 6 (1A)-030817.pdf | 2018-08-11 |
| 20 | 4691-MUM-2015-Form 1-270616.pdf | 2018-08-11 |
| 21 | 4691-MUM-2015-Correspondence-270616.pdf | 2018-08-11 |
| 22 | 4691-MUM-2015-REPLY FROM SECRECY DIRECTION-061118.pdf | 2018-11-12 |
| 23 | 4691-MUM-2015-RELEVANT DOCUMENTS [13-02-2019(online)].pdf | 2019-02-13 |
| 24 | 4691-MUM-2015-FORM 13 [13-02-2019(online)].pdf | 2019-02-13 |
| 25 | 4691-MUM-2015-AMENDED DOCUMENTS [13-02-2019(online)].pdf | 2019-02-13 |
| 26 | 4691-MUM-2015-ORIGINAL UR 6(1A) FORM 26 & CERIFICATE-180219.pdf | 2019-02-22 |
| 27 | 4691-MUM-2015-CORRESPONDENCE(IPO)-(DEFENCE LETTER)-(1-6-2018).pdf | 2019-08-01 |
| 28 | 4691-MUM-2015-FER.pdf | 2019-10-16 |
| 1 | search4691MUM2015_25-06-2019.pdf |