Abstract: A wearable device for providing real-time navigation assistance to a rider is disclosed. The wearable device comprises a Global Positioning System (GPS) sensor, a projector device, a 2D camera, and a data processing platform. The data processing platform is configured to determine a path from a set of paths that is followed by the rider. Further, the data processing platform is configured to fetch a set of adverse road conditions, associated with the path followed by the rider, from a remote database. Furthermore, the data processing platform is configured to execute programmed instructions stored in the memory to generate at least one of an audio alert or a visual alert corresponding to an adverse road condition.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application does not claim priority from any patent application.
TECHNICAL FIELD
[002] The present disclosure in general relates to the field vehicle navigation. More particularly, the present disclosure relates to a device and method for providing real-time navigation assistance.
BACKGROUND
[003] In most of the developing countries, road fatalities are increasing at an alarming rate. One of the major reasons for road fatalities is that motorists are not aware of the upcoming adverse road conditions while driving. As per current statistics, a lot of fatalities can be reduced or eliminated in case motorists is aware of the upcoming adverse road conditions while riding two-wheeler.
[004] A lot of road accidents especially those related with two wheelers can be prevented in case motorist is made aware of the adverse road conditions such as potholes, sharp turns, high bumps or speed breakers etc. either visually or aurally. In most of the road accident cases the motorist has very less time to safely circumnavigate an adverse road condition once such a road condition is actually reached. In other words, motorist can take appropriate action by reducing the speed or changing the vehicle direction in case he/she is made aware of the impending adverse condition well in advance. In most of the cases especially during night or a foggy day, motorist is not able to clearly see the approaching adverse road condition and hence is unable to take the appropriate action, which may result into a major accident.
SUMMARY
[005] This summary is provided to introduce aspects related to device and method for providing real-time navigation assistance to a rider, further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
3
[006] In one implementation, a wearable device for providing real-time navigation assistance to a rider is disclosed. The wearable device comprising a Global Positioning System (GPS) sensor, a projector device, a 2D camera, and a data processing platform. The data processing platform comprises a memory and a processor, wherein the processor is configured to execute programmed instructions stored in the memory. In one embodiment, the processor is configured to execute programmed instructions stored in the memory to determine a path from a set of paths, in a geographical location, that is followed by the rider. The path may be determined based on a current location of the rider, wherein the current location is determined by a GPS sensor in the wearable device. Further, the processor is configured to execute programmed instructions stored in the memory to fetch a set of adverse road conditions, associated with the path followed by the rider, from a remote database, wherein each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider. Furthermore, the processor is configured to execute programmed instructions stored in the memory to generate at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider. In one embodiment, the visual alert may be projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider. Further, the processor may be configured to determine a projection angle of the visual alert based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
[007] In one implementation, a method for providing real-time navigation assistance to a rider is disclosed. In the first step, a processor is configured to determine a path from a set of paths, in a geographical location, followed by the rider. The path may be determined based on a current location of the rider, wherein the current location is determined by a GPS sensor. In the next step, the processor is configured to fetch a set of adverse road conditions, associated with the path followed by the rider, from a remote database. In one embodiment, each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider. Further, the processor is configured to generate at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider. In one embodiment, the visual alert may be projected by a projection device on a windshield to superimpose the visual alerts over
4
the upcoming checkpoint on the path followed by the rider, and wherein a projection angle of the visual alert is determined based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
[008] A non-transitory computer readable medium embodying a program executable in a computing device for providing real-time navigation assistance to a rider is disclosed. The computer program product comprises a program code for determining a path from a set of paths, in a geographical location, followed by the rider. The path may be determined based on a current location of the rider, wherein the current location is determined by a GPS sensor. The computer program product comprises a program code for fetching a set of adverse road conditions, associated with the path followed by the rider, from a remote database, wherein each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider. Further, the computer program product comprises a program code for generating at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider. The visual alert may be projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider, and wherein a projection angle of the visual alert may be determined based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
BRIEF DESCRIPTION OF DRAWINGS
[009] The detailed description is described with reference to the accompanying Figures. In the Figures, the left-most digit(s) of a reference number identifies the Figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like/similar features and components.
[010] Figure 1 illustrates a network implementation of a wearable device for providing real-time navigation assistance to a rider, in accordance with an embodiment of the present disclosure.
[011] Figure 2 illustrates the data processing platform present in the wearable device, in accordance with an embodiment of the present disclosure.
5
[012] Figure 3 illustrates a method for providing real-time navigation assistance to a rider, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[013] The present disclosure relates to a wearable device for providing real-time navigation assistance to a rider. The wearable device may be a helmet. The wearable device may comprise a Global Positioning System (GPS) sensor, a projector device, a camera, and a data processing platform. The data processing platform comprises a memory and a processor. The processor is configured to execute programmed instructions stored in the memory. In one embodiment, the processor may execute programmed instructions stored in the memory to determine a path from a set of paths, in a geographical location, that is followed by the rider. The path may be determined based on a current location of the rider, wherein the current location is determined by a GPS sensor in the wearable device.
[014] Further, the processor is configured to execute programmed instructions stored in the memory to fetch a set of adverse road conditions, associated with the path followed by the rider, from a remote database, wherein each adverse road condition from the set of adverse road conditions corresponds to a checkpoint from a set of checkpoints associated with the path followed by the rider. Furthermore, the processor is configured to execute programmed instructions stored in the memory to generate at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider, the current location of the rider as well as the direction in which the rider is traveling. In one embodiment, the visual alert may be projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider. Further, the processor may be configured to determine a projection angle of the visual alert based on height of the rider, a distance between the current location of the rider and the upcoming checkpoint, such that the visual alert is superimposed on the adverse road condition of the upcoming checkpoint. The process of providing real-time navigation assistance is further elaborated with reference to figure 1.
[015] Referring now to Figure 1, a network implementation 100 of a wearable device 102 for providing real-time navigation assistance to a rider is illustrated, in accordance with an embodiment of the present disclosure. The wearable device 102 comprises a set of peripheral
6
devices comprising a Global Positioning System (GPS) sensor 108, a tilt sensor 110, an infrared sensor 112, a microphone 114, a projection device 116, a 2D camera 118, a 3D camera 120, and an earphone 122. The peripheral devices are connected to a data processing platform through wired or wireless connection. The peripheral devices and the data processing platform are powered by a battery fitted inside the wearable device 102. Further, the data processing platform 124 is configured to communicate with a backend server 126 through a network 106, wherein the backend server 126 is enabled with a remote database 128. Although the present disclosure is explained by considering that the data processing platform 124 is implemented as a software program over an embedded system present inside the wearable device 102 such as a helmet, it may be understood that the embedded system may be installed in a vehicle of the rider and can communicate with the peripheral devices through communication channels such as Bluetooth, Wi-Fi, NFC, Infrared communication and the like. It will be understood that the data processing platform 124 may be accessed by multiple users such as traffic authority, traffic police, or highway police through one or more user devices 104-1, 104-2…104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a hand-held device, and a workstation. The user devices 104 are communicatively coupled to the data processing platform 124 through a network 106.
[016] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like. The recording mode and the navigation mode enabled by the data processing platform 124 of the wearable device 102 further elaborated with reference to figure 2.
[017] Referring now to Figure 2, the data processing platform 124 is illustrated in accordance with an embodiment of the present disclosure. In one embodiment, the data
7
processing platform 124 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[018] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the data processing platform 124 to interact with a user directly or through the user devices 104. Further, the I/O interface 204 may enable the data processing platform 124 to communicate with other computing devices, such as the backend server 126, web servers and external data servers (not shown). The I/O interface 204 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[019] The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and system data 222.
[020] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a mode selection module 210, a peripheral device communication module 212, a backend database communication module 214, an adverse road condition detection module 216, an alert generation module 218, and other modules 220. The other modules 220 may include programs or coded instructions that supplement applications and functions of the data processing platform 124.
[021] The system data 222, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The system data 222 may also include a database 224 and other data 226. The other data 226 may include data
8
generated as a result of the execution of one or more modules in the other modules 220. Further, the database 224 is configured to store navigation information such as maps, information of roads and the like.
[022] In one embodiment, the mode selection module 210 in the data processing platform 124 is configured enable selection between the recording mode and the navigation mode. The recording mode is configured to record all the adverse road conditions observed while traversing the geographical location/ area. The process of real-time navigation assistance starts by recording the adverse road conditions using the 2D camera 118 mounted over the wearable device 102. Once the recording mode is activated, the peripheral device communication module 212 is configured to communicate with the 2D camera 118 and receive images of the adverse road conditions on a path followed by the authorised rider from the 2D camera 118.
[023] In one embodiment, the peripheral device communication module 212 is configured to automatically capture the adverse road conditions by analyzing the images captured by the 2D camera 118 and perform image processing in order to identify adverse road conditions on the path. As soon as an adverse road condition such as potholes, sharp turns, high bumps, oil spills and speed breakers is identified by the data processing platform 124, a record of the adverse road condition is generate against the path followed by the authorized rider. The record stores a checkpoint and a type of the adverse road condition associated with the checkpoint. The checkpoint corresponds to the exact GPS location of the adverse road condition given by the GPS sensor 108 at the location of adverse road condition. The record also stored a visual alert as well as an audio alert corresponding to the type of the adverse road condition. In a similar manner, all the adverse road conditions are recorded on the path being traversed by the authorized rider as well as other paths in the geographical area by the peripheral device communication module 212. Once all the adverse road conditions are recorded, the peripheral device communication module 212 is also configured to frequently update the remote database 128 with the adverse road conditions by adding newly discovered adverse road conditions and deleting the adverse road conditions which are already repaired.
[024] In one embodiment, the adverse road conditions may be manually recoded by the authorised rider using the 2D camera 118 and the peripheral device communication module 212. For this purpose, the authorized rider may stop at each of the adverse road conditions and record both an audio message using the microphone 114 as well make an appropriate
9
gesture in front of the 3D camera 120 in order to record the adverse road condition as well as the type of the adverse road condition. The gesture and audio message may be different for different adverse road conditions. Both the audio message and the gesture may be recorded by the peripheral device communication module 212. In one embodiment, the gestures and audio message are converted into control commands by the peripheral device communication module 212 before being sent to the remote database 128 for generation of the record. In one embodiment, a metadata corresponding to the wearable device 102 at the time of recording the adverse road condition is also recorded by the peripheral device communication module 212. The metadata comprises a wearable device orientation captured by the tilt sensor 110, GPS location coordinates captured by the GPS sensor 108, height from the ground captured by the Infrared sensor 112 is also recorded in the remote database 128 by the backend database communication module 214. The function of each of the peripheral devices for recording the metadata and adverse road conditions is as follows:
- GPS Sensor 108: The GPS sensor 108 is configured to indicate the GPS location coordinates of the adverse road condition both during the recording mode as well as in navigation mode. The GPS location coordinates are saved along with other information about the adverse road condition in the remote database 228.
- Microphone 114: The microphone 114 is configured to record an audio alert once and adverse road condition is detected by the designated rider while manually recording the adverse road conditions. The audio file generated by the microphone 114 is processed by the peripheral device communication module 212 for generating control command which is then sent to the remote database 128 for generating a record corresponding to the adverse road condition. The conversion from audio format to control command is for saving the bandwidth used while communication with the remote database.
- Tilt Sensor 110: The tilt sensor 110 is configured to measure the orientation of the wearable device 102 at the time of recording the adverse road conditions. The tilt sensor 110 enables recording metadata associated with the orientation of the wearable device 102.
- Infrared sensor 112: The Infrared sensor 112 is configured to measure the height of the wearable device from the ground while recording the adverse road conditions as well as at the time of navigation. The purpose of the tilt sensor 110 and the infrared sensor 112 is to generate metadata that can be used for projecting the visual alerts on the windshield such that the visual alerts/ annotations superimpose the actual adverse road conditions in the path.
10
- 3D camera 120: The purpose of the 3D camera 120 is to record adverse road conditions based on the hand gestures performed by the authorized rider during the recording mode, once adverse road condition is encountered. The gestures indicate the type of adverse road condition and also the relative location within the currently viewed view frame.
[025] In one embodiment, in the navigation mode, the adverse road condition detection module 216 is configured to monitor the path followed by the rider and fetch a set of adverse road conditions, associated with the path followed by the rider, from the remote database 128. In one embodiment, each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider. Once the set of adverse road conditions is fetched from the remote database, in the next step, the alert generation module 218 is configured to generate at least one of an audio alert or a visual alert corresponding to an adverse road condition, based on an upcoming checkpoint in the path followed by the rider and the current location of the rider. The visual alert may be projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider. In one embodiment, a projection angle of the visual alert is determined by the alert generation module 218 based on height of the wearable device 102 sensed by the infrared sensor 112 and a distance between the current location of the rider and the upcoming checkpoint determined by the GPS sensor 108. In one embodiment, the rider is assisted with real-time audio as well as visual alerts as soon as any checkpoint corresponding to an adverse road condition is approached by the rider. In another embodiment, the alert generation module 218 may be configured to display the visual alert over the small LCD display in the wearable device 102. Further, the alert generation module 218 may also be configured to receive audio alerts in the form of audio notifications as and when the adverse condition at the upcoming checkpoint is encountered. The peripheral devices used at the time of navigation mode include the projection device 116, the 2D camera 118, the earphone 122, and a small LCD display mounted inside the wearable device. The function of these peripheral devices is as follows:
- Projection device 116: The projection device 116 is configured to project a visual alert corresponding to an upcoming checkpoint on the path being followed by the rider. The projection may be made over the windshield of the wearable device 102.
- 2D camera 118 and small LCD display: In case is the wearable device 102 is not enabled with the projection device 116. The alert generation module 218 is configured to capture
11
the real-time video of the path followed by the rider and display the path as well as the visual alerts on the LCD display in the wearable device 102. The feed from this 2D camera is merged/ superimposed over the visual alerts/ annotations using the metadata associated with the upcoming checkpoint.
[026] Earphone 122: The earphone 122 receive audio signal from the data processing platform and is configured to play the audio alert as and when the rider approaches an adverse road condition. Further, the method for providing real-time navigation assistance to a rider is elaborated with reference to the flowchart of figure 3.
[027] Referring now to figure 3, a method 300 for providing real-time navigation assistance to a rider is disclosed, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like, that perform particular functions or implement particular abstract data types. The method 300 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[028] The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300 can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described wearable device 102.
[029] At step 302, the peripheral device communication module 212 is configured to determine a path from a set of paths followed by the rider. In one embodiment, the peripheral device communication module 212 may determine the path based on a current location of the rider, wherein the current location is determined by the GPS sensor 108.
12
[030] At step 304, the backend database communication module 214 is configured to fetch a set of adverse road conditions, associated with the path followed by the rider, from the remote database 128. Each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider.
[031] At step 306, the alert generation module 218 is configured to generate at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider. In one embodiment, the upcoming checkpoint in the path followed by the rider is determined by the adverse road condition detection module 216 based on the inputs received from the GPS sensor 108. The visual alert may be projected by a projection device 116 on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider. In one embodiment, the projection angle of the visual alert is determined by the alert generation module 218 based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
[032] Although the present disclosure relates to providing real-time navigation assistance to a rider, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described herein. However, the specific features and methods are disclosed as examples of implementations for providing real-time navigation assistance to a rider.
WE CLAIM:
1. A wearable device for providing real-time navigation assistance to a rider, the wearable device comprising:
a memory; and
a processor coupled to the memory, wherein the processor is configured to execute programmed instructions stored in the memory to:
determine a path from a set of paths, in a geographical location, followed by the rider, wherein the path is determined based on a current location of the rider, wherein the current location is determined by a GPS sensor;
fetch a set of adverse road conditions, associated with the path followed by the rider, from a remote database, wherein each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider; and
generate at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider, wherein the visual alert is projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider, and wherein a projection angle of the visual alert is determined based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
2. The wearable device of claim 1, wherein the remote database is configured to maintain a set of adverse road conditions and a set of checkpoints associated with each path from the set of paths in the geographical location, and wherein the adverse road conditions are recorded by traversing the set of paths and manually recording the adverse road condition at each interest point or automated by using a camera and
14
image processing for identifying the adverse road conditions at each interest point on each path.
3. The wearable device of claim 1, wherein the set of adverse road conditions comprise potholes, sharp turns, high bumps, oil spills and speed breakers.
4. The wearable device of claim 1, wherein the checkpoint corresponds to a physical location of an adverse road conditions on the path.
5. A method for providing real-time navigation assistance to a rider, the method comprising steps of:
determining, by a processor, a path from a set of paths, in a geographical location, followed by the rider, wherein the path is determined based on a current location of the rider, wherein the current location is determined by a GPS sensor;
fetching, by the processor, a set of adverse road conditions, associated with the path followed by the rider, from a remote database, wherein each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider; and
generating, by the processor, at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider, wherein the visual alert is projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider, and wherein a projection angle of the visual alert is determined based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
6. The method of claim 5, wherein the remote database is configured to maintain a set of adverse road conditions and a set of checkpoints associated with each path from the set of paths in the geographical location, and wherein the adverse road conditions are recorded by traversing the set of paths and manually recording the adverse road condition at each interest point or automated by using a camera and image processing for identifying the adverse road conditions at each interest point on each path.
15
7. The method of claim 5, wherein the set of adverse road conditions comprise potholes, sharp turns, high bumps, oil spills and speed breakers.
8. The method of claim 5, wherein the checkpoint corresponds to a physical location of an adverse road conditions on the path.
9. A non-transitory computer readable medium embodying a program executable in a computing device for providing real-time navigation assistance to a rider, the computer program product comprising:
a program code for determining a path from a set of paths, in a geographical location, followed by the rider, wherein the path is determined based on a current location of the rider, wherein the current location is determined by a GPS sensor;
a program code for fetching a set of adverse road conditions, associated with the path followed by the rider, from a remote database, wherein each adverse road condition from the set of adverse road conditions is associated with a checkpoint from a set of checkpoints associated with the path followed by the rider; and
a program code for generating at least one of an audio alert or a visual alert corresponding to an adverse road condition based on an upcoming checkpoint in the path followed by the rider and the current location of the rider, wherein the visual alert is projected by a projection device on a windshield to superimpose the visual alerts over the upcoming checkpoint on the path followed by the rider, and wherein a projection angle of the visual alert is determined based on height of the rider and a distance between the current location of the rider and the upcoming checkpoint.
| # | Name | Date |
|---|---|---|
| 1 | Form 3 [15-01-2016(online)].pdf | 2016-01-15 |
| 3 | Drawing [15-01-2016(online)].pdf | 2016-01-15 |
| 4 | Description(Complete) [15-01-2016(online)].pdf | 2016-01-15 |
| 5 | 201611001572-GPA-(13-05-2016).pdf | 2016-05-13 |
| 6 | 201611001572-Form-1-(13-05-2016).pdf | 2016-05-13 |
| 7 | 201611001572-Correspondence Others-(13-05-2016).pdf | 2016-05-13 |
| 8 | abstract.jpg | 2016-07-11 |
| 9 | 201611001572-FER.pdf | 2019-08-23 |
| 10 | 201611001572-OTHERS [17-02-2020(online)].pdf | 2020-02-17 |
| 11 | 201611001572-FER_SER_REPLY [17-02-2020(online)].pdf | 2020-02-17 |
| 12 | 201611001572-COMPLETE SPECIFICATION [17-02-2020(online)].pdf | 2020-02-17 |
| 13 | 201611001572-CLAIMS [17-02-2020(online)].pdf | 2020-02-17 |
| 14 | 201611001572-ABSTRACT [17-02-2020(online)].pdf | 2020-02-17 |
| 15 | 201611001572-POA [09-07-2021(online)].pdf | 2021-07-09 |
| 16 | 201611001572-FORM 13 [09-07-2021(online)].pdf | 2021-07-09 |
| 17 | 201611001572-Proof of Right [22-10-2021(online)].pdf | 2021-10-22 |
| 18 | 201611001572-US(14)-HearingNotice-(HearingDate-17-05-2022).pdf | 2022-04-20 |
| 19 | 201611001572-Correspondence to notify the Controller [06-05-2022(online)].pdf | 2022-05-06 |
| 1 | 2019-08-2212-51-57_22-08-2019.pdf |