Abstract: The present disclosure relates to a vehicle collision avoidance system, and more specifically, to a fogminator, system and method for automatic reduction or retardation of a vehicle’s speed during fog. An aspect of the present disclosure relates to a system (200) for controlling a vehicle. The system (200) includes a sensor (202) and an engine control unit (ECU) (208). The sensor (202) configured on the vehicle to obtain a distance between the sensor and an object. The engine control unit (ECU) (208) of the vehicle communicably coupled with the sensor to retrieve obtained distance between the sensor and the object to compare a first value of the retrieved distance with a second value pre-stored in the ECU. The ECU (208) controls the vehicle, based on an output of the comparison of the first value and the second value.
TECHNICAL FIELD
[0001] The present disclosure relates to a vehicle collision avoidance system, and more specifically, to a fogminator, system and method for automatic reduction or retardation of a vehicle’s speed upon obstacle detection during fog.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] A driver is driving a car in a foggy day and he/she cannot see further than about 3 meters (about 10 feet). Fog and smog can reduce visibility to zero, making driving extremely dangerous. In India, over 10,000 lives are compromised due to fog-related road crashes in the year 2018. The global road safety body International Road Federation urged with the Indian government to use fog lights and make them mandatory to all the vehicles. Use of fog lights can reduce the risk crash only by 30%, which is still less. Whenever an obstacle comes instantly in front of the vehicle during fog, the driver does not have sufficient time to apply the brakes which causes fog-related road crashes, i.e., the reaction time is minimal.
[0004] Existing technology diverts the path of a vehicle using a radar sensor and a stereo camera, which is very costly. In this, if any danger is detected the system gives the driver a visual and audible warning. In the existing technology, the vehicles are self-driven and can control the braking, on their own. However, Problems to be solved in the present invention are that: there is no device and/or system and/or method that are available in the market that reduces the speed of a vehicle when any object comes in front of the vehicle during foggy weather. Furthermore, the available system and/or device and/or method are expensive and require much complex operating systems. Additionally, until now 2
the existing system works during the normal condition, i.e., when the driver applies the brakes during the foggy condition, the on-coming traffic may collapse into the standing vehicle.
[0005] Therefore, there is a need an efficient, effective and improved system and method for automatic reduction or retardation of a vehicle’s speed upon detection of an obstacle in any direction of the vehicle. Further, there is a need an artificial intelligence powered system to reduce the speed of the vehicle, when any object comes in front of the vehicle during foggy weather. Furthermore, there is a need a vehicle collision avoidance system that assists the driver to control the automobile easily, when unintentionally another vehicle comes during fog.
[0006] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[0007] In some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[0008] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All
3
methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0009] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
SUMMARY
[0010] This summary is provided to introduce a selection of concepts in a simplified form to be further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0011] An aspect of the present disclosure relates to a system for controlling a vehicle during fog. The system includes a sensor and an engine control unit (ECU). The sensor configured on the vehicle to obtain a distance between the sensor and an object. The engine control unit (ECU) of the vehicle communicably coupled with the sensor to retrieve obtained distance between the sensor and the object to compare a first value of the retrieved distance with a second value pre-stored in the ECU. The ECU controls the vehicle, based on an output of the comparison of the first value and the second value.
[0012] In an aspect, the sensor is a Time-of-Flight principle (ToF) sensor. 4
[0013] In an aspect, the distance is obtained based on a time difference between emission of a signal from the sensor and a return, after being reflected by the object, of the signal to the sensor.
[0014] In an aspect, the vehicle includes a brake which brakes one or more wheels of the vehicle to control the vehicle.
[0015] In an aspect, the ECU includes a microcontroller to compare a first value of the retrieved distance with a second value pre-stored in the ECU.
[0016] In an aspect, the microcontroller determines a type of object based at least on an input received from the sensor, the type of object is selected from any or a combination of a human, an animal, a speed breaker, an electrical pole, a stone, another vehicle, and a pothole.
[0017] In an aspect, the ECU controls the vehicle by applying brakes of the vehicle, if the first value of the retrieved distance is less than the second value pre-stored in the ECU, on one or more wheels of the vehicle if on an output of the comparison of the first value and the second value.
[0018] In an aspect, the sensor is configured on any or combination of a front side, back side, and sides of the vehicle.
[0019] An aspect of the present disclosure relates to a device configured on the vehicle for controlling a vehicle during fog. The device includes a sensor and an engine control unit (ECU). The sensor obtains a distance between the sensor and an object. The engine control unit (ECU) of the vehicle communicably coupled with the sensor to retrieve the obtained distance between the sensor and the object to compare a first value of the retrieved distance with a second value pre-stored in the ECU, wherein the ECU, based on an output of the comparison of the first value and the second value, is configured to control the vehicle.
[0020] An aspect of the present disclosure relates to a method of controlling a vehicle during fog. The method includes the following steps: a sensor of the vehicle obtains a distance between the sensor and an object; an engine control unit (ECU) of the vehicle communicably coupled with the sensor retrieves the obtained distance between the sensor and the object and the ECU compares a first value of the retrieved distance with a second value pre-stored in
5
the ECU. The ECU controls the vehicle based on an output of the comparison of the first value and the second value.
[0021] Various objects, features, aspects, and advantages of the present disclosure will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like features.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0023] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0024] FIG. 1 illustrates an exemplary flow diagram of a proposed system, in accordance with an exemplary embodiment of the present disclosure.
[0025] FIG. 2 illustrates a block diagram of a proposed system, in accordance with an exemplary embodiment of the present disclosure.
[0026] FIG. 3 illustrates a module diagram of a proposed device, in accordance with an exemplary embodiment of the present disclosure.
[0027] FIG. 4 illustrates a flow diagram of a proposed system, in accordance with an exemplary embodiment of the present disclosure.
6
DETAILED DESCRIPTION
[0028] Embodiments of the present disclosure include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware or by human operators.
[0029] Embodiments of the present disclosure may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).
[0030] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present disclosure with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present disclosure may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the disclosure could be accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0031] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic. 7
[0032] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of the disclosure to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).
[0033] Thus, for example, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this disclosure. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any electronic code generator shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this disclosure. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[0034] Various terms as used herein are shown below. To the extent a term used in a claim is not defined below, it should be given the broadest definition persons in the pertinent art have given that term as reflected in printed publications and issued patents at the time of filing.
[0035] Problems to be solved in the present invention are that: Existing technology diverts the path of a vehicle using a radar sensor and the stereo 8
camera, which is very costly. In this, if any danger is detected the system gives the driver a visual and audible warning. In the existing technology, the vehicles are self-driven and can control the braking, on their own. However, Problems to be solved in the present invention are that: there is no device and/or system and/or method that are available in the market that reduces the speed of a vehicle when any object comes in front of the vehicle during foggy weather. Furthermore, the available system and/or device and/or method are expensive and require much complex operating systems. Additionally, until now the existing system works during the normal condition, i.e., when the driver applies the brakes during the foggy condition, the on-coming traffic may collapse into the standing vehicle.
[0036] Therefore, there is a need an efficient, effective and improved system and method for automatic reduction or retardation of a vehicle’s speed upon detection of an obstacle in any direction of the vehicle. Further, there is a need an artificial intelligence powered system to reduce the speed of a vehicle, when any object comes in front of the vehicle during foggy weather. Furthermore, there is a need a vehicle collision avoidance system that assists the driver to control the automobile easily, when unintentionally another vehicle comes during fog.
[0037] Aspects of the present disclosure relate to a system and method for automatic reduction or retardation of a vehicle’s speed during fog. The system incorporates one or more time of flight (TOF) sensors for accurate detection of distance, at least one electronic controller, and one or more induction motor.
[0038] In an aspect, TOF sensors are being used to accurately detect the obstacle distance from the vehicle, and send the distance data to the electronic controller, wherein if the distance data shows that obstacle distance is more than 20 meters, a microcontroller takes no action. However, in case the measured distance is less than 20 meters and speed being more than a threshold limit, the controller automatically actuates anti-speeding i.e. retards the speed of the vehicle; perhaps, by way of application of induction motor assisted braking during fog.
[0039] In an aspect, one or more TOF sensors detect accurate distance. 9
[0040] In an aspect, the sensor measured obstacle distance is less than 20 meters and speed being more than a threshold limit, the controller automatically actuates anti-speeding i.e. retards the speed of the vehicle; perhaps, by way of application of the induction motor assisted braking during fog.
[0041] In an aspect, the front mounted TOF sensor can determine the distance between the vehicles, using speed distance relation. Thus, the distance found can be used to display and take necessary actions. Also, the fog sensor has been used to obtain the real-time fog intensity and display it on the augmented reality-based head-up display. Furthermore, the two sensors i.e., TOF and fog sensor, can be interfaced to collaborate and compute the data.
[0042] In an aspect, the rear mounted TOF sensors that are being used at the rear end of the vehicle and will monitor the real-time data of the distance of approach of the rear vehicle. The proposed system can be used to warn the driver with the details of approaching the vehicle and displays the information on the screen.
[0043] FIG. 1 illustrates an exemplary flow diagram of a proposed system, in accordance with an exemplary embodiment of the present disclosure.
[0044] As shown in FIG. 1, the TOF sensor can detect an obstacle. If the TOF sensor detects the obstacle then a distance can be measured by a microcontroller. If the obstacle distance is more than 20 meters, the microcontroller takes no action. However, if the measured distance is less than 20 meters and speed is more than a threshold limit, the controller automatically actuates anti-speeding i.e. retards the speed of the vehicle, by way of application of the induction motor or electric control unit assisted braking.
[0045] FIG. 2 illustrates a block diagram of a proposed system, in accordance with an exemplary embodiment of the present disclosure.
[0046] In an embodiment, the proposed system 200 can include a TOF sensor 202, a photoelectric speed sensor 204, a microcontroller unit 206, an engine control unit 208 and an induction motor 210.
[0047] In another embodiment, the TOF sensor 202 can be used to detect the obstacle distance from the vehicle. The photoelectric speed sensor 204 can be 10
used to discover the distance, absence, or presence of an object by using a light transmitter. The photoelectric speed sensor 204 can be used to detect the speed of the wheel of the vehicle.
[0048] In another embodiment, the microcontroller unit 206 can be connected wired or wirelessly with the TOF sensor 202 and the photoelectric speed sensor 204. The microcontroller unit 206 can receive the distance data from the TOF sensor 202 and the photoelectric speed sensor 204.
[0049] In another embodiment, if the distance measured by the TOF sensor 202 is less than 20 meters, then the microcontroller unit 206 automatically actuates anti-speeding i.e. retards the speed of the vehicle. The microcontroller unit 206 can be connected with the engine control unit 208. The microcontroller unit can activate the engine control unit 208 to trigger the induction motor 210.
[0050] In another embodiment, the induction motor or asynchronous motor 210 is an AC electric motor in which the electric current in the rotor needed to produce torque.
[0051] In another embodiment, the microcontroller unit 206 can receive signals sent from various sensors 202 and 204, the various switching processing, and then outputs a command to control the operation of the engine control unit 208, during the entire operation of the present system.
[0052] In another embodiment, the engine control unit 208 can command and control center of the system.
[0053] In another embodiment, the ECU 208 can include a microcontroller 206 to compare a first value of the retrieved distance with a second value pre-stored in the ECU. The microcontroller 206 can determine a type of object based at least on an input received from the sensor, the type of object is selected from any or a combination of a human, an animal, a speed breaker, an electrical pole, a stone, another vehicle, and a pothole.
[0054] In another embodiment, the ECU 208 can control the vehicle by applying brakes of the vehicle, if the first value of the retrieved distance is less than the second value pre-stored in the ECU, on one or more wheels of the vehicle if on an output of the comparison of the first value and the second value. 11
[0055] FIG. 3 illustrates a module diagram of a proposed device, in accordance with an exemplary embodiment of the present disclosure.
[0056] In one embodiment, the proposed device 300 may include at least one processor 302, an input/output (I/O) interface 304, and a memory 306. The at least one processor 302 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 302 is configured to fetch and execute computer-readable instructions stored in the memory. The I/O interface may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface may allow the proposed system to interact with a user directly or through the client devices. Further, the I/O interface may enable the proposed device 300 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface may include one or more ports for connecting a number of devices to one another or to another server.
[0057] The memory 306 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory may include modules, routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types.
[0058] In another embodiment, the proposed device can include a sensor 202 and an engine control unit (ECU) 208. The sensor 202 can obtain a distance between the sensor 202 and an object. The engine control unit (ECU) 208 of the vehicle communicably coupled with the sensor 202 to retrieve obtained distance between the sensor and the object to compare a first value of the retrieved distance
12
with a second value pre-stored in the ECU. The ECU 208 can control the vehicle based on an output of the comparison of the first value and the second value.
[0059] In another embodiment, the sensor 202 is a Time-of-Flight principle (ToF) sensor.
[0060] FIG. 4 illustrates a flow diagram of a proposed system, in accordance with an exemplary embodiment of the present disclosure.
[0061] At step 402, a sensor of the vehicle obtains a distance between the sensor and an object.
[0062] At step 404, an engine control unit (ECU) of the vehicle communicably coupled with the sensor retrieves the obtained distance between the sensor and the object.
[0063] At step 406, the ECU compares a first value of the retrieved distance with a second value pre-stored in the ECU. The ECU controls the vehicle based on an output of the comparison of the first value and the second value.
[0064] The foregoing object, features, and advantages will be able to easily carry out self-technical features of the present invention one of ordinary skill in the art are described later in detail with reference to the accompanying drawings, accordingly. If the detailed description of the known art related to the invention In the following description of the present invention that are determined to unnecessarily obscure the subject matter of the present invention, and detailed description thereof will not be given. It will be described in the following, a preferred embodiment according to the present invention with reference to the accompanying drawings, for example, in detail. Like reference numerals in the drawings, it is used to refer to same or similar elements.
[0065] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, 13
components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ….and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.
[0066] While embodiments of the present disclosure have been illustrated and described, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.
We Claim:
1. A system (200) for controlling a vehicle, the system comprising:
a sensor (202) configured on the vehicle to obtain a distance between the sensor and an object;
an engine control unit (ECU) (208) of the vehicle communicably coupled with the sensor (202) to retrieve obtained distance between the sensor (202) and the object to compare a first value of the retrieved distance with a second value pre-stored in the ECU, wherein the ECU (208), based on an output of the comparison of the first value and the second value, is configured to control the vehicle.
2. The system (200) as claimed in claim 1, wherein the sensor (202) is a Time-of-Flight principle (ToF) sensor.
3. The system (200) as claimed in claim 1, wherein the distance is obtained based on a time difference between emission of a signal from the sensor and a return, after being reflected by the object, of the signal to the sensor.
4. The system (200) as claimed in claim 1, wherein the vehicle comprises a brake which brakes one or more wheels of the vehicle to control the vehicle during fog.
5. The system (200) as claimed in claim 1, wherein the ECU (208) comprises a microcontroller (208) to compare a first value of the retrieved distance with a second value pre-stored in the ECU (208).
6. The system (200) as claimed in claim 1, wherein the microcontroller (208) is configured to determine a type of object based at least on an input received from the sensor, the type of object is selected from any or a combination of a human, an animal, a speed breaker, an electrical pole, a stone, another vehicle, and a pothole. 15
7. The system (200) as claimed in claim 1, wherein the ECU (208) is configured to control the vehicle by applying brakes of the vehicle, if the first value of the retrieved distance is less than the second value pre-stored in the ECU, on one or more wheels of the vehicle if on an output of the comparison of the first value and the second value.
8. The system (200) as claimed in claim 1, wherein the sensor is configured on any or combination of a front side, back side, and sides of the vehicle.
9. A device (300) configured on the vehicle for controlling a vehicle, the device comprising:
a sensor (202) to obtain a distance between the sensor and an object;
an engine control unit (ECU) (208) of the vehicle communicably coupled with the sensor to retrieve the obtained distance between the sensor and the object to compare a first value of the retrieved distance with a second value pre-stored in the ECU, wherein the ECU, based on an output of the comparison of the first value and the second value, is configured to control the vehicle during fog.
10. A method of controlling a vehicle, the method comprising:
obtaining (402), at a sensor of the vehicle, a distance between the sensor and an object;
retrieving (404), at an engine control unit (ECU) of the vehicle communicably coupled with the sensor, the obtained distance between the sensor and the object;
comparing (406), at the ECU, a first value of the retrieved distance with a second value pre-stored in the ECU, wherein the ECU, based on an output of the comparison of the first value and the second value, is configured to control the vehicle.
| # | Name | Date |
|---|---|---|
| 1 | 201911022157-IntimationOfGrant11-03-2024.pdf | 2024-03-11 |
| 1 | 201911022157-STATEMENT OF UNDERTAKING (FORM 3) [04-06-2019(online)].pdf | 2019-06-04 |
| 2 | 201911022157-FORM FOR STARTUP [04-06-2019(online)].pdf | 2019-06-04 |
| 2 | 201911022157-PatentCertificate11-03-2024.pdf | 2024-03-11 |
| 3 | 201911022157-FORM FOR SMALL ENTITY(FORM-28) [04-06-2019(online)].pdf | 2019-06-04 |
| 3 | 201911022157-CLAIMS [12-11-2022(online)].pdf | 2022-11-12 |
| 4 | 201911022157-FORM 1 [04-06-2019(online)].pdf | 2019-06-04 |
| 4 | 201911022157-COMPLETE SPECIFICATION [12-11-2022(online)].pdf | 2022-11-12 |
| 5 | 201911022157-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-06-2019(online)].pdf | 2019-06-04 |
| 5 | 201911022157-CORRESPONDENCE [12-11-2022(online)].pdf | 2022-11-12 |
| 6 | 201911022157-EVIDENCE FOR REGISTRATION UNDER SSI [04-06-2019(online)].pdf | 2019-06-04 |
| 6 | 201911022157-DRAWING [12-11-2022(online)].pdf | 2022-11-12 |
| 7 | 201911022157-FER_SER_REPLY [12-11-2022(online)].pdf | 2022-11-12 |
| 7 | 201911022157-DRAWINGS [04-06-2019(online)].pdf | 2019-06-04 |
| 8 | 201911022157-FER.pdf | 2022-05-13 |
| 8 | 201911022157-DECLARATION OF INVENTORSHIP (FORM 5) [04-06-2019(online)].pdf | 2019-06-04 |
| 9 | 201911022157-COMPLETE SPECIFICATION [04-06-2019(online)].pdf | 2019-06-04 |
| 9 | 201911022157-FORM 18 [22-05-2021(online)].pdf | 2021-05-22 |
| 10 | 201911022157-Correspondence-180719.pdf | 2019-07-26 |
| 10 | abstract.jpg | 2019-07-12 |
| 11 | 201911022157-OTHERS-180719.pdf | 2019-07-26 |
| 11 | 201911022157-Proof of Right (MANDATORY) [16-07-2019(online)].pdf | 2019-07-16 |
| 12 | 201911022157-FORM-26 [16-07-2019(online)].pdf | 2019-07-16 |
| 12 | 201911022157-Power of Attorney-180719.pdf | 2019-07-26 |
| 13 | 201911022157-FORM-26 [16-07-2019(online)].pdf | 2019-07-16 |
| 13 | 201911022157-Power of Attorney-180719.pdf | 2019-07-26 |
| 14 | 201911022157-OTHERS-180719.pdf | 2019-07-26 |
| 14 | 201911022157-Proof of Right (MANDATORY) [16-07-2019(online)].pdf | 2019-07-16 |
| 15 | 201911022157-Correspondence-180719.pdf | 2019-07-26 |
| 15 | abstract.jpg | 2019-07-12 |
| 16 | 201911022157-COMPLETE SPECIFICATION [04-06-2019(online)].pdf | 2019-06-04 |
| 16 | 201911022157-FORM 18 [22-05-2021(online)].pdf | 2021-05-22 |
| 17 | 201911022157-FER.pdf | 2022-05-13 |
| 17 | 201911022157-DECLARATION OF INVENTORSHIP (FORM 5) [04-06-2019(online)].pdf | 2019-06-04 |
| 18 | 201911022157-FER_SER_REPLY [12-11-2022(online)].pdf | 2022-11-12 |
| 18 | 201911022157-DRAWINGS [04-06-2019(online)].pdf | 2019-06-04 |
| 19 | 201911022157-EVIDENCE FOR REGISTRATION UNDER SSI [04-06-2019(online)].pdf | 2019-06-04 |
| 19 | 201911022157-DRAWING [12-11-2022(online)].pdf | 2022-11-12 |
| 20 | 201911022157-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-06-2019(online)].pdf | 2019-06-04 |
| 20 | 201911022157-CORRESPONDENCE [12-11-2022(online)].pdf | 2022-11-12 |
| 21 | 201911022157-FORM 1 [04-06-2019(online)].pdf | 2019-06-04 |
| 21 | 201911022157-COMPLETE SPECIFICATION [12-11-2022(online)].pdf | 2022-11-12 |
| 22 | 201911022157-FORM FOR SMALL ENTITY(FORM-28) [04-06-2019(online)].pdf | 2019-06-04 |
| 22 | 201911022157-CLAIMS [12-11-2022(online)].pdf | 2022-11-12 |
| 23 | 201911022157-PatentCertificate11-03-2024.pdf | 2024-03-11 |
| 23 | 201911022157-FORM FOR STARTUP [04-06-2019(online)].pdf | 2019-06-04 |
| 24 | 201911022157-STATEMENT OF UNDERTAKING (FORM 3) [04-06-2019(online)].pdf | 2019-06-04 |
| 24 | 201911022157-IntimationOfGrant11-03-2024.pdf | 2024-03-11 |
| 1 | 201911022157E_10-05-2022.pdf |