Abstract: The present disclosure pertains to an autonomous stretcher assembly (100) including a set of sensors (104), an image capturing unit (106), a set of motors (108), a processing unit (110), and an alert unit. The processor (110) facilitates autonomous movement of the assembly (100) from a source to a destination by detecting obstructions with help of the set of sensors (104) and the image capturing unit (106), and facilitate changing direction of one or more wheels associated with the stretcher (102). The processing unit (110) enables tracking useful medical analytics and observing medical analytics which help doctor and medical practitioner to control medical situation efficiently. The processing unit (110) facilitates route of the doctor's cabin, operation theatre, and the like.
TECHNICAL FIELD
[0001] The present disclosure relates generally to field of supporting structure. More particularly, the present disclosure provides an autonomous stretcher assembly for taking an entity from one place to other without human efforts.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art. [0003] Stretchers are most vital role-playing asset at hospitals which require a human workforce to drive around the hospitals rooms or emergency areas which might be a problem that need to be solved. Patients need attendant for taking them on stretcher which involves more people. Existing solutions can include remote control stretchers which can be controlled remotely under a given range by a human, but again require a human workforce. There is a need to overcome above mentioned problem of prior art by bringing a solution that facilitates autonomous movement of stretcher assembly without human efforts.
[0004] The solution can facilitate driving through all the observable obstacles and take the patient lied on stretcher to directed place/destination (room) very safely. The solution can help in tracking some useful medical analytics. The solution can be capable enough of observing some medical analytics which help doctor, other medical practitioner to control medical situation efficiently. Further, the solution can help in tracking or following route of the doctor's cabin, operation theatre.
OBJECTS OF THE PRESENT DISCLOSURE
[0005] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0006] It is an object of the present disclosure to provide an autonomous
stretcher assembly that facilitates automatic transportation of a patient from a
place to another without any human workforce.
[0007] It is an object of the present disclosure to provide an autonomous
stretcher assembly that enables driving through all observable obstacles and take
the patient lied on the stretcher to directed place very safely.
[0008] It is an object of the present disclosure to provide an autonomous
stretcher assembly that is time efficient and is work efficient asset.
[0009] It is an object of the present disclosure to provide an autonomous
stretcher assembly that enables in auto balancing of patient to provide smooth
drive while in emergency.
[0010] It is an object of the present disclosure to provide an autonomous
stretcher assembly that facilitates tracking useful medical analytics.
[0011] It is an object of the present disclosure to provide an autonomous
stretcher assembly that helps in observing medical analytics to help doctor,
medical practitioner, and the like to control medical situation in a better way.
[0012] It is an object of the present disclosure to provide an autonomous
stretcher assembly that enables in tracking or following route of doctor's cabin,
and operation theatre.
SUMMARY
[0013] The present disclosure relates generally to field of supporting structure. More particularly, the present disclosure provides an autonomous stretcher assembly for taking an entity from one place to other without human efforts. [0014] An aspect of the present disclosure pertains to an autonomous stretcher assembly. The assembly may include a set of sensors, a set of motors, and a processor. The set of sensors may be configured with one or more wheels of the stretcher to capture a pre-defined region and correspondingly generate a first set of signals. The autonomous stretcher assembly may be configured to move on a pre-defined track. The set of motors may be coupled to the one or more wheels of the stretcher. The processor may be operatively coupled to the set of sensors , and
the set of motors, where the processor may include a memory storing a set of instructions, where upon execution of the set of instructions , the controller may be configured to extract images pertaining to obstructions associated with the pre-defined region from the first set of signals. The controller may be configured to match the images with the pre-stored obstructions stored in a database. The controller may be configured to generate a set of actuation signals in response to the matched images for the pre-stored obstructions, and may transmit the set of actuation signals to the set of motors, where the set of actuation signals may facilitate actuation of the set of motors and enables changing direction of the one or more wheel on the pre-defined track for autonomous movement of the assembly.
[0015] In an aspect, the set of sensors may include any or a combination of image sensor, balancing sensor,
[0016] In an aspect, the processor may be configured to generate a set of alert signals when the images pertaining to the obstructions associated with the pre-defined region are not matched with the pre-stored obstructions, where the set of alert signals may be transmitted to an alert unit operatively coupled to the processor.
[0017] In an aspect, the alert unit may be configured with the assembly, and where the alert unit may include any or a combination of light emitting diode (LED), buzzer, alarm, and vibrator.
[0018] In an aspect, the assembly may include a frame, and a mattress positioned on top surface of the frame and the one or more wheels movable coupled with one or more legs of the frame.
[0019] In an aspect, the processor may be communicatively coupled to one or more mobile computing devices through a communication unit, where the communication unit may facilitate transmitting the set of alert signals to the one or more mobile computing devices.
[0020] In an aspect, the direction of the one or more wheels may be controlled by the set of motors in response to the received set of actuation signals.
[0021] In an aspect, the direction of the one or more wheels may be changed
based on the pre-defined track, where the changed direction of the one or more
wheels may facilitate autonomous movement of the stretcher on an obstruction
less track.
[0022] In an aspect, the assembly may include one or more retractable strap
configured with the frame of the assembly, where the one or more retractable
strap may facilitate providing support and grip to an entity on the mattress of the
stretcher.
[0023] In an aspect, the assembly may include a power source coupled to the
controller, where the power source may facilitate supplying electric power to the
set of motors, and enables in electric charging of the assembly.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The accompanying drawings are included to provide a further
understanding of the present disclosure, and are incorporated in and constitute a
part of this specification. The drawings illustrate exemplary embodiments of the
present disclosure and, together with the description, serve to explain the
principles of the present disclosure.
[0025] The diagrams are for illustration only, which thus is not a limitation of
the present disclosure, and wherein:
[0026] FIG. 1 illustrates a block diagram of proposed autonomous stretcher
assembly, in accordance with an embodiment of the present disclosure.
[0027] FIG. 2 illustrates exemplary functional components of processing unit
of the proposed autonomous stretcher assembly, in accordance with an
embodiment of the present disclosure.
[0028] FIGs. 3A-3B illustrate implementation of the proposed autonomous
stretcher assembly, in accordance with an embodiment of the present disclosure.
DETAIL DESCRIPTION
[0029] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present
invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details. [0030] The present disclosure relates generally to field of supporting structure more particularly, the present disclosure provides an autonomous stretcher assembly for taking an entity from one place to other without human efforts. [0031] FIG. 1 illustrates a block diagram of proposed autonomous stretcher assembly, in accordance with an embodiment of the present disclosure. [0032] As illustrated in FIG. 1, the proposed autonomous stretcher assembly (100) (also referred to as assembly (100), herein) can include a set of sensors (104), an image capturing unit (106), and a processor (110) (interchangeably referred to as processing unit (110), herein), a set of motors (108), an alert unit (not shown), and one or more retractable strap (not shown). In an embodiment, the assembly (100) can be configured to move autonomously, where direction of the assembly (100) during the movement, can be adjusted according to obstruction in a pre-defined track.
[0033] In an illustrative embodiment, the assembly (100) can include a frame, a mattress, and one or more wheels , where the mattress can be positioned on top surface of the frame , and where the one or more wheels can be movable coupled with one or more legs of the frame. In another illustrative embodiment, the assembly (100) can be configured to move on a pre-defined track.
[0034] In an illustrative embodiment, the set of sensors (102) can be configured with one or more wheels of the stretcher to capture a pre-defined region and correspondingly generate a first set of signals. In another illustrative embodiment, the set of sensors (102) can include any or a combination of image sensor, balance sensor, and the like. In yet another illustrative embodiment, the first set of signals generated by the set of sensors (102) can be in electrical form, where the first set of signals can be transmitted to the processing unit (110). The pre-defined region can pertain to certain areas of the pre-defined track, where the assembly (100) moves.
[0035] In an illustrative embodiment, the assembly (100) can include an image capturing unit (106) configured to capture the pre-defined region, where the image capturing unit (106) can be configured with the one or more wheels of the assembly (100). In another illustrative embodiment, the set of motors (106) can be coupled to the one or more wheels of the stretcher, where the set of motors (108) can facilitate movement of the assembly (100) from a first pre-defined place to a second pre-defined place, and enables in changing direction of the one or more wheels.
[0036] In an illustrative embodiment, the processing unit (110) can be operatively coupled to the set of sensors (104), and the set of motors (108), and the alert unit, where the processing unit (110) can facilitate identifying obstructions associated with the pre-defined track and enables in actuating the set of motors (108) and changing the direction of the one or wheels based on obstructions available on the pre-defined track. In another illustrative embodiment,
[0037] In an illustrative embodiment, the processing unit (110) can be configured to generate a set of alert signals when the obstructions detected by the processing unit (110) does not matches with the pre-stored obstructions, where the set of alert signals are transmitted to the alert unit. In another illustrative embodiment, the alert unit includes any or a combination of light emitting diode (LED), buzzer, alarm, and vibrator.
[0038] In an illustrative embodiment, the processing unit (110) can be communicatively coupled to one or more mobile computing devices through a communication unit, where the communication unit can facilitate transmitting the set of alert signals to the one or more mobile computing devices. In another illustrative embodiment, the communication unit can include any or a combination of Wireless Fidelity (Wi-Fi) module, Bluetooth module, Li-Fi module, optical fiber, Wireless Local Area Network (WLAN), ZigBee module and the like.
[0039] In an illustrative embodiment, the direction of the one or more wheels can be controlled by the set of motors (108) in response to received set of
actuation signals generated by the processing unit (110). In another illustrative
embodiment, the direction of the one or more wheels can be changed based on the
pre-defined track, where the changed direction of the one or more wheels can
facilitate autonomous movement of the stretcher on an obstruction less track.
[0040] In an illustrative embodiment, the assembly (100) can include one or
more retractable strap configured with the frame of the assembly (100), where the one or more retractable strap can facilitate providing support and grip to an entity on the mattress of the stretcher. In another illustrative embodiment, the assembly (100) can include a power source coupled to the processing unit (110), where the power source can facilitate supplying electric power to the set of motors (108), and enables in electric charging of the assembly (100).
[0041] In an illustrative embodiment, the camera (106), or the set of sensors
(104) can be equipped with the stretcher, where the processing unit (110) can be configured with deep machine learning and help in controlling the one or more wheels of the stretcher enabling in autonomous movement of the assembly (100). In another illustrative embodiment, the assembly (100) can be configured with the balancing sensors to facilitate self balancing and provide smooth experience. In yet another illustrative embodiment, the assembly (100) can be chargeable electric autopilot stretcher.
[0042] In an illustrative embodiment, the stretcher can be a self-driven and
can be operated without human assistance to drive. In another illustrative embodiment, the processing unit (110) can be configured to direct or guide the assembly (100) for following route of the doctor to operation theatre, doctor cabin, and the like. In another illustrative embodiment, the stretcher can be capable of autonomous movement using machine learning model and set of sensors (104) configured to detect obstructions or obstacles in the pre-defined track. In yet another illustrative embodiment, the processing unit (110) can facilitate tracking useful medical analytics and observing medical analytics which can help the doctors and other medical practitioner to control medical situation efficiently. The processing unit (110) can facilitate tracking or following the route of the doctor's cabin.
[0043] FIG. 2 illustrates exemplary functional components of processing unit of the proposed autonomous stretcher assembly, in accordance with an embodiment of the present disclosure.
[0044] As illustrated in an embodiment, the processing unit (110) can include one or more processor(s) (202). The one or more processor(s) (202) can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) (202) are configured to fetch and execute computer-readable instructions stored in a memory (204) of the processing unit (110). The memory (204) can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory (204) can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like. [0045] In an embodiment, the processing unit (108) can also include an interface(s) (206). The interface(s) (206) may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) (206) may facilitate communication of the processing unit (110) with various devices coupled to the processing unit (110). The interface(s) (206) may also provide a communication pathway for one or more components of processing unit (110). Examples of such components include, but are not limited to, machine learning engine(s) (208) and database (210).
[0046] In an embodiment, the machine learning engine(s) (208) can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the machine learning engine(s) (208). In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the machine learning engine(s) (208) may be processor executable instructions stored on a non-transitory
machine-readable storage medium and the hardware for the machine learning engine(s) (208) may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the machine learning engine(s) (208). In such examples, the processing unit (108) can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to processing unit (108) and the processing resource. In other examples, the machine learning engine(s) (208) may be implemented by electronic circuitry. A database (210) can include data that is either stored or generated as a result of functionalities implemented by any of the components of the machine learning engine(s) (208).
[0047] In an embodiment, the machine learning engine(s) (208) can include an assisting unit (212), an actuation unit (214), and other unit(s) (218). The other unit(s) (218) can implement functionalities that supplement applications or functions performed by the assembly (100) or the machine learning engine(s) (208).
[0048] The database (210) can include data that is either stored or generated as a result of functionalities implemented by any of the components of the machine learning engine(s) (208).
[0049] As illustrated in FIG. 2, the processing unit (112) can be configured to receive a first set of signals from a set of sensors (104), where the set of sensors (104) can include any or a combination of image sensor, balancing sensor, camera, and the like, where the first set of signals can be in electrical form. In an illustrative embodiment, the processing unit (110) can be operatively coupled to the memory and a deep machine learning engine, where the memory can store set of instructions executable by the processing unit (112), where upon execution of the set of instructions, the processing unit (110) can be configured to extract images pertaining to obstructions associated with the pre-defined region from the first set of signals with help of an assisting unit (212), where the extraction
can be done with help of an extraction unit associated with the assisting unit (212).
[0050] In an illustrative embodiment, the assisting unit (212) can be configured to match the images with the pre-stored obstructions stored in a database (210), where the database (210) can include pre-defined track or route for an assembly (100) and obstructions associated with the pre-defined track. In another illustrative embodiment, the assisting unit (212) can be configured to assist and guide the assembly (100) based on the matched obstructions with the pre-stored obstructions. In yet another illustrative embodiment, the assisting unit (212) can be configured to detect the obstructions or obstacles in the pre-defined track or pre-defined path and accordingly can transmit a set of guiding signals to the actuation unit (214).
[0051] In an illustrative embodiment, the actuation unit (214) can be configured to generate a set of actuation signals in response to the matched images for the pre-stored obstructions, and transmit the set of actuation signals to the set of motors, where the set of actuation signals can facilitate actuation of the set of motors and enables changing direction of the one or more wheel on the pre¬defined track for autonomous movement of the assembly (100). In another illustrative embodiment, the actuation unit (214) can facilitate autonomous movement of the assembly (100) with help of the set of motors (108), where the set of motors (108) can enable in changing direction of the one or more wheels and accordingly the one or more wheels can move the assembly (100). [0052] In an illustrative embodiment, the assisting unit (212) can facilitate tracking useful medical analytics associated with the assembly (100) on the pre¬defined route or track of the assembly (100). In another illustrative embodiment, the assisting unit (212) can be configured to observe medical analytics and can help doctor and other medical practitioner to control medical situation efficiently. In yet another illustrative embodiment, the assisting unit (212) can facilitate tracking or following route of the doctor's cabin, operation theatre, and the like. [0053] In an illustrative embodiment, the stretcher (100) can be self driven with help of the deep machine learning model configured with the processing unit
(110), where the assembly (100) does not require any human assistance for driving the assembly (100). In another illustrative embodiment, the actuation unit (214) can facilitate driving the assembly (100) in direction that is obstruction less with help of the set of sensors (104) and enables in following the route of the doctor to the operation theatre.
[0054] In an illustrative embodiment, the other unit(s) (218) can include a signal generation unit, where the signal generation unit can be configured to generate a set of alert signals when the images pertaining to the obstructions associated with the pre-defined region are not matched with the pre-stored obstructions, where the set of alert signals can be transmitted to an alert unit operatively coupled to the processing unit (110). In another illustrative embodiment, the alert unit can be configured with the assembly (100), and where the alert unit can include any or a combination of light emitting diode (LED), buzzer, alarm, vibrator, and the like.
[0055] In an illustrative embodiment, the processing unit (110) can be communicatively coupled to one or more mobile computing devices through a communication unit, where the communication unit can facilitate transmitting the set of alert signals to the one or more mobile computing devices. In another illustrative embodiment, the one or more mobile computing device can include any or a combination of cell phone, laptop, and the like associated with a first entity like patient's attendant, medical practitioner, doctor, and the like, where the set of alert signals can help the first entity to determine and identify location of the entity, where the entity can include patient, person on the stretcher, and the like. [0056] In an illustrative embodiment, the processing unit (110) can be configured to transmit the status of the entity after the direction of the one or more wheels are changed according to the obstruction, and can facilitate sending status to the first entity with help of the communication unit. In another illustrative embodiment, the communication unit can include any or a combination of Wireless Fidelity (Wi-Fi) module, Bluetooth module, Li-Fi module, optical fiber, Wireless Local Area Network (WLAN), ZigBee module and the like
[0057] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the device (100). These units too may be merged or divided into super- units or sub-units as may be configured.
[0058] FIGs. 3A-3B illustrate implementation of the proposed autonomous stretcher assembly, in accordance with an embodiment of the present disclosure. [0059] As illustrated in FIG. 3A, and FIG. 3B, the proposed assembly (100) can include a set of sensors (104), an image capturing unit (106), a set of motors (108), a processing unit (110), an alert unit, camera wire (302), and one or more wheels (304). In an illustrative embodiment, the assembly (100) can be a stretcher (102), where the stretcher (102) can facilitate supporting patient. The stretcher (102) can enable in automatically transporting the patient from a first pre-defined place to a second pre-defined place without any human workforce. [0060] In an illustrative embodiment, the stretcher (100) can include a frame, a mattress, the one or more wheels (304), where the mattress can be positioned on top surface of the frame and the one or more wheels can be movable coupled with legs of the frame. In another illustrative embodiment, the stretcher (102) can include the image acquisition unit (106), or the set of sensors (104), along with the set of motors (108) to facilitate automatic movement of the stretcher (102) with help of the one or more wheels (304). In yet another illustrative embodiment, processing unit (110) can include a memory including a pre-defined set of route of associated hospital, and a set of obstacles to avoid collision. [0061] In an illustrative embodiment, the image capturing unit (106), and the set of sensors (104) can capture one or more images and obstacles in a vicinity respectively, and transmit the captured images to the processing unit (110), the processing unit (110) can compare the images, and the obstacle with the pre¬defined set of obstacles, and transmit a set of actuation signals to the set of motors (108), which facilitate in changing direction of the one or more wheels accordingly. In yet another illustrative embodiment, one or more machine learning techniques can be implemented with the processing unit (110) associated with the stretcher, where the machine learning module can be configured to assist the
stretcher to move automatically from a source to a required destination within a pre-defined area.
[0062] In an illustrative embodiment, the processing unit (110) can be communicatively coupled to one or more mobile computing devices through a communication unit, where the communication unit can facilitate transmitting the set of alert signals to the one or more mobile computing devices. [0063] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0064] The present disclosure provides an autonomous stretcher assembly that
facilitates automatic transportation of a patient from a place to another without
any human workforce.
[0065] The present disclosure provides an autonomous stretcher assembly that
enables driving through all observable obstacles and take the patient lied on the
stretcher to directed place very safely.
[0066] The present disclosure provides an autonomous stretcher assembly that
is time efficient and is work efficient asset.
[0067] The present disclosure provides an autonomous stretcher assembly that
enables in auto balancing of patient to provide smooth drive while in emergency.
[0068] The present disclosure provides an autonomous stretcher assembly that
facilitates tracking useful medical analytics.
[0069] The present disclosure provides an autonomous stretcher assembly that
helps in observing medical analytics to help doctor, medical practitioner, and the
like to control medical situation in a better way.
[0070] The present disclosure provides an autonomous stretcher assembly that enables in tracking or following route of doctor's cabin, and operation theatre.
We Claim:
1. An autonomous stretcher assembly (100) comprising :
a set of sensors (104) configured with one or more wheels (304)
of the stretcher (102) to capture a pre-defined region and correspondingly
generate a first set of signals;
wherein the autonomous stretcher assembly is configured to move
on a pre-defined track,
a set of motors (108) coupled to the one or more wheels (304) of
the stretcher (102);
a processor operatively coupled to the set of sensors , and the set
of motors, wherein the processor (110) includes a memory storing a set
of instructions, wherein upon execution of the set of instructions , the
controller is configured to:
extract images pertaining to obstructions associated with the pre-defined region from the first set of signals;
match the images with the pre-stored obstructions stored in a database (210), and
generate a set of actuation signals in response to the matched images for the pre-stored obstructions, and transmit the set of actuation signals to the set of motors (108), wherein the set of actuation signals facilitate actuation of the set of motors (108) and enables changing direction of the one or more wheels (304) on the pre-defined track for autonomous movement of the assembly (100).
2. The assembly (100) as claimed in claim 1, wherein the set of sensors (104) include any or a combination of image sensor, balancing sensor.
3. The assembly (100) as claimed in claim 1, wherein the processor (110) is configured to generate a set of alert signals when the images pertaining to
the obstructions associated with the pre-defined region are not matched with the pre-stored obstructions, wherein the set of alert signals are transmitted to an alert unit operatively coupled to the processor (110).
4. The assembly (100) as claimed in claim 3, wherein the alert unit is configured with the assembly (100), and wherein the alert unit includes any or a combination of light emitting diode (LED), buzzer, alarm, and vibrator.
5. The assembly (100) as claimed in claim 1, wherein the assembly (100) includes a frame , a mattress, where the mattress is positioned on top surface of the frame and the one or more wheels (304) movable coupled with one or more legs of the frame.
6. The assembly (100) as claimed in claim 1, wherein the processor (110) is communicatively coupled to one or more mobile computing devices through a communication unit, wherein the communication unit facilitates transmitting the set of alert signals to the one or more mobile computing devices.
7. The assembly (100) as claimed in claim 1, wherein the direction of the one or more wheels (304) are controlled by the set of motors (108) in response to the received set of actuation signals.
8. The assembly (100) as claimed in claim 1, wherein the direction of the one or more wheels (304) are changed based on the pre-defined track, wherein the changed direction of the one or more wheels (304) facilitate autonomous movement of the stretcher on an obstruction less track.
9. The assembly as claimed in claim 1, wherein the assembly includes one or more retractable strap configured with the frame of the assembly, wherein
the one or more retractable strap facilitates providing support and grip to an entity on the mattress of the stretcher.
10. The assembly (100) as claimed in claim 1, wherein the assembly I (100) includes a power source coupled to the processor (110) , wherein the power source facilitates supplying electric power to the set of motors (108), and enables in electric charging of the assembly (100).
| # | Name | Date |
|---|---|---|
| 1 | 202111017457-STATEMENT OF UNDERTAKING (FORM 3) [14-04-2021(online)].pdf | 2021-04-14 |
| 2 | 202111017457-POWER OF AUTHORITY [14-04-2021(online)].pdf | 2021-04-14 |
| 3 | 202111017457-FORM FOR STARTUP [14-04-2021(online)].pdf | 2021-04-14 |
| 4 | 202111017457-FORM FOR SMALL ENTITY(FORM-28) [14-04-2021(online)].pdf | 2021-04-14 |
| 5 | 202111017457-FORM 1 [14-04-2021(online)].pdf | 2021-04-14 |
| 6 | 202111017457-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [14-04-2021(online)].pdf | 2021-04-14 |
| 7 | 202111017457-EVIDENCE FOR REGISTRATION UNDER SSI [14-04-2021(online)].pdf | 2021-04-14 |
| 8 | 202111017457-DRAWINGS [14-04-2021(online)].pdf | 2021-04-14 |
| 9 | 202111017457-DECLARATION OF INVENTORSHIP (FORM 5) [14-04-2021(online)].pdf | 2021-04-14 |
| 10 | 202111017457-COMPLETE SPECIFICATION [14-04-2021(online)].pdf | 2021-04-14 |
| 11 | 202111017457-Proof of Right [10-07-2021(online)].pdf | 2021-07-10 |
| 12 | 202111017457-FORM 18 [05-01-2023(online)].pdf | 2023-01-05 |
| 13 | 202111017457-FER.pdf | 2023-07-31 |
| 14 | 202111017457-FER_SER_REPLY [31-01-2024(online)].pdf | 2024-01-31 |
| 15 | 202111017457-DRAWING [31-01-2024(online)].pdf | 2024-01-31 |
| 16 | 202111017457-CORRESPONDENCE [31-01-2024(online)].pdf | 2024-01-31 |
| 17 | 202111017457-CLAIMS [31-01-2024(online)].pdf | 2024-01-31 |
| 18 | 202111017457-US(14)-HearingNotice-(HearingDate-05-07-2024).pdf | 2024-06-12 |
| 19 | 202111017457-FORM-26 [01-07-2024(online)].pdf | 2024-07-01 |
| 20 | 202111017457-Correspondence to notify the Controller [01-07-2024(online)].pdf | 2024-07-01 |
| 21 | 202111017457-Written submissions and relevant documents [19-07-2024(online)].pdf | 2024-07-19 |
| 22 | 202111017457-PatentCertificate31-07-2024.pdf | 2024-07-31 |
| 23 | 202111017457-IntimationOfGrant31-07-2024.pdf | 2024-07-31 |
| 1 | 202111017457SEARCHSTRATERGYE_21-07-2023.pdf |
| 2 | 202111017457AMENDEDSEARCHSTRATEGYAE_16-05-2024.pdf |