Abstract: A digital assisting device to train activities of a user is provided. The device includes a body, an image capturing device configured to capture at least one of a plurality of images, a video or a combination thereof, of the user, while the user performs activities; a camera adjustable means configured to adjust a field of view of the image capturing device upon receiving an instruction by a view adjustable module on verifying one of the plurality of images, a video; a scanning device configured to measure a distance between the user and the digital assisting device using light, a microphone configured to receive one or more voice commands from the user; a wireless communication module configured to enable a centralised platform to connect a plurality of devices wirelessly; a multimedia unit configured to stream one or more content opted by the user to train the activities of the user. FIG. 1
Claims:1. A digital assisting device (10) to monitor and train activities of a user comprising:
a body (20) of a pre-defined shape and a pre-defined dimension;
an image capturing device (30) operatively coupled to the body (20), and configured to capture at least one of a plurality of images, a video or a combination thereof, of the user, while the user performs activities at a pre-defined distance from the digital assisting device (10), wherein the digital mat is communicatively coupled to the digital assisting device (10);
one or more processors (40) housed within the body (20);
a camera adjustable means (50) operatively coupled to the body (20), and configured to adjust a field of view of the image capturing device (30) upon receiving an instruction by a view adjustable module (60) on verifying one of the plurality of images, a video, wherein the view adjustable module is operable by the one or more processors;
a scanning device (65) operatively coupled to the body (10), and configured to measure a distance between the user and the digital assisting device (10) using light;
a microphone (70) operatively coupled to the one or more processors (40), and configured to receive one or more voice commands from the user;
a wireless communication module (80) operable by the one or more processors (40), and configured to enable a centralised platform to connect a plurality of devices wirelessly; and
a multimedia unit (90) operatively coupled to at least one side of the body (20), and configured to stream one or more content opted by the user to train the activities of the user.
2. The digital assisting device (10) as claimed in claim 1, wherein the user performs the activities on a digital mat (100).
3. The digital assisting device (10) as claimed in claim 1, wherein the camera adjustable means (50) comprises one of a manual thumbwheel or automated stepper motor.
4. The digital assisting device (10) as claimed in claim 1, wherein the scanning device (60) comprises a light detection and ranging (LiDAR) scanner.
5. The digital assisting device (10) as claimed in claim 1, wherein the multimedia unit (90) comprises at least one of a speaker, a liquid crystal display (LCD), a light emitting diode (LED) display, one or more multimedia control units, or a combination thereof.
6. The digital assisting device (10) as claimed in claim 5, comprising:
a projector (110) operatively coupled at a pre-defined location on a back surface of the body (10), and configured to project the content being streamed on one of the liquid crystal display (LCD), the light emitting diode (LED) display, or a combination thereof; and
at least one port (120) placed on corresponding surface on the body (20), and configured to enable a connectivity of at least one of the plurality of devices.
7. A system (130) to monitor and train activities of a user (140) comprising:
a digital mat (100) placed on a flat surface, and configured to enable the user (140) to perform one or more activities on the digital mat (100);
a digital assisting device (10) operatively coupled to the digital mat (100), and configured to assist the user (140) in performing the one or more activities on the digital mat (100), wherein the digital assisting device (10) comprises:
a body (20) of a pre-defined shape and a pre-defined dimension;
an image capturing device (30) operatively coupled to the body (20), and configured to capture at least one of a plurality of images, a video or a combination thereof, of the user (140), while the user (140) performs activities at a pre-defined distance from the digital assisting device (10), wherein the digital mat (100) is communicatively coupled to the digital assisting device (10);
one or more processors (40) housed within the body (20);
a camera adjustable means (50) operatively coupled to the body (20), and configured to adjust a field of view of the image capturing device (30) upon receiving an instruction by a view adjustable module (60) on verifying one of the plurality of images, a video, wherein the view adjustable module is operable by the one or more processors (40);
a scanning device (65) operatively coupled to the body (10), and configured to measure a distance between the user (140) and the digital assisting device (10) using light;
a microphone (70) operatively coupled to the one or more processors (40), and configured to receive one or more voice commands from the user (140);
a wireless communication module (80) operable by the one or more processors (40), and configured to enable a centralised platform to connect a plurality of devices wirelessly; and
a multimedia unit (90) operatively coupled to at least one side of the body (20), and configured to stream one or more content opted by the user (140) to train the activities of the user (140).
8. The system (130) as claimed in claim 7, wherein the camera adjustable means (50) comprises one of a manual thumbwheel or automated stepper motor.
9. The system (130) as claimed in claim 7, wherein the scanning device (65) comprises a light detection and ranging (LiDAR) scanner.
10. The system (130) as claimed in claim 7, wherein the multimedia unit () comprises at least one of a speaker, a liquid crystal display (LCD), a light emitting diode (LED) display, one or more multimedia control units, or a combination thereof.
11. The system (130) as claimed in claim 10, comprising:
a projector (110) operatively coupled at a pre-defined location on a back surface of the body, and configured to project the content being streamed on one of the liquid crystal display (LCD), the light emitting diode (LED) display, or a combination thereof; and
at least one port (120) placed on corresponding surface on the body (20), and configured to enable a connectivity of at least one of the plurality of devices.
Dated this 11th day of December 2020
Signature
Harish Naidu
Patent Agent (IN/PA-2896)
Agent for applicant
, Description:FIELD OF INVENTION
[0001] Embodiments of a present disclosure relate to training activities of a user, and more particularly to, a digital assisting device to monitor and train the activities of the user.
BACKGROUND
[0002] Human activities are defined as are the various actions for recreation, living, or necessity done by people. The activities may include leisure, entertainment, industry, recreation, war, and exercise. One such exercise may include one of yoga, gym, gymnastic, a Pilates and the like. In a conventional approach the activities performed by the user is monitored physically by a trainer, and further on keen observation, the trainer guides and trains the user to perform the activities in a better manner. However, the physical observation of the activities of the user being performed may not be accurate and are prone to many human errors. Henceforth the conational approach is less reliable and less accurate.
[0003] In comparison with the conventional approach, a newer approach uses a system which provides graphical illustrations on the exercise mat for the user to perform the exercise based on the graphical illustrations provided by the system on the exercise mat. However, in such system the exercise done by the user is not monitored and hence the user is not guided and tracked. Also, the graphical illustration may end up confusing the user to perform the exercise. In addition, the tracking and monitoring of the activities being performed by the user is not tracked by such system in real time, and thereby the monitoring of the user does not take place in such an approach. In addition, syncing data from one or more external device in order to monitor the user is difficult, which may involve a plurality of external devices.
[0004] Hence, there is a need for an improved system to monitor and train the activities of the user to address the aforementioned issues.
BRIEF DESCRIPTION
[0005] In accordance with one embodiment of the disclosure, a digital assisting device to monitor and train activities of a user is provided. The device includes a body of a pre-defined shape and a pre-defined dimension. The device also includes an image capturing device operatively coupled to the body. The image capturing device is configured to capture at least one of a plurality of images, a video or a combination thereof, of the user, while the user performs activities at a pre-defined distance from the digital assisting device. The device also includes one or more processors housed within the body. The device also includes a camera adjustable means operatively coupled to the body. The camera adjustable means configured to adjust a field of view of the image capturing device upon receiving an instruction by a view adjustable module on verifying one of the plurality of images, a video. The device further includes a scanning device operatively coupled to the body. The scanning device is configured to measure a distance between the user and the digital assisting device using light. The device also includes a microphone operatively coupled to the one or more processors. The microphone is configured to receive one or more voice commands from the user. The device also includes a wireless communication module configured to enable a centralised platform to connect a plurality of devices wirelessly. The device also includes a multimedia unit operatively coupled to at least one side of the body. The multimedia unit is configured to stream one or more content opted by the user to train the activities of the user.
[0006] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0008] FIG. 1 is a schematic representation of a front view of a digital assisting device in accordance with an embodiment of the present disclosure;
[0009] FIG. 2 is a schematic representation of a back view of the digital assisting device of FIG. 1 in accordance with an embodiment of the present disclosure;
[00010] FIG. 3 is a schematic representation of a front view of a system to train activities of a user in accordance with an embodiment of the present disclosure;
[00011] FIG. 4 is a schematic representation of a back view of the system to train activities of a user in accordance with an embodiment of the present disclosure of FIG. 3; and
[00012] FIG. 5 is a block diagram representation of a processing subsystem located on a local server or on a remote server in accordance with an embodiment of the present disclosure.
[00013] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[00014] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[00015] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[00016] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[00017] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[00018] Embodiments of the present disclosure relate to a digital assisting device to train activities of a user. As used herein, the term “digital assisting device” is defined as a type of electronic device which is designed to function as information manager for the user. In this invention, the digital assisting device is designed to assist the user in one or more activities being performed by the user. In one embodiment, the one or more activities may be exercises. In such embodiment, the exercise may be one of yoga, gym, gymnastic, a Pilates and the like. More specifically, the digital assisting device may assist the user in tracking, monitoring and guiding the user while performing the exercise.
[00019] Turning to FIG. 1 and FIG. 2, FIG. 1 is a schematic representation of a front view of a digital assisting device (10) in accordance with an embodiment of the present disclosure. FIG. 2 is a schematic representation of a back view of the digital assisting device of FIG. 1 in accordance with an embodiment of the present disclosure. The device (10) includes a body (20) of a pre-defined shape and a pre-defined dimension. As used herein, the term ‘body’ may be defined as a structure of the whole tower device. In one embodiment, the body (20) may be considered as a tower device to track the activities of the user. In such embodiment, the pre-defined may be one of cylindrical, cubical, cuboidal, pentagonal, hexagonal, or the like. In such embodiment, the device (10) may be designed like a hexagonal cylinder, honeycomb exterior patterns, or the like. In one exemplary embodiment, height of the device may be about 1-2 feet from a ground level.
[00020] In one exemplary embodiment, the user may perform the activities on a digital mat. In such embodiment, the digital mat may be a digital exercise mat (100). As used herein, the term ‘digital mat’ may be defined as a mat which may include a plurality of sensors, actuators, or the like which may be configured to determine certain parameters of the user while performing the activities. In one specific embodiment, a heatmap from the digital mat (100) which may be associated to the one or more activities may be stored in a storage medium which may be further extracted to analyse and track the one or more activities of the user on the digital mat (100). More specifically, the heatmap may include positions, pressure points, pressure levels, or the like associated to the one or more activities being performed by the user. In such embodiment, the heatmap may be generated upon integrating data from one or more of internal sources, one or more external sources or a combination thereof. In one embodiment, the one or more internal sources may include the plurality of sensors, actuators, or the like. In another embodiment, the one or more external sources may include a wearable device, which may include, but not limited to, a smart wrist band, watch, or the like.
[00021] The device (10) also includes an image capturing device (30) operatively coupled to the body (20). In one embodiment, the image capturing device (30) may be a camera of any kind. In one specific embodiment, the image capturing device (30) may be placed on a front surface of the body (20), wherein the front surface may be facing the user while the user may be performing the activities. Further the image capturing device (30) is configured to capture at least one of a plurality of images, a video or a combination thereof, of the user, while the user performs activities at a pre-defined distance from the digital assisting device (10).
[00022] Furthermore, the device (10) includes one or more processors (40) housed within the body (20). In one exemplary embodiment, the one or more processors (40) may include Compute power with CPU, GPU, DSP, or the like cores and an FPGA. In another embodiment, the one or more processors (40) may be associated with one of around a SBC, an SoM, an LGA, an SoC or the like which may be configured to maintain the one or more features of the device (10).
[00023] The device (10) also includes a camera adjustable means (50) operatively coupled to the body (20). The camera adjustable means (50) is configured to adjust a field of view of the image capturing device (30) upon receiving an instruction by a view adjustable module (60) on verifying one of the plurality of images, a video or a combination thereof.
[00024] In operation, the image or the video of the user is captured by the image capturing device (30) and is transmitted to the view adjustable module (60) to analyse whether or not the user is within the field of view of the image capturing device (30). Upon analysing; the view adjustable module (60) generates one of a positive result or a negative result. In one embodiment, the positive result may represent that the user is well within the field of view of the image capturing device (30). In another embodiment, the negative result may represent that the user is not within the field of view of the image capturing device (30). In such embodiment, the view adjustable module (60) generates an electrical signal and transmits the same to the camera adjustable means (50) to adjust the position of the image capturing device (30) automatically in real time.
[00025] The device (10) also includes a scanning device (65) operatively coupled to the body (20). The scanning device (65) is configured to measure a distance between the user and the digital assisting device using light. In one embodiment, the scanning device (65) may be a light detection and ranging (LiDAR) scanner. As used herein, the term “LiDAR” is defined as a device which uses light measuring distances by illuminating the target with light and measuring the reflection with a sensor. Differences in laser return times and wavelengths can then be used to make digital 3-D representations of the target. In one specific embodiment, the light used by the LiDAR sensor is a laser light. The LiDAR sensor works on the principle of Radar. In operation, the scanning device (65) transmits the light from the device (10) and upon receiving the reflection of the light from the user (140) (as shown in FIGs. 3 and 4), the distance between the user and the device (10) is monitored in real time. In one exemplary embodiment, the scanning device (65) may be one of White Light with optics device, Thermal device, IR device, PIR device, Ultrasound device or the like, or a combination thereof.
[00026] Furthermore, the device (10) includes a microphone (70) operatively coupled to the one or more processors (40). The microphone (70) is configured to receive one or more voice commands from the user. In one embodiment, the one or more commands from the user may be used to enable one or more features of the device (10). In such embodiment, the one or more features may include one of, but not limited to, opting for a specific instruction or a program associated to the one or more activities the user wishes to perform, enabling one or more multimedia devices housed within the body of the device, to a live conversation with an instructor or a trainer training the user to perform the one or more activities, or the like. In one specific embodiment, the one or more processors (40) may be instructed to enable the operation of the device (10) upon identifying voice of an authorized user only.
[00027] The device (10) further includes a wireless communication module (80) operable by the one or more processors (40). The wireless communication module (60) is configured to enable a centralised platform to connect a plurality of devices wirelessly. In one embodiment, the plurality of devices may be communicatively coupled to the centralised platform via a communication medium. In such embodiment, the wireless communication medium may include a WiFi medium, a Bluetooth medium, an NFC medium, BLE medium, or the like.
[00028] The device (10) also includes a multimedia unit (90) operatively coupled to at least one side of the body (20). The multimedia unit (90) is configured to stream one or more content opted by the user to train the activities of the user. In one embodiment, the multimedia unit (90) may include at least one of a speaker, a liquid crystal display (LCD), a light emitting diode (LED) display, one or more multimedia control units, or a combination thereof. In one exemplary embodiment, the multimedia unit (90) may be used to enable the display of at least one of an image or a video of the trainer performing (142) (as shown in FIGs. 3 and 4) the corresponding one or more activities while training the user. In such embodiment, the display of at least one of an image or a video of the trainer performing (142) may be synced with a virtual background (145) (as shown in FIG. 4) based on the user’s choice.
[00029] In another exemplary embodiment, the device (10) may further include a projector (110) operatively coupled at a pre-defined location on a back surface of the body (20). The projector (110) may be configured to project the content being streamed on one of the liquid crystal display (LCD), the light emitting diode (LED) display, or a combination thereof. In one embodiment, the projector (110) may project one of the at least one of an image or a video of the trainer performing the corresponding one or more activities while training the user, the plurality of images or the video of the user performing the one or more activities, or a combination thereof. In one specific embodiment, the display unit of the multimedia unit (80) may be touch enabled.
[00030] The device (10) may also include at least one port (120) placed on corresponding surface on the body (20). The at least one port (120) may be configured to enable a connectivity of at least one of the plurality of devices. In such embodiment, the at least one port may include USB, UART, Audio Jack, HDMI, or the like.
[00031] In one specific embodiment, the device (10) may include at least one battery which may be configured to provide electrical supply to all the components housed within the body (20) of the device (10). In such embodiment, the at least one battery may be a rechargeable battery.
[00032] In another specific embodiment, the device (10) may include input switches such as power ON / OF switch, media control, and status indication as well as decorative LEDs such as the ring along a rim of the device (10).
[00033] Turning to FIG. 3 and FIG. 4, FIG. 3 is a schematic representation of a system (130) to train activities of a user (140) in accordance with an embodiment of the present disclosure. FIG. 4 is a schematic representation of a back view of the system to train activities of a user in accordance with an embodiment of the present disclosure of FIG. 3. The system (130) includes a digital mat (100) placed on a flat surface. The digital mat (100) is configured to enable the user to perform one or more activities on the digital mat (100).
[00034] The system (130) also includes a digital assisting device (10) operatively coupled to the digital mat (100). The device (10) is configured to assist the user (140) in performing the one or more activities on the digital mat (100). The device (10) includes a body (20) of a pre-defined shape and a pre-defined dimension. The device (20) also includes an image capturing device (30) operatively coupled to the body (20). The image capturing device (30) is configured to capture at least one of a plurality of images, a video or a combination thereof, of the user, while the user performs activities at a pre-defined distance from the digital assisting device (10).
[00035] Furthermore, the system (10) includes one or more processors (40) housed within the body (20). The device (10) also includes a camera adjustable means (50) operatively coupled to the body (20). The camera adjustable means (50) configured to adjust a field of view of the image capturing device (30) upon receiving an instruction by a view adjustable module (60) on verifying one of the plurality of images, a video. The device (10) further includes a scanning device (65) operatively coupled to the body (20). The scanning device (65) is configured to measure a distance between the user (140) and the digital assisting device (10) using light. The device (10) also includes a microphone (70) operatively coupled to the one or more processors (40). The microphone (70) is configured to receive one or more voice commands from the user (140). The device (10) also includes a wireless communication module (80) configured to enable a centralised platform to connect a plurality of devices wirelessly. The device (10) also includes a multimedia unit (90) operatively coupled to at least one side of the body (20). The multimedia unit (90) is configured to stream one or more content opted by the user to train the activities of the user (140).
[00036] In one embodiment, the device (10) may include a projector (110) operatively coupled at a pre-defined location on a back surface of the body (20). The projector (110) is configured to project the content being streamed on one of the liquid crystal display (LCD), the light emitting diode (LED) display, or a combination thereof. The device (10) may also include at least one port placed on corresponding surface on the body (20). The at least one port (120) may be configured to enable a connectivity of at least one of the plurality of devices.
[00037] Furthermore, it should be noted that the digital mat (100), the digital assisting device (10), the body (20), the image capturing device (30), the one or more processors (40), the camera adjustable means (50), the view adjustable module (60), the scanning device (65), the microphone (70), the wireless communication module (80), the multimedia unit (90), the projector (110), the at least one port of FIG. 3 are substantially similar to a digital mat (100), a digital assisting device (10), a body (20), an image capturing device (30), one or more processors (40), a camera adjustable means (50), a view adjustable module (60), a scanning device (65), a microphone (70), a wireless communication module (80), a multimedia unit (90), a projector (120), at least one port of FIG. 1. Henceforth, all the embodiments disclosed for the elements of FIG. 1 holds good for all the elements disclosed in FIG. 3 and FIG. 4.
[00038] FIG. 5 is a block diagram representation of a processing subsystem located on a local server or on a remote server in accordance with an embodiment of the present disclosure. The server (155) includes processor(s) (150), and memory (160) operatively coupled to the bus (170).
[00039] The processor(s) (150), as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[00040] The memory (160) includes a plurality of modules stored in the form of executable program which instructs the processor (150). The memory (160) is substantially similar to the system (10) of FIG.1. The memory (160) has the following modules: a view adjustable module (60), a wireless communication module (80).
[00041] The view adjustable module (60) is configured to verify one of the plurality of images, a video, or a combination thereof associated to the one or more activities of a user performed on a digital mat. The wireless communication module (80) is configured to enable a centralised platform to connect a plurality of devices wirelessly.
[00042] Various embodiments of the present disclosure enable the device to act as a standalone and independent device for the overall monitoring, tracking and guiding the activities of the user. Since the dimensions of the device is less which not only makes the device small but also portable. The device provides an altogether new and wholistic immersive experience for the user with up-to 270 degree of visual and stereophonic audio experience with streaming to the TV / LCD like displays or projection on the walls around. The content of the visuals could include but not limited to the videos of teaching or coaching or the like and for creation of the experience of one being virtually present at an environment or place of choice besides the streaming of the visuals from the cameras, and, also playback of music or audio of choice as well as instructions in the language and voice of choice by the user.
[00043] Also, the device can create includes but not limited to that of visuals of a virtual class or that of chosen friends or fellow practitioners of a live coordinated or supervised activity session. Furthermore, the device can be used in an indoor or an outdoor environment, thereby making the device more reliable and efficient. In addition, the portable digital device may be viewed as a self-contained system encompassing processing elements, sensing elements, communication and the like, interfaces along with integration of multimedia elements for use for human activity monitoring and training. also, since the device is portable with a rechargeable battery, the device can be used in either the indoor or the outdoor within either the socket power or battery respectively. furthermore, the device with its accompaniment of processors, sensors and communication interfaces can alone be used for human activity monitoring and training.
[00044] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[00045] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.
| # | Name | Date |
|---|---|---|
| 1 | 202041053954-STATEMENT OF UNDERTAKING (FORM 3) [11-12-2020(online)].pdf | 2020-12-11 |
| 2 | 202041053954-PROOF OF RIGHT [11-12-2020(online)].pdf | 2020-12-11 |
| 3 | 202041053954-POWER OF AUTHORITY [11-12-2020(online)].pdf | 2020-12-11 |
| 4 | 202041053954-FORM FOR STARTUP [11-12-2020(online)].pdf | 2020-12-11 |
| 5 | 202041053954-FORM FOR SMALL ENTITY(FORM-28) [11-12-2020(online)].pdf | 2020-12-11 |
| 6 | 202041053954-FORM 1 [11-12-2020(online)].pdf | 2020-12-11 |
| 7 | 202041053954-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-12-2020(online)].pdf | 2020-12-11 |
| 8 | 202041053954-EVIDENCE FOR REGISTRATION UNDER SSI [11-12-2020(online)].pdf | 2020-12-11 |
| 9 | 202041053954-DRAWINGS [11-12-2020(online)].pdf | 2020-12-11 |
| 10 | 202041053954-DECLARATION OF INVENTORSHIP (FORM 5) [11-12-2020(online)].pdf | 2020-12-11 |
| 11 | 202041053954-COMPLETE SPECIFICATION [11-12-2020(online)].pdf | 2020-12-11 |
| 12 | 202041053954-RELEVANT DOCUMENTS [12-01-2021(online)].pdf | 2021-01-12 |
| 13 | 202041053954-FORM 13 [12-01-2021(online)].pdf | 2021-01-12 |
| 14 | 202041053954-AMMENDED DOCUMENTS [12-01-2021(online)].pdf | 2021-01-12 |
| 15 | 202041053954-STARTUP [11-12-2024(online)].pdf | 2024-12-11 |
| 16 | 202041053954-FORM28 [11-12-2024(online)].pdf | 2024-12-11 |
| 17 | 202041053954-FORM 18A [11-12-2024(online)].pdf | 2024-12-11 |
| 18 | 202041053954-FER.pdf | 2025-01-09 |
| 19 | 202041053954-FORM-8 [16-04-2025(online)].pdf | 2025-04-16 |
| 20 | 202041053954-Form-4 u-r 12(5) [23-04-2025(online)].pdf | 2025-04-23 |
| 21 | 202041053954-FORM-26 [23-04-2025(online)].pdf | 2025-04-23 |
| 22 | 202041053954-FORM 3 [24-04-2025(online)].pdf | 2025-04-24 |
| 23 | 202041053954-FER_SER_REPLY [08-07-2025(online)].pdf | 2025-07-08 |
| 24 | 202041053954-CLAIMS [08-07-2025(online)].pdf | 2025-07-08 |
| 25 | 202041053954-PatentCertificate01-10-2025.pdf | 2025-10-01 |
| 26 | 202041053954-IntimationOfGrant01-10-2025.pdf | 2025-10-01 |
| 1 | 202041053954SearchE_06-01-2025.pdf |