Sign In to Follow Application
View All Documents & Correspondence

An Artificial Intelligence Based Life Library Data Collection And Organization System

Abstract: The present invention relates to a system(100) for an artificial intelligence-based life library data collection and organization system(100). The present invention includes a data collection device(102), and a computing device(110). The data collection device(102) includes a camera sensor(104), a microphone(106), and a mounting clip(118). The computing device(110) is wirelessly connected to the data collection device(102). The computing device(110) receives the video feed, brain activity data, heartbeat rate and hormonal data from the data collection device(102), and further, the computing device(110) executes an artificial intelligence-based computer-readable instruction to stores external events and as well as feelings of the person during the particular external events and further organize the events in the proper library.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 September 2020
Publication Number
10/2022
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
ishasharmasharma1987@gmail.com
Parent Application

Applicants

Amit Sharma
C-5/B, 36-C, Janak Puri, New Delhi - 110058

Inventors

1. Amit Sharma
C-5/B, 36-C, Janak Puri, New Delhi - 110058
2. Arjit Sachdeva
C-5/C, 44B, First Floor, Janakpuri, New Delhi - 110058

Specification

The present invention relates to a system for an artificial intelligence-based life library data collection and organization system. Most specifically the present invention relates to a life library system consisting of a recording device and a companion artificial intelligence-based analysis system.
BACKGROUND OF THE INVENTION
Human beings live through a lot of different types of experiences during their lifetime. These experiences contain various people, places, and environments as part of them. Experiences encountered during a person’s lifetime can contribute hugely to one’s personal development. They can also contribute towards maintaining a healthy mental state as positive memories from the past can help people get inspired and motivated. Personally too, people may want to recollect important events so they can draw useful insights and come up with ideas to solve problems from the past. Writing diary entries had been a popular and favourite method for people in the past for a long time. Although it is a good way to document information, it lacks an effective way to log visual information. Later, as technologies improved, cameras and videographer equipment were developed, which could store visual memories and experiences in the form of pictures and videos. These are more effective and richer with information than simple text-only methods like diary entries but they still lack organization and richness in the sense that they may not be always available to store information. They have to be organized manually into albums and collections. They also cannot record other details associated with a person’s experiences such as his state of mind and feelings. Apart from external experiences and interactions, there are additional things associated with a person’s life such as thoughts, ideas, and feelings. There are no traditional, widely available forms of recording technology that can record such information.

US2016127641A1 discloses a media capture device (MCD) that provides a multi-sensor, free flight camera platform with advanced learning technology to replicate the desires and skills of the purchaser/owner is provided. Advanced algorithms may uniquely enable many functions for autonomous and revolutionary photography. The device may learn about the user, the environment, and/or how to optimize a photographic experience so that compelling events may be captured and composed into efficient and emotional sharing. The device may capture better photos and videos as perceived by one's social circle of friends, and/or may greatly simplify the process of using a camera to the ultimate convenience of fully autonomous operation.
The existing inventions are not able to overcome the problem associated with a person’s experiences such as his state of mind and feelings. The existing inventions are complex and are not cost-effective. Thus there is a need for the present invention to overcome the above mention problems.
OBJECTIVE OF THE INVENTION
The main objective of the present invention is to construct artificial intelligence-based life library data collection and organization systems.
Another objective of the present invention is to monitor a person’s experiences such as his state of mind and feelings.
Yet another objective of the present invention is to record types of data associated with the person such as brain activity.
Yet another objective of the present invention is to collects visual as well as other types of data associated with personal experiences and a companion artificial intelligence-based software application which analyses and organizes the information collected by the data collection device.
Yet another objective of the present invention is to effectively help the user.
Yet another objective of the present invention is to develop an easily navigable library of a person’s experiences, interactions, and thoughts.
Further objectives, advantages, and features of the present invention will become apparent from the detailed description provided herein below, in which various embodiments of the disclosed invention are illustrated by way of example.
SUMMARY OF THE INVENTION
The present invention relates to a system for an artificial intelligence-based life library data collection and organization system. The present invention includes a data collection device and a computing device. The data collection device includes a camera sensor, a microphone, and a mounting clip. The camera sensor is mounted on the front of the data collection device to capture live video of the area in front of the data collection device. The microphone is mounted on the data collection device, with the help of the microphone; the data collection device records the voice of live video that is captured by the camera sensor. The mounting clip is used to attach the data collection device to the user’s body. The computing device is wirelessly connected to the data collection device. The computing device receives the video feed, brain activity data, heartbeat rate and hormonal data from the data collection device, and further, the computing device executes an artificial intelligence-based computer-readable instruction to stores external events and as well as feelings of the person during the particular external events and further organize the events in the proper library. Herein, the user is able to search the video on the computing device based on the event, person, time, feelings, and further watch the same on the computing device. In an embodiment, in the computing device executes the artificial intelligence-based computer-readable instructions that estimate the feelings of users in the form of brain activity mapping, heart rate, and hormonal level capturing. In an embodiment, the data collection device further includes a brain activity sensor, a heart activity sensor, and a hormonal sensor. The brain activity sensor collects brain activity data during live video events. The heart activity sensor collects the heartbeat rate during live video events. The hormonal sensor collects hormonal data during a live video event. In an embodiment, the data collection device collects the brain activity data, heartbeat rate and hormonal data from the brain activity sensor, the heart activity sensor, the hormonal sensor respectively and sends to the computing device that executes the artificial intelligence-based computer-readable instructions to store external events and as well as feelings of the person during the particular external events and further organize the events in the library.
The main advantage of the present invention is that the present invention helps to construct artificial intelligence-based life library data collection and organization systems.
Another advantage of the present invention is that the present invention helps to monitor a person’s experiences such as his state of mind and feelings.
Yet another advantage of the present invention is that the present invention is an easy and cost-effective device.
Yet another advantage of the present invention is that the present invention is records types of data associated with the person such as brain activity.
Yet another advantage of the present invention is that the present invention collects visual as well as other types of data associated with personal experiences and a companion artificial intelligence-based software application which analyses and organizes the information collected by the data collection device.
Yet another advantage of the present invention is to develop an easily navigable library of a person’s experiences, interactions, and thoughts.
Further objectives, advantages, and features of the present invention will become apparent from the detailed description provided herein below, in which various embodiments of the disclosed invention are illustrated by way of example.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings are incorporated in and constitute a part of this specification to provide a further understanding of the invention. The drawings illustrate one embodiment of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 illustrates the front view of a data collection device.
Fig. 2 illustrates the backside view of a data collection device.
Fig. 3 illustrates a side view of a data collection device.
Fig. 4 illustrates a method of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Definition
The terms “a” or “an”, as used herein, are defined as one or as more than one. The term “plurality”, as used herein, is defined as two as or more than two. The term “another”, as used herein, is defined as at least a second or more. The terms “including” and/or “having”, as used herein, are defined as comprising (i.e., open language). The term “coupled”, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically.
The term “comprising” is not intended to limit inventions to only claiming the present invention with such comprising language. Any invention using the term comprising could be separated into one or more claims using “consisting” or “consisting of” claim language and is so intended. The term “comprising” is used interchangeably used by the terms “having” or “containing”.
Reference throughout this document to “one embodiment”, “certain embodiments”, “an embodiment”, “another embodiment”, and “yet another embodiment” or similar terms means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of such phrases or in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics are combined in any suitable manner in one or more embodiments without limitation.
The term “or” as used herein is to be interpreted as an inclusive or meaning any one or any combination. Therefore, “A, B or C” means any of the following: “A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps, or acts are in some way inherently mutually exclusive.
As used herein, the term "one or more" generally refers to, but not limited to, singular as well as the plural form of the term.
The drawings featured in the figures are to illustrate certain convenient embodiments of the present invention and are not to be considered as a limitation to that. Term "means" preceding a present participle of operation indicates the desired function for which there is one or more embodiments, i.e., one or more methods, devices, or apparatuses for achieving the desired function and that one skilled in the art could select from these or their equivalent in view of the disclosure herein and use of the term "means" is not intended to be limiting.
Fig. 1 illustrates the front view of a data collection device(102). The front view of the data collection device(102) includes a camera sensor(104), and a microphone(106). The camera sensor(104) is mounted on the front of the data collection device(102). The microphone(106) is mounted on the data collection device(102).
Fig. 2 illustrates the backside view of a data collection device(102). In an embodiment, the backside view of the data collection device(102) a mounting clip(118) a brain activity sensor(112), a heart activity sensor(114), and a hormonal sensor(116).
Fig. 3 illustrates a side view of a data collection device(102). The of the data collection device(102) includes a camera sensor(104), and a mounting clip(118). The camera sensor(104) is mounted on the front of the data collection device(102). In an embodiment, the data collection device(102) further includes a brain activity sensor(112), a heart activity sensor(114), and a hormonal sensor(116).
Fig. 4 illustrates a method for an artificial intelligence-based life library data collection and organization. The present invention includes a data collection device(102), and a computing device(110). The computing device(110) is wirelessly connected to the data collection device(102). The computing device(110) receives the video feed, brain activity data, heartbeat rate and hormonal data from the data collection device(102), and further, the computing device(110) executes an artificial intelligence-based computer-readable instruction to stores external events and as well as feelings of the person during the particular external events and further organize the events in the proper library.
The present invention relates to a system for an artificial intelligence-based life library data collection and organization system. The present invention includes a data collection device and a computing device. The data collection device includes a camera sensor, a microphone, and a mounting clip. The camera sensor is mounted on the front of the data collection device to capture live video of the area in front of the data collection device. In an embodiment, in the camera sensor including but not limited to an RGB camera, a thermal camera, and an infrared camera. The microphone is mounted on the data collection device, with the help of the microphone, the data collection device records the voice of live video that is captured by the camera sensor. The mounting clip is used to attach the data collection device to the user’s body. The computing device is wirelessly connected to the data collection device. The computing device receives the video feed, brain activity data, heartbeat rate and hormonal data from the data collection device, and further, the computing device executes artificial intelligence-based computer-readable instructions to stores external events and as well as feelings of the person during the particular external events and further organize the events in the proper library. In an embodiment, the computing device including but not limited to a tablet, a smartphone, a mobile phone, a desktop, and a laptop. In an embodiment, the computing device executes artificial intelligence-based computer-readable instructions that estimate the feelings of users in the form of brain activity mapping, heart rate, and hormonal level capturing. Herein, the user is able to search the video on the computing device based on the event, person, time, feelings, and further watch the same on the computing device. In an embodiment, the data collection device further includes a brain activity sensor, a heart activity sensor, and a hormonal sensor. The brain activity sensor collects brain activity data during a live video event. The heart activity sensor collects the heartbeat rate during a live video event. The hormonal sensor collects hormonal data during a live video event. In an embodiment, the data collection device collects the brain activity data, heartbeat rate and hormonal data from the brain activity sensor, the heart activity sensor, the hormonal sensor respectively and sends to the computing device that executes artificial intelligence-based computer-readable instructions to stores external events and as well as feelings of the person during the particular external events and further organizes the events in the library.
In an embodiment, the present relates to an artificial intelligence-based life library data collection and organization method, the method comprising:
A method of data collection, the method having
a data collection device is attached to user cloth through a mounting clip;
further, the data collection device is connected to a computing device;
a camera sensor captures live video of the event in which user is present;
a microphone records the voice of live video;
the data collection device receives the video feed and voice from the camera sensor and the microphone;
the data collection device further receives the brain activity data, heartbeat rate and hormonal data from a brain activity sensor, a heart activity sensor, a hormonal sensor respectively;
the data collection device transfers the video feed and voice to a computing device; and
the data collection device further transfers the brain activity data, heartbeat rate, and hormonal data to the computing device.
A method of storing and analyzing data, the method having:
the computing device executes computer-readable instruction to organize all data in a chapter-wise format based on different categorization and filtration parameters;
the data is logically separated into chronological sections that are then be collated to form an event library of the user, based on which the user is able to rewind and relive the experiences;
the data is also logically separated using sophisticated categorization parameters;
the computing device executes artificial intelligence-based computer-readable instructions to sort through the data and intelligently discard non-useful patches where the recordings are of idle state and have obstructed views;
lastly, an artificial intelligence-based computer-readable instruction is also used to keep the sensitive data gathered in a secure and encrypted format;
the computing device executes an artificial intelligence-based computer-readable instruction that estimates the feelings of a user in the form of brain activity mapping, heart rate, and hormonal level capturing;
data is broadcasted to interested and authorized people who the user wants to share the data with.
Herein sophisticated categorization parameters are selected from interactions, place, and experiences associated with a particular person, wherein, people that the user spends time with is identified by using facial and voice recognition algorithms on the video and audio data collected and the data is then be sorted based on persons associated.
Herein, the user is able to search the video on the computing device based on the event, person, time, feelings, and further watch the same on the computing device.
In an embodiment, the present invention relates to a system for an artificial intelligence-based life library data collection and organization system. The present invention includes one or more data collection devices, one or more computing devices. The one or more data collection devices include one or more camera sensors, one or more microphones and one or more mounting clips. The one or more camera sensors are mounted on the front of the one or more data collection devices to capture live video of the area in front of the one or more data collection devices. In an embodiment, in one or more camera sensors including but not limited to an RGB camera, a thermal camera, and an infrared camera. The one or more microphones are mounted on the one or more data collection devices, with help of the one or more microphones, the one or more data collection devices records voice of live video that is captured by the one or more camera sensors. The one or more mounting clips are used to attach the one or more data collection devices to the user’s body. The one or more computing devices are wirelessly connected to the one or more data collection devices. The one or more computing devices receives the video feed, brain activity data, heartbeat rate and hormonal data from the one or more data collection devices, and further, the one or more computing devices executes an artificial intelligence-based computer-readable instruction to stores external events and as well as feelings of the person during the particular external events and further organize the events in the proper library. Herein, the user is able to search the video on the one or more computing devices based on the event, person, time, feelings, and further watch the same on the one or more computing devices. In an embodiment, in the one or more computing devices including but not limited to a tablet, a smartphone, a mobile phone, a desktop, and a laptop. In an embodiment, in the one or more computing devices executes an artificial intelligence-based computer-readable instruction that estimates the feelings of users in the form of brain activity mapping, heart rate, and hormonal level capturing. The one or more data collection devices further include a brain activity sensor, a heart activity sensor, and a hormonal sensor. The brain activity sensor collects the brain activity data during a live video event. The heart activity sensor collects the heartbeat rate during a live video event. The hormonal sensor collects hormonal data during live video events. Herein the one or more data collection device collects the brain activity data, heartbeat rate and hormonal data from the brain activity sensor, the heart activity sensor, the hormonal sensor respectively and sends to the one or more computing device that executes artificial intelligence-based computer-readable instructions to stores external events and as well as feelings of the person during the particular external events and further organizes the events in the library.
In an embodiment, the present relates to an artificial intelligence-based life library data collection and organization method,
a method of data collection, the method having:
one or more data collection devices are attached to user cloth through a mounting clip;
further, the one or more data collection devices are connected to one or more computing devices;
one or more camera sensors capture live video of the event in which a user is present;
one or more microphones records voice of live video;
the one or more data collection devices receive the video feed and voice from the one or more camera sensors and the one or more microphones;
the one or more data collection devices further receives the brain activity data, heartbeat rate and hormonal data from a brain activity sensors, a heart activity sensor, a hormonal sensor respectively;
the one or more data collection devices transfers the video feed and voice to one or more computing devices; and
the one or more data collection devices further transfers the brain activity data, heartbeat rate, and hormonal data to the one or more computing devices.
A method of storing and analyzing data, the method having:
the one or more computing devices execute computer-readable instruction to organize all data in a chapter-wise format based on different categorization and filtration parameters;
the data is logically separated into chronological sections that are then be collated to form an event library of the user, based on which the user is able to rewind and relive the experiences;
the data is also logically separated using sophisticated categorization parameters;
the one or more computing devices executes artificial intelligence-based computer-readable instructions to sort through the data and intelligently discard non-useful patches where the recordings are of idle state and have obstructed views;
lastly, an artificial intelligence-based computer-readable instruction is also used to keep the sensitive data gathered in a secure and encrypted format;
the one or more computing devices executes an artificial intelligence-based computer-readable instruction that estimates the feelings of a user in the form of brain activity mapping, heart rate, and hormonal level capturing;
data is broadcasted to interested and authorized people who the user wants to share the data with.
Herein sophisticated categorization parameters are selected from interactions, place, and experiences associated with a particular person, wherein, people that the user spends time with is identified by using facial and voice recognition algorithms on the video and audio data collected and the data is then be sorted based on persons associated.
Herein, the user is able to search the video on the one or more computing devices based on the event, person, time, feelings, and further watch the same on the one or more computing devices.
Further objectives, advantages, and features of the present invention will become apparent from the detailed description provided herein below, in which various embodiments of the disclosed present invention are illustrated by way of example and appropriate reference to accompanying drawings. Those skilled in the art to which the present invention pertains may make modifications resulting in other embodiments employing principles of the present invention without departing from its spirit or characteristics, particularly upon considering the foregoing teachings. Accordingly, the described embodiments are to be considered in all respects only as illustrative, and not restrictive, and the scope of the present invention is, therefore, indicated by the appended claims rather than by the foregoing description or drawings. Consequently, while the present invention has been described regarding particular embodiments, modifications of structure, sequence, materials and the like apparent to those skilled in the art still fall within the scope of the invention as claimed by the applicant.

Claims:I/WE CLAIM
1. An artificial intelligence-based life library data collection and organization system(100), the system(100) comprising:
an at least one data collection device(102), the at least one data collection device(102) having
an at least one camera sensor(104), the at least one camera sensor(104) is mounted on the front of the at least one data collection device(102) to capture live video of the area in front of the at least one data collection device(102),
an at least one microphone(106), the at least one microphone(106) is mounted on the at least one data collection device(102), with help of the at least one microphone(106), the at least one data collection device(102) records the voice of live video that is captured by the at least one camera sensor(104),
a mounting clip(118), the mounting clip(118) is used to attach the at least one data collection device(102) to the user’s body ;
an at least one computing device(110), the at least one computing device(110) is wirelessly connected to the at least one data collection device(102), the at least one computing device(110) receives the video feed, brain activity data, heartbeat rate and hormonal data from the at least one data collection device(102), and further the at least one computing device(110) executes an artificial intelligence-based computer-readable instruction to stores external events and as well as feelings of the person during the particular external events and further organize the events in the proper library.
Wherein, the user is able to search the video on the at least one computing device(110) based on the event, person, time, feelings, and further watch the same on the at least one computing device(110).
2. The system(100) as claimed in claim 1, wherein, in the at least one data collection device(102) further comprising:
a brain activity sensor(112), the brain activity sensor(112) collects the brain activity data during live video event;
a heart activity sensor(114), the heart activity sensor(114) collects heartbeat rate during live video event;
a hormonal sensor(116), the hormonal sensor(116) collects hormonal data during live video event; and
wherein, the at least one data collection device(102) collects the brain activity data, heartbeat rate and hormonal data from the brain activity sensor(112), the heart activity sensor(114), the hormonal sensor(116) respectively and sends to the at least one computing device(110) that executes artificial intelligence-based computer-readable instructions to stores external events and as well as feelings of the person during the particular external events and further organize the events in the library.
3. The system(100) as claimed in claim 1, wherein, the at least one computing device(110) is different of types selected from a tablet, a smartphone, a mobile phone, a desktop, and a laptop.
4. The fluid as claimed in claim 1, wherein, the at least one camera sensor(104) is of different types selected from an RGB camera, a thermal camera, and an infrared camera.
5. The system(100) as claimed in claim 1, wherein, the at least one computing device(110) executes an artificial intelligence-based computer-readable instruction that estimates the feelings of a user in the form of brain activity mapping, heart rate and hormonal level capturing.
6. The system(100) as claimed in claim 1, wherein, the data that is being collected by the at least one computing device(110) is organized in a chapter-wise format based on different categorization and filtration parameters.
7. An artificial intelligence-based life library data collection and organization method, the method comprising:
a method of data collection, the method having

an at least one data collection device(102) is attached to user cloth through a mounting clip(118),
further, the at least one data collection device(102) is connected to an at least one computing device(110),
an at least one camera sensor(104) captures live video of the event in which user is present,
an at least one microphone(106) records voice of live video,
the at least one data collection device(102) receives the video feed and voice from the at least one camera sensor(104) and the at least one microphone(106),
the at least one data collection device(102) further receives the brain activity data, heartbeat rate and hormonal data from a brain activity sensor(112), a heart activity sensor(114), a hormonal sensor(116) respectively,
the at least one data collection device(102) transfers the video feed and voice to an at least one computing device(110), and
the at least one data collection device(102) further transfers the brain activity data, heartbeat rate, and hormonal data to the at least one computing device(110);
a method of storing and analyzing data, the method having
the at least one computing device(110) executes computer-readable instruction to organize all data in a chapter-wise format based on different categorization and filtration parameters,
the data is logically separated into chronological sections that are then be collated to form an event library of the user, based on which the user is able to rewind and relive the experiences,
the data is also logically separated using sophisticated categorization parameters,
the at least one computing device(110) executes artificial intelligence-based computer-readable instructions to sort through the data and intelligently discard non-useful patches where the recordings are of idle state and have obstructed views,
Lastly, an artificial intelligence-based computer-readable instruction is also used to keep the sensitive data gathered in a secure and encrypted format,
the at least one computing device(110) executes an artificial intelligence-based computer-readable instruction that estimates the feelings of a user in the form of brain activity mapping, heart rate, and hormonal level capturing
data is broadcasted to interested and authorized people who the user wants to share the data with.
8. The method(100) as claimed in claim7, wherein sophisticated categorization parameters are selected from interactions, place, and experiences associated with a particular person, wherein, people that the user spends time with is identified by using facial and voice recognition algorithms on the video and audio data collected and the data are then be sorted based on persons associated.
9. The method(100) as claimed in claim7, wherein, the user is able to search the video on the at least one computing device(110) based on the event, person, time, feelings, and further watch the same on the at least one computing device(110).

Documents

Application Documents

# Name Date
1 202011038631-CLAIMS [28-11-2022(online)].pdf 2022-11-28
1 202011038631-STATEMENT OF UNDERTAKING (FORM 3) [07-09-2020(online)].pdf 2020-09-07
2 202011038631-COMPLETE SPECIFICATION [28-11-2022(online)].pdf 2022-11-28
2 202011038631-REQUEST FOR EXAMINATION (FORM-18) [07-09-2020(online)].pdf 2020-09-07
3 202011038631-PROOF OF RIGHT [07-09-2020(online)].pdf 2020-09-07
3 202011038631-FER_SER_REPLY [28-11-2022(online)].pdf 2022-11-28
4 202011038631-POWER OF AUTHORITY [07-09-2020(online)].pdf 2020-09-07
4 202011038631-OTHERS [28-11-2022(online)].pdf 2022-11-28
5 202011038631-FORM 18 [07-09-2020(online)].pdf 2020-09-07
5 202011038631-FER.pdf 2022-05-30
6 202011038631-FORM 1 [07-09-2020(online)].pdf 2020-09-07
6 202011038631-COMPLETE SPECIFICATION [07-09-2020(online)].pdf 2020-09-07
7 202011038631-DRAWINGS [07-09-2020(online)].pdf 2020-09-07
7 202011038631-DECLARATION OF INVENTORSHIP (FORM 5) [07-09-2020(online)].pdf 2020-09-07
8 202011038631-DRAWINGS [07-09-2020(online)].pdf 2020-09-07
8 202011038631-DECLARATION OF INVENTORSHIP (FORM 5) [07-09-2020(online)].pdf 2020-09-07
9 202011038631-COMPLETE SPECIFICATION [07-09-2020(online)].pdf 2020-09-07
9 202011038631-FORM 1 [07-09-2020(online)].pdf 2020-09-07
10 202011038631-FORM 18 [07-09-2020(online)].pdf 2020-09-07
10 202011038631-FER.pdf 2022-05-30
11 202011038631-POWER OF AUTHORITY [07-09-2020(online)].pdf 2020-09-07
11 202011038631-OTHERS [28-11-2022(online)].pdf 2022-11-28
12 202011038631-PROOF OF RIGHT [07-09-2020(online)].pdf 2020-09-07
12 202011038631-FER_SER_REPLY [28-11-2022(online)].pdf 2022-11-28
13 202011038631-REQUEST FOR EXAMINATION (FORM-18) [07-09-2020(online)].pdf 2020-09-07
13 202011038631-COMPLETE SPECIFICATION [28-11-2022(online)].pdf 2022-11-28
14 202011038631-STATEMENT OF UNDERTAKING (FORM 3) [07-09-2020(online)].pdf 2020-09-07
14 202011038631-CLAIMS [28-11-2022(online)].pdf 2022-11-28
15 202011038631-US(14)-HearingNotice-(HearingDate-18-07-2025).pdf 2025-06-18
16 202011038631-Correspondence to notify the Controller [17-07-2025(online)].pdf 2025-07-17
17 202011038631-FORM-26 [18-07-2025(online)].pdf 2025-07-18
18 202011038631-Written submissions and relevant documents [31-07-2025(online)].pdf 2025-07-31
19 202011038631-Annexure [31-07-2025(online)].pdf 2025-07-31

Search Strategy

1 SearchStrategy-202011038631E_27-05-2022.pdf