Abstract: The invention envisages a device, which can be worn on the fingertip and the device is adapted to recognize a human"s hand gesture. In accordance with one embodiment of the invention, the device in accordance with this invention helps a user to select different menus by moving a hand in the air. A user can also send alphanumeric character to the set-top box just by writing the character in the air. Volume and TV channels can also be changed by some specific gesture with the help of the device.
FORM-2
THE PATENT ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
PROVISIONAL SPECIFICATION
(See section 10 and Rule 13)
REMOTE CONTROLLING USING GESTURES
TATA CONSULTANCY SERVICES LIMITED
an Indian Company
of Bombay House, 24, Sir Homi Modi Street, Mumbai-400 001,
Maharashtra, India
THE FOLLOWING SPECIFICATION DESCRIBES THE INVENTION
Field of the Invention:
This invention relates to the field of remote controlling.
In particular, this invention relates to remote controlling using gestures.
Background of the Invention:
Now a days, digital set-top boxes are becoming more popular. Those boxes provide Internet browsing, video chat, movie search along with television. To handle these applications, a user needs a keyboard and /or a mouse. However, it does not feel good to sit with a keyboard and mouse while watching TV. Some better device is required to handle the complex interfaces typically with set-top boxes.
Electronic products are coming into the market with improved human machine interface capability. The mechanical mouse was first introduced, where a rolling ball was employed to detect direction of mouse movement. Later on, optical mouse replaced the mechanical mouse. The optical mouse uses microscopic image processing system for detecting mouse movement direction. The Wireless mouse replaced a wire connection with wireless technology. However for all the aforesaid mouse devices, a horizontal two-dimensional plane is required for moving the mouse. There was therefore the need of a better unobstrusive device that can perform the function of communication with a set top box remotely.
This invention envisages a device, which can be worn on the fingertip and the device is adapted to recognize a human's hand gesture. In accordance with one embodiment of the invention, the device in accordance with this invention helps a user to select different menus by moving a hand in the air. A user can also send alphanumeric character to the set-top box just by writing the character in the air. Volume and TV channels can also be changed by some specific gesture with the help of the device.
In the prior art visual gesture recognition has been disclosed and some of the known techniques are listed in Table 1 wherein the type of gesture taken for input is matched with the approach used for recognition. Different types of gesture include:
• Head gestures: In this approach head and/or eye position over time is recognized and they have some specific meaning. In this approach it is not possible to insert numerical characters as user input.
• Arm gesture based:,
• Sign language gesture:
• Eye gaze behaviour: Main motivation behind this is the purpose of synthesizing a realistic Embodied Conversational Agent (ECA).
Usually the recognition method uses different approaches like:
• HMM based:
• Statistical information:.
• The Semantic Network Model
Table 1: Different approaches for gesture recognition
Approach Recognition method
head nods and head shakes two Hidden Markov Models (HMMs) trained and tested on 2D coordinate results from an eye gaze tracker
"between eyes" templates
head gesture detection with prosodic recognition of Japanese spoken utterances to determine strongly positive, weak positive, and negative responses to yes/no type utterances HMM
model eye gaze patterns in the context of real-time verbal communication hierarchical state machines
eye-tracking two-state Markov model
gaze patterns statistical information
eye movement model empirical studies of saccade and statistical models of eye-tracking data
arm and head gesture recognition Conditional Random Field (CRF)
arm and head gesture recognition Hidden state conditional random fields (HCRFs)
Tracking and recognizing hand gestures statistical shape models
Real-time Gesture Recognition The Semantic Network Model
From the prior art it can be seen that some method recognizes the characters given in American Sign Language and a modified version of it. But the problem of this approach is that a user needs to memorize a lot of code/symbols for each character.
Again Lee and Yangsheng Xu et al used a gesture based approach to make a system which can not only interact with a user by accurately recognizing gestures but which can learn new gestures and update its understanding of gestures it already knows in an on line interactive manner Commonly used approaches for gesture recognition includes HMM based approaches.
Michael J. Lyons et.al describes the use of an online gesture based approach in machine-mediated human interaction in web-based tutoring. Extensive studies in various branches of social and communication sciences show that skill in understanding and participating in these modes of interaction forms a significant component of the human social expertise.
But these above mentioned approaches are applicable to recognize only gestures. The prior art systems can not be used to give any character input which is required for inputing to an internet browser.
The invention envisages a hand held device, which can be used as a mouse device and the method applies the wireless transmission function by utilizing the technology of Micro-Electro-Mechanical System (MEMS). In addition, the invention can be employed in 3D mouse, where a user can move a hand in the air instead on a table and which can be therefore also replace a joystick, cursor pen and 3D game pad.
This invention envisages a device that can be used by a user by moving his/her hand in the air and send instructions toa PC or other device. A Gesture detection processing is used to detect and identify the user's gesture.
In particular, this invention envisages a novel way of getting user inputs from a Micro Electro-Mechanical Systems (MEMS) accelerometer which is used to detect the movement of a device in relation to the hand of a user.
In another aspect of the invention it provides a method and apparatus for providing user inputs in the form of alpha numeric data to an internet browser using the hand of the user.
In one particular embodiment of the invention there is provided a device that can be placed on fingertip or in the form of wrist band or can be placed in the hand or any physical organ that can be moved freely to provide a user input.
In its broadest form the invention envisages sensing the coordinates of any sensor from a hand held device, using a wireless transmission system including a microcontroller, a wireless transmitter and a sensor of Micro Electro Mechanical System (MEMS). The hand held device utilizes the MEMS sensor to sense the amount of applied force, detect the direction and acceleration of the applied force and send this data to a receiver apparatus using a wireless transmitter.
The receiver apparatus receives the data over the wireless and sends the signals to a host device for further processing of data generated by displacement of the hand held device.
More particularly the hand held device in accordance with the invention includes a MEMS sensor which is an accelerometer fabricated in a chip.
Typically, the MEMS sensor is a 3 axis sensor chip.
The hand held device in accordance with this invention is used to detect left right and up down movement by using the movement of the MEMS sensor's Y and Z direction movement.
The hand held device in accordance with this invention, also has a microcontroller adapted to read the status of the MEMS sensor and send the readings to a wireless transmitter.
The hand held device in accordance with this invention has an RF transmitter which is adapted to transmit RF data through the RF wireless transmission within the frequency ranges of 900 MHz or 2.4 Ghz or Industry, Science, Medical (ISM) frequency band. The receiver device also receives data in the same frequency band.
In accordance with a preferred embodiment of the invention there is provided a system having a hand held device which can interact with a host device, said host device being one of the following,
• Personal Computers (PC),
• Digital Set Top Boxes(DSTB)
• Notebook (NB),
• Personal digital assistance (PDA),
• Mobile phone.
The hand held device in accordance with this invention, is adapted to communicate with the host device using one of the following protocols
• USB
• RS232
• Fire wire
• Or any other custom protocol and can work as a standard mouse.
The hand held device in accordance with this invention is powered by a battery.
In particular, this invention envisages the automatic recognition of gesture based character input based on multifactorial analysis or a like approach that makes a decision from multiple features or parameters.
In accordance with another embodiment of the invention, this invention envisages the use of a graph theoretical approach and or chain code based approach for recognition of alpha numeric characters.
This invention aims at providing a hand held wireless device using ac accelerometer type MEMS chip to sense movement and displacement of the device. Acceleration is used to measure total displacement.
In a practical embodiment the invention consists of the following modules: i Hand held device
ii Receiver apparatus
The hand held device comprises a MEMS sensor, a transmitter RF-SoC(Radio Frequency-System-on-Chip) and a microcontroller.
The receiver apparatus comprises a microcontroller and receiver RF-SoC.
The method of the invention comprises the following steps:
The hand held device detects movement of the device by reading MEMS sensor's value. This read out data is transferred to a receiver apparatus through an RF-SoC module. The receiver apparatus then determines the direction and value of displacement experienced by the hand held device and sends these read signals to the host system.
Data is obtained from a sensor attached to the hand of a user. A User an write the English alphabets in capital letter by just using the sensor attached to the hand. Then this data undergoes filtering and finally it is sent to a recognition module. The recognition module consists of two different agents working on different principles. Each of them recognizes the input character with some confidence factor lying in the closed interval (0,1). Final decision about the recognized character is made by taking the output obtained from either agent with highest confidence. The main advantage of the scheme is that one agent works excellently for curved characters and the other for the characters with linear segment. A recognition accuracy of more than 92% is obtained.
The device of this invention was tested with the inputs given by 20 user (15 male and 5 female) each of the user was asked to write each letter 5 times.
But even after this there may be some wrong recognition because of similarity in shape. A curve and linear stroke intersection based features are used to resolve some clusters of similar shaped characters. For example P and D have almost similar shape. But difference comes in the intersection between the vertical line and the curve. In case of P this intersection occurs at the middle portion of the character where as in case of D this thing happens at the bottom part
Typical method steps are as follows
First, the data is captured using a MMSE based Sensor and the sensed value of acceleration are measured and sent to a set-top box using wireless technology (ZigBee, Blue Tooth, and the like).
Secondly, the received data is filtered and processed and sent to recognition module. The recognition module recognizes the character.
Finally, the recognized character is sent to the set top box for its convenient usage.
Brief Description of the Accompanying Drawings:
The invention will now be described with reference to the accompanying
drawings, in which
Figure 1 is a simplified schematic of the hand held device containing RF
transmitter;
Figure 2 is a structural diagram of a receiver apparatus containing RF receiver
to be used in conjunction with the transmitter of figure 1;
Figure 3 is a structural diagram of the applied hand held device in accordance
with this invention with a wireless transmission system;
Figure 4 is a flow chart for reading and transmitting sensors data.
Figure 5 is a flow chart for receiving wireless data, processing the data and
sending the data to a host device.
Detailed Description of Invention: Data Acquisition:
Micro Electro-Mechanical Systems (MEMS) accelerometer is used to detect the movement of the device, which is placed typically on the fingertip. Every time some movement is detected, direction of the replacement and value of acceleration are measured and sent to the set-top box using wireless technology (ZigBee, Blue Tooth, and). The Receiver is fitted in the set-top box which receives the signals from the transmitter then stores all the data as a series of coordinates and sends the signals to a gesture recognition engine. Preprocessing: Identify the strokes:
The difference between two consecutive X or Y co-ordinates throughout the whole input data, are taken into account to see whether they differ by a large no , which in turn will tell the receiver if the mouse is taken up while drawing. If so, the position of the stroke is stored which occurs as . The data captured using MMSE sensor is first made to undergo some filtering. From the observation it is found that if the x
and y coordinates are stored, there is a huge change in two consecutive points whenever there is a new stroke. So initially the input data is split into several segments.
Data Filtering: To get a smoother transition between two consecutive mouse data points, a Median Filter is applied on the input data. This filtering is applied on each stroke. In case of occurrence of a new stroke, there is a hectic transition between two consecutive data points.
Compute Area: a module
is defined to get the total boundary covered by the input pattern which later
helps to make the system independent of the size of the input data.
Construct Chain Code: The module
and
are responsible to calculate the gradient value of each stroke segment of the structure and store them in a . In this case, a generalized concept of line direction is followed typically as under [only by way of example]:
Direction Gradient value
'|', downward 0
'V, downward 1
'-', towards right 2
7', upward 3
'|', upward 4
'V, upward 5
'-', towards left 6
7', downward 7
Remove noise: another filter is used to remove any noise present within a continuous value of any gradient. This gives a good gradient look-up which excludes most of the un-wanted part of the drawing that includes some unintentional jerking of hand while drawing.
Recognition:
The most critical module of this system is the recognition module. In accordance with a preferred embodiment of the invention the method of "multi factorial approach" is used to find out the candidate recognized character. In this method of the invention 2 agents are used to give their opinion about the input character. Each of these agents gives some score indicating their confidence about the recognition. The character with highest confidence is taken as the recognized character. The method is described here.
Agent 1 is based on Chain code based approach. In this method we chain code is assigned to every segment. The flow chart of assigning chain code in input data is described in Figure 6 and Figure 7.
Determine presence of loop: If the stroke count is less than or equal to 1 and the starting and ending co-ordinates of the drawing do not differ by more than a , the input pattern is considered to have only one loop which is justified for the character 'O'.
Determine presence of curve: The presence of curve in any input data helps to distinguish between the characters 'B', 'C,' T\ 'G\ 'J', 'O', 'F, 'Q'/R', 'S', 'U' and •A', %' T\ 'H', T, 'K*, 'L\ 'M','N', T, 'V, 'W'/X', 'Y\ 'Z'. Here, the distance among every consecutive co-ordinates is computed and checked whether they
fall within a small range, in that case the gesture contains one or more than one curves.
Agent 2 is based on Un-directional un-weighted graph based approach for
recognition. The overall process is described in the flowchart shown in Figure 8 and Figure 9. The pseudo code for this approach is given here: For every code
• Find the minimum and maximum x and y co-ordinate of the input data
• They are marked as (max_x, max_y) and (minx, min_y)
• The entire region is splited into 3x3 blocks
• Each sub-block is marked as 0,1, 2, ,8
• Schematic diagram is shown in Error! Reference source not found.
• Find the starting and end position of each code mark them as start and end.
• So every code can be represented in the form of an adjacency matrix of size 9x9.
• Indicate i = Min(start,end), j = Max(start,end)
• If a code starts at I and ends at j position, mark adjacency(ij) = 1
• From observation we can construct template graph for each character. Let it be templategraph
• For some character more than one template is possible
• Construct the dis-similarity_matrix by graph(ij)- template_graph(ij)
• If dis-similarity_matrix(ij)= 1
• This means there is some edge in the input data that is absent in the template
• If dissimilarity_matrix(i,j) = -1
• This means there is some edge in the template that is absent in the input data
• If there is a (1,-1) pair in two adjacent (horizontally or vertically) position this means that the input is written in a different manner than the template is written.
• Count the number of occurrence of insertion, deletion and modification.
• Sum of matching, insertion and modification gives the number of codes present in the input
• Sum of matching and deletion gives the number of code in the template.
• Weighted sum of deviation from input and target template can be obtained by (1-((insertion*WEIGHT_INSERT+modification*WEIGHT_MODIFY+dele tion*WEIGHT_DELETE)/(div_factor)))
• Div_factor is defined by (WEIGHT_INSERT+WEIGHT_MODIFY+ WEIGHT_DELETE)*code_cnt, where code_cnt is the number of code in the input data.
• Find the character that matches best and obtain the matching score lying between 0 and 1.
0 1 2
3 4 5
6 7 8
While considerable emphasis has been placed herein on the particular features of the preferred embodiment and the improvisation with regards to it, it will be appreciated the various modifications can be made in the preferred
embodiments without departing from the principles of the invention. These and the other modifications in the nature of the invention will be apparent to those skilled in art from disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to interpreted merely as illustrative of the invention and not as a limitation.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 400-MUM-2008-FORM 1(30-12-2008).pdf | 2008-12-30 |
| 1 | 400-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |
| 2 | 400-MUM-2008-CORRESPONDENCE(30-12-2008).pdf | 2008-12-30 |
| 2 | 400-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 3 | 400-MUM-2008-REPLY TO EXAMINATION REPORT-08-04-2015.pdf | 2015-04-08 |
| 3 | 400-MUM-2008-RELEVANT DOCUMENTS [29-09-2021(online)].pdf | 2021-09-29 |
| 4 | 400-MUM-2008-RELEVANT DOCUMENTS [29-03-2020(online)].pdf | 2020-03-29 |
| 4 | 400-MUM-2008-FORM PCT-ISA-237-08-04-2015.pdf | 2015-04-08 |
| 5 | 400-MUM-2008-RELEVANT DOCUMENTS [23-03-2019(online)].pdf | 2019-03-23 |
| 5 | 400-MUM-2008-FORM PCT-IB-373-08-04-2015.pdf | 2015-04-08 |
| 6 | 400-MUM-2008-FORM 3-08-04-2015.pdf | 2015-04-08 |
| 6 | 400-MUM-2008-ABSTRACT(19-2-2009).pdf | 2018-08-10 |
| 7 | Other Patent Document [08-03-2017(online)].pdf | 2017-03-08 |
| 7 | 400-MUM-2008-Abstract-131015.pdf | 2018-08-10 |
| 8 | 400-MUM-2008-ORIGINAL UNDER RULE 6 (1A)-03-04-2017.pdf | 2017-04-03 |
| 8 | 400-MUM-2008-CLAIMS(19-2-2009).pdf | 2018-08-10 |
| 9 | 400-MUM-2008-Claims-131015.pdf | 2018-08-10 |
| 9 | 400-MUM-2008-PatentCertificate28-09-2017.pdf | 2017-09-28 |
| 10 | 400-MUM-2008-CORRESPONDENCE(12-2-2010).pdf | 2018-08-10 |
| 10 | 400-MUM-2008-IntimationOfGrant28-09-2017.pdf | 2017-09-28 |
| 11 | 400-MUM-2008-CORRESPONDENCE(17-4-2014).pdf | 2018-08-10 |
| 11 | 400-MUM-2008-RELEVANT DOCUMENTS [28-03-2018(online)].pdf | 2018-03-28 |
| 12 | 400-MUM-2008-CORRESPONDENCE(19-2-2009).pdf | 2018-08-10 |
| 12 | 400-MUM-2008_EXAMREPORT.pdf | 2018-08-10 |
| 13 | 400-mum-2008-correspondence-received.pdf | 2018-08-10 |
| 13 | 400-MUM-2008-SPECIFICATION(AMENDED)-131015.pdf | 2018-08-10 |
| 14 | 400-MUM-2008-DESCRIPTION(COMPLETE)-(19-2-2009).pdf | 2018-08-10 |
| 14 | 400-MUM-2008-Power of Attorney-131015.pdf | 2018-08-10 |
| 15 | 400-mum-2008-desription (provisional).pdf | 2018-08-10 |
| 15 | 400-MUM-2008-PETITION UNDER RULE-137(17-4-2014).pdf | 2018-08-10 |
| 16 | 400-MUM-2008-DRAWING(19-2-2009).pdf | 2018-08-10 |
| 16 | 400-MUM-2008-PETITION UNDER RULE 137-131015.pdf | 2018-08-10 |
| 17 | 400-MUM-2008-MARKED COPY-131015.pdf | 2018-08-10 |
| 17 | 400-mum-2008-drawings.pdf | 2018-08-10 |
| 18 | 400-MUM-2008-Examination Report Reply Recieved-131015.pdf | 2018-08-10 |
| 18 | 400-mum-2008-form-3.pdf | 2018-08-10 |
| 19 | 400-MUM-2008-Form 1-131015.pdf | 2018-08-10 |
| 19 | 400-mum-2008-form-26.pdf | 2018-08-10 |
| 20 | 400-mum-2008-form 13(19-2-2009).pdf | 2018-08-10 |
| 20 | 400-mum-2008-form-2.pdf | 2018-08-10 |
| 21 | 400-MUM-2008-FORM 18(12-2-2010).pdf | 2018-08-10 |
| 22 | 400-mum-2008-form 2(19-2-2009).pdf | 2018-08-10 |
| 22 | 400-mum-2008-form-1.pdf | 2018-08-10 |
| 23 | 400-MUM-2008-FORM 2(TITLE PAGE)-(19-2-2009).pdf | 2018-08-10 |
| 23 | 400-MUM-2008-FORM 5(19-2-2009).pdf | 2018-08-10 |
| 24 | 400-MUM-2008-FORM 3(17-4-2014).pdf | 2018-08-10 |
| 24 | 400-MUM-2008-FORM 2(TITLE PAGE)-(PROVISIONAL)-(27-2-2008).pdf | 2018-08-10 |
| 25 | 400-MUM-2008-Form 2(Title Page)-131015.pdf | 2018-08-10 |
| 26 | 400-MUM-2008-FORM 2(TITLE PAGE)-(PROVISIONAL)-(27-2-2008).pdf | 2018-08-10 |
| 26 | 400-MUM-2008-FORM 3(17-4-2014).pdf | 2018-08-10 |
| 27 | 400-MUM-2008-FORM 2(TITLE PAGE)-(19-2-2009).pdf | 2018-08-10 |
| 27 | 400-MUM-2008-FORM 5(19-2-2009).pdf | 2018-08-10 |
| 28 | 400-mum-2008-form 2(19-2-2009).pdf | 2018-08-10 |
| 28 | 400-mum-2008-form-1.pdf | 2018-08-10 |
| 29 | 400-MUM-2008-FORM 18(12-2-2010).pdf | 2018-08-10 |
| 30 | 400-mum-2008-form 13(19-2-2009).pdf | 2018-08-10 |
| 30 | 400-mum-2008-form-2.pdf | 2018-08-10 |
| 31 | 400-MUM-2008-Form 1-131015.pdf | 2018-08-10 |
| 31 | 400-mum-2008-form-26.pdf | 2018-08-10 |
| 32 | 400-MUM-2008-Examination Report Reply Recieved-131015.pdf | 2018-08-10 |
| 32 | 400-mum-2008-form-3.pdf | 2018-08-10 |
| 33 | 400-mum-2008-drawings.pdf | 2018-08-10 |
| 33 | 400-MUM-2008-MARKED COPY-131015.pdf | 2018-08-10 |
| 34 | 400-MUM-2008-DRAWING(19-2-2009).pdf | 2018-08-10 |
| 34 | 400-MUM-2008-PETITION UNDER RULE 137-131015.pdf | 2018-08-10 |
| 35 | 400-mum-2008-desription (provisional).pdf | 2018-08-10 |
| 35 | 400-MUM-2008-PETITION UNDER RULE-137(17-4-2014).pdf | 2018-08-10 |
| 36 | 400-MUM-2008-Power of Attorney-131015.pdf | 2018-08-10 |
| 36 | 400-MUM-2008-DESCRIPTION(COMPLETE)-(19-2-2009).pdf | 2018-08-10 |
| 37 | 400-mum-2008-correspondence-received.pdf | 2018-08-10 |
| 37 | 400-MUM-2008-SPECIFICATION(AMENDED)-131015.pdf | 2018-08-10 |
| 38 | 400-MUM-2008-CORRESPONDENCE(19-2-2009).pdf | 2018-08-10 |
| 38 | 400-MUM-2008_EXAMREPORT.pdf | 2018-08-10 |
| 39 | 400-MUM-2008-CORRESPONDENCE(17-4-2014).pdf | 2018-08-10 |
| 39 | 400-MUM-2008-RELEVANT DOCUMENTS [28-03-2018(online)].pdf | 2018-03-28 |
| 40 | 400-MUM-2008-CORRESPONDENCE(12-2-2010).pdf | 2018-08-10 |
| 40 | 400-MUM-2008-IntimationOfGrant28-09-2017.pdf | 2017-09-28 |
| 41 | 400-MUM-2008-Claims-131015.pdf | 2018-08-10 |
| 41 | 400-MUM-2008-PatentCertificate28-09-2017.pdf | 2017-09-28 |
| 42 | 400-MUM-2008-ORIGINAL UNDER RULE 6 (1A)-03-04-2017.pdf | 2017-04-03 |
| 42 | 400-MUM-2008-CLAIMS(19-2-2009).pdf | 2018-08-10 |
| 43 | Other Patent Document [08-03-2017(online)].pdf | 2017-03-08 |
| 43 | 400-MUM-2008-Abstract-131015.pdf | 2018-08-10 |
| 44 | 400-MUM-2008-FORM 3-08-04-2015.pdf | 2015-04-08 |
| 44 | 400-MUM-2008-ABSTRACT(19-2-2009).pdf | 2018-08-10 |
| 45 | 400-MUM-2008-RELEVANT DOCUMENTS [23-03-2019(online)].pdf | 2019-03-23 |
| 45 | 400-MUM-2008-FORM PCT-IB-373-08-04-2015.pdf | 2015-04-08 |
| 46 | 400-MUM-2008-RELEVANT DOCUMENTS [29-03-2020(online)].pdf | 2020-03-29 |
| 46 | 400-MUM-2008-FORM PCT-ISA-237-08-04-2015.pdf | 2015-04-08 |
| 47 | 400-MUM-2008-RELEVANT DOCUMENTS [29-09-2021(online)].pdf | 2021-09-29 |
| 47 | 400-MUM-2008-REPLY TO EXAMINATION REPORT-08-04-2015.pdf | 2015-04-08 |
| 48 | 400-MUM-2008-CORRESPONDENCE(30-12-2008).pdf | 2008-12-30 |
| 48 | 400-MUM-2008-RELEVANT DOCUMENTS [26-09-2022(online)].pdf | 2022-09-26 |
| 49 | 400-MUM-2008-FORM 1(30-12-2008).pdf | 2008-12-30 |
| 49 | 400-MUM-2008-RELEVANT DOCUMENTS [28-09-2023(online)].pdf | 2023-09-28 |