Abstract: A method and system for capturing a photographic image in real time with reduced capture delay is provided. The method includes determining motion of a human finger towards a capture button of a camera device, capturing a plurality of frames associated with the photographic image in response to the motion, determining a time delay between a first instant and a second instant associated with the motion of the human finger, retrieving a frame, from the plurality of frames and displaying the frame based on the time delay between the time the motion of the finger is sensed and the time the camera lens actually captures the view. The system includes a communication interface for establishing communication, a memory that stores instructions and a processor responsive to the instructions to determine motion of a human finger, capture a plurality of frames, determining a time delay, retrieving a frame, and display the frame.
A METHOD AND SYSTEM OF CAPTURING A PHOTOGRAPHIC IMAGE IN REAL TIME WITH REDUCED CAPTURE DELAY
FIELD OF THE INVENTION
[0001] The present invention relates to the field of capturing pictures in real time using a camera device.
BACKGROUND
[0002] Camera devices are becoming increasingly popular for capturing photographic images. The photographic images can include still images or moving images. In a conventional camera device, a photographic image is captured upon a user clicking a capture button of the camera device. However there is a delay between an instant at which the user decides to capture the photographic image and an instant at which the photographic image is actually captured by the camera device. As a result, an accurate moment of the photographic image that is desired by the user is often not captured. Hence, an effective method to capture accurate moment of the photographic image is required.
[0003] In the light of the foregoing discussion there is a need for an efficient method and a system for capturing a photographic image in real time with reduced capture delay.
SUMMARY
[0004] Embodiments of the present disclosure described herein provide a method and system for capturing a photographic image in real time with reduced capture delay between the view frame the user wants to capture and the camera captured frame.
[0005] An example of a method of capturing a photographic image in real time with reduced capture delay includes determining motion of a human finger towards a capture button of a camera device. The method also includes capturing a plurality of frames associated with the photographic image in response to the motion. Further, the method includes determining a time delay between a first instant and a second instant. The first instant and the second instant being associated with the motion of the human finger towards the capture button. Furthermore, the method includes retrieving the frame, from the plurality of frames, based on at least one of the time delay and a camera device delay. Moreover, the method includes displaying the frame on a display of the camera device based on the time delay calculated above.
[0006] An example of a system for capturing a photographic image in real time with reduced capture delay includes a communication interface for establishing communication. The system also includes a memory that stores instructions. The system further includes a processor responsive to the instructions to determine motion of a human finger towards a capture button of a camera device; to capture a plurality of frames associated with the photographic image in response to the motion; to determine a time delay between a first instant and a second instant. The first instant and the second instant being associated with the motion of the human finger towards the capture button; to retrieve a frame, from the plurality of frames, based on the time delay and a camera device delay; and to display the frame on a display of the camera device.
BRIEF DESCRIPTION OF FIGURES
[0007] The accompanying figure, similar reference numerals may refer to identical or functionally similar elements. These reference numerals are used in the detailed description to illustrate various embodiments and to explain various aspects and advantages of the present disclosure.
[0008] FIG. 1 is a flowchart illustrating a method of capturing a photographic image in real time with reduced capture delay, in accordance with one embodiment;
[0009] FIG. 2 is a block diagram of a camera device for capturing a photographic image in real time with reduced capture delay, in accordance with one embodiment; and
[0010] FIG. 3 is an exemplary illustration of capturing a photographic image in real time with reduced capture delay, in accordance with one embodiment.
DETAILED DESCRIPTION
[0011] It should be observed the method steps and system components have been represented by conventional symbols in the figure, showing only specific details which are relevant for an understanding of the present disclosure. Further, details may be readily apparent to person ordinarily skilled in the art may not have been disclosed. In the present disclosure, relational terms such as first and second, and the like, may be used to distinguish one entity from another entity, without necessarily implying any actual relationship or order between such entities.
[0012] Embodiments of the present disclosure described herein provide a method and system for capturing a photographic image in real time with reduced capture delay.
[0013] FIG. 1 is a flowchart illustrating a method of capturing a photographic image in real time with reduced capture delay, in accordance with one embodiment. The method starts at step 105.
[0014] At step 110, motion of a human finger, of a user, towards a capture button of a camera device is determined. Examples of the camera device include, but are not limited to, a mobile phone, a personal digital assistant (PDA), a smart phone and a digital camera.
[0015] The motion includes speed and direction of the human finger towards the capture button for capturing the photographic image.
[0016] At step 115, a plurality of frames associated with the photographic image is captured by the camera device. The frames are captured upon determining the motion of the human finger towards the capture button of the camera device. The frames represent a plurality of moments associated with the photographic image.
[0017] Further, the frames captured upon determining the motion of the human finger are stored in a memory of the camera device.
[0018] At step 120, a time delay between a first instant and a second instant associated with the motion of the human finger towards the capture button is determined.
[0019] The first instant is referred to as an instant at which the motion of the human finger towards the capture button commences. The second instant is referred to as an instant at which the human finger clicks the capture button.
[0020] The time delay between the first instant and the second instant are determined to calculate an accurate moment of the photographic image.
[0021] At step 125, a frame, from the plurality of frames is retrieved based on the time delay determined at step 120 and a camera device delay. The camera device delay includes a shutter lag and a lens adjustment delay.
[0022] The frame that is retrieved from the plurality of frames is referred to as the accurate moment of the photographic image that is desired by the user. One or more image processing algorithms can be used for retrieving the frame, from the plurality of frames, based on the time delay and the camera device delay.
[0023] At step 130, the frame retrieved at step 125 is displayed on a display of the camera device.
[0024] The method stops at step 135.
[0025] By capturing the frames associated with the photographic image upon determining the motion of the human finger, the accurate moment of the photographic image that is desired by the user is stored. Further, upon determining the time delay associated with the motion and the camera device delay, the accurate moment of the photographic image is retrieved and provided to the user. Hence, the user succeeds in capturing the accurate moment of the photographic image as desired.
[0026] FIG. 2 is a block diagram of a camera device for capturing a photographic image in real time with reduced capture delay, in accordance with one embodiment.
[0027] The camera device 200 includes a bus 205 or other communication mechanism for communicating information, and a processor 210 coupled with the bus 205 for processing information. The camera device 200 also includes a memory 215, for example a random access memory (RAM) or other dynamic storage device, coupled to the bus 205 for storing information and instructions to be executed by the processor 210. The memory 215 can be used for storing temporary variables or other intermediate information during execution of instructions by the processor 210. The camera device 200 further includes a read only memory (ROM) 220 or other static storage device coupled to the bus 205 for storing static information and instructions for the processor 210. A storage unit 225, for example a magnetic disk or optical disk, is provided and coupled to the bus 205 for storing information, for example a plurality of frames associated with a photographic image.
[0028] The camera device 200 can be coupled via the bus 205 to a display 230, for example a cathode ray tube (CRT), for displaying the photographic image. The input device 235, including alphanumeric and other keys, is coupled to the bus 205 for communicating information and command selections to the processor 210. Another type of user input device is the cursor control 240, for example a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to the processor 210 and for controlling cursor movement on the display 230.
[0029] Various embodiments are related to the use of the camera device 200 for implementing the techniques described herein. In some embodiments, the techniques are performed by the camera device 200 in response to the processor 210 executing instructions included in the memory 215. Such instructions can be read into the memory 215 from another machine-readable medium, for example the storage unit 225.
Execution of the instructions included in the memory 215 causes the processor 210 to perform the process steps described herein.
[0030] In some embodiments, the processor 210 can include one or more processing units for performing one or more functions of the processor 210. The processing units are hardware circuitry used in place of or in combination with software instructions to perform specified functions.
[0031] The term "machine-readable medium" as used herein refers to any medium that participates in providing data that causes a machine to perform a specific function. In an embodiment implemented using the camera device 200, various machine-readable media are involved, for example, in providing instructions to the processor 210 for execution. The machine-readable medium can be a storage medium, either volatile or non-volatile. A volatile medium includes, for example, dynamic memory, such as the memory 215. A non-volatile medium includes, for example, optical or magnetic disks, for example the storage unit 225. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
[0032] Common forms of machine-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic media, a CD-ROM, any other optical media, punchcards, papertape, any other physical media with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge.
[0033] In another embodiment, the machine-readable media can be transmission media including coaxial cables, copper wire and fiber optics, including the wires that include the bus 205. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications. Examples of machine-readable media may include, but are not limited to, a carrier wave as described hereinafter or any other media from which the camera device 200 can read. For example, the instructions can initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the camera device 200 can receive the data on the telephone line and use an infra¬red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on the bus 205. The bus 205 carries the data to the memory 215, from which the processor 210 retrieves and executes the instructions. The instructions received by the memory 215 can optionally be stored on the storage unit 225 either before or after execution by the processor 210. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
[0034] The camera device 200 also includes a communication interface 245 coupled to the bus 205. The communication interface 245 provides a two-way data communication coupling to the processor 210. For example, the communication interface 245 can be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 245 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. In any such implementation, the communication interface 245 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
[0035] The processor 210 in the camera device 200 is configured to determine motion of a human finger, of a user, towards a capture button of a camera device. Examples of the camera device 200 include, but are not limited to, a mobile phone, a personal digital assistant (PDA), a smart phone and a digital camera. The motion, determined by the processor 210, includes speed and direction of the human finger towards the capture button for capturing the photographic image.
[0036] The motion of the human finger is determined using a motion sensor 255. In one example, the motion sensor 255 can be embedded in the capture button of the camera device 200. In another example, the motion sensor 255 can be referred to as the capture button.
[0037] The processor 210 is also operable to capture a plurality of frames associated with the photographic image. The processor 210 captures the frames upon determining the motion of the human finger towards the capture button of the camera device. Further, the processor 210 is configured to store the frames captured in the storage unit 225.
[0038] The processor 210 is also configured to determine a time delay between a first instant and a second instant associated with the motion of the human finger.
[0039] Further the processor 210 is operable to retrieve a frame from the plurality of frames. The frame retrieved from the plurality of frames is referred to as an accurate moment of the photographic image that is desired by the user.
[0040] Furthermore the processor 210 is operable to display the frame retrieved on the display 230 of the camera device 200.
[0041] FIG. 3 is an exemplary illustration of capturing a photographic image in real time with reduced capture delay, in accordance with one embodiment.
[0042] A user wishes to capture a photographic image 300. Hence motion of a human finger 305, of the user, towards a capture button of a camera device 310 is performed.
[0043] A processor, for example the processor 210, included in the camera device 310 upon determining the motion of the human finger 305 begins to capture a plurality of frames, for example frame-1 315, frame-2 320, frame-3 325 and frame-4 330, associated with the photographic image 300. Further, the frames associated with the photographic image 300 are stored in a storage unit, for example the storage unit 225.
[0044] Also, the processor determines a time delay between a first instant and a second instant associated with the motion of the human finger 305 towards the capture button of the camera device 310.
[0045] The first instant includes an instant at which the motion of the human finger 305 towards the capture button of the camera device 310 commences. The second instant includes an instant at which the human finger 305 clicks the capture button of the camera device 310.
[0046] Further, based on the time delay between the first instant and the second instant and a camera device delay associated with the camera device 310, the frame-3 325 of the photographic image 300 is retrieved. The frame-3 325 is the accurate moment of the photographic image 300 that is desired by the user.
[0047] Furthermore, the processor displays the frame-3 325 of the photographic image 300 that is desired by the user on a display of the camera device 310.
[0048] Advantageously, the embodiments specified in the present disclosure provide an efficient method of capturing real time photographic images using a motion sensor. By using the motion sensor, a desire to capture a photographic image is determined, and a plurality of frames associated with the photographic image is captured prior to the user clicking the capture button. Further, the method enables to retrieve one frame of the plurality of frames that includes the accurate moment of the photographic image. Hence, the need for expensive camera devices with high-end hardware elements for capturing the real time photographic images is eliminated. As a result, the method provides for a camera device with the motion sensor that is economical for capturing the real time photographic images. Further, tedious method restructuring of the camera device is eliminated as the motion sensor that is simple is embedded into the camera device.
[0049] In the preceding specification, the present disclosure and its advantages have been described with reference to specific embodiments. However, it will be apparent to a person of ordinary skill in the art that various modifications and changes can be made, without departing from the scope of the present disclosure, as set forth in the claims below. Accordingly, the specification and figures are to be regarded as illustrative examples of the present disclosure, rather than in restrictive sense. All such possible modifications are intended to be included within the scope of present disclosure.
I/We claim:
1 A method of capturing a photographic image in real time with reduced capture
delay, the method comprising:
determining motion of a human finger towards a capture button of a camera device;
capturing a plurality of frames associated with the photographic image in response to the motion;
determining a time delay between a first instant and a second instant, wherein the first instant and the second instant are associated with the motion of the human finger towards the capture button;
retrieving a frame, from the plurality of frames, based on at least one of the time delay and a camera device delay; and
displaying the frame on a display of the camera device.
2 The method as claimed in claim land further comprising:
storing the plurality of frames.
3 The method as claimed in claim 1, wherein the first instant comprises commencement of the motion of the human finger towards the capture button and the
second instant comprises click of the capture button by the human finger.
4 The method as claimed in claim 1, wherein the camera device delay comprises at least one of a shutter lag and a lens adjustment delay.
5 A system for capturing a photographic image in real time with reduced capture delay, the system comprising:
a communication interface for establishing communication;
a memory that stores instructions; and
a processor responsive to the instructions to determine motion of a human finger towards a capture button of a camera device;
capture a plurality of frames associated with the photographic image in response to the motion;
determine a time delay between a first instant and a second instant, wherein the first instant and the second instant are associated with the motion of the human finger towards the capture button;
retrieve a frame, from the plurality of frames, based on at least one of the time delay and a camera device delay; and
display the frame on a display of the camera device.
6 The system as claimed in claim 1, wherein the processor further comprises a motion sensor for determining the motion of the human finger towards the capture button of the camera device.
7 The system as claimed in claim 1, wherein the processor further configured to a store the plurality of frames.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 4721-CHE-2012 POWER OF ATTORNEY 12-11-2012.pdf | 2012-11-12 |
| 1 | 4721-CHE-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf | 2023-09-26 |
| 2 | 4721-CHE-2012 FORM-5 12-11-2012.pdf | 2012-11-12 |
| 2 | 4721-CHE-2012-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 3 | 4721-CHE-2012-US(14)-HearingNotice-(HearingDate-29-10-2020).pdf | 2021-10-17 |
| 3 | 4721-CHE-2012 FORM-3 12-11-2012.pdf | 2012-11-12 |
| 4 | 4721-CHE-2012-IntimationOfGrant28-01-2021.pdf | 2021-01-28 |
| 4 | 4721-CHE-2012 FORM-2 12-11-2012.pdf | 2012-11-12 |
| 5 | 4721-CHE-2012-PatentCertificate28-01-2021.pdf | 2021-01-28 |
| 5 | 4721-CHE-2012 FORM-1 12-11-2012.pdf | 2012-11-12 |
| 6 | 4721-CHE-2012-AMENDED DOCUMENTS [03-11-2020(online)].pdf | 2020-11-03 |
| 6 | 4721-CHE-2012 DRAWINGS 12-11-2012.pdf | 2012-11-12 |
| 7 | 4721-CHE-2012-FORM 13 [03-11-2020(online)].pdf | 2020-11-03 |
| 7 | 4721-CHE-2012 DESCRIPTION (COMPLETE) 12-11-2012.pdf | 2012-11-12 |
| 8 | 4721-CHE-2012-PETITION UNDER RULE 137 [03-11-2020(online)].pdf | 2020-11-03 |
| 8 | 4721-CHE-2012 CORRESPONDENCE OTHERS 12-11-2012.pdf | 2012-11-12 |
| 9 | 4721-CHE-2012 CLAIMS 12-11-2012.pdf | 2012-11-12 |
| 9 | 4721-CHE-2012-RELEVANT DOCUMENTS [03-11-2020(online)].pdf | 2020-11-03 |
| 10 | 4721-CHE-2012 ABSTRACT 12-11-2012.pdf | 2012-11-12 |
| 10 | 4721-CHE-2012-Written submissions and relevant documents [03-11-2020(online)].pdf | 2020-11-03 |
| 11 | 4721-CHE-2012 FORM-18 25-04-2013.pdf | 2013-04-25 |
| 11 | 4721-CHE-2012-FORM-26 [28-10-2020(online)].pdf | 2020-10-28 |
| 12 | 4721-CHE-2012 FORM-13 15-07-2015.pdf | 2015-07-15 |
| 12 | 4721-CHE-2012-Correspondence to notify the Controller [26-10-2020(online)].pdf | 2020-10-26 |
| 13 | 4721-CHE-2012-FORM 3 [01-07-2020(online)].pdf | 2020-07-01 |
| 13 | Form 13_Address for service.pdf | 2015-07-17 |
| 14 | 4721-CHE-2012-AMENDED DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 14 | Amended Form 1.pdf | 2015-07-17 |
| 15 | 4721-CHE-2012-FORM 13 [04-03-2020(online)].pdf | 2020-03-04 |
| 15 | 4721-CHE-2012-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 16 | 4721-CHE-2012-RELEVANT DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 16 | 4721-CHE-2012-RELEVANT DOCUMENTS [22-02-2018(online)].pdf | 2018-02-22 |
| 17 | 4721-CHE-2012-FORM 3 [20-12-2019(online)]-1.pdf | 2019-12-20 |
| 17 | 4721-CHE-2012-Changing Name-Nationality-Address For Service [22-02-2018(online)].pdf | 2018-02-22 |
| 18 | 4721-CHE-2012-FORM 3 [06-07-2018(online)].pdf | 2018-07-06 |
| 18 | 4721-CHE-2012-FORM 3 [20-12-2019(online)].pdf | 2019-12-20 |
| 19 | 4721-CHE-2012-ABSTRACT [21-01-2019(online)].pdf | 2019-01-21 |
| 19 | 4721-CHE-2012-FER.pdf | 2018-10-31 |
| 20 | 4721-CHE-2012-AMMENDED DOCUMENTS [21-01-2019(online)].pdf | 2019-01-21 |
| 20 | 4721-CHE-2012-OTHERS [21-01-2019(online)].pdf | 2019-01-21 |
| 21 | 4721-CHE-2012-Annexure [21-01-2019(online)].pdf | 2019-01-21 |
| 21 | 4721-CHE-2012-MARKED COPIES OF AMENDEMENTS [21-01-2019(online)].pdf | 2019-01-21 |
| 22 | 4721-CHE-2012-CLAIMS [21-01-2019(online)].pdf | 2019-01-21 |
| 22 | 4721-CHE-2012-FORM 13 [21-01-2019(online)].pdf | 2019-01-21 |
| 23 | 4721-CHE-2012-COMPLETE SPECIFICATION [21-01-2019(online)].pdf | 2019-01-21 |
| 23 | 4721-CHE-2012-FER_SER_REPLY [21-01-2019(online)].pdf | 2019-01-21 |
| 24 | 4721-CHE-2012-CORRESPONDENCE [21-01-2019(online)].pdf | 2019-01-21 |
| 25 | 4721-CHE-2012-FER_SER_REPLY [21-01-2019(online)].pdf | 2019-01-21 |
| 25 | 4721-CHE-2012-COMPLETE SPECIFICATION [21-01-2019(online)].pdf | 2019-01-21 |
| 26 | 4721-CHE-2012-CLAIMS [21-01-2019(online)].pdf | 2019-01-21 |
| 26 | 4721-CHE-2012-FORM 13 [21-01-2019(online)].pdf | 2019-01-21 |
| 27 | 4721-CHE-2012-Annexure [21-01-2019(online)].pdf | 2019-01-21 |
| 27 | 4721-CHE-2012-MARKED COPIES OF AMENDEMENTS [21-01-2019(online)].pdf | 2019-01-21 |
| 28 | 4721-CHE-2012-AMMENDED DOCUMENTS [21-01-2019(online)].pdf | 2019-01-21 |
| 28 | 4721-CHE-2012-OTHERS [21-01-2019(online)].pdf | 2019-01-21 |
| 29 | 4721-CHE-2012-ABSTRACT [21-01-2019(online)].pdf | 2019-01-21 |
| 29 | 4721-CHE-2012-FER.pdf | 2018-10-31 |
| 30 | 4721-CHE-2012-FORM 3 [06-07-2018(online)].pdf | 2018-07-06 |
| 30 | 4721-CHE-2012-FORM 3 [20-12-2019(online)].pdf | 2019-12-20 |
| 31 | 4721-CHE-2012-Changing Name-Nationality-Address For Service [22-02-2018(online)].pdf | 2018-02-22 |
| 31 | 4721-CHE-2012-FORM 3 [20-12-2019(online)]-1.pdf | 2019-12-20 |
| 32 | 4721-CHE-2012-RELEVANT DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 32 | 4721-CHE-2012-RELEVANT DOCUMENTS [22-02-2018(online)].pdf | 2018-02-22 |
| 33 | 4721-CHE-2012-FORM 13 [04-03-2020(online)].pdf | 2020-03-04 |
| 33 | 4721-CHE-2012-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 34 | 4721-CHE-2012-AMENDED DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 34 | Amended Form 1.pdf | 2015-07-17 |
| 35 | 4721-CHE-2012-FORM 3 [01-07-2020(online)].pdf | 2020-07-01 |
| 35 | Form 13_Address for service.pdf | 2015-07-17 |
| 36 | 4721-CHE-2012-Correspondence to notify the Controller [26-10-2020(online)].pdf | 2020-10-26 |
| 36 | 4721-CHE-2012 FORM-13 15-07-2015.pdf | 2015-07-15 |
| 37 | 4721-CHE-2012 FORM-18 25-04-2013.pdf | 2013-04-25 |
| 37 | 4721-CHE-2012-FORM-26 [28-10-2020(online)].pdf | 2020-10-28 |
| 38 | 4721-CHE-2012 ABSTRACT 12-11-2012.pdf | 2012-11-12 |
| 38 | 4721-CHE-2012-Written submissions and relevant documents [03-11-2020(online)].pdf | 2020-11-03 |
| 39 | 4721-CHE-2012 CLAIMS 12-11-2012.pdf | 2012-11-12 |
| 39 | 4721-CHE-2012-RELEVANT DOCUMENTS [03-11-2020(online)].pdf | 2020-11-03 |
| 40 | 4721-CHE-2012 CORRESPONDENCE OTHERS 12-11-2012.pdf | 2012-11-12 |
| 40 | 4721-CHE-2012-PETITION UNDER RULE 137 [03-11-2020(online)].pdf | 2020-11-03 |
| 41 | 4721-CHE-2012 DESCRIPTION (COMPLETE) 12-11-2012.pdf | 2012-11-12 |
| 41 | 4721-CHE-2012-FORM 13 [03-11-2020(online)].pdf | 2020-11-03 |
| 42 | 4721-CHE-2012-AMENDED DOCUMENTS [03-11-2020(online)].pdf | 2020-11-03 |
| 42 | 4721-CHE-2012 DRAWINGS 12-11-2012.pdf | 2012-11-12 |
| 43 | 4721-CHE-2012-PatentCertificate28-01-2021.pdf | 2021-01-28 |
| 43 | 4721-CHE-2012 FORM-1 12-11-2012.pdf | 2012-11-12 |
| 44 | 4721-CHE-2012-IntimationOfGrant28-01-2021.pdf | 2021-01-28 |
| 44 | 4721-CHE-2012 FORM-2 12-11-2012.pdf | 2012-11-12 |
| 45 | 4721-CHE-2012-US(14)-HearingNotice-(HearingDate-29-10-2020).pdf | 2021-10-17 |
| 45 | 4721-CHE-2012 FORM-3 12-11-2012.pdf | 2012-11-12 |
| 46 | 4721-CHE-2012-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 46 | 4721-CHE-2012 FORM-5 12-11-2012.pdf | 2012-11-12 |
| 47 | 4721-CHE-2012 POWER OF ATTORNEY 12-11-2012.pdf | 2012-11-12 |
| 47 | 4721-CHE-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf | 2023-09-26 |
| 1 | 4721che2012_07-02-2018.pdf |