Abstract: A method and system for generating a stereoscopic image using two mobile devices is provided. The method includes classifying the two mobile devices into a master device and a slave device, performing calibration to capture a first image, transmitting the first image to the slave device, capturing a second image by the master device and the slave device based on the lens separation distance, and processing a first view image and a second view image, of the second image, to generate the stereoscopic image. The system includes a communication interface for establishing communication, a memory that stores instructions and a processor responsive to the instructions for generating a stereoscopic image using two mobile devices.
A SYSTEM AND METHOD OF GENERATING STEREOSCOPIC IMAGES USING
MOBILE DEVICES
FIELD OF THE INVENTION
[0001] The present invention relates to the field of generating stereoscopic images using mobile devices.
BACKGROUND
[0002] In recent days rendering two dimensional (2D) images for generation of three dimensional (3D) images are gaining popularity since the 3D images enhances depth perception by human eyes. The 3D images are also referred to as stereoscopic images. Various computers exist for the generation of the 3D images. However, due to advancement of computer technology miniaturized devices, for example mobile devices, are also configured for the generation of the 3D images.
[0003] Conventional technique for obtaining a stereoscopic image includes a mobile apparatus that includes a first camera and a second camera. By rotating the second camera by 180 degrees, the first camera and the second camera are aligned horizontally to obtain a similar photographic range by the first camera and the second camera. Further, the second camera is rotated such that a system controller determines a stereoscopic vision photographing mode. Further, when a user opens a camera shutter, the first camera and the second camera are allowed to perform a photographing process. Images, obtained from the first camera and the second camera, are attached to information indicating that the image is for the right eye and the image is for the left eye to generate a stereoscopic image. Further the images, obtained from the first camera and the second camera, are transmitted to the user.
[0004] However, conventional method employs two cameras on a single mobile apparatus, thereby making it expensive.
[0005] In the light of the foregoing discussion there is a need for an efficient method and a system for generating stereoscopic images using mobile devices.
SUMMARY
[0006] Embodiments of the present disclosure described herein provide a system and a method for generating stereoscopic images using two mobile devices.
[0007] An example of a method of generating a stereoscopic image using two mobile devices includes classifying the mobile devices into one of a master device and a slave device. The method also includes performing calibration to capture a first image. The first image is being located at a distance from the master device. The method further includes transmitting the first image to the slave device, wherein the first image captured by the master device is overlapped with the first image captured by the slave device. Overlapping is being performed to determine a lens separation distance between the master device and the slave device. Further, the method includes capturing a second image by at least one of the master device and the slave device based on the lens separation distance. Moreover, the method includes processing at least one of a first view image and a second view image, of the second image, wherein the first view image and the second view image, of the second image, is captured, by master device and the slave device, based on the lens separation distance to generate the stereoscopic image.
[0008] An example of a system for generating a stereoscopic image using two mobile devices includes a communication interface for establishing communication. The system also includes a memory that stores instructions. The system further includes a processor responsive to the instructions to classify the two mobile devices into one of a master device and a slave device. The processor is also responsive to the instructions to perform calibration to capture a first image. The first image is being located at a distance from the master device. The processor is further responsive to the instructions to transmit the first image to the slave device, wherein the first image captured by the master device is overlapped with the first image captured by the slave device. Overlapping is being performed to determine a lens separation distance between the master device and the slave device. Further the processor is responsive to the instructions to capture a second image by at least one of the master device and the slave device based on the lens separation distance. Moreover, the processor is responsive to the instructions to process at least one of a first view image and a second view image, of the second image, wherein the first view image and the second view image, of the second image, is captured, by the master and the slave device, based on the lens separation distance to generate the stereoscopic image.
BRIEF DESCRIPTION OF FIGURES
[0009] The accompanying figure, similar reference numerals may refer to identical or functionally similar elements. These reference numerals are used in the detailed description to illustrate various embodiments and to explain various aspects and advantages of the present disclosure.
[0010] FIG. 1 is a flowchart illustrating a method of generating a stereoscopic image using two mobile devices, in accordance with one embodiment;
[0011] FIG. 2 is a block diagram of a master device for generating a stereoscopic image using two mobile devices, in accordance with one embodiment; and
[0012] FIG. 3A-3I is an exemplary illustration of generating a stereoscopic image using two mobile devices, in accordance with one embodiment.
DETAILED DESCRIPTION
[0013] It should be observed the method steps and system components have been represented by conventional symbols in the figure, showing only specific details which are relevant for an understanding of the present disclosure. Further, details may be readily apparent to person ordinarily skilled in the art may not have been disclosed. In the present disclosure, relational terms such as first and second, and the like, may be used to distinguish one entity from another entity, without necessarily implying any actual relationship or order between such entities.
[0014] Embodiments of the present disclosure described herein provide a method and system to generate a stereoscopic image or a stereoscopic video using two mobile devices.
[0015] FIG. 1 is a flowchart illustrating a method of generating a stereoscopic image using two mobile devices, in accordance with one embodiment.
[0016] The method starts at step 105.
[0017] At step 110 each of the two mobile devices are classified into a master device and a slave device. Examples of the mobile devices include, but are not limited to, mobile phones, smart phones personal digital assistants and the like. Configuration data is exchanged among the mobile devices prior to classification. The configuration data is exchanged to determine the master device and the slave device. Examples of the configuration data include, but are not limited to, resolution of camera, distance sensor information, processor information, display information, resource utilization information, and network connectivity information, associated with the master device and the slave device.
[0018] The classification of the two mobile devices into the master device and the slave device occurs based on the configuration data. Selection of master device is based on the resolution of camera, the processor information, the display information, the resource utilization and the network connectivity information. If the two mobile devices have similar capabilities then the master device is chosen based on the resource utilization information.
[0019] At step 115 calibration is performed, by the master device, to capture a first image within a display of the master device. The first image can include a random 2D image that is located at a distance from the master device. Further, the slave device also captures the first image.
[0020] At step 120, the first image is extrapolated based on the distance of the master device from the first image and the display information obtained from the slave device. The first image that is extrapolated can also be referred to as an extrapolated image. Further, the extrapolated image generated by the master device is transmitted to slave device. Furthermore, the extrapolated image that is sent to the slave device is overlapped with the first image captured by the slave device.
[0021] Overlapping is being performed to determine a lens separation distance between the master device and the slave device. Hence, the first image that is captured by the master device and the slave device is used to determine the lens separation distance. The position of the first image captured by the slave device is altered such that the extrapolated image overlaps accurately with the first image.
[0022] Further, the overlapping enables to obtain the lens separation distance between the master device and the slave device. The point at which the first image captured by the slave device overlaps exactly with the extrapolated image determines the lens separation distance. The lens separation distance is referred to as the separation distance between human eyes that is required for perceiving a stereoscopic image.
[0023] Further, upon obtaining the lens separation distance, the slave device transmits a calibration complete command to the master device.
[0024] At step 125 a second image is captured, by the master device and the slave device, maintaining the lens separation distance. A first view of the second image is captured by the master device to generate a first view image. The first view image can be the second image that is captured from a perspective of the master device in one angle. In one example, the first view can include front view of the second image. Further, a second view of the second image is captured by the slave device to generate a second view image. The second view image can be the second image that is captured from another perspective of the slave device in another angle. In one example, the second view can include side view of the second image.
[0025] Further at step 125, the second view image captured by the slave device is transmitted, by the slave device, to the master device. In one example, wireless network protocols can be used for transmitting the side view image to the master device. Similarly various other transmission protocols can be used for transmission.
[0026] At step 130 the first view image and the second view image are processed by the master device to generate the stereoscopic image for the second image. 3D image processing algorithms are employed to process the first view image and the second view image, of the second image.
[0027] Further, the stereoscopic image is transmitted to the slave device such that a user of the slave device can view the stereoscopic image. The transmission protocols, for example wireless network protocols, can be used for transmission of the stereoscopic image.
[0028] Similarly, a stereoscopic video can also be generated using the method mentioned in the above paragraphs.
[0029] The method stops at step 135.
[0030] FIG. 2 is a block diagram of a master device for generating a stereoscopic image using two mobile devices, in accordance with one embodiment.
[0031] The master device 200 includes a bus 205 or other communication , mechanism for communicating information, and a processor 210 coupled with the bus 205 for processing information. The master device 200 also includes a memory 215, for example a random access memory (RAM) or other dynamic storage device, coupled to the bus 205 for storing information and instructions to be executed by the processor 210. The memory 215 can be used for storing temporary variables or other intermediate information during execution of instructions by the processor 210. The master device 200 further includes a read only memory (ROM) 220 or other static storage device coupled to the bus 205 for storing static information and instructions for the processor 210. A storage unit 225, for example a magnetic disk or optical disk, is provided and coupled to the bus 205 for storing information, for example information of a distance associated with location of a first image, information associated with a lens separation distance and the like.
[0032] The master device 200 can be coupled via the bus 205 to a display 230, for example a cathode ray tube (CRT), for displaying one or more images, for example the first image, a second image and a stereoscopic image for the second image. The input device 235, including alphanumeric and other keys, is coupled to the bus 205 for communicating information and command selections to the processor 210. Another type of user input device is the cursor control 240, for example a mouse, a trackball, or
cursor direction keys for communicating direction information and command selections to the processor 210 and for controlling cursor movement on the display 230.
[0033] Various embodiments are related to the use of the master device 200 for implementing the techniques described herein. In some embodiments, the techniques are performed by the master device 200 in response to the processor 210 executing instructions included in the memory 215. Such instructions can be read into the memory 215 from another machine-readable medium, for example the storage unit 225. Execution of the instructions included in the memory 215 causes the processor 210 to perform the process steps described herein.
[0034] In some embodiments, the processor 210 can include one or more processing units for performing one or more functions of the processor 210. The processing units are hardware circuitry used in place of or in combination with software instructions to perform specified functions.
[0035] The term "machine-readable medium" as used herein refers to any medium that participates in providing data that causes a machine to perform a specific function. In an embodiment implemented using the master device 200, various machine-readable media are involved, for example, in providing instructions to the processor 210 for execution. The machine-readable medium can be a storage medium, either volatile or non-volatile. A volatile medium includes, for example, dynamic memory, such as the memory 215. A non-volatile medium includes, for example, optical or magnetic disks, for example the storage unit 225. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
[0036] Common forms of machine-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic media, a CD-ROM, any other optical media, punchcards, papertape, any other physical media with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge.
[0037] In another embodiment, the machine-readable media can be transmission media including coaxial cables, copper wire and fiber optics, including the wires that include the bus 205. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications. Examples of machine-readable media may include, but are not limited to, a carrier wave as described hereinafter or any other media from which the master device 200 can read. For example, the instructions can initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the master device 200 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on the bus 205. The bus 205 carries the data to the memory 215, from which the processor 210 retrieves and executes the instructions. The instructions received by the memory 215 can optionally be stored on the storage unit 225 either before or after execution by the processor 210. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into a machine.
[0038] The master device 200 also includes a communication interface 245 coupled to the bus 205. The communication interface 245 provides a two-way data communication coupling to the processor 210. For example, the communication interface 245 can be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, the communication interface 245 can be a local area network (LAN) card to provide a data communication connection to a compatible LAN. In any such implementation, the communication interface 245 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
[0039] The processor 210 in the master device 200 is configured to classify the two mobile devices into the master device 200 and a slave device. The processor 210 classifies the mobile devices into the master device 200 and the slave device based on configuration data.
[0040] The processor 210 is configured to exchange the configuration data among the mobile devices to determine the master device 200 and the slave device. Examples of the configuration data include, but are not limited to, resolution of camera, distance sensor information, processor information, display information, resource utilization information, RAM usage information, battery usage information and network connectivity information, associated with the master device 200 and the slave device.
[0041] A data exchange unit 250 included in the processor 210 is operable to identify the two mobile devices located proximally close to each other and further performs exchange of the configuration data among the mobile devices.
[0042] The processor 210 further includes a master/slave identifier unit 265 for determining the master device 200 and the slave device based on the configuration data. The master/slave identifier unit 265 determines the master device 200 based on the mobile device having increased resolution of the camera, processing power and network connectivity. The master/slave identifier unit 265 further maintains information associated with the master device 200 and the slave device upon classification.
[0043] The processor 210 is configured to perform calibration to capture a first image. The first image is located at a distance from the master device 200. A distance measurement sensor 275 included in the processor 210 is used for determining the distance. In one example, the distance measurement sensor 275 can be a sound based sensor or a light based sensor.
[0044] A calibration unit 255 included in the processor 210 is operable for performing the calibration. The calibration includes capturing the first image using a camera 270 within the display of the display 230 of the master device 200. The calibration unit 255 performs calibration of the first image based on the distance.
[0045] Further processor 210 extrapolates the first image, referred to as an extrapolated image, prior to transmission. Extrapolation is performed based on the distance between the master device 200 and the first image and the display information obtained from the slave device.
[0046] The extrapolated image is overlapped with the first image that is captured by the slave device. A camera, for example the camera 270, included in the slave device, may be used for capturing the first image by the slave device.
[0047] A calibration unit, for example the calibration unit 255, can also be included in the slave device for overlapping the extrapolated image with the first image that is captured by the slave device.
[0048] The calibration unit performs the overlapping of the extrapolated image with the first image that is captured by the slave device in order to determine a lens separation distance between the master device and the slave device such that the lens separation distance is similar to a separation distance existing between human eyes for perceiving the stereoscopic image.
[0049] Further, upon obtaining the lens separation distance, the calibration unit included in the slave device is operable to transmit a calibration complete command to the master device. Hence, the lens separation distance is acknowledged by the master device 200 and the slave device.
[0050] Further, when change in the lens separation distance is detected, the processor 210 of the master device 200 is configured to send an alert message to a user of the master device 200 so that position of the master device 200 can be altered to maintain the lens separation distance constant.
[0051] Furthermore, the processor 210 of the master device 200 is configured to capture a first view of the second image to generate a first view image. The second image is also captured using the camera 270.
[0052] Further, a processor, for example the processor 210, can also be included in the slave device and is configured for capturing a second view of the second image to generate a second view image. The camera included in the slave device can be used for capturing the second view of the second image.
[0053] Further, the processor included in the slave device is operable to transmit the second view image that is captured by the slave device to the master device 200.
[0054] The processor 210 is further operable to process the first view image and the second view image to generate a stereoscopic image for the second image. A 3D image generator 260 included in the processor 210 is operable for processing the first view and the second view of the second image for generating the stereoscopic image. The 3D image generator 260 employs one or more 3D image processing algorithms for processing the first view and the second view of the second image for generating the stereoscopic image.
[0055] Further, the processor 210 is operable to transmit the stereoscopic image to the slave device so that a user of the slave device can view the stereoscopic image. One or more network protocols, for example wireless network protocols can be used for transmission of the stereoscopic image to the slave device.
[0056] FIG. 3A-3I is an exemplary illustration of generating a stereoscopic image using two mobile devices, in accordance with one embodiment.
[0057] FIG. 3A includes a mobile device 305 and a mobile device 310. The mobile device 305 and the mobile device 310 are required to be classified into a master device and a slave device. Hence configuration data 315 is exchanged among the mobile device 305 and the mobile device 310 prior to classification.
[0058] Upon exchange of the configuration data 315, it is determined that mobile device 305 includes an increased resolution of the camera, processing power and network connectivity. Hence, the mobile device 305 is considered as the master device. Further, the mobile device 310 is considered as the slave device.
[0059] The master device performs calibration to capture a first image 320 using a camera 325 embedded within the mobile device 305. The first image 320 captured, by
the master device, is displayed on a display of the master device as shown in FIG. 3B. The first image 320 can include a random 2D image that is located at a distance 'd', as shown in FIG. 3B, from the master device.
[0060] The calibration includes capturing the first image 320, at a distance 'd', within the display of the master device. A distance measurement sensor 330, included in the mobile device 305, is used to determine the distance 'd' for performing the calibration by the master device.
[0061] Further, the slave device also captures the first image 320 using a camera 340 embedded within the mobile device 310 as shown in FIG. 3C. The first image 320 captured, by the slave device, is displayed on a display of the slave device as shown in FIG. 3C.
[0062] Further, the master device extrapolates the first image 320 based on the distance 'd', that is determined by the distance measurement sensor 330, to generate an extrapolated image 335. The extrapolated image 335 is further transmitted, by the master device, to the slave device as shown in FIG. 3C.
[0063] The slave device further overlaps the extrapolated image 335 with the first image 320 that is captured by the slave device as shown in FIG. 3D. Position of the first image 320 captured by the slave device is altered, as shown in 380, such that the extrapolated image 335 overlaps accurately with first image 320 as shown in FIG 3D.
[0064] Upon overlapping the extrapolated image 335 with the first image 320 that is captured by the slave device, an overlapped image 345 is generated within the slave device as shown in FIG. 3E. The overlapping enables to obtain a lens separation distance 370 between the master device and the slave device such that the lens separation distance 370 is similar to a separation distance existing between human eyes for perceiving a stereoscopic image.
[0065] Further, upon obtaining the lens separation distance 370, the slave device transmits a calibration complete command 350 to the master device as shown in FIG. 3E
[0066] Further, a second image 375 is captured, by the master device and the slave device, maintaining the lens separation distance 370. The second image 375 can include an arbitrary 3D object. In one example, front view of the second image 375 is captured from one perspective of the master device to generate a front view image 355 as shown in FIG. 3F. Further, in one example, side view of the second image 375 is captured from another perspective of the slave device to generate a side view image 360 as shown in FIG. 3F.
[0067] Also, the side view image 360 is transmitted to the master device as shown in FIG. 3G. In one example, wireless network protocols can be used for transmitting the side view of the second image to the master device. Similarly various other transmission protocols can be used for transmission.
[0068] Further, the front view image 355 and the side view image 360 are processed by the master device to generate a stereoscopic image 365 as shown in FIG 3H. 3D image processing algorithms are employed to process the front view image 355 and the side view image 360, of the second image 375 as shown in FIG 3F to generate the stereoscopic image 365.
[0069] Further, the stereoscopic image 365 is transmitted to the slave device, as shown in FIG.3I, such that a user of the slave device can view the stereoscopic image 365. Transmission protocols, for example wireless network protocols, can be used for transmission of the stereoscopic image 365.
[0070] Advantageously, the embodiments specified in the present disclosure provide an efficient method to generate stereoscopic images by using simple mobile devices enabled with a camera feature. Further, the invention enables participation of any mobile devices located proximally close to each other for generation and sharing of the stereoscopic images. Further, by using simple mobile devices, implementation costs for generation of the stereoscopic images are reduced greatly.
[0071] In the preceding specification, the present disclosure and its advantages have been described with reference to specific embodiments. However, it will be apparent to a person of ordinary skill in the art that various modifications and changes can be made, without departing from the scope of the present disclosure, as set forth in the claims below. Accordingly, the specification and figures are to be regarded as illustrative examples of the present disclosure, rather than in restrictive sense. All such possible modifications are intended to be included within the scope of present disclosure.
I/We claim:
1 A method of generating a stereoscopic image using two mobile devices, the method comprising:
classifying the two mobile devices into one of a master device and a slave device;
performing calibration to capture a first image, the first image being located at a distance from the master device;
transmitting the first image to the slave device, wherein the first image captured by the master device is overlapped with the first image captured by the slave device, overlapping being performed to determine a lens separation distance between the master device and the slave device;
capturing a second image by at least one of the master device and the slave device based on the lens separation distance; and
processing at least one of a first view image and a second view image, of the second image, wherein the first view image and the second view image, of the second image, is captured, by the master and the slave device, maintaining the lens separation distance to generate the stereoscopic image.
2 The method as claimed in claim land further comprising:
transmitting the stereoscopic image, by the master device, to the slave device.
3 The method as claimed in claim land further comprising:
exchanging configuration data among the two mobile devices to determine the master device and the slave device.
4 The method as claimed in claim 1 land further comprising: capturing the first image by the slave device.
5 The method as claimed in claim land further comprising:
transmitting a calibration complete command, by the slave device, to the master device prior to capturing the second image by at least one of the master device and the slave device.
6 The method as claimed in claim 1, wherein the first view image, of the second image, is captured by the master device.
7 The method as claimed in claim 1, wherein the second view image, of the second image, is captured by the slave device.
8 The method as claimed in claim 1 and further comprising:
transmitting the second view image, of the second image, by the slave device, to the master device.
9 The method as claimed in claim 1, wherein 3D image processing algorithms are
employed to process at least one of the first view image, of the second image, and the
second view image of the second image.
10 A system for generating a stereoscopic image using two mobile devices, the system comprising:
a communication interface for establishing communication;
a memory that stores instructions; and
a processor responsive to the instructions to classify the two mobile devices into one of a master device and a slave device;
perform calibration to capture a first image, the first image being located at a distance from the master device;
transmit the first image to the slave device, wherein the first image captured by the master device is overlapped with the first image captured by the slave device,
overlapping being performed to determine a lens separation distance between the master device and the slave device;
capture a second image by at least one of the master device and the slave device based on the lens separation distance; and
process at least one of a first view image and a second view image, of the second image, wherein the first view image and the second view image, of the second image, is captured, by the master and the slave device, maintaining the lens separation distance to generate the stereoscopic image.
11 The system as claimed in claim 1, wherein the processor further comprises a data exchange unit to exchange configuration data among the two mobile devices to determine the master device and the slave device.
12 The system as claimed in claim 1, wherein the processor further comprises a distance measurement sensor for determining the distance.
13 The system as claimed in claim 1, wherein the processor further comprises a calibration unit to perform the calibration.
14 The system as claimed in claim 1, wherein the slave device is configured to:
capture the first image;
overlap the first image captured by the master device with the first image captured by the slave device; and
transmit a calibration complete command to the master device.
15 The system as claimed in claim 1, wherein the first view image, of the second image, is captured by the master device.
16 The system as claimed in claim 1, wherein the second view image, of the second image, is captured by the slave device.
17. The system as claimed in claim 1, wherein the processor further comprises a 3D image generator to process at least one of the first view image of the second image and the second view image of the second image to generate the stereoscopic image.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 4725-CHE-2012 POWER OF ATTORNEY 12-11-2012.pdf | 2012-11-12 |
| 1 | 4725-CHE-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf | 2023-09-26 |
| 2 | 4725-CHE-2012 FORM-5 12-11-2012.pdf | 2012-11-12 |
| 2 | 4725-CHE-2012-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 3 | 4725-CHE-2012-IntimationOfGrant18-08-2020.pdf | 2020-08-18 |
| 3 | 4725-CHE-2012 FORM-3 12-11-2012.pdf | 2012-11-12 |
| 4 | 4725-CHE-2012-PatentCertificate18-08-2020.pdf | 2020-08-18 |
| 4 | 4725-CHE-2012 FORM-2 12-11-2012.pdf | 2012-11-12 |
| 5 | 4725-CHE-2012_Abstract_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 5 | 4725-CHE-2012 FORM-1 12-11-2012.pdf | 2012-11-12 |
| 6 | 4725-CHE-2012_Claims_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 6 | 4725-CHE-2012 DRAWINGS 12-11-2012.pdf | 2012-11-12 |
| 7 | 4725-CHE-2012_Description_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 7 | 4725-CHE-2012 DESCRIPTION (COMPLETE) 12-11-2012.pdf | 2012-11-12 |
| 8 | 4725-CHE-2012_Drawings_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 8 | 4725-CHE-2012 CORRESPONDENCE OTHERS 12-11-2012.pdf | 2012-11-12 |
| 9 | 4725-CHE-2012 CLAIMS 12-11-2012.pdf | 2012-11-12 |
| 9 | 4725-CHE-2012_Marked Up Claims_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 10 | 4725-CHE-2012 ABSTRACT 12-11-2012.pdf | 2012-11-12 |
| 10 | 4725-CHE-2012-FORM 3 [01-07-2020(online)].pdf | 2020-07-01 |
| 11 | 4725-CHE-2012 FORM-18 25-04-2013.pdf | 2013-04-25 |
| 11 | 4725-CHE-2012-AMENDED DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 12 | 4725-CHE-2012 FORM-13 18-07-2015.pdf | 2015-07-18 |
| 12 | 4725-CHE-2012-FORM 13 [04-03-2020(online)].pdf | 2020-03-04 |
| 13 | 4725-CHE-2012-RELEVANT DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 13 | Form 13_Address for service.pdf | 2015-07-20 |
| 14 | 4725-CHE-2012-Correspondence_Notarized Affidavit-26-02-2020.pdf | 2020-02-26 |
| 14 | Amended Form 1.pdf | 2015-07-20 |
| 15 | 4725-CHE-2012-Notarized Affidavit-26-02-2020.pdf | 2020-02-26 |
| 15 | Form 3 [27-06-2017(online)].pdf | 2017-06-27 |
| 16 | 4725-CHE-2012-AMENDED DOCUMENTS [21-02-2020(online)]-1.pdf | 2020-02-21 |
| 16 | 4725-CHE-2012-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 17 | 4725-CHE-2012-FORM 3 [28-12-2017(online)].pdf | 2017-12-28 |
| 17 | 4725-CHE-2012-AMENDED DOCUMENTS [21-02-2020(online)].pdf | 2020-02-21 |
| 18 | 4725-CHE-2012-FER.pdf | 2018-02-08 |
| 18 | 4725-CHE-2012-FORM 13 [21-02-2020(online)]-1.pdf | 2020-02-21 |
| 19 | 4725-CHE-2012-FORM 13 [21-02-2020(online)].pdf | 2020-02-21 |
| 19 | 4725-CHE-2012-RELEVANT DOCUMENTS [22-02-2018(online)].pdf | 2018-02-22 |
| 20 | 4725-CHE-2012-Changing Name-Nationality-Address For Service [22-02-2018(online)].pdf | 2018-02-22 |
| 20 | 4725-CHE-2012-PETITION UNDER RULE 137 [21-02-2020(online)]-1.pdf | 2020-02-21 |
| 21 | 4725-CHE-2012-PETITION UNDER RULE 137 [08-05-2018(online)].pdf | 2018-05-08 |
| 21 | 4725-CHE-2012-PETITION UNDER RULE 137 [21-02-2020(online)].pdf | 2020-02-21 |
| 22 | 4725-CHE-2012-FER_SER_REPLY [08-05-2018(online)].pdf | 2018-05-08 |
| 22 | 4725-CHE-2012-Proof of Right [21-02-2020(online)].pdf | 2020-02-21 |
| 23 | 4725-CHE-2012-DRAWING [08-05-2018(online)].pdf | 2018-05-08 |
| 23 | 4725-CHE-2012-RELEVANT DOCUMENTS [21-02-2020(online)].pdf | 2020-02-21 |
| 24 | 4725-CHE-2012-Written submissions and relevant documents [21-02-2020(online)].pdf | 2020-02-21 |
| 24 | 4725-CHE-2012-CORRESPONDENCE [08-05-2018(online)].pdf | 2018-05-08 |
| 25 | 4725-CHE-2012-CLAIMS [08-05-2018(online)].pdf | 2018-05-08 |
| 25 | 4725-CHE-2012-PETITION UNDER RULE 138 [21-01-2020(online)].pdf | 2020-01-21 |
| 26 | 4725-CHE-2012-HearingNoticeLetter-(DateOfHearing-07-01-2020).pdf | 2019-12-10 |
| 26 | 4725-CHE-2012-Written submissions and relevant documents (MANDATORY) [21-01-2020(online)].pdf | 2020-01-21 |
| 27 | 4725-CHE-2012-Correspondence to notify the Controller (Mandatory) [06-01-2020(online)].pdf | 2020-01-06 |
| 28 | 4725-CHE-2012-HearingNoticeLetter-(DateOfHearing-07-01-2020).pdf | 2019-12-10 |
| 28 | 4725-CHE-2012-Written submissions and relevant documents (MANDATORY) [21-01-2020(online)].pdf | 2020-01-21 |
| 29 | 4725-CHE-2012-CLAIMS [08-05-2018(online)].pdf | 2018-05-08 |
| 29 | 4725-CHE-2012-PETITION UNDER RULE 138 [21-01-2020(online)].pdf | 2020-01-21 |
| 30 | 4725-CHE-2012-CORRESPONDENCE [08-05-2018(online)].pdf | 2018-05-08 |
| 30 | 4725-CHE-2012-Written submissions and relevant documents [21-02-2020(online)].pdf | 2020-02-21 |
| 31 | 4725-CHE-2012-DRAWING [08-05-2018(online)].pdf | 2018-05-08 |
| 31 | 4725-CHE-2012-RELEVANT DOCUMENTS [21-02-2020(online)].pdf | 2020-02-21 |
| 32 | 4725-CHE-2012-FER_SER_REPLY [08-05-2018(online)].pdf | 2018-05-08 |
| 32 | 4725-CHE-2012-Proof of Right [21-02-2020(online)].pdf | 2020-02-21 |
| 33 | 4725-CHE-2012-PETITION UNDER RULE 137 [08-05-2018(online)].pdf | 2018-05-08 |
| 33 | 4725-CHE-2012-PETITION UNDER RULE 137 [21-02-2020(online)].pdf | 2020-02-21 |
| 34 | 4725-CHE-2012-Changing Name-Nationality-Address For Service [22-02-2018(online)].pdf | 2018-02-22 |
| 34 | 4725-CHE-2012-PETITION UNDER RULE 137 [21-02-2020(online)]-1.pdf | 2020-02-21 |
| 35 | 4725-CHE-2012-FORM 13 [21-02-2020(online)].pdf | 2020-02-21 |
| 35 | 4725-CHE-2012-RELEVANT DOCUMENTS [22-02-2018(online)].pdf | 2018-02-22 |
| 36 | 4725-CHE-2012-FORM 13 [21-02-2020(online)]-1.pdf | 2020-02-21 |
| 36 | 4725-CHE-2012-FER.pdf | 2018-02-08 |
| 37 | 4725-CHE-2012-FORM 3 [28-12-2017(online)].pdf | 2017-12-28 |
| 37 | 4725-CHE-2012-AMENDED DOCUMENTS [21-02-2020(online)].pdf | 2020-02-21 |
| 38 | 4725-CHE-2012-AMENDED DOCUMENTS [21-02-2020(online)]-1.pdf | 2020-02-21 |
| 38 | 4725-CHE-2012-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 39 | 4725-CHE-2012-Notarized Affidavit-26-02-2020.pdf | 2020-02-26 |
| 39 | Form 3 [27-06-2017(online)].pdf | 2017-06-27 |
| 40 | 4725-CHE-2012-Correspondence_Notarized Affidavit-26-02-2020.pdf | 2020-02-26 |
| 40 | Amended Form 1.pdf | 2015-07-20 |
| 41 | 4725-CHE-2012-RELEVANT DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 41 | Form 13_Address for service.pdf | 2015-07-20 |
| 42 | 4725-CHE-2012 FORM-13 18-07-2015.pdf | 2015-07-18 |
| 42 | 4725-CHE-2012-FORM 13 [04-03-2020(online)].pdf | 2020-03-04 |
| 43 | 4725-CHE-2012 FORM-18 25-04-2013.pdf | 2013-04-25 |
| 43 | 4725-CHE-2012-AMENDED DOCUMENTS [04-03-2020(online)].pdf | 2020-03-04 |
| 44 | 4725-CHE-2012 ABSTRACT 12-11-2012.pdf | 2012-11-12 |
| 44 | 4725-CHE-2012-FORM 3 [01-07-2020(online)].pdf | 2020-07-01 |
| 45 | 4725-CHE-2012 CLAIMS 12-11-2012.pdf | 2012-11-12 |
| 45 | 4725-CHE-2012_Marked Up Claims_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 46 | 4725-CHE-2012_Drawings_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 46 | 4725-CHE-2012 CORRESPONDENCE OTHERS 12-11-2012.pdf | 2012-11-12 |
| 47 | 4725-CHE-2012_Description_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 47 | 4725-CHE-2012 DESCRIPTION (COMPLETE) 12-11-2012.pdf | 2012-11-12 |
| 48 | 4725-CHE-2012_Claims_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 48 | 4725-CHE-2012 DRAWINGS 12-11-2012.pdf | 2012-11-12 |
| 49 | 4725-CHE-2012_Abstract_Granted_344388_18-08-2020.pdf | 2020-08-18 |
| 49 | 4725-CHE-2012 FORM-1 12-11-2012.pdf | 2012-11-12 |
| 50 | 4725-CHE-2012-PatentCertificate18-08-2020.pdf | 2020-08-18 |
| 50 | 4725-CHE-2012 FORM-2 12-11-2012.pdf | 2012-11-12 |
| 51 | 4725-CHE-2012 FORM-3 12-11-2012.pdf | 2012-11-12 |
| 51 | 4725-CHE-2012-IntimationOfGrant18-08-2020.pdf | 2020-08-18 |
| 52 | 4725-CHE-2012 FORM-5 12-11-2012.pdf | 2012-11-12 |
| 52 | 4725-CHE-2012-RELEVANT DOCUMENTS [29-09-2022(online)].pdf | 2022-09-29 |
| 53 | 4725-CHE-2012 POWER OF ATTORNEY 12-11-2012.pdf | 2012-11-12 |
| 53 | 4725-CHE-2012-RELEVANT DOCUMENTS [26-09-2023(online)].pdf | 2023-09-26 |
| 1 | Searchstrategy_12-01-2018.pdf |