Sign In to Follow Application
View All Documents & Correspondence

"A System And Method For Controlling A Display Device Using A Camera Based Device"

Abstract: A method and system for controlling a display device using a camera based device is provided. A method includes configuring at least one of a camera based device and a display device to enable communication, enabling the camera based device to perform calibration to capture an initial calibrated boundary of the display device, calculating a gesture boundary corresponding to a gesture associated with the camera based device, determining a gesture associated with the camera based device based on a position of the gesture boundary relative to the initial calibrated boundary, transmitting a command from the camera based device to display based device based on the gesture and enabling the display device to process the command. The system includes a display device configured to receive commands, a camera based device to transmit the multiple commands. The multiple commands are being transmitted based on multiple gestures and a network for establishing communication.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 May 2012
Publication Number
36/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2022-06-02
Renewal Date

Applicants

SAMSUNG ELECTRONICS COMPANY
416 MAETAN-DONG, YEONGTONG-GU, SUWON-SI, GYEINGGI-DO 442-742

Inventors

1. PRASANTH JAYACHANDRAN
#45 SAMUNDY NAGAR, THINDAL, ERODE - 638 012
2. Subramanian Muthukumar
3/14,Kavalkaran Street, Periyandankovii-West, Karur - 639002 ,

Specification

A SYSTEM AND METHOD FOR CONTROLLING A DISPLAY DEVICE USING A CAMERA BASED DEVICE

FIELD OF THE INVENTION

[0001] The present invention relates to the field of controlling a display device.

BACKGROUND

[0002] In the recent times, mobile devices, for example, mobile phones, personal digital assistants (PDA's) and other hand held devices are becoming increasingly popular for entertainment purpose, for example games, watching television programs on mobile phones and the like. Input devices, for example, but not limited to, input buttons of the mobile device, a pointing device and a joystick can be used for providing inputs and for controlling the mobile devices. However, the input devices are miniaturized in nature and hence a user cannot conveniently provide the inputs. Further, the display associated with the mobile device is also miniaturized in nature and hence the user cannot view images displayed on the mobile devices distinctly. Further, there exist one or more circumstances where a user wishes to view content of display device through a mobile device and further control the display device using the mobile device.

[0003] Conventional technique aims at selecting an input value based on a motion sensed by, for example, a sensor embedded on a handheld device. The motion sensed is used to vary position of a graphical element displayed on the handheld device. The position of the graphical element is thus used to identify the input value. The input value may be used to perform a function on the handheld device or an external device. In one example, the input value may be used for opening a lock. In another example, the input value may be used for retaining an image displayed on the handheld device. However, in the present technique input is provided in the form of the motion. Further, an additional device, for example, a sensor is used for sensing the motion.

[0004] In another conventional technique, a motion sensor is coupled to a device. The motion sensor provides a motion signal corresponding to movement of the device. The technique further includes a processor to process the motion signal. The motion signal is referred to as an input to the processor. Processing includes identifying either a tap command or a position command associated with the motion signal. If the processor identifies the tap command, then one or more actions, responsive to the tap command, are performed by the processor. Further, if the processor identifies the position command, then one or more actions, responsive to the position command, for example, controlling and operating the device is performed by the processor. However, in this technique, additional devices, for example, the motion sensor is used for sensing the input. Further, a processor is used for processing the input.
[0005] In the light of the foregoing discussion there is a need for a system and a method for distinctly viewing and controlling a display device using a single input device.

SUMMARY

[0006] Embodiments of the present disclosure described herein provide system for controlling a display device using a camera based device.

[0007] An example of a method of controlling a display device using a camera based device includes configuring at least one of a camera based device and a display device to enable communication between the camera based device and the display device. The method also includes enabling the camera based device to perform calibration to capture an initial calibrated boundary of the display device. The method further includes calculating a gesture boundary based on a gesture associated with the camera based device. Further, the method includes determining the gesture associated with the camera based device based on a position of the gesture boundary relative to the initial calibrated boundary. Furthermore, the method includes transmitting a command from the camera based device to the display device using a network. The command is being transmitted based on the gesture. Moreover, the method includes enabling the display device to process the command. Processing is being performed to display a plurality of entities corresponding to the command.

[0008] An example of a system for controlling a display device using a camera based device includes a display device configured to receive multiple commands. The system also includes a camera based device configured to transmit the multiple commands. The multiple commands are being transmitted based on multiple gestures. The system further includes a network for establishing communication between the display device and the camera based device.

BRIEF DESCRIPTION OF FIGURES

[0009] The accompanying figure, similar reference numerals may refer to identical or functionally similar elements. These reference numerals are used in the detailed description to illustrate various embodiments and to explain various aspects and advantages of the present disclosure.

[0010] FIG. 1 is a block diagram of a system for controlling a display device using a camera based device, in accordance with one embodiment;

[0011] FIG. 2 is a block diagram of a display device, in accordance with one embodiment;

[0012] FIG. 3 is a flowchart illustrating a method of controlling a display device using a camera based device, in accordance with one embodiment; and

[0013] FIG. 4A-4I is an exemplary illustration of controlling a display device using a mobile phone, in accordance with one embodiment.

DETAILED DESCRIPTION

[0014] It should be observed the method steps and system components have been represented by conventional symbols in the figure, showing only specific details which are relevant for an understanding of the present disclosure. Further, details may be readily apparent to person ordinarily skilled in the art may not have been disclosed. In the present disclosure, relational terms such as first and second, and the like, may be used to distinguish one entity from another entity, without necessarily implying any actual relationship or order between such entities.

[0015] Embodiments of the present disclosure described herein provide system and method of controlling a display device using a camera based device.

[0016] FIG. 1 is a block diagram of a system 100 for controlling a display device using a camera based device, in accordance with one embodiment. The system 100 includes a display device 105, a network 110 and a camera based device 115. The display device 105 can also be referred to as an electronic device configured with a display feature. Examples of the display devices 105 include, but are not limited to, televisions, mobile phones, computers, laptops, handheld devices personal digital assistants (PDA) and telecommunication devices. Examples of the network 110 include, but are not limited to, local area network (LAN), wide area network (WAN) and wireless networks. The camera based device 115 includes a camera and is operable to capture entities using the camera. Examples of the camera based device 115 include, but are not limited to, a mobile phone including a camera, digital cameras, webcam and other electronic devices embedded with a camera.

[0017] The display device 105 is operable to display one or more entities included in a video stream or contents included in an application running on the display device 105. The camera based device 115 is configured to capture the entities displayed. Further, the camera based device 115 moves towards one or more directions with respect to a position of the display device 105. The camera based device 115, upon moving, generates a command corresponding to a direction moved. Further, camera based device 115 can be rotated in a clockwise and an anticlockwise direction with respect to the position of the display device 105. Further, the command corresponding to rotation of the camera based device 115 is generated. The command generated is thus transmitted to the display device 105. Transmission of the command is performed using various networks, for example, but not limited to, wireless networks such as Bluetooth, wireless USB, NFC, WI-FI and the like.

[0018] The display device 105 is configured to receive the command transmitted by the camera based device 115. Upon receiving the command, the entities, captured by the camera based device 115, displayed on the display device 105 are moved responsive to the command. The entities may be moved towards, for example, but not limited to, right, left, top and bottom, of the display device 205. Further, the entities may be rotated in a clockwise or anticlockwise direction. The entities moved are thus displayed on the display device 105 for viewing by a user. Hence, the camera based device 115 can be used for controlling movement of entities displayed on the display device 105.

[0019] A block diagram of a display device 105 configured to receive the command from the camera based device 115 and further displaying the entities responsive to the command is explained in detail in conjunction with FIG. 2.

[0020] FIG. 2 is a block diagram of a display device 105, in accordance with one embodiment. The display device 105 includes a bus 205 for communicating information, and a processor 210 coupled with the bus 205 for processing one or more commands transmitted by a camera based device, for example, the camera based device 115. The display device 105 also includes a memory 215, for example a random access memory (RAM) coupled to the bus 205 for storing the commands required by the processor 210. The memory 215 can be used for storing temporary information required by the processor 210. The display device 105 further includes a read only memory (ROM) 220 coupled to the bus 205 for storing static information required by the processor 210. A storage unit 225, for example a magnetic disk, hard disk or optical disk, can be provided and coupled to bus 205 for storing information.

[0021] The display device 105 can be coupled via the bus 205 to a display 230, for example a cathode ray tube (CRT) or liquid crystal display (LCD), plasma display and the like for displaying entities or images. An input device 235, including various keys, is coupled to the bus 205 for communicating the commands to the processor 210. In some embodiments, cursor control 240, for example a mouse, a trackball, a joystick, or cursor direction keys for communicating the commands to the processor 210 and for controlling cursor movement on the display 230 can also be present.

[0022] In one embodiment, the steps of the present disclosure are performed by the display device 105 using the processor 210. The commands can be read into the memory 215 from a machine-readable medium, for example the server storage unit 225. In alternative embodiments, hard-wired circuitry can be used in place of or in combination with software instructions to implement various embodiments.
[0023] The term machine-readable medium can be defined as a medium providing data to a machine to enable the machine to perform a specific function. The machine-readable medium can be a storage media. Storage media can include non-volatile media and volatile media. The storage unit 225 can be a non-volatile media. The memory 215 can be a volatile media. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into the machine.

[0024] Examples of the machine readable medium includes, but are not limited to, a floppy disk, a flexible disk, hard disk, magnetic tape, a CD-ROM, optical disk, punchcards, papertape, a RAM, a PROM, EPROM, and a FLASH-EPROM.

[0025] The machine readable medium can also include online links, download links, and installation links providing the information to the processor 210.

[0026] The display device 105 also includes a communication interface 245 coupled to the bus 205 for enabling data communication. Examples of the communication interface 245 include, but are not limited to, an integrated services digital network (ISDN) card, a modem, a local area network (LAN) card, an infrared port, a Bluetooth port, a zigbee port, and a wireless port.

[0027] The display device 105 receives the commands transmitted by the camera based device. The commands are received using the communication interface 245. The processor 210, in response to the commands, is operable to perform one or more actions on the entities or the images displayed corresponding to the commands. Examples of the one or more actions include, but are not limited to, moving the entities or the images displayed towards right, left, top or bottom of the display 230. The one or more actions also include rotating the entities or the images in a clockwise or an anticlockwise direction.

[0028] In one embodiment, the processor 210 can be included in a camera based device. The processor 210 included in the camera based device is operable to perform calibration to capture an initial calibrated boundary of the display device. The process 210 is also operable to calculating a gesture boundary, based on a gesture associated with the camera based device. The process 210 is further operable to determine a gesture associated with the camera based device based on a position of the gesture
boundary relative to the initial calibrated boundary. The gesture includes movement of the camera based device towards or away from the display device 105. The gesture can also include movement of the camera based device towards right, left, top or bottom with respect to the position of the display device 105. The gesture can further include movement of the camera based device towards the display device 105 or away from the display device 105. Further, the gesture includes rotating the entities or the images in a clockwise or an anticlockwise direction. Upon determining the gesture, the processor 210 is configured to transmit a command to the display device using a network. The command is being transmitted based on the gesture.

[0029] In some embodiments, the processor 210 can include one or more processing units for performing one or more functions of the processor 210. The processing units are hardware circuitry performing specified functions.

[0030] A method of controlling a display device using a camera based device is explained in detail in conjunction with FIG. 3.

[0031] FIG. 3 is a flowchart illustrating a method of controlling a display device using a camera based device, in accordance with one embodiment. The method starts at step 305. At step 310 at least one of a camera based device and a display device is configured using communication interfaces. Examples of the display devices 105 include, but are not limited to, televisions, mobile phones, computers, laptops, handheld devices personal digital assistants (PDA) and telecommunication devices. Examples of the camera based device 115 include, but are not limited to, a mobile phone including a camera, digital cameras, webcam and other electronic devices embedded with a camera. Configuration is performed to enable communication between the camera based device 115 and the display device 105. One or more communication protocols may be used for configuration. Further, the configuration can also include detecting a particular camera based device 115 by a display device 105 since communication
between the camera based device 115 and the display device 105 may be established in prior. The display device 105 can store the camera based device 115 in a database, for example, a first database for detection of the camera based device 115 in future.

Similarly, the camera based device 115 can store the display device 105 when the communication is established previously.

[0032] At step 315 calibration of the camera based device is enabled. Calibration includes capturing an initial boundary of the display device to form an initial calibrated boundary with respect to a position of the display device on the camera based device. The initial calibrated boundary of the display device that is captured on the camera based device can also be referred to as a solid boundary. The initial calibrated boundary of the display device captured can also be referred to as a fixed boundary. Scaling is used to perform the calibration. The initial calibrated boundary of the display device is captured and scaled such that the boundary of the display device is accommodated within display area of the camera based device. One or more scaling techniques can be used for performing the calibration. Further the solid boundary can be displayed and viewed on the camera based device. Furthermore one or more entities or images displayed on the display device are captured by the camera based device.

[0033] Further, at step 320, a gesture boundary is calculated based on a gesture associated with the camera based device. The gesture boundary is used capture images corresponding to the gesture associated with the camera based device. The initial calibrated boundary is stationary and the gesture boundary can be moved with respect to the motion of the camera based device. The movement of the gesture boundary corresponding to the motion of the camera based device is relative to the initial calibrated boundary. The motion can be performed in any axis, for example X-axis, Y-axis and Z-axis.
[0034] At step 325 a gesture associated with the camera based device is determined. The gesture associated with the camera based device is determined based on a position of the gesture boundary relative to the initial calibrated boundary. Examples of the gesture include, but are not limited to, the camera based device moving towards, right, left, top or downwards, with respect to the position of the display device. Further, the gesture can also include heading the camera based device towards the display device and receding the camera based device away from the display device. Furthermore, the gesture can also include rotating the camera based device in a clockwise or an anticlockwise direction with respect to the position of the display device.

[0035] In one example, the gesture includes movement of the camera based device, along an axis, for example Z-axis, towards the display device. In such a case, the gesture boundary is scaled-up relative to the initial calibrated boundary. In another example, the gesture includes retreating the camera based device away from the display device, along the axis. In such a case, the gesture boundary is scaled-down relative to the initial calibrated boundary.

[0036] In yet another example, the gesture includes the movement of the camera based device towards the right, for example along X axis with respect to the position of the display device. In such a case, the gesture boundary is moved towards left relative to the initial calibrated boundary. Further, in another example, the gesture includes the movement of the camera based device towards the left with respect to the position of the display device. In such a case, the gesture boundary is moved towards right relative to the initial calibrated boundary.

[0037] Furthermore, in another example, the gesture includes the movement of the camera based device towards the top, for example along Y axis with respect to the position of the display device. In such a case, the gesture boundary is moved downwards relative to the initial calibrated boundary. Moreover, in another example, the gesture includes the movement of the camera based device downwards with respect to the position of the display device. In such a case, the gesture boundary is moved upwards relative to the initial calibrated boundary.

[0038] Further, in one example, the gesture includes rotating the camera based device in the clockwise direction along an axis, for example X-axis, Y-axis or Z-axis with respect to the position of the display device. In such a case, the gesture boundary is rotated in anti-clockwise direction relative to the initial calibrated boundary. Furthermore, in another example, the gesture includes rotating the camera based device in the anticlockwise direction with respect to the position of the display device. In such a case, the gesture boundary is rotated in clockwise direction relative to the initial calibrated boundary. Similarly many other gestures can be associated with the camera based device. This enables the camera based device and the display device to be contrary to each other.

[0039] In some embodiments, if the user can wish to recalibrate the solid boundary, then camera device is configured to provide an option in the form of a button or menu to initiate re-detection process.

[0040] At step 330 a command from the camera based device to the display device is transmitted using a network. The command is generated by the camera based device. The command is generated based on the gesture associated with the camera based device. The command is further transmitted using a wireless network, for example, Bluetooth, Near Field Communication (NFC), WI-FI and the like.

[0041] In one example, a zoom-in command is generated and further transmitted when the camera based device is moved along the axis, towards the display device. In another example, a zoom-out command is generated and further transmitted when the camera based device is retreated along the axis away from the display device.

[0042] In yet another example, a translate-right command is generated and further transmitted when the gesture includes the movement of the camera based device towards the right with respect to the position of the display device. In some embodiments a gesture-boundary-left command is generated when the camera based device is moved towards the right. Further, in another example, a translate-left command is generated and further transmitted when the gesture includes the movement of the camera based device towards the left with respect to the position of the display device. In some embodiments a gesture-boundary-right command is generated when the camera based device is moved towards the right.

[0043] Furthermore, in another example, a translate-top command is generated and further transmitted when the gesture includes the movement of the camera based device towards the top with respect to the position of the display device. In some embodiments a gesture-boundary-down command is generated when the camera based device is moved towards the top. Moreover, in another example, a translate-down command is generated and further transmitted when the gesture includes the movement of the camera based device towards the bottom with respect to the position of the display device. In some embodiments a gesture-boundary-top command is generated when the camera based device is moved towards the bottom.

[0044] Further, in one example, rotate-clockwise command is generated and further transmitted when the gesture includes rotating the camera based device in the clockwise direction with respect to the position of the display device. In some embodiments a gesture-boundary- anticlockwise command is generated when the camera based device is rotated in the clockwise direction. Furthermore, in another example, rotate-anticlockwise command is generated and further transmitted when the gesture includes rotating the camera based device in the anticlockwise direction with respect to the position of the display device. In some embodiments a gesture-boundary-clockwise command is generated when the camera based device is rotated in the anticlockwise direction. Hence, gesture based commands that are generated in contrary to the movement of the gesture determined by the camera based device are transmitted to the display device based on an application running on the display device. Similarly various other commands corresponding to the gesture associated with the camera based device can be generated and further transmitted.

[0045] At step 335 the display device is enabled to process the command. Processing is performed to display the one or more entities or images corresponding to the command received. In one example, if the zoom-in command is received then the entities or images displayed on the display device is zoomed larger than original size of the entities or images and displayed on the display device. Further, if the zoom-in command is received then the entities or images displayed on the display device are zoomed out to such that the entities or images appear smaller than the original size of the entities or images and are displayed on the display device.

[0046] In another example, if the translate-right command is received then the entities or images displayed on the display device is moved towards the right and thus displayed on the display device. In some embodiments, if the gesture-boundary-left command is received, then the gesture-boundary-left command is processed by the display device and further, the entities or images are moved towards the right upon processing the gesture-boundary-left command. In another example, if the translate-left command is received then the entities or images displayed on the display device is moved towards the left and thus displayed on the display device. In some embodiments, if the gesture-boundary-right command is received, then the gesture-boundary- right command is processed by the display device and further, the entities or images are moved towards the left upon processing the gesture-boundary- right command.

[0047] In yet another example, if the translate-top command is received then the entities or images displayed on the display device is moved towards the top and thus displayed on the display device. In some embodiments, if the gesture-boundary-down command is received, then the gesture-boundary-down command is processed by the display device and further, the entities or images are moved towards the top upon processing the gesture-boundary-down command. In yet another example, if the translate-down command is received then the entities or images displayed on the display device is moved downwards and thus displayed on the display device. In some embodiments, if the gesture-boundary-top command is received, then the gesture-boundary-top command is processed by the display device and further, the entities or images are moved downwards upon processing the gesture-boundary-down command.

[0048] Further, in one example, if the rotate-clockwise command is received then the entities or images displayed on the display device is rotated in the clockwise direction and thus displayed on the display device. In some embodiments, if the gesture-boundary- anticlockwise command is received, then the gesture-boundary-anticlockwise command is processed by the display device and further, the entities or images are rotated in the clockwise direction upon processing the gesture-boundary- anticlockwise command. Further, in another example, if the rotate-anticlockwise command is received then the entities or images displayed on the display device is rotated in the anticlockwise direction and thus displayed on the display device. In some embodiments, if the gesture-boundary-clockwise command is received, then the gesture-boundary-clockwise command is processed by the display device and further, the entities or images are rotated in the anticlockwise direction upon processing the gesture-boundary-clockwise command. Similarly, the display device moves the entities and images corresponding to the commands received. The method stops at step 335.

[0049] In one example, an application running on the display device can be a game. The camera based device captures a character of the game. If the camera based device moves towards the right, then the camera based device generates the translate-right command. The display device upon receiving the translate-right command moves a character of the game to the right.

[0050] In another example, if an application running on the display device is a web browser. If the camera based device moves towards the top, then the camera based device generates the gesture-boundary-down command which scrolls a web page downwards that is also referred to as natural scrolling.
[0051] FIG. 4A-4I is an exemplary illustration of controlling a display device using a mobile phone, in accordance with one embodiment. FIG. 4A-4H includes a display device 405 and a mobile device 410. In one example, the display device 405 can include a television. The mobile device 410 includes a camera and is configured to capture images using the camera. Further, the display device 405 and the mobile device 410 are configured to enable communication between each other.

[0052] The mobile device 410 captures an initial boundary of the display device 405 with respect to a position of the display device 405. The mobile device 410 performs calibration to capture an initial boundary to form an initial calibrated boundary. The initial calibrated boundary can also be referred to as a solid boundary 415 as shown in FIG. 4A. The initial calibrated boundary can also be referred to as a fixed boundary. Upon calibration, position of the initial calibrated boundary, on display of the mobile device 410, is fixed. The mobile device 410 also determines a gesture boundary based on one or more gestures associated with the mobile device 410. The gesture boundary is illustrated in the form of dotted rectangle 420. Position of the dotted rectangle 420 is altered based on the movement of the mobile device 410 relative to the initial calibrated boundary. Altering of the position of the dotted rectangle 420 corresponding to the movement of the mobile device 410 is relative to the initial calibrated boundary. The position of the dotted rectangle 420 that is relative to the initial calibrated boundary is used to determine one or more gestures associated with the mobile device 410. Further, the mobile device 410 also captures an image displayed on the display device 405 as shown in FIG. 4A.

[0053] In FIG. 4B the mobile device 410 is moved towards the display device along an axis, for example Z-axis as shown in 505. When the mobile device 410 is moved towards the display device, the image becomes enlarged. Hence the mobile device 410 expands the dotted rectangle 420, relative to the initial calibrated boundary, as shown in FIG. 4B, to capture the enlarged image. The mobile device 410 is further configured to detect expansion of the dotted rectangle 420 and generates a zoom-in command responsive to the movement. The zoom-in command is further transmitted to the display device 405. One or more wireless networks, for example, but not limited to, Bluetooth, NFC, WI-FI and the like may be used for transmission of the zoom-in command. The display device 405 is configured to receive the zoom-in command. The display device 405 is further operable to process the zoom-in command and magnify the image in response to the zoom-in command received. Magnified image is displayed on the display device 405 as shown in FIG. 4B.

[0054] In FIG. 4C the mobile device 410 is moved away from the display device along an axis, for example Z-axis as shown in 510. When the mobile device 410 is moved away from the display device, the image becomes miniaturized. Hence the mobile device 410 performs contraction of the dotted rectangle 420 relative to the initial calibrated boundary as shown in FIG. 4C, to capture the miniaturized image. The mobile device 410 is further configured to detect the contraction of the dotted rectangle 420 and generates a zoom-out command responsive to the movement. The zoom-out command is further transmitted to the display device 405. One or more wireless networks, for example, but not limited to, Bluetooth, NFC, WI-FI and the like may be used for transmission of the zoom-out command. The display device 405 is configured to receive the zoom-out command. The display device 405 is further operable process the zoom-out command and to miniaturize the image in response to the zoom- out command received. Miniaturized image is displayed on the display device 405 as shown in FIG. 4C.

[0055] In FIG. 4D the mobile device 410 is moved towards left along an axis, for example Y axis as shown in 515, with respect to the position of the display device 405. When the mobile device 410 is moved towards the left, the image, captured in the mobile device 410 also moves towards the left. Hence the mobile device 410 moves the dotted rectangle 420 towards the right, relative to the initial calibrated boundary, for capturing the image that is moved towards the left corresponding to movement of the mobile device 410 shown in FIG. 4D. The mobile device 410 is further configured to detect the movement of the dotted rectangle 420 towards the right and generates a translate-left command in response to the movement. The translate-left command is further transmitted to the display device 405. The display device 405 is configured to receive the translate-left command. The display device 405 is further operable to process the translate-left command and move the image, displayed on the display device 405, towards the left in response to the translate-left command received. The image that is moved towards the left is thus displayed on the display device 405 as shown in FIG. 4C. In some embodiments a gesture-boundary-right command is generated when the camera based device is moved towards the right. The gesture-boundary-right command is further processed by the display device 405 for moving the image towards the left.

[0056] In FIG. 4E the mobile device 410 is moved towards right along an axis, for example X axis as shown in 520, with respect to the position of the display device 405.

When the mobile device 410 is moved towards the right, the image, captured in the mobile device 410, also moves towards the right. Hence the mobile device 410 moves the dotted rectangle 420 towards the left, relative to the initial calibrated boundary, for capturing the image that is moved towards the left corresponding to movement of the mobile device 410 shown in FIG. 4E. The mobile device 410 is further configured to detect the movement of the dotted rectangle 420 towards the left and generates a translate-right command in response to the movement. The translate-right command is further transmitted to the display device 405. The display device 405 is configured to receive the translate-right command. The display device 405 is further operable to process the translate-right command and move the image, displayed on the display device 405, towards the right in response to the translate-right command received. The image that is moved towards the right is thus displayed on the display device 405 as shown in FIG. 4E. In some embodiments a gesture-boundary-left command is generated when the camera based device is moved towards the right. The gesture-boundary-left command is further processed by the display device 405 for moving the image towards the right.

[0057] In FIG. 4F the mobile device 410 is moved downwards along an axis, for example Y axis as shown in 525 with respect to the position of the display device 405. When the mobile device 410 is moved downwards, the image, captured in the mobile device 410, is also moved downwards. Hence the mobile device 410 moves the dotted rectangle 420 towards the top, relative to the initial calibrated boundary, for capturing the image that is moved downwards corresponding to movement of the mobile device
410 shown in FIG. 4F. The mobile device 410 is further configured to detect the movement of the dotted rectangle 420 towards the top and generates a translate-down command in response to the movement. The translate-down command is further transmitted to the display device 405. The display device 405 is configured to receive the translate-down command. The display device 405 is further operable to process the translate-down command and move the image, displayed on the display device 405, downwards in response to the translate-down command received. The image that is moved downwards is thus displayed on the display device 405 as shown in FIG. 4F. In some embodiments a gesture-boundary-top command is generated when the camera based device is moved downwards. The gesture-boundary- top command is further processed by the display device 405 for moving the image downwards.

[0058] In FIG. 4G the mobile device 410 is moved towards top along an axis, for example Y axis as shown in 530 with respect to the position of the display device 405. When the mobile device 410 is moved towards the top, the image, captured in the mobile device 410, is also moved towards the top corresponding to movement of the mobile device 410. Hence the mobile device 410 moves the dotted rectangle 420 downwards, relative to the initial calibrated boundary, for capturing the image that is moved towards the top corresponding to movement of the mobile device 410 shown in FIG. 4G. The mobile device 410 is further configured to detect the movement of the dotted rectangle 420 downwards and generates a translate-top command in response to the movement. The translate-top command is further transmitted to the display device 405. The display device 405 is configured to receive the translate-top command. The display device 405 is further operable to process the translate-top command and move
the image, displayed on the display device 405, towards the top in response to the translate-top command received. The image that is moved towards the top is thus displayed on the display device 405 as shown in FIG. 4G. In some embodiments a gesture-boundary-down command is generated when the camera based device is moved towards the top. The gesture-boundary- down command is further processed by the display device 405 for moving the image towards the top.

[0059] In FIG. 4H the mobile device 410 is rotated in a clockwise direction along an axis, for example Z axis as shown in 535 with respect to the position of the display device 405. When the mobile device 410 is rotated in the clockwise direction, the image, captured in the mobile device 410, is also rotated in the clockwise direction corresponding to the rotation of the mobile device 410. Hence the mobile device 410 rotates the dotted rectangle 420 in the anticlockwise direction, relative to the initial calibrated boundary, for capturing the image that is rotated in the clockwise direction corresponding to rotation of the mobile device 410 as shown in FIG. 4H. The mobile device 410 is further configured to detect the rotation of the mobile device 410 in the clockwise direction and generates a rotate-clockwise command in response to the rotation. The rotate-clockwise command is further transmitted to the display device 405. The display device 405 is configured to receive the rotate-clockwise command. The display device 405 is further operable to process the rotate-clockwise command and rotate the image, displayed on the display device 405, in the clockwise direction in response to the rotate-clockwise command received. The image that is rotated in the clockwise direction is thus displayed on the display device 405 as shown in FIG. 4H. In
some embodiments a gesture-boundary- anticlockwise command is generated when the camera based device is rotated in the clockwise direction. The gesture-boundary-anticlockwise command is further processed by the display device 405 for rotating the image in the clockwise direction.

[0060] In FIG. 41 the mobile device 410 is rotated in an anticlockwise direction along an axis, for example Z axis as shown in 540 with respect to the position of the display device 405. When the mobile device 410 is rotated in the anticlockwise direction, the image, captured in the mobile device 410, is also rotated in the anticlockwise direction corresponding to the rotation of the mobile device 410 Hence the mobile device 410 rotates the dotted rectangle 420 in the clockwise direction, relative to the initial calibrated boundary, for capturing the image that is rotated in the anticlockwise direction corresponding to rotation of the mobile device 410 as shown in FIG. 41. The mobile device 410 is further configured to detect the rotation of the dotted rectangle 420 in the clockwise direction and generates a rotate-anticlockwise command in response to the rotation. The rotate-anticlockwise command is further transmitted to the display device 405. The display device 405 is configured to receive the rotate-anticlockwise command. The display device 405 is further operable to process the rotate-anticlockwise command and rotate the image, displayed on the display device 405, in the anticlockwise direction in response to the rotate-anticlockwise command received. The image that is rotated in the anticlockwise direction is thus displayed on the display device 405 as shown in FIG. 41. In some embodiments a gesture-boundary- clockwise command is generated when the camera based device is rotated in the anticlockwise direction.

The gesture-boundary-clockwise command is further processed by the display device 405 for rotating the image in the anticlockwise direction.

[0061] Advantageously, the present disclosure enables a method and a system for controlling a display device using electronic devices enabled with a camera feature. By moving the electronic device enabled with the camera feature, the images displayed on the displayed on the display device are moved correspondingly. Since mobile devices enabled with the camera feature is economical, users can utilize the system for controlling various applications running on the display device. Further, applications, for example, games that are not distinctly viewable in the mobile phone, since the display area of the mobile phone is miniaturized in nature, can be viewed distinctly. Furthermore, the present disclosure restrains from using additional devices, for example, a motion sensor and the like for detecting gestures and hence economical.

[0062] In the preceding specification, the present disclosure and its advantages have been described with reference to specific embodiments. However, it will be apparent to a person of ordinary skill in the art that various modifications and changes can be made, without departing from the scope of the present disclosure, as set forth in the claims below. Accordingly, the specification and figures are to be regarded as illustrative examples of the present disclosure, rather than in restrictive sense. All such possible modifications are intended to be included within the scope of present disclosure.

I/We claim:

1. A method of controlling a display device using a camera based device, the method comprising:

configuring at least one of a camera based device and a display device to enable communication between the camera based device and the display device;

enabling the camera based device to perform calibration to capture an initial calibrated boundary of the display device;

calculating a gesture boundary corresponding to a gesture associated with the camera based device;
determining a gesture associated with the camera based device based on a position of the gesture boundary relative to the initial calibrated boundary;

transmitting a command from the camera based device to the display device using a network, the command being transmitted based on the gesture; and

enabling the display device to process the command, processing being performed to display a plurality of entities corresponding to the command.

2. The method as claimed in claim 1 and further comprising:
capturing, by the camera based device, the plurality of entities displayed by the display device,

3. The method as claimed in claim 1 and further comprising:

receiving the command, transmitted from the camera based device, by the display device.

4. The method as claimed in claim 1 wherein the command is generated based on at least one of the gesture associated with the camera based device and the position of the gesture boundary relative to the initial calibrated boundary.
5 The method as claimed in claim 1 and further comprising:

storing one or more camera based devices by the display device In a first database, the one or more camera based devices being configured to enable communication between the display device and the one or more camera based devices.

6 The method as claimed in claim 1 and further comprising:

storing one or more display devices by the camera based device in a second database, the one or more display devices being configured to enable communication between the camera based device and the one or more display devices.

7 A system for controlling a display device using a camera based device, the system comprising:

a display device configured to receive a plurality of commands; a camera based device configured to transmit the plurality of commands, the plurality of commands being transmitted based on a plurality of gestures; and

a network for establishing communication between the display device and the camera based device.

8. The system as claimed in claim 7, wherein the camera based device is further configured to capture a plurality of entities displayed by the display device.

9. The system as claimed in claim 7, wherein the camera based device is further configured to:
detect the display device for transmitting the plurality of commands;

perform calibration to capture an initial calibrated boundary of the display device;

calculate a gesture boundary relative to the initial calibrated boundary, based on the plurality of gestures; and

determine a gesture associated with the camera based device based on a position of the gesture boundary relative to the initial calibrated boundary

10. The system as claimed in claim 7, wherein the camera based device is further operable to alter a position of the gesture boundary that is relative to the initial calibrated boundary with respect to the plurality of gestures associated with the camera based device.

11. The system as claimed in claim 7 wherein the plurality of commands is generated based on at least one of the plurality of gestures associated with the camera based device and a position of the gesture boundary that is relative to the initial calibrated boundary.

12. The system as claimed in claim 7, wherein the display device is further configured to:
detect the plurality of commands transmitted by the camera based device;

process the plurality of commands; and

perform one or more actions responsive to the plurality of commands.

Documents

Orders

Section Controller Decision Date
Sec 15 Grant LAKSHMI NARAYANA CHINTA 2022-06-02
Sec 15 Grant LAKSHMI NARAYANA CHINTA 2022-06-02

Application Documents

# Name Date
1 1756-CHE-2012 POWER OF ATTORNEY 07-05-2012.pdf 2012-05-07
1 1756-CHE-2012-PROOF OF ALTERATION [16-01-2023(online)].pdf 2023-01-16
2 1756-CHE-2012 FORM-5 07-05-2012.pdf 2012-05-07
2 1756-CHE-2012-IntimationOfGrant02-06-2022.pdf 2022-06-02
3 1756-CHE-2012-PatentCertificate02-06-2022.pdf 2022-06-02
3 1756-CHE-2012 FORM-3 07-05-2012.pdf 2012-05-07
4 1756-CHE-2012-Annexure [30-05-2022(online)].pdf 2022-05-30
4 1756-CHE-2012 FORM-2 07-05-2012.pdf 2012-05-07
5 1756-CHE-2012-Response to office action [30-05-2022(online)].pdf 2022-05-30
5 1756-CHE-2012 FORM-1 07-05-2012.pdf 2012-05-07
6 1756-CHE-2012-Response to office action [13-05-2022(online)].pdf 2022-05-13
6 1756-CHE-2012 DRAWINGS 07-05-2012.pdf 2012-05-07
7 1756-CHE-2012-Correspondene And POA_12-11-2021.pdf 2021-11-12
7 1756-CHE-2012 DESCRIPTION (COMPLETE) 07-05-2012.pdf 2012-05-07
8 1756-CHE-2012-AMENDED DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
8 1756-CHE-2012 CORRESPONDENCE OTHERS 07-05-2012.pdf 2012-05-07
9 1756-CHE-2012 CLAIMS 07-05-2012.pdf 2012-05-07
9 1756-CHE-2012-FORM 13 [08-10-2021(online)].pdf 2021-10-08
10 1756-CHE-2012 ABSTRACT 07-05-2012.pdf 2012-05-07
10 1756-CHE-2012-MARKED COPIES OF AMENDEMENTS [08-10-2021(online)].pdf 2021-10-08
11 1756-CHE-2012 FORM-13 01-04-2013.pdf 2013-04-01
11 1756-CHE-2012-RELEVANT DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
12 1756-CHE-2012 CORRESPONDENCE OTHERS 01-04-2013.pdf 2013-04-01
12 1756-CHE-2012-Written submissions and relevant documents [08-10-2021(online)].pdf 2021-10-08
13 1756-CHE-2012 FORM-18 25-04-2013.pdf 2013-04-25
13 1756-CHE-2012-US(14)-HearingNotice-(HearingDate-28-09-2021).pdf 2021-10-03
14 1756-CHE-2012 FORM-13 18-07-2015.pdf 2015-07-18
14 1756-CHE-2012-FORM-26 [28-09-2021(online)].pdf 2021-09-28
15 1756-CHE-2012-Correspondence to notify the Controller [23-09-2021(online)].pdf 2021-09-23
15 Form 13_Address for service.pdf 2015-07-20
16 1756-CHE-2012-FORM 3 [05-07-2021(online)].pdf 2021-07-05
16 Amended Form 1.pdf 2015-07-20
17 1756-CHE-2012-Response to office action [24-07-2020(online)].pdf 2020-07-24
17 1756-CHE-2012-FORM-26 [27-11-2017(online)].pdf 2017-11-27
18 1756-CHE-2012-ABSTRACT [12-12-2019(online)].pdf 2019-12-12
18 1756-CHE-2012-RELEVANT DOCUMENTS [19-02-2018(online)].pdf 2018-02-19
19 1756-CHE-2012-Changing Name-Nationality-Address For Service [19-02-2018(online)].pdf 2018-02-19
19 1756-CHE-2012-CLAIMS [12-12-2019(online)].pdf 2019-12-12
20 1756-CHE-2012-COMPLETE SPECIFICATION [12-12-2019(online)].pdf 2019-12-12
20 1756-CHE-2012-FORM 3 [06-07-2018(online)].pdf 2018-07-06
21 1756-CHE-2012-FER_SER_REPLY [12-12-2019(online)].pdf 2019-12-12
21 1756-CHE-2012-FORM 3 [24-12-2018(online)].pdf 2018-12-24
22 1756-CHE-2012-FORM 3 [11-12-2019(online)].pdf 2019-12-11
22 1756-CHE-2012-FORM 3 [25-06-2019(online)].pdf 2019-06-25
23 1756-CHE-2012-FER.pdf 2019-06-28
23 1756-CHE-2012-PETITION UNDER RULE 137 [11-12-2019(online)].pdf 2019-12-11
24 1756-CHE-2012-RELEVANT DOCUMENTS [10-12-2019(online)].pdf 2019-12-10
24 1756-CHE-2012-FORM 13 [10-12-2019(online)].pdf 2019-12-10
25 1756-CHE-2012-FORM-26 [10-12-2019(online)].pdf 2019-12-10
26 1756-CHE-2012-FORM 13 [10-12-2019(online)].pdf 2019-12-10
26 1756-CHE-2012-RELEVANT DOCUMENTS [10-12-2019(online)].pdf 2019-12-10
27 1756-CHE-2012-FER.pdf 2019-06-28
27 1756-CHE-2012-PETITION UNDER RULE 137 [11-12-2019(online)].pdf 2019-12-11
28 1756-CHE-2012-FORM 3 [11-12-2019(online)].pdf 2019-12-11
28 1756-CHE-2012-FORM 3 [25-06-2019(online)].pdf 2019-06-25
29 1756-CHE-2012-FER_SER_REPLY [12-12-2019(online)].pdf 2019-12-12
29 1756-CHE-2012-FORM 3 [24-12-2018(online)].pdf 2018-12-24
30 1756-CHE-2012-COMPLETE SPECIFICATION [12-12-2019(online)].pdf 2019-12-12
30 1756-CHE-2012-FORM 3 [06-07-2018(online)].pdf 2018-07-06
31 1756-CHE-2012-Changing Name-Nationality-Address For Service [19-02-2018(online)].pdf 2018-02-19
31 1756-CHE-2012-CLAIMS [12-12-2019(online)].pdf 2019-12-12
32 1756-CHE-2012-ABSTRACT [12-12-2019(online)].pdf 2019-12-12
32 1756-CHE-2012-RELEVANT DOCUMENTS [19-02-2018(online)].pdf 2018-02-19
33 1756-CHE-2012-FORM-26 [27-11-2017(online)].pdf 2017-11-27
33 1756-CHE-2012-Response to office action [24-07-2020(online)].pdf 2020-07-24
34 1756-CHE-2012-FORM 3 [05-07-2021(online)].pdf 2021-07-05
34 Amended Form 1.pdf 2015-07-20
35 Form 13_Address for service.pdf 2015-07-20
35 1756-CHE-2012-Correspondence to notify the Controller [23-09-2021(online)].pdf 2021-09-23
36 1756-CHE-2012-FORM-26 [28-09-2021(online)].pdf 2021-09-28
36 1756-CHE-2012 FORM-13 18-07-2015.pdf 2015-07-18
37 1756-CHE-2012 FORM-18 25-04-2013.pdf 2013-04-25
37 1756-CHE-2012-US(14)-HearingNotice-(HearingDate-28-09-2021).pdf 2021-10-03
38 1756-CHE-2012 CORRESPONDENCE OTHERS 01-04-2013.pdf 2013-04-01
38 1756-CHE-2012-Written submissions and relevant documents [08-10-2021(online)].pdf 2021-10-08
39 1756-CHE-2012 FORM-13 01-04-2013.pdf 2013-04-01
39 1756-CHE-2012-RELEVANT DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
40 1756-CHE-2012 ABSTRACT 07-05-2012.pdf 2012-05-07
40 1756-CHE-2012-MARKED COPIES OF AMENDEMENTS [08-10-2021(online)].pdf 2021-10-08
41 1756-CHE-2012 CLAIMS 07-05-2012.pdf 2012-05-07
41 1756-CHE-2012-FORM 13 [08-10-2021(online)].pdf 2021-10-08
42 1756-CHE-2012 CORRESPONDENCE OTHERS 07-05-2012.pdf 2012-05-07
42 1756-CHE-2012-AMENDED DOCUMENTS [08-10-2021(online)].pdf 2021-10-08
43 1756-CHE-2012 DESCRIPTION (COMPLETE) 07-05-2012.pdf 2012-05-07
43 1756-CHE-2012-Correspondene And POA_12-11-2021.pdf 2021-11-12
44 1756-CHE-2012 DRAWINGS 07-05-2012.pdf 2012-05-07
44 1756-CHE-2012-Response to office action [13-05-2022(online)].pdf 2022-05-13
45 1756-CHE-2012 FORM-1 07-05-2012.pdf 2012-05-07
45 1756-CHE-2012-Response to office action [30-05-2022(online)].pdf 2022-05-30
46 1756-CHE-2012-Annexure [30-05-2022(online)].pdf 2022-05-30
46 1756-CHE-2012 FORM-2 07-05-2012.pdf 2012-05-07
47 1756-CHE-2012-PatentCertificate02-06-2022.pdf 2022-06-02
47 1756-CHE-2012 FORM-3 07-05-2012.pdf 2012-05-07
48 1756-CHE-2012-IntimationOfGrant02-06-2022.pdf 2022-06-02
48 1756-CHE-2012 FORM-5 07-05-2012.pdf 2012-05-07
49 1756-CHE-2012-PROOF OF ALTERATION [16-01-2023(online)].pdf 2023-01-16
49 1756-CHE-2012 POWER OF ATTORNEY 07-05-2012.pdf 2012-05-07

Search Strategy

1 Searchstrategy_1756che2012_20-06-2019.pdf

ERegister / Renewals

3rd: 29 Aug 2022

From 07/05/2014 - To 07/05/2015

4th: 29 Aug 2022

From 07/05/2015 - To 07/05/2016

5th: 29 Aug 2022

From 07/05/2016 - To 07/05/2017

6th: 29 Aug 2022

From 07/05/2017 - To 07/05/2018

7th: 29 Aug 2022

From 07/05/2018 - To 07/05/2019

8th: 29 Aug 2022

From 07/05/2019 - To 07/05/2020

9th: 29 Aug 2022

From 07/05/2020 - To 07/05/2021

10th: 29 Aug 2022

From 07/05/2021 - To 07/05/2022

11th: 29 Aug 2022

From 07/05/2022 - To 07/05/2023

12th: 25 Apr 2023

From 07/05/2023 - To 07/05/2024

13th: 01 May 2024

From 07/05/2024 - To 07/05/2025

14th: 29 Apr 2025

From 07/05/2025 - To 07/05/2026