Abstract: CAMERA CONFIGURATION IN A HANDHELD DEVICE A camera configuration in a handheld device is disclosed. The device includes a camera module, a rotatable lens, a processor, and a user interface. The camera module is configured for capturing images or videos. The rotatable lens assembly is within the camera module. The processor for controlling the rotation of the lens assembly. The user interface allows a user to manually adjust the orientation of the lens assembly.
Description:TECHNICAL FIELD
[001] The present invention relates generally to a camera configuration in a handheld device.
BACKGROUND
[002] Terminals may be generally classified as mobile/portable terminals or stationary terminals. Mobile terminals may also be classified as handheld terminals or vehicle mounted terminals. Mobile terminals have become increasingly more functional. Examples of such functions include data and voice communications, capturing images and video via a camera, recording audio, playing music files via a speaker system, and displaying images and video on a display. Users' demands for capturing images with camera-equipped mobile terminals are growing. In line with this, a variety of camera-related structural parts and/or software parts are being developed. As part of the development, mobile terminals equipped with at least two or more cameras with different angles of view are under development.
[003] More recently, special cameras have been developed that remove some of the manual procedure previously necessary, but such cameras typically cost thousands of dollars, and thus, are not readily available to most people. For example, some cameras perform three-dimensional (3D) scanning using structured light or time-of-flight (TOF) sensors to recreate a 3D structure on point clouds. Another technique is to use a stereo camera rig that includes multiple cameras, or at least multiple camera sensors, that simultaneously obtain a pair of images from two different perspectives by spacing the multiple cameras or camera sensors apart from one another. Such specialized cameras are very complex and expensive and are not owned by or accessible to most people.
[004] Therefore, there is a need of a system which overcomes the aforementioned problems.
SUMMARY
[005] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems.
[006] Before the present subject matter relating to a camera configuration in a handheld device, it is to be understood that this application is not limited to the particular system described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the implementations or versions or embodiments only and is not intended to limit the scope of the present subject matter.
[007] This summary is provided to introduce aspects related to a camera configuration in a handheld device. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the present subject matter.
[008] In an embodiment, a camera configuration in a handheld device, the device includes a camera module, a rotatable lens, a processor, and a user interface. The camera module is configured for capturing images or videos. The rotatable lens assembly is within the camera module. The processor for controlling the rotation of the lens assembly. The user interface allows a user to manually adjust the orientation of the lens assembly.
[009] In an embodiment, a camera configuration in a method for capturing images or videos using a handheld device, the method includes the step of adjusting the orientation of a rotatable lens assembly within a camera module. The method includes the step of capturing images or videos based on the adjusted orientation.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[0010] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the disclosure, there is shown in the present document example constructions of the disclosure; however, the disclosure is not limited to the specific system or method disclosed in the document and the drawings.
[0011] The present disclosure is described in detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[0012] Figure 1 illustrates an example of the mobile terminal related to the present invention when viewed from different directions.
[0013] Figure 2 illustrates a rear view of the user interface module of the camera system.
[0014] Figure 3 illustrates a mobile computing device that can use a communication network to upload data to, and download data from, a remote system that includes one or more servers.
[0015] Figure 4 illustrates a conceptual diagrams showing the control method.
[0016] In the above accompanying drawings, a non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
[0017] Further, the figures depict various embodiments of the present subject matter for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the present subject matter described herein.
DETAILED DESCRIPTION
[0018] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although a camera configuration in a handheld device, similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, a camera configuration in a handheld device is now described.
[0019] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. For example, although the present disclosure will be described in the context of a camera configuration in a handheld device, one of ordinary skill in the art will readily recognize a camera configuration in a handheld device can be utilized in any situation. Thus, the present disclosure is not intended to be limited to the embodiments illustrated but is to be accorded the widest scope consistent with the principles and features described herein.
[0020] In an embodiment, a camera configuration in a handheld device, the device includes a camera module, a rotatable lens, a processor, and a user interface. The camera module is configured for capturing images or videos. The rotatable lens assembly is within the camera module. The processor for controlling the rotation of the lens assembly. The user interface allows a user to manually adjust the orientation of the lens assembly.
[0021] In another implementation, the rotatable lens assembly is configured to rotate at least 180 degrees to enable both front-facing and rear-facing capture.
[0022] In another implementation, the user interface comprises a touch-sensitive display allowing gesture-based control of the lens assembly orientation.
[0023] In another implementation, a gyroscope or accelerometer providing orientation data to the processor, wherein the processor adjusts the lens assembly orientation based on the orientation data.
[0024] In another implementation, the rotatable lens assembly comprises a wide-angle lens for panoramic capture.
[0025] In another implementation, the processor is programmed to automatically adjust the lens assembly orientation based on detected scene conditions.
[0026] In another implementation, an image stabilization mechanism to compensate for hand movements during image or video capture.
[0027] In another implementation, the user interface includes voice commands for controlling the rotation of the lens assembly.
[0028] In an embodiment, a camera configuration in a method for capturing images or videos using a handheld device, the method includes the step of adjusting the orientation of a rotatable lens assembly within a camera module. The method includes the step of capturing images or videos based on the adjusted orientation.
[0029] In another implementation, the method includes the step of receiving user input through a touch-sensitive display to control the rotation of the lens assembly.
[0030] Figure 1 illustrates an example of the mobile terminal related to the present invention when viewed from different directions.
[0031] In an embodiment, the mobile terminal 100 is shown having wireless communication unit configured with several commonly implemented components. For instance, the wireless communication unit typically includes one or more components which permit wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located. The input unit includes a camera 121 for obtaining images or video, a microphone 121, which is one type of audio input device for inputting an audio signal, and a user input unit 122 (for example, a touch key, a push key, a mechanical key, a soft key, and the like) for allowing a user to input information. Data (for example, audio, video, image, and the like) is obtained by the input unit 120 and may be analyzed and processed by controller 180 according to device parameters, user commands, and combinations thereof.
[0032] In an embodiment, the display unit 151 outputs information processed in the mobile terminal 100. The display unit 151 may be implemented using one or more suitable display devices. The display unit 151 may also include a touch sensor which senses a touch input received at the display unit. When a touch is input to the display unit 151, the touch sensor may be configured to sense this touch and the controller 180, for example, may generate a control command or other signal corresponding to the touch. The content which is input in the touching manner may be a text or numerical value, or a menu item which can be indicated or designated in various modes.
[0033] In an embodiment, the first camera 121 a can process image frames such as still or moving images obtained by the image sensor in a capture mode or a video call mode. The processed image frames can then be displayed on the display unit 151 or stored in the memory 170. The first and second manipulation units 123 a and 123 b are examples of the user input unit 123, which may be manipulated by a user to provide input to the mobile terminal 100. The first and second manipulation units 123 a and 123 b may also be commonly referred to as a manipulating portion, and may employ any tactile method that allows the user to perform manipulation such as touch, push, scroll, or the like. The first and second manipulation units 123 a and 123 b may also employ any non-tactile method that allows the user to perform manipulation such as proximity touch, hovering, or the like. The second camera 121 b can include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras may be referred to as an “array camera.” When the second camera 121 b is implemented as an array camera, images may be captured in various manners using the plurality of lenses and images with better qualities.
[0034] In an embodiment, a flash 124 is shown adjacent to the second camera 121 b. When an image of a subject is captured with the camera 121 b, the flash 124 may illuminate the subject. As shown in FIG. 1C, the second audio output module 152 b can be located on the terminal body. The second audio output module 152 b may implement stereophonic sound functions in conjunction with the first audio output module 152 a, and may be also used for implementing a speaker phone mode for call communication.
[0035] In an embodiment, the memory 170 is typically implemented to store data to support various functions or features of the mobile terminal 100. For instance, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like.
[0036] In an embodiment, a power supply unit 190 for supplying power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or detachably coupled to an outside of the terminal body. The battery 191 may receive power via a power source cable connected to the interface unit 160. Also, the battery 191 can be recharged in a wireless manner using a wireless charger. Wireless charging may be implemented by magnetic induction or electromagnetic resonance.
[0037] Figure 2 illustrates a rear view of the user interface module of the camera system.
[0038] In an embodiment, an interface 194 is configured for releasable, reliable electrical communication and mechanical interlocking between the user interface module 122 and the various expansion modules of the modular camera system. The interface 194 of the user interface module 122 includes a mechanical interface having a mounting surface, support recess 197, and locking protuberances 198. The interface 194 further includes an electrical interface including an electrical connector 196. The mechanical interface is configured to cooperate with the mechanical interfaces of the brain module, adapter module 128 and second interface of the expansion module to fasten the user interface module 122 to the corresponding modules. Moreover, the locking protuberances 198 engage the corresponding locking notches of the support of the brain module, providing enhanced locking of the user interface module 122 and the brain module.
[0039] Figure 3 illustrates a mobile computing device that can use a communication network to upload data to, and download data from, a remote system that includes one or more servers.
[0040] In an embodiment, the mobile computing device 102 (such as the smartphone 202) can use a communication network 302 to upload data to, and download data from, a remote system 312 that includes one or more servers 322. Preferably, the mobile computing device 102 can achieve such uploading and downloading wirelessly. Various communication protocols may be used to facilitate communication between the various components shown in FIG. 3. These communication protocols may include, for example, TCP/IP, HTTP protocols, wireless application protocol (WAP), vendor-specific protocols, customized protocols, but are not limited thereto. While in one embodiment, communication network 202 is the Internet, in other embodiments, communication network 202 may be any suitable communication network including a local area network (LAN), a wide area network (WAN), a wireless network, an intranet, a private network, a public network, a switched network, and combinations of these, and the like. The distributed computer network shown in FIG. 3 is merely illustrative of a computing environment in which embodiments the present technology can be implemented, but is not intended to limit the scope of the embodiments described herein. The mobile computing device 102 can upload data to the remote system 312 so that the remote system can generate 3D models based on the uploaded data, and the remote system 312 can download data to the mobile computing device 102 so that the remote computing device 302 can display 3D models to a user of the mobile computing device.
[0041] Figure 4 illustrates a conceptual diagrams showing the control method.
[0042] In an embodiment, a focal point preset for the first and second cameras 221 a and 221 b may be set based on the user's selection or a preset algorithm. For example, the preset focal point may be set based on the user's selection. More specifically, the controller 180 can display a first image received from the first camera 221 a on the display 151. Once the focal points are set, the controller 180 can store in the memory 170 the first focal point for the first camera 221 a and the second focal point for the second camera 221 b. Thereafter, when receiving images from the first and second cameras 221 a and 221 b, the controller 180 can activate the first and second cameras 221 a and 221 b so as receive images having the first and second focal points immediately from the first and second cameras 221 a and 221 b, without detecting the focal points. With the first camera 221 a set as the main camera, the controller 180 can receive a user request for setting the second camera 221 b as the main camera (S220). With the first camera 221 a set as the main camera, the controller 180 can display a first image having a first focal point received from the first camera 221 a on the display 151. The first image is displayed, the controller 180 can receive a user request for setting the second camera 221 b as the main camera. The controller 180 can set the second camera 221 b as the main camera, in response to the user request for setting the second camera 221 b as the main camera. The controller 180 can display the image having the second focal point on the display 151. For example, the controller 180 can display a first image in the entire output area of the display 151 and then a thumbnail image of a second image over the first image in an overlapping manner. Displaying images overlapping each other is referred to as PIP (Picture in Picture).
[0043] Although the description provides implementations of a camera configuration in a handheld device, it is to be understood that the above descriptions are not necessarily limited to the specific features or methods or systems. Rather, the specific features and methods are disclosed as examples of implementations for a camera configuration in a handheld device.
, Claims:We claim:
1. A camera configuration in a handheld device comprising:
a camera module configured for capturing images or videos;
a rotatable lens assembly within the camera module;
a processor for controlling the rotation of the lens assembly;
a user interface allowing a user to manually adjust the orientation of the lens assembly.
2. The camera configuration in a handheld device of claim 1, wherein the rotatable lens assembly is configured to rotate at least 180 degrees to enable both front-facing and rear-facing capture.
3. The camera configuration in a handheld device of claim 1, wherein the user interface comprises a touch-sensitive display allowing gesture-based control of the lens assembly orientation.
4. The camera configuration in a handheld device of claim 1, further comprising a gyroscope or accelerometer providing orientation data to the processor, wherein the processor adjusts the lens assembly orientation based on the orientation data.
5. The camera configuration in a handheld device of any preceding claim, wherein the rotatable lens assembly comprises a wide-angle lens for panoramic capture.
6. The camera configuration in a handheld device of any one of claims 1-4, wherein the processor is programmed to automatically adjust the lens assembly orientation based on detected scene conditions.
7. The camera configuration in a handheld device of any one of claims 1-4, further comprising an image stabilization mechanism to compensate for hand movements during image or video capture.
8. The camera configuration in a handheld device of any one of claims 1-4, wherein the user interface includes voice commands for controlling the rotation of the lens assembly.
9. A camera configuration in a method for capturing images or videos using a handheld device, comprising the steps of:
adjusting the orientation of a rotatable lens assembly within a camera module;
capturing images or videos based on the adjusted orientation.
10. The method of claim 9, further comprising receiving user input through a touch-sensitive display to control the rotation of the lens assembly.
| # | Name | Date |
|---|---|---|
| 1 | 202321084754-STATEMENT OF UNDERTAKING (FORM 3) [12-12-2023(online)].pdf | 2023-12-12 |
| 2 | 202321084754-POWER OF AUTHORITY [12-12-2023(online)].pdf | 2023-12-12 |
| 3 | 202321084754-FORM FOR STARTUP [12-12-2023(online)].pdf | 2023-12-12 |
| 4 | 202321084754-FORM FOR SMALL ENTITY(FORM-28) [12-12-2023(online)].pdf | 2023-12-12 |
| 5 | 202321084754-FORM 1 [12-12-2023(online)].pdf | 2023-12-12 |
| 6 | 202321084754-FIGURE OF ABSTRACT [12-12-2023(online)].pdf | 2023-12-12 |
| 7 | 202321084754-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [12-12-2023(online)].pdf | 2023-12-12 |
| 8 | 202321084754-EVIDENCE FOR REGISTRATION UNDER SSI [12-12-2023(online)].pdf | 2023-12-12 |
| 9 | 202321084754-DRAWINGS [12-12-2023(online)].pdf | 2023-12-12 |
| 10 | 202321084754-DECLARATION OF INVENTORSHIP (FORM 5) [12-12-2023(online)].pdf | 2023-12-12 |
| 11 | 202321084754-COMPLETE SPECIFICATION [12-12-2023(online)].pdf | 2023-12-12 |
| 12 | Abstract.1.jpg | 2024-02-23 |