Abstract: A method and a system for face detection with user intervention is provided. The method includes automatically detecting positions of one or more faces associated with the image in a preview frame. The method further includes manually relocating positions of the one or more faces associated with the image in the preview frame, based on a preference of the user. The method also includes capturing the image in the portable electronic device, the image having relocated positions of the one or more faces. Embodiments of the system also disclose a portable electronic device for face detection with user intervention. The portable electronic device includes a processor for automatically detecting positions of one or more faces, manually relocating positions of the one or more faces, and a capturing unit for capturing the image in the portable electronic device.
FIELD
|0001| The present invention relates to the field of digital communications. More particularly, the present invention relates to capturing an image in a portable electronic device through auto face detection with a user intervention.
BACKGROUND
[00021 In recent years introduction of a digital camera in a portable electronic device has gained wide acceptance globally. Further, in recent years an auto face detection feature is used in the portable electronic device to detect and track a face in the image that needs to be captured. The purpose of having the auto face detection feature in the portable electronic device is to fix the camera parameters for an image. For example, the camera parameters include, but are not limited to, precise lens focusing, white balancing, image exposure and flash light intensity control.
[0003] In the existing technique, when the camera is focused on the image, the auto face detection feature automatically detects the one or more faces in the image and sets the camera parameters accordingly, but at least one among the one or more faces detected automatically in the image may not be the face intended by the user to capture. In such a scenario, the user needs to change the dimension of focus. Further, in the existing technique, the image captured by the user may not be clear and the user may require the image to be processed further for improving clarity of the image. Moreover, it consumes a sufficient amount of time to detect the face which the user is interested. In light of the foregoing discussion, there is a need for a method and system to solve the above mentioned problem.
SUMMARY
[0004] Exemplary embodiments of the present invention relate to a method and system for face detection with user intervention.
10005) In one exemplary embodiment, a method for face detection with user intervention includes automatically detecting positions of one or more faces associated with the image in a preview frame. The preview frame being displayed on a screen of the portable electronic device. The method further includes manually relocating positions of the one or more faces associated with the image in the preview frame by a user of the portable electronic device, based on a preference of the user. The method also includes capturing the image in the portable electronic device, the image having relocated positions of the one or more faces.
[0006] In one exemplary embodiment, a system includes a portable electronic device for face detection with user intervention. The portable electronic device also includes a processor responsive to instructions for automatically detecting positions of one or more faces associated with an image in a preview frame. The processor responsive to the instructions is also used for manually relocating positions of the one or more faces associated with the image in the preview frame by a user. The portable electronic device also includes a capturing unit for capturing the image in the portable electronic device.
BRIEF DESCRIPTION OF THE DRAWINGS
]0007] FIG. 1 illustrates a block diagram of a portable electronic device, in accordance with which various embodiments can be implemented;
[0008] FIG. 2 illustrates face detection with user intervention to capture image in a portable electronic device, in accordance with an embodiment of the invention;
10009] FIG. 3 illustrates a flowchart for face detection with user intervention to capture image in a portable electronic device, in accordance with an embodiment of the invention;
[0010] FIG. 4a-4b illustrates a flowchart for face detection with user intervention to capture image in a portable electronic device, in accordance with another embodiment of the invention; and
[0011] FIG. 5a-5b show screen shots of face detection with user intervention to capture image in a portable electronic device, in accordance with exemplary embodiments of the invention.
DETAILED DESCRIPTION
[0012] Exemplary embodiments of the present invention provide a method and system for face detection with user intervention.
[0013] FIG. 1 illustrates a block diagram of a portable electronic device 105, in accordance with which various exemplary embodiments can be implemented.
[0014] The portable electronic device 105 includes a storage unit 110, a processor 115, a memory 120, a display unit 125, an input devicel30, a cursor control 135, a capturing unit 140, and a face tracking engine 145. An example of the portable electronic device includes, but is not limited to, a digital camera and a mobile phone with a camera.
[0015] In one embodiment, the portable electronic device 105 includes the storage unit 110 for storing one or more information and instructions. The information stored may be a number effaces detected by the face tracking engine 145. The instructions fed by a user of the portable electronic device 105 before capturing an image are also stored in the storage unit 110.
[0016] The portable electronic device 105 includes the processor 115 responsive to the instructions for automatically detecting positions of one or more faces associated with
the image in a preview frame. The preview frame is displayed in the portable electronic device 105 before the image is captured. The processor 115 responsive to the instructions is further used for manually relocating positions of the one or more faces associated with the image in the preview frame of the portable electronic device 105 by a user. The capturing unit 140 captures the image in the portable electronic device with relocated positions of the one or more faces.
[0017J The portable electronic device 105 includes the memory 120 for storing a plurality of images captured by the capturing unit 140.
[0018] The portable electronic device 105 includes the display unit 125 for displaying the image captured by the capturing unit 140. An example of the display unit 125 is a touch screen of the portable electronic device 105.
[0019] The portable electronic device 105 includes the cursor control 135. In some embodiments, the cursor control 135, for example a mouse, a trackball, a joystick, or cursor direction keys for communicating information to the portable electronic device 105. The portable electronic device 105 includes the input device 130. The input device 130 includes one or more navigational keys, for communicating information to the portable electronic device 105. The information communicated is related to dragging and dropping one or more face detection frames in the image to relocate the positions of the one or more faces. The information can be communicated to the processor 115 from a machine-readable medium. The term machine-readable medium can be defined as a medium providing data to a machine to enable the machine to perform a specific fianction. The machine-readable medium can be a storage media. All such media must be tangible to enable the instructions carried by the media to be detected by a physical mechanism that reads the instructions into the machine.
[0020] The portable electronic device 105 also includes the face tracking engine 145 for detecting the one or more faces and tracking the positions of the one or more faces detected in the preview frame of the image. In one embodiment, the face tracking engine 145 is an algorithm to detect and track one or more faces automatically or manually.
10021) FIG. 2 illustrates face detection with user intervention to capture image in the portable electronic device 105, in accordance with an embodiment of the invention.
10022] A user of the portable electronic device 105 desires to capture the image. The capturing unit 140 of the portable electronic device 105 may include a sensor for providing the preview frame of the image, before capturing. For example, the preview frame of the image includes one or more faces. The image of live preview frames is given as an input to the face tracking engine 145. The face tracking engine 145 provides infonnation to the user that a face is detected at a particular coordinate in the image. The face tracking engine 145 in the preview frame uses the information and focuses on that particular location in the preview frame. The face tracking engine 145 then detects one or more faces from the preview frame and tracks the faces automatically in successive frames. Further, during capturing of the image the user may intend to manually relocate positions of one or more faces detected in a preview mode. The preview mode includes previewing of the image on the screen by the user of the portable electronic device 105. The preview mode may be configured in the portable electronic device 105. Thus the user may manually relocate positions of one or more faces in a manual mode. The manual mode is selected by the user for manually relocating the positions of the one or more faces detected by the face tracking engine 145. The manual mode may be configured in the portable electronic device 105. The user of the portable electronic device 105 sets the positions by adjusting a face frame on the one or more faces. The face frame may be of any shape, for example, a square, a circle and a rectangle. The face tracking engine 145 then continuously tracks the relocated positions of the one or more faces in the preview frame. The information with the relocated positions of the one or more faces detected in the image is obtained as an output from the face tracking engine 145. The information with the relocated positions of the one or more faces is then transmitted to the later modules 205, The later modules 205 may include a memory for storing the information.
[0023] FIG. 3 illustrates a flowchart for face detection with user intervention to capture image in a portable electronic device, in accordance with an embodiment of the invention.
[0024] At step 305, the method starts.
[0025] At step 310, positions of one or more faces associated with the image are automatically detected in a preview frame. The preview frame being displayed on a screen of the portable electronic device 105.
|0026] At step 315, the one or more faces associated with the image are manually relocated in the preview frame by the user of the portable electronic device 105. The one or more faces are manually relocated, based on a preference of the user.
[0027] At step 320, the image is captured in the portable electronic device 105. The image having relocated positions of the one or more faces is captured.
(0028) The method stops at step 325.
[0029J In one exemplary embodiment, the user defines a particular location and a particular face to be focused by the face tracking engine 145 while capturing the image.
(00301 FIG. 4a-4b illustrates a flowchart for face detection with user intervention to capture image in a portable electronic device, in accordance with another embodiment of the invention.
[0031] At step 405, the method starts.
[0032] At step 410, positions of one or more faces associated with the image are automatically detected in a preview frame. The preview frame being displayed on a screen of the portable electronic device 105.
[0033] At step 415, the image is sensed. The capturing unit 140 of the portable electronic device 105 includes the sensor for providing the preview frame of the image. The preview frame of the image is then fed as the input to the face tracking engine 145. The preview frame of the image is displayed on the display unit 125 of the portable electronic device 105.
|0034| At step 420. one or more face detection frames are automatically generated in the image for detecting the positions of the one or more faces in an auto detection mode. The auto detection mode being configured in the portable electronic device 105. The processor 115 of the portable electronic device 105 responsive to the instructions automatically generates one or more face detection frames in the image for detecting the positions of the one or more faces in the auto detection mode. In one embodiment, the auto detection mode is configured by a manufacturer of the portable electronic device 105.
[0035) At step 425, the positions of the one or more faces are continuously tracked to optimize the image during motion. The face tracking engine 145 of the portable electronic device 105 continuously tracks the position of the one or more faces in the image.
|0036] At step 430, select the manual mode.
[00371 At step 435, positions of the one or more faces associated with the image in the preview frame are manually relocated by the user of the portable electronic device 105, based on the preference of the user. In one embodiment, the manual mode is selected for manually relocating the positions of the one or more faces detected by the face tracking engine 145. The processor115 of the portable electronic device 105 manually relocates the position of the one or more faces in response to the instructions provided by the user.
[00381 At step 440, the one or more face detection frames are dragged and dropped in the image to relocate the positions of the one or more faces. The user of the portable electronic device 105 drags and drops the one or more face detection frames using one or more navigational keys of the portable electronic device 105. In one embodiment, touch sense detection in the portable electronic device 105 is enabled for dragging and dropping the one or more face detection frames. This has been explained in detail in conjunction with FIG 5a-5b.
(00391 At step 445, the position of at least one face detection frame in the image is retained. For example, consider that the user intends to capture a group image. The group image may include one or more faces with four face detection frames. The user may desire to relocate positions of two face detection frames among the four face detection frames appearing in the group image. The user can perform this task by only dragging and dropping the desired face detection frames and retain the other two face detection frames in the preview frame.
[0040] At step 450, the relocated positions of the one or more faces in the preview frame are continuously tracked by the face tracking engine 145 associated with the portable electronic device 105 to optimize the image during motion.
[0041] At step 455, the information associated with the relocated positions of the one or more faces are obtained. The information of the relocated positions is obtained for customizing the image before capturing. The information is stored in the storage unit 110 of the portable electronic device 105.
[0042] At step 460, the image having relocated positions of the one or more faces is captured in the portable electronic device 105. The image captured is stored in the memory 120 of the portable electronic device 105. The image may be captured by the capturing unit 140.
[0043] The method stops at step 465.
|0044] FIG. 5a-5b show screen shots of face detection with user intervention to capture image in a portable electronic device, in accordance with exemplary embodiments of the invention.
[0045] Fig. 5a shows the screen shot when the user of the portable electronic device 105 intends to manually relocate position of a face associated with the image in the preview frame, in accordance with an exemplary embodiment of the invention. Consider an example, the user of the portable electronic device 105 desires to capture a photo of
510. Please note that for easy understanding of this example, the faces of the persons are same as the face detection frames around them.
10046] The face tracking engine 145 of the portable electronic device 105 automatically detects 505 in the photo by generating a face detection frame around 505. Considering that the portable electronic device 105 is a touch screen device, the user may drag and drop the face detection frame 505 to 510 to generate the relocated face detection frame 505 around 510 (As shown in screen shot 2 of Fig.Sa). Thereby, the user manually relocates the position of the face detection frame 505. The relocated face detection frame 505 is now locked by the user and the face associated with the face detection frame 505 is continuously tracked before capturing the photo.
[0047] Fig. 5b shows the screen shot when the user of the portable electronic device 105 intends to manually relocate positions of multiple faces associated with the image in the preview frame, in accordance with another exemplary embodiment of the invention. The user of the portable electronic device 105 desires to capture a photo with 520, 525, and 530. Please note that for easy understanding of this example, the faces of the persons are same as the face detection frames around them.
[0048] The face tracking engine 145 of the portable electronic device 105 automatically detects 515, 520, and 525 in the photo by generating the face detection frames around 515, 520, and 525. Considering that the portable electronic device 105 is a touch screen device, the user may drag and drop the face detection frame 515 to 530 to generate the relocated face detection frame 515 around 530 (As shown in screen shot 2 of Fig.Sb). Thereby, the user manually relocates the position of the face detection frame 515. The relocated face detection frame 515 is now locked by the user and the face associated with the face detection frame 515 is continuously tracked before capturing the photo.
[0049| Various embodiments of the present invention provide a method and system for face detection with user intervention to capture the image Jn a portable electronic device 105. The user can manually relocate positions of one or more faces in the portable
electronic device. Thereby, the invention provides an efficient and easy methodology for capturing intended faces in the image.
10050] In the present specification, the present invention and its advantages have been described with reference to specific embodiments. However, it will be apparent to a person of ordinary skill in the art that various modifications and changes can be made, without departing from the scope of the present invention, as set forth in the claims below. Accordingly, the specification and figures are to be regarded as illustrative examples of the present invention, rather than in restrictive sense. All such possible modifications are intended to be included within the scope of the present invention.
1. A method for face detection with user intervention to capture image in a portable
electronic device, the method comprising:
automatically detecting positions of one or more faces associated with the image
in a preview frame, the preview frame being displayed on a screen of the portable
electronic device;
manually relocating positions of the one or more faces associated with the image
in the preview frame by a user of the portable electronic device, based on a
preference of the user; and
capturing the image in the portable electronic device, the image having relocated
positions of the one or more faces.
2. The method as claimed in claim 1, wherein the step of automatically detecting one or
more faces comprises:
sensing the image;
automatically generating one or more face detection frames in the image for
detecting the positions of the one or more faces in an auto detection mode, the
auto detection mode being configured in the portable electronic device; and
continuously tracking the positions of the one or more faces to optimize the image
during motion.
3. The method as claimed in claim 1, wherein the step of manually relocating positions
of the one or more faces comprises:
dragging and dropping the one or more face detection frames in the image to
relocate the positions of the one or more faces; and
retaining the position of at least one face detection frame in the image .
4. The method as claimed in claim 1 furthercomprising:
selecting a manual mode for performing the step of manually relocating positions of the one or more faces, the manual mode being configured in the portable electronic device.
5. The method as claimed in claim 1 further comprising:
continuously tracking the relocated positions of the one or more faces in the
preview frame by a face tracking engine associated with the portable electronic
device to optimize the image during motion; and
obtaining information associated with the relocated positions of the one or more
faces for customizing the image before performing the step of capturing the
image.
6. A portable electronic device comprising;
a processor responsive to instructions for:
automatically detecting positions of one or more faces associated with an image in a preview frame, the preview frame being displayed in the portable electronic device;
manually relocating positions of the one or more faces associated with the image in the preview frame by a user of the portable electronic device; and a capturing unit for capturing the image in the portable electronic device.
7. The portable electronic device as claimed in claim 6 is at least one of:
a digital camera; and
a mobile phone with a camera.
8. The portable electronic device as claimed in claim 6, wherein the manually relocating
positions is performed by using at least one of:
a touch sense detection enabled in the portable electronic device; and
one or more navigation keys enabled in the portable electronic device.
9. The portable electronic device as claimed in claim 6, further comprising:
a face tracking engine for continuously tracking the relocated positions of the one or more faces to optimize the image during motion,
10. A method for face detection with user intervention in a portable electronic device, the portable electronic device as described herein and in accompanying figures.
| # | Name | Date |
|---|---|---|
| 1 | 2554-che-2009 power of attorney 22-10-2009.pdf | 2009-10-22 |
| 1 | 2554-CHE-2009-AbandonedLetter.pdf | 2018-12-14 |
| 2 | 2554-che-2009 form-2 22-10-2009.pdf | 2009-10-22 |
| 2 | 2554-CHE-2009-FER.pdf | 2018-06-12 |
| 3 | 2554-che-2009 drawings 22-10-2009.pdf | 2009-10-22 |
| 3 | 2554-CHE-2009 POWER OF ATTOREY 21-06-2011.pdf | 2011-06-21 |
| 4 | 2554-che-2009 description(complete) 22-10-2009.pdf | 2009-10-22 |
| 4 | 2554-CHE-2009 CORRESPONDENCE OTHERS 21-06-2011.pdf | 2011-06-21 |
| 5 | 2554-che-2009 claims 22-10-2009.pdf | 2009-10-22 |
| 5 | 2554-CHE-2009 FORM-13 21-06-2011.pdf | 2011-06-21 |
| 6 | 2554-che-2009 power of attorney 22-04-2010.pdf | 2010-04-22 |
| 6 | 2554-che-2009 assignment 22-10-2009.pdf | 2009-10-22 |
| 7 | 2554-che-2009 abstract 22-10-2009.pdf | 2009-10-22 |
| 7 | 2554-che-2009 correspondence others 22-10-2009.pdf | 2009-10-22 |
| 8 | 2554-che-2009 form-1 22-10-2009.pdf | 2009-10-22 |
| 8 | 2554-che-2009 form-5 22-10-2009.pdf | 2009-10-22 |
| 9 | 2554-che-2009 form-3 22-10-2009.pdf | 2009-10-22 |
| 10 | 2554-che-2009 form-5 22-10-2009.pdf | 2009-10-22 |
| 10 | 2554-che-2009 form-1 22-10-2009.pdf | 2009-10-22 |
| 11 | 2554-che-2009 abstract 22-10-2009.pdf | 2009-10-22 |
| 11 | 2554-che-2009 correspondence others 22-10-2009.pdf | 2009-10-22 |
| 12 | 2554-che-2009 power of attorney 22-04-2010.pdf | 2010-04-22 |
| 12 | 2554-che-2009 assignment 22-10-2009.pdf | 2009-10-22 |
| 13 | 2554-che-2009 claims 22-10-2009.pdf | 2009-10-22 |
| 13 | 2554-CHE-2009 FORM-13 21-06-2011.pdf | 2011-06-21 |
| 14 | 2554-che-2009 description(complete) 22-10-2009.pdf | 2009-10-22 |
| 14 | 2554-CHE-2009 CORRESPONDENCE OTHERS 21-06-2011.pdf | 2011-06-21 |
| 15 | 2554-che-2009 drawings 22-10-2009.pdf | 2009-10-22 |
| 15 | 2554-CHE-2009 POWER OF ATTOREY 21-06-2011.pdf | 2011-06-21 |
| 16 | 2554-CHE-2009-FER.pdf | 2018-06-12 |
| 16 | 2554-che-2009 form-2 22-10-2009.pdf | 2009-10-22 |
| 17 | 2554-CHE-2009-AbandonedLetter.pdf | 2018-12-14 |
| 17 | 2554-che-2009 power of attorney 22-10-2009.pdf | 2009-10-22 |
| 1 | 2554-che-2009_02-11-2017.pdf |