Sign In to Follow Application
View All Documents & Correspondence

Image Processing Device And Image Processing Method

Abstract: ABSTRACT To provide an image processing device and the like which allow extraction of attached shadow areas and cast shadow arear, from a snapshot-like image without requiring a large-scale system enabling movement or a light source. The image processing device (100) performs processing on a shadow in an image of an object, and includes: an image information obtaining unit (110) configured to obtain information about an image of the ohject, the information including luminance information which is information about fuminance of light from the object and polarization information which is information about polarization of the light from the object; a shadow area extracting unit (120) configured to extract an attached shadow area and a cast shadow area from the image of the object based on the luminance information and the polarization information obtained by the image information obtaining unit (110), Ihe attached shadow area appearing on the surface of Vhe object depending on an angle of incidence light, and the cast shadow area appearing on the surface of a material body other than (he object when the light is blocked by the ohject; and an output unit (130) configured to output information identifying the attached shadow area and cast shadow area extracted by the shadow area extracting unit (120).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
02 March 2009
Publication Number
22/2009
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
Parent Application

Applicants

PANASONIC CORPORATION
1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501

Inventors

1. SATO, SATOSHI
C/O PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
2. KANAMORI, KATSUHIRO
C/O PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
3. NAKATA, MIKIYA
C/O PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501

Specification

DESCRIPTION IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD Technical Field [0001] The present invention relates to devices which perform processing on images of objects, and in particular to devices which perform processing on shadows. Background Art [0002] Recently, image processing methods using shadows have been widely performed. For example, Non-patent Reference 1 has proposed an approach for estimating the distribution of light from D light source of real lighting based on the luminance distribution in a cast shadow which is a "shadow" generated by a material body which is an object having a known shape. This approach is derived from the fact that the relationship between the light source, the object, and the shadow can be represented as a geometrical and optical model When this concept is applied to an environment about which fight source information is known, it is possible to estimate the three-dimensional shape of an object based on the shadow, [0003] In addition, in image processing, when the luminance is low in a portion of an image due to inverse light or a shadow, a shadow correction process for correcting only the luminance of the shadow area is performed to provide a beautiful image, [0004] Here, a 'shadow" appears when light reaches a solid, and contains an "attached shadow" and a "cast shadow". An "attached shadow" appears on the solid itself depending on the angle of incidence light, and a "cast shadow" appears on a plane or on another solid when light is blocked by the former solid. [0005] Since the Non-patent Reference 1 is a method for estimating a light source distribution based on such shadow area only, [t is obvious ibat no accurate estimation can be performed in the case where the processing is performed based on a judgment that an attached shadow area is a oast shadow area. [0006] In addition, in the case of shadow correction process, an attached shadow area is an element providing a stereoscopfc vision of an object. Thus, it is desirable that such correction is made for the cast shadow area only, and no processing is performed on the attached shadow area, [0007] For this, it is very important to classify shadow areas into attached shadow areas and cast shadow areas. [0008] In order to classify shadow areas into attached shadow areas and cast shadow areas, Non-patent Reference 2 generates a linearized image which is an image obtainable in an ideal state where no specular reflection occurs, usiny an image of an object lighted by light sources in various directions, and classifies the shadow areas based on the linearized image. Non-patent Reference I: "Buttaino inei ni motozuku koqen kankyo suitei (Ulirmination Distribution from Shadows), Computer Vision and Imaging Media, the lournal of The Institute of Flectronics, Information and Communication Vol. 41, No. SIG10 (CVIM1), pp. 31-40, 2000f Imari Sator Yoichi Sato, Katsushi Ikeuchi. Non-patent Reference 2:, "Kogaku genshe no bunrui ni motozuku gazo no senkeika (Photometometric Linearization based on Classification of Photometric Factors)", Computer Vision and Imaging Media, the 3ourna) of The Institute of Electronics, Information and Communication, Vol- 44, No. 5IG5 (CVIM6), pp. 11-21, 2003, Yasunori Ishii, Kohtaro Fukui, Yasuhiro Mukaigawa, Takeshi Shakunaga. Disclosure of Invention Problems that the Invention is to Solve [00091 However, Non-palent Reference 2 entails a problem of requiring an image of an object righted by Tight sources in various directions and thus requires a lai^e-scafe device. In addition, Non-patent Reference 2 enables classification of a shadow which appears when a light :"-.ource is moved can he classified, but does not enable classification of a shadow which appears when no light source is moved. Therefore, Non-patent Reference 2 neither enables classification of shadows appearing by solar light in outdoor nor classification of shadows appearing by incandescent (amps used as lighting in indoor. [0010] In consideration of this, the present invention has an aim to provide an image processing device and the like which allows extraction of attached shadow areas and cast shadow areas from a snapshot-like image without requiring a large-scale system enabling movement of a light source. Means to Solve the problems [0011] In order to achieve the ahove aim, the image processing device according to the present invention performs proressing on a shadow in an image of an ohjert, and includes: an image information obtaining unit configured to obtain information about the image of the object, the information including luminance information which is information about luminance of light from the object and polarization information which is information about polarization of the light from the object; a shadow area extracting unit configured to extract an attached shadow area and a cast shadow area from the image of the object based on the luminance information and the polarization information obtained by the image information obtaining unit, the attached shadow area appearing on the surface of the object depending on an angle of incidence light, and the cast shadow $rca appearing on the surface of a material body other than the object when (he light is blocked by the object; and an output unit configured to output information identifying the attached shadow area and the cast shadow area extracted by the shadow area extracting unit. [0012] More specifically, a focus in the present invention is placed on the difference hetween the polarization characteristics of attached shadow areas and the polarization characteristics of cast shadow areas. The present invention extracts attached shadow areas and cast shadow areas by focusing on the degree of polarization which is polarization information and the difference in polarization characteristics which is an estimated polarization error. In addition, since it is difficult to divide a black object having a iovj reflectance and a shadow, the present invention performs area extraction on low luminance areas including shadow and low reflectance areas. The use of pofarization information in this manner makes it possible to easily extract attached shadow areas and cast shadow areas from low luminance areas including shadow areas, [0013} It is to be note that the present invention can be implemented not onfy as an image processing device, but also as an image processing method, as a program causing a computer to execute the steps included in the method, and as a compeer-readable recording medium such as a DVD on which the program is recorded. Effect of the Invention [0014] According to the present invention, attached shadow areas and cast shadow areas are extracted using the polarization information of an object. Tn this way, it becomes possible to extract attached shadow areas and cast shadow areas from a snapshot-like image captured in a general environment without requiring a large-scale system enabling movement of a light source. [001b] Therefore, the present invention makes it possible to easily extract attached shadow areas and cast shadow areas, enabling high refinement of an image. Therefore, the present invention is highfy practical today when mobile imaging devices such as mobile phones with a camera, digital cameras, digital movie cameras and the fike are becoming popular because image resolutions are important for such mobile imaging devices with an optical system and imaging elements miniaturized. Brief Description of Drawings [0016] [FIG. 1] FIG. 1 is a functional block diagram showing the structure of an optical area dividing device in Embodiment 1 of the present invention. [FIG, 2] FIG. 2 is a structural diagram of a camera mounting the optical area dividing device in Embodiments 1, 2f and 4 of the present invention. [FIG. 3] FIG. 3 is a schematic diagram showing the relationship between a patterned polarizer and imaging elements provided in the camera shown in FIG. 2, [FTG, 4] FIG 4 is a flowchart of processes performed by the optical area dividing device in Embodiment 1 of the present invention, [FIG. 5] FIG. 5 is a schematic diagram for illustrating an arrangement state of the patterned polarizer provided in the camera shown in FTG, 2. [FIG, 61 FIG. 6 is a schematic diagram for illustrating a luminance sinusoidal variation and observed luminance points. [FIG. 7] FIG, 7(a) is a diagram showing a plastic sphere balf as an oDject, and FIG. 7(b) to (d) are diagrams which respectively represent, in the three images, the degree of polarization pf the polarization phase angfe 0, and the estimated polarization error E in the case where the object is imaged. [FIG. 8] FIG. Sfa) lo {d} are schematic diagrams obtained by emphasizing the contrast of the respectively corresponding FIG. 7(a) to (d). [FIG. 9] FIG. 9 i^ a graph showing the degree of polarization with respect to the incidence angle of specular reflection components when the refractive indices n of the object equal 1.1, 1.3, 1.5, and 2.0. [FIG, 10] FIG. 1Q s2^+C' (Expression 2) Where, [0048] [Math 3] A = ^^. MII(-2£) = ———. w*(-2B)-^L= (Expression 3) -Ja7 i b1 V"2 +ft [0049] [Math 4] i ■ ^ ■. B= -tan 'I - I (Expression 4) 2 '■ a; [OObO] In other words, a sinusoidal (Expression 1) is approximated by calculating A, B, and C which minimize the following Expression 5 in the samples (cpi. If) of the four prxels. However, ]i shows an observation luminance at the time when the rotation angle of the deflecting plate is ipi. In addition, l\l is the number of samples, and is 4 here. [00511 [Math 51 •. i f{a.h.C) = ^Y_(li-<.i sin2^. -ftcos2tv_ - f) (Expression 5) [0052] The above processes determine three parameters A, B, and C in sinusoidal approximation. [0053] The polarization information generating unit 102 generates any one or some of the following as polahzation information using parameters calculated "in this way. - The degree of polarization p [00541 [Math 6] / - / -I ■f rruA mm ' ' ,,_ - r - P--—--- - y (Expression 6) ■* ma. * mm ^" ' [0055] - Polarization phase O (0 degree < tf> < 180 degrees) [0056] [Math 7] $- — + B (Expression 7) [0057] - Estimated polarization error E [0058] [Math 81 t=j_(j -j sin2iv, R\-c) (Expression 8) ■-■] [0059J Here, the degree of polarization is a parameter indicating the degree of polarization of light. A polarization phase is the angle at which luminance changing depending on the angle of the polarization principal axis becomes maximum. An estimated polarization error is the total of differences between the luminance vafucs observed in the tour pixel samples and the corresponding luminance values determined from the above sinusoidal obtained through the approximation. [0060] FIG. 7 is a diagram representing, m form of images, the degree of polarization p, the polarization phase tp, and the estimated polarization error E in the case where a plastic sphere ball as an object is imaged. In this diagram, FIG. 7(a) shows a plastic sphere ball as an object, and FIG. 7(b) shows the degree of polarization p of the object of FIG. 7{a), FIG. 7(c) shows the polarization phase

and detailed descriptions thereof are omitted. When the shadow area detecting unit 301 judges thai a target pixel is a shadow area (Yes in S2G1), the estimated polarization error comparing unit 304 and the degree-of-pofanzation comparing unfl 303 evaluate the estimated polarization error F defined by Expression 8 and the degree of polarization p defined by Expression G respectively in order to judge whether the pixel is an attached shadow avea or a cast shadow area-In other words, the estimated polarization error comparing unit 304 compares the estimated polarization error E and Ihe threshold value Tb_Err, and the degree-of-polarization comparing unit 303 compares the degree of polarization p and the threshold value Th P. [0101] As the result, when the estimated polarization error E is greater than the threshold value Th_Err, or the magnitude of the degree of polarization p is less than the threshofd value Th_p (Yes in S20S), the area judging unit 305 judges that the pixel is a cast shadow area (5205), whereas when the magnitude of the estimated polarization error E is less than the threshold value Th_Err, and the magnitude of the decree of polarization p is greater than the threshold value Th._P (No in S208), the area judging unit 305 judges that the pixel is an attached shadow area (S206). In this way, both the estimated polarization error and the degree of polarization are used to judge whether the pixel is an attached shadow area or a cast shadow area. [0102] It is to be noted thai the threshold value Th_Err of the estimated polarization error E may take a greater value compared to the case of making a judgment hy only using the estimated polarization error as in the processes in FIG. 15, and that the threshold value Th_P of the degree of polarization p may take a smaller value compared to the case of making a judgment based on the degree- of polarization only as in the processes in FIG. 23. In addition, the avea judging unii 305 may iudge that the pixel is an attached shadow area onfy when the magnitude of the estimated polarization error E is greater than the threshold value Th_Err, and the magnitude of the degree of polarization p is less than the threshold value Th .P (S208). [0103] In addition, an estimated polarization error is used to judge whether the pixel Is an attached shadow area or a cast shadow area (S204) in The processes of FIG, lb, the judgment may be made based on the fact that the polarization characteristics of the attached shadow area are specular reflection characteristics. For example, the polarization phase cp defined by Expression / may be used. As described above, the polarization phase tp shows one component of the normal vector of the object, but the relationship between the polarization phase cp and the one component of the normal vector of the object vary by 90 degrees depending on whether specular reflection components are dominant regarding the object or diffuse reflection components are dominant regarding the object. For example, FTG. 7(r) (FTG. 8(c) whirh is a schematic diagram of FIG. 7(c)) shows that the polarization phase of the attached shadow area is significantly different from the polarization phase information of the adjacent area. This is because the attached shadow area shows the polarization characteristics of specular reflection components, and the adjacent area shows the polarization characteristics of diffuse reflection components. For this, the pixel indicating the polarization characteristics of specular reflection components is detected and the attached shadow area Is detected by evaluating the continuity of the polarization phase of the object. 10104) in addition, the present invention can be implemented not only as an independent optical area dividing device as shown in FIG. 1, but also as a dfvice obtained by combining a processing unit for generating normal vector information using output by the optical area dividing device 100 with the optical area dividing device 100. [0105] FIG. 26 is a functional block diagram showing the structure of the device obtained by combining the processing unit for generating normal vector information (normal vector informal ion generating unii) with the optical area dividing device 100 which is an example of the image processing device according to the present Invention. The normal vector information generating unit 104 is a processing unit for generating, for each area divided by the area dividing unit 103, normal vector information identifying a normal vector on the surface of a corresponding object using polarization information generated by the polarization information generating unit 102. [0106] FIG, 27 is a flowchart of processes performed by the optical area dividing device and the normal vector information generating device shown In FIG, 26. This diagram includes step 5104 for processing normal vector information next to step S103 in the flowchart shown in FIG, 4. The normal vector information generating unit 104 generates, for each area divided by the area dividing unit 103, normal vector information identifying a normal vector on the surface of a corresponding object using the polarization information generated by the polarization information generating unit 102 (5104), after the area dividing process (5103). [0107] Here, a description is given of an algorism for generating normal vector information from polarization information, Tn addition, a known method is a method for calculating, based on the polarization phase ®, the one-dimensional degree of freedom of an angle at an emission plane (Incidence angle) containing rays of incidence light and reflected light from among the normal vector information of the object. It is also known that how to calculate normal vector information Is totally different depending on whether specular reflection is dominant or diffuse reflection is dominant in the object (tor example, see Non-patent Reference 6; "Using polarization to determine intrinsic surface properties", Ondrej Drbohfav and Sara Radim, Proc, SPIE Vol. 3826, pp. 253-263, 1999). In the case where diffuse reflection components are dominant, information of an emission plane of diffuse reflection can be calculated as an angle at which luminance changed by the rotation of a deflecting plate becomes the maximum- In the opposite case where specular reflection components are dominant, information of an incidence plane of specular reflection can be calculated as an angle at which luminance changed by the rotation of a deflecting plate becomes the minimum. Here, focusing on the fact that the variation curve of the polarization luminance is a sinusoidaf of a 180-dcgree cycle, It is known that the one-dimensional freedom degree of an estimated normal vector includes an error of 90 degrees in the case where normal vector Information is generated without considering whether diffuse reflection is dominant or specular reflection is dominant. Therefore, classification into diffuse reflection and specular reflection is important In the process of generating normal vector information from polarization information. [0108] FIG, 23 is a functional block diagram showing a detailed structure of the normat vector information generating unit 104 shown in FIG. 26- The normal vector information generating unit 104 is a processing unit for generating normal vector information from polarization Information based on the result of the area division performed by the area dividing unit 103, and includes an accumulation unit 306, an area referencing unit 307, a unit for generating normal vector informatfon assuming diffuse reflection 308, and a unit for generating normal vector information assuming specular reflection 309. It is to be noted that, in this diagram, the structural elements common with FIG. 14 are assigned with the same numerical references as those in FIG. 14, and detailed descriptions thereof are omitted. [Oiuy] The area referencing unit 307 is a processing unit for judging whether diffuse reflection components dre dominant or specular reflection components are dominant in a target pixel (whether the target pixel is a diffuse reflection area or the target pixel is a specular reflection area), or whether the pixel is an attached shadow area or not, by referring to the result of the area division accumulated in the accumulation unit 306. [0110] The unit for generating normal vector information assuming diffuse reflection 308 is a processing unit for generating normal vector information of a pixel corresponding to a diffuse reflection area assuming diffuse reflecliom More specifically, the angle of the polarization principal axis at which the luminance becomes the maximum in the sinusoidal obtained through the approximation is generated as the normal vector information of the emission plane of the object corresponding to the pixel. [0111] The unit for generating normal vector Information assuming specular reflection 309 is a processing unit for generating normal vector information of pixels corresponding to specular reflection areas and attached shadow areas assuming specular reflection. More specifically, the angle of the polarization principal axis at which the luminance becomes the minimum in the sinusoidal obtained through the approximation is generated as the normal vector information of the incidence plane of the object corresponding to the pixel. [0112] FIG. 29 is a flowchart of processes performed by this normal vector information generating unit 104. First, the area referencing unit 307 judges whether or not diffuse reflection components art1 dominant in the pixel, based on the result u$ the optical area div\s\on detected by the area dividing unit 103, In this processing, the result of the area division may be read from the accumulalion unit 306 in which the result of the area judging unit 305 is accumulated. When it is judged that diffuse reflection components are dominant (Yes in 5301), the unit for generating normal vector information assuming diffuse reffection 308 generates normal vector information of the pixel assuming diffuse reflection (S302). More specifically, the one-dimensional degree of freedom of the normaf vector on ^r\ emission plane is calculated as an angle at which luminance changed by the rotation of a deflecting plate becomes the maximum value, In other words, the anoje of the polarization principal axis at which the luminance becomes the maximum in the sinusoidal obtained through the approximation is generated as the normal vector information of the emission plane of the obiect corresponding to the pixel, [0113] In addition, when diffuse reflection components are not dominant m the pixel (No in S301), the area referencing unit 307 judges whether specular reflection componenls are domrnant (the pixel is a specular reflection area) or the pixel is an attached shadow area (5303). As the result, it is judged that specular reflection components are dominant, or the pixel is an attached shadow area (Yes in S303), the unit for generating normal vector information assuming specular reflection 309 generates normal vector information of the pixel assuming specular reflection (5304). More specifically, the one-dimensional degree of freedom of the normal vector on the incidence plane is calculated as the angle at which the luminance changed by the rotation of a deflecting plate becomes the minimum value. In other words, the angle of the polarization principal axis at which the luminance becomes the minimum in the sinusoidal obtained through the approximation i^ generated as normal vector information of the incidence plane of the object corresponding to the pixoL [CJ114J In contras!, when the pixel is judged to be a cast shadow area, in other words, when neither diffuse reflection components nor specular reflection components are dominant, and the pixel is not a cast shadow area (Mo in $303), this normal vector information generating unit 104 judges that errors are dominant In the pofarization information of the pixel and that it is Impossible to generate accurate normal vector information, and does not perform any normal vector information generating process (S305). [0115] As described above, it is possible to achieve a normal vector information generating device which automatically generates normal vector information for each optical area including a shadow area by combining the normal vector information generating unit with the image processing device according to the present invention. [0116] In addition, the shadow area detecting unit 301 shown in FIG. 14 and the tike may use a light emitting device 207 (such as a flash) mounted on a camera 200, Thi^ is because, when an object having a sufficiently small reflectance such as a blackout curtain exists, a judgment based on the luminance value is insufficient to distinguish the shadow area and the blackout curtain. With reference to the drawings, a detailed description is given of the optical area dividing device according to the Variation of this Embodiment in which such flash is used. [0117] FIG, 30 is a functional block diagram showing the structure of the optica! area dividing device 100a according to the Variation like this. Here, a normal vector information generating unit 104 is also shown in this example of a processing in which the result of processing performed by the optical area dividing device IGOa \s used. This optical aTea dividing device has iho structure of the optical area dividing device 100 shewn \n FIG. 1 and the hght-ernittinq unit 105 added thereto. It is to be noted that, in FIG. 30, the structural elements common with TTG, 26 are assigned with the same numerical references ds those in FIG. 26, and detailed descriptions thereof are omitted. [0118] The light emitting unit 105 fs a flash for projecting light onto the object working with the imaging operations performed by the optical area dividing device 100a. At this time, this light emitting unit 105 controls lighting of the flash. The polar i/ed imaqe capturing unit 101 captures two images working with the light emitling unit 105; one of the images is captured in a state where the flash is used, and the other is captured in a stale where the flash is not used, At this time, the images are captured in such a manner that the positional relationship between the object and the camera 200 is not changed. For example, such imaging may be performed using a serial imaging function of the camera 200. [0119] FIG. 31 is a flowchart of shadow detecting processes performed by the optical area dividing device 1 00a according to this Variation. In other words, FIG. 29 is a flowchart indicating another approach of the shadow area detecting process (5201) in FIG. 15, First; the shadow area detecting unit 301 checks the luminance value of the pixel in a state where the flash is not used (S401), In the case where the fumlnance value of the pixel is greater than the threshold value (No in S401), the shadow area detecting unit 301 judges that the pixel is not a low luminance area {here, a shadow area) (5402), and ends the processing. [0120] In the opposite case where the luminance value of the pixel is tess than the threshold value (Yes in 5401), it Is highly likely that the pixel is a shadow area, and thus, the ^hddow area detecting unit 301 generates a differential image between the flash image captured using the flash and ine normal image captured without using the flash (S403). Assuming that the lighting position of the Hash is sufficiently close to the positions of the imaging elements and the distances between them are approximately equal, a cast shadow caused by the lighting flash does not exist on trie image. This is because the sight line direction equals to the light source direction. Therefore; direct light appears on the areas when the flash is lighted a ft bough the areas are shadow areas in a no-flash state. Accordingly, the luminance values of the shadow areas increase significantly. [0121] On the other hand, when the pixel is not a shadow area and a blackout curtain having a low reflectance, the luminance value does not substantially change when the flash is used because of its low reflectance. In other words, when the luminance value of the differential image captured using the flash is equal to or greater than the threshold value (Yes in £404), the shadow area detecting unit 301 judges that the pixel is a shadow area (S4G5), and ends the processing. In contrast, when the luminance value of the differential image by the flash is less than the threshold value, the shadow area detecting unit 301 judges that the pixel is not a shadow area but a low reflectance area for a low reflectance pixel) (S405), and ends the processing. [01221 Ev&n when an object ^uoh as a blackout curtain having a small reflectance exists in this way, it is possible to detect shadow areas accurately, classify the shadow areas into attached shadow areas and cast shadow areas, and generate accurate normal vector information of the shadow areas by assuming specular reffection. In addition, as for shadow ateas having polarization information including a lot of errors and where only extremely poor accuracy is obtained when normal vector information is generated, no I performing normal vector information generating processes makes it possible to generate highly accurate normal vector information of areas as large as possible. [01231 A- described above, according to the optical area dividing device in this Embodiment, it is possible to classify the shadow areas of a snapshot-like image into attached shadow areas and cast shadow areas in a general environment, without requiring a farge-sca(e syMem enabling movement of a fighf source. [0124] In this Embodiment, photonic crystals are used as a patterned polarizer 201, but film-type polarizing elements or polarizing elements of wire grid type or using another mechanism may be used. In addition, luminance having different polarization principal axes may be obtained in time series by performing imaging while rotating the deflecting plate mounted in the front of the lens of the camera 200 without using a patterned polarizer. This method is disclosed in Japanese Patent Application Publication NoL H-711433: Patent Reference 1, [0135] (Embodiment 2) Next, a description is given of an optical area dividing device in Embodiment 2 of the present invention. [0126] FIG, 32 is a functional block diagram showing the structure of the optical area dividing device 100b in this Embodiment. This optical area dividing device 100b is a device for performing optical area division on the surface of an object by imaging the object, and is characterized by classifying low luminance areas into "attached shadow areas or low reflectance areas" and cast shadow areas. This optical area dividing device 100 b includes on arcs dividing unit 1031 instead of the area dividing unit 10.1 in the optical area dividing device 100 shown in hlG, 1, Here, the normal vector information generating unit 1041 is aTsc shown in this example of processing in which the result of processing by the optical area dividing device 100b is used. Although the normal vector information generating unit 1041 is not an essential structural clement of the image processing device according to the present invention, but it is shown as an example of a post-processing unit in which the result of the processing by the image processing device according to the present invention is used. It is to be noted that, in FIG, 32, the same structural elements as those of the optical area dividing device 100 in FIG. 26 are assigned with the same numerical references, and descriptions thereof are omitted. [0127] The area dividing unit 1031 is a processing unit for dividing a polarized image into plural areas each of which is a group of image areas having optically common characteristics using similarity (fikeness) between the luminance information of the polarized image and the polarized information generated by the pofari/ation information generating unit 102, At this time, the area dividing unit 1031 compares the luminance of each image area and a predetermined threshold value, and classifies the image area as a low luminance area including a shadow area (the low luminance area including "an attached shadow area or a low reffectance area" and a cast shadow area in this Embodiment) when the luminance is less than the threshold value. In this Embodiment, the avea dividing unit 1031 classifies the low luminance areas into "attached shadow areas or low reflectance areas" and cast shadow areas to divide the image into diffuse reflection areas, specular reflection areas, "attached shadow areas or low reflectance areas" and shadow areas. [0128] The normal vector infnrmation generating unit 1041 is a processing unit for generating normal vector information from polari/ed information for each of the areas divided by the area dividing unit 1031. This normal vector information qenc-ratmg unit 1041 generates normal vector information assuming that the attached shadow areas are "attached shadow areas or low reflectance areas", unlike the normal vector information generating unit 104 described in Embodiment 1. [0129] F]G. 33 is a flowchart of processes performed by the optical area dividing device 100b and the normal vector information generating unit 1041 in this Embodiment. It is to be noted that, in FIG, 33, the steps common with FIG, 4 in Embodiment 1 are assigned with the same numerical references as those in FIG. 4, and the descriptions thereof are omitted. [0130] The area dividing unit 1031 classifies the portions of the image into diffuse reflection areas, specular reflection areas, and low luminance areas (in this Embodiment, "attached shadow areas or low reflectance areas" and cast shadow areas) using the polarization information generated by the polarization information generating unit 102 and the luminance information obtained by the polarized image capturing unit 101 (S1031), [0131] The normal vector information generating unit 1041 generates normal vector information from the polarization information based on the result of the area division performed by the area dividing unit 1031 as described later (S104). At this time, no normal vector information generating process is performed because a lot of errors are included in the polarization information about cast shadow areas. [0132] hirst, a description is given of the difference in the poEarization characteristics of an object having a low reflectance. The internal reflection of the object having smooth surface and has a low reflectance is approximately 0, and diffuse reflection components are very weak. On the other hand, under the specular reflection condition, light is reflected and thus specular reflection becomes greater, In other words, ft is considered that diffuse reflection components are weak in the low reflectance area, and specular reflection components become relatively dominant. This shows that the object having a low reflectance has the polarization characteristics as in the attached shadow areas described below. (1) "Attached shadow areas or low reflectance areas" - The degree of polarization is high, and estimated polarization errors are small. - In many cases, specular reflection characteristics are indicated. (2) Cast shadow areas - The degree of polarization is low, and estimated polarization errors are large, - In many cases, diffuse reflection characteristics are shown. [0133] The low luminance areas are classified into "attached shadow areas or low reflectance areas" and cast shadow areas based on these classification standards. These processes are described in detail below with reference to the drawings. [0134] FIG. 34 is a functional block diagram showing the detailed structure of the area dividing unit 1031 in the optical area dividing device 100b shown in FIG, 32, This area dividing unit 1031 includes a DB 302, a degree-of-polan:ation comparing unit 303, an estimated polarization error comparing unit 304, an area judging unit 305, an accumulation unit 300, and a low Jumfnance pixel detecting unit 311. It is to he noted \hMf in FIG. 34, the steps common with FIG. 14 in Embodiment l are assigned with the ^mc numerical references as those in FIG. 14, and the descriptions thereof are omitted. [0135] The low luminance pixel detecting unit 311 is a processing unit for estimating whether or not the pixels in the image obtained by the polarized image capturing unit 101 are low luminance areas (areas including "attached shadow areas or low reflectance areas" and cast shadow areas). [0136J FIG, 35 is a flowchart of processes performed by this area dividing unit 1031. First, the low luminance pixel detecting unit 311 evaluates the luminance values of the pixels in the image obtained Dy the polarised image capturing unit 101 (S5U1). As in the step S201 described above, an evaluation is made to determine whether or not each of the luminance values is less than a threshold value. The threshold value for estimating the low luminance areas like this may be empirically determined, and for example, 256 may be set for a 16-bit monochrome image. This threshold value may be held in the DB 302. In the case where the luminance value is greater than the threshold value (No in S501), the area dividing unit 1031 judges whether diffuse reflection components arc dominant in the image or specular reflection components are dominant in the image according to the method as in the step S202 described above (comparison by the degree-of-polarization comparing unit 303) (S502). After the completion of the diffuse reflection/specular reflection classification process (S502), the area judging unit 305 checks whether or not the optical classification of all the pixels has been completed (S503). In the case where there remains a pixel which has not yet been classified (No in S503), the low Euminance pixel detecting unit 31 1 evaluates the luminance value of another pixel (S501 ). In addition, the optical classification of all the pixels has been completed (Yes in S503]r the area dividing unit 1031 completes the processing. [0137] On (he other hand, when the luminance vatue of the pixel is equal to or less than the threshold value (Yes in 5501), whether the pixel is ,nan attached shadow area or a low reflectance area" is judged (S504). As described above, the estimated polarization error comparing unit 304 is implemented by evaFuating the magnitude of the estimated polarization error E defined by Expression 8 (by comparing the estimated polarization error E and the threshold value Th_Err). As the result, the area judging unit 305 judges that the pixel is an attached shadow area (S505) in the case where the magnitude of the estimated polarization error E is greater than the threshold value Th_Err (Yes in S504), while the area judging unit 305 judges that the pixel is "an attached shadow area or a low reflectance area" (S50G) in the case where the magnitude of the estimated polarization error E is less than the threshold value Th_Err (No in S504). The threshold value Th. Err at this time may be determined according to the above-mentioned method. The result of the area division is accumulated in the accumulation unit 306. [0138] The normal vector information generating unit 1041 generates normal vector information from the polarization information based on the result of the area division performed by the area dividing unit 1031. This normal vector information generating unit 1041 has the same structure as that of the normal vector information generating unit 104 in Embodiment 1, in other words, includes the accumulation unit 306, the area referencing unit 307, a unit for generating normal vector information assuming diffuse reflection 308 and a unit for generating normal vector information assuming specular reflection 309, as shown in FIG. ?6. This norma! vector information generating unit 1041 generates normal vector information assuming that the attached shadow aTea^ are ''attached shadow areas or low reflectance areas", unJike Embodiment 1. [0139] FIG. 36 is a flowchart of processes performed by this normal vector information generating unit 1041. II is to be noted that, in FIG. 36, the steps common with FIG. 29 are assigned with the same numerical references as those in FIG. 29, and detailed descriptions thereof are omitted, [01401 The area referencing unit 307 of the normal vector information generating unit 1041 judges whether or not diffuse reflection components are dominant in the pixel based on the result of the optical area division detected by the area dividing unit 1031 (S301). In this processing, the result of the area division may be read from the accumulation unit 306 in which the result of the area judging unit 305 is accumulated. When it is judged that diffuse reflection components are dominant (Yes 'n 5301), the unit for generating normal vector information assuming diffuse reflection 308 generates normal vector information of the pixel assuming diffuse reflection (5302). More specifically, the one-dimensional degree of freedom of the normal vector on an emission plane is calculated as an angle at which luminance changed by the rotation of a deflecting plate becomes the maximum value. In addition, not diffuse reflection components (No in S301) but specular reflection components are dominant in the pixel or the pixel is "an attached shadow area or a low reflectance area" (Yes in 5306), the unit for generating normal vector information assuming specular reflection 309 generates normal vector information of the pixel assuming specular reflection (S304), More specifically, the one-dimensional degree of freedom of the normal vector on the incidence plane is calculated as the angle at whkh the luminance changed by the rotation of a deflectinq plate becomes the minimum value. On the other hand, in the case where the pixel is a cast shadow area (No in 53Q3), it \s judged that errors are dominant in the polarization information of the pixel, and no accurate normal vector information can be generated, and the normal vector information generating unit 1041 does not perform normal vector information generating processes (S30b). [0141J As described above, according to the optical area dividing device in this Embodiment, it is possible to classify the shadow areas of a snapshot-like image into attached shadow areas and cast shadow areas in a general environment, without requiring a large-scale system enabling movement of a light source. Further, as for "the attached shadow areas or the low reflectance areas", the normal vector information generating unit generates accurate normal vector information assuming specular reflection. In addition, as for shadow areas having polarization information including a lot of errors and where only extremely poor accuracy is obtained when normal sector information is generated, not performing normal vector information generating processes makes it possible to generate highly accurate normal vector information of areas as large as possible. [0147] It is to be noted that, in step S504, the degree of polarization, both the estimated polarization error and the degree of polarization, or the polarization phase may be used as in Embodiment 1 instead of the estimated polarization error in order to judge whether the pixel Is "an attached shadow area or a low reflectance area", or a cast shadow area. [0143J (Embodiment 3) Next, a desr.ripl ion is given of an optical area dividing device in Embodiment 3 of the present invention. [0144J FIG. 37 is a functional block diagram showing the structure of the optical area dividing device 100c in this Embodiment. This optical area dividing device 100c is a device for performing optical area division on the surface of an object by imaging the object, and is characterized by generating normal vector information only when accurate normal vector information can be generated. This optical area dividrng device 100c includes an imaging condition judging unit 106 in addition to the structure of the optical area dividing device 100 shown in FIG. 1. Here, a normal vector information generating unit 104 is also shown in this example of processing in which the result of processing performed by the optical area dividing device 100c is used. Althouqh this normal vector information generating unit 104 is not an essential element of the image processing device according to the present invention, but is shown as an example of a post-process processing unit which uses the result of the processing performed by the image processing device according to the present invention. It is to be noted that, in FIG. 37, the structural elements common with FIG. 26 ^rp assigned with the same numerical references as those in FIG. 26, and detailed descriptions thereof are omitted. [0145] The imaging condition judging unit 106 is a processing unit for judging whether or not the target scene to be imaged by the polarized image capturing unit 101 satisfies the imaging condition predetermined as an imaging condition under which the area dividing unit 103 can perform accurate area division. [0146] FIG. 38 is a functional block diagram showing the detailed structure of this imaging condition judging unit 106. This imaging condition judging unit 106 includes a DB 302, an optical axis direction detecting unit 312, and an optical axis direction comparing unit 313. [01471 The optical axis direction detecting unit 312 is an angle sensor or the like for detecting an optical axis direction of the optical area dividing device 100c, [0148] The optical axis direction detecting unit 313 is a processing unit forjudging whether or not the optical area dividing device 100c faces the upward direction of the horizontal surface (horizon plane). [0149J Here, in this Embodiment the image scene is required to satisfy the Condition 1 as explained in Embodiment 1. [0150] Condition 1: "an object including a large plane exists near an object in an image scene, and a light source exists in the direction opposite to the object from a large plane. It is to be noted that the above Condition j is not always satisfied in a state where the optical area dividing device 100c is placed. For this, in this Embodiment, the imaging condition judging unit 106 judges whether or not the above Condition 1 is satisfied. Here, focusing on the great likelihood that a light source is in the upward direction, the Condition 1 is not satisfied under the following Condition 2. F01511 Condition 2: "an image capturing person captures an image of an upward direction. This Condition 2 is satisfied, for example, in the following image scene, 1. An outdoor scene of the sky, the moon, or stars. 2, An indoor scene in the direction of the ceiling on which fluorescent lamps are used. [0152] In the case of the above image scene 1, for example, it is considered to image a crescent moon. Jt is considered that the shadow 3f^3 of the crescent, moon is an attached shadow area. However, this shadow area has a luminance due to the reflection which is multiple reflection from the Earth called earth shine. Therefore, although it is an attached area, ft is considered that the mufliple reflected light is incident from an extremely limited range, that is, only from the Earth, and that there are substantially no specular reflection components which are specular reflection components. Therefore, the optical area dividing device 100c does not function accurately. For this, the imaging condition judging unit 106 judges whether or not the optical area dividing device 100c accuralely functions (can perform accurate area division). When it is considered that the optical area dividing device 100c accurately functions, processes for area division into attached shadow areas and cast shadow areas are performed, while processes far area division into shadow areas are cancelled and processes for generating normal vector information based on the shadow areas are cancelled when it is considered that the optical area dividing device 100c does not accurately function. [01S3] FIG. 39 shows an example of the hardware structure of a camera 200a mounting an optical area dividing device 100c in this Embodiment. This camera 200a is an Imaging device including a function for optically dividing the areas of an image captured, and includes a patterned polarizer 201, imaging elements 202, a memory 203, a CPU 204, an angle sensor 205, a display unit 208, and a speaker 209. It is to be noted that, In FIG. 39, the structural elements common with FIG. 2 are assigned with the same numerical references as those in FIG. 2, and detailed descriptions thereof are omitted. [01541 The angle sensor 205 dele(t^ the optical axis direction of the camera 200a and outputs the information. [0155] When the imaging condition judging unit 106 judges that the image scene does not satisfy the above Condition 1, the display unit 208 displays a message indicating the fact, [0156] When the imaging condition judging unit 106 judges that the scene does not satisfy the above Condition 1, ihe speaker 209 outputs, in form of speech, the message indicating the Tact. [0157] It is noted that the optical axi^ direction detecting unit 312 shown in FIG. 38 is implemented as an angle sensor 205 shown in FIG, 39. The optical axis direction comparing unit 313 shown in FIG. 36 IS implemented triggered by that the CPU 204 shown in FTG, 39 executes a program stored in the memory 203. [0158] FIG. 40 is a flowchart of processes performed by the opticaf area dividing device 100c and the normal vector information generating unit 1041 in this Embodiment. It is to be noted that, in FIG- 40, the steps common with FIG. 4 are assigned with the same numerical references as those in FIG. 4, and the detailed descriptions thereof are omitted, [0159] In this Embodiment, the optical axis direction detecting unit 312 (angle sensor 205) obtains optical axis direction information indicating the optical axis direction of the optical area dividing device 100c (camera 200a) (S106). Based on the optical direction information calculated in this way, a judgment is made as to whether or not an image scene can be captured in an environment allowing generation of normal vector information (5107). This judgment is made by the optical axis direction comparing unit 313 depending on whether or not the optical axis direction of I he optical area dividing device 100c (camera 200a) fat'.es upward. The optical axis direction comparing unit 313 judges that the optical dxis direction faces upward when the optical axis faces upward by AS degrees or more from the horizontal direction. This threshold value of 4b degrees may he determined empirically, arid such threshold value may be held in the DB 302. Here, when the optical axis direction comparing unit 313 judges that the optical axis direction faces upward, the imaging condition judging unit 3 06 judges that the image scene does not satisfy the Condition 1 (Mo in S107), and the area dividing unit 103 classifies the portions of the imago into diffuse reflection areas, specular reflection £reas, and shadow areas, based on the polarization information generated by the polarization information generating unit 102 and the luminance information obtained by the polarized image capturing unit 101 (SlOft). Since the Condition 1 is not satisfied in this case* shadow areas ^re not classified into attached shadow areas and cast shadow areas. Subsequently, the normaf vector information generating unit 104 qenerates normal vector information from the polarization information, based on the result of the area division performed by the area dividing unit 103 (S109). FIG, 41 is a flowchart of detailed processes of this processing (S109). It is to be noted that, in FIG. 41, the steps common with FIG. 29 are assigned with the same numerical references as those in FIG. 29, and detailed descriptions thereof are omitted. [0160] The normal vector information generating unit 104 judges whether or not diffuse reflection components are dominant in the pixel based on the result of the optical area division detected by the area dividing unit 103 (5301). When it is judged that diffuse reflection components are dominant (Yes in S30I), the normal vector information generating unit 104 generates normal vector information of the pixel assuming diffuse reflection (S302), More specifically, the one-dimensional degree of freedom of the normal vector on an emission plane is calculated as an angle at which luminance changed by the rotation of a deflecting piste becomes I he maximum value. In addition, in the case where specular reflection components arc dominant in the pixel (Yes in S307) not diffuse reflection components are dominant rn the pixel (No in S301), the normal vector information generating unit 104 generates normal vector information of the pixel assuming specular reflection (5304). More specifically, the one-dimensional degree of freedom of the normal vector on the incidence plane is calculated as the angie at which the luminance changed by the rotation of a deflecting plate becomes Ihe minimum value. On the other hand, in the case where the pixel is a shadow area, that is, neither diffuse reflection components nor specular reflection components arc dominant (No in $307), the normal vector information generating unit 104 judges that errors are dominant \^ Ihe polarization information of the pixel, and that no accurate normal vector information can be generated, and thus the normal vector information generating unit 104 does not perform normal vrrtor information generating process (S305), F0161] Tn contrast, when it is judged that the optical axis direction does not face upward (Yes in 5107), the imaging condition judging unit 106 judges that the image scene satisfies the Condition 1, the area dividing unit 103 performs optical area dividing process (S103), and subseguentfy, the normal vector information generating unit 104 generates normal vector information (S104), [0162] It is to be noted that, when the imaging condition judging unit 106 judges that the image scene does not satisfy the Condition 1, it is desirable that the display unit 208 displays, on the display, a message indicating that,TNo area division can be implemented,", and that the speaker 209 notifies the image capturing person of a him Mar message by generating an audio signal. [0163] As a matter of course, when the imaging rendition judging unit 106 judges that the imaqc scene does not satisfy the Condition 1, it is atso good to generate normal vector information of the shadow areas assuming diffuse reflection instead of not performing optical area dividing processes and normal vector information generating processes, and it is good that the display unit 208 displays, on the display, a message indicating that "Area dividing processes are unstable.", and that the speaker 209 notifies the image capturing person of the similar message by generating an audio signal. L0164] In addition, when the imaging condition judging unit 106 judges that the image scene does not satisfy the Condition 1, the normal vector information generating unit 104 may synthesize normal vector information by performing an interpolating process using the normal vector information of the adjacent areas for the shadow areas. This interpolating process requires a conventional approach only, [016b] In addition, the imaging condition judging unit 106 does not necessarily have the optical axis direction detecting unit 312, and for example, it is good to use a processing unit having a function for recognizing an environment where the optical area dividing device 100c is placed. This is implemented by, for example, using a sonar or the like. A description is given of a normal vector information generating device according to this Variation, of this Embodiment, having such function for recognizing an environment. [0166] FIG, 42 shows an example of the hardware structure of the camera ?00b mounliny the optical area dividing device ar:< ording to this Variation. 1 his camera 200b has the same structure as that of the camera 200a in this Embodiment shown in FIG. 39 except that the angle sensor 205 is replaced with the sonar 210. It is to be noted that, in FTG. A?, the structural elements common with FTG. 39 are assigned with the same numerrcal references as those in FIG. 39, and detailed debcriptions thereof are omilted. [0167] FIG. 43 is a functional block diagram showing the detailed structure of the imaging condition judging unit 106a included in the normal vector information generating device according to this Variation. It is to be noted that the normal vector information generating device according to this Variation has the same structure as that ot the optical area dividing device 100c in the Embodiment shown in FIG. 37 except that the imaging condition judging unit 106 is replaced with the imaging condition judging unit 106a. This imaging condition judging unit 1 06a includes a sonar for measuring a distance to a nearby object by generating a sound wave and receiving a reflected wave of the sound wave, judges whether or not there is a material body near the normal vertor information generating device using the sonar, characterized by judging, when it is judged that there is no such material body, that the image scene does not satisfy the imaging condition, and includes a DB 302, an imaging environment detecting unit 315, and an imaging environment recognizing unit 316. It is to be noted that, in FIG. 43, the structural elements common with FIG, 38 are assigned with the same numerical references as those in FIG. 38, and detailed descriptions thereof are omitted. [0168] The imaging environment detecting unit 315 is a processing unit for measuring the distance to the nearby object, and generating the distance information as imaging environment information, and corresponds to the sonar 210 shown In FIG. 42, [0169] I he imaging environment recognizing unit 316 is a processing unit for judging whether or not a current environment is an environment where optica) area division tan be performed on the image scene using the imaging environment information from the imaging environment detecting unit 315. [0170] FIG. -M is a flowchart of processes performed by the optical area dividing device and the normal vector information generating unit 104 according to this Variation. It is to be noted tnat, in FIG, 44, the steps common with FIG. 10 ^re assigned with the same numerical references as those in FIG. 40, and detailed descriptions thereof arc omitted, [0171J In the normal vector information generating device according to this Variation, the imaging environment detecting unit 315 obtains the image environment information using the sonar (Sill). This sonar 210 is an active sonar which measures the distance to the nearby object hy generating an ultrasound wave and a sound wave. and receiving the reflected wave. Therefore, the use of the sonar 710 makes it possible to detect whether or not there is a material body near the camera 200b, and when there is a materia! body, obtains the distance information to the material body as the imaging environment information. It is to be noted that the sonar 210 is widely used as a fish detector or the like and is known in public, and thus the delailed descriptions thereof are omitted, [0172] Whether or not a current environment is 3n environment where optical area division is performed on the image scene is judged based on the imaging environment information calculated in this way (51 07). This is performed triggered by that the imaging environment recognizing unit 316 judges whether or not there is a material body near the camera 200b. More speciffcafly, it is onfy necessary that the imaging environment detecting unit 315 obtains the distance information to a material body near (in all directions) the camera 2G0b, and evaluates the magnitude of the solid angle at the distance shorter than a constant value TH S. Here, in the case where the magnitude of the solid angle is less than the threshold value TH__SR, the imaging condition judging unit 106 judges that the image scene does not satisfy the Condition 1 (No in S1G7), and the area dividing unit 103 classifies the portions of the imaqc into diffuse reflection areas, specular reflection areas, and shadow areas using the polarization information generated by the polarization information generating unit 102 and the luminance information obtained by the polarized image capturing unit 101 (5108). Since the Condition 1 is not satisfied, no classification of shadow areas into attached shadow areas and cast shadow areas is performed. Further, the normal vector information generating unit 104 generates normal vector information from the polarized information based on the result of the avea division performed by the area dividing unit 103 (S109). On the other hand, when the magnitude of the solid angle like this is greater than the threshold value TH_SR, the imaging condition judging unit 106 judges that the image scene satisfies the Condition 1 (Yes in S108), and the area dividing unit 103 performs optical area dividing processes and further, the normal vector information generating unit 104 generates normal vector information. [0173] It is to be noted that such threshold value TH S, and TH_SR may be empirically determined and held in the DB 302. [0174] As described above, according to the optical area dividing device in this Embodiment, it is possible to classify the shadow areas of a snapshot-like image into attached shadow areas and cast shadow areas in a general environment, without requiring a large-scale system enabling movement of d light source. Furthermore, if it is difficult to perform such classification, I he optical area dividing device can perform highly-reliable area dividing process by not obtaining an inaccurate result but notifying the image capturing person of the fact lhat il is impossible to perform the intended process. [01751 (Embodiment 4) Next, a description is given of an optical area dividing device in Embodiment 4 of the present invention. [0176] FIG. 45 is a functional block diagram showing the structure of the optical area dividing device lOOd in this Embodiment, This optical area dividing device lOOd is a device for performing optical area division on the surface of an object by imaging the object, and is characterized by not generating unreliable area division result and normal vector information. This optical area dividing device 100c includes a reliability judging unit 107 in addition to the structure of the optical area dividing device 100 shown in FIG. 1. Here, a normal vector information generating unit 104 is also shown in this example of processing in which the result of processing performed by the optical area dividing device lOOd is used. Although this normal vector information generating unit 104 is not an essential structural element of the image processing device according to the present invention, but is shown as an example of a post-process processing unit which uses the result of the processing performed by the image processing device according to the present invention. It is to be noted that, in FIG. 45, the structural elements common with FIG- 26 are assigned with the same numerical references as those in FIG. 26, and detailed descriptions thereof are omitted. In addition, the camera mounting the optical area dividing device lOOd in this Embodiment has the same hardware structure as that of the ccimerci 200 in Embodiment 1 shown in FIG. 2. [0177] The reliability judging unit 107 evaluates the reliability of the result of the optical area division using the result of the optical area division performed by the area dividing unit 103r and when there is no reliability, discards the result of the optical area division and the normal vector information. As the result, the normal vector information of areas without reliability is discarded. [0173] FIG, 46 is a flowchart of processes performed by the optical area dividing device 100d and Ihe normal vector information generating unit 104 in this tmbodiment, It is to be noted that, in FIG. 46, the steps common with FIG. 4 and FIG. 40 are assigned with the same reference numerals as those in FIG. 4 and FIG. 40, and the detailed descriptions thereof are omitted. The reliability judging unit 107 evaluates whether or not the above-mentioned Condition 1 is satisfied, in other words, the reliability of the result of the optical area division, using the result of the optical area division performed by the area dividing unit 103. In the case where there is no reliability (No in S107), the reliability judging unit 107 discards the resuft of the optical area division and the normal vector information generated by the normal vector information generating unit 104 (S110). [0179] Here, in order to judge whether or not the Condition 1 is satisfied, it is good to judge whether or not there is an attached shadow area where specular refleclion components are dominant due to the influence of multiple specular reflected light within the shadow area. For this, here, a description is given of a method for evaluating the reliability based on the degree of polarization and the luminance value of the polarized image. It is good to judge that the ondltion 1 is not satisfied in the case where no pixel indicating peculiar reflection pulari/alion characteristic exists in the shadow rear in other words, no attached shadow area exists on the image. 3180] FIG. 47 is a functional block diagram showing the detailed tructurc of the reliability judging unit 107, The reliability judging nit 107 includes the DB 302, the accumulation unit 306, and the nit for judging existence of an attached shadow area 314. 3181] The accumulation unit 306 accumulates the result of area iviiiion performed by Ihe area dividing unit 103. 3132] The unit for judging existence of an attached shadow area 314 i a processing unit for referring to the result of the area division ecumulated in the accumulation unit 306, and judges whether or ot an attached area having a sufficient size (equal to or more than predetermined threshold value) has been divided. 3183] The following are details of the processes (S107 and S110) erfnrmed hy the reliability judging unit 107 shown in FTG. 46. 3184] The unit forjudging existence of an attached shadow area 314 jdges whether or not an attached shadow area having a sufficient ize has been divided by referring to the result of the area division ccumulated in the accumulation unit 306 (S107). In the case here the result shows that no attached shadow area having a ufffcient size exists in an image, more specifically, in 100 pixels or lore in a VGA image (No in S1Q7), the reliability judging unit 107 jdges that the image scene does not satisfy the Condition 1, and ie results of the optical area division of the shadow area and the ormat vector information are discarded (SI 10), At this time, it is esirable thai an image capturing person is notified of the fact by means that the display unit 208 displays, on a display, a message indicating that "No normal vector informal ion generating processes of shadow areas can be implemented", or the speaker 209 generates an audio signal. In the opposite case where an attached shadow area exists in the image (Yes in 5107), the reliability judging unit 107 judqes that the image scene satisfies ihe Condition 1, and outputs the generated normal vectoi information. Here, it is good to empirically determine the threshold value for the size of the attached shadow area, and such threshold value may be held in the DB 302. [0185] As described above, according to the optical area dividing device in this Embodiment, it is possible to classify the low reflectance areas oT a snapshot-like image into attached shadow areas and cast shadow areas in a general environment, without requiring a large-scale system enabling movement of a fight source. Furthermore if it is difficult to perform such classification, the optical area dividing device can perform highly-reliable area dividing process by not obtaining an inaccurate result hut notifying the image capturing person of the fact that it is impossible to perform the intended process. [0136] Up to this point, the image processing device according to the present invention has been described using Embodiments 1 to 4, and their Variations, but the present invention is not limited to these Embodiments and Variations. [0187] The present invention includes another embodiment where the structural elements in any of these Embodiments and Variations are arbitrarily combined and implemented, and an embodiment obtainable by making modifications which may he arrived at a person skilled in the art to any of these Embodiments and Variations. [0188] In addition, in the case where the reliability judging unit 107 judges that the image scene does not satisfy the Condition 1, it is good to perform normal vector information generating precedes instead of not performing the processes, and it is good that the display unit 208 displays, on the display, a message indicaliny that "Area dividing processes are unstable," or the like, or the speaker ?09 notifies the image capturing person of the fact by generating an audio signal. [0189] In addition, when the reliability judging unit 10/ judges lhat the image scene does not satisfy the Condition 1, all optical area division result and normal vector information may be discarded not only the normal vector information of the shadow areas. [0190] In some parts of the above-described Embodiments and their Variations, a normal vector information generating unit is also shown together with the optical area dividing device. However, the image processing device according to the present invention may include or may not include such normal vector information generating unit. The normal vector information generating unit is a mere example of a processing unit which uses the result of area division by the area dividing unit. Possible processing examples \n which such area division results are used include a process of generating various shape information for generating shape models of an object, and a process of generating a beautiful three-dimensional imaqe by correcting shadow areas. T0191] In addition, in Embodiment 4, the reliability of the optical area division result is evaluated based on the result of optical area division performed by the area dividing unit 103, but approaches for evaluating such reliability are not limited to this. For exampEe, the reliability of the optical area division result may be evaluated based on the shape information of the object. [0192] FIG 48 is a functional block diagram showing the ijlruf.ture of the optical area dividing device lOOe according to a Variation, of the present invention, \n which the reliability of the upl ira! area division result is evaluated based on the shape information of an object. This optical area dividing device lOOe is a device for performing optical area division on the surface of the object by imaging the object, characterized by not outputting unreliable division result of low reflectance aieas, and includes a re I [ability judging unit 10/a in addition to the structure of the optical area dividing device 100 shown in TIG. 1 . [0193] The reliability judging unit lU/a is a processing unit for generating shape information of the object, evalualing the reliability of the result of the optical area division performed by the area dividing unit 103 based on th^ generated shape information, and discarding the result of the optical area division in the case where the result is unreliable. As shown in FIG. 49, the reliability judging unit 107a Includes a normal vector information generating unit 317 and a phase information comparing unit 318. [0194] The DB 302 is a storage unit for storing a threshold value which is used for comDarison made bv the Dhase information comparing unit 318. [0195J The normal vector information generating unit 317 is a processing unit for generating a normal vector (Nx Ny, Nz) corresponding to each pixel in a polarized image using the shape information generated by the shape information generating unit 211, and calculating, based on the generated normal vector, a one-dimensional freedom degree cpN of the normal vector projected on the surface of the patterned polarizer. [0196] The phase information comparing unit 318 is a processing unit for comparing, for each of the existing shadow areas, the polarization phase angfe O generated by the polarization information generating unit 102 with the one-dimensional freedom degree 0N of the normal vector calculated by the normaf vector information generating unit 317, and judging whether or not these values are sufficiently close to each other depending on whether or not Ihe difference is less than the threshold value stored in !he DB 302. [01 97) FIG. 50 shows Ihe hardware structure of the camera 200c mounting the optical area dividing device lOOe Irke this. This camera 200c is an imaging device having a function for optically dividing the areas of an image captured, and includes a pattern polarizer 201, imaging elements 202, a memory 203, a CPU 204, an angle sensor 205, a display unit 208, a speaker 209, and a shape information generating unit 211. This structure includes a shape information generating unit 211 in addition to the camera 200a shown in FIG. 39. [0198] The shape information generating unit 211 is intended to generate shape information of an object, and is a range finder, a stereo camera, or the like. It is to be noted that the normal vector information generaling unit 317 shown in FIG. 49 is implemented as a shape information generating unit 211, a CPU 204, a memory 203, and the like shown in FIG. 50. and the phase information comparing unit 318 is implemented as the CPU 204, the memory 203, and the like shown in FIG. 50. [0199] FIG, 51 is a flowchart of optical area division processes performed by the optical area dividing device lOOe like this. It is to be noted that this flowchart fs obtained by adding steps S120 to 5123 to the flowchart shown in FIG, 27. It is to be noted that step S103 in FIG. 11 is divided Into S103a and S103b here. Steps 5120 to S123 are described below. [0200] The normal vector information generating unit 317 of the reliability judging unit J 07a generates a normal vector (Nx, Ny, Nz) corresponding to each pixel in a polarized image as normal vector information, based on the shape information generated by the shape information generating unit 211 fSI20). The normal vectot generated here JS represented as a camera coordinate system (Xc-Yc-Ze) where the focus position in the optical axis direction is the origin, and the optical axis direction of the imaging element 202 is the 2c direction. In addition, the principal axis angle tjj and the polarization phase Co of the patterned polarizer 201 corresponds to the Xc axis in a camera coordinate system when UJ =

N of the normal vector calculated in this wav equals to the polarization phase O when diffuse reflection components are dominant. In other words, as for each of the existing pixels in the shadow areas, in the case where the polarization phase

N of the normal vector calculated based on the shape information from the shape- information generating unit 211, it can be judged that diffuse reflection components are dominant in all the pixels in the shadow areas and thus there is no influence of specular reflection multiple reflected light. Thus, the phase information comparing unit 518 compares the polarization phase 0> generated by the polarization information generating unit 102 with the one-dimensional freedom degree cbN of the normal vector calculated by the normal vector information generating unit 3W( flnd judges whether 01 not these values are sufficiently close to each other dependinq on whether or not the difference is less than the threshold value stored in the DB 302 (SI22). In the case where these vales are sufficiently close to each other (Yes in SI 22), the reliability judging unit 107a judges that the image scene does not satisfy the Condition 1, and discards the optical area division result about the shadow areas (5123). In the opposite case where there is an attached shadow area in the image, the reliability judqinq unit 107a judges that the image scene satisfies the Condition i (No in S122), and outputs the result of the optical area division about the shadow areas. As des< ribed earlier, since it is known that the polarization phase differs by 91) degrees depending on whether specular reflection components are dominant in an object or diffuse reflection components are dominant in the object, 45 degrees may be set as the I hreshold value for comparison of phase information. As a matter of course, such threshold value for comparison of phase information may be determined empirically. The threshold value like this may be held in the DB 302. [0203] It is to be note that, the normal vector information generating unit 31"/ included in the optical area dividing device 100c is not limited to ihe normal vector information generating unit 317 which generates normal vector information based on the shape information generated by the shape information generating unit 211, and may be the normal vector information generating unit 317 which generates normal vector information from polarization information. In other words, this optical area dividing device lOOe may include the normal vector information generating unit 104 in Embodiment 1 instead of the normal vector information generating unit 317, [0204] In addition, the present invention can be implemented not only as image processing devices but also as application products such as digital still cameras and digital movie cameras each mounting the image processing devire according to the present invenl ion. [02051 In Addition, the shadow area detecting unit iOl shown in, for example, FIG. 14 first makes judgments on shadow areas based on luminance values to detect cast shadow areas (tor example, 5201 in FIG. 15), but the present invention is not limited by this procedure, and such cast shadow areas may be detected based on only the polarization information generated by the polarization information generating unit 102. Thi^ process is effective for an object such as an oil painting obviousiy including a cast shadow area but not including an attached shadow area. This is hecause, in the case where an object, such as hlack paint, having a sufficiently low reflectance exists, it is impossible to distinguish ihe shadow areas and the black paint hased on only the above-mentioned luminance value. With reference to the drawings, a detailed description is given of the optical area dividing device like this according to the Variation of this Embodiment. [0206] Descriptions of the block structures are omitted because the functional block diagrams showing the structure of the optical area dividing device according to this Variation are the same as that of FIG. 26. [0207] FIG. 54 is a flowchart of shadow detecting processes performed by the optical area dividing devices according to this Variation. In other words, FIG. 54 is a flowchart of another approach of the shadow area detecling processes (5201, S202, S?04 to S206) shown in FIG. 15. As shown in FIG. 54, the degree-of-polarization comparing unit "303 checks whether the degree of polarization of a pixel is less or greater than the threshold vafuc TH_PS (S407), When the degree of polarization of the pixel is greater than the threshold value TH_PS (No in S407), the shadow area detecting unit 301 judges that the pixel is not a cast shadow area (S402), judges whether diffuse reflection is dominant or specular reflection is dominant in the pixel (5202), and ends tne processing, Jn contrast, when the degree of polarization of (he pixel is less than the threshold value TH PS (Yes in S4C7), the shadow area detecting unit 301 judges that the pixel is a cast shadow area (S4Q5), and ends the processing. [0208] FIG. 55 is a diagram representing, as an image, the degree of polarization p in the case where an object of oil painting is imaged. In thib diagram, FIG. 55(a) shows an image of oil painting which is an object, and h-JG. 55(b) shows the degree of polarization p (which is the polarization information generated by the polarization information generating unit 102) corresponding to the image shown in FTG. 55(a). In addition, FIG. 56 shows a cast shadow area which the optical area dividing device according to this Variation has extracted using the degree of polarization p shown in FIG. 55(b) corresponding to the image shown in FiG, 55(a). The black area in this diagram is the extracted cast shadow area. It is to be noted that the cast shadow areas extracted in this way may be finally outputted as cast shadow areas, and that areas newly defined may be finally outputted as cast shadow areas by performing contraction and expansion processes on [arge areas used in the image processing for each of the cast shadow areas extracted in this way, [0209] The following is an advantage of the image processing performed by the optical area dividing device according to this Variation, In each of FIG, 55 and FIG. 56, the area A shows a cast shadow area, and the area B shows an area of black paint. As know from the image shown in FIG. 55(a), the luminance information of the area A and the area B arc substantially the same, and thus it is difficult to divide 1 he area A and the area B, However, the use of the polarization information of FTG. 55(b) has enabled the accurate area extraction showing that the area A is a cast shddow area and the area B is not a cast shadow area, as shown in FIG,56. Industrial Applicability [0210] An information processing device according to the present invention is useful as an information processing device which performs processing on a shadow in an image of an object, for example, as a device which generates three-dimensional shape information of the object, as a device which highfy refines the image using the information, and more specifically, as a digital still camera, a digital movie camera, a surveillance camera or the like. CLAIMS 1, An image processing device1 whi(h performs processing on a shsdow in an image of an object, said image processing device comprising: an image information obtaining unit configured to obtain information about the image of the object, the information including luminance information which is information about luminance of light from the object and polarization information which is information about polarization of the light from the object; a shadow area extracting unit configured lo extract an attached shadow area and a cast shadow area from the image of the object based on the luminance information and the polarization information obtained by said image information obtaininq unit, the attached shadow area appearing on the surface of the object depending on an angle of incidence light, and the cast shadow area appearing on a surface of a material body other than the object when the light is blocked by the object; and an output unit configured to output information identifying the attached shadow area and the cast shadow area extracted by said shadow area extracting unit, 2. The image processing device according to Claim 1, wherein the image of the object includes a plurality of unit images, said image information obtaining unit is configured to obtain, for each of the unit images, the luminance information and the polarization information, said shadow area extracting unit is configured to extract, for each of the unit images, the attached shadow area and the cast shadow area, and said output unit is configured to assign identifiers to the respective unil" images included in the attached shadow area and the cast shadow area extracted by ^aid shadow a'~ea extracting unit, and output the assigned identifiers. 3. The image processing device according to Claim 1, wherein said output unit is configured to output portions respectively corresponding to the attached shadow area and the cast shadow ares in the image of the object, as information identifying the attached shadow area and the cast shadow area. 4. The image processing device according to Claim 1, wherein said image information obtaining unit includes: a polarization image capturing unit configured to obtain a polarized image of the object having the luminance information by receiving the light transmitted through a plurality of polarizers each having a different angle of a polarization principal axis, and d polarization information generating unit configured to generate, from the obtained polarized image, the polarization information of each of unit images which makes up the polarized image usina a correspondence relaiionshio between the angle of the polarization principal axis of each of the plurality of polarizers and luminance of the light transmitted through the plurality of polarizers, and said shadow area extracting unit is configured to extract the attached shadow area and the cast shadow area by: comparing, for each unit Image, luminance of the unit image and a predetermined threshold value based on luminance information of the polarized image; judging that the unit image belongs to a low luminance area including an attached shadow area and a cast shadow area when the luminance is less than the threshold value; and judging, for the unit image belonging to the low luminance area, whether the unit image is the attached shadow area or the cast shadow area, based on the polarization information generates Dy said polarization information generating unit. 5. The image processing device according to Claim 4, wherein said polarized image capturinq unit includes a plurality of imaging units for obtaining ihe polarized image, each of the plurality of imaging units includes: a plurality of polarizers each having a different angle of a polarization principal axis; and a plurality of unit pixels each receiving light transmitted through a corresponding one of the plurality of polarizers; and said polarization information generating unit is configured to generate polarization information using, as the unit images, the image obtained by said imaging units, 6. The image processing device according to Claim 4, wherein said shadow area extracting unit is configured to judge whether or not the polari/alion information of the unit image belonging to the low luminance area indicates polarization characteristics of specular reflection, and extract the unit image as the attached shadow area in the case where said shadow area extracting unit judges thai the polarization information shows polarization characteristics of specular reflection. 7. The image processing device according to Claim 4, wherein said polarization information generating unit is configured to generate, as the polarization information, a degree of polarization which is a parameter indicating a degree of polarization of light, and said shadow area extracting unit is configured to compare the degree of polarization in the unit image belonging to the low luminance area and a predetermined threshold value, extract the unit image as the cast shadow arcjd when the degree of polarization is Jess than the threshold value, and extract the unit image 3$ the attached shadow area when the degree of polarization is equal to or greater than the threshold value. S. The image processing device according to Claim 4, wherein said polarization information generating unit is configured to generate, as the polarization information, an estimated polarization error which is a difference between the luminance obtained by said polarization Image capturing unit and luminance determined from a sinusoidal approximating a correspondence relationship between the angle of the polarization principal axis and the obtained luminance, and said shadow area exlracttng unit is further configured to compare the estimated polarization error in the unit image belonging to the low luminance area and the predetermined threshold value, extract, as the cast shadow area, the unit image when the estimated polarization error is qreater than Ihe threshold value, and extract, as the attached shadow area, the unit image when the estimated polarization error is equril to or less than the threshold value, 9. The image processing device according to Clafm 4, wherein said polarization image capturing unit is configured to obtain a first polarized \maqe obtained in the case where iight from a flash is projected on the object and a second polarized image obtained [n the case where no light from a flash is projected on the object, and said shadow area extracting unit is configured to calculate, for each unit image belonging to the low luminance area, a difference between the first polarized image and the second polarized image, compare the calculated difference and the predetermined threshold value, and extract rhe unit [mage as the attached shadow area or the tast shadow area when the difference is greater than the threshold value. 10. The image processing device according to Claim 4, wherein said shadow area extracting unit is configured to judge whether or not the polarization information of the unit image belonging to the low luminance area indicates polarization characteristics of specular reflection, and extract the unii image as "the attached shadow area or the low reflectance area" in the case where said shadow area exlracting unit judges that the polarization information shows polarization characteristics of specular reflection. 11. The image processing device according to Ctaim 4, wherein said polarization information generating unit is configured to generate, as the polarization information, a degree of polarization which is a parameter indicating a degree of polarization of light, and said shadow area extracting unit is further configured to comoare the degree of polarization in the unit image belonging to the low luminance area and the predetermined threshold value, extract the unit image as the cast shadow area when the degree of polarization is less than the threshold value, and extract the unit image as h,the attached shadow area or the low reflectance are?a" when the degree of polarization is equal to or greater than the threshold value. 12. The image processing device according to Claim 4, wherein said polarization information generating unit is configured to generate, as the polarization information, an estimated polarization error which is a difference between the luminance obtained by said polarization imago capturing unit and tumimncc determ'ined from a sinusoidal approximating 3 correspondence relationship between Ihe angle of the polarisation principal axis and the obtained luminance, and said shadow area extracting unit is further configured tr> compare the estimated polarization error in the unit image belonging to the low luminance area and a predetermined threshold value, extract the unit image as ihe cast shadow area when the estimated polarization error is greater than the threshold value, and extract the unit image as "the attached shadow area or the low reflectance area" when the estimated polarization error is equal to or /ess than the threshold Vdtuc. 13. The information processing device according lo Claim 4, further comprising an imaging condition judging unit configured to judge whether or not an image scene to be imaged by said polarization image capturing unit satisfies a predetermined imaging condition which allows said shadow area extracting unit to perform accurate area extraction, wherein said shadow area extracting unit is configured to cancel area extraction as a low luminance area in the case where said imaging condition judging unit judges that the imaging condition is not satisfied. 14, The image processing device according to Claim 13, wherein the predetermined imaging condition is a condition that "a material body Including a large plane exists near the object in an image scene, and a light source exists in a direction opposite to the object when the light source is seen from the large plane" 15- The image processing device according to Claim 13, wherein said imaging condition judging unit has an angle sensor ror detecting zr\ imaging djreition of said polarized image capturing unit, and is configured to judge that the image scene does not satisfy the imaging condition in ihe case where said angle sensor detects that said polarized image capturing unit fares upward from a horizontal plane. 16. The image processing device according to Cairn 13, wherein said imaging condition judging unit includes a sonar for measuring a distance to a nearby target by generating a sound wave and receiving a reflected wave of the sound wave, and is configured to judge whether or not there is a materia) body near trie information processing devire using the sonar, and judges that the image scene does not satisfy the imaging condition in the case where said imaging condition judging unit judges that there is no material body. 17. The information processing device according to Claim 4, further comprising a reliability judging unit configured to judge whether area extraction is reliable or not by evaluating a result of the ^red extraction performed by said shadow area extracting unit, and discard the result of the area extraction performed by said shadow area extracting unit in the case where said reliability judging unit judges that the area extraction is not reliable. 18. The image processing device according to Claim 17, wherein said reliability judging unit is configured to judge whether or not an attached shadow area exists in a low luminance area as the result of the area extraction performed by said shadow ares extracting unit, and judge that the area extraction is not reliable in the case where no attached shadow area exists in the low luminance area. 19. An image processing method Tnr performing processing on a shadow in an image of an object, said image processing method comprising: obtaining information about an image of the object, the information including luminance information which is information about luminance of light from the object and polarization information which is information about polarization of the light from the object; extracting an attached shadow area and a cast shadow area fmm the image of the object based on the luminance information and the polarization information obtained in said obtaining, the attached shadow area appearing on the surface of Ihe object depending on an angle of incidence light, and the cast shadow area appearing on a surface of a material body other than the object when the light is blocked by the object; and outputting information identifying the attached shadow area and the cast shadow area extracted in said extracting. 20. A program Droduct for an information orocessing device which performs processing on a shadow in an image of an object, said program causing a computer to execute the information processing method according to Claim 19.

Documents

Application Documents

# Name Date
1 1160-CHENP-2009-AbandonedLetter.pdf 2017-07-14
1 Form5_As Filed_02-03-2009.pdf 2009-03-02
2 Form3_As Filed_02-03-2009.pdf 2009-03-02
2 Correspondence by Agent_Reply to Examination Report_21-12-2016.pdf 2016-12-21
3 Form2 Title Page_Complete_02-03-2009.pdf 2009-03-02
3 Correspondence by Agent_Form18_01-06-2011.pdf 2011-06-01
4 Form1_As Filed_02-03-2009.pdf 2009-03-02
4 Form18_Normal Request_01-06-2011.pdf 2011-06-01
5 Drawings_As Filed_02-03-2009.pdf 2009-03-02
5 Annexure Form3_After Filing_31-08-2009.pdf 2009-08-31
6 Description Complete_As Filed_02-03-2009.pdf 2009-03-02
6 Correspondence by Agent_Form3_PA_31-08-2009.pdf 2009-08-31
7 Form26_Power Of Attorney_31-08-2009.pdf 2009-08-31
7 Correspondence by Agent_As Filed_02-03-2009.pdf 2009-03-02
8 Claims_As Filed_02-03-2009.pdf 2009-03-02
8 Abstract_As Filed_02-03-2009.pdf 2009-03-02
9 Claims_As Filed_02-03-2009.pdf 2009-03-02
9 Abstract_As Filed_02-03-2009.pdf 2009-03-02
10 Correspondence by Agent_As Filed_02-03-2009.pdf 2009-03-02
10 Form26_Power Of Attorney_31-08-2009.pdf 2009-08-31
11 Description Complete_As Filed_02-03-2009.pdf 2009-03-02
11 Correspondence by Agent_Form3_PA_31-08-2009.pdf 2009-08-31
12 Drawings_As Filed_02-03-2009.pdf 2009-03-02
12 Annexure Form3_After Filing_31-08-2009.pdf 2009-08-31
13 Form1_As Filed_02-03-2009.pdf 2009-03-02
13 Form18_Normal Request_01-06-2011.pdf 2011-06-01
14 Form2 Title Page_Complete_02-03-2009.pdf 2009-03-02
14 Correspondence by Agent_Form18_01-06-2011.pdf 2011-06-01
15 Form3_As Filed_02-03-2009.pdf 2009-03-02
15 Correspondence by Agent_Reply to Examination Report_21-12-2016.pdf 2016-12-21
16 Form5_As Filed_02-03-2009.pdf 2009-03-02
16 1160-CHENP-2009-AbandonedLetter.pdf 2017-07-14

Search Strategy

1 1160search1_19-12-2016.pdf