Sign In to Follow Application
View All Documents & Correspondence

Air Conditioner

Abstract: An indoor unit of an air conditioner is provided with an image pickup device that includes a human body detecting means for detecting the presence or absence of a person and an obstacle detecting means for detecting the presence or absence of an obstacle. A wind direction changing means is controlled based on a detection signal of the human body detecting means and that of the obstacle detecting means. Specifically, at least one obstacle position discriminating region belongs to each of a plurality human position discriminating regions, and if a determination is made that an obstacle is present in an obstacle position discriminating region in front of a human position discriminating region that has been determined that a person is present, an air current control is conducted to flow air-conditioned air above the obstacle to avoid the obstacle by controlling vertical wind direction changing blades.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 April 2012
Publication Number
23/2013
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

PANASONIC CORPORATION
1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501

Inventors

1. JINNO, YASUSHI
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
2. SUGIO, TAKASHI
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
3. MORIKAWA, TOMOTAKA
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
4. SHIMIZU, TSUTOMU
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
5. TAKAHASHI, MASATOSHI
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
6. HASEGAWA, HIROKI
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
7. KAWANO, YUSUKE
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
8. SATO, SATOSHI
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
9. IWAMOTO, KEIKO
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501
10. TSUJIMURA, SATOSHI
C/O PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501

Specification

DESCRIPTION

Title of the Invention

Air Conditioner Technical Field

The present invention relates to an air conditioner having an indoor unit that is provided with a human body detecting device for detecting the presence or absence of a person and an obstacle detecting device for detecting the presence or absence of an obstacle and, more particularly, to a technique for efficiently conveying air-conditioned air to a region where a person has been detected by the human body detecting device depending on the position of an obstacle detected by the obstacle detecting device.

Background Art

A conventional air conditioner has an indoor unit that is provided with a human body detecting device including a human body detecting sensor such as, for example, a pyroelectric infrared sensor and an ultrasonic sensor for detecting the distance to an object. In this air conditioner, air-conditioned air is directed toward a region where no person is present by detecting the position of and distance to a person inside a room with the use of the human body detecting device and by subsequently controlling a wind direction changing means made up of vertical wind direction changing blades and horizontal wind direction changing blades (see, for example, Patent Document 1).

In the air conditioner as disclosed in Patent Document 1, if a region where no person is present coincides with a region in the room where an obstacle such as furniture, which impedes circulation of the air-conditioned air, is present, the air-conditioned air is conveyed toward the obstacle to thereby lower the air-conditioning efficiency. In order to eliminate" such problem, another air conditioner has been proposed having an indoor unit in which a human position detecting means and an obstacle position detecting means are provided such that a wind direction changing means is controlled based on both of a detection signal from the human position detecting means and a detection signal from the obstacle position detecting means to thereby enhance the air-conditioning efficiency.

In this air conditioner, when a heating operation is started, a determination is first made by the human position detecting means as to whether a person is present or absent in a room. If no person is present, the obstacle position detecting means determines whether an obstacle is present or absent, and if no obstacle is present, the wind direction changing means is controlled to spread the air-conditioned air over an entire space within the room.

If no person is present but an avoidable obstacle has been detected, the wind direction changing means is so controlled as to be directed toward a direction in which no obstacle is present. On the other hand, if an unavoidable object has been detected, the wind direction changing means is controlled so as not to allow the air-conditioned air to directly impinge on the obstacle to thereby spread the air-conditioned air over the entire space within the room.

Further, if a person(s) is present, a determination is made as to whether or not a region of absence is present, and if the region of absence is not present, the wind direction changing means is controlled to allow the air-conditioned air to spread over the entire space within the room. If the region of absence is present, the presence or absence of an obstacle is determined in the region of absence, i.e., the region where no person is present. If an obstacle is present, the wind direction changing means is so controlled as to be directed toward a direction in which the obstacle is present so that the air-conditioned air may not strongly impinge on the obstacle, while if no obstacle is present, the wind direction changing means is so controlled as to be directed toward a direction in which no obstacle is present (see, for example, Patent Document 2).

(Prior Art Documents)

• Patent Document 1: Japanese Laid-Open Patent Publication No. 63-143449

• Patent Document 2: Japanese Laid-Open Utility Model Publication No. 3-72249 Summary of the Invention

Problems to be solved by the Invention

In the case of the air conditioner as disclosed in Patent Document 2, the air-conditioning efficiency is enhanced by controlling the wind direction changing means based on the detection signal from the human position detecting means and the detection signal from the obstacle position detecting means. However, because there are a number of objects such as a television set, an audio station, and furniture such as tables, sofas, or the like within a room, there is still room for improvement in terms of optimized air conditioning.

Also, the human position detecting means detects the position of a person based on detection signals from a human body detecting sensor and an ultrasonic sensor, both constituting a human body detecting device, while the obstacle position detecting means detects the position of an obstacle based on, for example, distance information from the ultrasonic sensor when no detection signal is outputted from the human body detecting sensor constituting the human body detecting device. That is, the human body detecting device is used as the human position detecting means and also as the obstacle position detecting means.

Accordingly, if a person is erroneously detected as an obstacle, not only can comfortable air conditioning not be accomplished, but there is also a chance that air-conditioned air may directly impinge on the person, thus leading to a possibility of inefficient and uncomfortable air conditioning control.

The present invention has been developed to overcome the above-described disadvantages.

It is accordingly an objective of the present invention to provide an air conditioner capable of enhancing the air-conditioning efficiency by dividing an area to be air conditioned into a plurality of human position discriminating regions and into a plurality of obstacle position discriminating regions to accurately and efficiently determine the presence or absence of a person and the presence or absence of an obstacle in each region, and by subsequently finely controlling a wind direction changing means made up of horizontal wind direction changing blades and vertical wind direction changing blades based on results of such determination.

Means to Solve the Problems

In accomplishing the above objective, an air conditioner according to the present invention makes use of an image pickup device mounted on an indoor unit to detect the presence or absence of a person ( human body detecting means) and the presence or absence of an obstacle (obstacle detecting means), and includes a wind direction changing means for changing a direction of air blown out from the indoor unit, the wind direction changing means including vertical wind direction changing blades to vertically change the direction of air blown out from the indoor unit. The wind direction changing means is controlled based on a detection result of the human body detecting means and a detection result of the obstacle detecting means. When a determination is made based on the detection result of the human body detecting means and the detection result of the obstacle detecting means that an obstacle is positioned closer than a person to the indoor unit, and when a temperature of an indoor heat exchanger mounted in the indoor unit falls within a range from a cutaneous temperature-based reference temperature to a predetermined high temperature, an air current control is conducted to flow air-conditioned air above the obstacle to avoid the obstacle by controlling the vertical wind direction changing blades.

More specifically, an area to be air conditioned is divided into a plurality of human position discriminating regions to be detected by the human body detecting means and into a plurality of obstacle position discriminating regions to be detected by the obstacle detecting means, and at least one obstacle position discriminating region belongs to each of the plurality of human position discriminating regions. When a determination is made by the obstacle detecting means that an obstacle is present in an obstacle position discriminating region positioned closer to the indoor unit than a human position discriminating region that has been determined by the human body detecting means that a person is present, the air current control is conducted to flow air-conditioned air above the obstacle to avoid the obstacle by controlling the vertical wind direction changing blades.

Also, an angle from a horizontal line is set for each of the plurality of human position discriminating regions. When the determination is made by the obstacle detecting means that an obstacle is present in an obstacle position discriminating region positioned closer to the indoor unit than a human position discriminating region that has been determined by the human body detecting means that a person is present, the set angle of the vertical wind direction changing blades is modified to allow the vertical wind direction changing blades to be set upward.

Each of the plurality of human position discriminating regions is classified into any one of a first region and a second region farther than the first region from the indoor unit depending on a distance from the indoor unit, and the human position discriminating region determined by the human body detecting means that a person is present belongs to the second region, while the obstacle position discriminating region determined by the obstacle detecting means that an obstacle is present belongs to the first region.

Specifically, the vertical wind direction changing blades comprise a plurality of independently controllable blades.

Also, the vertical wind direction changing blades are set further upward with an increase in distance from a person to the indoor unit. Effects of the Invention

According to the present invention, if an obstacle is present in front of a person, an air current control is conducted to avoid an obstacle from above by controlling the vertical wind direction changing blades. Such an air current control and a fine control of the wind direction changing means including the vertical wind direction changing blades result in an increase in air-conditioning efficiency. Brief Description of the Drawings

Fig. 1 is a front view of an indoor unit of an air conditioner according to the present invention.

Fig. 2 is a vertical sectional view of the indoor unit of Fig. 1.

Fig. 3 is a vertical sectional view of the indoor unit of Fig. 1, depicting a state in which a movable front panel opens a front opening and vertical wind direction changing blades open a discharge opening.

Fig. 4 is a vertical sectional view of the indoor unit of Fig. 1, depicting a state in which a lower blade constituting the vertical wind direction changing blades has been set downward.

Fig. 5 is a flowchart indicating human position estimation processing in an embodiment of the present invention.

Figs. 6A to 6C are schematic views to explain background difference processing in the human position estimation in this embodiment.

Figs. 7A to 7C are schematic views to explain processing for creating a background image in the background difference processing.

Figs. 8A to 8C are schematic views to explain processing for creating a background image in the background difference processing.

Figs. 9A to 9C are schematic views to explain processing for creating a background image in the background difference processing.

Figs. 10A and 10B are schematic views to explain region division processing in the human position estimation in this embodiment.

Fig. 11 is a schematic view to explain two coordinate systems utilized in this embodiment.

Figs. 12A and 12B are schematic views indicating a distance from an image pickup sensor unit to a position of a center of gravity of a human body.

Figs. 13A and 13B are schematic views indicating human position discriminating regions that are detected by the image pickup sensor unit constituting a human body detecting means.

Figs. 14A and 14B are schematic views indicating the human position discriminating regions that are detected by the image pickup sensor unit constituting the human body detecting means, particularly depicting a case where a human body or human bodies are present

Fig. 15 is a flowchart for setting region property to each region shown in Figs. 13Aand 13B.

Fig. 16 is a flowchart for finally determining the presence or absence of a person in each region shown in Figs. 13A and 13B.

Fig. 17 is a schematic plan view of a house in which the indoor unit of Fig. 1 has been installed.

Fig. 18 is a graph indicating long-term cumulative results obtained by each image pickup sensor unit with respect to the house of Fig. 17.

Fig. 19 is a schematic plan view of another house in which the indoor unit of Fig. 1 has been installed.

Fig. 20 is a graph indicating long-term cumulative results obtained by each image pickup sensor unit with respect to the house of Fig. 19.

Fig. 21 is a flowchart indicating human position estimation processing in which processing for extracting a human-like region from a frame image is utilized.

Fig. 22 is a flowchart indicating human position estimation processing in which processing for extracting a face-like region from a frame image is utilized.

Fig. 23 is a schematic view of obstacle position discriminating regions that are detected by an obstacle detecting means.

Fig. 24 is a schematic view indicating how to detect an obstacle using a stereo method.

Fig. 25 is a flowchart indicating distance measurement processing to an obstacle.

Fig. 26 is a schematic view indicating a distance from the image pickup sensor unit to a point P.

Fig. 27 is an lavational view of a certain living space, particularly depicting measurement results of the obstacle detecting means

Fig. 28 is a schematic view indicating the definition of a wind direction at each position of right and left blades constituting the horizontal wind direction changing blades.

Fig. 29 is a schematic plan view of a room to explain a wall detection algorithm that is used to obtain distance numbers of surrounding walls by measuring distances to them from the indoor unit.

Fig. 30 is a front view of an indoor unit of another air conditioner according to the present invention.

Fig. 31 is a schematic view indicating a relationship between the image pickup sensor unit and a light emitting portion.

Fig. 32 is a flowchart indicating distance measurement processing to an obstacle that is executed using the light emitting portion and the image pickup sensor unit.

Fig. 33 is a front view of an indoor unit of another air conditioner according to the present invention.

Fig. 34 is a flowchart indicating processing to be executed by a human body distance detecting means employing the human body detecting means.

Figs. 35A and 35B are schematic views indicating processing for estimating a distance from the image pickup sensor unit to a human body using a v-coordinate v1 of an uppermost portion of an image.

Fig. 36 is a flowchart indicating processing executed by the obstacle detecting means employing the human body detecting means.

Figs. 37A and 37B are schematic views indicating processing for estimating a height v2 of a human body on the image using distance information from the image pickup sensor unit to the human body that has been estimated by the human body distance detecting means.
Figs. 38A and 38B are schematic views indicating processing for estimating whether an obstacle is present or absent between the image pickup sensor unit and the human body.

Figs. 39A and 39B are schematic views indicating the processing for estimating whether an obstacle is present or absent between the image pickup sensor unit and the human body. Detailed Description of the Embodiments

Embodiments of the present invention are described hereinafter with reference to the drawings. (Whole construction of air conditioner)

Air conditioners for use in ordinary households include an outdoor unit and an indoor unit connected to each other via refrigerant piping, and Figs. 1 to 4 depict an indoor unit of an air conditioner according to the present invention.

The indoor unit includes a main body 2 and a movable front panel (hereinafter referred to simply as "front panel") 4 to open and close front suction openings 2a defined in the main body 2. When the air conditioner is not in operation, the front panel 4 is held in close contact with the main body 2 to close the front suction openings 2a, while when the air conditioner is brought into operation, the front panel 4 moves away from the main body 2 to open the front suction openings 2a. Figs. 1 and 2 depict a state in which the front suction openings 2a have been closed by the front panel 4, and Figs. 3 and 4 depict a state in which the front suction openings 2a have been opened by the front panel 4.

As shown in Figs. 1 to 4, the main body 2 accommodates therein a heat exchanger 6, an indoor fan 8 operable to blow out into a room indoor air, which has been sucked through the front suction openings 2a and upper suction openings 2b and then heat exchanged by the heat exchanger 6, vertical wind direction changing blades 12 operable to open and close a discharge opening 10, through which heat exchanged air is blown out into the room, and also operable to vertically change the direction of air blown out from the discharge opening 10, and horizontal wind direction changing blades 14 operable to horizontally change the air direction. A filter 16 is disposed between the front and upper suction openings 2a, 2b and the heat exchanger 6 to remove dust contained in indoor air that has been sucked through the front suction openings 2a and the upper suction openings 2b.

The front panel 4 is connected at an upper portion thereof to an upper portion of the main body 2 via two arms 18, 20 provided on respective side portions thereof. The arm 18 is connected to a drive motor (not shown), and when the air conditioner is brought into operation, the front panel 4 is moved forward and obliquely upward from a position (where the front suction openings 2a are closed) at the time of stop of operation of the air conditioner by driving the drive motor.

The vertical wind direction changing blades 12 include an upper blade 12a and a lower blade 12b, both swingably mounted to a lower portion of the main body 2. The upper blade 12a and the lower blade 12b are connected to respective drive sources (for example, stepping motors), and angles thereof are independently controlled by a controller (first substrate 48 described later, for example, microcomputer) accommodated within the indoor unit. As can be seen from Figs. 3 and 4, a range of angles within which the lower blade 12b is allowed to swing is so set as to be greater than a range of angles within which the upper blade 12a is allowed to swing.

A method of driving the upper blade 12a and the lower blade 12b is explained later. The vertical wind direction changing blades 12 may be made up of three blades or more. In this case, it is preferred that angles of at least two blades (in particular, an uppermost blade and a lowermost blade) be independently controlled.

The horizontal wind direction changing blades 14 are made up of a total of ten blades in groups of five each on right and left sides with respect to a center of the indoor unit. These blades are swingably mounted to a lower portion of the main body 2. Each group of five blades is connected to a drive source (for example, a stepping motor) as a unit, and the angle thereof is controlled by the controller accommodated in the indoor unit. A method of driving the horizontal wind direction changing blades 14 is also explained later. (Construction of human body detecting means)

As shown in Fig. 1, an image pickup sensor unit 24 is mounted as an image pickup device on an upper portion of the front panel 4. The image pickup sensor unit 24 is held by a sensor holder.

The image pickup sensor unit 24 includes a circuit board, a lens mounted on the circuit board, and an image pickup sensor accommodated in the lens. The human body detecting means determines the presence or absence of a person based on, for example, difference processing (explained later) using the circuit board. That is, the circuit board acts as a determination means for determining whether a person is present or absent. (Estimation of human position by image pickup sensor unit)

A known difference method is utilized to estimate a position of a person using the image pickup sensor unit 24. In this method, difference processing is performed with respect to a background image in which no person is present and an image taken by the image pickup sensor unit 24, and it is estimated that a person is present in a region where a difference occurs.

Fig. 5 is a flowchart indicating human position estimation processing in this embodiment.

At step S101, background difference processing is utilized to detect pixels in which a difference occurs in a frame image. The background difference processing is a method of comparing a background image taken under specific conditions and an image taken under the same imaging conditions such as a field of view, a point of view, a focal length and the like of the image pickup sensor unit 24 as those of the background image to detect an object that is not present in the background image but present in the image taken. To detect a person, an image in which no person is present is first created as the background image.

Figs. 6A to 6C are schematic views to explain background difference processing in the human position estimation according to this embodiment. Fig. 6A depicts a background image. The field of view is set so as to be nearly equal to a space to be air conditioned by the air conditioner. In this figure, 101 denotes a window present in the space to be air conditioned, and 102 denotes a door.

Fig. 6B depicts a frame image taken by the image pickup sensor unit 24. The field of view, the point of view, the focal length and the like of the image pickup sensor unit 24 are the same as those of the background image shown in Fig. 6A. 103 denotes a person present in the space to be air conditioned. The background difference processing creates a difference image between Fig. 6A and Fig. 6B to detect the person.

Fig. 6C depicts the difference image in which white pixels denote pixels having no difference and black pixels denote pixels having a difference. It is found that a region of the person 103 not present in the background image but present in the frame image taken has been detected as a region 104 in which the difference has occurred. That is, the human region can be detected by extracting the region having the difference from the difference image.

The background image referred to above can be created by making use of inter-frame difference processing. Figs. 7 to 9 are schematic views to explain this processing. Figs. 7A to 7C are schematic views depicting images of three consecutive frames taken by the image pickup sensor unit 24 in a scene in which the person 103 is moving from right to left in front of the window 101. Fig. 7B depicts an image of the next frame of Fig. 7A, and Fig. 7C depicts an image of the next frame of Fig. 7B. Figs. 8A to 8C depict inter-frame difference images obtained by the inter-frame difference processing with the use of the images of Figs. 7A to 7C. White pixels denote pixels having no difference and black pixels 105 denote pixels having a difference. If an object that is moving within the field of view is only a person, it is conceivable that no person exists in a region where no difference has occurred in the inter-frame difference images. In view of this, in this embodiment, the background image is replaced with the image of the current frame in the region where no inter-frame difference has occurred, thereby automatically creating a new background image. Figs. 9A to 9C schematically depict renewal of the background images of the frames of Figs. 7A to 7C, respectively. A shaded region 106 denotes a region where the background image has been renewed, a black region 107 denotes a region where any background image has not been created yet, and a white region 108 denotes a region where the background image has not been renewed. That is, the sum of the black region 107 and the white region 108 in Figs. 9A to 9C is equal to the black region in Figs. 8A to 8C, respectively. As shown in Figs. 9A to 9C, if a person is moving, the black region 107 becomes gradually small and the background image is automatically created.

If a plurality of difference regions have been obtained, they are separated at the next step S102. That is, if a plurality of persons exist, they are separated as a plurality of difference regions using a known clustering method. By way of example, the difference images are separated as different regions in accordance with a rule in which a pixel having a difference and pixels similarly having a difference and existing in the vicinity thereof lie in the same region. Figs. 10A and 10B schematically depict this region separation processing. Fig. 10A depicts a difference image calculated by the difference processing, wherein black pixels 111 and 112 denote pixels in which a difference has occurred. When the difference image of Fig. 10A is obtained, a result as shown in Fig. 10B is subsequently obtained by the region separation in accordance with the aforementioned rule in which a pixel having a difference and pixels similarly having a difference and existing in the vicinity thereof lie in the same region. As can be seen from these figures, a determination is made that a horizontal-striped region 113 and a vertical-striped region 114 are different regions. For this determination, denoising processing such as, for example, morphological processing widely used in the field of image processing may be performed.

At the next step S103, the position of a person detected is determined by calculating the position of a center of gravity of each region obtained. In order to determine the position of a person from the position of the center of gravity of each image, the perspective projection conversion may be used.

Two coordinate systems are explained to explain the perspective projection conversion. Fig. 11 is a schematic view to explain the two coordinate systems. An image coordinate system is first considered. This is a two-dimensional coordinate system in an image taken. A top-left pixel in the image is set as an origin, and a rightward direction and a downward direction are indicated by "u" and "v", respectively. A camera coordinate system that is a three-dimensional coordinate system on the basis of a camera is subsequently considered. In this coordinate system, the position of a focal point of the image pickup sensor unit 24 is set as an origin, and an optical axis direction of the image pickup sensor unit 24 is indicated by "Zc". Also, an upward direction and a leftward direction as viewed from the camera are indicated by "Yc" and "Xc", respectively. Then, the following relationship is established by means of the perspective projection conversion.

[Formula 1]

where f denotes a focal length [mm], (uO, vO) denotes an image center [Pixel] on an image coordinate, and (dpx, dpy) denotes a size [mm/Pixel] of one pixel of an image pickup element. Because Xc, Yc and Zc are all unknowns, if a coordinate (u, v) on the image is known, Formula 1 means that a real three-dimensional position corresponding to that coordinate lies on a certain straight line from the origin of the camera coordinate system.

As shown in Figs. 12A and 12B, the position of the center of gravity of a person on the image is denoted by (ug, vg), and the three-dimensional position thereof in the camera coordinate system is denoted by (Xgc, Ygc, Zgc). Fig. 12A and Fig. 12B schematically depict a space to be air conditioned as viewed from the side and from above, respectively. Also, an installation height of the image pickup sensor unit 24 is denoted by H. A direction Xc is horizontal, and an optical axis Zc forms an angle 9 with the vertical direction. A direction of the image pickup sensor unit 24 is denoted by an angle a in the vertical direction (angle of elevation or angle as measured upward from a vertical line) and by an angle 0 in the horizontal direction (angle as measured rightward from a front reference line as viewed from the indoor unit. If a height of the center of gravity of the person is denoted by h, the three-dimensional position of the image pickup sensor unit 24 in the space to be air conditioned, i.e., a distance L and a direction W from the image pickup sensor unit 24 to the center of gravity of the person are calculated by the following formulas.


Here, it is considered that the image pickup sensor unit 24 is generally installed at a height of H=about 2 meters and the height h of the center of gravity of a person is about 80 centimeters. In applications where the installation height H of the image pickup sensor unit 24 and the height h of the center of gravity of the person are determined, Formula 3 and Formula 5 mean that the position (L, W) of the center of gravity of the person in the space to be air conditioned is uniquely obtained from the position (ug, vg) of the center of gravity on the image.

Figs. 13A and 13B depict a plurality of regions A-G in the space to be air conditioned and, as shown therein, if the center of gravity of the person on the image exists in somewhere in those regions A-G, a region where the person is present in the space to be air conditioned can be determined. Also, Figs. 14A and 14B schematically depict where a person or persons are present. In the case of Fig. 14A, because the centers of gravity of the persons are present in the regions A and F, a determination is made that the persons are present in the regions A and F shown in Fig. 13B. On the other hand, in the case of Fig. 14B, because the center of gravity of the person is present in the region D, a determination is made that the person is present in the region D shown in Fig. 13B.

Fig. 15 is a flowchart for setting region property (explained later) to each of the regions A-G using the image pickup sensor unit 24, and Fig. 16 is a flowchart for determining the presence or absence of a person in each region A-G using the image pickup sensor unit 24. A method of determining the position of a person is explained hereinafter with reference to these flowcharts.

At step S1, the presence or absence of a person in each region is first determined at predetermined intervals T1 (for example, 200 milliseconds if a frame rate of the image pickup sensor unit 24 is 5 fps) in the above-described manner. Based on results of such determination, the regions A-G are classified into a first region in which a person is frequently present (place of frequent presence), a second region in which a person is present for a short period of time (transit region such as a region through which the person merely passes, a region in which the person stays for a short period of time, or the like), and a third region in which a person is present for a considerably short period of time (non-living region such as walls, windows, or the like in which nobody is present very often. The first, second and third regions are hereinafter sometimes referred to as living sections I, II and III, respectively, which are hereinafter sometimes referred to as a region of region property I, a region of region property II, and a region of region property III, respectively. The living sections may be broadly classified depending on the frequency of the presence or absence of a person by referring to the living section I (region property I) and the living section II (region property II) as a living region (region in which a person(s) lives) and referring to the living section III (region property III) as a non-living region (region in which no person lives).

This determination is made after step S3 in the flowchart of Fig. 15 and explained hereinafter with reference to Figs. 17 and 18.

Fig. 17 depicts a layout of a house called "1LDK" consisting of a Japanese-style room and an LD (living and dining room), with the indoor unit of the air conditioner according to the present invention installed in the LD. Regions indicated by ovals in Fig. 17 indicate places where a subject is frequently present, which was reported by the subject.

As described hereinabove, a determination is made as to whether a person is present or absent in each region A-G for every period T1. A response result of 1 (presence of response) or 0 (no response) is outputted after a lapse of each period T1 and, upon repetition of this a plurality of times, all sensor outputs are cleared at step S2.

At step S3, a determination is made as to whether or not a predetermined cumulative period of time of operation of the air conditioner has elapsed. If it is determined at step S3 that the predetermined period of time has not elapsed, the program returns to step S1, but if it is determined that the predetermined period of time has elapsed, each region A-G is determined as one of the living sections I, II, and III by comparing the response results of each region A-G accumulated for the predetermined period of time with two threshold values.

Detailed explanation is made with reference to Fig. 18 indicating long-term cumulative results. A first threshold value and a second threshold value less than the first threshold value are set with which the long-term cumulative results are compared. A determination is made at step S4 whether or not the long-term cumulative results of each region A-G are greater than the first threshold value. If it is determined that the long-term cumulative results are greater than the first threshold value, the region having such long-term cumulative results is determined as the living section I at step S5. On the other hand, if it is determined at step S4 that the long-term cumulative results of each region A-G are not greater than the first threshold value, a determination is made at step S6 whether or not the long-term cumulative results of each region A-G are greater than the second threshold value. If it is determined that the long-term cumulative results are greater than the second threshold value, the region having such long-term cumulative results is determined as the living section II at step S7, and if not, the region is determined as the living section III at step S8.

In the example of Fig. 18, the regions C, D and G are determined as the living section I, the regions B and F as the living section II, and the regions A and E as the living section III.

Fig. 19 depicts a layout of another house having an LD in which the indoor unit of the air conditioner according to the present invention has been installed, and Fig. 20 indicates long-term cumulative results of each region A-G. In the example of Fig. 19, the regions B, C and E are determined as the living section I, the regions A and F as the living section II, and the regions D and G as the living section III.

Although the determination for the region property (living section) referred to above is repeated for every predetermined period of time, the results of determination hardly change unless sofas, tables and the like disposed inside the room to be determined are moved.

A final determination of the presence or absence of a person in each region A-G is explained hereinafter with reference to the flowchart of Fig. 16.

Because steps S21 and S22 are the same as steps S1 and S2 in the flowchart of Fig. 15, explanation thereof is omitted. It is determined at step S23 whether or not response results for a predetermined number M of (for example, 45) periods T1 have been obtained. If it is determined that the periods T1 do not reach the predetermined number M, the program returns to step S21, while if it is determined that the periods T1 have reached the predetermined number M, the number of a series of cumulative responses equal to a total of response results for periods T1 x M is calculated at step S24. The calculation of the series of cumulative responses is repeated a plurality of times, and it is determined at step S25 whether or not calculation results of a predetermined number of (for example, N=4) series of cumulative responses have been obtained. If it is determined that the calculation does not reach the predetermined number, the program returns to step S21, while it is determined that the calculation has reached the predetermined number, the presence or absence of a person in each region A-G is estimated at step S26 based on the region property that has been already determined and the predetermined number of series of cumulative responses.

It is to be noted here that because the program returns to step S21 from step S27 at which 1 is subtracted from the number (N) of the series of cumulative responses, the calculation of the plurality of series of cumulative responses is repeated.

Table 1 indicates a record of a newest series of cumulative responses (periods T1 x M). In Table 2, 2 AO means the number of a series of cumulative responses in the region A.

When the number of a series of cumulative responses immediately before SAO is 2A1, and the number of a series of cumulative responses immediately before 2A1 is 2A2, if N=4, the presence or absence of a person is determined based on the past four records (2A4, 2 A3, 2A2, 2A1). In the case of the living section I, if the past four records reveal that at least a series of cumulative responses exceeds 1, it is determined that a person is present. In the case of the living section II, if the past four records reveal that more than two series of cumulative responses exceed 1, it is determined that a person is present. In the case of the living section III, if the past four records reveal that more than three series of cumulative responses exceed 2, it is determined that a person is present.

After the period T1 x M from the determination of the presence or absence of a person referred to above, a subsequent determination of the presence or absence of a person is similarly made based on the next four records, the region property, and the predetermined number of series of cumulative responses.

That is, the indoor unit of the air conditioner according to the present invention tries to obtain human position estimation results having a high probability by estimating the human position using the region property, which is obtained upon long-term accumulation of the region determination results for each predetermined period, and the past records indicating the number of N series of cumulative responses in each region, each series indicating the region determination results for a predetermined number of periods.

When the presence or absence of a person is determined in a manner as described above, if T1=0.2 seconds and M=45, a period of time required for estimation of the presence of a person and that required for estimation of the absence of a person are indicated in Table 2.

After an area that is to be air conditioned by the indoor unit of the air conditioner according to the present invention has been classified into a plurality of regions A-G in the above-described manner using the image pickup sensor unit 24, the region property (living section l-lll) of each region A-G is determined, and the period of time required for estimation of the presence of a person and that required for estimation of the absence of a person are changed depending on the region property of each region A-G.

That is, after the setting for air conditioning has been changed, about one minute is needed before wind reaches and, hence, if the setting for air conditioning is changed in a short period of time (for example, several seconds), comfort is lost. In addition, it is preferred in terms of energy saving that a place that would be soon empty is not much air conditioned. For this reason, the presence or absence of a person in each region A-G is first detected, and air conditioning is optimized particularly in a region where a person is present.

More specifically, the period of time required for estimation of the presence or absence of a person in a region determined as the living section II is set as a standard one, and the presence of a person is estimated in a shorter period of time in a region determined as the living section I than in the region determined as the living section II, while when the person has disappeared from the region, the absence of a person is estimated in a longer period of time in the region determined as the living section I than in the region determined as the living section

II. In other words, the period of time required for estimation of the presence of a person is set shorter and that required for estimation of the absence of a person is set longer with respect to the region determined as the living section I. On the other hand, the presence of a person is estimated in a longer period of time in a region determined as the living section III than in the region determined as the living section II, while when the person has disappeared from the region, the absence of a person is estimated within a shorter period of time in the region determined as the living section III than in the region determined as the living section II. In other words, the period of time required for estimation of the presence of a person is set longer and that required for estimation of the absence of a person is set shorter with respect to the region determined as the living section

III. Further, as described above, the living section set to each region changes depending on the long-term cumulative results, and the period of time required for estimation of the presence of a person and that required for estimation of the absence of a person are both variably set.

Although in the above discussion the difference method is used to estimate the position of a person by means of the image pickup sensor unit 24, any other method may be used. For example, a human-like region, i.e., a region where a person is likely to be present may be extracted from a frame image by making use of image data of an entire body of the person. A method of utilizing, for example, HOG (Histograms of Oriented Gradients) is widely known as such a method (N. Dalai and B. Triggs, "Histograms of Oriented Gradients for Human Detection", In Proc. IEEE Conf. on Computer Vision and Pattern Recognition, Vol. 1, pp. 886-893, 2005). The HOG are the amounts of features focused on the edge strength in each edge direction in localized portions of an image, and the human region can be detected from the frame image by conducting learning and discrimination with respect to those amounts of features with the use of an SVM (Support Vector Machine) or the like.

Fig. 21 is a flowchart indicating human position estimation processing in which processing for extracting a human-like region from the frame image is utilized. In this figure, the same steps as those in Fig. 5 are designated by the same step numbers and detailed explanation thereof is omitted.

At step S104, the human-like region in the frame image is extracted as a human region by making use of the aforementioned HOG.

At step S103, the position of the person detected is determined by calculating the position of a center of gravity of the human region obtained. In order to determine the position of the person from the position of the center of gravity of the image, Formula 3 and Formula 5 can be used, as described above.

Instead of utilizing image data of the entire body of the person, the human-like region may be extracted from the frame image. A method of utilizing, for example, Haar-like features is widely known as such a method (P. Viola and M. Jones, "Robust real-time face detection", International Journal of Computer Vision, Vol. 57, No. 2, pp. 137-154, 2004). The Haar-like features are the amounts of features focused on a luminance difference between localized portions of an image, and the human region can be detected from the frame image by conducting learning and discrimination with respect to those amounts of features with the use of the SVM (Support Vector Machine) or the like.

Fig. 22 is a flowchart indicating human position estimation processing in which processing for extracting a face-like region from a frame image is utilized. In this figure, the same steps as those in Fig. 5 are designated by the same step numbers and detailed explanation thereof is omitted.

At step S105, the face-like region in the frame image is extracted as a face region by making use of the aforementioned Hear-like features.

At step S103, the position of the person detected is determined by calculating the position of a center of gravity of the face region obtained. In order to determine the position of the person from the position of the center of gravity of the image, the perspective projection conversion can be used, as described above. In this event, when the position of the person is determined from the position of the center of gravity thereof by making use of the region of the entire body of the person, the height of the center of gravity of the person is set to h=about 80 centimeters, but when the face region is utilized, the height of the center of gravity of the face is set to h=about 160 centimeters and the position of the person is determined by making use of Formula 3 and Formula 5.

(Construction of obstacle detecting means)

Obstacle detection is conducted by making use of the image pickup sensor unit 24 and this obstacle detecting means is explained hereinafter. The term "obstacle" as employed throughout this application is defined as an object that generally impedes an air flow blown out from the discharge opening 10 in the indoor unit to provide a resident or residents with a comfortable space, and collectively means objects other than residents such as, for example, a television set, an audio station, and furniture such as sofas, tables or the like.

In this embodiment, the floor face in the living space is divided into a plurality of regions as shown in Fig. 23 by the obstacle detecting means based on the vertical angle a and the horizontal angle (3. Each of the plurality of regions so divided is defined as an obstacle position discriminating region or a "position" where the presence or absence of an obstacle is determined. An entire area covering all the positions shown in Fig. 23 substantially coincides with an entire area covering all the human position discriminating regions as shown in Fig. 13B. By making region boundaries of Fig. 13B substantially coincide with position boundaries of Fig. 23, and by making the regions correspond to the positions in the following manner, not only can air conditioning control be easily conducted, but the number of memories for storage of information can also be minimized.

Region A: Position A1+A2+A3
Region B: Position B1+B2
Region C: Position C1+C2
Region D: Position D1+D2
Region E: Position E1+E2
Region F: Position F1+F2
Region G: Position G1+G2

In the region division of Fig. 23, the number of the positions is so set as to be greater than the number of the human position discriminating regions, and at least two positions belong to each of the human position discriminating regions and are positioned side by side as viewed from the indoor unit. However, air conditioning control can be conducted with a region division in which at least one position belongs to each of the human position discriminating regions.

Also, in the region division of Fig. 23, each of the plurality of human position discriminating regions is divided depending on a distance to the indoor unit, and the number of the positions belonging to a human position discriminating region close to the indoor unit is set greater than the number of the positions belonging to another human position discriminating region remote from the indoor unit. However, the positions belonging to each human position discriminating region may be the same in number irrespective of the distance from the indoor unit.

(Detecting operation and data processing by obstacle detecting means)

As described above, in the air conditioner according to the present invention, the presence or absence of a person in the regions A-G is detected by the human body detecting means, while the presence or absence of an obstacle in the positions A1-G2 is detected by the obstacle detecting means, and the vertical wind direction changing blades 12 and the horizontal wind direction changing blades 14 both constituting the wind direction changing means are controlled based on a detection signal (result detected) from the human body detecting means and that (result detected) from the obstacle detecting means, thereby providing a comfortable space.

The human body detecting means makes use of, for example, the fact that a person moves and can detect the presence or absence of the person by detecting an object that is moving in a space to be air conditioned, while the obstacle detecting means detects a distance to an obstacle by means of the image pickup sensor unit 24 and accordingly cannot distinguish between a human body and an obstacle.

If a human body is erroneously detected as an obstacle, a region in which a person is present cannot be air conditioned or air-conditioned air (air current) may directly impinge on the person, thus resulting in inefficient or uncomfortable air conditioning control.

For this reason, the obstacle detecting means is designed so as to detect only an obstacle by executing data processing explained below.

An obstacle detecting means employing image pickup sensor units is first explained. A stereo method is utilized to detect an obstacle with the use of the image pickup sensor units. The stereo method utilizes a plurality of image pickup sensor units 24, 26 to estimate a distance to an object by making use of a disparity between them.

Fig. 24 is a schematic view indicating how to detect an obstacle using the stereo method. In this figure, the image pickup sensor units 24, 26 are utilized to measure a distance to a point P where the obstacle is positioned. In Fig. 24, f denotes a focal length, B denotes a distance between a focal point of the image pickup sensor unit 24 and that of the image pickup sensor unit 26, u1 denotes a u-coordinate of the obstacle on an image of the image pickup sensor unit 24, u2 denotes a u-coordinate of a point corresponding to u1 on an image of the image pickup sensor unit 26, and X denotes a distance to the two image pickup sensor units 24, 26. Assuming that a center position of the image of the image pickup sensor unit 24 and that of the image of the image pickup sensor unit 26 are the same, the distance X from the image pickup sensor units 24, 26 to the point P is obtained by the following formula.

[Formula 6]

This formula reveals that the distance X from the image pickup sensor units 24, 26 to the point P or the obstacle depends on a disparity |u1-u2| between the two image pickup sensor units 24, 26.

A block matching method employing a template matching method may be used to search the corresponding point. As described above, the distance measurements (position detection of obstacles) are conducted using the image pickup sensor units.

Formula 3, Formula 5 and Formula 6 reveal that the position of the obstacle can be estimated by means of the pixel position and the disparity. In Table 3, "i" and "j" indicate the pixel positions to be measured, and angles in the vertical direction and those in the horizontal direction indicate the angles of elevation a referred to above and angles (3 measured rightward from a front reference line as viewed from the indoor unit, respectively. That is, each pixel is set to fall within a range of 5 degrees to 80 degrees in the vertical direction and within a range of -80 degrees to 80 degrees in the horizontal direction as viewed from the indoor unit, and the image pickup sensor units measure the disparity of each pixel.

That is, the air conditioner conducts the distance measurements (position detection of obstacles) by measuring the disparity at each pixel from a pixel [14, 15] to a pixel [142, 105].

A range of detection of the obstacle detecting means at the start of operation of the air conditioner may be limited to more than 10 degrees in the angle of elevation. The reason for this is that there is a high possibility that someone is present at the start of operation of the air conditioner, and data measured can be effectively utilized by conducting the distance measurements with respect to only regions where it is highly possible that nobody is detected, i.e., regions where walls exist (because a human body is not an obstacle, data obtained from a region where a person is present are not used, as described later).

The distance measurements to an obstacle are explained hereinafter with reference to Fig. 25.

If a determination is made at step S41 that no person is present in a region (any one of the regions A-G shown in Fig. 13) corresponding to the present pixel, the program advances to step S42, while if a determination is made at step S41 that a person is present, the program advances to step S43. Because a human body is not an obstacle, preceding distance data are used with respect to a pixel corresponding to the region where a determination has been made that a person is present without conducting the distance measurements (distance data are not updated). The distance measurements are conducted with respect to only a pixel corresponding to the region where a determination has been made that no person is present, and newly measured distance data are used (distance data are updated).

That is, in determining the presence or absence of an obstacle in each obstacle position discriminating region, a determination is made whether or not results of determination by the obstacle detecting means in each obstacle position discriminating region should be updated depending on results of determination of the presence or absence of a person in a human position discriminating region corresponding to each obstacle position discriminating region, thus resulting in an efficient determination of the presence or absence of the obstacle. More specifically, in an obstacle position discriminating region belonging to a human position discriminating region where it has been determined by the human body detecting means that no person is present, preceding results of determination by the obstacle detecting means are updated by current results of determination, while in an obstacle position discriminating region belonging to a human position discriminating region where it has been determined by the human body detecting means that a person is present, the preceding results of determination by the obstacle detecting means are not updated by the current results of determination.

At step S42, the disparity of each pixel is calculated by making use of the aforementioned block matching method, and the program advances to step S44.

At step S44, data are obtained eight times at the same pixel, and a determination is made whether or not distance measurements based on the data obtained have been completed. If it is determined that the distance measurements have not been completed yet, the program returns to step S41. To the contrary, if it is determined at step S44 that the distance measurements have been completed, the program advances to step S45.

At step S45, the reliability of the distance measurements is evaluated to enhance the accuracy of the distance estimation. That is, if a determination is made that the distance measurements are reliable, distance number determining processing is executed at step S46. On the other hand, if a determination is made that the distance measurements are not reliable, another processing in which a distance number of an adjacent pixel is used as distance data of the pixel in process is executed at step S47.

Such processing is executed by the image pickup sensor units 24, 26 and, hence, the image pickup sensor units 24, 26 act as the obstacle position detecting means.

The distance number determining processing at step S46 is explained hereinafter, but the term "distance number" is first explained.

The "distance number" means an approximate distance from the image pickup sensor unit to a position P in a space to be air conditioned. As shown in Fig. 26, when the image pickup sensor unit has been placed two meters above a floor face, and the distance from the image pickup sensor unit to the position P is referred to as "distance X [m] corresponding to the distance number", the position P is represented by the following formulas.

[Formula 7]
[Formula 8]

As indicated by Formula 6, the distance X corresponding to the distance number depends on the disparity between the image pickup sensor units 24, 26. The distance number is represented by an integer between two and twelve, and the distance corresponding to the distance number is set as shown in Table 4.

Table 4 shows the position P corresponding to an angle of elevation (a) that is determined by a v-coordinate value of each pixel based on the distance number and Formula 2. An area in black indicates positions under the floor where "h" takes a negative value (h<0). Also, Table 4 is applied to an air conditioner having a capacity of 2.2kw, and supposing that this air conditioner is solely installed in a six-mat room (width across corners=4.50m), a distance number=9 is set as a limiting value (maximum value D). In the six-mat room, a position corresponding to a distance number^ 10 is positioned on the other side beyond a wall (outside the room). Although such a distance number can be applied to a room having a width across corners>4.50m, it has no meaning in the six-mat room and is indicated in black in Table 4.

Table 5 is applied to an air conditioner having a capacity of 6.3kw, and supposing that this air conditioner is solely installed in a twenty-mat room (width across corners=8.49m), a distance number=12 is set as a limiting value (maximum value D).

Table 6 indicates limiting distance numbers set depending on the capacity of the air conditioner and the angle of elevation of each pixel.

[Table 6]

The reliability evaluation processing at step S45 and the distance number determining processing at step S46 are explained hereinafter.

As described above, the distance number has a limiting value depending on the capacity of the air conditioner and the angle of elevation of each pixel, and even if a distance number estimation result is N>maximum value D, unless all the measurement results are "distance number=N", the distance number is set to D.

Eight distance numbers are determined at each pixel. Two distance numbers from largest and two distance numbers from smallest are all removed, and an average of the four remaining distance numbers is determined as the distance number. In applications where the stereo method employing the block matching method is used, when an obstacle without any luminance change is detected, a disparity calculation is unstable and widely varying disparity results (distance numbers) are detected for every measurement. In view of this, the four remaining distance numbers are compared at step S45, and if a variation thereof is greater than or equal to a threshold value, a determination is made at step S47 that the distance numbers are not reliable, and the distance estimation at the pixel in process is abandoned. In this case, the distance number that has been estimated at an adjacent pixel is used. The average is an integer obtained by rounding it up after the decimal point. The positions corresponding to the distance numbers so determined are shown in Table 4 or Table 5.

Although in this embodiment the distance number has been described as being obtained by determining eight distance numbers at each pixel, by removing two distance numbers from largest and two distance numbers from smallest, and by averaging the four remaining distance numbers, the number of distance numbers to be determined at each pixel is not limited to eight, and that to be averaged is not limited to four.

That is, in determining the presence or absence of an obstacle in each obstacle position discriminating region, a determination is made whether or not results of determination by the obstacle detecting means in each obstacle position discriminating region should be updated depending on results of determination of the presence or absence of a person in a human position discriminating region corresponding to each obstacle position discriminating region, thus resulting in an efficient determination of the presence or absence of the obstacle. More specifically, in an obstacle position discriminating region belonging to a human position discriminating region where it has been determined by the human body detecting means that no person is present, preceding results of determination by the obstacle detecting means are updated by current results of determination, while in an obstacle position discriminating region belonging to a human position discriminating region where it has been determined by the human body detecting means that a person is present, preceding results of determination by the obstacle detecting means are not updated by current results of determination.

Although the preceding distance data are used at step S43 in the flowchart of Fig. 25, if a determination by the obstacle detecting means in each obstacle position discriminating region is a first one, a default value is used because no preceding data exist immediately after the air conditioner has been installed. The limiting value (maximum value D) described above is used as the default value.

Fig. 27 is a schematic elevation view (vertical sectional view passing through the image pickup sensor unit) of a living space, depicting measurement results when a floor face is located two meters below the image pickup sensor unit and there are obstacles such as tables at a level of 0.7-1.1m above the floor face. In this figure, meshing, upward-sloping hatching, and downward-sloping hatching indicate short-distance regions, intermediate-distance regions, and long-distance regions (these distances are described later) where the presence or absence of an obstacle is determined, respectively. (Obstacle avoiding control)

During heating, the vertical wind direction changing blades 12 and the horizontal wind direction changing blades 14, both employed as the wind direction changing means, are controlled in the following manner based on the determination of the presence or absence of an obstacle referred to above.

In the following discussion, the terms "block", "field", "short distance", "intermediate distance", and "long distance" are used, and these terms are first explained.
Each of the regions A-G shown in
Fig. 13 belongs to the following block.
Block N: region A
Block R: region B, E
Block C: region C, F
Block L: region D, G
Each of the regions A-G belongs to the following field.
Field 1: region A
Field 2: region B, D
Field 3: region C
Field 4: region E, G
Field 5: region F
The distance from the indoor unit is defined as follows.
Short distance: region A
Intermediate distance: region B, C, D
Long distance: region E, F, G

Table 7 indicates target angles of five right-side blades and five left-side blades constituting the horizontal wind direction changing blades 14 at each position. Signs attached to the figures (angles) are defined such that a plus sign (+, no sign in Table 7) indicates a direction in which the right- or left-side blades are directed inwards, and a minus sign (—) indicates a direction in which the right- or left-side blades are directed outwards, as shown in Fig. 28.

"Heating region B" in Table 7 is a heating region where an obstacle avoiding control is conducted, and "Normal automatic wind direction control" is a wind direction control in which no obstacle avoiding control is conducted. A determination as to whether or not the obstacle avoiding control is conducted is based on a temperature of the indoor heat exchanger 6. A wind direction control not to cause a wind to impinge on a resident or residents, a wind direction control at a maximum capacity position, and a wind direction control for the heating region B are conducted in the case where the temperature is low, too high, and moderate, respectively. "Low temperatures", "too high temperatures", "wind direction control not to cause a wind to impinge on a resident or residents", and "wind direction control at a maximum capacity position" all used here have the following meanings.

Low temperatures: temperatures (for example, 32 °C) below an optimum temperature of the heat exchanger 6, which is set equal to a cutaneous temperature (33-34°C)

Too high temperatures: temperature of, for example, above 56°C Wind direction control not to cause a wind to impinge on a resident or residents: wind direction control in which the angle of the vertical wind direction changing blades 12 is controlled to cause a wind to flow along a ceiling so that no wind may be conveyed to a space around the resident

Wind direction control at a maximum capacity position: wind direction control in which a resistance (loss) generated when the vertical wind direction changing blades 12 or the horizontal wind direction changing blades 14 bend an air current approaches zero inimitably (in the case of the horizontal wind direction changing blades 14, this position is a position where they are directed straight forward, and in the case of the vertical wind direction changing blades 12, this position is a position where they are directed 35 degrees downward from a horizontal line)

Table 8 indicates target angles of the vertical wind direction changing blades 12 in each field when the obstacle avoiding control is conducted. In Table 8, an angle (y1) of the upper blade and an angle (y2) of the lower blade are angles (angles of elevation) measured upward from a vertical line.

The obstacle avoiding control depending on a position of an obstacle is specifically explained hereinafter, and the terms "swing motion", "position swing motion with pause", and "block swing motion with pause" used in the obstacle avoiding control are first explained.

The "swing motion" is a motion of the horizontal wind direction changing blades 14 in which they swing right and left within a predetermined range of angles centering on a target position without any pause at right and left ends of the motion.

In the "position swing motion with pause", the target angles set at each position (angles indicated in Table 7) are modified using Table 9, and the modified angles are set as those at the right and left ends of the motion. In this motion, a time period of pause (time period for fixing the horizontal wind direction changing blades 14) is provided at each end of the motion. By way of example, when the time period of pause elapses at the left end, the horizontal wind direction changing blades 14 swing toward the right end and maintain the wind direction at the right end until the time period of pause elapses, and after a lapse of the time period of pause, the horizontal wind direction changing blades 14 swing toward the left end and repeat such motion. The time period of pause is set to, for example, 60 seconds.

That is, if an obstacle exists in a certain position and if the target angles set at such position are used without modification, a hot wind impinges on the obstacle, but the modification indicated in Table 9 allows the hot wind to reach a region where a person is present through the side of the obstacle.

In the "block swing motion with pause", the angles of the horizontal wind direction changing blades 14 corresponding to right and left ends of each block are determined based on, for example, Table 10. In this motion, a time period of pause is provided at respective ends of each block. By way of example, when the time period of pause elapses at the left end, the horizontal wind direction changing blades 14 swing toward the right end and maintain the wind direction at the right end until the time period of pause elapses, and after a lapse of the time period of pause, the horizontal wind direction changing blades 14 swing toward the left end and repeat such motion. The time period of pause is set to, for example, 60 seconds, as in the position swing motion with pause. Because the right and left ends of each block coincide with those of a human position discriminating region corresponding to each block, the block swing motion with pause can be referred to as a "swing motion with pause in human position discriminating region".

It is to be noted that the position swing motion with pause and the block swing motion with pause are separately used depending on a size of the obstacle. If an obstacle in front of a person is small, the position swing motion with pause is performed centering on a position where the obstacle is present to thereby convey air-conditioned air while avoiding the obstacle. On the other hand, if an obstacle in front of a person is large and extends, for example, over a whole area in front of a region where the person is present, the block swing motion with pause is performed to convey air-conditioned air over a wide range.

In this embodiment, the swing motion, the position swing motion with pause, and the block swing motion with pause are collectively referred to as a swing motion of the horizontal wind direction changing blades 14.

Although specific examples of control of the vertical wind direction changing blades 12 or that of the horizontal wind direction changing blades 14 are explained, if it has been determined by the human body detecting means that a person is present in only one region, and if it has been determined by the obstacle detecting means that an obstacle is present in an obstacle position discriminating region positioned in front of a human position discriminating region where the person has been detected by the human body detecting means, an air current control is conducted to control the vertical wind direction changing blades 12 such that air-conditioned air may flow above the obstacle to avoid the obstacle. Also, if it has been determined by the obstacle detecting means that an obstacle is present in an obstacle position discriminating region belonging to a human position discriminating region where a person has been detected by the human body detecting means, one of a first air current control and a second air current control is selected. In the first air current control, the horizontal wind direction changing blades 14 are caused to swing within at least one obstacle position discriminating region belonging to a human position discriminating region where a person has been detected by the human body detecting means, and a time period for fixing the horizontal wind direction changing blades 14 is not provided at respective ends of the swing motion. In the second air current control, the horizontal wind direction changing blades 14 are caused to swing within at least one obstacle position discriminating region belonging to a human position discriminating region where a person has been detected by the human body detecting means or another human position discriminating region adjacent such a human position discriminating region, and a time period for fixing the horizontal wind direction changing blades 14 is provided at respective ends of the swing motion.

Although in a discussion below the control of the vertical wind direction changing blades 12 and that of the horizontal wind direction changing blades 14 are separated, the control of the vertical wind direction changing blades 12 and that of the horizontal wind direction changing blades 14 are conducted in a combined fashion depending on the position of a person and that of an obstacle.

A. Control of vertical wind direction changing blades

(1) A case where a person is present in any one of the regions B-G, and an obstacle is present in a position A1-A3 in front of the region where the person is present

The set angles of the vertical wind direction changing blades 12 as indicated in the normal field wind direction control table (Table 8) are modified as indicated in Table 11 so that an air current control may be conducted in which the vertical wind direction changing blades 12 have been set upward.

(2) A case where a person is present in any one of the regions B-G, and no obstacle is present in the region A in front of the region where the person is present (other than the case (1) above) The normal automatic wind direction control is conducted.

B. Control of horizontal wind direction changing blades

B1. A case where a person is present in the region A (short distance) (1) A case where the number of the positions where no obstacle is present is one in the region A

The first air current control is conducted in which the blades are caused to swing right and left centering on a target angle set at the position where no obstacle is present. By way of example, if an obstacle is present in the positions A1 and A3, and no obstacle is present in the position A2, the blades are caused to swing right and left centering on a target angle set at the position A2 to thereby basically conduct air conditioning with respect to the position A2 where no person is present, but because it may be that there would be a person in the position A1 or A3, the swing motion allows an air current to be conveyed to the positions A1 and A3 to some extent.

More specifically, because the target angles and modification angles (swing range of angles during the swing motion) at the position A2 are determined based on Table 7 and Table 9, both the right-side blades and the left-side blades continue swinging in a range of angles of ± 10 degrees centering on an angle of 10 degrees without pause. However, timing for a turn of the right-side blades and that for a turn of the left-side blades are set to be identical and, hence, the swing motion of the right-side blades and that of the left-side blades are synchronized.

(2) A case where the number of the positions where no obstacle is present is two in the region A, and the two positions adjoin each other (A1 and A2, or A2 and A3)

The first air current control is conducted in which the blades are caused to swing right and left with the target angles at the two positions where no obstacle is present employed as respective ends, thereby basically air conditioning the positions where no obstacle is present.

(3) A case where the number of the positions where no obstacle is present is two in the region A, and the two positions are spaced away from each other (A1 and A3)

The block swing motion with pause is performed with the target angles at the two positions where no obstacle is present employed as respective ends, thereby conducting the second air current control.

(4) A case where an obstacle is present in all the positions in the region A

Because the target position is not clear, the block swing motion with pause is performed with respect to the block N to conduct the second air current control. Rather than aiming at the entire region, the block swing motion with pause can create a wind having greater directivity that is likely to reach far while avoiding the obstacles. That is, even if the region A is dotted with obstacles, the block swing motion with pause can convey air-conditioned air through spaces between the obstacles.

(5) A case where no obstacle is present in each position in the region A

The normal automatic wind direction control is conducted with respect to the region A. B2. A case where a person is present in any one of the regions B, C and D (intermediate distance)

(1) A case where an obstacle is present in only one of the two positions belonging to the region where the person is present

The first air current control is conducted in which the blades are caused to swing right and left centering on a target angle set at the position where no obstacle is present. By way of example, if a person is present in the region D, and an obstacle is present in only the position D2, the blades are caused to swing right and left centering on a target angle set at the position D1.

(2) A case where an obstacle is present in each of the two positions belonging to the region where a person is present

The block swing motion with pause is performed with respect to a block containing the region where the person is present to conduct the second air current control. By way of example, if the person is present in the region D, and an obstacle is present in each of the positions D1 and D2, the block swing motion with pause is performed with respect to the bedroll.

(3) A case where no obstacle is present in a region where a person is present

The normal automatic wind direction control is conducted with respect to the region where the person is present. B3. A case where a person is present in any one of the regions E, F and G (long distance)

(1) A case where an obstacle is present in only one of the two positions belonging to an intermediate-distance region in front of the region where the person is present (for example, the person is present in the region E, the obstacle is present in the position B2, and no obstacle is present in the position B1)

(1.1) A case where no obstacle is present on respective sides of the position where the obstacle is present (for example, no obstacle is present in each of the positions B1 and C1)

(1.1.1) A case where no obstacle is present behind the position where the obstacle is present (for example, no obstacle is present in the position E2)

The position swing motion with pause is performed centering on the position where the obstacle is present, thereby conducting the second air current control. By way of example, if a person is present in the region E, an obstacle is present in the position B2, and no obstacle is present on respective sides of and behind the position B2, an air current can be conveyed to the region E by causing the air current to pass by the obstacle in the position B2 to avoid the obstacle.

(1.1.2) A case where an obstacle is present behind the position where the obstacle is present (for example, the obstacle is present in the position E2)

The first air current control is conducted in which the blades are caused to swing centering on a target angle set at a position where no obstacle is present and which belongs to an intermediate-distance region. By way of example, if a person is present in the region E, an obstacle is present in the position B2, no obstacle is present on respective sides thereof, but an obstacle is present behind the position B2, it is advantageous that an air current would be conveyed through the position B1 where no obstacle is present.
(1.2) A case where an obstacle is present on one side of the position where the obstacle is present and no obstacle is present on the other side

The first air current control is conducted in which the blades are caused to swing centering on a target angle set at a position where no obstacle is present. By way of example, if a person is present in the region F, an obstacle is present in the position C2, another obstacle is present in the position D1 that is one of two positions on respective side of the region C2, and no obstacle is present in the position C1, an air current can be conveyed toward the region F through the position C1 where no obstacle is present while avoiding the obstacle in the region C2.

(2) A case where an obstacle is present in each of the two positions belonging to the intermediate-distance region in front of the region where the person is present

The block swing motion with pause is performed with respect to a block containing the region where the person is present to conduct the second air current control. By way of example, if the person is present in the region F, and an obstacle is present in each of the positions C1 and C2, the block swing motion with pause is performed with respect to the block C. In this case, because the obstacle is present in front of the person and accordingly unavoidable, the block swing motion with pause is performed irrespective of the presence or absence of an obstacle in a block adjacent the block C.

(3) A case where no obstacle is present in each of the two positions belonging to the intermediate-distance region in front of the region where the person is present (for example, the person is present in the region F and no obstacle is present in each of the positions C1 and C2)

(3.1) A case where an obstacle iS"ptesent in only one of two positions belonging to the region where the person is present

The first air current control is conducted in which the blades are caused to swing centering on a target angle set at the other of the two positions where no obstacle is present. By way of example, if a person is present in the region F, no obstacle is present in each of the positions C1, C2 and F1, and an obstacle is present in the position F2, a space in front of the region F where the person is present is open. Accordingly, the position F1 where no obstacle is present and that is a long-distance position is mainly air conditioned considering the obstacle in the long-distance position.

(3.2) A case where an obstacle is present in each of the two positions belonging to the region where the person is present

The block swing motion with pause is performed with respect to a block containing the region where the person is present to conduct the second air current control. By way of example, if the person is present in the region G, no obstacle is present in each of the positions D1 and D2, and an obstacle is present in each of the positions G1 and G2, the block swing motion with pause is performed with respect to the block L. The reason for this is that although the region G where the person is present is open on a front side thereof, the obstacles are present in this entire region and, hence, the target position is not clear.

(3.3) A case where no obstacle is present in each of the two positions belonging to the region where the person is present

The normal automatic wind direction control is conducted with respect to the region where the person is present. (Person-wall proximity control)

If a person and a wall are present in the same region, the person is always positioned in front of and adjacent to the wall. In this case, during heating, warm air is apt to remain in proximity to the wall and make a room temperature in proximity to the wall higher than that in other space. A person-wall proximity control is conducted to avoid such a phenomenon.

In this control, disparities are calculated at pixels different from the pixels [I, j] shown in Table 3 and the distances thereto are then detected, thereby first recognizing the positions of a front wall, a right-side wall and a left-side wall.

That is, a disparity of a pixel positioned substantially horizontally forward is first calculated using the image pickup sensor units 24, 26 to measure the distance to the front wall, thereby obtaining the distance number thereof. Further, a disparity of a pixel positioned substantially horizontally leftward is calculated to measure the distance to the left-side wall, thereby obtaining the distance number thereof. The distance number of the right-side wall is similarly obtained.

A detailed discussion is further made with reference to Fig. 29, which is a plan view of a room in which the indoor unit has been installed, depicting a case where a front wall WC, a left-side wall WL, and a right-side wall WR exist forward and on the right and left sides of the indoor unit, respectively. Numerals on the left side of Fig. 29 indicate distance numbers of corresponding squares, and Table 12 indicates distances from the indoor unit to a close point and to a distant point corresponding to each distance number.

As described above, the term "obstacle" as employed throughout this application is referred to, for example, as a television set, an audio station, and furniture such as tables, sofas, or the like, and considering the average heights of these obstacles, they are not detected in a range of angles of elevation more than 75 degrees. Because it can be assumed that what are detected in this range of angles are walls, in this embodiment, the distances to objects existing forward, rightward and leftward of the indoor unit in the range of angles of elevation more than 75 degrees are detected, and it is determined that the detected objects and objects lying on extensions thereof are walls.

It can be also assumed that in terms of a view angle in the horizontal direction, the left-side wall WL exists at positions of angles of -80 and -75 degrees, the front wall WC exists at positions of angles of -15 to 15 degrees, and the right-side wall exists at positions of angles of 75 and 80 degrees. Of the pixels indicated in Table 3, the pixels present within the above view angle in the horizontal direction in the range of angles of elevation more than 75 degrees are as follows.

Left end: [14, 15], [18, 15], [14, 21], [18, 21], [14, 27], [18, 27]

Next, as indicated in Table 14, unnecessary wall data are removed by removing a maximum value and a minimum value from the wall data, and the distance numbers of the front wall WC, the left-side wall WL, and the right-side wall WR are determined based on the wall data obtained in this way.

[Table 14]

Maximum values (WL=6, WC=5, WR=3) in Table 14 can be employed as the distance numbers of the left-side wall WL, the front wall WC, and the right-side wall WR. The employment of the maximum values results in air conditioning for a room (large room) having a front wall and right- and left-side walls each farther than that of the actual room. That is, a wider space is set as an object to be air conditioned. However, the maximum values are not always employed, and average values may be employed.

After the distance numbers of the left-side wall WL, the front wall WC, and the right-side wall WR have been determined in the above-described manner, the obstacle detecting means determines whether a wall is present or absent in an obstacle position discriminating region belonging to a human position discriminating region where a person has been detected by the human body detecting means. If it is determined that a wall is present, it is conceivable that the person is present in front of the wall and, hence, a temperature lower than a temperature set by the remote controller is newly set during heating.

The person-wall proximity control is explained hereinafter more specifically, taking the case of heating.

A. A case where a person is present in a short-distance region or an intermediate-distance region

Because the short-distance region or the intermediate-distance region is close to the indoor unit and has a small area, the degree of increase of the room temperature becomes high. Accordingly, a temperature lower than the temperature set by the remote controller by a first predetermined temperature (for example, 2°C) is newly set.

B. A case where a person is present in a long-distance region

Because the long-distance region is distant from the indoor unit and has a large area, the degree of increase of the room temperature is lower than that in the short-distance region or the intermediate-distance region. Accordingly, a temperature lower than the temperature set by the remote controller by a second predetermined temperature (for example, 1°C) less than the first predetermined temperature is newly set.

Further, because the long-distance region has a large area, even if a determination has been made that a person and a wall are present in the same human position discriminating region, it may be that the person and the wall would be apart from each other. Accordingly, the person-wall proximity control is conducted only in the case of combinations as indicated in Table 15 to perform a temperature shift depending on a positional relationship between a person and a wall.

Although in this embodiment the distance detecting means employs the stereo method, a method of utilizing a light emitting portion 28 and an image pickup sensor unit 24 can be employed in place of the stereo method. This method is explained hereinafter.

Fig. 30 depicts an indoor unit having a main body 2, the light emitting portion 28 mounted on the main body 2, and the image pickup sensor unit 24 mounted on the main body 2. The light emitting portion 28 includes a light source (not shown) and a scanning portion (not shown), and an LED or a laser is used for the light source. The scanning portion employs a galvanomirror and can arbitrarily change a direction of light emission.

Fig. 31 is a schematic view indicating a relationship between the image pickup sensor unit 24 and the light emitting portion 28. In general, the direction of light emission has two degrees of freedom and an imaging area lies on a two dimensional plane, but it is assumed for the sake of brevity that the direction of light emission has a single degree of freedom and the imaging area is a horizontally extending straight line. The light emitting portion 28 emits light in a direction p with respect to an optical axis direction of the image pickup sensor unit 24. The image pickup sensor unit 24 performs the difference processing with respect to a frame image immediately before the light emitting portion 28 emits light and a frame image toward which light is now emitted to obtain a u-coordinate u1 of a point P, which reflects light emitted from the light emitting portion 28, on an image. If the distance from the image pickup sensor unit 24 to the point P is represented by That is, distance information in a space to be air conditioned can be obtained by detecting the point P of reflection of the light while changing the direction p of light emission of the light emitting portion 28.

In Table 16, "i" and "j" indicate addresses to be scanned by the light emitting portion 28, and angles in the vertical direction and those in the horizontal direction indicate the angles of elevation a referred to above and angles B measured rightward from a front reference line as viewed from the indoor unit, respectively. That is, each address is set to fall within a range of 5 degrees to 80 degrees in the vertical direction and in a range of -80 degrees to 80 degrees in the horizontal direction as viewed from the indoor unit, and the light emitting portion 28 measures each address to scan the living space.

The distance measurements to an obstacle are explained hereinafter with reference to a flowchart of Fig. 32. Because the flowchart of Fig. 32 is quite similar to the flowchart of Fig. 25, only different steps are explained.

At step S48, if a determination is made that no person is present in a region (any one of the regions A-G shown in Fig. 13) corresponding to an address [i, j] to which light is emitted from the light emitting portion 28, the program advances to step S49, but if a determination is made that a person is present, the program advances to step S43. Because a human body is not an obstacle, preceding distance data are used at a pixel corresponding to the region where a determination has been made that a person is present without conducting the distance measurements (distance data are not updated). The distance measurements are conducted only at a pixel corresponding to the region where a determination has been made that no person is present, and newly measured distance data are used (distance data are updated).

At step S49, the distance to the obstacle is estimated by obtaining the point of reflection generated by the aforementioned light emitting processing from the image pickup sensor unit 24. As described above, it is only necessary to execute the distance number determining processing in which the distance number is used.

The human body detecting means may be used as the distance detecting means. In this case, the human body detecting means serves as a human body distance detecting means and also as an obstacle detecting means. This processing is explained hereinafter.

Fig. 33 depicts an indoor unit having a main body 2 and a single image pickup sensor unit 24 mounted on the main body 2. Fig. 34 is a flowchart indicating processing to be executed by the human body distance detecting means using the human body detecting means. In this figure, the same steps as those in Fig. 5 are designated by the same step numbers and detailed explanation thereof is omitted.

At step S201, in each of the regions divided by the human body detecting means, the human body distance detecting means detects a pixel that is located at an uppermost portion of an image from pixels having a difference to obtain a v-coordinate v1 thereof.

At step S202, the human body distance detecting means estimates the distance from the image pickup sensor unit to a human body using the v-coordinate v1 of the pixel located at the uppermost portion of the image.

Fig. 35A and Fig. 35B are schematic views indicating this processing. Fig. 35A depicts a scene in which two persons 121,122 close to and remote from a camera are present, and Fig. 35B depicts a difference image of an image taken by the image pickup sensor unit in the scene of Fig. 35A. Two regions 123, 124 in which a difference has occurred correspond to the two persons 121, 122, respectively. It is assumed that the heights h1 of all persons in a space to be air conditioned are known and substantially the same. As described above, because the image pickup sensor unit 24 is located at a height of two meters, the image pickup sensor unit 24 takes an image while looking down the persons from above. At this moment, the closer the person is to the image pickup sensor unit 24, the person is imaged at a lower portion of the image as shown in Fig. 35B. That is, the v-coordinate v1 of an uppermost portion of the image of the person and the distance from the image pickup sensor unit 24 to the person have a one-to-one relationship. Because of this, the distance to a human body can be detected using the human body detecting means by determining the relationship between the v-coordinate v1 of the uppermost portion of the image of the person and the distance from the image pickup sensor unit 24 to the person in advance.

Table 17 is an example in which an average height of persons is hi, and the relationship between the v-coordinate v1 of the uppermost portion of the image of the person and the distance from the image pickup sensor unit 24 to the person has been determined in advance. This table has been obtained using an image pickup sensor unit having a resolution VGA as the image pickup sensor unit 24. From this table, if v1=70, for example, the distance from the image pickup sensor unit 24 to the person is estimated to be about two meters.

The obstacle detecting means with the use of the human body detecting means is explained hereinafter.

Fig. 36 is a flowchart indicating the processing to be executed by the obstacle detecting means employing the human body detecting means.

At step S203, the obstacle detecting means estimates a height v2 of a person on an image by making use of distance information from the image pickup sensor unit 24 to the person estimated by the human body distance detecting means. Fig. 37A and Fig. 37B are schematic views for explaining this processing, indicating the same scene as in Fig. 35A and Fig. 35B. As described above, the heights hi of ail persons in a space to be air conditioned are known and substantially the same. Because the image pickup sensor unit 24 is located at a height of two meters, the image pickup sensor unit 24 takes an image while looking down the persons from above, as shown in Fig. 37A. At this moment, the closer the person is to the image pickup sensor unit 24, the bigger the person is on the image, as shown in Fig. 37B. That is, a difference v2 between a v-coordinate of an uppermost portion and that of a lowermost portion of an image of the person and the distance from the image pickup sensor unit 24 to the person have a one-to-one relationship. Because of this, if the distance from the image pickup sensor unit 24 to the person is known, the size of the person on the image can be estimated. In this case, it is only necessary to determine the relationship between the difference v2 between the v-coordinate of the uppermost portion and that of the lowermost portion of the image of the person and the distance from the image pickup sensor unit 24 to the person in advance.

Table 18 is an example in which the relationship between the difference v2 between the v-coordinate of the uppermost portion and that of the lowermost portion of the image of the person and the distance from the image pickup sensor unit 24 to the person has been determined in advance. This table has been obtained using an image pickup sensor unit having a resolution VGA as the image pickup sensor unit 24. From this table, if the distance from the image pickup sensor unit 24 to the person is two meters, the difference v2 between the v-coordinate of the uppermost portion and that of the lowermost portion of the image of the person is estimated to be equal to 85.

At step S204, in each region of the difference images, the obstacle detecting means detects a pixel having a difference and located at an uppermost portion of the image and a pixel having a difference and located at a lowermost portion of the image to calculate a difference v3 between v-coordinates thereof.

At step S205, whether an obstacle is present between the image pickup sensor unit 24 and the person is estimated by comparing the height v2 of the person that has been estimated on the image using the distance information from the image pickup sensor unit 24 to the person and the height v3 of the person that has been obtained from the real difference image.

Figs. 38A and 38B and Figs. 39A and Fig. 39B are schematic views indicating this processing. Fig. 38A and Fig. 38B depict a scene similar to that of Fig. 35A and Fig. 35B, but particularly depicting a scene in which no obstacle is present between the image pickup sensor unit 24 and the person. On the other hand, Fig. 39A and Fig. 39B schematically depict a scene in which an obstacle is present. As shown in Fig. 38A and Fig. 38B, if no obstacle is present between the image pickup sensor unit 24 and the person, the height h2 of the person on the image that has been estimated using the distance information from the image pickup sensor unit 24 to the person becomes nearly equal to the height v3 of the person 123 that has been obtained from the real difference image. On the other hand, as shown in Fig. 39A and Fig. 39B, if an obstacle is present between the image pickup sensor unit 24 and the person, the person is partially screened and any difference does not exist in a screened region. Taking notice of the fact that almost all obstacles in a space to be air conditioned are placed on a floor, it is conceivable that a lower part of the person is screened. This means that even if an obstacle is present between the image pickup sensor unit 24 and the person, the distance to the person can be correctly obtained using the v-coordinate v1 of the uppermost portion of the image in the human region. On the other hand, if an obstacle is present between the image pickup sensor unit 24 and the person, it is estimated that the height v3 of the person 125 obtained from the real difference image becomes smaller than the height h2 of the person on the image that has been estimated using the distance information from the image pickup sensor unit 24 to the person. In view of this, if a determination is made at step S205 that v3 is sufficiently smaller than v2, the program advances to step S206, at which a determination is made that an obstacle is present between the image pickup sensor unit 24 and the person. In this event, the distance from the image pickup sensor unit 24 to the person is made equal to the distance from the image pickup sensor unit 24 to the person that has been obtained from the v-coordinate v1 of the uppermost portion of the image.

As described above, the distance detecting means can be realized by making use of a detection result of the human body detecting means.

In this embodiment, there are provided a plurality of segmentalized human position discriminating regions and a plurality of segmentalized obstacle position discriminating regions, and the wind direction changing means is controlled depending on the positional relationship between the detected obstacles and the detected human bodies. However, in determining the positional relationship between the obstacles and the human bodies in the space to be air conditioned, a known method may be used. Based on the positional relationship between the obstacles and the human bodies detected by the known method, the first air current control or the second air current control may be selectively conducted. Industrial Applicability

The air conditioner according to the present invention can enhance the air-conditioning efficiency by finely controlling the wind direction changing means based on a determination result as to whether or not an obstacle is present in front of a person and is accordingly effectively utilized particularly for air conditioners for general home use. Explanation of reference numerals 2 indoor unit body, 2a front suction opening, 2b upper suction opening, 4 movable front panel, 6 heat exchanger, 8 indoor fan, 10 discharge opening, 12 vertical wind direction changing blade, 14 horizontal wind direction changing blade, 16 filter, 18, 20 arm for front panel, 24, 26 image pickup sensor unit, 28 light emitting portion.

CLAIMS

1. An air conditioner comprising:

an indoor unit having an indoor heat exchanger mounted therein;

a human body detecting means mounted to the indoor unit to detect presence or absence of a person;

an obstacle detecting means mounted to the indoor unit to detect presence or absence of an obstacle;

a wind direction changing means mounted to the indoor unit to change a direction of air blown out from the indoor unit, the wind direction changing means comprising vertical wind direction changing blades to vertically change the direction of air blown out from the indoor unit; and

an image pickup device comprising the human body detecting means and the obstacle detecting means,

wherein the wind direction changing means is controlled based on a detection result of the human body detecting means and a detection result of the obstacle detecting means, and when a determination is made based on the detection result of the human body detecting means and the detection result of the obstacle detecting means that an obstacle is positioned closer than a person to the indoor unit, and when a temperature of the indoor heat exchanger falls within a range from a cutaneous temperature-based reference temperature to a predetermined high temperature, an air current control is conducted to flow air-conditioned air above the obstacle to avoid the obstacle by controlling the vertical wind direction changing blades.

2. The air conditioner according to claim 1, wherein an area to be air conditioned is divided into a plurality of human position discriminating regions to be detected by the human body detecting means and into a plurality of obstacle position discriminating regions to be detected by the obstacle detecting means, and at least one obstacle position discriminating region belongs to each of the plurality of human position discriminating regions, and wherein when a determination is made by the obstacle detecting means that an obstacle is present in an obstacle position discriminating region positioned closer to the indoor unit than a human position discriminating region that has been determined by the human body detecting means that a person is present, the air current control is conducted to flow air-conditioned air above the obstacle to avoid the obstacle by controlling the vertical wind direction changing blades.

3. The air conditioner according to claim 2, wherein an angle of the vertical wind direction changing blades with respect to a horizontal line is set for each of the plurality of human position discriminating regions, and when the determination is made by the obstacle detecting means that an obstacle is present in an obstacle position discriminating region positioned closer to the indoor unit than a human position discriminating region that has been determined by the human body detecting means that a person is present, the set angle of the vertical wind direction changing blades is modified to allow the vertical wind direction changing blades to be set upward.

4. The air conditioner according to claim 2 or 3, wherein each of the plurality of human position discriminating regions is classified into any one of a first region and a second region farther than the first region from the indoor unit depending on a distance from the indoor unit, and the human position discriminating region determined by the human body detecting means that a person is present belongs to the second region, while the obstacle position discriminating region determined by the obstacle detecting means that an obstacle is present belongs to the-'first region.

5. The air conditioner according to any one of claims 1 to 4, wherein the vertical wind direction changing blades comprise a plurality of independently controllable blades.

6. The air conditioner according to claim 1, wherein the vertical wind direction changing blades are set more upward with an increase in distance from a person to the indoor unit.

Documents

Application Documents

# Name Date
1 3072-CHENP-2012 PCT 04-04-2012.pdf 2012-04-04
2 3072-CHENP-2012 FORM-5 04-04-2012.pdf 2012-04-04
3 3072-CHENP-2012 FORM-3 04-04-2012.pdf 2012-04-04
4 3072-CHENP-2012 FORM-2 04-04-2012.pdf 2012-04-04
5 3072-CHENP-2012 FORM-1 04-04-2012.pdf 2012-04-04
6 3072-CHENP-2012 DRAWINGS 04-04-2012.pdf 2012-04-04
7 3072-CHENP-2012 DESCRIPTION (COMPLETE) 04-04-2012.pdf 2012-04-04
8 3072-CHENP-2012 CORRESPONDENCE OTHERS 04-04-2012.pdf 2012-04-04
9 3072-CHENP-2012 CLAIMS 04-04-2012.pdf 2012-04-04
10 3072-CHENP-2012 ABSTRACT 04-04-2012.pdf 2012-04-04
11 3072-CHENP-2012 POWER OF ATTORNEY 03-10-2012.pdf 2012-10-03
12 3072-CHENP-2012 FORM-3 03-10-2012.pdf 2012-10-03
13 3072-CHENP-2012 CORRESPONDENCE OTHERS 03-10-2012.pdf 2012-10-03
14 3072-CHENP-2012-FER.pdf 2018-07-03
15 3072-CHENP-2012-AbandonedLetter.pdf 2019-01-08

Search Strategy

1 3072chenp2012_19-12-2017.pdf