Sign In to Follow Application
View All Documents & Correspondence

Multi Aperture Imaging Device

Abstract: The invention relates to a device (10) for sensing an object region (12; 12 '), comprising a flat housing. Said housing has a first main side (14), a second main side (16), a border side (18a), and a multi-aperture device (22) having a plurality of optical channels (24a; 24b), which are arranged laterally adjacent to each other and which face the border side (18a), wherein each optical channel (24a; 24b) is designed to sense a partial region (26a; 26b; 26'a; 26'b) of the object region (12 ; 12 ') through the border side (18a) or along an optical axis (32a; 32b) of the optical channel (24a; 24b) in question, which optical axis is deflected between a lateral course within the housing and a non-lateral course outside of the housing, wherein the partial regions (26a; 26b; 26'a; 26'b) of the optical channels (24a;

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 September 2020
Publication Number
13/2022
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
iprdel@lakshmisri.com
Parent Application

Applicants

FRAUNHOFER-GESELLSCHAFT ZUR FÖRDERUNG DER ANGEWANDTEN FORSCHUNG E.V.
Hansastraße 27c 80686 München, Germany

Inventors

1. WIPPERMANN, Frank
Berliner Str. 57 98617 Meiningen, Germany
2. Andreas BRÜCKNER
Finkenstraße 16a 82194 Gröbenzell, Germany.
3. BRÄUER, Andreas
Rabis 19 07646 Schlöben, Germany

Specification

The invention relates to a device for detecting an object area.

Many cell phones or smartphones today are equipped with at least two camera modules. A camera module often has exactly one optical channel for capturing a sub-area of ​​the object area. For example, there is a primary camera, which can be optimized for taking photos or videos, on the front or second main page of the device facing away from the user, and a secondary camera, which can be optimized for video telephony, for example, on the back or first main page facing the user of the device. Two object areas that are independent of one another can therefore be detected: a first object area facing the front of the housing and a second object area facing the rear. Fig. 1 1 shows for clarification a conventional camera or

In the course of increasing miniaturization, one of the main design tasks is to reduce the thickness of smartphones or cell phones. A problem arises here with the integration of the camera module or the camera modules: from the physical laws of optics, for each given camera module with the lateral dimensions X (extended in the x-direction) and Y (extended in the y-direction) a lower limit for the height of the entire camera module Z (extended in the z-direction). This height Z determines, for example, when the height Z is aligned along a thickness of the smartphone or mobile phone, the minimum thickness of the entire device. In other words, the camera often determines the minimum thickness of the smartphone.

One possibility of reducing the overall height of cameras is to use multi-aperture cameras, which include a large number of imaging channels arranged next to one another. Superresolution processes are used here, whereby the overall height can be halved. Basically, two principles are known, on the one hand based on optical channels, each of which covers the entire system.

Transfer the field of view (Pelican Imaging, inter alia WO 2009151903 A3, TOMBO Japan) and, on the other hand, only map a partial area of ​​the overall field of view (DE102009049387 and application DE 020 3222780 based on this).

A multi-aperture device can include an image sensor with one image sensor area per channel. An optical channel of the multi-aperture device is designed to image a partial area of ​​the object area on a respective image sensor area and has an optical element or imaging optics such as a lens, a section of a decentered lens or a freeform surface with an optical center.

Furthermore, it is not possible with a single camera module and thus a single optical channel to obtain depth information about the object area detected with it. For this purpose, at least two camera modules are necessary, with the depth resolution being able to be maximized as the distance between the camera modules increases. The minimum overall height Z can possibly be reduced if multi-aperture cameras with super resolution and a linear arrangement of the channels are used (see again DE102009049387 and, based on this, application DE102013222780). However, the reduction is dependent on a super resolution factor, which as a rule does not exceed 2.

The integration of several camera modules requires additional space, which, for example, when the camera module is accommodated in the rear side facing the user for capturing an object area that is e.g. Case is limited.

It is also possible in principle to reduce the overall height Z of the individual camera optics by reducing the focal length f of the individual camera optics. However, it is well known to the person skilled in the art that this approach leads to a reduction in the image quality in terms of resolution and / or image noise only when the pixel size is reduced or the number of pixels is reduced.

The object of the present invention is therefore to create a device with a small thickness with the same image quality or with a higher image quality with the same thickness.

This object is achieved by the subject matter of the independent claims.

The core idea of ​​the present invention is to have recognized that a device for capturing an object area can be made thinner or with a similar thickness with better imaging quality by capturing a respective sub-area of ​​the object area through or along an edge of the device for each channel a respective optical axis which is deflected between a lateral course inside the housing and a non-lateral course outside the housing. An extension of each channel along a depth (z) direction can be arranged essentially parallel to a main side, for example the front or rear of the device for object detection,

According to one embodiment, a multi-aperture device, which comprises at least two optical channels and in each case an optical axis assigned to a respective optical channel, is arranged in a device for object detection, which has at least one edge side and a first main side and a second main side, that the detection of a sub-area of ​​an object area can take place through the edge side. If the optical axes are deflected between a lateral course and a non-lateral course, the object area can face the first or second main side, for example.

Further advantageous embodiments are the subject of the dependent claims.

Preferred exemplary embodiments of the present invention are explained below with reference to the accompanying drawings. Show it:

1 shows a schematic perspective view of a device for detecting an object region, which device has a multi-aperture device with two optical channels;

2 shows a schematic perspective view of a device for detecting an object region, which device has a multi-aperture device with four optical channels;

3a shows a schematic plan view of a multi-aperture device with a multiplicity of optical channels and image sensor areas in a cellular structure;

3b shows a schematic representation of a two-dimensional arrangement of optical channels for detecting an object region, in which the arrangement of the

Image sensor areas correspond to a position of the sub-area within the object area;

4a shows a schematic plan view of a detail or section of the multi-aperture device from FIG. 3a;

4b shows a schematic plan view of the section from FIG. 4a, in which the optical channels have a lateral offset perpendicular to a line direction, so that the image sensor areas of the individual optical channels are arranged in the same position perpendicular to the line direction;

4c shows a schematic plan view of a substrate with a multiplicity of image sensor areas having different distances from one another in an arrangement according to FIG. 4b;

4d shows a schematic plan view of a substrate with a plurality of image sensor areas which are equally spaced from one another;

4e shows a schematic plan view of a substrate with a multiplicity of image sensor areas arranged next to one another without spacing;

Fig. 4f shows an imager according to FIG. 4b when in the spaces between the

Image sensor areas further image sensor areas are arranged;

5a shows a perspective view of a device for detecting an object region in a first state;

a perspective view of a device for detecting an object region in a second state;

a perspective view of a device for detecting an object area, which can detect two different object areas at the same time or simultaneously;

a perspective view of a device for detecting an object area facing away from the two main sides;

a perspective view of a device for detecting an object region facing the second main side, which additionally has a flash device arranged in the multi-aperture device;

9a shows a perspective view of a device for detecting two object areas in a first state, which enables an object area facing the second main side to be detected;

a perspective view of a device for detecting two object areas in a second state, which enables the detection of an object area facing the first main side;

10 shows a perspective view of a device for detecting three different object areas.

Before exemplary embodiments of the present invention are explained in more detail below with reference to the drawings, it is pointed out that identical, functionally identical or identically acting elements, objects and / or structures in the different figures are provided with the same reference symbols, so that the elements in different Embodiments illustrated description of these elements is interchangeable or can be applied to one another.

1 shows a schematic perspective view of a device 10 for detecting an object region 12. The device 10 has a flat housing with a first main side 14 and a second main side 16 arranged opposite one another. The device 10 also has edge sides 18a-d which are arranged between the first main side 14 and the second main side 16.

The first main side 14 and the second main side 16 are each arranged by way of example in a plane that runs parallel to a plane that is spanned by a y-axis and a z-axis. The edge side 18a and the object area 12, on the other hand, are each parallel to a plane that is spanned by an x-axis and the y-axis. In other words, the edge side 18a is a side surface which is arranged between the two main sides 14 and 16 of the device 10. The edge side 18a and the edge sides 18b-d can also be referred to as end faces.

The x-axis, the y-axis and the z-axis are arranged orthogonally in space to one another and linked to one another right-handed. It goes without saying that the main and / or edge sides have curvatures and / or can be formed with any desired surface geometry, for example round, elliptical, polygonal, as a free-form surface or with a combination of the same. Alternatively, device 10 can have a different number of main and / or edge sides, for example only one, two or three edge sides.

The device 10 has a multi-aperture device 22, which here comprises two optical channels 24a and 24b merely by way of example. The multi-aperture device 22 can also comprise any other number of optical channels, such as e.g. B., but not limited to, 6, 8, 10 or more.

Each channel 24a, b comprises optics 29a, b, which are shown here as having a circular aperture only by way of illustration and lie in an optics plane 33, and an image sensor area 25a, b. The image sensor areas 25a and 25b are arranged in an image plane 31 and each optical system 29a, b images the respective partial area of ​​the object area onto the respective sensor area. A detailed discussion of the effects of different two-dimensional expansions and / or positioning of the image sensor areas follows in the descriptions of FIGS. 3a and 4a-e. The image plane 31 is arranged parallel to the edge side 18a and the apertures of the optics 29a, b, as is the optics plane 33. However, as an alternative, the edge side 18b or one of the other sides could also be used as a viewing passage,

It is further pointed out that alternative embodiments of multi-aperture devices described below are similarly constructed optical channels

can have. For the sake of clarity, however, in the following exemplary embodiments, optical channels are only indicated by cuboids drawn with dashed lines.

The multi-aperture device 22 is arranged between the first and second main sides 14 and 16 in such a way that the image plane 31 lies parallel to and between the edge sides 18a and 18c.

The optical axis 32a is assigned to the optical channel 24a and the optical axis 32b is assigned to the optical channel 24b. The optical axes 32a and 32b are shown here as an example approximately parallel and in the interior of the housing primarily running along the z-axis, although other configurations are also possible, such as, for. B. a divergent course of the axes 32a and 32b from the multi-aperture device 22 to the object area 12 or to the side 18a.

The channels 24a, b are arranged next to one another along the transverse direction y and aligned along the lateral direction z, ie facing the edge side 18a. They are, for example - such as with the optical centers of their optics 29a, b - on a common line along y in order to form a one-dimensional array or a line of optical channels which extends along a longitudinal alignment of the edge side 18a.

Each of the optical channels 24a and 24b is designed to cover a partial area 26a or 26b of the object area 12 along the respective optical axis 32a or 32b. The partial areas 26a and 26b could each completely cover the entire object area 12. In the exemplary embodiments described in more detail below, the channels cover the entire object area 12 only partially and only in their entirety. In the latter case, it is possible for the areas to overlap one another or to adjoin one another seamlessly.

It should also be pointed out that the division of the object area 12 into the sub-areas 26a and 26b shown here is only an example. 1 shows, for example, that the centers of the subregions 26a and 26b lie next to one another in the y direction, in the same sequence as the assigned optical channels 24a and b, that is to say that the subregions 26a and b also quasi also along a Line parallel to y to form a one-dimensional array. Theoretically, it would also be one

other arrangement conceivable, such. B: a side-by-side arrangement transversely to the side-by-side arrangement direction y of the channels 24a, b, namely along x. If the number of channels or sub-areas is greater, it could also be that the sub-areas scanned by the channels, for example with their centers, form a two-dimensional array. As mentioned before, the sub-areas may or may not overlap one another.

The device 10 furthermore has, by way of example, a beam-deflecting element 28 which is designed to reflectively deflect the optical axes 32a and 32b of the optical channels 24a and 24b and to relocate the object region 12. The beam-deflecting element 28 is arranged in FIG. 1 by way of example on the edge side 18a.

The beam deflecting element 28 can be mounted stationary or rotatable or tiltable about an axis of rotation or rotation RA running in the direction of the y-axis in order to enable reflective deflection of the optical axes 32a and 32b. Such a tilting or reflective deflection can possibly also take place individually for each individual optical axis 32a or 32b and / or also in such a way that only the course of one of the optical axes is changed, ie tilted or deflected.

A reflective deflection of the optical axes 32a and 32b can result in a changed course of the optical axes 32a and 32b in each case towards optical axes 32'a and 32'b, so that the optical channels 24a and 24b are designed to surround an object region 12 '. with partial areas 26'a and 26'b, which is relocated with respect to the object area 12, for example by a rotation about the axis RA or an axis parallel thereto, when the optical channels 32a and 32b are deflected on the beam-deflecting element 28. This means that the detected object region 12 can be displaced in space by means of the reflective deflection by the beam-deflecting element 28. In other words, by means of the reflective deflection, the object area 12 can be mapped onto the object area 12 ', and vice versa. The deflection also causes the optical axes 32a and 32b to be deflected between a lateral course along a first direction, for example along the z-direction, and a non-lateral course along a second direction, which can be influenced by the beam-deflecting element 28 will. The partial areas 26'a and 26'b can cover the object area 12 '.

The advantage of the embodiment shown in FIG. 1 is that a thickness of the device 10 for object detection is influenced by an expansion of the multi-aperture device 22 in the direction of the x or y-axis and can be independent of an expansion of the multi-aperture device in the direction of the z-axis . In other words, the edge sides 18a-d can consequently have a small extent in the x-direction.

Another advantage of this embodiment is that any desired optical deflection of the optical axes can be used to detect any object areas arranged or positioned in space. Alternatively, it is also conceivable that more than one object area, for example two or three different and / or differently positioned or oriented object areas, can be detected, which can be achieved by a design of the beam-deflecting element 28 that is different for each channel or channel group . The deflectability of the device 10 is, however, only optional and, as described below, can also be absent.

Both the multi-aperture device 22 and the beam-deflecting element 28 can also be arranged at a different location within the device 10. It is also conceivable that both the multi-aperture device 22 and the beam-deflecting element 28 are arranged in the edge side 18a or in another edge side 18b-d.

The multi-aperture device 22 can furthermore also comprise more than two optical channels, each of which can be assigned a corresponding plurality of optical axes. The plurality of optical channels can be designed to detect two or more partial areas of an object area. It is also conceivable that the individual optical channels are arranged in at least two groups, wherein, for example, a first group of optical channels can be designed to capture a first sub-area of ​​the object area and a second group of optical channels can be designed to detect a second sub-area of ​​the object area can be. This can be used to increase the resolution,

The beam-deflecting element 28 can be, for example, a mirror or a (partially) reflective continuous or discontinuous surface. Alternatively, another beam-deflecting or beam-shaping element could be used

Find use, such as B. a prism, a lens, a refractive or diffractive lens element or a combination of such elements.

The device 10 can be, for example, a camera, a mobile phone or smartphone, a screen or TV device, a computer screen or any device suitable for image and / or video recording or for capturing an object area.

2 shows a schematic perspective view of a device for object detection 20, which in turn has the first main side 14, the second main side 16 and the edge sides 18a-d. The device 20 also has a multi-aperture device 34. The multi-aperture device 34 here comprises, by way of example, four optical channels 24a, 24b, 24c and 24d. The multi-aperture device 34 is arranged in the device 20 such that the optical axes 37a, 37b, 37c and 37d assigned to the optical channels 24a-d each run laterally or in the z-direction towards the edge side 18. The individual optical axes can be parallel in sections, for example between the multi-aperture device 34 and a beam-deflecting element 40 arranged in the edge side 18a, or they can be divergent. The optical channels 24a-d are designed to detect the object region 12. The object area 12 is arranged parallel to the edge side 18a and comprises the partial areas 38a and 38b, which here partially overlap as an example. With regard to further alternatives for the arrangement of the partial areas within the object area 12, reference is made to the explanations relating to FIG.

A first group of optical channels includes optical channels 24a and 24c, while a second group of optical channels includes optical channels 24b and 24d. The first group of optical channels is designed to capture the first partial area 38a of the object area 12. The second group of optical channels is designed to detect the second partial area 38b of the object area 12.

The beam-deflecting element 40 comprises sub-elements 42a, 42b, 42c and 42d. By means of the sub-elements 42a-d, a reflective deflection of the optical axes 37a-d towards optical axes 37'ad can take place, so that the first group of optical channels covers a sub-region 44a of a second object region 12 'and the second group of optical channels one Sub-area 44b of the object area 12 'is recorded. The subregions 44a and 44b can partially or completely overlap.

The multi-aperture device 34 can also comprise any other number of optical channels, such as e.g. B., but not limited to, 6, 8, 10 or more. A number of optical channels in the first group can be the same as a number of optical channels in the second group, but any other division of optical channels into a plurality of groups is also conceivable. The respective group of optical channels can be designed in such a way that a plurality of differently positioned and / or oriented object areas can be detected in such a way that the respective group detects a sub-area of ​​the respective object area. A partial or complete overlap of respective partial areas of a captured object area can result in an improved (depth) resolution, for example.

The sub-elements 42a-d can furthermore be designed in such a way that they can deflect the optical axes 37a-d assigned to the individual optical channels 24a-d in a non-lateral direction. The non-lateral areas can be achieved, as described, by different angles of the individual areas, but also by the channel-wise different lateral offsets of the respective image areas and their associated optics, as in the previous solutions. This can be done individually for each individual optical axis 37a-d, for example, but also for individual groups of optical channels or axes. Individual deflection of individual optical axes can be achieved, for example, if the sub-elements 42a-d have a different inclination with respect to the edge side 18a. It is conceivable that the sub-elements 42a-d are designed to be tiltable independently of one another about an axis of rotation RA running in the y-direction and arranged in the edge side 18a. Individual tilting of the individual sub-elements 42a-d around an arbitrarily oriented rotation axis or a plurality of differently oriented and / or differently positioned rotation axes is also possible. The individual sub-elements 42a-d can also be shaped in such a way that a configuration, respectively. Tilting can be done automatically, for example mechanically by a user or a corresponding control device. Individual tilting of the individual sub-elements 42a-d around an arbitrarily oriented rotation axis or a plurality of differently oriented and / or differently positioned rotation axes is also possible. The individual sub-elements 42a-d can also be shaped in such a way that a configuration, respectively. Tilting can be done automatically, for example mechanically by a user or a corresponding control device. Individual tilting of the individual sub-elements 42a-d around an arbitrarily oriented rotation axis or a plurality of differently oriented and / or differently positioned rotation axes is also possible. The individual sub-elements 42a-d can also be shaped in such a way that a configuration, respectively. Tilting can be done automatically, for example mechanically by a user or a corresponding control device.

The advantage of this embodiment is that the device 20 can detect a variable object area with an unchanged position and orientation. The simultaneous detection of a plurality of differently positioned object areas is also conceivable. A multi-aperture device with at least two optical channels can also be designed to record depth information of a respective detected object region.

The following FIGS. 3a-b and 4a-f each show perspective top views of optical channels. It should be noted in this regard that the apertures of the optics are shown here with solid lines as an example as square. An aperture is in each case assigned to an optical channel 24a-i. For the sake of clarity, however, only the optical channels 24a-i are provided with reference symbols in the following figures.

3a shows an example of a schematic top view of a multi-aperture device 21 with a plurality of groups 23a, 23b, 23c and 23d of optical channels 24a-i in a lower image area, which scan different partial areas of the object area in groups. Each group 23a-d of optical channels 24a-i is constructed identically here by way of example. All channels of all groups are arranged along a line, so that initially the optical channels 24a-i lying next to one another occupy a first section 23a of the multi-aperture device 21, then the optical channels 24a-i lying next to one another occupy a next, second section 23b along the line take, etc.

3a shows a schematic detailed view of a section 23a of the multi-aperture device 21 in an upper image area is indicated by the dashed lines within the optical channels. The area of ​​a respective image sensor region 25a-i of an optical channel 24a-i can be smaller than the area of ​​the respective optical channel 24a-i. The image sensor areas 25a-i of all sections 23 can be arranged on a common substrate (single chip). As indicated by the different orientations or positions of the image sensor areas 25a-i with respect to the optical centers of their optical channels 24a-i, the optical channels 24a-i in each section have a different viewing angle from one another, ie the optical channels 24a-i are designed to cover a sub-region of the object region that is different from one another. It is assumed in FIG. 3 by way of example that the optical centers are arranged centrally with respect to the apertures of the optics, but this could also be implemented differently. As is also indicated by the squares drawn with dashed lines and the numbering of the optical channels, the optical channels 24a-i are arranged in such a way that adjacent sub-areas of the object area, such as (7) and (8) or (8) and (9), overlap (the reference numerals (1) to (9) are shown in this figure and the following figures as natural numbers 1 to 9 enclosed by a circle). The overlapping of the sub-areas enables an evaluation of the connections, ie the same image content in different sub-areas, in order to be able to infer depth information and to extract image information from partial images and thus compile an overall image of an overall object. To simplify matters, the image sensor areas 25a-i are shown with a dimension in the x and y directions of 50% of the optical channels 24a-i. Alternatively, the dimensions of the image sensor regions 25a-i in the x and / or y direction can have any ratio to the dimensions of the optical channels 24a-i. Positions of the image sensor areas can be determined at least partially as a function of a position of the center of the optical centers of the optical element within the base area of ​​the respective optical channel.

The schematic detailed view of a section 23a in the upper image area shows nine optical channels 24a-i, each of which includes an image sensor area 25a-i. Based on the viewing direction of the respective optical channel 24a-i, which is defined for example by the connecting line between the optical center and the middle of the image area, the image sensor area 25a-i is shifted within a base area of ​​the optical channels 24a-i, as indicated by the dashed line Lines is indicated. A numbering within the optical channels is only used to illustrate the arrangement of the partial areas and to simplify differentiation between the optical channels. Depending on the orientation, ie viewing direction of the respective optical channel 24a-i, As is also indicated by the numbering (1) to (9) (reference numerals in the figure are numbers from 1 to 9 enclosed by circles), the optical channels 24a-i are designed to cover nine partial areas of the object area. Alternatively, the object area can also have any number of subdivisions into partial areas. Each sub-area is recorded with a number corresponding to the number of sections 23a-d, such as four in the example shown.

The four sections 23a-d exemplarily have an identical sorting sequence of the optical channels 24a-i. In other words, each sub-area 23a-d has a respective optical channel 24a, 24b, 24c, 24i, the optical channels 24a-i each being arranged laterally adjacent in a single-row structure. The four subregions 23a-d are each arranged laterally adjacent, so that the total number of optical channels is also arranged laterally adjacent in a row next to one another. The arrangement of the optical channels 24a-i is one-line, which can also be described as a form 1xN. The line runs parallel to the edge side 18a and the main sides 14 and 16.

The number of sections 23a-d can result from a superresolution factor to be achieved. To achieve the increase in resolution by the desired superresolution factor, a corresponding number of optical channels can be formed in the x direction, the respective channels 24g-1, 24g-2, 24g-3 and 24g-4 essentially viewing the same sub-area of ​​the object area. The image sensor areas 25a can in the respective subareas, ie sections 23a-d with respect to their assigned optical channels 24g-1 to 24g-4, for example, by half a pixel, ie with a pixel pitch that is half the extent of a pixel in one direction laterally corresponds to a row direction, be advanced in the x and / or y direction. For example, the image sensor areas 25a of the sections 23a and 23b can differ by half a pixel with respect to their respectively assigned channels 24a in the x-direction and / or the y-direction and not differ in the y-direction, the image sensor area 25a of the section 23c by half a pixel in the y-direction and / or in the x-direction from the image sensor area 25a of the section 23a and the image sensor area 25a of the section 23d, for example, both in the x- and also in the y-direction by half a pixel compared to the image sensor area 25a of section 23a. The number of sections 23 can thus also be referred to as the product of super resolution factors in the x and y directions, it being possible for the factors to be different in whole numbers.

The optical channels 24g-1, 24g-2, 24g-3 and / or 24g-4 for capturing an essentially identical sub-area of ​​the object area can have any lateral offset to one another in a direction perpendicular to a line direction or perpendicular to a direction of a distance X1 have. If this distance is a fraction, for example 1/4, 1/3 or 1/2, of a distance between two pixels, ie partial image areas, this offset can also be referred to as a subpixel offset. The subpixel offset can be based, for example, on a desired super resolution factor. If, for example, a super resolution factor of 2 is implemented and a sub-area of ​​the object area is recorded twice in the x and y directions, the subpixel offset can correspond, for example, to 1/2 the pixel width. The offset can be used, for example, to increase a spatial resolution of the object area. In other words, due to an entanglement of the optical channels, it is possible for scanning gaps of an optical channel to be detected by an adjacent optical channel. Alternatively, the optical channels 24g-1, 24g-2, 24g-3 or 24g-4 can also be arranged without offset in order to cover an essentially identical sub-area.

Due to the subpixel offset of the optical channels 24g-1, 24g-2, 24g-3 and / or 24g-4, which depict the same sub-area of ​​the object, an oversampling algorithm (super resolution algorithm) can be used from a large number of low-resolution microimages per optical Channel 24g-1, 24g-2, 24g-3 and / or 24g-4 a high-resolution overall image can be calculated. In other words, the center points of the image sensor areas 25g of the optical channels 24g-1, 24g-2, 24g-3 and / or 24g-4 can be

be arranged shifted so that at least two of the optical channels 24g-1, 24g-2, 24g-3, and / or 24g -4 with a pixel pitch or a fraction of a pixel pitch or a (sub) pixel offset have different, partially overlapping detection areas . An overlapping area of ​​the detection areas of two optical channels 24g-1, 24g-2, 24g-3 and / or 24g-4 can thus be mapped onto an image detection sensor in an offset manner.

The identical sorting order of the sub-areas 23a-d and therefore the optical channels that cover at least approximately the same sub-area of ​​the object area, such as the optical channels 24g-1, 24g-2, 24g-3 and 24g-4, enables the greatest possible lateral distance along a formation of a line structure. As indicated by the distances between the image sensor areas 25a-i along the formation of the line structure, optical empty spaces, ie spaces, can be formed between the image sensor areas 25a-i of the optical channels 24a-i. In these intermediate areas, ie in the areas between the partial image converters, for example non-light-sensitive electronic components, such as read-out circuits, analog-to-digital converters (ADCs), amplifiers, etc. can be arranged.

The arrangement of the optical channels in the sections 23a-d is, for example, interlaced and regular, so that a distance X1 is constant for optical channels that cover the same or essentially the same or the same partial area, such as for the optical channels 24g-1 to 24g-4 or 24f-1 and 24f-2.

The distance X1 can be referred to both as a maximum and as an equidistant distance, since it applies to each sub-area 23a-d and each optical channel 24a-i of the respective sub-area.

In other words, optical channels which cover approximately the same subregions which are only offset by a part of a field of view of a pixel are at a maximum distance from one another in the strip-shaped arrangement by the distance X1. From this, a large to maximum disparity can be achieved and, in this regard, an improved to the best possible depth resolution.

Alternative exemplary embodiments are, for example, multi-aperture devices which have a larger number of optical channels. A number of sections 23a-d, in which the optical channels are partially arranged, can be a square of a natural number, such as 2 2 , 3 2 or 4 2 , according to the superresolution principle . Alternatively, it is also conceivable that a different number of sections is arranged in the multi-aperture device, such as, for example, 2, 3, 5, 7 or 11.

In other words, FIG. 3a shows an imaging system with a small structure in the x direction with optimized extraction of depth information as a result of the largest possible base length X1. The optical channels of the multi-aperture device have a linear arrangement, that is, they are arranged in a row.

FIG. 3b shows, by way of example, the arrangement of groups or sections of optical channels 24a-i from FIG. 3a, the arrangement of the subregions within the object region covered by the optical channels. Such an arrangement is described in DE 10 2009 049387, for example. As mentioned, each of the optical channels 24a-i is designed in each section in order to capture a different sub-area of ​​the object area, as is indicated by the respective shifted image sensor areas 25a-i. In other words, each of the optical channels 24a-i has a different direction of view of the object region. The sub-areas of adjacent optical channels, such as 24a and 24b or 24e and 24f or 24d and 24g partially overlap, that is, adjacent optical channels partially capture the same image content, in order to be able to infer an object distance. 3b shows only an arrangement of optical channels to illustrate the influences of different viewing directions of optical channels. 3b corresponds to the prior art and serves to illustrate the channel division. The optical channels 24a-i have a sorting, so that optical channels 24g-1, 24g-2, 24g-3 and 24g-4, which are assigned to an approximately identical object area, within the row, ie line structure, at most from one another around the Distance X1 are separated.

FIG. 4a shows a schematic plan view of the section 23a of the multi-aperture device 21 with the sorting of the optical channels 24a-i, as shown in FIG. 3a. Two adjacent optical channels, such as 24g and 24c, 24h and 24b, 24i and 24a, or 24d and 24f, can have a maximum angular distance with respect to the respective position of the image sensor area of ​​the optical channel, for example 180 ° for the optical channels 24g and 24c. In other words, the viewing directions of two adjacent optical channels are rotated or mirrored by up to 180 °. Adjacent optical channels such as 24c and 24h or 24b and 24i are angularly spaced between 90 ° and 180 ° from one another.

In other words, adjacent optical channels 24a-i of section 23a are arranged in such a way that they can have a maximum difference in their viewing direction.

The optical channels 24a-i can, as shown in FIG. 4a, be arranged in such a way that centers of the respective optical channels 24a-i, ie the optical centers, are arranged along or on a straight line 17. This means that the distances between the center points of the image sensor areas 25a-i with respect to the line 17 can vary. In other words, the centers of the optical channels 24a-i are collinear.

4b shows a schematic plan view of a section 23'a of a multi-aperture device 21 '. The sorting order of the optical channels 24a-i along the line 17 is identical to the sorting order of FIG. 4a. In contrast to FIG. 4a, the optical channels 24a-i are offset in the y-direction along the linear arrangement of the line structure in such a way that centers or centers of the respective image sensor areas 25a-i are arranged collinearly on the line 17.

Alternatively, both the centers of the optical channels 24a-i and the image sensor regions 25a-i can be arranged partially or completely at a distance from the line 17. In the case of square cross-sections of the optical channels 24a-i or the image sensor areas 25a-i, the center points or centers can be determined on the basis of the intersection of two diagonal lines which each connect two opposite corner points of the square. Alternatively, or in the case of alternatively shaped optical channels 24a-i or image sensor areas 25a-i, the centers can be ascertainable, for example, on the basis of the geometric centroid or center. Alternatively, a longitudinal center line of an optical channel 24a-i or image sensor area 25a-i can also be used for a description of the collinear arrangement or arrangement spaced apart from the line 17.

In other words, the optical channels are arranged in the same order as in FIG 17 and an envelope of an active sub-imager surface can therefore have a minimal extent in the y-direction. This makes it possible to achieve the lowest possible height, that is to say a minimal area requirement, for an image sensor shaped, for example, in the form of a strip.

4a and 4b show only a partial view of the multi-aperture device 21 or 21 '. Overall, a line consists, possibly depending on a super resolution factor, of several, for example four cells, ie sections 23 or 23 ', which can be arranged one behind the other and thus in a row. The partial imagers, ie image sensor areas 25a-i, are shifted in an x ​​/ y direction of the object area by a width of a respective pixel divided by the super resolution factor.

FIG. 4c shows a device 39 with an arrangement of the image sensor regions 25a-i on a substrate 27, as it can result from an arrangement of the optical channels according to FIG. 4b. An arrangement of the center points of the image sensor areas 25a-i along a line 17 can result in a width Y1 of the substrate 27, that is to say in a small sensor surface of the image converter. The arrangement of the image sensor regions 25a-i, which is unchanged in the x-direction in comparison with the arrangement in FIG. 4b, can result in an overall extension in the x-direction, X2. The device 39, consisting of the substrate 27 with the image sensor areas 25a-i, can also be referred to as an imager.

By arranging the center points of the image sensor regions 25a-i, a total reduction or minimization of the area required by the substrate 27 can be achieved, which can lead to material savings and consequently to a reduction in costs and / or installation space. The fill factor of the image sensor is defined by the ratio of the total area of ​​all the pixels arranged on the image sensor and contributing to the image areas 25a-i to the total area of ​​the image sensor.

In other words, the centers of sub-images 25a-i lie on one line, so that an envelope, the total extent of which in the x and y directions can approximately correspond to an extent of the substrate 27 in the x and y directions, has a small, or possibly the minimum extension Y1 perpendicular to the line 17 in the y-direction. This results in a small area requirement of the substrate 27 in the y-direction, on which the partial imagers, ie image sensor areas 25a-i, are arranged, and thus a high area efficiency of the substrate 27 or a high fill factor of the image sensor.

FIG. 4d shows, as a further exemplary embodiment, an imager 39 ′ in which, compared to FIG. 4c, the image sensor regions 25a-i are arranged equidistantly on the substrate 27. The individual image sensor areas have an extension in the x direction X4 with a spacing of X4 '. The latter is measured from a center of a respective image sensor area to a center of the respective laterally closest image sensor area (shown as an example between the image sensor areas 25b and 25i in FIG. 4d). This arrangement can result in an overall expansion in the x direction X3.

FIG. 4e shows a further exemplary embodiment of the imager 39 ″. In contrast to FIG

With a constant expansion in the y-direction Y1, the entire device can be less expanded in the x-direction X5 compared to the imager 39 '(see FIG. 4d).

FIG. 4f shows, by way of example, an imager such as can result from the device 39 'if further image sensor areas are arranged in the spaces between the image sensor areas 25a-i. In this case, there is no need to subdivide it into individual areas. One can then also speak of an elongated and continuous pixel field 41. In other words, the imager can also consist of only one elongated and continuous pixel field. For the sake of completeness, it should also be mentioned that the omission of a subdivision into individual areas can result in the dimensions in the x and y directions X3 and Y1, which are unchanged in comparison with device 39 '.

The advantage of the imagers or image converters shown in FIGS. 4c-f is that by integrating a correspondingly shaped multi-aperture device between the main sides of a device for object detection, a requirement for depth or thickness of the device for object detection can be relaxed. In other words, when the multi-aperture device is accommodated between the main sides of the entire device for image acquisition, the thickness of the device is possibly determined by the extension in the y-direction, Y1.

5a shows a perspective view of a device 30. The device 30 comprises the first main side 14, the second main side 16 and the edge side 18a. The device 30 furthermore has the multi-aperture device 22, which comprises the optical channel 24a and the optical channel 24b. The device 30 also has the beam-deflecting element 28, which is arranged by way of example in the edge side 18a. The beam-deflecting element 28 is designed to be rotatable about an axis of rotation RA running along the edge side 18a. Alternatively, the axis of rotation RA can run transversely to the optical axes 32a, b. The beam-deflecting element 28 is tilted by a rigid or variable angle α with respect to the edge side 18a. The angle α is, for example, 45 °, but can also be a different number of degrees, such as 50, 80 or 90, exhibit. If the angle α is 45 °, for example, the multi-aperture device 22 can capture the object region 12 along the optical axes 32a and 32b. The optical axes 32a and 32b are each assigned to the optical channels 24a and 24b. The object area 12 is arranged parallel to the first main side 14 of the housing. In other words, the beam-deflecting element 28 is designed here to reflectively deflect the optical axes 32a and 32b in a direction facing the first main side 14. The device 30 with the previously described position and position of the beam-deflecting element 28 can be referred to as being in a first state. The first state results from the position of the beam-deflecting element 28 in such a way that

5b again shows a perspective view of the device 30. The beam-deflecting element 28 is now tilted in such a way that it assumes the angle α of 135 ° with respect to the edge side 18a. In other words, the beam-deflecting element 28 in FIG. 5b is tilted by 90 ° with respect to the beam-deflecting element 28 in FIG. 5a. The beam-deflecting element 28 is designed to enable the detection of a second object region 12 ′ that faces the second main side 16. The object area 12 'comprises sub-areas 38'a and 38'b. The configuration of the beam-deflecting element 28 shown in FIG. 5b means that the device 30 is in a second state.

The device 30 can be configured such that it can be switched between the first state and the second state. The switchover can take place, for example, manually by a user or automatically by appropriate control hardware and / or software. The device 30 can further be designed to be used in the first state for video telephony and in the second state for taking photographs or videos. It is advantageous that the multi-aperture device 30 can be used by means of the beam-deflecting element 28 both for detecting the first object area 12 (facing the first main side 14) and for detecting the second object area 12 '(facing the second main side 16). Compared to the state of the art, this embodiment stands out due to a reduction in complexity,

In other words, the reflective beam deflection enables two positions: a first position, which is characterized in that the viewing direction is oriented towards the front, i.e. in the direction of the second main side 16 (beam deflection + 90 ° with respect to the edge side 18a) and one second position, which is characterized in that the viewing direction is oriented towards the rear, i.e. in the direction of the first main side 14 (beam deflection -90 ° with respect to the edge side 18a). The advantage of this embodiment is that the different positions can enable the multi-aperture device 22 to take over the function of the first (primary) or the second (secondary) camera module. An imager plane is perpendicular to a screen plane, while object planes can be parallel to the screen plane. This embodiment can alternatively be referred to as a camera with adaptive beam deflection.

FIG. 6 shows a perspective view of a device 50 which can be referred to as a modification of the device 20. The device 50 also comprises the first main side 14, the second main side 16 and the edge sides 18a-d, the multi-aperture device 34, which includes the optical channels 24a-d, and the beam-deflecting element 40. The beam-deflecting element 40 is parallel to in the device 50 the edge side 18a arranged.

Optical axes 35a, 35b, 35c and 35d are now assigned to the optical channels 24a-d. These have an approximately parallel course between the multi-aperture device 34 and the beam-deflecting element 40.

The sub-elements 42a-d are now shaped so that a first group (35a and 35c) of optical axes 35a-d deflected in the direction of the first main side 14 and a second group (35b and 35d) of optical axes 35 ad in the direction of the second Main page 16 is diverted.

An opening 46 is arranged in the first main side 14 such that the optical channels 24a and 24c can detect the object region 12 along the first group (35a and 35c) of optical axes through the main side 14. In the second main side 16, an opening 46 'is furthermore arranged in such a way that the optical channels 24b and 24d can detect the object region 12' along the second group (35b and 35d) of optical axes through the main side 16. The openings 46 and 46 ′ can be a screen, a viewing opening, a window, etc., for example.

The device 50 can detect two different, ie differently arranged or positioned, object areas (12 and 12 ') at the same time or simultaneously. The interleaved or alternating arrangement of the individual optical channels, as described with reference to, for example, FIG. 4a or 4b, can for example also allow the recording of depth information of the respective object area. It is not imperative here for channels that view the object area 12 to be arranged alternating with channels that scan the object area 12 '. It is also conceivable that groups of channels that scan the same object area, ie 12 or 12 ', use the same deflection element. For example, 100 channels can be formed in order to capture the object area 12, followed by,

It is also conceivable to shape the beam-deflecting element 40 in such a way that the position of the respective sub-elements 42a-d can be changed individually. For this purpose, the individual elements 42a-d can, for example, each be designed to be tiltable at any desired angle along a common or respective axis of rotation with respect to the edge side 18a. In other words, a cell-by-cell adaptation or modification of the beam-deflecting element 40 can be made possible.

Alternatively, the beam-deflecting element 40 can also be a mirror which has a non-planar shape that is discontinuous or faceted over the extension of the entire imager. This can be different for each channel of a multi-aperture camera. Alternatively, it is also conceivable that the mirror has a non-plane shape that is continuous over the extension of the entire imager.

The device 50 can be, for example, a camera with beam deflection, in which some (optical) channels are aligned on an object plane I, others on an object plane II by means of a beam deflection element (e.g. mirror, plane or curved / freeform) . The imager plane is, for example, perpendicular to a screen plane, while the object planes I and II are parallel to the screen plane. In other words, the object plane I is recorded along a main viewing direction of a partial camera I and the object plane II is recorded along a main viewing direction of a sub-camera II.

FIG. 7 shows a perspective view of a device 60 that may be referred to as a modification of the device 10. The device 60 also includes the first main side 14, the second main side 16 and the edge sides 18a-d as well as the multi-aperture device 22, which includes the optical channels 24a and 24b. The optical axes 32a and 32b are assigned to the optical channels 24a and 24b.

The device 60 is shaped to detect an object region 54. The object region 54 is arranged parallel to the edge side 18a and has an object 55 that can be detected by the multi-aperture device 22 along the optical axes 32a and 32b running laterally (in the z-direction). For this purpose, the optical axes 32a and 32b pass through an opening 48 through the edge side 18a. Their course can be described as parallel in sections.

The device 60 also has a screen 52, which is arranged, for example, in the first main page 14. The screen 52 has an extension in the y and z directions and can have a surface extension that is at least half as large as the first main side 14. An area that is smaller, the same size or greater than the area of ​​the first main side is also conceivable. The screen 52 can be configured to display an image. The object 55 can, for example, be recorded and provided to a user as an image 55 '. This image 55 ′ of the object 55 is arranged in the screen 52 by way of example.

The device 60 is designed to detect an object area facing away from the two main sides 14 and 16. In other words, an object region arranged parallel to the edge side 18a can be detected. In this case, one can also speak of an imager plane (cf. 31 in FIG. 1) which is arranged perpendicular to a screen plane.

In other words, with the device 60, a camera can capture an object region in a straight line view (without deflection) along a main viewing direction.

Since the object 55 is detected with at least two optical channels 24a and 24b, the image 55 'can also have depth information.

The device 60 can be, for example, a multi-aperture camera with a division of the field of view in a linear design in a smartphone. It is advantageous if it can be accommodated (the multi-aperture camera) on the front of the smartphone. It is also conceivable that the multi-aperture device 22 has an imager which, as shown in FIG. 4f, consists only of an elongated and continuous pixel field.

The advantage of this embodiment is that the camera is determined from a narrow strip. This enables the smartphone to be thinner.

FIG. 8 shows a perspective view of a device 70 which can be referred to as a modification of the device 60. At the position of the multi-aperture device 22 (see FIG. 7), a multi-aperture device 56 is now arranged, which has a flash device 58 in addition to the optical channels 24a and 24b. This is arranged, for example, between the optical channels 24a and 24b and is designed to illuminate an object region 62. The object region 62 is arranged parallel to the second main side 16 and has the object 55.

The arrangement of the flash device 58 in the multi-aperture device 56 can result in a relaxation of design requirements with regard to the depth extension of the flash device. It is also advantageous that a corresponding control and / or linking of the flashlight device to the other elements of the multi-aperture device 56 could easily take place through the spatial proximity. The flashlight device 56 can be designed, for example, with one or more light emitting diodes (LEDs), but other embodiments are also conceivable. Alternatively, the multi-aperture device 56 and thus the flashlight device 58 can be integrated on an end face of the device 70.

A beam-deflecting element 64 is now arranged at the position of the opening 48 (see FIG. 7). The beam-deflecting element 64 is designed to deflect the optical axes 32a and 32b from a lateral course (in the z-direction) to a non-lateral course (in the x-direction), so that the object region 62 and therefore that therein arranged object 55 can be detected by the optical channels 24a and 24b.

The beam-deflecting element 64 is also designed to deflect electromagnetic waves (collared light) emitted by the flash device 58 in the direction of the second main side 16. The object area 62 can therefore be illuminated. In other words, the use of beam deflection elements (mirrors) can be made possible for the flashlight device 58. The flashlight device 58 can furthermore be designed for use by a reflective and / or refractive beam deflection element.

The beam-deflecting element 64 can be rigid, so that the deflection of the optical axes 32a, b and the light emitted by the flash device 58 takes place in an unchangeable manner. However, it is also conceivable that the beam-deflecting element 64 is designed to be changeable. If the beam-deflecting element 64 is, for example, a mirror or any surface reflecting electromagnetic waves, then this or these can be mounted rotatably about an axis of rotation, for example. A change resp. The position of the beam-deflecting element 64 could be adjusted manually or controlled and / or automated by a corresponding control device or software.

Thus, for example, a beam deflection can be made possible by means of a simple, continuous plane mirror. The mirror plane can be inclined by 45 ° to the screen plane. This configuration can possibly be described by a deflection of the main viewing direction in the direction of the second main page.

The configuration of multi-aperture device, flash device and beam-deflecting element described here can result in further advantageous embodiments of the device for object detection.

As a further exemplary embodiment, FIG. 9a shows a device 80, which can be derived from the device 30, in a first state. The first state results from the position of the beam-deflecting element 28 in such a way that the first state of the beam-deflecting element 28 also causes the first state of the device 80. The first state can enable the detection of the object region 62 facing the second main side 16.

The beam-deflecting element 28 also enables the deflection of the optical axes 32a and 32b towards optical axes 32'a and 32'b and the detection of a second object region 62 'facing the first main side 14. This configuration can be referred to as the second state.

The device 80 furthermore has a screen 52 which, as in the case of the device 60 (see FIG. 7), can be shaped to display an image 55 ′ of a detected object 55.

9b shows the device 80 in a second state. The optical axes 32a and 32b are now mapped onto the optical axes 32'a and 32'b. In this configuration, the device 80 can be designed to detect the object region 62 ′. The object 55 to be detected is therefore arranged, for example, in the object region 62 '.

As a further exemplary embodiment, FIG. 10 shows a device 90 which can be referred to as a modification of the device 80. The beam-deflecting element 28 is now designed to enable detection of three different object areas 72, 72 'and 72 "along optical axes 32a and 32b, 32'a and 32'b and 32" a and 32 "b In other words, axes 32a and 32b can be mapped onto optical axes 32'a and 32'b or 32 "a and 32" b by means of beam-deflecting element 28. Alternatively, for example, optical axes 32'a and 32'b are mapped onto the optical axes 32a and 32b or the optical axes 32 "a and 32" b, or the optical axes 32 "a and 32" b onto the optical axes 32'a and 32'b, etc.

It is possible to distinguish between three positions: a first position, which is characterized by the fact that the viewing direction is deflected forwards (deflection + 90 °), a second position, which is characterized by the fact that the viewing direction is deflected laterally or not (no deflection or 0 °) and a third position, which is characterized by the fact that the deflection takes place to the rear (deflection -90 °). It is therefore possible that three different object planes, which can be positioned and / or oriented differently in space, can be detected with a single multi-aperture device 22. Here, too, it is conceivable that the object area 72, 72 'or 72 "or a possibly

In other words, an (additional) beam deflection can take place by means of fixed / movable reflective components. Two, three or more positions are possible here. The deflection can take place, for example, on a plane mirror or a mirror that is continuous over all cells, but also via a mirror that is adapted for each cell.

Claims

Device (10; 20; 30; 50; 60; 70; 80; 90) for detecting an object area (12; 12 '; 54; 62; 62'; 72; 72 '; 72 "), with the following features:

a flat housing with a first main side (14), a second main side (16) and an edge side (18a); and

a multi-aperture device (22; 34; 21; 21 '39; 39', 39 "; 56), with

72 '; 72 ").

Apparatus according to Claim 1, the plurality of optical channels (24a-i) forming a one-dimensional array, while the partial areas of the object area cover a two-dimensional array.

Apparatus according to claim 1 or 2, wherein the plurality of optical channels (24a-d) comprises a first group (24a; 24c) of optical channels (24a-d) and a second group (24b; 24d) of optical channels (24a-d ), wherein the first group (24a; 24c) of optical channels (24a-d) covers a first partial area (38a; 44a) of the object area (12, 12 ') and the second group (24b; 24d) of optical channels ( 24a-d) detects a second partial area (38b; 44b) of the object area (12; 12 ').

Device according to claim 3, wherein the first partial area (38a; 44a) and the second partial area (38b; 44b) at least partially overlap.

5. Apparatus according to claim 3 or 4, wherein a number of the optical channels (24a-i) of the first group is equal to a number of the optical channels (24a-i) of the second group.

Device according to one of claims 3-5, wherein centers of pixel arrays of image sensor areas (25a-i) of the optical channels (24a-i) of the first group with respect to centers of associated imaging optics (29a; 29b) of the optical channels (24a-i) of the first group are laterally displaced relative to one another by a fraction of a pixel pitch, so that the first sub-area (38a; 44a) is scanned laterally displaced relative to one another by a subpixel offset through at least two of the optical channels (24a-i) of the first group.

Device according to one of claims 3-6, wherein the optical channels (24a-i) of the first and second groups are arranged in an interlaced manner in a single-row structure.

Device according to one of claims 3-7, wherein optical centers of optics (29a; 29b) of the optical channels (24a-i) of the first and second groups lie on a first line (17) and centers of image sensor areas (25a-i) of the optical channels (24a-i) of the first and second group are offset in relation to a projection of the optical centers onto a second line in an image plane (31) in which the image sensor areas of the first and second group of channels lie in the image plane (31) .

Device according to one of claims 3-7, wherein the centers of image sensor areas (25a-i) of the optical channels (24ai) of the first and second groups lie on a first line and optical centers of optics (29a; 29b) of the optical channels (24a- i) the first and second group opposite a projection of the centers of image sensor areas (25a-i) onto a second line in an optics plane (33) in which the optics of the optical channels (24a-i) of the first and second groups lie, in the optical plane (33) are offset.

Device according to one of claims 1-9, wherein the device has a beam-deflecting element (28; 40; 64) which is designed to move around the optical axes (32a; 32b; 32'a; 32'b; 32 "a; 32 "b; 37a-d; 37'ad; 35a-d) to deflect the plurality of optical channels (24a-i) in a first state in a direction facing the first main side (14), and in a second state in to deflect a direction facing the second main side (16), wherein the

beam-deflecting element (28; 40; 64) can be switched between the first state and the second state.

1 1 device according to claim 10, wherein the beam deflecting element (28; 40; 64) comprises:

a rigid body with a reflective surface which is rotatably mounted about an axis of rotation (RA) which is transverse to the optical axes (32a; 32b; 32'a; 32'b; 32 "a; 32" b; 37a-d; 37'ad; 35a-d) of the plurality of optical channels (24a-i) or along the edge side (18a) in order to switch between the first state and the second state.

12. Device according to one of the preceding claims, wherein the multi-aperture device (22; 34; 21; 21 '39; 39', 39 "; 56) further comprises a further plurality of optical channels (24a-i), each of which for capturing a respective sub-area (26a; 26b; 26'a; 26'b; 38a; 38b; 44a; 44b; 38'a; 38'b) of a further object area (12; 12 '; 54; 62; 62'; 72; 72 '; 72 ") along a further optical axis (32a; 32b; 32'a; 32'b; 32" a; 32 "b; 37a-d; 37'ad; 35a-d) between a lateral course and a non-lateral course is deflected, is formed.

13. Device according to one of the preceding claims, wherein the first main page (14) has a screen (52).

14. Device according to one of the preceding claims, wherein the device is a mobile phone, a computer screen or a TV set.

15. A method for detecting an object region with a device according to one of the preceding claims.

Documents

Application Documents

# Name Date
1 202019041813-STATEMENT OF UNDERTAKING (FORM 3) [25-09-2020(online)].pdf 2020-09-25
2 202019041813-REQUEST FOR EXAMINATION (FORM-18) [25-09-2020(online)].pdf 2020-09-25
3 202019041813-FORM 18 [25-09-2020(online)].pdf 2020-09-25
4 202019041813-FORM 1 [25-09-2020(online)].pdf 2020-09-25
5 202019041813-DRAWINGS [25-09-2020(online)].pdf 2020-09-25
6 202019041813-DECLARATION OF INVENTORSHIP (FORM 5) [25-09-2020(online)].pdf 2020-09-25
7 202019041813-COMPLETE SPECIFICATION [25-09-2020(online)].pdf 2020-09-25
8 202019041813-Response to office action [04-11-2020(online)].pdf 2020-11-04
9 202019041813-Annexure [04-11-2020(online)].pdf 2020-11-04
10 202019041813-FORM-26 [08-12-2020(online)].pdf 2020-12-08
11 202019041813-RELEVANT DOCUMENTS [28-01-2021(online)].pdf 2021-01-28
12 202019041813-Proof of Right [28-01-2021(online)].pdf 2021-01-28
13 202019041813-FORM 13 [28-01-2021(online)].pdf 2021-01-28
14 202019041813-FORM 3 [15-02-2021(online)].pdf 2021-02-15
15 202019041813-FER.pdf 2022-04-06
16 202019041813-AbandonedLetter.pdf 2024-02-19

Search Strategy

1 Search202019041813E_05-04-2022.pdf