Abstract: The invention relates to an autostereoscopic display for simultaneously displaying more than two different images comprising a pixel matrix (11) having a multitude of pixels (15) distributed over different subgroups; an optical element (12) which is arranged in front of or behind the pixel matrix (11) which has a grid like structure and imposes a respective defined propagation direction on light emanating or transmitted from the pixels (15) so that a plurality of viewing zones (16) laterally offset relative to one another is defined so that each of the viewing zones (16) is associated with exactly one of the subsets and so that the light emanating or transmitted from each of the subgroups of pixels (15) is directed into the viewing zone (16) associated with this subset; and a control unit (13) for controlling the pixel matrix (11) in dependence on image data (14) defining a 3D image. In this respect the control unit (13) is configured to carry out the respective following steps for controlling the pixel matrix (11) for an autostereoscopic viewing of the 3D image from a viewing distance (D) differing from the nominal spacing (Dn) in front of the display for each of a plurality of strips of pixels (15): determining a value of a location coordinate (x); determining intensity values which are defined by the image data (14) for an image strip corresponding to this strip of a view of the 3D image which corresponds to a direction of gaze from a position defined by the named value; and controlling the pixels (15) of this strip using the intensity values determined in this manner.
Autostereoscopic display and
Method of displaying a 3D image
The invention relates to an autostereoscopic display for simultaneously dis¬
playing more than two different images in accordance with the preamble of
the main claim and t o a method of displaying a 3D image in accordance with
the preamble of the independent claim which can be carried out using such a
display.
A generic display includes a pixel matrix having a multitude of pixels which are
arranged in different rows, wherein a plurality of more than two disjoint sub¬
sets of pixels on the pixel matrix are defined such that each of the subsets
forms a band of parallel strips which include a non-zero angle with the rows,
wherein the strips belonging t o the different subsets are interleaved such that
strips and/or pixels of the different subsets alternate cyclically in the row di¬
rection. In addition, such a display comprises an optica! element which is ar
ranged in front of or behind the pixel matrix, which has a grid-like structure
orientated parallel t o the strips and imposes, for each of the pixels, a defined
propagation direction on light emanating or transmitted from the respective
pixel such that, at a nominal distance in front of the display predefined by a
geometry of the display, a number, corresponding t o the named plurality, of
viewing zones, which are laterally offset relative t o one another and of which
each is associated with exactly one of the subsets are defined such that the
light emanating or transmitted from each of the subsets of pixels is directed in
the viewing zone associated with this subset.
Displays of this kind are known per se as so-called multiview displays. On a
use of these displays known from the prior art as intended, a respective one
of a number, corresponding t o the named plurality, of stereoscopic halfimages
is displayed on the named subgroups of pixels and a respective two of
said half-images which are displayed on subgroups having directly adjacent
strips combine pairwise t o form a stereoscopic image. In this manner, not only
a single viewer, but also several viewers positioned next t o one another in
front of the display can each autostereoscopicaily perceive an image of the
same scene which appears three-dimensional. In addition, a viewer can move
in a lateral direction in front of the display without losing the threedimensional
impression. He will rather see the same scene from a perspective
changing according t o his movement.
It is, however, disadvantageous in this respect that the viewer or each of the
viewers can only see a 3D image of satisfactory quality if his eyes maintain the
nominal distance from the display predefined by the geometry of the display.
Otherwise each eye of the viewer namely sees shares in different regions of
the display and in part overlaps of different half-images. Please note that the
terms nominal distance, nominal spacing are used as synonyms in the present
application and denote a particular viewing distance which is given by the
geometry of the display and can be regarded as the nominal viewing distance.
It is the underlying object of the invention t o develop an autostereoscopic
display on which a respective image of three-dimensional effect of a displayed
scene can be seen from distances which should be as freely selectable as pos¬
sible, wherein it should be possible as in the described prior art that several
viewers simultaneously look at the display and there each see an image of
three-dimensional effect of the scene and that a viewer moves laterally with¬
out him losing the three-dimensional impression. It is furthermore the object
of the invention t o propose a corresponding method of displaying 3D images
on an autostereoscopic display which satisfies these demands.
This object is satisfied in accordance with the invention by an
autostereoscopic display having features of the first claim as well as by a
method having the features of the other independent claim. Advantageous
embodiments result from the features of the dependent claims.
The proposed display therefore has a control unit for controlling the pixel ma¬
trix in dependence on image data which define a 3D image. This is configured
t o carry out the respective following steps for each of the strips of pixels for
controlling the pixel matrix for an autostereoscopic viewing of the 3D image
from a viewing distance in front of the screen differing from the nominal d is
tance:
- determining a value of a location coordinate which describes a lateral posi¬
tion of locations on a line orientated in the row direction and disposed at a
defined height in the viewing distance in front of the display, wherein the val
ue is determined for the location at which light emanating or transmitted
from the pixels of this strip - more precisely from centers of area of the re
spective pixels - is incident on the named line with the propagation direction
imposed by the optical element;
- determining intensity values which are defined by the image data for an im¬
age strip corresponding t o this strip of a view of the 3D image which corre¬
sponds t o a direction of gaze from a position defined by the named value;
- controlling pixels of this strip using the so-called intensity values.
The determination of the respective value of the location coordinate in de
pendence on the viewing distance is possible in this respect by a simple
arithmetic operation and only represents a use of projective geometry.
Various coordinate systems can be used as the basis for the definition of the
location coordinate. In an expedient coordinate system having an x axis orien
tated in the row direction, a vertical y axis and a z axis orientated in the direc¬
tion of a normal of the display plane, the named location coordinate can also
be selected as the coordinate x. In this coordinate system, the viewing d is
tance can be represented as z and a height by the coordinate y. Instead, how¬
ever, any parameterization, preferably a constant parameterization, of the
named line can be used as the location coordinate.
The named line is typically a section of a horizontal straight line orientated
parallel t o the display. It is, however, also possible that the line is slanted or
curved. In this case, let the named distance designate a distance between the
display and a defined point, typically a central point, on the line. In addition,
let the line be defined as limited in its length - that is only covering a space of
such a limited width in front of the display - such that the value of the location
coordinate determined in the described manner is unambiguous. That is, light
rays should not be taken into consideration which are conducted through the
optical element such that they are not led through one of the viewing zones
typically lying centrally in front of the display, but rather through so-called
secondary zones.
Each value of the location coordinate corresponds t o a viewing position on
the named line which is here only called a position. The respective view is de¬
fined as the two-dimensional view of the 30 image or of the scene displayed
by the 3D image which results from this viewing position or from the direction
of gaze thereby predefined. An actual - or virtual - camera position can be
associated with each value of the named location coordinate. The view which
corresponds t o a direction of gaze from the position defined by the respective
value of the location coordinate means a view which results or would result
by a taking of the named scene from the camera position associated with this
value of the location coordinate.
The intensity values which are spoken of here can also be called brightness
values or control values for the individual pixels. Therefore, they represent
image information on the individual picture elements of the respective view
t o be displayed by the pixels. In the case of multicolor pixels, they can additiona!
ly contain color information or, in the case of a pixel matrix having pixels
of different elementary colors, can depend on a color of the respective pixel -
then usually called a subpixel. At least some of the views - or more precisely
the image strips of the views belonging t o the respective strips of pixels, and
thus at least parts of the views - have t o be calculated in dependence on the
named image data t o determine the required intensity values. These image
data admittedly define the 3D image, but do not contain a priori all the image
information of all possible views. They are rather only defined indirectly by
the image data and are calculated as required - that is depending on the values
of the location coordinate determined for the different strips. In addition,
various processes known per se can be considered of which some will be out¬
lined further below.
It is achieved by the proposed measures that a viewer who looks at the correspondingly
controlled display from the named viewing distance will also see
an image of three-dimensional effect of good image quality when the viewing
distance differs from the nominal distance. It is in particular achieved by the
described control of the pixel matrix that the viewer sees two mutually com¬
plementary stereoscopic images which combine t o a stereoscopic image at
least in a very good approximation despite the viewing distance actually not
matching the geometry of the display, with it simultaneously being avoided
that conspicuous and disturbing irregularities or jumps occur on a lateral
movement. The latter would not be able t o be avoided if, instead of the pro¬
posed determination of the intensity values for the different strips of correspondingly
defined views, ideally defined in each case, only the intensity val
ues of a plurality of stereoscopic half-images defined in an unchanged manner
were redistributed in response t o the changed viewing distance between the
pixels. The display can therefore be used for completely different viewing d is
tances. An adaptation of the display itself t o the viewing distance which under
certain circumstances may be predefined by a specific use - e.g, by the posi¬
tioning in a room of predefined size and division - is not necessary in this re¬
spect.
The term "stereoscopic half-images" in the present document should designate
respective views of a scene of which two combine t o one stereoscopic
image of this scene in that they correspond t o views from - actual or virtual -
camera positions or eye positions which are laterally offset t o one another by
approximately an average distance between eyes. In the case of a band of
more than two views having these properties, the individual views should
therefore also be called stereoscopic half-images.
The aforementioned step of controlling the pixels using the determined inten¬
sity values can be done by controlling, for each strip, the pixels of the respec¬
tive strip such that the image strip which corresponds t o this strip of the view
of the 3D image corresponding t o the direction of gaze from the position de¬
fined by the named value of the location coordinate is reproduced by the pix¬
els of this strip of pixels, wherein the image strip corresponding to this strip is
a strip-shaped extract of said view having, in the complete view, an orienta¬
tion and position corresponding t o the orientation and position of the strip of
pixels in the pixel matrix.
The display described can be a simple multiview display which is only
equipped with a special control unit or an especially programmed control unit
so that, in addition t o the normal distance, other viewing distances are also
possible at least within certain limits which are freely choosable. The viewing
zones in typical embodiments are expediently dimensioned so that their lat¬
eral distance in each case approximately corresponds t o an average distance
between the eyes - e.g. 65 mm - the named plurality can e.g. be 9 or even
more. The pixel matrix can be provided e.g. by an LCD or by an OLED display.
The optical element can in particular be a parallax barrier or a lenticular
screen. A combination of these screen types is also possible. In the case of a
lenticular screen, the grid-like structure is typically formed by a group of paral¬
lel cylindrical lenses. Barrier screens, in particular slot screens, can be used as
the parailax barrier. Finally, the optical element can also be a Fresnel structure
or an LC structure which reproduces a slot screen or another screen type. The
pixels can be multicolor pixels or subpixels of different elementary colors - e.g.
red, green and blue. In the last named case, typically three respective pixels or
subpixels from three mutually following rows will combine t o form one colorneutral
or true-color picture element.
The described configuration of the control unit of a corresponding display is
particularly expedient if each of the strips from each of the rows of the pixel
matrix contains at most one pixel, that is has a width of only one pixel. There
is then namely no possibility of carrying out a lateral displacement of centers
of brightness within the individual strips to adapt the control to the changed
viewing distance.
An advantageous method is also proposed for displaying a 3D image on an
autostereoscopic display of the described type which achieves the object set.
This method is a particular use of a display having a pixel matrix and an optical
element arranged in front of or behind the pixel matrix, wherein the pixel ma
trix has a multitude of pixels which are arranged in different rows, wherein a
plurality of more than two disjoint subsets of pixels on the pixel matrix are
defined such that each of the subsets forms a band of parallel strips which
include a non-zero angle with the rows, wherein the strips of the different
subsets alternate cyclically in the row direction and wherein preferably each
of the strips from each of the rows of the pixel matrix contains at most one
pixel, and wherein the optical element has a grid-like structure orientated
parallel to the strips and imposes, for each of the pixels, a defined propaga¬
tion direction on light emanating or transmitted from the respective pixel
such that, at a nominal spacing from the display predefined by a geometry of
the display, a number, corresponding to the named plurality, of viewing zones
which are laterally offset relative to one another are defined such that each of
the viewing zones is associated with exactly one of the subsets and such that
the light emanating or transmitted from each of the subsets of pixels is d i
rected in the viewing zone associated with this subset.
The pixel matrix is controlled in the method in dependence on image data
which define a 3D image for an autostereoscopic viewing of the 3D image
from a viewing distance in front of the display which differs from the nominal
distance. For this purpose, the method includes the respective following steps
for each of the strips of pixels:
- determining a value of a location coordinate which describes a lateral posi
tion of locations on a line orientated in the row direction and disposed at a
defined height in the viewing distance in front of the display, wherein the val¬
ue is respectively determined for the location at which light emanating or
transmitted from the pixels of this strip is incident on the named line with the
propagation direction imposed by the optical element;
- determining intensity values which are defined by the image data for an im¬
age strip corresponding t o this strip of a view of the 3D image which corre¬
sponds t o a direction of gaze from a position defined by the named value;
- controlling pixels of this strip using the intensity values determined in this
manner.
What was said on the display above applies accordingly t o these method
steps. So that an image quality which is as good as possible also results in the
viewing distance different from the nominal distance, the location coordinate
can be determined - in a scale which is as finely graduated as possible or even
has no graduations - in each case so exactly that it adopts a number of differ¬
ent values for the different strips which is larger than then named plurality.
The control unit of the proposed display can accordingly be configured t o d e
termine the location coordinate on a scale which is as finely graduated as po s
sible or even has no graduations so exactly that it adopts a number of differ
ent values for the different strips which is larger than the named plurality.
The location coordinate namely adopts a number, corresponding t o the
named plurality, of possible values of the location coordinate for those loca
tions whose lateral positions correspond t o the previously named viewing
zones or which are disposed, viewed from the display, exactly in front of or
behind these viewing zones. In the present case, the location coordinate
should therefore additionally also adopt or be able t o adopt intermediate v al
ues between these discrete values. To keep the calculation effort within lim¬
its, it can, however, be advantageous if in each case only a limited number of
discrete intermediate values is permitted and if the location coordinate is
rounded up or down t o the respective next closest permitted value or inter¬
mediate value. It can thus be achieved that the number of the views required
in total, or more exactly of the views of which at least individual image strips
are required or may be required, remains manageable.
The named views are typically defined so that, for a number, corresponding t o
the named plurality, of values, they correspond t o a number of stereoscopic
half-images corresponding t o this plurality of which in each case two, which
correspond to values closest from one another from this number of values,
combine t o one stereoscopic image, They are the half-images which are dis¬
played on the named subgroups on a conventional control of the display or
when the viewing distance corresponds t o the nominal distance. In the pre¬
sent case of a viewing distance differing from the nominal distance, at least
one of the views, which corresponds t o an intermediate value of the location
coordinate, has t o be selected corresponding t o a direction of gaze disposed
between the directions of gaze of these two stereoscopic half-images.
The location coordinate is determined in an expedient embodiment of the
method such that, for a number, corresponding to the named plurality, of
directly adjacent strips which extend centrally over the pixel matrix, it adopts
the number of values named in the previous paragraph, whereas it adopts
intermediate values for at least some of the strips disposed further outwardly.
The control unit can accordingly be configured t o determine the location co¬
ordinate in this manner. The control of the pixel matrix at the image center
then does not differ or only differs insignificantly from the control provided
for the nominal distance
Any desired rendering processes known per se can be used t o determine the
required intensity values for the different views, It is advantageously sufficient
in each case in this respect if, for each of the views, the intensity views are
only determined of the image strip or of those image strips for which the val¬
ue of the location coordinate corresponding t o this view has been deter¬
mined. The required computation power therefore remains wit n limits
which also makes the method usable for image sequences which are not defined
in advance, but are defined in real time - for example in computer
games or in the presentation of live shots which are filmed with stereoscopic
cameras.
One possibility is that the intensity values are determined for the different
views in that image information for the required image strip or strips of the
respective view are determined from a depth map - or from several depth
maps - defined by the named image data and from texture values defined by
the image data for area segments of a surface represented by the depth map.
Details on such a process for acquiring image information of a view not pre¬
sent in advance can be seen e.g. from document DE 10 2010 028 668 Al.
In another embodiment, the intensity values for the different views can be
determined in that disparities between at least two stereoscopic half images,
which are defined by the image data, are determined and image information
is determined for the required image strip or strips in that the view is defined
as an intermediate image between the named stereoscopic half-images in
dependence on the disparities and on the respective value of the named loca¬
tion coordinate by interpolation and/or by transformation. Instructions on
how this can be done can be found e.g. in the document US 6,366,281 Bl.
Such processes are also called "morphing".
A further possibility comprises the fact that the intensity values for the differ¬
ent views are determined in that disparities between at least two stereoscopic
half-images, which are defined by the image data, are determined, from
which disparities a depth map is calculated and, using this depth map, image
information is determined for the required image strip or strips of the respec
tive view.
Accordingly, the control unit of the proposed display can be configured, for
determining the intensity values for the different views,
- t o determine image information for the required image strip or strips of the
respective view from a depth map defined by the named image data and from
texture values defined by the image data for surface segments of a surface
represented by the depth map; or
- t o determine disparities between at least two stereoscopic half-images
which are defined by the image data and t o determine image data for the
image strip or strips of the respective view in that the view is defined as an
intermediate image between the named stereoscopic half-images in depend¬
ence on the disparities and on the respective value of the named location co¬
ordinate by interpolation and/or by transformation; or
- t o determine disparities between at least two stereoscopic half-images
which are defined by the image data, to calculate a depth map from the dis¬
parities and to determine image information for the required image strip or
strips of the respective view using this depth map.
So that the control can be automatically adapted t o a current distance of a
viewer from the display, the display can include a tracking device for deter¬
mining a distance between the eyes of at least one viewer and the display and
the control unit can be configured to control the pixel matrix for the viewing
distance corresponding t o this distance. The control unit is therefore then
configured to control the pixel matrix - if the measured distance differs from
the nominal distance - in the previously described manner so that the named
viewing distance used as the basis for the control corresponds t o the distance
determined by the tracking device. In the correspondingly advantageously
designed method, a distance from an eye pair of at least one viewer from the
display is therefore detected, wherein the viewing distance is selected as cor¬
responding t o the spacing thus detected for determining the values of the
location coordinate for the different strips. For this purpose, images of a
space in front of the display taken by a stereoscopic camera can e.g. be evalu¬
ated using a suitable image evaluation process.
The subject of the present application can be described as a method for re
producing a 3D image on an autostereoscopic screen of a type known perse.
In typical embodiments, this screen has a pixel matrix with a large number of
pixels as well as an optical element arranged in front of the pixel matrix and
referred t o hereinafter as an optical grid, wherein the pixels in the pixel matrix
are arranged such that they form a large number of strips, arranged adjacent¬
ly in an equidistant manner and referred t o hereinafter as columns, with a
column direction that is vertical or inclined relative t o a vertical, and wherein
the optical grid has a group of strip-shaped structures, which are oriented
parallel t o the columns and are arranged adjacently in an equidistant manner,
and predefines, for each of the pixels, at least one defined plane of propaga¬
tion of the light originating from the respective pixel, said plane of propaga¬
tion being spanned from a defined horizontal direction of propagation and the
column direction, wherein a period length of the optical grid defined by a latera
I offset of adjacent strip-shaped structures is greater by a factor
nxD /(D +a} than a lateral offset of the directly adjacent columns, wherein a
denotes an effective distance between the pixel matrix and the optical grid, D
denotes the nominal viewing distance of the autostereoscopic screen, and n
denotes an integer greater than two corresponding to number of the afore¬
mentioned plurality of more than two viewing zones.
In the use proposed here of a screen of this type, the method for reproducing
a 3D image may comprise the following steps, which enable autostereoscopic
viewing of the 3D image from a viewing distance D deviating from the nominal
viewing distance Dn:
- assigning, to each of said columns, a value of a location coordinate in the
viewer space and a further location coordinate value, wherein the value of the
first-mentioned location coordinate gives a location on a coordinate axis ori¬
ented horizontally in front of the screen in the viewing distance D, i.e. a lateral
position of a location, said location being defined by the fact that the light
originating from the respective column and failing through the optical grid
falls onto the coordinate axis at this location, and wherein the further location
coordinate value gives a position, in a lateral direction, of the respective col¬
umn or the strip-shaped structure of the optical grid, through which the light
originating from the pixels of this column falls,
- calculating, for each of the columns, an extract of an image by image synthe¬
sis, wherein this image is given by a perspective of the 3D image to be repro¬
duced from a position that is defined by the value of the first-mentioned loca¬
tion coordinate assigned to the respective column, and wherein the extract is
defined by a strip of this image, which has a lateral position in this image that
corresponds t o the further location coordinate value assigned t o the respec
tive column,
- controlling the pixels in the pixel matrix in such a way that the extract thus
calculated for each of the columns is recorded in the respective column.
It might be helpful for the understanding to see that the definition of the firstmentioned
location coordinate given above implies that the value of the firstmentioned
location coordinate corresponds to the location at which the plane
of propagation, defined by the optical grid, of the light originating from the
respective column intersects the aforementioned horizontal coordinate axis.
A region within which the 3D image is visible, is of relatively good quality and
can be perceived three-dtmensionally autostereoscopically is thus produced in
the viewing distance D in front of the screen. Interference reducing the image
quality however may occur at the edges of this region and is typically visible in
the form of strips running in the column direction - for example at an incline -
and arranged adjacently in parallel. This interference is caused by crosstalk
between adjacent columns or strips of pixels, in which extracts of images of
relatively vastly different perspectives are reproduced. With the described
control, extracts of images of which the perspectives or viewing directions
generally only differ slightly are reproduced over the adjacent columns or
strips of pixels. At each approximately nt column jump however, a much
greater perspective jump arises in the opposite direction, which may lead t o
the aforementioned interference.
A measure is outlined hereafter that allows this interference to be attenuated
at least. In this case, averaged intensity values are recorded in or reproduced
by some of the columns. This preferably concerns the recording of a perspec¬
tive or viewing direction deviating vastly from the directly adjacent columns in
the extracts of images, that is t o say the columns of the pixel matrix in which
the aforementioned relatively large perspective jumps occur. In this case, con¬
tributions of the two perspectives corresponding t o the right-hand and lefthand
edge of the aforementioned region in front of the screen are averaged.
To be more precise, the pixels of selected strips of the strips of pixels are, in
this case, controlled using averaged intensity values, the selected strips being
determined as the strips or some of the strips for which the step of determin¬
ing a value of the location coordinate results in that two solutions - i.e. two
values of the location coordinate - are found within a given location coordi
nate interval due to the fact that the light emanating from the pixels of the
respective strip propagates through two adjacent structures of the optical
element, wherein each of the averaged intensity values is determined as an
average of a first intensity value which is determined for the respective pixel
for a first solution of the two solutions and a second intensity value which is
determined for the same pixel for a second solution of the two solutions. The
control unit can be configured accordingly. Typically, the averaged intensity
values are obtained by adding, for each of the pixels of a particular selected
strip, the weighted first intensity value and the weighted second intensity
value determined for the respective pixel, the first and the second intensity
values being weighted depending on the determined value of the location
coordinate, a weighting factor used for this averaging being smaller if the de¬
termined value is closer t o a boundary of the given location coordinate interval
and larger if the determined value is less close t o a boundary of the given
location coordinate interval.
Embodiments of the invention will be explained in the following with refer¬
ence t o Figures 1 t o 4. There is shown
Fig. 1 in a schematic representation, a plan view of an autostereoscopic dis¬
play and a viewing space in front of this display;
Fig. 2 a detail of a pixel matrix of the display of Fig. 1 in a front view;
Fig. 3 in a representation corresponding t o Fig. 1, the same display, with
here some components of the display having been omitted and only
some beam paths being drawn by way of example t o explain the pro¬
posed control of the display; and
Fig. 4 a front view of a part of the pixel matrix in a particular embodiment.
An autostereoscopic display is shown in Fig. 1 which is in particular suitable as
a multiview display to display a plurality of different images, nine in the present
case, simultaneously. This display has a pixel matrix 11 and an optical
element 12 arranged in front of the pixel matrix 11. In addition, the display
includes a control unit 13 for controlling the pixel matrix 11 in dependence on
image data 14 which define a 3D image. Typically, this 3D image will vary over
time so that is is more precisely an image sequence. The image data 14 can in
this respect be stored e.g. on a data carrier and can be read from there or can
be defined by a computer game in dependence on its course.
The pixel matrix 11 is an LCD or an OLED display having a multitude of pixels
15 which are arranged in different rows. A detail of this pixel matrix 11 is
shown in Fig. 2. The individual pixels 15 are each shown by rectangles there.
In the present case, the pixels 15 are subpixels of the elementary colors red,
green and blue - marked in Fig. 2 by the letters , G and B respectively.
A plurality of disjoint subsets of pixels 15, nine in the present case - the plural
ity could naturally also be larger or smaller - are defined on the pixel matrix 11
such that each of these subsets forms a group of parallel strips. The subsets
are numbered continuously from 1 t o 9 and in Fig. 2 the pixels 15 are each
provided with the number of the subset t o which the pixel 15 belongs. As can
be recognized in Fig. 2, the named strips include a non-zero angle with the
rows, with the strips of the different subsets alternating cyclically in the row
direction and with each of the strips not containing more than one pixel 15 in
each of the rows.
The optical element 12 can e.g. be designed as a slot screen or as a lenticular
screen and has a grid-like structure which is oriented parallel t o the strips and
which is indicated by dashed lines in Fig. 2. In this respect, in the present case,
d = 9b D /(D +a)
applies t o a period d of this structure in the lateral direction - corresponding
t o the row direction - where is a lateral distance from the area centers of ad¬
jacent pixels 15, a designates a distance between the pixel matrix 11 and the
optical element 12 and D stands for a so-called nominal distance. The optical
element 12 in each case thereby defines a respective defined propagation
direction for light emanating or transmitted from the pixels 15 . This is done
such that, at the nominal spacing Dn in front of the display, a number, corre¬
sponding t o the previously named plurality, of nine viewing zones 16 offset
laterally relative t o one another are defined so that each of the viewing zones
16 is associated with exactly one of the subsets, and such that light emanating
or transmitted from each of the subgroups of pixels 15 is directed into the
viewing zone 16 associated with this subset. This is illustrated in Fig. 1 by a
respective dashed line for two extremely outwardly disposed pixels 15 of the
subgroup 2. Modifications in which the optical element 12 is arranged behind
the pixel matrix 11 are just as possible. The viewing zones 16 are shown with
their diamond-shaped cross-section in Figure 1 and are numbered continuous¬
ly from 1 t o 9 in accordance with the subgroups. The mutually adjacent view
ing zones 16 are each mutually offset laterally by about 65 mm.
On a conventional mode of operation of the display, a respective one of nine
stereoscopic half-images is displayed on each of the subgroups of pixels 15 so
that one of these stereoscopic half-images is visible from each of the viewing
zones 16. The stereoscopic half-images are then selected so that the two ste¬
reoscopic half-images visible from directly adjacent viewing zones 16 each
combine t o form one stereoscopic image corresponding to a view of the 3D
image thus displayed. One or more viewers can then each see one of the
views of a three-dimensional effect with a depth effect from a viewing plane
17 which is disposed at the nominal distance D in front of the display.
Another mode of operation of the display will now be described here in which
the pixel matrix 11 is controlled for an autostereoscopic viewing of the 3D
image from a viewing distance D differing from the nominal distance D .
To measure the viewing distance D, the display in the present embodiment
has a tracking device which is here given by a stereoscopic camera 18 directed
t o the viewing space in front of the display and by an evaluation unit 19 for
carrying out an image evaluation process. A head position of at least one
viewer is detected using this tracking device and the viewing distance D is
measured as the distance between an eye pair of this viewer and the display,
The control unit 13 now carries out some of the steps explained in more detail
in the following by a corresponding technical program device in dependence
on the image data 14 and on the viewing distance D determined by the track¬
ing device for each of the strips of pixels 15 t o control the pixel matrix 11 for
an autostereoscopic viewing of the 3D image from the viewing distance D in
front of the display differing from the nominal distance Dn.
First, a respective value of a location coordinate x is determined for each of
the strips according t o a rule which can be described as follows. At a specific
height which can be selected largely as desired - an imaginary line 20 orien¬
tated in the row direction - that is horizontally - is defined at a spacing in front
of the display which corresponds t o the determined viewing distance D. The
location coordinate is defined such that it describes a lateral position of loca¬
tions on the line 20. In the present case, the line 20 is a section of a straight
line. It could, however, also extend in a slanted manner or be curved. In this
case, let the named distance designate a distance between the display and a
defined point, typically a central point disposed in front of the display, on the
line 20. The value of the location coordinate is now determined by a simple
mathematical operation for each of the strips for the location at which light
emanating or transmitted from the pixels of this strip is incident with the
propagation direction imposed by the optical element 12 onto the named line
20. This is illustrated by way of example in Fig. 1 for one of the strips by
means of a dashed line, and indeed for a strip of pixels 15 which belongs t o
the subgroup 7 and which is disposed near the left hand margin of the display.
The location coordinate x adopts nine discrete possible values for such loca¬
tions which are disposed exactly in front of the viewing zones 16, seen from
the center of the display. The location coordinate x is scaled in the present
example such that these are the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9. The
value of the location coordinate x is in each case determined on a scale which
is finely graduated or is even at least quasi without graduations exactly so that
it also adopts intermediate values between these discrete values and adopts a
number of different values for the different strips which is considerably larger
than the previously named plurality of nine. Thus, x = 3.5 applies rather pre¬
cisely e.g. t o the strip for which the determination of the value of the location
coordinate x is illustrated in Fig, 1, As can also be recognized in Fig. 1, the lo¬
cation coordinate x is defined so that it adopts the discrete values 1, 2, 3, 4, 5,
6, 7, 8 and 9 for the nine directly adjacent strips which extend centrally over
the pixel matrix, whereas it also adopts intermediate values disposed
therebetween for at least some of the further outwardly disposed strips.
Each value of the location coordinate x therefore stands for a specific position
on the line and thus for a specific viewing position with which in turn a specif¬
ic direction of gaze or perspective on the scene defined by the 3D image can
be associated. In a further step, intensity values are now determined for each
of the strips and are defined by the image data 14 for an image strip corresponding
t o this strip of a view of the 3D image which corresponds t o a direc
tion of gaze from a position defined by the named value. The respective view
is in this respect defined as the two-dimensional view of the 3D image or of
the scene displayed by the 3D image which results from this viewing position
or from the direction of gaze thereby predefined.
Finally, the pixels 15 of the respective strip is controlled using the thus deter¬
mined intensity values which in the present case naturally also depend on
color information contained in the image data 14 and on the color of the individua)
pixels 15. in a modification, the pixel matrix 15 could naturally also
have multicolor pixels which are then each controlled using correspondingly
determined intensity values and color values.
For the discrete values 1, 2, 3, 4, 5, 6, 7, 8 and 9, the named views are defined
as the nine stereoscopic half-images which were named above in connection
with the conventional mode of operation of the display for a viewing from the
nominal distance D . In the operating mode focused on here, however, most
views, of which only individual strips are needed, are defined for intermediate
values of the location coordinate x which each correspond t o a direction of
gaze disposed between the directions of gaze of those nine stereoscopic halfimages.
To illustrate this, rays are drawn by way of example in Fig. 1 of the x
values 1, 5, and 9 which show the points on the pixel matrix 11 from which
light must emanate t o be incident through the optical element 12 onto the
location on the line 20 defined by the respective x value. As can easily be rec¬
ognized in Fig. 1, these points are, however, only central in one of the pixels
15 from which light actually emanates in exceptional cases. The light emanat¬
ing from the actually present pixels 15 - this always means area centers of the
pixels 15 - in contrast is incident on the line 20 in most cases onto locations
which correspond t o intermediate values of the x coordinate. A represent a
tion of best possible quality is realized for a viewer positioned at the viewing
distance D in front of the display by the control proposed here of these pixels
15 using image information which correspond t o correspondingly selected
intermediate views.
Fig. 3, in which recurring features are again provided with the same reference
numerals, illustrates the relationships again. In a manner of representation
otherwise corresponding t o Fig. 1, respective light bundles are drawn by way
of example here for three different regions of the pixel matrix 11, said pixel
bundles emanating there from three t o four respective adjacent subpixels 15
and illuminating a region on the line 20 about the location which is defined by
the coordinate value x 5. So that a viewer's eye can see a view from this lo¬
cation with as few disturbances as possible, said view corresponding t o a di¬
rection of gaze from a camera position defined by the value x = 5, the pixels
15 of the different subgroups have to be controlled as follows due t o the ge
ometrical relationships which are easily recognizable here. At the ef t hand
end of the pixel matrix 11, the pixels 15 of the subgroup 9, and at the right
hand end of the pixel matrix l i the pixels 15 of the subgroup 1 are controlled
using the intensity values which belong in the normal case - that is on a view¬
ing from the nominal spacing D - t o the stereoscopic half-image visible in the
fifth viewing zone 16. In the region of the pixel matrix identified by the refer
ence numeral 21 in Fig. 3, in contrast, none of the pixels 15 is controlled using
intensity values of this view because none of the pixels 15 there is disposed
where the corresponding image information would have t o be imaged. In¬
stead, the pixels 15 of the subgroup 3 are controlled there using image infor¬
mation of a calculated view which belongs t o a value x = 4.6, whereas the pix¬
els 15 of the subgroup 4 in the region 21 are controlled using image infor¬
mation of another calculated view which belongs t o a value x = 5.6. The views
are meant by this which would result by taking the displayed scene from cam
era positions which are disposed at corresponding locations between the
camera positions of the fourth and fifth or of the fifth and sixth stereoscopic
half-images of the nine stereoscopic half-images named further above.
The line 20 is fixed in such a limited manner in its length or width - that is in
the x direction - that the values of the location coordinate x can be clearly
determined in the above-described manner. The parameterization of the line
20 by the location coordinate x can naturally also have a different scaling than
in the case shown in Fig. 1. It can also be achieved by a stretching of the dis¬
tances between the positions which correspond here t o the discrete x values
1, 2, 3, 4, 5, 6, 7, 8 and 9 with an unchanging association of the views with
specific values for x that two views which are visible from two positions on the
line 20 remote from one another by an average distance between the eyes of
about 65 mm also correspond t o two respective perspectives from camera
positions correspondingly remote from one another - and not further remote,
for instance, with a smaller D. The parallax between two views or more exact¬
ly between the views which are approached by the described control and
which a viewer can see with his two eyes from the viewing distance D should
therefore correspond as exactly as possible t o the parallax which results on
the viewing of the displayed scene by the average eye distance.
Provision can be made that the value of the location coordinate x is respec
tively rounded up or down to a next closest intermediate value from a limited
number of discrete intermediate values. It would e.g. be possible to deter¬
mine the location coordinate respectively only up to the first decimal point.
Nine respective possible intermediate images are then disposed between the
stereoscopic half-images which correspond t o the x values 1, 2, 3, 4, 5, 6, 7, 8
and 9. The number of the views needed as a maximum - more precisely the
number of which at least individual image strips can be needed - is then re¬
duced to at most 90. The calculation effort can be advantageously restricted
by this restriction to a discrete number of views - which is, however, larger
than the original number of nine views.
Different processes known per se can be used by a corresponding program¬
ming of the control unit 13 to construe the needed views or more precisely
the required image strips thereof and to determine the intensity values for
the different views.
In particular the following cases are possible:
The image data 14 can e.g. define a depth map and texture values for area
segments of a surface reproduced by the depth map. The intensity values can
then be determined for the different views in that image information for the
required image strip or strips of the respective view are determined from the
depth map and from the texture values for the area segments of the surface
reproduced by the depth map.
In other cases, the image data define two or more stereoscopic half-images.
The intensity values can then be determined for the different views in that
disparities between the already defined stereoscopic half-images are determined
and image information are determined for the required image strip or
strips of the respective view in that this view is defined as an intermediate
image between the named stereoscopic half-images in dependence on the
disparities and on the respective value of the named location coordinate x by
interpolation and/or by transformation - by so-called "morphing". Instead, a
depth map can also be calculated from the disparities which result from the
already present half-images. Image information for the required image strip
or strips of the respective view can then in turn be determined using this
depth map.
t must additionally be pointed out that the tracking device is naturally also
configured also to determine a lateral location of the at least one eye pair
using the head position of the at least one eye pair. The control unit 13 can
therefore additionally be configured t o control the pixel matrix 11 in depend¬
ence on the lateral position determined at least by the tracking device so that
a region from which the 3D image is autostereoscopicaily visible also includes
the eye pair or the eye pairs of the tracked viewer or of the tracked viewers. If
required, this can be done by a lateral displacement of the line 20 or of the
viewing zones 16.
In the present embodiment, the viewing distance D is determined by the
tracking device and the pixel matrix 11 is controlled in dependence on the
viewing distance Dthus determined. Instead, the viewing distance D could
naturally also be selected by a user - e.g. in dependence on dimensions of a
room in which the display is installed - and can be predefined by an input at
the control unit 13.
In a special embodiment of the display, the optical element 12 can be cont rol
lable and form lens elements with refractive properties variable in depend
ence on a control of the optical element 12. Liquid crystals can be used for
realizing such structures known per se. Independently of whether the viewing
distance D is fixed arbitrarily by an input or in dependence on output signals
of a tracking device, the control unit 13 can be configured in this case t o con¬
trol the optical element 12 in dependence on the viewing distance D so that
the refractive properties of the lens elements are adapted t o this viewing d is
tance D.
If the pixels 15 of the pixel matrix 11 are controlled as described so far, the 3D
image can be seen from a certain region more or less centrally in front of the
display, a distance between this region and the display corresponding t o the
viewing distance D. In Fig. 1, this region can be identified with the part of the
line 20 corresponding t o x coordinate values of between 1 and 9. However,
interference reducing the image quality may occur at the edges of this region.
This interference is caused by crosstalk between adjacent strips of pixels in
which extracts of images of relatively vastly different perspectives are repro¬
duced. To put this in other words, the problem occurs as, in some cases, an x
value close t o 9 or 10 will be assigned t o one of two adjacent strips of pixels
15 while an x value close to 0 or 1 is assigned t o the other of the same two
adjacent strips of pixels 15.
The quality loss caused by this phenomenon can be reduced by using aver¬
aged intensity values for the respective strips of pixels 15. Fig. 4 illustrates
how this is done in a particular embodiment. Fig. 4 shows a part of the pixel
matrix 11, this part comprising ten lines and 57 columns of pixels 15. Letters
, G of B on top of the columns indicate whether the pixels 15 of the respec
tive column are red, green or blue, respectively. At least one number is plot
ted in each of the pixels 15 shown in Fig. 4. These numbers indicate the values
of the location coordinate x determined, by the method described above, for
the respective pixel 15 - or, t o be more precise, determined for the strip t o
which the respective pixel 15 belongs. When determining the values of the
location coordinate x, only values between 0 and 10 are admitted. Even within
this given location coordinate interval, two different values of the location
coordinate x are found for some of the strips of pixels 5. This is due t o the fact
that the light emanating from the pixels 15 of the respective strip and falling
in said region propagates through two adjacent structures of the optical ele¬
ment 11.
The strips of pixels 15 for w ch two different x values between 0 and 10 are
found, i.e. the strips for which the step of determining a value of the location
coordinate s results in that two solutions are found, or at least some of them
are selected t o be the strips of pixels 15 which are controlled using averaged
intensity values. The two different x values assigned t o these strips - hereafter
referred to as i and x - are plotted in the X x x format in Fig. 4. Each of
the averaged intensity values is determined as an average of a first intensity
value f(xi) which is, in the step of determining the intensity values, det er
mined for the respective pixel 15 for a first solution i of the two solutions
and a second intensity value f(x ) which is determined for the same pixel for a
second solution x of the two solutions. Thus, the first intensity value x is
an intensity value defined by an image point of a view corresponding t o a di¬
rection of gaze from a position defined by the location coordinate value
while the second intensity value f(x ) is an intensity value defined by an image
point of a view corresponding t o a direction of gaze from a position defined by
the location coordinate value x . The averaged intensity values I are obtained
by adding, for each of the pixels 15 of a particular selected strip, the weighted
first intensity value f(xj) and the weighted second intensity value f(x ) deter¬
mined for the respective pixel 15, the first intensity value f(xi) and the second
intensity value f(x ) being weighted depending on the determined value or
x2 of the location coordinate. The weighting factors used for this averaging are
defined as being smaller if the determined value is closer to a boundary of the
given location coordinate interval from 0 to 10 and larger if the determined
value is less close t o a boundary of this location coordinate interval. This is
illustrated in Fig. 4, for some of the pixels 15 of the selected strips, by the
formulae below the visible part of the pixel matrix 11.
In the examples described here, the viewing distance Dwas smaller than the
nominal spacing D . This is, of course, not necessarily the case. The same steps
can be performed analogously in order to adapt the display to be observed
with a viewing distance Dwhich is larger than the nominal viewing distance
D .
Claims
An autostereoscopic display for simultaneously displaying more than
two different images, comprising
a pixel matrix (11) having a multitude of pixels (15) which are arranged
in different rows, wherein a plurality of more than two disjoint subsets
of pixels (15) are defined on the pixel matrix (11) so that each of the
subsets forms a band of parallel strips which include a non-zero angle
with the rows, wherein the strips of the different subsets alternate cy
clically in the row direction;
an optical element (12) which is arranged in front of or behind the pix¬
el matrix (11), which has a grid-like structure orientated parallel t o the
strips and imposes, for each of the pixels (15), a defined propagation
direction on light emanating or transmitted from the respective pixel
(15) such that, at a nominal spacing (Dn) from the display predefined
by a geometry of the display, a number, corresponding t o the named
plurality, of viewing zones (16), which are laterally offset relative t o
one another, is defined so that each of the viewing zones (16) is asso¬
ciated with exactly one of the subsets and such that the light emanat¬
ing or transmitted from each of the subsets of pixels (15) is directed in
the viewing zone (16) associated with this subset; and
a control unit (13) for controlling the pixel matrix (11) in dependence
on image data (14) which define a 3D image,
characterized in that
the control unit (13) is configured to carry out the respective following
steps for controlling the pixel matrix (11) for an autostereoscopic view
ing of the 3D image from a viewing distance (D) differing from the
nominal spacing (D„) in front of the display for each of the strips of
pixels (15):
- determining a value of a location coordinate (x) which describes a
lateral position of locations on a line (20) orientated in the row direc
tion and disposed at a defined height at the viewing distance (D) in
front of the display, wherein the value is respectively determined for
the location at which light emanating or transmitted from the pixels
(15) of the respective strip is incident on the named line (20) with the
propagation direction imposed by the optical element (12);
- determining intensity values which are defined by the image data
(14) for an image strip corresponding t o this strip of a view of the 3D
image which corresponds to a direction of gaze from a position de¬
fined by the named value;
- controlling the pixels (15) of this strip using the intensity values d e
termined in this manner.
2. The display of claim 1, characterized in that the control unit (13) is
configured to determine the location coordinate (x) so exactly that it
adopts a number of different values for the different strips which is
larger than the named plurality.
The display of any of the claims 1 or 2, characterized in that the views
for a number, corresponding to the named plurality, of values are a
number, corresponding to this plurality, of stereoscopic half-images of
which a respective two, which correspond to mutually next closest
values of this number of values, combine to form a stereoscopic im
age, whereas at least one of the views which corresponds to an inter¬
mediate value of the location coordinate (x) corresponds to a direction
of gaze disposed between the directions of gaze of these stereoscopic
half-images.
The display of claim 3, characterized in that the control unit (13) is con¬
figured to determine the location coordinate (x) such that it adopts the
named number of values for a number, corresponding to the named
plurality, of directly adjacent strips which extend centrally over the
pixel matrix (11), whereas it adopts intermediate values for at least
some of the further outwardly disposed strips.
The display of any of the claims 1 to 4, characterized in that the control
unit (13) is configured to determine the intensity values for the differ¬
ent views in that
- image information for the required image strip or strips of the re¬
spective view are determined from at least one depth map defined by
the named image data (14) and from texture values defined by the im
age data (14) for surface segments of a surface represented by the
depth map; or
- disparities are determined between at least two stereoscopic halfimages
which are defined by the image data (14) and image data for
the required image strip or strips of the respective view are det er
mined in that the view is defined as an intermediate image between
the named stereoscopic half-images in dependence on the disparities
and on the respective value of the named location coordinate (x) by in
terpolation and/or by transformation; or
- disparities are determined between at least two stereoscopic halfimages
which are defined by the image data (14), a depth map is calcu
lated from the disparities and image information for the required im
age strip or strips of the respective view is determined using this depth
map.
The display of any of the claims 1 to 5, characterized in that each of
the strips contains at most one pixel (15) from each of the rows of the
pixel matrix (11).
The display of any of the claims 1 to 6, characterized in that it includes
a tracking device for determining a distance between an eye pair of at
least one viewer and the display, wherein the control unit (13) is configured
t o control the pixel matrix (11 so that the named viewing dis¬
tance (D) corresponds t o the distance determined by the tracking de¬
vice.
The display of claim 7, characterized in that the tracking device is con¬
figured also t o determine a lateral location of the at least one eye pair,
wherein the control unit (13) is configured t o control the pixel matrix
11) in dependence on the at least one lateral location determined by
the tracking device so that the at least one eye pair is located in a re
gion from which the 3D image is autostereoscopically visible.
The display of any of the claims 1 t o 8, characterized in that the optical
element (12) is controllable and forms lens elements having refractive
properties variable in dependence on a control of the optical element
(12) , wherein the control unit (13) is configured t o control the optical
element (12) to adapt the refraction properties of the lens elements t o
this viewing distance (D).
The display of any of the claims 1 t o 9, characterized in that the gridlike
structure of the optical element (12) is given by periodically a r
ranged strip-shaped structures such as cylindrical lenses or slots, a lat¬
eral offset of adjacent strip-shaped structures being greater by a f ac
tor nxDr,/(D +a) than a lateral offset of the directly adjacent strips of
pixels (15), wherein a denotes an effective distance between the pixel
matrix (12) and the optical element, D denotes the nominal spacing
(Dn) of the autostereoscopic screen, and n denotes an integer greater
than two corresponding t o said plurality.
11. The display of any of the claims 1 to 10, characterized in that the con
trol unit (13) is configured t o control the pixels (15) of selected strips
of the strips of pixels (15) using averaged intensity values, the selected
strips being determined as the strips or some of the strips for which
the step of determining a value of the location coordinate (x) has,
within a given location coordinate interval, two solutions due t o the
fact that the light emanating from the pixels (15) of the respective strip
propagates through two adjacent structures of the optical element
(12), wherein each of the averaged intensity values is determined as an
average of a first intensity value which is determined for the respective
pixel (15) for a first solution of the two solutions and a second intensity
value which is determined for the same pixel (15) for a second solution
of the two solutions.
A method for displaying a 3D image on an autostereoscopic display
having a pixel matrix (11) and an optical element (12) arranged in front
of or behind the pixel matrix (11);
wherein the pixel matrix (11) has a multitude of pixels (15) which are
arranged in different rows, wherein a plurality of more than two d is
joint subsets of pixels (15) are defined on the pixel matrix (11) so that
each of the subsets forms a band of parallel strips which include a non¬
zero angle with the rows, wherein the strips of the different subsets al
ternate cyclically in the row direction;
and wherein the optical element (12) has a grid-like structure orient at
ed parallel t o the strips and imposes, for each of the pixels, a defined
propagation direction on light emanating or transmitted from the re¬
spective pixel (15) such that, at a nominal spacing (Dn) in front of the
display predefined by a geometry of the display, a number, corre¬
sponding t o the named plurality, of viewing zones (16), which are lat
erally offset relative t o one another, is defined so that each of the
viewing zones (16) is associated with exactly one of the subsets and
such that the light emanating or transmitted from each of the subsets
of pixels (15) is directed in the viewing zone (16) associated with this
subset;
wherein the pixel matrix (11) is controlled in dependence on image d a
t a (14) which define a 3D image,
characterized in that
the pixel matrix (11) is controlled for an autostereoscopic viewing of
the 3D image from a viewing distance (D) in front of the display differ¬
ing from the nominal spacing (D ), wherein the method includes the
following steps for each of the strips of pixels (15):
- determining a value of a location coordinate (x) which describes a
lateral position of locations on a line (20) orientated in the row direc
tion and disposed at a defined height in the viewing distance (D) in
front of the display, wherein the value is respectively determined for
the location at which light emanating or transmitted from the pixels
15) of the respective strip is incident on the named line (20) with the
propagation direction imposed by the optical element (12);
- determining intensity values which are defined by the image data
(14) for an image strip corresponding t o this strip of a view of the 3D
image which corresponds t o a direction of gaze from a position defined
by the named value;
- controlling the pixels (15) of this strip using the thus determined in
tensity values.
The method of claim 12, characterized in that the location coordinate
(x) is respectively determined so exactly that it adopts a number of dif
ferent values for the different strips which is larger than the named
plurality.
The method of any of the claims 12 or 13, characterized in that the
views for a number, corresponding t o the named plurality, of values
are a number, corresponding t o this plurality, of stereoscopic halfimages
of which a respective two, which correspond to mutually next
closest values of this number of values, combine t o form a stereoscop¬
ic image, whereas at least one of the views which corresponds to an
intermediate value of the location coordinate (x) corresponds t o a di¬
rection of gaze disposed between the directions of gaze of these ste¬
reoscopic half-images.
15. The method of claim 14, characterized in that the location coordinate
(x) is determined such that it adopts the named number of values for a
number, corresponding t o the named plurality, of directly adjacent
strips which extend centrally over the pixel matrix (11), whereas it
adopts intermediate values for at least some of the further outwardly
disposed strips.
The method of any of the claims 12 t o 15, characterized in that the
intensity values for the different views are determined in that
- the intensity values for the required image strip or strips of the re
spective view are determined from at least one depth map defined by
the named image data 14) and from texture values defined by the im¬
age data (14) for area segments of a surface represented by the depth
map; or
- disparities are determined between at least two stereoscopic halfimages
which are defined by the image data (14) and intensity values
for the required image strip or strips of the respective view are deter¬
mined in that the view is defined as an intermediate image between
the named stereoscopic half-images in dependence on the disparities
and on the respective value of the named location coordinate (x) by in
terpolation and/or by transformation; or
- disparities are determined between at least two stereoscopic halfimages
which are defined by the image data (14), a depth map is calcu¬
lated from the disparities and the intensity values for the required im¬
age strip or strips of the respective view are determined using this
depth map.
The method of any of the claims 12 t o 16, characterized in that a dis¬
tance between an eye pair of at least one viewer and the display is de¬
tected, wherein the viewing distance (D) is correspondingly selected as
the thus detected distance for determining the values of the location
coordinate (x) for the different strips.
18. The method of any of the claims 12 t o 17, characterized in that the
step of controlling the pixels (15) of the respective strip using the d e
termined intensitY values is done by controlling the pixels (15) such
that the image strip which corresponds to this strip of the view of the
3D image corresponding to the direction of gaze from the position de¬
fined by the named value of the location coordinate (x) is reproduced
by the pixels (15 of this strip of pixels (15), wherein the image strip
corresponding t o this strip is a strip-shaped extract of said view having,
in the complete view, an orientation and position corresponding to the
orientation and position of the strip of pixels (15) in the pixel matrix
(11).
The method of any of the claims 12 to 18, characterized in that the
pixels (15) of selected strips of the strips of pixels (15) are controlled
using averaged intensity values, the selected strips being determined
as the strips or some of the strips for which the step of determining a
value of the location coordinate (x) has, within a given location coordi¬
nate interval, two solutions due t o the fact that the light emanating
from the pixels (15) of the respective strip propagates through two ad¬
jacent structures of the optical element (12), wherein each of the av¬
eraged intensity values is determined as an average of a first intensity
value which is determined for the respective pixel (15) for a first solu¬
tion of the two solutions and a second intensity value which is det er
mined for the same pixel (15) for a second solution of the two solu¬
tions.
The method of claim 19, characterized in that the averaged intensity
values obtained by adding, for each of the pixels (15) of a particular se¬
lected strip, the weighted first intensity value and the weighted second
intensity value determined for the respective pixel (15), the first and
the second intensity values being weighted depending on the det er
mined value of the location coordinate (x), a used weighting factor be¬
ing smaller if the determined value is closer t o a boundary of the given
location coordinate interval and larger if the determined value is less
close t o a boundary of the given location coordinate interval.
| # | Name | Date |
|---|---|---|
| 1 | 6363-DELNP-2014-IntimationOfGrant08-12-2023.pdf | 2023-12-08 |
| 1 | PCT-IB-304.pdf | 2014-08-01 |
| 2 | 6363-DELNP-2014-PatentCertificate08-12-2023.pdf | 2023-12-08 |
| 2 | OTHER RELEVANT DOCUMENT.pdf | 2014-08-01 |
| 3 | FORM 5.pdf | 2014-08-01 |
| 3 | 6363-DELNP-2014-Correspondence-120419.pdf | 2019-04-22 |
| 4 | FORM 3.pdf | 2014-08-01 |
| 4 | 6363-DELNP-2014-Power of Attorney-120419.pdf | 2019-04-22 |
| 5 | FORM 2 + SPECIFICATION.pdf | 2014-08-01 |
| 5 | 6363-DELNP-2014-ABSTRACT [03-04-2019(online)].pdf | 2019-04-03 |
| 6 | 6363-DELNP-2014.pdf | 2014-08-23 |
| 6 | 6363-DELNP-2014-CLAIMS [03-04-2019(online)].pdf | 2019-04-03 |
| 7 | 6363-DELNP-2014-GPA-(18-09-2014).pdf | 2014-09-18 |
| 7 | 6363-DELNP-2014-COMPLETE SPECIFICATION [03-04-2019(online)].pdf | 2019-04-03 |
| 8 | 6363-DELNP-2014-English-Translation-(18-09-2014).pdf | 2014-09-18 |
| 8 | 6363-DELNP-2014-DRAWING [03-04-2019(online)].pdf | 2019-04-03 |
| 9 | 6363-DELNP-2014-Correspondence-Others-(18-09-2014).pdf | 2014-09-18 |
| 9 | 6363-DELNP-2014-FER_SER_REPLY [03-04-2019(online)].pdf | 2019-04-03 |
| 10 | 6363-DELNP-2014-FER.pdf | 2018-07-04 |
| 10 | 6363-DELNP-2014-FORM 3 [03-04-2019(online)].pdf | 2019-04-03 |
| 11 | 6363-DELNP-2014-FORM 4(ii) [28-12-2018(online)].pdf | 2018-12-28 |
| 11 | 6363-DELNP-2014-FORM-26 [03-04-2019(online)].pdf | 2019-04-03 |
| 12 | 6363-DELNP-2014-Information under section 8(2) (MANDATORY) [03-04-2019(online)].pdf | 2019-04-03 |
| 12 | 6363-DELNP-2014-PETITION UNDER RULE 137 [03-04-2019(online)].pdf | 2019-04-03 |
| 13 | 6363-DELNP-2014-OTHERS [03-04-2019(online)].pdf | 2019-04-03 |
| 14 | 6363-DELNP-2014-Information under section 8(2) (MANDATORY) [03-04-2019(online)].pdf | 2019-04-03 |
| 14 | 6363-DELNP-2014-PETITION UNDER RULE 137 [03-04-2019(online)].pdf | 2019-04-03 |
| 15 | 6363-DELNP-2014-FORM 4(ii) [28-12-2018(online)].pdf | 2018-12-28 |
| 15 | 6363-DELNP-2014-FORM-26 [03-04-2019(online)].pdf | 2019-04-03 |
| 16 | 6363-DELNP-2014-FER.pdf | 2018-07-04 |
| 16 | 6363-DELNP-2014-FORM 3 [03-04-2019(online)].pdf | 2019-04-03 |
| 17 | 6363-DELNP-2014-FER_SER_REPLY [03-04-2019(online)].pdf | 2019-04-03 |
| 17 | 6363-DELNP-2014-Correspondence-Others-(18-09-2014).pdf | 2014-09-18 |
| 18 | 6363-DELNP-2014-DRAWING [03-04-2019(online)].pdf | 2019-04-03 |
| 18 | 6363-DELNP-2014-English-Translation-(18-09-2014).pdf | 2014-09-18 |
| 19 | 6363-DELNP-2014-GPA-(18-09-2014).pdf | 2014-09-18 |
| 19 | 6363-DELNP-2014-COMPLETE SPECIFICATION [03-04-2019(online)].pdf | 2019-04-03 |
| 20 | 6363-DELNP-2014.pdf | 2014-08-23 |
| 20 | 6363-DELNP-2014-CLAIMS [03-04-2019(online)].pdf | 2019-04-03 |
| 21 | FORM 2 + SPECIFICATION.pdf | 2014-08-01 |
| 21 | 6363-DELNP-2014-ABSTRACT [03-04-2019(online)].pdf | 2019-04-03 |
| 22 | FORM 3.pdf | 2014-08-01 |
| 22 | 6363-DELNP-2014-Power of Attorney-120419.pdf | 2019-04-22 |
| 23 | FORM 5.pdf | 2014-08-01 |
| 23 | 6363-DELNP-2014-Correspondence-120419.pdf | 2019-04-22 |
| 24 | OTHER RELEVANT DOCUMENT.pdf | 2014-08-01 |
| 24 | 6363-DELNP-2014-PatentCertificate08-12-2023.pdf | 2023-12-08 |
| 25 | 6363-DELNP-2014-IntimationOfGrant08-12-2023.pdf | 2023-12-08 |
| 25 | PCT-IB-304.pdf | 2014-08-01 |
| 1 | 6363_DELNP_2014_27-12-2017.pdf |