Abstract: A system (100) for interacting with 3D data comprising a display device (108), a user input (106), and a processor (104) communicatively connected to the display device (108) and the user input (106). The processor (104) is configured to generate a volumerendered image (402) shaded from a light direction based on a 3D dataset. The processor (104) is configured to display the volume-rendered image (402) and a model ofa solid (406) on the display device (108) at the same time. The processor (104) is configured to display a light direction indicator (405) at a position with respect to the model ofthe solid (406) to indicate the light direction used for shading the volume-rendered image (402).
FIELD OF THE INVENTION
This disclosure relates generally to a method and system for displaying a light
direction indicator to show the light direction used for shading a volume-rendered image.
BACKGROUND OF THE INVENTION
Volume-rendered images are very useful for representing 3D datasets,
particularly in the field of medical imaging. Volume-rendered images are typically 2D
representations of a 3D dataset. There are currently many different techniques for
generating a volume-rendered image. One such technique, ray-casting, includes
projecting a number of rays through the 3D dataset. Data along each of the rays is
sampled, and then mapped to a color and transparency. Next, the data is accumulated
along each of the rays. According to one common technique, the accumulated data along
each of the rays is displayed as a pixel in the volume-rendered image. In order to gain an
additional sense of depth and perspective, volume-rendered images are oftentimes shaded
based on a light direction. The shading helps a viewer to more-easily comprehend and
visualize the true three-dimensional shape of the object represented in the volumerendered
image. According to conventional shading algorithms, shading may be used
with volume-rendered images in order to convey the relative positioning of structures or
surfaces in the volume-rendered image.
Some conventional systems allow the user to alter the light direction in order
to more clearly illustrate one or more targeted features in the volume-rendered image.
However, since the light direction may be selected from any angular positions in three
2
dimensions and since the display is only two-dimensional, it is often difficult for a user to
quickly understand the current light direction used for determining the shading of the
volume-rendered image. Additionally, it may be difficult for a user to visualize exactly
how a specific control input will adjust the light direction with respect to the volumerendered
image.
Therefore, for these and other reasons, an improved system and method for
indicating light direction for volume-rendered images is desired.
BRIEF DESCRIPTION OF THE INVENTION
The above-mentioned shortcomings, disadvantages and problems are
addressed herein which will be understood by reading and understanding the following
specification.
In an embodiment, a method of volume-rendering includes generating a
volume-rendered image from a 3D dataset, wherein the volume-rendered image is shaded
from a light direction. The method includes displaying a model of a solid at the same
time as the volume-rendered image. The method includes displaying a light direction
indicator with respect to the model of the solid, wherein the position ofthe light direction
indicator with respect to the model of the solid corresponds to the light direction used for
shading the volume-rendered image.
In another embodiment, a method of volume-rendering includes displaying a
volume-rendered image that is shaded from a first light direction, displaying a model of a
solid at the same time as the volume-rendered image. The method includes displaying a
light source indicator at a first position with respect to the model of the solid. The first
position of the light source indicator corresponds to the first light direction used to shade
the volume-rendered image. The method includes moving with a user interface the light
source indicator to a second position with respect to the model of the solid. The method
includes automatically updating the shading on the volume-rendered image to correspond
3
with the second position of the light source indicator with respect to the model of the
solid.
In another embodiment, a system for interacting with a 3D dataset includes a
display device, a memory, a user input, and a processor configured to communicate with
the display device, the memory, and the user input. The processor is communicatively
connected to the display device, the memory, and the user input, wherein the processor is
configured to access a 3D dataset from the memory, generate a volume-rendered image
from the 3D dataset that is shaded from a light direction, and display the volumerendered
image on the display device. The processor is configured to display a model of
a solid at the same time as the volume-rendered image. The processor is configured to
display a light source indicator at a position with respect to the model of the solid to
indicate the light direction used for shading the volume-rendered image.
Various other features, objects, and advantages of the invention will be made
apparent to those skilled in the art from the accompanying drawings and detailed
description thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
FIGURE I is a schematic diagram of a system for interacting with 3D data in
accordance with an embodiment;
FIGURE 2 is a schematic representation of the geometry that may be used to
generate a volume-rendered image according to an embodiment;
FIGURE 3 is a flow chart illustrating a method 300 in accordance with an
embodiment;
FIGURE 4 is a schematic representation of a screen shot of a display device
according to an embodiment;
4
FIGURE 5 is a schematic representation of a light navigator with a light
source icon in a first position in accordance with an embodiment;
FIGURE 6 is a schematic representation of a light navigator with a light
source icon in in a second position in accordance with an embodiment; and
FIGURE 7 is a schematic representation of a light navigator with a light
source icon in a third position in accordance with an embodiment.
DETAILED DESCRIPTION OF THE INVENTION
In the following detailed description, reference is made to the accompanying
drawings that form a part hereof, and in which is shown by way of illustration specific
embodiments that may be practiced. These embodiments are described in sufficient
detail to enable those skilled in the art to practice the embodiments, and it is to be
understood that other embodiments may be utilized and that logical, mechanical,
electrical and other changes may be made without departing from the scope of the
embodiments. The following detailed description is, therefore, not to be taken as limiting
the scope of the invention.
FIG. 1 is a schematic diagram of a system for interacting with volumetric or
3D data according to an embodiment. The system 100 includes a memory 102, a
processor 104, a user input 106, and a display device 108. The memory 102 may include
any known medium for storing digital data, including, but not limited to a hard drive, a
flash memory, random access memory (RAM), read only memory (ROM), a compact
disc (CD), and a compact disc read-only memory (CD-ROM). The processor 104 is
communicatively connected to the memory. The processor 104 may include one or more
separate processing components. For example, the processor 104 may include a central
processing unit (CPU), a microprocessor, a graphics processing unit (GPU), or any other
electronic component capable of processing inputted data according to specific logical
instructions. Having a processor that includes a GPU may advantageous for
computation-intensive operations, such as volume-rendering large 3D datasets.
5
According to some embodiments, the memory 102 may be co-located with the processor
104. However, according to other embodiments, the memory 102 may be remotely
located with respect to the processor 104 and accessed through technologies including
wireless networks, the internet, or an intranet.
A user input 106 is in communicatively connected to the processor 104. The
user input 106 may include a trackball and one or more buttons according to an
exemplary embodiment. However, according to other embodiments, the user input 106
may include one or more of a mouse, a track pad, a touch screen, rotary controls, or an
assortment of hard or soft keys with defined functions. The display device 108 is
communicatively connected to the processor 104 as well. The display device 108 may
include a monitor or display such as a monitor, an LCD screen, an LED screen, a
projector, or any other device suitable for displaying a volume-rendered image. Other
embodiments may include multiple display devices, such as two or more LED screens.
The system 100 may optionally include an acquisition device 110 configured
to acquire one or more 3D datasets. The acquisition device 110 may include any device
configured to acquire 3D data. For example, the acquisition device 110 may include a
medical imaging device such as a computed tomography (CT) system, a magnetic
resonance tomography (MR) system, an ultrasound system, a nuclear medicine system, a
positron emission tomography (PET) system, or any other imaging modality capable of
acquiring 3D data such as optical imaging. According to other embodiments, the
acquisition system may include a non-medical device capable of acq.uiring 3D data.
According to other embodiments, the system 100 may receive one or more 3D datasets
that were acquired or generated with a device that is separate from the system 100.
Figure 2 is a schematic representation of the geometry that may be used to
generate a volume-rendered image according to an embodiment. Figure 2 includes a 3D
dataset 150 and a view plane 154.
6
Referring to both Figures 1 and 2, the processor 104 may generate a volumerendered
image according to a number of different techniques. According to an
exemplary embodiment, the processor 104 may generate a volume-rendered image
through a ray-casting technique from the view plane 154. The processor 104 may cast a
plurality of parallel rays from the view plane 154 to the 3D dataset 150. Figure 2 shows
ray 156, ray 158, ray 160, and ray 162 bounding the view plane 154. It should be
appreciated that additional rays may be cast in order to assign values to all of the pixels
163 within the view plane 154. The 3D dataset 150 comprises voxel data, where each
voxel, or volume-element, is assigned a value or intensity. According to some
embodiments, each voxel may be assigned an opacity as well. According to an
embodiment, the processor 104 may use a standard "front-to-back" technique for volume
composition in order to assign a value to each pixel in the view plane 154 that is
intersected by the ray. For example, starting at the front, that is the direction from which
the image is viewed, the intensities of all the voxels along the corresponding ray may be
summed. Then, optionally, the intensity may be multiplied by an opacity corresponding
to the voxels along the ray to generate an opacity-weighted value. These opacityweighted
values are then accumulated in a front-to-back direction along each of the rays
163. This process is repeated for each of the pixels 163 in the view plane 154 in order to
generate a volume-rendered image. According to an embodiment, the pixel values from
the view plane 154 may be displayed as the volume-rendered image. The volumerendering
algorithm may be configured to use an opacity function providing a gradual
transition from opacities of zero (completely transparent) to 1.0 (completely opaque).
The volume-rendering algorithm may factor the opacities of the voxels along each of the
rays when assigning a value to each of the pixels 163 in the view plane 154. For
example, voxels with opacities close to 1.0 will block most of the contributions from
voxels further along the ray, while voxels with opacities closer to zero will allow most of
the contributions from voxels further along the ray. Additionally, when visualizing a
surface, a thresholding operation may be performed where the opacities of voxels are
reassigned based on the values. According to an exemplary thresholding operation, the
7
opacities ofvoxels with values above the threshold may be set to 1.0 while voxels with
the opacities ofvoxels with values below the threshold may be set to zero. Other types of
thresholding schemes may also be used. For example, an opacity function may be used
where voxels that are clearly above the threshold are set to 1.0 (which is opaque) and
voxels that are clearly below the threshold are set to zero (translucent). However, an
opacity function may be used to assign opacities other than zero and 1.0 to the voxels
with values that are close to the threshold. This "transition zone" may be used to reduce
artifacts that may occur when using a simple binary thresholding algorithm. For
example, a linear function mapping opacities to values may be used to assign opacities to
voxels with values in the "transition zone". Other types of functions that progress from
zero to 1.0 may also be used. According to other embodiments, volume-rendering
techniques other than the ones described above may also be used in order to generate
volume-rendered images from a 3D dataset.
The volume-rendered image may be shaded in order to present the user with a
better perception of depth. For example, a plurality of surfaces may be defined based on
the 3D dataset. Next, according to an exemplary embodiment, a gradient may be
calculated at each of the pixels. The processor 104 (shown in Figure 1) may then
compute light reflection at positions on the surface corresponding to each of the pixels
and apply standard shading methods based on the gradients and a specific light direction.
Figure 3 is a flow chart illustrating a method 300 in accordance with an
embodiment. The individual blocks represent steps that may be performed in accordance
with the method 300. The technical effect of the method 300 is the display of a light
direction indicator and a model of a solid in order to show a light direction used to
determine the shading of a volume-rendered image.
Figure 4 is a schematic representation of a screen shot of a display device
according to an embodiment. The screen shot 400 includes a volume-rendered image
402, and a light navigator 409. The light navigator 409 includes a scale volume-rendered
image 404, and a light direction indicator 405, and a model of a solid 406. For purposes
8
•
of this disclosure, the tenn "light navigator" is defined to include the combination of a
light direction indicator and a model of the solid. The light navigator 409, may also
include the scale volume rendered image 404 according to some embodiments as shown
in the embodiment of Figure 4. Additional details about the screen shot 400 will be
described hereinafter.
Referring to Figures 1,3, and 4, at step 302 of the method, the processor 104
accesses a 3D dataset. As described previously, the 3D dataset may be accessed from a
medical imaging device according to an exemplary embodiment. The 3D dataset may
include voxel data where each voxel is assigned a value corresponding to an intensity. At
step 304, the processor 104 generates a volume-rendered image from the 3D dataset.
According to an embodiment, the processor 104, may generate the volume-rendered
image according to one of the techniques described with respect to Figure 2. As part of
the generation of the volume-rendered image during step 304, the processor 104
detennines the shading for the volume-rendered image. As described hereinabove with
respect to Figure 2, the shading of the image may include calculating how light from a
specific light direction would interact with the structure represented in the volumerendered
image. The algorithm controlling the shading may calculate how the light
would reflect, refract, and diffuse based on the orientation of the surfaces represented in
the volume-rendered image. The orientation of the light direction directly affects how the
volume-rendered image is shaded. Shading of volume-rendered images is well-known by
those skilled in the art and will therefore not be described in additional detail.
At step 306, the processor 104 displays the volume-rendered image generated
during step 304 on the display device 108. The volume-rendered image 402 (shown in
Figure 4) is an example of a volume-rendered image that may be generated and displayed
according to an embodiment. Hereinafter, the method 300 will be described according to
an exemplary embodiment illustrated in Figure 4. It should be appreciated by those
skilled in the art that other embodiments may differ from the exemplary embodiment.
9
•
Referring to Figures 1,3, and 4, at step 308, the processor 104 displays a
model of a solid, such as the model of the solid 406. The model of the solid 406 may be
a model of a sphere in accordance with the embodiment depicted in Figure 4. According
to other embodiments, the model of the solid may include a model of a different shape,
such a model of an ellipsoid or other shapes with a generally convex outer surface.
According to yet other embodiments, the model of the solid may include a model of any
shape with a generally smooth outer surface. Next, at step 310, the processor 104
displays a light direction indicator, such as the light direction indicator 405. According to
an embodiment, the light direction indicator 405 may include both a light source icon 408
and a highlight 411. The light source icon 408 indicates the light direction used to shade
the volume-rendered image 402. According to an embodiment, the light source icon 408
may be a model of an arrow 413 with an arrow head 415. The light source icon 408 may
include a representation of a polygonal model that is directional or shaped so that a user
can clearly identify one end from an opposite end. This way, the directional model can
be used to clearly indicate light direction used to shade a volume-rendered image. The
highlight 411 includes a bright area on the model of the solid 406. The highlight 411
may show an area of increased reflectivity due to illumination of the model of the solid
from the light direction. The highlight 411 may be included as part of the rendering of
the model of the solid 406.
Optionally, the processor 104 (shown in Figure 1) may generate and display a
scale volume-rendered image such as the scale volume-rendered image 404. The scale
volume-rendered image 404 is a smaller representation of the volume-rendered image
402. The scale volume-rendered image 404 is shown from the same perspective as the
volume-rendered image 402. According to some embodiments, the scale volumerendered
image 404 may optionally be shaded in the same manner, according to the same
relative light direction as the volume-rendered image 402. That is, the position of the
light direction with respect to the volume-rendered image 402 is the same as the position
of the light direction with respect to the scale volume-rendered image 404. According to
the embodiment shown in Figure 4, the light direction is coming from the upper left side
10
•
as indicated by the light source icon 408. The light direction in Figure 4 is also slightly
from the front side. For the purpose of this disclosure, the terms "front side" and "back
side" will be used to describe the orientation of light direction with respect to images. A
view direction is the direction from which the user views the images. The view direction
is typically normal to the display device in the direction from the user towards the display
device. For purposes of this disclosure, the term "front side" is defined to include the
surfaces within a model or a volume-rendering that would be visible from the view
direction. Conversely, for purposes of this disclosure, the term "back side" is defined to
include the surfaces within a model or a volume-rendering that would be visible from a
direction 180 degrees from the view direction. Since the volume-rendered image 402, the
scale volume-rendered image 404, the light source icon 408 and the model of the solid
406 may all be volume-rendered, the terms "front side" and "back side" may be applied
to any of the aforementioned elements. Therefore the shading 410 is pronounced on the
right and lower sides of the volume-rendered image 402. While not shown in the
schematic representation of the screen shot 400, the shading of the volume-rendered
image 402 may include more subtle effects as well, including areas of reflection.
Additionally, the shading may include one or both of diffuse shading and specular
shading based on the light direction.
The model of the solid 406 and the light source icon 408 may both be
generated from a polygonal model according to an embodiment. Using a polygonal
model to generate the light source icon 408, allows the processor 104 (shown in Figure 1)
to accurately portray the three-dimensionality of the light source icon 408 from a variety
of different perspectives with respect to the model of the solid 406. Since, according to
an embodiment, the light source icon 408 may be shown in a variety of different positions
and orientations with respect to the scale volume-rendered image 404 and the model of
the solid 406. According to an embodiment, the model of the solid 406 may be
transparent or semi-transparent in order to allow the viewer to see the position ofthe light
source icon 408 and/or the highlight 411 even when the light source icon 408 or the
highlight 411 is on the backside of the model of the solid 406. The processor 104 (shown
11
e
•
in Figure 1) may apply some shading to the light source icon 408 in order to give the user
a better sense ofperspective regarding the orientation of the light source icon 408.
Next, at step 315, a user determines if it is desired to move the light direction
indicator 405. If it is not desired to move the light direction indicator 405, then the
method ends. However, if it is desired to move the light direction indicator 405, the
method 300 advances to step 316. At step 316, a user may input a command through the
user input 106 (shown in Figure 1) in order to reposition the light direction indicator 405.
The user may use the user input 106 to move the position of the light direction indicator
405. According to an exemplary embodiment, the user input 106 may include a trackball
or mouse configured to function as a virtual trackball. For example, the movement of the
mouse or trackball may be projected onto a virtual trackball, that controls the rotation of
the light direction indicator 405. The translation movement from the user input 106
results in a rotation of the virtual trackball in the direction ofthe translation movement.
For example, according to an embodiment represented in the screen shot 400, the light
direction indicator 405 includes both the light source icon 408 and the highlight 411. The
virtual trackball may control the position of the light source icon 408 and the highlight
411 with respect to the model of the solid 406. For example, both the light source icon
408 and the highlight 411 may be rotated about the model of the solid 406 in the
rotational manner in real-time base on the rotation of the virtual trackball. According to
another embodiment, the user input may include a trackball. The movements of the
trackball may be directly mapped to the virtual trackball. That is, instead of simply
controlling the translation of a pointer on the screen, the processor 104 (shown in Figure
1) may move the light direction indicator 405 in a rotational manner about the model of
the solid 406 that mirrors the user's movements with the trackball. It is very intuitive for
a user to control the light direction in three dimensions using a trackball or a virtual
trackball. According to either embodiment, the user may need to select a button or place
a cursor, pointer or icon at a specific location, such as over the light navigator 409
(shown in Figure 4) in order to initialize the functioning of the user input 106 as the
12
virtual trackball. It should be appreciated, that the user input 106 may control the light
direction in other ways according to other embodiments.
According to an embodiment, the processor 104 may shade the scale volumerendered
image 404 based on the same light direction that was used to shade the volumerendered
image 402 during step 304. For example, as described previously, the scale
volume-rendered image 404 may be a smaller version of the volume-rendered image 402.
This means that the scale volume-rendered image 404 shows the same structures in the
same orientation as the volume-rendered image 402. Additionally, according to an .
embodiment, shading may be applied to the volume-rendered image 402 and the scale
volume-rendered image 404 based on the same relative light direction. That is, the
shading on the scale volume-rendered image 404 may be based on a light direction that is
the same as the light direction used to generate the shading on the volume-rendered
image 402. It should be appreciated by those skilled in the art that it may not be
necessary to independently calculate the shading for the scale volume-rendered image
404. Instead, the processor 104 may use a smaller version of the volume-rendered image
402, including any shading, as the scale volume-rendered image 404. The result is that
the shading may be the same in both the volume-rendered image 402 and the scale
volume-rendered image 404. It should also be appreciated that since the scale volumerendered
image 404 is smaller than the volume-rendered image 402, it may be necessary
for the scale volume-rendered image 404 to be lower in resolution than the volumerendered
image 402.
According to an embodiment, for the purposes of shading, the processor 104
may use the color, opacity, and location of each of the voxels as inputs in order to
calculate how the light may reflect, refract, and diffuse on the various structures
represented in the volume-rendered image in order to generate a more realistic volumerendered
image.
According to an embodiment, the model ofthe solid 406 may be at least semitransparent.
This means that the model of the solid 406 may be either complete
13
transparent, in which case the model of the solid 406 would only be visible based on the
way that it interacts with light from the light direction, or the model of the solid 406 may
be partially opaque. For embodiments where the model of the solid 406 is semitransparent
(i.e. partially opaque) the model of the solid 406 may partially block the scale
volume-rendered image 404. The highlight 411 may be generated by calculating how the
model of the solid 406 would reflect and refract light originating from the light source
icon 408 or the light direction according to an embodiment. It should be appreciated that
while the highlight 411 is shown as a homogeneous area on Figure 4, the area of
reflection may decrease in intensity with radial distance away from the center of the
arrowhead 415. For example, the area of reflection may smoothly decrease in intensity
with radial distance away from the center of the arrow head 415. Other embodiments
may not include the highlight 411 on the model of the solid 406.
Figure 5 is a schematic representation of the light navigator 409 with the light
direction indicator 405 in a first position in accordance with an embodiment. Figure 6 is
a schematic representation of the light direction indicator 405 in a second position in
accordance with an embodiment. Figure 7 is a schematic representation of the light
direction indicator 405 in a third position in accordance with an embodiment. Common
reference numbers are used to identify identical components in the light navigator 409
shown in Figures 4,5,6, and 7.
Referring now to Figures 3, 4,5,6, and 7, as previously described, the user
may move the light direction indicator 405 (shown in figure 4) from a first position to a
second position at step 316. For example, the user may move the light direction indicator
405 from a first position, such as that shown in Figure 4, to a second position, including
any of the exemplary positions shown in Figures 5, 6, and 7. For example, a first
exemplary position is shown in Figure 5, a second exemplary position is shown in Figure
6, and a third exemplary position is shown in Figure 7.
Referring to Figure 5, the light direction indicator 405 is positioned at the
right side of the model of the solid 406. This corresponds to a situation where the light
14
direction is originating from the right. The arrow head 415 is clearly pointing to the left.
This corresponds to a situation where the shading of the volume-rendered image was
done based on a light direction from the right. The scale volume-rendered image 404
includes shading 412 that is consistent with the light direction as indicated by the light
source icon 408 and the highlight 411.
Referring to Figure 6, the second exemplary position has the light source icon
408 positioned in the upper right, approximately in the 2 o'clock position. However,
based on the geometry of the light source icon 408 and by noticing that the model of the
solid 406 is in front of the light source icon 408, it is easy for a user to tell that the light
source icon 408 is positioned slightly behind the model of the solid 406. In other words,
the arrow head 415 is pointed slightly towards the view direction.
Referring to Figure 7, in the third exemplary position, the light source icon
408 is in approximately the 7 o'clock position. However, the light source icon 408 is
positioned in front ofthe model of the solid 406. This indicates that the light direction is
from the front side of the model of the sphere. The arrow head 415 is pointing in towards
the model of the sphere 406. The scale volume-rendered image 404 includes shading 412
that is consistent with the light direction as indicated by the light source icon 408.
According to an embodiment, the processor 104 (shown in Figure 1) may
control the order of rendering the light source icon 408, the model of the solid 406, and
the scale volume-rendered image 404 in order to obtain the most realistic transparency
values. For example, according to embodiments or configurations when the light source
icon 408 indicates that light is coming from the front side, the processor 104 may perform
the rendering by first rendering the backside of the model of the solid 406. Next, the
processor 104, may render the scale volume-rendered image 404. Then, the processor
104, may render the front side of the model of the solid 406. And finally, the processor
104, may render the light direction icon 408.
15
In contrast, for embodiments or configurations when the light direction is
from the backside, such as in Figure 6, the rendering order would be the reverse. That is,
first the light source icon 408 would be rendered, then the backside of the model of the
solid 406, then the scale-volume-rendered image 404, and finally the front side of the
model of the solid 406.
Referring to Figures 3 and 4, at step 318, the processor 104 (shown in Figure
I) updates the shading of the volume-rendered image 402 based on the new, or second,
light direction. According to an embodiment, the position of the light source icon 408
with respect to the scale volume-rendered image 404 is the same as the position of the
new, or second, light direction with respect to the volume-rendered image 402. The
processor 102 calculates the shading of the volume-rendered image 402 using the second
light direction.
The processor 104 may then update the shading of the scale volume-rendered
image 404 based on the second light direction. According to an embodiment, the
processor 104 may calculate the shading based on the second light direction indicator 405
using the volume-rendered image 402 and apply the same shading to the scale volumerendered
image 404. Next, the processor 104 may update the shading of the model of the
solid 406 based on the second light direction. By moving the light source icon 408 with
respect to the scale volume-rendered image 404 and the model of the solid 406, the user
is able to control the light direction used for shading the volume-rendered image 402.
According to an embodiment, steps 316 and 318 may occur in real-time as the user is
moving the light source icon 408. The result of this is that the shading on the volumerendered
image 402, the scale volume-rendered image 404, and the model of the solid
406 are all updated to reflect the real-time light direction as controlled by the position of
the light source icon 408. This allows the user to quickly and easily see the effects of any
changes in the light direction produced by movement of the light source icon 408. At
324, the user may optionally move the light source icon 408 again. If the user chooses to
16
move the light source icon 408, then the method 300 returns to step 316, where steps 316,
318, 320, and 322 are repeated.
The combination of the scale volume-rendered image 404, the model of the
solid 406, and the light source icon 408 allow the user to quickly and accurately
comprehend the exact position of the light direction used to generate shading on the
volume-rendered image 402. Since, according to an embodiment, both the. scale volumerendered
image 404 and the model of the solid 406 are shaded based on the light
direction, it is very intuitive for the user to understand the three-dimensional position of
the light source icon 408. Additionally, the use of shading on the model of the solid 406
greatly aids a user in comprehending the relative positioning of the light source icon 408.
For example, as the user move the light source icon 408, the shading on the model of the
solid 406 may change in real-time corresponding to the changes in light direction.
Shading applied to the model of the solid 406 will create areas of brightness and shadows
in a manner that is very familiar to users. This will make it easier and faster for the user
to accurately position the light source in order to obtain the desired shading on a volumerendered
image.
This written description uses examples to disclose the invention and to enable
any person skilled in the art to practice the invention, including making and using any
devices or system and performing any incorporated methods. The patentable scope of the
invention is defined by the claims, and may include other examples that occur to those
skilled in the art. For example, the light direction may represent the direction of a light
beam used to shade the volume-rendered image. The light beam used to determine the
shading of the volume-rendered image may be adjustable by a clinician in terms of width
and/or degree of spread (e.g., varied from a collimated pencil beam to a diffuse pattern
like a flood light or anywhere in between), color pallet (e.g., varied from "cooler"
blue/green tints to "warmer" red/yellow tints or even natural skin tones) as well as
intensity and/or penetration depth. These and other such examples are intended to be
within the scope of the claims if they have structural elements that do not differ from the
17
literal language of the claims, or if they include equivalent structural elements with
insubstantial differences from the literal languages of the claims.
This written description uses examples to disclose the invention, including
the best mode, and also to enable any person skilled in the art to practice the invention,
including making and using any devices or systems and performing any incorporated
methods. The patentable scope of the invention is defined by the claims, and may
include other examples that occur to those skilled in the art. Such other examples are
intended to be within the scope of the claims if they have structural elements that do not
differ from the literal language of the claims, or if they include equivalent structural
elements with insubstantial differences from the literal language of the claims.
18
PART LIST
Figure 1
100 system
102 memory
104 processor
106 user input
108 display device
110 acquisition device
Figure 2
150 3D dataset
154 view plane
156 ray
160 ray
162 ray
163 pixels
Figure 3
300 method
302 access 3D dataset
304 generate volume-rendered image
306 display volume-rendered image
308 display model of a solid
310 display light direction indicator
315 move light direction indicator?
316 move light direction indicator
318 update the shading on the volume-rendered image
320 move light direction indicator again?
Figure 4
400 screen shot
402 volume-rendered image
404 scale volume-rendered image
405 light direction indicator
406 model of a solid
408 light source icon
409 light navigator
410 shading
411 highlight
412 scale volume-rendered image
413 model of an arrow
19
415 arrow head
Figure 5
404 scale volume-rendered image
405 light direction indicator
406 model of a solid
408 arrow
411 highlight
412 shading
415 arrow head
Figure 6
404 scale volume-rendered image
405 light direction indicator
408 arrow
411 highlight
412 scale volume-rendered image
415 arrowhead
Figure 7
404 scale volume-rendered image
406 model of a solid
408 light source icon
409 light navigator
411 highlight
412 scale volume-rendered image
415 arrow head
We claim:
1. A system (100) for interacting with 3D data comprising:
a display device (l08);
a user input (106); and
a processor (104) communicatively connected to the display device (108)
and the user input (106), wherein the processor (104) is configured to:
generate a volume-rendered image (402) based on a 3D dataset, the
volume-rendered image (402) being shaded from a light direction;
display the volume-rendered image (402) on the display device (l08);
display a model of a solid (406) on the display device (108) at the
same time as the volume-rendered image (402); and
display a light direction indicator (405) at a position with respect to the
model of the solid (406) to indicate the light direction used for shading
the volume-rendered image (402).
2. The system (100) of claim 1, wherein the model of the solid (406) is at least
semi-transparent.
3. The system (100) of claim 1, wherein the light direction indicator (405)
comprises a light source icon (408).
4. The system (100) of claim 3, wherein the light source icon (408) comprises a
representation of a polygonal model.
5. The system (100) of claim 3, wherein the light source icon (408) comprises an
arrow with an arrow head to indicate the light direction.
6. The system (l00) of claim 1, wherein the light direction indicator (405)
comprises a highlight (411) on the model of the solid (406).
21
7. The system (100) of the claim 1, wherein the light direction indicator (405)
comprises both a light source icon (408) and a highlight (411) on the model of
the solid (406).
8. The system (100) of claim 1, wherein the processor (104) is further configured
to display a scale volume-rendered image (404) within the model of the solid
(406) at the same time as the volume-rendered image (402), wherein the scale
volume-rendered image (404) comprises a smaller representation of the
volume-rendered image (402).
9. The system (100) of claim 1, wherein the model of the solid (406) comprises a
model of a sphere.
10. The system (100) of claim 1, wherein the processor (104) is further configured
to shade the volume-rendered image (402) based a second light direction in
addition to the light direction.
11. The system (100) of claim 10, further comprising displaying a second light
direction indicator at the same time as the light direction indicator (405),
wherein the position of the second light direction indicator with respect to the
model of the solid (406) corresponds to the second light direction.
12. The system (100) of claim 1, wherein the processor (104) is further configured
to update the shading of the volume-rendered image (402) in real-time to
correspond with the position of the light source icon (408) with respect to the
model of the solid (406) as the light source icon (408) is move based on a
command through the user input (106).
13. A system (100) for interacting with 3D data comprising:
a display device (108);
a memory (102);
a user input (106); and
22
a processor (104) communicatively connected to the display device (108),
the memory (102) and the user input (106), wherein the processor (104) is
configured to:
access a 3D dataset from the memory (l02);
generate a volume-rendered image (402) from the 3D dataset that is
shaded from a light direction;
display the volume-rendered image (402) on the display device (108);
display a model of a solid (406) at the same time as the volumerendered
image (402); and
display a light source icon (408) at a position with respect to the model
of the solid (406}to indicate the light direction used for shading the
volume-rendered image (402).
14. The system (100) of claim 13, wherein the processor (104) is further
configured to update the shading of the volume-rendered image (402) to
correspond with the position of the light source icon (408) with respect to the
model of the solid (406) as the light source icon (408) is moved based on a
command entered through the user input (106).
'J .. A~ J\.....,ovu-..u.. O~
MANISHA SING NAIR
Agent for the Applicant [IN/PA-740]
LEX ORBIS
Intellectual Property Practice
7091710, Tolstoy House,
15-17, Tolstoy Marg,
New Delhi-II 000 1
| Section | Controller | Decision Date |
|---|---|---|
| GRANT U/S 15 | KASTHURI M | 2022-11-29 |
| GRANT U/S 15 | KASTHURI M | 2022-11-29 |
| GRANT U/S 15 | KASTHURI M | 2022-11-29 |
| # | Name | Date |
|---|---|---|
| 1 | 3565-DEL-2012-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | 3565-del-2012-Correspondence Others-(27-11-2012).pdf | 2012-11-27 |
| 1 | 3565-DEL-2012-IntimationOfGrant29-11-2022.pdf | 2022-11-29 |
| 2 | 3565-del-2012-Assignment-(27-11-2012).pdf | 2012-11-27 |
| 2 | 3565-DEL-2012-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 2 | 3565-DEL-2012-PatentCertificate29-11-2022.pdf | 2022-11-29 |
| 3 | 3565-DEL-2012-Annexure [28-11-2022(online)].pdf | 2022-11-28 |
| 3 | 3565-del-2012-Correspondence Others-(11-12-2012).pdf | 2012-12-11 |
| 3 | 3565-DEL-2012-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 4 | 3565-DEL-2012-Written submissions and relevant documents [28-11-2022(online)].pdf | 2022-11-28 |
| 4 | 3565-DEL-2012-IntimationOfGrant29-11-2022.pdf | 2022-11-29 |
| 4 | 3565-del-2012-Form-3-(15-01-2013).pdf | 2013-01-15 |
| 5 | 3565-DEL-2012-Written submissions and relevant documents [25-11-2022(online)].pdf | 2022-11-25 |
| 5 | 3565-DEL-2012-PatentCertificate29-11-2022.pdf | 2022-11-29 |
| 5 | 3565-del-2012-Correspondence-others-(15-01-2013).pdf | 2013-01-15 |
| 6 | 3565-DEL-2012-Written submissions and relevant documents [14-11-2022(online)].pdf | 2022-11-14 |
| 6 | 3565-del-2012-GPA.pdf | 2013-08-20 |
| 6 | 3565-DEL-2012-Annexure [28-11-2022(online)].pdf | 2022-11-28 |
| 7 | 3565-DEL-2012-Written submissions and relevant documents [28-11-2022(online)].pdf | 2022-11-28 |
| 7 | 3565-del-2012-Form-5.pdf | 2013-08-20 |
| 7 | 3565-DEL-2012-Correspondence to notify the Controller [07-11-2022(online)].pdf | 2022-11-07 |
| 8 | 3565-DEL-2012-FORM-26 [07-11-2022(online)].pdf | 2022-11-07 |
| 8 | 3565-del-2012-Form-3.pdf | 2013-08-20 |
| 8 | 3565-DEL-2012-Written submissions and relevant documents [25-11-2022(online)].pdf | 2022-11-25 |
| 9 | 3565-del-2012-Form-1.pdf | 2013-08-20 |
| 9 | 3565-DEL-2012-US(14)-HearingNotice-(HearingDate-11-11-2022).pdf | 2022-10-25 |
| 9 | 3565-DEL-2012-Written submissions and relevant documents [14-11-2022(online)].pdf | 2022-11-14 |
| 10 | 3565-DEL-2012-ABSTRACT [26-08-2020(online)].pdf | 2020-08-26 |
| 10 | 3565-DEL-2012-Correspondence to notify the Controller [07-11-2022(online)].pdf | 2022-11-07 |
| 10 | 3565-del-2012-Form-.pdf | 2013-08-20 |
| 11 | 3565-DEL-2012-CLAIMS [26-08-2020(online)].pdf | 2020-08-26 |
| 11 | 3565-del-2012-Drawings.pdf | 2013-08-20 |
| 11 | 3565-DEL-2012-FORM-26 [07-11-2022(online)].pdf | 2022-11-07 |
| 12 | 3565-DEL-2012-COMPLETE SPECIFICATION [26-08-2020(online)].pdf | 2020-08-26 |
| 12 | 3565-del-2012-Description(Complete).pdf | 2013-08-20 |
| 12 | 3565-DEL-2012-US(14)-HearingNotice-(HearingDate-11-11-2022).pdf | 2022-10-25 |
| 13 | 3565-del-2012-Correspondence-others.pdf | 2013-08-20 |
| 13 | 3565-DEL-2012-CORRESPONDENCE [26-08-2020(online)].pdf | 2020-08-26 |
| 13 | 3565-DEL-2012-ABSTRACT [26-08-2020(online)].pdf | 2020-08-26 |
| 14 | 3565-DEL-2012-CLAIMS [26-08-2020(online)].pdf | 2020-08-26 |
| 14 | 3565-del-2012-Claims.pdf | 2013-08-20 |
| 14 | 3565-DEL-2012-DRAWING [26-08-2020(online)].pdf | 2020-08-26 |
| 15 | 3565-del-2012-Assignment.pdf | 2013-08-20 |
| 15 | 3565-DEL-2012-COMPLETE SPECIFICATION [26-08-2020(online)].pdf | 2020-08-26 |
| 15 | 3565-DEL-2012-FER_SER_REPLY [26-08-2020(online)].pdf | 2020-08-26 |
| 16 | 3565-del-2012-Abstract.pdf | 2013-08-20 |
| 16 | 3565-DEL-2012-CORRESPONDENCE [26-08-2020(online)].pdf | 2020-08-26 |
| 16 | 3565-DEL-2012-OTHERS [26-08-2020(online)].pdf | 2020-08-26 |
| 17 | 3565-DEL-2012-DRAWING [26-08-2020(online)].pdf | 2020-08-26 |
| 17 | 3565-DEL-2012-PETITION UNDER RULE 137 [26-08-2020(online)].pdf | 2020-08-26 |
| 17 | Other Document [12-12-2015(online)].pdf | 2015-12-12 |
| 18 | 3565-DEL-2012-FER.pdf | 2020-02-27 |
| 18 | 3565-DEL-2012-FER_SER_REPLY [26-08-2020(online)].pdf | 2020-08-26 |
| 18 | Form 13 [12-12-2015(online)].pdf | 2015-12-12 |
| 19 | 3565-DEL-2012-FORM 13 [01-10-2019(online)].pdf | 2019-10-01 |
| 19 | 3565-DEL-2012-OTHERS [26-08-2020(online)].pdf | 2020-08-26 |
| 19 | 3565-DEL-2012-RELEVANT DOCUMENTS [01-10-2019(online)].pdf | 2019-10-01 |
| 20 | 3565-DEL-2012-FORM 13 [01-10-2019(online)].pdf | 2019-10-01 |
| 20 | 3565-DEL-2012-PETITION UNDER RULE 137 [26-08-2020(online)].pdf | 2020-08-26 |
| 20 | 3565-DEL-2012-RELEVANT DOCUMENTS [01-10-2019(online)].pdf | 2019-10-01 |
| 21 | Form 13 [12-12-2015(online)].pdf | 2015-12-12 |
| 21 | 3565-DEL-2012-FER.pdf | 2020-02-27 |
| 22 | 3565-DEL-2012-FORM 13 [01-10-2019(online)].pdf | 2019-10-01 |
| 22 | 3565-DEL-2012-PETITION UNDER RULE 137 [26-08-2020(online)].pdf | 2020-08-26 |
| 22 | Other Document [12-12-2015(online)].pdf | 2015-12-12 |
| 23 | 3565-del-2012-Abstract.pdf | 2013-08-20 |
| 23 | 3565-DEL-2012-OTHERS [26-08-2020(online)].pdf | 2020-08-26 |
| 23 | 3565-DEL-2012-RELEVANT DOCUMENTS [01-10-2019(online)].pdf | 2019-10-01 |
| 24 | Form 13 [12-12-2015(online)].pdf | 2015-12-12 |
| 24 | 3565-DEL-2012-FER_SER_REPLY [26-08-2020(online)].pdf | 2020-08-26 |
| 24 | 3565-del-2012-Assignment.pdf | 2013-08-20 |
| 25 | 3565-DEL-2012-DRAWING [26-08-2020(online)].pdf | 2020-08-26 |
| 25 | Other Document [12-12-2015(online)].pdf | 2015-12-12 |
| 25 | 3565-del-2012-Claims.pdf | 2013-08-20 |
| 26 | 3565-del-2012-Abstract.pdf | 2013-08-20 |
| 26 | 3565-DEL-2012-CORRESPONDENCE [26-08-2020(online)].pdf | 2020-08-26 |
| 26 | 3565-del-2012-Correspondence-others.pdf | 2013-08-20 |
| 27 | 3565-del-2012-Assignment.pdf | 2013-08-20 |
| 27 | 3565-DEL-2012-COMPLETE SPECIFICATION [26-08-2020(online)].pdf | 2020-08-26 |
| 27 | 3565-del-2012-Description(Complete).pdf | 2013-08-20 |
| 28 | 3565-del-2012-Drawings.pdf | 2013-08-20 |
| 28 | 3565-del-2012-Claims.pdf | 2013-08-20 |
| 28 | 3565-DEL-2012-CLAIMS [26-08-2020(online)].pdf | 2020-08-26 |
| 29 | 3565-DEL-2012-ABSTRACT [26-08-2020(online)].pdf | 2020-08-26 |
| 29 | 3565-del-2012-Correspondence-others.pdf | 2013-08-20 |
| 29 | 3565-del-2012-Form-.pdf | 2013-08-20 |
| 30 | 3565-del-2012-Description(Complete).pdf | 2013-08-20 |
| 30 | 3565-del-2012-Form-1.pdf | 2013-08-20 |
| 30 | 3565-DEL-2012-US(14)-HearingNotice-(HearingDate-11-11-2022).pdf | 2022-10-25 |
| 31 | 3565-del-2012-Drawings.pdf | 2013-08-20 |
| 31 | 3565-DEL-2012-FORM-26 [07-11-2022(online)].pdf | 2022-11-07 |
| 31 | 3565-del-2012-Form-3.pdf | 2013-08-20 |
| 32 | 3565-DEL-2012-Correspondence to notify the Controller [07-11-2022(online)].pdf | 2022-11-07 |
| 32 | 3565-del-2012-Form-.pdf | 2013-08-20 |
| 32 | 3565-del-2012-Form-5.pdf | 2013-08-20 |
| 33 | 3565-del-2012-Form-1.pdf | 2013-08-20 |
| 33 | 3565-del-2012-GPA.pdf | 2013-08-20 |
| 33 | 3565-DEL-2012-Written submissions and relevant documents [14-11-2022(online)].pdf | 2022-11-14 |
| 34 | 3565-del-2012-Correspondence-others-(15-01-2013).pdf | 2013-01-15 |
| 34 | 3565-del-2012-Form-3.pdf | 2013-08-20 |
| 34 | 3565-DEL-2012-Written submissions and relevant documents [25-11-2022(online)].pdf | 2022-11-25 |
| 35 | 3565-del-2012-Form-3-(15-01-2013).pdf | 2013-01-15 |
| 35 | 3565-del-2012-Form-5.pdf | 2013-08-20 |
| 35 | 3565-DEL-2012-Written submissions and relevant documents [28-11-2022(online)].pdf | 2022-11-28 |
| 36 | 3565-DEL-2012-Annexure [28-11-2022(online)].pdf | 2022-11-28 |
| 36 | 3565-del-2012-Correspondence Others-(11-12-2012).pdf | 2012-12-11 |
| 36 | 3565-del-2012-GPA.pdf | 2013-08-20 |
| 37 | 3565-DEL-2012-PatentCertificate29-11-2022.pdf | 2022-11-29 |
| 37 | 3565-del-2012-Correspondence-others-(15-01-2013).pdf | 2013-01-15 |
| 37 | 3565-del-2012-Assignment-(27-11-2012).pdf | 2012-11-27 |
| 38 | 3565-DEL-2012-IntimationOfGrant29-11-2022.pdf | 2022-11-29 |
| 38 | 3565-del-2012-Form-3-(15-01-2013).pdf | 2013-01-15 |
| 38 | 3565-del-2012-Correspondence Others-(27-11-2012).pdf | 2012-11-27 |
| 39 | 3565-DEL-2012-POWER OF AUTHORITY [18-03-2025(online)].pdf | 2025-03-18 |
| 39 | 3565-del-2012-Correspondence Others-(11-12-2012).pdf | 2012-12-11 |
| 40 | 3565-DEL-2012-FORM-16 [18-03-2025(online)].pdf | 2025-03-18 |
| 40 | 3565-del-2012-Assignment-(27-11-2012).pdf | 2012-11-27 |
| 41 | 3565-del-2012-Correspondence Others-(27-11-2012).pdf | 2012-11-27 |
| 41 | 3565-DEL-2012-ASSIGNMENT WITH VERIFIED COPY [18-03-2025(online)].pdf | 2025-03-18 |
| 1 | SearchStrategy_27-01-2020.pdf |