Sign In to Follow Application
View All Documents & Correspondence

Player, Recorder, Playing Method, And Recording Method

Abstract: A playback device for playing back a playlist. The playback device determines, as a current sub-playitem, a sub-playitem that is optimum for the current sub-playitem each time the current playitem changes. The playback device continues a playback of a playitem when a clip file being referred to by the current sub-playitem has been downloaded and is in the Enable state in the local storage; and stops, by issuing a DataStarved event, the playback of the playitem when the clip file being referred to by the current sub-playitem is in a missing state or an invalid state in the recording medium.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 October 2009
Publication Number
52/2009
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
Parent Application

Applicants

PANASONIC CORPORATION
1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501 JAPAN

Inventors

1. TAIJI SASAKI
C/O. PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501 JAPAN
2. HIROSHI YAHATA
C/O. PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501 JAPAN
3. KAZUHIRO MOCHINAGA
C/O. PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501 JAPAN
4. WATARU IKEDA
C/O. PANASONIC CORPORATION 1006, OAZA KADOMA, KADOMA-SHI, OSAKA 571-8501 JAPAN

Specification

DESCRIPTION
PLAYBACK DEVICE, RECORDING DEVICE, PLAYBACK
METHOD, AND RECORDING METHOD
Technical Field
The present invention relates to a technical field of
a playlist playback technology.
Background Art
The playlist is information defining a logical playback
path. In various application formats for storing contents, the
playlist is used as a unit of playback. In general, the playlist
is composed of playitems and sub-playitems. The playitem is a
logical playback section that is defined by specifying a pair
of an in point and an out point in a time axis of a video stream
that constitutes a main video image.
The sub-playitem is a logical playback section that is
defined by specifying a pair of an in point and an out point in
a time axis of a sub-stream. Here, the sub-stream is any of an
audio stream, video stream, and subtitle stream that is not
multiplexed with a video stream, but is played back together with
the video stream.
When a virtual package is constructed from a combination
of contents recorded on a ROM disc and another recording medium,
it is possible to define, with use of the sub-playitem, a playback
section of a sub-stream that is downloaded via a network. Here,
it is possible to use the playitem in the playlist to define a
playback section of an AV stream recorded on a ROM disc, and to
use the sub-playitem in the playlist to define a playback section
of a sub-stream to be downloaded. With such usage, the playlist
achieves a new combination of subtitle, audio and moving image
which is not achieved by optical discs.
As one example, Patent Document 1 identified below
discloses a technology for generating a virtual package.
["Patent Document 1]
Japanese Patent Application Publication No. 2006-109494
Disclosure of the Invention
The Problems the Invention Is Going to Solve
Meanwhile, reading from the ROM disc may be performed
at a bit rate of 54Mbps. However, in many cases of ordinary user
homes, a bit rate guaranteed for a streaming playback is the best
effort, and as high a level of bit rate as in a streaming playback
reading from ROM disc cannot be always guaranteed.
Here, a case will be studied where a playlist including
many sub-playitems is to be played back under a circumstance where
the bit rate for streaming playback is restricted.
Suppose, for example, that, in a specific scene of a
main-feature video image, a playitem is associated with 10
sub-playitems, and the sub-playitems support 10 languages being
options. It is also presumed here that each clip file referred
to by each sub-playitem has 10MB of data size. Then, to start
a playback of one playitem, files of 100MB = 10(languages) X
10 (MB) should be downloaded, as well as AV clips referred to by
the playitem of the main-feature video image. In such a case,
when an application in charge of downloading clip files first
downloads the clip files of a total of 100MB referred to by the
10 sub-playitems, and then downloads the clip files referred to
by the playitem, the supply of the clip files referred to by the
playitem may be delayed under the circumstance where the bit rate
is restricted, resulting in a delayed progress of playback.
One might think that this could be avoided by first
downloading the clip files referred to by the main path and then
downloading the clip files referred to by the sub-playitems.
However, in that case, when the user selects, as a playback target,
elementary streams in the clip files referred to by the
sub-playitems, the elementary streams of the playback target
cannot be supplied to the decoder, and the playback may proceed
while any of the subtitle, audio, and video image is missing.
As described above, when there are a lot of clip files
to be downloaded due to the presence of many sub-playitems, it
is impossible to instruct the application which clip files should
be downloaded first with priority. When this happens under the
circumstance where the bit rate is restricted, an interruption
to the playback may become prominent or the playback may proceed
while any of the subtitle, audio, and video image is missing.
In addition, since the user is not conscious of the way
the playback is performed, namely whether the playback is performed
by reading streams from the ROM disc or by receiving supply of
streams from a network, when the playback is not performed
completely, the user may consider it a defect of the playback
device and may complain about it to the customer service of the
maker of the device. Typically, the procedure of downloading
the clip files referred to by the sub-playitems is made by an
application program created by a producer of a movie, not by a
program embedded in the playback device. It is therefore not
desirable for the maker to receive a complaint when an application
program created by a producer of a movie happens to give priority
to downloading the clip files referred to by the sub-playiterns.
An object of the present invention is therefore to provide
a playback device that can instruct an application on which clip
files should be downloaded first with priority when there are
a lot of clip files to be downloaded due to the presence of many
sub-playiterns.
Means to Solve the Problems
The above-described object is fulfilled by a playback
device for playing back a playlist, wherein the playlist includes
a plurality of playitems and a plurality of sub-playitems, each
playitem is information defining a playback section in a clip
file that includes amain stream, each sub-playitem is information
defining a playback section in a clip file that includes a
sub-stream, and the clip files defined and referred to by the
sub-playitems are transferred, the playback device comprising:
a playback unit operable to play back the playitems; a specifying
unit operable to specify a current playitem; a determining unit
operable to determine a current sub-playitem each time the current
playitem is specified; and a sub-stream register operable to
indicate the current sub-stream that is to be played back in
synchronization with the current playitem, the current
sub-playitem defining a playback section of the sub-stream
indicated by the sub-stream register, wherein the playback unit
continues playback of the playitem when a clip file being referred
to by the current sub-playitem exists in an accessible recording
medium, and stops playback of the playitem when the clip file
is in a missing state or an unrecognizable state in the accessible
recording medium.
Effects of the Invention
With the above-described structure, the playback device
stops playback when a clip file, which is being referred to by
the current sub-playitem information and includes the current
sub-stream, does not exist in the recording medium.
In other words, the playback device continues playback
when a clip file, which is being referred to by the current
sub-playitem information and includes the current sub-stream,
exists in the recording medium. Therefore, clip files that should
be downloaded first with priority are (i) the clip file including
the main stream and (ii) the clip file that is being referred
to by the current sub-playitem information and includes the current
sub-stream.
When an application downloads these clip files first with
priority, before downloading clip files that are referred to by
other sub-playitems, it is possible to continue playback of a
playitem including many sub-playitems although the bit rate will
become lower than that with with the ROM disc.
With this structure, it is possible to instruct an
application on which clip files should be downloaded first with
priority when there are many clip files to be downloaded due to
the presence of many sub-playitems since it is possible to continue
the playback as far as the clip files include a clip file that
is being referred to by the current sub-playitem information and
includes the current sub-stream. Accordingly, even if there are
a lot of sub-playitems and the bit rate is restricted, it is possible
to continue the playback as far as the application performs
downloading following the rule: "while a playitem is played back,
download a clip file that is being referred to by a sub-playitem
that is to be the next current sub-playitem" . In this way,
this structure can instruct which clip files should be downloaded
first with priority, thus eliminating an occurrence that the
playback proceeds while any of the subtitle, audio, and moving
image is missing. This minimizes an interruption of the progress
of a playback which happens due to a delay in the supply of streams .
More specifically, when downloading should follow
the above- stated rule, a bit rate B that is required for downloading
the clip files is calculated is calculated as follows.
Bit rate B =
((data size of clip file constituting next playitem) +
(data size of clip file constituting next sub-playitem) )
/ playback time of current playitem
In the above equation, the "next playitem" means a playitem that
should be played back next to the current one, and the "next
sub-playitem" means a sub-playitem that should be played back
next to the current one.
With the bit rate required for the downloading being
calculated as indicated above, it is possible to minimize an
interruption of the progress of a playback even if the bit rate
is restricted.
Also, when the clip file to be referred to by the current
sub-playitem is in an unrecognizable state, the progress of the
playback is stopped. This prevents the user from becoming
conscious of whether the streams are supplied from the ROM disc
or supplied through downloading. It eliminates the need for
verifying how the playback will be without a playback of a
sub-playitem, and lightens the load at the authoring.
The playback device of the present invention can continue
a playback of a playlist without waiting for AV clips to be
downloaded, since the playback device can perform the playback
by downloading minimum clip files required for the playback.
Brief Description of the Drawing
Fig. 1 shows one example of an implementation of a usage
act of a playback device 101.
Fig. 2 shows one example of a playlist.
Fig. 3 shows a playlist used as one example in the explanation
of the operation.
Fig. 4 shows how the playlist shown in Fig. 3 is played
back.
Fig. 5 shows an example of the structure of a playback device
101.
Fig. 6 is a list of the system parameters (SPRM).
Fig. 7 illustrates the streaming-like playback function.
Fig. 8 shows an example of how the streaming-like playback
proceeds with use of sub-playitems.
Fig. 9 shows a state where the playback position is reaching
playitem #2.
Fig. 10 shows a state where, after the user requested a
subtitle change, the current playback position has moved from
sub-playitem #3 that is referring to 10003 .m2tsof sub-path (ID=0) ,
to sub-playitem #3 that is referring to 20003.m2ts of sub-path
(ID=1) .
Fig. 11 shows a state where, after a chapter jump occurred,
playitem #3 is the current playitem, and sub-playitem #3 is the
current sub-playitem information.
Fig. 12 is a flowchart showing the procedure of the process
performed by the BD-J application.
Fig. 13 is a flowchart showing the procedure of the process
of downloading AV clips.
Fig. 14 is a flowchart showing the procedure of the playlist
playback process.
Fig. 15 is a flowchart showing the procedure of determining
the current sub-playitem.
Fig. 16 is a flowchart showing the procedure of the
progressive attribute AV clip control.
Fig. 17 is a flowchart showing the procedure of the playback
stop recovery process.
Fig. 18 shows the clip files that are requested to be
downloaded when the current playitem is playitem #1 and the current
sub-playitem information is sub-playitem #1 of sub-path (ID=1).
Fig. 19 shows the clip files that are requested to be
downloaded when the current playitem is playitem #2 and the current
sub-playitem information is sub-playitem #2 of sub-path (ID=1).
Fig. 20 shows the clip files that are requested to be
downloaded when the current playitem is playitem #3 and the current
sub-playitem information is sub-playitem #3 of sub-path (ID=1).
Fig. 21 shows one example of the structure of an AV clip.
Fig. 22 schematically shows how the AV clips are multiplexed.
Fig. 23 illustrates in more detail how the video stream
is stored in the PES packet sequences.
Fig. 24 shows the format of the TS packets ultimately written
in the AV clip.
Fig. 25 explains the data structure of the PMT in detail.
Fig. 26 shows one example of the clip information file.
Fig. 2 7 shows one example of the stream attribute
information.
Fig. 28 shows one example of the entry map.
Fig. 2 9 shows one example of the internal structure of the
system target decoder 13.
Fig. 3 0 shows the data structure of the playList information.
Fig. 31 shows the internal structure of the Subpath
information with close-ups.
Fig. 32 shows one example of the entire structure of the
STN_table.
Fig. 33 shows one example of stream_entry for the secondary
video stream, as a part of the entire structure of the STN_table
shown in Fig. 32.
Fig. 34A shows one example of Stream_entry and
Stream_attribute in the primary video stream.
Fig. 34B shows Stream_entry in the secondary video stream.
Fig. 35A shows one example of the bit assignment in the
PSR14.
Fig. 35B shows one example of the bit assignment in the
PSR29.
Fig. 36 shows one example of the internal structure of the
playback control engine.
Fig. 37 is a flowchart showing the procedure for determining
the current secondary video stream performed by the stream
selection procedure.
Fig. 3 8 is a flowchart showing the procedure for determining
an optimum secondary video stream for the current playitem.
Figs . 3 9A through 3 9E show one example of the sequential - type
stereo goggle.
Fig. 4 0 shows one example of the internal structure of the
primary video stream and the secondary video stream for
stereoscopic viewing.
Fig. 41 shows one example of the internal structure of the
system target decoder 13.
Fig. 42 shows one example of the internal structure of
Primary_audio_stream_entry and Secndary_audio_stream_entry,
and the internal structure of Comb_info_Secondary_video_
Secondary_audio.
Fig. 4 3A shows one example of the bit assignment in the
PSR1.
Fig. 43B shows one example of the bit assignment in the
PSR14.
Fig. 44 is a flowchart showing the procedure for determining
the current primary audio stream when the current playitem changes .
Fig. 45 is a flowchart showing the procedure for determining
the secondary audio stream.
Fig. 4 6 is a flowchart showing the procedure for selecting
a current secondary audio stream that is optimum for the current
playitem.
Fig. 4 7 shows a portion of the STN_table that is especially
related to the PGtextST stream.
Fig. 4 8A shows one example of the numerical range of the
stream numbers that the current PGtextST stream may have.
Fig. 48B shows one example of the bit assignment in the
PSR2.
Fig. 4 9 is a flowchart showing the procedure for determining
the current PiP PG textST stream when the current playitem changes .
Fig. 50 is a flowchart showing the procedure for selecting
a current PG textST subtitle stream that is optimum for the current
playitem.
Fig. 51 shows an example of the structure of the BD-ROM.
Fig. 52 shows an example of the internal structure of the
index file.
Fig. 53 shows an example of the internal structure of the
update kit stored in the local storage 103.
Fig. 54 shows an example of the contents of the merge
management information file and the process for building the
virtual package, based on the contents of the merge management
information file, from the BD-ROM file and the update kit file.
Fig. 55 shows one example of the authoring system.
Figs. 56A and 56B illustrate the method used to create the
ROM disc image and the update kit image.
Description of Characters
100 BD-ROM
102 WWW server
103 local storage
104 television
Best Mode for Carrying Out the Invention
[0027] The following describes embodiments of the playback device
and recording device which are provided with the means for solving
the above-described problem, with reference to the drawings.
< Embodiment 1>
The following describes an embodiment of the playback device
and recording device. Firstly, of the implementation acts of
the playback device, a usage act is described. Fig. 1 shows an
implementation of a usage act of a playback device 101. As shown
in Fig. 1, the playback device 101 is used by the user together
with a recording medium 100 that is one example of the first
recording medium, a WWW server 102, a local storage 103, and a
television 104.
The BD-ROM 100 is a recording medium on which a movie
work is recorded.
[0030] The playback device 101 constitutes a home theater
together with the television 103 and plays back the BD-ROM 100.
The playback device 101 has a function to download the data into
the recording medium, thus having a function of a recording device,
as well.
The WWW server 102 is a server device that manages an
official site of the movie distributor, and distributes a set
of files (update kit) to the user via the Internet or the like,
where the set of files achieves a partial replacement or addition
of the movie work recorded on the BD-ROM 100.
The local storage 103 is attached to the playback device
to be used as a storage for storing the content distributed from
the WWW server 102 of the movie distributor. With this structure,
a content that was downloaded into the local storage 103 via the
net can be combined with a content recorded on the BD-ROM 100,
and thus the content recorded on the BD-ROM 100 can be
expanded/updated.
The television 104 provides an interactive operation
environment to the user by displaying the playback images of the
movie work and displaying the menu or the like.
[0034] Up to now, the use form of the playback device has been
described. The following describes the playlist, which is a
target of playback of the playback device.
Fig. 2 shows a playlist, where a playlist is composed
of a "main path" and one or more "sub-paths".
[0036] The "main path" is composed of one or more playitems.
The "sub-paths" are a series of playback paths played
back together with a main path, and are assigned with IDs (sub-path
IDs) in order of being registered in the playlist. The sub-path
IDs are used to identify the sub-paths. The sub-paths include
"synchronized type" and "non-synchronized type", where the
"synchronized type" sub-paths are played back in synchronization
with a main path, and "non-synchronized type" sub-paths are played
back not in synchronization with a main path. The type of each
sub-path is written in the sub-path. A sub-playitem is composed
of one or more pieces of sub-playitem information. In the case
that the sub-path type is the synchronized type, the playback
start time and the playback end time of the sub-playitem are
expressed using the same time axis as the main path. In the case
that the sub-path type is non-synchronized type, the playback
start time and the playback end time of the sub-playitem are
expressed using a different time axis as the main path.
Also, the "playitem" includes a stream number table . The
stream number table is information that indicates stream numbers
of elementary streams that are permitted to be played back in
the playitem. Detailed explanation of the playlist information,
playitem information, sub-playitem information, and stream number
table will be given in other embodiments later.
[0039] In the following description of operation, a certain
playlist is used as one example.
Fig. 3 shows the certain playlist used as one example
in the explanation of the operation.
The playlist is composed of a main path and two sub-paths
(sub-path (ID=0) , sub-path (ID=1) ) . The main path includes five
playitems #1, #2, #3, #4, and #5 . The sub-path with ID=0 includes
five sub-playitems #1, #2, #3, #4, and #5, and the sub-path with
ID=1 includes five sub-playitems #1, #2, #3, #4, and #5.
Both of the sub-paths are the synchronized-type, and refer
to AV clips on which a presentation graphics stream is multiplexed.
The presentation graphics stream multiplexed on the AV clips
referred to by one of the sub-paths is subtitle data of one language.
The AV clips used from the playlist are all contents contained
in the update kit stored in the local storage, and have a progressive
attribute. The progressive attribute is an attribute of an AV
clip that indicates that said AV clip does not need to be stored
in the local storage preliminarily before the playlist is played
back, only if the sub-play items referring to the AV clip have
been stored in the local storage immediately before the sub-play
items become the current sub-playitem information.
[0043] The five playitems #1, #2, #3, #4, and #5 respectively
refer to 00001.m2ts, 00002.m2ts, 00003.m2ts, 00004.m2ts, and
00005.m2ts.
Ir&6-444- The five sub-playitems #1, #2, #3, #4, and #5 of sub-path
ID=0 respectively refer to 10001.m2ts, 10002.m2ts, 10003.m2ts,
10004.m2ts, and 10005.m2ts.
The five sub-playitems #1, #2, #3, #4, and #5 of sub-path
ID=1 respectively refer to 20001.m2ts, 20002.m2ts, 20003.m2ts,
20004.m2ts, and 20005.m2ts.
The playitems in the main path have the same stream number
table such as the one shown on the upper-right corner of Fig.
3. This stream number table has three entries that have been
respectively assigned with stream numbers "1" , "2" , and "3" . The
three entries respectively permit playback of (i) a primary video
stream that is referred to by the playitem information of the
main path, (ii) a presentation graphics stream (PG#1) that is
encompassed by an AV clip that is referred to by the sub-playitem
(sub-path ID=0) , and (iii) a presentation graphics stream (PG#2)
that is encompassed by an AV clip that is referred to by the
sub-playitem (sub-path ID=1).
When the current subtitle stream number is "2", the
corresponding stream entry is PG#1 indicated by sub-path ID=0,
and thus PG#1 indicated by sub-path ID=0 is played back in
synchronization with the playback of the playitem.
Fig. 4 shows how the playlist shown in Fig. 3 is played
back. The right-hand side of Fig. 4 shows the WWW server 102,
and the left-hand side shows the playback device 101. The middle
part of Fig. 4 shows a transmission path which is, for example,
the Internet or an intranet. The 00001.m2ts, 00002.m2ts,
00003.m2ts, 00004.m2ts, and 00005.m2ts shown in Fig. 3 exist in
the local storage 103. As shown in Fig. 4, of the AV clips shown
in the Fig. 4, 00001.m2ts and 10001.m2ts are transmitted from
the WWW server 102 to the playback device 101 in response to a
download request transmitted to the WWW server 102.
In the database of the server device, the AV clips are
stored and managed in files whose file names are none of 00001 .m2ts,
00002.m2ts, 00003.m2ts, 00004.m2ts, and 00005.m2ts. This is
because files that may constitute a virtual package may be accessed
by aliases by manifest files.
The following describes structural elements of the
playback device 101 for performing a playback of a playlist, a
download request, and a download. The structural elements for
performing these processes include the BD-J application and BD-J
object. The following describes these structural elements.

The BDJ object is data that includes an application
management table (ApplicationManagementTable () ) , and causes the
platform unit to perform the application signaling for switching
titles during a playback of a BD-ROM. More specifically, the
ApplicationManagementTable() includes "application_id" and
"application_control_code" , where the application_id indicates
a BD-J application to be executed, and the
application_control_code indicates a control to be performed when
the BD-J application is activated. The application_control_code
defines the first execution state of the application after a title
is selected. Also, the application_control_code can define
AUTOSTART or PRESENT, where with AUTOSTART, the BD-J application
is loaded into the virtual machine and is started automatically,
and with PRESENT, the BD-J application is loaded into the virtual
machine, but is not started automatically.
The following describes the internal structure of the
playback device.
Fig. 5 shows an example of the structure of a playback
device 101. The playback device 101 is composed of a BD-ROM drive
10, a read buffer 11, a read buffer 12, a system target decoder
13, a BD-J execution unit 14, a network interface 15, a virtual
package control unit 16, a state management unit 17, a user event
processing unit 18, a playback engine 19, a playback control engine
20, an HDMI transmission/reception unit 21, a heap memory 22,
a virtual machine interpreter 23 , and a PSR set 24 . A description
is now given of each of these components,
tarsal
The BD-ROM drive 10 reads data from a BD-ROM disc, and stores
the data in the read buffer 11.
[0059]
The read buffer 11 is a buffer constituted from a memory
or the like that temporarily stores data read out by using the
BD-ROM drive.

The read buffer 12 is a buffer constituted from a memory
or the like that temporarily stores data read out from the local
storage.

The system target decoder 13 performs a demultiplexing
process onto source packets read out into the read buffer 11 or
the read buffer 12, and performs a process of decoding and playing
back the streams. The system target decoder 13 also performs
a process of decoding and playing back graphics data such as JPEG
and PNG for display of a menu or the like by the BD-J execution
unit 14 . Details of the system target decoder 13 are given later.

The BD-J execution unit 14 is a program processing engine
that executes the BD-J application transferred from the virtual
package control unit 16. The BD-J execution unit 14 performs
operations in accordance with the program of the BD-J application,
and performs control as follows. (1) The BD-J execution unit
14 performs playlist playback with respect to the virtual package
control unit 16. (2) The BD-J application unit 14 stores the
update kit from a WWW server such as the Internet to local storage.
(3) The BD-J application unit 14 instructs that the virtual package
is constructed by combining the BD-ROM and the update kit. (4)
The BD-J application unit 14 sets the values of the player variables .
(5) The BD-J application unit 14 transfers PNG and/or JPEG for
graphics for a menu or a game to the system target decoder, to
display a screen. These operations can be performed flexibly
in accordance with the makeup of the programs . What kind of control
is performed is determined according to programming procedure
of the BD-J application in the authoring procedure.

The network interface 15 achieves a communication function
in the playback device. Upon receiving a URL specified from the
BD-J application, the network interface 15 establishes a TCP
connection, FTP connection or the like with the web site of the
URL, enabling the Java™ application to download information from
the web site.

The virtual package control unit 16 controls the BD-ROM
drive 10 and the local storage 103 to construct a virtual package,
and has the function of controlling playback by the player. The
virtual package is a virtual BD-ROM package constructed by
combining BD-ROM contents in the memory, based on contents recorded
on a BD-ROM disc, the difference data stored in the local storage
103, and the merge management information stored in the local
storage 103 . The constructed virtual package has the same format
as the data structure of the BD-ROM. The virtual package may
be constructed when a disc is inserted in the playback device
1600 or when a virtual package construction instruction is executed
by the BD-J execution unit 14. After the virtual package is
constructed, the virtual package control unit 16 controls the
playback process of the AV clip via the playlist information,
in accordance with a playback instruction from the BD-J execution
unit or a notification by the user event processing unit.
Furthermore, the virtual package control unit 16 also performs
setting and referencing of the player variable, and performs
playback operations.

The state management unit 17 manages the state (any of Missing
state, Enable state, and Disable state) which each AV clip stored
in the BD-ROM or the local storage is in, and performs a control
on whether or not to stop the playback of a playlist.
The Missing state is a state in which an AV clip referred
to by the playitem information or the sub-playitem information
is not stored in any of the BD-ROM or the local storage.
[0067] The Enable state is a state in which the AV clip can be
played back by the virtual package control unit 16, and is
controlled by the API of the BD-J application. When an AV clip
is set to the Enable state by the API, the AV clip comes to have
the read-only attribute, and becomes playable by the virtual
package control unit 16.
The Disable state is a reversed state of the Enable state,
and in which the AV clip cannot be played back by the virtual
package control unit 16. An AV clip is in the Disable state when
the AV clip has not at all been set to the Enable state by the
BD-J application. When the BD-J application is to delete or
overwrite an AV clip in the Enable state, the AV clip should be
changed to the Disable state once by the API.
AV clips that are in the Missing state or the Disable
state are generically called "Unavailable clips".
The control on whether or not to stop the playback of
a playlist is specifically a control where, when the current
sub-playitem information changes, it is judged whether the AV
clip referred to by the current sub-playitem information is an
Unavailable clip, and when the AV clip is an Unavailable clip,
a DataStarved event is notified to the JMF player instance and
the playback control engine so that the JMF player instance and
the playback control engine are transferred from the playback
state to the stop state.

The user event processing unit 18, in response to a user
operation input via a remote control, requests the BD-J execution
unit 14 or the virtual package control unit 16 to execute a process .
For example, when a button on the remote control is pressed, the
user event processing unit 18 requests the BD-J execution unit
14 to execute a command indicated by the button. For example,
when a Forward/Backward play button on the remote control is pressed,
the user event processing unit 18 requests the virtual package
control unit 16 to execute a Forward/Backward process onto the
AV clip of the playlist that is currently played back.

The playback engine 19 executes AV playback functions . The
AV playback functions in the playback device are a group of
traditional functions succeeded from CD and DVD players. The
AV playback functions include: Play, Stop, Pause On, Pause Off,
Still Off, Forward Play (with specif ication of the speed) , Backward
Play (with specification of the speed), Audio Change, SubTitle
Change, and Angle Change. To realize the AV playback functions,
the playback engine 19 controls the system target decoder to decode
a portion of the AVClip that corresponds to a desired time.

The playback control engine 20 performs playback control
functions for the playlist. The playback control functions for
the playlist specifically mean that, among the AV playback
functions performed by the playback engine 19, the Play and Stop
functions are performed in accordance with the current playlist
information and the clip information.

The HDMI transmission/reception unit 21, via an HDMI (High
Definition Multimedia Interface), receives, from a device
connected to the HDMI, information concerning the device, and
sends non-compressed digital video to the device connected to
the HDMI, together with LPCM and compressed audio data, where
the non-compressed digital video is obtained by decoding by the
system target decoder.

The heap memory 22 is a stack memory reserved for the BD-J
execution unit 14, and stores the JMF player instance generated
by the BD-J application,, and a byte code generated by performing
a class loading onto the BD-J application. These are stored in
a thread manner and are supplied for the execution by the virtual
machine interpreter 23 in the First-In-First-Out manner.

The virtual machine interpreter 23 converts the byte code
stored in the heap memory 22 into a native code that can be executed
by the CPU, and causes the CPU to execute the native code.

The PSR set 24 is composed of a player setting register
storing player variables, and a player state register. The player
variables fall into system parameters (SPRM) showing the state
of the player, and general parameters (GPRM) for general use.
Fig. 6 is a list of the system parameters (SPRM) .
SPRM (0) :Language Code
SPRM (1) :Primary audio stream number
SPRM (2) :Subtitle stream number
SPRM (3) :Angle number
SPRM (4) :Title number
SPRM (5) :Chapter number
SPRM (6) :Program number
SPRM (7) :Cell number
SPRM (8) : Selected key name
SPRM (9) :Navigation timer
SPRM (10) :Playback time information
SPRM (11) :Player audio mixing mode for Karaoke
SPRM (12) :Country code for parental management
SPRM (13) : Parental level
SPRM (14) :Player setting value (video)
SPRM (15) :Player setting value (audio)
SPRM (16) :Language code for audio stream
SPRM (17) :Language code for audio stream (extension)
SPRM (18) :Language code for subtitle stream
SPRM (19) :Language code for subtitle stream (extension)
SPRM (20) :Player region code
SPRM (21) :Secondary video stream number
SPRM (22) :Secondary audio stream number
SPRM (23) : Player state
SPRM (24) :Reserved
SPRM (25) :Reserved
SPRM (26) :Reserved
SPRM (27) :Reserved
SPRM (28) :Reserved
SPRM (29) : Reserved
SPRM (30) :Reserved
SPRM (31) :Reserved
The SPRM (10) is updated every time picture data belonging
to an AV clip is displayed. In other words, if the playback device
causes a new piece of picture data to be displayed, the SPRM (10)
is updated to show the display time (PTS) of the new picture.
The current playback point can be known by referring to the SPRM
(10) .
The language code for the audio stream of the SPRM (16)
and the language code for the subtitle stream of the SPRM (18)
are items that can be set in the OSD of the player or the like,
and show default language codes of the player. For example, it
is possible to include on the BD-ROM disc a BD-J application having
the following function. Namely, if the language code for audio
stream is English, when a playlist is played back, a stream entry
having the same language code is searched for in the stream
selection table of the playitem, and the corresponding audio stream
is selected and played back. These SPRMs are respectively stored
into registers each of which has a capacity of 32 bits . The numbers
in the parentheses are used to identify the SPRMs, basically
indicating register numbers of the corresponding registers (note
that SPRM(21) and SPRM(22) do not indicate register numbers of
the corresponding registers).

Referring now to Fig. 7, a description is given of playback
transitioning by the virtual package control unit 16 in the
streaming- like playback function using the virtual package . Fig.
7 illustrates the streaming-like playback function.
The streaming-like playback function is to perform the
playback operation and the download operation in parallel with
each other so that an AV clip that is referred to by playitem
information or sub-playitem information and that is assigned with
the progressive attribute is stored into the local storage
immediately before the playitem information or the sub-playitem
information playitem becomes the current playitem information
or the current sub-playitem information.
The upper row of Fig. 7 shows a playlist. Five playitems
in the playlist refer to 0001.m2ts, 0002.m2ts, 00003.m2ts,
00004.m2ts, and 00005.m2ts, respectively. These are all contents
stored in the local storage, and have a progressive attribute.
Of these, downloading of 00001.m2ts and 00002.mt2s is already
complete, and are set to an Enable state by the BD-J application.
Each of 00003 .m2ts, 00004 .m2ts and 00005 .m2ts has either not been
downloaded, or has been downloaded but is in a Disable state.
As shown in the upper row of Fig. 7, when the package control
unit 16 plays back a playlist in order from the top playitem,
the payback device plays back the playitems #1 and #2 normally
because these playitems refer to AV clips that are in the Enable
state.
The middle row of Fig. 7 shows the playback position having
further advanced from the upper row in Fig. 7. The middle row
shows that downloading of 00003.m2ts is complete and 00003.m2ts
is put in the Enable state before the playback position moves
to the playitem #3. Therefore, the playback proceeds to the
playitem #3.
The lower row of Fig. 7 shows the playback position having
further advanced from the middle row in Fig. 7. The lower row
shows that 00004.m2ts, which is to be referred to, has not been
downloaded or is in the Disable state. In this case, the
DataStarved event is output. Upon receiving the DataStarved event,
the BD-J application recovers by performing control to complete
downloading of 00004.m2ts, set 00004.m2ts to the Enable state,
and start playback from the point at which the DataStarved event
was received.
As described above., in the streaming-like playback, the
progress of playback is not interrupted as far as the AV clip
of the playitem which is in the next playback position is downloaded
and set to the Enable state preliminarily.
Fig. 8 shows an example of how the streaming-like playback
proceeds with use of sub-playitems.
The third row of Fig. 8 shows the playlist of Fig. 3 with
addition of arrows that indicate positions to which the playback
has proceeded.
The first row of Fig. 8 shows the state of the JMF player
instance. The second row shows the state of the playback control
engine. The third row shows the subject playlist. As shown in
the first row, the JMF player instance in the playback state enters
into the stop state when the DataStarved event is output. The
DataStarved event is an event that indicates that there is no
data to be played back. Upon receiving the DataStarved event,
the JMF player instance and the playback control engine transit
to the stop state. Because it has a function to stop the operation
of the JMF player instance and the playback control engine, the
DataStarved event is also called "playback stop event".
As shown in the second row, the playback control engine
in the playback state, together with the JMF player instance,
enters into the stop state when the DataStarved event is output.
It would be understood from this that the JMF player instance
and the playback control engine transit to the stop state triggered
by the DataStarved event.
In the example shown in Fig. 8, sub-playitem #1 belonging
to sub-path (ID=0) is the current sub-playitem information.
Therefore, the playback proceeds to playitem #1 and sub-playitem
#1. At the stage of the next playitem playitem #2, 20002.m2ts,
which is referred to by sub-playitem #2 corresponding to playitem
#2, is an Unavailable clip. Thus, the DataStarved event should
be output to stop the playback when the current sub-playitem becomes
sub-playitem #2.
Fig. 9 shows a state where the playback position is reaching
playitem #2. In the example shown in Fig. 9, sub-playitem #2,
which belongs to sub-path (ID=0) and corresponds to playitem #2
in which the playback position exists, refers to an AV clip in
the Unavailable state. However, at this point in time,
sub-playitem #2 belonging to sub-path (ID=1) has either not been
downloaded or is referring to an AV clip in the Disable state.
TheSPRM(2) indicates sub-playitem#2 belonging to sub-path (ID=0) .
Therefore, even if an AV clip to be referred to by sub-playitem
#2 belonging to sub-path (ID=1) is not in the Enable state, the
virtual package control unit 16 can continue the playback.
As described above, in the example shown in Fig. 9, even
if 20002 .m2ts is an Unavailable clip, the progress of the playback
is not interrupted. Thus the playback proceeds to playitem #3.
At the stage of the next playitem playitem #4, 10004.m2ts, which
is referred to by sub-playitem #4 corresponding to playitem #4,
is an Unavailable clip. Thus, the playback stops when the current
playitem information becomes playitem #4.
The following can be said from the above description.
That is to say, when it is intended to download all AV clips referred
to by sub-playitems of all sub-paths (ID=0, ID=l) which correspond
to the playback section of playitem #2, the AV clips to be downloaded
are 10001.m2ts, 10002.m2ts, 10003.m2ts, 10004.m2ts, 10005.m2ts,
20001.m2ts, 20002.m2ts, 20003.m2ts, 20004.m2ts, and 20005.m2ts.
This indicates that the virtual package control unit 16 should
wait for as many AV clips as sub-paths to be downloaded. In this
case, as the number of sub-paths increases, the user must wait
for a longer time during a playback of the playlist.
On the other hand, in the above-described operation with
respect to sub-playitems, when all AV clips including the streams
specif ied by the Primary audio stream number SPRM (1) , the subtitle
stream number SPRM (2) , the Secondary video stream number SPRM
(21) , and the Secondary audio stream number SPRM (22) are in the
Enable state, the playback can be continued without waiting until
the AV clips of sub-playitems of other sub-paths are in the Enable
state. With this structure, the user can continue playback of
the playlist without waiting for the downloading of unnecessary
AV clips.
Fig. 10 shows a state where, after the user requested
a subtitle change, the current playback position has moved from
sub-playitem#3 that is referring to 10003 .m2ts of sub-path (ID=0) ,
to sub-playitem #3 that is referring to 20003.m2ts of sub-path
(ID=1) .
When SPRM (2) is changed to 3 by a user operation or the
like, the subtitle to be played back is sub-playitem #3 of sub-path
(ID=1) . The 20003.m2ts referred to by sub-playitem #3 is an
Unavailable clip. Therefore, the DataStarved event is output
at the same time as this change occurs . In this way, when a subtitle
change is requested and it is found that an AV clip including
a subtitle stream of the change destination subtitle is either
not yet downloaded or is in the Disable state, the virtual package
control unit 16 stops the playback and not if ies the BD-J application
of the DataStarved event.
Fig. 11 shows a state where, after a chapter jump occurred,
playitem #3 is the current playitem, and sub-playitem #3 is the
current sub-playitem information. The 20003.m2ts referred to
by sub-playitem #3 is an Unavailable clip. Therefore, the
DataStarved event is output to stop the playback.
The playback stops when the sub-playitem referring to
an Unavailable clip becomes the current sub-playitem information.
In view of this, it is possible to continue the playback by
downloading by priority a clip file that is being referred to
by the next playitem and by the next sub-playitem and that includes
the current stream being the playback target.
As described above, in a jump playback such as the chapter
jump, the virtual package control unit 16 notifies the DataStarved
event to the BD-J application when an AV clip, which is either
not yet downloaded or is in the Disable state, is to be referred
to by a sub-playitem that corresponds to any of the Primary audio
stream number SPRM (1) , the subtitle stream number SPRM (2) , the
Secondary video stream number SPRM (21) , and the Secondary audio
stream number SPRM (22). By defining the playback operations
of the virtual package control unit 16 in this way, playback can
continue with minimum downloading of AV clips.
Fig. 12 is a flowchart showing the procedure of the process
performed by the BD-J application.
[0101] In step SI, and update kit corresponding to a loaded BD-ROM
is downloaded into the BUDA (Binding Unit Data Area) directory
in the local storage. In step S2, a virtual package construction
request specifying the Missing state in the update kit is issued.
Following this step, a loop of steps S3 through S8 is
executed. In this loop, a playlist is selected and a JMF player
instance is created (step S3), a top playitem of the playlist
information is set as the current playitem (step S4) , an AV clip
specified by Clip_information_file_name in the current playitem
is selected (step S5) , the AV clip is downloaded (step S6) , and
the following three threads are established in the virtual machine
and the processes are executed in parallel with each other (step
S7) :
(1) playlist playback;
(2) progressive attribute AV clip control; and
(3) DataStarved event recovery process.
After these three processes are completed, the three threads
are ended, and the control returns to step S3.
Fig. 13 is a flowchart showing the procedure of the process
of downloading AV clips. In step Sll, a network interface is
instructed to download an AV clip i from a server device on the
WWW to the BUDA directory. In step S12, the AV clip is changed
from the Missing state to the Disable state. In step S13, a
completion of the download is waited. When the download is
completed, verification is performed in step S14. When the
verification ends normally, the AV clip is made into the Enable
state in step S15.
Fig. 14 is a flowchart showing the procedure of the playlist
playback process. In step S21, the current sub-playitem
information is determined, and the control moves to a loop of
steps S22 through S24. The loop of steps S22 through S24 is
performed as follows. It is judged whether or not the AV clip
specified by Clip_information_file_name in the current playitem
is an Unavailable clip (step S22). When it is judged that the
AV clip is not an Unavailable clip, source packets between In_Time
and Out_Tinie in the current playitem are read out from the BD-RONU
among the source packets constituting the AV clip (step S23).
It is judged whether or not the AV clip specified by
Clip_information_file_name in the current sub-playitem
information is an Unavailable clip (step S24) . When it is judged
that the AV clip is not an Unavailable clip, source packets between
In_Time and Out_Time in the current playitem are read out from
the local storage, among the source packets constituting the AV
clip (step S25). Among the read-out source packets, source
packets which are permitted to be played back by the STN_table
of the current playitem are output to the decoder (step S26).
It is judged whether or not the current playitem is the last playitem
(step S27) . The current playitem is changed (step S28) . These
steps are performed in each round of the loop. The loop is ended
when it is judged that there is no more playitem in the playlist
(Yes in step S27).
When it is judged that the AV clip specified by
Clip_information_file_name in the current playitem or in the
current sub-playitem information is an Unavailable clip, a
DataStarved event is generated to enter the playback control engine
and the JMF player instance into the Stop state (step S29) , and
then resuming is waited for (step S30).
Fig. 15 is a flowchart showing the procedure of determining
the current sub-playitem. In step S31, the procedure determines
the current primary audio stream and sets it in SPRM(l) . In step
S32, the procedure determines the current secondary audio stream
and sets it in SPRM(14) . In step S33, the procedure determines
the current secondary video stream and sets it in SPRM(21) . In
step S34, the procedure determines the current PG text subtitle
stream and sets it in SPRM(2).
Following this, all the sub-playitems referring, in
Clip_Inf ormation_f ile_name, to the current primary audio stream,
the current secondary audio stream, the current primary video
stream, or the current PG text subtitle stream, are set in the
current sub-playitem.
Fig. 16 is a flowchart showing the procedure of the
progressive attribute AV clip control.
[0110] In step S41, an AV clip that is specified by
Clip_Information_file_name in the playitem (next playitem) that
is next to the current playitem is selected. In step S41, a
sub-playitem (next sub-playitem) whose Sync_PlayItem_Id
indicates the next playitem is detected. In step S43, an AV clip
that is specified by Clip_Information_file_name in the next
sub-playitem and that includes an elementary stream corresponding
to the current stream number in PSR is selected.
In step S44, it is judged whether or not the selected
AC clip exists in the local storage. When it is judged that the
selected AC clip does not exist in the local storage, the selected
AC clip is downloaded in step S45. Then, in step S46, an AV clip
to be deleted is selected, and it is judged whether or not there
is an AC clip to be deleted. When it is judged that there is
an AC clip to be deleted, the AC clip is deleted in step S48.
In step S4 9, a change of the current playitem or the current
sub-playitem is waited for. When it is judged that the current
playitem or the current sub-playitem has changed, it is judged
whether or not the playback of the playlist has ended (step S50) .
When it is judged that the playback has not ended, the control
moves to step S41.
Fig. 17 is a flowchart showing the procedure of the playback
stop recovery process. Step S61 is a loop in which obtainment
of DataStarved event is waited for. When DataStarved event is
obtained, the control moves to step S62 which is an event completion
waiting loop where a completion of downloading of an AV clip of
the obtained DataStarved event is waited for. In step S63, a
playlist is started to be played back from a playback position
at which DataStarved event was obtained. InstepS64, it is judged
whether or not the playback of the playlist has ended.
Next, the relationships between the current playback
position in the AV clip and the AV clip that is targeted to be
downloaded.
Fig. 18 shows the clip files that are requested to be
downloaded when the current playitem is playitem #1 and the current
sub-playitem information is sub-playitem #1 of sub-path (ID=1).
It is necessary that, in the local storage, 00001 .m2ts constituting
the current playitem and 20001.m2ts constituting the current
sub-playitem information are both in the Enable state. In this
case, the next playitem should be playitem #2 and the next
sub-playitem information should be sub-playitem #2 . Therefore,
the clip files that should be requested to be downloaded while
the current playitem and the current sub-playitem information
are played back, are 00002.m2ts constituting the next playitem
and 20002.m2ts constituting the next sub-playitem information.
Fig. 19 shows the clip files that are requested to be
downloaded when the current playitem is playitem #2 and the current
sub-playitem information is sub-playitem #2 of sub-path (ID=1).
It is necessary that, in the local storage, 00002 .m2ts constituting
the current playitem and 20002.m2ts constituting the current
sub-playitem information are both in the Enable state. In this
case, the next playitem should be playitem #3 and the next
sub-playitem information should be sub-playitem #3 . Therefore,
the clip files that should be requested to be downloaded while
the current playitem and the current sub-playitem information
are played back, are 00003.m2ts constituting the next playitem
and 20003.m2ts constituting the next sub-playitem information.
Fig. 20 shows the clip files that are requested to be
downloaded when the current playitem is playitem #3 and the current
sub-playitem information is sub-playitem #3 of sub-path (ID=1).
It is necessary that, in the local storage, 00003 .m2ts constituting
the current playitem and 20003.m2ts constituting the current
sub-playitem information are both in the Enable state. In this
case, the next playitem should be playitem #4 and the next
sub-playitem information should be sub-playitem #4 . Therefore,
the clip files that should be requested to be downloaded while
the current playitem and the current sub-playitem information
are played back, are 00004.m2ts constituting the next playitem
and 20004.m2ts constituting the next sub-playitem information.
When the sub-path type is the synchronized type, the current
sub-playitem information has the same playback time length as
the current playitem. Accordingly, in the above-described
downloading, the clip files constituting the next playitem and
the next sub-playitem information should be downloaded while the
current playitem is played back.
The bit rate (bit rate B) required for the download is
calculated as follows.
Bit rate B =
((data size of clip file constituting next playitem) +
(data size of clip file constituting next sub-playitem) )
/ playback time of current playitem
The progress of playback is not interrupted by occurrence
of DataStarved event as far as, as described above, the AV clip
to be referred to by the next playitem and the AV clip to be referred
to by the next sub-playitem are downloaded while the current
playitem and the current sub-playitem are played back. This is
because the required minimum AV clips are stored in the local
storage.
As described above, according to the present embodiment,
when a playlist composed of a plurality of sub-paths is to be
played back, the player variables indicating the selected streams
embedded in the playback device are referred to, and AV clips
that are not necessary for the playback can be played back in
succession even if the download has not been completed, thus
achieving an efficient streaming-like playback.

The following technical topics among those shown in
Embodiment 1 may further be improved or modified for implementation.
Note that whether to implement these as described in Embodiment
1 or to implement these after improvement or modification is
arbitrary, and should be determined by the implementor.

In the pre-processing of a playlist playback, priority may
be given to downloading of an AV clip including an audio stream
having a language code written in the language code for audio
stream in SPRM (16) and an AV clip including a subtitle stream
having a language code written in the language code for subtitle
stream in SPRM (18) . With this kind of structure, AV clips that
the user is highly likely to use inplaylist playback can be selected
as targets for downloading.

In the case of the main path indicating main feature video
stored on the disc, it is preferable to sequentially select, in
order of proximity to the current playback position, AV clips
used by sub-playitems from the current playback position onwards
and including a stream corresponding to the stream number shown
by the current Primary audio stream number SPRM (1) , the subtitle
stream number SPRM (2) , Secondary video stream number SPRM (21) ,
or Secondary audio stream number SPRM (22) .
Since it is unlikely that the user will change the subtitles
or audio while watching the main feature video, this kind of
structure enables the playlist to be played back without making
the user wait.

The following process maybe prepared in the BD-J application
to prevent an occurrence of the state where DataStarved event
is issued while a main feature video is played back due to the
delay in downloading a subtitle/audio/Secondary video/Secondary
audio, and the playback of the main feature video temporarily
stops. That is to say, for example, when an AV clip including
a stream corresponding to the current Primary audio stream number
in SPRM (1) has not been in the Enable state, the following may
be performed before the sub-playitem including the AV clip is
reached: (a) the BD-J application writes, into SPRM (1), a special
value indicating that "no stream has been selected" ; or (b) the
BD-J application writes SPRM (1) so that a stream number of an
AV clip, that is already in the Enable state or exists on the
disc, is determined (in this case, a dummy content may be prepared
on the disc in advance in preparation for such a case). With
such a structure, the BD-J application can perform a control which
prevents DataStarved event from being issued. This process is
also applicable to switching between streams performed by a user
operation such as switching from Japanese audio to English audio.
With the structure where the BD-J application prevents an actual
occurrence of a rewriting of a stream switching SPRM (1) , it is
possible to prevent a playback of a main feature video from being
stopped temporarily.

A description is now given of the AV clip (XXX.M2TS) , clip
-information file (XXX.CLPI), and system target decoder.

The AV clip is a digital stream having an MPEG-2 transport
stream format.
[0125] Fig. 21 shows one example of the structure of an AV clip.
As shown in Fig. 21, an AV clip is obtained by multiplexing one
or more of each of a video stream, an audio stream, a presentation
graphics stream (PG), and an interactive graphics stream. The
video stream represents a Primary video and Secondary video of
a movie. The audio stream represents a Primary audio portion
of the movie and Secondary audio to mix with the Primary audio
portion. The presentation graphics stream represents subtitles
for the movie. The Primary video is ordinary video displayed
on a screen. The secondary video is video displayed in a small
screen in the main video. The interactive graphics stream
represents an interactive screen created by disposing GUI
components on a screen. The stream in each AV clip is identified
by a PID. For example, 0x1011 is allocated to a video stream
used as the video of the movie, 0x1100 to 0xlllF are allocated
to the audio streams, 0x1200 to 0xl21F are allocated to the
presentation graphics, 0x14 00 to 0x14IF are allocated to the
interactive graphics streams, OxlBOO to OxlBlF are allocated to
the video streams used as secondary video of the movie, and OxlAOO
to 0xlAlF are allocated to the audio streams used as secondary
audio mixed with the main audio.
Multiplexing of AV clips>
Fig. 22 schematically shows how the AV clips are multiplexed.
Firstly, a video stream and an audio stream (First row) are
respectively converted into PES packet sequences (Second row) ,
and further converted into TS packets sequences, respectively
(Third row) . Similarly, a presentation graphics stream and an
interactive graphics stream (Seventh row) are converted into PES
packet sequences, respectively (Elementary streams) , and further
converted into TS packet sequences, respectively (Fifth row).
An AV clip (Fourth row) is composed of these TS packets multiplexed
on one stream.
Fig. 23 illustrates in more detail how the video stream
is stored in the PES packet sequences. The first row shows a
video frame sequence of the video stream. The second row shows
a PES packet sequence. The third row shows a TS packet sequence
obtained by converting the PES packet sequence . As shown by arrows
yy1, yy2, yy3 and yy4, the video stream is composed of a plurality
of video presentation units (I picture, B picture, P picture).
The video stream is divided up into the individual pictures, and
each picture is stored in the payload of a PES packet. Each PES
packet has a PES header storing a PTS (Presentation Time-Stamp)
that is a display time of the picture stored in the payload of
the PES packet, and a DTS (Decoding Time-Stamp) that is a decoding
time of the picture stored in the payload of the PES packet.

Fig. 24 shows the format of the TS packets ultimately written
in the AV clip. The first row shows a TS packet sequence. The
second row shows a source packet sequence. The third row shows
an AV clip.
As shown in the first row, each TS packet is a fixed-length
packet consisting of a 4-byte TS header carrying information such
as a PID identifying the stream, and a 184-byte TS payload storing
data. The PES packets are divided and stored in the TS payloads.
[0130] As shown in the second row, each TS packet is given a
4-byte TP_Extra_Header, thus constituting a 192-byte source
packet. Such 192-byte source packets are written in an AV clip.
The TP_Extra_Header stores information such as an ATS
(Arrival_Time_Stamp). The ATS shows a transfer start time at
which the TS packet is to be transferred to a PID filter. The
source packets are arranged in the AV clip as shown on the third
row. The numbers incrementing from the head of the AV clip are
called SPNs (source packet numbers).
In addition to TS packets of audio, video, subtitles and
the like, the AV clip also includes TS packets of a PAT (Program
Association Table) , a PMT (Program Map Table) and a PCR (Program
Clock Reference) . The PAT shows a PID of a PMT used in the AV
clip, and is registered by the PID arrangement of the PAT itself.
The PMT stores the PIDs in the streams of video, audio, subtitles
and the like, and attribute information corresponding to the PIDs .
The PMT also has various descriptors relating to the AV clip.
The descriptors have information such as copy control information
showing whether copying of the AV clip is permitted or not permitted.
The PCR stores STC time information corresponding to an ATS showing
when the PCR packet is transferred to a decoder, in order to achieve
synchronization between an ATC (Arrival Time Clock) that is a
time axis of ATSs, and an STC (System Time Clock) that is a time
axis of PTSs and DTSs.
Fig. 25 explains the data structure of the PMT in detail.
A PMT header is disposed at the top of the PMT. Information written
in the PMT header includes the length of data included in the
PMT to which the PMT header is attached. Aplurality of descriptors
relating to the AV clip are disposed after the PMT header.
Information such as the described copy control information is
listed in the descriptors. After the descriptors is a plurality
of pieces of stream information (stream information #1 through
#N) relating to the streams included in the AV clip. Each piece
of stream information is composed of stream descriptors, each
listing information such as a stream type for identifying the
compression codec of the stream, a stream PID, or stream attribute
information (such as frame rate or aspect ratio). The stream
descriptors are equal in number to the number of streams in the
AVclip. Up to now, the AV clip has been explained. The following
describes the clip information file in detail.

Fig. 2 6 shows one example of the clip information file.
Each clip information file is management information for an AV
clip. The clip information files are in one to one correspondence
with the AV clips, and are each composed of stream attribute
information and entry map.
Fig. 27 shows one example of the stream attribute
information. As shown in Fig. 27, apiece of attribute information
is registered for each PID of each stream in the AV clip. Each
piece of attribute information has different information
depending on whether the corresponding stream is a video stream,
an audio stream, a presentation graphics stream, or an interactive
graphics stream.
Eachpiece of "video stream attribute information" carries
information including what kind of compression codec the video
stream was compressed with, and the resolution, aspect ratio and
fame rate of the pieces of picture data that compose the video
stream.
Eachpiece of "audio stream attribute information" carries
information including what kind of compression codec the audio
stream was compressed with, how many channels are included in
the audio stream, how many languages the audio stream supports,
and the sampling frequency. The information in the video stream
attribute information and the audio stream attribute information
is used for purposes such as initialization of a decoder before
the player performs playback.
Fig. 2 8 shows one example of the entry map.
As shown in Fig. 28, the entry map is table information
listing PTSs and SPNs . EachPTS show a display time of anintraframe
encoded image (hereinafter referred to as an I picture) in the
video stream in the AV clip. Each SPN is the SPN of the AV clip
that the I picture is at the start of.
Here, a pair of a PTS and an SPN shown in a same row in
the table are called an entry point. Each entry point has an
entry point ID (hereinafter also referred to as an EP_ID) . Starting
with the top entry point, which has an entry point ID "0", the
entry points have successively incrementing entry point IDs.
Using the entry map, the player can specify the location of a
file of an AV clip corresponding to an arbitrary point on the
playback axis of the video stream. For instance, when performing
special playback such as fast forward or rewind, the player can
perform processing efficiently without analyzing AV clips, by
specifying, selecting and playing back an I picture registered
in the entry map. An entry map is created for each video stream
multiplexed in the AV clip. The entry maps are managed according
to PIDs.
Up to now, the AV clip and the clip information file have
been explained. Next, the system target decoder 13 will be
described in detail.
Fig. 29 shows one example of the internal structure of
the system target decoder 13. As shown in Fig. 29, the system
target decoder 13 is composed of source depacketizers 32a and
32b, PID filters 33a and 33b, a primary video decoder 34 , a primary
video plane 35, a secondary video decoder 36, a Secondary video
plane 37, a PG decoder 38, a PG plane 39, an IG decoder 40, an
IG plane 41, a primary audio decoder 42, a secondary audio decoder
43, an audio mixer 44, a BD-J processor 45, a BD-J plane 46, and
an adder 4 7.
The source depacketizer 32a, 32b interprets a source
packet transferred, to the system target decoder 13, extracts the
TS packet, and sends the TS packet to the PID filter. In sending
the TS packet, the source depacketizer adjusts the time of input
into the decoder in accordance with the ATS of the source packet.
More specifically, the source depacketizer transfers the TS packet
to the PID filer according to the recording rate of the AV clip,
at the instant when the value of the ATC generated by the ATC
counter and the value of the ATS of the source packet become
identical.
The PID filters 33a and 33b transfer TS packets output
from the source depacketizers . More specif ically, the PID filters
transfer TS packets having a PID that matches a PID required for
playback to the primary video decoder, the secondary video decoder,
the IG decoder, the PG decoder, the audio decoder or the secondary
audio decoder, depending on the PID of the TS packet. For instance,
in the case of the BD-ROM, a TS packet having a PID 0x1011 is
transferred to the primary video decoder, TS packets having a
PIDs 0xlB00 to OxlBlF are transferred to the secondary video decoder,
TS packets having PIDs 0x1100 to 0xlllF are transferred to the
primary audio decoder, TS packets having PIDs OxlAOO to OxlAlF
are transferred to the secondary audio decoder, TS packets having
PIDs 0x1200 to 0x121F are transferred to the PG decoder, and TS
packets having PIDs 0x1400 to 0x141F are transferred to the IG
decoder.
As shown in Fig. 28, the playback device has two source
depacketizers and two PID filters. One set of a source
depacketizer and a PID filter processes an AV clip transferred
from the read buffer 11, and the other set processes an AV clip
transferred from the read buffer 12. When the sub-path type is
the synchronized type, playback is performed with synchronization
between the AV clip referred to from the primary path and the
AV clip referred to from the sub-path. When the sub-path type
is non-synchronized type, playback is performed without
synchronization between the AV clip referred to from the main
path and the AV clip referred to from the sub-path.
The primary video decoder 34 has a buffer. While
accumulating data in the buffer, the primary video decoder extracts
information such as a TS header and a PES header, extracts a picture
in an encoded state (I picture, B picture, P picture) , and decodes
each frame image in a video stream at respective predetermined
decode times (DTS) . As a result of these operations, the primary
video decoder creates a plurality of frame images. The primary
video decoder outputs the frame images to the primary video plane
35 in accordance with the respective display times (PTS) .
Possible compression encoding formats of the video stream
multiplexed on the AV clip include MPEG2, MPEG4AVC, and VC1, and
therefore the decoding scheme used to decode the compressed video
is changed in accordance with stream attributes.
The primary video plane 35 stores frame images obtained
by the primary video decoder 34.
The secondary video decoder 36, having the same structure
as the primary video plane 35, performs decoding of an input
secondary video stream, and writes resultant pictures to the
secondary video plane in accordance with respective display times
(PTS).
The secondary video plane 37 stores frame images obtained
by the secondary video decoder 36.
The PG decoder 3 8 extracts and decodes a presentation
graphics stream from the TS packets input from the source
depacketizers, and writes the resultant non-compressed graphics
data to the PG plane in accordance with respective display times
(PTS).
The PG plane 39 stores non-compressed graphics data.
The IG decoder 4 0 extracts and decodes an interactive
graphics stream from the TS packets input from the source
depacketizers, and writes the resultant non-compressed graphics
data to the IG plane in accordance with respective display times
(PTS).
[0152] The IG plane 41 stores non-compressed graphics data.
The primary audio decoder 42 has a buffer. While
accumulating data in the buffer, the primary audio decoder extracts
information such as a TS header and a PES header, and performs
audio stream decode processing to obtain non-compressed
LPCM-state audio data. The primary audio decoder outputs the
obtained audio data to the audio mixer in accordance with the
respective playback time (PTS). Possible compression encoding
formats of the audio stream multiplexed on the AV clip include
AC3 and DTS, and therefore the decoding scheme used to decode
the compressed audio is changed in accordance with stream
attributes.
The secondary audio decoder 43 has the same structure
as the primary audio decoder. The secondary audio decoder
performs decoding of an input secondary audio stream, and outputs
resultant non-compressed LPCM-state audio data to the audio mixer
in accordance with respective display times. Possible
compression encoding formats of the audio stream multiplexed on
the AV clip include DolbyDigitalPlus andDTS-HD LBR, and therefore
the decoding scheme used to decode the compressed audio is changed
in accordance with stream attributes.
The audio mixer 4 4 mixes (superimposes) the non-compressed
audio data output from the primary audio decoder and the
non-compressed audio data output from the secondary audio decoder
with each other, and outputs the resultant audio to a speaker
or the like.
The BD-J processor 4 5 decodes graphics data (in PNG or
JPEG format) transferred from the BD-J execution unit, and outputs
the resultant decoded graphics data to the BD-J plane in accordance
with a display time designated by the BD-J application.
The BD-J plane 46 stores graphics data decoded by the
BD-J processor 45.
The adder 4 7 instantaneously superimposes the data written
in the primary video plane, data written in the secondary video
plane, data written in the IG plane, data written in the PG plane,
and data written in the BD-J plane, and displays the resultant
superimposed data on the screen of a television or the like.
{0159] As described above, the present embodiment achieves an
internal structure in compliance with the BD-ROM player model
so that the playlists can be played back.

In the present embodiment, a description is given of a
detailed data structure of the playlist information, and how to
determine the current secondary video stream.
Fig. 30 shows the data structure of the playList
information. As shown in Fig. 30, the PlayList information
includes: MainPath information (MainPathO) that defines
MainPath; and SubPath information (SubPath()) that defines
SubPath.

While MainPath is a playback path defined for MainClip being
a primary video, SubPath is a playback path defined for SubClip
that should be synchronized with MainPath.
Fig. 31 shows the internal structure of the Subpath
information with close-ups. As indicated by the arrow hcO, each
Subpath includes : SubPath_type that indicates the type of SubClip;
and one or more pieces of SubPlayltem information ( • • •
SubPlayltem ()•••) .
The lead line hcl indicates the close-up of the structure
of the SubPlayltem information.
SubPlayltem defines one or more elementary stream playback
paths that are separate from MainPath, and is used to indicate
a type of how the SubPath is synchronized with MainPath. When
SubPlayltem uses a sub-path of the Primary audio/PG/IG/Secondary
audio/Secondary video, the SubPlayltem is synchronized with
MainPath that uses Playltem in PlayList. The elementary stream
used by the sub-path for playing back the elementary stream is
multiplexed in a clip (namely, SubClip) that is separated from
MainClip that is used by Playltem on the MainPath side.
The following describes the internal structure of
SubPlayltem. As the lead line hcl in Fig. 31 indicates, the
SubPlayltem information includes: wClip_information_file_name" ,
"Clip_codec_identifier", "ref_to_STC_id[0]",
"SubPlayItem_In_time", "SubPlayItem_Out_time",
"Sync_PlayItem_id", and "Sync_Start_PTS_of_PlayItem".
The ,vClip_information_file_name" is information that
uniquely specifies a SubClip that corresponds to SubPlayltem,
by writing the file name of the Clip information.
The "Clip_codec_identifier" indicates an encoding method
of the AVClip.
The "ref_to_STC_id [0] " uniquely indicates a STC_Sequence
that is the target of the Playltem.
The nSubPlayItem_In_time" is information that indicates
the start point of SubPlayltem in the playback time axis of the
SubClip.
The "SubPlayItem_Out_time" is information that indicates
the end point of SubPlayltem in the playback time axis of the
SubClip.
The "sync_PlayItem_id" is information that uniquely
specifies, among Playltems constituting the MainPath, Playltem
with which the SubPlayltem is to be synchronized. The
"SubPlayItem_In_time" is present in the playback time axis of
the Playltem specified by the "sync_PlayItem_id".
The "Sync_Start_PTS_of_PlayItem" indicates the position
of the start point of the SubPlayltem specified by the
SubPlayItem_In_time, in the playback time axis of the Playltem
specified by the "Sync_PlayItem_id" , with the time accuracy of
45KHz. When "Sync_Start_PTS_of_PlayItem" of a SubPlayltem
indicates a time point in the playback time axis of the Playltem,
the SubPlayltem achieves a "Synchronization Picture in Picture" .
Also, an indefinite value (OxFFF) may be set in the
"Sync_Start_PTS_of_PlayItem" . This indefinite value indicates
that a time point at which a locking operation was performed by
the user is set as a sync (synchronization) time point at which
a synchronization with a Playltem specified by Sync_PlayItem_id
is performed. When an indefinite value is set in the
"Sync_Start_PTS_of_PlayItem" and a SubPlayltem intends to play
back the secondary video stream, the SubPlayltem achieves a
"Non-Synchronization Picture in Picture".
This completes an explanation of the SubPath information.

What is unique to the PlayList information is the STN_Table .
The STN_table is a table that indicates reproducible
streams among a plurality of elementary streams multiplexed in
the AVClips specified by the Clip_Information_file_name in the
Playltem information, and among Out_of_MUX streams specified by
the Clip_Information_file_name in the information. More
specifically, the STN_table is generated by associating the
Stream_attributes with the Stream_entries respectively
corresponding to the plurality of elementary streams multiplexed
in the MainClips and to the Out_of_MUX streams multiplexed in
the SubClips.
Fig. 32 shows one example of the entire structure of the
STN_table. Fig. 33 shows one example of stream_entry for the
secondary video stream, as a part of the entire structure of the
STN_table shown in Fig. 32. As shown in Fig. 33, the STN_table
includes n pieces of Secondary_video_stream_entries
(Secondary_video_stream_entry [1] through Secondary_video_
stream_entry[n]) and "number_of_Secondary_video_stream_
entries(=n)" which indicates the number of secondary video
streams.
The lead line hsl indicates the close-up of the internal
structure of the Secondary_video_stream_entry [1] . That is to
say, Secondary_video_stream_entry[1] through Secondary_video_
stream_entry[n] are a plurality of instances that were generated
from the same class structure, and have the same internal structure
as the one indicated by the lead line hsl. The number in " [] "
that is attached to each Secondary_video_stream_entry indicates
a rank order.thereof in the STN_table.
As indicated by the lead line hsl,
Secondary_video_stream_entry[1] includes: "Stream_entryM that
presents, to the playback device, a PID corresponding to the
Secondary Video Stream Number (=1); "Stream_attribute" that
indicates a video attribute corresponding to the Secondary Video
Stream Number ( = 1) ; Comb__info_Secondary_Video_Secondary_Audio
that indicates a secondary audio stream that becomes playable
when the Secondary Video Stream Number (=1) is set; and
Comb_info_Secondary_Video_PiP_PG_textST that indicates a PG
stream or a text subtitle stream that becomes playable when the
Secondary Video Stream Number (=1) is set.
As indicated by the lead line hs2,
Secondary_video_stream_entry[2] includes: "Stream_entry" that
presents, to the playback device, a PID corresponding to the
Secondary Video Stream Number (=2); nStream_attribute" that
indicates a video attribute corresponding to the Secondary Video
Stream Number (=2); Comb_info_Secondary_Video_Secondary_Audio
that indicates a secondary audio stream that becomes playable
when the Secondary Video Stream Number (=2) is set; and
Comb_info_Secondary_Video_PiP_PG_textST that indicates a PG
stream or a text subtitle stream that becomes playable when the
Secondary Video Stream Number (=2) is set.
The structure indicated by the lead line hs3 is similar
to the above. As described above,
Secondary_video_stream_entry[x] that is positioned xth in the
STN_table presents, to the playback device when the secondary-
video stream number is set to "x", a PID corresponding to the
Secondary Video Stream Number (=x), a video attribute
corresponding to the Secondary Video Stream Number (=x), and
a secondary audio stream and a PGTextST that can be combined with
it.
The streams representing subtitles include, as well as
the presentation graphics stream that has been described in the
embodiments, the text subtitle stream represented by the text
code. The PG text subtitle stream used here is a name generated
by combining the presentation graphics stream and the text subtitle
stream. Furthermore, the PG text subtitle stream used in the
Picture in Picture is called "PiP_PG_text subtitle stream".
Fig. 34A shows one example of Stream_entry and
Stream_attribute in the primary video stream. The Stream_entry
includes "ref_to_stream_PID_of_mainClip" that indicates a packet
identifier of a PES packet that constitutes the primary video
stream.
The Stream_attribute includes: "Video_format" that
indicates a display method of the video stream; and "f rame_rate"
that indicates a display frequency of the video stream,
Fig. 34B shows Stream_entry in the secondary video stream.
As shown in Fig. 34B, the Stream_entry of the secondary video
stream includes: "ref_to_Sub_Path_id" that indicates SubPath
information that is referring to the secondary video stream; and
"ref_to_stream_PID_of_mainClip" that indicates a packet
identifier of a PES packet that constitutes the secondary video
stream.
This completes description of the recording medium in
the present embodiment.
The following describes PSR14 and PSR29. Each PSR is
32-bit long. The bit positions of the bits constituting one word
(32 bits) of the PSR are represented by bO through b31. In this
notation, the most significant bit isb31, and the least significant
bit is bO.

Fig. 35A shows one example of the bit assignment in the
PSR14.
As shown in Fig. 35A, the eight bits from b8 to bl5 among
32 bits of the PSR14 represents a stream number of the secondary-
video stream number, and identifies one of a plurality of Secondary
video streams whose entries are written in the STN_table of the
current Playltem. When the value set in the PSR14 changes, the
playback device plays back a Secondary video stream corresponding
to the set value after the change. The PSR14 is set to "OxFF"
as the initial value, and then may be set to a value ranging from
"1" to "32" by the playback device. The value "OxFF" is an
indefinite value and indicates that there is no Secondary video
stream or that a Secondary video stream has not been selected.
When the PSR14 is set to a value ranging from "1" to "32", the
set value is interpreted as a stream number of a Secondary video
stream.
The bit "b31" of the PSR14 is disp_v_flag that indicates
whether the playback device has a capability to play back a
secondary video (lb:Presentation of Secondary Video is enable)
or not (Ob:Presentation of Secondary Video is disable).

Fig. 3 5B shows one example of the bit assignment in the
PSR2 9.
The bit "bO" of the PSR29 indicates whether or not there
is HD_Secondary_video_Capability, namely whether the playback
device has a capability to play back an HDTV-compliant secondary
video (lb: Secondary Video is capable) or not (Ob: Secondary Video
is incapable).
The bit "bl" of the PSR29 indicates whether or not there
is 50&25Hz__video_Capability, namely whether the playback device
has a capability to play back a PAL-compliant secondary video
(1b:50&25Hz Video is capable) or not (0b:50&25Hz Video is
incapable).
This completes an explanation of the PSR set.
Fig. 36 shows one example of the internal structure of
the playback control engine. As shown in Fig. 36, the playback
control engine includes a procedure execution unit 4 8 and a PID
conversion unit 49.
The procedure execution unit 4 8 executes a predetermined
stream selection procedure to overwrite PSR14 with a stream number
of a new secondary video stream when a piece of Playltem information
is switched to another piece of Playltem information, or when
the user performs an operation for switching the stream number.
The playback device plays back a secondary video stream in
accordance with the stream number written in the PSR14.
Accordingly, a secondary video stream is selected via the setting
in PSR14.
The reason for executing the stream selection procedure
when the Playltem information is switched is as follows. That
is to say, since the STN_table exists for each piece of Playltem
information, there may be a case where a secondary video stream
is not played back with a piece of Playltem information while
it is played back with another piece of Playltem information.
The PID conversion unit 4 9 converts a stream number stored
in the PSR set into a PID reference value based on the STN_table,
and sends the conversion-resultant PID reference value to the
PID filters 33a and 33b.
Fig. 3 7 is a flowchart showing the procedure for
determining the current secondary video stream performed by the
stream selection procedure.
In step S71, it is checked whether a secondary video stream
with a stream number that is equivalent with the number set in
PSR14 satisfies the following conditions (A) and (B).
Condition (A): the playback device has the capability of
playing back the secondary video stream with a stream number that
is equivalent with the number set in PSR14, based on a comparison
among Video format, frame_rate, HD Secondary Video Capability,
and 50&25Hz Video Capability.
Condition (B) : SubPath_type of the secondary video stream
is "=6 (non-synchronization Picture in Picture)"
It should be noted here that the Video format, frame_rate
are written in the stream_attribute for the secondary video stream
in the STN_table. Also, whether there is HD_Secondary_
video_Capability or 50&25Hz_video_Capability is indicated by bit
"bO" and"bl" of PSR29 . The judgment on whether or not the condition
(A) is satisfied is made by referring to these settings in the
STN_table and the value of bl in PSR29.
Step S71 is followed by steps S72 and S73.
In step S72, it is judged whether or not the number of
entries of the secondary video streams in the STN_table of the
current playitem is 0. It should be noted here that when the
number of entries of the secondary video streams in the STN_table
is 0, it means that there is no secondary video stream that has
been permitted to be played back. When it is judged that the
number of entries of the secondary video streams in the STN table
of the current playitem is 0, the secondary video stream number *
in PSR14 is maintained (step S73) . This is because the value
currently stored in PSR14 should be maintained when, in the current
playitem, there is no secondary video stream that has been permitted
to be played back.
Step S74 is a judgment step that is performed when it
is judged in step S73 that the number of entries of the secondary
video streams in the STN_table of the current playitem is not
0. In step S74, it is judged whether or not the secondary video
stream number in PSR14 is equal to or smaller than the total number
of entries in the STN_table, and the secondary video stream with
the stream number satisfies the condition (A) . When the judgment
in step S74 results in Yes, a secondary video stream that is optimum
for the current playitem is selected (step S75).
When the judgment in step S74 results in No, the control
moves to step S7 6 in which it is judged whether the condition
(B) is satisfied. When it is judged that the condition (B) is
satisfied, OxFE is set in PSR 14 (step S78). Here, the value
OxFE indicates that the secondary video stream number in PSR14
is valid, but that the secondary video stream is not selected.
In the execution of Non-Synchronization Picture in Picture, when
PSR14 has been set to the above-described value, the procedure
for determining the current stream is executed upon receiving
the user operation therefor. However, if the stream number in
PSR14 is invalid, the procedure for determining the current stream
is not executed even if the user operation is received, and the
secondary video stream is not played back. To prevent this from
occurring, OxFE is set in PSR14 when Non-Synchronization Picture
in Picture is executed.
When it is judged that the condition (B) is not satisfied,
but an effective secondary video stream number has already been
set in PSR14, the number is not changed (step S77).
This completes an explanation of the stream selection
procedure for the secondary video stream.
Fig. 38 is a flowchart showing the procedure for
determining an optimum secondary video stream for the current
playitem.
In steps S81 through S83, it is checked, for each stream
written in all stream_entries in the STN_table, whether a stream
satisfies the following conditions (a) and (b).
Condition (a) : the playback device has the capability
of playing back the secondary video stream with a stream number
that is equivalent with the number set in PSR14, based on a
comparison among Video format, frame_rate, HD Secondary Video
Capability, and 50&25Hz Video Capability for the secondary video
stream.
Condition (b) : SubPath_type of the secondary video stream
is "=6 (non-synchronization Picture in Picture)"
After the above-described check is performed with respect
to all secondary video streams that are permitted to be played
back in the STN_table, the control goes to step S84.
In step S84, it is judged whether or not there is no
secondary video stream that satisfies the condition (a) . When
it is judged that there is no secondary video stream that satisfies
the condition (a), OxFF is set in PSR14 as the secondary video
stream number (step S85).
When it is judged that at least one secondary video stream
that satisfies the condition (a) exists, the control goes to step
S86. In step S8 6, it is judged whether or not the top secondary
video stream in the STN_table among the secondary video streams
satisfying the condition (a) satisfies the condition (b) . When
it is judged that the top secondary video stream satisfies the
condition (b) , OxFE is set in PSR14 as the secondary video stream
number (step S87).
When it is judged that the top secondary video stream
does not satisfy the condition (b) , a secondary video stream whose
corresponding stream_entry positioned at the top of the STN_table
is selected from among the secondary video streams that satisfy
the condition (a) , and the stream number of the selected secondary
video stream is set into the PSR14 (step S88).
With this structure, when a playback section in the current
secondary video stream is defined by SubPlayltem, a sub-playitem
that defines the playback section in the current secondary video
stream is identified as the current sub-playitem.
With these procedures, a current secondary video stream
that is optimum for the current playitem is stored in PSR14 . This
completes an explanation of the procedures for selecting an optimum
current secondary video stream.
As described above, according to the present embodiment,
when there is a secondary video stream that cannot be played back
among a plurality of secondary video streams recorded in a BD-ROM
or a local storage, the next stream is selected. This provides
an option for "using a stream, which can be played back by the
device itself, in the Picture in Picture". With this structure,
when the total size of the secondary video varies, and there is
a variation in whether or not the playback device has a capability
to play back the secondary video, it is possible to cause the
playback device to display any secondary video and execute the
Picture in Picture.
When an AV clip, which includes a secondary video stream
corresponding to a secondary video selected in the above-described
manner, is referred to by a second sub-playitem, the playback
is interrupted if the AV clip is an Unavailable clip. Therefore,
by downloading in time an AV clip which includes the current
secondary video stream and is referred to by the sub-playitem
information, it is possible to prevent the playback from being
interrupted and ensure a smooth progress of playback.

The present embodiment relates to an improvement in the
case where the playback list to be played back is a playlist for
a stereoscopic image.
A stereoscopic viewing is achieved by using a holography
technology or use of a parallax image.
The former method, the holography technology is
characterized in that it can reproduce an object
three-dimensionally in the same manner as the human being
recognizes the object normally, and that, in regards with video
generation, although it has established a technological theory,
it requires (i) a computer that can perform an enormous amount
of calculations to generate the video for holography in real time,
and (ii) a display device having a resolution in which several
thousands of lines can be drawn in a length of 1mm. It is extremely
difficult for the current technology to realize such a product,
and thus a product for commercial use has hardly been developed.
On the other hand, the latter method using a parallax
image has a merit that a stereoscopic viewing can be realized
only by preparing images for viewing with the right eye and the
left eye. Some technologies including the sequential segregation
method have been developed for practical use from the viewpoint
of how to cause each of the right eye and the left eye to view
only the images associated therewith.
To achieve the stereoscopic viewing, the home theater
system described in Embodiment 1, which includes the playback
device and the television, also includes a sequential-type stereo
goggle.
Figs. 3 9A through 39E show one example of the
sequential-type stereo goggle. Fig. 39A shows a home theater
system that achieves the stereoscopic viewing. As shown in Fig.
39A, the home theater system includes a sequential-type stereo
goggle 105, as an attachment to the television 104.
Fig. 39B shows the sequential-type stereo goggle 105 in
the state where it is worn by the user. The sequential-type stereo
goggle 105 is equipped with a liquid-crystal shutter that enables
the user to view a parallax image by the sequential segregation
method. Here, the parallax image is an image which is composed
of a pair of (i) an image that enters only into the right eye
and (ii) an image that enters only into the left eye, such that
pictures respectively associated with the right and left eyes
respectively enter the associated eyes, thereby achieving the
stereoscopic viewing.
Note that the sequential segregation method is a method
in which images for the left eye and right eye are alternately
displayed in a time axis direction such that left and right scenes
are overlaid in the brain by the effect of residual images of
eyes, and the overlaid image is recognized as a stereoscopic image .
Fig. 39C shows the sequential-type stereo goggle 105 in the state
of being used for viewing a planar image. When a planar image
is to be viewed, the liquid-crystal shutters for both eyes are
set to a light-transmission state. Fig. 39D shows the
sequential-type stereo goggle 105 in the state of being used for
viewing with the left eye. At the instant when an image for viewing
with the left eye is displayed on the display, the liquid-crystal
shutter of the sequential-type stereo goggle 105 for the left
eye is set to the light-transmission state, and the liquid-crystal
shutter for the right eye is set to a light block state. Fig.
3 9E shows the sequential-type stereo goggle 105 in the state of
being used for viewing with the right eye. At the instant when
an image for viewing with the right eye is displayed on the display,
the liquid-crystal shutter of the sequential-type stereo goggle
105 for the right eye is set to the light-transmission state,
and the liquid-crystal shutter for the left eye is set to the
light block state.
In the sequential segregation method, images for the left
eye and right eye are alternately displayed in a time axis direction.
For this reason, while 24 frames of images are displayed per second
in the case of a regular two-dimensional movie, a total of 4 8
frames of images for the right and left eyes should be displayed
per second. Accordingly, this method is suitable for displays
that can rewrite the screen at a relatively high speed, but this
method can be applied any displays that can rewrite the screen
a predetermined times per second.
Different from the sequential segregation method in which
images for the left eye and right eye are alternately displayed
in a time axis direction, there is another method in which pictures
for the left and right eyes are aligned vertically in a screen
at the same time, and a lenticular lens is played on the surface
of the display such that pixels constituting the picture for the
left eye form an image only in the left eye and pixels constituting
the picture for the right eye form an image only in the right
eye. This enables the left and right eyes to see respectively
pictures that have a parallax, thereby realizing a stereoscopic
viewing. Note that this method is achieved not only by the
lenticular lens, but by another device having a similar function
such as a liquid-crystal element. Furthermore, a stereoscopic
viewing can also be realized by a system in which a vertical
polarizing filter is set for pixels for the left eye, a horizontal
polarizing filter is set for pixels for the right eye, and the
user watches the display by using a pair of polarizing glasses,
where a vertical polarizing filter is attached to the left-eye
glass, and a horizontal polarizing filter is attached to the
right-eye glass.
The stereoscopic viewing using such a parallax image has
already been in practical use in play equipment in amusement parks
or the like, and its technology has been established, and thus
is considered to be put into practical use at home first among
others . Note that there have been proposed various methods, such
as the two-color segregation method, for realizing a stereoscopic
viewing as well as the above-described ones, and that although
the sequential segregation method and the polarizing glass method
are used as examples in the present embodiment, the present
invention is not limited to the two methods, but is applicable
to any method that uses the parallax image.
The present embodiment describes a method for storing
the parallax images to be used for the stereoscopic viewing, onto
an information recording medium. In the following description,
an image including a screen for the left eye is referred to as
"image for left eye", an image including a screen for the right
eye is referred to as "image for right eye" , and an image including
both screens is referred to as "image for stereoscopic viewing".

The primary video stream in the present embodiment is a
video stream that is played back as an image for planar viewing
in a playback device for planar viewing, and is played back as
an image for left eye when an image for stereoscopic viewing is
played back by a planar/stereoscopic view playback device.
Hereinafter, this video stream is referred to as a " planar/left-eye
view video stream".
The secondary video stream in the present embodiment is
a "right-eye view video stream" . The right-eye view video stream
is a video stream that is played back as an image for right eye
when an image for stereoscopic viewing is played back by a
planar/stereoscopic view playback device. The right-eye view
video stream is assigned with PID "0x1012", which is different
from the PID of the primary video stream. Next, a description
is given of the structure of the planar/left-eye view video stream
and the right-eye view video stream.
It should be noted here that the right-eye view video
stream, compared with the planar/left-eye view video stream, can
be reduced greatly in data amount by performing an inter-picture
predictive encoding between the left and right viewpoints since
there is a large correlation between the images of the left and
right viewpoints that view the same subject. Also, the frame
rate for the planar/left-eye view video stream is a frame rate
in the case where a single planar/left-eye view video stream is
played back by a planar view playback device. The value of the
frame rate is stored in the GOP header.
Fig. 4 0 shows one example of the internal structure of
the primary video stream and the secondary video stream for
stereoscopic viewing.
The second row of Fig. 4 0 shows the internal structure
of the primary video stream. The primary video stream includes
picture data II, P2, Br3 , Br4 , P5 , Br6 , Br7 , and P8 . These picture
data are decoded in accordance with the DTS. The first row shows
the image for the left eye. The decoded picture data II, P2,
Br3, Br4, P5, Br6, Br7, and P8 are played back in the order of
I1, Br3, Br4, P2, Br6, Br7, and P5 in accordance with the PTS,
thereby achieving a display of the image for the left eye.
The fourth row of Fig. 4 0 shows the internal structure
of the secondary video stream. The secondary video stream
includes picture data PI, P2, B3, B4, P5, B6, B7, and P8. These
picture data are decoded in accordance with the DTS. The third
row shows the image for the right eye. The decoded picture data
P1, P2, B3, B4, P5, B6, B7, and P8 are played back in the order
of PI, B3, B4, P2, B6, B7, and P5 in accordance with the PTS,
thereby achieving a display of the image for the right eye.
The fifth row of Fig. 4 0 shows how the state of the
sequential-type stereo goggle 105 is changed. As shown in the
fifth row, the shutter for right eye is closed when the left-eye
image is viewed, and the shutter for left eye is closed when the
right-eye image is viewed.
These primary and secondary video streams have been
compressed by an inter-picture predictive encoding that makes
use of the redundancy between .the viewpoints, as well as by an
inter-picture predictive encoding that makes use of the redundancy
in a time axis direction. The pictures of the right-eye view
video stream have been compressed by referring to the pictures
of the planar/left-eye view video stream for the same display
time.
For example, the first P picture in the right-eye view
video stream refers to the I picture in the planar/left-eye view
video stream, the B picture in the right-eye view video stream
refers to the Br picture in the planar/left-eye view video stream,
and the second P picture in the right-eye view video stream refers
to the P picture in the planar/left-eye view video stream.

The time axis in Fig. 4 0 indicates the relationships between
the display times (PTS) and the decode times (DTS) that are assigned
to each video access unit for the planar/left-eye view video stream
and the right-eye view video stream. More specifically, the DTSs
of each picture data in the planar/left-eye view video stream
and the right-eye view video stream are set such that they
alternately appear in the time axis . Also, the PTS of each picture
data in the planar/ left -eye view video stream and the PTS of each
picture data in the right-eye view video stream are set such that
they alternately appear in the time axis. This can be achieved
by setting the pictures of the planar/left-eye view video stream
and the right-eye view video stream, which are in the reference
relationships in the inter-picture predictive encoding,
alternately in the decoding order and the display order.
Also, the time gap between the DTSs for a frame for the
planar/left-eye viewing and the next frame for the right-eye
viewing is set to be a half of one display time gap of frames
for the planar/left-eye viewing. Similarly, the time gap between
the PTSs for a frame for the planar/ left -eye viewing and the next
frame for the right-eye viewing is set to be a half of one display
time gap of frames for the planar/left-eye viewing.

A stereoscopic viewing display delay is defined as a
differential between a gap between PTSs of pictures in the
planar/left -eye view video stream and a gap between PTSs of pictures
in the right-eye view video stream, for the same display time.
The stereoscopic viewing display delay is a half of one display
time gap of frames in the planar/left-eye view video stream.

A sub-path for AV clips for the right-eye view is prepared
in the stereoscopic view playlist. The sub-path is set to refer
to a plurality of AV clips for the right-eye view and to be
synchronized with the main path in the time axis. With this
structure, the planar view playlist and the stereoscopic view
playlist can share an AV clip for the planar/left-eye view, and
the stereoscopic view playlist can associate the left-eye view
with the right-eye view so that they are synchronized in the time
axis.

We Clamis

1. A playback device for playing back a playlist,
wherein the playlist includes a plurality of playitems and
a plurality of sub-playitems,
each playitem is information defining a playback section
in a clip file that includes a main stream,
eachsub-playitemis information defining a playback section
in a clip file that includes a sub-stream, and
the clip files defined and referred to by the sub-playitems
are transferred,
the playback device comprising:
a playback unit operable to play back the playitems;
a specifying unit operable to specify a current playitem;
a determining unit operable to determine a current
sub-playitem each time the current playitem is specified; and
a sub-stream register operable to indicate the current
sub-stream that is to be played back in synchronization with the
current playitem, the current sub-playitem defining a playback
section of the sub-stream indicated by the sub-stream register,
wherein the playback unit continues playback of the playitem
when a clip file being referred to by the current sub-playitem
exists in an accessible recording medium, and stops playback of
the playitem when the clip file is in a missing state or an
unrecognizable .state in the accessible recording medium.
2. The playback device of Claim 1, wherein
the sub-stream falls into any of types that are audio stream,
video stream, and subtitle stream,
the sub-stream register includes number registers storing
stream numbers corresponding one-to-one to the types, thereby-
indicating a current sub-stream for each type of sub-stream,
the current sub-playitem is a sub-playitem referring to
a clip file that stores a current sub-stream indicated by a number
register.
3. The playback device of Claim 1 further comprising
a management unit operable to manage a state of a clip file
stored in the recording medium, wherein
the clip file is transmitted to the playback device after
an application sends a download request to a server device,
the playback unit starts the playback after the application
generates a player instance for playlist information, and
the stop of the playback by the playback unit is made when
the management unit outputs an event instructing to stop the
playback, to the playback unit and to the player instance.
4. The playback device of Claim 1, wherein
the clip files referred to by the play items include video
streams,
the sub-streams are secondary video streams which, when
played back in synchronization with the video streams, provide
a stereoscopic view to a user,
the play list includes stereoscopic view setting information
which indicates whether the stereoscopic view is set on or off,
when the stereoscopic view setting information indicates
that the stereoscopic view is set on, the playback unit does not
stop playback even if a clip file, which is referred to by a
sub-playitem and includes a secondary video stream, is not stored
in the recording medium or is in an unrecognizable state, and
the playback unit stops playback when the stereoscopic view
setting information in the playlist indicates that the
stereoscopic view is set on and a clip file referred to by the
current sub-playitem is in the missing state in the recording
medium or is in an unrecognizable state.
5. The playback device of Claim 1, wherein
the sub-stream register further indicates whether a subtitle
is set on or off,
the playback unit does not stop playback when the sub-stream
register indicates that the subtitle is set on even if a clip
file including a subtitle stream is in the missing state in the
recording medium or is in an unrecognizable state, and
the playback unit stops playback when the sub-stream
register indicates that the subtitle is set on, and a clip file
referred to by the current sub-playitem is in the missing state
in the recording medium or is in an unrecognizable state.
6. The playback device of Claim 1, wherein
playitem information includes a stream number table
indicating sub-streams that are playable,
the stream number table includes a plurality of sub-stream
entries,
the sub-stream entries are information that indicates a
plurality of sub-streams that can be played back in synchronization
with the playitems,
each of the sub-stream entries is information that indicates,
in association with a piece of sub-playitem information, a
sub-stream that can be played back in synchronization, and an
order of the sub- stream entries in the streamnumber table indicates
priority ranks of the sub-streams, and
the current sub-stream is, among the plurality of
sub-streams that can be played back in synchronization with the
playiterns, a sub-stream that corresponds to a sub-stream entry
having a highest priority rank among the plurality of sub-stream
entries included in the stream number table.
7. The playback device of Claim 1 further comprising
a setting register storing deck information which indicates
a language setting in the playback device, and
the determining unit determines, as the current sub-stream,
a sub-stream which, among sub-streams that are permitted to be
played back, has a language attribute that matches the language
setting indicated by the deck information.
8. A recording device for writing, onto a recording medium, clip
files that are to be referred to when a playlist is played back,
wherein the playlist includes a plurality of playitems and
a plurality of sub-playitems,
each of the playitems is information that defines (i) a
clip file including a main stream and (ii) a playback section
in the clip file,
each of the sub-playitems is information that defines (iii)
a clip file including a sub-stream and defines (iv) a playback
section in the clip file including the sub-stream, as a playback
section to be synchronized with a playitem, and
the clip files that are defined and referred to by the
sub-playitems are transferred via a transmission path, and
substreams are not multiplexed with the clip files that are defined
and referred to by the playitems,
the recording device comprising:
a writing unit operable to receive a clip file via the
transmission path and write the received clip file onto the
recording medium;
a management unit operable to manage state of the clip file
written on the recording medium;
a playback unit operable to play back the playitems included
in the playlist;
a specifying unit operable to specify one of the playitems
that is a current target of playback, as a current playitem;
a determining unit operable to, each time the current
playitem is specified by the specifying unit, determine, as a
current sub-playitem, a sub-playitem that is optimum for the
current playitem; and
a sub-stream register operable to indicate a current
sub-stream that is to be played back in synchronization with the
current playitem,
wherein the current sub-playitem is a sub-playitem that
defines a playback section to be synchronized with the current
playitem and that defines a playback section of the sub-stream
indicated by the sub-stream register, and
the playback unit continues playback of the playitem when
a recording medium accessible by the playback device stores a
clip file being referred to by the current sub-playitem, and stops
playback of the playitem when the recording medium accessible
by the playback device either does not store a clip file being
referred to by the current sub-playitem, or is in an unrecognizable
state.
9. The recording device of Claim 8, wherein
while the current playitem is played back, an application
identifies, as a next sub-playitem, a piece of sub-playitem
information among a plurality of pieces of sub-playitem
information included in playlist information, and requests, from
a server device, a download of a clip file referred to by the
next sub-playitem, and
the clip file written by the writing unit onto the recording
medium is the clip file referred to by the next sub-playitem.
10. A playback method for playing back a playlist,
wherein the playlist includes a plurality of playitems and
a plurality of sub-playitems,
each playitem is information defining a playback section
in a clip file that includes a main stream,
each sub-playitem is inf ormation defining aplayback section
in a clip file that includes a sub-stream, and
the clip files defined and referred to by the sub-playitems
are transferred,
the playback method comprising the steps of:
playing back the playitems;
specifying a current playitem;
determining a current sub-playitem each time the current
playitem is specified; and
indicating the current sub-stream that is to be played back
in synchronization with the current playitem, the current
sub-playitem defining a playback section of the sub-stream
indicated by the sub-stream register,
wherein the playback step continues playback of the playitem
when a clip file being referred to by the current sub-playitem
exists in an accessible recording medium, and stops playback of
the playitem when the clip file is in a missing state or an
unrecognizable state in the accessible recording medium.
11. A recording method for writing, onto a recording medium,
clip files that are to be referred to when a playlist is played
back,
wherein the playlist includes a plurality of playitems and
a plurality of sub-playitems,
each of the playitems is information that defines (i) a
clip file including a main stream and (ii) a playback section
in the clip file,
each of the sub-playitems is information that defines (iii)
a clip file including a sub-stream and defines (iv) a playback
section in the clip file including the sub-stream, as a playback
section to be synchronized with a playitem, and
the clip files that are defined and referred to by the
sub-playitems are transferred via a transmission path, and
substreams are not multiplexed with the clip files that are defined
and referred to by the playitems,
the recording method comprising the steps of:
receiving a clip file via the transmission path and writing
the received clip file onto the recording medium;
managing state of the clip file written on the recording
medium;
playing back the playitems included in the playlist;
specifying one of the playitems that is a current target
of playback, as a current playitem,-
determining, as a current sub-playitem, a sub-playitem that
is optimum for the current playitem, each time the current playiteir
is specified by the specifying unit; and
indicating a current sub-stream that is to be played back
in synchronization with the current playitem,
wherein the current sub-playitem is a sub-playitem that
defines a playback section to be synchronized with the current
playitem and that defines a playback section of the sub-strearr
indicated by the sub-stream register, and
the playback step continues playback of the playitem wher
a recording medium stores a clip file being referred to by the
current sub-playitem, and stops playback of the playitem wher
the recording medium either does not store a clip file beinc
referred to by the current sub-playitem, or is in an unrecognizable
state.

A playback device for playing back a playlist. The playback
device determines, as a current sub-playitem, a sub-playitem that
is optimum for the current sub-playitem each time the current
playitem changes. The playback device continues a playback of
a playitem when a clip file being referred to by the current
sub-playitem has been downloaded and is in the Enable state in
the local storage; and stops, by issuing a DataStarved event,
the playback of the playitem when the clip file being referred
to by the current sub-playitem is in a missing state or an invalid
state in the recording medium.

Documents

Application Documents

# Name Date
1 3475-KOLNP-2009-AbandonedLetter.pdf 2018-07-11
1 abstract-3475-kolnp-2009.jpg 2011-10-07
2 3475-KOLNP-2009-FER.pdf 2017-11-14
2 3475-kolnp-2009-specification.pdf 2011-10-07
3 3475-kolnp-2009-pct request form.pdf 2011-10-07
3 3475-KOLNP-2009-(15-03-2016)-ASSIGNMENT.pdf 2016-03-15
4 3475-kolnp-2009-pct priority document notification.pdf 2011-10-07
4 3475-KOLNP-2009-(15-03-2016)-CORRESPONDENCE.pdf 2016-03-15
5 3475-kolnp-2009-others pct form.pdf 2011-10-07
5 3475-KOLNP-2009-(15-03-2016)-FORM-1.pdf 2016-03-15
6 3475-kolnp-2009-international publication.pdf 2011-10-07
6 3475-KOLNP-2009-(15-03-2016)-FORM-2.pdf 2016-03-15
7 3475-kolnp-2009-gpa.pdf 2011-10-07
7 3475-KOLNP-2009-(15-03-2016)-FORM-3.pdf 2016-03-15
8 3475-kolnp-2009-form 5.pdf 2011-10-07
8 3475-KOLNP-2009-(15-03-2016)-FORM-5.pdf 2016-03-15
9 3475-KOLNP-2009-(15-03-2016)-FORM-6.pdf 2016-03-15
9 3475-kolnp-2009-form 3.pdf 2011-10-07
10 3475-KOLNP-2009-(15-03-2016)-PA.pdf 2016-03-15
10 3475-KOLNP-2009-FORM 3-1.1.pdf 2011-10-07
11 3475-KOLNP-2009-(31-12-2015)-ANNEXURE TO FORM 3.pdf 2015-12-31
11 3475-kolnp-2009-form 2.pdf 2011-10-07
12 3475-KOLNP-2009-(31-12-2015)-CORRESPONDENCE.pdf 2015-12-31
12 3475-kolnp-2009-form 1.pdf 2011-10-07
13 3475-KOLNP-2009-(17-01-2014)-ANNEXURE TO FORM 3.pdf 2014-01-17
13 3475-kolnp-2009-drawings.pdf 2011-10-07
14 3475-KOLNP-2009-(17-01-2014)-CORRESPONDENCE.pdf 2014-01-17
14 3475-kolnp-2009-description (complete).pdf 2011-10-07
15 3475-kolnp-2009-correspondence.pdf 2011-10-07
15 3475-KOLNP-2009-FORM-18.pdf 2012-02-23
16 3475-kolnp-2009-abstract.pdf 2011-10-07
16 3475-KOLNP-2009-CORRESPONDENCE-1.1.pdf 2011-10-07
17 3475-kolnp-2009-claims.pdf 2011-10-07
18 3475-KOLNP-2009-CORRESPONDENCE-1.1.pdf 2011-10-07
18 3475-kolnp-2009-abstract.pdf 2011-10-07
19 3475-kolnp-2009-correspondence.pdf 2011-10-07
19 3475-KOLNP-2009-FORM-18.pdf 2012-02-23
20 3475-KOLNP-2009-(17-01-2014)-CORRESPONDENCE.pdf 2014-01-17
20 3475-kolnp-2009-description (complete).pdf 2011-10-07
21 3475-KOLNP-2009-(17-01-2014)-ANNEXURE TO FORM 3.pdf 2014-01-17
21 3475-kolnp-2009-drawings.pdf 2011-10-07
22 3475-KOLNP-2009-(31-12-2015)-CORRESPONDENCE.pdf 2015-12-31
22 3475-kolnp-2009-form 1.pdf 2011-10-07
23 3475-KOLNP-2009-(31-12-2015)-ANNEXURE TO FORM 3.pdf 2015-12-31
23 3475-kolnp-2009-form 2.pdf 2011-10-07
24 3475-KOLNP-2009-FORM 3-1.1.pdf 2011-10-07
24 3475-KOLNP-2009-(15-03-2016)-PA.pdf 2016-03-15
25 3475-KOLNP-2009-(15-03-2016)-FORM-6.pdf 2016-03-15
25 3475-kolnp-2009-form 3.pdf 2011-10-07
26 3475-KOLNP-2009-(15-03-2016)-FORM-5.pdf 2016-03-15
26 3475-kolnp-2009-form 5.pdf 2011-10-07
27 3475-KOLNP-2009-(15-03-2016)-FORM-3.pdf 2016-03-15
27 3475-kolnp-2009-gpa.pdf 2011-10-07
28 3475-KOLNP-2009-(15-03-2016)-FORM-2.pdf 2016-03-15
28 3475-kolnp-2009-international publication.pdf 2011-10-07
29 3475-KOLNP-2009-(15-03-2016)-FORM-1.pdf 2016-03-15
29 3475-kolnp-2009-others pct form.pdf 2011-10-07
30 3475-KOLNP-2009-(15-03-2016)-CORRESPONDENCE.pdf 2016-03-15
30 3475-kolnp-2009-pct priority document notification.pdf 2011-10-07
31 3475-kolnp-2009-pct request form.pdf 2011-10-07
31 3475-KOLNP-2009-(15-03-2016)-ASSIGNMENT.pdf 2016-03-15
32 3475-kolnp-2009-specification.pdf 2011-10-07
32 3475-KOLNP-2009-FER.pdf 2017-11-14
33 abstract-3475-kolnp-2009.jpg 2011-10-07
33 3475-KOLNP-2009-AbandonedLetter.pdf 2018-07-11

Search Strategy

1 3475-KOLNP-2009_23-08-2017.pdf