Sign In to Follow Application
View All Documents & Correspondence

Recording Medium, Playback Device And Integrated Circuit

Abstract: On a recording medium, stereoscopic and monoscopic specific areas are located one after another next to a stereoscopic/monoscopic shared area. The stereoscopic/monoscopic shared area is a contiguous area to be accessed both in stereoscopic video playback and monoscopic video playback. The stereoscopic specific area is a contiguous area to be accessed immediately before a long jump occurring in stereoscopic video playback. In both the stereoscopic/monoscopic shared area and the stereoscopic specific area, extents of base-view and dependent-view stream files are arranged in an interleaved manner. The extents on the stereoscopic specific area are next in order after the extents on the stereoscopic/monoscopic shared area. The monoscopic specific area is a contiguous area to be accessed immediately before a long jump occurring in monoscopic video playback. The monoscopic specific area has a copy of the entirety of the extents of the base-view stream file recorded on the stereoscopic specific area.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 September 2010
Publication Number
48/2010
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

PANASONIC CORPORATION
1006, OAZA KADOMA, KADOMA-SHI, OSAKA-571-8501, JAPAN

Inventors

1. SASAKI, TAIJI
C/O. PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA-571-8501, JAPAN
2. YAHATA, HIROSHI
C/O. PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA-571-8501, JAPAN
3. OGAWA, TOMOKI
C/O. PANASONIC CORPORATION, 1006, OAZA KADOMA, KADOMA-SHI, OSAKA-571-8501, JAPAN

Specification

DESCRIPTION RECORDING MEDIUM, PLAYBACK DEVICE, AND INTEGRATED CIRCUIT [Technical Field] [0001] The present invention relates to a technology for stereoscopic video playback and especially to the allocation of a video stream on a recording medium. [Background Art] [0002] For distribution of moving image contents, optical discs such as DVDs and Blu-ray discs (BDs) are widely used. BDs are with larger capacity compared with DVDs and thus capable of storing high quality video images. Specifically, for example a DVD is capable of storing standard definition (SD) images at the resolution of 640 * 480 according to the VGA, and 720x480 according to the NTSC standard. In contrast, a BD is capable of storing high definition (HD) images at the maximum resolution of 1920 x 1080. [0003] In recent years, there is an increasing number of movie theaters where customers can enjoy stereoscopic (which is also referred to as three-dimensional (3D)) video images, in response to this trend, developments of a technology are underway, the technology for storing 3D video images onto an optical disc without degrading high image quality. Here, the requirement to be satisfy is that 3D video images are recorded on optical discs in a manner to ensure compatibility with playback devices having only playback capability of two-dimensional (2D) video images (which is also referred to as monoscopic video images). Such a playback device is hereinafter referred to as "2D playback device". Without the compatibility, it is necessary to produce two different optical discs per content, one to be used for 3D video playback and the other for 2D video playback. This would cause increase in cost. Accordingly, it is desirable to provide an optical disc storing 3D video images in a manner that a 2D playback device is allowed to execute 2D video playback and that a playback device supporting playback of both 2D and 3D video images (which is hereinafter referred to as "2D/3D playback device") is allowed to execute both 2D and 3D video playback. [0004] FIG. 59 is a schematic diagram illustrating the mechanism for ensuring the compatibility of an optical disc storing 3D video images with 2D playback devices (see Patent Document 1) . An optical disc 24 01 has a 2D/ left -view AV (Audiovisual) stream file and a right-view AV stream file recorded thereon. The 2D/left-view AV stream contains a 2D/left-view stream. The 2D/left-view stream represents video images to be visible to the left eye of a viewer in stereoscopic playback and on the other hand, also allows the use in monoscopic playback. The right-view AV stream file contains a right-view stream. The right-view stream represents video images to be visible to the right eye of a viewer in stereoscopic playback. The video streams have the same frame rate but different presentation times shifted from each other by half a frame period. For example, when the frame rate of the video streams is 24 frames per second, the frames of the left- and right-view streams are alternately displayed every 1/4 8 second. [0005] As shown in FIG. 59, the 2D/left-view and right-view AV stream files are divided into a plurality of extents 2402A-2402C and 2403A-2403C, respectively, in GOPs (group of pictures) on the optical disc 2401. That is, each extent contains at least one GOP. Furthermore, the extents 2402A-2402C of the 2D/left-view AV stream file and the extents 2403A-2403C of the right-view AV stream file are alternately arranged on a track 2401A of the optical disc 2401. Each two adjacent extents 2402A-2403A, 2402B-2403B and 2402C-2403C have the same length of playback time. Such an arrangement of extents is referred to as an interleaved arrangement. Groups of extents recorded in an interleaved arrangement on a recoding medium are used both in stereoscopic playback and monoscopic playback, as described below. [0006] As shown in FIG. 59, a 2D playback device 2404 causes a 2D optical disc drive 24 04A to sequentially read the extents 24 02A-2402C of the 2D/ left -view AV stream from the optical disc 24 01 and a video decoder 24 04B to sequentially decode the read extents into left-view frames 24 06L. As a result, left views, i.e. , 2D video images are played back on a display device 2407. Note that the arrangement of the extents 2402A-24 02C on the optical disc 2401 is designed in view of the seek performance and the reading rate of the 2D optical disc drive 2401A so as to ensure seamless playback of the 2D/left-view AV stream file. [0007] As shown in FIG. 59, a 2D/3D playback device 2405, when accepting the selection of 3D video playback from the optical disc 2401, causes a 3D optical disc drive 2405A to alternately read the 2D/left-view AV stream file and the right-view AV stream file extent by extent from the optical disc 2401, more specifically, in the order of the reference numbers 2402A, 2403A, 2402B, 2403B, 2402C, and 2403B. Of the read extents, those belonging to the 2D/]eft-view stream are supplied to a left video decoder 2405L, whereas those belonging to the right-view stream are supplied to a right-video decoder 2405R. The video decoders 2405L and 2405R alternately decode the received extents into video frames 2406L and 2406R, respectively. As a result, left and right video images are alternately displayed on a 3D display device 2408. In synchronization with the switching between left and right video images, 3D glasses 2409 cause the left and right lenses to opacify alternately. Through the 3D glasses 2409, the video images presented on the display device 2408 appear to be 3D video images. [0008] As described above, the interleaved arrangement enables an optical disc having 3D video images to be used for both 2D video playback by a 2D playback device and 3D video playback by a 2D/3D playback device. [Reference List] [Patent Documents] [0009] [Patent Document 1] Japanese Patent No. 3935507 [Summary of Invention] [Technical Problem] [0010] There are optical discs having a plurality of recording layers such as a dual layer disc. With such an optical disc, a series of AV stream files may be recorded on disc areas extending over two layers. Even with a single layer disc, in addition, a series of AV stream files may be recorded on separate areas between which a different file is recorded. In such cases, the optical pickup of an optical disc drive needs to execute a focus jump or a track jump in data reading from the optical disc. The focus jump is a jump caused by a layer switching, and the track jump is a jump caused by a movement of the optical pickup in a radial direction of the optical disk. Generally, these jumps involve longer seek time, thus called long jumps. Ensuring seamless video playback regardless of a long jump needs the extent accessed immediately before the long jump to have a large size enough to satisfy the condition for preventing buffer underflow in a video decoder during the long jump. [0011] In order to satisfy the above-mentioned condition in both 2D and 3D video playback when the 2D/left-view AV stream file and the right-view AV stream file are arranged in an interleaved manner as shown in FIG. 59, however, the area accessed immediately before the long jump needs to have a larger size of an extent of the right-view AV stream file in addition to a sufficiently larger size of the extent of the 2D/left-view AV stream since both the extents have the same playback time. As a result, a 2D/3D playback device needs to allocate a larger buffer capacity to a right video decoder, the buffer capacity larger than that satisfying the above-mentioned condition. This is not desirable since it prevents further reduction in buffer capacity and further improvement in memory efficiency of a playback device. [0012] An object of the present invention is to provide a recording medium having stream files recorded thereon in an arrangement to allow further reduction in the buffer capacity necessary for stereoscopic playback. [Solution to Problem] [0013] A recording medium according to the invention includes a base-view stream file and a dependent-view stream file recorded thereon. The base-view stream file is to be used for monoscopic video playback. The dependent-view stream file is to be used for stereoscopic video playback in combination with the base-view stream file. The recording medium has a stereoscopic/monoscopic shared area, a stereoscopic specif ic area, and a monoscopic specific area. The stereoscopic/monoscopic shared area is a contiguous area to be accessed both while a stereoscopic video is to be played back and while a monoscopic video is to be played back. The stereoscopic/monoscopic shared area is also an area in which a plurality of extents belonging to the base-view stream file and a plurality of extents belonging to the dependent-view stream file are arranged in an interleaved manner. Both of the stereoscopic specific area and the monoscopic specific area are contiguous areas located one after another next to the stereoscopic/monoscopic shared area. The stereoscopic specific area is an area to be accessed immediately before a long jump occurring in stereoscopic video playback. The stereoscopic specific area is also an area in which extents belonging to the base-view stream file and extents belonging to the dependent-view stream file are arranged in an interleaved manner. The extents recorded on the stereoscopic specific area are next in order after the extents recorded on the stereoscopic/monoscopic shared area. The monoscopic specific area is an area to be accessed immediately before a long jump occurring in monoscopic video playback. The monoscopic specific area has a copy of the entirety of the extents that belong to the base-view stream file and are recorded on the stereoscopic specific area. [Advantageous Effects of Invention] [0014] When video images are played back from the recording medium according to the present invention described above, the stereoscopic specific area is accessed immediately before a long jump occurring in stereoscopic playback, whereas the monoscopic specific area is accessed immediately before a long jump occurring in monoscopic playback. Thus, the playback path for stereoscopic playback and the playback path for monoscopic playback are separated immediately before their respective long jumps. This allows the extent sizes of the stream files arranged on the stereoscopic specific area to be determined regardless of the extent size of the base-view stream file arranged on the monoscopic specific area. Especially, sizes and an arrangement of extents recorded on the stereoscopic specific area are allowed to be designed to satisfy only the condition for seamless playback of stereoscopic video images. Independently of that, sizes and an arrangement of extents recorded on the monoscopic specific area are allowed to be designed to satisfy only the condition for seamless playback of monoscopic video images. As a result, further reduction in the buffer capacity necessary for stereoscopic playback can be achieved. [Brief Description of Drawings] [0015] FIG. 1 is a schematic diagram showing a type of usage of a recording medium according to a first embodiment of the present invention; FIG. 2 is a schematic diagram showing the data structure of a BD-ROM disc shown in FIG. 1; FIG. 3 is a schematic diagram showing an index table stored in an index file shown in FIG. 2; FIG. 4 is a schematic diagram showing elementary streams multiplexed in an AV stream file 2046A used for 2D video playback shown in FIG. 2; FIG. 5 is a schematic diagram showing an arrangement of packets in each elementary stream multiplexed in the AV stream file shown in FIG. 2; FIG. 6 is a schematic diagram showing a detail of a method for storing a video stream into PES packets shown in FIG. 5; FIGS. 7A, 7B, 7C are schematic views respectively showing the format of a TS packet, the format of a source packet, and an arrangement of source packets constituting the AV stream file shown in FIG. 5; FIG. 8 is a schematic diagram showing the data structure of a PMT; FIG. 9 is a schematic diagram showing the data structure of a clip information file shown in FIG. 2; FIG. 10 is a schematic diagram showing the data structure of stream attribute information shown in FIG. 9; FIGS. 11A and 11B are schematic views showing the data structure of an entry map shown in FIG. 10; FIG. 12 is a schematic diagram showing the data structure of a playlist file shown in FIG. 2; FIG. 13 is a schematic diagram showing the data structure of playitem information 1300; FIGS. 14A and 14B are schematic views showing the relationship between playback sections specified by the playitem information to be connected when the connection condition 1310 indicates "5" and "6", respectively; FIG. 15 is a schematic diagram showing the data structure of a playlist file when the playback path to be specified includes subpaths; FIG. 16 is a functional block diagram of a 2D playback device; FIG. 17 is a list of system parameters stored in a player variable storage unit shown in FIG. 16; FIG. 18 is a functional block diagram of a system target decoder shown in FIG. 16; FIG. 19 is a schematic diagram showing an arrangement of extents on the disc 101 shown in FIG. 2; FIG. 20 is a schematic diagram showing the processing channel for converting an AV stream file read from the BD-ROM disc 101 into 2D video data VD and audio data AD in the 2D playback device shown in FIG. 16; FIG. 21 is a graph showing a progression of the data amount DA accumulated in a read buffer 1602 shown in FIG. 20 during a processing period of an AV stream file; FIG. 22 is a table showing an example of the relationship between jump distances and jump times specified for BD-ROM discs; FIG. 23 is a schematic diagram showing an example of an arrangement of extents when videos are continuously played back from three different AV stream files in turn; FIGS. 24A, 24B, 24C are schematic diagrams illustrating the principle of stereoscopic video playback according to a method using parallax images; FIG. 25 is a schematic diagram showing a relationship among an index table 310, a movie object MVO, a BD-J object BDJO, a 2D playlist file 2501, and a 3D playlist file 2502; FIG. 26 is a flowchart showing a selection process of a playlist file to be played back, according to a movie object MVO; FIG. 27 is a schematic diagram showing an example of the structures of the 2D playlist file 2501 and the 3D playlist file 2502; FIG. 28 is a schematic diagram showing another example of the structures of the 2D playlist file 2501 and the 3D playlist file 2502; FIGS. 29A and 29B are schematic diagrams showing elementary streams multiplexed into a 2D/left-view AV stream file and a right-view stream file, respectively; FIGS. 30A and 3 0B are schematic diagrams showing compression coding methods for a 2D/left-view stream and a right-view stream, respectively; FIGS. 31A and 31B are schematic diagrams showing the relationship between DTSs and PTSs allocated to pictures of a 2D/left-view stream 3101 and a right-view stream 3102, respectively; FIG. 32 is a schematic diagram showing the data structure of a video access unit 3200 of the 2D/left-view stream and the right-view stream; FIGS. 33A and 33B are schematic diagrams showing values of a decode counter 3204 allocated to the pictures of a 2D/left-view stream 3301 and a right-view stream 3302, respectively; FIGS. 34A and 34B are schematic diagrams showing two types of arrangements of extents of the 2D/left-view AV stream file and the right-view AV stream file on the BD-ROM disc 101 shown in FIG. 2; FIGS. 35A and 35B are schematic diagrams showing the relationship between playback times and playback paths; FIGS. 36A and 36B are schematic diagrams showing the data structures of clip information files linked to the 2D/lef t-view AV stream file and the right-view AV stream file, respectively; FIGS. 37A and 37B are schematic diagrams showing the data structure of 3D meta data 3613 shown in FIG. 36A; FIGS. 38A and 38B are schematic diagrams showing the data structure of the entry map 3622 of the right-view clip information file 3602 shown in FIG. 36B; FIG. 39 is the functional block diagram of a 2D/3D playback device 3900; FIG. 40 is a schematic diagram showing a superimposing process of plane data pieces by a plane adder 3910 shown in FIG. 39; FIGS. 41A and 4IB are schematic diagrams showing cropping processes by the cropping processing unit 4022 shown in FIG. 40; FIGS. 42A, 42B, 42C are schematic diagrams respectively showing left 2D video images, right 2D video images, which are superimposed by the cropping processes shown in FIGS. 41A, 41B, and 3D video images perceived by a viewer; FIG. 43 is a functional block diagram of the system target decoder 3903 shown in FIG. 39; FIG. 44 is a schematic diagram showing the processing channel for playing back 3D video data VD and audio data AD from a 2D/left-view AV stream file and a right-view AV stream file read from the BD-ROM disc 101; FIGS. 45A, 45B, 45C are schematic diagrams showing the relationship between the physical order of extents of each AV stream file recorded on the BD-ROM disc 101 in the interleaved arrangement, the progression of the data amounts accumulated in the read buffers 3902 and 3911 during 3D video playback and the physical order of extents of the AV stream files recorded on the BD-ROM disc 101 in the interleaved arrangement; FIGS. 46A and 46B are schematic diagrams showing two types of the order of the extents belonging to AV stream files; FIGS. 4 7A and 47B are graphs respectively showing progressions of the data amount DAI accumulated in the read buffer (1) 3 902 and the data amount DA2 accumulated in the read buffer (2) 3911 when the extents of left and right AV stream files are alternately read from the disc 101; FIG. 48 is a schematic diagram showing an example of the arrangement of extents of the 2D/left-view AV stream file and the right view AV stream file when a long jump is required while the extents of the files are alternately read; FIGS. 49A and 49B are graphs respectively showing the progressions of the data amounts DAI and DA2 accumulated in the read buffers 3902 and 3911 in the section including the long jump LJ2 among the sections of the playback path 4822 for 3D video images; FIG. 50 is a schematic diagram showing an example of the arrangement of the extents when the BD-ROM disc 101 is a multi-layer disc and a series of AV stream files is separated on two layers; FIG. 51 is a schematic diagram showing an example of the arrangement of extents of AV stream files on which a 2D video playback path and a 3D video playback path for are separated in the area to be accessed immediately before their respective long jumps; FIG. 52 is a schematic diagram showing the correspondence relationship between playlist files and AV stream files for playing back video images from the extents arranged shown in FIG. 51; FIGS. 53A and 53B are schematic diagrams showing the arrangements of extents in the recording areas on the discs of the first and second embodiments, respectively, the recording areas being accessed before and after a long jump; FIG. 54 is a schematic diagram showing the arrangements of extents in the recording area(s) on the disc of the third embodiment, the recording area(s) being accessed immediately before the long jump; FIG. 55 is a schematic diagram showing the correspondence relationships between playlist files and AV stream files for playing back video images from the extents arranged shown in FIG. 54; FIGS. 56A and 56B are schematic diagrams showing the relationships between DTSs and PTSs allocated to pictures of a 2D/left-view stream 5601 and a right-view stream 5602, respectively; FIG. 57 is a block diagram of the internal configuration of a recording device according to the fourth embodiment; FIGS. 58A, 58B, 58C are schematic diagrams showing a process of calculating depth information by the video encoder 5701 shown in FIG. 57; and FIG. 59 is a schematic diagram illustrating the mechanism for ensuring the compatibility of an optical disc storing 3D video images with 2D playback devices. [Description of Embodiments] [0016] The following describes a recording medium and a playback device pertaining to embodiments of the present invention with reference to the drawings. [0017] First Embodiment [0018] First, the following describes a usage pattern of a recording medium in accordance with a first embodiment of the present invention. FIG. 1 is a schematic diagram showing a usage pattern of the recording medium. In FIG. 1, a BD-ROM disc 101 is depicted as the recording medium. A playback device 102, a display device 103 , and a remote control 104 constitute one home theater system. The BD-ROM disc 101 provides movies to the home theater system. [0019] > [0035] FIG. 3 is a schematic diagram showing an index table stored in the index file 2043A. The index table 310 stores items, such as "first play" 301, "top menu" 302, and "title k" 303 (k = 1, 2, ..., n) . Each item is associated with either of the movie object MVO and the BD-J object BDJO. Each time a menu or a title is called in response to a user operation or an application program, a control unit of the playback device 102 refers to a corresponding item in the index table 310, and calls an object corresponding to the item from the disc 101. The control unit then executes the program of the called object. More specifically, the "first play" 301 specifies an object to be called when the disc 101 is loaded into the disc drive. The "Top menu" 302 specifies an object for displaying a menu on the display device 103 when a command "go back to menu" is input responsive, for example, to a user operation. The "title k" 303 specifies an object for playing back, when a user operation requests a title to be played back, a AV stream file corresponding to the requested title from the disc 101, in accordance with the playlist file 2044A. [0036] <> [0037] The movie object file 2043B generally stores a plurality of movie objects. Each movie object stores a sequence of navigation commands. A navigation command causes the playback device 101 to execute playback processes similarly to general DVD players. A navigation command includes, for example, a read-out command to read out a playlist file corresponding to a title, a playback command to play back stream data from an AV stream file indicated by a playlist file, and a progression command to make a progression to another title. The control unit of the playback device 101 calls a movie object in response, for example, to a user operation and executes navigation commands included in the called movie object in the order of the sequence. Thus, in a manner similar to general DVD players, the playback device 101 displays a menu on the display device to allow a user to select one of the commands. The playback device 101 then executes a playback start/stop of a title or switching to another title in accordance with the selected command, thereby dynamically changing the progress of video playback. [0038] <> [0039] The BD-J object file 2047A includes a single BD-J object. The BD-J object is a program to cause a Java virtual machine mounted on the playback device 101 to execute the processes of title playback and graphics rendering. The BD-J object stores an application management table and identification information of the playlist file to be referred. The application management table indicates a list of Java application programs that are to be actually executed by the Java virtual machine. The identification information of the playlist file to be referred identifies a playlist file that corresponds to a title to be played back. The Java virtual machine calls a BD-J object in accordance with a user operation or an application program, and executes signaling of the Java application program according to the application management table included in the BD-J object. Consequently, the playback device 101 dynamically changes the progress of the video playback of the title, or causes the display device 103 to display graphics independently of the title video. [0040] <> [0041] The JAR directory 2048 stores the body of each Java application program executed in accordance with a BD-J objects. The Java application programs include those for causing the Java virtual machine to execute playback of a title and those for causing the Java virtual machine to execute graphics rendering. [0042] <> [0043] The AV stream file 2046A is a digital stream in MPEG-2 transport stream (TS) format, and is obtained by multiplexing a plurality of elementary streams . FIG. 4 is a schematic diagram showing elementary streams multiplexed in an AV stream file 2046A used for playback of 2D video images. The AV stream file 2046A shown in FIG. 4 has multiplexed therein a primary video stream 4 01, primary audio streams 4 02A and 402B, presentation graphics (PG) streams 403A and 403B, an interactive graphics (IG) stream 404, secondary video streams 405A and 405B, and a secondary audio stream 406. [0044] The primary video stream 401 represents the primary video of a movie, and the secondary video streams 405A and 405B represent secondary video of the movie. The primary video is the major video of a content, such as the main feature of a movie, and is displayed on the entire screen, for example. On the other hand, the secondary video is displayed simultaneously with the primary video with the use, for example, of a picture- in-picture method, so that the secondary video images are displayed in a smaller window presented on the full screen displaying the primary video image. Each video stream is encoded by a method, such as MPEG-2, MPEG-4 AVC, or SMPTE VC-1. [0045] The primary audio streams 402A and 402B represent the primary audio of the movie. The secondary audio stream 406 represents secondary audio to be mixed with the primary audio. Each audio stream is encoded by a method, such as AC-3, Dolby Digital Plus ("Dolby Digital" is registered trademark), MLP, DTS (Digital Theater System: registered trademark) , DTS-HD, or linear PCM (Pulse Code Modulation). [0046] The PG streams 4 03A and 403B represent subtitles of the movie. The PG streams 403A and 403B each represent subtitles in a different language, for example. The IG stream 404 represents an interactive screen. The interactive screen is created by disposing graphical user interface (GUI) components on the screen of the display device 103. [0047] The elementary streams 401-406 contained in the AV stream file 2046A are identified by packet IDs (PIDs). For example, the primary video stream 401 is assigned with PID 0x1011. The primary audio streams 402A and 402B are each assigned with any of PIDs from 0x1100 to OxlllF. The PG streams 403A and 403B are each assigned with any of PIDs from 0x1200 to 0xl21F. The IG stream 404 is assigned with any of PIDs from 0x1400 to 0xl41F. The secondary video streams 405A and 405B are each assigned with any of PIDs from OxlBOO to OxlBlF. The secondary audio stream 4 06 is assigned with any of PIDs from OxlAOO to OxlAlF. [0048] FIG. 5 is a schematic diagram showing a sequence of packets in each elementary stream multiplexed in the AV stream file 513 . Firstly, a video stream 501 having a plurality of video frames is converted to a series of PES packets 502 . Then, each PES packet 502 is converted to a TS packet 503. Similarly, an audio stream having a plurality of audio frames 504 are converted into a series of PES packets 505. Then, each of the PES packets 505 is converted to a TS packet 506. Similarly, stream data of the PG stream 507 and the IG stream 510 are separately converted into a series of PES packets 508 and a series of PES packets 511, and further into a series of TS packets 509 and a series of TS packets 512, respectively. Lastly, these TS packets 503, 506, 509, and 512 are arranged and multiplexed into one stream to constitute the AV stream file 513. [0049] FIG. 6 is a schematic diagram showing a detail of a method for storing a video stream 601 in PES packets 602. As shown in FIG. 6, in the encoding process of the video stream 601, video data of each video frame or field was treated as one picture and the data amount thereof was separately reduced. Here, pictures mean the units in which video data is encoded. A moving image compression coding method such as MPEG-2, MPEG-4 AVC, and SMPTE VC-1, reduces data amount by using spatial and temporal redundancy in the moving images. Inter-picture predictive coding is employed as the method using the temporal redundancy. In inter-picture predictive coding, first, a reference picture is assigned to each picture to be encoded, the reference picture being a picture earlier or later in presentation time than the picture to be encoded. Next, a motion vector is detected between the picture to be encoded and the reference picture, and then motion compensation is performed by using the motion vector. Furthermore, the picture processed by the motion compensation is subtracted from the picture to be encoded, and then, spatial redundancy is removed from the difference between the pictures. Thus, each picture is reduced in data amount. [0050] As shown in FIG. 6, the video stream 601 contain an I picture yyl, a P picture yy2, B pictures yy3 and yy4, ..., starting from the top. Here, I pictures are pictures compressed by intra-picture predictive coding that uses only a picture to be encoded without any reference picture. P pictures are pictures compressed by inter-picture predictive coding that uses the uncompressed form of one already-compressed picture as a reference picture. The B picture is compressed by inter-picture predictive coding that simultaneously uses the uncompressed forms of two already-compressed pictures as reference pictures. Note that some B pictures may be referred to as Br pictures when the uncompressed forms of the B pictures are used as reference pictures for other pictures by inter-picture predictive encoding. In the video stream 601, each picture with a predetermined header attached constitutes one video access unit. The pictures can be read from the video stream 601 in video access units. [0051] As shown in FIG. 6, each PES packet 602 contains a PES payload 602P and a PES header 602H. The I picture yyl, the P picture yy2, and the B pictures yy3 and yy4 of the video stream 601 are separately stored in the PES payloads 602P of different PES packets 602 . Each PES header 602H stores a presentation time and a decoding time, i.e., a PTS (presentation time-stamp) and a DTS (decoding time-stamp) , respectively, of a picture stored in the PES payload 602P of the same PES packet 602. [0052] FIGS. 7A, 7B, and 7C schematically show the format of a TS packet 701 and a source packet 702 constituting the AV stream file 513. The TS packet 701 is 188-byte long. As shown in FIG. 7A, the TS packet 701 is composed of a 4-byte long TS header 701H and a 184-byte long TS payload 701P. Each PES packet is divided and stored in the TS payload 701P of a different TS packet 701. Each TS header 701H stores information such as a PID. The PID identifies an elementary stream having data stored in the PES payload 601P when the PES packet 601 is reconstructed from data stored in the TS payload 701P of the same TS packet 701. When the AV stream file 513 is written in the BD-ROM disc 101, as shown in FIG. 7B, a 4-byte long header (TP_Extra_Header) 702H is added to each TS packet 701. The header 702H particularly includes an ATS (Arrival_Time_Stamp) . The ATS shows the transfer start time at which the TS packet is to be transferred to a PID filter inside a system target decoder, which is described later. In the manner described above, each packet 701 is converted to a 192-byte long source packet and written into the AV stream file 513. Consequently, as shown in FIG. 7C, the plurality of source packets 702 are sequentially arranged in the AV stream file 513. The source packets 702 are serially assigned from the top of the AV stream file 513. The serial numbers are called SPNs (source packet numbers). [0053] The TS packets contained in the AV stream file include those are converted from an elementary stream representing audio, video, subtitles and the like. The TS packets also include those comprise a PAT (Program Association Table) , a PMT (Program Map Table), a PCR (Program Clock Reference) and the like. The PAT shows the PID of a PMT included in the same AV stream file. The PID of the PAT itself is 0. The PMT stores the PIDs identifying the elementary streams representing video, audio, subtitles and the like included in the same AV stream file, and the attribute information of the elementary streams. The PMT also has various descriptors relating to the AV stream file. The descriptors particularly have information such as copy control information showing whether, copying of the AV stream file is permitted or not. The PCR stores information indicating the value of STC (System Time Clock) to be associated with an ATS of the packet. The STC is a clock used as a reference of the PTS and the DTS in a decoder. With the use of PCR, the decoder synchronizes the STC with the ATC that is the reference of the ATS. [0054] FIG. 8 is a schematic diagram showing the data structure of the PMT 810. The PMT 810 includes, from the top thereof, a PMT header 801, a plurality of descriptors 802, a plurality of pieces of stream information 803. The PMT header 801 indicates the length of data stored in the PMT 810. Each descriptor 802 relates to the entire AV stream file 513. The aforementioned copy control information is described in one of the descriptors 802 . Each piece of stream information 803 relates to a different one of the elementary streams included in the AV stream file 513 . Each piece of stream information 803 includes a stream type 803A, a PID 803B, and a stream descriptor 803C. The stream type 803A includes identification information of the codec used for compressing the elementary stream. The PID 803B indicates the PID of the elementary stream. The stream descriptor 8 03C includes attribute information of the elementary stream, such as a frame rate and an aspect ratio. [0055] <> [0056] FIG. 9 is a schematic diagram showing the data structure of a clip information file. As shown in FIG. 9, the clip information file 2045A is in one-to-one correspondence with the AV stream file 2046A. The clip information file 2045A includes clip information 901, stream attribute information 902, and an entry map 903. [0057] As shown in FIG. 9, the clip information 901 includes a system rate 901A, a playback start time 901B, and a playback end time 901C. The system rate 901A indicates the maximum transfer rate at which the AV stream file 2046A is transferred to the PID filter in the system target decoder, which is described later. The interval between the ATSs of the source packets in the AV stream file 2046A is set so that the transfer rate of the source packet is limited to the system rate or lower. The playback start time 901B shows the PTS of the video access unit located at the top of the AV stream file 2046A. For instance, the playback start time 901B shows the PTS of the first video frame. The playback end time 901C shows the value of the STC delayed a predetermined time from the PTS of the video access unit located at the end of the AV stream file 204 6A. For instance, playback end time 901C shows the sum of PTS of the last video frame and the playback time of one frame. [0058] FIG. 10 is a schematic diagram showing the data structure of the stream attribute information 902. As shown in FIG. 10, pieces of attribute information of the elementary streams are associated with different PIDs 902A. Each piece of attribute information is different depending on whether it corresponds to a video stream, an audio stream, a PG stream, or an IG stream. For example, each piece of attribute information 902B corresponds to a video stream and includes a codec type 9021 used for the compression of the video stream as well as a resolution 9022, an aspect ratio 9023 and a frame rate 9024 of the pictures composing the video stream. On the other hand, each piece of audio stream attribute information 902C corresponds to am audio stream and has a codec type 9025 used for compressing the audio stream, the number of channels 9026 included in the audio stream, a language 9027, and a sampling frequency 9028. These pieces of attribute information 902B and 902C are used for initializing a decoder in the playback device 102. [0059] FIG. 11A is a schematic diagram showing the data structure of the entry map 903. As shown in FIG. 11A, one entry map is provided for each of the video streams in the AV stream file 2046A and is associated with the PID of a corresponding video stream. The entry map 9031 of a video stream includes an entry- map header 1101 and entry points 1102 in the stated order from the top. The entry map header 1101 includes the PID of the corresponding video stream and the total number of the entry points 1102 . Each entry point 1102 is information showing a pair of PTS 1103 and SPN 1104 in correspondence with a different entry map ID (EP_ID) 1105. The PTS 1103 indicates a PTS of each I pictures in the video stream, and the SPN 1104 indicates the first SPN including the I picture in the AV stream file 2046. [0060] FIG. 11B schematically shows, out of the source packets included in the AV stream file 2046A, source packets whose correspondence with EP_IDs are shown by the entry map 903 . With reference to the entry map 903, the playback device 102 can specify the SPN within the AV stream file 2046A corresponding to an arbitrary point during the playback of the video stream. For instance, to execute special playback such as fast-forward or rewind, the playback device 102 specifies source packets having the SPNs corresponding to the EP_IDs by using the entry map 903, and selectively extracts and decodes the source packets. As a result, the I picture can be selectively played back. Thus, the playback device 102 can efficiently perform the special playback without analyzing the AV stream file 2046A. [0061] <> [0062] FIG. 12 is a schematic diagram showing the data structure of a playlist file 1200. The play list file 1200 indicates the playback path of an AV stream file 1204 . More specifically, the playlist file 1200 shows portions P1, P2, and P3 to be actually decoded in the AV stream file 1204 and the decoding order of these portions P1, P2, and P3. The playlist file 1200 particularly defines with PTSs a range of each of the portions P1, P2, and P3 that are to be decoded. The defined PTS are converted to SPNs of the AV stream file 1204 using the clip information file 1203. As a result, the range of each of the portions P1, P2, and P3 is now defined with SPNs. [0063] As shown in FIG. 12, the playlist file 1200 includes at least one piece of playitem (PI) information 1201. Each piece of playitem information 1201 defines a different one of playback sections in the playback path using a pair of PTSs respectively representing the start time Tl and the end time T2. Each piece of playitem information 1201 is identified by a playitem ID unique to the piece of playitem information 1201. The pieces of playitem information 1201 are written in the same order as the order of the corresponding playback sections in the playback path. Reversely, the playback path of a series of playback sections defined by the pieces of playitem information 1201 is referred to as a "main path" 1205. [0064] The playlist file 1200 further includes an entry mark 1202. The entry mark 1202 shows a time point in the main path 1205 to be actually played back. The entry mark 1202 can be assigned to a playback section defined by the playitem information 1201. For example, as shown in FIG. 12, a plurality of entry marks 1202 are assigned to one piece of playitem information PI #1. The entry mark 1202 is particularly used for searching a start position of playback when random access is made. When the playlist file 1200 defines a playback path for a movie title, for instance, the entry marks 1202 may be assigned to the top of each chapter. Consequently, the playback device 102 enables the movie title to be played back starting from any of the chapters. [0065] FIG. 13 is a schematic diagram showing the data structure of playitem information 1300. FIG. 13 shows that the playitem information 1300 includes reference clip information 1301, a playback start time 1302, a playback end time 1303, a connection condition 1310, and a stream selection table 1305. [0066] The reference clip information 1301 identifies a clip information file that is necessary for converting PTSs to SPNs. The playback start time 13 02 and the playback end time 1303 respectively show the PTSs of the top and the end of the AV stream file to be decoded. The playback device 102 refers to the entry map from the clip information file indicated by the reference clip information 13 01, and obtains SPNs respectively corresponding to the playback start time 1302 and the playback end time 1303. Thus, the playback device 102 identifies a portion of the AV stream file to start reading and plays back the AV stream starting from the identified portion. [0067] The connection condition 1310 specifies a condition for connecting video images to be played back between the playback section defined by a pair of playback start time 1302 and playback end time 1303 and the playback section specified by the previous piece of playitem information in the playlist file. The connection condition 1310 has three types, "1", "5", and "6", for example. When the connection condition 1310 indicates "1", the video images to be played back from the portion of the AV stream file specified by the piece of playitem information does not need to be seamlessly connected with the video images to be played back from the portion of the AV stream file specified by the previous piece of playitem information. On the other hand, when the connection condition 1310 indicates "5" or "6", both the video images to be played back need to be seamlessly connected with each other. [0068] FIGS. 14A and 14B each schematically show the relationships between playback sections defined by the playitem information to be connected when the connection condition 1310 indicates "5" or "6". When the connection condition 1310 indicates "5", as shown in FIG. 14A, the STCs of two pieces of playitem information PI#1 and PI#2 may be inconsecutive. That is, PTS TE at the end of the first AV stream file 1401F defined by the preceding first playitem information PI#1 and PTS TS at the top of the second AV stream file 1401B defined by the following second playitem information PI#2 may be inconsecutive. Note that, in this case, several constraint conditions must be satisfied. For example, when the second AV stream file 1401B is supplied to a decoder subsequently to the first AV stream file 1401F, each of the AV stream files needs to be created so that the decoder can smoothly decodes the file. Furthermore, the last frame of the audio stream contained in the first AV stream file must overlap the first frame of the audio stream contained in the second AV stream file. On the other hand, when the connection condition 1310 indicates "6", as shown in FIG. 14B, the first AV stream file 1402F and the second AV stream file 1402B must be handled as a series of AV stream files, in order to allow the decoder to duly perform the decode processing. That is, STCs and ATCs must be consecutive between the first AV stream file 1402F and the second AV stream file 1402B. [0069] Referring to FIG. 13 again, the stream selection table 1305 shows a list of elementary streams that the decoder in the playback device 102 can select from the AV stream file during the time between the playback start time 1302 and the playback end time 1303. The stream selection table 1305 particularly includes a plurality of stream entries 1309. Each of the stream entries 1309 includes a stream selection number 1306, stream path information 1307, and stream identification information 13 08 of a corresponding elementary stream. The stream selection numbers 13 06 are serial numbers assigned to the stream entries 1309, and used by the playback device 102 to identify the elementary streams. Each piece of stream path information 1307 shows an AV stream file to which an elementary stream to be selected belongs. For example, if the stream path information 1307 shows "main path", the AV stream file corresponds to the clip information file indicated by the reference clip information 1301. If the stream path information 1307 shows "subpath ID = 1", the AV stream file to which the elementary stream to be selected is an AV stream defined by a piece of sub-playitem information included in the subpath whose subpath ID = 1. The sub-playitem information piece defines a playback section that falls between the playback start time 1302 and the playback end time 1303. Note that the subpath and the sub-playitem information are descried in the next section of this specification. Each piece of stream identification information 1308 indicates the PID of a corresponding one of the elementary streams multiplexed in an AV stream file specified by the stream path information 1307. The elementary streams indicated by the PIDs are selectable during the time between the playback start time 13 02 and the playback end time 13 03. Although not shown in FIG. 13, each piece of stream entry 1309 also contains attribute information of a corresponding elementary stream. For example, the attribute information of an audio stream, a PG stream, and an IG stream indicates a language type of the stream. [0070] FIG. 15 is a schematic diagram showing the data structure of a playlist file 1500 when the playback path to be defined includes subpaths. As shown in FIG. 15, the playlist file 1500 may include one or more subpaths in addition to the main path 1501. Subpaths 1502 and 1503 are each a playback path parallel to the main path 1501. The serial numbers are assigned to the subpaths 1502 and 1503 in the order they are registered in the playlist file 1500. The serial numbers are each used as a subpath ID for identifying the subpath. Similarly to the main path 1501 that is a playback path of a series of playback sections defined by the pieces of playitem information #1-3, each of the subpaths 1502 and 1503 is a playback path of a series of playback sections defined by sub-playitem information #1-3. The data structure of the sub-playitem information 1502A is identical with the data structure of the playitem information shown in FIG. 13. That is, each piece of sub-playitem information 1502A includes reference clip information, a playback start time, and a playback end time. The playback start time and the playback end time of the sub-playitem information are expressed on the same time axis as the playback time of the main path 1501. For example, in the stream entry 1309 included in the stream selection table 13 05 of the playitem information #2, assume that the stream path information 1307 indicates "subpath ID = 0", and the stream identification information 1308 indicates the PG stream #1. Then, in the subpath 1502 with subpath ID = 0, for the playback section of the playitem information #2, the PG stream #1 is selected as the decode target from an AV stream file corresponding to the clip information file shown by the reference clip information of the sub-playitem information #2. [0071] Furthermore, the sub-playitem information includes a field called an SP connection condition. The SP connection condition carries the same meaning as a connection condition of the playitem information. That is, when the SP connection condition indicates "5" or "6", each portion of the AV stream file defined by two adjacent pieces of sub-playitem information needs to satisfy the same condition as the condition described above. [0072] [0073] Next, the configuration for the playback device 102 to play back 2D video images from the BD-ROM disc 101, i.e., the configuration of a 2D playback device will be described below. [0074] FIG. 16 is a functional block diagram showing a 2D playback device 1600 . The 2D playback device 1600 has a BD-ROM drive 1601, a playback unit 1600A, and a control unit 1600B. The playback unit 1600A has a read buffer 1602, a system target decoder 1603, and a plane adder 1610. The control unit 1600B has a dynamic scenario memory 1604, a static scenario memory 1605, a program execution unit 1606, a playback control unit 1607, a player variable storage unit 1608, and a user event processing unit 1609. The playback unit 1600A and the control unit 1600B are each implemented on a different integrated circuit. Alternatively, the playback unit 1600A and the control unit 1600B may also be implemented on a single integrated circuit. [0075] When the BD-ROM disc 101 is loaded into the BD-ROM drive 1601, the BD-ROM drive 1601 radiates laser light to the disc 101, and detects change in light reflected from the disc 101. Furthermore, using the change in the amount of reflected light, the BD-ROM drive 1601 reads data recorded on the disc 101. The BD-ROM drive 1601 has an optical head, for example. The optical head has a semiconductor laser, a collimate lens, a beam splitter, an objective lens, a collecting lens, and an optical detector. A beam of light radiated from the semiconductor laser sequentially passes the collimate lens, the beam splitter, and the objective lens to be collected on a recording layer of the BD-ROM disc 101. The collected beam is reflected and diffracted by the recording layer. The reflected and diffracted light passes the objective lens, the beam splitter, and the collecting lens, and is collected onto the optical detector. As a result, a playback signal is generated at a level in accordance with the intensity of the collected light, and the data is decoded using the playback signal. [0076] The BD-ROM drive 1601 reads data from the BD-ROM disc 101 based on a request from the playback control unit 1607. Out of the read data, an AV stream file is transferred to the read buffer 1602, a playlist file and a clip information file are transferred to the static scenario memory 1605, and an index file, a movie object file and a BD-J object file are transferred to the dynamic scenario memory 1604. [0077] The read buffer 1602, the dynamic scenario memory 1604, and the static scenario memory 1605 are each a buffer memory. A memory device in the playback unit 1600A is used as the read buffer 1602. Memory devices in the control unit 1600B are used as the dynamic scenario memory 1604 and the static scenario memory 1605. In addition, different areas in a single memory device may be used as these memories 1602, 1604 and 1605. The read buffer 1602 stores therein an AV stream file. The static scenario memory 1605 stores therein a playlist file and a clip information file, namely static scenario information. The dynamic scenario memory 1604 stores therein dynamic scenario information, such as an index file, a movie object file, and a BD-J object file. [0078] The system target decoder 1603 reads an AV stream file from the read buffer 1602 in units of source packets and demultiplexes the AV stream file . The system target decoder 1603 then decodes each of elementary streams obtained by the demultiplexing. Information necessary for decoding each elementary stream, such as the type of a codec and attribute of the stream, is transferred from the playback control unit 1607 to the system target decoder 1603. The system target decoder 1603 outputs a primary video stream, a secondary video stream, an IG stream, and a PG stream that have been decoded in video access units. The output data are used as primary video plane data, secondary video plane data, IG plane data, and PG plane data, respectively. On the other hand, the system target decoder 1603 mixes the decoded primary audio stream and secondary audio stream and outputs the resultant data to an audio output device, such as an internal speaker 103A of a display device. In addition, the system target decoder 1603 receives graphics data from the program execution unit 1606. The graphics data is used for rendering graphics such as a GUI menu on a screen, and is in a raster data format such as JPEG and PNG. The system target decoder 1603 processes the graphics data and outputs the data as image plane data. Details of the system target decoder 1603 will be described below. [0079] The user event processing unit 1609 detects a user operation via the remote control 104 and a front panel of the playback device 102 . Based on the user operation, the user event processing unit 1609 requests the program execution unit 1606 or the playback control unit 1607 to perform a relevant process. For example, when a user instructs to display a pop-up menu by pushing a button on the remote control 104, the user event processing unit 1609 detects the push, and identifies the button. The user event processing unit 1609 further requests the program execution unit 1606 to execute a command corresponding to the button, which is a command to display the pop-up menu. On the other hand, when a user pushes a fast-forward or a rewind button on the remote control 104, for example, the user event processing unit 1609 detects the push, and identifies the button. In addition, the user event processing unit 1609 requests the playback control unit 1607 to fast-forward or rewind the playback being currently executed according to a playlist. [0080] The playback control unit 1607 controls transfer of files, such as an AV stream file and an index file, from the BD-ROM disc 101 to the read buffer 1602, the dynamic scenario memory 1604, and the static scenario memory 1605. A file system managing the directory file structure 204 shown in FIG. 2 is used for this control. That is, the playback control unit 1607 causes the BD-ROM drive to transfer the files to each of the memories 1602, 1604 and 1605 using a system call for opening files. The file opening is composed of a series of the following processes. First, a file name to be detected is provided to the file system by a system call, and an attempt is made to detect the file name from the directory/file structure 204. When the detection is successful, a content of a file entry of the target file is transferred to a memory of the playback control unit 1607, and FCB (File Control Block) is generated in the memory. Subsequently, a file handle of the target file is returned from the file system to the playback control unit 1607. After this, the playback control unit 1607 can transfer the target file from the BD-ROM disc 101 to each of the memories 1602, 1604 and 1605 by showing the file handle to the BD-ROM drive. [0081] The playback control unit 1607 decodes the AV stream file to output video data and audio data by controlling the BD-ROM drive 1601 and the system target decoder 1603 . More specifically, the playback control unit 1607 reads a playlist file from the static scenario memory 1605 in response to an instruction from the program execution unit 1606 or a request from the user event processing unit 1609, and interprets the content of the file. In accordance with the interpreted content, particularly with the playback path, the playback control unit 1607 specifies an AV stream to be played back, and instructs the BD-ROM drive 1601 and the system target decoder 1603 to read and decode the AV stream to be played back. Such playback processing based on a playlist file is called playlist playback. In addition, the playback control unit 1607 sets various types of player variables in the player variable storage unit 1608 using the static scenario information. With reference to the player variables, the playback control unit 1607 specifies an elementary stream to be decoded, and provides the system target decoder 1603 with information necessary for decoding the elementary streams. [0082] The player variable storage unit 1608 is composed of a group of registers for storing player variables. The player variables include system parameters (SPRM) showing the status of the player 102, and general parameters (GPRM) for general use. FIG. 17 is a list of SPRM. Each SPRM is assigned a serial number 1701, and each serial number 1701 is associated with a variable value 1702 . The contents of major SPRM are shown below. Here, the bracketed numbers indicate the serial numbers. [0083] SPRM (0) :Language code SPRM (1) :Primary audio stream number SPRM (2) :Subtitle stream number SPRM (3) :Angle number SPRM (4) :Title number SPRM (5) :Chapter number SPRM (6) :Program number SPRM (7) :Cell number SPRM (8) :Selected key name SPRM (9) :Navigation timer SPRM (10) :Current playback time SPRM (11) :Player audio mixing mode for Karaoke SPRM (12) :Country code for parental management SPRM (13) :Parental level SPRM (14) :Player configuration for video SPRM (15) -.Player configuration for audio SPRM (16) :Language code for audio stream SPRM (17) :Language code extension for audio stream SPRM (18) :Language code for subtitle stream SPRM (19) :Language code extension for subtitle stream SPRM (20) :Player region code SPRM (21) -.Secondary video stream number SPRM (22) :Secondary audio stream number SPRM (23) :Player status SPRM (24) :Reserved SPRM (25) :Reserved SPRM (26) :Reserved SPRM (27) :Reserved SPRM (28) :Reserved SPRM (29) :Reserved SPRM (30) :Reserved SPRM (31) :Reserved [0084] The SPRM (10) is the PTS of the picture being currently- being decoded, and is updated every time the picture is decoded and written into the primary video plane memory. Accordingly, the current playback point can be known by referring to the SPRM (10) . [0085] The language code for the audio stream of the SPRM (16) and the language code for the subtitle stream of the SPRM (18) show default language codes of the player. These codes may be changed by a user with use of the OSD (On Screen Display) of the player 102 or the like, or may be changed by an application program via the program execution unit 1606. For example, if the SPRM (16) shows "English", in playback processing of a playlist, the playback control unit 1607 first searches the stream selection table in the playitem information for a stream entry having the language code for "English". The playback. control unit 1607 then extracts the PID from the stream identification information of the stream entry and transmits the extracted PID to the system target decoder 1603 . As a result, an audio stream having the same PID is selected and decoded by the system target decoder 1603 . These processing can be executed by the playback control unit 1607 with use of the movie object file or the BD-J object file. [0086] During playback processing, the playback control unit 1607 updates the player variables in accordance with the status of the playback. The playback control unit 1607 updates the SPRM (1) , the SPRM (2) , the SPRM (21) and the SPRM (22) in particular. These SPRM respectively show, in the stated order, the stream selection numbers of the audio stream, the subtitle stream, the secondary video stream, and the secondary audio stream which are currently being processed. As one example, assume that the audio stream number SPRM (1) has been changed by the program execution unit 1606. In this case, the playback control unit 1607 first searches the stream selection table in the playitem information currently being played back for a stream entry including a stream selection number that matches the stream selection number shown by the changed SPRM (1) . The playback control unit 1607 then extracts the PID from the stream identification in the stream entry and transmits the extracted PID to the system target decoder 1603. As a result, the audio stream having the same PID is selected and decoded by the system target decoder 1603. This is how the audio stream targeted for playback is switched. The subtitle stream and the secondary video stream to be played back can be switched in a similar manner. [0087] The playback execution unit 1606 is a processor and executes programs stored in the movie object file or the BD-J object file. The playback execution unit 1606 executes the following controls in particular in accordance with the programs. (1) The playback execution unit 1606 instructs the playback control unit 1607 to perform playlist playback processing. (2) The playback execution unit 1607 generates graphics data for a menu or a game as PNG or JPEG raster data, and transfers the generated data to the system target decoder 1603 to be composited with other video data. Specific contents of these controls can be designed relatively flexibly through program designing. That is, the contents of the controls are determined by the programming procedure of the movie object file and the BD-J object file in the authoring procedure of the BD-ROM disc 101. [0088] The plane adder 1610 receives primary video plane data, secondary video plane data, IG plane data, PG plane data, and image plane data from the system target decoder 1603, and composites these data into a video frame or a field by superimposition. The resultant composited video data is outputted to the display device 103 and displayed on a screen thereof. [0089] <> [0090] FIG. 18 is a functional block diagram of the system target decoder 1603. As shown in FIG. 18, the system target decoder 1603 includes a source depacketizer 1810, an ATC counter 1820, a first 27 MHz clock 1830, a PID filter 1840, an STC counter (STC1) 1850, a second 27 MHz clock 1860, a primary video decoder 1870, a secondary video decoder 1871, a PG decoder 1872, an IG decoder 1873, a primary audio decoder 1874, a secondary audio decoder 1875, an image processor 1880, a primary video plane memory 1890, a secondary video plane memory 1891, a PG plane memory 1892, an IG plane memory 1893, an image plane memory 1894, and an audio mixer 1895. [0091] The source depacketizer 1810 reads source packets from the read buffer 1602, extracts the TS packets from the read source packets, and transfers the TS packets to the PID filter 1840. The source depacketizer 1810 further adjusts the time of the transfer in accordance with the ATS of each source packet. Specifically, the source depacketizer 1810 first monitors the value of the ATC generated by the ATC counter 182. Here, the value of the ATC is a value of the ATC counter 1820, and is incremented in accordance with a pulse of the clock signal of the first 27 MHz clock 1830. Subsequently, at the instant the value of the ATC and the ATS of a source packet are identical, the source depacketizer 1810 transfers the TS packet extracted from the source packet to the PID filter 1840 at the recording rate RTS1 of the AV stream file. [0092] The PID filter 184 0 first selects, from among the TS packets outputted from the source depacketizer 1810, TS packets which have a PID that matches a PID pre-specif ied by the playback control unit 1607. The PID filter 1840 then transfers the selected TS packets to the decoders 1870-1875 depending on the PID of the TS packets. For instance, a TS packet with PID 0x1011 is transferred to the primary video decoder 1870, TS packets with PIDs ranging from 0x1B00 to 0x1B1F, 0x1100 to OxlllF, 0x1A00 to 0x1A1F, 0x1200 to 0x121F, and 0x1400 to 0x141F are transferred to the secondary video decoder 1871, the primary audio decoder 1874, the secondary audio decoder 1875, the PG decoder 1872, and the IG decoder 1873, respectively. [0093] The PID filter 1840 further detects PCR from each TS packet using the PID of the TS packet. In this case, the PID filter 1840 sets the value of the STC counter 1850 to a predetermined value. Herein, the value of the STC counter 1850 is incremented in accordance with a pulse of the clock signal of the second 27 MHz clock 1860 . In addition, the value to which the STC counter 1850 is set to is instructed to the PID filter 1840 from the playback control unit 1607 in advance. The decoders 1871-1875 each use the value of the STC counter 1850 as STC. That is, the decoders 1871-1875 perform decoding processing on the TS packets outputted from the PID filter 1840 at the time indicated by the PTS or the DTS shown by the TS packets. [0094] The primary video decoder 1870, as shown in FIG. 18, includes a TB (Transport Stream Buffer) 18 01, an MB (Multiplexing Buffer) 1802, an EP (Elementary Stream Buffer) 1803, a compressed video decoder (Dec) 1804, and a DPB (Decoded Picture Buffer) 1805. The TB 1801, the MB 1802, the EB 1803, and the DPB 1805 each are buffer memory, and use an area of a memory device internally provided in the primary video decoder 1807. Some or all of the TB 1801, the MB 1802, the EB 1803, and the DPB 1805 may be separated in different memory devices. The TB 1801 stores the TS packets received from the PID filter 1840 as they are. The MB 1802 stores PES packets reconstructed from the TS packets stored in the TB 1801. Note that when the TS packets are transferred from the TB 1801 to the MB 1802, the TS header is removed from each TS packet. The EB 1803 extracts an encoded video access unit from the PES packets and stores the extracted encoded video access unit therein. The video access unit includes compressed pictures, i.e., I picture, B picture, and P picture. Note that when data is transferred from the MB 1802 to the EB 1803, the PES header is removed from each PES packet. The compressed video decoder 1804 decodes each video access unit in the MB 1802 at the time of the DTS shown by the original TS packet. Herein, the compressed video decoder 1804 changes a decoding scheme in accordance with the compression encoding formats, e.g., MPEG-2, MPEG4AVC, and VC1, and the stream attribute of the compressed pictures stored in the video access unit. The compressed video decoder 1804 further transfers the decoded pictures, i.e. , video data of a frame or a field, to the DPB 1805. The DPB 1805 temporarily stores the decoded pictures. When decoding a P picture or a B picture, the compressed video decoder 1804 refers to the decoded pictures stored in the DPB 1805. The DPB 1805 further writes each of the stored pictures into the primary video plane memory 1890 at the time of the PTS shown by the original TS packet. [0095] The secondary video decoder 1871 has the same structure as the primary video decoder 1870. The secondary video decoder 1871 first decodes the TS packets of the secondary video stream received from the PID filter 1840, into uncompressed pictures. Subsequently, the secondary video decoder 1871 writes the resultant uncompressed pictures into the secondary video plane memory 1891 at the time of the PTS shown by the TS packet. [0096] The PG decoder 1872 decodes the TS packets received from the PID filter 1840 into uncompressed graphics data, and writes the resultant uncompressed graphics data to the PG plane 18 92 at the time of the PTS shown by the TS packet. [0097] The IG decoder 1873 decodes the TS packets received from the PID filter 1840 into uncompressed graphics data, and writes the resultant uncompressed graphics data to the IG plane 1893 at the time of the PTS shown by the TS packet. [0098] The primary audio decoder 1874 first stores the TS packets received from the PID filter 184 0 into a buffer provided therein. Subsequently, the primary audio decoder 1874 removes the TS header and the PES header from each TS packet in the buffer, and decodes the remaining data into uncompressed LPCM audio data The primary audio decoder 1874 further outputs the resultant audio data to the audio mixer 1895 at the time of the PTS shown by the original TS packet. The primary audio decoder 1874 changes a decoding scheme of the uncompressed audio data in accordance with the compression encoding formats, e.g. , Dolby Digital Plus and DTS-HD LBR, and the stream attribute, of the primary audio stream, included in the TS packets. [0099] The secondary audio decoder 1875 has the same structure as the primary audio decoder 18 74 . The secondary audio decoder 1875 decodes the TS packets of the secondary audio stream received from the PID filter 184 0 into uncompressed LPCM audio data. Subsequently, the secondary audio decoder 1875 outputs the uncompressed LPCM audio data to the audio mixer 1895 at the time of the PTS shown by the original TS packet. The secondary audio decoder 1875 changes a decoding scheme of the uncompressed audio data in accordance with the compression encoding format, e.g. , Dolby Digital Plus, DTS-HD LBR, or the like and the stream attribute, of the primary audio stream, included in the TS packets. [0100] The audio mixer 1895 mixes (superimposes) the uncompressed audio data output ted from the primary audio decoder 1874 and the uncompressed audio data outputted from the secondary audio decoder 1875 with each other. The audio mixer 1895 further outputs the resultant composited audio to an internal speaker 103A of the display device 103 or the like. [0101] The image processor 1880 receives graphics data, i.e., PNG or JPEG raster data, along with the PTS thereof from the program execution unit 1606. Upon the reception of the graphics data, the image processor 1880 appropriately processes the graphics data and writes the graphics data to the image plane memory 18 94 at the time of the PTS thereof. [0102] [0123] Playback methods of stereoscopic video are roughly classified into two categories, i.e., methods using a holographic technique, and methods using parallax images. [0124] The feature of the methods using the holographic technique is to allow a viewer to perceive objects in video as three-dimensional by giving the viewer' s visual perception substantially the same information as optical information provided to visual perception of human beings by actual objects. However, although a technical theory for utilizing these methods for moving video display has been established, it is extremely difficult to realize, according to the present technology, a computer that is capable of real-time processing of an enormous amount of calculation required for the moving video display and a display device having super-high resolution of several thousand lines per 1 mm. Accordingly, there is hardly any idea of when these methods can be realized for commercial use. [0125] On the other hand, the feature of the methods using parallax images is as follows. For one scene, video images for the right eye of a viewer and video images for the left eye of the viewer are separately generated. Subsequently, each video image is played back to allow only the corresponding eye of the viewer to recognize the image, thereby allowing the viewer to recognize the scene as three-dimensional. [0126] FIGS. 24A, 24B, 24C are schematic diagrams illustrating the principle of playing back 3D video images (stereoscopic video images) according to a method using parallax images . FIG. 24A shows, from the above, when a viewer 251 is looking at a cube 252 placed in front of the viewer' s face. FIG. 24B shows the outer appearance of the cube 252 as perceived by a left eye 251L of the viewer 251. FIG. 24C shows the outer appearance of the cube 252 as perceived by a right eye 251R of the viewer 25A. As is clear from comparing FIG. 24B and FIG. 24C, the outer appearances of the cube 252 as perceived by the eyes are slightly different. The difference of the outer appearances, i.e., the parallax view allows the viewer 251 to recognize the cube 252 as three-dimensional. Thus, according to a method using parallax images, first, two images with different viewpoints are prepared for one scene. For example, for the cube 252 placed in front of the face of the viewer 251 as shown in FIG. 24A, two video images with different viewpoints, e.g., FIGs. 24B and 24C are prepared. Here, the difference between the viewpoints is determined by the parallax view of the viewer 251. Next, each video image is played back so as to allow the corresponding eye of the viewer 251 to perceive it. Consequently, the viewer 251 recognizes the scene played back on the screen, i.e. , the video image of the cube 252 as three-dimensional. As described above, unlike the methods using the holography technique, the methods using parallax views have an advantage of requiring video images from mere two viewpoints . Hereinafter, video images for the left eye are referred to as "left video images" or "left views", and video images for the right eye are referred to as "right video images" or "right views". Additionally, video images including the video images for the left eye and the video images for the right eye are referred to as "3D video images". [0127] The methods using parallax views are further classified into several methods from the standpoint of how to show video images for the right or left eye to the corresponding eye of the viewer. [0128] One of these methods is called alternate-frame sequencing. According to this method, right video images and left video images are alternately displayed on a screen for a predetermined time, and the viewer observes the screen using stereoscopic glasses with a liquid crystal shutter. Herein, the lenses of the stereoscopic glasses with a liquid crystal shutter (also referred to as "shutter glasses") are each made of a liquid panel. The lenses pass or block light in a uniform and alternate manner in synchronization with video-image switching on the screen. That is, each lens functions as a shutter that periodically blocks an eye of the viewer. More specifically, while a left video image is displayed on the screen, the shutter glasses make the left-side lens to transmit light and the right-hand side lens block light. While an right video image is displayed on the screen, as contrary to the above, the shutter glasses make the right-side glass transmit light and the left-side lens block light. As a result, the eyes of the viewer see afterimages of the right and left video images, which are overlaid with each other, and perceive a stereoscopic video image. [0129] According to the alternate-frame sequencing, as described above, right and left video images are alternately displayed in a predetermined cycle. Thus, for example, when 24 video frames are displayed per second for playing back a normal 2D movie, 48 video frames in total for both right and left eyes need to be displayed for a 3D movie. Accordingly, a display device able to quickly execute rewriting of the screen is preferred for this method. [0130] Another method uses a lenticular lens. According to this method, a right video frame and a left video frame are respectively divided into reed-shaped small and narrow areas whose longitudinal sides lie in the vertical direction of the screen. In the screen, the small areas of the right video frame and the small areas of the left video frame are alternately arranged in the landscape direction of the screen and displayed at the same time. Herein, the surface of the screen is covered by a lenticular lens. The lenticular lens is a sheet-shaped lens constituted from parallel-arranged multiple long and thin hog-backed lenses. Each hog-backed lens lies in the longitudinal direction on the surface of the screen. When a viewer sees the left and right video frames through the lenticular lens, only the viewer' s left eye perceives light from the display areas of the left video frame, and only the-viewer' s right eye perceives light from the display areas of the right video frame. This is how the viewer sees a 3D video image from the parallax between the video images respectively perceived by the left and right eyes. Note that according to this method, another optical component having similar functions, such as a liquid crystal device may be used instead of the lenticular lens. Alternatively, for example, a longitudinal polarization filter may be provided in the display areas of the left image frame, and a lateral polarization filter may be provided in the display areas of the right image frame. In this case, viewers sees the display through polarization glasses. Herein, for the polarization glasses, a longitudinal polarization filter is provided for the left lens, and a lateral polarization filter is provided for the right lens. Consequently, the right and left video images are respectively perceived only by the corresponding eyes, thereby allowing the viewer to recognize a stereoscopic video image. [0131] A playback method for stereoscopic video with use of the parallax images has already been technically established and is in general use for attractions in amusement parks and the like. Accordingly, among playback methods for stereoscopic video, this method is considered to be closest to practical household use. Thus, in the embodiments of the present invention in the following, the alternate-frame sequencing method or the method using polarization glasses are assumed to be used. However, as a playback method for stereoscopic video, various methods such as a two-color separation method have been proposed., Any of these various methods can be applicable to the present ; invention, as is the case with the two methods described below, j as long as parallax views are used. [0132] > [0136] FIG. 25 is a schematic diagram showing relations among an index table 310, a movie object MVO, a BD-J object BDJO, and playlist files 2501 and 2502. In the BD-ROM disc 101 that stores therein 3D video images, the PLAYLIST directory includes the 3D playlist file 2502 in addition to the 2D playlist file 2501. As is the case with the playlist file 204A, the 2D playlist file 2501 specifies a playback path of 2D video images. For example, when a title 1 is selected by a user operation, the movie object MVO associated with an item "title 1" of the index table 310 is executed. Herein, the movie object MVO is a program for a playlist playback that uses one of the 2D playlist file 2501 and the 3D playlist file 2502. The playback device 102, in i accordance with the movie object MVO, first judges whether the playback device 102 supports 3D video playback or not, and if judging affirmatively, further judges whether the user has selected the 3D video playback or not. The playback device 102 then selects, in accordance with the result of the judgment, one of the 2D playlist file 2501 and the 3D playlist file 2502 as a playlist file to be played back. [0137] FIG. 26 is a flowchart showing selection processing of a playlist file to be played back, the selection processing being executed in accordance with the movie object MVO. [0138] In S2601, the playback device 102 checks the value of the SPRM (24) . If the value is 0, the process advances to S2605. If the value is 1, the process advances to S2602. [0139] In step S2602, the playback device 102 causes the display device 103 to display a menu and makes the user to select 2D video playback and 3D video playback. If the user selects the 2D video playback with an operation of a remote control or the like, the process advances to S2605. On the other hand, if the user selects the 3D video playback, the process advances to S2603. [0140] In S2603, the playback device 102 checks whether the display device 103 supports the 3D video playback. For example, if the playback device 102 is connected with the display device 103 using the HDMI format, the playback device 102 exchanges CEC messages with the display device 103 and asks the display device 103 whether the display device 103 supports the 3D video playback. If the display device 103 does not support the 3D video playback, the process advances to S2605. On the other hand, if the display device 103 supports the 3D video playback, the process advances to S2604. [0141] In S2604, the playback device 102 selects the 3D playlist file 2502 as the playback target. [0142] In S2605, the playback device 102 selects the 2D playlist file 2501 as the playback target. Note that in this case, the playback device 102 may cause the display device 103 to display the reason the 3D video playback was not selected. [0143] <> [0144] FIG. 27 is a schematic diagram showing an example structure of the 2D playlist file 2501 and the 3D playlist file 2502. A first AV stream file group 2701 is composed of AV stream files LCL_AV#l-3 each storing a video stream of 2D video images and is independently used for 2D video playback. The video streams of the AV stream files LCL_AV#l-3 are further used as left-view streams in 3D video playback. Hereinafter, such an AV stream file is referred to as a "2D/left-view AV stream file", and the video stream included therein is referred to as a "2D/left-view stream". On the other hand, a second AV stream file group 2702 is composed of AV stream files RCL_AV#l-3, and is used in combination with the first AV stream file group 2701 for 3D video playback. Hereinafter, such an AV stream file is referred to as a "right-view AV stream file", and the video stream included therein is referred to as a "right-view stream". A main path 2501M of the 2D playlist file 2501 and a main path 2502M of the 3D playlist file 2502 each include three pieces of playitem information #1-3. Each piece of the playitem information #1-3 specifies a playback section in the first AV stream file group 2701. On the other hand, unlike the 2D playlist file 2501, the 3D playlist file 2502 further includes a subpath 2502S. The subpath 2502S includes three pieces of sub-playitem information #1-3, and each piece of the sub-playitem information #1-3 specifies a playback section in the second AV stream file group 2702. The sub-playitem information #1-3 correspond one-to-one with the playitem information #1-3. The length of the playback section specified by each piece of sub-playitem information is equal to the length of the playback section of the corresponding piece of playitem information. The subpath 2502S further includes information 2502T which indicates that a subpath type is "3D". Upon detecting the information 2502T, the 2D/3D playback device synchronizes the playback processing between the subpath 2502S and the main path 2502M. As described above, the 2D playlist file 2501 and the 3D playlist file 2502 may share the same 2D/left-view AV stream file group. [0145] Note that the prefix numbers of the 2D playlist file 2501 and the 3D playlist file 2502 (e.g., "XXX" of "XXX.mpls") may be sequentially numbered. In this manner, the 2D playlist file corresponding to the 2D playlist file can be easily identified. [0146] For each piece of playitem information in the 3D playlist file 2502, a stream entry of the 2D/lef t-view stream and a stream entry of the right-view stream have been added in the stream selection table 1305 shown in FIG. 13. The stream entries 1309 for the 2D/left-view stream and the right-view stream have the same contents such as the frame rate, the resolution, and the video format. Note that each stream entry 1309 may further have a flag for identifying the 2D/left-view stream and the right-view stream added therein. [0147] In the first embodiment as described above, assume that the 2D playback device plays back 2D video images from the left-view streams. However, the 2D playback device may be designed to play back 2D video images from the right-view streams. It is similarly applicable in the description hereinafter. [0148] FIG. 28 is a schematic diagram showing another example structure of the 2D playlist file 2501 and the 3D playlist file 2502. The STREAM directory of the BD-ROM disc 101 may include two or more kinds of right-view AV stream files for each left-view AV stream file 2701. In this case, the 3D playlist file 2502 may include a plurality of subpaths corresponding one-to-one with the right-view AV stream files. For example, when 3D video images with different parallax views are expressed for the same scene with use of differences between the shared left video image and the right video images, for each different right video image, a different right-view AV stream file group is recorded on the BD-ROM disc 101. In this case, subpaths which respectively correspond with the right-view AV stream files may be provided in the 3D playlist file 2502 and used according to a desired parallax view. In the example of FIG. 28, the viewpoints of the right video exhibited by a first right-view AV stream file group 2801 and a second right-view AV stream file group 2802 are different. Meanwhile, the 3D playlist file 2502 includes two kinds of subpaths 2502S1 and 2502S2. The subpath 0 having a subpath ID of "0" specifies a playback section in the first right-view AV stream file group 2801. The subpath 1 having the subpath ID of "1" specifies a playback section in the second right-view AV stream file group 2802. the 2D/3D playback device selects one of the two kinds of the subpaths 2502S1 and the 2502S2 in accordance with the size of the screen of the display device 103 or specification by the user, and synchronizes the playback processing of the selected subpath with the playback processing of the main path 2502M. This allows pleasant stereoscopic video display for the user. [0149] <> [0150] FIGS. 29A and 29B schematically show elementary streams that are multiplexed into a pair of the AV stream files, and are used for playing back the 3D video images. FIG. 29A shows elementary streams multiplexed into a 2D/left-view AV stream file 2901. The elementary streams are the same as the streams multiplexed into the AV stream file for the 2D video images in FIG. 4 . The 2D playback device plays back a primary video stream 2911 as 2D video images, while the 2D/3D playback device plays back the primary video stream 2911 as left video at the time of providing 3D playback. That is, the primary video stream 2911 is a 2D/left-view stream. FIG. 29B shows an elementary stream multiplexed into a right-view AV stream file 2902. The right-view AV stream file 2902 stores therein a right-view stream 2921. The 2D/3D playback device plays back the right-view stream 2902 as the right video at the time of providing the 3D playback. To the right-view stream 2921, a PID of 0x1012 is allocated that is different from a PID of 0x1011 allocated to the left stream 2911. [0151] FIG. 30A is a schematic diagram showing a compression coding format for a 2D video stream 3000. As shown in FIG. 30A, frames/fields of the 2D video stream 3000 are compressed into a picture 3001, a picture 3002 and so on using an inter-picture predictive encoding format. In the encoding format is adopted a redundancy in a time direction of the 2D video stream 3000 (i.e. similarities between previous and/or subsequent pictures whose display orders are serial) . Specifically, a top picture is, at first, compressed into an I0 picture 3001 with use of an intra-picture predictive encoding. Here, numbers shown by indexes are serial numbers in FIG. 30A and FIG. 30B. Next, as shown by arrows in FIG. 30A, a fourth picture refers to the I0 picture 3001, and is compressed into a P3 picture 3004. Next, second and third pictures are compressed into a Bi picture and a B2 picture respectively, with reference to the I0 picture 3001 and the P3 picture 3004. [0152] FIG. 30B is a schematic diagram showing a compression encoding format for 3D video streams 3010 and 3020. As shown in FIG. 30B, a left-view stream 3010 is compressed using the inter-picture predictive encoding format that uses the redundancy in the time direction as with the 2D video stream 3000. When a right-view stream 3020 is compressed using the inter-picture predictive encoding format, on the other hand, a redundancy between left and right viewpoints is used in addition to the redundancy in the time direction. That is, as shown by arrows in FIG. 30B, each picture of the right-view stream 3020 is compressed with reference to a picture having a same display time or a picture having a close display time in the 2D/left-view stream 3010 as well as a previous picture and/or a subsequent picture in the right-view stream 3020. For example, a top picture in the right-view stream 3020 is compressed into a P0 picture 3021 with reference to an I0 3011 picture in the 2D/left-view stream 3010. A fourth picture is compressed into a P3 picture 3024 with reference to the P0 picture 3021 and a P3 picture 3014 in the 2D/left-view stream 3010. Furthermore, second and third pictures are respectively compressed into a Bx picture and a B2 picture with reference to a Brx picture 3012 and a Br2 picture in the 2D/lef t-view stream in addition to the P0 picture 3021 and the P3 picture 3024, respectively. Thus, pictures of the right-view stream 3020 are compressed with reference to the 2D/left-view stream 3010. Accordingly, the right-view stream 3 020 cannot be decoded alone unlike the 2D/left-view stream 3010. However, since there is a strong correlation between the right video and left video, a data amount of the right-view stream 3020 is drastically smaller than a data amount of the 2D/left-view stream 3010 due to the inter-picture predictive encoding format that uses the redundancy between the right and left viewpoints. Hereinafter, a video stream that can be decoded alone like the 2D/left-view stream 3010 is referred to as a "base-view stream", and a video stream that needs to be decoded with use of the base-view stream is referred to as a "dependent-view stream". [0153] Note that the right-view stream may be compressed into the base-view stream. Furthermore, in that case, the left-view stream may be compressed into the dependent-view stream with use of the right-view stream. In either of the cases, the base view stream is used as the 2D video stream in the 2D playback device. Also, a frame rate of the 2D/left-view stream is a frame rate at which the 2D/left-view stream is decoded alone by the 2D playback device. The frame rate is recorded in a GOP header of the 2D/left-view stream. [0154] FIG. 31A shows an example of a relation between the PTSs and the DTSs allocated to pictures of the 2D/left-view stream 3101, and FIG. 31B shows an example of a relation between the PTSs and the DTSs allocated to pictures of the right-view stream 3102. In both of the video streams 3101 and 3012, DTSs of the pictures alternate one another on the STC. This can be realized by delaying, with respect to DTSs of the pictures of the 2D/left-view stream 3101, the DTSs of pictures of the right-view stream 3102 that refer to corresponding pictures of the 2D/left-view stream 3101 in the inter-picture predictive encoding format shown in FIG. 3OB. An interval TD of the delay (i.e. an interval between each picture of the 2D/left-view stream 3101 and a picture of the right-view stream 3102 that immediately succeeds the picture of the 2D/left-view stream) is refereed to as a 3D display delay. The 3D display delay TD is set to a value corresponding to an interval between previous and subsequent pictures of the 2D/left-view stream 3101 (i.e. a value half a frame period or half a field period TFr) . Similarly, in both of the video streams 3101 and 3012, PTSs of the pictures alternate one another on the STC. That is, an interval TD between: a PTS of each picture of the 2D/left-view stream 3101; and a PTS of a picture of the right-view stream 3102 that immediately succeeds the picture of the 2D/left-view stream is set to a value corresponding to an interval between pictures of the 2D/left-view stream 3101 (i.e. a value half a frame period or half a field period TFr). [0155] FIG. 32 is a schematic diagram showing the data structure of a video access unit 3200 of each of the 2D/left-view stream and the right-view stream. As shown in FIG. 32, each video access unit 3200 is provided with decoding switch information 3201. A 3D video decoder 4115 (described later) performs, for each video access unit, decoding processing of the 2D/left-view stream and decoding processing of the right-view stream switching therebetween. At that time, the 3D video decoder 4115 specifies a subsequent video access unit to be decoded at a time shown by a DTS provided to each video access unit. However, many- video decoders generally ignore the DTSs, and keep on decoding the video access units. For such 3D video decoders, it is favorable that each video access unit of the video stream has information for specifying a subsequent video access unit to be decoded in addition to a DTS. The decode switch information 3201 is information for realizing the switching processing of each of the video access units to be decoded by the 3D video decoder 4115. [0156] As shown in FIG. 32, the decode switch information 3201 is stored in an expansion area (SEI Message or the like when MPEG-4 AVC is used) in each of the video access units . The decode switch information 3201 includes a subsequent access unit type 3202, a subsequent access unit size 3203 and a decode counter 3204. [0157] The subsequent access unit type 3202 is information indicating to which of the 2D/left-view stream and the right-view stream the subsequent video access unit to be decoded belongs. For example, when a value shown by the subsequent access unit type 3202 is "1" , it is indicated that the subsequent video access unit belongs to the 2D/left-view stream. When the value shown by the subsequent access unit type 3202 is "2", it is indicated that the subsequent video access unit belongs to the right-view stream. When the value shown by the subsequent access unit type 3202 is "0", it is indicated that the subsequent video access unit is at an end of the stream. [0158] A subsequent video access unit size 3203 is information indicating a size of each subsequent video access unit that is to be decoded. If the subsequent video access unit size 3203 is unavailable in a video access unit, it is necessary to analyze, when a video access unit to be decoded is extracted from a buffer, a structure of the access unit in order to specify its size. By adding the subsequent access unit size 3203 to the decode switch information 3201, the 3D video decoder 4115 can specify the size of the access unit without analyzing the structure of the video access unit. Accordingly, the 3D video decoder 4115 can easily perform processing of extracting video access units from the buffer. [0159] The decode counter 3204 shows a decoding order of the video access units in the 2D/left-view stream starting with an access unit including an I picture. FIG. 33A and FIG. 33B schematically show values each of which is shown by the decode counter 3204, and is allocated to a picture of the 2D/left-view stream 3301 and a picture of the right-view stream 3302. As shown in FIGS. 33A and 33B, there are two manners of allocating values. [0160] In FIG. 33A, "1" is allocated to an I picture 3311 of a 2D/ieft-view stream 3301 as a value 3204A shown by the decode counter 3204, "2" is allocated to a P picture 3321 of a right-view stream 3302 to be subsequently decoded as a value 3204B shown by the decode counter 3204, and "3" is allocated to a P picture 3 322 of the 2D left-view stream 3301 to be further subsequently- decoded as a value 3204A shown by the decode counter 3204. Thus, the values 3204A and 3204B shown by the decode counter 3204 that are allocated to the video access units of the 2D/left-view stream 3301 and the right-view stream 3302 are alternately incremented. By allocating the values 3204A and 3204B shown by the decode counter 3204 in such a manner, the 3D video decoder 4115 can immediately specify, with use of the values 3204A and 3204B shown by the decode counter 3204, a missing picture (video access unit) that the 3D video decoder 4115 fails to read due to some error. Accordingly, the 3D video decoder 4115 can appropriately and promptly perform error handling. [0161] In FIG. 33A, for example, the 3D video decoder 4115 fails to read a third video access unit of the 2D/left-view stream 3301 due to an error, and a Br picture 3313 is missing. Therefore, with the Br picture 3313 missing, a Br picture 3313 cannot be referred to during the decoding processing of a third video access unit (B picture 3323) of the right-view stream 3302. Accordingly, the B picture 3323 cannot be decoded properly, and a noise is likely to be included in the played back video. However, if the 3D video decoder 4115 reads and holds therein a value 3204B (shown by the decode counter 3204) of a second video access unit (P picture 3322) of the right-view stream 3302 in decoding processing of the P picture 3322, the 3D video decoder 4115 can predict a value 3204B (shown by the decode counter 3204) of a video access unit to be subsequently decoded. Specifically, as shown in FIG. 33A, the value 3204B (shown by the decode counter 3204) of the second video access unit (P picture 3322) of the right-view stream 3202 is M4". Accordingly, it is predicted that the value 3304A (shown by the decode counter 3204) of the video access unit to be subsequently read is "5". However, since the video access unit to be subsequently read is actually a fourth video access unit of the 2D/left-view stream 3301, the value 3204A (shown by the decode counter 3204) of the video access unit is w7". In such a manner, the 3D video decoder 4115 can detect that the 3D video decoder 4115 fails to read one video access unit. Therefore, the 3D video decoder can execute error handling of "skipping decoding processing of the B picture 3323 extracted from the third video access unit of the right-view stream 3302 since the Br picture 3313 to refer to is missing". Thus, the 3D video decoder 4115 checks, for each decoding processing, the value 3204A and the value 3204B shown by the decode counter 3204. Consequently, the 3D video decoder 4115 can promptly detect a read error of the video access unit, and can promptly execute an appropriate error handling. [0162] As shown in FIG. 33B, a value 3204C and a value 3204D (shown by the decode counter 3204) of a video stream 3301 and a video stream 3302 respectively may be incremented separately. In this case, at a time point where the 3D video decoder 4115 decodes a video access unit of the 2D/left-view stream 3301, the value 3204C shown by the decode counter 3204 is equal to a value 3204D (shown by the decode counter 3 204) of a video access unit of the right-view stream 3302 to be subsequently decoded". Meanwhile, at a time point where the 3D video decoder 4115 decodes a video access unit of the right-view stream 3302, the 3D video decoder 4115 can predict that "a value obtained by incrementing, by one, a value 3204D (shown by the decode counter 3204) of the video access unit is equal to a value 3204C (shown by the decode counter 3204) of a video access unit of the 2D/lef t-view stream 3301 to be subsequently decoded". Therefore, at any time point, the 3D video decoder 4115 can promptly detect a read error of a video access unit with use of the value 3204C and the value 3204D shown by the decode counter 3204. As a result, the 3D video decoder 4115 can promptly execute appropriate error handling. [0163] > [0167] The following will describe conditions of playback time of a video stream contained in each extent. FIGS. 35A and 35B are schematic diagrams showing a relationship between playback times and playback paths. Assume that an extent 3501 of the 2D/left-view AV stream file and an extent 3502 of the right-view AV stream file are adjacent to each other as shown in FIG. 35A and a playback time of a video stream contained in the first extent 3501 and the second extent 3502 are four seconds and one second, respectively. Here, the playback path for 3D video images alternately proceeds the extent 3501 and the extent 3502 of the respective files by portions having the same playback time (e.g. , one second) as shown by an arrow 3510 in FIG. 35A. Accordingly, when extents of the files have different playback time of video streams, a jump occurs between both the extents 3501 and 3502 as shown by dash lines in FIG. 35A. In contrast, in the first embodiment as shown in FIG. 35B, an extent of the 2D/left-view AV stream file and an extent of the right-view AV stream file adjacent to each other on the BD-ROM disc 101 contain portions of the 2D/left-view stream and the right-view stream; the portions are to be played back with the same timing. In particular, the portions contained in the extents have the same playback time. For example, the top extent 3501A of the 2D/left-view AV stream file and the top extent 3502A of the right-view AV stream file have the same playback time equal to one second; and the second extent 3501B and the second extent 3502B thereof have the same playback time equal to 0.7 seconds. Thus, in the recording areas for storing the 2D/left-view AV stream file and the right-view AV stream file, extents having the same playback time are always adjacent to each other. As a result, the playback path can be designed to run the extents 3501A, 3501B, 3502A, 3502B, ..., sequentially, starting from the top extent as shown by arrows 3520 in FIG. 35B. Accordingly, the 2D/3D playback device can continuously read the AV stream files without causing a jump when playing back 3D video images. This enables seamless playback to be reliably performed. . [0168] <> [0169] The top portion of every extent in the recording area for storing an AV stream file contains an I picture of the 2D/lef t-view stream or a P picture of the right-view stream that has been compressed with reference to the I picture as shown in FIG. 30B. This allows the size of each extent to be determined by using entry points in the clip information file. Accordingly, a playback device can simplify the process of alternately reading extents of a 2D/left-view AV stream file and a right-view AV stream file from the BD-ROM disc 101. [0170] <> [0171] The following will describe conditions for the lower limit of the size of each extent and the upper limit of the interval between extents. As described above, causing the 2D playback device to seamlessly play back 2D video images from an AV stream file requires the size of each extent of the AV stream file to be equal to or larger than the minimum extent size and in addition, the interval between the extents to be smaller than the maximum jump distance Sjump_max. Accordingly, the size of each extent of the 2D/left-view AV stream file needs to be set at the value equal to or larger than the minimum extent size calculated based on the distance to the next extent in the same file. In addition, the interval between the extents needs to be set at the value not exceeding the maximum jump value Sjump_max. This allows the 2D playback device to seamlessly play back 2D video images from the 2D/left-view AV stream file. [0172] Further conditions are required for an interleaved arrangement of extents of 2D/left-view AV stream WE CLAIM: 1. A recording medium comprising a base-view stream file and a dependent-view stream file recorded thereon, the base-view stream file used for monoscopic video playback, the dependent-view stream file used for stereoscopic video playback in combination with the base-view stream file, the recording medium having a contiguous area in which a plurality of extents belonging to the base-view stream file and a plurality of extents belonging to the dependent-view stream file are arranged in an interleaved manner, starting from an extent belonging to the dependent-view stream file. 2. The recording medium of Claim 1, wherein pictures stored in the dependent-view stream file are compressed based on inter- frame correlations with pictures stored in the base-view stream file. 3. The recording medium of Claim 1, wherein a transfer rate of packets stored in the dependent-view stream file to a decoder is lower than a transfer rate of packets stored in the base-view stream file to a decoder. 4. A playback device for playing back video images from a recording medium, the playback device comprising: a reading unit operable to read a base-view stream file and a dependent-view stream file extent by extent, the base-view stream file to be used for monoscopic video playback, the dependent-view stream file to be used for stereoscopic video playback in combination with the base-view stream file; a read buffer unit operable to store extents read by the reading unit; and a decoder unit operable to receive compressed pictures contained in the extents from the read buffer unit, then decoding the compressed pictures, the recording medium having a contiguous area in which a plurality of extents belonging to the base-view stream file and a plurality of extents belonging to the dependent-view stream file are arranged in an interleaved manner, starting from an extent belonging to the dependent-view stream file. 5. A playback device for playing back video images from a transport stream having a base-view video stream and a dependent-view video stream multiplexed therein, the base-view video stream to be used for monoscopic video playback, the dependent-view video stream to be used for stereoscopic video playback in combination with the base-view video stream, the playback device comprising: an extracting unit operable to extract the base-view video stream and the dependent-view video stream from the transport stream; a first buffer unit operable to store the extracted base- view video stream therein; a second buffer unit operable to store the extracted dependent-view video stream therein; a decoder unit operable to receive the base-view video stream and the dependent-view video stream from the buffer units and decode the base-view video stream and the dependent-view video stream; and a switching unit operable to switch between the first and second buffer units on a picture-by-picture basis so that the base-view video stream and the dependent-view video stream are alternately supplied to the decoder unit, wherein the timing of the operation of the switching unit is determined by using decoding time stamps attached to pictures in the base-view video stream and the dependent-view video stream. 6. An integrated circuit to be mounted on a playback device for playing back video images by using a reading unit that reads a stream file from a recording medium extent by extent, the integrated circuit comprising: a decoder unit operable to receive compressed pictures contained in extents read from the recording medium, then decoding the compressed pictures; a control unit operable to control the supply of the compressed pictures to the decoder unit, wherein the recording medium having a contiguous area in which a plurality of extents belonging to a base-view stream file and a plurality of extents belonging to a dependent-view stream file are arranged in an interleaved manner, starting from an extent belonging to the dependent-view stream file, the base-view tream file to be used for monoscopic video playback, the dependent-view stream file to be used for stereoscopic video playback in combination with the base-view stream file. 7. An integrated circuit to be mounted on a playback device for playing back video images from a transport stream having a base-view video stream and a dependent-view video stream multiplexed therein, the base-view video stream to be used for zonoscopic video playback, the dependent-view video stream to be used for stereoscopic video playback in combination with the base-view video stream, the integrated circuit comprising: an extracting unit operable to extract the base-view video stream and the dependent-view video stream from the transport stream; a decoder unit operable to decode the base-view video stream and the dependent-view video stream; a switching unit operable to switch between first and second buffer units on a picture-by-picture basis so that the base-view video stream and dependent-view video stream are alternately supplied to the decoder unit; and a control unit operable to control the switching unit, wherein the timing of the operation of the switching unit is determined by using decoding time stamps attached to pictures in the base-view video stream and the dependent-view video stream. On a recording medium, stereoscopic and monoscopic specific areas are located one after another next to a stereoscopic/monoscopic shared area. The stereoscopic/monoscopic shared area is a contiguous area to be accessed both in stereoscopic video playback and monoscopic video playback. The stereoscopic specific area is a contiguous area to be accessed immediately before a long jump occurring in stereoscopic video playback. In both the stereoscopic/monoscopic shared area and the stereoscopic specific area, extents of base-view and dependent-view stream files are arranged in an interleaved manner. The extents on the stereoscopic specific area are next in order after the extents on the stereoscopic/monoscopic shared area. The monoscopic specific area is a contiguous area to be accessed immediately before a long jump occurring in monoscopic video playback. The monoscopic specific area has a copy of the entirety of the extents of the base-view stream file recorded on the stereoscopic specific area.

Documents

Application Documents

# Name Date
1 3464-KOLNP-2010-AbandonedLetter.pdf 2018-02-16
1 abstract-3464-kolnp-2010.jpg 2011-10-07
2 3464-KOLNP-2010-FER.pdf 2017-08-09
2 3464-kolnp-2010-specification.pdf 2011-10-07
3 3464-kolnp-2010-gpa.pdf 2011-10-07
3 3464-KOLNP-2010-(15-03-2016)-ASSIGNMENT.pdf 2016-03-15
4 3464-kolnp-2010-form-5.pdf 2011-10-07
4 3464-KOLNP-2010-(15-03-2016)-CORRESPONDENCE.pdf 2016-03-15
5 3464-kolnp-2010-form-3.pdf 2011-10-07
5 3464-KOLNP-2010-(15-03-2016)-FORM-1.pdf 2016-03-15
6 3464-kolnp-2010-form-2.pdf 2011-10-07
6 3464-KOLNP-2010-(15-03-2016)-FORM-2.pdf 2016-03-15
7 3464-kolnp-2010-form-1.pdf 2011-10-07
7 3464-KOLNP-2010-(15-03-2016)-FORM-3.pdf 2016-03-15
8 3464-KOLNP-2010-FORM 3-1.1.pdf 2011-10-07
8 3464-KOLNP-2010-(15-03-2016)-FORM-5.pdf 2016-03-15
9 3464-KOLNP-2010-(15-03-2016)-FORM-6.pdf 2016-03-15
9 3464-KOLNP-2010-FORM 18.pdf 2011-10-07
10 3464-KOLNP-2010-(15-03-2016)-PA.pdf 2016-03-15
10 3464-kolnp-2010-drawings.pdf 2011-10-07
11 3464-KOLNP-2010-(31-12-2015)-ANNEXURE TO FORM 3.pdf 2015-12-31
11 3464-kolnp-2010-description (complete).pdf 2011-10-07
12 3464-KOLNP-2010-(31-12-2015)-CORRESPONDENCE.pdf 2015-12-31
12 3464-kolnp-2010-correspondence.pdf 2011-10-07
13 3464-KOLNP-2010-(13-05-2014)-CORRESPONDENCE.pdf 2014-05-13
13 3464-KOLNP-2010-CORRESPONDENCE-1.1.pdf 2011-10-07
14 3464-KOLNP-2010-(13-05-2014)-FORM-3.pdf 2014-05-13
14 3464-kolnp-2010-claims.pdf 2011-10-07
15 3464-KOLNP-2010-(20-01-2014)-ANNEXURE TO FORM 3.pdf 2014-01-20
15 3464-kolnp-2010-abstract.pdf 2011-10-07
16 3464-KOLNP-2010-(20-01-2014)-CORRESPONDENCE.pdf 2014-01-20
17 3464-kolnp-2010-abstract.pdf 2011-10-07
17 3464-KOLNP-2010-(20-01-2014)-ANNEXURE TO FORM 3.pdf 2014-01-20
18 3464-kolnp-2010-claims.pdf 2011-10-07
18 3464-KOLNP-2010-(13-05-2014)-FORM-3.pdf 2014-05-13
19 3464-KOLNP-2010-(13-05-2014)-CORRESPONDENCE.pdf 2014-05-13
19 3464-KOLNP-2010-CORRESPONDENCE-1.1.pdf 2011-10-07
20 3464-KOLNP-2010-(31-12-2015)-CORRESPONDENCE.pdf 2015-12-31
20 3464-kolnp-2010-correspondence.pdf 2011-10-07
21 3464-KOLNP-2010-(31-12-2015)-ANNEXURE TO FORM 3.pdf 2015-12-31
21 3464-kolnp-2010-description (complete).pdf 2011-10-07
22 3464-KOLNP-2010-(15-03-2016)-PA.pdf 2016-03-15
22 3464-kolnp-2010-drawings.pdf 2011-10-07
23 3464-KOLNP-2010-(15-03-2016)-FORM-6.pdf 2016-03-15
23 3464-KOLNP-2010-FORM 18.pdf 2011-10-07
24 3464-KOLNP-2010-FORM 3-1.1.pdf 2011-10-07
24 3464-KOLNP-2010-(15-03-2016)-FORM-5.pdf 2016-03-15
25 3464-kolnp-2010-form-1.pdf 2011-10-07
25 3464-KOLNP-2010-(15-03-2016)-FORM-3.pdf 2016-03-15
26 3464-kolnp-2010-form-2.pdf 2011-10-07
26 3464-KOLNP-2010-(15-03-2016)-FORM-2.pdf 2016-03-15
27 3464-kolnp-2010-form-3.pdf 2011-10-07
27 3464-KOLNP-2010-(15-03-2016)-FORM-1.pdf 2016-03-15
28 3464-kolnp-2010-form-5.pdf 2011-10-07
28 3464-KOLNP-2010-(15-03-2016)-CORRESPONDENCE.pdf 2016-03-15
29 3464-kolnp-2010-gpa.pdf 2011-10-07
29 3464-KOLNP-2010-(15-03-2016)-ASSIGNMENT.pdf 2016-03-15
30 3464-kolnp-2010-specification.pdf 2011-10-07
30 3464-KOLNP-2010-FER.pdf 2017-08-09
31 3464-KOLNP-2010-AbandonedLetter.pdf 2018-02-16
31 abstract-3464-kolnp-2010.jpg 2011-10-07

Search Strategy

1 SearchStrategy_23-06-2017.pdf