Sign In to Follow Application
View All Documents & Correspondence

Sample Array Coding For Low Delay

Abstract: The entropy coding of a current part of a predetermined entropy slice is based on, not only, the respective probability estimations of the predetermined entropy slice as adapted using the previously coded part of the predetermined entropy slice, but also probability estimations as used in the entropy coding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part thereof. Thereby, the probability estimations used in entropy coding are adapted to the actual symbol statistics more closely, thereby lowering the coding efficiency decrease normally caused by lower-delay concepts. Temporal interrelationships are exploited additionally or alternatively.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 December 2013
Publication Number
15/2014
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2019-06-06
Renewal Date

Applicants

FRAUNHOFER GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V.
Hansastraße 27c 80686 München

Inventors

1. GEORGE Valeri
John Sieg Str. 24 10365 Berlin
2. HENKEL Anastasia
Fahrenheitstrasse 8 10245 Berlin
3. KIRCHHOFFER Heiner
Gotzkowskystr. 5 10555 Berlin
4. MARPE Detlev
Südwestkorso 70 12161 Berlin
5. SCHIERL Thomas
Dunckerstrasse 72 10437 Berlin

Specification

Sample Array Coding for Low-Delay Description The present application is concerned with sample array coding such as picture or video coding. Parallelization of encoder and decoder is very important due to the increased processing requirements by the HEVC standard as well as by the expected increase of video resolution. Multi-core architectures are becoming available in a wide range of modern electronic devices. Consequently, efficient methods to enable the use of multiple-core architectures are required. Encoding or decoding of LCUs occurs in raster scan, by which the CABAC probability is adapted to the specificities of each image. Spatial dependencies exist between adjacent LCUs. Each LCU (largest coding unit) depends on its left, above, above-left and above-right neighbor LCUs, because of different components, for instance, motion-vector, prediction, intra-prediction and others. Due to enable parallelization in decoding, these dependencies typically need to be interrupted or are interrupted in state-of-the-art applications. Some concepts of parallelization, namely wavefront processing have been proposed. The motivator for the further study is to perform techniques, which lower the coding efficiency loss and thus reduce the burden on the bitstream for parallelization approaches in encoder and decoder. Furthermore, low-delay processing was not possible with the available techniques. Thus, it is the object of the present invention to provide a coding concept for sample arrays allowing lower delay at comparatively less penalties in coding efficiency. This object is achieved by the subject matter of the enclosed independent claims. If the entropy coding of a current part of a predetermined entropy slice is based on, not only, the respective probability estimations of the predetermined entropy slice as adapted using the previously coded part of the predetermined entropy slice, but also probability estimations as used in the entropy coding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part thereof, the probability estimations used in entropy coding are adapted to the actual symbol statistics more closely, thereby lowering the coding efficiency decrease normally caused by lower-delay concepts. Temporal interrelationships may be exploited additionally or alternatively. For example, the dependency on probability estimations as used in the entropy coding of a spatially neighboring, in entropy slice order preceding entropy slice may involve the initialization of the probability estimations at the beginning of entropy coding the predetermined entropy slice. Usually, probability estimations are initialized to values adapted to symbol statistics of a representative blend of sample array material. In order to avoid the transmission of the initialization values of the probability estimations, they are known to encoder and decoder by convention. However, such pre-defined initialization values are, naturally, merely a compromise between side information bitrate on the one hand and coding efficiency on the other hand as such initialization values naturally - more or less - deviate from the actual sample statistics of the currently coded sample array material. The probability adaptation during the course of coding an entropy slice adapts the probability estimations to the actual symbol statistics. This process is accelerated by initializing the probability estimations at the beginning of entropy coding the current/predetermined entropy slice using already adapted probability estimations of the just-mentioned spatially neighboring, in entropy slice order preceding entropy slice as the latter values have already been, to some extent, adapted to the actual symbol statistics of the sample array currently at hand. Low-delay coding may nevertheless be enabled by using, in initializing the probability estimations for the predetermined/current entropy slices, the probability estimation used at the neighboring part thereof, rather than manifesting themselves at the end of entropy coding the preceding entropy slice. By this measure, wavefront processing is still possible. Further, the above-mentioned dependency on the probability estimations as used in the entropy coding of the spatially neighboring, in entropy slice order preceding entropy slice may involve the adaptation process of adapting the probability estimations used in entropy coding the current/predetermined entropy slice itself. Probability estimation adaptation involves the use of the just-coded part, i.e., the just-coded symbol(s), in order to adapt the current state of the probability estimations to the actual symbol statistics. By this measure, initialized probability estimations are adapted at some adaptation rate to the actual symbol statistic. This adaptation rate is increased by performing the just-mentioned probability estimation adaptation not only based on the currently coded symbol of the current/predetermined entropy slice, but also dependent on probability estimations as manifesting themselves at a neighboring part of the spatial neighboring, in entropy slice order preceding entropy slice. Again, by choosing the spatially neighborhood of the current part of the current entropy slice and the neighboring part of the preceding entropy slice appropriately, wavefront processing is still possible. The benefit from coupling the own probability estimation adaption along the current entropy slice, with the probability adaptation of the preceding entropy slice is the increased rate at which the adaptation to the actual symbol statistics takes place as the number of symbols traversed in the current and previous entropy slices contributes to the adaptation, rather than merely the symbols of the current entropy slice. Advantageous implementations of embodiments of the present invention are the subject of the dependent claims. Further, preferred embodiments are described with respect to the figures, among which Fig. 1 shows a block diagram of an exemplary encoder; Fig. 2 shows a schematic diagram of a partitioning of a picture into slices and slice parts (i.e. blocks or coding units) along with coding orders defined thereamong; Fig. 3 shows a flow diagram of functionality of an exemplary encoder such as the one of Fig. 1 ; Fig. 4 shows a schematic diagram for explaining the functionality of an exemplary encoder such as the one of Fig. 1 ; Fig. 5 shows a schematic diagram for a parallel operational implementation of encoder and decoder; Fig. 6 shows a block diagram of an exemplary decoder; Fig. 7 shows a flow diagram of functionality of an exemplary decoder such as the one of Fig. 6; Fig. 8 shows a schematic diagram for an exemplary bitstream resulting from the coding scheme of Fig. 1 to 6; Fig. 9 schematically shows an example for how to compute probability with the help of others LCUs; Fig. 10 shows a graph illustrating the RD results for Intra (4 Threads), in a comparison with HM3.0; Fig. 1 1 shows a graph illustrating the RD results for Low Delay (1 Thread), in a comparison with HM3.0; Fig. 12 shows a graph illustrating the RD results for Random Access (1 Thread), in a comparison with HM3.0; Fig. 13 shows a graph illustrating the RD results for LowDelay (4 Threads), in a comparison with HM3.0; Fig. 14 schematically and exemplarily illustrates the possible compounds of entropy slices Fig. 15 schematically and exemplarily illustrates a possible signaling of an entropy slice Fig. 16 schematically and exemplarily illustrates Encoding, segmentation, interleaving and decoding of entropy slice data via chunks; Fig. 17 schematically and exemplarily illustrates a possible compound between Frames; Fig. 18 schematically and exemplarily illustrates a possible usage of collocated information; Fig. 19 schematically shows the possibility of a wavefront running obliquely within the spatio/temporal space spanned by consecutive sample arrays; and Fig. 20 schematically shows another example for subdividing entropy slices into chunks. In order to ease the understanding of the below-outlined measures for improving the achievement of low-delay at less penalties in terms of coding efficiency, the encoder of Fig. 1 is firstly described in more general terms without, preliminarily, discussing the advantageous concepts of embodiments of the present application and how same may be built into the embodiment of Fig. 1. It should be mentioned, however, that the structure shown in Fig. 1 merely serves as an illustrative environment in which embodiments of the present application may be used. Generalizations and alternatives for encoders and decoders in accordance with embodiments of the present invention are also briefly discussed. Fig. 1 shows an encoder for encoding a sample array 10 into an entropy encoded data stream 20. As shown in Fig. 1 , the sample array 10 may be one of a sequence 30 of sample arrays and the encoder may be configured to encode the sequence 30 into the data stream 20. The encoder of Fig. 1 is generally indicated by reference sign 40 and comprises a precoder 42 followed by an entropy encoding stage 44, the output of which outputs data stream 20. The precoder 42 is configured to receive and operate on sample array 10 in order to describe the content thereof by way of syntax elements of a predetermined syntax with each syntax element being of a respective one of a predetermined set of syntax element types which, in turn, are associated with a respective semantics. In describing the sample array 10 using the syntax elements, the precoder 42 may subdivide the sample array 10 into coding units 50. The term "coding unit" may, for reasons set out in more detail below, alternatively be called "coding tree units" (CTU)s. One possibility of how precoder 42 may subdivide the sample array 10 into coding units 50 is exemplarily shown in Fig. 2. In accordance with this example, the subdivision regularly subdivides the sample array 10 into coding units 50, so that the latter are arranged in rows and columns so as to gaplessly cover the complete sample array 10 without overlap. In other words, precoder 42 may be configured to describe each coding unit 50 by way of syntax elements. Some of these syntax elements may form subdivision information for further subdividing the respective coding unit 50. For example, by way of multi-tree subdivisioning, the subdivision information may describe a subdivision of the respective coding unit 50 into prediction blocks 52 with the precoder 42 associating a prediction mode with associated prediction parameters for each of these prediction blocks 52. This prediction subdivision may allow for the prediction blocks 52 being different in size as illustrated in Fig. 2. The precoder 42 may also associate residual subdivision information with the prediction blocks 52 so as to further subdivide the prediction blocks 52 into residual blocks 54 so as to describe the prediction residual per prediction block 52. Thus, precoder may be configured to generate a syntax description of sample array 10 in accordance with a hybrid coding scheme. However, as already noted above, the just-mentioned way that the precoder 42 describes the sample array 10 by way of syntax elements has merely been presented for illustration purposes and may also be implemented differently. The precoder 42 may exploit spatial interrelationships between content of neighboring coding units 50 of sample array 10. For example, precoder 42 may predict syntax elements for a certain coding unit 50 from syntax elements determined for previously coded coding units 50 which are spatially adjacent to the currently coded coding unit 50. In Figs. 1 and 2, for example, the top and left-hand neighbors serve for prediction as illustrated by arrows 60 and 62. Moreover, precoder 42 may, in an intra-prediction mode, extrapolate already coded content of neighboring coding units 50 into the current coding unit 50 so as to obtain a prediction of the samples of the current coding unit 50. As shown in Fig. 1 , precoder 42 may, beyond exploiting spatial interrelationships, temporarily predict samples and/or syntax elements for a current coding unit 50 from previously coded sample arrays as illustratively shown in Fig. 1 by way of arrow 64. That is, motion-compensated prediction may be used be precoder 42, and the motion vectors themselves may be subject to temporal prediction from motion vectors of previously coded sample arrays. That is, precoder 42 may describe the content of sample array 10 coding unit wise and may, to this end, use spatial prediction. The spatial prediction is restricted for each coding unit 50 to spatially neighboring coding units of the same sample array 10 such that when following a coding order 66 among the coding units 50 of sample array 10, the neighboring coding units serving as prediction reference for the spatial prediction, have generally been traversed by the coding orders 66 prior to the current coding unit 50. As illustrated in Fig. 2, the coding order 66 defined among the coding units 50 may, for example, be a raster scan order according to which coding units 50 are traversed row by row from top to bottom. Optionally, a subdivision of array 10 into an array of tiles may cause the scan order 66 to traverse - in a raster scan order - the coding units 50 composing one tile first before preceding to the next in a tile order which, in turn, may also be of a raster scan type. For example, the spatial prediction may involve merely neighboring coding units 50 within a coding unit row above the coding unit row within which the current coding unit 50 resides, and a coding unit within the same coding unit row, but to the left relative to the current coding unit. As will be explained in more detail below, this restriction onto the spatial interrelationship/spatial prediction, enables parallel wavefront processing. Precoder 42 forwards the syntax elements to the entropy coding stage 44. As just outlined, some of these syntax elements have predictively been coded, i.e. represent prediction residuals. Precoder 42 may, thus, be regarded as a predictive coder. Beyond that, precoder 42 may be a transform coder configured to transform code residuals of the prediction of the content of the coding units 50. An exemplary internal structure of the entropy coding stage 44 is also shown in Fig.1. As shown, the entropy coding stage 44 may comprise, optionally, a symbolizer for converting each of syntax elements received from precoder 42, the number of possible states of which exceed the symbol alphabet's cardinality into a sequence of symbols Sj of the symbol alphabet based on which the entropy coding engine 44 operates. Beyond that optional symbolizer 70, the entropy coding engine 44 may comprise a context selector 72 and an initializer 74, a probability estimation manager 76, a probability estimation adaptor 78 and an entropy coding core 80. The entropy coding core's output forms the output of entropy coding stage 44. Besides, the entropy coding core 80 comprises two inputs, namely one for receiving the symbols Sj of the sequence of symbols, and another one for receiving a probability estimation p, for each of the symbols. Owing to the properties of entropy coding, the coding efficiency in terms of compression rate, increases with an improvement of the probability estimation: the better the probability estimation matches the actual symbol statistics, the better the compression rate. In the example of Fig. 1, the context selector 72 is configured to select, for each symbol Sj, a corresponding context Cj among a set of available contexts managed by manager 76. It should be noted, however, that the context selection forms merely an optional feature and may be left off with, for example, using the same context for each symbol. However, if using the context selection, context selector 72 may be configured to perform the context selection at least partially based on information concerning coding units outside the current coding unit, namely concerning neighboring coding units within the restricted neighborhood discussed above. Manager 76 comprises a storage which stores, for each available context an associated probability estimation. For example, the symbol alphabet may be a binary alphabet so that merely one probability value may have to be stored for each available context. The initializer 74 may, intermittently, initialize or re-initialize the probability estimations stored within manager 76 for the available contexts. Possible time instants at which such initialization may be performed, are discussed further below. The adapter 78 has access to the pairs of symbols s, and corresponding probability estimations pi and adapts the probability estimations within manager 76 accordingly. That is, each time a probability estimation is applied by entropy coding core 80 so as to entropy code the respective symbol Sj, into data stream 20, adapter 78 may vary this probability estimation in accordance with the value of this current symbol Sj , so that this probability estimation pj is better adapted to the actual symbol statistics when encoding the next symbol which is associated with that probability estimation (by way of its context). That is, adaptor 78 receives the probability estimation for the selected context from manager 76 along with the corresponding symbol Sj and adapts the probability estimation p, accordingly so that, for the next symbol Sj of the same context Cj , the adaptive probability estimation is used. The entropy coding core 80 is, for example, configured to operate in accordance with an arithmetic coding scheme or in a probability-interval-partitioning-entropy coding scheme. In arithmetic coding, entropy coding core 80 would, for example, continuously update its state when coding the sequence of symbols, with the state being defined by a probability interval defined by a probability interval width value and probability interval offset value, for example. When operating in the pipe concept, the entropy coding core 80 would, for example, subdivide the domain of possible values of the probability estimations into different intervals with performing fixed-probability entropy coding with respect to each of these intervals, thereby obtaining a substream for each of the sub-intervals the coding efficiency of which is respectively tailored to the associated probability interval. In case of entropy coding, the data stream 20 output would be an arithmetically coded data stream signaling to the decoding side information enabling emulating or redoing the interval subdivision process. Naturally, it would be possible for entropy coding state 44 to entropy code all information, i.e. all syntax elements/symbols sj, relating to sample array 10 with initializing the probability estimations merely once at the beginning thereof, and then continuously updating the probability estimations by adaptor 78. This would, however, result in a data stream 20 which would have to be sequentially decoded at the decoding side. In other words, there would not be a possibility for any decoder to subdivide resulting data stream into several sub-portions and decoding the sub-portions in parallel. This would, in turn, hamper any low-delay efforts. Accordingly, as will be outlined in more detail below, it is favorable to subdivide the amount of data describing the sample array 10 into so-called entropy slices. Each of these entropy slices would, accordingly, cover a different set of the syntax elements relating to the sample array 10. If the entropy coding stage 44 would, however, entropy encode each entropy slice completely independent of each other by firstly initializing the probability estimation once with then continuously updating the probability estimation for each entropy slice individually, then the coding efficiency would be decreased due to the increased percentage of the data relating to, and describing, the sample array 10 for which the probability estimations used are (yet) less accurately adapted to the actual symbol statistics. In order to overcome the just-mentioned problems in accommodating the wish for low coding delay on the one hand and high coding efficiency on the other hand, the following encoding scheme may be used, which is now described with respect to Fig. 3. Firstly, the data describing the sample array 10 is subdivided into portions called "entropy slices" in the following. The subdivision 80 needs not to be overlap-free. On the other hand, this subdivision may at least partially correspond to a spatial subdivision of the sample array 10 into different portions. That is, according to the subdivision 80, syntax elements describing sample array 10 may be distributed onto different entropy slices depending on the location of the coding unit 50 which the respective syntax element relates to. See, for example, Fig. 2. Fig. 2 shows an exemplary subdivision of sample array 10 into different portions 12. Each portion corresponds to a respective entropy slice. As exemplarily shown, each portion 12 corresponds to a row of coding units 50. Other subdivisions are, however, feasible as well. However, it is advantageous if the subdivision of the sample array 10 into portions 12 follows the aforementioned coding order 66 so that portions 12 cover consecutive runs of coding units 12 along the coding order 66. Even if so, however, the start and end positions of portion 12 along coding order 66 need not coincide with the left-hand and right-hand edges of the rows of coding units 50, respectively. Even a coincidence with the borders of coding units 50 immediately follow each other and coding order 66 does not need to be mandatory. By subdividing sample array 10 in that way, an entropy slice order 16 is defined among portions 12 along which portions 12 follow each other along the coding order 66. Moreover, for each entropy slice, a respective entropy coding path 14 is defined, namely the fragment of coding path 66 running to the respective portion 12. In the example of Fig. 2 where portions 12 coincide with the rows of coding units 50, the entropy coding paths 14 of each entropy slice point along the row direction, in parallel to each other, i.e. here from the left-hand side to the right-hand side. It should be noted that it would be possible to restrict spatial predictions performed by precoder 42 and context derivations performed by context selector 72 so as to not cross slice boundaries, i.e. so that the spatial predictions and context selections do not depend on data corresponding to another entropy slice. This way, the "entropy slices" would correspond the usual definition of "slices" in H.264, for example, which are completely decodable independently from each other, except for the below-outlined probability initialization/adaptation dependency. However, it would also be feasible to allow spatial predictions and context selections, i.e. generally speaking dependencies, to cross slice boundaries in order to exploit local/spatial inter-dependencies as WPP processing is still feasible even as far as the reversal of the precoding, i.e. the reconstruction based on the syntax elements, and the entropy context selection is concerned. In so far, the entropy slices would somehow correspond to "dependent slices". Subdivision 80 may, for example, be performed by entropy coding stage 44. The subdivision may be fixed or may vary among the array of sequence 30. The subdivision may be fixed per convention or may be signaled within data stream 20. Based on the entropy slices, the actual entropy coding may take place, i.e. 82. For each entropy slice, the entropy coding may be structured into a starting phase 84 and a continuation phase 86. The staring phase 84 involves, for example, the initialization of the probability estimations as well as the triggering of the actual entropy coding process for the respective entropy slice. The actual entropy coding is then performed during the continuation phase 86. The entropy coding during phase 86 is performed along the respective entropy coding path 14. The starting phase 84 for each entropy slice is controlled such that the entropy coding of the plurality of entropy slices starts sequentially using the entropy slice order 16. Now, in order to avoid the above-outlined penalty which would result from entropy coding each entropy slice completely independent of each other, the entropy coding process 82 is controlled such that a current part, e.g. a current coding unit of a current entropy slice is entropy coded based on the respective probability estimations of the current entropy slice as adapted using the previously encoded part of the current entropy slice, i.e. the part of the current entropy slice to the left of the current coding unit 50 in case of Fig. 2, and the probability estimations as used in entropy coding the spatially neighboring, in entropy slice order 16 preceding entropy slice at a neighboring part, i.e. a neighboring coding unit, thereof. In order to describe the aforementioned dependency more clearly, reference is made to Fig. 4. Fig. 4 illustrates, the n-1, n and n+1 th entropy slices in entropy slice order 16 with reference sign 90. Each entropy slice 90 encompasses the sequence of syntax elements describing the portion 12 of the sample array 10 which the respective entropy slice 90 is associated with. Along the entropy coding path 14, entropy slice 19 is segmented into a sequence of segments 92, each of which corresponds to a respective one of the coding units 50 of the portion 12 which entropy slice 90 relates to. As described above, the probability estimations used in entropy coding the entropy slices 90 are continuously updated during the continuation phase 86 along the entropy coding path 14 so that the probability estimations increasingly better adapt the actual symbol statistics of the respective entropy slice 90 - i.e. the probability estimations are associated with the respective entropy slice. While the probability estimations 94 used to entropy code the entropy slice 90 during the continuation phase 86, are continuously updated, in Fig. 4, merely states of the probability estimations 94 occurring at the start and end positions of segments 92 are illustrated and mentioned further below. In particular, the state prior to entropy coding the first segment 92 as initialized during the starting phase 84, is shown at 96, the state manifesting itself after coding the first segment is illustrated at 98, and the state manifesting itself after encoding the first two segments is indicated at 100. The same elements are shown in Fig. 4 also for the n-1 entropy slice in entropy slice order 16, and the following entropy slice, i.e. entropy slice n+1. Now, in order to achieve the above-outlined dependency, the initial state 96 for entropy coding the n-th entropy slice 90 is set dependent on any intermediate state of the probability estimations 94 manifesting itself during encoding the preceding entropy slice n-1. "Intermediate state" shall denote any state of the probability estimations 94, excluding the initial state 96 and the final state manifesting itself after entropy coding the complete entropy slice n-1. By doing so, entropy coding the sequence of entropy slices 90 along the entropy slice order 16 may be parallelized with a degree of parallelization being determined by the ratio of the number of segments 92 preceding the state used for initialization of the probability estimations 94 for entropy coding the next entropy slice, i.e. a, and a number of segments 92 succeeding this stage, i.e. b. In particular, in Fig. 4, a is exemplarily set to be equal to, with the initialization, i.e. adaption of state 100 so as to set state 96 of current entropy slice to be equaled to state 100 of the preceding entropy slice, being illustrated by arrow 104. By this measure, the entropy coding of any segment 92 following state 100 in entropy coding path order 14 would depend on the probability estimation 94 as adapted during the continuation phase 86 based on the preceding segments of the same entropy slice as well as the probability estimation as used in entropy coding the third segment 92 of the preceding entropy slice 90. Accordingly, the entropy coding of the entropy slices 90 could be performed in parallel in a pipelined scheduling. The only restrictions imposed onto the time scheduling would be that the entropy coding of some entropy slice may begin merely after finishing the entropy coding of the a-th segment 92 of the preceding entropy slice. Entropy slices 90 immediately following each other in entropy slice order 16, are not subject to any other restrictions regarding time-alignment of the entropy coding procedure during the continuation phase 86. In accordance with another embodiment, however, a stronger coupling is used, additionally and/or alternatively. In particular, as illustrated in Fig. 4 by representative arrows 106, the probability estimation adaptation during continuation phase 86 causes that the data of the coding unit corresponding to a certain segment 92 changes the probability estimations 94 from the state at the beginning of the respective segment 92 till the end of this segment 92, thereby improving the approximation of the actual symbol statistics as denoted above. That is, the adaptation 106 is performed for the entropy slice n-1 merely dependent on data of entropy slice n-1 , and the same applies to the probability estimation adaptation 106 of entropy slice n, etc. For example, it would be possible to perform the initialization as explained above with respect to arrows 104 with performing the probability estimation adaptation 106 without any further interference between the entropy slices 90. However, in order to expedite the probability estimation approximation of the actual symbol statistics, probability estimation adaptation 106 of consecutive entropy slices may be coupled so that the probability estimation adaptation 106 of a preceding entropy slice n-1 also influences, or is taken into account, when adapting the probability estimation adaptation of a current entropy slice n. This is illustrated in Fig. 4 by an arrow 108 pointing from state 1 10 of the spatially neighboring probability estimations 94 for entropy coding the n-l-th entropy slice 90 to state 100 of the probability estimations 94 for entropy coding the n-the entropy slice 90. When using the above-outlined initialization of state 96, the probability adaptation coupling 108 may be, for example, used at any of the b probability estimation states manifesting themselves after entropy coding the b segments 92 of the preceding entropy slice. To be more precise, the probability estimations manifesting themselves after entropy coding the first segment 92 of the current entropy slice may be the result of the usual probability adaptation 106 and by taking into account 108 of the probability estimation states resulting from the probability estimation adaptation 106 during entropy coding the (a+l)-th segment 92 of the preceding entropy slice n-1. The "taking into account" may, for example, involve some averaging operation. An example will be further outlined below. In other words, state 98 of the probability estimations 94 for entropy coding the n-th entropy slice 90 at the beginning of entropy coding segment 92 thereof, may be the result of averaging the predecessor state 96 of the probability estimations 94 for entropy coding the current entropy slice n as adapted using adaptation 106, and the state before entropy coding the (a+l)-th segment 92 of the preceding entropy slice n-1 modified according to the probability adaptation 106. Analogously, state 100 may be the result of an averaging the result of the adaptation 106 performed during entropy coding the current entropy slice n and the result of the probability adaptation during entropy coding the (a+2)-th segment 92 of the preceding entropy slice n-1, etc. More specifically, let p(n)->{i,j}, with i,j denoting the position of any coding unit (with (0,0) being the left, top and (I,J) the right, bottom position), ie{l...I} and je{l...J}, I being the number or columns, J being the number of rows and p() defining the path order 66, P{ij} the probability estimation used in entropy coding coding unit {i,j}; and T(P{ij}) the result of the probability adaptation 106 of Pjy} based on coding unit {i,j}; Then, the probability estimations 106 of consecutive entropy slices 90 may be combined to replace the usual, entropy slice internal adaptation according to by average(T(PP(n)), T(P{I )L ),..., (P{IJ)N )) where N may be 1 or more than 1 and {i,j}i...N is/are selected from (lie within) any previous (in entropy slice order 16) entropy slice 90 and its associated portion 12, respectively. The function "average" may be one of a weighted sum, a median function, etc. p(n)={i,j} is the current coding unit and p(n+l) follows in accordance with the coding order 14 and 66, respectively. In the presented embodiments p(n+l)={i+l,j}. Preferably, {i,j}i...N fulfill, for each ke{l...N}, {i,j}i...N = {ikjk} and ik)'s. This may be done for k= 0 only, or for each entropy slice k e { 1 . ..K} with K denoting the number of entropy slices in the current frame. The temporal initialization may be done additionally or alternatively to the spatial initialization described above. That is, P{jj) with {i,j } denoting the first CU 50 in the k-th entropy slice may be set equal to some combination ( such as sme average) of T(P/j jv) and T( ?(, ,, ) with {i,j } ' denoting a CU within a preceding (previously de/coded) sample array or a combination of several T(P{jj}>)'s and {i,j }spatiai denoting a CU within a preceding entropy slice of the current sample array. As far as the location of {i,j } ' is concerned, P{y} with {i,j } denoting the first (in entropy coding order 14) CU 50 in the k-th (in entropy coding order 14) entropy slice of the current sample array may be set equal to T(P{ }-) with {i,j } ' denoting the last (in entropy coding order 14) CU within the k-th (in entropy slice order) entropy slice in the preceding (in sample array coding order) sample array or the last CU within the last (in entropy slice order) entropy slice in the preceding (in sample array coding order) sample array. Again, this temporal initialization may be performed for the first entropy slice in the sample array only. The parsing process of the end state of the reference frame was tested with the method of probabilities adaptation, which results are illustrated on Fig. 10 - Fig. 19 {Temporal graph). Another opportunity to use the dates from other frames is to exchange the obtained probabilities between collocated LCUs. The main idea is based on assertion that properties of reference frame differ from current frame not greatly. In order to quicken the learning of the probabilities along LCUs in a frame, one can try to pass the end state of each LCU to the appropriate LCU in current frame. This proposal is illustrated on Fig. 18. Under reference frame, different opportunities can be understood. As for instance, a frame, which is the last coded can be used as a reference frame. Otherwise the last coded frame only from the same temporal layer can be appropriated for reference. Moreover, this approach can be merged with already above proposed methods, as usage of last (slice) information from reference frame, probabilities adaptation and usage of the second LCU of the upper line. The above spatial adaptation process may be modified to be Pp(n+1 )= average(T(Pp(n)), T( P{IJ ),... , Τ( />/>Λ, ), T( P{I ], ), . . . ( P{I J YM ),) where N may be 1 or more than 1 and {i,j } i ...N is/are selected from (lie within) any previous (in entropy slice order 16) entropy slice 90 in the current sample array 10 and its associated portion 12, respectively, and M may be 1 or more than 1 and {i,j } ' i . . .M is/are lie within the preceding sample array 350. It may be that (the) at least one of CU's 50 {ij } ' 1...M is co-located to p(n). With regard to possible selections of CU's 50 {i,j } i ...N reference is made to the above description. The function "average" may be one of a weighted sum, a median function, etc. The above spatial adaptation process may be replaced by average(T(Pp(n)), T( P{ /}, ),... , T( P{IJYu ),) where M may be 1 or more than 1 and {ij } ' i ... M is/are lie within the preceding sample array. It may be that (the) at least of {i,j } ' i ... M is co-located to p(n). With regard to possible selections of {i,j } i ...N reference is made to the above description. The function "average" may be one of a weighted sum, a median function, etc. It may be that (the) at least of {ij } ' i ...M is co-located to p(n). As a specific extension of usage of collocated information, an approach to utilize the obtained dates from other blocks from one or even more reference frames can be applied. Earlier mentioned techniques use only the obtained information from the direct neighbors in current frame or in reference frames. However it does not mean that the gained probabilities in this case are the best ones. The adjacent LCUs, according to the picture partitions (residuals), do not have always the best probabilities models. It is supposed that the best results can be achieved with the help of blocks, from which prediction will be done. And thus, this appropriate block can be used as a reference for the current LCU. Thus, in the above adaptation example, {i,j } i ...N and/or {i,j } ' i .. M may be selected depending on CU's serving as providers of predictors for p(n). The temporal probability adaptation/initialization schemes presented can also be used without entropy slices or a single entropy slice per frame. In accordance with the latter aspect, a gain in probability adaptation speed is achieved by coupling the probability adaptations of temporally neighboring/related frames. What it is described there is, a decoder such as the one of Fig. 6, wherein the decoder is configured to reconstruct a sequence of sample arrays from an entropy encoder data stream, and is configured to entropy decode a current frame of the entropy encoder data stream so as to reconstruct a current sample array of the sequence of sample arrays, perform the entropy decoding along an entropy coding path and using probability estimations and adapt the probability estimations along the entropy coding path using a previously decoded part of the current frame, wherein the entropy decoding stage is configured to initialize or determine the probability estimations for the current frame based on probability estimations used in decoding a previously decoded frame of the entropy encoded data stream. That is, for example, the probability estimations for the current frame are initialized based on probability estimations resulting after having finished decoding the previously decoded frame of the entropy encoded data stream. Buffer requirements are, therefore low, since merely the end state of the probability estimates have to be buffered till the start of the decoding of the current frame. Of course, this aspect is combinable with the aspect of Fig. 1 to 9 in that for the first parts of each portion 12 not only probability estimates used for spatially neighboring parts in prior entropy slices (if available) but also, in a weighted manner, for example, the end state of the probability estimates of a (for example, spatially) corresponding entropy slice in the previous frame is used. Such data of a corresponding slice in a reference frame may be derived not only at the end position but also from a prior position in the referenced slices, since the parallel wavefront processing may also go over frame boundaries, i.e. while coding a slice of a frame, the coding process of the slice of the preceding frame may not already be finished. Therefore, signaling may be used to indicate the reference position, or may be indicated by scheme. Further, for example, the probability estimations used for coding the parts/blocks of the previously decoded frame are buffered all, not only the end state, and the decoder would, in entropy decoding the predetermined entropy slice (with reference to above description of spatially couple probability derivation), perform entropy decoding the current part (X) of the predetermined entropy slice based on the respective probability estimations of the predetermined entropy slice as adapted using the previously decoded part of the predetermined entropy slice (including pi, for example), and probability estimations as used in the entropy decoding of a spatially corresponding part of an entropy slice of the previously decoded frame with, optionally, additionally using probability estimations as used in the entropy decoding of a spatially neighboring, in entropy slice order preceding entropy slice (the slice comprising X, for example) at a neighboring part (such as p4) of the spatially neighboring entropy slice, as described above. As has also been described above, the spatial correspondence among parts, and the identification of an appropriate one for probability adoption for the current frame among the previously decoded frame may be defined by help of motion information such as ,motion indices, motion vectors and the like, of the current part/block. Up to now, the wavefront extending during wavefront processing has primarily been described as extending obliquely through one sample array 10 with the coding/decoding being performed one sample array after the other. However, this is not a must. Reference is made to Fig. 19. Fig. 19 shows a portion out of a sequence of sample arrays, wherein the sample arrays of the sequence have defined thereamong, and are depicted as being arranged in, a sample array coding order 380 which may or may not coincide with a presentation time order. Fig. 19 exemplarily shows a subdivision of the sample arrays 10 into four entropy slices each. Already coded/decoded entropy slices are shown hatched. Four coding/decoding threads (coding/decoding stages) 382 are currently operating on the four entropy slices 12 of sample array having index n. However, Fig. 19 shows a thread number 5 being left, and it is possible that this further thread 382 having number 5 in Fig. 19 operates to code/decode the next sample array in line, i.e. n+1 , at portions for which it is guaranteed that respective reference portions in currently coded/decoded frame n are already available, i.e. have already been processed by any of threads 1 to 4. These portions are referenced in predictions shown at 64 in Fig. 1 , for example. Fig. 19 exemplarily shows with a dotted line 384, a line extending through sample array n+1 which is co-located to the border between the already processed, i.e. already coded/decoded portion of sample array n, i.e. the hatched portion within sample array n on the one hand, and the non yet processed portion, i.e. the un-hatched portion of sample array n on the other hand. With double headed arrows, Fig. 19 also shows the maximum possible length of motion vectors measured in column and row direction, i.e. ymax and xmax, respectively. Accordingly, Fig. 19 also shows with a dash-dotted line 386 a displaced version of line 384, namely a line 386 which is spaced apart from line 384 at the minimum possible distance so that the distance does not fall below ymax in column direction and xmax in row direction. As can be seen, there are coding units 50 in sample array n+1 for which any reference portion in sample array n is guaranteed to be found as being completely contained within the already processed portion of this sample array n, namely those laying in the half of sample array n+1 lying on the upstream side relative to line 386. Accordingly, thread 5 is able to already operate to decode/code these coding units as shown in Fig. 19. As can be seen, even a sixth thread could operate on the second entropy slice in entropy slice order 16 of sample array n+1. Thereby, the wavefront extends not only spatially but also temporally through the spatio-temporal space spanned by the sequence 30 of sample arrays. Please note that the just-mentioned wavefront aspect does also work in combination with the above-outlined probability estimation couplings across entropy slice boarders. Further, with respect to the above outlined chunk aspect, it should also be noted that subdividing entropy slices into smaller pieces, i.e. chunks, is not restricted to being performed in the entropy coded domain, i.e. in the entropy compressed domain. Think of the above discussion: entropy slices as described above have the advantage of reducing the coding efficiency loss despite the enablement of wavefront processing due to the derivation of probability estimations from previously coded/decoded entropy slices of the same or a previously coded/decoded frame, i.e. an initialization and/or adaptation of the probability estimations based on probability estimations of such previous entropy slices. Each of these entropy slices is supposed to be entropy coded/decoded by one thread in case of wavefront processing. That is, when subdividing the entropy slices, it is not necessary to render the chunks codable/decodable in parallel. Rather, the encoder shall merely be provided with the opportunity to output subparts of its entropy slice's bitstream prior to finalizing the entropy encoding, and the decoder shall be provided with the opportunity to operate on these subparts, i.e. chunks, prior to the reception of the remaining chunks of the same entropy slice. Moreover, the interleaving shall be enabled at the reception side. In order to enable the latter de-interleaving, however, it is not necessary to perform the subdivision in the entropy coded domain. In particular, it is possible to perform the above presented subdivision of entropy slices into smaller chunks without severe coding efficiency loss, by intermittently merely resetting the internal state of the probability interval, i.e. the probability interval width value, and the offset value, respectively, of the entropy coding/decoding core. The probability estimations, however, are not reset. Rather, they are continuously updated/adapted from the beginning to the end of the entropy slices, respectively. By this measure, it is possible to subdivide the entropy slices into individual chunks with the subdivision being performed in the syntax element domain rather than the compressed bitstream domain. The sub-division may follow a spatial sub-division as outlined below in order to ease a signaling of the chunk interfaces to the decoder. Each of the chunks could be provided with its own chunk header revealing, for example, its starting position in the sample array, measured, for example, with respect to the coding order 14 relative to the respective entropy slice's start position along with an index to its entropy slice, or relative to a prominent location of the sample array 10 such as the upper left corner. In order to more clearly describe the subdivision of entropy slices into chunks in accordance with the latter embodiment, reference is made to Fig. 20. Fig. 20 shows, merely for illustrative purposes, sample array 10 as being subdivided into four entropy slices. Currently coded portions of sample array 10 are shown hatched. Three threads are currently operating on the entropy encoding of sample array 10 and output chunks of the entropy slices on an immediate-attention basis: see for example the first entropy slice in entropy slice order 16 which corresponds to portion 12 of sample array 10. After having encoded a subpart 12a of portion 12, the encoder forms a chunk 390 therefrom, i.e. the entropy coding core 80 performs some finishing procedure to finalize the arithmetic bitstream produced from sub-portion 12a so far in case of arithmetic coding to form chunk 390. The encoding procedure is then resumed with respect to the subsequent sub-portion 12b of entropy slice 12 in coding order 14 while starting a new entropy bitstream. This means, for example, that internal states, such as probability interval width value and probability interval offset value of entropy coding core 80, are reset. The probability estimations, however, are not reset. They are left unchanged. This is illustrated in Fig. 20 by an arrow 392. In Fig. 20, it is exemplarily shown that the entropy slice or portion 12 is subdivided into more than two sub-portions, and accordingly even the second chunk lb is subject to some entropy finishing before reaching the end of portion 12 along coding order 14, whereupon a next chunk in line is started with etc. Concurrently, another thread operates on the second entropy slice or portion 12 in entropy slice order 16. Upon finishing a first sub-portion of this second entropy slice/portion 12, a chunk 2a is output, whereupon entropy coding the remainder of the second entropy slice is commenced while maintaining, however, the probability estimation as valid at the end of chunk 2a. With a time axis 394, Fig. 20 seeks to illustrate that the chunks 390 are output as soon as they have been finalized. This leads to an interleaving similar to the one depicted in Fig. 16.. Each chunk may be packetized into a packet and transported to the decoding side via some transport layer in any order. The transport layer is illustrated using arrow 396. The decoder has to reassign the chunks to its sub-portions 12a, 12b and so forth. To this end, each chunk 390 may have a header section 398 revealing the location of the beginning of its associated sub-portion 12a or 12b, i.e. the sub-portion the syntax elements describing the same are entropy coded in the respective chunk. By using this information, the decoder is able to associate each chunk 390 with its entropy slice and with its sub-portion within the portion 12 of that entropy slice. For illustration purposes, Fig. 20 also exemplarily shows the possibility that the junction between consecutive sub-portions 12a and 12b of an entropy slice 12 does not have to coincide with the border between consecutive coding units 50. Rather, the junction may be defined in a deeper level of the aforementioned exemplary multi-tree subdivision of the coding units. The location information contained in headers 398 may indicate the beginning of the sub-portion associated with the current chunk 390 precisely enough in order to identify the respective sub-block of the respective coding unit, i.e. the location within the syntax element sequence from which on the respective sub-block is described. As became clear from the above discussion, almost no coding efficiency loss resulted from the subdivision of the entropy slices into chunks. Merely the entropy finishing processes and the packetizing may involve some coding efficiency loss, but on the other hand low delay gains are enormous. Again, please note that the just-mentioned spatial sub-division chunk aspect does also work in combination with the above-outlined probability estimation couplings across entropy slice boarders, spatially and temporally. The decoder such as decoder of Fig. 6, may undo the chunk transmission as follows. ΓΝ particular, the decoder may check as to which entropy slice a current chunk belongs to. This check may be done based on the afore-mentioned location information. Then, it may be checked whether the current chunk corresponds to a first sub-portion of the portion of the corresponding entropy slice along the entropy coding path 14. If so, the decoder may entropy decode the current chunk under adapting the respective probability estimations and take a state of the respective probability estimations as manifesting themselves at an end of entropy decoding the current chunk into account when entropy decoding another chunk which corresponds to a second sub-portion of the portion of the predetermined entropy slice along the entropy coding path. The "taking into account" may involve setting the probability estimations at the beginning of chunk lb equal to the probability estimations manifesting themselves, by the probability adaption starting from the probability estimation state at the beginning of chunk la, at the end of chunk's la sub-portion 12a, or equal to a combination thereof with entropy probability estimations from other entropy slices as described above. As far as the probability initialization at the beginning of the first chunk 12a is concerned, reference is made to the above discussion, as this also forms the beginning of the corresponding entropy slice. In other words, if the current slice is a second or following chunk in order 14, decoder may entropy decode the current chunk using probability estimations which depend on probability estimations manifesting themselves at an end of entropy decoding a chunk which corresponds to a sub-portion of the portion of the predetermined entropy slice preceding the sub-portion corresponding to the current chunk, along the entropy coding path 14. The above description reveals different methods, which can be useful for parallel encoding and decoding as well as helpful for optimization of already existing processes in the emerging HEVC video coding standard. A short overview of entropy slices has been presented. It has been shown how they can be formed, which advantages can be achieved by slicing and what penalties can occur from those techniques. A number of methods have been proposed, which are supposed to improve the learning process of probabilities along LCUs (largest coding unit) in frame, by better exploiting the local dependences between LCUs, and also the temporal dependences between LCUs of different frames. It is asserted that different combinations provide improvements for both concepts with and without parallelization of encoding and decoding. The performance improvement in High Efficiency, for instance, by the best mix of proposed approaches, is -0.4 % in Intra, -0.78 % in Low Delay and -0.63 % in Random Access in comparison with HM3.0 without use of entropy slices or -0.7 % in Intra, -1.95 % in Low Delay and -1.5 % in Random Access in comparison with entropy slice approach with usual re-initialization. In particular, inter alias, the following techniques have been presented above. • to use not only local but also temporal dependencies of LCUs, to optimize adaptation of the CAB AC probabilities before coding each LCU, see Fig. 1 to 9, 17 and 18. • to achieve more flexibility in decoding, also entropy slices can be utilized, so that certain regions in frame become independent from each other. • to allow minimal signaling of the slice / entropy slices start positions for parallelized, e.g. wavefront processing, see Fig. 15 • to allow low delay transport in an parallelized encoder - transmitter - receiver - decoder environment through interleaved transport of entropy slices / slices, see Fig. 16. All the methods, which were mentioned above, have been integrated and tested in HM3.0. The obtained results, where reference point is HM3.0 without any entropy slice implementation, are presented in tables 1 and 2 (where 2LCU- usage of the second LCU of the upper line; 2LCU+Prob.Adap - 2LCU merged with the method of probabilities adaptation; Temporal- usage of temporal dependences (end state of a reference frame) with the probabilities adaptation for each LCU). Table 1. Summar RD results with 1 Thread However, it is interesting to know how the proposed approaches affect the wavefront processing with the re-initialization of probabilities at the beginning of each line of LCUs. These results are illustrated on tables 3 and 4 (where orig ieilnit is comparison HM3.0 without use of entropy slices with usage of entropy slices with re-initialization). Table 3. Summar RD results with 1 Thread. Reference is new initialization. The above results show that considerably more use of dependences within and between frames and rational application of already obtained information prevent average loss. An approach for Wavefront Processing for HEVC video encoding and decoding merges the possibility to use dependences between adjacent LCUs as well as temporal frame dependences with the concept of Wavefront Parallel Processing. In this way the loss can be reduced and the performance advance can be achieved. A gain in probability adaptation speed has been achieved by computing the probability adaptations of spatially neighboring entropy slices. As already mentioned above, all of the above aspects may be combined with each other, and thus, the mentioning of certain implementation possibilities with regard to a certain aspect shall, of course, also apply to the other aspects. Although some aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus. Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus. The inventive encoded signals mentioned above can be stored on a digital storage medium or can be transmitted on a transmission medium such as a wireless transmission medium or a wired transmission medium such as the Internet. Depending on certain implementation requirements, embodiments of the invention can be implemented in hardware or in software. The implementation can be performed using a digital storage medium, for example a floppy disk, a DVD, a Blu-Ray, a CD, a ROM, a PROM, an EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable. Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed. Generally, embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer. The program code may for example be stored on a machine readable carrier. Other embodiments comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier. In other words, an embodiment of the inventive method is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer. A further embodiment of the inventive methods is, therefore, a data carrier (or a digital storage medium, or a computer-readable medium) comprising, recorded thereon, the computer program for performing one of the methods described herein. The data carrier, the digital storage medium or the recorded medium are typically tangible and/or non-transitionary. A further embodiment of the inventive method is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein. The data stream or the sequence of signals may for example be configured to be transferred via a data communication connection, for example via the Internet. A further embodiment comprises a processing means, for example a computer, or a programmable logic device, configured to or adapted to perform one of the methods described herein. A further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein. A further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver. The receiver may, for example, be a computer, a mobile device, a memory device or the like. The apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver . In some embodiments, a programmable logic device (for example a field programmable gate array) may be used to perform some or all of the functionalities of the methods described herein. In some embodiments, a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein. Generally, the methods are preferably performed by any hardware apparatus. The above described embodiments are merely illustrative for the principles of the present invention. It is understood that modifications and variations of the arrangements and the details described herein will be apparent to others skilled in the art. It is the intent, therefore, to be limited only by the scope of the impending patent claims and not by the specific details presented by way of description and explanation of the embodiments herein. Claims 1. Decoder for reconstructing a sample array (10) from an entropy encoded data stream, configured to entropy decode a plurality of entropy slices within the entropy encoder data stream so as to reconstruct different portions (12) of the sample array associated with the entropy slices, respectively, with performing, for each entropy slice, the entropy decoding along a respective entropy coding path (14) using respective probability estimations, adapting the respective probability estimations along the respective entropy coding path using a previously decoded part of the respective entropy slice, starting the entropy decoding the plurality of entropy slices sequentially using an entropy slice order (16), and performing, in entropy decoding a predetermined entropy slice, entropy decoding a current part of the predetermined entropy slice based on the respective probability estimations of the predetermined entropy slice as adapted using the previously decoded part of the predetermined entropy slice, and probability estimations as used in the entropy decoding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part of the spatially neighboring entropy slice. 2. Decoder according to claim 1, wherein the different portions are rows of blocks of the sample array. 3. Decoder according to claim 1 or 2, wherein the entropy slice order is chosen such that, along the entropy slice order, the different portions follow each other in a direction (16) angled relative to the entropy coding paths (14) of the entropy slices, which, in turn, extend substantially in parallel to each other. 4. Decoder according to any of claims 1 to 3, wherein each entropy slice has entropy encoded therein data for a corresponding portion of the sample array, the different portions forming rows of blocks of the sample array with the blocks being regularly arranged in rows and columns so that the portions corresponding to the entropy slices consist of the same number of blocks and the entropy coding path points in parallel along the rows of the blocks, wherein the decoder is configured to perform, for each entropy slice (90), an initialization, for a respective entropy slice, of the probability estimations (94) before decoding the first block of the portion (12) corresponding to the respective entropy slice along the respective encoding path (14) with probability estimations manifesting themselves after having entropy decoded the second block (50) of the portion (12) corresponding to the in entropy slice order (16) preceding entropy slice along the respective encoding path. 5. Decoder according to claim 4, wherein the decoder is configured to store the probability estimations manifesting themselves after having entropy decoded the second block of the portion corresponding to the in entropy slice order preceding entropy slice along the respective encoding path, and using the stored probability estimations for the initialization before decoding the first block of the portion corresponding to the respective entropy slice along the respective encoding path. 6. Decoder according to any of claims 1 to 3, wherein each entropy slice has entropy encoded therein data for a corresponding portion of the sample array, the different portions forming rows of blocks of the sample array with the blocks being regularly arranged in rows and columns so that the portions corresponding to the entropy slices consist of the same number of blocks and the entropy coding path pointing in parallel along the rows of the blocks, wherein the decoder is configured to perform, for each entropy slice, the entropy decoding along the respective entropy coding path and the adaptation of the respective probability estimations along the respective entropy coding path such that, after the current part of the predetermined entropy slice has been entropy decoded based on the respective probability estimations (94) of the predetermined entropy slice, the respective probability estimations (94) of the predetermined entropy slice are adapted depending on the current part of the predetermined entropy slice, and the probability estimations as manifesting themselves in the entropy decoding of the neighboring part of the spatially neighboring entropy slice. 7. Decoder according to any of claim 6, wherein the decoder is configured such that the adaptation of the respective probability estimations of the predetermined entropy slice, after the current part of the predetermined entropy slice has been entropy decoded based on the respective probability estimations of the predetermined entropy slice, is performed by a first adaptation depending on the current part of the predetermined entropy slice and an averaging of a result of the first adaptation with the probability estimations as used in the entropy decoding of the neighboring part of the spatially neighboring entropy slice. 8. Decoder according to any of claims 4 to 7, wherein the decoder is configured to steer the entropy decoding of immediately consecutive entropy slices in entropy slice order so that a distance of currently decoded blocks of portions corresponding to immediately consecutive entropy slices measured in blocks along the encoding paths is prevented from becoming lower than two blocks. 9. Decoder according to any of claims 4 to 7, wherein the decoder is configured to steer the entropy decoding of immediately consecutive entropy slices in entropy slice order so that a distance of currently decoded blocks of portions corresponding to immediately consecutive entropy slices measured in blocks along the encoding paths remains two blocks. 10. Decoder according to any of claims 1 to 9, wherein the entropy slices are subdivided into chunks, and the decoder comprises a de-interleaver to de-interleave the chunks and is configured to start the entropy decoding the entropy slices in parallel along the entropy decoding paths even before a reception of any of the entropy slices as a whole . 1 1. Decoder according to any of claims 1 to 10, wherein the entropy slices are subdivided into chunks and the decoder is configured to check whether a current chunk corresponds to a first sub-portion of the portion of the predetermined entropy slice along the entropy coding path, and if so, entropy decode the current chunk under adapting the respective probability estimations and take a state of the respective probability estimations as manifesting themselves at an end of entropy decoding the current chunk, into account when entropy decoding another chunk which corresponds to a second sub-portion of the portion of the predetermined entropy slice along the entropy coding path, and if not, entropy decode the current chunk using probability estimations which depend on probability estimations manifesting themselves at an end of entropy decoding a chunk which corresponds to a sub-portion of the portion of the predetermined entropy slice preceding the sub-portion corresponding to the current chunk, along the entropy coding path. 12. Decoder according to any of claims 1 to 1 1 , wherein the sample array (10) is a current sample array of a sequence of sample arrays and the decoding is configured to , in entropy decoding a predetermined entropy slice, entropy decode the current part of the predetermined entropy slice based on the respective probability estimations of the predetermined entropy slice as adapted using the previously decoded part of the predetermined entropy slice, probability estimations as used in the entropy decoding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part of the spatially neighboring entropy slice, and probability estimations used in decoding a previously decoded frame of the entropy encoded data stream relating to another sample array than the current sample array. 13. Decoder configured to reconstruct a sequence of sample arrays from an entropy encoded data stream, configured to entropy decode a current frame of the entropy encoded data stream so as to reconstruct a current sample array of the sequence of sample arrays, perform the entropy decoding along an entropy coding path and using probability estimations and adapt the probability estimations along the entropy coding path using a previously decoded part of the current frame, wherein the decoder is configured to initialize or determine the probability estimations for the current frame based on probability estimations used in decoding a previously decoded frame of the entropy encoded data stream. 14. Decoder according to claim 13, wherein the entropy decoding stage is configured to initialize the probability estimations for the current frame based on probability estimations resulting after having finished decoding the previously decoded frame of the entropy encoded data stream. 15. Decoder according to claim 13 or 14, wherein the entropy decoding stage is configured to perform the adaptation based on the respective probability estimations of the predetermined entropy slice as adapted using the previously decoded part of the predetermined entropy slice, and probability estimations as used in the entropy decoding of a spatially corresponding part of an entropy slice of the previously decoded frame and, optionally, probability estimations as used in the entropy decoding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part of the spatially neighboring entropy slice. 16. Decoder according to any of claims 13 to 15, wherein the entropy decoding stage is configured to select the spatially corresponding part of an entropy slice of the previously decoded frame and/or the neighboring part of the spatially neighboring entropy slice based on prediction references. 17. Encoder for encoding a sample array (10) into an entropy encoded data stream, configured to entropy encode a plurality of entropy slices into the entropy encoder data stream each entropy slice being associated with a different portion (12) of the sample array, respectively, with performing, for each entropy slice, the entropy encoding along a respective entropy coding path (14) using respective probability estimations, adapting the respective probability estimations along the respective entropy coding path using a previously decoded part of the respective entropy slice, starting the entropy encoding the plurality of entropy slices sequentially using an entropy slice order (16), and performing, in entropy encoding a predetermined entropy slice, entropy encoding a current part of the predetermined entropy slice based on the respective probability estimations of the predetermined entropy slice as adapted using the previously encoded part of the predetermined entropy slice, and probability estimations as used in the entropy encoding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part of the spatially neighboring entropy slice. 18. Encoder configured to encode a sequence of sample arrays into an entropy encoded data stream, configured to entropy encode a current frame of the entropy encoded data stream so as to reconstruct a current sample array of the sequence of sample arrays, perform the entropy encoding along an entropy coding path and using probability estimations and adapt the probability estimations along the entropy coding path using a previously encoded part of the current frame, wherein the encoder is configured to initialize or determine the probability estimations for the current frame based on probability estimations used in encoding a previously encoded frame of the entropy encoded data stream. 19. Method for reconstructing a sample array (10) from an entropy encoded data stream, comprising entropy decoding a plurality of entropy slices within the entropy encoder data stream so as to reconstruct different portions (12) of the sample array associated with the entropy slices, respectively, with performing, for each entropy slice, the entropy decoding along a respective entropy coding path (14) using respective probability estimations, adapting the respective probability estimations along the respective entropy coding path using a previously decoded part of the respective entropy slice, starting the entropy decoding the plurality of entropy slices sequentially using an entropy slice order (16), and performing, in entropy decoding a predetermined entropy slice, entropy decoding a current part of the predetermined entropy slice based on the respective probability estimations of the predetermined entropy slice as adapted using the previously decoded part of the predetermined entropy slice, and probability estimations as used in the entropy decoding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part of the spatially neighboring entropy slice. 20. Method configured to reconstruct a sequence of sample arrays from an entropy encoded data stream, comprising entropy decoding a current frame of the entropy encoded data stream so as to reconstruct a current sample array of the sequence of sample arrays, performing the entropy decoding along an entropy coding path and using probability estimations and adapting the probability estimations along the entropy coding path using a previously decoded part of the current frame, and wherein the method comprises initialize or determine the probability estimations for the current frame based on probability estimations used in decoding a previously decoded frame of the entropy encoded data stream. 21. Method for encoding a sample array (10) into an entropy encoded data stream, comprising entropy encoding a plurality of entropy slices into the entropy encoder data stream each entropy slice being associated with a different portion (12) of the sample array, respectively, with performing, for each entropy slice, the entropy encoding along a respective entropy coding path (14) using respective probability estimations, adapting the respective probability estimations along the respective entropy coding path using a previously decoded part of the respective entropy slice, starting the entropy encoding the plurality of entropy slices sequentially using an entropy slice order (16), and performing, in entropy encoding a predetermined entropy slice, entropy encoding a current part of the predetermined entropy slice based on the respective probability estimations of the predetermined entropy slice as adapted using the previously encoded part of the predetermined entropy slice, and probability estimations as used in the entropy encoding of a spatially neighboring, in entropy slice order preceding entropy slice at a neighboring part of the spatially neighboring entropy slice. 22. Method for encoding a sequence of sample arrays into an entropy encoded data stream, comprising entropy encoding a current frame of the entropy encoded data stream so as to reconstruct a current sample array of the sequence of sample arrays, performing the entropy encoding along an entropy coding path and using probability estimations and adapting the probability estimations along the entropy coding path using a previously encoded part of the current frame, wherein the method comprises initializing or determining the probability estimations for the current frame based on probability estimations used in encoding a previously encoded frame of the entropy encoded data stream. 23. Computer program having a program code configured to perform, when running on a computer, a method according to any of claims 19 to 22.

Documents

Application Documents

# Name Date
1 3778-KOLNP-2013-RELEVANT DOCUMENTS [10-08-2023(online)].pdf 2023-08-10
1 FOA.pdf 2013-12-31
2 3778-KOLNP-2013-PROOF OF ALTERATION [10-09-2022(online)].pdf 2022-09-10
2 F5.pdf 2013-12-31
3 F3.pdf 2013-12-31
3 3778-KOLNP-2013-RELEVANT DOCUMENTS [05-09-2022(online)].pdf 2022-09-05
4 F2.pdf 2013-12-31
4 3778-KOLNP-2013-RELEVANT DOCUMENTS [01-10-2021(online)].pdf 2021-10-01
5 DW.pdf 2013-12-31
5 3778-KOLNP-2013-RELEVANT DOCUMENTS [04-05-2020(online)].pdf 2020-05-04
6 3778-KOLNP-2013-REQUEST FOR CERTIFIED COPY [21-06-2019(online)].pdf 2019-06-21
6 3778-KOLNP-2013-(18-02-2014)-PA.pdf 2014-02-18
7 3778-KOLNP-2013-IntimationOfGrant06-06-2019.pdf 2019-06-06
7 3778-KOLNP-2013-(18-02-2014)-CORRESPONDENCE.pdf 2014-02-18
8 3778-KOLNP-2013-PatentCertificate06-06-2019.pdf 2019-06-06
8 3778-KOLNP-2013-(18-02-2014)-ASSIGNMENT.pdf 2014-02-18
9 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [01-05-2019(online)].pdf 2019-05-01
9 3778-KOLNP-2013.pdf 2014-03-06
10 3778-KOLNP-2013-(02-04-2014)-CORRESPONDENCE.pdf 2014-04-02
10 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [25-03-2019(online)].pdf 2019-03-25
11 3778-KOLNP-2013-(02-04-2014)-ANNEXURE TO FORM 3.pdf 2014-04-02
11 3778-KOLNP-2013-Annexure [26-12-2018(online)].pdf 2018-12-26
12 3778-KOLNP-2013-CLAIMS [26-12-2018(online)].pdf 2018-12-26
12 3778-KOLNP-2013-FORM-18.pdf 2014-05-17
13 3778-KOLNP-2013-(23-06-2014)-FORM-3.pdf 2014-06-23
13 3778-KOLNP-2013-CORRESPONDENCE [26-12-2018(online)].pdf 2018-12-26
14 3778-KOLNP-2013-(23-06-2014)-CORRESPONDENCE.pdf 2014-06-23
14 3778-KOLNP-2013-FER_SER_REPLY [26-12-2018(online)].pdf 2018-12-26
15 3778-KOLNP-2013-(07-09-2015)-GPA.pdf 2015-09-07
15 3778-KOLNP-2013-FORM 13 [26-12-2018(online)].pdf 2018-12-26
16 3778-KOLNP-2013-(07-09-2015)-FORM-6.pdf 2015-09-07
16 3778-KOLNP-2013-OTHERS [26-12-2018(online)].pdf 2018-12-26
17 3778-KOLNP-2013-PETITION UNDER RULE 137 [26-12-2018(online)].pdf 2018-12-26
17 3778-KOLNP-2013-(07-09-2015)-FORM-5.pdf 2015-09-07
18 3778-KOLNP-2013-(07-09-2015)-FORM-3.pdf 2015-09-07
18 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [10-07-2018(online)].pdf 2018-07-10
19 3778-KOLNP-2013-(07-09-2015)-FORM-2.pdf 2015-09-07
19 3778-KOLNP-2013-FER.pdf 2018-06-26
20 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [09-03-2018(online)].pdf 2018-03-09
21 3778-KOLNP-2013-(07-09-2015)-DRAWINGS.pdf 2015-09-07
21 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [08-01-2018(online)].pdf 2018-01-08
22 3778-KOLNP-2013-(07-09-2015)-CORRESPONDENCE.pdf 2015-09-07
22 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [20-12-2017(online)].pdf 2017-12-20
23 3778-KOLNP-2013-(07-09-2015)-ASSIGNMENT.pdf 2015-09-07
23 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [13-10-2017(online)].pdf 2017-10-13
24 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [28-08-2017(online)].pdf 2017-08-28
24 3778-KOLNP-2013-(09-03-2016)-OTHERS.pdf 2016-03-09
25 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [18-07-2017(online)].pdf 2017-07-18
25 3778-KOLNP-2013-(09-03-2016)-CORRESPONDENCE.pdf 2016-03-09
26 3778-KOLNP-2013-(22-04-2016)-OTHERS.pdf 2016-04-22
26 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [18-07-2017(online)].pdf_1.pdf 2017-07-18
27 3778-KOLNP-2013-(22-04-2016)-CORRESPONDENCE.pdf 2016-04-22
27 Information under section 8(2) [13-06-2017(online)].pdf 2017-06-13
28 Other Patent Document [21-07-2016(online)].pdf 2016-07-21
28 Other Patent Document [27-01-2017(online)].pdf 2017-01-27
29 Other Patent Document [07-11-2016(online)].pdf 2016-11-07
29 Other Patent Document [25-08-2016(online)].pdf 2016-08-25
30 Other Patent Document [18-10-2016(online)].pdf 2016-10-18
31 Other Patent Document [07-11-2016(online)].pdf 2016-11-07
31 Other Patent Document [25-08-2016(online)].pdf 2016-08-25
32 Other Patent Document [21-07-2016(online)].pdf 2016-07-21
32 Other Patent Document [27-01-2017(online)].pdf 2017-01-27
33 3778-KOLNP-2013-(22-04-2016)-CORRESPONDENCE.pdf 2016-04-22
33 Information under section 8(2) [13-06-2017(online)].pdf 2017-06-13
34 3778-KOLNP-2013-(22-04-2016)-OTHERS.pdf 2016-04-22
34 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [18-07-2017(online)].pdf_1.pdf 2017-07-18
35 3778-KOLNP-2013-(09-03-2016)-CORRESPONDENCE.pdf 2016-03-09
35 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [18-07-2017(online)].pdf 2017-07-18
36 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [28-08-2017(online)].pdf 2017-08-28
36 3778-KOLNP-2013-(09-03-2016)-OTHERS.pdf 2016-03-09
37 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [13-10-2017(online)].pdf 2017-10-13
37 3778-KOLNP-2013-(07-09-2015)-ASSIGNMENT.pdf 2015-09-07
38 3778-KOLNP-2013-(07-09-2015)-CORRESPONDENCE.pdf 2015-09-07
38 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [20-12-2017(online)].pdf 2017-12-20
39 3778-KOLNP-2013-(07-09-2015)-DRAWINGS.pdf 2015-09-07
39 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [08-01-2018(online)].pdf 2018-01-08
40 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [09-03-2018(online)].pdf 2018-03-09
41 3778-KOLNP-2013-(07-09-2015)-FORM-2.pdf 2015-09-07
41 3778-KOLNP-2013-FER.pdf 2018-06-26
42 3778-KOLNP-2013-(07-09-2015)-FORM-3.pdf 2015-09-07
42 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [10-07-2018(online)].pdf 2018-07-10
43 3778-KOLNP-2013-(07-09-2015)-FORM-5.pdf 2015-09-07
43 3778-KOLNP-2013-PETITION UNDER RULE 137 [26-12-2018(online)].pdf 2018-12-26
44 3778-KOLNP-2013-(07-09-2015)-FORM-6.pdf 2015-09-07
44 3778-KOLNP-2013-OTHERS [26-12-2018(online)].pdf 2018-12-26
45 3778-KOLNP-2013-(07-09-2015)-GPA.pdf 2015-09-07
45 3778-KOLNP-2013-FORM 13 [26-12-2018(online)].pdf 2018-12-26
46 3778-KOLNP-2013-FER_SER_REPLY [26-12-2018(online)].pdf 2018-12-26
46 3778-KOLNP-2013-(23-06-2014)-CORRESPONDENCE.pdf 2014-06-23
47 3778-KOLNP-2013-CORRESPONDENCE [26-12-2018(online)].pdf 2018-12-26
47 3778-KOLNP-2013-(23-06-2014)-FORM-3.pdf 2014-06-23
48 3778-KOLNP-2013-CLAIMS [26-12-2018(online)].pdf 2018-12-26
48 3778-KOLNP-2013-FORM-18.pdf 2014-05-17
49 3778-KOLNP-2013-(02-04-2014)-ANNEXURE TO FORM 3.pdf 2014-04-02
49 3778-KOLNP-2013-Annexure [26-12-2018(online)].pdf 2018-12-26
50 3778-KOLNP-2013-(02-04-2014)-CORRESPONDENCE.pdf 2014-04-02
50 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [25-03-2019(online)].pdf 2019-03-25
51 3778-KOLNP-2013-Information under section 8(2) (MANDATORY) [01-05-2019(online)].pdf 2019-05-01
51 3778-KOLNP-2013.pdf 2014-03-06
52 3778-KOLNP-2013-(18-02-2014)-ASSIGNMENT.pdf 2014-02-18
52 3778-KOLNP-2013-PatentCertificate06-06-2019.pdf 2019-06-06
53 3778-KOLNP-2013-(18-02-2014)-CORRESPONDENCE.pdf 2014-02-18
53 3778-KOLNP-2013-IntimationOfGrant06-06-2019.pdf 2019-06-06
54 3778-KOLNP-2013-(18-02-2014)-PA.pdf 2014-02-18
54 3778-KOLNP-2013-REQUEST FOR CERTIFIED COPY [21-06-2019(online)].pdf 2019-06-21
55 DW.pdf 2013-12-31
55 3778-KOLNP-2013-RELEVANT DOCUMENTS [04-05-2020(online)].pdf 2020-05-04
56 F2.pdf 2013-12-31
56 3778-KOLNP-2013-RELEVANT DOCUMENTS [01-10-2021(online)].pdf 2021-10-01
57 F3.pdf 2013-12-31
57 3778-KOLNP-2013-RELEVANT DOCUMENTS [05-09-2022(online)].pdf 2022-09-05
58 F5.pdf 2013-12-31
58 3778-KOLNP-2013-PROOF OF ALTERATION [10-09-2022(online)].pdf 2022-09-10
59 3778-KOLNP-2013-RELEVANT DOCUMENTS [10-08-2023(online)].pdf 2023-08-10
59 FOA.pdf 2013-12-31
60 3778-KOLNP-2013-PROOF OF ALTERATION [23-11-2025(online)].pdf 2025-11-23

Search Strategy

1 searchstrategy_24-04-2018.pdf

ERegister / Renewals

3rd: 05 Jul 2019

From 16/07/2014 - To 16/07/2015

4th: 05 Jul 2019

From 16/07/2015 - To 16/07/2016

5th: 05 Jul 2019

From 16/07/2016 - To 16/07/2017

6th: 05 Jul 2019

From 16/07/2017 - To 16/07/2018

7th: 05 Jul 2019

From 16/07/2018 - To 16/07/2019

8th: 05 Jul 2019

From 16/07/2019 - To 16/07/2020

9th: 15 Jul 2020

From 16/07/2020 - To 16/07/2021

10th: 13 Jul 2021

From 16/07/2021 - To 16/07/2022

11th: 11 Jul 2022

From 16/07/2022 - To 16/07/2023

12th: 13 Jul 2023

From 16/07/2023 - To 16/07/2024

13th: 15 Jul 2024

From 16/07/2024 - To 16/07/2025

14th: 14 Jul 2025

From 16/07/2025 - To 16/07/2026