Sign In to Follow Application
View All Documents & Correspondence

Neural Network Representation Formats

Abstract: Data stream (45) having a representation of a neural network (10) encoded thereinto, the data stream (45) comprising serialization parameter (102) indicating a coding order (104) at which neural network parameters (5 32), which define neuron interconnections (22, 24) of the neural network (10), are encoded into the data stream (45).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
25 July 2025
Publication Number
33/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

FRAUNHOFER-GESELLSCHAFT ZUR FÖRDERUNG DER ANGEWANDTEN FORSCHUNG E.V.
Hansastraße 27c 80686 München, Germany

Inventors

1. MATLAGE, Stefan
Grazer Damm 115 12157 Berlin, Germany
2. HAASE, Paul
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
3. KIRCHHOFFER, Heiner
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
4. MÜLLER, Karsten
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
5. SAMEK, Wojciech
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
6. WIEDEMANN, Simon
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
7. MARPE, Detlev
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
8. SCHIERL, Thomas
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
9. SÁNCHEZ DE LA FUENTE, Yago
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
10. SKUPIN, Robert
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany
11. WIEGAND, Thomas
c/o Fraunhofer-Institut für Nachrichtentechnik, Heinrich-Hertz-Institut, HHI Einsteinufer 37 10587 Berlin, Germany

Specification

Description:AS ATTACHED , Claims:I/We Claim:
1. Data stream (45) having a representation of a neural network (10) encoded
thereinto, wherein the data stream (45) is structured into individually
accessible portions (200), each portion 5 representing a corresponding neural
network portion of the neural network, wherein the data stream (45)
comprises for each of one or more predetermined individually accessible
portions (200) a supplemental data (350) for supplementing the
representation of the neural network.
10
2. Apparatus for encoding a representation of a neural network (10) into a data
stream (45), so that the data stream (45) is structured into individually
accessible portions (200), each portion representing a corresponding neural
network portion of the neural network, wherein the apparatus is configured
15 to provide the data stream (45) with, for each of one or more predetermined
individually accessible portions (200) a supplemental data (350) for
supplementing the representation of the neural network.
3. Apparatus for decoding a representation of a neural network (10) from a
20 data stream (45), wherein the data stream (45) is structured into individually
accessible portions (200), each portion representing a corresponding neural
network portion of the neural network, wherein the apparatus is configured
to decode from the data stream (45), for each of one or more predetermined
individually accessible portions (200) a supplemental data (350) for
25 supplementing the representation of the neural network.
4. Apparatus of claim 3, wherein the data stream (45) indicates the
supplemental data (350) as being dispensable for inference based on the
neural network.
30
62
5. Apparatus of claim 3 or claim 4, wherein the apparatus is configured to
decode the supplemental data (350) for supplementing the representation of
the neural network for the one or more predetermined individually
accessible portions (200) from further individually accessible portions,
wherein the data stream (45) 5 comprises for each of the one or more
predetermined individually accessible portions a corresponding further
predetermined individually accessible portion relating to the neural network
portion to which the respective predetermined individually accessible
portion corresponds.
10
6. Apparatus of any previous claim 3 to 5, wherein the neural network portions
comprise neural network layers (210, 30) of the neural network and/or layer
portions into which a predetermined neural network layer of the neural
network is subdivided.
15
7. Apparatus of any previous claim 3 to 6, wherein the apparatus is configured
to decode the individually accessible portions (200) using context-adaptive
arithmetic decoding and using context initialization at a start of each
individually accessible portion.
20
8. Apparatus of any previous claim 3 to 7, wherein the supplemental data (350)
relates to
relevance scores of neural network parameters (32), and/or
25 perturbation robustness of neural network parameters (32).
9. Apparatus of any previous claim 3 to 8, for decoding a representation of a
neural network (10) from a data stream (45), wherein the apparatus is
configured to decode from the data stream (45) hierarchical control data
30 (400) structured into a sequence (410) of control data portions (420),
63
wherein the control data portions provide information on the neural network
at increasing details along the sequence of control data portions.
10. Apparatus of claim 9, wherein at least some of the control data portions
(420) provide information 5 on the neural network which is partially
redundant.
11. Apparatus of claim 9 or claim 10, wherein a first control data portion
provides the information on the neural network by way of indicating a
10 default neural network type implying default settings and a second control
data portion comprises a parameter to indicate each of the default settings.
12. Apparatus for performing an inference using a neural network, comprising
15
an apparatus for decoding a data stream (45) according to any of claims 3 to
11, so as to derive from the data stream (45) the neural network, and
a processor configured to perform the inference based on the neural network.
20
13. Method for encoding a representation of a neural network into a data stream,
so that the data stream is structured into individually accessible portions,
each portion representing a corresponding neural network portion of the
neural network, wherein the method comprises providing the data stream
25 with, for each of one or more predetermined individually accessible portions
a supplemental data for supplementing the representation of the neural
network.
14. Method for decoding a representation of a neural network from a data
30 stream, wherein the data stream is structured into individually accessible
portions, each portion representing a corresponding neural network portion
64
of the neural network, wherein the method comprises decoding from the
data stream, for each of one or more predetermined individually accessible
portions a supplemental data for supplementing the representation of the
neural network.
5
15. Computer program for, when executed by a computer, causing the computer
to perform the method of claim 13 or claim 14.

Documents

Application Documents

# Name Date
1 202518071105-STATEMENT OF UNDERTAKING (FORM 3) [25-07-2025(online)].pdf 2025-07-25
2 202518071105-REQUEST FOR EXAMINATION (FORM-18) [25-07-2025(online)].pdf 2025-07-25
3 202518071105-POWER OF AUTHORITY [25-07-2025(online)].pdf 2025-07-25
4 202518071105-FORM 18 [25-07-2025(online)].pdf 2025-07-25
5 202518071105-FORM 1 [25-07-2025(online)].pdf 2025-07-25
6 202518071105-DRAWINGS [25-07-2025(online)].pdf 2025-07-25
7 202518071105-DECLARATION OF INVENTORSHIP (FORM 5) [25-07-2025(online)].pdf 2025-07-25
8 202518071105-COMPLETE SPECIFICATION [25-07-2025(online)].pdf 2025-07-25
9 202518071105-Proof of Right [13-08-2025(online)].pdf 2025-08-13