Sign In to Follow Application
View All Documents & Correspondence

Apparatus And Method For Rendering An Audio Signal For A Playback To A User

Abstract: An apparatus (10) for rendering an audio signal for a playback to a user, wherein the apparatus (10) is configured to determine information about an orientation of a head of the user using an optical sensor (12); wherein the apparatus (10) is configured to determine information about an orientation of the optical sensor (12) using an orientation sensor (14) which is arranged in a predetermined positional relationship with respect to the optical sensor (12); wherein the apparatus (10) is configured to consider the information about the orientation of the optical sensor (12) when determining the information about the orientation of the head; wherein the apparatus (10) is configured to perform a spatial rendering of an audio signal in dependence on the information about the orientation of the head of the user.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
22 October 2020
Publication Number
06/2021
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
IPRDEL@LAKSHMISRI.COM
Parent Application

Applicants

FRAUNHOFER-GESELLSCHAFT ZUR FÖRDERUNG DER ANGEWANDTEN FORSCHUNG E.V.
Hansastraße 27c 80686 München

Inventors

1. HÄUSSLER, Dominik
c/o Fraunhofer-Institut für Integrierte Schaltungen IIS Am Wolfsmantel 33 91058 Erlangen
2. MELVILLE, Frederick
c/o Fraunhofer-Institut für Integrierte Schaltungen IIS Am Wolfsmantel 33 91058 Erlangen
3. ROSENBERGER, Dennis
c/o Fraunhofer-Institut für Integrierte Schaltungen IIS Am Wolfsmantel 33 91058 Erlangen
4. DÖHLA, Stefan
c/o Fraunhofer-Institut für Integrierte Schaltungen IIS Am Wolfsmantel 33 91058 Erlangen

Specification

APPARATUS AND METHOD FOR RENDERING AN AUDIO SIGNAL FOR A PLAYBACK TO A USER Description The present invention relates to an apparatus for rendering an audio signal, more specifically, an apparatus which is configured to perform a spatial rendering or sound field rendering of the audio signal of acoustic communication. Spatial audio processing for binaural rendering of spatial audio data has been widely adopted for headphone use in video gaming and virtual reality (VR), but yet to break into other applications such as audio communications, e.g., voice calls, conferencing and standard video consumption (i.e., non-360 degree). Though some applications using static binaural rendering of spatial audio data exist, user acceptance seems limited. The reason behind this is hypothesized to be that for spatial audio to be convincing, live positional information of the user's perspective must be actively applied during spatial processing. In order for the brain to be successfully tricked, the audio must respond with low latency to even the smallest of adjustments of the head position. In a phone call, the remote participant(s)/user(s) can be rendered as a mono object (per participant/user), each with a unique three-dimensional position (e.g., spread out horizontally in front of the participant/user as listener) in order to give a realistic same-room feeling. VR experiences with headphones achieve this using head tracking data (e.g., in the form of pitch angle, yaw angle, roll angle or as quaternions) obtained from inertial measurement units (IMU), including data from sensors, e.g., gyroscopes and accelerometers within the user's head-mounted display (HMD). If such sensors were commonly found already in consumer headphones, then everyday applications such as phone calls could also benefit from head-tracked spatial processing, but there are currently very few standalone headphones known with these sensors built in, and even fewer that make this data readily accessible to developers. For example, using the video feed of a camera to extract head tracking data, and to use this data for binaural rendering of an audio signal has already been done on desktop computers in combination with the Microsoft™ Kinect™ camera (see, Kronlacher, M. (2013). Ambisonics plug-in suite for production and performance usage. Retrieved from http://lac.linuxaudio.org/2013/papers/51.pdf, for instance). In addition, head tracking data extraction from the video feed of a common webcam is also known (see, for example, Lambers, 2017, https://github.com/marlam/webcam-head-tracker, and Face TrackNoir, 2010, https://git.marlam.de/gitweb/?p=webcam-head-tracker.git), but they do not propose to use it for spatial rendering of an audio signal. Furthermore, US 2009/0219224 A1 discloses a system for rendering a virtual environment in a multimedia application which relates to head tracking with mobile device and adaptive visual audio/video scenes. However, with the above mentioned known technologies considered, certain problems are not yet solved, for example how to compensate for motion of the sensor itself such as usage in a dynamic mobile scenario (e.g., user walking around or in a moving vehicle). It is thus the object of the present invention to provide a concept for accurate and low-latency adjustment for rendering an audio signal for a playback to a user, doing so robustly in a multitude of scenarios. This object is achieved by the subject-matter of an apparatus for rendering an audio signal for a playback to a user according to claim 1 , a method for rendering an audio signal for a playback to a user according to claim 23 and a computer program according to claim 24 of the present application. According to the present invention, an apparatus comprises an optical sensor and an orientation sensor for determining a head position of a user. Therefore, an apparatus, e.g., device is possible to determine the position of the head of the user by referencing the positional relationship between the optical sensor and the orientation sensor, and hence, it is possible to accurately determine the position of the head of the user. In addition, using the accurately determined position of the head of the user, it is possible to implement a low-latency adjustment for the spatial rendering and improve the user experience. In accordance with embodiments of the present application, an apparatus for rendering an audio signal for a playback to a user, wherein the apparatus is configured to determine information about an orientation of a head of the user using an optical sensor, e.g. using a camera or using a user-facing moving image capture device, and/or using a depth sensor and/or using a visual face/head tracking sensor, for example, using camera-captured data for head tracking; wherein the apparatus is configured to determine information about an orientation of the optical sensor using an orientation sensor, for example, a gyroscope and/or a magnetic field sensor and/or a gravity sensor and/or an accelerometer and/or an optical sensor, etc., which is arranged in a predetermined positional relationship, e.g., mechanical relationship with respect to the optical sensor, for example, to enable the apparatus to be aware of its position and/or orientation in a“real world" or in an Earth-fixed coordinate system; wherein the apparatus is configured to consider the information about the orientation of the optical sensor when determining the information about the orientation of the head, for example, to obtain at least one parameter about the orientation of the head with respect to an Earth-fixed coordinate system, substantially independent from a current orientation of the optical sensor or from the orientation of the apparatus carrying or comprising the optical sensor; wherein the apparatus is configured to perform a spatial rendering of an audio signal, for example, for playback to the user via a speaker system or via a headset which is in communication with the apparatus, in dependence on the information about the orientation of the head of the user, for example, to adapt a virtual audio environment in dependence on the information about the orientation of the head of the user. In accordance with embodiments of the present application, the apparatus is configured to perform a binaural rendering, e.g., for a headset worn by the user, or, e.g., of spatial audio data, in dependence on the information about the orientation of the head of the user, for example, considering a yaw angle or an azimuth angle between a head front direction of the user (e.g. a direction into which the user eyes or nose is pointing) and a direction from the user’s head towards the apparatus or towards the optical sensor included within the apparatus, or towards a display of the apparatus, and/or considering a roll angle of the head of the user, and/or considering a pitch angle of the head of the user. In accordance with embodiments of the present application, the apparatus comprises the optical sensor, e.g., a camera or a user-facing moving image capture device, and/or a depth sensor, wherein the optical sensor is arranged to track a head of the user, e.g. a position of the user’s face, e.g., when the user is looking at a display of the apparatus. In accordance with embodiments of the present application, the apparatus is configured to determine, for example, as a part of information about the orientation of the head of the user, a yaw angle information, for example, an angle value or a rotation matrix or a quaternion, describing an angle between a head front direction of the head of the user and a position of the apparatus, or, equivalently, a direction from the user’s head to the apparatus or to the optical sensor; and/or wherein the apparatus is configured to determine, for example, as a part of the information about the orientation of the head of the user, a roll angle information, for example, an angle value or a rotation matrix or a quaternion describing a roll angle of the head of the user, e.g., with respect to a vertical direction, e.g. with respect to a direction of gravity; and/or wherein the apparatus is configured to determine, for example, as a part of the information about the orientation of the head of the user, a pitch angle information, for example, an angle value or a rotation matrix or a quaternion, describing a pitch angle of the head of the user, e.g. with respect to a horizontal alignment. In accordance with embodiments of the present application, the apparatus is configured to determine, for example, as a part of information about the orientation of the head of the user, a yaw angle information

Documents

Application Documents

# Name Date
1 202017046185-FORM 3 [20-09-2023(online)].pdf 2023-09-20
1 202017046185-STATEMENT OF UNDERTAKING (FORM 3) [22-10-2020(online)].pdf 2020-10-22
1 202017046185-Written submissions and relevant documents [24-02-2025(online)].pdf 2025-02-24
2 202017046185-FORM 3 [20-09-2022(online)].pdf 2022-09-20
2 202017046185-FORM-26 [07-02-2025(online)].pdf 2025-02-07
2 202017046185-REQUEST FOR EXAMINATION (FORM-18) [22-10-2020(online)].pdf 2020-10-22
3 202017046185-Correspondence to notify the Controller [24-01-2025(online)]-1.pdf 2025-01-24
3 202017046185-Information under section 8(2) [20-09-2022(online)].pdf 2022-09-20
3 202017046185-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105) [22-10-2020(online)].pdf 2020-10-22
4 202017046185-FORM 3 [09-03-2022(online)].pdf 2022-03-09
4 202017046185-FORM 18 [22-10-2020(online)].pdf 2020-10-22
4 202017046185-Correspondence to notify the Controller [24-01-2025(online)].pdf 2025-01-24
5 202017046185-US(14)-HearingNotice-(HearingDate-11-02-2025).pdf 2025-01-23
5 202017046185-FORM 1 [22-10-2020(online)].pdf 2020-10-22
5 202017046185-CLAIMS [14-12-2021(online)].pdf 2021-12-14
6 202017046185-FORM 3 [20-09-2023(online)].pdf 2023-09-20
6 202017046185-FER_SER_REPLY [14-12-2021(online)].pdf 2021-12-14
6 202017046185-DRAWINGS [22-10-2020(online)].pdf 2020-10-22
7 202017046185-Information under section 8(2) [14-12-2021(online)].pdf 2021-12-14
7 202017046185-FORM 3 [20-09-2022(online)].pdf 2022-09-20
7 202017046185-DECLARATION OF INVENTORSHIP (FORM 5) [22-10-2020(online)].pdf 2020-10-22
8 202017046185-COMPLETE SPECIFICATION [22-10-2020(online)].pdf 2020-10-22
8 202017046185-Information under section 8(2) [20-09-2022(online)].pdf 2022-09-20
8 202017046185-OTHERS [14-12-2021(online)].pdf 2021-12-14
9 202017046185-FER.pdf 2021-10-19
9 202017046185-FORM 3 [09-03-2022(online)].pdf 2022-03-09
9 202017046185-Proof of Right [11-01-2021(online)].pdf 2021-01-11
10 202017046185-CLAIMS [14-12-2021(online)].pdf 2021-12-14
10 202017046185-FORM-26 [11-01-2021(online)].pdf 2021-01-11
10 202017046185.pdf 2021-10-19
11 202017046185-FER_SER_REPLY [14-12-2021(online)].pdf 2021-12-14
11 202017046185-FORM 3 [10-03-2021(online)].pdf 2021-03-10
12 202017046185-FORM-26 [11-01-2021(online)].pdf 2021-01-11
12 202017046185-Information under section 8(2) [14-12-2021(online)].pdf 2021-12-14
12 202017046185.pdf 2021-10-19
13 202017046185-Proof of Right [11-01-2021(online)].pdf 2021-01-11
13 202017046185-OTHERS [14-12-2021(online)].pdf 2021-12-14
13 202017046185-FER.pdf 2021-10-19
14 202017046185-COMPLETE SPECIFICATION [22-10-2020(online)].pdf 2020-10-22
14 202017046185-FER.pdf 2021-10-19
14 202017046185-OTHERS [14-12-2021(online)].pdf 2021-12-14
15 202017046185-DECLARATION OF INVENTORSHIP (FORM 5) [22-10-2020(online)].pdf 2020-10-22
15 202017046185-Information under section 8(2) [14-12-2021(online)].pdf 2021-12-14
15 202017046185.pdf 2021-10-19
16 202017046185-DRAWINGS [22-10-2020(online)].pdf 2020-10-22
16 202017046185-FER_SER_REPLY [14-12-2021(online)].pdf 2021-12-14
16 202017046185-FORM 3 [10-03-2021(online)].pdf 2021-03-10
17 202017046185-FORM 1 [22-10-2020(online)].pdf 2020-10-22
17 202017046185-FORM-26 [11-01-2021(online)].pdf 2021-01-11
17 202017046185-CLAIMS [14-12-2021(online)].pdf 2021-12-14
18 202017046185-FORM 3 [09-03-2022(online)].pdf 2022-03-09
18 202017046185-Proof of Right [11-01-2021(online)].pdf 2021-01-11
18 202017046185-FORM 18 [22-10-2020(online)].pdf 2020-10-22
19 202017046185-COMPLETE SPECIFICATION [22-10-2020(online)].pdf 2020-10-22
19 202017046185-Information under section 8(2) [20-09-2022(online)].pdf 2022-09-20
19 202017046185-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105) [22-10-2020(online)].pdf 2020-10-22
20 202017046185-DECLARATION OF INVENTORSHIP (FORM 5) [22-10-2020(online)].pdf 2020-10-22
20 202017046185-FORM 3 [20-09-2022(online)].pdf 2022-09-20
20 202017046185-REQUEST FOR EXAMINATION (FORM-18) [22-10-2020(online)].pdf 2020-10-22
21 202017046185-DRAWINGS [22-10-2020(online)].pdf 2020-10-22
21 202017046185-FORM 3 [20-09-2023(online)].pdf 2023-09-20
21 202017046185-STATEMENT OF UNDERTAKING (FORM 3) [22-10-2020(online)].pdf 2020-10-22
22 202017046185-FORM 1 [22-10-2020(online)].pdf 2020-10-22
22 202017046185-US(14)-HearingNotice-(HearingDate-11-02-2025).pdf 2025-01-23
23 202017046185-Correspondence to notify the Controller [24-01-2025(online)].pdf 2025-01-24
23 202017046185-FORM 18 [22-10-2020(online)].pdf 2020-10-22
24 202017046185-Correspondence to notify the Controller [24-01-2025(online)]-1.pdf 2025-01-24
24 202017046185-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105) [22-10-2020(online)].pdf 2020-10-22
25 202017046185-FORM-26 [07-02-2025(online)].pdf 2025-02-07
25 202017046185-REQUEST FOR EXAMINATION (FORM-18) [22-10-2020(online)].pdf 2020-10-22
26 202017046185-STATEMENT OF UNDERTAKING (FORM 3) [22-10-2020(online)].pdf 2020-10-22
26 202017046185-Written submissions and relevant documents [24-02-2025(online)].pdf 2025-02-24

Search Strategy

1 totalpatentoneE_14-06-2021.pdf