Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Virtual Projection From Exocentric Viewpoint

Abstract: An airborne platform includes an airrrame, a first sensor at a first pose on the airframe, a second sensor at a second pose on the airframe, a processing circuit, and a display. The first sensor acquires first information regarding an environment surrounding the airrrame. The second sensor acquires second information regarding the environment including at least a portion of the environment not included by the first sensor. The processing circuit is configured to receive the first information from the first sensor and the second information from the second sensor, retrieve pose data including the first pose and the second pose, and a three-dimensional model of the airborne platform, and generate a visual rendering of the environment based on the first information, the second information, the pose data, and the model such that the airborne platform is visible from an exocentric perspective. The display is configured to display the visual rendering.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 September 2016
Publication Number
10/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patents@remfry.com
Parent Application

Applicants

ROCKWELL COLLINS, INC.
400 Collins Road NE, M/S 124-323, Cedar Rapids, Iowa 52498, USA,

Inventors

1. Shibarchi Majumder
Kheya Abasan, 1st Floor, Bandhgora More, Bolpur West Bengal 731204, India

Specification

SYSTEMS AND METHODS FOR VIRTUAL PROJECTION FROM
EXOCENTRIC VIEWPOINT
BACKGROUND
[0001] The inventive concepts disclosed herein relate generally to the field of avionics display
systems. More particularly, embodiments of the inventive concepts disclosed herein relate to
virtual projection of an airborne platform from an exocentric viewpoint for a display system.
[0002] A cockpit of an aircraft may include a display that shows an image of the aircraft
travelling through an environment surrounding the aircraft, where the environment is rendered
by simulation. For example, a synthetic vision system may provide a simulated visualization of
the aircraft while travelling through the air and/or over terrain based on information retrieved
from a pre-loaded database. However, the synthetic vision system does not provide real images
to an operator of the aircraft of the environment, making it less effective for helping the operator
to make flight decisions based on real-time features of the environment not reflected in the
database (e.g., when travelling through environments not mapped to the database) as well as
current weather conditions and visibility. In addition, the cockpit provides a limited range of
visibility that may not cover points of interest such as runways or other landing surfaces, and
may be obscured by weather conditions.
SUMMARY
[0003] In one aspect, the inventive concepts disclosed herein are directed to an airborne
platform. The airborne platform includes an airframe, a first sensor at a first pose on the
airframe, a second sensor at a second pose on the airframe, a processing circuit, and a display.
The first pose includes at least one of a first position or a first orientation of the first sensor. The
second pose includes at least one of a second position or a second orientation of the second
sensor. The first sensor acquires first information regarding an environment surrounding the
airframe. The second sensor acquires second information regarding the environment including at
least a portion of the environment not included by the first sensor. The processing circuit is
-2-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
configured to receive the first information from the first sensor and the second information from
the second sensor, retrieve pose data including the first pose and the second pose and retrieve a
three-dimensional model of the airborne platform, and generate a visual rendering of the
environment based on the first information, the second information, the pose data, and the threedimensional
model of the airborne platform, such that the airborne platform is visible from an
exocentric perspective relative to the environment. The display is configured to display the
visual rendering of the environment.
[0004] In a further aspect, the inventive concepts disclosed herein are directed to a display
system. The airborne platform display system includes a sensor interface, a processing circuit
communicatively coupled to the sensor interface, and a display. The sensor interface is
configured to receive first information regarding an environment surrounding an airframe from a
first sensor, and to receive second information regarding the environment from a second sensor.
The processing circuit is configured to retrieve pose data including a first pose of the first sensor
and a second pose of the second sensor and retrieve a three-dimensional model of the airborne
platform, and generate a visual rendering of the environment based on the first information, the
second information, the pose data, and the three-dimensional model, such that the airborne
platform is visible from an exocentric perspective relative to the environment. The display is
configured to display the visual rendering of the environment.
[0005] In a further aspect, the inventive concepts disclosed herein are directed to a method.
The method includes acquiring first information regarding an environment surrounding an
airframe of an airborne platform by a first sensor. The first sensor is at a first pose on the
airframe. The first pose includes at least one of a first position or a first orientation of the first
sensor. The method includes acquiring second information regarding the environment by a
second sensor. The second sensor is at a second pose on the airframe. The second pose includes
at least one of a second position or a second orientation of the second sensor. The second
information includes information regarding at least a portion of the environment not acquired by
the first sensor. The method includes generating a visual rendering of the environment based on
the first information, the second information, the first pose, the second pose, and a three-
-3-
4851-3466-0918
Atty. Dkt. No.: l6CR543 (047!41-1204)
dimensional model of the airborne platform, such that the airborne platform is visible from an
exocentric perspective relative to the environment. The method includes displaying the visual
rendering.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] Implementations of the inventive concepts disclosed herein may be better understood
when consideration is given to the following detailed description thereof. Such description
makes reference to the included drawings, which are not necessarily to scale, and in which some
features may be exaggerated and some features may be omitted or may be represented
schematically in the interest of clarity. Like reference numerals in the drawings may represent
and refer to the same or similar element, feature, or function. In the drawings:
[0007] FIG. 1 is a schematic illustration of an exemplary embodiment of an aircraft control
center according to the inventive concepts disclosed herein;
[0008] FIG. 2A is a top view of a schematic illustration of an exemplary embodiment of an
airborne platform according to the inventive concepts disclosed herein;
[0009] FIG. 2B is a front end view of an exemplary embodiment of an airborne platform
according to the inventive concepts disclosed herein;
[0010] FIG. 3 is a block diagram of an exemplary embodiment of a display system for an
airborne platform according to the inventive concepts described herein; and
[0011] FIG. 4 is a diagram of an exemplary embodiment of a method of generating a visnal
rendering of an environment surronnding an airborne platform according to the inventive
concepts disclosed herein.
-4-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
DETAILED DESCRIPTION
[0012] Before explaining at least one embodiment of the inventive concepts disclosed herein in
detail, it is to be understood that the inventive concepts are not limited in their application to the
details of construction and the arrangement of the components or steps or methodologies set
forth in the following description or illustrated in the drawings. In the following detailed
description of embodiments of the instant inventive concepts, numerous specific details are set
forth in order to provide a more thorough understanding of the inventive concepts. However, it
will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that
the inventive concepts disclosed herein may be practiced without these specific details. In other
instances, well-known features may not be described in detail to avoid unnecessarily
complicating the instant disclosure. The inventive concepts disclosed herein are capable of other
embodiments or of being practiced or carried out in various ways. Also, it is to be understood
that the phraseology and terminology employed herein is for the purpose of description and
should not be regarded as limiting.
[0013] As used herein a letter following a reference numeral is intended to reference an
embodiment of the feature or element that may be similar, but not necessarily identical, to a
previously described element or feature bearing the same reference numeral (e.g., I, Ia, I b).
Such shorthand notations are used for purposes of convenience only, and should not be construed
to limit the inventive concepts disclosed herein in any way unless expressly stated to the
contrary.
[0014] Further, unless expressly stated to the contrary, "or" refers to an inclusive or and not to
an exclusive or. For example, a condition A orB is satisfied by any one of the following: A is
true (or present) and B is false (or not present), A is false (or not present) and B is true (or
present), or both A and Bare true (or present).
[0015] In addition, use of the "a" or "an" are employed to describe elements and components
of embodiments of the instant inventive concepts. This is done merely for convenience and to
-5-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
give a general sense of the inventive concepts, and "a" and "an" are intended to include one or at
least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
[0016] Finally, as used herein any reference to "one embodiment" or "some embodiments"
means that a particular element, feature, structure, or characteristic described in connection with
the embodiment is included in at least one embodiment of the inventive concepts disclosed
herein. The appearances of the phrase "in some embodiments" in various places in the
specification are not necessarily all referring to the same embodiment, and embodiments of the
inventive concepts disclosed may include one or more of the features expressly described or
inherently present herein, or any combination or sub-combination of two or more such features,
along with any other features which may not necessarily be expressly described or inherently
present in the instant disclosure.
[0017] Broadly, embodiments of the inventive concepts disclosed herein are directed to
systems and methods for virtual projection from an exocentric viewpoint. The inventive
concepts disclosed herein can be utilized in a number of display devices and systems for airborne
platforms (e.g., aircraft), including but not limited to flight control and autopilot systems,
navigation systems, and flight display systems. While the present disclosure describes systems
and methods implementable for an airborne platform, the inventive concepts disclosed herein
may be used in any type of environment (e.g., in another aircraft, a spacecraft, a ground-based
vehicle, or in a non-vehicle application such as a ground-based display system, an air traffic
control system, a radar system, a virtual display system).
In some embodiments, an airborne platform includes an airframe, a first sensor at a first pose on
the airframe, a second sensor at a second pose on the airframe, a processing circuit, and a
display. The first pose includes at least one of a first position or a first orientation of the first
sensor. The second pose includes at least one of a second position or a second orientation of the
second sensor. The first sensor acquires first information regarding an environment surrounding
the airframe. The second sensor acquires second information regarding the environment
including at least a portion of the environment not included by the first sensor. The processing
-6-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
circuit is configured to receive the first information from the first sensor and the second
information from the second sensor, retrieve pose data including the first pose and the second
pose, and generate a visual rendering of the environment based on the first information, the
second information, the pose data, and a three-dimensional model of the airborne platform. The
display is configured to display the visual rendering of the environment. By providing a visual
rendering of an exocentric viewpoint of the airborne platform relative to the environment,
systems manufactured in accordance with the inventive concepts disclosed herein can provide an
operator of the airborne platform with an improved visualization for controlling the airborne
platform, including for performing landing and other difficult maneuvers in situations with low
visibility, where visibility of critical features of the environment (e.g., landing sites such as
runways) is unavailable through cockpit windows, and/or where a synthetic vision system fails to
provide an accurate visualization of the environment. Systems manufactured in accordance with
the inventive concepts disclosed herein can provided a real-time or near real-time visualization
of the airborne platform, including position and attitude of the airborne platform, in the context
of a real view of the surrounding environment (e.g., from an exocentric perspective or view
relative to the environment), to help aircraft crew orient the airborne; such systems can be useful
for remotely piloted aircraft, low altitude flights, unusual landings, cargo drop, aerial refueling,
and rotorcraft landing in un-cooperative environments, including improving orienting the
airborne platform with respect to geographic terrain and obstacles or with respect to other aircraft
or vehicles, as well as increasing situational awareness. An operator may change the exocentric
perspective and visualize such a virtually rendered aircraft with respect to the visualization of the
surrounding environment from various points of view.
[0018] Referring to FIG. 1, a perspective view schematic illustration of an aircraft control
center or cockpit 10 is shown accordingly to an exemplary embodiment of the inventive concepts
disclosed herein. The aircraft control center 10 can be configured for an aircraft operator or
other user to interact with avionics systems of an airborne platform. The aircraft control center
I 0 may include one or more flight displays 20 and one or more user interface ("UI") elements
22. The flight displays 20 may be implemented using any of a variety of display technologies,
-7-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
including CRT, LCD, organic LED, dot matrix display, and others. The flight displays 20 may
be navigation (NA V) displays, primary flight displays, electronic flight bag displays, tablets such
as iPad® computers manufactured by Apple, Inc. or tablet computers, synthetic vision system
displays, head up displays (HUDs) with or without a projector, wearable displays, watches,
Google Glass@ The flight displays 20 may be used to provide information to the flight crew,
thereby increasing visual range and enhancing decision-making abilities. One or more of the
flight displays 20 may be configured to function as, for example, a primary flight display (PFD)
used to display altitude, airspeed, vertical speed, and navigation and traffic collision avoidance
system (TCAS) advisories. One or more of the flight displays 20 may also be configured to
function as, for example, a multi-function display used to display navigation maps, weather
radar, electronic charts, TCAS traffic, aircraft maintenance data and electronic checklists,
manuals, and procedures. One or more of the flight displays 20 may also be configured to
function as, for example, an engine indicating and crew-alerting system (EICAS) display used to
display critical engine and system status data. Other types and functions of the flight displays 20
are contemplated as well. According to various exemplary embodiments of the inventive
concepts disclosed herein, at least one of the flight displays 20 may be configured to provide a
rendered display from the systems and methods of the inventive concepts disclosed herein.
[0019] In some embodiments, the flight displays 20 may provide an output based on data
received from a system external to an aircraft, such as a ground-based weather radar system,
satellite-based system, a sensor system, or from a system of another aircraft. In some
embodiments, the flight displays 20 may provide an output from an onboard aircraft-based
weather radar system, LIDAR system, infrared system or other system on an aircraft. For
example, the flight displays 20 may include a weather display, a weather radar map, and a terrain
display. In some embodiments, the flight displays 20 may provide an output based on a
combination of data received from multiple external systems or from at least one external system
and an onboard aircraft-based system. The flight displays 20 may include an electronic display
or a synthetic vision system (SVS). For example, the flight displays 20 may include a display
configured to display a two-dimensional (2-D) image, a three dimensional (3-D) perspective
-8-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
image of terrain and/or weather information, or a four dimensional (4-D) display of weather
information or forecast information. Other views of terrain and/or weather information may also
be provided (e.g., plan view, horizontal view, vertical view). The views may include
monochrome or color graphical representations of the terrain and/or weather information.
Graphical representations of weather or terrain may include an indication of altitude of the
weather or terrain or the altitude relative to an aircraft. The flight displays 20 may receive image
information, such as a visual rendering of an environment surrounding the aircraft generated
based on information from sensors, and display the visual rendering with the aircraft shown from
an off-board or exocentric viewpoint.
[0020] The UI elements 22 may include, for example, dials, switches, buttons, touch screens,
keyboards, a mouse, joysticks, cursor control devices (CCDs), menus on Multi-Functional
Displays (MFDs), or other multi-function key pads certified for use with avionics systems. The
UI elements 22 may be configured to, for example, allow an aircraft crew member to interact
with various avionics applications and perform functions such as data entry, manipulation of
navigation maps, and moving among and selecting checklist items. For example, the UI
elements 22 may be used to adjust features of the flight displays 20, such as contrast, brightness,
width, and length. The UI elements 22 may also (or alternatively) be used by an aircraft crew
member to interface with or manipulate the displays of the flight displays 20. For example, the
UI elements 22 may be used by aircraft crew members to adjust the brightness, contrast, and
information displayed on the flight displays 20. The UI elements 22 may additionally be used to
acknowledge or dismiss an indicator provided by the flight displays 20. The UI elements 22 may
be used to correct errors on the flight displays 20. The UI elements 22 may also be used to adjust
the radar antenna tilt, radar display gain, and to select vertical sweep azimuths. Other UI
elements 22, such as indicator lights, displays, display elements, and audio alerting devices, may
be configured to warn of potentially threatening conditions such as severe weather, terrain, and
obstacles, such as potential collisions with other aircraft.
[0021] Referring now to FIGS. 2A-2B, an airborne platform 100 is shown according to the
inventive concepts disclosed herein. The airborne platform I 00 can travel through an
-9-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
environment surrounding the airborne platform 100 (e.g., a three-dimensional environment
including an airspace about the airborne platform 100 and surface features, such as runways,
ground, buildings, and mountains or other elevated structures). The airborne platform 100
includes an airframe 110 including a fuselage 112, at least two sensors 120, and a cockpit 124.
The airframe 110 includes an exterior structure of the airborne platform (e.g., wings, fuselage
II 0, tail, exterior of cockpit 124).
[0022] The cockpit 124 can be similar to and/or incorporate features of the cockpit or aircraft
control center 10 described with reference to FIG. I. The cockpit 124 can define a field of view.
For example, if the cockpit 124 includes one or more exterior windows allowing for the
environment to be viewed through the windows, then the field of view can be defined as the
maximum portion (e.g., focal angle, view angle, light cone) of the environment visible through
the windows (or a portion of the environment visible through the windows from a position at
which an operator of the airborne platform 100, such as a pilot or co-pilot, is seated or otherwise
typically positioned within the cockpit 124).
[0023] As shown in FIGS. 2A-2B, the airborne platform 100 includes three sensors 120a,
120b, and 120c on the airframe 110. Each sensor 120 is at a corresponding pose. The pose can
include at least one of a position or an orientation of the sensor 120. For example, given a
coordinate frame of reference (e.g., Cartesian coordinate frame of reference) for the airborne
platform 100 and/or the sensors 120, each sensor 120 can be positioned on the airframe 110
relative to the frame of reference. Each sensor 120 can also be oriented relative to the frame of
reference. For example, in a frame of reference defined by three planes perpendicular to one
another, each sensor 120 can be oriented in terms of three angles corresponding to rotation about
each of the three planes (e.g., Euler angles; pitch, yaw, and roll angles). In some embodiments,
the orientation of the sensor 120 can be changed (e.g., the sensor 120 can receive an instruction
to change orientation or can change orientation to acquire information regarding a point of
interest in the environment); the sensor 120 can communicate the change in orientation (e.g., to a
processing circuit that receives image information from the sensor 120).
-10-
4851-3466-0918
li
~i
ti 1-:
Atty. Dkt. No.: 16CR543 (047141-1204)
[0024) The sensors 120 can be posed such that at least one of the sensors 120 acquires
information that is not acquired by the other sensor(s) 120. For example, as shown in FIGS. 2A-
2B, the fields of view 122a, 122b, 122c are at least partially non-overlapping. A field of view of
one sensor 120 can be outside of a field of view of one or more of the other sensors 120, and/or
can be outside of a field of view of the cockpit 124. For example, each field of view can define
a set of points in the environment from which light is received, and at least one of the field of
views can be such that that corresponding sensor 120 receives light from one or more points in
the environment that is not received by other sensors 120 (or through the cockpit 124).
[0025) In some embodiments, at least one of the first pose or the second pose is set (e.g.,
selected, determined) such that the corresponding at least one sensors 120 acquires information
regarding the environment that is outside of the first field of view of the cockpit 124. This can
allow the sensors I20 to acquire information that is not visible to a pilot or other aircraft operator
in the cockpit 124.
[0026) In some embodiments, at least one of the sensors 120 is posed (e.g., at least one of the
first pose or the second pose is set) to acquired information corresponding to a point of interest
located in a portion of the environment. For example, the point of interest can be a runway, a
flight control tower, or a landing structure, such as a helipad or a cliff plateau on which the
airborne platform I 00 is to be guided to land.
[0027) The sensors 120 can be image sensors (e.g., cameras, visible light sensors, near-visible
light sensors infrared sensors, ultraviolet sensors), such as for acquiring image information. The
sensors 120 can be radar sensors (e.g., millimeter radar sensors). The sensors 120 can be active
sensors (e.g., laser sensors). The sensors 120 can receive electromagnetic radiation (e.g., light)
from the environment and output image data representing the light. The sensors 120 can output
the image data as still images or as video data (e.g., a sequence of images). The sensors 120 can
include an indication of the type of image data being provided (e.g., an indication of the
wavelength of the electromagnetic radiation being acquired) to facilitate processing of the image
data (e.g., by a display system 300 as discussed with regards to FIG. 3). In some embodiments,
-II-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
providing the airborne platform 100 with at least two sensors 120 allows for depth information
to be included in the image data from the sensors 120, even while the airborne platform 100 is
stationary. In some embodiments, while the number of sensors 120 included with the airborne
platform 100 can increase the overall field of view of visual information acquired by the sensors
120, increasing the number of sensors 120 may also increase computational load for processing
the visual information and generating the visual rendering based on the visual information.
[0028] The type of sensors 120 used can be determined based on an application of the airborne
platform 100. For example, in applications where the airborne platform 100 is a rotary wing
platform (e.g., a helicopter) that may be tasked with landing on difficult terrain, the sensors 120
can be image sensors that provide visible and/or near-visible light image data that can be
displayed to an operator of the airborne platform 100 to help land on the terrain. In applications
where the airborne platform I 00 is operating in remote locations, in desert locations or other
locations with sandstorms and other obscuring conditions), the sensors 120 can be millimeter
radar sensors.
[0029] The sensors 120 may be placed on the airframe 110 (e.g., on an exterior surface of the
airframe II 0). In some embodiments, the sensors 120 can be placed at locations on the airframe
110 having relatively low vibration (e.g., wing root as opposed to wing tip), improving the
image quality from the sensors 120 as the effect of vibrations on the field of view of the sensors
120 between image frames will be reduced.
[0030] As shown in FIGS. 2A-2B, the first sensor 120a is at a first pose where it is positioned
on a nose of the airframe 110 and oriented to define a first field of view 1 22a; the second sensor
120b is at a second pose where it is positioned at a wing root of the airframe·llO and oriented to
define a second field of view 122b; and the third sensor 120c is at a third pose where it is
positioned at another wing root of the airframe 110 and oriented to define a third field of view
122c. As shown in PIG. 2B, the fields of view 122 of the sensors 120, when aligned with one
another, can be combined to establish a more complete view of the environment surrounding the
airborne platform 100.
-12-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
[0031] The sensors 120 may be calibrated. For example, output from the sensors 120 may be
recorded under test conditions for later use in generating a visual rendering based on infonnation
from the sensors 120. The output may indicate a view angle (e.g., a light cone corresponding to
a maximum range of light that each sensor 120 can receive, a field of view) for the sensor 120.
Output from the sensors 120 may be correlated to pose when the sensors 120 are installed.
[0032] In some embodiments, calibration data may be determined or stored regarding the
sensors 120. The calibration data may correspond to the pose of the sensors 120. The
calibration data may be registered to a model of the airborne platform I 00. For example, the
pose data for the sensors 120 may be mapped, correlated, or assigned to analogous portions of
the model of the airborne platform. The pose data may be mapped to a frame of reference of the
model of the airborne platform 100. If the pose data is acquired or stored in a first frame of
reference that is different than a second frame of reference of the model of the airborne platform
I 00, the pose data may be transformed to the second frame of reference to be mapped to the
model of the airborne platform 100. In various such embodiments, as an illustrative example, if
the pose data corresponding to the sensor 120 indicates that the sensor 120 is positioned on the
nose of the airborne platform 100 and facing forward (e.g., facing in the same direction as a
direction of travel of the airborne platform 100), the model of the airborne platform 100, or a
map of the sensors 120 that corresponds to the model of the airborne platform I 00, will indicate
the pose of the sensor 120.
[0033] The sensors 120 may be rigidly fixed to the airframe II 0 so that the need for recalibration
is reduced. Where identical sensors 120 are configured for an included on multiple
airborne platforms I 00, the same calibration data may be used for each airborne platform I 00.
[0034] Referring now to FIG. 3, a block diagram of an exemplary embodiment of a display
system 300 is illustrated in accordance with the inventive concepts described herein. The
display system 300 includes a processing circuit 210, sensors 220 (e.g., first sensor 220a, second
sensor 220b), and a display device 228. The sensors 220 can be similar to the sensors 120
-13-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
described with reference to FIGS. 2A-2B. While FIG. 3 illustrates two sensors 220, three or
more sensors can be included in systems according to the inventive concepts described herein.
[0035) The processing circuit 210 is shown to include a processor 2I2 and a memory 214. The
processor 212 may be implemented as a specific purpose processor, an application specific
integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of
processing components, or other suitable electronic processing components. The memory 2 I 4 is
one or more devices (e.g., RAM, ROM, flash memory, hard disk storage) for storing data and
computer code for completing and facilitating the various user or client processes, layers, and
modules described in the present disclosure. The memory 214 may be or include volatile
memory or non-volatile memory and may include database components, object code
components, script components, or any other type of information structure for supporting the
various activities and information structures of the inventive concepts disclosed herein. The
memory 214 is communicably connected to the processor 212 and includes computer code or
instruction modules for executing one or more processes described herein. The memory 214 can
various circuits, software engines, and/or modules that cause the processor 212 to execute the
systems and methods described herein.
[0036) While FIG. 3 shows the processing circuit 2IO to include a single processor 212, in
various embodiments, the processing circuit 2 I 0 can include various numbers or arrangements of
processors. For example, the processor 212 can be a multi-core processor. The processor 212
can include a plurality of processors that may be dedicated to different tasks. The processing
circuit 210 can include the processor 212 as well as a graphics processing unit (GPU) (not
shown); the GPU may be configured to retrieve (or be controlled by the processor 212 to
retrieve) electronic instructions for generating a visual rendering (e.g., from rendering circuit
2 I 8) and execute the electronic instructions in order to generate a visual rendering for display by
the display device 228.
[0037] In some embodiments, the memory 214 includes a pose circuit 2I6. The pose circuit
216 can store pose data regarding the sensors 220. The pose circuit 216 can store a first pose
-14-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
data including at least one of a first position or a first orientation of the sensor 220a. The pose
circuit can store a second pose data including at least one of a second position or a second
orientation of the sensor 220b. The pose circuit 216 can be pre-loaded or pre-programmed with
the pose data (e.g., during an initialization process of the display system 300, the pose data can
be loaded on the pose circuit 216). The pose circuit 216 can receive the pose data as part of an
initialization or calibration procedure for the sensors 220.
[0038] In some embodiments, the memory 214 includes a rendering circuit 218. The rendering
circuit 218 is configured to generate a visual rendering (e.g., an image, a video, a twodimensional
or three-dimensional spatial mapping of information received from the sensors 120
to a two-dimensional or three-dimensional arrangement of pixels, such that a display device can
use the arrangement of pixels to display the information to a user) the environment surrounding
the airborne platform 100 based on the pose data, first information received from the first sensor
220a, and second information received from the second sensor 220b. The visual rendering can
be a re-construction of the environment surrounding the airborne platform 100. In some
embodiments, the visual rendering will appear to a user (e.g., a user in the cockpit 124 viewing
the visual rendering on a display device) to be a real-time or near real-time video of the
environment surrounding the airborne platform 100. In some embodiments, the visual rendering
provides an exocentric or off-board view of the airborne platform 100, such as by including an
image of the airborne platform 100 in the visual rendering (e.g., a perspective or point of view in
which the entire airborne platform 100 is visible, where the airborne platform 100 is visible.from
outside the airborne platform I 00, from a third-person perspective relative to the airborne
platform 1 00).
[0039] In some embodiments, the rendering circuit 218 is configured to use the pose data to
generate the visual rendering. For example the rendering circuit 218 can retrieve the pose data
from the pose circuit 216, and based on the pose data, determine relative poses between the
sensors. For example, if the airborne platform I 00 includes the sensors 220a and 220b, the
rendering circuit can retrieve the positions and orientations of the sensors 220a and 220b,
determine a distance between the sensors 220a and 220b based on the corresponding position
-15-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
data, and determine a relative orientation (e.g., difference in angle between the sensors 220a and
220b; difference in direction that the sensors 220a, 220b point). Given a sphere corresponding to
a maximum possible visualization of the environment surrounding the airborne platform 100
(e.g., a visualization that would occur for a "360 degree" view or omnidirectional view about the
airborne platform I OO),the rendering circuit 218 can use the pose data, as well as calibration data
regarding indicating the view angles of the sensors 220, to map the image data received from the
sensors 220 to corresponding portions of the sphere.
[0040) In some embodiments, the rendering circuit 218 is configured to generate the visual
rendering based on a three-dimensional model of the airborne platform 100. The threedimensional
model can include specification data regarding the airborne platform I 00, such as
size data, or relative distances between portions of the airborne platform 100. The threedimensional
model enables the rendering circuit 218 to render or otherwise generate an image of
the airborne platform 100. For example, the rendering circuit 2I8 can store, include, or retrieve
a CAD model of the airborne platform 100. The rendering circuit 218 can generate the visual
rendering such that the airborne platform 100 is visible in the visual rendering (e.g., the airborne
platform I 00 is visible in an exocentric or off-board view).
[0041) The visual rendering may include an action of the airborne platform 100. The visual
rendering can include actions such as opening, closing, or other movement of structures of the
airborne platform 100, such as flaps or landing gear. The rendering circuit 218 can be
communicatively coupled to various avionics systems of the airborne platform 100 that control
operation of the structures of the airborne platform 100, and receive an indication of the action
(e.g., receive an indication that the landing gear has been deployed or is being retrieved from an
avionics system that controls the landing gear). Responsive to receiving the indication, the
rendering circuit 218 can generate the visual rendering to include the action. The rendering
circuit 218 can store image or video data corresponding to the actions, or may retrieve image or
video data corresponding to the actions that is included with the three-dimensional model of the
airborne platform 100. The rendering circuit 218 can retrieve a duration ofthe action, or
determine a duration of the action based on the image or video data. In response to receiving the
-16-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
indication, the rendering circuit 218 can generate the visual rendering to include the action for a
subsequent length of time corresponding to the duration of the action. For example, if the image
or video data indicates a pre-determined duration or number of frames for displaying the action,
the rendering circuit 218 can generate the visual rendering to include the action for the predetermined
duration or number of frames after the indication is received. The rendering circuit
218 can also receive additional indications of the action (e.g., action in progress, action
complete) and modifY the visual rendering based on the additional actions. For example, the
rendering circuit 218 can modifY a relative speed at which the action is displayed based on the
additional actions, such as accelerating the display of the action to a completed action image
based on receiving an action complete indication.
[0042) In some embodiments, the rendering circuit 218 is configured to generate the visual
rendering based on the three-dimensional model and calibration data regarding the poses of the
sensors 220. The rendering circuit 218 can retrieve the calibration data or pose data to determine
where the sensors 220 are located on the airframe II 0, and the directions that the sensors 220
face. The rendering circuit 218 can use the pose data to determine the sensors 220 to be origin
points for the information received from the corresponding sensors 220, and orient the visual
information in the visual rendering so that the information appears to be properly oriented when
the visual rendering is viewed. The rendering circuit 218 can determine a relative size of the
airborne platform 100 when generating the image of the airborne platform I 00 to be included in
the visual rendering based on the pose data or calibration data (e.g., based on lhe distance(s)
between the sensors 220 and the !mown poses of the sensors 220 relative to the airborne platform
I 00).
[0043) In some embodiments, the rendering circuit 218 is configured to generate the visual
rendering based on at least one of attitude or position information regarding the airborne
platform 100. The rendering circuit 218 can receive the at least one of the attitude or position
information from avionics system of the aircraft (e.g., GPS, GPS data, altitude data from an
altimeter, data from a gyroscope). The display system 300 can include the avionics system from
which the attitude or information is received, or the display system 300 can be communicatively
-17-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
coupled by an electronic data bus (not shown) to the avionics system and receive the attitude or
position information via the electronic data bus.
[0044] At least one of the first information or the second information can include a
measurement of altitude or distance from ground or another surface feature below the airborne
platform 100. For example, the sensors 220 may be configured to perform a rangefinding
function (e.g., the sensors 220 can be or include a laser rangefinder). In some embodiments, the
sensors 220 can be posed to face ground (e.g., one or more sensors 220 can be positioned on a
lower surface of the airborne platform 100 and/or oriented in a direction aligned with a gravity
vector), such that a rangefinding output from the sensors 220 represents an altitude of the
airborne platform 100. In some embodiments, the processing circuit 210 can receive a
rangefinding output from the sensors 220 (e.g., a length of a line-of-sight from the sensors 220 to
ground), retrieve attitude information regarding the airborne platform 100, and modifY the
rangefinding output based on the attitude information to determine the altitude. The visual
rendering can include text information indicating the altitude. The rendering circuit 218 can be
configured to determine a relative size of the airborne platform 1 00 in the visual rendering based
on the altitude information (e.g., relative to a size at which ground is displayed).
[0045] In some embodiments, the rendering circuit 218 is configured to generate the visual
rendering such that a rendering of the first field of view is visible adjacent to a rendering of the
second field of view. For example, the first information can correspond to a first field of view,
the second information can correspond to a second field of view, and the first sensor 220a can be
posed relative to the second sensor 220b such that the first field of view is adjacent to the second
field of view. The rendering circuit 218 can retrieve pose data from the pose circuit 216 and
calibration data indicating the fields of view, and based on these data determine that the first
field of view is adjacent to the second field of view.
[0046] In some embodiments, the rendering circuit 218 is configured to generate the visual
rendering further based on reusing a previous frame of at least the first information or the second
information as a current frame. This can reduce the computational load required to generate the
-18-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
visual rendering, as well as allow for a more dense reconstruction of the environment
surrounding the airborne platform 100. The rendering circuit 218 may review a plurality of
previous frames to identifY common characteristics of the previous frames, and reuse previous
frames if the volume of common characteristics is greater than a threshold volume.
(0047] In some embodiments, the display system 300 includes a user interface, such as a user
input device 232. The user input device 232 can be similar to or incorporate features of the UI
elements 22 described with reference to FIG. I. The user input device 232 can receive a user
input. The user input can indicate a desired (e.g., target, selected) angle for viewing the visual
rendering. The rendering circuit 218 can be configured to generate the visual rendering further
based on the user input. For example, the visual rendering can be rotated or shifted based on a
corresponding user input. The rendering circuit 218 can retrieve pose data and calibration data
regarding the sensors 220, compare the desired angle to the pose data, calibration data, and the
information received from the sensors 220, and modifY the visual rendering based on the
companson.
[0048] In some embodiments, the display system includes a display device 228. The display
device 228 can be similar to the flight displays 20 described herein with reference to FIG. I.
The display device 228 can receive the visual rendering from the processing circuit 210 and
display the visual rendering. In some embodiments, the display device 228 includes the user
input device 232 (e.g., the display device 228 is a touchscreen).
[0049] The display system 300 can include a sensor interface 224. The sensor interface 224
can be configured to receive information (e.g., image information, video information) from the
sensors 220, and transmit the information to the processing circuit 210. The sensor interface 224
be communicatively coupled to the sensors 220 directly (e.g., by a wired connection, or by a
wireless connection if the sensor interface 224 includes wireless receiver hardware electronics).
The sensor interface 224 can be communicatively coupled to the sensors 220 via an electronic
bus of the airborne platform I 00.
-19-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
In some embodiments, the airborne platform is a remotely operated vehicle (e.g., a drone, a
vehicle that receives control instructions from a remote location, such as a control station, a
ground control station, or another vehicle).
[0050] Referring now to FTG. 4, an exemplary embodiment of a method 300 according to the
inventive concepts disclosed herein may include the following steps. The method 300 may be
performed using various hardware, apparatuses, and systems disclosed herein, such as the
airborne platform I 00, the sensors 120, the display system 300, and/or components thereof.
[0051] A step (310) may include acquiring first information regarding an environment
surrounding an airframe of an airborne platform by a first sensor. For example, the first sensor
can be a visible light sensor (e.g., a camera), an infrared sensor, or a millimeter radar sensor,
located on the airframe. The first sensor can capture images of the environment and output or
transmit the images as images or video data. The first sensor can be at a first pose (e.g., first
position and/or first orientation).
[0052] A step (320) may include acquiring second information regarding the environment by a
second sensor. The second sensor can be similar to the first sensor. The second sensor can be at
a second pose (e.g., second position and/or second orientation). The second information may
include information that includes (or captures visual information regarding) at least a portion of
the environment not acquired by the first sensor. For example, the first sensor may have a first
field of view, and the second sensor may have a second field of view such that the first and.
second fields of view are non-overlapping. This can allow the first sensor and second sensor to
acquire a greater field of view than the first field of view or second field of view.
[0053] In some embodiments, the airborne platform includes a cockpit defining a third field of
view of the environment, and at least of the first sensor or second sensor output information
corresponding to a field of view that is outside the third field of view, such that the sensors can
provide visual information that cannot be acquired by looking out through windows of the
cockpit. In some embodiments, at least one of the first pose or the second pose is set such that
-20-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
the corresponding at least one of the first sensor or the second sensor acquires information
regarding the environment that is outside the third field of view.
[0054] In some embodiments, acquiring information by the sensors includes acquiring
information corresponding to a point of interest in the environment. 1n some embodiments,
acquiring information by the sensors includes acquiring information indicating a measurement of
altitude.
[0055] A step (330) may include generating a visual rendering of the environment based on the
first information, the second information, the pose data, and a three-dimensional model of the
airborne platform (e.g., a CAD model), such that the airborne platform is visible from an
exocentric viewpoint relative to the environment. For example, the visual rendering can map the
first information and the second information to a two-dimensional or three-dimensional
arrangement of pixels based on the pose data. In some embodiments, generating the visual
rendering includes orienting the visual rendering based on a user input indicating a desired angle
for viewing the visual rendering. In some embodiments, ifthe first field of view of the first
sensor is adjacent to the second field of view of the second sensor, generating the visual
rendering can include generating a rendering of the first information to be adjacent to a rendering
of the second information, such that the first field of view is visible adjacent to the second field
of view.
[0056] In some embodiments, the visual rendering may be generated to show an action
performed by the airborne platform or a structure thereof, such as movement of flaps or landing
gear.
[0057] Generating the visual rendering can include reusing a previous frame of at least the first
information or the second information as a current frame. For example, if a plurality of previous
frames share a volume of common characteristics, the previous frame may be reused if the
number of common characteristics is greater than a threshold volume.
-21-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
[0058] A step (340) may include displaying the visual rendering. The visual rendering may be
displayed in real-time or near real-time (e.g., relative to a time lag for reception and processing
of the first information and second information and generation and displaying of the visual
rendering).
[0059] As will be appreciated from the above, systems and methods for virtual projection from
an exocentric viewpoint according to embodiments of the inventive concepts disclosed herein
may improve operation of airborne platforms by providing an operator of the airborne platform
with a real visualization of the environment surrounding the airborne platform, including visual
information that cannot be acquired by looking through the cockpit, and unlike existing synthetic
vision systems, systems and methods according to embodiments of the inventive concepts
disclosed herein can acconnt for real-time, dynamic features of the environment that may be
critical to aircraft operations (e.g., landing, takeoff, fog).
[0060] It is to be understood that embodiments of the methods according to the inventive
concepts disclosed herein may include one or more of the steps described herein. Further, such
. steps may be carried out in any desired order and two or more of the steps may be carried out
simultaneously with one another. Two or more of the steps disclosed herein may be combined in
a single step, and in some embodiments, one or more of the steps may be carried out as two or
more sub-steps. Further, other steps or sub-steps may be carried out in addition to, or as
substitutes to one or more of the steps disclosed herein.
[0061] From the above description, it is clear that the inventive concepts disclosed herein are
well adapted to carry out the objects and to attain the advantages mentioned herein as well as
those inherent in the inventive concepts disclosed herein. While presently preferred embodiments
of the inventive concepts disclosed herein have been described for purposes of this disclosure, it
will be understood that numerous changes may be made which will readily suggest themselves to
those skilled in the art and which are accomplished within the broad scope and coverage of the
inventive concepts disclosed and claimed herein.

WHAT IS CLAIMED IS:
I. An airborne platform, comprising:
an airframe;
a first sensor at a first pose on the airframe to acquire first information regarding an
environment surrounding the airframe, the first pose comprising at least one of a first position or
a first orientation of the first sensor;
a second sensor at a second pose on the airframe to acquire second information regarding
the environment including at least a portion of the environment not acquired by the first sensor,
the second pose comprising at least one of a second position or a second orientation of the
second sensor;
a processing circuit configured to:
receive the first information from the first sensor and the second information from
the second sensor;
retrieve pose data including the first pose and the second pose and a threedimensional
model of the airborne platform; and
generate a visual rendering of the environment based on the first information, the
second information, the pose data, and the three-dimensional model, such that the
airborne platform is visible from an exocentric viewpoint relative to the environment; and
a display configured to display the visual rendering.
2. The airborne platfmm of claim 1, further comprising a cockpit defining a first field of
view of the environment, the first field of view visible through the cockpit;
wherein at least one of the first pose or the second pose are set such that the
corresponding at least one of the first sensor or the second sensor acquires information regarding
the environment that is outside of the first field of view.
-23-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
3. The airborne platform of claim 1, wherein at least one ofthe first sensor or the second
sensor is posed to acquire information corresponding to a point of interest located in a portion of
the environment.
4. The airborne platform of claim 3, wherein the point of interest is at least one of a runway,
a flight control tower, or a landing structure.
5. The airborne platform of claim I, further comprising a user interface configured to
receive a user input indicating a desired angle for viewing the visual rendering, wherein the
processing circuit is configured to generate the visual rendering further based on the user input.
6. The airborne platform of claim I, wherein the first sensor and the second sensor are both
either image sensors or millimeter radar sensors.
7. The airborne platform of claim I, ~herein the first information corresponds to a first field
of view, the second information corresponds to a second field of view, the first sensor is posed
relative to the second sensor such that the first field of view is adjacent to the second field of
view, and the processing circuit is configured to generate the visual rendering such that a
rendering of the first field of view is visible adjacent to a rendering of the second field of view.
-24-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
8. A display system, comprising:
a sensor interface configured to receive first information regarding an environment
surrounding an airborne platform from a first sensor and second information regarding the
environment from a second sensor;
a processing circuit communicatively coupled to the sensor interface, the processing
circuit configured to:
retrieve pose data including a first pose of the first sensor and a second pose of the
second sensor, and retrieve a three-dimensional model of the airborne platform; and
generate a visual rendering of the environment based on the first information, the
second information, the pose data, and the three-dimensional model, such that the
airborne platform is visible from an exocentric viewpoint relative to the environment; and
a display configured to display the visual rendering.
9. The display system of claim 8, wherein the processing circuit is further configured to
generate the visual rendering based on attitude information regarding the airborne platform.
10. The display system of claim 8, wherein the processing circuit is configured to generate
the visual rendering further based on whether the first information and the second information
correspond to information from image sensors, or the first information and the second
information correspond to information from millimeter radar sensors.
II. The display system of claim 8, wherein the processing circuit is further configured to
receive a user input indicating a desired angle for viewing the visual rendering from a user
interface and generate the visual rendering further based on the user input.
12. The display system of claim 8, wherein the first information corresponds to a first field of
view, the second information corresponds to a second field of view, and the processing circuit is
further configured to generate the visual rendering such that a rendering of the first field of view
is visible adjacent to a rendering of the second field of view.
-25-
4851-3466-0918
Atty. Dkt. No.: 16CR543 (047141-1204)
13. The display system of claim 8, wherein the airborne platform is a remotely operated
vehicle, and the display system is included in a control tower.
14. A method, comprising:
acquiring first information regarding an environment surrounding an airrrame of an
airborne platform by a first sensor, the first sensor being at a first pose on the airframe, the first
pose including at least one of a first position or a first orientation of the first sensor;
acquiring second information regarding the environment by a second sensor, the second
sensor being at a second pose on the airrrame, the second pose including at least one of a second
position or a second orientation ofthe second sensor, the second information including
information regarding at least a portion of the environment not acquired by the first sensor;
generating a visual rendering of the environment based on the first information, the
second information, the first pose, the second pose, and a three-dimensional model of the
airborne platform, such that the airborne platform is visible from an exocentric perspective
relative to the environment; and
displaying the visual rendering.
15. The method of claim 14, further comprising setting at least one of the first pose or the
second pose such that the corresponding at least one of the first sensor or the second sensor
acquires information regarding the environment that is outside a first field of view, the first field
of via defined as a portion of the environment visible through a cockpit of the airborne platform.
16. The method of claim 14, further comprising setting at least one of the first pose or the
second pose such that the corresponding sensor acquires information corresponding to a point of
interest location in a portion of the environment.
17. The method of claim 14, further comprising:
4851-3466-0918
receiving a user input indicating a desired angle for viewing the visual rendering; and
generating the visual rendering further based on the user input.
-26-
Atty. Dkt. No.: 16CR543 (047141-1204)
18. The method of claim 14, wherein the first sensor and the second sensor are both either
image sensors or millimeter radar sensors.
19. The method of claim 14, wherein the first information corresponds to a first field of view,
the second information corresponds to a second field of view, the first sensor is posed relative to
the second sensor such that the first field of view is adjacent to the second field of view;
wherein generating .the visual rendering further comprises generating the visual rendering such
that a rendering of the first field of view is visible adjacent to a rendering of the second field of
view.
20. The method of claim 14, wherein generating the visual rendering further comprises
reusing a previous frame of at least the first information or the second information as a current
frame.

Documents

Application Documents

# Name Date
1 201611030255-Annexure [14-08-2024(online)].pdf 2024-08-14
1 Power of Attorney [05-09-2016(online)].pdf 2016-09-05
2 Form 5 [05-09-2016(online)].pdf 2016-09-05
2 201611030255-Written submissions and relevant documents [14-08-2024(online)].pdf 2024-08-14
3 Form 3 [05-09-2016(online)].pdf 2016-09-05
3 201611030255-Correspondence to notify the Controller [08-07-2024(online)].pdf 2024-07-08
4 Drawing [05-09-2016(online)].pdf 2016-09-05
4 201611030255-FORM-26 [08-07-2024(online)].pdf 2024-07-08
5 Description(Complete) [05-09-2016(online)].pdf 2016-09-05
5 201611030255-US(14)-HearingNotice-(HearingDate-31-07-2024).pdf 2024-06-27
6 abstract.jpg 2016-10-04
6 201611030255-ABSTRACT [17-03-2022(online)].pdf 2022-03-17
7 Other Patent Document [23-12-2016(online)].pdf 2016-12-23
7 201611030255-CLAIMS [17-03-2022(online)].pdf 2022-03-17
8 201611030255-OTHERS-271216.pdf 2016-12-29
8 201611030255-COMPLETE SPECIFICATION [17-03-2022(online)].pdf 2022-03-17
9 201611030255-DRAWING [17-03-2022(online)].pdf 2022-03-17
9 201611030255-Correspondence-271216.pdf 2016-12-29
10 201611030255-FER_SER_REPLY [17-03-2022(online)].pdf 2022-03-17
10 201611030255-FORM 18 [27-08-2020(online)].pdf 2020-08-27
11 201611030255-FER.pdf 2021-12-29
11 201611030255-FORM-26 [17-03-2022(online)].pdf 2022-03-17
12 201611030255-OTHERS [17-03-2022(online)].pdf 2022-03-17
13 201611030255-FER.pdf 2021-12-29
13 201611030255-FORM-26 [17-03-2022(online)].pdf 2022-03-17
14 201611030255-FER_SER_REPLY [17-03-2022(online)].pdf 2022-03-17
14 201611030255-FORM 18 [27-08-2020(online)].pdf 2020-08-27
15 201611030255-Correspondence-271216.pdf 2016-12-29
15 201611030255-DRAWING [17-03-2022(online)].pdf 2022-03-17
16 201611030255-COMPLETE SPECIFICATION [17-03-2022(online)].pdf 2022-03-17
16 201611030255-OTHERS-271216.pdf 2016-12-29
17 201611030255-CLAIMS [17-03-2022(online)].pdf 2022-03-17
17 Other Patent Document [23-12-2016(online)].pdf 2016-12-23
18 201611030255-ABSTRACT [17-03-2022(online)].pdf 2022-03-17
18 abstract.jpg 2016-10-04
19 201611030255-US(14)-HearingNotice-(HearingDate-31-07-2024).pdf 2024-06-27
19 Description(Complete) [05-09-2016(online)].pdf 2016-09-05
20 Drawing [05-09-2016(online)].pdf 2016-09-05
20 201611030255-FORM-26 [08-07-2024(online)].pdf 2024-07-08
21 Form 3 [05-09-2016(online)].pdf 2016-09-05
21 201611030255-Correspondence to notify the Controller [08-07-2024(online)].pdf 2024-07-08
22 Form 5 [05-09-2016(online)].pdf 2016-09-05
22 201611030255-Written submissions and relevant documents [14-08-2024(online)].pdf 2024-08-14
23 Power of Attorney [05-09-2016(online)].pdf 2016-09-05
23 201611030255-Annexure [14-08-2024(online)].pdf 2024-08-14

Search Strategy

1 SearchStrategyMatrixE_17-11-2021.pdf