We use our knowledge of spacecraft systems and operations, human factors expertise and programming skills to quickly turn ideas into viable solutions.

  • We define and implement software and hardware human-machine interfaces and operational concepts, in collaboration with system experts and users.
  • We define and implement subsystem simulators using commercial software tools, NASA proprietary tools and in-house developments.
  • We integrate pre-existing prototype interfaces, simulators and custom made human factors analysis tools to define and support test campaigns.
  • We support a direct evolution from prototype to actual operational system, minimizing development cost and time.

The technologies we use include C/C++, Visual Basic, C#, MFC, OpenGL, GL Studio, IData, HTML and Javascript.


Aerospace Applications North America is part of NASA’s Cockpit Rapid Prototyping Laboratory (RPL), located at the Johnson Space Center in Texas. The RPL’s main focus is to define and develop the whole suite of over 60 display formats for NASA’s Orion spacecraft.

In this project the responsibility of our team is to prototype display formats as part of an integrated cockpit simulation hosted on various Orion cockpit mockups around the Johnson Space Center, and to implement test scenarios to help decide on final design issues such as:

  • Types of controls during the different phases of the flight.
  • Organization of the display formats and navigation.
  • Hardware selection and ergonomics.
  • Operational concepts.
  • Display format specification and prototyping.

The suite of displays includes a Primary Flight Display (PFD) which gets automatically reconfigured for each flight phase, an electronic procedures system which allows users to execute procedures on the cockpit displays and interact with each displays to fetch telemetry data or cue commands, a Caution and Warning (C&W) display that links fault messages to recovery procedures, a generic display engine that simplifies the implementation of tabular display formats, and a series of schematic displays representing the various Orion systems.

The prototype display software also includes its own internal state-based simulator, which allows mission operation specialists to animate the displays without needing to program complex algorithms. Additionally, the internal simulator provides connectivity with external data sources, through a simple UDP-based interface.

The software includes a graphical representation of display edge keys, for convenient use on a desktop or laptop computer, or can be configured for mockup use, where actual physical keys can be used to operate the displays. The same can be said for cockpit switch panels, as a fully functional software model of the Orion switch panels is part of the system, and can be used to drive the simulations when the hardware panels are not connected.

Our team uses the C++ programming language for internal simulator implementation and display format logic, Windows programming for simulator control interfaces and telemetry connectivity, IData for display format graphics definition and simulated display edge keys, and HTML and Javascript for cockpit switch panel simulation.

Advanced Cockpit
Evaluation System

Aerospace Applications North America designed human machine interface components and implemented visualisation and telemetry interfaces for NASA’s Advanced Cockpit Evaluation System (ACES).

The ACES was a remote cockpit including five display screens and other head mounted displays, a telemetry server providing connectivity to various NASA vehicles and test projects, and an integrated navigation system, located inside a van, for complete mobility. ACES’s focus was to research blending video and synthetic vision to create an augmented reality view to assist flight operations.

Our team supported the ACES van with Head-Up Display (HUD) implementation in OpenGL, data connectivity systems, test area virtual modeling, and overall software and hardware component interface and integration. We participated in various NASA test campaigns with the ACES van, including:

  • Extensive stand-alone tests at NASA Johnson Space Center, with telemetry coming from the on-board navigation system and video from the “Virtual Eye” 360-degree camera array located on the roof of the ACES van.
  • Buckeye flight test support at Texas A&M university in College Station, Texas, with video and telemetry downlink from this powered parachute vehicle.
  • Two NASA test campaigns at Meteor Crater, Arizona, with telemetry and video coming from a prototype planetary rover built by NASA. The rover was successfully driven autonomously from the ACES van. Tests were also performed to remotely operate a “thumper” planetary geologic instrument, with video from an infrared camera.
  • X38 parafoil tests at Dryden Flight Research Center in California, with telemetry and video coming directly from the X38 test spacecraft during the drop test. Astronauts were able to sit in the back of the ACES van and experience the approach and landing phase of the prototype space vehicle. Astronauts were also able to steer the parafoil during flight with the controls provided at the back of the ACES van.

The main areas of research included:

  • Reusable library of HUD graphical components.
  • Spherical implementation of HUD for accurate mapping to virtual scene.
  • Stereo virtual reality with head-mounted display.
  • Video stitching in 360-degree wrap-around.
  • 5-screen display in 180-degree configuration.

X38/CRV Cockpit

Aerospace Applications North America designed prototype human machine interfaces for the X38/Crew Rescue Vehicle (CRV).

When operational, the CRV would have been an emergency vehicle to return up to seven International Space Station (ISS) crewmembers to Earth. The CRV required an innovative concept to allow intuitive crew interaction with a highly autonomous vehicle. The new interface simplified crew interaction and provided quick access to all information within a small display hierarchy, reducing training time, improving safety and operability.

The project involved:

  • Specification and implementation of a multi-functional display application in C++ and OpenGL.
  • hosting 24 display formats for ISS separation, deorbit, reentry and landing, including systems, caution and warning and procedures.

Our team implemented a simulator of the CRV cockpit at the NASA Johnson Space Center, after multiple iterations on the display format designs with system specialists and astronauts. This system, which included a clever test subject recording capability, was put through an extensive 3-month evaluation campaign involving 30 astronauts.