Minimize SPLICE

SPLICE (Safe and Precise Landing – Integrated Capabilities Evolution)

Development status    Background    References

NASA is developing advanced precision landing technologies for robotic science and human exploration missions to the Moon, Mars, icy bodies, and other terrestrial destinations. A new suite of lunar landing technologies, namely SPLICE, will enable safer and more accurate lunar landings than ever before. Future Moon missions could use SPLICE’s advanced algorithms and sensors to target landing sites that weren’t possible during the Apollo missions, such as regions with hazardous boulders and nearby shadowed craters. SPLICE technologies could also help land humans on Mars. 1)

SPLICE consists of the following technologies and capabilities:

• Navigation Doppler Lidar (NDL): This unit has a small electronics box and laser connected by fiber optic cables to three telescopes mounted at fixed angles optimized for performance. During a landing, NDL sends laser beams to the surface, and the reflected returns are detected to provide an estimate of the lander’s velocity and altitude.

• Terrain Relative Navigation (TRN): This system includes a camera that takes live pictures and compares them to existing orbital images of the surface to determine the spacecraft’s location.

• Hazard Detection Lidar (HDL): The sensor is a laser-based 3D imaging system that scans a surface to create a 3D map of the landing field. The HDL images surface terrain that could be a hazard for landings, such as steep slopes or large rocks.

• Descent and Landing Computer (DLC): This is a high-performance multicore computer processor unit that analyzes all SPLICE sensor data and determines the spacecraft’s velocity, altitude, and terrain hazards. It also computes the hazards and determines a safe landing location.

How would SPLICE technologies work together during a lunar landing?

As a spacecraft begins to lower from orbit, it ignites its engine and begins a powered descent towards the surface. During the entire descent, the SPLICE DLC aboard the spacecraft autonomously operates the SPLICE sensor suite and processes algorithms for navigation, guidance, and hazard detection to enable a precise and safe landing. The “brain” of the DLC is a commercial multicore processor. In the coming years, the DLC processor chip will be replaced with a more advanced processor currently in development by NASA, called the High-Performance Space Computing (HPSC) processor.

As the spacecraft approaches the Moon, SPLICE’s TRN component uses a camera to take photos of the surface. It compares them to preloaded maps generated from images previously captured by satellites orbiting the Moon. By detecting and tracking the features in these live photos, the DLC knows where the spacecraft is relative to the features on the map. This enables the spacecraft to avoid known geographic features, such as hills and craters. At the same time, TRN allows a spacecraft to land near areas of scientific interest instead of landing far away and driving a rover to a targeted location.

When a lander is about 4 miles (6.4 km) above the surface, the NDL activates. The instrument transmits laser beams to the Moon’s surface. Those beams bounce off the surface and back to the instrument. This feedback allows NDL to detect the lander’s velocity and altitude as it approaches the lunar surface, which increases the precision of the navigation and guidance algorithms running on the DLC. NASA has previously relied on radar sensors for landing vehicles on the Moon and other planets. The NDL provides measurements that are significantly more precise than radar-based sensors in a smaller package, has less mass, and requires less power.

During the final portion of the descent, the lander rotates to a vertical position to prepare a soft touchdown. When the lander is about 1,640 feet (500 m) from the surface, the HDL system images the surface and generates a 3D terrain map of the landing site. The HDL uses a laser-based 3D imaging system that scans the landscape in real-time and stitches together the terrain model from millions of laser pulses and their return. The DLC processes this 3D map to identify landing hazards, such as rocks and slopes, and determine the safest landing sites for a touchdown within an approximately 330-foot (100 m) diameter circular space.

This upgraded suite of landing technologies can be used to explore new solar system destinations, providing access to surface regions of scientific interest that missions cannot reach with current landing capabilities. The NDL system will fly on two commercial lunar landers targeted for flights in 2021. Additionally, SPLICE technologies will be considered for use to land the first woman and the next man on the Moon under NASA’s Artemis program as well as for future Mars sample return missions.

Fast facts

• For the Apollo landings, astronauts were trained to identify particular lunar surface locations to help them navigate during a landing. Astronauts would have to identify landmarks they saw outside the window to help them understand where they were. The SPLICE’s TRN function

will perform this task by running real-time images against preloaded lunar surface maps, so astronauts don’t have to rely on their ability to recall many different surface reference points.

• The SPLICE technology suite could also help land humans on Mars as early as the 2030s.

• The SPLICE technologies have been tested extensively on the ground and aboard suborbital rockets over the past decade. They were matured through multiple prior NASA projects, including the former Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project.


Figure 1: SPLICE Project Manager Ron Sostaric checks out the system (including the descent and landing computer, navigation doppler lidar, terrain relative navigation camera, and inertial measurement unit) at NASA's Johnson Space Center in Houston (image credit: NASA, Robert Markowitz)


• Precursors to the SPLICE sensors, developed under the ALHAT project, underwent suborbital testing aboard a NASA Project Morpheus rocket in 2014.

• NASA’s Flight Opportunities program coordinated additional flight testing of the precursor SPLICE NDL instrument aboard Masten Space Systems’ Xodiac suborbital rocket in 2017.

• In June 2019, high-speed rocket sled tests of NDL in Kern County, California, obtained data at almost 480 miles per hour (250 m/s, successfully demonstrating NDL’s ability to provide accurate high-velocity measurements.

• The project tested the TRN system, developed by Draper, in September 2019 on a launch and landing of Masten’s Xodiac rocket. NASA’s Flight Opportunities program also supported this test.


• NASA's Johnson Space Center in Houston provides project leadership and is developing the DLC, a ground-based hardware-in-the-loop simulation testbed, and the guidance and navigation software.

• NASA’s Langley Research Center in Hampton, Virginia, is developing the NDL and performing mission architecture studies and developing trajectory simulations.

• NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is developing, building and maturing the HDL, providing a flight software system and fiber optics, and developing an HPSC single board computer.

• NASA’s Jet Propulsion Laboratory is providing hazard detection algorithms, developing high-definition hazard detection and TRN models for concept of operations studies, and leading with the HPSC processor development.

• Draper of Cambridge, Massachusetts, is providing and adapting the TRN algorithm, developing navigation software, and implementing guidance algorithms.

• The SPLICE project is funded by NASA’s Game Changing Development program within the Space Technology Mission Directorate.


Figure 2: The NDL instrument is comprised of a chassis, containing electrooptic and electronic components, and an optical head with three telescopes (image credit: NASA)


Figure 3: The SPLICE system components prepared for inspection before flight tests (image credit: NASA)


Figure 4: Bruce Barnes of NASA/LaRC in Hampton, Virginia, routes fiber optic cables from the NDL lidar laser source (white box in background) to the instrument’s telescopes. Two telescopes, three sighting scopes, and a sighting camera are visible in this picture (image credit: NASA)


Figure 5: The NDL (Navigation Doppler Lidar) transmits laser beams, which bounce off the surface and back to the instrument (image credit: NASA)

Development status

• September 17, 2020: With NASA planning robotic and crewed missions to new locations on the Moon and Mars, avoiding landing on the steep slope of a crater or in a boulder field is critical to helping ensure a safe touch down for surface exploration of other worlds. In order to improve landing safety, NASA is developing and testing a suite of precise landing and hazard-avoidance technologies. 2)

Figure 6: A new suite of lunar landing technologies, called Safe and Precise Landing – Integrated Capabilities Evolution (SPLICE), will enable safer and more accurate lunar landings than ever before. Future Moon missions could use NASA's advanced SPLICE algorithms and sensors to target landing sites that weren’t possible during the Apollo missions, such as regions with hazardous boulders and nearby shadowed craters. SPLICE technologies could also help land humans on Mars (video credit: NASA)

- Three of SPLICE’s four main subsystems will have their first integrated test flight on a Blue Origin New Shepard rocket during an upcoming mission. As the rocket’s booster returns to the ground, after reaching the boundary between Earth’s atmosphere and space, SPLICE’s terrain relative navigation, navigation Doppler lidar, and descent and landing computer will run onboard the booster. Each will operate in the same way they will when approaching the surface of the Moon.

- The fourth major SPLICE component, a hazard detection lidar, will be tested in the future via ground and flight tests.


Figure 7: The New Shepard (NS) booster lands after this vehicle's fifth flight during NS-11, 2 May, 2019 (image credit: Blue Origin)

Following Breadcrumbs

- When a site is chosen for exploration, part of the consideration is to ensure enough room for a spacecraft to land. The size of the area, called the landing ellipse, reveals the inexact nature of legacy landing technology. The targeted landing area for Apollo 11 in 1968 was approximately 11 miles by 3 miles, and astronauts piloted the lander. Subsequent robotic missions to Mars were designed for autonomous landings. Viking arrived on the Red Planet 10 years later with a target ellipse of 174 miles by 62 miles.


Figure 8: The Apollo 11 landing ellipse, shown here, was 11 miles by 3 miles. Precision landing technology will reduce landing area drastically, allowing for multiple missions to land in the same region (image credit: NASA)

- Technology has improved, and subsequent autonomous landing zones decreased in size. In 2012, the Curiosity rover landing ellipse was down to 12 miles by 4 miles.

- Being able to pinpoint a landing site will help future missions target areas for new scientific explorations in locations previously deemed too hazardous for an unpiloted landing. It will also enable advanced supply missions to send cargo and supplies to a single location, rather than spread out over miles.

- Each planetary body has its own unique conditions. That’s why “SPLICE is designed to integrate with any spacecraft landing on a planet or moon,” said project manager Ron Sostaric. Based at NASA’s Johnson Space Center in Houston, Sostaric explained the project spans multiple centers across the agency.

- “What we’re building is a complete descent and landing system that will work for future Artemis missions to the Moon and can be adapted for Mars,” he said. “Our job is to put the individual components together and make sure that it works as a functioning system.”

Figure 9: Terrain relative navigation provides a navigation measurement by comparing real-time images to known maps of surface features during descent (image credit: NASA)

- Atmospheric conditions might vary, but the process of descent and landing is the same. The SPLICE computer is programmed to activate terrain relative navigation several miles above the ground. The onboard camera photographs the surface, taking up to 10 pictures every second. Those are continuously fed into the computer, which is preloaded with satellite images of the landing field and a database of known landmarks.

- Algorithms search the real-time imagery for the known features to determine the spacecraft location and navigate the craft safely to its expected landing point. It’s similar to navigating via landmarks, like buildings, rather than street names.

- In the same way, terrain relative navigation identifies where the spacecraft is and sends that information to the guidance and control computer, which is responsible for executing the flight path to the surface. The computer will know approximately when the spacecraft should be nearing its target, almost like laying breadcrumbs and then following them to the final destination.

- This process continues until approximately four miles above the surface.

Laser navigation

- Knowing the exact position of a spacecraft is essential for the calculations needed to plan and execute a powered descent to precise landing. Midway through the descent, the computer turns on the navigation Doppler lidar to measure velocity and range measurements that further add to the precise navigation information coming from terrain relative navigation. Lidar (light detection and ranging) works in much the same way as a radar but uses light waves instead of radio waves. Three laser beams, each as narrow as a pencil, are pointed toward the ground. The light from these beams bounces off the surface, reflecting back toward the spacecraft.

- The travel time and wavelength of that reflected light are used to calculate how far the craft is from the ground, what direction it’s heading, and how fast it’s moving. These calculations are made 20 times per second for all three laser beams and fed into the guidance computer.

- Doppler lidar works successfully on Earth. However, Farzin Amzajerdian, the technology’s co-inventor and principal investigator from NASA’s Langley Research Center in Hampton, Virginia, is responsible for addressing the challenges for use in space.


Figure 10: Langley engineer John Savage inspects a section of the navigation Doppler lidar unit after its manufacture from a block of metal (image credits: NASA/David C. Bowman)

- “There are still some unknowns about how much signal will come from the surface of the Moon and Mars,” he said. If material on the ground is not very reflective, the signal back to the sensors will be weaker. But Amzajerdian is confident the lidar will outperform radar technology because the laser frequency is orders of magnitude greater than radio waves, which enables far greater precision and more efficient sensing.

- The workhorse responsible for managing all of this data is the descent and landing computer. Navigation data from the sensor systems is fed to onboard algorithms, which calculate new pathways for a precise landing.

Computer Powerhouse

- The descent and landing computer synchronizes the functions and data management of individual SPLICE components. It must also integrate seamlessly with the other systems on any spacecraft. So, this small computing powerhouse keeps the precision landing technologies from overloading the primary flight computer.

- The computational needs identified early on made it clear that existing computers were inadequate. NASA’s high-performance spaceflight computing processor would meet the demand but is still several years from completion. An interim solution was needed to get SPLICE ready for its first suborbital rocket flight test with Blue Origin on its New Shepard rocket. Data from the new computer’s performance will help shape its eventual replacement.


Figure 11: SPLICE hardware undergoing preparations for a vacuum chamber test. Three of SPLICE’s four main subsystems will have their first integrated test flight on a Blue Origin New Shepard rocket (image credit: NASA)

- John Carson, the technical integration manager for precision landing, explained that “the surrogate computer has very similar processing technology, which is informing both the future high-speed computer design, as well as future descent and landing computer integration efforts.”

- Looking forward, test missions like these will help shape safe landing systems for missions by NASA and commercial providers on the surface of the Moon and other solar system bodies.

- “Safely and precisely landing on another world still has many challenges,” said Carson. “There’s no commercial technology yet that you can go out and buy for this. Every future surface mission could use this precision landing capability, so NASA's meeting that need now. And we’re fostering the transfer and use with our industry partners.”

• June 20, 2019: NASA is developing an advanced suite of sensors, avionics and algorithms to avoid hazards and perform extremely safe and precise landings on planetary surfaces. One of those critical landing technologies is navigation Doppler lidar (NDL), which is used to determine precise vehicle velocity and position. 3)

- The new NDL unit, being developed at NASA’s Langley Research Center in Hampton, Virginia, is comprised of a small electronics box connected by fiberoptic cables to three lenses that transmit laser beams to an anticipated distance greater than 4 miles on the Moon and 2.5 miles on Earth. Those beams reflect off the ground to help the sensor determine its speed, direction and altitude. NDL provides ultra-precise measurements that identify exactly how high a human or robotic lander is and how fast it is traveling.

- “The lander uses the NDL measurements during its descent toward the Moon surface to precisely and gently land at the designated location,” said Farzin Amzajerdian, the NDL principal investigator.

Figure 12: NASA will need ultra-precise entry, descent and landing technology to land the first woman and next man safely on the Moon in 2024 (video credit: NASA)

- Engineers at NASA recently tested the performance of NDL’s velocity measurement capability during a high-speed rocket sled test at the Naval Air Weapons Station China Lake in Kern, County, California. The objective of the testing was to validate NDL’s ability to accurately track the speed of a target moving at 450 miles per hour. The target is put on a sled and launched down a track while NDL measures its distance and velocity.

- The tests were a part of the SPLICE (Safe & Precise Landing – Integrated Capabilities Evolution) project, developing the perfect combination of technologies needed to more precisely land on planetary surfaces. SPLICE technologies will be infused into CLPS (Commercial Lunar Payload Services) missions within the next few years, with NDL providing instruments for both the Astrobotic and Intuitive Machines lander missions planned for 2021.

- During the series of tests, NDL telescopes were fixed to a stationary mount that picked up range and speed of a sled powered by rocket motors that traveled down a rail track at 450 miles per hour. The NDL unit documented accurate speed and range measurements of the sled during each of the eight tests and validated the targeted NDL design performance.

- “This recent test validates the NDL’s ability to provide extremely accurate velocity measurements during descent and landing, which is a part of critical testing required to validate all of the SPLICE technologies for future NASA missions,” said John Carson, principal investigator for SPLICE.

- The major components of SPLICE, along with NDL, are a camera for terrain relative navigation, a hazard detection lidar, and a descent and landing computer that incorporates a surrogate for the in-development NASA HPSC (High-Performance Spaceflight Computing) processor.

- The SPLICE suite of sensors and algorithms use real-time images and 3D-generated maps to precisely navigate during descent and landing toward safe touchdown locations in close proximity to targeted planetary surface locations. The NASA HPSC chip enables SPLICE computing to rapidly process high volumes of data with complex algorithms that determine precise navigation information, intelligent guidance maneuvers, and the safest landing sites for future missions.

- The HPSC processor architecture provides roughly 100 times the computational capacity of current space flight processors for the same amount of power. The chip also offers greater flexibility, extensibility and interoperability than current processors.

- A test of the terrain-relative navigation capability to capture and compare real-time images with known maps of surface features is planned for late 2019 through NASA’s Flight Opportunities program which is managed at NASA's Armstrong Flight Research Center in Edwards, California.

- SPLICE’s advanced sensing, computing and algorithm technologies will enable safe and precise landing for future NASA missions.

- Charged with returning astronauts to the Moon within five years, NASA’s Artemis lunar exploration plans are based on a two-phase approach: the first is focused on speed – landing astronauts on the Moon by 2024 – while the second will establish a sustained human presence on and around the Moon by 2028. We will use what we learn on the Moon to prepare to send astronauts to Mars. The technology missions on this launch will advance a variety of future exploration missions.

More description and background of the SPLICE project

• January 2019: Guidance, Navigation and Control (GN&C) technologies for precise and safe landing are essential for future robotic science and human exploration missions to solar system destinations with targeted surface locations that pose a significant risk to successful landing and subsequent mission operations. These Entry, Descent and Landing (EDL) technologies are a part of the NASA domain called PL&HA (Precision Landing and Hazard Avoidance) and are considered high-priority capabilities within NASA space technology development roadmaps to promote and enable new mission concepts. The SPLICE (Safe & Precise Landing { Integrated Capabilities Evolution) project is a multi-center, multi-directorate NASA project focused on continuing the decade-plus of NASA investments and projects focused on PL&HA technology development and infusion. This paper highlights the GN&C technologies in development within SPLICE, along with the simulation and field test plans for validation of the capabilities and Technology Readiness Level (TRL) maturation toward infusion into potential near-term robotic lunar landing missions. 4)

NASA Technology Roadmaps 5) and the NASA Space Technology Mission Directorate (STMD) EDL investment strategy deem precision landing and hazard avoidance (PL&HA) technologies as critical capabilities for future robotic science and human exploration missions to the Moon, Mars, icy bodies and other solid-surface destinations. The PL&HA suite of technologies includes multiple sensors, algorithms, and avionics components that when integrated together enable a spacecraft to safely land in close proximity to specified surface locations. Such locations include landing within topographically diverse terrain consisting of lander-sized hazards (e.g., high slopes and/or large rocks), as well as in regions in close proximity of pre-positioned surface assets (e.g., cached science samples or human mission infrastructure). These technologies support the NASA Strategic Plan to enable exploration of new solar system destinations and allow access to new surface regions of scientific interest that are currently unreachable with current landing capabilities.

The SPLICE project has been initiated with the STMD Game Changing Development (GCD) Program as a three-year project from government fiscal year (FY) 2018 through FY 2020. The project objective is to develop, mature, demonstrate and infuse PL&HA (Precision Landing and Hazard Avoidance) technologies into NASA and potential US commercial spaceflight missions. The SPLICE project is the focal PL&HA project within the agency and is the direct successor of the prior NASA ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) and COBALT (CoOperative Blending of Autonomous Landing Technologies) projects that ended in FY 2015 and FY 2017, respectively.

SPLICE is conducting detailed modeling and ConOps analyses of GN&C and PL&HA systems and multiple candidate mission EDL architectures for robotic and human landings on the Moon, Mars, and other solar system bodies. The work is in partnership with the STMD ESM project. The analyses are utilizing multiple reference landers, combinations of GN&C and PL&HA sensors, and different EDL trajectories to develop a PL&HA Requirements Information \Matrix" (RIM). The purpose of the RIM is to establish applicability of existing PL&HA technologies to near-term missions, as well as to identify capability gaps that require future NASA investments into next-generation PL&HA technologies.

Figure 13 provides a notional PL&HA EDL or DDL (Deorbit Descent and Landing) highlighting representative GN&C phases and the sensor systems during each of the phases. The PL&HA sensing capabilities under evaluation in the studies include Terrain Relative Navigation (TRN), Hazard Detection (HD), Hazard Relative Navigation (HRN), NDL and/or optical velocimetry for velocity, and NDL or other altimeters for ranging. TRN uses a passive optical camera and a reconnaissance map to determine a global navigation state. HD utilizes an optical sensor to determine landing hazards and safe landing sites from either a camera image or a lidar-generated map: SPLICE technologies are focused on active lidar-based HD. HRN utilizes the lidar-generated map from HD to perform a subsequent TRN-like function.


Figure 13: Notional PL&HA ConOps (image credit: NASA)

Focal technologies

NDL (Navigation Doppler Lidar)

The NDL, in development at NASA LaRC, provides ultra-precise and direct velocity measurements, as well as range measurements. 6) 7) 8) The NDL measurements are utilized within a lander GN&C subsystem to minimize navigation error in velocity and position (minimize the landing ellipse) and to tightly control vertical and lateral velocities during terminal descent to ensure a soft and/or controlled touchdown. NDL has been in development within NASA for more than a decade and will be at TRL 6 at the end of 2019, following completion and testing of an Engineering Test Unit (ETU).

The NDL ETU consists of an electronics chassis and a fiber-coupled optical head (Figure 14). The electronics chassis incorporates a custom command and data handling (C&DH) board, seed laser, fiber optic amplifier, and other electrooptical components. The optical head contains three fiber-coupled, transmit/receive telescopes that are rigidly mounted to the vehicle with a clear field of view to the ground.


Figure 14: NDL ETU electronics chassis (left) illustration and example optical head (right), image credit: NASA

The NDL uses a customized laser waveform and optical homodyne detection to obtain both velocity and range measurements along each telescope line of sight (LOS). The NDL ETU is designed to achieve LOS velocity and range performance of 200 m/s and 7+ km (lunar), respectively, with accuracies on the order of 2 cm/s and 2 m, respectively. The Size, Weight, and Power (SWaP) for the NDL ETU are as follows: chassis size 35 x 24 x 17cm, chassis mass under 10 kg, optic head mass under 4 kg (customizable), power ~85 W.

The NDL ETU is designed for spaceflight, incorporating spaceflight and path-to-spaceflight components, conductive cooling, and provisions to minimize electromagnetic interference (EMI). The telescopes can be separated for packaging advantages with spacecraft design and integration. The low divergence of the NDL laser beams further facilitates packaging options. Environmental tests of the NDL ETU will include thermal, vacuum, vibration, and EMI tests, along with radiation testing of select components. In addition, a high-speed NDL test will be conducted to validate the velocity performance of the sensor.

DLC (Descent and Landing Computer)

The DLC, in development at NASA JSC, is designed as a stand-alone EDL GN&C computer to offload the computationally expensive processing of PL&HA algorithms from the host vehicle (HV) flight critical functions running on the primary flight computer. The design has been done in partnership with the STMD HPSC Project to develop a surrogate processing platform in preparation for the forthcoming NASA HPSC processor. The architecture also leverages design elements of the HD compute element developed during the former ALHAT project.

The DLC manages and time stamps all Input/Output (I/O) data from the EDL GN&C and PL&HA sensors, communicates with the HV flight computer, and provides a hardware-based method for time synchronization between the DLC and the HV time. See the illustration in Figure 15. The DLC incorporates commercial-off-the-shelf (COTS) Xilinx Multi-Processor System on a Chip (MPSoC) devices as surrogates for the in-development NASA HPSC multicore processor. Each MPSoC is hosted on a custom baseboard. Also included are a path-to-spaceflight FPGA (Field-Programmable Gate Array) card for I/O data interfacing, management and time stamping, as well as a path-to-spaceflight DLC power card and a solid state drive (SSD). The MPSoC devices include a quad-core ARM A53 processor cluster, dual ARM R5 real-time processors, and FPGA fabric. The MPSoC FPGA is used for implementing high-speed serial interface firmware to the standalone FPGA, as well as for MPSoC-to-MPSoC inter-board communication for PL&HA applications requiring additional A53 clusters for data and algorithms processing.


Figure 15: Illustration of the DLC architecture with interfaces to PL&HA sensors and terrestrial host vehicle testbeds (image credit: NASA)

Software development within SPLICE is leveraging the NASA core Flight System (cFS) framework, which has been ported to the 64-bit multicore HPSC (High-Performance Spaceflight Computing) surrogate and is running native within the DLC. The DLC EDU is being designed toward spaceflight, including all path-to-spaceflight or equivalent electronics and a path-to-spaceflight chassis enclosure. The DLC EDU will be put through functional and environmental testing relevant for spaceflight environments, as well as through HWIL (Hardware-In-the-Loop) simulation-based testing, to achieve TRL 5+ within the timeline of the SPLICE project.

HDL (Hazard Detection Lidar)

The HDL, in development at NASA/GSFC, is a scan-array lidar system that couples an optical beam steering mechanism with a small detector array to generate a precise three-dimensional terrain map within seconds. The HD L measures millions of ranges per second, each with ~1 cm accuracy, and can accommodate a wide range of return signal intensities. These range measurements are merged onboard in real-time with inertial measurements of position and attitude to create an accurate DEM (Digital Elevation Map) of the landing area. This DEM is then assessed to identify the safe landing sites.

The HDL has the capability to produce medium- and short-range, high-resolution terrain maps, as well as long-range altimetry measurements. The terrain map size, range precision, GSD (Ground Sample Distance), and slant range at execution are customizable to meet the needs of multiple mission scenarios. The configuration for the SPLICE HDL EDU is in active architectural trades through the project ConOps studies of potential near-term robotic lunar lander missions. The targeted specifications are slant ranges on the order of 500 m to 1+ km, map diameters on the order of 50-100 m, GSD of 5-10 cm, and cm-level range precision. A detailed HDL performance simulator has been developed to predict the accuracy and coverage of the DEM over a range of environmental conditions (e.g., position and angular changes during data collection), surface conditions (slope, roughness, distinct features/”hazards", etc.), and sensor performance characteristics (e.g., range precision, pixel size, false detections, pixel dropouts, etc.). A simulated lunar example of an anticipated HDL DEM is provided in Figure 16.

The components within the HDL EDU are primarily high-TRL (Technology Readiness Level), spaceflight heritage components, or derivatives of subsystems, that have flown on multiple missions, such as the LOLA (Lunar Orbiter Laser Altimeter) that is flying on the LRO (Lunar Reconnaissance Orbiter) mission and the GEDI (Global Ecosystem Dynamics Investigation) Lidar set to fly on the ISS (International Space Station). The steering mechanism is the only mid-TRL component, and it is in active development and testing within the project. The HDL EDU will be tested to TRL 5+ within the timeline of the SPLICE project.


Figure 16: A simulated HDL lunar DEM (image credit: NASA)

PL&HA (Precision Landing and Hazard Avoidance) Algorithms

Safe and precise landing requires complex algorithms and high-performance computing to fuse sensor data and plan intelligent maneuvers that are subsequently executed with the vehicle propulsion system. The SPLICE project is investing new and novel methods for onboard HD (Hazard Detection) for safe site identification, terrain-relative sensor fusion for improved navigation, and 6-DOF (Degree of Freedom) guidance. In addition, SPLICE is leveraging existing NASA and US Government investments into TRN algorithms.

Hazard Detection and Safe Site Identification: Identification of safe landing sites is accomplished with an HD algorithm that analyzes the HD Lidar terrain DEM to determine candidate surface sites with high probability for safe landing (i.e., acceptable slopes and low probability of lander-size hazards). The algorithm considers lander geometry, hazard tolerances, touchdown orientations, and lidar and navigation uncertainty in the determination of a safety probability for each candidate location within the generated terrain map. The identified safe landing sites are then utilized within higher-level GN&C logic to plan and execute a hazard avoidance divert to accomplish a safe and precise landing. The HD phase of PL&HA is time critical, so the HD algorithm must process the HD Lidar map and identify safe landing sites rapidly in real time. The SPLICE HD algorithm work is evolving techniques developed at NASA JPL during the former ALHAT project. 9)

PL&HA Data Fusion for Improving Navigation: SPLICE is supporting fundamental research in navigation algorithms to improve knowledge precision, enhance robustness, and identify promising new methodologies. The modeling of the terrain-relative measurements and the intelligent fusion of those measurements within a Kalman filter is a focal research area. 10) 11)

Investigations are also underway to determine the sensitivity of navigation knowledge to relative-sensor and terrain (e.g., ellipsoids, DEMs, etc) model fidelity, as well as the effect of sequencing different model fidelities during the EDL timeline. 12) Research efforts are also looking at the theoretical underpinnings of fusing classic Kalman filtering techniques with image-based localization techniques. 13) 14)

Dual-Quaternion 6-DOF Intelligent Guidance: Traditional engineering approaches to guidance policy design consider three-degree-of-freedom (3-DOF) translation-only dynamics and neglect physical state and control constraints. Subsequent high-fidelity 6-DOF simulations and analyses then adapt the 3-DOF guidance policies to meet spacecraft pointing requirements and ensure adherence to physical state and control constraints. This process comes at the cost of potentially reducing the trade space for feasible guidance designs, as well as requiring extensive analyst effort to verify that unconstrained 3-DOF designs subsequently satisfy the constrained 6-DOF requirements.

SPLICE is supporting fundamental research into a 6-DOF DQG (Dual-Quaternion Guidance) algorithm for powered descent that intrinsically incorporates 6-DOF coupled (translation and rotation) constraints, state triggered constraints, and other relevant state and control constraints. 15) 16)

The DQG algorithm makes use of convex optimization methods to solve the constrained 6-DOF guidance problem. The DQG formulation is particularly suitable to convexification of both the dynamics and constraints, which offers computational efficiency and a guaranteed solution (provided problem feasibility). DQG is highly relevant to PL&HA because ConOps phases such as Hazard Detection (Figure 13) have 6-DOF coupled state constraints. During the HD Phase, the HD Lidar must be actively pointed to scan a specific terrain region while the spacecraft itself is translating.

In comparison to traditional approaches, the DQG algorithm framework provides the potential to reduce the analysis time required for guidance policy design because the framework is fully 6-DOF and incorporates relevant PL&HA constraints. Additionally, the 6-DOF formulation and convex optimization-based approach allows automatic, and thus broader, searches of the trade space for valid guidance policies. This same formulation is also conducive to onboard implementations because convex algorithms can be solved in polynomial time and to a prescribed level of accuracy.


Component-level and integrated system-level tests are critical to the PL&HA TRL maturation process. The SPLICE project is conducting numerous component-level tests in lab, ground, and airborne test facilities to evaluate component technology performance. System-level testing is a challenge, however, as the intended operational environment for PL&HA technologies is within a spacecraft GN&C subsystem performing an EDL trajectory profile. To accomplish integrated system-level tests, SPLICE is leveraging two testbeds, a HWIL simulation and suborbital rockets.

HWIL-based Simulations

The SPLICE HWIL simulation testbed (Figure 17) at NASA JSC has been implemented for use in the development, performance testing, and validation of PL&HA subsystems and flight software, as well as for future playback and analysis of field test data. The HWIL testbed provides a low-cost method for system-level development and validation prior to incurring the higher costs of field/flight testing or spaceflight mission infusion. The HWIL testbed incorporates physical avionics, sensor hardware, ground consoles, and a 6-DOF high-fidelity simulation. The simulation is developed using the JSC Trick framework, which integrates 6-DOF dynamic body models, environment models, and sensor and actuator models. The HWIL testbed is being used in the development and performance testing of the DLC EDU architecture, which executes flight software within the NASA cFS framework. The HWIL simulation testbed and the cFS-based flight software together provide the SPLICE project with capabilities that support future DLC migration to the HPSC flight processor, as well as validation of PL&HA technologies in simulated flight-like environments to advance TRL and mitigate risk in future spaceflight applications.


Figure 17: Illustration of the SPLICE hardware-in-the-loop simulation testbed (image credit: NASA)

Suborbital Rockets

Suborbital rockets provide another valuable capability for integrated testing and maturation of EDL and PL&HA technologies. These testbeds provide terrestrial, moderate-cost capabilities to evaluate PL&HA technologies (in open-loop or closed-loop) within a vehicle GN&C subsystem performing dynamically-relevant powered descent and landing. This additional test capability provides further risk reduction and systems-level TRL maturation prior to the PL&HA technologies being infused into a high-cost spaceflight mission.

Multiple suborbital rockets have been leveraged within the NASA PL&HA community during previous projects: the ALHAT project [17)] in 2014 tested a prototype NDL and HD system in closed loop onboard the NASA Morpheus vehicle; the JPL ADAPT (Autonomous Descent and Ascent Powered-Flight Testbed) project [18) 19)] in 2014 tested JPL TRN in open loop onboard a Masten Space Systems (MSS) Xombie vehicle; and, the COBALT project [20) 21) 22)] in 2017 tested the NDL and JPL TRN in open loop onboard a MSS Xodiac vehicle.

• July 31, 2018: NASA's technology advancement needs for entry, descent and landing call for high-precision, high-rate sensors that can improve navigation accuracy and vehicle control performance. Higher landing accuracy is required for any future human lander missions, and likely, for most robotic missions. Sensors and algorithms that significantly reduce navigation errors and can image the local terrain will enable landing at locations of high scientific interest that would otherwise pose significant risk to the vehicle. 23)

- The SPLICE project is developing PL&HA (Precision Landing and Hazard Avoidance) technologies for NASA and for potential commercial space flight missions. SPLICE technologies include sensors, algorithms, advanced space flight computing capabilities, and simulation tools used to integrate and study guidance, navigation, and control (GN&C) system performance. SPLICE efforts include HWIL (Hardware-In-the-Loop) simulation testing, ground testing, and flight testing, including reuse of hardware from the COBALT (CoOperative Blending of Autonomous Landing Technologies) suborbital flight-test payload.

1) ”Safe and Precise Landing – Integrated Capabilities Evolution (SPLICE),” NASA, 5 August 2020, URL:

2) Margo Pierce, ”NASA Technology Enables Precision Landing Without a Pilot,” NASA 17 September 2020, URL:

3) Hillary Smith, ”Coming in for a Landing with New NASA Technology,” NASA Feature, 20 June 2019, URL:

4) John M. Carson III, Michelle M. Munk, Ronald R. Sostaric, Jay N. Estes, Farzin Amzajerdian, J. Bryan Blair,O, David K. Rutishauser, Carolina I. Restrepo, Alicia Dwyer Cianciolo, George T. Chen, Teming Tse, ”The SPLICE Project: Continuing NASA Development of GN&C Technologies for Safe and Precise Landing,” AIAA Scitech Forum, 7-11, 2019, San Diego, CA, USA, Published online: 6 January 2019,, URL:

5) Steering Committee for NASA Technology Roadmaps; National Research Council of the National Academies, ”NASA Space Technology Roadmaps and Priorities: Restoring NASA's Technological Edge and Paving the Way for a New Era in Space,” The National Academies Press, January 2012, URL:

6) Farzin Amzajerdian, Diego Pierrottet, Glenn Hines, Larry Petway, and Bruce Barnes, ”Fiber-based Doppler Lidar Sensor for Vector Velocity and Altitude Measurements," Frontiers in Optics 2015, OSA Technical Digest,San Jose, CA, USA, 20 October 2015, URL:

7) Farzin Amzajerdian, Glenn D. Hines, Larry B. Petway, Bruce W. Barnes, Diego F. Pierrottet, John M. Carson III, ”Development and Demonstration of Navigation Doppler Lidar for Future Landing Mission," Proceedings of AIAA Space 2016 Conference & Exposition, Long Beach, CA, 13-16 September 2016,

8) Diego F. Pierrottet, Glenn D. Hines, Bruce W. Barnes, Farzin Amzajerdian, Larry B. Petway, and John M. Carson III, ”Navigation Doppler Lidar Integrated Testing Aboard Autonomous Rocket Powered Vehicles," Proceedings of AIAA 2018 SciTech/GN&C Conference, paper: AIAA 2018-0614, Kissimmee, FL, 8-12 January 2018,

9) T. Ivanov, A. Huertas and J. Carson, ”Probabilistic Hazard Detection for Autonomous Safe Landing," Proceedings of AIAA Guidance, Navigation, and Control Conference, August 19-22, 2013, Boston, MA, USA,

10) Kari C. Ward and Kyle J. DeMars, ”Including Topographical Effects in Slant-Range Modeling," Proceedings of the AIAA 2018, SciTech/GN&C Conference , paper: AIAA 2018-1333, Kissimmee, FL, USA, 8-12 January 2018,

11) John C. Helmuth,Kari C. Ward, ”Fusion of Multiple Terrain-Based Sensors for Descent-to-Landing Navigation," Proceedings of AIAA 2019, SciTech/GN&C Conference, San Diego, CA,USA, 7-11 January 2019, paper: AIAA 2019-0922,

12) Kenneth M. Kratzer, J. Cameron Helmuth, Kari C. Ward, Kyle J. DeMars, ”Impact of Sensor Model Fidelity and Scheduling on Navigation Performance," SciTech/GN&C Conference, paper: AIAA 2018-1334, 8-12 January 2018, Kissimmee, FL, USA,

13) James S. McCabe and Kyle J. DeMars , ”Robust, Terrain-Aided Landing Navigation through Decentralized Fusion and Random Finite Sets," Proceedings of AIAA 2018, SciTech/GN&C Conference, paper: AIAA 2018-1332, Kissimmee, FL, USA, 8-12 January 2018,

14) James S. McCabe, and Kyle J. DeMars, ”Landing Navigation With Terrain Aiding Using Prioritized Features," Proceedings of AIAA 2019, SciTech/GN&C Conference, San Diego, CA, USA, 7-11 January 2019

15) Taylor P. Reynolds, Michael Szmuk, Danylo Malyuta, Mehran Mesbahi, Behcet Acikmese, and John M. Carson, ”A State Triggered Line of Sight Constraint for 6-DoF Powered Descent Guidance," Proceedings of AIAA 2019, SciTech/GN&C Conference, San Diego, CA, USA, 7-11 January 2019,

16) Danylo Malyuta, Taylor P. Reynolds, Michael Szmuk, Mehran Mesbahi, Behcet Acikmese and John M. Carson, ”Discretization Performance and Accuracy Analysis for the Rocket Powered Descent Guidance Problem," Proceedings of AIAA 2019, SciTech/GN&C Conference, San Diego, CA, USA, 7-11 January 2019,

17) John M. Carson III, Edward A. Robertson, Nikolas Trawny, and Farzin Amzajerdian, ”Flight Testing ALHAT Precision Landing Technologies Integrated Onboard the Morpheus Rocket Vehicle," Proceedings of AIAA Space 2015 Conference & Exposition, paper: AIAA 2015-4417, Pasadena, CA, 31 August - 2 September 2015,

18) Nikolas Trawny, Joel Benito, Brent Tweddle, Charles F. Bergh, Garen Khanoyan, Geoffrey M. Vaughan,Jason X. Zheng,Carlos Y., Villalpando, Yang Cheng, Daniel P. Scharf, Charles D. Fisher, Phoebe M. Sulzen, James F. Montgomery, Andrew E. Johnson, MiMi Aung, Martin W. Regehr, Daniel Dueri, Behcet Acikmese,David Masten, Travis O'Neal, and Scott Nietfeld, ”Flight testing of terrain-relative navigation and large-divert guidance on a VTVL rocket," Proceedings of AIAA SPACE 2015 Conference & Exposition, paper:AIAA 2015-4418, 31 August-2 September 2015, Pasadena, CA,

19) Daniel P. Scharf, Martin W. Regehr, Geoffrey M. Vaughan, Joel Benito, Homayoon Ansari, MiMi Aung, Andrew Johnson, et al., ”ADAPT Demonstrations of Onboard Large-Divert Guidance with a VTVL Rocket," Proceedings of IEEE Aerospace Conference, Big Sky, MT, USA, 1-8 March 2014,

20) John M. Carson III, Carolina I. Restrepo, Carl R. Seubert,Farcin Amzajerdian, D. F. Pierrottet, S. M. Collins, T. O'Neal and R. Stelling, ”Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing," Proceedings of AIAA Space 2017 Conference & Exposition, AIAA 2017-5287, Orlando, FL, 12 September 2017, URL:

21) Jon M. Carson III,Carl R. Seubert, Farcin Amzajerdian, Chuck Bergh, Ara Kourchians,Carolina I. Restrepo,Carloa Y. Villalpando, Travis V. O'Neal, Edward A. Robertson, Diego F. Pierrottet, Glenn D. Hines, and Reuben Garcia, ”COBALT: Development of a Platform to Flight Test Lander GN&C Technologies on Suborbital Rockets," Proceedings of AIAA 2017 SciTech/GN&C Conference, paper: AIAA 2017-1496, Grapevine, TX, January 2017,

22) Carolina I. Restrepo, John M. Carson, Frarcin Amzajerdian, Carl R. Seubert,Ronney S. Lovelace, Megan M. McCarthy, Teming Tse, Richard Stelling, and Steven M.Collins, ”Open-Loop Performance of COBALT Precision Landing Payload on a Commercial Sub-Orbital Rocket," Proceedings of AIAA 2018 SciTech/GN&C Conference, paper: AIAA 2018-0613, Kissimmee, FL, USA, 8-12 January 2018,

23) John M. Carson III, Ronald Sostaric, Carolina I. Restrepo, Michelle Munk, ”NASA SPLICE Project: Development and Testing of Precision Landing GN&C Technologies,” Spacecraft Design, Testing and Performance, July 31, 2018, URL:

The information compiled and edited in this article was provided by Herbert J. Kramer from his documentation of: ”Observation of the Earth and Its Environment: Survey of Missions and Sensors” (Springer Verlag) as well as many other sources after the publication of the 4th edition in 2002. - Comments and corrections to this article are always welcome for further updates (

Development status    Background    References    Back to top