#Digital

3D vision in space

With SINTEF’s experience in developing 3D measurement systems, we are now addressing the challenges related to 3D vision in space.

In The Global Exploration Strategy, the world’s major Space Agencies outline a common strategy for future space exploration. A central point in this strategy is the development of robotic techniques. Robotics in space is expected to increase productivity and enable more complex missions, e.g. for the ESA/NASA Mars Sample Return missions. A central point in this strategy is the development of advanced 3D imaging techniques.

Why is this so important?

Human vision
Humans and other animals with stereo vision (two eyes with overlapping field of view), perceive distance to objects using various techniques. Our stereo vision allows us to use physical techniques, such as vergence (that the eyes are more cross-eyed when looking at a close object) and parallax (the relative displacement between close and far away objects when changing viewing angle). However, we also use experience-based techniques. We know, for example, that the object obscuring another object is closer, we have expectations about the relative size between objects we recognise etc. On top of this, the brain has an amazingly good object detection algorithm. We can easily detect a coffee-mug on a crowded table or find a key almost hidden under a book, and we have no problems distinguishing between cats and dogs.

Camera vision
A regular camera records a projection of the world in two dimensions; it does not record information about the distance to objects in the scene. As such, a camera-based perception system lacks the physical 3D perception techniques available to humans. Furthermore, it turns out that the object detection algorithms of a computer are not nearly as good as the human ones. It says a lot that up until quite recently, algorithms successfully distinguishing between images showing cats or dogs were considered state of the art, a task easily managed by any two-year old kid. All in all, it turns out that navigation, orientation and object detection algorithms would be far more robust if only the camera had access to the third dimension. It is evident that the development of 3D cameras is essential for autonomous or robotic missions.

3D vision techniques
There are two main categories of 3D measurement techniques, triangulation and Time of Flight (TOF). Triangulation based techniques use two known viewing points (either two cameras (stereo 3D) or one camera and an illuminator, typically called a projector as it projects structured light (for example light in stripes or dots (structured light 3D)). By knowing the angles between the camera, projector and a point in the image space, it is possible to infer distance to the object.  These techniques can provide highly accurate distance measurements at short range. Projection of dot patterns gives simpler systems with low number of 3D points per image, while full-field pattern projection provides megapixel 3D resolution. TOF based techniques are based on the known speed of light, 3 x 108ms-1.

By measuring how long it takes for light to travel from the camera to a point in the scene and back, it is possible to determine the distance. These techniques are most useful at larger distances.

Back on earth, 3D vision allowed the robust detection of human pose, used e.g. in the Kinect system for console-free gaming and gesture/pose recognition (based on structured light using dot patterns). In the automotive industry particularly, LIDAR-based 3D vision (TOF) makes camera-based autonomous navigation far more robust than systems based on 2D vision. For robotics, highly accurate, full-field structured light 3D vision is required for robotic manipulation of objects, for example for moving an object (pick and place) or assembly of parts or components.

3D vision in space

While console-free gaming might not be ESAs prime motivation for developing 3D sensing for space, the other earth-based examples are very interesting for space applications. In the Mars sample-return missions, 3D vision is a very natural functional requirement. Samples of Martian soil shall be picked up by a robot and placed in a storage vessel, a task very similar to pick and place in a factory on Earth. This vessel shall be launched into orbit around Mars, where it shall be intercepted by a second spacecraft returning it to Earth. This interception is, at least functionally, like the navigation performed by autonomous vehicles on Earth.

While real time remote navigation and control is possible from Earth for a spacecraft that is in Earth’s orbit, these tasks are less feasible for missions on Mars, where the signals take between 8 and 48 minutes from Mars to Earth and back. As such, there is a solid case for robotization of these missions.

So why can’t we just export our earth-based 3D techniques to space?

For triangulation-based 3D techniques, the projected patterns are simply too weak to be reliably detected outdoors, where the ambient light (sunlight) is strong. Furthermore, high-resolution structured light systems normally use DMD-based projectors that include a million individually programmable micro-mirrors that are mechanically moved to generate the structured light pattern, requiring power-hungry control electronics. All in all, a system that may not be desirable to bring to space. TOF techniques are, due to their application in the automotive industry, developed for outdoor use, providing 3D data at ranges up to 200 meters. While this is enough for automotive applications, distances in orbit will likely be longer, e.g. in the km range. Also, common car-LIDARs measure one data point at a time by using a collimated laser beam, and incorporate moving mirrors to scan the scene, creating a modest number of 3D points per second, resulting in either low image resolution or low frame rate, using a mechanically complex system, which is generally undesirable in space.

SINTEFs contribution to 3D vision technology development

SINTEF has 20 years’ experience in the development of 3D measurement systems, including performing optical design and developing 3D reconstruction algorithms for both structured light and Time of Flight systems. Our spin-off company Zivid is supplying state-of-the-art structured light cameras to the industry. Now, SINTEF is addressing the above-mentioned challenges in several projects related to 3D measurements in space.

UTOFIA UTOFIA prototype. and UTOFIA camera operated in subsea environment.
Left: UTOFIA UTOFIA prototype. Photo: SINTEF Right: UTOFIA camera operated in subsea environment. Photo: AZTI Tecnalia

In an ongoing ESA-funded project, SINTEF develops a Time of Flight based 3D camera intended for Rendezvous and Docking in the Mars Sample Return Mission. The project builds on hardware previously developed for a substantially different environment, namely 3D imaging for subsea applications in the project UTOFIA.

Three key requirements are addressed in this project:

  1. Image resolution
  2. Measurement range
  3. “No moving parts”

In contrast to the “one point at a time”-systems, the project uses a so-called flash LIDAR, which illuminates the entire scene and captures high resolution 3D images at high frame rate. Furthermore, the system uses a compact, high-power laser giving the required measurement range.

Structured Light projector
In the project I3DS, SINTEF built a full-field Structured Light projector capable of delivering high resolution, high frame rate 3D images with high accuracy even in direct sunlight. Our fundamental understanding of the 3D measurement process enabled us to predict system performance under varying illumination conditions, thereby allowing us to choose the correct light source for the system. Real-life measurements showed that the performance was in accordance with our predictions and achieved the functional requirements. This demonstrated that it is possible to bring highly accurate Structured Light 3D imaging out from factories and production lines to outdoors and even outer space!

I3DS projector
Left: Researchers Trine Kirkhus and Jostein Thorstensen assembling the I3DS projector. Photo: SINTEF Right: I3DS projector. Photo: SINTEF
Interferometric projector
Left: Breadboard version of the interferometric projector. Right: Pattern generated by the interferometric projector. Photos: SINTEF

In our current ESA-funded project Short-range high-resolution 3D real-time imaging for robotic vision, we develop a new technique for projecting patterns for full-field Structured Light 3D. This project is a technology development project aiming at simplifying the hardware used for full-field Structured Light projection. The new technique employs an interferometer with angular offset between the interferometer arms. This angular offset creates a projected stripe pattern. Phase and frequency of this pattern can be adjusted by varying angle and piston movement of the mirror of one of the interferometer arms.

The mirror component enabling such angle and piston control is developed at SINTEF MiNa-lab. It is a piezo-MEMS micro-mirror using a thin-film PZT material giving highly accurate actuation at frequencies from DC to several kHz, using low voltage control signals. The current design uses only four control voltages for the mirror, greatly simplifying control electronics compared with DMDs with a million individually addressable micromirrors.

3D imaging in space is a prerequisite for future robotic missions. Although 3D imaging for earth-based applications is in rapid development, some functional requirements are quite unique to use cases space. Effort is now being made to develop systems that meet these requirements. With our expertise and experience with 3D imaging systems, SINTEF is excited to be leading several projects that are contributing to this development!


“The view expressed herein can in no way be taken to reflect the
official opinion of the European Space Agency.”


 

1 comment on “3D vision in space

  1. Pingback: The history 3D technological revolution - Tech solution

Leave a Reply

Your email address will not be published. Required fields are marked *