Skip to main content Skip to main navigation

Project

DAKARA

Design and application of an ultra-compact, energy-efficient and reconfigurable camera matrix for spatial analysis

Design and application of an ultra-compact, energy-efficient and reconfigurable camera matrix for spatial analysis

  • Duration:

Within the DAKARA project an ultra-compact, energy-efficient and reconfigurable camera matrix is developed. In addition to standard color images, it provides accurate depth information in real-time, providing the basis for various applications in the automotive industry (autonomous driving), production and many more. The ultra-compact camera matrix is composed of 4x4 single cameras on a wafer and is equipped with a wafer-level optics, resulting in an extremely compact design of approx. 10 x 10 x 3 mm. This is made possible by the innovative camera technology of the AMS Sensors Germany GmbH. The configuration as a camera matrix captures the scene from sixteen slightly displaced perspectives and thus allows the scene geometry (a depth image) to be calculated from these by means of the light field principle. Because such calculations are very high-intensity, close integration of the camera matrix with an efficient, embedded processor is required to enable real-time applications. The depth image calculations, which are researched and developed by DFKI (Department Augmented Vision), can be carried out in real-time in the electronic functional level of the camera system in a manner that is resource-conserving and real-time. Potential applications benefit significantly from the fact that the depth information is made available to them in addition to the color information without further calculations on the user side. Thanks to the ultra-compact design, it is possible to integrate the new camera into very small and / or filigree components and use it as a non-contact sensor. The structure of the camera matrix is reconfigurable so that a more specific layout can be used depending on the application. In addition, the depth image computation can also be reconfigured and thus respond to certain requirements for the depth information.

The innovation of the DAKARA project represents the ultra-ompact, energy-efficient and reconfigurable overall system that provides both color and depth images. Similar systems, which are also found in the product application, are generally active systems that emit light and thus calculate the depth. Major disadvantages of such systems are the high energy consumption, the large design and the high costs. Passive systems have much lower energy consumption, but are still in the research stage and generally have large designs and low image rates. For the first time, DAKARA offers a passive camera, which convinces with an ultra-compact design, high image rates, reconfigurable properties and low energy consumption, leaving the research stage and entering the market with well-known users from different domains.

In order to demonstrate the power and innovative power of the DAKARA concept, the new camera is used in two different application scenarios. These include an intelligent rear-view camera in the automotive field and the workplace assistant in manual production. The planned intelligent rear-view camera of the partner ADASENS Automotive GmbH is capable of interpreting the rear vehicle environment spatially, metrically and semantically compared to currently used systems consisting of ultrasonic sensors and a mono color camera. As a result, even finer structures such as curbsides or poles can be recognized and taken into account during automated parking operations. In addition, the system is able to detect people semantically and to trigger warning signals in the event of an emergency. The DAKARA camera provides a significant contribution to increasing the safety of autonomous or semi-automated driving. A manual assembly process at the Bosch Rexroth AG and DFKI (Department Innovative Factory Systems) is shown in the case of the workplace assistant. The aim is to support and assure the operator of his tasks. For this purpose, the new camera matrix is fixed over the workplace and both objects and hands are detected spatially and in time by the algorithms of the partner CanControls GmbH. A particular challenge is that objects such as tools or workpieces that are held in the hand are very difficult to separate from these. This separation is made possible by the additional provision of depth information by the DAKARA camera. In this scenario, a gripping path analysis, a removal and level control, the interaction with a dialog system and the tool position detection are implemented. The camera is designed to replace a large number of sensors, which are currently being used in various manual production systems by the project partner Bosch Rexroth, thus achieving a new quality and cost level.

In the next three years the new camera matrix will be designed, developed and extensively tested in the mentioned scenarios. A first prototype will be realized by late summer 2018. The project "DAKARA" is funded by the Federal Ministry of Education and Research (BMBF) within the framework of the "Photonics Research Germany - Digital Optics" program. The project volume totals 3.8 million euros; almost half of it is provided by the industry partners involved.

Partners

  • AMS Sensors Germany GmbH, Nürnberg (Konsortialführung)
  • Deutsches Forschungszentrum für Künstliche Intelligenz GmbH (DFKI), Kaiserslautern (Technische Konsortialführung)
  • ADASENS Automotive GmbH, Lindau
  • Bosch Rexroth AG, Stuttgart
  • CanControls, Aachen

Sponsors

BMBF - Federal Ministry of Education and Research

13N14318

BMBF - Federal Ministry of Education and Research

Images

© DFKI

Farbbild einer Rückfahrkamera in 2D und ohne Tiefeninformation.

© DFKI

Farbbild einer Rückfahrkamera in 2D und ohne Tiefeninformation.

© DFKI

Farbbild eines Arbeitsplatzes ohne Tiefeninformation.

© DFKI

Bild mit Tiefeninformationen. Außerdem: Klare Trennung von Werkzeug und Hand möglich durch DAKARA-Technologie

Publications about the project

Yuriy Anisimov; Jason Raphael Rambach; Didier Stricker

In: Academic Editor Denis Laurendeau (Hrsg.). Sensors - Open Access Journal (Sensors), Vol. 22(3), Pages 814-829, MDPI, 1/2022.

To the publication

Yuriy Anisimov; Gerd Reis; Didier Stricker

In: 29. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision'2021. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision (WSCG-2021), International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, May 17-20, Pilsen/Hybrid, Czech Republic, Pages 253-261, Vol. 29, No. 1-2, ISBN 978-80-86943-34-3, Vaclav Skala - UNION Agency, Pilsen, Czech Republic, 2021.

To the publication

Yuriy Anisimov; Oliver Wasenmüller; Didier Stricker

In: Mario Vento; Gennaro Percannella (Hrsg.). Computer Analysis of Images and Patterns - 18th International Conference. International Conference on Computer Analysis of Images and Patterns (CAIP-2019), September 2-5, Salerno, Italy, Pages 52-63, Lecture Notes in Computer Science (LNCS), Vol. 11678, ISBN 978-3-030-29887-6, Springer, Cham, Switzerland, 8/2019.

To the publication