Master Thesis - Multi-Modal Traversability Estimation for Autonomous Outdoor Navigation

All the best with your application!

Want more jobs like this straight to your inbox?

Summary

Location

Stuttgart, Germany

Work

Internship

Key Benefits
Cutting-Edge Technology
Hands-On Robots
Project Ownership
Cake Thursday
Top Student Peers

About this Job

Advertisement for the field of study such as: Automation technology, electrical engineering, computer science, cybernetics, mechatronics, control engineering, software design, software engineering, technical computer science or comparable.

In the Professional Service Robots - Outdoor research group we develop autonomous, mobile robots for a variety of outdoor applications, such as agriculture, forestry and logistics. The focus is on the development of an autonomous outdoor navigation solution as well as the hardware of the robots.

For mobile robots operating in outdoor, unstructured environments with unknown terrain conditions, an accurate representation of the environment is essential. For this purpose, data from multiple sensors needs to be interpreted and fused to reliably estimate the traversability of the surrounding environment.

Terrain traversability can be evaluated by analyzing the geometry of elevation maps generated using LiDAR scans. This method alone is inherently unable to capture subtle semantic information such as surface properties, which could, for example, help prioritize paths along dirt roads rather than through dense vegetation. For this purpose, semantic traversability scores can be extracted from RGB images produced by a stereo camera using Deep Neural Networks. But relying only on the semantic traversability information is sensitive to domain shifts and weather conditions.

Therefore, the objective of this thesis is to develop and test a real-time, tightly-coupled traversability algorithm that fuses information from both sensor modalities, therefore providing a more complete understanding of the environment as an input for the path planning.

Hier sorgen Sie für Veränderung

In this thesis, you will design a traversability estimation algorithm that fuses geometric information derived from LiDAR elevation maps with semantic annotations inferred from RGB images. You will focus on fusing traversability scores generated by independent sensor modalities and evaluate the effectiveness of different fusion strategies, including early-stage and late-stage approaches.

You will evaluate the accuracy and real-time computational performance of your implementation in real-world scenarios using both recorded data and real-life deployment with our mobile CURT robots, to ensure real-time performance.

Hiermit bringen Sie sich ein

  • Student enrolled at a German university/Hochschule
  • Background in Computer Science, Software Engineering, Mechatronics or similar 
  • Programming knowledge and experience with C/C++
  • Experience with ROS is a plus
  • Analytical mindset
  • Enthusiasm for mobile robotics
  • Fluent in English or German

 

Was wir für Sie bereithalten

  • Cutting-edge technology in the field of outdoor mobile robotics
  • Hands on with our robots in Stuttgart 
  • Take on responsibility and freedom to implement your own ideas
  • Work with the best students in their discipline 
  • Familiar atmosphere including Cake Thursday

Wir wertschätzen und fördern die Vielfalt der Kompetenzen unserer Mitarbeitenden und begrüßen daher alle Bewerbungen – unabhängig von Alter, Geschlecht, Nationalität, ethnischer und sozialer Herkunft, Religion, Weltanschauung, Behinderung sowie sexueller Orientierung und Identität. Schwerbehinderte Menschen werden bei gleicher Eignung bevorzugt eingestellt. Unsere Aufgaben sind vielfältig und anpassbar – für Bewerber*innen mit Behinderung finden wir gemeinsam Lösungen, die ihre Fähigkeiten optimal fördern.

Mit ihrer Fokussierung auf zukunftsrelevante Schlüsseltechnologien sowie auf die Verwertung der Ergebnisse in Wirtschaft und Industrie spielt die Fraunhofer-Gesellschaft eine zentrale Rolle im Innovationsprozess. Als Wegweiser und Impulsgeber für innovative Entwicklungen und wissenschaftliche Exzellenz wirkt sie mit an der Gestaltung unserer Gesellschaft und unserer Zukunft. 

Bereit für Veränderung? Dann bewerben Sie sich jetzt, und machen Sie einen Unterschied! Nach Eingang Ihrer Online-Bewerbung erhalten Sie eine automatische Empfangsbestätigung. Dann melden wir uns schnellstmöglich und sagen Ihnen, wie es weitergeht.   

Ms. Jennifer Leppich

Recruiting

+49 711 970-1415

jennifer.leppich@ipa.fraunhofer.de  

Fraunhofer-Institut für Produktionstechnik und Automatisierung IPA 

www.ipa.fraunhofer.de 

Kennziffer: 82321                Bewerbungsfrist:

About the Company

Fraunhofer-Gesellschaft logo

Fraunhofer-Gesellschaft

Nonprofit Research Institute
Transportation & Autonomous VehiclesRobotics Software & AIResearch & Academia

Fraunhofer IGD is the international leading institute for applied research in visual computing. Visual computing is image- and model-based information technology and includes computer graphics and computer vision, as well as virtual and augmented reality. In simple terms, the Fraunhofer researchers in Darmstadt, Rostock, and Kiel are turning information into images and extracting information from images. In cooperation with its partners, technical solutions and market-relevant products are created. Prototypes and integrated solutions are developed in accordance with customized requirements. In doing so, Fraunhofer IGD places users at the forefront, providing them with technical solutions to facilitate computer work and make it more efficient. Owing to its numerous innovations, Fraunhofer IGD raises man-machine interaction to a new level. Man is able to work in a more result-oriented and effective way by means of the computer and visual computing developments.

View details
Related Jobs

Get the week's best robotics jobs

We review hundreds of postings weekly and hand-pick the top roles for you. High-salary positions, top companies, remote opportunities.

Please enter a valid email address

Unsubscribe anytime. We respect your privacy.