GEOMAR Helmholtz Centre for Ocean Research Kiel is a foundation of public law jointly financed by the Federal Republic of Germany (90 %) and the state of Schleswig-Holstein (10 %) and is one of the internationally leading institutions in the field of marine sciences. Currently GEOMAR disposes over an annual budget of approx. 80 million Euro and has approx. 1000 employees.
“Learning Underwater Visual Appearance for Robust Seafloor Surveys”,
exploiting machine learning for improving correspondence search in challenging underwater imaging conditions. Here, light attenuation and scattering in the water body, as well as dynamic illumination make it extremely difficult to re-identify the same seafloor patch in different images, which is however at the heart of visual mapping approaches. Recent advances in machine learning suggest that establishing corresponding points in different underwater images could be improved by training appropriate neural networks.
The position is embedded in the “Helmholtz School for Marine Data Science” (MarDATA), a graduate school financed by the Helmholtz Association aiming to define and educate a new type of “marine data scientist” by introducing and embedding researchers from computer sciences and mathematics into ocean sciences, covering a broad range from supercomputing and modelling, (bio)informatics, robotics, to statistics and big data methodologies. Education of doctoral researchers in joint block courses, international summer schools and colloquia goes beyond a single discipline, towards genuine scientific insight into and a more systematic treatment of marine data.
Job Description / Project Description
The successful applicant will
- use underwater light propagation models (e.g. scattering phase functions) for realistic simulation of underwater images
- develop, use and extend machine learning methods to learn visual appearance changes in underwater images taken in different conditions
- integrate the learnt robustness into algorithms for estimation of structure and motion from underwater cameras/images
- work with cameras, lights and other equipment in the lab or at sea
- participate in conferences and publish scientific articles
- develop software for visual underwater mapping, calibration and simulation
- participate in research cruises (depending on availability/schedule) and interdisciplinary collaborations with ocean sciences / marine technology
- hold a M.Sc. degree in the field of computer science, information technology, robotics or comparable (with course achievements obtained in computer vision, machine learning and signal processing)
- knowledge of C++ programming and willingness to use large software libraries
- good command of English language
If the required degree is not completed at the time of application, the degree certificate must be handed in before the above start date of the contract and the application must contain plausible evidence that the degree can be finished before that date.
It is appreciated and beneficial, but not required, if applicants also have experience in
- underwater images, underwater cameras or underwater light propagation
- image features and Structure-from-Motion/SLAM techniques
- raytracing software
- machine learning software
The position is available for a funding period of 3 years. The salary depends on qualification and could be up to the class 13 TVöD-Bund of the German tariff for public employees. This is a full-time position. The position can not be split. Flexible working time models are generally possible. The fixed-term contract shall comply with Section 2 Paragraph 1 of The Act of Academic Fixed-Term Contract (German WissZeitVG).
Please send your application for this post via email in a single pdf-file mentioning the keyword “MarDATA-Learning Underwater Appearance” in the subject line. Please send your application not later than July 4th, 2021 to the following email address: bewerbung(at)geomar.de
For further information regarding the position and research group please contact Dr. Kevin Köser (kkoeser(at)geomar.de).
More details about the position here.