Open Positions

We are seeking expressions of interests for the open positions via the FONDA portal, which is hosted by the Humboldt Graduate School:

https://hu.berlin/fonda-portal-2020

Please register at this portal, fill in your personal details, including a CV, information on two referees, a letter outlining your motivation to join FONDA, as well as statements of interests for some of the announced positions.

We currently have open positions of the following types in FONDA:

Level Title Supervisor Status
A1 PhD student Scalable discovery of queries over event streams Prof. Weidlich Open
A1 PhD student Exploring the Complexity of Event Query Discovery Prof. Schweikardt Closed
A2 PhD student or PostDoc Adaptive optimization of genomic workflow languages Prof. Leser Open
A2 PhD student or PostDoc Algorithmic components for genome analysis Prof. Reinert Closed
A5 PhD student Uncertainty Estimates for Image Analysis Workflows Prof. Kainmüller Closed
A5 PostDoc Workflows for domain adaptation in neuroimaging data Prof. Ritter Closed
A6 PhD student Genome analysis with exploratory workflows Dr. Kehr Pending
A6 PhD student Workflows for exploratory analysis of scientific data Prof. Weidlich Open
B1 PhD student Adaptive resource management for heterogeneous infrastructures Prof. Kao Closed
B1 PhD student Scheduling algorithms for distributed data analysis workflows Prof. Meyerhenke Closed
B2 PhD student Portable and Adaptive 3D Vision Prof. Eisert Open
B2 PostDoc Portable and Adaptive Data Analysis Workflows for Real-Time 3D Prof. Koch Open
B3 PhD student Systematic fault localization in data-centric software engineering Prof. Kehrer Closed
B3 PhD student Debugging distributed data analysis workflows Prof. Markl Closed
B4 PhD student Exploiting SDNs for Efficient Data Management in Next-Generation DAWs Prof. Reinefeld Open
B5 PhD student or PostDoc Online scheduling of workflows for large-scale land use mining Prof. Leser Open
B6 PhD student Distributed Run-Time Monitoring and Control of Data Analysis Workflows Prof. Rabl Pending
Inf PhD student or PostDoc Monitoring and benchmarking of workflow systems Prof. Leser Open
Inf PhD student or PostDoc Infrastructures and repositories for workflow systems Prof. Kao Closed
Coord PostDoc Scientific coordinator Open
Coord Administration Administrative support (Sekretariat) Open

Disclaimer

The information on this page has no legal binding. The official announcements will be linked from here as soon as they appear at the respective organization’s site (see status=Open).

 

Research Area A

  • Scalable discovery of queries over event streams
    Subproject A1Foundations of Data Analysis Workflow Validation
    Level PhD student
    Scientific disciplines Databases, stream processing, query specification
    PI Matthias Weidlich
    Location Humboldt-Universität zu Berlin
    Contact matthias.weidlich@hu-berlin.de
    Link to official call Link
    Expected competencies At least one of the following: data/event stream processing, parallel/distributed data processing, data mining

    Subproject A1 is dedicated to the discovery of event queries over streams of provenance data that are collected during the execution of data analysis workflows (DAWs). These queries shall separate runs that succeeded from those that fail, thereby enabling efficient and effective validation. The PhD student will develop algorithms for the discovery of event queries (see prior work on the IL-Miner), including schemes for parallel and approximate discovery. Moreover, the student will devise mechanisms to select a subset of discovered queries for effective validation of DAW execution. Focusing on the algorithmic perspective, the student will work closely with the second PhD student in this subproject, who focuses on the complexity of the discovery problem and who is supervised by Prof. Schweikardt, also at Humboldt-Universität zu Berlin.

  • Exploring the Complexity of Event Query Discovery
    Subproject A1Foundations of Data Analysis Workflow Validation
    Level PhD student or PostDoc
    Scientific disciplines Theoretical computer science, database theory, logic in computer science
    PI Nicole Schweikardt
    Location Humboldt-Universität zu Berlin
    Contact schweikn@informatik.hu-berlin.de
    Link to official call
    Expected competencies Excellent background in at least one of the areas logic, database theory, and complexity theory, as well as a solid background in database systems

    The position will be part of subproject A1. We are looking for a PhD student or a PostDoc to explore the space- and time-complexity of event query discovery and incremental query maintenance. This will also involve the design of specific target query languages, the investigation of their expressive power and the limitations of their expressivity, as well as the exploration of the computational complexity of automatically discovering such queries. Successful candidates should hold a degree in computer science or mathematics, with an excellent background in logic, database theory, or complexity theory. In addition, she/he should have a solid background in database systems and show strong interest in interdisciplinary research.

  • Adaptive optimization of workflow languages for genomics workflows
    Subproject A2: Adapting Genomic Data Analysis Workflows for Different Data Access Patterns
    Level PhD student or PostDoc
    Scientific disciplines Bioinformatics, domain-specific languages workflow systems, query optimization
    PI Ulf Leser
    Location Humboldt-Universität zu Berlin
    Contact leser@informatik.hu-berlin.de
    Link to official call Link
    Expected competencies At least one of the following: genome data analysis, distributed system, domain-specific languages, workflow systems

    This position will be part of subproject A2, which will study the problems of specifying, identifying, and exposing data access and data exchange patterns (DADE) in data analysis workflows (DAWs). We will develop methods to (semi-)automatically adapt I/O-heavy DAWs in genome data analysis to achieve robust runtimes on different computational infrastructure. Therein, the holder of this position will focus on designing workflow languages for genome data analysis, analysis of workflows regarding their DADE patterns, and their DADE-aware optimization on different infrastructures. She will closely cooperating with the second position in this subproject, which will be supervised by Prof. Reinert at Freie Universität Berlin.

  • Making algorithmic components for sequence analysis amenable for self-configuring, genomic data analysis workflows
    Subproject A2: Adapting Genomic Data Analysis Workflows for Different Data Access Patterns
    Level PhD student or PostDoc
    Scientific disciplines Bioinformatics algorithms, next generation sequencing, workflow systems
    PI Knut Reinert
    Location Freie Universität Berlin
    Contact Knut.Reinert@fu-berlin.de
    Link to official call
    Expected competencies At least one of the following: Genome data analysis, C++ skills, algorithms, mathematical optimization

    The PhD student will mainly work on developing cost models for the adaptation of DAWs according to DADE patterns, adapting algorithmic components in the SeqAn library (www.seqan.de) and developing
    (reference based) compression techniques. A possible title for a PhD thesis would be “Making algorithmic components for sequence analysis amenable for self-configuring, genomic data analysis workflows”. The PhD student will investigate how to generalize algorithmic components for sequence analysis to make them easily adaptable for varying data access patterns and computational infrastructure. Based on the cost models s/he will also investigate which components should be chosen to optimize performance. Given the variety of sequence analysis components, the PhD student will choose a smaller set of suitable components for generalization, which also involves choosing between different algorithms that perform better in a given setting of data access pattern and computational infrastructure. Those components are then employed in a metagenomics workflow.
    For further information see “Research Areas and Subprojects” on this website, and visit http://www.mi.fu-berlin.de/en/inf/groups/abi/research/projects/fonda.

  • Leveraging Uncertainty Estimates for Adaptable Image Analysis Workflows
    Subproject A5Dependability, Adaptability and Uncertainty Quantification for Data Analysis Workflows in Large-Scale Biomedical Image Analysis
    Level PhD student
    Scientific disciplines Machine learning, computer vision / microscopy image analysis, data analysis workflows
    PI Dagmar Kainmueller
    Location Max Delbrück Center for Molecular Medicine / IRI for Life Sciences
    Contact dagmar.kainmueller@mdc-berlin.de
    Link to official call
    Expected competencies Background in computer science, physics, mathematics, or a related discipline; Enthusiasm for coding (e.g. python, c++); Experience with machine learning / computer vision / image analysis is a plus

    A PhD researcher to be recruited within subproject A5 of FONDA will focus her research on facilitating uncertainty assessment for ease of development of adaptable data analysis workflows (DAWs). She should have an education in computer science, physics, mathematics, electrical engineering, or a related field. Her work will evolve around the suitability of various kinds of uncertainty estimates for measuring DAW adaptability, where she will have the freedom to define her own focus, e.g. on dataset similarity measures, or on machine learning model-inherent uncertainty predictors. The working title of the thesis will be “Leveraging Uncertainty Estimates for Adaptable Machine Learning Workflows”. The position will be situated in the Kainmueller Lab at the MDC Berlin.

  • DAWs for domain adaptation in neuroimaging data
    Subproject A5Dependability, Adaptability and Uncertainty Quantification for Data Analysis Workflows in Large-Scale Biomedical Image Analysis
    Level PostDoc
    Scientific disciplines Machine learning, neuroimaging, data analysis workflows
    PI Kerstin Ritter
    Location Charité – Universitätsmedizin Berlin
    Contact kerstin.ritter@bccn-berlin.de
    Link to official call
    Expected competencies At least one of the following: Machine learning, python, software engineering, neuroimaging

    For subproject A5 of FONDA, we are looking for a PostDoc to develop a domain-specific language (DSL) to support generalizability of machine learning-heavy DAWs for large-scale biomedical image analysis with a focus on neuroimaging data and transfer learning between multiple (open) data sets. The respective candidate should have a background in computer science, physics, mathematics, computational neuroscience, electrical engineering, or a related field. Additionally, the candidate needs to have both excellent domain knowledge in large-scale MRI data analysis in clinical context as well as ample experience in setting up machine learning analyses on noisy and large data sets. The position will be situated in the Ritter lab at Charité – Universitätsmedizin Berlin.

  • Genome analysis with exploratory workflows
    Subproject A6: Data Analysis Workflows for Interactive Scientific Exploration
    Level PhD student
    Scientific disciplines Bioinformatics, genetic variation, workflow exploration
    PI Birte Kehr
    Location Regensburger Centrum für Interventionelle Immunologie (RCI)
    Contact birte.kehr@bihealth.de
    Link to official call tba
    Expected competencies Skills in at least one of genomic data analysis or C++ and Python programming, and interest in the other.

    In subproject A6, we will develop exploratory workflows for the analysis of large-scale genomic data. The focus of this PhD student will be on instantiating the exploratory workflow model for structural variant (SV) detection from genomic data. SVs are differences in a person’s genome that affect at least 50 base pairs of DNA sequence, can have an impact on a person’s health and, thus, SV detection is a basis for many biomedical research directions. The PhD student will dissect existing workflows and implement exploration strategies with the aim of improving the process of structural variant detection. Interested candidates should hold a degree in bioinformatics with a strong computational interest or a degree in computer science with an interest in genomics. The PhD position is located in the research group led by Birte Kehr and the holder of this position will closely collaborate with the other PhD student in this subproject, who will be located in the group of Prof. Weidlich at Humboldt Universität Berlin.

  • Workflows for exploratory analysis of scientific data
    Subproject A6: Data Analysis Workflows for Interactive Scientific Exploration
    Level PhD student
    Scientific disciplines Databases, distributed systems, dataflow infrastructures
    PI Matthias Weidlich
    Location Humboldt-Universtität zu Berlin
    Contact matthias.weidlich@hu-berlin.de
    Link to official call Link
    Expected competencies Skills in at least one of: data-intensive systems, large-scale data analysis, distributed infrastructures

    Subproject A6 is dedicated to models and algorithms for exploratory data analysis workflows (DAWs), especially for the analysis of large-scale genomic data. The PhD student will develop a generic model of an exploratory DAW, including exploration primitives, dependencies between exploration options, and strategies for following these options. The student will further ground this model in an actual execution infrastructure and devise mechanisms for interactive exploration of the workflow execution. While this position is located in the research group led by Matthias Weidlich, the student will work closely with the PhD student in this subproject, who is based in the research group of Dr. Kehr at Charité – Universitätsmedizin Berlin.

Research Area B

  • Adaptive resource management for heterogeneous infrastructures
    Subproject B1: Scheduling and Adaptive Execution of Data Analysis Workflows across Heterogeneous Infrastructures
    Level PhD student
    Scientific disciplines resource management, virtualization, cross-layer optimization
    PI Odej Kao
    Location Technische Universität Berlin
    Contact odej.kao@tu-berlin.de
    Link to official call
    Expected competencies At least one of the following: scheduling and placement algorithms, C++, resource optimization, operating systems internals

    Large-scale data analysis systems require scientists to make difficult decisions regarding the different available computational resources and to hand-tune distributed processing jobs, as these settings have a significant impact on runtimes and efficiency. Within FONDA, the Distributed and Operating Systems group of Prof. Dr. Odej Kao at TU Berlin is going to develop new methods for profiling, performance modeling, and task placement that will enable resource management systems to use the available cluster resources efficiently. The research questions focus on matching the user-defined data-parallel tasks to the available cluster resources (CPU, hardware accelerators, memory), efficient learning and sharing of execution profiles, as well as fine-grained partitioning of the available resources. The new methods are to be validated in context of relevant open-source software. The results are to be published.

  • Application-oriented scheduling algorithms for distributed data analysis workflows
    Subproject B1: Scheduling and Adaptive Execution of Data Analysis Workflows across Heterogeneous Infrastructures
    Level: PhD student
    Scientific disciplines: Algorithm engineering, scheduling, distributed computing
    PI Henning Meyerhenke
    Location: Humboldt-Universität zu Berlin
    Contact meyerhenke@hu-berlin.de
    Link to official call
    Expected competencies: Solid skills in algorithm design, analysis and implementation

    The PhD student filling this position will perform research on scheduling and load balancing (SLB) algorithms and their efficient realization in practice. His/her primary advisor will be Prof. Henning Meyerhenke from the Department of Computer Science at HU Berlin. The PhD student’s main topic will be to develop and implement cutting-edge decentralized SLB algorithms in the context of FONDA. He/she will work in close cooperation with the other PhD student of this subproject and will thus be co-advised by Prof. Odej Kao (TU Berlin). We expect the PhD student to have an education in computer science (or possibly a closely related field) with an emphasis on algorithm engineering, ideally in parallel and/or distributed computing scenarios.

  • Portable and Adaptive 3D Vision
    Subproject B2:  Portable and Adaptive Data Analysis Workflows for Real-Time 3D Vision
    Level: PhD student
    Scientific disciplines: Computer Vision, Visual Computing, Software Engineering
    PI Peter Eisert
    Location: Humboldt-Universität zu Berlin
    Contact eisert@informatik.hu-berlin.de
    Link to official call Link
    Expected competencies: computer vision, C++, software engineering, distributed systems

    We are looking to fill a doctoral researcher position (TVL E13 100%) at the computer science department of Humboldt-Universität zu Berlin, in the Visual Computing group headed by Prof. Peter Eisert. Candidates shall make scientific contributions to the DFG-funded subproject „Portable and Modular Data Analysis Workflows for Real-Time 3D Vision“ of CRC1404 (“Foundations of Workflows for Large-Scale Scientific Data Analysis”), i.e. to the analysis and optimization of 3D Vision workflows to increase portability of 3D reconstruction and tracking in the field of 3D microscopy.
    Successful candidates should have a degree in computer science, physics, mathematics, or related fields and show strong interest in interdisciplinary research as well as scientific work. Knowledge in Computer Vision (3d reconstruction, image and video analysis, tracking), software engineering and programming skills (C++, Python, CUDA) are highly beneficial. The ability to work in a team, creativity and good English skills are expected.

  • Portable and Adaptive Data Analysis Workflows for Real-Time 3D
    Subproject B2:  Portable and Adaptive Data Analysis Workflows for Real-Time 3D
    Level: PostDoc
    Scientific disciplines: Algorithm engineering, Distributed computing, Resource management
    PI Christoph T. Koch
    Location: Humboldt-Universität zu Berlin,  Department of Physics
    Contact christoph.koch@physik.hu-berlin.de
    Link to official call Link
    Expected competencies: At least one of C/C++, Python, CUDA; 3D reconstruction algorithms; microscopy

    We are looking to fill a postdoctoral researcher position (TVL E13 100%) at the department of physics of Humboldt-Universität zu Berlin, in the structure research and electron microscopy group headed by Prof. Christoph T. Koch. The person holding this position shall make scientific contributions to the DFG-funded subproject „Portable and Modular Data Analysis Workflows for Real-Time 3D Vision“ of CRC1404 (“Foundations of Workflows for Large-Scale Scientific Data Analysis”), consisting in the analysis, optimization, development, and application of complex modular reconstruction algorithms in the field of 3D microscopy with light and electrons. Successful candidates should have a degree in computer science, physics, mathematics, or related areas of science and be experienced in programming in at least one of the following programming languages: C/C++, Python, CUDA. Having programming experience with image processing or 3D reconstruction algorithms and/ or practical experience with imaging systems such as optical microscopes or electron microscopes is beneficial.

  • Systematic fault localization in data-centric software engineering
    Subproject B3: Debugging Distributed Data Analysis Workflows
    Level: PhD student
    Scientific disciplines: Software engineering, Programming languages, database and distributed systems
    PI Timo Kehrer
    Location: Humboldt-Universität zu Berlin
    Contact timo.kehrer@informatik.hu-berlin.de
    Link to official call
    Expected competencies: At least one of the following: software testing, debugging and fault localization, monitoring, parallel computing, program analysis

    The PhD student filling this position will focus on our research objectives to develop methods and techniques to (a) support the interactive yet guided localization of errors in distributed data analysis workflows (DAWs) through their controlled and monitored execution, and (b) narrowing down the search space for locating errors in DAWs by (semi-)automatically identifying suspicious parts of a DAW that are likely to cause an error. While the research conducted by the PhD student is motivated by FONDA’s focus on scientific DAWs, more generally, the foundations developed within this sub-project shall have an impact on the emerging paradigm of data-centric software engineering in a broader sense. The PhD student filling this position will closely collaborate with the second PhD candidate of this subproject on a prototypical implementation of our envisioned debugging framework for selected computational infrastructures such as Apache Flink. The candidate for this position must have an education in computer science or related area, and should have basic competencies in one or several related areas of software engineering, programming languages, or database and distributed systems.

  • Debugging distributed data analysis workflows
    Subproject B3: Debugging Distributed Data Analysis Workflows
    Level: PhD student
    Scientific disciplines: Software engineering, database and distributed systems, compiler technologies
    PI Volker Markl
    Location: Technische Universität Berlin
    Contact jobs@dima.tu-berlin.de
    Link to official call
    Expected competencies: At least one of the following: Compiler techniques, query execution models, system programming (Java or C++), project management skills

    The Department of Database Systems and Information Management (DIMA) at Technische Universität Berlin is looking for a research assistant (PhD student) for the collaborative research center FONDA. The main focus of the project is the research and development of debugging in distributed data analysis workflows (DAWs). These workflows consist of operations typically derived from relational and linear algebra. Debugging aims at establishing a cause-and-effect relationship between an observed problem and an actual error. Such an error identification serves as a first step to achieve a reliable problem solution. The central research question to be answered is how debugging hypotheses in connection with data-science pipelines can be efficiently formulated, tested and refined on large and fast data sets.

  • Exploiting SDNs for Efficient Data Management in Next-Generation DAWs
    Subproject B4: Exploiting Software-Defined Networks for Efficient Data Management in Next-Generation Data Analysis Workflows
    Level: PhD student
    Scientific disciplines: Distributed data management, software-defined networks, network protocols
    PI Alexander Reinefeld
    Location: Zuse Institute Berlin / Humboldt-Universität zu Berlin
    Contact schintke@zib.de
    Link to official call Link
    Expected competencies: At least one of the following: Distributed algorithms, distributed data management, network programming, SDNs

    Within FONDA, this subproject researches and develops new methods for data management of next-generation data analysis workflows using software-defined networks. In cooperation with project partners, we will develop suitable distributed algorithms, implement them on modern hardware, evaluate their performance and publish the scientific results at appropriate conferences. You work in a diverse research team including interdisciplinary partners and carry out the project work actively in coordination with them. A combination of the project work with a doctoral project is desired.

    Interested candidates should have a master’s degree in computer science, mathematics, or a related field, preferably with a focus on distributed algorithms and network programming. They should have good programming skills, and an interest in system development and practical computer science research.

  • Online scheduling of workflows for large-scale land use and land cover mining
    Subproject B5: Adaptive, Distributed and Scalable Analysis of Massive Satellite Data
    Level: PhD student or PostDoc
    Scientific disciplines: Remote sensing, workflow scheduling satellite image analysis, resource management
    PI Ulf Leser
    Location: Humboldt-Universität zu Berlin
    Contact leser@informatik.hu-berlin.de
    Link to official call Link
    Expected competencies: At least one of the following: distributed system, workflow systems, machine learning satellite image analysis

    This position will be part of subproject B5, which will research adaptive methods for analyzing land use and land cover changes over large geographical areas based on satellite data. Our goal is to automate adaptations to changing regions or changing infrastructure as much as possible by means of self-adapting geospatial data analysis workflows (DAWs). Adaptivity will be addressed both from a throughput perspective, by investigating algorithms for dynamic online scheduling of data-intensive DAWs, and from a quality perspective, by studying DAW functionality that adapts the analysis to complex regions and to different time periods. The holder of this position will focus on the former and will work closely with a second project member focusing on the latter, which will be supervised by Prof. Hostert, Humboldt-Universität zu Berlin.

  • Distributed Run-Time Monitoring and Control of Data Analysis Workflows
    Subproject B6Distributed Run-Time Monitoring and Control of Data Analysis Workflows 
    Level: PhD student
    Scientific disciplines: Database systems, modern hardware, stream processing
    PI Tilmann Rabl
    Location: Hasso Plattner Institute
    Contact tilmann.rabl@hpi.de
    Link to official call tba
    Expected competencies: Skills in database systems, stream processing, and modern hardware

    In subproject B6, we will develop a stream processing infrastructure for huge amounts of event and log data generated by large-scale distributed DAW executions for a large number of event queries. The PhD candidate will work in the Data Engineering Systems Group at HPI lead by Tilmann Rabl. The goal is to work on efficient distributed stream stream processing with a focus on multiquery optimization and utilizing modern hardware capabilities such as remote direct memory access (RDMA).
    Interested candidates should have a master’s degree in computer science with a focus on data processing, good programming skills, and an interest in system development.

Research Area Infrastructure

  • Monitoring and benchmarking of workflow systems
    Subproject S1: Testbeds and Repositories
    Level PhD student or PostDoc
    Scientific disciplines Workflow systems, distributed systems operating systems
    PI Ulf Leser
    Location Humboldt-Universität zu Berlin
    Contact leser@informatik.hu-berlin.de
    Link to official call Link
    Expected competencies At least one of the following: distributed systems, workflow systems, virtualization research data management

    This position will be part of central service subproject S1, which provides computational infrastructures, data analysis workflows (DAWs) and their associated data, and development tools to support the research performed in the CRC. The subproject will maintain different workflow systems on different clusters, support the creation of workflow execution benchmarking, build a trace file repository for workflow execution analysis and monitoring, and maintain CRC-wide development tools. The specific position will focus on DAW benchmarking and building and analysis of the trace repository. It will work closely with a second position in this subproject, supervised by Prof. Kao, Technische Universität zu Berlin, which will focus on cluster operations and development tools.

  • Building and operating infrastructures and repositories for data analysis workflows
    Subproject S1: Testbeds and Repositories
    Level PhD student or PostDoc
    Scientific disciplines System management, Data curation, DevOps, HPC
    PI Odej Kao
    Location Technische Universität Berlin
    Contact odej.kao@tu-berlin.de
    Link to official call
    Expected competencies At least one of the following: system administration, repository management, C++, Flink, Spark and similar data engines

    The Collaborative Research Center Foundations of Workflows for Large-Scale Scientific Data Analysis (FONDA) will develop new methods to support scientists, who use cluster infrastructures to analyze very large datasets. Large-scale data analysis systems routinely require scientists to work with a multitude of hardware platforms, software systems, and repositories to be able to analyze given datasets efficiently. For this reason, FONDA is building a joint infrastructure for all project members that enables the execution of data analytic workflows (DAWs), collects profiling data from these workflows and any other globally available executions, and makes this data available for research. The wide range of tasks for the envisioned scientific coordinator of this infrastructure is going to be highly interdisciplinary and will include supporting CRC members in accessing the TUB infrastructures, the installation and execution of DAWs during development of DAWs and specific experiments, and creating the new profiling data sharing system. The repositories of this system must also be set up and populated with DAW traces. The relevant knowledge has to be provided to all FONDA members through lectures and practical workshops.

Service and Administration

  • Scientific coordination of the CRC
    Subproject Z – Central Administration
    Level PostDoc
    Scientific disciplines Computer science, natural sciences
    PI Ulf Leser
    Location Humboldt-Universität zu Berlin
    Contact leser@informatik.hu-berlin.de
    Link to official call Link
    Expected competencies Experience in management of scientific projects and activities

    This position will, in cooperation with the speakers and the central board of the CRC, coordinate the scientific and administrative activities of the research center. This includes organization of workshops and retreats, design of a high-profile guest researcher programme, coordination with national and international partner organizations and projects, financial forecasting, support in grant writing activities, and design and organization of summer schools. We expect interest and experiences in the management of large research projects, a PhD in computer science or a natural science, and profound computational skills in data analysis settings.

  • Administrative support of the CRC
    Subproject Z – Central Administration
    Level Administration
    Scientific disciplines None
    PI Ulf Leser
    Location Humboldt-Universität zu Berlin
    Contact leser@informatik.hu-berlin.de
    Link to official call tba
    Expected competencies Management of universitary administrative processes

    This position will support all administrative processes (Sekretariatstätigkeiten) of the CRC. Duties include financial accounting, management of employments, support in organization of workshops and conferences, and management of procurement processes.