gfx-labs.svg

Explore
Our Labs

Find the labs best suited for your research needs

No results found

Please change your filter settings

icon-lab-1.svg

Object Transportation Lab

Experiment with robotic arms for object manipulation in dynamic settings.

innovation_1.svg

Knowledge Graph Lab

Develop and test algorithms for navigation in challenging environments.

interaction_1.svg

Action Cores

Study interactions between humans and robots in shared tasks.

augmented-paper_1.svg

openEASE Knowledge Service Laboratory

openEASE is a cutting-edge, web-based knowledge service that leverages the KnowRob robot knowledge representation and reasoning system to offer a machine-understandable and processable platform for sharing knowledge and reasoning capabilities.

manufacturing_1.svg

Retail Robotics Lab

Optimize robotic workflows in simulated industrial environments.

innovation_1.svg

Meal Preparation Laboratory

Enable robots to autonomously prepare meals by bridging natural language recipe instructions and robotic action execution.

interactive_1.svg

Interactive Task Learning Laboratory

Empower robots with the ability to understand the ‘what’ and ‘how’ of task demonstrations.

robotic_1.svg

ROS Programming for Virtual Robot

Immersive and practical approach to learning the intricacies of programming robots using ROS

interactive_1.svg

Visual Programming for Robots

Learn Robot Programming through Visual Interface.

icon-lab-1.svg

PyCRAM Laboratory

The PyCRAM Laboratory focuses on leveraging the capabilities of PyCRAM, a Python 3 re-implementation of the original CRAM framework designed to serve as a comprehensive toolbox for the creation, implementation, and deployment of AI-driven and cognition-enabled control software on autonomous robots.

gfx-impact.svg

RoboKudo Perception Executive Laboratory

Our Perception Executive Laboratory is centered around RoboKudo, a cutting-edge cognitive perception framework designed specifically for robotic manipulation tasks.

icon-lab-1.svg

GISKARD Motion Executive Laboratory

In the Motion Executive Laboratory, our primary focus is on exploring and enhancing Giskard, a cutting-edge open-source framework dedicated to motion planning and control.

robotic_1.svg

The Body Motion Problem

The Body Motion Problem (BMP) is a fundamental challenge in robotics, addressing how robots can compute goal-directed, context-sensitive motions to achieve desired outcomes while adapting to real-world constraints.

interactive_1.svg

Knowledge-Based Servoing

This virtual research lab explores reactive motion generation through reasoning. We propose image schema-based reasoning for decision-making within motion controllers.

machinery_1.svg

The TraceBot Laboratory

The TraceBot Lab offers a pioneering platform for conducting research with robotic systems uniquely designed to have a profound understanding of their actions and perceptions, specifically targeting the automation of the membrane-based sterility testing process.

innovation_1.svg

Systematic Review on Metacognitive Tracing

The Metacognition Lab is an interactive online platform designed as a companion to our systematic review of Computational Metacognitive Architectures (CMAs).

augmented-paper_1.svg

Enhanced Task Learning Using Actionable Knowledge Graphs Laboratory

In this virtual research lab, we aim to empower robots with the ability to understand the ‘what’ and ‘how’ of task demonstrations.

augmented-paper_1.svg

The Assisted Living Laboratory with Ontologies Demo

The Bremen Ambient Assisted Living Laboratory (BAALL) ontology is a repository of knowledge useful for agents providing assisted living

interaction_1.svg

Video-Enhanced Human Activity Understanding

This project integrates advanced machine learning techniques with Unreal Engine’s MetaHuman avatars, providing a sophisticated platform for robotic agents to acquire knowledge of everyday activities and object manipulation.

Learn More Coming Soon
innovation_1.svg

SOMA-Lab

Soma Laboratory

robotic_1.svg

The Dual-Arm Action Selection Laboratory

In the Dual Arm Manipulation & Decision Laboratory, you can observe the capabilities of a dual-arm robot performing table setting tasks under uncertainty.

robotic_1.svg

MUHAI Kitchen Affordances Simulator

The MUHAI Kitchen Affordances Simulator was developed as part of a larger effort by the MUHAI project to benchmark natural language understanding software by parsing recipes into executable robot actions.

interaction_1.svg

Semantic Event and Object Segmentation

Human beings are able to partition the world in flexible ways such that handling a task at hand is made easier.

innovation_1.svg

NEEM Laboratory

Narrative Enabled Episodic Memories (NEEMs) are structured recordings of actions performed by either a robot or a human operator, captured in real-world, simulated, or virtual reality environments.

machinery_1.svg

The FAME Laboratory

The FAME (Future Action Modelling Engine) Lab stands at the cutting edge of robotics research, operating as a virtual research laboratory under the auspices of the ERC project bearing the same name.

Learn More Coming Soon
manufacturing_1.svg

URoboSim Artificial World Laboratory

URoboSim is an Unreal Engine 4/5 plugin that allows importing robots to Unreal Engine using the URDF/SDF format and control them using various ROS interfaces.

innovation_1.svg

Simulation-Driven Robotic Knowledge Graph Laboratory

In this virtual research lab, we investigate how robots can reason about complex, real-world environments using knowledge graphs in KnowRob.

interactive_1.svg

PyCRORM

Object Relational Mapping in PyCram

augmented-paper_1.svg

EASE Fall School Tutorial

Our hands-on labs provide experience in cognition-enabled robotics.

interaction_1.svg

Faster Simulation With Uncertainty

In robotics, uncertainty often arises from noisy sensors, imperfect models, or unknown material properties, and addressing this through prospection is crucial for ensuring reliability and safety.

innovation_1.svg

Talk 2 Your Knowledge Base Laboratory

The Talk2YourKnowledgeBase system from the fortiss research institute research institute takes natural language input from the user, interpreting it through an interface that connects to a knowledge graph database.

Learn More Enter Lab
manufacturing_1.svg

Kitchen Search Lab

In this virtual laboratory, we investigate how humans search their environment for objects they need during everyday activities.

augmented-paper_1.svg

Eye Fixation Prediction Lab

We present a demonstration of a deep learning model designed to predict human fixation patterns across visual scenes.

interaction_1.svg

Cube Stacking Experiment Lab

We present an experiment in which intuitive stability measurements are tested in 3D space.

robotic_1.svg

Robot Pouring Simulation

We present a simulation used for causal investigations into robot pouring

Back to top