Object Transportation Lab
Experiment with robotic arms for object manipulation in dynamic settings.
Please change your filter settings
Experiment with robotic arms for object manipulation in dynamic settings.
Develop and test algorithms for navigation in challenging environments.
Study interactions between humans and robots in shared tasks.
openEASE is a cutting-edge, web-based knowledge service that leverages the KnowRob robot knowledge representation and reasoning system to offer a machine-understandable and processable platform for sharing knowledge and reasoning capabilities.
Optimize robotic workflows in simulated industrial environments.
Enable robots to autonomously prepare meals by bridging natural language recipe instructions and robotic action execution.
Empower robots with the ability to understand the ‘what’ and ‘how’ of task demonstrations.
Immersive and practical approach to learning the intricacies of programming robots using ROS
Learn Robot Programming through Visual Interface.
The PyCRAM Laboratory focuses on leveraging the capabilities of PyCRAM, a Python 3 re-implementation of the original CRAM framework designed to serve as a comprehensive toolbox for the creation, implementation, and deployment of AI-driven and cognition-enabled control software on autonomous robots.
Our Perception Executive Laboratory is centered around RoboKudo, a cutting-edge cognitive perception framework designed specifically for robotic manipulation tasks.
In the Motion Executive Laboratory, our primary focus is on exploring and enhancing Giskard, a cutting-edge open-source framework dedicated to motion planning and control.
The Body Motion Problem (BMP) is a fundamental challenge in robotics, addressing how robots can compute goal-directed, context-sensitive motions to achieve desired outcomes while adapting to real-world constraints.
This virtual research lab explores reactive motion generation through reasoning. We propose image schema-based reasoning for decision-making within motion controllers.
The TraceBot Lab offers a pioneering platform for conducting research with robotic systems uniquely designed to have a profound understanding of their actions and perceptions, specifically targeting the automation of the membrane-based sterility testing process.
The Metacognition Lab is an interactive online platform designed as a companion to our systematic review of Computational Metacognitive Architectures (CMAs).
In this virtual research lab, we aim to empower robots with the ability to understand the ‘what’ and ‘how’ of task demonstrations.
The Bremen Ambient Assisted Living Laboratory (BAALL) ontology is a repository of knowledge useful for agents providing assisted living
This project integrates advanced machine learning techniques with Unreal Engine’s MetaHuman avatars, providing a sophisticated platform for robotic agents to acquire knowledge of everyday activities and object manipulation.
Soma Laboratory
In the Dual Arm Manipulation & Decision Laboratory, you can observe the capabilities of a dual-arm robot performing table setting tasks under uncertainty.
The MUHAI Kitchen Affordances Simulator was developed as part of a larger effort by the MUHAI project to benchmark natural language understanding software by parsing recipes into executable robot actions.
Human beings are able to partition the world in flexible ways such that handling a task at hand is made easier.
NEEM Laboratory