Description
Narrative Enabled Episodic Memories (NEEMs) are structured recordings of actions performed by either a robot or a human operator, captured in real-world, simulated, or virtual reality environments. NEEMs include both subsymbolic, low-level data and symbolic, semantically annotated information, using the SOMA ontology and its extensions. By employing a shared ontology across all NEEMs, it becomes possible to align and compare them—even when recorded using different tools or under varying conditions.
The overarching goal is to build and maintain a database of episodic memories that enables robots to learn from past experiences and human demonstrations, thereby improving task performance in real-world scenarios. Since NEEMs store both the execution parameters and the outcomes of actions—successes as well as failures—they provide a valuable basis for selecting effective parameterizations for current tasks or for use in machine learning models. Failure cases also offer insight into which actions or parameterizations should be avoided in specific contexts.
Within the CRAM framework, learning suitable parameterizations for action designators has been a central application of NEEMs. Their effectiveness in this role has been demonstrated in multiple publications (see below).
Pipeline for NEEM generation from Virtual Reality Demonstrations
Software Components
-
PyCRAM: The Python 3 re-implementation of CRAM, serving as a toolbox for designing, implementing, and deploying software on autonomous robots.
-
CRAM: A software toolbox for implementing autonomous robots.
-
KnowRob: A knowledge processing system for robots.
Courses and Tutorials
-
SUTURO - sudo tidy-up-my-room: A project where students design their own applications to run on real robots.
-
AI-based Robot Control: An introductionary course to ROS and the frameworks used within the SUTURO Project. Project preparation course.
-
Robot Programming with ROS: An introduction to the Robot Operating System (ROS).
Authors and Contact Details
-
Prof. Michael Beetz, PhD
Head of Institute
Tel: +49 421 218 64001
Email: beetz@cs.uni-bremen.de
Profile: https://ai.uni-bremen.de/team/michael_beetz -
Alina Hawkin
Email: hawkin@uni-bremen.de
Profile: https://ai.uni-bremen.de/team/alina_hawkin -
Sascha Jongebloed
Tel: –49 -421 218 64008
Email: jongebloed@uni-bremen.de
Profile: https://ai.uni-bremen.de/team/sascha_jongebloed -
Daniel Beßler
Tel: +49 -421 218 64016
Email: danielb@cs.uni-bremen.de
Profile: https://ai.uni-bremen.de/team/daniel_beßler
Publications
-
Steps towards generalized manipulation action plans-tackling mixing task by Vanessa Hassouna, Alina Hawkin, and Michael Beetz. Presented at the Workshop on Actionable Knowledge Representation and Reasoning for Robots (AKR3) at European Semantic Web Conference (ESWC), 2024. https://ceur-ws.org/Vol-3749/akr3-05.pdf
-
Between Input and Output: The Importance of Modelling Transients in Meal Preparation Tasks by Michaela Kümpel, Vanessa Hassouna, Alina Hawkin, and Michael Beetz. In Proceedings of the Joint Ontology Workshops (JOWO) - Episode X: The Tukker Zomer of Ontology, and satellite events co-located with the 14th International Conference on Formal Ontology in Information Systems (FOIS 2024), July 15-19, 2024, Enschede, The Netherlands. https://ceur-ws.org/Vol-3882/ifow-5.pdf
-
Learning motion parameterizations of mobile pick and place actions from observing humans in virtual environments by Gayane Kazhoyan, Alina Hawkin, Sebastian Koralewski, Andrei Haidu, and Michael Beetz. Presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020. IEEE, pp. 9736–9743. DOI: 10.1109/IROS45743.2020.9341458
-
Automated models of human everyday activity based on game and virtual reality technology by Andrei Haidu, Michael Beetz. Presented at the International conference on robotics and automation (ICRA), 2019. IEEE, pp. 2606–2612. DOI: 10.1109/ICRA.2019.8793859