Simulation-Driven Robotic Knowledge Graph Laboratory

Explore Labs

Description

In this virtual research lab, we investigate how robots can reason about complex, real-world environments using knowledge graphs in KnowRob. These OWL-based knowledge representations are directly linked to a Universal Scene Description (USD) scene graph used in MuJoCo simulation, enabling the integration of symbolic reasoning with real-time subsymbolic perceptual input.

Our focus lies in enabling autonomous agents to perform table setting tasks by interpreting symbolic, labeled kitchen environments through structured queries. The combined knowledge architecture allows the robot to perceive its environment, update its internal model, and make decisions regarding object selection, grasping, and placement — all resolved through logic-based reasoning in KnowRob.

Reasoning Over the Knowledge Graph

To enable goal-directed behavior in dynamic scenes, we integrate a USD scene graph from the MuJoCo simulator with an OWL-based KnowRob knowledge graph. This setup captures both the subsymbolic spatial configuration of objects and their symbolic roles, including:

  • Object identities (e.g., cup, plate, spoon)
  • Functional roles (e.g., container, tool, target)

The robot can query this linked representation in real time using KnowRob to make inferences such as:

  • “What object fulfills the role of ‘cup’ and is currently on the table?”
  • “Where should the spoon be placed relative to the plate?”

This setup allows symbolic tasks like table setting to be robustly grounded in perceptual data and executed reproducibly.

Software Components

Multiverse Framework: A decentralized Simulation Framework.

KnowRob: A knowledge processing system for robots.

Courses and Tutorials

See Chapter 1 of the Fall School 2024:

Authors and Contact Details

Related Publication

Giang Nguyen, Daniel Beßler, Simon Stelter, Mihai Pomarlan and Michael Beetz,
“Translating universal scene descriptions into knowledge graphs for robotic environment”,
In 2024 IEEE International Conference on Robotics and Automation (ICRA), pp. 9389–9395, 2024.

Back to top