Foundational Research (Phase 1)
The EASE CRC was initiated to focus on understanding and replicating human everyday activities in robotics. Key milestones include...
CRAM's Action Plan Generalization
Proposed and developed a novel generative model enabling a robot to master everyday activities through a single, compact, generalized action plan for fetch-and-place tasks. Generalized action plans using the CRAM Plan Language (CPL) Executive.

Different object grasps selected by the generative model based on the object, task, and context.
KnowRob2.0 Introduction
Proposed KnowRob2.0, a second generation KR&R (Knowledge Representation & Reasoning) framework for robot agents. KnowRob2.0 is an extension and partial redesign of KnowRob that provides the KR&R mechanisms needed to make informed decisions about how to parameterize motions in order to accomplish manipulation tasks. The extensions and new capabilities include highly detailed symbolic/subsymbolic models of environments and robot experiences, visual reasoning, and simulation-based reasoning. Aspects of redesign include the provision of an interface layer that unifies heterogeneous representations through a uniform entity-centered logic-based knowledge query and retrieval language.

KnowRob2.0 Architecture.
SOMA Ontology
Employed a collection of ontologies within a very concise foundational or top-level ontology, called SOMA (Socio-physical model of activities). SOMA is a parsimonious extension of DUL (Dolce Upper Lite) ontology, where additional concepts and relations provide a deeper semantics of autonomous activities, objects, agents, and environments. SOMA has been complemented with various subontologies that provide background knowledge on everyday activity and robot and human activity including axiomatizations of NEEMs, 48 common models of actions, robots, affordances, execution failures, and so on.

Representation of the physical and computational processes and representation involved in the execution of actions in the SOMA ontology
NEEM Hub Establishment
The NEEM Hub is the data storage of robot agents that stores and manages NEEMs (Narrative-enabled episodic memories) and provides a software infrastructure for analyzing and learning from NEEMs. NEEMs are a way of storing the data generated by robot agents during everyday manipulation in such a way that it enables knowledge extraction. More formally, NEEMs are CRAM’s generalization of episodic memory — encapsulating subsymbolic experiential episodic data, motor control procedural data, and descriptive semantic annotation — and the accompanying mechanisms for acquiring them and learning from them in KnowRob.

NEEM-HUB (for acquiring, curating, and publishing NEEMs)
Automated Modelling of Human Activity
Developed AMEVA (Automated Models of Everyday Activities), a computer system that can observe, interpret, and record fetch-and-place tasks and automatically abstract the observed activities into action models according to Flanagan model, and represent these models as NEEMs. These NEEMs can be used to answer questions about the observed human activities and learn generalized knowledge from collections of NEEMs.

Collecting NEEMs from human demonstration