Cognitive Robotics: PERI.2

PERI.2 is a Perception Enabled Robotic Intelligence: an integration of an arm and hand for manipulation, image recognition integrated cameras, and predicate based formal reasoning and planning. Using these elements, the PERI2 project aims to enable physical, dexterous manipulation of objects and interaction with the environment. The central demonstration of the project is the integration of formal planning with robotic elements.  There are a number of targets this integration is attempting to achieve, including puzzle solving, non-rigid object manipulation, and operating with impaired perception.

The physical hardware of the system consists of an industrial robot arm fitted with a Barrett hand.  This is a three fingered end-effector capable of pinching and grasping, with tactile sensors on the fingers and a suite of sensors in the palm.  The cameras are three Cognex integrated vision cameras capable of independent image recognition, whose software support one shot learning and tuning.  PERI.2’s software consists of Robot Operating System (ROS) integrated with Spectra, a RAIR Lab automated planner.  By forming percepts and the world state into predicates, and physical actions into macro, actionable units, the formal reasoner can integrate high level reasoning with proper actions and sensor information.

For project objectives there’s a particular emphasis on non-trivial reasoning, primarily by solving a physical puzzle. The puzzle, Metaforms, consists of a set of visually represented rules and small blocks.  The rules must first be observed and considered – by designing the appropriate interactions, PERI.2 can work through the puzzle and produce a valid solution.  Arriving at this solution will produce an actionable plan (or at least allow the creation of one).  PERI.2 will then arrange the blocks appropriately and formally verify that the physical puzzle is solved given the stated rules.

Interaction with non-rigid objects is also a goal, here in the context of field medical applications.  Specifically, Sammy Splint will be manipulated around a (fake) arm so as to simulate splinting the arm.  A Sammy Splint is a long, semi-rigid rectangle that can be bent into a number of shapes.  The system is to bend the splint into a particular shape, then form it to the arm.  We are also considering the feasibility of bandaging and visually checking the health of the arm.

The final goal regards occluded vision. While PERI.2 is attempting to observe and solve these problems, it’s possible some occlusion will occur – here, a fog machine.  The system will need to note that it’s vision will not always be complete and navigate or reason around that.  By observing the world from more than one angle and lighting and across multiple time steps, PERI.2 will be able to resolve observations that fit valid rules and solutions.  Knowledge of the presence of occlusion can be inferred or observed directly, compelling the system to act more cautiously and perform verifying observations.

In summary, the project is an exercise in demonstrating the capabilities of certain cognitive robotics principles and techniques in the real world and is expected to be extensible to further reasoning demonstrations as the hardware matures.