Brown-MIT Hackathon Feb 2012: PR2 Commander

This is the project information page for the PR2 Commander hackathon (also known as “Move that Block!”). The hackathon is a joint collaboration between the Brown Robotics Group and the Robust Robotics group at MIT. Stefanie Tellex and Sam Prentice have joined the group in the Brown AI lab, where we are hacking on the PR2 robot!

Objective:

The goal of the hackathon is to implement a natural language command interface for the PR2 robot. We want the robot to receive natural language commands (such as “pick up the red block”) and perform the appropriate action. Towards this goal, we will integrate MIT’s Spatial Language Understanding (SLU) system (and its use of Mechanical Turk) with Brown’s Castle Builder using rosbridge, as well as adding various additional features.

At the end of this hackathon, we will have preliminary functionality for commanding picking-and-placing of toy blocks from natural language given (as speech or text) through a web interface.

Flickr set

The source code for the hackathon will (eventually) be placed in the following repository:
http://code.google.com/p/brown-mit-pr2-nlu/

Final result video:

Contributors:

Brown:
Arjun Arumbakkam
Stephen Brawner
Trevor Jay
Jihoon Lee
Jonathan Mace
Aaron Silverman
Evan Wallace
MIT:
Stefanie Tellex
Sam Prentice
Advisor:
Chad Jenkins

Initial Schedule

Feb. 27
o Kick off at 10:00am
o Initial Discussion
- Repository
- To-do tasks
- Roles
o Finalize the system design
o Take videos of “pick-up” tasks
o Summary of the day at 5:00pm
Feb. 28
o Progress meeting at 10:00am
o Work, Work, Work
o Make at least 5 videos
o Summary of the day at 5:00pm
Feb. 29
o Progress meeting at 10:00am
o Work, Work, Work
o Summary of the day at 5:00pm
Mar. 01
o Project Summary

Pre-hackathon: Setup

mit_painted_blocksIMAG1196IMAG1195IMAG1209IMAG1216

Day 1: Organization

The first day kicked off with a discussion of the goals and technical details for the project. Stefanie and Jihoon presented some of the MIT and Brown material respectively.

Breakdown of Work:

  • Figure out project structure. (How to rossify SLU.)  (SLU is Spatial Language Understanding and is the name of our codebase.)
  • Create new repo. Add everyone and check out the code.(Jon)
  • Extract color information from block.(Jihoon)
  • Adding color in visualization(Jihoon)
  • Object Tracking(Jihoon)
  • Testing Grabbing and placing (Stephen, Jihoon)
  • Create videos of the PR2 paired with ROS logs.(Stephen, Sam)
  • Extract object and robot trajectories from ROS logs export into XYZT sequence.(Stephen, Sam)
  • Extract perceptual features from the logs.(Stephen, Sam)
  • Create state/action model of the PR2 in the SLU codebase. (Stefanie, Arjun, Jon)
  • Implement SLU ROS node (Stefanie, Jon)
  • Upload videos to turk and get commands.(Stefanie, Arjun)
  • Annotate data, train model, and run inference. (Stefanie, Arjun, Jihoon)
  • Read the PR2 state and send it to SLU.(Jon, Sam, Stefanie)
  • Send the SLU inference results to the PR2.(Stefanie, Sam, Jon)
  • Demo! (Everyone!)
  • Documentation and photograph(Jon)
  • Front-end (Jon)

IMAG1226

End of day 1 status update:

Jihoon

Working on the controller and manager; previously was implemented in C++, changed his mind and now working in python.
Goal: get new manager working and grasp working properly

Arjun

Working on the PR2 state-action representation for the blocks hackathon.  Read stefanie’s paper and looked through her code to figure out how to represent states.  First pass is an implementation which will use the forklift cost function, and once that is going, will swap in a new cost function for the pr2 blocks world.
Goal: Wrap up state-action space stuff with Stefanie

Stefanie

Orchestrated with Arjun and Jon about ros and made high level overview design choices.  Helped Arjun out with state action representations.
Goal: Get state-action space running with Arjun and get end-to-end working with rosbrige

Jon

Design discussion with stefanie, chad and trevor about rosbridge.py.
Begun implementation of rosbridge.py
Goal: Finish rosbridge implementation

Sam

Began setup of rosbagging, set up camera to collect video of the pr2.  Begun looking at parsing logs.
Goal: Finish getting rosbags parsed

Stephen

Hardware support, helped Sam set up PR2 environment, worked with Jihoon on getting the robot to pick up arbitrarily located blocks based on their rotation.  Python script files for videos.
Goal: Finish video scripts and get grasp working properly

Day 2: Putting the Pieces Together

Tuesday Morning Status:

Stefanie

Rosbridge running. Begun SLU manager that reads robot pose as a transform, sends over rosbridge to slu, then slu makes a state.
State-Action space in SLU using the forklift cost function
Made a plan for messages for communication with pick and place manager.
Goal: End to end demo with a random cost function

Steven

Have the PR2 picking up blocks and putting them down in arbitrary positions and orientations. Have brought Sam and Stefanie up to date with the PR2.
Goal: Do some hacking to track blocks as they’re being pick and placed and improving pick and place. Finish scripts to make videos.

Jon

Wrote the Rosbridge python client implementation.  Now working on end-to-end demo with Stefanie and in particular cobbling together an interface.

Aaron

Working on stuff to help Stephen for robots for education – enclosure escape
3D visualisation modifications for arbitrary block placement
Goal: Continue both

Arjun

Working with Stefanie on state action space; gettin ga map between what the PR2 things and what the SLU does. (A lot of MIT in house code).
Defined the state as a set of blocks (cachement area / placed block grid)
Actions defined: pick up and put down
Today: Working on connecting the SLU manager to the block builder manager using the rosbridge client.
Goal: Parsing SLU manager and converting it to a ROS message for sending over rosbridge.

Jihoon

Finished writing a new build manager in python. Defined the message types for passing between controller and SLU with Arjun. Worked with Stephen on block picking.
Currently: Debugging block placement.

Sam

Worked on zzz’s

 

Brown Robotics rolls first class

End of Day 2 Update:

Stefanie:

Crazy rosbridge bug
Goal: close to demo, just need to fix demo and integrate with Jihoon

Aaron

Finished the block builder rotation visualisation

Stephen
Mostly complete, just a few IK fails right now.

Jihoon

Debugging with stephen for pick ‘n’ place.  Build manager mostly working now.  Will work on colour extraction next.

Sam

Working on data extraction for blocks

Jon

Cobbled together a web page and worked on end-to-end passing of messages from SLU rosbridge to webpage rosbridge.  Investigated html 5 voice stuff.

Day 3: Initial System Working

At the start of day 3, we have a first pass of an end-to-end system. The system uses SLU, passing messages through a rosbridge client on the SLU side, to the ROS system running on the robot. Another web browser rosbridge client is used to control the overall system. Check out the video to see it in action:

Day 3 Status Update:

Stefanie:
Rosbridge bug fixed, end-to-end demo complete, now just need to correctly complete cost function.
Goal: Colours correctly in colour manager, reuse same code for rosbag state extraction and live state extraction

Jon:
Did end-to-end web stuff, voice stuff, web interface stuff.
Goal: assist Jihoon with problems, act as curator on previous stuff

Stephen:
Working to get grab block to work as robustly as possible.  Now quite robust.  Now ready for videos for MTurk

Sam:
Helped get the state extraction to extract state and poses from rosbags offline for log postprocessing.

Goal: synchronize messages and message names

IMAG1248IMAG1233

End of day status:
Stephen and Sam recorded some videos for turkifying. Stephanie uploaded the videos to turk and got some initial results. The turk videos (and some initial responses) are as follows:

“Move the blue block in front of the red block.”
“Move the blue block near the orange block”

“Pick up the block closest to you and move it six inches to the right”
“Move the green block about a foot to the right of the orange block.”
“Pick up the blue block and move it to your right, towards the end of the table.”
“Move the blue block away from the orange block”

Screenshot of the web speech interface:

Day 4: End-to-end system

Summary thus far, in the words of Stefanie: “We have an entire system, we have the ability to make videos. It would be good to make some videos with logs and hand annotate some commands this morning, for process’s sake. Then, we should make an end-to-end demonstration of the whole system.”

Stefanie:
Took jihoon’s video and uploaded it to turk, and got 10 commands back from turkers. A few bugs in the log parser extractor which were fixed. Can now read a rosbag and draw the trajectory of the robot in SLU.
Made stub training targets for SLU so all that’s needed is YAML files and then it can just go.
Goals: Get end-to-end system going. 5 videos, 5 logs.

Stephen:
Working on picking and placing and is confident that it’s robust as it needs to be. Then started making videos with Sam.

Jihoon:
Worked with stephen to get it working. Did block colour stuff. Writing instructions.

Aaron:
Working on the tracking server with Stephen

Jon:
Little bit of help for Jihoon, looked into ripping out the wviz visualization

Goal:
End to end system
5 videos, 5 logs, 2 commands
Trained model
Demo video of end-to-end system.
Run everything from MIT machine and Brown machine
Next steps: repository, 25 videos

IMAG1255

The results!

The video below shows a demonstration of the most basic functionality of the system. With a few more demonstrations (from MTurk) and a wider variety of commands, we will see some very interesting behaviour. Watch this space!

Unoffical song of the hackathon

Unoffical slogan of the hackathon


 

No comments

Be the first one to leave a comment.

Post a Comment

You must be logged in to post a comment.