Internal:Bthomas weekly agenda
From Brown University Robotics
- Working on NSF; need help with the research plan essay. I don't know what the current multi-end-effector field is like right now, and I don't know any methodology/algorithms associated with it.
- Aggeliki is demonstrating the multirobot coverage things to me today 11am-2pm; more info. after that's done.
- Sarah and I don't know what your schedule is for the SF/DC trips, and we need to make flight reservations.
- Multirobot: Working on infrastructure (one command -> robots go). Essentially done with necessary part, so I'll start looking at and running Aggeliki's code now.
Aaron Edsinger and Charles C. Kemp: Toward Robot Learning of Tool Manipulation from Human Demonstration
Charles C. Kemp and Aaron Edsinger: Robot Manipulation of Human Tools: Autonomous Detection and Control of Task Relevant Features
- Working on NSF essays
- After NSF, will work on multirobot
- Read LfD and multirobot survey papers:
2/8 Surveys of Robot LfD and Manipulation
* A Survey of Robot Learning from Demonstration.
B. Argall, S. Chernova and M. Veloso.
Robotics and Autonomous Systems. Vol. 57, No. 5, pages 469-483, 2009.
* Challenges for Robot Manipulation in Human Environments.
C.C. Kemp, A. Edsinger, and E. Torres-Jara.
IEEE Robotics & Automation Magazine. (Volume 14, Issue 1), pg 20-29, March 2007.
- Updated the overhead system in 404 to use launch files -- simple, clean restarts now!
- Need to discuss NSF GRFP essays, particularly the topic of "propose a research topic" -- what topic should I research?
- overhead_map complete
- overhead_map documentation complete
- I read "Collective Decision-Making in Multi-Agent Systems by Implicit Leadership" (Yu, Werfel, and Nagpal / Harvard), and I reviewed the paper.
- I talked with Aggeliki about her multirobot system and am trying to think of how I want to tackle the subject.
- Focus for this week: Complete and get NSF GRFP applications out the door, and get work done for classes.
- Other todos: multirobot work, lit. search.
- overhead_map documentation (pretty pictures are there this week)
- Improvements to how overhead_map handles backgrounds
- xml files
- However, can't do ajax on file:// using Chrome due to Chrome bug. (Solution: Firefox, or host over http)
- "Mini" remote lab done
- 1 lenovo + 1 iRobot Create + 1 PS3 camera => overhead_map (for ar tag recognition) and teleop_trackpad (for web-based teleoperation)
- Robot can be given a border outlined by tape, and it will "bounce" off of it (using virtual_border)
- Lenovo is connected to iRobot Create via bluetooth, so theoretically, all we need to do is get the Create to dock to allow for continuous operation.
- Still todo until "actually done": Calibrate planar_tracker and overhead_map downstairs, setup a tape border, figure out how the network works downstairs and setup with respect to that. (Waiting on networking, room construction to do that.)
- Todo this week:
- Lit search (I didn't end up doing much last week on this.)
- Start working on NSF fellowship?
Distributed tasks: overhead_map files hosted on server (right), with rosjs and omclock running on "robot" (center), and the overhead_map display running on a laptop's web browser (left).
- Finished overhead_map
- Accepts type Overhead_Map_Objs and publishes specified objects (point, line, arrow, image) to screen.
- Web files: overhead_map.html, overhead_map.js
- Associated nodes: om_msgs (messages), omclock (clock demo), omdc (planar_tracker display demo)
- All of the above are in brown-ros-pkg/experimental.
- Made documentation
- Installed apache on bold, and put overhead_map files there.
- For CS148, students run rosjs (plus all usual nodes) on robot, and they can view webpages from their own laptops (or on the CS148 laptops, if Chrome is installed).
- Updated front page of brown-robotics.org
- Next goal: Work with Aggeliki and multirobot coordination.
[Meeting will not occur this week due to scheduling conflict with mandatory first-year meeting.]
- Problem: Lenovo is too slow to run everything at once
- Throttled gscam->ar_recog pipe to 1 frame/sec. (gscam still operating at full frequency)
- Reduced PS3 cam to minimum pixels/second (also tried min. frames/sec., but this turned out worse)
- NXT responds within 1-3 seconds, but this isn't fast enough...
- Based on top, rosjs.py now takes up 70% of runtime
- Guess: rosjs.py is taking up too much time.
- Can test by replacing controller with teleop_twist_keyboard and seeing if that helps.
- Potential solutions:
- Speeding up rosjs, if that's the problem
- Getting a faster computer
- Currently working on restructuring overhead_map.js to accept input from a user's topic
- Points, lines, arrows, and images
- Will also add simple user node, as an example controller.
Lego NXT Mini Remote Lab
All the nodes running on remote lab
All the nodes, less unnecessary odometry ones present in nxt_ros
- Continued to merge changes with brown-ros-pkg
- New nodes:
- nxt_robot_mindstorms2_quickstart (brown-ros-pkg/experimental)
- Sensors in same arrangement as found in Mindstorms NXT 2.0 quickstart guide.
- Depends on the nxt_ros stack
- Not fully functional (eg, model not built for odometry, etc.)
- virtual_border (brown-ros-pkg/experimental)
- (ie, border_container from last week)
- Threshold now taken as argument
- rosjs_node (not added to repository)
- Trevor is working on making rosjs a node; this is a holdover until then.
- Lenovo being modified to work for remote lab
- gscam is up and running with the modified ps3 driver
- Need to train ar_recog (how?)
- Still need Francois to show me the wifi fix, but have wired ethernet working for now.
- New tutorial: Internal: NXT Mini Remote Lab
- Lenovo is too slow to run everything at once
- Can run ar_recog OR nxt, but not both.
- Running both results in teleop_trackpad being unusable. It appears as if cmd_vel isn't being broadcast. I suspect the real problem is that something in nxt_ros locks up if everything starts running slow.
- Tested, and running both works well enough on my laptop.
- nxt's launch file has lots of odometry stuff built-in ; I removed as much of these nodes as possible and the situation improved a bit, but not enough.
- Current implementation: python launches chrome with local URL
- Makes launch files easy to run
- Organized inside ROS
- Will probably need to move files for over-the-network use, but this will work for testing for now.
- Where are we going, research-wise?
- We have a "remote lab" -- what then?
- Is there a good starting place for reading papers in the field, or is it just "read a lot"?
Lego NXT Mini Remote Lab
Screenshot of NXT Minilab
The NXT Minilab, including the foreign AR tag host
- NXT up and running under ROS (Internal:NXT_ROS)
- Currently uses USB. Bluetooth support "exists", but it is very flaky under Linux. (I got it to work a grand total of 1 out of about 6-7 tries.)
- Decent (but not great) driver that uses Twist is already available in package.
- Uses overhead_map to display top-down view of area.
- Uses teleop_trackpad to control robot.
- Robot was built by instructions provided in Mindstorms kit; sensors were then piled on top where they could fit, and a pole was built on top to support a large AR tag and a long USB cord.
- Lab area can be established using thick masking tape lines.
- border_container intercepts the broadcast Twist messages and uses an intensity sensor to look for lines. If one is run over, it reverses the robot's direction, preventing it from running out of the lab area.
Robot soccer overhead viewer (overhead_map)
Screenshot of soccer field
Not quite roombas in real life...
- Easy to add new maps and reconfigure existing maps inside of overhead_map.
- Uses canvas element (new to html5) to render graphics.
- Added multicamera_resolution.py to planar_tracker
- Takes AR tags from an arbitrary number of cameras (give topics on command line), and averages the values for each id found. Then, presents a "unified" view of the tags as output.
(Note this meeting didn't actually occur; this is a record of activities up until this date.)
Two demos were made and aimed particularly towards irobot creates, though they can theoretically be used by any robot that can receive geometry_msgs/Twist publications:
- "Touchpad" controller interface for robots
- Horizontal axis is angular velocity.
- This can be counterintuitive going backwards
- Turn angle could be more appropriate
- Vertical axis is linear velocity.
- Programming widget
- Enables people to start from nothing and do something neat in 1+ lines of code
- Inner response loop with globals, so can do anything Markovian
- Example code for moving, line following, and bumping (with counter)
publications page of brown-robotics.org
- cron job on unison.cs.brown.edu runs rlab-bibtex2html.bash weekly
- results end up on robotics.cs.brown.edu webspace
- cron job on maiahost grabs results daily from robotics.cs.brown.edu and puts them on the wiki
- called using htmlets extension
- htmlets can also be used for things like embedding rosjs stuff!
- Calibrates light intensity sensors
- Follows lines based on readings of either one or two sensors (requires source edit)
- Can either do light-on-dark or dark-on-light
irobot_create_2_1 bug fix
Response to high angular and low linear velocity (eg, 1 and 1e-5) was to stall robot.
- Fix implemented and published in experimental/irobot_create_2_1 for internal testing.
- Requires constant for wheelbase width, but _exact_ value is not incredibly important.