Applied Dexterity at ICRA 2014 – Hong Kong

We’ve been very busy at Applied Dexterity in 2014, and we’ll have a lot of news to share with you shortly. To whet your news appetite: we’ll be at ICRA in Hong Kong for the first week of June! We’ll have the RAVEN with us and we’re doing our best to have some cool new toys at our booth, too.

The RAVEN community will be presenting two papers as well. The first is UC Berkeley’s paper Autonomous Multilateral Debridement with the Raven Surgical Robot by Ben Kehoe et al. Ben and his labmates used stereo vision to find and remove simulated dead tissue (debridement). This work presents the automation of a surgically relevant sub-task as a stepping stone towards supervisory control of surgical robots.

Our roboticist, Andrew Lewis, will also be presenting his work on Dynamic Gravity Compensation on the RAVEN. Using an accelerometer on the base of the robot, Andrew implemented a simple and effective method to calculate induced gravity torques on the RAVEN and serial robots in general. The addition of this sensor improves control in any base orientation and any gravitational state.

We can’t wait to meet up with the robotics community at large again. And feel free to follow us on Twitter to get the latest AD and RAVEN happenings.

Robotic Hand Developments at Harvard

Professor Robert Howe at Harvard University was recently featured on Engadget for his lab’s low-cost compliant hand. It utilizes only one motor with some clever mechanisms and takes advantage of inexpensive pressure sensors commonly used as barometers in smartphones. By vacuum forming a rubber skin to the sensors, they are able to get good sensitivity. The photographers made sure to interview Professor Howe with a familiar high-tech robot in the background as well.

http://www.engadget.com/2013/12/02/peripheral-vision-013/

RAVEN’s Big Screen Debut

Ender's Game Cast and Crew

To the delight of all of Applied Dexterity, the RAVEN is making it’s Big Screen debut in the first film adaptation of Orson Scott Card’s Sci-Fi masterpiece Ender’s Game. The staff of the film got in touch with Blake Hannaford and the BioRobotics lab at the University of Washington in the spring of 2012  to invite the RAVEN to be used in one of the scenes.

RAVEN on set

Lab members and PhD students Hawkeye King and Lee White spent a week in New Orleans preparing the robot for the scene, which culminated in an all day shoot with most of the movie’s stars. You can catch their masterful teleoperation skills during a closeup on the RAVEN just under an hour into the movie.

We are huge fans of the story, and we’re proud of the great work done by all of the BioRobotics Lab in making this happen. It truly is a sign that the RAVEN is ready for the future. The film premiers today, November 1st.

Autonomous Block Transfer

UCSC FLS demo

Ji Ma at the University of California Santa Cruz has successfully demonstrated an autonomous, vision-guided block transfer task using the RAVEN research robot. The FLS Block Transfer task is a standard test used to train and test surgeons. Now, it’s being used to train the RAVEN! Ji reports 100% success in block grasping and 94% success in placement. This is a huge step for the RAVEN community and comes hot on the heels of UC Berkeley’s successes with autonomous debridement.

Autonomous Debridement at UC Berkeley

UC Berkeley Autonomous Debridement Scenario
UC Berkeley Autonomous Debridement Scenario

Researchers at the University of California Berkeley have successfully demonstrated autonomous debridement capabilities with their RAVEN robot. This is the first example of a fully autonomous subtask implemented on the RAVEN. In a simplified scenario, the robot is able to detect the locations of dead tissue and remove it to a safe location.

Graduate students, under the supervision of Professors Ken Goldberg and Pieter Abbeel, were able to take advantage of the open source RAVEN architecture in order to integrate stereo camera hardware and  Model Predictive Control software. Together, these systems are used to enable a vision-guided, autonomous application for the the robot. One day, this contribution can be used to enable surgery in remote locations where a surgeon may only have limited control of the robot due to communication limitations. A paper detailing the methods has been submitted to the IEEE for publication in 2014. This work indicates a major contribution to the field of robotic surgery and machine learning from a RAVEN team.

The advancements from the Berkeley team are shared with the RAVEN community by providing other users with their code. Several teams are actively working on adapting this software for their own research goals.

Find more information at the team’s RAVEN site.