Skip to main content

Robots fly, swim, fetch, and drive in Johns Hopkins student project demos

May 12, 2016

It took a couple tries. At one point the robotic helicopter accelerated upwards, hitting the top of the net, then crashed abruptly to the floor.

But later during the robotics demonstration at Johns Hopkins University’s Krieger Hall, the small quadcopter successfully hovered over its target landing pad and settled safely.

A student team controls a quadcopter using virtual-reality headgear and a myoelectric armband. IMAGE: WILL KIRK / HOMEWOOD PHOTOGRAPHY

A student team controls a quadcopter using virtual-reality headgear and a myoelectric armband.
IMAGE: WILL KIRK / HOMEWOOD PHOTOGRAPHY

“If you never crash a robot, you’re probably not pushing the envelope hard enough,” said Louis Whitcomb, the mechanical engineering professor whose students demonstrated their independent robotics projects at labs across the Homewood campus on Wednesday afternoon.

In addition to the quadcopter that found its own target, there was another controlled remotely by human motions, via virtual-reality headgear and a myoelectric armband. One arm movement triggered the helicopter into a quick mid-air flip.

Other demos by students in Whitcomb’s graduate-level Robot System Programming course included two small self-driving cars that could independently travel to pre-set destinations, avoiding obstacles like humans and trash cans along the way. Then there were two “turtlebots”—basic personal robots—one programmed to map out and reach a target, the other programmed to pick up and deliver an object by verbal command, such as “Fetch me the water bottle.”

Whitcomb’s students spent the latter half of the semester working in teams to assemble and program their robots. They followed some fairly broad directives: The robots had to be able to perform two tasks, to operate both independently and autonomously, and to be equipped with at least two sensors.

Excerpted from The Hub. Read the complete story here.

Back to top