Smart Robots: Shape Matters
When designing a robot, choosing the right shape can have tremendous impact on the robot controller. This idea has been pushed to the extreme with the HexBug Nano. It has no sensors and even no controller at all! Still its behavior is really realistic and robust as shown in Video 1.
Video 1: HexBug Nano Robust Behavior
The Nano has only one motor that makes it vibrate and as a result move forward. This kind of locomotion is also used in other robots such as the Kilobot (from Harvard University) that has two vibration motors to allow steering (see Video 2). The HexBug Nano cannot be directed. But, it triangular head makes it change direction when touching an obstacle. The resulting behavior looks pretty much similar to the way insects navigate.
Video 2: Introduction to Kilobot
The Nano back has also a functional shape. This flat pyramid combined with motor vibrations enables the robust locomotion. Whenever the robot falls on the side or even on the back, it quickly recovers and stands on its feet.
The robot shape is important even when building simple robots in a hobby project. A notable example are wheeled robots developed by David Anderson from the Dallas Personal Robotics Group (DPRG). In a talk given end of 2011 (see Video 3), David describes all facets of robots design starting with the physical shape up to the control software. More specifically, he discusses the importance of the robot geometry and its impact on simplifying the controller.
Video 3: David P. Anderson Talk on Robots Design
David’s talk includes plenty of interesting explanations and tips that every roboticists should aware of. To ease information retrieval from this 2+ hours long videos, we provide you below with a timeline as well as some notes.
David Anderson points in the beginning of the video his web page dedicated to robots: http://www.geology.smu.edu/~dpa-www/myrobots.html. This is actually the root of a series of pages with plenty of materials on different robots and their control software. Some are a good complement to the video. We link them in the outline below. Other valuable information are available on the pages related to the DPRG Outdoor Challenges.
00:00 Robot Geometry
-Robot should be able to turn in place = disc shape + 2 driving wheels + 1 rear free wheel
-Rear wear contact with the ground should be on the disc perimeter when the robot turns on itself
-Gravity center should be closest possible to the driving wheels axis toward the rear = better robot stability + maximize friction at the driving wheels to allow robots climb small obstacles
00:13 Driving the robot and subsumption [http://www.geology.smu.edu/~dpa-www/robo/subsumption/]
-Command = forward speed + rotation speed
–Derive individual wheels speed
—Left = Forward – Rotation
—Right = Forward + Rotation
00:23 PID [http://www.geology.smu.edu/dpa-www/robots/doc/speedctl.txt]
-PID to control accurately the motors and take into account differences in friction or small obstacles => the robot
-Should be considered more as touch sense to feel the environment
–As opposite to simply signaling that something is wrong
-Should drive the highest priority behavior
-When left bumper is active: go back a little then turn right
-Same when the right bumper touches
-The rotation angle on bumping by default is about 10 degrees.
–To avoid trapping : Should be slowly reduced to try to find a path.
–After a while (use of leak integrator described below), should make a large rotation (more than 90 degrees) and try to escape
-Leaky integrator (00:55 has other uses)
–A counter that is decremented by 1 every cycle if its value greater than 0
–Each time a bumper is activated the counter is incremented by some delta value (e.g. 20)
–Once the total reaches goes beyond some threshold, the behavior is changed.
–Use with bumpers : If the robot oscillates between left and right for some time, we can consider it is trapped in a symmetrical shaped object (chair, corner).
01:11 Simple light-following behavior combined with bumpers.
–Uses light sensros
01:16 IR-based obstacle avoidance
01:28 Position calculation [http://geology.heroy.smu.edu/~dpa-www/robo/Encoder/imu_odo/]
-2 virtual location sensors =
–2 points located in the space in front of the robot. One point is on the right and the other on the right.
–A function answers true if a given point is inside a given box
–Depending if the right virtual sensor or the left one is inside a given box, we can decide about turning right or left like with IR
01:37 Navigation to a target point or through a list of waypoints
01:57 Sonars [http://www.geology.smu.edu/dpa-www/robots/doc/stsonar.txt]
-Grippers fixed on the bumpers to “inherit” bumping behavior
02:09 Acting when reaching waypoints
02:14 Tracking objects to grab, aligning the robot and slowing down
Leave a Reply