Researchers at the University of Sheffield have fitted a robotic rat with a monkey brain model in a bid to increase the machine’s perception.
By fitting the monkey brain model into an existing robotic rat, which researchers had already developed, it was able to feel different textured surfaces, such as rough and smooth carpets, as it scuttled across them with its rat-like whiskers. It also made better decisions with its whiskers than any previous method tested.
Experts hope the machine, nicknamed ‘roomba’, will help develop robot sensing so the whisker technology can be used in disaster zones to help guide rescuers and save lives.
Dr Nathan Lepora of the University’s Department of Psychology said: “Animals far surpass present-day robots in their perceptual abilities. By using methods based on how brains perceive the world, we aim to develop methods for robot perception that would allow them to interact with the world in a more successful way.
“The whisker sensors used in this study have applications to robot sensing in the sort of environments that rats thrive in, such as dark, enclosed spaces as might be found in pipes or disaster zones. This is a general aim of the BIOTACT grant led by Professor Tony Prescott at the University of Sheffield.”
This research suggests that rats, who are masters at sensing with whiskers, may use similar brain mechanisms as monkeys when recognising objects, and the decision-making processes underlying perception may be common across all mammals, from rodents to humans.
Dr Lepora added: “This particular study is aimed principally at testing biological hypotheses in robots, in particular the theories of decision making developed from recording from the visual cortex of monkeys.
“In terms of the perceptual abilities of whiskered robots, we are currently aiming to generalize the approach from this paper over other touch sensations rather than just texture, such as object shape and position relative to the rat/robot.”
‘Roomba’ is just one of a series of robots, developed by the University in partnership with other institutions as part of the BIOTACT project, which use ‘active touch’ rather than vision to navigate their environment.
The research has been published in Interface, a journal of the Royal Society (25 January 2012).