Collective Construction for Autonomous Agents:
From 2015-2019 our NIH-funded project on autonomous collectives sought to investigate the mound-building behaviour of two species of Macrotermes. These ingenious insects construct elaborate temperature- and humidity-regulated mounds, in order to cultivate a delicate fungus that acts as their primary food source. From them, we hope to develop algorithms for autonomous construction that will allow robots to build elaborate, robust, multi-functional structures in hostile environments, such as disaster zones, or outer space. To facilitate this research, we developed novel multimodal vision systems and sensor arrays which were lightweight, robust, and could be effectively deployed in the remote deserts of Namibia. As a capstone, we created a bioimetic humidity-reactive robot, which will allow us to test algorithms on a small low-cost swarm of robots that react to humidity, and attempt to regulate their environment via a stigmergic process.
Responsible Social Robots
How do we integrate robots into society in ways that support, rather than disrupt, our most vulnerable populations? In particular, the proliferation of social robots in care services is raising questions about ethics, privacy, and and the nature of future human societies. Together with researchers in Australia and the UK, I am exploring the roles that robots should, and even more critically, should not play in care delivery. Although there is a burgeoning literature on the topic of robots in social and care settings, the majority of this commentary and evidence tends to revolve around their technical efficacy, their acceptability to consumers, or the legal ramifications of such innovations. Yet, there remains a serious lack of attention within the public policy and public management to the actual implementation of robots in care settings. See my Scientific American blog post for some additional thoughts: Can robots tighten the bolts on ...
Insect-Inspired Vision
In 2010, my colleagues and I at the Centre of Excellence in Cognitive Interaction Technology embarked on a project to develop low-cost polarisation-sensitive vision systems for UAVs, to enable them to navigate using polarised signals in the natural world, in the same way locusts, bees, and other insects do. This led to fruitful on-going collaborations, and presently myself, Martin Howe at University of Bristol, and Keram Pfeiffer at University of Wurzburg are working on creating cross-platform integrated vision software that will enable researchers to quickly and accurately reconstruct and analyse polarised images ( canopy_movie.mp4) . I am also an advisor on the related DFG-funded project No: HO 950/25-1 (Sky compass signaling of central-complex neurons in locusts exposed to the natural sky).
Follow Me:
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Cras diam risus, dignissim pharetra sagittis sed, adipiscing nec felis. Maecenas non nulla eget ligula semper