Scientists use VR on zebrafish to teach robots how to group

Fish are masters of coordinated movement. Their schools have no leader, yet each individual manages to maintain a uniform pattern, avoid collisions, and respond with fluid flexibility to changes in their environment. Cloning this unique combination of resilience and flexibility has been a long-standing challenge for engineered systems like robots. Now, using a virtual reality environment that allows zebrafish to swim freely alongside virtual “holograms” of their own kind, a research team from Konstanz has taken a significant step toward a solution.
“Our work demonstrates that solutions that have evolved in nature over thousands of years have the potential to guide robust and efficient control laws in engineered systems,” said Yi Li of the University of Konstanz, the first author of the study. His co-author, Matta Ngi of the University of Ottwasser, adds: “The discovery opens up exciting possibilities for future applications in robotics and autonomous vehicle design.”
Decoding nature's hidden algorithm
In a 3D VR system that simulates natural group swimming, young zebrafish were placed in interconnected arenas where each fish could freely interact with “holograms” of virtual fish. Each virtual fish was projected from a real fish arena in a different arena, so that the swimming mix appeared as if all the fish shared the same world. The full environment allowed precise control of visual stimuli and recording of the fish’s responses.
The high level of control allowed the researchers to isolate exactly which visual cues the fish use to direct their behavior within the school. The discovery was a simple and robust control law that relies solely on the perceived position of neighbors—not their speed—to regulate peripheral movement.
“We were surprised by how little information the fish need to coordinate movements in a school,” says Ian Cousin, senior author and director of MPI-AB. “They use local rules that have minimal cognitive demands, but are highly functional.”
Underwater Turing Test

To verify that the law indeed reflects natural behavior, the researchers conducted a unique “Turing” test: a real fish was placed next to a virtual fish that alternated between being real and being controlled by the algorithm. The real fish did not distinguish between the real fish and the virtual friend, and behaved the same in both cases.
From fish to machines
The next step was to embed the law in swarms of robotic vehicles, drones, and boats. Their role was to track a moving target, using parameters from the biomimetic algorithm or using the MPC (Model Predictive Control) method currently common in autonomous vehicles. In all tests, the fish control law yielded almost identical performance to MPC in terms of accuracy and energy consumption—but with much lower computational complexity.
“The study highlights the interrelationships between biology and robotics—using robots to investigate biological mechanisms, which in turn inspire ideas for new and efficient robotic controls,” concludes Prof. Oliver Duisen, also from the University of Konstanz.
More on the subject on the science website