University

Research by the University of Konstanz and ELTE Helps Uncover How Fish Swim in Schools

A study led by researchers from the University of Konstanz’s Centre for the Advanced Study of Collective Behaviour and the Max Planck Institute of Animal Behavior (MPI-AB)—with contributions from Eötvös Loránd University (ELTE) and the Massachusetts Institute of Technology (MIT)—is helping to decipher how fish move in schools.

According to a Monday statement from ELTE, the team of biologists and robotics engineers, whose findings were published in Science Robotics, developed a virtual reality system for fish that provides insights into how they coordinate their group movements.

The researchers discovered a natural “regulation law” that zebrafish use to synchronize their behavior.

Since such collective behavior can have technological benefits—such as in autonomous vehicle navigation—the team tested the algorithm’s effectiveness on robot cars, drones, and aquatic vehicles, concluding that the interaction rules used by fish could offer promising solutions for controlling future robot fleets.

Fish are masters of coordinated motion. Even without a leader, individuals in a school manage to maintain formation, avoid collisions, and respond flexibly to environmental changes. Replicating this robustness and adaptability artificially has long challenged engineers, the summary explained.

The researchers took a new step toward solving this problem by using virtual reality in experiments with freely swimming fish.

“Our work shows that solutions evolved by nature over millions of years can inspire robust and efficient control rules for engineered systems,” said Liang Li, the study’s lead author and researcher at the University of Konstanz.

“Our discovery opens exciting new possibilities for the future design of robotics and autonomous vehicles,” added Máté Nagy, co-author and researcher at ELTE.

The researchers used a virtual reality system that mimics natural schooling behavior. Juvenile zebrafish were placed in connected tanks where they could freely interact with virtual avatars of other fish.

Each virtual fish was a hologram-like projection of a real fish, perceived by the animals as if it were swimming in the same space. This fully immersive 3D environment allowed the researchers to control visual stimuli and observe how the fish responded.

This level of control revealed which visual cues guide fish behavior—what underlies schooling and how they solve the complex problem of movement coordination. The solution was a simple rule based solely on the perceived positions of nearby individuals, not their speed.

To test the rule’s realism, the researchers conducted a VR Turing Test—a method used to determine whether an artificial entity can mimic real behavior convincingly.

In this special test, a real fish swam with a virtual fish. Sometimes the virtual fish mimicked real behavior, while other times it was guided by the newly discovered algorithm. The real fish showed no difference in its behavior, reacting to the algorithm-driven avatar the same way it would to a real companion.

The researchers then implemented the discovered rule in robot cars, drones, and boats. The robots were tasked with following a moving target using either the zebrafish-inspired algorithm or the commonly used Model Predictive Controller (MPC) system.

The natural “control law” matched the MPC in terms of accuracy and energy efficiency across nearly all tests—but required much simpler operation, the study concluded.

(MTI)

Leave a Reply

Your email address will not be published. Required fields are marked *