New Technology Of Virtual Rodent

Humans and animals move with incredible agility, which no robot has been able to match. To help researchers investigate the enigma of how brains control movement, Harvard neuroscientists constructed a virtual rat with an artificial brain that can walk like a genuine rodent.

Bence Ölveczky, a professor in the Department of Organismic and Evolutionary Biology, led a team of academics who partnered with scientists from Google’s DeepMind AI division to build a biomechanically realistic digital model of a rat. Using high-resolution data from real rats, they trained an artificial neural network (the virtual rat’s “brain”) to operate the virtual body in MuJoco, a physics simulator with gravity and other forces.

The researchers published in Nature that activations in the virtual control network accurately predicted neural activity measured from the brains of real rats producing the same behaviors. Ölveczky, an expert in training (real) rats to learn complex behaviors in order to study their neural circuitry. The achievement offers a new approach to researching how the brain governs movement, according to Ölveczky. It leverages developments in deep reinforcement learning, AI, and 3D movement-tracking in freely acting animals.

The collaboration was “fantastic,” Ölveczky added. “DeepMind had created a pipeline for training biomechanical agents to move through complex situations. We just did not have the means to run those simulations and train these networks.”

Working with Harvard researchers was also “a really exciting opportunity for us,” said co-author and Google DeepMind Senior Director of Research Matthew Botvinick. “We’ve learned a lot from the challenge of developing embodied agents: AI systems that must not only think intelligently, but also translate that thinking into physical action in a complex environment.” It was feasible that applying this strategy in a neuroscience environment could provide insights into both behavior and brain function.

Graduate student Diego Aldarondo collaborated closely with DeepMind researchers to teach the artificial neural network to use inverse dynamics models, which experts believe our brains use to control movement. When we reach for a cup of coffee, our brain immediately calculates the path our arm should take and converts it into motor commands. Similarly, using data from actual rats, the network was fed a reference trajectory of the desired movement and trained to create the forces required to achieve it. This enabled the virtual rat to emulate a wide range of behaviors, including some it had not been formally educated on.

These simulations may pave the way for an untapped area of virtual neuroscience in which AI-simulated animals trained to behave like real ones provide simple and completely transparent models for investigating brain networks, including how such circuits are damaged in disease. Ölveczky’s lab focuses on understanding the brain, but the platform might also be utilized to improve robotic control systems.

A further stage could be to give the virtual animal autonomy to complete tasks similar to those faced by real rats. “From our experiments, we have a lot of ideas about how such tasks are solved, and how the learning algorithms that underlie the acquisition of skilled behaviors are implemented,” the researcher stated. “We would like to start employing the virtual rats to test these ideas and help advance our understanding of how real brains generate complex behavior.

Reference – Researchers create realistic virtual rodent

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More