• Sun. Oct 6th, 2024

Researchers at Stanford Introduce CORNN: A Machine Learning Method for Real-Time Analysis of Large-Scale Neural Recordings

Nov 11, 2023

Technological developments have brought a new age in the constantly changing field of neuroscience research. With this extraordinary power, it has become possible to gain a deeper understanding of the intricate relationships between brain function and behavior in living things. In neuroscience research, there’s a critical connection between neuronal dynamics and computational function. Scientists use large-scale neural recordings acquired by optical or electrophysiological imaging techniques to comprehend the computational structure of neuronal population dynamics.

The ability to record and manipulate more cells has increased as a result of recent developments in various recording modalities. As a result, the necessity for creating theoretical and computational tools that can efficiently analyze the enormous datasets produced by various recording techniques is increasing. Manually constructed network models have been used, particularly when recording single or small groups of cells, but these models found it difficult to manage the massive datasets generated in modern neuroscience.

In order to derive computational principles from these large datasets, researchers have presented the idea of using data-constrained recurrent neural networks (dRNNs) for training. The objective is to do this training in real-time, enabling medical applications and research methodologies to model and regulate treatments at single-cell resolution, impacting particular animal behavior types. However, the restricted scalability and inefficiency of current dRNN training methods provide a hurdle, as even in offline circumstances, this constraint impedes the analysis of extensive brain recordings.

To overcome the challenges, a team of researchers has presented a unique training technique called Convex Optimisation of Recurrent Neural Networks (CORNN). By eliminating the inefficiencies of conventional optimization techniques, CORNN aims to improve training speed and scalability. It exhibits training speeds about 100 times quicker than conventional optimization techniques in simulated recording investigations without sacrificing or even improving modeling accuracy. 

The team has shared that CORNN’s efficacy has been evaluated using simulations that include thousands of cells carrying out basic operations, like executing a timed response or a 3-bit flip-flop. This demonstrates how adaptable CORNN is for managing challenging neural network jobs. The researchers have also shared that CORNN is extremely robust in nature in replicating attractor structures and network dynamics. It demonstrates its ability to produce accurate and dependable findings even when faced with obstacles such as discrepancies in neural time scales, extreme subsampling of observed neurons, or incompatibilities between generator and inference models.

In conclusion, CORNN is significant because it can train dRNNs with millions of parameters in sub-minute processing speeds on a normal computer. This achievement represents an important first step towards real-time network reproduction that is limited by extensive neuronal recordings. By enabling quicker and more scalable studies of large neural datasets, CORNN has been positioned as a potent computational tool with the potential to improve understanding of neural computing.


Check out the PaperAll credit for this research goes to the researchers of this project. Also, don’t forget to join our 32k+ ML SubReddit, 41k+ Facebook Community, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.

If you like our work, you will love our newsletter..

We are also on Telegram and WhatsApp.

The post Researchers at Stanford Introduce CORNN: A Machine Learning Method for Real-Time Analysis of Large-Scale Neural Recordings appeared first on MarkTechPost.


#Applications #ArtificialIntelligence #EditorsPick #MachineLearning #Staff #TechNews #Technology #Uncategorized
[Source: AI Techpark]

Related Post