Understanding the Brain through Code and Mathematics: An Introduction to Computational Neuroscience

Bhaumik Tyagi
6 min readAug 1, 2023

--

https://neuroscience.wustl.edu/

The human brain remains one of the most enigmatic and complex structures known to mankind. Unraveling its mysteries has been a longstanding challenge for researchers across various disciplines. Computational Neuroscience, an interdisciplinary field that merges neuroscience, computer science, and mathematics, has emerged as a powerful approach to understanding the brain’s functions and mechanisms. This article delves into the world of Computational Neuroscience, highlighting its significance, methodologies, and its role in deciphering the brain’s inner workings.

1. The Essence of Computational Neuroscience

Computational Neuroscience aims to understand the brain’s functioning by developing and utilizing mathematical models and computer simulations. It embraces both theoretical and experimental approaches, integrating experimental data with computational tools to test and validate hypotheses about brain function.

At its core, Computational Neuroscience seeks to answer fundamental questions:

  • How do neurons process and transmit information?
  • What are the underlying principles of brain function?
  • How does the brain give rise to complex behaviors and cognition?

2. Neuron Modeling

Neurons are the building blocks of the brain, and their behavior is central to understanding brain function. Computational Neuroscience often employs mathematical models to represent neurons and their interactions.

One of the simplest models is the “leaky integrate-and-fire” neuron model, described by the following differential equation:

where v is the membrane potential, vrest​ is the resting potential, τ is the membrane time constant, R is the membrane resistance, and, I(t) is the input current.

Python code for simulating a basic leaky integrate-and-fire neuron:

import numpy as np
import matplotlib.pyplot as plt

def leaky_integrate_and_fire(I, v_rest, tau, R, dt, T):
v = v_rest
time = np.arange(0, T, dt)
v_trace = []

for t in time:
dv_dt = (-(v - v_rest) + R * I(t)) / tau
v += dv_dt * dt
v_trace.append(v)

return time, np.array(v_trace)

# Example usage
I_input = lambda t: 2.5 if t > 20 and t < 60 else 0 # Input current function
v_rest = -70 # Resting membrane potential (mV)
tau = 10 # Membrane time constant (ms)
R = 2 # Membrane resistance (kΩ)
dt = 0.1 # Time step (ms)
T = 100 # Total simulation time (ms)

time, voltage_trace = leaky_integrate_and_fire(I_input, v_rest, tau, R, dt, T)

plt.plot(time, voltage_trace)
plt.xlabel('Time (ms)')
plt.ylabel('Membrane Potential (mV)')
plt.title('Leaky Integrate-and-Fire Neuron')
plt.show()

3. Network Models and Connectivity

Brains are complex networks of interconnected neurons. Computational Neuroscience explores the collective behavior of neural networks using graph theory and network models.

One widely used model is the “artificial neural network” (ANN). ANNs are inspired by biological neural networks and are crucial in various applications, including pattern recognition, memory, and decision-making.

Python code for a simple feedforward neural network using the Keras library:

import numpy as np
import keras
from keras.models import Sequential
from keras.layers import Dense

# Generate some dummy data for illustration
input_data = np.random.rand(100, 10)
output_data = np.random.randint(2, size=(100, 1))

# Create the model
model = Sequential()
model.add(Dense(32, input_dim=10, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

# Compile the model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

# Train the model
model.fit(input_data, output_data, epochs=10, batch_size=32)

4. The Role of Data Analysis and Machine Learning

Computational Neuroscience heavily relies on data analysis and machine learning techniques to interpret experimental data and extract meaningful insights. These methods help researchers uncover patterns, relationships, and emergent properties from vast amounts of neural data.

5. Simulating Brain Functions and Cognitive Processes

Computational Neuroscience allows us to simulate various brain functions and cognitive processes, such as visual perception, motor control, learning, and memory. By modeling these processes computationally, researchers can test hypotheses and gain deeper insights into the underlying mechanisms.

6. Synaptic Transmission and Neural Dynamics

The communication between neurons, known as synaptic transmission, is a fundamental process underlying brain function. Computational Neuroscience involves mathematical models to describe synaptic interactions and neural dynamics.

A classic model for synaptic transmission is the “excitatory postsynaptic potential” (EPSP) and “inhibitory postsynaptic potential” (IPSP). These are changes in the postsynaptic neuron’s membrane potential in response to neurotransmitter release from presynaptic neurons. We can describe the membrane potential of a neuron with both excitatory and inhibitory inputs using the following equation:

where:

  • C is the membrane capacitance,
  • V is the membrane potential,
  • restVrest​ is the resting membrane potential,
  • leakgleak​ is the leak conductance,
  • excVexc​ and inhVinh​ are the reversal potentials for excitatory and inhibitory inputs, respectively,
  • excgexc​ and inhginh​ are the conductances of excitatory and inhibitory inputs, respectively, and
  • extIext​ is the external current input.

Python code for simulating a neuron with both excitatory and inhibitory inputs:

import numpy as np
import matplotlib.pyplot as plt

def neuron_with_synaptic_inputs(C, V_rest, g_leak, V_exc, g_exc, V_inh, g_inh, I_ext, dt, T):
time = np.arange(0, T, dt)
V = V_rest
V_trace = []

for t in time:
dV_dt = (-g_leak * (V - V_rest) + g_exc * (V_exc - V) + g_inh * (V_inh - V) + I_ext) / C
V += dV_dt * dt
V_trace.append(V)

return time, np.array(V_trace)

# Example usage
C = 1.0 # Membrane capacitance (nF)
V_rest = -70 # Resting membrane potential (mV)
g_leak = 0.1 # Leak conductance (μS)
V_exc = 0 # Excitatory reversal potential (mV)
g_exc = 0.5 # Excitatory conductance (μS)
V_inh = -80 # Inhibitory reversal potential (mV)
g_inh = 0.2 # Inhibitory conductance (μS)
I_ext = 2 # External current input (nA)
dt = 0.1 # Time step (ms)
T = 100 # Total simulation time (ms)

time, voltage_trace = neuron_with_synaptic_inputs(C, V_rest, g_leak, V_exc, g_exc, V_inh, g_inh, I_ext, dt, T)

plt.plot(time, voltage_trace)
plt.xlabel('Time (ms)')
plt.ylabel('Membrane Potential (mV)')
plt.title('Neuron with Synaptic Inputs')
plt.show()

7. Spike-Timing-Dependent Plasticity (STDP)

STDP is a form of synaptic plasticity where the strength of a synapse changes based on the precise timing of pre- and postsynaptic spikes. It is a critical process underlying learning and memory in the brain.

Mathematically, STDP can be described using the following formula:

where:

  • Δw is the change in synaptic weight,
  • Δt is the time difference between the pre-and postsynaptic spikes,
  • ALTD​ and ALTP​ are the amplitude of long-term depression and long-term potentiation, respectively,
  • τLTD​ and τLTP​ are the time constants for LTD and LTP, respectively.

8. Large-Scale Brain Simulations

Computational Neuroscience also ventures into large-scale brain simulations, such as the Blue Brain Project and the Human Brain Project. These projects employ supercomputers and advanced mathematical models to simulate complex brain networks at unprecedented scales.

Conclusion

Computational Neuroscience is a cutting-edge field that employs the power of mathematics and computer simulations to explore the intricate workings of the brain. By developing and testing theoretical models alongside experimental data, it has the potential to unlock the secrets of cognition and revolutionize our understanding of the human mind. As technology and our understanding of neuroscience advance hand-in-hand, Computational Neuroscience is poised to continue contributing significantly to the scientific world and improving our quality of life.

Thanks for reading 🙂

Please follow for more such content 🙌

--

--

Bhaumik Tyagi

Jr. Research Scientist || Subject Matter Expert || Founder & CTO|| Student Advocate ||