Skip to Main Content

Cluster 1: Brain-Inspired Computing: Learning in Biological; Artificial Neural Networks

Description:

The world is entering a new era of Artificial intelligence (AI) technology in which machines will become increasingly capable of performing “human” tasks such as speaking a language, driving an automobile, writing a computer program, or recognizing images in photographs and videos. To perform such tasks, AI systems rely on artificial neural networks that learn from their own experiences in the much same way that humans and animals do. Indeed, AI technology has arisen largely from efforts by scientists and engineers to build machines that mimic the brain, and for this reason, modern AI is sometimes referred to as “brain-inspired computing.” 

This cluster will introduce students to the fundamental principles of brain-inspired computing that make it possible for biological and artificial neural networks to learn and solve problems. Each morning, students will attend two lectures on closely related topics from neuroscience and machine learning. Each afternoon, students will attend a “coding clinic” to learn computer programming skills (prior coding experience is not a prerequisite for enrollment), followed by a modeling exercise that challenges students to simulate a biological neural network or create a machine learning system that performs a useful task. In addition to lectures and labs, field trips and discussions with cluster faculty will help students to understand what AI technology can teach us about how the brain works, and vice versa.

Along our journey we will delve into mysteries and questions that are taking on new importance with the advent of AI technology: What is intelligence? How do our brains store and retrieve memories? Why do we sleep and dream? How do people form habits and how they can break them? Will machines ever become as or more intelligent than humans? Can a machine become conscious? What are the greatest dangers of AI? How can people help to make sure that democracy, tolerance, and diversity will flourish in a post-AI world?  

WEEK 1: FUNDAMENTALS OF NEURAL COMPUTATION

Morning Neuro lectures: 1)Natural and artificial intelligence, 2) The neuronal membrane potential, 3) Synaptic transmission & plasticity, 4) Action potentials and activation functions     

Morning ML/AI lectures: 1)Neural coding: population vectors, stimulus features, normalization, 2) Linear regression, MSE error, Cost functions, 3) Error-driven learning & gradient descent, 4) Logistic regression (classification) and decision boundaries     

Afternoon coding clinic: 1)Plotting functions and displaying images, 2) Operations on scalars, vectors, and tensors, 3) Loops and conditionals, 4) Time series and derivatives     

Afternoon modeling challenge: 1)Ocellus illumination encodes flight position in insects, 2) Tuning a retinal bipolar neuron’s preferred stimulus, 3) Train a visual cortex neuron to perform edge detection, 4) Simulating the jump escape reflex in flies   

FIELD TRIP: Visit a neurophysiology lab

WEEK 2: SUPERVISED LEARNING

Morning Neuro lectures: 1)Sensory pathways in the brain, 2) Topographic organization of sensory cortex, 3) Primate visual pathways, 4) Error signals in the brain     

Morning ML/AI lectures: 1)Single-layer perceptrons, 2) Multilayer perceptrons & backpropagation, 3) Convolutional neural networks, 4) AlexNet     

Afternoon coding clinic: 1)Tool libraries for AI and machine learning, 2) Specification of training schedules, 3) RGB image processing, 4) Curating data     

Afternoon modeling challenge: 1)Worm detection in the toad visual system, 2) Train an MLP to discriminate signal from noise, 3) Train a CNN to classify handwritten MNIST digits, 4) Train a CNN to classify CIFAR-10 images  

FIELD TRIP: Visit a cellular calcium imaging lab

WEEK 3: UNSUPERVISED LEARNING

Morning Neuro lectures: 1)Memory and the hippocampus, 2) Sleep and dreams, 3) Face coding in the primate brain, 4) Cognitive maps and the hippocampus     

Morning ML/AI lectures: 1)Overfitting, generalization, & catastrophic interference, 2) Autoencoders, data compression & latent variables, 3) Generative AI: VAEs, GANs, diffusion models, 4) Attractor networks and reservoir computing     

Afternoon coding clinic: 1)Build an autoencoder, 2) Statistical analysis of network performance, 3) Loops and conditionals, 4) Kernels and filtering     

Afternoon modeling challenge: 1)Catastrophic interference in a CNN, 2) Preventing catastrophic interference in a VAE that “dreams,” 3) Using VAEs to model bias and discrimination in AI, 4) Training a VAE to learn camera angles and positions   

WEEK 4: REINFORCEMENT LEARNING

Morning Neuro lectures: 1)Pavlovian versus instrumental conditioning, 2) The dopaminergic system, 3) Value-based decision making, 4) Addiction & depression     

Morning ML/AI lectures: 1)Introduction to reinforcement learning, 2) The temporal difference (TD) learning rule, 3) Actor-critic models, 4) Deep reinforcement learning     

Afternoon coding clinic: 1)Real-time modeling, 2) Trial-based modeling, 3) Exponential processes, 4) State vectors     

Afternoon modeling challenge: 1)Simulations of classical conditioning, 2) Q-Learning in gridworld, 3) TD Learning, 4) Deep learning to play Atari games   

FINAL PROJECT: Deep reinforcement learning agent for autonomous navigation