Collazos, Steven2019-12-112019-12-112019-08https://hdl.handle.net/11299/209044University of Minnesota Ph.D. dissertation. August 2019. Major: Mathematics. Advisor: Duane Nykamp. 1 computer file (PDF); v, 75 pages.Hebbian theory proposes that ensembles of neurons, that is, groups of co-active neurons, form a basis for neural processing. We model the collection of all possible ensembles of neurons---known as permitted sets, $\mathcal{P}_\Phi (W)$---as a collection of binary strings that indicate which neurons are deemed active. In this model, $\Phi$ is a function that prescribes how neurons respond to inputs, and $W$ is a matrix that captures the strengths of the connections among neurons in the network. We construct $\mathcal{P}_\Phi (W)$ by imposing a threshold on the responsiveness of the neuron to input at the steady state. We investigate how synaptic strengths shape $\mathcal{P}_\Phi (W)$. When the synaptic weight matrix is almost rank one, we prove two main results about $\mathcal{P}_\Phi (W)$. First, $\mathcal{P}_\Phi (W)$ is a convex code, which is a combinatorial neural code that arises from a pattern of intersections of convex sets. Second, $\mathcal{P}_\Phi (W)$ exhibits nesting, meaning that a permitted set with $k$ co-active neurons contains a permitted subset of $k-1$ co-active neurons. Our results are applicable to neuronal networks whose activation function is $C^1$ with finitely many discontinuities.enAsymptotically stable fixed pointsDifferential EquationsNeural codingNeuronal networksCoding Properties of Firing Rate Models with Low-Rank Synaptic Weight MatricesThesis or Dissertation