One primary objective of mining the brain is to learn the inner workings of the mind and how external events become internal perceptions. But to mine the brain also means to mine the continuous network of neural signals that transcend across billions of its neurons. Advances in the past several decades in computational neuroscience have provided fundamental clues into understanding brain processes in relation to memory, movement, and sensory perception. We analyze the responses of a population of neurons recorded simultaneously in guinea pig auditory cortex while various sound stimuli are presented in the free field. By mining the responses of auditory neurons in the awake animal to different acoustic stimuli, we hope to address a few key questions. 1) Do the neurons respond in specific ways to particular features of the stimuli? 2) Is there a clear relation between groups of neurons and a specific sound stimulus? 3) How many neurons are needed to decode the stimuli? 4) What are the optimum algorithms to interpret the neural responses? 5) How much pre-processing is necessary to account for missing data, noise, and high levels of variability of neural responses even to similar stimuli? We first introduce techniques that are used to transform the original data set from spike times to identifiable signal waveforms for discrimination analysis. We then demonstrate the level of complexity of the problem by providing results obtained with template matching. Finally, the self-organizing map (SOM) is described as a promising technique that extracts the most relevant information from the complex data set.