|Affiliation||Johns Hopkins University|
|Date and Time||April 4, 2013, 6 p.m. - 07:00 p.m.|
|Location||McGovern Seminar Room: 3189|
I will discuss a framework for a theory of neural computation from several perspectives. From the viewpoint of automata theory, the machines in question are formal neural networks (connectionist nets) supporting distributed representations of symbol structures and simultaneously performing stochastic global optimization as well as quantization to discrete symbolic states. From the perspective of recursive function theory, recursive equations for symbol-mapping functions become recursive equations for the (weight) matrices of linear mappings (networks). Finally, a language-theoretic approach uses optimization over symbol structures to define formal languages, and to support optimization-based approaches to natural languages.
There are no comments yet