McClelland and Rumelhart's Parallel Distributed Processingwas the first book to present a definitive account of the newly revived connectionist/neural net paradigm for artificial intelligence and cognitive science. While Neural Computing Architecturesaddresses the same issues, there is little overlap in the research it reports. These 18 contributions provide a timely and informative overview and synopsis of both pioneering and recent European connectionist research. Several chapters focus on cognitive modeling; however, most of the work covered revolves around abstract neural network theory or engineering applications, bringing important complementary perspectives to currently published work in PDP. In four parts, chapters take up neural computing from the classical perspective, including both foundational and current work; the mathematical perspective (of logic, automata theory, and probability theory), presenting less well-known work in which the neuron is modeled as a logic truth function that can be implemented in a direct way as a silicon read only memory. They present new material both in the form of analytical tools and models and as suggestions for implementation in optical form, and summarize the PDP perspective in a single extended chapter covering PDP theory, application, and speculation in US research. Each part is introduced by the editor.Aleksander, Igor is the author of 'Neural Computing Architectures' with ISBN 9780262011105 and ISBN 0262011107.