There is a battle about to commence. A battle in the world of cognitive modelling. Or at least a bit of a skirmish. Two articles to be published in Trends in Cognitive Sciences debate the merits of approaching cognition from different ends of the microscope.
On the side of probabilistic modelling we have Thom Griffiths, Nick Chater, Charles Kemp, Amy Perfors and Joshua Tenenbaum. Representing (perhaps non-symbolically) emergentist approaches are James McClelland, Matthew Botvinick, David Noelle, David Plaut, Timothy Rogers, Mark Seidenberg and Linda B. Smith. This contest is not short of heavyweights.
However, the first battleground seems to be who can come up with the most complicated diagram. I leave this decision to the reader (see first two images).
The central issue is which approach is the most productive for explaining phenomena in cognition. David Marr’s levels of explanation include the ‘computational’ characterisation of the problem, an ‘algorithmic’ description of the problem and an ‘implementational’ explanation which focusses on how the task is actually implemented by real brains. Structured probabilistic takes a ‘top-down’ approach while Emergentism takes a ‘bottom-up’ approach.