Skip to main content
Log in

Information processing, computation, and cognition

  • Perspective
  • Published:
Journal of Biological Physics Aims and scope Submit manuscript

Abstract

Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism, connectionism, and computational neuroscience on the other. We defend the relevance to cognitive science of both computation, at least in a generic sense, and information processing, in three important senses of the term. Our account advances several foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. The Church–Turing thesis properly so-called—i.e., the thesis supported by Church, Turing, and Kleene’s arguments—is sometimes confused with the Physical Church–Turing thesis. The Physical Church-Turing thesis [58] lies outside the scope of this paper. Suffice it to say that the Physical Church–Turing thesis is controversial and in any case does not entail that cognition is computation in a sense that is relevant to cognitive science [29].

  2. Classical theories are often referred to as “symbolic.” Roughly speaking, in the present context, a symbol is something that satisfies two conditions: (1) it is a representation and (2) it falls under a discrete (or digital) linguistic type. As we will argue below, conceptual clarity requires keeping these two aspects of symbols separate. Therefore, we will avoid the term “symbol” and its cognates.

  3. Sometimes, the view that cognition is dynamical is presented as an alternative to the view that cognition is computational (e.g., [4]). This is simply a false contrast. Computation is dynamical too; the relevant question is whether cognitive dynamics are computational.

  4. Did McCulloch and Pitts really offer a theory of cognition in terms of digital computation? Absolutely. McCulloch and Pitts’ work is widely misunderstood; for a detailed study, see [63].

  5. The term “digit” is used in two ways. It may be used for the discrete variables that can take different values; for instance, binary cells are often called bits, which can take either 0 or 1 as values. Alternatively, the term digit may be used for the values themselves. In this second sense, it is the 0s and 1s that are the bits. We use digit in the second sense.

  6. Addition and multiplication are usually defined as functions over numbers. To maintain consistency in the present context, they ought to be understood as functions over strings of digits.

  7. The term “classical computation” is sometimes used as an approximate synonym of digital computation. Here, we are focusing on the more restricted sense of the term that has been used in debates on cognitive architecture at least since [10].

  8. Fodor and Pylyshyn [10] restrict their notion of classical computation to processes defined over representations because they operate under assumption 3—that computation requires representation. In other words, the sentence-like symbolic structures manipulated by Fodor and Pylyshyn’s classical computations must have semantic content. Thus, Fodor and Pylyshyn’s classical computations are a kind of “semantic computation.” Since assumption 3 is a red herring in the present context, we avoided assumption 3 in the main text. We will discuss the distinction between semantic and nonsemantic computation in Section 3.3.

  9. Two caveats: First, neural networks should not be confused with their digital simulations. A simulation is a model or representation of a neural network; the network is what the simulation represents. Of course, a digital simulation of a neural network is algorithmic; it does not follow that the network itself (the system represented by the simulation) is algorithmic. Second, some authors use the term “algorithm” for the computations performed by all neural networks. In this broader sense of algorithm, any processing of a signal follows an algorithm, regardless of whether the process is defined in terms of discrete manipulations of strings of digits. Since this is a more encompassing notion of algorithm than the one employed in computer science—indeed, it is even broader than the notion of computing Turing-computable functions—our point stands. The point is that the notion of computing Turing-computable functions is more inclusive than that of algorithmic computation in the standard sense [30].

  10. For more details on the contrast between digital and analog computers, see [31, Section 3.5].

  11. One terminological caveat. Later on, we will also speak of semantic notions of information. In the case of nonnatural semantic information, by “semantic,” we will mean the same as what we mean here, i.e., representational. A representation is something that can misrepresent, i.e., may be unsatisfied or false. In the case of natural information, by “semantic,” we will mean something weaker, which is not representational because it cannot misrepresent (see below).

  12. Digital computing systems belong in a hierarchy of systems that are computationally more or less powerful. The hierarchy is measured by the progressively larger classes of functions each class of system can compute. Classes include the functions computable by finite automata, pushdown automata, and Turing machines [62].

  13. We should also note that there are different associative mechanisms, some of which are more powerful than others; comparing associative mechanisms lies outside the scope of this paper.

  14. Shannon was building on important work by Boltzmann, Szilard, Hartley, and others [89].

  15. The logarithm to the base b of a variable x—expressed as log b x—is defined as the power to which b must be raised to get x. In other words, log b x = y if and only if b y = x. We stipulate that the expression 0 log b 0 in any of the addenda of H(X) is equal to 0. Shannon ([39], p. 379) pointed out that in choosing a logarithmic function he was following Hartley [94] and added that logarithmic functions have nice mathematical properties, are more useful practically because a number of engineering parameters “vary linearly with the logarithm of the number of possibilities,” and are “nearer to our intuitive feeling as to the proper measure” of information.

  16. The three mathematical desiderata are the following: (1) The entropy H should be continuous in the probabilities p i ; (2) The entropy H should be a monotonic increasing function of n when p i  = 1/n; and (3) If n = b 1 + .. + b k with b i positive integer, then \(H( {1/n,\ldots ,1/n} )=H( {b_1 /n,\ldots ,b_k /n} )+\sum\limits_{i=1}^k {b_i /n\,\,H( {1/b_i ,\ldots ,1/b_i } ).} \) Shannon further supported his interpretation of H as the proper measure of information by demonstrating that the channel capacity required for most efficient coding is determined by the entropy ([39], see Theorem 9 in Section 9).

  17. Shannon credited John Tukey, a computer scientist at Bell Telephone Laboratories, with introducing the term in a 1947 working paper.

  18. − log2 0.5 = 1.

  19. Furthermore, Shannon’s messages do not even have to be strings of digits of finitely many types; on the contrary, they may be continuous variables. We gave the definitions of entropy and mutual information for discrete variables, but the definitions can be modified to suit continuous variables ([39], part III).

  20. Shannon defined the channel capacity C as follows: \(\mathop {\mbox{Max}}\limits_{P( a )} I( {X;Y} )=\mathop {\mbox{Max}}\limits_{P( a )} [ {H( X )-H( { X |Y} )} ].\) The conditional entropy is calculated as follows: \(H( { X |Y} )=\sum\limits_{i=1,j=1}^{n,r} {p( {a_i ,b_j } )\log _2 \,\frac{1}{p( {a_i \vert b_j } )}.} \)

  21. By calling this kind of information nonnatural, we are not taking a stance on whether it can be naturalized, that is, reduced to some more fundamental natural process. We are simply using Grice’s terminology to distinguish between two importantly different notions of semantic information. We will briefly discuss the naturalization of nonnatural information in Section 4.3

  22. This is not to say that conventions are the only possible source of nonnatural meaning. For further discussion, see [96].

  23. For a more precise and detailed theory of probabilistic information, see [36, 37]. The probabilistic notion of information we employ includes all-or-nothing natural information—roughly, the natural information that o is G with certainty—as a special, limiting case. For a critique of approaches focusing solely on all-or-nothing natural information, see [99].

  24. This is not to say that natural information is enough to explain why acting on the basis of received natural information sometimes constitutes a mistake and sometimes it does not. We briefly discuss conditions of correctness for information-based behaviors below.

  25. A consequence of this point is that the transmission of natural information entails nothing more than the truth of a probabilistic claim [37]. It follows that our distinction between natural and nonnatural meaning/information differs from Grice’s original distinction in one important respect. On our view, there is nothing objectionable in holding that “those spots carry natural information about measles, but he doesn’t have measles,” provided measles are more likely given those spots than in the absence of those spots.

  26. There are also imperative representations, such as desires. And there are representations that combine descriptive and imperative functions, such as honeybee dances and rabbit thumps (cf. [47]). For simplicity, we will continue to focus on descriptive representations.

  27. See [37] for an explanation of the limited sense in which we take natural information to be truth entailing.

  28. In this section, we have not discussed the distinction between semantic and nonsemantic notions of computation, as it makes no difference to our present concerns.

  29. At least for deterministic computation, there is always a reliable causal correlation between later states and earlier states of the system as well as between initial states and whatever caused those initial states—even if what caused them are just the thermal properties of the immediate surroundings. In this sense, (deterministic) digital computation entails natural information processing. But the natural information that is always “processed” is not about the distal environment, which is what theorists of cognition are generally interested in.

  30. For further discussion of some relevant issues, see [28, 29].

References

  1. Perkel, D.H.: Computational neuroscience: scope and structure. In: Schwartz, E.L. (ed.) Computational Neuroscience, pp. 38–45. MIT Press, Cambridge (1990)

    Google Scholar 

  2. Edelman, G.M.: Bright Air, Brilliant Fire: on the Matter of the Mind. Basic Books, New York (1992)

    Google Scholar 

  3. Globus, G.G.: Towards a noncomputational cognitive neuroscience. J. Cogn. Neurosci. 4(4), 299–310 (1992)

    Article  Google Scholar 

  4. Port, R.F., van Gelder, T.: Mind as Motion: Explorations in the Dynamics of Cognition. MIT Press, Cambridge (1995)

    Google Scholar 

  5. Freeman, W.J.: How Brains Make Up Their Minds. Columbia University Press, New York (2001)

    Google Scholar 

  6. Wallace, B., Ross, A., David, J., Anderson, T. (eds.): The Mind, The Body and the World: Psychology after Cognitivism? Imprint Academic, Exeter (2007)

    Google Scholar 

  7. Spivey, M.: The Continuity of Mind. Oxford University Press, Oxford (2007)

    Google Scholar 

  8. Miller, G.A., Galanter, E.H., Pribram, K.H.: Plans and the Structure of Behavior. Holt, New York (1960)

    Book  Google Scholar 

  9. Newell, A., Simon, H.A.: Computer science as an empirical enquiry: symbols and search. Commun. ACM 19, 113–126 (1976)

    Article  MathSciNet  Google Scholar 

  10. Fodor, J.A., Pylyshyn, Z.W.: Connectionism and cognitive architecture. Cognition 28, 3–71 (1988)

    Article  Google Scholar 

  11. Newell, A.: Unified Theories of Cognition. Harvard University Press, Cambridge (1990)

    Google Scholar 

  12. Pinker, S.: How the Mind Works. Norton, New York (1997)

    Google Scholar 

  13. Gallistel, C.R., King, A.P.: Memory and the Computational Brain: Why Cognitive Science will Transform Neuroscience. Wiley-Blackwell, Malden (2008)

    Google Scholar 

  14. Rosenblatt, F.: The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev. 65, 386–408 (1958)

    Article  MathSciNet  Google Scholar 

  15. Feldman, J.A., Ballard, D.H.: Connectionist models and their properties. Cogn. Sci. 6, 205–254 (1982)

    Article  Google Scholar 

  16. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. U.S.A. 79, 2554–2558 (1982)

    Article  MathSciNet  ADS  Google Scholar 

  17. Rumelhart, D.E., McClelland, J.M., et al.: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. MIT Press, Cambridge (1986)

    Google Scholar 

  18. Smolensky, P.: On the proper treatment of connectionism. Behav. Brain Sci. 11(1), 1–23 (1988)

    Article  MathSciNet  Google Scholar 

  19. Churchland, P.S., Sejnowski, T.J.: The Computational Brain. MIT Press, Cambridge (1992)

    Google Scholar 

  20. O’Reilly, R.C., Munakata, Y.: Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. MIT Press, Cambridge (2000)

    Google Scholar 

  21. Rogers, T.T., McClelland, J.L.: Semantic Cognition: a Parallel Distributed Processing Approach. MIT Press, Cambridge (2006)

    Google Scholar 

  22. Dale, R.: The possibility of a pluralist cognitive science. J. Exp. Theor. Artif. Intell. 20(3), 155–179 (2008)

    Article  Google Scholar 

  23. Edelman, S.: On the nature of minds, or: truth and consequences. J. Exp. Theor. Artif. Intell. 20(3), 181–196 (2008)

    Article  Google Scholar 

  24. Fodor, J.A.: LOT 2: the Language of Thought Revisited. Oxford University Press, Oxford (2008)

    Google Scholar 

  25. Pylyshyn, Z.W.: Computation and Cognition. MIT Press, Cambridge (1984)

    Google Scholar 

  26. Shagrir, O.: Why we view the brain as a computer. Synthese 153(3), 393–416 (2006)

    Article  MathSciNet  Google Scholar 

  27. Piccinini, G.: Computing mechanisms. Philos. Sci. 74(4), 501–526 (2007)

    Article  MathSciNet  Google Scholar 

  28. Piccinini, G.: Computational modeling vs. computational explanation: is everything a Turing machine, and does it matter to the philosophy of mind? Australas. J. Philos. 85(1), 93–115 (2007)

    Article  MathSciNet  Google Scholar 

  29. Piccinini, G.: Computationalism, the Church-Turing thesis, and the Church-Turing fallacy. Synthese 154(1), 97–120 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  30. Piccinini, G.: Some neural networks compute, others don’t. Neural Netw. 21(2–3), 311–321 (2008)

    Article  Google Scholar 

  31. Piccinini, G.: Computers. Pac. Philos. Q. 89(1), 32–73 (2008)

    Article  Google Scholar 

  32. Piccinini, G.: Computation without representation. Philos. Stud. 137(2), 205–241 (2008)

    Article  MathSciNet  Google Scholar 

  33. Piccinini, G.: Computationalism in the philosophy of mind. Philos. Comp. 4(3), 515–532 (2009)

    Article  MathSciNet  Google Scholar 

  34. Piccinini, G.: The mind as neural software? Understanding functionalism, computationalism, and computational functionalism. Philos. Phenomenol. Res. 81(2) (2010)

  35. Piccinini, G.: The resilience of computationalism. Philos. Sci. (2010, in press)

  36. Scarantino, A.: A theory of probabilistic information. Unpublished

  37. Scarantino, A., Piccinini, G.: Information without truth. Metaphilosophy 41, 313–330 (2010)

    Article  Google Scholar 

  38. Edelman, S.: Computing the Mind: How the Mind Really Works. Oxford University Press, Oxford (2008)

    Google Scholar 

  39. Shannon, C.E.: A mathematical theory of communication. ATT Tech. J. 27(379–423), 623–656 (1948)

    MathSciNet  Google Scholar 

  40. Turing, A.: On computable numbers, with an application to the Entscheidungsproblem. Proc. Lond. Math. Soc., ser. 2, 42, 230–265 (1936–37)

    Article  Google Scholar 

  41. Wiener, N.: Cybernetics: or Control and Communication in the Animal and the Machine. MIT Press, Cambridge (1948)

    Google Scholar 

  42. McCulloch, W.S.: The brain as a computing machine. Electr. Eng. 68, 492–497 (1949)

    Google Scholar 

  43. McCulloch, W.S., Pitts, W.H.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biol. 7, 115–133 (1943)

    MathSciNet  Google Scholar 

  44. Dayan, P., Abbott, L.F.: Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems. MIT Press, Cambridge (2001)

    MATH  Google Scholar 

  45. Eliasmith, C., Anderson, C.H.: Neural Engineering: Computation, Representation, and Dynamics in Neurobiological systems. MIT Press, Cambridge (2003)

    Google Scholar 

  46. Dretske, F.: Knowledge and the Flow of Information. Blackwells, Oxford (1981)

    Google Scholar 

  47. Millikan, R.G.: Varieties of Meaning. MIT Press, Cambridge (2004)

    Google Scholar 

  48. Floridi, L.: Is semantic information meaningful data? Philos. Phenomenol. Res. 70(2), 351–370 (2005)

    Article  Google Scholar 

  49. Cohen, J., Meskin, A.: An objective counterfactual theory of information. Australas. J. Philos. 84, 333–352 (2006)

    Article  Google Scholar 

  50. Piccinini, G.: Functionalism, computationalism, and mental contents. Can. J. Philos. 34(3), 375–410 (2004)

    Google Scholar 

  51. Baum, E.B.: What is Thought? MIT Press, Cambridge (2004)

    Google Scholar 

  52. Churchland, P.M.: Neurophilosophy at Work. Cambridge University Press, Cambridge (2007)

    Book  Google Scholar 

  53. Church, A.: An unsolvable problem in elementary number theory. Am. J. Math. 58, 345–363 (1936)

    Article  MathSciNet  Google Scholar 

  54. Kleene, S.C.: Introduction to Metamathematics. Elsevier, New York (1952)

    MATH  Google Scholar 

  55. Copeland, B.J.: Narrow versus wide mechanism: including a re-examination of Turing’s views on the mind-machine issue. J. Philos. XCVI(1), 5–32 (2000)

    Article  MathSciNet  Google Scholar 

  56. Wolfram, S.: A New Kind of Science. Wolfram Media, Champaign (2002)

    MATH  Google Scholar 

  57. Lloyd, S.: Programming the Universe: a Quantum Computer Scientist Takes on the Cosmos. Knopf, New York (2006)

    Google Scholar 

  58. Pitowsky, I.: The physical Church thesis and physical computational complexity. Iyyun 39, 81–99 (1990)

    Google Scholar 

  59. Jilk, D.J., Lebiere, C., O’Reilly, R.C., Anderson, J.R.: SAL: an explicitly pluralistic cognitive architecture. J. Exp. Theor. Artif. Intell. 20, 197–218 (2008)

    Article  Google Scholar 

  60. Gödel, K.: On undecidable propositions of formal mathematical systems. In: Davis, M. (ed.) The Undecidable, pp. 41–71. Raven, Ewlett (1934)

    Google Scholar 

  61. Post, E.: Finite combinatory processes - formulation I. J. Symb. Log. 1, 103–105 (1936)

    Article  MATH  Google Scholar 

  62. Davis, M.D., Sigal, R., Weyuker, E.J.: Computability, Complexity, and Languages. Academic Press, Boston (1994)

    Google Scholar 

  63. Piccinini, G.: The first computational theory of mind and brain: a close look at McCulloch and Pitts’s “Logical calculus of ideas immanent in nervous activity”. Synthese 141(2), 175–215 (2004)

    Article  MathSciNet  Google Scholar 

  64. Penrose, R.: Shadows of the Mind. Oxford University Press, Oxford (1994)

    Google Scholar 

  65. Gerard, R.W.: Some of the problems concerning digital notions in the central nervous system. Cybernetics. In: Foerster, H.V., Mead, M., Teuber, H.L. (eds.) Circular Causal and Feedback Mechanisms in Biological and Social Systems. Transactions of the Seventh Conference, pp. 11–57. Macy Foundation, New York (1951)

    Google Scholar 

  66. Rubel, L.A.: The brain as an analog computer. J. Theor. Neurobiol. 4, 73–81 (1985)

    Google Scholar 

  67. Pour-El, M.B.: Abstract computability and its relation to the general purpose analog computer (some connections between logic, differential equations and analog computers). Trans. Am. Math. Soc. 199, 1–28 (1974)

    Article  MATH  MathSciNet  Google Scholar 

  68. Fodor, J.A.: The mind-body problem. Sci. Am. 244, 124–132 (1981)

    Article  Google Scholar 

  69. Block, N.: Troubles with functionalism. In: Savage, C.W. (ed.) Perception and Cognition: Issues in the Foundations of Psychology, vol. 6, pp. 261–325. University of Minnesota Press, Minneapolis (1978)

    Google Scholar 

  70. Baars, B.J., Banks, W.P., Newman, J.B. (eds.): Essential Sources in the Scientific Study of Consciousness. MIT Press, Cambridge (2003)

    Google Scholar 

  71. Harman, G.: Thought. Princeton University Press, Princeton (1973)

    Google Scholar 

  72. Fodor, J.A.: The Language of Thought. Harvard University Press, Cambridge (1975)

    Google Scholar 

  73. Marr, D.: Vision. W.H. Freeman, San Francisco (1982)

    Google Scholar 

  74. Bechtel, W., Abrahamsen, A.: Connectionism and the Mind: Parallel Processing, Dynamics, and Evolution in Networks. Blackwell, Malden (2002)

    Google Scholar 

  75. Koch, C.: Biophysics of Computation: Information Processing in Single Neurons. Oxford University Press, New York (1999)

    Google Scholar 

  76. Marr, D., Poggio, T.: Cooperative computation of stereo disparity. Science 194(4262), 283–287 (1976)

    Article  ADS  Google Scholar 

  77. Smolensky, P., Legendre, G.: The Harmonic Mind: From Neural Computation to Optimality-Theoretic Grammar, vol. 1: Cognitive Architecture; vol. 2: Linguistic and Philosophical Implications. MIT Press, Cambridge (2006)

    Google Scholar 

  78. Horgan, T., Tienson, J.: Connectionism and the Philosophy of Psychology. MIT Press, Cambridge (1996)

    Google Scholar 

  79. Thorndike, E.: The Fundamentals of Learning. Columbia University Press, New York (1932)

    Book  Google Scholar 

  80. Hebb, D.: The Organization of Behavior: a Neuropsychological Theory. Wiley, New York (1949)

    Google Scholar 

  81. Fodor, J.A.: The Modularity of Mind. MIT Press, Cambridge (1983)

    Google Scholar 

  82. Turing, A.M.: Intelligent Machinery. Mechanical Intelligence, pp. 117–127. D. Ince. Amsterdam, North-Holland (1948)

  83. Copeland, B.J., Proudfoot, D.: On Alan Turing’s anticipation of connectionism. Synthese 113, 361–377 (1996)

    Article  MathSciNet  Google Scholar 

  84. Trehub, A.: The Cognitive Brain. Cambridge, MIT Press, Cambridge (1991)

    Google Scholar 

  85. Marcus, G.F.: The Algebraic Mind: Integrating Connectionism and Cognitive Science. MIT Press, Cambridge (2001)

    Google Scholar 

  86. Miller, G.A.: Language and Communication. McGraw-Hill, New York (1951)

    Book  Google Scholar 

  87. Minsky, M.: Semantic Information Processing. MIT Press, Cambridge (1968)

    MATH  Google Scholar 

  88. Fano, R.M.: Transmission of Information; a Statistical Theory of Communications. MIT Press, New York (1961)

    Google Scholar 

  89. Pierce, J.R.: An Introduction to Information Theory. Dover Publications, New York (1980)

    MATH  Google Scholar 

  90. Smith, J.M.: The concept of information in biology. Philos. Sci. 67(2), 177–194 (2000)

    Article  MathSciNet  Google Scholar 

  91. Godfrey-Smith, P.: On the theoretical role of “genetic coding”. Philos. Sci. 67, 26–44 (2000)

    Article  MathSciNet  Google Scholar 

  92. Griffiths, P.E.: Genetic information: a metaphor in search of a theory. Philos. Sci. 68(3), 394–412 (2001)

    Article  Google Scholar 

  93. Bradbury, J.W., Vehrencamp, S.L.: Economic models of animal communication. Anim. Behav. 59(2), 259–268 (2000)

    Article  Google Scholar 

  94. Hartley, R.V.: Transmission of information. ATT Tech. J. 7, 535–563 (1928)

    Google Scholar 

  95. Baddeley, R., Hancock, P., et al. (eds.): Information Theory and the Brain. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  96. Grice, P.: Meaning. Philos. Rev. 66, 377–88 (1957)

    Article  Google Scholar 

  97. Struhsaker, T.T.: Auditory communication among vervet monkeys (Cercopithecus aethiops). In: Altmann, S.A. (ed.) Social Communication Among Primates, pp. 281–324. University of Chicago Press, Chicago (1967)

    Google Scholar 

  98. Seyfarth, R.M., Cheney, D.L.: Meaning and mind in monkeys. Sci. Am. 267, 122–129 (1992)

    Article  Google Scholar 

  99. Scarantino, A.: Shell games, information, and counterfactuals. Australas. J. Philos. 86(4), 629–634 (2008)

    Article  Google Scholar 

  100. Fisher, R.: The Design of Experiments. Oliver and Boyd, Edinburgh (1935)

    Google Scholar 

  101. Li, M., Vitányi, P.: An Introduction to Kolmogorov Complexity and its Applications, 2nd edn. Springer, New York (1997)

    MATH  Google Scholar 

  102. Winograd, S., Cowan, J.D.: Reliable Computation in the Presence of Noise. MIT Press, Cambridge (1963)

    MATH  Google Scholar 

  103. Margolis, E., Laurence, S. (eds.): Concepts: Core Readings. MIT Press, Cambridge (1999)

    Google Scholar 

  104. Quine, W.V.O.: Word and Object. MIT Press, Cambridge (1960)

    MATH  Google Scholar 

  105. Stich, S.: From Folk Psychology to Cognitive Science. MIT Press, Cambridge (1983)

    Google Scholar 

  106. Egan, F.: A modest role for content. Stud. Hist. Philos. Sci. (2010, in press)

  107. Rupert, R.D.: Causal theories of mental content. Philos. Comp. 3(2), 353–380 (2008)

    Article  MathSciNet  Google Scholar 

  108. Dretske, F.: Explaining Behaviour. Bradford Press, Cambridge (1988)

    Google Scholar 

  109. Barwise, J., Seligman, J.: Information Flow: The Logic of Distributed Systems. Cambridge University Press, Cambridge (1997)

    Book  MATH  Google Scholar 

  110. Fodor, J.: A Theory of Content and Other Essays. MIT Press, Cambridge (1990)

    Google Scholar 

  111. Millikan, R.G.: Language, Thought, and Other Biological Categories. MIT Press, Cambridge (1984)

    Google Scholar 

  112. Harman, G.: (Nonsolipsistic) conceptual role semantics. In: LePore, E. (ed.) New Directions in Semantics, pp. 55–81. Academic Press, London (1987)

    Google Scholar 

  113. Papineau, D.: Reality and Representation. Blackwell, Oxford (1987)

    Google Scholar 

  114. Grush, R.: The emulation theory of representation: motor control, imagery, and perception. Behav. Brain Sci. 27(3), 377–442 (2004)

    Google Scholar 

  115. Ryder, D.: SINBAD neurosemantics: a theory of mental representation. Mind Lang. 19(2), 211–240 (2004)

    Google Scholar 

Download references

Acknowledgements

An abbreviated ancestor of some portions of this paper appeared in G. Piccinini and A. Scarantino, “Computation vs. information processing: why their difference matters to cognitive science,” Stud. Hist. Philos. Sci. (2010, in press). Thanks to our audience at the 2009 meeting of the Southern Society for Philosophy and Psychology and to Ken Aizawa, Sonya Bahar, Mark Collier, Carrie Figdor, Robert Gordon, Corey Maley, Alex Morgan, Brad Rives, Martin Roth, Anna-Mari Rusanen, Dan Ryder, Oron Shagrir, Susan Schneider, Eric Thomson, and several anonymous referees for helpful comments and encouragement. Special thanks to Neal Anderson for his extensive and insightful comments. Thanks to Matthew Piper and James Virtel for editorial assistance. This material is based upon work supported by the National Science Foundation under grant no. SES-0924527 to Gualtiero Piccinini.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gualtiero Piccinini.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Piccinini, G., Scarantino, A. Information processing, computation, and cognition. J Biol Phys 37, 1–38 (2011). https://doi.org/10.1007/s10867-010-9195-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10867-010-9195-3

Keywords

Navigation