Academic / Philosophical Prose

 Home / Academic / Philosophical Prose / Cognitive Gadgets

Cognitive Gadgets

Cognitive Gadgets. The Cultural Evolution of Thinking, by Cecilia Heyes, The Belknap Press of Harvard University Press, Cambridge, Massachusetts . London, England, 2018, 292 pp., (cloth)

(To be published in The European Legacy in 2019)


Cecilia Heyes is a “Cultural Evolutionary Psychologist”, which is to be distinguished from an “Evolutionary Psychologist”, though the two theoretical perspectives have a good deal in common. Her own overall perspective is so succinctly summarized at the beginning of the book that I hope I will be forgiven for quoting from it at some length here, as that will facilitate the subsequent discussion:

“We humans have created not just physical machines – such as pulleys, traps, carts, and internal combustion engines – but also mental machines; mechanisms of thought, embodied in our nervous systems, that enable our minds to go further, faster, and in different directions than the minds of other animals. These distinctively human cognitive mechanisms include causal understanding, episodic memory, imitation, mindreading, normative thinking, and many more. They are “gadgets”, rather than “instincts”, because, like many physical devices, they are products of cultural rather than genetic evolution. New cognitive mechanisms – different ways of thinking – have emerged, not by genetic mutation, but by innovations in cognitive development. These novelties have been passed on to subsequent generations, not via genes, but through social learning: people with a new cognitive mechanism passed it on to others through social interaction. And some of the new ways of thinking have spread through human populations, while others have died out, because the holders had more “students”, not just more “babies”.”(pp 1-2) Yet later she writes:

“… cognition as in other biological systems, there are no pure cases of nature or of nurture; no biological characteristic is caused only by “the genes” or only by “the environment”.(p 24)

The debate over the genetics/culture dichotomy is fascinatingly undertaken in the chapter on language. As Heyes puts it:

“Language is often described as a Rubicon, a shining threshold in the evolution of human cognition. Ancient legend in many cultures suggests that, once the language boundary was crossed, the minds and lives of our ancestors were forever transformed. Now capable of abstract thought and subtle communication, we became radically different from all other animals – more like gods than beasts……..

“The genetic evolutionary account of the origins of language, rooted in the work of Chomsky, (contrasts) with a cultural evolutionary account based on the “constructivist” approach to language…….(but) no one doubts that a wide range of genetically inherited resources are needed for language learning including memory capacity…….., perceptual and motor skills, and dedicated vocal apparatus”(pp. 169-171), as well as social learning, imitation, and mindreading.

Heyes´ all-embracing theory of learning allows of a deep and sophisticated analysis of language learning. Her theoretical conclusions are always underpinned by reference to evidence from empirical scientific experimental research, which is absolutely enormous in quantity. One would have to be a specialist in these fields to assess in detail Heyes´ theoretical conclusions in the light of such an immense quantity of empirical evidence.

Without demonstrable links between empirical research findings and theoretical conclusions, the results of any kind of scientific endeavour become speculative and conceptually abstract. I must therefore underline the point that without such certainties the whole exercise is a remarkable exploration of an immensely complex field, about which, however, the non-specialist is not well-equipped to make ultimate judgements. Specialists on the other hand, are evidently in considerable dispute one with another about both the empirical evidence and the appropriate conceptual structures necessary to grasp the reality.

Nevertheless the chains of reasoned argument provided by Celia Heyes are at nearly all points convincing. I will draw out some of these points as best I can.

As an example of the different judgements of specialists, we read that for Heyes the Chomskyian view “of what constitutes a linguistic universal makes it impossible to test for linguistic universals in any way that a cognitive scientist would recognize.”(p 180) Nevertheless she also says: “The critical period debate (about whether an individual´s learning language is scheduled by the genes) has not yielded evidence supporting the cultural over the genetic account of language.”(183) Now consider these arguments against the suggestion that language learning is a critical period phenomenon genetically scheduled:

“First, it has become evident since the 1960s that, even in the paradigmatic cases involving imprinting and birdsong, transitions from ease to difficulty of learning are typically experience-dependent rather than genetically programmed. Second, it is far from clear what the reproductive advantage of switching off Universal Grammar at puberty could be, especially given signs that bi- or multi-lingualism may be typical of human populations.”(182) These are two most interesting points: one an empirical observation, the other a speculative question.  

Let me follow another tack explored by Heyes concerning language learning. She suggests that language enlists a wide range of neural structures, while each of the brain areas involved in language processing has many other, non-linguistic functions.(p 184) But the discovery that language, rather than being localized, depends on scattered, multi-functional brain areas does not undermine the genetic account of the evolution of language, she avers. Localization of linguistic function, apparently, is repeatedly claimed as supporting the genetic account over the cultural account, but for Heyes it is not clear why this was ever thought to be the case. I could continue at length with this dispute. Heyes herself remains agnostic in the genetic versus cultural account of language.

Something similar is to be seen in the argument about whether human imitation is a cognitive instinct or whether it can be changed by sensorimotor experience. But in this case Heyes is convinced by currently accepted empirical evidence that the latter theory is correct. Thus she regards this theory as supporting the cognitive gadget theory of imitation.

I find the enormously bird´s eye view of all these issues, so abstract and theoretical, though so consistently and rationally argued through by Celia Heyes, bewildering on a certain level. I would have loved to see some historical and pre-historical discussion of the emergence of actual, empirical human languages for example, earlier in the book, in order to link the hugely abstract, speculative realm of conceptualization, to actually known facts of pre-historical or historical developments in language, literacy, reading, and so on. But perhaps such facts do not exist in sufficiently exact and trustworthy form? One has to await a reading of the last chapter: “Cultural Evolutionary Psychology” for a discussion of pre-historical and historical issues.

Heyes concedes that the cognitive gadgets theory needs to be connected to key events in human evolution, using the archaeological record, and that this is a priority for future research; this reinforced the feeling that had grown in my mind throughout my reading of the book up to that point.

But the very interesting discussion in the short section titled “A Little History” did not resolve very much, I felt. Again I must quote a reasonably long passage because it is vital to what we are talking about, and  summarizing it could not make it shorter:

“Simple stone tools began to be used by hominins about 3.2 million years ago (MYA). More complex Acheulian tools appeared about 1.7 MYA, and there are early signs of hafted and blade-like tools about 250 thousand years ago (KYA). Thus, the archaeological record suggests that hominin technology – and, by implication, ways of living – changed relatively little over a three million year period. Then something big happened. Until recently, it was the received view that Homo sapiens underwent an abrupt upgrade in cognitive sophistication – becoming “behaviourally modern” – between fifty and forty KYA, and that this “Upper Palaeolithic Revolution” was driven by genetic change (Mellars, 1989; 2005; Mellars and Stringer, 1989). In the last twenty years, this view has been undermined by signs that the light of cognitive complexity was not suddenly switched on in one place at one time (McBrearty and Brooks, 2000). Instead, there was “flickering”. Over a 100-200 KY period to forty KYA, various signs of cognitive sophistication – for example, evidence of an expanded trade network, bow and arrow technology, jewelry, and ornaments – appeared and then disappeared in different regions of Africa.” (211)

The speculative arguments that follow this, fascinating though they are, do not to my mind resolve the issues concerning the relative importance of genetic or cultural evolution at all. And the author asserts herself a few pages before the end of the book, that: “Cognitive gadgets theory…..  suggests that human cognitive mechanisms are shaped primarily by cultural evolution….. but I have no illusions that the case is already conclusive. A great deal more work is needed…..  to develop a deeper understanding of the origins and operating characteristics of human minds.” (219)

But the author raises the very significant suggestion that perhaps over the 100-200 KY prior to forty KYA there began climate-driven demographic changes “that enabled gene-culture co-evolution to get off the ground”.(211) Previously individual hominins may have had the genetically given psychological capabilities necessary for cultural evolution, but because social groups were too small and weakly connected, though smart enough, adaptive innovations were less likely to be passed on. For example, how to make new tools or tie a particular knot might easily become extinct. But around 250 KYA human beings started living in larger and better connected bands, and cultural evolution “began to work its magic, albeit in a faltering, flickering way, due to regional and global disturbances that sometimes forced social groups to disperse.”(212) However, might not increases in the sizes of social groups, with greater social and physical connections, cause increases in the rate of genetic evolution within them just as much as it could increase the rate of cultural evolution within them, over this kind of time-span?

In a book written a little more than twenty years ago by Tim Megarry (Society in Prehistory: The Origins of Human Culture), Megarry wrote:

“…. The simple opposition between nature and nurture… largely misconceived since no final distinction or boundary dividing nature frim culture can be drawn. The contention that the greater part of human behaviour is learned, is not contradicted by the fact that learning occurs only by virtue of a set of biological and psychological mechanisms that have evolved with the development of human and pre-human society.”

I am not sure whether Cecilia Heyes agrees with this formulation or not: consider the debate over the genetic assimilation hypothesis. She writes that “some researchers assume that, even if new cognitive mechanisms are produced by learning in a culture-soaked environment, they will later become genetically assimilated……    they may start out as cognitive gadgets, constructed…… through social interaction, but then selection will progressively favour genetic mutations that reduce the experience-dependence of the gadgets´ development, converting them into cognitive instincts.” (207)  Her own view is that the evidence from cognitive science does not support the genetic assimilation hypothesis, but at the present stage of our scientific knowledge, it seems uncertain whether we can definitively know, in every case, one way or the other. She suggests for example that because in her view there is no evidence that identical twins are more alike in imitative ability than fraternal twins, this argues against the probability of genetic assimilation of imitative ability.

Heyes advances the credible idea that genetic assimilation of culturally transmitted cognitive gadgets such as the capacity for imitation is unlikely on current evidence because such mechanisms track targets that move too fast for genetic evolution, though she describes this modestly as her “guess”. Distinctively human cognitive mechanisms need to be nimble, “capable of changing faster than genetic evolution allows, because their job is to track specific, labile features of the environment.”(208) But how fast is “too fast” for genetic assimilation: how “nimble” must a human cognitive mechanism be to be too nimble to become genetically assimilated, and all this within which varying circumstances?

Cecilia Heyes´ book is admirably both multidisciplinary and interdisciplinary, but convincing evidence for one or another hypothesis from the knowledge of prehistory and history seems to be strikingly lacking to this reviewer, and one must wait to see if future research in these fields will or can provide such evidence, as the author clearly hopes it will.

Another fascinating angle of thinking and learning that Cecilia Heyes opens up is that of “mindreading”. This is connected to the human development of morality and to the distinctively human emotions of shame and guilt. Yet anyone who has owned a dog that knows he or she has been “naughty”, has surely seen a non-human variety of these forms of “thought” or emotion. Heyes indicates this herself when she speaks of the differences between human and non-human animal minds.

Mindreading involves Cecilia Heyes’ wonderfully expressed idea of “thinking about thinking”. Mindreading for Heyes is when someone works out “what another agent is thinking or feeling right now”.(144) It is a “special ingredient” of teaching, both for the taught and the teacher. According to her: “Like print reading, mindreading involves the derivation of meaning from signs. In print reading, the signs are usually marks on paper, and their meaning relates to objects and events in the world. In mind-reading, the signs are facial expressions, body movements, and utterances – and their meaning relates to the actor´s mental states.” (148)

At the end of the book, Heyes reiterates some considerations discussed earlier in this review: “If the adaptiveness of cognitive mechanisms can be due not only to genetic evolution, but to cultural selection, the mere fact that a mechanism is adaptive……does not amount to evidence that the mechanism was shaped by genetic evolution. To be convinced that a mechanism is a cognitive instinct, we need positive evidence of genetic involvement”. (222) But to me the thrust of this enormously thought-provoking book is something else. I started my adult educational life as a sociology student, and for a long time I thought biology explained the “origins of the human species” genetically, but thereafter it all became a matter of cultural-historical “development” (the word “evolution” implying biological-genetic change which is far too slow to enter into “human history”). Gradually as the time-periods of human evolution became clearer, and the scientific understanding of genetics and culture became, let us hope, better, those earlier certainties melted away – though I was never an adherent of “sociobiology”. That perspective tried to leap directly from genes to particular social behaviours and appeared to me to be entirely wrong. But now, I am not at all sure if we are yet able to discern all the differences between “genetic” on the one hand and “cultural” processes on the other in every case within the phenomena of humanity, and I even wonder if the desire to do so is not perhaps an example of the “reification” of thought into rigid, fixed concepts or categories, rather than allowing of a fluid, dynamic, dialectical sense of endless movement, mutual entailment, and interaction between two apparent poles.

I feel something similar is at issue with Heyes´ assertion that “although we now know that some versions of the nature-nurture debate were deeply misguided, it is important to discover, for any particular feature of human cognition, the ways and extent to which the feature is shaped by: (1) genetically inherited information; (2) culturally inherited information, and (3) information derived directed from the environment in the course of development.”(3) Surely the second and third of these cannot be rigidly separated: must not culturally inherited information inevitably be immensely significant to the ways in which information is derived directly from the environment.

The same issue is discussed by the author later in the book in a slightly different way:

“Asocial and social cognitive mechanisms deliver better living conditions in different ways. Asocial cognitive mechanisms promote the discovery of strategies and technologies to harness and defend natural resources (for example, causal understanding). Social cognitive mechanisms both enable these strategies and technologies to be learned by others, thereby promoting……    cultural learning and cooperation among group members.”(202) Perhaps the precise conceptual distinctions involved here help thinking about the processes, but again the two processes must surely be absolutely intermingled in reality.

Let us turn now to some comments made early in the book:

“All scientific inquiry about human distinctiveness aims to explain the manifest differences between our lives and the lives of other animals…….  Some research focuses on our bodies, for example, tracing the effects of bipedalism and our remarkable manual dexterity. Other work zeros in on the brain….. and that certain parts have expanded more than others in the course of human evolution. A third focus is on behaviour…. use of tools, or control of fire. The final focus….. is on the mind…… to identify the mental processes, or ways of thinking, that make humans special……

“Imagine someone hunting in the wilderness with a spear. It may be possible, with the help of immensely complex mathematical models, to document what typically happens in the hunter´s brain whenever there is a change in the pattern of light entering his eyes as he scans the horizon. It may even be possible to correlate these light-related changes in neural firing with the hunter´s behaviour…….  But this huge, complicated matrix of inputs, brain activities, and outputs would not make sense of the hunter´s action……  unless it were translated into mental terms…….  a description of what the hunter “sees”, “misses”, “wants”, and “knows”…..  without abstraction of the kind that mental terms provide, behavioural science and neuroscience provide information without insight, and precision without predictive power”. (10)

These kinds of observation seem to me to lie at the heart of the completely inter-disciplinary science that is needed to understand who we humans are. And of course, understanding mental activity involves analysing structures of economy, society, polity, and culture.



Tim Cloudsley


Descargar como pdf



Cloudsley, Tim, MA; British independent academic researcher and writer, poet, essayist, and short story writer resident in Colombia; formerly lecturer in Sociology at Heriot-Watt University, Edinburgh, Scotland.

Read more