An alien observer of human cognitive development would be struck by a fact he might be tempted to describe as paradoxical. This is that in the first five years or so of life development is rapid and impressive while subsequent learning tends to be slow and laborious. The typical five-year-old already has excellent sensory awareness of the world, a mature language, and a fully functioning conceptual scheme—all without apparent effort. They may be small, but they are smart. The reason for this precocity, we conjecture, is that much of what they have achieved by that age is the unfolding of an innate program or set of programs: all this cognitive sophistication is written into the genes awaiting read-out. It is not picked up by diligent inspection of the environment. It comes quickly because it was already present in substantial outline. Thereafter the child must learn things the hard way—by learning them. Hence school, memorization, studying, instruction, concentration. Knowledge becomes willed, while before it was unwilled, spontaneous, given. Cognitive development turns into work.
It could be otherwise for our alien observers: they are accustomed to school virtually from birth, because their children are born knowing practically nothing. They learn language by painstaking instruction, having no innate grammar; concepts are acquired by something called “deliberate abstraction”, which is arduous and time-consuming; even their senses need years to get honed into something usable. They don’t reach the cognitive level of a typical human five-year-old till the age of fifteen. Empiricism is true of them, and it takes time and effort. However, they have excellent memories and powers of concentration, as well as an aversion to play, so their later cognitive development is rapid and smooth: they are superior to human college-educated humans by the age of seventeen and they go on to spectacular intellectual achievements in later life, vastly outstripping human adults. They are slow at first, given the paucity of their innate endowments, but quick later, while humans are quick at first but slow later (our memory is weak and our powers of concentration lamentable). To the alien observers this seems strange, almost paradoxical: why start so promisingly and then lapse into mediocrity? They continue to gain in intellectual strength while we seem to lose that spark of genius that characterized the first few years of life. That’s just the way the two species are cognitively set up: an initial large genetic boost for us, and a virtual blank slate for them (but excellent capacities of attention, memory, and studiousness). Our five-year-olds outshine theirs, but their adults put ours to shame.
I tell this story to highlight an important point about the human capacity for knowledge—an existential point. The existentialists thought that freedom was the essence of human nature, conditioning many aspects of our lives, individual and social; but a case can be made that human knowledge plays a similar life-determining role. For we suffer under a fundamental ambivalence about knowledge, which is to say about our cognitive nature (which is not confined to non-affective parts of our lives). We are simultaneously very good at knowledge and quite poor at it. Some things come to us naturally and smoothly, especially in our earliest experience (pre-school); but other things tax us terribly, calling for intense effort and leading to inevitable frustration. Rote memory becomes the bane of our lives. Examinations loom over us. School is experienced as a kind of prison. Calculus is hard. History refuses to stick. Geography is boring. What happened to that earlier facility when everything came so easily? We were all equal then, but now we must compete with each other to achieve good test results, which determine later success in life. We seem to go from genius to dunce overnight. Imagine if you could remember your earlier successes and compare them with your current travails: it was all so easy and enjoyable then, as the innate program unfurled itself, but now the daily need to absorb new material has become trial and tribulation. Getting an education is no cakewalk. Wouldn’t it be nice if it could just be uploaded into your brain as you slept, as your genes uploaded all that innate information? It’s like a lost paradise, a heavenly pre-existence (shades of Plato), with school as the fall from blessedness. You are condemned to feel unintelligent, a disappointment, an intellectual hack. Maybe you will make your mark in society by dint of great effort and a bit of luck, but you are still a member of a species that has to struggle for knowledge, for which knowledge is elusive and hard-won. Suppose you had to live in a society in which those late-developing aliens also lived: they would make you look like a complete ignoramus, an utter nincompoop—despite their initial slow start.
A vice to which human beings are particularly prone is overestimating their claims to knowledge. It is as if they need to do this—it serves some psychic purpose. Reversion to childhood would be one hypothesis (“epistemic regression”). But the actual state of human knowledge renders it intelligible: within each of us there exists a substantial core of inherited solid knowledge combined with laboriously acquired knowledge, some of it shaky at best. Take our knowledge of language, including the conceptual scheme that goes with it: we are right to feel confident that we have this under control—the skeptic will not meet fertile ground here (I know how to speak grammatically!). Generalizing, we may come to the conclusion that our epistemic skills are well up to par: so far as knowledge is concerned, we are a credit to our species. But this is a bad induction: some of our knowledge is indeed rock solid, but a lot isn’t. Being good at language is not being good at politics or medicine or metaphysics or morals. We are extrapolating from an unrepresentative sample. As young children, our knowledge tends to be well-founded, because restricted to certain areas; but as adults we venture into areas in which we have little inborn expertise, and here we are prone to error, sometimes fantastically so. We know what sentences are grammatical but not what political system is best. But we overestimate our cognitive powers because some of them are exemplary. It would be different if all our so-called knowledge were shaky from the start; then we might have the requisite humility. But our early-life knowledge gives us a false sense of security, which we tend to overgeneralize. We believe we are as clever about everything as we are about some things.
I recommend accepting that we have two sorts of knowledge—that we are split epistemic beings. On the one hand, we have the robust innately given type of knowledge; but on the other hand, we have a rather rickety set of aptitudes that we press into service in order to extend our innately given knowledge. Science and philosophy belong to the latter system. Thus they developed late in human evolution, are superfluous to survival, and are grafted on by main force, not biological patrimony. There is no established name for this distinction between types of knowledge, though it seems real enough, and I can’t think of anything that really captures what we need; still, it is a distinction that corresponds to an important dimension of human life—an existential fact. We are caught between an image of ourselves as epistemic experts and a contrasting image of epistemic amateurishness. We are not cognitively unified. We have a dual nature. We are rich and poor, advantaged and disadvantaged. Other animals don’t suffer from this kind of divide: they don’t strive to extend their knowledge beyond what comes naturally to them. Many learn, but they don’t go to school to do it. They don’t get grades and flunk exams and read books. Reading is in some ways the quintessential human activity—an artificial way to cram your brain with information not given at birth or vouchsafed by personal experience. Reading is hard, unnatural, and an effort. It is an exercise in concentration management. We may come to find it enjoyable, but no one thinks it is a skill acquired without training and dedication (and reading came late in the human story). It is also fallible. And it hurts your eyes. This is your secondary epistemic system in operation (we could label the types of knowledge “primary knowledge” and “secondary knowledge” just to have handy names).
Animals are not divided beings in this way (lamenting their reading ability); nor do they apprehend themselves as so divided. But we are well aware of our dual nature, and we chafe at it (as the existentialists say that we chafe at the recognition of our freedom). We wish we could return to epistemic Eden, when knowledge came so readily; but we are condemned to conscious ignorance, with little inroads here and there—we are aware of our epistemic limits and foibles. We know how much we don’t know and how hard it would be to know it (think of remote parts of space). We know, that is, that we fall short of an ideal. We can’t even remember names and telephone numbers! Yet our knowledge of convoluted grammatical constructions is effortless. If we are that good at knowledge, why are we so bad? Skepticism is just the extreme expression of what we all know in our hearts—that we leave a lot to be desired from an epistemic point of view. We are both paragons and pariahs in the epistemic marketplace. In some moods we celebrate our epistemic achievements, in others we rue our epistemic failures. The reason is that we are genuinely split, cognitively schizoid. Perhaps in the prehistoric world the split was not so evident, in those halcyon hunter-gatherer days, before school, writing, and transmissible civilization; but modern humans, living in large organized groups, developing unnatural specialized skills, have the split before their eyes every day—the specter of the not-known. We thus experience epistemic insecurity, epistemic neurosis, and epistemic anxiety. Our self-worth is bound up with knowledge (“erudite” is not a pejorative). It is as if we contain an epistemic god (already manifest by age 5) existing side by side with an epistemic savage: the high and the low, the ideal and the flawed. I don’t mean that we shouldn’t value what we acquire with the secondary system, or that it isn’t really knowledge, just that it contrasts sharply with the primary system. The secondary system might never have existed, in which case no felt disparity would have existed; but with us as we are now we cannot avoid the pang of awareness that our efforts at knowledge are halting and frequently feeble. The young child does not suffer from epistemic angst, but the adult has epistemic angst as a permanent companion. School is the primary purveyor of that angst today. Education is thus a fraught venture, psychologically speaking, in which our dual nature uneasily plays itself out. The existentialists stressed the agony of decision, but there is also the agony of ignorance (Hamlet is all about this subject, as is Othello).
Freud contended that the foundations of psychic life are laid down in the first few years of life (and sex not freedom or knowledge is the dominant theme), shaping everything that comes later. The stage was set and then the drama played out. I am suggesting something similar: the first few years of cognitive life lay down the foundations, and they are relatively trouble-free. Knowledge grows in the child quite naturally and spontaneously without any strenuous effort or difficulty. Only subsequently does the acquisition of knowledge become a labor, calling upon will power and explicit instruction. We might view this transition, psychoanalytically, as a kind of trauma: from ease to unease, from self-confidence to self-doubt. Whoever thought knowledge could be so hard! Compare acquiring a first language with learning a second language: so effortless the first time, so demanding the second. What happened? Now learning has become a chore and a trial. It is a type of fall from grace. The reason we don’t feel the trauma more is that it happens at such an early age (I assume there is no active repression)—though many a child remembers the misery of school. Knowledge becomes fraught, a site of potential distress. Cramming becomes a way of life, a series of tests and trials. But all the while the memory of a happier time haunts us, when knowledge came as easily as the dawn. And then there is death, when all that knowledge comes to nothing—when all the epistemic effort is shown futile. Our divided nature as epistemic beings thus has its significance for how we live in and experience the world. It is not just a matter of bloodless ratiocination.
 I won’t rehearse all the evidence and arguments that have been convincingly given for this conjecture, save to mention the existence of critical periods for learning. Would that such periods could occur during high school mathematics training!
 Of course, we still pick up a lot of information without effort just by being in the world, but for many areas of knowledge something like school is required (this is true even for illiterate tribes).
 Logan Pearsall Smith: “People say that life is the thing, but I prefer reading.”
 Is it an accident that one of the prime distinguishing characteristics of God is his omniscience? He knows automatically what we can never hope to.
 The Internet, with its seemingly infinite resources, drives this point home. It also leads to varied and grotesque deformities in our cognitive lives.
 Here you see me lapsing into weak poetry, as all theorists of the meaning of life must inevitably do. Sartre’s Being and Nothingness is one long dramatic poem: who can forget his puppet-like waiter, or the woman in bad faith whose hand remains limp as her would-be suitor grasps it, or Pierre’s vivid absence from the cafe? My illustrative vignette would feature a bleary-eyed student studying in a gloomy library while recollecting her carefree sunlit days of cheerful effortless knowing.
An internationally acclaimed philosopher and teacher, McGinn was educated at Manchester University (Psychology, BA and MA, 1972) and Oxford University (Philosophy, B Phil, 1974), and went on to teach philosophy at University College London, Oxford University, UCLA, Princeton, and Rutgers. He was a philosophical advisor to Geoge Soros from 2008-2013.