Our argument starts not with statistics about secondary schools, or even comments about various political initiatives, but rather with a kind of fable.
Of all the animals in the woodland surely it is the deer that most excites human imagination. A peaceful herbivore, the deer’s survival has depended on its ability to sniff out danger and run off to safety faster than any other creature. Over millions of years it has developed the sleekest and most powerful combination of bone structure, muscle, and tendon, making it a veritable icon of animal fitness.
It takes all of two years for the young fawn to learn enough about the art of survival from its mother to live on its own. Once responsible for itself, the young deer has learned not to panic when danger approaches, but to stand stock still so as to attract no attention; to sniff the air for the scent of danger; to hold a leg just off the ground to detect the slightest vibration of an approaching predator; and to flex its ears to pick up the faintest of sounds. All those skills have been perfected by its ancestors over vast periods of time and have become part of the instincts that create the character of a deer. A powerful set of survival skills, it seems – but no longer quite enough.
Setting out on its own as dusk creeps over the woodland, the deer comes on a clearing of unnaturally level and hard ground. Suddenly, around a corner approaching at high speed, comes a noisy contrivance sporting two bright headlights. The young deer does everything that its instincts have taught it to do – in an instant it becomes immobile, sniffing the air furiously, sensing the vibrations and testing its muscles for action, but unsure of where to go. Mesmerized by the lights, the young deer remains rooted to the spot a split second too long, and that young prince of animals, the ultimate descendant of an ages-old line of evolution, is killed instantly as it is hit head-on by a car. The car is probably a write-off, and the driver – if he is lucky enough to survive – curses the animal for its “lack of intelligence” in not getting out of his way.
Like the deer, humans too are the result of a long saga of evolutionary adaptations that has taken us to the point where we have the intelligence and motor skills to build a car, send a man into outer space, and carry out complicated medical operations using the power of the new nanotechnologies. Because humans have evolved big brains, rather than a deer’s athletic anatomy, it takes our young far longer to grow up. Unlike the fawn, whose brain was nearly fully formed at birth, human babies are born with brains so premature that two thirds of brain growth happens after birth (an evolutionary compromise made necessary by the narrowness of the woman’s birth canal resulting from our species learning to stand upright). Consequently most human brain growth is shaped not simply by genetics, but by the lessons we draw from real-life experience.
Here is the secret of our phenomenal brain: every baby is born with a variety of inherited pre-dispositions that enable it to so internalize such real-life experiences that it is literally able to grow its own brain (in a way that the fawn does not) and so reflect the increasing complexity of the environment in which it finds itself. Thus humans have the capability to adapt, in very quick time, to almost any environment – always providing that they keep every one of their mental antennae alert to further environmental change.
Within the past 300 years (a mere split second on our evolutionary time-scale), our ancestors have developed a range of technologies that have enabled our species to spread out across the globe. There are now some 20 times as many of us as there were when the first steam engine was invented in 1728; ten times as many of us as there were in 1824 when the first railway engine enabled a man to go faster than on the back of a horse, and two-and-a-half times as many as there were at the start of World War II, less than seventy years ago. This vastly inflated population (some would say a population waiting for Mother Nature to carry out a savage cull) has only been made possible by turning much of the world’s population into “specialists”, people who so concentrate on the efficient production of the individual components of a machine or a process that they have little understanding of how the various parts connect.
Two hundred years ago, most young people learned about growing up by participating in family farms, businesses, or community projects for which they had to learn how every subpart contributed to the usefulness of the final product. They had to know how things worked, for theirs was a world in which connectivity was obvious – the strength of a chain was well understood to be dependent on its weakest link, just as the speed of a convoy depended on the speed of the slowest ship. But over the last two centuries, ever fewer young people have had this experience. Ours is a world of information saturation, where the power of computers doubles every eighteen months, and it is estimated that the world produces about five exabytes of new information per year (an exabyte is a billion gigabytes). That’s about 37,000 times the amount of information held in the Library of Congress. This brings enormous opportunities: ten years ago who would have thought of “Googling” an old friend and five years ago who would have known what a “wiki” was? But it has also brought problems.
In our search for greater material rewards, we seem to have decided that there is no longer any reason for young people to learn, as did the apprentices of old, by working alongside older people. Instead, and especially in the past 60 years, we have decided that youngsters should spend longer and longer studying in ever greater detail – in theory rather than practice – a single aspect of a sub-component, or a sub-discipline, as defined by somebody else. This, we are told, will enable the wonderful productivity of the present technological world to thrive.
While it is entirely appropriate that a young fawn should grow up as a mirror of its parents, for a human child to grow up as a clone at a time of rapid cultural and economic environmental change would be nothing short of disastrous.
In exchange for what was once the satisfaction gained from a job well done, as shipyard workers cheered when the boat they had built together for two or three years finally slipped into the water, people are now paid good money for a job that may have little or no intrinsic satisfaction. It gives them a wad of cash to spend in their free time, but not the satisfaction of a job well done. Too many of us don’t even realize how vulnerable this makes us, because we have readily exchanged wealth today for any sense of personal responsibility for the future.
To support this materialistic mindset, education has come to mean doing what you are told and not asking awkward questions. But is that what our brain evolved to do?
Pushing the Evolutionary Envelope
“Cultural speciation” – cultural change requiring people to modify their behaviours and attitudes – proceeds infinitely faster than does “biological speciation”, the development of biological adaptations to changed sets of circumstances. In other words, what we are now expecting from individuals in our so-called advanced culture has far outrun those adaptations inherited from the past which, when properly utilized, streamline the operation of the brain. While the human race is wonderfully empowered by its ancestors, it is certainly constrained as well. We are very adaptable, but not infinitely so. Being driven to live in ways that are utterly uncongenial to our inherited traits simply drives people mad
In the past 20 or 30 years scientists have learned much about the grain of the brain. We now know that, because of our initial physical vulnerability, we learn a whole raft of skills in the first seven or eight years of our lives through closely imitating the actions of our parents and teachers. Like the young deer, young children’s learning is clone-like. But while it is entirely appropriate that a young fawn should grow up as a mirror of its parents, for a human child to grow up as a clone at a time of rapid cultural and economic environmental change would be nothing short of disastrous. Within our lifetimes, the next generation must be equipped to go where no one has gone before. To equip them to do this, we must not forget the past. But at the same time, we have to recognize that to 21st century man, the past is only a partial clue to the future.
The brain may, in fact, have evolved to help us. Scientists are now discovering massive structural changes in the adolescent brain through extensive functional MRI scans, changes that apparently shake the internal mechanisms of a teenage brain to its roots. If this is true – and all the signs suggest that it is – these must be seen as essential evolutionary adaptations that ensure the survival of the human race by forcing teenagers to break away from their parents and teachers. “Get off my back,” adolescents down the ages have pleaded. “Leave me alone. Give me space.” Adolescence is about growing up and no longer thinking like a child. It’s about ceasing to be a clone. Sitting still (if only for part of the time!) may be an appropriate learning environment for the pre-pubescent child, but it is largely inappropriate for adolescents, whose biological pre-dispositions, we now know, urge them to find out things for themselves.
And here is the crux of the present advanced world’s dilemma. Little more than 100 years ago, American psychologists started to define this rebelliousness of adolescence as a disease, an aberration that made teenagers a threat to themselves. Psychologists and educational bureaucrats alike concluded that something had to be done to prevent teenagers from threatening the carefully controlled world that teachers had created.
Educational administrators saw only one answer to this problem: put adolescents into school for longer and longer, and give them so much studying to do that they wouldn’t have the time or energy to question what an adult society was actually doing to them. We’re still doing this today. Policymakers, with little background in the neurological processes, expected that, by the age of 22 or 23, the next generation of young people would have been “broken in” to the currently defined way of doing things. Their thinking resembled that of horse breeders who, until very recently, thought it necessary to break in a young foal after it had run relatively wild for two years. Now horse breeders carefully study the temperament of every foal, and then define unique training programs that build upon what each can do naturally. Human adolescents crave and deserve no less. Deep down, there stirs within them the urge to climb the mountains of the mind and see what possibilities lie before them; they are innately “big picture” thinkers and frequently upset older generations by questioning the compromised lives so many of us lead. That is their nature; it is what their brains have evolved to do. It is the apparently unreasonable dreams of adolescence that, years later, drive the progress of what we are proud to call our civilization. It has always been so.
And yet, society has so outlawed the natural rebelliousness of adolescence that most people simply accept the specialized roles that have been created for them and have but a limited capability to look beyond their restricted worldview to see the ecological, environmental, and social crises that are hurtling towards them – crises that the unfettered adolescent brain may have evolved to tackle.
Going with the Grain
By misunderstanding teenagers’ instinctive need to do things for themselves, society has created a system of schooling that goes against the natural grain of the adolescent brain and ends up trivializing the very young people it claims to be supporting. By failing to keep up with appropriate research in the biological and social sciences, current educational systems continue to treat adolescence as a problem rather than an opportunity bequeathed to them through the genetic transfer of important mental pre-dispositions to learn in particular ways. These predispositions, once activated, transform the clone-like learning of the pre-pubescent child through adolescence into the self-directed learning of the mature adult.
We have effectively lost the plot: adolescence is an opportunity not a threat. Understand that, and it changes everything.
By using our schools to subvert the natural processes of growing up in order to fit more comfortably into our present economic state, we have created whole generations of young people and adults who are now mesmerized by the bright lights of a way of living that is hurtling, out of control, towards us. Like the young deer, we too are transfixed by the lights that are about to destroy us. Because we have effectively told young people not to think for themselves, far too many of today’s so-called “educated” people know of no way to find a solution that has not already been prepared for them and described in a text book.
This kind of thinking gave birth to the modern secondary school, which became a kind of holding ground in which the problems of adolescence could be worked through so that eventually youngsters would be mature enough to deal with adult society. School became the exact opposite of apprenticeship. Schoolchildren were required to sit docilely in classrooms, listening to the received wisdom of the teacher and reproducing that knowledge when tested. Independent and creative thinking was not encouraged, for that threatened the teacher’s control of the rest of the class. Young apprentices, on the other hand, had to be put through their paces so that the older they became, the less dependent they were on the craftsman and the more confident they were in demonstrating their ability to solve problems. Every skill learned, every experience internalized, increased the apprentice’s sense of autonomy.
Recent research in cognitive science and neurobiology makes it obvious that apprenticeship was a culturally appropriate response to the neurological changes in the adolescent brain. Apprenticeship was a form of intellectual weaning whereby the more skillful and thoughtful the apprentice became, the less he or she would depend on the teacher. The German philosopher Nietzsche put it succinctly: “It is a poor teacher whose pupils remain dependent on him.”
If Western society is to survive (and it really is as serious as that), it is essential that all those involved with young people escape from that assumption made a century ago by early psychologists, that adolescence is an aberration and an inconvenience. While the human brain has evolved to enable each of us to function effectively in complex situations – we naturally think big and act small – modern education has become side-tracked into creating specialists who are well-qualified in their own disciplines, but unable to see the wider impact of their actions. Because formal schooling has done its best to neutralize the impact of adolescence, recent generations of young people have been deprived of the strength that comes from fearlessly making difficult decisions – and if necessary picking up the pieces when things go wrong. We have effectively lost the plot: adolescence is an opportunity not a threat. Understand that, and it changes everything.
An education system that truly went with the natural way in which people learn – we call it “going with the grain of the brain” – would prepare children in their younger and prepubescent years for the self-defining struggle that is adolescence. A delightful story illustrates this well. A man, seeing a butterfly struggling on the sidewalk to break out of its now useless cocoon, bent down and with his pocket knife carefully cut away the cocoon and set the butterfly free. To the man’s dismay the butterfly flapped its wings weakly for a while, then collapsed and died. A biologist later told him that this was the worst thing he could have done, for the butterfly needed the struggle in order to develop the muscles needed to fly. By robbing the butterfly of the struggle, he had inadvertently made it too weak to live.
Children need the struggle of adolescence to sort themselves out and put away those childish behaviours which earlier had served them well. Sometimes alone, often with their peers and supported by the guidance of wise and caring adults, adolescents need a careful mixture of guidance and the space to work things out for themselves. Through the struggle of adolescence they develop the strength for adult life. To waste adolescence is to deny future generations the strength they will need to respond to the serious problems facing our civilization and our planet.
EN BREF – Au cours de leurs sept ou huit premières années, les enfants acquièrent une multitude de compétences en imitant de près leurs parents et enseignants. Mais il serait désastreux que les enfants grandissent comme des clones pendant une période de rapides changements environnementaux, culturels et économiques. Nous savons maintenant que les enfants ont besoin du tumulte de l’adolescence pour s’affranchir de ces comportements infantiles. D’après les recherches récentes en sciences cognitives et en neurobiologie, l’apprentissage – une forme de sevrage intellectuel réduisant la dépendance de l’apprenti sur l’enseignant au fur et à mesure de ses progrès en matière de compétences et de réflexion – constitue une réponse culturellement plus appropriée aux changements neurologiques du cerveau adolescent que ce qu’offrent nos systèmes scolaires actuels. Gaspiller l’adolescence, c’est nier aux générations futures la vigueur qu’il leur faudra pour résoudre les graves problèmes confrontant notre civilisation et notre planète.