Scientific discoveries and new technologies will change beyond all recognition the way we live, work, think, feel and see the world. As our environment is transformed, our minds will be transformed too. Human nature itself will be transformed. And, for the first time in almost 100,000 years, says Susan Greenfield, we may finally understand and celebrate individuality

As a species, we probably would not take kindly to the idea of a mind make-over. As technologies increasingly dazzle and shock, we typically take comfort in human nature and our obstinate disposition to love, hate and wreak revenge, just as our ancient Greek forebears did.

So why, now, at the turn of the 21st century, should that ultimate of bastions be breached?

To understand how, for the first time in almost 100,000 years, our minds might be dramatically transformed, we need first to explore what the human mind could actually be as a physical reality.

The more we learn from neuroscience, the more it seems that, instead of being an antithetical alternative to brain, the mind does indeed have a physical substrate in the sludgy grey matter between our ears.

Let’s look first at how the brain is actually organized. The basic working unit is a brain cell (neuron), but more important still are the circuits of brain cells that then grow into complex assemblies that form recognizable macro brain structures.

Each of these brain structures is like an instrument in an orchestra or an ingredient in a complex dish of food: they can perform many functions, according to the combinations and the degree to which they are operating at one moment.

With the act of vision, for example, there are at least 30 different brain regions involved. However, at the level of gross anatomy all brains look fairly similar. If we were looking for a mind, or individuality, it wouldn’t reside in any one highly variable structure. Indeed, the notion of a brain within a brain is as nonsensical as it is unhelpful.

The mind is even less likely to reside in the most basic components that will enable brain cells to work within assemblies, namely the genes. Genes make proteins, they do not have an agenda all of their own.

They are not single-mindedly hell-bent on enslaving the brain to their own purposes, but rather are there as bit players, albeit important ones, that are necessary – rather than sufficient – for complex mental traits.

As an example of how there is an interplay between nature and nurture, consider a study of a few years ago on mice that have been deliberately bred to carry the aberrant gene for Huntington’s chorea.

Huntington’s chorea is a severe disorder of movement that, unusually for most brain disorders, is related to one single rogue gene. Nonetheless, when mice that were genetically doomed in this way were given environmental stimulation – merely a few cardboard tubes and wheels with which to interact – the age of onset of the disease was far later, and the impairment in movement was far less.

Just think therefore how indirect is the relation between the gene and mental behaviour, where diseases have many genes involved, and where the human brain is still more sensitive to the environment.

Genes are important in that they make proteins, but those proteins, like brain regions themselves, can have many functions. Within brain regions, genes will be switched on and off throughout life, and the proteins that they make will, in turn, change the configurations of brain cell connections.

What causes these changes is quite simply interaction with the environment on a moment-to-moment basis. Hence the old dichotomy of “nature versus nurture” doesn’t really hold. Rather it is a ceaseless interaction between input from the environment and the chemistry of the brain that will lead to changes in brain cell connectivity that, in turn, will determine, literally, how you see the world.

It is this personalization of brain cell connections through experience, arising from constant dialogue with the outside world, that I like to think of as the mind.

So you are born into a booming buzzing confusion, where the world is evaluated in terms of sweet, fast, cold, bright and so on. But gradually these abstract sensations will coalesce into faces or objects that will eventually acquire labels and meaning, as they trigger ever

more associations as they feature increasingly in your life.

This plasticity of the brain has been documented not just in lab rats, where mere exposure to wheels and ladders can increase the amount of connections, but also in humans.

Perhaps the most fascinating example recently was a report showing that London taxi drivers, burdened as they are with an extra challenge to their working memory to remember all the street names of London, do indeed have a part of their brain that is larger than that of most other people.

And if the brain is this sensitive to change, it follows that anything in the environment, or anything that targets those brain connections, will change the mind.

Meet your silicon self

So what will happen to our minds in the 21st century, where finally all the promises of the scientific discoveries of the 20th century are suddenly delivering a fast and furious wealth of technologies?

A good place to start is the home – an environment of mechanized pets and computers with human interface personas, voice interfacing and virtually invisible devices that are embedded in our clothes and jewellery.

These smart, inanimate objects on and around us will mean that as we potter around the home almost everything or anything will be influenced by a spoken command or, indeed, change automatically in tune with our bodies.

Living in a world where objects answer back, and where your every body process triggers an outcome in the outside environment, will surely impact dramatically on how we see the world.

And if it is harder to distinguish what is real out there, and what is instigated by our own minds, then the barrier between the silicon selves and carbon also looks like being breached.

The brilliant work of Peter Fromhertz in Germany, among others, has shown that it is now possible for human nerve cells to grow on integrated circuits. This idea immediately gives rise to the neuro-chip, where the excellent electronic properties of brain cells can be exploited to make hybrid carbon-silicon neuro-chips.

And if these electrical signals generated by brain cells can communicate directly with the electrical signals generated in silicon, then it follows that not only can we have a new type of computer, but that one could implant electrodes into the brain.

For example, at Emory University a patient paralyzed from the neck down can now “will” a cursor to move on a computer screen: an implant enables a thought to be changed into a movement to compensate for his damaged motor system.

So in the future – where not only synthetic prostheses heighten our sense, but brain implants enhance performance – it may be possible to move objects at a thought and, perhaps even to change our endocrine and immune systems by thinking. Moreover, all this information would be read out by very small nanotechnological devices in the body. Another barrier that will therefore be crossed is that between mental and physical, between objective and subjective. And just as the advent of this invasive and exquisitely sophisticated IT will change how we see the world and how we see ourselves, so inevitably it will change how we work.

In a ballooning e-commerce world, robots and databases are gradually replacing experts. Hence outsourcing and subcontracting will replace a corporate hierarchy where the individual will be a freelancer working from home. Much of daily life will be hard to compartmentalize as recreational mental activity distinct from work.

Another important change that IT could bring is an increasingly grey workforce, freed up from needing to exercise physical prowess, or even being particularly mobile. It could well be we are facing a world where retirement recedes into the past.

Hence the future of work is again one of breaking down of barriers – between work and home, work and retirement, and work and leisure. This breakdown of a work/life narrative would also be challenged by advances in technologies that challenge the notion of life itself.

The biotechnologies are promising additional enhancing genes that could be eventually purely synthetic, that will be able to hang on artificial chromosomes without interfering therefore with natural genes.

In addition, “one generation germ-line engineering” will enable the more efficient targeting of genes, but without the huge responsibility of inculcating that trait for generations.

Now add to this scenario the possibility that soon genetic material could perhaps be extracted from any cell in the body, so that anyone of any age could be reproductive.

We can then see that it might be the end of the traditional life narrative as we know it. If anyone of any age could have a child with anyone, then barriers between the generations will surely break down.

People of the screen

Another factor in this breakdown of life narrative could be great changes in education.

Voice interface computers would certainly mean that reading and writing eventually became obsolete.

And if all facts were accessible to everyone on a screen, as they are increasingly, then surely the traditional barriers, where one expert is differentiated from another, would start to crumble.

The journalist Kevin Kelly has summed up already the changes that are occurring in “people of the screen” versus the more traditional preceding generation “people of the book”.

He writes: “Screen culture is a world of constant flux, of endless sound-bites, quick cuts and half-baked ideas. It is a flow of gossip tit-bits, news headlines and floating first impressions. Nations don’t stand alone, but are massively interlinked with everything else. Truth is not delivered by authors and authorities, but is assembled by the audience.”

It is this latter point that is particularly important. No longer will we be told by a teacher, or indeed by the author of a book, how to progress and unfold our thinking and arguments.

But rather we ourselves will be driving that thinking as we dart around hyper-texting and manipulating icons. But if we don’t need to learn anything, the emphasis will be placed on the experience of interaction, rather than the internalization of learning.

So what will this mean for us, for our much vaunted and much prized human nature. Despite the ubiquity of its use, the term is actually quite hard to define. It is not, after all, describable in terms of isolated genes – most of which we share with other creatures and, indeed, with vegetables.

Similarly, any tail-wagging dog or purring cat will show that non-human animals have emotions, although we will never know how it feels to be one.

We are faced with the paradox that human nature is ubiquitous in time and place, and yet unique to the human species, wherever they are, and whenever they existed.

Envy is recognizable as envy, and greed as greed, whatever values have caused it, and however disparate those values may be.

The whole point is that we can vary the context both in history and geography, but the response is always uniform: we are always looking at familiar behaviours, albeit ones that result from particular and varying social values. In turn these values will determine your status in society, and hence your sense of self.

Status, then, is arguably a state of mind. As we have seen, the growth of the connections reflects post-natal experience, and mind might be this personalization of the brain.

Status therefore will depend on the dynamics of the configuration of the connections, the context of the particular time and place in which we live. We have seen how our environment may be about to change forever, to be one where we no longer need to coexist immediately in close, interactive society with others.

But if we live through a screen, and in a world that sanitizes us – and, indeed, our co-species – then it will reduce our life to an immediate experience, one of incessant distraction by the mercurial image or icon.

Will there really be a need to treasure status? What would it bring us? Perhaps then we would no longer need, or need to use, our personalized neuronal connections, our minds.

For the first time in 100,000 years, human nature could therefore be radically changed, or even vanish. The vital issue for us now, therefore, is instead to harness these new technologies to gain more insight into human nature. We could then be placed as never before, not to annihilate individuality, but celebrate it.

Susan Greenfield
Susan Greenfield, CBE, is director of The Royal Institution of Great Britain and Fullerian Professor of Physiology and Comparative Anatomy. Since 1996 she has been Professor of Pharmacology at the University of Oxford. Her research concentrates on understanding brain functions and disorders, such as Parkinson’s and Alzheimer’s diseases, as well as the physical basis of consciousness. She has written, broadcast, lectured and consulted widely on these topics. Her most recent book – Tomorrow’s People: How 21st Century Technology is Changing the Way We Think