Wednesday, December 31, 2008

The Age of Aging: How Demographics are Changing the Global Economy and Our World


Author: George Magnus

Review: FT

Demography is the senior social science: churchgoers this week will be reminded that Jesus Christ was born in the midst of a census two millennia ago. George Magnus’s The Age of Aging is an account of the great population transitions currently under way around the world. Magnus is a renowned City of London economist, now at UBS, and his new book is a guide for the general reader on how the greying of the world will change everything, everywhere.

Demographic cycles are immensely powerful, and move just fast enough to cause occasional outbursts of panic. The western world intermittently gives in to fears about population extinction and Malthusian famine. The 1930s brought us titles such as The Twilight of Parenthood, while the 1960s gave us The Population Bomb. The new terror for the west is a large, aged population: the popular literature frets that future generations – usually Americans – are already doomed to penury because of the rising costs of medical bills and pension payments.

Magnus’s account avoids the hysteria that often affects this genre. He level-headedly discusses the west’s ageing problem with an explanation of the developing world’s demographic conundrums. For readers who are new to the topic, some of the chapters are useful primers on these crucial issues.

Longer lives and changes in the birth rate have conspired to make the developed world become very old very quickly. Longevity is a blessing for individuals, but the stresses it creates are a real quandary for policymakers. Even if there had been no financial crisis, the developed world would have been lumbered with rising social, healthcare and pension costs.

But demographic problems are not confined to the west. China has a problem with both age and gender; by 2020 it will have the same median age as Australia, while a relative lack of women means as many as 10 per cent of men between 20 and 45 will be unable to find a spouse. China may be old and dysfunctional before it is rich enough to cope.

India, meanwhile, has a race to become educated and find employment in time to claim the demographic dividend which it is now due. In the Middle East and North Africa, a region where the employment rate is currently at 47 per cent, the challenge will be to create enough jobs to cope with a surging young population. Magnus, quite rightly, sees HIV/Aids as having reached a scale where it deserves to be seen as a demographic problem. In Africa, it is the biggest killer, responsible for one in five deaths – twice the death toll from malaria and 10 times that of violence and war. It may stop that continent from exploiting a coming demographic dividend. In Russia, where 1m people are probably already infected, it is exacerbating an already weak population position.

The book is a comprehensive survey, tied together with chapters on how the global balance of power, religion and economics will be altered by this change. Its broad reach, however, is both a strength and a weakness. Magnus is a shrewd and serious commentator; he was among the first to identify the ongoing credit crisis as a “Minsky moment”, named after the economist, Hyman Minsky. It refers to the moment when financial euphoria tips into crisis. But, in producing this book, he has spread himself too thinly.

On a number of occasions, Magnus strays on to fascinating topics, dwelling on them just long enough to pique the reader’s interest before galloping swiftly on to something else. Contentious ideas, such as Gunnar Heinsohn’s hypothesis that particularly young populations are inherently unstable, are reported and dealt with only briefly. He simply reports them, and makes little attempt to add to them.

Given Magnus’s stellar record, some of the reflections on globalisation and economics are disappointing. Writing a book with any reference to the future financial outlook that does not immediately get superseded is a nearly impossible task at the moment, but the discussion of capital flows, in particular, is too short and it seems to come from an age when Northern Rock was still able to fund itself.

Magnus is right that younger people in the developed world will be left with a serious demographic burden, creating what he calls the “Boomerangst” generation. The issue will surely grow as a political issue. But there is a serious philosophical debate on what intergenerational fairness actually means; most recently, it has emerged in the debate about what costs current generations should bear to mitigate climate change.

Magnus has produced an overview of the current state of world demography. It is well-researched and thorough. But it is disappointing, given his calibre, that he has not helped to advance the argument.

The Shadows of Consumption: Consequences for the Global Environment


Author: Peter Dauvergne

Review: Times Higher Education

On Christmas Eve 40 years ago, astronauts Bill Anders and Frank Borman gazed out of Apollo 8 as it swung around the Moon and snapped the first pictures of the whole of our lonely blue-green planet in the middle of black space. This world used to comprise many different cultures and ways of living. Since then, it has grown close.

Our six and a half billion people increasingly wish to do the same things, and the model is a high-consumption lifestyle typical of the most industrialised countries. There are 78 cars for every 100 people in the US (and that includes the elderly, infirm and babies) and just 0.7 for every 100 in China. A quarter of the world's population has no access to drinking water; the average Briton consumes 150-200 litres per day, the average American 400 litres and the average Las Vegan (a separate species) some 1,600 litres.

There have always been differences, of course. But now there is a convergence of aspirations. If the whole world consumed at the same levels as the average American, we would need six to eight planets. And global environmental problems are the consequence. The world cannot cope, and it is creaking at the seams.

This is the territory of Peter Dauvergne's intriguing book on the largely hidden environmental consequences of consumption. When the figures are aggregated, we - all of us - consume too much. Yet we seem not to think of the consequences, nor even to care. In exploring the costs of rising consumption, Dauvergne tracks the rise and fall of five commodities: cars, leaded petrol, refrigerators, beef and harp seals. But it is an odd mix. Some are past problems, largely solved; others are eclectic and specific to Canada; others still utterly intractable.

The lead-in-petrol story is worth a reminder, as it was a success. Tetra-ethyl lead was discovered by Thomas Midgley as a much-needed anti-knock agent for petrol in 1921, but by the 1980s evidence was piling up that lead was accumulating in both adults and children and causing ill-health. It was contested, of course. Some said "no problem" to removing it; others were scared. In the mid-1980s, industrialised countries suddenly shifted to lead-free petrol and oil companies simply shifted their leaded petrol to developing countries. Sub-Saharan Africa phased out leaded petrol only in 2006.

The chlorofluorocarbons story is informative, too. The mysterious Midgley reappears as it was he who, in 1928, discovered stable, odourless and non-toxic refrigerants known as CFCs. The price of fridges fell and put them in the reach of almost all consumers. Out went ice packed in sawdust, in came shining boxes that would eventually populate every kitchen in the industrialised world. But poor old Midgley, who almost deserves a story to himself, had come up with another corker. Fifty years later, a gaping hole in the stratospheric ozone was found to have been caused by CFCs, and it was letting in harmful ultraviolet radiation. Now those fridges with CFCs are being replaced with safe ones, but the gains are being outdone by the rampant consumption of old models elsewhere.

Dauvergne rightly points out that there is not a lot of point in improving standards in some places if consumption of products that cast a deep environmental shadow grows rapidly elsewhere. But then this book pitches on its face. If convergence is the problem, then the solution lies in divergence, which means letting some local people do what they think is culturally important. And they may choose to do something distasteful to some, like eat meat or hunt seals.

You cannot pick and choose globalisation for the good things, such as moral positions or types of governance, and then target localisation for the bad. Some think localisation may save the world, but if so, people will have to be left to choose their own pathways.

The harp seal chapters are deeply problematic. They are at first interesting, but descend into pre-designed positions. Dauvergne appears to assume that the kind of person who would read a book on the environmental consequences of consumption would be inherently against the seal harvest on Canada's East Coast. Native Canadians are mentioned only once, even though many rely on seals for a variety of products. Activists against the seal hunt are described as "idealistic, imaginative and daring", and the Newfoundlanders "an angry mob".

Some 200,000 to 300,000 seals are culled each year; a figure to make anyone think. Yet the author does not set this against the billion cows, billion pigs, and billion sheep and goats killed every year worldwide, or the number of fish caught from the oceans. None of these is a simple comparison. They need careful analysis and delicate moral positions. Truths almost always lie in the middle of extremes. The lack of discussion about what native peoples think about seals and the hunt is a glaring flaw.

At times, this book contains unnecessary exaggerations and loose language - the world population will not reach 11 billion, and "chemicals" make up everything, not just nasty products sprayed on fields. Moreover, the book mixes narratives - traffic accidents kill more than a million people worldwide every year, a terrible toll. But this is not inherently an environmental problem; the cars could be electric and still kill people.

Dauvergne concludes that consumers can help. Buy less, be different. Don't buy furs, don't eat meat, choose safe fridges, don't drive a car. Buy "natural" beef, if you must, but beware that companies like these terms, and natural may mean an animal that once saw some grass. But are we free to make these decisions? How much are we shaped by advertising, by what friends and neighbours do? How much do the aspirations of others affect these decisions?

This book concludes with calls for changes to navigate towards a bright future, but sadly offers us not much more than a window on to dystopia. And what of heroic Midgley? He went to an early death after contracting polio in the Second World War, without any hint of the troubles his lauded discoveries would cause.

Tuesday, December 30, 2008

The Culture Code: An Ingenious Way to Understand Why People Around the World Live and Buy as They Do



Author: Clotaire Rapaille

Backstory:
Thirty years ago, Clotaire Rapaille was a Parisian psychoanalyst studying autistic children when a businessman at Nestlé suggested that Rapaille’s work decoding subconscious imprinting could be used to market instant coffee to tea-drinking Japan. Not only were Nestlé’s coffee-selling strategies affected, but Rapaille embarked on a new career of applying cultural anthropology to the corporate world. Today, the internationally renowned consultant is based in New York, and his firm, Archetype Discoveries Worldwide, is retained by half of the Fortune 100 companies to assist in product marketing.

Total reading time: 240 minutes

First published: 2006

Key passages:
“Imprints vary from culture to culture. If I could get to the source of these imprints—if I could somehow ‘decode’ elements of culture to discover the emotions and meanings attached to them—I would learn a great deal about human behavior and how it varies across the planet. This set me on the course of my life’s work.”

“American culture exhibits many of the traits consistent with adolescence: intense focus on the “now,” dramatic mood swings, a constant need for exploration and challenge to authority, a fascination with extremes, openness to change and reinvention, and a strong belief that mistakes warrant second chances.”

“If work means ‘who we are,’ then it is perfectly understandable that we seek so much meaning in our jobs. If our jobs feel meaningless, then ‘who we are’ is meaningless as well.”

Synopsis:
This fascinating, accessible read takes a look at how cultural mores shape people’s behavior, using analytical techniques that have helped Rapaille improve the profitability of products from coffee to cars for such companies as Nestlé and Chrysler. Addressing some two dozen archetypes—sex, beauty, youth, home, money, and even America itself—Rapaille shows how and why cultures interpret words associated with these archetypes differently and how that should affect the design and marketing strategies companies adopt in different regions.

By distilling societal biases to their emotional essence, Rapaille reveals a series of cultural codes that work subconsciously to affect our choices as consumers. For example, while working with Chrysler on a Jeep Wrangler campaign, Rapaille asked focus groups not what they wanted in a Jeep, but their earliest memories of the vehicle. “The stories had a strong recurring image … of riding free … the American West or the open plain,” he writes. “I returned to those wary Chrysler executives and told them the code word for Jeep in America is horse.” In France, Jeeps reminded people of American soldiers in World War II, so the code word there became liberator. For Rapaille, those two meanings dictate how Jeeps should be marketed in each country.

While the book is intended for corporate product-design and marketing divisions, it’s also a handy psychological tool for guiding our interactions with other cultures as well as reexamining ourselves. Also, Rapaille’s focus on American archetypes and his thesis that U.S. culture mimics adolescence (as if our obsession with Paris Hilton wasn’t a dead giveaway) offer readers in the States an opportunity for greater self-awareness and foreigners a window into our eccentricities.

The Origin and Evolution of New Businesses


Authr: Amar Bhide

Backstory:
For The Origin and Evolution of New Businesses, Amar Bhidé, a professor at Columbia University’s Graduate School of Business, spent more than 10 years researching hundreds of successful entrepreneurial ventures from the pre-dotcom era. Bhidé had earned an M.B.A. from Harvard but failed twice as an entrepreneur himself. After a five-year stint working for McKinsey, he joined the Harvard Business School faculty in 1988 to teach a course in entrepreneurship. It was there that he discovered an apparent lack of available research on small businesses, which prompted him to compile a textbook-quality study of the field. It will take a scholar’s aptitude and patience to digest the fruits of his labor—summer beach reading this is not.

Total reading time: 1,800 to 2,400 minutes

First published: 2000

Key passages:
“Over the past half-century, business schools have devoted considerable resources to studying the entrepreneurial activities of large companies—how Merck develops new drugs and Intel new microprocessors, how Disney produces and markets The Little Mermaid and McDonald’s introduces Big Macs in China. Little effort has been devoted to systematic research about starting and growing new businesses.”

“Entrepreneurs who start and build new businesses are more celebrated than studied.”

“Most entrepreneurs start their businesses by copying or slightly modifying someone else’s idea. They also usually don’t have deep managerial or industry experience.”

“Overall, the reality of bootstrapped businesses does not bear out the popular image of an entrepreneur as an irrational, overconfident risk-seeker. Quite the contrary. Entrepreneurs can pursue ‘heads I won, tails I don’t lose much’ opportunities because they are less prone than average to irrational ambiguity aversion and they have a talent for exploiting the cognitive biases and defects of other individuals. They require exceptional self-control; they may have to tolerate difficult customers with unreasonable demands and focus on winning orders rather than arguments.”

“By coincidence, our interviewees, the founders of Attronica Computers and Bohdan Associates, had both started their businesses in 1983 about 10 miles away from each other. Both evolved into companies that distributed computers to corporate customers. Attronica’s founders, who had worked together in a large telecommunications company, had decided after extensive market research and planning to purchase a franchise from a computer retail chain for $150,000. After that and other attempts to serve the retail market failed, Attronica developed, through trial and error, into a company serving corporate and government accounts. In 1988 the company had revenues of about $8.1 million. Bohdan’s founder, in contrast, started selling computers out of his home by accident -- he placed an ad to sell his used computer and was surprised by the demand. Simply by ‘reacting’ to his customers, Bohdan grew to a $48 million revenue business by 1988.”

Synopsis:
At least one thing is clear after reading Amar Bhidé’s exhaustively researched book, The Origin and Evolution of New Businesses: Everything you thought about entrepreneurship is probably wrong. Spurred on by what he saw as a lack of understanding of what makes startups successful, Bhidé spent 10 years researching and interviewing the founders of some of the fastest-growing private companies in the country, using the firms named to the Inc. 500 list in 1989 as his base. The results are surprising. Ironically, Bhidé’s book, published in the throes of the dotcom era, blasts many of the notions made popular by the so-called internet millionaires, such as the Silicon Valley stereotype that risk-loving dynamos armed with business plans and backed by a fleet of deep-pocketed venture capitalists create the most successful startups. Bhidé disagrees. He argues that the most successful entrepreneurs usually just copy someone else’s idea—often something the company they work for is already doing—risking little capital in the process. One example: John Katzman, founder of the Princeton Review, started his company after teaching S.A.T. preparation classes at Hunter College in New York City. Over time he differentiated his service by reducing the size of his classes and offering more computer support. “Successful startups usually do not involve a blockbuster innovation,” Bhidé writes. “The entrepreneur’s ability to recognize the promise of someone else’s idea, and to show resourcefulness and ingenuity in solving the problems of its implementation, provide sufficient bases for success.” And unlike a select few examples, such as Bill Gates, most entrepreneurs don’t set out to build multibillion-dollar companies, nor do they “need to have exceptional charm, magnetism, or a capacity to inspire devotion.”

Perhaps Bhidé’s most thought-provoking conclusion is that not all small businesses are created equal: Fewer than 10 percent of all new businesses will become big enough to make a substantial contribution to the U.S. gross domestic product, add to national job growth statistics, or even provide a return on an owner’s initial investment. Only a select few of these small businesses, Bhidé writes, will follow the course of such companies as Microsoft, Hewlett-Packard, and Wal-Mart and become tomorrow’s large corporations. “Few new ventures attain significant longevity and size, because only a very small number of individuals have the willingness and the capacity to both start and build a business.” To build a business for the ages, in other words, you need to be one in a million.

Grown up digital : how the net generation is changing the world


Author: Don Tapscott

BOOK REVIEW: FT

In 'Grown Up Digital,' Don Tapscott argues that young people's skills in 'playing' with information are undervalued and that the best managers can learn from these employees.
By Richard Donkin

December 29, 2008

It is the privilege -- or possibly the curse -- of each new generation to be different from the last. But rarely has a generational divide been as noticeable as that between those in their early 20s and the baby boomers.

This, at any rate, is the proposition put forward by Don Tapscott, a management professor at the University of Toronto and author a decade ago of "Growing Up Digital: The Rise of the Net Generation." In his latest book, "Grown Up Digital: How the Net Generation Is Changing Your World," he argues that senior corporate managers must strive to understand what he calls the Net Generation -- born between 1977 and 1997 -- often described as Generation Y.

Too often, he says, the generation that grew up with the Internet is derided by employers as ill-informed, Web-addicted, unfocused, poorly read and narcissistic.

But in a long-running, $4-million research project involving thousands of interviews with 16- to 19-year-olds in 12 countries and comparative interview programs with earlier generations, Tapscott and his team reached a different view.

The Net Generation, he notes, was raised on a much more varied media diet than its parents who, in the United States at least, watched an average of 22 hours of television a week in their youth.

The "Net Geners," he says, have access to so much competing media that they are more likely to spend their home time on computers, simultaneously interacting on several screens, while talking on the phone, listening to music, doing homework and reading.

This all-in-one generation cannot be defined as passive learners. "They are the active initiators, collaborators, organizers, readers, writers, authenticators, and even strategists. . . . They do not just observe, they participate. They inquire, discuss, argue, play, shop, critique, investigate, ridicule, fantasize, seek and inform."

Technology is shaping their minds in a different way, he writes. Unlike their predecessors who absorb information sequentially, Net Geners "play" with information -- clicking, cutting, pasting and linking to interesting material.

"They develop hypertext minds," in the words of William Winn, learning center director at the University of Washington's Human Interface Technology Laboratory.

Such behaviors, Tapscott writes, mean this new generation is well equipped for handling information.

He quotes research by father-and-son brain scientists Stanley and Matthew Kutcher, who argue that the scanning practices of young people are developing their potential for analytical thinking.

Henry Jenkins, director of the comparative media studies program at the Massachusetts Institute of Technology, is also cited. He believes so-called digital immersion may be encouraging a new form of intelligence that is strengthened through collaboration with other people and machines.

In a direct challenge to the conventional school syllabus, Tapscott argues that although it is still important that children have certain basic knowledge, the details, such as the date of the Battle of Hastings, are less important when they can be accessed instantly on the Web.

This is controversial thinking with far-reaching consequences for the way young people are taught and employed. But the book aims to tackle Internet prejudices head on.

His conclusion is that "the kids are all right."

The best managers and educators will understand that there is much they can learn from this cohort, as well as the other way around. His seven guidelines for managers include a recommendation to "rethink authority," giving feedback where it is needed but remaining open to learning from young employees.

Other guidelines include encouraging employees to blog and avoiding bans on access to social networking sites. Instead, managers should work on ways to harness these technologies to promote better collaboration.

The book is a thoughtful antithesis to entrenched and sometimes alarmist managerial opposition to Internet-influenced behaviors. Read it next to the computer, scanning, flicking through and annotating it as a valuable addition to the Internet knowledge that is revolutionizing our world.

Buy-ology: Truth and Lies About Why We Buy



Author: Martin Lindstorm

Review: FT
What do smokers think and feel when they see gruesome pictures of diseased body parts and dire predictions of early, painful death on their cigarette packs? Smokers say the warnings put them off. But when their reactions were tested under a brain scanning machine, parts of the brain associated with intense craving flared into action on the scans. Even the most graphic health warnings might unwittingly encourage smokers to light up, the experiment suggested.

It is an intriguing finding from the biggest “neuromarketing” research project so far, a $7m study organised with university academics by marketing consultant Martin Lindstrom and reported by him in a new book, Buy-ology. The study was designed, in his words, to reveal “hidden truths behind how branding and marketing messages work on the human brain”.

Unfortunately, Lindstrom’s book is more speculation than serious science. Little of it actually reports on his own neuro-research; the rest consists of marketing war stories that are rehashed with speculative spin on unrelated topics such as mirror neurons and neurotransmitters.

Dopamine is a powerful neurotransmitter that gives us a sense of pleasure as we anticipate a reward. How and when it works and what its full effects are remain a subject of controversy, but Lindstrom is not deterred. According to him, all you have to do is look at a shiny digital camera and “wham, before you know it,” your brain is flush with dopamine and “a few minutes later, you exit the store, bag in hand”. Why, then, do we not buy every shiny object we see?

Mirror neurons could be the big neuroscientific discovery of the century. They are active in the same parts of the brain of someone observing an action as the brain of someone taking the action. Scientists believe mirror neurons could help gain a deeper understanding of human empathy, learning and imitation. But the research is only just beginning.

A defining feature of mirror neurons is the disconnect between the observer’s internal brain activity and his external, observable actions. But Lindstrom turns this into its opposite. You look at a Gap window display and see a picture of a gorgeous model wearing its clothes. Your mirror neurons make you imagine yourself as equally good-looking and “override” your more rational thoughts. “You just can’t help it”, declares Lindstrom, you go into the store and buy. Many Gap marketers must be thinking “if only Lindstrom was right”.

Lindstrom’s own research did not actually investigate the effects of dopamine and mirror neurons. But it did include functional Magnetic Resonance Imaging brain scans of people looking at brand icons and religious icons. He reports evidence that both trigger activity in the same parts of the brain, and uses this to draw the conclusion that the emotions generated by religious belief and by iconic brands are “almost identical”. That’s a huge claim. But according to Professor Gemma Calvert of Warwick University’s Applied Neuroimaging Group, who actually conducted the research, these particular results were only weakly statistically significant.

This year, Yale University researchers tested the ability of laymen and neuroscience students to distinguish between good and bad explanations of psychological phenomena. They performed well, except when bad explanations were prefaced with “Brain scans indicate ...”, when they accepted invented tosh as plausible. The researchers’ work is a timely warning as to how willing we are to be blinded by science.

There’s little doubt we have a lot to learn from neuroscience. But for this we need thorough, careful research accompanied by thorough, careful analysis and reporting.

Right now, the business world is awash with gurus bearing presentations and project proposals starting “Brain scans indicate ...”. The question for executives is whether the added neuroscience is really there to educate and inform, or to prise open executives’ marketing and research coffers by blinding them with science.

Why We Buy: The Science of Shopping


Author: Paco Underhill

Paco Underhill published his book "Why We Buy: The Science of Shopping" in 1999. Today it remains a modern classic for understanding the psychology of consumer purchasing behavior.

Here is a summary and critique the book's main points:

1. Purchasing behaviors can be studied: Underhill and his team opened the eyes of CEO and retail managers everywhere with their unique approach of meticulously observing people as they shopped. They brought techniques from anthropology and merged them with economics to create a new science.

2. Retailers still have much to learn about why people buy: Most CEOs that Underhill spoke with knew a whole lot about how store revenues but very little about what actually made customer purchase. For example, one CEO he spoke with believed that about 99% of people who visited their stores made purchase. When Underhill revealed that the correct statistic was only 48%, the CEO was needless to say enthralled with the possibilities.

3. Sellers can benefit from understanding buyer behavior: Buyers have a certain way of walking through a store. A specific way of using their hands, looking at signs, taking breaks while shopping. Sellers who understand these behaviors can gain a huge competitive advantage.

4. You can always sell more: Your best customers are your current customers. Find ways to upsell. Entice them to the back of the store. Keep them in the store longer.

5. Women and men shop differently: For example, men tend to go into a store, look at a large shelf of items, pick one, and quickly leave. Meanwhile, women are actually more information-intensive, reading the label for each possibility before making a purchase.

6. People use all five senses to decide on a purchase: The more of the five senses to which a seller can appeal, the better. People want verification with their whole body before buying a product.

7. Shopping on the Internet is different: Okay, this one is a "well, duhhh, Mr. Underhill" today, but remember this book was written in the late 1990s. Underhill does list some truisms about the advantages of shopping on the Internet which are helpful reminders.

Underhill's book certainly opened my eyes to what retailers know and do not know about what makes people buy. My only problem with his book is the writing format. Underhill's writing is actually pretty good, but it lacks periodic summaries of main points to really drive home the reader's understanding.

By reading "Why We Buy," you will get an informative, if sometimes wandering, read through the psychology of buying.

Thursday, December 25, 2008

DESCARTES’ BONES A Skeletal History of the Conflict Between Faith and Reason

Author: Russell Shorto

NYTIMES REVIEW

Making the case for one or an­other historical moment as the starting point of modernity is a familiar hook for writers of grand chronicles. Was the transformative event World War I, with its fateful consequences for 20th-century warfare, ideology and identity? Or perhaps Einstein’s “miracle year” of 1905, when he published his universe-shattering papers? The appearance of Darwin’s “Origin of Species” offers a bright dividing line, as do the (take your pick) French and American revolutions. The literary critic Harold Bloom reaches still farther back, crediting Shakespeare with the “invention of the human” in its various modern modes. Others find the deepest roots of modernity in the bleak realism of Machiavelli.

Russell Shorto’s “Descartes’ Bones” is a smart, elegantly written contribution to this genre. For Shorto, the pivot upon which the old world yielded to the new was the genius of Descartes, the philosopher who gave us the doubting, analytical, newly independent modern self. The Frenchman’s most famous phrase, “I think, therefore I am,” may strike our own ears as a coffee-mug cliché, but in the 17th century it was a revolutionary declaration. Shorto’s achievement is to complicate this picture, and with it our understanding of modernity, by also describing the religious context of the philosopher’s ideas. Though Descartes’s name has come to be associated with unrelenting rationalism, he was “as devout a Catholic as anyone of his time,” Shorto writes, and looked to theology to support his system. As Shorto recognizes, our own fundamentalists, religious and secular alike, might draw some useful lessons in modesty from Descartes’s example.

Descartes made a cameo appearance in Shorto’s previous book, “The Island at the Center of the World” (2004), a richly detailed revisionist history of 17th-­century Dutch Manhattan and its liberalizing influence on America’s British colonies. In those pages, we encountered the philosopher as a celebrity in Holland, where he lived for almost two decades and, in 1637, published his seminal “Discourse on the Method.” Descartes takes center stage in Shorto’s new book, but not in the way one might expect. The action opens in the winter of 1650, with the hapless Frenchman on his deathbed in faraway Stockholm, cursing the fate that had lured him to the Swedish court of Queen Christina. By Page 40, after an instructive synopsis of his controversial career, exit Descartes, a corpse — and enter a large, motley cast of Cartesians, determined to do right by their teacher’s ideas and by his moldering, displaced bones.

Shorto makes deft use of the centuries-­long to-and-fro over Descartes’s remains, a tale that involves three different burials, events in six countries and lingering questions, partly resolved by the author himself, about the authenticity of the skeleton, or rather of its scattered parts. As it turns out, the skull of the philosopher was separated mysteriously, at an early date, from the rest of his bones. This macabre fact provides Shorto with the makings of a detective story but also with an irresistible metaphor. Descartes’s chief contribution to modern science and philosophy was his radical focus on epistemology, on defining the boundaries of what we are capable of knowing with certainty. At the center of this project was his assertion of mind-body dualism, the notion, as Shorto explains, that “the mind and its thoughts exist in a different category or somehow on a different plane from the physical world.” For his admirers and for a latter-day scientific establishment aware of its debt to him, what could be more urgent than identifying and uniting the deceased philosopher’s own head and body?

The parade of colorful figures taking part in this drawn-out effort forms the heart of Shorto’s narrative. We meet Hugues de Terlon, a militant Catholic and the French ambassador to Sweden, who in 1666 had Descartes’s bones repatriated, seeing in the philosopher’s famed “method” a superior window into God’s handiwork. Another central character is the waifish, ethereal Alexandre Lenoir, a rationalist aesthete and supporter of the French Revolution who spent the years after 1789 fighting to preserve the artistic and architectural heritage of the old regime, including the Parisian church of Ste. Geneviève, where Descartes was (ostensibly) buried. A number of early-19th-century scientific notables also play significant roles in the story, including Jöns Jacob Berzelius, the Swede who invented modern chemical notation; Jean-Baptiste-Joseph Delambre, an important contributor to the development of the metric system; and Georges Cuvier, a pioneer in comparative anatomy and paleontology.

The religious quarrels in which Descartes’s ideas embroiled both himself and his followers are too numerous to count, ranging from the character of transubstantiation in the Eucharist to the possibility that the animal kingdom might exhibit something other than the Bible’s apparent “fixity of the species.” Most of these disputes concern, in one way or another, the challenge posed by the new mechanistic science to classical notions of nature and its ends — that is, to the teleology inherited from Aristotle and codified by churchmen. But, as Shorto em­phasizes, there was another side to Descartes’s project. The philosopher thought he had succeeded not in overturning the true faith but in protecting it from the crumbling edifice of ancient natural science. His mind/body distinction, Shorto notes, has long been invoked in defense of “an eternal realm of thought, belief and ideals that can’t be touched by the prying fingers of science.”

Whether Shorto himself falls into this camp is hard to say, but he offers welcome sympathy to those of us who would like to see today’s discussion of the relationship between science and religion placed on a more civil, informed footing. It is a mistake, he writes, to think that the Enlightenment “set reason firmly against faith and the two have ever since been locked in a death struggle.” Radicals among the trailblazing modern thinkers were more than equally matched by moderates who believed that “reason would function alongside faith to increase human happiness and life span, end disease, reduce suffering of all kinds and give people greater power over nature and greater freedom in their lives.” If the founders of the modern sensibility could bridge this divide, perhaps we can, too.

Shorto overreaches at times in the interest of advancing a strong thesis and weaving an engaging tale. Descartes’s influence was immense, to be sure, but it is a stretch to credit him, as Shorto does, with laying the ground for modern ideas of equality, individual rights and self-government. On the scientific side of the ledger, Shorto’s eagerness to set apart Descartes as a system-builder leads to his unfortunate assertion that the celebrated experimenters and empiricists of early modern science — Galileo, Bacon, Harvey, Kepler — initially sowed “more confusion than clarity.” Melodrama also occasionally intrudes into Shorto’s account, particularly in his sleuthing about Descartes’s skull and his speculation about the philosopher’s feelings for his working-class mistress and their illegitimate daughter.

None of this detracts much, ultimately, from Shorto’s feat of intellectual story-telling. If pressed, he would probably concede that his philosophical hero was not so single-handedly responsible for modernity; Descartes had many capable partners, even peers. But Shorto is right about certain enduring aspects of Descartes’s thought. As he observes in the book’s epilogue, in an especially eloquent passage about dualism: “We are all philosophers because our condition demands it. We live every moment in a universe of seemingly eternal thoughts and ideas, yet simultaneously in the constantly churning and decaying world of our bodies and their humble situations. . . . The result is a nagging need to find meaning.”

Gary Rosen is the chief external affairs officer of the John Templeton Foundation.

Saturday, December 20, 2008

The Politics of Hunger How Illusion and Greed Fan the Food Crisis

PAUL COLLIER is Professor of Economics and Director of the Center for the Study of African Economies at Oxford University and the author of The Bottom Billion: Why the Poorest Countries Are Failing and What Can Be Done About It.

After many years of stability, world food prices have jumped 83 percent since 2005 -- prompting warnings of a food crisis throughout much of the world earlier this year. In the United States and Europe, the increase in food prices is already yesterday's news; consumers in the developed world now have more pressing concerns, such as the rising price of energy and the falling price of houses. But in the developing world, a food shock of this magnitude is a major political event. To the typical household in poor countries, food is the equivalent of energy in the United States, and people expect their government to do something when prices rise. Already, there have been food riots in some 30 countries; in Haiti, they brought down the prime minister. And for some consumers in the world's poorest countries, the true anguish of high food prices is only just beginning. If global food prices remain high, the consequences will be grim both ethically and politically.

Politicians and policymakers do, in fact, have it in their power to bring food prices down. But so far, their responses have been less than encouraging: beggar-thy-neighbor restrictions, pressure for yet larger farm subsidies, and a retreat into romanticism. In the first case, neighbors have been beggared by the imposition of export restrictions by the governments of food-exporting countries. This has had the immaculately dysfunctional consequence of further elevating world prices while reducing the incentives for the key producers to invest in the agricultural sector. In the second case, the subsidy hunters have, unsurprisingly, turned the crisis into an opportunity; for example, Michel Barnier, the French agricultural minister, took it as a chance to urge the European Commission to reverse its incipient subsidy-slashing reforms of the Common Agricultural Policy. And finally, the romantics have portrayed the food crisis as demonstrating the failure of scientific commercial agriculture, which they have long found distasteful. In its place they advocate the return to organic small-scale farming -- counting on abandoned technologies to feed a prospective world population of nine billion.

The real challenge is not the technical difficulty of returning the world to cheap food but the political difficulty of confronting the lobbying interests and illusions on which current policies rest. Feeding the world will involve three politically challenging steps. First, contrary to the romantics, the world needs more commercial agriculture, not less. The Brazilian model of high-productivity large farms could readily be extended to areas where land is underused. Second, and again contrary to the romantics, the world needs more science: the European ban and the consequential African ban on genetically modified (GM) crops are slowing the pace of agricultural productivity growth in the face of accelerating growth in demand. Ending such restrictions could be part of a deal, a mutual de-escalation of folly, that would achieve the third step: in return for Europe's lifting its self-damaging ban on GM products, the United States should lift its self-damaging subsidies supporting domestic biofuel.

SUPPLY-SIDE SOLUTIONS

Typically, in trying to find a solution to a problem, people look to its causes -- or, yet more fatuously, to its "root" cause. But there need be no logical connection between the cause of a problem and appropriate or even just feasible solutions to it. Such is the case with the food crisis. The root cause of high food prices is the spectacular economic growth of Asia. Asia accounts for half the world's population, and because its people are still poor, they devote much of their budgets to food. As Asian incomes rise, the world demand for food increases. And not only are Asians eating more, but they are also eating better: carbohydrates are being replaced by protein. And because it takes six kilograms of grain to produce one kilogram of beef, the switch to a protein-heavy diet further drives up demand for grain.

The two key parameters in shaping demand are income elasticity and price elasticity. The income elasticity of demand for food is generally around 0.5, meaning that if income rises by, say, 20 percent, the demand for food rises by 10 percent. (The price elasticity of demand for food is only around 0.1: that is, people simply have to eat, and they do not eat much less in response to higher prices.) Thus, if the supply of food were fixed, in order to choke off an increase in demand of 10 percent after a 20 percent rise in income, the price of food would need to double. In other words, modest increases in global income will drive prices up alarmingly unless matched by increases in supply.

In recent years, the increase in demand resulting from gradually increasing incomes in Asia has instead been matched with several supply shocks, such as the prolonged drought in Australia. These shocks will only become more common with the climatic volatility that accompanies climate change. Accordingly, against a backdrop of relentlessly rising demand, supply will fluctuate more sharply as well.

Because food looms so large in the budgets of the poor, high world food prices have a severely regressive effect in their toll. Still, by no means are all of the world's poor adversely affected by expensive food. Most poor people who are farmers are largely self-sufficient. They may buy and sell food, but the rural markets in which they trade are often not well integrated into global markets and so are largely detached from the surge in prices. Where poor farmers are integrated into global markets, they are likely to benefit. But even the good news for farmers needs to be qualified. Although most poor farmers will gain most of the time, they will lose precisely when they are hardest hit: when their crops fail. The World Food Program is designed to act as the supplier of last resort to such localities. Yet its budget, set in dollars rather than bushels, buys much less when food prices surge. Paradoxically, then, the world's insurance program against localized famine is itself acutely vulnerable to global food shortages. Thus, high global food prices are good news for farmers but only in good times.

The unambiguous losers when it comes to high food prices are the urban poor. Most of the developing world's large cities are ports, and, barring government controls, the price of their food is set on the global market. Crowded in slums, the urban poor cannot grow their own food; they have no choice but to buy it. Being poor, they would inevitably be squeezed by an increase in prices, but by a cruel implication of the laws of necessity, poor people spend a far larger proportion of their budgets on food, typically around a half, in contrast to only around a tenth for high-income groups. (Hungry slum dwellers are unlikely to accept their fate quietly. For centuries, sudden hunger in slums has provoked the same response: riots. This is the classic political base for populist politics, such as Peronism in Argentina, and the food crisis may provoke its ugly resurgence.)

At the end of the food chain comes the real crunch: among the urban poor, those most likely to go hungry are children. If young children remain malnourished for more than two years, the consequence is stunted growth -- and stunted growth is not merely a physical condition. Stunted people are not just shorter than they would have been; their mental potential is impaired as well. Stunted growth is irreversible. It lasts a lifetime, and indeed, some studies find that it is passed down through the generations. And so although high food prices are yesterday's news in most of the developed world, if they remain high for the next few years, their consequences will be tomorrow's nightmare for the developing world.

In short, global food prices must be brought down, and they must be brought down fast, because their adverse consequences are so persistent. The question is how. There is nothing to be done about the root cause of the crisis -- the increasing demand for food. The solution must come from dramatically increasing world food supply. That supply has been growing for decades, more than keeping up with population growth, but it now must be accelerated, with production increasing much more rapidly than it has in recent decades. This must happen in the short term, to bring prices down from today's levels, and in the medium and long terms, since any immediate increase in supply will soon be overtaken by increased demand.

Fortunately, policymakers have the power to do all of this: by changing regulation, they can quickly generate an increase in supply; by encouraging organizational changes, they can raise the growth of production in the medium term; and by encouraging innovations in technology, they can sustain this higher growth indefinitely. But currently, each of these steps is blocked by a giant of romantic populism: all three must be confronted and slain.

THE FIRST GIANT OF ROMANTIC POPULISM

The first giant that must be slain is the middle- and upper-class love affair with peasant agriculture. With the near-total urbanization of these classes in both the United States and Europe, rural simplicity has acquired a strange allure. Peasant life is prized as organic in both its literal and its metaphoric sense. (Prince Charles is one of its leading apostles.) In its literal sense, organic agricultural production is now a premium product, a luxury brand. (Indeed, Prince Charles has his own such brand, Duchy Originals.) In its metaphoric sense, it represents the antithesis of the large, hierarchical, pressured organizations in which the middle classes now work. (Prince Charles has built a model peasant village, in traditional architectural style.) Peasants, like pandas, are to be preserved.

But distressingly, peasants, like pandas, show little inclination to reproduce themselves. Given the chance, peasants seek local wage jobs, and their offspring head to the cities. This is because at low-income levels, rural bliss is precarious, isolated, and tedious. The peasant life forces millions of ordinary people into the role of entrepreneur, a role for which most are ill suited. In successful economies, entrepreneurship is a minority pursuit; most people opt for wage employment so that others can have the worry and grind of running a business. And reluctant peasants are right: their mode of production is ill suited to modern agricultural production, in which scale is helpful. In modern agriculture, technology is fast-evolving, investment is lumpy, the private provision of transportation infrastructure is necessary to counter the lack of its public provision, consumer food fashions are fast-changing and best met by integrated marketing chains, and regulatory standards are rising toward the holy grail of the traceability of produce back to its source. Far from being the answer to global poverty, organic self-sufficiency is a luxury lifestyle. It is appropriate for burnt-out investment bankers, not for hungry families.

Large organizations are better suited to cope with investment, marketing chains, and regulation. Yet for years, global development agencies have been leery of commercial agriculture, basing their agricultural strategies instead on raising peasant production. This neglect is all the more striking given the standard account of how economic development started in Europe: the English enclosure movement, which was enabled by legislative changes, is commonly supposed to have launched development by permitting large farms that could achieve higher productivity. Although current research qualifies the conventional account, reducing the estimates of productivity gains to the range of 10-20 percent, to ignore commercial agriculture as a force for rural development and enhanced food supply is surely ideological.

Innovation, especially, is hard to generate through peasant farming. Innovators create benefits for the local economy, and to the extent that these benefits are not fully captured by the innovators, innovation will be too slow. Large organizations can internalize the effects that in peasant agriculture are localized externalities -- that is, benefits of actions that are not reflected in costs or profits -- and so not adequately taken into account in decision-making. In the European agricultural revolution, innovations occurred on small farms as well as large, and today many peasant farmers, especially those who are better off and better educated, are keen to innovate. But agricultural innovation is highly sensitive to local conditions, especially in Africa, where the soils are complex and variable. One solution is to have an extensive network of publicly funded research stations with advisers who reach out to small farmers. But in Africa, this model has largely broken down, an instance of more widespread malfunctioning of the public sector. In eighteenth-century Great Britain, the innovations in small-holder agriculture were often led by networks among the gentry, who corresponded with one another on the consequences of agricultural experimentation. But such processes are far from automatic (they did not occur, for example, in continental Europe). Commercial agriculture is the best way of making innovation quicker and easier.

Over time, African peasant agriculture has fallen further and further behind the advancing commercial productivity frontier, and based on present trends, the region's food imports are projected to double over the next quarter century. Indeed, even with prices as high as they currently are, the United Nations Food and Agriculture Organization is worried that African peasants are likely to reduce production because they cannot afford the increased cost of fertilizer inputs. There are partial solutions to such problems through subsidies and credit schemes, but it should be noted that large-scale commercial agriculture simply does not face this particular problem: if output prices rise by more than input prices, production will be expanded.

A model of successful commercial agriculture is, indeed, staring the world in the face. In Brazil, large, technologically sophisticated agricultural companies have demonstrated how successfully food can be mass-produced. To give one remarkable example, the time between harvesting one crop and planting the next -- the downtime for land -- has been reduced to an astounding 30 minutes. Some have criticized the Brazilian model for displacing peoples and destroying rain forest, which has indeed happened in places where commercialism has gone unregulated. But in much of the poor world, the land is not primal forest; it is just badly farmed. Another benefit of the Brazilian model is that it can bring innovation to small farmers as well. In the "out-growing," or "contract farming," model, small farmers supply a central business. Depending on the details of crop production, sometimes this can be more efficient than wage employment.

There are many areas of the world that have good land that could be used far more productively if properly managed by large companies. Indeed, large companies, some of them Brazilian, are queuing up to manage those lands. Yet over the past 40 years, African governments have worked to scale back large commercial agriculture. At the heart of the matter is a reluctance to let land rights be marketable, and the source of this reluctance is probably the lack of economic dynamism in Africa's cities. As a result, land is still the all-important asset (there has been little investment in others). In more successful economies, land has become a minor asset, and thus the rights of ownership, although initially assigned based on political considerations, are simply extensions of the rights over other assets; as a result, they can be acquired commercially. A further consequence of a lack of urban dynamism is that jobs are scarce, and so the prospect of mass landlessness evokes political fears: the poor are safer on the land, where they are less able to cause trouble.

Commercial agriculture is not perfect. Global agribusiness is probably overly concentrated, and a sudden switch to an unregulated land market would probably have ugly consequences. But allowing commercial organizations to replace peasant agriculture gradually would raise global food supply in the medium term.

THE WAR ON SCIENCE

The second giant of romantic populism is the European fear of scientific agriculture. This has been manipulated by the agricultural lobby in Europe into yet another form of protectionism: the ban on GM crops. GM crops were introduced globally in 1996 and already are grown on around ten percent of the world's crop area, some 300 million acres. But due to the ban, virtually none of this is in Europe or Africa.

Robert Paarlberg, of Wellesley College, brilliantly anatomizes the politics of the ban in his new book, Starved for Science. After their creation, GM foods, already so disastrously named, were described as "Frankenfoods" -- sounding like a scientific experiment on consumers. Just as problematic was the fact that genetic modification had grown out of research conducted by American corporations and so provoked predictable and deep-seated hostility from the European left. Although Monsanto, the main innovator in GM-seed technology, has undertaken never to market a seed that is incapable of reproducing itself, skeptics propagated a widespread belief that farmers will be trapped into annual purchases of "terminator" seeds from a monopoly supplier. Thus were laid the political foundations for a winning coalition: onto the base of national agricultural protectionism was added the anti-Americanism of the left and the paranoia of health-conscious consumers who, in the wake of the mad cow disease outbreak in the United Kingdom in the 1990s, no longer trusted their governments' assurances. In the 12 years since the ban was introduced, in 1996, the scientific case for lifting it has become progressively more robust, but the political coalition against GM foods has only expanded.

The GM-crop ban has had three adverse effects. Most obviously, it has retarded productivity growth in European agriculture. Prior to 1996, grain yields in Europe tracked those in the United States. Since 1996, they have fallen behind by 1-2 percent a year. European grain production could be increased by around 15 percent were the ban lifted. Europe is a major cereal producer, so this is a large loss. More subtly, because Europe is out of the market for GM-crop technology, the pace of research has slowed. GM-crop research takes a very long time to come to fruition, and its core benefit, the permanent reduction in food prices, cannot fully be captured through patents. Hence, there is a strong case for supplementing private research with public money. European governments should be funding this research, but instead research is entirely reliant on the private sector. And since private money for research depends on the prospect of sales, the European ban has also reduced private research.

However, the worst consequence of the European GM-crop ban is that it has terrified African governments into themselves banning GM crops, the only exception being South Africa. They fear that if they chose to grow GM crops, they would be permanently shut out of European markets. Now, because most of Africa has banned GM crops, there has been no market for discoveries pertinent to the crops that Africa grows, and so little research -- which in turn has led to the critique that GM crops are irrelevant for Africa.

Africa cannot afford this self-denial; it needs all the help it can possibly get from genetic modification. For the past four decades, African agricultural productivity per acre has stagnated; raising production has depended on expanding the area under cultivation. But with Africa's population still growing rapidly, this option is running out, especially in light of global warming. Climate forecasts suggest that in the coming years, most of Africa will get hotter, the semiarid parts will get drier, and rainfall variability on the continent will increase, leading to more droughts. It seems likely that in southern Africa, the staple food, maize, will at some point become nonviable. Whereas for other regions the challenge of climate change is primarily about mitigating carbon emissions, in Africa it is primarily about agricultural adaptation.

It has become commonplace to say that Africa needs a green revolution. Unfortunately, the reality is that the green revolution in the twentieth century was based on chemical fertilizers, and even when fertilizer was cheap, Africa did not adopt it. With the rise in fertilizer costs, as a byproduct of high-energy prices, any African green revolution will perforce not be chemical. To counter the effects of Africa's rising population and deteriorating climate, African agriculture needs a biological revolution. This is what GM crops offer, if only sufficient money is put into research. There has as yet been little work on the crops of key importance to the region, such as cassava and yams. GM-crop research is still in its infancy, still on the first generation: single-gene transfer. A gene that gives one crop an advantage is identified, isolated, and added to another crop. But even this stage offers the credible prospect of vital gains. In a new scientific review, Jennifer Thomson, of the Department of Molecular and Cell Biology at the University of Cape Town, considers the potential of GM technology for Africa. Maize, she reports, can be made more drought-resistant, buying Africa time in the struggle against climatic deterioration. Grain can be made radically more resistant to fungi, reducing the need for chemicals and cutting losses due to storage. For example, stem borer beetles cause storage losses in the range of 15-40 percent of the African maize crop; a new GM variety is resistant.

It is important to recognize that genetic modification, like commercialization, is not a magic fix for African agriculture: there is no such fix. But without it, the task of keeping Africa's food production abreast of its population growth looks daunting. Although Africa's coastal cities can be fed from global supplies, the vast African interior cannot be fed in this way other than in emergencies. Lifting the ban on GM crops, both in Africa and in Europe, is the policy that could hold down global food prices in the long term.

The final giant of romantic populism is the American fantasy that the United States can escape dependence on Arab oil by growing its own fuel -- making ethanol or other biofuels, largely from corn. There is a good case for growing fuel. But there is not a good case for generating it from American grain: the conversion of grain into ethanol uses almost as much energy as it produces. This has not stopped the American agricultural lobby from gouging out grotesquely inefficient subsidies from the government; as a result, around a third of American grain has rapidly been diverted into energy. This switch demonstrates both the superb responsiveness of the market to price signals and the shameful power of subsidy-hunting lobbying groups. If the United States wants to run off of agrofuel instead of oil, then Brazilian sugar cane is the answer; it is a far more efficient source of energy than American grain. The killer evidence of political capture is the response of the U.S. government to this potential lifeline: it has actually restricted imports of Brazilian ethanol to protect American production. The sane goal of reducing dependence on Arab oil has been sacrificed to the self-serving goal of pumping yet more tax dollars into American agriculture.

Inevitably, the huge loss of grain for food caused by its diversion into ethanol has had an impact on world grain prices. Just how large an impact is controversial. An initial claim by the Bush administration was that it had raised prices by only three percent, but a study by the World Bank suggests that the effect has been much larger. If the subsidy were lifted, there would probably be a swift impact on prices: not only would the supply of grain for food increase, but the change would shift speculative expectations. This is the policy that could bring prices down in the short term.

STRIKING A DEAL

The three policies -- expanding large commercial farms, ending the GM-crop ban, and doing away with the U.S. subsidies on ethanol -- fit together both economically and politically. Lifting the ethanol subsidies would probably puncture the present ballooning of prices. The expansion of commercial farms could, over the next decade, raise world output by a further few percentage points. Both measures would buy the time needed for GM crops to deliver on their potential (the time between starting research and the mass application of its results is around 15 years). Moreover, the expansion of commercial farming in Africa would encourage global GM-crop research on Africa-suited crops, and innovations would find a ready market not so sensitive to political interference. It would also facilitate the localized adaptation of new varieties. It is not by chance that the only African country in which GM crops have not been banned is South Africa, where the organization of agriculture is predominantly commercial.

Politically, the three policies are also complementary. Homegrown energy, keeping out "Frankenfoods," and preserving the peasant way of life are all classic populist programs: they sound instantly appealing but actually do harm. They must be countered by messages of equal potency.

One such message concerns the scope for international reciprocity. Although Americans are attracted to homegrown fuel, they are infuriated by the European ban on GM crops. They see the ban for what it is: a standard piece of anti-American protectionism. Europeans, for their part, cling to the illusory comfort of the ban on high-tech crops, but they are infuriated by the American subsidies on ethanol. They see the subsidies for what they are: a greedy deflection from the core task of reducing U.S. energy profligacy. Over the past half century, the United States and Europe have learned how to cooperate. The General Agreement on Tariffs and Trade was fundamentally a deal between the United States and Europe that virtually eliminated tariffs on manufactured goods. NATO is a partnership in security. The Organization for Economic Cooperation and Development is a partnership in economic governance. Compared to the difficulties of reaching agreement in these areas, the difficulties of reaching a deal on the mutual de-escalation of recent environmental follies is scarcely daunting: the United States would agree to scrap its ethanol subsidies in return for Europe's lifting the ban on GM crops. Each side can find this deal infuriating and yet attractive. It should be politically feasible to present this to voters as better than the status quo.

How might the romantic hostility toward commercial and scientific agriculture be countered politically? The answer is to educate the vast community of concern for the poorest countries on the bitter realities of the food crisis. In both the United States and Europe, millions of decent citizens are appalled by global hunger. Each time a famine makes it to television screens, the popular response is overwhelming, and there is a large overlap between the constituency that responds to such crises and the constituency attracted by the idea of preserving organic peasant lifestyles. The cohabitation of these concerns needs to be challenged. Many people will need to agonize over their priorities. Some will decide that the vision articulated by Prince Charles is the more important one: a historical lifestyle must be preserved regardless of the consequences. But however attractive that vision, these people must come face-to-face with the prospect of mass malnutrition and stunted children and realize that the vital matter for public policy is to increase food supplies. Commercial agriculture may be irredeemably unromantic, but if it fills the stomachs of the poor, then it should be encouraged.

American environmentalists will also need to do some painful rethinking. The people most attracted to achieving energy self-sufficiency through the production of ethanol are potentially the constituency that could save the?United States from its ruinous energy policies. The United States indeed needs to reduce its dependence on imported oil, but growing corn for biofuel is not the answer. Americans are quite simply too profligate when it comes to their use of energy; Europeans, themselves pretty profligate, use only half the energy per capita and yet sustain a high-income lifestyle. The U.S. tax system needs to be shifted from burdening work to discouraging energy consumption.

The mark of a good politician is the ability to guide citizens away from populism. Unless countered, populism will block the policies needed to address the food crisis. For the citizens of the United States and Europe, the continuation of high food prices will be an inconvenience, but not sufficiently so to slay the three giants on which the current strain of romantic populism rests. Properly informed, many citizens will rethink their priorities, but politicians will need to deliver these messages and forge new alliances. If food prices are not brought down fast and then kept down, slum children will go hungry, and their future lives will be impaired. Shattering a few romantic illusions is a small price to pay.

Friday, December 19, 2008

Real Utopia: Participatory Society for the 21st Century


Author: Robin Hahnel, Barbara Ehrenreich, Michael Albert, Noam Chomsky , Chris Spannos

What if we had direct control over our daily lives? What if society's defining institutions—those encompassing economics, politics, kinship, culture, community, and ecology—were based not on competition, individual ownership, and coercion, but on self-management, equity, solidarity, and diversity? Real Utopia identifies and obliterates the barriers to an egalitarian, bottom-up society, while convincingly outlining how to build it.

Instead of simply declaring "another world is possible," the writers in this collection engage with what that world would look like, how it would function, and how our commitments to just outcomes is related to the sort of institutions we maintain. Topics include: participatory economics, political vision, education, architecture, artists in a free society, environmentalism, work after capitalism, and poly-culturalism. The catchall phrase here is "participatory society"—one that is directly democratic and seeks institutional solutions to complex sociological and economic questions.

Contributors include: Michael Albert, Barbara Ehrenreich, Steve Shalom, Robin Hahnel, Marie Trigona, Noam Chomsky, Paul Burrows, Justin Podur, Tom Wetzel, Cynthia Peters, Andrej Grubacic, and Mandisi Majavu, among others.

Chris Spannos is an activist, organizer, and anti-capitalist. He is a full-time staff member with the internationally acclaimed ZNet, a web site dedicated to social change, hosting works by many of today's leading social commentators, organizers, activists, and analysts, with 300,000 users weekly. He resides in Woods Hole, MA.

"This is a spectacular book of ideas—brave, adventurous, intriguing ideas that reclaim perhaps the greatest human asset of all, political imagination, and help us realise once again that another world is indeed possible."—John Pilger, author of New Rulers of the World and Freedom Next Time

"Chris Spannos has assembled a volume of hard-hitting, thought-provoking essays which address a critical need on the Left: the creation and elaboration of new theory. Whether in agreement or disagreement, readers will be both excited and challenged by the contents of this book. So pick it up right now!"—Bill Fletcher, Jr., Co-founder of the Black Radical Congress and the Center for Labor Renewal, and former President of TransAfrica Forum

"This book captures what's best in past and most promising in future social practice; no one-size-fits-all miracles but practical suggestions and a huge and warranted display of confidence in peoples' skills and imagination. It's a compendium of healthily head-in-clouds [where the air is purer] but feet-on-ground utopias, and it reinforces our belief that the story of human emancipation is far from over."—Susan George, Board Chair of the Transnational Institute.

"This excellent book fills a huge gap in the thinking and writing about the creation of a better society. It not only outlines how such a society might be organized in theory, but also looks at concrete applications of these ideas around the world, in recent history, and in the U.S., and how we might organize to get there. This book is essential reading for all those who firmly believe that a better world is possible and who want to engage with some of the best ideas and practices for bringing about such a world."—Gregory Wilpert, author of Changing Venezuela by Taking Power: The Policies of the Chavez Presidency and editor of Venezuelanalysis.com

"Now that the idea that 'there is no alternative' has been challenged by the idea that 'another world is possible,' it behooves us to debate what that 'other world' could and should be. This book presents a coherent school of thought with provocative answers to that question—answers that go beyond the traditional shibboleths of the left."—Jeremy Brecher, historian and author of Strike!

"There comes a time in every anarchist's life when she must decide whether her value system has application in the real world or is simply an ideology of lament. For those not content with the low-mileage of the latter, Real Utopia is an inspiring interim report—collated from the four corners of the Earth—on the evolution of the complex adaptive system we commonly refer to as 'anarchism'."—Chris Hannah, Propagandhi

Introducing Biological Rhythms: A Primer on the Temporal Organization of Life, with Implications for Health, Society, Reproduction, and the Nature

Author: Willard L. Koukkari, Robert B. Sothern

Preview this book
Introducing Biological Rhythms is a primer that serves to introduce individuals to the area of biological rhythms. It describes the major characteristics and discusses the implications and applications of these rhythms, while citing scientific results and references. Also, the primer includes essays that provide in-depth historic and other background information for those interested in more specific topics or concepts. It covers a basic cross-section of the field of chronobiology clearly enough so that it can be understood by a novice, or an undergraduate student, but that it would also be sufficiently technical and detailed for the scientist.

Rhythms of Life: The Biological Clocks that Control the Daily Lives of Every Living Thing


Author: Russell G. Foster and Leon Kreitzman

HOW do birds know when it's time to migrate? Why are we more likely to suffer a heart attack in the morning than at night? Why do some plants open their flowers at the same time every day? The answers lie in biological clocks AS this book explains, biological clocks are in the genes of living things ranging from simple bacteria to people. Earth works on a 24-hour cycle of night and day, so organisms need to stay in tune with that cycle and pace their activities. Sleep, heartbeat, and body temperature changes are just a few of the functions regulated by our biological clocks. Foster, a professor of molecular neuroscience, teams with writer Kreitzman to report on advances in research probing the many ways that biological clocks, or circadian systems, regulate life. The authors reveal the evolutionary history of circadian systems in mammals, birds, insects, fungi, and bacteria The authors also discuss the benefits of understanding biological clocks. For example, clinicians are learning that certain drugs work better when administered at certain times of the day. Alternatively, disrupting a person's biological rhythms with long-distance travel, artificial lighting, and even caffeinated coffee can cause discomfort and some damage. This book is a thorough analysis of a broad field.

Saturday, November 22, 2008

Honest Signals: How They Shape Our World

Author: Alex Pentland

A 'nervous system for humanity'? John Gilbey finds a sting in the tale

A slender hardback book is sitting innocuously in a pool of sunlight on my desk. With fewer than 200 pages, of which almost half are taken up by appendices, you might not expect it to carry the seeds of social revolution - but I strongly suspect that it does.

In Honest Signals: How They Shape Our World, Sandy Pentland and his research team from the Massachusetts Institute of Technology Media Lab examine the different ways we communicate within groups. This may not seem, at first glance, to be anything startling or novel. After all, anyone who has been subjected to a management training course in the past few decades will have been instructed in the importance of body language and non-verbal communication at some level, however basic.

What is innovative about this tranche of research is the development of a set of technology-based tools to enable the automated capture of behavioural information in an intensive and robust way - providing a demonstrable degree of numerical rigour in support of their conclusions and dramatically extending what is possible in experimental terms.

The concept of honest signals is explained by Pentland as those elements of communication and display that are processed by us unconsciously - or are effectively uncontrollable, or are difficult to fake - so that they can provide an intrinsically valid stream of data with which people guide conversations, meetings and decisions. He isolates four examples of honest signalling for closer study: influence, measured as the extent to which someone modifies the pattern of speaking of another person to match their own; mimicry, or the way we copy the behaviours of another through the course of a conversation through smiles, comments and nods; activity - how interested and excited you are is apparently reflected measurably in your level of activity; and consistency - Pentland suggests that consistent levels of emphasis and timing in speech indicate mental focus in the speaker, with any inconsistency leaving us open to influence from others.

Critically, these honest signals can be read clearly - by people or gadgets - in a variety of environments where the finer-grained information of language and expression might be lost: think of discos, bars and crowded streets. Pentland views this as a connection to our remote past - when accurate communication around the campfire or in the depths of a forest was a key survival element.

Forming the core of the book is a discussion of the work carried out by Pentland's group over the past five years, showing the evolution of the research programme. Inventive data capture and analysis, using novel in-house developments and the manipulation of existing technologies, has enabled the mass observation of subjects, generating thousands of hours (330,000 hours is quoted) of data. In the narrative, the various data collection devices are referred to collectively as "sociometers".

The current version of the sociometer is a sophisticated but unobtrusive device that can be worn like an ID badge. It can tell how much time you spend talking face to face with a named person, carry out speech analysis to measure social signals and social context, recognise common activities by measuring body movement, determine where you are in the building, and talk to mobile phones and computer networks to exchange data and measure exactly where you are in relation to other people. Try doing that lot for a roomful of people using a clipboard and a stopwatch.

One slight disappointment is that the tools developed by the research team are not described in great technological detail in the book, so geeks like me have to slide over to the website for more information.

Most of the team's research papers are available in PDF format - but be warned, the website (http://hd.media.mit.edu) is one of the most garishly presented I have ever seen.

Pentland builds the story upwards from a fairly traditional discussion of group roles, and the interactions within teams, through the challenge of "reading" poker players - real and metaphorical - to the questions around how the power of the group can be most effectively harnessed. Strangely, the linear discussion in the main text becomes almost secondary to the material carried in the compendious appendices, which relate how each component of the argument was tested.

I found myself repeatedly flipping back and forth between chapter and support material - but with deepening interest rather than frustration. The appendices are, in many cases, mini-research papers with a hypothesis/method, results and discussion format forming a clever bridge between guidance for lay readers and the more formal expectations of a scientific readership.

The analysis and reporting of the sociometer data makes a persuasive case for the use of this technology in the understanding of social networks, but there is a lot more at stake here than that.

The technology offers, in Pentland's words, a chance to "enable a magnification of our social sense", to step outside our own behaviour and observe it dispassionately as a set of statistics. On a substantially larger scale, it potentially provides a set of techniques on which to base the development of a "social physics" - for which the author provides a convincing proof-of-concept analysis. Readers of Isaac Asimov's "Foundation" stories will be forgiven if they feel a hint of Hari Seldon's discipline of "psycho-history" at this point.

Throughout the book, the methodologies and the academic treatment of the subject are impressive, but there is - for me - a significant sting in the tail (or tale, for that matter). In developing the data-capture technology, the team has sensibly used standard components wherever possible. One of the team's main platforms has been a series of increasingly powerful mobile phones, with Linux operating systems and Bluetooth communications, for which it developed some interesting applications.

The software described in the text and in the supporting research papers is intriguing - Meeting Mediator, for example, uses the sociometer data to provide immediate feedback to enhance group collaboration, feedback that is presented on the screen of each phone, thus "encouraging" participation. My particular favourite, however, is the Jerk-O-Meter - yes, really. This application provides appropriate feedback via phone screen to someone whose interest in a phone call is perceived, by the system, to be waning.

Clever stuff, but this is surely appropriate only in the context of a development tool and an academic exercise. The problem I foresee is that it will be only a matter of time before big organisations decide to include such features by default in the phones used by their staff to provide them with daily (hourly?) reports on how much each person is perceived to be contributing to discussions, and how attentive they are to their phone conversations. Heck, you can see where I am going with this.

In the epilogue to his book, Pentland presents the case for using these developments in the creation of a "nervous system" for humanity, evolving into a structure that enables us "to engineer our societies and entire culture".

The author is obviously aware of the risks this could pose to privacy and individual liberty - and this book provides a timely wake-up call so that we can decide where the boundary lies between "possible" and "desirable".

For once - just once - please let us have that debate in public before the whole issue is dismissed as "inevitable".

The author

Alex (Sandy) Pentland is the Toshiba professor of media, arts and sciences at the Massachusetts Institute of Technology. Specialising in human-centred technology, he is also passionate about creating ventures to take the technology into the "real" world.

To that end, he has founded or co-founded half a dozen institutes and set up ten spin-off companies. His achievements have been recognised not only by his peers - who have cited him so often that he can now claim to be one of the top-cited computer scientists in the world - but also by the mainstream media. In 1997, Newsweek named him one of the 100 Americans likely to shape this century. During his career, he has taught at the University of Rochester and Stanford University and has worked for two non-profit organisations.

Wednesday, November 19, 2008

Outliers

Author: Macolm Gladwell

Review: Tyler Cowen

The book is getting snarky reviews but if it were by an unknown, rather than by the famous Malcolm Gladwell, many people would be saying how interesting it is. The main point, in economic language, is that human talent is heterogeneous and that the talent of a particular person must mesh with the capital structure of his or her time if major success is to result. The book is best read as a supplement to Ludwig Lachmann's Capital and its Structure. The main enduring insight of both Lachmann and Gladwell is simply how much we live in a world of complementarity rather than substitutability.

Nowhere in the book does the name Dean Keith Simonton (check out the headings to these links) appear nor does the phrase "multiplicative model of human success." A lot of the content here has already been done with more rigor and empirical support and also in readable form I might add. Everyone should read Simonton, noting that his hypotheses fare better in the arts than in politics.

If you ask too much from Outliers it will fall apart. It is too easy to find contingency in the world and Gladwell doesn't begin to look for a theory of which contingencies are interesting or not. For instance arguably Ludwig van Beethoven would not have been a great composer if:

1. An extra butterfly had died two million years ago.

2. The outcome of the Thirty Years' War had been different.

3. The Germany of his time had not had fortepianos.

4. His parents had conceived their child one second earlier.

5. Haydn had not paved the way.

#3 and #5 seem more interesting than #1 and #4 but that's because some contingencies just don't help us understand the world very much. Gladwell never gives us enough theoretical structure to see why his contingencies are the relevant ones. Simply showing the contingencies in personal histories is not, taken alone, very enlightening.

Gladwell's contingency stories skid out of control. At one point it seems the main claim is that the steady accumulation of advantages is what matters, but once you ask which advantages end up "counting," the claim collapses into tautology.

There is also a "PC" undercurrent in the book of "don't write anyone off" but if everything is so contingent on so many factors, maybe writing people off isn't such a big deal. It could go either way. It depends.

Gladwell deliberately steers us away from the contingency of genetic endowment (even for a given set of parents, which sperm got through?), but if you hold everything else fixed you can assign a very high marginal product to the genetic factor if you wish, usually up to 100 percent of a person's outcome. That mental exercise is verboten but somehow it is OK to hold the genetic endowment constant and vary some other historical factor and regard that as a meaningful contingency. See the discussion of Beethoven above, especially #4 on the list.

Gladwell descends into the swamp of contingency but he is unwilling to really live in it and take it seriously or, alternatively, to find a way out.

In reality the complementarity concept is easier to work with and also more fruitful for thinking about policy implications or for that matter the implications for management or talent training. Success is fragile but foster competing cultures based on clusters of talent motivated by rivalry and emulation. Don't filter out the eccentrics or the risk takers. That's about where David Hume ended up but Gladwell never gets anywhere close.