Write a paper

User Generated

wnlfba666

Humanities

Pace University - New York

Description

Introduction:

Thesis:

Topic sentence:

Body paragraph: Need 3 of the body paragraph.

Find the QUOTE from each article and then explain them, what the relationship between with two of the quote.

conclusion 

Unformatted Attachment Preview

Reporting Live from Tomorrow In Alfred Hitchcock’s 1956 remake of The Man Who Knew Too Much, Doris Day sang a waltz whose final verse went like this: When I was just a child in school, I asked my teacher, “What will I try? Should I paint pictures, should I sing songs?” This was her wise reply: “Que sera, sera. Whatever will be, will be. The future’s not ours to see. Que sera, sera.”1 Now, I don’t mean to quibble with the lyricist, and I have nothing but fond memories of Doris Day, but the fact is that this is not a particularly wise reply. When a child asks for advice about which of two activities to pursue, a teacher should be able to provide more than a musical cliché. Yes, of course the future is hard to see. But we’re all heading that way anyhow, and as difficult as it may be to envision, we have to make some decisions about which futures to aim for and which to avoid. If we are prone to mistakes when we try to imagine the future, then how should we decide what to do? Even a child knows the answer to that one: We should ask the teacher. One of the benefits of being a social and linguistic animal is that we can capitalize on the experience of others rather than trying to figure everything out for ourselves. For millions of years, human beings have conquered their ignorance by dividing the labor of discovery and then communicating their discoveries to one another, which is why the average newspaper boy in Pittsburgh knows more about the universe than did Galileo, Aristotle, Leonardo,i or any of those other guys who were so smart they only needed one name. We all make ample use of this resource. If you were to write down everything you know and then go back through the list and make a check mark next to the things you know only because somebody told you, you’d develop a repetitive-motion disorder because almost everything you know is secondhand. Was Yury Gagarin the first man in space? Is croissant a French word? Are there more Chinese than North Dakotans? Does a stitch in time save nine? Most of us know the answers to these questions despite the fact that none of us actually witnessed the launching of Vostok I, personally supervised the evolution of language, hand-counted all the people in Beijing and Bismarck, or performed a fully randomized double-blind study of stitching. We know the answers because someone shared them with us. Communication is a kind of “vicarious observation”2 that allows us to learn about the world without ever leaving the comfort of our Barcaloungers. The six billion interconnected people who cover the surface of our planet constitute a leviathan with twelve billion eyes, and anything that is seen by one pair of eyes can potentially be known to the entire beast in a matter of months, days, or even minutes. The fact that we can communicate with one another about our experiences should provide a simple solution to the core problem with which this book has been concerned. Yes, our ability to imagine our future emotions is flawed — but that’s okay, because we don’t have to imagine what it would feel like to marry a lawyer, move to Texas, or eat a snail when there are so many people who have done these things and are all too happy to tell us about them. Teachers, neighbors, coworkers, parents, friends, lovers, children, uncles, cousins, coaches, cabdrivers, bartenders, hairstylists, dentists, advertisers — each of these folks has something to say about what it would be like to live in this future rather than that one, and at any point in time we can be fairly sure that one of these folks has actually had the experience that we are merely contemplating. Because we are the mammal that shows and tells, each of us has access to information about almost any experience we can possibly imagine — and many that we can’t. Guidance counselors tell us about the best careers, critics tell us about the best restaurants, travel agents tell us about the best vacations, and friends tell us about the best travel agents. Every one of us is surrounded by a platoon of Dear Abbys who can recount their own experiences and in so doing tell us which futures are most worth wanting. Do we listen too well when others speak, or do we not listen well enough? As we shall see, the answer to that question is yes. Given the overabundance of consultants, role models, gurus, mentors, yentas,ii and nosy relatives, we might expect people to do quite well when it comes to making life’s most important decisions, such as where to live, where to work, and whom to marry. And yet, the average American moves more than six times,3 changes jobs more than ten times,4 and marries more than once,5 which suggests that most of us are making more than a few poor choices. If humanity is a living library of information about what it feels like to do just about anything that can be done, then why do the people with the library cards make so many bad decisions? There are just two possibilities. The first is that a lot of the advice we receive from others is bad advice that we foolishly accept. The second is that a lot of the advice we receive from others is good advice that we foolishly reject. So which is it? Do we listen too well when others speak, or do we not listen well enough? As we shall see, the answer to that question is yes. Super-Replicators The philosopher Bertrand Russell once claimed that believing is “the most mental thing we do.”6 Perhaps, but it is also the most social thing we do. Just as we pass along our genes in an effort to create people whose faces look like ours, so too do we pass along our beliefs in an effort to create people whose minds think like ours. Almost any time we tell anyone anything, we are attempting to change the way their brains operate — attempting to change the way they see the world so that their view of it more closely resembles our own. Just about every assertion — from the sublime (“God has a plan for you”) to the mundane (“Turn left at the light, go two miles, and you’ll see the Dunkin’ Donuts on your right”) — is meant to bring the listener’s beliefs about the world into harmony with the speaker’s. Sometimes these attempts succeed and sometimes they fail. So what determines whether a belief will be successfully transmitted from one mind to another? The principles that explain why some genes are transmitted more successfully than others also explain why some beliefs are transmitted more successfully than others.7 Evolutionary biology teaches us that any gene that promotes its own “means of transmission” will be represented in increasing proportions in the population over time. For instance, imagine that a single gene were responsible for the complex development of the neural circuitry that makes orgasms feel so good. For a person having this gene, orgasms would feel … well, orgasmic. For a person lacking this gene, orgasms would feel more like sneezes — brief, noisy, physical convulsions that pay rather paltry hedonic dividends. Now, if we took fifty healthy, fertile people who had the gene and fifty healthy, fertile people who didn’t, and left them on a hospitable planet for a million years or so, when we returned we would probably find a population of thousands or millions of people, almost all of whom had the gene. Why? Because a gene that made orgasms feel good would tend to be transmitted from generation to generation simply because people who enjoy orgasms are inclined to do the thing that transmits their genes. The logic is so circular that it is virtually inescapable: Genes tend to be transmitted when they make us do the things that transmit genes. What’s more, even bad genes — those that make us prone to cancer or heart disease — can become superreplicators if they compensate for these costs by promoting their own means of transmission. For instance, if the gene that made orgasms feel delicious also left us prone to arthritis and tooth decay, that gene might still be represented in increasing proportions because arthritic, toothless people who love orgasms are more likely to have children than are limber, toothy people who do not. The same logic can explain the transmission of beliefs. If a particular belief has some property that facilitates its own transmission, then that belief tends to be held by an increasing number of minds. As it turns out, there are several such properties that increase a belief’s transmissional success, the most obvious of which is accuracy. When someone tells us where to find a parking space downtown or how to bake a cake at high altitude, we adopt that belief and pass it along because it helps us and our friends do the things we want to do, such as parking and baking. As one philosopher noted, “The faculty of communication would not gain ground in evolution unless it was by and large the faculty of transmitting true beliefs.”8 Accurate beliefs give us power, which makes it easy to understand why they are so readily transmitted from one mind to another. It is a bit more difficult to understand why inaccurate beliefs are so readily transmitted from one mind to another — but they are. False beliefs, like bad genes, can and do become super-replicators, and a thought experiment illustrates how this can happen. Imagine a game that is played by two teams, each of which has a thousand players, each of whom is linked to teammates by a telephone. The object of the game is to get one’s team to share as many accurate beliefs as possible. When players receive a message that they believe to be accurate, they call a teammate and pass it along. When they receive a message that they believe to be inaccurate, they don’t. At the end of the game, the referee blows a whistle and awards each team a point for every accurate belief that the entire team shares and subtracts one point for every inaccurate belief the entire team shares. Now, consider a contest played one sunny day between a team called the Perfects (whose members always transmit accurate beliefs) and a team called the Imperfects (whose members occasionally transmit an inaccurate belief). We should expect the Perfects to win, right? Not necessarily. In fact, there are some special circumstances under which the Imperfects will beat their pants off. For example, imagine what would happen if one of the Imperfect players sent the false message “Talking on the phone all day and night will ultimately make you very happy,” and imagine that other Imperfect players were gullible enough to believe it and pass it on. This message is inaccurate and thus will cost the Imperfects a point in the end. But it may have the compensatory effect of keeping more of the Imperfects on the telephone for more of the time, thus increasing the total number of accurate messages they transmit. Under the right circumstances, the costs of this inaccurate belief would be outweighed by its benefits, namely, that it led players to behave in ways that increased the odds that they would share other accurate beliefs. The lesson to be learned from this game is that inaccurate beliefs can prevail in the belief-transmission game if they somehow facilitate their own “means of transmission.” In this case, the means of transmission is not sex but communication, and thus any belief — even a false belief — that increases communication has a good chance of being transmitted over and over again. False beliefs that happen to promote stable societies tend to propagate because people who hold these beliefs tend to live in stable societies, which provide the means by which false beliefs propagate. Some of our cultural wisdom about happiness looks suspiciously like a super-replicating false belief. Consider money. If you’ve ever tried to sell anything, then you probably tried to sell it for as much as you possibly could, and other people probably tried to buy it for as little as they possibly could. All the parties involved in the transaction assumed that they would be better off if they ended up with more money rather than less, and this assumption is the bedrock of our economic behavior. Yet, it has far fewer scientific facts to substantiate it than you might expect. Economists and psychologists have spent decades studying the relation between wealth and happiness, and they have generally concluded that wealth increases human happiness when it lifts people out of abject poverty and into the middle class but that it does little to increase happiness thereafter.9 Americans who earn $50,000 per year are much happier than those who earn $10,000 per year, but Americans who earn $5 million per year are not much happier than those who earn $100,000 per year. People who live in poor nations are much less happy than people who live in moderately wealthy nations, but people who live in moderately wealthy nations are not much less happy than people who live in extremely wealthy nations. Economists explain that wealth has “declining marginal utility,” which is a fancy way of saying that it hurts to be hungry, cold, sick, tired, and scared, but once you’ve bought your way out of these burdens, the rest of your money is an increasingly useless pile of paper.10 So once we’ve earned as much money as we can actually enjoy, we quit working and enjoy it, right? Wrong. People in wealthy countries generally work long and hard to earn more money than they can ever derive pleasure from.11 This fact puzzles us less than it should. After all, a rat can be motivated to run through a maze that has a cheesy reward at its end, but once the little guy is all topped up, then even the finest Stilton won’t get him off his haunches. Once we’ve eaten our fill of pancakes, more pancakes are not rewarding, hence we stop trying to procure and consume them. But not so, it seems, with money. As Adam Smith, the father of modern economics, wrote in 1776: “The desire for food is limited in every man by the narrow capacity of the human stomach; but the desire of the conveniences and ornaments of building, dress, equipage, and household furniture, seems to have no limit or certain boundary.”12 If food and money both stop pleasing us once we’ve had enough of them, then why do we continue to stuff our pockets when we would not continue to stuff our faces? Adam Smith had an answer. He began by acknowledging what most of us suspect anyway, which is that the production of wealth is not necessarily a source of personal happiness. In what constitutes the real happiness of human life, [the poor] are in no respect inferior to those who would seem so much above them. In ease of body and peace of mind, all the different ranks of life are nearly upon a level, and the beggar, who suns himself by the side of the highway, possesses that security which kings are fighting for.13 That sounds lovely, but if it’s true, then we’re all in big trouble. If rich kings are no happier than poor beggars, then why should poor beggars stop sunning themselves by the roadside and work to become rich kings? If no one wants to be rich, then we have a significant economic problem, because flourishing economies require that people continually procure and consume one another’s goods and services. Market economies require that we all have an insatiable hunger for stuff, and if everyone were content with the stuff they had, then the economy would grind to a halt. But if this is a significant economic problem, it is not a significant personal problem. The chair of the Federal Reserve may wake up every morning with a desire to do what the economy wants, but most of us get up with a desire to do what we want, which is to say that the fundamental needs of a vibrant economy and the fundamental needs of a happy individual are not necessarily the same. So what motivates people to work hard every day to do things that will satisfy the economy’s needs but not their own? Like so many thinkers, Smith believed that people want just one thing — happiness — hence economies can blossom and grow only if people are deluded into believing that the production of wealth will make them happy.14 If and only if people hold this false belief will they do enough producing, procuring, and consuming to sustain their economies. The pleasures of wealth and greatness … strike the imagination as something grand and beautiful and noble, of which the attainment is well worth all the toil and anxiety which we are so apt to bestow upon it… . It is this deception which rouses and keeps in continual motion the industry of mankind. It is this which first prompted them to cultivate the ground, to build houses, to found cities and commonwealths, and to invent and improve all the sciences and arts, which ennoble and embellish human life; which have entirely changed the whole face of the globe, have turned the rude forests of nature into agreeable and fertile plains, and made the trackless and barren ocean a new fund of subsistence, and the great high road of communication to the different nations of the earth.15 If parenting is such difficult business, then why do we have such a rosy view of it? In short, the production of wealth does not necessarily make individuals happy, but it does serve the needs of an economy, which serves the needs of a stable society, which serves as a network for the propagation of delusional beliefs about happiness and wealth. Economies thrive when individuals strive, but because individuals will only strive for their own happiness, it is essential that they mistakenly believe that producing and consuming are routes to personal well-being. Although words such as delusional may seem to suggest some sort of shadowy conspiracy orchestrated by a small group of men in dark suits, the belieftransmission game teaches us that the propagation of false beliefs does not require that anyone be trying to perpetrate a magnificent fraud on an innocent populace. There is no cabal at the top, no star chamber,iii no master manipulator whose clever program of indoctrination and propaganda has duped us all into believing that money can buy us love. Rather, this particular false belief is a super-replicator because holding it causes us to engage in the very activities that perpetuate it.16 The belief-transmission game explains why we believe some things about happiness that simply aren’t true. The joy of money is one example. The joy of children is another that for most of us hits a bit closer to home. Every human culture tells its members that having children will make them happy. When people think about their offspring — either imagining future offspring or thinking about their current ones — they tend to conjure up images of cooing babies smiling from their bassinets, adorable toddlers running higgledy-piggledy across the lawn, handsome boys and gorgeous girls playing trumpets and tubas in the school marching band, successful college students going on to have beautiful weddings, satisfying careers, and flawless grandchildren whose affections can be purchased with candy. Prospective parents know that diapers will need changing, that homework will need doing, and that orthodontists will go to Aruba on their life savings, but by and large, they think quite happily about parenthood, which is why most of them eventually leap into it. When parents look back on parenthood, they remember feeling what those who are looking forward to it expect to feel. Few of us are immune to these cheery contemplations. I have a twenty-nineyearold son, and I am absolutely convinced that he is and always has been one of the greatest sources of joy in my life, having only recently been eclipsed by my two-year-old granddaughter, who is equally adorable but who has not yet asked me to walk behind her and pretend we’re unrelated. When people are asked to identify their sources of joy, they do just what I do: They point to their kids. Yet if we measure the actual satisfaction of people who have children, a very different story emerges… . couples generally start out quite happy in their marriages and then become progressively less satisfied over the course of their lives together, getting close to their original levels of satisfaction only when their children leave home.17 Despite what we read in the popular press, the only known symptom of “empty nest syndrome” is increased smiling.18 Interestingly, this pattern of satisfaction over the life cycle describes women (who are usually the primary caretakers of children) better than men.19 Careful studies of how women feel as they go about their daily activities show that they are less happy when taking care of their children than when eating, exercising, shopping, napping, or watching television.20 Indeed, looking after the kids appears to be only slightly more pleasant than doing housework. None of this should surprise us. Every parent knows that children are a lot of work — a lot of really hard work — and although parenting has many rewarding moments, the vast majority of its moments involve dull and selfless service to people who will take decades to become even begrudgingly grateful for what we are doing. If parenting is such difficult business, then why do we have such a rosy view of it? One reason is that we have been talking on the phone all day with society’s stockholders — our moms and uncles and personal trainers — who have been transmitting to us an idea that they believe to be true but whose accuracy is not the cause of its successful transmission. “Children bring happiness” is a super-replicator. The belief-transmission network of which we are a part cannot operate without a continuously replenished supply of people to do the transmitting, thus the belief that children are a source of happiness becomes a part of our cultural wisdom simply because the opposite belief unravels the fabric of any society that holds it. Indeed, people who believed that children bring misery and despair — and who thus stopped having them — would put their belief-transmission network out of business in around fifty years, hence terminating the belief that terminated them. The Shakers were a utopian farming community that arose in the 1800s and at one time numbered about six thousand. They approved of children, but they did not approve of the natural act that creates them. Over the years, their strict belief in the importance of celibacy caused their network to contract, and today there are just a few elderly Shakers left, transmitting their doomsday belief to no one but themselves. The belief-transmission game is rigged so that we must believe that children and money bring happiness, regardless of whether such beliefs are true. This doesn’t mean that we should all now quit our jobs and abandon our families. Rather, it means that while we believe we are raising children and earning paychecks to increase our share of happiness, we are actually doing these things for reasons beyond our ken. We are nodes in a social network that arises and falls by a logic of its own, which is why we continue to toil, continue to mate, and continue to be surprised when we do not experience all the joy we so gullibly anticipated. The Myth of Fingerprints My friends tell me that I have a tendency to point out problems without offering solutions, but they never tell me what I should do about it. In one chapter after another, I’ve described the ways in which imagination fails to provide us with accurate previews of our emotional futures. I’ve claimed that when we imagine our futures we tend to fill in, leave out, and take little account of how differently we will think about the future once we actually get there. I’ve claimed that neither personal experience nor cultural wisdom compensates for imagination’s shortcomings. I’ve so thoroughly marinated you in the foibles, biases, errors, and mistakes of the human mind that you may wonder how anyone ever manages to make toast without buttering their kneecaps. If so, you will be heartened to learn that there is a simple method by which anyone can make strikingly accurate predictions about how they will feel in the future. But you may be disheartened to learn that, by and large, no one wants to use it. Why do we rely on our imaginations in the first place? Imagination is the poor man’s wormhole. We can’t do what we’d really like to do — namely, travel through time, pay a visit to our future selves, and see how happy those selves are — and so we imagine the future instead of actually going there. But if we cannot travel in the dimensions of time, we can travel in the dimensions of space, and the chances are pretty good that somewhere in those other three dimensions there is another human being who is actually experiencing the future event that we are merely thinking about. Surely we aren’t the first people ever to consider a move to Cincinnati, a career in motel management, another helping of rhubarb pie, or an extramarital affair, and for the most part, those who have already tried these things are more than willing to tell us about them. It is true that when people tell us about their past experiences (“That ice water wasn’t really so cold” or “I love taking care of my daughter”), memory’s peccadilloes may render their testimony unreliable. But it is also true that when people tell us about their current experiences (“How am I feeling right now? I feel like pulling my arm out of this freezing bucket and sticking my teenager’s head in it instead!”), they are providing us with the kind of report about their subjective state that is considered the gold standard of happiness measures. If you believe (as I do) that people can generally say how they are feeling at the moment they are asked, then one way to make predictions about our own emotional futures is to find someone who is having the experience we are contemplating and ask them how they feel. Instead of remembering our past experience in order to simulate our future experience, perhaps we should simply ask other people to introspect on their inner states. Perhaps we should give up on remembering and imagining entirely and use other people as surrogates for our future selves. This idea sounds all too simple, and I suspect you have an objection to it that goes something like this: Yes, other people are probably right now experiencing the very things I am merely contemplating, but I can’t use other people’s experiences as proxies for my own because those other people are not me. Every human being is as unique as his or her fingerprints, so it won’t help me much to learn about how others feel in the situations that I’m facing. Unless these other people are my clones and have had all the same experiences I’ve had, their reactions and my reactions are bound to differ. I am a walking, talking idiosyncrasy, and thus I am better off basing my predictions on my somewhat fickle imagination than on the reports of people whose preferences, tastes, and emotional proclivities are so radically different from my own. If that’s your objection, then it is a good one — so good that it will take two steps to dismantle it. First let me prove to you that the experience of a single randomly selected individual can sometimes provide a better basis for predicting your future experience than your own imagination can. And then let me show you why you — and I — find this so difficult to believe. Finding the Solution Imagination has three shortcomings, and if you didn’t know that then you may be reading this bookiv backward. If you did know that, then you also know that imagination’s first shortcoming is its tendency to fill in and leave out without telling us… . No one can imagine every feature and consequence of a future event, hence we must consider some and fail to consider others. The problem is that the features and consequences we fail to consider are often quite important. You may recall the studyv in which college students were asked to imagine how they would feel a few days after their school’s football team played a game against its archrival.21 The results showed that students overestimated the duration of the game’s emotional impact because when they tried to imagine their future experience, they imagined their team winning (“The clock will hit zero, we’ll storm the field, everyone will cheer …”) but failed to imagine what they would be doing afterward (“And then I’ll go home and study for my final exams”). Because the students were focused on the game, they failed to imagine how events that happened after the game would influence their happiness. So what should they have done instead? They should have abandoned imagination altogether. Consider a study that put people in a similar predicament and then forced them to abandon their imaginations. In this study, a group of volunteers (reporters) first received a delicious prize — a gift certificate from a local ice cream parlor — and then performed a long, boring task in which they counted and recorded geometric shapes that appeared on a computer screen.22 The reporters then reported how they felt. Next, a new group of volunteers was told that they would also receive a prize and do the same boring task. Some of these new volunteers (simulators) were told what the prize was and were asked to use their imaginations to predict their future feelings. Other volunteers (surrogators) were not told what the prize was but were instead shown the report of a randomly selected reporter. Not knowing what the prize was, they couldn’t possibly use their imaginations to predict their future feelings. Instead, they had to rely on the reporter’s report. Once all the volunteers had made their predictions, they received the prize, did the long, boring task, and reported how they actually felt… . Simulators were not as happy as they thought they would be. Why? Because they failed to imagine how quickly the joy of receiving a gift certificate would fade when it was followed by a long, boring task. This is precisely the same mistake that the college-football fans made. But now look at the results for the surrogators. As you can see, they made extremely accurate predictions of their future happiness. These surrogators didn’t know what kind of prize they would receive, but they did know that someone who had received that prize had been less than ecstatic at the conclusion of the boring task. So they shrugged and reasoned that they too would feel less than ecstatic at the conclusion of the boring task — and they were right! Imagination’s second shortcoming is its tendency to project the present onto the future… . When imagination paints a picture of the future, many of the details are necessarily missing, and imagination solves this problem by filling in the gaps with details that it borrows from the present. Anyone who has ever shopped on an empty stomach, vowed to quit smoking after stubbing out a cigarette, or proposed marriage while on shore leavevi knows that how we feel now can erroneously influence how we think we’ll feel later. As it turns out, surrogation can remedy this shortcoming too. In one study, volunteers (reporters) ate a few potato chips and reported how much they enjoyed them.23 Next, a new group of volunteers was fed pretzels, peanut-butter cheese crackers, tortilla chips, bread sticks, and melba toast, which, as you might guess, left them thoroughly stuffed and with little desire for salty snack foods. These stuffed volunteers were then asked to predict how much they would enjoy eating a particular food the next day. Some of these stuffed volunteers (simulators) were told that the food they would eat the next day was potato chips, and they were asked to use their imaginations to predict how they would feel after eating them. Other stuffed volunteers (surrogators) were not told what the next day’s food would be but were instead shown the report of one randomly selected reporter. Because surrogators didn’t know what the next day’s food would be, they couldn’t use their imaginations to predict their future enjoyment of it and thus they had to rely on the reporter’s report. Once all the volunteers had made their predictions, they went away, returned the next day, ate some potato chips, and reported how much they enjoyed them… . Simulators enjoyed eating the potato chips more than they thought they would. Why? Because when they made their predictions they had bellies full of pretzels and crackers. But surrogators — who were equally full when they made their predictions — relied on the report of someone without a full belly and hence made much more accurate predictions. It is important to note that the surrogators accurately predicted their future enjoyment of a food despite the fact that they didn’t even know what the food was! Imagination’s third shortcoming is its failure to recognize that things will look different once they happen — in particular, that bad things will look a whole lot better… . When we imagine losing a job, for instance, we imagine the painful experience (“The boss will march into my office, shut the door behind him …”) without also imagining how our psychological immune systems will transform its meaning (“I’ll come to realize that this was an opportunity to quit retail sales and follow my true calling as a sculptor”). Can surrogation remedy this shortcoming? To find out, researchers arranged for some people to have an unpleasant experience. A group of volunteers (reporters) was told that the experimenter would flip a coin, and if it came up heads, the volunteer would receive a gift certificate to a local pizza parlor. The coin was flipped and — oh, so sorry — it came up tails and the reporters received nothing.24 The reporters then reported how they felt. Next, a new group of volunteers was told about the coin-flipping game and was asked to predict how they would feel if the coin came up tails and they didn’t get the pizza gift certificate. Some of these volunteers (simulators) were told the precise monetary value of the gift certificate, and others (surrogators) were instead shown the report of one randomly selected reporter. Once the volunteers had made their predictions, the coin was flipped and — oh, so sorry — came up tails. The volunteers then reported how they felt. Simulators felt better than they predicted they’d feel if they lost the coin flip. Why? Because simulators did not realize how quickly and easily they would rationalize the loss (“Pizza is too fattening, and besides, I don’t like that restaurant anyway”). But surrogators — who had nothing to go on except the report of another randomly selected individual — assumed that they wouldn’t feel too bad after losing the prize and hence made more accurate predictions. Rejecting the Solution This trio of studies suggests that when people are deprived of the information that imagination requires and are thus forced to use others as surrogates, they make remarkably accurate predictions about their future feelings, which suggests that the best way to predict our feelings tomorrow is to see how others are feeling today.25 Given the impressive power of this simple technique, we should expect people to go out of their way to use it. But they don’t. When an entirely new group of volunteers was told about the three situations I just described — winning a prize, eating a mystery food, or failing to receive a gift certificate — and was then asked whether they would prefer to make predictions about their future feelings based on (a) information about the prize, the food, and the certificate; or (b) information about how a randomly selected individual felt after winning them, eating them, or losing them, virtually every volunteer chose the former. If you hadn’t seen the results of these studies, you’d probably have done the same. If I offered to pay for your dinner at a restaurant if you could accurately predict how much you were going to enjoy it, would you want to see the restaurant’s menu or some randomly selected diner’s review? If you are like most people, you would prefer to see the menu, and if you are like most people, you would end up buying your own dinner. Why? If you are like most people, then like most people, you don’t know you’re like most people. Because if you are like most people, then like most people, you don’t know you’re like most people. Science has given us a lot of facts about the average person, and one of the most reliable of these facts is that the average person doesn’t see herself as average. Most students see themselves as more intelligent than the average student,26 most business managers see themselves as more competent than the average business manager,27 and most football players see themselves as having better “football sense” than their teammates.28 Ninety percent of motorists consider themselves to be safer-than-average drivers,29 and 94 percent of college professors consider themselves to be better-than-average teachers.30 Ironically, the bias toward seeing ourselves as better than average causes us to see ourselves as less biased than average too.31 As one research team concluded, “Most of us appear to believe that we are more athletic, intelligent, organized, ethical, logical, interesting, fair-minded, and healthy — not to mention more attractive — than the average person.”32 This tendency to think of ourselves as better than others is not necessarily a manifestation of our unfettered narcissism but may instead be an instance of a more general tendency to think of ourselves as different from others — often for better but sometimes for worse. When people are asked about generosity, they claim to perform a greater number of generous acts than others do; but when they are asked about selfishness, they claim to perform a greater number of selfish acts than others do.33 When people are asked about their ability to perform an easy task, such as driving a car or riding a bike, they rate themselves as better than others; but when they are asked about their ability to perform a difficult task, such as juggling or playing chess, they rate themselves as worse than others.34 We don’t always see ourselves as superior, but we almost always see ourselves as unique. Even when we do precisely what others do, we tend to think that we’re doing it for unique reasons. For instance, we tend to attribute other people’s choices to features of the chooser (“Phil picked this class because he’s one of those literary types”), but we tend to attribute our own choices to features of the options (“But I picked it because it was easier than economics”).35 We recognize that our decisions are influenced by social norms (“I was too embarrassed to raise my hand in class even though I was terribly confused”), but fail to recognize that others’ decisions were similarly influenced (“No one else raised a hand because no one else was as confused as I was”).36 We know that our choices sometimes reflect our aversions (“I voted for Kerry because I couldn’t stand Bush”), but we assume that other people’s choices reflect their appetites (“If Rebecca voted for Kerry, then she must have liked him”).37 The list of differences is long but the conclusion to be drawn from it is short: The self considers itself to be a very special person.38 What makes us think we’re so darned special? Three things, at least. First, even if we aren’t special, the way we know ourselves is. We are the only people in the world whom we can know from the inside. We experience our own thoughts and feelings but must infer that other people are experiencing theirs. We all trust that behind those eyes and inside those skulls, our friends and neighbors are having subjective experiences very much like our own, but that trust is an article of faith and not the palpable, self-evident truth that our own subjective experiences constitute. There is a difference between making love and reading about it, and it is the same difference that distinguishes our knowledge of our own mental lives from our knowledge of everyone else’s. Because we know ourselves and others by such different means, we gather very different kinds and amounts of information. In every waking moment we monitor the steady stream of thoughts and feelings that runs through our heads, but we only monitor other people’s words and deeds, and only when they are in our company. One reason why we seem so special, then, is that we learn about ourselves in such a special way. The second reason is that we enjoy thinking of ourselves as special. Most of us want to fit in well with our peers, but we don’t want to fit in too well.39 We prize our unique identities, and research shows that when people are made to feel too similar to others, their moods quickly sour and they try to distance and distinguish themselves in a variety of ways.40 If you’ve ever shown up at a party and found someone else wearing exactly the same dress or necktie that you were wearing, then you know how unsettling it is to share the room with an unwanted twin whose presence temporarily diminishes your sense of individuality. Because we value our uniqueness, it isn’t surprising that we tend to overestimate it. The third reason why we tend to overestimate our uniqueness is that we tend to overestimate everyone’s uniqueness — that is, we tend to think of people as more different from one another than they actually are. Let’s face it: All people are si milar in some ways and different in others. The psychologists, biologists, economists, and sociologists who are searching for universal laws of human behavior naturally care about the similarities, but the rest of us care mainly about the differences. Social life involves selecting particular individuals to be our sexual partners, business partners, bowling partners, and more. That task requires that we focus on the things that distinguish one person from another and not on the things that all people share, which is why personal ads are much more likely to mention the advertiser’s love of ballet than his love of oxygen. A penchant for respiration explains a great deal about human behavior — for example, why people live on land, become ill at high altitudes, have lungs, resist suffocation, love trees, and so on. It surely explains more than does a person’s penchant for ballet. But it does nothing to distinguish one person from another, and thus for ordinary folks who are in the ordinary business of selecting others for commerce, conversation, or copulation, the penchant for air is stunningly irrelevant. Individual similarities are vast, but we don’t care much about them because they don’t help us do what we are here on earth to do, namely, distinguish Jack from Jill and Jill from Jennifer. As such, these individual similarities are an inconspicuous backdrop against which a small number of relatively minor individual differences stand out in bold relief. Because we spend so much time searching for, attending to, thinking about, and remembering these differences, we tend to overestimate their magnitude and frequency, and thus end up thinking of people as more varied than they actually are. If you spent all day sorting grapes into different shapes, colors, and kinds, you’d become one of those annoying grapeophiles who talks endlessly about the nuances of flavor and the permutations of texture. You’d come to think of grapes as infinitely varied, and you’d forget that almost all of the really important information about a grape can be deduced from the simple fact of its grapehood. Our belief in the variability of others and in the uniqueness of the self is especially powerful when it comes to emotion.41 Because we can feel our own emotions but must infer the emotions of others by watching their faces and listening to their voices, we often have the impression that others don’t experience the same intensity of emotion that we do, which is why we expect others to recognize our feelings even when we can’t recognize theirs.42 This sense of emotional uniqueness starts early. When kindergarteners are asked how they and others would feel in a variety of situations, they expect to experience unique emotions (“Billy would be sad but I wouldn’t”) and they provide unique reasons for experiencing them (“I’d tell myself that the hamster was in heaven, but Billy would just cry”).43 When adults make these same kinds of predictions, they do just the same thing.44 Our mythical belief in the variability and uniqueness of individuals is the main reason why we refuse to use others as surrogates. After all, surrogation is only useful when we can count on a surrogate to react to an event roughly as we would, and if we believe that people’s emotional reactions are more varied than they actually are, then surrogation will seem less useful to us than it actually is. The irony, of course, is that surrogation is a cheap and effective way to predict one’s future emotions, but because we don’t realize just how similar we all are, we reject this reliable method and rely instead on our imaginations, as flawed and fallible as they may be. Onward Despite its watery connotation, the word hogwash refers to the feeding — and not to the bathing — of pigs. Hogwash is something that pigs eat, that pigs like, and that pigs need. Farmers provide pigs with hogwash because without it, pigs get grumpy. The word hogwash also refers to the falsehoods people tell one another. Like the hogwash that farmers feed their pigs, the hogwash that our friends and teachers and parents feed us is meant to make us happy; but unlike hogwash of the porcine variety, human hogwash does not always achieve its end. As we have seen, ideas can flourish if they preserve the social systems that allow them to be transmitted. Because individuals don’t usually feel that it is their personal duty to preserve social systems, these ideas must disguise themselves as prescriptions for individual happiness. We might expect that after spending some time in the world, our experiences would debunk these ideas, but it doesn’t always work that way. To learn from our experience we must remember it, and for a variety of reasons, memory is a faithless friend. Practice and coaching get us out of our diapers and into our britches, but they are not enough to get us out of our presents and into our futures. What’s so ironic about this predicament is that the information we need to make accurate predictions of our emotional futures is right under our noses, but we don’t seem to recognize its aroma. It doesn’t always make sense to heed what people tell us when they communicate their beliefs about happiness, but it does make sense to observe how happy they are in different circumstances. Alas, we think of ourselves as unique entities — minds unlike any others — and thus we often reject the lessons that the emotional experience of others has to teach us. Here is the link with the article https://www.nytimes.com/2010/08/22/magazine/22Adulthood-t.html What is it about 20-somethings Why are so many people in their 20s taking so long to grow up? This question pops up everywhere, underlying concerns about “failure to launch” and “boomerang kids.” Two new sitcoms feature grown children moving back in with their parents — “$#*! My Dad Says,” starring William Shatner as a divorced curmudgeon whose 20-something son can’t make it on his own as a blogger, and “Big Lake,” in which a financial whiz kid loses his Wall Street job and moves back home to rural Pennsylvania. A cover of The New Yorker last spring picked up on the zeitgeist: a young man hangs up his new Ph.D. in his boyhood bedroom, the cardboard box at his feet signaling his plans to move back home now that he’s officially overqualified for a job. In the doorway stand his parents, their expressions a mix of resignation, worry, annoyance and perplexity: how exactly did this happen? It’s happening all over, in all sorts of families, not just young people moving back home but also young people taking longer to reach adulthood overall. It’s a development that predates the current economic doldrums, and no one knows yet what the impact will be — on the prospects of the young men and women; on the parents on whom so many of them depend; on society, built on the expectation of an orderly progression in which kids finish school, grow up, start careers, make a family and eventually retire to live on pensions supported by the next crop of kids who finish school, grow up, start careers, make a family and on and on. The traditional cycle seems to have gone off course, as young people remain untethered to romantic partners or to permanent homes, going back to school for lack of better options, traveling, avoiding commitments, competing ferociously for unpaid internships or temporary (and often grueling) Teach for America jobs, forestalling the beginning of adult life. The 20s are a black box, and there is a lot of churning in there. One-third of people in their 20s move to a new residence every year. Forty percent move back home with their parents at least once. They go through an average of seven jobs in their 20s, more job changes than in any other stretch. Twothirds spend at least some time living with a romantic partner without being married. And marriage occurs later than ever. The median age at first marriage in the early 1970s, when the baby boomers were young, was 21 for women and 23 for men; by 2009 it had climbed to 26 for women and 28 for men, five years in a little more than a generation. We’re in the thick of what one sociologist calls “the changing timetable for adulthood.” Sociologists traditionally define the “transition to adulthood” as marked by five milestones: completing school, leaving home, becoming financially independent, marrying and having a child. In 1960, 77 percent of women and 65 percent of men had, by the time they reached 30, passed all five milestones. Among 30-year-olds in 2000, according to data from the United States Census Bureau, fewer than half of the women and one-third of the men had done so. A Canadian study reported that a typical 30-year-old in 2001 had completed the same number of milestones as a 25-year-old in the early ’70s. The whole idea of milestones, of course, is something of an anachronism; it implies a lockstep march toward adulthood that is rare these days. Kids don’t shuffle along in unison on the road to maturity. They slouch toward adulthood at an uneven, highly individual pace. Some never achieve all five milestones, including those who are single or childless by choice, or unable to marry even if they wanted to because they’re gay. Others reach the milestones completely out of order, advancing professionally before committing to a monogamous relationship, having children young and marrying later, leaving school to go to work and returning to school long after becoming financially secure. Even if some traditional milestones are never reached, one thing is clear: Getting to what we would generally call adulthood is happening later than ever. But why? That’s the subject of lively debate among policy makers and academics. To some, what we’re seeing is a transient epiphenomenon, the byproduct of cultural and economic forces. To others, the longer road to adulthood signifies something deep, durable and maybe better-suited to our neurological hard-wiring. What we’re seeing, they insist, is the dawning of a new life stage — a stage that all of us need to adjust to. JEFFREY JENSEN ARNETT, a psychology professor at Clark University in Worcester, Mass., is leading the movement to view the 20s as a distinct life stage, which he calls “emerging adulthood.” He says what is happening now is analogous to what happened a century ago, when social and economic changes helped create adolescence — a stage we take for granted but one that had to be recognized by psychologists, accepted by society and accommodated by institutions that served the young. Similar changes at the turn of the 21st century have laid the groundwork for another new stage, Arnett says, between the age of 18 and the late 20s. Among the cultural changes he points to that have led to “emerging adulthood” are the need for more education to survive in an information-based economy; fewer entrylevel jobs even after all that schooling; young people feeling less rush to marry because of the general acceptance of premarital sex, cohabitation and birth control; and young women feeling less rush to have babies given their wide range of career options and their access to assisted reproductive technology if they delay pregnancy beyond their most fertile years. Just as adolescence has its particular psychological profile, Arnett says, so does emerging adulthood: identity exploration, instability, self-focus, feeling in-between and a rather poetic characteristic he calls “a sense of possibilities.” A few of these, especially identity exploration, are part of adolescence too, but they take on new depth and urgency in the 20s. The stakes are higher when people are approaching the age when options tend to close off and lifelong commitments must be made. Arnett calls it “the age 30 deadline.” The issue of whether emerging adulthood is a new stage is being debated most forcefully among scholars, in particular psychologists and sociologists. But its resolution has broader implications. Just look at what happened for teenagers. It took some effort, a century ago, for psychologists to make the case that adolescence was a new developmental stage. Once that happened, social institutions were forced to adapt: education, health care, social services and the law all changed to address the particular needs of 12- to 18-year-olds. An understanding of the developmental profile of adolescence led, for instance, to the creation of junior high schools in the early 1900s, separating seventh and eighth graders from the younger children in what used to be called primary school. And it led to the recognition that teenagers between 14 and 18, even though they were legally minors, were mature enough to make their own choice of legal guardian in the event of their parents’ deaths. If emerging adulthood is an analogous stage, analogous changes are in the wings. But what would it look like to extend some of the special status of adolescents to young people in their 20s? Our uncertainty about this question is reflected in our scattershot approach to markers of adulthood. People can vote at 18, but in some states they don’t age out of foster care until 21. They can join the military at 18, but they can’t drink until 21. They can drive at 16, but they can’t rent a car until 25 without some hefty surcharges. If they are full-time students, the Internal Revenue Service considers them dependents until 24; those without health insurance will soon be able to stay on their parents’ plans even if they’re not in school until age 26, or up to 30 in some states. Parents have no access to their child’s college records if the child is over 18, but parents’ income is taken into account when the child applies for financial aid up to age 24. We seem unable to agree when someone is old enough to take on adult responsibilities. But we’re pretty sure it’s not simply a matter of age. If society decides to protect these young people or treat them differently from fully grown adults, how can we do this without becoming all the things that grown children resist — controlling, moralizing, paternalistic? Young people spend their lives lumped into age-related clusters — that’s the basis of K-12 schooling — but as they move through their 20s, they diverge. Some 25-yearolds are married homeowners with good jobs and a couple of kids; others are still living with their parents and working at transient jobs, or not working at all. Does that mean we extend some of the protections and special status of adolescence to all people in their 20s? To some of them? Which ones? Decisions like this matter, because failing to protect and support vulnerable young people can lead them down the wrong path at a critical moment, the one that can determine all subsequent paths. But overprotecting and oversupporting them can sometimes make matters worse, turning the “changing timetable of adulthood” into a self-fulfilling prophecy. The more profound question behind the scholarly intrigue is the one that really captivates parents: whether the prolongation of this unsettled time of life is a good thing or a bad thing. With life spans stretching into the ninth decade, is it better for young people to experiment in their 20s before making choices they’ll have to live with for more than half a century? Or is adulthood now so malleable, with marriage and employment options constantly being reassessed, that young people would be better off just getting started on something, or else they’ll never catch up, consigned to remain always a few steps behind the early bloomers? Is emerging adulthood a rich and varied period for self-discovery, as Arnett says it is? Or is it just another term for self-indulgence? THE DISCOVERY OF adolescence is generally dated to 1904, with the publication of the massive study “Adolescence,” by G. Stanley Hall, a prominent psychologist and first president of the American Psychological Association. Hall attributed the new stage to social changes at the turn of the 20th century. Child-labor laws kept children under 16 out of the work force, and universal education laws kept them in secondary school, thus prolonging the period of dependence — a dependence that allowed them to address psychological tasks they might have ignored when they took on adult roles straight out of childhood. Hall, the first president of Clark University — the same place, interestingly enough, where Arnett now teaches — described adolescence as a time of “storm and stress,” filled with emotional upheaval, sorrow and rebelliousness. He cited the “curve of despondency” that “starts at 11, rises steadily and rapidly till 15 . . . then falls steadily till 23,” and described other characteristics of adolescence, including an increase in sensation seeking, greater susceptibility to media influences (which in 1904 mostly meant “flash literature” and “penny dreadfuls”) and overreliance on peer relationships. Hall’s book was flawed, but it marked the beginning of the scientific study of adolescence and helped lead to its eventual acceptance as a distinct stage with its own challenges, behaviors and biological profile. In the 1990s, Arnett began to suspect that something similar was taking place with young people in their late teens and early 20s. He was teaching human development and family studies at the University of Missouri, studying college-age students, both at the university and in the community around Columbia, Mo. He asked them questions about their lives and their expectations like, “Do you feel you have reached adulthood?” “I was in my early- to mid-30s myself, and I remember thinking, They’re not a thing like me,” Arnett told me when we met last spring in Worcester. “I realized that there was something special going on.” The young people he spoke to weren’t experiencing the upending physical changes that accompany adolescence, but as an age cohort they did seem to have a psychological makeup different from that of people just a little bit younger or a little bit older. This was not how most psychologists were thinking about development at the time, when the eight-stage model of the psychologist Erik Erikson was in vogue. Erikson, one of the first to focus on psychological development past childhood, divided adulthood into three stages — young (roughly ages 20 to 45), middle (about ages 45 to 65) and late (all the rest) — and defined them by the challenges that individuals in a particular stage encounter and must resolve before moving on to the next stage. In young adulthood, according to his model, the primary psychological challenge is “intimacy versus isolation,” by which Erikson meant deciding whether to commit to a lifelong intimate relationship and choosing the person to commit to. But Arnett said “young adulthood” was too broad a term to apply to a 25-year span that included both him and his college students. The 20s are something different from the 30s and 40s, he remembered thinking. And while he agreed that the struggle for intimacy was one task of this period, he said there were other critical tasks as well. Arnett and I were discussing the evolution of his thinking over lunch at BABA Sushi, a quiet restaurant near his office where he goes so often he knows the sushi chefs by name. He is 53, very tall and wiry, with clipped steel-gray hair and ice-blue eyes, an intense, serious man. He describes himself as a late bloomer, a onetime emerging adult before anyone had given it a name. After graduating from Michigan State University in 1980, he spent two years playing guitar in bars and restaurants and experimented with girlfriends, drugs and general recklessness before going for his doctorate in developmental psychology at the University of Virginia. By 1986 he had his first academic job at Oglethorpe University, a small college in Atlanta. There he met his wife, Lene Jensen, the school’s smartest psych major, who stunned Arnett when she came to his office one day in 1989, shortly after she graduated, and asked him out on a date. Jensen earned a doctorate in psychology, too, and she also teaches at Clark. She and Arnett have 10-yearold twins, a boy and a girl. Arnett spent time at Northwestern University and the University of Chicago before moving to the University of Missouri in 1992, beginning his study of young men and women in the college town of Columbia, gradually broadening his sample to include New Orleans, Los Angeles and San Francisco. He deliberately included working-class young people as well as those who were well off, those who had never gone to college as well as those who were still in school, those who were supporting themselves as well as those whose bills were being paid by their parents. A little more than half of his sample was white, 18 percent African-American, 16 percent AsianAmerican and 14 percent Latino. More than 300 interviews and 250 survey responses persuaded Arnett that he was onto something new. This was the era of the Gen X slacker, but Arnett felt that his findings applied beyond one generation. He wrote them up in 2000 in American Psychologist, the first time he laid out his theory of “emerging adulthood.” According to Google Scholar, which keeps track of such things, the article has been cited in professional books and journals roughly 1,700 times. This makes it, in the world of academia, practically viral. At the very least, the citations indicate that Arnett had come up with a useful term for describing a particular cohort; at best, that he offered a whole new way of thinking about them. DURING THE PERIOD he calls emerging adulthood, Arnett says that young men and women are more self-focused than at any other time of life, less certain about the future and yet also more optimistic, no matter what their economic background. This is where the “sense of possibilities” comes in, he says; they have not yet tempered their idealistic visions of what awaits. “The dreary, dead-end jobs, the bitter divorces, the disappointing and disrespectful children . . . none of them imagine that this is what the future holds for them,” he wrote. Ask them if they agree with the statement “I am very sure that someday I will get to where I want to be in life,” and 96 percent of them will say yes. But despite elements that are exciting, even exhilarating, about being this age, there is a downside, too: dread, frustration, uncertainty, a sense of not quite understanding the rules of the game. More than positive or negative feelings, what Arnett heard most often was ambivalence — beginning with his finding that 60 percent of his subjects told him they felt like both grown-ups and not-quite-grown-ups. Some scientists would argue that this ambivalence reflects what is going on in the brain, which is also both grown-up and not-quite-grown-up. Neuroscientists once thought the brain stops growing shortly after puberty, but now they know it keeps maturing well into the 20s. This new understanding comes largely from a longitudinal study of brain development sponsored by the National Institute of Mental Health, which started following nearly 5,000 children at ages 3 to 16 (the average age at enrollment was about 10). The scientists found the children’s brains were not fully mature until at least 25. “In retrospect I wouldn’t call it shocking, but it was at the time,” Jay Giedd, the director of the study, told me. “The only people who got this right were the car-rental companies.” When the N.I.M.H. study began in 1991, Giedd said he and his colleagues expected to stop when the subjects turned 16. “We figured that by 16 their bodies were pretty big physically,” he said. But every time the children returned, their brains were found still to be changing. The scientists extended the end date of the study to age 18, then 20, then 22. The subjects’ brains were still changing even then. Tellingly, the most significant changes took place in the prefrontal cortex and cerebellum, the regions involved in emotional control and higher-order cognitive function. As the brain matures, one thing that happens is the pruning of the synapses. Synaptic pruning does not occur willy-nilly; it depends largely on how any one brain pathway is used. By cutting off unused pathways, the brain eventually settles into a structure that’s most efficient for the owner of that brain, creating well-worn grooves for the pathways that person uses most. Synaptic pruning intensifies after rapid brain-cell proliferation during childhood and again in the period that encompasses adolescence and the 20s. It is the mechanism of “use it or lose it”: the brains we have are shaped largely in response to the demands made of them. We have come to accept the idea that environmental influences in the first three years of life have long-term consequences for cognition, emotional control, attention and the like. Is it time to place a similar emphasis, with hopes for a similar outcome, on enriching the cognitive environment of people in their 20s? Photo N.I.M.H. scientists also found a time lag between the growth of the limbic system, where emotions originate, and of the prefrontal cortex, which manages those emotions. The limbic system explodes during puberty, but the prefrontal cortex keeps maturing for another 10 years. Giedd said it is logical to suppose — and for now, neuroscientists have to make a lot of logical suppositions — that when the limbic system is fully active but the cortex is still being built, emotions might outweigh rationality. “The prefrontal part is the part that allows you to control your impulses, come up with a long-range strategy, answer the question ‘What am I going to do with my life?’ ” he told me. “That weighing of the future keeps changing into the 20s and 30s.” Among study subjects who enrolled as children, M.R.I. scans have been done so far only to age 25, so scientists have to make another logical supposition about what happens to the brain in the late 20s, the 30s and beyond. Is it possible that the brain just keeps changing and pruning, for years and years? “Guessing from the shape of the growth curves we have,” Giedd’s colleague Philip Shaw wrote in an e-mail message, “it does seem that much of the gray matter,” where synaptic pruning takes place, “seems to have completed its most dramatic structural change” by age 25. For white matter, where insulation that helps impulses travel faster continues to form, “it does look as if the curves are still going up, suggesting continued growth” after age 25, he wrote, though at a slower rate than before. None of this is new, of course; the brains of young people have always been works in progress, even when we didn’t have sophisticated scanning machinery to chart it precisely. Why, then, is the youthful brain only now arising as an explanation for why people in their 20s are seeming a bit unfinished? Maybe there’s an analogy to be found in the hierarchy of needs, a theory put forth in the 1940s by the psychologist Abraham Maslow. According to Maslow, people can pursue more elevated goals only after their basic needs of food, shelter and sex have been met. What if the brain has its own hierarchy of needs? When people are forced to adopt adult responsibilities early, maybe they just do what they have to do, whether or not their brains are ready. Maybe it’s only now, when young people are allowed to forestall adult obligations without fear of public censure, that the rate of societal maturation can finally fall into better sync with the maturation of the brain. Cultural expectations might also reinforce the delay. The “changing timetable for adulthood” has, in many ways, become internalized by 20-somethings and their parents alike. Today young people don’t expect to marry until their late 20s, don’t expect to start a family until their 30s, don’t expect to be on track for a rewarding career until much later than their parents were. So they make decisions about their futures that reflect this wider time horizon. Many of them would not be ready to take on the trappings of adulthood any earlier even if the opportunity arose; they haven’t braced themselves for it. Nor do parents expect their children to grow up right away — and they might not even want them to. Parents might regret having themselves jumped into marriage or a career and hope for more considered choices for their children. Or they might want to hold on to a reassuring connection with their children as the kids leave home. If they were “helicopter parents” — a term that describes heavily invested parents who hover over their children, swooping down to take charge and solve problems at a moment’s notice — they might keep hovering and problem-solving long past the time when their children should be solving problems on their own. This might, in a strange way, be part of what keeps their grown children in the limbo between adolescence and adulthood. It can be hard sometimes to tease out to what extent a child doesn’t quite want to grow up and to what extent a parent doesn’t quite want to let go. IT IS A BIG DEAL IN developmental psychology to declare the existence of a new stage of life, and Arnett has devoted the past 10 years to making his case. Shortly after his American Psychologist article appeared in 2000, he and Jennifer Lynn Tanner, a developmental psychologist at Rutgers University, convened the first conference of what they later called the Society for the Study of Emerging Adulthood. It was held in 2003 at Harvard with an attendance of 75; there have been three more since then, and last year’s conference, in Atlanta, had more than 270 attendees. In 2004 Arnett published a book, “Emerging Adulthood: The Winding Road From the Late Teens Through the Twenties,” which is still in print and selling well. In 2006 he and Tanner published an edited volume, “Emerging Adults in America: Coming of Age in the 21st Century,” aimed at professionals and academics. Arnett’s college textbook, “Adolescence and Emerging Adulthood: A Cultural Approach,” has been in print since 2000 and is now in its fourth edition. Next year he says he hopes to publish another book, this one for the parents of 20somethings. If all Arnett’s talk about emerging adulthood sounds vaguely familiar . . . well, it should. Forty years ago, an article appeared in The American Scholar that declared “a new stage of life” for the period between adolescence and young adulthood. This was 1970, when the oldest members of the baby boom generation — the parents of today’s 20-somethings — were 24. Young people of the day “can’t seem to ‘settle down,’ ” wrote the Yale psychologist Kenneth Keniston. He called the new stage of life “youth.” Keniston’s description of “youth” presages Arnett’s description of “emerging adulthood” a generation later. In the late ’60s, Keniston wrote that there was “a growing minority of post-adolescents [who] have not settled the questions whose answers once defined adulthood: questions of relationship to the existing society, questions of vocation, questions of social role and lifestyle.” Whereas once, such aimlessness was seen only in the “unusually creative or unusually disturbed,” he wrote, it was becoming more common and more ordinary in the baby boomers of 1970. Among the salient characteristics of “youth,” Keniston wrote, were “pervasive ambivalence toward self and society,” “the feeling of absolute freedom, of living in a world of pure possibilities” and “the enormous value placed upon change, transformation and movement” — all characteristics that Arnett now ascribes to “emerging adults.” Arnett readily acknowledges his debt to Keniston; he mentions him in almost everything he has written about emerging adulthood. But he considers the ’60s a unique moment, when young people were rebellious and alienated in a way they’ve never been before or since. And Keniston’s views never quite took off, Arnett says, because “youth” wasn’t a very good name for it. He has called the label “ambiguous and confusing,” not nearly as catchy as his own “emerging adulthood.” For whatever reason Keniston’s terminology faded away, it’s revealing to read his old article and hear echoes of what’s going on with kids today. He was describing the parents of today’s young people when they themselves were young — and amazingly, they weren’t all that different from their own children now. Keniston’s article seems a lovely demonstration of the eternal cycle of life, the perennial conflict between the generations, the gradual resolution of those conflicts. It’s reassuring, actually, to think of it as recursive, to imagine that there must always be a cohort of 20-somethings who take their time settling down, just as there must always be a cohort of 50-somethings who worry about it. KENISTON CALLED IT youth, Arnett calls it emerging adulthood; whatever it’s called, the delayed transition has been observed for years. But it can be in fullest flower only when the young person has some other, nontraditional means of support — which would seem to make the delay something of a luxury item. That’s the impression you get reading Arnett’s case histories in his books and articles, or the essays in “20 Something Manifesto,” an anthology edited by a Los Angeles writer named Christine Hassler. “It’s somewhat terrifying,” writes a 25-year-old named Jennifer, “to think about all the things I’m supposed to be doing in order to ‘get somewhere’ successful: ‘Follow your passions, live your dreams, take risks, network with the right people, find mentors, be financially responsible, volunteer, work, think about or go to grad school, fall in love and maintain personal well-being, mental health and nutrition.’ When is there time to just be and enjoy?” Adds a 24-year-old from Virginia: “There is pressure to make decisions that will form the foundation for the rest of your life in your 20s. It’s almost as if having a range of limited options would be easier.” While the complaints of these young people are heartfelt, they are also the complaints of the privileged. Julie, a 23-year-old New Yorker and contributor to “20 Something Manifesto,” is apparently aware of this. She was coddled her whole life, treated to French horn lessons and summer camp, told she could do anything. “It is a double-edged sword,” she writes, “because on the one hand I am so blessed with my experiences and endless options, but on the other hand, I still feel like a child. I feel like my job isn’t real because I am not where my parents were at my age. Walking home, in the shoes my father bought me, I still feel I have yet to grow up.” Photo CreditBon Duke Despite these impressions, Arnett insists that emerging adulthood is not limited to young persons of privilege and that it is not simply a period of selfindulgence. He takes pains in “Emerging Adulthood” to describe some case histories of young men and women from hard-luck backgrounds who use the self-focus and identity exploration of their 20s to transform their lives. One of these is the case history of Nicole, a 25-year-old African-American who grew up in a housing project in Oakland, Calif. At age 6, Nicole, the eldest, was forced to take control of the household after her mother’s mental collapse. By 8, she was sweeping stores and baby-sitting for money to help keep her three siblings fed and housed. “I made a couple bucks and helped my mother out, helped my family out,” she told Arnett. She managed to graduate from high school, but with low grades, and got a job as a receptionist at a dermatology clinic. She moved into her own apartment, took night classes at community college and started to excel. “I needed to experience living out of my mother’s home in order to study,” she said. In his book, Arnett presents Nicole as a symbol of all the young people from impoverished backgrounds for whom “emerging adulthood represents an opportunity — maybe a last opportunity — to turn one’s life around.” This is the stage where someone like Nicole can escape an abusive or dysfunctional family and finally pursue her own dreams. Nicole’s dreams are powerful — one course away from an associate degree, she plans to go on for a bachelor’s and then a Ph.D. in psychology — but she has not really left her family behind; few people do. She is still supporting her mother and siblings, which is why she works full time even though her progress through school would be quicker if she found a part-time job. Is it only a grim pessimist like me who sees how many roadblocks there will be on the way to achieving those dreams and who wonders what kind of freewheeling emerging adulthood she is supposed to be having? Of course, Nicole’s case is not representative of society as a whole. And many parents — including those who can’t really afford it — continue to help their kids financially long past the time they expected to. Two years ago Karen Fingerman, a developmental psychologist at Purdue University, asked parents of grown children whether they provided significant assistance to their sons or daughters. Assistance included giving their children money or help with everyday tasks (practical assistance) as well as advice, companionship and an attentive ear. Eighty-six percent said they had provided advice in the previous month; less than half had done so in 1988. Two out of three parents had given a son or daughter practical assistance in the previous month; in 1988, only one in three had. Fingerman took solace in her findings; she said it showed that parents stay connected to their grown children, and she suspects that both parties get something out of it. The survey questions, after all, referred not only to dispensing money but also to offering advice, comfort and friendship. And another of Fingerman’s studies suggests that parents’ sense of well-being depends largely on how close they are to their grown children and how their children are faring — objective support for the adage that you’re only as happy as your unhappiest child. But the expectation that young men and women won’t quite be able to make ends meet on their own, and that parents should be the ones to help bridge the gap, places a terrible burden on parents who might be worrying about their own job security, trying to care for their aging parents or grieving as their retirement plans become more and more of a pipe dream. This dependence on Mom and Dad also means that during the 20s the rift between rich and poor becomes entrenched. According to data gathered by the Network on Transitions to Adulthood, a research consortium supported by the John D. and Catherine T. MacArthur Foundation, American parents give an average of 10 percent of their income to their 18- to 21-year-old children. This percentage is basically the same no matter the family’s total income, meaning that upper-class kids tend to get more than working-class ones. And wealthier kids have other, less obvious, advantages. When they go to four-year colleges or universities, they get supervised dormitory housing, health care and alumni networks not available at community colleges. And they often get a leg up on their careers by using parents’ contacts to help land an entry-level job — or by using parents as a financial backup when they want to take an interesting internship that doesn’t pay. “You get on a pathway, and pathways have momentum,” Jennifer Lynn Tanner of Rutgers told me. “In emerging adulthood, if you spend this time exploring and you get yourself on a pathway that really fits you, then there’s going to be this snowball effect of finding the right fit, the right partner, the right job, the right place to live. The less you have at first, the less you’re going to get this positive effect compounded over time. You’re not going to have the same acceleration.” EVEN ARNETT ADMITS that not every young person goes through a period of “emerging adulthood.” It’s rare in the developing world, he says, where people have to grow up fast, and it’s often skipped in the industrialized world by the people who marry early, by teenage mothers forced to grow up, by young men or women who go straight from high school to whatever job is available without a chance to dabble until they find the perfect fit. Indeed, the majority of humankind would seem to not go through it at all. The fact that emerging adulthood is not universal is one of the strongest arguments against Arnett’s claim that it is a new developmental stage. If emerging adulthood is so important, why is it even possible to skip it? “The core idea of classical stage theory is that all people — underscore ‘all’ — pass through a series of qualitatively different periods in an invariant and universal sequence in stages that can’t be skipped or reordered,” Richard Lerner, Bergstrom chairman in applied developmental science at Tufts University, told me. Lerner is a close personal friend of Arnett’s; he and his wife, Jacqueline, who is also a psychologist, live 20 miles from Worcester, and they have dinner with Arnett and his wife on a regular basis. “I think the world of Jeff Arnett,” Lerner said. “I think he is a smart, passionate person who is doing great work — not only a smart and productive scholar, but one of the nicest people I ever met in my life.” No matter how much he likes and admires Arnett, however, Lerner says his friend has ignored some of the basic tenets of developmental psychology. According to classical stage theory, he told me, “you must develop what you’re supposed to develop when you’re supposed to develop it or you’ll never adequately develop it.” When I asked Arnett what happens to people who don’t have an emerging adulthood, he said it wasn’t necessarily a big deal. They might face its developmental tasks — identity exploration, self-focus, experimentation in love, work and worldview — at a later time, maybe as a midlife crisis, or they might never face them at all, he said. It depends partly on why they missed emerging adulthood in the first place, whether it was by circumstance or by choice. Photo CreditAnnie Ling No, said Lerner, that’s not the way it works. To qualify as a developmental stage, emerging adulthood must be both universal and essential. “If you don’t develop a skill at the right stage, you’ll be working the rest of your life to develop it when you should be moving on,” he said. “The rest of your development will be unfavorably altered.” The fact that Arnett can be so casual about the heterogeneity of emerging adulthood and its existence in some cultures but not in others — indeed, even in some people but not in their neighbors or friends — is what undermines, for many scholars, his insistence that it’s a new life stage. Why does it matter? Because if the delay in achieving adulthood is just a temporary aberration caused by passing social mores and economic gloom, it’s something to struggle through for now, maybe feeling a little sorry for the young people who had the misfortune to come of age in a recession. But if it’s a true life stage, we need to start rethinking our definition of normal development and to create systems of education, health care and social supports that take the new stage into account. The Network on Transitions to Adulthood has been issuing reports about young people since it was formed in 1999 and often ends up recommending more support for 20-somethings. But more of what, exactly? There aren’t institutions set up to serve people in this specific age range; social services from a developmental perspective tend to disappear after adolescence. But it’s possible to envision some that might address the restlessness and mobility that Arnett says are typical at this stage and that might make the experimentation of “emerging adulthood” available to more young people. How about expanding programs like City Year, in which 17- to 24-year-olds from diverse backgrounds spend a year mentoring inner-city children in exchange for a stipend, health insurance, child care, cellphone service and a $5,350 education award? Or a federal program in which a governmentsponsored savings account is created for every newborn, to be cashed in at age 21 to support a year’s worth of travel, education or volunteer work — a version of the “baby bonds” program that Hillary Clinton mentioned during her 2008 primary campaign? Maybe we can encourage a kind of socially sanctioned “rumspringa,” the temporary moratorium from social responsibilities some Amish offer their young people to allow them to experiment before settling down. It requires only a bit of ingenuity — as well as some societal forbearance and financial commitment — to think of ways to expand some of the programs that now work so well for the elite, like the Fulbright fellowship or the Peace Corps, to make the chance for temporary service and self-examination available to a wider range of young people. A century ago, it was helpful to start thinking of adolescents as engaged in the work of growing up rather than as merely lazy or rebellious. Only then could society recognize that the educational, medical, mental-health and socialservice needs of this group were unique and that investing in them would have a payoff in the future. Twenty-somethings are engaged in work, too, even if it looks as if they are aimless or failing to pull their weight, Arnett says. But it’s a reflection of our collective attitude toward this period that we devote so few resources to keeping them solvent and granting them some measure of security. THE KIND OF SERVICES that might be created if emerging adulthood is accepted as a life stage can be seen during a visit to Yellowbrick, a residential program in Evanston, Ill., that calls itself the only psychiatric treatment facility for emerging adults. “Emerging adults really do have unique developmental tasks to focus on,” said Jesse Viner, Yellowbrick’s executive medical director. Viner started Yellowbrick in 2005, when he was working in a group psychiatric practice in Chicago and saw the need for a different way to treat this cohort. He is a soft-spoken man who looks like an accountant and sounds like a New Age prophet, peppering his conversation with phrases like “helping to empower their agency.” “Agency” is a tricky concept when parents are paying the full cost of Yellowbrick’s comprehensive residential program, which comes to $21,000 a month and is not always covered by insurance. Staff members are aware of the paradox of encouraging a child to separate from Mommy and Daddy when it’s on their dime. They address it with a concept they call connected autonomy, which they define as knowing when to stand alone and when to accept help. Patients come to Yellowbrick with a variety of problems: substance abuse, eating disorders, depression, anxiety or one of the more severe mental illnesses, like schizophrenia or bipolar disorder, that tend to appear in the late teens or early 20s. The demands of imminent independence can worsen mental-health problems or can create new ones for people who have managed up to that point to perform all the expected roles — son or daughter, boyfriend or girlfriend, student, teammate, friend — but get lost when schooling ends and expected roles disappear. That’s what happened to one patient who had done well at a top Ivy League college until the last class of the last semester of his last year, when he finished his final paper and could not bring himself to turn it in. The Yellowbrick philosophy is that young people must meet these challenges without coddling or rescue. Up to 16 patients at a time are housed in the Yellowbrick residence, a four-story apartment building Viner owns. They live in the apartments — which are large, sunny and lavishly furnished — in groups of three or four, with staff members always on hand to teach the basics of shopping, cooking, cleaning, scheduling, making commitments and showing up. Viner let me sit in on daily clinical rounds, scheduled that day for C., a young woman who had been at Yellowbrick for three months. Rounds are like the world’s most grueling job interview: the patient sits in front alongside her clinician “advocate,” and a dozen or so staff members are arrayed on couches and armchairs around the room, firing questions. C. seemed nervous but pleased with herself, frequently flashing a huge white smile. She is 22, tall and skinny, and she wore tiny denim shorts and a big T-shirt and vest. She started to fall apart during her junior year at college, plagued by binge drinking and anorexia, and in her first weeks at Yellowbrick her alcohol abuse continued. Most psychiatric facilities would have kicked her out after the first relapse, said Dale Monroe-Cook, Yellowbrick’s vice president of clinical operations. “We’re doing the opposite: we want the behavior to unfold, and we want to be there in that critical moment, to work with that behavior and help the emerging adult transition to greater independence.” The Yellowbrick staff let C. face her demons and decide how to deal with them. After five relapses, C. asked the staff to take away her ID so she couldn’t buy alcohol. Eventually she decided to start going to meetings of Alcoholics Anonymous. At her rounds in June, C. was able to report that she had been alcohol-free for 30 days. Jesse Viner’s wife, Laura Viner, who is a psychologist on staff, started to clap for her, but no one else joined in. “We’re on eggshells here,” Gary Zurawski, a clinical social worker specializing in substance abuse, confessed to C. “We don’t know if we should congratulate you too much.” The staff was sensitive about taking away the young woman’s motivation to improve her life for her own sake, not for the sake of getting praise from someone else. C. took the discussion about the applause in stride and told the staff she had more good news: in two days she was going to graduate. On time. THE 20S ARE LIKE the stem cell of human development, the pluripotent moment when any of several outcomes is possible. Decisions and actions during this time have lasting ramifications. The 20s are when most people accumulate almost all of their formal education; when most people meet their future spouses and the friends they will keep; when most people start on the careers that they will stay with for many years. This is when adventures, experiments, travels, relationships are embarked on with an abandon that probably will not happen again. Does that mean it’s a good thing to let 20-somethings meander — or even to encourage them to meander — before they settle down? That’s the question that plagues so many of their parents. It’s easy to see the advantages to the delay. There is time enough for adulthood and its attendant obligations; maybe if kids take longer to choose their mates and their careers, they’ll make fewer mistakes and live happier lives. But it’s just as easy to see the drawbacks. As the settling-down sputters along for the “emerging adults,” things can get precarious for the rest of us. Parents are helping pay bills they never counted on paying, and social institutions are missing out on young people contributing to productivity and growth. Of course, the recession complicates things, and even if every 20-something were ready to skip the “emerging” moratorium and act like a grown-up, there wouldn’t necessarily be jobs for them all. So we’re caught in a weird moment, unsure whether to allow young people to keep exploring and questioning or to cut them off and tell them just to find something, anything, to put food on the table and get on with their lives. Arnett would like to see us choose a middle course. “To be a young American today is to experience both excitement and uncertainty, wide-open possibility and confusion, new freedoms and new fears,” he writes in “Emerging Adulthood.” During the timeout they are granted from nonstop, often tedious and dispiriting responsibilities, “emerging adults develop skills for daily living, gain a better understanding of who they are and what they want from life and begin to build a foundation for their adult lives.” If it really works that way, if this longer road to adulthood really leads to more insight and better choices, then Arnett’s vision of an insightful, sensitive, thoughtful, content, well-honed, self-actualizing crop of grown-ups would indeed be something worth waiting for. Robin Marantz Henig is a contributing writer. Her last article for the magazine was about anxiety.
Purchase answer to see full attachment
Explanation & Answer:
3 pragraphs
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Please find the attachement

SURNAME 1
Name
Instructor’s name
Course code
Date
Article Analysis and Comparison
The social revolution of the modern age has been disturbing and exhilarating because of the
diverse impacts that it has caused. People have been looking at the effects of social evolution on
each of the social groups, and it remains to be seen on how this affects different age groups. For
instance, one of the articles talks about it has become so challenging for modern youth to mature
and embrace adulthood and its inherent responsibilities. The youth are no longer committing to
relationships, investing in job careers, and starting families at an early age as earlier as the past
generations. The millennials are perverse to responsibilities and look for ways to avoid adulthood
by living with their parents, romantic partners, and remaining childless. All this leads to the
question, when will the 20-year olds mature? The other article looks at ways through which
beliefs are transmitted in society. The idea is that there should be an analytical approach to how
beliefs become part of human society, and the role this plays in forming a mental pattern. What
makes one person believe in an entity or factor, and then sees its transmission to the society? The
article argues that believing is a ‘menta...


Anonymous
Super useful! Studypool never disappoints.

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Similar Content

Related Tags