The oldest known stone-tipped projectiles have been discovered in Ethiopia. The javelins are roughly 280,000 years old and predate the earliest known fossils of our species, Homo sapiens, by about 80,000 years.
These javelins are some 200,000 years older than previous examples of similar weapons, suggesting that modern humans and their extinct relatives had the know-how to create these sorts of complex thrown projectiles much earlier than often thought.
Scientists investigated stone tools unearthed at the Gademotta Formation on the flanks of an ancient, large collapsed volcanic crater in central Ethiopia’s Rift Valley….
"We were only interested in testing the hypothesis that these tools were definitely used to tip spears," Sahle said. "The eureka came much later as we did the analysis and found out that the features we were dealing with were the result of throwing impact, not thrusting."
When pointed artifacts are used as weapons, V-shaped fractures, called fracture wings, can form at the moment of impact; the apexes mark where the cracks started. Past experiments in materials such as obsidian have shown that the narrower the V-shapes of fracture wings, the higher the speed of the fracturing that created them.
The researchers discovered that the fracture wings seen in a dozen of these obsidian points suggest that the fracture cracking sped faster than 1,820 miles an hour (2,930 kilometers an hour). In experiments with thrusting spears, that’s the maximum velocity seen in fracturing. And some of these artifacts apparently developed fractures after impact at speeds of up to 3,345 miles an hour (5,385 kilometers an hour), close to the maximum velocity seen with fracturing in thrown spears.
A number of these artifacts are among the oldest at the site, suggesting that javelins were used as early as 279,000 years ago. Such weapons are considered signs of complex behavior and were pivotal to the spread of modern humans.
"The implication is that certain behavioral traits that are considered complex and mostly only the domains of anatomically modern humans—such as the capacity to make and use projectiles—were not only incorporated into the technological repertoire of the African early Homo sapiens, but also had earlier roots and were present in populations ancestral to Homo sapiens,” Sahle said.
According to an article published in the journal Motivation and Emotion, there are five types of boredom—which is one more than the research team expected to identify. The boredom varieties range from a calm and pleasant experience to something more like depression….
Indifferent boredom: This is a pleasant form of boredom, said Goetz, giving as an example a student who has had a really long day. “You go to a class, you are tired, and the class is boring. However, the boredom is experienced as rather relaxing and even positive. It is still boredom, but you like being bored.” Another example? Zoning out on the couch in front of a marathon of trashy reality TV.
Calibrating boredom: Do you let your thoughts wander? If you are open to new ideas but don’t feel any motivation to actually get up and do something, that’s calibrating boredom. “It is like daydreaming,” said Goetz, “but not actively searching for new actions.”
One of the most exciting findings to emerge from neuroscience in recent years underlines the brain’s inherently social nature. When neuroscientists monitor what’s going on in someone’s brain, they are typically interested in what happens in it when people are involved in an active task, like doing a math problem or reaching for a ball. But neuroscientists have looked more closely at what the brain does during non-active moments, when we’re chilling out and the brain is at rest. Every time we are not engaged in an active task—like when we take a break between two math problems—the brain falls into a neural configuration called the “default network.” When you have down time, even if it’s just for a second, this brain system comes on automatically.
What’s remarkable about the default network, according to Lieberman’s research, is that it looks almost identical to another brain configuration—the one used for social thinking or “making sense of other people and ourselves,” as he writes: “The default network directs us to think about other people’s minds—their thoughts, feelings, and goals.” Whenever it has a free moment, the human brain has an automatic reflex to go social. Why would the brain, which forms only 2 percent of our body weight but consumes 20 percent of its energy, use its limited resources on social thinking, rather than conserving its energy by relaxing?
“Evolution has made a bet,” Lieberman tells me, “that the best thing for our brain to do in any spare moment is to get ready for what comes next in social terms.”
Evolution only makes bets if there are payoffs—and when it comes to being social, there are many benefits. Having strong social bonds is as good for you as quitting smoking. Connecting with other people, even in the most basic ways, also makes you happier—especially when you know they need your help.
Oakley defines pathological altruism as “altruism in which attempts to promote the welfare of others instead result in unanticipated harm.” A crucial qualification is that while the altruistic actor fails to anticipate the harm, “an external observer would conclude [that it] was reasonably foreseeable.” Thus, she explains, if you offer to help a friend move, then accidentally break an expensive item, your altruism probably isn’t pathological; whereas if your brother is addicted to painkillers and you help him obtain them, it is.
As the latter example suggests, the idea of “codependency” is a subset of pathological altruism. “Feelings of empathic caring . . . appear to lie at the core of . . . codependent behavior,” Oakley notes. People in codependent relationships genuinely care for each other, but that empathy leads them to do destructive things.
Yet according to Oakley, “the vital topic of codependency has received almost no hard-science research focus, leaving ‘research’ to those with limited or no scientific research qualifications.” That is to say, it is largely the domain of pop psychology. “It is reasonable to wonder if the lack of scientific research involving codependency may relate to the fact that there is a strong academic bias against studying possible negative outcomes of empathy.”
In an epic case of unintended consequences, government-mandated anti-bullying programs are actually increasing bullying by teaching kids how to bully, according to a new study published in the Journal of Criminology:
The study concluded that students at schools with anti-bullying programs might actually be more likely to become a victim of bullying. It also found that students at schools with no bullying programs were less likely to become victims.
The results were stunning for Jeong [the author]. “Usually people expect an anti-bullying program to have some impact—some positive impact.”
The student videos used in many campaigns show examples of bullying and how to intervene. But Jeong says they may actually teach students different bullying techniques—and even educate about new ways to bully through social media and texting.
America’s lockups are its new asylums. After scores of state mental institutions were closed beginning in the 1970s, few alternatives materialized. Many of the afflicted wound up on the streets, where, untreated, they became more vulnerable to joblessness, drug abuse and crime.
The country’s three biggest jail systems—Cook County, in Illinois; Los Angeles County; and New York City—are on the front lines. With more than 11,000 prisoners under treatment on any given day, they represent by far the largest mental-health treatment facilities in the country. By comparison, the three largest state-run mental hospitals have a combined 4,000 beds.
Put another way, the number of mentally ill prisoners the three facilities handle daily is equal to 28% of all beds in the nation’s 213 state psychiatric hospitals, according to the National Association of State Mental Health Program Directors Research Institute Inc.
"In every city and state I have visited, the jails have become the de facto mental institutions," says Esteban Gonzalez, president of the American Jail Association, an organization for jail employees.
Correctional systems define mental illness differently. Generally, the term is used to describe prisoners who require medication for serious issues ranging from major depressive disorders to schizophrenia and bipolar disorders. Also included are inmates with diagnoses that warrant overnight stays in a mental hospital or who demonstrate serious functional impairment.
To get a snapshot of how the U.S. is grappling with such an explosive societal issue, The Wall Street Journal surveyed all 50 states about issues of mental health within their prison populations. Of the 22 states that provided detailed responses, their mental-health patient ratios ranged from one in 10 inmates to one in two. Inmates in all 23 responding states account for 55% of the prisoners in the U.S. under state jurisdiction….
The picture echoes the past. Two centuries ago, reformers were disturbed to find large numbers of the mentally ill in jails, paving the way for the development of state-run institutions. In the 1950s and 1960s, complaints about abuses, advances in medication and a push to give the patients more independence led to another change, this time toward community settings. The weaknesses of that concept—a lack of facilities, barriers created by privacy laws and tightened local and state funding—has brought the picture full circle.
"Society was horrified to warehouse people in state hospitals, but we have no problem with warehousing them in jails and prisons," says Thomas Dart, sheriff of Cook County.
Then there are the supposedly high rates of suicide, post-traumatic stress and sexual aggression, all of which tempt one to regard the military itself as a dehumanizing institution in need of therapeutic intervention.
Soldiers, in this view, are no longer seen as models of self-control, courage and patriotism. Instead they are victims and should be treated as patients. Yet the links between combat, the military and mental health are more complex than the war-as-disease construct allows.
Begin with suicides by servicemen and women, which have increased in recent years—but by dozens of deaths, not in the epidemic fashion that news coverage sometimes seems to suggest. That said, the 349 military suicides in 2012 did exceed the 295 deaths of U.S. soldiers in Afghanistan. The question is: why?
A major study published this month in the Journal of the American Medical Association found that factors such as substance abuse, depression, financial and relationship problems accounted for the rise in soldier suicides—in other words, the same factors that influence civilians to take their own lives. “The findings from this study,” the authors concluded, “are not consistent with the assumption that specific deployment-related characteristics, such as length of deployment, number of deployments, or combat experiences, are directly associated with increased suicide risk.”
Nor does the rate of military suicides differ significantly from suicides in the general population….
Combat stress is a complex phenomenon. But research has confirmed what military commanders have long known: It is possible to identify those who are most prone to stress problems, and that has more to do with nonmilitary issues—again, substance abuse, money and family problems are the culprits—than with the experience of combat or deployment to a war zone.
Compared with other countries, the United States diagnoses PTSD cases at improbably high rates….
[T]he numbers bandied about to show an epidemic of sexual violence in the U.S. military are questionable. In May, Capt. Lindsay Rodman, a judge advocate stationed at U.S. Marine Headquarters in Arlington, Va., reported on this page, for example, that the number of military sexual assaults frequently cited in Congress and elsewhere are based on a badly distorted interpretation of a Defense Department survey. In recent months the American public has often heard that 26,000 service members were sexually assaulted last year. But that statistic comes from an unscientific poll and refers to “unwanted sexual contact,” including touching the buttocks or even attempted touching.
Moreover, as Gail Heriot, a law professor at the University of San Diego and a member of the U.S. Civil Rights Commission, wrote recently in the Weekly Standard, “there is no evidence that the military has a higher rate of sexual assault than, say, colleges and universities. Indeed, what paltry evidence there is suggests the opposite.”
… [W]ar demands unflinching discipline, courage and loyalty in the presence of our deepest animal passions, and in that sense it is anything but dehumanizing. By regarding soldiers, sometimes condescendingly, as victims and patients, we are in danger of foisting our own, very civilian and very modern, therapeutic pathologies on people who don’t need them and whose ability to do their jobs—that is, keep us safe—is likely to be diminished.
Imagine we rewound the tape of your life. Your diplomas are pulled off of walls, unframed, and returned. Your children grow smaller, and then vanish. Soon, you too become smaller. Your adult teeth retract, your baby teeth return, and your traits and foibles start to slip away. Once language goes, you are not so much you as potential you. We keep rewinding still, until we’re halving and halving a colony of cells, finally arriving at that amazing singularity: the cell that will become you.
The question, of course, is what happens when we press “play” again. Are your talents, traits, and insecurities so deeply embedded in your genes that they’re basically inevitable? Or could things go rather differently with just a few tiny nudges? In other words, how much of your fate do you allot to your genes, versus your surroundings, versus chance? This is navel gazing that matters.
In the absence of a time rewinder, the next best experiment is to do what Julia Freund and her colleagues did in a simple, yet remarkable recent study. These investigators placed genetically identical individuals (mice in this case) in a common environment, and asked whether systematic behavioral differences could still develop between them. An answer of “Yes” would mean that there are sources of behavioral variability – “individuality,” if you will – that aren’t accounted for by the combination of genes and common environment.
A key distinction to make is “tastes” versus “flavors.” In terms of taste—the perceptions of salty, sweet, savory, bitter, sour—humans essentially have the same innate preference the world round. John Prescott notes in his book Taste Matters, “The sweet taste of sucrose in water…is optimally pleasant at around 10-12 per cent by weight (approximately the same as is found in many ripe fruits), regardless of whether you are from Japan, Taiwan or Australia.”
But we do not eat tastes, we eat flavors, and what makes us like flavors, says Small, is “flavor nutrient conditioning.” The upside of this process, she says, “is that we can learn to like the foods that are available to us, and avoid particular foods rather than entire classes of nutrients.” Such learning involves a complex chain of activity in the brain, all oriented around understanding what Small calls “flavor objects.” “Our brain and our behavior are geared toward learning about the object—strawberry, for example—rather than its various components. Did this food make me sick? Did this food give me energy? You learn preferences based on the entire flavor object.” Coffee, for example, is just as bitter the 1,000th time we drink it as the first, but, Small notes, “it becomes coffee. The brain has learned that coffee is not a potentially harmful signal.”
In recently presented work, Small is trying to understand, neurologically, how physiological factors can influence the way we eat: “When does the moment kick in where you like it?” Experimental subjects are exposed to novel flavors that have no calories; over a few weeks, one of the flavors has caloric (but tasteless) maltodextrin added. The “post-oral signal” coming from the gut—which is happily converting the maltodextrin into glucose—can, she suggests, alter the response to a flavor. “These post-ingestive signals are getting into the reward circuits” of the brain, “altering the way reward circuits process the flavor, and doing that quite independently of liking,” she says. In short, our liking grows without our quite knowing why.
Leave quibbling of every kind to lawyers pleading at the bar for the life of a culprit; in society and conversation it is invariably out of place, unless when Laughter is going his merry round. At all other times it is a proof of bad breeding….
Cheerfulness, unaffected cheerfulness, a sincere desire to please and be pleased, unchecked by any efforts to shine, are the qualities you must bring with you into society, if you wish to succeed in conversation. … a light and airy equanimity of temper,—that spirit which never rises to boisterousness, and never sinks to immovable dullness; that moves gracefully from “grave to gay, from serious to serene,” and by mere manner gives proof of a feeling heart and generous mind.
Martine’s Hand-book of Etiquette, and Guide to True Politeness, 1866. I read etiquette handbooks at an impressionable age, and ever since have had a vision of society and proper behavior that is out of step with the modern age. (via Brain Pickings)
The brains of two rats on different continents have been made to act in tandem. When the first, in Brazil, uses its whiskers to choose between two stimuli, an implant records its brain activity and signals to a similar device in the brain of a rat in the United States. The US rat then usually makes the same choice on the same task.
Miguel Nicolelis, a neuroscientist at Duke University in Durham, North Carolina, says that this system allows one rat to use the senses of another, incorporating information from its far-away partner into its own representation of the world. “It’s not telepathy. It’s not the Borg,” he says. “But we created a new central nervous system made of two brains.”
Nicolelis says that the work, published today in Scientific Reports, is the first step towards constructing an organic computer that uses networks of linked animal brains to solve tasks. But other scientists who work on neural implants are skeptical.
Play in our species serves many valuable purposes. It is a means by which children develop their physical, intellectual, emotional, social, and moral capacities. It is a means of creating and preserving friendships. It also provides a state of mind that, in adults as well as children, is uniquely suited for high-level reasoning, insightful problem solving, and all sorts of creative endeavors….
[P]lay is not neatly defined in terms of some single identifying characteristic. Rather, it is defined in terms of a confluence of several characteristics. People before me who have studied and written about play have, among them, described quite a few such characteristics; but they can all be boiled down, I think, to the following five: (1) Play is self-chosen and self-directed; (2) Play is activity in which means are more valued than ends; (3) Play has structure, or rules, which are not dictated by physical necessity but emanate from the minds of the players; (4) Play is imaginative, non-literal, mentally removed in some way from “real” or “serious” life; and (5) Play involves an active, alert, but non-stressed frame of mind….
Because play involves conscious control of one’s own behavior, with attention to process and rules, it requires an active, alert mind. Players do not just passively absorb information from the environment, or reflexively respond to stimuli, or behave automatically in accordance with habit. Moreover, because play is not a response to external demands or immediate strong biological needs, the person at play is relatively free from the strong drives and emotions that are experienced as pressure or stress. And because the player’s attention is focused on process more than outcome, the player’s mind is not distracted by fear of failure. So, the mind at play is active and alert, but not stressed. The mental state of play is what some researchers call “flow.” Attention is attuned to the activity itself, and there is reduced consciousness of self and time. The mind is wrapped up in the ideas, rules, and actions of the game.
This point about the mental state of play is very important for understanding play’s value as a mode of learning and creative production. The alert but unstressed condition of the playful mind is precisely the condition that has been shown repeatedly, in many psychological experiments, to be ideal for creativity and the learning of new skills. Such experiments are normally not described as experiments on play, but it is no stretch to interpret them as that. What the experiments show is that strong pressure to perform well (which induces a non-playful state) improves performance on tasks that are mentally easy or habitual for the person, but worsens performance on tasks that require creativity, or conscious decision making, or the learning of new skills. In contrast, anything that is done to reduce the person’s concern with outcome and to increase the person’s enjoyment of the task for its own sake—that is, anything that increases playfulness—has the opposite effect.
Strong pressure to perform well inhibits creativity and learning by focusing attention strongly and narrowly on the goal, thereby reducing the ability to focus on means. In the pressured state, one tends to fall back on instinctive or well-learned ways of doing things. That way of responding to pressure is adaptive in many emergency situations. When a tiger is chasing you, you use whatever means you have already learned for getting away or hiding; that is not a good time to experiment with new ways. Experts in any realm can usually perform well in the pressured state because they can call on their well-learned, habitual modes of responding and don’t need to learn anything new or act creatively. Their attention can focus on producing the best possible outcome using the repertoire of actions that are already second nature to them.
In “Spontaneous flocking in human groups,” a paper published in the January issue of Behavioral Sciences, Boos and colleagues describe an attempt to isolate underlying flocking mechanisms hinted at by the large-scale behaviors sometimes seen in crowds.
Her team designed an experiment in which test subjects were allowed to move within a virtual space, but with the identities of other people completely hidden. Other people were literally seen as black dots.
Despite the impossibility of exchanging information or social cues, people in the experiment drifted towards each other in predictable, mathematically regular ways. These are hints, said Boos, of the fundamental forces of spatial attraction that exist between people….
“We see collective behavior in many aspects of human society,” said Couzin. “If you observe a crowd from above, you see that pedestrians spontaneously form lanes, following the slipstream of others. There’s lots of patterns that arise from local interactions that we’re not aware of.”
According to Joseph Henrich and his colleagues at the University of British Columbia, most undergraduates are WEIRD. Those who teach them might well agree. But Dr Henrich did not intend the term as an insult when he popularised it in a paper published in Behavioral and Brain Sciences in 2010. Instead, he was proposing an acronym: Western, Educated, Industrialised, Rich and Democratic.
One reason these things matter is that undergraduates are also psychology’s laboratory rats. Incentivised by rewards, in the form of money or course credits, they will do the human equivalents of running mazes and pressing the levers in Skinner boxes until the cows come home.
Which is both a blessing and a problem. It is a blessing because it provides psychologists with an endless supply of willing subjects. And it is a problem because those subjects are WEIRD, and thus not representative of humanity as a whole. Indeed, as Dr Henrich found from his analysis of leading psychology journals, a random American undergraduate is about 4,000 times more likely than an average human being to be the subject of such a study. Drawing general conclusions about the behaviour of Homo sapiens from the results of these studies is risky.
This state of affairs, though, may be coming to an end. The main reasons undergraduates have been favoured in the past are that they are cheap, and easy for academics to recruit. But a new source of supply is now emerging: crowdsourcing.
As a child, were you constantly reminded by teachers to stop daydreaming?
Well, psychological research is beginning to reveal that daydreaming is a strong indicator of an active and well-equipped brain. Tell that to your third-grade teacher [first-grade, in my case — HR].
A new study, published in Psychological Science by researchers from the University of Wisconsin and the Max Planck Institute for Human Cognitive and Brain Science, suggests that a wandering mind correlates with higher degrees of what is referred to as working memory. Cognitive scientists define this type of memory as the brain’s ability to retain and recall information in the face of distractions….
“What this study seems to suggest is that, when circumstances for the task aren’t very difficult, people who have additional working memory resources deploy them to think about things other than what they’re doing,” said Jonathan Smallwood in a press release. In other words, daydreamers’ minds wander because they have too much extra capacity to merely concentrate on the task at hand. These results, the researchers believe, point to the fact that the mental processes underlying daydreaming may be quite similar to those of the brain’s working memory system. Previously, working memory had been correlated with measures of intelligence, such as IQ score. But this study shows how working memory is also closely tied to our tendency to think beyond our immediate surroundings at any given time.