A Theory of Justice (and the Dog Park.)

“That traditional view of morality is beginning to show signs of wear and tear. The fact that human morality is different from animal morality — and perhaps more highly developed in some respects — simply does not support the broader claim that animals lack morality; it merely supports the rather banal claim that human beings are different from other animals…Unique human adaptations might be understood as the outer skins of an onion; the inner layers represent a much broader, deeper, and evolutionarily more ancient set of moral capacities shared by many social mammals, and perhaps by other animals and birds as well.

In The Chronicle of Higher Education, bioethicist Jessica Pierce and biologist Marc Bekoff suggest what apparently agreed-upon rules of canid play teach us about animal morality. (via FmH.) “Although play is fun, it’s also serious business. When animals play, they are constantly working to understand and follow the rules and to communicate their intentions to play fairly.

The Politics of Yecccch.

“Likewise, conservatives are more likely than liberals to sense contamination or perceive disgust. People who would be disgusted to find that they had accidentally sipped from an acquaintance’s drink are more likely to identify as conservatives.” The NYT’s Nicholas Kristof examines the hardwired psychological differences between liberals and conservatives. “The larger point is that liberals and conservatives often form judgments through flash intuitions that aren’t a result of a deliberative process. The crucial part of the brain for these judgments is the medial prefrontal cortex, which has more to do with moralizing than with rationality …For liberals, morality derives mostly from fairness and prevention of harm. For conservatives, morality also involves upholding authority and loyalty — and revulsion at disgust.

We Control The Verti…ooh, new Tweet!

“Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy…We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane.”

Or, as Matt Johnson put it 25 years ago, I’ve been filled with useless information, spewed out by papers and radio stations…Another year older and what have i done? All my aspirations have shriveled in the sun. And don’t get me started on blogs, e-mails, youtubes, and tweets. In a New York Magazine cover story, Sam Anderson runs the gamut from Buddhism to Lifehacking to ascertain whether technology has really propelled us into a “crisis of attention”. (By way of Dangerous Meta, a blog that’s invariably worth the distraction.) And his conclusion? Maybe, but thems the breaks, folks. There’s no going back at this point. “This is what the web-threatened punditry often fails to recognize: Focus is a paradox — it has distraction built into it. The two are symbiotic; they’re the systole and diastole of consciousness…The truly wise will harness, rather than abandon, the power of distraction.

Which just goes to show, the real key to harnessing distraction is…wait, hold on a tic, gotta get back to you. There’s a new funny hamster vid on Youtube.

So Tweet and So Cold.

@JohnnyCash: Hello from Reno. Shot man…just to watch him die, actually. Weird, I know.
@ACamus: Beach lovely this time of year. Also, killed Arab. Oops.

Or something like that. Apparently, a new study suggests that — uh, oh — using Twitter may stunt one’s moral development. “A study suggests rapid-fire news updates and instant social interaction are too fast for the ‘moral compass’ of the brain to process. The danger is that heavy Twitters and Facebook users could become ‘indifferent to human suffering’ because they never get time to reflect and fully experience emotions about other people’s feelings.

Hmm. I can’t say I’ve found Twitter to be particularly useful yet — to be honest, it all seems rather gimmicky to me, I worry about its Idiocracy-like implications. (Why 140 characters? Why not 10?), and, frankly, I often find that neither my life nor anyone else’s (nor, for that matter, that of anyone’s else’s adorable little children) is all that fascinating from moment to moment. (“Got up. Tired. It’s raining. Maybe I’ll eat some Grape Nuts.“) But I don’t think I can pin any personal reservoir of misanthropy on it either. (For that, I blame FOX News.)

A Hole in the Heart.

“‘This is the part of the brain involved in knowing that you want something,’ she said. ‘When people who are not adjusting well are having these sorts of thoughts about the person, they are experiencing this reward pathway being activated. They really are craving in a way that perhaps is not allowing them or helping them adapt to the new reality.‘” It’s darker than you know in those complicated shadows…A new study finds that unrelenting grief works on the brain differently than the usual kind of post-traumatic depression. “The same brain system is involved in other powerful cravings, such those that afflict drug addicts and alcoholics…It’s like they’re addicted to the happy memories.

Thanks for the Memories.

“We appear to be bringing the worst affected parts of the brain functionally back to life.” Is Alzheimer’s disease about to go the way of polio? A new drug known as rember, according to scientists in England, seems to halt and even roll back the symptoms of Alzheimer’s. “We have demonstrated for the first time that it may be possible to arrest progression of the disease by targeting the tangles that are highly correlated with the disease. This is the most significant development in the treatment of the tangles since Alois Alzheimer discovered them in 1907.

World in My Eyes.

Thoughtcrime is death. Thoughtcrime does not entail death. Thoughtcrime IS death. I have committed even before setting pen to paper the essential crime that contains all others unto itself.” The shape of things to come? Scientists at Berkeley conceive a way to use MRI imaging to “map” images in the brain. “Our results suggest that it may soon be possible to reconstruct a picture of a person’s visual experience from measurements of brain activity alone. Imagine a general brain-reading device that could reconstruct a picture of a person’s visual experience at any moment in time…It is possible that decoding brain activity could have serious ethical and privacy implications downstream in, say, the 30 to 50-year time frame.

You’re biased! No, really, you are.

“If you are unprepared to encounter interpretations that you might find objectionable, please do not proceed further…I am aware of the possibility of encountering interpretations of my IAT performance with which I may not agree. Knowing this, I wish to proceed with either the Democratic Candidates task or the Republican Candidates task. As the 2008 Democratic primary season degenerates into a Clintonian morass of identity politics and invective, now seems as good a time as any to test your own internal bias with an Implicit Association Test. (For more info, Slate’s Jay Dixit covered the test and it social implications a few years ago.)

As for me, I took it three times. At first, my reptile-brain displayed a bias for Hillary Clinton, with Barack Obama and John Edwards exactly tied below her, and Bill Richardson lagging considerably behind. (My apologies, Governor Richardson. I think it might be because you look older than the rest of the candidates. At least, I hope that’s the reason.) The second time I took it involved just the candidate’s names, and it was completely inconclusive — all four were tied exactly in the center of the chart. The third time — perhaps because I was growing more used to the interface — Barack Obama was up high, followed by Edwards, followed by Clinton followed by Richardson.

I’ll Sleep When I’m Dead.

“We have to realize that we are already living in a society where we are already self-medicating with caffeine.” This one’s been languishing in the bookmarks for awhile, but via Drudge and blog-twin FmH, scientists may have discovered a cure for sleep deprivation in Orexin A. “The study, published in the Dec. 26 edition of The Journal of Neuroscience, found orexin A not only restored monkeys’ cognitive abilities but made their brains look ‘awake’ in PET scans. Siegel said that orexin A is unique in that it only had an impact on sleepy monkeys, not alert ones, and that it is ‘specific in reversing the effects of sleepiness’ without other impacts on the brain.” But is it cheaper than my daily Red Bull?

The Dancer Upstairs.

Ok, this one’s a bit creepy. By way of Webgoddess, watch the rotating dancer to ascertain whether you’re left-brained or right-brained. I’m pretty right-brained, it seems (which makes sense, since I’m both left-handed and left-footed). But, if I changed tasks while the dancer was on — say went to click another window or focused on the list at left, she’d sometimes switch direction. Weird…well, I just hope my right-brain knows what my left-brain is doing.