Saturday, July 30, 2016

7 Secrets of People Who Never Get Sick

Have Breakfast

It's important for a bunch of reasons. It jump-starts your metabolism and stops you from overeating later. Plus, studies show that adults who have a healthy breakfast do better at work, and kids who eat the morning meal score higher on tests. If a big meal first thing isn't for you, keep it light with a granola bar or a piece of fruit. Just don't skip it.

Plan Your Meals

It'll help you save time and money in the long run. Block out some time, then sit down and consider your goals and needs. Do you want to lose weight? Cut back on sugar, fat, or carbs? Add protein or vitamins? Meal prep keeps you in control. You know what you're eating and when. A bonus: It'll be that much easier to skip those donuts in the break room at work.

Drink Plenty of Water

It can do so many good things for you. Staying hydrated is at the top of the list, but it may also help you lose weight. Another reason to go for H2O? Sugary drinks are linked to obesity and type 2 diabetes. If you aren't a fan of plain water, add flavor with slices of orange, lemon, lime, watermelon, or cucumber.

Take an Exercise Break

Don't just grab another cup of coffee -- get up and move. Do some deep lunges or stretches. It's great for your body and mind. Just 35 minutes of walking five times a week may help keep the blues at bay. And if you can't get all those minutes in at once, short bursts help, too.

Go Offline

Checking your email and social media a lot? Sure, your friends' and family's latest updates are just a click away, but do you really need to see pictures of your cousin's latest meal? Let it wait until morning. Set a time to log off and put the phone down. When you cut back on screen time, it frees you up to do other things. Take a walk, read a book, or go help your cousin chop veggies for her next great dinner.

Learn Something New

New skills help keep your brain healthy. Sign up for a dance class or a creative writing workshop. Better yet, master a new language. The mental work it takes can slow the signs of aging and may even delay the effects of Alzheimer's disease.

Don’t Smoke

If you light up, quit. It's a big move toward better health. Your body repairs itself quickly. As soon as 20 minutes after your last cigarette, your heart rate and blood pressure drop. Why wait? Kick the habit, today. Your doctor will be happy to help you get started.

Convincing Skeptics to Try Meditation

By Jill Suttie 

Two new books aim to bring mindfulness to two resistant groups: children and lawyers.



Mindfulness meditation has its fair share of skeptics. Perhaps some envision it as a kind of New Age-y practice of sitting on a cushion and contemplating their breath while holding an impossible lotus pose. Who has time for that?
Now, two recently published books make the case that mindfulness meditation can be incorporated into practically anyone’s life. Targeting two notoriously reluctant groups—children/teens and busy professional attorneys—these books provide lots of practical advice and specific exercises for starting a mindfulness practice…even when you think it’s just not for you or you don’t have time.
Christopher Willard’s Growing Up Mindful focuses on children and what parents can do to encourage them to be more mindful. He suggests that parents introduce mindfulness by tapping into the things that concern their kids most—sports, art, friendships, or academics—and then demonstrating how mindfulness can make these more enjoyable or productive, while preparing kids to cope better with setbacks when things go wrong.
Too often kids respond to challenges by checking out, distracting themselves, or lashing out at others, argues Willard. But mindfulness has the potential to help kids understand and soothe their emotions more effectively than resorting to drugs, mind-numbing video games, or bullying. Practicing mindfulness gives kids some freedom from their kneejerk emotional reactions, empowering them to make wiser choices—an argument that Willard thinks will be convincing to kids, especially teens.
“The idea that mindfulness strengthens us is an empowering one that resonates with today’s kids,” he writes. “Offering children the tools to find answers within, rather than by looking outside, is offering them the lifelong gift of independence.”
Teens may also be convinced by the science. Research suggests that practicing mindfulness helps activate our insular cortex, says Willard, which helps with emotional regulation and with self/other awareness. It also appears to affect parts of the brain involved in taking the perspective of others, memory, and learning. Telling kids about these benefits might inspire them to look at mindfulness as more of a scientifically proven practice, rather than a flakey New Age phenomenon.
But encouraging mindfulness in our kids shouldn’t mean pushing an agenda, Willard advises. He encourages parents to begin practicing meditation themselves and to show compassion toward their kids—being a good role model rather than forcing kids to meditate.
“The more authentic we can be, the more authentic and trusting our relationship with kids will be,” he writes.
Readers will find in Willard’s book many of the tried-and-true meditation practices featured in other books on mindfulness—sitting meditation, body scan, loving-kindness meditation, for example—but also less well-known practices that might appeal more to kids—such as mindful coloring, mindfulness games, or mindful movement. He also offers ideas to teachers about how to use mindfulness practices in the classroom. Many of the practices Willard outlines can be incorporated into everyday life, with some taking less than a minute a day—no doubt something parents and kids will appreciate.
In Jeena Cho and Karen Gifford’s book, The Anxious Lawyer, the focus is on convincing lawyers that mindfulness has a lot to offer them, especially when it comes to decreasing anxiety. Not only can mindfulness reduce stress in a traditionally high-stress occupation, it has the potential to help lawyers sleep better, improve client and other emotionally-laden interpersonal interactions, and enjoy their jobs more. All this research has led the American Bar Association Journal to tout the benefits of mindfulness to lawyers and led several law schools to offer mindfulness training.
Cho and Gifford, both lawyers by training, write from personal experience about how developing a mindfulness meditation practice has benefitted their work. For example, Cho recalls how staying aware of her own boundaries has helped protect her from burnout, and being mindful of her stress and sense of urgency when meeting with clients has helped her truly listen to them and better earn their trust and cooperation.
“Every client interaction can be an opportunity to manifest our highest intention as attorneys,” she and Gifford write. Yet, “if we aren’t in a healthy space, if we’re running on fumes, if we’re giving more than what is available to us, we can’t be effective. Understanding our own limitations and setting boundaries requires awareness.”
Their book is essentially an eight-week mindfulness course, peppered with practices one can easily do at home. What’s intriguing about the book is how the authors address some of the unique challenges faced by attorneys.
As they acknowledge, most lawyers are a bit obsessive—they practically have to be to become a lawyer in the first place. Yet, on the job, many elements of a case remain out of their control—like unsympathetic judges, incorrect documentation, and uncooperative clients. Mindfulness can help lawyers let go of the need for perfection and stop beating themselves up for losing a case.
Mindfulness “gives us the room to forgive ourselves for mistakes, make room for imperfection, and truly be our own best friends,” they write.
For those lawyers who want to give mindfulness a try—or, for that matter, any professional who feels stressed and overworked on the job—Cho and Gifford’s book could be the tool they need to start meditating.

Daily Inspirational Quote for July 30, 2016

“True religion is the life we lead, not the creed we profess.”

Need I say more? I have witnessed people who attend Church faithfully every Sunday and profess themselves Christians be guilty of “casting the first stone” or never practicing what they preach. I especially dislike the riches and treasures some religions seem to consider their “due”, while their followers are rife with disease because they can’t afford medicines, starving because of a lack of food, have no roof over their heads or no clean drinking water etc. What’s that all about? Personally, I think it’s obscene and it really saddens me, as I guess it does you. I try to live my life being the best I can be and treat other people the way I would like to be treated. It works for me. What about you? What do you think?

by CathiBew.co.uk

A Tale of Two Americas and the Mini-Mart Where They Collided

Ten days after 9/11, a shocking attack at a Texas mini-mart shattered the lives of two men: the victim and the attacker. In this stunning talk, Anand Giridharadas, author of "The True American," tells the story of what happened next. It's a parable about the two paths an American life can take, and a powerful call for reconciliation.

http://www.dailygood.org/story/1348/a-tale-of-two-americas-and-the-mini-mart-where-they-collided-anand-giridharadas/

Friday, July 29, 2016

How to Stop the Racist in You

By Jeremy Adam Smith, Rodolfo Mendoza-Denton

The new science of bias suggests that we all carry prejudices within ourselves—and we all have the tools to keep them in check.



In the wake of racially charged bloodshed in Baton Rouge, Minneapolis, and Dallas, the city of Cleveland hosted the Republican National Convention.
There Iowa Rep. Steve King argued that only whites had made contributions to civilization, while other “sub-groups” did not. Asked to clarify his remarks, King—who keeps a Confederate flag on his desk—did not back down. “The Western civilization and the American civilization are a superior culture,” he said, deliberately associating “Western” and “American” with white. No leader at the convention publicly disavowed King’s assertion.
Representative Steve King (R-Iowa) with confederate flag, lower left.Representative Steve King (R-Iowa) with confederate flag, lower left.Still from Sioux City’s KCAU TV.
This is just the latest example of what seems to be a rise in polarizing public language that meets the dictionary definition of “racist”—“having or showing the belief that a particular race is superior to another.” King’s argument is an example of explicit, conscious prejudice, when someone outwardly expresses, through words or behavior, a view denigrating a particular group.
But what explains the fact that police departments are more likely to use force against black suspects than white ones, at a time when so many departments are consciously trying to reduce these discrepancies? What could explain why companies explicitly committed to diversity show racial bias in hiring decisions? Why would caring teachers be more likely to punish black students more harshly than white students?
In these cases, and many others, scientific evidence suggests that we’re seeing the effects not of explicit prejudice but of implicit bias—the unconscious, often knee-jerk prejudices that subtly guide our behavior. 

The distinction between explicit and implicit bias is important, because it changes how we address prejudice in every corner of society, from police departments to schools to homes. If the problem is with racists—individuals like Steve King—then the solution is to identify them and limit their influence. That does need to happen; indeed, after Chief David Brown took over the Dallas police department in 2010, he fired over 70 officers from his force—and excessive-force complaints dropped by 64 percent.
But the new science of implicit bias suggests that the problem is not only with bad apples. Instead, prejudice is a conflict that plays out within each and every one of us.
Since we published the book Are We Born Racist? in 2010—which explores racial prejudice as a neurological and psychological process—we’ve seen more and more research into the automatic and measurable associations that people have about others, and the subtle and unconscious behaviors that these associations influence. In many daily circumstances, automatic associations are natural and harmless. Not so when a police officer pulls a car over for a broken tail light, and the negative associations he has with the face of the driver can produce deadly results; or when a black defendant’s facial features can make a jury more likely to give him the death penalty.
Last summer, Greater Good published a series of articles by researchers and law enforcement officials about how to reduce the negative influences of implicit bias in the criminal justice system. But this research isn’t just for cops and judges—it can help all of us to understand how our brains work and why we are not as different as we might like to think from a police officer who shoots an unarmed suspect.
Indeed, the fact that implicit bias occurs outside of our awareness but affects explicit behaviors—from whether we pull a trigger to how we judge a resume to how we discipline young children—can deeply threaten our self-image. If I have implicit bias, does that mean I’m not really committed to fairness and equality? Am I, at a deep and unconscious level, actually a racist?
The answer is both yes and no. We all carry prejudices within ourselves—and we all have the tools to keep them in check.

From explicit to implicit bias

When we think of “racists,” our minds conjure up people like the San Francisco police officers who were recently caught using racially derogatory words in text messages, or perhaps politicians like King. Their pronouncements shock many of us with their old-fashioned racism, in which people’s out-group attitudes are conscious, explicit, and openly endorsed. This type of racism was characteristic of majority group members’ attitudes up until around the 1950s—and today it does indeed appear to be undergoing a vocal revival in public life.
What current discussions about implicit bias recognize, however, is that a great deal of contemporary racism comes from people who say they don’t want to be racist.
Evidence of this tendency emerged when negative attitudes or stereotypes became publicly frowned upon in the 1960s and 70s, and many people felt social pressure to not get “caught” saying something that sounded racist—an extrinsic motivation that many have labeled “political correctness.”
This formulation implies that egalitarian behavior is not real or truly felt, but rather, a social grace to mask an unacceptable attitude. As many supporters have said about GOP presidential candidate Donald Trump, he “says what nearly everybody thinks, but is too fearful or polite to say.” This conception makes someone like Trump sound “honest,” but by implication, suggests that those who speak up for egalitarianism are being somehow “dishonest.”
Things become even more complicated when a person (or institution) sincerely values egalitarianism yet engages in some kind of behavior that nonetheless betrays bias. Many studies find evidence of anti-black bias in pain-killer prescription and other kinds of medical treatment. One study found that job applicants with stereotypically African-American names were less likely to be invited to be interviewed. And, despite the avowed commitment of the courts to “justice for all,” the connection between criminal sentencing and race is well documented. 

For many people, the very possibility that they too might get caught saying one thing but doing another is extremely threatening and aversive. That threat, in fact, has a name: aversive racism. It refers to the type of racism in which a person’s implicit biases are so out of line with their conscious values that social situations where they experience this conflict—such as interracial interactions—are something to fear and avoid.
In a 2008 study, for example, white participants who were about to discuss racial profiling with a fellow study participant who was black literally sat further away from them, and this distance was not predicted by their level of racial bias. Instead, it was predicted by their fear of being perceived as racist. In these kinds of situations, we create a self-fulfilling cycle of negative racial interactions—and to avoid them we may avoid contact with different kinds of people altogether.
This dynamic, ironically, can deepen racial segregation and inequality.

Did we evolve to be racist?

These behavioral findings have counterparts in neuroscience.
We often hear descriptions of the brain’s limbic system as our “reptilian brain” that responds to environmental cues with the same level of sophistication as an alligator. Lightning quick and outside of our control, the limbic system has been called the seat of our fight-or-flight responses, perfectly adapted to the eat-or-be eaten environment of our early ancestors. A central player in this prehistoric narrative is the amygdala, a pair of almond-like structures that form part of the limbic system. Early findings that the amygdala responds strongly to fear conditioning led to the view that the amygdala is the structure that sets in motion the fight-or-flight response.
Researchers like Elizabeth Phelps and Mahzarin Banaji wrote a significant chapter in our understanding of implicit bias when they found that faces of different races trigger different amygdala activation in the brain, and that there’s a relationship between levels of implicit bias and amygdala activity. These findings have fueled a conception of implicit bias as not only unconscious and automatic, but also as biologically determined—part of our ancestral heritage. The implication there is that our only hope is to contain it, but never realistically to overcome it.
Newer research—often by the same people—is beginning to challenge the core assumptions of this narrative. Once again, the amygdala plays a central role. Scientists are beginning to recognize that the amygdala, rather than responding exclusively to negative or fear-inducing stimuli, instead seems to be exquisitely sensitive to emotionally important information in the environment. This is a subtle but important difference, and suggests that depending on the task or the situation at hand, the amygdala may be able to respond differentially.
In one study, researchers found that the amygdalae of participants activated at levels consistent with how negatively they rated a set of faces, in line with prior findings. However, amygdala activity was also related to their judgments of the positivity of faces. And when they judged faces using a scale that was anchored by both positive and negative endpoints, the amygdala tracked the overall intensity of the responses. In other words, the amygdala is more than just a “fear” center, and its activation doesn’t necessarily indicate prejudice.
In another study, researchers had participants engage in a face-sorting task in one of two different conditions—either by race, or by membership in teams that included people of different races. Interestingly, the amygdala did not only track race information—it tracked the socially relevant membership (team or race) depending on the social task in front of participants. This tells us that the amygdala is not necessarily pre-wired to detect race information, but rather, to track and respond to the category or social grouping that is most relevant at a given time.
Rather than contradicting an evolutionary narrative, however, these findings merely challenges us to think a little more broadly about the usefulness of categorization even in early times—we may have had to quickly recognize a member of an “out-group” on the basis of race, but it would have been just as helpful to quickly track whether an individual of the same race as us was part of a nearby enemy tribe. When we consider that “in-group” versus “out-group” distinctions don’t neatly fall along racial categories, we can begin to consider that race is not a biological inevitability, but a social construction with social significance that our amygdala tracks.
In other words, if the brain adjusts to quickly process information that is deemed as socially relevant, it may be within our power to redefine what is socially relevant. And, rather than needing to squash or cover up our base biases, perpetually caught in a Freudian tug-of-war between Id and Superego, the current view opens the possibility of redefining our social environment so that it doesn’t need to track race as a socially significant marker.

Six ways to stop the racist in you

What are the implications of this new way of thinking and conceptualizing brain function for our understanding of prejudice—and of how can we use it to limit our own biases?
At its most basic level, this new understanding of the brain reveals it not as a layered organ showing the layers of our evolution, as might layers of sediment in a canyon. Rather than thinking in terms of dualistic structures—primitive/evolved, emotion/thought, limbic system/neocortex—we are coming to understand that the brain is much more interconnected than previously thought.
But beyond this understanding, these new findings show that our automatic processes (including our implicit biases) are not unchangeable, and that we can learn new behaviors that can become second nature.
An everyday example shows how this is possible. Consider that not one of us is born learning how to drive, and yet by the time many people are adults, we find ourselves not even thinking about it even as we expertly maneuver the car. One day, with practice, egalitarianism might be like driving a car: a skill learned over time but eventually so automatic as to be second nature.
So what are the tricks that you can use to stop the racist in you? There are many, of course, but here are six to consider that follow from the scientific insights we describe.
  • Consciously commit yourself to egalitarianism.
  • But recognize that unconscious bias is no more “the real you” than your conscious values. You are both the unconscious and the conscious.
  • Acknowledge differences, rather than pretend that you are ignoring them.
  • Seek out friendship with people from different groups, in order to increase your brain’s familiarity with different people and expand your point of view.
  • It’s natural to focus on how people are different from you, but try to consciously identify what qualities and goals you might have in common.
  • When you encounter examples of unambiguous bias, speak out against them. Why? Because that helps create and reinforce a standard for yourself and the people around you, in addition to providing some help to those who are the targets of explicit and implicit prejudice.
Those are steps you can take right now, without waiting for the world to change.
But this research has implications that go well beyond the personal. The split-second reaction of a police officer who shoots an unarmed black man might not be very different from your own. Instead of asking the question of whether a person is or is not racist—because we’re all a mix—we can turn to thinking of the ways in which we might engineer our social environment to address racism and its worst effects, without believing that any one step will be a blanket fix.
Knowing that bias is part of the structure of our minds we can ask, for example, how can we change policing so that the results of bias are less deadly? How can we address economic inequality between different groups so as to reduce the stress on communities that are historically the targets of racism? What can school districts do to make sure teachers come in daily positive contact with different kinds of people, and receive training in techniques to help them consciously reduce unconscious bias?
There are many fronts in the campaign against bias, both implicit and explicit, but they all have one thing in common: us. We are all potentially part of the problem—and we can all become a part of the solution.
This essay was revised and updated by Smith from a piece by Mendoza-Denton and Amanda Perez in the journal Othering & Belonging, published by the UC Berkeley Haas Institute for a Fair and Inclusive Society.

Why Can’t We Remember Our Early Childhood?

By Jeanne Shinskey

Research into "childhood amnesia" sheds light on how memories are formed and maintained.



Most of us don’t have any memories from the first three to four years of our lives.
In fact, we tend to remember very little of life before the age of seven. And when we do try to think back to our earliest memories, it is often unclear whether they are the real thing or just recollections based on photos or stories told to us by others.
The phenomenon, known as “childhood amnesia,” has been puzzling psychologists for more than a century—and we still don’t fully understand it. But research is starting to suggest an answer: Autobiographical memory might begin with the stories we tell each other.

The journey into language

Denis Omelchenko / Shutterstock
At first glance, it may seem that the reason we don’t remember being babies is because infants and toddlers don’t have a fully developed memory.
But babies as young as six months can form both short-term memories that last for minutes, and long-term memories that last weeks, if not months. In one study, six-month-olds who learned how to press a lever to operate a toy train remembered how to perform this action for two to three weeks after they had last seen the toy. Preschoolers, on the other hand, can remember events that go years back. It’s debatable whether long-term memories at this early age are truly autobiographical, though—that is, personally relevant events that occurred in a specific time and place.
Of course, memory capabilities at these ages are not adult-like—they continue to mature until adolescence. In fact, developmental changes in basic memory processes have been put forward as an explanation for childhood amnesia, and it’s one of the best theories we’ve got so far.
These basic processes involve several brain regions and include forming, maintaining, and then later retrieving the memory. For example, the hippocampus, thought to be responsible for forming memories, continues developing until at least the age of seven. We know that the typical boundary for the offset of childhood amnesia—three and a half years—shifts with age. Children and teenagers have earlier memories than adults do. This suggests that the problem may be less with forming memories than with maintaining them.
However, this does not seem to be the whole story. Language also plays a role. From the ages of one to six, children progress from the one-word stage of speaking to becoming fluent in their native language(s), so there are major changes in their verbal ability that overlap with the childhood amnesia period. This includes using the past tense, memory-related words such as “remember” and “forget,” and personal pronouns, a favorite being “mine.”
It is true to some extent that a child’s ability to verbalize about an event at the time that it happened predicts how well they remember it months or years later. One lab group conducted this work by interviewing toddlers brought to accident and emergency departments for common childhood injuries. Toddlers over 26 months, who could talk about the event at the time, recalled it up to five years later—whereas those under 26 months, who could not talk about it, recalled little or nothing. This suggests that preverbal memories are lost if they are not translated into language.

How stories make memories

However, most research on the role of language focuses on a particular form of expression called narrative, and its social function. When parents reminisce with very young children about past events, they implicitly teach them narrative skills—what kinds of events are important to remember and how to structure talking about them in a way that others can understand.
Unlike simply recounting information for factual purposes, reminiscing revolves around the social function of sharing experiences with others. In this way, family stories maintain the memory’s accessibility over time, and also increase the coherence of the narrative, including the chronology of events, their theme, and their degree of emotion. More coherent stories are remembered better. Maori adults have the earliest childhood memories (age 2.5) of any society studied so far, thanks to Maori parents’ highly elaborative style of telling family stories.
Reminiscing has different social functions in different cultures, which contribute to cultural variations in the quantity, quality, and timing of early autobiographical memories. Adults in cultures that value autonomy (North America, Western Europe) tend to report earlier and more childhood memories than adults in cultures that value relatedness (Asia, Africa).
This is predicted by cultural differences in parental reminiscing style. In cultures that promote more autonomous self-concepts, parental reminiscing focuses more on children’s individual experiences, preferences, and feelings, and less on their relationships with others, social routines, and behavioral standards. For example, an American child might remember getting a gold star in preschool whereas a Chinese child might remember the class learning a particular song at preschool.
While there are still things we don’t understand about childhood amnesia, researchers are making progress. For example, there are more prospective longitudinal studies that follow individuals from childhood into the future. This helps give accurate accounts of events, which is better than retrospectively asking teens or adults to remember past events which are not documented. Also, as neuroscience progresses, there will undoubtedly be more studies relating brain development to memory development. This should help us develop other measures of memory besides verbal reports.
In the meantime, it’s important to remember that, even if we can’t explicitly remember specific events from when we were very young, their accumulation nevertheless leaves lasting traces that influence our behavior. The first few years of life are paradoxically forgettable and yet powerful in shaping the adults that we become.