Monday, April 8, 2013

Hunger Strike



Potatoes.  My dad hated potatoes.

Growing up in a Korean household, you eat a lot of varied things – a lot of things that most people would not consider “food” (Sea cucumbers? Congealed cow’s blood? Bellflower root? Yep, we ate them all.), but the one food item that never made an appearance? Potatoes.

It’s not that potatoes are not featured in Korean cuisine – there are plenty of recipes with them. It’s just that my dad hated them…to the point where he forbade my mom from ever buying them (Unless my brother-in-law was coming into the house. He was supposed to have potatoes. He’s Irish.) There was another reason: Hunger.

My dad was born on the island of Cheju in what is now South Korea.[1]  At the time of his birth, the island was occupied by Imperial Japan, during World War II.[2]  By the time my Dad was two years old, my paternal grandmother was working in Japan and my paternal grandfather, a radically independent fisherman with Communist leanings, had left to join the Kim Il-Sung’s Workers’ Party of North Korea. My grandfather was never to be seen again.
                     
What did this mean for my father? It meant potatoes. Endless amount of potatoes. Rice was too expensive. Potatoes were cheap. And they were ubiquitous. For those living in the relative comfort of Seoul, food was to be found through the black markets of US Army provisions[3]. But for my father, growing up on an island that was believed to be a hotbed for Communist partisans, this meant NO food. Already on the threshold of severe famine, the burn-and-slash attacks by Nationalist guerilla groups had drained the island’s resources dry. Even fishing, which had sustained the island for years, was destroyed as American and anti-Communist Nationalist navies patrolled local waters.

While war is an obvious condition for food insecurity, what we have now in the United States seems unfathomable.  According to the latest statistics from the USDA[4], 15% of Americans, or about one in seven, are using the Supplemental Nutritional Assistance Program (“SNAP”), better known as food stamps. Despite the “recovery” from the Great Recession of 2008, food stamp usage has remained relatively high, even rising 1.8% from January of 2012.

Food banks and pantries across the US are suffering in what seems to be the perfect storm for hunger: Smaller food donations and rising ranks of needy persons. Furthermore, tight demand for food has only made this condition worse. Under the Emergency Food Assistance Program, USDA buys excess food from suppliers and donates it to local food banks. In the past year, tight supplies, due to drought and skyrocketing worldwide demand, has decreased the amount of food USDA has bought for the program and thus the amount donated to food charities.[5] On the other hand, food banks have seen the demand for their services grow, in some places by double digits, as economic recovery has not meant new jobs for many. While the newest numbers by the US Department of Labor indicate a drop in unemployment from the recent all time high of 10% in October, 2009 to 7.6%,[6] as several economists have noted, the numbers don’t reflect the real story: many unemployed have been so discouraged by the job market, they have dropped out all together, only increasing the pressure on welfare benefits such as SNAP and Social Security.

But what the numbers can’t reflect are the poignancies of hunger. While the physical effects are obvious -- malnutrition, arrested mental and physical development for children, and higher rates of disease -- the psychological effects can and do last just as long. For every person on food stamps, using the Special Supplemental Nutrition Program for Women, Infants and Children (“WIC”) or receiving a subsidized lunch at school, there is a story of pain, humiliation and mental anguish unassuaged even when or if they ever come off the welfare rolls. As seen in Ancel Keys’ famous study on the psychological effects of hunger[7], Keys noted that subjects were prone to disordered eating habits, vacillating between devouring their food or eating so slowly as to savor every last bit of food. While Holocaust victims exemplified the conclusions of Keys’ study, his observations are no less applicable to those living in food insecure homes now. Behaviors such as hoarding, gorging or hiding food last far longer than the physical pangs of hunger. And for many, the constant insecurity of not having enough food will mark their own sense of security for years to come.

And for my father, that insecurity came in the form of a lowly potato. It was not a vegetable destined to be a chip or a fry, but a lingering symbol of poverty and hunger. It was a constant reminder of a childhood that he would rather forget. For many of America’s children, parents and so many others, they could only wish hunger was such a distant memory. Unfortunately for them, it’s an ever-present reality that will only bring more scars than they can ever hope to bear now or in the future. With or without potatoes.

This post is part of Food Bloggers Against Hunger, a collaborative effort of over 200 food bloggers, The Giving Table, Share Our Strength and the documentary, A Place at the Table to bring awareness about hunger, protect SNAP dollars for hungry families and push for anti-hunger legislation in Congress. Want to do something to fight hunger? Click here to write to your Representative, Senator or elected official that you want to end hunger in America by maintaining and prioritizing anti-hunger initiatives in Congress.

This post has been cross-posted at the Huffington Post.


[1] Cheju Island, now a giant tourist destination for much of East Asia, was a small underdeveloped island, mainly populated by fishermen/women and small farmers in the early 20th century.
[2] Korea was annexed in 1910 by Imperial Japan. Much like the colonial empires of 19th and 20th century Europe, Japan sought to dominate the Korean peninsula for strategic purposes.
[3] My mother’s first introduction to cheese and mayonnaise was from the black markets in Seoul during the Korean War. As US troops filled the city, American foodstuffs, such as chewing gum, chocolate and yes, American cheese, were introduced for the first time in Korea.
[5] In 2011, the USDA bought 421million pounds of surplus food; in 2012, only 129. (Source, USDA).
[7] The study, dubbed the Minnesota Starvation Experiment, was done in 1944-1945, not only to study the psychological aspects of hunger and starvation, but also to help direct Allied relief efforst in post-World War II Europe. The results of the experiment were summarized in 1950 in Keys’ two-volume work, The Biology of Human Starvation. Keys, A.; Brožek, J.; Henschel, A.; Mickelsen, O.; Taylor, H. L. (1950). The Biology of Human Starvation (2 volumes). St. Paul, MN: University of Minnesota Press, MINNE edition.

Sunday, March 17, 2013

Nomavirus is Everyone's Virus and Everyone's Cost

Restaurant Noma. Photo Credit: Huffington Post

Last Friday morning, Extra Bladet, a Danish Tabloid, broke the story last Friday morning: “Noma: 63 hit by Roskildesyge (Norovirus in Danish).” Norovirus, a highly virulent and contagious virus, causing nausea, vomiting and diarrhea, or gastroenteritis, sickened 63 out of 435 guests over a two-day period in February according to reports by The Danish Veterinary and Food Administration.

For the world’s top restaurant, this was not just a case of Noma catching the flu. The story went viral. Food websites, such as Eater and Grub Street, lapped up the story as soon as it was reported in Danish papers, and soon, all other established media sources, including National Public Radio, AP, UPI, Huffington Post, ABC News and the LA Times followed suit.

While the Twittersphere was burning through its schadenfreude quota, the actual scientific details about the illness were buried under a pile of snark. Norovirus has been at epidemic levels, causing almost 21 million illnesses each year in the US alone. According to the CDC, there is really no specific treatment and prevention, that is, proper hand and food hygiene, is often the best cure.

The problem is that norovirus is one tough bug. According to a paper in the Journal of Infectious Diseases[1], “Noroviruses are perhaps the perfect human pathogens… highly contagious, rapidly and prolifically shed, constantly evolving, evoking limited immunity, and only moderately virulent, allowing most of those infected to fully recover, thereby maintaining a large susceptible pool of hosts.” In other words, it’s a public health nightmare. Carriers often don’t know they have it or continue to carry it after they recover, thus passing it on to unsuspecting victims. It can survive at a wide range of temperatures, from below freezing up to 140F, and can survive for nearly two weeks on many surfaces. And it doesn’t need a high viral load to do its job: less than 20 viral particles are enough to cause illness.  And it just needs one carrier to infect an entire community or institution.  

Considering the prevalence, incidence and virulence of norovirus, it seems almost unbelievable that Noma didn’t have a previous outbreak of norovirus or have more patrons puking their guts out. What happened at Noma could have happened anywhere and everywhere, as it did in London in 2009 when 240 diners contracted the virus at the three-Michelin-starred restaurant, The Fat Duck and a Chipotle outlet near Kent State University in Ohio in 2008.

But then the question is why didn’t Noma have an outbreak earlier or have more guests holding their stomachs? Hygiene is one factor. While the Danish authorities cited hygiene problems, specifically a “lukewarm” hand washing faucet, chefs and waitstaff are given strict instructions to wash hands thoroughly with hot water and soap on a regular basis. Although norovirus has been known to withstand even a dishwasher, frequent hand washing often cuts transmission rates. But one policy that Noma has is critically important: paid sick days. The CDC has found that 89% of norovirus outbreaks occur in places were food is prepared and handled on a regular basis: schools, nursing homes, cruise ships and restaurants.  As it takes only one infected person to cause an outbreak, keeping quarantine on ill or possibly infected workers is paramount. Noma has a strict illness policy in which any ill worker, from the office to the cleaning staff, is sent immediately home at the slightest sign of illness and is told to stay home for 48 hours after symptoms subside. And they are paid for those days.

Compare this to the United States. According to the CDC, in 2011, 12% of restaurant workers reported signs of norovirus. The CDC also reports that 50% of norovirus infections can be traced back to food service workers. [2] According to the Institute for Women’s Policy Research,[3] 78% of hotel and food service workers do not have paid sick leave. Another food service workers advocacy group, ROC (Restaurant Opportunities Center) United estimates 90% of food service workers lack paid sick leave. Due to the low wages and the job instability of food service work, many of America’s cooks, busboys, and servers can neither afford to stay home due to lost wages nor firing, only encouraging ill workers to come to work and infect their co-workers and patrons. Add the lack of health insurance to the absence of sick days, and you have a recipe for an ongoing epidemic.

But where is the will to change public health and labor policies to prevent such epidemics from happening? Small business owners complain that health insurance and paid sick days are too costly for them. Yet the cost of NOT giving workers sick days is much greater. According to the Integrated Benefits Institute, $227 billion are wasted due to lost productivity from illness.[4] In an economy that is barely recovering from a recession, these are dollars that cannot afford to be squandered. According to Cornell University economist Sean Nicholson, for every dollar spent on employee health care, employers can save three dollars in costs.

While Noma tries to repair its unfairly damaged reputation, millions of other food service workers at no-name restaurants are just trying to work through another sick day. Too bad that they don’t have Eater or Grub Street gleefully sneering at their misfortune. It might be the only way that their plight and that of millions of others working without sick days will finally be able to serve you and the public better.


[2]Widdowson M-A, Sulka A, Bulens SN, et al. Norovirus and foodborne disease, United States, 1991–2000. Emerg. Infect. Dis. 2005;11:95-102
[3] Vicky Lovell, Institute for Women’s Policy Research, Women and Paid Sick Days: Crucial for Family Well-Being, 2007.

Thursday, March 7, 2013

Creating MADness

1st Mad Monday: (from left to right) Knud Romer, Erwin Lauterbach, Tal R & Rene Redzepi & Paul Cunninham


A number of years ago, when I was at the Tate Modern in London, I was attacked by a sticky mist of water and the glare from a giant Day-Glo bulb of orange-yellow light in the museum’s Turbine Hall. Confused at first, I just squinted my eyes into that searing ball of light. Then I glanced at the floor and saw of bunch of people lying there as if they were at the beach for the day. Yeah… right, I thought. Crazy art people. And then I saw the sign: Olafur Eliasson. "The Weather Project." Epiphany. I immediately lay down on the floor and basked in that light.

I thought of this on Feb. 25 when I attended the inaugural lecture of René Redzepi’s MAD Mondays, a public lecture and discussion series based on his wildly popular annual summertime MAD Symposium. How did Eliasson's concept spring to life? For that matter, where did Picasso’s "Guernica" come from? How did James Joyce create his stream of consciousness for Ulysses? And where does René Redzepi come off serving ants and cricket paste at dinner at his "world's best restaurant," Copenhagen's Noma? The answers to all these questions lie in the nature of creativity.
An understanding of the creative process was what Redzepi attempted to get at, at the first MAD Monday, by moderating a discussion, held upstairs in the book-lined chefs' common area at Noma, between four different people who all have stakes in the creativity game: chef Paul Cunningham, formerly of the Michelin-starred The Paul in Copenhagen; Danish author Knud Romer; Danish-Israeli artist Tal R; and the grand old man of Danish gastronomy, chef and cookbook author Erwin Lauterbach.

Starting with his own failure to understand the creative process, Redzepi declared, "Why creativity? Since 2010, we’ve tried to answer that. And we have no clue." Then, turning to the panel, Redzepi asked about their own challenges with creativity. For Lauterbach, the seeming paucity of fresh ingredients forced creativity upon him. "In the '70s, it was all French cooking," he said. "But I was in Sweden and I needed to be creative making food with just four products, like beets and potatoes."

For Paul Cunningham, the challenge was more mental than physical. After he garnered heaps of accolades for The Paul, the pressure of running a successful restaurant, writing cookbooks, and promoting his own "brand" eventually exacted a huge toll. "I was afraid of saying no," he said. "The stress of creativity created a monster." That monster was a nervous breakdown that led him to close down his restaurant at the height of its fame in order to find his mojo again, leading him to the rural coast of Jutland to cook at the resort inn Henne Kirkeby Kro. And with a new sense of freedom, not only did Cunningham find creativity, he also found happiness. "Creativity and happiness go hand in hand," he added. "And for me, they have to be that way."

But can anyone be creative? "I think the conventional idea of creativity is wrong," Knud Romer told the crowd. "That it is something immediate. It’s nonsense." But if our romantic notion of creativity is not what creates the newest dish or the critically acclaimed novel, then what is it? For Romer, creativity does not come from "nowhere," it’s the product of discipline and sweat. "To be creative is hard work," he said. "You have to have a goal. You have to learn about your craft, whether it is cooking or art. And you have to have knowledge.

Artist Tal R didn’t see creativity so much as an issue of knowledge but more in terms of process: "What is creativity?" he asked. "It is the free-fall of an idea." And falling free means that sometimes "you have to break something." That process is not necessarily one of originality, but one that involves a social push from others, whether it is from competition or from mimicry. "There’s a lot of stealing that goes on in creativity — I steal all the time." But does that make one less of an artist? Tal R says no, "It’s not the clichés or the stealing. It’s the desire that makes creativity."
And with respect to cooking, Redzepi sees begging, borrowing, and stealing in the same light, but in the end, he believes, it’s still the individual who determines creativity: "You can feel the creativity of the individual in the dish," he said. "Recipes are just a guideline." When the session ended, as the audience filed out the door, the Noma staff distributed chocolate chip cookies and Mikkeller beer to everyone. Each cookie was different — a vivid (and delicious) symbol of what it means to be truly creative.