Tuesday, July 27, 2010

A short skirt doesn't equal consent


BY ERIN K. O'NEILL

Six years ago, a 20-year-old woman, named in court papers only as Jane Doe, went to a bar at Laclede's Landing in St. Louis. She was dancing and someone pulled her tank top down, and it was all filmed by a Girls Gone Wild video crew. The incident was distributed on a video called “Girls Gone Wild Sorority Orgy.”

On July 22, a St. Louis jury ruled that despite saying “no” when asked to reveal her breasts to the camera, Jane Doe had given “implied consent” because she was there and taking part in the party.

The St. Louis Post Dispatch reported that Patrick O'Brien, the jury foreman, said: "Through her actions, she gave implied consent ... She was really playing to the camera. She knew what she was doing."

Allow me to express my outrage. This kind of reasoning makes me nauseated. If this case involved a guy unwillingly having his genitalia exposed to a video camera, I can guarantee that the outcome would’ve been incredibly different.

There is more than an undercurrent in our culture that says a woman “asks” for rape or other forms of sexual assault. It starts with snide comments about a woman’s wardrobe — a short skirt is apparently some kind of invitation to be harassed.

This is the beginning of a slippery slope, which ends in blaming women for rape. In feminism and feminist theory, the term “rape culture” is used to describe the commonness of sexual violence and how social norms, the media and people’s attitudes condone it.

The culture then teaches women how to avoid rape, through PSAs and self-defense classes and carrying mace and traveling in groups. This implies that not taking these precautions means that a woman deserves what she gets. Not that these precautions aren’t a good idea — I’ve taken a self-defense class myself — but that a woman must somehow be on guard against sexual attack at all times is ridiculous.

I spent one painful evening begging a friend to go to the police after an incident where her date just didn’t stop when she asked. She decided not to inform authorities and press charges against the guy because of the stigma of rape. Upon reflection, I find that a big part of her decision not to press charges was the fear that others would place the blame on her.

The St. Louis Girls Gone Wild case uses the illogic that Jane Doe was asking for it because she was “playing to the camera.” This is the same absurdity that would say a sexually provocative dress is “implied consent” for a man to rape the woman.

A girls' night out usually isn't a big deal. A woman wants a night out on the town with her girlfriends. She wears less clothes and higher heels than would be acceptable in the daytime. There will be dancing and a few drinks. A fun time will be had by all. I’ve been on these nights out, and they’re harmless.

Until someone pulls down a woman’s tank top in front of a Girls Gone Wild video crew.

That Girls Gone Wild, with its owner Joe Francis, is one of the most repellent companies ever to grace late-night television with its commercials is inconsequential. Pornography has been around since history could be recorded. I would even go far as to say that pornography, when made or consumed by consenting adults, can be empowering to women.

This jury’s decision is enshrining in legal precedent that being a woman in front of a camera at a party “implies consent” for having images of your naked breast distributed for profit. Girls Gone Wild made an estimated $1.5 million from the video in question. Someone pulled down Jane Doe’s tank top, and she said “no” to the camera crew. There is no evidence Jane Doe signed a consent form.

As Jane Doe’s lawyer, Stephen Evans, told the St. Louis Post-Dispatch: "Other girls said it was OK. Not one other one said, 'No, no.’ She is entitled to go out with friends and have a good time and not have her top pulled down and get that in a video."

Apparently not.

Erin K. O'Neill is an assistant director of photography for the Missourian and a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Tuesday, July 27, 2010.

Thursday, July 15, 2010

Daily Show accusations of sexism could be as fake as its news


BY ERIN K. O'NEILL

“Men hire men,” my dad said to me when I was home for Independence Day.  “And women hire women. That’s just the way it is.”

Whether my dad was strictly accurate or not misses the point.  The gross generalization — that hiring for jobs is largely based on gender — is the center of the brouhaha surrounding the blogosphere and the “Daily Show with Jon Stewart,” America’s premiere source for fake news.

I love the “Daily Show.” I’ve been watching since Craig Kilborn was the host, and he left the show in 1998. I plan my life around watching the “Daily Show” four nights a week — because if I miss the 10 p.m. airing I catch the rerun at 12:30 a.m., 9 a.m., 1 p.m., or 6 p.m. the next day. My biographers, should I ever have any, will probably point out the influence of the “Daily Show" in my chosen career of pursuing real news (hint hint, biographers).

Thus, when the feminist blog Jezebel decided to make a very thorough, if flawed, critique of the “Daily Show’s” dearth of female on-air “correspondents,” I was devastated. Or, to put it in the 140 characters or less I wrote on Twitter: “This is the most upsetting news EVER!!! EVER EVER!!!” (sic).

Because I was just a little worried that it was true.

Irin Carmon, who writes for Jezebel, went to great lengths, and through a lot of anonymous sources, to make the point that institutionalized sexism, or discrimination based on gender is alive and well at the “Daily Show.” Even if it's not active prejudice, it is a result of adherence to existing social norms and organizational rules.

Jezebel’s article quotes the show co-creator and former executive producer, Madeleine Smithberg, as saying that she doesn’t think the show is sexist and blames “larger societal forces” (Jezebel’s words) for the gender disparity.

And, in some ways, the numbers don’t lie: of the 50 “correspondents” the “Daily Show” has featured over the years, only 11 have been women.

Like my dad said: “Men hire men.”

A friend of mine, who shall remain nameless because I’m still angry with him for even suggesting it, said that maybe women just aren’t as funny as men. Since the “Daily Show” is predicated on humor, it would make sense that more men make it on-air. He sent me an article by Christopher Hitchens from “Vanity Fair” called “Why Women Aren’t Funny.” Apparently, a side effect of the ability to grow tiny humans kills any ability to be funny.

“For women, reproduction is, if not the only thing, certainly the main thing,” Hitchens wrote. “Apart from giving them a very different attitude to filth and embarrassment, it also imbues them with the kind of seriousness and solemnity at which men can only goggle.”

Excuse me?

Well, it was all just fuel for the fire. I was furious — not only at my friend for sending me such an odious article, but at my own blindness. How could I have been such a fan of the “Daily Show” and not seen what was right in front of me? Were smashingly good and hilarious critiques of Fox News really enough to justify overlooking such discrimination? Was I condoning the male-dominated media landscape by default because I had not even realized that all of my fake news idols were men?

I thought about it. A lot.

And then the backlash in the media started. Jon Stewart himself mentioned on-air that “Jezebel thinks I’m a sexist prick,” and Slate’s Emily Gould accused Jezebel of using accusations of sexism and the female predisposition to petty jealousy to boost page views. The New York Times wrote a piece on Jezebel’s willingness to take a “media heavyweight . . . to task.”

I found the open letter to “People Who Don’t Work Here,” written by the female staffers of the “Daily Show,” to be most enlightening. “The ‘Daily Show’ isn't a place where women quietly suffer on the sidelines as barely tolerated tokens,” the letter said. “On the contrary: just like the men here, we're indispensable. We generate a significant portion of the show's creative content and the fact is, it wouldn't be the show that you love without us.”

I would rather take their word for it than anyone else’s.

I am, in the end, conflicted. I think that the “Daily Show” could have saved itself a lot of agony if it had not refused to comment for Carmon’s article.  I think Jezebel did a huge amount of reporting, but instead of deferring to a journalist’s obligation to the truth, they decided they had a bone to pick (Jezebel may be a media organization, but it’s a blog of opinion writing with a feminist slant, which can lead to a lack of fairness).

This may have all been blown way out of proportion. Welcome to media in the 21st century.

Do I wish that the “Daily Show” would represent more females on-air? Absolutely.  Do I think that the conspicuous lack of women on the show is a result of deliberate and insidious sexism? Not at all. I will still be watching.

Erin K. O'Neill is an assistant director of photography for the Missourian and a master's degree candidate at the Missouri School of Journalism. In her first semester of graduate school, she wrote a (very bad) academic paper on the Daily Show titled “On Reporting, Irony and Fake News.”

Published in the Columbia Missourian on Thursday, July 15, 2010.

Thursday, July 1, 2010

Twilight is not for lovers


BY ERIN K. O'NEILL

I am not one to throw stones about pop culture addiction. I am a devourer of stories.

I’ve been a rereader and rewatcher of stories for as long as memory serves. I watched “Cinderella” every day when I was 4. In second and third grade, I read the “Little House in the Big Woods” series a bagazillion times, and when I was a little older, I probably flew through “Anne of Green Gables” about as many times. In middle school, I was devoted to “Buffy the Vampire Slayer,” “Daria” and the entire canon of John Irving.

Then I discovered Harry Potter. If ever there was a series of books to feed my addictive personality, Harry Potter was more than manna for the soul. I came into the series a little late — after the release of the fourth novel, “Harry Potter and the Goblet of Fire,” when I was 14 years old. I was instantly hooked and in love with waiting and speculating and tearing each book apart for clues about what would happen.

The night the final Harry Potter was released was probably the best night of my life. I read all 759 pages of “Harry Potter and the Deathly Hallows” between the midnight release and 9:30 a.m. It was the most thrilling and satisfying singular experience of my life thus far. Chapter 34 gives me the chills just thinking about it.

My ardor for the Harry Potter novels can border on humiliating — especially when I start waxing rhapsodic to the uninitiated. Would you like to hear all about how Harry Potter is a classic mythical hero or an analysis of the philosophical implications of magic?

Sorry, I didn’t think so.

It would perhaps seem, to a casual observer, that I would at least enjoy the “Twilight” novels written by Stephenie Meyer. A series of four very long novels about the supernatural and a bookish girl, with some romance and action, could be perceived to be right up my ally.

And the casual observer would be wrong. Oh. So. Very. Wrong. There is no hatred in pop culture like my hatred for “Twilight.” I have read all the books and even seen the movies. The novels are addictive in the worst way: The prose is awful, the content of the story alarming and the heroine a downright bore, and yet I couldn’t stop reading them. I got no joy from the pages, only a sick compulsion to continue.

I could perhaps forgive this if “Twilight” was merely poorly written with an uncompelling narrator. By the time I got to the fourth book, “Breaking Dawn,” I realized these books were an unwitting assault on any ambition a woman may have in this world outside marriage and children.

SPOILER ALERT: Bella, the heroine, decides to skip college, despite her supposed smarts, so that she can marry her immortal and creepily obsessive boyfriend Edward. Postnuptials, he knocks her up with a half-human/half-vampire. Bella almost dies in childbirth, so Edward makes her into a vampire.

What is the message to the flocks of devoted young women from all of this? If you have sex, you will get pregnant and die.

I know that personal biography can have a profound effect on interpretation of literature. Considering I watched my older sister leave college at 19 to get married (and soon divorced), have three children and work in fast food, glamorizing this epic failure of a life choice seems downright foolhardy to me. It worked out for my sister, who is now a registered nurse and very happily married again (I love you Katey!), but to say that the “Twilight” series managed to push all of my crazy buttons is an understatement.

There are other disturbing things about the “Twilight” series, including Edward’s bizarre infatuation with the smell of Bella’s blood; Bella’s suicidal mindset when Edward abandons her in the second book; really terrible allusions to classic literature ("Romeo and Juliet?" Seriously? Could a literary allusion be less original?); and ickiest of all, Jacob the werewolf falling in love with Edward and Bella’s infant daughter.

My problem is not so much with the content of these decisions but how they are portrayed. I’ll give Meyer credit for making Bella the one who wants to go all the way more than her vampire boyfriend, but the tone of the romance is sexy love without the actual sex. There is no discussion or thought of realistic consequences. The beauty of the best fantasy is that despite its fantastical flourishes, it reveals something true about the human existence. “Twilight” does not come close to this — it is divorced from any semblance of reality.

“Twilight” teaches young girls that skipping college and teen marriage is the very definition of happily ever after. And this is what makes it worthy of my virulent loathing.

Now, please excuse me. I’d like to go watch the new movie trailer for “Harry Potter and the Deathly Hallows.” Again.

Erin K. O'Neill is an assistant director of photography for the Missourian and a master's degree candidate at the Missouri School of Journalism. Her favorite book of the Harry Potter series is “Prisoner of Azkaban.”

Published in the Columbia Missourian on Thursday, July 1, 2010.

Friday, June 25, 2010

A revised version of adulthood


BY ERIN K. O'NEILL

From the time I was very small, my mother had rules for my life.

  1. You can't get married until you're 32.
  2. College is not optional.
  3. Get a master's degree immediately after college (because once you get into the workforce, you won't go back, my mother says).
  4. No tattoos.
  5. No living in California (because they're weird out there, so says my mother).
  6. Backpack through Europe.

It goes on and on, and occasionally she makes one up that I know wasn’t on the list when I was 10. And while other parents' desires for their children's lives were perhaps less specific, they reflect the middle-to-upper-class ideal that young people should go to college and establish careers before “settling down” with a spouse, a mortgage and kids.

I think my peers and I bought into this hook, line and sinker.

The Missourian recently asked, Why are Americans taking longer to grow up? Apparently, we young'uns are still dependent on our parents for money and housing and are making our parents wait longer for grandchildren. This makes us an economic strain in hard times on the older generation, or something.

I say just because I don’t have kids and am still in school at the ancient age of 24 doesn’t mean I’m not a real grown-up. Adulthood is simply being defined differently these days.

It’s true that my parents still pay for my car and I’m still on my father’s health insurance plan. It’s also true that I’m putting myself through graduate school with teaching assistantships and loans (also known as mounds of soul-crushing debt).

When societies demand that young people make a small horde of money before becoming truly independent, the age of marriage rises. Stephanie Coontz wrote in "Marriage, a History" that "In England between 1500 and 1700 the median age of first marriage for a woman was twenty-six, which is higher than the median age for American woman at any point during the twentieth century."

The expectations of young adults amongst the commoners during this period were not unlike what seems to be expected today. Although college wasn’t on the menu in 1500, according to Coontz, the ability to independently support children and a separate household was. Not to mention many of the trade guilds required apprentices to remain single, so if a man wanted to learn a trade to support a wife and family, he would have to wait.

There are other pressures on today’s youth that contribute to this supposed delayed adolescence. First, the economy has been rather miserable since 2001 and went from bad to worse in 2008. With unemployment hovering above nine percent, jobs are scarce for young people with thin resumes.

Furthermore, the cost of higher education has skyrocketed in the past 20 years — it has well outpaced the rate of inflation. My mother, who wrote these rules for me, put herself through the University of Michigan in the late '70s working for $2.35 an hour at a gift shop, which along with an $800 scholarship from the state of Michigan and working as an resident-hall assistant for room and board, was enough to pay the $660 per semester to attend school full time.

This is simply not possible these days. Without my parents and the Free Application for Federal Student Aid, in addition to the $7-an-hour job I worked at Shakespeare’s Pizza as an undergrad, I wouldn’t have made it through college, much less a master’s degree.

And so here I am: 24-years-old, single, childless, overeducated and on the brink of homelessness and unemployment (or so I think on my cynical days when my job hunt doesn’t go well). My mother was married and gainfully employed at my age.

Maybe I’m not an adult by the most prevalent societal standards of adulthood, but society is changing. It has given us a revised standard of adulthood.

Erin K. O'Neill is an assistant director of photography for the Missourian and a master's degree candidate at the Missouri School of Journalism. Her mother, Carol J. Homkes, lives in Georgetown, Ky., and is a manufacturer’s representative in the gift industry.

Published in the Columbia Missourian on Friday, June 25, 2010.

Friday, June 18, 2010

Would you go in front of the lens?


BY ERIN K. O'NEILL

Would you, dear reader, let a photographer into your life?

Considering Columbia has been over-saturated with journalists since the establishment of the School of Journalism in 1908, many of you have probably been photographed by the Missourian or an eager student.

But I speak of allowing a photographer to have deeper access to the private moments of your life. Would you truly allow a photographer access to your private, intimate moments — for weeks or months at a time?

And if you would, why?

I’ve been asking myself — along with people who have been photo subjects — these questions for the past five months.

I’ve been researching the 61st Missouri Photo Workshop, which took place in September 2009 in Festus and Crystal City and the people who let these workshop photographers into their lives.

The workshop serves as a microcosm for the documentary photojournalistic experience. In one week, a photographer must find a photo story and then spend morning until night with their subject.

As a journalist, I ask to see the most intimate, emotional and sometimes difficult moments of my subject’s life — all in the name of getting the story.

As a photojournalist, this is an even bigger undertaking.

A writer does not need, per se, to bear witness to these moments — for them to be recalled in an interview is enough. But a photojournalist must view and capture these moments in real time.

I was told ad nauseam throughout my education, “People want to have their story told,” but I was suspicious that this was not the case; As much as I loved journalism— I suspected that the profession was infected with a light strain of opportunistic voyeurism.

The nine photo story subjects I interviewed all described the experience as being awkward at first. One of my interviewees, Annette Bauman, said multiple times that she’s “not a picture person.”

Most said that it became less awkward as the week went on, as they became accustomed to their photographer and being in front of the lens.

Jason and Sara O’Shea, who homeschool their four children, were photographed during the workshop by photographer Michele Kraus. The O’Sheas said the experience became a family-like affair.

“I think it very quickly stopped feeling like someone was at our house doing a documentary,” Jason O’Shea said. “It more felt like we had a family member over that we don’t get to see a lot, so she wanted to take a lot of pictures ... I know that it sounds silly, she (Kraus) was only here three or four days, but it was almost like she was more of a little sister and she was just taking pictures of family.”

Jason O’Shea thought the photos that were taken by Kraus were “phenomenal.”

“They capture what our life is really about,” O’Shea said. “I guess in a way (the photos) made a difference because I realized that when I look at those pictures, they really do capture a lot of what is important to me and that helped me to realize, I guess, that the direction our lives are going is the direction I want them to go. ...I felt the pictures were, to a great extent, an affirmation of the fact that our life is really what I want it to be.”

Private moments were also photographed. Laverne Austin, a resident of Crystal City who lives with a rare form of multiple sclerosis was photographed by John Liau during the workshop.

“John didn’t mind coming in the bedroom,” Austin said. “I’d be getting ready to put something on and the he’d be with his camera going, click click click. I’d say, ‘How long have you been here? I’m going to tell on you to your fiancĂ©.’”

To me, the most amazing story I heard was that of the Bauman family. Annette and Josh Bauman have two sons, Jackson and Kade.  Kade, who is now 2 years old, could not support his head, crawl or talk because of multiple medical conditions including epilepsy, cortical vision impairment and hypotonia. The Baumans allowed their photographer, Julia Robinson, to visit the emergency room with them when  Kade had a seizure.

“It was cool of Julia to come to the ER with us,” Josh Bauman said. “Trips to the hospital with Kade are intense. She didn’t back down, she went right with it. ...We thought it was going to get difficult when we went to the ER because hospitals are picky — we thought it may get hairy, but they didn’t mind. ...We wouldn’t have agreed to do it if there was going to be something off limits.”

I’ve been photographed by classmates as a class exercise, but I wonder every time I send a Missourian photographer out on assignment if I would say yes to a photographer in a similar situation. If it was my nine-year-old brother in the hospital, would it be OK with me?

What it comes down to for me is this: The power of these stories is extraordinary. They have the power to inform, to enlighten, to show and examine the nature of human emotion. One could (and I have) debate the measurable effect of these stories but I believe that revealing a common humanity is the highest cause that could exist. Photo stories are unique in their ability to do this.

If my story had the capacity to exhibit something so true, how could I possibly say no?

What would you say?

Erin K. O'Neill is an assistant director of photography for the Missourian and a master's degree candidate at the Missouri School of Journalism. She volunteered twice for the Missouri Photo Workshop, and she is weeks away from completing her master’s project about the people who were photographed during the 2009 MPW in Festus and Crystal City.

Published in the Columbia Missourian on Friday, June 18, 2010.

Wednesday, April 7, 2010

Compulsory voting for all


BY ERIN K. O'NEILL

Voting in federal elections should be compulsory for all eligible Americans above the age of 18. It should be illegal to not vote. Citizens should be made to vote. Not enough citizens are exercising their constitutional rights and voting.

Compulsory voting is not a popular idea. In 2004, an ABC News poll showed that 72 percent of Americans opposed requiring citizens to vote. It was also reported that Americans were similarly averse 40 years ago, when 69 percent of Americans voted nay to mandatory voting. Apparently, it would be too corrosive to our freedoms.

Although voter turnout increased by 5 million in 2008 compared to 2004, the actual percentage of eligible voters who voted remained the same: 64 percent. The 2008 elections saw increases in turnout among black, Hispanic and Asian voters, as well as voters aged 18-24, but turnout decreased or was stagnant among other demographic groups.  This may be considered an improvement from the 2000 elections, when overall voter turnout was 55 percent.

No matter how you slice it or spin it, 64 percent is not enough.

There are 21 nations worldwide that make voting in national elections mandatory for most people, according to the CIA World Factbook. Some of these nations include Argentina, Australia, Belgium, Brazil, Chile, Egypt, Greece, Lebanon, Mexico and Thailand. It’s an eclectic list, including many South American countries.

Australia, my favorite nation and continent, made voting in federal elections mandatory in 1924. The Commonwealth Electoral Act 1918, the federal Australian law in question, states: “It shall be the duty of every elector to vote at each election.”

It was a fast road to compulsory voting for Australia. First, they made voter registration compulsory in 1911, and then the Australian state of Queensland made voting obligatory in state elections in 1915. Other states followed Queensland’s lead: Victoria in 1926; New South Wales and Tasmania in 1928; and Western Australia and South Australia in 1942.

Voter turnout in Australia has been above 90 percent in every federal election since 1924. In 2004, Australia’s federal election had a turnout of 94.34 percent. The rates of voter registration, called “enrollment” down under, are consistently above 95 percent. In 2004, of the 13,098,461 enrolled Australian voters, 12,354,983 turned in ballots.

Australia’s total population was 21,262,641 in July 2009. To have over 12 million citizens vote in an election in such a small country is remarkable. Out of the 21 countries that theoretically mandate compulsory voting, Australia is actually one of the few to actually enforce the law. There are small fines for not enrolling if a citizen is eligible, and the fine for not turning up to vote or turn in a ballot by various other means is $20 Australian, or $18.30 in U.S. dollars.

High voter turnout is not the only advantage of compulsory voting. For example, enacting compulsory voting in the U.S. could turn political dialogue away from the extremes of all sides. Political candidates could focus on persuading voters on the issues, and not spend all their time pandering to special interests and political bases in an effort to get people to the polls.

Moreover, incidents of voter disenfranchisement and voter suppression, in its many nefarious forms, could potentially become a thing of the past. Many cases of voter intimidation have been aimed at minorities and low-income areas, where voter turnout is usually lower anyway. Compulsory voting could have the ability to override these attempts to suppress voter turnout, making voting more accessible.

The Constitution of the United States requires citizens to pay taxes and submit census forms. We have laws that require children to attend school, and we require the fulfillment of jury duty. Voting is possibly the most important of these civic duties — and one that is constantly taken for granted by Americans. Making voting compulsory is not a detriment to our freedom; it is a method of increasing it.  A government that more accurately reflects the will of the people, and not just the people with the means and motivation to vote, would benefit us all.

Erin K. O'Neill is a former assistant director of photography and page designer for the Missourian. She is a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Wednesday, April 7, 2010.

Wednesday, March 24, 2010

Filling out the census is a duty we all must fulfill


BY ERIN K. O'NEILL

"Representation and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers. ... The actual Enumeration shall be made within three Years after the first Meeting of the Congress of the United States, and within every subsequent Term of ten Years, in such Manner as they shall by Law direct."  — Article I, Section 2 of the Constitution of the United States

You know you hang with nerds when filling out census forms is the talk of the town (it was all over my Twitter feed). The 10-question form arrived last week, and even after attempting to savor my first solo census experience, it took me all of five minutes. After all, there’s only one of me in my household. And I’m sure that by now my name, race, age and the fact that I rent my apartment are already back in the hands of the census people.

This is information that is not hard to find. And these are questions that have been asked of Americans every 10 years since our fine nation was formed — 2010's is one of the shortest forms in history (we don’t ask about slaves anymore).

“The genius of the Founders was taking a tool historically used for government oppression and making it a tool of political empowerment for the governed,” Census Director Robert M. Groves wrote on his official blog. “They accomplished that goal in 1790, and we’re about to continue that tradition in 2010.”

It’s Groves’ job to be pro-census. Complaints about an inflated budget of approximately $14.7 billion are not unfounded. But I happen to think the budget is justified, especially if Groves' claim of saving $85 million per 1 percent increase in U.S. households completing and returning the form is correct.

It is perfectly constitutional for the census to ask questions related to demography. It does not violate the Fourth Amendment (see Southern Texas district court case Morales v. Daily). In 1999, the Supreme Court went so far as to describe the census as the “linchpin of the federal statistical system … collecting data on the characteristics of individuals, households, and housing units throughout the country” (see the case Dept. of Commerce v. U.S. House of Representatives).  The idea that asking if citizens own their places of residence, or what races they identify with, qualifies as an invasion of privacy (if you consider the social construct of race to be an intrusive piece of data) is slightly absurd.

Moreover, the Department of Justice confirmed that not even the Patriot Act could override the confidentiality of the information gathered by the census. The government is less respectful in looking at what books you check out at the public library (see U.S. H.R. 3162, Public Law 107-56, Title II, Sec. 215.).

The census is the backbone of our representative democracy — not just a tool for states to pork-barrel the federal government (all members of Congress want more than their fair share of the $400 billion in federal appropriations sent to their districts based on the census). The writers of the Constitution said we needed two houses of Congress to balance the power of the more populous states with the smaller states. The makeup of the House of Representatives is absolutely dependent on the results of the census. It would follow that we need to know where the population resides. Not to mention, the requirement of a census is in the third paragraph of the Constitution — which implies that it’s pretty darn important.

Consider this a friendly reminder: Fill out your census form and send it in. It’s less painful than voting. Honest.

Erin K. O'Neill is a former assistant director of photography and page designer for the Missourian. She is a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Wednesday, March 24, 2010.

Wednesday, February 24, 2010

NBC taking drama out of Olympics with delayed coverage


BY ERIN K. O'NEILL

I knew Lindsey Vonn won Olympic gold in the woman’s downhill ski hours before it aired in the perpetual present on NBC. I also knew Vonn crashed in the super combined way before NBC even pretended it happened. Sports are not fun to watch on tape delay, especially in the Internet age, where we technophiles are constantly jacked into a stream of news.

I would even go so far as to say that even if you are not emotionally attached to your laptop and Twitter feeds, there is no escaping dreaded Olympics results spoilers. Between word-of-mouth and incidental media consumption, news of the Olympics can't be avoided if you engage in the modern world.

NBC, the broadcast network home of America’s Olympic coverage for as long as I can remember (that would be the summer games of 1992 in Barcelona, Spain), has not adjusted at all to this new media landscape. Which is funny, considering NBC achieved the first live color television broadcast via satellite by airing the opening ceremonies of the 1964 Summer Olympics in Tokyo.

A typical prime time night of Olympics is a mixed bag of live and tape-delayed sports, with Bob Costas in a studio with a fake fireplace trying not to get excited when Dutch speed skaters beat Shani Davis. But most annoying, and not to mention borderline dishonest, is NBC’s insistence on not quite telling the viewer if the event is live or not. I was frankly unsure if Shaun “The Flying Tomato” White’s victory in the snowboard halfpipe was live until I heard some language that is sure-to-be FCC-fine-inducing.

The figure skating is sure to be shown live, and every now and again there is a small icon telling you the viewer that you're watching in real time. But the skiing, a glamorous and therefore prime-time worthy sport, takes place during the day in Vancouver, B.C., and if you just watched the nightly NBC Olympic telecast, you would have no idea. Perhaps daylight is a dead give away on TV, but if you're watching ski jumping at 7 p.m. CST, it is 5 p.m. in Vancouver and before sunset. There is no acknowledgment that the sporting events being shown took place hours before. Full and constant disclosure of whether the event is live or tape delayed is crucial.

Sports are inherently dramatic. And NBC is killing it. There is no suspense, no drama, no thrill of the win and no urgency when most of the audience already knows the results. The disconnect between the audience and the broadcast presentation is a serious detriment to the coverage.

I get the appeal, for NBC, of having prime events in prime time. But that is a very old media approach. And sure, even if I already knew Vonn bit the big one in the super combined, I still want to see it happen (and even in prime time). NBC Universal has tons of networks between its free-to-air and cable television holdings, and the potential to have beyond amazing Olympic coverage. What if skiing was shown live in the morning on CNBC or MSNBC, with a highlight reel in prime time? Or what's wrong with showing live events on NBC itself; can’t “Days of Our Lives” have two weeks off every two years? I am infinitely frustrated when I turn my TV to NBC to see soap operas when a real-life athletic soap opera is going on, unseen by American eyes.

The Internet could be utilized to a much greater effect. For the Vancouver games, NBC is live-streaming all of the hockey and curling and offering replays of other events. But frankly, the video site is much like the broadcast production: flashy and confusing. And the video player made me download a new piece of software. The site lacks the simplicity and elegance of Hulu, a free Internet TV site, also owned by NBC Universal.

The Olympics are the most exciting sporting event in the world, with only the possible exception of the FIFA World Cup. One Olympic flame; two weeks; 82 countries; and 258 gold, silver and bronze medals to be won. This is the stuff that dreams are really made of.

If only NBC treated it that way.

Erin K. O'Neill is a former assistant director of photography and page designer for the Missourian. She is also a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Wednesday, February 24, 2010.

Wednesday, February 10, 2010

Statistics show progress, 'Pride and Prejudice' in the women's movement


BY ERIN K. O'NEILL

“It is a truth universally acknowledged, that a single man in possession of a good fortune must be in want of a wife.” ~Jane Austen, "Pride and Prejudice," 1813.

Some interesting statistics came out last week regarding the strides women have made as economic forces in marriage.  “A larger share of men in 2007, compared with their 1970 counterparts, are married to women whose education and income exceed their own ... A larger share of women are married to men with less education and income,” according to the study “Women, Men and the New Economics of Marriage" from the Pew Research Center.

The study found that among American men and women ages 30 to 44, 22 percent of married men had wives who made more money in 2007, as compared to just four percent in 1970. The study also found in 2007, 19 percent of married women had husbands with more education, while 28 percent of wives had husbands with less education.

It may not seem revelatory today that women have gained so much ground from 1970 to 2007.  Of course, to truly understand the current economics of gender roles and marriage, we must look to the past. Before the Women's Movement of the 20th Century, the economic progress toward equality for women in marriage was tortoise-like.

To wit, I offer the case of Miss Jane Austen, whom published four novels from 1811 to 1817, and two novels posthumously in 1818. Miss Austen’s six novels centered around maintaining England’s social structure through marriage. While political action takes place off stage in the novels, the Napoleonic wars, the War of 1812 and the establishment of a regent at the head of the monarchy, were all causes of domestic unrest in Austen’s lifetime.

But, the “truth universally acknowledged,” is that young women were seeking a single man of good fortune in order to secure their own economic and social security (not the other way around; Miss Austen is a noted ironist). A woman could not inherit money or property, and upon marriage any assets she may possess became property of her husband. Divorce was nearly impossible, and mothers had no custody rights.

It is very little coincidence that many of the marriages in Miss Austen’s novels are not only for love or passion, but are also financially advantageous for the woman. After all, Mr. Darcy is worth ten-thousand pounds per year (the equivalent of $6 million today), while Elizabeth Bennet’s family is in danger of losing their land because the family had no male heir.

All of Miss Austen’s novels tend to paint a rosy interpretation of the marriage state in the nineteenth century. It helps that Hollywood and the BBC have made a specialty of filming these stories so they are particularly attractive to modern women; after all, Miss Austen was the originator of the RomCom. But modern audiences tend to lose sight of the basic fact that it was once a woman’s only destiny to be a man’s possession.

Miss Jane Austen’s world is fascinating, but it would be badly done to foster any desire to emulate it.

Which is why the statistics on modern marriage are so telling. Laws guaranteeing equality in heterosexual marriage have been established—it has been generations since a woman has been considered the property of her husband. But this change in law didn't have nearly the material effect on the economic power of women in marriage as a sea-change in cultural expectations. Once the popularly-held belief that a woman's sole purpose was to be a man's wife was expunged, rapid progress has been made.

“This reshuffling of marriage patterns from 1970 to 2007 has occurred during a period when women’s gains relative to men’s have altered the demographic characteristics of potential mates,” the Pew Research Center reported. Americans are marrying less, and at later ages, and more women are now graduating from university than men. The strides women have made in education certainly account for much of the economic gain, with and without marriage vows.

Erin K. O'Neill is a former assistant director of photography and page designer for the Missourian. She is also a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Wednesday, February 10, 2010.

Wednesday, January 27, 2010

Benefits of constant media use far and few between


BY ERIN K. O'NEILL

We live in a media age. The young folks are submerged in a complex media landscape that involves being constantly connected to the Internet. Americans aged 8 to 18 years old spend more than 7 1/2 hours a day in front of a screen, be it television, computer, iPod or smart phone.

Ever watched TV and surfed the web at the same time?  According to the Kaiser Family Foundation study, since more than one medium is being used at a time, a person can actually pack in close to 11 hours of media and Internet time per day.

The Millennial Generation, or those among us who are somewhere between adolescence and late twenty-something, modus operandi is to be hyper-connected all the time. If I am in front of a computer I am also logged into Facebook, Twitter and  both of my e-mail accounts (at least). I often boot up my computer as soon as I wake in the morning, and it stays on and open until I go to sleep at night — closing only when I need to relocate to another class or Internet connection.

I have been wondering lately, however, if this normalization of absurd amounts of screen time is all that good. Sure, there are benefits to online attachments. I can find academic articles in moments for classes, and it saves paper to use electronic textbooks and consume news media online. I also talk to my sister a lot more often than I did before we were both on Facebook. But in all honesty, I am not doing any of these laudable things in 95 percent of my time connected to the web.

For example: have you seen Go Fug Yourself? Two chicks in LA make very snarky fun of terrible celebrity fashion. Could there be a more sublime time waster? Or, when I’m not spending time reading Texts From Last Night, where unfortunate text messages are revealed, I am perusing Overheard in the Newsroom, where journalists' unfortunate conversations are posted. I have more than one list of bookmarks in my net browser, organized by category, where more gems of worthless time suck can be found.

Maybe I just need some new blogs to destroy my days with, but this endless Internet routine is just plain boring. Nothing really exciting happens on the Internet — my friends make meaningless observations on Twitter, I get slews of emails from MU administration, and I obsessively read the New York Times (and not usually the really newsy part).

There is a social pressure not only to be constantly connected but also to be constantly seeking stimulation. There is no just having a coffee, you must be online and having a coffee, or texting and having a coffee. It is infinitely frustrating to want to sit and read a book but feel the necessity to check in online every hour. Nothing is really happening, no one said anything that can’t wait until the book is read, but it is imperative to be present online.

The Internet age has many advantages. An entire generation of Americans is dependent on portable communications devices to stave off the smallest amounts of ennui. The philosophical standard for existence is no longer “cogito, ergo sum” (not that this standard was ever the final word on the proof of existence). We now subscribe to: I am online, therefore I am.

Erin K. O'Neill is a former assistant director of photography and page designer for the Missourian. She is also a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Wednesday, January 27, 2010.

Wednesday, January 13, 2010

Fear of flying? Blame the bureaucracy


BY ERIN K. O'NEILL

Wasn’t the Department of Homeland Security supposed to fix this mess?

This mess being the epic failure to “collate” the intelligence on Christmas Airplane Underwear Bomber Umar Farouk Abdulmutallab. And there were more indicators of Abdulmutallab’s plans than there are health warnings on cigarette packets.  But only bureaucrats think adding another layer of bureaucracy will fix the problems with bureaucracy.

“The U.S. government had the information — scattered throughout the system — to potentially uncover this plot and disrupt the attack,” President Barack Obama said at a news conference on Jan. 7.  “Rather than a failure to collect or share intelligence, this was a failure to connect and understand the intelligence that we already had.”

Hey, that’s super-comforting. Just to be clear, the American intelligence community knew that Yemen is a hot spot for anti-American terrorism. Abdulmutallab’s own father tried to turn him in as a potential terrorism threat to a CIA agent at the U.S. embassy in Nigeria two months ago. He was put on a security watch list in the United Kingdom after his student visa application was flagged.

And then it was reported that Abdulmutallab paid $2,831 in cash for a one-way ticket from Lagos to Detroit (with a layover in Amsterdam), didn’t check any luggage and didn't give the airline any contact information. And yet, none of this information made it anywhere near a no-fly list.

Even without the father’s warning to the CIA or the UK security watch list, isn’t this everything we’re supposed to be looking for after Sept. 11, 2001?

Now, TSA is trying to fix the security problems with more stringent and invasive searches in airports. Apparently, getting on an airplane is tantamount to probable cause for TSA agents to not only take off your shoes and search your carry-on bags but also to force you to submit to full-body scans.

This process only makes travel by airplane more onerous for the average citizen, and I see little evidence that it actually makes air travel any safer. Americans are xenophobic enough without having to face long, slow lines that end in pat-down searches.

There’s no way any TSA agent is getting anywhere near my granny panties, even if that is where Abdulmutallab hid the explosives.

What saved Northwest Flight 253 on Christmas Day from pentaerythritol tetranitrate explosives sewn into Umar Farouk Abdulmutallab’s underwear was very clearly not the work of intelligence agencies or anti-terrorism bureaucracy, or even airport security checkpoints. It was alert passengers on the flight who heard the first small explosion, jumped on Abdulmutallab as his lap burst into flame, alerted flight attendants to put out the fire and prevented the PETN from igniting and blowing a hole in the plane, which would have caused it to crash.

Umar Farouk Abdulmutallab shouldn’t have made it on the plane in the first place.

Erin K. O'Neill is a former assistant director of photography and page designer for the Missourian. She is also a master's degree candidate at the Missouri School of Journalism.

Published in the Columbia Missourian on Wednesday, January 13, 2010.