Why are the early voting lines so long… and why did NPR’s Carrie Kahn just report about them like they’re a big block party? We’re talking three to five hours long, when more than a half hour should be considered undemocratic.

A long voting line is not an act of nature. It is not inevitable. Nor can it be attributed to surprisingly high voter turnout — after the past eight years, there’s no surprise. Besides, election officials should have methods for handling “surprises” within their jurisdictions. Even with early voting.

“This has the potential to disenfranchise a heck of a lot more people than, dare I say it, hacked electronic voting machines,” said Tova Wang, vice president for research at Common Cause, which has been monitoring potential balloting problems ahead of this week’s vote. (LA Times, 11/2/08)

I want to know how and why long voting lines happen. If we don’t ask, and we don’t know, then we can’t press for change. We’re down to the wire now. If it’s true that the presidential race looks different when pollsters ‘narrow the pool of responses from registered voters to likely voters,’ then there should be no block-party reporting where voting isn’t smooth.


I spent last weekend in Western Massachusetts, visiting my parents and enjoying a summer performance in the fresh mountain air at Tanglewood.

Freed from the seemingly endless To Do list of my Cambridge life, I embraced one of those projects that nearly always ends up at the bottom of the pile. I descended into my parents’ basement to winnow a few boxes of personal memorabilia. They were mostly grade school and college notebooks, awards certificates, and personal essays assigned by voyeuristic middle school “English” teachers who know they’re panning for gold during a rush.

Even a media girl will tolerate only so many boxes of dormant paper, because someday her parents will make her store them herself. At least computers arrived partway through this lifelong accumulation of paper, creating the illusion that one is a minimalist (and then making it a reality through malfunction or obsolescence). But, with the paper that remains, it’s so hard to know what to keep and what to throw.* 

However, there was no question as to the fate of one yellowed 8 1/2″ x 11″ sheet of paper. In 1987, a summer school teacher wrote a brief evaluation of my participation in a class where we made our own magazine:

Rekha was a fast learner in the art of “lay-out” and “paste-up.” She has a good logical mind for organizing material, and likes to add her own creative touch to the format of a magazine. I appreciated the leadership she provided as editor and enjoyed working with her on her own articles.

I was thirteen years old. In the years that followed, I considered careers in interior decorating, environmental law, medicine, architecture, photography, international diplomacy, or something that would fulfill my eighth grade class designation as First to Make a Million. I eventually faced my destiny and went into media… and here I am, 20 years later, laying out digital objects and curating content with logic and creativity. I might have saved myself a lot of searching if I had paid attention to these little indicators. Then again, what I do now didn’t exist back then.

[Because I was reminded that truth, edited for narrative flow, will still be treated as truth (see comments).] There’s another story about how my earliest memories include sitting in the back of my parents’ car, listening somewhat involuntarily to NPR. And how, while a teenager, lots of people told me I had a great voice, and that I should be a journalist, and how I agreed but for many years I was too scared to try. But that’s going to have to wait until I find another artifact to peg it on.


*I would be curious as to other people’s criteria for retaining or letting go of personal artifacts over the years.

Not too long ago, most people I knew continued to harbor a certain social prejudice well after the major social prejudices had fallen out of favor among the thinking set. This prejudice was not against a person, or a stereotype. It was against the computer.

No one ever spoke openly of this prejudice. It was transmitted subtly: a gentle roll of the eyes when someone cautiously suggested that a quick Google search might resolve the conversational impasse; a derisive snort when the token geek in the room offered to show the group what he or she was working on. When a friend said in mixed company that her local movie theater used to project video game play on the big screen for public viewing, there was an unmistakable ‘only in Maine’ undercurrent to the response.

Those of us who were less prejudiced almost unconsciously learned to suppress our impulses to introduce a computer into good old human interaction. With labels like “video game addict”, “internet addict”, and “geek” with a “dork” intonation floating around, it was best to keep demonstrations of the non-physical world to oneself.

(This unspoken rule had one exception: The showing of one’s digital photos on one’s computer with everyone gathered around. A descendant of slide carousels and photo albums. Here, eyerolls meant that the photographer took twice what they would have taken on film and edited nothing.)

I didn’t feel the oppression, as I had internalized the prejudice. There’s one keyboard, one screen, and one pointing device. Computer use is individual by design. Look inside a cafe or office and you get the idea.* It followed that introducing such a private physical relationship into mixed company would be considered unseemly.

About three years ago, however, my unconscious attitudes surfaced and changed. Martin Wattenberg came to MIT to talk about “The Social Life of Visualizations.” He focused on how people socialize around visualizations of data, the most fun example being his NameVoyager, which lets people enter a name and see a graph of its popularity over time:

When we launched the NameVoyager, we expected that it would be of interest to expectant parents… What we did not expect was that it became popular even among people with no interest in children. The site has received millions of visits since launch, and has been the subject of thousands of blog posts and online conversations. The activity around this visualization was one of the inspirations for my current research focus on the social aspects of data visualization and analysis.

Martin showed several other examples of how data visualizations inspire (mostly online) sociality, and how well-designed data visualizations inspired people to interact more deeply with the data in the process of interacting with each other.

I was totally with him on this. Soon after, I was with a pregnant friend, and she let me pull up the NameVoyager on her computer. We had a ball looking up names of everyone we knew and every name she had considered for the baby. With Martin’s emphasis on digital sociality fresh in my mind, I became acutely aware of my own different but related assumption that computers and people shouldn’t mix in physically social space. Yet here we were having so much fun. I began to feel a sort of defiant validation… why NOT bring computers into conversation?

Me in my element

My new liberation was not confined to displays of data; it was about Web surfing in general. In my self-proclaimed role as an avid media curator, I find all sorts of cool stuff online. And since my personal breakthrough, I unapologetically pull this stuff up on the nearest web-enabled computer when the conversation leads me there. I call this social surfing, not to be confused with the Web 2.0 version that assumes people are alone in their physical space and socializing only online.

As you can see from the photo, I am serious about this. I even like to think I’m particularly good at it. Usually I show the coolest new thing I’ve found. But there are some classics as well, which I especially enjoy showing to people who are easily amazed by the Web. These include:

My enduring takeaway from Martin’s talk became much broader than he intended. But I do have him to thank for it.

*Now, of course, people are often very social online. Yet even online socializing has experienced prejudice, that of being viewed as less authentic — or simply less social — than in-person interaction.

I attended Edward Tufte’s “course” on information design a few weeks ago. The hundreds in attendance were clearly hungry for answers to the problems of everyday life. (It reminded me a bit of the time I saw Sri Sri Ravi Shankar when my friend tricked me into it by failing to correct the natural assumption that a Ravi Shankar at a performance hall in Washington, DC, would be bringing his sitar.) Tufte stokes their hunger by titling his events “Presenting Data and Information: A One-Day Course Taught by Edward Tufte” and by styling himself as professor-guru. He holds “office hours” during breaks. He does not invite questions from the audience, nor does he acknowledge those that are occasionally blurted out anyway.

I sat between a guy who designs machines that cut steel, and a woman who works in medical informatics for a hospital. They wanted to learn how to make their presentations better. Instead, they learned that the Napoleon took his soldiers on a death march through Russia and that Boeing engineers were afraid to sound the alarm when astronauts’ lives were on the line.

Ok, I’m being a little snarky. I actually have a lot of respect for what Tufte has done. His beautiful books are full of examples that show the artistry and breadth of information design. But his course is little more than a book tour that participants get to pay to attend. It’s fun, but it’s not a course.

Tufte spent a little too much time on a pet idea of his called sparklines, “data-intense, design-simple, word-sized graphics.” Actually, they’re pretty cool. Sparklines at RetailMeNotMy only concerns were that they pack *too* much information into too small a space, and in the wrong information intake context. If I’m in a text-reading rhythm, will I — or can I even — break that rhythm, switch to a graphics-reading mode, and then switch back again? I wasn’t so sure, though I remained open to the possibility.

When I got home that evening, I did a bit of online retail therapy to wind down. As ever, I visited RetailMeNot before closing a purchase. And there I saw sparklines, used perfectly.

I’ve never quite understood why a coupon code with a 30% success rate would work for me, and why one with a 90% success rate wouldn’t. I still don’t understand it, but with sparklines, I get a better picture of that strange phenomenon. Because I can see where in the sequence the successes and failures are, I also have a better sense of which coupon to try first. That appeals to my desire for efficiency in the service of optimum discount achievement.

What I like about this usage of sparklines is that it doesn’t break my rhythm. On RetailMeNot, I’m not in a fluid text-reading mode. I’m scanning the page and evaluating several different color, graphic, numeric, and textual indicators to decide which coupon code to try. In that context, the sparkline graphic fits right in.

Now, enough about consumerism. Can sparklines save the world? (At a discount?)

The other day, a friend asked if I knew anything about Aqua, a restaurant in San Francisco.

My response: “Don’t know it, but I’ll bet it gets 4 stars on Yelp.”

Sure enough, it does. How did I guess that? Because nearly everything I’ve searched for gets four stars on Yelp.

Of course, individual reviews vary. But I’ve always been curious about why listings in many star systems end up with nearly the same average rating over time. With Yelp, I became curious as to why these averages are so high. Looking first at the individual reviews, I saw some psychosocial reasons for this. The site encourages positivity by allowing you to tag other reviews as only Useful, Funny, or Cool. A typical search result on Yelp.com(More than once I’ve been tempted to write a review just to call someone else’s Really Dumb.) Reviewers sometimes compensate for a lackluster review with a higher number of stars. Some examples:

– “I’d easily give this place a 3 star [sic], but it gains one star for being the only place to get Sushi in Lincoln Square.” Four stars.

-“Not sure how the hostess sleeps at night with that gigantic stick lodged up her ass.” Four stars.

-“[T]he “scone” was so dry you could sand paint off the walls.” Four stars.

-“I have to say that the drinks I ordered were BAD. My Margarita was so sour and bitter that I had to return it. The vodka tonics must have been made with grade Z tonic water as it tasted like dirty soda water. I won’t even get into the dirty martini.” Four stars.

(Forgive me for taking these clips out of context, but it’s more fun that way.)

The prevalence of positive reviews might also be due to the site’s social networking element, which displays your (ostensibly) real name next to your reviews. Some of these people also get together in person. Do you really want to be the jerk who got all negative over an overcooked burger at the struggling mom-and-pop?

I initially surmised that many in the Yelp community had had those empowered childhoods where criticism was considered demoralizing.* But as I dug further into the reviews, I became impressed by their thoughtfulness. Which suggests another bias: People on Yelp — and elsewhere — tend to review places they like. Farhad Manjoo provides some supporting evidence so I don’t have to.

And yet, individual reviews are not the only cause of high average ratings: Yelp has built the bias into its search engine. At the category search level (e.g. sushi, bars, or salons), “best match” is a weak concept. Lots of results will be highly relevant to a search for sushi. So, what’s the secondary sorting logic? A combination of most reviewed, highest rated, and other special sauce criteria alluded to by a Yelp exec I once spoke with.

When you privilege the most reviewed, highest rated businesses, what happens? Logic indicates that the more reviews there are, the more likely things will average out… and in a community that evaluates matters of high subjectivity with a skew towards positivity, four stars is where the average will land. (Interestingly, All Songs Considered’s now-defunct Open Mic area had anonymous ratings. The song ratings all migrated to a similar average, but they landed more in the middle of the scale.)

In addition, the most reviewed will become more reviewed because they appear more often in the top search results, while the less reviewed will continue to lag. Weirdly, the result of this power law distribution is that Yelp falls behind the coolhunters. If Acme Grill had a moment in the spotlight 6 months ago and got tons of reviews, even when its popularity dies down, it will appear higher in Yelp search results than a newer, hipper thing. People will be more inclined to review it, and the situation is perpetuated.

I’ve long relied on word of mouth and online reviews to make purchase and entertainment decisions. When review communities first reached critical mass on Amazon, they paralyzed me. I treated any bad review, even when among other good ones, as a veto. By now, however, many of us have learned how to extract what’s useful. I’ve also come to understand that reviews are not just useful for consumers, but fun (and cathartic) for the reviewers to write. That said, Yelp does have a lot of influence, for better and worse. So it’s important to remember that a Yelp star is no Michelin (that’s not entirely a bad thing). And that all stars should be taken with a grain of salt.


*There are likely other variances as well. Geographic, for example: People in the Washington, DC, area seem to me more faux polite than those in New England.

Like anyone interested in food and cooking, I have cookbooks, I have a few issues of cooking magazines… and I have recipes. Recipes clipped and torn from magazines and newspapers, scribbled on scraps of paper, and printed from an email or a Web site.

These recipes comprise a tattered pile perched atop the cookbooks in my kitchen. The pile comprises a tiny but constant space in the room of my memory palace that is furnished with other personal organizational directives — photos, pantry, clothing, computer files. That room is, naturally, painted a light shade of guilt.

A recent blog post by a foodie friend brought this room to the front of my mind:

…I bought a sturdy-looking accordion file and began going through all the mags to clip out my favorite recipes. It was a herculean task that had to be done over a number of sittings, spread out over several months to allow sufficient recovery time after each brave plunge…

Organizing recipes is what people do, right? I bought an accordion file at Staples, and considered the taxonomy I would use.

  • Appetizers, Entrees, Desserts?
  • Vegetables, Grains, Meats?
  • Sweets and Savories?
  • Large Meals and Small Meals?
  • Quickies and Time-Takers?

I couldn’t commit to an organizing logic. The problem nagged at me, and I began to resent that so much thought was going into a common and ostensibly minor problem. I blamed my difficulty on the years of immersive computer use that had eroded my ability to place items into silos. Online, I search by keyword or browse by tags. On my computer, I keep folders that are loosely structured and highly imperfect because it’s easy enough to copy the same file to several places or (gasp) use the Windows Search when desperate.

But I remained convinced that the physical realm still needs categories, and that the mental exercise of creating them would somehow make me stronger. Until recently, when I had dinner with a few friends and submitted my organizational problem for a collaborative solution.

One (John) has a few tried-and-true recipes that he makes from memory. One is a food blogger, who knows where her recipes are because she turns to them all the time. Both were sympathetic to my organizing drive, but neither seemed compelled to do the same. Their empathetic distance from the issue was rather surprising, as I had not expected the discussion to call the problem itself into question.

Two other friends had no sympathy at all for my recipe problem. One is a core mover in the development of the semantic Web; the other would greatly benefit from its widespread use, judging from his current endeavor. Now, the semantic Web is a complex concept. (Overly) simply put, it’s a way to connect similar types of data across dissimilar digital contexts. One result of this, true believers claim, is that organizing things into categories is a waste of time and shuts information retrievers off from results that might actually be relevant but are in a different category. For example, if a recipe Web site relied on categories alone, an item in the Entree section might make a perfectly nice Appetizer, but the person looking for appetizers would never know.

Fine. I get this, and over the past few years my digital information management methods have shifted significantly from categorizing and browsing to search.

But what about my clippings, my tangible, physical, natty clippings? The semweb’s got nothing for me here.

Their response: ‘You don’t have that many recipes. If you bother to categorize them, you’re going to end up going through the entire accordion file to get ideas or find the recipe you were looking for, anyway. So why bother filing them in the first place?’

At this point, the brief and powerful manner in which a fundamental assumption of mine — recipes must be organized — was dismissed had an interesting effect: utter acceptance on my part. I also felt a little lighter. That’s one less piece of clutter in the room of personal organizational directives.

The conversation then took a telling turn towards other examples of mild obsessive-compulsive disorder in our lives. I did resent that a little bit. But, I could see their point. What if the desire to organize by category is not a cleaner kitchen, but rather a prison with guilt-colored walls?

I mulled over this for a few days after that fateful dinner. I came to a couple of other realizations:

-If we took a little survey of the semantic Web developer community, we’d probably find they don’t organize their recipes, they don’t put their photos in albums, I’d hate to see their closets, and they ripped all their CDs and got rid of them as soon as it was technologically possible. They likely have a general aversion to categorizing and culling, but the digital realm is more conducive than the physical realm to a workaround. (If only this crossover fantasy could be a reality without RFID tags everywhere!)

-My obligatory need to organize my recipes emerged, at least in part, from a fear of forgetting. I am not a chef or a food blogger, just a reasonably good occasional cook. I will never have enough recipes to lose track of the ones I have. And they’re kept in two places – in that pile in my kitchen, and in my head: There is a recipe clippings room in my memory palace that I hadn’t realized existed. What brings me there is sometimes rational (I need an appetizer) — but more often it’s emotional and sensual. When I think of my beloved grandmother, and I think of her matzo ball soup, I think of the page I wrote it on in a little book given to me by friends on my 22nd birthday. When I remember one of the best dinners I’ve ever hosted, I remember the lamb kofta recipe on its glossy magazine stock in that tattered pile. My pumpkin bread, made hundreds of times, still seems like the perfect thing for every occasion.

In any case, if a recipe does disappear from my memory palace, is that really such a tragedy? It might turn up the next time I rifle through the clippings, or it might not. I’ve always thought of food as experience and memory. So I’ll let it behave like those, fading in and out, establishing, or going away to make room for something new. Here, I finally understand, I do not need a closed system that scales.

Last week, I watched the “60 Minutes” segment about chain restaurants and the growing pressure by New York health officials to prominently display calorie counts.*

My mind turned an ashamed eye back to a Starbucks in Washington, DC, which fed me a small mocha frappuccino a day for an entire, hot summer. Urban legend has it that a frappuccino has ‘as much fat as a Big Mac.’ I never motivated to research this until now… <researching>… actually, the small one isn’t as bad as that. Phew. Apparently hyperbole goes both ways in the Obesity Wars. But I’ll stick with tea.

Back to “60 Minutes”. Lesley Stahl accuses a calorie count board hanging at a Wendy’s of being “drab and easy to miss.” From what I saw in the video, that seemed a bit unfair. True, the text is small. But there’s lots of stuff to show (which is probably why the Starbucks site is very interactive).

From the show’s online summary:

[A Wendy’s spokesperson] says that because Americans love to customize — adding cheese or extra mayo — providing accurate information is nearly impossible… [He] showed 60 Minutes a Wendy’s menu board that lists the combos.

He then showed Stahl what it would look like [with calorie counts]: a dense, cluttered board, with tiny type. “Obviously … no one can read it. And you would have to see this from eight feet away,” [he] explains.

“Let me see. This is absurd. Oh my gosh,” Stahl remarks.

Seeing the board, I saw an information design problem. A problem for:

Edward Tufte.

As far as I know, Mr. Tufte has never tackled this. How would he — or any information designer — present calorie counts in a way that is accurate, comprehensive, and easy to read at a glance?

Would it change how people order? Would it change how they eat?

In other, more dramatic terms: Could good information design fight obesity? (Has it already, in supermarket labeling?)

*The logic that targets the chains is a bit bizarre, according to “60 Minutes”:

The calorie labeling in New York would not apply to “calorie Meccas,” like Chinese restaurants, delis, and fancy French bistros. The chains were singled out because they already publish nutritional information about their food…

Would I want to be confronted with calorie counts wherever I eat out? No. Would I want to know, when choosing where to eat, that the worst of the offenders had been banned? Sure. For example, while it would be naive to trust that the FDA vets everything, there’s a reason it exists.