Election Night CoveritLive and Webcast Analysis

Almost a week after the election, I’m ready to spend some time thinking about how we all did. What worked, what didn’t, and what lessons am I carrying forward.

Overall, it was all a bit much. I got the sense this was a chance for news orgs to try out shiny new tools, which is not necessarily a bad thing (the more we play with them, the closer we all get to figuring out how to use them), but it definitely felt like a herculean effort put behind a not-quite-as-herculean happening.

What won the night? For me it was the Guardian’s live blog. It used cool, shiny, new tools, but didn’t feel like an excuse to try out new tools at the expense of comprehensible coverage (like some other things I saw that night). The platform was pretty cool – it was a blog that auto-updated every minute, and you could choose to turn the auto-update on or off. It was easy to monitor and scroll through to catch up with results. Plus, it had a great sense of humor about everything. Maybe the sense of humo(u)r comes from the luxury of being an outsider, but it was a refreshing change. I think this was my favorite piece of online coverage, and the Twitter buzz I saw on it backs that up.

Here are some other things that caught my eye:


On mobile I was really interested to see a number of news orgs put some effort into making their elections results accessible on the mobile web (rather than building out flashy new apps, which is where everyone’s mind seems to go lately).

The three I noticed on the night of were the Washington Post, with a mobile-accessible results map, the New York Times with a mobile results dashboard and NPR with a special mobile elections page.

I didn’t see this the night of, but 10,000 words says the Wall Street Journal built their web experience with translation to mobile in mind. Their interactive results map was created without using Flash, which sounds difficult…and pretty cool.

Mobile fail? The New York Times’ SMS service. At 10am Wed I was receiving text alerts of races called the minute the polls closed. And they must’ve sent at least 20 total messages. It’s sort of solidified my growing feeling that national (or international) news is rarely a useful thing to send via SMS. SMS is so intrusive that it really only works when it’s highly local or highly personalized.


Someone asked me that night what they should look at to help them understand the role Twitter can play. Instead of directing them to a specific Twitter search or hashtag, I made a point to show her the things that were being done around aggregation and visualization of tweets. I felt that what was different this time was the effort news orgs were making to tame the firehose and mold it into something digestable and useful (although, if she had wanted to look at hashtags, #ivoted and the various vote report tags were interesting as well).

Some only managed to make something cool, some genuinely managed something useful, but there was definitely a recognition that just slapping up a Twitter widget doesn’t cut it anymore. Not only can we do cooler things, but a stream of tweets without any context or curation just isn’t informative enough.

CNN’s Election Pulse map used tweets to put a finger on the mood of the country – a really interesting experiment that actually produced some intriguing results (in no state did the majority of tweets for or against the tea party focus on policy – I wonder how much that has to do with Twitter’s limitations as a medium versus the mood of the elections).

The New York Times also had a cool thing – an interactive chart/timeline tracking how Twitter mentions of different candidates rose and fell over time. You could scroll over a particular candidate and compare them to their opponent, and let the visualization play to see the ebb and flow over time.

MSNBC supposedly had some sort of Twitter art made up of avatar photos (which I would say falls on the side of cool rather than useful), but when I accessed their elections page all I saw was two live tweet streams.

Other social media

The Washington Post (and some others) gave Storify a trial run. I’m torn on this one, and on Storify in general. Storify seems to make curation from the web and social media easy, and I like the idea of aggregating all these resources in one place to tell a cohesive story. But I think that the average person will look at a Storify stream and have no idea how to digest it. In fact, I looked at the Washington Post’s Storify stream and had no idea how to digest it.

I felt like I was seeing a whole lot of data points but had no way to weave them together into a story. I simply didn’t know how to analyze them as a whole, which is supposed to be the whole point of Storify (Maybe I’m being old-fashioned? Youngster @Amadeus3000 says: I like maps and tweets. Its more real time. Text stories are so 10 minutes ago). I heard someone say it might work better for more feature-y coverage, and I think I agree.

More cool things

PBS Newshour was really pushing hard on their use of UStream for live coverage, and encouraging others (pubmedia or not) to embed it. The syndication model seemed to work really well for them, and I’d be interested to know how many embeds they got on websites other than public media stations. WOSU paired the Newshour UStream embed with an NPR news stream and their own live chat. When I checked in at 10:45 the UStream player had nearly 3,000 viewers (@gteresa says: the unofficial total was more than 125,000 views and 3,150 ish at one time).

The one thing I was surprised about was that Newshour didn’t push any of the social elements of UStream, like the live chat and social stream, which seem to me some of its major benefits. When we looked into using UStream, we actually had a harder time exporting our programming to their player versus simply using our own embeddable flash player.

There’s even more on:

Nieman Lab

Lost Remote

and 10,000 Words

And how did our stuff go?

It wouldn’t be fair to critique everyone else and not look at what we did. We did a live television broadcast on election night, but our main push was on a Friday postmortem live event/webcast looking at the international implications of the election.

In the week leading up to that event (including election night) we used #usavotes to share stories on the global impact of the elections and drum up buzz for the event.

The event itself was a live stream paired with a CoveritLive live chat/live blog, in which 3 VOA reporters live blogged and commented on the event, and answered any questions that came in.

Here’s the recap that I sent along to the powers-that-be…the PR-friendly version of how this all went (with my additional thoughts that I didn’t share with them in italics).

Here’s what we accomplished:

1) Streaming the video using an embedded player. This means that our audience can watch the video right on our website, rather than having to click a link and be taken offsite to watch it.

I was really pleased about being able to do this.  I know it seems so obvious, but we don’t do our live streams in embedded players.  We provide a .asx link that opens in your desktop Windows Media Player.  Unfortunately I wasn’t able to get a different file format, so the embedded player was a Windows Media Player as well.  Next time I’d at least like to have a Flash player.  My actual intention was to run it through UStream, and I’d really love to figure out how to make that happen.

2) Implementing a live chat. The Cover it Live chat box worked really well. It’s extremely easy to administer and lets a number of people manage a chat together. We had 3 people working the chat – myself, Raza and Doug Bernard, and we were able to talk back and forth to the audience and to each other easily.

No complaints here.  We didn’t expect really wide audience participation, so planned in advance to approach it as more of a liveblog than a place to field audience questions.

3) Getting audience involvement. Like I said, given that we don’t typically do things like this, we don’t have an audience that’s in the habit of participating with us on things like this. Given that, I felt really good about the questions/comments we were able to solicit in advance using social media and the participation we got during the event.

The main lesson here was that if you want to get audience involvement, you have to promote the heck out of it starting far in advance and targeting the right audiences.  That requires some serious organization-wide buy-in and commitment – and strategizing.  It also reinforced the value of getting the audience engaged in small ways all the time so that the community’s there when you need it.

Update: Ran across this great recap of NewsHour’s election night operations.  Apparently there was a lot more going on behind the scenes, and it’s a really interesting read.

Posted in Portfolio Analysis Tagged with: ,

Leave a Reply

Your email address will not be published. Required fields are marked *