In 2004, the much-anticipated “9/11 Commission Report” was released by the National Commission on Terrorist Attacks Upon the United States, which was created by President George W. Bush and Congress to figure out what the hell had happened — and how it could have happened. The report was a must-read in those days, a best-seller.
Sept. 11, 2001, was still an open wound. We had invaded Afghanistan in late 2001, which seemed to make sense, but also Iraq in 2003, which didn’t. The nation was divided — albeit less starkly and more evenly than it is today. When the 9/11 report was released in July 2004, a presidential election was a few months away. People were eager to connect dots, to read an orderly and factual narrative of what had happened, so “The 9/11 Commission Report” flew off the shelves.
The report was widely criticized for all kinds of reasons, including how the commission approached the work. (For an example of vigorous contemporaneous dissent, read this analysis by Richard A. Posner in The New York Times.) And “The 9/11 Commission Report” was not revelatory in any broad sense. But it had the virtue of being a methodical recitation.
It also had the virtue of being a compelling read, which sort of stunned everyone. In fact, if you didn’t know that it was all true, you might mistake “The 9/11 Commission Report” for a novel. It later was a finalist for a National Book Award.
I was the managing editor of Seattle Weekly at the time, and the artful quality of “The 9/11 Commission Report” prompted me to write a review (which apparently only exists in printed form — in a cardboard box somewhere in my basement — because I cannot find my review online).
As I prepared the review, I emailed Al Felzenberg, who was then the chief spokesman for the 9/11 Commission, to ask who was responsible for the lyrical quality of the narrative. “It’s a terrific work,” I wrote on Aug. 18, 2004. “If it was more than one person, whoever did the editing ensured remarkable consistency in the voice.”
The subject line of my email was, “Who wrote it?” He replied immediately.
From: Al Felzenberg <email@example.com>
Date: Wed, Aug 18, 2004 at 6:24 PM
Subject: RE: Who wrote it?
To: Chuck Taylor <firstname.lastname@example.org>
There were many hands and authors. Here are the basics:
We had 7 teams of about a dozen people. Each team had a leader. The leader supervised certain drafts. Then the team met with [9/11 Commission Executive Director] Philip Zelikow and others. They went at it for hours at a time reviewing facts, arguing policy. Philip and company turned it into a first draft and sent it to the commissioners.
Each of the ten commissioners submitted back their drafts with annotations, notes, etc. They would discuss their changes at lengthy meetings. When they agreed on where they came out, they submitted it back to Philip and company, who turned it in to a second draft and recirculated it. This process went on about four or five time per chapter. The final two chapters took even more.
Each staff member can probably point to sections he or she drafted. So can commissioners. But the overall style that reads in one voice was the product of Philip Zelikow with careful touches from Chris Kojm the Deputy Director and Professor Ernest May of Harvard.
I suggest yuo call [former Washington] Senator Slade Gorton for additional thoughts. He was exceptionally active in the drafting of key sections, but he would never claim authorship. The report is a commission document and product.
So you probably know where I’m going with this.
Who will write the inside story of the special counsel’s investigation into interference by Russia in the presidential election of 2016 — and the many related threads?
I don’t mean the official report Robert Mueller will send the U.S. attorney general. Lawyers will write that, without flourish. I mean the inside account of the special counsel’s office itself. It would likely have to be someone with special access, someone perhaps already documenting the investigation in real time. Or someone with access to official records and the private journals of the participants.
Is such an account even possible? Because if it is, it would be one helluva read.
Election night for newspapers is crazy. It’s the news equivalent of the Super Bowl — multiplied by however many races need coverage. For the print edition, you write and edit a bunch of stories, compile lists of results, lay out a bunch of pages and do all this within the span of a couple of hours. The print edition must roll off the press during the night for delivery in the morning. For the web, people expect instantaneous news, and they expect that report to be updated frequently. The deadline for a news website is now and forever.
That expectation of immediacy is true for broadcasting, too, of course, but textual journalism must be both timely and timeless. I am not going to watch a recording of CNN’s coverage of the presidential race the next day to find out what happened. I am going to seek an account that is smartly organized and can be consumed more efficiently. That is almost certainly going to be a story, or a graphical representation of the outcome. Digital or inked textual journalism must be produced with urgency similar to broadcasting, but it must be more polished and thoughtful. The only way to do that well is to plan well. Portions of stories are written in advance. If it’s a really important race that will require nuanced analysis on deadline, two entire versions might be written before the results are known. Plug in the numbers and away you go.
Sometimes the most basic information is the hardest to prepare for publication. This year The Daily Herald of Everett, Washington, where I am the digital news editor, identified 63 noteworthy races or ballot measures. We needed to make a table of results for national, state and local races, and there would be multiple sources for those data — national media, the state of Washington and several counties.
Sure, The Associated Press would feed us those data in an efficient stream if we were willing to pay several thousand dollars more than our AP membership costs, but that expenditure is tough to justify at a small publication. We needed to be self-sufficient, using free sources of government results online and bringing them together in our own custom list.
I thought about taking the state’s RSS feed of results and parsing them with a text macro, but the structure of the XML made this too challenging — I didn’t have the coding skill and extra time to do this, or the budget to hire someone who did. Besides, our county’s results were disseminated in a different format — flat-file HTML. We needed to accommodate both sets of data on deadline and regurgitate them in separate formats for print and the web.
I came up with a simple solution. The downside was that it required human data entry, either by typing the numbers into a spreadsheet, or by cutting and pasting them from government-generated results found online (my preferred method). The upside was that we would have our own custom compilation of results which could be exported for presentation however we wished.
If you want to do this yourself, you need only a spreadsheet. We are a Google Docs shop, so we used Sheets, the cloud-based Google spreadsheet app. (Sheets is inferior to Excel, IMO, but it’s the tool available to everyone in the room, which is important.) Since Sheets resides in the cloud, multiple people can enter data into the same document simultaneously. The advantage of that when you’re on a tight deadline should be obvious.
Here’s how I did it, step by step.
We needed to output results in tabular form for import into InDesign for the printed page, and as HTML tables for a web page. That meant we needed the same data exported from Sheets in two distinct formats — tabbed plain text for print and carefully tagged HTML for the web. We also needed a clean interface within the spreadsheet so humans could accurately input the results.
First, I compartmentalized the results, bearing in mind how I wanted the list organized and how people in the newsroom would be finding and compiling the results. I created a separate sheet, or tab, for each silo — one each for president, Congress, statewide races and measures, local races, etc. For example, I created a tab within the sheet for congressional races and designed it to be human-friendly for data entry.
I made the cells for data entry green and locked all the others so that mortals couldn’t break anything. I also made the tabs for the data-entry sheets green to indicate it was OK to edit them.
Formulas in this sheet automatically calculated percentages from the vote totals. I did this for two reasons: It’s one less number to enter (you don’t have to enter the vote number and the percentage) and there is a factual check — you can compare the calculated percentage in the spreadsheet to the percentage published by the government election results to ensure you input the raw numbers correctly. This requires, however, that you also enter the total number of votes for a given race. But that’s still fewer numbers to enter — three numbers instead of two numbers and two percentages, and you get the added bonus of quality control during data entry.
Then I created two more sheet-tabs for the same data. These would be used for exporting the data that was entered into the first version of the sheet. I used formulas to grab the data from the first sheet and paste it into the others. But I designed the other sheets differently, bearing in mind how I wanted the data to be exported for print and the web. I formatted the text in the cells of these duplicate sheets differently to make it easy to troubleshoot any problems.
The sheet for export to print was simple, because an exported tab-separated file (.tsv) was all that was needed. InDesign knows what a tab is, and that file could be imported into an InDesign page pretty much as-is. I made the tabs for print-export sheets blue.
The sheet for export to the web was more complicated, because there would need to be HTML code co-mingled with constantly changing data. (The HTML tags in the cells beside the data would not change.) I could still export this as a .tsv file, because web browsers know to ignore tabs and just read the HTML tags. But the spreadsheet needed to be organized carefully so all the HTML appeared in the right order in relation to the data. I made HTML-export tabs yellow.
So here’s how the whole process worked:
1. Election returns were entered into the green cells in the green sheets. I prefer to cut and paste the data from the government website, one number at a time, and use the data-only paste option in Google Sheets to prevent foreign HTML or formatting to contaminate the results:
2. The print (blue) version of that tab was exported as a .tsv file:
3. The web (yellow) version of that tab was exported as a .tsv file:
4. The print .tsv files were converted to .txt, then dragged into a shared folder that is visible to InDesign. A colleague, news-desk chief Mark Carlson, figured this part out. He created a macro in InDesign that imported the tabular data from those files into a pre-formatted table on the InDesign page. The file contents, before import into InDesign, looked like this:
5. I opened the web .tsv files and cut and paste their contents into a page on our website. (We happen to use WordPress for hosting. We bypassed our content-management system, NewsEngin GPS, with this process.) Here’s what the code looked like before pasting:
Once the data were entered into the green sheets, this whole process took just minutes. Of course, the more tabs you have in your spreadsheet, the more files there are to export. Here’s how the end result looked in print and on the web:
Here’s our entire results page on the web, and below is the results page in print:
This setup is ready for next year, obviously. Because odd-numbered years have more races (lots and lots of city councils), I’m glad I tested this method in 2016. But I have a feeling I will come up with something even smarter and more awesome next year. Stay tuned.
Richard Perez-Pena, a journeyman reporter for The New York Times national desk, has written an interesting explanation of the revival of the role of rewrite editors in news. Things have kind of come full circle. The rewrite method of preparing news for publication evolved to overcome a lack of tools. Today it is in use again to overcome an overabundance of tools and numerous sources of information, and to overcome the fact things keep changing and the digital platform needs to be constantly fed.
As recently as the early 1990s, reporters had a notepad and a pocketful of coins, and that was about it in terms of tools in the field. If there was urgency, you found a landline telephone — usually a pay phone, but sometimes you knocked on a stranger’s door to ask if you could use theirs — to call in your story or dictate your notes. Sometimes you dictated the first draft of a story to whoever was on the other end of the line, but usually what we dictated was a combination of useable newswriting and notes, because you were flipping through the scribbled-on pages of the notepad. (“Oh, here’s a quote for you to insert higher up in the story …”) The person in the newsroom then rewrote this stuff into a polished narrative.
When I had to dictate like this, I used to close my eyes and pretend I was in front of a typewriter. Writing that was generated with typewriters was not so easily polished on the fly; you let your consciousness stream onto the page and refined it later. Writing was a more linear process than it is today. So dictating over the phone was simply a matter of switching from the keyboard to your voice. The result was going to be kind of rough either way. In the field, sometimes you took a few minutes to hand-write a story and then read it over the phone. But more often it was a combination of reciting into the receiver actual paragraphs that could be published and facts or quotations. In any event, it was quick and dirty.
For those of you not in the business, Perez-Pena’s description of what happens at The New York Times is fairly typical of how newspapers handle breaking news today. Few of us have a dedicated rewrite desk waiting for stuff to happen, but when big news breaks and you have multiple reporters involved or developments in multiple places, one person pulls it together in the newsroom and posts iterations to the Web as events unfold. What Mr. Perez-Pena did not get into is that everything else gets refined, too, as the story develops. We add images, improve the headlines, add links, change the home page layout. At some point late in the day, you settle on a version of a story that will go to print. And then you might update the online version again that night or in the morning.
Throughout this process, we practice a discipline that distinguishes professional journalism from all the crap online: verification. When we want to include something in a story, the operative question is always, “How do we know that?” Good news reporting is very labor intensive, even in modest situations, and high technology has made that no less the case than in the days of pay phones. That’s why it’s so important for you to be willing to pay for news — even a news product you know is occasionally flawed. I know everyone complains about their local paper, but try to imagine it not there.
When South Lake Union is too expensive for the top TV station in town, that’s a sign — of the demise of local TV, which is the next medium to begin a rapid economic descent, after radio and newspapers.
TV stations used to be all about visibility and ostentation — the tower with Christmas lights, the unique building, the talent, the helicopter, the branding, the logo-laden vehicles, the live shot, the reporter or anchor sent to a faraway disaster for no good reason but to preen. (Remember the KIRO news jet?)
But now cable and the Internet are killing them, too. Viewership of local TV news is flat or down. Meanwhile, the population increases steadily.
As it was with newspapers, the first thing to go is the valuable real estate. In five or 10 years, we’ll probably have one station doing news and perhaps sharing that. KING and KOMO already are sharing a helicopter. KOMO’s former parent sold Fisher Plaza. KCPQ and KIRO will be next to sell their dirt in the former media gulch.
It’s the end of an era.
Six years ago, a phrase emerged that grew popular as an explanation of the demise of the journalism business:
“If the news is that important, it will find me.”
Market researcher Jane Buckingham related that quote to The New York Times in 2008, attributing it to a college student who was participating in a focus group. Those of us who make a living by being curious and methodically documenting the world have been brow-beaten with that proud intellectual passivity ever since.
At the time, the growing cottage industry of legacy-journalism jeerleaders seized on “it will find me” as original haiku for the fate of institutionally disseminated news and the rise of social media. The problem is, “it will find me” wasn’t original. It wasn’t even surprising.
As a group, college students have been notoriously self-centered since the invention of college. Young people never embraced newspapers or watched TV news. (Those of us who did were weird.)
Even the famously activist baby-boomer generation didn’t actually leave the dormitory in significant numbers until important news arrived in the form of military draft cards. That was a good reason to leave the dorm and take up signs, to be sure, but it was hardly singular enlightenment on their part. The news found them — without an Internet, even — and they paid attention because their lives were in danger.
So “it will find me” is true, but it always has been. It says nothing about the technological change that endangers the business model of professional journalism. It says a lot about how little humans have changed. Most of us are as incurious as ever, especially when we are young.
The reason this sticks in my craw six years on is because “it will find me” persists as supposedly original wisdom unique to Millennials. The quote popped up in a tweet recently. I retweeted it, with an admittedly sarcastic and untrue introduction:
— Chuck Taylor (@chcktylr) February 22, 2014
This of course prompted an earnest critique.
— Sean Robinson (@seanrobinsonTNT) February 22, 2014
To which, prudently, I did not reply, tempted though I was. “Actually, Sean, you’re right, it will find them — just before they get shipped off to Vietnam” did not seem likely to enlighten. Nor did, “Try seeing the world through the eyes of people who aren’t assholes.”
I wish Sean and the rest of us in this business would recognize that technology has changed, and the commerce of information has changed, but people have not. Let’s not pretend that Millennials are somehow behind a sea change in journalism. They just happen to be there. The natural tendency of young people to be self-absorbed is better served by today’s tools, that’s true. But young people are not what’s new and wise, and they surely shouldn’t get a trophy for being incurious.
If you await news that’s important enough to find “me,” you’ll never encounter or understand anything outside yourself.