Election night for newspapers is crazy. It’s the news equivalent of the Super Bowl — multiplied by however many races need coverage. For the print edition, you write and edit a bunch of stories, compile lists of results, lay out a bunch of pages and do all this within the span of a couple of hours. The print edition must roll off the press during the night for delivery in the morning. For the web, people expect instantaneous news, and they expect that report to be updated frequently. The deadline for a news website is now and forever.
That expectation of immediacy is true for broadcasting, too, of course, but textual journalism must be both timely and timeless. I am not going to watch a recording of CNN’s coverage of the presidential race the next day to find out what happened. I am going to seek an account that is smartly organized and can be consumed more efficiently. That is almost certainly going to be a story, or a graphical representation of the outcome. Digital or inked textual journalism must be produced with urgency similar to broadcasting, but it must be more polished and thoughtful. The only way to do that well is to plan well. Portions of stories are written in advance. If it’s a really important race that will require nuanced analysis on deadline, two entire versions might be written before the results are known. Plug in the numbers and away you go.
Sometimes the most basic information is the hardest to prepare for publication. This year The Daily Herald of Everett, Washington, where I am the digital news editor, identified 63 noteworthy races or ballot measures. We needed to make a table of results for national, state and local races, and there would be multiple sources for those data — national media, the state of Washington and several counties.
Sure, The Associated Press would feed us those data in an efficient stream if we were willing to pay several thousand dollars more than our AP membership costs, but that expenditure is tough to justify at a small publication. We needed to be self-sufficient, using free sources of government results online and bringing them together in our own custom list.
I thought about taking the state’s RSS feed of results and parsing them with a text macro, but the structure of the XML made this too challenging — I didn’t have the coding skill and extra time to do this, or the budget to hire someone who did. Besides, our county’s results were disseminated in a different format — flat-file HTML. We needed to accommodate both sets of data on deadline and regurgitate them in separate formats for print and the web.
I came up with a simple solution. The downside was that it required human data entry, either by typing the numbers into a spreadsheet, or by cutting and pasting them from government-generated results found online (my preferred method). The upside was that we would have our own custom compilation of results which could be exported for presentation however we wished.
If you want to do this yourself, you need only a spreadsheet. We are a Google Docs shop, so we used Sheets, the cloud-based Google spreadsheet app. (Sheets is inferior to Excel, IMO, but it’s the tool available to everyone in the room, which is important.) Since Sheets resides in the cloud, multiple people can enter data into the same document simultaneously. The advantage of that when you’re on a tight deadline should be obvious.
Here’s how I did it, step by step.
We needed to output results in tabular form for import into InDesign for the printed page, and as HTML tables for a web page. That meant we needed the same data exported from Sheets in two distinct formats — tabbed plain text for print and carefully tagged HTML for the web. We also needed a clean interface within the spreadsheet so humans could accurately input the results.
First, I compartmentalized the results, bearing in mind how I wanted the list organized and how people in the newsroom would be finding and compiling the results. I created a separate sheet, or tab, for each silo — one each for president, Congress, statewide races and measures, local races, etc. For example, I created a tab within the sheet for congressional races and designed it to be human-friendly for data entry.
I made the cells for data entry green and locked all the others so that mortals couldn’t break anything. I also made the tabs for the data-entry sheets green to indicate it was OK to edit them.
Formulas in this sheet automatically calculated percentages from the vote totals. I did this for two reasons: It’s one less number to enter (you don’t have to enter the vote number and the percentage) and there is a factual check — you can compare the calculated percentage in the spreadsheet to the percentage published by the government election results to ensure you input the raw numbers correctly. This requires, however, that you also enter the total number of votes for a given race. But that’s still fewer numbers to enter — three numbers instead of two numbers and two percentages, and you get the added bonus of quality control during data entry.
Then I created two more sheet-tabs for the same data. These would be used for exporting the data that was entered into the first version of the sheet. I used formulas to grab the data from the first sheet and paste it into the others. But I designed the other sheets differently, bearing in mind how I wanted the data to be exported for print and the web. I formatted the text in the cells of these duplicate sheets differently to make it easy to troubleshoot any problems.
The sheet for export to print was simple, because an exported tab-separated file (.tsv) was all that was needed. InDesign knows what a tab is, and that file could be imported into an InDesign page pretty much as-is. I made the tabs for print-export sheets blue.
The sheet for export to the web was more complicated, because there would need to be HTML code co-mingled with constantly changing data. (The HTML tags in the cells beside the data would not change.) I could still export this as a .tsv file, because web browsers know to ignore tabs and just read the HTML tags. But the spreadsheet needed to be organized carefully so all the HTML appeared in the right order in relation to the data. I made HTML-export tabs yellow.
So here’s how the whole process worked:
1. Election returns were entered into the green cells in the green sheets. I prefer to cut and paste the data from the government website, one number at a time, and use the data-only paste option in Google Sheets to prevent foreign HTML or formatting to contaminate the results:
2. The print (blue) version of that tab was exported as a .tsv file:
3. The web (yellow) version of that tab was exported as a .tsv file:
4. The print .tsv files were converted to .txt, then dragged into a shared folder that is visible to InDesign. A colleague, news-desk chief Mark Carlson, figured this part out. He created a macro in InDesign that imported the tabular data from those files into a pre-formatted table on the InDesign page. The file contents, before import into InDesign, looked like this:
5. I opened the web .tsv files and cut and paste their contents into a page on our website. (We happen to use WordPress for hosting. We bypassed our content-management system, NewsEngin GPS, with this process.) Here’s what the code looked like before pasting:
Once the data were entered into the green sheets, this whole process took just minutes. Of course, the more tabs you have in your spreadsheet, the more files there are to export. Here’s how the end result looked in print and on the web:
Here’s our entire results page on the web, and below is the results page in print:
This setup is ready for next year, obviously. Because odd-numbered years have more races (lots and lots of city councils), I’m glad I tested this method in 2016. But I have a feeling I will come up with something even smarter and more awesome next year. Stay tuned.
Am I an optimist or a pessimist? It depends. The optimist in me thinks I have a lot to say. The pessimist counters that no one wants to hear so much negativity. Hence, one blog post per year.
The optimist in me thinks the Internet has been an enormously good thing for human knowledge and expression. The pessimist notes that uploading humanity’s collective wisdom to the cloud has revealed what foggy, uncritical thinkers we are. For example, Quora.
The fact Quora exists makes my heart soar. How cool is it that anyone anywhere can throw a vexing question into the wind and have experts on that very topic answer it?
On the other hand, who thought it was a good idea to let the stupid and the lazy reveal how ignorant they are by enabling the dissemination of unanswerable questions?
To wit — or, more precisely, too nitwit: Continue reading
When South Lake Union is too expensive for the top TV station in town, that’s a sign — of the demise of local TV, which is the next medium to begin a rapid economic descent, after radio and newspapers.
TV stations used to be all about visibility and ostentation — the tower with Christmas lights, the unique building, the talent, the helicopter, the branding, the logo-laden vehicles, the live shot, the reporter or anchor sent to a faraway disaster for no good reason but to preen. (Remember the KIRO news jet?)
But now cable and the Internet are killing them, too. Viewership of local TV news is flat or down. Meanwhile, the population increases steadily.
As it was with newspapers, the first thing to go is the valuable real estate. In five or 10 years, we’ll probably have one station doing news and perhaps sharing that. KING and KOMO already are sharing a helicopter. KOMO’s former parent sold Fisher Plaza. KCPQ and KIRO will be next to sell their dirt in the former media gulch.
It’s the end of an era.