this space intentionally left blank

March 3, 2016

Filed under: journalism»industry

Spotlit

Judging by my peers, it's possible that I'm the only journalist in America who didn't absolutely love Spotlight. I thought it was a serviceable movie, but when it comes to this year's Best Picture award I still harbor a fantasy that there's an Oscar waiting in Valhalla, shiny and chrome, for Fury Road (or for Creed, if push came to shove).

But I'm not upset to see Spotlight win, either. The movie may have been underwhelming for me, but its subject deserves all the attention it gets (whether or not, as former NYT designer Khoi Vinh wonders, the Globe fully capitalizes on it). My only real concern is that soon it'll be mostly valuable as a historical document, with the kind of deep reporting that it portrays either dying or dead.

To recap: Spotlight centers on the Boston Globe's investigation into the Catholic Church's pedophilia scandals in the 1990s — and specifically, into how the church covered up for abusive priests by moving them around or assigning them to useless "rehabilitation" sessions. The paper not only proved the fact that the church was aware of the problem, but also demonstrated that it was far more common than anyone suspected. It's one of the most important, influential works of journalism in modern memory, done by a local newsroom.

It's also a story of successful data journalism, which I feel is often rare: while my industry niche likes to talk itself up, our track record is shorter than many of us like to admit. The data in question isn't complex — the team used spreadsheets and data entry, not scripting languages or visualizations — but it represents long hours of carefully entering, cleaning, and checking data to discover priests that were shuffled out of public view after reports of abuse. Matt Carroll, the team's "data geek," writes about that experience here, including notes on what he'd do differently now.

So it's very cool to see the film getting acclaim. At the same time, it's a love letter to an increasingly small part of the news industry. Investigative teams are rare these days, and many local papers don't have them anymore. We're lucky that we still have them at the Seattle Times — it's one of the things I really like about working there.

Why do investigative teams vanish? They're expensive, for one thing: a team may spend months, or even a year working on a story. They may need legal help to pursue evidence, or legal protection once a story is published. And investigative stories are not huge traffic winners, certainly not proportional to the effort they take. They're one of the things newsrooms do on principle, and when budget gets tight, those principles often start to look more negotiable than they used to.

In this void, there are still a few national publishers pursuing investigations, both among the startups (Buzzfeed, which partnered on our mobile home stories) and the non-profits (Pro Publica and the Marshall Project). I'm a big fan of the work they're doing. Still, they're spread thin trying to cover the whole country, or a particular topic, leaving a lot of shadows at the local level that could use a little sun.

It's nice to imagine that the success of Spotlight the movie will lead to a resurgence in funding for Spotlight the investigative department, and others like them. I suspect that's wishful thinking, though. In the end, that Oscar isn't going to pay for more reporters or editors. If even Hollywood glamor can't get reporters and editors funded, can anything?

February 17, 2016

Filed under: culture»pop»comics

Excelsior?

Marvel Comics has a digital subscription service called "Marvel Unlimited" that's basically Netflix for their comics: access to most of their archives online for ten bucks a month or so. I decided to give it a shot after Ta-Nehisi Coates kept singing its praises. I buy a few trades a year, but don't always keep them on my shelf, and I figured this was a good chance to go trolling through a few classics that aren't collected in print anymore.

Is it worth it? Well, usually. It turns out that Marvel's back catalog is hardly immune to Sturgeon's Law: most of it is crap. It doesn't help that it's almost all superhero-flavored, which is fine in small doses but starts to feel a little ridiculous when you're exposed to literally thousands of titles and they've all got capes: really, this is all you have? Sure, it's Marvel and that's what they do, but knowing that there's a broad range of other stories being told in this medium makes their genre limitations feel all the more jarring.

Marvel's other bad habit, which only seems to have gotten worse as far as I can tell, is the "special events" that make it impossible to just read through a single storyline. For example, trying to read through the new X-Men titles is an exercise in frustration, since they keep being interrupted or pre-empted by crossover stories from other books. As a way to sell comics to a hardcore faithful, it probably works pretty well. But as a relative newcomer, it's disorienting and irritating, as though a medical drama came crashing into your favorite sitcoms at random intervals.

As a result, it's not surprising that my favorite series to read so far have been either standalone humor titles or oddball takes on the genre. Dan Slott's 2004 run on She-Hulk (often referred to as "Single Green Female") is more legal workplace drama than anything else, and while it sometimes got too clever with the meta-humor, it delivers a nice, funny, self-contained story arc. Ditto for The Superior Foes of Spider-Man, which ran for 17 issues and follows a set of petty, incompetent super-thieves who get in way over their heads. X-Men: Legacy is another short storyline focusing on Charles Xavier's son, David, who has some legitimate disagreements with his "peaceful" father's violent vigilante organization. With its frequent trips into psychic psychedelia, it makes a great case for the infinite effects budget that comics so rarely exploit.

On the other side of the coin, I went trolling through Walt Simonson's tenure on Thor, which ran back in the 1980's and often gets mentioned as a stellar example of classic comics writing. It's pretty good! But it's also a decidedly-weird artifact: while there's overlap with the rest of the Marvel universe from time to time, most of the story is a kind of bonkers faux-Norse legend, with characters taking oaths of honor, pursuing doomed love, and striking off on various quests. The most impressive thing, from a modern perspective, is how many storylines it manages to juggle per issue. There's A, B, C, and sometimes even a D plot, all playing out in 30 page chunks.

But by far my favorite discovery has been the original reason I signed up: Priest's late-90's Black Panther, which is a really fascinating, thought-provoking bit of work. While parts of the art and dialog have not aged gracefully, a lot of it continues to feel very current, both in terms of topic matter and storytelling.

As early as possible, and throughout the rest of the book, Priest emphasizes that T'Challa (the titular Panther) is not just a vigilante out to fight crime, like other superheroes. He's the king of a country — a legitimate state power with an entirely different set of priorities and concerns. To drive that point home, Priest frames the narrative as a series of progress reports from the US liason to T'Challa, Everett Ross, a move that turns out to be an elegant narrative hat trick:

  • Being a white State Department functionary, Ross can explain the political element of the books and serve as an audience surrogate for the largely-white readership.
  • He's useful as comic relief, which is good, since the story arcs themselves revolve around political coups and international sovereignty, and can get a little byzantine.
  • He's a terrible narrator, which starts a running gag where each issue starts disastrously in media res and then unshuffles itself as Ross is forced to double back and explain the situation.

It's a comic book, so of course there are goofy action scenes, and much like the current crop of comic-inspired movies, these rarely rise above "vaguely interesting." But when I think back to the most memorable pages, it's mostly quieter or more subversive scenes. Most of the real plot happens in dialog: negotiations between the Panther and other governments, discussions of succession and history, sarcastic asides that mock the standard superhero schtick. Along the way, Priest is happy to extend a scene for either pathos or awkward humor, to undercut his own pretension, or let characters react to The Black Panther's quietly revolutionary core — an African nation that's portrayed as a technological superpower of its own. As Coates says, when talking about his own plans to write for the character:

It's obviously not the case, but T'Challa — the Black Panther and mythical ruler of Wakanda — has always struck as the product of the black nationalist dream, a walking revocation of white supremacist myth. T'Challa isn't just a superhero in the physical sense, he is one of the smartest people in the world, ruling the most advanced civilization on the planet. Wakanda's status as ever-independent seems to eerily parallel Ethiopia's history as well as its place in the broader black imagination. Maybe it's only me, but I can't read Jason Aaron's superb "See Wakanda And Die" and not think of Adowa.

Comic book creators, like all story-tellers, get great mileage out of myth and history. But given the society we live in, some people's myths are privileged over others. Some of that is changing, no doubt. In the more recent incarnations of T'Challa you can see Christopher Priest invoking the language of the Hausa or Reginald Hudlin employing the legacy of colonialism. These were shrewd artistic decisions, rooted in the fact that anyone writing Black Panther enjoys an immediate, if paradoxical, advantage: the black diaspora is terra incognita for much of the world. What does the broader world really know of Adowa? Of Nanny and Cudjoe? Of the Maji-Maji rebellion? Of Legba and Oshun? Of Shine? Of High John The Conqueror? T'Challa's writers have always enjoyed access to a rich and under-utilized pool of allusion and invocation.

It's a proudly Afrocentric (and Afrofuturist) book, way ahead of its time, and put out by a major comics publisher. I imagine there are a lot of people for whom these throwaway, cheaply-printed comics were profound experiences when they were young. It's hard to imagine how much of that material can translate through to the eventual movie version, even when directed by a thoughtful and talented filmmaker like Ryan Coogler. But kids who go looking for the originals after they see it in theaters are in for a real surprise.

February 5, 2016

Filed under: journalism»articles

Catch-up: 2015

The last thing I'd written about here was the paper's investigation into police shootings, so let's take this chance to wander through the rest of 2015.

In October, after a Seattle dentist shot Cecil the lion and made himself temporarily infamous, one of our reporters put in a records request for all historical animal imports into the USA. The resulting story involved querying through seven and a half million rows of data to find out what we import, and how Paul Allen's Initiative 1401 (which banned the resale of several species of animal trophies) would affect these imports (answer: hardly at all). We also got to do some fun visualizations for it.

In November, my teammate Audrey worked with the Seattle Sketcher to create a voiced history of Ravensdale, a boomtown destroyed after a mining accident. In general, audio slideshows aren't hugely successful online, but I think this one was a really pleasant experience, and analytics indicate that a lot of people listened to it.

Every year, during the Seahawks season, the paper does a series of "paper hawks" — foldable paper dolls for players on the team. The last one is blank, so people can put in their own faces. To make things interesting, I put together a paper hawk web app that could use a camera to take a picture of the reader, and do all the customization in the page (including changing skin tones and hair color), then print it out. This was interesting project in part because the API I used (getUserMedia) is restricted to HTTPS only in Chrome. To make it work, we moved all of our projects to secure domains, which was a great test case for encrypting additional content at the paper.

For MLK Day, my team revived the Seattle Times' tribute to the great man, which was originally published twenty years ago (and had been last updated in 2011). The new version is responsive and easier to update, so that each year we can add more information to it. It's fitting, of course, that the paper has a page just for Dr. King, since they were a major part of the campaign to rename King County in his honor back in 1995. It's pretty cool to keep that tradition going.

Finally, just this week, we published a Pacific NW Magazine story on modern dating, with an interactive "mini-documentary" that I built with our video team. Based on your answers, it generates a custom playlist from the interviews that we recorded. We were inspired by this great piece done by the Washington Post on "the N word." I really enjoyed putting the interactions and animation together, but honestly, most of the credit goes to our video team, and my work was just the window dressing.

These are just the major interactives, of course. All told, we built 84 projects of all sizes last year, not including various small pages built by the producers using our app template. That's a pretty good rate of production for a two-developer team. Here's to a busy 2016!

January 21, 2016

Filed under: journalism»professional

Unconferencing

How do we level up data journalists? In a few months, we'll have a new digital/data intern at the Times, and so I've been asking myself this question quite a bit, especially in light of our team's efforts to recruit diverse candidates. There are a lot of students and young journalists out there with a little bit of training, but no idea where to go from there: how do we get them across the gap to where they're capable of working on a newsroom development team? There's a catch-22 at work here: it's especially tough for aspiring news devs to get a job without experience, but they can't get experience without the job.

One strategy I've often heard is that young people should attend industry conferences as a way to learn from experienced journalists and build connections. Myself, I'm skeptical of this. Conferences have never really been a part of my professional life. We didn't go to them at CQ, and I never got a chance to go to GDC when I worked in the game industry. After I was hired at the paper, I got to go to SND2015 and Write the Docs, and this year I'm heading to NICAR, SRCCON, and (possibly) CascadiaJS. It's possible I really hate myself.

Visiting conferences is rewarding, but it's also exhausting, expensive, and a huge time-sink. And while host organizations often work to mitigate that through scholarships and grants to disadvantaged communities, it's still a big ask for neophytes. Even if I weren't skeptical of the benefits conferences actually bring, I think it's hard to argue that we don't need better, more accessible solutions.

The way I see it, there are three things that you get out of a conference as a young person:

  1. Mentorship
  2. Training
  3. Exposure to developing industry trends

Of the three, the first is the hardest to duplicate, and yet it's the most crucial. Networks are powerful in this industry, and you can practically watch them develop before your eyes if you look closely: young people who catch a break early with the right people, and find themselves quickly elevated with opportunities to work on well-known teams, fill industry panels, and write insipid Nieman Lab think-pieces on the future of news. Then we all end up competing over hiring those same six people, which I don't really think is healthy.

Ironically, this is something I want to discuss with other newsrooms at the conferences this year, before I retreat into my Seattle cave for the rest of my natural life. But I'm also starting a personal initiative to make myself available for "remote mentorship," and asking other people to do so. If you're in news and would like to join, feel free to add yourself to the sheet, and I'll share it with students or other people who get in touch!

December 28, 2015

Filed under: tech»web

Let's not

Right now you can access my portfolio over a secure, encrypted connection, thanks to Let's Encrypt. Which is pretty cool! On the other hand, if nginx restarts this week, it'll probably crash on a bad config value, temporarily disabling all my public-facing websites. This has been emblematic of my HTTPS experience in general: a mix of triumphs and severe configuration mishaps.

A little background: in order to serve a website over a secure connection, you need a digital certificate to encrypt communication with the browser. You can generate these certificates yourself, but that's really only good for personal use. The self-signed cert has to be manually installed on each machine that accesses the server, otherwise the browser will throw up a big, ugly warning screen. The alternative is to buy a certificate from a "trusted authority," most of which are not particularly trustworth or authoritative, but it'll get you a green lock icon in the URL bar. Purchased certs tend to be either expensive or a hassle or both.

After the Snowden leaks, there was a lot of interest in encrypting all web traffic, which meant bypassing the existing certificate authority protection racket run by Symantec et al. Mozilla and some other organizations got together and started Let's Encrypt, with the goal of making trusted certificates free and easy. I figure they're halfway there: I didn't pay anyone for the cert, at least.

There's an official client for the service, but it only works for Apache and it's kind of hefty. My server is set up in an unsupported (but still pretty standard) configuration: I run nginx as a forward proxy in front of Apache (for PHP scripts) and Node (for various apps, including Weir), both of which I'd like to be secured. So I used acme-tiny instead, which basically just talks to the cert API and is small enough that I could read and understand the whole thing. I wrote a shell script to wrap it up and automate things. Automation is important, because unlike paid certificates, these are only good for 90 days, so you need a cron job set to run every month or so to renew them.

Setting all this up wasn't an easy process. The acme-tiny script is well-written, but it has bugs on the version of Python that comes with CentOS. Then I had to set up nginx to use the certificates manually. My webmail got locked into an infinite redirect once I moved my self-signed cert out from Apache and out to the proxy. And the restart crash? Turns out that Let's Encrypt is rate-limited on a per-domain basis, and I didn't back up the current certificate before I hit the rate limit, so my update script overwrote it with an empty version. Luckily, nginx caches certs and won't restart if it detects a bad config, so I'm safe as long as it can outlast the seven-day rate-limit window (it probably will: it's been up 333 days so far, after all).

Without literally years of server admin experience, I'm not sure I would have made it through these issues. And as I mentioned, my system is pretty standard — there's no load-balancer, no CDN, and I don't need to host third-party content. I also don't have any business that gets lost if anything is busted and the certificate expires in March. If I were, say, an IT department responsible for a high-traffic site, I'd be a lot more cautious about moving everything over to HTTPS, either through Let's Encrypt or a paid option.

Ultimately, the news industry and other sites are going to have to follow the lead of the Washington Post, even if the timeframe takes a while. Even apart from the security benefits it carries, browsers have locked new features (Service Worker, for example) behind HTTPS, and are moving old features behind it as well (geolocation is going to be the biggest disruption there). If you want to develop fast websites in the future (assuming that's something news product management cares about, which is... questionable), and especially if you want to create rich news applications, you're going to have to be encrypted.

In my case, I wanted to get a head start on developing with new browser features (a Service Worker would clean up a lot of Weir code), so it's worth the hassle. And we will continue to push these boundaries on the Seattle Times interactives team, since we've moved our S3 hosting to HTTPS (the rest of the site will follow eventually).

But I think there's a lot of tension between where we want to be, as a news industry, and where it's possible for us to be right now. Although I've seen people calling for incentives to change it (such as requiring HTTPS for news grants), the truth is that it just isn't that simple. News sites are often built in a baroque, overcomplicated set of layers — the Seattle Times, for example, currently sits behind a CDN, several instances of Varnish, some reverse proxies, and a load balancer, mostly due to a lot of historical baggage. Changing this to run securely is going to be a big process, even for a company of our size (maybe because of our size). I can't imagine the hassle for local papers that might have little or no IT support. It won't happen overnight, and Let's Encrypt hasn't done anything to change that yet.

In the meantime, I think it's worth stepping back and asking what we really want out of a digital news industry, because sometimes it's hard to maintain perspective from in the trenches. Is it important that readers be able to see our sites securely, free from worries that third parties are snooping or altering what they see? Sure, that's important. Is it in the top three things that Americans need from local news, above problems like "a sustainable revenue model" and "a CMS that doesn't actively fight against the newsroom?" Probably not. Given a choice between a cryptographically-secure media and a diverse, sustainably-funded media, I'm personally going to take the latter every time.

December 22, 2015

Filed under: random»personal

Post-SCC Plans

Last Tuesday was my last day at Seattle Central College, and I turned in my grades over the weekend. If nothing else, this leaves me with 10-20 free hours a week. And while I'll no doubt spend much of that time watching movies, practicing my dance moves, or catching up on my Steam backlog, I do have some projects that I want to finally start (or restart) in my spare time.

  • Rebuild Grue: back when I was leaving Big Fish Games, I spent a couple of weeks working on a text adventure framework called Grue. The main goal of it was to make constructing interactive fiction in JavaScript as easy as possible. I think it was reasonably successful: a sample world is surprisingly readable and intuitive. It also got a bunch of things wrong (weird inheritance system, poor module setup). I'm planning on rebooting Grue in 2016, using ES2015 and a Node-compatible interface.
  • Upgrade Caret's find/replace functionality: A few months ago, in one of my rare open source success stories, a contributor added project-wide search to Caret — a much-requested feature for years now. Unfortunately, we're still stuck with the default Ace dialog for find/replace within a single file. This year, I want to pull that out and re-implement it as a Caret UI widget, which (among other things) will fix a number of regular-expression bugs.
  • Play music: Bass took a backseat to breaking when I started dancing a few years ago. These days, my fingers are noticeably slower on the strings than the used to be, which seems like a shame since I bought a really nice bass before we moved to Seattle. If I can find a laid-back open mike, I might resuscitate Four String Riot for a session or two.
  • Break the web: In a recent project at the paper, I started using the getUserMedia API to access the built-in camera from a web app. It turns out this is, apart from some weirdness and the need to polyfill, pretty great: you don't need native code to access the camera (of course it's not in Safari). Now I want to do some additional mini-apps that use other future-forward web APIs, like Service Worker and gamepads.
  • Write another book/article series: I've gotten a lot of really good feedback on JavaScript for the Web Savvy, both from inside my classes and by random readers around the Internet. Now that I have some time, I'd like to write another book, probably this time packaging up some of the lessons I've learned working in data journalism. I also want to pitch an article series that helps get people from the basics of web production up to more serious news app development.
  • Hacks/Hackers meetup: Finally, last year I took over the local Hacks/Hackers meetup group, but I've been too busy to organize anything for it. Now that I have the time to round people up and gather resources, I want to make good on my goal of holding a day-long event one weekend — either as a hack day or a training session of some kind. More details as I figure it out!

December 7, 2015

Filed under: tech»web

How We're Fast

Over the Thanksgiving holiday, when I wasn't busily digesting as much cornbread stuffing as I could eat, I spent some time running WebPageTest against various projects that the Seattle Times Interactive team has built. The news industry as a whole may not care about speed, but I do, and I want our pages as fast as possible — especially the ones that are embedded in the regular CMS via responsive frames.

After all the testing, I'm generally pleased by how our stuff stacks up, especially when compared against the rest of the site. We have some advantages, of course: our pages typically have fewer ads, and we can strip down the page for maximum efficiency. But it's also the result of a lot of hard work on our news app template, ensuring that every project comes with smart decisions built in. I genuinely think that all news pages could be this fast, so it's worth talking about how we've made it happen, especially for other news organizations that use a similar flat-file approach to their interactives.

Browserify with care

We use Browserify to package up our JavaScript, because we're not savages, and you need some sort of module system for JavaScript these days. Browserify builds all our scripts into a single file, which is important for high-latency connections (which means most cellular networks, even on 4G). We also make sure to load that bundle file with the async attribute at the bottom of the page, so that it won't block rendering.

All of that is pretty standard best practice, but we've also learned that Browserify can be dangerous if you're not careful. A lot of NPM modules are published with the unminified, debug version of the library as the default export from the module. Angular in particular is bad about this: running require("angular") on its own will load a file filled with comments and documentation, totalling more than a megabyte in size (even after gzip, it's still more than 200KB). That's huge!

As a result, one of our production checklist items is to make sure that we are loading the minified version of any external libraries. We also use the browser property in our package.json file to alias common libraries to their minified versions, so that when we require Angular, jQuery, or Leaflet, it automatically defaults to the smallest file.

Gzip on S3

Like a lot of newsroom developers, my team hosts files on Amazon S3, mostly because it's cheap and reliable. People like to think about S3 as though it's just a normal, heirarchical flat-file server, like Apache or Nginx, but it's not. S3 is really a key-value store: you put in a path, and it spits back a prerecorded reply, including the headers.

If you think of S3 as a server, you'll expect it to do a bunch of things that it doesn't actually do. For example, it doesn't set a cache expiration date, and it doesn't know about content types. It also doesn't understand about Gzip compression, so it'll merrily serve your files in their uncompressed form, making them way bigger than they need to be, even if the browser requests the compressed version.

We get around this by running a compression stage on any text-based file during deployment, and setting the headers for the stored object to match. This does mean that theoretically, a browser that doesn't support Gzip will be unable to request that content, because S3 will always respond with compressed content no matter what Accepts-Encoding header the browser sends. Luckily, every browser since IE4 supports it.

Reduce framework code

I love Angular. If you want to quickly generate a visualization with powerful tools for filtering and data binding, you can't do much better. I personally think it's an order of magnitude better than D3. But Angular can also be brutally slow: its change detection algorithm requires a lot of time and memory as a tradeoff for developer convenience.

On a recent project that looked at animal imports, we started with Angular as a way to test out the visualization, but soon noticed that it was taking three or four seconds just to parse and apply the data. On a desktop, that time is a drag. On mobile, it's likely to get the tab terminated, or convince readers that there's something wrong with it.

When the profiler says that you're spending that much time in JavaScript, there are two options. The first is to try to find ways to work around the framework, which can range from unpleasant to actively painful. The second is to just rewrite in vanilla JS. It sounds more difficult to do the rewrite, but if all you're doing is data-binding and events, you can usually replace it pretty easily with a little templating and some custom data attributes. The resulting code isn't as clean or simple, but in the case of the animal imports, it dropped our JS execution time to under 100ms. That's fast.

Even jQuery can be optional these days. Because we compile ES6 down with Babel, a lot of DOM code that would be ungainly can become elegant. Template strings and arrow functions alone have allowed us to cut out DOM libraries entirely, and as a result many of our interactives consist of no external libraries at all. If you haven't checked into the advantages of using Babel in your build process, it's well worth another look.

Reduce third-party code

The number one contributor to page load time is not written by journalists: it's the third-party ad code that runs on the page. There may be only so much you can do about this, since it pays the bills, and of course it may not even apply on embedded graphics. But on our standalone pages, I've taken a strong stance on implementing all code ourselves whenever possible. For example, although our commenting system usually requires multiple scripts loaded synchronously, I wrote a loader that runs through and adds them asynchronously, and only after a user clicks on the "view comments" banner. We can't avoid the hit, but we can delay it until well after the rest of the page has had a chance to render.

Lazy-load everything

Once you've delayed scripts with the async attribute, trimmed the size of those scripts and compressed them, and deferred as much third-party code as you can, what's left over? In our case, this is where we start getting into the structure of the actual interactive, and how it loads itself. For most interactives, we embed data directly into the page, but beyond a certain size it becomes worthwhile to grab it via AJAX instead.

But there's another way to think about lazy-loading, and that's to consider what format you're actually using to populate the page. I'm as big a fan of progressive enhancement as anyone else, but in the case of my team, what we produce is interactive — there's literally no point if JavaScript is disabled. I've found that moving content into JSON and then templating it onto the page can reduce download times significantly, while the speed hit is negligible. Finding the balance between network speed and JavaScript execution time is a constant process for us.

When performance matters

Finally, a note of caution: as much fun as it is to squeeze every last millisecond out of the browser, I'm a little uncomfortable making it the alpha and omega of the job. Ultimately, our goal is to inform people — we'd like that to be fast, but a fast page with bad or misleading reporting is still a failure.

What I like about front-end speed is that it serves as a useful proxy for site quality. A site that's fast can't load too many ads. It can't serve too many tracking scripts. It has to put the reader first. It's easy, much of the time, to chip away at performance in the name of business metrics: loading an additional analytics script to get more information, or an obnoxious ad for a short-term revenue boost.

But if you put speed first, every decision has to start from the perspective of "what's good for the reader?" It's hard to measure the impact of good journalism, but we can have metrics for speed and other technical aspects of the presentation. We can spend more time on the former if we have strong, user-centric guidelines on the latter. If we want people to give us money over the long term, that seems like the only healthy strategy to me.

November 20, 2015

Filed under: tech»coding

Weir, year two

I realized the other day that Google Reader shut down in June of 2013, which means I've been using Weir as my RSS reader for more than two years now. It's my longest-running software project, and still one of the most complex things I've built in Node. And apart from occasional revisions, it's been up and running constantly, in mobile and desktop browsers, that entire time.

I don't log out a lot of metrics from Weir, so there's a lot of stuff that I'm not tracking. But I can say that there are currently 113 subscriptions, with around 6,000 stories in the database. The server that hosts the app (as well as my various domains) downloads about 20GB of data each month, most of which is probably Weir (the rest is e-mail and server updates, and I'm frankly not that popular). It also hovers around 10% of available memory, which is pretty good for a garbage-collected language on a piddly little VM.

On the client side of things, the Angular code has definitely started to show its age. This was the project that I used to learn Angular, and since then I've learned a lot about the framework. Would I use it if I were writing Weir from scratch? I'm not sure. I still love the databinding aspect of Angular, but I suspect I could write a smaller, nimbler version of the UI in vanilla JavaScript pretty easily. At some point, I may give it a shot: the server API is clean enough that writing a new client should be relatively straightforward.

As an experiment in self-hosting a cloud service, Weir is a mixed success. But I have grown to love the way that something I wrote has become a fixture of my life. I clear out my stream on the bus in the morning. Throughout the day, Weir's purple tab icon lights up to let me know that new items are available. It feels like wearing clothes that I tailored for myself — using it feels a little nicer than it should, just from the pride in its construction.

November 4, 2015

Filed under: tech»education

Closing the textbook

Last week, when the administration sent out their quarterly "please someone cover these classes, we are very desparate" email, I put in my notice at Seattle Central College (how's that for irony?). I'll be finishing up this quarter teaching ITC 210, and then I'll need to find a new way to occupy 10-20 hours a week. For a start, I'm planning to volunteer for the local Girl Develop It organization as a TA. I'll be able to cook for Belle more often. And I'd like to be more active in managing the local Hacks/Hackers chapter that I took over earlier this year.

SCC does have some deep organizational problems, and I won't pretend they haven't influenced my decision to leave. But I don't regret the time I spent there: there's been little as rewarding as seeing people take the information I can give them and really run with it. Teaching has often pushed me to make sure that I knew every detail of a subject so that I wouldn't mislead students, and it's gotten me to explore new workflows and clarify my thinking on a lot of topics.

The most important thing I've learned isn't anything technical. Early on in my time as an instructor, I would often be surprised when students wouldn't know something basic, even though it might have been in the prerequisites (only later would I find out how porous those prereqs are at SCC). After a little while, I made a conscious decision that my reaction should be enthusiasm instead of surprise. Although I'm not a huge fan of XKCD, I was inspired by this comic:

Approaching ignorance not as a character flaw or personal failing, but as a chance to share something cool, was great for students. It provided a perspective from which basic questions can turn into an enthusiastic deep-dive into a topic — something even advanced students might find valuable. And it kept me engaged far longer than I think I could have managed in a curriculum where the opportunities to teach really high-level, interesting techniques just weren't there.

Although it seems a bit like pablum, and deeply out of character for a cynic like me, I actually believe that enthusiasm will continue to be useful, even when I'm not teaching regularly. After all, I work on the bleeding edge of an industry that's still struggling to figure out the Internet: the least I can do is be positive about it, for my sake if not for theirs.

October 14, 2015

Filed under: journalism»industry

AMPed up

This morning, you can read my opinions (plus three other newsroom developers) on AMP, Google's proposed ultra-fast publishing format. I'm the most optimistic of the the four, even though I wouldn't say that I'm enthusiastic. I think it's an interesting format, and possibly a kick in the pants for the business side of the industry.

In the last question of the interview, I talk a little bit about how I don't think site performance is a topic of actual discussion for product managers at news organizations, and as a result speed is still not a priority for them. What I didn't get in, but wish I had, is that I'm not sure they're wrong about that. Certainly, performance is important and third party code has run rampant on mobile pages. But is that really what's killing us?

I think it's worth remembering that this whole conversation started, in part, because Facebook decided that they want to be a publisher. Of course, nobody with a firm grasp on reality would think that handing full control of all their content over to Facebook is a good idea, so Zuckerberg's posse needed to create an incentive. Instant Articles ensued: in a burst of publicity, Facebook announced that the web was "slow" (with a lot of highly suspect numbers quantifying that slowness) and proposing their publication system as a way to speed it up.

Since in general we like nothing more than talking about how awful our industry is, journalists leapt to join in: why yes, now that you mention it, look how slow our sites are! Clearly, that's the problem (and not, say, the fact that Facebook holds our referral traffic hostage). It's the same reaction the industry has every time Apple releases a new device — cue exhaustive (and exhausting) ruminations on how to create compelling smartwatch content. Yuck.

This is not the first time that Facebook has created panic around the open web in order to make its social racket seem more appealing. In 2011, Anil Dash wrote his infamous post Facebook is gaslighting the web, documenting their practice of putting scary warnings on outgoing links while privileging their (short-lived) "seamless sharing" program. I think we should be careful about accepting their premises, even when they seem to jibe with the larger conversations around web technology.

Which brings us back to the question: should we care that news sites are slow?

My thought is that from a technical side, we should obviously care. Everyone on the web cares about speed. It has a proven effect on things like purchases and on-site time. It's an important metric, and one we should absolutely take seriously. But from a product standpoint, is it the most important thing? No. It's a Product X, and Product X will not save journalism (that post is from 2010, and sure enough, I think I've linked to it once a year since). It's easier to pitch a silver bullet than to admit the harder truth: that the key to our success is putting out journalism that is good enough that people will pay for it, one way or another.

It's possible, unfortunately, that there is no general-audience journalism good enough to make people pay for it anymore. And in that case, we are all doomed, with the possible exception of the NYT and whatever hipster media startups can get Comcast to cough up $200 million in funding. So it goes. But if we're going to be doomed, I'd rather be honest about why that is. It's not because we're slow. It's not because the ads are horrible. It's because our readers didn't think what we put out was important enough to pay for. That's enough of a tragedy on it's own.

Future - Present - Past