I've had a full week now to digest my TOC experience in New York and hopefully I will manage to capture in this post the many points of real value that I gleaned from it. For those of you I didn't meet on Twitter, it was actually me behind the @thedigitalist tweets, not Sara. Like many delegates this year, I had the slightly schizophrenic experience of following Sara's keynote on Twitter whilst also watching her deliver it. In a funny way, the main 'take home' of TOC 2009 could be the usage explosion on Twitter. At one point, there was a question from the floor that had been asked via Twitter - this even surprised a roomful of webby technotypes and internet futurologists!
The themes running through the conference this year, as I interpret them, were: product, discovery, behaviour.
The product under discussion is still more ebooks than books. Ebooks are top of mind for digital publishers, although I'm not sure that's the way we'd all like it to be as there are lots of other things to be getting on with too. Twitter, for example. The encouraging news, to my mind, was that Bill McCoy from Adobe announced increased support for the export to .epub function in the Adobe suite of creative content tools, adding momentum to a standard format approach. And also, importantly, adding the creation of .epub documents to the general skill set of everyone, all users, in the same way that people have learned to create .pdf. The ability to, and tendency to, create documents in .epub will feed adoption of .epub optimised devices (i.e. readers). And that general behavioural change will benefit publishers offering content in .epub.
What surprises me though, is that ebooks seem to still be tethered, in general discourse, to the print book. Ebook = electronic replica of print book. This is the baseline, surely, not the endpoint. We saw, with magazines and newspapers, that when content goes online it first replicates print and then diverges from print and soon leaves print behind (or makes it redundant). At what point will publishing models emerge, on a commercial scale, that take advantage of content divergence for ebooks in the same way that guardian.co.uk (for example) has done for newspaper? What or who will lead that divergence - authors writing in a new way for online consumption? or publishers structuring content differently for new distribution channels and formats? Bob Stein's concept of the networked book talks to this issue - where the content and activity generated before and after the book is published as an object come to be ingested and engaged with as much as the printed codex itself. For magazines and newspapers, the drivers for divergence have been immediacy, personalisation and localisation. What are the drivers for long form content divergence online? Not sure yet.
With real energy and aplomb, Cory Doctorow also spoke to the theme of product. Reinforcing his message of 'DRM is no good, please don't use it' (Doctorow's Law: anytime someone puts a lock on something you own, and doesn’t give you the key, they’re not doing it for your benefit), Cory also highlighted the issue, for publishers, of choice. He exhorted delegates to ensure that they chose, and not their retailers, whether or not to apply DRM. The standard User License Agreement, he argued, should be: Don't break copyright law. (This thinking could be applied to so many things. London Underground: Stick to the schedule. Global bankers: Don't take absurd risks. Professional tennis: Just hit the ball.) Again, this is about product: make it attractive, make it interesting, make it easy to get to and use.
Jon Orwant, from Google Book Search, stated at TOC that 'the ultimate goal of Google Book Search is to convert images to “original intent” XML'. He explained the post-processing Google runs to continuously improve the quality of the scanned books, and to convert images to structured content. Retro-injecting structure accurately is no mean feat but when it's done, Google will be able to transform the books into a variety of formats. The content becomes mutable and transportable, in a sense it isn't yet, even though it is scanned, online and searchable. Orwant also presented three case studies - McGraw Hill, OUP, Springer - that demonstrated the benefits publishers can gain from having their books in GBS.
Highlighting the theme of discovery (to my mind), Tim O'Reilly interjected, at the end of these case studies, and made the point that O'Reilly used to own the top links to their own books in Google search results, but have now lost those links to GBS. Orwant, somewhat simplistically, responded that O'Reilly needed to improve their website to regain the top ranked link per title, as this spot was determined by Google's search algorithms. This was not a convincing response, and dodged the issue, which I understood to be that the scale and in-house-ness of GBS could seriously inhibit the ability of the publisher to represent their own products online at the most common point of entry by the consumer, Google search results. There are many compelling reasons for publishers to own the top search result link, the most obvious being: offer unique additional content around the title, start a conversation with the reader, control the brand.
In his own keynote, Tim O'Reilly spoke across all three themes: product, discovery, behaviour. These points were, in summary (with apologies to Tim for some of my paraphrasing):
- We don't know what's going to happen next because the internet is building into a global intelligence network the likes of which we've never seen before. Build partnerships with people who live and breath the newest technology, and then you have a chance to do something new.
- Mobile is everywhere. Soon, one of the main pathways to discovery will be mobile, and if you as publisher are not engaging with mobile, then your content will be invisible.
- Do more for authors online. With the array of low cost tools available now, it's important to re-figure the relationship between publisher and author online.
- Curation still matters (i.e. publishers still matter). Publishers confer status on authors, and in time authors confer status back on publishers. People in the head confer status back onto people in the tail. (See Clay Shirky's post on this topic.)
- People pay for access to information. So make content available wherever your readers want to find it, in whatever format they want to consume it.
- Participation drives revenue. Tim shared some data on the Rough Cuts products, showing that Rough Cuts titles on their own sold about as many copies as finished, non-rough titles, but that finished titles that had also been published as rough cuts sold 2.5 times as much.
And so we get to 'Googly books' and 'smart content' - my two favourite phrases for 2009. Jeff Jarvis took us on a whistlestop tour of his new book and all the various forms in which it was being released (mostly for money), including the .ppt version and the vbook (sorry Jeff, but, ugh). Jeff's point was that the more you put your content out there, and the more books become more process than product (a la Google's approach to releasing and shaping software), the better for everyone and the books. We all need a bit of SEO, said Jeff, a bit of 'Google juice', so that our content can be *discovered* (sorry, just had to emphasise that). There's a new economy and a new ethic out there, and publishers and authors need to adapt to it.
And this is where the conversation moved firmly into the theme of behaviour - of the books, the content, the readers, the publishers... everyone. Or rather, I should, say the conversation returned to the theme of behaviour, as one of the opening keynotes from Peter Brantley was a fascinating explication of the ways in which books have become networked social acts, and are a part of the pattern of our global analogue culture being 'uplifted' into a digital one. Nick Bilton from NYT R&D Labs set the conference room alight with his super-cool run-through of the history of human engagement with content, and the main characteristics of how we deal with content now. We are all inundated with content, said Bilton, and we've developed a swarm intelligence online to navigate that content flow. Smart content adapts to that flow and only puts stuff in front of you that satisfies the immediate interest. Paper is just a device - hybrid forms of narrative can engage readers in the new pathways of content, be it on mobile, in print, or online.
That's a quick run-through of what has been in my head over the past week - now we need to act on some of these points. I'm also still thinking about how to apply some of the insights from Nick Bilton... more to come perhaps.