Thanks to everyone for the comments and apologies for igniting the whole debate again. I thought I would collect all my responses together and put them out as a post. - My position: personally I think DRM is a pain and try and avoid it as much as possible. Professionally I recognise that as a publisher we are obligated in some instances to use it. Before everyone beats us up too much, can I just point out there aren't many publishers actually selling non-DRM ebooks and actively promoting them, or even embarking on a discussion like this. What I am saying is that I have a lot of sympathy and affinity with the anti-DRM position and strongly support open licences so am not droning out some unthinking policy.
- Andrew Savikas makes a good point when he says a pirated copy is not a lost sale. In a related point Cory Doctorow argues that the viral possibilities of a non-DRM means that it can have more blockbuster potential. I agree with both of these points. However just because a pirated copy does not necessarily equate to a lost sale, it does not mean that it doesn't all the time.
Margins are tight. On big titles agents will ensure advances are calibrated to the max. That means publishers have to hit very high sales targets to get any kind of return. A 5% loss of sales across big titles over a few years will greatly damage publishers ability to publish big books. So even if there are many new readers being added even a relatively low number of book buyers lost could cause a lot of damage. The thing publishers should learn is not to hit the panic button at the first whiff of piracy but to have a more considered response that doesn't alienate everyone involved.
As for Cory's point I think this is true for some works but not all works (as JEB points out). So I agree that having no DRM works exceptionally well for Little Brother (a brilliant book), as indeed it did for NIN and Radiohead. There have now been numerous instances of publishers giving a non-DRM file away, and this leading to a boost in print sales. My argument isn't with the effectiveness of this, but rather with a) there are some authors for whom this will not work as there audience isn't right and b) this isn't really a solid foundation from which is build a range of digital products. As a model I do think this will become more and more prevalent (all good) but also that it has the potential in the long term to undermine that which it currently supports.
- Regarding paper and DRM: I think this is a good metaphor for the expectations we have when we own a book. We expect a degree of control, but we don't expect to be able to do absolutely anything. Paper/digital isn't what it is about, rather I am saying that there should be some consistency across how we approach a book and freely acknowledge that present DRM is not doing this.
- A common argument here is that DRM doesn't work as it doesn't stop piracy therefore whats the point, lets get read of it. This is like saying the police neither deter nor solve all crimes so whats the point, lets get rid of them. Just because something is not absolutely effective, does not mean it is absolutely ineffective.
- Sean Cranbury calls my final comment "disingenuous". All I can say is that it was not written disingenuously at all. I am trying to strike a balance that favours readers compared to what exists now! This is about saying, well given that we are not in a position to scrap DRM for all our ebooks (technically impossible under existing arrangements) what can we be doing to improve things?
- To the many points about how DRM can make life difficult for ordinary readers, I agree and always have agreed. My reasoning for DRM not being 100% bad is that it can help mitigate risk. However we can all, I think, acknowledge that, bluntly, a lot of existing solutions suck. Cory Doctorow made many interesting points about the difficulties in creating a more humane DRM system to which I don't have an immediate answer. What I can say is that these should not stop us from trying even if it is hard, and I would be happy to get involved with standards bodies to fight for a better consumer experience re DRM. It might be challenging but we should give it a shot anyway - better to have tried and failed etc.
- Gary Gibson (a Pan Mac SF author whose non-DRM ebooks are available on the website) made a brilliant suggestion that some kind of digital escrow account is what we need. I could see this being run by an independent body like the IDPF or the BISG in concert with publishers. Such a project might offer a workaround to those objections that focus on the concentration of DRM in the hands of a few big players and the added cost burden DRM places on digital products by being a non-profit. This would also get round many of the difficult scenarios presented by Cory Doctorow. David Smith's point about a rental model being a good way of getting round this is exactly what I mean when I say we need to be open to new business models. A subscription/library style service could work for everyone.
I realise that saying something positive for DRM is not going to win me any friends (in public at least). Most of the objections to DRM are fair and publishing will no doubt follow a path similar to the record industry. Equally though blanket condemnation of DRM without an acknowledgment that it can play a role in maintaining the lifeblood of content industries is telling only one part of the story. I'm not saying the current IP framework is perfect, just that elements of it are important.
Given that the weight of industry opinion demands DRM for our files, isn't it better to try and make sure that DRM is as inclusive, flexible and consumer oriented as possible, instead of just going with the flow?
Many of the questions here seem to be about what the concept of "ownership" is in the digital age, and these are not fully resolved yet. I certainly don't pretend to have all the answers!
... the dizzying range of easily accessible material on the internet conspires with a lack of editorial guidance to make web reading a disjointed experience that works against the sustained concentration required for serious reading.
There is an interesting piece in the London Review of Books from Colin Robinson about the impact of global economic woes on publishing. As the byline has it, "Colin Robinson until recently worked for a large publisher in New York." He outlines the pressures facing the principal cast of the publishing ecosystem (to mix my metaphors), including writer, editors, producers, retailers, and readers.
Robinson's comments on the effect of electronic communication and the internet on the life of books could be judged as accurate or out of step, depending on your perspective. (I'm not going to go there on the "For all the claims of their optical friendliness and handiness, e-books still strain the eyes" remark.) Yes, there is a lot of rubbish content on the internet, and yes society seems to be moving towards a sort of chronic individualism that exhibits itself online. But is that dreadful for publishing and reading?
Robinson points to a possible solution - that the editor's powers of curation and provision of status to some writing over other writing will migrate from paper to internet. "There is opportunity as well as challenge in this model. The roles of editor and publicist, people who can guide the potential reader through the cacophony of background noise to words they’ll want to read, will become ever more important."
Perhaps what Robinson has a sense of losing is, to draw an analogy, a hansom cab for a yellow cab. That would be the other perspective.
"This is not to say that the book is doomed. But publishers will surely have to change the way they do business", writes Robinson, and I'm sure we all agree.
I've had a full week now to digest my TOC experience in New York and hopefully I will manage to capture in this post the many points of real value that I gleaned from it. For those of you I didn't meet on Twitter, it was actually me behind the @thedigitalist tweets, not Sara. Like many delegates this year, I had the slightly schizophrenic experience of following Sara's keynote on Twitter whilst also watching her deliver it. In a funny way, the main 'take home' of TOC 2009 could be the usage explosion on Twitter. At one point, there was a question from the floor that had been asked via Twitter - this even surprised a roomful of webby technotypes and internet futurologists!
The themes running through the conference this year, as I interpret them, were: product, discovery, behaviour.
The product under discussion is still more ebooks than books. Ebooks are top of mind for digital publishers, although I'm not sure that's the way we'd all like it to be as there are lots of other things to be getting on with too. Twitter, for example. The encouraging news, to my mind, was that Bill McCoy from Adobe announced increased support for the export to .epub function in the Adobe suite of creative content tools, adding momentum to a standard format approach. And also, importantly, adding the creation of .epub documents to the general skill set of everyone, all users, in the same way that people have learned to create .pdf. The ability to, and tendency to, create documents in .epub will feed adoption of .epub optimised devices (i.e. readers). And that general behavioural change will benefit publishers offering content in .epub.
What surprises me though, is that ebooks seem to still be tethered, in general discourse, to the print book. Ebook = electronic replica of print book. This is the baseline, surely, not the endpoint. We saw, with magazines and newspapers, that when content goes online it first replicates print and then diverges from print and soon leaves print behind (or makes it redundant). At what point will publishing models emerge, on a commercial scale, that take advantage of content divergence for ebooks in the same way that guardian.co.uk (for example) has done for newspaper? What or who will lead that divergence - authors writing in a new way for online consumption? or publishers structuring content differently for new distribution channels and formats? Bob Stein's concept of the networked book talks to this issue - where the content and activity generated before and after the book is published as an object come to be ingested and engaged with as much as the printed codex itself. For magazines and newspapers, the drivers for divergence have been immediacy, personalisation and localisation. What are the drivers for long form content divergence online? Not sure yet.
With real energy and aplomb, Cory Doctorow also spoke to the theme of product. Reinforcing his message of 'DRM is no good, please don't use it' (Doctorow's Law: anytime someone puts a lock on something you own, and doesn’t give you the key, they’re not doing it for your benefit), Cory also highlighted the issue, for publishers, of choice. He exhorted delegates to ensure that they chose, and not their retailers, whether or not to apply DRM. The standard User License Agreement, he argued, should be: Don't break copyright law. (This thinking could be applied to so many things. London Underground: Stick to the schedule. Global bankers: Don't take absurd risks. Professional tennis: Just hit the ball.) Again, this is about product: make it attractive, make it interesting, make it easy to get to and use.
Jon Orwant, from Google Book Search, stated at TOC that 'the ultimate goal of Google Book Search is to convert images to “original intent” XML'. He explained the post-processing Google runs to continuously improve the quality of the scanned books, and to convert images to structured content. Retro-injecting structure accurately is no mean feat but when it's done, Google will be able to transform the books into a variety of formats. The content becomes mutable and transportable, in a sense it isn't yet, even though it is scanned, online and searchable. Orwant also presented three case studies - McGraw Hill, OUP, Springer - that demonstrated the benefits publishers can gain from having their books in GBS.
Highlighting the theme of discovery (to my mind), Tim O'Reilly interjected, at the end of these case studies, and made the point that O'Reilly used to own the top links to their own books in Google search results, but have now lost those links to GBS. Orwant, somewhat simplistically, responded that O'Reilly needed to improve their website to regain the top ranked link per title, as this spot was determined by Google's search algorithms. This was not a convincing response, and dodged the issue, which I understood to be that the scale and in-house-ness of GBS could seriously inhibit the ability of the publisher to represent their own products online at the most common point of entry by the consumer, Google search results. There are many compelling reasons for publishers to own the top search result link, the most obvious being: offer unique additional content around the title, start a conversation with the reader, control the brand.
In his own keynote, Tim O'Reilly spoke across all three themes: product, discovery, behaviour. These points were, in summary (with apologies to Tim for some of my paraphrasing):
- We don't know what's going to happen next because the internet is building into a global intelligence network the likes of which we've never seen before. Build partnerships with people who live and breath the newest technology, and then you have a chance to do something new.
- Mobile is everywhere. Soon, one of the main pathways to discovery will be mobile, and if you as publisher are not engaging with mobile, then your content will be invisible.
- Do more for authors online. With the array of low cost tools available now, it's important to re-figure the relationship between publisher and author online.
- Curation still matters (i.e. publishers still matter). Publishers confer status on authors, and in time authors confer status back on publishers. People in the head confer status back onto people in the tail. (See Clay Shirky's post on this topic.)
- People pay for access to information. So make content available wherever your readers want to find it, in whatever format they want to consume it.
- Participation drives revenue. Tim shared some data on the Rough Cuts products, showing that Rough Cuts titles on their own sold about as many copies as finished, non-rough titles, but that finished titles that had also been published as rough cuts sold 2.5 times as much.
And so we get to 'Googly books' and 'smart content' - my two favourite phrases for 2009. Jeff Jarvis took us on a whistlestop tour of his new book and all the various forms in which it was being released (mostly for money), including the .ppt version and the vbook (sorry Jeff, but, ugh). Jeff's point was that the more you put your content out there, and the more books become more process than product (a la Google's approach to releasing and shaping software), the better for everyone and the books. We all need a bit of SEO, said Jeff, a bit of 'Google juice', so that our content can be *discovered* (sorry, just had to emphasise that). There's a new economy and a new ethic out there, and publishers and authors need to adapt to it.
And this is where the conversation moved firmly into the theme of behaviour - of the books, the content, the readers, the publishers... everyone. Or rather, I should, say the conversation returned to the theme of behaviour, as one of the opening keynotes from Peter Brantley was a fascinating explication of the ways in which books have become networked social acts, and are a part of the pattern of our global analogue culture being 'uplifted' into a digital one. Nick Bilton from NYT R&D Labs set the conference room alight with his super-cool run-through of the history of human engagement with content, and the main characteristics of how we deal with content now. We are all inundated with content, said Bilton, and we've developed a swarm intelligence online to navigate that content flow. Smart content adapts to that flow and only puts stuff in front of you that satisfies the immediate interest. Paper is just a device - hybrid forms of narrative can engage readers in the new pathways of content, be it on mobile, in print, or online.
That's a quick run-through of what has been in my head over the past week - now we need to act on some of these points. I'm also still thinking about how to apply some of the insights from Nick Bilton... more to come perhaps.
Whole business empires are now founded upon that most fleeting of things, at once profound and perfunctory, the human gaze. In buzzword bingo "attention economy" is a winning ticket. In this model of super abundant information invisibility is a function of excess and simply being noticed becomes the prerequisite for sucess, whether this is measured in monetary terms or by other criteria. This is hardly a new phenomenon. Go into any bookstore and what you notice is hardly an absence of choice, title vying against stylishly covered title for our hungry eyes. Indeed Reuters claims that the UK has now overtaken the US as the country with the most books published per annum, with over 206,000 books published in 2005 alone.
Even as our frazzled attention spans are being catered for by five second ad slots and continuous partial attention becomes our default the deluge of books expands. Media coverage of literature contracts. The result is that the publishing space is crowded, an attention addicted junkie with not enough eye balls to satisfy its craving.
And then along comes the web.
Here is the issue: despite the attention competition in the current bibliographic climate once you've walked into a bookshop your attention is focused on books. Ok, some bookshops sell cds and dvds but by and large you are almost exclusively surrounded by print objects in that environment.
In the AIDA (attention, interest, desire, action) model once you are in a bookshop the action is likely to be a book purchase.
On the web this is not the case, as there is no singular destination where books are the sole option (other websites are only a click away).
Take Amazon. You might search for a book, but this doesn't mean that only books will come up. This might not seem a problem, but if attention is a currency it has, in book terms, been devalued in that seconds glance.
Google is supposed to create greater attention efficiencies- PageRank is designed to send us where our attention most wants to go. However let's say we are interested in Doctor Who on a particular day. Say we are standing outside a bookshop. We would likely go in and devote our "Doctor Who attention" to a book in that shop. In that time slice the Doctor Who book has our attention and possibly the rest.
In the week that the BBC has upgraded it's (hugely popular) iPlayer Google is more likely to send us there than to the Amazon book page. Whatsmore even if we were searching for something much more specifically bookish we are only ever a click away from something beyond the orbit of books.
My point then is that the web might exacerbate issues surrounding the value of people's attention for publishers by diluting-eradicating- the singular focus possible in both physical spaces and traditional formats (literary magazines etc).
When time is a currency a publisher's main competitors are Playstations, House box sets and Twitter. It's a not a new point but one that bears repeating.
The answer is clearly not a retreat from the web, a manoeuvre that would only serve to completely remove the genuine positive opportunities presented by the web in both delivering content and connecting with new and existing readers. In truth there is no easy answer. Yes it means everyone has to learn how to eke out every last drop of value from the web. It's still a challenge. But it's a challenge we should relish. It means that publishers really have to ensure there stuff is worth its weight in attention gold.
My guess is that this is what publishers, writers, readers, in fact anyone involved with books and texts at any level, is into anyway.
It all started with a trailer running before showings of last summer's blockbuster Transformers. Handheld, seemingly amateur footage of a party in downtown Manhatten. Lights go out and suddenly a huge roaring resounds across New York. Before long explosions are flattening the other side of the island and the Statue of Liberty's head is rolling down the street. The trailer finished with "From Producer J.J. Abrams" and "In Theatres 1-18-08". That was all. At the time it (Cloverfield as we now know it) caused a sensation. As the producer of Lost people knew that the Abrams signature was an invitation to find out more, that this trailer held out the promise of a rich, involved information mine that would gradually reveal some answers to the many questions prompted even as more were posed. Media interest in the trailer was piqued and soon USA Today was arguing that the film was based on the work of H.P. Lovecraft while the Guardian claimed a definite tie-in with Lost (the Observer followed up the story here).
Viewers were first directed to 1-18-08.com, a website that consisted of a photomontage manipulated by the user. Soon other sites were up and running, including those for Slusho!, a fictional softdrink familiar to fans of the Abrams produced series Alias, and Tagruato, a huge mining conglomerate. Both sites were convincingly done and soon a metanarrative was building that encompassed speculation of all kinds about the as yet un-named film and a tantalising trail of clues as to its plot.
As well as personal pages of characters on myspace and this site (log in jllovesth) the viral evolved an internal logic, so before long an anti-Tagruato environmentalist site for a group called TIDO was found. On the site TIDO deny responsibility for destroying a Tagruato oil rig, a denial which was followed by the release of this trailer showing a rig collapsing into the sea. All of this gives only a snapshot of the overall complexity: as this analysis demonstrates most of the viral work, often taking the form of news reports, was done in languages other than English. Gradually a picture emerged from the dense network of clues and puzzles that pointed to the major events of the film, giving the plot background and substance whilst garnering near priceless acres of newsprint and becoming one of the dominant memes of 2007.
Four days before the films US release and almost everything about the film has been leaked. However in the process of publicising the movie it's content has become part of a narrative that the film both describes and yet which also supercedes the film.
However the team working with Abrams wasn't done yet. As described in this Read/Write Web article Lost fans in the Mid West started to see billboard ads for Oceanic airlines with a URL that directed them here. As fans of the show will know Oceanic Flight 815 is the plane that crashes thus prompting the subsequent drama that is the show. Anyway the site was a provocation and a new ARG, Find815, was launched. The iconography used in the game was consistent with that in the program; new characters have been introduced in the game that harmonise with those on TV and key mysteries from the series are inextricably worked in to the ARG. Find 815, which is currently in full flow, follows Sam, an Oceanic engineer whose girlfriend was aboard Flight 815, as he attempts to find out what happened to the craft when the official investigation is abandoned.
A new model of storytelling is on offer here, a model that just so happens to engage large numbers of people on the internet and generate press interest. As Cloverfield director Matt Reeves has said "All the stories kind of bounce off one another and inform each other[...]To us, it's just another exciting aspect of the storytelling." Publishers and writers are in the business of storytelling and given the ravages of the writers strike on moving image media are presented with an historic opportunity to explore this new, metafictional, ludic, reality bending narratology. Artistically and commercially, even philosophically, a new frontier has been opened in this most fundamental of human attributes.
A series like Harry Potter is already transmedia in that it exists in virtually every media on earth, and some. What it doesn't have is a sense of unity between the different media; the film doesn't complement the book, it is the book; the game doesn't complement the film, it is the film. There is one story set down in the books that is then the template for everything else. There is no Harry Potter myspace page other than what is essentially an advert for the film, for example.
Imagine say the new Dan Brown as intimately conceived with a viral ARG. Brown's novels have a very definite sense of place and this could be used to fascinating effect (Google maps mash ups etc). Dan Brown tours already take place in London, Paris and Rome; what about if they became an integral part of a wider mystery? The gap between the story and the world would vanish as people become part of the story. Its post-structuralist literary theory, literalised.
There are some indications that writers are embracing this model. Boing Boing reported on Shadow Unit, who are creating "a fan site for a show that never existed". The SF writer Elizabeth Bear and mastermind of the project gives a rough outline: "Over the next couple of months, the site will be updated on a weekly or biweekly basis with new information, vignettes, character sketches, character bios, a community message board, and other exciting things". Moreover "there will be a series of novellas and novellettes, and one complete novel". Shadow Unit works through a mixture of free and subscription content and breaks ground in terms of offering publishers and writers an example of how to radically expand their storytelling.
Creating fictional myspace profiles is not uncommon for books now and last year the publisher Canongate gestured more fully in this direction on the website for The Raw Shark Texts by Steven Hall.
In his famous discussion of simulacra and hyperreality the philosopher Jean Baudrillard cites Disneyland as being the ultimate simulacra, a copy for which there is no original. Building Disneyland isn't exactly easy and is probably beyond the marketing budgets of most publishers. However as Jeff Gomez points out in a recent post on Print is Dead, writers today have, via the internet, a means of getting involved with their texts beyond writing them. Creating a fictional myspace profile is do-able and adds an intriguing twist to a reader's relationship with a character. Whilst the resources required for an ARG on the scale of the Cloverfield virals is vast, as are those for full on ARGs like Perplex City, conceptualising, planning and executing interesting metafictional elements and the creation of a web based narrative are within reach for many publishing projects.
Doing this requires commitment from writers. What works so well about the Lost ARG is how perfectly it coheres with the vision presented in the series, and this sense of unification is necessary for building a transmedia story. Otherwise its just stories.
In summary then, I think we are seeing a new breed of story emerging that blends viral marketing campaigns with alternate reality games to produce a narrative that forms part of the story told in traditional media formats. In doing so many old distinctions, between say fiction and reality or marketing and content, are being challenged or breaking down. Recent examples spearheaded by the producer J.J. Abrams have demonstrated the complexity, artistry and publicity boosting potential of the model and publishing companies are making steps in this direction. Our understanding and definition of what stories are and how we tell them are shifting in new and exciting ways. The terms of this post's title are blending into one, a concept we have, as yet, no other word for than the neverending constant that is "story".
There are two videos that are essential viewing just at the moment. They may not change your life: but they really might. Anyway try here for a video called Shifthappen. For the other video, type Web 2.0 in YouTube and watch the top choice. Both videos work much better with sound on. Neither are obviously related to publishing. They practically had me weeping with awe though, which either means I am a complete prat (possible...) or they are simply awesome. You decide. Last night I did something rather geeky: I stayed at home and attended a live video, reading and Q&A with the great William Gibson- in Second Life. No, no, don't worry my life isn't usually quite so sad. Honestly. So the event started with a film about WG, was followed by a reading from his new novel and concluded with a lengthy Q&A session. Despite a slight technical issue unlikely to occur in "meatspace". As I said to an in-world journalist writing a piece for the Second Life News Network it was kind of like watching TV, listening to the radio, playing a computer game, social networking and attending a lecture all at once. Very strange. Talking of which the Macmillan building is about to be fully operational on Book Island/Publishing Village.
That Mssv piece on the Death of Publishers became quite a controversy spreading to the Bookseller blogs section and thence, in a slightly altered form to The Book Depository. And its unlikely to die down to quickly- a main feature should be appearing in the Bookseller magazine in a few weeks. Theres one thing nobody is saying: that the web will not, one way or another, influence desicions made by publishers. There just not saying that in lots of different ways.
An entry on Publishing 2.0 [link no longer available] suggests that Twittering is a form of publishing. Maybe if you really, really stretch what counts as a publication. Really, really. More of realistic internet publishing apps are those described on the O'Reilly Radar.