CARVIEW |
Publishing
Technology is transforming publishing. From the way ideas are generated to the packaging of information to the delivery of products, the industry is in the midst of a sea change. We've always considered O'Reilly as much of a technology company as a publisher, a belief that's led us to develop information products such as GNN (the first commercial website), Safari Books Online, and the Tools of Change for Publishing conference. As publishers seek a new equilibrium in our networked world, we aim to be both a catalyst and chronicler of what has inevitably been called Publishing 2.0.
Recent Posts from TOC
Legally Speaking: The Dead Souls of the Google Booksearch Settlement
by Pamela Samuelson | comments: 54
Guest blogger Pamela Samuelson is the Richard M. Sherman Distinguished
Professor of Law and Information at the University of California, Berkeley, as well as a Director of the Berkeley Center for Law & Technology and an advisor to the Samuelson High Technology Law & Public Policy Clinic at Boalt Hall. She has written and spoken extensively about the challenges that new information technologies pose for traditional legal regimes, especially for intellectual property law.
This piece will appear in the July 2009 issue of Communications of the ACM. Readers may also be interested in the slides from Pam's recent presentation, "Reflections on the Google Book Search Settlement."
Google has scanned the texts of more than seven million books from major university research libraries for its Book Search initiative and processed the digitized copies to index their contents. Google allows users to download the entirety of these books if they are in the public domain (about 1 million of them are), but at this point makes available only “snippets” of relevant texts when the books are still in copyright unless the copyright owner has agreed to allow more to be displayed.
In the fall of 2005, the Authors Guild, which then had about 8000 members, and five publishers sued Google for copyright infringement. Google argued that its scanning, indexing, and snippet-providing was a fair and non-infringing use because it promoted wider public access to books and because Google would take out of the Book Search corpus any digitized books whose rights holders objected to their inclusion. Many copyright professionals expected the Authors Guild v. Google case to be the most important fair use case of the 21st century.
This column argues that the proposed settlement of this lawsuit is a privately negotiated compulsory license primarily designed to monetize millions of orphan works. It will benefit Google and certain authors and publishers, but it is questionable whether the authors of most books in the corpus (the “dead souls” to which the title refers) would agree that the settling authors and publishers will truly represent their interests when setting terms for access to the Book Search corpus.
Orphan Works
An estimated 70 per cent of the books in the Book Search repository are in-copyright, but out of print. Most of them are, for all practical purposes, “orphan works,” that is, works for which it is virtually impossible to locate the appropriate rights holders to ask for permission to digitize them.
A broad consensus exists about the desirability of making orphan works more widely available. Yet, without a safe harbor against possible infringement lawsuits, digitization projects pose significant copyright risks. Congress is considering legislation to lessen the risks of using orphan works, but it has yet to pass.
The proposed Book Search settlement agreement will solve the orphan works problem for books—at least for Google. Under this agreement, which must be approved by a federal court judge to become final, Google would get, among other things, a license to display up to 20 per cent of the contents of in-copyright out-of-print books, to run ads alongside these displays, and to sell access to the full texts of these books to institutional subscribers and to individual purchasers.
The Book Rights Registry
Approval of this settlement would establish a new collecting society, the Book Rights Registry (BRR), initially funded by Google with $34.5 million. The BRR will be responsible for allocating $45 million in settlement funds that Google is providing to compensate copyright owners for past uses of their books.
More important is Google’s commitment to pay the BRR 63 per cent of the revenues it makes from Book Search that are subject to sharing provisions. The revenue streams will come from ads appearing next to displays of in-copyright books in response to user queries and from individual purchases of and institutional subscriptions to some or all of the books in the corpus. Google and the BRR may also develop new business models over time that will be subject to similar sharing.
One of the main jobs of the BRR will be to distribute the settlement revenues. The money will go, less BRR’s costs, to authors and publishers who have registered their copyright claims with BRR. Although the settlement agreement extends only to books published prior to January 5, 2009, BRR is expected to attract authors and publishers of later-published books to participate in the revenue sharing arrangement that Google has negotiated with BRR.
tags: copyright, google, policy, publishing
| comments: 54
submit:
What Publishers Need to Learn from Software Developers
by Tim O'Reilly | comments: 29There was a great exchange on the O'Reilly editors' backchannel the other day, so illuminating that I thought I should share it with the rest of you. We've been discussing the fast-track development we're using to produce The Twitter Book. (We're basically authoring the book as a presentation, after I realized how much more quickly I am able to put together a slide deck to make my points than I am a normal book. Twitter is also such a fast-moving topic that we need to be able to update the book every time we reprint it.)
Sarah Milstein wrote:
Apropos of everything, the NYT on publishers' speeding up the production process, especially with eBooks:Andrew Savikas replied:“If this book had gone through the normal publishing procedures,” Mr. Kiyosaki said, “it wouldn’t be worth writing.”
The more I think about it the more obvious it's becoming to me that the next generation of authoring/production tools will have much more in common with today's software development tools than with today's word processors.Software developers spend enormous amounts of time creatively writing with text, editing, revising, refining multiple interconnected textual works -- and often doing so in a highly distributed way with many collaborators. Few writers or editors spend as much time as developers with text, and it only makes sense to apply the lessons developers have learned about managing collaborative writing and editing projects at scale.
'Nuff said. I await said next generation of authoring/production tools.
tags: publishing, tools, twitter
| comments: 29
submit:
Managing monopolies and dominance in the Net age
by Mike Shatzkin | comments: 11
Guest blogger Mike Shatzkin is Founder and CEO of The Idea Logical Company, where he has focused on supply chain and digital change issues since 1979. Mike has spoken at and organized publishing industry conferences all over the world. He recently launched The Shatzkin Files blog. One of Mike's several books, The Ballplayers, forms the core of BaseballLibrary.com.
Our thinking about "monopoly" may need to be recast in the Internet age. This is a complicated question to consider and we need to start gathering some good minds around it.
Network effects were noticed before there was an Internet. Both the phone company and the electric company were networks, and it became clear about a century ago that everything worked better for everybody if they WERE monopolies and everybody was hooked up to the same network, not competing ones. So phones and electricity became regulated monopolies, with prices and other behavior, including mandated service levels, controlled. Whether because of a changing ethos or because things became more complicated, or both, "competition" has been introduced in both spheres over the past two or three decades. With debatable results.
Amazon's dominance -- which is not a monopoly but which certainly looks like unassailable hegemony in the world of online bookselling -- can be largely attributed to brilliant execution and maintaining a tight focus on serving the customer. But part of their success at eliminating meaningful competition for online book sales has to do with the nature of the Internet. Online likes one winner in many spaces because it serves the users better NOT to fragment aggregations. If Amazon's reader reviews were spread over 1000 web sites, they wouldn't be as useful to the consumers. And their recommendation engine thrives on data; fewer customers would mean less helpful recommendations for those customers remaining, and the concentration at Amazon means less useful recommendations come from all their retailing competitors. This is an edge that may not stay with the retailer forever, though, because the playing field for information about books is being leveled by social networking sites. That's why Amazon is investing in them.
This tendency to concentration makes it urgent for publishers to get into niches and start trying to own them while they have legacy advantages. If the history of the Net so far is any guide, each information and interest niche will end up being owned by a very small number of players; often it will boil down to one. We seem to have been pretty fortunate with the dominant players (perhaps we should call them "monopoly threats") that have emerged so far, among them: Amazon, Google, ebay, Craigslist, wikipedia, and a now-emerging Facebook. They've executed well and kept their eye on the stakeholders they serve. They, so far, have been more benign dominators than were Microsoft and AOL, two big winners on the previous go-round.
tags: business, network effects, publishing, web 2.0
| comments: 11
submit:
Radar Interview with Clay Shirky
by Joshua-Michéle Ross | comments: 3
Clay Shirky is one of the most incisive thinkers on technology and its effects on business and society. I had the pleasure to sit down with him after his keynote at the FASTForward '09 conference last week in Las Vegas.
In this interview Clay talks about
- The effects of low cost coordination and group action.
- Where to find the next layer of value when many professions are being disrupted by the Internet
- The necessary role of low cost experimentation in finding new business models
A big thanks to the FASTForward Blog team for hosting me there.
tags: clay shirky, future at work, innovation, journalism, publishing, social media
| comments: 3
submit:
The Kindle and the End of the End of History
by Jim Stogdill | comments: 23
This morning I was absentmindedly checking out the New York Times' bits blog coverage of the Kindle 2 launch and saw this:
“Our vision is every book, ever printed, in any language, all available in less than 60 seconds.”
It wasn't the main story for sure. It was buried in the piece like an afterthought, but it was the big news to me. It certainly falls into the category of big hairy audacious goal, and I think it's a lot more interesting than the device Bezos was there to launch (which still can't flatten a colorful maple leaf). I mean, he didn't say "every book in our inventory" or "every book in the catalogues of the major publishers that we work with." Or even, "every book that has already been digitized." He said "every book ever printed."
When I'm working I tend to write random notes to myself on 3x5 cards. Sometimes they get transcribed into Evernote, but all too often they just end up in piles. I read that quote and immediately started digging into the closest pile looking for a card I had just scribbled about an hour earlier.
I had been doing some research this morning and was reading a book published in 1915. It's long out of print, and may have only had one printing, but I know from contemporary news clippings found tucked in its pages that the author had been well known and somewhat controversial back in his day. Yet, Google had barely a hint that he ever existed. I fared even worse looking for other people referenced in the text. Frustrated, I grabbed a 3x5 card and scribbled:
"Google and the end of history... History is no longer a continuum. The pre-digital past doesn't exist, at least not unless I walk away from this computer, get all old school, and find an actual library."
My house is filled with books, it's ridiculous really. They are piled up everywhere. I buy a lot of old used books because I like to see how people lived and how they thought in other eras, and I guess I figure someday I'll find time to read them all. For me, it's often less about the facts they contain and more about peeking into alternative world views. Which is how I originally came upon the book I mentioned a moment ago.
The problem is that old books reference people and other stuff that a contemporary reader would have known immediately, but that are a mystery to me today - a mystery that needs solving if I want to understand what the author is trying to say, and to get that sense of how they saw the world. If you want to see what I mean, try reading Winston Churchill's Second World War series.
Churchill speaks conversationally about people, events, and publications that a London resident in 1950 would have been familiar with. However, without a ready reference to all that minutiae you'll have no idea what he's talking about. Unfortunately, a lot of the stuff he references is really obscure today and today's search engines are hit and miss with it - they only know what a modern wikipedia editor or some other recent writer thinks is relevant today. Google is brilliant for things that have been invented or written about in the digital age, or that made enough of a splash in their day to still get digital now, but the rest of it just doesn't exist. It's B.G. (before Google) or P.D. (pre digital) or something like that.
To cut to the chase, if you read old books you get a sense for how thin the searchable veneer of the web is on our world. The web's view of our world is temporally compressed, biased toward the recent, and even when it does look back through time to events memorable enough to have been digitally remembered, it sees them through our digital-age lens. They are being digitally remembered with our world view overlaid on top.
I posted some of these thoughts to the Radar backchannel list and Nat responded with his usual insight. He pointed out that cultural artifacts have always been divided into popular culture (on the tips of our tongues), cached culture (readily available in an encyclopedia or at the local library) and archived culture (gotta put on your researcher hat and dig, but you can find it in a research library somewhere). The implication is that it's no worse now because of the web.
I like that trichotomy, and of course Nat's right. It's not like the web is burying the archive any deeper. It's right there in the research library where it has always been. Besides, history never really operates as a continuum anyway. It's always been lumpy for a bunch of reasons. But as habit and convenience make us more and more reliant on the web, the off-the-web archive doesn't just seem hard to find, it becomes effectively invisible. In the A.G. era, the deep archive is looking more and more like those charts used by early explorers, with whole blank regions labeled "there be dragons".
So, back to Bezo's big goal... I'd love it to come true, because a comprehensive archive that is accessible in 60 seconds is an archive that is still part of history.
tags: big hairy audacious goals, emerging tech, publishing
| comments: 23
submit:
For-Profit, Non-Profit, and Scary Humor
by Michael Jon Jensen | comments: 6
Guest blogger Michael Jon Jensen, Director of Strategic Web Communications for the Office of Communications of the National Academies and National Academies Press, has been at the interface between digital technologies and scholarly/academic publishing since the late 1980s.
Tim was kind enough to suggest that I expand on a longish comment I made on his recent post Stuff That Matters: Non-profit to For-profit.
Two threads wove my argument: first, I pushed back at his conventional framing of the non-profit vs. for-profit sectors. But what I think caught his attention most was my description of a project that's trying to "find the funny" in the grinding, slo-motion collapse of our natural world.
An easy knee-slapper, eh?
I'll get back to that second theme after some musings on non-profit vs. for-profit:
Tim: The heart of my message is that work on stuff that matters is a great hedge in down times: even if there isn't a huge monetary payoff, you've done something that needs doing. And it's certainly true that non-profit enterprises are often a good way to tackle hard problems that the marketplace doesn't seem to be addressing.But I want to make clear that I'm not just talking about charity work. I'm talking about the creation of real economic value. There are huge opportunities for entrepreneurs in solving hard problems, and in so doing creating new markets that can be exploited not just by themselves but by those that follow in their footsteps.
I certainly can't disagree with most of that statement -- but we need to do better at clarifying the roles and mission-driven goals underlying the nonprofit and the for-profit worlds, especially on "stuff that matters."
Non-profits vs. For-profits
Tim comes to his benign perspective on the for-profit sector honestly: O'Reilly has historically been a responsible for-profit, building immense social value at the same time that it profits from its actions. But O'Reilly Media is a somewhat exceptional company.
On the main, the for-profit world has a different "maturation goal" than the non-profit world has, and it affects nearly every decision made in either kind of enterprise.
I heard my favorite summation of the distinction from Peter Likens at an Online Computer Library Center conference years ago. He was then President of the University of Arizona; I first used this quote more than a decade ago, in a presentation I gave entitled "Entrepreneurs of Social Value":
"A for-profit's mission is to create as much value for its stockholders as possible, within the constraints of society. The non-profit's mission is to create as much value for society as possible, within the constraints of its money."
Of course there are, as Tim mentions, great overlaps betwixt the two, and the more that the for-profit world addresses the "stuff that matters," the better. But quite frequently -- at least in publishing, and online, and in the "public good" sector -- when a for-profit takes advantage of that overlap, the pattern has been to decrease the public good.
Take a look at, for example, scientific publishing: in the post-WWII economy, most non-profit scientific journals were bought up by a handful of smart for-profit publishers who, over the following decades, began to ratchet up the prices far beyond what university libraries could afford, producing a dramatic shift in library resource use: an increasing share of nonprofit money went to for-profit scholarly publishing. One could argue that $50,000 a year is a fair price for a really important specialty journal, but it's not an argument that fits into the "stuff that matters" or "social value" meme.
In that instance, smart, rapacious for-profit cherry-picking decreased the means that nonprofit publishers had to fund their other, less profitable work in the humanities, the social sciences, or even the sciences themselves.
A for-profit takeover of formerly nonprofit work could also describe what has happened with Blackwater, and the privatization of the military in general -- higher costs, less accountability, and unintended consequences.
I've worked in nonprofit publishing for more than 20 years, and while I recognize the need for a risk-reward economy, some care needs to be taken to acknowledge that the "public good" rarely is profit-making. It can be sustainable, but is rarely super-profitable.
That said, over those 20+ years, I've always had side projects of some kind -- "stuff that matters" projects that I hoped would end up being profitable, or potentially commercial ones that might be fabulously so.
My hoped-for goals for those projects have changed over time, and recently shifted drastically. For the last 18 months my side project has been with my oldest, bestest friend -- a project which has changed my entire thinking on "what *really* matters," and what "breakthroughs" we need in the next decade -- from the Web 2.0 community, from myself, and from the world at large.
Yeah, it's time for phase II of this guest blog: about trying to turn the onrushing apocalypses into laughter -- or at least a knowing grin.
tags: publishing, web 2.0
| comments: 6
submit:
Four short links: 27 Jan 2009
by Nat Torkington | comments: 0
Fantasy, feedback, facts, and flies, all will be revealed in today's links of loops and life:
- Blueful - a story told in text, but delivered through the medium of web sites. It's like an xkcd cartoon embodied in the web. Interesting, artistic, and makes you look at web sites in a new way. From Aaron A. Reed.
- The Case Against Candy Land - Steven Johnson talks about how dull the children's games of our youth are. "What’s irritating about the games is that they are exercises in sheer randomness. It’s not that they fail to sharpen any useful skills; it’s that they make it literally impossible for a player to acquire any skills at all." Every process in life should have a feedback loop that lets you get better at it.
- Journo Data - a Guardian journalist publishes data resources about the US economy as Google spreadsheets. This is the start of something interesting, where the raw data is available from journalists not just the (textual or programmatic) interpretation. As mentioned in the fantastic presentation Tim just linked to, access to the data behind our world view is essential if we are to critically assess that world view.
- Userfly - a usability tool that records and then recreates your users' sessions on your web site, so you can see where and when they type, click on, backtrack, etc. (via
tags: book related, games, journalism, publishing, usability, web | comments: 0
submit:
Making Site Architecture Search-Friendly: Lessons From whitehouse.gov
by Vanessa Fox | comments: 10
Guest blogger Vanessa Fox is co-chair of the new O'Reilly conference Found: Search Acquisition and Architecture. Find more from Vanessa at ninebyblue.com and janeandrobot.com. Vanessa is also entrepreneur in residence at Ignition Partners, and Features Editor at Search Engine Land.
Yesterday, as President-elect Obama became president Obama, we geeky types filled the web with chatter about change. That change of change.gov becoming whitehouse.gov, that is. The new whitehouse.gov robots.txt file opens everything up to search engines while the previous one had 2400 lines! The site has a blog! The fonts are Mac-friendly! That Obama administration sure is online savvy.
Or is it?
An amazing amount of customer acquisition can come from search (a 2007 Jupiter research study found that 92% of online Americans search monthly and over half search daily). Whitehouse.gov likely doesn't need the kind of visibility that most sites need in search, but when people search for information about today's issues, such as the economy, the Obama administration surely wants the whitehouse.gov pages that explain their position to show up.
The site has a blog, which is awesome, but the title tag, the most important tag on the page, has only the text "blog". Nothing else. Which might help the page rank well for people doing a search for blog, but that's probably not what they're going for. This doesn't just hurt them in search of course. It's also what shows up in the browser tab and bookmarks.
The site runs on IIS 6.0. Does the site developer know about tricky configuration that makes the redirects search engine-friendly?
Search engines are text-based, so they can't read text hidden in images. Some whitehouse.gov pages get around this issue well, by making the text look image-like, but leaving it as text, such as below.
However, other pages have text in images and don't use ALT text to describe them. (This, of course, is an accessibility issue as well, as it keeps screen readers from being able to access the text in the images.) An example of this is the home page, which may be part of why whitehouse.gov doesn't show up on the first page in a search for President Obama.
There are all kinds of technical issues, big and small, that impact whether your site can be found in search results for what you want to be found for. (whitehouse.gov using underscores rather than dashes in URLs, the meta descriptions are the same on every page...) Probably the biggest issue in this case is the lack of 301 redirects between the old site and the new site. When you change domains and move content to the new domain, you don't want to have to rebuild the audience and links all over again. (Not that Obama or whitehouse.gov will have a problem with attracting and audience, but we all can't be president!) When you use a 301 redirect, both visitors and search engines know to replace the old page with the new one.
In the case of change.gov, it's unclear if they intend to maintain the old site. The home page asks people to join them at whitehouse.gov, but all the old pages still exist (even the old home page at https://change.gov/content/home).
And in many cases, the same content exists at both change.gov and whitehouse.org (see, for instance, https://change.gov/agenda/iraq_agenda/ and https://www.whitehouse.gov/agenda/iraq/).
As Matt Cutts, Googler extraordinaire pointed out, give them a few days to relax before worrying so much about SEO. And I certainly think the site is an excellent step towards better communication between the president and the American people. But not everyone has the luxury of having one of the most well-known names and sites in the world, so the technical details are more important for the rest of us.
If you want to know more about technical issues that can keep your site from being found in search and tips for making sure that you don't lose visibility in a site move, join us for the O'Reilly Found conference June 9-11 in Burlingame. And if you're in Mountain View tomorrow night (Thursday, January 22nd), stop by Ooyala from 6pm to 9pm for our webdev/seo meetup, and get all your search questions answered. Hope to see you there! (Macon Phillips and the whitehouse.gov webmasters are welcome, but my guess is that they're a little busy.)
tags: publishing, search, seo, web 2.0, whitehouse.gov
| comments: 10
submit:
Wikipedia and RNA Biology
by Nat Torkington | comments: 11
I love the RNA Biology journal's new guidelines for submissions, which state that you must submit a Wikipedia article on your research on RNA families before the journal will publish your scholarly article on it:
This track will primarily publish articles describing either: (1) substantial updates and reviews of existing RNA families or (2) novel RNA families based on computational and/or experimental results for which little evolutionary analysis has been published. These articles must be accompanied by STOCKHOLM formatted alignments, including a consensus secondary structure or structures and a corresponding Wikipedia article. Publication in the track will require a short manuscript, a high quality Stockholm alignment and at least one Wikipedia article; Each centered around the RNA in question.
As my source for this points out, Nature (the publishing organisation behind the RNA Biology journal, and co-producer of Science Foo Camp with O'Reilly and Google) the publishers of RNA Biology already synchronise a database with Wikipedia. Apparently there's a core of scientists who do most of the edits, but also a lot of other scientists who pop in sporadically to fix or add information.
Kudos to Nature the publishers of RNA Biology for doing something imaginative to increase the commons. Journals wield a huge amount of power in the scientific world, and it's wonderful to see them using that power to incentivize good.
tags: nature, publishing, science, wikipedia
| comments: 11
submit:
Did you read the book from that movie?
by Brett McLaughlin | comments: 15
New Radar blogger Brett McLaughlin is the executive editor of O'Reilly's Head First books and a Java developer-turned-author.
It doesn't take a rocket scientist to realize that media is changing the way books are viewed. In fact, video - and YouTube in particular - has already changed how books are sold. Most big fiction releases are heralded by short "book trailers" that give an almost movie-like feel to the contents of the book.
But in a recent article published by the Christian Science Monitor, I was surprised to see that there's an even more notable link between movies and the sale of books:
In the upcoming Christian movie “Fireproof”, screenwriters created a book as plot point. The movie tells the story of Caleb Holt, a firefighter with a troubled marriage. To help prevent divorce, Caleb’s dad suggests he read a book called “The Love Dare.”The book changes Caleb’s view of marriage and transforms his life. As soon as preview audiences saw the film, they began flooding bookstores with inquiries.
The only problem: The book didn’t exist.
It does now, however.
Brothers and associate pastors Alex and Stephen Kendrick, also co-directors and producers of “Fireproof,” sat down and penned such a book in the space of a few weeks. It hasn’t hit bookstores yet but has already sold 300,000 copies and may go on to become the bestselling Christian book of 2008.
This is pretty remarkable. Keep in mind, we've long seen books-turned-into-movies re-released with movie-centric covers. We've seen movies come out, and then books released that are adaptations of the movie, in cases where the movie's based on an original screenplay. But books that happen to be featured in movies? That's a new one.
Is this an isolated case? Or perhaps a phenomenon related more to religion and self-help tomes? Not so much; from the same article:
On the opposite end of the spectrum, there’s the story of the “Sex and the City” book. When Carrie Bradshaw (Sarah Jessica Parker) sat in bed reading a book called “Love Letters from Great Men” in a scene in the film, women viewers everywhere decided they needed a copy.Again: As the press was quick to report, the book didn’t actually exist. (At least not with that title.)
But there was something close enough: a 1920s title called “Love Letters of Great Men and Women” reissued last year by Kessinger Publishing. On the strength of the movie, the book suddenly became a hot item for booksellers.
So what does this mean for publishing as an industry? Even more poignantly, what does this mean for learning books; the sort of books that O'Reilly and other technology, math, science, educational, etc. publishers routinely put out?
I'm not completely sure, although I plan on positing a few ideas in the coming days... but one thing that is clear: the competition for a book sale is no longer just other good books. Movies, videos on YouTube, even the latest Metal Gear Solid game on PlayStation 3 are increasingly key competitors. They're informing buyers about what to buy, in very unique and surprising ways.
And when the competition is no longer just books, everything changes... whether we acknowledge it or not. Anyone - or any company - that doesn't realize and react is going to be hurting before decade's end.
tags: learning, new media, publishing
| comments: 15
submit:
I Am Trying To Believe (that Rock Stars aren't Dead)
by Jim Stogdill | comments: 36
Last Friday night I attended a Nine Inch Nails concert in Philadelphia with Chris Cera of Vuzit (thanks Chris for your help with this post). At 43, Trent Reznor can certainly still grab an audience by the throat and shake it. It was a fantastic show; the kind of show that has you checking to see if there are other tour dates within driving distance.
During a short break in the sonic and visual mayhem, Reznor spoke for a moment and told us emphatically to steal his music. Later, on my way to the car after the show, a member of the band Cube Head was handing out sharpie-labled home-burned demo CD's in the parking lot complete with a hand drawn "copywrong" marking. It was an interesting contrast between established artist and emerging talent and how they are both figuring out how to make their way in the post-vinyl post-jewel-case economy.
I'll come back to that theme in a second, but first a brief aside. Chris (who has some background in real time video processing) and I were blown away by the amazing stage show; it was geek manifest and a video processing tour de force. During about 1/3 of the show the band played sandwiched between at least two giant video monitors, the one in the foreground transparent when its pixels were dormant and opaque when lit up.
The source video for the display was sometimes heavily processed local camera inputs, sometimes it was prerecorded, and sometimes it was electronically generated. Whatever the source, it was frequently and heavily modified by the audio inputs or by the movements of the artists on the stage. With a sweep of his hand Trent would wave away the static hiding him from the audience and then moments later it would fill back in. It's hard to explain but the effect was very cool. Cool enough that trying to figure it out started to distract both of us from the music. There are some videos out there of it in action but none that I found really capture the full effect. Let me know in the comments if you find one.
The next day, still curious about how the stage show was done, and with Reznor's call to "steal my music" still in my head, I poked around on the web looking for more info. One of the most interesting things I found was this story about Nine Inch Nail's Year Zero Alternate Reality Game. The way Reznor used this new gaming medium as an extension of his canvas rather than as a promotional stunt (and the nascent geekness it suggests) makes me think he has a much better than average chance to figure out the post RIAA world. Or, it may just be that with the state of distribution being what it is, he realized that while promotion might move more units, it would do it in a way so loosely coupled to monetization as to be pointless.
His comments in the story's sidebar make me think it is probably the latter. In particular: "So a couple years ago I realized that music essentially is free now. I'd prefer, it wasn't, but it is. And hey, I've had a pretty good run. I can still make a living touring." .... "I feel that the right model hasn't revealed itself yet."
Here's the thing, I'm not convinced it's going to reveal itself. Or, more likely, it has revealed itself and he already knows what it is: "I can still make a living touring."
tags: just plain cool, music, nin, publishing, riaa
| comments: 36
submit:
Social Networking for Books: One Ring, or Loosely Joined?
by Tim O'Reilly | comments: 24
I have to confess that one of the social networking tools I find most valuable is Goodreads. (It's a close second to Twitter, and way ahead of Facebook, Friendfeed, or Dopplr.) Unlike twitter, where I follow hundreds of people (possible because of twitter's minimalism) and am followed by thousands, on Goodreads, I follow and am followed by a small circle of friends and people whose taste in books I trust. As someone who loves books, it is the pinnacle of private social networking for me.
So it was with some interest that I read about Amazon's acquisition of Shelfari. Much of the resulting commentary has focused on the problems this poses for LibraryThing, in which Amazon also has an invesment (via their recent purchase of Abebooks.) I'm a bit surprised that the articles have seemingly ignored the fact that Goodreads appears to be the market leader, at least based on data from compete.com:
Of course, that could change quickly if Amazon throws their muscle behind Shelfari and integrates it into their overall service. And there's the rub: we're entering a period of Web 2.0 consolidation. After all, web 2.0 is all about network effects in applications that get better the more people use them. And that means that companies with dominant share tend to get more dominant over time; that dominance need not be organic to start with (though it helps.) Over time, I expect to see companies who've achieved dominant market share in one market segment to use it to dominate a related segment.
But here's the counter: open and interoperable applications, including open social networks. When are companies with "point applications" of social networks going to realize that their best option, in the face of inevitable competition from big companies looking to dominate their market, is to join forces via shared social networks?
Some of my friends prefer LibraryThing. Others may prefer Shelfari. But I only network with those on Goodreads because that's the service I ended up using first. What a shame that I can't see what my friends on LibraryThing and Shelfari might be reading! I'd love to see a firm commitment to cross-application connectivity, with the social network as infrastructure rather than application.
This applies to other specialized social networks as well. Sorry, even though I'm an investor in Tripit, I'm not going to try to rebuild the social network I've already got on dopplr, just because Tripit thinks they'd better add this hot functionality to what was already a unique and interesting product.
I've argued for years that one of the critical architectural decisions we can make about Web 2.0 applications is whether they are built on the "one ring to rule them all" model that we saw with Microsoft Windows and Office, a game where network effects drive a winner-takes-all marketplace, or the Unix/Internet model of "small pieces loosely joined," in which cooperating applications come together to build value greater than any of the pieces do alone.
We're entering the critical phase of that decision. Application developers need to embrace the "small pieces loosely joined" model, or they will be picked off one by one by dominant companies who've already reached scale, and who are practicing the "one ring" model. As Benjamin Franklin said during the American Revolution, "Gentlemen, we must all hang together, or we shall assuredly all hang separately." Now is a good time for LibraryThing and Goodreads to start talking about interoperability.
tags: publishing, web 2.0
| comments: 24
submit:
Recent Posts
- A Graphic Designer Puts Print on Demand Through Its Paces | by Tim O'Reilly on August 23, 2008
- Lessons on Blogging from Jon Stewart | by Tim O'Reilly on August 17, 2008
- O'Reilly Ebook Bundles Now Available | by Andrew Savikas on July 16, 2008
- Select O'Reilly Books Soon on Kindle, and as Digital Ebook Bundles | by Andrew Savikas on June 18, 2008
- The Kindle: "Looks Like a Million to Me" | by Tim O'Reilly on June 9, 2008
- Twittering D Conference, and Kindle Sales Stats, finally | by Tim O'Reilly on May 28, 2008
- Boycotting Amazon | by Allison Randal on May 21, 2008
- Amazon Accused of Anti-Trust Violations "Tied" to Print-On-Demand Terms | by Andrew Savikas on May 19, 2008
- When Authors Ask Us About the Consequences of "Piracy" | by Andrew Savikas on April 28, 2008
- Publishers Beware: Amazon has you in their sights | by Tim O'Reilly on April 16, 2008
- Amazon Gets Demanding with Print-on-Demand Publishers | by Andrew Savikas on March 28, 2008
- Goodbye, New York Times | by Jimmy Guterman on March 24, 2008
STAY CONNECTED
TIM'S TWITTER UPDATES
CURRENT CONFERENCES

Where 2.0 2009 delves into the emerging technologies surrounding the geospatial industry, particularly the way our lives are organized, from finding a restaurant to finding the source of a new millennium plague. Read more
O'Reilly Home | Privacy Policy ©2005-2009, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
Website:
| Customer Service:
| Book issues:
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.