CARVIEW |
Tim O'Reilly

Tim O'Reilly is the founder and CEO of O'Reilly Media, Inc., thought by many to be the best computer book publisher in the world. O'Reilly Media also hosts conferences on technology topics, including the Web 2.0 Summit, the Web 2.0 Expo, the O'Reilly Open Source Convention, and the O'Reilly Emerging Technology Conference. Tim's blog, the O'Reilly Radar, "watches the alpha geeks" to determine emerging technology trends, and serves as a platform for advocacy about issues of importance to the technical community. Tim is an activist for open source and open standards, and an opponent of software patents and other incursions of new intellectual property laws into the public domain. Tim's long-term vision for his company is to change the world by spreading the knowledge of innovators.
Mon
Sep 8
2008
Twitter Epigrams and Repartee
I recently encountered the following zinger on twitter:
@jayrosen_nyu: Scoble is like a guest at a hotel for one, where a huge staff is trying to anticipate his every need. And he's angry.Shades of noted wits from the past! As when Dorothy Parker, asked to use the word horticulture in a sentence, said "You can lead a whore to culture, but you can't make her think" or when Nancy Astor said to Winston Churchill, "Winston, if I were your wife, I'd put poison in your coffee," and he replied, "Nancy, if I were your husband, I'd drink it," or less meanly, Oscar Wilde on his deathbed: "Either that wallpaper goes or I do." (My all time favorite, from the collection Viva la repartee, tells how the Earl of Sandwich braced reformist politician John Wilkes with the insult, "Upon my soul, Wilkes, I don't know whether you'll die upon the gallows, or of the pox," to which Wilkes cuttingly replied, "That will depend, my lord, on whether I embrace your principles, or your mistress." Ouch!)
It occurred to me that twitter, with its 140 character limit, its dialogue between people who may be rivals as well as friends, is a breeding ground for the rebirth of repartee and of the epigram. So I started keeping track of some of these sparks of wit. Some of these are actual rejoinders; others are simply clever insights. Here are a few I've captured recently:
@sacca: You can't really appreciate the vapidity of most people's taste in music until you live directly above a traffic signal.@tempo: Evolution is a sorting process that is the very antithesis of random. (in response to https://is.gd/1Do1 )
@davorg: Conference Driven Development - submitting talks to conferences so that it will galvanise you into doing the work you're talking about.
@sourcegirl: procrastination is our brain's way of saying that something is not as important as we may think it is...
@gstein: OH: isn't a smoking area in a restaurant like a peeing area in a pool?
@amandachapel:@lisahoffmann "I've had conversations with people I never would have met otherwise." Like hanging out at the bus terminal.
I only follow a few hundred people out of millions of twitter users, so I'm thinking that there must be tens of thousands of great lines waiting out there to be captured into a book of twitter one-liners. If you know of any, and want to share them, either tweet them to the attention of @timoreilly or leave them in the comments here. A book of twitter wit and wisdom would make a fun conference giveaway, don't you think?
Keep in mind that, as Aristotle said, "Wit is educated insolence." I'm not looking for abuse, per se, but cleverness and concise expression of insight. When I asked for suggestions on twitter, several people pointed me to feeds of people who routinely insulted others, often crudely. A great insult may be appropriate, but it's far from the soul of wit. Consider the examples above, and give me more like that!
P.S. And be sure to give me the link to the individual status message if you have it. I can hopefully find it via twitter search without, but that's more work than you might think, especially if the quote isn't exact. As I discovered finding the links above, one wrong word that you're sure you remembered correctly can get you a seemingly mysterious "no results."
tags: epigrams, repartee, twitter
| comments: 22
| Sphere It
submit:
Fri
Sep 5
2008
Microsoft Missing the Boat on Mobile?
Yesterday's Microsoft Watch had an incisive article about Microsoft's failure to compete in the mobile phone marketplace. Echoing my own assertions that Microsoft's obsessive focus on competition with Google in search is a massive distraction, while open mobile is Google's most strategic initiative, Joe Wilcox notes:
Microsoft must change its priorities. The company has wasted too much time chasing Google in search. The search wars are over, and Google won. Microsoft must accept this. Where Microsoft should have been pushing hard is the device category where search will be the killer application: the cell phone.Instead, Windows Mobile has fallen way behind competing products. Windows Mobile is a mess. The user interface is too complicated, and there are few—I say no—capabilities that distinguish it from other mobile operating systems....
It's time for Microsoft to launch a mobile Manhattan Project, something on the scale of Internet Explorer in 1996. If Microsoft cedes the mobile market to Apple and Google, the PC will be the software giant's final—and declining—legacy...
The mobile market has dramatically changed over the last 12-15 months. But Windows Mobile hasn't moved with it. Apple's iPhone is exciting and has raised end user expectations about mobile user interfaces. Apple's iPhone platform has huge potential to woo developers, too, mainly because of the App store.
Now along comes Google, carrying two nuclear missiles: Android and Chrome. Both are immediate problems for Microsoft. Let me be absolutely clear: Chrome is not a Web browser, it's an application runtime. Chrome is really Google Gears with a browser facade. Sure, Chrome is based on Webkit and has browser legacy, but the product's core capabilities—and Google's objectives for them—is running Web applications. Chrome is a development platform, but in the cloud instead of on the PC.
Implicit in the argument is that while both Google and Microsoft are subsidizing their mobile initiatives with cash from their core businesses, in Google's case, succeeding on mobile is aligned with and strengthens their core revenue stream in search, while in Microsoft's, it competes and undercuts their core revenue in operating systems. This becomes clear when Joe turns his mind to lightweight laptops:
Microsoft's problem isn't just mobile phones. The next-generation PCs aren't big, they're small. Yesterday I was looking at the pink MSI Wind Netbook and thought it would be a perfect Web application computer. "Net" is in the name for a reason.Microsoft's Windows Vista is a fiasco that just keeps on giving trouble. I'm not one of the Vista haters, but that doesn't mean I don't recognize its foibles. The biggest: Vista demands too much hardware at a time when the market has shifted to lower-powered notebooks and now netbooks and ultra-low-cost PCs. The latter two really can't run Windows Vista, which is why Microsoft has licensed Windows XP Home for them.
Microsoft had to do something. The company couldn't abandon the emerging netbook and ultra-low-cost PC markets to Linux, so it licensed Windows XP Home for these devices. Now Google comes along with Chrome, which is more application runtime than Web browser. Chrome should run just fine on netbooks running XP Home, even with the resources consumed by each tab operating as a single process.
What do you bet that Microsoft comes up with a new "improved" release of XP Home that has features deliberately designed to block Chrome? This is, after all, what they did against Netscape in 1996, with Windows NT Workstation allowing no more than 10 TCP/IP connections so that it couldn't be used with Netscape web servers.
But this kind of backward-looking, defensive competition doesn't do more than buy you time. Yes, Microsoft killed Netscape, but they missed the deeper, stronger competition that would come from true web applications like Google. The future is not like the past, and any strategy that is designed to protect the past will eventually fail.
What's so ironic is that if Microsoft started thinking about the user again, instead of thinking about protecting their business, they could do great things. There are many problems yet to be solved in online software, but they won't be solved without bold leaps into the future.
Tue
Aug 26
2008
Social Networking for Books: One Ring, or Loosely Joined?
I have to confess that one of the social networking tools I find most valuable is Goodreads. (It's a close second to Twitter, and way ahead of Facebook, Friendfeed, or Dopplr.) Unlike twitter, where I follow hundreds of people (possible because of twitter's minimalism) and am followed by thousands, on Goodreads, I follow and am followed by a small circle of friends and people whose taste in books I trust. As someone who loves books, it is the pinnacle of private social networking for me.
So it was with some interest that I read about Amazon's acquisition of Shelfari. Much of the resulting commentary has focused on the problems this poses for LibraryThing, in which Amazon also has an invesment (via their recent purchase of Abebooks.) I'm a bit surprised that the articles have seemingly ignored the fact that Goodreads appears to be the market leader, at least based on data from compete.com:
Of course, that could change quickly if Amazon throws their muscle behind Shelfari and integrates it into their overall service. And there's the rub: we're entering a period of Web 2.0 consolidation. After all, web 2.0 is all about network effects in applications that get better the more people use them. And that means that companies with dominant share tend to get more dominant over time; that dominance need not be organic to start with (though it helps.) Over time, I expect to see companies who've achieved dominant market share in one market segment to use it to dominate a related segment.
But here's the counter: open and interoperable applications, including open social networks. When are companies with "point applications" of social networks going to realize that their best option, in the face of inevitable competition from big companies looking to dominate their market, is to join forces via shared social networks?
Some of my friends prefer LibraryThing. Others may prefer Shelfari. But I only network with those on Goodreads because that's the service I ended up using first. What a shame that I can't see what my friends on LibraryThing and Shelfari might be reading! I'd love to see a firm commitment to cross-application connectivity, with the social network as infrastructure rather than application.
This applies to other specialized social networks as well. Sorry, even though I'm an investor in Tripit, I'm not going to try to rebuild the social network I've already got on dopplr, just because Tripit thinks they'd better add this hot functionality to what was already a unique and interesting product.
I've argued for years that one of the critical architectural decisions we can make about Web 2.0 applications is whether they are built on the "one ring to rule them all" model that we saw with Microsoft Windows and Office, a game where network effects drive a winner-takes-all marketplace, or the Unix/Internet model of "small pieces loosely joined," in which cooperating applications come together to build value greater than any of the pieces do alone.
We're entering the critical phase of that decision. Application developers need to embrace the "small pieces loosely joined" model, or they will be picked off one by one by dominant companies who've already reached scale, and who are practicing the "one ring" model. As Benjamin Franklin said during the American Revolution, "Gentlemen, we must all hang together, or we shall assuredly all hang separately." Now is a good time for LibraryThing and Goodreads to start talking about interoperability.
tags:
| comments: 20
| Sphere It
submit:
Sat
Aug 23
2008
A Graphic Designer Puts Print on Demand Through Its Paces
A report on the UnderConsideration blog outlines a fascinating experiment called Dear Lulu. From the blog coverage:
This past July, fourteen students attended a two-day workshop at Germany's Hochschule Darmstadt University of Applied Sciences with Prof. Frank Philippin and London-based designer James Goggin. The brief, as explained by Goggin:"My plan for the workshop is to investigate the visible and tangible parameters of graphic design — type specimens, halftone screens and, in particular, colour tests and calibration charts — and make a book of our own self-produced tests which we will send to print on Friday afternoon using the online print-on-demand system Lulu. The book project will therefore act as a colour/type/pattern test of the very system with which it is produced. "Print-on-demand" is an increasingly important production system which can serve to make us designers rethink the impact our profession has on the environment and to question the often wasteful print volumes and production methods requested of us by our clients. Graphic designers, and especially students, have a chance to use and subvert these relatively new (and fairly cheap) technological systems to our advantage."
![]()
The result of the workshop is Dear Lulu, a fantastic and imaginative resource that puts digital printing to the test through a Do-It-Yourself presentation that fits right in with philosophy of print on demand that makes it such an alluring proposition for designers looking to publish with little financial risk and with pretty decent results in return.
The report is not only a fascinating analysis of how far Print on Demand has come, but also a great tool for evaluating printers in general, as the output of the process is a book designed to stress the capabilities of any printer. As Amrita Chandra wrote on twitter in response to my post there, "what is great is you can send the book to other printers for comparison."
Food for thought for research firms: what if the output of a research firm were not just a report but a tool for putting a company's own systems through its paces, evaluating against the standards outlined in the report?
tags: lulu, print on demand, publishing
| comments: 2
| Sphere It
submit:
Mon
Aug 18
2008
Is Linking to Yourself the Future of the Web?
Last year, Bill Janeway really got my attention (pdf) when he noted that "over time, Wall Street 'firms began to trade against their clients for their own account, such that now, the direct investment activities of a firm like Goldman Sachs dwarf their activities on behalf of outside customers.'" As I wrote in my blog post at the time, Trading for Their Own Account, "I thought, whither Google, Yahoo! and Amazon?"
At the time, I noted the way that more and more information that was once delivered by independent web sites was now being delivered directly by search engines, and that rather than linking out to others, there were strong signs of a trend towards keeping the link flow to themselves.
This thought re-surfaced when Techcrunch launched Crunchbase. Now, rather than linking directly to companies covered in its stories, Techcrunch links to one of its own properties to provide additional information about them. I noticed the same behavior the other day on the New York Times, when I followed a link, and was taken to a search result for articles on the subject at the Times (with lots of ads, even if there were few results).
Journalism professor Jay Rosen noticed this too, and wrote the tweet that sparked this post:
@NYTimesComm Could you try to find out for me why Week in Review pieces do not link out even when vital to the story? https://is.gd/1Hzd
Follow Jay's link and you come to a story that indeed doesn't have any outbound links, except to other Times stories. Now, I understand the value of linking to other articles on your own site -- everyone does it -- but to do so exclusively is a small tear in the fabric of the web, a small tear that will grow much larger if it remains unchecked.
Business Week is also getting into the act, per a New York Times article entitled Topic Pages to Be Hub of New BusinessWeek Site:
The core of Business Exchange is hundreds of topic pages, on subjects as broad as the housing market and as narrow as the Boeing 787. Plans call for the number of topic pages to grow quickly into the thousands. (The first one created, which may or may not be in the public version of Business Exchange, was “BlackBerry vs iPhone.”)
Want to place a bet whether articles in the magazine will link exclusively to these "topic pages?" At least Business Week plans to have outbound links from the topic pages (Crunchbase does this too, just siphoning off the first step in the link stream, unlike the NYT roach-motel links.)
Each Business Exchange topic page links to articles and blog posts from myriad other sources, including BusinessWeek’s competitors, with the contents updated automatically by a Web crawler. Nearly all traditional news organizations offer only their own material, spurning the role of aggregator as an invitation to readers to leave their sites.
When this trend spreads (and I say "when", not "if"), this will be a tax on the utility of the web that must be counterbalanced by the utility of the intervening pages. If they are really good, with lots of useful, curated data that you wouldn't easily find elsewhere, this may be an acceptable tax. In fact, they may even be beneficial, and a real way to increase the value of the site to its readers. If they are purely designed to capture additional clicks, they will be a degradation of the web's fundamental currency, much like the black hat search engine pages that construct link farms out of search engine results.
I'd like to put out two guidelines for anyone adopting this "link to myself" strategy:
- Ensure that no more than 50% of the links on any page are to yourself. (Even this number may be too high.)
- Ensure that the pages you create at those destinations are truly more valuable to your readers than any other external link you might provide.
The web is a great example of a system that works because most sites create more value than they capture. Maybe the tragedy of the commons in its future can be averted. Maybe not. It's up to each of us.
tags: linking, web 2.0
| comments: 50
| Sphere It
submit:
Sun
Aug 17
2008
Lessons on Blogging from Jon Stewart
The New York Times today has a fascinating profile of Jon Stewart, host of The Daily Show, entitled Is This The Most Trusted Man in America? The article is a wonderful celebration of the person and the spirit of the show he's created.
But perhaps more interestingly in the internet context, this article is a must-read for anyone who cares about the future of journalism. It shows how the informality and attitude that we take as characteristic of blogging can be combined with the tough-mindedness, research, and craft that is displayed by the best investigative reporters.
Let's start with passion about stuff that matters, something top bloggers and top journalists ought to have in their genes:
MR. STEWART describes his job as “throwing spitballs” from the back of the room and points out that “The Daily Show” mandate is to entertain, not inform. Still, he and his writers have energetically tackled the big issues of the day — “the stuff we find most interesting,” as he said in an interview at the show’s Midtown Manhattan offices, the stuff that gives them the most “agita,” the sometimes somber stories he refers to as his “morning cup of sadness.” And they’ve done so in ways that straight news programs cannot: speaking truth to power in blunt, sometimes profane language, while using satire and playful looniness to ensure that their political analysis never becomes solemn or pretentious.“Hopefully the process is to spot things that would be grist for the funny mill,” Mr. Stewart, 45, said. “In some respects, the heavier subjects are the ones that are most loaded with opportunity because they have the most — you know, the difference between potential and kinetic energy? — they have the most potential energy, so to delve into that gives you the largest combustion, the most interest. I don’t mean for the audience. I mean for us. Everyone here is working too hard to do stuff we don’t care about.”
Much like blogging, a key to the show's success is its authentic, personal voice, and its ability to synthesize news with viewpoint:
Ms. Corn [the show's executive co-producer] noted that while things “may be exaggerated on the show, it’s grounded in the way Jon really feels.”“He really does care,” she added. “He’s a guy who says what he means.”
Unlike many comics today, Mr. Stewart does not trade in trendy hipsterism or high-decibel narcissism. While he possesses Johnny Carson’s talent for listening and George Carlin’s gift for observation, his comedy remains rooted in his informed reactions to what Tom Wolfe once called “the irresistibly lurid carnival of American life,” the weird happenings in “this wild, bizarre, unpredictable, hog-stomping Baroque” country.
“Jon’s ability to consume and process information is invaluable,” said Mr. Colbert. He added that Mr. Stewart is “such a clear thinker” that he’s able to take “all these data points of spin and transparent falsehoods dished out in the form of political discourse” and “fish from that what is the true meaning, what are red herrings, false leads,” even as he performs the ambidextrous feat of “making jokes about it” at the same time.
But there's also a lesson for bloggers that the show, however personal, is finely honed, with lots of research:
“We often discuss satire — the sort of thing he does and to a certain extent I do — as distillery,” Mr. Colbert continued. “You have an enormous amount of material, and you have to distill it to a syrup by the end of the day. So much of it is a hewing process, chipping away at things that aren’t the point or aren’t the story or aren’t the intention. Really it’s that last couple of drops you’re distilling that makes all the difference. It isn’t that hard to get a ton of corn into a gallon of sour mash, but to get that gallon of sour mash down to that one shot of pure whiskey takes patience” as well as “discipline and focus.”
And:
The day begins with a morning meeting where material harvested from 15 TiVos and even more newspapers, magazines and Web sites is reviewed. That meeting, Mr. Stewart said, “would be very unpleasant for most people to watch: it’s really a gathering of curmudgeons expressing frustration and upset, and the rest of the day is spent trying to mask or repress that through whatever creative devices we can find.”The writers work throughout the morning on deadline pieces spawned by breaking news, as well as longer-term projects, trying to find, as Josh Lieb, a co-executive producer of the show, put it, stories that “make us angry in a whole new way.” By lunchtime, Mr. Stewart (who functions as the show’s managing editor and says he thinks of hosting as almost an afterthought) has begun reviewing headline jokes. By 3 p.m. a script is in; at 4:15, Mr. Stewart and the crew rehearse that script, along with assembled graphics, sound bites and montages. There is an hour or so for rewrites — which can be intense, newspaper-deadlinelike affairs — before a 6 o’clock taping with a live studio audience.
What the staff is always looking for, Mr. Stewart said, are “those types of stories that can, almost like the guy in ‘The Green Mile’ ” — the Stephen King story and film in which a character has the apparent ability to heal others by drawing out their ailments and pain — “suck in all the toxins and allow you to do something with it that is palatable.”
What a call to action! What a way forward for all of those trying to understand the future of news! Point of view fused with fact checking, bluntness and informality fused with ruthless editing, a humanistic vision that acts as a filter to make sure that the stories covered actually matter!
tags: blogging, daily show, jon stewart, journalism, trust
| comments: 5
| Sphere It
submit:
Fri
Aug 15
2008
Why We're Failing in Math and Science
Norman Mailer's brilliant novel Why Are We in Vietnam? doesn't talk explicitly about the Vietnam war; it tells a story about American culture and the American psyche, thereby producing a devastating critique of the war with the title and last line alone.
In a similar way, it may be easier to understand why America is falling behind at math and science with a few simple stories.
Last week, Robert Bruce Thompson, author of An Illustrated Guide to Home Chemistry Experiments, wrote a guest blog post on makezine.com, Home Science Under Attack, which told the sad story of how a retired chemist was arrested and his lab confiscated because he was doing experiments:
The Worcester Telegram & Gazette reports that Victor Deeb, a retired chemist who lives in Marlboro, has finally been allowed to return to his Fremont Street home, after Massachusetts authorities spent three days ransacking his basement lab and making off with its contents. Deeb is not accused of making methamphetamine or other illegal drugs. He's not accused of aiding terrorists, synthesizing explosives, nor even of making illegal fireworks. Deeb fell afoul of the Massachusetts authorities for ... doing experiments.Authorities concede that the chemicals found in Deeb's basement lab were no more hazardous than typical household cleaning products. Despite that, authorities confiscated "all potentially hazardous chemicals" (which is to say the chemicals in Deeb's lab) from his home, and called in a hazardous waste cleanup company to test the chemicals and clean up the lab.
Pamela Wilderman, the code enforcement officer for Marlboro, stated, "I think Mr. Deeb has crossed a line somewhere. This is not what we would consider to be a customary home occupation."
Allow me to translate Ms. Wilderman's words into plain English: "Mr. Deeb hasn't actually violated any law or regulation that I can find, but I don't like what he's doing because I'm ignorant and irrationally afraid of chemicals..."
I forwarded this message to Dave Farber's IP list (which is now searchable via markmail, the amazing mailing list search engine!), and got back some great stories that I wanted to share.
Armando Stettner wrote one story that illustrates just how much our culture has changed. His story also involves the cops, but here, they understand and support science. Too bad that was 40+ years ago:
When I was about 13 or so, I also had a chemistry set in my basement. I was living on Long Island - Freeport, to be exact. I also remember the hobby shop with ALL sorts of glassware and little labeled bottles of chemicals. I had some really neat stuff: all sorts of chemicals - I seem to remember potassium ferrocyanide with which I did some chemoluminescence (I think that's one of the ingredients), sodium in liquid form, various acids, a few rolls of magnesium - not to mention all the paraphernalia: lots of pyrex stuff, triple beam balances, etc. All the chemicals were neatly arranged in this cabinet.One day, I had mixed a concoction and was carrying it (premixed!) in a tin coffee can. Myself and a friend were carrying the stuff to the train tracks to test it out (light it) where it was relatively safe. The stuff started getting warm but I thought it was the sun heading the can up. Then it started getting REALLY warm. As it got hot, I dropped it in the middle of the street. The stuff flashed over. It was VERY cool.
But, I decided I didn't want to stay around any more and left.
Unfortunately for me, this all occurred in front of the house of someone who knew me (she was a 'friend' of my parents). She called the cops.
The Freeport police came to my house questioned me and my parents, joined in a little while by some county detectives. They were very polite. We took them down to the basement where I showed them all the stuff. The uniformed police left and the detectives continued to look at all the stuff and ask questions. They called somebody to ask some advice. It turns out they called the county labs. The guy got off the phone and asked 'you're not making any drugs down here are you?" I said no!! He smiled - he winked at my parents. Then he said the most unexpected thing: he said the gang at the labs offered to give me a tour of the labs anytime I wanted.
Then they left asking me to be careful. For me, it was actually a positive experience.
Today, I'm sure I'd face a visit from the Hazmat teams and the DHS. And, because of the triple beam balance, my house (or my parents') would be confiscated under the forfeiture rules.
At Maker Faire earlier this year, Robert Bruce Thompson gave a talk (video unfortunately truncated at both ends) that highlighted how attitudes towards chemistry have changed since he was a kid, starting with a tour of the powerful chemistry sets available in 1964 (courtesy of the Sears Catalog), and tracing the dumbing down and rising fear of liability that doomed them, until, as Kevin Kelly noted in a recent review of Robert's book, we reached "the so-called chemistry sets today which boldly (and insanely) advertise they contain 'No Chemicals!'" (Review sent out in Cool Tools email, up on the Cool Tools site soon.)
Why are we failing at math and science? Because it isn't fun any more. When you put safety on the highest altar, what do you give up? When fear of lawsuits -- not to mention fear of technology -- drives product design, marketing, and public policy, you eliminate science at its roots, in the natural experimentation of kids who want to know how the world works.
tags: chemistry
| comments: 44
| Sphere It
submit:
Thu
Aug 7
2008
Al Gore Joins Web 2.0 Summit Lineup
As I wrote last month in What Good is Collective Intelligence if it Doesn't Make Us Smarter?, at this year's Web 2.0 Summit, we're focusing on how what we've learned from the web over the past decade can be applied to solve the world's hard problems. That's why I'm really excited to see that John Battelle has persuaded Al Gore to join us.
One of those hard problems that requires all the intelligence we can throw at is global warming. And there's no one who deserves as much credit as Al Gore for getting it on our collective radar. Through persistence, vision, and hard work, and a real mastery of the new tools of global media, he made all of us pay attention. His work has been a textbook demonstration of the power of media to change the way people think.
That's Gore's continuing focus, with his role at Current TV. He's also joined Kleiner Perkins as a partner involved in cleantech investing.
When I first saw Gore talk about climate change at the TED conference in early 2006, everyone wanted to know what we could do about it. People are still struggling to answer that question, but it's clear that technology can play a large role: helping us to monitor and measure the rate of change in crucial environmental variables, creating feedback loops that change behavior at both macro-levels (like carbon markets) and personal levels (like home energy monitoring); creating green data centers and low-power devices; creating new forms of renewable energy generation or storage, new materials that require less energy to create; alternative fuels and vehicles. The list goes on and on. (Reminder: we're looking for innovative "web meets world"startups for the Web 2.0 Summit Launchpad.)
Of course, global warming is far from the only "web meets world" theme that we're exploring. The conference will cover everything from the latest trends on the web (the rediscovery of e-commerce as a business model, cloud computing, social networking, mobile applications, and the inevitable platform wars) to politics, global disease detection, personal genomics, private space industry, and even military infotech. Speakers I'm particularly excited to see, in addition to Vice President Gore, include Tony Hsieh (@zappos, for those of you who see him continually on twitter), Elon Musk (who's got to have the coolest portfolio of investments since retiring from PayPal, with SpaceX, SolarCity, Tesla Motors all under his wing), and Michael Pollan, who's completely changed the way many of us think about food. Check out the confirmed speaker list, but keep in mind that there are more yet to come as John and I firm up the program.
tags: al gore, web 2.0, web2summit
| comments: 26
| Sphere It
submit:
Thu
Jul 31
2008
Open Source and Cloud Computing
I've been worried for some years that the open source movement might fall prey to the problem that Kim Stanley Robinson so incisively captured in Green Mars: "History is a wave that moves through time slightly faster than we do." Innovators are left behind, as the world they've changed picks up on their ideas, runs with them, and takes them in unexpected directions.
In essays like The Open Source Paradigm Shift and What is Web 2.0?, I argued that the success of the internet as a non-proprietary platform built largely on commodity open source software could lead to a new kind of proprietary lock-in in the cloud. What good are free and open source licenses, all based on the act of software distribution, when software is no longer distributed but merely performed on the global network stage? How can we preserve freedom to innovate when the competitive advantage of online players comes from massive databases created via user contribution, which literally get better the more people use them, raising seemingly insuperable barriers to new competition?
I was heartened by the program at this year's Open Source Convention. Over the past couple of years, open source programs aimed at the Web 2.0 and cloud computing problem space have been proliferating, and I'm seeing clear signs that the values of open source are being reframed for the network era. Sessions like Beyond REST? Building Data Services with XMPP PubSub, Cloud Computing with BigData, Hypertable: An Open Source, High Performance, Scalable Database, Supporting the Open Web, and Processing Large Data with Hadoop and EC2 were all full. (Due to enforcement of fire regulations at the Portland Convention Center, many of them had people turned away, as SRO was not allowed. Brian Aker's session on Drizzle was so popular that he gave it three times!)
But just "paying attention" to cloud computing isn't the point. The point is to rediscover what makes open source tick, but in the new context. It's important to recognize that open source has several key dimensions that contribute to its success:
- Licenses that permit and encourage redistribution, modification, and even forking;
- An architecture that enables programs to be used as components where-ever possible, and extended rather than replaced to provide new functionality;
- Low barriers for new users to try the software;
- Low barriers for developers to build new applications and share them with the world.
This is far from a complete list, but it gives food for thought. As outlined above, I don't believe we've figured out what kinds of licenses will allow forking of Web 2.0 and cloud applications, especially because the lock-in provided by many of these applications is given by their data rather than their code. However, there are hopeful signs like Yahoo! Boss that companies are at beginning to understand that in the era of the cloud, open source without open data is only half the application.
But even open data is fundamentally challenged by the idea of utility computing in the cloud. Jesse Vincent, the guy who's brought out some of the best hacker t-shirts ever (as well as RT) put it succinctly: "Web 2.0 is digital sharecropping." (Googling, I discover that Nick Carr seems to have coined this meme back in 2006!) If this is true of many Web 2.0 success stories, it's even more true of cloud computing as infrastructure. I'm ever mindful of Microsoft Windows Live VP Debra Chrapaty's dictum that "In the future, being a developer on someone's platform will mean being hosted on their infrastructure." The New York Times dubbed bandwidth providers OPEC 2.0. How much more will that become true of cloud computing platforms?
That's why I'm interested in peer-to-peer approaches to delivering internet applications. Jesse Vincent's talk, Prophet: Your Path Out of the Cloud describes a system for federated sync; Evan Prodromou's Open Source Microblogging describes identi.ca, a federated open source approach to lifestreaming applications.
We can talk all we like about open data and open services, but frankly, it's important to realize just how much of what is possible is dictated by the architecture of the systems we use. Ask yourself, for example, why the PC wound up with an ecosystem of binary freeware, while Unix wound up with an ecosystem of open source software? It wasn't just ideology; it was that the fragmented hardware architecture of Unix required source so users could compile the applications for their machine. Why did the WWW end up with hundreds of millions of independent information providers while centralized sites like AOL and MSN faltered?
Take note: All of the platform as a service plays, from Amazon's S3 and EC2 and Google's AppEngine to Salesforce's force.com -- not to mention Facebook's social networking platform -- have a lot more in common with AOL than they do with internet services as we've known them over the past decade and a half. Will we have to spend a decade backtracking from centralized approaches? The interoperable internet should be the platform, not any one vendor's private preserve. (Neil McAllister provides a look at just how one-sided most platform as a service contracts are.)
So here's my first piece of advice: if you care about open source for the cloud, build on services that are designed to be federated rather than centralized. Architecture trumps licensing any time.
But peer-to-peer architectures aren't as important as open standards and protocols. If services are required to interoperate, competition is preserved. Despite all Microsoft and Netscape's efforts to "own" the web during the browser wars, they failed because Apache held the line on open standards. This is why the Open Web Foundation, announced last week at OScon, is putting an important stake in the ground. It's not just open source software for the web that we need, but open standards that will ensure that dominant players still have to play nice.
The "internet operating system" that I'm hoping to see evolve over the next few years will require developers to move away from thinking of their applications as endpoints, and more as re-usable components. For example, why does every application have to try to recreate its own social network? Shouldn't social networking be a system service?
This isn't just a "moral" appeal, but strategic advice. The first provider to build a reasonably open, re-usable system service in any particular area is going to get the biggest uptake. Right now, there's a lot of focus on low level platform subsystems like storage and computation, but I continue to believe that many of the key subsystems in this evolving OS will be data subsystems, like identity, location, payment, product catalogs, music, etc. And eventually, these subsystems will need to be reasonably open and interoperable, so that a developer can build a data-intensive application without having to own all the data his application requires. This is what John Musser calls the programmable web.
Note that I said "reasonably open." Google Maps isn't open source by any means, but it was open enough (considerably more so than any preceding web mapping service) and so it became a key component of a whole generation of new applications that no longer needed to do their own mapping. A quick look at programmableweb.com shows google maps with about 90% share of mapping mashups. Google Maps is proprietary, but it is reusable. A key test of whether an API is open is whether it is used to enable services that are not hosted by the API provider, and are distributed across the web. Facebook's APIs enable applications on Facebook; Google Maps is a true programmable web subsystem.
That being said, even though the cloud platforms themselves are mostly proprietary, the software stacks running on them are not. Thorsten von Eicken of Rightscale pointed out in his talk Scale Into the Cloud, that almost all of the software stacks running on cloud computing platforms are open source, for the simple reason that proprietary software licenses have no provisions for cloud deployment. Even though open source licenses don't prevent lock-in by cloud providers, they do at least allow developers to deploy their work on the cloud.
In that context, it's important to recognize that even proprietary cloud computing provides one of the key benefits of open source: low barriers to entry. Derek Gottfried's Processing Large Data with Hadoop and EC2 talk was especially sweet in demonstrating this point. Derek described how, armed with a credit card, a sliver of permission, and his hacking skills, he was able to put the NY Times historical archive online for free access, ramping up from 4 instances to nearly 1,000. Open source is about enabling innovation and re-use, and at their best, Web 2.0 and cloud computing can be bent to serve those same aims.
Yet another benefit of open source - try before you buy viral marketing - is also possible for cloud application vendors. During one venture pitch, I was asking the company how they'd avoid the high sales costs typically associated with enterprise software. Open source has solved this problem by letting companies build a huge pipeline of free users, who they can then upsell with follow-on services. The cloud answer isn't quite as good, but at least there's an answer: some number of application instances are free, and you charge after that. While this business model loses some virality, and transfers some costs from the end user to the application provider, it has a benefit that open source now lacks, of providing a much stronger upgrade path to paid services. Only time will tell whether open source or cloud deployment is a better distribution vector, but it's clear that both are miles ahead of traditional proprietary software in this regard.
In short, we're a long way from having all the answers, but we're getting there. Despite all the possibilities for lock-in that we see with Web 2.0 and cloud computing, I believe that the benefits of openness and interoperability will eventually prevail, and we'll see a system made up of cooperating programs that aren't all owned by the same company, an internet platform, that, like Linux on the commodity PC architecture, is assembled from the work of thousands. Those who are skeptical of the idea of the internet operating system argue that we're missing the kinds of control layers that characterize a true operating system. I like to remind them that much of the software that is today assembled into a Linux system already existed before Linus wrote the kernel. Like LA, 72 suburbs in search of a city, today's web is 72 subsystems in search of an operating system kernel. When we finally get that kernel, it had better be open source.
tags: cloud computing, open source, web 2.0
| comments: 28
| Sphere It
submit:
Wed
Jul 30
2008
Suggestions for Web 2.0 Summit Charity Auction?
At this year's Web 2.0 Summit, we're holding a charity auction as part of our "web meets world" focus.
From the press release:
The Web 2.0 Summit team will solicit donations, and donation ideas, from individuals and companies within the community and then choose the 10 most promising and unique offerings to auction after the conference dinner. Lance Armstrong, the seven time Tour de France winner and founder of the Lance Armstrong Foundation and livestrong.com, will donate an autographed bicycle that he signs on-stage during his interview with John Battelle. All proceeds from the event will benefit three charities, including witness.org, which uses video and online technologies to open the eyes of the world to human rights violations.Members of the Web community can contribute to the success of the Web Meets World auction by joining the Web 2.0 Summit Facebook community and suggest which charities should benefit from the auction and what you would consider a priceless donation. Individuals or companies who would like to offer auction items should email auction@techweb.com.
We're looking for suggestions as well as donations. For example, what might O'Reilly donate that would bring a big price for the target charities? For example, how much would you donate to have us organize a mini-foo camp for a company, bringing together cool hackers in the company's area of interest? (But suggestions are best if you have some kind of angle on actually helping to make them happen.)
Feel free to leave suggestions in the comments as well as by email or facebook as outlined above.
For more details, see the Summit auction page.
Recent Posts
- iPhone rants and raves on July 19, 2008
- House trying to ban twitter and qik? on July 8, 2008
- Should Personal Genomics Be Regulated? on July 8, 2008
- What good is collective intelligence if it doesn't make us smarter? on July 7, 2008
- Segway CTO Leaves for Apple as Product Design VP on July 6, 2008
- Why Arrington is Wrong about Yahoo!-Google Deal on June 14, 2008
- The Kindle: "Looks Like a Million to Me" on June 9, 2008
- America's Capacity for Change on June 8, 2008
- WordSpy as Collective Intelligence on June 7, 2008
- Unexpected Pleasures in Gates/Ballmer interview at D Conference on June 6, 2008
TIM'S TWITTER UPDATES
BUSINESS INTELLIGENCE
RELEASE 2.0
Current Issue

Velocity: Web Operations & Performance
Issue 2.0.9
Back Issues
More Release 2.0 Back IssuesCURRENT CONFERENCES

RailsConf Europe is the largest gathering of the European Ruby on Rails community. This is your chance to meet, connect, and collaborate with other Rails programmers and developers. Read more

New York has long been where the world's biggest industries go online, and as Web 2.0 grows up and gets serious, the time is right to convene the East Coast web communities under the umbrella of the next generation web. Read more
O'Reilly Home | Privacy Policy ©2005-2008, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
Website:
| Customer Service:
| Book issues:
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.