CARVIEW |
Four short links: 25 May 2009
by Nat Torkington | comments: 3
- China is Logging On -- blogging 5x more popular in China than in USA, email 1/3 again as popular in USA as China. These figures are per-capita of Internet users, and make eye-opening reading. (via Glyn Moody)
- The Economics of Google (Wired) -- the money graf is Google even uses auctions for internal operations, like allocating servers among its various business units. Since moving a product's storage and computation to a new data center is disruptive, engineers often put it off. "I suggested we run an auction similar to what the airlines do when they oversell a flight. They keep offering bigger vouchers until enough customers give up their seats," Varian says. "In our case, we offer more machines in exchange for moving to new servers. One group might do it for 50 new ones, another for 100, and another won't move unless we give them 300. So we give them to the lowest bidder—they get their extra capacity, and we get computation shifted to the new data center."
- Why Washington Doesn't Get New Media -- Things eventually improved, but despite the stunning advances in communications technology, most of federal Washington has still failed to grasp the meaning of Government 2.0. Indeed, much is mired in Government 1.5. Government 1.5? That’s a term of art for the vast virtual ecosystem taking root in Washington that has set up the trappings of 2.0 — the blogs, the Facebook pages, the Twitter accounts — but lacks any intellectual heartbeat. Too many aides in official Washington are setting up blogs and social media pages because they understand that is what they are supposed to do. All the while, many are sweating the possibility that they might actually have to say something substantive or engage the public directly. It is the nature of midlevel know-nothings to grinfuck any idea that would force them to substantially change their behaviour. We incentivize this when we talk about "you must have a blog" (ok, I'll get comms to write it), or "put up a wiki for this" (ok, but there'll be no moderation so it'll be ignorable chaos). Describe the behaviour you want and not a tool that might produce it. (via timoreilly on Twitter)
- On the Information Armageddon (Mind Hacks) -- Vaughn points out that the much-linked-to New York Magazine article on attention is a crock. I didn't like it because it was wordy and self-indulgent, Vaughn because it didn't actually cite any studies other than one which was described incorrectly. History has taught us that we worry about widespread new technology and this is usually expressed in society in terms of its negative impact on our minds and social relationships. If you're really concerned about cognitive abilities, look after your cardiovascular health (eat well and exercise), cherish your relationships, stay mentally active and experience diverse and interesting things. All of which have been shown to maintain mental function, especially as we age.
tags: attention, brain, china, democracy, economics, google, government, internet, web
| comments: 3
submit:
Local forums to implement high-speed networks (broadband): proposal open for votes
by Andy Oram | @praxagora | comments: 7
I've posted a proposal titled Local forums to implement high-speed networks (broadband) to a forum on open government put up by the White House. I ask this blog's readers to tell other people who might be interested, and vote up the proposal if you like it.
The Open Government Dialog site where this proposal appears is part of the White House's implementation of the Memorandum on Transparency and Open Government that Obama signed on his first day in office. Hundreds of ideas have already been posted. Many are very specific and some look quite worthy, but I think mine stands out for the reasons listed in my justification:
First, one of the Administration's major goals is to bring high-speed networking to every resident of the country.
Second, this goal is fundamental to the other goals in the Memorandum on Transparency and Open Government. Members of the public need continuous access to the Internet and the ability to handle video and sophisticated graphical displays in order to make full use of the resources provided in open government efforts.
The local community aspect is also crucial, for reasons I list in my justification.
Many readers will note that the people who need my proposal the most the ones who have the most trouble participating in the forums--people who can't afford computers, who have access only to intermittent dial-up Internet access, etc. I deal with this ironic problem in the proposal in several ways (public terminals, face-to-face meetings, partnering with newspapers and television).
Because the formatting came out a mess, I'm reprinting the proposal below.
Local forums to implement high-speed networks (broadband)
Municipalities and regions undertaking projects in high-speed networking be encouraged to create online forums that:
-
Post regional maps showing the demographic features, geographical features, patterns of network use, and technological facilities relevant to the project
-
Accept proposals, provide comment and rating systems, and run polls
-
Provide public terminals and low-bandwidth versions of data, so that people who are currently on the disadvantaged side of the digital divide can offer input to help cross that divide
-
Are supplemented by face-to-face gatherings
-
Collaborate with newspapers and with television and radio news programs to publicize proposals, meetings, and opportunities for public comment
-
Create visitor accounts, perhaps with validation procedures for determining local residence, and allow visitors to identify their expertise and credentials
-
Provide tools for mapping proposed facilities and for calculating the reach, bandwidth, and costs of proposed facilities
-
Provide data about ongoing deployments in standardized, open formats on maps and in downloadable form
The federal-level initiative can support these efforts by:
-
Mandating the types of information that participating municipalities and companies should provide, such as the capabilities of current facilities, statistics on current usage, demographic information such as income and connectivity on a neighborhood basis, and detailed implementation plans with measurable milestones
-
Funding the development of software tools, such as programs that can estimate the quality of wireless coverage for different terrains, or the time period required to recoup the costs of building out networks
-
Providing formats and quality standards for the data provided
-
Publicizing successful initiatives, the tools they used, and their best practices
Why Is This Idea Important
High-speed digital networking (also known as "broadband") should concern open government advocates in two ways.
First, one of the Administration's major goals is to bring high-speed networking to every resident of the country.
Second, this goal is fundamental to the other goals in the Memorandum on Transparency and Open Government. Members of the public need continuous access to the Internet and the ability to handle video and sophisticated graphical displays in order to make full use of the resources provided in open government efforts.
Why do I stress the local nature of these forums?
All networking is (on one level) local. Given the limited resources available for any network deployment, and the trade-offs that must be made during plans, decision-makers need to take into account local demographics, geography, topology, social and economic priorities, and existing facilities. Here are just a few examples the many local issues typically considered:
-
Which neighborhoods are already relatively well-served or poorly served
-
Where it's cost-effective to string fiber, versus serving a neighborhood through a high-bandwidth wireless solution
-
Whether there are existing facilities and lines that could be repurposed or upgraded for high-speed networking
-
How many public funds to invest and which private firms to contract with to provide infrastructure or Internet service
-
Whether a non-essential service, such as wireless for spots where tourists or businesspeople congregate, could generate enough new jobs or revenue to be worth an investment
-
What public and private partners and sources of investment are available
-
Whether people in potential new markets have the desire and education to use new network services, and how to create the conditions under which the populations would use the services
Innumerable issues like these require local knowledge and judgment. That is why many innovative and successful initiatives to provide digital networks have been launched by local governments or local private service providers.
Local collaboration to promote network penetration can also build bonds that support local communities in other ways. The global reach of the Internet has long been stressed, but the role of digital networks in connecting people within geographical communities and improving their way of life may be even more important and is beginning to be recognized.
Although complex, the issues are no more complex than many other issues being considered for implementation of the open government directive. With proper organization and support, community members could make these decisions and monitor their implementation.
Local community forums also attract participants more easily than geographically distributed "communities of interest." People are likely to respond to the invitations of friends and neighbors, and to be more loyal to the forums when they know the participants personally. So local forums are good ways to initiate the general public to the notion of transparent and participatory governance.
A note on current federal broadband initiatives
The American Recovery and Reinvestment Act (ARRA) includes a Broadband Technology Opportunities Program (BTOP), operated through the NTIA, that creates a 4.7-billion-dollar program to promote broadband, particularly for unserved areas and populations.
The implementation does not involve any of the innovative aspects of the open government directive, such as collaborative online forums and data exposed through open formats and APIs. Like other programs in ARRA, the focus on providing a fast economic stimulus led to a schedule that does not accommodate time to set up and accept comments in this manner. A public comment period was held from March 12 to April 13, 2009, and proposals must be submitted by September 2009.
The FCC adopted the goal of expanding broadband access many years ago, and cites this goal in many opinions concerning competition. The FCC also continues to offer funds for broadband under the Universal Service Fund (USF), which was expanded by the 1996 Telecommunications Act to include Internet access. The USF does not involve public online forums or open data access.
The FCC also plans to publish a national broadband plan by February 2010. Because the funds from BTOP will probably be disbursed by then, this plan could be a locus for the kinds of forums describes in this proposal.
Quick disclaimer: broadband adoption is hard to measure--it depends on such fuzz factors as the minimum speed defined as "broadband," the difference between potential and usual speeds, and uncertainties about actual availability versus official penetration rates--but recent estimates suggest that half of the United States population has always-on, high-speed network access. Although this reflects a substantial increase in recent years, it still leaves the US behind many other developed nations. Further improvements will require more intensive planning and careful resource allocation, as we try to extend adoption to populations with fewer resources or geographical challenges.
Summary of benefits
-
When the public can evaluate the options available to their community and the trade-offs required, they can reach agreement on a digital networking policy that reflects the values of many constituents and communities.
-
Tools for measuring the impacts of different proposals can help everyone in the community agree on what trade-offs exist, and provide a factual basis for decision-making.
-
Technically trained members of the community can use the measurement and visualization tools on the forum to educate those who are less technically sophisticated and ensure that everyone has the opportunity to make valid and appropriate input.
-
Progress in implementation can be followed by the public, who can demand accountability in spending and results.
-
Collaboration in building local networks can lead to continued collaboration in using those networks to improve economic, educational, and policy initiatives in the communities. They can also give visitors the skills and interests to join larger, national efforts in fulfillment of the Memorandum on Transparency and Open Government,
-
Standardization and information sharing between communities can help later communities reach successful conclusions more quickly and with less wasted effort.
-
Finally, the public participation fostered by local forums can educate the public about telecommunications issues that have a national or even international scope, such as expanding major access points, fostering technological innovation, and changing national policies and laws.
Welcoming Eric Ries to the Radar Team
by Tim O'Reilly | @timoreilly | comments: 5The Radar blog is a community of thinkers organized around the O’Reilly mission to change the world by spreading the knowledge of innovators. Some of the folks with posting privileges on Radar are O'Reilly employees: Brady Forrest organizes the ETech, Where 2.0 and Web 2.0 Expo events, Mike Loukides, Andy Oram, Brett McLaughlin, and Mike Hendrickson are editors of many of the books you know and love, Ben Lorica does data analysis in our research group, Andrew Savikas heads up our digital publishing efforts, Dale Dougherty is the publisher of Make:, Sara Winge runs the Radar group and organizes our annual Foo Camp.
Others work part-time with us, such as our open source maven Alison Randal, who co-chairs the Open Source Convention, and “Master of Disaster” Jesse Robbins, who co-chairs the Velocity conference on large scale web operations. Some are alumni such as Nat Torkington and Marc Hedlund, who have gone on to other jobs but remain very much part of the O'Reilly family.
But others are interesting people we have met along the journey like Artur Bergman, Jim Stogdill, and Nick Bilton. These are people who've stimulated our thinking and helped us reflect on areas we want to learn about. In each case the goal is the same - talk about "Stuff That Matters" and generate meaningful conversation. With that in mind, I wanted to welcome Eric Ries to the Radar community.
I met Eric a few months ago, and immediately realized that he was someone I could learn a lot from, and whose ideas I wanted to spread as widely as possible. Eric has been championing the concept of The Lean Startup; a methodology that helps startups learn and adapt faster than the competition. Startups get lean through a mixture of agile development, leveraged product development and implementing direct, tight customer feedback loops. The result is a new type of company - one that uses operational excellence to drive down costs and accelerate learning.
Eric’s methodology has been honed by running successful startups (and learning from running unsuccessful ones) along with experience gathered through consulting, mentoring, and advising entrepreneurs. The Lean Startup is deeply prescriptive and practical; it is a vision for a new way to start, build and grow your company—starting on day one.
One of the things that excites me about the Lean Startup is that it doesn’t just apply to the traditional “two guys in a garage.” The questions that I have seen technology startups face time and again are increasingly relevant to institutions of all kinds: Who exactly is my customer? What exactly do they want? How do I deliver my product quickly and effectively at lower cost? Lessons learned in the crucible of entrepreneurship are applicable to enterprise and to government as both struggle to do more with less, to grow to reach new markets, and to innovate.
You will find Eric here occasionally on Radar as well as on his blog. Additionally, Eric has partnered with O’Reilly to produce a series of upcoming workshops intended to help people master the concepts of The Lean Startup.
Here is a video that Radar’s Joshua-Michéle Ross shot with Eric recently.
tags: ericries startups agile
| comments: 5
submit:
Four short links: 22 May 2009
Villainous Javascript, Funding the Arts, Peak Web, and Crowdsourced Quality Control at a Museum
by Nat Torkington | comments: 0
- Hiding Dirty Deeds: "Encrypted" Client-Side Code -- obfuscated Javascript from a Facebook phishing site, deconstructed and reconstructed, parsed and glossed for understanding. It reminds me of the best obfuscated Perl: Latin, string substitution, runtime and compile-time semantics ... a work of evil art. (via waxy)
- Kickstarter -- artistic commercial version of PledgeBank. You say "I want to do [X] by Y and it takes $Z" and people can donate to your goal. (via waxpancake on Twitter)
- Peak Web (Chris Heathcote) -- My biggest problem is that people always perceive the near-past, present and near-future as having the most technological change, and the speed of decline of the old new media feels wrong. I am, however, thinking that there’s something true in one reading of the graph: we may be at or past Peak Web.
- Crowdsourcing the Cleanup with Freeze Tag -- The Awe-Worthy Brooklyn Museum, like all cultural institutions, have more objects than they can add metadata to. They let users provide metadata through tagging, but all crowdsourcing projects permit vandals. Their solution: crowdsource the cleanup. My only question is whether this will become a game between vandals and janitors. Brooklyn Museum is noteworthy for their insanely great use of the web, check them out and please support them if you like what you see.

Warning sign of peak web
tags: crowdsourcing, culture, javascript, money, programming, security, web
| comments: 0
submit:
Social Science Moves from Academia to the Corporation
by Joshua-Michéle Ross | @jmichele | comments: 4
This is the latest of a series of posts addressing questions regarding social technologies. Previous posts: The Evangelist Fallacy, Captivity of the Commons and The Digital Panopticon. These topics will be opened to live discussion in an upcoming webcast on May 27 with a special guest to be announced.
In order to control a thing you must first classify a thing -- and we are seeing a massive classification of social behavior. While that classification falls under the guise of making life easier (targeted ads, locating a nearby pizza joint using your mobile), history tells us that we should be leery of motives and masters of our social data (see Captivity of the Commons).
Social sciences (behavioral psychology, sociology, organizational development), whose historical lack of data and scientific method left them open to ridicule from the “hard” sciences, finally have enough volume of data and analytics and processing power (see Big Data) to make “social” much more scientific. But this time social science is going to be coming to you not courtesy of Princeton, but courtesy of Google. Not through small studies on willing subjects, but through massive multivariate testing and optimization upon (largely) unknowing test subjects. The corporation, in other words, will hold the keys to social science at a level of precision only dreamed of by the academic and state institutions of yore.
This recent New York Times article highlights just how much social science, psychology, and personal data converge when a credit card company wants its debts repaid (via Andy Oram’s Radar post).
Should we be concerned about this shift from academia to the corporation?
I hold the current structure of government and corporations in equal regard in terms of how well they adhere to Google’s maxim, “Don’t be Evil.” So in some regard, I shouldn’t really be troubled that social science has moved from academia (which has often been a handmaiden of government) to the corporation (which really just wants to understand what moves you to click that “buy” button, or bump up your average order size by $10, etc.). Except
Except if you believe that consumer culture is wreaking havoc upon the systems that support life and that the application of social science on behalf of the corporation is intended to simply turbo charge the status quo...
We find ourselves in 2009 facing deep, structural challenges -- peak oil, environmental degradation, climate change, and financial meltdown.
That's why the notion of social science in service of accelerating the existing system troubles me. Tim has spoken about the need to “Work on Stuff that Matters.” How might we apply social science toward "stuff that matters" instead of toward "buying more stuff that doesn't matter?"
tags: culture, social science, social web, technology
| comments: 4
submit:
Time Lapse of Galactic Center of Milky Way rising over Texas Star Party
by Jesse Robbins | @jesserobbins | comments: 13
Galactic Center of Milky Way Rises over Texas Star Party from William Castleman.
According to William Castleman: The time-lapse sequence was taken with the simplest equipment that I brought to the star party. I put the Canon EOS-5D (AA screen modified to record hydrogen alpha at 656 nm) with an EF 15mm f/2.8 lens on a weighted tripod. Exposures were 20 seconds at f/2.8 ISO 1600 followed by 40 second interval. Exposures were controlled by an interval timer shutter release (Canon TC80N3). Power was provided by a Hutech EOS203 12v power adapter run off a 12v deep cycle battery. Large jpg files shot in custom white balance were batch processed in Photoshop (levels, curves, contrast, Noise Ninja noise reduction, resize) and assembled in Quicktime Pro. Editing/assembly was with Sony Vegas Movie Studio 9.
[via the Primary Tentacle @ Laughing Squid]
tags: astronomy, astrophotography, just plain cool, make, maker, photography, space
| comments: 13
submit:
Ignite Show: @MSG on Some Images are REALer Than Others
by Brady Forrest | @brady | comments: 0
Snopes became a household word by debunking fake internet images -- like this wonderful shark attack. These images are the product of programs like Photoshop or Aviary. In this Ignite Show Michael Galpert shows us how to detect these fake images. For more reading on this check out eHow's article.
This work is licensed under a Creative Commons Attribution-Share Alike 3.0 United States License. Subscribe to the Ignite Show on iTunes.
tags: ignite, photoshop
| comments: 0
submit:
Four short links: 21 May 2009
by Nat Torkington | comments: 0
- Us Now -- UK documentary, available streaming or on DVD, about how open government and digital democracy makes sense. It's good to watch if you've not thought about how government could be positively changed by technology, but I don't think it's radical enough in the future it describes.
- It's Gonna Be The Future Soon -- great video for the Jonathan Coulton song that's the Radar theme song, my theme song, and probably works well as an anthem for most of us goofy future-loving freaks. Taken from the DVD of a live show. (via BoingBoing)
- Jetpack -- Mozilla Labs' new extension system. Mozilla Labs is building quite the assemblage of interesting hack tools, and it's interesting how significantly they're aimed at the developer and encouraging lots of add-ons and after-market extensions for the browser. I wonder whether this is a deliberate strategy ("community will beat off Chrome!") or whether it's a simple consequence of the fact that Mozilla is a developer organisation.
- Sci Bar Camp -- Science topics, Palo Alto, 7 July 2009.
tags: future, government, mozilla, open government, science
| comments: 0
submit:
The Digital Panopticon
by Joshua-Michéle Ross | @jmichele | comments: 16
This post is part three of a series raising questions about the mass adoption of social technologies. Here are links to part one and two. These posts will be opened to live discussion in an upcoming webcast on May 27. (special guest to be announced shortly)
In 1785 utilitarian philosopher Jeremy Bentham proposed architectural plans for the Panopticon, a prison Bentham described as "a new mode of obtaining power of mind over mind, in a quantity hitherto without example." Its method was a circular grid of surveillance; the jailors housed in a central tower being provided a 360-degree view of the imprisoned. Prisoners would not be able to tell when a jailor was actually watching or not. The premise ran that under the possibility of total surveillance (you could be being observed at any moment of the waking day) the prisoners would self-regulate their behavior to conform to prison norms. The perverse genius of the Panopticon was that even the jailor existed within this grid of surveillance; he could be viewed at any time (without knowing) by a still higher authority within the central tower - so the circle was complete, the surveillance - and thus conformance to authority - total.
In 1811 the King refused to authorize the sale of land for the purpose and Bentham was left frustrated in his vision to build the Panopticon. But the concept endured - not just as a literal architecture for controlling physical subjects (there are many Panopticons that now bear Bentham’s stamp) - but as a metaphor for understanding the function of power in modern times. French philosopher Michel Foucault dedicated a whole section of his book Discipline and Punish to the significance of the Panopticon. His take was essentially this: The same mechanism at work in the Panopticon - making subjects totally visible to authority - leads to those subjects internalizing the norms of power. In Foucault’s words “
the major effect of the Panopticon; to induce in the inmate a state of conscious and permanent visibility that assures the automatic functioning of power. So to arrange things that the surveillance is permanent in its effects, even if it is discontinuous in its action; that the perfection of power should tend to render its actual exercise unnecessary” In short, under the possibility of total surveillance the inmate becomes self regulating.
The social technologies we see in use today are fundamentally panoptical - the architecture of participation is inherently an architecture of surveillance.
In the age of social networks we find ourselves coming under a vast grid of surveillance - of permanent visibility. The routine self-reporting of what we are doing, reading, thinking via status updates makes our every action and location visible to the crowd. This visibility has a normative effect on behavior (in other words we conform our behavior and/or our speech about that behavior when we know we are being observed).
In many cases we are opting into automated reporting structures (Google Lattitude, Loopt etc.) that detail our location at any given point in time. We are doing this in exchange for small conveniences (finding local sushi more quickly, gaining “ambient intimacy”) without ever considering the bargain that we are striking. In short, we are creating the ultimate Panopticon - with our data centrally housed in the cloud (see previous post on the Captivity of the Commons) - our every movement, and up-to-the-minute status is a matter of public record. In the same way that networked communications move us from a one to many broadcast model to a many to many - so we are seeing the move to a many-to-many surveillance model. A global community of voyeurs ceaselessly confessing to "What are you doing? (Twitter) or "What's on your mind? (Facebook)
Captivity of the Commons focused on the risks corporate ownership of personal data. This post is concerned with how, as individuals, we have grown comfortable giving our information away; how our sense of privacy is changing under the small conveniences that disclosure brings. How our identity changes as an effect of constant self-disclosure. Many previous comments have rightly noted that privacy is often cultural -- if you don't expect it - there is no such thing as an infringement. Yet it is important to reckon with the changes we see occurring around us and argue what kind of a culture we wish to create (or contribute to).
Jacques Ellul’s book, Propaganda, had a thesis that was at once startling and obvious: Propaganda’s end goal is not to change your mind at any one point in time - but to create a changeable mind. Thus when invoked at the necessary time - humans could be manipulated into action. In the U.S. this language was expressed by catchphrases like, “communism in our backyard,” “enemies of freedom” or the current manufactured hysteria about Obama as a “socialist”.
Similarly the significance of status updates and location based services may not lie in the individual disclosure but in the significance of a culture that has become accustomed to constant disclosure.
tags: identity, panopticon, social graph, social media, social web
| comments: 16
submit:
Yahoo! Placemaker - Open Location, Open Data and Supporting Web Services
by Brady Forrest | @brady | comments: 5
Today at Where 2.0 Tyler Bell, the Head of Yahoo's Geo Technologies Group, launched Placemaker (this link should be live at posting). Placemaker is a webservice that takes in text and returns the locations found within via either XML or enhanced GeoRSS. The locations Placemaker returns come in the form of WOEIDs (Radar post). You might be cautious about relying on Yahoo's ID system for your locations. To alleviate your fears Yahoo! is announcing the release of GeoPlanet Data, all of the WOEIDs available as a free download under Creative Commons in June. Woot!
Placemaker's geoparsing API will return WOEIDs and place names for all of the locations detected in the submitted text. This text can be structured or unstructured. If their are multiple locations detected the it will return a common ancestor called the Doc Scope. For example if San Francisco and Los Angeles are in the text then the Doc Scope will be "California". If San Francisco and Sacramento were in unstructured text then the Doc Scope would return the colloquial term "Northern California". There are no explicit limits on the API as long as your usage is "nice" -- if it's not you may find yourself shut off for a while.
Placemaker is an updated version of the geoparsing engine currently available through Yahoo! Pipes. This release rightfully makes geoparsing a stand-alone API. If you want to learn more about Placemaker Yahoo has posted the following instructions:
1. Read the online documentation at developer.yahoo.com/geo/placemaker/guide
2. Get an Application Id at developer.yahoo.com/wsregapp
3. POST your content to wherein.yahooapis.com/v1/document
The WOEIDs will be made available under the CC-Attribute license. It will ultimately include over 5 million entities in multiple languages. Relationships between the entities will be included.
Up till now Geonames IDs have been used as place IDS by many apps. All of Geonames' data is freely available for download. It was tough for Yahoo to compete with this open data solution. Today's release and announcement really ups the game. By making the data freely available developers will no longer have much fear about using the data. WOEIDs were first released as a webservice a year ago. At that point in time I expect the free release of the WOEID data to greatly increased the uptake of these supporting webservices and make Yahoo an integral part of mapping mashups.
Yahoo! has more info over on their Geo Technologies Blog.
tags: geo, where 2.0
| comments: 5
submit:
Google Launches Maps Data API
by Brady Forrest | @brady | comments: 1
The crowd at Where 2.0 was expecting an API announcement and Google delivered one. Lior Ron and Steve Lee announced their Maps Data API, a service for hosting geodata. As they describe it on the site:
What is it?
The Google Maps Data API allows client applications to view, store and update map data in the form of Google Data API feeds using a data model of features (placemarks, lines and shapes) and maps (collections of features).
Why Use the Google Maps Data API?
- Storage scales simply with usage. You shouldn't have to worry about maintaining a data store to build a cool Google Maps mashup. Focus on building the client, and we'll provide hosting and bandwidth for free.
- Geodata is accessible across platforms and devices. With many client libraries and clients, accessing stored geodata is possible from anywhere, whether it's on the web, a mobile phone, a 3D application, or even a command line.
- Realtime geodata requires realtime indexing. For a lot of geographic content, freshness is important. Geodata from the Google Maps Data API can be instantly indexed and made searchable in Google Maps.
- Rendering geodata is better and faster with the right tools. Through JavaScript, Flash, 3D, static images and more, we'll continue to provide better ways to render your content to meet platform and latency demands.
Google is launching with some sample apps:
- My Maps Editor for Android allows users to create and edit personalized maps from an Android mobile phone. Integration with the phone's location and camera makes it easy to document a trip with photos and text on a map.
- ConnectorLocal is a service that informs users about the places where they live, work and visit by gathering trusted hyperlocal information from many sources. Using the Google Maps Data API, ConnectorLocal makes it easy for users to import and export geodata in and out of Google Maps, and also improves their ability to have data indexed in Google Maps for searching.
- My Tracksenables Android mobile phone users to record GPS tracks and view live statistics while jogging, biking, or participating in other outdoor activities. Stored with Google Maps Data API, these tracks can be accessed, edited and shared using the My Maps feature in Google Maps.
- Platial, a social mapping service for people and places, uses the Google Maps API to host geodata for community maps on both Platial and Frappr.
Geo data can get very large very quickly. Serving it can get expensive. This Data API will help NGOs, non-profits and developers make their data available without breaking the bank. Google's goals for doing this are obvious. If the data is on their servers they can index it easier and make it readily available to their users. There will be concern that Google will have too much of their data, but as long as Google does not block other search engines and allows developers to remove their data I think that this will be a non-issue.
The crowd was hoping for a formal Latitude API to be announced (knowing that they launched the hint of one at the beginning of May). When I asked Lior and Steve about it we got some smiles. I think we'll see some more movement in this area, but not *just* yet.
tags:
| comments: 1
submit:
Four short links: 20 May 2009
Cognitive Surplus, Data Centers=Mainframes, Django Microframework, and a Visit To The Future
by Nat Torkington | comments: 1
- Distributed Proofreaders Celebrates 15000th Title Posted To Project Gutenberg -- a great use of our collective intelligence and cognitive surplus. If I say one more Clay Shirkyism, someone's gonna call BINGO. (via timoreilly on Twitter)
- Datacenter is the New Mainframe (Greg Linden) -- wrapup of a Google paper that looks at datacenters in the terms of mainframes: time-sharing, scheduling, renting compute cycles, etc. I love the subtitle, "An Introduction to the Design of Warehouse-Scale Machines".
- djng, a Django powered microframework -- update from Simon Willison about the new take on Django he's building. Microframeworks let you build an entire web application in a single file, usually with only one import statement. They are becoming increasingly popular for building small, self-contained applications that perform only one task—Service Oriented Architecture reborn as a combination of the Unix development philosophy and RESTful API design. I first saw this idea expressed in code by Anders Pearson and Ian Bicking back in 2005.
- Cute! (Dan Meyer) -- photo from Dan Meyer's classroom showing normal highschool students doing something that I assumed only geeks at conferences did. I love living in the future for all the little surprises like this.

Approximate distribution of peak power usage by hardware subsystem in one of Google’s datacenters (circa 2007)
tags: book related, datacenter, django, education, future, open source, programming
| comments: 1
submit:
Recent Posts
- Completing the circle on journalists and public participation | by Andy Oram on May 19, 2009
- Wolfram Alpha a Google Killer? Not... Supposed... To... Be | by Mike Loukides on May 19, 2009
- Clothing as Conversation (Twitter Tees on Threadless) | by Tim O'Reilly on May 19, 2009
- Captivity of the Commons | by Joshua-Michéle Ross on May 19, 2009
- MapstractionAPI Sandbox: For Trying Out Multiple Providers | by Brady Forrest on May 19, 2009
- Four short links: 19 May 2009 | by Nat Torkington on May 19, 2009
- More Geo-Games: Ship Simulator on Google Earth | by Brady Forrest on May 18, 2009
- The Question Concerning Social Technology | by Joshua-Michéle Ross on May 18, 2009
- Ignite Google IO Line-Up; 5 Passes to Give Away | by Brady Forrest on May 18, 2009
- Velocity Preview - The Greatest Good for the Greatest Number at Microsoft | by James Turner on May 18, 2009
- Four short links: 18 May 2009 | by Nat Torkington on May 18, 2009
- Being a Suggested User Leads to Thousands of Twitter Followers | by Ben Lorica on May 18, 2009
STAY CONNECTED
TIM'S TWITTER UPDATES
CURRENT CONFERENCES

Velocity brings together representatives from industry leaders like Google, Facebook, Microsoft, eBay, AOL, MySpace, and more — people who are doing the best performance and operations work in the world — to improve the experience of web users everywhere. Read more

Join us at OSCON 2009, the crossroads of all things open source, a deeply technical conference that brings community ideals face-to-face with business practicality. Come together with 3,000 of the best, brightest, and most interesting people to explore what's new and to help define, maintain, and extend the identity of what it means to be open source. Read more
O'Reilly Home | Privacy Policy ©2005-2009, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
Website:
| Customer Service:
| Book issues:
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.