CARVIEW |
James Turner

James Turner, contributing editor for oreilly.com, is a freelance journalist who has written for publications as diverse as the Christian Science Monitor, Processor, Linuxworld Magazine, Developer.com and WIRED Magazine. In addition to his shorter writing, he has also written two books on Java Web Development ("MySQL & JSP Web Applications" and "Struts: Kick Start"). He is the former Senior Editor of LinuxWorld Magazine and Senior Contributing Editor for Linux Today. He has also spent more than 25 years as a software engineer and system administrator, and currently works as a Senior Software Engineer for a company in the Boston area. His past employers have included the MIT Artificial Intelligence Laboratory, Xerox AI Systems, Solbourne Computer, Interleaf, the Christian Science Monitor and contracting positions at BBN and Fidelity Investments. He is a committer on the Apache Jakarta Struts project and served as the Struts 1.1B3 release manager. He lives in a 200 year old Colonial farmhouse in Derry, NH along with his wife and son. He is an open water diver and instrument-rated private pilot, as well as an avid science fiction fan.
Tue
Jun 22
2010
Does the world need another programming language?
Rob Pike on how and why Google's new Go language was developed.
by James Turner | comments: 24
Rob Pike has certainly been places and done things. In the early 1980s, he worked with Brian Kernighan and Ken Thompson at Bell Labs, where he co-wrote "The Unix Programming Environment" with Kernighan and co-developed the UTF-8 character encoding standard with Thompson. Pike is now a principal engineer at Google, where he's co-developed Go, a new programming language. Pike, who will discuss Go at next month's OSCON convention, talks about Go's development and the current state of programming languages in the following interview.
What were the motivations for creating Go?
Rob Pike: A couple of years ago, several of us at Google became a little frustrated with the software development process, and particularly using C++ to write large server software. We found that the binaries tended to be much too big. They took too long to compile. And the language itself, which is pretty much the main system software language in the world right now, is a very old language. A lot of the ideas and changes in hardware that have come about in the last couple of decades haven't had a chance to influence C++. So we sat down with a clean sheet of paper and tried to design a language that would solve the problems that we have: we need to build software quickly, have it run well on modern multi-core hardware and in a network environment, and be a pleasure to use.
Although we targeted Go for a particular kind of problem, it turned out to be a much more general and adaptable programming language than we had thought. So we're using it for a lot of different things now. I think it might have an interesting future in any number of directions.
tags: emerging languages, google books, oscon2010
| comments: 24
submit:
Thu
May 13
2010
When it comes to new media, the Smithsonian is all in
Michael Edson on how the Smithsonian uses crowdsourcing and transparency to further its mission
by James Turner | comments: 4
The Smithsonian Institution epitomizes the phrase "an embarrassment of riches." With 137 million physical objects in its collection, and 28 distinct museums and research centers, you could spend the rest of your life there and not see everything.
Michael Edson, who serves as director of web and new media strategy for the Smithsonian, got his start cleaning cases in one of the art museums. He now oversees the Institution's online presence, which he talks about in the following interview. He'll expand on many of these same topics at the upcoming Gov 2.0 Expo.
Where the Smithsonian's online content comes from:
Michael Edson: Each museum collection -- sub collection, the zoo, the portrait gallery, natural history museum -- really functions as its own world. And 99 percent of the content creation happens deep within the Institution's 28 separate research and collection units. It's a real innovation at the edge environment. We argue in our web and new media strategy that so much great work can happen in those edge environments, where you've got subject matter experts, collections, the public and a little bit of technology expertise really close together. That's the hothouse where these great things grow.
Pretty much any model of web and new media production you can think of is being done somewhere in the Smithsonian. We have 6,000 employees, 137 million physical objects, and incredibly brilliant and talented people. So we do outsourcing. We do in-sourcing. We do lightweight development. We do great big monolithic development. The trick now is establishing a framework -- we call it a Commons -- where these edge innovators can have the basic tools they need to be successful.
tags: gov 2.0, gov 20, museums
| comments: 4
submit:
Mon
May 3
2010
The spy who came in from the code
Carmen Medina talks about tech, the CIA, and why government agencies don't play well with others
by James Turner | comments: 15
If you were going to pick an adjective to describe the Central Intelligence Agency, "open" wouldn't immediately spring to mind. But according to Carmen Medina, who recently retired from the CIA and will speak at Gov 2.0 Expo, openness is just what the agency needs.
Medina's Role at the CIA:
Carmen Medina: I just retired after 32 years at the CIA. I spent 25 years as a manager of analysts. In the mid part of this decade, I was sort of the number two in charge of analysis and also ended in charge of the Center for the Study of Intelligence, which is kind of like the Agency's think tank and lessons-learned center. During my career, I was a bit of a heretic in the organization, though a successful one I guess, in that I always questioned how things were done. From the beginning, I was really interested in how information technology and the Internet had the potential to change the way we did our business. So back in the late '90s, I was pushing hard to get all of our work online, even though a lot of people in the agency were skeptical about it.
Social media and extreme views:
CM: What the Internet allows, if you're an individual that has an extreme view, is the ability to broadcast that view in the privacy of your den. You can get information to support your view without having to go to any unusual places that would attract suspicion. You can find other people who hold the same views that you do. You're able to hide in plain sight, basically, while you're doing that. While I'm a strong believer in the Internet and social networking, like everything else that's happened in human history, it also offers a lot of potential for people who are not well-intentioned.
tags: cia, gov2.0, gov20
| comments: 15
submit:
Tue
Apr 27
2010
The military goes social
Letters from the front have been replaced with Facebook updates
by James Turner | comments: 3
For most of the 20th century, a soldier in the field could only communicate with his/her family and friends via letters that might take weeks or months to make their way to the recipient. But as the battlefield goes high tech, so has the ways soldiers can talk to the outside world.
Managing how social media interacts with the military is the job of Price Floyd, Principal Deputy Assistant Secretary of Defense for Public Affairs. Floyd, a speaker at O'Reilly's upcoming Gov 2.0 Expo, discusses how the public face of the military is changing in the following interview.
The role of the Public Affairs office:
Price Floyd: We're responsible not just for external, but also internal communication. There's about, depending on how you count it, 2.5 million members of the Defense Department at large, all of the civilian employees, contractors, the men and women in service. And if you count those who retired and their families, it comes to about 10 million. Then there's the external audiences that could be U.S. and foreign-based.
How social media is being used by the Defense Department:
PF: At the Defense Department, what we have done is embraced social media, and the technology behind it, to engage with all our audiences. That's everything from veteran's groups to foreign publics to people who follow me on Twitter. And it's a two-way engagement. The idea that social media is a better way to reach a broader audience with our message, that kind of one-way communication idea, is not what we want to do. We want to engage with our audience, all of them, on the whole host of issues and policies that we deal with.
Does social media mean losing control of the message?
PF: I think that we need to become much more comfortable with taking risk, much more comfortable with having multiple spokesmen out there, thousands of spokesmen in essence. But, for me, there's nothing more credible than the men and women who are out there on the front lines fighting the wars that we're in to send messages back to their family and friends. As you know, you send a tweet or a make a post on Facebook, it doesn't necessarily stay there. That could be forwarded around. Other people that you never thought could see it will see it, even the media. And I'm okay with that. I'm okay with us no longer controlling exactly what people say to the media and then trying to work with the media to make sure they get their story exactly the way we may want it.
tags: department of defense, gov2.0, interviews, military, skype, social media
| comments: 3
submit:
Mon
Apr 12
2010
Citizens as public sensors
The co-founder of SeeClickFix on how crowdsourcing can help local government
by James Turner | comments: 24
When people talk about the effects of Gov 2.0, the discussion tends to center around transparency and making data available to the general public. But information can flow in both directions.
SeeClickFix believes that the citizens may have as much to offer local government as the government may have to offer to the people. By letting the man (literally) on the street report issues to local city or town departments, and making them trackable, it shifts some of the management burden to the people most affected by them. Ben Berkowitz, one of the co-founders, talked to us about how the program is working.
How SeeClickFix came to be:
Ben Berkowitz: It started a little over two years ago when I was dealing with graffiti on my neighbor's building. After deciding that it was hopeless dealing with my neighbor, I went to call city hall. I left something like five different messages with the department that I presumed was responsible. At some point, while waiting on hold and getting different answers about what I could do, I started thinking: "I bet a lot of my neighbors have complained about similar issues and have had a lot of trouble, and it would be nice to know what their experiences were when attempting to contact city hall."
So we sat down on a Sunday night and said, "We're going to give ourselves four hours to try to come up with something." And at the end of four hours if we're happy and we like the tool, we would keep working on it. If not, we'd can it. At the end of four hours, we had a little Google Map where you could post issues on the map and that was it. We showed it off to friends. They thought it was cool. And the rest is kind of history.
How cities respond to SeeClickFix:
BB: The first city was New Haven, Conn., and the mayor and the chief administrative officer were both very receptive. So receptive that the mayor wrote a letter to about 100 other mayors around the country. The majority of responses since have been really positive. You get a few where they'll say, "Oh, but we already have a website where people can report issues." And, of course, our response is, "Yes, you do. But that website does not display issues publicly when you post them."
We have a ton of features that exceed standard city websites, and that helps move the ball forward in terms of acceptance of public, transparent, collective reporting. But in the beginning, really the only one-up we had on a city website was that we were a map-based transparent web reporting tool, and they were usually just a closed web form that was no better than leaving a phone call. You still had the same black box syndrome.
tags: cities, citizen engagement, gov2.0, interviews
| comments: 24
submit:
Thu
Apr 8
2010
Brian Aker on post-Oracle MySQL
A deep look at Oracle's motivations and MySQL's future
by James Turner | comments: 1
Brian Aker parted ways with the mainstream MySQL release, and with Sun Microsystems, when Sun was acquired by Oracle. These days, Aker is working on Drizzle, one of several MySQL offshoot projects. In time for next week's MySQL Conference & Expo, Aker discussed a number of topics with us, including Oracle's motivations for buying Sun and the rise of NoSQL.
The key to the Sun acquisition? Hardware:
Brian Aker: I have my opinions, and they're based on what I see happening in the market. IBM has been moving their P Series systems into datacenter after datacenter, replacing Sun-based hardware. I believe that Oracle saw this and asked themselves "What is the next thing that IBM is going to do?" That's easy. IBM is going to start pushing DB2 and the rest of their software stack into those environments. Now whether or not they'll be successful, I don't know. I suspect once Oracle reflected on their own need for hardware to scale up on, they saw a need to dive into the hardware business. I'm betting that they looked at Apple's margins on hardware, and saw potential in doing the same with Sun's hardware business. I'm sure everything else Sun owned looked nice and scrumptious, but Oracle bought Sun for the hardware.
The relationship between Oracle and the MySQL Community:
BA: I think Oracle is still figuring things out as far as what they've acquired and who they've got. All of the interfacing I've done with them so far has been pretty friendly. In the world of Drizzle, we still make use of the Innodb plugin, though we are transitioning to the embedded version. Everything there has gone just along swimmingly well. In the MySQL ecosystem you have MariaDB and the other distributions. They're doing the same things that Ubuntu did for Debian, which is that they're taking something that's there and creating a different sort of product around it. Essentially though, it's still exactly the same product. I think some patches are flowing from MariaDB back into MySQL, or at least I've seen some notice of that. So for the moment it looks like everything's as friendly as it is going to be.
tags: databases, geodata, geolocation, interviews, mysql, nosql, operations, oracle, sun
| comments: 1
submit:
Tue
Mar 23
2010
Joe Stump on data, APIs, and why location is up for grabs
The SimpleGEO CTO and former Digg architect discusses NoSQL and location's future
by James Turner | comments: 6
I recently had a long conversation with Joe Stump, CTO of SimpleGeo, about location, geodata, and the NoSQL movement. Stump, who was formerly lead architect at Digg, had a lot to say. Highlights are posted below. You can find a transcript of the full interview here.
Competition in the geodata industry:
I personally haven't seen anybody that has come out and said, "We're actively indexing millions of points of data. We're also offering storage and we're giving tools to leverage that. I've seen a lot of fragmentation." Where SimpleGeo fits is, I really think, at the crossroads or the nexus of a lot of people that are trying to figure out this space. So ESRI is a perfect example. They have a lot of data. Their stack is enormous. They answer everything from logistics down to POI things, but they haven't figured out the whole cloud, web, infrastructure, turn-key approach. They definitely haven't had to worry about real time. How do you index every single tweet and every single Twitter photo without blowing up? With the data providers, there's been a couple of people that are coming out with APIs and stuff.I think largely, things are up for grabs. I think one of the issues that I see is as people come out with their location APIs here, like NAVTEQ is coming out with an API, as a developer, in order to do location queries and whatnot, especially on the mobile device, I don't want to have to do five different queries, right? Those round trips could add up a lot when you're on high latency slow networks. So while I think that there's a lot of people that are orbiting the space and I know that there's a lot of people that are going to be coming out with location and geo offerings, a lot of people are still figuring out and trying to work their way into how they're going to use location and how they're going to expose it.
tags: digg, interviews, joe stump, location, mysql, nosql, sql
| comments: 6
submit:
Thu
Mar 11
2010
Personalization and the future of Digg
A recommendation model could quell competition for Digg's front page
by James Turner | comments: 7
I recently talked to Joe Stump, CTO of SimpleGeo, about a number of topics related to location and databases. In the course of the interview, we also got around to discussing Digg. Previous to launching SimpleGeo, Stump was the lead architect at Digg, and he has a lot of insight into where the site is heading. We'll be running the rest of the interview soon, but what Stump told me about Digg got me thinking.
We've all heard about citizen journalism. Digg, in principle, is citizen editing. One of the primary jobs of an editor is to decide what's important, and what's dross. Digg uses crowdsourcing to determine what rises above the noise to move onto the topic and main front pages. Being "dugg" can make or break a geek-related news story.
But the general consensus these days is that Digg has been gamed into uselessness. Gangs of Digg assassins work to vote stories down that deal with topics they dislike. Mac folks slam Windows stories. Windows guys sink Linux stories. Cabals of content creators try to work together to vote each other's work up onto the front page. It's like a guerrilla war fighting to control the front page of your local newspaper.
According to Stump, Digg is aware of this and plans to address it in an upcoming rework of the site. "Digg has always aggressively pursued ways of evening the field and avoiding those turf warfares," Stump said. "I think the next iteration is going to do a lot to answer that problem, because really you need to build consensus."
What Stump believes is going to happen is that Digg is going to move to more of a recommendation model than a universal up/down voting system:
I think the way that Digg is going to answer it, and the way that the internal thoughts were when I was there, is that rather than allowing a small group to yank a story from the bigger group, that we give people better tools so they don't even see those stories to begin with. If all you're going to do is bury Palin stories or Obama stories, maybe the answer isn't to figure out that you're a Palin or an Obama hater and to not count your buries, but maybe the better answer is to give you a tool where you can say, 'Screw that Obama guy. I don't want to see anything about him.' Or, 'Screw Palin. I don't want to see anything about her.' I think that at some point in the future, you'll probably see where those negative votes carry a much more personal connotation as opposed to a group connotation.
Delivering more personalized news will probably be great for Digg users. But without a single front page to vie for, it will also blunt the power Digg has to make or break stories.
"That's fed a lot of the problems that Digg's had up to this point," said Stump. "If there's not one unified front page, and everybody's front page is different based on their behavior and their interests and their niche categories, it removes a lot of the incentive to get something on the front page. Because you really don't know whose front page you're on once you get promoted."
tags: digg, interviews, new media, web 2.0
| comments: 7
submit:
Wed
Mar 10
2010
Gov 2.0 invades Harvard: A report from #gov20ne
Politicians, advocates, techies and citizens discuss open government
by James Turner | comments: 2
Last Saturday (March 6), several hundred folks gathered at the Harvard Kennedy School of Government to spend the day discussing open government. O'Reilly's own Laurel Ruma was one of the organizers, and she sends in this report:
To geeks, bar camps are nothing new. But what we're seeing is a surge in civic-based camps, including Transparency Camp , Participation Camp, Change Camps in Canada, City Camp, Congress Camp, and the ongoing Crisis Camps. However, there is one overarching topic that includes all of the granular subjects: Gov 2.0. The Camp scene is not without Gov 2.0 Camps. Started in Washington D.C. last year by the Government 2.0 Club, Gov 2.0 Camps are popping up across the country. These camps are free and open to the public, and are helping to connect citizens to government officials in a casual and engaging format. Organized by groups of community members from across many fields and experiences, each camp is unique to its geographic area.
Gov 2.0 Camp D.C. hosted close to 500 engaged campers, including policy and government folks, technologists, nonprofits, and government contractors. Gov 2.0 Camp Los Angeles was held in downtown L.A. last month with a focus on "work[ing] to make 'Gov 2.0' more accessible to the public, share advice, and solve common problems." Now, we've just held Gov 2.0 Camp New England this past weekend at the Harvard Kennedy School, and we're just hearing about Denver's own Gov 2.0 Camp Rocky Mountains coming in June.
Gov 2.0 Camp New England was brainstormed one late night, as many good ideas are, with Yasmin Fodil (a masters student at Harvard Kennedy School), Sarah Bourne (Mass.Gov technology strategist), and yours truly, Laurel Ruma (Gov 2.0 Evangelist for O'Reilly Media). We pulled in our friends Rob Goodspeed, who's finishing up a PhD in urban planning at MIT, and Jess Weiss, who works for Mass.Gov as a project and social media coordinator.
tags: barcamp, gov 2.0, open government
| comments: 2
submit:
Tue
Feb 23
2010
When it Comes to Tweets, the Key is Location, Location, Location!
Raffi Krikorian works to make geotagging tweets fast and efficient
by James Turner | comments: 0
When you only have 140 characters to get your message across, you have to depend a lot on context. For Twitter, a big part of that context has become location. Knowing where someone is tweeting from can add a lot of value to the experience, and it's Raffi Krikorian's job to integrate location into Twitter. Raffi will be talking about this and other location-related topics at the upcoming Where 2.0 conference. We began by asking him how Twitter determines location, and whether it will always be an opt-in option.
Raffi Krikorian: I think part of it is based around the philosophy of Twitter itself. We only publish information that you've explicitly given to us on a tweet-by-tweet basis. So for location on your tweets, it's all opt-in. You have to give us that location information, and we'll put it out. There are other things we do behind the scenes, like our local trends information, that doesn't actually tie to an individual person. We might do some IP look-ups. We look at your user location field. But for anything that's tied to an individual, it's all opt-in.
James Turner: 140 characters is a restriction that Twitter's famous for. Location is fairly high bandwidth information. Have you considered carrying location data out of band from the 140 characters?
Raffi Krikorian: We do that right now. Originally, when people used to tweet location, they put a URL in their text field which linked to a map or linked to a service which might show where they are. But ever since we launched our geotagging API in November, we store the latitude and longitude for your tweet out of band. It's completely metadata on top of the tweet. A bunch of clients implement it, such as Tweety and Seesmic Web, they can read that metadata, and will show you either a map or attempt a reverse geocode and give you an actual name.
James Turner: What value do you see location bringing to social networking? Usually, if someone is talking about a location, it's explicit in the message or in the blog, "I'm at so-and-so and the show is really nice tonight." If you imagine that people are pervasively providing their geolocation, how does that aid social networking?
Raffi Krikorian: I think that one, it helps people like us at Twitter to be able to give more relevant context information to other people. Especially in our 140 character constrained lifestyle, you can't necessarily put fully structured information of where you might be or what you might be talking about. But since we're now trying to expand the dimensionality of our platform to include place, we can now store that structured data, and, therefore, we can analyze it better. We can deliver it to the right people better, and we can do more interesting high-level analytics. Therefore, we can deliver relevant search or relevant information to people who are wanting it.
I think one of the dreams would be, not necessarily for Twitter but for someone out there, to be able to look at status update streams with geotagging on top of it and try to figure out what are the hot bars out there tonight, or be able to see cross-referencing with my foursquare check-in, for example. I want to be able to ask the service, "What bar should I go to right now that my friends have liked that I think I'll probably like and have no line?" And you'd only do something like that kind of high-level query if you actually have some really good way, either to analyze data or to get structured data out of the system. Analysis of that is going to be hard, especially in a world where you only have 140 characters to express yourself, for providing these metadata or meta ways to included structured information, and it becomes a UI problem to get that information into the system. It should become a lot easier for other people to build applications on top of it. So I think that's where geo-type stuff would go for networking, with better recommendations or better information delivery, better stuff within social networks.
tags: geodata, geolocation, hashtags, interviews, twitter
| comments: 0
submit:
Recent Posts
- Google Enters the Home Broadband Market on February 10, 2010
- When it Comes to News, Why Won't People Eat Their Vegetables? on January 27, 2010
- Bringing e-Books to Africa and the Middle East on January 19, 2010
- The Best and the Worst Tech of the Decade on December 17, 2009
- Innovation from the Edges: PayPal Taps the Developer Community to Build Next-Gen Payment Apps on December 14, 2009
- Steve Souders: Making Web Sites Faster in the Web 2.0 Age on November 30, 2009
- The iPhone: Tricorder Version 1.0? on November 17, 2009
- The Minds Behind Some of the Most Addictive Games Around on November 9, 2009
- Why Google and Bing's Twitter Announcement is Big News on October 21, 2009
- Life With TED - Micromanaging Your Carbon Footprint on October 19, 2009
STAY CONNECTED
RECOMMENDED FOR YOU
O'Reilly Home | Privacy Policy © 2005 - 2010, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.