CARVIEW |
James Turner

James Turner, contributing editor for oreilly.com, is a freelance journalist who has written for publications as diverse as the Christian Science Monitor, Processor, Linuxworld Magazine, Developer.com and WIRED Magazine. In addition to his shorter writing, he has also written two books on Java Web Development ("MySQL & JSP Web Applications" and "Struts: Kick Start"). He is the former Senior Editor of LinuxWorld Magazine and Senior Contributing Editor for Linux Today. He has also spent more than 25 years as a software engineer and system administrator, and currently works as a Senior Software Engineer for a company in the Boston area. His past employers have included the MIT Artificial Intelligence Laboratory, Xerox AI Systems, Solbourne Computer, Interleaf, the Christian Science Monitor and contracting positions at BBN and Fidelity Investments. He is a committer on the Apache Jakarta Struts project and served as the Struts 1.1B3 release manager. He lives in a 200 year old Colonial farmhouse in Derry, NH along with his wife and son. He is an open water diver and instrument-rated private pilot, as well as an avid science fiction fan.
Thu
Dec 17
2009
The Best and the Worst Tech of the Decade
It was the best of decades, it was the worst of decades...
by James Turner | comments: 46
With only a few weeks left until we close out the 'naughts and move into the teens, it's almost obligatory to take a look back at the best and not-so-best of the last decade. With that in mind, I polled the O'Reilly editors, authors, Friends, and a number of industry movers and shakers to gather nominations. I then tossed them in the trash and made up my own compiled them together and looked for trends and common threads. So here then, in no particular order, are the best and the worst that the decade had to offer.
The Best
AJAX - It's hard to remember what life was like before Asynchronous JavaScript and XML came along, so I'll prod your memory. It was boring. Web 1.0 consisted of a lot of static web pages, where every mouse click was a round trip to the web server. If you wanted rich content, you had to embed a Java applet in the page, and pray that the client browser supported it.
Without the advent of AJAX, we wouldn't have Web 2.0, GMail, or most of the other cloud-based web applications. Flash is still popular, but especially with HTML 5 on the way, even functionality that formerly required a RIA like Flash or Silverlight can now be accomplished with AJAX.
Twitter - When they first started, blogs were just what they said, web logs. In other words, a journal of interesting web sites that the author had encountered. These days, blogs are more like platforms for rants, opinions, essays, and anything else on the writer's mind. Then along came Twitter. Sure, people like to find out what J-Lo had for dinner, but the real power of the 140 character dynamo is that it has brought about a resurgence of real web logging. The most useful tweets consist of a Tiny URL and a little bit of context. Combine that with the use of Twitter to send out real time notices about everything from breaking news to the current specials at the corner restaurant, and it's easy to see why Twitter has become a dominant player.
tags: agile, ajax, decade, intellectual property, json, mobile, mpaa, music, restrospective, riaa, scrum, soap, twitter
| comments: 46
submit:
Mon
Dec 14
2009
Innovation from the Edges: PayPal Taps the Developer Community to Build Next-Gen Payment Apps
Developer Challenge offers big prizes for best apps using new APIs
by James Turner | comments: 4
Two enduring tenets of Web 2.0 are "A platform beats an application every time" and "All the smart people don't work for you." Companies that take those bits of wisdom to heart find ways to engage developer communities to extend their products--and the result can be creative, surprising new applications that would never have been developed from within. Online payment giant PayPal recently announced the PayPal X APIs, a new group of developer APIs designed to enable new applications that can more tightly integrate with PayPal services. To encourage developers to create some awesome applications with the APIs, PayPal is offering prizes $100,000 and $50,000 (in cash plus waived transaction fees) for the best new applications. We caught up with PayPal's director for their Developer Network, Naveed Anwar, as he prepared to deliver a talk in Beijing, and he filled us in on what the new PayPal APIs bring to the table for application designers, and laid out the details of the challenge.
James Turner: In the last week or so, you've released new API. Can you talk a little bit about what's different with them, in comparison to how people have interacted with PayPal in the past as developers?
Naveed Anwar: We have always had APIs to help our developers at PayPal. The difference is that all the original APIs resulted in actions on PayPal.com, you had to access the web check out flow on our website. On November 3rd we announced the first ever truly open global payments platform. Part of that announcement was new APIs that let developers build PayPal’s payment service into their applications. We made the adaptive payments APIs available to everyone and gave attendees who attended our conference access to other APIs that make it easy to create user accounts (individually or in batches) within a developer’s application. We’ve had an overwhelming response from our developers to these Adaptive Accounts APIs, and so we decided to open them up to the rest of the community as soon as we could.
Very briefly, Adaptive Payments Platform has a lot of core features. In addition to the traditional features like send/receive money in P2P and B2B market segments, the Adaptive Payments Platform APIs provide new ways to make parallel and chain payments, and pre-approvals with PIN authorizations that enable several new possibilities. To provide a better user experience for both our customers and merchants, the Adaptive Accounts and Authentication APIs provide ways to make the account creation process contextual and allow merchants to authenticate their PayPal customers as part of their own order management or account related flows. These are just a few very high level sets of APIs that enable new capabilities and uses of PayPal’s payment service.
tags: apis, credit cards, developers, e-commerce, interviews, paypal
| comments: 4
submit:
Mon
Nov 30
2009
Steve Souders: Making Web Sites Faster in the Web 2.0 Age
How huge JavaScript libraries, rich content, and lame ad servers are slowing the web down
by James Turner | comments: 9
As much as anything else, a user's impression of a web site has to do with how fast the site loads. But modern Web 2.0 websites aren't your father's Oldsmobile. Chocked full of rich Flash content and massive JavaScript libraries, they present a new set of challenges to engineers trying to maximized the performance of their sites. You need to design your sites to be Fast by Default. That's the theme of the upcoming Velocity Online Conference, co-chaired by Google performance guru Steve Souders. Souders is the author of High Performance Web Sites and Even Faster Web Sites, and spent some time discussing the new world of web site performance with me.
James Turner: There's been an evolution of the whole area of web performance, from the old days when it was all about having a bunch of servers and then doing round robin or just spreading the load out. How is web performance today different than it was, say, ten years ago?
Steve Souders: Well, what's happened in the last five years or so is that Web 2.0 and DHTML and AJAX has really taken off. And that's really been in the last two years. Five years ago, we started seeing a lot of flash and bigger images. So basically what's happened is our web pages have gotten much richer, and that also means heavier. There's more content being downloaded, more bytes. And then in the last two years, with the explosion of Web 2.0, we're seeing not only a lot more bytes being downloaded into web pages, but we're seeing that that involves a lot of complex interaction with JavaScript and CSS. And so whereas maybe five or ten years ago, the main focus was to build back-end architectures that would support these websites, we're seeing today that we need a lot more focus on the front-end architecture of our websites and web applications.
So that's where Velocity comes in, and my work comes in. Whereas ten or twenty years ago, you had people looking at collecting and evangelizing best practices for back-end architectures like databases and web servers, Velocity and my work is about identifying and evangelizing best practices for building really fast high-performance front-end architectures.
James Turner: I know, as someone who's been doing AJAX development, AJAX is a very different kind of paradigm for how you're interacting with the server. It's a lot more chatty. Are the current generations of web servers really designed for that kind of interaction?
Steve Souders: I think that the chattiness of AJAX applications isn't really the issue with regard to performance. I mean, anything can become an issue, but looking across the top 1,000 websites in the world, that's not the issue. The issue is that these web applications that do AJAX require a lot of JavaScript to set up that framework on the client, inside the browser. And to set that up, to download all of that JavaScript and parse it and execute it just takes a lot of time. A user is downloading complex photos or mail or calendaring application, and before they've even done any chatty AJAX request, they're just waiting for the application to actually load and start up. That's the frustrating period, you just want to get in and start working on your slides or reading your mail, and you're waiting for this start up to happen. Typically, once these AJAX frameworks have loaded, the AJAX work that we're doing in the background is not that big of a problem either from a back-end perspective or from the client side perspective.
James Turner: One of the things we see a lot these days is people using libraries like YUI or Google's libraries or JQuery . They have compressed versions, but they're still pretty large. To what extent do you think there's a need to really go in and pick and choose out of those libraries?
Steve Souders: Well, myself personally, I do that frequently because I only need usually one small feature, like I need a carousel or I need a drop-down menu or something like that. And I'll go to the work of pulling out just the code that I need. But I'm working small website projects. If you're building a whole web application, you're probably using many parts of these JavaScript frameworks. There still might be some benefit in pulling out just the pieces that you need. But that's extra work. And when you need to upgrade, that's the likelihood of introducing bugs or other problems. So certainly, I wouldn't avoid doing that. It should be evaluated, pulling out just the JavaScript that you need from the frameworks so long as the licensing even supports that.
But something else that helps address that problem is the Google AJAX Libraries API. This is where Google is actually hosting versions of JQuery and Dojo and YUI and Google's Closure JavaScript framework and Scriptactulous and EXT and others. What happens is you can have multiple websites that don't have any interaction with each other, Sears.com and WholeFoods.com, but they both might be using JQuery 1.3.2. And if they're both downloading that library from the Google AJAX Libraries API, then the URL is exactly the same. So a user that visits multiple of these websites only has to download that JavaScript once during their entire cache session. That further mitigates the need or motivation or benefit of pulling out just the parts that you need.
At first, I didn't think there would be that much of a critical mass around these people that would adopt the Google AJAX Library CDN and the actual version numbers of these JavaScript frameworks, but it's actually taken off really well, a lot of websites are using them. Users are actually getting this critical mass benefit where when they go to some website, Sears.com, that's using JQuery, they already have that in their cache from a visit they made a previous day to a different website. So I think in general, I would recommend to developers that they not change the JavaScript frameworks they're using. And if they're using a framework that's hosted on Google, download it from there. It's hosted on Google's infrastructure, so it's going to be fast, reliable, and users will actually get the benefit of having a greater probability of the framework being in their cache, because multiple websites are taking advantage of loading it from there.
James Turner: I have to put my security hat on for a second and ask, when you get into a situation like that, the flag that comes up for me is if someone managed, by some kind of an injection fake, delivery a version of a library that had vulnerabilities so that it appeared to be coming from Google, you could get into a situation where someone would be using a poison library. Do you think that's at all a realistic concern?
Steve Souders: Well, I think it depends on who's doing that. I work at Google so I don't want to come off as sounding like a fan boy who's only going to say great things about what Google is doing. I'm as cautious as the next person with what passwords I use and what information I give to any web company. But when it comes to something like this, I've built stuff that's running on Google App Engine or Amazon AWS. It's always possible that these big companies, these big web hosting providers are going to go down. But there's probably a greater chance that my website is going to go down than Google or Amazon. And the same thing with security. There's probably a greater chance that my website is going to be hacked than Google or Amazon. So I think it is a possibility. But I think the odds are pretty small of that happening. And that would not be a concern that would stop me from taking advantage of these services because of the performance benefits I get from them.
tags: ad servers, dojo, flash, google closure, interviews, javascript, performance, steve souders, velocity, yui
| comments: 9
submit:
Tue
Nov 17
2009
The iPhone: Tricorder Version 1.0?
by James Turner | comments: 7
The iPhone, in addition to revolutionizing how people thought about mobile phone user interfaces, also was one of the first devices to offer a suite of sensors measuring everything from the visual environment to position to acceleration, all in a package that could fit in your shirt pocket.
On December 3rd, O'Reilly will be offering a one-day online edition of the Where 2.0 conference, focusing on the iPhone sensors, and what you can do with them. Alasdair Allan (the University of Exeter and Babilim Light Industries) and Jeffrey Powers (Occipital) will be among the speakers, and I recently spoke with each of them about how the iPhone has evolved as a sensing platform and the new and interesting things being done with the device.
Occipital is probably best known for Red Laser, the iPhone scanning application that lets you point the camera at a UPC code and get shopping information about the product. With recent iPhone OS releases, applications can now overlay data on top of a real time camera display, which has led to the new augmented reality applications. But according to Powers, the ability to process the camera data is still not fully supported, which has left Red Laser in a bit of a limbo state. "What happened with the most recent update is that the APIs for changing the way the camera screen looks were opened up pretty much completely. So you can customize it to make it look any way you want. You can also programmatically engage photo capture, which is something you couldn't do before either. You could only send the UI up and the user would have to use the normal built-in iPhone UI to capture. So you can do this programmatic data capturing, and you can process those images that come in. But as it turns out, at the same time, shortly after 3.1, the method that a lot of people were using to get the raw data while it was streaming in became a blacklisted function for the review team. So we've actually had a lot of trouble as of late getting technology updates through the App Store because the function we're using is now on a blacklist. Whereas it wasn't on a blacklist for the last year."
Powers is hopeful that the next release of the OS will bring official support for the API calls that Red Laser uses, based on the fact that the App Store screeners aren't taking down existing apps that use the banned APIs. Issues with the iPhone camera sensors pose more of a problem for him. "In terms of science, it's definitely a really bad sensor, especially if you look at the older iPhone sensor, because it has what's called a rolling shutter. A rolling shutter means that as you press capture or rather as the camera is capturing video frames or as you capture a frame, the camera then begins to take an image. And it takes a finite number of milliseconds, maybe 50 or so, before it is actually exposed to the entire frame and stored that off into a sensor. Because it's doing something that's more like a serial data transfer instead of this all at once parallel capture of the entire frame, what that causes is weird tearing and odd effects like that. For photography, as long as it's not too dramatic, it's not a huge deal. For vision processing, it's a huge deal because it breaks a lot of assumptions that we typically make about the camera. That has gotten better in the 3GS camera, but it's still not perfect. It is getting better, especially when the camera's turned on the video mode."
tags: augmented reality, image recognition, interviews, iphone, science, sensors, webcast, where 2.0
| comments: 7
submit:
Mon
Nov 9
2009
The Minds Behind Some of the Most Addictive Games Around
If you've wasted half your life playing Peggle, Bejeweled, Zuma or Plants vs. Zombies, blame these guys!
by James Turner | comments: 5
You may also download this file. Running time: 38:21
Subscribe to this podcast series via iTunes. Or, visit the O'Reilly Media area at iTunes to find other podcasts from O'Reilly.
The gaming industry tends to focus on the high end products, first person shooters that crank out a bazillion polygons a seconds and RPGs which spend more time developing the plot in cut scenes than in actual gameplay. But for every person playing Borderlands, there are scores playing casual games like Bejeweled and Zuma. PopCap Games has been at the forefront of casual game development, with a catalog that includes bestselling titles like Peggle and Plants vs Zombies, in addition to the two previously mentioned. I recently had a chance to talk to Jason Kapalka, one of the founders and the creative director of PopCap. We discussed the evolution of PopCap, how the casual gaming industry differs from mainstream gaming, and the challenges of creating games that can be engaging, without being frustrating.
James Turner: Could you start by talking a little bit about your background and how you came to PopCap and what you did before then?
Jason Kapalka: My career in computer games started back in the early '90s, when I was writing for the magazine, Computer Gaming World, doing various reviews and articles. In '95, one of the editors from the magazine left to join an internet dotcom start-up in San Francisco called TEN, the Total Entertainment Network. He invited me to come down there and work there, which I did. And TEN evolved over the dotcom boom and bust cycle, from a very hardcore gaming service into what eventually turned into Pogo.com around 1999. I worked there initially on hardcore games. One day, I was working on Total Annihilation tournaments, and then the next day, someone said, "Hey, design bingo." And I was sort of like, "Oh. Bingo? Okay."
That was the beginning of my casual game design career, I guess. And yes, I was there at Pogo. I helped design a lot of the structure for their casual games until around 2000 when I left, and Pogo eventually went on to get bought by Electronic Arts, of course. I left in 2000 and started PopCap with two other guys, Brian Fiete and John Vechey who are these guys from Indiana that I'd met earlier, around '97. They had made an internet action game called ARC that we'd produced on TEN, and we stayed in touch. In 2000, we all thought we wanted to try something different. So we all left our respective companies to start PopCap. As you might remember, 2000 was not the best year for internet companies. So we didn't really realize that the entire industry was collapsing. We had an interesting time initially. Luckily, our ignorance protected us, I guess.
PopCap started from there, just the three of us working out of our apartments. And over time, we'd say, "Well, I guess we need to hire an artist." And I'd say, "Well, I guess we need to hire maybe another guy here to program this stuff." And then eventually, maybe someone should look at the books or whatever, so we'll hire someone to take care of the bookkeeping. And it kept going like that until eventually we thought that maybe we needed an office. And from there, suddenly, we've got nearly 300 employees now in 2009. So it's been an interesting kind of experience. We never really intended PopCap to get anywhere near as big as it has today.
James Turner: How would you describe PopCap's place in the market today?
Jason Kapalka: I guess it's a bit odd. Casual game companies exist in these strange spaces where they're often the developer and the publisher at the same time. And then they also publish stuff with other guys, where they're sort of rivals, but also they're partners. There's a lot of this co-opetition thing going on. PopCap is obviously a developer, and we develop a lot of games. We used to publish other people's games. And we still do indirectly. in that we have SpinTop Games. which is a company we bought a couple of years back. They distribute a lot of other people's games through their site. But primarily, I think we develop and then publish titles. But we primarily focus on publishing our own titles. So we're kind of a self-publisher, I suppose.
James Turner: That's actually something I wanted to ask you about because one of your distribution channels now is Steam, which is another company's portal for their games and others. How do you see that relationship?
Jason Kapalka: Steam's been really good. We work with lots of different portals. Steam is one of many that our typical game would go out on. On Steam, on Real Arcade, Big Fish Games, Yahoo Games, MSN, WildTangent, a whole bunch of smaller channels. So Steam was just one of several. It's been interesting in that it was developed differently than a lot of those other ones. Steam is definitely much more of a hardcore game distribution channel than something like Real Arcade. So initially, when we started on Steam, it was uncertain whether our games were going to really fit in. Initially, a lot of the ones we tried on Steam didn't really work too well for their audience. Hidden object games don't do especially well with Steam users, for example.
The turning point for Steam was probably when we did Peggle Extreme with Valve. I don't know if you remember that. Peggle had just come out, and the guys at Valve really liked it. We were talking and we had some weird ideas. Someone had the odd suggestion to do sort of a miniature-themed version of Peggle that featured all of the Orange Box's characters, the Half-Life, MT Team Fortress guys. It was a really strange idea, because that was a fairly mature violent kind of franchise. And certainly, it didn't seem like the obvious fit for Peggle. But, on the other hand, we thought, "Well, what the heck? We can try it and it's only going to go on Steam anyway so it's not like it'll offend the soccer moms necessarily." So we tried that out, and it went up. And we were all kinds of nervous because we didn't know -- it had launched initially as a free download with the Orange Box. And even though it didn't cost people anything, we were still kind of wondering if there was going to be this big backlash from the hardcore community about, "What the hell is this cheap little pinball thing doing in the middle of my Orange Box product."
But actually, the response was really good. I mean, the Orange Box guys all really liked Peggle a lot. And ultimately, that led them to go and seek out and buy the regular versions of Peggle which made Peggle suddenly this fairly big success on Steam. Which a month or two ago, before that, didn't seem very likely that this game with unicorns and rainbows would be selling well on Steam. So after that, that sort of seemed to kind of be -- it sort of opened the floodgates a little bit. And now a variety of our games do very well on Steam. Obviously, Plants Vs Zombies was the last one that had quite a hit there. Not everything. There's still some of our games that are clearly more casual and that don't particularly work well on Steam. But the ones that do work there seem to really work well.
James Turner: There seems to be a fairly different expectation level for casual games in terms of graphics and such. Do you think that's a natural result of how they're produced and what they're intended for? Or could you see something like Plants Vs Zombies but with the graphics levels of a Half-Life?
Jason Kapalka: It's certainly possible. I mean in some cases, we're not intentionally trying to make the games low fidelity. We try to do the best art direction we can. Although the usual contradiction, or decision to be made, there is we also want to make games as accessible as possible. So we want Plants Vs Zombies to play on every crummy netbook and seven-year-old computer your mom has and all of these types of things. And so that tends to mean that we try to work and have good art, but usually make the technical requirements very modest. We've been working at making things that can scale well so that on a good computer, you'll get a really nice experience and it'll still scale down to play on a lower-end computer. But that can be challenging in itself. So usually, we err on the side of not worrying about the graphics being too high-end because our experience is showing that a good game with not very fancy graphics can sell very well, like Plants Vs Zombies. And I think that game has good graphics, but it's definitely limited. It's only got 800X600 resolution and so forth. But on the other hand, we've seen plenty of games in the casual space that have really good graphics and they sell very poorly if they're not a fun game. So accessibility and fun definitely, for us, end up being a first priority over graphics. And especially 3-D or technically impressive graphics versus just good art direction.
James Turner: You would think Nethack and Rogue would be the ultimate proof that you can have good game play without good graphics.
Jason Kapalka: Sure, I love Roguelike games. We have lots of Nethack fans over at PopCap, which seems a bit weird in that they're obviously not very casual in many regards. But yeah, they're good exemplars of that principle that graphics are not as important as game play.
tags: development, flash, games, gaming, interviews, iphone, popcap, software, steam
| comments: 5
submit:
Wed
Oct 21
2009
Why Google and Bing's Twitter Announcement is Big News
Tweets will finally become first class web citizens
by James Turner | comments: 11
Lurking innocently on Google's blog this afternoon, like many of their big announcements, was the bombshell that they have reached an agreement with Twitter to make all tweets searchable. This followed an earlier announcement at the Web 2.0 conference by Microsoft that Bing has also arranged to make tweets searchable.
This is not only a huge thing for Twitter, it is also well past due. Until now, Twitter really hasn't been a first class web citizen, because you're not really part of Web 2.0 until you're searchable by Google (and, I suppose, Bing). Sure, you can read someone's tweets from Twitter, or get a thread via a #tag, but the full text searching capabilities that make things really usable on the web, largely powered by Google, have been missing.
Making tweets searchable is a major usability improvement as well. Twitter handles are cute, but sometimes obscure as well. Perhaps people will start using more full names in their tweets in addition to @ references, which would let you find tweets about people without having to know what their handle happened to be.
It appears that Twitter is going out of their way not to play favorites in the search space, by cutting deals with both Microsoft and Google. Microsoft seems to be ahead of the game right now, since they have a live site up, whereas the announcement from Marissa Mayer of Google only hints at things to come over the next few months.
The Bing interface is interesting, it seems to be a hybrid of a web search engine and a twitter search. Typing in a term gets you back both the latest tweets that match the keywords, as well as web pages that more than one tweet share in common that also match the keywords. This is a tacit acknowledgement that a lot of the useful content of Twitter is found in the web pages that are linked from the Tweets.
If I had to guess, I'd say that Tweets will show up more traditionally on Google, as just another kind of search result, that can be narrowed in the same way that you can narrow results to just images or movies. I guess we'll have to wait and see on that.
tags: bing, google, microsoft, twitter
| comments: 11
submit:
Mon
Oct 19
2009
Life With TED - Micromanaging Your Carbon Footprint
I've spent three days watching my power consumption like a hawk, here's how it's going
by James Turner | comments: 3
I've been interested in having a better handle on my electrical consumption for a long time. Our family regularly goes through 1100-1200 kWh a month, and it's been frustrating that I couldn't really get a grip on where or when the power was really being used. I want to get my power usage under control for three reasons:
- I want to reduce my $180-a-month-and-climbing power bill. Public Service Company of NH (PSNH) has one of the higher electricity rates in the country (we have a nuke we're still paying off, among other things.)
- I'm seriously investigating adding solar to the mix, now that a 30% federal tax credit, a $6,000 state rebate, and lower prices for the panels have converged. It would be great to get my usage down into the 600-800 kWh average output I've been told I can expect a month from a system, and zero out my PSNH bill on a yearly basis.
- I'm a firm believer in reducing carbon emissions, I'd like my 14 year old son to have a world to grow up in. I've already cut my fuel oil use in half (to a still awful 250 gallons a month in the winter, but it's a huge house...) Cutting my electricity is the next low-hanging piece of fruit on the tree.
I had been tracking Google PowerMeter, a Google initiative that lets people monitor their energy usage online, but it was only available to customers of electric providers who were using so-called "Smart Meters". Smart Meters send usage data back to the provider, and PSNH isn't one of them.
Then, this week, Google announced on their blog that normal mortals could now order a device called The Energy Detective (or TED, as he's known by his friends...) TED is made by Energy, Inc. out of South Carolina, and consists of a minimum of two components. The first piece is an inductive current measuring device that lives out in your circuit breaker box. The second is a gateway device that plugs into a wall socket and has an Ethernet jack. Optionally, you can also get a stand-alone display, so that you don't need a computer to view your usage.
Wiring the sensor device into your box is fairly straightforward. You clamp the two sensors around the mains as they come into the box. You also have to wire the device to the two "hot" phases of your 220V service (which requires two free breakers in your box on different phases), and a third wire running to neutral. If you have some basic electrical savvy, you can do it yourself, but I decided to wimp out, since my box is so crowded (after-effects of having a transfer switch put in for a generator...), so I shelled out the $85 to have an electrician put it in.
The gateway unit communicates with the sensor unit via signals sent over the house AC. As with anything using the power lines to communicate, I found the unit was very particularly to which outlet I plugged it into. It really doesn't like to share a circuit with a computer, for example. Neither of the two plugs which was actually next to a network hub would pick up a signal, but one in an adjacent room that happened to have a network jack did.
Once you have the gateway talking to the sensors and plugged into the network (it uses DHCP to get an address), you can surf to it using any browser. I can even get to it using Safari on my iPhone. The "home" screen is a dashboard, showing various statistics about current demand and your daily, weekly and monthly averages. You can view the data in terms of kWh, dollars (once you tell TED how much you pay for power, it can even handle peak period and tiered pricing models), or pounds of CO2.
All of the ranges on the dials and bar-graphs are configurable, so if you want 3kWh to be "red", you can set it up that way. You can also configure refresh rates. Clicking on the "Graphing" tab lets you view your usages second to second, minute by minute, or by daily or weekly aggregates.
It's these graphs that I have found to be most useful. You can start to see all sorts of interesting patterns, like the "heartbeat" of my furnace turning on and off at night, when the rest of the house is otherwise quiet.
I can also see the huge hump when my son wakes up in the morning, and proceeds to turn on every first floor light in the house. I was even able to tell that my wife had turned on the dishwasher before she left for school one morning.
tags: google, green tech, power management, powermeter
| comments: 3
submit:
Tue
Sep 29
2009
David Hoover's Top 5 Tips for Apprentices
Finding a Good Mentor is Key
by James Turner | comments: 1
If you're a senior developer with years of experience under your belt, it may be hard to remember what it was like coming out of college with a newly minted CS degree, and entering the workplace. But as David Hoover argues, helping these newcomers to the workforce to succeed can be the difference between effective, motivated developers and confused, discouraged ones. Hoover is the author of the new O'Reilly book Apprenticeship Patterns, and he says that people coming right out of college may, in fact, be less motivated than someone who has been working for a while. "One of my theories is computer science education is really hard, and it's expensive. And so when you're done with it, you're ready to cash in and sit back for a little while. 'Hey, I just spent a lot of money. I spent a ton of time and effort and pain on four years of getting this certificate and okay, now it's time to make that pay off.' You're definitely going to be less incentivized to start a new job, and now realize that you've got so much more to learn still. As opposed to someone who's just coming up, who's going to be at a big disadvantage knowledge-wise, but is probably actually going to be at a big advantage motivation-wise because they're going to be hungry, and just assume that they have to learn everything on their own. Whereas, like I said, some computer science people are going to be disincentivized. They're going to be surprised that they've come into their first job and, geez, they have to learn source control and they have to learn unit testing and they have to learn about these different processes that we use. And some programs prepare you for that stuff; some programs are very theoretical and very outdated. And you just have a ton to learn in your first gig."
According to Hoover, one way to ease the transition into real life development is to use an apprenticeship model. His book draws on his own experience moving from being a psychologist to a developer, and the lessons he's learned running an apprenticeship program at a company called Obtiva. "We have an apprenticeship program that takes in fairly newcomers to software development, and we have a fairly loose, fairly unstructured program that gets them up to speed pretty quickly. And we try to find people that are high-potential, low credential people, that are passionate and excited about software development and that works out pretty well."
Hoover says that most developers have benefited from one or two key people in their career that helped them move along. "For people that had had successful careers, they only point back to one or two people that mentored them for a certain amount of time, a significant amount of time, a month, two months, a year in their careers." He also points out that finding that person may mean looking outside your company. "For me personally, I wasn't able to find a mentor at my company. I was in a company that didn't really have that many people who were actually passionate about technology and that was hard for me. So what I did is I went to a user group, a local Agile user group or you could go to a Ruby user group or a .net user group, whatever it is and find people that are passionate about it and have been doing it for a long time. I've heard several instances of people seeking out to be mentored by the leader, for me that was the case. One of our perspective apprentices right now was mentored by the leader of a local Ruby user group. And that doesn't necessarily mean you're working for the person, but you're seeking them out and maybe you're just, "Hey, can you have lunch with me every week or breakfast with me every other week." Even maybe just talking, maybe not even pairing. But just getting exposure to people that have been far on the path ahead of you, to just glean off their insights."
tags: agile, apprenticeship, interviews, mentorship, peer programming
| comments: 1
submit:
Thu
Jul 16
2009
How NPR is Embracing Open Source and Open APIs
Daniel Jacobson Will Talk About the NPR Open API at OSCON
by James Turner | comments: 7
You may also download this file. Running time: 14:14
Subscribe to this podcast series via iTunes. Or, visit the O'Reilly Media area at iTunes to find other podcasts from O'Reilly.
News providers, like most content providers, are interested in having their content seen by as many people as possible. But unlike many news organizations, whose primary concern may be monetizing their content, National Public Radio is interested in turning it into a resource for people to use in new and novel ways as well. Daniel Jacobson is in charge making that content available to developers and end users in a wide variety of formats, and has been doing so using an Open API that NPR developed specifically for that purpose. Daniel will talk about how the project is going at OSCON, the O'Reilly Open Source Convention. Here's a preview of what he'll be talking about.
James Turner: Can you start by explaining what NPR Digital Media is and what your role with it involves?
Daniel Jacobson: Sure. NPR is a radio organization, of course, and the Digital Media Group, of which I'm a part, handles, essentially as I describe it, everything that is publishable by NPR that does not go to a radio. So that includes the website, podcasts, API, mobile sites, HD radios, anything that has some sort of visual component to it. So Digital Media as a group is responsible for producing that content, producing all of those distribution channels, managing all of those relationships.
James Turner: And what is your particular role there?
Daniel Jacobson: I manage the application development team that is responsible for all the functional aspects of all of the systems, which includes our CMS, all of the templating engines for the website, for the API, for the podcasts, all of the engines that drive that.
James Turner: Now NPR is an organization that consists of a lot of member stations kind of flying in close formation. What's your relationship with the content producers? To what extent do they have their own stuff, and to what extent do you work together?
Daniel Jacobson: Those member stations are really exactly that; they are members of NPR. They essentially buy NPR programming. They're distinct organizations from us. NPR is a content producer and distributor. They buy our programming and broadcast it out to the world. They also have their own corresponding web teams that can take NPR content and also produce their own content and create their own websites. So in the Digital Media Team, we take a lot of pride and effort in providing services that help those member stations better serve their communities and their listeners and audiences, using NPR content and using their own content. We work with them to try and satisfy their missions. And to the extent that they need NPR services or content, we work hard to try and provide those. The API is one massive step, I think, in making it much easier for them to do what they need to do without a whole lot of intervention from us, where previously they would have to pull in content in much more arduous ways. So the API, I think, is a step in the right direction to make it more of a self-service model.
James Turner: Since you've mentioned the API, that's what you're going to be talking about at OSCON. We've already talked to the New York Times and the way they're opening up their content through APIs. What are you doing with yours?
Daniel Jacobson: Well, we launched ours formally at OSCON last year. And at that time, we essentially opened up our entire archive. So anything that you can get on npr.org is available through the API, to the extent that we have the rights to distribute it. There are some rights restrictions, for example, for receiving photos or stories from sources that we have not cleared rights to redistribute. Those are getting suppressed through a rights filtering engine on our API. Everything else that you can get on npr.org, you can get through the API. That includes full text. It includes images, audio, video, everything like that. Throughout the last year, we have added more features. We included the layer of "mix your own podcast", for example, which allows people to not only get the content in audio form, but also to download it as a podcast-type item. And all of that is available through search terms or totally customized queries. So what the API really does is it enables people to take the content, make widgets, or do whatever they want with essentially everything that is on npr.org and get to audiences that we are not getting to.
tags: interviews, news, npr, open apis, opensouce, oscon
| comments: 7
submit:
Tue
Jul 14
2009
Making Government Transparent Using R
Danese Cooper thinks it will be an important tool in Open Gov
by James Turner | comments: 7
You may also download this file. Running time: 26:58
Subscribe to this podcast series via iTunes. Or, visit the O'Reilly Media area at iTunes to find other podcasts from O'Reilly.
With Open Source now considered an accepted part of the software industry, some people are starting to wonder if we can't bring the same degree of openness and innovation into government. Danese Cooper, who is actively involved in the open source community through her work with the Open Source Initiative and Apache, as well as working as an R wonk for Revolution Computing, would love to see the government become more open. Part of that openness is being able to access and interpret the mass of data that the government collects, something Cooper thinks R would be a great tool for. She'll be talking about R and Open Government at OSCON, the O'Reilly Open Source Convention.
James Turner: Why don't you start by describing where you came from, and you're involved in, and what your interests are?
Danese Cooper: Okay. I'm Danese Cooper. I serve on the board of the Open Source Initiative. I have been serving for the last eight years. And I'm also currently employed by Revolution Computing, which is a start-up focusing on an open source language called R, as in the letter R, that is very useful for analytics and statistical analysis. I'm also an Apache member. And I also serve on an advisory board for Mozilla.
James Turner: One of the two panels you're going to be speaking on at OSCON is on open source and open government. If you could talk a little bit about what interests you about open government and also what open government means to you.
Danese Cooper: Sure. Well, along with a lot of open source people, I got interested in the Obama campaign and in helping President Obama get elected. And part of why he was so compelling was that the vision of how Washington needed to change is pretty close to the way that we think about working collaboratively in open source. The night that he was elected, there was a great little clip on CNET of a Republican commentator actually explaining open source as exactly what I just said. It was a really brilliant little two-minute clip. He pointed at The Cathedral and the Bazaar, that canonical document about how open source works. And he said, "Microsoft is the cathedral. It's their way or the highway. And the bazaar is a bunch of people working together grassroots to collaboratively build the things that they need. And so Obama's basically asking for the government to become open source, and the problem is Washington isn't really like that right now."
So anyway, that's the transformation that has to happen in order for government to really be transparent. To me, open source government is transparent government. There's been an awful lot of shenanigans in recent political history, like the last decade has been pretty crazy in terms of things happening that couldn't be traced back to any source. Even just the way we vote and the way that voting is managed, and the fact that the software that runs the machines that we vote on is not open source so it can't be inspected. And nobody knows quite what it does. There are all of these stories of weird updates to the software that happened right before major elections in states where there are strange results. Transparency, in the same way that it helped the software industry transform, could really help the government transform. So that's what I'm talking about. There's a bunch of other people on that panel. My good friend, Brian Behlendorf, and I co-proposed it. And he's actually taken the next step. He helped found Apache. And he's run off to Washington to work on projects that are interesting to the Obama government to try to figure out how to help them to more open source solutions. And he'll be talking about his progress on that panel. So I think it's a pretty exciting panel.
tags: interviews, open government, open source, oscon, r, statistics
| comments: 7
submit:
Recent Posts
- Sequencing a Genome a Week on July 13, 2009
- Open Source is Infiltrating the Enterprise on July 7, 2009
- Patrick Collison Puts the Squeeze on Wikipedia on July 2, 2009
- Want A Job? Learn SharePoint, Says Gary Blatt on June 29, 2009
- Walking the Censorship Tightrope with Google's Marissa Mayer on June 16, 2009
- John Viega Talks About Beautiful Security on June 10, 2009
- Google Squared is an Exponential Improvement in Search on June 3, 2009
- Velocity Preview - The Greatest Good for the Greatest Number at Microsoft on May 18, 2009
- Google Engineering Explains Microformat Support in Searches on May 12, 2009
- Velocity Preview - Keeping Twitter Tweeting on May 7, 2009
STAY CONNECTED
BUSINESS INTELLIGENCE
RELEASE 2.0
CURRENT CONFERENCES
O'Reilly Home | Privacy Policy © 2005 - 2010, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.