CARVIEW |
Emerging Tech
Emerging Technology is our catch-all phrase for the innovations and trends we're watching (and the name of our popular conference) . We seek out and amplify the "faint signals" from the alpha geeks who are creating the future, and report on it here. We're currently intrigued by data visualization and search, hardware hacking, alternative energy, personalized medicine, immersive online environments, nanotechnology, body modification, and other mindbending new technology that will that spark new industries.
ETech Preview: Creating Biological Legos
by James Turner | comments: 15
You may also download this file. Running time: 00:15:25
If you've gotten tired of hacking firewalls or cloud computing, maybe it's time to try your hand with DNA. That's what Reshma Shetty is doing with her Doctorate in Biological Engineering from MIT. Apart from her crowning achievement of getting bacteria to smell like mint and bananas, she's also active in the developing field of synthetic biology and has recently helped found a company called Gingko BioWorks which is developing enabling technologies to allow for rapid prototyping of biological systems. She will be giving a talk entitled Real Hackers Program DNA at O'Reilly's Emerging Technology Conference, March 9-12, in San Jose, California. And she's joining us here today. Thank you for taking the time.
RESHMA SHETTY: No problem. Happy to be here.
JAMES TURNER: So first of all, how do you make bacteria smell nice, and why? I get an image of a commercial, "Mary may have necrotizing fasciitis, but at least her hospital room smells minty fresh."
RS: Well, the original inspiration for the project was the fact that for anybody who works in a lab, who works with E. coli, when you grow cultures of the stuff, it just smells really bad. It smells really stinky, basically. And so our thought was, "Hey, why don't we reengineer the smell of E. coli? It'll make the lab smell minty fresh, and it's also a fun project that gets people, who maybe aren't normally excited about biology, interested in it because it's a very tangible thing. I can smell the change I made to this bacteria."
JT: So what was the actual process involved?
RS: So the process was, you basically take a gene, we took a gene from the petunia plant, which normally provides an odor to the flower, and you place that gene into the E. coli cell. And by supplying the cell with an appropriate precursor, you make this minty smell as a result. So it's fairly straightforward.
JT: Your degree, biological engineering, is a new one to me. How is it different from biochemistry or microbiology or genomics or any of the other traditional biotech degrees?
RS: Well, biology and biochemistry, and so on, are concerned with studying the natural world. So I'm going to go out and figure out how the natural world works. Biological engineering, instead, is really all about saying, "Hey, we have this natural world around us. Biology is, in some sense, a new technology through which we can build new engineered biological systems." Right? So the idea is, what's the difference between physics and electrical engineering? Electrical engineers want to go build. So in biological engineering, we're interested in going and building stuff, too. But using biology, rather than physics, as the underlying science of it.
tags: biology, emerging tech, synthetic biology
| comments: 15
submit:
ETech Preview: Why LCD is the Cool New Technology All Over Again
by James Turner | comments: 5
You may also download this file. Running time: 00:43:53
In an early test of the OLPC XO in Nigeria, the student users dropped every laptop several times a day. Despite the laptops' rugged construction, they occasionally needed fixing, and a group of six-year-old girls opened up a "hospital" to reseat cables and do other simpler repairs. Mary Lou Jepson, One Laptop Per Child project's CTO, had this response: "I put extra screws underneath the battery cover so that if they lost one, they could have an extra one. And kids trade them almost like marbles, when they want to try to get something fixed in their laptop."
Mary Lou led the development of the OLPC's breakthrough low-power transflective display, combining a traditional backlit color display with a black and white display that could be used outdoors. She left OLPC to form Pixel Qi, and bring the revolutionary engineering used in the XO to the broader consumer market. In this interview, she discusses lessons learned from OLPC and shares her vision of "cool screens that can ship in high volume, really quickly, at price points that are equivalent to what you pay for standard liquid crystal displays."
At ETech, Mary Lou's keynote presentation delves further into Low-Cost, Low-Power Computing.
JAMES TURNER: I'm speaking today with Mary Lou Jepsen, Founder and CEO of Pixel Qi. Dr. Jepsen previously served as chief technology officer for the One Laptop per Child program where she was an instrumental player in the development of the OLPC's revolutionary hybrid screen. She also previously served as CTO of Intel's display division. Dr. Jepsen was also named by Time Magazine recently as one of the 100 most influential people in the world for 2008. She'll be speaking at the O'Reilly Emerging Technologies Conference in March, and we're pleased she's taken the time to talk to us. Good evening.
MARY LOU JEPSEN: Hi. Nice to speak with you tonight.
JT: So in some ways, you're kind of uniquely qualified to comment on the current travails of the OLPC since you've been in highly influential positions both in the OLPC effort itself and at Intel, who some believe tried to sabotage the OLPC. Do you think that the OLPC would've had wider acceptance if the Intel Classmate wasn't competing against it?
MLJ: It is interesting. I think the OLPC, and I haven't seen the latest numbers, sold a lot more than the Classmate. I think head-to-head there's no comparison which is the better machine, and I'm not saying that just because I'm the architect. But what's really happened has been extraordinary. I think OLPC's impact in sort of spearheading the movement to Netbooks is fairly undisputed, although OLPC is not the best selling Netbook; 17 million Netbooks shipped in 2008 and that's through companies like Acer, Asus, MSI, HP, Dell. And that impact on the world is starting to be felt.
JT: What were the factors that led you to leave the OLPC program and start Pixel Qi?
MLJ: You know, I started OLPC with Nicholas in his office in the beginning, in January of 2005. And at that point, right after that Bill Gates, Steve Jobs, Michael Dell, all said it was impossible. So it became my job to sort of take that, create an architecture, invent a few things, convince the manufacturers to work with me to develop it, get a team together, and take it into high-volume mass production. And then it got to the point where my days were spent getting safety certifications for various countries.
And I just realized, it's time for me to continue doing this; this is the best job I've ever done, but to keep going, why not make these components that are inside of the XO and let everybody buy them rather than just exclusively making and designing them for the OLPC laptop. If you make more of something, you can sell it for less. So rather than just serving the bottom of the pyramid, why not take the fantastic technology that we developed at OLPC and serve the whole pyramid? Everybody wants their batteries to last a lot longer. Everybody wants screens that are e-paper-like and high resolution and sunlight readable. So why not make these for the whole world?
tags: displays, emerging tech, etech, green tech, lcd, olpc, pixel qi
| comments: 5
submit:
The Kindle and the End of the End of History
by Jim Stogdill | comments: 21
This morning I was absentmindedly checking out the New York Times' bits blog coverage of the Kindle 2 launch and saw this:
“Our vision is every book, ever printed, in any language, all available in less than 60 seconds.”
It wasn't the main story for sure. It was buried in the piece like an afterthought, but it was the big news to me. It certainly falls into the category of big hairy audacious goal, and I think it's a lot more interesting than the device Bezos was there to launch (which still can't flatten a colorful maple leaf). I mean, he didn't say "every book in our inventory" or "every book in the catalogues of the major publishers that we work with." Or even, "every book that has already been digitized." He said "every book ever printed."
When I'm working I tend to write random notes to myself on 3x5 cards. Sometimes they get transcribed into Evernote, but all too often they just end up in piles. I read that quote and immediately started digging into the closest pile looking for a card I had just scribbled about an hour earlier.
I had been doing some research this morning and was reading a book published in 1915. It's long out of print, and may have only had one printing, but I know from contemporary news clippings found tucked in its pages that the author had been well known and somewhat controversial back in his day. Yet, Google had barely a hint that he ever existed. I fared even worse looking for other people referenced in the text. Frustrated, I grabbed a 3x5 card and scribbled:
"Google and the end of history... History is no longer a continuum. The pre-digital past doesn't exist, at least not unless I walk away from this computer, get all old school, and find an actual library."
My house is filled with books, it's ridiculous really. They are piled up everywhere. I buy a lot of old used books because I like to see how people lived and how they thought in other eras, and I guess I figure someday I'll find time to read them all. For me, it's often less about the facts they contain and more about peeking into alternative world views. Which is how I originally came upon the book I mentioned a moment ago.
The problem is that old books reference people and other stuff that a contemporary reader would have known immediately, but that are a mystery to me today - a mystery that needs solving if I want to understand what the author is trying to say, and to get that sense of how they saw the world. If you want to see what I mean, try reading Winston Churchill's Second World War series.
Churchill speaks conversationally about people, events, and publications that a London resident in 1950 would have been familiar with. However, without a ready reference to all that minutiae you'll have no idea what he's talking about. Unfortunately, a lot of the stuff he references is really obscure today and today's search engines are hit and miss with it - they only know what a modern wikipedia editor or some other recent writer thinks is relevant today. Google is brilliant for things that have been invented or written about in the digital age, or that made enough of a splash in their day to still get digital now, but the rest of it just doesn't exist. It's B.G. (before Google) or P.D. (pre digital) or something like that.
To cut to the chase, if you read old books you get a sense for how thin the searchable veneer of the web is on our world. The web's view of our world is temporally compressed, biased toward the recent, and even when it does look back through time to events memorable enough to have been digitally remembered, it sees them through our digital-age lens. They are being digitally remembered with our world view overlaid on top.
I posted some of these thoughts to the Radar backchannel list and Nat responded with his usual insight. He pointed out that cultural artifacts have always been divided into popular culture (on the tips of our tongues), cached culture (readily available in an encyclopedia or at the local library) and archived culture (gotta put on your researcher hat and dig, but you can find it in a research library somewhere). The implication is that it's no worse now because of the web.
I like that trichotomy, and of course Nat's right. It's not like the web is burying the archive any deeper. It's right there in the research library where it has always been. Besides, history never really operates as a continuum anyway. It's always been lumpy for a bunch of reasons. But as habit and convenience make us more and more reliant on the web, the off-the-web archive doesn't just seem hard to find, it becomes effectively invisible. In the A.G. era, the deep archive is looking more and more like those charts used by early explorers, with whole blank regions labeled "there be dragons".
So, back to Bezo's big goal... I'd love it to come true, because a comprehensive archive that is accessible in 60 seconds is an archive that is still part of history.
tags: big hairy audacious goals, emerging tech, publishing
| comments: 21
submit:
A Climate of Polarization
by Gavin Starks | comments: 10
Guest blogger Gavin Starks is founder and CEO of AMEE, a neutral aggregation platform designed to measure and track all the energy data in the world. Gavin has a background in Astrophysics and over 15 years Internet development experience.
We're all aware of the emotive language used to polarize the climate change debate.
There are, however, deeper patterns which are repeated across science as it interfaces with politics and media. These patterns have always bothered me, but they've never been as "important" as now.
We are entering an new era of seismic change in policy, business, society, technology, finance and our environment, on a scale and speed substantially greater than previous revolutions. The sheer complexity of these interweaving systems is staggering.
Much of this change is being driven by "climate science", and in the communications maelstrom there is a real risk that we further alienate "science" across the board.
We need more scientists with good media training (and presenting capability) to change the way that all sciences are represented and perceived. We need more journalists with deeper science training - and the time and space to actually communicate across all media. We need to present uncertainty clearly, confidently and in a way that doesn't impede our decision-making.
On the climate issue, there are some impossible levers to contend with;
- Introducing any doubt into the climate debate stops any action that might combat our human impact.
- Introducing "certainty" undermines our scientific method and its philosophy.
When represented in political, public and media spaces, these two levers undermine every scientific debate and lead to bad decisions.
Pascal's Wager is often invoked, and this is entirely reasonable in this case.
It is reasonable because of what's at stake: the risk of mass extinction events. If there is a probability that anthropogenic climate change will cause the predicted massive interventions in our ecosystem, then we have to act.
The nature of our actions must be commensurate with both the cause and the effect. The causes are many: population, production, consumption - as are the effects: war, poverty, scarcity, etc.
Our interventions will use all our means to address both cause and effect, and those actions will run deep.
Equally, we must allow science to do what it's designed to do: measure, model, analyse and predict.
From a scientific perspective we must allow more room for theories to evolve, otherwise we'll only prove what we're looking for.
However, if we ignore the potential need to act, the consequences are not something anyone will want to see.
It's not something we can fix later (for me, "geo-engineering" is not a fix, it's a pre-infected band-aid).
Given the massive complexity of the issues, and that - really - anthropogenic climate change is only one of many "peak consumption" issues that we face, there is no way we can accurately communicate all the arguments that would lead to mass understanding.
However, the complexity issues are no different from those we face in politics. They are not solvable, but they are addressable.
We can communicate the potential outcomes, and the decisions that individuals need to make in order to impact the causes.
Ultimately it's your personal choice.
My choice is based on my personal exposure to the science, business, data, policy, media, and broader issues around sustainability. That choice is to do my best to catalyse change as fast as I possibly can.
We all need to actively engage in improving communication, so that everyone - potentially everyone on Earth - can make informed choices about the future of the planet we inhabit.
--
Recommended reading:
https://www.realclimate.org/ is a great resource.
Today, the UK Government launched a campaign "to create a more science literate society, highlighting the science and technology based industries of the future"
tags: climate change, communication, emerging tech, media
| comments: 10
submit:
What Will Change Everything?
by Brady Forrest | comments: 16
Regular Radar contributor Linda Stone sent this in to be posted today.
What game-changing scientific ideas and developments do you expect to live to see? The Internet, television, antibiotics, automobiles, electricity, nuclear power, space travel, and cloning - these inventions were born out of dreams, persistence, and imagination. What game-changing ideas can we expect to see in OUR lifetimes?
tags: edge, emerging tech, question
| comments: 16
submit:
The State of Transit Routing
by Jim Stogdill | comments: 12
My brother called me a week ago and during the course of our conversation mentioned that he made the trek to the Miami Auto Show. He was complaining that he really wanted to take Tri-Rail (the commuter rail that runs along Florida's South East coast) but it was just too hard to figure out the rest of the trip once he got off the train. "One web site for train schedules, another for buses, and another for a city map to tie it all together. It was just too much trouble to figure out, so I drove. I just want to go online and get directions just like I do for driving, but that tells me which train, which bus, etc."
Coincidentally, later in the day I downloaded the iPhone 2.2 upgrade with the new walking and public transit directions. So far, at least where I live, it's useless. The little bus icon just sits there grayed out, taunting me. I guess because SEPTA (our local transit authority for bus and regional rail) isn't giving data to Google?
My brother hadn't heard of Google Transit, but It turns out to have some coverage in Miami. Their coverage at this point seems to be transit authority centric and doesn't seem to have great support for mixed mode or stuff that crosses transit system boundaries. I am curious though, is it being used? Let me know in the comments if you are using it to good effect.
Anyway, my brother's call on the same day as the iPhone update piqued my interest in the current state of the art for mixed mode transit routing. After some mostly fruitless web searches for I reached out to Andrew Turner. I knew he'd know what was going on in. This is what he had to say:
Routing is definitely one of the emergent areas of technology in the next generation applications. So far, we've done a great job getting digital maps on the web, mobile locative devices, and comfortable users.One problem for awhile has been the lack of data. You can have a great algorithm or concept, but without data it's useless. Gathering this data has been prohibitively expensive - companies like NAVTEQ drive many of the roads they map for verification and additional data. Therefore if you wanted to buy road data from one of the vendors you had to have a large sum of money in the bank and know how you were going to monetize it. This stifled experimentation and creating niche applications.
Now that the data is becoming widely, and often freely, available innovation is happening at an increased pace.
For one example, consider the typical road navigation. The global OpenStreetMap project has always had topology (road connectivity), but the community now adding attribute data to ways such as number of lanes, stop lights, turn restrictions, speeds, and directionality. Anyone can download this data to use with a variety of tools such as pgRouting. As a result people are rethinking standard routing mechanisms that assume travel from A to B via the fastest, or shortest, route. What if a user wants to take the "greenest" route as determined by lowest total gas mileage, or the most scenic route based on community feedback.
An area that has strongly utilized this idea has been disaster response. Agencies and organizations deploy to areas with little on the ground data, or data that is now obsolete due to the disaster they're responding. Destroyed bridges, flooded roads, new and temporary infrastructure are just some of the aspects that are lost with typical navigation systems. However, given the capability of responders to correct the data and instantly get new routes is vital. And these routes may need to be based on attributes different from typical engines - it's not about the fastest, but which roads will handle a 5-ton water truck?
This scheme was deployed in the recent hurricane response in Haiti in conjunction with the UNJLC, CartOng, OpenStreetMap and OpenRouteService.
Beyond just simple, automotive routing, we can now incorporate multi-modal transit. With 50% of the world's population now living in urban areas, the assumption that everyone is in a car is not valid. Instead people will be utilizing a mixture of cars, buses, subways, walking, and bicycling. This data is also being added to OpenStreetMap as well as other projects such as Bikely or EveryTrail. GraphServer is one routing engine that will incorporate these various modes and provide routes.
And we're interfacing with all these engines using a variety of devices: laptop, PND (Personal Navigation Device), GPS units, mobile phones, and waymarking signs. PointAbout recently won an award in the Apps For Democracy for their DC Location Aware Realtime Alerts mobile application that displays the route to the nearest arriving metro.
What's also interesting is the potential of these routing tools beyond actual specific individual routes. Taken in amalgamation the routing distances form a new topography of the space. Given a point in the city, how far can I travel in 20 minutes? in 40 minutes? for less than $1.75? This type of map is known as an isochrone. Tom Carden and MySociety developed London Travel Time Maps that allow users to highlight the spots in London given a range of house prices and travel times.
Despite these apparent benefits, there is a large hurdle. Like road data, there has been a lack of openly available transit data to power applications and services. Providers like NAVTEQ and open projects like OpenStreetMap are possible because the public roads are observable and measurable by any one. By contrast, the many, varied local transit agencies own and protect their routing data and are reluctant to share. Google Transit has made great strides in working with transit authorities to expose their information in the Google Transit Feed Specification - at least to Google. This does not mean the data has to be publicly shared, and in many cases this is exactly what has occured.
However, not even the allure of widely admired Google Transit can induce transit authorities to share their prized data. The Director of Customer Service of the Washington Metro Area Transit Authority (WMATA) plainly states that working with Google is "not in our best interest from a business perspective."
Hopefully, this situation will change, first through forceful FOIA requests, but later through cooperation. One step in this direction have been TransitCamps. And Portland's TriMet is a shining example with a Developer Resources page detailing data feeds and API's.
These experimentations are just the beginning of what is being pushed in the space. Routing is one of those features that users may not realize they need until they have it and then they'll find it indispensable. The ability for a person to customize their valuation of distance to assist in making complex decisions and searching is very powerful.
For more projects and tools, check out the OpenStreetMap routing page, Ideas in transit and the OGC's OpenLS standards.
tags: emerging tech, geo
| comments: 12
submit:
DIY Appliances on the Web?
by Jim Stogdill | comments: 9
Or, My Enterprise is Appliancized, Why Isn't Your Web?
I wrote a couple of posts a while back that covered task-optimized hardware. This one was about a system that combined Field Programmable Gate Arrays (FPGA's) with a commodity CPU platform to provide the sheer number crunching performance needed to break GSM encryption. This one looked at using task-appropriate efficient processors to reduce power consumption in a weather predicting super computer. In these two posts I sort of accidentally highlighted two of the three key selling points of task-specific appliances, sheer performance and energy efficiency (the third is security). The posts also heightened my awareness of the possibilities for specialized hardware and some of my more recent explorations that focused on the appliance market in particular got me wondering if there might be a growing trend toward specialized appliances.
Of course, specialized devices have been working their way into the enterprise ever since the first router left its commodity Unix host for the task-specific richness of specialized hardware. Load balancers followed soon after and then devices from companies like Layer 7 and Data Power (now IBM) took the next logical step and pushed the appliance up the stack to XML processing. These appliances aren't just conveniently packaging intellectual property inside commodity 1U blister packs, they are specialized devices that process XML on purpose-built Application Specific Integrated Circuits (ASICS), accelerate encryption / decryption in hardware, and encapsulate most of an ESB inside a single tamper proof box whose entire OS is in firmware. They are fast, use a lot less power than an equivalent set of commodity boxes, and are secure.
Specialization is also showing up in the realm of the commodity database management systems. At last year's Money:Tech Michael Stonebraker described a column-oriented database designed to speed access to pricing history for back testing and other financial applications. In this case the database is still implemented on commodity hardware. However, I think it's interesting in the context of this conversation on specialized computing because it speaks to the inadequacy of commodity solutions for highly specific requirements.
A device from Netezza is also targeted at the shortcomings of the commodity DBMS. In this case the focus is on data warehousing, but it takes the concept further with an aggressive hardware design that is delivered as an appliance. It has PostgreSQL at its core but it takes the rather radical step of coupling FPGA's directly to the storage devices. The result, for at least a certain class of query, is a multiple order of magnitude boost in performance. I think this device is noteworthy because it puts the appliance way up the stack and is perhaps a harbinger for further penetration of the appliance into application-layer territory.
While appliances are expanding their footprint in the enterprise, it seems like the exact opposite might be happening on the web? Maybe the idea of a closed appliance is anathema to the open source zeitgeist of the web, but in any case, the LAMP stack is still king. Even traditional appliance-like tasks such as load balancing seem to be trending toward open source software on commodity hardware (e.g. Perlbal).
I can't help but wonder though, at the sheer scale that some web properties operate (and at the scale of the energy cost required to power them), can the performance and cost efficiency of specialized hardware appliances be ignored? Might there be a way to get the benefits of the appliance that is in keeping with the open source ethos of the web?
If you've ever uploaded a video to Youtube and waited for it to be processed you have an idea of how processor hungry video processing is on commodity hardware. I don't know what Google's hardware and energy costs are for that task but they must be significant. Same goes for Flickr's image processing server farm and I would guess for Google's voice processing now that its new speech services have launched. If the combination hardware and electricity costs are high enough, maybe this is a good place to introduce specialized appliances to the web?
But how to do that in a way that is consistent with the prevailing open source ethos and that still lets a firm continue to innovate? I think an answer might be sort of DIY writ large; a confluence of open source and open hardware that works like an undocumented joint venture based on the right to fork. Think Yahoo and the Hadoop community or JP Morgan and friends with AMQP but with hardware and you get the idea. Such a community could collaborate on the design of the ASICS and the appliance(s) that hosted them and even coordinate production runs in order to manage unit costs. Perhaps more importantly, specifying the components openly would serve cost sharing across these companies while still supporting flexibility in how they were deployed and ultimately, generativity and innovation for future uses.
There are probably a bunch of reasons why this is just silly speculation, but Google's efforts with power supply efficiency might be seen as at least a bit of precedent for web firms dabbling in hardware and hardware specifications. In fact, Google's entire stack, from it's unique approach to commodity hardware, to software infrastructure like GFS, might be thought of as a specialized appliance that suits the specific needs of search. It's just a really really big one that "ships" in a hundred thousand square foot data center form factor.
tags: emerging tech, energy, open hardware, open source
| comments: 9
submit:
Apps for Democracy
by Jim Stogdill | comments: 3
Vivek Kundra, the District of Columbia's CTO, isn't just talking about transparent government and participative democracy, he's working hard to make DC's massive data stores transparent and inviting your participation.
I first heard about Vivek's push for transparency when he spoke at an Intelligence Community Conference in September (I just happened to be speaking on a panel thanks to a twitter-induced serendipitous introduction to one of the conference organizers - @immunity). He was there in sort of "one government entity to another" role to demonstrate that data could be shared and that it is valuable to do it.
I was impressed with the risks he was taking to push hard for the "democratization of data" and for what he was calling The Digital Public Square. What came through really clearly was that he didn't just view this as a technology exercise, but as a way for citizens to participate in real ways and to increase government accountability. It was an engaging and refreshing talk.
It's not exactly news at this point that he came up with $20,000 to offer prizes for the best applications to be built on top of the district's data. After all, the submissions have been long closed and the winners have already been announced. However, I thought it might be worth pointing out that you still have until tomorrow to vote for the two people's choice award winners.
I thought it was kind of fun to just poke around in the list of submissions and see what people came up with. As you can imagine many of them are geo-spatial display's of district stores, but there are some other ideas in there as well. Take a look, see what you think, and get your people's choice vote in.
And just because the contest is over doesn't mean it's too late to build something. Take a look at the catalog of data and see what comes to mind. This is just the beginning (Mayor Nutter of Philly, I'm looking at you...).
tags: emerging tech
| comments: 3
submit:
My Apple Holiday Wish
by Jim Stogdill | comments: 20
I've been searching for a personal backup solution that doesn't suck for, well, pretty much since I got my first computer in the 80's, and I'm still looking.
A few years ago I was cleaning out old crap and ran across boxes and boxes of 800kb floppies labeled "1988 backup disk x." The trash / recycling picker uppers got those along with a pile of zip disks, various CD's, DVD's, a USB drive or two, and a couple of bare SATA drives that I was too cheap to buy housings for. Oh, and there was even a pile of tapes in some long forgotten format in there.
After a few years of manually copying stuff to multiple USB drives, last year I was completely seduced by the "it's like RAID but you don't need identical drives" beauty of the Drobo. Three failures later (including one with smoke), a nasty virtual tinnitus that comes and goes as its disks transition through a perfect cabinet-resonating frequency, incompatibility problems with Time Machine and Airport Extreme, and access speeds that are too slow to serve Final Cut, and screw it. Now it mostly just sits there powered down making a Drobo-shaped dust-free spot on my desk. It's too buzzy to listen to but too expensive to Freecycle.
Next up, Time Capsule. Still (even more) useless for Final Cut and that sort of thing, but it's doing an ok job with backups - at least of the straight Time Machine variety. There are still a few issues though...
First off, I don't really trust that single spinning platter. It will die some day. Plus, it's in my house about ten feet from where my laptop is usually parked so my eggs are all in a single fire / theft / flood basket.
Apple's Mobile Me and the Backup program that comes with it theoretically provide a solution to this issue, but unfortunately it sucks. It's slow, much slower than a local time capsule backup because it is relying on an Internet connection. Also, it effectively requires my machine to be running all the time so that it can conduct it's backups in the middle of the night when I won't be competing for bandwidth or CPU cycles.
Even worse, it fails all the time. I don't know why, but it's finicky. A brief connectivity hiccup (or whatever) and I wake up the next day to find that my multi-hour backup died. Finally, It's too small to be useful for more than a few key critical files. I have a few hundred gigabytes of data I'd like to secure and my mobile me account is limited to twenty.
So Apple, I don't usually resort to begging, but here's your chance to fix backup for me once and for all. Just update the firmware in my Time Capsule so that my fast Wi-Fi-based local backups can be incrementally streamed to either an expanded Mobile Me account or to a separate S3 account (or whatever) whenever it's sitting at home with my network connection to itself.
I can't leave my laptop connected for the days it would take to stream all those hundreds of Gig, but Time Capsule is just sitting there with my Internet connection doing nothing while I'm at work anyway, so give it something to do. This way I'll have the best of both worlds, fast reasonably secure backups to my local Wi-Fi connected Time Capsule when I'm home and don't-need-to-think-about-it remote storage that can take its time when I'm not. At the risk of way over reaching, it could even work in both directions so that if I'm on the road for an extended period, Time Machine could backup critical changes directly to Mobile Me which could then in turn incrementally stream that back to my Time Capsule.
Ok, that's it. A simple idea I think. Can I have it by Christmas?
By the way, if the thought of all those gigabytes in your Mobile Me data centers makes you blanche (and the idea of using S3 is anathema to Apple's do it all culture), how about a Time Capsule-based distributed hash overlay network? If every Time Capsule shipped with the option of turning on a separate partition representing about 1/3 of the disk, you could put a Planet Lab-like distributed file system in there. My files would be split into chunks, encrypted, and distributed around to other people's Time Capsules while some of their stuff was on mine. Sort of an inverted Bit Torrent for backups, no data center required.
That would be cool but I know you won't do it. And, from the category of "things you are even less likely to do," if you opened up the Time Capsule firmware to third parties someone else probably would.
tags: emerging tech
| comments: 20
submit:
PICNIC Network 2008
by Brady Forrest | comments: 2
The week of September 22nd I am going to be flying across the Atlantic for the third PICNIC Network. Ever since i heard about the conference last fall I've wanted to attend. My friends' stories last year focused on the many RFID-enabled art pieces. As discussed in this interview these were developed by Mediamatic, a digital art lab.
Last year Mediamatic offers PICNIC delegates some fun and intriguing services. An RFID-tag was added to the PICNIC badges, and linked to the delegates profile on the PICNIC network. A team of top notch hackers, developers and dreamers got involved and came up with cute, fun and relevant new services. Remind us!.
"Last year we had the first version of the hackers camp where we built some cool physical interfaces for the PICNIC social network. We had the Photobooth, Badger, The Friend-Drink station, I-tea and of course a whole range of inspiring designs that we only could realise later."
Mediamatic is going to be hosting another hacking workshop and I'll be volunteering. Additionally there are a number of sessions and day long minitracks:
Surprising Africa has a great selection of content. I've never been to Africa, but there is an increasing amount of tech heading there. Google, Nokia and Vodafone are going to be describing their initiatives. Ethan Zuckerman will be discussing citizen journalism in Africa; Ethan gave an amazing talk at last year's ETech on this topic (Radar post).
Visible City is exploring the data available from cities. As their page states "What if an entire city could be visible from above, like we see it from an airplane? Not simply buildings and squares, but also the aggregation of people who populate it, outdoor as well as indoor. We could detect public gatherings and traffic jams, estimate which neighborhoods are most crowded, reconstruct commuting patterns during the day." This is very relevant for this year's ETech.
In the Internet of Things speakers from SAP, OpenSpime, and ThingM (amongst others) will be discussing how they are moving devices and objects online. This session makes me think of the soon-to-be released Fitbit, a personal web-enabled activity tracker. Soon it won't be just things online, but our passive data as well.
Perhaps the most significant event of the week be the Green Challenge. The winner will get €500,000 from Sir Richard Branson. Last year's winner was Qurrent, an energy consumption monitoring company.
If you're going to be there drop me a line.
tags: emerging tech, etech, web 2.0
| comments: 2
submit:
Disaster Technology for Myanmar/Burma aid workers
by Jesse Robbins | comments: 8
There is an ongoing crisis in Myanmar (Burma) in the aftermath of cyclone Nargis. The ruling military junta is finally allowing humanitarian organizations into the region after denying access for almost a week. The situation is grim, and you can help by donating to organizations like: Doctors without Borders, Direct Relief, and UNICEF.
There has been some incredible discussion on the humanitarian tech and Geo lists in the past 24 hours around adapting/improving existing collaboration services to work with the tools in the field. Mikel Maron and I will be speaking about this at Where2.0 next week, and it looks like some exciting work will be happening there and at WhereCamp.
Eduardo Jezierski from InSTEDD is currently working to localize the Sahana Disaster Management System
Jonathan Thompson's organization, Humanlink, has been working on adapting technology for aid workers for some time. You can follow recent developments on the Aid Worker Daily blog.
Update: Paul Currion posted a big list of other projects now underway to the humanitarian.info blog:
- A Sahana instance is being set up for the use of anybody who needs it, with the support of INSTEDD and possible uptake by NetHope members.
- Direct Relief International have done up a KMZ file of health facilities in-country, based on the WHO 2002 Global Health Atlas.
- OCHA are prepping a HIC to support the existing Myanmar Information Management Unit, who have already put out some W3 maps.
- UNOSAT have also got their sat on with a KMZ file of the cyclone path and the usual satellite mapping.
- Ditto ITHACA, who have released a series of satellite maps showing the impact of Nargis.
- ReliefWeb’s info stream on Cyclone Nargis is of course like drinking water from a hose, with their map filter probably most useful.
- The WorldWideHelp blog roars into action with all the news that’s fit to blog.
- A couple of the mailing list discussions that I’m on are talking about ways in which we might leverage cellphone and/or satellite phone communications if they become available, particularly for tracking relief and relief personnel.
- Digital Globe and Geo-Eye have hopped the NASA satellite for an updating KML layer on the cyclone.
- Microsoft apparently have a team on standby to deploy the refugee tracking software that was developed for Kosovo (no reference yet).
- Telecoms sans Frontieres are also on standby out of Bangkok, waiting for access to free up.
- Also Infoworld points out that - with regards to early warning - IT didn’t fail Myanmar, people did.
tags: disastertech, diy, emerging tech, emerging telephony, etech, geo, hacks, make, open source, operations, web 2.0
| comments: 8
submit:
roBlocks: Simple Blocks To Make Robots
by Brady Forrest | comments: 9
roBlocks are small, computerized cubes that can be combined to make robots. They began as research project at Carnegie Mellon. They look like great fun for fooling around or teaching programming concepts.
The catalog page shows about twenty different blocks. Each of those blocks has a single purpose. There are four types of blocks: Sensors (light, sound), Actuators (movement), Operators (negative, min/max) and Utility (power). When put together they can be made to perform complex actions.
The creators provide an example of roBlock's interactions in their paper "The Robot is the Program: Interacting with roBlocks":
It is easy to understand the basic idea of roBlocks by considering a simple light seeking robot made of two roBlocks: a light sensor block placed atop a tread block. The sensor measures the ambient light level and produces a number. The tread block gets that number from the light sensor block that sits on it, and runs its motor with a speed that corresponds to the magnitude of that number. To make the robot avoid light, take the two blocks apart and insert a red Inverse block between them. This operator block takes the number produced by the light sensor block, inverts it and transmits it to the tread block at the bottom. The new three-block robot moves away from a light source just as the previous robot moved toward it. This sort of modularity is possible because each of the blocks operates independently without knowing its place within the construction.
The creators are going to be commercializing them later this year. To see the prototypes in action check out this video. or play with their online simulator . No word on whether or not they will open source the hardware.
The world of programmable hardware is expanding. Between roBlocks, IPRE (the open-source robot-kit that was at ETech), BugLabs (the programmable, open-source gadgets -- Radar post) , and Lego MINDSTORMS NXT there is starting to be something for every sophistication level and wallet-size.
tags: emerging tech, etech
| comments: 9
submit:
Recent Posts
- Baseball Simulations | by Brady Forrest on March 31, 2008
- How Technology Almost Lost the War, but Should Do Better | by Jim Stogdill on March 21, 2008
- Radar Roundup: Brains | by Nat Torkington on March 17, 2008
- From ETech to Where 2.0: Disaster Tech and Activist Mapping | by Brady Forrest on March 12, 2008
- @ETech: Matt Webb's Tour of a Fictional Solar System | by Brady Forrest on March 8, 2008
- Why I Love Hackers | by Tim O'Reilly on March 7, 2008
- Neuroscience and Epistemology at ETech | by Tim O'Reilly on March 7, 2008
- ETech 2008 Coverage Roundup | by Tim O'Reilly on March 6, 2008
- @ETech: Wednesday Morning Keynotes | by Jimmy Guterman on March 5, 2008
- @ETech: fire eagle Launches | by Jimmy Guterman on March 5, 2008
- @ETech: Tuesday Morning Keynotes | by Jimmy Guterman on March 4, 2008
- I Thought You Guys Were Supposed To Be Utopian: The EFF at Etech | by Nat Torkington on March 4, 2008
STAY CONNECTED
TIM'S TWITTER UPDATES
CURRENT CONFERENCES

ETech, the O'Reilly Emerging Technology Conference, is O'Reilly Media's flagship "O'Reilly Radar" event. Read more

Now in its third year, Web 2.0 Expo is for the builders of the next generation web: designers, developers, entrepreneurs, marketers, business strategists, and venture capitalists. Read more
O'Reilly Home | Privacy Policy ©2005-2009, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
Website:
| Customer Service:
| Book issues:
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.