CARVIEW |
Jim Stogdill

Jim Stogdill is a group CTO for a large technology consultancy where he advocates the development of open source software in government and defense. He believes, perhaps naively, that open source can help break the proprietary lock in business model that is the norm in that space. In previous lives he built B2B reverse auction systems, brought heuristic-based optimization and online trading to the corporate treasury, and traveled the world as a Navy Officer. Unfortunately from his vantage point it all looked like the inside of a submarine. He spends his free time hacking silver halides with decidedly low-tech gear. He's on twitter as @jstogdill
Wed
Jun 9
2010
Streamlining craft in digital video
by Jim Stogdill | @jstogdill | comments: 2
I ran across an article this morning in the New York Times about The 48 Hour Film Project. I thought it was cool and it got me thinking about how a digital workflow makes filmmaking so much more accessible -- even before the new iPhone puts iMovie in our palms.
In short, the 48 Hour crew comes to your town and runs a contest that gives you two days to complete a four- to seven-minute film from script to screen. Some of the films are surprisingly watchable and engaging in a DIY-meets-media-culture kind of way.
I'm a still photographer but I occasional dabble in film and video. Before it died under the weight of the digital avalanche, I used to subscribe to a magazine for 8mm filmmakers called "Small Film" (I think the German version still exists but I can't find the link at the moment). It regularly featured filmmaking contests like "make a super 8 film with only in-camera cuts" or "make a film with a budget of $x." They could have never run a contest that lasted only 48 hours though. It simply wouldn't have been possible with a film-based work flow.
tags: digital content, DIY, filmmaking
| comments: 2
submit:
Tue
Apr 13
2010
The iPad isn't a computer, it's a distribution channel
by Jim Stogdill | @jstogdill | comments: 42
"You don't want your phone to be an open platform..." and with that brief statement, Apple justified the closed iPhone and then quickly followed it with the monitored and controlled app store. But Steve, the iPad isn't a phone at all so why not open it up again? If people are concerned about the safety of their apps or need you to protect them from porn, you can do an "app store approved" program or something can't you? And really, do we even need an app store to tell us which apps are good in an era of ubiquitous user feedback and preferential attachment?
The thing is, Jobs' argument was always a bit disingenuous. Closed follows from his brain architecture, not from an argument on behalf of his customers or their network providers. Those are post facto justifications supporting an already-held point of view. And the reason the iPad is going to stay closed isn't because it is good for users, it's because it is good for Apple.
The bottom line is that the iPhone was a relatively open phone and we accepted it, but the iPad is a relatively closed computer, and that's a bummer. Jobs probably believes that he is doing it for the users, finally giving them a post-crank-the-handle-to-start-it experience, but it doesn't take a genius to see how it benefits Apple. Beliefs and self interest usually go hand in hand and here's what I think is really happening...
Microsoft in the 1980s was the perfect business. The kind of business every MBA would like to invent. It had network effects to drive adoption, products with near-zero marginal cost, and a distribution channel that was controlled and constrained enough in the days of the floppy disks in boxes to enforce direct monetization. In short, it had leverage. The kind of leverage that delivers a very steep-sloped relationship between enterprise valuation and market penetration. Or, put the way an economist would put it, the kind of leverage that captures greater than economic rents. We paid more than we had to for Windows and Office during all those years, but at least we can take some comfort in the fact that a big part of it turned out to be an involuntary tithe for Bill's charitable efforts.
The music industry worked the same way. The constrained distribution channel of the vinyl record gave artists and the music companies that distributed their work tremendous leverage, and get them paid for it. Even if a lot of that economic surplus ended up the noses of label executives...
tags: generativity, ipad, open distribution
| comments: 42
submit:
Fri
Mar 5
2010
Yammer: Will viral work in the enterprise?
by Jim Stogdill | @jstogdill | comments: 16
I work for a very large company and at some point or another someone started a Yammer account based on our email domain. Starting on whatever day that was, Yammer commenced its viral expansion and its spread has really been quite impressive and rapid. Last time I looked we were approaching 3000 users.
The usage demonstrates all the free-scaling behaviors you'd expect though, so not everyone is yammering away. Still both the growth and the impact have been impressive. We are developing a nice network of the kind of weak connections that tend to "small world" a big enterprise like ours. It's always difficult to quantify the benefits of "soft" collaboration but I'm really happy with what I see and I've personally enjoyed the interactions and my expanded network.
I think Yammer has done so well because it's a really good product with well thought out features that make Twitter seem kinda retro. It has a nice slick interface, threaded conversations, and no pesky 140 char limit (which is countered by a "return key = submit" that inhibits multi-paragraph posts). They are also working to create the kinds of features that enterprises need to feel comfy: an api that includes directory integration, an Outlook module and etc.
However, despite all that, I'm bummed to say I don't think they are going to make it.
The question of data privacy and ownership comes up over and over in our Yammer discussions. The last time it came up the thread ran for nearly 100 responses. Even though the typical post is something like "Who is using Grails?" or "Is the X application slow for everyone today or just for me?" data privacy is simply one of the biggest concerns going for a lot of companies these days. The mere suggestion that our data isn't under our control is a big deal.
This point was demonstrated to me in a personal and compelling way during my first week on Yammer. I mentioned a client meeting so that I could share a few tidbits with colleagues. Hours later I was surprised and dismayed when a Google search revealed that my comments had been re-posted to the friendfeed of someone I didn't even know. Someone on our network had written a quick and dirty app to follow his Yammer RSS feed and re-post everything to friendfeed. Then for good measure he followed everyone in our network. When I "politely suggested" he take it down he equally politely explained to me that I just didn't get Web 2.0.
Despite that kind of hiccup, I don't think data privacy is the death knell. After all, no one has told us to stop using it yet. The real problem is that Yammer thinks viral works the same way in the enterprise that it works on the web. It doesn't.
Yammer, by being free and viral, is demonstrating in that soft benefit kind of way to lots of enterprises like ours that networks of weak connections and "ambient collaboration" are useful. Usage is creating a pool of users and even executives that "get it." But they are playing their cards too early and are probably going end up as little more than a contribution to someone else's cost of sales.
Recently a thread started with "does anyone know how to remove people from Yammer that left the company?" Well, it turns out that's an admin function and only available to paying customers.
While we have grown rapidly and virally, the "admin issue" is coming to a head with only about 1% of the company holding an account and probably more like .1% actively posting. There is no way this is going to be a level of usage that an enterprise like ours sees as lock-in. And it won't for anyone else's either.
If the average company has an attrition rate of 10% it means that EVERY company that adopts Yammer virally is going to start to have this conversation well before adoption has locked them in. Every company will face the problem of removing ex-employees by the time they reach relatively low penetration rates. If it's a 25 person shop it may be easier to just pay the $3/employee per month than worry about it, but for any reasonably sized enterprise this is going to force an off-budget-cycle decision that involves real dollars before adoption has locked them in.
The other problem with viral adoption as a strategy is this: I may love using Yammer, but I'm not Yammer's customer, our IT department is. And they already have SharePoint. What Yammer doesn't understand, and what Microsoft has known for years, is that IT makes these decisions, not the users.
While Yammer is going viral with users out at the edge, Microsoft perfected its S1P1 virus to attack the very core of the IT enterprise. So, when it comes to enterprise microblogging, The Microsoft Office SharePoint Server (MOSS) and its various add ons may be mediocrity in code form, but it's already there. And being there counts.
tags: enterprise, networking, social
| comments: 16
submit:
Tue
Mar 2
2010
Apps for Army Launches - The Hybrid Enterprise?
by Jim Stogdill | @jstogdill | comments: 2
This week the U.S. Army announced the launch of Apps for Army. It is modeled on the Apps for Democracy contests in the District of Columbia and is being run by the same indefatigable Peter Corbett and his iStrategyLabs. It looks to uncork the Army's cognitive surplus and let soldiers start solving their own problems in code without the personal risk of going off reservation to do it.
An overview of the program can be found here; or in video form here. Army Deputy CIO, Mike Krieger, discusses the Army's goals for the program here and Lt. General Sorenson, Army CIO, will discuss the program at 1:30pm EST on March 3rd during a round table. You can listen in here.
Now that the obligatory fact dump is out of the way, I want to delve in a bit more. While Apps for Army looks a lot like Apps for Democracy, the similarity is only half the story. The differences are interesting too, as is the path the Army took to get here.
What Apps for Army and Apps for Democracy share is an explicit attempt to harvest latent intellectual capacity toward emergent rather than planned outcomes. They are intentionally establishing the conditions for emergence by making data and platforms available and essentially saying "if you have a good idea, and you can execute on it with this stuff, please do." The big difference is who they are saying it to. D.C. said it to citizens effectively external to the "enterprise" while the Army is bringing intentional emergence inside.
The Army has always had innovation in the field, they just couldn't bring themselves to completely sanction or acknowledge it. It's the simple reality though. When the unmet needs of mission-oriented soldiers in the field build up enough, innovation starts shorting to ground with whatever tools are at hand. Terms like "county options" and "drive by fieldings" express the enterprise's grudging ambivalence toward these difficult to integrate and support local developments. The Army and other services have long wrestled with the reality that these local options are critical to the mission, but that they are expensive and inefficient (pdf). Also, they just aren't readily provided by a Big Army centrally planned acquisition system which is focused on shepherding the big stuff through the process and has little bandwidth left for big volumes of little things.
Apps for Army is a first step in a long term effort to reconcile enterprise control with local initiative. Apps for Army might not claim such high goals for itself, but in effect it is the Army's attempt to have its cake and eat it to. Open platforms, source, and data should greatly increase generativity and the potential for innovation inside the enterprise. However, by supplying and certifying the platforms and code that runs on them, the Army hopes it can do it without sacrificing enterprise controls or security the way it does with today's unmanaged local point solutions. This makes it look a bit more like Apple's app store processes embedded inside the enterprise than like the original Apps for Democracy. It also partially explains why it took six months to launch rather than the two months originally planned. Apps for Army simply couldn't happen before the platform provisioning and application certification stuff was worked out. The processes behind Apps for Army are necessarily more complicated than those involved in the outside the firewall Apps for Democracy.
Fri
Jan 29
2010
The iPad is the iPrius: Your Computer Consumerized
by Jim Stogdill | @jstogdill | comments: 24
Eugene Shimalsky in his short piece "One Small iPad for Man, One Giant Leap for Apple" declares that the iPad is interesting primarily because it isn't a computer. As he puts it:
Yesterday, Apple got all of the geeks glued to their screens waiting for the "Jesus Tablet," iPad. An hour later, they were twittering that it did not come. Or maybe it just wasn't their Jesus?
It turns out it was his Mom's.
It's been a long time since most of us have used our computers to do anything approaching "computing," but the iPad explicitly leaves the baggage behind, leaps the conceptual gulf, and becomes something else entirely. Something consumery, media'ish, and not in the least bit intimidating.
The automobile went through a similar evolution. From eminently hackable to hood essentially sealed shut. When the automobile was new, you HAD to be a mechanic to own one. Later, being a mechanic gave you the option of tinkering and adapting it to your specific interests. In fact, that's how most people up until about 1985 learned to be mechanics. The big changes came with the catalytic converter and electronic ignition (and warranty language to match). Now the automobile has reached the point in its development where you don't even have to know whether it has a motor or an engine to use it, but to tinker at all requires highly specialized skills.
So, in some ways this evolution of the computer to the iPrius seems completely natural. I don't care all that much if the iPad is hermetically sealed, but I wonder uncomfortably if in a few years the MacBook and the PC will be too. Or, more likely, we'll just wake up one day to a world without MacBooks or PC's. As we continue our shift en mass to the mobile device ecosystem and the laptop as we know it goes the way of the desktop, banished to special purpose niches.
In mobile land, closed carrier heritage combined with Apple's product vectors may leave us with only closed options. A confluence of interests - commercial (get your pure non-pirated content only from me!), governmental (cyber defense!), and user (I want to be safe!) - will find that outcome attractive. Our generative and hacker-friendly world will be replaced by a sterile world of sealed aluminum.
No doubt the iPad will be hacked by someone to prove it is still possible. They'll run linux on it within a week of launch, but that's not where they will have learned those skills. They learned them on the highly generative PC they probably bought for something else. Slight differences in approachability and "ease of mastery" (as Zittrain puts it) make a big difference. The curves are steep. And tomorrow the people that buy iPad's descendants will be less likely to develop those skills. Who's going to buy a developer's license just to screw around?
For your phone Apple could make a strong argument that this kind of control was necessary. They needed to make sure it was reliable first and foremost as a phone (rather than reliable as a snooping device or wouldn't just crash every time you really needed to make a call). The argument is being extended to the iPad more because of Apple's culture than real need, and if I was Steve Jobs looking at iTunes receipts I would do the same thing. But... directionally this is a vector toward compuserve, not away from it. The iPad is Steve's Minitel terminal.
Just for the heck of it, imagine for a minute that the MacBookPro was locked up like the iPad. The apps that run on the iPhone have been mostly trivial. One person for a few weeks is probably the average effort. Eugene Lin may be willing to build apps on spec and hope for the best after they are submitted, but will Adobe? Imagine when Adobe invests $X millions building Lightroom for a year only to have it rejected because Apple launches Aperture the same week.
tags: iPad Apple hacking generativity
| comments: 24
submit:
Mon
Jan 4
2010
Skinner Box? There's an App for That
by Jim Stogdill | @jstogdill | comments: 31
If you are reading this post it means that after countless misfires, I finally kept my attention focused long enough to finish it. That may seem like no big deal, a mere trifling effort, but I'm basking in the moment. In fact, I'll probably tweet it.
It didn't start out to be about digital Skinner boxes. It was a Radar backchannel email about the infamous Web 2.0 Expo Twitterfall incident. I got all curmudgeonly and ranted about continuous partial attention, Twitter as a snark amplifier, and the "Ignite'ification" of conferences (with apologies to Brady). In short, I demonstrated myself unfit to contribute to a blog called Radar.
I swear I'm not a Luddite. I'm not moving to Florida to bitch about the government full time and I'm not in some remote shack banging this out on an ancient Underwood. However, I guess I count myself among the skeptics when it comes to the unmitigated goodness of progress. Or at least its distant cousin, trendiness.
Anyway, I sent the email, inexplicably Jesse said "post!", and I tried reworking it. I still am. This piece has been grinding away like sand in my cerebral gears since, and along the way it has become about something else.
In The Anthologist, Nicholson Baker describes writing poetry as the process of starting with a story and building a poem around it. I try to do that with photography and build pictures around narrative and metaphor. After the work takes shape the story is carved back out and what remains hints at the story's existence, like a smoke ring without the mouth.
He says it better: "If you listen to them, the stories and fragments of your stories you hear can sometimes slide right into your poem and twirl around in it. Then later you cut out the story and the poem has a mysterious feeling of charged emptiness, like the dog after the operation." Don't worry about the dog, it lived and it isn't relevant. My point is that this post isn't about the Twitterfall fail story, that was just a catalyst. The inchoate uneasiness still twirling around in here is what's left of it.
This all began with these lingering questions: "Why are we conference attendees paying good money, traveling long distances, and sitting for hours in chairs narrower than our shoulders only to stare at our laptops? Why do we go to all that trouble and then spend the time Twittering and wall posting on the overwhelmed conference wifi? Or, more specifically, why are we so fascinated with our own 140 character banalities pouring down the stage curtain that we ignore, or worse, mob up on, the speakers that drew us there in the first place?"
As I kept working away on what has become this overlong post, the question eventually turned into, "why the hell can't I finish this?" This has become the post about distraction that I've been too distracted to complete. It's also about ADHD and the digital skinner box that makes it worse, narcissism's mirror, network collectivism and the opt-in borg, and an entropic counter-argument for plugging in anyway. So, here goes...
tags: attention, distraction, entropy, evolution, singularity, twitter
| comments: 31
submit:
Tue
Oct 27
2009
Defense Department Releases Open Source Memo
by Jim Stogdill | @jstogdill | comments: 11
I've been holding my breath for so long waiting for this memo that I may not remember how to start breathing again, but here it is. The Department of Defense Deputy CIO Dave Wennergren has signed and released "Clarifying Guidance on Open Source Software."
Written primarily by my friend Dan Risacher at the Office of Secretary of Defense the memo is intended to clear up common misconceptions and make it easier for DoD program managers to include OSS in their programs. Its goals are to improve agility, eliminate lock in, and reduce cost.
One of the memo's key points comes from Dave Wheeler at IDA - OSS is considered "Commercial Off the Shelf" software as far as DoD acquisition rules are concerned and therefore OSS must be considered on an equal footing by law whenever a program is doing market research prior to technology selection.
Some will argue that it doesn't go far enough by only encouraging and not demanding the use of OSS on government programs (I certainly have some sympathy for that point of view) but my hope is that this will at least provide some counter to the FUD machine - you know who you are - and keep moving OSS in defense toward a tipping point of acceptance.
By the way, if you are interested in open source in government and are in or near DC, make sure you check out GOSCON next Thursday, Nov 5. Dave Wennergren will be giving the breakfast keynote and you can ask him all about this memo.
tags: defense, opensource
| comments: 11
submit:
Wed
Aug 5
2009
Three Quick Open Source in Defense Links (and then one other)
by Jim Stogdill | @jstogdill | comments: 0
Next week I'll be participating in the inaugural Military Open Source Software Working Group Conference in Atlanta Georgia. Open source conferences that focus on the defense market are often salesy, have a dearth of actual developers, and tend toward sartorial blandness - a sea of dark blue suits worn by open source vendor sales people so they can convince hesitant buyers that their wares are just like the other guys. Look, we even license it by the seat!
This grass roots event, which will be held at the Georgia Tech Research Institute Conference Center, is designed to answer the question raised by those other conferences; "where the geeks at?" It will even have a dress code to match, no suits allowed. There is still space available, so if you are having the kind of ridiculously cool summer that makes August in Atlanta sound appealing, pack your shorts and sandals and head down.
If you aren't familiar with the defense software space, it buys and builds an immense amount of software. Quite a lot of it is actually pretty cool too because it is designed to solve interesting problems. We're still waiting for the defense market to have its IBM/Apache moment, but when this market inevitably tips hard into open source the impact is going to be tremendous. Open source methods and licensing will be a conduit for technology transfer from the DoD into commercial use on a vast scale. However, what I think is really cool, is the opportunity it will offer for important participation in the other direction.
A couple of projects at the vanguard of this trend that just opened up are FalconView and Open CPI.
FalconView started life as a moving map for USAF mission planning and was already a great example of user innovation in the military. Recently the team at Georgia Tech took the next logical step and open sourced the bulk of the project.
My colleague John Scott (@johnmscott) and his team at Mercury Computer Systems just opened up the distinctly different Open CPI project. Sort of a middleware for FPGA's, it grew out of the signals processing field and, if it picks up community support, should make it simpler to develop and build hybridized hardware platforms for special purpose applications. I've written before at Radar about the trend in some areas away from pure commodity hardware in areas where performance and energy consumption are a priority. I think projects like Open CPI will contribute to this trend by making the development of specialized platforms more approachable.
This last link isn't related to open source software except for the fact that Gunnar Hellekson @ghelleks of Redhat pointed me to it. We were chatting over lunch about the epidemiology of virus and vulnerability propagation and the fact the removal term is too low to keep populations small. All too often, once a system on the network (whether in the enterprise or at the home) is infected, it stays infected until it is removed from the network and (hopefully responsibly recycled) sometime after it has been fully depreciated.
Furthermore, in a large enterprise with as many as millions of machines to deal with, it is simply impossible to manage the process of consistently hardening machines to prevent infection in the first place. If Population = (rate of infection - rate of removal)*t you can see that these two issues conspire to help the bot herders and other nefarious characters keep populations large.
To deal with the second problem (and perhaps someday enable a solution to the first) NIST has been developing the Security Content Automation Protocol (SCAP). Basically it is an extensible XML schema for defining the hundreds of security configuration parameters and their values that need to be managed. Once defined and rolled into profiles, agents running on various platforms can implement the profiles automagically. In DoD parlance, this means that Security Technical Implementation Guides (STIGs) can be implemented broadly, efficiently, and, perhaps most importantly, in an ongoing manner.
tags: defense, map, open source
| comments: 0
submit:
Wed
Jul 1
2009
The Hacker Ethic - Harming Developers?
by Jim Stogdill | @jstogdill | comments: 26
On Monday Neil McAllister posed the question "is the hacker ethic harming American developers?" Slashdot picked it up and Tim forwarded it to the Radar list. As you might expect, it resulted in some spirited discussion.
James Turner kicked things off with this response (it has been slightly edited from its email form). After James lays out his argument I'll reply with my thoughts. Then we hope to hear from you. Let us know what you think.
I've worked in a lot of organizations that thought that the kind of rigid deforesting paradigms that Nayar is referring to were the magic bullet to keeping all three of the variable (dollars, time, features) under control. Without exception, all they did was get in the way and reduce the productivity of the most senior people to the level of the most junior. All of them exhibited some degree of failure, some catastrophic.The India shops *love* methodologies like UML and the like, specifically because the problems have been reduced to a simplistic enough granularity that they can be doled out to junior-level staff, who may have only been onboard a few weeks because of the massive churn over there.
At least 3 times at 3 different companies, I've seen major pieces of work brought back in-house because the Bangalore team had fallen so far behind or proved so unable to get beyond the literal description of work that they were endangering the project. When you combine the time difference with a tendency to halt dead in their tracks as soon as they hit a stumbling block, it can be a recipe for disaster.
There are certainly some good Indian shops, and I know some outstanding Indian developers (most of whom have come to the States.) But I find Nayar's comments hilarious. It's akin to someone saying that American football players aren't employable in Jamaica because they aren't able to limbo well. Look at the most successful Web 2.0 companies today, most of them started as garage enterprises with a few talented developers, not a 60 person team of UML jockies following some Arthur Anderson project management program. Heck, look at Google Labs.
In huge projects, you obviously need some master planning and coordination to make sure the tracks meet at the right place to drive in the spike, but I don't see any effort being made these days to right-size the amount of project overhead to the needs of the projects. Instead we get a one-size-fits-all approach that smothers anything but the largest project in paperwork. Even some of the original authors of the Agile manifesto, when I've talked to them, point out that part of being Agile is picking and choosing the right components of project management that make sense for a given task.
Nayar's remarks are incredibly self-serving. "We're the best, because we can mindlessly follow some arbitrary and flawed development process." Or is he claiming that Indian projects do better QA? Not in my experience...
This entire debacle is representative of a problem I think is endemic in the industry these days, the inability or unwillingness to engage in rapid prototyping. Every successful project I've ever worked on (and I've worked on some fairly large enterprise-sized projects), we started by designing and coding a quick "throw-away" skeleton of the application, that let us look at how the thing worked, where the unseen warts were, and where the vendors had lied about their APIs, etc. This is the crucial and neglected stage in project design, one that most modern design paradigms ignore or actively discourage. Even Agile tries to jump in and start coding the finished application from the get go, although if project teams were willing to aggressively refactor (a tenant of Agile), early project work could be a rapid prototype (although the story model of scrum really doesn't fit well with this, unless you make the prototype a story...)
This is also something I've never seen an offshored team do particularly well...
Well, I'll be damned if I'm going to jump in and take the side that says hacking is bad for American programmers. First off, I don't need that kind of flame bait and second, I don't believe it. I think approachable programming is hugely important because that's how many people get into the field in the first place. However, my reaction to the article was very different than James' and I might as well try to explain.
I'm not going to take an opposing position, but it's not really an orthogonal position either. Maybe it has a power factor of about .7 or so. Here's my (also edited) response...
When I when I read McAllister's piece, at least some of it resonated with me. Before we were bought by a large firm, we were a small company that grew from nothing to 250 people, about 200 or more of them were programmers. So, a whole lot of my time over three years was spent hiring programmers and building cohesive teams that could deliver to our customers.In our hiring we aggressively hired hackers into the mix. We wanted outside-our-industry thinking and we thought they brought in creativity. We called it "hiring weird, but not weird weird." Occasionally we pushed it to weird and a half. For our efforts we got creative problem solving and interesting (but frequently weird) dinnertime conversation when we travelled.
However, our pollyanna idea of "disciplined teams catalyzed with a bit of weird" didn't always work out.
That leads to the bit that resonated with me: the sense that hacker = distilled essence of American individualism combined with lots of ISTJ Myer's Brigg's Type Indicator. Individualism is a trait that I hold dearly, but it can make a cohesive team effort difficult if people are unwilling to suborn themselves to the goals of the team. Remember those tee shirts the football team always wore at your university? "There is no I in team?" I sometimes joked that I was going to make a batch that said "I'm the Me in team."
Maybe we were just growing fast and it was going to take more storming and norming than I had patience for, but at times it was a struggle to get everyone to see past their individual biases and focus on what we were trying to achieve, and we couldn't do what we were trying to do with teams of one.
But it really wasn't a hacker problem if hacker means self taught like McAllister implies. We had a lot of people with CS degrees and we used to talk a lot about whether and how their degrees had prepared them for their jobs.
Separate from the individualist approach to development, few of our recent graduates came to us prepared with the terminology and practices of any development approach (or engineering approaches like continuous integration etc.). They knew how to code, but not how teams coded.
At one point I gave a talk on agile software development to about 100 CS students at a university in Philadelphia and I asked them to raise their hands if they had ever done a team project with greater than two people on the team. I don't recall anyone raising a hand. Then I asked if they had ever covered development methodologies in their classes and a few acknowledged they had, but it had been abstract classroom stuff only. That part surprised me.
I'm not sure that the "sanctity of engineering" argument really makes that much difference. I have little faith in McAllister's scheme to do computer engineering instead of computer science coursework.
My undergraduate degree was in Mechanical Engineering and I can only imagine how useless I would have been to a firm that actually did engineering, and for mostly the same reasons. I knew how to take integrals and I still know the packing ratio of a hexagonal close packed material but I didn't know squat about how a complex machine actually got designed in a team setting. It's interesting to note that the Engineer in Training exam I took (a precursor to the professional engineer's exam) didn't probe my knowledge of team practices at all.
Maybe there just isn't time in an undergraduate degree to teach everything that an engineer needs to know. Plus, can you imagine the drop out rate in CS/CE if ITIL was a required course?
Since James mentioned Google I'll switch gears and muse about ecosystems for a moment... I guess I tend to bristle when I hear that everyone should just develop software the way Google does.
Google is to computing what LA was to Aerospace and Electronics in the early 60's. It's gravitational force attracts five sigma talent (probably a bunch of six too) in ways that the rest of us can only envy. More generally, Silicon Valley has had programmer talent flowing into it for the last twenty years the way Hollywood sucks pretty people out of the midwest.
Maybe it's not as obvious because you can't spot brains the way you can spot an oddly beautiful wait staff, but the valley has been the vortex of a talent-laden embarrassment of riches for a long time and, if you work there, you might not even notice it. However, I think that at some level this effects what kinds of processes work when you build software. I think it's at least a little part of the reason why an ERP system in a manufacturing town gets implemented differently than MapReduce (there are other reasons too having to do with software as product vs software as supporting infrastructure). Combine that with the very clear shared vision of "lets do something great and get rich together" thing that valley firms often have, and well, it's easy to see how smart people coalesce to build amazing stuff.
It's easy to denigrate Arthur Anderson's progeny or the offshore firms they compete with, but they do different work, with a different talent pool, for different ends, and with a very different set of personal and organizational incentives. Or, put another way, Kelly Johnson didn't build the SR-71 with General Motor's engineers, and General Motors didn't design the Chevy Cavalier with the Skunkworks' processes. However, even at the Skunkworks, Johnson's brilliant engineers did conform to a process and work together as a team toward a shared vision. And, conversely, I bet a lot of talent is left on the table at General Motors because of processes too restrictive in their attempt to remove all uncertainty.
So,... maybe it's possible that Google's (or the valley's in general) processes are appropriate to an ecosystem that, because of the intellectual environment and potential for riches, is rich in IQ and initiative. So it ends up feeling more "special forces" and less like "infantry regiment." And over there closer to the hump in the normal distribution curve, or in a different cultural environment outside of the valley, a different flavor of processes may be effective?
The counter argument to that, which I'll go ahead and provide, is that I once helped teach a team of engineers at midwestern defense contractor how to do agile development. The effect was amazing and immediate and their productivity and satisfaction went up tremendously; until their management freaked out and shut it down when they "perceived" that it created too much uncertainty in their processes.
Well, it's obvious that I don't know the answers here, so, with that, I'll stop thinking out loud.
What do you think?
tags: hacking
| comments: 26
submit:
Fri
Jun 19
2009
The Web^2: it's Exponential, but is it Contracting or Expanding?
by Jim Stogdill | @jstogdill | comments: 5
The theme for the Web 2.0 Summit this year is Web Squared. It is rooted in the idea that as the web morphs into less of a hub and spoke distribution model and more of a network of connected people and things, innovation and opportunity on it are growing exponentially. There has been a little bit of discussion on the Radar back channel about exactly what this means, or should mean, and Nat started things off with a thoughtful response that probably should be blogged as well. In particular he introduced feedback loops into the discussion, and with Nat's prodding, I decided to share my response to his email here. I've edited it a bit to make it a *bit* more cohesive, and while it isn't as structured as I would like, these are my thoughts about the exponential future of the web and a little bit about how that future might also impinge on the future of government...
I agree with Nat that feedback loops are a great mental filter to view the world. I read a little bit of Wiener and now I see feedback loops everywhere. Furthermore, what I like about them as a mental model, is that they help me understand the web at the ecosystem level rather than at the level of a specific technology. Wiener defined a cybernetic system the way engineers define a thermodynamic system. In thermodynamics, a system is closed if no energy crosses its boundary. A cybernetic system is closed when no messages or information cross. Since messages are the lifeblood of feedback these boundaries are important. As an example, open government stuff is so exciting to me because once computing systems connect between the web and government, the boundaries of previously isolated cybernetic systems (e.g. the people and its government) begin to be permeable. And once they are permeable to computing messages they will also be permeable to cultural signals that can create cultural feedback loops. That will cause state to change on both sides of the boundary. Two small isolated cybernetic social systems become one larger integrated one with new feedback loops in place.
Regarding the exponential theme, I'm not sure that innovation is progressing as an exponential over time - although, in fairness, I'm still working on my unabashed optimism credentials. But... In the 1920's automobile companies were springing up like crazy in America. It was the era before production methods became the dominant competitive weapon and anyone with a good idea for a better combustion chamber design or a valve train or a styling cue could still try their hand at building a car company. With access to tools, labor, and know how Detroit in the 20's was a very generative environment for automobile innovation. But by 1980 even DeLorean with a trunk full of coke couldn't afford the startup costs - a combination of more sophisticated design requirements and the changes in production scale economics made it impossible.
Are Data Centers the Economic Equivalent of Manufacturing Plants?
The interesting parallel with the web (or computing and software more generally) is that the rise of the data center as a key piece of competitive know how and, perhaps more importantly, capital cost. The question in my mind is whether utility computing enhances generativity, or by making it contingent on powerful interests, effectively stifles generativity in the long term despite the generative potential of the technology (I'm shamelessly borrowing the idea of contingent generativity from Jonathan Zittrain). And a related question, does the introduction of capital cost as a major factor in the eco system eventually make the web feel more like Detroit in 1980? Will it fundamentally change the web by tying it firmly to those who can access sufficient capital? (Google spent over $800m on data center capital improvements last year. That's a number that even makes the Defense Department wistfully declare "we just can't afford to do what Google does.").
Or, ...the electric utilities made innovation with electrical devices more possible, but it doesn't necessarily follow that utility computing will always do the same. After all, electrical utilities ship their power to us where we use it in situ for whatever purpose we want, but utility computing requires us to send our "loads" to them where it is much easier to implement perfect mechanisms of surveillance and enforcement. Homeowners associations used association charters to turn neighborhoods into little fascist fiefs and data centers have the potential to do the same with EULA's.
Scale and Concentration (or, is the Universe Expanding or Contracting?)
As scale on the web increases there are competing concentrating and generative factors at work (any of which might be exponential). The concentrating factors (need for capital, sophisticated expertise, ...) tend, like gravity, to collapse the system down on itself in a variety of ways. I don't mean that it becomes less relevant or makes less money, I mean that it ends up feeling more like AT&T; in the 60's with centralized control and vested interests and strict contingencies on generativity. Just like Apple's oversight of the app store. On the other hand, the factors that tend toward expansion are feedback loops that span organizational boundaries, ready access to seed funding, standards for cloud computing that encourage true commodity availability of non-contingent generative environments etc.
Figuring out which force will dominate is like trying to figure out whether the universe will expand forever or eventually contract. The balance between the factors is quite subtle, depends on minute variations in initial conditions, and is very difficult to predict. But, we can still ask ourselves, "how can we influence the broader cybernetic ecosystem of the web to encourage policy, practices, cultural values, etc. that will promote generative expansion rather than scale-driven contraction?"
Exponential Effects and Social Structures
Shifting gears for just a moment, complexity science is the other idea I tend to come back to as a frame for viewing the web that, while not directly related to the exponential theme, is at least peripheral. The web is fascinating in the way it has become the cybernetic substrate on which both technical and social patterns are emerging. Stripes form on a zebra because "black" and "white" chemical messengers from adjoining cells interact with each other differently over distance. Out of that simple mechanism complex patterns emerge. The web is transport for human messages that don't decay with geo-spatial distance. This geo-and-time-independent messaging is enabling human "striping" that is no longer geo-ethnic dependent.
Within a geography the existing striping can become more severe as the web enables self-selected and self-reinforcing pockets of auto-propaganda that combine with social graph clusters; clusters that only infrequently span value systems. The situation is reminiscent of 1930's era Spanish political parties and their newspapers, but operating at photo-multiplier tube speed. We consume the stuff that reinforces our world view and segregate ourselves into more and more thoroughly strident neighborhoods of belief. We remain physically in our geo-defined country, but in our chosen echo chamber we each live a very different intellectual and emotional experience in a whirlpool of exponentially hardening world view. Perhaps someday we'll live in "nation states" that are stripes of psychographic and value alignment instead of stripes in geography.
Of course, it's true that as long as we are physical beings we will continue to stripe locally in our physical world. The cybernetic overlay in human relationships provided by the the web doesn't replace that reality, but by augmenting it and letting us stripe along lines of affinity and value system without regard to geography, it contributes to fissures in our geo loyalties. These fissures are important because States exist to govern the physical world (trade, law, taxes, defense...) but they depend on shared values and culture to function effectively. Just look at Iran today to see the effect of incongruent value systems on co-located peoples.
tags: web 2.0
| comments: 5
submit:
Recent Posts
- Forge.mil Update and DISA Hacks Public Domain on April 28, 2009
- Google's PowerMeter. It's Cool, but don't Bogart My Meter Data on February 16, 2009
- The Kindle and the End of the End of History on February 9, 2009
- The Army, the Web, and the Case for Intentional Emergence on January 27, 2009
- The State of Transit Routing on December 15, 2008
- My Web Doesn't Like Your Enterprise, at Least While it's More Fun on November 25, 2008
- DIY Appliances on the Web? on November 18, 2008
- Apps for Democracy on November 13, 2008
- My Apple Holiday Wish on November 10, 2008
- Open Source in Defense on October 9, 2008
STAY CONNECTED
RECOMMENDED FOR YOU
O'Reilly Home | Privacy Policy © 2005 - 2010, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.