CARVIEW |
Educational technology needs to grow like a weed
Want to scale education reform? Plant a tech seed and help it flourish.
by Marie Bjerede | comments: 6
Why do so many well-conceived education reform designs fail in implementation? For the same reason that old-school top-down software development fails in today's rapidly evolving Internet-based marketplaces.
In both cases there is an implicit false assumption that the designers can accurately predict what users will need in perpetuity and develop a static one-size-fits-all product. In response to that fallacy, both software development and education reform have developed agile models of adapting to unpredictable environments. Independently, these have failed to scale to their potential in the real-world trenches of the U.S. educational system. Interdependently, could they achieve the results that have so far eluded each?
Traditional education reform, like traditional engineering development, invests heavily in up-front design. In engineering, this makes sense when dealing with deliverables that are hard to change, like silicon, or when mistakes are not an option, as with space flight or medical technology. However, when the deliverable is malleable, as with consumer software, once the market starts to change the implementer is trapped between the choice of piling modification upon modification until the initial design is completely obscured, or plowing ahead unswervingly only to deliver a product that is obsolete on delivery. The software developer is destined to be outperformed by more nimble developers who can adapt effectively to changing market needs, new information, and an evolving industry.
Similarly, education reform interventions are rigidly constrained. To prove a treatment's effectiveness, research needs to demonstrate that one particular variable in a messy human dynamic environment is responsible for a change in student outcomes. This means that an educator and his/her students must behave precisely as designed in order for the research to be valid. Tremendous resources are spent in these kinds of trials to ensure "fidelity of implementation." In this situation, the educator is trapped between the choice of corrupting trial data by changing the implementation to meet the changing needs of students and the environment, or plowing ahead only to limit the good he/she can do for students to the lowest, common, measurable denominator.
tags: edu 2.0, edu2tech, emergence, emerging tech
| comments: 6
submit:
Four short links: 18 May 2010
Multitouch Medical Errors, Scaling, Javascript Charts, and Fighting Credit Crunches with Open Data
by Nat Torkington | @gnat | comments: 0
- Tondo Interactive Table to Analyze Medical Errors (MedGadget) -- use of a multitouch table to help clinical staff identify and track medical errors. (via IVLINE on Twitter)
- Steve Huffman Lessons Learned While at Reddit (SlideShare) -- uptime and scale. It's interesting that most everyone reinvents tuples as a way to scale databases, hence the popularity of NoSQL systems.
- HumbleFinance -- JavaScript library to render dynamic charts as per Google Finance. (via carlos_d_hoy on Delicious)
- Hernando de Soto: Shadow Economies -- de Soto is an economist, and this ends up talking about the need for transparency and open data. As long as you don’t know who owns the greatest amount of your assets, there’s no info as to who owns what, who is related to what, you have a shadow economy. We live in one, and it has as a characteristic a permanent credit crunch. We know more about it than you do. Credit crunch is where you don’t know who you’d be lending to, so you don’t lend. It’s permanent, we live with it, and now you’re going to have to learn to live with it too, because until you know who is solvent how can you give anybody credit? You’re flying blind. (via Jon Udell)
tags: credit crunch, healthcare, javascript, multitouch, open data, opensource, programming, scale, ui
| comments: 0
submit:
Mobile operating systems and browsers are headed in opposite directions
As the mobile OS market fragments, mobile browsers are consolidating
by Jason Grigsby | @grigs | comments: 2
During a panel at Web 2.0 Expo, someone asked if the panelists saw any signs that suggest mobile operating system fragmentation might decrease.
One of the panelists had a blunt answer: "No. There will be more fragmentation."
It is striking to see the different trajectories mobile operating systems are on when compared to the mobile web.
In 2006, two smartphone operating systems accounted for 81 percent of the market. There were really only four platforms to worry about: Symbian, Windows Mobile, RIM, and Palm OS. These represented 93 percent of the market.
2006 | 2007 | 2008 | 2009 | 2010 | |
---|---|---|---|---|---|
Sources: Canalys, 2006. Gartner: 2007, 2008, 2009. | |||||
Symbian | 67 | 63.5 | 52.4 | 46.9 | ? |
RIM | 7 | 9.6 | 16.6 | 19.9 | ? |
Windows Mobile | 14 | 12.0 | 11.8 | 8.7 | ? |
iPhone | 0 | 2.7 | 8.2 | 14.4 | ? |
Linux | 6 | 9.6 | 7.6 | 4.7 | ? |
Palm OS | 5 | 1.4 | 1.8 | ||
Android | 0.5 | 3.9 | ? | ||
WebOS | 0.7 | ? | |||
Windows Phone 7 | ? | ||||
Bada OS | ? | ||||
MeeGo | ? | ||||
Other OSs | 1 | 1.1 | 2.9 | 0.6 | ? |
Fast-forward to the present and the picture is different. No single operating system has more than 50 percent marketshare. There are seven operating systems being tracked and even within operating systems there are fragmentation concerns.
tags: internet operating system, mobile systems, webkit
| comments: 2
submit:
Four short links: 17 May 2010
MapReduce and Hadoop Papers, Privacy Problems, School Data, and Crowdsourcing Info
by Nat Torkington | @gnat | comments: 0
- MapReduce and Hadoop Algorithms in Academic Papers -- a collection of such papers, interesting for those who wrangle big data. (via tlockney on delicious)
- Facebook and Radical Transparency: A Rant (danah boyd) -- well-argued and well-written piece about what is becoming the tech issue of the year. The key to addressing this problem is not to say “public or private?” but to ask how we can make certain people are 1) informed; 2) have the right to chose; and 3) are consenting without being deceived. I’d be a whole lot less pissed off if people had to opt-in in December. Or if they could’ve retained the right to keep their friends lists, affiliations, interests, likes, and other content as private as they had when they first opted into Facebook. Slowly disintegrating the social context without choice isn’t consent; it’s trickery.
- Schooloscope -- interesting new Berg project to help parents make sense of the long and complex reports on British schools produced by the relevant government department. Notable for what it doesn't do (leaderboards), and what it does (the face visualisations). See Matt Webb's description.
- Expert Labs Grand Challenges First Results -- they gathered the results of the Office of Science and Technology Policy's call for "Grand Challenges in science and technology that could yield significant breakthroughs in the future". Interesting for all who planning crowdsourcing efforts because there's a detailed and thoughtful summation of lessons learned. And even those in the science and technology communities who might have ready responses would have to acclimate to the huge new idea of being asked for their feedback, as well as the big new idea that they could give feedback using common social networking tools. If there is an area for improvement in our efforts, this is clearly an important one to focus on. Even relatively minor variables like the time of day when a social networking prompt is sent can have significant impact on results, both in terms of the quality of responses, as well as the speed with which they responses are submitted. More significantly, the terse wording and distracted attention environment of social networks can amplify ambiguities in a prompt.
tags: big data, crowdsourcing, danah boyd, facebook, multicore, privacy, schools, visualization
| comments: 0
submit:
Disintermediation: The disruption to come for Education 2.0
by Rob Tucker | @Rob_Tucker | comments: 33
On the largest of scales, we rarely have the luxury of designing technological systems. Instead, technologies happen to us - our experience of them being ragged, volatile, turbulent and rife with unexpected interactions. Tim’s posts about the emerging internet operating system (here and here) describe a great example of this - the winner of that particular fight being very much TBD and the factors determining victory or defeat being themselves the subject of lively debate. When we talk about Education 2.0, though, we are prone to think that we can design it - that we can consciously and deliberately lay the groundwork for its effective implementation. Our deliberation, though, may be less powerful than the larger forces driving its rapid evolution. One such force will certainly be disintermediation.
Disintermediation is a process in which a middle player poised between service or product providers and their consumers is weakened or removed from the value chain. Disintermediation is driven by the fact that middle players consume resources and in removing them from the chain, these resources are recovered to enable either lower cost for the consumer, better value from the provider, or both. Disintermediation can be total, in which case a middle player is removed entirely. It can also be partial, in which case an intermediary is carved up and the different ways in which they formerly added value are segmented, replaced, or done away with as circumstances permit. Understanding the process of disintermediation is critical to understanding the ways in which Education 2.0 will evolve.
tags: change, disintermediation, edu 2.0
| comments: 33
submit:
Gov 2.0 Week in Review
Open government, open data, moving .gov into the cloud
by Alex Howard | @digiphile | comments: 2
So what is Gov 2.0? This past week saw wide-ranging discussion about the meaning, substance and relevance of the term, along with plenty of other news related to social media in government, open data, improved crisis response through technology and a move to the cloud.
Tim O'Reilly went live online to talk about his paradigm for government as a platform, including how technologists can play key roles in this important transformation. You can also read Tim's thoughts on open government at O'Reilly Labs. The recording is embedded below, after the jump. J.D. Lasica wrote a terrific review of both "Open Government" and the webcast.
Mark Drapeau took a look back at the three phases of government 2.0 as the first Gov 2.0 Expo draws near. Federal News Radio anchor Chris Dorobek also stepped back and published an important "mile high" perspective in his Gov 2.0 status report. Time to get ready for the Gov 2.0 Expo!
tags: gov 2.0, gov 20, week in review
| comments: 2
submit:
Using technology to support global education
by Lucy Gray | comments: 0
In the summer of 2006, I was very fortunate to travel to Europe with colleagues in the Apple Distinguished Educator program and it proved to be an inspiring, life-changing event for me. Tasked with writing a global awareness curriculum infused with digital content, we spent 10 days in Berlin and Prague, constantly photographing, filming, and discussing our experiences. As a group, we had been moved to action after reading such books as A World is Flat and A Whole New Mind, and I think our travel experiences reinforced our beliefs that kids (and adults) need to connect to other cultures in order to fully understand and participate in the world. We clearly understood that there never has been an easier time to make these connections via technology and that these technological possibilities will only improve with time.
Around the same time, edublogger Steve Hargadon started Classroom 2.0, an online community built on the Ning platform. Classroom 2.0 is designed to help educators investigate new and emerging technololgies, and it's proven to be a great community with a membership of over 40,000 educators. Inspired by Classroom 2.0 and pleased with Ning's ease of use, I also started a community focused on bringing educators and other interested parties together around the topic of global education. Primarily, this has been a networking space where teachers and students can find partners for global collaborative projects.
Steve has since become a consultant at Elluminate, an e-learning company, and he uses their web conferencing platform to host a myriad of free events including Classroom 2.0 Live and the Future of Education. He asked me a few months ago what we could do to significantly impact education using Elluminate, and I suggested an online virtual conference, similar to what other educational technology colleagues have created in the K12 Online Conference.
Either we are completely out of our minds or very brave (take your pick), but we are now attempting to make this a reality. After a couple of months of brainstorming and discussion with groups such as the Asia Society, IEARN, and ePals, we announced this week a preliminary call for participation in the proposed 2010 Global Education Conference. This event is a collaborative effort to significantly increase opportunities for global collaboration in education.
Our idea is to host free conference sessions related to Teachers, Students, Pedagogy, Leadership and Policy, and Change over the course of 5 days in November of this year. Sessions will be scheduled around the clock, as well as archived, in order to accommodate time zone differences. We still have a great deal of work to do around logistics, so this preliminary call is really focused on getting people involved to help and soliciting ideas for reaching educators around the world. In the first 24 hours of publicizing this event, over 1000 inquiries have been received and we're absolutely thrilled with the response.
You can help by signing up to participate in some way, or by simply passing on this information. Stay tuned for updates as we begin our efforts to connect educators and students around the world!
tags: edu 2.0, edu2tech, global education
| comments: 0
submit:
Four short links: 14 May 2010
Personalised Healthcare, Academic Link Shorteners, Journalism Futures, Security
by Nat Torkington | @gnat | comments: 2
- Genome Scan Gives Man Insight Into Future Health Risks -- the first completely mapped genome of a healthy person aimed at predicting future health risks. The scan was conducted by a team of Stanford researchers and cost about $50,000. The researchers say they can now predict [his] risk for dozens of diseases and how he might respond to a number of widely used medicines. Personalized medicine takes a step closer, and all powered by massive computational power.
- Long Handle on Shorted Digital Object -- digital object identifiers, and their relationship to shortener services like bit.ly (in which O'Reilly is an investor). The Handle System is relatively inexpensive, but the costs are now higher than the large scale URL shorteners. According to public tax returns, the DOI Foundation pays CNRI about $500,000 per year to run the DOI resolution system. That works out to about 0.7 cents per thousand resolutions. Compare this to Bit.ly, which has attracted $3.5 million of investment and has resolved about 20 billion shortened links- for a cost of about 0.2 cents per thousand. It remains to be seen whether bit.ly will find a sustainable business model; competing directly with DOI is not an impossibility.
- We Are In The Information Business -- A well-architected news website leads to content that will keep on providing value, rather than leaving stories to wither away when their immediate news value has faded. Structured content is the stuff that makes a website malleable, rather than cementing you into certain ways of doing things. Structured content is like a big undo button that allows you to reverse decisions and change how your website looks and behaves. Since none of us can predict the future, the freedom to change course as often as we please and not having to worry about escalating legacy costs, well, that’s pretty close to heaven.
- Sacramento Credit Union FAQ -- The answers to your Security Questions are case sensitive and cannot contain special characters like an apostrophe, or the words “insert,” “delete,” “drop,” “update,” “null,” or “select.” (via Simon Willison)
tags: bit.ly, genomics, healthcare, journalism, media, security, web
| comments: 2
submit:
White House moves Recovery.gov to Amazon's cloud
by Alex Howard | @digiphile | comments: 0
Earlier today in a blog post on WhiteHouse.gov, federal CIO Vivek Kundra announced that Recovery.gov would be moving to the cloud. The Recovery Accountability and Transparency Board's primary contractor, Smartronix, chose Amazon's Elastic Compute Cloud (EC2) to host the site. NASA has used EC2 for testing, but this will be the first time a government website -- a ".gov" -- has been hosted on Amazon's EC2. Kundra estimated the savings to the operational budget to run Recovery.gov at approximately $750,000, with $334,000 coming in 2010 alone.
"This is a production system," said Kundra, during a press briefing today. "That's a critical difference from other agencies that have been testing or piloting. We don't have data that's sensitive in nature or vital to national security here."
The recovery board plans to redirect more than $1 million in computer hardware and software that were being used to host Recovery.gov to fraud oversight operations. It's a move that Earl Devaney, chairman of the recovery board, said will help identify fraud, waste and abuse in the recovery program.
tags: amazon ec2, cloud computing, gov 2.0, gov 20
| comments: 0
submit:
White House deputy CTO Noveck on next steps for open government
Transparency gets all the press, but participation and collaboration are equally important
by Alex Howard | @digiphile | comments: 4
You may also download this file. Running time: 13:29
Transparency initiatives at the White House, one of the three elements of the Open Government Initiative, have received ample attention from mainstream media and groups like the Sunlight Foundation. The implementation of the other two elements, participation and collaboration, have not. Can citizens be empowered to participate and collaborate in governance?
To begin to answer that question, I spoke with Beth Simone Noveck, professor of law at New York Law School, director of the White House Open Government Initiative, and U.S. deputy chief technology officer. Noveck is the author of "Wiki Government," where she wrote about using social networking technology to connect people to policymakers.
President Obama's first executive action on January 21, 2009 was to sign the Memorandum of Open Government. Last December, Peter R. Orszag, director of the Office of Management and Budget, issued the Open Government Directive. In April, all federal agencies delivered initial open government plans and an independent coalition released an open government plans audit.
Noveck has been at the heart of open government theory and application for years, starting with her groundbreaking work on the Peer to Patent project. That effort -- which began in 2005 and became the subject of Noveck's 2009 book, "Wiki Government" -- was aimed at applying the expertise of individual members of the public to the heavily burdened U.S. Patent and Trademark Office. Now, she's directing the implementation of the open government agenda of President Obama.
tags: gov 2.0, gov 20, open government
| comments: 4
submit:
When it comes to new media, the Smithsonian is all in
Michael Edson on how the Smithsonian uses crowdsourcing and transparency to further its mission
by James Turner | comments: 4
The Smithsonian Institution epitomizes the phrase "an embarrassment of riches." With 137 million physical objects in its collection, and 28 distinct museums and research centers, you could spend the rest of your life there and not see everything.
Michael Edson, who serves as director of web and new media strategy for the Smithsonian, got his start cleaning cases in one of the art museums. He now oversees the Institution's online presence, which he talks about in the following interview. He'll expand on many of these same topics at the upcoming Gov 2.0 Expo.
Where the Smithsonian's online content comes from:
Michael Edson: Each museum collection -- sub collection, the zoo, the portrait gallery, natural history museum -- really functions as its own world. And 99 percent of the content creation happens deep within the Institution's 28 separate research and collection units. It's a real innovation at the edge environment. We argue in our web and new media strategy that so much great work can happen in those edge environments, where you've got subject matter experts, collections, the public and a little bit of technology expertise really close together. That's the hothouse where these great things grow.
Pretty much any model of web and new media production you can think of is being done somewhere in the Smithsonian. We have 6,000 employees, 137 million physical objects, and incredibly brilliant and talented people. So we do outsourcing. We do in-sourcing. We do lightweight development. We do great big monolithic development. The trick now is establishing a framework -- we call it a Commons -- where these edge innovators can have the basic tools they need to be successful.
tags: gov 2.0, gov 20, museums
| comments: 4
submit:
Four short links: 13 May 2010
Open Facebook, Internet Stats, Handling Interviews, and Textual Relationships
by Nat Torkington | @gnat | comments: 1
- Don't Simply Build a More Open Facebook, Build a Better One -- Most people don’t care so much about whether technology is “open” or “closed” so long as it works. (Case in point: iPhone.) Rather than starting your plans by picking which “open” standards you’ll use, start by designing a better social networking service and then determine how “open” specs will help you build that service. (via David Recordon)
- Internet Stats from Google -- very nice categorized factoids about internet use, technology, trends, etc. 64% of C-level executives conduct six or more searches per day to locate business information.
- Qualitative Methods for IS Research -- summary of qualitative methods (interviews, documents, observation data) as applied to IS. Written for academics, so you have to choke back passive voice vomit (sorry, "passive voice vomit must be choked back") but it's got lots of useful information on approaches and tools. (via johnny723 on Twitter)
- Social Signaling and Language Use -- turns out the stopwords like "to", "be", and "on" are the ones that indicate manager-subordinate relationships. In so many fields I see again and again that you keep data at each stage of transformation, because transforming for one use prevents others. (via terrycojones on Twitter)
tags: data mining, facebook, internet, interviews, machine learning, numbers, open, research, text
| comments: 1
submit:
Recent Posts
- Announcing The Emerging Languages Camp at OSCON | by Brady Forrest on May 12, 2010
- What would technology do best for learning? | by Dale Dougherty on May 12, 2010
- Craig Newmark on better government through enlightened customer service | by Alex Howard on May 12, 2010
- Why check-ins and like buttons will change the local landscape | by Tyler Bell on May 12, 2010
- Four short links: 12 May 2010 | by Nat Torkington on May 12, 2010
- Crowdsourcing and the challenge of payment | by Andy Oram on May 11, 2010
- What is Gov 2.0? Come find out | by Mac Slocum on May 11, 2010
- Better government through code | by Alex Howard on May 11, 2010
- Four short links: 11 May 2010 | by Nat Torkington on May 11, 2010
- Notes from the Politics of Open Source conference | by Andy Oram on May 10, 2010
- The three phases of Government 2.0 | by Mark Drapeau on May 10, 2010
- Four short links: 10 May 2010 | by Nat Torkington on May 10, 2010
STAY CONNECTED
RECOMMENDED FOR YOU
O'Reilly Home | Privacy Policy © 2005 - 2010, O'Reilly Media, Inc. | (707) 827-7000 / (800) 998-9938
All trademarks and registered trademarks appearing on oreilly.com are the property of their respective owners.