Total Pageviews

Thursday, August 18, 2011

Crlt+Alt+Del: A Higher Ed Reboot

That was not the classroom I sat in, but pretty close.
An unadorned room with bad lighting. A long wooden table surrounded by slightly uncomfortable chairs. A professor---white-bearded, tweed-jacketed. 12 students. A photocopied packet of great works, from Rousseau to Darwin to Veblen.

That was the set-up for one of the best academic courses I have taken. Ever.



HSCI E-113 Science, Technology, and the Good Society (21314) (Syllabus)
Peter Buck, PhD, Senior Lecturer on the History of Science, Harvard University.
Graduate seminar. 4 units. Graduate credit $1,325. Limited enrollment.
Thursday, Jan. 31, 5:30-7:30 pm, 51 Brattle Street, Room 219. Spring term
Seminar on the hopes and fears associated with scientific and technological change since the beginning of modern times. Ideas about how advances in science and technology will improve the human condition in the future. Explanations of why technical progress has not produced promised social and political benefits in the past. Readings include classic descriptions—More, Bacon, Condorcet, Mill, Marx—of what the good society will look like, when and if it arrives, and classic accounts—Hobbes, Weber, Veblen—of why expectations have not yet been met and, perhaps, cannot be realized.
Francis Bacon's is way smarter than you.
Professor Buck, who joined the Harvard faculty in 1966, is now retired. He had only one primary rule: we had to admit that all of the thinkers we were reading were way smarter than we were.

The aim of the course was to figure out, by close reading, exactly what they were saying and why they might be saying it, not to claim that Mill or Marx were "wrong."

We were not allowed to go beyond the text and bring in contemporary or personal assessments.

Such constraints were liberating and led to spirited discussions that he marvelously kept in check.

I admit to having many moments of... "This is why I love higher education!"

Alas, with e-readers and social networking sites ... students cast as consumers and faculty doubling as entertainers ... and the glow of the screen replacing the quite white of pulp and ink ... this throwback to a simpler time in academia may be in jeopardy.

In fact, according to two thinkers, the entire American higher education system is in a crisis. Harvard Business School's Clayton M. Christensen and his co-author Michael B. Horn believe all is not right in the ivory tower.
More fundamentally, the business model that has characterized American higher education is at—or even past—its breaking point.
Christensen, the guru on disruptive innovation, believes that with the advent of online learning, even venerable institutions like Harvard are vulnerable. That said, he's excited about what's to come.
What is exciting about this emerging reinvention it that it has significant potential to help address the challenges facing American higher education by creating an opportunity to rethink its value proposition—its cost and quality.
The ease of creating and distributing content (just look at the fantastic TED talks or MIT's Open Courseware or CS 50 right here at SEAS) have the potential to down costs while at the same time upping distribution. That may sounds like higher education should embrace the Walmart model---philosophy for 50% off and name-brand professor for less!

Christensen is, however, not suggesting that the popular for-profit online educational institutions of the day (many of which are now under attack for their aggressive recruiting methods and questionable outcomes) are the model. Instead, places like Harvard should embrace the new frontier and help set the standards for Higher Ed 2.0.

A case-in-point is Stanford's radical move to open up a popular course in Artificial Intelligence to anyone who wishes to take it. Here's the kicker: all the virtual participants can submit homework, take exams, and receive a grade. All eyes are going to be on that class for sure.

Before we move into an argument on the pros and cons of online learning, the sacredness of the campus, the idyllic professor-to-student experience (as I revelled about above), I'd like to take a step back.
"Former lecturer" Eric Mazur in action.
At SEAS we are very fortune to have the insights of a brilliant applied physicist and teaching expert, Eric Mazur.

Professor Mazur has been on a mission to disrupt another type of tradition: the lecture.

In "Farewell, Lecture" (Science Magazine), he talks about his personal discover that his teaching, for lack of a better word, stunk. He writes:
My lecturing was ineffective, despite the high evaluations ... The traditional approach to teaching reduces education to a transfer of information ... However, education is so much more than just information transfer, especially in science. New information needs to be connected to preexisting knowledge in the student's mind. Students need to develop models to see how science works. Instead, my students were relying on rote memorization.
In short, the students in his intro physics course were not learning. They could not apply basic concepts to new problems. Mazur, thinking as a scientist would, decided to apply the same methodology he used in the lab to the classroom.
Since this agonizing discovery, I have begun to turn this traditional information-transfer model of education upside down. The responsibility for gathering information now rests squarely on the shoulders of the students. They must read material before coming to class, so that class time can be devoted to discussions, peer interactions, and time to assimilate and think. Instead of teaching by telling, I am teaching by questioning.

Mazur went one step further, not only abandoning the traditional lecture, but bringing in whole new ways of engagement. He was an early adopter of clicker technology---a way for students to answer questions in real time and then see individual and class results in real time. Now he has moved onto mobile devices and is even using GPS to track how clusters of students tackle a given problem or how certain students influence other students and either help or hinder one another to the correct answer to a problem set.

Mazur will soon take  a sabbatical to help refine his teaching methods and empower other faculty at Harvard, especially in the sciences and engineering, to integrate his methods into their courses. He warns, however, that the technology is not the game-changer.
However, it is not the technology but the pedagogy that matters. Unfortunately, the majority of uses of technology in education consist of nothing more than a new implementation of old approaches, and therefore technology is not the magic bullet it is often presumed to be.
To quote Eric Clapton, "it's in the way that you use it." Meaning, before we rush into a YouTube utopia for higher education, it might be wise to figure out what is and is not working in the traditional classroom set-up. After all, even if all courses end up online one day (which I think is very likely), they still need to be well-crafted and thought out.

In parallel to further exploring and implementing Mazur's findings, our dean is committed to putting more of our course content online and using technology to enhance teaching and learning (from virtual office hours to social networking sites). For engineers, this just seems natural---why not use the same technology you teach about to enhance teaching itself?
David Malan makes a profound point.

David Malan, the hero behind the CS 50 revolution, will help lead the effort across the school and the entire College.

Malan has been incredibly smart in how he has used technology to enhance teaching. The aim is not to replace anything per se, but to find ways that an online video or a mobile app can help the student experience.

An online course selection tool developed through CS 50, for example, has transformed the way students shop and choose courses.

Virtual office hours have made it easier for undergraduates, many who keep odd hours, to connect. (In fact, one or our faculty now holds in-person office hours in the late evening, as that when the students are most likely to need the help.)

Sometimes by improving the basic clunky mechanics behind the educational engine, you get more bang for your buck than investing in a multimedia experience for a course on, say, Shakespeare.

In the case of the course I took from Professor Buck, he insisted that everyone come to class prepared, making a reaction paper (which he graded prior to the class) a requirement for entry. Such simple housekeeping kept the discussions focused and productive---and the solution was no-tech and free. Having taken dozens of courses, it still puzzles me that more professors do not adopt these simple solutions.

Online learning can be fun too!
Believe it or not, one of my other favorite academic experience was online---completely online. I took a sequence of technical writing courses from Northeastern University in what would have been in the early days of the online learning revolution (University of Phoenix did not exist then.) In that case, given the material and the need by many taking the course to managing demanding full-time jobs, the format was ideal.

In the case of both the seminar format of Professor Buck and the online system offered by NEU, as Mazur advocated, pedagogy took the lead, and the right methodology and format followed.

I agree that disruption to higher education is coming. I'd even admit that it is likely needed. I was chatting with CS faculty member and Extension School affiliate Henry Leitner last night about putting courses online. While we both were extolling all of the benefits, he summed it up in a way that I had not thought of before. "This is simply a great way for Harvard to give back. To share its intellectual wealth with the world. It's the right thing to do."

Again, online learning, while disruptive, should not be seen as the substitute teacher (or the substitute academic experience).

Decision-theory guru Jonah Leher wrote a brilliant piece on social networking that gets to the heart of the matter in his assessment of social networking versus actual socializing.
This doesn't mean that we should stop socializing on the web. But it does suggest that we reconsider the purpose of our online networks. For too long, we've imagined technology as a potential substitute for our analog life, as if the phone or Google+ might let us avoid the hassle of getting together in person...
These limitations suggest that the winner of the social network wars won't be the network that feels the most realistic. Instead of being a substitute for old-fashioned socializing, this network will focus on becoming a better supplement, amplifying the advantages of talking in person.
For years now, we've been searching for a technological cure for the inefficiencies of offline interaction. It would be so convenient, after all, if we didn't have to travel to conferences or commute to the office or meet up with friends. But those inefficiencies are necessary. We can't fix them because they aren't broken.
The point of Peter Buck's course on Science, Technology, and the Good Society, was just that, an exploration of "why expectations have not yet been met and, perhaps, cannot be realized." That should be kept in mind before higher education is given a reboot.

Wednesday, August 3, 2011

Superheroes and such

Jack Ryan and Captain Ramius in the Hunt for Red October.
I love the scene in Hunt for Red October when Jack Ryan is ungraciously air dropped into a submarine during a torrential storm. The film's hero, who professes a fear of flying and turbulence, has no special powers, weapons, or Conan-like physical attributes.

Instead, Ryan, the man sent to save the world from possible nuclear annihilation, is an academic. "I'm not an agent, I just write books for the CIA," he says, half-embarrassed.

Riffing on the same topic, the first Indiana Jones' flick opens with Indy teaching archeology in stuffy college classroom at fictional Marshall College. That even some of his foes later refer to him as "Dr. Jones"---without any irony---cements his status as a professorial superhero (with a whip).

Indeed, knowledge is the answer.
The higher ed adventurer/hero with high mental powers motif shows up again in The Da Vinci Code, the television series Fringe,  and even in the fabulously titled The Librarian. Iron Man could also be included, but the protagonist is as much industrialist as engineer. (Let's not even get started with Harry Potter, but sheesh, most of the film takes place inside the high school/college we all really wanted to attend---and what are wizards and witches other than faculty with magical powers.)

On the print side, academics get less of a warm welcome, especially considering recent works like John Updike's Roger's Version (about chaos theory and proving the existence of God) and Ian McEwan's latest, Solar (about a revolution in the physics of energy production).

Despite their grand achievements in world-altering science, morally, Roger Lambert and Michael Beard are less than superheros. (In Solar, McEwan, I think quite cheaply, thinly fictionalizes the entire former Harvard president Larry Summer's women in science and engineering fiasco.)

So, are you expecting me to suggest that, despite their faults, we should elevate academics to modern superhero status? Out with Captain American and in with the Chair of American Studies! Turn the librarians and labs into fantastic fortresses. After all, Jack Ryan and Indiana Jones did best the Russians and Nazi's, respectively, and well, saved the world.

Imagine entirely new movie franchises (in 3D) with academics saving the day with their brilliant retorts! Oh the faculty meetings would offer up a bevy of brilliance. I suspect that would, alas, end up being more Monty Python than Michael Bay.

UCSD's  library kind of looks like a fortress suitable for a superhero.
I am going to go one better. Should universities attempt to put a stake in the ground and help make the world a better place? Dare I say it, or say they are going to "save the world"?

Reading all of the PR copy (some of which I wrote) seems to suggest that the aim of most research universities is to use knowledge and its applications to solve our most challenging problems. (Just read some of President Faust's speeches or the recent op-ed from the SEAS dean, Cherry A. Murray.)

Tackle global warming. Discover cleaner, greener sources of energy. Cure disease. End hunger. Promote tolerance. Protect our privacy and security. End poverty. And even, revitalize the entire economy.

Aspirational, yes, but not without evidence or success. Especially during WWII universities like Harvard became hotbeds for practical science, cultivating everything from computing to cryptography to medical imaging to eventually, the Internet.

While slightly less grandiose, our Harvard friends at Public Health promoted the designated driver program, reducing drunk driving in the U.S.; the Ed school created Sesame Street, educating and inspiring generations of kids; HMS invented one of the first "cures" for certain types of cancer with the drug Gleevac; and at SEAS, we have a lot to pat ourselves on the pack about (See our list of favorite milestones (baking powder being my personal favorite)).

MIT has long been a game-changer.
MIT recently celebrated its 150th. And well, wow. It is hard to argue that without MIT, the world would be a very very different place. Email. Biotech. Radar. Quarks. The Roomba.

Case closed. Universities have long been superheroes! Hooray.

And yet. And yet. Other than in times of great distress (such as war), universities do not, in fact, boast that they are going to change (let alone save) the world or direct all of their energies towards a single-minded goal. Doing so is not only presumptuous, it is dangerous.

Any good Kuhnian knows that revolutions do not work in an orderly fashion and that "enemies" do not come in brightly colored tights. And most professors, especially in the sciences, are very weary of suggesting that their latest discoveries will lead to a manufacturing revolution, cure, or the next Google or Facebook. When such things happen the normal reaction is utter surprise. "I wasn't trying to change the world, I was just solving an interesting problem" is a common refrain.

Is picking low-hanging intellectual fruit enough?
It does seem disappointing. Universities do make a huge impact, but they have to be careful about over-promising the fruits of knowledge (except after the fact---as that is what alumni are for). And yes, knowledge for its own sake is a lovely, wonderful thing. You cannot, however, base fundraising campaigns or calls for massive government or corporate investments on picking low-hanging fruit.

Moreover, given the multi-billion dollar endowments of some of our most well-known institutions, underselling impact doesn't play all that well to alumni, budget-minded politicians eager to slash federal funding for research, or to the public. 

To hedge, Harvard, in particular, takes a different tactic, employing a kind of six-degrees-of-separation scheme. The tagline is: We educate leaders, who then go out and lead governments, companies, and institutions that end up making a difference.

This is reminiscent of like BASF's former tagline: "We don't make the products you buy. We make the products you buy better." (A typical advertisement is below.)


MIT, as it did for its 150th celebration, shows its contributions in a similar way: the number of spin-off companies and overall economic impact of various inventions and technologies.

These are all fine indicators of success. What they lack in the "wow" department they make up for in celebrating the long-term dynamism and overall value proposition of American higher education.

That said, I am a sucker for a good moonshot. And given our current government's distraction about debt, I doubt it is going to come from Washington---at least soon.

Couldn't universities step up, or better, band together, and declare: We will cure cancer. We will stop global warming. We will invent an ideal form of energy. We will revolutionize transportation.

IBM's Watson was a big risk and big reward endeavor.
Believe it or not, companies have been far less bashful about getting out in front of problems. IBM set out to create a machine that could pounce on the world's leading Jeopardy! players. Guess what? Watson did so with a flourish.

Google, honestly, wants to put everything its programmers can get their hands on online. Amazon is changing the way we read.

Apple changed music, forever. And private entrepreneurs are building planes that can fly in space and Virgin Galactic is signing up customers right now. On a bigger scale, consider that sequencing the human genome was a collaborate effort of a private company, the government, and higher education.

(And if you want to read about companies going a bit too far, check out Fordlandia, Henry Ford's quest to create an American Midwest-in-miniature in South America or books about Milton Hershey, the candy man who wanted to develop the perfect little utopia.)

So where does that leave universities in the world-bettering equation? Will we have to continue to look to fiction and film for our superheroes? Or wait for companies to tie profits to issues of great promise?

Instead, I think universities should think big and dream big.
  • They can identify the kinds of problems that need to be solved---and show how, even in small ways, that they are making progress.
  • They can collaborate and join together as in the case of the Internet 2.0 and partner with industry to tackle Moore's law, quantum computing, and other deeply challenging frontiers.
  • With vast projects like the Large Hadron Collider, universities can continue to convince governments that it is worth the investment to get to the bottom of how our universe works and spark the public's imagination.
  • They can work with countries to solve infrastructure and energy problems and become the 'hubs' (virtual and physical) where people are brought together to get their hands dirty and come up with not just policy solutions, but real engineering solutions.
  • They can encourage their students to enter competitions with big goals, like Robocup, which seeks to create a robotic soccer player that can compete with humans by 2050. Better, they can create such competitions themselves---if it worked for Netflix to create a smarter algorithm, it can work for higher education.
  • They can, as MIT did, celebrate their contributions, showing how they make world not just better, but far more interesting and exciting. (To be inspired, just check out all those TEDx talks.)
Ultimately, even with all of the excitement of flash and explosions at the movies, the kinds of down-to-earth superheroes are far more compelling. In fact, the recent trend has been to unmask the heroes and show their humanity (from the reboot of Batman to the latest Spiderman to shows like Heroes.)

Believability is potent stuff. Jack Ryan works as a latter-day hero precisely because it is his book smarts that win over the day, all with a bit of humility and humor. The world and its problems are way too complex for solutions that only offer bludgeoning as a response.

To wit, you cannot blow up global warming without destroying the earth. Thankfully, moviegoers are smart enough to realize these nuances, as superheroes now operate in a grayer, tougher landscape.

An assessment of the modern battlefield by SEAS's Kit Parker, my favorite bioengineer-soldier, offers precisely this dose of reality: "It is the most intellectually complex environment that probably our military has ever operated in."

Blue sky thinking in higher ed should be more than just about caps.
This is not to say that universities should not attempt to go for the gusto.

In fact, it may be better if they encourage some blue-sky thinking---just blue sky thinking at a slightly lower elevation and with a net.

"The choice we have is not between reasonable proposals and an unreasonable utopianism. Utopian thinking does not undermine or discount real reforms. Indeed, it is almost the opposite: practical reforms depend on utopian dreaming." - Russell Jacoby, Picture Imperfect Utopianism