The End of the Textbook

End of March 2008

What would be the situation of a $5 billion a year industry that had a product that hadn’t substantially changed for decades, a product that over that same span had been increasing in price at twice the rate of inflation, and operated in a market where they could be almost completely deaf to the opinions of the primary consumers. Pretty dire, right?

Wrong. If it’s the college textbook publishing industry, these problems have caused some discomfort, but they haven’t changed the industry and they haven’t affected the bottom line. Publishers have a captive, reliable and slightly growing audience from year to year. Consolidation into four corporate giants has reduced the overall level of competition. The industry is unique in that the consumer in not the adopter: students buy the increasingly expensive books that professors recommend, but the professors are immune from the pain of rising costs.

It gets worse. Used books sales, on the web and at college bookstores, have eaten into the publishers’ profit. This has in turn spurred an arms race, where publishers put out editions every couple of years, and shrink-wrap the books with ancillaries like CDs and software, all to try and force the instructors to adopt only the most “recent” materials. There is little or no price pressure on publishers because they are marketing to the professors, and the professors either don’t care, or they rationalize the situation by saying that students “have to buy a book anyway.” Publishers also give professor adoption rewards like DVD sets of TV shows and expensive software. These amount to little more than bribes, but to get 100 adoptions of a $100 textbook generates $10,000 in revenue so a $100 bribe is just good business.

This situation has recently attracted the unwelcome attention from student groups and Congress. The average student spends about $900 per year on textbooks, which is about 20% of tuition and fees at a four-year public institution, so this is a substantial cost for students and their parents. The GAO investigated textbook prices in 2005, and politicians in nearly 30 states have conducted hearings or raised the issue for public debate. The grassroots Student Public Interest Groups have become vigorous critics of the textbook industry, pointing out that books are often much cheaper in Europe, noting that very few professors use the bundled extra materials, and that new editions are not needed on such a rapid timescale. At the GAO hearing, one congressman sardonically pointed out that new calculus texts were churned out every few years, but he “didn’t think that calculus had changed much since the time of Newton.”

Where does the textbook dollar go? The Association of College Bookstores creates and updates a useful chart. Of every dollar a student spends, 11 cents goes to the authors, 66 cents goes to the publisher, and 23 cents goes to the college bookstore. The royalty rate for authors is standard and, after all, without the authors, there would be no authoritative material to help students study. The publishers gets 2/3, the lion’s share, and 25 cents go for marketing and administrative costs, which seems like pure overhead. The bookstore only takes 1/4, but they are not blameless in this game. Bookstores buy back used copies at 25 cents on the dollar but then sell for half price so they can make a 25% margin by selling the same books over and over again. That’s pure gravy, so their claim of altruism to students rings hollow since obviously they are acting as profit-skimming brokers and there would be no textbooks if only the bookstores sold them.

My perspective on this subject is as an insider and an outsider. I wrote textbook for a decade before leaving the game, and now I have college age kids myself so know from first-hand experience about the big bills. I quit writing textbooks because the industry was becoming steadily more impersonal—small publishers with long-time editors were consumed by conglomerates that put the money into marketing. The authors were being increasingly treated like paid help rather than creators of content and pedagogy and full partners in the publishing process. For me, it became a grind and not much fun. Plus, I note that I still drive a 15-year-old car; very few instructors who write books do more than add a modest supplement to their income.

What does the future hold? It would seem that in the age of information and the Internet, printed books are an anachronism. People do like to read, and students still like to mark up their books with highlighting and notes, but they also all have laptops and could easily migrate to electronic alternatives. The textbook is still in rude health but it shows signs of terminal illness. Alternative and a glimpse at the future will be covered in a future post.

Comments off

The End of the University

 Middle of March 2008

When Plato founded the Academy in 385 B.C., he had no idea that the concept of the university would permeate the civilized world. Until the nineteenth century universities were religious-based European institutions that catered only to the elite, and taught a narrow range of disciplines ill-suited to people who might have to learn a trade. By the twentieth century, the secular German model had propagated to other countries and now nearly half of the population of industrialized countries experience tertiary education. 

American universities are victims of their own success. Serious and high level career employment is almost impossible without a university degree; it’s become the high school diploma of the twenty first century. Research universities have become bloated and bureaucratic and the quaint tradition of tenure means that the professoriate is almost immune from market forces. As a result, students and their families can be gouged by tuition that’s rising at twice the rate of inflation. This is justified by the extra earning power that a college education still conveys. However, the bang for the buck declines when students find themselves in huge lecture classes that use old Germanic teaching methods, and when so few classrooms are informed by the latest research on pedagogy. 

The gravy days are nearly over. Parents and government regulators are starting to draw lines in the sand on further tuition increases. Regents and university administrators are demanding that their highly paid research faculty pay more attention to teaching, and to the undergraduates who pay most of the bills. Students are demanding that universities be responsive to the changing workplace; forty years ago a graduate would have typically a single job, today the number is six or seven. The largest high school graduating class will be on 2010. Thereafter, enrollments will decline and the landscape will transition into a buyer’s market. 

I teach at a large Land Grant university with 40,000 undergraduates and thousands of different classes. It’s primarily a residential campus, and the first year away from home, that time of roiling hormones and excess and experimentation, is still a rite of passage for many 18-year-olds. However, the percentage of adults or non-traditional learners in U.S. colleges has grown steadily, from 15% of students in 1990 to over 50% in 2004. The 18-year-old freshman fresh away from home is now a minority. 

The informal sector of higher education is the only one that will grow in the next decade. The graying of American and the spryness of all those retired people means that lifelong education will be more than a catchphrase. Over 100 million people, 50% of all adults, now enroll in a college course in any 12-month period, an increase of 35% since 1991. Another 40 million adults participate in courses occasionally for personal interest. The older students have a lot of disposable income so their preferences and their spending patterns matter. 

Technology is also transforming the landscape of the university. You might not know it to visit a campus like mine, where most large classes are still taught in the old-fashioned way, by lecturing, and where the use of computers and classroom responders and other learner-centered tools is still rare. At its worst, this is a pitiful mode of transmission of information, little better than babysitting. Even at its best, it is inefficient and poorly suited to the way students actually learn.

The biggest change wrought by technology is the erosion of the sense of place. Distance learning began with the growth of the postal service and correspondence courses in the nineteenth century. Britain’s Open University was established in 1969; with very little fanfare it has grown to 180,000 students enrolled worldwide. The rapidly growing, for-profit University of Phoenix in the U.S. has 250,000 students spread over 79 campuses. These universities cater to vocational needs and they offer classes when working people need them, or they offer online learning anytime, anyplace. 

The wired and wireless Internet, and the growth of bandwidth sufficient to deliver video, are creating a situation where the classroom experience can be simulated online. Add to that immersive technologies like Second Life and game-play learning, and the need to sit in a classroom will dissipate. Adult learners will use video on demand to view lectures, tools like Wikipedia instead of a library, and discussion boards and wikis to discuss and collaborate with their fellow students. It sounds detached and austere but it’s already the way most teenagers learn and interact. 

Does all this presage the end of the university? Not necessarily. The jeremiads have been written before and been found to be premature. The cultural heritage that posits years at university as a rite of passage and maturation will not go away overnight. The technology that facilitates distance education will have a lurching and painful learning curve. The corps of disembodied learners will grow steadily but not dramatically. And Plato got one thing exactly right when he founded the Academy in an olive grove outside Athens. He employed the Socratic dialog, the best method ever developed for challenging ideas and homing in on the truth. The virtual university of the mid-twenty-first century will discard this at its peril.


Comments off

My So-Called Cyber-Life

End of February 2008

It’s a familiar lament to any parent of a modern teenager: they are so multiply connected to electronic inputs that the real world slips away and might become irrelevant. If you tell a fifteen year old that when they were eight there were no iPods and when they were born there was no Internet, they’ll look at you either blankly or aghast, as if you had said there was no food.

Many newspaper inches and web electrons have been spent dissecting the onrush of digital media and their effect on young people, but the biggest mine of real data is the web site of the Pew Internet and American Life Project. This charitable organization has had its finger on the pulse of the Internet and its impact on the popular culture for nearly a decade. Their reports cover everything from education and privacy issues to the effect of the Internet on the political discourse of the nation.

There’s a YouTube video currently posted with a message that scrolls up, Star Wars style, saying “For years, parents could not text message. They could not figure out how to record a voice mail. They could not even connect to the Internet without using AOL.” After a warning that parents are adapting to new technologies, there’s a clip of a man figuring out how to the video capabilities of his cell phone. “Watch with caution,” the video closes, “and pray that your parents do not gain these powers.” The teen mind may recall the opening scene of an earlier science fiction classic, 2001: A Space Odyssey, when apes confront the enigmatic monolith that will change them forever.

I have relatives in their eighties who have iPods and know how to text their nieces and nephews, but young people are ascending the rapid curve of technology much faster than their parents. Teenagers now spend six and a half hours a day with video input. That total is not much longer than a decade ago, but the mix has changed; TV watching is down and playing video games and watching video clips is way up. It’s axiomatic that all teenagers multiplex on a computer, doing their homework in parallel with text-messaging, watching online videos, checking out MySpace or Facebook, and surfing the Internet.

The most recent Pew report notes the emergence of “super-communicators,” the quarter of all teens who use all the major communication channels—texting, cell phones, social network web sites, and instant messaging—to connect to their friends. A sub-theme of this blog is projecting the future of IT usage. What is the end point of this extraordinarily rapid assimilation of technology? Will it saturate at some point due to limitations of the brain or the finite number of hours in the day? Will the distancing aspect outweigh the ease of making new connections? Will these technologies be used to sustain a froth of popular and often trivial culture or will they be used to promote learning?

One hopeful sign from the Pew study is the fact that teenagers are not passive users of these new technologies. Two thirds have created some kind of online content, compared to only 15% of adults. A third have shared artistic creations or songs, or have created or worked on blogs and web pages for school or groups they belong to. A quarter of them have created their own web page or online journal. And a quarter of them have remixed online content into their own creations, creating what is known as a mash-up. Blogging is hot, nearly doubling among teens between 2004 and 2006. There are interesting gender effects too. Girls are twice as likely to blog as boys while boys are twice as likely to post videos. The new modes of content creation and posting facilitate further communications: 90% of teens report feedback from others or online conversation as a result of posting a photo or video.

At the pinnacle of young people harnessing these new capabilities are some amazing success stories. Catherine Cook founded while she was in high school as a way of keeping in touch with friends after she graduated; it has 2 million members and has attracted $4 million of venture capital. Ashley Qualls was 14 when she created a web site to help teens “express themselves” with art and graphics. Her site now gets 60 million page views a month. Ben Cathers was only 12 when he started his first business on the Internet. By age 19 he had his own syndicate radio show and had founded a search engine technology company. For each of these teenage titans they are many mini-moguls.

Internet use is saturating for the simple reason that 93% of teens are online, versus 73% eight years ago. Their use is intensifying; a third go online multiple times a day. However technology has not been able to use Special Relativity to create more hours in the day, so we can anticipate that the new technologies will become like “wallpaper,” unremarkable parts of the everyday flow of life. And despite the fear voiced up front, teens still value face-to-face contact. As for parents, you’re on your own. But it’s a comfort to know that almost any average eight year old could help you program your web phone.

Comments off

The Wonderful World of Wikipedia

Middle of February 2008

To 99% of the population, Wikipedia has become wallpaper, a free source of information so useful and ubiquitous that it’s uninteresting. To academicians, however, the idea of an encyclopedia that anyone can edit is anathema. It sounds like a recipe for misinformation and opinion dressed as facts. They worry about the blind leading the blind.

This criticism was always overdone. Head to head comparisons show that Wikipedia is marginally less accurate and reliable than Microsoft’s Encarta and Britannica Online but the difference is unlikely to be noticeable to the average user. Occasionally a journalist has “salted” Wikipedia with errors or falsehoods and complained that they persist. But that’s only true if the problems are in rarely-visited byways; on average a deliberate error lasts just a day. The truth is that several thousand dedicated Wikipedians and hundreds of thousands of occasional editors keep the resource remarkably free of major problems.

The worst hotspots occur where you might imagine, in articles that deal with politics or contentious social issues like abortion. A more subtle and insidious form of bias comes from corporations or individuals editing articles out of self-interest, financial or personal. To deal with this, the architects can lock down articles on contentious issues or require the editors to reveal themselves and disclose conflict of interest. In many technical and esoteric fields, the quality is very high. I’ve found only a few minor errors in the hundred or so astronomy articles that I’ve read. After all who would vandalize an article on stellar nucleosynthesis, even if they knew how? Another complaint concerns the proportionality and balance of a resource where the article on Brittany Spears is two-thirds of the length of the article on Albert Einstein. So what? If the information on Einstein is correct, why should we cavil at inclusion of the numbing details of Brittany’s life, as long as they’re also accurate.

Wikipedia has become so ubiquitous as a one-stop shop for information that it’s turned the 800-pound gorilla called Google into its tame poodle. I entered 100 words or phrases that a student might use in their studies, ranging from “Napoleon” to “subjunctive” to “magnetic field,” and Wikipedia was only outside the top three results 10% of the time. This extraordinary result means that the giant search engines have largely become proxy search tools for the Wikipedia web site. It also points to Wikipedia’s Achilles heel: the lack of any consistent hierarchy for the information. Articles are assigned to categories with as many as 22 levels, but a third of the articles aren’t categorized, and classifiers can’t keep up with the rapid growth of the resource.

Articles on major topics were written years ago, so time serves to hone the information and weed out the diminishing number of errors. Wikipedia is growing laterally. Most of the new articles populate a long tail of obscure topics and obscure people. The fact that few people will care about such articles is beside the point; they’re all there in one web location and nobody will every have to create that information again. The other thing Wikipedia has going for it is currency. News is updated almost instantly and journalists have become agitated at this competition from rank amateurs. Articles on major topics also get “churned” to ensure currency, a process that’s impossible in the standard model of publishing.

Wikipedia is also becoming less monotone. Foreign language articles are being added faster than articles in English, and 250 languages are represented. In addition to the 2.2 million articles, the resource has over a million images and many useful maps, charts and tables. Unknown to most users, there is an effort to create structured data tables on many topics, so that information can be retrieved by a structured query akin to natural language. This is a step in the direction of the semantic web envisaged by Sir Tim Berners-Lee, the architect of the Internet.

Nobody foresaw the spontaneous creation of such a wonderful free resource in a context where the web was being increasingly dominated by commerce. The sense of community embodied by Wikipedia is inspiring, and that alone is enough to forgive its faults. Like a living entity, Wikipedia will continue to grow and adapt and evolve in surprising ways.

Comments off

Scientific Computing for the Masses

End of January 2008

The temple of scientific computing has been opened to unwashed heathens. Science has always pushed the envelope of computation and the state of the art of computers. Until a decade ago, supercomputers built by IBM and Hitachi and Cray duked it out to claim the title of fastest calculator; their speeds were measured in teraflops, where a teraflop is a mind-bending trillion floating point operations per second. Such machines were power-hungry and cost tens of millions of dollars and they had sleek housings made of anodized aluminum. It took chilled water or Freon to sooth their fevered circuits. Supercomputers were cool, literally and metaphorically.

Supercomputers still exist and they’re used to tackle the toughest problems in science: how galaxies form, how proteins fold, what happens when an atomic bomb goes off underground, and how to embarrass mere mortals at the game of chess. The current record-holder is an IBM machine called Blue Gene that clocks in at a staggering 500 trillion operations per second. It could do you taxes in a nanosecond, if it would ever stoop that low.

But the relentless pace of Moore’s law means that a high end desktop today is like the supercomputer of only a decade ago. Moreover, blazing speed in a supercomputer is achieved by clustering processors and running them in parallel. As PCs gained speed researchers realized they could be harnessed into a highly distributed supercomputer, where the calculations are parceled out over the Internet and the answer is assembled afterwards.

Rather than a supercomputer in a room, this is a supercomputer in thousands of rooms scattered across the world. There are a half a billion PCs around the world. They spend most of their time idling, and rarely approach their computational capacity even when they are being used by their owners. Scientists have figured out how to harness this vast excess capacity on a volunteer basis. Most people are delighted to be able to help solve problems at the cutting edge of science.

Volunteer or networked computing has projects across all scientific disciplines, spurred by an open-source platform to make it easy to set up and manage these projects called, with onomatopoeic whimsy, BOINC, or the Berkeley Open Infrastructure Initiative for Network Computing. Projects include searches for prime numbers and cancer cures and testing for drugs that can combat tropical diseases. IBM has harnessed 800,000 volunteer computers for a variety of philanthropic and humanitarian causes. A group from Stanford studies protein-folding, which is the key to a variety of diseases such as Alzheimer’s, and late last year they used the processing power of 40,000 Playstation 3’s to pass the magical petaflop barrier, a quadrillion floating point operations per second.

Astronomy features heavily in this new form of “citizen” computing. The granddaddy of distributed computing projects is SETI@home, which started in 1999 and has had over 6 million participants. This screensaver analyzes chunks of radio spectral data looking for artificial signals from extraterrestrial civilizations; none has been found so far. Some of the more interesting distributed computing projects tap the skill and judgment of their participants, not just their CPU’s. Stardust@home enlisted 24,000 volunteers to search images of the porous aerogel from the Stardust comet sample return mission for telltale tracks of interstellar dust grains. This needle in a haystack project was very successful; forty million searches resulted in fifty new dust particles recovered.

The Galaxy Zoo has been even more of a hit. This British-American collaboration seeks civilian scientists to classify the shapes and types of galaxies from Sloan Digital Sky Survey images, of which there are too many for professionals to handle. In under a year, 100,000 volunteers have classified over a million galaxies. Project managers had to buy new servers to handle the overwhelming public response. Each galaxy is classified by as many as 30 volunteers. The results are just as accurate as classification by a professional.

It’s unabashed good news that computing power is making science less of a priesthood, and letting members of the public participate in cutting-edge research or help find the answers to real-world problems. If you are not doing research right now, you’re almost certainly wasting the computer in front of you.

Comments off

Getting Used to Exponential Change

Middle of January 2008

We’re captives of linear thinking because time flows like a river and our lives play out linearly. The closest most of us get to an appreciation of non-linear growth is the wonder of compound interest. Exponential change is so dramatic intuition fails us. We smile at the story of the precocious child who normally gets $10 a week pocket money, but tells her father that she’ll cut him a break but accepting a penny at the beginning of the year and doubling it each week thereafter. Wise fathers decline the offer because the monthly amount will reach $10 in March and instead of $520 they’ll be in for more that the GNP of the world, $45,000,000,000,000 by years’ end.

Some technologies change slowly or improve only marginally. The internal combustion engine is mostly unchanged in a century and the rockets that power the Space Shuttle are close relatives of the chemical rockets that launched Sputnik fifty years ago. Information technology is different. The speed of microprocessors has been doubling every twelve to eighteen months since the mid 1960’s, and Moore’s law is accompanied by shrinkage in size that decreases the power requirement and increases the portability of computers and computational devices. Experts differ on how long this virtuous trend can continue. The fundamental limits of the architecture of integrated circuits are being approached, but by using light instead of electrons Moore’s law might continue for a decade. Beyond that, the famous physicist Richard Feynman pointed out, there’s “room at the bottom.” If bits can flip at the level of individual atoms, the engine of ever-increasing computer power won’t run out of steam, if you’ll pardon the anachronistic metaphor.

Moore’s law is paralleled by two trends equally relevant to information technology. One is the logarithmic or exponential increase in the capacity of memory or storage of digital information. The other is the rapid increase in the bandwidth of most wired and wireless networks. These trends create an extraordinary convergence of benefits for anyone who uses information technology, which is to say: everyone. The manipulation, storage, and transmission of bits are all getting ever-easier and ever-cheaper.

Of course it doesn’t always feel that way. Bloated and Byzantine software seems capable of absorbing all the gains on offer from Moore’s law (for more information, try Googling “Wintel conspiracy.”) Images, music, and particularly, video, can fill the most capacious hard drive, and the sharing of such material within rapidly-growing social networks will tax bandwidth even as it rapidly grows. Despite these teething pains, our digital lives are much richer and more capable than they were a decade ago.

What would happen if we were to anticipate exponential change, rather than just riding it breathlessly? Ironically, if you’re in any business that involves solving problems or doing calculations, you should sit on your hands and do nothing. Procrastination is the logical answer. Why? Because anything you can do right now will be subsumed and superseded in the future. In my field of astronomy, this is the situation of the people who search for radio signals from hypothetical intelligent aliens. They scan the radio spectrum Hertz by Hertz and look at as many stars as possible; the experiment is limited by both bandwidth and computation speed. The survey started last year using the Allen Array will exceed in a year the sum of all surveys that preceded it over fifty years. Such is the power of riding the exponential curve.

Many people do in fact wait on technology. The first cell phones in the 1970’s were the size of a small briefcase, very power-hungry, and served by limited networks. The first PCs of the 1980’s were prone to crashing, and their speed limited them to simple tasks and games. The rapid maturation of technology is a natural byproduct of Moore’s law because a lower price leads to economies of scale in manufacturing and larger potential markets. We sensibly jump into a new technology when it has evolved beyond the first few steps along the exponential curve.

A modest extrapolation of current trends, and technologies currently in the lab, leads to the following prediction. In fifteen, or maybe twenty years, we’ll only dimly remember cursing the slow downloads off the Internet, getting frustrated because we’ve filled our memory stick or iPod, or staring at the hourglass icon (another wonderfully anachronistic metaphor) while our computer chews on some process. Fifteen years of eighteen-monthly doubling is two to the power ten, or a factor of a thousand. Let’s be conservative and say the gain is only a factor of a hundred.

When the average microprocessor runs at 100 GHz, the average memory device holds 1000 Gbytes, and the wireless Internet runs at 10 Gbits per second, the situation will fundamentally change. When your personal computing device holds all the music you could ever listen to or all the photos you could ever take, when you can have instantly delivered to your handheld any web page or video clip, the organization and navigation of information becomes more important that your degree of access to information. The limitations of human sensory input and human processing power will dominate. Without tools to make sense of the cyber-babble, Moore’s law risks turning into a curse of plenty. In this important aspect the Information Age is still in its infancy.

Comments off