One of my favorite scholars is James Bessen, a lecturer at Boston University and a fellow at Harvard’s Berkman Center. A Harvard graduate, he founded a company that created one of the first desktop publishing systems and helped revolutionize the publishing industry. He sold that company in 1993 and has since become a self-trained academic economist.
Some of his most important work has been on patents. He wrote an excellent paper on software patents with future Nobel laureate Eric Maskin. And with Michael Meurer, he wrote Patent Failure, a fantastic book I have promoted at every opportunity.
Patents are one aspect of Bessen’s larger research agenda, which is focused on understanding the process of innovation and the policies that encourage it. To that end, he has been doing some in-depth research into the history of innovation. In this interview, he talks about his findings on the history of weaving technology, his own experiences in the desktop publishing industry, and what those experiences tell us about the alleged “great stagnation” of our own era. My questions are in bold, and his responses are in ordinary type. I’ll post a bit more of the interview tomorrow.
Timothy B. Lee: I think a lot of people have a sense that the rise of the Internet and the software industry are pretty exceptional. On the other hand, Tyler Cowen has argued that the changes of the last 40 years are actually less dramatic than those of his grandmother’s lifetime. Where do you come down on this question?
James Bessen: Cowen argues that we’ve picked all the low-hanging ideas and that we’re running out of good ideas. Other people have a sense that we’re sort of in the midst of a technological revolution. The central paradox is that for the past 3 decades, during the rise of the personal computer, wages, at least, have stagnated. There’s this sense that we’re doing all this innovation, we’re coming up with new technology, but we’re not seeing the economic fruits of it like we did in the past.
So maybe this is just frivolous technology, not “real” innovation. Grandma got indoor plumbing and we’re getting social networking.
Cowen trots out things like patent statistics, but he unfortunately gets it wrong. For starters, patents are not measuring innovation, they’re measuring industrial strategies. In terms of the number of patents granted, even just to domestic innovators, it’s at an all-time high. If you weight it per capita, it’s a little bit less than it was in the late 19th century, but not very much. So patents are not a clear indication of a great stagnation.
But isn’t the relatively slow growth of wages and GDP evidence that we’re not producing as many good ideas as we used to?
People think these great inventors have these great ideas which then just go out and immediately revolutionize society and produce all of these benefits. So the fact that we’re seeing lots of technology, lots of innovation, and yet not seeing the economic benefit seems to say, “well, something’s wrong with those ideas.”
But if you look in the past, technology has never been about simple inventions revolutionizing society directly. It’s always been about them providing an opportunity, but that opportunity requires the development of all sorts of new knowledge by large numbers of people–people who are going to use it, people who are going to work with it, people who are going to build it, and that’s very often a process that takes decades.
You’ve studied 19th century weaving technology as an example of this process, right?
It’s interesting. Some people view the story as one where Cartwright invented the power loom in 1785, and that revolutionized weaving and immediately de-skilled the job from the artisan hand-weaver to the unskilled factory worker. But in fact, if you look at the actual timing of things, it was a couple of decades after Cartwright before they really even got into production in any substantial way. It was a couple decades more until they really had the knowledge standardized.
So even if you understood the machines and had access to the machines, you needed to understand how to maintain them, how to modify them, how to use them, how to train people on them. It turns out that while weavers were unskilled when they walked in the factory door, they had critical learning on the job, and their skill was critical to making the new technology profitable. There was a period of several decades where the successful mills were the ones that had critical knowledge about how to operate the machinery.
One example is William Gilmore, an English mechanic who had worked on the looms in England, came to the US in 1815, and was one of the first people to build looms in the US. It took him a year or longer to build his first looms. He couldn’t get them to work until finally he talked to an Englishman who had been a loom operator. The Englishman told Gilmore his machinery was fine, he just didn’t know how to operate it. The knowledge you needed to run this thing with any sort of efficiency was not obvious even to someone who understood the machinery intimately.
In the present day, we have lots of computers around and people expect that because everybody’s got a computer on their desk, people should be more productive.
It’s not the computer itself, it’s the computer applications. To build successful computer applications typically requires organizational changes. It also requires custom adaptations to the particular needs of the user–what Tim Bresnahan called co-invention. So there’s a lot of adaptation, there’s a lot of very specific investments going along with that. It’s never enough just to have the equipment just like it was never enough just to have the loom.
One economic historian found that the mill owners who simply purchased the machinery from others almost all failed. The ones who succeeded were the ones who had some capacity to build and modify the equipment themselves. It’s a similar thing here. Not that you need to build the computer yourselves, but there’s an awful lot of custom software development that goes into adapting applications to the needs of particular businesses. And very often that’s where the productivity payoff comes in.
And this is something you’ve had personal experience with in the past, right? Can you tell us about your experience creating desktop publishing software in the 1980s?
That was the big shock about going into business. I was expecting that we would create this software and people would just buy it. But people would call up wanting us to adapt it to particular needs.
And they needed to adapt not only the software but also the way they worked to a new method of production. In the extreme case, it meant complete re-organization of a workplace. At the Sears Catalog they eliminated something like a hundred jobs.
For those of us who have always done our publishing on computers, can you briefly describe how things were done before the advent of publishing software?
Take the Sears Catalog, for example. They would have artists who would draw up a page, copywriters who would write up a text, and they would do various mockups, and then send it off to a typesetter, an outside agency who would typeset the various materials. There were little pieces of photo paper that could be pasted together on a page layout and then those would come back and be proofed, and modifications of that would have to be re-typed, and re-output, and re-pasted-up, until they finally had a camera-ready mockup. They would then take that to a camera room and shoot a large photograph of that that was used to make a printing plate.
People began doing computerized typesetting in the early 1960s. They began with the easier applications and in the 1970s you had some very large and sophisticated operations–very expensive equipment. What you saw in the early 1980s was that you could do it on a personal computer, which made it much more accessible and much more cost-effective.
What effect did this have on the labor market?
Beginning in the late 1970s, particularly with the newspapers, it started to lead to the elimination of jobs in the typesetting and composition departments–in some cases leading to large strikes, in other cases leading to complete buyouts of peoples’ jobs.
The International Typographers Union was one of the most powerful unions. This technology replaced an awful lot of those jobs. Those were well-paying blue-collar jobs. But it meant a greater demand for graphic designers and editors. A lot of the work was shifted upstream in a way.
And so it depressed wages for one group and increased wages for another group. But then things started moving to the web so we started getting some of those graphic designers replaced by web designers, and now we’re seeing web designers replaced by mobile designers. There are all sorts of new specialties.
All of this is very expensive. From society’s point of view, there’s a lot of investment going on, a lot of learning going on. You’re changing the nature of whole organizations. A lot of this is not standardized knowledge, so that even after you get an MFA in interactive design, you still need to be learning new skills every other year if there’s new technology you need to be learning. The schools aren’t necessarily able to teach the latest technology. You still need to be learning new skills every other year because there’s new technology you need to be learning.
This leads to greater wage disparities. There are a small number of people who can excel in an environment without a lot of standardization. They can work as freelancers, they can teach themselves new technologies, there’s not as much competition as in a well-established profession. On the other hand, there are other people in the middle who may be experiencing some downward pressure on wages, because what they did in the past is simply less valuable.
So should we view this as a grim development? Are we doomed to a future where a lot of people have trouble finding work?
The lesson of history is that you need to be careful about projecting current trends permanently into the future. Things change. From the 1820s to the 1860s wages of weavers were pretty much flat at the same time that the productivity of the technology was improving dramatically. This was partly because the technology was changing, and partly because the knowledge wasn’t standardized.
In the beginning, the weavers were hired as mill girls. They were brought in and boarded in town for a year or two. There were relatively few mills, so you couldn’t learn it on the job and then get employment easily anywhere else. The numbers of new hires who had previous experience was very small.
Over time, a residential workforce developed and the number of people who had previous experience with weaving started to grow. Gradually over a period of decades, it developed this workforce of experienced weavers. So by the time we get to the 1870s or 1880s, almost all of the new hires had previous experience. So there was a labor market for those skills that didn’t exist before.
The first schools for learning weaving only appeared after the civil war. The employers had to pay for the training in the earlier years. In the later years, the weavers paid for their own training, but they also got the benefit of it. So wages started rising and weavers started getting a larger share of the benefits of the technology.
You could project a similar change happening in, say, publishing technology. The situation we’re in right now is every couple of years, people acquire the words they read through a different medium–a different technology–and so there’s a constant churning of new skills. It’s possible of course that that could go on forever. On the other hand, it’s possible that things may settle down to some sort of configuration where there is a dominant way that’s stable over a long period of time. Then you would expect to see a similar transition to a greater wage growth.