Eric Raymond predicts that smartphones will disrupt the traditional PC market:
Here’s what I think my computing experience is going to look like, oh, about 2014:
All my software development projects and personal papers live on the same device I make my phone calls from. It looks a lot like the G1 now sitting on the desk inches from my left hand; a handful of buttons, a small flatscreen, and a cable/charger port. My desk has three other things on it: a keyboard about the size of the one I have now, a display larger than the one I have now, and an optical drive. Wires from all three run to a small cradle base in which my phone sits; this also doubles as a USB hub, and has an Ethernet cable running to my house network. And that’s my computer…
When I leave the house, I pull the phone from its cradle and put it in my pocket. At that point, the onboard screen becomes its display. I’m limited to low resolution and a soft keyboard through the phone’s touchscreen…until I get to my local internet cafe, which is full of display-keyboard combinations much like the one I have at home, awaiting my use. If for some reason I need an optical drive, I borrow one and plug it into the device hub that’s servicing my phone.
I think the trend he’s describing is plausible, but the timescale seems wrong. I’m already doing basically what he describes with my laptop. I’ve got one MacBook and two keyboard/display setups–one on my desk at home and one on my desk at school. It’s extremely convenient to have exactly the same computing environment everywhere I go.
However, I don’t think convergence with phones is going to happen quite as quickly as ESR is predicting. My iPhone is an order of magnitude slower and has an order of magnitude less storage than my MacBook. The gap is closing, but it’s happening pretty slowly. You could run a desktop OS on your smartphone, but it would feel like you’d time-traveled back to 2001. I think it’ll be another decade before phones are fast and capacious enough that people will be willing to give up their traditional PCs.
Another possibility is that we’ll see just the opposite: a proliferation of cheap, varied devices, with everything synced to “the cloud.” On this model everyone will have a bunch of $200 devices with a variety of shapes and sizes scattered around—desktops, laptops, tablets, and smartphones—and they’ll be able to access all their stuff from any device because you’ve stored everything on a server run by Google, Microsoft, or Dropbox.
This outcome seems more likely for a couple of reasons. First, there are a lot of different potential form factors for a computing devices. ESR’s model only works well for the two ends of the spectrum. Second, the key economic trend is that these devices are going to keep getting cheaper. People are going to find it convenient to treat them as basically disposable, much as pocket calculators and land-line telephones are treated today. In contrast, ESR’s model suggests that phones will get more and more valuable as it takes on more and more functions. Given how bad most people are about backups, they’re not going to want to put all their files on a device they could lead in the back of a taxicab.
I’m not too enthusiastic about the “cloud” scenario because I like control over my data. But a lot of people basically work this way today, and if I had to bet money, I’d say that’s how most people will be computing a decade in the future.
(Thanks to Chaim for the pointer to the ESR article)
Does everyone’s moving to a cloud model have to be incompatible with people storing their own data?
Why can’t you just keep everything hosted on your PC at home, and use cloud-friendly hand-held devices to get remote access to it?
I have to agree with Pete. I’m already wishing that I had the setup at home to simply access anything from any display that is on either my PC or NAS.
I don’t know if either solution you’re describing is likely to get adopted by programmers ( I know I’m not keen on either ) which will slow their adoption and innovation considerably.
Agreeing with Pete here as well. I don’t think programmers will adopt a solution to this very soon, and I don’t see why would they want to, in the first place (except the benefits for us as consumers). But only time will tell, eh…
George