An oral history of USB, the port that changed everything
In the olden days, plugging something into your computer—a mouse, a printer, a hard drive—required a zoo of cables. Maybe you needed a PS/2 connector or a serial port, the Apple Desktop Bus, or a DIN connector; maybe a parallel port or SCSI or Firewire cable. If you’ve never heard of those things, and if you have, thank USB. When it was first released in 1996, the idea was right there in the first phrase: Universal Serial Bus. And to be universal, it had to just work. “The technology that we were replacing, like serial ports, parallel ports, the mouse and keyboard ports, they all required a fair amount of software support, and any time you installed a device, it required multiple reboots and sometimes even opening the box,” says Ajay Bhatt, who retired from Intel in 2016. “Our goal was that when you get a device, you plug it in, and it works.”
It was at Intel in Oregon where engineers made it work, at Intel where they drummed up the support of an industry that was eager to make PCs easier to use and ship more of them. But it was an initial skeptic that first popularized the standard: in a shock to many geeks in 1998, the Steve Jobs-led Apple released the groundbreaking first iMac as a USB-only machine. The faster speeds of USB 2.0 gave way to new easy-to-use peripherals too, like the flash drive, which helped kill the floppy disk and the Zip drive and CD-Rs. What followed was a parade of stuff you could plug in: disco balls, head massagers, security keys, an infinity of mobile phone chargers. There are now by one count six billion USB devices in the world.
Now a new cable design, Type-C, is creeping in on the typical USB Type-A and Type-B ports on phones, tablets, computers, and other devices—and mercifully, unlike the old USB cable, it’s reversible. The next-generation USB4, coming later this year, will be capable of achieving speeds upwards of 40Gbps, which is over 3,000 times faster than the highest speeds of the very first USB. Bhatt couldn’t have imagined all of that when, as a young engineer at Intel in the early ’90s, he was simply trying to install a multimedia card. The rest is history, one that Joel Johnson plugged in to with some of the key players. Their reminiscences have been edited for clarity. —The Ed.
“I knew that computers could be made easy to use”
Ajay Bhatt: It was probably in 1992—I had joined Intel in 1990—that I started looking at the PC. I always felt that they were too difficult to use. I based that on my observation with my family’s struggle with computers and doing a simple thing like printing a document.
I also struggled, even as a technologist. I struggled with upgrading my PC when the multimedia cards first started coming out. I looked at the architecture, and I thought, you know what? There are better ways of working with computers, and this is just too difficult.
Bala Cadambi (an I/O architecture manager, now Intel’s director of IO Technologies/Standards): If you go back to the creation of the PC, it was based on the IBM design and the documentation of the hardware, the BIOS, and the interfaces. It was put together with the mindset this would be used by somewhat computer knowledgeable users. By the end of the ’90s, it was clear that the PC as it was evolving needed to become easier to use.
AB: The original goal was to attract a new class of users and promote new users of computers. That’s where it all began in 1992. I came to work, proposed this idea to a few managers, didn’t get much interest actually. People didn’t grasp the usefulness of something like USB, but I was quite passionate about it. I knew that computers could be made easy to use, and you didn’t need an IT guy to install a printer or configure a keyboard or a mouse or support multiple input devices.
First: microprocessors
BC: Both Intel and Microsoft were seeing business that was set to go well beyond that first 10 million users and required PC hardware and software to be much easier and much more seamless and much more standardized.
The first of those initiatives was PCI [Peripheral Component Interconnect]. PCI was intended to make some systems within the PC, within the box, easy to install, initialize, upgrade, and maintain. That initiative starting with PCI evolved to become plug and play. PCI was the first of the standardized 32-bit interfaces, which has since evolved to become PCI Express
Even under PCI, each computer peripheral had different and specific characteristics in terms of how data moved in and out of the PC. In some cases that required the addition of adapter jacks on the outside of a PC, or additional cards on the inside.
AB: The basic concern at that point was that at that time, there were multiple ways to interface with hardware. That actually meant that any time you changed something, it would result in significant changes to the operating system and applications themselves.
Everybody saw that as the most difficult part, and my immediate supervisor basically told me that changes cannot be done: “I don’t think you’ll succeed. You don’t understand PC architecture.” I said to him, I said, “No, no, no. We can fix that. Believe me. This can be done.” I had a hard time convincing him.
BC: The PC in this instance was not a notebook, keep in mind. It was a desktop. This was the era when notebooks were just barely beginning, and they were luggable, big boxes. Literally, I would say that if it had a handle, it was portable. If it didn’t have a handle, it was a desktop.
So as you plugged in these devices, you started getting cluttered—a different standard for audio, another for a serial modem, a different one for a SCSI printer. Each of these had artifacts in terms of how they would work. The orientation of the plug. Whether you could hot plug or you had to reboot the PC. The software was there inside the PC, or you had to load the software with a floppy drive. Whether they would work if you moved the peripheral from one interface to another.
AB: I didn’t get any positive response, so I decided to make a lateral move within the company to a sister group, and that’s when I started working for a gentleman named Fred Pollock. At that time, there were a handful of Intel Fellows in the company. These are the topmost technical folks at Intel. He’s an incredibly smart person and one of the top computer scientists. I spoke to him, and his view was, “I don’t know. You know what? Go convince yourself.” That’s all I needed. I needed somebody who would be open-minded enough to allow me to take this risk.
I didn’t just rely on him. I started socializing this idea with other groups at Intel. I talked to business guys, and I talked to other technologists, and eventually, I even went out and talked to Microsoft. And we spoke to other people who ultimately became our partners, like Compaq, DEC, IBM, NEC, and others.
Basically, I had to not only build a life inside the company, but we had to ally with people outside, and obviously, each company or each person that I spoke to had their own perspective on what it ought to be. One thing that was common was that everybody agreed that PCs were too hard to use and even hard to design around. Something had to be done, and that’s where it all began.
“A lot of storming and norming”
In February of ’92, as Intel and others were working on the PCI rollout and the plug and play initiative, a group of companies met in Redmond to discuss how to standardize external interfaces on the PC.
BC: That was a fairly simply ad hoc meeting. It was clear that we had a problem that was not well understood—let alone what solution we would want for it. At that point, I already was managing the team that did PCI and plug and play at Intel, so the knowledge base was there for us to appreciate how difficult this would be.
We did not have an appreciation in terms of what the potential for something could be in terms of future applications and requirements. So solving the current problem may have been somewhat obvious, but anticipating what the PC interfaces might look like several years down the road required us to start doing research. Meeting other companies. Talking to analysts. Talking to end users. Looking at where the trends were in terms of the business and the consumer marketplace for applications.
AB: Slowly but surely, I started convincing people [inside Intel] with the subtle requirements that we had, which ultimately became the USB. I think somewhere around 1993, we had achieved internal alignment, and we were off and running.
This whole convincing part took about a year, a year and a half. There was a lot of storming and norming happening in the early part of development, where we had to work on people’s skepticism and get people aligned in solving this problem.
By late ’93, maybe early ’94, I had formed a small team. We had internal work groups to generate ideas at Intel, and also to do analysis and to write the specs. Then we were also working with external partners on a regular basis.
At that time, it was called “Serial Box.” It didn’t have any name. It was a technology-driven thing.
Digging into the Serial Box at Denny’s at 2 a.m.
Ajay and Bala, who had met while working together on the plug and play initiative, were joined by Jim Pappas, an input/output expert, along with others in marketing at Intel. They formed a working group in the summer of 1994.
Jim Pappas (engineering manager, now director of technology initiatives at Intel): We were a very focused and committed team. The four of us, myself, Bala, Ajay, and Steve Whalley, were very, very active and very close—in particular, Bala and I. I haven’t called him at home in a decade and a half, but I would pick up a phone and my fingers would just dial his number because we talked so much. It was incredible.
We would do what we called a power breakfast. Let’s say we were meeting with a company to get them going on real estate. We might all fly in different times from different cities. At 1 a.m., 2 a.m., we would meet at a Denny’s or something. We would have what we called a power breakfast. “Okay, what are we going to say tomorrow, what do we need?” We had this habit of not only just working very close together but we did it almost around the clock, and it was an incredibly fun experience.
BC: We went, literally, on a road trip for about a year. We visited about 50 companies in the whole range of industries–printing, scanning, communications, industrial controls, the keyboard, mice, joystick, modems, and so on. I’m just amazed there was that much interest in something that simple as a standardized external interface. The enthusiasm came because every one of those companies had a requirement, and they felt there was a market opportunity that was restrained by the current interfaces.
The biggest challenge however, was there was no existing problem to solve other than making it easier. Everything [already] had a place to plug in, which makes it extra hard to ask, “why do you need to standardize until the next one comes?”
Building the team
Ajay and Bala and Jim’s team evolved into a larger group within Intel. The group was subdivided into different disciplines: overall protocols, how the bits would be organized, electromechanical issues–like connectors and cables–and groups devoted to business and adoption.
AB: We had organized ourselves to go attack all different aspects of a technology and what it takes to make it successful in the market because we wanted to cover all the bases. We not only wanted to do the spec, but we also wanted to help the developers develop products around this technology.
We didn’t quite stop after finishing the specs, but we came up with the recipe to design various applications. We also created something called “interoperability programs,” such that when different vendors were brought together, they would get tested according to predetermined tests, and we would make sure that everybody was following the spec and these devices, and that they would interoperate without any hiccups.
Even though we were an alliance, we were like a startup who were paying attention to every aspect of not only the specs but the product development and ultimately, introduction in the market.
JP: Ajay was the technical spec lead, Bala was the engineering lead, and I was running the overall program. We were forming an engineering group. In Intel speak, “Two in a Box” is where you have two managers. I asked Bala to join me running the program. It’s almost like we put ourselves back-to-back. He was looking into Intel and driving the engineering efforts, I was looking after driving the industry efforts.
BC: Jim drove the external communication, the external alignment of industries, putting the promoter group together, the industry forum together. He’s a natural at the communication aspect of technology relative to the marketplace. Often, he would defer to me to give keynote talks, but we tag teamed on that. Routinely, Jim and I would touch base at the end of every day. It was an 11 p.m. call. That went on for years. The phone would ring, and our spouses would think, “That must be Bala. That must be Jim.” It was that routine.
AB: I think we succeeded because everything that we did was well implemented. I knew that this would be the case because we had multidisciplinary teams where we had people who were experts in software and operating systems. We had people who knew how to build systems, like IBM and Compaq. We had people who knew how to build chips, like Intel and NEC. We had Nortel that knew how to build the telephony and other things that eventually became very important. By assembling a team of experts, we were able to reduce the risk and see to it that these were very broadly applicable specs to a variety of applications.
BC: Ajay was central to developing the spec itself. That’s where his passion was. He was also passionate in terms of understanding the requirements to make sure that the spec met the requirements. [Engineer] Jeff Morris, at this point, had relocated from Santa Clara to Oregon to work on my team. He felt strongly that this was something he wanted to work on. He wrote a good chunk, I would say more than half, of the first spec along with all the white papers that went with it to develop the technology.
AB: The spec got done somewhere in 1995. There was a [trade] show called COMDEX. Our goal was to finish the spec right around [COMDEX’s November dates] at the end of 1995. Thereafter, we started working on the products and stuff. It was a long journey, but ultimately the industry realized that this was something that really addressed all the painpoints of a personal computer.
The problem with Firewire and other interfaces
Computer companies were eager to make plugging in easier, but one application on the horizon that seemed to demand a faster interface in particular was video. Digital multimedia was still in its infancy, but getting video on and off computers would become a major focus for computer and peripheral makers. As Intel engineers worked on the interface that would become USB, they were also examining possibly faster alternatives.
BC: In general, streaming multimedia, getting video in and out of the PC, was the one [area] that we could put at the forefront and say, “You need to be able to do something like this.” I think in general, most companies appreciated the fact that if things were easier to use, the purchasing experience, the upgrade experience, the service and support would become easier. They would also have lower returns and fewer frustrated service calls.
As we gathered the requirements, we were also in parallel evaluating technologies that might fit these requirements. We clearly didn’t want to go out and invent something new if there was something that was close enough or good enough.
We explored about 12 different technologies. The most obvious of them was IEEE 1394, which subsequently got labelled as Firewire. When I first started attending those committee meetings to see whether that technology would work, 1394 was a 10-megabyte interface. It felt like it was a technology looking for a problem to solve. They already had something, but they weren’t quite sure what to use it for, and it was evolving. It was also a little more complicated and expensive than what the PC cost point was. On the other hand, it had elements that would be potentially usable.
We looked at that generation of Ethernet-like technologies. We looked at audio interfaces. Apple had an interface called GeoPort back then. We actually talked to Apple to see if they might be interested in evolving that. It didn’t quite transpire. Access Bus was another industry standard.
AB: I personally went to so many different forums. I talked to people in adjacent areas and said, “Guys, let’s all consolidate our applications.” For music, there’s an interface called MIDI, and a lot of synthesizers and the keyboards and stuff used it. I remember hosting a meeting with the key telephony vendors in Dallas, because there are a whole bunch of outside allies. We were trying to convince people that you could do computer telephony using something like USB, and a lot of people thought that we could not support certain things like that. There was a view that HP had that your printers would talk to your computer using this link called infrared data [IRDA].
BC: I would say it was USB, 1394, and Access Bus that were running on parallel tracks for about a year or two. In that ’93, ’94 time frame. By then, there was a groundswell on USB. Back then, it was called the Serial Bus. We didn’t have the name identified yet.
How USB got its name
BC: The naming of USB itself was a significant committee effort. So the question of naming goes three ways. There was a school of thought that said anything that was numbered wouldn’t succeed. It was too techy. Don’t make it like a 1394. That was a spec number. It needs to be something that’s got a handle that users can relate to. We tried coming up with consumer-ish names. Then we felt they were too far out from where USB was.
Intel is very big on acronyms, in case you haven’t picked that up already. You turn to our organization, a lot of team names or organization names, project names, technology names, a lot of them are acronyms. That seemed to be a place for us to start with and the universality of the solution. We played around with that. How would we expand on that?
On the other hand, the word “bus” seemed counterintuitive, but it’s something that the industry knew what it was about. So we stuck with that. Other interfaces were going parallel, SCSI, the parallel port, and so on. [Our new standard] was slim. It was economically simple. You want to bring out some of those elements in the description.
That’s what brought out the “universal serial bus.” There was this idea around the usage of the word “bus” in that time frame as being something that gets you from here to there, efficiently and consistently.
In the end, I think the universality of it is what really took off. That was what we were trying to do.
JP: COMDEX was a huge show in Las Vegas, and [in 1998], we rented a big hall, and we actually had a big showcase as well. We rented a big pavilion where we did a press event, and we attached the full 127 devices to one PC, and we hired Bill Nye The Science Guy to plug in the last one to kind of show how this one port on the PC could support—we had a whole stage full of different printers! We went around and shook a mouse and shook a different mouse—or print something here, and print something there.
Okay, but: why wasn’t the plug reversible?
AB: Good question. We had looked at it, but the whole goal here was to make it very inexpensive, and at that point, we were trying to solve all the USB problems with two wires. At that point, if you added wires to make things flippable, you have to add wires, and you also have to add a lot of silicon. Wires and pins cost real money, so we decided to keep it as cheap as possible. With serial port and parallel port, there were versions that were 25 pins and 36 pins and so on and so forth. The cables were very thick and expensive. We were trying to solve all the problems. We went in favor of fewer wires. In hindsight, a flippable connector would have been better.
AB: Our goal was to say that this interface should be such that it should work on a mouse and it should also work on a high-end printer or on digital cameras. That’s what we were looking at, the range of products. At one end, we wanted it simple enough, so there could be very low costs. At the other end, we wanted to make sure that it could be scaled and, just as we speak today, we’re running the USB at tens of gigs. The original one was running at 12 megs. We’ve come a long way in scaling.
The call from Microsoft’s Betsy Tanner that saved USB
JP: One of the people we met at Microsoft was Betsy Tanner, and at the time, she was the engineering manager for the mouse. I talked to Betsy and said, “if there ever comes a day that you’re not going to use USB for your next Microsoft mouse, I need to know.” And she says, “okay, that’s a fair request.”
We were designing USB—originally, it was supposed to be a five megabit-per-second bus, which at the time was faster than anything else that normally would come at the back of the PC. It’s not fast by today’s standards, but at the time, it seemed fast. And the reason we wanted high speed was so you could fan it out through hubs, and basically, however many devices would be attached to that single port would be sharing that bandwidth—not necessarily all being used simultaneously, but we wanted it to be fairly robust. Well, Betsy called me one day and said, “Jim, you asked me to call you if we’re not going to be using USB for the mouse. I’m calling you to tell you we’re not going to be able to do it because we have a problem.”
And I said, “what’s the problem?” She says, “Well, 5 megabits is just too fast.”
I said, “For a mouse, we don’t need that much bandwidth, and secondly, I’m really afraid of whether we can pass the electrical magnetic interference specifications. Signals going through a wire become an antennae. Am I going to have too much EMI radiation coming off creating digital noise?”
She said, “we could solve it by putting a shield around it, but it adds 4 cents per foot to the cost of a cable. If I’ve got a six foot cable that adds 24 cents. So I can’t do that. Secondly, if I put a shield on it, a mouse needs to have a simple cable. The cable can’t affect the movement of the mouse, and I’m afraid that if I put a shield, it becomes too stiff.”
So I said, “Betsy, what could you live with?” She said, “We’d be comfortable with two megabits per second.”
And I said, “Damn, that’s just as slow. Give me a week, can you do that?”
She said yes. I came back to the team, and we discussed Microsoft’s problem, and that’s where we actually split it, where we had a high speed and a low speed in the bus. At the high speeds, we brought it up to 12 megabits per second. And then we made the slow speed down to one and a half megabits per second, which was three quarters of the speed that was her maximum.
We saved Microsoft, we saved the mouse. And I think that that call from Betsy saved the program. One of the reasons why USB was so successful is because it hit the cost point that was required. It didn’t add any significant cost to the PC. You can even make the argument that it reduced the cost, over time.
“Apple had no interest in working with us”
Among the USB confederacy that brought the standard into the world, one big company was notably missing. But in 1998, with the release of the iMac, Apple became the first to include USB as the only plug on its computers. It was Apple, not Intel, that would become the first prominent computer company to be associated with USB.
AB: It’s interesting. There was no Apple on the list, and they had a competing product called 1394, or Firewire. Apple also had their own interface. They were known for easy-to-use even then. Once the spec got done, it was actually Apple that came out with the first product. The Windows-based system was transitioning from DOS to Windows and from Windows 3.1 to Windows 98.
Remember, we were not the marketeers. We had a vision to bring about a profound change to the computer industry. That’s what my motivation was as a computer scientist. I wanted some of the clunky interfaces to go away because they were limiting some internal extensions, as well as they were limiting some of the applications of the computer.
As a matter of fact, when we started this thing, we had approached Apple, and they had no interest in working with us, and they wanted to go in a different direction. When they adopted the spec, we knew that we had done the right thing, and we had addressed the right problem. We were nothing but happy about it. Our view was that this pie needs to get bigger, and everyone will have a significant piece of pie. We were not at all disappointed. We were elated, and every time new stuff came out, it made us even more happy, and it validated our vision that we were solving the right problem.
Everything was USB
JP: Since the fall of 1996, [USB ports] started appearing on PCs. In the fall, Microsoft had [Windows] OSR 2.1 if I remember right. It had support for USB. But it had to be installed; OEMs couldn’t sell it on new machines. There were peripherals coming out, but it wasn’t like what happened in 1998.
When Windows 98 shipped, it was like a dam burst. The world was flooded with USB devices. I remember Steve Whalley and I were actually in Tokyo. We went to Akihabara, the electronic district of Tokyo, and we walked into one of the large electronic stores. We started walking around to see if they had USB stuff there yet. This was prior to Windows 98 or right about that time. We were walking around, we didn’t really see much. Somebody came up and asked us if they could help, and we said, “yeah, we’re looking for USB devices,” and he says, “oh, well that’s the fifth floor. The fifth floor is all USB devices!”
There was an entire floor of this electronics superstore dedicated to USB devices. That was a pretty exciting moment as well. You walk up there and there was aisles. Everything was USB.
BC: Who would have thought that a connector that we had defined in the early ’90s would still be usable today? That’s very rare. We had cost constraints, performance constraints. It was designed for a desktop not a smartphone. Looking back at it, it was wonderful that we accomplished what we did, that it withstood the test of time—that we were able to build on it, enhancing the power delivery, the performance, all the things we did for USB2 and USB3.
The miniaturization of USB development has taken us beyond the PC era right into the mobile era. Also, we’ve built on other protocols that had evolved since besides USB. We brought the goodness of all of those into Type-C.
JP: With USB-C, you can actually charge your laptop with a single USB port. You don’t even have to have a dedicated power port on your laptop anymore. It’s been a big deal. Who could have predicted?
BC: Defining a new connector always is a transition issue. We thought about it extremely carefully. It took us a good six years to work on it to put the whole Type-C capability together. A lot of industry work again went behind it.
There’s two aspects that we talk about when we talk about the standard. One is the interface that you plug in, and that is changing. The interface behind it, which is the interface between the device drivers and operating system between device and device drivers, there’s no change to that. That has held seamlessly for 20 years, and that carries through to USB-C.
JP: Bala put together a great slide at one point. Shawn Maloney was one of our senior vice presidents, and Bala took a picture of the back of Shawn’s computer. It was a complete rat’s nest of cables. That was one visual of the industry at the time, and that mess just kind of visually conveyed that.
But the other slide that he did was USB—the first hundred million devices. I remember we put that slide up in one of my keynotes or something, and the audience would laugh, you know?
And then a few years later, we were shipping 2.2 billion units a year. Nobody was laughing anymore. The success of this thing has been phenomenal. It became the ubiquitous connector.
Fast Company , Read Full Story
(46)