Red-haired woman working on an old terminal Gaby's Homepage for CP/M and Computer History
   Deutsch    News     Sitemap    Contact/Legal stuff    Data Privacy     Home     English CP/M Center/ Gary Kildall

Dr. Dobb's Special Report, Spring 1997

by Michael Swaine

Michael, Dr. Dobb's Journal's editor-at-large, can be contacted at or
Gary Kildall
In the early days of the personal-computer revolution, the atmosphere at those shoestring startup companies with names like "Golemics" and "Loving Grace Cybernetics" was often more academic than businesslike. This collegiate ambiance touched everything, from the ways in which decisions were made and respect allocated, right down to sophomoric pranks and styles of dress.

There's a fairly obvious reason for this, or at least for some of it: Microcomputers were a new field, ripe for rapid advances, and that's a situation that fits neatly into a collegial atmosphere in which information is openly shared. When discoveries are freely shared, it's easier to build quickly on those discoveries; conversely, when progress is rapid, there is less reason to hold onto yesterday's discoveries. This natural synergy between rapid progress and information sharing is one key factor in the spectacular growth in the use and acceptance of computers over the past 20 years. It's one of the reasons that the personal-computer revolution really has been a revolution.

In time, companies like Apple and Microsoft would emphasize this synergy, explicitly calling their corporate headquarters "campuses." Even today, computer hardware and software companies often have a lot of the look and feel of grad school. But this predilection for a collegial atmosphere predates Apple and Microsoft. And while it didn't start there either, it was nowhere more evident in the early days than at one of the first personal-computer software companies—Digital Research. Digital Research could hardly have been anything but collegial: The company that gave personal computers their first operating system was the brainchild of a perpetual academic and born teacher. His name was Gary Kildall.


Gary Kildall seemed fated to be a teacher. His uncle would later claim that it was what Gary had always wanted. Teaching certainly was in his blood: The Kildall family actually owned and operated a small college. More precisely, it was a school for the teaching of navigation, based in Seattle, Washington. The Kildall College of Nautical Knowledge, the family called it; it was founded in 1924 by Gary's grandfather. Many Kildalls taught or worked at the school, including Gary himself, for a while, after graduating from high school.

But he had decided that year that he was going to be a math teacher, so he enrolled at the University of Washington. Newly married to high-school sweetheart Dorothy McEwen, he buckled down and applied himself to his studies, trying to put a childhood of mediocre grades, fast cars, and pranks behind him.

Somewhere along the way to a math degree he got hooked on computers. On finishing his degree, Gary went on to graduate school in computer science. He was still headed for a career in teaching, only now it would be teaching computer science at one of the few colleges that had programs back then. But there was a hitch. He had joined the Naval Reserve, and it was the '60s, with the Vietnam war in full flower. The Navy gave him a choice: Go to Vietnam or take a faculty position at the Naval Postgraduate School in Monterey, California.

Gary thought about it for a microsecond and chose Monterey. Even when the Navy told him what to do, the answer was the same: Teach.


It was in Monterey that Gary created CP/M, the program that brought him success and that became the unquestioned standard operating system throughout the microcomputer industry. CP/M was a good product and deserved, for many technical reasons, to be the standard. But getting there first always helps, too. And CP/M actually appeared a year before the first commercial microcomputer arrived on the scene.

Unlike operating systems before and since, CP/M was not the result of years of research by a team of software engineers. It was, like most software of its time, the invention of one individual. That individual was Gary Kildall, and if chance put Kildall in just the right place at just the right time, you would have to say, in retrospect, that chance chose well. As it did with Bill Gates, chance spoke to Gary Kildall through a note on a college bulletin board, college bulletin boards apparently being the Schwabb's Drug Store of personal-computer fame.

The note talked about a $25 "microcomputer," a pretty good deal even at 1972 prices. It was actually describing not a computer but the first microprocessor, the 4004 that Ted Hoff had designed at Intel. Presumably, this note was an advertisement torn from a recent issue of Electronics News. Intel had hired Regis McKenna to write the ad at Hoff's urging. Hoff was convinced that techies would see the virtue of this new device, this general-purpose processor, and urged that it be advertised, extravagantly but not altogether inaccurately, as a "microcomputer." This would make it absolutely clear that it was not just another limited-purpose device, but something fundamentally different. Hoff was sure that engineers and programmers would get it.

Kildall got it, literally, sending off his $25 for one of the first Intel 4004 chips.

It was 1972. Kildall was busy teaching computer science at the United States Naval Postgraduate School in Monterey. He and Dorothy (and son Scotty) had moved into a house in neighboring Pacific Grove. The Seattle natives loved this scenic coastal town, with its laid-back, fog-draped ambiance. The place suited the easy-going professor. Whether in class or among family and friends the lanky, shaggy-maned Kildall spoke with the same soft voice, the same disarming wit. Although he was teaching at a naval installation, he wouldn't have been out of place on any college campus in his customary sport shirts and jeans. When he had a point to make he would often cast about for chalk or a pencil; he was an incurable diagram drawer.

Gary was happy in his marriage, happy to be living by the ocean, happy not to have gone to Vietnam, and most definitely happy in his job. He loved teaching, and the work left him time to program. Nothing in his life was preparing him to run a business, to handle a spectacularly successful software company supplying the essential software for hundreds of different computer models in an industry running wild. Everything argued for his staying right where he was forever, teaching and programming. At first, the 4004 seemed to fit in with that scenario.

Gary started writing programs for the 4004. His father, up at that little navigation school in Seattle, had always wanted a machine that would compute navigation triangles. Gary made that his project, writing some arithmetic programs to run on the 4004, thinking idly that he might come up with something that his father could use. He was really just fooling around with the device, trying to see how far he could push it, and with what speed and accuracy.

Not all that far, he soon learned. The 4 in 4004 meant that the device dealt with data in 4-bit chunks—less than a character. Getting anything useful done with it was a pain, and performance was pitiful. Although he was frustrated by the limitations of the 4004, he was fascinated by what it promised. Early in 1972 he visited Intel and was surprised to see how small the microcomputer division (dedicated to the 4004 and the new 8008) was: The company had set aside only a few small rooms for the entire division. Gary and the Intel microcomputer people got along well, though, and he began working there as a consultant on his one free day a week. He spent months programming the 4004 in this day-a-week mode until he "nearly went crazy with it." He realized—and it was a radical idea for the time—that he would never go back to "big" computers again. Which is not to say that he stopped using "big" computers. With both the 4004 and the significantly more powerful 8008 that he soon moved on to, he was doing his development work on a minicomputer, much as Bill Gates and Paul Allen did later in writing software for the breakthrough MITS Altair computer. Like Paul Allen, he wrote programs to simulate the microprocessor on the "big" minicomputer, and used this simulated microprocessor, with its simulated instruction set, to test the programs he wrote to run on the real microprocessor.

But unlike Gates and Allen, Gary had the benefit of a development system, essentially a full microcomputer spun out around the microprocessor, so he could try out his work on the real thing as he went along. In a few months he had created a language implementation called "PL/M," a version of the mainframe language PL/I that was significantly more sophisticated than Basic.

The Lab

As partial payment for his work, Gary received a development system of his own, which he immediately set up in the back of his classroom. This allowed him to combine his new obsession with microcomputers and his love of teaching. The system in the back of the classroom became the Naval Postgraduate School's first—if not the world's first—academic microcomputer lab.

And academic it was. This was not just Gary's toy; he used it to teach students about the technology, and encouraged them to explore it. His curious students took him up on it, spending hours after class tinkering with the machine. When Intel upgraded this Intellec-8 from an 8008 to its new 8080 processor and gave Gary a display monitor and a high-speed paper tape reader, he and his students were working with a system comparable to—favorably comparable to—the early Altair computer before the Altair was even conceived.

Gary realized, though, that he was missing an essential ingredient of a really useful computer system—an efficient storage medium. In the early '70s, paper tape was one of the standard storage media, along with the infamous punched card. Neither was very efficient, and the issue was particularly critical on microcomputer systems because the relatively slow microprocessors couldn't offset the inherent slowness of the mechanical process of punching holes in pieces of paper.

IBM had recently introduced a new storage medium that was much faster and more efficient. It was based on the existing technology of recording data as patterns of magnetization on large rapidly spinning disks, a medium that had everything going for it except price. But IBM engineers figured out how to scale down this technology to something smaller and more affordable, creating the floppy-disk drive.

One $5 floppy disk held as much data as a 200-foot spool of paper tape, and a floppy-disk drive could be had for around $500. The combination of the microprocessor and the floppy disk drive meant that, in Kildall's words, "It was no longer necessary to share computer resources." In other words, the elements of a personal computer were at hand. Well, most of the elements. Gary soon found that some important components were still annoyingly missing.

By this time, an industry was developing to create these floppy-disk drives in volume, and Shugart was the pioneer of this industry. Once again, Gary traded some programming for some hardware, getting himself (and the microcomputer lab) a Shugart disk drive. But for the disk drive to work with the Intellec-8, another piece of hardware was needed, a controller board that fit in the Intellec-8 and handled the complicated communication between the computer and disk drive. This piece of hardware, unfortunately, did not exist.

Gary tried his hand more than once at building the controller. When that proved more challenging than he expected, he explored the idea of using a different magnetic medium—ordinary audio tape, mounted on a conventional tape recorder. His efforts in interfacing a tape recorder with the Intellec-8 were no more successful than his efforts to build a disk controller. It soon became clear that his considerable programming expertise was no substitute for the hardware knowledge needed to build a device that would connect the Intellec-8 with an efficient storage device. It is worth noting that Kildall was well ahead of his time: When MITS, IMSAI, and other companies began marketing microcomputers, they began with paper-tape or magnetic-tape storage. It would be several years yet before disk drives came into common use on microcomputers.

Finally, in 1973, admitting hardware defeat, Gary turned to an old friend from the University of Washington, John Torode. Torode would later found his own computer company, but in 1973, he was just doing a favor for his old friend. "John," Gary said, "we've got a really good thing going here if we can just get this drive working." Torode got the drive working.


Meanwhile, Gary found himself involved with another hardware engineer on another microprocessor-based project. This project, for all its apparent frivolousness, was the first hint of any genuine commercial ambitions on the part of Gary Kildall. The project was the ill-fated Astrology Machine.

Ben Cooper was a San Francisco hardware designer who had worked with George Morrow on disk systems and later would, like Torode, start his own computer company, Micromation. In the early '70s, he thought he could build a commercially successful machine to cast horoscopes, and he enlisted Gary's help.

The business was not a success—"a total bust," Gary later called it. Still, the Astrology Machine gave Gary the first field test of several programs he had written and rewritten over the past months: a debugger, an assembler, and part of an editor. He also wrote a Basic interpreter that he used to program the Astrology Machine. Since, for Gary, there was little distinction between his academic work and his commercial or consulting work, he passed on the tricks he came up with to his students. He passed the tricks he came up with in writing the Basic interpreter on to a young naval officer named Gordon Eubanks (today, president and CEO of Symantec). All the programs, with the exception of the interpreter, became part of the disk operating system he was writing to control the controller that Torode was building.


As they worked on the hardware and software to make the computer and disk drive work together, Kildall and Torode traded thoughts on the potential of these microprocessors. Neither of them thought that the computer system in the back of Gary's classroom would have a very large market. Microprocessors, they thought, as most everyone at Intel itself thought, would see their chief use in smart consumer devices, like blenders and carburetors. Kildall and Torode did see a small market for development systems like the Intellec-8, but only among the engineers who would be designing and developing those smart blenders and carburetors. This view was fostered by Intel management. In fact, Intel's top brass was even more conservative about the potential market for the devices than Kildall. When Gary and some Intel programmers wrote a game that ran on the 4004 and suggested that Intel market it, Intel chief Bob Noyce vetoed it. The future of microprocessors was elsewhere, he told them; "It's in watches."

When Intel passed on marketing the game, Gary wasn't fazed. He more or less agreed with Noyce about the market. But when the company turned down a piece of software closer to Gary's heart, he began to think that he might have a better sense of the value of microcomputer software than the powers-that-be at Intel.

When Torode finished the controller, Gary polished the software to control it. This was a disk operating system, the first such for a microcomputer, and Gary called it CP/M, for "Control Program/Monitor" or "Control Program for Microcomputers." He presented it proudly to Intel and suggested a reasonable price for it: $20,000. Intel passed. The thinking, apparently, was that the target market for Intel development systems was people like Gary, and since Gary had written some impressive software for the 4004, 8008, and 8080 without an operating system, clearly an operating system was not necessary for the target market. Not $20,000 necessary, anyway. Intel did buy Gary's system programming language, PL/M, but not CP/M.

Gary had been doing his consulting and development work under the name "MAA," or "Microcomputer Applications Associates." MAA (that is, Kildall) completed CP/M in 1974. It was a spartan system, containing only what was essential. It was also remarkably simple, reliable, and well suited to the limited microcomputers of the day. Gary believed in CP/M, and if Intel didn't want it, he was sure there were a lot of hardware hackers and engineers who would. He could sell it himself.


Gary might have been content to run a small ad in the back of one of the electronics magazines, maybe putting a note on that bulletin board. What actually ensued was a little more ambitious. At Dorothy's urging, the Kildalls formed a corporation. Gary would do the programming and Dorothy would run the business. She started using her maiden name, McEwen, so she wouldn't be seen as just "Gary's wife." They incorporated, dropping the MAA name and calling their corporation "Intergalactic Digital Research Inc." This was later shortened to Digital Research Inc. And they started selling CP/M.

It was the beginning of the personal-computer revolution. (Everything Gary had been doing up to then was prerevolution.) The Altair had been announced, and a flock of other startup companies were starting work on microcomputers, usually kits but sometimes assembled systems, some with a paper tape interface but many with no satisfactory provision for data storage. They needed disk drives, and they needed a disk operating system.

In those days, there was no model for software pricing, so Digital Research's first customers got some pretty good deals. When Tom Lafleur came to them wanting a license for his company, GNAT Computers, Dorothy gave him unlimited rights to use CP/M as the operating system on any product his company produced, for a whopping $90. Within a year the price had gone up by a factor of 100.

The deal with IMSAI in 1977 was the turning point, and Dorothy knew it. Until 1977, Digital Research was, like most of the industry, little more than a hobby. And until 1977, IMSAI had been purchasing CP/M from Digital Research on a single-copy basis. But IMSAI, with its grandiose plans to sell thousands of floppy-disk-based microcomputers for use in businesses, wanted to restructure the deal. Marketing director Seymour Rubenstein (later of WordStar fame), a shrewd negotiator, haggled with Dorothy and Gary, ultimately arriving at a license fee of $25,000. Rubenstein gloated. He felt that he had virtually stolen CP/M from them. He respected Kildall's programming expertise, but thought the Kildalls were babes in the woods when it came to business. Perhaps they were, but the Kildalls' perspective was a bit different. After the IMSAI deal, Digital Research was a real, full-time business. The IMSAI deal also solidly established CP/M as the standard, and other companies followed IMSAI's lead in licensing it. CP/M quickly became and remained so solid a standard that, until IBM introduced a personal computer, Digital Research faced no serious competition.

And the programmers who would provide that competition were still working at MITS in Albuquerque.


After the IMSAI deal, Digital Research began to grow rapidly. Although it wasn't a financial necessity, Gary continued to teach at the Naval Postgraduate School for years after the founding of DRI. DRI itself felt very academic. Relationships tended to be collegial, the atmosphere casual, discussions animated and cerebral. Or not so cerebral: The atmosphere sometimes was less like a college classroom than a college dorm. Gary liked to rollerskate through the halls, and once conducted an employee interview in a toga.

The staff was young, eager and deeply loyal.

"Gary built a campus in Monterey," Alan Cooper would later remember. DRI "was collegial in every respect." It was only when the company didn't function like a college that Gary got frustrated. Employees would come to him expecting him to solve business problems, marketing problems, personnel problems. He didn't know the answers; didn't really want to think about the problems. What he wanted to do was write code. "Code was his element," Cooper says.

So he wrote code, keeping out of the business end of things as much as possible. He improved CP/M, making it more portable. Certain features of the program were logically independent of the hardware, while others were intimately dependent on the exact features of the machine the program was running on. Gary shrewdly carved out the smallest possible set of machine-dependent elements, and made them easily field customizable. The result was that DRI could write one version of CP/M, and hardware vendors, field engineers, or whoever could customize it to their particular hardware configuration. This approach would be reinvented years later as the "hardware abstraction layer," but Gary had it down cold in 1978.

Even his second-in-command was—let's not mince words—a total code geek. Tom Rolander was exactly the sort of person Gary liked to have around him: just a kid in a candy store when it came to programming, without a business bone in his body. There weren't many business-boned bodies at DRI. But the company did have the operating system that you pretty much had to run on your computer system. It had the market because it had the technology.

DRI didn't actually have the entire market. In the early '80s, the Apple II was the largest-selling machine that did not run CP/M, but it was also the largest-selling machine, period. The base of software for CP/M systems was large and growing, and Microsoft, seeing an opportunity, made an uncharacteristic move into hardware: It developed a SoftCard for the Apple II that would let it run CP/M. Then it licensed CP/M from DRI to sell with the SoftCard. Soon Microsoft was selling as much CP/M as DRI.


Gary had moments of doubt about whether this was what he wanted to be doing with his life. In one of the darkest of those moments in the late '70s, Gary passed the parking lot by on his way in to work, and continued around the block, realizing that he just couldn't bring himself to go in the door. He circled the block three times before he could force himself to confront another day at DRI.

Later, in frustration, he offered to sell the company to friends Keith Parsons and Alan Cooper. Parsons and Cooper were running one of the first companies to deliver business software for microcomputers, a kitchen-table startup named "Structured Systems Group." Gary was fed up with all the pointless games and distractions of business. They could have the whole operation for $70,000, he told them. As for him, he would go back to teaching.

It was a dream: There was no way it would have happened. Keith and Alan had little hope of coming up with $70,000, and Dorothy would never have okayed the deal. Dorothy's self-taught business skills would be sorely tested in the near future, but in the late '70s, she knew well enough that the Kildalls had something worth a lot more than $70,000 in DRI. By 1981, it was obvious to the dullest wit that she was right: In that year, there were some 200,000 microcomputers running CP/M, in more than 3000 different hardware configurations, a spectacular testament to the portability that Gary had designed into CP/M. That year, the company took in $6 million. Digital Research employed 75 people in 1981 in various capacities. It had come a long way since its inception only seven years earlier in Gary and Dorothy's house.

That was also the year that IBM announced its plan to build a personal computer.

The story has been told often—and variously—of how Digital Research lost the IBM operating-system contract to Microsoft, and how this made Microsoft's success. It had a big impact on DRI, too.


From that point on, DRI was going in several directions at once. DRI was one of the first personal-computer companies to seek venture-capital funding to go public. The VCs were willing, but insisted that strong management be brought in to get the business under control. Gary was thrilled by the idea of bringing in someone on whom he could unload all the annoying business decisions. John Rowley got that job. Gary quickly disappeared into the fold of Tom Rolander and the developers, and was rarely seen elsewhere. In particular he was rarely seen with John Rowley.

Personable, bright, enthusiastic, Rowley nevertheless struck some around him as just a bit unfocused. He was routinely late to meetings, and he called a lot of them. If there was an overall strategy to his actions, it wasn't obvious. Sometimes, one employee later recalled, he forgot to pay his bills—his own bills, that is—and dunning letters would find him at work. He was boundlessly enthusiastic, but as company direction shifted from week to week, the optimism got old quickly.

But Rowley may very well have been doing the best anyone could under the circumstances. The circumstances being that the company remained Gary's company, and actions Gary took or authorized could drive the company into one market or another. And did.

Gary wrote a version of the programming language LOGO for his son Scott. He just thought it would be a cool thing and that Scott could learn about programming and logic from it. Then he handed it to Rowley, saying, what should we do with this? And that was how DRI LOGO became a product. Tom Rolander was fascinated with the Apple LISA (the slo-ow predecessor to the Mac), and set one up in his office. He messed with it for quite a while, but nothing ever came of that. Fortunately. Gary was also intrigued by the LISA/Mac user interface, and began exploring that realm. The company's focus was supposed to be operating systems, but the result of Gary's interest in user interfaces was that one of the many varieties of CP/M then under development got sidetracked into a user-interface shell that would sit atop an operating system. That was "GEM," a Mac-like UI for non-Mac computers. Apple thought it was a little too Mac-like and threatened to sue, and DRI caved. It couldn't have been lost on Gary that Microsoft, which also had a Mac-like UI called "Windows," did not (then, at least) get sued.

The company was making lots of money at first, but it was also making some serious mistakes. Not keeping customers happy was one of the worst. DRI in the early '80s occupied a role similar to Microsoft in the '90s: Everybody depended on it and resented it for that. But DRI just wasn't sufficiently responsive to customer complaints and requests.

Alan Cooper blames Gary. When anyone would tell Gary that he ought to add a particular feature, "Gary would try to argue you out of it." He didn't want to pollute good code with kludged-on features. The PIP command exemplified his attitude. In CP/M, you "Pipped" to drive B from drive A; in MS-DOS, you "Copied" from A to B. Gary thought that there was nothing wrong with using the command PIP to copy, and that any halfway intelligent person could master the concept that you copied (or pipped) from right to left. Bill Gates let people do it the way they wanted. "That difference in attitude," Cooper says, "is worth twenty million dollars." Gary didn't care. What Gary was interested in was inventing.

On Cooper's first day at DRI, he recalls, Gary took him to Esther Dyson's high-level industry conference. "He gave me John Rowley's badge, and we climbed into his Aerostar and flew [to Palm Springs]. I remember running into Bill Gates and saying I had just joined Gary Kildall in research. I said I was working in research and development. He chuckled that Gary had set up an R&D department. He considered R&D to be part of what everybody did. Bill was right."

Gary, however, wanted to segregate R&D from the mundane concerns of the business. He wanted a skunkworks, a small crew that pursued projects on the basis of interest, just as pure academic researchers follow the interesting idea rather than worry about someone's bottom line. And he did.

Some good ideas came out of the skunkworks, although the best mostly came from Gary. He did groundbreaking work on CD-ROM software and on interfacing computers and video disks. A company, KnowledgeSet, came out of that work. So did a CD-ROM-based Grolier's Encyclopedia, a product that showed everyone how to do CD-ROM content. Microsoft's later enviable position in the CD-ROM content market owes a lot to Gary Kildall's good ideas and Bill Gates' ability to spot a good idea and pounce on it.

In the midst of the rest of the confusion, Gary and Dorothy split up. It was more than a private, personal matter, since both of them stayed at DRI. So did the other woman. The atmosphere grew more tense than it already was.


As Digital Research floundered and flailed, Microsoft flourished. Sometimes Microsoft flourished in ground cleared by Gary Kildall, as in the case of MS-DOS, as in the case of multimedia/CD-ROM technology. The legend of Bill Gates as the technological genius who invented everything in the personal computer realm grew, while a dwindling percentage of computer users had even heard of Gary Kildall.

Kildall was always gracious about this.

At least publicly he was gracious. Inwardly, he hid a bitterness that few ever saw. One day, though, Cooper got a glimpse of Gary's depth of feeling about proper credit for invention.

"Kildall took me aside once, about '83. [He started] talking about Apple. He opened this door, and I saw the bitterness: 'Steve Jobs is nothing. Steve Wozniak did it all, the hardware and the software. All Jobs did was hang around and take the credit.'" Cooper was not blind to the implications of this. Kildall resented that Gates, this dropout, this businessman, was getting credit for things that Kildall had invented. "All of a sudden there was this cauldron of resentment. It must have tortured Gary that Bill Gates [got all the credit]."

Whether Kildall's resentment of Bill Gates was fair or not—and it is important to repeat that it was never publicly expressed—it was probably inevitable. When you look at the allocation of credit in the computer industry from a collegial, academic perspective, it does seem that Bill Gates and Microsoft have, now and then, got credit that rightfully belonged to others. It's hard to defend the idea that this is the right perspective to use in looking at an industry, but a collegial, academic perspective was exactly the perspective from which Gary Kildall viewed his world. He could hardly help but feel wronged.


Gary never went back into academia, staying with DRI to its end, when it was sold to Novell in 1991. At Novell, all traces of DRI products and projects quickly dissolved and were absorbed like sutures on a healing wound.

Gary then moved to the West Lake Hills suburb of Austin, Texas. The Novell deal had made him a wealthy man. His Austin house was a sort of lakeside car ranch, with stables for 14 sports cars and a video studio in the basement. He owned and flew his own Lear jet and had at least one boat. In California, he kept a second house: a mansion with a spectacular ocean view on 17 Mile Drive in Pebble Beach. He started a company in Austin to produce what he called a "home PBX system," called "Prometheus Light and Sound." He did charitable work in the area of pediatric AIDS. It should have been a good life, but all was not sublime. His second marriage was ending in divorce, and there were signs that lack of credit was continuing to eat at him.

Then, while in Monterey in 1994, Gary Kildall died from internal bleeding on July 11, three days after falling down in the Franklin Street Bar and Grill in downtown Monterey. He was 52.


And there the history of Gary Kildall and Digital Research ends. But it is more than mere politeness to say that a legacy remains. In March, 1995, the Software Publishers Association posthumously honored Gary for his contributions to the computer industry. They listed some of his accomplishments:

  • He introduced operating systems with preemptive multitasking and windowing capabilities and menu-driven user interfaces.
  • Through DRI, he created the first diskette track buffering schemes, read-ahead algorithms, file directory caches, and RAM disk emulators.
  • In the 1980s, through DRI, he introduced a binary recompiler.
  • Kildall defined the first programming language and wrote the first compiler specifically for microprocessors.
  • He created the first microprocessor disk operating system, which eventually sold a quarter million copies.
  • He created the first computer interface for video disks to allow automatic nonlinear playback, presaging today's interactive multimedia.
  • He developed the file system and data structures for the first consumer CD-ROM.
  • He created the first successful open-system architecture by segregating system-specific hardware interfaces in a set of BIOS routines, making the whole third-party software industry possible.

That's a good list, as far as it goes. But friends and students might make a different list, citing his gift for explaining, his patience, his high standards in his work, his generosity. Those who knew him in later years in Austin might cite his pediatric AIDS work.

These things are worth remembering, and represent a real positive impact on the world, whether remembered or not.

As for Kildall's place in computer history, it certainly shouldn't be as The Man Who Wasn't Bill Gates. "It was" as his friend and colleague Alan Cooper puts it, "Gary's bad luck that put him up next to the most successful businessman of a generation. Anyone is a failure standing next to Bill Gates."

He was by any measure an admirable man, a business success, an inventor of importance, a humanitarian.

And, above all, a teacher. "When Fortune magazine writes about Gary Kildall," Cooper said, "they don't see him in his natural habitat: a university." Kildall was never happier than when he was in that academic habitat, solving tough problems and sharing the solutions that he discovered openly with others.

He should have stayed in academia, a relative later said. It's what he loved. But in a sense, he never really left.

Copyright © 1998, Dr. Dobb's Journal
Dr. Dobb's Web Site Home Page

The true story or ... Gary Killdall, inventor of the smart phone

by George Wenzel

I try to tell Garyís story to anyone that will listen. I tried to have everything I know about Gary added to the Wikipedia page, and there I was out voted... If it is reported in the press, it has more weight than if someone that was there reports what really happened.

Remember the story about Gary flying his plane instead of meeting with IBM when they were wanting to put CP/M on the PC? And how missing that meeting caused IBM to go with msdos?

That story is rubbish.

The PC was shipped with a crappy, buggy version of msdos, because msdos was cheap. The user was expected to pay $250 for CPM if they wanted to run the current commercial software (wordstar and others). CP/M was an upgrade path for the PC. The real story was that IBM thought of CP/M as the "good stuff" and it was a premium offering. But that isnít even a tiny part of the whole story...

Did you ever own an original IBM AT? On the IBM AT, the power-up post routine printed the words "AT Multiuser System" at the top of the screen during power up. What the heck is the AT Multiuser System??? The AT prototypes didnít boot msdos, they booted a secret new OS. This new OS (I donít know the code-name for it) was a multiuser dos. The design was simple, you buy an AT, and it comes with one or more 8 port serial cards. You buy Wyse terminals (the ones with the PC Emulation mode( as opposed to modes like VT100 emulation)) and put one one each desk, then you run serial cables to your AT. This OS was a true multitasking OS, and the AT was a small cheap minicomputer. THAT was what the AT was designed to be. Period.

So what the heck happened?

IBM had a successful project, the first manufacturing run filled the warehouse, the software was ready, and it was time to start packaging the retail boxes. The first of the manufactured systems used up the last of the initial supply of 286 chips, before Intel switched to a lower-cost run of the 286 chips. The new 286 chips had some minor changes to increase yields, allowing the price to be lowered. Apparently the yields were low on the original version that IBM started manufacturing with. At some point IBM found a bug in the OS that slipped through the QA tests... It was a subtle bug that only had an impact when the machine was multitasking with multiple users, and as it turns out it was only present on the systems with the new 286 chips.

There was a lot of finger pointing and hand-wringing going on over it. Something was broken in the protected mode functions. IBM sales execs were either going to have to insist Intel use the more expensive low-yield version of the 286 chip, OR IBM was going to have to scrap the multiuser system. IBM is managed by the sales department. Someone in sales did the math... we use the expensive 286 chip, and sell one AT to each customerís office of 15 to 20 people... OR we sell them 15 or 20 machines with a cheaper chip. Which makes more money? It was decided that scrapping the multiuser system would increase system sales volume in theory. Even if the 15 to 20 machines were PCís instead of ATís, more money was made selling more systems than selling one.

So what happened to the multiuser system? The 8-port serial boards were ground into dust, except for a few units that escaped with the beta testers (I used to have a bunch of these). The Wyse terminals still had the PC emulation mode, but the expected sales volume of those never happened, so Wyse suffered as their R&D investment in the project was scrapped. DR got paid for their work on the OS, but they never got to see any revenue from sales. Bill Gates went from being a footnote in history to being who he is today. The first widely distributed multitasking OS never saw the light of day for several more years, and Gary went from being the grandfather of the small computer industry to a footnote. The operating system sat on a shelf for a couple years, but was dusted off and used when IBM started developing scanner cash registers. The new cash registers needed some sort of network or centralized computer, so DR got to sell a couple hundred copies of an OS that was supposed to sell millions (billions by today). The software got a new name, ConcurrentDOS.

Did Gary miss an IBM meeting because he was flying a plane, sure, he missed meeting all the time because travel prevented timely arrival. He was a pilot, and liked to fly, and sometimes that caused travel delays. Did that meeting cause Bill Gates OS to win over CP/M, not even close!

In engineering circles, there is a great deal of secrecy with regard to un-released projects. The press had no way to know the real story back then. Some low-level IBMer may have been aware of a missed meeting, and made a connection (that didnít exist) between that and the fact CP/M was not dominant, and leaked that to the press. That may have even become the semi-official story being as the real story was secret. But you and I have more proof that my version of the story is true, than the press have that their story is true. I donít have my boxed set of ConcurrentDos any more (it was lost along the way) but you have a copy of it in your archive... and there are probably more than one IBM AT around that still displays the AT Multiuser System banner... ConcurrentDos went on sale to the general public as soon as 386 chips were available, as they had a properly working protected mode. But by then it was too late, IBM didnít want to sell cheap minicomputers any more, and the whole industry had already grown accustomed to msdos.

If Intel hadnít broken the 286 chip, Microsoft would be remembered as the company that wrote software for the Apple ][. I purchased my first copy of CP/M from Microsoft... they were selling a z80 board for the Apple, and shipping it with a copy of CP/M. That would have been my last memory of Microsoft, I would have never heard the name Bill Gates, and we would have had several generations of multitasking and graphical interfaces years earlier than the ones Microsoft eventually created. Gary already had the graphical OS, it would have just dropped it on top of ConcurrentDos.

This is just one of the stories I have, the others are more widely known... But the story of ConcurrentDOS and the AT Multiuser System goes untold. That story had a big impact on Gary. He died a rich man, but he also dies a defeated man. Bill Gates and Microsoftís anti-competitive activities haunted Gary. He didnít like talking about it much, he preferred to innovate something better, with the hope of securing a technology patent that would reestablish his deserved status. He did create the next killer app, but he died too soon to finish the push to get it into the hands of the public.

I have two of my prototype wire-wrap boards used in the Intelliphone project, as well as one of the first printed circuit prototypes. Iím the only one that knows how it was supposed to work, because Iím the one that designed it, and I never documented my hardware beyond schematics and verbal meetings with the engineers and customers. Gary should have patented his UI. We should have also patented my memory design.

The Intelliphone hardware consisted of a board called the PPU, and a PC. The PC was mainly used to program the PPU, and serve to hold a bunch of PPUís to form a communications mainframe to be located in telco central offices. The PPU was also designed to be packaged into a single chip, and located in a cell phone. The PPU did not do RF, the RF component was considered transient technology, and expected to frequently change with different phone networks. The PPU was the computer and interface controller for the phone. The PPU was also a subsystem inside the central office mainframe. In the mainframe, the PPUís resided on a PC bus, but the PC was just a power supply and file server to the PPUs. The PC could not transfer data fast enough for the application, so an additional bus was designed into the PPU. The PPUís had two communications paths in addition to the pc bus. The PPU bus consisted of multiple T1 lines, and a shared memory path. The shared memory path was essentially a crossbar memory switch. The crossbar switch allowed large blocks of data to be transported from PPU to another instantly, without wasting time on the PC bus. The T1ís carried internal voice data, allowing 12 PPUís to have full duplex voice paths between them for each T1, and the design allowed as many T1ís to be added as needed for the size of mainframe being built. My crossbar memory switch was actually more complex than a "simple" crossbar switch, as the intent was to have other capabilities such as blocks of memory that are "broadcast" to multiple CPUís, in addition to more simple bank switching that most crossbar switches do. But my hardware design was dependent on the software engineers writing software to make use of it. The prototypes only memory switched between two PPUís, so much of the memory design went unimplemented because I quit before any large mainframe systems were prototyped.