home page                                                                      

                                                                                   ARMAND VAN DORMAEL HOME PAGE                                                                                                                                                                                                                                                                   




                                                                REFLECTIONS ON THE HISTORY OF COMPUTING                               



Several eminent historians have drawn attention to the gaps they detected in the history of computing. Michael Mahoney noted:


It should be easy to do the history of computing... The major problem is that we have lots of answers but very few questions, lots of stories but no history, lots of things to do but no sense of how to do them or in what order. Simply put, we don't yet know what the history of computing is really about. A glance at the literature makes that clear. We still know more about the calculating devices that preceded the electronic digital computer - however tenuously related to it - than we do about the machines that shaped the industry. We have barely begun to determine how the industry got started and how it developed.1

History helps us to know where we might be going by establishing where we are and how we got there… It is especially important to bear that in mind with respect to computers and computing because they have always been surrounded by hype… and hype hides history. 2


Alfred D. Chandler has observed that the evolution of the computer industry is a largely untold story and presents opportunities for business historians of the twenty-first century for further investigation. Current literature focuses almost exclusively on the history of the American computer industry. And he concludes:


I hope that this review of the opportunities for writing the history of electronic-based businesses and industries…will encourage economic historians trained as historians to return to the history of business enterprises and their industries. If they do, they will be able to open a new field of historical investigation.  Moreover, because so few enterprises were involved in commercializing the products of the new electronic devices, historians will be able to analyze the competitive successes and failures not only of companies but of major industries—successes and failures that led to worldwide domination or the near death of crucial national industries.3


History is a matter of interpretation as well as evidence, of judgment as well as knowledge. That is what makes it so open to gaps and inaccuracies. It should be a result of free investigation and criticism. The most we can hope for is a partial rendering, an approximation, of the real truth about the past.The history of computing, as it has been recorded, is the 'winner's version' of the past. Hundreds of books and research papers have been published, covering the computer industry as it developed in the United States. The American computer industry is part and parcel of a global development; but the United States - the economic and academic superpower - has established the paradigms and the contours of the evolution of computer technology, leaving out essential developments that have not been adequately studied and leave a trail of question marks. The lack of scholarly literature related to the transistor and the microcomputer - the foundational building blocks of the computer revolution - is truly astonishing. Substantially under-investigated, their invention remains misunderstood and is colored by a range of myths. Documents recently brought to light obsolete accepted wisdom and established truth. This should be thought-provoking and raise new insights, a challenge to the historical profession as a whole. A fundamental reappraisal of the issues would open a new field of historical investigation. Michael Mahoney and Alfred D. Chandler would give it their stamp of approval.

Historical truth can be very elusive, even to professionals. It is tempting to believe that events have been accurately recorded and that the story is well known. But a number of problems arise in any attempt to record the past. It is not just a matter of marshalling facts, but a question of seeing what holds the story together. We have to be extremely careful with chronology: what happened, and when? The history of computing is strewn with false claims of primarity. Some disputes were settled in court; some by mutual agreement. The logic of history clarified a number of misunderstandings.

American scholars, swayed by 'pride in our heritage' and - consciously or unconsiously - hegemonic, have focused on developments within the United States. In Europe, the history of computing is a backwater. 

The seminal contributions and pioneering breakthroughs to computer architecture by British, German and French scientists are practically unknown. Computer science arose in several places simultaneously and needs an international perspective. Implicitly, research in the history of science presupposes a world-view with the purpose of building a global picture and a single shared history from fragmented pieces, putting every piece into context and catalyzing the whole into a verifiable account. Their juxtaposition should add a new dimension to the scope of research and highlight previously unrecognized connections, thus allowing historians to judge their verisimilitude, their objectivity and trustworthiness. Such enquiry will open several areas of potential disagreement. It will require mental flexibility and a selective abandonment of conventional wisdom. Current historical orthodoxy is strongly embedded in the popular mind and in the history books, and has never been listed for peer review. 


The program-controlled electromechanical computer, the programmable electronic computer, the general-purpose computer, the transistor, the transistor radio, the microcomputer and the www originated in the Old World.  Their designers and builders did not receive the credit they deserve. Nor indeed did they receive a fraction of the staggering financial returns raked in by those who benefited from the commercial exploitation of their invention.

Initially conceived for military use, the first commercialization of the computer was successfully taken up in Britain. The technology was appropriated and perfected by a few large U.S. companies that soon dominated the global computer industry. In Europe, the legacy of the scientific innovators was squandered, largely because of industrial and commercial mismanagement and political and bureaucratic meddling. The seminal contribution made by European scientists to the Information Revolution is practically unknown.


On December 5, 1941, when he pushed the button to start the Z1, his experimental automatic calculating machine, Konrad Zuse heralded the computer age. In 1949, he established the Zuse KG and developed a line of computers that soon became popular with scientists and engineers. Since he did not have the financial resources to develop large machines, his production focused on minicomputers. The Graphomat was the first computer-controlled automatic drawing board. Unable to compete with the large companies, he was forced to sell his company. It was finally taken over by Siemens. Zuse must be credited with several fundamental inventions: the use of the binary system for numbers and circuits; the use of floating point numbers, along with the algorithms for the translation between binary and decimal - and vice-versa; the look-ahead which allows to read a program two instructions in advance and test it to see whether memory instructions can be performed ahead of time; the mechanical binary cells that make up the memory, etc.

In 1946, the French government recruited two German scientists, Heinrich Welker and Herbert Mataré, to set up a production line of diodes intended to replace the vacuum tubes as rectifiers in radar receivers. Both were interested in developing a semiconductor amplifier for use in radio and television and in telephone transmission. In the spring of 1948, they were able to produce germanium crystals of sufficient purity to generate a  functional amplification effect.  Samples were successfully tested in radio and television sets and in the telephone network. Presented to the press by the Secretary of Post, Telegraph and Telephone, they were hailed as "the fathers of the transistron." The French government concentrated on atomic energy and the project was abandoned. Both scientists returned o Germany. Mataré established a company producing diodes and transistors. In 1948, he demonstrated the world's first transistor radio at the Düsseldorf radio fair.

He migrated to the United States and worked as an engineer in several major companies. On 24 February 2003, John Markoff drew the public's attention to Mataré's contribution to the history of computing by publishing an article in the New York Times titled "Herbert F. Mataré: An Inventor of the Transistor has his Moment." Finally, the "French" transistor was rescued from oblivion.

 In January 1973, a French government agency, the INRA, took possession of the world's first microcomputer, the Micral. It had been designed and built by François Gernelle and a small team of engineers  who developed a microprocessor-based operating system for Intel's 8008. Production was limited and delivered mainly to government departments and large companies. In 1974, the Micral was demonstrated without success at the National Computer Conference in Chicago. The editor of Popular Electronics picked up all the technical information he could lay his hand on. He gave it to Ed Roberts, the owner of a small hardware store, who put together a computer kit that had the looks of the Micral. They called it the Altair.  Paul Allen and Bill Gates added the software. Everyone connected with the Altair project and how it got started, tells a different story.

For more than five years, the owner of the French company had within his grasp a world monopoly. His clumsy and erratic management and poor marketing led to bankruptcy. He sold the majority of the shares to Bull. The microcomputer was a disruptive technology and Bull was not ready to give up its traditional production. For several years, the machines were marketed separately under the name Bull-Micral. Gernelle was dissatisfied with Bull's policy. In 1983, he set up his own company, but was unable to compete with the U.S. and Japanese companies. The Micral faded away.


During the years following World War II, a technological gap developed between the United States and Europe. Governments observed with mounting concern their plants and offices operating with American computers, their airlines flying American planes, their people cured by American drugs. Was the gap really technological or something quite different? On the one side, there was within the United States an apparently irresistible force of invention and application of invention, creating new demand for goods and services. Using their superior financial, technological and organizational capacity, huge and growing corporations could afford the costs of basic research and development and the risks of innovation. Young entrepreneurs were establishing small enterprises for the development and exploitation of advanced technologies. Europe had no such tradition of broad-based industrial entrepreneurship. The European capital market was not organized to provide the initial financing needed by small, innovative enterprises. Venture capital for exploiting new ideas was practically inexistent. All this, combined with rigid social structures, hampered the formation of new businesses based on new technologies. Governments and business leaders were increasingly worried, without knowing quite what to do about it.

The computer age began in 1953, when IBM switched from punched cards to vacuum tubes for its data-processing machines. IBM set up production in Europe, followed by several competitors. By 1970, American manufacturers supplied 80 percent of the mainframes, while the European office-machine industry produced the remainder, mostly under license.  Alarmed by the technology gap, in the early 1980s the European commission took responsibility for computer technology and decided on a strategy to take on IBM and the other American giants. The commission's leadership had not noticed that the U.S. computer industry was in the throes of a shake-up and that a new breed of entrepreneurs had eclipsed the vertically-integrated companies by developing an open system and that Silicon Valley had invented an entirely new way of doing business.

For years, European computer companies were kept alive with massive public subsidies, to be eventually undone by the onslaught of American and Japanese manufacturers. Europe is a high-tech disaster area entirely dependent on American technology and Asian productive skills. The extinction of the computer industry is due to a mixture of political and industrial bungling. It covers a multitude of sins and reflects the lack of managerial expertise, the high cost of employing labor, the regulatory burden, the counterproductive bureaucratic and legislative interference in the economy and the anti-entrepreneurial climate.

For decades, Europe's best and brightest scientific minds left in droves. Jean Hoerni studied in Geneva and Cambridge. He invented the planar process, thus building a bridge between the transistor and the integrated circuit. Andy Grove came from Hungary. As chairman of Intel, in the early 1980's, he stood up to the Japanese chipmakers who flooded the American market with cheap transistors. Intel was the only manufacturer to refuse government assistance. Grove's survival instincts turned the company into a sweatshop. In his book Only The Paranoid Survive, he observed that "the balance between American and Japanese producers now depends on their relative capacity to endure pain." Federico Faggin, who had studied at the University of Padua before joining Intel, headed a team of researchers that designed the microprocessor, thus assuring American hegemony of computer technology. 


Historical data can be viewed, sampled and weighted in many different ways. The history of computing is a tale largely untold.  Several blank spots and discrepancies need to be addressed. Fundamental inconsistencies, ambiguities and contradictions will have to be resolved. My forthcoming book, The Silicon Revolution, sheds new light on the invention of the transistor, of the microcomputer and of the travails of the European computer industry that led to its extinction. It pricks a few bubbles and casts into doubt much of what is generally known about the history of computing. It will take some time, however, before the validity of my documentation is acknowledged, accepted and incorporated into a global, coherent and comprehensive account evoking the first 50 years of the computer's tumultuous existence.

Little scholarly attention has been focused on the origin of the transistor, the product of decades' worth of humanity's genius and innovation. Its prehistory is practically unknown. The concept of the microprocessor-based computer germinated in the mind of François Gernelle, a young French engineer when he saw the specifications of Intel's 8008. Together with three software engineers, he built the Micral, the first microcomputer, five years before Steve Jobs and Steve Wozniak showed the Apple II at the West Coast Computer Faire.

What has been missing is an in-depth and across-the-board global analysis. In order to fill this gap, it will be necessary to bring together and authenticate information scattered across multiple clusters, collect and collocate materials of historical value that support, complement and even contradict each other. Historiography is a testing ground not only for factual knowledge but for common sense. It requires an evaluation and critical analysis of commonly held assumptions. It can also become emotionally charged and lead to a biased presentation of facts.

History is shaped by the information that is accessible at the time and in the place it is written. Evidence available in some countries remains unknown elsewhere. The language barrier narrows the horizon, the scope of research and the historian's perspective. Writing world history from a single vantage point inevitably leads to misconstruction. Based on partial information,  one's view of historical reality is fragmentary and largely subjective. Even the most authentic account of past events can only be envisioned, narrated and represented by using the "glue" which joins clusters of facts together in a way that they acquire meaning. This process is inevitable, and poses to the historian the main challenge of his profession, a challenge which he seldom completely overcomes.

Many people in many countries have contributed to the genesis and the development of the computer. Over the centuries, the cumulative learning of mathematicians, philosophers, physicists, chemists and engineers - overwhelmingly German - laid the foundations of the theoretical and practical knowledge that contributed to the automation of arithmetical processes. In the early 20th century, a stay in Germany was de rigueur for any aspiring young physicist. It took a long time to gain insight into the material structures involved in all of these theoretical constructions. Solid-state physics was a domain in which there was only a very modest scientific competence outside Germany, where the term Festkörperphysik was commonly used among specialized physicists.

For many years after the war, the world was not in a mood to give credit to German scientists. Their pioneering contributions are central to the subsequent development  of applied computing technology. Legend has it that with the exodus of scientists, German technological output lost momentum.  In fact, war-time forced-march imperatives and the autarky policy did wonders for the advancement of civilian and military technology. But ever since the end of the war, German historians and publicists have kept a politically-correct low profile about the extraordinary scientific achievements during the 1930's and 1940's, such as the invention of the computer, the jet engine, radar technology that would lead to the transistor, nuclear fission, long range missiles, the tape recorder and many other technical innovations that would later benefit mankind. Because of the break between nationalist Germany and the internationalist Nobel institution, followed by the law prohibiting German scientists from accepting the Prize, the names of many of the inventors are hardly known outside Germany. 

Solid-state physics progressed immensely before and during the war. Microelectronics came to the forefront of engineering science. But the names of major scientists such as Clemens Winkler, Karl Ferdinand Braun, Walter Schottky, Karl Bädeker, Oskar Heil, Robert Pohl and Rudolf Hilsch are seldom mentioned in English literature. With the occupation of Germany, the Allies put scientific research under control. In the United States, the invention of the transistor gave a huge impetus to the study semiconductors and gradually introduced the discipline in the curriculum of the universities.

The first functional transistors were developed in France in June 1948, by two German scientists. It was the result of advanced scientific research in the field of semiconductor physics financed by the government intent on acquiring radar technology. The French authorities presented the "transistron" as "a brilliant realization of French research", and laid it aside. For more than five decades, the memory of the transistron was buried in oblivion - and occasionally suppressed - until The New York Times  brought it to public attention.

The personal computer was also invented in France, but because the owner of the small company that produced it was unable to commercialize it successfully, the PC became part of the American Heritage. Taken at face value and routinely translated into many languages, the history of their origin has been codified from a purely American perspective, unaware of the fact that transistors and personal computers had been made in France several years before production started in the United States 

The subject is arcane and esoteric and has given rise to legends. A book claiming that computer technology was brought to earth by extraterrestrials whose spaceship crashed somewhere in the desert was a popular bestseller. An amateur engineer who had never worked with computers is credited with the invention of the microcomputer. A fast-talking Frenchman, friend of the owner of the company that sold the Micral, but who never met with the engineers who built it, was able to convince American audiences and historians that he wrote the software. A beau mentir qui vient de loin.

The transistor was a legacy of radar technology. The discovery of radar and its employment during World War II is swamped by myths. Truth is the first victim of war. To the victor belong the spoils of war history, the patent and technology booty and the moral high ground. Truth and credibility become his appanage. In the long run and in most cases, history is self-correcting thanks to detached and unprejudiced research. Rooting out historical misrepresentation in a sensible way should be a primary responsibility for the scholarly community.


The names and the accomplishments of several scientists whose inventions determine the architecture of the computer as we know it, are practically unknown. The documentary heritage is very scant. The interaction between invention, technological innovation and the applications that fueled industrial take-up of information technology has been insufficiently explored. The impact of government technology policies in the United States, Europe and Asia has hardly been probed. Documents that have been laid aside and overlooked for decades occasionally turn up, bringing a new historical perspective and shedding new light on unexplored episodes and half-forgotten names, raising doubts and questions about some of the taken-for-granted truths enshrined in current historiography. The first-hand accounts related in my book should clarify the issues.

It took time, perseverance and a certain amount of luck to discover that transistors were developed and produced in June 1948, by Herbert Mataré and Heinrich Welker, two German scientists employed by a small company under contract with the French government. The microcomputer, the Micral, was invented by François Gernelle in 1972. Both accomplishments represented quantum leaps in human knowledge. Although they are at the origin of the digital age, Herbert Mataré and François Gernelle have been airbrushed from history. Precursors and innovators, it was their misfortune to work under unfavorable circumstances. Nobody remembers the losers. The visionary entrepreneurs who changed the world by commercializing their inventions accumulated enormous wealth and legendary fame, while the pioneering scientists came out empty-handed and ignored, even in their homelands. The journey was worth taking.

The idea of a personal computer was unquestionably thought up by François Gernelle He had studied physics, chemistry and electronics and held a doctorate in applied physics. Starting in January 1973, the Micral computer went into serial production. It served as a model for the Altair that spawned Apple and all the rest. The owner of the small company that produced the first personal computer had for several years a world monopoly of an invention made by his employees. Clumsy management forced him to sell out to Bull. François Gernelle is in his sixties and lives south of Paris.



                                                           A. Van Dormael;  J.M. Ackermann;   A. Lacombe;   F. Gernelle;   J.C. Beckmann  


The personal computer has spawned one of the most unfounded stories of the Information Age. It proves that despite the Internet and instantaneous global communication networks, information does not travel well. Historiography should be a cooperative and cosmopolitan exercise that thrives on inquisitive curiousness, open interaction and on the exchange of ideas, information and materials. It should gradually transcend national boundaries and tend to universality and a world-history standard. By lessening the national focus we would gain a more reliable and trustworthy picture of the whole.

Who invented the transistor? This simple question has a simple answer that requires a very complex explanation. In June 1948, Herbert Mataré and Heinrich Welker, after more than two years of experimentation in the laboratory of a small French company, Compagnie des Freins et Signaux Westinghouse located near Paris, succeeded in synthetizing germanium crystals of sufficient purity to bring about transconductance and thus the transistor effect. Both held a doctorate in semiconductor physics. Their research relied on engineering work extending over a two-year period and consisting of dozens of little operations which made it ultimately possible to produce a reliable electronic amplifier. The company was under contract with the French Post and Telegraph Ministry. They installed these solid-state  devices in experimental telephone lines and in radio and television sets. The French authorities showed no interest. Only after Bell Labs announced the invention of the transistor did a few government officials visit the laboratory. In 1952, Mataré set up a company in Düsseldorf and supplied transistors to the nascent German electronics industry. Together with Welker he invented the transistor.  



                                                           H. F. Mataré; Mrs. Van Dormael; Mrs. Mataré; A. Van Dormael


Dr. Mataré holds more than 80 patents. He has published several books and a vast number of scientific articles. In Von der Radartechnik zur modernen Kommunikationstechnik, he describes how the development of radar technology during WW2, especially the receiver development, gave him the idea that a solid state amplifier was a technical possibility. But the transistor effect based on minority carrier injection was only achieved in Paris in June 1948, with a better understanding of materials properties and when crystal of higher quality could be obtained. His autobiography, Erlebnisse eines Deutschen Physikers und Ingenieurs von 1912 bis zum Ende des Jahrhunderts, provides a very interesting account of his career. He emigrated to the United States, but never mentioned his contribution to the invention of the transistor until approached by a journalist of The New York Times.

The "transistron" had a long prehistory. It was an extraordinary technological breakthrough and one of the most complex man-made creations in history. It was made possible by the knowledge gained in solid state physics, a new materials science which was had been developed in Germany.  Between 1930 and 1945, interdisciplinary research had contributed immensely to the understanding of semiconductors. Until 1938, solid-state research remained essentially academic. It was conducted to advance fundamental knowledge with no thought of practical use, until it seemed to have a military and industrial potential as a substitute for the vacuum tube.

In 1957, the "pocket radio" catapulted Sony at the cutting edge of semiconductor technology. In 1953, the company had paid $25.000 to Western Electric for a license to produce transistors. The management of Western Electric had made it clear that the transistor could only be used in hearing aids. Solid state technology was completely unknown in Japan. It took three years for a team of 40 electronics engineers to familiarize themselves with the arcana of solid state technology and to learn how to apply it. In 1955, Sony produced its first transistor radio, followed in 1957 by the TR-63 which took the world by storm and established Sony as a world leader in consumer electronics. For this they had to re-invent the transistor almost from scratch. Sony's chief engineer, Leo Esaki, would later receive the Nobel Prize. 

In 1956, William Shockley, Walter Brattain and John Bardeen were awarded the Nobel Prize “for their researches on semiconductors and their discovery of the transistor effect.” Their discovery became an inspiring tale of scientific prowess and was elevated to mythical status in the collective memory. It was also a jewel in the crown of Bell Labs. The Nobel committee is not an investigative body. Moreover, Mervin Kelly, Bell Labs’ director of research, was a foreign member of the Swedish Academy of Sciences.

Crystallography was a new materials science in the United States. Solid-state electronics was an unknown discipline, beyond the realm of generalist engineers. The research that led to the transistor started at Purdue University. The role played by Purdue in the development of the transistor has been vastly underrated. During the war years, research on the germanium project was carried out under strict secrecy. Results were communicated in reports only to those with appropriate clearance.

But towards the end of 1945, all of the Purdue research on semiconductors was declassified. When the Bell Labs group started to operate, the results of the research program became known through Purdue's publications and frequent meetings. In 1947, Purdue supplied the germanium crystals that would enable Brattain's hands-on tinkering to build a transistor. Several of the graduate students who earned their Ph.D. for their findings in semiconductor research and witnessed how Bell Labs took advantage of their work, have offered a highly plausible explanation of the circumstances that led to the invention of the transistor. 

 The invention of the transistor as it has been written up, leaves us with a puzzle that poses a challenging test for historians. It should provoke some soul searching about its ability to stand up to a rational analysis. Compared with the huge amount of literature devoted to the first generation electronic computers, very little research has taken place. The technical complexity of the technology is a powerful deterrent to scrutiny. The products are mysterious and the terminology is inaccessible to anyone who does not have a thorough understanding of the basic concepts of solid-state physics.

Personal considerations and loyalties occasionally take precedence over factual substantiation to determine opinions and assessments of historical facts. But historians should not leave a suspicion of being manipulators.  Getting the facts right is essential, both the technical facts and the chronological facts. Sixty years after it was announced, the invention of the transistor is still a riddle wrapped in mystery. Documents recently brought to light fly in the face of conventional wisdom. Journalists write history's first draft. In 2003 - 55 years after the fact - The New York Times revealed  that, in 1948, two German scientists had produced transistors for account of the French government.

This should be thought-provoking and raise new issues and insights, a challenge to historians committed to broadening the history of computing, and to the historical profession as a whole.  It should stimulate solid-state scientists and historians to take a searching look behind the conventional accounts, dispel a string of misconceptions and recast, wherever necessary, our understanding of the evolution of computing. Should we take conventional history on faith? Endless curiosity and a healthy scepticism are the hallmarks of the true historian.

Every invention generates its own history. The accounts dealing with the invention of the transistor and of the microcomputer are half-done, inaccurate and  in need of revision. Historians will have to make their way across the tortuous road of oblivion, retrieve whatever documentation is available and rely on common-sense reasoning based on empirical evidence. By compiling and confronting documents written in German, French, English and Japanese, by highlighting connections and interactions where none have been recorded so far, a more accurate vision of the history of computing will emerge. What was once unknown will become obvious. The logic of scientific discovery should transcend historical interpretations of the past, allowing historiography to be rewritten to suit that logic.

The dividing line between historical folklore and history occasionally gets blurred.  It should be possible to challenge vested intellectual interests in mainstream history, to question common wisdom, make a logical analysis of the available documentation, engage in an open enquiry and agree on a bona fide historical account of the origin of the transistor and of the microcomputer. Up until now, history has been unfair to the inventors. The winners are those who, by luck or vision, managed to bring the inventions to market, thereby showering  themselves with fame and fortune.  If The Silicon Revolution gives the inventors their due, it will have achieved its purpose.           



1  Mahoney, M. S. Issues in the History of Computing Forum on History  of  Computing.

    ACM/SIGPLAN. Cambridge, MA. 20-23 April, 1993. 

2  Mahoney, M. S. The Histories of Computing(s). Interdisciplinary Science Reviews.l

    30.2.2005, pp.119-135.

3  Chandler, A. D. Gaps in the Historical Record: Development of the Electronics

    Industry. Harvard Business School.  Working Knowledge. October 20, 2003.

                 A Parallel Inventor of the Transistor has his Moment. New YorkTimes, Febr. 24, 2003.


Armand Van Dormael

33A Drève de la Meute

B 1410 Waterloo



                                                                                                                                                        See  life and times