THE ROYAL IRISH ACADEMY IS IRELAND'S LEADING BODY OF EXPERTS IN THE SCIENCES AND HUMANITIES

The Royal Irish Academy/Acadamh Ríoga na hÉireann champions research. We identify and recognise Ireland’s world class researchers. We support scholarship and promote awareness of how science and the humanities enrich our lives and benefit society. We believe that good research needs to be promoted, sustained and communicated. The Academy is run by a Council of its members. Membership is by election and considered the highest academic honour in Ireland.

Read more about the RIA

Evolution of the microchip

27 July 2020

Latest Library Blog post. This month we are very pleased to welcome guest writer, David Fegan, MRIA, who takes us through the history of the microchip.   

The ubiquitous microchip

Since the 1960s the technological advancement of society has been impressive across many fields and disciplines. It is difficult to identify any aspect of early 21st-century living that has not undergone radical transformation through the proliferation of intelligent devices incorporating miniaturised electronic circuitry and flexible embedded software which orchestrates functionality. Nowhere is this more evident than within the modern sophisticated mobile phone. For those of a certain age group, it is almost surreal to recall how things were a quarter of a century ago, particularly in Ireland, where the primitive land-line telephone was our sole means of interpersonal electronic communication, assuming of course that one was prepared to wait one to two years for connection! How times have changed.

Consider the contemporary mobile phone. Routine capability embraces communication, messaging, photography, navigation, email, internet browsing, remote commerce, bill-payment, data-hosting, access to the cloud and a host of other applications (apps), many of which are free to use and simple to download and install. Truly, your personal cellphone is a breathtaking example of the pervasiveness of integrated circuitry in the guise of the microchip.

Figure 1: Contemporary (2020) integrated circuit (IC) controller board for a robot. Credit: Neil Mac Giolla Phadraig

A contemporary mobile phone is far too valuable and expensive to deconstruct simply for the sake of a photograph. As an alternative, Figure 1 displays internal details of a contemporary robotic controller circuit card, featuring a myriad of electronic components including individual microchips - the very small black rectangular objects. What follows is an overview of how microchips evolved from earlier generations of electronic developments and how they have come to dominate modern electronic design and manufacture, underpinning the contemporary new industrial revolution.

First generation electronic technology - Vacuum tubes (Valves)

During the first half of the 20th century, the humble radio was probably the most emblematic and pervasive electronic device to be found in homesteads across much the world. As the art of broadcasting slowly improved, both in quality and transmission range, radio began to transform the quality of individual lives, delivering news, documentaries, debate, music, art, culture and much more. Often the radio (or perhaps a radiogram) was more than just a communications receiver facilitating regeneration and conversion of broadcasts into audible sound, through some apparent magical process. It was also an elegant piece of furniture, featuring an illuminated glass plate upon which printed names of many of the world’s great broadcasting stations featured, exotic locations such as Hilversum, Stavanger, Strasbourg, Athlone, Droitwich, Kalundborg and many others. Tuning into the world was simply a matter of rotating a knob until a moving pointer aligned with the station of interest. By operating at night and preferably with a connected external long-wire antenna, success was almost always guaranteed, despite the influence of random atmospheric bouts of interference. The aura and experience of listening on vintage cabinet radios was special, expressed beautifully in the mid-1970s, by Linda Ronstadt’s singing the introductory line of Warren Zevon’s song ‘Carmelita’ – ‘I hear mariachi static on my radio and the tubes they glow in the dark ...’. Indeed the tubes did glow in the dark, frequently visible through a slotted grill at the rear of the radio.

Invented in America very early in the 20th century, electronic vacuum tubes became important in underpinning the emergence of what in time became referred to as consumer electronics. Essentially evacuated glass envelopes with heated metallic cathodes, these tubes (or valves) orchestrated the flow of internal charge carriers (electrons), by a much smaller current, essentially a process of signal amplification. Used in association with resistors and capacitors, these power-hungry, bulky, high voltage valve devices became essential to the successful development of radio, TV, avionics, early vintage computers, military hardware and a host of related industrial and technological fields. World War II stimulated invention of specialised tubes, leading to some modest miniaturisation as devices became smaller, lighter and more robust driven particularly by demands of the world’s air forces and navies for greater compactness coupled with enhanced reliability. Not unexpectedly, inherent wartime requirements stimulated financing and research into alternative approaches to electronic component design and use.

Second generation electronic technology - Transistors

In particular, the potential of materials such as Germanium and Silicon were studied and explored, in the hope that they might lead to some type of alternatives to valves, which were constrained by their failure rates and unreliability. As naturally occurring elements, Germanium and Silicon were known to facilitate electron flow within the solid state materials in question – designated as semiconductors. Just before Christmas 1947, at Bell Laboratories, New Jersey, William Shockley, Walter Brattain and John Bardeen successfully invented the world's first transistor. A device possessing three terminals (connections), the transistor exhibited amplification in a controlled fashion. It could also be used as an electronic switch, alternating from being in a state of full-on conduction to one of full-off conduction, extremely rapidly. Formally announced by Bell Laboratories on June 30th, 1948, the press release heralded the development with the following words - “known as the transistor, the device works on an entirely new principle, and may have far-reaching significance in electronics and electrical communication ... its essential simplicity, however, indicates the possibility of widespread use, with resultant mass-production economics.“ Eight years later, in 1956, the discovery team shared the Nobel Prize in Physics in recognition of the importance of their invention. Since that time, semiconductor materials have been the focus of intense international research, leading to the commercial proliferation of many of the modern era’s electronic wonders.

Why was the transistor so important? The transistor was almost certainly the most transformative invention in the immediate post-World War II era. Here was a novel device of remarkably small bulk fabricated from cheap common naturally occurring elements, and which could be made to change its conduction state effortlessly. Designed to operate at low voltage (typically 5 to 12 volts DC) the transistor was safe to use and not constrained, like valves, by the excess heat produced from wasteful power dissipation. The inherent reliability of this solid-state device was to prove a huge attractor to both industrial and research communities. Apart from military applications, two immediate post-war large-scale projects were on the developmental horizon. Increasingly, the construction of reliable large-scale computers had become constrained by the operational limitations and reliability of valves. Frequently, many hundreds of valves, all with limited operational lifetimes, were required to perform a myriad of computational tasks. So, as with incandescent light bulbs, the next component failure was never far away. In addition, based on the successful wartime invention of radar, the emerging challenge of designing and building a reliable air traffic control (ATC) system demanded extreme reliability, given the implicit safety requirements. Also, from the perspective of consumer electronics, it appeared that the transistor offered new and unique opportunities in the development of hearing aids, miniaturised portable radios, aspects of TV circuitry and much more. Here then was a unique opportunity offering potentially attractive returns to farsighted financial investors.

As is often the case with radically novel inventions, it frequently takes time following invention, before manifestation of the invention in day-to-day products. The transistor was no exception, primarily due to the practicalities of growing pure high-quality semiconductor crystals. In addition, there was the question of patent rights to be sorted out. The patent holding company, Western Electric, only began in 1951 offering manufacturing licences and training to interested companies prepared to pay an advance of $25,000 licence fee against royalties. Alongside the giants of US electronic companies, a small far-sighted company based in Houston, Texas, reincorporated itself as Texas Instruments (TI) and with a small but highly talented team of engineers, physicists and chemists, gained an early lead in the black art of making silicon transistors of quality and worth-while yield. By mid-1954, the company had developed a prototype NPN silicon transistor, effectively a sandwich of negative-positive-negative structure, with the central region of thickness about one-thousandth of an inch. The actual transistor constituted a small bar (about a quarter inch long), cut from the much larger grown junction silicon crystal. An important aspect of the manufacturing process was the development team’s ability to selectively “dope” (add impurities) to the crystal melt in a controlled manner as it was growing.

Figure 2 depicts the design sketch of TI’s first silicon transistor. A typical transistor circuit may be described as a semiconductor sandwich of positive-negative-positive (PNP) regions within Germanium or Silicon, featuring emitter, base and collector segments of what are known as Bipolar transistors. Nothing was ever to be quite the same after the successful manufacture of the transistor, the title “semiconductor” becoming synonymous with the new technology.

Figure 2: Design sketch of Texas Instruments first junction silicon transistor. Image courtesy of De Golyer Library, Southern Methodist University, Texas Instruments Records.

A smash-hit, the first transistor radio

Perhaps not surprisingly, the immediate and successful demonstrations of silicon-transistor capability by TI were somewhat underwhelming, since cautious potential users and manufacturers were sceptical of incorporating the novel technology into existing or planned consumer products. This was a matter of real concern for TI, with the now-proven new devices in both germanium and silicon, but no immediately obvious market. Patrick Haggarty, chief executive of TI, realised that the company needed some dramatic exposition as to the power and flexibility of the transistor, something aimed preferably at a mass consumer market. He hit upon the idea of a shirt-pocket sized portable radio, engineered from transistors, and tasked his TI engineering team to rapidly design and build a prototype version. A critical limiting factor was the sheer cost of early transistors, about $8 each in 1953, by comparison with the cost of a typical radio valve (tube), about $1. The original portable radio concept envisaged a six germanium transistor implementation which, through ingenious brainstorming, quickly reduced to just four transistors, very important in reducing cost to potential customers. Assembled and built by an Indianapolis based company operating under the trade name Regency, the radio was launched in time for the Christmas market of 1954, with a price tag of $49.95. While the radio may have fitted all shirt-pockets, it may not have fitted all US pocket-wallets! Figure 3 displays side-by-side, the internal cramped components and the front visual aspect of the Regency. However, in excess of 100,000 Regency TR-1 portables were sold in the first year. Thereafter, the sheer novelty, compactness and performance of the product boosted sales in spectacular fashion. Serendipitously, the launch of the pocket portable radio was to have an inestimable influence on emerging teenage youth-culture, coinciding with the arrival of “Rock ’n Roll” and teenage rebellion.

The vision of Patrick Haggarty for his company may have been realised through the success and impact of the Regency TR-1 but it was not a profitable venture for TI, a consequence of trying to keep the price below $50. Much more important was his focus on the fast moving computer market, in particular IBM (International Business Machines) and its formidable president, Thomas Watson Jr. When the TR-1 hit the market, Watson bought 100 of them and passed them out to his sceptical managers and executives, exhorting them with the words - “If that little outfit down in Texas can make these radios work for that kind of money, they can make transistors that will make our computers work also.” Contracts were signed and thereafter TI became a successful supplier of transistors to IBM for many years thereafter, as IBM began building its range of commercial computers with transistors. Ironically, it was the Japanese electronics manufacturers that quickly became the world leaders in production of transistor radios in the latter half of the 1950s, with many brands becoming household names, such as Sony, Toshiba, Hitachi and many more.

Figure 3: Regency transistorised radio, internal components and facial aspect. Image courtesy of author.

Third generation electronic technology - The Microchip

Design and employment of transistors rapidly proliferated into communications devices, colour television and less-than-compact reel-to-reel tape recorders. The great attractive features of the transistor were compactness, reliability and rapidly diminishing unit cost, as designers and engineers came to exploit novel and innovative possibilities. Miniaturisation drove the evolution of aviation, communication satellites, and the fledgling space programs where payload constraints had previously been paramount in determining what was technically feasible. The possibilities appeared endless as the transistor revolution got seriously underway in the 1960s. However, other exciting possibilities were on the horizon.

Jack Kilby - engineer

Jack Kilby was an engineer of unusual vision and talent who joined Texas Instruments (TI) in May 1958 and was destined to become an innovator of exceptional originality and influence, literally changing the world of electronic engineering through his design flair and tenacious approach to problem-solving. His remarkable style of researching many different aspects of semiconductor physics, embraced both failure and success, as attested in the detail and scope of his investigative laboratory notebooks. He frequently admitted to learning what might be technically possible, through a thorough understanding of what was impossible, based on his methodical investigations. Tall, slow-talking and originally from Great Bend, Kansas, Kilby moved to work with TI in Dallas, a member of a team that was involved in the Micro-Module approach to shrinking circuitry bulkiness and simplifying connectivity. His new employer operated a staff mass-vacation program but since he had not acquired any significant vacation credits he was expected to work through vacation and to come up to speed on the company’s existing plans for the Micro-Module. However, based on experiences garnered with his previous employer, he was sceptical and pessimistic of the underlying approach. Left somewhat to his own devices during the vacation, he was inspired to think deeply as to whether Germanium or Silicon might offer possibilities of manufacturing both resistors and capacitors, in addition to the proven transistors and diodes. The essence of the new approach was recognition of the fact that while pure Silicon is closer to being an insulator than being a semiconductor, therefore it is in effect a resistor per se. Through selective doping with chemical impurities, conduction may be enhanced in proportion to the degree of doping, so in principle, a piece of Silicon could be made to exhibit controlled conduction. Likewise, by carefully controlled doping, a P-N junction could be made to electrically function as a capacitor. Kilby appreciated that neither of such unorthodox resistors or capacitors would have the quality of conventionally manufactured devices, but realised that given sufficient development time, greater quality and manufacturing tolerances might be attainable. However, the real gain would be the inherent miniaturisation, whereby all components of a circuit could be laid down within a tiny silicon block, with no internal wiring required. Irrespective of the complexity of a circuit’s design, no soldering would be necessary and the package might involve multiples of transistors, determined by the desired functionality of each particular design requirement. Kilby’s notebook entry for July 24th, 1958 included the following modest statement - “The Monolithic Idea: The following circuit elements could be made on a single slice [of silicon]: resistors, capacitors, distributed capacitor, transistor.”

Figure 4: Jack Kilby, electronic engineer. Image: James R. Biard CC-BY-SA-4.0

Following the vacation period, Kilby (Figure 4) shared his theorising with his new boss Willis Adcock (an expert in semiconductor physics). With the somewhat reluctant agreement of Adcock, Kilby’s speculative ideas quickly became speculative laboratory research investigations. He patiently carved a tiny resistor from a small strip of silicon and then with a tiny bipolar strip of silicon made a PN junction capacitor. On September 12th, 1958, the first functional laboratory device (a phase-shift oscillator) was successfully demonstrated to colleagues and the era of the monolithic integrated circuit or microchip was born. Kilby’s proto-device is shown in Figure 5.

Figure 5: The first integrated microchip circuit, built by Jack Kilby in 1958. Image courtesy of DeGolyer Library, Southern Methodist University, Texas Instruments Records.

While visually not particularly attractive, miniaturisation was quite noteworthy, given that the circuit was just a tiny thin bar of germanium 7/16th by 1/16th inches in area. Quoting Kilby, “What we didn’t realise then was that the integrated circuit would reduce the costs of electronic functions by a factor of a million to one. Nothing like that had been done before.”

Robert Noyce - Physicist

Born in 1927, Robert Noyce’s early life was spent in the small town called Denmark, in Iowa, USA. He graduated from Grinnell College, Iowa, in 1949 with a degree in Physics. An inveterate tinkerer possessed with an overarching desire to understand how the world functions, serendipitously his college undergraduate years coincided with the emergence of the transistor. Physical electronics and the physics of the solid-state of matter were to become the focus of his post-graduate researches at MIT, culminating in the award of a PhD in 1953.

Ambitious, flamboyant and extroverted, his career trajectory was meteoric, propelled by his electronic expertise and an innate business acumen. Never short of self-confidence, an early employment target was with Shockley Semiconductor Laboratories in Palo Alto, California. There is an apocryphal story describing how he rented an apartment for his family, before the job interview, or offer of a position. Co-inventor of the transistor and about-to-be Nobel Laureate, William Shockley had departed Bell Laboratories in order to launch his own personal plant for transistor fabrication, with prime emphasis on design and development of high-performance double-diffusion transistors. Semiconductor fabrication plants in the latter part of the 1950s, were effectively licences to print dollars. However, despite the prestige of being part of a team working with a Nobel Prize winner, Shockley’s business organisation and management know-how paled to insignificance in comparison with his forte of transistor design. Abruptly, in 1957, Noyce along with seven other members of the engineering design group departed Shockley’s operation, collectively investing savings in the fledgling Fairchild Semiconductor Corporation, in San Jose.

Fairchild quickly became a very successful enterprise and during the first couple of years Noyce became increasingly aware of shortcomings arising from wiring and interconnection issues, especially in large pieces of electronic equipment such as mainframe computers. Here was a real irony, whereby the beneficial attributes of the transistor enabled building of large and reliable products, only to be compromised by literal forests of interconnecting wiring, often unflatteringly referred to as “rat’s nests”. Quoting Noyce from that era, “Here we were in a factory that was making all these transistors [thousands] in a perfect array on a single [silicon] wafer, and then we cut them apart into tiny pieces and had to hire thousands of women with tweezers to pick them up and try to wire them together into electronic circuitry.”

The non-obvious solution would have been not to shred the wafer in the first place. Fairchild perfected a process for transistor manufacture in the late 1950s, based on controlled layering of silicon dioxide onto the surface of NPN transistors, as a means of reducing unwanted contaminants. Fairchild patented the “planar process” in 1959, but perhaps more importantly, Robert Noyce had the insight to realise that if a thin track of copper could be printed on top of the oxide passivation layer then perhaps a resistor could be fabricated in a channel of un-doped silicon, and a capacitor might be constructed from a PN junction. Furthermore, the metal layer would facilitate connection with the underlying solid-state components, through drilling and dropping wires to precise desired location in the chip of silicon. Sharing his deepening insights with Fairchild colleague Gordon Moore, Noyce’s laboratory notebook gathered all the threads of his remarkable insights into four pages that formed the basis of a complete integrated circuit template, completed on January 23rd, 1959. Effectively, here was a blueprint for what Fairchild considered to be the world’s first monolithic integrated circuit. Figure 6 shows Noyce some time afterwards, holding a circuit template (many times magnified) of an integrated circuit with clearly delineated metal tracks.

Figure 6: Robert Noyce with enlargement of an integrated circuit (IC) design. Image: Intel Free Press  cc-by-sa-2.0.

The patent war

Given two large electronic corporations vying to solve the connectivity bottleneck in dense component circuitry, it was inevitable that progress should be incremental, extremely competitive and veiled in secrecy as to the precise technical details. The commercial roll-out of monolithic solid-state integrated circuits (colloquially ICs) was destined to be anything but rapid, with many additional world-wide competitors driven by potentially astronomical profit levels, should they be allowed to participate. Initially, both Texas Instruments (TI) and Fairchild were unaware of precisely what each company was up to, despite rumours. So, at TI, while Kilby made the successful demonstration of his semiconductor PSO circuit in September 1958, little additional progress was made with the technology, as TI was preoccupied with the production of silicon transistors, the mainline product at that time. It was early 1959 before TI publicly announced Kilby’s accomplishment, hailing the breakthrough as the most significant development by the company since the commercial development of the silicon transistor. TI president Patrick Haggarty was forthright in his advocacy as to the merits of the microchip, predicting that early applications would be in the miniaturisation of computers, in missile technology, in space vehicles and artificial satellites. If TI wished to pioneer such developments then the company had to protect Kilby’s concept and working proof of concept, and so a patent application was was urgently completed and submitted to the US Patent Office on February 6th, 1959. A serious weakness with the application was that TI had not addressed the practical question as to how such a device might actually be engineered, a requirement of patent law. While TI emphasised the originality of concept and the limitless complexity of circuitry that would be possible through integration of components within the chip, the question of internal connectivity within the chip was still vague to say the least. A crude drawing of a hand-wired chip accompanied the patent application, but it was rather more representative of a spider crawling over the underlying silicon slab, than of a high technological breakthrough. The vagueness of what was being conveyed by the drawing was a matter of much concern to Kilby, forcing him to add a descriptive paragraph outlining how gold wires might be added as conductive connections, the gold being laid down on an underlying oxide layer.

At Fairchild, as a new startup company, the major focus was on transistor production and marketing, in order to establish the organisation. It was to be four months on from Noyce’s laboratory notebook summary of his concept for the integrated circuit before Noyce was galvanised into patent application, based on TI going public with Kilby’s successful demonstration. Fairchild realised that TI had almost certainly already filed for patent protection. The Fairchild application was prepared emphasising that it was revolutionary in the sense that both component integration and interconnection formed part of a single process. Noyce was exploiting Fairchild’s in-house invention and development of the planar-process, enabling printing of metal strips onto chip surfaces. The patent was submitted to the US Patent Office on July 30th, 1959. Patent application assessment is never a speedy business, and while the process trundled on both companies' major focus continued to concentrate on transistor manufacture, which was a huge and successfully growing market.

From the perspective of bringing a new product to market however, both companies recognised the need to plan and draw up possible strategies, long before the patent issue might be settled, something that could take several years. So, in addition to recruiting talented innovators, engineers and scientists, funding and contracts would be absolutely essential if the integrated circuit was to be successfully brought to market. TI had pre-existing collaborative transistor programs with the U.S. Air Force (USAF) Electronics Components Laboratory. Unofficially a modest funding initiative from that source enabled TI to begin exploratory research on possible methods of IC fabrication. Additional “unofficial” funding made it possible for TI to develop and demonstrate a pilot IC fabrication facility that produced a small number of devices with the phenomenal price tag of $450 apiece. However, that encouraging enterprise resulted in the USAF injecting $600,000 into a TI project to build a small digital computer capable of performing assorted mathematical operations. Using TI integrated circuits, the computer comprised 587 ICs and weighed 10 ounces, a marvel in integration and miniaturisation. When successfully completed, the enterprise was considered to be years ahead of its time. Just to convince sceptics as to the breathtaking capability of the integrated circuit approach, TI emulated the design of a second unit of identical proficiency, but which was built entirely from transistors and other discrete individual components. This latter computer when constructed, was 150 times larger than the integrated circuit version, 50 times heavier and used 8,500 discrete components, a factor of 14.5 times greater than the component count of the novel microchip version. By 1961, TI ICs were being employed in defence equipment and by the following year, in the Minuteman missile defence system.

In contrast, after patent submission the situation at Fairchild was driven by Noyce’s reluctance to have military interests or defence contractors dictate the terms and objectives of the company’s research, as opposed to secured private funding. Figure 7 is an enlarged die photo of a Fairchild logic chip, showing the complexity of component layout of early generation devices. Noyce however, was more sanguine about selling integrated circuitry to US military customers, should the outcome of the companies research strategy lead to a product of direct interest or value to the military, a subtle differentiation of mindset. Conscious of the expanding Apollo space project, also realising that the Minuteman II defence program was by 1962 planning to switch entirely to microchips, a defence contract program of $25 million dollars over three subsequent years seemed to guarantee a future for the microchip, driven effectively by government funding. Beyond that, the anticipation was that computer manufacturers would also gravitate away from transistors to chips, but that was slow to come about.

Large corporations, among them IBM, had already invested hugely in transistor design for product development and to some extent stood back from moving to integrated circuits, preferring to adopt a wait and see policy. The IBM 360/85 was the first mainframe from that manufacturer to include some integrated circuitry in 1969. Figure 8 features the UCD IBM 360 computing facilities director Frank Anderson, in the early 1970s.

Figure 7: Greatly magnified die photograph of a Fairchild 74 series logic chip, circa mid-1970s. Image: Robert.Baruch CC-BY-SA-4.0.

 

 

Figure 8: UCD IBM mainframe model 360, circa 1971, Frank Anderson, director. Image courtesy of author.

The IBM 370 was to make extensive use of integrated circuits when commissioned in 1971. Directly or indirectly, defence sourced R&D funding did have a huge beneficial impact on commercial electronics businesses in those years. Also, many new smaller specialised companies were to become independently established on the coat-tails of the technical know-how and experiences garnered by scientists and engineers learning the practicalities of chip manufacture from working on defence contracts. Signetics, founded in 1961 by former Fairchild employees (Fairchild employees were referred within the business as “Fairchildren”) was an example of a small company that made a significant impact during the 1960s, with some excellent specialist functional devices. It was just one of many such niche companies that was to ride the microchip wave in those early and exciting days as the new technology mushroomed.

The microchip patent evaluation was to rumble on until April 1961 when Jack Kilby received news that a patent for invention of the first integrated circuit was being awarded, but not to him at Texas instruments. The award had been granted to Robert Noyce at Fairchild, the person who was in fact second to conceive of the concept. Naturally, the award gave rise to deep consternation at TI and an objection was prepared and submitted to “US Board of Patent Interferences”, a body charged with investigation of conflicts. That board is guided by the principle that priority in invention has to be the prevailing factor. The investigative process was to ebb and flow, back and forth, through minefields of industrial-strength legalese, oscillating from one side to the other until finally, ten years and ten months after Jack Kilby first applied for the patent, Robert Noyce was judged to be the victor in the celebrated case Kilby versus Noyce. But by that time, a whole semiconductor industry had sprung up as the market for integrated circuits took off exponentially. As a matter of pragmatism, in 1966, long before the final patent judgement, the two companies Fairchild and TI negotiated with other interested electronics companies and rationalised a workable, interim agreement as to how to proceed. Both Fairchild and TI agreed to grant licenses to one another for IC fabrication. Other companies with ambitions to manufacture chips were issued licenses by the principals, provided they agreed to pay between 2% and 4% of the licensee’s profits accruing from chip manufacture. The pathway to manufacture of microchips had been opened up to all those eager to be involved.

Now in 2020, more than a half-century since the US patent award to Robert Noyce, the scientific and engineering communities largely consider the microchip as a joint accomplishment of both Robert Noyce and Jack Kilby, and both men are viewed as co-inventors. Both have been recipients of many awards during their lives. Noyce passed away in 1990, while in 2000 the Nobel Prize in Physics was awarded to Jack Kilby who was to pass away five years later. Both men launched the second Industrial Revolution. Their genius and insight has since permeated the lives of virtually every person on the planet, yet ironically, few people could name the inventors of the integrated circuit. In the contemporary world of computers, mobile devices, the internet, Zoom, Facebook, Twitter and so on, perhaps less than one person in a hundred is aware of the microchip and its importance in empowering the society in which we live. Is this lack of awareness a reflection of our society?

Hand held electronic calculators

When I enrolled as a Science undergraduate at UCD in 1962, almost all undergraduates personally purchased at least some textbooks. However, the very first purchase that I made was a slide rule, a venerable aid to calculation, essential for solving mathematical problems, and for estimating parameters arising from practical laboratory observations and measurements in Physics and Chemistry. The slide rule was invented in 1622 by William Oughtred. Essentially the device facilitated fast calculations to be made which otherwise would involve tediously looking up logarithmic tables. By the 1960s, the utility and functional complexity of top-of-the-range slide rules was impressive, requiring significant investment of effort on the part of the user, in order to master tricks and sleights-of-hand which in time bestowed effortless numerical wizardry on users. However, the inherent useful resolutional accuracy of the slide rule was typically only three decimal digits, and keeping track of where the decimal point’s location should be in any calculation, was a bit of a black art, unless one knew a deeper trick. It is interesting to look back at scientific or engineering magazine adverts dating from the post-World War II era, and note how frequently they featured formally dressed research scientists and design engineers, enthusiastically manipulating slide rules. Figure 9 illustrates some personal slide rule heirlooms from earlier times.

Figure 9: Slide rules, featuring the fabled Faber-Castell Novo-Duplex, circa 1974. Image courtesy of author.

In the early 1960s, it was apparent to some involved in the electronics business that if the new technology was to prosper then some application of mass commercial appeal and pervasiveness was essential. Just as the Regency pocket radio heralded the arrival of the transistor in 1954, some appropriate form of emulation might hasten mass public interest and engagement with the microchip. An obvious market possibility was replacement of the mechanical calculator, at that time consisting of clunky, noisy, slow electro-mechanical devices of appreciable weight, size and prohibitive cost. By 1964, the Friden EC-130 electronic calculator was just entering the market, with construction based on germanium transistors. Calculations were displayed on a miniature cathode ray tube, essentially a TV screen. The price tag, however, was $2,100. (There is speculation that this may have been the world’s first such device.)

However, the redoubtable Patrick Haggarty at Texas Instruments (TI) was intrigued by the concept of a hand-held electronic calculator and in 1965 a team was assembled around Jack Kilby in order to design such a device, small and light enough to fit in the palm of one’s hand, but with capability to perform basic mathematical calculations. Here was a departure from TI’s established business model, as it planned to enter the consumer market with its own calculator. By late 1966 a working model existed, based on a set of specially designed integrated circuits with capability of performing the mathematical functions – addition, subtraction, multiplication and division (+, -, x and /) respectively. Within a year the team had made a patent application, but it took eight further years before being granted in 1974. With an integral keyboard and thermal printer, the device was code named “Cal-Tech”. Not marketed commercially, nevertheless TI became expert at the design and fabrication of chip-sets which were later sold on to interested third parties.

In 1972, Hewlett-Packard designed and built the world’s first pocket scientific calculator, the HP-35, illustrated in Figure 10.

One of the ironies concerning electronic calculators is that design of the earliest models demanded significant complicated calculations, many of which were made using slide rules. In the process, the slide rules were guaranteeing their own demise! The HP-35 was a wonderfully innovative piece of technology, designed to do everything a slide rule could do, and much more, all very rapidly. A quote from the instruction manual reads –

Figure 10: The HP-35, the first pocket scientific calculator, 1972. Image: Holger Weihe CC-BY-SA-3.0.

“Our objective in developing the HP-35 was to give you a high-precision portable electronic slide rule. We thought you’d like to have something only fictional heroes like James Bond, Walter Mitty or Dick Tracy are supposed to own.” That sentiment within the quote, itself quantifies just how long ago the calculator came into existence. Now, 50 years on, hand-held calculators have become largely obsolete, given that like much else, such functionality hides discretely within our mobile phones, Figure 11. Yet another triumph of the versatility of the the integrated circuit and its many manifestations – microprocessors, memory chips, logic gates, control devices, sensors, communications, the internet and much more.

Figure 11: The iPhone’s integral scientific calculator screen display. Image courtesy of author.

This brief account of the microchip’s evolution began with Figure 1 and the display of a robot’s controller card, designed around integrated circuits. Perhaps it is not beyond the bounds of possibility that anytime soon, robots of the world will unite and herald the next wave of electronic innovation, in the process obsolescing the very microchips which empowered the robots themselves.

David Fegan, MRIA, Emeritus Professor of Astrophysics, UCD.

Former Senior Vice-President and Science Secretary of the Royal Irish Academy (RIA). Former Robert Noyce Fellow at Grinnell College, Iowa, USA.  Author of “Cherenkov Reflections, Gamma-Ray Imaging and the Evolution of TeV Astronomy” [World Scientific, 2019]. 

 

 

Fan ar an eolas le nuachtlitir Acadamh Ríoga na hÉireann

Sign up now