Electrical Engineering’s Identity Crisis – IEEE Spectrum

v

Image: Retro File; Image manipulation: Richard Tuschman

More than a century ago, electrical engineering was so much simpler. Basically, it referred to the technical end of telegraphy, trolley cars, or electric power. Nevertheless, here and there members of that fledgling profession were quietly setting the stage for an era in industrial history unparalleled for its innovation, growth, and complexity.

That decades-long saga was punctuated early on by spark-gap radios, tubes, and amplifiers. With World War II came radar, sonar, and the proximity fuze, followed by electronic computation. Then came solid-state transistors and integrated circuits: originally with a few transistors, lately with hundreds of millions. Oil-filled circuit breakers the size of a cottage eventually gave way to solid-state switches the size of a fist. From programs on punch cards, computer scientists progressed to programs that write programs that write programs, all stored on magnetic disks whose capacity has doubled every 15 months for the past 20 years [see “Through a Glass”]. In two or three generations, engineers took us from shouting into a hand-cranked box attached to a wall to swapping video clips over a device that fits in a shirt pocket.

Today, at its fringes, electrical engineering is blending with biology to establish such disciplines as biomedical engineering, bioinformatics, and even odd, nameless fields in which, for example, researchers are interfacing the human nervous system with electronic systems or striving to use bacteria to make electronic devices. On another frontier—one of many—EEs are joining forces with quantum physicists and materials scientists to establish entirely new branches of electronics based on the quantum mechanical property of spin, rather than the electromagnetic property of charge.

What EEs have accomplished is amazing by any standard. “Electrical engineers rule the world!” exclaims David Liddle, a partner in U.S. Venture Partners, a venture capital firm in Menlo Park, Calif. “Who’s been more important? Who’s made more of a difference?”

But as the purview of electrical engineering expands, does the entire discipline risk a kind of effacement by diffusion, like a photograph that has been enlarged so much that its subject is no longer recognizable? For those in the profession, and those at universities who teach its future practitioners, this is not an abstract issue. It calls into question the very essence of what it means to be an EE.

img

img
Through a Glass: As electrical engineering has become more complex, it has also become more abstract. Nowadays, circuit designers usually work on representations of their designs, rather than on the physical realizations of those designs. In 1969 (top), an RCA engineer checked a pattern for a layer of an integrated circuit, at a time when ICs had just thousands of transistors. In 1984 (bottom), a worker at Bell-Northern Research viewed the layout for a circuit board on a computer.
Photos: Top: Henry Groskinsky/Time Life Pictures/Getty Images; Bottom: Roger Ressmeyer/Corbis

I remember hearing the same sort of words 20 years ago,” says Fawwaz T. Ulaby, professor of electrical engineering and computer science and vice president for research at the University of Michigan, in Ann Arbor. Indeed, two decades ago, in its 20th anniversary issue, IEEE Spectrum ran an article describing how the drive toward abstraction and computer simulation was reshaping electrical engineering [see “The Engineer’s Job: It Moves Toward Abstraction,” Spectrum, June 1984]. Breadboards and soldering irons were out; computer simulations and other abstractions were in.

If anything, the variety of things EEs do has actually increased since then. If you are an EE, you might design distribution substations for an electric utility or procure mobile communications systems for a package delivery company or plan the upgrade of sprawling computer infrastructures for a government agency. You might be a project manager who directs the work of others. You might review patents for an intellectual property firm, or analyze signal strength patterns in the coverage areas of a cellphone company. You might preside over a company as CEO, teach undergraduates at a university, or work at a venture capital or patent law firm.

Maybe you work on contract software in India, green laser diodes in Japan, or inertial guidance systems in Russia. Maybe, just maybe, you design digital or—more and more improbably—analog circuits for a living. Then there are the offshoots: field engineering, sales engineering, test engineering. Lots of folks in those fields consider themselves EEs, too. And why not? As William A. Wulf, president of the National Academy of Engineering (NAE), in Washington, D.C., notes, the boundaries between disciplines are a matter of human convenience, not natural law.

If your aim is to define the essence of the electrical engineering profession, you might ask what all these people have in common. Perhaps what links them is the connection, however indirect, between their livelihoods and the motion of electrons (or photons). But is such a link essential to defining an EE? Not to Ulaby.

“Engineers tend to be adaptive machines,” he says. Even though there’s little resemblance between the details of what he learned in school and the work he does now, Ulaby, who is also editor of the Proceedings of the IEEE, has no doubt that he himself is an EE.

Engineers are doing less and less design of circuits and getting further from the MESSINESS—and SATISFACTIONS—of the real world

David A. Mindell of the Massachusetts Institute of Technology, in Cambridge, says the perception that the field is heading toward unrecognizability is a constant. (This associate professor of the history of engineering and manufacturing also designs electronic subsystems for underwater vehicles.) Perhaps the biggest change to the electrical engineering field occurred in 1963, when engineers who worked with generators and transmission lines and engineers who worked with tubes and transistors finally agreed that they were all part of the same discipline.

That was the year the American Institute of Electrical Engineers (AIEE), whose membership consisted largely of power engineers, agreed to merge with the Institute of Radio Engineers (IRE) to form the IEEE. In the 1980s, jokers were already suggesting that the IEEE should become the Institute of Electrical Engineers and Everyone Else. Then, as now, many observers worried that such mainstay specialties of the profession as power engineering and analog circuit design were stagnating, while all of the interesting progress took place at the boundaries between electrical engineering and other fields.

Forced to choose a single core activity of electrical engineering, many technologists would probably pick circuit design, in all its various manifestations. It wouldn’t be anything like a unanimous choice, of course, but it would make sense in much the same way as identifying surgery as the archetype of the medical profession, say, or litigation as the heart of lawyers’ work. Circuit design is, after all, what non-EEs tend to associate with electrical engineering, if only in a vague way. And if a connection to moving electrons is a fundamental characteristic of an EE’s occupation, then circuit designers must be counted among the elite.

By that standard, Tom Riordan is an EE’s EE. Now a vice president and general manager of the microprocessor division at chip conglomerate PMC-Sierra Inc., in Santa Clara, Calif., Riordan started his career in the late 1970s, when circuit design was king and designing your own microprocessor, he says, “was the be-all and end-all” of an electrical engineering career. Riordan helped design a single-chip signal processor at Intel Corp. and created special-purpose arithmetic units at Weitek Corp. He then played a key role in developing the design for the central processing unit of the single-chip reduced instruction set computer (RISC) that made what was then MIPS Computer Systems Inc. a commercial success in the early 1990s.

That kind of deeply technical 14- to 16-hour-a-day work, mixing intimate knowledge of architectural principles with the intricacies of semiconductor layout required to get a chip working at speed, is what Riordan still thinks of as engineering. He designed a floating-point unit for MIPS and oversaw the architecture of a couple of more generations of CPUs before starting his own company, Quantum Effect Devices Inc., where he guided about a dozen engineers over the hurdles of creating MIPS-compatible custom processors. On the side, he negotiated with customers and dealt with investors and investment bankers.

After PMC-Sierra bought Quantum in 2000, Riordan dropped much of the CEO side of his job. This shift, he says, gives him roughly one day a week of what he calls “real engineering”—helping to make complex tradeoffs in CPU architecture or reviewing the niceties of yet another reduction in the size of a chip feature. He may not get into the same level of technical detail on every project as he once did, but he asserts that knowing the ins and outs of nanometer-scale circuit design is still part of his job.

Of course, the definition of hands-on has changed drastically in the past 20 or 30 years. Designers in the 1970s and 1980s still built prototypes out of parts they could see with the naked eye. And when those prototypes didn’t work, they attached oscilloscope probes to suspect points until they found the source of the problem. Those days are fast becoming a fond memory.

For the past 10 or 15 years, at least, “you couldn’t debug a system into working,” says John Mashey, a former chief scientist at Silicon Graphics Inc., in Mountain View, Calif. When you’re building on silicon, the first chip out of production has to “more or less work,” he adds, maybe not at the full speed or with all the functions intended. But if the chip doesn’t do most of what it was designed to do, a project will lose months getting to market while waiting for a new fabrication cycle. So design now means endless rounds of simulation and modeling. And design engineers effectively become programmers as they type the “source code” representing their circuits into the tools that will ultimately generate a layout.

Where designers once built, breadboarded, poked, and probed, they now simulate. And almost all of the modeling, analysis, and synthesis that designers do, Riordan points out, would be unthinkable without the nearly two orders of magnitude by which computing power has increased in the past decade.

As Moore’s Law continues its relentless advance, engineers who build systems—whether chips or boards—seem to be doing less and less actual design of circuits and ever more assembly of prepackaged components. Circuit designers are working with bigger and bigger functional blocks, assembling them with increasingly powerful tools, and getting further from both the messiness and the simple satisfactions of working in the real world.

/images/nov04/images/ee02.jpg
Cold Warrior: In 1960, an electrical engineer at a Radio Free Europe transmitting station in Munich, Germany, analyzed broadcast signals. The work was part of these stations’ constant struggle to be heard over Soviet-bloc jamming efforts, which cost an estimated US $35 million—roughly double the cost of running the stations.
Image: Bettmann/Corbis

Mashey points out that for a system on a chip, or SOC, designers don’t even lay out blocks of circuitry. Instead they stitch together CPU blocks, network and video interfaces, cache memory, and other pieces of intellectual property from multiple vendors—each with software instructions that handle the detailed interconnections—to create a custom chip for a set-top box, a toy, or a smart refrigerator. Designers may put together complex systems containing billions of transistors without ever seeing a physical circuit; to the designer, the chip or populated circuit board is merely a collection of files stored on a desktop computer.

Although such an abstract, project management-style view of engineering may be what the future holds, it could well leave current generations of engineers behind. Some technologists have always embraced management; others (such as Riordan) have taken on management tasks only reluctantly. If managing becomes what engineers do, might a very different kind of person make up most of the engineering population? The NAE’s Wulf doesn’t think so: he politely scolds his interviewer for parroting the old stereotype of engineers as gizmo-focused loners. As long as engineering involves using technology to make new things, he argues, that’s what engineering types will do, even if it involves work that looks like a combination of anthropology, marketing, and project management.

Some engineering schools and departments have been bowing to these trends for years. Rosalind H. Williams, director of the MIT Program in Science, Technology, and Society, helped oversee the institution’s curriculum retooling in the second half of the 1990s. She suggests that assembling parts from disparate sources and cobbling together abstractions makes engineering more akin to project management than to design. Some of the changes in MIT’s curriculum were designed to prepare engineering students for management-related careers. Others, like the addition of biology to the core curriculum, respond to changes in the world where students will live and work.

Already, she says, many of the roughly one-third of MIT students who major in electrical engineering and computer science, or EECS, view it as a sort of technical liberal arts degree that prepares them for a wide range of technical and nontechnical jobs. Indeed, after earning their undergraduate degrees, about a quarter of MIT students go directly into jobs in finance or management consulting.

One crucial problem, Michigan’s Ulaby says, is giving students a sense of the potential breadth of their field without sacrificing solid training in its fundamentals. It takes time for students to absorb the mathematical rigor associated with the material, he says. With demand for both a broad perspective and a rigorous grounding in an ever-enlarging set of core subjects, it is not surprising that the four-year engineering degree is under pressure, as it has been for decades. Wulf, for example, states flatly that the four-year engineering degree should not suffice as a first professional qualification. A. Richard Newton, dean of the College of Engineering at the University of California, Berkeley, proposes that students take a fifth year tackling real-world problems far from home to improve their practical and cultural understanding of their discipline’s role in society.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top
Malcare WordPress Security