Magnus Schaefer

 

Magnus Schaefer is a Ph.D. candidate in the Department of Art History and Communication Studies at McGill University and a 2024-2025 Consortium Research Fellow.

The field of digital signal processing (DSP) comprises a wide range of techniques for transforming and analyzing images, sound recordings, and other media representing physical vibration phenomena. My dissertation traces a historical trajectory spanning from the commercial introduction of electronic digital computers for industrial signal processing applications in the early 1950s to the emergence of DSP as a distinct branch of electrical engineering in the late 1960s. Over the course of these two decades, researchers and engineers developed methodologies for signal processing that fundamentally relied on high-speed computation: Digital computers stored data as discrete numerical values; and they could perform mathematical operations on them much faster than any human or other machine had ever done before—calculations that previously may have taken weeks to complete became a matter of minutes, if not seconds. Many of these techniques are in wide use today and form the basis for contemporary applications such as medical imaging, machine listening, and video streaming.


My dissertation examines how the historical configurations, goals, and professional cultures of exploration geophysics, seismology, and communication engineering have informed the relations between digital signals and the phenomena they stand in for. I situate the development of digital signal processing and the history of 1960s scientific computing in relation to the oil-centric energy regime that has fundamentally shaped U.S. politics and culture since the 19th century, as well as to questions of labor, infrastructure, and the production of knowledge. Digital signals offer a highly generalizable mode of representation, equally suitable for, say, a quick voice note and high-resolution images of far-away galaxies. However, their historical emergence is contingent on a specific mid-20th century configuration of mathematics, business interests, and Cold War anxieties. The affordances of digital signals were not self-evident or clearly defined when, in the early 1950s, scientists and engineers began applying unfamiliar, unreliable, and expensive high-speed digital computer technology toward problems of identifying meaningful patterns in sequences of numerical values. These early actors found that recent developments in statistical time-series analysis and electronic computation vastly expanded the practical feasibility of interpreting physical vibrations based on their frequency spectra. In theory, this applied to any kind of vibration. But the emergence of digital signal processing was driven by a narrow set of use cases. 


The analysis of seismic vibration was one of the earliest major applications of digital signal processing—even before the term itself existed. My first chapter discusses the early 1950s proto-digital signal processing techniques of the Geophysical Analysis Group (GAG), an oil-industry-funded group of graduate researchers at the Massachusetts Institute of Technology. In conjunction with his graduate work, GAG’s first director, Enders A. Robinson, had developed a proof of concept for treating seismograms as sequences of discrete values and processing them to bring out spectral characteristics that could indicate the positions of possibly oil-bearing subsurface rock formations. Expanding on Robinson’s research, the group attempted, albeit largely without success, to convince its sponsors that the combination of statistical analysis and digital computation could be a viable technology for oil prospecting.


In my second chapter, I examine how, within a decade or so after GAG’s dissolution in 1957, the oil and exploration geophysics industries came to embrace what they called a “digital revolution,” which referred to the rapid and widespread shift from analog to digital data collection and processing in oil prospecting. Digital techniques for spectral analysis made it more feasible to image the subsurface geographies of the Gulf of Mexico and other areas where the standard analog recording and playback equipment available in the early 1960s had been insufficient for obtaining useful data. By the middle of the decade, more and faster computation meant more oil. Advertisements in industry-adjacent publications index the rapid emergence of a market for digital computers and computing services, which drew in established and new manufacturers such as IBM and CDC (Control Data Corporation).


The third chapter follows the migration of knowledge, expertise, and technology from exploration geophysics into Project VELA Uniform. This ARPA initiative supported seismological research that could contribute to deterring the Soviet Union from conducting nuclear tests by creating techniques and infrastructure for remotely monitoring seismic activity. One of the technological outcomes of this research was the cepstrum algorithm for spectral analysis, published in 1963. Though initially written for dealing with seismic problems, engineers quickly adapted it for speech recognition. It is still in use today, e.g., in music classification.


The final chapter focuses on another spectral analysis algorithm, the Fast Fourier Transform, or FFT. Published in 1965, the FFT drastically sped up the computation of frequency spectra for digital signals, opening up possibilities for processing techniques that had been prohibitively computation-intense until then. IBM sponsored its development, though it initially appeared as a minor endeavor with unclear commercial potential. By 1966–67, seismologists and oil industry researchers showed interest in the FFT. But the use of the algorithm did not gain true momentum until the end of the decade, when electrical engineers working in speech analysis and synthesis took a lead in popularizing it as a general-purpose tool for both existing problems and new applications in a wide variety of fields. The FFT functioned as a catalyst for the emergence of digital signal processing as a discipline with a distinct, teachable methodology and remains one of its most fundamental conceptual and technological building blocks. 

 

The Consortium for History of Science, Technology, and Medicine generously supported research trips to the American Philosophical Society in Philadelphia and the Massachusetts Institute of Technology in Cambridge. I am deeply grateful to the Consortium for sponsoring my research. I would also like to thank the librarians and archivists at the APS and MIT, who facilitated access to their collections and pointed me to relevant materials.


The holdings of the APS were essential for gaining a better understanding of the academic and business networks that served as conduits for expertise on time-series analysis and (proto-) digital signal processing in the early 1960s. The APS has the papers of John W. Tukey, a statistician with parallel appointments at Princeton University and Bell Laboratories, the research and development arm of AT&T. Though he stayed somewhat in the background, Tukey played a pivotal role in the history of digital signal processing. He devised a computationally efficient method for spectral analysis that was central to GAG’s work and, later, was a co-author of the cepstrum and FFT algorithms. At the APS, I reviewed his correspondence with, among others, Enders A. Robinson of GAG, and Bruce Bogert and M. J. R. Healy, who cowrote the cepstrum paper with Tukey; as well as materials related to the development and the history of the FFT. Furthermore, I was able to access the Richard W. Garwin papers. As the director of applied research at IBM’s Thomas J. Watson Research Center, Garwin was the driving force behind the implementation of the FFT algorithm: He initiated the collaboration between Tukey and the IBM mathematician and programmer James W. Cooley, and eagerly promoted the algorithm to senior IBM management and among his expansive peer network—albeit initially with limited success. The Garwin papers contain numerous memos and letters from the 1960s documenting these efforts and are an invaluable primary source for reconstructing the origins and early reception of the FFT. Garwin’s papers at the APS also include manuscripts of presentations he gave later in his life, in which he lays out his views of the development of the FFT. Another collection I benefitted from at APS was the Herman H. Goldstine papers. In the 1960s, Goldstine was the director of the Mathematical Sciences Department at IBM. His correspondence and other materials from this era offer insights into the company’s scientific computing business and IBM’s interests in marketing its products for applications in the earth sciences.


At the Massachusetts Institute of Technology, I focused on records related to the school’s Department of Geology and Geophysics, particularly the papers of Professor Robert Shrock, who had been instrumental in establishing the Geophysical Analysis Group and in fostering connections between the department and the oil and exploration geophysics industry. I was especially interested in the materials that Shrock compiled in the 1970s for a history of the department. While there appear to remain (at best) fragmentary records—at MIT and elsewhere—of GAG’s meeting minutes, correspondence, and computer code, the Shrock papers allowed me to reconstruct or confirm many key details of GAG’s history in the 1950s and the group’s impact on 1960s exploration geophysics. The archival holdings at MIT also comprise an extensive collection of documents detailing scientific, technological, and administrative aspects of Project VELA Uniform. This collection gave me a better understanding of the overall structure of the program and the range of academic and commercial collaborators that ARPA contracted with. It also allowed me to corroborate the close connections between GAG and VELA Uniform (for example, several former GAG members contributed research to VELA).

Linked Profile